Hmm... I wonder to what extent the bi-directional linking architecture of creates a computationally intractable problem... you've got to have a really rapid reactive process maintaining consistency... even before you run into cap theorem stuff in larger dbs
Conversation
Yeah it gets expensive fast. At human scale data sets though it's generally fine. Dunno how far sufficiently clever algorithms can take this though
1
1
I imagine the right way to approach such problems is to hold the largest subset of the graph possible in-memory on a single instance?
1
2
Show replies
Consistency is probably not crucial, eventual consistency will be fine. Also I mean PageRank is a pretty efficient algorithm for indexing the whole web’s graph structure
1


