Summer project: “PageRank” my antilibrary to surface the top books I should actually read. For each pair of books, I will score the strength of the link using a sort of Hebbian heuristic of whether they evoke correlated unconscious intentions. Then cluster, label, rank...
Conversation
Shit why didn’t I think of this 20y ago 🤬
I’d guess I have maybe 300 books. That’s 44850 pairwise scorings. At 10s a pair that works out to about 124 hours of tedium. Plus I’d need to enter in all my books and write a little program to present them in all possible pairs.
2
5
Still feels like it would be a worthwhile effort. Anti libraries are neural metadata.
1
3
Hmm. Algorithm: use previous scores of a book to auto-score other links? There’s got to be a way to simplify this.
Replying to
If there are 100 books and if book 1’s vector of 99 affinities is matched by book 2’s vector for say the first 20, could we auto-fill the rest to match? Maybe manually score x% of the books, then estimate the rest via interpolation among most similar vectors based on partials? 🤔
2
1
4
There’s a product idea here. Step 1: take pictures of bookshelf. Step 2, mturk HIIT to turn into a spreadsheet. Step 3, estimate affinity graph via full scoring of random subset and partial scoring with autocomplete of rest.
3
10

