Conversation

The new "Rinzler" big data archiving and index system (written purely in Go) was just tested on a 64 core machine and can encode and index a whopping 500,000 records per second. This take JSON data, indexes key fields, compresses with zstd and applies ... #bigdata #datascience
1
15
Reed Solomon redundant data packets for each row. Even with redundancy, each row of data is compressed to around 25-30% of it's original size. The goal is to reach one million encodings per second.
1
1