Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Hashes can be efficient because redis compresses hashes that match certain restrictions (eg: under 1000 items, each item being under 1000 characters long) and the same can apply to lists, sets, and zsets.

This compression may have inadvertently been activated by the author when he pruned his zsets, which is why the savings are so apparent.



The primary reason for the reduction was going from ~7k items in the ZSET to 100 for several thousand ZSETS. That's not to say the built-in compression didn't affect anything, but obviously less data == less memory.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: