Hacker News new | past | comments | ask | show | jobs | submit login

a slow to compress format that does a good job sounds great for physical media which gets distributed, but sounds like it would be bad where the things being compressed are most probably many millions of individual images uploaded by users as you want a fast enough compression that you can start to return the compressed image to the user within milliseconds.

That just sounds to me like what happened to it, although I can't say for sure - did it perhaps also have other things about it that would make it a problem to be running lots of concurrent fractal compressing processes on a server?




I don't know - the machine I played with it on was a 16MHz 386, and the IFS software was not optimized. But it was way slower than DCT-based JPEGs to compress.

The odd thing about it though, was that it could upscale smoothly. A 320x200 source could be decompressed at 640x400, and it would sort of intelligently "fill in the gaps" on textures etc.


Some comparison between fractal compression and jpeg here: https://dl.acm.org/doi/fullHtml/10.5555/364682.364685


Slow to compress in the 1990s is probably not that slow to compress in 2024.

And you can always do fast initial compression on images needed immediately, and run background compression to replace them with the smaller versions.


hmm, yeah I never thought about that (the second part)

but slow to compress is always relative to fast to compress, and fast is related to user expectations of speed which is calibrated to your competitor's speed, thus if something is slow to compress it will probably stay slow to compress even as its speed increases.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: