Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's a shame unpackerr is needed. BitTorrent works better on large uncompressed files.


I suppose Usenet is still in use and distributors can use the same release regardless of medium.


Interesting, why is that?


- Smaller torrent filesize

- Clients prevent forcing priority if there's only one file

- Checksum checking is built in

- Less work per client [0]

- Redundant decompression required by everyone that grabs the file

[0] - https://blog.codinghorror.com/everybody-loves-bittorrent/


What does the torrent treat a compressed uncompressed file differently? I would have imagined the torrent framework would be agnostic to the actual file encoding.


To clarify it works better with big files vs small files. Compression has no impact.


Is that just efficiency (per byte, etc) or would the network actually work better (faster real downloads for larger files vs smaller)? I’m imagining there’s some tradeoff at some point.


From what I've read the swarm works better on one large file vs many smaller ones. I am verbatim passing along what I've read and have no data to substantiate that claim. Hopefully someone that has a better understanding of the protocol and client implementations can weigh in.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: