Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> If you access an http site, a government controlled exit node could inject js and eventually gather enough info through profiling mouse movements and browsing habits to ID you.

It's impossible for Tor to magically encrypt the whole Internet. With any network that will allow you to access the clearnet, the endpoint will see the plaintext if the website you're communicating with doesn't have HTTPS. You're criticizing Tor for something that's impossible to solve. If you think that's possible then please open a ticket to https://trac.torproject.org/ outlining the solution. (Note that the Tor network is scanned to detect bad exits, and authorities flag them with a bad exit flag but they're still used for onion services and stuff)

Also now HTTPS usage is in the 70% from FF telemetry, and onion services are end-to-end encrypted.

> If you access an http site, a government controlled exit node could inject js and eventually gather enough info through profiling mouse movements and browsing habits to ID you.

Do you realize that the Tor Browser comes with loads of patches to Firefox that seek to minimize the amount of entropy leaked by your browser fingerprint? To be specific,[1]

> Timing-based Side Channels

> Attacks based on timing side channels are nothing new in the browser context. Cache-based, cross-site timing, and pixel stealing, to name just a few, got investigated in the past. While their fingerprinting potential varies all timing-based attacks have in common that they need sufficiently fine-grained clocks.

> Design Goal: Websites MUST NOT be able to fingerprint a Tor Browser user by exploiting timing-based side channels.

> Implementation Status: The cleanest solution to timing-based side channels would be to get rid of them. This has been proposed in the research community. However, we remain skeptical as it does not seem to be trivial even considering just a single side channel and more and more potential side channels are showing up. Thus, we rely on disabling all possible timing sources or making them coarse-grained enough in order to render timing side channels unsuitable as a means for fingerprinting browser users.

> We set dom.enable_user_timing and dom.enable_resource_timing to false to disable these explicit timing sources. Furthermore, we clamp the resolution of explicit clocks to 100ms with two Firefox patches. This includes performance.now(), new Date().getTime() , audioContext.currentTime, canvasStream.currentTime, video.currentTime, audio.currentTime, new File([], "").lastModified , new File([], "").lastModifiedDate.getTime(), animation.startTime, animation.currentTime, animation.timeline.currentTime, and document.timeline.currentTime.

> While clamping the clock resolution to 100ms is a step towards neutering the timing-based side channel fingerprinting, it is by no means sufficient. It turns out that it is possible to subvert our clamping of explicit clocks by using implicit ones, e.g. extrapolating the true time by running a busy loop with a predictable operation in it. We are tracking this problem in our bug tracker and are working with the research community and Mozilla to develop and test a proper solution to this part of our defense against timing-based side channel fingerprinting risks.

[1] : https://www.torproject.org/projects/torbrowser/design/



> You're criticizing Tor for something that's impossible to solve. If you think that's possible then please open a ticket

Not criticizing tor. Just pointing out a couple of ways in which a tor user could give themselves up by doing stupid things while using tor.

Interesting stuff on timing based attacks. Thanks




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: