Hacker News new | past | comments | ask | show | jobs | submit login

>Too bad onions was compromised by the NSA. that makes speaking up against the Bad Guys a bit more dangerous.

Any references you would want to provide for this claim?




If you access an http site, a government controlled exit node could inject js and eventually gather enough info through profiling mouse movements and browsing habits to ID you.

Even with https, if the feds are in cahoots with the certificate authorities, you would be just as vulnerable to this sort of injection right?

However, tor hidden services is another story. I think this is where a bad actor would hit a wall.


I keep toying with the idea of building a system that allows services to authenticate with you. It's not going to be useful for the general population, but for people who have a clue it would be useful for detecting CA hacks. Theoretically it's not so hard -- you send the service a key when you first start using it. The service signs challenges to prove that they have the key. Now an attacker must both hack the CA and get the private key to impersonate the service. I tried to figure out a way to do it with a plugin, but unfortunately it looks like it requires modifications to the browser to make it work. Thinking pragmatically, I suspect that it would never make it into a mainstream browser (for the same reasons that things like Persona never made it).


You're probably at more risk of a criminal-controlled exit node stealing your login details than you are of a government-controlled exit node working out your physical location.


Yes, anonymity and security are separate but related issues.


> If you access an http site, a government controlled exit node could inject js and eventually gather enough info through profiling mouse movements and browsing habits to ID you.

That's why Tor Browser comes with NoScript.


And why it's so annoying that it nonetheless sets it to ALLOW JS by default.


> If you access an http site, a government controlled exit node could inject js and eventually gather enough info through profiling mouse movements and browsing habits to ID you.

It's impossible for Tor to magically encrypt the whole Internet. With any network that will allow you to access the clearnet, the endpoint will see the plaintext if the website you're communicating with doesn't have HTTPS. You're criticizing Tor for something that's impossible to solve. If you think that's possible then please open a ticket to https://trac.torproject.org/ outlining the solution. (Note that the Tor network is scanned to detect bad exits, and authorities flag them with a bad exit flag but they're still used for onion services and stuff)

Also now HTTPS usage is in the 70% from FF telemetry, and onion services are end-to-end encrypted.

> If you access an http site, a government controlled exit node could inject js and eventually gather enough info through profiling mouse movements and browsing habits to ID you.

Do you realize that the Tor Browser comes with loads of patches to Firefox that seek to minimize the amount of entropy leaked by your browser fingerprint? To be specific,[1]

> Timing-based Side Channels

> Attacks based on timing side channels are nothing new in the browser context. Cache-based, cross-site timing, and pixel stealing, to name just a few, got investigated in the past. While their fingerprinting potential varies all timing-based attacks have in common that they need sufficiently fine-grained clocks.

> Design Goal: Websites MUST NOT be able to fingerprint a Tor Browser user by exploiting timing-based side channels.

> Implementation Status: The cleanest solution to timing-based side channels would be to get rid of them. This has been proposed in the research community. However, we remain skeptical as it does not seem to be trivial even considering just a single side channel and more and more potential side channels are showing up. Thus, we rely on disabling all possible timing sources or making them coarse-grained enough in order to render timing side channels unsuitable as a means for fingerprinting browser users.

> We set dom.enable_user_timing and dom.enable_resource_timing to false to disable these explicit timing sources. Furthermore, we clamp the resolution of explicit clocks to 100ms with two Firefox patches. This includes performance.now(), new Date().getTime() , audioContext.currentTime, canvasStream.currentTime, video.currentTime, audio.currentTime, new File([], "").lastModified , new File([], "").lastModifiedDate.getTime(), animation.startTime, animation.currentTime, animation.timeline.currentTime, and document.timeline.currentTime.

> While clamping the clock resolution to 100ms is a step towards neutering the timing-based side channel fingerprinting, it is by no means sufficient. It turns out that it is possible to subvert our clamping of explicit clocks by using implicit ones, e.g. extrapolating the true time by running a busy loop with a predictable operation in it. We are tracking this problem in our bug tracker and are working with the research community and Mozilla to develop and test a proper solution to this part of our defense against timing-based side channel fingerprinting risks.

[1] : https://www.torproject.org/projects/torbrowser/design/


> You're criticizing Tor for something that's impossible to solve. If you think that's possible then please open a ticket

Not criticizing tor. Just pointing out a couple of ways in which a tor user could give themselves up by doing stupid things while using tor.

Interesting stuff on timing based attacks. Thanks


I'd be interested too. I'm aware of the Yasha Levine theory, and complains about js in Firefox ESR, but not much else.


Yasha Levine just released a book which is supposed to give a lot of documentation for his claims. It sounds compelling from a few interviews I listened to but I'm not that far into the book.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: