Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Unbeknownst to him at the time, Google Authenticator by default also makes the same codes available in one’s Google account online.

This sounded absolutely crazy to me so I went to open Authenticator on my phone and lo and behold it offered me the option of linking to my account and "backing up my codes in the cloud" to which I declined.

But I had never seen this behavior before, so is this new? It did not seem to be enabled by default in my case.



What's crazy to me is that Google would allow access to a foreign device from a single click. It would be easy for a person to accidentally click it, or for a kid playing on their parents advice to click it when it popped up. I really can't understand why they wouldn't send a code that would have to be entered instead; it would be far less prone to those kinds of problems.


"foreign device" based on IP geolocation is pretty tricky and annoying.

My home in Texas had an IP address which a lot of databases had as supposedly being in Montreal. It was like that for years. Gotta love so many sites trying to default to French.


As a network admin I have found that whitelisting only US address space for my companies IPs drastically reduces how many attacks we get.


As a person who had to deal with clients, I have found whitelisting to only "US address space" lead to lots of clients being unable to access the services until they were whitelisted.

As a person who had to deal with other associates, I also found whitelisting only US address space led to a number of people being unable to connect from their homes.

As a person who had this happen to them, I had quite a lot of frustrations with services insisting they couldn't provide me service because Texas is in Canada apparently.


of course before implementing this I log all IPs and verify that we don't have any legitimate traffic coming from non-US IPs. and whitelisting a few IPs isn't a big deal. Of course a medium sized manufacturing company in the Midwest isn't going to have much need for people connecting to use outside the US.

I'm actually working to get rid of any public IPs that isn't a VPN access point.


> any legitimate traffic coming from non-US IPs.

If it's not actually reaching you to log in and what not, how do you know it's legit or not?

How do you know it's US traffic or not in the end?

I'm not saying it's not something anyone can reasonably do, but I've both been the gatekeeper required to implement/support such a policy and been someone burned by it. It shouldn't be assumed the block lists are actually that good.


This is an argument over the accuracy of georeferencing IP addresses and in my experience it is adequate for my needs.


Je suppose que le Texas est au Québec.


What was the point of replying in French?


> My home in Texas had an IP address which a lot of databases had as supposedly being in Montreal.

J'ai dû apprendre le français parce que les bases de données géo-IP sont des déchets.

So many sites defaulted to French due to shit geo-ip databases. So many account lockouts because of fears credentials got hijacked due to shit geo-ip databases. So many "sorry this isn't available in your country" messages because of shit geo-ip databases. So many stores defaulting to Canadian dollars because of shit geo-ip databases.

They're so annoying to be on the receiving end.


How would a code help? The victim has already bought into the social engineering. If the person on the phone asks the user to read out a code, they will. If the person on the phone asks them to enter a code (i.e. the version of this kind of prompt where the user needs to enter a code on the phone matching the one showing on the login page), they will.


Every step you make someone who is being socially engineered jumo through, is an extra chance for them to realize what is happening, especially if those steps contain warnings.


Google only added this feature recently. I am really conflicted about this feature. Without it you need to either save every TOTP code when you first set up the account or manually disable 2FA on every account and then enable it again so you can enroll it on a new phone. I used it when migrating to my most recent cell phone but then disabled it. Of course you have to trust that Google actually deletes the codes from your account.


Same with me, I had setup MFA using Google Auth for an important account I use.

Next day the phone broke, and I lost that account forever. I had not written the backup codes down anywhere.


Generating and storing your passwords, OTPs, and passkeys in a fully E2EE system like 1Password is effectively a root of trust, although you also have to trust (a) the password manager company, (b) whatever third-party systems and devices they use to build and deliver their software, (c) the quality of their cryptosystem, and (d) whatever device you use to decrypt/access secrets in your vault.


I trust 1Password. They are very open about how they encrypt data and how the key is derived. I like how they store your encrypted data locally in a SQLite DB. My only real concern is with storing passkeys because they cannot be stored locally yet and you are granting 1Password control over your access to any site you need a passkey stored with them. They are working on a passkey exporting process. I would feel better if I could have the same Passkey stored by 1Password and Azure and Google.


What is the advantage of passkeys compared to managing unique passwords with 1pw? Is there any tangible benefit to switching, besides that Google et al will stop hounding you to do so?


Passkeys are asymmetric keys so a hacked site cannot leak the hash or even the plaintext of a passkey. And the private key is never exported to insecure hardware. Funny how so many Linux gurus have been shitting on using passwords for SSH for decades in favor of using SSH keys and now that there is an actually effort to use what are essentially SSH keys tied to a specific domain they are rejecting it.


Sorry, I'm still not clear what the advantage is, compared to storing unique passwords in 1pw. If a site is hacked, the only thing at risk is my data on that specific site, which would be the case either way. I definitely understand how they would be easier and more secure for people who don't use a pw manager, but that's not my question.


There are some obvious, significant benefits I can think of off the top of my head:

- Passkeys give the website no secret to keep.

Breach of the passkey public key is not an event worthy of credential rotation.

- Passkey authentication is submitted via a rigorously-defined mechanism intended for machine-to-machine communication.

Ever had your password manager try to fill the wrong field with your login credentials? Passkeys cannot make that mistake. There's no heuristic mechanism at play trying to figure out where to insert the passkey.

- Passkeys are immune to credential theft via MITM

Sure the MITM could hijack the session, but not the credential. (I know this one is a stretch, but you asked for anything)


a actual API to use when authenticating is a real advantage for Passkeys I hadn't considered.


They aren't that much more secure than a random 256 bit unique password for every site stored in a secure password manager. They are designed to raise the security for the average user, not the most security conscious.

https://www.computest.nl/en/knowledge-platform/blog/advantag...


This is a weird take. The passkey can be up to 1400 bits in length which makes it significantly more difficult to brute force than a 256 bit password. Not to mention some sites won’t even let you type in a password that long, and then ofc rainbow tables.

Passkeys are significantly more secure for everybody.


a truly random 256 bit password would require more energy to brute force than the sun will emit during its entire lifetime. a 1400 bit long random password is not any more secure in practice.

Passkeys are normally 256 bit ECC keys.


You’re totally right, more bits doesn’t mean more security. /s

I seriously hope you don’t work in any security field.


I don’t trust 1Password, but not for technical reasons. They like to play subscription games and hold accounts hostage. I’m moving to apple passwords myself.


I'm going to try running vaultwarden myself.


Yup. If you DON'T have this feature, you're depending on every user who has TOTP 2FA to actually save their backup codes somewhere they can retrieve ~years later or back up their TOTP some other way manually. Naturally, most users will fail to do this, so you'll have to deal with how to securely reset the accounts of people whose phones got lost or destroyed.

But then if you DO have it, you have to deal with the situation in this story, where if you can compromise their one key account, you get all of their TOTP codes too.


There is a big gap in the greater security landscape here. I personally use hardware authenticators for this reason, but I have to manually enrol each security key for each account.

Really what I would like is a root of trust which maybe is a cipher text which I can store in several physical locations, and then my security keys are derived from that root of trust. Then when I set up 2fa with a service it is using the root of trust and seeing that my security keys are is derived from that root of trust. This allows me to register the root of trust only once and then I can use any key derived from it.


Some cryptocurrency hardware wallets such as Trezor's are usable exactly how you want: they support fido2/webauthn and derive their keys from the recovery seed phrase. You can write down the recovery seed phrase, initialize other hardware wallets with the same recovery seed later on, and they will present to a computer as the same fido2/webauthn token.


If it's hardware it can break or be lost or stolen.


As I said, you can write down the recovery seed and initialize other security keys from it, so you're able to deal with a hardware wallet breaking, unlike most fido2/webauthn security keys. Hardware wallets also require a pin to be entered, so they're more secure against being lost or stolen too than security keys that don't need a pin.


Just checked and Google authenticator seems to be synced to my account, which is a huge SPOF and not what I want. It's possible that I did this without realising, but does anyone know of a way to revert authenticator to local-only? I don't see anything obvious.


> It's possible that I did this without realising

IIRC on my platform, when they added the feature they turned it on by default, as an auto-installed update.

And if you're logged into the gmail app on the same device that also logs you into authenticator.

You didn't do anything wrong.


FWIW, I still remember recoiling in horror when I was asked whether I wanted to sync my Google Authenticator stuff.


I remember getting prompted for it on iOS when they added it. I still have it turned off.


Better option is to not use Google's TOTP app. Use something else


You can't revert, they keys are sent, they have them. They can't un have them. You'll need to rotate your MFA.


> You can't revert, they keys are sent, they have them. They can't un have them. You'll need to rotate your MFA.

Not true. See https://news.ycombinator.com/item?id=42471459


You've missed the point entirely. The point is not that you can't recover the codes. The point is that if you are concerned about uploading codes due to the security implications (which most people on here are) then you need to do more than just disabling uploading, you also have to go rotate all the secrets that were uploaded.


I understood the point, thanks. But I'm concerned about the scenario in the article, where someone did a device recovery and got access to the cloud synced auth codes.

I don't particularly like that my codes were apparently synced to Google's cloud without my being aware, or the ux that prevented me from noticing. But I'm pretty confident that, having disabled the cloud sync, Google no longer has my codes

(And in fact I verified this by installing the authenticator on a tablet before turning off sync on my phone. The codes vanished from the tablet.)

In principle, yes I should rotate all the secrets. Because google may have borked their data retention, or is just outright lying and keeping my secrets. In practice, though, for my personal account, I'm content that nothing has been compromised.


> But I'm pretty confident that, having disabled the cloud sync, Google no longer has my codes

Based on just your intuition. Since you don't have access to the backend specs or code, assuming this isn't a responsible security practice. It is a shortcut you can choose to take personally but should never take with any professional credentials.

I'm going to point out that you responded "Not true." instead of adding a caveat about how you personally choose to ignore security best practices for personal accounts.


> I'm going to point out that you responded "Not true."

I could have been clearer, but that was in response to the asserion of "you can't revert".


> does anyone know of a way to revert authenticator to local-only?

To answer my own question: tap the profile pic (top right on Android) and choose the Use Without an Account option. Removes codes from cloud storage and any _other_ devices. Mentioned in TFA.


I am literally mind f** by the wording “Use Authenticator without an Account”. This is one of the most tortured and cryptic phrases I have seen. Government legalese is more straightforward than Google.


I use Authy and it does this too. I like that I can get the code on my phone or tablet. I also keep paper copies of the original QR codes in a safe place.


The trick with Authy is to disable multi-device access unless you're in the process of adding another device, so hackers and scammers can't add their own devices to your account without your aid. If you leave the setting enabled, someone may get your TOTP secrets from Authy before you can stop them.


If there is a trick to doing something securely, then that is already an automatic fail.


No. That's not "the trick". As soon as it's in the cloud, it's over, it's gone, you've lost the game.


I've been using Authy for around ten years now, so I lost the game a decade ago and the consequences have been nothing and the benefits have been something. Not a bad loss IMHO.


Good for you. Just wait and see...


You can just decode the QR code and use whatever secret is in there to generate the OTP codes. TOTP isn't that complicated, it's really just a second password that the system generates.


While true, I haven't yet seen an authenticator app that let's you just dump the topt code yet...


1Password can show the whole URI with the seed, and I have used it in the past to tediously restore seeds to my other 2FA apps.


It is at least relatively new. Years ago I had to try the Google “hard landing” account recovery process because it wasn’t happening, which is how I learned that they had that form going to an email address which had been deleted. Fortunately I had paper recovery codes in my safe.


Google rolled out that hare-brained "improvement" in an update to Google Authenticator a few months ago, with the nice extra that for some users, when you dared unselecting the new cloud backup checkbox, the secrets stored in the app were instantly corrupted in some way, so you were locked out of your Google accounts immediately as a bonus <chef's kiss>. Happened to a family member, luckily they had a working emergency access method. We will never use Google Authenticator again.

Recommended alternative: 2FAS (https://play.google.com/store/apps/details?id=com.twofasapp) which allows you to import the secrets from Google Authenticator via QR codes, and has a local backup feature (e.g. to a USB drive).


As a side question: How do I, as a novice, vet a 2FA?

This has all the "looks nice", but I have no reason to trust this recommendation over any other social engineering.


My first impulse after ruling out Google Authenticator was to simply switch to Microsoft's Authenticator app (which I already had to use for a work-related thing anyway), thinking "of course MS would not make the same stupid mistake". Turns out they would, and they did. So alternatives from smaller vendors were the only option. In evaluating them, I focused on popular open-source solutions that had the features I deemed important (notably, local backup), and looked into the history, provenance and reputation of their vendors. Nevertheless, some risk will always remain.


I was one of the fools who installed the iOS 7 beta onto a phone that I depended on with Google Authenticator. The app had a compatibility issue with that beta release that caused it to disappear all my 2FA seeds except, very fortunately, for my Gmail. There was a bit of a ruckus about this here https://news.ycombinator.com/item?id=6112077.

Since then, I always use at least two 2FA apps at the same time.


I used andOTP for years, until the author stopped working on it. While it still likely works fine, I've switched to Stratum, which likewise supports import from the Google Authenticator export QR codes as well as from andOTP, authy, and others.


Ugh, yeah, that update.

You didn't have to do anything, either, the update just instantly corrupted some 2FAs. How can an app not do a TOTP? It's literally just math.

I had to recover a few MFAs from backup codes due to that.


I'm shocked how often one of my ~50 colleagues asks me to reset their 2FA. It's every 6-8 weeks or so.

Their personal accounts will be affected in the same way (lost phone, new phone etc).


Was about to say this but yeah.

Big brains at google didn't understand the number '2' in 2FA


They added this recently, because lots of people complained to Google that they lose their tokens; Authy and others started to gain traction because they did synchronization. Google was pretty much forced.

I know, 2FA loses the entire point when it's synchronized. But, well. People lose their stuff all the time!


I've had customers tell me that they cannot use email verification to meet a 2FA compliance requirement because it's not a second factor, but somehow SMS is. I always push back with "why not just good old TOTP" and the answer is that it's too easy for a customer to lose because it is only on their device. Like yeah... that's what makes it a real second factor.


It’s possible to synchronise secrets without sharing them with a third party: just encrypt them locally, transmit to third party, download to other device, decrypt.

This could be made easy for users by having each device share a public key with the third party (Google, in this case), then the authenticator app on one device could encrypt secrets for the other devices.

This would be vulnerable to Google lying about what a device’s public key is, of course, but enduring malice is less likely (and potentially more detectable) than one-time misbehaviour.


> It’s possible to synchronise secrets without sharing them with a third party

Sadly the problem Google is actually trying to solve is providing security for the dumbest people you've ever met. Dumbasses are entitled to security too!

I'm talking people who've lost access to their e-mail, and their phone number, and their 2FA all at once. Then they've also forgotten their password.

No password manager, no backup phone, no yubikeys, no printed codes, no recovery contacts, nothing.


You're describing the majority of my extended family. Some of whom are well educated and tech illiterate.


The active ingredient in 2FA as practically implemented for nearly everyone has never been the 2. It's mostly just not letting humans choose their entire password.


Most people wouldn't realise they can't recover their TOTP codes. But the hacker would still need to know your password surely


...so you agree that this is missing the '2' in 2FA?


For "something you have" to be true to its purpose it has to be something that has one and only one copy - so either only you have it, or you don't, but nothing in between. The second you have "cloud backup", or activate an additional device, or "transfer to a new device" then you turn the attack into "phishing with extra steps".


You can support transferring to a new device without increasing the phishing risk, the transferral just needs to be done via a physical cable rather than via the cloud.


I'll grant you that it's a better option but by no means good if you want to stand on the 2FA hill and put security first (only?). That "just" does a lot of heavy lifting.

The only time I'd consider transferring a secret like this is secure is within an HSM cluster. But these are exceptionally hardened devices, operating in very secure environments, managed by professionals.

Your TOTP seed on the other hand is stored on any of the thousands of types of phones, most of which can be (and are) outdated and about as secure as a sieve. These devices also have no standard protocol to transfer. Allowing the extraction via cable is still allowing the extraction, the cable "helps" with the transfer. Once you have the option to extract, as I said, you add some extra steps to an attack. Many if not most attacks would maybe be thwarted but a motivated attacker (and a potential payoff in the millions is a hell of a motivator) will find ways to exfiltrate the copy of the keys from the device even without a cable.

This is plain security vs. convenience. The backup to cloud exists because people lose/destroy the phones and with that their access to everything. The contactless transfer exists because there's no interoperability between phones, they used different connectors, etc. No access to the phone is a more pressing risk than phishing for most people, hence the convenience over security.


I think this is also the main drawback of physical U2F/FIDO2/Webauthn tokens: security-wise they are by far the best 2FA option out there, but in practice it quickly becomes quite awkward to use because it assumes you only own a single token which you permanently carry around.

Sure, when I make a new account I can easily enroll the token hanging on my keychain, but what about the backup token lying in my safe? Why can't I easily enroll that one as well? It's inconvenient enough that I don't think I could really recommend it to the average user...


I don't quite get this "I need to add every possible authenticator I have at account creation or I'm not doing it" kind of mentality I see a lot.

When I make an account, if I have at least two authenticators around me, I'll set up the hardware authenticators or make sure it's got a decent recovery set up. As time goes on I'll add the rest of them when it's convenient. If I don't have at least two at account creation or I don't trust their recovery workflow, I guess I'll just wait to add them. No big deal.

If I'm out and I make an account with $service but I only have my phone, I'll probably wait to add any authenticators. When I'm with my keys, I'll add my phone and my keyring authenticator to it. When I sit down at my desktop sometime in the next few days and I use $service I'll add my desktop and the token in my desk drawer to it. Next time I sit down with my laptop and use $service, I'll add that device too. Now I've got a ton of hardware authenticators to the account in question.

It's not like I want to make an account to $service, gotta run home and have all my devices around so I can set this up only this one time!


>When I make an account, if I have at least two authenticators around me

If you do, you're in a tiny minority of users. Well, even if you have one you're in a tiny minority, but having two laying around is extremely unusual.


Only because I bothered to buy a few. If they're making a new account they're probably on a device which can be an authenticator, i.e. a passkey. Is it rare for people to be far away from their keyring where they potentially have a car key and a house key and what not?

Do most people with hardware authenticators not also have laptops, desktops, or phones? They just have an authenticator, no other computers?

This person I replied to already has two hardware tokens. They probably also have a phone that can be used with passkeys, they probably also have a laptop which can be used with passkeys, they might also have a tablet or desktop which can be used with passkeys. That person probably has 3-6 authenticators, and is probably with two of them often if they carry keys regularly.


I don't understand the existence of an HSM cluster. I thought HSM was meant to be a very "chain-of-custody" object, enabling scenarios like: cryptographically guarantee one can only publish firmware updates via the company processes.


The HSM is more generic than that - a Hardware Security Module. It's just a hardware (usually, software... Hardware security modules exist...) device that securely stores your secret cryptographic material, like certificate private keys. The devices are exceptionally hardened both physically and the running software. In theory any attempts to attack them (physically open, or even turn them upside down to investigate them, or leave them unpowered for longer than some hours, attempt too many wrong passwords, etc.) results in the permanent deletion of all the cryptographic material inside. These can be server sized, or pocket sized, the concept is the same.

Their point is to ensure the private keys cannot be extracted, not even by the owner. So when you need to sign that firmware update, or log into a system, or decrypt something, you don't use a certificate (private key) file lying around that someone can just copy, you have the HSM safely handling that for you without the key ever leaving the HSM.

You can already guess the point of a cluster now. With only one HSM there's a real risk that a maintenance activity, malfunction, accident, or malicious act will lead to temporary unavailability or permanently losing all the keys. So you have many more HSMs duplicating the functionality and keys. So by design there must be a way to extract a copy and sync it to the other HSMs in the cluster. But again, these are exceptionally hardened HW and SW so this in incomparably more secure than any other transfer mechanism you'd run into day to day.


Ah, got it. So in the event someone managed to get access, they are limited to signing things in that moment on that infrastructure. I can see how that would reduce the blast radius of a hack.


Ideally this would destroy the initial copy too - but forcing physical access would indeed be a great start.


Even so, if you have a copy even for a fraction of a second then you can have two copies, or skip the deletion, or keep the temporary copy that was used during the transfer. Even the transfer process could fail and leave a temporary file behind with your secrets.


I quite like Apple’s Advanced Data Protection, I set it up with two physical yubikeys recently. To login to iCloud/Apple on a new device that’s not part of your trusted devices, you must use the hardware token.


They'd have to know your password, and get you to click your 2FA accept button, that's 2 factors still


It's because everybody wants to put everything in 2FA protocols, because people just can't use passwords...

And the fact that one of those doesn't lead to the other passes way over their heads.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: