Hacker News new | past | comments | ask | show | jobs | submit login

This question has been in my head recently. How feasible is it really? The answer in the link isn’t comprehensive. Is it really out of the question for manufacturer’s to ship a particular version of a device and software for a target country? Nation states have a history of backdooring or weakening particular technologies.



Baseband backdoor. No need to target the OS or the primary CPU.


Are basebands not sandboxed at all? There's no conceivable reason that my baseband should be able to access my camera, microphone, or the contents of my display in normal production use, as that's all filtered through the CPU typically. Why not have an MMU that limits the baseband to DMA in a specific chunk of memory and reduce the attack surface dramatically? It's not just effective against nation states. With such a protection, 0-click OTA attacks targeting the baseband would have a much smaller blast radius.


Historically the baseband was the primary processor with full control and the CPU was subordinate. This is because the baseband code was developed by the chip manufacturer so they gave themselves full control over the system to make it easier for themselves.

This may no longer be the case right now as the primacy of the CPU has become increasingly obvious, but it should still be the default assumption since having the baseband in control lowers costs to the chip manufacturer which is their lifeblood.


Exactly, and we're talking about governments, not competing companies. "You wanna sell phones or build infrastructure here? Fine, here's a truckload of appliances to put in the middle of each pipe; no questions please". There are many ways a government can ruin businesses even without swatting their offices or raise public anger, they just need to apply different bureaucratic pressure where it is needed so that for example a permit, tax installment or reduction, whatever that otherwise would take 6 months will require say 5 years or more.


Any signs of these in the wild?

I know it is a valid threat, but even in the cases that set this precedent there was a team of 140 and they did not leverage a baseband exploit.


I only have my experience with this so it requires you to have a phone that is off and without a battery or in a faraday(foil) shielded bag. Be in an area your government doesn’t want regular people to be (unacknowledged military base), turn on the phone.

I’ve done this many times so I know how long it takes to power on my phone to a “usable” state on my iphone and android.

I can’t take my phone inside where I work and they have mobile phone detectors which set off alarms if you bring one near any door or the inner facility fence. I put my phones inside a foil cooler bag with ice packs so they won’t overheat inside the car.

My guess is that there was a cell site simulator and it was setup to take over any phone which comes in the area. I got the same result with my android and iphone. Phone boots, weird hang where all indicators appear but I cannot interact with the phone. Wait at least one minute then I can use the phone.

I think this is why governments don’t like China developed 5G technology. It doesn’t have their default back doors.


Absence of evidence is not evidence of absence, especially when searching for evidence left behind by competent adversaries (e.g. NSA, GCHQ, etc) who have a strong motivation to remain undetected.


> Absence of evidence is not evidence of absence

But it is also not evidence of the thing for which there is absence of evidence.

EDIT:

> especially when searching for evidence left behind by competent adversaries (e.g. NSA, GCHQ, etc) who have a strong motivation to remain undetected.

No, there is no “especially”; absence of evidence means no basis for any affirmative belief, period, equally for any fact proposition. Arguing for “especially... ” is exactly arguing for a case where absence of evidence is evidence for the thing for which there is an absence of evidence.


I'm not asserting that it is.

In risk management, you shouldn't ignore known unknowns like that, you should either adapt your threat model or risk accept, not simply consider that risk nonexistent until proven.


How could we know for sure? Basebands are 100% proprietary, we have no idea how they operate and even less of an idea of how their operation might be subverted.


This is why I'm an open source advocate. It's not that open source automatically makes software/firmware trustworthy, it's that closed source empirically guarantees the software/firmware can never be deemed trustworthy.


And yet there have been plenty of long standing security issues in Linux…

Why would you think that a bunch of people volunteering their time would be more motivated to look for security issues and even those that are found, how many would be disclosed responsibly instead of being sold to places like Pegasus?


>And yet there have been plenty of long standing security issues in Linux…

• See the first half of my second sentence.

>Why would you think that a bunch of people volunteering their time would be more motivated to look for security issues

• So they're not harmed by the vulnerabilities. I'm on a big tech red team. I routinely look for (and report) vulns in open source software that I use - for my own selfish benefit.

>and even those that are found, how many would be disclosed responsibly instead of being sold to places like Pegasus?

• Not all of them, that's a fair point. But I'd rather have the ability to look for them in source than need to look for them in assembly.

• Keep in mind that the alternative you're proposing (that proprietary code can be more trustworthy than open source code) is pretty much immediately undermined by the fact that the entities who produce proprietary code are known to actively cooperate and collaborate with the adversary - look no further than PRISM for an example. Microsoft, for instance, didn't reluctantly accept - they were the first ones on board and had fully integrated years before the second service provider to join (yahoo, iirc).

• If you want to start a leaderboard for "most prolific distributor of vulnerable code", let's see how the Linux project stacks up against Adobe and Microsoft. I wouldn't even need to research that one to place a financial bet against "team proprietary".


> Why would you think that a bunch of people volunteering their time would be more motivated to look for security issues

I don't. I trust that bad actors are less motivated to insert malicious code, and I trust that transparency enforces good practices. All sufficiently complex code has unintended behavior, what matters to me is how you stop third parties from using my device beyond my control.

> and even those that are found, how many would be disclosed responsibly instead of being sold to places like Pegasus?

What do you think everyone else does with their no-click exploits? Send them to Santa?


FOSS doesn't mean "volunteers." FOSS means that the source is viewable, legally usable, and that changes can be made and redistributed without permission from the author(s).

Volunteers can make closed source software, massive corporations and governments can make FOSS.


Nobody said that FOSS was perfect, only better.


Seems like some people really believe that FOSS is basically perfect when it comes to security. "It's FOSS so people would find any serious vulnerabilities". Heartbleed, anyone?

As an aside, I wonder if there's a term for this kind of "nobody says...but some do" thing. Everyone sees their own reality, blah blah. I trust that you're speaking in good faith, but that doesn't account for everyone, and good faith doesn't magically resolve arguments.


> Is it really out of the question for manufacturer’s to ship a particular version of a device and software for a target country?

Another means: is it really infeasible for a nation state to intercept and modify devices that are being sent to a specific country/person?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: