I can believe that this particular case wasn't Meta being nefarious and trying to get specific people in trouble over a possible abortion. But I don't think that was ever the real concern here.
The press release only enforces my concern about how easy it seems for governments to get private conversations between a mother and daughter. Also that apparently Facebook stores the conversations in plain (or easily decryptable) text.
> my concern about how easy it seems for governments to get private conversations between a mother and daughter.
I struggle to understand how anyone can still think that facebook (and any other major social media platform) has any issue (moral, technical, or otherwise) with giving away any and all data from their systems, to anyone willing to pay enough or who can threaten them with fines. Why would they? Their business model is selling user data to advertisers. Do people really think they're just leaving money on the table and saying "well, that's private, people might not like that data being sold." They don't give a shit!
I also struggle to understand how anyone can look at the current social media landscape, where seemingly everyone has accounts on multiple platforms and uses them to post/message about their personal lives, and not think the us government/law enforcement is involved somehow. Does anyone really think the microphones we all willingly put inside our homes that are controlled by one of the largest corporations in the entire world aren't readily available for intelligence agencies?
I admit I am very cynical, but it's very difficult not to be. It feels like every month we hear about how some massive platform was actually lying about handling data that they totally promised they were keeping safe and private, which is followed up some new expansion of their ability to collect said data. But we all continue to give them the benefit of the doubt, and act surprised when things like this happen.
When I was in middle-high school, "private messages" got shared all the time. The first time it happens to someone, myself included, it's a bit of a ride awakening.
But once it happens, you learn that what you say online is a persistent written record which outsiders will occasionally access. I don't think that people can credibly claim to be unaware that all of their social media activity exists in a grey area between "public" and "private".
Is it possible that most people just don't care? It seems like the "I'm not doing anything wrong, so I don't care" impulse is very beguiling. Maybe events like this will be enough to snap people out of it, but I wouldn't bet on it.
> Also that apparently Facebook stores the conversations in plain (or easily decryptable) text.
This. Even with E2EE enabled, that only protects your conversations as they travel between you and facebook's servers. It does not mean that the messages are protected from facebook being able to see them. People should have zero expectation of privacy on facebook's platform(s).
That is not true. For both messenger and whatsapp, e2ee messages are not only encrypted between you and facebook servers, they are encrypted end-to-end and only decryptable on the devices. Please reconsider your level of confidence in your understanding of this.
I do not have any information about the current state of messenger, so I cannot comment.
Here is my issue with WhatsApp though:
How will I know that Meta is still shipping an application based on an uncompromised version of the Signal protocol, without malicious modifications?
Auditing is the normal answer.
Sadly, Meta is not ISO27001 certified, so there's no trustworthy external audit trail.
Barring that, who is capable of auditing Meta to confirm this? Who can see the client and server sources to confirm that there is no MITM? Only Meta, on both counts.
I have to trust their word for it and I'm incapable of trusting them.
The problem is not what people read, but what they do not read - journalists have an axe to grind against Facebook, so they write titles that make it sound like Facebook willingly gave the data to the police, and most people don't go beyond reading the title.
What makes you say journalists have an axe to grind against Facebook? If anything, I think they spent more than a decade being too generous to Facebook, and even now are overly credulous a lot of the time.
Facebook was a media darling until they became profitable and ate the media’s lunch. When it became clear that FB would swallow a large chunk of online ad dollars that the media companies coveted, their attitude changed rather suddenly.
They did willingly give the data to the police. Compare Apple v. FBI. "But the iPhone was encrypted" - nothing is stopping Facebook from encrypting Messenger.
Whether you think Facebook is right or not, this story gives pause about what kind of information you share online. It's one more reason to participate less or not participate at all in these services.
That seems a good thing, as it will tarnish Facebook both with beeing bad and with being involved in the abortion issue, which is super division no matter what.
And it is not like Facebook doesn't have options here. They can fight the warrant and they can publish the private information of the people involved with the warrant.
For sure. What troubles me here is that Facebook didn't fight it. What I'd like them to do is to fight overly intrusive requests all the way to the Supreme Court. Maybe this is trying to say they that feel that they got hoodwinked? If so, they need to say that very clearly. Because I'm not at all confident that Facebook is particularly committed to user privacy, especially when they are eager to suck up to politicians.
Facebook, has among many choices, two clear and easily territories in which to advance user privacy: technically and legally.
Technically, Facebook could implement E2E for private messages. Doing so would make disclosure not something they were unwilling to do, but unable to do.
The other, and this is much more juicy, would be to lobby the California legislature to make it illegal under state law to disclose private conversations re: abortion to states where abortion is prohibited.
The fact that these are possible, but not likely to happen is another issue altogether.
You're not kidding. Doing state-by-state battle over jurisdiction and the right of each state to create its own laws would be ... interesting. I'd expect it to be very destructive overall. Not saying it would necessarily lead to an end of the union, but it's a step.
Seems to be a pattern, lately, and specific to abortion as well. The ends justifying the means, regardless of collateral damage.
What we have now has the same legal-political texture the same as the Fugitive Slave Act of 1850. A state like California is compelled to assist states like Nebraska in the enactment of punishment of an act which is legal in California but which is illegal in Nebraska.
Make such assistance illegal so that Facebook doesn't have a choice in the matter. Facebook certainly has the political power[money] to successfully lobby for such a legislative change, but it doesn't have the will to do so.
It's likely that the abortion in this case would have been illegal in California. It's hard to say for sure, because California prohibits abortion after medical viability rather than setting a strict week limit, but it happened at week 23 when many fetuses are viable.
I don't want to say "similar", because chattel slavery was uniquely bad, but even in areas that wouldn't enforce the Fugitive Slave Act forced prison labor was pretty common at the time. There was also the question of what should happen to people who conspire to violate the Fugitive Slave Act, although my understanding is that in practice it was impossible to get northern juries to convict them.
It is hard to deny that this is a pretty common thought these days. I hear a lot of people making sounds in that direction, regardless of their more traditional ideological views.
We'd have to figure out how the nukes are divided up, however. Though what is it they say about possession being 9/10ths of the law...
The Lord's our shepherd, goes the psalm
When Alabama gets the bomb.
~ Tom Lehrer, "Who's next"
Well, the USSR broke up without too much fuss about who got the bombs; as far as I can see, only Russia was interested. But I can imagine a Divided States of America in which both successor states have nukes. I suspect they'd be reluctant to attack one-another. They might even both choose to disarm, depending on the nature of the break-up.
Sure. You could abolish the Union entirely, and just have states. I don't think it's obvious that that would be a terrible thing - either for USAians or for the rest of us. My sense is that a lot of USAians don't really want a Union. I suspect that many USAians think the Union was effectively a coup.
The question remains: how do you spread the nukes around?
My guess is that the residents of Wyoming could do without nukes. They have a lot of land-based nukes, that serve no defensive purpose (for Wyoming); they just make Wyoming a target. I wonder which state might decide that they need nukes, to defend themselves from someone outside North America?
Second question: how you divide up the military equipment that isn't nuclear - ships, tanks, rockets, planes etc. All that equipment carries a heavy cost - it has to be maintained, replaced, tested etc.
Belgium and the Netherlands and Luxembourg were the same country, at least for some time, then split up and are good countries to live in (probably beating the US on most quality of life metrics). Same goes for the Nordics which were at different times in various combos (Denmark-Norway, Sweden-Finland, Sweden-Norway, Iceland was a part of Denmark).
Also a bunch of other examples where the countries aren't necessarily the best to live in, but pretty much everyone is happy they split - Yugoslavia, India/Pakistan, West Pakistan/East Pakistan. Singapore/Malaysia (to be fair Singapore were kicked out).
A split of the US among the crazily politicised duopoly wouldn't be the end of the world. One part would be reactionary as hell and move time back (on social and ecological issues), the other.. who knows.
Fantasizing about the end of the world/country/whatever is a popular pastime in the US. It is reasonable to suspect that the talk is primarily the 1% loudest people and that if anything starts to look like it's getting real, the remaining majority will step on the brakes.
There's a time bomb baked into the conceptualization of mythos of the United States.
The US in many ways legitimizes itself as a government in the traditional of the "western european project," by evoking the ritual, esthetic, ideals, and historical lineage of ancient Greece and Rome. From names and architecture, to values and ritual, there is an establishment of ties, real or imagined to be real.
On one hand, this works fantastically to mentally cement the US in the greatest of greats, but we cannot ignore the visibly obvious - that ancient Greece and Rome declined and fell. By drawing a parallel to ourselves(us-centric obvs) we draw a parallel to our own decline. We we're honest with this comparison, we have to be honest about the demise of these governments too. It's not possible to separate the too.
"End of the Union" is a catastrophically bad idea and it disturbs me that otherwise intelligent people would take it seriously. I find it stunning how people who might otherwise decry things like "racism" can be in a mindstate of not understanding how close of an idea this is to "Maybe we should do the Civil War" again.
There's a fair argument that it never ended; the South was defeated, but never gave in. Or at least, part of the South didn't. Come on, isn't that the meaning of the Confederate flag? Doesn't it mean "Fuck you, Union"?
/me a brit, I don't have a dog in this race. I'd just feel massively more comfortable if the most powerful country in the world knew who it was, what it's policies are, and who it's friends are. If they can't figure that out, and have to break up, that's miles better than the USA being balanced 49% to 51% on every policy and every federal court judgement.
I'd prefer it if the USA split, rather than having a schizophrenic USA, with the biggest army in the world, that can't agree who it is, what it's economic policy is, or what it's foreign policy is. Schizo USA is screwing everyone else up.
I like the goal you're looking at, I just don't think that's a viable path from where we're standing; which is to say in reality a broken US gives you more potential chaos, not less?
I've come to believe that "nation" is somewhat of a weird and artificial concept and so that in many ways it almost doesn't make sense to compare e.g. the US and Russia. And so, while "no lopsidedly strong nation" would be optimal, from here I do think that US in its current form is a pretty good hedge against the others, for the most part.
It's going to be a real bad thing for some people in some of the states. E.g., the last time the Union was split up it was over the "right" to treat people as livestock. Or look at the use of federal civil rights laws in the ending of Jim Crow.
Yes, to me (not from USA) it looks as if would get worse for most USAians if the Union broke up. Especially marginalized people. But I think that for the rest of us, it would be cool if the US government would agree to work together, or agree to divorce. It's shit, being in the blast-radius of a marital crisis that's lasted 30 years.
They tried, but governments and organizations like NCMEC (and many other people) got up in arms about how it would protect child abusers (which is not incorrect). AIUI, they've been working on solutions that would alleviate some of those concerns before switching to enable e2ee by default.
There was a warrant, which is pretty much the bar. If you can get a warrant to find evidence for a crime, I don't know what the court battle would be about. You don't have a right to privacy to conspire about crimes on facebook. Even if you don't agree it should be a crime.
Yep. Facebook Messenger isn’t E2E encrypted by default, it takes a bit of digging to make it that way, and Facebook obeys warrants for unencrypted messages. This may be important if you’re doing something recently made illegal.
The stories were accurate, but most of the headlines said Facebook "gave" data to law enforcement, when it's more accurate to say they were legally obligated to provide data. The villains in this story are the law enforcement agencies, legislators and judges who set the rules. Facebook did nothing wrong.
I've been on the other end of law enforcement requests for data, and they were not legally obligated to provide the data right away. They had the option to fight the subpoena.
Story says it was a warrant. Not a request or a subpoena. Fighting a warrant isn't really possible. Defense can move to suppress it at pretrial, but FB has no standing to refuse to service a warrant.
Could you say a bit more? IANAL, but I only know about warrants in the context of them coming in and taking stuff. I don't know what a warrant could even mean here, especially given that the information isn't physically in Nebraska.
Ok, I just double-checked that, and apparently Facebook has a Nebraska data center. So the information could possibly be there. But I have a hard time imagining Nebraska PD going into a 10-acre data center and just grabbing servers.
Warrants don't need to be for physical items. Even pre-internet, PD could issue a warrant for phone records and it doesn't matter where or how the records are stored. They just need the records to be turned over for inspection. Nebraska can sanction anyone who doesn't comply with a warrant.
They are. Roughly, a warrant is issued to a law enforcement agent and grants them powers not normally granted (arrest, search, etc). A subpoena is issued to a 3rd party to produce evidence or appear in court.
In this case, the warrant appears to have been issued to an officer in the Norfolk PD, who then asked FB for any messages between the two suspects.
What's not clear to me (IANAL) is what power FB had to refuse to comply with the warrant. If they refused to comply, would the police start seizing servers? Or, would FB be held in contempt (usually applies to subpoenas, not sure if that would apply with a warrant).
All that said, the "easy" answer for FB (Apple, Google, etc) is E2E encryption of messages. Then all they can do it produce the encrypted "gibberish" and the PD has to hack it or move on.
Facebook absolutely did something wrong. Perhaps, if they weren't so intent on hoovering up our data for advertisers, they might have built their systems such that they couldn't provide that data to law enforcement.
Is there a good (for the end user) reason that Messenger does not have E2EE enabled by default?
From The Verge's article[1]:
> However, campaigners note that Meta always has to comply with legal requests for data, and that the company can only change this if it stops collecting that data in the first place. In the case of Celeste and Jessica Burgess, this would have meant making end-to-end encryption (E2EE) the default in Facebook Messenger. This would have meant that police would have had to gain access to the pair’s phones directly to read their chats. (E2EE is available in Messenger but has to be toggled on manually. It’s on by default in WhatsApp.)
From Meta's perspective, in all the ways that matter, the advertiser is the end user. Non-advertisers' impressions and data are simply inventory that can be sold to end users. And it would be bad for the "end user" if that inventory was stored by default in a form that could not be easily indexed for cost-efficient packaging and delivery.
It doesn't matter whether end (you) to end (facebook) encryption is enabled or not. That only protects data "in transit". The information is still accessible in to facebook "at rest". Enabling E2EE should give you absolutely no sense of privacy from Facebook because it doesn't exist.
This is contrary to the universally understood meaning of E2EE (as in, end to end between the two participants in the conversation). I'm not one to blindly take Facebook's PR statements at face value, but if you're making the claim that Facebook is deliberately advertising E2EE while secretly redefining the term to mean non-E2EE, you should have some strong evidence. Those sorts of linguistic gotchas don't work in real life or in a courtroom.
It's mostly not enabled by default due to uproar from politicians and organizations like NCMEC on how it would protect child abusers. I expect that they are currently working on features to help address that and will enable it by default when those are ready.
Drop the web app, make a native one like Signal does if they even bother with desktop. They clearly don't want people to use it anyway, they've been implementing dark patterns to push the phone version of Messenger for years.
Encryption keys could themselves be encrypted with a password that the user would type, that is only ever saved in browser local storage, or even only in memory and needs to be retyped on each pageload.
There's nothing preventing the government from forcing Meta to implement a backdoor that exfiltrates the unencrypted key, of course, but that's true of non-web-based systems as well.
I am not sure how would that prevent them having access to the key and subsequently the data? Is there any platform which implements what you are suggesting and prevents the platform access to the data on a web application?
Genuinely asking as I would love to implement something for my customers which gives them control over their data while it resides on my servers.
Your parent poster proposes that the key itself is protected by a password that the user needs to enter and that the unlocked key is only stored on the users device (local storage for browsers,…)
The server only serves encrypted data that gets decoded in the browser.
The primary usability problem for that approach is that there’s no way to recover the data (messages) if the user ever forgets the keys passphrase.
Another problem is that all of the rendering that uses such encrypted data needs to happen client side in JS, WASM or similar.
Ah. I misinterpreted this thinking the user password would be used but in this case having a separate password which user would have to reenter erratically.
I am not in security but think that XSS might be a concern here with something so sensitive.
And UX problems that come with it. Sounds interesting though to at least discuss with customers to see if the benefits are worth the costs to them.
I am at a loss for words if people expect Facebook of all companies to not access the data on their platform. Of course they will access the data on their platform. Texts and apps like Signal are a different story.
It's part of the "act like a chicken running around with its head cut off" antipattern.
In many organizations there is a lot of pressure after a dramatic event to come out with a statement right away and this denies people the chance to deliberate about what the message should be.
A big problem with Facebook's PR is that they reflexively deny that Facebook has any negative effects, that they had responsibility for anything, that anything is their fault, etc. It's a major reason why people think they are out of touch and why Zuckerberg has such low favorables. You're better off keeping your mouth shut and letting people think you don't give a f--k as opposed to writing a press release like that and proving it.
They've been outmaneuvered for one thing because even before the supreme court struck down Roe there has been a drumbeat about the risks that people could have social media and other digital communications used against them in an abortion case and those people have been waiting for a real case to turn up.
I for one think this was a very well-executed PR, with the caveat that I lack context as I am not aware of the referred reporting and still am. As someone without preconceptions to set straight, I can look up the backstory to fill in the blanks elsewhere should I be so inclined.
It's concise and to the point, mostly factual with a twist of putting in their understanding and framing. No fluff.
This clarification is either very poorly written, or very duplicitous. They say what court documents in part indicate, but give no clarity on whether that portion of the court documents were, or were not, part of what was under the secrecy order, nor do they reveal whether or not they had access to those portions of the documents at the time. The entire statement seems designed to confuse the issue.
Laying that aside, are they saying that they will not turn over user data when they know the request for data relates to a person being prosecuted for seeking or helping with an abortion? I don’t think they are saying that, but they are trying to get good will as if they would fight such a request, even though they wouldn’t.
So Facebook's point here is that their legal team is so utterly clownshoes that when receiving a warrant about burning a stillbirth they couldn't clue in that it might be about abortion. Their press release is designed to convince us that a local NE PD completely snowed this international corporate powerhouse.
a read the piece as a barking tantrum. that they, a company so magnanimously capable of eschewing everything from taxation to even cursory efforts at regulation could be tricked into delivering a very public, very negative publicity event by not a nation, but a middling state that generates less GDP than the annual revenue of meta.
It also felt like a shot across the bow for Nebraska that a correction was mentioned at all, and so tersely as im sure legislators in the state of two million were expecting business as usual.
either way this is a 117bn company that owns the eyes and ears of nearly the entire planet. piss in their cheerios and 'the algorythm' might make your next re-election a lost cause.
Was anyone mad at Facebook for responding to a valid warrant? Whether the warrant mentioned abortion or not is irrelevant. It was a warrant. They had to comply.
However, this should be a fairly huge signal that Facebook needs to move to E2EE by default for Messenger.
So they both played along and now they're saying that they will be forced to play along every single time and there's no way to stop it. This is even worse then them being knowingly complicit.
They are forced to comply with legal warrants. They can try to fight some things, but if the warrant doesn't even mention everything (like abortion suspicions) then what do you expect?
Though I would say, they appear to be choosing to not use end to end encryption. I don't know if that's due to technical limitations or because they don't want to, but I imagine they could figure something out if they wanted to.
I don't expect anything. I'm merely stating how things are. Unless it effects Facebook's income I don't expect the situation to change.
The solution is obvious. Individual humans should stop using corporate controlled communication services for personal, or private, communications. Corporations both cannot and will not protect you from authoritarian government abuses.
I've been saying this since the early 2000s but like with most things, people will never care until it effects them personally.
This is the first I'm hearing about this case. Taking FB's text at face value, I would say: My comments:
> We received valid legal warrants from local law enforcement ... investigating the alleged illegal burning and burial of a stillborn infant.
Law enforcement should not be investigating this IMHO; and the availability of people's Facebook interactions record lets the police access additional information they would not otherwise have had in their misplaced investigation.
> The warrants were accompanied by non-disclosure orders, which prevented us from sharing information about them.
That's just dandy, the government will order large corporations to keep everything secret from you, and of course Facebook is a law-abiding corporation, so of course they will hide this from you.
Secret investigations, constant mass surveillance used in secret investigations, extremely carcerally-minded legal system, and a trigger-happy police in case you somehow get the wrong idea about resisting anything. Verily, the USA is the land of the free.
Everyone talking about not having E2EE in messenger, am I missing something? How would one implement E2EE on a web application? How would the key sharing work on such a transient platform?
Well, there _is_ E2EE in messenger, it is just off by default. I am not sure how they manage key sharing, but likely in some way that would allow them to add another device to your session relatively transparently, which means that they could always bypass it in that way. Of course, they could also bypass it by shipping a targeted software update to force your app to resubmit all the messages unencrypted.
And of course, all those ways to bypass e2ee are properties of e2ee itself (or of common features required for usable e2ee like being able to add devices), not properties of facebook or facebook's implementation. e2ee generally should not be considered to be protecting you from a malicious messaging service.
My biggest take-away and concern from this is that non-disclosures can be part of warrants. What? I thought this was unique to national security letters.
Gag orders. This has long been a standard practice in the US, for better and worse. When Eliot Ness was investigating Al Capone, do you think the warrants for wiretapping phones were issued in open court (edit: or that AT&T would be free to tell Al Capone that his phones are wiretapped)?
That makes sense, despite me not liking it - but what's the logic in a gag order in this instance? There's nothing the 17 year old or any other user could do to their Facebook to change or remove their messaging/post history that I am sure Facebook keeps into perpetuity regardless of warrant.
Guess not. This must be why some organizations maintain warrant canaries. Of course Meta probably receives enough search warrants to make a canary useless, but it begs the question: if they had one, how many times per day would it need updated?
I've long given up on this fight - it seems to be that it's acceptable to say "begs the question" as long as it's followed up with the question that it's "begging". I think using "begs the question" without the follow-up question is still problematic though when they're not referring to the fallacy that "begs the question" originally meant.
I apologise for shaking that hornet's nest. It's just that there's a right way of saying that something raises a question; it doesn't involve the use of the word "beg".
If you want to really say that something "begs the question", then you could say what the question is supposed to be. For "begging the question", the question is one or other of the propositions in an argument, and "begging" it means assuming it's truth or falsehood.
I find it pretentious to use terms like "beg the question" incorrectly. Instead of using these erudite-sounding formulations, one could just say what one means. Apart from anything else, I think simple, straight talk is more convincing. Especially if your audience might include people for whom English is a second language.
It annoys me in the same way the word "utilize" in place of "use" does and for the same reasons. And yes I agree on how it should be used.
The only hill I'm willing to die on anymore is the now-accepted use of the word "literally" when the writer/speaker would be more accurate in saying "figuratively".
People love the "language evolves!" argument, but my thought is that language evolutions shouldn't be accepted if they actively make communication more difficult or less clear.
> The only hill I'm willing to die on anymore is the now-accepted use of the word "literally"
Oh, $DEITY. That one really pisses me off; but that hill is almost a guaranteed death sentence. It's in Merriam Webster, saying it means one thing, as well as it's negation. I'm sure there are better hills to fight for.
You can call me a deserter or a fugitive, if you like. I don't care. I won't die on the hill called "literally".
Great of FB to correct the record on this. I hope they correct the record on millions of instances of misinformation spread on their platform as well. They didn't seem too eager to do that in the past.
I'm not sure what people are expecting from Facebook/Meta here. I mean they have to comply with legal warrants and the laws of the land. The only unanswered question here is could Facebook have fought this search warrant? We still don't know that.
But would any other company have done anything different?
The only thing you, as a user, can really do is use a messaging service that has end-to-end encryption so the company doesn't have that ability. But do they really not have that ability? It's harder to say. Signal? WhatsApp? Telegram?
The press release only enforces my concern about how easy it seems for governments to get private conversations between a mother and daughter. Also that apparently Facebook stores the conversations in plain (or easily decryptable) text.