Journalists say they don't write clickbait, but they do.
Wait until journalists find out about the chat export feature of WhatsApp. Or the share button. Think how much juicy clickbait could be written about those features.
I think you're grasping the wrong end of the stick.
If someone reports you, they (the people receiving the report) will be able to read the messages. This is the same if you use iMessage, Signal or any other system. (ie you take your phone to the police and show them the messages, they will see your messages, obviously.)
The key difference is that facebook has a button to report the message to them.
If you have access to the message, you can do whatever you want with it. There is no way around that. End-to-end encryption is like a mathematically sealed envelop and only prevents seeing the message in transit. What sender or receiver choose to do is always up to them.
If the recipient then sends those messages to the cops, that doesn't mean it's not E2E.
Of course there's a bit of a sliding scale here. If the recipient automatically and unknowingly sends all past messages to the cops when they try to report a single abusive message, it's not E2E but it sure is a back door. It's not clear from the article just how much history is sent with each report.
But why are we desensitized to move on with life and letting a company lie about what their product offers? Isn’t there a law somewhere about fake claims? I’m sure it’s somewhere in the policies but if a bread company said they were gluten free and they weren’t, they would be in deep shit no?
I'm not a proponent of many of the practices Facebook regularly engages in but this, as described, is quite harmless. I'm not sure what the purpose of this article is other than to try and paint a false narrative.
Are people really unaware that encryption doesn't limit the information they send to only ever be viewed by the intended recipient? That's not how and never has been how encryption works, it's not a mission impossible letter that self destructs.
The point of E2E encryption is that you trust a recipient and don't trust the medium data is sent through to the recipient. Once the recipient has the information, how well you assessed your trust in the recipient is what matters regarding the security of the information you sent. Nothing has ever prevented the recipient from breaking that trust and sharing information sent to them.
There are of course ways of reducing access to the recipient of the information through a specific technology (view once, time expirations, "self destruction", highly controlled viewing areas, etc), making it more difficult for the recipient to "proove" you sent the information and show the information relying more on the recipients word, but even that has never been that secure (Snapchat is a simple example of such).
I wonder if Business Insider was paid by Facebook to report a news so absolutely obvious and benign, and package it as if it was part of the serious privacy requests formulated by the public and security/privacy NGOs, in order to discredit those.
It's a little clickbaity. Here is the core of the article:
> A Facebook representative told Insider that it allows users to report abuse, and those reports are then reviewed by contractors. When a user reports abuse, WhatsApp moderators are sent "the most recent messages sent to you by the reported user or group, according to WhatsApp's FAQ.
> WhatsApp is founded on so-called "end-to-end" encryption, which means that messages are scrambled before being sent and only unscrambled when they're received by the intended user. But when a user reports abuse, unencrypted versions of the message are sent to WhatsApp's moderation contractors, ProPublica reports.
I read somewhere that WhatsApp uses some AI to detect CSAM and send content automatically and without user consent to Facebook for review. Has this been confirmed?
^ This is the Head of WhatsApp at Facebook specifically calling out Apple for their client-side scanning, pointing out that Apple should instead have a way to report content; so, while he technically could be lying, I doubt that's true.
The wired/gizmodo article tries its best to imply that it does.
the main claim comes from one of the moderators saying that: "the ai keeps sending us mundane pictures, like kids in bathtubs". This has allowed people to claim that facebook are scanning every image/message. Where as, obviously given the paltry number of moderators, your going to us AI to triage messages. Otherwise you'd have to spend more cash on staff
I think there is another bit where someone says they can trace the journey of the image. Its pretty trivial to do this with metadata, given that whatsapp leaks a fair bit of metadata its not suprising.
Many years ago I read how Facebook was developing technology where they would hash content before encrypting, checking against a list and then notifying. Current csam tech basically.
That time I understood the whole e2e is for outsiders. Essentially, you cant have this data, only we do. First dibs and all.
My question is whether that functionality can be triggered remotely (rather than a user creating an unencrypted message digest and sending it to Facebook)
If you think it's crazy that a user who you send messages to can send your messages to Facebook, wait until you realize they can send your messages to literally everyone and anyone.
E2E only protects between the endpoints, not after.
What I find the most shocking about all this is that people believe Facebook when they said it was E2E. If you can't inspect for yourself the default assumption should be that they're lying.
I even recall recommending Signal to friends -- nope. We use WhatsApp it's E2E so even safer than Signal or Telegram that are deceitful as you have to use secret chat to get E2E! All these FB messenger advocates seem to have the same arguments and yet none of the knowledge about how it works and this ludicrous "trust" in FaceBook.
At least authoritarian regimes are distrustful of Telegram and Signal -- banning them.
> We use WhatsApp it's E2E so even safer than Signal or Telegram that are deceitful as you have to use secret chat to get E2E!
Just a minor correction, Telegram is the one that is only E2E encrypted via their secret chat feature. Signal is E2E encrypted at all times, and open source.
There is no indication that Facebook can read any messages sent encrypted. The headline might be given the wrong impression. But of course, Facebook can read messages which are sent to them in clear text to inspect them for abuse.
The article is written confusingly. It looked as if Facebook has private keys, and WhatsApp is not end to end encrypted.
But considering this:
>> Those contractors, which Facebook acknowledges, reportedly spend their days sifting through content that WhatsApp users and the service's own algorithms flag.
Does Facebook use automated tools and perform “Client-side Scanning” ?
That also defeats the purpose of end to end encryption.
Nothing stops it from sending the unencrypted version when someone reports a message they can read, it's just grabbing the plain text and sending to moderation.
Completely abusable... So E2E encryption doesn't really save us here.
https://news.ycombinator.com/item?id=28448593