The BMJ itself tightly controls who can say what on its platform, and that is precisely why it gained its reputation.
Granted, Facebook is supposed to be more open for everyone, but it’s still their platform, their reputation, and they can decide what to filter - and people can decide to take their discussions somewhere else.
The BMJ does not claim protections as a platform, whether you call it one or not. BMJ is a publisher, and takes responsibility for the content that it publishes. The argument here is that FaceBook is now also a publisher.
The distinction needs to be there, from the moment FB (or Twitter) has a conflict of interest: their revenue depends on the content being engaging, regardless its factuality.
In my head, if they do any filtering or ordering of content, they have to be accountable such content, since they are exerting a de facto editorial line.
I don't really understand how a newspaper publishing an incendiary op-ed can accountable for it, whereas the same thing as a viral FB post won't hold FB accountable. The only difference is that an editor picks the former, and an algorithm picks the latter.
Is that it? Does delegating responsibilities to a computer make us any less responsible of the results?
The issue is not whether Facebook, as a private company, can filter whatever it likes, but whether it does so reliably and truthfully in a way its users can trust.
Facebook can indeed decide what to filter, but its also permissible for other organizations to make users aware that Facebook may filter and display misleading messages based on inaccurate fact checking and that the fact-checking organization that caused the filtering defamed a reputable publication by falsely alleging that their content is a hoax.
Granted, Facebook is supposed to be more open for everyone, but it’s still their platform, their reputation, and they can decide what to filter - and people can decide to take their discussions somewhere else.