Yeah because if it wasn’t for COVID YouTube, Facebook, et al would never have removed any content on their platform, unlike what they had been doing all this while…
There are so many issues with this.
Being able to pick what content they host is fundamental to freedom of speech for private entities.
The real problem is twofold.
1. A few platforms hold monopoly positions. Who else can compete with Youtubr? And the reason isn’t necessarily because YouTube has a particularly better UI that keeps viewers and content creators on it. The reason YT has all the content creators is because it leverages Google’s ad monopoly and is able to help creators make money. A decently functioning anti-trust system would have split google ads from the rest of the company by now.
2. The devastation of the promise of the open internet. VCs have spent hundreds of billions of dollars to ensure we remain in walled gardens. Open source, self hosted, software on the other hand, where the benefits are shared and not concentrated in individual hands which can then spend billions to ensure that concentration, has suffered.
We need govt funding for open source and self hosted alternatives that are easy and safe for people to setup.
Combine the two and instead of YT getting to choose what videos are seen and not seen on the internet, major and small content creators would self host and be the decision makers, and still make similar amounts of money because they could plugin the openly available Google Adsense (kind of like how you can on blogs…).
I think their real edge is a practically free and practically infinite bandwidth/capacity global CDN setup. There's no real technical reason for this still to be the case, but bandwidth costs are significant for people relying on other services to provide such. Or they're cheap and slow/capped.
This is the main reason I think alternative sites have a hard time competing. Play anything on YouTube from anywhere and if it's buffering/slow then it's probably your internet connection that's the problem. By contrast do the same on competing streaming sites and it's, more or less, expected especially if you aren't in certain geographic areas.
Monetization on YouTube is mostly just a carrot on a stick. The overwhelming majority of content creators will never make anything more than pocket change off of it. That carrot might still work as an incentivization system, but I don't think it's necessarily the driving force.
I'm not really disagreeing with you but I have a 700/700 fiber connection that generally works perfectly for anything I do, and youtube craps out pretty frequently. It'll just fail to load videos and I have to refresh up to multiple times before it starts working properly.
Also the frontend is generally very wonky, I'm wondering if its severely over engineered or something. It seems very simple, but it's failing at all kinds of stuff all the time. Shorts fail to load when scrolling, the scrolling just stops working, some times it keeps playing the previous video's audio while the current video is frozen.
Some times if I write a comment and try to highlight and delete some of it, when I hit backspace it deletes the part that wasn't highlighted. A normal <input type="text" /> does not do that. Have they implemented their own text inputs in JS or something?
All you need for that component is a form with a textfield and a submit button. As far as I know that won't behave this way so I'm not sure what they're doing but it doesn't seem great.
I went and checked, it's a div. No idea why they would do that for that simple comment form.
Why should "YouTube" as an entity enjoy freedom of speech? They're a platform for user-generated content. Outside of outright illegal content (which is even tenuous sometimes, I'd like to reserve this for the worst of things), they shouldn't be able to pick and choose which UGC they are willing to allow. They're the modern "town square". They're effectively a monopoly in this day and age (yes, there are other video hosting platforms, but YouTube has the largest share of all by far, and are de facto the place people expect to find video UGC).
Serving video with high availability to millions of people is hard. Few organizations, that aren't already flush with capital, are going to be able to replicate that at any sort of scale.
I'm tired of big corporations using their might to override individual freedom of speech. Once you reach a certain size, you should have to make moderation a more personal thing. Instead of taking videos that aren't illegal in and of themselves down, they should have to empower the user to moderate their own feed. Of course, this is incompatible with the modern drive to use these platforms to push content in front of people, instead of letting them curate their own experience.
I don't have all the answers, but the "corporations = people, and thus corporations have freedom of speech" angle has done a lot of damage to the rights of individuals.
I think one thing that we should be more cognizant about in general is that corporations are a legal construct to begin with, and as such, there's no natural right to incorporate - it's strictly a privilege. So society attaching even very heavy strings to that is not unreasonable so long as they are applied consistently to all corporations. Which is to say, if corporations don't do what we as a society want them to do, beating them with a large and heavy stick until they start doing that is not wrong, and we should be doing more of it.
And if people really want their freedoms, well, they can go and run their business as individuals, with no corporate liability shield etc. Then I'm fine with saying that their freedom of speech etc overrides everything else.
> Being able to pick what content they host is fundamental to freedom of speech for private entities
I simply don’t think this applies to places like YouTube.
But if does then they also must be responsible for the content. It makes no sense that curating content is their free speech but at the same time it’s not their speech when the content could have legal repercussions to them.
The argument that removing videos is their speech implies that hosting videos is their speech. So they should be liable for all content they post.
They are two different things, though. One is actually producing content, and the others deciding which content host and share. And there are all kinds of various legal and illegal combinations, here. For instance maybe they decide that it's okay to host Nazi content, something that is absolutely protected under the first amendment. Or maybe they decide that it's not okay to host Nazi content, even though it's definitely protected under the first amendment.
Also see Gonzales v. Google.
But really the most dangerous thing here is telling a company that they are legally liable for everything their users post. A large company like Google has the legal firepower to handle the massive onslaught of lawsuits that will instantly occur. A smaller startup thing? Not a chance. They're DOA.
Heck, even on my tiny traffic personal website, I would take the comment section down because there's no way I can handle a lawsuit over something somebody posted there.
I should not be required to host content I do not wish to host. And at the same time I must be shielded from liability from comments that people make on my website, if we are to have a comment section at all.
I think using the example of Nazi content and the first amendment is a distraction. What’s relevant is speech that is not legally protected.
Should the New York Times have civil libel liability for what they publish in a newspaper? Should Google have civil libel liability for what they publish on YouTube?
> The argument that removing videos is their speech implies that hosting videos is their speech.
There is no such implication because the first is an affirmative act based on their knowledge of the actual content and the other is a passive act not based on knowledge of that content.
The solution would be to revoke section 203 from any platform which acts as a digital public square if they do moderation beyond removing illegal content.
Ofc they would try there best to be excluded to have there cake and eat it too.
The entire point of section 230 is to allow platforms to remove non-illegal content [1].
Basically there were two lawsuits about platforms showing content. One of the platfroms tried to curate content to create a family-friendly environment. The second platform just didn't take anything down. The first platform lost their lawsuit while the second won their lawsuit. Congress wants to allow platforms to create family friend environment online so section 230 was written.
If something like that were put in place, any platforms acting as a “public square” should also be required to disable all recommendation and content surfacing features aside from search, algorithmic or otherwise.
Those recommendation features already do plenty of damage even with platforms having the ability to remove anything they like. If platforms are restricted to only removing illegal content, that damage would quickly become much greater.
* When a bot farm spams ads for erectile dysfunction pills into every comment thread on your blog... That's "legal content"!
* When your model-train hobbyist site is invaded by posters sharing swastikas and planning neo-nazi rallies, that too is "legal content"--at least outside Germany.
All sorts of deceptive, off-topic, and horribly offensive things are "legal content."
Sadly it turns out that the biggest driving force is politics, and the inability for our institutions to win with boring facts, against fast and loose engaging content.
The idea is that in a competitive marketplace of ideas, the better idea wins. The reality is that if you dont compete on accuracy, but compete on engagement, you can earn enough revenue to stay cash flow positive.
I would say as the cost of making content and publishing content went down, the competition for attention went up. The result is that expensive to produce information, cannot compete with cheap to produce content.
Your premise is incomplete. When someone posts illegal content on YouTube they are not liable if they are not aware of the illegality of that content. Once they learn that they are hosting illegal content they lose their safe harbor if they don't remove it.
Let me rephrase, since saying they lose their safe harbor was a poor choice of words. The safe harbor does indeed prevent them from being treated as the publisher of the illegal content. However illegal content can incur liability for acts other than publishing or distributing and section 230's safe harbor won't protect them from that.
The reason we're having this discussion this on this particular post because YT's AI is not infallible. There isn't a "standard rubric" - just automated correlation-based scoring derived from labeled training data. In this case, the AI learned that media piracy and self-hosted setups are correlated, but without actual judgement or a sense of causality. So YT doesn't truly "know" anything about the videos despite the AI augmentation.
I am curious what you consider to be a "standard rubric" - would that be based on the presence of keywords, or requires a deeper understanding of meaning to be able to differentiate the study/analysis of a topic versus promoting said subject.
> Interesting position - when somebody posts illegal content on YouTube, they are not liable, it’s not their speech.
> But when I want to post something they don’t like, suddenly it’s their freedom of speech to remove it.
There is no contradiction there.
Imagine a forum about knitting. Someone, who has it in for the owners of this knitting forum (or perhaps even just a SPAM bot) starts posting illegal, or even just non-knitting content on this forum.
The entire purpose of the forum is to be a community about knitting.
Why is it the legal or moral responsibility of the knitting forum to host SPAM content? And why should they be legally liable for someone else posting content on their platform?
You're equating specific pieces of content with the platform as a whole.
There is no reality where I will accept that if I create something. I spend and risk my money on web hosting. I write the code. I put something out there... that other people get to dictate what content I have to distribute. That's an evil reality to contemplate. I don't want to live in that world. I certainly wont' do business under those terms.
You're effectively trying to give other people an ultimatum in order to extract value from them that you did not earn and have no claim to. You're saying that if they don't host content that they don't want to distribute that they should be legally liable for anything that anyone uploads.
The two don't connect at all. Anyone is, and should be free to create any kind of online service where they pick and choose what is or is not allowed. That shouldn't then subject them to criminal or civil liability because of how others decide to use that product or service.
Imagine if that weird concept were applied to offline things, like kitchen knives. A kitchen knife manufacturer is perfectly within their rights to say "This product is intended to be used for culinary purposes and no other. If we find out that you are using it to do other things, we will stop doing business with you forever." That doesn't then make them liable for people who use their product for other purposes.
This isn’t really what’s being argued. We’re not talking about a knitting forum. We’re talking about content neutral hosting platforms. There is a distinction in the law. If you want to not be liable for the content posted to your platform then you may not moderate or censor it seems like a fair compromise to me. Either you are knitting forum carefully cultivating your content and thus liable for what people see there, or you are a neutral hosting service provider. Right now we let people platforms be whichever favors their present goal or narrative without considering the impact such duplicity has on the public users.
> We’re talking about content neutral hosting platforms.
There is no such thing as a "content neutral hosting platform." I know that people like to talk about social media services in the same umbrella as the concept of "common carrier", which is reserved for things like mail service and telecommunications infrastructure. And that might be what you're conflating here. If you're not, then please point me to the law, in any country even, where "content neutral hosting platform" is a legal term defined.
> If you want to not be liable for the content posted to your platform then you may not moderate or censor it seems like a fair compromise to me.
Compensation for what? The "platform" built something themselves. They made it. They are offering it on the market. If anyone is due compensation, it is them. No matter how much you don't like them. You didn't build it. You could have, maybe. But you didn't. I bet you didn't even try. But they did. And they succeeded at it. So where does anyone get off demanding "compensation" from them just for bringing something useful valuable into existence?
That is a pretty messed up way of looking at things IMO. It is the mindset of a thief.
> Either you are knitting forum carefully cultivating your content and thus liable for what people see there,
Thank you for conceding my argument and shining a spotlight on how ridiculous this is. You agree that according to your world view, the knitting forum should be liable for the content others post on it just because they are enforcing that things stay on topic. Even just for removing SPAM bot posts this would expose them to this liability.
> Right now we let people platforms be whichever favors their present goal or narrative without considering the impact such duplicity has on the public users.
The beautiful thing about freedom is that along as people don't infringe upon the rights of others, they don't need your permission to just go build things and exist.
The YouTube creators didn't have to ask you to "allow" them to build something useful and valuable. They just went and did it. And that's how it should be.
I get that certain creators run into trouble with the TOS. Hell, I've tried to create an Instagram account on several occasions and it gets suspended before I can even use it. And when I appeal or try to ask "why?" I never get answers. It's frustrating.
But the difference between you and me, is I don't think that people who build and create things and bring valuable shit into existence owe me something just by virtue of their existence.
> The beautiful thing about freedom is that along as people don't infringe upon the rights of others, they don't need your permission to just go build things and exist
This is hollow sophistry, and it’s not how things actually are.
You don’t have freedom for
Self dealing, price fixing, collusion, bribery, false marketing, antitrust violations, selling baby powder with lead and many other things.
In some states you can’t even legally collect rainwater.
Also the government will come after you with guns and throw you in jail if you violate some bogus and fictitious “intellectual property rights” that last for 70 years after creator has died.
It’s u helpful to pretend we live in Wild West of liberty
> You don’t have freedom for Self dealing, price fixing, collusion, bribery, false marketing, antitrust violations, selling baby powder with lead and many other things.
It's funny how often people will not read what you wrote, and instead read what they want to read.
Not only did my comment preempt that specific reply of yours in the very sentence you quoted, but you seem to have a warped working definition of the word "freedom": where you think that if someone uses it they mean "freedom to do literally whatever the hell they want to no matter who they hurt."
That means that your mental model of the word "freedom", at least when you hear others say it, begins with a straw-man.
No discussion is possible under those conditions.
I'll help you out: my personal operating definition of "liberty" is "An environment in which all interpersonal relations are consensual."
That's why, as long as you are not infringing upon the rights of others (the part of my quote that you just completely dropped and ignored so that you could react to what you wanted to read instead of what I actually wrote) you don't need the permission of others to build something. You can just go and do it.
So then, your actual opinion is Yes a "content neutral hosting platform." does exist?
Its seems very obvious here that people are saying that the laws that apply to common carriers could be changed so they apply to social media platforms.
Problem/confusion solved here, and the world doesn't fall apart. As we already have these laws, and the world didn't fall apart before.
> So then, your actual opinion is Yes a "content neutral hosting platform." does exist?
No. Common carrier and "hosting platform" are not the same thing. If someone wanted to apply common carrier status to broadband infrastructure, it might make sense. Applying common carrier to knitting forum does not. They are two very different things. One facilitates discrete communication between two distinct parties while the other publishes and distributes content to a wide audience. Conflating the two is an exercise in mental gymnastics that only makes sense if you have a political agenda and don't care about being intellectually honest.
I honestly don’t know what you are spewing off about. At one point you quote me saying “compromise” then proceed to argue as if I said “compensation”. I’m not going to respond to a mischaracterization.
To your challenge:
> In the United States, companies that offer web hosting services are shielded from liability for most
content that customers or malicious users place on the websites they host. Section 230 of the
Communications Decency Act, 47 U.S.C. § 230 (―Section 230‖). protects hosting providers from
liability for content placed on these websites by their customers or other parties. The statute states
that ―[n]o provider or user of an interactive computer service shall be treated as the publisher or
speaker of any information provided by another information content provider.‖ Most courts find
that a web hosting provider qualifies as a ―provider‖ of an ―interactive computer service.‖
>Although this protection is usually applied to defamatory remarks, most federal circuits have
interpreted Section 230 broadly, providing ―federal immunity to any cause of action that would
make service providers liable for information originating with a third-party user of the service.‖
There is clear legal handling in the US beyond common carrier provisions for hosting providers on the internet.
The nuance here is an argument over what constitutes a hosting provider and how far we extent legal immunity.
My “worldview” is that if you want to claim your business is a hosting provider so that you are granted the legal protection from content liability, that you have a responsibility—which I’d argue we should codify more formally—to remain a neutral hosting provider in spirit, because it is in line with the type of liberty (freedom of expression) we aim to protect in the US. You are saying “legally I’m a neutral hosting provider”, and we already tolerate removal of spam and legally obscene/objectionable content so your point there is moot, so if you are making that claim legally then it’s two faced to turn around and say “IMA private entity I can do whatever I want to curate the content on my platform because I’m responsible for the brand and image and experience I want to cultivate in my house”.
I’m okay with hosting providers not being liable for user content, and I’m okay with yarn forums deleting any post that doesn't reference yarn. It’s the mix of both that I feel is partly responsible for the poor state we’re in now where users get demonetized on YT for questioning the efficacy of new vaccine technology.
Hopefully it’s clear what the nuance is here. And if you don’t think there’s a whole conversation that has been happening here read up on Cloudflare’s philosophy and what Prince has written about the topic. Because they were faced with the same dilemma with The Daily Stormer (but not quite as flagrant as Google/YT trying to play both sides for profit).
The issue is that the knitting forum is a different beast from youtube. The latter is a platform. Its scale makes it QUALITATIVELY different. And there's network effects, there's dumping behaviour, there's preinstalls on every phone, there's integration with the ad behemoth, all to make sure it remains a platform.
This is correct. In the US tiktok is currently being sued for feeding kids choking game content through the algorithm that was earlier judged to be free speech.
Curation and promotion, even if done by a machine (LOL, why does that matter at all?) needs to come with significant liability.
It should be possible to protect content hosting services from extensive liability while not protecting companies from the consequences of what they choose to promote and present to people. Those are two separate and very different activities that aren't even necessarily connected (you could curate and promote without hosting, and in fact this happens all the time; you can host without curating and promoting, this also happens all the time—in fact, these typically are not mixed together outside of social media companies with their damned "algorithms", as far as content from 3rd parties goes)
Unfortunately, it's a tall order in the current political environment for the same reason open source funding isn't forthcoming, these are just parts of a bigger problem which is best discussed elsewhere.
With that said, you're absolutely right in your assessment, this is approximately what needs to happen in order to improve the current sorry state of media and public discourse. Sadly, as evidanced by the other replies to your comment, the public at large simply doesn't get it and the situation is even worse with the structural changes needed to make a real solution possible.
It's a vicious cycle that results in ever worse media, and not only media. The current public spat between the two smartest people in the world (by mass media metrics), garnished with public blackmail attempts and private-social media channels, is a jaw dropping proof of dysfunction but ofcourse the media presents it as casual entertainment.
> Sadly, as evidanced by the other replies to your comment, the public at large simply doesn't get it and the situation is even worse with the structural changes needed to make a real solution possible.
The ones with money and power (which are effectively the same thing) want it to be this way, as it makes them richer and more powerful. The masses are just pawns literally being moved around on the chessboard of society.
One thing I really wish, is that more people volunteered to moderate things. It’s a volunteer position, it’s needed for most of the communities we are part of, and doing this raises the floor of conversations across the board.
The distance between the average view point on how free speech works, and the reality that content moderation forces you to contend with, is frankly gut wrenching. We need to be able to shorten that distance so that when we discuss it online, we have ways to actually make sense of it. For the creativity of others ideas to be brought to bear.
Otherwise, we’re doomed to reinvent the wheel over and over again, our collective intuitions advancing at a snails pace.
I dislike people promoting extensions to the formerly liberal moderation and content controls on the net because the current status quo was entirely predictable.
And your statement is wrong. There was a culture that such content wasn't removed and if it was done, there was a backlash. Even on platforms like Facebook and there certainly was a time where such removals generated feedback.
But activists demanded censorship and everything degenerated into some stupid partisan shit about bullshit topics that do not matter.
It doesn't take too much to comprehend that the demands for censorhip normalized it in the end.
And open solutions like ActivityPub also had to suffer the insufferable and made the openness a moot point.
Govt funding would make everything even worse because people would demand even more content controls and there are numerous leverages where public officials could be pressured to enact more content controls.
> Being able to pick what content they host is fundamental to freedom of speech for private entities.
Here's some text from Section 230 of the CDA:
> (c) (2) Civil liability
> No provider or user of an interactive computer service shall be held liable on account of—
> (A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or
> (B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1)
...
> (e) (1) No effect on criminal law
> Nothing in this section shall be construed to impair the enforcement of section 223 or 231 of this title, chapter 71 (relating to obscenity) or 110 (relating to sexual exploitation of children) of title 18, or any other Federal criminal statute.
Now in this case, you have YouTube, a service with obvious market power, taking down content promoting a competitor to YouTube. There are Federal criminal antitrust statutes.
Mhmm, so would it be fine for a private platform not allowing say, Muslims on their website? Especially a platform as big as YouTube? I mean, it's essential to their rights to be able to do that, I guess?
Like I understand your point, but this argument is usually not actually useful. Especially since it's usually not coming from "free speech absolutist" types, so it always comes off as a bit disingenuous. Unless you are arguing for big corporations having an absolute right to free speech, which I would disagree with but would at least make the argument consistent.
> Mhmm, so would it be fine for a private platform not allowing say, Muslims on their website?
Depends on the sense of “private”.
If it is, private in the sense that it is a platform run by a Christian Church for the use of organizations affiliated with that Church, and not offering information dissemination to the general public, sure.
If its a private business offering platform services to the public at large but specifically excluding Muslims, then it is potentially engaging in prohibited religious discrimination in a public accommodation. Unlike religion, political viewpoint is not, federally, a protected class in public accommodations, though state law may vary.
(OTOH, under the federal Religious Freedom Restoration Act and similar laws in many states, and case law based on and in line with the general motivation of such laws, laws including state public accommodation laws, are being looked at more skeptically when they prohibit religious and religiously-motivated discrimination, as an impairment of the religious freedom of the discriminating party, in theory irrespective of the religions on each side, but in practice favoring discrimination by Christians and against non-Christians, so possibly the Muslim exclusion would succeed even in a public accommodation.)
I don't think anyone would argue that would violate freedom of speech, however it would still be illegal as it would violate the civil rights act by discriminating based on religion. Theres more than one right involved in your hypothetical basically.
We don’t need the government to throw money at “open source”. That’s silly. Youtube used to be a means to an end. I need to send my friend or a teacher a video but email has a 25mb attachment limit. Need to use youtube or image shack. These days you can just use a text message or whatever platform you’re using to communicate. So youtube has now become a platform for “content creators”. It’s a different beast. To compete with youtube you have to not only make the video stuff work but also break the network effect and figure out how to pay creators.
Further, plenty of VCs don’t give two shits whether your thing is open source or not, they just want ROI. In my experience it’s tech law (or lack thereof) that missed the infusion of “internet maker ethos”. The depth of the average startup legal advice is “here’s a privacy policy and EULA that maximally protect your company at the expense of users”. “Here’s an employment contract template that tries to fuck your employees.” “It’s safest not to share your source code and keep it a trade secret.” “Go have fun.” If you want to see more open source then you need to cultivate that ethos among the people in power running the companies. So often I see the prevailing sentiment even here to be anti-gpl. The gpl may be imperfect, but if you care at all about the proliferation of open-source in a western copyright regime, then pissing on the gpl as “the brainchild of crackpot Stallman” is not the way to get there.
If you want more open source then founders need to come to fundamentally understand that their source code is not what makes their business valuable, it’s the time and effort they put in to provide a service that others aren't providing or is better than the competition. Too many founders are living the delusion that at a software level their engineers are writing novel patentable or trade secret level code that gives them a true algorithmic leg up. 9 times out of ten their shit is just new and fresh and disruptive. I understand that in rare cases people are doing truly novel things with software, but that certainly isn’t the default case.
There are so many issues with this.
Being able to pick what content they host is fundamental to freedom of speech for private entities.
The real problem is twofold. 1. A few platforms hold monopoly positions. Who else can compete with Youtubr? And the reason isn’t necessarily because YouTube has a particularly better UI that keeps viewers and content creators on it. The reason YT has all the content creators is because it leverages Google’s ad monopoly and is able to help creators make money. A decently functioning anti-trust system would have split google ads from the rest of the company by now.
2. The devastation of the promise of the open internet. VCs have spent hundreds of billions of dollars to ensure we remain in walled gardens. Open source, self hosted, software on the other hand, where the benefits are shared and not concentrated in individual hands which can then spend billions to ensure that concentration, has suffered.
We need govt funding for open source and self hosted alternatives that are easy and safe for people to setup.
Combine the two and instead of YT getting to choose what videos are seen and not seen on the internet, major and small content creators would self host and be the decision makers, and still make similar amounts of money because they could plugin the openly available Google Adsense (kind of like how you can on blogs…).