Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Tim Wu Is Out of Control (techdirt.com)
22 points by hn_acker on July 3, 2024 | hide | past | favorite | 31 comments


The featured article is Mike Masnick's response to a different article by Tim Wu titled "The First Amendment Is Out of Control".


> As Justice Robert Jackson put it in 1949, “if the court does not temper its doctrinaire logic with a little practical wisdom, it will convert the constitutional Bill of Rights into a suicide pact.”

This is similar to the frequent misuse of the "free speech does not include the right to shout 'fire' in a crowded theater" quote. Oliver Wendell Holmes said that in a decision upholding punishment for distributing anti-WW I leaflets, the same law that sent Eugene Debs to prison. In other words, it was not something to be proud of.

The "suicide pact" quote just said that protecting public safety always tempered Free Speech rights. Note that Judge Jackson didn't condemn hate speech, or hurting people's feelings, or supporting dangerous political candidates.

In other words, the First Amendment is not out of control. We already had all the limits we need.



I've always been confused by this First Amendment argument with regards to TikTok: they're an organization that has been tied directly with an adversarial foreign state. How is this a rational take? Using this logic, Russia should be allowed to foment unrest through fake hate groups on Facebook (which they've done).

People should familiarize themselves with Gresham's Law: bad actors will always beat good actors if bad actors suffer no penalty. If bad actors leverage the rights and freedoms of a democracy to perform attacks without repercussion, we're toast.


>I've always been confused by this First Amendment argument with regards to TikTok: they're an organization that has been tied directly with an adversarial foreign state.

The First Amendment doesn't contain exceptions for adversarial foreign states, it's that simple. If it's acceptable to foment disinformation, hatred and conspiracy natively, and all free speech advocates will say that it is, then the same speech coming from foreign adversaries must also be acceptable.

And let's be clear - the premise that TikTok is some kind of nefarious CCP mind control platform is entirely speculative, and based primarily on Sinophobia. Elon Musk is driving far right wing white supremacist and anti-vaxx content all the time, but people are losing their shit about something TikTok isn't even doing.

>People should familiarize themselves with Gresham's Law: bad actors will always beat good actors if bad actors suffer no penalty.

This is true, but as far as the First Amendment is concerned, it's the job of society to penalize bad actors in the marketplace of ideas, not the government. Which is still further than many free speech advocates are willing to accept, but in practice means that it's up to TikTok (and every other platform) to decide what speech to carry, and what speech not to carry, and under whatever terms they choose, within legal limits of course.


TikTok is banned in China.

TikTok has gone on record that they would rather shutdown that be forced to sell. No normal business/business owners would do that.


> TikTok is banned in China.

ByteDance made Douyin for inside China and TikTok for outside China.


The first time I fired up TikTok I was subjected to video of an older, "surgically-enhanced" woman in a Trump bikini (intention obvious), and someone putting a bumper sticker about killing pedophiles on their car (intention less obvious: a dog whistle for QAnon).

Maybe it's not a nefarious CCP mind control platform. Maybe it is even doing this sort of thing totally blindly based on "engagement". But there's definitely propaganda being served up by default.


I struggle to see how limiting free speech would reign it big tech power.

Government should make it illegal for companies to algorithmically discriminate/censor or throttle speech... Technically, it should already be illegal under existing discrimination laws. An explicit law against algorithm-based discrimination as far as the government should intervene IMO. Also, the government should support many alternative platforms by taxing platform monopolies or other regulations.


Limiting free speech only makes sense if you're a bureaucrat. The government wants to control information, and how citizens use information. They derive power from this, and use it to manipulate the public.

I'm saddened by Tim Wu's arguments, I thought he was one of the good guys.


> his nonsense pro-authoritarian theories in support of Texas’ social media law that would have stripped companies of editorial discretion rights and enabled authoritarian politicians to force companies to host messages they had no interest in associating with.

This right here I think is the absolutely fundamental tension. There are no good answers here because two people's abilities to speak are at odds. And while 1A provides an answer that spells out whose rights win out it's not clear whether this outcome is actually good.

We have public accommodation laws that very much violate a business's right to (non-) association because of burdensome effects discrimination has when no one will do business with you. And I think you can make a very strong case that the same laws could be necessary in a world where if a few large businesses refuse to do business with you it's functionally impossible to speak in any way that matters.

A 1A that upholds the speech of individual real humans above the speech of corporate entities when the two are in conflict I don't think would be a bad one.

> Social media is not that. It’s nothing like a common carrier.

The whole argument in the latter half of this article hinges on this and well… it's kind of arbitrary. Social media could be called a common carrier just as strongly as an airline could be said to curate a flying experience. Do I think the comments section of a blog post is a common carrier, no not really. But Facebook or Twitter have more in common with an ISP than a newspaper.


I interviewed at a non-profit that was not looked upon favorably by various big tech orgs. Microsoft had revoked their non-profit discount, Salesforce would not renew their contract. This org was operating completely legally, just supporting causes disliked by these groups.

We’ve seen the same with banks refusing to do business with people because of their political or social affiliations.

These seem ridiculous. Of course a business should have the ability to choose who they affiliate with, but if all businesses disassociate simultaneously, what recourse does an individual or organization have?

At the same time, who would want to hold accounts for organizations universally considered “bad”? When the label of “bad” changes at social media speed, who is safe?


I completely agree with everything said here, this really is the heart of it. And I think folks feel so strongly about the right to what is essentially blacklisting certain people and organizations because it is one of the few ways you can "hurt" someone in polite society.

As strange as it is I think we lost something when dueling and other violent extrajudicial justice stopped really being a thing. I don't want it back certainly but there is something to be said for the mediating effect that if you piss off enough people they might drag you out of your house and beat you half or fully to death. Or say literally burn down your house or business.

There's a very real asymmetry where when someone materially hurts you or your community and you don't have one of the means that society has deemed acceptable to "hit back" with you're just stuck having to put up with it.


Extrajudicial justice tends to be more extrajudicial than just.

> literally burn down your house or business

See https://en.wikipedia.org/wiki/16th_Street_Baptist_Church_bom...


Since the original article mentioned was tweeted/Xed by Elon Musk, it gained a lot traction and criticism as well [1].

[1] https://x.com/elonmusk/status/1808168603721650364


There are two problems around free speech in America, as I see it:

1. There are a small but very loud minority of people on the far left who are authoritarian and want to suppress all speech that they don't like (there are such people on the far right but they are much less politically influential than they were a couple decades ago). At this moment in time, far left authoritarians are trying to ban dissent from their preferred points of view via claims of "hate speech" and "disinformation". The first amendment was designed precisely to protect unpopular opinions, even odious ones, and there is no requirement in the first amendment for truthfulness.

2. Speech has moved online, increasingly to a very small number of social media platforms and mass media that is controlled by a very small number of owners. This provides a very small number of choke points where an enterprising authoritarian, either private citizen or government actor, can suppress speech that they dislike. The "algorithms" argument is a red herring; the most basic algorithm is "publishers I choose to follow, in reverse chronological order, and excluding publishers that I block", but the social media companies can't resist making it more complex and not providing this basic capability as a selectable option.

The solution to #1 is to fix #2.

The solution to #2 is to end corporatism; let's stop treating companies exactly the same as we treat humans wrt rights. Companies don't act like humans when they are run well (they are amoral entities that exist to serve limited business purposes to the benefit of investors). When poorly run, they magnify the power of the people at the controls of the company. And "incorporation" is not a natural construct; it's a legal construct. Laws should, IMO, benefit the public first. If we're going to create a legal structure that bestows limited immunity from liability for investors, then why not attach some pro-social strings? For example, we could change section 215 of the DMCA to explicitly state that immunity from liability for user-created content, only applies when the company acts as a common carrier, e.g. they do not have a right to freedom to dissociate with specific customers, and the company by default allows all content with the content neutral algorithm above. The company may ALSO provide their own algorithms for displaying and prioritizing content, but it must be explicit opt-in and able to be opted out of.

If this seems radical, consider that under this suggestion, a company could moderate however they want, but must make it optional; they could also choose to act like newspapers or TV stations and forgo liability protection and moderate however they want and associate with whomever they want.

I'm simply saying that I think it's a fair trade for society to say "if you want all these immunities from liabilities that normal people don't enjoy, then you have to serve all people, even people you don't like".


Ultimately I think this doesn’t work because politically, it’s about power, not fairness. Your solution may work if it’s about fairness, but most people will want to use the moderated feed, and there will still be arguments about why person A’s content is not in the moderated feed, and what worries me about “taking companies free speech away” is that the ultimate goal for a lot of people is to be unmoderatable which sort of fundamentally means they don’t want Twitter to be a fair town square, they just want it to go away.

Ultimately I think we’ll start to see harder pushes on this, like I think when conservatives get enough power they’ll try to age-wall all lgbt content in the internet. And I suspect if liberals felt they weren’t mostly winning the moderation battle, they’d push harder.


Normal people do enjoy the same immunity from liability. Put up a Wordpress blog and the same rights that protect Facebook and Twitter protect you when someone posts a comment.


Not true- as an individual you are liable for everything you do.

Incorporation shields personal assets from corporate liability and usually shields corporate officers, and in any case corporate officers usually have officers & directors insurance, so they are largely unaffected by lawsuits.

Not to mention the sheer scale difference- having to fight a lawsuit can cost hundreds of thousands of dollars; most individuals can’t afford to fight legally, either offensively or defensively, like most corporations can.


Unless the hosting company or a company like Cloudflare decides to cancel the wordpress site. I guess in theory as long as your Internet service provider is willing to give you a connection, you could host a local website. I guess.


You make it seem as if this is a common occurrence but I'm only aware of it happening with extremist content, it isn't a problem for 99% of people. And if it does happen, extremists can find other hosts, as they often do. That isn't a first amendment issue it's a business relationship issue. In a free society, businesses can't be forced to have you as a customer.


Last I saw, the word “extremist” wasn’t in the first amendment. And in any case it’s highly subjective.


You're the last person who should be making an argument from the first amendment, given that the first amendment explicitly allows private platforms to do what you're opposed to them doing. You literally started this subthread with a thesis that the first amendment as written wasn't sufficient for modern society, and I don't see the words "corporation" or "algorithm" in it, either.


No, the first amendment allows PEOPLE the right of free speech. The core of my argument is that corporations themselves are NOT people. The individuals who work for (or own stock in) a corporation are people and can do what they want; they just shouldn’t be allowed to use the money and power of the corporation to advance their own agendas.


I have no idea what the man's reasons are, but supporting a ban on TikTok isn't necessarily against the first amendment.

TikTok is a fountain of hot garbage. It drowns out legitimate speech for algorithmically driven propaganda and memes. TikTok is to free speech what waterboarding is to hydration.


Hot garbage, algorithmically driven propaganda and memes are all protected by the First Amendment.


A rule that undermines itself is a very stupid rule; or in the case of the Constitution, a very stupid interpretation.


¯\_(ツ)_/¯ Change the Constitution then. If we have to tolerate spree killers and school shooters because the founding fathers wanted to make sure plantation owners could put slave revolts down, then we have to put up with flat earthers and memes on the internet because the First Amendment was written when speech never traveled faster than horseback, at a time when only the speech of rich white Christian men ever made it into print to begin with.

It would be nice to not have to deal with the negative externalities and technical debt that comes with being wedded to a 200 year old document written by a pre-industrial society that still believed in witches and vampires and killing the "savage Indian," but it is what it is.

Government is not allowed to police speech, full stop. It still does, but in some very limited and generally universally agreed upon ways (such as having laws against libel and slander, or labeling laws regulated by the FDA, or the FCC regulating the broadcast spectrum.) But having the government regulate social media because of the quality or nature of speech would obviously be beyond the pale.


Or, you know, interpret it in a way that actually makes sense in the time we live in, like we always have.


Exactly. The Second Amendment doesn’t protect the right to own biological or nuclear weapons despite the fact that they’re obviously “arms”.

The First Amendment has for a long time been regarded as being subject to reasonable limitation. This new wave of extremist interpretation is alarming and counterproductive to a civil society.

Unfortunately the judiciary has been overrun by an infestation of staunch originalists clinging to literal textual interpretation that would only work centuries ago.


No can do. The majority of justices on the Supreme Court were hand selected by the Federalist Society specifically to prevent that, and they have the rest of their lives to exercise their absolute and uncheckable power towards that end.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: