Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Plaintiff ... is concerned that Defendants have taken her skills and expertise, as reflected in [their] online contributions, and incorporated it into Products that could someday result in [their] professional obsolescence ...

It's been a bit surreal seeing modern day Luddites come out of the wood works basically coming up with any ethical/legal argument they can that is a thinly veiled way of saying "I don't want to be automated!"

Not commenting on whether or not they are right per se, but it's weird seeing history repeat itself.



I don't think it's a matter of right or wrong - these are people who are behaving completely rationally given their context.

(I should caveat that I think if they get what they want, we all lose in a big way. Not that I think this is going anywhere)

We're coming up on the outer bounds of our systems of incentives. Captialism, as a system, is designed to solve for scarcity, both in terms of resources and in terms of skill and effort. Unfortunately, one of the core mechanisms it operates on is that it's all-or-nothing. You MUST find a scarcity to solve or you divorce yourself from the flow of capital (and starve / become homeless as a result).

Thus, artificial scarcity. It's easy to spot in places like manufacturing (planned obsolescence) IP (drug / software / etc patents) and so forth. I think this is just the rest of humanity both catching on and being caught up with. Two years ago, everyone thought they had a moat by virtue of being human. That's no longer a given.

One hopes that we'll collectively notice the rot in the foundation before the house falls over (and, critically, figure out how to act on it. We have a real problem with collective action these days that may well put us all in the ground).


As far as I remember Luddites were smart and not against all technology, they were just protecting their jobs. And they were ultimately right.

Why? Except for the longshoremen in the US getting compensation and an early retirement due to the introduction of containers, I know of exactly 0 (ZERO!) mass professional reconversions after a technological revolution.

Look at deindustrialization in the US, UK, Western Europe.

When this happens, the affected people are basically thrown in the trash heap for the rest of their lives.

Frequently their kids and grandkids, too.


Stables became gas stations. Nintendo used to be a toymaker.

Businesses change and adapt. Workers too — but people often don’t like change, so many choose to stay behind. Should we cater to them?

I used to do a lot of work which is now mostly automated. Things like sysadmin work, spinning up instances and configuring them manually, maintaining them. I reconverted and learned terraform, aws etc when it became popular.

Should I have gotten help from the government to instead stick to old style sysadmin work?


> Should I have gotten help from the government to instead stick to old style sysadmin work?

I don't think anyone beyond a few marginal voices are calling for a ban on job automation. What they seem to prefer is that, if they are to be automated out of a job, they should be compensated for their copyrighted works having been used in the process of doing so.

Regardless, at the very least people who are being automated should get some government support. Not everyone can easily retrain.


Suppose you're a weaver. It's hard, fiddly work, and you have to get your timing and your tension just right to make quality material. Now, there are mechanised looms that can do the job faster (though the quality's not great: they could still do with some improvement, in your opinion). From this efficiency gain, who should reap the profits?

Suppose you're a farmer. You've been working on your tractors for decades, and have even showed the nice folk at John Deere how you do it. Now they've built your improvements into the mass-produced models, and they say you can't work on your tractors any more. Who should reap the profits?

Suppose you're a writer. You've spent a long time reading and writing, producing essays and articles and books and poems and plays, honing your craft. You've got quite a few choice phrases and figures of speech in your back pocket, for when you want to give a particular impression. Now, there is a great big statistical model that can vomit your coinages (mixed in with others') all over the page, about any topic, in mere minutes. Who should reap the profits?

Suppose you're a visual artist. You enjoy spending your time making depictions of fantasy scenes: you have a vivid imagination, and, so you can make a living illustrating book covers and the like. You put your portfolio online, because why not? It doesn't hurt you, it makes others happy, and maybe it gets you an extra gig or two, now and then. Except now, there's a great big latent diffusion model. Plug in “Trending on Artstation by Greg Rutkowski”, and it will spit out detailed fantasy scenes, photorealistic people, the works. Nothing particularly novel, but there was so much creativity and diversity in your artwork, that few have the eye to notice the machine's subtle unoriginality. Who should reap the profits?


I've answered this before. The container revolution split some of the resulting profits with those whose livelihoods were destroyed, the longshoremen.

"You build a dam that destroys 10000 homes, who should reap the profits?"


It's a good answer, but it raises further questions:

• Should we be destroying people's homes to build dams without their consent?

• In general, are people being compensated when these things happen to them? i.e., while it might be nice, does this actually happen?

The Luddites (the real ones, not the mythological bastardisation of them) continue to be sympathetic characters.


> • In general, are people being compensated when these things happen to them? i.e., while it might be nice, does this actually happen?

The famous: "it depends" :-)

AI most likely falls under: "they should be", IMHO.


I don't think we should cater to Luddites, but (and it's a big but) if we automate enough jobs out of existence it's essentially undeniable that we will need systemic changes to avoid becoming a completely dystopian society.


But as the corollary to that, I know of zero successfully stopped technological revolutions. You can't put the genie back in the bottle, and there is no way to stop progress, aside from a one-world authoritarian government that forcibly stops as much of it as they can. But even that would only be marginally effective. Progress would eventually resume.


Yes, you do know of revolutions stopped and it worked for centuries.

Tokugawa Japan, Qing China, many other places including in Europe for centuries.

That's too extreme.

My point is that we're reaching a point where people need to be compensated. We can't just destroy their lives, collect all the money in 2 bank accounts and call it a day.


Bingo.

That's the real flaw in Luddite thinking -- you can destroy the machines.


In this case I think it's a little different. People are saying that they don't want to have their own productive or creative output used to undermine their own standard of living. That's not the same as simply not wanting to have your job automated away by someone else's business innovation.


To make chatGPT analogous to coal mining automation it would have to be able to automate the thing it is doing without learning from sources online.

To make coal mining automation analogous to chatGPT the machinery would have had to use something the coal miner did to learn how to automate their work? I'm imagining a camera looking at all the coal miner's work and then the machine can immediately do it, but better.

I agree it is a tad different, but like with someone's coal mining which is in the public domain for anyone in the tunnel to see, likewise anything you write unprotected online is in the public domain and fair game I think?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: