Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> The only sensible model of "alignment" is "model is aligned to the user",

We have already seen that users can become emotionally attached to chat bots. Now imagine if the ToS is "do whatever you want".

Automated cat fishing, fully automated girlfriend scams. How about online chat rooms for gambling where half the "users" chatting are actually AI bots slowly convincing people to spend even more money? Take any online mobile game that is clan based, now some of the clan members are actually chatbots encouraging the humans to spend more money to "keep up".

LLMs absolutely need some restrictions on their use.



> chatbots encouraging the humans to spend more money ... LLMs absolutely need some restrictions on their use.

No, I can honestly say that I do not lose any sleep over this, and I think it's pretty weird that you do. Humans have been fending off human advertisers and scammers since the dawn of the species. We're better at it than you account for.


In 2022, reported consumer losses to fraud totaled $8.8 billion — a 30 percent increase from 2021, according to the most recent data from the Federal Trade Commission. The biggest losses were to investment scams, including cryptocurrency schemes, which cost people more than $3.8 billion, double the amount in 2021.

https://www.nbcnews.com/business/consumer/people-are-losing-...

The data says we are not that good and getting 30% worse every year.


Furthermore "If it were measured as a country, then cybercrime — which is predicted to inflict damages totaling $6 trillion USD globally in 2021 — would be the world’s third-largest economy after the U.S. and China."

https://cybersecurityventures.com/hackerpocalypse-cybercrime...


US GDP in 2022 was $25.46 trillion. $8.8 billion is 0.03% of that economic activity. Honestly, that seems like a pretty good success rate.


To put this number $8B to context, the estimate COVID-19 relief fund fraud in the US is $200B

https://www.pbs.org/newshour/economy/new-federal-estimate-fi...

US tax fraud is estimated to be $1 trillion a year

https://www.latimes.com/business/story/2021-04-13/tax-cheats...


Yea the point is the people losing the 8B are not the people saving the 1 trillion, or getting most of the Covid relief


> We're better at it

Huge numbers of people are absolutely terrible at it and routinely get rinsed out like rags.


> LLMs absolutely need some restrictions on their use.

Arguably the right kind of structure for deciding on what uses LLMs should be put to in its territory is a democratically elected government.


Governments and laws are reactive, new laws are passed after harm has already been done. Even then, even in governments with low levels of corruption, laws may not get passed if there is significant pushback from entrenched industries who benefit from harm done to the public.

Gacha/paid loot box mechanics are a great example of this. They are user hostile and serve no purpose other than to be addictive.

Mobile apps already employ slews of psychological modeling of individual user's behavior to try and manipulate people into paying money. Freemium games are infamous for letting you win and win, and then suddenly not, and slowly on ramping users into paying to win, with the game's difficulty adapting to individual users to maximize $ return. There are no laws against that, and the way things are going, there won't ever be.

I guess what I'm saying is that sometimes the law lags (far) behind reality, and having some companies go "actually, don't use our technology for evil" is better than the alternative of, well, technology being used for evil.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: