> Killing 70.000 as revenge for 1200 is not particularly blurry
It’s not. It’s horribly wrong. But lines are drawn from both sides.
In my opinion, there isn’t a terribly sympathetic party in that war. Just two sides that deserve to live and don’t acknowledge that right in the other, with one having overwhelming firepower to prosecute its belief.
To make an analogy, the nuking of Japan is morally blurry even if the Japanese Empire (and the use of nuclear weapons) were each, individually, clearly morally wrong.
> maybe we shouldn't go out of our way to support one of those sides
Yes. We shouldn’t be providing any financial or military aid to Israel. And we should pass into law a process, subject to judicial oversight, that bars even weapons sales to countries systematically engaging in war crimes.
There is bipartisan agreement on the first point. I think there could be on the second. The problem is both sides’ activists are so polarized on this issue that everyone in the middle must choose between being vilified by both sides (Exhibit A: the angry dude in this thread) or ignoring the issue. (Admittedly, I’ve taken the second path.)
> There is bipartisan agreement on the first point.
I don't see how this can be true. The last initiatives to stop weapon sales all died in the House (although with shrinking majorities). Meanwhile weapon sales, military and intelligence cooperation and diplomatic protection continue with no change.
By now a majority among the US population has changed their mind on Israel - but the actual decisionmakers haven't and I don't see that they will in the future either.
> And we should pass into law a process, subject to judicial oversight, that bars even weapons sales to countries systematically engaging in war crimes.
> don't see how this can be true. The last initiatives to stop weapon sales all died in the House
Sorry, I meant among likely voters. Barring Trump and Netanyahu personally falling out, MAGA can’t piss the latter off. That locks what can happen until ‘28 (‘26 if November is wild).
> a majority among the US population has changed their mind on Israel
Yes and no. Yes, in that the balance has shifted. (And markedly in the Democratic party.) No, as in this isn’t a deciding issue in almost all districts. Put another way, the polarity has shifted, but the magnitude has not, even if this is rating pretty high for a foreign policy issue.
That’s why I think cutting aid is doable. You unite the isolationists on the right with the majority of the left, thereby turning the usual weakness of foreign policy—voters’ apathy towards it—into a strength. (Gutting foreign aid is usually popular by default.)
We will not let you "both sides" a Genocide and I'm frankly tired of your Zionist gaslighting rhetoric. You really think that people are so stupid to still take this bait? I don't think so. You still stick to that selective outrage complaining about Oct 7, while the entire history of Israel is founded upon a multitude of massacres and mass-graves of Palestinians in the Nakba, Tantura and co, yet you're still trying to pretend that history started at some recent convenient date for your Zionist victim narrative. No one buys your deceptive faux-neutral rhetoric, it's ineffective and outright embarrassing.
Some combination of America being Beirut’s security patron, not wanting to get into a shooting match with the emerging regional hegemon and being underpowered relative to both Israel and Hezbollah.
The LAF is kitting up to disarm Hezbollah, which is now a de facto Iranian occupying force. It could then, with foreign assistance, most likely, start working toward securing its southern border. (Lebanon probably needs a new constitution first. The current one doesn’t empower anyone to negotiate with anyone.)
> the one million Lebanese citizens who just permanently lost their homes just had bad luck
No, the folks up north traded their homes and security for keeping Beirut more or less intact.
Lebanon doesn’t have a great security solution so long as it contains Hezbollah, a force that fights no longer for the Lebanese people but entirely for a foreign leader. For Tehran, keeping a low-boil conflict in southern Lebanon is useful. Same for Hezbollah. Israel and Lebanon should prefer no conflict, though Netanyahu clearly does for personal political reasons.
Israel has to rid itself of Likud. Lebanon of Hezbollah. Fortunately, the LAF has a built-in confidence-building exercise in disarming Hezbollah (initially within a green zone).
I think they answer that question pretty convincingly: Because if what you're looking at is already on the screen, it much more easy to point to it and say "that" than to describe it.
(And if it's an abstract entity like a file, it might not even be possible to describe it, short of rattling off the entire file path)
> This is a good instinct: one of the virtues of democracy is the way that it gives people a feeling of control over their own lives. People who believe that they can rein in AI companies through votes and laws and regulations will be much less likely to turn to violence.
I like how this is entirely put in terms of "feelings" and "beliefs" with the ultimate goal being to keep people from resorting to violence. It doesn't seem to play any role how much control people actually have.
> "jobs are going to still exist you just can't imagine them!"
Ironically, this makes even less sense.
If (ostensibly) the goal of developing LLMs was so we can all create more while working less, but he also assures us there will be just as much work in the future, then what was the point of this tech in the first place?
I am by no means defending Sam Altman here, but it's roughly the same value proposition as every productivity enhancing technology. Creating more even if you don't end up working less means at the end of the day we all still have more. There are certainly potential problems when it comes to how that "more" is distributed, among other issues, but things that increase human productivity tend to go along with increases in quality of life even if it doesn't mean you get a bunch more free time to sit on the beach drinking Mai Tais.
And truthfully those productivity enhancements mean that you probably could indeed work less, as long as you're willing to also forgo the standard of living improvements that go along with them. The idea of the digital nomad living in some incredibly cheap but less than advanced country is based on exactly this concept. But a lot of people aren't willing to do that, nor should they feel compelled to. Working the same 40 hours a week while making more stuff seems perfectly reasonable.
What's the point of listening to purely AI-generated music?
I don't mean music that has AI-generated stems as part of an arrangement, where a human actually created it and used AI for bits and pieces, I don't see absolutely any point on listening to purely AI-generated music. The fundamental essence of music is emotion, listening to something generated without emotion has no point, it might sound good but it's hollow and devoid of meaning.
I've tried to listen to it, it doesn't even make me "sad", it makes me feel... Nothing. I'm a hobby musician and I incorporated some AI-generated parts in some tracks where I mangled/processed them but my idea was exactly to express how hollow AI-generated music is without the human aspect.
> What's the point of listening to purely AI-generated music?
For formulaic music-as-a-product (McMusic™) it arguably makes no difference whatsoever whether it is totally machine-made or assembled out of vat-grown parts in the musack factory . This says far more about this category of music than it does about the value of machine-made music. Insta-pop, a large fraction of hiphop, supermarket country, plastic metal, there's plenty of formulaic thrash made by both man as well as machine. Even the supposedly man-made stuff was often half machine-made already before the advent of generative models so that other half did not make much of a difference.
If you're looking for music which makes you feel things (other than 'comfortably numb' to borrow a phrase from some real musicians) you're probably looking in the wrong area. It is the new music for airports, elevator music, hold-the-line music, slide-show-music, acoustical filler.
Many music that are in autoplay on Spotify are AI and I literally didn't know until I checked, the emotion was triggered successfully, I don't really see why only a human could be able to trigger you an emotion? Like if I'm at a party, let say I don't know the artist and everything is AI made and everybody is vibing, then what's "wrong" with it?
I think this is more of a musician side which I respect, but a lot of people would simply not care who created it (or what).
Most people don't care about music, as most don't care about art in general. People like entertainment though.
What you are describing is more akin to a form of hollow entertainment through the medium of music, a lot of pop music can also fall into that category (no, not all, there is also a lot of artistry is many pop artists/songs).
If AI-generated music triggers emotions on you then keep consuming it but knowing that it's a hollow form of the art, there's no one on the other side communicating with you, it's basically like having a conversation with a chatbot, it might sound human but you know that there's no one on the other side listening to you. AI music is the other way: there's no one on the other side telling you a story, or a feeling they went through, it's just a mimesis of it.
Music has served various roles throughout history. The whole notion of music being "art" and "invoking feelings" has not always been consistently true across the entirety of its history of various cultures. Painting, drawing, sculpting, and other visual arts have had a similar history as well.
We can take examples of some pieces from famous composers like much of Haydn's works, some pieces from Handel, Bach, Mozart, etc.. Some of their works were commissioned pieces for particular functions. Whether the music be for courts, dances, aristocratic displays, churches, and other events. Even on the battlefield music has been used to route troops, supply orders, and other forms of communication. My point is that there is not always a story to be told. Music can also be used to disrupt one's sense of time -- while on hold on the phone, elevators, etc.. I would not say the music in those instances are really telling me a story either.
Much like the visual arts. Emotion can be expressed in a piece, but pieces can also be functional in nature. There is a difference between figures in an instruction manual, portrait paintings, and a van Gogh piece.
Not to mention that this debate has been had countless times through out history, as well. It's always the same No Scotsman Fallacy. For example, some critics of electronic music have made a similar argument way before AI.
"It's not real music if there are no instruments."
"It's not real music if <racial/cultural demographic> creates or plays it."
"It's not real music if the music does not adhere to contrapuntal rules."
I think what angers people most is that as technology progresses, the gap between effort and accomplishment decreases. Thus there is some sort of clinging to a sunk cost fallacy for some. As if something being easy to create devalues all the effort one has put into something. Maybe it does? I do not personally think so. If anything, it allows greater access for people to participate in the arts -- something the arts have also had a historically rocky relationship trying to gatekeep.
The invention of the camera did not make painting irrelevant. It even opened a new door to the world of visual arts. I do not think AI music will make musicians irrelevant either, and perhaps new doors might open too.
There, patterns of electrical signals are said to be good, and not bad, but at the same time "morality" gets dismissed. But ideas of good and bad are morality!
Note that moral ideas don't have to be correct. "I should burgle a house" is a moral idea, an idea in the domain of morality. 1920s disapproval of the seductive decadent jungle rhythms of jazz was moral (I guess we say "moralistic" to indicate that we don't agree). The opposite attitude, praising jazz, is also moral. Treating Dylan as a traitor for going electric was moral, and attending the metal love-in that was Ozzy's farewell concert was also moral.
Then, a couple of posts up the thread from you, there's an imagined scene of people "vibing" to music at a party where everything is AI made. This sounds disgusting, somewhere between vaping and using a vibrator, and so I think I have to grudgingly give it my full approval. These imaginary young people are enjoying the vibe that they have vaguely selected. Maybe they had some input about the genre, maybe implicitly. They're choosing not to turn it off, anyway, because they like it, they think the vibe is good.
You imply that everybody saying "It's not real music" is wrong. OK, kind of, but they're not completely wrong. It doesn't follow that just because of our long history of snobbery, therefore everything is real music. The snobs are doing gatekeeping, but they're also doing discernment, and participating in the kind of moral ideas that music and art is made of. It's such a pain to define art that I'm liable to be downvoted for trying: some people are certain that relativism is the way forward, and that it's a brilliant insight to throw our hands in the air and give up. You're quite right that it has to encompass lots of different things, and no one defining feature will withstand counterexamples, but it can still be defined in a vague way as a collection of optional qualities, under which we could say that an instruction manual is not really art, but arguably artistic or artfully made.
So, I'm not judging the AI music as art or not-art right now, but I'm saying that it's amenable to so being judged. Anybody claiming that it's good music is admitting the possibility that it might be bad music, and this is a moral matter, about the value of feelings, meanings, and affections. That even applies to good or bad elevator music, it's trivial background sound, but approval or disapproval of it is moral. This is not about its worth as patterns of signals, because that's reductive. Those patterns mean things, or matter to us in ways that we have preferences about, which are value judgments.
I have. It's overly polished, formulaic and dull. It's devoid of any of the qualities that make music interesting. There's nothing a human is trying to communicate. Perhaps it could be used as elevator or hold music.
I agree, it's shockingly good these days; we can argue about morality etc, fine, but burying one's head in the sand and claiming it's bad puts you at odds with reality, which isn't a good place to be.
It's pretty silly that so many people take as an axiom that the human brain basically has a monopoly on certain patterns of electrical signals, and have semi-religious beliefs that this will always be the case.
It's not that AI can't convince a novice that what comes out is passible.
It's that experts in a field generally agree that what comes out is insidiously hollow garbage.
This isn't a "semi-religious" belief. It's linear token soup and diffusion bakes running headfirst into actual expertise, second and third order effects, refined skill and taste, and so on.
If you actually want to see civilization advance, you cannot rely on machines that merely mash up existing intellectual output while pretending to have expertise.
We already had that in the form of art school avant-gardism. AI is just style transfer of that, with corporate sycophancy and valley hyperbole as a veneer.
But you really believe it will stay that way? What do you think models will be 10 years from now? (not only models, we must include processes and tools in it) - developers were thinking this until recently there is some sort of sudden switch where "shit, it's good enough" and then pass this in a 50x loop and suddenly it becomes "shit, it's actually great" which proves it's a matter of time imo before it's not hollow garbage but actually innovative and expert in its field.
I still think you are missing entirely the point about music or any art in general.
It doesn't matter how technically innovative, or how much expertise, a model has, while an AI is not a consciousness that can express itself it will be hollow. There's no way around that.
If some form of AI becomes conscious, and can express itself through whatever art form it conjures for that, why would it even use music? Music is human, it's tuned to how our brains work and perceive sounds, I'd be much more interested to discover what art forms another form of consciousness that we can commuicate with can come up on its own.
I can't fully agree with the hollow part, when AI resonate with me about real-life issues (I understand it's just a machine without thoughts) it's pretty expressive and spot-on, and genuinely useful. I don't really see why it couldn't be the same with music, it can already write completely unique pieces that are very entertaining and full of emotions (even tho they are "fake")...
The brain perceiving sounds a certain way in the end is just data, that can be mapped as well, an AI can make us laugh right because it understands speech really well (and will be a thousand time better someday), what's the actual difference with music?
Let me give you another example, there is some Meme about older folks getting bamboozled by AI images right (especially doomsday stuff) which proves that it does trigger them genuine emotions, what's the difference if that image does actually exist or not (or let say a human photographed it).
What if that does not matter to someone? I know my opinion can't be common, but I cannot stand live music. I dislike the sound quality, the differences from the recording, the crowds, the cost, and more.
I know not everyone enjoys concerts, but it’s fundamental to my listening experience. That aside, I have no interest in music or art of any kind generated by AI. Other folks might, but I’ll have nothing to do with it.
The difference is the indelitable reality behind it.
You are confusing the topography of it with the substance, what's the point of something that is without substance? Without meaning? It's just fake, whenever you point to someone that an image that brought them joy is fake, generated by AI, it immediately changes the feeling they had. It doesn't bring the same awe anymore, awe is reserved to what is real. It might bring awe in the sense of "woah, a computer can do that" but that's a different feeling than being in awe of the story the image created.
How can it be full of emotion if it's created by something without emotion? It's just a mimicry of emotion, I really cannot understand how you cannot feel that knowing it's not created by another being; being real is the whole point, an emotion triggered by something not real, not experienced, transformed, and communicated by someone else is inevitably hollow.
Like: how can AI know what is to feel in love? Or to feel the loss of a loved one? Or to feel despair about something? Or to feel depressed? Or to feel extreme joy? Why would you listen to a song telling you a story to evoke an emotion on something that simply does not exist? There is no experience being transmitted, it's purely a hollow amalgamated mimicry of the experiences that were ingested but the output has absolutely no emotion, just a synthetic mimesis of it.
You are enjoying the mimicry, it's entertaining, but I really would like for you to ask yourself deeper questions about this rather than be impressed by the surface of it.
> The brain perceiving sounds a certain way in the end is just data, that can be mapped as well
“In producing textiles but has there been actual positive impact in other sectors?”
I’m sure the Industrial Revolution didn’t just happen all at once, it started somewhere and crept.
support of all kind (including voice), marketing, real-estate, financial... yes, a ton of fields are being very impacted right now but right now doesn't even matter, what matter is what we know it will reach as theory will become practice.
Generally, people don't care about "fields being impacted", and the students certainly don't. People care about the impact certain technology has on their daily lives, on their welfare and the ability to pay off their mortgage and provide a decent life for their children.
The AI as it is today isn't really doing any of those things. At most, it's a sort of reliable replacement fot Google Search. Worden ehen, it's being presented as threat to all those things the people care about.
reply