Hacker Newsnew | past | comments | ask | show | jobs | submit | sofal's commentslogin

Well, yeah. I like getting computers to automate things and solve problems. Typing in boilerplate and syntax is just a means to that end, and not even remotely the most interesting part. I don't like managing my own memory garbage collection either, so I prefer to use tools that handle that for me.

I mean, I guess when I was really early in my career I'd get a kick out of writing a clever loop or whatever, and I drank deep from all the low level coding wisdom that was available, but the scope of what I care about these days has long since expanded outward.


I don't think this is why. I listened to a recent podcast with Michael Truell and he made clear they think the TUI is an inferior form factor. This feels more like a reactionary move than a visionary one.


I'm constantly disappointed by how little I'm able to delegate to AI after the unending promises that I'll be able to delegate nearly 100% of what I do now "in the not too distant future". It's tired impatience and merited skepticism that you mistake for fear and coping. Just because people aren't on the hype train with you doesn't mean they're afraid.


Personally, I am. Lots of unusual skills I have, have already been taken by AI. That's not to say I think I'm in trouble, but I think it's sad I can't apply some of these skills that I learned just a couple of years ago like audio editing because AI does it now. Neither do I want to work as an AI operator, which I find boring and depressing. So, I've just moved onto something else, but it's still discouraging.

Also, so many people said the same thing about chess when the first chess programs came out. "It will never beat an international master." Then, "it will never beat a grandmaster." And Kasparov said, "it would never beat me or Karpov."

Look where we are today. Can humanity adapt? Yes, probably. But that new world IMO is worse than it is today, rather lacking in dignity I'd say.


I don't acquire skills and apply them just to be able to apply them. I use them to solve problems and create things. My learned skills for processing audio are for the purpose of getting the audio sounding the way I want it to sound. If an AI can do that for me instead, that's amazing and frees up my time to do other things or do a lot more different audio things. None of this is scary to me or impacts my personal dignity. I'm actually constantly wishing that AI could help me do even more. Honestly I'm not even sure what you mean by AI doing audio editing, can I get some of that? That is some grunt work I don't need more of.


I acquire skills to enjoy applying them, period. I'm less concerned about the final result than about the process to get there. That's the different between technical types and artist types I suppose.

Edit: I also should say, we REALLY should distinguish between tasks that you find enjoyable and tasks you find just drudgery to get where you want to go. For you, audio editing might be a drudgery but for me it's enjoyable. For you, debugging might be fun but I hate it. Etc.

But the point is, if AI takes away everything which people find enjoyable, then no one can pick and choose to earn a living on those subset of tasks that they find enjoyable because AI can do everything.

Programmers tend to assume that AI will just take the boring tasks, because high-level software engineering is what they enjoy and unlikely to be automated, but there's a WHOLE world of people out there who enjoy other tasks that can be automated by AI.


I'm with you, I enjoy the craftsmanship of my trade. I'm not relieved that I may not have to do it in the future, I'm bummed that it feels like something I'm good at, and is/was worth something, is being taken away.

I realize how lucky I am to even have a job that I thoroughly enjoy, do well, and get paid well for. So I'm not going to say "It's not fair!", but ... I'm bummed.


I can't tell whether I'm supposed to be the technical type or the artist type in this analogy. In my music making hobby, I'd like a good AI to help me mix, master, or any number of things under my direction. I'm going to be very particular about every aspect of the beat, but maybe it could suggest some non-boring chord progressions and I'll decide if I like one of them. My goal as an artist is to express myself, and a good AI that can faithfully take directions from me would help.

As a software engineer, I need to solve business problems, and much of this requires code changes, testing, deployments, all that stuff we all know. Again, if a good AI could take on a lot of that work, maybe that means I don't have to sit there in dependency hell and fight arcane missing symbol errors for the rest of my fucking career.


> Again, if a good AI could take on a lot of that work, maybe that means I don't have to sit there in dependency hell and fight arcane missing symbol errors for the rest of my fucking career.

My argument really had nothing to do with you and your hobby. It was that AI is signficantly modifying society so that it will be hard for people to do what they like to make money, because AI can do it.

If AI can solve some boring tasks for you, that's fine but the world doesn't revolve around your job or your hobby. I'm talking about a large mass of people who enjoy doing different things, who once were able to do those things to make a living, but are finding it harder to do so because tech companies have found a way to do all those things because they could leverage their economies of scale and massive resource pools to automate all that.

You are in a priveleged position, no doubt about it. But plenty of people are talented and skilled at doing a certain sort of creative work and the main thrust of their work can be automated. It's not like your cushy job where you can just automate a part of it and just become more efficient, but rather it's that people just won't have a job.

It's amazing how you can be so myopic to only think of yourself and what AI can do for you when you are probably in the top 5% of the world, rather than give one minute to think of what AI is doing to others who don't have the luxuries you have.


Everyone should do the tasks where they provide unique value. You could make the same arguments you just made for recorded music, automobiles, computers in general in fact.


Difference is though AI does it much faster and has much fewer central sources that provide the service. The speed and magnitude is important as well, just like a crash at 20km/h is different than a crash at 100km/h. And those other inventions WERE also harmful. Cars -> global warming.


My point is every invention has pros and cons, and tends to displace people who were very tied to the previous way.


You can still do those tasks, but the market value will drop. Automatable work should always be automated, because we best focus on things that can't be automated yet and those gain more market value. Supply and demand and all that. I do hope we have a collective plan about what we do when everything is automated at some point. Some form of UBI?


What do you mean that AI can do audio editing? I don't think all sound engineers have been replaced.


Yes. I know what you’re referring to, but you can’t ignore the pace of improvement. I think within 2-3 years we will have AI coding that can do anything a senior level coder can do.


Exactly how broke do you think this guy is that taking a career break is a sign of instability?


You can get burned real bad by this if the specific product SKU gets discontinued and the price spikes 10x because now they're called Tide Pods Original.


I mostly agree with you, or at least I think you're providing important pushback that needs to be considered. Porn is largely a weird religious boogeyman, and scary stranger kidnapping stories seem to be a form of lucrative fear porn to attract conspiracy minded types. My access to the internet as a teen fueled my curiosity and computer technology exploration in a way that was crucial for where I've ended up today. I actually installed a keylogger to get the internet password that my parents guarded, and I've never regretted it. Blanket luddite rules for kids seem to me to be lazy at best.

That being said, here are some of the things I worry about:

- The internet is no longer a niche playground for nerds, and much of it has become a mainstream entertainment megahub, very highly cultivated for your bland engagement. When I was growing up, I had to constantly fiddle with and troubleshot several layers of software in order to explore, interact with friends online, and play games. It was almost like a barrier to entry. These days, I'm not entirely sure I would have fiddled with anything and might have just skipped to the gaming & media consumption part. After all, it just works now, and the media is more engaging than ever. There seem to be fewer incentives for learning and creativity.

- I'm more concerned about bad behavior modeling than I am about the moral panic nonsense. I want to make sure that whatever personalities my kids are having a social/parasocial relationship with aren't encouraging trollish and abusive behavior.

- I'm also concerned about misinformation. Most people generally are very bad at gauging the trustworthiness of information online. Ironically even the people who cry the most about how media distorts your worldview tend to have that exact problem. I want to teach my kids critical thinking and how to evaluate information based on several important criteria. This will have to be an involved process, and I want to be able to contextualize heavy sources of misinformation while they're being exposed to it.

None of these problems are well addressed with a luddite approach, but they do need careful attention.


Exactly. Actually funny jokes from amateurs tend to have risen out of an order of magnitude more failed attempts. You can't encourage the funny one without inviting a ton of bad ones.


The idea that politics in academia are so fierce because the stakes are so low is a meme that has been around for many years, so perhaps it's just the first time you heard it?

I'm not convinced of the meme despite its cleverness or the many blog articles discussing it. I don't think it would get less politically intense if the stakes were raised.


How about: I'd like to know what's happening during an active shooter event at my son's school. Is there a loudspeaker for that?

This actually happened to me, and guess where all of the official police updates were posted.


Do you think the tradeoffs discussed in that article are unproven?


The tradeoffs are not unproven but the soundbite suggests that the tradeoffs are a necessary precondition. That's the crux of my issue with the soundbite.


So the tradeoffs are proven, but not necessary?


The trade offs happen but they aren’t the reason to say “the optimal amount of fraud is non zero”.

The correct framing is “since we accept some fraud, the optimal amount of fraud must be non-zero”.

As written, it sounds like society can only function if there is some amount of fraud.

I gave this analogy to a sibling comment of yours. Maybe it will help clarify my point. We accept some adulteration (let’s say it is 2% for conversation sake) in coffee. That’s not the same as “the optimal purity of coffee is 98%”. The optimal purity of coffee is 100% but since we can’t guarantee that some not-coffee stuff has mixed in with coffee, we live with 98%.


This is somewhat pedantic. With this logic, you could say that the purpose of a car is to get somewhere, and therefore the optimal speed is c.

But that would be a completely useless thing to say, because we can’t get your Camry to c. There is a whole system of tradeoffs that we all know exists.

Google could easily lower ”bad things” to 0 by shutting down tomorrow. The government could easily eliminate racism by nuking everyone. None of those are interesting or intellectually stimulating discussions.

“Optimal” is different than “perfect”. Everyone else here is talking about it from a system view.


How is it pedantic? The framing is fully wrong.

We can't eliminate murders but do we say the optimal amount of murders is non-zero? Instead, the right framing is "we live with some numbers of murders in aggregate because eliminating all murders is impossible". It's not like we need some number of murders for society to survive.

The use of the word optimal implies some murders (or fraud) is necessary for society (or financial transactions).


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: