Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ask HN: How will be the future of software development?
8 points by alismayilov on Jan 30, 2024 | hide | past | favorite | 10 comments
As we all see GenAI changed a lot of things and evolving rapidly. How software development practices will change? Will it be more designing software important? Will it be more towards to maintaining of large code bases which most of them generated by AI? How things could change? What is your vision about future?


Whenever I think about the future of software-engineering, I'm reminded of the history of fourth-generation modeling programming languages, and their failure to gain much foothold in the industry outside of some small niches. We develop software using the current paradigm because we want to. It's what makes sense to software-engineers. As an industry, we've already been shown alternate, arguably superior ways of writing reliable software, and we rejected those in favour of largely the same paradigm we've had since the beginning. I know this point doesn't have anything to do with AI. It's just something to think about whenever people foretell massive paradigm shifts in the industry.


GenAI isn't super magical, it helps senior engineers with boilerplate and certain types of method discovery.

I think the simple answer is to imagine life before Google and general purpose search engines that could find answers in documentation.

How did life change for developers after this? More information, more easily available, complexity of the domain increased and comprehensive documentation (manuals) went away.

GenAI/LLMs really are a better search engine, in so many ways, so just imagine this trajectory.


Except it is not exact same like a search engine. In search engine, results were generated by other people. Now, it is generated by machine. I think it is a big difference.


I dont think a search engine is asking humans for its results.

I get what you are trying to say, but materially LLMs are fed by humans and human content (such as documentation).


I don't think that software development as a profession will cease to exist for a long while, but I do think that in a decade or so, it will look very different to development today, and it won't be an improvement (from the perspective of a developer who likes writing software).

Based on my own feelings toward the profession which I see echoed around in online and offline discussions, there's often many undesirable elements of this career giving low overall job satisfaction, and I can only see increased AI presence making it worse. So at some point the good won't outweigh the bad, like one might argue it does now, and there'll be a bit of an exodus from the field, leaving few but well paid developers who either enjoy or tolerate AI's involvement. All just my own speculation, but that's the nature of the question.


Also to expand on my comment above, I spent a few years working in schools and I was surprised to see a trend I had noticed prior was reflected in the students: the tech skills don't seem to be as prevalent as in prior generations. My very casual theory is that the transition in UI's over the years from powerful and flexible to dumb and barely configurable, has led to fewer kids exploring software and getting a knack for (and confidence in) tinkering. Again, I don't think AI will improve this long term, so it will be interesting to see what new developers are like in 20 years if it's safe to assume that many devs have this background of 'software curiosity' if I can call it that.

And before someone well ackshually's me: yes I recognise that this technology progression isn't unique to software.


I think that software development has peaked. Not because of GenAI but because of saturation and lack of innovation.

The primary value prop of tech has always been automation, and there has been very little new automation lately.

Google and Facebook have already perfected the automation of surveillance and advertising. Streaming services have already solved the automation of content delivery. Cloud services are stagnating, and in some cases we see the pendulum swinging back with companies opting for on-prem hosting due to pricing. Most "innovation" now is just competing for market share and IP.

GenAI is kind of the last and greatest promise of automation. But the thing is, either way it goes, it's the end of automation. Either it fulfills the promise and delivers full AGI, rendering all software devs obsolete. Or, its a dud, undermining the value proposition of automation. It would mean there's a ceiling to what can be automated, and we've hit it.

Here's a thought experiment: it takes 100 engineers to build a bridge, but only 10 to maintain it. The bridge is now complete, what happens to the other 90 engineers?

I think the future will look something like the history of automobile manufacturing. In the beginning, there were assembly lines where cars where assembled by hand. The jobs were numerous and lucrative for the time. However, as the tech evolved, humans were gradually removed from the loop. Now nearly all of car manufacturing is automated. The only human input is design and oversight performed by the exceptional few.


> Either it fulfills the promise and delivers full AGI, rendering all software devs obsolete. Or, its a dud, undermining the value proposition of automation.

It isn't binary. There is a LOT of territory between those two extremes.


Of course. The bridge still needs to be built, with all it's on-ramps and off-ramps. That's where all the current excitement and "innovation" is right now. I don't think software development as a profession is disappearing tomorrow, or anytime soon. But I do think it signals the peak.


I notice a lot of XY problems with GenAI if you're not careful. Thus I can picture more senior engineers writing (with GenAI) really specific business requirements as unit tests with intermediate and junior engineers (with GenAI) writing the actual code.

A lot more work on building these internally (ex: Azure OpenAI & huggingface) and business analysts or others vetting the outputs or sentiment of output.

I can also picture a lot of work on designing more complex LLMs architectures. Like if you have the money, could you have a rough loop that reviews issues, roadmaps, and/or todos and proposes changes to the code itself? If so - see the importance of writing good tests and solid devops practices.

Conversely, integrate GenAI into PRs. Think "making AI" a team member. And even more so if you think about mythical man month problems. Software engineering makes you a manager for AIs.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: