Yes, in that the tools are massively better. So is the hardware that it all runs on.
No, in that you still have to tell the computer precisely and unambiguously exactly what you want it to do, and how, mostly in text. The level of detail required today is somewhat less, due to better tools, but at a high level the work hasn't changed.
That has changed significantly too though. Sorry about this being a long post, but having programmed through most of the last 50 years, I've seen a massive shift in the way people code even from a language perspective:
1) There's a massive reliance on reusable libraries these days. Don't get me wrong, this is a good thing, but it means people spend less time rewriting the "boring" stuff (for want a lazy description) and more time wring their program logic.
2) Most people are coding in languages and/or language features that are several abstractions higher than they were 50 years ago. Even putting aside web development - which is probably one of the widest used frameworks these days - modern languages and even modern standards of old languages have templates, complex object systems, and all sorts of other advanced features that a compiler needs to convert into a runtime stack. Comparatively very few people write code that directly maps as closely to hardware as they did 50 years ago.
3) And expanding on my former point, a great many languages these days compile to their own runtime environment (as per the de facto standard language compiler): Java, Javascript, Python, Scala, Perl, PHP, Ruby, etc. You just couldn't do that on old hardware.
4) Multi-threaded / concurrency programming is also a big area people write code in that didn't exist 50 years ago. Whether that's writing POSIX threads in C, using runtime concurrency in languages like Go (goprocesses) which don't map directly to OS threads, or even clustering across multiple servers using whatever libraries you prefer for distributed processing, none of this was available in the 60s when servers were a monolithic commodity and CPUs were single core. Hence why time sharing on servers was expensive and why many programmers used write their code out by hand before giving it to operators to punch once computing times was allocated.
So while you're right that we still write statements instructing the computer, that's essentially the minimum you'd expect to do. Even in Star Trek with the voice operated computers, it's users are commanding the computer with a series of statements. One could argue that is a highly intuitive REPL environment which mostly fits your "you still have to tell the computer precisely and unambiguously exactly what you want it to do..." statement yet is worlds apart from the they we program today.
Expanding on your above quote, "mostly in text": even that is such a broad generalisation that it overlooks quite a few interesting edge cases that didn't exist 50 years ago:
1) web development with GUI based tools (I some people will argue that web development isn't "proper" programming, but it is one of the biggest areas in which people write computer code these days. So it can't really be ignored. And there are a lot of GUI tools that write a lot of that code for the developer / designer. Granted hand crafted code is almost always better, but fact remains they still exist.
2) GUI mock ups with application-orientated IDEs. I'm talking about Visual Basic, QtCreator, Android Studio, etc where you can mock up the design of the UI in the IDE using drawing tools rather than writing creating the UI objects manually in code.
3) GUI based programming languages (eg Scratch). Granted these are usually aimed as teaching languages, but they're still an interesting alternative to the "in text" style programming languages. There's also an esoteric language which you program with coloured pixels.
So your generalisation is accurate, but perhaps not fair given the number of exceptions
Lastly: "The level of detail required today is somewhat less, due to better tools, but at a high level the work hasn't changed.":
The problem with taking things to that high level is it then becomes comparable with any instruction-based field. For example, cook books have a list of required includes at the start of the "program", and then a procedural stack of instructions afterwards. Putting aside joke esoteric languages like "Chef", you wouldn't class cooking instructions as a programming language yet it precisely fits the high level description you gave.
I think as programming is a science, it pays to look at things a little more in-depth when comparing how things have changed rather than saying "to a lay-person the raw text dump of an non-compiled program looks broadly the same as it did 50 years ago". While it's true that things haven't changed significantly from a high level overview, things have moved on massively in every specific way.
Lastly, many will point out that languages like C and even Assembly (if you excuse the looser definition) are still used today, which is true. But equally punch cards et al was still in widespread use 50 years ago. So if we're going to compare the most "traditional" edge case of modern development now, then at least compare it to the oldest traditional edge case of development 50 years ago to keep the comparison far rather than comparing the newest of the old with the oldest of the new. And once you start comparing ANSI C to punch-inputted machine code, the differences between then and now become even more pronounced :P
Yes, in that the tools are massively better. So is the hardware that it all runs on.
No, in that you still have to tell the computer precisely and unambiguously exactly what you want it to do, and how, mostly in text. The level of detail required today is somewhat less, due to better tools, but at a high level the work hasn't changed.