Hacker News new | past | comments | ask | show | jobs | submit login

I could never really understand the enthusiasm. Why are we still dealing with over half a century of cruft? I get that this is a core piece of technology lots of stuff is built upon, and I'm not arguing to get rid of the classic terminal emulation altogether. But I wish there was an effort of building a new, modern, textual interface to computers with modern assumptions and integration bindings. We shouldn't need to be concerned with obscure escape sequences to print color; tooling shouldn't need to parse strings to do something useful; and junior developers shouldn't need to waste hours scrolling an obtuse man page until they resort to a half-assed SO response with broken parameters to extract a tar archive.

There is more to shells and text interfaces than working within constraints set 50 years ago.




In my opinion, the Lindy Effect[0] makes a lot of sense in scenarios like this. Personally, I love the fact that the slower evolution of command-line tools gives my personal skillset a longer shelf life. I can add additional capabilities without having to constantly re-learn how to do established work.

[0] https://en.wikipedia.org/wiki/Lindy_effect


The problem is to break away from ANSI escape sequences and the like means also rewriting 50 years of command line tools.

Like with the modern web, there’s just too much momentum behind the current design to make it practical to reinvent it from scratch

That doesn’t mean that things cannot improve though. It just makes it massively more difficult if you want to retain backwards compatibility. And if you don’t, then people probably won’t adopt it because they won’t be able to get their job done, regardless of how good the UX is.

As it happens, I am trying to solve these problems. In part with my alternative shell. To take your SO example, I’ve recently even playing around with integrating ChatGPT to help with hints (on top of automatic man page parsing which already happens).

I’m also writing a new terminal emulator that aims to bring interactive widgets (like support for sorting and filling anything that looks like a table). But that term is very much alpha at the moment.


Why can't we make it backward compatible so an interface has each client application say "I support the new interface" if it does? You probably would need to add kernel support to handle the fork/execve case, but I don't think that there is an intrinsic limitation.


fork et al are manage by the shell, not the terminal emulator. And what you're suggesting is exactly what feature flagging in ANSI escape codes achieves. 50 year old teletypes already had way to announce support for new interfaces.


> The problem is to break away from ANSI escape sequences and the like means also rewriting 50 years of command line tools.

Are we really that reliant on those command line tools? I only use a handful of them, and any time it gets more complicated, I reach for a real programming language to do the scripting. Those tools just do a bunch of string parsing, and the user interfaces are usually incredibly esoteric, inconsistent, and obtuse.


In a word: Yes.

The long tail is very long, and it's not just the bare text tools, it's the bare text tools (including tools that were written 30 years ago and that haven't changed since, and the tools that barely even count as cli tools), and also anything that ever did anything more interesting. You're not just talking about rewriting coreutils, you're also talking about redoing vim and emacs and top and all the *stat programs and tmux and screen and ...


Only in regards to UNIX CLI tools, computing history is full of other kinds of command line interfaces.


Sure. And aside from Powershell, which itself had to recreate a bunch of common UNIX idioms, which of those other command line interfaces are still in widespread usage?

The status quo is ugly but reinventing it is at least an order of magnitude more work than improving upon it.


Amiga DOS doesn't seem to go away no matter what.

IBM and Unisys mainframes and micros.

Smalltalk and Common Lisp REPL environments.

Although probably debatable to consider any of them mainstream.


The fact that you opened your rebuttal with DOS basically proves the point I’m making.

> Although probably debatable to consider any of them mainstream.

The only person debating that point is you.

We could be online until the sun rises debating about different command line environments but if AWS (for example) haven’t released an official CLI utility for Amiga DOS then your position is ultimately just an academic one.


Amiga DOS !== UNIX CLI, and if you don't get why, well so be it, lets worship 1970's printer hardware instead.


I'm giving you the benefit of the doubt that you're just mistaken rather than trolling:

1. I never said Amiga DOS was the same as UNIX CLI. In fact I never even compared the two, that was all you

2. I do get why different command line interfaces are different -- I author a significant amount of code towards terminal emulators, shells, command line tools and maintain a hell of a lot of retro systems. So I'm definitely experienced on this subject. In fact I bet I could teach you a thing or two on this topic too ;) But that wasn't the point of what was being discussed. We were talking about the state of the status quo, not how some niche interface that nobody has used for serious work in nearly 30 years compares to the entrenched standard.

3. I never once said the UNIX CLI was peak command line design either. In fact I actually said the exact opposite. What I actually said was that it was dominant. Dominance != well designed

4. ANSI escape sequences, the $TERM env var, and all the other terminal UX stuff that are being discussed here, came about with hardware terminals like the VT-series. Yes, teletypes are part of mainframe history, but they're not relevant to this specific discussion here. Terminal emulators don't emulate a teletype, the POSIX kernel does that. Terminal emulators ostensibly just open a file and emulate how VTs interpreted ANSI escape sequences. This is the same reason why you can have terminal emulators on Windows (like PuTTY, Microsoft Terminal, and my own terminal emulator) despite Windows never having a concept of a PTY. Though, perhaps ironically, Windows now does support an approximation of a PTY.

---

My point was very clear: the current status quo sucks but it would be an order of magnitude more work re-implementing everything from scratch and it ultimately wouldn't likely gain adoption anyway because of the momentum behind the current status quo. Microsoft understood this and ended up re-implementing some of the concepts despite literally decades fighting against it.

History is littered with examples of sub-par technologies becoming dominant because they work just good enough to maintain any initial momentum they have behind them. And people would sooner use what they're familiar with than learn something entirely new just because it is technically better.

And the fact that you keep harping on about Amiga DOS is, frankly, absurd. I love the Amiga, I honestly do. I have one sat next to me right now. But if there was one example in history of a command line interface that sucked more than the Bourne shell, it would be DOS. Mentioning Lisp machines might have earned you a little kudos yet you chose to lead with Amiga DOS.....


You were the one that was so eager to reply that didn't even bother to go past the first line of my comment, and that says it all.

All the non-UNIX platforms eventually had to come up with POSIX support, because apparently people cannot stop to see UNIX on everything that has a CPU, other platforms be dammed, they better come up with POSIX support, including terminal escape codes.

The only reason to use PUTTY on Windows was and is, UNIX software running under mingw/cygwin.

Same with Windows Terminal, and naturally Microsoft <3 Linux with WSL, again UNIX like software.


I made reference to stuff past your first line. Maybe you missed it because you didn’t read the entirety of my comment ;)

Anyway, you’re now just repeating exactly the sentiment I made that we were originally arguing against.


VMS used ANSI escapes too, did it not? Windows is moving that direction. Amiga as well? CP/M? Not much left.


There is an effort, but it’s mostly built on cruft (which libraries like ncurses or the projects at charm.sh try abstracting away)

Also, what would we use besides escape sequences? They are about as small as can be for what they do.

Terminals deal with streams of bytes. They don’t just download entire views and render them (normally.) This normally works to their advantage.

There’s also the ansi standard control sequences, which imo should have more focus on them as opposed to the terminfo db.


> Terminals deal with streams of bytes.

But is this a fundamental constraint, or something we can challenge? Is there any good reason why terminals can't deal with structured data instead?

> There’s also the ansi standard control sequences, which imo should have more focus on them as opposed to the terminfo db.

Everything in that sentence \033[31;1;4munderlines\033[0m the problem to me: If we didn't need to nest metadata into the byte stream, none of us needed wrestle with escape sequences. And that such a thing like the terminfo db even needs to exist speaks volumes about the lengths we have to go to to accommodate for decades of cruft…

Applications should neither be concerned with what color codes the output device can render, nor should the terminal itself have to support hundreds of emulation targets.


I think it’s a fundamental design choice. It’s a double edged sword for sure, but it’s important.

It allows terminals to respond very quickly because they never need to wait for an entire data structure to download before displaying information.

It’s such a simple protocol that anyone can hack on it (which is a strength and weakness)

Asking to change that, in my mind, is like asking to change how packets move across a network. It’s all byte streams that get treated as structured data higher up on the abstraction layers.

TUI frameworks like the ones from charm.sh are built atop the base protocol and provide support for structured data much like how our network stack works. That way, as you were saying, application programmers don’t need to worry too much about the underlying “wire protocol”

The issue isn’t that terminals are stream-native or that they operate on byte codes instead of characters.

The problem is lack of standardization of said codes. Which leads to a mountain of edge cases in supporting every known terminal. That’s the cruft, not the minimal protocol


> But is this a fundamental constraint, or something we can challenge? Is there any good reason why terminals can't deal with structured data instead?

Plenty already do: - Powershell - Murex https://murex.rocks - Elvish https://elv.sh/ - https://ngs-lang.org/

> Applications should neither be concerned with what color codes the output device can render, nor should the terminal itself have to support hundreds of emulation targets.

If you have colour codes (et al) sent out-of-band then you need a new kind of terminal emulator which the application then also needs to support. So you do effectively create yet another standard.

Whereas the status quo, as much as it sucks, is largely just vt100 with a few extra luxuries, some of which are as old as xterm. We aren't really talking about having to deal with hundreds of emulation targets, nor even more than one, in most cases.

Where things get a little more challenging is if you want stuff like squiggly underlines or inlined images. There is the beginnings of some de facto standardisation there but it's still a long way from being standardised.


> But is this a fundamental constraint, or something we can challenge? Is there any good reason why terminals can't deal with structured data instead?

What do you mean by structured data? The escape codes are structured. And any structured dat would be sent as a stream of bytes.

Also consider that if everything had to json messages, that would mean that programs that didn't care about controlling a terminal would need to determine if their stdout was a terminal or a pipe, and properly format content if it was a terminal.

I think the concept of escape sequences is fine.

Although, I do wish that there was a way for the terminal and application to communicate out of bound. In particular that would allow querying the terminal without having to worry about getting unexpected input before the response, and you could send graphics or clipboard data out of band without tying up the terminal itself.

> But is this a fundamental constraint, or something we can challenge? Is there any good reason why terminals can't deal with structured data instead?

I mostly agree with that. Although it brings up the question of how an rgb color should be converted to a more limited color set. Should it approximate the nearest color for some definition of nearest, or should it ignore the color setting altogether. Though I think it would be better for that complexity to be centralized in the terminal instead of having to be implemented in every application.

> nor should the terminal itself have to support hundreds of emulation targets

I definitely agree with this.

I think we could use a standard terminal control spec that standardizes the things that modern terminals mostly agree on, and removes things that aren't really used anymore. And have a standard mechanism to query for optional or vendor specific capabilities. And write some decent documentation for the control codes (which if it already exists I haven't been able to find, at least in a single place).


There are libraries and frameworks that abstract this for you: ncurses, bubbletea, etc. At some point, every interface is low-level: GPU geometry is a stream of vertices, a program is a stream of bytes. This isn't some terminal-specific tech problem, it's just kind of how computers are.


Abstractions are layered a specific way, though. Layering upon something suboptimal will always be limited by the suboptimal layer.

Ncurses or bubbletea et all may hide it from you, but they still only paint over the bumpy legacy wall below without being able to really improve on it.


"\033[31;1;4munderlines\033[0m" is (again) no worse than a stream of vertices or a stream of object code. Everything is a stream of bytes (well, a stream of bits anyway). Do you want CSS? Lipgloss is not too far off [0].

I read your objection basically as "escape sequences and control codes are noisy garbage"; are you saying something more like "the functionality you can achieve with escape sequences and control codes is fundamentally limited"? If that's the case, I don't see how, especially in the context of a character-based display.

[0]: https://github.com/charmbracelet/lipgloss?tab=readme-ov-file...


I don’t think it’s suboptimal at all. (How many bytes does it take to change the color of a single word in the terminal protocols vs the web for example)

It’s literally a wire protocol.

It’s minimal and designed to be easily abstracted away.

The suboptimal part is the lack of standardization.


It may be a wire protocol, but it’s also the primary method of interaction with the computer for technical folk. Do we really have to constrain ourselves to a wire protocol here, or is there maybe room for something more user-friendly?


We, as users, do not interact with the wire protocol. We type at generally very user friendly terminals, which abstract the wire protocol.

Even TUI devs don’t usually interact with the protocol directly, they use ncurses or something from charm.sh to abstract away the protocol.

Moreover, what “constraint” does using a wire protocol like this impose?

And how would one interact with any computer without a low level layer of byte-encoded information? (You can’t. It’s how computers work)


Is there? Lets put a few reasonable requirements down:

- Self-sufficient commands that you can store in history, put on wiki page, put inside a script file, Slack to your friends, store in configuration, etc...

- Variety of execution methods: local computer, remote via ssh, jupyter-like notebook, remote via something else (like AWS SSM), CI runner, ssh which launches SSM session which connects over serial port, cron-like periodic schedulers, starting commands in your Go program etc... Every method should be supported automatically with no effort from developer.

- Output could be interactive (possibly with some reprocessing like tmux/screen does) or stored (like CI/cron execution). It is possible to parse output to get relevant details.

Even if you start from scratch, what can you design to fit this pattern? You'll get something very close to existing state - applications that take command lines, character-based input/output streams with some sort of formatting sequences.

Sure, if I could design from scratch I'd standardize on _one_ TERM, make escape sequences easier to parse (longer and common start/end chars), fix extended keys/numpad mess, redesign the CLI defaults of the few tools, create common command-line completion interfaces... but the overall idea will be the same.

(And if you don't care about the three requirements above, then go with HTML or native UI! There are tons of them and they are well supported. But I don't see the point of retrofitting HTML-like functionality to exiting terminals)


There definitely are more options than applications reading and writing character streams, you’re just not even considering them because you’re too entrenched in the environment you’re familiar with. Take Powershell for example; the syntax may be horrible, but the way it passes structured data is truly different and way ahead of classic UNIX shells. The design space is huge; just dismissing any possible improvements before even sitting down and thinking properly is frankly not a good strategy.


If the design space is so huge, how about giving some examples, instead of insulting people who disagree with you? Surely you'll be able to do so, while satisfying the requirements I listed?

Because Powershell doesn't satisfy them. You are not going to save Powershell object stream yo CI output or provisioner logs. You are not going to send objects over SSH, or over web console, or over BMC-emulated serial port. In all of those cases it's going to be good old stream of characters, with the occasional control sequence from more advanced programs.

You can have whatever rich datatypes inside your application (powrshell isn't the first, I remember reading about LISP shells back in 2000's), but at some moment you have to talk with other systems, and that's when you will have to switch to character streams.

(that said, the current escape sequences could use lots of improvement. That's not going to change fundamental concept though)


> There is more to shells and text interfaces than working within constraints set 50 years ago.

Of course.

With some minor changes, you could make the same argument many, many technologies we use every day, from IPv4 to SQL to C.

Then there is the old saw -- likely apocryphal -- about how railroads are the width they are because they evolved from standards around Roman roads. Even if it is a fairy tale, the moral of that story strikes true: existing tools are built on top of old ones, because it was better to have a standard then even if we may know better now.


IPv4 is a great example, because the transition to IPv6, albeit slow, is happening. And that is an all-renewed protocol with a lot of previously impossible, awesome features. There is a way to do this right.


It’s also exponentially more complicated, has addresses that are a mile long and impossible to memorise, and has enabled such marvels as a toaster that outright refuses to make you breakfast unless it’s connected to the internet with an active subscription and all the telemetry that comes with it. Truly a great example of a way to do things right.


While we're at it, can we get rid of staggered keyboards? We don't need to accommodate mechanical linkages from the bottom rows any longer, yet here we are.

As other commenters have pointed out, once conventions and standards have been adopted, there is almost no way of dislodging them. It's a coordination problem. New standards and approaches usually need to be at least a 10x improvement over the status quo in order to be adopted organically.

The same story applies to all areas of human endeavor.


Im not so sure. We have HTTP3 now, which is binary. We use different image formats than we used to, and switched from horses to cars, too. Why shouldn’t we be able to innovate in this space? Building new walls doesn’t automatically imply tearing old ones down.


> Building new walls doesn’t automatically imply tearing old ones down.

But often in this space they seem to want to do that anyway. I can only imagine that they are attempting to force people to adopt their pet project. There's literally no reason a pair of sequences can't be specified that allows clients to query support for advanced terminal mode and get a response. After the client and the terminal have agreed on advanced mode, switch to a byte stream/command blocks/whatever which allow rich text, graphics, sound, access to UI devices, etc. Whan you're done you've recreated X11, but they can do whatever they like I guess.

There's no need to fuck up terminfo along the way though.


> But I wish there was an effort of building a new, modern, textual interface to computers with modern assumptions and integration bindings.

Would it be different from a web-browser? Isn't the webstack already the modern textual interface everyone is using? I mean that's also the direction they all are walking to, with adding images to terminals, having more complex layout with TUIs. I think I remember some TUI-toolkit which even used a limited CSS for describing their interface. And there are some terminals and alternatives which are build with NodeJS and webstack IIRC. So would it make more sense to find some proper standards for webstack-based cli-interfaces?


> Isn't the webstack already the modern textual interface everyone is using?

I would argue that the web stack is really more of a virtual machine, or OS-agnostic application runtime, but it is not a textual interface per se (albeit it certainly can be used that way).

On the other hand, I'm looking for solid primitives to make applications talk to each other, but with structured data instead of plain text open to liberal interpretation; for terminals that don't emulate a teletype device, but make use of modern operating systems. Think of what we had before systemd—the wild west of shell scripts, layers upon layers of arcane compatibility hacks—and what we have now—a standardised, documented, structured, system management interface that works the same way everywhere (please don't lets discuss systemd here, it's just an analogy).

So rather than trying to paint NodeJS-lipstick on the TTY-pig, I'd like to see a new kind of terminal protocol that solves the UX issues of old.


> I'm looking for solid primitives to make applications talk to each other, but with structured data instead of plain text open to liberal interpretation

JSON exists. YAML exists. XML exists. The lack of proper exchange-formats is not a problem. The lack of tooling is the problem.

> Think of what we had before systemd—the wild west of shell scripts

Liberty. And this gives you an answer why terminals, shells and cli in general remain this way. People have the liberty to do what they need, and they have the liberty to go as simple or complex as they want. And this works very well. Don't forget that not all output is structured.

So unless you have someone creating a whole new set of tools and environment which can compete with the established solutions, and beat them on ALL aspects, and is still compatible...until then people will not support it. I mean, there are more than enough alternatives who are barely popular for one reason or another. Speaks for itself.

> I'd like to see a new kind of terminal protocol that solves the UX issues of old.

Which are the issues? Is there some solid documentation on them?


Chesterton's Fence is in full effect here. Quite a few people have tried to do exactly what you wish. They haven't really succeeded, obviously.


Me neither, having started to program when this is all we had, and graphical displays costed a fortune, I really don't get this desire to live in the past.


tooling shouldn't need to parse strings to do something useful; and junior developers shouldn't need to waste hours scrolling an obtuse man page until they resort to a half-assed SO response

Absolutely. Glad we fixed all that with CSS...


Could you share what you mean by that? I suppose it was sarcasm, but I didn’t get the point.


Ha sorry, that was probably overly snarky. I was thinking of CSS's multitude of string-valued properties that effectively have their own DSLs, ranging from the simple "5px 10px 0px 10px" for padding/margins to much more complicated expressions for gradients and animations. And it's all stringly-typed so (without additional linters) you don't get told when you format values incorrectly or set mutually exclusive properties. But there are also lots of advantages in terms of flexibility and ease of use (the "API" is just "style.foo = bar"), and that's led to CSS being ubiquitous and extremely successful.

I think this applies to some of your complaints about terminals as well. Yes escape sequences are ugly, but it means I can just emit the appropriate bytes from any program written in any language rather than figuring out what library I need to import to generate a correctly formatted structured message, and operating on the byte stream level lets it transparently work across remote connections. I'm all for experimenting with new approaches, but we shouldn't lose sight of the existing benefits.


CSS doesn't fix junior developers [wasting] hours scrolling an obtuse man page until they resort to a half-assed SO response


Arcan is one such effort, but I am not sure what state the project is in.


I'm bewildered that the "user friendly" flavors of linux still use ancient terminals. If the fact that these distros still lean heavily on terminal use wasn't bad enough, you also need to have a computer intuition from 1985 to feel comfortable using it. At least capitulate to ctrl+v and ctrl+c.


> I'm bewildered that the "user friendly" flavors of linux still use ancient terminals.

Do they? I was given to believe that you can use modern Ubuntu/Fedora/OpenSUSE without needing to open a terminal.

> If the fact that these distros still lean heavily on terminal use wasn't bad enough, you also need to have a computer intuition from 1985 to feel comfortable using it. At least capitulate to ctrl+v and ctrl+c.

Okay, let's say we're going to break backwards-compatibility; how should the user kill the running program, and how should they input character literals, and how are we going to implement your change?


Just to get it out of the way - Linux is great if you are a grandma or linux junkie. It sucks for everyone in between, especially for those who come from Windows or MacOS.

--

This is the fundamental problem with the terminal, it is extremely powerful when you are intimately familiar with using it. And it's unsurprising that people building distros and maintaining linux fall into that camp.

What they are completely blind to is how incredibly user hostile the environment for people from a GUI complete background. What is the first thing every mainstream Unix based OS does? They take the terminal and hide it. They make menus and menus of GUI elements that cover 80-90% of the things you would typically use the terminal for aside from the 5% of ultra power users.

My gripe is primarily that Linux is _desparetly_ needed now more than ever as an escape from windows. But the people who are working on linux distros are so lost in their egos that they are arrogantly trapped in this idea that everyone needs to be driving stick shift in 2024 because look at how much more control you have compared to an automatic, if that analogy makes sense.


> My gripe is primarily that Linux is _desparetly_ needed now more than ever as an escape from windows.

If all you see Linux as is a crappy free clone of Windows with less user abuse, you'll certainly be left disappointed when it doesn't deliver.

For all their faults, Microsoft pours a ton of money into Windows's usablity and backwards compatibility. If anything, it's impressive Linux DE's come close with a fraction of the funding and organizational structure compared to a literal tech giant.

> But the people who are working on linux distros are so lost in their egos that they are arrogantly trapped in this idea that everyone needs to be driving stick shift in 2024 because look at how much more control you have compared to an automatic, if that analogy makes sense.

Is it really "ego" if the people working on these distros simply like it better this way? Are they wrong for developing the program (oftentimes in their free time) to accomodate the way they, and most people who contribute to the project, enjoy using it?

Sure, it would be great if we could accommodate everyone. But as most FOSS projects are chronically understaffed and underfunded, people prioritize creating a product they can enjoy using.


>Is it really "ego" if the people working on these distros simply like it better this way? Are they wrong for developing the program (oftentimes in their free time) to accomodate the way they, and most people who contribute to the project, enjoy using it?

From Ubuntu's missions statement:

>We believe that bringing free software to the widest audience will empower individuals and communities to innovate, experiment and grow.

I'd say they have been failing catastrophically at that for the last 20 years. And so do the statistics.

If they could just dedicate two releases to snuffing out as much terminal use as possible, they could probably double their market share in a month.


This is a silly thread, beginners don’t even know the terminal exists, and 99% don’t need to.

The situation is the same in Windows, though they might have a few more duplicated GUIs, they don’t solve the long tail of troubleshooting. You can also use a Mac Keyboard with Linux.

It doesn’t make any sense to remove terminals either, when available and tiny in resource use. Use an OS modeled on the original Mac OS if you want to be prevented from seeing a terminal.

Utopia never existed in the computing world. But the war on general purpose computing sure has. Soon there will be no place to turn for an experienced user who is not a slave to bigcorp interests. Unfortunately the bondage you recommend leads here, they are now coupled.


To extend, this issue affects an incredibly small number of computer users. The 1% that knows it exists, divided by the large fraction that is too busy (or lazy?) to work around it:

https://news.ycombinator.com/item?id=40390656

The reason it falls between the cracks is because it doesn't affect enough people, especially normal folks. But let's not kid ourselves that integrating one of the existing fixes would affect the industry significantly.


> Okay, let's say we're going to break backwards-compatibility; how should the user kill the running program, and how should they input character literals, and how are we going to implement your change?

Please don’t take it personally, but I find it a bit sad that your imagination ends here. These two questions depend entirely on the chosen implementation. If we’re talking about a new implementation, should we really start by accepting the limitations and constraints of the existing system as requirements for the improvement?


> I find it a bit sad that your imagination ends here.

This isn't the limit of what's possible, it's a lower bound of questions that you have to answer if you want to build a replacement.

> These two questions depend entirely on the chosen implementation.

Okay? Feel free to include multiple answers, but you need at least one.

> If we’re talking about a new implementation, should we really start by accepting the limitations and constraints of the existing system as requirements for the improvement?

If you ditch all the behaviors of the old system without figuring out how to shim them in, then you can build a beautiful new system that's free of all the problems of the old system, and that's also free of all users because you just jettisoned all the programs that let people actually do useful work with their computers. We're stuck with 50 years of backwards compatibility not because we love it, but because people need to be able to interact with their machines without relearning everything, and people need to be able to use the machine to run the programs they're using.


Instead of a specific key combination that a) clashes with the way the vast majority of computer users enter text, b) is neither obvious nor easily discoverable, and c) not available or awkward to enter depending on the input device, we could also separate the monitoring of running processes from the command prompt and let the terminal application offer a control to cancel them, or match modern user expectations and do so using the Escape key. Or entirely different.

The point is that we don't use ^C because it's ergonomic or obvious, but because of conventions inherited from a time long gone. Keeping legacy software running does not imply imposing the same UX constraints on users.


> Instead of a specific key combination that a) clashes with the way the vast majority of computer users enter text, b) is neither obvious nor easily discoverable, and c) not available or awkward to enter depending on the input device,

I grant that a) it's different from everything else, but b) CUA shortcuts aren't obvious or discoverable either, they're just more common, and c) I can't think of any device that has a terminal where ctrl isn't readily available. Nonetheless, I actually agree that if we could do it without breaking everything, moving everything to the same convention is compelling on its own. (Of course, I don't think we can do it without breaking everything, but that's life.)

> we could also separate the monitoring of running processes from the command prompt and let the terminal application offer a control to cancel them,

When I want a process dead, I want it dead now, not after I've found the mouse and hit a button. I also have concerns about how that button is going to actually work under the hood; the only options I can think of are that that sends a simulated ctrl-c, or the terminal emulator suddenly has to track child processes and decide what to kill (which strikes me as a bad idea).

> or match modern user expectations and do so using the Escape key.

And in so doing break anything that was using escape, like vi. Incidentally, the plan to intercept ctrl-c will break emacs, too. Which is why I strongly disagree with

> Keeping legacy software running does not imply imposing the same UX constraints on users.

because breaking legacy software is exactly the trade you're proposing.


GUI Task managers have existed for decades.

If this issue is really bugging you:

- Buy a cheap Mac style keyboard and configure it as such. See kinto.

- Use a terminal with “smart paste” for a decent workaround.


Would Ctrl-c work conditionally based on if there is a foreground process? What if I want to paste in to the process?

The command key on macOS really comes in handy here.

The “windows” key on most Linux desktop environments I have used is usually pretty under-utilized, perhaps there’s a case to be made that super-v should default to paste.


<S-Ins>


While I don't disagree with your sentiment I have a few comments:

"Why are we still dealing with over half a century of cruft?". IMHO that's because we have software running that's over half a century, with organizations depending on it, and so it needs to continue running. This means we need to continue providing that software with the environment it expects. For over a decade, a significant part of my recurring revenue came from helping companies make sure their ancient software would continue to run on newer machines (and they only got newer machines because it was no longer possible to keep the older hardware running). And this was on PC hardware, from my short experience in finance, I reckon there's a bunch of 390 software still running emulated on modern z/OS machines.

"tooling shouldn't need to parse strings to do something useful" The thing is tooling either needs to process some binary format, which is more efficient but also more obscure, or to parse strings. All the tooling that deals with json, yaml, etc is still parsing strings, and I prefer that to a binary format in most cases. I know a defined format like yaml is simpler to parse than free-form text, but it maintains some of the same constraints (notably the need to escape things).

Your comment about escape sequences to print color is very relevant. I feel two concurrent yet opposite feelings here: I get all warm inside from nostalgia since I spent what probably was an unhealthy amount of time learning about ansi sequences back in the day of BBSs, and I also despise that completely and would love to be able to have color on a textual computer interface without the need for that. I believe that's possible, but most likely, because the effort to achieve that is significantly bigger than the effort to keep hacking what we have, is why we don't have a new textual computer interface. Challenges I can imagine: while I despise dealing with escape sequences (because I invariable got them wrong the first time, always), it's either something like that, or a side channel to convey formatting information. Or we move to html altogether. I guess this is the approach of text-intensive applications based on web frameworks (like I suppose most modern editors). There's the option emacs takes, which is to just have the text be plain text (whatever that means, really, we like to think it's a simple format because we can cat it and wysiwyg, but for the computer, it's all a bunch of bytes anyway) and use modes to alter how the editor shows you that text. As an emacs user I'm biased to think that's a better approach than html or a side channel, but I don't see that becoming the norm in the short term.

"and junior developers shouldn't need to waste hours scrolling an obtuse man page until they resort to a half-assed SO response with broken parameters to extract a tar archive" Now that is one hill I'm willing to die on: if someone can't figure out how to extract a tar archive just from looking at the synopsis on the man page and scrolling through the options, unless the archive was maliciously named to hide the fact that it's gzipped or something, I think that person would be a pre-junior developer and still have a way to go to become junior. Note I'm not saying all developers must use the command line: if you prefer to open archives with a GUI tool that's fine, but if for whatever reason you need to open it from the command line and you can't figure that out from the man page, I think it says more about them than about tar. Perhaps it's just a poor choice of example, as I agree there are very poor man pages out there, but for the basic use cases, I think tar is as understandable as it can get.

"There is more to shells and text interfaces than working within constraints set 50 years ago." Completely agree, which is why I often use the emacs shell, which is not a POSIX shell, and you know what? I don't care. I go to zsh when I need that, but often times, the emacs shell gets me what I need with less friction. And I'm sure the same would be possible with other shells. I wish we had more new shells that attempt to keep what seems useful about 'the old shells' but discards all that's not strictly needed and adds new, useful features.


> Now that is one hill I'm willing to die on: if someone can't figure out how to extract a tar archive just from looking at the synopsis on the man page and scrolling through the options, unless the archive was maliciously named to hide the fact that it's gzipped or something, I think that person would be a pre-junior developer and still have a way to go to become junior.

There’s “let them figure it out” and there’s “this is unnecessarily complicated because of archaic legacy compatibility”. It should not take multiple parameters to untar+gzip a file.

There is absolutely no reason why, if you provide tar a single .tar or .tar.gz with no other parameters, it doesn’t just say “ok this is a tar/gzip file and you probably want to extract it”.


> There is absolutely no reason why, if you provide tar a single .tar or .tar.gz with no other parameters, it doesn’t just say “ok this is a tar/gzip file and you probably want to extract it”.

The default behaviour is being held back from backwards compatibility allowing tar options to be passed as arguments (e.g. `tar xzvf foo.tar` functions the same as `tar -xzvf foo.tar`) though the program could just check if the argument is a valid path to a tar file.

While we're at it though, the most annoying aspect of `tar` has got to be the requiring the `-f` option. I don't see why it's required instead of just taking the first file passed as an argument as the input/output tar file.


> if someone can't figure out how to extract a tar archive just from looking at the synopsis on the man page and scrolling through the options, unless the archive was maliciously named to hide the fact that it's gzipped or something

At least with GNU tar (and I think Darwin and some others), compression doesn't matter any more; `tar -xf foo.tar` correctly autodetects the compression and extracts even when foo.tar is gzipped.


> IMHO that's because we have software running that's over half a century, with organizations depending on it, and so it needs to continue running.

Well; as I said, I'm absolutely not arguing for somehow removing that feature—having the existing terminals continue working is definitely unavoidable, for the reasons you stated. However, that shouldn't mean there's zero movement in the space. We have several coexisting approaches to everything from operating systems to processors to web browsers; keeping legacy software functional shouldn't be an argument to hinder innovation in the entire space.

> All the tooling that deals with json, yaml, etc is still parsing strings, and I prefer that to a binary format in most cases

There is a subtle difference here: A tool reading JSON from standard input and parsing that is just parsing strings, too, yes. However, what about a platform that passed messages from one tool to another on some kind of channel designated for passing JSON messages, exclusively? I don't think we should just have a --json parameter for GNU utils, but really a formalised way for unrelated applications to exchange data, without each of them having to fend off parsing errors. And reality right now is even worse, with every one of those utils having their own, bespoke output format.

Re: ANSI escaping: This ties into the same point. Applications should have more to their disposal than just reading and writing strings. If there were a way to attach metadata to input and output, we could pass on stuff like what should be coloured how, without requiring escape code parsing downstream. I'm inclined to agree with you on all of this being intensive effort-wise, but that hasn't deterred us in other cases either.

> if someone can't figure out how to extract a tar archive just from looking at the synopsis on the man page and scrolling through the options, unless the archive was maliciously named to hide the fact that it's gzipped or something, I think that person would be a pre-junior developer and still have a way to go to become junior.

I can see where you're coming from, but that's just gate keeping. The terminal is great interface, but does it really have to as undiscoverable? IDEs offer so much introspection into source code-- why can't we even have that for the terminal (except shell-specific completion hacks, which are just a poor approximation of what we should have). I'm not saying we should dumb it down to enable any fool to extract that archive, but UX-wise, it could be a lot easier to learn how something works. And yes, tar was a bad example. How about, say `ip`? Been using it for years, and still cannot remember how to do basic stuff with it without looking it up.


There are tons of efforts like that. https://xkcd.com/927/


First, dont like it? Dont use it. No need to cry over it.

Second, dont like it? Write something better, do show case and maybe people like it and start using it.

Third, not everyone needs to use 100s of MB of memory to render some idiotic emoji along the text using GPU accelerated routines. Those old legacy stuff is lightweight, its everywhere so I can run stuff on platforms that problem existed before you were even born.

I myself love CLI. Its reliable, easy to use, fast!! interface. I cant really imagine a computer without CLI.. Its like? WTF? :)


While no one is entitled to other's time for free software, your comment is needlessly dismissive and discouraging. People can have legitimate complaints and try to rally others to change the status quo. No one is encroaching on your right to use the software you choose. You also seem to have preconceived notions of what GP envisions. You can stick to your own thing, but don't stifle the creativity and fervor of others.


But I dont see it as being creative. He is just complaining, not even giving any new ideas.


> First, dont like it? Dont use it. No need to cry over it.

I have to use it. That's the point. The CLI is an important and great piece of technology; I just don't love the way things are, and would prefer to see them improved instead.

> Second, dont like it? Write something better, do show case and maybe people like it and start using it.

This is not how stuff like that works. Terminal interfaces are an ecosystem; there are a lot of different projects attempting new approaches, but there's no shared sense of the existing problems—as proved by your comment. I've written my original comment to make more people aware of those issues, increasing awareness.

> Third, not everyone needs to use 100s of MB of memory to render some idiotic emoji along the text using GPU accelerated routines. Those old legacy stuff is lightweight, it's everywhere so I can run stuff on platforms that problem existed before you were even born.

I didn't argue for that; in fact, I didn't even argue for removing the existing legacy stuff. Maybe try reading the original comment again? Just because we have a working solution doesn't mean we should stop innovating. There are tons of UX improvements possible beyond rendering emojis, such as proper inter-process message passing via standardised interfaces- just imagine piping from `ls` to any other process, and have that process know that it just got a list of file references, instead of a bunch of characters (Powershell does something similar). Let terminals offer completions like IDEs do. Let them render proper progress bars, without abusing braille characters. There's so many things people do just because they did them like this since forever, they stop questioning if there isn't a better way.


Well, innovating or start bloating? Usually I see things being overengineered because we MUST innovate. No innovation = going backward.

Progress bar? what for? Is something like that not enough?

Transmiting files: 3/20 (12%), 675kB/s

Or sth like that. Far more informative that just progress bar. And can render anywhere, easly, over 9600bps too.


Sure, that is a progress bar. And it's something countless devs have wasted hundreds of man-years reimplementing over and over again, instead of delegating to the terminal application. Because we're stuck with handling transparent character sequences.


This makes no sense to me.

There is only one terminal that user is going to use, and as an app author, you cannot choose which one is it. There is a good chance that your app will run in terminal which was last updated many years ago, and you cannot do anything about it.

On the other hand, you can link to any library you want, and as an app author, you have full (or almost full) control over it. There is a good chance your app will use the libraries you want (especially if its in modern language), and in the worst case you can vendor library locally.

It follows that we should have as much functionality as possible in libraries, and as little as possible in terminals. Your progress bar? Should be a library. If there is a terminal support for it, it should be an optional low-level "eye candy" (some might even call it "bloat"), with all the real logic (like ETA estimation, rate limiting, and formatting) implemented in libraries.

This applies to all your other ideas, too.

"proper inter-process message passing via standardised interfaces" - that exists, and has nothing to do with terminals. Depending on exact details, it is called JSON, DBUS, XML, etc... If you want Powershell-like "ls | select .name", then you don't need to mess with terminal either - it's all shell functionality (maybe with some env var conventions).

Terminals already offer completion like IDEs do, try hitting TAB. I agree it might be nice to use a different font for those, but there is no need for redesign, you only need a new escape sequence like "start/stop floating window" (but again, some might call this "bloat")


> First, dont like it? Dont use it. No need to cry over it.

> Second, dont like it? Write something better, do show case and maybe people like it and start using it.

Anytime I use a graphical tool, command line afficionados effectively reel in disgust. So I'd say that it's those people who need to get over that some people don't like typing in a terminal to do their work and want to work differently and often more efficiently.


GUI is more efficient? Joke.. JOKE :) GUI have it uses, for example data presentation and visualization. Oh I have nice tools here too. But data manipulation and query? CLI only or mixed where you query in CLI, got graph output.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: