Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have a long break over the holiday and am thinking of putting a software project out.

Do you think I should consider making it a priority to code in a fairly low-level programming language (e.g. Rust) without overhead, and count cycles so that most tasks are done within a single screen refresh?

I can't make the rest of users' systems more responsive but I could make my own software as fast and efficient as possible.

The other alternatives are Python, which would make it easier to ship large features but would come with a large overhead (full interpreter) since it's an interpreted language.



This is usually the wrong approach. Make sure you don’t do computations that block the UI, avoid accidentally quadratic behavior, and make sure you process UI events promptly. You can do this is pretty much any language, with minor caveats:

Some languages (e.g. Haskell) make it easy to write code that does more computation than intended. If you use one of these languages, make sure you know what you’re doing.

If you use a language with a truly horrible GC, you might experience excessively long pauses. Similarly, if you produce too much garbage, you might have issues.

If you use a language that can’t multithread properly (sigh, Python), moving tasks off thread is a mess.

Otherwise, one can write perfectly responsive software in just about any language.


I'd do it in C, but that's just me. If this is a one-person project you can hold you code to a high standard. Clean standard C11 with all warnings turned on (-Wall Wextra Wpendantic Wconversion). Write unit tests. Run them with valgrand/sanitizers. Use clang-format. Build with multiple compilers (GCC/clang/MSVC). ...

EDIT Not many hackers on hackernews apparently.


I’m curious as to the upside of C, is it mostly just a familiarity thing?

Holding your own code to a high standard is great, but wouldn’t it be nicer if you could offload more of that into the tooling and spend more time on the problem or making the code even cleaner?

Also as a side note: I always find it funny that a flag that represents all warnings doesn’t actually turn on all warnings.


For my own projects I use C because I know it, but also because I know and trust the ecosystem. I seem to have developed something of a software-survivalist mentality, so I like to know that I can build my project five years from now without worrying about whether some remotely-hosted dependency isn't there any more. (I'm not claiming that newer, more trendy languages necessarily fail this test - just that I don't know and trust their ecosystems well enough to be sure.)

C also has the advantage that there are many, many different compilers targeting many, many, /many/ different architectures. My own favourite is VBCC, which is lightweight enough that I was able to write my own backend for my own toy CPU project, and even build the entire toolchain - assembler, linker and compiler - under AmigaOS.


I think the remotely hosted dependency problem and ecosystem thing is not that big of a deal all the time, but if the project involves atypical OS and architectures, then C makes sense. I think it’s still possible to do these in other languages, but that’s just another familiarity gap to add to the existing set.

For the remotely hosted dependency thing, I think it’s pretty easy to vendor dependencies in most realistic C contenders, and a really simple litmus test is just yanking your network cable and doing a fresh build.

The ecosystem thing can be a bigger deal, but again it really depends on the problem domain. There are lots of high quality, non-C libraries for C to always be a clear winner. I think it’s more important to take a step back and make sure whatever language you pick is well suited to the task, rather than assuming any individual one will always be. Knowing multiple languages is handy for this, since that’s more ecosystems that you can pick from, rather than just tying yourself to a single one.


> The ecosystem thing can be a bigger deal, but again it really depends on the problem domain.

Yes, absolutely - and of course C isn't immune to ecosystem problems, either. I remember the pain of working with GNU autotools back in the mid 2000s - in fact it's probably that experience (plus trying to use bleeding-edge tools written in Python!) that left me so cautious about external dependencies today.


Out of curiosity, when you considered the compilers to use for your CPU project, did you look at pcc? And if so, how would you say vbcc compares, in terms of ease of porting to a new arch?


I looked at a number of different options (http://retroramblings.net/?p=1277) but didn't spend a great deal of time looking at pcc. I was amazed to discover how many different options there were, actually!

VBCC's backend interface is well documented, which helps a lot - and there's a skeleton "generic RISC" backend which is trivial to copy and use as a starting point - I found it very useful to be able to tweak a working backend and observe how the generated code changes, while I was getting a feel for how it all hangs together.

VBCC does have an unusual license, however - commercial usage requires permission from the author.


I found the opposite in a very real-world analysis of my hobby project to build a digital dashboard for my DeLorean. I spent 5 years making very slow progress with C++, and then switched to perl, started mostly from scratch, and finished in a year. I gave a talk about this exact topic at YAPC 2014: https://youtu.be/SERH3_gZOTo?t=1018 The CPU usage on the embedded PC went from 15% to 40%, but in the grand scheme I'd rather have it finished and pay a little more for the hardware.


> Do you think I should consider making it a priority to code in a fairly low-level programming language (e.g. Rust)

Maybe. Premature optimization is said to be the root of all evil...

But at the same time - the assumption that everything will be easier and faster in Python rather than Rust or C++ is often invalid. Sure, for smaller scripts it almost always is like that, but once your app grows, this may stop being the case.

Start with a language that's convenient for you and with which you can release an initial version. Then get an understanding how it behaves in terms of performance, and draw your conclusions.

> as fast and efficient as possible.

Responsiveness is not the same as speed or efficiency. Of course it's important to be fast and efficient, but it is even more important to not just start crunching numbers and ignore the user and the rest of the system.

Do your hard lifting asynchronously and have a thread attending to user input and your UI (or even different threads for these two tasks). And this is easier said than done!

Also remember you'll have to try and work around delays and slowdowns due to other apps and the (non-realtime) OS. Specifically, you might have to play with thread scheduling and I/O priority (although - that's usually the user's rather than the app's job).

Additional notes:

* Also consider C++; it has some advantages and disadvantages relative to Rust (which I obviously will not get into), but it has seen a whole lot of progress in recent years, in particular w.r.t. the ease of doing many things which used to be painful.

* If you think of Rust as low-level, then your head must be in the clouds... :-P


> Maybe. Premature optimization is said to be the root of all evil...

The full quote, because it always gets butchered to "premature optimization is the root of all evil":

Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered. We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%.


I think this quote is pretty funny in hindsight. Since it was said, we've had an entire generation of OOP muppets, and now a second one of modern web developers going that way, that add layers of pointless abstraction and complete bullshit on top of everything they write, just for fun.

In the 70s you might've needed a reason like optimisation to commit evil. These days apathy and cargo culting do it perfectly fine without optimisation even coming into the picture.


Cycle-counting is useless on modern workstations, since different processor are used in each. I don't see how the choice of language is significant either, as long as it meets your goals for responsiveness. Goal could be instead to limit latency where possible (measuring time from click to completed action is easy with a framework), or communicate to the user when an action will take a noticable amount of time.


Agreed. The 2 choke points you optimize in algo returns the loss on choosing an otherwise better lang/environment for productivity.

Sometimes not though, like startup times, but there is not so many use cases that this counts, like 1password starts slower with each release.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: