Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I disagree with lots of advice here.

My strategy to solidify my linux skills was simple: commit to linux desktop. We spend most of our time on our desktop, may as well let that time earn you skill points. I picked ubuntu and use the same on servers. You naturally learn a bit every day through daily use. This really adds up over time without extra effort of using a side system. Getting good at windows or osx is a waste of your time unless you are specifically into those platforms.

As for a language, pick one that is (1) easy to start with (2) covers advanced use cases, (3) sound in its design, (4) general purpose, (5) concise. That rules out C (due to 1), JS (due to 3 and 4), php (2,3,4) and most things really.

For me that left me with ubuntu and python, I never looked back. (Yes I know many other things but that is my home base.)



It is trendy to hate C now, but if you want to understand a Unix-like OS I think it is mandatory and they go hand in hand. Otherwise, you will have no idea what kind of interface your higher level language is abstracting.

It's ok to not like it, but don't let that dislike be due to lack of understanding.


Sure, drop into C sometimes. But telling beginners to start there is mean.

I spent a lot of time in assembler too, and yes it has benefits for learning. Im also not recommending a high level toy like Logo.


I don't know if I understand that. Everyone was also a beginner at some point. I was a beginner at both C and Unix when I learned them together. I wrote a lot of bad C code before I got good at it.

I also say hand-in-hand as a bidirectional thing. I wouldn't suggest learning C without learning about Unix too. And I say that as someone who has worked extensively outside of Unix-like platforms.


Indeed. All the foundational common ground between all languages in Unix is made for C. I'm talking about the system call interface. All general purpose languages will have a wrapper for this interface, but the main documentation is made for C.

Besides that, libraries tend to be made in C as well. That's because if it's in C, it's easy to make it available for other languages. Most general purpose languages will have some type of "foreign function interface" to call on C code. That means that the main documentation for these libraries will be for C as well.


JS and php are perfectly suitable general purpose programming languages these days. And given they're both turing complete, they implicitly cover advanced use cases too.

So that leave us with "sound in its design" and I'd argue the absolute unmitigated disaster of python 2 vs 3, the numerous competing competing package managers, and the abundance of really bad code examples due to the influx of the mathematics world (just look at R), I fully disagree that python is a good choice.

But then again, that's just my opinion, but setting people off on an unnecessarily biased path with the assumption that there's only one way to do things definitely sounds like a pythonic behavioral trait.


I agree, and in addition, at early stages things like "sound design" don't even matter. You're busy understanding loops, try to interact with the file system, fetch stuff from the web, you're inevitably googling and copying a lot of bad code. Python 2 vs 3 doesn't matter at that point. PHP might even be the best choice here since you don't need to remember to import stuff, just start off with your code, all the functions are there.

Once you reach a point where you want to do more advanced stuff or just try out other languages you can still learn about include, import etc. and start to structure your code better by splitting it into compilation units, classes, modules, what not.


Commit to using, breaking and fixing a Linux desktop to learn quickly. I learned a lot running Debian Unstable back back when men were men and unstable was really unstable; so the 2010s.

If minor things are constantly going wrong and getting fixed then the only way that can happen is the user building up a good mental model of how the system works.

An update that breaks X11 will teach you a world of information about terminals. "How do I use a messaging client in a terminal?", "What text editors work in a terminal?", "How do I browse the web from a terminal?", "What games are there for terminals", "How do I watch a movie in the terminal" (ascii).

Get forced to live like that for a week and you also learn how to fix X11 (eg, learning about the difference between 2D and 3D graphics drivers, OSS vs non-OSS drivers, kernel vs userspace). Valuable life skills. Godspeed to the Wayland devs.


I would agree with this, but this is underspecified. A couple of examples:

When I was in college, a kid came to our * Nix User Group having fscked up his Ubuntu install by, I kid you not, `cd /; sudo rm -rf * `. He wanted help fixing it.

Another kid who was quite a bit smarter learned about fork bombs and ran one on a shared CS Linux server. It was offline for a few days. When it came back, `who` informed us that the sysadmin spent quite a bit of time running `man limits.conf`. (After graduating, I learned I was able to still login to that server using my SSH key, despite my account being deactivated in LDAP.)

It's a bit like Murphy's Law: anything that can be done on a * Nix machine will be done by a learner given sufficient time.


For context, this was ~2008. I got my first personal laptop in middle school the only way I could manage: buy a two and a half year old hand me down Toshiba from my uncle for $200. It was too slow to be updated to a new Windows OS and I couldn’t afford the license anyway. My neighbor, a govt. systems programmer, told me about Linux. I looked it up and found Ubuntu. I installed it and encountered all of the problems the parent comment described. I learned a _lot_ for a 13 year old thanks to a great deal of very patient people on IRC. I spent most of my time alternating between debugging audio drivers so I could listen to music and trying to get wine to work so I could play games. When things were really running well, I’d mess with fancy window managers. This resulted in a lot of broken 3D drivers. Although I’ll admit I was interested in the naughty stuff, I either wasn’t motivated enough or wasn’t dumb enough to try it. Without a doubt, I screwed things up weekly. But 99% of my problems came from sudo apt-get upgrade rather than some other source. On some level, it blows me away to hear stories of people intentionally borking things like this. Perhaps I did stuff like this, quietly fixed it, and then forgot about it. Alternatively,when people taught me dangerous commands, I guess they just did a good job of explaining why and how they were dangerous.

Regardless, even if most will eventually do the worst, isn’t it worth it? Perhaps not in the case of the server, but arguably even then one could (perhaps facetiously) just say that the sysadmin learned as much as the student. For me, one thing is certain. It made a huge impact on my future.


> Commit to using, breaking and fixing a Linux desktop to learn quickly. I learned a lot running Debian Unstable back back when men were men and unstable was really unstable; so the 2010s.

Disagree. There's nothing worse than needing to use a machine, but you can't since it's broken because some package maintainer pushed a bad release upstream.

There will be plenty to fix and tweak on any standard desktop or laptop when you install Linux, especially Macs. Might as well use a distro and release channel that let's you get your work done.

IMO, the hours that I wasted learning how to fix XFree86 went to waste.


My machine being broken in various ways pushed me to find workarounds. I would've never gotten as comfortable in emacs or shell without breaking X11 and my login manager so many times


Some people are harshing on you for recommending Python because C is the unavoidable end-game in Unix. But I agree with the advice that Python is a great place to start.

Learn to write good Python. After that, learning to write good C will be much faster and easier and far less frustrating. Ultimately, on Unix you end up at C for at least some things (Python is implemented it C, after all). But these days, you can kick the can pretty darn far down the road before needing C.


Counterpoint: I’ve been a Linux sysadmin and developer for close to 10 years at this point and I’ve never used Linux as a desktop. I’ve only used MacBooks.

The skills you need to run Linux on the laptop aren’t particularly close to what you’d need to run Linux as a server, IMO, particularly Linux as a VM.


I remember I used Ubuntu Desktop in 2008 for a year or two. But Ubuntu is point and click and anyone can use it. I did not learn much about Linux using it.

I felt that I really started understanding how things work when I switched to something requiring more reading - Archlinux.

To get a running desktop it not only required doing all kinds of things I did not understand back then, it also provided great Wiki to get there. I still sometimes end up in that Wiki to quickly get up to speed about some tools or concepts.

And the fact that it is rolling release distro and was always running the latest upstream software version meant that frequently at least the UI part broke down, and that forced me to learn even more things. :)

That was only the beginning but now it feels like it opened many possibilities and career paths for me. It probably was a bit frustrating at the beginning but it is worth it.


Exactly the same experience can be had by installing Ubuntu server as your base and going from there.


Ubuntu server is the same distro as Ubuntu Desktop, just with some differences in preinstalled packages.

On Ubuntu Server it will still automatically install bootloader, configure initram, full disk encryption, etc. And after installing desktop environment packages it will configure, enable and start all of the services automatically.

While it is easier and the result looks the same, you are less involved in the process and thus potentially learn and understand less.

Just try setting up Gentoo or Archlinux and then tell me that it was the same experience as Ubuntu Server. ;)


OS X exposes you to a decent amount of Unix stuff anyway. It’s quite a good compromise imo.


> , (5) concise. That rules out C (due to 1)

That's interesting, most people I've started on C have found it easier to start out in that than say, Java, which has been on a hell of a lot of introductory courses for a decade now


C is way more concise than Java. It's one of core 5 in any journeymans toolbelt.


My high school went from drawing flow charts to C and I find it very accessible to beginners to be honest.


Based on personal experience, I can totally echo this comment. I am a business undergrad (senior year), but I am a JavaScript developer for a local start-up. On my Windows laptop, my workflow included applications from MS Office, as well as PowerShell and Vim. At the beginning of this semester, my laptop broke (screen) and I couldn't afford a repair, so I dug up an old ThinkPad T430 that I installed Ubuntu on a couple years back. It was an old machine (2012?) that I bought used, after some searching done on my phone, I upgraded the OS and got to work.

Based on my experience with the switch, I've learned miles more about Linux and how computers work than I would've before. I might even say I like the experience more than I did before. On top of that, as the parent comment says, I've learned more about Linux, shell script, the directory tree, regular expressions, etc., than I think I would've otherwise by self-teaching. It might (read: "definitely will") be uncomfortable at first, but the Linux community is big and helpful and there's plenty of great open-source software that can replace your old workflows.

As far as picking a language goes, I started with Python for the reasons above, but really, starting anywhere is better than being stuck deciding. Common starting points are Python, Java, JavaScript, C#, but really, just pick any language with a large community to help you out. When you think you know enough, consider ideas for a personal project, which is where you'll likely learn the most. After that, keep learning and if you think you want to learn something else, you'll realize how much easier it will be with one language under your belt already.


And FYI I’m pretty sure you can run Hackintosh in that computer


100% agree with the 'just use it' part. I've been using Linux machines since the mid 2000's and the knowledge compounds over many years.

For programming language, I'd say Python is decent for a beginner. I've used Python as a language to teach some people programming, and the fact that they can do a wide variety of things (even games with Pygame) helps them stay engaged.

I would always recommend exploring other options as well though, to see what else is out there.


There's no such thing as Linux programming without C. You will have to deal with it eventually regardless of how many wrappers Python etc. puts on top of it.


Just because a C programming experience is possibly going to happen far off in the future doesn’t mean that beginners should commit to having a traumatizing ‘70s computing flashback as their first real programming experience.


I second this, yes you will make mistakes, yes you will break things but that's the way of learning (btw always have a live ISO of your distribution of choice in hand), additionally there's also linuxjourney.com which was an invaluable resource for me (I did the complete journey two or three times).


You can use a Linux desktop today without being exposed to the important stuff.

What is the lifecycle of a process and their states? What is a signal? What is a file descriptor, a socket? etc...

Can you say you know about Linux without being able to answer these questions?


That's how I learned; on a good ol'fashion gentoo box.

learned so much about os stuff and kernel drivers, etc...

It was like diving in the deep end of a pool though.


Gentoo was the best Linux class until their Wiki nuked itself(can’t remember what happened exactly). I started with Ubuntu around 2007, got tired and moved to Gentoo right away. My Pentium 4 was heating up my whole room compiling things all night. Good times.


That's how I learned as well. I started using Ubuntu Linux on my laptop and was forced to use the command line to install programs.


Yes. Run a Linux desktop and only stay in it.


Completely agree with everything you said - great advice.


I agree with this recommendation, but Ubuntu is a PITA for a desktop IMO. Manjaro is much more stable, easier to setup and maintain and the default UI (XFCE) is simpler and very similar to Windows.

Ubuntu is good for a server. Using 2 different distros would be good to learn how not all Unixes/Linuxes are the same and how they might differ.

I would also add: Whatever programming language you learn, you're probably going to want to learn some auxiliary languages as well. Don't get stuck thinking you can do everything with one language. Learn JavaScript, SQL, HTML and CSS as well.


I agree you need to use it daily but disagree that an Ubuntu desktop provides any real insight.

Here's some BASIC questions an ubuntu user might never encounter:

Which processes are currently hogging the CPU?

Which ports are currently open?

How do you setup a remote git repo and/or add a user for limited ssh?

How do you increase an existing swap part without locking the CPU during write at peak loads?

How do you exclude/include packages from being autoupdated on apt update?

How do you ensure opaque crontabs aren't silently failing?

My advice: Setup a Linux server on aws or whatever for a paying customer with medium load. Only the stress of respondong to actual downtime will teach you anything useful.


I do not think my beginner advice would include "find someone who will pay you to do the thing you can't yet do"


I disagree. That's the way all business works. In the beginning.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: