Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ask HN: What are the “lost arts” of our field?
12 points by Gabrielfair on Aug 14, 2023 | hide | past | favorite | 23 comments
I've been thinking a lot of the good old simple days of the internet and programming. Lots of technology and tools are killed now right out of the gate b/c of how powerful established players have gotten. Your hobby project to track bots on twitter, killed by an overnight API change.

So I'm looking for a list of the "lost arts".

For example, its near impossible to find documentation related to reverse engineering Widevine DRM.

Anything published is scrubbed from the internet search results lighting fast.



To paraphrase your own complaint - one "lost art" is self-contained software, that can be shipped on a CD, including release notes and documentation that form part of the software/release itself, and can be used on a standalone computer with no internet access, or occasional internet access.

Lots of software these days is written with the naive assumption of continuous, unlimited bandwidth / zero-latency internet connectivity, and falls apart whenever that assumption turns out to be incorrect.


> Lots of software these days is written with the naive assumption of continuous, unlimited bandwidth / zero-latency internet connectivity, and falls apart whenever that assumption turns out to be incorrect.

I’ve been stuck tethering to my phone for connection for a couple weeks with… volatile mobile networks, and what kills me is when the UI in some application stops responding when connection is bad. “Yeah here’s an intrusive popup, sorry you can’t close it because you don’t have a fast enough internet connection”


Writing low level things from scratch, utilizing CPUs fully, not relying on high level things that abstract away all the underlying hardware, OS and other low level details.

The Handmade [1] community is really into that stuff Originally inspired by Casey Muratori's Handmade Hero Series [2]

[1] https://handmade.network/

[2] https://handmadehero.org/


User research, UI design. Windows 95 was a big leap in that regard, at least in the PC space [1]

Nowadays it's "move fast & break things", following the fashion, and asshole design ("dark patterns").

[1] https://dl.acm.org/doi/fullHtml/10.1145/238386.238611


Jakob Nielsen 4-ever <3


We used to have tools that could be used to build a CRUD application from scratch in a few hours (minutes for a trivial one). Delphi and Visual Basic 6 were widely used by domain experts to hammer out a workable tool.

Then Microsoft got obsessed with .NET because they thought everyone was going to bail on the X86, and used it as an excuse to kill off VB6. The shift to "professional" programming tools really just added a ton of boilerplate.

Since then, we've been forced to send UI though a soda straw to a random web browser, and make it all work.


Microsoft Access was great for this. You could built a quick app in couple of hours and deploy in local network with a shared database.


Analog video encryption and decryption, perhaps. see https://en.wikipedia.org/wiki/Videocipher

Knuth used to have a section on "sorting with tapes" that he removed; but the problems there might still generalize to slower access methods even when they're not as strictly sequential as tapes. Maximizing bandwidth to GPU chores and searching through 10TB+ hdd's with only relatively tiny access speed both have similarities to some of those problems.


Well-known game developer Jonathan Blow has a lot to say about software and technology more generally being in decline due to skills being lost when older generations retire [1]. In my own experience, writing code with pointers in it used to be a skill expected of entry level programmers, and getting the hang of it was something of a rite of passage. Around the time Java started being used as a teaching language, it became best practice to rely on the language's standard libraries instead of rolling one's own linked list or whatever, and I instructed university students accordingly. As things have developed, the design effort put into Rust seems to imply that pointers are now regarded as a grand challenge problem. These days I don't tie my own neck ties any more but wear the clip-on kind because the knots have been developed and tested by experts.

[1] https://www.youtube.com/watch?v=pW-SOdj4Kkk


Writing good Mac applications is a quite rare skill these days. Not extinct – but the share of devs that can do it is probably quite small.


Docs, tutorials, and other educational resources for learning how to write native macOS applications is pretty limited. I can understand why a lot of devs choose to go for Electron instead.


Got any good examples of outstanding mac apps? I think I'd throw "Things 3" into that bucket for sure.


Acorn is one of my favorites.


As an outsider to web dev, I think there's a lot of wheel reinventing going on in web technologies over the past 10 years or so, and we still haven't fully caught up to where we were.

XMPP was a federated instant messenger protocol that worked extremely well and was widely used. It generally worked better than Element/Matrix or MS Teams in my experience. It was used by over 10 million people by 2003.

Adobe Flash. It had a lot of security and usability issues, but HTML5 has still not caught up to it. Any bright 15 year old could spend a day and make their own game.

Makefiles. People hate on them, but if you get around their weird syntax and other eccentricities, it's hard to find a better build system. They've been around since the 70s, and I have not seen another build system achieve feature parity.


> It generally worked better than [...] MS Teams in my experience

That's a very very very low bar to cross.


> Anything published is scrubbed from the internet search results lighting fast.

Try alternative search engines like Yandex etc.


Back in my day... probably not technically a lost art, but an interesting lesson in math.

I remember I was in computer science class. I was 17 years old.

And the teacher taught us how to read binary and how to calculate it. He said binary was math which is a universal language that even aliens could understand and would allow us to communicate though we'd still run into the struggle of converting that binary to actual meanings, such as converting binary to English or vice versa.

We often don't think about it, but our programming technology is based on binary calculations.

I'd put the Conversion to Binary or from Binary in the lost arts category because I highly doubt it is ever taught, not even in computer science classes nowadays because its just slow and boring to do the conversions.

But if you do think about it, he wasn't wrong.

For those who are curious about it:

Converting binary to English or math involves interpreting the binary representation of numbers or characters into their corresponding decimal values or ASCII representations. Here's how you can do it:

Binary to Decimal (Math): Each digit in a binary number represents a power of 2. To convert binary to decimal, start from the rightmost digit and multiply each digit by 2 raised to its position, then sum up the results.

Example: Convert binary 1101 to decimal.

1 * 2^3 + 1 * 2^2 + 0 * 2^1 + 1 * 2^0 = 8 + 4 + 0 + 1 = 13

Binary to ASCII (English): ASCII (American Standard Code for Information Interchange) is a character encoding standard where each character is represented by a unique numeric value. In ASCII, each character is assigned a decimal value, which can be converted from binary.

Example: Convert binary 01001000 01100101 01101100 01101100 01101111 to ASCII (which represents the word "Hello").

01001000 -> 72 (H)

01100101 -> 101 (e)

01101100 -> 108 (l)

01101100 -> 108 (l)

01101111 -> 111 (o)

So, to convert binary to English or math, follow these steps:

For English (ASCII characters):

Divide the binary into groups of 8 bits (1 byte).

Convert each group of 8 bits to decimal.

Find the ASCII character associated with the decimal value.

For Math (Decimal):

Write down the binary number.

Multiply each binary digit by 2 raised to its position (starting from the rightmost position).

Sum up the results to get the decimal equivalent.

Keep in mind that this process is straightforward for numbers and ASCII characters, but other types of binary data (like images or files) may require different methods to interpret.


Tangentially related: storing a bitmask in your database (as a single 32-bit value) to track many flags, and using bitwise math to manipulate/interpret it. Much more space efficient than what everyone would do now (a bunch of tinyints, enums, or booleans) but obviously you’d be rightly fired out of a cannon now for introducing something so difficult to grasp for young devs, just to save “cheap” memory and storage.


Bit masks are cool

They are also very fragile

Far better to have some maintainable and extendable (what happens when you exceed 32 binary flags?) than being overly clever for the sake of saving a couple bytes of storage


No dispute there, it would be silly now, I just think it was kind of cool how programmers used to understand how binary worked, and appreciate that it was worth it at one point to use 1 actual bit to store a true/false value.


I have a few simple bit questions in my coding interviews and most programmers simply don't know how to set a bit in a byte (or word), rotate bits in a byte (or word), or what endian is.


In the 2010s binary was taught in high school intro to CS and in university CS classes. I don't know about bootcamps though, I doubt it is taught there.

What isn't really taught is binary representations and what things look like in memory. I think there was one class in university that covered it, but just briefly.


Binary is used often by embedded programming/FPGA dev types. I use binary and hexadecimal every day.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: