Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

We take it for granted now with the prevalence and acceptance of interpreted languages like Python and Perl, but the radical accessibility of BASIC can't be over-emphasized. C, Pascal, FORTRAN, and friends all had a compile step which meant that you had to operate a text editor and command-line... BASIC was so much more immediate and straightforward to learn on.

The other great reason so many of us 40-somethings got our start on the BASIC-wielding micros was that our universe was pretty small and the simple things we could make our machines do were still fancy. These days it seems that you won't interest kids with anything short of a 40fps multiplayer battle royale game with millions of polygons on screen. Random number guessing games are lame, Dad.

The early web was a lot like the early micros -- you had to make your own entertainment as you quickly ran out of things to do, and the things you were able to make were comparable to the things built by well-founded teams. The opportunity to add something new and useful to the web is largely past, and any window is even smaller if you're not a venture-backed team.

Old man out. <mic drop>



Considering the RAM and storage restrictions of those machines, hosting a compiler would have been quite challenging. Where would it store object files or the final executable? On Datasette? And, the 6502's CPU stack is limited to a whopping 256 bytes! With an interpreter burned into ROM, we only needed the BASIC code to reside in RAM. And, of course, the occasional assembly routine to speed things up. I remember how entering a hex dumper (and disassembler, if possible) was always among the first things I did on such a box. I hacked the ROM of the Atari 800 to find out where those BCD floating-point subroutines were located, so I wouldn't have to run the computation loop of my Mandelbrot fractal generator from BASIC. That made a hell of a difference. Today's sports disciplines are a bit more elaborate. It came to the point where I could enter 6502 instruction codes in hex from memory because the instruction set is so small you could actually have it memorized, just as a side effect of daily use.

I still remember the elation I felt when I finally got my hands on a machine that could actually run a C compiler and had “high-resolution” graphics, not to mention an operating system that was documented. It felt like changing from a horse cart to a racing car. Today's computers are science-fiction space ships compared to that.

Another old man, out.


I always had contempt for BASIC languages in the 80th. Being a tough demo coder in assembler, I thought that this isn't the real stuff.

Today I agree with you.

After re-entering the world of 8bit and 16bit Commodores again, I finally utilized these. Indeed you could do quite a few tricks with them, and they are quite powerful.

There is beauty in the ease of use of these languages. And to be honest, there is a lot of hardship in languages like assembler.

The whole IDE idea for Java, Kotlin, etc. reminds me more of BASIC than C. Especially that BASIC is somewhat forgiving if you make mistakes, aka bugs, I appreciate. If you make mistakes in assembler on Amiga 500/1200, for example, you most often have to reset the machine. If you forgot to save your work before testing, you would suffer even more. And this is not the coolest of all workflows. Even on emulators like WinUAE, where you need timing to hit the debug key.

Assembler is complicated (word operations accessing an odd memory address is hard to find) and only useful in very constraint areas like elder computers.

BASIC is simple, and therefore fast.

With computers doing the heavy lifting nowadays, I keep the speed part to the computers while focusing on fast iterations. Except for the Amiga, where I think that BASIC and assembler can coexist nowadays for me. Should have utilized BASIC back then for simple things.


80’s Basic “REPL” and lack of local context (except for BBC Basic) made observing and debugging programs much easier than any modern system, despite the amazing debuggers we now have.

Break/modify/continue was the norm. It somehow worked much better than e.g. the JavaScript console on a modern browser even though reading a description of it wouldn’t give any indication of that.


The nice thing is that such beauties of interfaces are rediscovered by the younger generations. The rise of the REPL interface for python is a great example which makes this language feel a lot like BASIC on stereoids. And the notebook interface was kind of a game changer for open source scientific computing. Despite volunteered by some software from the 80s and 90s (thinking of the young Mathematica and Matlab GUIs), it is something which really got widespread nowadays, from my feeling. The distributed nature of these tools over the web is probably boosting this experience.

Furtheremore, the small joy for programming can always be around the corner. Take for instasnce https://tixy.land/ , its a small creative code environment. Or take https://d3js.org/ as a bigger ecosystem which made animations and svg really fun and easy with javascript.


This use case is part of why I'm so disappointed in some python 3 changes. 2 was definitely more suitable for a newcomer experimenting in a REPL by virtue of its greater simplicity and intuitiveness.


It’s really surprising how converting print from a statement to a function made repl debugging much less convenient.


I don't see "print(x)" being in any way worse then "print x". If you want to save some characters, you can always just write the symbol in the REPL line:

    >>> x=42
    >>> x
    42
This already works in the standard python REPL, no need for ipython. You can literally just omit the print call/statement:

    >>> print("the answer to life is %d" % x, "not 0")
   the answer to life is 42 not 0
   >>> "the answer to life is %d" % x, "not 0"
   ('the answer to life is 42', 'not 0')


How so?


I can’t give a logical explanation, but after 3 years, “print(x)” in the repl is still a cognitive dissonance whereas “print x” is natural.

(Doesn’t bother me in the editor, only in the repl)


> C, Pascal, FORTRAN, and friends all had a compile step which meant that you had to operate a text editor and command-line... BASIC was so much more immediate and straightforward to learn on.

A "text editor and command line" workflow requires a notion of files which just wasn't a thing in those early machines (except as part of custom disk-mamagement commands). What they had was a sort of line editor as part of the UI itself - if you typed a line number at the prompt, that line would be replaced in memory. Of course this worked really well with early BASIC where control flow is expressed entirely via those line numbers. Some primitive visual editing was also allowed - you could LIST some lines of code, then change a line on the screen and press the enter key - the ROM would then read the complete line of code from the screen and update it in the actual listing. It could get confusing at times, but mostly it worked quite well.


A notion of files was certainly there for the Commodore 64, and many programming language environments for the C64 included code editors that supported saving/loading files on disk directly, including macro-assemblers, for that matter.


> The early web was a lot like the early micros -- you had to make your own entertainment as you quickly ran out of things to do, and the things you were able to make were comparable to the things built by well-founded teams.

I'd never made this connection before but it seems both obvious and profound in retrospect. Thanks.


> Random number guessing games are lame, Dad.

Drawing a circle on the screen is still pretty cool for kids. Not like super cool, but qbasic is a lot more accessible than anything more modern (other than maybe scratch, but that's weird)


I have a 12-year-old. The killer app for learning programming for him has been Minecraft bots. He's on Discord with a group of friends from all over the world, and they're learning to program by writing bots in Javascript. Those bots watch what happens on the server and react. His current project is writing a bot that can play connect-4 with the user. He's slowly reinventing the minimax algorithm and having a lot of fun doing it.


There was a thread about a month back where some of these points came out - where I shared the exact frustration you feel!

I think these days the interesting building blocks may have to be more "intelligent".

https://news.ycombinator.com/item?id=24706571


I started programming on an amiga. After some amigabasic and later amos, a wanted to learn a real language. I got a book about C, but the compiler was a pain. It didn't fit in one floppy and I has no hard drive.

Learning assembly was the natural way out.


There was that weird period on the Amiga when using assembler felt easier - especially with a proper macro assembler - than C. I remember having a C library for some user interface component that I compiled to C and then disassembled to optimise (by hand) because the C compiler was "useless" and improve on.

Of course the Amiga got other interesting languages, like AmigaE (whose author is probably better known for FlatBuffers these days, but AmigaE was a usability revelation at the time)


I had a Pascal interpreter for the ZX Spectrum, on a book form that I had to type in, languages and implementations are orthogonal.


Really with Javascript on a web page you can do the same thing, except there’s much more structure and power.


The Node's REPL is not very comfortable for creating and running snippets. The Basic environment has the magic LIST command that shows you the current state.

I think the Smalltalk environments with the object browsers were (and are) the ultimate fiddling environments.


You could even pass in the line number or range and the list command would only show that part of it.


I mean in a web browser, not Node.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: