Hacker Newsnew | past | comments | ask | show | jobs | submit | wkjagt's commentslogin

I heard about this but for some reason my 4a was never affected. Still works great and I still use it daily.

I miss it too. I just bought a 90s laptop, and put Windows 98 on it. And Borland C++. I'm sure a good part of it is nostalgia, but there just was something special about computers back then that I miss today.

I recently bought an HP Jornada that I want to explore a little further. Cool little Windows CE palmtop with great battery life.

> OS/2, Microsoft’s latest addition to its operating system line

Wasn't it mostly an IBM product, with Microsoft being involved only in the beginning?


The article is from December 1987, when nobody yet knew that it would end up that way. The Compaq Deskpro 386 had just been released in 1986 (thus unmooring the “IBM PC clones” from IBM), the September 1987 release of Windows/386 2.01 was only a couple of months ago (less if you account for print turnaround), and development of what would initially be called NT OS/2 would only start in 1988, with the first documents in the NT Design Workbook dated 1989. Even OS/2 1.1, the first GUI version, would only come out in October 1988 (on one hand, really late; on the other, how the hell did they release things so fast then?..).

While NT OS/2 effort started earlier, Windows 3.0 was apparently an unsanctioned originally rogue effort started by one developer, initially masquerading as update to "Windows-as-Embedded-Runtime" that multiple graphical products were shipping with, not just Microsoft's

Even when marketing people etc. got enthused enough that the project got official support and release, it was not expected to be such a hit of a release early on and expectation was that OS/2 effort would continue, if perhaps with a different kernel.


Microsoft was only interested in fulfilling the contracts, and some networking components such as NetBIOS and LAN Manager, then winding down. This was due to Microsoft had already been in discussion with David Cutler, and had hired him in October 1998 to essentially port VMS to Windows NT. Windows NT 3.1 appeared in July 1993.

https://archive.org/details/showstopperbreak00zach


Microsoft unwrote a lot of the code that IBM needlessly wrote.

I worked as a trainer at a commercial training company that used the Glockenspiel C++ compiler that required OS/2. It made me sad. NT made me happy.


This is from 1987, the IBM / Microsoft joint development agreement for OS/2 didn't fall apart until around 1990, and there was a lot of Microsoft work in early OS/2 (and conversely, non-multitasking MS-DOS 4.0 was largely IBM work).

Windows NT originally shipped with an OS/2 compatibility layer, along with POSIX and Win32.

I'm assuming that all of it was written mainly, if not solely, by Microsoft.


If you count the beginning as the time between OS/2 1.0 up until MS released Windows 3, then it makes sense. IBM understood Microsoft would continue to collaborate on OS/2 more or less forever.

As an outside to all the history and lore… IBM is probably one of the most confusing companies I can think of.

Also, any fanciness you add in your product is something you need to then maintain. Even after the developer that built it leaves the company.

It takes thousands of years for the stars to have changed positions in a noticeable way, and my best guess is that the customers will not use their car GPS for so long that this will bother them.

Very funny, but in case you're serious, it's not the stars changing...

...it's the software frameworks. A new screen size. A different color depth. A bug when the graphics library is upgraded for antialiasing. Etc.


That's not going to be an issue with devices already sold. And if developers of future devices can't handle it they should probably be fired from their job.

It's not a question of whether future developers can "handle" it. It's a question of whether the additional time required to maintain it is worth the cost. Maintenance isn't free. It takes time, and every little bit adds up.

Also, devices already sold often get updates, so it's not even just about future devices.


And if you completely discharge the powkiddy you can't charge it anymore, unless you open it up and physically disconnect the battery, plug the charger in, and then the battery back in.

It's a way of signaling agreement.


They were correcting my initial "1+"


Shouldn't we programmers be just typing "++"?


i++


I want to get into Rust and OS development, so this sounds like a great series. However, the articles were written in 2018-2020. Would they still be mostly relevant? Or has Rust moved so fast that too much has changed?


Earlier this year I was able to struggle-bus together my own bootable rust binary (and bootloader in assembly!) on QEMU in a couple of hours using his blog as a guide. It seems the LLMs have definitely scraped his blog, as if you go off the happy path they're absolutely useless in this endeavor, and few other resources on the topic exist.


You can have a look at this project that shows how with proper abstractions you can minimize unsafe Rust for low level firmware. Specifically look for page token, page allocator and page table implementations: https://github.com/IBM/ACE-RISCV


Still relevant. CPUs don't change much. The biggest differences will be the build tooling.

My kernel is at [0], feel free to reference its config for inspiration.

[0] https://github.com/oro-os/kernel


I went through it recently and I don't think there was anything in the blog that was out of date. There was mention in the blog of having to do some hacky things because certain features in rust didn't exist yet, but the hacky solutions provided still worked.


I have an old version (I don't remember which one) of Word running on Windows 3.11 on a 486 DX2-66.


Beautiful. I had that exact computer model 30 years ago.


From what I understood is that these are supposedly bad because they look like video games instead of photographs. Not sure what the problem with that is though. I'm fine with video games looking like video games.


I also thought that, but then I scrolled down to the Breath of the Wild shot and got (part of) it: BotW has an awesomely rendered sky, whereas the CoD and to a lesser extent HZD ones have a desaturated, largely overexposed mess (despite all my affection for HZD). The Smaug shot is flawed in a similar manner.

And the photography comparison does come to mind immediately, because that kind of thing is in fact what you’ll get from a DSLR on a sunny day if you don’t know what you’re doing, and to some extent from a film camera too (I’m speaking about the sky only—the HZD shot has much too large a dynamic range to capture on a real camera without compositing). Photographers have a huge bags of tricks to deal with the problem, from taking photos in early morning light to darkening parts of a shot with a graduated ND filter to underexposing and fixing it up in post (before digital, that meant chemistry).

I think it is fair to hold games to this standard. It’s not that they have to look like photos. It’s that they shouldn’t have flaws in their look that have been recognized and solved for photos for more than a century.


I don't see how photo solutions applie to real time rendering. Unless your games are slideshows.


You... apply the photo solution but in real time, to every frame.

Games are interactive movies (not like - they are) and movies are moving pictures. Of course techniques applicable to pictures are largely applicable frame-wise-ish to movies and games. Siloing knowledge stifles achievement.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: