Just to add my experience to the pile: when I went to college I was able to convince my parents to get me a custom PC from a company called GamePC. Among the specs in 1998:
400Mhz Pentium 2
128MB
Nvidia Riva TNT
3DFX Voodoo2
CDRW (4x4x24 I think)
Syquest SparQ (Awesome, but had major issues)
Internal Zip Drive
Just a ridiculous system for the time. Quake 2 and Starsiege Tribes were really popular in our dorm and that system was just perfect for it. Also popular was burning lots of pirated games, so we'd order CDRs in bulk from this really random site overseas. High quality "gold" CDRs and they were far more reliable than any of the ones you'd find in stores in the US for about half the cost.
Halfway through my freshman year I decided to swap the motherboard and CPU for a crazy motherboard/CPU combo. There was a brief moment where Intel Celerons didn't really prevent you from using them in a dual CPU setup, so I had two 366mhz Celerons overclocked to ~433mhz (sometimes up to 533mhz, but that was less stable) and started playing around with OSs like Linux and BeOS to actually take advantage of them.
> Halfway through my freshman year I decided to swap the motherboard and CPU for a crazy motherboard/CPU combo. There was a brief moment where Intel Celerons didn't really prevent you from using them in a dual CPU setup, so I had two 366mhz Celerons overclocked to ~433mhz (sometimes up to 533mhz, but that was less stable) and started playing around with OSs like Linux and BeOS to actually take advantage of them.
Half of HN alive at the time probably had that motherboard - ABIT BP6 was the reason I got my hands on a copy of W2K, and also started playing with Linux.
I'm still bummed that CPU manufacturers basically killed off the third party chipset industry. Maybe it was inevitable when memory controllers moved on-die, but I remember when there was actual competition in the chipset business.
Chipset, not CPU. For example, Nvidia was a well known chipset manufacturer around this time, shortly before memory controllers went on package and 3rd party chipsets died off.
Speak for yourself, mate! Many fun times were had with machines built by each. I have particularly fond memories of the SiS 630 / 730, Via's KT133A, and lots of old ALi, OPTi and ULi gear from the 286, 386, and 486 era.
It was a cool board. I didnt technically have one, but I built my dad a W2K server on a BP6. I always wanted to hack on it and overclock with it. But after I handed it over, I wasnt allowed to touch it, "you'll burn up my processors." Since he didn't care about overclocking he had dual P2-400s or maybe 450s. It was a beast. He could run SQLServer and compile Delphi apps so hard.
I got my kicks though with a BF6 and a 300A. Those were the times; atleast until the AthlonXPs (AXIA -- anybody?) were released.
The first PC I built myself had an Athlon XP Barton 2500+ and 2x 256mb sticks of DDR2-400. It wasn't top of the line by any means but great bang-for-buck in 2003.
Had pretty much the same thing... but only one overclocked Celeron to 433. Was amazing upgrade from my pentium 133 with a Matrox Millenium; which I somehow used to complete Half Life in low FPS agony.
I still have distinct memories of "playing" CS in 640x480 on a PII with the same card, which didn't do 3D at all iirc. 12-15 fps with the software renderer, depending on how many bots you had.
agree -- that dual celeron setup (often with a peltier cooler) was suuuper common, I knew so many people who rushed out to get them and run at 500?
it was my second exposure to SMP though: First was dual socket Pentium Pro 200mhz which ran nt4.0 for the longest time (which I still keep that hefty cpu around on my desk for laughs)
I'm slightly confused, how would games of that era benefit from a dual CPU setup?
Old games were decidedly single-threaded and built for a single-core world. It was only in the mid-to-late 2000s that games started to be more optimized for multi-core CPUs. And multi-CPU is even more difficult because there isn't any cache sharing.
Games didn't (not directly), but your workload did. If you ran an NT OS (2000/XP, primarily), the system would gain from being able to run background tasks without needing to preempt your game's logic.
It's impossible to run Windows as an exokernel, so you would inevitably gain some benefit (even if negligible).
I wonder if there something like this today you will have in college? Some low cost graphic card rigs? Or is it more like some cloudflare set-ups today?
It was just single-CPU, but I had the ABIT BH6 with a Celeron 300A, one of the most overclockable setups ever. Mine was stable at 450mhz without any special cooling.
Similar experience, I had a Cyrix PR200 which really underperformed the equivalent Intel CPU.
Convinced my parent's to buy a new PC, they organized with a local computer store for me to go in and sit with the tech and actually build the PC. Almost identical specs in 1998: 400Mhz Pentium 2, Voodoo 2, no zip drive, but had a Soundblaster Live ($500 AUD for this at the time).
I distinctly remember the invoice being $5k AUD in 1998 dollars, which is $10k AUD in 2024 dollars. This was A LOT of money for my parents (~7% of their pretax annual income), and I'm eternally grateful.
I was in grade 8 at the time (middle school equivalent in USA) and it was the PC I learnt to code on (QBasic -> C -> C++), spent many hours installing Linux and re-compiling kernel drives (learning how to use the command line), used SoftICE to reverse engineer shareware keygen (learning x86 assembly), created Counterstrike wall hacks by writing MiniGL proxy dlls (learning OpenGL).
So glad there wasn't infinity pools of time wasting (YouTube, TikTok, etc) back then, and I was forced to occupy myself with productive learning.
I could share almost exactly the same story. So grateful my parents could afford, and were willing to spend, the money on a nice PC that I entirely monopolised.
That and high speed internet. I played for a couple of years on 28.8K. The day I got a better graphics card was great. No more choppiness. The day I got cable internet was life changing in Tribes (and Tribes 2)!
I think I still have a pic somewhere of the infamous NoFix scolding "LPBs"
I remember when Cable internet started showing up... I'd cart my computer to a friend's house once a month to play LAN party for the weekend and run updates.
Back then, updates over modem took hours to run, it was kind of crazy considering how many easily exploited bugs existed back then.
> it was kind of crazy considering how many easily exploited bugs existed back then.
Anyone on IRC learned this pretty quick.
I thought my computer was up to date on everything, ran win2k, zone alarm firewall well configured, and someone on IRC said they had a script with all known exploits and I invited them to run it against me… they still managed to crash my computer.
True enough... ironically, I ran a fw/router before I got high speed internet with an early linux distro in the mid-late 90's. So no open inbound ports to the internet. Had a script to set/fwd game ports while playing and close out after.
Win2k had a few file/print services and similar on by default.
There was a moniker for the few people with high speed back then - LPB - low ping bastards. All those fortunate enough to live in a city with adsl or cable high speed in the early days (or gaming at work or university on the T1)
Interestingly enough, these days it's often an advantage to have high ping, because modern games make client-side hit detection authorative. With Apex Legends, Respawn uses the argument that playing against laggers but with client-side hit detection makes the bullshit that happens "symmetrical" and they want to keep the game accessible for people with poor connections, but anyone that plays against laggers knows that is absolutely not the case.
I wish modern games would just include a Ping Lock toggle in the matchmaking. "Do not match me with anyone with poor connection quality" (>100 ping, >1% packet loss). With a big fat pop-up warning that it'll increase matchmaking times.
It was deeper than that. That was just the way we were all classified back then: hpb (high), lpb (low), slpb (super-low?). When we got a cable modem in `99, I felt like hot shit to leave the hpb shame behind.
vgh! Except that texture transparency worked with glide (voodoo) cards and not with opengl or software rendering. So if you made a custom skin with transparency texture there was a brief time in the Tribes 1.1 to 1.2 era where you could be somewhat invisible to people with voodoo cards (if your skin was in a skin pack that everyone had).
It's too bad that tribes games' learning curve is too steep for people now. Tribes Ascend was pretty well made but died quickly, and Tribes 3 seems to be dead even faster.
Very few people who didn't already play the earlier games have much stomach to figure out how to even move effectively across the map or hit anything moving at high speed, let alone do proper cap routes, chase, etc. I played Tribes Ascend for awhile and on most random servers you could play the first 2 minutes as a chaser to verify "yup there is nobody who knows how to move quickly", kill yourself over to chaser, and then end the game in like 4 more minutes when everyone else is just slowly messing around in midfield doing nothing productive. And I wasn't even any good lol, when I went into any semi organized pug I would get destroyed.
Ah, a fellow Tribesplayer. Just so you know, we still play tribes. Join us! http://playt1.com/ - the community mantains the master server and clients these days. There are good pick-up games on fri+weekends.
I love this game, it's also amazing to me how the concept of "skiing" was foreign to me when I first played T1 and T2, and now its a core game mechanic.
> There was a brief moment where Intel Celerons didn't really prevent you from using them in a dual CPU setup, so I had two 366mhz Celerons overclocked to ~433mhz
Was that the BP6 motherboard from Abit?
I had that board, those processors and used to overclock them too.
Also ran Linux and BeOS on it (though IIRC you had to patch BeOS for SMP support).
Quake 3 ran so smooth on that machine, even without Q3s experimental SMP support enabled.
That was actually my all time favourite computer, even to this day.
I also had a TNT2 in an earlier machine, but the BP6 machine had a GeForce 3.
Dual 300As overclocked to just over 500Mhz each on a BP6 with Geforce 256 here too! Fastest, smoothest machine I ever used until the M1 MacBook. Quake 3 multiplayer demo ran so fast it gave me motion sickness ^^ Years later I "upgraded" to a 1Ghz Athlon and it felt like a downgrade a lot of the time.
> though IIRC you had to patch BeOS for SMP support
The board might have require a driver or patch, but SMP was BeOS's entire reason for being! The drawing of each window on the screen ran in a separate thread. It was their main selling point.
Reading the BeOS Bible talking about that is quite a throwback:
> As described elsewhere in this book, BeOS uses multiple processors with incredible efficiency. If you'll be running BeOS most of the time, you'll get more bang for your buck by getting two (or more) older processors than by installing one superfast CPU. Last year's 266 MHz CPUs will always be dirt cheap compared to today's 450 MHz CPU. Thus, when running BeOS, you could have 532 MHz for less than the cost of a single 450 MHz processor. The catch is that if you'll be dual-booting into operating systems that won't recognize a second CPU (such as Windows 95/98), you'll end up with half of your processor speed being wasted until you reboot into BeOS. Chance are that once you start using BeOS regularly, you won't want to use anything else, and you won't regret buying a multiprocessor machine.
Lack of SMP was an artificial limitation for the BeOS 5 Personal Edition (I think it was called). The idea being you’d get BeOS for free but you couldn’t use it as a proper multiprocessor workstation without paying for a license.
This was also the same BeOS release that targeted Intel and ran off a virtual disk stored on a Windows FAT32 partition.
Oh man the Celeron A, which was basically a Pentium II with on-die L2 cache. Intel attempted to handicap it by limiting it's FSB to 66 MHz, but any half-decent motherboard would allow you to bump that up to 100 MHz so long as you had the rest of the hardware to support it (i.e., PC-100 memory). This resulted in a pretty significant bump in CPU frequency.
Overclocking Celeron's those were the days. Intel binning down a bunch of processors capable of reaching higher clock rates but selling them as a lower end part was a boon for college students everywhere.
NVidia RIVA TNT which used the AGP bus on the Intel LX440 mobo.
A whopping 128Mb of RAM and 8Gb HDD.
I recall using a program called WinSplit to split the Nvidia driver over several floppy discs on my bosses Win3.1 machine in the office. I didn't have internet at home and really wanted to play Jedi Knight and Battlezone.
I recall the legendary Celeron being the 300A. It was 300MHz, but was easily overclocked to 450MHz. There were higher clocked versions, but regardless of which CPU you got, they ultimately were only able to overclock to about the same frequencies.
Also, the celerons of that generation did not have unlocked multipliers. The only way to overclock them was to overclock the front side bus, which also controlled memory bandwidth. The "standard" FSB speed was 66MHz. By overclocking a 300MHz CPU to 450MHz, you got a 100MHz memory speed. By overclocking a 366MHz CPU to 466MHz, you "only" got 78MHz of memory bandwidth.
My friend in college had one. Windows 98 didn't support SMP, so he had to run Windows 2000, which was based on Windows NT, and would be the basis for XP. Compatibility with games was sometimes...interesting. Windows ME came out about that time, but was absolute garbage. All of us either stuck with 98SE or experimented with 2k. None of us actually bought it of course...
So the story originally started with the cacheless 266 Mhz Celeron. CPUs were delivered as AICs (add-in-cards) at the time, with separate cache chips, so to deliver a budget processor, they shipped the same silicon, but without the cache chips added. Removing the cache drastically tanked the performance, especially on integer work loads (typically productivity software), but didn't really affect floating point workloads. However, it had the side benefit of removing the part of the AIC that was most sensitive to over-clocking (the cache). It used a 66Mhz clock with a fixed 4x multiplier, and upping the clock to 100Mhz got the Celeron running at 400Mhz, which had performance roughly equivalent to a 266 Mhz Pentium II with cache for integer workloads, but for games, it was almost as fast as the fastest Pentium II of the time (which topped out at 450Mhz).
In order to stop the overclocking, Intel decided to add some cache back to the CPU, but to save money, rather than using cache chips, they stuck a relatively tiny amount of cache directly on the CPU die, and released the now infamous Celeron 300A
Because the cache was on-die, it could overclock just as well as the previous celeron, but this time the 300A was faster than the equivalent Pentium because the on-die cache ran at twice the clock speed of the external caches
> By overclocking a 366MHz CPU to 466MHz, you "only" got 78MHz of memory bandwidth.
I think the PCI bus probably also typically ran at some fraction of the front-side bus. The common FSB frequencies around those times were 66 or 100 MHz which gave a standard ~33 MHz PCI bus frequency with a multiplier of 1/2 or 1/3. FSB frequencies that weren't close to a multiple of 33 MHz might have caused trouble with some PCI cards. Might have depended on how the motherboard or chipset handled the bus frequencies, too.
Of course the PCI bus should probably always run at 33 MHz but I think I saw it being modified with the FSB speed at least on some motherboards.
It was crazy how fast things moved back then. A friend of mine had a 233MHz P2 with 32GB and a 2D card, and within two years it was a dinosaur, being shown up by machines like yours, 400-450MHz, 3D cards, way more memory....
I think the other commenter is right...you're thinking of DVD-R vs DVD+R, possibly even DVD-RW and DVD+RW.
Based on the specs listed, OP was in college just before me or may have overlapped. The big gold CD-R stacks (you could bur in jewel cases, on spindles, or just gross stacks which were nice and cheap) were a huge thing with my group (who encoded to FLAC & MP3 -V0 and burned audio CDs relentlessly. We felt we were archiving our liberal arts college music library and radio library for the future! Who knows. Some of that "future" is still backed up and on hard disks, and I should migrate them to SSD or tape just on principle.
At that point CD-R were cheaper than CD-RWs, and because most archiving/distributing didn't require rewriting (not return-on-investment wise anyway), we just shared programs on CD-R as well. In some ways it was a beautiful technology! Particularly fidelity to a spec everyone tried to bend and break for a profit angle, when honestly, there was no point for many of us using CD-R
Halfway through my freshman year I decided to swap the motherboard and CPU for a crazy motherboard/CPU combo. There was a brief moment where Intel Celerons didn't really prevent you from using them in a dual CPU setup, so I had two 366mhz Celerons overclocked to ~433mhz (sometimes up to 533mhz, but that was less stable) and started playing around with OSs like Linux and BeOS to actually take advantage of them.
edit: corrected the amount of memory
/end reminiscing about a simpler time