For me, the 486 was right between my (actually my Dad's) first computer, a 386, and my first personal computer (Pentium MMX). During those couple of years my friends had 486s and I was always jealous. I used to drool at the Best Buy catalog that came every Sunday in the mail.
Nowadays, 486 computers are getting rare and relatively expensive. CPUs themselves are 25, 30, 40, sometimes 50 bucks on eBay. Whole working systems are in the low hundreds, and fully working 486 laptops can fetch 400 or 500 bucks.
The 486 killer app was DOOM. It was butter-smooth at 20 fps if you also had a VLB graphic card.
The 486 DX2 66MHz was the target platform for gaming during almost two years (1992-1994). That was an huge achievement back in the days to be at the top that long.
The DX/2 66 is a true legend of a chip. It was so good. The final nail in the coffin for the Amiga and for 68k. I love the Amiga, but it just didn’t Doom.
Before it, you could claim that a 68040 was kinda-sorta keeping up with the 486 and that the nicer design and better operating systems of other computers made up for the delta in raw performance, but the DX/2 66 running Doom was the final piece of proof that the worse-is-better approach of using raw CPU grunt to blast pixels at screen memory instead of relying on clever custom circuitry was winning.
Faced with overwhelming evidence, everyone sold their Amiga 1200s and jumped ship to that hated Wintel platform.
I remember arguments (and benchmarks) around all the variations of the 486 since the bus speed/clock speed was uncoupled (the /2 is clock doubling). For some applications, a 50Mhz 486 with a 50Mhz bus would beat a DX/2 66Mhz with a 33Mhz bus.
And sometimes the DX/4 100Mhz would be slowest of all those at 25Mhz bus.
Especially since when actual clock quadrupled chips eventually came out they had to call themselves ridiculous things like ”5x86” instead of DX/4. (The Am5x86 133 runs at 4x33 MHz)
As I noted in my other comment (1), in 1985 Amiga OCS bitplane graphics (separate each bits of a pixel index into separate areas) was a huge boon in 2d capability since it lowered bandwidth to 6/8ths but made 3d rendering a major pain in the ass.
The Aga chipset of the 1200/4000 stupidly only added 2 more bitplanes. The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.
Reading in hindsight there was probably too many structural issues for Commodore to remain competitive anyhow, but an alt-history where they would've seen the needs for 3d rendering is tantalizing.
> The Aga chipset of the 1200/4000 stupidly only added 2 more bitplanes. The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.
The intention was good, but the Akiko chip was functionally almost useless. It was soon surpassed by CPU chunky to planar algorithms. I don't think it was ever even used in any serious way by any released games (though it might have been used to help with FMV).
The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.
I agree. Unfortunately, even with chunky graphics and/or 3D foresight, 68k would still have been a dead end and Commodore would still have been mismanaged into death. It’s fun to dream though…
Was it necessarily a dead end? Considering the ways Intel and later AMD managed to upgrade/re-invent x86 that until x64 still retained so much of the x86 instruction encoding/heritage (heck, even x64 retains some of the instruction encoding characteristics).
Had the Amiga retained relevance for longer and without a push for PowerPC I don't see a reason why 68k wouldn't have been extended. Heck the FPGA Apollo 68080 would've matched end of 1990s P-II's and FPGA's aren't speed monsters to begin with.
There were no tech problems IMHO, it was all mgmt problems. They could have chosen a handful of completely different tech paths and still have won, but instead they chose to do almost nothing except bleeding the company dry.
Commodore so slowly and ineffectually improving on the OCS didn't help, but the original sin of the Amiga was committed in the beginning, with planar graphics (i.e., slow and hard to work with, even setting aside HAM) and TV-oriented resolutions/refresh rates (i.e., users needing to buy a "flicker fixer"). It's like they looked at one of the most important reasons for the PC and Mac's success—a gorgeous, rock-solid monochrome display—and said "Let's do exactly the opposite!"
Iirc interlaced display and 6 bitplanes were a compromise to allow color graphics in 1985 with the memory bandwidths available at the time.
If it's a sin or feature can of course be debated but I remember playing games on an Amiga in the early 90s and until Doom the graphics capabilities didn't look outdated.
By 1992 with AGA however I agree, flicker and planar graphics(with 8 bitplanes any total memory bandwidth gains were gone) was a downside/sin that should've been fixed to stay relevant.
Yeah, it does alright and is a significant difference to a DX/2, but Quake came out in ’96 and the P60 came out as a super expensive workstation class CPU in ’93. If you were a gamer in ’96 it is unlikely you were rocking a P60 because it was not ever good value for money.
The original Pentiums (socket 4, 60 or 66 MHz) had the infamous floating point division bug, had underwhelming perf for anything not FP bound (most things), ran hot, and were too expensive for what you got. A DX/4 100 was nearly always a more rational choice.
Second gen Pentiums, starting with the 75 MHz, were great.
Doom was released end of '93. In 1992 most of us were in the 286 -> 386 upgrade wave and a 486-33 was easily at $2.5k+ ($5.5k in today's terms). The 486 DX2 66 was a good choice even 1994-1996.
Yes, the latest chips were very expensive back then, and out of reach for most people who would continue buying new computers with older chips. (As opposed to how most people today buy an iPhone or a Mac or whatever with the latest semiconductor technology.) I got my 25MHz 386 in 1991, over two years after the 486 was announced, and I had one of the fastest computers of anybody in school... for a short time.
Hard to imagine now, but this was a huge turning point. A genuinely powerful CPU in a "Pee-Cee" available for less than RISC workstation money. I had to wait a while, mine was an AMD DX2-66 since I didn't have a budget for Intel... add Slackware... and countess hours messing with XF86config and I had a poor-mans Sun workstation.
It comes over as so incredibly insane to me that people from the late 80s (people working with computers! Reporting on them!) would look at their current technology stack and basically go: "I have no idea whatsoever what else we can do with these things, we've reached the end"
On the other end, you have people who have no idea how insanely fast computers are today, and how little computing power is "really" needed for most things that computer users do - or how much you can do with one average machine ("Oh no, 1000 requests per second - let's erect another rube goldberg machine to handle that!").
The 80s and 90s were filled with new things computers could do - spreadsheets, wysiwyg word processors, games - things that simply were impossible before (or not done).
In the 2000s through now we've mostly had improvements - 4k Youtube is much better than realplayer, but it's still just "online video". AI is definitely a "new" thing and it's somewhat awoken a similar spirit to the 80s/90s - but not the same breadth. Dad bringing home a computer because he wants to do spreadsheets and you finding it can run DooM or even play music.
The first 80286-based system (IBM PC AT), 80386 (Compaq Deskpro 386), and 80486 all had people writing about their suitability as servers, with the consensus's implication being that normal people didn't need them.
The Pentium is the first one, I think, that this didn't happen, because by then it turned out that people need a computer that can do what they are currently doing—but faster—much more often than they need servers.
We ran a 3-line BBS (Renegade and then Wildcat) on OS/2 on a 486-33 with 12 MB RAM. This was in 1994 or so. Great way to multitask several dos applications!
I too have one sitting on my desk, 486DX2 66Mhz. I've had it for probably 25 years now, bringing it from job to job like the magical lost artifact it is. I remember how much more capable it was for playing Doom and Descent than the 33Mhz, or heaven forbid the SX. Of course shortly after the Pentium came out and blew everything away. The good 'ol days of giant Gateway 2000 towers.
Great throwback.. they were awesome proc's. With a few Simms (4 - 16 Mb) it could do multimedia madness never seen before (play a CD-ROM game of mpeg1 video) 486dx4 100 was the latest Intel I had before going to Pentium clones. (AMD K series and the shitty Cyrix 6x86)
I remember getting my first 486 33mhz computer and being able to play Ultima 7 the black gate, and later Ultima 7 part 2. This was a turning point for me as the game was way ahead of others on the console side of things. DOS 6 !
I think the turning-point was that flat-framebuffers and plenty of CPU-power for the first time eclipsed specialized 2d hardware (Amiga,Megadrive, Snes, etc).
Flat framebuffers and "powerful" CPU's also enabled easier software rendering (Doom/Duke) of 3d, compared to the Amiga where writing textured rendering for an Amiga is a PITA due to video memory layout with separted bitplanes spreading bits of each pixel into different memory locations (the total memory bandwidth reduction in 1985 by using 5 or 6 bitplanes became a fatal bottleneck at this point).
It wasn't really always full framerate though and the 2d chipsets did help in "classic" actiongames that were still much in the rage.
The Pentium further widened the gap, but at the same time consoles gained hardware 3d acceleration (PSX/Saturn/Jaguar) yet the Pentium could do graphics better in some respects (As shown with Quake).
Once 3d accelerators landed, PC's has more or less constantly been ahead apart from when it comes to price (and comfort/ease).
Ahhh but it gave me the opportunity to ran real programs, coming from an XT!
*Edited to add an example: I could for the first time use AutoCAD.
The price difference between a 286 and a 386SX was negligible, but the software I could use, was other league.
Yeah by the time we were getting into it the 486 was already out, but we wanted the real 32 bit bus and had to be a bit careful when looking at used computers (as by that time the 386SX and DX machines were about the same price).
A switch from the exponential regime to something immensely slower was a qualitative change. The difference is so vast that it's completely reasonable to say that clock speeds haven't changed a single bit since 2006 or so (and even for raw ops/s speeds, which have improved much more, it's debatable).
Clock speeds used to be going up in a straight line (the normal "interpretation" of Moore's law) - but once the P4 hit a (kind of useless 3.8GHz) we leveled off for decades.
Nowadays, 486 computers are getting rare and relatively expensive. CPUs themselves are 25, 30, 40, sometimes 50 bucks on eBay. Whole working systems are in the low hundreds, and fully working 486 laptops can fetch 400 or 500 bucks.
sigh
Suddenly, it was possible to imagine running advanced software on a PC, and not have to spend 25,000 USD on a workstation.
The 486 DX2 66MHz was the target platform for gaming during almost two years (1992-1994). That was an huge achievement back in the days to be at the top that long.
Before it, you could claim that a 68040 was kinda-sorta keeping up with the 486 and that the nicer design and better operating systems of other computers made up for the delta in raw performance, but the DX/2 66 running Doom was the final piece of proof that the worse-is-better approach of using raw CPU grunt to blast pixels at screen memory instead of relying on clever custom circuitry was winning.
Faced with overwhelming evidence, everyone sold their Amiga 1200s and jumped ship to that hated Wintel platform.
And sometimes the DX/4 100Mhz would be slowest of all those at 25Mhz bus.
The Aga chipset of the 1200/4000 stupidly only added 2 more bitplanes. The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.
Reading in hindsight there was probably too many structural issues for Commodore to remain competitive anyhow, but an alt-history where they would've seen the needs for 3d rendering is tantalizing.
1: https://news.ycombinator.com/item?id=47717334
The intention was good, but the Akiko chip was functionally almost useless. It was soon surpassed by CPU chunky to planar algorithms. I don't think it was ever even used in any serious way by any released games (though it might have been used to help with FMV).
I agree. Unfortunately, even with chunky graphics and/or 3D foresight, 68k would still have been a dead end and Commodore would still have been mismanaged into death. It’s fun to dream though…
Had the Amiga retained relevance for longer and without a push for PowerPC I don't see a reason why 68k wouldn't have been extended. Heck the FPGA Apollo 68080 would've matched end of 1990s P-II's and FPGA's aren't speed monsters to begin with.
If it's a sin or feature can of course be debated but I remember playing games on an Amiga in the early 90s and until Doom the graphics capabilities didn't look outdated.
By 1992 with AGA however I agree, flicker and planar graphics(with 8 bitplanes any total memory bandwidth gains were gone) was a downside/sin that should've been fixed to stay relevant.
(Also, a Pentium 60 is barely faster than a DX/2 66 at many tasks — it is a Bad Processor — but that’s another conversation ;)
Second gen Pentiums, starting with the 75 MHz, were great.
https://www.silverstonetek.com/en/product/info/computer-chas...
Back then, 10 years of technological advancement made a huge difference. Today, you can get by just fine with a 2016-era laptop.
The lack of imagination is just disturbing.
In the 2000s through now we've mostly had improvements - 4k Youtube is much better than realplayer, but it's still just "online video". AI is definitely a "new" thing and it's somewhat awoken a similar spirit to the 80s/90s - but not the same breadth. Dad bringing home a computer because he wants to do spreadsheets and you finding it can run DooM or even play music.
The Pentium is the first one, I think, that this didn't happen, because by then it turned out that people need a computer that can do what they are currently doing—but faster—much more often than they need servers.
Flat framebuffers and "powerful" CPU's also enabled easier software rendering (Doom/Duke) of 3d, compared to the Amiga where writing textured rendering for an Amiga is a PITA due to video memory layout with separted bitplanes spreading bits of each pixel into different memory locations (the total memory bandwidth reduction in 1985 by using 5 or 6 bitplanes became a fatal bottleneck at this point).
It wasn't really always full framerate though and the 2d chipsets did help in "classic" actiongames that were still much in the rage.
The Pentium further widened the gap, but at the same time consoles gained hardware 3d acceleration (PSX/Saturn/Jaguar) yet the Pentium could do graphics better in some respects (As shown with Quake).
Once 3d accelerators landed, PC's has more or less constantly been ahead apart from when it comes to price (and comfort/ease).
The 386 SX was a crap, 16 bit wide bus IIRC.
[0] https://en.wikipedia.org/wiki/Dennard_scaling
Played some awesome games, like DOOM, Wolfenstein. Later duke3d was the shit. But i cant remember if i run on the same setup or something newer.