• My first computer was a 486sx 25Mhz [1] The rig (tower, monitor, etc.) cost around $3,000. We got the SX instead of the DX because it was $500 cheaper. And I wanted a 16bit sound card. (Note that this is in 1992 dollars. Today it would cost over $7,000)

    My parents didn't have a lot of money, but my great-grand father passed and they used some of the inheritance to buy the computer. I was instantly hooked. In hindsight I see how much of a gift my family gave me.

    The announcement reminded me of article John Dvorak wrote around the same time. 1GB hard drives had just come out, and he asked what all the extra space would be used for. Even as a young teenager, I remember thinking how short sighted that comment was. That was before I realized how the tech press tends to get stuck in local optimizations, and can't understand the bigger picture.

    It's all a good reminder that cutting edge today doesn't stay cutting edge very long, and the world figures out how to squeeze every ounce ounce of power out of hardware. (Also, yes, that leads to bloat...)

    [1] https://en.wikipedia.org/wiki/I486SX

    [2] https://en.wikipedia.org/wiki/John_C._Dvorak

    • > In hindsight I see how much of a gift my family gave me.

      True for many, many of us, I suspect. My family bought a 286 in the early 90s and it cost something like $2000 CAD then, which is nearly $4000 now; but salaries were lower then, this would have been something like 5-6% of my single income family's yearly post-tax earnings for the year, and if you think about it as the % of "disposable" income it was probably more like 60% of it for the year.

      Obviously it paid off in that it set me on the path for my career, hard to make any other investment as good as that, but who would have known that at the time? I'm glad that there were so many ads positioning computers as being educational and not just game machines; even though in reality I think it was learning about the computer to make the games work that taught me way more than any educational software ever did.

    • similar, but I got the 486 DX2-66.

      I’ve been thinking a lot about these inflation-adjusted prices due to the big Apple Computer anniversary — an Apple // cost $5000 in 2026 dollars, meanwhile a $600 Macbook Neo cost $150 in 1980 cash!

      What helped me reconcile this was an observation that we’ve inverted the prices of necessities and luxury goods. Rent and mortgage in particular were a much smaller slice of income back then, but luxury goods were very expensive, so one would save up for a year or two to buy a new TV or a computer for the kids.

      Now the necessities take a much larger slice of our income, but TVs and computers are incredibly cheap. It takes very little money to get a nice computer, and not-buying it barely makes a dent in the bills. This isn’t a good thing.

      I do disagree a little with your observation regarding the industry “squeezing every ounce of power out of hardware”. Beyond local LLM stuff, there’s basically nothing a modern computer can comfortably do that any laptop since the mainstreaming of SSDs can’t.

    • > In hindsight I see how much of a gift my family gave me.

      Gotta tack on to this thread showing appreciation for parents. We could never afford new computers in the 90s, but luckily my dad could bring home obsolete equipment from work. We were thus always at least a generation behind. I remember my friend's Pentium feeling like sci-fi compared to our 386, but my goodness it completely molded my life!

      Later, towards the end of the 90s, those sci-fi Pentiums were obsolete, so I got a few to run "that weird Linux stuff" on. Since it was considered junk, nobody cared what I did with it. To this day, if I happen to hear Metallica play and there's early winter's first smell of snow in the air, my mind will be transported back to that school night I secretly stayed up wayyy too late and discovered SSH for the first time. Haven't looked back.

      Thank you, dad! I just hope general computing devices owned by regular people are still natural by the time my children come of age.

  • The 486 and https://www.delorie.com/djgpp/history.html changed everything.

    Suddenly, it was possible to imagine running advanced software on a PC, and not have to spend 25,000 USD on a workstation.

    • It's hard to convey to today's generation, who think Ivy Bridge to Haswell was a big jump or whatever, how awesome the 286 -> 386 -> 486 changes were to personal computing. It felt almost like what going from a NES to a Super Nintendo to a N64 felt like. The improvements were astounding.
    • dbdr
      Amazing to see a webpage "Updated Dec 1998" still up, running and displaying correctly.
      • Without fancy JS or CSS, sites can last decades easily
        • With JS and CSS sites can last decades easily.
          • Agreed, it's not those, it's the fact that we went from JS being a little sprinkling of dynamism on a document to an entire build process with massive numbers of dependencies and browser shims. The web feels like a mistake as a platform...
        • [dead]
    • I remember trying to run a game, Rise of the Triad, which was built with an improved Wolfenstein engine iirc, and having it struggle on my 386 unless I made the viewport as small as possible. At which point it told me to buy a 486... well I did eventually, I guess it worked.
      • Had the same experience with Doom II. Got it to run surprisingly well on a brand new Tandy 486DX2 + 4MB RAM, though I seem to recall having issues with SoundBlaster compatibility.
    • and dont forget _legendary_ RHIDE dev environment!

      https://ftp.gwdg.de/pub/gnu/www/directory/all/rhide.html

      :-)

      And you could use VESA linear framebuffer above 256KB - this was a breakthrough back then :-))

    • It was really the 386 that was the beginning of modern computing, since it had a mmu.
      • Several operating systems on 286 (eg Xenix, Coherent, OS/2) used its MMU for multitasking and memory protection. See https://en.wikipedia.org/wiki/Intel_80286#Protected_mode
        • The 286 protected mode did not allow for a 32-bit flat address space and was heavily half-baked in other ways, e.g. no inbuilt way to return the CPU to real mode without a slow and fiddly CPU-reset.
          • It was architecturally a 16-bit CPU so a flat 32-bit address space would be a non sequitur. If you wanted flat 32-bit addressing, there was a contemporary chip that could do it with virtual memory: Motorola 68010 + the optional external MMU. (Or if you were willing to do some hoops, even a 68000.. see the Sun-1)
        • Coherent was the first Unix-like OS I ran, on a 386SX box. I think it was Coherent 4.x.
      • Except the 486 had hardware floating point, essential for technical work.
        • An MMU is pretty much necessary for robust multitasking. Without it, you are at the whim of how well software behaves. Without it, it is more difficult for developers to create well behaved software. That also assumes good intentions from programmers, since an MMU is necessary for memory protection (thus security).

          While emulating an FPU results in a huge performance penalty, it is only required in certain domains. In the world of IBM PCs, it was also possible to upgrade your system with an FPU after the fact. I don't recall seeing this option for IBM compatibles. While I have seen socketed MMUs on other systems, I don't know whether they were intended as upgrade options.

        • By the way, "the i486SX was a microprocessor originally released by Intel in 1991. It was a modified Intel i486DX microprocessor with its floating-point unit (FPU) disabled." (https://en.wikipedia.org/wiki/I486SX)
        • You could buy a 8087 for your 8086 or 8088, the 486DX just moved it on chip.
        • That's an advancement but that's a matter of speed an simplicity. An MMU is a huge before and after, it's still the biggest separator of CPUs today. The most important detail to understand a CPU is whether it has an MMU.
  • The 486 killer app was DOOM. It was butter-smooth at 20 fps if you also had a VLB graphic card.

    The 486 DX2 66MHz was the target platform for gaming during almost two years (1992-1994). That was an huge achievement back in the days to be at the top that long.

    • einr
      The DX/2 66 is a true legend of a chip. It was so good. The final nail in the coffin for the Amiga and for 68k. I love the Amiga, but it just didn’t Doom.

      Before it, you could claim that a 68040 was kinda-sorta keeping up with the 486 and that the nicer design and better operating systems of other computers made up for the delta in raw performance, but the DX/2 66 running Doom was the final piece of proof that the worse-is-better approach of using raw CPU grunt to blast pixels at screen memory instead of relying on clever custom circuitry was winning.

      Faced with overwhelming evidence, everyone sold their Amiga 1200s and jumped ship to that hated Wintel platform.

      • I remember arguments (and benchmarks) around all the variations of the 486 since the bus speed/clock speed was uncoupled (the /2 is clock doubling). For some applications, a 50Mhz 486 with a 50Mhz bus would beat a DX/2 66Mhz with a 33Mhz bus.

        And sometimes the DX/4 100Mhz would be slowest of all those at 25Mhz bus.

        • I remember being so excited when I figured out how to jumper my DX/4 100 and operate it with clock doubling and a 50 MHz front side bus speed. Same core speed, faster memory and I/O.

          My peripherals seemed to take it. My graphics output showed some slight glitches, which I was OK with for the speed.

          However, I think it was a bit unstable and would fail a correctness challenge like compiling XFree86 or the Linux kernel, which were like overnight long runs. Must have been some bit flips in there occasionally. I seem to recall that once that reality settled into my brain, I went back to the clock tripler config.

        • Nearly correct. The DX/4 100MHz had a 33MHz bus. The DX/4 75MHz had the 25MHz bus. I remember well because I had both.
          • Now I remember being annoyed that it wasn't the DX/3 as it should have been!
            • Especially since when actual clock quadrupled chips eventually came out they had to call themselves ridiculous things like ”5x86” instead of DX/4. (The Am5x86 133 runs at 4x33 MHz)
              • I think 5x86 had more to do with marketing than anything else, because the Pentium had already been on the market for a while when the Am5x86 came out.
                • I think it’s a bit of both. It absolutely tried very hard to pretend that it was a ”586” (Pentium class) but also ”5x” is right there and implies that if the DX4 is 4, this is 5.

                  The full name on the chip on some of them is ”Am5x86-P75 DX5-133” which implies a lot of things, some of which are flat out misleading (it does not get very close to ”P75” performance)

      • As I noted in my other comment (1), in 1985 Amiga OCS bitplane graphics (separate each bits of a pixel index into separate areas) was a huge boon in 2d capability since it lowered bandwidth to 6/8ths but made 3d rendering a major pain in the ass.

        The Aga chipset of the 1200/4000 stupidly only added 2 more bitplanes. The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.

        Reading in hindsight there was probably too many structural issues for Commodore to remain competitive anyhow, but an alt-history where they would've seen the needs for 3d rendering is tantalizing.

        1: https://news.ycombinator.com/item?id=47717334

        • > The Aga chipset of the 1200/4000 stupidly only added 2 more bitplanes. The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.

          The intention was good, but the Akiko chip was functionally almost useless. It was soon surpassed by CPU chunky to planar algorithms. I don't think it was ever even used in any serious way by any released games (though it might have been used to help with FMV).

          • Ah, I was under the impression that it had a native chunky mode but it was a built-in C2P routine? Anyhow, seems it was useful (1) when running on stock CD32's but not in conjunction with faster machines.

            1: https://forum.amiga.org/index.php?topic=51616.msg544232#msg5...

            • Which brings me to my pet peeve, the already slow 68020 (680ec20) at 14MHz was crippled by, even though it had a 32-bit bus, was only connected to a 16-bit RAM bus. (Chipram.)

              This 16-bit memory (2 megs) is also where the framebuffer and audio lives, so the stock CPU in A1200 has to share bandwidth with display signal generation and the graphics and audio processing.

              All-in-all, it meant the Amiga 1200 had only about twice the memory throughput of the Amiga 500. (About 5 megabytes/s vs about 10 megabytes/s)

              If the A1200 had at least some extra 32-bit memory (it existed as a third party add-on) the CPU could have had its own uncontested memory with a troughput of about 20-40 megabyte/s.

              Imagine the difference it would have made if the machine had just a little extra memory.

              That's just a tiny detail. That the chipset wasn't 32-bit was another disappointment.

              The bigger problem was that Commodore as a company was aimless.

              • Yeah, and it took ~7 years to make those marginal improvements over the earlier Amiga chipset! I'm ignoring ECS, since it barely added anything over OCS for the average user.
        • einr
          The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.

          I agree. Unfortunately, even with chunky graphics and/or 3D foresight, 68k would still have been a dead end and Commodore would still have been mismanaged into death. It’s fun to dream though…

          • Was it necessarily a dead end? Considering the ways Intel and later AMD managed to upgrade/re-invent x86 that until x64 still retained so much of the x86 instruction encoding/heritage (heck, even x64 retains some of the instruction encoding characteristics).

            Had the Amiga retained relevance for longer and without a push for PowerPC I don't see a reason why 68k wouldn't have been extended. Heck the FPGA Apollo 68080 would've matched end of 1990s P-II's and FPGA's aren't speed monsters to begin with.

            • The 68060 is pretty good to be fair, but it never ended up being widely used and Motorola definitely saw PPC as the future.

              Maybe if these theoretical new 68k Amigas became a huge market hit they could have taken the arch further and it could have remained competitive, but all the other 68k shops had already pretty much given up or moved on already (Apple was already going PPC, Sun went SPARC, NeXT gave up on their 68k hardware, Atari was exiting the computer business entirely, etc) so I don’t know that the market would have been there to support development against the vast amount of competition from both the huge x86 bastion on one hand and the multitude of RISC newcomers on the other.

            • The argument is that 68k is "CISCier" than x86, the addressing modes in particular, so making a performant modern out-of-order superscaler core that uses it would be harder than x86.
              • I believe in that. But Commodore could have plunked a cheap 68020 in their machines for backwards compatilibity (like how MSX2 had a SOC MSX1 inside, PS2 had a PS1 SOC, PS3 had a PS2 SOC, and so on) and put another "real" socketed CPU as a co-processor. Or made big-box machines with CPUs on PCI cards, for infinite expansion options. "True" multitasking, perfect for CAD, 3D rendering and non-linear video editing. It would have been very cool with an architecture where the UI could be rendered with almost hard realtime and heavy processing happened elsewhere.
                • This is almost exactly what the plan was, until C= went out of business:

                  https://en.wikipedia.org/wiki/Amiga_Hombre_chipset

                  It was going to be HP PA-RISC based and have an AGA Amiga SoC, including a 68k core.

                  • How much of Hombre is myth-and-legend? Given how little progress with made with OCS->ECS->AGA, it seems unlikely they could even have built an Amiga SoC, nevermind designed a new 64-bit chipset.
          • There were no tech problems IMHO, it was all mgmt problems. They could have chosen a handful of completely different (edit: mutually exclusive even!) tech paths and still have won, but instead they chose to do almost nothing except bleeding the company dry.

            Edit: I don't mean that their success was certain if they executed better. I mean they did almost nothing and got the guaranteed outcome: failure. (And their engineers were brilliant but had very little resources to work with.)

        • Commodore so slowly and ineffectually improving on the OCS didn't help, but the original sin of the Amiga was committed in the beginning, with planar graphics (i.e., slow and hard to work with, even setting aside HAM) and TV-oriented resolutions/refresh rates (i.e., users needing to buy a "flicker fixer"). It's like they looked at one of the most important reasons for the PC and Mac's success—a gorgeous, rock-solid monochrome display—and said "Let's do exactly the opposite!"
          • Iirc interlaced display and 6 bitplanes were a compromise to allow color graphics in 1985 with the memory bandwidths available at the time.

            If it's a sin or feature can of course be debated but I remember playing games on an Amiga in the early 90s and until Doom the graphics capabilities didn't look outdated.

            By 1992 with AGA however I agree, flicker and planar graphics(with 8 bitplanes any total memory bandwidth gains were gone) was a downside/sin that should've been fixed to stay relevant.

      • At that point in time I would not have called it Wintel yet. That started after Windows 95, IIRC.
      • Yep. 486DX/2 was when I started seriously looking at moving on from the Amiga. I wound up with a DX/4 100 sometime in 1994.
      • My classmate kept his Amiga 1200 a bit longer! ...eventually he got a PC with Pentium 60 MHz.
        • einr
          Yeah, there were holdouts of course but the DX/2 really seems like the breaking point.

          (Also, a Pentium 60 is barely faster than a DX/2 66 at many tasks — it is a Bad Processor — but that’s another conversation ;)

          • Pentium is a bad processor? It's way faster than 486, especially on FP it's not even close.
            • einr
              The original Pentiums (socket 4, 60 or 66 MHz) had the infamous floating point division bug, had underwhelming perf for anything not FP bound (most things), ran hot, and were too expensive for what you got. A DX/4 100 was nearly always a more rational choice.

              Second gen Pentiums, starting with the 75 MHz, were great.

              • I had a P60 that had the F0 0F bug; Windows would crash for weird reasons on it, but Linux ran like a champ because it actually had a workaround. Luckily my chip was already recalled for the FDIV bug so it wasn't a total boat anchor. Loved that machine. I had BeOS, QNX, and one time I made Linux look like Solaris with all the Open Look stuff - really enjoyed that aesthetic.

                Now we have these amazing displays and graphics cards and there's literally no way to make my Mac have different window titlebars or anything. So boring

              • Idk if the 75 was really that great tho, mostly in that it had a 50Mhz FSB rather than 60 or 66Mhz like most other parts.

                Another factor for the later P1s being better IIRC was improved chipsets.

                • We had a 90 overclocked to 100Mhz that served as the family computer, I inherited from it when the family computer was upgraded to a K6 II and it chugged along as my personal computer until ~2001 thanks to Linux whike the Ghz barrier had been broken for a while already in the Intel world.

                  I think my next computer came with an AMD Duron 900Mhz, an entry level at the time but the jump from the pentium 100Mhz was such a huge gap it still felt like a formula 1.

                • To be more exact, I think the first great Pentium was the 133, but the 75 is the first that was a real, proper jump in performance from a fast 486 and represented decent price/performance.
              • It didn't help that the earliest P5 Pentiums ran on a 5V rail. Newer revisions starting with the P54 core used 3.3V and helped with keeping the chips cool.
            • The Pentium was great, but the 60 and 66MHz versions were not liked, they ran way too hot.
              • I think from the price people also expect a similar performance boost as going from 386 to 486. What made Pentium also confusing is that during this time Intel introduced PCI.

                From a 486 with VLB to a Pentium with PCI everything became a lot nicer.

          • Many tasks perhaps, but running Quake was not one of them.
            • Yeah, it does alright and is a significant difference to a DX/2, but Quake came out in ’96 and the P60 came out as a super expensive workstation class CPU in ’93. If you were a gamer in ’96 it is unlikely you were rocking a P60 because it was not ever good value for money.
    • Slightly before DOOM came out, the killer 486 app for me was Fractint (https://en.wikipedia.org/wiki/Fractint)
    • I distinctly remember having a Strike Commander poster in my bedroom saying “Strike really flies on a 486 DX/2”. Fond memories indeed.
    • Doom was released end of '93. In 1992 most of us were in the 286 -> 386 upgrade wave and a 486-33 was easily at $2.5k+ ($5.5k in today's terms). The 486 DX2 66 was a good choice even 1994-1996.
      • My boss then - who's still a very dear friend - purchased a work computer to play Doom. He was already mentally checked out of that job and was looking for his next opportunity. Spent a lot of time at work playing Doom and got quite good at it.

        I think it was 1994. It was a loaded 486 with the best 17" CRT monitor money could buy at the time. I think he spent over $7000.

      • Yes, the latest chips were very expensive back then, and out of reach for most people who would continue buying new computers with older chips. (As opposed to how most people today buy an iPhone or a Mac or whatever with the latest semiconductor technology.) I got my 25MHz 386 in 1991, over two years after the 486 was announced, and I had one of the fastest computers of anybody in school... for a short time.
    • I wonder, I wonder where one could find a good book about the software architecture of that game… oh, well
    • They need to bring back the turbo button.
    • ...and with 8 MB (-eight- for the youngsters ;-) RAM you were absolutely the king ruler :-D
  • I didn't have access to a 486 until around 1999. I was making do with a hand-me-down 8088 and then a 386SX.

    Back then, 10 years of technological advancement made a huge difference. Today, you can get by just fine with a 2016-era laptop.

  • 486 was my dream. Unfortunately, my parents didn't have money for it. I bought my first PC in 1999 - a Pentium 2. I invested a lot of money in the monitor; computers become obsolete very quickly, while a monitor can serve for many years. Surprisingly, flat monitors appeared soon after...
  • We ran a 3-line BBS (Renegade and then Wildcat) on OS/2 on a 486-33 with 12 MB RAM. This was in 1994 or so. Great way to multitask several dos applications!
  • Linux kernel version 7.1 will drop support for 486: "Linux devs think even one second spent on 486 support is a second too many." https://arstechnica.com/gadgets/2026/04/linux-kernel-maintai...
    • I can understand running an old 486 machine for nostalgia reasons, or because you have some old industrial equipment that relies on it and even one second spent replacing it is a second too many, but I struggle to imagine why you'd want or need to run a modern Linux kernel on it.
    • >This chip was originally introduced in 1989, was replaced by the first Intel Pentium in 1993, and was fully discontinued in 2007

      That's really long compared to 1yr refresh cycles we have today with phones etc.

  • Heh, I remember using my first machine, a 486 for a long time after it was obsolete and reading system requirements like, what do you mean pentium recommended and why the hell do you need 16Mb of RAM. It's interesting to reflect that the old games like Settlers, HoMM 2 or Warcraft 2, that are no worse than modern ones gameplay wise, used to run on something that is so vastly underpowered by modern standards the numbers don't even feel like a real spec.
  • • Ran my first Linux at home on a i486-DX2 (33 MHz, 4 MB RAM), which supported a decent X11/R6 performance in color in 1992, with a 14" CRT.

    • Ran my first real UNIX at home on a PA-RISC (HP 9000-715/75 with HP-UX 9.03 and 96 MB RAM) in 1997, 20" color CRT.

    • Today, Linux is still here, but on a 2-CPU, 140-core AMD server with 2 TB RAM, hundreds of TB NAS and a 40" TFT... (and it still takes too long to open the bloated Web browser! - keenly awaiting Ladybird to the rescue in August.)

    • Wouldn't the DX2 be 66 MHz? Or did you intentionally run it at 33 MHz?
    • > and it still takes too long to open the bloated Web browser! - keenly awaiting Ladybird to the rescue in August

      chromium browsers launch pretty fast. If you're talking about memory usage, Ladybird isn't aimed at minimal memory usage from what I've seen.

  • Funny I'm working with intel 686 right now brutal to get stuff to build eg. rust/cargo related (missing deps but mostly the hardware, slow). Recently trying to fix this maturin problem I ran into. But it is cool the backwards compatibility of python 3.11 to 32bit with debian 12

    The CPU I'm working with is Celeron M 900MHz single core no HT struggling to build wheels for python (several hours)

  • I remember getting my first 486 33mhz computer and being able to play Ultima 7 the black gate, and later Ultima 7 part 2. This was a turning point for me as the game was way ahead of others on the console side of things. DOS 6 !
    • I think the turning-point was that flat-framebuffers and plenty of CPU-power for the first time eclipsed specialized 2d hardware (Amiga,Megadrive, Snes, etc).

      Flat framebuffers and "powerful" CPU's also enabled easier software rendering (Doom/Duke) of 3d, compared to the Amiga where writing textured rendering for an Amiga is a PITA due to video memory layout with separted bitplanes spreading bits of each pixel into different memory locations (the total memory bandwidth reduction in 1985 by using 5 or 6 bitplanes became a fatal bottleneck at this point).

      It wasn't really always full framerate though and the 2d chipsets did help in "classic" actiongames that were still much in the rage.

      The Pentium further widened the gap, but at the same time consoles gained hardware 3d acceleration (PSX/Saturn/Jaguar) yet the Pentium could do graphics better in some respects (As shown with Quake).

      Once 3d accelerators landed, PC's has more or less constantly been ahead apart from when it comes to price (and comfort/ease).

  • And of course, support for this venerable processor will be dropped in Linux kernel 7.1 in a couple of months time.
  • Raise your hand if you have been there and:

    - tinkered for HOURS to get enough EMM/XMM memory by tweaking Config.Sys & Co to get whatever game running (and having dedicated boot options configured, because you could unload some drivers from mem and could then run other games)

    :-D

  • Hard to imagine now, but this was a huge turning point. A genuinely powerful CPU in a "Pee-Cee" available for less than RISC workstation money. I had to wait a while, mine was an AMD DX2-66 since I didn't have a budget for Intel... add Slackware... and countess hours messing with XF86config and I had a poor-mans Sun workstation.
  • Microsoft was and still is the reason why average people needed more powerful chips lol, maybe with the exception of browser bloat.
  • I've got one sitting on the shelf above my desk, a 33 Mhz dx, I don't even remember what machine it came out of.
    • I've got an AMD branded 286 chip, from my first owned-by-me PC, bluetac-ed to the case of my home desktop PC, powered by a Ryzen something-or-other from a few years ago (with a 1060/6Gb card from a few years before that because I wasn't gaming enough to justify a new graphics card along with the other updates at the time).
    • I too have one sitting on my desk, 486DX2 66Mhz. I've had it for probably 25 years now, bringing it from job to job like the magical lost artifact it is. I remember how much more capable it was for playing Doom and Descent than the 33Mhz, or heaven forbid the SX. Of course shortly after the Pentium came out and blew everything away. The good 'ol days of giant Gateway 2000 towers.
  • I got a paper route just to get a hold of the dx2.

    It was a life-changing machine.

    Ordered, I believe, from the depths of a Computer Shopper magazine.

  • I still have a 486 linux system from those days - has not been turned on in this century but I'll try some day together with a glas of whisky :-)
  • I loved my 486DX2 66Mhz based IBM PS/1 (2168), which had a whopping 8MB of RAM. Not only did it really enable me to experience the fullness of PC gaming of the era, but it was the first computer I was able to install an internal modem into, and the computer I used to get SLIP dial-in access to the state university mainframe and thus to the Internet (prior I was limited to Prodigy walled garden). It was this computer that let me play early MUDs via telnet, let me play my first graphical MMORPG (Ultima Online), and and introduced me to real visual programming (Visual Basic).

    To a significant degree, the 486DX2 was the primary computing platform that created the foundation I needed to learn computing at depth and enabled my later career, and really set many of the formative moments in my life. Thanks Intel, even though you suck now as a shadow of your former self you were a beast in the 90s.

  • For me, the 486 was right between my (actually my Dad's) first computer, a 386, and my first personal computer (Pentium MMX). During those couple of years my friends had 486s and I was always jealous. I used to drool at the Best Buy catalog that came every Sunday in the mail.

    Nowadays, 486 computers are getting rare and relatively expensive. CPUs themselves are 25, 30, 40, sometimes 50 bucks on eBay. Whole working systems are in the low hundreds, and fully working 486 laptops can fetch 400 or 500 bucks.

    sigh

  • How was the person incorrect that speed increases won't continue forever? Pentium 4 was 3.8GHz and Ryzen 7 has 4.7Ghz some 20 odd years later?
    • > How was the person incorrect that speed increases won't continue forever?

      Through the magic of saying something different in actuality, which really ended up being proven incorrect. From the blogpost above, verbatim, italicizing the relevant bits:

      > Writing in the May 8, 1989 issue of Infoworld, Michael Slater warned that the sixfold speed increase seen from 1981 to 1989, going from 5 MHz to 33 MHz, would not be repeated.

    • While the speed increases weren't as dramatic, do note that even in single core speed, unlike the clocks would suggest the Ryzen 7 is much, much more than 1.23X faster than the P4. The P4 was a particularly fragile architecture, and achieved IPC on real code was typically well below 1, often closer to 0.5. The x3d variants of Ryzen have been measured at running above 3 average IPC on real, complex loads. So the single-core uplift from that P4 to a modern AMD core is about the same as from that 300MHz Pentium to the 3.8 P4, it just took 20 years, not 8. Of course, now we also have 8 times the cores.
    • Clock speeds used to be going up in a straight line (the normal "interpretation" of Moore's law) - but once the P4 hit a (kind of useless 3.8GHz) we leveled off for decades.
    • A switch from the exponential regime to something immensely slower was a qualitative change. The difference is so vast that it's completely reasonable to say that clock speeds haven't changed a single bit since 2006 or so (and even for raw ops/s speeds, which have improved much more, it's debatable).
  • > But when Word 97 arrived with real-time spelling and grammar checking and Clippy, the 486 couldn’t keep up. You really needed a Pentium or equivalent to do all three at once without noticeable lag as you typed.

    In other words, faster hardware was needed because the quality and performance of the software dropped. I was doing spell-checking with WordStar on an CP/M Apple II with zero lag -- and WordStar fit on one side of a 5' floppy.

    • WordStar originally didn’t have a spell checker. It was an add in product. And even after SpellStar was integrated (a response to the NewStar clone’s built-in spell checker), it was never as-you-type spell checking, which is what we got in Word 97, and what consumed the cycles on a 486.

      Word 97 also had as-you-type grammar checking, which wordstar never had. Wordstar did have an add in extra cost grammar checker whose name escapes me at the moment. But again, it was never real time.

      Yes, programs have become bloated, but it is worth it to compare apples to apples.

      One might argue that real time isn’t necessary, and one might be right. But that’s different from poorly written.

  • Great throwback.. they were awesome proc's. With a few Simms (4 - 16 Mb) it could do multimedia madness never seen before (play a CD-ROM game of mpeg1 video) 486dx4 100 was the latest Intel I had before going to Pentium clones. (AMD K series and the shitty Cyrix 6x86)
  • Hard to convey these days how the 486 felt like an absolute quantum leap in computing power.

    I built a 486 Compaq Novell server for the company I worked for and named it Godzilla - gives a sense of how the 486 was seen.

  • 486 SX 33Mhz, could not afford the DX
    • My experience too, as I dimly remember it.
    • The 486 SX was a fine chip, just no math copro.

      The 386 SX was a crap, 16 bit wide bus IIRC.

      • For a time systems with a 386SX were significantly cheaper than those with a 386DX because the 16-bit data-bus mean cheaper motherboards could be used.

        If you were running 16-bit software they were little slower than a 386DX at the same clock and significantly faster than a 286 because of higher clocks (286's usually topped out at 12MHz though there were some 16MHz options, the slowest 386s were running at 16MHz with some as fast as 40MHz), but also in part, when not blocked by instruction ordering issues, to the (albeit small by modern standards) instruction pipeline which the 286 lacked.

        32-bit software was a lot slower than on a DX because 32-bit data reads and writes took two trips over the 16-bit data bus, but you could at least run the code as it was a full 386 core otherwise (full enhanced protected mode, page based virtual memory, v8086 mode, etc).

        The SX also only used 24 bits of the address bus, limiting it to 16MB of RAM compared to the original's 4GB range, though this was not a big issue for most at the time.

      • I can't remember, could you buy a math coprocessor for it?

        I know my 286 you could pair with a 287 next to it.. not sure if it really made a difference you could discern outside of hyper-specific uses though.

        • There were 387 co-pros, just like the 287s (ad 8087s). You could actually use a 287 to provide floating-point instructions to a 386, albeit more slowly than a 387.

          Very little, if any, “home” or small-business software would make use of a floating-point unit though (maybe some spreadsheet apps did?). The most common use for them was CAD/CAM, and those doing scientific modelling without a budget that would allow for less consumer-grade kit.

        • I believe so the 487 which had a full 486 on board it and disabled your main CPU.
      • Ahhh but it gave me the opportunity to ran real programs, coming from an XT! *Edited to add an example: I could for the first time use AutoCAD. The price difference between a 286 and a 386SX was negligible, but the software I could use, was other league.
        • Yeah by the time we were getting into it the 486 was already out, but we wanted the real 32 bit bus and had to be a bit careful when looking at used computers (as by that time the 386SX and DX machines were about the same price).
  • Uuh! I recall i had this setup, not in 89, but sometime in the early 90s.

    Played some awesome games, like DOOM, Wolfenstein. Later duke3d was the shit. But i cant remember if i run on the same setup or something newer.

  • It comes over as so incredibly insane to me that people from the late 80s (people working with computers! Reporting on them!) would look at their current technology stack and basically go: "I have no idea whatsoever what else we can do with these things, we've reached the end"

    The lack of imagination is just disturbing.

    • On the other end, you have people who have no idea how insanely fast computers are today, and how little computing power is "really" needed for most things that computer users do - or how much you can do with one average machine ("Oh no, 1000 requests per second - let's erect another rube goldberg machine to handle that!").
      • The 80s and 90s were filled with new things computers could do - spreadsheets, wysiwyg word processors, games - things that simply were impossible before (or not done).

        In the 2000s through now we've mostly had improvements - 4k Youtube is much better than realplayer, but it's still just "online video". AI is definitely a "new" thing and it's somewhat awoken a similar spirit to the 80s/90s - but not the same breadth. Dad bringing home a computer because he wants to do spreadsheets and you finding it can run DooM or even play music.

    • That's not so different than today, wherein:

      All we really have to look forward to in the future of increasing-performance personal computing is doing the same things as yesterday, but doing them faster.

      The future after today will probably turn out more interesting than that, of course, but we can't know that until it happens.

      And the future after 1988 certainly turned out to be a very interesting time in computing -- but they had no idea what was in store. Perhaps you can use your time machine to go back and let them know?

    • It's easy to mock in hindsight, but the failure mode isn't lack of imagination. It's extrapolating linearly from physical limits that were real at the time. In 1989, DRAM refresh cycles and bus bandwidth genuinely were bottlenecks that seemed fundamental. What nobody predicted was that the industry would sidestep those walls entirely (caches, pipelines, out-of-order execution, then multicore). Architectural innovation tends to appear orthogonally to wherever the current wall is.
    • The first 80286-based system (IBM PC AT), 80386 (Compaq Deskpro 386), and 80486 all had people writing about their suitability as servers, with the consensus's implication being that normal people didn't need them.

      The Pentium is the first one, I think, that this didn't happen, because by then it turned out that people need a computer that can do what they are currently doing—but faster—much more often than they need servers.