In 1998 a had a summer job, as a programmer, in a little shop that built computers. At the time there was a short of Pentium II 300Mhz chips. One day the logistics guy mentioned they had gotten a batch of Pentium II 300Mhz marked with an operating voltage of 2.0V, instead of the usual 2.8V. Hmm.
Intel had taken a bunch of their most expensive CPUs, PII-450Mhz, using the brand new 250nm process and rebranded them as the cheapest CPUs which were at the time built with a 350nm process. You could just take this, stick in a motherboard an change the base frequency and you'd have the best CPU Intel could make.
We never sold any of those CPU to customers. We all bought ourselves new CPUs that day.
Something similar might have happened with Celeron 300 CPUs, I recall they could be overclocked to 450MHz as well without trouble. I think they didn't have as much cache as the Pentium II, but the increased clock rate meant you could get a pretty fast CPU for cheap.
Not only could you overclock the Celeron 300A, with the right motherboard you could run two of them in a dual CPU configuration.[0] This was my first high performance computer! Initially I dual booted Windows NT 4 and Windows 98, but soon enough Windows 2000 was released and allowed me to make practical everyday use of the two cores. It was an extremely satisfying computer to own.
The ABIT BP6 was a legend; and it was kind of amazing that Intel hadn't done their "licensing" or whatever such that it was impossible to make; good thing clones were around.
Of course now every CPU is its own multiprocessor machine, but back then it was a real screamer, and unless you could get multi-CPU Windows it was a real reason to poke around with Linux.
I had one of those with dual Celeron 366's running at 533
It worked great from about 1999-2002 when I upgraded to an Athlon 1700, then an Athlon 64. I finally had 2 CPU cores again with the Athlon 64 X2 around 2005.
The 440BX chipset supports two CPUs, so it is organised so that the northbridge is shared between the CPUs[1]. I found a datasheet[2] and from a skim and it looks like the front side bus is also shared between the CPUs.
[1] This is from the era before the memory controller / northbridge was integrated with the CPU.
I remember drooling over the price list of a local computer store back in the day. A dual-socket board for a Pentium Pro cost DM 2,000, and a Pentium Pro 200MHZ cost about as much, so just the board and two CPUs were DM 6,000 (roughly € 3,000, not adjusted for inflation).
A friend had that setup... was definitely fast for the price at the time... I ran an OC'd duron to 1.1ghz about a year later iirc (2000 era)... also ran Win2000, switched from OS/2 Warp and never really looked back. Win2k was very nice, and mostly stuck with it until Win7, the couple times I tried XP sucked, though MCE2005 was very ice, shame MS and cablecard pretty much killed it.
IIRC the Celeron chips coming out of the fabs tested better than Intel expected with far more of the batches being rated as stable as higher-spec units. This lead to a shortage of cheaper units that people were trying to buy and an excess of faster units that people didn't want to pay for. To avoid both missing sales and having a lot of stock sat in warehouses, many batches of Celeron chips that had tested fine to be sold as the faster models were binned with those that were only rated for the slower speeds. This meant that if you were lucky you could safely overclock the cheaper units by officially extravagant amounts, without extra special cooling arrangements or other hacks, and many did.
You could tell (at least initially, I'm not sure if Intel changed this later) from part of the product code that you definitely had one of the units from a batch that tested fine at the higher speeds, so it wasn't that much of a gamble either.
The Celeron 300A might still be the best bargain in CPU history. I put together a bunch of PCs with them for myself & friends/family. Not all of them could get to 450mhz, but all of them were able to clocked much higher than 300. And with the L2 cache on the chip, it already was better clock-for-clock than contemporary products.
While from an objective amount 50-150mhz is peanuts in today's terms, that was a huge step up relative to the original speed.
I did the same with a Celeron 300A at 450. The other great thing about that era was the 440BX chipset supported so many CPUs. I upgraded the same Slot 1 board from that 300A@450 to a Celeron 900 in a "slotket" socket to slot 1 adapter and used it for another 3 years.
5820K and 5960X have hit nearly similar overclocks in recent memory - chips with 3.4 and 3.3 GHz all-core turbo had a majority of chips (~75-79%) going to 4.5 GHz. 4.5/3.3 = 36% overclock, which is quite high by modern standards.
Sandy Bridge and Sandy Bridge-E were fantastic overclockers too with many samples going to 4.7 GHz (even on HEDT!).
Broadwell-E, too, could hit some solid clocks if you were willing to push more voltage... a lot of people got later samples to 4.7 GHz despite the numbers being closer to 4.3 in SiliconLottery's tests. Probably an uncomfortable amount of voltage and heat, but, Broadwell itself wasn't as bad as people took it to be, just the prices, and the L4 cache on Broadwell-C limiting the core clocks.
This is somewhat of a function of Intel not really pushing clocks in those days. 3.3 GHz all-core turbo is nowhere near what the silicon could deliver - generally I would punch in 4.1 GHz all-core on my 5820K and that would be stable without any other (explicit) settings adjustments.
And that is a factor of the TDPs - in those days TDP on desktop was expected to cover your max boost under a (reasonably) worst-case load. It has been a thing for a long time in laptops, but the desktop market was different. AMD was actually the one who introduced the "boost TDP is higher than box TDP" concept (aka PPT) to the desktop market with Ryzen 1000 - and then Intel started following suit once they started ramping up core counts and clocks with the 8700K/9900K.
But 140W is actually plenty for a hexacore or even octocore running at 3.3 GHz and even under a worst-case load scenario... and 91W is plenty for a quad-core running at max turbo too. The numbers weren't as phony back then.
I seem to remember that the 300A overclock was first published by Tom's Hardware - a site started by a German medical doctor, as implausible as that sounds.
The first to do it was probably the first to touch the CPU given how established over clocking was by that era. So it could be tom but it's just not that significant if it was or wasn't.
Yeah, "let's see if I can overclock this a bit" was the first thing many people did as soon as they got any new chip. The 300A was notable both because of how far they could be pushed but also how it worked for basically all of them & not outliers.
No. It was a proper Pentium II chip, originally binned as a 450Mhz chip. There was a shortage of the low end chip and they re-badged a bunch of high end chips to fill a whole in the market.
I did that. My first gaming computer was a Celeron 300 overclocked to 400 with a 3dfx card. At the time celerons we're just getting off the ground and there was virtually no difference in performance - just a much cheaper price!
I believe the smaller cache was the reason it could overclock so high; a similar story happened later in the Netburst era, when the ~2.4GHz Celerons could go over 4GHz with sufficient cooling and sometimes a slight increase in voltage.
Intel is still doing something similar - the cheaper CPUs from the 13th gen have cores from the previous 12th generation. But some OEM processors come with the newer raptor lake cores.
If this surprises you, I've got some bad news for you. Everyone does binning. Go buy 1000 5% resistors and test them. Instead of a normal distribution of tolerances, you'll find that there's an interesting trough around -2% to +2%. That's because all of the resistors are made using the same process and the lower tolerance ones are binned and sold at a higher price.
Same goes for silicon. The dies are largely the same. Only later in the process, they get binned to a lower spec cpu by selectively disabling features. Generally, this is because the chip doesn't meet some spec, but there are cases where this is simple done to meet demand.
Not talking about binning, but getting previous generation of core in your "new" CPU.
>Go buy 1000 5% resistors and test them. Instead of a normal distribution of tolerances, you'll find that there's an interesting trough around -2% to +2%.
The top comment in the video mentions someone's experience with resistor binning with a batch of 10% resistors having no samples under 5%. Anyway, the resistors binned for tighter tolerances aren't going to have to have a trough in the center of the distribution - they'll have cutoffs on either side. Binnig can be repeated for any tolerance desired, so carbon vs metal film doesn't have anything to do with it.
Out of all the things they have done this is the straw that broke the camel's back for you? If it has the advertised number of lanes, performance, etc., then it wouldn't affect me if I bought it, no? And if it did not have the advertised feature then it would be fraud and I would have a recourse? It's the same socket after all and I will never see how many nanometers the chips are. The i3's are locked down hard anyways.
There's no guarantee that the processor you used was actually equivalent to the 450MHz one [0] It might have worked like that for your use cases and lifetimes, but if you have issues, or it's underperforming, Intel aren't going to help or support you.
Lots of quirky stuff in the same time period. Like the 487. Marketed as a math coprocessor for the 486sx, meaning it was supposedly the same idea as the 387, 287, etc. But it wasn't that. It was a full on 486DX that disabled your installed 486SX and took over.
> Contrary to industry speculation, the 80386 multiplier errors result
from a layout problem, not from an error in logic design. "We didn't
allow enough margin to catch the worst-case pattern in the multiplier
at the corners of our process," explains Dana Krelle, Intel's 80386
marketing manager. "As a result, some chips, at some temperature/
voltage/frequency points, will produce errors from particular combinations
of 32-bit operands." The error, apparently due to unintentional
coupling between adjacent cells in the multiplier, escaped Intel's
simulation and chip verification process until it was spotted in a
subsequent stress-testing program.'
It wasn’t just CPUs, at the time it was also common to pay a premium for faster RAM chips. In many cases a chip marked as a 70ns speed could be reconfigured to support 50ns easily by adding or removing a surface mount resister or cutting a solder pad.
I have a weird nostalgia for all things 486 because the PC my father had while I was growing up was a 386, and the 486 was "one better", running all the games silky smooth...and EVEN WINDOWS! Other people seem to share this weird nostalgia because the prices of crappy, broken old laptops and desktops from this era are surprisingly high, and working ones are actually expensive!
I think part of it comes down to emulation being less than great for some use cases. I'm fine with emulation, but some want the real experience. The difference with CRT displays vs modern displays is particularly noticeable...
I seem to recall a video running a Control, maxed out on a 22" Flat CRT at 1280x1024 that looked better than a 4k display sitting next to it, let alone the performance differences.
I know I held on to my pro grade CRTs until I had to move them 3x one summer... man they were heavy. Had a permanent bow in my desk.
That was probably DigitalFoundry with a Sony FW900, one of the last high-end CRT monitors. Iirc it does 120 Hz at 1920x1200, which purely in terms of latency and lack of S&H blur remains unbeatable for flat-panel displays.
> broken old laptops and desktops from this era are surprisingly high, and working ones are actually expensive!
What do you think, could it be factories or corporations that have critical processes running on ancient software that requires very specific hardware to run, and which nobody wants to pay to rewrite for modern hardware? Young me would have said "don't be silly", but I've seen some crazy things, man.
The draw of laptops is easy to understand. I have a cardboard box full of old motherboards and things, but without all the requisite drives and video cards, they are junk. I'm not going to buy a bulky 486 desktop for nostalgia. A laptop is a pretty compact, fully self-contained item; just needs an AC power adapter. Easy to have a box full of them. All of my old laptops still work, especially that 12" aluminum G4 Macbook.
The total number of 486s sold is incredibly dwarfed by the the absolute huge numbers of Pentiums after Windows 95. It grew the market just astronomically.
I had an IBM motherboard that had an odd enameled wire on it that stuck out like a sore thumb. It was a 486/33 that someone had hotwired to be more like a 486/100. Local computer shop showed it to an IBM rep that said he'd never seen anything similar. It was not as fast as a real /100, but was quite fast for the money.
My first (childhood) computer was a 486 (DX I think?). Pentiums were already out, even PIIs maybe. The LEDs on the front (remember those?) said "100", but I was always suspicious of these, because my computer seemed a _lot_ slower (at playing games – what else) than my friend's 133MHz... But his was a Pentium(!), so I never got to the bottom of it.
For the record I needed to downscale Doom to about a 1/2 window (running in DOS) to have it run at a decent framerate. Can't have been 100MHz, right? Does anyone here have a comparable benchmark of a "true" 100MHz 486 I can compare with?
Thus began my fascination with computers. Booting up DOS in some low-memory mode in order to squeeze every CPU cycle out of that thing. Working within constraints taught me a lot.
I didn't have internet at the time, but this thing had a 14.4k modem that I tried to get running. When the modem was in use, the mouse froze (and vice-versa). They were on the same IRQ interrupt jumper I think. I ended up frying the motherboard trying to fix this issue. I didn't have a computer for a while, but after lots of pleading and my parents seeing that I was serious about this, they eventually got me a Pentium 2 (450MHz or so!). But alas, it had a Voodoo Banshee video card (which had notoriously-bad drivers which often simply hard crashed). Alas, working within another constraint...
Pentium was superscalar and could often run two operations per cycle. The difference between a 133mhz pentium and a 100mhz 486 was way more than just 33mhz. That said I am pretty sure I recall running doom with more than a half window if not a almost full window on a 486.
I only have my unreliable memory to say, but on a 486DX/33 I recall play Doom 2 at least on the second-from-top view size without noticeable lag. But what sound card did you have? In the late 90s hardware manufacturers made budget cards with processing being done in software rather than dedicated chips. If you had one of those it may have been too much for a 486 to handle. Also probably the cause of your modem problems, as "soft modems" were notorious for being unreliable.
Obviously these cards were meant for use in Pentiums with their extra processing power. But some stingy integrators coughPackardBellcough would slap together the lowest cost components without regard to if it'd actually work properly.
> My first (childhood) computer was a 486 (DX I think?). Pentiums were already out, even PIIs maybe. The LEDs on the front (remember those?) said "100", but I was always suspicious of these, because my computer seemed a _lot_ slower (at playing games – what else) than my friend's 133MHz... But his was a Pentium(!), so I never got to the bottom of it.
A Pentium is much faster than a 486 at the same clock speed, so that shouldn't be surprising.
To be clear, this is how most chips work today. Intel, Nvidia and AMD make entire lineups of CPUs and GPUs with only a handful of different chips, the only difference being features fused off (either because the silicon doesn't handle them well or because they need market segmentation).
Mine did, also did the pencil trick with the Duron in the past (bridge the 4 L1 links) to unlock it and those overclocked easily so you'd have more bang for your buck.
I mean, the most recent case was the 1600AF, which was not a new stepping of the Zen 1 based 1600, but an underclocked 2600 (including the instruction set efficiency improvements, cache changes, etc.)
In 1998 a had a summer job, as a programmer, in a little shop that built computers. At the time there was a short of Pentium II 300Mhz chips. One day the logistics guy mentioned they had gotten a batch of Pentium II 300Mhz marked with an operating voltage of 2.0V, instead of the usual 2.8V. Hmm.
Intel had taken a bunch of their most expensive CPUs, PII-450Mhz, using the brand new 250nm process and rebranded them as the cheapest CPUs which were at the time built with a 350nm process. You could just take this, stick in a motherboard an change the base frequency and you'd have the best CPU Intel could make.
We never sold any of those CPU to customers. We all bought ourselves new CPUs that day.