Hacker News new | past | comments | ask | show | jobs | submit login
Microsoft’s original source code (gatesnotes.com)
511 points by EvgeniyZh 1 day ago | hide | past | favorite | 277 comments





The source code is such a fun read (for the comments). I found some source code for GW-BASIC, and here are two of my favorites:

  ;WE COULD NOT FIT THE NUMBER INTO THE BUFFER DESPITE OUR VALIENT
  ;EFFORTS WE MUST POP ALL THE CHARACTERS BACK OFF THE STACK AND
  ;POP OFF THE BEGINNING BUFFER PRINT LOCATION AND INPUT A "%" SIGN THERE

  ;CONSTANTS FOR THE RANDOM NUMBER GENERATOR FOLLOW
  ;DO NOT CHANGE THESE WITHOUT CONSULTING KNUTH VOL 2
  ;CHAPTER 3 FIRST
Edit: GW-BASIC, not QBASIC (https://github.com/microsoft/GW-BASIC)

Fun fact, GW-BASIC was a descendant of the original Altair BASIC. The "Translation created 10-Feb-83" headers on each source file refer to tooling Microsoft had that automatically translated the 8080 assembly to 8086 (it shouldn't be taken as a build date since they were manually modified after that point). Besides GW-BASIC, source code for the 6502 and 6809 rewrites of Microsoft BASIC were available up to this point (see https://www.pagetable.com/?p=774 and https://github.com/davidlinsley/DragonBasic) but I believe this is the first public release of the original 8080 BASIC code.

Shouldn't it be "valiant" ?

Sure, but in those days spellcheckers were separate apps - the most popular at the time being CorrectStar from MicroPro.

They weren't integrated into programming-oriented editors, and it would have been unusual to run them against code.


I still haven't seen anyone using a spellchecker in code outside of IntelliJ


I recently found https://github.com/tekumara/typos-lsp that uses https://github.com/crate-ci/typos Plenty of GH stars so likely a solid user base. Works great in NeoVim with the built-in apellchecker.

Eclipse has had an integrated spell-checker, which I believe is on by default for most file types, for like approximately forever. Now maybe everybody turns it off, but I gotta imagine there are some people who like it and keep it on.

Emacs has the ability to do spellcheck inline, both as a run through the buffer (old-school style) and as an as-you-type live feature. That said, I do most of my coding in JetBrains IDEs these days.

For Vim/Neovim users, there is one built in that is pretty good, and once you've added frequent custom words to the dictionary it is great. You can turn it on with `:set spell` or off with `:set nospell`. Add custom words by pressing `zg` on the target word:

I have this in my vimrc file so it's on by default for certain file types:

    " Turn on spellcheck for certain filetypes and word completion.                                                                                                                                                                               
    " words can be added to the dict by pressing 'zg' with cursor on word.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                 
    autocmd Filetype markdown setlocal spell                                                                                                                                                                                                      
    autocmd Filetype gitcommit setlocal spell                                                                                                                                                                                                     
    set complete+=kspell                                                                                                                                                                                                                          
                                                                                                                                                                                                                                              
    " Don't highlight in red an underscore (_) in markdown                                                                                                                                                                                        
    " https://vi.stackexchange.com/q/18471/17441                                                                                                                                                                                                  
    autocmd Filetype markdown syn match markdownIgnore "\v\w_\w"
Custom additions to the dictionary will go to a simple text file (one word per line) in `~/.vim/spell/en.utf-8.add` (depending on your settings) where it is easy to edit or backup.

Some people use VSCode extensions

The best programmers I’ve known have all been deficient at spelling. I don’t know why it so uniformly appears among them.


Humans in general, even writers, are deficient at spelling. This is the reason we need spellcheckers.

Steve Jobs used to say the problem with Microsoft is they don’t have taste.

The font-shimmering effect on scroll immediately reminded me of that, it is really distracting. And you can’t use reader mode to disable it.

(FWIW, I’m a fan of Bill Gates and all he’s done for the world)


The design is fun and gave me a lot of nostalgia, but I admit they overdid it. They could have made that piece feel the same without so much distraction. And please people, support reader mode. It's not hard and it shouldn't be optional.

EDIT: Good god they animated EVERYTHING. It's not even readable... also... not one inline code sample? This is the designer trying to get an awwwards site of the day without any interest in the actual content. It's like a guitar player that solos over everyone else's solos.


On top of the poor readability, my 2-year-old laptop can't even navigate through the page without CPU and GPU going insane, and my fans blasting at max speed. It's the poorest, choppiest web performance I can recall, all for what should be a simple blog post.

That's the fault of modern websites being massive JavaScript ad-playing behemoths instead of sub-1kB served HTML as god intended.

Funny cause just today this made it to the front page of HN

https://animejs.com/

It has way fancier animations and scrolls like butter


Tim Berners-Lee has been elevated to many things, but an ascension to deity must be a new reach.

I don't know, did you see the 2012 Olympic opening ceremony?

Kernighan & Ritchie deserve company

FWIW the spinning scrolling effects of Apple release announcements are nearly as bad.

Personally I like it :) Tastes differ.

get your hands on DONKEY.BAS you will love it!

Makes me wonder: did Bill write all of this text? Did he decide this effect is cool and must go in? Did he even know about that text effect?

Yes, I was shocked that Bill Gates's personal blog seems to have that "500 WordPress plugins" kinda vibe. Kinda reminds me of my old MySpace profile.

I think it's pretty cool

“All he’s done for the world” by copyrighting Covid vaccine, eh?

I've written an Intel 8080 emulator that was portable between Dec10/VAX/IBM VM CMS. That was easy - the 8080 can be done quite simply with a 256 value switch - I did mine in FORTRAN77.

Writing a BASIC interpreter, with floating point, is much harder. Gates, Allen and other collaborators BASIC was pretty damned good.


Fun facts, according to Jobs for some unknown reasons Wozniak refused to add floating point support to Apple Basic thus they had to license BASIC with floating point numbers from Microsoft [1].

[1] Bill & Steve (Jobs!) reminisce about floating point BASIC:

https://devblogs.microsoft.com/vbteam/bill-steve-jobs-remini...


Writing a floating point emulator (I've done it) is not too hard. First, write it in a high level language, and debug the algorithm. Then hand-assembling it is not hard.

What is hard is skipping the high level language step, and trying to do it in assembler in one step.


Also, though, how big was Apple Integer BASIC? As I understand it, you had an entire PDP-10 at your disposal when you wrote the Fortran version of Empire.

I did learn how to program on the -10. A marvelous experience.

Looking backwards, writing an integer basic is a trivial exercise. But back in the 70s, I had no idea how to write such a thing.

Around 1978, Hal Finney (yes, that guy) wrote an integer basic for the Mattel Intellivision (with its wacky 10 bit microprocessor) that fit in a 2K EPROM. Of course, Hal was (a lot) smarter than the average bear.


Interesting, I didn't know that! I didn't know him until the 90s, and didn't meet him in person until his CodeCon presentation.

What I was trying to express—perhaps poorly—is that maybe floating-point support would have been more effort than the entire Integer BASIC. (Incidentally, as I understand it, nobody has found a bug in Apple Integer BASIC yet, which makes it a nontrivial achievement from my point of view.)


I've never understood floating point :-)

Fixed point is where the number has a predetermined number of bits for the integer and fraction like 8.8 where you have 0-255 for the integer and the fraction goes from 1/256 to 255/256 in steps of 1/256

Floating point at it's simplest just makes that a variable. So the (.) position is stored as a separate number. Now instead of being fixed - it floats around.

This way you can put more in the integer or more in the fraction.

The Microsoft Basic here used 23 bits for the number, 1 sign bit and 8 bits to say where the floating point should be placed.

Of course in practice you have to deal with a lot of details depending on how robust you want your system. This Basic was not as robost as modern IEEE754 but it did the job.

Reading more about IEE754 is a fascinating way to learn about modern floating point. I also recommend Bruce Dawson's observations on his Random ASCII blog.


Let's say we want to store numbers in computer memory but we are not allowed to use decimal point or any characters except for digits. We need to make some system to encode and decode real numbers as a sequence containing only digits.

With fixed point numbers, you write the digits into the memory and have a convention that the decimal point is always after N-th digit. For example, if we agree that the point is always after 2-nd digit then a string 000123 is interpreted as 00.0123 and 123000 means 1230. Using this system with 6 digits we can represent numbers from 0 to 9999 to precision of 0.01.

With floating point, you write both decimal point position (which we call "exponent") and digits (called "mantissa"). Let's agree that the first two digits are the exponent (point position) and the rest four is mantissa. Then this number:

    020123 
means 01.23 or 1.23 (exponent is 2 meaning the decimal point is after 2nd digit in mantissa). Now using same 6 digits we can represent numbers from 0 to 9999·10⁹⁶ with relative precision of 1/10000.

That's all you need to know, and the rest should be easy to figure out.


In other words, a floating point number consists of 2 numbers and a sign bit:

1. the digits

2. the exponent

3. a sign bit

If you're familiar with scientific notation, yes, it's the same thing.

https://en.wikipedia.org/wiki/Scientific_notation

The rest is just the inevitable consequences of that.


I like "decimal point position" more than "exponent". Also, if I remember correctly, "mantissa" is the significand (the digits of the number).

And by the way engineering notation (where exponent must divide by 3) is so much better. I hate converting things like 2.234·10¹¹ into billions in my head.

And by the way (unrelated to floating point) mathematicians could make better names for things, for example instead of "numerator" and "denominator" they could use "upper" and "lower number". So much easier!


I do get significand and mantissa mixed up. I solved that by just removing them!

Wrote floating point routines in assembler back in college. When you get it, it's one of those aha moments.

The specs for it are indeed hard to read. But the implementation isn't that bad. Things like the sticky bit and the guard bit are actually pretty simple.

However, crafting an algorithm that uses IEEE arithmetic and avoids the limitations of IEEE is hard.


Floating point math was a key feature on these early machines, since it opened up the "glorified desk calculator" use case. This was one use for them (along with gaming and use as a remote terminal) that did not require convenient data storage, which would've been a real challenge before disk drives became a standard. And the float implementation included in BASIC was the most common back in the day. (There are even some subtle differences between it and the modern IEEE variety that we'd be familiar with today.)

I agree - it's a useful BASIC that can do math and fits in 4 or 8 kilobytes of memory.

And Bill Gates complaining about pirating $150 Altair BASIC inspired the creation of Tiny BASIC, as well as the coining of "copyleft".


I still have a cassette tape with Microsoft Basic for the Interact computer. It's got an 8080.

I remember my old Tandy Color Computer booting up and referencing Microsoft BASIC:

https://tinyurl.com/2jttvjzk

The computer came with some pretty good books with example BASIC programs to type in.


I have a MS Extended Basic cassette for the Sol-20, also 8080 based.

You should upload the audio to the Internet Archive!

>Writing a BASIC interpreter, with floating point, is much harder. Gates, Allen and other collaborators BASIC was pretty damned good.

The floating point routines are Monte Davidoff's work. But yes, Gates and Allen writing Altair BASIC on the Harvard PDP-10 without ever actually seeing a real Altair, then having it work on the first try after laboriously entering it with toggle switches at MITS in Albuquerque, was a remarkable achievement.


What Allen did was write an 8080 emulator that ran on the -10. The 8080 is a simple CPU, so writing an emulator for it isn't hard.

https://pastraiser.com/cpu/i8080/i8080_opcodes.html

Then, their BASIC was debugged by running it on the emulator.

The genius was not the difficulty of doing that, it wasn't hard. The genius was the idea of writing an 8080 emulator. Wozniak, in comparison, wrote Apple code all by hand in assembler and then hand-assembled it to binary, a very tedious and error-prone method.

In the same time period, I worked at Aph, and we were developing code that ran on the 6800 and other microprocessors. We used full-fledged macro assemblers running on the PDP-11 to assemble the code into binary, and then download binary into an EPROM which was then inserted into the computer and run. Having a professional macro assembler and text editors on the -11 was an enormous productivity boost, with far fewer errors. (Dan O'Dowd wrote those assemblers.)

(I'm doing something similar with my efforts to write an AArch64 code generator. First I wrote a disassembler for it, testing it by generating AArch64 code via gcc, disassembling that with objdump and then comparing the results with my disassmbler. This helps enormously in verifying that the correct binary is being generated. Since there are thousands of instructions in the AArch64, this is a much scaled up version of the 8080.)


The Wozniak method was how I used to write 6502 assembler programs in high school since I didn’t have the money to buy a proper assembler. I wrote everything out longhand on graph paper in three columns. Addresses on the left, a space for the code in the middle and the assembler opcodes on the right, then I’d go through and fill in all the hex codes for what I’d written. When you work like that, it really focuses the mind because there’s not much margin for error and making a big change in logic requires a lot of manual effort.

I started Z80 assemnbler (on a ZX80 computer) that way. But I soon get fed up looking up opcodes and especially calculating relative jumps (especially backwards ones) by hand as I often seemed to make off by one errors causing my program to crash.

So I wrote my on assembler in BASIC :)


Allen had to write the loader in machine code, which was toggled in on the Altair console. The BASIC interpreter itself was loaded from paper tape via the loader and a tape reader. The first BASIC program Allen ran on the Altair was apparently "2 + 2", which worked - i.e. it printed "4" I'd like to have such confidence in my own code, particularly the I/O, which must have been tricky to emulate on the Dec10.

> which must have been tricky to emulate on the Dec10

I don't see why it would be tricky. I don't know how Allen's 8080 emulator on the PDP-10 worked, but it seems straightforward to emulate 8080 I/O.


Well, I found it a bit hard on my Dec10-based emulator. I never got the memory-mapped stuff to work properly - I just mocked up some of the I/O instructions. But it was actually a spare-time project, intended to let my students do stuff like sorting, searching in strings, so I didn't feel too guilty. It had an assembler, debugger and other stuff. And it was portable - completely standard FORTRAN77!

I'm not as smart as you guys but I figured that I'm going to try and write wine for life every single thing on GitHub unless someone has done it already so that we could try to compile and build this thing directly on our own computers.

Tried to open this page and the music I was streaming started to stutter so hard I just exed out. Is this a preposterously heavy page, or just very heavy?

Flipping through the source code is like a time machine tour of tech's evolution over the past 50 years. It made me wonder: will our 2025 code look as ancient by 2075?

And, btw, great infographics within the post.


That's interesting to consider. Some of the GNU code is getting quite old and looking through it is a blast from the past. I'm frankly amazed that it continues to work so well. I suspect there is a relatively small group of GNU hackers out there rocking gray and white beards that are silently powering the foundations of our modern world, and I worry what's going to happen when they start retiring. Maybe we'll get rust rewrites of everything and a new generation will take over, but frankly I'm pretty worried about it.

Has there ever been a moment in human history where we’ve (as a society, not as individuals) looked back and were envious?

So my money is that the code I wrote today is the joke of tomorrow - for all involved.

Also, I for one don’t want to go back to punch cards ;)


> Has there ever been a moment in human history where we’ve (as a society, not as individuals) looked back and were envious?

I am guessing that generation that transitioned from Pax Romana to early middle ages in Europe.


I doubt that since knowledge and education wasn’t wide spread - beyond cloisters, people didn’t general know how well the Romans had it.

Remember it took until the Renaissance until ancient texts (Greek and Roman) were “rediscovered” by European scholars.


In all their cities they could see buildings that they did not know how to build. And before that, public services would have broken down. It would have become impossible to find people who knew how to repair your heated floor (if you were rich), etc. The city of Rome declined from 1 million people to something like 20,000. In the late 500s, Pope Gregory the Great thought that the world was ending because of all the trouble (including vicious barbarian invasions). Monks (and presumably anyone educated) had access to a lot of ancient texts, it was only some that got lost in the West. I think they would have had a distinct sense that that past was more advanced.

I think to most (90+%?) software developers out their in the world, Assembler might as well be hieroglyphics. They/we can guess at the concepts involved of course, but actually being able to read the code end to end, and have a mental model of what is happening is not really going to happen. Not without some sort of Rosetta Stone. (Comments :) )

I think 2075 developers will feel the same way about modern Java, C#, TypeScript, etc.

They will think of themselves as software developers but they won't be writing code the same way, they'll be giving guided instructions to much higher level tools (perhaps AIs that themselves have a provenance back to modern LLMs)

Just as today there will still be those that need to write low level critical code. There are still lots of people today that have to write Assembler, though end up expressing it via C or Rust. And there will be people still working on AI technology. But even those will be built off other AI's.


I wonder who the handwritten notes on page 98 are by?

Starts with "confirm plane reservation on Tue. Sept 2 or Wed. Sept 3" which is correct for 1975


Some luck, and willingness to take risks paid off in ways that could never be anticipated. Not sure I'll see something like the pc era in my lifetime. Perhaps mobile phones, or the Internet.

Having lived through pcs, internet, mobile, social, crypto and ai, I’d say mobile or social has been the biggest so far and AI is likely to be vastly larger impact. Of course they build on each other. But the global impact of mobile and social vastly exceed that of the pc era.

The Internet?

I mean… The AI?

Consider that nobody ever sat in countless meetings asking "How can we use the PC?" They either saw the vision and went for it, or eventually ran up against the limitations of working without a PC and bought in.

Well, apparently, the guys in Xerox did sit in meetings not knowing what to do, until Steve Jobs visited PARC and saw what was possible.

Actually, there was about a 15-year period where many people didn't think PCs were good for anything, because they had access to much better (shared) computers. That's the context where http://catb.org/jargon/html/B/bitty-box.html comes from. See also http://canonical.org/~kragen/tao-of-programming.html#book8. Throughout the 01980s PC Magazine worked hard to convince business decisionmakers that IBM PCs weren't merely game machines; if you look at old issues you'll see that computer games were completely missing from the abundant advertisements in the magazine, presumably due to an explicit policy decision.

I personally encountered people arguing that using PCs (as opposed to VAXen or mainframes) was a waste of time as late as 01992. And I actually even sort of joined them; although I'd been using PCs since before the IBM PC, once I got access to the internet in 01992, I pretty much stopped using PCs as anything but a terminal or a game machine for years, spending virtually 100% of my computer time on VMS or Ultrix. When I was using PCs again, it was because I could run BSD/386 and Linux on them, in 01994.

(Maybe you'd assume from my own story of enthusiastic adoption that "nobody ever sat in countless meetings asking[,] "How can we use the internet?"', but if so, you'd be extremely wrong. In 01992 and even in 01994 there were lots of people who thought the internet was useless or a fad. Bill Gates's The Road Ahead, published November 01995, barely mentioned the internet, instead treating it as a sort of failed experiment that would be supplanted by the Information Superhighway. Metcalfe predicted in 01996 that it would collapse. David Isenberg was still arguing against "Bellheads" and their "Advanced Intelligent Network" in 01997: https://isen.com/stupid.html)

It can be easy looking back in retrospect to oversimplify events like these with the benefit of hindsight, imagining that the things that seem obvious now were obvious then. But not only weren't they obvious—in many cases, they could have turned out differently. I think it was Alan Kay that argued that, without the invention of the sort of graphical user interface used by most non-cellphone personal computers today, the personal computer as we know it never would have become a mass-market phenomenon (though video game consoles were) and therefore Moore's Law would have stalled out decades ago. I'm not sure he was right, but it seems like a plausible alternate history to me.

Of course, there were "killer apps" as early as VisiCalc for the Apple ][. Accountants and corporate executives were willing to read through the manual and take the time to learn how to use it, because it was such a powerful tool for what they were doing. But it was designed for specialists; it's not a UI that rewards casual use the way Excel or MacPaint or NCSA Mosaic is. Without the GUI, or if the GUI had come much later, plausibly personal computers would have remained a niche hobbyist thing for much longer, while somebody like Nintendo would have locked down the unwashed-masses platform—as we now see happening with Android. And (maybe this is obvious) that would have made it immensely less useful.


I offer this as constructive feedback, but I found your highly unusual style of adding a zero in front of all your years was very distracting while I was reading your comment, in a sense it “derailed my parsing” of what you were trying to say.

Keep in mind that persevering with this style in your writing may mostly serve to detract from what you’re actually trying to communicate to others.


Yeah, if you deliver a history lesson while wearing a mohawk, there are always certain people who will only remember the mohawk, no matter how good the history lesson was. Some places, they'll even beat you up for the mohawk.

I'm perfectly happy to deny those people the perhaps-dubious benefit of my viewpoint. I'm sharing knowledge, not making a sales pitch, and I'm not especially worried about getting beaten up anymore.


That came out of millions of dollars and man hours of investment by Google and OpenAi.

VS

Some college students selling software they didn't have and getting it ready from 0 to sellable in 2 months which led to a behemoth that still innovates to this day.


It doesn't sound that different from Alex Krizhevsky training AlexNet on a pair of gaming GPUs in his bedroom, winning ImageNet, and launching the current wave of deep learning / AI.

The big difference is that Bill's dad was one of the best corporate lawyers in America. Microsoft might not have amounted to much if they hadn't struck some extraordinarily prescient licensing deals at the right time and place.

No difference really, just google who Bill Gates' mom was and how he got the IBM DOS deal... It wasn't BASIC that made MS big, it was DOS.

Great point, I was thinking more on the Transformer architecture, but I stand corrected.

Google started similarly with PageRank as far as I remember.


Grad students, but yeah. CUDA was also basically invented by a grad student.

Many undergrad examples as well in the web era, from Excite to Facebook to Snapchat.

(Note the unanticipated consequences aren't always good.)


This website froze my phone, not joking.

Yeah, there's sort of a glitchy virus matrixy thing going on with the text as I scroll and it's really weird.

Everything Bill touches gets frozen at some point of time...

A recent disassembly of, I think the same code. https://github.com/option8/Altair-BASIC/blob/master/BASIC%20...

I checked in a few places at the start and towards the end (the sin function) and they matched.


The source code is linked at the end (warning: it's a 100 MB PDF).

https://images.gatesnotes.com/12514eb8-7b51-008e-41a9-512542...


The printout is dated 10-SEP-75 and is labeled "VERSION 3.0 -- MORE FEATURES TO GO".

Curiously this isn't the oldest extant version of the source code. The Harvard archives have a copy of version 1.1, printed on 30 April 75. http://altairbasic.org/other%20versions/ian.htm


The printout also contains dates 6-SEP-64 below it, any idea what those are?

Thank you for the warning. I once used up my Internet package's entire monthly quota by following a similar link on Hacker News.

Ironic for something designed to take up only 4KB on its target machine :)

(It's a high-res image of the printed code.)

Nice one. Has anyone OCRed this back into text?

I attempted OCR with OCRmyPDF / Tesseract. It's not great, but it's under 1% the size, at least. https://github.com/pronoiac/altair-basic-source-code

Maybe you should try something like EasyOCR instead: https://github.com/JaidedAI/EasyOCR

Feel free to run EasyOCR against it and submit a PR

It's interesting reading this after finishing Palo Alto by Malcom Harris.

added to my must-read list.

I notice his interview on Democracy Now : https://www.youtube.com/watch?v=j7jPzzjbVuk

This guys mental map is impressive, as are the color of his book titles : https://www.goodreads.com/author/show/16872611.Malcolm_Harri...


Thanks for the Democracy Now interview! His description of "tech layoffs" is the most concise framing I've heard to describe what I've felt about it:

"Cosmetic offering to the financial markets to show that Silicon Valley still can control its labor costs... It's less the future flow of funds is improved ... than that they're signaling something to the markets ..."

https://youtu.be/j7jPzzjbVuk?si=YSbUW8h2mNktzj_9&t=634


Total sidenode but "Gates Notes" has to be one of the most exotic personal blogs I've ever seen. At this point would you even consider this a personal blog?

Yea, well I would consider it that in the sense that it seems like a mix of his personal interests, history, and promotion of stuff he cares about (his biography and foundation and various projects he's on). Its a unique site because he has the cash to hire people who put a great UX experience on top of it all. I think that's the main difference.

Not that he's unique in this, but I do really appreciate his book lists. I usually grab a few books during the year based on his recommendations.


Maybe Bill has really taken an interest in Javascript. /s

What's compelling is that he basically starts off saying that they lied... to MITS.

I sort of knew the story, but the way Gates presents it in his article makes it pretty blunt. There is no contrition; rather it is a story of glorious success, a story of hard work to be proud of, all started by the lie. In fact, the lie is presented as the nucleating event, as positive thing that spurred them to turn the lie into truth.

To me it felt consonant with the ethics of Harvard, and more saliently, the fact that their founding event was a lie seems consonant with the trajectory of the company. The summary of the book makes it sound like the real title is "A Glorious Life", and I would expect no contrition about DR DOS, Netscape, and other Microsoft ruthlessness under Gates.

(To be fair, I loathe Microsoft and their products, which help me accomplish my goals the way a spoon helps me cut a steak, and I have never seen Gates as virtuous. So I am hardly unbiased.)


Fake it till you make it.

Oracle did the same.


Trial balloon is the euphemism used in the Wiki article.

Damn this is cool. I think text is an underutilized medium for design.

I would say, "Looking forward to the github repo with this code in ASCII" but I realize Microsoft would likely not allow that.

Why would they not allow it? They've published the source to numerous old products (including MS-DOS[1]), and they own GitHub...

[1]: https://github.com/microsoft/MS-DOS


Funny enough last night I was hoping more old 90s Microsoft code got open sourced and I somehow was living under a rock. Maybe one day I'll get to legally dig through NT 3.51 code. especially since it was ported to MIPS and Power iirc. I went on a huge tangent reading about how someone ported leaked ntvdm code to x64. They didn't provide code and I didn't go hunting for the leaked stuff and won't...but I think it's super neat how forward portable some of the stuff NT has is.

I attempted OCR, and while it's not great, it's a start. I considered adding a reference to "software wants to be free!" or the Open Letter, but I'm winding down for the night. https://github.com/pronoiac/altair-basic-source-code

Note that the constants in the PDF are in Octal!

Microsoft (and maybe even Bill Gates personally) generated a strong "dislike" sentiment to the hacker community. But we can't deny that he and Paul Allen were pure breed hackers and helped a lot the development of technology. Of course, we all prefer OSS and we'd pick Linus (or insert OSS dev name here) 100 times over one of the "evil capitalists"/s, but nevertheless they have to be recognized.

I’m a 90s kid (born in 1989), and I remember the days of the anti-trust lawsuit, “Internet Exploder,” the Slashdot Borg icon, and resentment from Mac users, WordPerfect users, Netscape users, and others who strongly disliked the Microsoft monopoly.

Still, there’s something about Microsoft of that era. Bill Gates was “one of us,” a passionate nerd. This was an era where nerds like Jobs, Woz, and Gates ruled. The 1990s and the 2000s felt exciting, and it felt like technology was making the world a better place.

I must admit, even though I was firmly in the Jobs and Woz camp in the 2000s, I also fondly remember Windows 2000, Visual Studio 6, and pre-ribbon Microsoft Office. Contrary to Steve Jobs’ opinion, I believe Microsoft has occasionally exhibited great taste :). For better or for worse, the 1990s was peak Microsoft.

Something happened in the 2010s. It seems like the tech industry has become just like any other industry that has gotten entrenched, and today’s tech leaders simply don’t inspire me like how the leaders of previous eras did. Today’s Web media companies are far scarier than 1990’s Microsoft ever was.

Then again, I was a mere child in the 1990s, and I became an adult in the 2010s, and so I could be looking at the 1990s through childhood memories.


As a fellow 90s kid... I feel the same. I remember when Sony Ericson launched their first camera phone and how we used to go through PC upgrades like crazy. My dad would go to the bookstore to buy magazines with new linux distros included for free. Now I have laptop thats 4 years old and Im not excited to buy my next (heck I dont even need to buy my next... I can run LLama.cpp just fine on my current).

I do think the barrier to entry in tech has significantly increased. There was a wave of internet companies like Uber, (and their global equivalents) that benefited massively from providing local internet services. In the 2000s and 2010s the tech companies benefited massively from global poverty alleviation efforts to get users in remote regions on line. The push to get people online meant that millions of people in poor countries had access to social media and ads but not basic needs like toilets. As the tech companies saturated the emerging markets, covid began to hit. The stark inequalities began to be rubbed in. The big tech companies also dont really have any real material asset to fight over anymore. Their markets have been largely captured. As a big tech firm the game is now to maintain your lead. The industry is now run by MBAs, not hackers anymore.


Now those poor people are online globally and can scroll Instagram.

I think what you are remembering is just nostalgia, people tend to remember the good things and shut out the bad ones.

I still remember how Microsoft, under Gates, acted like a robber baron to the whole tech community. You had a nice product? It was instantly copied by Microsoft, and they pulled the rug under you because they could.

You wanted open standards? It was a war purely because Microsoft wanted it to be. It was either Microsoft's way or the highway.

I consider pre-2008 and pre-iPhone launch to be the peak of the Internet, but it's all downhill from that year onwards.


Yes, agree. Bill Gates was never ”one of us”. He came from extreme privilege and used his advantage to kill off much more innovative technologies. BeOS, anyone?

There's a throwaway quote about the school Gates was attending spending a few thousand dollars a year on a terminal and computer time.

The inflation factor is around 5X, so that's maybe $15k to $20k in modern money.

There were very few schools in the world with a five figure budget for computer experiments for a handful of pupils in the early 1970s.


and?

To be fair, much of the coding community is highly educated - especially in the top companies, which generally hire from top schools - and therefore likely to be privileged.

>It seems like the tech industry has become just like any other industry that has gotten entrenched, and today’s tech leaders simply don’t inspire me like how the leaders of previous eras did. Today’s Web media companies are far scarier than 1990’s Microsoft ever was.

Three letters: MBA

When the MBA's came into the tech industry everything got stale, 'safe' and unexciting as they want to leech their fucking hands over everything in the name of maximal profit.

Private Equity follows MBAs so you see more PE firms getting into tech during the same period. Same story, fucking leeches leeching makes the leeches happy at the expense of society. In fact, it seems PE firms and MBA grads love making the world an actively terrible place

I hate business bros. They ruin god damn everything.


Gates showed his true colors right up front with the "Open Letter to Hobbyists", and pursued the rest of his career in like fashion. It's not just about Microsoft versus open source: many of us already resented their strong-arming, dominance-oriented, rent-seeking, ownership-hungry monopolistic approach to computing before the free software movement had really gotten going, or the term "open source" had even been invented.

It is interesting, especially in the context of Gates childhood upbringing and his extremely rare access to computers and computer training.

Something that maybe one or two other dozen children had access to in the entire country during that time (60s/70s).

You have to also remember that computers were also seen as a public good for a large swath of users during this time too.

Makes you wonder how different this industry would be if we replaced Bill Gates singular childhood privilege with that of Bill Joy's (which looks like your typical middle class experience)? Only instead of one child, you could probably help thousands of children.


Berkeley's Willard Jr high school bussed 7th grade student s up to Lawrence hall of science in the fall of 1970. I was the 3rd grade younger brother that started to print all the code, so I could walk through it. There were at least 70 to 80 kids there, and only two years later, they added two more 30 person labs. Dartmouth BASIC and HP basic were at most universities. While punched card FORTRAN was as most engineering schools.

Yes, you're talking about getting access via public education. Bill Gates, as a child, had nearly 24/7 personal access to these machines.

Something most professionals didn't even have.


This is consistent with the parent comment. You can have a hacker mindset and be totally against open source. They are orthogonal qualities.

> You can have a hacker mindset and be totally against open source. They are orthogonal qualities.

You can write that, but I don't see it. FOSS is built for hacking, designed to empower and enable hacking. Proprietary closed-source software prevents it.


How were they "pure bread hackers"? Was Gates especially proficient with code? I've never heard that. From what I read, they were the enemies of hackers. This really seems like looking back with rose-colored glasses.

My understanding of Microsoft's success was it came from marketplace maneuvers, many ranging from unethical to illegal, not from quality or innovative hacking. Compare Windows with any contemporaneous MacOS, for example. They took over the office productivity software market by illegally leveraging their Windows monopoly. Their initial and core success - getting DOS on IBM PCs, which led to the Windows monopoly - was simply leaping at a business opportunity, I think even before they began developing the product.

Didn't they generate fake errors for Windows running on DR-DOS, or something like that, even though it ran fine? Do you mind that they tried to destroy and monopolize the open web (thank you Mozilla!)?


My understanding of Microsoft's success was it came from marketplace maneuvers, many ranging from unethical to illegal, not from quality or innovative hacking. Compare Windows with any contemporaneous MacOS, for example.

So it's 1992, and OS/2 still isn't happening.

But you can get a 386 at 16 or 25 MHz complete with maybe a 40 MB hard drive, color monitor, 256-color VGA, a couple megabytes of memory, and licenses for MS-DOS and Windows 3.1 for $1000 or less. This will let you do a lot of computer things.

If you want to run Mac OS, the very cheapest Macintosh you can get is the Mac Classic, and it costs $1695 for a 7 MHz 68000, a single floppy drive, no hard drive, and a 1-bit black and white display. This will enable you to do a lot fewer computer things, much more slowly.

Macs were very expensive. Windows was good enough. It wasn't better, necessarily, but it wasn't strong-armed onto the market by shady maneuvers either -- at the time of Windows 3 and 95 it was genuinely good "product-market fit". Microsoft, from its earliest days, was good at leveraging mass-market hardware to deliver "good enough" software that worked for the majority of people. Of course they did shady stuff that increased their dominance, but Windows would have sold like hotcakes either way.

Didn't they generate fake errors for Windows running on DR-DOS, or something like that, even though it ran fine?

IIRC that code existed, but was commented out in the final build.


It was strong-armed because Gates used family connections to negotiate a preferential deal for DOS with IBM, and then forced PC manufacturers to bundle DOS and/or Windows.

That was then leveraged into attempts to force Internet Explorer onto Internet users. Which was when the antitrust suit happened.

Meanwhile IE and Windows were notorious for being terrible pieces of software.

Windows was always horrifically buggy and crash prone - far behind even the most basic standards of professional reliability. 3.x was sort of usable but extremely simple, 9x was just horrific, and it wasn't until XP that it became almost reliable.

Both IE and Windows were also a security disaster.

Between the bugs and the security flaws Microsoft wasted countless person-centuries for its users.

The one thing that MS did right was create a standard for PC software. That was the real value of Windows - not the awfulness of the product but the ecosystem around it, which created Visual Basic for beginner devs and Windows C++ classes for more experienced devs, and kick-started a good number of bedroom/small-scale startup businesses.

For context, PCs at this time were also extremely expensive. The price of a Mac Classic got you a brain damaged 80286 and not much RAM. You had to spend $3k or more to get the newer 80386, and the 486/66 was just starting to become available.


> Windows was always horrifically buggy and crash prone

At the time Mac OS didn't have memory protection -- Netscape would make your whole computer go BOOM at regular intervals.

IE was even a hell of a lot more stable (and faster) than Netscape.

I put a fresh copy of Redhat on the Internet in 90s and it was p0wned in 5 minutes.

That's just the way things were.


> Meanwhile IE and Windows were notorious for being terrible pieces of software.

My feeling of IE3 to IE6 (at its release time) is that (anti-competitive strategies aside), many (most?) average consumers would very likely choose IE over Netscape if they gave both a bit of a test drive.

In 1996 (maybe 1997) I was 14/15 at the time and remember coming to the conclusion that IE3 ran much faster on Windows 95 compared to Netscape.

It being (anticompetitively) free helped, but on the 100Mhz Pentiums with 8MB of RAM in our computer lab, you’d be a masochist to choose Netscape over it for random web browsing.

IE4 was quite resource intensive, but because MS anticompetitively pre-loaded it on OS startup, it still started faster than Netscape.

IE6 I found pleasant to use and it wasn’t until Firefox came out with tabs (Opera had them earlier, but you would often encounter websites it wouldn’t render properly, probably due to IE targeted design), that IE lost its sheen for me.

Firefox was popular enough that developers started caring about standards compliant websites at which point IE started entering the “despised” category, but it may not have actually been displaced from its top spot were it not for Chrome.


> IIRC that code existed, but was commented out in the final build.

I've never heard that and IIRC, DR-DOS's owners sued successfully (or DoJ sued successfully). People certainly saw the errors.


> This really seems like looking back with rose-colored glasses.

It works both ways. It's hard to look back at the time while ignoring all the paths the road has taken since then.

Microsoft has always been company that is very good at building software compared their competition at the time. Their office productivity software, for example, is what made Windows popular (Windows is useless without apps). It's easy to give more weight to their flaws because, in many ways, their successes just seem obvious now.


> Microsoft has always been company that is very good at building software compared their competition at the time.

I have never, ever heard that. (Edit: Name such software today.)

> Their office productivity software, for example, is what made Windows popular (Windows is useless without apps).

Completely false. Windows was already a monopoly, and the US government successfully sued Microsoft for using their Windows monopoly to leverage sales for Office. They told manufacturers: If you want Windows (which was essential) for the computer, you must pay for an Office license too.

Where do you get this stuff or why are you posting it?


> Completely false. Windows was already a monopoly, and the US government successfully sued Microsoft for using their Windows monopoly to leverage sales for Office.

The government lawsuit was specifically about Internet Explorer, not Office. At no time were manufacturers forced to pay for Office licenses. Go ahead, look it up, I'll wait.

Where do you get your stuff and why are posting it? You do know that Office applications existed before Windows, right? Excel came out for Mac OS first.


> Was Gates especially proficient with code?

Well the article is obviously a biased source, but surely developing a) an ALTAIR emulator for PDP-10s (Allen) and b) a pretty much full-fledged BASIC interpreter that was exclusively tested on top of said emulator (Gates) in two months, in the 70s was not the kind of stuff an average coder would have done.


This also how I read the story, they were ‘basically’ salesmen/marketing guys with good investor storytime. The hacking part was hacking together code on the plane before the meeting to rake in the cash?

Simply untrue. They were hacking in highschool for fun. Complete nerds. They were _also_ ruthless business people.

Most high school hackers and nerds don't become good professional coders.

And then all the folks that used to write M$ served the open Web in a plate to Google, now with the exception of Safari, what we have is ChromeOS, in browser, and being packaged in "native" apps.

Gates was obviously a proficient coder. I think you're experiencing a time compression phenomenon here: this was the mid 70s. Microsoft the big bad Microsoft that everyone knows about didn't appear until around the mid 90s. 20 years later, although from the perspective of 2025 those two eras seem pretty much adjacent.

I don't mean proficient, I mean elite, exceptional, legendary.

BASIC was written as a team in Albuquerque. Altair had good reason to support their efforts. They then purchased DOS from Seattle Computer Products after they made a deal with IBM to sell it. To be fair Xerox gave away the office suite and the hardware to anyone who asked.

BASIC was written as a team in Bellevue. Altair did nothing to support them until they traveled to Albuquerque and proved the code worked.

A pretty limited version of it was written there the only purpose of which was to get the contract. The majority of actual BASIC development happened afterwards. In any case it was commentary on the "pure breed hackers" question so I was trying to highlight the commercial aspect of it. The work in Bellevue was only to achieve this outcome.

Allen wrote an 8080 emulator on a time shared PDP-10 in order for Gates to write the assembly code that implemented a BASIC interpreter - complete with I/O and editor - for a sight-unseen system, all in 4 kilobytes. And it worked the first time it was run.

I've been in the industry for 30 years and I couldn't do all that without serious Googling (or AI help nowadays).

Doing it as 20-somethings in the mid 70s definitely qualifies them as pure breed hackers to me.


As a kid of the late 90s i feel like it was kinda unfair.

Back in the day (70s(?)80s) computers shipped with the programming language manual. All I got was a CDROM of ENCARTA and a slip to mail in for a restore set of MS DOS / WIN 3.1 diskettes(which was sorely needed I might add).


I wish Microsoft would bring back Encarta!!

Microsoft Dinosaurs was also awesome

In the mid 70s you got a badly mimeographed copy of the schematics and a bag of parts.

In the late 70s to early 80s you got a programming manual, but you had to save your programs on cassette tapes.

In the late 80s, you got glossy manuals which showed you how to turn on the computer, hook up a printer and load a program from DOS.

In the early 90s, the manuals were plain paper, smaller, and had instructions on how to use a mouse, and explained what a window is. Plus the mail-ins.

Mid-90s (CD-ROM "multimedia machines") you got a sheet of paper which told you to load the interactive tutorial from the included CD.

Late 90s you got 5000 hours of AOL. Plus another CD filled with co-branded crapware like CorelDraw Lite for Dell.

2000s+ crapware pre-installed, driver CD and a warranty card.

So really, the time period with the included programming manual was just a few years. And mostly all you did is print Hello World over and over again on the screen. So don't be too jealous.


Yeah. At least you got a good MSDN CD in 1999 with tons of example code and all the info you'd want on Windows.

Now we get: {{ Fill in the Description }}

https://learn.microsoft.com/en-us/powershell/module/storageb...


Good programming manuals that were delivered with the computers and with the compilers/interpreters have existed about for the entire time when MS-DOS was dominant, i.e. from the launch of IBM PC in 1981, which always had things like a commented BIOS listing, which was very instructive, and detailed documentation of all its hardware peripherals, until the mid nineties, i.e. until Windows 95.

Until the early nineties, the compilers and interpreters from companies like Borland and Microsoft came with big excellent programming manuals demonstrating how to use them.

Also any complex commercial application for MS-DOS, e.g. AutoCAD, Lotus 1-2-3, the BRIEF editor for programmers etc., would have voluminous manuals, including sections on how to write scripts in whatever embedded scripting language they were using.

Only for the users of pirated copies of MS-DOS, compilers etc., the access to manuals was more difficult and some of them may have even not been aware of what manuals were normally available for the legitimate owners. Most IBM PC clones also did not have much documentation delivered with them. Since they were made to be compatible with IBM, it was supposed that anyone who needs them will buy the original IBM manuals.

Since Windows 95, the vendors of hardware PC peripherals have stopped providing documentation for them, providing closed-source Windows device drivers instead, but before that, whenever I was buying some PC add-on card, it typically came with a manual providing enough information about control registers etc., that I was able to write an MS-DOS device driver myself, if necessary.


The pure breed hacker just published source code in a 100 MB PDF.

Yes, it's called pulling the ladder up behind you. I don't think "he was a hacker" mitigates anything whatsoever.

Meh, I don't prefer OSS. I prefer tools that work well, whatever they may be. For a long time, that was Windows. Microsoft went to hell, so now it's Linux. I'll happily use commercial solutions so long as they're good.

In "In the Beginning was the Command Line" Neal Stephenson used a car analogy to describe consumer operating systems, I always thought his analogy was pretty apt:

To paraphrase him a little bit:

Microsoft sells Family Station Wagons. Spare parts are cheap and plentiful and if they breakdown there is a huge network of dealerships with mechanics on staff.

Apple sells Luxury Sedans - nicer to drive than the station wagons but spare parts are uncommon and the oil changes are expensive.

Linux is represented by a group of volunteer hackers organized by consensus giving away tanks for free made from sophisticated space aged materials.

The observation he makes is 90% of people go straight to the biggest dealership and buy a station wagon without ever looking at any of the other options. They will make a bunch of excuses like "I Don't know how to maintain a tank" and get angry when told "You don't know how to maintain a station wagon either", in the end their argument boils down to "can't you see everyone else is buying a station wagon"...


Yup! Which is why I use Linux but you better believe I've got Sublime Text installed (and licensed!)

It’s also kind of difficult to hate on a guy that devoted his remaining decades to literally saving tens of thousands of lives around the world.

It's very easy to hate on him for that very reason. He's just buying a good reputation for the fraction of his wealth that is completely insignificant.

If I could buy that kind of reputation by tossing a few coins into the void, why not? Especially after I've stolen billions from others.


Its possible. He is following the same tactics as when he was head of MS [1].

1. https://www.wired.com/story/opinion-the-world-loses-under-bi...


[flagged]


Yeah, corporations have the resources to do that kind of investment in Linux which random hobbiests don't.

But why do they do it in the first place, instead of investing in their own obviously supiriour massively invested in OS's? Because Linux IS better, and the whole idea of it is better than some closed source crap. By nature of the GPL license it will snowball and everyone else will be left behind.


This is wrong, Linux had plenty of momentum before RedHat specifically was purchased by IBM.

I'm sure that helped its momentum in the corporate space, where it was already very present, but the whole family of Linux was very well established in servers, firewalls (more BSD than Linux here), mobile devices, embedded hardware, etc


> This is wrong, Linux had plenty of momentum before RedHat specifically was purchased by IBM.

I’m not defending their overall argument, but I don’t think they are talking about the 2018 Red Hat acquisition, rather IBM’s 2000 announcement they were investing a billion dollars in Linux: https://www.cnet.com/tech/tech-industry/ibm-to-spend-1-billi...

IBM has been a big contributor to Linux long before buying Red Hat


I don’t have an opinion on the issue, cause it’s outta my wheelhouse. Well, aside from civility, which you need to work on.

What stands out to me about Gates and Allen is the serious technical chops. Writing an emulator for the PDP-10 and then an interpreter, line editor, I/O system all in 4KB of memory. The code is worth reading and in addition to that they had a very solid business sense and pretty serious work ethic for people who were 20 years old.

It stands to me in real contrast to the "fake it till you make it", "if it works you shipped too late" hustle culture that took hold of the industry, with entire products just being API wrappers. Really hope we see more companies that start out like Microsoft again.


> It stands to me in real contrast to the "fake it till you make it"

They are the all-time greatest in fake-it-til-you-make-it. They got the IBM PC OS contract without having an OS, which they bought from someone else (iirc).

> What stands out to me about Gates and Allen is the serious technical chops. Writing an emulator for the PDP-10 and then an interpreter, line editor, I/O system all in 4KB of memory.

Is that really so impressive? Everything then was in 4K, from all coders.


To be fair they definitely faked it, they said they had source code for a program they hadn't even written yet! They were just also very serious about the "making it" part.

True but "fake it and then immediately proceed to make it" is definitely more appreciated than just burning through deals by lying for a long time, which "fake it till you make it" usually boils down to.

IMO although it was complex, the human brain could still manage the complexity back then. Reading Woz's autobiography, it feels he knew what every logic gate on the original Apple computer did.

The PDP-10 probably worked at "human speed" too...


i thought they started by writing traffic control software, where's that source code? :)

The fact that Microsoft has a $2.77 trillion market cap despite being terrible at virtually everything it tries to do proves large swaths of the economy are fake.

> despite being terrible at virtually everything it tries to do

Oh, MSFT ain't even "terrible" compared to some other players. Try Salesforce. Or ADP. Or even Atlassian. I can't believe we're actually paying money to use them and OMG, the software... I feel like when going to conferences, I'd be like that guy from the cigarettes ad in Idiocracy https://www.youtube.com/watch?v=OzUcoZdfCOY ... "You work there? Fuck you!" :)


I really don't like Microsoft products (notable exceptions include: F# and Age of Empires). But they are really good at getting companies to spend large amounts of money on their products. Slack is strictly better than Teams, however a company that already has Windows, Outlook, and Office really needs a good reason to spend $20/user (or whatever it is) for Slack over Teams. Azure I don't think is as good as AWS or GCP, however for a lot of business its we are already on Azure with Office 365 so why not?

> F# and Age of Empires

;-) I have never disliked MS games, or Xbox, or Game Pass.

I also dislike Teams, but Microsoft has integration, which means that it works with Outlook's calendar, with Office documents, etc. It's mediocre but full-featured.

I wonder what would have happened if Google Docs had evolved into a credible MS Office competitor? It's also amazing that Skype (and Hangouts/Meet for that matter) had such a head start over Zoom.


Google Docs is a competitor, but that doesn't necessarily mean it can take significant market share from Microsoft, especially among customers deeply embedded in the Microsoft ecosystem.

The reverse is also true: companies that are heavily invested in Google Workspace, GCP, and related tools are unlikely to switch to Office 365.

That said, there are exceptions. Legal professionals, for instance, often require the standard: Microsoft Word. And for advanced tasks, Google Sheets falls short of what Excel can do.


> terrible at virtually everything it tries

Microsoft things I think are pretty OK and don't really mind using:

Xbox, especially Game Pass; Azure; BASIC (particularly classic Microsoft BASICs and SmallBASIC)

Microsoft things that I think are not completely terrible and sometimes kind of useful:

Hyper-V; WSL; VSCode; C# and .NET; Visual BASIC; Excel and PowerPoint


Imho it just speaks to importance of first mover advantage, land grab, and most importantly distribution distribution distribution.

It's not fake, it's reality. And things have always has been this way.


I recently left a company that was spending $10million on SalesForce licenses that no one was apparently using. When the re-org happened, heads were rolled.

How common do you think that story is? Over paying for software that doesn't actually make users more productive?


I don't think so, products are not the 100% of a business.

They are pretty good at making money at the end of the day, with ~100 billion/yr profits. Their P/E is only 30, which isn't outrageously overpriced.

What strikes you as fake?


Source code published as PDF? Come on, this should be published on Github.

pretty slick

Nice design

I met Bill Gates briefly a few years ago. Nice guy. Definitely buying his book.

The screenshot of the source code at the end of the article is a ton of printed code.

How was it then entered into the Altair? Did someone have to retype it? Or was there media that predated floppies that was used?


It was distributed on paper tape. You needed a teletype with a paper tape reader to run it. Basically you would manually enter a bootloader using the switches on the Altair's front panel, and the bootloader would read BASIC off the tape and into RAM. If the checksum passed, it would then jump into BASIC. Here's a video of the process if you're interested: https://youtu.be/TxU_3dEJ2nM?t=1013


Paul Allen entered it in front of the customer for the first run

https://paulallen.com/Futurist/Microsoft.aspx

I expect it was distributed on tape as well.


"he’d forgotten to write the bootstrap loader" He didn't load the whole program from the switches on the face, just the bootstrap that would let them feed the paper tape through the teletype/paper tape reader that was common at the time. It would take a very very long time to load the whole program by hand. See this video of a demo. https://www.youtube.com/watch?v=TxU_3dEJ2nM

damn, that's a crazy process- thanks for the video link

Love how absolutely engorged and broken this web page is to dramatically depict a style that - were the article actually just published in plain text - would be what... a millionth the size? Should have known better than to be surprised that the "source code" one can "download" and "look through" is in a goddamned PDF.

I do truly wonder if the fact that he was publishing a PDF as downloadable "code" even caused him any pause lol.


Shipping highly optimized assembler for a program made to work on computers with 4KB RAM as a ~100 MB PDF is quite the flex.

I must admit that while it's computationally quite wasteful, the web page does look quite neat.


Regardless of what anyone thinks of the website, it's likely that the only way the code exists is that ream of paper. While Bill Gates could easily have bought an OCR reader to make a text file of it with the loose change in his couch, I don't think it's entirely unreasonable to just scan it in and provide that scan.

The article rendering hurt my eyes, and then it was a pdf of the source code! :-(

If only Microsoft owned a place to post source code...

That would be either OneDrive or for the real l337 adminz: B:\

Git is for Linux and other cancers.


OneDrive? Look at mister corporate moneybags here. Sharepoint!

SharePoint is where the real money and fun stuff is at.

How do you think the likes of Delta and McDonalds manage their intranet and document storage? OneDrive is just a glorified SharePoint feature.

P.S. Joking only partially, and not much at all.


Yeah, it's pretty awesome, right?

REAL windows enterprise companies worth their salt use a shared drive on \\global.


You can’t even use reader mode on the site because of the text effect. It will cut off after the first few paragraphs since the others have the effect applied.

There's something rather cringeworthy about the heavy and painful animations etc. on this website trying to create a 1970s computer technology vibe but instead just giving me a headache. I'd much prefer the same information, and the same vibe, with some much less fancy, lightweight easy to read web tech that actually simulates an authentic 1970s experience (I remember that era well! I'm an 8080 programmer myself from way way back).

I thought it was pretty neat and think they did a good job of creating that vibe. I have fond memories of that time and the computers and the electronics magazines.

As for the heaviness of the page… My 8 year old iPad loaded it just fine, so it couldn’t have been all that heavy.


The page design is distracting and making it hard to browse through. Pressing Page down/up key does not work! Such a design is not UX friendly.

I tried to view on a Windows 10 machine that's connected to a physical keyboard. In the scrolling on the mouse feels so laggy - you gotta wait for the animation to play before you can read.

I spent hardly a minute to read the top and then jumped back here to make this comment, which I never ever did before.


I had the same reaction to the site - but I could've been won over if there was a link to E1ite and C@@L basic source for the effects (at least the text effects which could've fit in 4k)

Steve Jobs quote: "The problem with Microsoft is that they just have no taste."

But I actually would prefer the pre-XP windows desktop to the flattened UIs of Apple's today.


To be fair Jobs is dead so his ability to veto UI changes is limited.

Bummer.

The animation is very reminiscent of Sneakers - wouldn't be surprised if that was the inspiration for it. It's a little distracting, but pretty cute imo.

But totally Microsoft, ain’t it? Elegance was never their thing.

So you had to be "that" guy. I think it looks pretty cool.

This is HN. I would be surprised if that guy was not here. ;-)

I think the guidelines actually say not to post comments criticizing the website layout, etc.

This website is the biggest missed opportunity to use win98.css ever

I had never heard of this but it's description for it's git is what I hope and dream for anytime I go look at a project related to or having a GUI. reference at https://jdan.github.io/98.css/

Guys, even reading this article could land you in jail!! Reading the code will forever taint your knowledge and cause every line you write to be subject to a lawsuit !! Stay safe !11

(Anyone else remember 2004, how scared everyone was when the Windows 2000 source was leaked?)


It's like how you see blogs with "not my company's words" and comments online with disclaimers "I'm not a lawyer". They serve no purpose other than telling you that person has a misadjusted sense of risk.

> and cause every line you write to be subject to a lawsuit

See: Oracle v Google.

> Anyone else remember 2004

Remember John Ashcroft? The legal system was not as sophisticated then as it is now and juries were unlikely to penetrate even the basic issues of a case.


[flagged]


I wouldn't exactly use the word evil, but I do remember a time when desktop hardware and software were not so massively dominated by one or two companies. I could buy a 386 or 486 computer from any number of vendors, buy expansion cards (graphics, sound, MIDI etc.) from various other vendors, buy hard disks and optical drives from yet more vendors, and even buy DIMM memory modules from yet more vendors, and put it all together myself. Yes the machine would run DOS or Windows, but most software outside the Office suite came from various different vendors (remember Norton, Borland, Corel?)

Not blaming MS per se (much of my examples above are H/W), but the type of "consolidation" companies such as MS engaged in, killed a lot of small to medium computer hardware and software businesses.


I don't see how they're worse than Nvidia, Broadcom, or Intel. At least you can remove Windows from a computer.

Do they still do EEE? I'm not a huge fan of MS but I haven't really heard of any EEE stuff in quite awhile.

Most Linux development is corporate now, WSL makes Linux easy to "use" without ever leaving Windows, and the lock-in-effect if you are using Office/Azure/Teams/BI/etc is almost perfect. You can't leave it, basically. Easier to start a new subsidiary from scratch using something else, than trying to migrate off the Total Microsoft Stack.

Office has a port to Mac that is perfectly fine. Teams has a port to both Mac and Linux. Azure is a cloud service, but most of its development tools that I've used had Mac clients. I don't know anything about BI so I can't speak to that.

Office even has a web version that generally works fine. I ran it on Brave browser in Linux last week. Teams browser also works fine, I use it to talk to my parents.

I don't think your examples are good on this.


The client almost doesn't matter anymore. The real lock is on the server side, at least if you are a company.

It feels like you shifted the goalpost, since you were initially complaining about WSL making it easier to stay within Windows.

Even still, I don't even know that I agree with your updated point. I've imported docx files into Google Docs, LibreOffice, Pages, and OnlyOffice. There's varying levels of success, but generally they all work fine. It's really not that hard to migrate from Azure to other platforms.

Even if I granted the lock-in here, I'd argue that it's different than the EEE thing that Microsoft is infamous for. I'm not a fan of vendor-lock-in either, but it's different than actively trying to kill standards.


What an ugly blog site, everything covered in bells and whistles for showing off FrEaKiNg CoOl WeB TeCnOlOgIeS, unopenable in reading mode, unselectable text, everything jumping around and the whole page is like coming out of crazyhouse.

Did they make it specifically so I have to take funny quotes out via DevTools?

>Five decades later, Microsoft continues to innovate new ways to make life easier and work more productive. Making it 50 years is a huge accomplishment, and we couldn’t have done it without incredible leaders like Steve Ballmer

Ahahahahah, 50 years of innovations in telemetry, now PC doesnt work without internet. Really funny Bill, keep up the good jokes!


> Celebrate 50 years of Microsoft

Maybe vomit. So many days lost trying to use Windows, Office and other "apps"[1] from Mictosoft.

[1] They were never able to write programs.


Its written for people who know nothing about computers but most people who will read it knows loads.

Why do I need to enable JS to view this website?

Since the site is an art project and not a site tuned for pure functionality.

I've seen many an art project that eventually stops being updated and is used to serve up malware -- sometimes with a bonus expired or nonexistent cert.

It should never be a requirement to enable JS to download a binary file like a PDF.

If you're concerned about scraping, put in a robots dot txt and/or give it to an entity like Internet Archive to host.


All nice and good, but Bill Gates decided he wants to have some fancy visual effects there. If you visit it with a JavaScript enabled graphicla browser (I haven't checked accisibility etc.) or read through some discussions here you will notice that functionality or such isn't purpose of the site. So yeah, for a company which wants to do business or something it isn't right, but that some retired dude doing something fun to him (of course he didn't implement it, but asked some agency)

Cool Bill. But do you have what it takes to fix the onedrive shared folder bug that has been open for more than a year?

Have you tried emailing random people who appear in the Windows 3.1 development team credits page? Maybe Daniel Stenberg, he definitely wrote some of the code that goes into Windows!

Gates pivoting back to being a "computer genius" reflects how badly his philanthropic reputation laundering operation is going.

Microsoft got its start by Bill Gates doing some dumpster diving. Back then software wasn't seen as valuable thing, only hardware was. Source code wasn't something to be protected, so printouts of code would be thrown in trash. And that's where Bill Gates found the source code for Basic interpreter, which he ported and it became the first Microsoft product.

https://americanhistory.si.edu/comphist/gates.htm

https://paulallen.com/Futurist/Microsoft.aspx


> "...so printouts of code would be thrown in trash. And that's where Bill Gates found the source code for Basic interpreter, which he ported and it became the first Microsoft product"

Both sources you link to say Allen and Gates pulled listings of the PDP-10 operating system out (probably DEC's TOPS-10?) of the trash. BASIC is not an operating system. So your claim is debunked by your own sources.

"...digging out the operating system listings from the trash and studying those. Really not just banging away to find bugs like monkeys[laughs], but actually studying the code to see what was wrong."

https://americanhistory.si.edu/comphist/gates.htm

"...He and Bill would go “dumpster diving” in C-Cubed’s garbage to find discarded printouts with source code for the machine’s operating system..."

https://paulallen.com/Futurist/Microsoft.aspx


And Apple stole a UI from Xerox Parc. Open AI stole everyone's content.

This is how the industry innovates


This is a myth. Jobs negotiated access to PARC technology as part of a deal in which Xerox bought shares in Apple at $10/share[0], selling about a year later at $22/share. Those shares would be worth around $5 billion today.

Xerox did later sue Apple for IP infringement, however most of their claims were dismissed[1].

[0] https://web.stanford.edu/dept/SUL/sites/mac/parc.html

[1] https://arlingtonmnnews.com/articles/bits-and-bytes/xerox-ve...


> Xerox bought shares in Apple at $10/share[0], selling about a year later at $22/share.

> [0] https://web.stanford.edu/dept/SUL/sites/mac/parc.html

I searched the cite for the 'share', '10', '22', 'sold, 'sell', 'bought', 'buy', 'purchase', and found nothing. ?


Apologies, I was juggling multiple sources. The Xerox VC investment into Apple is a matter of public record, the figure of $10/share is widely quoted, including in the Walter Isaacson biography of Steve Jobs[0].

Exactly how and when Xerox disposed of its shares is not public record, but it's known to be around that timeframe and certainly Xerox made a profit. The book _Dealers of Lightning_ goes into more detail about the deal if you're interested[1].

[0] https://www.cnbc.com/2018/05/21/why-your-computer-has-a-mous...

[1] https://www.goodreads.com/book/show/1101290.Dealers_of_Light...


Now AIs are stealing from AIs.

[flagged]


That article is a bit confusing because it's using the term "BASIC" to refer to both the language and Microsoft's implementation. But what it's trying to say is that Microsoft's BASIC implementation was licensed by many computer companies (including Commodore and Atari) and that those companies changed and extended it in incompatible ways.

[flagged]


Bill Gates did not write it by himself, Paul Allen and Monte Davidoff also worked on it. And they did not have a finished product after 8 weeks -- only a demo. The first commercial release was "version 2", half a year later.

Podcast with Monte Davidoff

https://floppydays.libsyn.com/floppy-days-113-monte-davidoff...

Starts after about the first 15 minutes.


How would Bill Gates copy source code from a 36-bit minicomputer with 32 kilowords (no byte addressing) of memory and a time-sharing operating system to a 8 bit microprocessor with a completely different instruction set and 4 kilobytes of memory and no operating system, just bare metal? Even if he and Allen had had the source code for BASIC-10, which you haven't provided evidence of, it would be closer to a reimplementation than a port.

And DEC was in Massachusetts, Bill Gates went to high school in Washington. That would be one hell of a road trip to dig into DEC's trash.


I think it was C-Cubed's trash, but it was DEC's IP. See: https://www.theregister.com/2000/06/29/bill_gates_roots/

None of us write programs from first principles, it's all based on code we've read before. If I was going to write a BASIC interpreter I'd read up on the basics of interpreters, literature which would include sample code, and look at other interpreters' code.

No matter where you think the code came from, the impact of Microsoft BASIC was huge, and they were first to the market.


BASIC was " BASIC, developed at Dartmouth College, was initially designed for and ran on a GE-225 mainframe computer paired with a Datanet-30 processor, which handled communications with Teletype terminals. " I got into the game on HP BASIC, also with teletype ASR-33s, I was only 9.

That's not what that says at all. It says that the language was slightly different depending on the platform.

Microsoft basic wasn't the first basic interpreter which is a different claim than Microsoft basic source was copied from another interpreter.


What part of that paragraph you quoted suggests that Microsoft BASIC wasn't original work?

Those were their own ports, as per the page you just linked. They developed Microsoft BASIC.

"The Altair BASIC interpreter was developed by Microsoft founders Paul Allen and Bill Gates using a self-written Intel 8080 emulator running on a PDP-10 minicomputer."


Gates and Allen wrote and copyrighted the first Microsoft Basic, and the Dec10 8080 emulator needed to run it (I've written one of these - a bit later as it happens).

Allen wrote a loader (in machine code) for it on an aircraft flying down to sell it to Altair.

What ever you might say about them, they were not dim.


They were not dim, but Microsoft copied a lot, and didn't innovate. This aspect of Microsoft hasn't changed.

In the 1990s, during the competition between Microsoft and Sun Microsystems, Sun's CEO, Scott McNealy, compared Bill Gates to Ginger Rogers. This analogy suggested that, like Rogers, who danced everything Fred Astaire did but backward and in high heels, Gates was adept at following and adapting competitors' innovations. This comparison was part of Sun's broader critique of Microsoft's business practices at the time.

"It has been noted that everything Astaire did, Rogers was able to do -- backwards and in high heels. That's high praise for the nimble Ms. Rogers. But for a would-be visionary, following someone else's lead -- no matter how skillfully -- simply doesn't cut it."

https://web.archive.org/web/19991013082222/www.sun.com/dot-c...


Yes, well Scott McNealy will never be my idea of a brilliant man. Or Sun of a particularly good company - where are they now?

I remember one investment bank I worked for, starting:

IT tech: Would you like a Sun workstation?

Me: Nope, I would like a top of range Windows PC, with two or more screens.

IT tech: Yeah, OK, all the traders say that too. We're throwing those Suns in the dumpster.


Sun made incredibly good hardware and software. They were incredibly good technologists, responsible for lots of innovations, but they were bad at business. So in that sense they were the opposite of Microsoft.

Some quite good hardware, I must admit - their servers were good. Workstations less so, and ludicrously expensive for what they were.

Just yesterday I personally witnessed pallets of Sun/Oracle equipment being unloaded. I’ll admit, it made me nostalgic!

They’re still out there. Maybe not visible to normal folks, but I know for a fact until very recently the Chicago Mercantile Exchange used their hardware in great quantities— maybe even as the underlying hardware for their matching engines, though I admit this is conjecture on my part. They don’t exactly let exchange customers in those rooms!

I miss their 10k & 15k chassis. Solid kit for their day.


The spiritual successor for Sun machines is Oxide (lots of ex-Sun folks). And Sun got acquired by Oracle so it’s still technically around on the software side via virtual box and Java.

That's the point though.

What's left of Sun is basically a startup founded by a few ex-employees, some open-source software, and the rest of their IP being milked by Larry Ellison.


Neither SunOS or Solaris were open source, or based on open source.

Wasn't SunOS essentially a flavor or distro of Unix?

I'm not talking about SunOS or Solaris. I'm talking about Java, dtrace, OpenZFS, and a various other random bits of Sun legacy still floating around in modern open-source systems.

I love Oxide's podcast. I checked its career page a few times but they are only hiring for field sales.

And I should of said (and did say) "With a Kingfisher X server installed and configured"

"This aspect of Microsoft hasn't changed." Now that is quite a dig, but I am going to have to completely agree, until they got Coulter but after that it is pretty much Microshaft.

Seems that Ginger got the last laugh though.

When I look back at that era now I am amazed at how Gary Killdall failed to capitalize on his amazing position as the creator of CP/M, which was the dominant 8-bit OS and ran on numerous popular platforms, like the 8080, 8086, Z80, and the 68000. When IBM entered the PC market, Killdall and IBM could not come to an agreement so MS stepped in and licensed then purchased an imitation of CP/M called 86-DOS, which IBM offered in addition their own PC DOS. Killdall's company created an 8086 OS called CP/M-86 but it was more expensive than IBM's PC DOS and never took off. IBM did not want the liability of having contested code, so they let MS hold that bag and the rest is history.

I couldn't find the precise reference that mentions that they found the source code for the Basic interpreter and just "copied/ported" it. I did read they'd go "dumpster diving" to learn assembly. But not that they found and just ported the source code. Where is it?

I think it comes from a misread of the text in the gates interview linked in the comment:

"r. We were moving ahead very rapidly: BASIC, FORTRAN, LISP, PDP-10 machine language, digging out the operating system listings from the trash and studying those. Really not just banging away to find bugs like monkeys[laughs], but actually studying the code to see what was wrong."

My understanding is that they saw the source implementation for other BASICs (on mainframes or whatever they were called at the time) but their code is mostly their own. Few if any programmers spring fully-formed from the head of zeus (although paul allen was close) and plenty of valuable intellectual property was originally created elsewhere.


"The listings evidently included Basic for the PDP-10, but it was Allen who did the Assembler programming to simulate the Altair, while Gates, Monte Davidoff and later Allen worked on a Basic interpreter for the machine."

See https://www.theregister.com/2000/06/29/bill_gates_roots/


"Just porting" is doing some seriously heavy lifting, if it's referring to porting something from a mainframe to one of the micros of the day.

Don't forget the infamous Open Letter to Hobbyists that followed:

https://en.wikipedia.org/wiki/An_Open_Letter_to_Hobbyists


One minor thing to consider is that hobbyists weren't distributing the source code (as posted in the OP) but trading the paper tape of the executable interpreter. They wanted the interpreter so they could write their own software that was probably unrelated to basic itself, that was just a means to an end.

The industry pretty quickly moved to incorporate basic in rom on many platforms and microsoft was able to capitalize on that integration through licensing. I don't think his letter did much other than antagonize hobbyists - but they made a lot licensing to the hardware manufacturers later on (and the hardware was truly more valuable with basic on board.

(One of my all time to this day favorite computers from that era is the TRS-80 Model 100. I don't remember if Microsoft provided the entire software stack for it, but I believe it was the last product that Bill Gates actually contributed to the software development.)


According to Gates, he wrote the Model 100's software himself. It was indeed his final major software project as a coder.

Licensing programming tools was staple MS, since it also provided lock-in. The letter comes off as the complete opposite of open source approach to it.

And he won that argument. The steady movement away from Free Software licenses to shared source is because developers want to get paid by people using the code they created just as Gates describes in the letter. Even Bruce Perens is trying to hammer out a Post-Open Source license that's proprietary in all but name.

For his goals at the time, but not really in the long run. Open development ecosystems like Rust are way better thriving than any closed ones.



Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: