God, here we go again. Yet another example of companies stealing from Apple. This is clearly similar enough to the Macintosh (1984) that I'm surprised Apple didn't sue the hell out of AT&T. They certainly sued the hell out of Microsoft[1], so AT&T must have been lucky.
Around the same time, Myron Krueger had the gall to demonstrate pinch-to-zoom[2], nearly 23 years before Apple patented it. So many people piggy-backing off of Cupertino's innovation :/
Development of the Lisa started in 1978 and it was released in 1983. The Blit was started in 1981 and released as the 5620 in 1984, the same year as the Mac.
So unfortunately the facts render your sarcasm rather hollow.
Anyway, the Blit doesn't infringe on Apple as it doesn't include any of the things Apple did genuinely develop independently - like pull-down menus, resizable and moveable windows, overlapping windows, directly manipulatable file and document names, desk accessories, control panels, internationalisation, multiple views of the file system and drag and drop file manipulation. So in comparison to Apple tech of the time it's so primitive it wouldn't be worth it anyway.
I'm fairly confident that many (most?) of the gui elements you listed were invented by Xerox and willfully copied by Apple. Certainly stacking resizable movable windows, pull-down menus, and manipulable desktop items.
The grandparent to your post says "Apple did genuinely develop independently". Not "Apple did legally license from Xerox in exchange for stock". So the point still stands.
The amount of misinformation surrounding how much of the modern computing experience Apple independently invented is also dizzying.
The difference is AT&T has several patents covering blit. Harder to sue someone for stealing your idea when they own the patent, not you. [revised wording]
I'm surprised AT&T didn't sue the hell out of apple because BLIT is clearly older than the Macintosh.
BLIT is actually from 1982 as is shown in the video at 3:50 and in the copyright notice some seconds later.
The wikipedia article also claims it's 1882.
> When did Apple patent pinch-to-zoom? This video is from 1988.
The video is from 1988, but it appears the work was around 1983. The pinch-to-zoom (on 'portable communication devices') patent[1] was filed in 2006, issued in 2010.
If that is the case, what about the Jeff Han video which came out in 2006? Jeff Han was obviously working on it for at least a year before he gave that talk. Shouldn't that be obvious prior art? He talks about pinch and much more!
The patents didn't cover the touch screen sensors. The sensors and their drivers--not covered--just give you points corresponding to fingers and pressure. The same thing with this touch table.
You could take the exact same software written for the touch table and feed it the touch sensor (again, not patented) input. Under the ruling this would be found to infringe.
Think about that. Software written before the patent in question fed data from a different device--infringement. It would be like me filing a patent for elements of the standard desktop gui running on an LCD instead of a CRT, and suddenly getting a free pass against all prior-art.
Ah, early GUIs. For comparison, take a look at Xerox' Cedar and Smalltalk, or Wirth's Lillith and Oberon, all from pretty much the same era.
The Lilith systems are often overlooked. They predate the Blit, are programmed in Modula-2, translated to bytecode. Oberon is a bit more well-known, but still not as much as both the language and the OS deserve.
Which in turn took quite some influence from Cedar/Mesa at Xerox. But one might as well say that KDE 4.9 (2012) is based on MacOS (1984), that doesn't make it an "early GUI" in my eyes.
The point being that this is an editable document serving as a user interface. Which, if I remember correctly, some people once envisioned for the web (cf. Amaya[1]), but never really took hold.
That's a smart dude, I'm surprised I haven't heard of him earlier. But I can't help but think he would seem twice as brilliant if he had a neckbeard :)
"Unix compilers are slow. So to entertain myself while I'm waiting, I can play asteroids! You see? Compiler errors print out even while asteroids is running!" (2:05)
This was pretty revolutionary at the time, but for some reason those lines made me crack up. Also:
"Is graphics good for anything other than playing games?" (2:19)
The question sounded sort of facetious then as it does now, but for different reasons.
I was also entertained by the line towards the end of video: "I've always been able to think about multiple things at once but the terminal held me back."
It seems like we all very much doubt that was ever true now.
If in your last line you mean "Everyone works in terminal emulators all the time", that's not what he was going for. The intent of the statement you quote was that previous terminal hardware, such as the VT-100, gives you access to exactly one program at a time. GNU Screen was not to be invented until 1987, and I have not heard of any earlier terminal multiplexer.
Sorry that was unclear. I was referring to the modern backlash against multitasking. Specifically, I'm sure that I can't think productively about more than one thing at a time.
There's no question that this system was a technological achievement though and that having access to more than one running program at a time is a good thing.
I had one of these on my desk in 1984. It was very usable.
As for having more than one on a desk, you had better have a really well built one. As I recall, it weighed about 75lbs.
I could work on this system. Give me a couple of terminals with vim, and a webbrowser in a different layer (of course that system precedes the web) and I could do 90% of what I'm doing today to get work done.
Except for the green tinge that's pretty much what my setup looks like. Borderless terminals in a tiling window manager with vim and assorted CLI programs. Except of course for iceweasel. Because there's only so far w3m will take you (not very).
Well, if you're not that tied to terminal apps, there's always xombrero, jumanji, dwb or luakit.
(Rob Pike himself is pretty proud that he never wrote a program with cursor addressing, there are definitely differences between modern Linux "minimalist" and proper 9fans)
I remember playing with one at Bell Labs in 82 or 83. The mouse was gigantic but looked AWESOME (black orb with red buttons). Not sure if the commercial version had the same issue, but it was slow, especially when spawning a new layer. Favorite thing: the "wait" cursor was a cute little coffee cup with steam rising from it (i.e. "this is going to take a while so grab yourself some coffee").
Piece of humor. The graphics workstation for Plan9 was a 68020 board put into a 630/730MTG chassis, but with a DMD5620 keyboard. People would walk up to it and ask if it was 630. To which, the response was, it's not. It was later simply named, the Gnot.
Blit even had some capabilities for terminal side software so that it wasn't entirely "dumb terminal" [1]. After logging in the host was able to upload some code that would run on the terminal during the session, but would be gone after power cycle. In a sense one could compare it to a web browser of today.
I dig the terminology they use. Layer is a great term. Also loved the distinct lack of chrome around layers. The invisible interface is still the future.
I dunno, affordances are pretty nifty. (http://en.wikipedia.org/wiki/Affordance) It's a tradeoff of clutter vs. learning curve. And I have to imagine the future will, as always, be a tradeoff between the two.
Its a great point and I love a good set of affordances, that said all it took was seeing the 4 finger gestures on the ipad once and I could never forget them. The 4 finger pull the a up for switching and full hand movement for closing the app are so natural.
OsX has gotten rid of the little handles on the edge too- I imagine its all going to disappear.
RISCOS and Arthur (Acorn Computers, UK) had a three button mouse and the middle button brought up a local menu for the window under the button. I often wondered where that came from...
It's interesting to see that the focus in this video is on showing as much information on the screen at the single time, a sort of "swing of the pendulum" that has now gone the other way with mobile UIs such as iOS, Android and (to a lesser extent) Windows 8.
Ah, the AT&T 3B2. I learned heavy-duty assembly language programming on the WE32000, that was a really nice processor. Shame it never took off in the era of 80286 and 68000...
Heh, maybe you're right. It was the first true 32-bit iron I got to use (halfword address exceptions! woo!), and it also had strcpy() built into a single opcode. I thought that was pretty cool...for 1986....
For one, better monitors, where it was easy enough to distinguish pictures. Basically CRTs that are better than those required for TVs.
But the biggest issue was display RAM. To get a decent enough resolution, you need an insane amount of memory, often 50+ kb! And then you need this for every terminal hooked up to the minicomputer/mainframe (don't even think about double-buffering).
This was even an issue for the early text terminals. 72x20 characters was bad enough. Compare this to a line printer, where you just send a character and it stays there, without any backing memory. If I recall correctly, there were some experiments with persistent CRTs, and then of course terminals as we know them that had some memory of their own used to store the sent characters, and after a while even interpret quite elaborate escape sequences (compare sending one "^L" command to sending 80x24 spaces to clear a screen).
Once RAM got cheaper and you had semi-decent monitors, you could build some pretty high resolution monochrome terminals. If you wanted color, too, you had to stay low-res, which is why this was mostly used for gaming, not for actual work. It took quite a while until you could build high-res color workstations, and those were pretty expensive at first.
The Blit* is a graphics terminal characterized more by the
software it runs than the hardware itself. The hardware is simple
and inexpensive (Figure 1): 256K bytes of memory dual-ported between
an 800x1024x1 bit display and a Motorola MC68000 microprocessor,
with 24K of ROM, an RS-232 interface, a mouse and a keyboard. Unlike
many graphics terminals, it has no special-purpose graphics
hardware; instead, the microprocessor executes all graphical
operations in software. The reasons for and consequences of this
design are discussed elsewhere.
It used the same processor and 'only' twice the RAM.
Even assuming the RAM is the most expensive component, by definition, the materials cost no more than twice the cost of the original Macintosh. (~2500USD in 1984)
Of course, the Blit is just a terminal pulling applications down from a Unix host over the serial port; when you consider that the Blit is just a display, it does indeed seem a bit pricey.
Around the same time, Myron Krueger had the gall to demonstrate pinch-to-zoom[2], nearly 23 years before Apple patented it. So many people piggy-backing off of Cupertino's innovation :/
[1] http://en.wikipedia.org/wiki/Apple_Computer,_Inc._v._Microso...
[2] http://www.youtube.com/watch?v=dmmxVA5xhuo (skip to 4:32)