The article makes clear that the orientation of the lettering has changed over time, which counts against the idea that what it is now necessarily reflects the original intent.
To me the evidence in the article still suggests that “hard correctness” is probably not historically appropriate…hand lettering is not a typeface.
That’s really where I am coming from — the perspective of historical architecture, historical architectural practice, and historical methods of delivering buildings.
In particular, today’s mythological Wright is not the 1908’s historical Wright on a commercial jobsite. And the contractual relationships of a 1908 construction project were not delineated like current construction projects.
And yet the article shows the original sketches Wright made for the building that show the asymmetrical H's with the bars aligned with the bars on the E's (i.e on the upper half) in virtually identical font to what was eventually installed.
I don't really see how you can come away with the conclusion that this suggests lack of intent; at most, it seems like you had already formed the opinion that there was no intent, and you didn't find the evidence to the contrary convincing enough that you were wrong. I don't think your take is necessarily wrong, but I don't think it's fair to characterize the evidence as suggesting what you're saying.
No, there's nothing special about the spec secure boot variables as far as boot services goes - you can modify those in runtime as well. We use boot service variables to protect the MOK key in Shim, but that's outside what the spec defines as secure boot.
I really don't understand why people keep misunderstanding this post so badly. It's not a complaint about C as a programming language. It's a complaint that, due to so much infrastructure being implemented in C, anyone who wants to interact with that infrastructure is forced to deal with some of the constraints of C. C has moved beyond merely being a programming language and become the most common interface for in-process interoperability between languages[1], and that means everyone working at that level needs to care about C even if they have no intention of writing C.
It's understandable how we got here, but it's an entirely legitimate question - could things be better if we had an explicitly designed interoperability interface? Given my experiences with cgo, I'd be pretty solidly on the "Fuck yes" side of things.
(Of course, any such interface would probably end up being designed by committee and end up embodying chunks of ALGOL ABI or something instead, so this may not be the worst possible world but that doesn't mean we have to like it)
[1] I absolutely buy the argument that HTTP probably wins out for out of process
I don't see that as a problem. C has been the bedrock of computing since the 1970s because it is the most minimal way of speaking to the hardware in a mostly portable way. Anything can be done in C, from writing hardware drivers, to GUI applications and scientific computing. In fact I deplore the day people stopped using C for desktop applications and moved to bloated, sluggish Web frameworks to program desktop apps. Today's desktop apps are slower than Windows 95 era GUI programs because of that.
Ok you're still missing the point. This isn't about C being good or bad or suitable or unsuitable. It's about whether it's good that C has, through no deliberate set of choices, ended up embodying the interface that lets us build rust that can be called by go.
Sure history is great and all, but in C it's hard to say reliably define this int is 64-bit wide, because of the wobbly type system. Plus, the whole historical baggage of not having 128-bit wide ints. Or sane strings (not null terminated).
> in C it's hard to say reliably define this int is 64-bit wide
That isn't really a problem any more (since c99). You can define it as uint64_t.
But we have a ton of existing APIs that are defined using the wobbly types, so we're kind of stuck with it. And even new APIs use the wobbly types because the author didn't use that for whatever reason.
But that is far from the only issue.
128 bit ints is definitely a problem though, you don't even get agreement between different compilers on the same os on the same hardware.
You're still thinking of C as a programming language but the blogpost is not about this, it's about using C to describe interfaces between other languages.
> because it is the most minimal way of speaking to the hardware in a mostly portable way.
C is not really the most minimal way to do so, and a lot of C is not portable anyway unless you want to become mad. It's just the most minimal and portable thing that we settled on. It's "good enough" but it still has a ton of resolvable problems.
> could things be better if we had an explicitly designed interoperability interface?
Yes, we could define a language-agnostic binary interoperability standard with it's own interface definition language, or IDL. Maybe call it something neutral like the component object model, or just COM[1]. :)
Of course things could be better. That doesn’t mean that we can just ignore the constraints imposed by the existing software landscape.
It’s not just C. There are a lot of things that could be substantially better in an OS than Linux, for example, or in client-server software and UI frameworks than the web stack. It nevertheless is quite unrealistic to ditch Linux or the web stack for something else. You have to work with what’s there.
It's a somewhat weird product. There's no real access to any of the hardware that made the Amiga impressive at the time, without an add-on graphics card you're going to have a bad time in X, and it replaces AmigaOS entirely so you don't have any ability to run Amiga software at the same time (it's not like a/is in that regard). It's an extremely generic Unix, and I don't know who Commodore really thought they were selling it to. But despite all this is was cheaper than a comparable Sun? Extremely confusing.
Wasn't there some government procurement rule that required any computers they bought be able to run UNIX? At least, that's the reason commonly cited for why Apple created A/UX, their Unix for 68k Macs, originally released in 1989.
Well that sounds disappointing. These days you're probably better off just running Linux or NetBSD on your old Amigas. But the ability to run true multiuser Unix on cheap desktop hardware was probably immensely valuable to businesses at the time, so it might've been worth it, even if you forgo much of the Amiga's Amiganess. The Tandy Model 16 family was not an Amiga by any stretch, but they had 68000 CPUs and were Unix capable in the form of Xenix. So they ran a lot of small business back office stuff until well into the 90s I'm guessing, despite first coming out in 1982.
Atari Corp was doing the same thing around the same time as Commodore was, with their own branded SysV fork. Both were trying to get into the later stages of the workstation market because it was seen as a new revenue source at a time when the "home computer" market was disappearing.
But I distinctly remember an editorial in UnixWorld magazine (yes, we had magazines like that back then you could buy in like... a drug store...) with the headline "Up from toyland" talking about the Atari TT030 + SysV. Not exactly flattering.
The reality is by 1992, 93, 94 the workstation market was already being heavily disrupted by Linux (or other x86 *nix/BSD) on 386/486. The 68k architecture wasn't compelling anymore (despite being awesome), as Motorola was already pulling the rug out from under it.
And, yeah, many people just ran NetBSD on their Atari TTs or Falcon030s anyways.
>The reality is by 1992, 93, 94 the workstation market was already being heavily disrupted by Linux
From Wikipedia:
> Linux (/ˈlɪnʊks/ LIN-uuks)[16] is a family of open source Unix-like operating systems based on the Linux kernel,[17] a kernel first released on September 17, 1991, by Linus Torvalds
I personally bought a 486 (and left the Atari ST world) in the winter of 1992 precisely so I could run the earliest versions of Linux. Which were next to useless, but so was running most of the Unix-ish stuff on 68k platforms.
I imagine any home computers manufacturer looked at the workstation 68000 machines like Sun and said "we have the same CPU, if we have a Unix we can market our computers as workstations at a fraction of the cost". You also had Apple release A/UX for their 68k Macs.
Sun and NeXT also sold 68k Unix workstations at the time. IMHO, The thing about Amiga was that it was not seen as a business machine. Commodore in general was seen as a home computer, and really one aimed at gaming first. AFAIK they didn't even have computers with the specs to compete with what Sun, SGI, HP, and others were doing.
The Sun and NeXT machines were pricey. Commodore may well have been trying to break into the business market by releasing an affordable business-attractive OS for the Amiga. They were also starting to sell PCs around this time. It certainly tracks with their scattershot marketing efforts late in their history.
There were video and multimedia applications at the time that could ONLY be tackled by an Amiga unless you wanted to pay $10,000 or more for specialized equipment. Besides the Video Toaster, which 'nuff said, Amigas also provided teletext-like TV information services in the USA, such as weather forecasts and the Prevue Channel (a cable channel that scrolled your cable system's program listings). Teletext itself never really caught on here.
Anime fan subtitling was also done almost exclusively on Amiga hardware.
Amiga gained a reputation as a glorified game console in the wider market, but those who knew... knew.
The vast majority of it is just recompiled AT&T code. The Amiga specific stuff is provided in object form and largely shipped with debug symbols so it'd be pretty easy to get something approximating the original.
What software? The Apricot ran DOS but didn't implement a full PC-compatible BIOS, so some software would work and some wouldn't. Even back in 1984 people didn't call it a PC compatible.
The transition enabled faster and more frequent service, which is something you probably do care about if you need to get into the office and are deciding how to get there.
/ has to be writeable (or have separate writeable mounts under it), /usr doesn't. The reasons for unifying under /usr are clearly documented and make sense and it's incredibly tedious seeing people complain about it without putting any effort into understanding it.
> Improved compatibility [...] That means scripts/programs written for other Unixes or other Linuxes and ported to your distribution will no longer need fixing for the file system paths of the binaries called, which is otherwise a major source of frustration. [..]
Scripts authors should use the binary name without a path and let the user's $PATH choose which binary to use and from where.
This union denies me the choice of using the statically linked busybox in /bin as a fallback if the "full" binaries in /usr are corrupted or segfaults after some library update.
> Improved compatibility with other Unixes (in particular Solaris) in appearance [...]
I don't care about appearances and I care even less about what Solaris looks like.
Did they take a survey of what Linux users care about, or just imposed their view on all of us because they simply know better? Or were paid to "know better" - I never exclude corruption.
> Improved compatibility with GNU build systems. The biggest part of Linux software is built with GNU autoconf/automake (i.e. GNU autotools), which are unaware of the Linux-specific /usr split.
Yeah, right. Please explain to me how GNU, the userspace of 99% of all Linux distributions isn't aware of Linux-specific /usr split.
And how is this any different from #1 ?
> Improved compatibility with current upstream development
AKA devs decided and users' opinion is irrelevant. This explains why GNU isn't aware of Linux /usr split - they simply don't want to be aware.
A meaningful gamble IBM made at the time was whether the BIOS was copyrightable - Williams v. Artic wasn't a thing until 1982, and it was really Apple v. Franklin in 1983 that left the industry concluding they couldn't just copy IBM's ROMs.
reply