In the UK and a few other countries (Norway, Hungary, Canada, ???), EVs will have a green "flash" on the number plate. Makes it a bit easier to identify!
Where did I mention being an amazing programmer? If that's the requirement then why not. The comment was replying specifically about environment where you gotta sit through hour long meetings and that is what I wrote about
maybe there is a company where being an amazing programmer is enough. I worked with capable depressed programmer who never delivers and is too shy to delegate anything, capable psycho programmer who no one wants to work with, bad programmer who works crazy hours, carries the project and interacts nicely with customers when needed. The last one was probably the most valuable
If you are an amazing programmer but can't function in the 1 hour sitdown meeting which is part of your job activities then you are de facto worse candidate than the next amazing programmer who can, that's just how it is.
In the specific case here, 7z is your friend for all zips and compressed files in general, not sure I've ever used unzip on Linux.
Related to that, the Unix philosophy of simple tools that do one job and do it well, also applies here a bit. More typical workflow would be a utility to tarball something, then another utility to gzip it, then finally another to encrypt it. Leading to file extensions like .tar.gz.pgp, all from piping commands together.
As for versioning, I'm not entirely sure why your Debian and Ubuntu installs both claim version 6.00, but that's not typical. If this is for a personal machine, I might recommend switching to a rolling release distro like Arch or Manjaro, which at least give upto date packages on a consistent basis, tracking the upstream version. However, this does come with it's own set of maintenance issues and increased expectation of managing it all yourself.
My usual bugbear complaint about Linux (or rather OSS) versioning is that people are far too reluctant to declare v1.00 of their library. Leading to major useful libraries and programs being embedded in the ecosystem, but only reaching something like v0.2 or v0.68 and staying that way for years on end, which can be confusing for people just starting out in the Linux world. They are usually very stable and almost feature complete, but because they aren't finished to perfection according to the original design, people hold off on that final v1 declaration.
Info-Zip Unzip 6.00 was released in 2009 and has not been updated since. Most Linux distros (and Apple) just ship that 15-plus-year-old code with their own patches on top to fix bugs and improve compatibility with still-maintained but non-free (or less-free) competing implementations. Unfortunately, while the Info-Zip license is pretty liberal when it comes to redistribution and patching, it makes it hard to fork the project; furthermore, anyone who wanted to do so would face the difficult decision of either dropping or trying to continue to support dozens of legacy platforms. Therefore, nobody has stepped up to take charge and unify the many wildly disparate mini-forks.
The "Unix Philosophy" is a bankrupt romanticized after the fact rationalization to make up excuses and justifications for ridiculous ancient vestigial historic baggage like the lack of shared libraries and decent scripting languages, where you had to shell out THREE heavyweight processes -- "[" and "expr" and a sub-shell -- with an inexplicable flurry of punctuation [ "$(expr 1 + 1)" -eq 2 ] just to test if 1 + 1 = 2, even though the processor has single cycle instructions to add two numbers and test for equality.
??? This complaint seems more than 20 years too late
Arithmetic is built into POSIX shell, and it's universally implemented. The following works in basically every shell, and starts 0 new processes, not 2:
20 years doesn't even get you back to the last century, it's more like 48 years since 1977 when Bourne wrote sh. As one of the authors of the Unix Haters Handbook, published relatively recently in 1994, and someone who's used many versions of Unix since the 1980's, of course I'm fully aware that those problems are hell of a lot more than 20 years old, and that's the whole point: we're still suffering from their "vestigial historic baggage", arcane syntax and semantics originally intended to fork processes and pipe text to solve trivial tasks instead of using shared libraries and machine instructions to perform simple math operations, and people are still trying to justify all that claptrap as the "Unix Philosophy".
Care to explain to me how all the problems of X-Windows have been solved so it's no longer valid to criticize the fallout from its legacy vestigial historic baggage we still suffer from even today? How many decades ago did they first promise the Year of the Linux Desktop?
The X-Windows Disaster: This is Chapter 7 of the UNIX-HATERS Handbook. The X-Windows Disaster chapter was written by Don Hopkins.
Why it took THREE processes and a shitload of context switches and punctuation that we are still stuck with to simply test if 1 + 1 = 2 in classic Unix [TM]:
[ "$(expr 1 + 1)" -eq 2 ]
Breakdown:
expr 1 + 1
An external program used to perform arithmetic.
$(...) (Command substitution)
Runs expr in a subshell to capture its output.
[ ... ]
In early shells, [ (aka test) was also an external binary.
It took THREE separate processes because:
Unix lacked built-in arithmetic.
The shell couldn't do math.
Even conditionals ([) were external.
Everything was glued together with fragile text and subprocesses.
All of this just to evaluate a single arithmetic expression by ping-ponging in and out of user and kernel space so many times -- despite the CPU being able to do it in a single cycle.
That’s exactly the kind of historical inefficiency the "Unix Philosophy" retroactively romanticizes.
> The X-Windows Disaster: This is Chapter 7 of the UNIX-HATERS Handbook. The X-Windows Disaster chapter was written by Don Hopkins.
This gave me a big laugh, I love the UNIX-haters Handbook despite loving UNIXy systems. Thank you for decades of enjoyment and learning, especially in my late-90s impressionable youth.
UNIX is dead, no one cares anymore. It's just Linux now. Your examples and complaints are both outdated and not in good faith.
For all the weirdos smashing that downvote button: How about you name me some UNIX distros you have ran in the past year? Other than Linux, OpenBSD (~0.1% market share btw) and ostensibly MacOS (which we all know has dropped any pretense of caring to be UNIX-like many years ago), that is.
macOS is absolutely Unix, and a lot more like mainstream Unix than many of the other vastly different Unix systems of the past and present, so exactly when did the definition of Unix suddenly tighten up so much that it somehow excludes macOS? And how does your arbitrary gatekeeping and delusional denial of the ubiquity and popularity of macOS, and ignorance of the Unix 03 certification, the embedded, real time, and automotive space, and many other Unix operating systems you've never heard of or used, suddenly change the actual definition of Unix that the rest of the world uses?
Have you ever even attended or presented at a Usenix conference? Or worked for a company like UniPress who ports cross platform software to many extremely different Unix systems? Maybe then you'd be more qualified to singlehandedly change the definition of the word, and erase Unix 03 certification from existence, and shut down all the computers and devices running it, but you're not. Who do you think you are, one of Musk's DOGE script kiddies? Because you sound as overconfident and factually incorrect as one.
>The "no true Scotsman" fallacy is committed when the arguer satisfies the following conditions:
>1) not publicly retreating from the initial, falsified a posteriori assertion: CHECK
>2) offering a modified assertion that definitionally excludes a targeted unwanted counterexample: DOUBLE CHECK
>3) using rhetoric to signal the modification: TRIPLE CHECK
macOS, AIX, HP-UX, Solaris (still technically certified), Inspur K-UX, EulerOS, etc.
POSIX-compliant and Unix-alike OSes (e.g., FreeBSD, QNX, etc.) are very active in many common domains (networking, firewalls, embedded, industrial).
Mission-critical infrastructure, telco, financial systems, military/spacecraft, automotive, and embedded still widely use non-Linux Unix or Unix-like systems.
QNX in cars, AIX in banks, Illumos in storage, RTEMS in space systems.
You have no clue what you're talking about, you're completely incapable and afraid to respond to any of my points, and you've been just making shit up and throwing around random buzzwords you don't understand for quit some time now, incoherently unable to complete a sentence, like you're on ketamine. Nobody's falling for any of it. All you've done is make ad hominem attacks, no true scotsman defenses, move the goalposts, then hypocritically accuse other people of doing exactly what you just did: textbook psychological projection. Every single leaf of this argument is you unable to take the L, counter any the valid arguments other people have made, and implicitly admitting defeat that you can't defend anything you said or counter anything anyone else has.
macOS is certified Unix, widely used and extremely popular, and there's absolutely nothing you can do or say that will change that fact, and everyone knows it.
I'll update my examples when your examples of how it's been fixed don't use the same arcane syntax and semantics as the 48 year old Bourne shell. That's the whole point, which you're still missing.
> $ bash -c '[ $((1 + 1)) = 2 ]; echo $?'
Not even Perl uses that much arcane punctuation to test if 1 + 1 = 2. As if [] isn't enough, you've got to throw in two more levels of (()), plus enough grawlix profanity for a Popeye comic strip. And people complain Lisp has too many parens. Sheez.
I prefer ITS DDT (aka HACTRN), with its built-in PDP-10 assembler and disassembler, that lets you do things like your login file customizing your prompt in assembly code to print the time by making system calls, without actually spawning any sub-jobs to merely print the time:
..PROMPT
holds the instruction which DDT uses to type out a "*".
You can replace it with any other instruction.
To use "%" instead of "*", deposit $1#%$> in that location
($> to avoid clobbering the opcode without having
to know what it is)
If you have to use arcane syntax and grawlix profanity, you should at least have direct efficient access to the full power of the CPU and operating system.
I keep submitting PR’s to get my assembler extensions in Fish and ZSH but so far no avail. Ideally all scripting should be done in single-clock-cycle assembly statements.
I mean it makes write-only languages like Perl look like beautiful prose but it’s hard to argue about efficiently setting the 20 environment variables used by my terraform jobs with a mere 20 clock cycles. It may seem silly but every clock cycle truly matters.
I love "the Unix Haters Handbook", just as I love "Worse is Better", but this ship has sailed 30 years ago as you mentioned. Your "old man yelling at clouds" rant reminds me of Bjarne Stroustrup's quip, "there are two type of languages, those everyone complains about and those nobody uses". I mean run your nice, coherent, logical LISP machine or Plan9 system of whatever is that you prefer, but let us enjoy our imperfect tools and their philosophy :)
The Unix philosophy really comes down to: "I have a hammer, and everything is a nail."
ESR's claptrap book The Art of Unix Programming turns Unix into philosophy-as-dogma, where flaws are reframed as virtues. His book romanticizes history and ignores inconvenient truths. He's a self-appointed and self-aggrandizing PR spokesperson, not a designer, and definitely not a hacker, and he overstates and over-idealizes the Unix way, as well as and his own skills and contributions. Plus he's an insufferable unrepentant racist bigot.
Don't let historical accident become sacred design. Don’t confuse an ancient workaround with elegant philosophy. We can, and should, do better.
Philosophies need scrutiny, not reverence.
Tools should evolve, not stagnate.
And sometimes, yelling at clouds stirs the winds of change.
>In a 1981 article entitled "The truth about Unix: The user interface is horrid" published in Datamation, Don Norman criticized the design philosophy of Unix for its lack of concern for the user interface. Writing from his background in cognitive science and from the perspective of the then-current philosophy of cognitive engineering, he focused on how end-users comprehend and form a personal cognitive model of systems—or, in the case of Unix, fail to understand, with the result that disastrous mistakes (such as losing an hour's worth of work) are all too easy.
Donald A. Norman: The truth about Unix: The user interface is horrid:
>In the podcast On the Metal, game developer Jonathan Blow criticised UNIX philosophy as being outdated. He argued that tying together modular tools results in very inefficient programs. He says that UNIX philosophy suffers from similar problems to microservices: without overall supervision, big architectures end up ineffective and inefficient.
>Well, the Unix philosophy for example it has been inherited by Windows to some degree even though it's a different operating system, right? The Unix philosophy of you have all
these small programs that you put together in two like Waves, I think is wrong. It's wrong for today and it was also picked up by Plan Nine as well and so -
>It's micro services, micro services are an expression of Unix philosophy, so the Unix philosophy, I've got a complicated relationship with Unix philosophy. Jess, I imagine you do too, where it's like, I love it, I love a pipeline, I love it when I want to do something that is ad hoc, that is not designed to be permanent because it allows me- and you were
getting inside this earlier about Rust for video games and why maybe it's not a fit in
terms of that ability to prototype quickly, Unix philosophy great for ad hoc prototyping.
>[...] All this Unix stuff, it's the sort of the same thing, except instead of libraries or crates, you just have programs, and then you have like your other program that calls out to the other programs and pipes them around, which is, as far from strongly typed as you can get. It’s like your data coming in a stream on a pipe. Other things about Unix that seemed cool, well, in the last point there is just to say- we've got two levels of redundancy that are doing the same thing. Why? Get rid of that. Do that do the one that works and then if you want a looser version of that, maybe you can have a version of a language that just doesn't type check and use that for your crappy spell. There it is.
>[...] It went too far. That's levels of redundancy that where one of the levels is not very sound, but adds a great deal of complexity. Maybe we should put those together. Another thing about Unix that like- this is maybe getting more picky but one of the cool philosophical things was like, file descriptors, hey, this thing could be a file on disk or I could be talking over the network, isn't it so totally badass, that those are both the same thing? In a nerd kind of way, like, sure, that's great but actually, when I'm writing software, I need to know whether I'm talking over the network or to a file. I'm going to do very different things in both of those cases. I would actually like them to be different things, because I want to know what things that I could do to one that I'm not allowed to do to
another, and so forth.
>Yes, and I am of such mixed mind. Because it's like, it is a powerful abstraction when it
works and when it breaks, it breaks badly.
No tool is perfect. The unix philosophy is a philosophy, not a dogma. It serves well in some use cases. And in the other use case, you’re perfectly fine to put the whole domain in a single program. The hammer has been there for millennia, but once we invented screw, we had to invent the screwdriver.
The point is that Unix philosophy is mostly a retroactive justification of why things are the way they are, and not really a coherent philosophy that drove the design of those things, even though it is now often represented as such.
> And sometimes, yelling at clouds stirs the winds of change.
> "The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore, all progress depends on the unreasonable man."
George Bernard Shaw.
Man, I'm with you, but I'll put my efforts elsewhere :)
Based on the account name, bio, and internal evidence you should assume this is Don Hopkins. His Wikipedia entry at https://en.wikipedia.org/wiki/Don_Hopkins includes:
> He inspired Richard Stallman, who described him as a "very imaginative fellow", to use the term copyleft. ... He ported the SimCity computer game to several versions of Unix and developed a multi player version of SimCity for X11, did much of the core programming of The Sims, ... He is also known for having written a chapter "The X-Windows Disaster" on X Window System in the book The UNIX-HATERS Handbook.
I hope this experience helps you realize that jumping immediately to contempt can easily backfire.
If you're going to emphasize that it's two processes, at least make sure it's actually two processes. `[` is a shell builtin.
> `eval` being heavy
If you want a more lightweight option, `calc` is available and generally better-suited.
> inexplicable flurry of punctuation
It's very explicable. It's actually exceptionally well-documented. Shell scripting isn't syntactically easy, which is an artifact of its time plus standardization. The bourne shell dates back to 1979, and POSIX has made backwards-compatibility a priority between editions.
In this case:
- `[` and `]` delimit a test expression
- `"..."` ensure that the result of an expression is always treated as a single-token string rather than splitting a token into multiple based on spaces, which is the default behaviour (and an artifact of sh and bash's basic type system)
- `$(...)` denotes that the expression between the parens gets run in a subshell
- `-eq` is used for numerical comparison since POSIX shells default to string comparison using the normal `=` equals sign (which is, again, a limitation of the type system and a practical compromise)
> even though the processor has single cycle instructions to add two numbers and test for equality
I don't really understand what this argument is trying to argue for; shell scripting languages are, for practical reasons, usually interpreted, and in the POSIX case, they usually don't have to be fast since they're usually just used to delegate operations off to other code for performance. Their main priority is ease of interop with their domain.
If I wanted to test if one plus one equals two at a multi-terabit-per-second bandwidth I'd write a C program for it that forces AVX512 use via inline assembly, but at that point I think I'd have lost the plot a bit.
I was quite clear that this is HISTORICAL baggage whose syntax and semantics we're still suffering from. I corrected it from TWO to THREE and wrote a step by step description of why it was three processes in the other comment. That's the whole point: it was originally a terrible design, but we're still stuck with the syntactic and semantic consequences even today, in the name of "backwards compatibility".
> they usually don't have to be fast since they're usually just used to delegate operations off to other code for performance
Even now you're bending over backwards to make ridiculous rationalizations for the bankrupt "Unix Philosophy". And you're just making my point for me. Does the Unix Philosophy say that the shell should be designed to be slow and inefficient and syntactically byzantine on purpose, or are you just making excuses? Maybe you don't think YOUR shell scripts have to be fast, or easy to write, read, and maintain, or perform simple arithmetic, or not have arsenals of pre-loaded foot guns, but speak for yourself.
When my son was six he found a girly magazine at a friends house and was sneaking away to look at it. When my wife caught him she told him the magazine was bad and he should not be looking at it. His simple reply was "But I like it Mom."
I actually didn't mention the Unix philosophy once in my comment, I just explained why the shell snippet you posted is the way it is. As far as I can tell, nobody in this thread's making long-winded ideological arguments about the Unix philosophy except you.
I think it's a perfectly reasonable assessment to think of shell scripts as a glue layer between more complex software. It does a few things well, including abstracting away stuff like pipelining software, navigating file systems, dispatching batch jobs, and exposing the same interface to scripts as you'd use to navigate a command line as a human, interactively.
> Maybe you don't think YOUR shell scripts have to be fast, or easy to write, read, and maintain, or perform simple arithmetic, or not have arsenals of pre-loaded foot guns, but speak for yourself.
This is the opinion of the vast majority of sysadmins, devops people, and other shell-adjacent working professionals I've encountered during my career. None of them, including myself when I'm wearing a sysadmin hat, deny the shortcomings of bash and friends, but none of us have found anything as stable or ubiquitous that fits this domain remotely as well.
I also reject the idea that faster or more full-featured alternatives lack footguns, pre-loaded or otherwise.
- C has a relatively limited type system by modern standards, no memory safety, no bounds checking, a slew of non-reentrant stdlib functions, UB, and relies on the user to account for all of that to benefit from its speed.
- C++ offers some improvements, but, being a near superset of C, it still has the footguns of its predecessor, to say nothing of the STL and the bloat issues caused by it.
- Rust improves upon C++ by miles, but the borrow checker can bite you in nontrivial ways, the type system can be obtuse under some circumstances, cargo can introduce issues in the form of competing dependency versions, and build times can be very slow. Mutable global state is also, by design, difficult to work with.
- Python offers ergonomic and speed improvements over POSIX shells in some cases, and a better type system than anything in POSIX shells, but it can't compete with most serious compiled languages for speed. It's also starting to have a serious feature bloat issue.
Pick your poison. The reality is that all tools will suck if you use them wrong enough, and most tools are designed to serve a specific domain well. Even general-purpose programming languages like the ones I mentioned have specializations -- you can use C to build an MVC website, yes, but there are better tools out there for most real-world applications in that domain. You can write an optimizing compiler in Ruby, but if you do that, you should reevaluate what life choices led you to do that.
Bash and co. are fine as shell languages. Their syntax is obtuse but it's everywhere, which means that it's worth learning, cause a bash script that works on one host should, within reason, work on almost any other *nix host (plus or minus things like relying on a specific host's directory structure or some such). I'd argue the biggest hurdle when learning are the difference between pure POSIX shell scripting idioms and bashisms, which are themselves very widely available, but that's a separate topic.
C was already limited by 1960's standards when compared to PL/I, NEWP and JOVIAL, 1970's standards when compared to Mesa and Modula-2, .....
It got lucky ridding the UNIX adoptiong wave, an OS that got adopted over the others, thanks to having its source available almost at a symbol price of a tape copy, and a book commenting its source code, had it been available as commercial AT&T product at VMS, MVS, et al price points, no one would be talking about UNIX philosophy.
> - C has a relatively limited type system by modern standards, no memory safety, no bounds checking, a slew of non-reentrant stdlib functions, UB, and relies on the user to account for all of that to benefit from its speed.
That is a feature, not a bug. Add your own bound checks if you want it, or use Ada or other languages that add a lot of fluff (Ada has options to disable the addition of bound checks, FWIW).
I am fine with Bash too (and I use shellcheck all the time), but I try to aim to be POSIX-compliant by default. Additionally, sometimes I just end up using Perl or Lua (LuaJIT).
I never said it wasn't a feature. There was a time, and there are still certain specific domains, where bit bashing the way C lets you is a big benefit to have. But bug or not, I think it's reasonable to call these limitations as far as general-purpose programming goes.
My argument was that C puts the onus on the user to work within those limitations. Implementing your own bounds checks, doing shared memory management, all that stuff, is extra work that you either have to do yourself or know and trust a library enough to use it, and in either case carry around the weight of having to know that nonstandard stuff.
We’re stuck with plenty of non-optimal stuff because of path dependency and historical baggage. So what? Propose something better. Show that the benefits of following the happy path of historical baggage don’t outweigh the outrageously “arcane” and byzantine syntax of…double quotes, brackets, dollar signs, and other symbols that pretty much every other language uses too.
>I don't really understand what this argument is trying to argue for; shell scripting languages are, for practical reasons, usually interpreted, and in the POSIX case, they usually don't have to be fast since they're usually just used to delegate operations off to other code for performance. Their main priority is ease of interop with their domain.
DDT is a hell of a lot older than Bourne shell, is not interpreted, does have full efficient access to the machine instructions and operation system, and it even features a built-in PDP-10 assembler and disassembler, and lets you use inline assembly in your login file to customize it, like I described here:
And even the lowly Windows PowerShell is much more recent, and blows Bourne shell out of the water along so many dimensions, by being VASTLY more interoperable, powerful, usable, learnable, maintainable, efficient, and flexible, with a much better syntax, as I described here:
>When even lowly Windows PowerShell blows your Unix shell out of the water along so many dimensions of power, usability, learnability, maintainability, efficiency, and flexibility, you know for sure your that your Unix shell and the philosophy it rode in on totally sucks, and self imposed ignorance and delusional denial is your only defense against realizing how bankrupt the Unix Philosophy really is.
>It's such a LOW BAR to lose spectacularly to, and then still try to carry the water and make excuses for the bankrupt "Unix Philosophy" cargo cult. Do better.
Shell != Unix (philosophy) as I’m sure you are aware. The unix philosophy is having a shell and being able to replace it, not its particular idiosyncrasies at any moment in time.
This is like bashing Windows for the look of its buttons.
I realized the hype for the Unix Philosophy was overblown around 1993 when I learned Perl and almost immediately stopped using a dozen different command-line tools.
I realized the hype for composing $thing$s was overblown around 1993 when I learned I could just have "A Grand Unified $thing$" and almost immediately stopped using a dozen different $thing$s.
Then, a decade or two later, I realized the Grand Unified $thing$ was itself composed, but not by me so I had no control over it. Then I thought to myself, how great would it be if we decompose this Grand Unified $thing$ into many reusable $thing$s? That way we can be optimally productive by not being dependent on the idiosyncrasies of Grand Unified $thing$.
And so it was written and so it was done. We built many a $thing$ and the world was good, excellent even. But then one of the Ancients realized we could increase our productivity dramatically if we would compose our many $thing$s into one Grand Unified $thing$ so we wouldn't have to learn to use all these different $thing$s.
And so it was written and so it was done. Thus goes the story of the Ancients and their initiation of the most holy of cycles.
There is a world outside of Perl. There really is.
It's a general observation of how we are infatuated with composibility, then tire of it and unify and then learn to love it again because the unifications grow stale and weird of which Perl is an excellent example.
I switched to Python in 1998, and I haven't gone back to the Unix philosophy of decomposition into small command-line tools which interoperate via text and pipes, nor the COM/DCOM/CORBA approach, nor microservices, nor even Erlang processes, so I'm really not the target audience for your joke.
Ken Thompson and Unix folks agree with you. The point is... Perl was a solution to the former Unix (BSD/GNU) bloatings.
When you have a look at Plan 9 (now 9fron) with rc as a shell, awk and the power of rio/acme scripting and namespaces among aux/listen... Perl feels bloated and with the same terse syntax as SH derived shells.
Not much; what makes AWK shine it's the I/O in plan9; it's trivial to spawn
sockets (literally from the command line), either plain text or encrypted.
Also, rc it's much simpler than Bash.
I don't see what crusty implementation details have to do with a philosophy. In fact, UNIX itself is a poor implementation of the "UNIX" philosophy, which is why Plan 9 exists.
The idea of small composable tools doing one thing and doing it well may have been mostly an ideal (and now pretty niche), but I don't think it was purely invented after the fact. Just crippled by the "worse is better".
The "Unix Philosophy" is some cargo cult among FOSS folks that never used commercial UNIX systems, since Xenix I haven't used any that doesn't have endless options on their man pages.
Well, we are set by your "Windows philosphy", and forget NT being a VMS rehash, we would still be using the crappy W9x designs with DOS crap back and forth.
Even Risc OS seems to do better even if it doesn't have memory protection too (I think it hasn't, I didn't try it for more than a few days).
Thing is there is no "Windows philosphy" cargo cult, and I don't worship OSes nor languages, all have their plus and minus, use any of them when the situation calls for it, and it is a disservice to oneself to identify themselves to technology stacks like football club memberships given at birth.
Neither I am a sole Unix user; I have Risc OS open (Apache 2.0?) on an RPI to experiment something else beyond Unix/C.
But Windows it's too heavyweight, from 8 it has been a disaster. And the NT kernel+explorer can be really slim (look at ReactOS, or XP, or a debloated W7).
The problem it's that Apple and MS (and RedHat) are just selling shiny turds wasting tons of cycles to do trivial tasks.
Worse, you can't slim down your install so it behaves like a sane system for 1GB of RAM.
I can watch 720p@30FPS videos under a n270 netbook with MPV. Something even native players for WXP can't do with low level direct draw calls well enough.
The Windows > XP philosophy among RedHat and Apple it's: let bloat and crap out our OSes with unnecesary services and XML crap (and interpreted languages such as JS and C#) for the desktop until hardware vendors idolize US so the average user has to buy new crap to do the same task ever and ever.
Security? Why the fuck does Gnome 3 need JS at first? Where's Vala, where it could shine here and Mutter could get a big boost and memory leaks could be a thing of the past?
C# is a compiled language at all levels (source into bytecode, then bytecode into machine code either JIT or AOT). V8 has JIT compilation for hot paths. As a result, JS is significantly faster than the interpreted languages like Python, Ruby and Erlang/Elixir/Gleam.
No one under GTK/Gnome uses plain C, they use Glib as a wrapper. Plain ANSI C might be 'unusable' for modern UI needs, but, as I said, just have a look on WebkitGTK4.
Glib everywhere, WebkitSettings are a breeze to setup.
Vala it's a toy because Miguel de Icaza went full MS with C# since Ximian. If Vala had more support from Red Hat, Gnome 4 could support Vala as the main language for it. JS? Lua and Luajit wouldb be a better choice for Mutter scripting. If you have a look on how Luakit and Vimb behave, the difference it's almost nil.
Even an operating system as brain damaged as Windows still has PowerShell, which lets you easily and efficiently perform all kinds of operations, dynamically link in libraries ("cmdlets") and call them directly, call functions with typed non-string parameters, pipe live OBJECTS between code running in the SAME address space without copying and context switching and serializing and piping and deserializing everything as text.
PowerShell even has a hosting api that lets you embed it inside other applications -- try doing that with bash. At least you can do that with python!
When even lowly Windows PowerShell blows your Unix shell out of the water along so many dimensions of power, usability, learnability, maintainability, efficiency, and flexibility, you know for sure your that your Unix shell and the philosophy it rode in on totally sucks, and self imposed ignorance and delusional denial is your only defense against realizing how bankrupt the Unix Philosophy really is.
It's such a LOW BAR to lose spectacularly to, and then still try to carry the water and make excuses for the bankrupt "Unix Philosophy" cargo cult. Do better.
>PowerShell implements the concept of a pipeline, which enables piping the output of one cmdlet to another cmdlet as input. As with Unix pipelines, PowerShell pipelines can construct complex commands, using the | operator to connect stages. However, the PowerShell pipeline differs from Unix pipelines in that stages execute within the PowerShell runtime rather than as a set of processes coordinated by the operating system. Additionally, structured .NET objects, rather than byte streams, are passed from one stage to the next. Using objects and executing stages within the PowerShell runtime eliminates the need to serialize data structures, or to extract them by explicitly parsing text output.[47] An object can also encapsulate certain functions that work on the contained data, which become available to the recipient command for use.[48][49] For the last cmdlet in a pipeline, PowerShell automatically pipes its output object to the Out-Default cmdlet, which transforms the objects into a stream of format objects and then renders those to the screen.[50][51]
>Because all PowerShell objects are .NET objects, they share a .ToString() method, which retrieves the text representation of the data in an object. In addition, PowerShell allows formatting definitions to be specified, so the text representation of objects can be customized by choosing which data elements to display, and in what manner. However, in order to maintain backward compatibility, if an external executable is used in a pipeline, it receives a text stream representing the object, instead of directly integrating with the PowerShell type system.[52][53][54]
> Hosting
>One can also use PowerShell embedded in a management application, which uses the PowerShell runtime to implement the management functionality. For this, PowerShell provides a managed hosting API. Via the APIs, the application can instantiate a runspace (one instantiation of the PowerShell runtime), which runs in the application's process and is exposed as a Runspace object.[12] The state of the runspace is encased in a SessionState object. When the runspace is created, the Windows PowerShell runtime initializes the instantiation, including initializing the providers and enumerating the cmdlets, and updates the SessionState object accordingly. The Runspace then must be opened for either synchronous processing or asynchronous processing. After that it can be used to execute commands. [...]
9front it's the truest Unix philosophy since Unix v6. It makes it much better. Proper devices and network connections as files, plus namespaces and aux/listen plus friends. It makes AWK better than Perl and rc it's much simpler without the bullshit of sh. You only have functions, not aliases, and the syntax it's much saner.
On Powershell/C#, TCL/Tk might not be as powerful but it works under Windows XP with IronTCL unlike MS' own and newest C# implementations ( >= 4.5). Double irony there.
TCL can help to write some useful software such as a Gopher
/Gemini client with embedded TLS support.
And the resource usage will still be far lower.
On embedding, TCL wins here, hands down. It's everywhere.
If we forget that the authors moved on into Inferno and Limbo, while re-doing all the Plan 9 decisions they had to rollback like Alef as main userspace language.
>Because all PowerShell objects are .NET objects, they share a .ToString() method,
Congrats, PSH, you did what TCL did ~30 years ago, but worse. With TCL everything it's a string, even numbers. Yes, it sucks you need to [ eval ] math operations, but well, the advantages outnumber the quirks.
If you came from Lisp, you will be at home in the spot. Use the l* functions as you were doing the same with Lisp lists, but without juggling with car, cdr, caar, cddr and so on.
And there's Expect which is utterly underrated.
Yes, I hate upvar sometimes, but with namespaces you can almost avoid that issue.
On TCL done for serious stuff... if people have been using Excel with millions of rows for covid patients and census, TCL/Tk with SQLite would outperform these by a huge margin.
PowerShell is the opposite of TCL and bash. You pass objects directly, NOT strings. I have no idea what you're trying to say. And yes I've written and shipped and open sourced shitloads of TCL/Tk.
Objects are not my thing, they are just good for Inform6 as a Z-Machine game maps really well with OOP because a text adventure based on events tied to attributes it's ideal.
Now you're making even less sense than before, with incoherent grammar and random buzzwords, which is an impressive leap. I don't think "your thing", whatever that is, has any bearing on this conversation. Are you an LLM?
I played the original Zork on MIT-DM, and read the original source code written in MDL, which is essentially Lisp with angled brackets and data types, and it's neither object nor text oriented, so I have no idea what point you're trying to make about its descendent ZIL, because it makes no sense and has no bearing on this discussion.
You're arguing with a well vetted factually correct evidence based wikipedia page, so if you disagree, go try to edit it, and see how long your hallucinations and vandalisms last without citations to reality or coherent sentences.
At least my code doesn't shit its pants when you pass it a filename with a space in it.
I am not an LLM. I am talking about Inform6, an OOP language born in the 90's where they created games far more powerful than the Infocom ones. If6 maps pretty well to MDL. Both compile to ZMachine games, but if6 it's far easier.
On games, have a look on Anchorhead, Spider and Web, Curses, Jigsaw... in these kind of games OOP has tons of sense.
Wow it's really sad that you're not an LLM. That would have been a great excuse. Too bad you've been superseded and displaced by computers. My condolences.
> Related to that, the Unix philosophy of simple tools that do one job and do it well, also applies here a bit. More typical workflow would be a utility to tarball something, then another utility to gzip it, then finally another to encrypt it. Leading to file extensions like .tar.gz.pgp, all from piping commands together.
I do this for my own files, but half of the time I zip something, it’s to send it to a Windows user, in which case zip is king.
Was there any problem with 7z some years ago? I feel like I've been actively avoiding it for having the feeling that I've read something bad about it, but I can't remember what. But I could've mixed it with something else. It sometimes happens to me.
Ah, I think I might remember a couple RCE they had... [0]
So for Windows use I then started to recommend a fork called NanaZip [1] that enabled some Windows security features (CFG, CET, Package Integrity Check...) and added support for additional formats that other forks already had [2] [3].
If you use `atool`, there is no need to use different tools either – it wraps all the different compression tools behind a single interface (`apack`, `aunpack`, `als`) and chooses the right one based on file extensions.
Seems to me like many research institutions, but well funded, attracting great talent and able to operate for a long timescale - they asked a lot of wrong questions too and we just highlight the great ones.
It is more like "Don't burden talented people with 15 hours of mindless grant-related bureaucracy per week". Current models of science funding sap precious time away from the researchers in the name of red tape.
The brains are the most precious resource, not the money, and they should not be bothered with trivialities.
The institute of advanced study follows this philosophy. The results are mixed. Putting people away with no urgency or engagement with real world problems hasn’t done that well.
The Bell system was rapidly growing and expanding across the world. This gave it capital to spend but it also gave it incredible access across our entire society to government, universities, and business.
It allows more people to do it as a career if the funding goes partially towards salaries. That goes a long way towards making more progress in more fields.
I'm starting to come to the idea that the rate of progress is more driven by the rate of adoption than by number of researchers.
It takes a whole lot of implementation and organisation to make any innovation tangible, so in some ways a bigger overall society slows things down as much as you gain from the total number of researchers.
Competition from foreign societies is also a part of this.
I'd say that's true after a certain point in any specific field. But there are so many fields that do show some future promise with only a few people doing research at any one time. I think a bit more breath of search will do us well
Certainly, as does any open-ended research organization. But when there are innumerable questions to consider and maybe 1% are worth-while, a 25% success rate might be incredibly impressive.
Think of it as if ChatGPT (or other models) didn't just have the embedded unstructured knowledge in their weights from learning, but also an extra DB on the side with specific structured knowledge that it can lookup on the fly.
Has saved us a number of times when having to deploy at a remote client with limited on-prem customisation for security reasons (ie. no to installing a big Postgres or other RDBMS solution).
Powerful tooling; all local to the environment and the data being worked on; SQL, so it's pretty close to a drop-in replacement compared to our old solution. Really great stuff and I was very happy to see the project gain the confidence to hit 1.0 a while back and now 1.1.
And don't forget that SpaceX had to sue the government/Boeing/Lockheed/ULA, multiple times, just in order to be allowed to compete for these contracts rather than having it locked down to the usual suspects.
Fortnite: July 25, 2017 (Battle Royale mode launched September 26, 2017)
Apex Legends: February 4, 2019
Valorant: June 2, 2020
Overwatch: May 24, 2016
Call of Duty: 2003, Annual release
League of Legends: October 27, 2009
Dota 2: July 9, 2013
Roblox: 2006 (initially as DynaBlocks, rebranded to Roblox the same year)
Blame Claude 4 if any date is wrong...