It's actually quite the opposite. Scouring the internet is what you have to do to find Windows software. With UNIX-like systems the software is more organised and more often than not packaged for you in respositories.
There is no need for Flash. There never was. With a UNIX-like system you can download any video and play it at your leisure. No slow starts, hiccups, timeouts, or whatever annoyances people tolerate with Flash.
The video is a file. You get the file and play it. Simple.
I would guess there are some video formats that Windows Media Player, not to mention other Windows video players, will still choke on. This does not happen with Mplayer (which is also available for Windows, but is truly a UNIX-style program).
And of course audio can be extracted from any video. Youtube is like a giant Napster. But the mp3's are hidden in flv/mp4/3gp/webm/etc, and Youtube has better lawyers.
My youtube downloader is 30 lines of sed and can use any tcp client (wget, curl, whatever -- there are so many). It takes me about 15 minutes to write and could probably be smaller. But there are plenty of more complex solutions, e.g., there's a nice one done in Lua called quvi.
It is trivial to set up a server that takes the youtube watch?v= url and returns the download url for the video file.
Obviously this probably makes some people uncomfortable. But this is how the web works. Anything that is uploaded can be downloaded. IMO, it's more respectable to try to be honest than lying to people that video can be "protected" from download by using some convoluted Flash scheme.
Flash is fading into obsolescence. Youtube is getting stronger every day. And the lesson from that is clear, at least to me.
And as for all thos people who love the concept of "streaming", youtube still uses progressive _download_.
If I want "streaming" I download to a file on a ramdisk and let Mplayer read from that file as the download progresses. With a fast connection, using ffmpeg to do the download will allow you to do transcoding on the fly if you need it. Mplayer gives flawless playback, every time.
Because if so, now we find ourselves discussing the resource requirements just to scan/tokenise and parse it to get it back into a human readable form. Why did we translate it to a non-readable form in the first place? What were we trying to achieve?
Maybe we should let JSON be something the receiver translates text to (if they want that sort of format), not the sender. The receiver knows what resources she has to work with, the sender has to guess. The same principle applies to XML. By all means, play around with these machin-readable formats to your heart's content. But do it on the receiver side. No need to impose some particular format on everyone.
The "universal format" is plain text. The UNIX people realised this long ago. People read data as plain text, not JSON and not XML, not even HTML. No matter how many times you translate it into something else, using a machine to help you, it will, if humans are to read it, be translated back to plain text.
As for the "plain text haters", let us be reminded that UNIX can do typesetting. Professional quality typesetting. But that's the receiver's job.[1] There's a learning curve, sure, but what the receiver can produce using typesetting utilities on her own machine is world's better than what a silly web browser can produce from markup.
1. I am so tired of dumping PDF's to text and images. PDF makes it seemingly impossble to scan through a large number of documents quickly. Ever been tasked with reading through 100 documents all in PDF format (i.e., scanned images from a photocopier)? What could be accmplished in minutes with BRE takes hours or even days to accomplish. This is a problem that persists year after year. OCR is a hack. In most cases, the text should never have been scanned to an image in the first place. The documents are being created on computers, not typewriters!
So, as I see it, if you were a plain text hater, and you were really sincere about making things look nice, then you would be a proponent of educating people how to do typesetting and sending them plain text, the universal format, that they can easily work with.
My solution to JSON and XML is sed. It works in all resource conditions and most times is just as fast as any RAM hungry parser. If I need to do complex things, that's what lex and yacc are there for. Pipes and filters; small buffers. 'Nuf said.
In my biased opinion, once you have gotten over the learning curve, nothing beats daemontools for running services. It is a fantastic set of tools. Why some OS doesn't just embrace djbware I'll never understand. It compiles, smoothly, in seconds. (There's no need for distributing binaries.) And the chances of the author initiating lawsuits (as some Linux foundations are known to do), over something placed in the "public domain" are close to nil. He's got better things to do.
BSD's rc system is fine. Sometimes the scripts are too verbose. But the whole idea is the system is simple enough to understand that you can write your own scripts -- more concisely, if you wish. You don't need to read a book (e.g. Linux from Scratch), keep most things disabled by default and let the user turn stuff on as they need it.
I recently used Debian's live USB, the rescue version, for a little while and was amazed at how much stuff is turned on by default. I guess if you understand each and every choice that's been made for you it's OK. But if not, that approach is not very conducive to learning.
As for Apple, never mind all the XML fluff, good luck trying to understand what's going on behind the scenes with their computers anymore. They can't even manage to let you have an nsswitch.conf or equivalent.
Debian (and other apt-based systems) are the Lego blocks systems of the Linux world. If you install a service, the assumption is that you want it to run (if you don't want it to run, you can either uninstall it or deactivate it). Bootable / live versions tend to have more comprehensive lists of installed packages to allow for greater utility/flexibility -- though some (Knoppix) actually allow you to install additional packages (yes, booted RAM-only) into the booted system.
The is not the case on BSD systems (generally an integrated whole, though they've got package management) or RPM (poorer package management leading very frequently to a "kitchen sink" installation paradigm).
Yeah. RHEL's even got a package you can install to enable/disable postfix vs ... oh, whatever the default MTA is, I can't keep track (smail still? I know they've moved off of sendmail, right? Right?).
> And the chances of the author initiating lawsuits (as some Linux foundations are known to do), over something placed in the "public domain" are close to nil.
Interesting comments. Like in so many other instances where an article criticises, explicitly or implicitly, the way some nerds make money, the reaction is to downplay it.
There's nothing really shocking here. As Cringley pointed out in an earlier blog post, it's common for publishers to overcharge for advertising. This was true long before the web existed.
But the thing that's different is nerds have the power to change the way things are done. Everyone knows how annoying (and generally ineffective) ads are. The question is, are you going to make the situation better, or worse?
Steve Jobs actually sought and was granted a patent on "serving ads upon booting". No internet required. Sad, really.
There is no need for Flash. There never was. With a UNIX-like system you can download any video and play it at your leisure. No slow starts, hiccups, timeouts, or whatever annoyances people tolerate with Flash.
The video is a file. You get the file and play it. Simple.
I would guess there are some video formats that Windows Media Player, not to mention other Windows video players, will still choke on. This does not happen with Mplayer (which is also available for Windows, but is truly a UNIX-style program).
And of course audio can be extracted from any video. Youtube is like a giant Napster. But the mp3's are hidden in flv/mp4/3gp/webm/etc, and Youtube has better lawyers.
My youtube downloader is 30 lines of sed and can use any tcp client (wget, curl, whatever -- there are so many). It takes me about 15 minutes to write and could probably be smaller. But there are plenty of more complex solutions, e.g., there's a nice one done in Lua called quvi.
It is trivial to set up a server that takes the youtube watch?v= url and returns the download url for the video file.
Obviously this probably makes some people uncomfortable. But this is how the web works. Anything that is uploaded can be downloaded. IMO, it's more respectable to try to be honest than lying to people that video can be "protected" from download by using some convoluted Flash scheme.
Flash is fading into obsolescence. Youtube is getting stronger every day. And the lesson from that is clear, at least to me.
And as for all thos people who love the concept of "streaming", youtube still uses progressive _download_.
If I want "streaming" I download to a file on a ramdisk and let Mplayer read from that file as the download progresses. With a fast connection, using ffmpeg to do the download will allow you to do transcoding on the fly if you need it. Mplayer gives flawless playback, every time.
Can we say the same for Windows?