For anyone who might not have been following Dave Winer's work, or thought streams in the recent past, the reference to blogging like it's 1999 isn't about the tech. It's primary concept is about it going back to the roots of being about openness. Instead of blogging on all the closed platforms, Dave has always advocated for blogging on your own terms. Whether from wordpress, ghost, or his new tool, it's all the same level of importance in keeping the open web alive.
A lot of his focus in this product is likely to be around the ease of editing and ease of getting your content out there. For example, it comes with support for instant articles built in. I imagine things like Amp page support will also get built in.
Re the twitter login. It's not something I personally believe in, but I know that Dave has a strong belief in Twitter having potential to be much better as a dev platform of sorts. From identity management, to message delivery. Debatable, but for another time. Just thought I could help provide some context here.
Whether this platform is better than WP, Drupal, or Ghost is all a matter of personal preference honestly. I'm not sure Dave really cares about it that way either. He just wants to keep making the open web more appealing than the closed.
Of everything in this announcement so far, the most intriguing part is his idea of interop with WP and Drupal and other platforms. One thing I can connect it with, is how his posts cross post to FB, and Medium. And he's talked about live editing a bit over there. If I recall right, Dave kind of imagines a future where the editor and the server are different. So I might edit things on my 1999.io server while the updates are all served on a WP or Drupal site. A bit more context to this. In the past Dave also wrote about wishing the Medium.com experience was more like that. Where I could use their incredible editor and publish to anywhere I wanted instead of just medium.com.
The one thing and "maybe" criticism I have of Dave's stuff, is that it is very high level/abstract at times. I've followed his work for years, and it's always taken me a while to digest any of his ideas because there's a lot of imagination that goes unsaid there. In many ways this is like the wonderful work he's done in creating RSS too. RSS alone is simply a germ of an idea which can then be used in so many creative ways. Dave's ideas are very similar. A germ of an idea that he hopes others will pick up on and push forward. New frontiers!
I have lost count of how many blogs I've lost or have shut down due to acquisition/close and losing track of an email account. The only blog I have? The one I wrote in text files, and then posted places.
No platform is forever, except maybe the one you maybe help be that way.
That was the year we did the first browser-based blogging software. It's when Edit This Pager was invented. The first wave of bloggers. The name weblog. It basically came together in 1999.
That was how I did it, too. Monthly, I'd archive posts, to allow archive perusal by months. I wasn't tied to any particular software. I got hosting with friends who set up their own blogs at their own domains. All of us were in or just out of high school, no real plans for the future, lots of free time to post and talk over AIM/MSN.
In retrospect it was primitive, and it worked. There was a huge scene around all this - everyone had their blogs, and those with an eye for colour and design were doing cutting-edge presetation in idiosyncratic ways. I really miss this when taken against today's single-page, "FOL.IO IS A PLATFORM FOR MANAGING WORMS" type of web.
mc's online journal was what drew me into all of this. It's been offline for probably 20 years now, but it was deeply personal and intense, and wonderfully written.
We used to append content into a single page because the alternatives were A) independent post files and a cgi script to concatenate them dynamically on each view, or B) re-create the static page from individual post files.
Since using a database was totally overkill, and static + individual posts wasted precious megabytes, and a new dynamic process per page load was way too CPU-intense, we just curated a single static page. Saves disk and CPU. When posts got past a certain number, the process'd do some juggling of content into a new archive file.
Methinks you are selling older computers short. I ran my blog (up through 2005, maybe to 2006) on a 50MHz 486 with 32?MB RAM (maybe 64MB? It's been a decade) and a gig hard drive.
That same server was also my email server.
Then again, I wrote my own blogging engine in C [1] (and still use it).
Sure, you could do plenty with old 486's. But not hosting thousands of weblogs generating dynamic content. Not without load shooting up to 100, anyway. For a paltry few visitors, not so bad, but the more users and higher the traffic the worse it'd get.
I should add that I completely forgot (has it been that long?) about the hacks we'd use to speed up dynamic content. Since there were no competent threading models for PHP and Perl at the time, we would use either FastCGI, or mod_php or mod_perl. The former would allow you to build app servers to handle multiple requests without terminating, to speed up initialization and share/cache memory. The latter would embed a PHP or Perl interpreter in the Apache process, eliminating lots of overhead, enabling better communication, and of course allowing you to prefork multiple Apache processes and execute scripts as soon as a request came in, and optionally stay resident in memory. But that's just execution speed; if you're reading from flat-files and spitting them out one by one, that's still a very i/o-bound operation and eats up more CPU than necessary.
And ultimately, very few web hosts allowed using mod_perl or mod_php, and no weblog maintainer ever wrote FastCGI (nor were the servers configured for it). So everything was forked at run-time, leading to very slow page loads for any CGI script - unless you ran your own server, of course. It wasn't until later that LAMP development became more common and people started putting the kitchen sink into MySQL and PHP.
Nah, it was too CPU intense. In addition, even though disks haven't gotten significantly faster since then, filesystems were a lot less advanced, and disk, virtual memory and CPU caches were much smaller.
PHP did develop quickly in order to optimize the performance of serving scripted web content. Besides its own performance enhancements it also relied heavily on database storage for content, which significantly improved performance for complex websites.
But it was mostly useless to people with just a homepage and guestbook; most web hosters would not provide database access without charging, and most web devs needed a higher level of technical competency to use them. A lot of websites were run by people who only knew front-end programming and markup languages.
I don't know if this is some type of cynical irony, but calling your software "blogging like it's 1999" whilst using a .io domain creates many feelings for me.
This confuses me. Several months ago Dave Winer wrote a post titled 'Anywhere but Medium' where he railed against closed platforms such as Medium that could disappear at a moment's notice and where the editorial voice was controlled by a company (as an individual gets to decide who gets noticed).
Now Dave Winer is launching a platform that could also disappear at a moments' notice, where the best chance of being 'noticed' is likely to be getting reposted by Dave Winer.
What's more, it seems to go against some of the Open Web principles that Winer espouses (seems to require JavaScript to view any content, doesn't seem to render anything if you have Safari Content Blockers running, requires a Twitter account even to get up and running with the thing). All in all it seems to have very little to do with 1999 or any kind of 'golden age' for the Web.
Developers already have tons of options for getting a blog up and running, from GitHub Pages to Posthaven and Ghost - what gives with this?
> Now Dave Winer is launching a platform that could also disappear at a moments' notice,
Looks like you can set up your own server, so assuming you point your own domain at this, if 1999 goes belly-up, you can just switch to another host seamlessly:
> (seems to require JavaScript to view any content
This is true of the "About" page, but it looks like that is not true of actual blog entries. If you enable JS on the about page, they explicitly state that one of their goals is a graceful fallback when JS is not present.
My work proxy is blocking most of the page's JS/CSS resources, which are being served from a different domain that our firewall doesn't approve of for some inane reason.
In 1999, I could view a simple page containing nothing but text without requiring me to download, compile and execute a massive wad of code in a huge bloated VM with a JIT. There's no need for any javascript there, try having a web page.
I don't know where you were in 1999, but where I was in 1999 IE4's DHTML was all the rage among the elite wannabe high school web developers. Early explorations into XMLHttpRequest (via this awesome ActiveX technology, wow) and DHTML Components (which perhaps unsurprisingly don't seem all that different from today's Web Components under the surface).
(I'm only remembering some of this because I recently dug up some of my actual circa 1999 website work.)
I was on the same web you were. The one where doc ozone was a bizarre novelty and well over 99.999% of text based sites had no javascript at all, much less being entirely created from javascript.
It may have been more of a novelty back then, but I remember a lot more than 0.001% of websites having some JS or some Flash (or both), and maybe not many were entirely JS (though there were some I'm trying to recall), there were plenty of websites at the time that were 100% Flash, including the websites of some major corporations.
Again, certainly from the perspective of a high school web developer at the time, my "budget" consisted of just about as much JS as a I wanted and could get my hands on, whereas I was on mostly static web hosts which constrained me from doing as much as I would have liked at the time in server side code... I know I wrote some very heavy JS sites at the time, and I know I was not alone in that constraint.
Again, sure, it was mostly for novelty, but the web where websites had basically no JS at all was several years before 1999.
When I go to their "about" page, this is the entirety of the content that I see:
> How is 1999.io different from other blogging platforms?
Looking at the source, I see a little more content.
"Create a test site" is almost completely blank (no text, just a drop shadow and some pull-down arrows near the top of the page). Their other links seem to work as designed.
My work blocks some domains, and is stricter about ones that it doesn't know about (like 1999.io), but I didn't have trouble with accessing any of the fargo.io css or js files.
For this to be 1999-era journaling, it would need:
- To be written in Perl
- To support .php, .pl, & .cgi in the user's /cgi-bin/
- To host about 10000 accounts per physical machine
- To use FTP, or a CGI form, for remote file management
- To use HTML 4.01/XHTML
- What's CSS?
- What's Twitter?
- Features: A user profile/bio! User comments! Subscriptions! Communities!
- Up to 10 megabytes of FREE storage
- Free add-ons like a hit counter and a feedback submission e-mail form
- One free e-mail address and five free e-mail aliases
- EXTRAS: Virtual host name and domain name support, up to 5 e-mail addresses &
20 e-mail aliases, up to 1000* megabytes of storage, and No Advertising Banners!!!
* actual space may vary based on how badly we over-committed storage
Just HTML and your own assets. No node.js hosting, no JSON+XML backend storage (that we can see), no sweat.
Granted, Neocities isn't built on 1999-era technology either, but I think it better captures the aesthetic of "web sandbox that i get to play with in my own webspace" a bit closer than this.
Actually CSS 1 is quite old but support wasn't good even in IE4/Netscape 4. IE5 in 2000 was the first browser with almost complete support and that was one of the reasons for it wiped out Netscape.
We've been doing passwords on the web since before 1999 and we weren't storing them in plaintext even then. I bet there is a proven authentication framework they could use in this project.
https was there in 1999 too but luckily we must use the current one, which is more secure.
I understand, but maybe if you're blogging for the purpose of political dissidence you could look elsewhere? Home-grown authentication adds a lot of overhead to web apps, so it makes sense to hand that off to one or more standardized identity management systems.
Sure, that's when the term was coined, but no one talked about having blogs in 1999. Words I did hear it called back then are "personal website", "finger", or "zine".
1999? XML is our savior! The future of the web will be interconnected SOAP services! It's the year of the Linux desktop! Itanium will revolutionize software if someone would just make a sufficiently smart compiler!
To be fair, the neat parts of SOAP, WSDL, still is pretty neat. XML is mostly hated because named closing tags make it stupidly verbose -- a big mistake. After that it was extra complicated which made platforms like PHP have crappy implementations, I guess.
But having to rewrite lots of boilerplate code for everyone's JSON or "REST" API is annoying. There's even projects to describe JSON schemas and APIs because that's actually useful. Maybe this time it'll be simple.
Transactions over SOAP (WS-AtomicTransaction I think) is also sort of neat I guess, but too complicated to be useful on the Internet?
> To edit a blog you have to enable JavaScript because the blogging software is written in JS.
> It would be possible to create a client that didn't, but that doesn't exist today.
I think the client exists: it's a web browser using forms. Not as nice as what we all expected we would have by now back in the 90s, but it works. A blog entry's just a title and some text. Throw in an input field for the title, and a textarea for some Markdown and baby you've got a blog!
I feel like a lot of people are harshing on the fact that this is not like, I dunno, geocities or something, but I just want to express some appreciation that an effort is made for those of us who do not want to execute arbitrary code just to read a blog entry.
One note - on that page, it seems that while the content is there independent of Javascript, the navigation is loaded (from three different domains) with Javascript. I suspect there's a more graceful way to fallback there.
> If I have to enable javascript to view parts of your website at all, then it isn't anything like 1999.
JS is older than that, and early uses of it were notorious for not having graceful fallbacks, so, no, I think that description is not at all unlike 1999.
Do people nowadays really don't care? Or are completely oblivious to the fact that images can be huge? 2MB is still a lot of data even today. I could see it loading (on 100Mbit no less).
Seeing how Communicator 4.75 came out in Sept 2000, I guess this site is really making a go for authenticity, even on non-existent point releases in '99. [0]
A big problem for machines of this vintage is HTTPS everywhere. Google still works pretty well and will let you access it over HTTP, but the rendering of the search results is a bit janky.
A lot of his focus in this product is likely to be around the ease of editing and ease of getting your content out there. For example, it comes with support for instant articles built in. I imagine things like Amp page support will also get built in.
Re the twitter login. It's not something I personally believe in, but I know that Dave has a strong belief in Twitter having potential to be much better as a dev platform of sorts. From identity management, to message delivery. Debatable, but for another time. Just thought I could help provide some context here.
Whether this platform is better than WP, Drupal, or Ghost is all a matter of personal preference honestly. I'm not sure Dave really cares about it that way either. He just wants to keep making the open web more appealing than the closed.
Of everything in this announcement so far, the most intriguing part is his idea of interop with WP and Drupal and other platforms. One thing I can connect it with, is how his posts cross post to FB, and Medium. And he's talked about live editing a bit over there. If I recall right, Dave kind of imagines a future where the editor and the server are different. So I might edit things on my 1999.io server while the updates are all served on a WP or Drupal site. A bit more context to this. In the past Dave also wrote about wishing the Medium.com experience was more like that. Where I could use their incredible editor and publish to anywhere I wanted instead of just medium.com.
The one thing and "maybe" criticism I have of Dave's stuff, is that it is very high level/abstract at times. I've followed his work for years, and it's always taken me a while to digest any of his ideas because there's a lot of imagination that goes unsaid there. In many ways this is like the wonderful work he's done in creating RSS too. RSS alone is simply a germ of an idea which can then be used in so many creative ways. Dave's ideas are very similar. A germ of an idea that he hopes others will pick up on and push forward. New frontiers!