Hacker News new | past | comments | ask | show | jobs | submit login
1999.io: Blogging like it's 1999 (1999.io)
148 points by gk1 on June 8, 2016 | hide | past | favorite | 107 comments



For anyone who might not have been following Dave Winer's work, or thought streams in the recent past, the reference to blogging like it's 1999 isn't about the tech. It's primary concept is about it going back to the roots of being about openness. Instead of blogging on all the closed platforms, Dave has always advocated for blogging on your own terms. Whether from wordpress, ghost, or his new tool, it's all the same level of importance in keeping the open web alive.

A lot of his focus in this product is likely to be around the ease of editing and ease of getting your content out there. For example, it comes with support for instant articles built in. I imagine things like Amp page support will also get built in.

Re the twitter login. It's not something I personally believe in, but I know that Dave has a strong belief in Twitter having potential to be much better as a dev platform of sorts. From identity management, to message delivery. Debatable, but for another time. Just thought I could help provide some context here.

Whether this platform is better than WP, Drupal, or Ghost is all a matter of personal preference honestly. I'm not sure Dave really cares about it that way either. He just wants to keep making the open web more appealing than the closed.

Of everything in this announcement so far, the most intriguing part is his idea of interop with WP and Drupal and other platforms. One thing I can connect it with, is how his posts cross post to FB, and Medium. And he's talked about live editing a bit over there. If I recall right, Dave kind of imagines a future where the editor and the server are different. So I might edit things on my 1999.io server while the updates are all served on a WP or Drupal site. A bit more context to this. In the past Dave also wrote about wishing the Medium.com experience was more like that. Where I could use their incredible editor and publish to anywhere I wanted instead of just medium.com.

The one thing and "maybe" criticism I have of Dave's stuff, is that it is very high level/abstract at times. I've followed his work for years, and it's always taken me a while to digest any of his ideas because there's a lot of imagination that goes unsaid there. In many ways this is like the wonderful work he's done in creating RSS too. RSS alone is simply a germ of an idea which can then be used in so many creative ways. Dave's ideas are very similar. A germ of an idea that he hopes others will pick up on and push forward. New frontiers!


Thank you. That's pretty much exactly right.


Forever a fan dave! Thank you so much for teaching me about the open web from all your blog posts :)


I have lost count of how many blogs I've lost or have shut down due to acquisition/close and losing track of an email account. The only blog I have? The one I wrote in text files, and then posted places.

No platform is forever, except maybe the one you maybe help be that way.


When I was blogging in 1999 it involved appending each new blog entry to the top of the html page which went in the content frame.

And I don't think I'd heard of "blogging" yet.


That was the year we did the first browser-based blogging software. It's when Edit This Pager was invented. The first wave of bloggers. The name weblog. It basically came together in 1999.


I served my first webpages via Userland Frontier — thanks for building such a neat product all that long ago!


That was how I did it, too. Monthly, I'd archive posts, to allow archive perusal by months. I wasn't tied to any particular software. I got hosting with friends who set up their own blogs at their own domains. All of us were in or just out of high school, no real plans for the future, lots of free time to post and talk over AIM/MSN.

In retrospect it was primitive, and it worked. There was a huge scene around all this - everyone had their blogs, and those with an eye for colour and design were doing cutting-edge presetation in idiosyncratic ways. I really miss this when taken against today's single-page, "FOL.IO IS A PLATFORM FOR MANAGING WORMS" type of web.

mc's online journal was what drew me into all of this. It's been offline for probably 20 years now, but it was deeply personal and intense, and wonderfully written.


Oh my god. Frames. The horror...

We used to append content into a single page because the alternatives were A) independent post files and a cgi script to concatenate them dynamically on each view, or B) re-create the static page from individual post files.

Since using a database was totally overkill, and static + individual posts wasted precious megabytes, and a new dynamic process per page load was way too CPU-intense, we just curated a single static page. Saves disk and CPU. When posts got past a certain number, the process'd do some juggling of content into a new archive file.


Methinks you are selling older computers short. I ran my blog (up through 2005, maybe to 2006) on a 50MHz 486 with 32?MB RAM (maybe 64MB? It's been a decade) and a gig hard drive.

That same server was also my email server.

Then again, I wrote my own blogging engine in C [1] (and still use it).

[1] https://github.com/spc476/mod_blog


Sure, you could do plenty with old 486's. But not hosting thousands of weblogs generating dynamic content. Not without load shooting up to 100, anyway. For a paltry few visitors, not so bad, but the more users and higher the traffic the worse it'd get.

I should add that I completely forgot (has it been that long?) about the hacks we'd use to speed up dynamic content. Since there were no competent threading models for PHP and Perl at the time, we would use either FastCGI, or mod_php or mod_perl. The former would allow you to build app servers to handle multiple requests without terminating, to speed up initialization and share/cache memory. The latter would embed a PHP or Perl interpreter in the Apache process, eliminating lots of overhead, enabling better communication, and of course allowing you to prefork multiple Apache processes and execute scripts as soon as a request came in, and optionally stay resident in memory. But that's just execution speed; if you're reading from flat-files and spitting them out one by one, that's still a very i/o-bound operation and eats up more CPU than necessary.

And ultimately, very few web hosts allowed using mod_perl or mod_php, and no weblog maintainer ever wrote FastCGI (nor were the servers configured for it). So everything was forked at run-time, leading to very slow page loads for any CGI script - unless you ran your own server, of course. It wasn't until later that LAMP development became more common and people started putting the kitchen sink into MySQL and PHP.


+1 for GPLv2 or later. :D


A CGI script to cat the posts together sounds like the better option.

Wasn't this the sort of thing that PHP was literally invented to do? Upon saying that, I have no idea what the state of PHP was in 1999.


Nah, it was too CPU intense. In addition, even though disks haven't gotten significantly faster since then, filesystems were a lot less advanced, and disk, virtual memory and CPU caches were much smaller.

PHP did develop quickly in order to optimize the performance of serving scripted web content. Besides its own performance enhancements it also relied heavily on database storage for content, which significantly improved performance for complex websites.

But it was mostly useless to people with just a homepage and guestbook; most web hosters would not provide database access without charging, and most web devs needed a higher level of technical competency to use them. A lot of websites were run by people who only knew front-end programming and markup languages.


> most web hosters would not provide database access without charging

> most web devs needed a higher level of technical competency

> websites were run by people who only knew front-end programming and markup languages

That was exactly my situation


Yes, PHP had already largely supplanted perl cgi scripts by 1999.


The front page tells me nothing. Go and read this blogpost about it instead. http://scripting.com/2016/06/08/1311.html


Yeah, what is it with people using front page designs like that these days.


Well this site does not look very '99, nor does "Sign up with Twitter". Never mind.


This post, written with 1999.io, might help explain the context.

http://scripting.com/2016/06/08/1311.html

Dave


I read that and I sort of get it but still feel that I do not quite get it.


Doesn't work without Javascript either :/


http://tilde.club/ is what feels like 1999 to me, because it's just a plain old multi-user Linux box. That's it.


I don't understand what is this about. Is it just a poorly implemented super-common blog engine?


It's very poorly done welcome page that's for sure.


I don't know if this is some type of cynical irony, but calling your software "blogging like it's 1999" whilst using a .io domain creates many feelings for me.


Should have been a .nu domain!


sounded interesting, until I selected 'view page source' on one of the pages and saw a screen full of .js

not really very 1999ish.


90s were all about JS, how could we forget Doc Ozone? (http://www.ozones.com/)


I thought this was going to be bringing back the .plan file :(


Disappointed. Where are the spinning 3D button gifs?


Ironically, it loads as fast as a 1999 web page on dialup.


What does this have to do with 1999 exactly? RSS 2.0?


This confuses me. Several months ago Dave Winer wrote a post titled 'Anywhere but Medium' where he railed against closed platforms such as Medium that could disappear at a moment's notice and where the editorial voice was controlled by a company (as an individual gets to decide who gets noticed).

Now Dave Winer is launching a platform that could also disappear at a moments' notice, where the best chance of being 'noticed' is likely to be getting reposted by Dave Winer.

What's more, it seems to go against some of the Open Web principles that Winer espouses (seems to require JavaScript to view any content, doesn't seem to render anything if you have Safari Content Blockers running, requires a Twitter account even to get up and running with the thing). All in all it seems to have very little to do with 1999 or any kind of 'golden age' for the Web.

Developers already have tons of options for getting a blog up and running, from GitHub Pages to Posthaven and Ghost - what gives with this?


> Now Dave Winer is launching a platform that could also disappear at a moments' notice,

Looks like you can set up your own server, so assuming you point your own domain at this, if 1999 goes belly-up, you can just switch to another host seamlessly:

https://github.com/scripting/1999-project/blob/master/docs/s...

> (seems to require JavaScript to view any content

This is true of the "About" page, but it looks like that is not true of actual blog entries. If you enable JS on the about page, they explicitly state that one of their goals is a graceful fallback when JS is not present.


You're just making this up. It's completely wrong.


the deps for 1999 consist of request and a date/time lib ... cool.

someone else mentioned ghost, and i'm posting mainly to link that project

https://github.com/TryGhost/Ghost

and thank the author b/c i learned a lot a/b node development from it.


What? No visit counter?


Just a blank gray page for me?


That's brutalist


My work proxy is blocking most of the page's JS/CSS resources, which are being served from a different domain that our firewall doesn't approve of for some inane reason.


It should really be tiled fake woodchip wallpaper for the genuine 1999 look.


Working fine from here.


@davewiner, you mention interop with other CMS software. What kinds of interop are you looking for?


In 1999, I could view a simple page containing nothing but text without requiring me to download, compile and execute a massive wad of code in a huge bloated VM with a JIT. There's no need for any javascript there, try having a web page.


I don't know where you were in 1999, but where I was in 1999 IE4's DHTML was all the rage among the elite wannabe high school web developers. Early explorations into XMLHttpRequest (via this awesome ActiveX technology, wow) and DHTML Components (which perhaps unsurprisingly don't seem all that different from today's Web Components under the surface).

(I'm only remembering some of this because I recently dug up some of my actual circa 1999 website work.)


I was on the same web you were. The one where doc ozone was a bizarre novelty and well over 99.999% of text based sites had no javascript at all, much less being entirely created from javascript.


It may have been more of a novelty back then, but I remember a lot more than 0.001% of websites having some JS or some Flash (or both), and maybe not many were entirely JS (though there were some I'm trying to recall), there were plenty of websites at the time that were 100% Flash, including the websites of some major corporations.

Again, certainly from the perspective of a high school web developer at the time, my "budget" consisted of just about as much JS as a I wanted and could get my hands on, whereas I was on mostly static web hosts which constrained me from doing as much as I would have liked at the time in server side code... I know I wrote some very heavy JS sites at the time, and I know I was not alone in that constraint.

Again, sure, it was mostly for novelty, but the web where websites had basically no JS at all was several years before 1999.


text based sites

Just because "web designers" had shitty flash pages, doesn't mean "personal home pages" that were just test and images did.


When I go to their "about" page, this is the entirety of the content that I see:

> How is 1999.io different from other blogging platforms?

Looking at the source, I see a little more content.

"Create a test site" is almost completely blank (no text, just a drop shadow and some pull-down arrows near the top of the page). Their other links seem to work as designed.

My work blocks some domains, and is stricter about ones that it doesn't know about (like 1999.io), but I didn't have trouble with accessing any of the fargo.io css or js files.


This amuses me, partly because I recently uploaded some backup archives of my actual 1999 websites and blogs to GitHub.

http://blog.worldmaker.net/2016/06/07/portrait-web-developer...

The web in 1999 was a very different place, and not a lot of it survives in even the Internet Wayback Machine (web.archive.org).


Let me cross-post my comment from http://my.1999.io/users/NKCSS/2016/06/09/0001.html

Nick Kusters@NKCSS1 min 0 likes Checking the editor to see how this works...

It seems to bare-bones to me.

Nick Kusters@NKCSSJust now 0 likes I expected a lot more to be honest; and no edit button seems like an omission.


why do you need to be able to post to my twitter timeline?


We don't. And it doesn't.


Twitter's permissions stink. I've refused to connect with a few interesting services because there's insufficient granularity.


Was the `nano` command a joke? I know it's a fine editor, but it seemed like sarcasm.


How can I delete my "test" account? Since the test content stays there, even after I signed-off from the "test" this should be a mandatory feature.


> You can link an MP3 or video file to a post, and 1999.io will generate standard RSS 2.0 enclosure code in your site's RSS feed.

What? No Atom? No NewsML? That is so 1999.


For this to be 1999-era journaling, it would need:

  - To be written in Perl
  - To support .php, .pl, & .cgi in the user's /cgi-bin/
  - To host about 10000 accounts per physical machine
  - To use FTP, or a CGI form, for remote file management
  - To use HTML 4.01/XHTML
  - What's CSS?
  - What's Twitter?
  - Features: A user profile/bio! User comments! Subscriptions! Communities!
  - Up to 10 megabytes of FREE storage
  - Free add-ons like a hit counter and a feedback submission e-mail form
  - One free e-mail address and five free e-mail aliases
  - EXTRAS: Virtual host name and domain name support, up to 5 e-mail addresses &
    20 e-mail aliases, up to 1000* megabytes of storage, and No Advertising Banners!!!
  
  * actual space may vary based on how badly we over-committed storage


Some of these ideas live on in Neocities: https://neocities.org/

Just HTML and your own assets. No node.js hosting, no JSON+XML backend storage (that we can see), no sweat.

Granted, Neocities isn't built on 1999-era technology either, but I think it better captures the aesthetic of "web sandbox that i get to play with in my own webspace" a bit closer than this.


Actually CSS 1 is quite old but support wasn't good even in IE4/Netscape 4. IE5 in 2000 was the first browser with almost complete support and that was one of the reasons for it wiped out Netscape.


Sign my guestbook!


Don't forget Webrings!


I'm hoping to make them fashionable again! http://webring.club/ :)


  - A visit counter with configurable styles.


The technology is 2016, not 1999.


Then why is your catchphrase "Blogging like it's 1999"? What makes your site anything like 1999?


After re-reading my comment, I think I sound rude so I apologize.

If you are going to use that catchphrase, maybe your site should explain how it's like 1999 because I don't think it's obvious.


> We use Twitter for identity, so creating a connection to Twitter is part of the setup process.

That must be as far from "blogging like it's 1999" one will ever get. :-/


Reminds me of the "frozen in time" NBA Jam site ... with Google analytics.


Do you mean the SpaceJam website?

http://www.warnerbros.com/archive/spacejam/movie/jam.htm

I go here every now and then. Not sure why.


I see "frozen in time" Omniture (Adobe) analytics tags, but no Google.


Strange, I remember it being there just a few months ago when someone posted it.


Oops, yeah that one.


Beats asking for a password over HTTP and storing it as plaintext, at least.


It's better to let them them leak my throwaway email and 9IB2?iDnDaFt0cf6jwXQp_ rather than accessing my tweets, or even my twitter id.


We've been doing passwords on the web since before 1999 and we weren't storing them in plaintext even then. I bet there is a proven authentication framework they could use in this project.

https was there in 1999 too but luckily we must use the current one, which is more secure.


Implying Twitter won't just hand it over to governments.


"Governments" aren't anywhere near the complete scope of risk you bring on yourself by storing passwords badly.


I'd rather do that than risk my identity being compromised in a place like Iran where my life is at stake if I speak out against the government..


I understand, but maybe if you're blogging for the purpose of political dissidence you could look elsewhere? Home-grown authentication adds a lot of overhead to web apps, so it makes sense to hand that off to one or more standardized identity management systems.


Clarus the Dogcow favicon is pretty 1999


an anti-aliased clarus the dogcow... not very 1999...


moof!


..and the fact that no one called it blogging in 1999.


http://essaysfromexodus.scripting.com/whatIsScriptingNews

"My name is Dave Winer. Scripting News is my weblog, started on April 1, 1997. It's the longest continuously running weblog on the Internet."


Sure, that's when the term was coined, but no one talked about having blogs in 1999. Words I did hear it called back then are "personal website", "finger", or "zine".


1999? XML is our savior! The future of the web will be interconnected SOAP services! It's the year of the Linux desktop! Itanium will revolutionize software if someone would just make a sufficiently smart compiler!


To be fair, the neat parts of SOAP, WSDL, still is pretty neat. XML is mostly hated because named closing tags make it stupidly verbose -- a big mistake. After that it was extra complicated which made platforms like PHP have crappy implementations, I guess.

But having to rewrite lots of boilerplate code for everyone's JSON or "REST" API is annoying. There's even projects to describe JSON schemas and APIs because that's actually useful. Maybe this time it'll be simple.

Transactions over SOAP (WS-AtomicTransaction I think) is also sort of neat I guess, but too complicated to be useful on the Internet?


> After that it was extra complicated which made platforms like PHP have crappy implementations, I guess.

If the implementations are bad, the specification may be not simple enough.


That is funny. Considering Dave Winer's long history with hands on RSS, XML-RPC, (and his close seat to watch the disaster of SOAP)


> Blogging like it's 1999

> 3. It's written in JavaScript and runs under Node.js.


If I have to enable javascript to view parts of your website at all, then it isn't anything like 1999.


To edit a blog you have to enable JavaScript because the blogging software is written in JS.

It would be possible to create a client that didn't, but that doesn't exist today.


> To edit a blog you have to enable JavaScript because the blogging software is written in JS.

> It would be possible to create a client that didn't, but that doesn't exist today.

I think the client exists: it's a web browser using forms. Not as nice as what we all expected we would have by now back in the 90s, but it works. A blog entry's just a title and some text. Throw in an input field for the title, and a textarea for some Markdown and baby you've got a blog!


You have to enable JavaScript (from fargo.io) to view posts (or at least to view the About page.)


The About page should NOT have required JS. That was a mistake. I'm going to fix it, but not tonight.

Here's an example of a blog post written in 1999.io.

http://scripting.com/2016/06/08/1311.html

You do NOT need JS on to read it, by design.


I feel like a lot of people are harshing on the fact that this is not like, I dunno, geocities or something, but I just want to express some appreciation that an effort is made for those of us who do not want to execute arbitrary code just to read a blog entry.

One note - on that page, it seems that while the content is there independent of Javascript, the navigation is loaded (from three different domains) with Javascript. I suspect there's a more graceful way to fallback there.


> If I have to enable javascript to view parts of your website at all, then it isn't anything like 1999.

JS is older than that, and early uses of it were notorious for not having graceful fallbacks, so, no, I think that description is not at all unlike 1999.


We need the little text+logos at the bottom of the page telling us which browsers we can expect it to work on.


Websites in 1999 didn't use 2MB PNG files as background images..


Do people nowadays really don't care? Or are completely oblivious to the fact that images can be huge? 2MB is still a lot of data even today. I could see it loading (on 100Mbit no less).


Just for the record, I want to point out that it does not render well on Netscape Communicator 4.75.

http://grab.by/QH2u


Seeing how Communicator 4.75 came out in Sept 2000, I guess this site is really making a go for authenticity, even on non-existent point releases in '99. [0]

Ok fine, I get what you were trying to say. :)

[0] https://en.wikipedia.org/wiki/Netscape_(web_browser)#Release...


In a way, I feel amazed that it even recognized that TLD.


A big problem for machines of this vintage is HTTPS everywhere. Google still works pretty well and will let you access it over HTTP, but the rendering of the search results is a bit janky.


Nice, and on an SGI too...




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: