Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
It's Time for Server-Side Browsers (camendesign.com)
47 points by necolas on Feb 21, 2011 | hide | past | favorite | 43 comments


Could someone translate what this is actually trying to talk about into something clear? This blog post reminds me of someone on weed trying to relate to you this profound idea to you and rambles on stream of consciousness style. He wants there to be no 'centralization' (that is, servers) and have things run on a cloud of people's web browsers, yet at the same time it still talks to mysql somewhere? We won't need any one site to hold data and things just magically work by sending things out into the ether?


What I understood is that the author wants the browser to run on the server and then stream the video, VNC-style. That's a profoundly silly idea, since you'd have to basically get one server per user for any semi-CPU-intensive rendering, and to what benefit? I don't know.


What I’m proposing is that we replace PHP with a web browser so that the code we run on the server could be the same code we run on the client. To make it so developers only have to learn one language, one set of technologies.

That will eventually lead to decentralised applications, but in the mean time will simply lay the groundwork for it whilst people continue with the centralised model.

There will always be a need for centralised systems, the world is just not going to simply change their entire toolset and paradigm overnight. We must make centralised apps use pretty much the same technologies as decentralised ones first, then coax developers over.


With desktop applications you can use the same language/code on both the client and the server side. That did not lead to decentralization. Well, that isn't cross platform you might say. However, Java mostly covers that and still it did not happen. There are also numerous tools that allow you to write in one language (Java with GWT, some lisp one I cannot name) that runs on the server side that automatically generates equivalent javascript for the client.

There is no reason (and you provide no argument) that using the same code on the client/server results in decentralization. Furthermore, we have the fact that it already exists and still did not lead to decentralization.


Presumably he/she envisions a Facebook/BitTorrent hybrid with content replicated across the social graph.

You'd get drafted into storing images for some subset of the intersection of your social graph and several others, and the rest of your images would be served by someone(s) with whom your social graph intersects.

Presumably invitations would be via e-mail, or, if you wanted to be even more decentralized, QR code.


The best I have been able to make of it is that he wants to make a protocol where anyone can make an application/service and distribute arbitrary data via the browser. Basically, the internet at large except all of the 'computers' are actually web browsers running on actual computers.


... Which assumes people keep their machines running 24/7 on unlimited data connections. It's a nice idea from a tech point of view, but for the life of me I cannot see how it's superior to what we already have.


Well, except some degree of redundancy would be built in to the storage scheme á la BitTorrent.

As far as data volume, I've got to imagine the data usage per user would be significantly less than BitTorrent.

The desirable aspect of this system is obvious - the only people who see your data are people who've seen a QR code from someone in your social graph - your FaceTorrent profile will never, ever be indexed by Google.


> The desirable aspect of this system is obvious - the only people who see your data are people who've seen a QR code from someone in your social graph - your FaceTorrent profile will never, ever be indexed by Google.

What if Google started offering, say, free Google Coupons (assuming they roll their own groupon) in exchange for befriending them on FaceTorrent? Not all of my friends and relatives value privacy as much as I do.


I'm left wondering wether the idea is server-side browsers or browser-side servers. Plus, what's the problem with having to deal with JS client-side and something else server-side? And as for this...

"I therefore cannot redistribute and decentralise the server-side parts without expecting the end user to understand how to configure and run a server"

You can. It's called a desktop app.


This is not a layer 7 problem, its a layer 1 problem. Get your ISP to provide unfiltered nearly-synchronous connectivity with pervasive ipv6 (no-nat) and then we can start getting rid of the client server model.


That's at least layer 2 (in the IP model) or layer 3 (in the OSI model). /nitpick


Time for "server-side browsers" came long time ago - check out headless web application testing frameworks (htmlunit, for example).


Another one is "Envjs": http://www.envjs.com/

Quote: "Envjs is a simulated browser environment written in javascript"


This discussion handles headless browsers: http://news.ycombinator.com/item?id=2142104


I built a proof of concept that sort of does this:

https://github.com/shazow/relay.js

Someone hosts a hub which just acts as a dumb message-passing relay between browsers via websockets. When a server connects, it sets a code payload. Then when a client connects to the same hub and requests the server, it receives the code payload and a streaming 2-way connection to the server.

There's a few live examples like a collaborative whiteboard in about ~40 lines of js.


Isn't it how mobile Opera works already?


It’s how Opera Mini works (sort of) but not how Opera Mobile works.


Going by the title, yes, that's (sort of) how Opera Mini works, but going by what he wrote, no, not really. He seems to be talking about browsers as servers on the client side, not turning browsers into glorified image viewers.


Every Opera can use Opera Turbo to get recompression of pages. The author is talking about something more than that, though, he's basically talking about VNCing to a server.


PhantomJS is quite nice -- headless javascript which leverages webkit for rendering.

http://code.google.com/p/phantomjs/


Making browsers servers as well makes much more sense. At least allowing communication that is peer to peer. This would dramatically reduce network costs for the application owner.

There is a Proposal for peer to peer browser communication. See: http://www.whatwg.org/specs/web-apps/current-work/multipage/...


An interesting tangent to this discussion is OTOY (http://www.otoy.com/), where your browser simply functions as a display and input device; all the heavy lifting of 3D rendering and game logic is processed server-side.


I believe this is what Aptana's Jaxer (http://jaxer.org/) tried to do. It didn't exactly take the world by storm...but perhaps it was too early...


aka Browser as a Service -- This is something I tought about a few years ago. It's becoming a reality with Opera Mini & Skyfire. http://virteal.com/UltraLightBrowser


It's much more powerful and smarter to make the clients into servers. Aka P2P.


sorry, I believe this was done ages ago - the remote desktop, such as VNC.


Way before that — the X window system.


It's not seamless though, and that makes it a very different idea.

He's talking about it working at the same level as a browser where it sees if you have HTML5, and if not falls back to flash, dynamically.

I can't imagine it being a high priority - fiddly, low return, creates huge performance issues. Lots of basic problems - you don't want someone supplying input to a game over a (latent) network. You don't want to be running instances of performance-intensive games on your webserver.

Still - the proposed idea is distinct from VNC or X.


You can implement this (somewhat wastefully) using VirtualBox and VNC.


My point above is that there's a vast difference between (1) something being possible by a tech-savvy user spending time to jury-rig a solution, and doing it on a case-by-case basis and (2) the same kind of thing being done as a commodity - something that runs commonly on the platform with no effort on the part of end-users.


Maybe there's not as much difference as you might think. Look up Paul Farnell and Litmus. You can listen to his Mixergy interview from January.


high lvl nap talk


Or we could stop treating the HTTP/browser combination as an application delivery platform.

It's unpopular here, but it's becoming exceptionally obvious as the correct choice.


The term "exceptionally obvious" might be over-stating the case a bit. Please could you expand on why you think that


Please could you expand on why you think that

In 1993, I was the only person I knew with a web browser. 18 years later, we have applied hack after hack to make it into an "application platform." And it's still the same old thing.. a way to deliver hypertext pages to people.

If you want to allow people to sync their files, share large popular binaries or listen to the perfect mix of music, you don't do it over HTTP. Not because you don't understand HTTP, but because you do.


It's possible that the reason the web's still "the same old thing" is because it has been proven an effective platform for distributing services and is, _in practice_, a system that's extremely scalable and evolvable. Infact, it's so successful there's nothing to compare it to - which means no value judgement or comparison is at all "obvious".

HTTP is also extensible (by design), which has lead to stuff like WebDAV and the like. Stuff that is used large scale, over the web. So the web is not 'just' HTTP.

There's deliberate trade-offs in the web's architecture which means, unfortunately, you can keep coming up with edge-case applications which are challenging (although often not impossible) to implement using web technology. The reality is that a large majority of applications are information-centric, are distributed in nature, need to evolve over extended periods of time, and therefore benefit _greatly_ from the trade-offs that the web has made.

The web's not perfect but it's evolving and - importantly - it doesn't have the benefit of being completely fictitious and living inside the mind of a jilted, chippy geek. No offense. ;)


How should we treat it?

What is the alternative application delivery platform?

Where is it becoming exceptionally obvious?

Tks in advance.


We should treat it as a document delivery and linking system, as it was designed to be.

Currently, the alternative application delivery platform is the App Store, the runaway success of which is neatly proving the point that the web just isn't up to the job of serving applications in the way we've been trying to hack it to become.

That's also where it's becoming exceptionally obvious, as people struggle mightily (even Apple themselves) to create web apps that have a look and feel anywhere close to those of native apps. And users feel the difference, sense the request/response bubbling underneath and the JS churning on top, and reject them.


Thank you for your answer. It's a valid example but I guess it doesn't prove the point that the web isn't up to the job of serving applications. Maybe in mobile that's true, maybe mobile it's the future. For now I would counterargue that web app market is much bigger than the mobile app market.

For whoever downmoded me RTFM: http://paulgraham.com/road.html. If there is a new road ahead write a new road ahead essay close to that form so I can understand. I won't stand for empty vanguardism.


I don't think your lack of understanding entitles you to demand someone write a 2000-word essay for you.

But, briefly, many of the points in Road can still be right with the conclusion being wrong. Yes, data is more important than computer, and server-side computation and storage of data etc is increasingly important.

But the web as the client interface? Network-aware native apps are trumping all over it. These aren't the desktop apps PG speaks of, they're not shipped in boxes and they're not static. They're downloaded instantly, updated easily, network enabled and fully responsive.

Convenience over all is the thing for users, and apps are just more convenient. They don't load themselves from network every time (Gmail, Twitter), they don't have UI elements that fill in slowly as their graphics load over the network (Posterous and a million others), they don't transform without warning (Facebook) and they don't pretend that a way of sharing research papers can support an interactive session without you noticing.

As for the size of the Market, I've already spent way more on desktop and mobile software than I've ever spent on Saas/site subscriptions and I don't see that changing. Where there's a native app I'll almost always prefer it.

Why? Because native apps are better software; a better experience. And when they're supported by (synced/cached) server-side data (and ideally a fall back to a web client if I find myself at some random terminal somewhere) I get the best of both worlds.


I don't feel entitled to demand anything, I was just asking :).


Server side browsers are good for making screen snapshots.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: