Hacker News new | past | comments | ask | show | jobs | submit login
Google launches Portable Native Client (thenextweb.com)
125 points by hackhackhack on Nov 12, 2013 | hide | past | favorite | 111 comments



I think this and asm.js are both extremely cool. I look forward to the competition this will inspire, and the improvement of both approaches as a result. I say this honestly not sure which one I personally think will be better in a year or two.

I often think of software as "walking a path." What I mean by this is that there are often different approaches to the same problem, and these start out as just basic, undeveloped ideas. But put smart people on it (and I have high regard for both the Google and Mozilla engineers on these projects) and let them refine these ideas over time and you'll learn more about all of the consequences, the fundamental strengths and limitations, of these approaches.

I first had this thought when working at Amazon. I remember hearing an off-hand comment about how S3 and Dynamo, both Amazon projects that started around the same time, started with very different consistency goals (Dynamo: very weak, S3: very strong) but as the details were fleshed out, ended up coming towards each other and both became more moderate.

I think that's extremely valuable knowledge, because it can become part of the body of resources that any future designer can look at. Whenever you have an idea, chances are other people have "walked the path" of that idea (or some variant of it) before, and reading about their experience can help you see down the road and anticipate what practical problems you'll run up against. You might have an idea for how to improve on what the previous designers did, but by reading about their experience you can save yourself a lot of the work they already did.

For this reason, I look extremely forward to watching these two projects evolve and compete, and in doing so, to learn more about the fundamental properties/tradeoffs of this design space.


Thanks for bringing some positivity to the debate. PNaCl and asm.js are timely and they are technically very different. It will be interesting indeed to see how it goes from here.


The introduction of pepper.js is particularly interesting. It's a re-implementation of (some of?) the Pepper API on top of the standard web APIs, so you can write stuff for PNaCl but also compile it using enscripten for other browsers. Which means you could target both PNaCl and asm.js at the same time.

That also means it's effectively encouraging asm.js adoption, which might give Apple and Microsoft enough of a kick to optimize that. Google seem to already be planning on it (they added some asm.js tests to their Octane benchmark).

It also might be useful if, at some point, other browsers want to implement a PNaCl runtime. One of the main sticking points (aside from the use of LLVM bitcode, which is not something LLVM was designed for) was that it requires the Pepper API, which is huge, and duplicates much of the existing web APIs. Google have the spare engineers to do that, but Mozilla certainly doesn't.

A reverse of that would be useful too - an implementation of the standard web APIs on top of Pepper, so you could re-target an asm.js app to PNaCl. Or some kind of tooling that lets you transparently target both.

Maybe we'll see some kind of convergence thing going on? Wouldn't surprise me too much if they started evolving slightly towards one another. I think it's probable that asm.js will eventually get some kind of compact bytecode representation, especially if it can be implemented as a shim to allow it to load in existing browsers.


Yes, I find the possibility of convergence hopeful.

I'd prefer the inverse of pepper.js and deprecation of Pepper. Way back in the plugin-futures days, every other browser vendor rejected Pepper because it duplicated Web APIs and a C binding to the Web APIs would be more useful. To me, Emscripten and pepper.js just shows that this was true. Emscripten provides C bindings to Web APIs, and pepper.js shows that Pepper is effectively equivalent to the functionality the Web APIs provide. So I see no need for Pepper, and I hope that Google will eventually eliminate this unnecessary duplication of functionality.


> pepper.js shows that Pepper is effectively equivalent to the functionality the Web APIs provide

From my reading of it, pepper.js can only support a subset of Pepper's complete capabilities. At a high level, it is missing:

    - threads
    - memory mapping
    - memory protection
There's also stuff like this in pepper.js that clearly demonstrates a lack of parity:

    // The Web Audio API currently does not allow user-specified sample rates.
    var supportedSampleRate = function() {
      return createAudioContext().sampleRate;
    }
Also pepper.js is filled with chunks of code like:

    var FileRef_GetFileSystemType = function() {
      throw "FileRef_GetFileSystemType not implemented";
    };

    var FileRef_GetName = function() {
      throw "FileRef_GetName not implemented";
    };

    var FileRef_GetPath = function() {
      throw "FileRef_GetPath not implemented";
    };

    var FileRef_GetParent = function() {
      throw "FileRef_GetParent not implemented";
    };

    var FileRef_MakeDirectory = function() {
      throw "FileRef_MakeDirectory not implemented";
    };

    var FileRef_Touch = function() {
      throw "FileRef_Touch not implemented";
    };
I don't know enough about these APIs to say whether this is just incomplete or whether these Pepper calls can't be reasonably implemented on Web APIs, but pepper.js seems far from a demonstration that Pepper is "effectively equivalent" to Web APIs.


My problem with PNaCl is that it's basically defined an a LLVM module (snapshotted as the implementation an LLVM module today), but with a bunch of limitations of what you can't do ("The only calling convention supported by PNaCl bitcode is ccc - the C calling convention", "PNaCl bitcode does not support visibility styles."[1]).

I understand not wanting to re-invent the wheel, but I don't think we should bake a bunch of LLVM-isms into something that should be designed to run on browsers for the next 20 years. Basically, the whole PNaCl spec feels way too half-baked for something that should be a standard -- sort of how I felt about embedded SQLite into web standards as the spec and reference implementation (WebDatabases, shiver).

I prefer asm.js's approach over PNaCl. I'd also be happy if the browser vendors came up with a new bytecode from scratch as well, designed for the web.

[1] https://developers.google.com/native-client/dev/reference/pn...


And the Web SQL initiative failed, for just this reason; SQL itself does not define enough to write portable SQL, so they just had to say "well, SQL as implemented by SQLite as of this particular version", and that was pretty much a non-starter. Web SQL had even got a little more buy in from multiple vendors (Safari, Chrome, and Opera), but due to the problems with it Mozilla and Microsoft refused to implement it.

The same is likely to happen to PNaCL. It will probably survive for a while in Chrome, as it may be useful for writing ChomeOS specific apps, but it will never be adopted by any of the other browser vendors. ASM.js runs just fine in other browsers, the only difference is that they don't implement the stricter subset and same optimizations as Firefox does, so it's a lot more likely to catch on, as people can write cross-platform ASM.js right now.


Correction: you can write cross-platform PNaCl code; there's a project called pepper.js that compiles the PNaCl application using emscripten. To quote the GitHub README: "Native Pepper applications can now be run in Chrome, Firefox, Internet Explorer, Safari, and more."

This leaves them in roughly the same place; write code that will be heavily optimized for one of the browsers, then include a compatibility layer that will make it runnable in all of the others. Without making an argument as to which underlying approach is better, I see no clear upper-hand from the perspective of compatibility.

What somebody needs to do now is define another language that compiles to both PNaCl and asm.js-suitable languages, and offer a deployment mechanism that correctly uses one or the other when in a browser-optimizable situation. Nothing beats too many standards like adding another standard to unify them. ;)


Yeah, and you can also run Linux in a browser and run Chrome within that and then everything will be portable everywhere. Except, well, for performance.

One of the big differences between ASM.js an PNaCl is that ASM.js just exposes ordinary standardized Web APIs, while PNaCl gives you the Pepper API, which is basically an entirely Chrome specific API. That means that rather than just using Emscripten and compiling with two output targets (PNaCl and ASM.js) which call the same APIs, you need to do a wrapper layer on one or the other (emulating PNaCl with standard Web APIs, or vice versa, or having some other intermediate layer that abstracts over both). The whole point of ASM.js is to leverage existing standardized infrastructure, while PNaCl just exposes what's convenient to expose in Chrome, while likely being considerably more expensive for other browser vendors to implement.

This is a lot like Filter Effects in IE. Rather than specifying a reasonably portable syntax, they defined something that was build specifically on some random subset of DirectX filters, and pretty much impossible to implement anywhere else without implementing a large amount of DirectX. While other browsers eventually released most of the same things in a portable manner, there was a while where you had to either implement both or use some kind of wrapper of one over the other.

While I think that it's a good thing that Google has spent time experimenting with and building NaCl and PNaCl (in general, you need ad-hoc single engine experimental implementations that people can play around with in part to figure out what will actually work for authors and is actually implementable, like canvas was, or various CSS3 effects over the years), it's a standardization dead end, and it would be nice if they would spend their effort on trying to extract something that could be standardized from it and ASM.js, rather than enabling it for web pages so it'll become another single-vendor technology that leaves others as second-class citizens.

This kind of vendor lock in is out of line with promises of openness and supporting standards that Google has made in the past, and especially concerning since they're selling hardware that only runs Chrome. Between locking people in to a single browser on that platform, and a single browser if they target PNaCl (with maybe second-class support for other browsers via Emscripten and pepper.js), this is starting to remind me of what a certain other company did when they had the sleek, fast new browser that was trouncing the big buggy slow incumbent.


Is it really cross-platform PNaCl code if making it cross-platform requires converting the PNaCl LLVM IR into an actually-portable language (Javascript) using asm.js technology and a reimplementation of Pepper? That sounds more like someone went to the trouble of writing an emulator/translator for a foreign platform, treating PNaCl the same way they'd treat old game consoles or ARM for android simulators.


>there's a project called pepper.js that compiles the PNaCl application using emscripten.

That made me laugh. You write code that is supposed to compile to native code, only to have it compiled to javascript instead. It's a crazy world we live in.


It is the hipster world, where an application created to show documents has been subverted into a virtual machine.


> I'd also be happy if the browser vendors came up with a new bytecode from scratch as well, designed for the web.

I would actually really like this, if defined as an isomorphism to asm.js. (asm.js core operations are pretty simple, it's only the JavaScript syntax that is funny.) Anyone can do it. Write a compiler in JS that compiles this WebBytecode to asm.js (and PNaCl if you want, though I'm not sure if you can dynamically generate PNaCl at runtime from JS) and evals it. You could then ship apps with this bytecode today and have them work in all browsers, and it would be, effectively, completely standards-based since the asm.js semantics would be normative and the asm.js semantics are defined by the ES6 draft spec. You'd be able to ship a clean bytecode representation, unsaddled with the legacy baggage of JavaScript or LLVM, and cross-browser compatibility with what could be, if implemented properly, a tiny stub loader.

The biggest hurdle to getting it working in PNaCl would be wrapping the Web APIs in the Pepper APIs as an "inverse pepper.js", but that should be done anyway as any app that wants to target the standards-based Web APIs and the Google-specific, nonstandard Pepper APIs will want this.


I don't think the problem with asm.js is the on-the-wire format, it's the semantics. Someone writing native C code wants threads and low level currency primitives. Javascript is fundamentally unable to model that, and Javascript as designed for the browser does not model synchronous blocking either (XHR sync and dispatchEvent notwithstanding)

I feel like both PNaCL and asm.js are a kludge in search of a problem. I just don't see either of these winning over say, DICE to implement Battlefield 4 in a browser, or Valve for the next Half-Life franchise. Triple-A developers don't want to leave performance on the table, and running your code inside of a browser virtual machine, even a C-based sandboxed one virtually guarantees that.

Asm.js might be vitally important for a platform like Firefox OS, but I doubt either PNaCl or Asm.js will ever be something the drive by web bothers with, because the kinds of applications that need native levels of performance are also the ones native developers are most likely to do native platform specific ports for. That is, having to recompile and port from Win32 to OSX to Linux is worthy tradeoff vs losing 20-50% performance off the bat, not to mention the different purchasing behaviors of Web users vs console or native mobile users.


I'm so glad to see Google finally make a mobile version of Active X.

Seriously though. Stuff like this is what made me quit Chrome. Too much non-standard Google nonsense in what is supposed to be a standards-compliant web-browser.

It just feel wrong.


As opposed to Active X, PNaCl runs on all major OSes (desktop and ChromeOS so far, not mobile yet) and on all major CPU architectures.

Active X, AFAIR, was portable in true Henry Ford manner - running everywhere as long as it's a Wintel box with Internet Explorer.


I believe Microsoft learned that lesson a decade ago. Since then, they've pushed Silverlight as an open, standardized, cross-platform programming environment for the web with virtually no success. This is the relevant comparison.


Silverlight was a big success up until Sinofsky decided to kill it (with Silverlight, you could create rich apps and bypass Windows Store). Worst decision ever, I hope PNaCl takes over the world now.


Yes because the market always picks the superior technology.


Wasn't Silverlight discontinued though?


> Microsoft Delivers ActiveX on the Macintosh > Oct. 17, 1996

http://www.microsoft.com/en-us/news/press/1996/oct96/macpr.a...


Marginally off topic, but calling a A Windows box a Wintel box implies that there is functional, if not fundamental, difference between Windows/Intel machines and Windows/AMD machines. There isn't.

I ran all the same Active X controls on my WAMD box you did on your Wintel.

Or am I missing something about the Windows/Intel combination?

Anecdote: I switched to Mac (Mintel? Mactel?) in 2007. Prior to that, my 486DX4 100 (circa 1995) was the last Intel I used.

</offtopic>


Wintel just refers to the instruction set, not the actual processor. The reason for using it in this case is that since ActiveX are compiled for x86, they won't run on other platforms that Windows runs on. In the past, Windows NT ran on PowerPC, Alpha, MIPS, and Itanium platforms; now Windows runs on ARM in the form of Windows Phone and Surface. So it's not just IE on Windows that is relevant for ActiveX, it's IE on Windows on x86. But that's a mouthful to say, so people abbreviate it to "Wintel".


For a while there were non-x86 Windows NT boxes. In fact, there are again with Windows RT. Wintel versus WinDEC or Wintanium or WinMIPS, for example, though there was never enough market shire for any of those to justify a neologism.


From Wikipedia:

Wintel is a portmanteau of Windows and Intel, referring to personal computers using Intel x86 compatible processors running Microsoft Windows.

http://en.wikipedia.org/wiki/Wintel


I believe the term was coined this way because x86 was originally invented by Intel.

Although, indeed, Intel lost the control when AMD introduced AMD64. :) But this is really getting offtopic.


Additionally, it is Open Source.


So what? It needs to be adopted by other vendors, otherwise it is just literature for sleepless nights.


I believe it's open, and not patented (or I'm wrong?), an attempt to bring LLVM IR (quite standardized) to browser side. It's redundant (competes with asm.js, which is the same kind of "non-standard nonsense" with a WIP spec) with it's own pros and cons, but that's about it.

Every browser vendor experiments with new ideas, only because of this we're getting nice things as the results.


I have no axe to grind against LLVM or PNaCl, but I think you may have a novel definition of "quite standardized".


You're right. It's widespread, but I was certainly wrong with "standardized" part.


It's easy to say that when you're not providing any examples yourself at where pnacl's standardization efforts is lacking, and where asm.js is doing a good job at it...


I'm using the standard definition, of course.

Seriously, the absence of a published standard from a credible and relevant standards body would be my primary objection to the phrase. Barring that, de facto standardization would be indicated by multiple interoperable implementations from more than one vendor and widespread adoption. LLVM IR and PNaCl are, at best, documented. That's better than undocumented, but it's a long way from standardized.

I never said asm.js is standardized. ECMA/JavaScript is, though, and asm.js is just a well-defined conventional subset of JS. There doesn't need to be a standard.


If we take the definition of "standardized" as "there exist one or more cross-vendor (draft) standards that, when correctly implemented in a user agent, allow asm.js to run properly" then asm.js is standardized.


At least they are trying to innovate. I am tired of seeing n-th JS-MVC framework that are posed as something fundamentally new.


What I'm really looking forward to with this is not PNaCl on the web, but PNaCl as a way of writing high performance portable native apps by using a Chromium wrapper.

I'm very excited by the idea of being able to write an app that "just works" across a ton of platforms using JS and HTML5, with compiled code sprinkled in for performance-critical portions. Projects like AppJS have been working toward this already, but with PNaCl, a big missing piece can finally be filled in.


Have you considered XULRunner?


Could we just standardize and complete what is missing from Firefox OS's proposed standard: WebAPI ?


I am unaware of any part of WebAPI which is geared toward native code performance. And cool as it is, asm.js is never going to have the same potential that PNaCl has. Primarily this falls under one reason: multi-threading.

Web Workers are a step in the right direction, but they're incredibly limited by design because of the inability to share memory across threads. The requirement of copying data across threads means that parallelizable tasks involving small operations over large chunks of data often don't benefit from using Web Workers. Unfortunately those tasks make up a large percentage of situations where parallel processing is useful.


One day, I will be able to compile my golang apps to work on browsers without losing considerable performance, multiple threads and concurrency. That day, I'll be very happy.


You'll be interested in this project, then: http://blog.awilkins.id.au/2012/12/go-in-browser-llgo-does-p...

It's important to emphasize that PNaCl is a platform. We currently provide front-ends for C and C++, but nothing prevents you from writing your own, for your own language (or an existing one). As long as you emit PNaCl bitcode, Chrome will run it for you. The PNaCl bitcode has a definition here - https://developers.google.com/native-client/dev/reference/pn... - and we're working on making it more precise.


Working on a PNaCl spec or did I miss it and that was released too?


Can you clarify what you mean by "spec"? The bitcode ABI document I pointed to is the spec of the IR used by PNaCl. The official documentation covers the Pepper APIs which are PNaCl's interface with the browser and DOM.


A document with which someone could create an alternative implementation; like another browser for example. Do I take this to mean that there isn't one?


There is no real "spec" of the Pepper APIs, just API docs. While those might be of high quality, it still depends on many implementation details, so it seems like it would be challenging for another browser to duplicate.


Bueller...


> One day, I will be able to compile my golang apps to work on browsers without losing considerable performance, multiple threads and concurrency. That day, I'll be very happy.

One day, the browser will be used again just for documents. That day, I'll be very happy.


Native in-browser CLR, JVM and LLVM support would be the best world to live.


I don't know that sounds like a very large surface to secure.


How does PNaCl differ beyond intermediate representation from asm.js? They both seem to focus on providing cross compiling into a browser environment with the latter already having compatibility across the entire set of javascript interpreters. What does it offer that hasn't somewhat been solved?


The point of both projects is to allow a larger class of programs (and programming languages) to run on web platforms.

The point of PNaCl was to overcome or remove some of the constraints that come with the JavaScript language, like the lack of a concurrency model. This necessitates a new runtime. Since a new runtime was required anyways, Google decided to expend the engineering effort to make it as similar to native code as possible in terms of capabilities and performance, while also maintaining security.

The point of asm.js was to not include another runtime in the browser, but still meet the needs of the programs that PNaCl was trying to serve. Because it plays within the bounds of JavaScript it doesn't meet all the needs that PNaCl does (again, like concurrency), but it still allows native-like programming.


At the moment, PNaCl supports pthreads-style threads and asm.js does not.

Also, Native Client, but apparently not PNaCl, supports SIMD, while asm.js does not.

Of course, both PNaCl and asm.js are evolving, so asm.js may be able to eventually support those features somehow, and PNaCl may get additional features.

Some information about features PNaCl supports are here: https://developers.google.com/native-client/dev/faq


> How does PNaCl differ beyond intermediate representation from asm.js?

Both are basically intended as a target for LLVM-compiled code, so lots of similarities, but the main difference is that asm.js is a subset of JS so it runs in any JS engine, while PNaCl is different. Aside from that, there are lots of technical differences, but it's hard to say which actually matter in the long run. To quickly summarize, right now asm.js tends to run a little more slowly than PNaCl but start up a little more quickly. But engineers on PNaCL and on JS engines intend to shrink those differences over time, and there is no reason in principle why they won't succeed.

To judge for yourself, you can see some comparisons between PNaCl and asm.js in these two sites:

http://www.flohofwoe.net/demos.html

http://trypepperjs.appspot.com/

My impression is that the perf differences are not that noticeable already. For example, the bullet demo in the second one seems to run slightly faster in PNaCl than asm.js. However, profiling shows that 65% of time is spent in three.js rendering code, not in asm.js, so perhaps rendering differences account for most of the disparity, and it mostly isn't comparing asm.js to PNaCl.


We were seeing some significant slowdowns when running lua in emscripten (testing on repl.it) vs. PNaCl.

Take this naive fibonacci function: function fib(n) return n<2 and n or fib(n-1)+fib(n-2) end print(fib(30))

On my machine, it completes in less than a second on PNaCl, but takes nearly 10 seconds to complete in emscripten.


repl.it contains code from a few years ago. That's way before a huge amount of general optimizations in emscripten, as well as asm.js. Here is a more up to date Lua VM running in JS:

http://kripken.github.io/lua.vm.js/repl.html

But even that is already out of date ;) just this week I found that I was building Lua with a bad choice of optimization flags. We include Lua VM benchmarks in the emscripten test suite, so for the latest numbers (with the proper optimization flags), see

https://docs.google.com/spreadsheet/ccc?key=0AkuGewEm05tZdFd...

It's possible to run the Lua VM in JS at only about 50% slower than a native build.

edit: fix link


Ah, thanks for that! I was wondering if repl.it was out of date. Might be worthwhile to get the owner to update... :)


Regarding repl.it, I'm not sure how active it is. Their python C port for example is 2 years since the last update. I'll ping them to see.

Regarding lua, I updated the lua vm project, and tried your fibonacci function from before in the repl

http://kripken.github.io/lua.vm.js/repl.html

Looks like in both firefox and chrome it runs in about a second in JS, which feels about the same as the time it takes in PNaCl in chrome on

http://gonativeclient.appspot.com/demo/lua


They are basically approaching the same problem from two different directions (PNaCl is the older project, despite having a later stable release; asm.js was a response to PNaCl.)

EDIT: And pepper.js is essentially a response back to asm.js and its "run, albeit with less performance optimization, on any modern JS engine".


Using the LLVM toolchain, thus probably better performance and language support?

I'd like to see the benchmarks before assuming that, but if this leads to letting me use Haskell in the browser, I'll be all over it.


Most asm.js is generated using Emscripten, which is also based on LLVM. I guess the backend optimizations will be different.


The PNaCl bitcode can serve as the output of any compiler. Google currently provides toolchains for C and C++, but folks have been working on Go ports, Lua, Python etc. Haskell, in fact, already has a LLVM backend so porting that to emit PNaCl bitcode should not, in theory, be overly hard.


Doesn't Haskell's LLVM backend depend on custom LLVM calling convention to do tail call? Does PNaCl bitcode support tail call?


This could also be a fertile pasture for distributed computing on the fly. Suppose you have a website that attracts 1k visitors per day on chrome, how many MIPS could you harvest? Could you let them mine bitcoins or fold proteins in the background without attracting attention?


How is this different from existing client-side JavaScript code? PNaCl is significantly faster than JS, but conceptually it's similar - you can run code on the user's machine.


The "significantly faster" is your answer. A 10 second visit on JS vs PNaCl is a whole new game.

Edit: to elaborate: do a simple simulated annealing on a 200x200 array in JS - it's measured in seconds on my laptop. I wrote a program in JS that did this (image segmentation, fg/bg seperation for recognition) and abandoned it because of speed issues.


JS should not be that slow in my experience. Do you still have the code to your benchmark somewhere? Perhaps you ran into a performance bug that we can file for the relevant browser(s).


I have it somewhere, also still have that laptop. It wasn't a benchmark but a poc for an algorithm.

I did some serious optimization when I wrote it. There was a "best practices" for V8 that shedded some light on the background of JS variable allocation, I followed it and got a serious boost. Also did lookup tables instead of runtime math with exp/log functions. But in the end: hit several worst case images that took more than 12s, which was a gap that couldn't further be reduced through optimization.

But you're absolutely right, I should redo it in PNaCl.


I never said that ;)


ouch, sorry, misread your last sentence.


As azakai said, real benchmarks run by 3rd parties are great and very welcome. They can go a far way for improving all involved platforms and thus the web in general.


I'll allocate some time for it. Where should I submit?


Bandwidth may be a problem then. I mean, your gains in mining would be seriously hurt by the bandwidth costs. Also, if more people start mining this way, there will be more competition and your earnings will decline. There's also the case of some users maliciously feeding you false results. So you'll have to introduce redundancy to the system (I'm not sure if you get punished for false results apart from simply not getting paid though).


I sure hope this doesn't become a thing, and if it does, that existing ad blocking and/or anti-malware tools pick it up and warn the user against it.

When you start appropriating other peoples' computers for your own purposes, secretly, without their approval, that's called a bot net.


yeah, but think of the pro: "you can help pay for this website by pressing this button and leaving this window open for an hour". You could help SETI by just visiting a URL.

By fertile pasture I mean that both malevolent and benevolent communities will support this. This could see a lot of optimization if open sourced (is it?).


If it's voluntary on the user's part, then I'd be all for it. I was more concerned about the "in the background without attracting attention" part, which implies that it is done behind the user's back. I could see users opting into distributed computing tasks if given the choice.


I can't be the only one who doesn't think this is cool.

From an application security point of view, PNaCl -- and the general trend of packing everything into browsers -- is not good. The first rule of security is to keep things small and simple. A browser with a built-in compiler is no longer a browser.

If Google had its way, phones would just be dumb bootloaders for operating systems downloaded on the fly from their cloud.


Do note that PNaCl uses Native Client as a base. Native Client is a very secure sandbox that's been out in the wild for some time, had security contests run against it and has in general proven itself to be very secure. The PNaCl translator (backend of the compiler) runs within a NaCl sandbox itself, so nothing in the Native Client security model is compromised.


So JS shouldn't be JIT-ed as well?


You have it backwards.

The PNaCl thing actually makes it more realistic to actually install a "Web App" inside Chrome and use it independent of an internet connection. The browser is lagging well behind Android and iOS in terms of user experience. Efforts like PNaCl and asm.js help to restore the balance.

All browsers include compilers already, by the way.


My guess is this gives Google a clear migration path away from Java to a unified Chrome OS for phone, tablet, notebook etc. PNaCl could coexist with Android apps until Google finally phases Java out.


Using the NDK [1], an Android app can already run C/C++ code. If Google really wanted to phase out Java (where did that come from anyway?), the biggest task is to port the whole SDK, especially the UI, and not to run C code.

[1] http://developer.android.com/tools/sdk/ndk/index.html


Am I missing something here, when was this about Java and phasing it out of Android? I don't think that's a likely plan but if you have any evidence to the contrary I would love to see it. Java is pretty pretty central to Android to think that it's somehow going to be "phased out", that would be more like destroying the entire platform.

I know everyone loves to hate on Java, but I'm always amazed by some of the wacky things people say ...


I really don't hate Java at all. It's had a generally beneficial impact in all sorts of places. But I wonder with the change-over in leadership for Android[1] if we aren't seeing signals that Google wants to consolidate all of its apps under Chrome. One possible motivation is that they may not want to leave a core technology under the control of Oracle forever.

1. http://thenextweb.com/google/2013/07/17/google-is-holding-a-...


Maybe a new virtual machine along with a high-level language will magically materialize out of thin air? GoVM perhaps, that was secretly being developed all this time?


> Maybe a new virtual machine along with a high-level language will magically materialize out of thin air?

Dart[1] is high-level, and has a clean, fast VM implementation. Also, ARM support[2].

[1] https://www.dartlang.org/ [2] https://code.google.com/p/dart/source/browse/branches/bleedi...


You probably wanted to say JVM. Please read some Wikipedia for background info before posting.


Nope. I said what I meant. Java is alot of things. I was talking about all of them. I used to be part of Sun's Java Center.


Does this abstraction of native OS environments mean lesser supported platforms like Linux will see more apps? Of course, only if the app dev decides to port it to PNaCl.

One thing I am concerned about is, since PNaCl is essentially yet another platform, how Google (or anyone) is going to manage the host of supporting tools required. Tools like dependency management, libraries, process control etc.

By the way, Google has already proven their "platform-creation" skills with Android.


I think NaCl is already portable across different OSes on the same architecture, since it exposes its own API.

The new part of PNaCl is that you don't have to recompile for each target architecture. Among other things, if they push hard for this for desktop developers on the Chrome App Store, then suddenly they have a large number of things that work on Chromebooks without demanding that developers recompile.


I'm under the impression that once a game dev compiled for NaCl x86/x86_64 at all, it ran on all of the Chrome binaries on x86/x86_64, i.e. Mac, Windows, Linux, and Chromebook.

There are already a few games on NaCl, like Don't Starve[0] that work on x86/x86_64-based chromebooks. They just don't run on ARM-based chromebooks -- which is one of the incentives for PNaCl.

IIRC, NaCl could only run in Chrome Web Apps precisely because it was platform-dependent. It looks like PNaCl will also allow ActiveX/Flash-style embedding.

[0] http://www.dontstarvegame.com/


You are right. But even with NaCl, the original question remained: would a game dev compile an NaCl app for Linux (even though code changes weren't required.)

PNaCl eliminates a dev's actions to recompile.


NaCl -- even the non-portable form -- doesn't require recompiling for different OS's. It only requires recompilation for different machine architecture (x86 vs. ARM.)

PNaCl removes the need to compile different x86 vs. ARM binaries for NaCl.


This is correct.

A more significant difference is, however, that NaCl is restricted to the Chrome web store, while PNaCl is available on the open web. You can have a PNaCl module your web app, on your website/domain today, and Chrome 31 (and later) will run it.


The two are tied -- the reason Google didn't enable NaCl outside of the Web Store is to avoid architecture-tied NaCl code from becoming common on the web when PNaCl was in the pipeline.


I did not realise this. That is awesome.


Is this how Google plans to bridge the JS performance gap to get web apps running at "native" speeds?


It's one of several approaches in that direction. Google's not particularly shy about pursuing multiple approaches in parallel for important priorities.


So sort of .NET for Chrome? Cool. I like the language agnostic approaches we are seeing these days.



If I want to write a portable C++ GUI app, this codebase would be a good free option to fork.

Anyway running in Chrome product sandbox is nothing attractive. For the web, I would bet on Emscripten + asm.js which is a lot more portable.


Qt, wxWidgets, Gtkmm, Juce


GPL requires you to PAY your freedom to choose license for the freedom to access source code. So it's trade of freedoms, not free.


What has GPL to do with the listed frameworks?

Not all of them have GPL licenses, plus it is only fair to pay people for their work.


Maybe here's a miscommunication. I meant no price, no need to trade my copyright, and maybe you're meaning freedom to access source code - strict GPL meaning.

Also I haven't argue payment itself is bad. Go back and read carefully my reply. Fair trade is good, and I would gladly pay for good product.


I still don't get your point.

You mentioned targeting the browser with C++ for portable applications.

For me the place of native code is at the OS level, and the browser should be left alone for plain interactive documents, instead of Frankenstein VM.

Hence my short reply with a list of native frameworks.

It has nothing to do with licenses.


OK I see your point.

My original intention was using HTML as a GUI toolkit layer, and I realized that was wrong idea with PNaCL. It's exactly opposite concept. It's my mistake. I'm sorry for that.

Wow. now I see PNaCL is really completely useless except for Google's API dominance.


NaCl is not new, I remember seeing about it I guess over a year ago?

What really happened, they reached a stable version or something?


Does Google use this internally? For example are they using it as an optional faster frontend to their apps?


They sure took their sweet time. Didn't say they it was supposed to arrive like a year ago? NaCl should've worked on ARM from day one.


Embrace, Extend, Extinguish... now in version 2.0 beta!


Another brick in Google's walled garden. I will never use it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: