As far as I can remember, Flex1 was a "server". This (I suppose)was an excuse for the outrageous price. Users would upload their MXML and AS files, then the server would spit out '.swf' files to client requests. A compiled swf was cached until new MXML or AS files were uploaded. It was a "one licence" per company approach, rather than one licence per developer one.
Flex 1.x was a joke, but Flex Builder 2.0 was only $500. Making the Flex SDK (but not the IDE) open appears to be a case of openwashing, since most Flex developers are probably still buying the IDE.
by "supporting" other languages they mean they can compile the interpreter's c/c++ source code using their LLVM->actionscript bridge (http://blog.digitalbackcountry.com/?p=1095) they came up with and then run the script on top of that actionscriptified interpreter.
which, even for people like me who drink the interpreted-languages-are-fast-enough koolaid, seems like a lot of levels of indirection :) i think the CLR approach seems a little more straightforward: "native" interpreters that compile e.g. python->the CLR IL (instead of python->interpreter c source->llvm->actionscript).
and i doubt you could just take e.g. the python interpreter and throw it into a SWF -- certainly pieces of the interpreter along with C modules that interact with the filesystem, etc. would need to be tweaked to deal with the AIR runtime.
cool idea though, and they even got quake 1 (!) running in a SWF (see the video below in this thread... wild stuff)
Personally, I absolutely LOATHE flash/flex only sites. They behave horribly on the usability side, lack simple stuff like 'back button' and text sizing, and they task/heat up the CPU like you wouldn't believe (especially on a Mac).
Flash has a purpose on the web... you can use it for various widgets that are impossible to do in HTML (say, recording and playing video and playing back animations and audio). But I'd never use it as a general presentation layer.
Hm. This may make flash suddenly relevant to me and many others. Actionscript itself is weird, so I've avoided it. But python or ruby, that's workable. I wonder if they're going to do it at the bytecode level for java? That would open up a variety of other interesting languages.
What I want to know is: what's the value of this to Adobe? It's been apparent for ages that they're trying to claim platform share. But I don't understand what this buys them. Sales of developer tools? That can't possibly be very much. I can imagine it being 'strategic', but I don't know how.
Perhaps this is why I'm in a cubicle, not a boardroom. :)
how about flash that I can run outside proprietary adobe players? This move is unimpressive, unsurprising and totally self-serving. It's goal is to suck more code into vendor-locked adobe space.
Adobe Flash is aiming to become the Applet of the 2000's. They very well might pull it off - except that Flash content is not discoverable by search engine crawlers (startup idea anyone?)
A web spider that's run by out sourcing work to China. We could have hundreds of thousands people going to flash websites and entering the contents into a database. Would work flawlessly.
Maybe I'm missing something, but what's wrong with ActionScript? I've never used it, so I don't have much to go on besides the fact that it's based on ECMAScript/JavaScript. Being a superset of JS can hardly be considered a bad thing.
You still get the prototypes, dynamism, and functional programming of JS with AS, right?
ActionScript 3.0 is basically Java. We're using it at work, and, compared to JavaScript (with a good library), it's pretty uncomfortable. They've made it (optionally) classical and static. Prototypes, dynamism, and FP are pushed into the corner. It's really gross. The modularity seems kind of nice at first, but then you realize they wrote a bunch of mediocre libraries instead of building a good lang. At least it has E4X, which is very nice, but then, I've found JSON to often be the better approach anyway.
The game is a Far Cry (pun intended) from DirectX 10. The demo was also on a Windows machine. Let's see the same demo on Linux.
The original poster is absolutely correct.
C++ written for KDE is going to run on Windows now. Well, having worked at Adobe when I was just out of college, I can say that they have VERY bright people there.
Bright enough to get C++ code written for Linux to run on Windows? in a VM? Which itself would be in a browser?
I have to think that VMWare or Parallels would have done that already if it was doable. Or, if it is doable, and just hard, VMWare and Parallels will do it better than Adobe in any case. Be assured, with the value inherent in such a product they WOULD release their own versions.
I can hear the Flash VM talking now:
Scenario 1:
Oh C++ code . . . you want me to run some Intel vectorized math routines? No problem . . . those Microsoft templates you got there are REALLY cool . . . uhh . . . well let me just get this Objective C library here on this shiny new MacBook Pro and . . . Oh Wait? . . . Well . . . Maybe I can try this GCC library and . . . SLASH!!!
<Leopard kills Flash VM process, and drags carcass home to feed cubs.>
Moral of the story: Predatory cats always target the slow ones.
Scenario 2:
Ahh C++ code . . . I get it now . . . you want to use this really cool GeForce 9xxx series card to do instancing. That is AWESOME!!! Let me get started on that for you . . . uh . . . ahh . . . OK so I'm an OpenGL context and you want me to make a DirectX call . . . Oh OK no problem . . . ahh . . . . umm . . . OK so I'm on Linux, can I ask a quick question? Your developer didn't supply you with a software DirectX 10 implementation did he?
<C++ code makes sad realization that both developer AND Flash VM are idiots>
Moral of the story: Even intelligent things, C++, can be put in situations where it looks really stupid.
Scenario 3:
Oh mapped memory? SURE, I'll just go over here to this pool of unused hard drive space and . . . THWOCK!!! THACK!!! WOCK!!! WHACK!!!
<Flash VM realizes that VISTA is actually a Dominatrix, and he is now being beaten about the head, neck and chest with the interrupt controller he asked for.>
Moral of the story: Something like unused hard drive space, or cross compilation, may sound REALLY good . . . but make sure you know who has a claim on it . . . because she might be a REAL bitch. Or worse . . . Microsoft.
Scenario 4:
Oh . . . multicore . . . umm . . . why don't I go ahead and kill myself now so you can get to the blue screen faster.
<Single gunshot is heard from the other room, system hangs without getting to the bluescreen, because Flash VM had problems even managing to commit suicide. The bullet went through its brain, but its vital organs were untouched.>
Moral of the story: Actually test claims before you start making plans, a lot of the potential technologies you want to use, or potential spouses you will meet, will make claims of intelligence, but there may not be too much between the ears.
I don't think you fully understand what Adobe is doing -- they are compiling down to AVM2 bytecode using LLVM. And DirectX is abstracted out of Actionscript, so you gain the benefits of DirectX when it's available, but the AVM will resort to other means when it's not.
That Quake II demo is a completely self-contained flash animation, and will run anywhere Flash runs (including linux).
Edit: It sounds like some people think Adobe is promising to run ANY C++/Java/Whatever application in Flash. They're not. Some C applications however, like Quake II, can be coerced to run in Flash's single-threaded platform-agnostic environment. The developer who gave that presentation at Max described the work he had to do to emulate threading in order for Quake II to run. It wasn't easy, but I'm sure it was far easier than porting the whole codebase.
There is no SUITABLE replacement for DirectX instancing under OpenGL that would be available on other platforms. To say that you will get the benefits of DirectX where it is not available is to be disingenuous. Microsoft goes to great lengths to make sure that, even with a VM doing the work for you, game developers will find it difficult to
"gain the benefits of DirectX when it's available, but the AVM will resort to other means when it's not" - as you put it.
Over the years 100's of millions of dollars have been spent to ensure that you CAN'T do that. If you talk to Parallels or VMWare they will tell you that they can get DirectX 8 to work at times, Direct X 9 seldomly, and then they will have a good belly laugh while pointing at you when you ask them about virtualizing Direct X 10. This is the way things are for now, until the requisite extensions have been developed for OpenGL.
This is not just me spouting off at the mouth. The entire gaming industry has grudgingly accepted this for quite some time. Steve Jobs has been looking for ways around it forever. If Adobe has people who can do this, they won't have them for long, because Steve is sending over a limo I guarantee it.
And what about proprietary templates? Adobe is going to translate all of those too.
Multicore programming? Gonna translate between the Intel multicore stuff and . . . say . . . the Cell processor stuff?
Taking it further, we all know that CUDA code will work beautifully if the machine you're on happens to have an NVidia card, but what if you have an ATI card?
Don't even get me started on some of these templates for mapped memory management . . .
RESPONSE TO PARENT EDIT:
Thanx for clarifying the assertions you were making.
I assume you mean it's pointless because C++ sacrifices ease-of-coding for performance, but when C++ is compiled to bytecode it won't be appreciably faster than other languages.
While it would be odd for someone to write new C++ code solely to target bytecode, I think that a C++ bytecode compiler would still be useful in situations where, for example, you have an existing C++ codebase that you would like to reuse in a web client.
No, I mean it's pointless because I think there are very few use cases for it. You're pretty much limited to libraries which do not rely on any OS-specific calls or which only do very generic OS-calls like file I/O.
FWIW, I'm the author of GCC-CIL, which provides a backend to GCC to emit .NET bytecode, and I still don't see the point. I mean, what are you going to recompile for flash or .NET? BLAS and LINPACK? :P At least with .NET you have access to libc and other native code libraries via P/Invoke, but you don't even get that from flash...
Ah, I wasn't sure what you meant. I agree that there aren't too many use cases. Personally, I think some of the existing computational geometry libraries could be useful, but I am thinking of a niche set of applications. More useful, perhaps, might be existing libraries to read/write file formats. But I'd be surprised if there was anyone looking to port their huge C++ application wholesale to Flash.
If so, this is a good example of competition "waking-up" age-old platforms (like Flash).