Everyone on this thread is so dismissive about the language that they forget to credit the accomplishments of these wonderful boards.
See, Javascript maybe a 'bad' language according to many of you, but it has massive adoption unlike other languages. These board creators just want to ease the path for most web developers to become hardware developers. It not only opens up a whole new industry to work with, but also it creates a good 'filter' to filter out the bad ones. I will explain.
The thing about hardware products is that most people dont care about internals. Most of them care about the experience. I am NOT an Apple fanboy, but in this occasion I would like to cite the iPhone's sales as a good example. If you suck at programming in Javascript, it will show up, especially in the Hardware world, easily, and you/your product will be rejected.
Also, when you develop, say, a DSLR Quadcopter[1] with this board, people aren't going to ask you "What language is it running on?", "How slow is your language?". People are going to be asking about the footage you're going to film with it. Let's not dissolve ourselves into the hatred of a language. Instead, let's take the time to appreciate what these developers have achieved and what we can build out of these boards.
> These board creators just want to ease the path for most web developers to become hardware developers. It not only opens up a whole new industry to work with, but also it creates a good 'filter' to filter out the bad ones. I will explain.
What exactly happened to the programmers who, upon realizing that they had to use a new technology, learned the new technology instead of feature-creeping something until it gets to their familiar JS boat.
I'm not going to rant that this is never going to fly. Were it based on technical consideration, the current image of Web 2.0 would have sunk like a rock, let alone even try to fly. But this is horribly inefficient and backwards that encouraging seems incredibly detrimental to our field.
Come on, people, it's not that hard. If you want to do embedded development, a reasonable subset of C is all you need to know, and it certainly takes less to learn that it took to learn half a gazillion JS frameworks. Ask your boss to give you a one or two-week leave and give you a break from your 70 hours a week streak, and learn something that's actually new for a change.
The key draw of running Node.js code in the embedded environment for me is that it works out of the box with all the web libraries that I'm already familiar with. With Node.js I can easily leverage existing modules to do things like make a device that responds to social media by accessing my Twitter account and when it sees a new tweet mentioning my company make a light flash or something like that.
I would hate to try to code such a program in C for connecting to the web. Sure I could definitely do it if I took a couple weeks off, but from what it looks like I'll be able to code that program in under an hour using the Node.js I already know and the libraries that the open source community has already developed.
Anyway, that's why I ordered a Tessel and am looking forward to developing with it. If it turns out that I really fall in love with embedded programming then sure I'll bust out the C compiler and learn the low level coding. But in the meantime I welcome the chance to learn about Node.js in a familiar environment where I can get stuff built quickly using a toolset I already know.
> I would hate to try to code such a program in C for connecting to the web.
Why would you want to connect a wall switch to the web in the first place? The Internet of Things doesn't necessarily have to mean the WWW of things.
For instance, you can always expose low-power devices through a low-overhead, low-power & short-range communication protocol to a <that-protocol-enabled> router. It also makes sense to have all of those devices configured from there (indeed, via a web interface exposed on the gateway) rather than having a web server on each of them.
For what it is worth this is one of the upcoming stretch goals:
nRF24 – low power wireless communication with mesh capabilities (good for tying lots of Tessels together without WiFi)
So one Tessel can be the web server to connect a cluster of Tessels communicating via nRF24 to the web. Or you could use one Tessel as the router to connect a bunch of nRF24 Arduino devices to the web.
Come on people, it's not hard- if you want to program computers, reasonably structured assembler is all you need to know, and it certainly takes less to learn than it took to learn half a gazillion C libraries.
That's a carricaturized view of it, from which I gather you haven't done much embedded programming. A subset of the (famously thin) standard library is quite sufficient.
Of course it's characterized. That's the whole point of a reductio ad absurdum argument. You're basically making the same argument as the old guy yelling at kids to get off his lawn.
Out of curiosity, what do you think about Lego Mindstorms? Especially the later versions which allow a large multitude of programming languages to be used?
Leveraging the great node ecosystem comes at a significant cost in terms of power consumption, complexity of the manufacturing process, board size and of the system itself.
Power consumption and, to some extent, board size have been very significant drags for the Internet of Everything. Mindlessly throwing libraries at them just to help people who are too lazy to learn a new programming technology is only adding a drag on it.
I certainly don't want to drag this into the mud. It's certainly a good learning platform which can provide exposure to a range of devices for people who would otherwise not even hear about them, or for whom the initial technological barriers would be too high to overcome in a single evening. But this is far, far from adding any kind of value to the struggle towards universal Internet connection.
I'm not disagreeing with you completely. I don't think I would be thrilled creating a consumer product with this kit. However, you have to admit that this would be very ideal for rapid prototyping. Unlike you, I have never programmed hardware. I think that what I would do with this product is build a prototype or two or three, just to a point where I get a proof of concept nailed down. From there, I'll be extra-motivated to reduce the size and improve power-consumption, reliability, and responsiveness by learning C in the context of embedded programming.
If what you say is true, it's a bad sign for the future of hardware. If we fundamentally can't leverage existing libraries, we can't build the standing-on-the-shoulders-of-giants complexity pyramid that allows for such reliable, rapid development in other spaces. It will always remain a niche field.
Of course we can leverage existing libraries and standing-on-the-shoulders-of-giants complexity pyramids. Slapping a webserver on them isn't necessarily a good approach though.
I think this is a great idea for getting more people to experiment with hardware projects. Then they can move on to learning more things. But for a first try the barrier to entry needs to be as low as possible.
i quote "leverage the great node ecosystem" : which is not written with performance on embedded in mind => useless.
Embedded dev is all about performances, power consumption ,etc ... it is about focusing on hardware , not about how much libraries you can throw at a problem, since you usually do not throw any,because of limited memory, computation power...
I'm getting really tired of this "it has massive adoption" trope.
Let's be clear about it, the merits of the language have nothing to do with its adoption. It's widely adopted because Javascript has a stranglehold on the browsers and people don't have a choice. Nowhere in the software space would such a monopolistic position be acceptable, but hey on the web for some reason it's OK.
So yeah, it's the dominant language but not because it's oh so good or because people think it's awesome (though some of them do and I respect that), it's only because there's absolutely no alternatives.
> Let's be clear about it, the merits of the language have nothing to do with its adoption.
Having a platform that is already out there and which you can execute against without even an install might not prove much more about the merits of a language, but it is a merit in its own right.
Of course, in this particular case... JavaScript isn't exactly widely deployed in hardware, and this device is what provides the platform so... yeah.
> It's widely adopted because Javascript has a stranglehold on the browsers and people don't have a choice.
That's not entirely true. People have had lots of choices. There was Java. There was Flash. There was even VBScript on IE (I bet you didn't know that), not to mention all the EMScripten fun that is now available. Time and again, people choose JavaScript if for no other reason than for its lowest common denominator qualities.
> Nowhere in the software space would such a monopolistic position be acceptable, but hey on the web for some reason it's OK.
If you consider the list of widely used languages for which there is an approved standard by an independent standards body and multiple independent implementations, you end up with a surprisingly short list (and some qualify but by the skin of their teeth). I think, sadly, this is actually quite widely accepted.
Come on. VBScript and Flash? Really? What I meant and you pretend to not understand is that there is no way to use another language just like you use Javascript when scripting a web application. I can't open a console and go
@document.query_selector_all(".fancy").each do |element|
...
end
I find that annoying. That a few companies have tried to introduce proprietary extensions is another topic, it doesn't mean that there is a real choice for developers.
The point is that some people decided some time ago that scripting the web and accessing everything browser-ish (DOM, Canvas, SVG, WebSockets, Web Workers, etc.) meant Javascript and that was it. The lack of choice and variety at hand is absolutely ridiculous. In no other software space would people tolerate to be forced into a technology like that.
> That a few companies have tried to introduce proprietary extensions is another topic, it doesn't mean that there is a real choice for developers.
I spoke in the past tense because those were choices available to developers... JavaScript seems to have won out as a preferred choice. EMScripten opens up a lot of possibilities though...
> In no other software space would people tolerate to be forced into a technology like that.
Good arguments. We had silverlight, and Adobe Air / Flex that may or may not have been trying to displace JS - but they were ways to program in the Browser. Sure they were put forward as solutions by companies that had an agenda.
Such a language also is dependent on HTML-standards.
We should see interpreters for other languages written in JS - or is that far fetched?
Acting like java or flash are or ever were alternatives to javascript is displaying a very profound ignorance of what javascript is for. How do I manipulate the DOM in java? Oh I don't? It is not an alternative option then. Javascript absolutely is the only option, and people have no choice. That is why people invest so much time in writing LANGUAGE_X to javascript compilers, so they can write code in a less terrible language even though it has to be deployed as javascript.
Flash has similar (arguably much better) API's for this as well. There are graveyards filled with other attempts.
> Javascript absolutely is the only option, and people have no choice.
Being the most broadly supported and most integrated solution doesn't mean developers have no choices. It just means they JavaScript might be their best choice. On almost any platform there are going to be certain programming languages that are more integrated and better supported.
> That is why people invest so much time in writing LANGUAGE_X to javascript compilers, so they can write code in a less terrible language even though it has to be deployed as javascript.
And there you have it. JavaScript wins by virtue of being a better deployment platform in the browser space. At one time Java used to enjoy an even broader advantage. Acrobat, Flash, VBA, Bourne Shell, MSI, Unix DBM, SQL, sendmail, PHP, MySQL, Windows, Linux, C, POSIX, PostScript, XWindows, etc. have all ridden these kinds of waves. Some are more successful than others, and certainly the web browser is probably the most ubiquitously deployed runtime environment ever, but really, if anything this is normal and the broad diversity is not.
It's not unfair or unusual, so much as inevitable.
> So, write javascript to trigger a java applet that changes the dom is a reasonable alternative to writing javascript to change the dom?
Depends on what you mean. These days people often have Applets disabled. But if you have an Applet with the access to do so, you can manipulate the DOM as much from it as you could from JavaScript.
> Yes, it literally does. When presented with one option, you have no choice.
There are lots of options, its just one is better than the others.
>> And there you have it. JavaScript wins by virtue of being a better deployment platform in the browser space
> Are you serious?
Absolutely. People made a go of it with Applets. Once they realized they could get the job done with JavaScript, they dropped Applets like a hot potato, to the point where Applets are being dumped.
Most other in browser programming environments are platforms in their own right that talk to browser platform. JavaScript's platform is the browser, and that turns out to make a big difference.
> Just that javascript sucks and we're stuck with it against our will.
You really think pulling the "you are too dumb to understand my bullshit" card is effective? Javascript is used to make changes to the dom in response to the user doing things. User clicks button, stuff changes. Have you ever actually tried doing that with a java applet? Have you noticed how 75% of the api doesn't actually work in any major browser?
...and there you go again about lack of support. Have you noticed that Java Applet's just generally aren't supported anymore? We tried it (and yes, all the event handling worked). No one used it and it exacerbated security problems.
People arguing about programming languages are like people who focus more on cameras than on the art of taking good photographs.
Those of us defending JS or PHP or VB (in discussions which aren't about programming languages) aren't suggesting we should take a point-and-shoot (or a leica) to an action game.
> People arguing about programming languages are like people who focus more on cameras than on the art of taking good photographs.
Hmm... just to play on that metaphor: there are certainly camera choices that can make the process of learning to take good photographs easier or harder. Isn't that a relevant point?
ingent learning a few things about composition and picking an interesting place to take your first pictures than it is making sure you have the best camera.
You don't need to have the best camera, it's true. But you do need to have an adequate camera, where adequacy relates to the purposes you have.
For example, i used to have a little digital compact. It had autofocus; the autofocus wasn't perfect, or even particularly great, and there was no way at all to focus manually. I have countless pictures which were ruined by being out of focus, and there was nothing i could do about it. I now have a camera which lets me focus manually, which means that a picture's being in focus or not is now entirely in my hands. The former camera was not adequate for the photographs i wanted to take; this one is.
An interesting point is that a 40-year-old film camera would also have been adequate, although much less helpful in other ways.
This feels like it could be a good metaphor. C is a 40-year-old SLR covered in dials and switches, enormously capable but a nightmare to work with to anyone but a master; JavaScript is a digital compact which automates everything whether you like it or not. Java is a modern dSLR, capable and more automated than C, but still clunky. Rust is a Leica M9, still manual but modern in other respects. Go is a bafflingly-horrendous-to-outsiders Lomo camera. PHP is a Fisher-Price toy camera. Clojure is a Lytro, weird but capable of amazing things (but weird). Scala fans think their language an E-M5, but it's really an EOS M.
> The former camera was not adequate for the photographs i wanted to take; this one is.
Right, but that doesn't necessarily comment on its effectiveness as a tool to assist in learning how to take good photographs.
> An interesting point is that a 40-year-old film camera would also have been adequate, although much less helpful in other ways.
This is more in line with what I'm getting at. There are perfectly good (even great) cameras out there, but there are cameras that are better suited to helping you get up to facilitating the learning process, and others that will hamper it. While perhaps not the most essential component of the learning process, they still kind of matter and are a perfectly reasonable aspect for someone to discuss.
The language shape the minds, define the boundaries and the kind of solutions that can/can't be done in a reasonably time.
Look as pretending enlightenment to say that languages not matter, that only matter the man behind them. Well, is that is true, the mans behinds the languages mean nothing? Is only the work that create the ones using the languages that matter but not the work that make THAT possible?
The tool matter. You can build a city with only a hammer. But is stupid. Some languages ARE better than others. Some ARE faster. Some ARE more legible. Some ARE more performant. Some ARE safer. Some ARE more productive.
Maybe two languages too close in his objective give small returns, but surely exist order of magnitude improvements between different groups...
agreed - these guys, like Raspberry Pi, are making a bet on Moore's Law - that's been a good bet for forty years and will keep being a good bet for a long while yet - certainly long enough to make the cost of interpreted JS insignificant
And yet our operating systems are still written in C(++).
Javascript web code epitomizes the "long tail" of random ideas being articulated. The code is written quickly, needs to change a lot in response to user behavior and designer inspiration. It's usually < 10k lines, so its terrible medium- and long-term maintenance characteristics are manageable. It's written by millions of independent teams to bring to life millions of small ideas used by (typically) only a few users. And that's great, we really need languages for that.
But, humor me for a minute, and let's define "the C law": Anything important enough to be used broadly will eventually be replaced by something written in C(++). Why? Because solution X not written in C is always vulnerable to replacement by solution Y written in C with the equivalent feature set, and the demand is now high enough to provide the time and talent. See: every python module ever used by more than 1M people. See: every programming language implementation (incl javascript and luajit!) ever used by more than 100k people. See every operating system, every database used by more than 1 million. See every web server running top 1000 websites. See (almost) every tech startup that becomes a fortune XXXX company, and rolls through its infrastructure rewriting its ruby or its python or its perl in C/C++/Java. See why we're not all using jitted PyPy yet (hint: those pesky c modules make real world cpython often just as fast or faster and more memory efficient).
Languages that trade performance and correctness for productivity are fantastic when the project is young, small, and of dubious value (yet), or narrowly targeted and not generally interesting. Or the browser gives you no other choice.
Consumer hardware product development just doesn't align with this kind of thinking. It races right past the C law threshold. Physical stuff introduces serious economies of scale, design costs, significant difficulty of change, etc, where shipping a few hundred of something doesn't make a lot of sense (as a business; maybe as a hobby project).
So, if you're going to (aspire to) ship a million of something, investing in the software side to keep costs down, maximize battery life, mitigate risks related to a misapplied language runtime (not designed or heavily tested on embedded), guarantee performance and latency characteristics, etc, makes sense--the compromises that are appropriate for a website because you want to make a little gamble fast don't apply... you're making a pretty big gamble, and it will take awhile to get right anyway, and your capital needs are higher in general, so doing software "right" to save on COGS is just practical.
BTW, "C" here is usually C or C++, but it can very occasionally be Java [see: zookeeper] or some other JVM thing like Scala. Regardless, it basically represents the final state reached by the tool/language/platform/project race. If your project could be replaced by a version that says "like $project, but fast/battery efficient/cheap!", you are not yet at the end game, and the C law could always be invoked (if the demand was sufficient), and your project will probably ultimately lose the lion's share of the market (or open source mindshare, or some other version of "market").
So, maybe this project is betting on our lives transforming into everyone paying more for their hardware, and everyone using a lot more small/local/boutique type stuff created by small hardware teams. This doesn't really jive with the way technology products are currently marketed, hardened, and distributed, so I'm operating on the assumption that change doesn't happen anytime soon.
It could be useful/fun for hobbyists or prototyping, though.
you are right - for any given expected unit sales, there is a level where using C first time is sensible.
But ultimately there is always a market below that point - people are selling arduinos now for surveillance or monitoring simply because the market is too small for anyone skilled enough in C to bother.
As the price performance of the hardware drops more of these markets will open up - today the "stuff it use C" point is maybe a 100 units or a thousand. tomorrow a million. then 10 million.
Forgive my hardware ignorance but imagine a system on a chip with the cpu and memory and buses of say today's MacBook Air. everything you want, on a thumbnail, just hook up electricity. If I could buy that for 2 cents I would be foolish if I decided to write almost anything in C till I got real time stats from my first 5 million users.
Twitter did not drop Ruby for Java till they were at the billions of messages level. There are an awful lot of markets and price points between 1000 units and a billion units.
(Especially now when we can realistically talk about every human being having a hand held in x years. which blows my mind but for good reasons)
ps - no I do not think Moores law will hold in terms of transistors in a chip for evermore. but we have barely scratched the surface of "everything on a chip".
(Would be interested on the feasibility of literally a PC on a chip ? If we took a literal count of transistors on say a model two years old and then looked at price to fab that many transistors today what would we see?)
edit: rereading it seems to indicate that there is hardware out there were C is the sensible option - I am just trying to say there is a spectrum of / 8 bit assembler / embedded C-like / DSLs / anything a PC might recognise and that climbing that spectrum is inevitable based on hardware price/performance
Moore's law doesn't apply to batteries. There's nothing stopping you from putting the MBA chip in a phone... except energy consumption.
If your program in javascript takes 14 times as long to run than the equivalent C version and yet still manages to be performant from a user experience perspective, then you'd better be close to an outlet because we're talking about a beefy and energy hungry processor.
There's a sweet spot related to programming effort and power consumption. I don't think Javascript can hit that sweet spot yet for most devices.
I'd argue that the same problem, but to a lesser degree, is present in other non-mobile devices. I'll buy the one that costs $20 more but costs me $20 less per year on my energy bill.
I think we may even see a bit of a reversal in the current trend of programmer productivity over program efficiency. Clusters of cheap multi-core servers consume significant energy.
I remember mobile phones that had to be kept in the car because the batteries were like dumbbells. we lived with it because it was what we wanted.
There are going to be physical limits to battery technology, to processing power, all of these things. but those limits are not upon us yet and I have a suspicion the true physical limits lie in a place that will make us 20th century simpletons look like slack jawed savages.
So, your conjecture is that physics will continue, but our appetite for new features will plateau? I don't buy that, personally; in fact I think it is exactly the opposite. I want a calendar program that is smarter than me, I want search that understands my tastes in restaurants, shoes, books. I want sw that immediately translates, perfectly and idiomatically, any language. I want to talk to my phone. I want it to compose music for me. I want a pony (okay, that last one is a different list).
Now, we do know one limit - the human brain fits in an X sized area, and requires Y sized plumbing and energy store to accomplish what it does. My feature requests exceeds what any one brain can do. I'd be astonished if we could shrink that to phone sized, but maybe, just maybe.
At that time we can renew this conversation. Until then, burning batteries, and running data centers, matters. Until then, C (or a safer version thereof) matters.
> ultimately there is always a market below that point - people are selling arduinos now for surveillance or monitoring simply because the market is too small for anyone skilled enough in C to bother.
I agree, and this could be interesting for some of those small volume domains--but their copy in the slide deck is broadly encompassing and far reaching, implying mass-market products like Nest. I'm just taking them at their word and addressing that application.
> And yet our operating systems are still written in C(++).
The last time a mainstream OS was developed from scratch was the late 80's (1989 to 1993), with windows NT. Linux is just a kernel for an OS from the early 80's (gnu). OS X is a derivative of nextstep, also developed in the second half of the 80's. C was the state of the art back then.
The investment to build a competitive mainstream OS from scratch in a new language is huge. Probably 10x to 100x the effort to build windows nt, which took 250 people 5 years. I think there would be a lot of value in developing an OS in a programming language that deals intrinsically with the topics of security and multi-processing, but at this point it's too expensive to do that and match other OS's feature for feature.
Most of that effort would be spent writing drivers. Writing a scheduler (how many has Linux had so far, 3?), process manager, virtual memory system, simple filesystem and networking (TCP/IP) stack is not that much work. Supporting most existing hardware, however, takes years.
EDIT: for an example, compare linux/drivers/ to linux/kernel/ (core kernel code, including process management and scheduling) and linux/mm/ (memory management) in Linux. The former is huge.
> So, maybe this project is betting on our lives transforming into everyone paying more for their hardware, and everyone using a lot more small/local/boutique type stuff created by small hardware teams. This doesn't really jive with the way technology products are currently marketed, hardened, and distributed, so I'm operating on the assumption that change doesn't happen anytime soon.
So, didn't and won't. At least not mass-market. Maybe very specialized, low unit volume kind of applications.
This concept of course not going to amount to anything more than a hobbyist or prototyping/POC use for quite a while purely from a cost perspective. You can hire a master C programmer and pay him $100 an hour for a year, or an advanced JS programmer and pay him $40. Well after launch, you may have saved $150k, but your hardware costs you $50 extra per unit to produce, requires more power to run, and has lower performance and timing precision (not real time). Your break even point is now 3,000 units. This is a non-starter. Other engineering, design and marketing costs will likely land you in the red if you are only producing 3,000 units. Perhaps in some niche markets this concept might be viable (like $500k plug in medical devices), but I doubt it.
There is no way to really getting around learning hardware. Who is going to write the hardware interface drivers or driver/JavaScript bridge.
Power consumption will rule out the possibility of many portable devices, loss of performance will rule out many options as well as will latency caused by the garbage collected (non real time) nature.
This is the 'write everything in assembly' argument.
Productivity and prototyping gains that allow you to proof of concept, manufacture and sell small runs of (expensive) toys, enables large scale investment to manufacture more mature products.
There's absolutely nothing wrong with this approach, and, honestly, the 'big bang' approach of writing it all in C and producing 500,000 units before actually getting them out to anyone is extremely risky.
That's why hardware doesn't get investment.
Just look at kickstarter; great ideas popping up, people wanting them, people getting them.
If you only ever end up making 5000 units for your 5000 backers, so what? Great idea. Prototyped a thing. People who wanted one got one. Not a mass market thing? Ok. We're not not $4,000,000 in debt.
> That's why hardware doesn't get investment. Just look at kickstarter; great ideas popping up, people wanting them, people getting them.
What exactly leads you to believe that hardware doesn't get investment? Besides Kickstarter, I mean. Hardware development tends to be massively funded. You get less exposure for a lot more money, which is why Kickstarter is obviously not a good place to look, but in most projects I've seen it was software, not hardware that tended to be underfunded.
I've practically never heard of people getting funded with hardware ideas outside of Kickstarter.
If you've got some links to incubators/hardware startup scenes, please share.
I honestly can't think of any hardware startups which have been funded off the top of my head; maybe Nest? MakerBot (weren't they privately funded by the founders)?
> I've practically never heard of people getting funded with hardware ideas outside of Kickstarter.
You mean you've never heard of small startups getting funded with hardware ideas, which is unsurprising considering that it takes significantly more money to develop a working prototype than it takes to develop a web application.
Kickstarter is not the only place where people get funds, and startups with four people working 80-hour weeks are not the only places where innovation happens.
There are things like littleBits, of course. But the fact that new electronic gadgets keep hitting the shops is a clear enough indication that they get funding from somewhere.
It's not just about the money it cost to write the code. I mean, as a master C programmer I am not without sympathy for your suggested approach, but if you follow the philosophy of 'real men write the entire stack in hand-optimized C' and as a result it takes an extra six months before you're shipping, you might miss a key market window and then all the other considerations might end up not mattering.
It's not like we haven't seen this play out before. Over and over again, flexibility, easy debugging, time-to-market and Moore's law have gone up against the 'real men' approach - and won hands down. History doesn't always repeat itself - but it's usually the way to bet.
>These board creators just want to ease the path for most web developers to become hardware developers
Most web developers don't know javascript though. Even the ones who write javascript, I'd estimate fewer than 1 in 10 know the language even at a basic level. Most people are just grabbing messes of jquery infested crap and copy+pasting it. Then making random changes until it seems to work. That's why so much javascript out there is invalid according to the language spec, and doesn't work in less popular browsers.
See, Javascript maybe a 'bad' language according to many of you, but it has massive adoption unlike other languages. These board creators just want to ease the path for most web developers to become hardware developers. It not only opens up a whole new industry to work with, but also it creates a good 'filter' to filter out the bad ones. I will explain.
The thing about hardware products is that most people dont care about internals. Most of them care about the experience. I am NOT an Apple fanboy, but in this occasion I would like to cite the iPhone's sales as a good example. If you suck at programming in Javascript, it will show up, especially in the Hardware world, easily, and you/your product will be rejected.
Also, when you develop, say, a DSLR Quadcopter[1] with this board, people aren't going to ask you "What language is it running on?", "How slow is your language?". People are going to be asking about the footage you're going to film with it. Let's not dissolve ourselves into the hatred of a language. Instead, let's take the time to appreciate what these developers have achieved and what we can build out of these boards.
Cheers.
[1]A sample DSLR quadcopter for reference: http://farm7.staticflickr.com/6225/6868828438_e6d798c68d_b.j...