When it comes to Rust, there's no stable version of the language at this point. There's no stable version of the standard libraries. There's no reliable production-grade compiler available. As the Rust home page itself states, "Rust is a work-in-progress and may do anything it likes up to and including eating your laundry."
Maybe Rust will offer such stability in the future. But that's of no use to people and organizations who need to develop software today, and who need to be able to trust that the code they write now will compile and work tomorrow, a month from now, a year from now, and perhaps even decades from now.
C++ does offer stable, standardized, well-supported versions of the language. C++ does offer stable, standardized, well-supported standard libraries. There are numerous high-quality free and commercial C++ implementations available, for just about every platform imaginable. It provides a robust and predictable platform that serious and massive software systems can be built upon.
The theoretical benefits that Rust may bring are pretty much irrelevant as long as it isn't a production-ready language in the way that C++ is.
But now you aren't talking about Rust the Language but Rust the Ecosystem.
Not that I am disagreeing with your points, I am not. However, when people talk about C++'s problems, I immediately assume they talk about C++'s problems as a language rather than it's ecosysem.
Rust isn't out there to tackle C++'s ecosystem, tooling, legacy code or professional workforce, but rather Rust aims somewhere near C++ and fixes many of the language flaws which are inherent in C and C++, while still being competitive in performance and low-level control.
Ecosystem really is the whole problem, though. You can never make it big without having lots of friends.
Rust is a pretty cool language, but I can't help being reminded about another very well-designed but ultimately unsuccessful C++ challenger, D. The parallels are really hard to ignore.
D, like Rust, had great syntax and was a breath of fresh air after coding with C++98. Neither Rust not D has a sponsor with really deep pockets to encourage adoption. Neither came out of a standards process. Both have had compiler and standard library issues. The big difference between the two at this point seems to be momentum and where the two are in their parabolic trajectories.
There are important differences: D threw away one of the most important features of C/C++ (for the target audience): usability of the language without a garbage collector.
Also, Mozilla has much deeper pockets than Digital Mars.
Still I agree with you that it is very likely that Rust will follow a parabolic trajectory. The advantages as perceived by the industry compared with C++11/14 will be too few. At the same time it does not have the ecosystem.
Go succeeded because of Google's deep pockets and, perhaps more importantly, because it filled a large niche that even the authors did not anticipate: a faster language for Python and Ruby aficionados.
> Still I agree with you that it is very likely that Rust will follow a parabolic trajectory. The advantages as perceived by the industry compared with C++11/14 will be too few.
Rust is memory- and type-safe (as in: the compiler will not let you write a dangling reference, invalidate an iterator, or write exception-unsafe code without opting into an unsafe dialect). The security benefits alone of that are enough to justify the language for our use cases, and, from what we've seen, for many others. Safe zero-cost abstractions are a niche to its own.
Could you be specific about what in D isn't useable without the GC? You can use manual memory management in D. It has Unique (unique_ptr) and RefCounted (shared_ptr). It has Array. It has malloc and free. None of these use the GC.
People complain about the GC being available to use while simultaneously complaining that to avoid it you have to manage your memory yourself. You can't have it both ways. There is no memory allocation strategy that works best in every situation. Sometimes ref counting is best, sometimes stack allocation, sometimes RAII, sometimes memory pools, sometimes its the GC.
Rust has done some cool work with memory but even it doesn't free the programmer from having to consider and choose which memory allocation/ownership option is going to deliver the best performance on a case-by-case basis.
You're right about Rust and Go having much deeper pockets. D isn't backed by any corporation. It's 100% a community project. Maybe it won't ever gain a significant market share because of this. I don't know.
I would also say that D did itself irreparable harm with the whole v1-v2 standard library debacle. Right when it was receiving the most attention, it came right out and said to anyone who might have considered it, "we have two incompatible versions of the standard library: one which we don't support anymore and one you can't use yet."
>At the same time it [Rust] does not have the ecosystem
Try watching new projects pop up at Rust CI [0] for a few days. With the possible exception of Node (which is not even a PL), I've never seen a language ecosystem grow this vast, and I'm a PL afficionado.
I think in a year's time, the question of Rust's stability and ecosystem will be entirely moot. It's a tough wait meanwhile, but I'm still investing the time in learning Rust (and it's a significant investment).
Is Rust CI really the best evidence to use in this case?
When I last looked at it, probably 40% to 50% of the projects listed had builds that were in the "failing" or "error" statuses.
That indicates that one or more of at least a few things are happening:
1. The Rust language and its standard libraries are changing at a pace that results in previously-compiling code needing to be modified before it will compile again with a newer version of the language/implementation, perhaps a very short time after the code was initially written.
2. The Rust compiler or other tooling is crashing or failing in some way while compiling these projects.
3. The projects themselves aren't being maintained on an ongoing basis.
4. The projects themselves were never building properly in the first place.
5. The projects' developers are targeting different versions of Rust (which probably means there will be interoperability problems for anyone trying to use them in a larger projects, especially when it comes to libraries).
And while there may be a lot of these projects, I've never found the quality to be very good. Many of them are extremely limited or incomplete. Many of them are little more than casual experimentation. Many of them are only developed by a single person, who often has appeared to have lost interest.
Those factors are disconcerting, especially for somebody who wants to use Rust for serious product development. It does no good if there are hundreds of libraries available for use, but half of them don't even build, and the ones that do are very incomplete.
>Many of them are extremely limited or incomplete. Many of them are little more than casual experimentation. Many of them are only developed by a single person, who often has appeared to have lost interest.
Yes, I completely agree, and this is an entirely normal part of the language ecosystem development cycle. Rust is at the tail end of the experimentation stage, and as it converges on 1.0, more people will undertake serious projects.
I'm not at all worried about the quality of Rust projects. I'm just happy to see so much enthusiasm. I have no doubt that all this enthusiasm will transfer into some powerful libraries as Rust continues to stabilize and reaches 1.0.
But yes, it's still to early to use Rust for production unless you're willing and able to write your own libraries.
Then again, the "batteries included" approach of Rust's standard library leaves not much to be desired outside of domain-specific libraries.
Given the Mesa/Cedar system a Xerox PARC, Oberon at Swiss Federal Institute of Technology and Modula-3/SPIN at Olivetti, doing OS work in GC enabled systems programming languages is quite possible.
The problem is how to move OS vendors away from C's influence.
A large or well-known company merely using a programming language, and maybe even contributing back to it and its community, isn't the same as the company truly supporting or championing it.
What you describe is very different from, say, how Sun pushed Java, or Microsoft pushed C#, or how Apple will likely push Swift, or how a huge portion of the entire software industry pushed C and C++.
Facebook's Hack language is probably a much better example than D is of a language that they're actively supporting. It's a creation of theirs, rather than just a creation of somebody else's that they find useful in some limited cases.
> Both have had compiler and standard library issues.
What are the compiler/standard library issues with Rust? Sure, it's taken time to get to a high level of quality, but no language compiler and library is going to spring out of thin air fully complete. In particular, I'm confident that the Rust compiler is of high quality for its age, especially in the quality of the generated code.
> In particular, I'm confident that the Rust compiler is of high quality for its age
I'd like to concur. I've used the Rust 0.10 compiler and the only bug I encountered was that generating debug binaries was somehow broken. Apart from that, it was one of the smoothest experiences ever and the error messages are amazing.
Rust error messages are perhaps the best thing about the compiler. For me, it's a new experience altogether. Instead of having to Google obscure error messages, Rust error messages actually tell me how to solve the problem!
I don't think that the two can be separated in any meaningful way. Of course they're different things, and we can discuss each on their own, but when it comes to which language to use as a business decision then the line becomes blurred if not completely transparent.
What makes you so sure that there was ever only one memory leak? What makes you so sure that one or more new ones haven't been introduced since then?
I've seen and heard a lot of reports from many different Firefox users about Firefox using an unreasonably large amount of memory, even when using fresh installations of the most recent version, and when engaging in very reasonable browsing patterns.
As a software developer faced with a large and frequent volume of reports of such a nature, the only responsible thing to do is to assume that there is truly a problem. This should be assumed even if the developers themselves may be having trouble reproducing the problem. Denying that the problem exists is usually the most counterproductive thing that can be done, because the problem likely does actually exist, and it doesn't get fixed.
By the way, I don't believe that there ever was a Firefox 2.5 release. Perhaps you mean Firefox 3.5?
Once again, nobody is denying anything about memory problems. But there are two things at play here:
1. One person's experience is not universal. With a web browser as customizable as Firefox, there will be certain configurations causing problems that aren't caught by tests or by dogfooding. Some people are probably seeing leaks that users with a default configuration don't see. For example: https://blog.mozilla.org/nnethercote/2014/08/15/the-story-of...
2. People need to report their problems in ways that are actionable. HN is not a bug tracker. If somebody on HN (Hacker News! News for programmers who presumably know how a bug tracker works!) is having memory issues, (s)he should have no problems filing a bug in Bugzilla. Firefox has had about:memory for years now; save that report and attach it to your bug.
Users have been reporting memory leaks in Firefox up to and including last week. Users should not be used a source of what a memory leak is cause 80% of users do not know what a memory leak truly is much less be technically competent to recognize and report one.
I often make the error of saying the fix was in version 2.5 but perhaps it was in v1.5 or v2.0. It doesn't matter. They were fixed long ago is my point.
This is absurd. So rdxm is a "shill" merely because he or she pointed out some very real and important issues surrounding this sort of technology, and then also pointed out the fact that established players are best prepared to handle such challenges? Huh?
If you're going to make such accusations, or even just hint at them, please provide us with at least some real evidence to show that rdxm is being directly compensated by one or more of the industry incumbents for posting that comment. Since I doubt very much that you can provide that evidence, I think it would be appropriate for you to apologize to rdxm and to the rest of the community here.
Eh, I'd expect somebody shilling for Epic or any of the other major vendors to sound the same way--which is exactly what I said, no more no less. I'm not going to apologize.
The problem with bandying about "HIPPA [sic]" and random ISO security standards is that it only serves to dampen enthusiasm for fixing the staggeringly pervasive issues of mismanagement and technological obsolescence.
Anybody can come up with a "no" or a problem--"but but but HIPAAaaaaa" is a common refrain from people who want to sound like they know something but who lack the talent or skill to fix the fucking thing.
Based on the comments here and those that appear under the article, people in general are not at all impressed by these ideas.
I don't think the hatred is as universal as it is for, say, Australis, but it's close enough that it should be discomforting.
What is your response to this? Do you think it's right to continue work on a project that the majority of people dislike for a variety of very legitimate reasons?
"Do you think it's right to continue work on a project that the majority of people dislike for a variety of very legitimate reasons?"
Almost certainly yes. How else can you find new, previously unknown products and ideas of value? Don't you ever just brainstorm crazy ideas because you don't know where an idea will take you? You know... think different... here's to the crazy one's.
Also, "majority of people" might need to be edited to say "majority of a subset of people who are HN readers/commenters and who are so interested in browsers that they watch nearly 10 minutes of video on the topic."
"Based on the comments here and those that appear under the article, people in general are not at all impressed by these ideas."
I wouldn't say that is a fair assessment, many people took it for what it was (a UI/UX experiment), and quite a few people who did take to it badly mistakenly thought it was a Firefox design concept.
Well, to be fair, the last 2 Firefox "design concepts" got almost verbatim into the mainline despite almost unanimous complaints, and now I have 2 different plugins installed to correct the mess they created.
It's only a "mess" to some, I personally have no beef with Australis, and the beauty of Firefox is that it allows for you to customise it to work how you want (hence you were able to 'fix' it). This flexibility means there's still plenty of room for UI experimentation. Besides, this Lightspeed design is explicitly not for Firefox, the designers acknowledge part of what makes Firefox what it is is the customisability, this design is seeing what could be done with an alternative browser.
I don't see any such suggestion. The fact is that updates fix many known flaws -- flaws that are or will be exploited after an update is released and the details of the flaws are fully disclosed (to the world, including the bad guys) so not taking an update is the same as being zero day'd by the software providing the update. It's not a good idea to avoid taking fixes for known and already or soon to be exploited flaws because the update may or may not have introduced new as yet unknown flaws.
That sensible and effective approach has become yet another victim of the failed quest to "improve the user experience" by throwing away or changing stuff that was working perfectly fine before. The new approach is almost always far worse than whatever minor flaws might have existed with the earlier approach.
And allowing the old, working approach to be toggled back on through some preferences dialog, through about:config or through some extension doesn't justify the bad change. If anything, it's actually somewhat offensive, because it now requires users to engage in yet more fixing of things that just shouldn't be broken to begin with.
When I start using a fresh installation of Firefox, I have to spend at least a good 10 minutes installing various extensions and reconfiguring it just to get a minimally usable experience out of it. That's not acceptable, and it's not justifiable.
To me, browsers defaulting to the Downloads folder improved the user experience. You might care much about where exactly your files are going, but I generally don't. Of the two, I think "not caring much" is the default option. If you do care, right-click -> "Save link as...".
It's also incredibly arrogant to say that it's not acceptable or justifiable that Firefox does not exactly cater to what your personal opinion on "usability" is.
> When I start using a fresh installation of Firefox, I have to spend at least a good 10 minutes installing various extensions and reconfiguring it just to get a minimally usable experience out of it. That's not acceptable, and it's not justifiable.
You're assuming that this is widely applicable — very few people have spent a decade building a highly customized browsing experience and most people don't customize much at all. Ever walk around the office and notice how few people even removed Microsoft's default bookmarks from the toolbar so they spend all day staring at links to a service which they don't use?
I would also note that the history of computing is littered with loaded descriptions of things which were reflexively described as a "bad change" and a complete non-issue a year later. Mouse wheels, tabbed browsing, fonts and later CSS, JavaScript, all got the kind of grumbling you made above from a few people. Sometimes it's worth asking whether you really benefit from the old way or are just reacting to something being different.
I think your understanding of the history of computing is flawed.
As somebody who lived through the events you mentioned as an adult working in industry, I can assure you that the sentiment you believe was felt was actually not felt.
Mouse wheels were seen as a very good thing when they first came on the scene. They gave the power of the three-buttoned mouse, but also made scrolling much simpler.
The same goes for tabbed browsing. It was one of the best features of Opera for a long time. Everyone I showed it to at the time thought it was very useful. And it was one of the best features of Firefox, too, when it was still Phoenix.
And the same goes for fonts, and CSS (although to a lesser extent). Their benefits were obvious from the beginning, and I don't remember them facing really any resistance.
Contrary to popular belief today, JavaScript was not seen as good when it was first released, and it should not be considered good today. In the mid-1990s it was generally seen as a rather bad and limited language. That's why it didn't see much use until the mid-2000s. The first generation of developers who experienced it found it inferior to existing technologies and generally refused to use it. Even today, it's still a very flawed language (the problems with it are well know; I'm not going to regurgitate them here).
The problem with Firefox lately isn't that there has been chance. Of course change can be good. In the case of Firefox, though, the change has been utterly horrible, and caused far more problems than it brings in benefits, for a huge number of people. This is reflected very well in Firefox's ever-dropping market share.
> As somebody who lived through the events you mentioned as an adult working in industry, I can assure you that the sentiment you believe was felt was actually not felt.
My comment was based on my experience as someone who also lived through that period as an adult working in the industry.
> Mouse wheels were seen as a very good thing when they first came on the scene. They gave the power of the three-buttoned mouse, but also made scrolling much simpler.
That's easy to assume now but at the time there were people who complained that they required more precision to use, were inconsistently supported by existing software, etc. I remember people complaining that clicking the wheel was less reliable than using a proper third-button — no doubt true for the people who did a lot of pasting in X11 but that number was an increasingly miniscule fraction of the computing world and no doubt most of them adjusted after they stopped grumbling.
> The same goes for tabbed browsing. It was one of the best features of Opera for a long time. Everyone I showed it to at the time thought it was very useful.
… and yet other people complained that it was confusing to have tabs when you also had windows, duplicated with the OS window management, made it easy to accidentally forget you already had something open, etc. I'm sure all of those people use tabs now without even thinking about it but that doesn't mean that they didn't grumble first and learn how to use them second.
> And the same goes for fonts, and CSS (although to a lesser extent). Their benefits were obvious from the beginning, and I don't remember them facing really any resistance.
Outside of your corner of the web, there were impassioned rants about how the font tag overrode the user's font selection – that was one of the early selling points for CSS! Some people complained about CSS because it was harder to use than the font tag while others complained that it made pages slow or required downloading more data, etc. Some people complained about both because they made it easy to make pages which were hard to read on the wrong browser, operating system, or if you had a very small or very large display, or were color blind or visually impaired.
> Contrary to popular belief today, JavaScript was not seen as good when it was first released, and it should not be considered good today. In the mid-1990s it was generally seen as a rather bad and limited language.
I started writing JavaScript back when it was called LiveScript (oh, those heady days of downloading Netscape 2 betas when their FTP server wasn't overwhelmed). Then, as now, people complained about JavaScript being bloated or slow and there's a long tradition continuing down to your comment of complaining about the technical merits of the language. This wasn't wrong – even Brendan Eich is apologetic about most of it – and yet here we are in a world where the one language you can assume will be taught in 10 years is JavaScript because a billion people interact with JavaScript programs constantly.
Again, I'm not saying that the complaints are entirely without merit – only that there's a long tradition of people who overestimated either how serious a problem was, the degree to which their reaction was representative of the general computing public, or both. I remember plenty of advocacy that pages should work without JavaScript – and personally engaged in a fair amount – but much of the web today assumes JavaScript without any of the predicted disasters.
> The problem with Firefox lately isn't that there has been chance. Of course change can be good. In the case of Firefox, though, the change has been utterly horrible, and caused far more problems than it brings in benefits, for a huge number of people. This is reflected very well in Firefox's ever-dropping market share.
As they say, citation needed. That trend started well before the new UI and appears to have rather a lot more to do with Google's successful promotion of Chrome. Even in technical forums, there will be a ton of messages but they always seem to be posted by a small percentage of highly vocal users who assume everyone who isn't commenting agrees with them.
When I start using a fresh installation of firefox I don't need to do anything to get my normal use experience back because I already restored my profile folder before opening Firefox for the first time.
yarou obviously didn't say that they should focus everyone only on reducing Firefox's memory usage and performance problems. I'm not sure how you mistakenly got that impression, because that's clearly not what that comments suggests.
The main issue here is that we've been hearing that these problems will be fixed, or even that they supposedly have been fixed, yet they're still present years later.
Whatever work is being done clearly isn't having much of an impact. Users are still reporting problems with Firefox's performance and memory usage, even if those within the Mozilla community wish to deny these problems exist, or claim that they'll be fixed "soon".
Users can only take so much of this. With Chrome and Firefox offering UIs that are pretty much identical these days, but Chrome offering significantly better performance and significantly lower memory usage, any reasonable user will obviously consider switching from Firefox to Chrome. Many have done so already, and many will continue to do so as time goes on.
yarou's comment is in response to a presentation about unrelated features, and his/her response suggests knowledge that Mozilla isn't focusing on memory and performance problems.
Many of the memory leaks have been fixed over the years, benchmarks suggest [Firefox can compete with Chrome in terms of speed](www.tomshardware.com/reviews/chrome-27-firefox-21-opera-next,3534-12.html), and Mozilla, realizing their UI still feels sluggish, launched a project (called Snappy) to fight this UI sluggishness.
Many improvements have been made over the years on all counts.
It may just be a thought experiment at this point, but I think it's pretty clear now that all of the major browsers are unfortunately headed in this direction.
This rush to target "more simple Internet users" hasn't gone well for Mozilla and Firefox so far. All they've managed to do is create a dumbed-down UI that's harder and less efficient to use, and this has alienated a lot of Firefox's existing users. This is a big part of why we keep seeing Firefox's share of the market sliding lower and lower.
Users can forgive Chrome for its bad UI experience because it offers quite good performance and resource usage. Firefox, unfortunately, does not offer that (I know, I know, Mozilla has benchmark results that will show the opposite, but these aren't indicative of actual user experiences). When faced with two browsers offering basically the same flawed UI, then users will use the one that offers the best runtime performance and the lowest resource usage.
Going forward with this sort of a design, or even continuing down the existing path that Mozilla has been taking, ultimately won't be successful. Users have very obviously been rejecting Firefox because it now no longer offers a usable UI, nor does it offer acceptable performance and resource usage.
>When faced with two browsers offering basically the same flawed UI, then users will use the one that offers the best runtime performance and the lowest resource usage.
I don't think that Chrome wins decisively on resource usage. It cheats a lot, and if you use your browser in uncommon ways, it degenerates.
I think that Chrome will win over Firefox's 'be like Chrome' strategy because people like leaders rather than followers, and would rather have Chrome-like features now on Chrome rather than waiting a year for Firefox to copy them, slightly differently.
That behavior of Firefox on Android is just plain stupid and broken, even when it's working.
It's another great example of how the Firefox developers have added in functionality that just makes the browsing experience less efficient or more confusing, without any real benefit. Getting a sliver more of screen space just isn't worth the inconvenience this "feature" brings, especially on larger mobile devices.
Disabling it via 'Settings' -> 'Display' -> 'Scroll title bar' was the first thing I had to do when trying out Firefox on Android, and that was just to bring it up to a minimal level of usability.
Yes, Chrome on Android does appear to be broken in a similar way. I'm not sure about Safari on iOS, though. Regardless, two or even three of them being broken in the same way doesn't mean that they aren't broken. The state of being broken in independent of how many different systems are broken.
Firefox OS doesn't exist in a vacuum. It's facing some very strong competition from Android, iOS, BlackBerry OS, and numerous other mobile operating systems and platforms.
It makes absolutely no sense to consider Firefox OS by itself. It needs to be considered within the larger context. Unfortunately, like others have pointed out, Firefox OS is not competing very well against the alternatives at this point.
It currently provides a worse experience than its competitors in pretty much every way. I don't think this is something that can be denied. Mozilla and the Firefox OS community will need to face these problems.
And these "astroturfer" accusations from you are kind of weird. Pointing out obvious flaws with software and hardware does not mean that the people pointing them out are being paid by Apple, or Google, or any other company or organization. It probably just means that they'd like to see these problems resolved, for the sake of Firefox OS and its potential success.
This is an article about designing and developing for Firefox OS. This is not an article about competing with Android and others.
Discussion about which is a better option for the consumer is a very valid discussion and is important to the long term survival of the platform. But, in my opinion, it just doesn't have anything to do with the topic at hand.
It's like if there was an article about a release of the next version of Android and someone came in taking about how the smaller subset of target screen sizes in iOS make it a better platform to design for. Its a valid discussion, but just doesn't have anything to do with the topic at hand.
The "astroturfer" comment is because I can't see any sane reason, beyond self interest or possibly just to troll, why anyone would bother to hop into a thread and just say Android is better when there was no mention about Android in the article at hand.
Maybe Rust will offer such stability in the future. But that's of no use to people and organizations who need to develop software today, and who need to be able to trust that the code they write now will compile and work tomorrow, a month from now, a year from now, and perhaps even decades from now.
C++ does offer stable, standardized, well-supported versions of the language. C++ does offer stable, standardized, well-supported standard libraries. There are numerous high-quality free and commercial C++ implementations available, for just about every platform imaginable. It provides a robust and predictable platform that serious and massive software systems can be built upon.
The theoretical benefits that Rust may bring are pretty much irrelevant as long as it isn't a production-ready language in the way that C++ is.