Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Was Steve Yegge Right? (regularlyexpressed.com)
75 points by nhashem on Aug 1, 2011 | hide | past | favorite | 41 comments


I think he's actually somewhat correct about multi-threaded programming, at least in some areas. Multi-threading works (aside from all the problems Yegge mentions) when you're dealing with multiple cores in a single box, but it doesn't scale at all well across a distributed system. As a counter-example, Map-Reduce has exploded since Yegge's post, as a way to get practical parallel speedups for certain kinds of tasks, without even having to think about race conditions, deadlocks, etc. Now Map-Reduce isn't useful for all the same problems as multi-threading, so one isn't going to replace the other, but it's just an example of an area where the multi-threading paradigm fails.


Totally agree. "out of favor" != "gone".

Distributed, message/actor models have exploded. There are better options to multithreading now. Only niches (often front end ui something) need multithreading.


Depending on what you meen by multithreading. Is multithreading when you use threads (and locks) directly or if your using multible threads in your programm in some way.

Acctully using threads probebly went back because of Actor Framworks, better understanding off developers, STM ....


Multithreading is certainly popular to pee on these days. It requires a fairly high level of understanding of how your data flows to get right, and languages do not help you out in that mechanism.

So CSP-based models have gained a lot of popularity, as a way to kludge around it and get most of the benefit (but shared memory is still a property of threading I don't see emergent in the CSP approaches currently).


To the extent that multithreading is popular to pee on, it's precisely because of the shared memory, or more accurately, the shared mutable variables. There's a lot of interesting concurrency solutions being developed, but it looks to me that shared mutable variables aren't going to be in any of them, so don't hold your breath waiting for them to come back. (The video at [1] is interesting overview of all the various Haskell solutions, which is interesting beyond just Haskell as an overview of a variety of concurrency approaches.)

[1]: http://skillsmatter.com/podcast/scala/talk-by-haskell-expert...


Shared memory can give strong performance advantages that the alternatives don't have. You pay for this in implementation complexity. Each can decide how much he or she wants to pay.

The price could be lowered by some trivially better support functions. Why for example C++0x doesn't have lockless concurrent containers in the standard library is beyond me.


agreed. see also redis, node.js et al. evented is the new threaded.


  Prediction #9:   Apple’s laptop sales will exceed those   
  of HP/Compaq, IBM, Dell and Gateway combined by 2010.
  Reason for prediction: Macs rule. Windows laptops are as 
  exciting as a shiny disco ball in the ceiling.
Honestly, to me, that's just absurd on a fundamental level. And the reasoning just reeks of fanboy-ism (note: I have no idea who Steve Yegge is - I just clicked through out of curiosity).

  I tried to actually Google the market share for Macbooks, 
  but it’s probably impossible to find an absolute number.
Not hard. Found the following with two searches in just a few minutes:

Circa Q2'08 (so, four years after the prediction) [1]: Apple laptop North American market share 10.6%. Dell and HP are both 21%, Acer is just under 15%, Toshiba is 9%, and "All Others" are 22%.

Circa Q1'11 Apple Notebooks are 5.7% global market share [2] and Apple is 8.5% of shipped notebooks + desktops [3].

At best, Apple COO Tim Cook says in Oct. 2010 that 20% of sold laptops + desktops are Macs [4].

[1] http://arstechnica.com/apple/news/2008/09/apple-gains-us-mar...

[2] http://www.ventureoutsource.com/contract-manufacturing/taiwa...

[3] http://www.notebookcheck.net/Acer-loses-its-Q1-2011-market-s...

[4] http://www.betanews.com/joewilcox/article/App-Store-comes-to...


However, he is correct if you just examine laptop sales among college students:

http://daringfireball.net/misc/2011/07/u-texas.png

When I worked IT for my college, we saw a similar pattern amongst residential computer registrations - almost all of students' machines were laptops, and the majority of those were running OS X.

This might not have been such a big deal back in the day (once they graduated, they probably got a job that forced them to use a PC, and so they stuck with one), but the PC landscape is changing dramatically. Nowadays, it's a lot easier to mainline a Mac after graduating.


Definitely true and I agree with all points. I would wonder how much of the overall laptop-buying population is made up of college students, though? Over time I could definitely see a significant effect (and as is stated elsewhere, Apple is growing much more in that market right now than others), but I would expect that to take upwards of a decade, as you can only 'convert' so many new college students each year.

Given the higher baseline cost of Macs, I would expect them to be more prevalent among college students because a) those people can afford to go to college and b) financial aid makes the difference between a $500 HP and a $1000 MacBook seem much less significant. I am still surprised by that infographic, though: 52% OSX wireless users! I probably would have guessed something more along the lines of 33%.


You're surprised because you haven't been on a college campus in the last five years. I'm amazed it's that low.


For most jobs, are there really any barriers left to using a Mac at home and PC at work? Considering computing for the vast majority of workers likely consists of email, excel, word, and powerpoint, I can't think of many. Certainly in-house apps would only work on Windows, but who uses those on their home computer?


Are we talking about work or home computers?

Overwhelming user pressure is slowly bringing Macs into the workplace, but they're only feasible because something like Citrix XenApp (aka Presentation Manager aka Metaframe) servers allow access to decades of in-house Windows applications.


But when those students arrive at Big Corp they're most likely to be issued a PC laptop. A more interesting stat would be the market share of purchases where the person paying is the person who'll use the machine.


It's hard to say what Steve meant by "laptop sales". I'm assuming that your interpretation (unit sales) is probably correct and Apple is definitely not that large. Their growth is certainly outpacing the industry, so maybe Steve is off in timing, but Apple doesn't keep on the low end which makes it harder to build up unit volume.

Even by revenue metrics, I doubt that Apple is bigger than all of those brands combined.

The thing is that Apple seems to maximize on profit (silly folks, they should know that market share is what you're going for). I couldn't find the figures, but I'd be curious to see what percentage of laptop profits belong to Apple. In the case of mobile phones, Apple now gets 60% of the profit in the industry.


> EDIT: And the downvotes are because...?

Ok, I'll bite. I downvoted you for not knowing who Steve Yegge is.


And that is valid in what way? Seriously. I wikipedia'd him before posting and nothing stood out as something that would change my opinion of his prediction.

His identity has no bearing on his being right or wrong. My knowledge of him (or lack thereof) doesn't mean that my criticism is invalid.

I'm sorry I'm not a programmer?

EDIT: So apparently my criticism is invalid? Would anyone care to explain why?


> And that is valid in what way?

Here's your quote: "And the reasoning just reeks of fanboy-ism"

You didn't call him a fanboy, but that's just a technicality. You implied. Therefore, the objection is valid. Before you name-call, know who you are name-calling. (or better yet, make a more thoughtful case)

Personally, I downvoted you asking about the downvotes. Just suck it up and stop whining. (Incidentally, as is often the case, plenty of people came along after and, caring more about the rest of your post, upvoted accordingly.)


Alright, I removed "EDIT: And the downvotes are because...?" as per your request. It was there because I felt like there was nothing wrong with my comment (and still do) and so if someone could bring something to the discussion rather than just downvote me, I would appreciate it. So far, at least two of you have downvoted me for nothing involving what I actually said. More than that still have yet to produce actual criticism.

This all still doesn't explain how "Macs rule" is a valid and unbiased explanation of how MacBooks would come to account for over one-in-two laptops sold.[1] Doesn't that sound absurd if you say it out loud? "Every second person in the country owns a MacBook." That is quite the jump for a company whose cheapest (new, not used) laptop offering appears to start at $999 [2] and who only holds 5% to at best 20% of the market share.

[1] This is slightly simplified to a world where all laptops are either produced by Apple or they are Windows machines produced by the four companies he mentioned.

> better yet, make a more thoughtful case

I would say the same to you: why should I know who I am "name-calling?" You haven't told me any reason that his opinion shouldn't be taken at face value.

[2] http://store.apple.com/us/browse/home/shop_mac?mco=OTY2ODQxN...


At face value, Yegge's comment lacks substance and the opinion is not well-supported, that's about the only valid criticism. Speculate about the author's motives at your own risk.

I think people downvote rather than comment to avoid uninteresting nit-picky arguments like this. I would love a way to attach an anonymous private message to a downvote but that's probably a non-trivial feature to implement.


This all still doesn't explain how "Macs rule" is a valid and unbiased explanation of how MacBooks would come to account for over one-in-two laptops sold.

Why on Earth do you want someone to defend "Macs rule" as unbiased? Who is claiming it is unbiased?

I would say the same to you: why should I know who I am "name-calling?

Because then you'd realise why you are being downvoted. It's like you are downvoting Mr Burns for being biased in favour of nuclear power, or something. Yegge's style is humorous and pushy, not graphs and references. Lots of stuff he writes sounds a bit silly.


Thank you. All someone had to tell me was something like this. Now I understand much better.


"Prediction #1: XML databases will surpass relational databases in popularity by 2011."

WRONG.

Aren't we capable of calling "false" statements "false" anymore? I can't believe that the author defended this prediction.

I'm sure you can twist the words "popular", "database", and "XML" around enough to try to make some argument that he was almost right. But if you read the full original prediction it's even more clear that he was as wrong as it is possible to be in a prediction.


It seems instead of XML databases, we're seeing JSON databases. True that they're far from surpassing or even nearing relational databases in popularity. Rleatoipnal databases are rather fundamental software, and there is such a vast amount of code and habits related to their use, that the prediction that they will be 'surpassed' buy something else only 6-7 years out is rather brash.


"It seems instead of XML databases, we're seeing JSON databases."

Even that one grain of truth doesn't hold up if you read his original prediction. He was talking about pretty XML-specific stuff like XPath and XQuery, and even things like DTDs and Schemas; so schema-less document stores just don't fit Yegge's story.

I don't have anything against Yegge for making such a prediction, and I'm sure he's not all that concerned that he was wrong. However, I don't think the author is being intellectually honest when he tries to twist it into a "partially right", and I do have something against that.


Spot on. At least half of these predictions are from marginally passable to blatantly wrong, with #1 and #5 taking the cake. Clojure's "emergence" as the language du jour in HN's echo chamber doesn't put Lisp "in the top 10 most popular programming languages", not even close.


Prediction #8: Someday I will voluntarily pay Google for one of their services.

Pretty close; there have been recent signs of this. Appengine is switching to a stricter payment plan. Google Apps for domain reduced their free user limit to 10. A lot of free API's are shutting down and some new ones require billing (Prediction, Storage, Search).

Although, in his particular case, it's Google paying him for his services...


Some things like the online photo albums have pretty low limits (1G storage). If I end up paying it will probably be for that. Despite having my own web-server, the link-in to my Android phone is huge convenience factor.


Steve Yegge is always right; unfortunately, sometimes he's right too early. This was the case with his game, Wyvern, which was a something akin to a beautifully retro meeting of Nethack and Everquest... with a client for the Sharp Zaurus.

If it'd been iPhone, it would have taken off. It was just way too early.


"Steve Yegge is always right; unfortunately, sometimes he's right too early."

What year do you think XML databases will take over?


I believe the point he's making (which I don't nessecarily agree with) is that he's right about the changes in direction, but he tries to estimate the result in the context of what currently exists.

So for example, he got it right that mobile games would be big, but they were big on iOS which didn't exist at the time he tried to make use of it.

Nd he was right that relational databases would decline. At at the time not relational meant XML. He couldn't predict the NoSQL of today, because that just didn't exist.

Personally, I think that argument is just retroactively applying meanings that weren't there, however.


#7: The mobile/wireless/handheld market is still at least 5 years out.

This was written in 2004. iPhone App Store launched July 2008. Pretty good prediction.

Overall interesting to read these and see how many of them he nailed.


Concerning "Prediction #4": I have the impression that the author of the article look at the global Java market share whereas Steve Yegge speaks of the market share of java on the jvm.


The global Java market share has never been above 50%. Moreover, the author combines achieving 4 and 5.


"Prediction #10: In five years’ time, most programmers will still be average."

Just remember, half the population is below average intelligence.


You only have a 50% chance of this being correct.

Replace "average" with "median" and you're at 100%.


Isn't it normally distributed? In which case using average is fine.


I enjoyed the article; a bit fluffy, but fun. However, oversee.net? The thing that comes to mind directly is "overseer", in the southern slaveocracy sense. (I say this as one who grew up in the southern US.) What an unfortunate name!


Really, you get "southern slaveocracy" from "oversee"? Oversee just means to watch over, to monitor.

Edit: Seriously, do a search on the term "oversee". It's a common term and it's related to slavery the same way "driver" is, which is to say basically not at all.

http://www.google.com/search?q=oversee


Really, I do. "Oversee" can be used in a sentence as a verb without the connotation. But when I see "oversee.net" -- the noun -- I think "overseer", and "overseer" is not a word that I can hear without thinking "slavery".


I don't get the slavery connection from "overseer", either. That's still a common term, and not closely tied to slavery. It just means someone who oversees, a manager/supervisor/foreman. Dictionaries (the ones I checked at least) don't even mention slavery, though I did notice a lot of slavery-related "related searches" on dictionary.com.

Obviously what you see is what you see, so you're not wrong, but I think the connection isn't something most people would see.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: