Hacker Newsnew | past | comments | ask | show | jobs | submit | svensken's commentslogin

Four Main Roles of Government:

1. Keep a rule of law

2. Maintain a stable & competitive economy

3. Turn taxes into public services (defense, roads, etc)

4. Lie


I can think of one great example of this. Charles Lindbergh won the Pulitzer prize in 1954 for his Spirit of St. Louis autobiography. He worked on it for 15 years and refused to use a ghost writer, and the result was a masterpiece. Between his poetic writing style and the mountains of fascinating details leading up to the historic trans-Atlantic flight, I wouldn't hesitate to rank it as the most inspiring book I have ever read.

https://www.amazon.com/Spirit-St-Louis-Charles-Lindbergh/dp/...


I got a hold of my old copy of the book and wanted to share an excerpt from the first page:

> Searching memory might be compared to throwing the beam of a strong light, from your hilltop campsite, back over the road you traveled by day. Only a few of the objects you passed are clearly illuminated; countless others are hidden behind them, screened from the rays. There is bound to be some vagueness and distortion in the distance. But memory has advantages that compensate for its failings. By eliminating detail, it clarifies the picture as a whole. Like an artist's brush it finds higher value in life’s essence than in its photographic intricacy.

I can flip to any page and find a sentence or two that I've underlined for being as well written as the page above.


When you look back on the history of communism, you sure do see a lot of red flags.


Isn't it just the bloodflow within the brain that we're observing?

We'll never be able to read a mind based on fMRI-style data any more than we'll be able to judge the ongoings of America by the headlights on the highways.


Cool interpretation! I like that concept. But grandparent was definitely trying to deconstruct the word "meaningless", getting a little existential in the process.


Thanks for the link! Pretty exciting stuff.

Can anyone comment on the following quote:

The list below shows some of the more important optimizations for GPUs... A few of them have not been upstreamed due to lack of a customizable target-independent optimization pipeline.

So the LLVM version of gpucc will be incomplete? Will there be a release of the original stand-alone gpucc?


Thanks for your interest, and hope you like it!

Yes, it is currently incomplete, but I'd say at least 80% of the optimizations are upstreamed already. Also, folks in the LLVM community are actively working on that. For example, Justin Lebar recently pushed http://reviews.llvm.org/D18626 that added the speculative execution pass to -O3.

Regarding performance, one thing worth noting is that missing one optimization does not necessarily cause significant slowdown on the benchmarks you care about. For example, the memory-space alias analysis only noticeably affects one benchmark in the Rodinia benchmark suite.

Regarding your second question, the short answer is no. The Clang/LLVM version uses a different architecture (as mentioned in http://wujingyue.com/docs/gpucc-talk.pdf) from the internal version. The LLVM version offers better functionality and compilation time, and is much easier to maintain and improve in the future. It would cost even more effort to upstream the internal version than to make all optimizations work with the new architecture.


In fact I think at the moment almost everything, other than the memory-space alias analysis and a few pass tuning tweaks, is in. I know the former will be difficult to land, and I suspect the latter may be as well.

I don't have a lot of benchmarks at the moment, so I can't say how important they are. And it of course depends on what you're doing.

clang/llvm's CUDA implementation shares most of the backend with gpucc, but it's an entirely new front-end. The front-end works for tensorflow, eigen, and thrust, but I suspect if you try hard enough you'll be able to find something nvcc accepts that we can't compile. At the moment we're pretty focused on making it work well for Tensorflow.


Thanks for the clarification! It's always a pleasure to get a direct response from the first author on something as awesome as this.

I'm definitely subscribing to the llvm-dev list[1] in case any discussion on this continues there. There's also the llvm-commits, clang-dev, and clang-commits lists as well, but llvm-dev kinda seems like the right place for this.

Gpucc in LLVM is definitely a breath of fresh air for all of us nvcc users. To get to see some compiler internals for cuda, it feels like Christmas. A big thanks from me for all the upstreaming effort!

1: http://lists.llvm.org/mailman/listinfo/llvm-dev


I was wondering about this too, the way they plugged old K80's at the end for non-deep-learning applications. Either they're clever about keeping multiple product lines alive (more profits!) or it's a big cop-out (they're hiding something about P100 that makes it a bad choice for GPGPU - maybe price?)


Only users with enough karma can see downvote buttons. I'm not there yet, either.


I highly doubt that Nvidia dropped the ball this hard with Pascal.

A much more obvious and sensible conclusion is that Nvidia is currently developing their next chip, called Volta. We already know that the Department of Energy contracted Nvidia and IBM (lots and lots of money) to provide a good Volta GPU + POWER9 CPU combo for the new Summit and Sierra supercomputers set for completion in 2017.[1] This means that Nvidia knew since 2014 (at least) that they'd have very little time between their Pascal release and the more pressing Volta release. It's been their roadmap for a while now.

The Fermi, Kepler, and Maxwell architectures each had two or three years between them. Pascal and Volta are set to have a year or less.

1: http://www.anandtech.com/show/8727/nvidia-ibm-supercomputers


I'd trust McDonald's buns and meat to contain LESS preservatives and other bizarre compounds than a pack of burger bread and patties that I'd pick up at my local grocery store. The main reason is that McDonald's is a well-oiled supply-chain machine with (probably) low storage durations for the various ingredients, whereas consumer grocery products can suffer weeks on a store shelf before purchase. The second reason (also just speculation) is that the intensity of regulatory agencies' scrutiny of McDonald's ingredients (especially with regard to meat) would dwarf that of the small packages I would be able to buy for a barbeque.

I believe in staying away from foods high in lipids in general, but fast food burgers don't scare me. It's fries and pizzas that I abhor.


I wasn't talking about store bought burgers. They may or may not be as bad as a McDonald's burger. Get the meat, add some spices(and maybe Worcester sauce) and you'll have your own delicious patty. The whole thing takes around 15 mins and you can customize any part of the process.

However, I agree with your point that there are healthier food options out there if you have the choice.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: