Hacker Newsnew | past | comments | ask | show | jobs | submit | pyalot2's commentslogin

The overwhelming consensus led to a diabetes type 2 and obesity epidemic. Curious that when this consensus was absent, it wasn't nearly that much of a problem, but only started to become a huge giant problem once consensus was reached. Coincidence? Totally.


Well, no, bad eating habits and people with poor self-control led to a diabetes type 2 and obesity epidemic. (I'm not saying I have awesome self-control - it's a work in progress, but I'm not going to blame external factors when I'm the single largest determinator of what I put into my body).

Sigh.

I get sick of reading this, that people think it's some grand conspiracy, or that big corporations are poisoning you. Look, for most people (read - you, me and most people around us) - it really does boil down to self control. Eat less food, and exercise more.

I'm always amazed by the portion sizes I get when I go out. And then when I don't shovel down food like my friends, they're like, "What's wrong? Are you sick? You should go see a doctor".

I'm like, no...err, I'm just full?

Yes, I have moments of weakness and I have my vices (liquorice...yummy), but if you just cut down on food in general, and the sugar and processed stuff, you will lose weight. I went from around 75kg to 68kg that way - sure, I suppose I could go lower, but I don't think I could maintain that level of self-control 24/7...haha.

And another poster is right - a lot of the success of these fad diets is probably just due to people being conscious of what you eat - which they're normally not. The fact they're dieting probably means they realise exactly how much they're eating, and consciously or not, cut down on it.


Isn't it more like you give me money, and for that privilege you shall now have to pay me more money for as long as I have your money?


That's a distinction without a difference.


HAHA. Nice going newsweek, this idea, it's been around for over 50 years, and it's always just around the corner according to rags like you, and like practical nuclear fusion, it's always 50 years in the future for anybody who's got a clue.

But just in case we actually can teach computers to understand us, "solving programming" will be one of the least exciting things that'll happen at that time. Other things may include, but may also not be limited to: extinction of the human species, technological singularities, deconstruction of earth and moon to build a matrichoka brain, and so forth.


This post is factually wrong, and misguided. Here's why:

#Preamble: Except on Windows you cannot run Direct3D anywhere else. Unless you plan not to publish on Android, iOS, OSX, Linux, Steambox, PS4 etc. you will have to target OpenGL, no matter how much you dislike it.

#1: Yes the lowest common denominator issue is annoying. However, in some cases you can make use of varying features by implementing different renderpaths, and in other cases it doesn't matter much. But factually wrong is that there would be something like a "restricted subset of GL4". Such a thing does not exist. You either have GL4 core with all its features, or you don't. Perhaps author means that GL4 isn't available everywhere, and they have to fall back to GL3?

#2: Yes driver quality for OpenGL is bad. It is getting better though, and I'd suggest rather than complaining about OpenGL, how about you complain about Microsoft, Dell, HP, Nvidia, AMD etc.?

#compiler in the driver: Factually this conclusion is completely backwards. First of all the syntactic compile overhead isn't what makes compilation slow necessairly. GCC can compile dozens of megabytes of C source code in a very short time (<10ms). Drivers may not implement their lexers etc. quite well, but that's not the failing of the specification. Secondly, Direct3D is also moving away from its intermediary bytecode compile target, and is favoring delivery of HLSL source code more.

#Threading: As author mentions himself, DX11 didn't manage to solve this issue. In fact, the issue isn't with OpenGL at all. It's in the nature of GPUs and how drivers talk to them. Again author seems to be railing against the wrong machine.

#Sampler state: Again factually wrong information. This extension http://www.opengl.org/registry/specs/ARB/sampler_objects.txt allows to decouple texture state from sampler state. This has been elevated to core functionality in GL4. The unit issue has not been resolved however, but nvidia did propose a DSA extension, which so far wasn't taken up by any other vendor. Suffice to say, most hardware does not support DSA, and underneath, it's all texture units, even in Direct3D, so railing against texture units is a complete red herring.

#Many ways to do the same thing: Again many factual errors. Most of the "many ways" that author is railing against are legacy functions, that are not available in core profile. It's considered extremely bad taste to run a compatibility (to earlier versions) profile and mix&mash various strata of APIs together. That'd be a bit like using Direct3D 8 and 11 functionality in the same program. Author seems to basically fail in setting up his GL context cleanly, or doesn't even know what "core profile" means.

#Remainder: Lots of handwaving about various vaguely defined things and objecting to condjmp in the driver, again, author seems to be railing against the wrong machine.

Conclusion: Around 90% of the article is garbage. But sure, OpenGL isn't perfect, and it's got its warts, like everything, and it should be improved. But how about you get the facts right next time?


Every time I read the words "The extension," I cringe a little. In my experience, "There's an OpenGL extension for it" translates almost directly to "This will not work on a critical percentage of your customer's machines, and will work in a way that violates the extension's documentation in another critical percentage."

If key parts of the API that you need live in "the extensions," then the API itself is not a properly-tuned abstraction to serve your needs well.


> This has been elevated to core functionality in GL4.


Which many systems still do not support. You don't even need to get mad at drivers, there is plenty of graphics hardware (the latest being the Intel Sandy Bridge GPUs from three years ago) that only supported up to 3.3. And even if you say "well, if you don't have a tessellation capable GPU, get a new one" there is still the problem of the Linux drivers being at 3.3 right now, and older versions of OSX only supporting 2.1.

Hell, the pre-Sandy Bridge Intel hardware only supports 2.1. It is only in the last 5 years that both mainstream GPU manufacturers supported 4 at all too, I have a GTX 285 that can't do tessellation either.


Sure implementations are still playing catchup, but you can't mark that as a failing of the standard.

The standard addressed the concern and now its up to implementors to do it.


But as the original article notes (in a bit of a roundabout way), OpenGL-the-standard vs. OpenGL-the-ecosystem is a bit of an empty debate. Unless you're one of the privileged few who only writes OpenGL code that targets one hardware platform... It doesn't much matter what the standard says if my customers don't install hardware that can take advantage of the new standardized features, if they never upgrade their graphics drivers for their older hardware, if the driver vendors never bother to update the old drivers to meet newer versions of the standard that their hardware could support but is no longer in their financial interests to support, etc., etc., etc.

"OpenGL is broken" refers to the market adoption of the standard, because when you're developing graphics software for consumers that's the aspect you care about.


> "OpenGL is broken" refers to the market adoption of the standard, because when you're developing graphics software for consumers that's the aspect you care about.

(This isn't the only point the OP is arguing about, but anyway.) What exactly would be the alternative? It's either there's a standard, and adherents must follow its core features to get the compliancy stamp, or there are no standards, and each go its merry way, up to third parties to follow up on all the completely different API resulting from that. As someone else said, there are core levels in the standard, which give garantees to third parties. It hasn't always been like that, but now we do have them.

As for support of the latest features on my old Geforce 7600, I guess I should accept the fact that they cannot be implemented efficiently, and if I want to play the latest installment of Wolfenstein, I'll have to grab a new card. Or I could try getting a more modest game. There is clearly a commercial aspect to this whole upgrade mechanism too, but since upgrades are necessary for technical reasons, it's difficult to argue against the mercantile part.


> It's either there's a standard... or there are no standards, and each go its merry way

In my experience, OpenGL walks the middle line. There is certainly a core set of functions that (almost always, discounting buggy drivers) work. But the core set doesn't span all the critical functions you need for what modern game players would consider a performant game engine (such as hardware synchronization to eliminate "tearing"). So game engines will need to factor in the GL extensions, which put us in the "up to third parties to follow up" world. It's a frustrating environment to work in; you can't really ever trust that your code will either succeed or fail on any given hardware configuration, and you're stuck playing whack-a-mole on bug reports that you lack the hardware to reproduce.

> As for support of the latest features on my old Geforce 7600, I guess I should accept the fact that they cannot be implemented efficiently

I wish it were that simple. That, I could deal with.

I've worked with a card that had a bug in the GLSL compiler. A particular sum simply wasn't compiled, and we had to work around the problem by multiplying by 1.000000000000000001 to force the compiler to generate the bytecode for the whole calculation (the fact this trick works is something one of our veteran engineers "just knew would work," so we got lucky). There is functionally no chance of that software bug ever getting patched; card vendors don't care about older versions of their technology, and even if a driver version were out there that patched the bug, you can't trust machine owners to keep their drivers up-to-date.

More frustratingly, as I mentioned elsewhere, I've worked with cards that implement things that should be high-performance (like stages of the shader pipeline) in software, just to claim they have the capability. Since OpenGL gives you no way in the API to query whether a feature is implemented in a reasonably-performant way, you either do some clever tricks to suss this stuff out (F.E.A.R. has a test mode where it runs a camera through a scene in the game and quietly tunes graphics pipeline features based upon actual framerate) or gather bug reports, blacklist certain card configurations, and keep going.

Old cards not being as powerful as good cards I can deal with; if we could simply say "Your card must be X or better to play," we'd be fine. New cards with bugs and under-performant cards that lie about their performance to clear some market hurdles are the maddening corners of the ecosystem.


> There is functionally no chance of that software bug ever getting patched; card vendors don't care about older versions of their technology,

Well, is that an OpenGL issue? Wouldn't you get that very same problem with D3D?

> reasonably-performant way

I don't understand. How can you objectively define such thing? Doesn't it depend on the workload? If you're pushing 3 tris per frame, any feature can be labeled as reasonably performing, but if you have 300M, can any card these days maintain reasonable framerate even on the most basic settings? I am exagerating on purpose; some apps will require a very small amount of work on each stage of rendering, and could reasonably afford any extra pass, even if software implemented. And in other cases (which might be the majority), it doesn't cut work. I don't see how there could be an objective way of deciding if a feature is sufficiently performant. Your example is telling: an application as complex as F.E.A.R. Should clearly make its benchmarks (or keep a database) to decide which feature can be included without hurting performances. And even then, players have also different perceptions of what constitutes playability.

I agree with you: multiple standards, multiple vendors, multiple products, the fallout is "struggling compatibilities" at worst, "varying performances" at best. But that's a common point between D3D and OpenGL, not a divergence. Am I missing something?


> There is functionally no chance of that software bug ever getting patched; card vendors don't care about older versions of their technology,

Isn't it interesting that not more developers are pushing for open source drivers? Try finding a GLSL compiler bug in mesa and asking in #dri-devel on freenode or bugs.freedesktop.org. There you most likely get a very quick and helpful reply.


> The unit issue has not been resolved however, but nvidia did propose a DSA extension, which so far wasn't taken up by any other vendor.

... which is graphic-developer-ese for "I hope you like branching your render engine some more."


> Suffice to say, most hardware does not support DSA, and underneath, it's all texture units, even in Direct3D, so railing against texture units is a complete red herring.


but java does fine with it needs of extensions to everything. How do you justify that?


Java is actually cross-platform. OpenGL is just an API.

OpenGL extensions are typically optional parts of the spec, available on a per-implementation basis, and are not pluggable: you're at the mercy of the vendors, assuming your target audience even has up-to-date drivers at all.

Compare the following:

  Java class works on JVM on Windows, Linux, OS X...
vs

  OpenGL implementation for Intel HDs on Windows
  OpenGL for Radeons on Windows
  OpenGL for Geforces on Windows
  OpenGL for Intel HD on OS X
  OpenGL for Radeon on OS X
  OpenGL for Geforce on OS X
Now add Linux and various API-levels of Android to the mix and cry if any single one of these permutations is missing the extension you want.


I see what you mean. my bad


I don't develop in Java (outside of Android); I'm afraid I'd need you to clarify what you mean to answer the question. Are you referring to JNI?


"Like it or lump it" is not a great advertising strategy :)

I suspect the point about this subset-of-GL4 thing is that what you can rely on in practice is OpenGL3, plus enough extensions to bring it up to something closer to OpenGL4. Take a look at the Unity3D hardware stats (http://stats.unity3d.com/index.html) or the Steam hardware stats (http://store.steampowered.com/hwsurvey/) - ~50% of the Unity market and ~25% of the Steam market is pre-DX11, which I believe limits it to OpenGL3 at most.

I might agree that the the author of this piece is a bit careless about distinguishing between OpenGL as it is implemented and OpenGL as it is specified. Aside from the bind points nonsense, and the massive pile of random 16-bit values that is GLenum, the OpenGL spec doesn't actually require everything to be this total clusterfuck. I'm not aware of any specified reason why the GLSL compiler couldn't just as easily be a shared component (as it is in Direct3D), for example, and I don't recall the bit where the spec insists on shitty drivers. Still, we have to work with what's there, not with what should be there, and when what's there is a bit of a mess, it's absolutely fair to call it what it is.


Also, regarding the `many ways' section, which bits are removed in the core profile? It looks like everything he complains about is still there in the 4.3 core profile...


Your point about Direct3D being only on Windows is true but I just wanted to point out that the PS4 does not in fact use OpenGL at all.


"Drivers may not implement their lexers etc. quite well, but that's not the failing of the specification."

Well, it could be, if the specification of the language requires a lot of work in the lexer, but that's probably not the case.


GLSL isn't anything as hard to parse as C. Besides, the bytecode discussion is a complete red herring. It's been done since ages for the web too, if it's slow to compile, it's either sheer incompetence on the part of the driver writer, or the reason isn't the parsing.


It's a bit of an understatement that hardware vendors aren't known for being very good at software. NVidia and AMD aren't so bad, but look at how awful the mobile GPU vendors are: https://dolphin-emu.org/blog/2013/09/26/dolphin-emulator-and...

That's what made it a bad decision to compile the entire shader in the driver. They are incompetent, and the ARB picked a design that exacerbated their incompetence instead of working around it. Users don't care why their OpenGL implementation has bugs, they just care that it does.


The 'lore', if you will, around this decision is that 3dfx pushed for GLSL being handled in the driver instead of a bytecode model because they had the best driver developers so would have a clear advantage. Who knows if that was actually their reasoning, perhaps they just wanted the flexibility?


Yeah, that's in line with my understanding/expectations. My point was just that this is an "in practice" thing, not "in principle".


> how about you complain about Microsoft, Dell, HP, Nvidia, AMD etc.?

These companies are businesses that need a business reason to support your platform. Until more people are playing triple A games on platforms that use OpenGL you can't really fault them for spending money when it doesn't make sense. Apple designs its own chips for its mobile device so I'd think the OpenGL on iOS would have better driver support.


> Apple designs its own chips for its mobile device so I'd think the OpenGL on iOS would have better driver support.

Apple licenses Imagination Technology's PowerVR GPUs, they do not design them themselves. The OpenGL driver also comes from IT.


Apple absolutely has graphics driver teams for iOS and OS X. They control a significant portion of the driver stack, not to mention the rendering APIs.


Source for that?


I interviewed with apple and many of the people I interviewed with were explicitly on their mobile driver team. We discussed hardware and driver stack details. They made it clear that yes, they are partially responsible for the driver stack.

I also have explicitly had my statement confirmed by game & driver developers for desktop OS X; I can't say with 100% certainty that Apple owns the iOS stack, but I am relatively certain that it is true given that it is 100% known to be true on OS X.

Common-sense wise, I find it highly unlikely that iOS would be bolted directly to the powerVR graphics stack. It would make it too hard for them to move to another graphics vendor. So I think it is almost a certainty that they own some, if not all of the graphics stack, even if it is derived from proprietary PowerVR code & hardware.


I've spoken to the driver teams for both Imagination and iOS. For anything regarding iOS, the Imagination team will always say "We can't answer that, speak to the iOS team." For anything that's not in the public docs, the iOS team will respond "We can't answer that." ;)


At least for large customers of PowerVR the GPU design includes the source code for the drivers. This is why you can't even rely on the same GPU model or even exact same SoC working similarly in the Android world because they come from different companies so have different driver trees.


Rage and Wolfenstein:The new order are both triple A titles and IdTech 4 & 5 is based on OpenGL.


But laptop manufacturers normally don't support driver updates over many years which is necessary to stay current with OpenGL versions. Often the (semi-official) NVidia Verde drivers from NVidia work when the laptop has an NVidia GPU; for AMD GPUs there's no such support.


OpenGL is the only thing you get on iOS. There is no Direct3D on iOS. Likewise, it's the only thing you get on PS4, Steambox, OSX etc.

But that's not my issue, I acknowledge freely that OpenGL drivers are bad. I just don't quite see how that's a failing of OpenGL, rather than the vendors who actually implement the drivers.


Well, no, Sony has its own low-level API that you can use on the PS4[0], and because all PS4s use the same GPU you don't have to worry about a lot of what OpenGL has to offer you in terms of abstracting away the underlying hardware, if all you care about is the PS4.

http://arstechnica.com/gaming/2013/03/sony-dives-deep-into-t...


PS4 doesn't use OpenGL. No game console I'm aware of has ever used OpenGL. (The PS3 is the closest example, since it used to let you run Linux, so you could run Mesa - but the GPU wasn't accessible to you.) I don't know why people keep claiming that a given console runs OpenGL.


People probably bring it up because you can often use OpenGL on a console, even if it's through a wrapper library instead of supported directly, or even if it's only "OpenGL-like" instead of fully compliant -- which still may be better than porting to the native format depending on the game. (Though the original XBox's D3D was pretty OpenGL friendly: http://www.emuxtras.net/forum/viewtopic.php?f=193&t=4009) So for the PS2, PS3, Wii, and handhelds like the PSP and DS, there are OpenGL or OpenGL-like wrappers available.


PS4 doesn't use 'OpenGL', just a low level api and a higher level api that has features suspiciously close to OpenGL 4.3...

Also uses Clang and a bunch of Unixy open source stuff...


Sure, but in practice this is not 'OpenGL' enough to count when talking about OpenGL making ports trivial. (I say this as someone who recently shipped a game w/a OpenGL renderer that has a PS4 port in the works - there are a surprising number of differences!)

The core OpenGL feature set and API factoring are almost certainly things you can expect to be similar on console platforms, at least where the hardware matches. So in that sense 'It's OpenGL' is almost true!


Ouya, maybe?


The bigger problem is a failing of the ecosystem at large; there's an insufficiently toothful agency policing the vendors and a lack of gold-seal certification that matters, leaving space for vendors to do whatever gets the card out the door before the competition. OpenGL's breadth as an API definitely doesn't help ("now that you've implemented direct-buffer rendering, let's go implement all that glVertex crap that lets you do the exact same thing, only slower! Your library coders have infinite time, right?"). But I doubt it's the root cause of the frustration; the "hit the benchmarks and beat ATI out the door this Christmas" ecosystem is my biggest gripe.

I've had to deal with cards that explicitly lie to the software about the capabilities by specifying they support a shader feature that's implemented in software without acceleration (!!!). There's no way to tell via the software that the driver is emulating the feature besides enabling it and noticing your engine now performs in the seconds-per-frame range. So we blacklist the card from that feature set and move on, because that's what you do when you're a game engine developer.


>Until more people are playing triple A games on platforms that use OpenGL you can't really fault them for spending money when it doesn't make sense.

All platforms support OpenGL, not sure where you're getting at.


Sorry to break it to you, but until the recent introduction of Android devices branded as 'microconsoles', nothing resembling a game console ever ran OpenGL. And millions of people play their games on consoles (some instead of on PCs)


And how many of those run DirectX? Are we really trying to debate whether or not OpenGL is more portable than DirectX across platforms? Really?


I'm calling out the incredibly common lies that a) game consoles use OpenGL b) all major game platforms use OpenGL.

Like it or not, neither of those are true. It might be nice if they were.

OpenGL is obviously still a widely-supported choice on desktop PCs (and mobiles), so that's not in question.


You're aware that the PS4 is basically a PC running FreeBSD and an AMD GPU right?


And its rendering APIs are nothing close to OpenGL.


Strange, I developed on Xbox/Xbox 360/Xbox One for many years. Never realized they supported OpenGL...


I didn't know this.


Except on Windows you cannot run Direct3D anywhere else. Unless you plan not to publish on Android, iOS, OSX, Linux, Steambox, PS4 etc. you will have to target OpenGL, no matter how much you dislike it.

Yeah? So? If the graphics API which provides reasonable guarantees of feature availability and performance exists only on one platform, you code for that platform if you need those guarantees. There is a reason why until recently virtually all PC games, and a great many other graphics applications, were Windows only.

Yes driver quality for OpenGL is bad. It is getting better though, and I'd suggest rather than complaining about OpenGL, how about you complain about Microsoft, Dell, HP, Nvidia, AMD etc.?

Khronos Group failing to provide rigorous conformance tests and driver quality standards IS A FAILURE OF OPENGL. That's why Microsoft provided extensive support for OEMs and driver developers from the get-go with Direct3D: they wanted their API to be used.

If you care about graphical performance, then OpenGL is simply not up to the task. You should be using Direct3D, period.


> If you care about graphical performance, then OpenGL is simply not up to the task.

I'm always confused when people say stuff like that.

1. Random forum user says "OpenGL is not up to the task". 2. AAA game developers says "That the Linux version runs faster than the Windows version (270.6) seems a little counter-intuitive, given the greater amount of time we have spent on the Windows version. However, it does speak to the underlying efficiency of the kernel and OpenGL." http://blogs.valvesoftware.com/linux/faster-zombies/

Who do I believe?


I believe the AAA game developers that say things like "I’ve spend years porting games from DX to GL on the Mac including multiple Call of Duty franchisees, Civilization franchisees, and more. You are always taking it on the chin with GL where DX works." https://medium.com/@michael_marks/opengl-for-real-world-game...

Valve was testing a game that was already years old when the tests were run, based on an engine that was written for DX9, and presumably only the happy case in terms of hardware and driver support (meaning an NVIDIA card and proprietary driver). If you're developing a AAA game for release next Christmas, you're going to want it to look better than the state-of-the-art from five or even three years ago. If you attempt that with OpenGL, you are going to run into holes in driver support for various GPU features, not to mention discrepancies in which rev of OpenGL is actually supported by the platform, which you will then have to work around with various vendor-specific extensions, which means more code paths and more things to debug. And then once all those holes have been painstakingly plugged, you can get to work on the performance hiccups...

Or you could use Direct3D, where everything Just Works on the same code path.

You can bet your ass that when HL3 is released as foreordained by the prophecies, it will be a Direct3D-only title at first. And remain so for at least a year.


During Steam Dev Days when Valve talked about Source 2 (which, if HL3 ever ships, will almost certainly be the engine used) they were almost always talking about their OpenGL backend or how they manage Direct3D and OpenGL without having to write everything twice. It'll ship with both on Windows and default to D3D, just like Source does now (and has for years iirc).


You've got to be kidding me...


The folly of premined cryptocurrencies is a well explored and established fact in the realm of altcoins. Somehow ripple thought they where immune.


>The folly of premined cryptocurrencies is a well explored and established fact in the realm of altcoins.

I don't know if the folly of premining was established before, but it is now. Ripple will be an example to anyone who says this can't happen because nobody would depress the value of their own stake like McCaleb has done.

In the long run, it's good for Ripple that these large amounts of money will no longer be in one set of hands, if it survives, but it's not good for anyone who hoped to make a quick profit from owning XRP.

Bitcoin also has an issue with a large quantity of coins held by the developer.


Yes, but you've got to find your niche.


Clearly there's some unsolved problem there. Somebody come up with some proof of data algorithm and associated blockchain...


Filed in the northern district of texas, of course, where every other IP troll goes to file suits. Really all you need to know about the merits.


SOX is an immense drag on any US business, and on any business doing business with a US business.

I've seen this firsthand at two companies (Systor and Accenture). Adherence to SOX creates insane bureaucracies and makes an organization utterly inflexible. It introduces a pleathora of glass ceilings and most employees spend their time ticking off checkboxes on compliance forms, rather than performing actual work. On top of that, it also erodes moral/ethical behavior, because at all levels of management, it creates the impression that as long as you're in compliance, everything else goes.

SOX is the reason why companies like Valve will likely never go public. It's also the reason why so many small companies who go public, fall on hard times. A previously successful, flexible culture suddenly finds itself siloed into the SOX (aka sucks) work structure where all flexibility and cross-departmental/hierarchical communication ceases. Where SOX creates the perfect breeding ground for disassociated upper management to replace the founders and passionate people on top.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: