As mentioned under the older thread of this same article, many hardware APIs suffer from having to provide a C-compatible interface/memory model. I remember reading that a particular GPU’s elegant memory model was butchered so that C-programmers could do something with it? My memory is hazy on the details though.