It's poor form to put the word "secure" (or any derivative of it) into a product that aims to provide more security.
Reason: there's no solution to the security problem. Fil-C doesn't solve security. Does it make things more secure? Yes! But there will always be more security issues. So imagine if I had called it "Secure C" (and then had a sexy compiler), and 10 years from now someone finds a comprehensive solution to the string injection problem in Secure C. What do they call their thing? Securer C? Secure C Pro? Secure Secure C?
I agree. I'd add that security is always relative to not just an existing level of security but to a threat model. There are threat models relative to which Fil-C doesn't make things more secure. (I can't think of one under which Fil-C makes things less secure, though.)
A similar criticism applies to "new". Newcastle is named after a castle built 945 years ago. Neuchâtel is named after a castle built 1014 years ago. Xavier (from Basque "etxeberri", "new house") is named after a castle built in the 10th century. Windows NT "new technology", etc.
Correct. Progressive MP4 also do the trick. Many open-source tool such as ffmpeg (generalist), gpac (specialized in this and leveraging ffmpeg), etc. in the area.
My Ghidra extension doesn't perform static recompilation, the bytes that are exported are the ones from the original program (except the ones targeted by a relocation). It's similar to a raw binary exportation, except you get a relocatable object file instead.
That being said, maybe it's possible to perform a delinkage and then run a static recompiler on the object file. I don't know how that would compare to running a decompiler on the object file.
> I've heard from colleagues that this won't be possible with DASH due to the switch to fMP4 format.
That's incorrect. With DASH the latency depends on the fragment duration, not the segment duration. You can start sending the segment when its first fragment is generated, and use chunk-based HTTP transfer as mentioned in other comments.
Sure, but in a "modern" codebase where you're template-everything, then more changes will be in headers, resulting in a lot more translation units being affected by any given change.
It's always weird to read codec commentary from broadcast people; their world and concerns are just so alien to me. I also rather wonder what he considers a raw bytestream that HEVC has and VP9 doesn't...
Anyway, I guess the only major concern from him is the DCT overflow? I mean, for the extra precision, even HEVC has increased from 16-bit intermediates in the newer profiles. I think Daala's transforms would solve his issues with VP9's, I'd imagine they're being considered.