Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Reflections on Trusting Trust (1984) [pdf] (cmu.edu)
93 points by goranmoomin on June 19, 2022 | hide | past | favorite | 21 comments


A classic paper by Ken Thompson. Today I think the most important paragraph is this: "I picked on the C compiler. I could have picked on any program-handling program such as an assembler, a loader, or even hardware microcode. As the level of program gets lower, these bugs will be harder and harder to detect. A well-installed microcode bug will be almost impossible to detect."

Keep the above in mind when thinking about the proprietary microcode and ME hardware/firmware components that are built-in to nearly all modern processors. A typical "supply chain" attack.


I bet a FOIA request for information on any microcode backdoors would result in an impressive number of redactions.


It's more likely that such a request would not get a meaningful response because it would never be routed to anyone with knowledge of such activities. (SAP/SAR/UNACKNOWLEDGED/WAIVED)


I don't know if much would be reported. If you mean a FOIA request to the US Federal government, there are many exemptions, including:

Exemption 1: Information that is classified to protect national security.

...

Exemption 3: Information that is prohibited from disclosure by another federal law.

Exemption 4: Trade secrets or commercial or financial information that is confidential or privileged.

Source: https://www.foia.gov/faq.html

Feel free to try.


FOIA request with whom?

Intel had a ton of firmware leaked a while back and no one found anything particularly juicy.


Many third party researchers have already found backdoors in other parts of Intel's CPUs, such as the Management Engine.

I very much suspect a well funded and motivated actor like the NSA could find mistakes in the microcode without needing to ask for a backdoor directly.


Related. Others?

Reflections on Trusting Trust (1984) [pdf] - https://news.ycombinator.com/item?id=23807168 - July 2020 (19 comments)

Reflections on Trusting Trust (1984) [pdf] - https://news.ycombinator.com/item?id=17459891 - July 2018 (1 comment)

Reflections on Trusting Trust (1984) [pdf] - https://news.ycombinator.com/item?id=13569275 - Feb 2017 (15 comments)

“Reflections on Trusting Trust” annotated - https://news.ycombinator.com/item?id=10698537 - Dec 2015 (15 comments)

Ken Thompson: Reflections on Trusting Trust (1984) - https://news.ycombinator.com/item?id=9183106 - March 2015 (9 comments)

Reflections on Trusting Trust (1984) - https://news.ycombinator.com/item?id=8662876 - Nov 2014 (3 comments)

Ken Thompson: Reflections on Trusting Trust (1984) [pdf] - https://news.ycombinator.com/item?id=7992172 - July 2014 (1 comment)

Ken Thompson 1984: Reflections on Trusting Trust [pdf] - https://news.ycombinator.com/item?id=3987710 - May 2012 (1 comment)

Reflections on Trusting Trust - https://news.ycombinator.com/item?id=2909357 - Aug 2011 (1 comment)

Reflections on Trusting Trust (Ken Thompson's Turing Award speech) - https://news.ycombinator.com/item?id=2642486 - June 2011 (34 comments)

Ken Thompson - Reflections on Trusting Trust - https://news.ycombinator.com/item?id=300350 - Sept 2008 (3 comments)


We can also get the requisite link to "David A. Wheeler’s Page on Fully Countering Trusting Trust through Diverse Double-Compiling (DDC)" out of the way: https://dwheeler.com/trusting-trust/


As the linked-to author, thanks for the "requisite" link :-).


Thanks for the work! :)


> The modified version of Xcode, the researchers claimed, could enable spies to steal passwords and grab messages on infected devices. Researchers also claimed the modified Xcode could “force all iOS applications to send embedded data to a listening post.”

https://9to5mac.com/2015/03/10/cia-apple-encryption/

https://en.wikipedia.org/wiki/XcodeGhost


These days this should be required reading by age 12 or 14 ...


Required reading at the upper levels of a computer science undergrad curriculum maybe, but I don’t know any 12 or 14 year olds equipped to handle this paper. Your experience may vary.


Existence of this problem is often used as a defeatist argument to do nothing about software security.

1. I don't want to run any software I don't trust, and I don't want to trust anyone.

2. But it's impossible to verify anything, because even my CPU could be lying to me.

The result is that any improvement in security gets shot down, because nothing short of digging sand for silicon with your bare hands is safe from Ken Thompson.


more than half the people I knew in Computer Science that took the road of security, turned out as concerted weasels, on both sides of that "bright line" of the law. Meanwhile lots of worthy people doing worthy things are effectively in the same camp as dolts and illiterates, having nothing to do with security at all.


Dunno why you are getting voted down. This matches my experience, down to the "concerted weasels" description.


What does that actually mean?


seL4 is a system formally verified up to machine code.

https://sel4.systems/About/home.pml

It won't protect you against unprotected apps doing what they want, nor against CPU backdoors, but it reduces your attack surface by a lot.


Should we trust a formal verifer program which was used to verify seL4? (Remember: being "open source" does not help in such cases).


I suppose a trojan designed for a specific compiler is hard enough to make.

Having it hijack a model checker as well, in order to conceal itself from resulting models, would be a giant leap more difficult.

So, you are still risking it if you have nation-state adversaries, but not if they are script kiddies.


Shameless plug, but here's a short video covering the paper: https://youtu.be/Ow9yMxJ8ez4




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: