The article (the readme on github) takes the following quote wildly out of context:
> A recent high-profile forensic investigation reported that “due to end-to-end encryption employed by WhatsApp, it is virtually impossible to decrypt the contents of the downloader [.enc file]
This quote clearly means it is virtually impossible without the key. OF COURSE if you have full access to the device as a logged in user, then you can get access to the key and decrypt things that cannot be decrypted by others who do not have the key. Nothing to see here.
At least to the author’s credit the FAQ answers below clarify this, but not after the lead in, which is all most people read, has already done the damage of dramatically planting the incorrect impression that someone has figured out how to break WhatsApp encryption.
That quote is from the data forensics report on Bezos' phone, so the missing context was them stating that even with a full logical file data acquisition, they couldn't decrypt the .enc file. That is what inspired me to write this tool to demonstrate how end-to-end encryption is only impossible to decrypt by intermediate hosts, it's very possible with full access to the filesystem of one of the endpoints.
I could likely make that part of the README clearer by adding more context from the original forensics report that was published.
If you read the referenced forensic investigation, you'll note the obvious incompetence of the forensic investigator involved. The whole thing is a clueless "I ran some forensics tools" job.
This tool is a response to that - that that statement by the forensic investigator is nonsense, because if you have full access to the device (as they did), you can easily extract the decryption key for media files present on the device.
The quote isn't out of context, the quote is just nonsense and wrong. They had the key.
I think the "end-to-end" statement is also misleading as you don't really know whether the communication is indeed encrypted end to end.
If you want true end to end ecryption you can grab a tool that will generate public and private key pair, send public key to your contact and ask them for their public key. Then use tool to encrypt your following messages with your private key and received messages with your contact's public key.
Then it doesn't matter whether underlying transport is encrypted or not. Things like Whatsapp give false sense of security. I am pretty sure the messaging can be accessed but nobody will ever admit it.
You can make it less obvious that you are sending encrypted messages by using two way binary to English text transcoder, so your messages will appear like an English conversation (although easy to pick as gibberish by a censor)
It's entirely possible to examine the compiled app and understand whether it sends the private key or unencrypted messages off the device, there's no need to treat the behavior or Whatsapp like a mystery.
The crux of the matter is verifying that the public key you're told is the public key of the other person, is actually the public key of the other person. If you do not verify this out-of-band, then the server can generate a new keypair for each recipient and it would go like this:
- Alice encrypts message using FAKE public key of Bob (she was told is Bob's key by the server), sends it to whatseverapp server so it can be delivered to Bob when he sign on.
- Whatseverapp server has FAKE private key for Bob (it generated that one itself, after all), decrypts message and notifies the corporate overlords/government/Russians/AdSevers/Illuminati of contents.
- Whatseverapp server encrypt message using REAL public key of Bob, sends it to Bob
- Bob decrypts the message using his REAL public key, reads it.
This out-of-ban verification is nothing you can automate or do once-for-all, like analyzing the binary for bad stuff. Each user has to verify the key of the other users. Which very few people actually do.
Also, you make it sound so easy to analyze an app binary. You can relatively easily prove it does certain things; but to prove a negative, namely that it does NOT do certain things is far more difficult at least and outright infeasible or even impossible in other cases (compare with halting problem). This is why security vulnerabilities happen, and a backdoor is more or less just a security vulnerability that didn't happen by accident or incompetence but deliberately - which may mean whoever put it there might have spent some time to obfuscate it even further to make it harder to find.
It is possible but extremely unlikely that anyone would ever bother to do this. Anyone that cared that much would just use something with source available and save themselves a whole lot of pointless effort that would have to be repeated every time a new binary was generated.
TL;DR: This program decrypts encrypted media files you yourself have received through WhatsApp, "in the same way that the WhatsApp app does to display it on the screen." It doesn't decrypt other people's media files, as the title could suggest.
> It doesn't decrypt other people's media files, as the title could suggest.
It can be used to decrypt other's files as long as there is physical access to the device and the user's passcode. This was published as a result of the Bezo's phone examination report, where the author stated it was impossible to decrypt the media file. A claim that was largely refuted by the digital forensics community.
This does not completely contradict what you are saying but you can only export limited number of messages (10k or 40k can't remember the amount correctly).
There is an app on the google play store that claims to be an “antivirus/cleaner” app that abused Androids accessibility API’s to access WhatsApps media files. The developer called it a “WhatsApp cleaner”. The app was removed from the Play store for several weeks but was allowed to return. The developers now claim to PROTECT unauthorized access to WhatsApp’s media files. App is affiliated with China’s Qihoo that was booted long ago from the Play store for hijacking users WebView with fake virus warnings to boost installs. As this app has been doing every day since 2013.
I thought I’d take this opportunity to describe my recent experience submitting a bug report for WhatsApp on iOS.
When you export a chat, you get a zip file containing the messages as plain text, plus any media files referenced in the chat. The .txt file unfortunately only contains the text-only messages, not the text captions for media items. I reported this as a bug and was told this was functioning as intended.
So this is a warning to anyone who thinks they are backing up their WhatsApp chats via the export feature that their backups are incomplete.
As a workaround, you can get hold of the ChatStorage.sqlite file from an iTunes backup of your phone. All text is in there but you obviously have to query the database and format it into a readable sequence of messages.
This really, really sucks as a workflow and I hope if any WhatsApp engineers ever read this they start working on a real export feature.
It seems that WhatsApp has some oddity in how it handles captions on media items.
I forward messages between various groups, but when you forward a media item like an image, it doesn't forward the caption with it. This is incredibly frustrating - sometimes you don't realize you've forwarded a photo of something that has no context, and my recipients have been often clueless until I have to clarify.
That’s ok for me - I only consider my iPhone a data capture device (at which it is excellent), not a general-purpose computer (at which it is... hobbled).
I only use apps for which I can get data out of the walled garden and into a sane Linux environment. Ironically, considering the generally oppressive lockdown of the iOS environment, the prevalent use of SQLite databases means you can frequently get a good machine-readable extract.
The weak link is iTunes. I know that one day they will kill it and make backups totally opaque, and that will be the end of me using iDevices.
Somewhat disheartening that the author believes by default that the encryption wasn't tampered with on the proprietary server side of this proprietary client.
There's nothing "by default" about it, E2E encryption by definition can't be tampered with by anything on the server side, and the encryption mechanisms of WhatsApp are well understood
Which definition? WhatsApp, like a lot of things that claim E2EE, encrypts and decrypts at the clients. The problem is that we have no idea what the client programs are doing. They have not yet performed the required step of showing us the source code in a form that can be compiled to be the same as the distributed binary.
We know exactly what the client programs are doing, the bits are right on your device and you can audit them if you want (and it has been done already for WhatsApp by many researchers). Have you audited any of the open source software that you use?
No, OpenWhisper is well-understood, which is the algorithm WhatsApp and Signal _claim_ to have implemented. There is no proof the server-side doesn't have a means to acquire the data. And when the context is proprietary software, chances are _always_ against the weak link in the chain, the end-user.
> No. WhatsApp uses iOS Data Protection to encrypt user data files (including ChatStorage.sqlite) using the device-specific and unrecoverable hardware UID key as well as a key derived from the user's passcode. It may not be decrypted without physical access to the specific iOS device that created the file as well as knowledge of the user's passcode.
Didn't we learn that not to be the case since presumably the device can still flash a new apple-signed firmware that would override this ?
No. You would still need a signed modified firmware that would make use of the passcode-derived key from the original device, but the point is that an iPhone can accept a new firmware while locked which shouldn't be the case if Apple was serious about non-compromised security.
Are you talking about DFU mode? I think it would be very difficult for Apple to remove that without causing a lot of issues for people. Being able to completely reset the device if you forget the password is useful, especially since you still need the user's iCloud credentials to activate it (so as to discourage theft).
I'm talking about the time Apple was asked to provide an alternative firmware to unlock the iPhone of the San Bernardino shooter, which suggests that at least Apple itself can put a new firmware on a locked device without having it wiped.
It may be the case you can put a new OS on the device without it being wiped, because the OS partition is separate from the user data partition, and probably unencrypted.
Anyway, a hypothetical alternative firmware couldn't just magically bypass the encryption. What it could have done, and I think this may only apply to older iPhone models as it's now handled by the Secure Enclave(?), is make it easier to brute-force the pincode (no lockout, less delay).
Can you do an OS upgrade on a locked device without unlocking it first ? If so that is terrible regardless of whether the OS partition is encrypted or not.
It also doesn't really matter whether the compromise is a direct key extraction or just defeating the anti-bruteforce protections, the root flaw here is with the phone accepting new privileged software while locked and still retaining its state.
Well the existence of a bypass for that is what is being considered here, such a bypass would constitute a backdoor contrary to Apple's security/privacy posturing.
> a hypothetical alternative firmware couldn't just magically bypass the encryption
No, but it could presumably brute force the pin (unless the rate limiting is hardware controlled?) or wait until the user enters it, then decrypt everything.
I'm talking about having device data preserved and made accessible, while the device is screen-locked, without having any credentials or the passcode - pick all three.
Yes. If the FBI had a signed system image that disabled disk encryption, they would have applied it through Recovery mode and the data would not have been lost. They couldn’t get Apple to make or sign the software, however.
Not the point, If installing such a software on a locked device is technically feasible that's a pretty glaring hole in the entire security approach, essentially making Apple a trusted party to your phone data with the ability to create other such parties.
You already trust them not to make compromised software updates anyway. While I agree that this is definitely a potential issue, I don’t think it changes the threat model.
You can because (presumably) the backup is encrypted with a key that is derived from your iThing credentials not with the phone hardware encryption key which is derived among other things from the screen lock passcode.
But if you lose the passcode to your screen-locked or unpowered device any way of getting data (that has not been backed up) out of it, while it's locked or turned off implies a security flaw even if done via a firmware signed by Apple.
> A recent high-profile forensic investigation reported that “due to end-to-end encryption employed by WhatsApp, it is virtually impossible to decrypt the contents of the downloader [.enc file]
This quote clearly means it is virtually impossible without the key. OF COURSE if you have full access to the device as a logged in user, then you can get access to the key and decrypt things that cannot be decrypted by others who do not have the key. Nothing to see here.
At least to the author’s credit the FAQ answers below clarify this, but not after the lead in, which is all most people read, has already done the damage of dramatically planting the incorrect impression that someone has figured out how to break WhatsApp encryption.