Signal is an end-to-end encrypted messaging app. People continue to breathlessly mentioning the lack of database encryption as a problem, but that never made it a real security issue: its job is not, and has never been, dissuading an attacker who has local access to one of the ends, especially because that is an incoherent security boundary (just like the people who were very upset about Signal using the system keyboard which is potentially backdoored - if your phone is compromised, of course someone will be be able to read your Signal messages).
Database encryption isn't comparable to the keyboard drama. Protecting against malware in your keyboard can be done by using a different meyboard and is of course out of scope.
But if my phone gets taken and an exploit is used to get root access on it, I don't want the messages to be readable and there's nothing I can do about it. It's not like I can just use a different storage backend.
It's also a very simple solution - just let me set an encryption password. It's not an open-ended problem like protecting from malware running on the device when you're using it.
If someone has root access to your apparently unencrypted phone, then they can just launch the Signal app directly and it'll decrypt the database for them.
Which is to say this is an incoherent security boundary: you're not encrypting your phone's storage in a meaningful way, but planning to rely on entering a pin number every time you launch Signal to secure it? (Which in turn is also not secure because a pin is not secure without hardware able to enforce lock outs and tamper resistance...which in this scenario you just indicated have been bypassed).
Any modern Android is encrypted at rest, but if your phone is taken after first unlock, they get access to the plaintext storage. That's the attack vector.
A passphrase can be long, not just a short numeric PIN. It can be different from the phone unlock one. It could even be different for different chats.
They're vulnerable to "High-S" malleable signatures, while ed25519 isn't. No one is claiming they're backdoored (well, some people somewhere probably are), but they do have failure modes that ed25519 doesn't which is the GP's point.
CodeQL compiles to the Souffle datalog engine and I use it for static analysis. I've also used ascent for a few random side projects in Rust which is very convenient.
The human brain is just really bad at evaluating risk, especially over long periods of time. A lot of people are wanted overseas for years or even decades without anything happening, which makes it hard to maintain the mindset of being at risk without falling back to "eh, I've been fine this long"; a lot of them do foreign travel anyway and get away with it, which makes it hard to not fall into "what's one more vacation to a extradition-friendly country".
...And this is how I learn that the line "Steele Dakota's sandhill crane" from mewithoutYou's Nine Stories is talking about a bird species and not a literal mechanical crane. Apparently they have the largest sculpture of a sandhill crane in the world at 40ft (which makes more sense in the context of the song than a mechnical one!) https://www.ndtourism.com/steele/attractions-entertainment/f...
Adacore is doing great work. They have a Rust compiler, but the static analysis and formal verification is not quite there for the Rust toolchain and it does not have any where near the legacy for high-integrity, mission-critical apps. Not that it is not heading that way, but for us it is not there by a longshot to select it for our product.
They run the game servers in Docker. Doing multi-tenant is a weaker security boundary and makes it easier to steal places from other users, which Roblox takes pretty seriously when places represent all the time invested by game studios and millions of dollars in revenue.
running multiple game servers in docker is a multi-tenant environment, because docker is not a serious security boundary unless you're applying significant kernel hardening to your kconfig to the tune of grsecurity patches or similar