Do we really need 128 permutations just to express an alphabet of 26 letters?
I think we should use a 4 bit encoding.
0 - NUL
1-7 - aeiouwy
8 - space
9-12 - rst
13-15 - modifiers
When modifier bits are set, the values of the next half-byte change to represent the rest of the alphabet, numbers, symbols, etc. depending on the bits set.
As someone who has repeatedly had to deal with Unicode nonsense; I wholeheartedly agree. Also, you don’t need accents. You just need to know how to read and have context. See: live and live, read and read, etc.