Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As far as iOS is concerned, Apple got rid of 32 bit support in the processor itself allowing it to improve the processor.

Keeping around old code increases the security vulnerability surface. For instance, there are at least a half dozen ways of representing a string in Windows. One of the earliest widespread vulnerabilities in Windows was caused by improper handling of string encoding where anyone could run DOS commands on a web server running IIS just by encoding the commands in the browser.

https://www.sans.org/reading-room/whitepapers/threats/unicod...



> As far as iOS is concerned, Apple got rid of 32 bit support in the processor itself allowing it to improve the processor.

That is slightly different because Apple is designing their own mobile CPU's. And indeed by dropping 32-bit ARM support they can simplify and improve their CPU designs.

OTOH, Intel isn't gonna drop 32-bit x86 support from their chips just because Apple isn't making use of it.


Yeah but maybe dropping 32-bit support is a necessary step before dropping Intel chips...

They will face some* backslash now, but if/when they switch Mac to their own arm chips they might achieve a painless transition.

*They announced 32-bit deprecation like a decade ago, will legacy users be pissed of? yes! Is it a excuse for developers that still relied on 32-bit support over the last decade? NO!


Maybe, but I remember reading a the time that it was so the OS didn't need to load two versions of every library in.


Your example is from 2000/2001. The same year OSX was first publicly released.


That’s kind of the point - it’s gotten worse since then. Windows has become more bloated as they’ve added on more layers and refuse to drop backwards compatibility.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: