The funny thing is Hammerbacher worked on trading problems which have tons of super fun CS problems even if it is a bit nihilistic making rich guys richer ... then founded Cloudera, which, I dunno, seems pretty "beige" as far as CS goes. Didn't realize he did a stint at FB. That could suck the life out of anyone.
Physics has the same problem. Upwards of 20k people in the APS... most working on very obscure problems spraying "shittonium on silicon 111" or making no progress in various theories of everything.
Still plenty of important and interesting problems in CS, some of which I've been lucky enough to work on.
Blockchain stuff is pretty useful and interesting, though it could all fall apart with some breakthrough in crypto.
Machine learning is mostly used for stupid stuff like getting people to click on ads, but there's lots of interesting use cases for it which haven't been explored yet.
There's also a lot of work to be done on the core tools used in ML; while everyone babbles about deep learning, boosting, PGMs and topological data analysis still have a lot of low hanging fruit IMO.
One that people don't think about enough: non-standard computing architectures. Quantum computing hasn't produced anything of note yet, but it's hardly the only potential area of research here. Simply using stuff like old school Harvard architectures has tremendous implications for security (no more buffer overflows, yo), but nobody bothers thinking these things through and implementing.
> most working on very obscure problems spraying "shittonium on silicon 111" or making no progress in various theories of everything.
That's the problem with foundational research: It always looks obscure and impractical until suddenly there's a huge breakthrough out of nowhere. Same dynamic as with startups, where 99% will never make a significant mark of any kind, while 1% change the world forever. And you cannot know in advance which startup (or which foundational research) falls into which category.
Surface science (shittonium on silicon 111) has been promising to explain catalysis for 40 years now... it's still important to do, makes the chips run faster, but it's usually not considered real foundational.
>Blockchain stuff is pretty useful and interesting, though it could all fall apart with some breakthrough in crypto.
At this point it seems unlikely that a breakthrough in crypto could kill blockchains without killing nearly all of modern cryptography. For instance we can build blockchains which are secure even against quantum computers. We can build blockchains that make no number theoretic assumptions using only secure hash functions. Any technique which can break all techniques we have of inventing hard to solve problems (i.e. cryptography), would have a massive impact on technology.
A breakthrough in crypto that would destroy blockchains is much more likely to be a technology which is significantly better than a blockchain for what we want to do with blockchains. That would also be a exciting result.
I was thinking something along the lines that hashing proofs aren't very rigorous. I mean, it all looks pretty good, and I have no idea how to break this stuff (not my department), but some weird topology guy could wake up one day and discover that hashes aren't as good as they looked.
Physics has the same problem. Upwards of 20k people in the APS... most working on very obscure problems spraying "shittonium on silicon 111" or making no progress in various theories of everything.
Still plenty of important and interesting problems in CS, some of which I've been lucky enough to work on.
Blockchain stuff is pretty useful and interesting, though it could all fall apart with some breakthrough in crypto.
Machine learning is mostly used for stupid stuff like getting people to click on ads, but there's lots of interesting use cases for it which haven't been explored yet.
There's also a lot of work to be done on the core tools used in ML; while everyone babbles about deep learning, boosting, PGMs and topological data analysis still have a lot of low hanging fruit IMO.
One that people don't think about enough: non-standard computing architectures. Quantum computing hasn't produced anything of note yet, but it's hardly the only potential area of research here. Simply using stuff like old school Harvard architectures has tremendous implications for security (no more buffer overflows, yo), but nobody bothers thinking these things through and implementing.