The bitter lesson, that data beats understanding, reminds me of the more-energy beats efficiency in "Review: Energy and Civilization" (https://news.ycombinator.com/item?id=36013309) which notes that the pursuit of efficiency suffers from diminishing returns, whereas abundent energy does not.
In a similar way, parsimonious concepts and knowledge, while personally satisfying to researchers, are inherently limited; whereas massive data (including self-generated data, as for alpha-zero) are not.
Why would more data not enable better understanding in general? The only problem is storing that data and making it accessible. As storage and processing throughput grow, generality of computation can too.
You can make Formula 1 cars faster by putting more powerful engines into it, but only to some degree. It's a constrained problem, and the solution space is full of tradeoffs.
In the same way I'm sure there are limits to energy, and at some point "abundant energy" (whatever that means) suffers from diminishing returns as well -- at least until new technologies can make it accessible, distributable etc.
Yea, there are a ton of limits to energy. Use too much in a confined spot and you lose your thermal gradient to do useful work. Use too much too long, and well you've turned your local universe into entropy. Put too much in one spot, you create a black hole.
One interesting thing about the human mind is, early in life it uses a huge portion of our total energy, somewhere around 60% of our total energy flux. Later in life it only uses around 20% of our total energy and it does this by taking a lot of shortcuts and reducing connectivity except to 'where it matters'. Of course this makes it much more difficult for us to learn new things at the same speed as when we were young, but as long as our environment doesn't change too much it allows us to conserve a ton of energy to survive lean times.
> [...] which notes that the pursuit of efficiency suffers from diminishing returns, whereas abundent energy does not.
But it does? For example, if we put more and more energy into making trains move fast, the returns will diminish. It's simply not worth it, considering cost and energy net stability. I did not read the book you are referring to, so maybe the context is different in the book.
Somewhere that energy must come from and at some point it is going to hit us, when we use too much energy for things. In the short or the long run.
I was quite taken by these ideas but now I’ve come to a more balanced viewpoint. Just look around and see how much of the modern world is built using interpretable physical and mathematical models. No one would suggest that we build skyscrapers by training a neural net to empirically predict the probability of collapse. But this is hindsight bias, because we already solved skyscrapers through centuries of accumulated foundational knowledge. Looking to the future, it’s unclear what class of problems are tractable engineering challenges and which ones need the hammer of deep learning — including AGI.
In a similar way, parsimonious concepts and knowledge, while personally satisfying to researchers, are inherently limited; whereas massive data (including self-generated data, as for alpha-zero) are not.
Knuth sides with actual understanding:
> [I shall continue] to devote my time to developing concepts that are authentic and trustworthy (https://news.ycombinator.com/item?id=36012360)