Last November, DeepMind released the results of a generative AI model that created theoretical molecular structures for over 2 million undiscovered synthetic materials. Within days, materials science researchers were able to confirm 700 of them. The sheer number of these new potential materials discovered is greater than has been created in the rest of human history combined. These are materials that can be used in manufacturing, energy production, and other objectives that are critical not just for advancing human society, but avoiding the impending crisis we are already facing.
Similar AI endeavors have been underway for medicine and human health.
The author is making extremely shallow, flawed arguments that hinge on an ignorant (or possibly, deliberately narrow-minded) understanding of what generative AI is, how it is already being used, and the magnitude of what is already being achieved with it.
It will be interesting to see how many of those, if any, pan out to have a meaningful use at scale. If I remember right, those 700 or so were synthesized in a lab but I don't think we know much beyond that.
We'll see if any of them end up being viable as far as manufacturing and material availability go, and whether they're better replacements for existing tech like batteries. The hope is that we'll have Jarvis inventing a new material for Iron Man's suit, but we could always end up with an endless pile of technically feasible but functionally useless materials.
> We urgently need the expertise of social scientists to be able to make much-needed collective decisions about the future of generative AI that we want; we can’t leave it to business, markets or technologists.
> Kean Birch is director of the Institute for Technoscience & Society at York University.
Academic sociologist argues that AI should be controlled by academic sociologists. Color me surprised.
Southern Indiana has lots of hills and forested areas, including the Hoosier National forest. The Great American Rail-Trail mostly goes through the upper third of the state, however, so you miss that and are mostly in the corn belt section of the state. Its very flat and mostly farm land. Which is likely why the railways were built on that route anyways.
I personally can find flat farmland quite beautiful, but I can imagine biking through it for days might get dull.
Must vary widely, because I went in August and there was 0 line at all. A lot of places are on Spring Break in the US right now, so that may contribute to your experience.
I'm not sure millenials or Gen-Xers are any less materially obsessed, I think we just generally have less money to spend on the things Boomers did. It isn't like we aren't spending thousands on tech, clothing, etc. I hesitate to condemn Boomers too harshly for their lifestyle choices given the benefit of hindsight when I'm sure my generation will also be pilloried for the aspects of lifestyle we are living that will later be seen as moronic.
More like lol at assuming your rent doesn't cover the other expenses of owning a house. You think your landlord is just taking the loss? When you rent, you are paying, if not directly, for every expense associated with the house, plus extra for the landlords profit margin.
The rental market and the home buying market as separate and not perfectly correlated. In SF right now, it's cheaper to rent than to buy, so yes, if a landlord bought today and rented it out they'd be taking a loss.
The thing is that landlords may have purchased the place a long time ago. They can still make a profit with rent being less than what a mortgage+ would be today.
Landlords absolutely incorporate home repair costs into rent. I'm shocked at the number of people who assume that landlords just eat the cost of maintenance. Just because you aren't forking over the money directly to the repairman doesn't mean you aren't paying.
> I’m not sure if this is just an attempt to down play their results or if it’s more academic jealousy because the funding goes to the “cool stuff” like AI/ML in the CS dept. and the Stats dept. is seen as old and boring.
No one is trying to downplay the legitimately impressive results of AI/ML. Deep learning, convolutional neural networks and GANs have had incredible success in fields like computer vision, and image/speech recognition. But outside of those areas the "results" for the current fads in AI/ML learning have been grossly overstated. You have academic computer scientists like Judea Pearl decry the "backward thinking" of statistics and who are championing a "causal revolution", despite not actually doing anything revolutionary. You have modern machine learning touted ad nauseam as a panacea to any predictive problem, only for systematic reviews to show they don't actually out perform traditional statistical methods [1]. And you have industry giants like IBM and countless consulting companies promise AI solutions to every business problem that turn out to be more style than substance, and "machine learning" algorithms that are just regression.
There's a reason why AI research has gone through multiple winters, and why another is looming. Those in AI/ML seem to be more prone and/or willing to overpromise and underdeliver.
God I hate this paper. Perhaps it was relevant at its time. But that was 18 years ago. The described dichotomy between the "two cultures" isn't nearly as pronounced, if it even exists, today. There are few statisticians today who adhere entirely to the "data modeling culture" as described by Breiman.
I'm surprised how often this paper continues to get trotted out. In my experience it seems to be a favorite of non-statisticians who use it as evidence that statistics is a dying dinosaur of a field to be superseded by X (usually machine learning). Perhaps they think if its repeated enough it will be spoken into existence?
Similar AI endeavors have been underway for medicine and human health.
The author is making extremely shallow, flawed arguments that hinge on an ignorant (or possibly, deliberately narrow-minded) understanding of what generative AI is, how it is already being used, and the magnitude of what is already being achieved with it.