A chillingly moving article that reminds me of what is important in life... to want to spend time with those I love that still live, and to remember those that don't.
As an aside- I'm also an academic and resent this sudden nonsense push to 'embrace AI' from people that have no clue what it even is. Some of my research is close enough to be getting renewed interest and funding from this... but I find it offensive the my same research ideas weren't interesting on their own merits a few years ago but only suddenly are now because they involve 'AI.' It suggests a frivolous trend following unbecoming of scientists.
I don't know what this author researches, but she deeply understands why technology needs to be aligned with human interests, and how to make sure it is. That seems to me something academic departments interested in AI research need now more than ever.
I've also always found this poem darkly captivating, because it imagines that positive humanizing technology will be possible but itself seems like a vision that would be infantilizing, and the author leaves it unclear on if they notice this or not, and if it is serious or dark satire.
Utterly painful and deeply humiliating gynocology - which I would guess every women begrudgingly deals with is not a product of AI, but of MS. Medical Stupidity. Certainally a place the marketing department deems 'unprofitable'.
I hope her inbox is filled with sympathy, and promises to destroy the dehumanizing parts of women's health.
We think nk in terms of modern medicine, when it took 150 years for surgeons to wash their hands. Now it's a common place assumption, while not washing hands is ancient history. Let's have women's health care advance much much faster.
As a founder and engineer primarily focused on AI right now, it's strange to see all of this going on. Neither I nor anyone in my immediate bubble go along with the fanaticism, it's an exciting new tool with lots of potential, but a tool among many nonetheless.
I want to prevent everyone else from getting distracted by the shininess of it all, because the world really needs all these scientists and organizations to keep progressing the substantial work they are best at.
The recent Nobel in physics was the perfect example, and I think HN echoed the general feeling that we mightily respect the field, probably above what we are doing, and that it is an outrage for it to get overshadowed like that.
It's also weird to see how not being focused on AI is disrupting so many people, but being focused on AI doesn't help you that much either. It is definitely not the trump-card people think it is with investors, and even less with the market. Sometimes it is better not to emphasize it because it cheapens the whole thing! Among other things, people (understandably) do not understand the spectrum of possibilities for using LLMs, in their minds it all boils down to asking ChatGPT, which is obviously not very substantive. At the end of the day, it's all just technical details, people want results and value.
I think they are referring to the author of the article losing their job because they weren't focused on AI:
> the Dean of my college told me...that I should look for long-term academic employment elsewhere. My research and practice was not centered enough on “AI” and “emerging technology” to fit within the institution...
"As an aside- I'm also an academic and resent this sudden nonsense push to 'embrace AI' from people that have no clue what it even is."
Follow the money.
If you're used to "follow the money" and have done it a lot, you may find this money trail doesn't lead to the usual places. But look at the size of the AI bubble. There's a lot money available to fund PR and push "influencers" to run around telling people that they need to get on board or get left behind... because if people don't get on board that AI bubble is going to pop. And I think people underestimate the amount of control a donation to an educational institution gets the donor, unless they are very cynical.
I do not claim or believe that "AI" is useless. I do firmly believe that on top of the foundation of useful stuff that it can do, some of it if anything underutilized and understudied (I agree with an article in HN a few days back that embeddings are really underutilized and studied versus all this generative stuff which is a lot less interesting than meets the eye), there is a staggeringly enormous bubble. And with that bubble, a lot of well-resourced people motivated to keep it inflated, as there always is.
You'd think by 2024, the techno-fetishist "If it's NEW technology it's GOOD technology!" would be dying down in the general consciousness. Tech is not automatically good and helpful just by virtue of being good.
As an aside- I'm also an academic and resent this sudden nonsense push to 'embrace AI' from people that have no clue what it even is. Some of my research is close enough to be getting renewed interest and funding from this... but I find it offensive the my same research ideas weren't interesting on their own merits a few years ago but only suddenly are now because they involve 'AI.' It suggests a frivolous trend following unbecoming of scientists.
I don't know what this author researches, but she deeply understands why technology needs to be aligned with human interests, and how to make sure it is. That seems to me something academic departments interested in AI research need now more than ever.
I've also always found this poem darkly captivating, because it imagines that positive humanizing technology will be possible but itself seems like a vision that would be infantilizing, and the author leaves it unclear on if they notice this or not, and if it is serious or dark satire.