I agree with your take, but will emphasize that the recent wave of AI progress has me questioning how much of human intelligence just reduces to pattern matching. There's certainly a lot of things, like painting, that most people wouldn't have called "pattern matching" a few years ago and now seem to clearly fall into that category.
This reminds me of how I felt when I was 14 years old and I discovered what oxytocin was on an episode of Boston Legal.
The fact that feelings of love and closeness could be prompted by a mere chemical was deeply saddening to me. It wrecked my worldview.
"Love is just the result of some chemical? Then it's not even real!" I thought to myself.
Fast-forward ~20 years later, and that's proven to be an obvious— and massive— and useless— oversimplification.
Of course love isn't "just a reaction caused by a chemical." It's a fantastically complex emergent property of our biological system that we still absolutely do not understand.
It's the same with thinking: are parts of it analogous to pattern matching? Sure! Is this the whole story? Not even close.
There's one rather extreme difference. Humanity went from a domain where there was literally no such thing as painting, to the Mona Lisa. Once there is an extremely large and well established body of course one can create,in literally any field, solely by mimicry, but "intelligence" is what enables us to go from nothing to something. And that remains completely absent in any any sort of "AI" of today.
Contrarian view: I think you need to be critical about which patterns to match. Eg if my inputs are a book on astronomy and one of conspiracy theories, how do I answer "Is the Earth flat?".
Now contrarian to the contrarian view: many of us live in bubble echos and go for the popular opinion instead of critical thinking, so maybe that's a bar too high even for humans.
I agree. Try formulating a sentence backwards in your head and you'll realize that most of the speaking that HUMANS do is just figuring out the next token.
This painting was revolutionary. When it was first exhibited in Paris, people were shocked. It was rejected from the Salon (the most prominent art exhibition at the time). Yet, 10 years later, every painting in the Salon resembled it. And you can draw a line from this painting, to Monet, from which you can draw a line to Picasso, from which you can draw a line to Pollock....
Obviously, none of these are totally new innovations, they all came from somewhere. Pattern making.
The only difference between this and these language models is that Manet and artists like him use their rich sensory experience obtained outside of painting to make new paintings. But it's all fundamentally pattern matching in the end. As long as you can obtain the patterns, there's no difference between a human and a machine in this regard.
Duchamp, quoted on why he wrote what he wrote on fountain:
> Mutt comes from Mott Works, the name of a large sanitary equipment manufacturer. But Mott was too close so I altered it to Mutt, after the daily cartoon strip "Mutt and Jeff" which appeared at the time, and with which everyone was familiar. Thus, from the start, there was an interplay of Mutt: a fat little funny man, and Jeff: a tall thin man... I wanted any old name. And I added Richard [French slang for money-bags]. That's not a bad name for a pissotière. Get it? The opposite of poverty. But not even that much, just R. MUTT.
Why did he choose "Mutt" after reading the strip, and not before? Why did he make the piece after moving to the US, and not before? Why was fountain made only a few short years after economies were industrialized, and not before (or 100 years later?)
The point is, can an AI point out novel things well? All these little things add up to make it novel, and the search space for all the possible combinations of little things is infinite, when only a select few will click with the public at any given time.
I remember reading the biography of a 20th century musician/composer, who said something to the effect of -- "Sure, I can sit down and write 4-part cantatas like Bach did, but that doesn't mean that I'm as great of a composer as Bach. What made Bach so great was that he was the one who figured out how to put these things together in the first place. Once he did that, copying the approach is no big deal."
It seems to me we're at a similar place now with AI tools. If you provided an AI tool with all music written _prior to_ Bach, would that tool take those inputs and create something new along the lines of what Bach did?
Or if provided input of all music up through the 1920s, would it create bebop? Or if provided music through the 1940s, would it create hard bop? Or if provided music through the 1970s, would it create music like Pat Metheny?
On one hand, being able to create more of the same sort of music that already exists is a very respectable thing, and what today's AI tools can do is utterly amazing. It takes human composers time and effort to be able to learn to write music that is certainly not innovative, but just matching the state of the art. And there's certainly a commercial market for churning out more of the same.
But in terms of asking, how close are these tools to human intelligence?, I think this is one legitimate area to bring up.
Granted these are exceptional humans, but they are extreme examples of a capability that all humans have, but no machine has, which is coming up with something new.
People underestimate the impact that innovations, true ones not the Silicon Valley buzz words, have had on the world. Einstein’s theories were not inevitable, neither was Plato, democracy, or most of the other big impactful ideas of history. But we’re all conditioned to accept the lie of inevitable scientific progress, without justifying why things must always get better and more advanced. On the contrary, the collapse of many great civilizations shows that things often get much worse, quickly.
Can you explain how this is a whole different ballgame?
It seems to me that making art that people like is a combination of pattern matching, luck, the zeitgeist, and other factors. However it doesn't seem like there's some kind of unknowable gap between "making similar art" and "making innovations in art that people like". I'm of the opinion that all art is in some sense derivative in that the human mind integrates everything it has seen and produces something based on those inputs.
Luck and the zeitgeist are pretty important. Without those, you have a lot of noise and are basically throwing things at the wall until it sticks.
A urinal, and some supermarket soup cans, represent pretty pivotal art movements. It’s not clear what makes those two things more art than others, and even to people at the time it wasn’t super clear.