As someone (probably like many here) who graduated from a university which taught computer vision and peripheral neuroscience courses, with such titles as "Computational Neuroscience of Vision", I always felt that trying to understand the human brain as a kind of algorithm was a bit of an artefact of computer scientists as they approach biology.
The truth is the visual cortex is vast, and not sufficient to explain the human classification and perception of objects visually. Never mind individual neurons or edge perception. Edge detection is an interesting isolated example for study and learning, but you will never come close to explaining human recognition and cognition in such simple terms.
(incidentally; there’s a fairly deep literature of historians of science that have carefully documented that we describe ourselves as analogous to the most sophisticated technology of the day: see “to lose one’s temper”, “to blow a gasket”, “i got my wires crossed”, “sorry, cache miss”, … as metaphors and idioms of mental state through the centuries that reflect the cool tech of the time in which they were coined )
And also the universe itself is often seen through the lens of contemporary technology.
Are we living on an island that floats on a giant turtle’s back? Or are the heavens like giant clockworks? Or maybe it’s all a computer simulation?
These cosmological speculations are separated by thousands of years, but they are all simply a reflection of what the person finds most awe-inspiring in their everyday life.
Reasoning by analogy is one of the ways we solve the framing problem.
So, when explaining the universe we imagine it's an act of will by a conscious entity (ie., like how we invent). When explaining the mind we suppose it's like one of our inventions.
Absent an analogy of some kind it's quite hard to determine what features are salient. Objects have an essentially infinite number of properties.
Each of these are true in a sense. There's no turtle, but we are living on an "island" floating through space. The heavens do follow predictable, clockwork rules. Computer simulations are at least a good way to describe the universe.
Was it? What evidence do you have for that? If anything it sounds like the kind of verbal slapdown someone in authority would subject someone trying to be a smart alec. It is short. Easy to understand. And closes the kind of questioning.
I would be very surprised if someone have considered it the literal truth, but of course have seen stranger things.
> heavenly clockwork and computer program perspectives are clear metaphors.
I don’t know about the clockwork. You will need to find someone who talks about that and ask them if they meant as a metaphor or not.
On the other hand I know about the computer simulation one. That for me is not a metaphor. I seriously think that it is within the realm of possibilities that this universe we live in (including us) is a literal simulation.
There would be possible physical experiments which depending on their results could make me increase or decrease my confidence in that statement. But I don’t consider it a metaphor.
Now of course that is only the viewpoint of a single human, at a single point of time. So it might not matter much. But it shows that it is not that “clear” that everyone considers that view only as a metaphor.
> I seriously think that it is within the realm of possibilities that this universe we live in (including us) is a literal simulation.
Notice how you dropped the “computer” part. Without that qualifier, the “universe is a simulation” hypothesis goes back at least to Descartes and his evil demon [1].
That’s the GP’s point. The “demon,” “clockwork,” and “computer” are just metaphors to help illustrate the point. Hundreds of years ago it was a trickster demon, now it’s computers - the simulation part is the same.
(The world floating on a turtle idea traces to the world turtle and several different creation myths, so it’s safe to say their believers took them a bit more literally)
> now it’s computers - the simulation part is the same.
I don't think so. I do understand this lineage of thought, and I agree with you that they are somewhat similar. But I must insist in saying that what I'm talking about is different.
The trickster demon metaphor talks about a being (you) whose senses are replaced by the demon. But that means there is a you outside of the demon/computer simulation.
I believe if this universe is a simulation, then I am part of that simulation. My mind is not an external observer hooked up to the simulation, but just matter simulated by the simulation according to the rules of the simulation. The thing Descartes was talking about is a Matrix situation. (Or rather to say the creators of Matrix were paraphrasing Descartes) Neo thinks he is living his life, but in truth his body is laying in a pod in the goo. I don't believe in that. I don't think that is likely true. If this is a simulation then me (and you, and this computer, and all of the people, and the butterflies and the stars) are in the simulation. And not the way how Neo is in there, but the way a cubic meter of minecraft sand is inside a minecraft world. Inside the minecraft world it is a cubic meter of sand, outside of it is just a few bytes in the memory of some program.
Let me illustrate what I mean when I say that I don't speak about the universe being a computer simulation as a metaphor. Imagine that it is a simulation. What does this computer simulation have to do? Well, it seems that there are particles, and there are forces between them (gravity, electric, weak/strong nuclear force) In every iteration of this simulation it would seem that you need to calculate which particles are close to others so you can update the forces on them, so you can calculate their new state.
To do this you need to inspect the distance between every two particle. That scales with ordo N^2 with the number of particles N. If the universe is a computer simulation it probably runs on a computer of immense power. But even then N^2 scaling is not good news in a hot path. Funny thing is that if the universe you want to simulate is relatively sparse (as is ours), and has an absolute speed limit (as ours seems to have), then you can shard your workload into parallel processes. And then you can run the separate shards relatively independently, and you only need to pass information from one shard to an other periodically.
Now if our universe is a simulation, and it is sharded this way, then you would expect anomalies to crop up on the shard boundaries. Where the simulated mater is moved from one executor "node" to an other. We could construct small spacecraft and send them far away (perhaps other solar systems?). We would furnish these small automated spacecraft with sensitive experiments. Microscopic versions of a Newton's cradle, or some sort of subatomic oscillator, or a very precisely measured interferometric experiment. And the craft would autonomously check constantly that the laws of physics are unchanged, and work without glitches.
If we don't see any glitches, then we shrug. Either we don't live in a computer simulation, or the computer simulation is not sharded this way, or the edge cases are very well handled, or the instruments were not sensitive enough, or the shards are even bigger (perhaps we should have sent the same experiments to a different galaxy?) If we see glitching, then we should try to map out exactly where they happen, and how they happen, and that would be very interesting. And if we see glitching of this kind that would increase my confidence in us living in a computer simulation.
Does this make sense? You cannot design an experiment to test a metaphor. It doesn't even make sense. But I think of this as a possible literal truth, in which case you can formulate hypothesises based on it and you can check those with experiments.
> so it’s safe to say their believers took them a bit more literally
I believe you. Did anyone ever propose to solve a famine by sending a hunting party to cut a chunk of the turtle's flesh? Or to send gatherer's to collect the dung of the turtle to fertilise the land? Or to send holly people to the edge of the world, to peer down at the turtle to predict earthquakes? If the turtles are meant to be literal turtles these are all straightforward consequences. If nobody ever proposed anything like these, then perhaps the turtles were more of a metaphor?
The "we're living in a simulation" theory is silly and self indulgent. If it's a simulation, then a simulation of WHAT that is REAL and exists OUTSIDE of a simulation? You still have to explain THAT. It's just as stupid and self-justifying and needlessly complex and arbitrarily made-up as any religion.
That is different from "we're living in a computational medium", which doesn't claim it's simulating something else, and is the only level of existence. (i.e. Fredkin et al)
I’m sorry. Writing select words with all-caps and calling the idea names is not making your point more persuasive.
> You still have to explain THAT.
I see your point there. Sadly the universe is not obliged to be easy to understood. “If X then I have further questions, therefore not X.” Is not a logical reasoning I recognise.
What I am saying is that you can’t argue that we are not in a computer because that would bring up a host of questions.
> That is different from "we're living in a computational medium" which doesn't claim it's simulating something else
Interesting. The way I use these they are synonymous in my mind. I don’t claim that there is something else out there which the simulation mimics. If you have some state representation and some rules to describe how the state propagates then I would describe a computer program which calculates new states based on the old one a simulator. This is the sense how I use the word when I say “we might be living in a simulation”. If this bothers you feel free to just imagine that I am saying “we might be living in a computational medium”.
> and is the only level of existence
Now, why exactly do you belive that? Why not 2 levels? Or 3? Why do you feel believing that there is only one level of existence is more justified than those other arbitrary numbers?
There is one key difference between reality and simulation. In reality you have to spend energy to remove noise. In simulation you have to spend energy to add noise. Or perhaps more accurately, all objects interact in reality and energy needs to be spent to prevent interaction, while simulation requires energy to make objects interact.
But it’s even worse than it sounds at first, because you need to spend energy not just on calculating the interactions which is super linear with the number of objects, you must also spend the energy to make it possible for the objects to interact in the first place.
This is an incredibly deep observation that essentially points to the problem with the representations we use to understand the Universe. It feels like the universe is essentially showing us that there is a non-supra-linear representation it uses (based on the kinds or fields of interactions?), and that calculating within this representation (between fields?) is somehow equivalent to calculating all of the interactions for the objects across all of the fields simultaneously.
Almost feels like it's related to P=NP or logic and meta-logic. Is it fundamentally impossible to use the same 'Universe'-al representation inside the Universe, a Gödel-like result limiting us only to the real? Or can we represent and run subsets of smaller universes within without a computational explosion? If so, does it eventually revert back to becoming fundamentally impossible at some limit, and if so, are we there yet? Can we measure how far from the limit we are, somehow?
Fun questions. Thanks for the provocative clarification.
Perhaps a foolish question but does “simulation” necessarily imply calculation or is that just an extension of our current evolution of computing technology as an analogy for what a simulation would be? I’m not convinced the one necessitates the other.
Oh, I don’t know. I mean conceptually a simulation is just a model that changes over some axis, time being a prime candidate. I’ve seen some goofy models that use an axis other than time to create some interesting visuals. There are definitely game makers playing with some of this stuff.
Calculation may be the wrong word for what’s necessary for a simulation, but I don’t think you can have a simulation without something analogous to computing. But the computation may look foreign, think analog vs digital computers. I mean, what would it mean to simulate something if you weren’t interested in finding some measurable thing? How do you seperate the ability to observe the simulation and not be able to measure anything? I may be too steeped in engineering to be able to answer this, since the last thing I simulated was an analog circuit. But I also studied artificial life, and even there the goal was to learn something about life.
What I wonder about from your explanation is how does a simulation know where the noise is coming from. I feeling is that inside the simulation one is unable to differentiate the source of the noise.
You're not wrong. But I suspect you'd find inconsistencies if you looked hard enough. Situations where 2 things don't interact in some obvious expected way. And that's just the simple case. If you've played enough video games, you'd know that devs can easily create scenarios where there is no way to get the correct behavior between 2 objects without doing some pretty drastic changes to their game engine. (I play a lot of simulation centric games). Basically the number of ways you can poorly implement objects interacting with one another explodes pretty quickly. So that means, that the bar is pretty high, for something living in a simulation to never notice irregularities quick enough for the simulator runner to fix them, assuming the simulator runner is able to fix them at all.
I think about this a lot, and sometimes wonder if the edges of science can't be solved until some meta being comes along and implements that edge case. And then the edge cases get weirder and weirder. But really, I'm relying on my intuition of superlinearity when I think about this stuff, and I can see certain problems with simulations going to infinity faster than, say, the infinity of the infinite time argument that we must be in a simulation.
I'm curious if there is a way that I could phrase a polite request to you to ask if you're a human (that just happened to create your account 30 minutes ago to post this within the same minute) or if this comment was auto-generated.
Quote Investigator traces variants back as far as 1626, though it evolves over time. A "rocks all the way down" variant dates to 1838, "tortoises all the way down" to 1854:
The version I'd first heard attributed the story to a lecture by Bertrand Russell and an audience Q&A, though it seems clear that that couldn't have been the first instance.
Agreed, though I don't believe that's the context I first heard it.
I've run across the Vonnegut variant more recently. I don't recall where or when I heard the earlier version for the first time, though I suspect it came up in conversation without attribution. Likely sometime ~1980 -- 1999.
That variant may well trace to Vonnegut, though I suspect it had been passed through numerous mouths and ears by the time I heard it.
>Seuss has stated that the titular character Yertle represented Adolf Hitler, with Yertle's despotic rule of the pond and takeover of the surrounding area parallel to Hitler's regime in Germany and invasion of various parts of Europe.[3][4] Though Seuss made a point of not beginning the writing of his stories with a moral in mind, stating that "kids can see a moral coming a mile off", he was not against writing about issues; he said "there's an inherent moral in any story" and remarked that he was "subversive as hell".[5][6] "Yertle the Turtle" has variously been described as "autocratic rule overturned",[7] "a reaction against the fascism of World War II",[8] and "subversive of authoritarian rule".[9]
How could at least two cultures without a communication line between them both come up with such a quirky idea? There must be some underlying truth to it.. I'm sold. World turtle is the answer.
Linguistic stuff like this is fun to find; these days I mostly spot it via learning German as a second language, so the artifice in artificial intelligence becomes “Künstliches Intelligenz” where “Kunst” is artist and “Kunststoff” is plastic, and in Middle Low German “kunst” is knowledge and ability.
> coined
Deliberate choice to exemplify your point, or accidental because it’s almost impossible to avoid examples like this in modern English?
Something I've heard a few times is that computer "logs" refer to ships log books, but log books themselves refer to the actual wooden logs that would be thrown out of the back of ships to help determine their speed.
Modern submarines still have a "log" which is a pole and sensor that extends outside of the hull to measure the speed through the water and other important measurements.
Just an interesting connection, in English, "plastic" comes from Greek, via Latin (and Medieval Italian) "to mold". We see this meaning show up in phrases like "neural plasticity," which refers to the brain's capacity to learn, (re)grow, and make new connections (e.g. knowledge and abilities).
> where “Kunst” is artist and “Kunststoff” is plastic, and in Middle Low German “kunst” is knowledge and ability.
Kunst means art, and an artist is a "Künstler" in German. (and Intelligenz is grammatically female, so there is no trailing s in "künstlich" in "künstliche Intelligenz". Its a difficult language.
I think in the case of “to lose one’s temper”, there isn’t an obvious match between the technology of the day and the early medical theory of four humours, not of metal-working.
Origin of Temper, New Oxford American Dictionary:
Old English temprian 'bring something into the required condition by mixing it with something else', from Latin temperare 'mingle, restrain'.
Sense development was probably influenced by Old French temper 'to temper, moderate'.
The noun originally denoted a proportionate mixture of elements or qualities, also the combination of the four bodily humours, believed in medieval times to be the basis of temperament, hence temper (sense 1 of the noun) (late Middle English). Compare with temperament.
So tempered steel is well-balanced steel, or mild-mannered.
See also the well-tempered clavier. In music, the temperament is an aspect of the tuning system relating to how the dissonances of different notes are balanced.
How did WTC not come to my mind, it is one of my favorite works! Maybe because it drives my spouse up the wall so I only listen it less often than I’d like. My favorite recording is the Ishikawa from OpenGoldberg [0].
Reflecting on my response to “lose one’s temper” I can see how a straight line reading of etymology (as I proposed) might be misleading if the specific idiom did come or return from steel or string as an enhancement/extension to the original’s meaning.
I think it is more that as technology grows it spreads its terms and contexts to the point of entering pop culture. I'm not sure if the populous uses terms simply becuase they have heard them before in the same context or due to an understanding.
I have to point out that P.G. Wodehouse is often used as an example of this style in recent "literature". I can't even figure out the words to describe the sources of his terms. Wodehouse use terms from anywhere in English language culture (including French I think). The odd part about it is that Wodehouse's writings are so old I find it easy to miss the references.
I don't doubt we do this but do expect that is no different than my love being as deep as the ocean.
It's hard to think of any concrete examples but I will set the scene as best I can and recommend listen to some of the 6 or 7 hour audiobooks narrated by Jonathan Cecil.
I just went searching through a bunch of quote lists to try to find examples.
I grabbed some just to show the breadth of metaphors and analogs used. What I think I realized is that some of the best examples are probably descriptions of the scenes.
I'm too young to know but apparently he was pioneering.
I certainly find him funny with old world elocution.
I hope you feel satisfied.
His most famous character is Jeeves the personal gentlemen's gentlemen of Bertie Wooster. Askjeeves.com was named for Jeeves. They are set in the post Great War England/The continent and Bertie is young and part of the leisure class.
Bertie is over educated and deep into night life, pop culture, and sporting.
The settings are always over-privileged people trying to work out there issues while Jeeves is the observer and advisor.
<snip> this I thought was good because the use of props, underpinnings, bird, orphanage, payoff.
Bertie Wooster: I was standing on Eden-Roc in Antibes last month, and a girl I know slightly pointed to this fellow diving into the water and asked me if I didn't think that his legs were about the silliest-looking pair of props ever issued to a human being. Well, I agreed that indeed they were and, for perhaps a couple of minutes, I was extraordinarily witty and satirical about this bird's underpinnings. And guess what happened next.
Jeeves: I am agog to learn, sir.
Bertie Wooster: A cyclone is what happened next, Jeeves, emanating from this girl. She started on my own legs, saying that they weren't much to write home about, and then she moved on to dissect my manners, morals, intellect, general physique and method of eating asparagus. By the time she'd finished, the best that could be said about poor old Bertram was that, so far as was known, he hadn't actually burnt down an orphanage.
Jeeves: A most illuminating story, sir.
Bertie Wooster: No, no, no, no, no, Jeeves, Jeeves, you haven't had the payoff yet!
Jeeves: Oh, I'm so sorry, sir! The structure of your tale deceived me, for a moment, into thinking that it was over.
Bertie Wooster: No, no, no, the point is that she was actually engaged to this fellow with the legs. They'd had some minor disagreement the night before, but there they were the following night, dining together, their differences made up and the love light once more in their eyes. And I expect much the same results with my cousin Angela.
Jeeves: I look forward to it with lively anticipation, sir.
<snip>
Jeeves: I hope you won't take it amiss, sir, but I've been giving some attention to what might be called the "amatory entanglements" at Brinkley. It seems to me that drastic measures may be called for.
Bertie Wooster: [sighs audibly] Drastic away, Jeeves. The prospect of being united for life with a woman who talks about "little baby bunnies" fills me with an unnamed dread.
<snip> gaming the use of chip-in
Bertie Wooster: Oh, very well, then. If you're not going to chip in and save a fellow creature, I suppose I can't make you. You're going to look pretty silly, though, when I get old Biffy out of the soup without your assistance.
<snip>
<snip> this has a few but is a good example of using the reaction of a character in movie to describe ones self.
“I felt most awfully braced. I felt as if the clouds had rolled away and all was as it used to be. I felt like one of those chappies in the novels who calls off the fight with his wife in the last chapter and decides to forget and forgive. I felt I wanted to do all sorts of other things to show Jeeves that I appreciated him.”
― P.G. Wodehouse, My Man Jeeves
<snip> this is good becuase he uses Shakespeare
Bertie Wooster: Well, let me tell you, Mr. Mangelhoffer, that the man that hath no music in himself is fit for... hang on a minute. [goes into the other room, where Jeeves is peeling potatoes] Jeeves, what was it Shakespeare said the man that hadn't music in himself was fit for?
Jeeves: Treasons, stratagems, and spoils, sir.
Bertie Wooster: [returning] Treasons, stratagems, and spoils.
Mr. Mangelhoffer: What?
Bertie Wooster: That's what he's fit for, the man that hath no music in himself.
<snip>
Aunt Dahlia: Oh, Bertie, if magazines had ears, Milady's Boudoir would be up to them in debt. I've got nasty little men in bowler hats knocking at my door.
Thank you very much for finding these! I find it surprising that I remember these quotes from the 90s Hugh Laurie and Steven Fry television adaptation 'Jeeves and Wooster' - few TV programmes are faithful enough to their original book to include the dialogue verbatim!
That series was my introduction and it was truly brilliantly done. I consider must watch TV. They managed to preserve everything but the exact context of the story. Fry and Laurie is what makes that show work. I'm not sure anyone else could pull it off. I have seen other adaptations. B&W movies and such and the magic is lost. That series is an exception that proves the rule.
I once tried to read him and found it difficult but the audio books ended being a good listen and available on YouTube.
What the books seem to reveal is just how central Wodehouse is to modern comedy. There are obvious Wodehouse references in Seinfeld. Such the surname VanDelay.
>as metaphors and idioms of mental state through the centuries
after listing a bunch of things from the previous century that barely stretches a bit further back. i would have been impressed if you had come up with a phrase from medieval tech, or roman/greek/egyptian. hell, i'd settle for pioneer days tech to allow for "centuries". otherwise, it just feels like modern day analogies.
Prior to the scientific age, most theory of mind was religious and/or philosophical, and models typically ran to mind/body duality (Descartes), ideals and essences (Plato & Aristotle, generally), or "spirit" which had numerous associations, many of them textual, which was itself the great technology of the Axial Age in which that concept emerged.
Otherwise, in the scientific and technical era, you have the brain as computer, as AI, as homonculus (which really doesn't explain much), as mechanism or clockwork, as a composed of parts (much as a factory or assembly-line, I suppose), and the like.
I was thinking something more like "tied up in knots" or "wolf in sheep's clothing". Things before 1800s relevancy. Open book might be a little older, but surely, there were phrases older than that
Abstract algebraic equations are not closer to biology than the steam engine or clockwork watch.
The latter are at least physical. Gradient descent and inference doesn’t resemble the physical mechanisms that drive neurons at all. Floats and integers aren’t even capable of representing a continuous voltage potential.
There is zero relation between a "neuron" in neural networks and a real neuron. It's entirely marketing. The field of "AI" has always been quick to market their work as magic rather than be open and honest about the reality, which is why we have been through at least 3 AI winters already.
>trying to understand the human brain as a kind of algorithm was a bit of an artefact of computer scientists as they approach biology
I think the advances in neural networks over the past few years have shown that the failure of such an approach was mostly a matter of scale. Trying to reduce the visual system into a few kilobytes of code is of course a fool's errand, but trying to emulate it with ~10^11 parameters is looking much less foolish.
"The brain is a computer" is a stupid analogy if you think of a computer as a scalar or vector machine, but it's much less stupid if you're thinking in tensors.
Especially not purely RGB cameras. There’s a reason why you automatically focus to something that’s moving or fluttering. I think DVS camera would have bridged a huge gap in perception sensing but unfortunately there’s not enough demand for it to scale so most manufacturers dropped it.
As someone who minored in neuroscience, I took away that edge detection is actually quite important to the way your vision works. Google search "center-surround receptive field of retinal ganglion cells". This happens in your eye, before the signal even enters the optic nerve to go to the brain. The brain itself is not detecting the edges; its input already has that information.
I was also struck by the similarity between the way your cochlea (the organ in your ear that picks up sound waves) functions, and the way a Fourier transform works. They both transform the signal into the frequency domain, but your cochlea does it via its mechanical properties rather than by convolving the signal with a bunch of sine waves.
>The truth is the visual cortex is vast, and not sufficient to explain the human classification and perception of objects visually.
There have been experiments which have exactly located individual neurons and sets of neurons responsible for the first layers of image recognition. i.e. a neuron that fires when a specific spot on a retina is stimulated (a single pixel) and neurons that fire which detect lines. This is not theoretical but actual probing of living brains. I'll find the paper(s) later.
The truth is the visual cortex is vast, and not sufficient to explain the human classification and perception of objects visually. Never mind individual neurons or edge perception. Edge detection is an interesting isolated example for study and learning, but you will never come close to explaining human recognition and cognition in such simple terms.