Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Fascinating analysis! The section "Building devices that approach physical limits" reminds me of a rock Kurzweil mentions in "The Singularity is Near" [1]. Although it's not as a cosmic scale, it's a similarly interesting exploration of the limits of computation!

>“How Smart Is a Rock? To appreciate the feasibility of computing with no energy and no heat, consider the computation that takes place in an ordinary rock. Although it may appear that nothing much is going on inside a rock, the approximately 1025 (ten trillion trillion) atoms in a kilogram of matter are actually extremely active. Despite the apparent solidity of the object, the atoms are all in motion, sharing electrons back and forth, changing particle spins, and generating rapidly moving electromagnetic fields. All of this activity represents computation, even if not very meaningfully organized. We’ve already shown that atoms can store information at a density of greater than one bit per atom, such as in computing systems built from nuclear magnetic-resonance devices. University of Oklahoma researchers stored 1,024 bits in the magnetic interactions of the protons of a single molecule containing nineteen hydrogen atoms.51 Thus, the state of the rock at any one moment represents at least 1027 bits of memory.”

[1] https://www.goodreads.com/quotes/1284270-how-smart-is-a-rock...



That seems to deeply confuse "a thing that can be modeled by computer" with "a computer." It's trying to commensurate the incommensurable. How much more efficient is a rock than your phone when it comes to computing the sum of two 64 bit integers? How much more efficient when it comes to storing and retrieving a megabit of JPEG data? I'd accept answers in the form of joules per operation.

The very questions are faulty. A factory-stock rock is not a programable logic or memory device. If it computes at all, it only computes being-a-rock.

This rock analogy is the sort of sloppy thinking that takes someone from "it takes at least 38 petaflops to simulate a conscious human brain" (currently unknown, but not obviously incorrect) to "the human brain is a petaflop-class supercomputer" (WTF, no, category error).


This interesting theoretical tangent raises a good question about the way we measure information.

A laptop only computes being-a-laptop. What makes it useful is that we have assigned meaning to the physical output. That is, there is a homomorphism mapping the atoms on a computer to a logical program.

So any physical system is only computationally useful to the extent that we can define homomorphisms from its physical state to a logical system of interest.

The OP is about the physical limitations of the information stored in the underlying system as measured by entropy. I wonder if entropy is the operating definition here because it implicitly defines what logical system we use to identify the physical system.


It's basically saying that the universe is a special kind of computer that computes state transitions according to the rules of physics, presumably at the planck scale, which is far beyond what our simplified models of said rules can manage.

This view of he universe seems to be fairly important and taken seriously since black hole thermodynamics, gravity as entropic force and similar concepts are derived from it.


The digital physics everything-is-computation view of the universe is also not obviously wrong; it is taken seriously, as you say. But even if the universe is computational there's presently no example of running a human-defined instruction sequence at that fundamental level.

The quote I was reacting to emphasizes feasibility:

"To appreciate the feasibility of computing with no energy and no heat, consider the computation that takes place in an ordinary rock."

There's precious little to suggest feasible computing with a rock. (Or to suggest other feasible ways to run computations of interest to humans with no input energy and no waste.) It's like reading "to appreciate the feasibility of supplying civilization's electricity sustainably, consider the energy released when two neutron stars collide." Both are strong contenders for "among the least feasible engineering programs not yet proven outright impossible."


This thread might not be particularly relevant for you. The parent link is discussing theoretical computational capacity at a cosmic level - which "presently has no example of running a human-defined instruction sequence at that level".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: