Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Limits of computation (wikipedia.org)
121 points by diziet on April 28, 2017 | hide | past | favorite | 18 comments


>Landauer's principle defines a lower theoretical limit for energy consumption: kT ln 2 joules consumed per irreversible state change, where k is the Boltzmann constant and T is the operating temperature of the computer.[2] Reversible computing is not subject to this lower bound. T cannot, even in theory, be made lower than 3 kelvins, the approximate temperature of the cosmic microwave background radiation, without spending more energy on cooling than is saved in computation.

Well, technically, you just have to wait as the universe continues expanding if you want the temperature of the CMB to drop.



There are noncomputable functions. For example: https://en.wikipedia.org/wiki/Busy_beaver


If anyone is interested, I attempted to plot the first few Busy Beaver iterations: https://github.com/cslarsen/busy-beaver


>Busy beaver machines are, by definition, non-halting machines

Is this an error? I thought busy beaver must halt?


Thanks, I see now that the whole section is very clumsily written.

IIRC, a Busy Beaver may halt or not. You only care about those that _do_ halt, though, and the _champion_ is the one with the most number of 1s on the tape. The whole problem is determining if a given machine will halt or not. Which we know, by the halting theorem, is possible to prove on a case-by-case basis, but not in the general case (i.e., you cannot create an algorithm to determine it).

So you resort to heuristics. E.g., if a machine never has a "go to the halt state" in its transition table, you know that it can't ever halt. But as the machines grow, you'll have to cover an infinite amount of such heuristics, hence the incomputable nature of them.


About reversible computation and the Landauer limit: http://www.nipslab.org/node/5097


I think the key paragraph from the paper is this:

> We would like to emphasize that the output of our logic gate can be fed directly as an input to a second similar device (as shown below) and so on, to perform all the calculations desired. Clearly, our logic gate is a combinational device; thus, the removal of the inputs makes the system revert to the initial state S0 (regardless of whether the final state was S0 or S1). If we want to remember the final state, we need to couple this device to a sequential device where a Landauer reset might be required and a minimum dissipation of kBT log 2 is needed.

In other words, the Laundauer limit applies to state machines, but not to combinatorial logic blocks. I guess in retrospect this should not be surprising: the derivation of the principle considers a computing machine which is in thermal equilibrium with its environment, so implicitly we model the machine as having some memory unit which has time to thermalize after each operation.


I found out about Landauer principe last week and am fascinated about the mental shift.


Fascinating analysis! The section "Building devices that approach physical limits" reminds me of a rock Kurzweil mentions in "The Singularity is Near" [1]. Although it's not as a cosmic scale, it's a similarly interesting exploration of the limits of computation!

>“How Smart Is a Rock? To appreciate the feasibility of computing with no energy and no heat, consider the computation that takes place in an ordinary rock. Although it may appear that nothing much is going on inside a rock, the approximately 1025 (ten trillion trillion) atoms in a kilogram of matter are actually extremely active. Despite the apparent solidity of the object, the atoms are all in motion, sharing electrons back and forth, changing particle spins, and generating rapidly moving electromagnetic fields. All of this activity represents computation, even if not very meaningfully organized. We’ve already shown that atoms can store information at a density of greater than one bit per atom, such as in computing systems built from nuclear magnetic-resonance devices. University of Oklahoma researchers stored 1,024 bits in the magnetic interactions of the protons of a single molecule containing nineteen hydrogen atoms.51 Thus, the state of the rock at any one moment represents at least 1027 bits of memory.”

[1] https://www.goodreads.com/quotes/1284270-how-smart-is-a-rock...


That seems to deeply confuse "a thing that can be modeled by computer" with "a computer." It's trying to commensurate the incommensurable. How much more efficient is a rock than your phone when it comes to computing the sum of two 64 bit integers? How much more efficient when it comes to storing and retrieving a megabit of JPEG data? I'd accept answers in the form of joules per operation.

The very questions are faulty. A factory-stock rock is not a programable logic or memory device. If it computes at all, it only computes being-a-rock.

This rock analogy is the sort of sloppy thinking that takes someone from "it takes at least 38 petaflops to simulate a conscious human brain" (currently unknown, but not obviously incorrect) to "the human brain is a petaflop-class supercomputer" (WTF, no, category error).


This interesting theoretical tangent raises a good question about the way we measure information.

A laptop only computes being-a-laptop. What makes it useful is that we have assigned meaning to the physical output. That is, there is a homomorphism mapping the atoms on a computer to a logical program.

So any physical system is only computationally useful to the extent that we can define homomorphisms from its physical state to a logical system of interest.

The OP is about the physical limitations of the information stored in the underlying system as measured by entropy. I wonder if entropy is the operating definition here because it implicitly defines what logical system we use to identify the physical system.


It's basically saying that the universe is a special kind of computer that computes state transitions according to the rules of physics, presumably at the planck scale, which is far beyond what our simplified models of said rules can manage.

This view of he universe seems to be fairly important and taken seriously since black hole thermodynamics, gravity as entropic force and similar concepts are derived from it.


The digital physics everything-is-computation view of the universe is also not obviously wrong; it is taken seriously, as you say. But even if the universe is computational there's presently no example of running a human-defined instruction sequence at that fundamental level.

The quote I was reacting to emphasizes feasibility:

"To appreciate the feasibility of computing with no energy and no heat, consider the computation that takes place in an ordinary rock."

There's precious little to suggest feasible computing with a rock. (Or to suggest other feasible ways to run computations of interest to humans with no input energy and no waste.) It's like reading "to appreciate the feasibility of supplying civilization's electricity sustainably, consider the energy released when two neutron stars collide." Both are strong contenders for "among the least feasible engineering programs not yet proven outright impossible."


This thread might not be particularly relevant for you. The parent link is discussing theoretical computational capacity at a cosmic level - which "presently has no example of running a human-defined instruction sequence at that level".


https://arxiv.org/abs/1301.4504 quantum computation complexity prevents hawking radiation from being decoded before the black hole evaporates!


> A cold degenerate star could conceivably be used as a giant data storage device, [...] purposes.

The cited link to this [1] is a fascinating email thread/essay which they call "Femtotech". I don't even understand most of the stuff, but it seems we're far, far from it.

[1] https://web.archive.org/web/20041025030505/http://www.cs.usu...


I remember watching some documentaries and reading some articles, and remember Hugo de Garis talking about femtotech. I thought he was a pioneer in the technology, seems he's just a bystander.

He and Ben Goertzel seem to look as pioneers but never produce something tangible.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: