> For your proposed example, a cycle detection program executed on a computer with finite memory cannot analyze all programs the computer can execute
That is correct. A cycle detection program that can analyze all input programs with a memory bound of "N bytes" would need more than N bytes of memory to run for some input programs.
For example, the tortoise and hare algorithm needs at most 2*N bytes of memory to analyze any program that runs with a memory limit of N bytes.
> I'd be curious whether this suffices to convince you the halting problem exists for computers in the real world as well. If not, I'd be curious to read your objections.
It does not. That only means that you if you, say, want to analyze how a program behaves when ran in a computer with 16 GB of state, you need to analyze it on a computer that has more than 16 GB of state.
It would be an incredibly useful analysis if we could discover time-efficient algorithms to accomplish this.
> It does not. That only means that you if you, say, want to analyze how a program behaves when ran in a computer with 16 GB of state, you need to analyze it on a computer that has more than 16 GB of state.
Then we must differ on what the "halting problem" refers to. I would say that a computer suffers from the halting problem if there is no program it can run that can determine whether arbitrary programs it can run halt or not. You seem to agree that this is the case for a computer with a fixed amount of memory (as you say, a computer with more memory is needed to analyze all programs for such a computer).
I can then only ask what is this thing that is not the halting problem (as I understand it) that you think does not exist for computers in the real world?
> Then we must differ on what the "halting problem" refers to. I would say that a computer suffers from the halting problem if there is no program it can run that can determine whether arbitrary programs it can run halt or not.
Well, but even if you define it your way, then computers don't necessarily suffer from that problem, do they?
Because you can limit how much memory usage a program can run with. So for example, if you have a computer which can handle 16 GB of state, you can analyze an arbitrary program X on this computer, where the analyzer is able to use the entire 16 GB of state.
The analyzer can say: this program X will halt if you give it 8 GB of state to run.
So now you run program X with an operating-system-enforced RAM+disk limit of 8 GB and you'd be quite happy to know that it will always halt.
This would be a very interesting and useful thing to do.
The analyzer would also always be able to tell if the program crashes because it ran out of memory (and why). So you could tell that 8 GB is not enough to run this program, which would also be very interesting to know, because this would imply that the program would crash in production.
What is not clear to me is this scenario that you're thinking:
Let's say your computer has 16 GB of state.
But your analyzer is not able to tell what happens to the program if it needs 16 GB of state or more to run.
But... your computer can't even run programs with more than 16 GB of state!
So... how is it useful to not know this? Why is this version of the Halting problem more useful than my version?
> I can then only ask what is this thing that is not the halting problem (as I understand it) that you think does not exist for computers in the real world?
The Halting problem is not usually defined like I define it or like you seemed to have defined it.
It's usually defined assuming that the program being analyzed can run with literally unbounded memory, which is not useful for most real-world programs because real-world programs cannot use an unbounded amount of memory.
That is correct. A cycle detection program that can analyze all input programs with a memory bound of "N bytes" would need more than N bytes of memory to run for some input programs.
For example, the tortoise and hare algorithm needs at most 2*N bytes of memory to analyze any program that runs with a memory limit of N bytes.
> I'd be curious whether this suffices to convince you the halting problem exists for computers in the real world as well. If not, I'd be curious to read your objections.
It does not. That only means that you if you, say, want to analyze how a program behaves when ran in a computer with 16 GB of state, you need to analyze it on a computer that has more than 16 GB of state.
It would be an incredibly useful analysis if we could discover time-efficient algorithms to accomplish this.