Hacker News new | past | comments | ask | show | jobs | submit login

There is a reasonable-to-read paper [0] that explains everything; I will only share a few responses to your concerns.

"Our puny brains" have no problem constructing O(n^100) or similarly-ridiculous problems. Here is an informal survey [1] of several good examples. In my previous message, I indicated that matrix multiplication and CFG parsing are good examples; they are both cubic-time but we have come up with even cheaper algorithms for special cases, like sparse matrices or LR(k) grammars.

Whether P equals NP is not only a fundamental question, but it's the first of infinitely-many questions about the "polynomial hierarchy" or PH, a tower of ever-more-complex classes. And part of why the question is so important is that deciding whether P=NP is actually a problem in a higher rung of another complexity class further up in PH! P!=NP is like the first of an infinite family of very very difficult meta-proofs about complexity.

"RSA wouldn't break" because PRIMES is already known to be in P, via a famous paper titled "PRIMES is in P", and RSA keys are already sized to have impractical exponents.

What P=NP implies is much more powerful. First, let's consider the actual cost of a poly-time reduction. P is self-low, which means that P with a P oracle is equivalent to P. So if P=NP, then the cost of a poly-time reduction can be rolled in, and the entire problem is still in P.

Now, which problems would be transformable via P=NP? Well, we typically assume that the solution would include a poly-time reduction for 3SAT instances into some P-complete problem. Okay, great. However, we know that 3SAT can't be transformed opaquely; if this transformation exists, it must be able to examine and specialize the structure within each specific 3SAT instance. So we'd get one powerful theorem about logic (specifically, about SAT). But that wouldn't be the end.

We'd also get transformations for the following NP-complete problems, and each transformation would embody a free theorem about the nature of the structure of all instances of the problem ([2] for a bigger list):

* Graph theory: CLIQUE, HAMILTONIAN, K-COLORING (faster compilers!), COVER

* Number theory: TSP (faster packet routing!), KNAPSACK, SUBSET-SUM, PARTITION

* Formal systems: BOUNDED-POST-CORRESPONDENCE (holy shit, bounded Turing-complete analysis in P!?)

That's some serious free theorems! There's no reason to expect that whatever would give us so much stuff for free would be easy to prove. In fact, it's quite the opposite: The fact that any NP-complete problem, if it yielded, would give us deep insight into all of the others, should give us extreme pause before hoping that P=NP.

[0] https://www.scottaaronson.com/papers/pnp.pdf

[1] https://cs.stackexchange.com/questions/87073/what-are-the-ex...

[2] https://en.wikipedia.org/wiki/List_of_NP-complete_problems




Thanks for the explanation.

But, I do have some comments:

1)

As far as a I can tell the fact that PRIMES is in P doesn't mean breaking RSA is. The problem in rsa is integer factorization for which we do not know whether it is in P. (But also do not know for it to be NP-complete.)

But here you are actually supporting my point: event in that case we could choose sufficiently sized keys.

2)

And yes, the fact that P is self-low does seem to be the best justification we have to give such importance to P vs NP. That's also somewhat related to what I was saying about Turing machines and random access machines having polynomial reductions between them. But that's just because we decided to ignore polynomial reductions.

Continuing my example with even number being the same: We can see that multiplying an even number by any number yields an even number. Does that in any way justify ignoring the multiplication alltogether?

P with a P oracle really is equivalent to P, but O(n) with a O(n) oracle is not O(n), it's O(n^2).

3)

We have many NP-complete problems that have practical algorithms that perform fast on real-world inputs. I don't see how getting a free theorem about K-COLORING would practically mean faster compilers if the polynomial exponent was galactic.


Last reply; the sun's up and work starts soon. I'm leaving breadcrumbs.

On RSA: You're right, but this is only the tip of the iceberg. There's a concept, "Impagliazzo's five worlds," which explains in fine detail how the various P!=NP and P=NP scenarios impact cryptography. Basically, whether P equals NP could be an easier question than another open question, whether one-way functions even exist. There are cryptographic practical implications.

On multiplication: You are correct that DTIME(n) ≤ DTIME(n^2). However, P = DTIME(n) + DTIME(n^2) + DTIME(n^3) + …

If you would like to consider that only some small sections of P are tractable, then that's an understandable stance to take. There is, as I've mentioned before, an entire subfield of computer science dedicated to trying to move matrix multiplication from DTIME(n^3) to DTIME(n^2).

On approximation: Integer programming is simply harder than approximation. That's all. The fact that the approximations are tractable doesn't slake our thirst for the precise answers. Moreover, we don't know how BPP and NP relate yet, so it could be that there are ways to solve problems exactly in NP by randomized algorithms. We've found a few hints at this already, but derandomization is a hard process. (We don't even know if BPP=P, although P≤BPP and BPP is self-low. The holy grail of derandomization would be to derandomize BPP to P.)

Finally, I really encourage you to take a bit of time to read the Aaronson paper I previously linked, because it explains all this in much more detail. You are free to continue to think that the problem is not interesting, but I encourage you to at least get a sense of why it is hard.


I don't understand why you are responding to me in such a condescending tone, but thanks anyway for the discussion.

I'm well aware that P vs NP is a very hard question (I have studied quite some complexity theory, although I am a bit rusty), I'm saying that it does not seem as fundamental (or interesting) to computer science as people believe it to be.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: