The original poster probably meant to write "sub-exponential". The commenter to whom you're responding may or may not have known what the correct term was; we'll never know, because they chose snarky incivility instead of correction.
> factoring numbers is deterministically achievable in sub polynomial time
Am I missing something? I thought that the whole point of using the difficulty of integer factorization as the strength of things like RSA was that it's not known to be solvable in polynomial time. "P==NP" is one of the unsolved problems of computer science.
I think that the state of the art is that we can factor in sub-exponential time.
You may not mean it this way, but for any reader who might be mistaken:
P == NP => integer factorization can be done in polynomial time with a classic computer but not vice versa.
In fact, many people believe P != NP && integer factorization can be done in polynomial time since FACTOR lies in this awkward space in the intersection of NP and co-NP. (Most other problem in this space are also in P, but we think P != NP, so...)
Now of course, if you put in quantum computers into the mix, Shor's algorithm can break both RSA and ECDSA in polynomial time.
By "things like RSA", you pretty just much mean "RSA". A lot of progress was made on integer factorization over the last 35 years, so that key sizes we'd have thought totally reasonable in 1995 are totally inadequate today. That is not similarly true of non-factoring systems, and so it's legit to point out the relative weakness of IFP crypto compared to other approaches.
> key sizes we'd have thought totally reasonable in 1995 are totally inadequate today.
Not really. The last big jump in complexity was in 1991, when the first 512-bit modulus, 2^512+1, was factored by NFS [1]. What has progressed the most, by far, has been the available computational power.
The last big theoretical jump in complexity was NFS, but is your argument that implementations of NFS haven't much improved, apart from raw computational power of the same sort available to general purpose computers (ie: non-vectorized instruction sets)?
You know much more than I do about this stuff, but I think I'm prepared to challenge you on this. :)
NFS implementations have certainly improved, and certain sub-algorithms as well, e.g. lattice sieving and polynomial selection. But these things don't significantly tilt the exponent you expect in a factorization. As in, you expected around 2^80 "operations" for RSA-1024 back in 1995, and that's still roughly what you expect today.
The practical NFS improvements since then might bring that down to 2^78 or even 2^72---and that's a lot of money saved---but when you're thinking of which key size to use it doesn't matter nearly as much as how 2^80 computation is something that is reasonably practical today, but was unthinkable in 1995.
I take your point. ECC probably wouldn't cost Nintendo more to implement, but would provide a system that probably isn't being attacked as successfully, hopefully giving their system more time on the market before it's cracked.
It didn't make a huge difference here, I should be clear. I don't like RSA, and my dislike for it comes almost entirely from something orthogonal to key sizes --- public key systems provide one or more of three primitives: direct encryption, signature verification, and key exchange; RSA provides all three, and the direct encryption primitive is a giant foot cannon.
If the original commenter's point was that RSA directly harmed the 3DS in some fundamental way, that would be a legit thing to push back on, I think.
Apologies, I mistook your typo/mistake combined with the odd phrasing of different difficulties to imply you really thought factorization could be done in subpolynomial time by some actors (i.e., NSA or something) as opposed to a misunderstanding.
I should have phrased my response differently regardless.
> deterministically achievable in sub polynomial time
it's already been stated that 'sub polynomial' was a typo, the original sentence discussed subexp state of the art but i changed it to discuss my research instead and failed to remove the 'sub'
'deterministic' can also be appropriately called out as redundant when used with polynomial time because it is implied in the latter but i wanted to be explicit
> my intended inference
inference is defined as 'a conclusion reached on the basis of evidence and reasoning'
this is a word structure i use in place of faith based assumptions: i think, i believe, etc; to establish a respect for evidence based conclusions
part of my research's aim is interested in a polynomial time algo for integer factorisation, but before i am to conclude that there is such a thing i require evidence, hence it being the intended inference of my research stead a direct inference
> 'it's hard for me it must be impossible for you'
this is what you will hear very often from many in the know if you tell them you have interest in a poly factorisation algo, even in spite of the problem being open
except in the case of one of the people that fanned all this interest
Richard Karp: (Berkeley, unsure, P=/=NP) My intuitive belief is that P is unequal to NP,
but the only supporting arguments I can offer are the failure of all efforts to place specific
NP-complete problems in P by constructing polynomial-time algorithms.
I believe that the traditional proof techniques will not suffice. Something entirely novel will
be required.
My hunch is that the problem will be solved by a young researcher who is not encumbered
by too much conventional wisdom about how to attack the problem. (o)
I remember that paper. I believe P=/=NP, but man it would be amazing what you could accomplish if it was! (For example, you could answer in polynomial time whether a proof for a sentence S existed with N symbols or fewer. For sufficiently large N, it is essentially a universal truth tester for a given model T! Incredible!)
what exactly do you mean by this?