Depends on what you mean by "time". Going by algorithmic complexity time it's at least linear yes. Going by wall clock time (the more important one for interview code) it's constant time.
The time to spawn the threads is just inconsequential compared to the time to wait for the threads to exit for small sizes of N. But by doubling the number of numbers to sort you do double the time (actual wall clock time) that it takes to spawn those threads. O(N + 10000000) is still O(N)
Not to mention for the OS to handle N threads would likely take some kind of non-linear time, at least N^2 if I had to take a shot in the dark. But I imagine it'd depend on the OS and how the hardware handles interrupts. Even if the thread is sleeping it still takes resources to schedule it
You're forgetting that algorithmic complexity deals with asymptotic behavior.
An algorithm that takes N inputs, does N operations, and then sleeps for 10 seconds is not constant time, because in the asymptotic case the N iterations will take longer than the sleep.
It's constant with regards to the length of the input array, which is the usual metric for sort algorithm performance. That is, d(runtime)/d(len(input)) can be constant, even though d(runtime)/d(max(input)) is not, if you assume that len(input) is independent from max(input) (which is inconsistent with input having uniformly distributed random members, which is probably the more common assumption).
I don't see how it is. Different, but the same number, of elements is a different time. There's no constant time between [0, 1] and [0, 10], both having two elements.
The main "value" in the domain of sorting problems is the number of elements in your collection.
A subproblem also considers the elements to be integers, then they become another "value" domain. (But in general, sorting problems only need their elements to be comparable, not necessarily integers.)
I don't think that sleepsort is any more constant time, than any other sorting algorithm. For sleepsort to be constant time, you must
- have an upper bound on the input (largest number)
- you must not count some arbitrary things, like reading and processing the input, or spawning threads
But if are allowed to do these things, all the other sorts will became constant time. (I believe only one of them is enough.)
Well they were indeed sleeping through the interview. Otherwise I don't know how could one possibly claim that a program whose first statement is a sequential loop over all values of the input has "constant time" asymptotic complexity.
As with everything in algorithm analysis, it depends on your model of computation. That is, it depends what you're counting.
For sorting (comparison sorts), one fairly typical model is to just count how many comparisons you do. Which, this does none (kind of, not really, they're really just implicit or hidden in the OS).
It's just playing around with computational models, not a serious proposal. It's either just a joke, a troll, or an parable about the need to be careful about what metric you're measuring. Or some combination of those.