It certainly puts a ceiling on a career. And I'd argue it probably gave him a pretty rough shelf life. At some point he has to understand what he's doing.
Unless he's so good at selling his services he can consistently find new clients. And if that's the case, he'd probably kill it in sales.
Sales engineers have to be good enough to bluff their way through the layers of hyperbole/minor exaggeration/utter bullshit (delete as applicable) the sales team have spun. Whether their conscience gets involved before the deal closes, different question.
Not at my work. Around here sales engineers just say "this is a proof of concept, X will be different in the final version". Then, after they close the deal, they give us their half implemented feature they developed that none of us heard about before, and tell ys that we need to finish it and include it in the next release.
He may have made a good living, but his customer / employer bought low quality code with lots of tech debt.
That business model only works until customers are sophisticated enough to understand tech debt. In the future, more customers will be less willing to pay the same good wages for low quality code.
Yeah, and the business people could not care less. I am on a team taking in millions of dollars from a Delphi Windows app from 1997. Zero tests, horribly mangled business logic embedded in UI handlers. Maintaining that app is not feasible. I'm rebuilding a modern version of it only because it is embarrassing to demo and is such a UX nightmare that our distributor made us commit to a new app.
I'm not sure if its really misunderstanding, when in 99% cases it has turned out that if a problem is in P, then it has a polynomial-time algorithm with a quite small exponent.
It's certainly not true in principle that most problems in P have a small exponent. There is no shortage of graph related algorithms in P that have absolutely absurd exponents, like on the order of 10^100. It is true that practical decision problems that are in P have small exponents but that's exactly the point being made, namely that problems in P with large exponents are not practical and hence don't get much attention.
What are examples of natural graph related problems in P with absurd exponents? I think the reason they don't get attention is not that the algorithms are not practical, but that the problems are not natural. Really, the only examples of such problems I can think of are something like "Finding a clique of size 222222 is in P because we can try all possibilities in n^222222 time".
As the amount of data grows, exponents that used to be small become large.
In many applications, the data grows at least as quickly as computer performance. If the fastest known algorithm is superlinear, today's computers take longer to solve today's problems than yesterday's computers solving yesterday's problems. While O(n^2) time algorithms used to be pretty fast in the 80s, today they are often unusable.
I completely agree with you. My point was that the P vs NP distinction matters in practice, but of course also the subquadratic vs quadratic time distinction matters a lot.
For binary-encoded input it is in 2-EXPTIME by trying all graphs of size exponential in the input number and testing all subsets of the given size. Would be surprising if any hardness result for complexity classes would be known.
I'm not sure what you're referring to here. Ramsey's theorem is constructive enough that you can extract an upper bound of 4^k on the kth diagonal Ramsey number, and the (j,k)th Ramsey number is bounded from above by the max(j,k)th diagonal number.
For any positive integers n, k, m, such that m ≥ n, one can find N with the following property: if we color each of the n-element subsets of S = {1, 2, 3,..., N} with one of k colors, then we can find a subset Y of S with at least m elements, such that all n-element subsets of Y have the same color, and the number of elements of Y is at least the smallest element of Y.
This function is outside the reach of Peano arithmetic.
I read the essay, and my takeaway to "There are for example four fundamental forces. Are we going to find another four?" would be:
Maybe not, but we could find out that the model of fundamental forces wasn't really the final answer, and explain the universe much better with some alternative theory. In your comment you assume that science is more or less settled, and all there is to do is to tweak the present understanding a bit and fill some gaps. The same sentiment was shared by many prominent physicists already in the end of 1800s, who were obviously proved wrong by Einstein and others in the 1900s.
I tend to agree that rumors of the end of discovery have been greatly exaggerated, however in the case of theoretical physics, we have the unsatisfying problem of having several alternative theories, but severely constrained access to the energy/technology required to test them.
From my personal observations, Google indeed hires a lot of generalists.
Source: been an SWE in a hardware org for the past year there, and my team is a mix of about 1-2 people with plenty of previous hardware experience and focus, with the rest being generalists without much previous related experience (including me) who can pick things up and resolve them quickly, whatever they are. Observed quite a similar pattern on other teams in the org as well, with the only exception (to a degree) being a few research teams filled with PhDs.
Specialist and fungible are contradictory. It's true that they don't hire specialists - but there are only 7-8 different software profiles they look for.
1. ML
2. Systems
3. Product
4. Data engineer
5. SRE ....
Are you implying most companies have 37 profiles? Even this article was giving multiple examples or a little Web shop that probably has less profiles than 7.
Well, it is also not proved to not be NP-hard, so it could also be NP-hard to our current knowledge. But you are right that researchers believe that it's not NP-hard.