Hacker Newsnew | past | comments | ask | show | jobs | submit | fjeifisjf's commentslogin

That's a very naive view of computer science. That is the attitude of people who have given up and decided that computers are too big for science now.


You're right, there are plenty of papers focusing on real-world performance. I chose not to capture the nuance because I wasn't sure how to express it succinctly.


How I see it: computer science as an academic field is roughly split between theory and systems. The theory folks tend to evaluate progress through proofs. The systems folks tend to evaluate progress through experiments.


I’ve vouched for this, because it brings up an interesting point (albeit, one that proven provocative and been expressed before). Also, many of your comments are dead.

Big O/algorithmic complexity is interesting in that it tends to abstract away the architecture of the underlying processor. A copy is a copy is a copy. Arithmetic is arithmetic is arithmetic. Regardless of what instructions/processing units the underlying processor has. Regardless of data access. Regardless of parallelization. Regardless of memory complexity. All we use are brutish units of “work” — despite not all “workers” being equal.

It reminds me a bit of scientific management: treating your “workers” as interchangeable units that carry out what has been decided to be the most efficient movements and processes — completely disregarding the individual characteristics of the worker. For a humanist example, consider the individual quirks of the physiology (not only in size, length, and composition of the skeletal system via the muscles, bones, tendons and so on that would necessitate there being specific movements and workloads that best suit them — but also in psychology; the brain as its own work-producing part that is uniquely suited and most efficient for specific workloads; rather than an all-encompassing abstract average “best workstyle” that makes no note of these characteristics, but simply decides to use external metrics like “time to pick up a box using X, Y, or Z form.”)

The same parallel can be drawn to computers: different processors and collections of hardware (what is essentially the “body and brain”) have different quirks and different workloads they perform best at. For example, GPUs are much more useful in the case of vectorized/parallel workloads where the operations are relatively simple (f.e matrix arithmetic). While you can run an iterative matrix multiplication algorithm on a CPU in n^3 time — your data access costs will be significant. Instead, running a parallel algorithm on a GPU, with RAM very close by, you can achieve log^2 n.

This is where CS research really shines: not running away from the realities of physical systems, but embracing them.


Obviously yes, rats don’t deserve tumors.


Tech companies have IC promotions.


If you look at age adjusted percentiles, 1% income is much lower than $750K/yr. 30 yr olds making $350k/ are 1%ers, and when they reach 40 or 50 they have substantial investment gains bolstering their compensation.


What are you talking about? Enforcement of law is handled by the DOJ.


(d) just means that enforcement is handled by the Justice Department, as it should. This new council is for policy.


Many things that are hard to prevent are still dangerous and wrong.


I hope you charge a good fee for your evaluations.


The US governments love putting people in jail for inability to pay fines. But mostly <$1000 fines and poor people.


Yes.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: