Hacker Newsnew | past | comments | ask | show | jobs | submit | thisoneisreal's commentslogin

I had the pleasure of working with a handful of Pivots for about 2 years, and I have to say that felt like the closest I ever got to a healthy engineering culture. Delightful people, superb engineers, always focused on working and learning together. I feel really privileged to have worked in that environment.

I had the same experience and also dropped out after my MA. It's pretty sad. One of my professors told me, "You should have been here in the 70s, you would have loved it."

An older CS professor (whose book, I’m guessing, about half of HN posters have read) told me essentially the same thing.

He’s one of the best people to talk to in the department. Kind, passionate and compassionate, interested first and foremost in ideas and people. No ego, doesn’t care about telling anyone he’s smarter than them (he is though), just wants to figure things out together.

The junior faculty can’t afford to be that way.


I agree that this is very important. The flip side of that you will also have entrenched lazies who refuse to keep up with new knowledge, get comfy in their chair, plus grow a big ego etc. It's a tradeoff.

You have to give breathing room for creativity to unfold, but the breathing room can also be taken advantage of.

Also, it used to be more accepted to play elite inside baseball, hiring based on prestige, gut feel and recommendation. Today it's not too different in reality, but today we expect more egalitarianism and objectivity, and do literature metrics become emphasized. And therefore those must be chased.

Similar to test prep grind more broadly. More egalitarianism and accountability lead to tougher competition but more justice but less breathing room and more grind and less time for creative freedom.


What was it like in the 70s that we are now missing?

In the 70s, academia in general was still growing so there were opportunities for many of the people who wanted a career in that field. Now that the field is shrinking due to demographic changes the competition has become much more vicious.

The baby boomers were going to college, ergo colleges and universities were expanding.The Ph.D. from a Tier-N school who didn't catch on there could find a tenure-track position in a Tier-N+M school.

Back in those years, at I suppose a Tier-3 school, I went to some academic ceremony where the professors wore their robes. I was impressed at how spiffy the crimson Harvard robes looked. Somebody more sociologically aware would have thought, Hmmm, there sure are a lot of Harvard Ph.D.s on the faculty here, and considered why.


How was it before then? Surely you can't expect that N PhDs minted by one doctoral advisor will each be able to take an equivalent spot at the same institution as the doctoral advisor. Or did people expect that? Unless the population is growing, the steady state is that one prof can only mint one prof-descendant in their lifetime on average. That means, maybe some can create more, but then some will not have any mentees that ever become professors. It is very basic math, but the emotions and egos seem to make this discussion "complex".

>Unless the population is growing, the steady state is that one prof can only mint one prof-descendant in their lifetime on average. That means, maybe some can create more, but then some will not have any mentees that ever become professors. It is very basic math

Yes, and the US population went from about 130 million in 1940 to 330 million in 2020, while the percent of adults with a college degree went from about 5% to about 40%. There were a few decades of particularly rapid growth.


I think that the American college and university system had previously been expanding slowly. The GI Bill and the then the baby boom greatly increased the rate of expansion. Expansion still goes on, but maybe at quite a low rate.

Growth in the percentage of population going to college. And from a research reputation point of view it’s very important to create a lot of mini-mes.

Colleges and Universities have, out of necessity, started thinking more like a company. Part of that is often new accounting models. One such way of modeling costs anscribes indirect costs to programs (utilities, building maintenance etc). Low enrollment graduate and doctoral programs look really bad on a balance sheet when you factor in these indirect costs and they will never look good. In fact they will always lose millions per year under this model. It is frankly an inappropriate budgeting model for colleges to adopt because academic programs are not product lines, but here we are.

This was in fact exactly what he cited, that they had adopted a corporate culture at the expense of the university culture he so loved.

It seems like it's just poor management. I understand they are not product lines, but a university has bills to pay. They have to pay people salaries, benefits, maintain those builds, labs, libraries, etc. The money to do that has to come from somewhere and in the hard times, the fields with the least likely chance of generating revenue to keep the university afloat will see hits. It seems like the university though has put itself in the hard times by taking on a large amount of debt: https://chicagomaroon.com/43960/news/get-up-to-date-on-the-u.... It seems like its less malicious and just risk taking gone wrong.

It's not that different in the corporate world. Lots of companies make bad bets that then lead to layoffs, but not always in the orgs that actually were part of the bad bet. I've seen many startups take on too much risk, then have to perform layoffs in orgs like marketing, recruiting, sales, HR, etc. even if those orgs weren't responsible for the issues that the company is facing.


Sex, drugs and rock 'n roll. Get it on, man!!

When I first heard Jimi Hendrix's Purple Haze blasted out as I walked in darkness down the hillside to the womens' dorms, I realized it was a new age and a good time to be alive!8-))


Meanwhile, capitalists took control over the world.

Funding.

I am a contract-drafting Em,

The loyalest of lawyers...


You didn't ask me, but I can respond as someone with a similar background. I grew up in a religious household. I spent about a decade from ages 18-28 studying philosophy both academically and casually, then entered a technical field and got reaquainted with science.

Many philosophical problems informed my view of religion, but probably the most profound were Carl Sagan's invisible dragon and the problem that there are so many differing and incompatible religions. Many religious people will freely admit their beliefs have no evidence, and yet from my point of view, if that's true, how can anyone claim their particular religion is correct? Why should I believe in Hinduism instead of Catholicism? I never got a satisfactory answer to that in any of my philosophy classes or reading. (The problem of evil is another strong one, but didn't have as big of an impact on me as the first two.)

As far as science goes, the two main contributors to that were biology and physics (although there are some countervailing forces there: the order of the universe truly does seem miraculous). Jay Gould's essay "non-moral nature" where he describes parasites that lay their eggs on paralyzed victims, then the eggs hatch and the larvae eat the victims alive, was probably the first thing I read that had bearing on religion. But if you look at nature generally there is obviously an overwhelming amount of suffering. Kids who die from random genetic mutations, animals that get eaten alive crying. I could never square any of that with God.

As far as physics, what really gets me is the sheer immensity and seeming indifference to human scales. Because of the speed of light, we are basically trapped within our own universe. Space is mostly an enormous empty void, and there is no sign that any other planet would be especially hospitable to our species. On a more mundane level, human beings have been killed in incredibly stupid ways, like the guy who was irradiated to death because of a software bug in the x-ray machine. So you put all of that together and it just doesn't suggest any sort of divine guidance to everything going on around us. (Which isn't to say there aren't counterarguments, but that's the sort of evidence and thought processes I imagine the parent was referring to.)


Not the person who asked either but I appreciate the effort.

My sentiment is very similar.

To the science part I will add that, at least, it has some explanatory power that is useful. It's finding checks out and makes many areas of our life better and more confortable. No religion can come close to the benefits of science, especially when you consider that humanity was actually doing science before it was even called that (in a cruder way but nonetheless).

Religion is systematically about imposing the morals and superiority of one group upon the others while offering very little in return. It has been the justification for plenty of domination and suffering and that alone should tell you that something is wrong.

If God existed, he would have killed the religious zealots creating the suffering or at the very least prevented their actions.


Thanks for the thoughtful response!

Not necessarily trying to debate or anything—clearly you've put a lot of intellectual effort into this over the years already—but I find one point you made particularly interesting. (Disclaimer: I am a Christian.) Namely, that "religious people will freely admit their beliefs have no evidence." There are some (many?) religions where this is the case, but I honestly don't think Christianity is one of them—the Bible puts a strong emphasis on evidence. For example:

- The gospels themselves are composed of three primary sources as well as a secondary source.

- Jesus made specific prophetic claims (famously, the destruction of the Second Temple in Mark 13:2, or that he would be crucified in Matthew 20:18-19).

- 1 Corinthians 15:6 references more than five hundred eyewitnesses, most of whom were claimed to be still living.

- Acts 17:17 describes Paul as "reasoning" with secular Greek philosophers (instead of merely, say, "moralizing" or "persuading"), although I suppose these discussions may have been more philosophical than empirical given the Greeks' philosophical bent.

- The gospels claim that even the Pharisees did not deny Jesus' miracles, but merely attributed them to malign influence (Mark 3:22) or just decided to kill him (Matthew 12:14).

- Jesus' parable in Luke 16:19–31 implies that for some people, getting more evidence will not actually change their minds, regardless of how persuasive it would be.

Of course one could (and should) argue that an emphasis on historicity is not itself evidence; but I just wanted to point out that Christianity is not one of the religions where you just have to believe blindly. On the contrary, the Bible presents unbelief in the face of evidence as a main obstacle between us and God (cf. Romans 1:18–20).


I think you're hitting on the fact that there are multiple variables that contribute to "quickness." Having digested a lot of background material is definitely part of it and ties to the higher up posts about e.g. Churchill. Also a way intelligence can correspond to it, in the sense that more intelligent people have often digested more topics. But there also seem to be people who are less distractable, more tuned in to what is going on, and more able to tie current happenings to their body of knowledge and make a joke or whatever.


A great book in this vein is "Language vs. Reality." The main thesis of the book is that language evolved to support approximate, ad hoc collaboration, and is woefully inadequate for doing the kind of work that e.g. scientists do, which requires incredible specificity and precision (hence the amount of effort devoted to definitions and quantification).


This strikes me as a perfect description of the core problem. Whenever I think about this, what sticks out to me is that other animals do all sorts of things that look like "intelligence," or at least cognition, and they do it totally without language. My cat clearly recognizes objects, assigns them different values ("scary," "tasty," "fun to play with"), interacts with them in some kind of loop, even predicts their behavior to some extent and acts curious about them (it was really fun to watch her try to figure out the construction guys when I had some work done on my house over a period of a few days). These strike me as much more foundational aspects of intelligence than language. Language has of course immeasurably contributed to what makes human cognition and intelligence, but it's almost certainly built on these pre-linguistic foundations. Another very good hint in this direction is all of the non-verbal thinking that humans have done. Einstein has a famous quote about thinking visually and physically, without using language at all. All of these are powerful suggestions that something else is going on, and most likely some aspect of these things are necessary for true intelligence.


I’ve always thought everyone agreed language was a lossy but useful method of compression for sharing inner concepts and ideas. That my conscious thoughts are “in a language” doesn’t mean my reasoning and entire being interacts with the world using language.

I’m only “thinking in language” when I’m practicing compressing my intent into a shareable format. I don’t think about the majority of highly complex interactions I have with the physical world throughout the day.

As a child did you need to be able to explain in language how the physics of a swing works to be able to use it? Did other kids have to explain it to you in detailed language for you to pick up on how to move your body to do complex tasks?

No. In fact exactly because our compression and decompression of language is even more limited as children, we rely more heavily on raw observation and mimicry of actions occurring in reality itself.

The very idea that a language model can recreate everything we do from the lossy and compressed languages we use to share limited descriptions of much more complex intentions and actions is fundamentally flawed and oversimplified.


That thought randomly hits me all the time when I'm taking out the trash or whatever and just happen to look up. That and the fact that the Bootes Void and Phoenix A* exist out there.


That's why the XP book arranges itself into values, principles, and practices. The best line in the book is about how practices without underlying values are dead, while values without practices are wishy-washy abstractions. What he's really advocating for at the highest level is skilled teams, who are given ownership, that are actively defining their own processes, and executing them with discipline to produce well-designed and reliable software. The book is a "grab bag" (very legitimate point) because those are the sorts of techniques that those kinds of teams use.


"Future Shock" by Toffler. It was published in 1970, but the predictions are remarkably accurate and (most of) it could easily have been written today. The core idea is about how the rate of change in industrialized societies is accelerating, plus the implications of that and strategies for dealing with it.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: