Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have worked at three universities. This is pretty insanely accurate.

You left out one item though. The main role US institutions are trying to fill right now is: hire new faculty, have faculty pull in 5+ Million dollar grants in order to get tenure. After tenure, pull in multi-million dollar grants each year.

Teaching? Never heard of it. OH! That thing the actual customers are paying us for... yes, have graduate students or temporary emergency hires do it since all of our "real" faculty are bringing in research dollars.



The grants are even crazier than that.

Most of the actual work is done by trainees who are a) learning how to do <cutting-edge thing> for the first time and b) may never actually do it again because they're going to move onto something else. The profs are, at best, managing them and even that's often a bit iffy.

Everybody's obviously got to learn somewhere, but universities and research funders are oddly weird about experience and expertise.


The professor job is an absurd ask:

- Enterprise sales rep with a $500k-$1M annual quota (aka grants).

- R&D manager of a team comprised almost entirely of junior employees

- Lecturer and administration for teaching.

- Administrator & volunteer within the department, chair roles for key academic conferences, and editor on major journals -- all to check the "service" box for tenure.

- Because they aren't a proper CEO, they don't even have the ability to push back on obscene G&A costs (ie 50%+ of grant contracts going to university overhead!).

It's a labor of love (esp in STEM!), and I feel bad that many are caught in this morass.


As has been previously noted, HN seems to be convinced 50% overhead means that 50% of the grant is taken to cover overhead.

This is inaccurate. That would be 100% overhead, which is vanishingly rare (I've only encountered one institution above that rate, and they have to maintain research ships).

50% overhead means that the indirect costs of a grant are 50% of the direct costs, i.e. 1/3rd of the total grant. Notably, a lot of grant mechanisms (i.e. NIH R01s) are discussed at the direct cost rate, instead of the totals, so this doesn't "sting" nearly as badly as you all seem to think it does.

Additionally:

- I've seen private industry overhead rates. They're not lower. Indeed, many of them are quite a bit higher.

- A funder can specify a cap on overhead costs, and as long as that's public and known, every university I've ever been at has accepted it. The Gates Foundation is notorious for this, as are most charities. Hell, I had a grant from Merck that didn't pay "full freight".

- We push back on overhead costs all the time. Faculty are wily creatures. We have lots of ways.


I saw a "private grant" project funded at $500k, the budget that hit floor where work actually happened was $25k. The accountants went to absurd lengths to turn that into "$75k spent" in the year end statements, but it didn't help: the CEO met with the grant givers the next year on the porch of his brand new house, to explain why the actual goals they'd funded hadn't been realized yet. The people doing the work weren't invited; but we heard the story later about how much blame we took for the failure.


I think the problem though is that whatever those indirects are, they exceed the actual costs, and those direct costs are covering important things. So the indirects are being used as profit at what is supposed to be a nonprofit.

That might be fine except then it leads to massive distortion in incentives, so faculty are being driven to bring in money rather than do research per se.

It's kinda like the classic stereotype of a drug company not putting any RD money into something where they can't control the profit, despite efficacy, only now you're talking about this happening at a nonprofit that's supposed to be committed to research for its own sake.


These rates are negotiated with the government, and aren't exceeding the actual costs. Indeed, someone I know crunched the numbers for my institution, and the "margin" is exceedingly tight, and for some projects, negative.

I am faculty, and while yes, we are incentivized to bring in money, we also have output too. And believe me, when talking to funders, "gee, I didn't do any work because I was bringing in more grants" is a pretty dangerous strategy.

Indirects also cover important things. On my current grants, the following are covered by indirect costs, just as examples.

- The data center and networking infrastructure where my particular nodes on our cluster live.

- My actual office, including things like the facilities folks that keep it clean, lighting and heating, IT infrastructure again, as well as things like paper for the printer because I prefer to read on actual paper.

- The library and journal subscriptions that provide me the material I actually need to read.

- The admins who are helping shepherd a position I'm hiring for through the process of doing that, posting the position descriptions on the relevant sites, etc. The one who put in a bunch of orders for me for an international project where it's cheaper to buy the stuff here, set it up, and then carry it with me than it is to get it there. The one who knows the reporting requirements I don't. The IRB admin who helped us figure out some human subjects issues around doing research that involves university employees in their capacity as employees. The admin that helped me actually prepare and submit the proposal.

- The attorney that helped review and sort out several fairly complex agreements, as well as a whole host of fairly simple ones that still needed to be checked.

- The senior level administrator (the usual boogeyman in these things) who is spearheading an effort to consolidate several of individual-level efforts into something that's coherent at the university level, so we can actually act on things that don't require individual-level researchers like me to have the bandwidth to keep a relationship alive, and which persists past turnover either at the university or the organizations we are working with.

- The startup funding for a new disease ecologist before I can put them on some new grants.

All of these help propel my research forward. My indirects pay for my share of these people's time and expertise. And importantly, they help cover things that I can't foresee, or are enough layers of removed from me that I don't even really know who we're paying as much as "This thing gets done".

Is there waste in our indirects? Sure. There's waste everywhere. But it's both hardly the terribly oppressive burden HN seems to think it is (even leaving aside that HN consistently does the math wrong) and something that exists in industry as well - it's just hidden in the total price.

Incidentally, my institution is also perfectly happy to just quote folks a fully burdened price.

What happens when we don't charge full indirect costs?

The taxpayers of my state end up making up the difference, because the lights, and the reporting bits, and the IRB, etc. still have to exist to get the work done. And for some projects one might be able to justify that, but for some others (waves at industry) I have a problem with it.


> (ie 50%+ of grant contracts going to university overhead!).

This is why a lot of grants are made in-kind rather than in cash. Give the research $750K in equipment instead of $1M in cash where the university pockets $500k of it.


As noted elsewhere, that's not how overhead works, and vanishingly few grants are in-kind.

Also, expressly, large capital expenditures like equipment are exempt from indirect costs at most universities.

HN's notion of how universities work is probably about as accurate as my notions of how VC funding works.


In-kind grants are fairly uncommon and generally aren't a "substitute" for cash: salaries, even being what they are, are a fairly major expense for most labs.

In my experience, in-kind grants mostly occur when the "giver" has/gets the "gift" cheaply. For example, NVIDIA gave out tons of graphics cards, but they certainly don't pay full retail for their own product. This lets them give you a grant "worth" $5k, but costs them substantially less. Ditto pharma companies, which may not even sell the thing given.

It seems clever to give the funder a shopping list, and then get an "in-kind" grant for exactly what you want. However, many places will waive/reduce indirect costs on equipment (which, per the NIH is $5000 and lasts for >1 year).


also

- getting paid below (barely above if you are lucky) 6 figures

- requires 3+ years of postdoc training which paid even worse


Yeah. 6 figures would put me back on ramen.


the successful professors that I've seen out source everything but number 1 and get a healthy cut of number 5.


I remember (this would have been in the 90s) my father getting crap from his peers because he listed his grad students before himself on research papers.

In his words, "Why would I not? They did most of the work."


That must depend on the field. In my field (operating systems), the primary professor was always listed last; so you knew the first author was the grad student who did the most work, and the last author was the prof who oversaw the whole thing.


This. It depends very much on field. Pretty much every paper I didn't solo write in the past three or four years has had my grad students first (Epidemiology).


I recall the foundation I worked for was getting charged rent by the University like they were doing us a favor instead of giving us one of the worst buildings on that side of campus - after they knocked down the actual worst buildings to build a library. Then took a cut of all grants and a very large chunk of patent and IP royalties.

I don't think the fact that they were making money on either was what surprised me or anyone else, but that they were making money on all of them at the same time.

Apparently that building is still standing, 20 years later, though I don't know who is using it now.


Once you have tenure, what’s stopping you from just getting off the grant treadmill?

A friend of mine got on it after tenure (art school) because he could finance cool projects of his own design that way.

But even in a research university isn’t tenure supposed to protect you from being pushed around by the administration?


Salary mainly. You may keep the job if you're tenured but without the grants, you may not get much in the way of salary. Also, your grad students, postdocs, and resources (lab equipment, computing time, etc) are being paid through grant money so without it you don't have a lab group. Finally, without funding, your department and/or the university can make life exceedingly difficult. Extra teaching, more admin tasks, etc.


It is common for the university to pay a partial salary (9-months) and then the faculty can pay themselves another 3 months from grants if they have the funding. At University of Michigan the 9-month salary is spread out over 12 months, but you get a nice summer bonus if you have grant money.

The publicly listed listed salaries are the 9-month salaries, so faculty pay isn't quite as bad as it looks at first glance. If you have funding, most grants also REQUIRE the professor to take salary money from it, otherwise they assume you aren't actually putting in any effort on that funded project.


You cannot be fired with tenure, which is nice. But you can still move up and down in status. The grants provide supplemental income. You can do work you love enough to take a lower wage than you probably could get in the private sector. There are benefits.


Important to note that you have a STEM perspective. Nothing's changed in that respect, since that's always been the case in grant fields.

However...that's not at all a good representation of the rest of the university. Maybe the very best research universities don't care about teaching. There aren't many though if you're not in a grant field.

> That thing the actual customers are paying us for

If you're in a grant field, the grant agency is the customer, not the students. Teaching is something you do on the side. And not very much compared to non-grant fields.


At my institution, arguably my customers are the People of $State, followed by my grant funders.

Indeed, "The student is the customer" is the source of a lot of really pretty flawed reasoning about how universities can and should work, including things like "Am I obligated to pass you because you really, really want to go to med school?"


> Teaching? Never heard of it.

This view is starting to become outdated. Plenty of schools now have a role known as "Teaching Professor" that focuses on teaching, with no research expectation. They have all the same qualifications as any faculty hired for the role, like a Ph.D. in the field. At UPenn at least they can even get tenure.


Indeed I have a position without a teaching obligation (though I do teach a bit because I want to), and it's vanishingly rare to find one of those in my field outside of a medical school.


They've just added this position to our school called "Research Faculty". They are soft money supported and have no teaching load. Often they will be attached to some research institute.


Soft money supported versions are fairly common - mine's got a substantial hard money component, which is what makes it rare and honestly a little bit wonderful.


What’s soft vs hard money ?


Soft Money: A position where a substantial portion (usually 50%+) of one's salary is expected to be brought in by grants. Failing to do so is, at best, going to impact your salary, and at worst, is a good way to no longer have a job.

Hard Money: University-level funds (from the university, the state if a state school, etc.) provide a portion of your salary that doesn't have to be tied to a grant. This often comes with other obligations (i.e. teaching).

Most positions are a mix of the two in some form. Purely soft money positions, or >50% soft money positions, are common at some research institutes, medical schools, etc. Hard money positions that don't come with substantial non-research obligations are very rare.


Thanks.


How does that work now at UPenn?


So this is based on what a colleague told me who left my institution for UPenn. The position is "Teaching Professor" or equivalent where the idea is you spend most of your time teaching, with zero expectation of research, although you might want to engage in research about teaching. That kind of introspection comes naturally. Usually this role is without tenure, but universities are trying to make this position more attractive and so some, like UPenn, have attached the potential promise of tenure to the role (although I'm not familiar with the specifics of the promotion process there).


Thanks, this is a helpful explanation since I'm considering a UPenn degree programme at the moment.


There are schools that don't allow grad students to teach.


Primarily at what is known as PUIs (primarily undergraduate institutions), colleges and universities that either don't have grad students or have such a tiny graduate program that relying on them for teaching roles is unfeasible. In theory this is better for the students because the professors just care about teaching, but the down side is such institutions do little to no research and so the professors may be decades out of touch with current developments in their fields.


Not doing research doesn’t mean they are decades out of touch. What gives you that conclusion?


I am (temporarily) a prof at such an institution. Most profs in the STEM depts are getting more and more out of date with each passing year. There are no external incentive to learn the latest technical developments. And no time because the teaching load is 5 courses/year unlike 2-3/year at research institutions.


If you are doing research you have to be up to date or you don't get things published. I'm sure there are some professors who don't do research who make it a point of personal pride or something to keep themselves updated, but there's zero pressure to do so.


No, there is tremendous pressure to be up to date with your teaching because if you’re not, then your students don’t get accepted to highly ranked graduate programs, and that is a significant factor in undergrad college rankings.

> If you are doing research you have to be up to date or you don't get things published.

Only within the boundaries of your particular field of study, which in most cases is far narrower than the scope of undergrad classes.


I personally think the truth is somewhere in the middle, but do you see how this explanation is based on a very loose relationship between action and evaluation?


That’s a shame, because sometimes graduate students are better teachers than research faculty.


You can also go to a school where professors are employed for their ability to teach.

Choosing to go to a research institution seems like such an obviously poor choice it’s strange people get suckered into it. But it’s clear education is a low priority for both students and collages.


Not really. Small liberal arts colleges offer the chance to be friends with your professors and small classes. Research universities can offer classes that just aren't going to happen at a SLAC. For example, majors in our department have the option to take graduate courses before they graduate.


Many small to midsized universities with masters programs aren’t research institutions.

Higher education is very competitive with a wide range of business models. One of them is to have tight collaboration with industry to have undergrad and masters programs around Mining, Manufacturing, Medicine, etc focused on employability. Some are focused on night classes for government employees who get an automatic pay raise with a higher degree irrespective of what it’s in. Others are focused on continuing education requirements.

The high prestige research institutions can be a good fit, but aren’t the only option outside of small liberal arts schools.


> Choosing to go to a research institution seems like such an obviously poor choice it’s strange people get suckered into it.

The best educations happen in research labs. Undergraduates should get a good foundation years 1 and 2, and then find their way as a junior research assistant years 3 and 4 if they want to make the best of their University years. These days you can follow along with any course you want on YouTube. At University you are paying to get access to top researchers and the work they do.


I find people from research institutions are significantly more likely to misunderstand the fundamentals. Statistics for example is based on very specific assumptions and it’s really important to understand what they are to avoid making huge mistakes.

There’s this unfortunate idea that tests are there for the school rather than the student. Avoiding misunderstanding is critical and getting something wrong is one way to discover them. But that’s really inefficient, the basic question and answers with a professor are why your not simply reading from a textbook or watching a YouTube video.

That said, if you went to a horrible collage I can see why you would think YouTube was just as good and doing research was the real secret to learning.


Strongly disagree with this for CS. At top universities, you’re paying for access to similarly interested/capable students and top internship programs. Having done undergrad research, it was both underwhelming and worse experience than my friends who did internships at FAANG.


I think GPs perspective is not wrong, but assumes (or maybe hopes) that students are at least considering graduate+ level education. like you suggest, I think this is mostly not true of undergrad cs students. if you have no intention of going to grad school, industry internships are a much better use of time outside of class. as a bonus, the internship is a lot more likely to pay enough that you can actually pay bills while in school.


> (or maybe hopes) that students are at least considering graduate+ level education.

I'm not necessarily assuming that. I think there seems to be an assumption here that the choice is research or internship, but not both. That is not the case. Students do research with me during a typical semester, and when they are done the school year, they often go and get industry internships over the summer at top companies around the world. We train them to pass the Google-style whiteboard coding interviews; our students are very prepared for those. I don't know of any students who do internships during the school year. Most of my students who do research with me do not go on to do graduate studies.


I guess I'm just saying there's an opportunity cost to doing research in undergrad. at the school I attended, it was pretty common for students to have part-time internships during the semester. these were not google-tier companies, but rather local companies that liked hiring from that particular school. the time expectation (and pay) made it a good alternative to a campus job. that's what I did, and I got a fairly good offer from one of the places where I interned. once I got involved in hiring there, I found that the number one criterion for new grads was quantity/quality of internships. research was irrelevant unless it happened to be directly relevant to the business, and was still probably no better than having one more solid internship on the resume.

maybe if you're the kind of undergrad that has a good shot at google straight out of college, this doesn't make as much sense. I guess if you do two summer internships at FAANGs, you already look really good and may as well diversify your experience. my guess is that does not apply to the vast majority of cs majors that are just in it to get a decent job.

I'll have to defer to your experience on undergrad research though. what do you see as the value add from doing research as a student who has no intention of continuing school after undergrad?


better on teach content of text book. But if someone want to learn how to research, they should learned it from who actually research each day




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: