Hacker Newsnew | past | comments | ask | show | jobs | submit | more diffeomorphism's commentslogin

Will we? The hardware you want for AI and the hardware you want for super computing seem to have different priorities, e.g. concerning floating point precision.


compute is compute, yes. Maybe some older models might have to be rewritten to take advantage of a gpu farm's parallel processing capabilities, but humanity will ultimately benefit if the current AI boom fizzles


Any launcher on any OS uses alt+space or win+space. Weird that you think this is somehow a power toys thing.


> retina resolution

That just means 3024x1964. With other laptops you can either go up a step to 4k or down to OLED 2880xsomething.


Unfortunately it also means a software stack that can properly scale everything for such a display. Windows and Linux both have... issues around UI scaling that make this kind of a pain.


As far as I understand, the terminology says "linear" but means compositions of affine (with cutoffs etc). That gives you arbitrary polynomials and piecewise affine, which are dense in most classes of interest.

Of course, in practice you don't actually get arbitrary degree polynomials but some finite degree, so the approximation might still be quite bad or inefficient.


Basic questions: what does a GDPR request get you? Wouldn't providers like you to switch to them?

Just look at the smartphone market.


It does cause the same problem but seems to be somewhat less frequent.


No, that was some of the initial speculation, but turned out to be wrong.


No. That is what roundabouts, curved roads etc are for. Left turns are generally more problematic due to crossing incoming traffic etc.. Hence planning avoids them for good reason and there are much more right turns.


Obvious stoppers: man power, expertise, incompatibilities,...


Most 32 bit dependencies would not need to be upgraded. The only one I can think of off the top of my head is for the graphics driver.


Which is typically closed source, right?


Only for NVIDIA cards, which are the worst option for a Linux system anyways


That's a them problem...


This is not limited to CS or Latex in any way. Plenty of students spend a lot of time fiddling with word, powerpoint, note taking systems, citation management (which is surprisingly horrible in MS word), Adobe software etc..

Obvious reasons:

- Your thesis is a major output of years of work. Of course you want it to look good.

- You might think it superficial, but if the presentation looks bad, many people (subconsciously) interpret this as a lack of care and attention. Just like an email with typos feels unprofessional even if the content is otherwise fine.

- Spending time on tooling feels productive even if it is not past a certain point.

- People that are into typesetting now have an excuse to spend time on it.

That said, in my experience people spent a few hours to learn "enough" latex several years ago and almost never write any macros. Simple reason: you work with other people and different journal templates, so the less custom code the better.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: