Hacker Newsnew | past | comments | ask | show | jobs | submit | orlp's commentslogin

In university? No, absolutely not straight away.

The point of a CS degree is to know the fundamentals of computing, not the latest best practices in programming that abstract the fundamentals.


My university also taught best practices alongside that, everytime. I am very grateful for that.

Absolutely the wrong take. You can teach CS with just pencil and paper, but that doesn’t advance the technology, it might only benefit academia in a narrow sense. CS students should be actively engineering software in addition to doing science.

Why do you find it in interesting that someone chose something mainstream? Isn't that the definition of mainstream, that it's a common choice?

Anecdotally, I'm a fan and generally pick the rarer ones. Eg I had a pair of socks and a few stickers made for my kid which had rarer pokemon.

If OP got a tattoo of Pikachu, it was interesting to me that he picked the most mainstream pokemon (I assumed he was a huge fan)

But from the response, my sample of 1 assumption checks out: "When I was planning it was obvious there'd be a Pokémon due to his (continuing) interest in those and to me as a non-fan that's the most immediately-recognizable one."


In modern usage (e.g. in gaming communities) "carries" has become not only ambitransitive but also a noun.

If something "carries" or is "a carry", it means it is so strong it metaphorically carries the rest of the setup with it. For example:

> This card carries.

> These two are the carries of the team.


> N tokens looking at N tokens is quadratic

Convolving two arrays can be done perfectly accurately in O(n log n), despite every element being combined with every other element.

Or consider the even more basic sum of products a[i] * b[j] for all possible i, j:

    total = 0
    for i in range(len(a)):
        for j in range(len(b)):
            total += a[i] * b[j]
This can be computed in linear time as sum(a) * sum(b).

Your logic that 'the result contains terms of all pairs, therefore the algorithm must be quadratic' simply doesn't hold.


One of my favorite bits of my PhD dissertation was factoring an intractable 3-dimensional integral

\iiint f(x, y, z) dx dy dz = \int [\int g(x, y) dx]*[\int h(y, z) dz] dy

which greatly accelerated numerical integration (O(n^2) rather than O(n^3)).

My advisor was not particularly impressed and objectively I could have skipped it and let the simulations take a bit longer (quite a bit longer--this integration was done millions of times for different function parameters in an inner loop). But it was clever and all mine and I was proud of it.


That's like saying sorting can be done in O(n) because radix sort exists. If you assume some structure, you lose generality, i.e. there'll be some problems it's no longer able to solve. It can no longer approximate any arbitrary function that needs perfect memory over the sequence.


This brings me back to DSP class, man learning about FFT was eye-opening.


Convolution is a local operation.

Attention is a global operation.


An interesting follow-up question is, what is the smallest number unable to be encoded in 64 bits of binary lambda calculus?


BLC can output any literal 60 bit string x as the 64-bit (delimited) program 0010 x, so in that sense it would be some 61 bit number. But if ask about just lambda calculus terms without the binary input, then I think it would be some small number of at most 10 bits. BBλ looks at the normal form size so it cannot even reach numbers 0,1,2,3, and 5.


Bloom filters also become full.

As it fills up the false probability rate goes up. Once the false probability rate reaches the threshold of unacceptability, the bloom filter is full, and you can no longer insert into it.

That most interfaces still let you do something that looks like an insert is an interface failure, not a bloom filter feature.

If you find this controversial and want to reply "I don't have a threshold of unacceptability", I'll counter that a false probability rate of 100% will be reached eventually. And if you still find that acceptable, you can trivially modify any probabilistic filter to "never become full" by replacing the "is full" error condition with setting a flag that all future queries should return a false positive.


Same author.


> Long-running projects that converge on high-quality, complex projects

In my experience agents don't converge on anything. They diverge into low-quality monstrosities which at some point become entirely unusable.


Yeah, I don't think they're built for that either, you need a human to steer the "convergtion", otherwise they indeed end up building monstrosities.


I work in a 400k+ LOC codebase in Rust for my day job. Besides compile times being suboptimal, Rust makes working in a large codebase a breeze with good tooling and strong typechecking.

I almost never even think about the borrow checker. If you have a long-lived shared reference you just Arc it. If it's a circular ownership structure like a graph you use a SlotMap. It by no means is any harder for this codebase than for small ones.


The person you replied to stated:

> how productive power users in different [fields] can be with their tools

There are a lot more tools in programming than your text editor. Linters, debuggers, AI assistants, version control, continuous integration, etc.

I personally know I'm terrible at using debuggers. Is this a shortcoming of mine? Probably. But I also feel debuggers could be a lot, lot better than they are right now.

I think for a lot of us reflecting at our workflow and seeing things we do that could be done more efficiently with better (usage of) tooling could pay off.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: