Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In reference to memorizing material the OP writes: In the “real world,” having a copy of your notes is called being prepared. Instead, university exams expect us to tie one hand behind our backs and master a skill we’ll seldom if ever use again.

The truth is in the middle of the dichotomy you setup. Yes having notes, referencing materials, etc in the real world is being prepared. BUT, even in the real world, there's a reasonable expectation that you'll store and retrieve as much of that information as possible in and from your own memory. For example, I'm a pretty adept with python. I occasionally need to look at the standard library reference; yet, I try to commit as much as I can to memory. If I didn't, I would spend the majority of my time searching instead of doing. It goes without saying that a person who knows something off the top of his or her head is more efficient than someone who doesn't. You're correct that in the real world you'll rarely be in a situation where you don't have access to a reference of some kind. However to say at the same time that memorization is a skill that you'll seldom if ever use again is incorrect at best and reckless at worst.



The counterpoint here would be that memorization happens more naturally when practicing, as frequently used knowledge is automatically memorized.

I remember school courses in history and chemistry where requirements of memorization was the largest component of the course credit. What I suspect is the real reason is that measuring student understanding and engagement is a hard problem, and memorization is easily measured proxy for the above. The problem is that it is easily gamed, learning decays into this gaming process and then, students promptly forget material after the exams. In my experience, though, universities mostly do allow for notes in exams.


Precisely. My math courses never tested my memorization outright.

But if after pouring 40-50 hours a week into a class for a couple months didn't result in memorization of the important stuff as a byproduct, you probably weren't going to do well anyway.

I'd like to be optimistic and say that that this is how memorization began to be tested in schools. Instructors noticed that the best students seemed to have things memorizes, so they started testing this as it's an easy thing to test.

For some reason, the analogy of a doctor treating symptoms rather than the cause of the illness comes to mind.


I had a statistical machine learning course whose exam was mostly factual questions, closed-notes, and oddly enough I think it was reasonably relevant, despite the fact that I usually dislike pure memorization. It didn't ask for specific formulas, but more like concepts and terminology, and how they'd be applied. It's not that these are specific things you should memorize, but that it's at least a necessary condition: if you can't, without notes, say what an expectation is, what a loss function is, what nonparametric regression is, etc., and when you might use some of these things, then you probably didn't pay attention in class or work any of the problem sets, because after a semester of actually doing the course you should definitely know all that without even really thinking.

So even if an A doesn't guarantee you actually know statistics, it's at least, imo, justifiable to say that a low grade means you definitely don't know statistics. You can always argue that you'd look things up if it was open book, but past some point if you don't know any of the material or even the basic terminology of the field, saying you could look it up amounts to saying that you could learn statistics from scratch if you needed to. It's sort of a test of, "can you hold a reasonably intelligent conversation on the topic without constantly checking Wikipedia on your smartphone for basic definitions".

(That kind of exam is probably also particularly suited to statistical ML because not knowing those things is the most common kind of real-world mistake... the details of an algorithm you can always get from an R package or Weka, but not knowing how to analyze a problem or what the main issues even are can't be solved by open-source code.)


> It's not that these are specific things you should memorize, but that it's at least a necessary condition: if you can't, without notes, say what an expectation is, what a loss function is, what nonparametric regression is, etc., and when you might use some of these things, then you probably didn't pay attention in class or work any of the problem sets, because after a semester of actually doing the course you should definitely know all that without even really thinking.

Terminology is easy to remember once you understand the concept, and those things you mentioned are something that you do not memorize. Those things you have to understand. You can memorize a formula, and you can memorize a list of applications of a given concept, but both of them are worthless if you don't understand on a gut level, what the concept is and thus where to apply it.


Thats a good theory.

I always often hear about this kind of complains regarding algorithmic tests in interviews, that you have to "memorize" these algorithms, but i never understood this position as being constantly programming, these kind of algorithms really seem easy to do, and you don't need any memorization of them.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: