Hacker Newsnew | past | comments | ask | show | jobs | submit | johngossman's commentslogin

It depends on who and what you read. Since they became controversial, I notice them more. Charles Dickens used them both regularly--most pages seem to have both.

Virginia Woolf's writing has the most semi-colons I've seen and almost as many em-dashes. It fits her stream of consciousness style where there are very few hard stops.

Jack Vance used semi-colons in almost the opposite fashion to increase the tempo by having short clauses without using conjunctions. His action scenes are sometimes almost staccato.

Just today I'm reading Patricia McKillip and noticed she also used a lot of em-dashes.


Imagine you were expecting one of those and read the other!


I suggest you go meet some PhD mathematicians and have that discussion.


Having a PhD in mathematics myself, I have been surrounded by such and had this discussion a few times. Some even like the ideas suggested!

I would say the most common counter argument is cultural: Classical mathematics is the norm in the field, hence one must use it to participate in research in this field.

But that seems to me a rather intellectually unsatisfying argument, if one cares about the meaning of the work.


Newton and Gauss and Euler did just fine without such solid foundations. If you get a PhD, very likely even a undergraduate degree in mathematics you cover this stuff, then (unless you choose foundations as your field) you go about doing statistics, or algebra (the higher kind), or analysis knowing you're working on solid fundamentals. It would be crazy if every time you proved something in one of those fields you had to state which derivation of real number you were using. And I guarantee at least 90% of PhD mathematicians could do so if they really needed to.


We are not talking about having to return to foundational axioms in every argument! Just that what axioms one chooses has an impact on which arguments are valid, and thus in turn what truths there are.


Not to mention that a lot of Japanese cafes don't open until late morning or early afternoon.


Sometime in the 80s, I implemented the core of the Mandelbrot Set calculation using assembly on an 8087. As the article mentions, the compilers did math very inefficiently on this stack architecture. For example, if you multiplied two numbers together and then added a third, they would push the first two numbers, multiply, pop the result, push the result back onto the stack (perhaps clearing the stack? after 40 years I don't remember), push the third number, add, pop the result. For the Mandelbrot loop this was even worse, as it never kept the results of the loop. My assembly kept all the intermediate results on the stack for a 100x speed up.

Running this code, the 8087 emitted a high-pitched whine. I could tell when my code was broken and it had gone into an infinite loop by the sound. Which was convenient because, of course, there was no debugger.

Thanks for bringing back this memory.


Ah, lots of supposedly solid state computer stuff, including CPUs, did that. I, too, used it for debugging. This wasn't very conscious on my part, but if some whine became unusual and constant, it was often a sign of something hanging.

As I got older, not only did computers stop doing that, my hearing also got worse (entirely normal for my age, but still), so that's mostly a thing of the past.


I used to hear the 16KHz whistle of CRT monitors. Of course, there is no whistle with LED monitors, but I stopped hearing the CRT whistle before they went obsolete. It was my first sign my hearing was declining.

I thought I was protecting my ears from loud noises like rock concerts and gunshots. But I didn't know that driving with the window down damages the hearing. I crossed the country many times with the window down. I'm pretty sure that was the cause as my left ear is much worse off than my right.

I don't need a hearing aid yet, but I'm pretty careful in wearing ear plugs whenever there are loud noises.


16 kHz is very high on the spectrum. Just the normal age-related decline of hearing makes that inaudible pretty quickly, you don’t need to drive with the window down for that.


You're right, but it was coincident with my realizing I had trouble hearing my watch tick with my left ear.


The sound usually comes from inductors and capacitors in the power supply circuitry, not the ICs themselves as they draw pulses of power in patterns at audible frequencies. Modern CPUs and GPUs will still whine audibly if given a suitable load; the amount of current they consume is amazingly high, dozens to hundreds of amps, and also changing extremely quickly.


I had a Radeon 5850 that did it. I ran someone's simple test unity project with vsync disabled, was getting around 3000 fps, and heard a tone that was probably 3000hz. Supposedly the 5090 FE's are pretty bad too.


The compilers available at the time that the 8087 was commonplace were overall horrible and easily beaten anyway.

On the other hand, skilled humans can do very very well with the x87; this 256-byte demo makes use of it excellently: https://www.pouet.net/prod.php?which=53816


Oh boy...more memories. About a decade later at work I identified a bottleneck in our line-drawing code. The final step was to cast two floats (a point) to integers, which the compiler turned into ftoa() calls. Unfortunately, ftoa changed and restored the floating point control flags in order to set the rounding behavior (the Intel default did not match the standard library spec). Even more unfortunately, this stalled the Pentium's instruction pipeline. Replacing the casts with a simple fld/fist pair was another 100x speedup. A few years later I noticed the compilers started adding optimization flags controlling this behavior.


Yah, I never did a very good job with x87 code generation. I'm a bit embarrassed by that.


The idea of listening to hardware running a program to tell what it is doing is surprisingly old. On the EDSAC computer[0] a little speaker was connected across one of the serial data lines, which allowed the progress of a program to be listened to. Skilled operators could immediately tell when a program had triggered a bug and either gone off into the weeds or entered a tight loop.

[0]: https://en.wikipedia.org/wiki/EDSAC


With the PDP-8E we used to do this with an AM radio.


- You can do the Mandelbrot set with integers. In Forth it's 6 lines.

- Coincidentally, Forth promotes a fixed point philosophy.

- Forth people defined the IEEE754 standard on floating point, because they knew how to do that well in software.


> - Forth people defined the IEEE754 standard on floating point, because they knew how to do that well in software.

IEEE 754 was principally developed by Kahan (in collaboration with his grad student, Coonen, and a visiting professor, Stone, whence the name KCS draft), none of whom were involved with Forth in any way that I am aware. And the history is pretty clear that the greatest influence on IEEE 754 before its release was Kahan's work with Intel developing the 8087.


I'm a big fan of Kahan's work. I am just sad that the NaN remains terribly misunderstood.

The signalling NaN, however, turned out to be quite useless and I abandoned it.

I think the Zortech C++ compiler was the first one to fully support NaN with the Standard library.


I think the 1985's standard/propotal from the Forth Vendor Group set a precedent.


Citation?


Mathematics is such an old field, older than anything except arguably philosophy, that it's too broad and deep for anyone to really understand everything. Even in graduate school I often took classes in things discovered by Gauss or Euler centuries before. A lot of the mathematical topics the HN crowd seems to like--things like the Collatz conjecture or Busy Beavers--are 60, 80 years old. So, you end up having to spend years specializing and then struggle to find other with the same background.

All of which is compounded by the desire to provide minimal "proofs from the book" and leave out the intuitions behind them.


> A lot of the mathematical topics the HN crowd seems to like--things like the Collatz conjecture or Busy Beavers--are 60, 80 years old.

Do you know the reason for that? The reason is that those problems are open and easy to understand. For the rest of open problems, you need an expert to even understand the problem statement.


I'll argue for astronomy being the oldest. Minimal knowledge would help pre-humans navigate and keep track of the seasons. Birds are known to navigate by the stars.


I would argue that some form of mathematics is necessary for astronomy, for “astronomy” as defined as anything more than simply recognizing and following stars.


The desire to hide all traces where a proof comes from is really a problem and having more context would often be very helpful. I think some modern authors/teachers are nowadays getting good at giving more context. But mostly you have to be thankful that the people from the minimalist era (Bourbaki, ...) at least gave precise consistent definitions for basic terminology.

Mathematics is old, but a lot of basic terminology is surprisingly young. Nowadays everyone agrees what an abelian group is. But if you look into some old books from 1900 you can find authors that used the word abelian for something completely different (e.g. orthogonal groups).

Reading a book that uses "abelian" to mean "orthogonal" is confusing, at least until you finally understand what is going on.


>>[...] at least gave precise consistent definitions for basic terminology.

Hopefully interactive proof assistants like Lean or Rocq will help to mitigate at least this issue for anybody trying to learn a new (sub)field of mathematics.


actually a lot of minimal proof expose more intuition than older proofs people find at first. I find it usually not extremely enlightening reading the first proofs of results, counterintuitively.


> Mathematics is such an old field, older than anything except arguably philosophy

If we are already venturing outside of scientific realm with philosophy, I'm sure fields of literature or politics are older. Especially since philosophy is just a subset of literature.


> I'm sure fields of literature or politics are older.

As far as anybody can tell, mathematics is way older than literature.

The oldest known proper accounting tokens are from 7000ish BCE, and show proper understanding of addition and multiplication.

The people who made the Ishango bone 25k years ago were probably aware of at least rudimentary addition.

The earliest writings are from the 3000s BCE, and are purely administrative. Literature, by definition, appeared later than writing.


> As far as anybody can tell, mathematics is way older than literature.

That depends what you mean by "literature". If you want it to be written down, then it's very recent because writing is very recent.

But it would be normal to consider cultural products to be literature regardless of whether they're written down. Writing is a medium of transmission. You wouldn't study the epic of Gilgamesh because it's written down. You study it to see what the Sumerians thought about the topics it covers, or to see which god some iconography that you found represents, or... anything that it might plausibly tell you. But the fact that it was written down is only the reason you can study it, not the reason you want to.


> That depends what you mean by "literature". If you want it to be written down

That is what literature means: https://en.wiktionary.org/wiki/literature#Noun


Well, then poetry is not literature.


No, the argument is even dumber than that. The person who writes a poem hasn't created any literature.

The person who hears that poem in circulation and records it in his notes has created literature; an anthology is literature but an original work isn't.


> No, the argument is even dumber than that. The person who writes a poem hasn't created any literature.

Sure they have, by virtue of writing it down. It becomes literature when it hits the paper (or computer screen, as it were).

(Unless you mean to imply that formulating an original poem in your mind counts as "writing", in which case I guess we illustrate the overarching point of value in shared symbols and language and the waste of time in stating our original definitions for every statement we want to make)


> Unless you mean to imply that formulating an original poem in your mind counts as "writing"

You're close. I'm making the point that, in modern English, no other verb is available for the act of creating a poem.

Here's a quote from the fantasy novel The Way of Kings that always appealed to me:

>> "Many of our nuatoma -- this thing, it is the same as your lighteyes, only their eyes are not light--"

>> "How can you be a lighteyes without light eyes?" Teft said with a scowl.

>> "By having dark eyes," Rock said, as if it were obvious. "We do not pick our leaders this way. Is complicated. But do not interrupt story."

For an example from reality, I am forced to tell people who ask me that the English translation of 姓 is "last name", despite the fact that the 姓 comes first.

Similarly, the word for writing a poem is "write", whether this creates a written artifact or not. And the poem is literature whether a written artifact currently exists, used to exist, or never existed.

(Though you've made me curious: if the Iliad wasn't literature until someone wrote it down, do you symmetrically believe that Sophocles' Sisyphus is no longer literature because it is no longer written down?)


> I'm making the point that, in modern English, no other verb is available for the act of creating a poem.

Make, Create, Formulate.

> I am forced to tell people who ask me that the English translation of 姓 is "last name", despite the fact that the 姓 comes first.

"Family name" is availabe, commonly used and a better traslation than "last name" here, no?

> Similarly...

You're probably pretty alone in this thinking.

I don't think the metaphysical argument about Sisyhus is interestng or relevant.

I don't consider all movies to be literature. Do you consider all movies to be literature by definition?

I also write computer programs and banking checks. Does that make them literature to you?


> I'm making the point that, in modern English, no other verb is available for the act of creating a poem.

You literally used another perfectly acceptable verb in modern English besides “writing” for the act of creating a poem in the very sentence making this claim, which somewhat undermines the claim.


If it’s not written down, then that’s true.

Once someone writes it down, it is.


Sure in the context that you mean it’s an oral tradition.


> Literature, by definition, appeared later than writing.

Literature, by strict defintion, appeared no earlier than writing, but it is only a tentative conclusion from which surviving writing has been found and understood that it appeared later than writing.


And you don't think she knows this? She's clearly fascinated with the Romans, despite all she finds unappealing about them. Which can easily be said about a lot (most?) of history. Based on books and TV, WW2 is possibly the historical period that draws the most attention, which doesn't mean the historians (or their readers) "love WW2."


I have a b&w photo of my (considerably) older brother, from the early 1960s, reading a pile of comic books a foot high. The only cover visible is Spiderman #4. When I was a kid I used to stare at that picture and dream.

Needless to say, I kept all my old comics.


I use to collect baseball cards as a kid in the 80s and I can remember I would see comics at card shows.

I had seen football cards take off in value and really wanted to get into comic collecting. What I remember is the big comic books were just slightly outside the price range of a 10 year old cutting the lawn. Unlike baseball cards that cost thousands for HOF rookie cards.

I don't know how accurate it is but chatGPT gives prices that sound about right from what I remember looking at comic price guides in the 80s.

X-Men #1 $60–$150

The Incredible Hulk #1 $60–$120

Avengers #1 $80–$180

I never got to start my collection then I remember as a teenager thinking what a stupid idea it was anyway. Who the hell is ever going to be that into comic book characters?


I had a British edition of Star Wars #1 at my parent's house that an English friend gave me when we were kids back in the early 80's. I always wondered what it was worth, as I could only find price guides for the US edition. But when I finally got around to go get it a couple years ago, it was nowhere to be found. So the question became only academic.


I kept all my old comics too, and check the value occasionally on eBay. Most valuable one has yet to top $40.

Turns out a lot of 80s kids had the same idea!


I think you're right that the inelegant part is how AI seems to just consist of endless loops of multiplication. I say this as a graphics programmer who realized years ago that all those beautiful images were just lots of MxNs, and AI takes this to a whole new level. When I was in college they told us most of computing resources were used doing Linear Programming. I wonder when that crossed over to graphics or AI (or some networking operation like SSL)?


What could any complex phenomenon possibly be other than small “mundane” components combined together in a variety of ways and in immense quantities?

All such things are like this.

For me, this is fascinating, mind-boggling, non-sensical, and unsurprising, all at once.

But I wouldn’t call it inelegant.


> When I was in college they told us most of computing resources were used doing Linear Programming.

I seriously doubt that was ever true, except perhaps for a very brief time in the 1950s or 60s.

Linear programming is an incredibly niche application of computing used so infrequently that I've never seen it utilised anywhere despite being a consultant that has visited hundreds of varied customers including big business.

It's like Wolfram Mathematica. I learned to use it in University, I became proficient at it, and I've used it about once a decade "in industry" because most jobs are targeted at the median worker. The median worker is practically speaking innumerate, unable to read a graph, understand a curve fit, or if they do, their knowledge won't extend to confidence intervals or non-linear fits such as log-log graphs.

Teachers that are exposed to the same curriculum year after year, seeing the same topic over and over assume that industry must be the same as their lived experience. I've lost count of the number of papers I've seen about Voronoi diagrams or Delaunay triangulations, neither of which I've ever seen applied anywhere outside of a tertiary education setting. I mean, seriously, who uses this stuff!?

In the networking course in my computer science degree I had to use matrix exponentiation to calculate the maximum throughput of an arbitrary network topology. If I were to even suggest something like this at any customer, even those spending millions on their core network infrastructure, I would be either laughed at openly, or their staff would gape at me in wide-eyed horror and back away slowly.


The first two results from Google with "Voronoi astro" gave two different uses than the one I knew about (sampling fibre bundles): https://galaxyproject.org/news/2025-06-11-voronoi-astronomy/ https://arxiv.org/abs/2511.14697


Astronomy is pure research and is performed almost exclusively by academics.

I’m not saying these things have zero utility, it’s just that they’re used far less frequently in industry than academics imagine.


And astronomy tends to throw up technology that becomes widely used (WiFi being the obvious example) or becomes of "interest" to governments. I expect that AMR code will be used/ported to nuclear simulations if it proves to be useful. Do I expect it to be used in a CRUD app? Obviously not, but use by most software shops isn't a measure of importance.


I have not only used linear programming in the industry, I have also had to write my own solver because the existing ones (even commercial) were to slow. (This was possible only because I only cares about a very approximate solution)

The triangulations you mention are important in the current group I'm working in.


I'm curious to hear what you specifically use these algorithms for!

PS: My point is not that these things are never used, they clearly are, I'm saying that the majority of CPU cycles globally goes towards "idle", then pushing pixels around with simple bitblt-like algorithms for 2D graphics, then whatever it is that browsers do on the inside, then operating system internals, and then specialised and more interesting algorithms like Linear Programming are a vanishingly small slice of whatever is left of that pie chart.


3d modelers would like to have a word with you.

Part of the reason why linear programming does need t get used as often is that there are no industry standard software implementation that is not atrociously priced. Same deal with Mathematica.


3D modelling is mostly linear algebra, not linear programming, which is an entirely different set of algorithms.


Oh I mentioned it in the context of mesh geometries, tesselations.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: