Hacker News new | past | comments | ask | show | jobs | submit | whage's comments login

This post feels like recent movie trailers. The way you wrote it makes me believe you know what you are talking about but I have no idea. Care to give the curious reader pointers to the topics you mention? I mean the "printing press to furnace" analogy of the economy, or what you mean by 1971 or the gold bug or the 50 year transistor omg what?


I’m just a nerd whose nerd hobbies came to encompass finance when I moved to Manhattan. I was also a party monster back then (please don’t read the book) which for a childless bachelor is a pretty reasonable way to get the most knowledge NYC has to give in a year.

If you have particular questions about the monetary system, I know enough to either answer or know that I don’t know and refer you to someone who does.


> If you have particular questions

There were four fairly direct questions in the post you replied to, to which you might like to give a response:

>> [what do you] mean [by] the "printing press to furnace" analogy of the economy

>> what you mean by 1971

>> or the gold bug

>> or the 50 year transistor

The first of those I'm pretty sure I understand what you are meaning, though if I'm right about that I don't think it is a good analogy, but the others I have no context for.


1971 (and/or 1973) is a year people often point to as when we “went off the gold standard”. Neither year is particularly great as a signal example of the collapse of what is loosely called the “Bretton Woods” monetary system, but either is a pretty good nickname for it.

By “printing press” I mean the set of mechanisms in which fractional reserve deficit spending creates the money supply to notionally value future growth, and by “furnace” I mean the set of mechanisms by which we prevent arbitrary inflation as a result. You can measure the money supply in plenty of ways, but for someone who needs a glossary on my original comment, TLDR you want a number called “M2”.

A “gold bug” is someone who either does or advocates the strategy of holding “precious” metals as a uniquely good asset class (no one turns down a free pile of gold, many of us think the market is pricing gold as well as we are), in particular an asset class uniquely resistant to inflation and/or the “government”. I can assure you from painful experience that “the government” gives no fucks if you have gold in your back yard when they make a clerical error.

50 years is roughly the period of time it took for at least two separate inventions of “the transistor” by a bunch of Bell Labs people, but overwhelmingly associated to a guy called Shockley, to culminate in a useful computer that a middle class person could afford to have in their living room. I learned to code on an IBM PC my distinctly middle-class grandfather owned, and it is a substantially better computer than no computer, a harder claim to make about the Apple I for example.


> "printing press to furnace" analogy of the economy

Very roughly modern economies work by printing money in the form of government treasuries (government borrows $ to run the country) and destroys money by charging taxes (taxes go to IRS and are used to pay off the outstanding treasuries)


This really seems like php. I also think that mixing UI with logic is a recipe for disaster and there is already php (and several others?) for that. I suppose the creators of such a thing have a decent knowledge of compilers and related domains, knowledge which really seems wasted on a project like this.


I almost peed.


Never heard of Level1Tech, just looked them up. Man... These people are so likeable! Great content!


The author seems to be very knowledgeable about the different aspects of programming. Whether you agree with his opinions or not, this article I think is a great starting point for learning about many interesting topics. Definitely bookmarked for later.


They sound like someone that is attracted to chasing the high of the next technology to learn rather than picking a language like C/C++ that would have worked and given them way more career stability.

Rather than mastering a bunch of programming languages they should have focused on one and built clout as a problem solver that doesn’t see a new shiny tool as a way to label their career. Boring popular language + focusing on building stuff and solving problems are the key to actual career growth.


That assumes a whole lot about their goals. If everyone sticks with C++ we would never have nice things like Zig/Nim/Rust that are advancing the conversation about how we should be doing systems programming tasks with code. C++ is quite influences by a lot of this. Like, let emm hack and learn. Also I think Nim is a very productive language to be hacking in and much easier to get going with than C++.


As they say, let people enjoy things


That take detracts from the amazing work people have done over the decades to research and implement improvements to C and C++. We can innovate without throwing things away, even if it may seem boring to the uninitiated.


People have done (and are still doing) amazing work in Fortran too. Why should we use C or C++ when Fortran was already perfectly fine?


Fortran us better than C and even C++ for its domain, high-performance numerical code. The guaranteed lack if aliasing allows for optimizations not always possible in C code. Built-in handling of stuff like complex numbers and even range types in modern Fortran (that is, supported for last 25-30 years), along support for parallelization, and with robust structured / procedural programming, is just the right tool for some jobs.

BTW if you run fancy modern stuff like numpy, you run quite some Fortran, because it includes, for instance, BLAS.


None of this is true or very important anymore.

For aliasing, in C, there's the restrict keyword, although it isn't used that often. If you take a look at BLIS or even BLAS, restrict isn't used. C++ doesn't have restrict, but that hasn't stopped the development of great numerical linear algebra libraries like Eigen or Armadillo.

C++ has had std::complex for... forever. And C99 has support for complex numbers.

Range types are handy, but not having them typically isn't a show stopper when designing libraries of numerical kernels. They are more useful for high-level prototyping, a la MATLAB. Nevertheless, it's not that hard to emulate the behavior using a well-designed API in C, or operator overloading in C++.

The only parallelism that Fortran has that C or C++ lack is co-array Fortran. I haven't used it myself, so can't speak to how useful it is. Both C and C++ have many options for parallelism: pthreads, OpenMP, MPI, C++'s parallel stuff, and tons of other libraries.

I'm not sure what "robust structured/procedural programming" is. I think Fortran is at a clear disadvantage compared to C and C++, here. The amount of boilerplate needed to define a function in Fortran is pretty painful. Fortran's control flow is a subset of C and C++'s, so any style of structured or procedural programming you would do in Fortran can obviously be done in C and C++.

And of course, numpy (and similar) call out to Fortran, but they call out to loads of C and C++, too!

This isn't to say Fortran is a bad language. I work with people who use it and prefer it. But the reasons you pointed out simply aren't valid. A good reason to use Fortran: you work in a group where Fortran is the language that's used and you want to be able to contribute to what's going on!


Exactly. They are just arbitrarily drawing the line of which language is sensible based upon what was hot when they made their choice.


It doesn't detract from that at all, chief. What a weird thing to say. Particularly in the case of Nim -- it doesn't "throw away" C or C++, it compiles to them!


...decades of research dedicated to handling problems introduced by C and amplified by C++.

Sometimes it makes sense to simply reset things.


Sure. But we can also innovate with throwing things away, which has its own set of advantages.


I have been learning Rust so it is top of mind for me. Rust started with some really tough constraints:

1.) How can we eliminate undefined behavior?

2.) without sacrificing any performance

From there, they built a powerful programming language that has always honored zero cost abstractions and elimination of undefined behavior. This is like a 15 year development and the result is a little hard to understand at first but one of the most innovative things in the realm of systems programming [that people are actually using at scale]. I know Rust borrows a lot and is built on the shoulders of giants and is not a singular invention, but it is slowly winning hearts and minds in places where languages like Ocaml and Haskell never could. You really could not do something like this by saying "Hey, how could we eliminate UB in C++?" Starting anew was the solution :)


> That take detracts from the amazing work people have done over the decades to research and implement improvements to C and C++.

What kind of argument is that?

> We can innovate without throwing things away, even if it may seem boring to the uninitiated.

You can’t innovate without breaking backwards compatibility. C++ is an evolutionary dead end that collapses under its own complexity, deal with it.


Ugh, as a C++ programmer - for the love of god don't choose the language for career stability. Both the language and the ecosystem is horrible. Choose it if you need it work in the domain of your choosing. For me it happens to be the case. If you want to choose a career language go for any of the other TIOBE usual suspects.

I completely agree that if you want to be a career programmer, focus on solving problems for business. But it can be done in any language nowadays.


What's wrong with C++ stability? You think things are looking better in JS/Python world? They both changed tons and the knowledge you accumulate constantly needs to be tweaked and relearned.


I perhaps used words in a confused combination. There are two connotations to stability here - 1. language and ecosystem standardization and 2. having a stable and predictable career. C++ has both a plenty (2. if you find your niche and then you are nigh irreplaceable, but not perhaps at FAANG rates).

Your knowledge never gets old, basically in C++.

The problem with C++ is that it is a horrible language in several ways. I guess the root cause is that it's basically a sugared C preprocessor with an infinite amount of special rules and special cases, that attempts to retain backward compatibility while adding new features. If my domain did not need it I would not write it. But it does, so I do.

My advice not to choose C++ because of career stability meant - "It's a bloody awful thing! Don't choose it if you have options unless you absolutely want to!"


> You think things are looking better in JS/Python world?

Yes.


Let's face it: C++ is pain. Even with all the improvements of the last 15 years.

Choosing something which us not C++ may be a long-term goal by itself.


Have you considered there may be a reason why so many large tech companies are so eager to move away from C++?


Perhaps they care more about exploration and learning rather than career growth and solving business problems.


I'd like to hear about this too!


Ken Thompson wrote in his famous paper [1] about quines:

> If you have never done this, I urge you to try it on your own. The discovery of how to do it is a revelation that far surpasses any benefit obtained by being told how to do it

Every once in a while I give them a try but I couldn't yet create one and it frustrates me very much. Afraid of being denied that "revelation" I never dared to read his paper past that point. I'm afraid I might never read it because of my ego.

1: https://www.cs.cmu.edu/~rdriley/487/papers/Thompson_1984_Ref...


It's easier in some languages than others.

There is a delight in doing it, but don't overdo the anticipation. I've found greater delights in programming.


> It's easier in some languages than others.

Yes, it seems Lisp wouldn't be appropriate for this.


I tried typing `1` into the REPL. Am I doing this right?


Maybe. What sound did it make?


It was blue.


I believe that solution is possible for the Lisp-modified task description:

Write a cyclic quine that reproduces itself through 128 Lisp languages

because we easily have that many, or more.


I mean, isn't generating quines and twines one of the examples used to show of MiniKanren?

William Byrds presentations of MK are absolutely delightful, by the way.


Yes, it's quite easy in sh:

[update: same program also works in python and ruby]


Like refactoring someone else's code? So much fun.


You're teasing, but I understand the question so I'll answer.

I've had a good deal of joyous fun exploring quirky ideas with mathematics and software. It's old hat now, but back in 2011 I repurposed an algorithm primarily used for document classification by creating my own tokenization of features and stuffing it in there then running it through a ton of photographs, but forcing it to only 200 dimensions of freedom.

Since the algorithm didn't have room to fully separate dimensions, it had to start grouping photos together along dimensions that had to account for multiple topics. I'll never forget bursting out laughing as I explored one particular dimension that was "motorcycles...... and photos of women taken from street level aimed at their scantily clad behinds."

Collaborative filtering proved to me that some stereotypes really do have grounding in reality.

As for the quine stuff? I actually had more fun repurposing[0] some of the ideas for a letter to my MP about the dangers of electronic voting back in 2011 when it seemed possible that Canada was going to allow it at the federal level. I chuckled more at literally sending self-eating code to a presumable less technical politician than I did at getting something to repeat itself.

[0] https://github.com/zachaysan/darth_vader_wins_election


I hadn't written one until ~30 mins ago [1]. I cheated and looked at a Java quine (not particularly elegant, but easy to see what is going on.), but I wrote one for Virgil. Just think string substitution; a string with a hole in it and you substitute a copy of the string, quoted into the hole. Just one substitution suffices.

[1] https://github.com/titzer/virgil/blob/master/apps/Quine/Quin...


You should put SPOILER tags on that.


Nice, this Go version satisfies gofmt:

https://github.com/62726164/quine


In every paper there comes a line at which I completely lose what they are talking about. If I'm lucky, this happens after the abstract/intro. In this case, this was it: "A real-time symbolic processing system on a single processor should have the following four features". What?


real-time - being concerned with predictable latency rather than overall throughput

symbolic processing - symbolic expressions (s-expressions) being the building blocks of lisp programs

single processor - not considering multitasking or multiprocessor concerns


I appreciate the clarification but what I didn't get is what "symbolic processing" has to do with "hard real-time" (who cares what the language looks like?) and similarly, how is "single processor" related to "hard real-time"


This terminology was common in the 1990s, before AI meant "machine learning" (aka adjusting parameters). Symbolic processing involves having an object in memory for things in the real world, and manipulating them. There's some vague relation to object oriented programming. The point is, it involves creating & destroying objects, memory allocation, that sort of thing. As opposed to purely numerical computing: for a neural net with a fixed architecture, there are a fixed number of operations (multiplying, adding) to get an answer. It's easy to do that in real time, since the computation is not only bounded, but of constant time. Symbolic manipulations can take variable amounts of time, and depend on the particulars of the inputs & problem being solved.

Lisp was championed by people in AI working on symbolic reasoning back in the 1990s.


Symbolic reasoning based AI goes back to at least the 1950s. You're missing almost a half-century of history there.


"A real-time symbolic processing system on a single processor should have the following four features"

The language they're developing in/for is Lisp, which is used (a lot) for symbolic processing. That's the "symbolic processing" part. "real-time" is the constraint they're trying to achieve and relates to system performance. "single processor" is just another constraint. If you permit multiple processors then some of the problems making Lisp real-time go away or are mitigated as you can do things like shove the GC bit into a second processor and execute it in parallel. Being constrained to a single processor means you can't do this, when the GC is going it's using the full CPU and taking time away from other computations (symbolic processing, in their case).


> who cares what the language looks like

I would guess, the paper cares what the language looks like since it's about Lisp being real-time.

Lisp has been dubbed as a "symbolic processing language" since its early days, since MacCarthy used that language in some papers.*

Basically it means the ability to juggle complex data structures that can represent formulas; e.g. solving algebraic equations (something easily hacked up in Lisp) is an example of symbolic processing.

This is entirely relevant to the topic because the dynamic allocation used in symbolic processing presents challenges to real-time processing, like garbage collection pauses, and whatnot. What if you want asynchronous interrupts to be able to allocate cons cells and other objects?

In a Lisp program, memory-allocating expressions casually appear throughout a program. Something like (list a b c), if we translate it to C, would call a malloc-like routine three times. Some people doing some kinds of embedded work even today eschew dynamic allocation.

---

* Like: Recursive Functions of Symbolic Expressions and Their Computation by Machine. Memo 8, Artificial Intelligence Project, RLE and MIT Computation Center, March 13, 1959, 19 pages. https://dspace.mit.edu/handle/1721.1/6096


"symbolic processing" means computing with symbols, not numbers.

Like: (simplify '(2 * a + 4 * a)) -> (6 * a).

An application for symbolic processing could be a rule-based system. The application then consists of a lot of rules and facts. Facts also could be symbols, not just numbers. Like a physical pressure value could be a number, but it could also be a qualitative value like low-pressure.

A relatively old application (originating in the mid 80s) is real-time control with a rule-based system in technical systems. Technical systems could be a fleet of satellites, a chemical plant, a cement production plant, ...

The software might get sensor inputs and based on rules may cause physical actions like opening or closing a valve, while getting more live sensor inputs, then possibly triggering other rules.

Typically Lisp systems have some kind of GC and most of the time this will not allow real-time applications. There are are some specialized Lisp systems which have no GC and the application code is written in a way to not dynamically allocate memory during runtime - one says that the style is non-consing, where CONS is the basic operation to allocate a new list cell. This way they avoid garbage collections. This allows some real-time applications, like the mentioned rule-based process control software. This is rare and exotic stuff. It's also considered 'old AI' technology.

The paper describes also something exotic: a concurrent Garbage Collector for a Japanese Lisp Machine, which runs a Lisp operating system - that was a thing when Japan was investing a lot into Artificial Intelligence in the 'Fifth Generation Project'. Such a GC would be useful so that the machine's network stack could be written directly in Lisp, but also for applications (similar to above control of external physical systems): robots, machines, production plants, ... It was thought that an important application domain for expert systems could be in industrial control systems.


To be clear for OP hard real-time means that missing a response guarantee constitutes a total system failure.



Thanks a bunch!

You got any more of these "Real-time Programming" courses? Or books?


Reading through the comments here (and very much feeling the pain they describe), this idea came: Shouldn't we have a search engine that heavily favours the types of websites that we typically look for? You know, the classic 90s style tech blogs, the plain HTML documentation pages. Ignoring websites with ads, sites with lots of baggage (fonts, scripts, whatnot), sites with lots of images. Maybe increase the ranking of pages that don't change much in their look and content as time goes by. I don't know. Would it be useful? How would it pay for itself?


What you're describing is an internet that hasn't existed since 2006, or whenever Jquery and Ajax started becoming mainstream.

Sites containing "fonts and scripts" are not baggage. They have just been updated for modern times. Sites have to measure traffic somehow, and it's not going to be via a "Guest Counter" widget like they had back in the Geocities days.

You might be looking for an alt-web like that hosted on the Gemini network. It's all text and HTML-only as I understand.


Are you talking about Marginalia? [https://search.marginalia.nu/]


Remember Hotbot? They had a way to limit searches to those that used JavaScript. ha!

https://wikieducator.org/File:Hotbot.gif


millionshort.com used to be good but it recently joined the mega-corp search pack and hid it's search engine behind required code exectution that blocks non-megacorp browsers.

https://searchmysite.net/ is a curated search for exactly what you describe though. It's been on HN a couple times.


This resonates with me so freaking much. I am disgusted by how every bit of information on webpages seem to be buried between paragraphs of empty EMPTY talk and even when they seem to get to the point, most of the time there is no real information there. Sorry, this had to come out. We really need better search engines.


My understanding from reading a number of articles about this "helpful content" update from Google is that they want to get rid of pages like what you describe and instead prioritize pages that get right to the point. So perhaps you'll get what you are asking for.


This is true of most non-fiction literature as well (self-help and business books especially), so I'm not sure if Google can necessarily discourage people from adding unnecessary padding.

Google is already scraping bits of website content and showing it to the user as a "featured snippet". Nobody is going to write short pages if Google can already rob you of a click so easily.


I think it's going to require a social rating system. Which is why people append "reddit" to searches these days.


It’s safe to assume that a social ranking system covering the whole web will be gamed.


Yes, but if it's truly social it won't be easy. There need to be ways to see who did what and filter them out and various groups / spheres of influence. Just like in real life.

If it is a hidden algorithmically social function then of course it will be gamed.

But IRL there are certain people whose advice and recommendations you value and those whose you ignore. And in other cases you can easily ask the source of other information to find out if it is high value or not. MLM is the gamification of the IRL social structure and it's fairly easy to opt-out.

That's what search needs, a way to see the path information took to be presented to you and a way to filter it.

Unfortunately right now, so much of the best information is in Facebook groups, post and comments. The interface there is absolutely horrible though and not designed to provide you information, but to maximize the amount of ads that come across your screen.

The same is true for video information. It's not easily searchable or digestible. The web peaked when information was predominantly text form and not fragmented into walled gardens.


> Yes, but if it's truly social it won't be easy. There need to be ways to see who did what and filter them out and various groups / spheres of influence. Just like in real life.

Appending site:news.ycombinator.com instead of Reddit?


There is an extent they can go to with account verification where gaming is not very feasible.

Can you create a fake account with a verified credit card number, verified phone number, passport, drivers license, account history consistent with human usage, Google One subscription, etc.? You probably can, but doing it at-scale is going to be quite costly.


Can you get a lot of people to install a piece of software and then use their account instead? Could you even pay those people per hour of usage of their account?

Quite cheaply


I don't think we need this that bad so as to jeopardize our privacy and freedom over it. So I'll pass on this idea.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: