Welcome to HN. Don't post memes here, only posts that add value. If you want to post memes, try Reddit or Digg. But if you have additional context or information to add, then your input is appreciated.
I like "TIL"s for quick references. HashRocket's TIL page[1], for example, often comes up in my searches. I don't know if it counts as a TIL in the way OP means...
A house is an asset that can grow in value over time and build wealth, while a car loses value as you use it. Housing costs are higher but can provide long-term stability and access to better jobs and schools. A car helps you get to opportunities but doesn’t create wealth.
The first time I used a flare with their support agents, it truly felt like magic. It's such a clever way to perform data collection for a specific, imperative need without doing a dragnet of constant use telemetry (as far as I'm aware)
Performance (fitness functions are usually high-volume/CPU intensive operations), state management (managing mutable state isn't Prolog's strong suit), and integration (you'll have to set up communication between whatever language/framework you're using and Prolog).
More generally, if you could completely specify what makes a solution "fit" using logic, you could just ask Prolog to find a solution that satisfies those predicates. The search would then be handled by Prolog's built-in resolution and backtracking mechanisms rather than needing a GA.
So, I would like to create/discover patentable algorithms.
One would not even have to run the logic programs in the normal sense - just interpret the declarative constraints in a direct way as a fitness function in the GP system...
If you notice the GP research, they almost all focus on Symbolic Regression because the problem space is easier with commutativity and associativity. Generating programs is much harder because of state-space issues and Genetic Programming quickly falls into premature convergence. If you actually look at the breadth of research in this field, it is almost all efforts to combat premature convergence. As an algorithm, I don't think GP is all that great of an optimization technique. My PhD turned from GP research to developing an alternative called Prioritized Grammar Enumeration (PGE). This could be adapted to generate programs, but I suspect similar scaling and correctness issues from state-space complexity. PGE does offer more opportunity to create local generators that ensure some amount of correctness. Both algos still primarily benefit from noticing and ignoring bad solutions, which dominate the search space (the search space is sparse w.r.t. good solutions)
I agree with GP that you'll likely have to create cross language techniques to develop a system that has reasonable performance. I used a mix of Go & Python and used REST APIs to communicate between the processes. It was actually more performant overall by doing so, as certain subtasks had bad performance (small object allocation in Python was a bottleneck, Go lacks SymPy equivalent for mathematical simplification).
They go into significant detail about their sample handling as well as documenting potential sources of contamination here: https://www.plasticlist.org/methodology
For those interested, see Daniel Shiffman's Nature of Code[1], a book in which you go from simple "ant" simulations to "boid" flocking behaviors, and from physics simulation to machine learning, neuroevolution, and NEAT using p5.js for graphics.
Ruby and Rails will continue to grow in notoriety thanks to improvements in YJIT, GC, and Ractor/Fibers and the reemergence of SQLite as a production-grade tool.
GenAI tools will "replace" developers in the same way that no/low code tools allowed anyone to make an app. These tools will be tied to specific vendors, meaning you can completely embed with AWS/Google/OpenAI as an LLM app platform, or hire that developer to build the app. Developers who augment their tooling with LLMs will learn faster and become stronger generalists overall. Grow-fast companies will hire less than otherwise, but subject matter experts will keep the lights on and those who can reach across bureaucracies to get things done will remain.
Consumer appetite for products using LLMs for traditional workflows will tarnish: chat bots and human-computer interfaces will frustrate but novel applications like improved search and last-mile customization might take hold: "AI powered" will leave marketing lexicon for segments with consumers who want more privacy and who just want to buy new shoes online.
We won't see the return of high-demand positions with high pay and lots of perks. Companies have been incentivized and permitted to run lean and increase performance demands from remaining staff. Teams have been understaffed for months, but growth remains steady.
For those interested, see Daniel Shiffman's Nature of Code[1], a book in which you go from simple "ant" simulations to machine learning, neuroevolution, and NEAT using p5.js for graphics.
Daniel Shiffman is brilliant. Haven’t watched The Coding Train for a while now but back in the days I really enjoyed his attitude. I think “creative coding” is the optimal path to get people into programming and he’s an excellent teacher of it.
https://bsky.app/profile/sciguyspace.bsky.social/post/3lfmbm...
https://x.com/sciguyspace/status/1878713938109776032?s=46&t=...