Hacker Newsnew | past | comments | ask | show | jobs | submit | rnxrx's commentslogin

The development of steam technology is a great metaphor. The basic understanding of steam as a thing that could yield some kind of mechanical force almost certainly predated even the Romans. That said, it was the synthesis of other technologies with these basic concepts that started yielding really interesting results.

Put another way, the advent of industrialized steam power wasn't so much about steam per se, but rather the intersection of a number of factors (steam itself obviously being an important one). This intersection became a lot more likely as the pace of innovation in general began accelerating with the Enlightenment and the ease with which this information could be collected and synthesized.

I suspect that the LLM itself may also prove to be less significant than the density of innovation and information of the world it's developed in. It's not a certainty that there's a killer app on the scale of mechanized steam, but the odds of such significant inventions arguably increase as the basics of modern AI become basic knowledge for more and more people.


Its mostly metallurgy. The fact that we became so much better and precise at metallurgy enabled us to make use of steam machines. Of course a lot of stuff helped (Glassmaking, whale oil immediatly come to mind) but mostly, metallurgy.


I remember reading an article that argued that it was basically a matter of being path dependent. The earliest steam engines that could do useful work were notoriously large and fuel-inefficient, which is why their first application was for pumps in coal mines - it effectively made the fuel problem moot and similarly their other limitations were not important in that context, while at the same time rising wages in UK made even those inefficient engines more affordable than manual labor. And then their use in that very narrow niche allowed them to be gradually improved to the point where they became suitable for other contexts as well.

But if that analogy holds, then LLM use in software development is the "new coal mines" where it will be perfected until it spills over into other areas. We're definitely not at the "Roman stage" anymore.


If we go by that analogy, i think LLMs (and all of current programming automation like compilers) are just different mechanical parts. They will improve in quality, in precision, surrounding product will make them even more effective (MCP is vulcanized rubber here? :D), but they aren't coal or even the steam engine.


There's a point in the article that mentions allowing the model to ask questions. I've found this to be especially helpful in avoiding the bad or incomplete assumptions that so often lead to lousy code and debugging.

The (occasionally) surprising part is that there are times where the generated clarifying questions actually spawn questions of my own. Making the process more interactive is sort of like a pseudo rubber duckie process: forcing yourself to specifically articulate ideas serves to solidify and improve them.


I think the progression of sentiment is basically the same. There were lots of folks pushing the agenda that connecting us all would somehow bring about the evolution of the human race by putting information at our fingertips that was eventually followed by concern about kids getting obsessed/porn-saturated.

The same cycle happened (is happening) with crypto and AI, just in more compressed timeframes. In both cases the initial period of optimism that transitioned into growing concerns about the negative effects on our societies.

The optimistic view would be that the cycle shortens so much that the negatives of a new technology are widely understood before that tech becomes widespread. Realistically, we'll just see the amorality and cynicism on display and still sweep it under the rug.


It's only a matter of time until private enterprises figure out they can monetize a lot of otherwise useless datasets by tagging them and selling (likely via a broker) to organizations building models.

The implications for valuation of 'legacy' businesses are potentially significant.


Already happening.


The other side of this argument is that we're constantly fed lots of extraneous information along with the actual interesting content. The point about listening to the storyteller is completely valid, but that story teller wasn't full of advertisements, links to other stories or entreaties to smash a like button.

To an extent we're becoming wired to skim content because that content has been so deeply interleaved with items that aren't just extraneous, they're not even from the storyteller. I'd suggest this capability is even a kind of survival skill, akin to not only being able to spot motion in a dense jungle but to also instinctively focus on certain kinds of motion.


I'm curious about why the performance gains mentioned were so substantial for Qwen vs Llama?


it looks like llama.cpp has some performance issues with bf16


I had the same experience with Digital Ocean. Thankfully there were several other providers happy to take my money immediately.


DO will sign off in minutes (or did for me)


FWIW, 100% that BGP itself doesn't *use* multicast, but it can *propagate* multicast routing information. It's certainly technically possible to support multicast on the Internet (..thus the invention of MBGP) but in practice has been a non-starter for a whole bunch of reasons.


It's kind of funny - a fair amount of the major network vendors' hardware (i.e. Cisco, Arista, Juniper, HPE) isn't that much better than what MikroTik has produced at a fraction of the cost. Having a better and faster processor is great, but I don't think it's going to move the needle very much.

This really highlights how much the OS on network hardware is actually the biggest barrier to entry to the larger market. It's arguably one of the market segments where open source has traditionally had the least amount of adoption. Things have certainly been changing in recent years certain use-cases (e.g. SONiC and similar for DC switching) but it remains true that the OS itself (and the associated supporting infrastructure) is actually what drives both adoption and stickiness, not the newest/biggest/fastest speeds and feeds.

It's been true for a while that if RouterOS could be enhanced and made more attractive (manageability, support, QA, feature roadmap, 3rd part ecosystem, etc) it would make MikroTik a major market disruptor.


This really does change the interaction with art. As a future expansion it might be neat to recognize images on camera that would make for interesting art (i.e. detection of people/animals or recognition of certain styles of composition) as well as being able to choose amongst different styles.

It seems sort of akin to some modern art that incorporated TV screens and video to make dynamic installations, like Nam June Paik.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: