Hacker Newsnew | past | comments | ask | show | jobs | submit | sharemywin's commentslogin

so by that logic is transaction data allowed without a warrant?


I think the big win with AI is being able to work around jargon. Don't know what that word means ask AI. what the history on it no problem. don't understand a concepts explain this at a high school reading level.


what if cool new tech is just slowing down and AI is masking it.


Not a "what if". Can you name 3 new cool technologies that have come out in the last 5 years?


1. Copilot for Microsoft PowerPoint

2. Copilot for Windows Notepad

3. Copilot for Windows 11 Start Menu


Nah man, I’m still waiting for Copilot for vim.


Yeah. Where are all the great new Mac native apps putting electron to shame, avalanche of new JS frameworks, and affordable SaaS to automate more of life? AI can write decent code, why am I not benefiting from that a consumer?


Well, if you're a consumer of code, then technically you benefit. Otherwise, you probably won't notice it as much.


It's almost like a lot of our technologies were pretty mature already and an AI trained on 'what has been' has little to offer with respect to 'what could be'.


oof that's profound. Really nice closing thought for 2025.


LLMs, Apple Silicon, self-driving cars just off the top of my head without really thinking about it.


GPT-2 was 6 years ago, the first Apple silicon (though not branded as such at the time) was 15 years ago, and the first public riders in autonomous vehicles happened around 10 years ago. Also, 2/3 of those are "AI".


> the first Apple silicon (though not branded as such at the time) was 15 years ago

Nobody, not even Apple was using the term "Apple Silicon" in 2010.

The first M series Macs shipped November 2020.


1 year is being pedantic. Apple Silicon is clearly referring to the M series chips which have disrupted and transformed the desktop/laptop market. Self driving also refers to the recent boom and ubiquity of self driving vehicles.


M series is an interation of A series, "disrupting markets" since 2010. LLMs are an iteration of SmarterChild. "Self driving vehicles" is an iteration of the self-parking and lane assist vehicles of the last decade.

I'm bored.


Damn, what an LLM roast. Smarterchild couldn't even recall past 3 messages


I would be bored too if I was disingenuous. Everything is an iteration of ENIAC right? Things haven't changed at all since then right?


All of those things are more than 5 years old.


I could not get in a Waymo and travel across San Francisco five years ago, are you serious?


Neura Link, Quantum computers are making interesting milestones with Microsoft releasing a processor chip for Quantum computing. Green steel is another interesting one, though not as 'sexy' as the previous two.


didn't believe the quantum stuff, so I googled it. I'm shocked how far its come. Even China has some kind of photonic quantum chips now.


Wait so quantum is going to actually deliver something useful within the next 10-20 years??


Incredibly cheaper batteries and solar panels. Much better induction stoves.


I was thinking computer-related, but those are good, and better battery technology helps with computing.


Uhhh, LLMs? The shit computers can do now is absurd compared to 2020. If you showed engineers from 2020 Claude, Cursor, and Stable Diffusion and didn't tell them how they worked their minds would be fucking exploding.


So LLMs exist therefore nothing else is worth the time? That’s sort of the gist of HN these past few years


Moreover: people’ve been crowing about LLM-enabled productivity for longer than it took a tiny team to conceive and build goddamn Doom. In a cave! With a box of scraps!

Isn’t the sales pitch that they greatly expand accessibility and reduce cost of a variety of valuable work? Ok, so where’s the output? Where’s the fucking beef? Shit’s looking all-bun at the moment, unless you’re into running scams, astroturfing, spammy blogs, or want to make ELIZA your waifu.


No I was just skeptical of the GPs assertion that tech hasn't produced anything "cool" in the last 5 years when it has been a nonstop barrage of insane shit that people are achieving with LLMs.

Like the ability for computers to generate images/videos/songs so reliably that we are debating if it is going to ruin human artists... whether you think that is terrible or good it would be dumb to say "nothing is happening in tech".


This was posted earlier today:

https://www.danshapiro.com/blog/2025/12/i-made-the-xkcd-impo...

The xkcd comic is from 11 years ago (September 2014).


Surely you have realized by now that a large portion of the HN userbase is here for get rich quick schemes.


ahh brings me back to the blockchain days, and the many excuses people tried to use them instead of a SQL database for whatever reason


It’s really incredible how quickly people take things for granted.


LLMs are one, granted. GP asked for three, though.


GGPs question doesn't make sense though. What does it mean for a technology to "come out".

Also what does three prove? Is three supposed to be a benchmark of some kind?

I would wager every year there are dozens, probably hundreds, of novel technologies being successfully commercialized. The rate is exponentially increasing.

New procedural generation methods for designing parking garages.

New manufacturing approaches for fuselage assembly of aircraft.

New cold-rolled steel shaping and folding methods.

New solid state battery assembly methods.

New drug discovery and testing methods.

New mineral refinement processes.

New logistics routing software.

New heat pump designs.

New robotics actuators.

See what I mean?


Great list, and most of those don't involve big tech. I think what your list illustrates is that progress is being made, but it requires deep domain expertise.


Technology advances like a fractal stain, ever increasing the diversity of jobs to be done to decrease entropy locally while increasing it globally.

I would wager we are very far from peak complexity, and as long as complexity keeps increasing there will always be opportunities to do meaningful innovative work.


1. We may be at the peak complexity that our population will support. As the population stops growing, and then starts declining, we may not have the number of people to maintain this level of specialization.

2. We may be at the peak complexity that our sources of energy will support. (Though the transition to renewables may help with that.)

3. We may be at the peak complexity that humans can stand without too many of them becoming dehumanized by their work. I could see evidence for this one already appearing in society, though I'm not certain that this is the cause.


1. Human potential may be orders of magnitude greater than what people are capable of today. Population projections may be wrong.

2. Kardachev? You think we are at peak energy production? Fusion? Do you see energy usage slowing down, or speeding up, or staying constant?

3. Is the evidence you're seeing appear in society just evidence you're seeing appear in media? If media is an industry that competes for attention, and the best way to get and keep attention is not telling truth but novel threats + viewpoint validation, could it be that the evidence isn't actually evidence but misinformation? What exactly makes people feel dehumanized? Do you think people felt more or less dehumanized during the great depression and WW2? Do you think the world is more or less complex now than then?

From the points you're making you seem young (maybe early-mid 20s) and I wonder if you feel this way because you're early in your career and haven't experienced what makes work meaningful. In my early career I worked jobs like front-line retail and maintenance. Those jobs were not complex, and they felt dehumanizing. I was not appreciated. The more complex my work has become, the more creative I get to be, the more I'm appreciated for doing it, and the more human I feel. I can't speak for "society" but this has been a strong trend for me. Maybe it's because I work directly for customers and I know the work I do has an impact. Maybe people who are caught up in huge complex companies tossed around doing valueless meaningless work feel dehumanized. That makes sense to me, but I don't think the problem is complexity, I think the problem is getting paid to be performative instead of creating real value for other people. Integrity misalignment. Being paid to act in ways that aren't in alignment with personal values is dehumanizing (literally dissolves our humanity).


Not even close. I'm 63. You would be nearer the mark if you guessed that I was old, tired, and maybe burned out.

I've had meaningful work, and I've enjoyed it. But I'm seeing more and more complexity that doesn't actually add anything, or at least doesn't add enough value to be worth the extra effort to deal with it all. I've seen products get more and more bells and whistles added that fewer and fewer people cared about, even as they made the code more and more complex. I've seen good products with good teams get killed because management didn't think the numbers looked right. (I've seen management mess things up several other ways, too.)

You say "Maybe it's because I work directly for customers and I know the work I do has an impact". And that's real! But see, the more complex things get, the more the work gets fragmented into different specialties, and the (relative) fewer of us work directly with customers, and so the fewer of us get to enjoy that.


Ah my bad, that was a silly deduction on my part.

Yes I see your point better now, however I still think this is temporary. It's probably something like accidental/manufactured complexity is friction, and I'm this example the friction is dehumanizing jobs. You're right this is a limiting factor. My theory is that something will get shaken up and refactored and a bunch of the accidental complexity that doesn't effectively increase global entropy will fall off, and then real complexity will continue to rise.

I'm kind of thinking out loud here and conflating system design with economics, sociology, antitrust, organizational design, etc. Not sure if this makes sense but maybe in this context real complexity increases global entropy and manufactured complexity doesn't.

Manufactured complexity oscillates and real complexity increases over longer time horizons.

So what you see as approaching a limit (in the context of our lifetimes) is the manufactured complexity, and I agree.

My point is that real complexity is far from its limit.

I'm a lot less confident, but suspect, that if real complexity rises and manufactured complexity decreases we will see jobs on average become better aligned with human qualities. (Drop in dehumanizing jobs)

Not sure how long this will take. Maybe a generation?


I see your point better also. I'd like to think you're right, especially about the accidental complexity getting removed. That would do much to make me feel more positive about the way work is.

And in fact, if you have multiple firms in competition, the one that can decrease accidental complexity the most has a competitive advantage. Maybe such firms will survive and firms with more accidental complexity will die out.


That sounds right to me. It also makes me wonder whether artificially low cost of capital (artificially low interest rates) would result in more manufactured complexity.


that doesn't even consider the buying the competitor across the street and paying lobbyist to have congress ignore you for the benefit of the consumer because with the combined stores you can gain market efficiencies. of course ignore the price gouging that actually happens.


if you've ever shopped at Giant Eagle instead of Kroger you'll understand the flaw in the logic.

higher prices usually equals better service, less busy shopping. get in and get out.

so if your time is worth more than your money you aren't sensitive to price at the widget scale. most widgets are bundled with some kind of service.

I bought some printing and it was super cheap but no service not an email not a phone number nothing. not ordering from their again.


So the products are not the same


I think if you built some kind of game state server it would make a great front end for it. it could even generate the "rooms" as some kind of graph with items, and foes, and descriptions and directions between the rooms. items might need actions to transform or use items.


it's annoying to me that there's not a doc store with vectors. seems like the vector dbs just store the vectors I think.


Elasticsearch and MongoDB Atlas and PostgreSQL and SQLite all have vector indexes these days.


> MongoDB Atlas

It took a while but eventually opensource dies.


My search service Lens returns exact spans from search, while having the best performance both in terms of latency and precision/recall within a budget. I'm just working on release cleanup and final benchmark validation so hopefully I can get it in your hands soon.


Pinecone allows 40k of metadata with each vector which is often enough.


Elasticsearch and Vespa both fit the bill for this, if your scale grows beyond the purpose-built vector stores.


chroma stores both


As does Azure's AI search.


I just use sqlite


I always love the the solution thrown out is we're all going to be plumbers...


a few possibilities for me are: over engineering, rabbit holes, trying to build on top of something that only 80% works, and trying to fix something you don't understand how it works. also integrating in with existing code bases. it will ignore field name capitalization, forget about fields, other things like that.

I prefer working with AI but it ain't prefect for sure.


so was the article conflating the two or was it just pointing out the software developer roles typically have many hats that can and sometimes are held by other in the process. ie. systems analyst, business analyst, architect, designer, PM, QA etc.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: