Hacker Newsnew | past | comments | ask | show | jobs | submit | nazgul17's commentslogin

Could it be the pendulum swinging hard before selling in the middle?

Once we learn from our mistakes we can find the frequencies that do yield the best outcome and at the same time consume (say) ¼ the energy of an incandescent bulb.

Or the minimum set of species yielding optimal outcomes, without the answer being "all of them"


At my previous workplace, we were developing a greenfield project, years in the making and kinda already brownish. Our managers were using our estimates to choose the right amount of work to fit into a sprint (fortnight).

Am I misinterpreting things or there is no overlap with the circumstances argued in the OP? Also, in that case, how do we make quality tradeoffs when all features are necessary for the end product?


The thing is, when you copy paste a bibliography entry from the publisher or from Google Scholar, the authors won't be wrong. In this case, it is. If I were to write a paper with AI, I would at least manage the bibliography by hand, conscious of hallucinations. The fact that the hallucination is in the bibliography is a pretty strong indicator that the paper was written entirely with AI.

Google Scholar provides imperfect citations - very often wrong article type (eg article versus conference paper), but up to and including missing authors, in my experience.

I've had the same experience. Also papers will often have multiple entries in Google Scholar, with small differences between them (enough that Scholar didn't merge them into one entry).

I'm not sure I agree... while I don't ever see myself writing papers with AI, I hate wrangling a bibtex bibliography.

I wouldn't trust today's GPT-5-with-web-search to do turn a bullet point list of papers into proper citations without checking myself, but maybe I will trust GPT-X-plus-agent to do this.


Reference managers have existed for decades now and they work deterministically. I paid for one when writing my doctoral thesis because it would have been horrific to do by hand. Any of the major tools like Zotero or Mendeley (I used Papers) will export a bibtex file for you, and they will accept a RIS or similar format that most journals export.

This seems solvable today if you treat it as an architecture problem rather than relying on the model's weights. I'm using LangGraph to force function calls to Crossref or OpenAlex for a similar workflow. As long as you keep the flow rigid and only use the LLM for orchestration and formatting, the hallucinations pretty much disappear.

I'm not sure I follow; are you disputing that social media cause harm to mental health, and particularly in teenagers?


Is this available to replicate? I've been thinking about this for some time, for music albums, specifically.


Is there anything like this but for music selection? I mean, for adults. Say I want to have a dozen "albums" on my coffee table (NFC, QR, whatever), and insert one in a box to listen to them. Like an Audio CD, but without the risk of running, leveraging Spotify, or my MP3 connection. Something like in the OP, but using something less prone to stop working than a floppy disk (I was there, I remember).


yes! PhonieBox - But you built it yourself [0]. You make your own cards with nfc/rfid stickers in them, put a nfc/rfid reader somewhere nice, and hooked up to phoniebox rpi with spotify to a nice sound system.

https://github.com/MiczFlor/RPi-Jukebox-RFID


You can also buy ready-made PhonieBoxes on some marketplace sites.


Dunno, not deleting the posts would be a good start.


Exactly. They're just acting like Trump during the pandemic - "no testing - no cases..." Why not just keep the posts and allow people exchange ideas for workarounds?


This was also my experience with certain algorithms in the realm of scheduling.


All abstractions drop some details. If you're unlucky, you removed details that actually matter in some context. You can only make educated guesses.

Another aspect is that some abstractions are too... abstract. The concept they represent is not immediately obvious. Maybe it's a useful concept, but if it's new, it takes time to be internalized by someone for the first time.


Loose typing makes you really fast at writing code, as long as you can keep all the details in your head. Python is great for smaller stuff. But crossed some threshold, the lack of a mechanism that has your back starts slowing you down.


Sure, my language of choice is more flexible than that: I can type

   put "test abc999 this" into x
   add 1 to char 4 to 6 of word 2 of x
   put x -- puts "test abc1000 this"
But I'm still curious -- what's the better language?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: