Hacker Newsnew | past | comments | ask | show | jobs | submit | more Seb-C's commentslogin

I'm also in this camp, and that's why it does not work for me.

Natural language is just a terrible interface and fundamentally not an appropriate one to communicate with a computer.

I wonder if I'm in the minority here because I'm neurodivergent.


> an AI coder can do a lot of the boring typing a lot faster than you, leaving you right at the point of 'real implementation

Vim and bash solved that for me a long time ago in a more reliable and efficient way (and it's certainly not the only tool capable of that).

> the same way IDEs/copy-paste/autocomplete/online documentation have radically changed our work

I was there before and went in the autocomplete/lsp thing pretty late (because Vim didn't have good lsp support for a long time, and Vim without it was still making me more efficient than any other IDE with it). Those things didn't radically change our work as you claim, it just made us a bit more productive.


Blob URLs should work without much overhead.


That won't handle resizing properly. This API lets you hook directly into the painting phase of rendering in the browser so you can both draw the correct size without forcing layout and handle resizing. It also gives the browser the flexibility to not paint the off screen content at all.


I'm working on a universal tool to generate and print custom dust jackets for books, with the goal of making prettier bookshelves.

https://www.jacket-lab.com/


cool idea, looking forward to seeing that on Show HN :)


Thanks! I'm waiting until it's a bit more mature before sharing it there.


Hit me up when you do, ben@shepherd.com. Happy to tweet it, and share on my reader newsletter.

It would be super cool to see different patterns etc for different genres.


Thanks, will do! I hope I can also provide different templates and some default art in the future, but I'm not entirely sure about what yet.

(nice website BTW ;) )


100% this, that is the easiest and less error prone way to do it.

Even if the author still insisted on using a single interface, he could also do what he wants by relying on bytes.Buffer rather than bytes.Reader.


Getting an io.Reader over a byte slice is a useful tool, but the primary use case for io.Reader is streaming stuff from the network or file system.

In this context, you can either have the io.Reader do a copy without allocating anything (take in a slice managed by the caller), or allocate and return a slice. There isn't really a middle ground here.


> They help honest people to avoid potentially dangerous mistakes.

They also can prevent honest people from gathering proofs to cover or defend themselves: abusive boss, illegal requests, harassment...


You can always use your phone to record it if there's some overriding reason to break the confidentiality protection. Or have a witness be present during a call (maybe out of sight).


I disagree. I write complex code, and it is essentially bug-free.

And no, I'm not Jesus, I just care a lot about quality and have spent the last 20 years finding ways and strategies to improve it.

Reducing the number of bugs does not mean being a god that writes bug-free code on the first draft. It means being able to detect and fix issues as early as possible. In my case I aim to always do that before letting myself push any code to git.

IMO, it only comes down to how much someone really cares about the quality, but here are some examples of what can be done and is very effective:

- Plan ahead your functional and technical design

- Carefully research existing code to confirm the feasibility of the design

- Use a statically typed language

- Use advanced static-analysis tools

- Avoid magic, write explicit code. Especially avoid runtime checks such as reflection. Ideally, everything should be checked statically one way or another.

- Never let a code path/branch/corner case be unhandled, however unlikely it is (and go back to step one to refine the design if a code path has been forgotten in the current design)

- Always have automated testing. The bare minimum is to unit-test all business logic, including all possible code paths. Ideally e2e tests are nice, but not always a good investment. Tests must be 100% independent and never depend on an external environment, otherwise it's going to be flaky at some point.

- Always manually test every feature and path related to my changes (especially don't skip testing the ones that I think are going to be ok) before pushing anything to git.

- Warnings and "optional" notices are unacceptable and must always be fixed (or disabled), otherwise the list will just keep growing, which reduces the visibility of any issue and normalizes having problems.

- Have a CI integration that applies all the automated checks mentioned in this list and make everything mandatory.

Each one of those actions does on it's own significantly reduce the number of bugs. If you combine them all, you can effectively reduce the number of bugs to pretty-much zero. And since the earlier you find a bug, the cheaper is it overall to fix, I've also found-out that in terms of productivity it's always worth the investment (despite many people pretending the opposite).


IMO, this AI crap is just the next step of the "let's block criminal behavior with engineering" path we followed for decades. That might very well be the last straw, as it is very unlikely we can block this one efficiently and reliably.

It's due time we ramp-up our justice systems to make people truly responsible and punished for their bad behavior online, including all kind of spams, scams, fishing and disinformation.

That might involve the end of anonymity on internet, and lately I feel that the downsides of that are getting smaller and smaller compared to it's upsides.


> I believe that one day there will be great code examining security tools.

As for programming, I think that we will simply continue to have incrementally better tools based on sane and appropriate technologies, as we have had forever.

What I'm sure about is that no such tool can come out of anything based on natural language, because it's simply the worst possible interface to interact with a computer.


people have been trying various iterations of "natural language programming" since programming languages were a thing. Even COBOL was supposed to be more natural than other languages of the era.

https://www.cs.utexas.edu/~EWD/transcriptions/EWD06xx/EWD667...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: