Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

stop being afraid of the future, this is experimental technology, and one day it will be the present, and normal. a vulnerability is not fixed till it's patched, if you tried to access /etc/foo, A3 will recognized you don't have access and block it, but if you have access to /usr/bin then it won't block it, an attacker then can craft an exploit to only use what they know you have access to, say netcat in /usr/bin or wget to get a shell and carefully try to bypass your other security layers.


While I agree that the technology is impressive I personally don't want to wade through codebase full of machine-generated patches. Making sense of code when the original author is not available is bad enough, I don't want to know how bad it gets when there is no author.


I've got an old-ish SVN repo at work with a ~115k loc Angularjs app, which is the end result of some very poorly thought through decisions to "allow all merges". It "works", but it's now completely unmaintainable (and has been since rewritten from scratch and the lowest effort way to add new features). It's a real shame, because the original dev (who wasn't involved in the botched merge) was actually doing some really good work, but the business kinda let him down with angular training for other staff, pulled him off that project, then let it crash and burn while not explicitly blaming him but making it clear they didn't blame anyone else... Not our proudest moment...


This was the thing that immediately leapt out at me as well, though I imagine you wouldn't just leave The Machine to manage things forever more, you'd use it as a first responder and then analyse and patch the vulnerability properly.

The idea of a self-patching machine going on forever and building up cruft would make a good premise for a story though :)


The intention isn't that you would carry the patch forward long-term - it's a short-term, throwaway fix that is only intended to fix the problem whilst not breaking your application in your specific use case.

You'd still apply the proper patch from upstream when it's available and throw away the machine-generated one.


It somewhat reminds me of the things that genetic/evolutionary algorithms can produce; amazingly efficient and effective, yet nearly impossible to understand and reproduce systematically. I suppose the same might apply to "self-healing" code in the future.


assembly programmers that coded in C once said that, oh, i don't want to wade through codebase full of machine generated optimizations. i just want the compiler to keep it simple, and then we can optimize. Now, the average compiler will generate far more efficient code on a large project than a human can. Have you thought about the possibility that in the future, you will not have to wade through those codebase? The patch might be on a lower level where you never have to see it. Be imaginative.


It's kind of hard to believe we still have people like this here. People who're afraid of the future should be banned.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: