Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Reminds me of the evolutionary FPGA experiment that was dependent on magnetic flux or something. The same program wouldn't work on a different FPGA.



Would be interesting to hook up many FPGAs of the same model and train all of the at once. Programs with differing outputs on different individuals could be discarded. The program may still not transfer to another batch of FPGAs but at least you have a better chance of the working.

Another idea is to just train a whole bunch of them individually, like putting your chips in school. :-D


What thy did was overfitting. We later found other ways of getting around the issue.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: