Hacker News new | past | comments | ask | show | jobs | submit login

It's amazing how far we have come in the past decade. I took 6.034 with Professor Winston in Fall 2006 (and agree completely that he is an amazing teacher). I remember there being only one lecture covering Neural Networks on which was remarked that they were interesting in theory but disappointing in practice.



There have been four such revolutions so far and I strongly feel this will be the fifth.

  - transistors
  - (micro)computers
  - (smart)mobile phones
  - the web
And now:

  - practical pattern recognition using neural nets
    aka deep learning


Agreed, though I draw the parallel elsewhere. The industrial revolution was about automating labor, though really it just scaled up repeatable processes. Likewise, computer programs don't really automate mental labor, they just scale up repeatable processes once the mental labor of figuring out the specification is done. Deep learning, on the other hand, promises the automation of actual mental labor--creating new information from other, unrelated information. If you look at it that way, its role and future seems pretty obvious.


Why the device-centric view? (Not that there is anything wrong with that, but, you know, it's just an incremental innovation curve). Why not:

  - Cybernetics (incl. Szilard, Von Neumann)
  - Information theory (incl. Shannon, Turing) 
  - Whatever you might call LISP
  - Overlapping window based BLT GUIs
(for example)

?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: