Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think that there is a "balance" to tip, it's just that the previously theoretical work of the latter group is starting to become practical to the former in a business setting. Neural networks have been a topic of research for many decades before consumer demand for graphical video games drove the technology to make them practical. Backpropagation was first described as a solution to efficiently training multi-layer neural networks in 1975 but only in the last decade have we developed the infrastructure for non-CS researchers to use complex neural networks using a sub-$1000 teraflop GPU and an iPython notebook.

Off the top of my head the closest analogy would be number theory, especially the study of prime numbers. Before information technology in general, number theory was esoteric and considered useless by many pure and applied mathematicians (I'm simplifying a bit for argument's sake) but all of that accumulated research proved massively useful once we started to communicate electronically. WWI and II cryptographers didn't become or absorb number theorists as a group, they just adapted the knowledge to their field under the umbrella of electronic warfare. I think this is happening with data scientists, who are starting to experiment with ML but it's still just another tool in their toolbox. The media hype train focuses on the flashy AI contests and muddles the terminology but the real work [1] is happening behind the scenes in data science.

[1] By "real work" I mean work that directly translates into value on a company balance sheet. The theoretical work is important in and of itself and has been happening for decades.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: