the real problem, is that software languages have not yet caught up with hardware. if software was as powerful now, as code is now, you used to be able to use one instruction and have a computer turn for a millisecond or so. yes it was slow, but it was still faster than having a person process 2 numbers and then add them, or divide them etc...
now i find it hard to find 'useful' instructions that actually take 1ms of processing time. heck, TI's lates micro controllers can turn on and off, and still stabilize analog circuits in less than 1ms now.
new processes and neural networks are becoming more practical every day as information overflow, and processing abundance are allowing us to write code on sample sets, and probabilities.
now i find it hard to find 'useful' instructions that actually take 1ms of processing time. heck, TI's lates micro controllers can turn on and off, and still stabilize analog circuits in less than 1ms now.
new processes and neural networks are becoming more practical every day as information overflow, and processing abundance are allowing us to write code on sample sets, and probabilities.
soon, for loops will seem silly.