It seems the author is down the 'deep learning' rabbit hole.
>> It does this at the cost of requiring many, many times the computing power. But much of this computing cost can be parallelized and accelerated effectively on the GPU, so with GPU cores still increasing exponentially, at some point it's likely to become more effective than CPUs.
So can be any matrix. Sadly, there aren't as many algorithms that are efficiently represented by one.
>> It does this at the cost of requiring many, many times the computing power. But much of this computing cost can be parallelized and accelerated effectively on the GPU, so with GPU cores still increasing exponentially, at some point it's likely to become more effective than CPUs.
So can be any matrix. Sadly, there aren't as many algorithms that are efficiently represented by one.