I don't believe any apps utilize these algorithms, and for good reason. The main problem is that SOTA is a very computationally intensive problem. On the example shown, a plain Dijkstra takes a fraction of a second (you can imagine A* is faster, and over time these have been sped up by a factor of millions, IIRC), whereas SOTA takes dozens of seconds to run. The difference increases quickly for larger networks and time budgets.
If you check out the paper I linked to, you might get a better idea of just how computationally intensive this process is. If I recall correctly, classical point-to-point pathfinding algorithms have even advanced to the stage that, with some reasonable (albeit clever) preprocessing, queries on very large networks can be done in a matter of milliseconds on mobile CPU power. Compare that to this algorithm, which takes dozens of seconds to run on a network the size of San Francisco, or at best in dozens or hundreds of milliseconds after practically prohibitive preprocesssing (e.g. 30 CPU-days and tens or hundreds of gigabytes are pretty normal) on a Core i7 CPU. It's just not scalable yet, which is why we've been working on it as a research problem (as have many other people).
Edit: There are lots of obstacles; the above was just one. Another one is that even the existing model assumes the same travel time distribution throughout the day, which is clearly unrealistic (you can imagine fixing this would blow up the running time even more). Yet another one is the problem of getting enough traffic data to construct useful travel time distributions, which means only a major company with existing data (like Google) could release an app that actually initially works with real data. Barring some kind of contract with a big company, everyone else would have to just release an app to gather data, and it would only become useful after a long time (if people are even willing to give away precise location data to random app creators).