You could potentially help millions of people by computing bite size versions of Wikipedia English each month by running Pagerank on it, then rendering a list of the top articles to HTML, and encoding in some way that's quickly decompressable on mobile.
Network connections are often slow but Wikipedia English is often really handy but too big for lots of people to store offline.
Additionally Wikipedia has awesome stats on pageviews that need crunching - there is a wealth of cultural, zeitgeist info that can be parsed, and used to priorotise with more than Pagerank.
Network connections are often slow but Wikipedia English is often really handy but too big for lots of people to store offline.
I have some code based on Sean Harnett's work here: https://github.com/lukestanley/wiki_pagerank
Additionally Wikipedia has awesome stats on pageviews that need crunching - there is a wealth of cultural, zeitgeist info that can be parsed, and used to priorotise with more than Pagerank.