Hacker News new | past | comments | ask | show | jobs | submit login

Google has indexed 38 so far: http://www.google.com.au/search?q=site%3Aianab.com%2Ftrillio...

Tackling the data quality & diversity of the traversed pages (best):

Producing English text with the 40 bits, by driving a generative grammar or a markov/travesty generator, would make it harder for Google to detect that the pages are auto-generated. It's unlikely to infer the function f(URL) -> text (or even to attempt it), but would limit the recursion for the other reasons you mention.

(guessing) sites like hackernews are indexed primarily by recursion (few direct inbound links to specific stories).




(guessing) sites like hackernews are indexed primarily by recursion (few direct inbound links to specific stories).

Correct. Notice that it is difficult to find old HN comments on Google, since after a while there are no short paths from the home page to them. In practice & all else being equal (quality, length, spamminess, speed, age, uniqueness, PR, etc), the maximum depth a page can afford to have is about 6 or 7.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: