Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

On the point of indexing what the user sees, we agree completely. On the statement that Google does this now, that's not actually true. The current fact is, obviously from the article content, that my browser can do things that Google won't. Namely, AJAX, which is critical to truly scalable pages that cache and perform minimal delta requests. Even the JavaScript required using Google-specific webmaster tools.

It's fairly clear that just by developing those webmaster tools, Google is effectively (and understandably to a point) saying that they won't try to create engineering solutions to certain problems. Or that at this point it's not a sound financial investment because, after all, people will come to them because they're the biggest game in town.

If you're referring to my alternate crawling strategy suggestion of mapping a deep link structure into a structured REST URL structure, that's just an optimization I'd love to see. Really, I just think that search indexers should index what my users see, regardless of my engineering decisions.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: