Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Pages With Too Many Ads Above The Fold Now Penalized By Google (searchengineland.com)
136 points by coupdegrace on Jan 20, 2012 | hide | past | favorite | 77 comments



I count three ads above the fold. I believe the results below those top ads are still search results, just presented in different modalities.


Regardless of whether the product search stuff counts as an ad or not, the six items on the right definitely do.


The three in the center are in a box that says "Ads". I think they count. And all six down the right side count too. Targeted ads are still ads, even if they're relevant.


Yep, only Google's search results page gets to be top-heavy with ads.


Completely true.

http://www.google.com/search?btnG=1&pws=0&q=breast+r...

There are 11 paid links and 10 organic. At least 75% of the above-the-fold content is ads.



This is how it looks without AdBlock.

http://imgur.com/3pOyK


To be fair, I only see the three ads above and none on the right. Still, is this what Google refers to as 'above fold' location ?


Yep, that is what they're referring to. The term comes from newspapers, which are shipped folded across the middle, so that only the half of the front page that's "above the fold" is immediately visible. The web design community adopted the term with essentially the same meaning: the part of the page that's visible when you first get there. So yes, those ads are decidedly above the fold.


not in MA with my google cookies-- only have one above organic-- I have the stuff on the right, though


No shock there — you're running AdBlock.


Sorry, I completely forgot about that. It's amazing how different everything looks with it off. I think I'm going to keep it on.


You use Adblock, yes? That's kinda the whole point.


Only one ad, no Adblock: http://i.imgur.com/35ZjK.jpg

I think Google likes me :P


Go here: http://www.google.com/ads/preferences - go down to demographics. Google thinks I'm a 35 y.o. male, interested in Sci-Fi, and gadgets.

I'm guessing google thinks you won't get breast reduction.


"No demographic categories are associated with your ads preferences so far."

I don't really know what this implies in terms of ads, but I'm not complaining ;)


Strange. I was like OP. 11 ads (3 top, 8 side), 9 results, 3 images.


This was my experience as well. Are you outside of the US?


Yes.


They're also blocked by robots.txt, so feel free to load up on advertising on your pages you don't want indexed.


Google is penalizing sites that violate their standards of top-heaviness. I don't think it's fair to call them hypocrites because they don't meet your standards.


The article shows the "credit cards" results page with (at his resolution) 4 top ads, 4 full and 1 partial side ad, and 1 partial result shown. If 9 ads to 1 line of content isn't top-heavy with ads according to google's standards, then their standards are worthless.


If that results page were even close to the average case for Google, I'll agree with you. But it's not, so it doesn't really prove much of anything. Google isn't going to penalize your whole site just because one page out of a million is a little over the line.


This argument sort of rubs me the wrong way. It makes perfect sense for a search engine to penalize pages that are ad-top-heavy; Why should it matter what Google does on it's own pages? I get it, practice what you preach, but I'm pretty sure Google is not concerned with the SEO of their results pages. The whole argument seems to be reaching for something that isn't there.


Disagree. I whitelist google and all subdommains with adblock because a moderate number of text-only ads is both reasonable and can even be useful. By contrast, many sites seem to use 80%+ of their bandwidth on advertising cruft and serve pitifully thin content.


Google search has ads on the top, but it isn't top-heavy.


Yeah, this is awfully rich considering where G's money comes from.


Google's money comes from providing things to click that are relevant. Sometimes those are ads, but most times those are search results. If you click a search result and see nothing but ads, you're less inclined to continue to use Google. A balance has to be made between advertising and non-advertising, and all ads is not the right balance.


This seems rather contradictory with the general AdSense historic guidance to run up to 3 ad units and place them generally to the top and left side of the page (per various pieces of heatmap research often cited). Seems like this could be pretty easily gamed too depending on how much of a CSS-wizard you are, as I suppose the algorithms are somewhat limited in analyzing CSS vs plain HTML layout or Javascript.


There was an article a while back about how Googlebot is likely a version of Chrome that fully renders a page to process it. [1] You're right, without a full browser engine, you could pull a whole host of tricks to make the page appear clean but then put in ads later. Since Googlebot is probably a full featured browser, it becomes a lot harder to trick.

[1] http://ipullrank.com/googlebot-is-chrome/


There are definitely at least two different, shall we say classes of crawlers: The Googlebot and the Google Web Preview crawler. I don't know the extent of Googlebot's javascript parsing but the web preview crawler appears to parse javascript like plain webkit.

Here's a link where they show the UA for the Google Web Preview crawler: http://www.webmasterworld.com/search_engine_spiders/4353651....

I believe this crawler renders your pages for the preview snippets you get in search results when you hover over the arrow that appears on the right side of a result.

The previews I've seen would only look that way if javascript was being rendered and allowed to run for ~10-20s by my estimation--based on the progress of an animation that was previewed.


I've seen a lot of evidence that Googlebot evaluates Javascript in a page. For one thing, I've got a site that has Facebook Connect on it, and I see Facebook's bot following on the heels of Googlebot. They work together just like those crabs that work in teams to cut up starfish...


I regularly get emails from Google's AdSense program encouraging me to add more ad blocks, and as you said, they include a template that encourages you to place those ads high on the page. Got one of these emails just yesterday funnily enough.


I wouldn't assume that at all. What's keep them from running a GUI-less version of webkit on their server farm. In fact, I'd be surprised if they _didn't_.


Agree. 4 years ago I was offered a position at a Norwegian company that used a modified version of Firefox to read various news sources and compare them, so this was definitely possible back then.


Why would you assume that they don't fully analyze web pages? Google makes most advanced Javascript engine and the fastest growing browser, it's beyond belief that they aren't rendering every page. (For starters, how do you think they get all those preview images?)


The Googlebot doesn't fetch external css or javascript. How can it be rendering the full page if it doesn't have those resources?


Googlebot definitely fetches Javascript. How else would it get AJAX pages?

http://searchenginewatch.com/article/2122137/Googlebot-Learn...

It also grabs CSS and has for years.


My mistake. I just checked my server logs and can see Googlebot fetching these resources now.


Rather ironic to see this on a site so crowed with obnoxious social media-cruft that it actually took me about 5 seconds to find the article text...


Well, I guess they got penalized and got upset. But it's honestly good if such pages are penalized. Nowadays time is very important, and wasting time on a normal basis just to look through the advertisements is a very annoying thing.


Prime example of a page which works better if you have NoScript+RequestPolicy+AdBlock installed. Just opened it in my vanilla Chrome installation so I could see what you meant.


This change also needs to penalize pages with full screen ads that are not always being shown (NY Times among others do it). Can anyone at Google confirm or deny whether these types of pages are being penalized?


God I hope so.


Since when did Google become the Internet police? Seriously, I don't think it's their role to penalize anything you find annoying online.


It's Google's service, it's Google's web-spiders, and it's Google's attempt to deliver useful content to their own customers. How is it NOT in Google's interest to police the links they present to visitors of Google's own properties?

If you don't like it, use Yahoo!.


But where does it stop? Is it ok for example for them to downgrade a site based on political content? Or religious content? It's not much of a jump from "we don't want to to visit this site because it has too many ads" to "we don't want you to visit this site because we don't believe in it's views".

> If you don't like it, use Yahoo!.

Thanks for the tip, but I prefer DuckDuckGo.


Advertising company penalises others who sell advertising, news at 11.


Cool, I didn't know about the browser sizing tool,

http://browsersize.googlelabs.com/


I wonder if this isn't at least partly a sneaky way to penalize pirate sites without needing to wade into the intractable question of copyright. There seems to be a strong correlation between a site's sketchiness and the likelihood that it will plaster obnoxious ads all over the top of every page, and the timing is weirdly coincidental if not.

(I really hope this doesn't derail one of the few non-SOPA threads. But that's the most relevant motivation I can think of for this change.)


any serious 'pirate site' has absolutely no ads. The only 'pirate sites' that have ads are the ones open to the public - and they are not so serious, but rather unreliable with no community. I am guessing you are not a member of a private tracker.


I said pirate, not private. Private trackers may be where the most hardcore pirates hang out, but theyre still only a fraction of all pirate sites. What you're saying here is precisely cognate to the No True Scotsman fallacy.


> But that's the most relevant motivation I can think of for this change.

Really? You think "a sneaky way to penalize pirate sites" is a more relevant motivation than "help users find information more easily by trying to filter out sites that have so many ads they obscure content"?


Yes. The latter motivation is not at all timely. I don't see any reason for Google to be more concerned about it than they have been for the last five years.


> Yes. The latter motivation is not at all timely. I don't see any reason for Google to be more concerned about it than they have been for the last five years.

So you're under the assumption that Google implemented every quality-related change to the core ranking algorithms - every change they ever wanted - 5 years ago, and now they just sit around and react to politics.

Gotcha.


So many issues with the internet seem to devolve into the same thing: a fight over who gets to show ads to the naive user.

Eventually Google itself will be showing "too many ads above the fold". Does anyone doubt it?

Gaming the search engine to be numero uno on the SERP is one thing. But proclaming a penalty for websites that have "too many ads"? That seems like it's for users to decide, not Google. Not to mention hypocritical. Can we penalise Google for "too many ads"?


Huh? That's why you go to a search engine -- so that it decides what good content is and the user doesn't have to (or at least they only have to do so in a dramatically-reduced-dimensional space).

When Google sends me to the worst of these kinds of sites, I become extremely annoyed...at Google. So yes, we can and should "penalize" Google, but on metrics like quality of results (which includes the ads being shown). In an ideal world I'm replacing having to separate the wheat from the chaff of the entire internet with having to separate the wheat from the chaff of the search engine market, and I'm going to favor a search engine that does a better job.

In other words, as a user, "since Google is a website that uses ads, and they're going to favor websites that use fewer ads, aren't they hypocritical?" is not a question I care about even a little.

What I do care about, among other things, is having a search engine that doesn't show me useless crap.


The fact that they have to make changes to their system in order to not have useless crap appear at the top of the results tells us something: either people are searching for crap or the portion of the web Googlebot is crawling is full of crap.

Neither is something the search engine can fix for you.

With respect to the later idea, the search engine may in fact be contributing to it by encouraging more crap to be created, because it easily percolates to the top of their "intelligent" results and users blindly click on result #1. And no doubt many users see these results as equivalent to "the web". Whatever Google returns, to them, that's "the web".

You can think about the web through the lense of "search engine results" and evaluate the web based on whatever is returned from your search engine queries.

Or you can think of the web as a huge mess of websites some of which are useful, most of which are crap and many of which an aggressive search engine might index.

Are you evaluating search results, or websites?

I'm evaluating websites, individually. Because that is what the web is. To me, Google is not the web. Google might give me some clues about some sites. They do an enormous amount of grunt work crawling them.

But it's up to me to do the final evaluation. To decide whether a site is useful or whether it is crap.

And there are other ways to discover websites besides using Google. How do you think Google learns about existing and new websites? Voluntary disclosure by the webmasters?

It sounds like you want someone to evaluate websites for you. I doubt you are alone in that regard.

This is not a new problem.

However, unlike you, I do not see Google as providing any viable solution.


The fact that they have to make changes to their system in order to not have useless crap appear at the top of the results tells us something: either people are searching for crap or the portion of the web Googlebot is crawling is full of crap.

No, it means the ranking algorithm is evaluating the results wrongly. Which is what they're trying to fix.

With respect to the later idea, the search engine may in fact be contributing to it by encouraging more crap to be created, because it easily percolates to the top of their "intelligent" results and users blindly click on result #1. And no doubt many users see these results as equivalent to "the web". Whatever Google returns, to them, that's "the web".

But that's the point, isn't it? It shouldn't easily percolate to the top. That's what their algorithms are for. If it does, they need to be fixed.

Are you evaluating search results, or websites?

I'm evaluating websites, individually. Because that is what the web is. To me, Google is not the web. Google might give me some clues about some sites. They do an enormous amount of grunt work crawling them.

But it's up to me to do the final evaluation. To decide whether a site is useful or whether it is crap.

I don't get what you mean by "Google being the web". Of course the final evaluation is up to the user. But if Google can rank the results more like you would, you're wasting less time clicking through the crap to get what you want.

And there are other ways to discover websites besides using Google. How do you think Google learns about existing and new websites? Voluntary disclosure by the webmasters?

Actually, they do that too. But mostly by painstakingly loading every link recursively, something which is obviously impossible for a person to do unless they want to be limited to 0.0...01% of the web.


i'm pretty amazed that they didn't do this a long time ago... sites with no content above the fold have been a problem for a long time...


I think they have been trying for a long time, but while it's possible to conceive the idea in a second, it might be harder to find the right algorithms to do it in an effective way. After all they make changes every day and study the response of users randomly distributing new versions of the algorithm to a small group of users that are effectively beta testers without even knowing it. It is a known problem, that's true, it can be tricky. For example it is massively clear for webpages that have keywords such as "iphone" "jailbreak" and so on that pretty much have just ads or want to sell you something without really any content in the page. But if you try to find, say something about a scientific paper in blogs, you don't want to penalize the page of a good blogger that has a few ads towards maybe a crazy crackpot that has their own idea about the universe and knows nothing about real science just because their pages have no ads.


well, i think also Google has a motivation to make the search results as bad as they can get away with.

a crappy page with nothing but ads is going to get clicks, and since Google is the #1 ad network, it means more money for Google.

if Google organic search results were perfect, people would never click on the ads. the worse the results are, the more likely the ads are better and you get trained to click... and ker-ching!

Google needs to be good enough to (i) discourage mass defection to Bing and Duck-Duck-Go, (ii) not produce public outrage as happened when A list bloggers were getting outranked by duplicate content, and (iii) not get in trouble for anti-trust. (Hint: if you want to run a spam farm, buy a second or third tier search engine.


> "(Side note, that yellow color around the ads in the screenshot? It’s much darker in the screenshot than what I see with my eyes. In reality, the color is so washed-out that it might as well be invisible. That’s something some have felt has been deliberately engineered by Google to make ads less noticeable as ads)."

Sounds like the author needs to go through the process of correctly setting up his monitors using the Apple provided tool in System Preferences, I have absolutely no problem seeing the yellow box in Google's search results.


My monitor is calibrated properly, so I checked.

The color yellow in that screenshot is very faint. It's HTML color #fef5e4, which is 10% yellow and little else.

Then I compared it with the ad background on a live SEPR, which was rendered as #fff, aka 100% white.

Sounds like the author was right.


This is what I am seeing when Googling "trash cans":

http://i.imgur.com/gJjCK.png

To me that yellow stands out extremely well, I have no trouble seeing it.

The background in Google's CSS for that div with id "tads" is: #FDF6E5

This is for the search "credit cards":

http://i.imgur.com/bPnkJ.png

The background colour is still the same. I just asked around the office if anyone was misled regarding these ads and no-one else seems to be unable to see the yellow or have trouble seeing it in general.


Ah yes, that SEPR does have an ad block with background color #FDF6E5. However, that color is only 10% yellow. Objectively, that should present a very faint yellow.

If you're seeing a gold-ish yellow background, then your monitor is set to low brightness or Gamma 2.2. That way, you won't see colors as they are.


My monitor is set for accurate colour representation. Gamma 2.2 is the norm these days and is what websites, photography software, video software all want these days.

Mac OS X used to be the lone hold-out at 1.8 gamma but those days are long gone. (Snow Leopard changed the default to 2.2 to be in-line with the rest of the industry). Windows has always been at 2.2 gamma.


Let's closely monitor YouTube's rankings in that case.


I skimmed the article, but I didn't see at what arbitrary line Google considers "the fold"


I thought "the fold" was always the line where you have to scroll to see below it


Yes but for what combination of hardware and software?


Why do people still browse the web without an ad blocker


Seriously, if you were a web publisher who's livelihood depended on revenue from advertising, you'd be kinda glad that most of them didn't.


Best way to find the content: don't use javascript.


Then I can't see 80% of the "Show HN" stuff that enters the newsfeed!


Exactly. You get content only. Do you want to read, or do you want to click on stuff? Most ads (stuff you click on) are javascript driven. Most content (e.g. text, like what you're reading now) is not. Articles on news sites like HN or other similar sites render just fine without javascript. Converting them to nicley formatted text is easy once the effects of javascript are eliminated. Let the downvotes begin!




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: