This is the problem with using associative reasoning to detect bad actors. Bad actors are attempting to look like good actors, but good actors aren't trying to avoid looking like bad actors. Bad actors will target a stereotyped set of good actor attributes, and your filters will mostly catch unique or idiosyncratic good actors, who aren't targeting anything.
Unless your good actors are willing to conform to a published set of (nerfed) behaviors that don't have the possibility of being bad, or are willing to register and be vetted by you individually, you can't help but be overwhelmed by false positives. It's the same reason why the most intricate, pervasive, and technologically flexible surveillance system in history can't find a terrorist.
edit: I think the entire endeavor is doomed. The bubble associating your current searches with your past searches, and attempts to eliminate spam and eHow through algorithms have just resulted in eliminating most sites from the searchable internet. You don't realize how bad its gotten from all of the search engines until you spend an hour on something like millonshort (which looks like it's down now.)
Reminds me of (forum) mafia: there's a set of heuristics for how "town" players behave. But real town players will break those heuristics occasionally, because they're trying to catch the mafia - so the players who conform exactly to the "town" heuristics all the time are actually more likely to be mafia players.
What it's resulting in is a 'news-stand' effect in which articles from the mainstream press will rank higher than anyone else on a given query if the match is close enough. Although I'm not a huge fan of the press as it is today, it could be worse.
You can actually identify opportunities by finding valuable queries that are being squatted on by the content mills. Lots of nice things in stuff like home improvement, insurance, and finance.
The thing is, in the context of content, if a bad actor acts sufficiently like a good actor it is effectively a good actor. You just have do a sufficiently good job of determining good or bad and not some proxy "like good" or "like bad".
I'm glad I experienced the old internet back in its heyday, when high quality sites linked to other high quality sites, and Google exposed this natural topology for all to explore. Google's success eventually led to the end of the old internet: AdSense captured its value and then SEO perverted it with noise.
I expect within 5 years some crotchety old programmer will have build a search engine that penalizes sites for anything more than basic markup. No JS, no CSS, no hints of a CMS. Maybe it's already been done. "Old Skool Search"
And it will still be a poor imitation, because if there is one thing you can't recreate and experience for yourself, it is an internet gone by.
I sincerely believe we need two Internets, or we need two Googles.
We need what started with ARPANET, but this time never let it commercialise. PageRank works for that network.
We also need the commercial network that works like a better Yellow Pages but actually gives valuable local focused information on businesses and services that can't be gamed and has different rules.
Trying to apply the same ranking rules to both types of searches pollutes the academic and destroys every local small business who are unlucky enough to not make it to the magic first page - because nothing else counts and the winner takes all.
Google has taken on a lot of responsibility dictating what they think we are looking for and they are failing us (and especially small business) big time. Maybe they are thinking about this? Maybe Google Local will factor more than PageRank for commercial. Maybe they will change the layout so more than 10 lucky links get to page one and you never get to have multiple listings for the same company. But right now it is bad and only getting worse.
MetaFilter at least has the advantage of being old and respected enough in tech circles to get the attention of Hacker News and Matt Cutts himself when something like this happens. For the vast majority of legitimate websites who see their traffic drop or become non-existent when google pushes an update to their ranking algorithms, there is virtually no recourse. I understand it's a necessary evil and there will always be collateral damage when trying to combat spammy websites, but at the same time it's scary to know that there are people whose sole source of income is web traffic and overnight it can disappear without explanation.
It's amazing how crappy Google results are nowadays. It's all Buzzfeed, Crackle, Mashable, Y!A, WikiHow, eHow, and other garbage results akin to those garbage sites. True information like you'd find on MeFi or some of the more interesting subreddits is now completely buried. There's no other alternatives, either. DuckDuckGo pulls info from less popular sites, and you can find real gems with the right search parameters and hashbangs, but the relevance is not nearly as good as Google's.
I hate to sound like a conspiracy theorist, but I think since we started getting blog/aggregator/quizlet-type sites on the web back in 2009-2011, the quality of search results has gone far, far down the river.
> It's amazing how crappy Google results are nowadays. It's all Buzzfeed, Crackle, Y!A, WikiHow, eHow, and other garbage results akin to those garbage sites.
I think this is just a sign of the fact that the web has come of age and everyone is on it -- the things that people are actually most likely to be looking for no longer correspond very well to the things people who were actively and heavily using the web in, say, 1998, were looking for.
Not only did it make it impossible for me to figure out what those graphs were supposed to show, it made it hard for me to read the paragraphs directly before & after because the giant vine image was so distracting.
This is what's so frustrating about Google. I don't want to spend time poking your black box to try to divine why my company's site has dropped in rankings. Especially when that means major architectural changes with no guarantee of any sort of payoff.
The small online retailer I work for experienced a 1/3 drop in Google traffic a year and a half ago. No discernible reason. Thin content? Maybe - we do have a lot of filters to make it easier to browse products.
Of course, these are things we built for our customers, not Google. I'm not going to spend a month or two rearchitecting them just to see if Google likes it.
"Advice online was to scale back the "top heavy" ads... I removed most ads to the absolute minimum of just 1 or 2 per page..."
Also mentions first running ads that blended with the content to then running ads that stood out from the content.
Funny thing is that if you are an AdSense publisher, you get emails from Google all the time telling you what you should do. If you aren't using your full allocation of 3 ads and 2 LUs per page, they will often email to remind you to run more ads. And they'll encourage you to run ads in prominent positions towards the top of the page. And to run ever-larger ads. And then change from text-only ads to image ads.
I have had AdSense-monetised sites for around 10 years and have had seemingly automated emails as well as personal emails from account managers about these things.
Funnily enough, I got an email from Google this morning:
"You could run 27 more ads on 15 pages of your website."
Apparently I should try to make more money by using the full ad complement on all pages... The 15 pages it's probably talking about are things like the contact form, about page, etc.
That's hilarious. Metafilter's content is great, especially AskMe.
The idea that pictures would improve things for metafilter is bizarre; it's a discussion site and the discussion is really deep on both MeFi proper and AskMe, which is where a lot of the search traffic went (very deep archives).
It's possible that people wouldn't "bounce" as much if there were lots of images, but it's not obvious that getting people to stay that way is a good thing. Google's algorithms shape the web, they don't just measure it, and I'd rather see more link-heavy, image-sparse pages.
lauradhamilton said Metafilter's content was thin, not that it was bad or wasn't great.
If you answer everything in two sentences, Google is still likely to punish you for it even if those are ideal answers. They punish thin content even if it's supposedly great.
I actually view the thin / thick content issue as a strong proof that Google's search engine is incredible dumb.
They have to constantly write edge case scenario algorithms because the overall of their search system is of very low 'intelligence.' They're whittling their way into a corner in the process, because they don't know any other way to deal with problems than to further narrow what's defined as the good with another specialty algorithm that someone inside Google hacks together to buy another day against spam winning.
The situation is: Google can't tell what's a good answer and what's not based on the content. They have to try to figure out what's good or bad based on every other measurement except the actual content. So if the ideal answer is a mere 47 words long, Google is too stupid to know the difference and understand there are many instances where short & concise is better than seven pages of verbosity. In the not so distant future, this is going to be laughable.
My opinion might be scoffed at today, but I think Google's search platform is little more than state of the art junk that scales well. It's the best junk we have right now, and that's not saying much.
Killing Google search should be on PG's list of hard things that someone should be tackling. They're a dinosaur.
You are talking about the difference between good content, and what is perceived to be good content.
Until google fully develops AI to actually analyse content, it has to go on meta data about content to judge it. This can be length of content. This can be things like images.
Thanks, but I hope you are a bit wrong. The "lots of social shares" generally indicate to me IMHO that the site is not high quality, and the image thing just bugs me. I hate images for no good reason.
Interesting idea, but sounds too expensive. I do believe the the image filename and alt/title attributes come into play, though those are easily spoofed so can't be taken too seriously by Google.
For example: Surrounding text, links to the page or image, topic analysis, color analysis, filename, alt attribute, OCR, advanced image recognition, Google Image Labeler (manual labeling).
Reverse image search is probably too expensive to do this for every image found on the web, but Google could surely do this for a subset. Then they could match alt attributes: If two sites use the same image, and one has an alt attribute of "automobile" and the other "car in street" you can ask Bayes how likely it is that these two sites both made fake alt text that does not describe the image.
Basically if Google Images can show you images relevant to your search term, then Google Search can find out if an image is relevant to a topic in much the same way.
Yes, it can improve your rankings, both directly and indirectly.
Directly: It adds rich content to your posts (video's, audio, images). It adds keywords in the image alt attribute and image description. Good images will increase your ranking on Google Images.
Indirectly: If the images contribute to the topic/post (not merely decorative), then you get more user satisfaction, which will increase repeat visits, word-of-mouth and links.
Do the all-things-equal test: Two pages on the same topic (let's say 'Bonobo Monkey'), but one has a picture of the topic. Which one would you find more useful? Two pages of equal quality, yet one has a picture of the author with rich markup next to it, which one would you find more credible?
As with all things, don't overdo it, or use it to manipulate: Adding images when they don't make sense for the topic, copying all your images from elsewhere, spamming the alt attribute with keywords, put a page full with unrelated images etc. Basically do not treat it as an SEO technique, treat it as enhancing your content with media to increase overall quality and user satisfaction.
Even let's just assume for the sake of argument everyone agrees on who's the good guys and who's the bad guys.
There's still no way for an algorithm to correctly exclude all bad guys and avoid excluding all good guys. And trying to improve in one area often decreases in the other. (Before even getting to the fact it's a dynamic system where the bad guys are constantly trying to adapt to avoid exclusion)
> Often, there is an inverse relationship between precision and recall, where it is possible to increase one at the cost of reducing the other. Brain surgery provides an obvious example of the tradeoff. Consider a brain surgeon tasked with removing a cancerous tumor from a patient’s brain. The surgeon needs to remove all of the tumor cells since any remaining cancer cells will regenerate the tumor. Conversely, the surgeon must not remove healthy brain cells since that would leave the patient with impaired brain function. The surgeon may be more liberal in the area of the brain she removes to ensure she has extracted all the cancer cells. This decision increases recall but reduces precision. On the other hand, the surgeon may be more conservative in the brain she removes to ensure she extracts only cancer cells. This decision increases precision but reduces recall. That is to say, greater recall increases the chances of removing healthy cells (negative outcome) and increases the chances of removing all cancer cells (positive outcome). Greater precision decreases the chances of removing healthy cells (positive outcome) but also decreases the chances of removing all cancer cells (negative outcome).
Obviously, what happened to metafilter isn't right.
But it should also be obvious why Google doesn't say what needs to be done to remove a penalty. The sites that should receive a penalty will just use that information to further game the system.
How is it obvious that what happened to Metafilter isn't right? I'm inclined to be sympathetic when any non-spammy site is hit by a Google penalty, but what's the basis used for determining that Metafilter is deserving of a particularly high ranking above where it's at right now?
The only thing I've seen are a few opinions stating that Metafilter is great. That doesn't make it so to the majority of web users.
It's not just that MetaFilter's own ranking has dropped. If that were the case, then you might argue that it was just the algorithm doing its job.
But Google has clearly miscategorized MetaFilter as a content farm or linkspam site, and is actively telling other sites that it is penalizing them for outbound links from MetaFilter that it has categorized as spam, even in cases where those links are clearly not spam.
Excellent point. I didn't put together the link designation with the likelihood that Google had put Metafilter into a spam penalty box. I simply hadn't considered that a possibility given Metafilter's reputation.
Every time I read about this Metafilter situation, I cringe in thinking about how horrible Answers.com is with the abusive tactics they employ now when it comes to displaying content / answers. And yet they remain non-penalized; historically they're one of AdSense's biggest publishers, always found that interesting.
That's a reasonable argument against Google announcing what needs to be done. It's no argument at all against someone from Google contacting Matt Haughey directly, and telling him.
That's what surprises me most in all of this. Why hasn't Cutts picked up the phone, and called Haughey? Haughey's a long-time and well respect Web Dude. He's not a spammer, nor some 'random dude with a site.' Sure seems like a win/win for Google to help MetaFilter out of this jam.
I guess what Matt said to Haughey: your site looks like a content farm, lot of posts have poor content, filled with lots of links, none of them have backlinks so it was hit by Panda. Even if there is a lot of good pages on the site according to someones, it is enough to have also a lot of shallow pages to get caught.
Obviously, what happened to metafilter isn't right.
Why is that obvious?
One aspect of these types of stories that seems to get ignored is the value that these sites provide for searchers: Was Metafilter providing a good experience for users? Is their experience worse with the site ranked lower? Is it, perhaps, better?
That is the question that needs to be answered (e.g. what questions brought someone to metafilter? I can honestly say I've never been directed there), not whether MetaFilter is entitled to some set quantity of search traffic per month.
If Google et. al. can manually punish you, they should be able to manually un-punish you also. That this feature doesn't exist is a serious design flaw. A human being should always be able to look at a site and say, "Nope, you're good." and let you be ranked naturally again.
But the successful bad actors are more noticeable because they are near the top of the search while those you have to unpunish are hard to find.
Even if they can easily find them I think the real reason they want to stay away from it is that having such a tool would open a can of worms on the search neutrality side of things. (i.e. we know Google+ is a good actor we should remove link out penalties, we know are largest advertising client is generally a good actor, we should remove penalties from their site). Even if google wasn't tempted to misuse the tool you can easily see it generating a bunch of lawsuits form bad or unlucky actors accusing google of favoritism.
We had a site get hit hard by Panda 4.0 but we are not surprised. About a 35% drop overnight May 20th.
An observation that we have made over the years is that significant changes to Google's algorithms always seem to soften in the subsequent months for us and we return to a high level in the Google SERPs.
The site in question is a site where we curate free crafts projects and patterns which my spouse and I began in the 90's with our family and friends. The idea was to gather together excellent crafts projects on little known mostly small sites with the criteria that they are free, complete, usually require no email/login and are within two clicks of us. We still update it every week.
Over the years we have used user feedback to make design decisions. For example, when Pinterest became popular we got a lot of feedback to use masonry instead of tables for our images/links and lately we've been moving to make it fully responsive because we get a lot of feedback from tablet users.
The only time we were manually penalized by Google involved an issue where we had ignored our user's complaints about so they were right and we were wrong. We fixed it.
Our site looks thin to an algorithm and we almost always get hit by large algorithm changes but over the subsequent months, the site always moves back up and ranks very well. We can only imagine that this means the algorithm is somehow tempered by our visitor's behavior (we use adsense and analytics so they see all) and is not simply a switch that is thrown and left on.
I have to wonder how much MetaFilter has done to gather user feedback. The answer to their problem may be there.
Anyone with any experience in the SEO world knows that Danny Sullivan doesn't have a fucking clue anymore. He's no different than a talking head reporting on the daily ups and downs of the stock market on your local news.
Anyone with any experience in the SEO world knows that dchuck doesn't have a fucking clue anymore. He's no different than a talking head reporting on the daily ups and downs of the stock market on your local news.
Ad hominem.
If instead of saying "X is an idiot, anyone can tell " you'd say what's wrong with his argument, you wouldn't look like a talking head.
If you were active in the SEO world, you'd know what I said wasn't an Ad Hominem but instead an accurate description of his abilities.
Danny Sullivan does not actively "do SEO" anymore, a field that changes monthly. He reports on news that other people discover about happenings of the search engine industry.
Hence why he concluded that he didn't know what happened to MetaFilter. Because he has no fucking clue how modern SEO works anymore.
MetaFilter's real problem is that on a web that is currently built around sharing, its audience shuns social media. I've never seen a MetaFilter link in any of my feeds. I've never seen a story on one of the major blogs use MetaFilter as a source (expect for this Google story).
I agree it's backwards / ridiculous, but that is apparently how Google prefers links to be treated now. It clearly breaks the notion of good will value sharing between sites. It seems these days all the best ranking sites are careful to nofollow across the board.
I used to see claims that Google gave sites a modest benefit for sharing pagerank via links to other high quality sites or similar sites. That doesn't seem to hold up under scrutiny when you look at the sites that have done well by optimizing SEO.
MetaFilter should definitely nofollow their external links in the comments. That makes sure that spammy/unnatural links do not count as a vote (which turns MetaFilter into a bad neighborhood as far as search engines are concerned, and attracts the spammers).
... <a href="#">Sample</a>, an Indian online pharmacy. I have not ordered from them myself. ...
They should look real hard at their site structure and which pages they allow to be indexed. Too many vague subdomains, and 'posts tagged with...' in the index.
I wonder what their user metrics show to Google. I don't think Google is penalizing sites based on a mere hunch, or won't notice that a site like MetaFilter got hit by an algorithmic update.
This site is someone's child, so it's hard to be too critical, but the view that MetaFilter hosts the quality content of the internet is far too rosy.
Pages full like: http://www.metafilter.com/67307/Gampoumlmbampoumlc Which are veiled spam pages. Pages full of inane babble about medical issues or personal drama, with the credibility and authority of a Yahoo Answers page. Just because people paid MetaFilter to post, and a site is heavily moderated, does not mean it adds anything to the search results for most generic search terms.
Then most links or interesting stories are from elsewhere on the web. MetaFilter is kinda like trying to rank with the old Digg comments: not the best place on the internet to read a discussion surrounding a topic. Why deserve to rank for linking to a funny or controversial news article, and riling up 15 comments or so?
In the end there may be far more natural reasons for this. The user signals showed something was wrong with the site. People bounced a lot (be it the design, be it because you can't comment without paying, be it because 15 random internet comments have little value). In the meantime Reddit and other sites grew to large communities. You don't see Reddit comment threads ranking all too high either, unless they are really special, popular or significant, like the president doing an AMA.
There's absolutely nothing spammy about http://www.metafilter.com/67307/Gampoumlmbampoumlc, and if it has to be excluded from the internet because it contains two links to the website of a product along with discussion of that product between two dozen people - your search is broken, not the page.
I'm not sure when it became alright to exclude "inane babble" from the internet, but I'm not comfortable with an algorithmic measure of lack of interestingness or quality being judged as an attack. I also think that it's moving the goalposts from detecting destructive or deceptive content to defining what all content on the internet should look like.
That page is commercial spam (or "veiled spam"). It serves no other purpose than to get a link to his own product site, no discussion, no nothing. Just: For the special price of 999 Euro you can get your own. How is that quality content?
It is also spammy in that it doesn't nofollow external links on comments. Those links should have been nofollowed like 6 years ago. That they still aren't, is telling about MetaFilter's knowledge on SEO. You know that Google sees your site linking to bad sites, and what do you do? Complain about it? Fire people due to an update 2 years ago? Or do you simply take action to fix it?
MetaFilter doesn't have to be (and isn't) excluded from the internet. It just doesn't have to rank well for terms like "Gomboc".
It became alright to derank "inane babble" the moment it increased the satisfaction of search engine users. Which is pretty much from the beginning of search engines. It is user metrics that count. Statistics don't lie.
It does not matter that you are uncomfortable with an algorithm. I think I don't get your "judged as an attack"-part. The alternative is not using a search engine that relies on algorithms.
I did a casual investigation of MetaFilter. It has not changed much over the years. The pages I did look at were all of the level of "inane babble". One line comments. Puns. Not adding much to an already thin topic. Perfectly alright to have a place on the internet to debate how to get your friend to take Viagra. But not much need to rank that for men's health topics, relationship advice or even Viagra.
Now would you want to find that page when you are searching for those symptoms? Or would you want credible medical advice?
If it is moving the goalposts, it is moving the goalposts to better quality content. Those visitors not going to MetaFilter are not gone, they are catered to by other sites. Every tag on MetaFilter has a specialist site now. Ranking MetaFilter so high before 2012 was a present. MetaFilter doesn't deserve to be deranked any further, but it doesn't deserve huge pre-2012 rankings either. In my experience, todays SERPS are much better than ever before, and the complaints are biased, reminiscing good old times or influenced by the fact that some paid to post on that site (You'll defend it, as you don't want to back a losing horse, and your experiences with the site are probably good).
>It serves no other purpose than to get a link to his own product site
You've simply made this up. The poster (gleuschk) may have some undetectable connection to the product (just like you or I may have one), but it's the only post that he/she ever made about it. Judging by a skim of their thousand comment history on the site, what gleuschk seems to have is an interest in mathematics.
You could have figured that out with a click, though. Does the fact that you didn't bother make your comments inane enough to be filtered off of the internet as a spamlink for metafilter? Can you detect that with an algorithm?
Thanks for the link, though. I remember seeing that shape on a spam segment of a documentary on PBS a number of years ago, and it was nice to be reminded of it.
edit: From your examples of bad pages on metafilter, I know that I don't want you vetting what is or isn't quality content on the internet, yet I still trust you more than some associative algorithm.
That page is commercial spam (or "veiled spam"). It serves no other purpose than to get a link to his own product site, no discussion, no nothing. Just: For the special price of 999 Euro you can get your own. How is that quality content?
Just for reference: metafilter bans self-links (ie links to content that the poster is involved with in any way) outright in posts to the main page. It's extremely unlikely that the poster has anything to do with the product in this post whatsoever. The fact that they're a member of fifteen years standing with only 15 posts to their name makes it even more unlikely.
I wouldn't argue that that this post deserves to rank highly in any search engine, but to downgrade the whole site because of it? Madness: It's a link to an interesting mathematical shape, with a bit on perfectly on topic discussion underneath which contains a few other relevant links. Best of the web? Probably not. Spammy? Hardly.
It is also spammy in that it doesn't nofollow external links on comments. Those links should have been nofollowed like 6 years ago. That they still aren't, is telling about MetaFilter's knowledge on SEO. You know that Google sees your site linking to bad sites, and what do you do? Complain about it? Fire people due to an update 2 years ago? Or do you simply take action to fix it?
Links to spam sites will be flagged & then squashed by the moderators in short order. Well moderated comments by an active user-base shouldn't be nofollow - they contain actual useful information that a search engine (like Google!) ought to find useful, if page-rank matters at all.
It's possible that metafilter might have a problem with links in old posts ending up pointing to domain squatter sites I suppose, but if this is a major problem then it ought to show up in metafilter's google webmaster tools.
I shouldn't have said that the poster owned that site, that was a mistake, and it detracts from my point. I should have left it as an example of a thin-content page that doesn't really add anything to the web. Of which there are many at MetaFilter. That the view that MetaFilter pages are all quality and deserve to rank well is not realistic. That it isn't a mistake to derank a site when user metrics and a/b tests show that that increases user satisfaction.
>but to downgrade the whole site because of it?
I do not think this happened. I do not think I found the single page on MetaFilter that deserved a downgrade of the whole site. It is but a symptom of a deeper underlying problem with the site's quality. Best of the web? Nope. Clear-cut spam? Nope. Somewhere in between? Yes. The rankings reflect that.
>Links to spam sites will be flagged & then squashed by the moderators in short order.
Except they aren't. Google reports followed links to spammy sites. I found a link to an Indian pharmacy in a few seconds of searching. Moderators ought to work. Nofollow will work.
The problem is not the old links, though they are accompanied by old stale content, which IS a problem. Also these are todays links:
- "Everyone On Wall Street Is A Dick."
- Is Using Lotion a Black Thing?
- #basketball #trickshots
These link to other websites like vine.co or b3ta.com. They are followed by some comments that new visitors can not join in on, lest they pay. This is fluffy content that ranked well around 2006, it doesn't anymore. Why are these comments so special that they deserve a top 10? When 100s of other sites also put up a link and have a small discussion? Where is the unique quality content of MetaFilter? In the comments?
Ah, I think you're operating under the misapprehension that it was metafilter.com that was attracting the Google clicks & so you're reasonably looking at posts to metafilter.com & wondering how they could ever have ranking in Google searches in the first place.
As I understand things, it's the ask.metafilter.com Q+A sub-site that was the main source of Google search clickthroughs & thus ad revenue (Yup, 90% according to Matt Haughey's essay on medium.com: https://medium.com/technology-musings/941d15ec96f0 ). The main metafilter.com site is pretty much irrelevant in revenue terms, although it may be affecting the overall ranking of ask.metafilter.com pages in Google search results for internal Google voodoo reasons of course.
NB. Let me know where that link to the Indian pharmacy is & I'll flag it.
This is why it happened. Mystery over. 99% of the time, "we didn't do nuffin" means that the site's moderation was not as tight as once thought.
Ignorance isn't really an excuse on this, because it takes five minutes to explain to people nofollow vs. followed links. However, when you explain it to people, 99% of the time, they will continue breaking the rules because they have time preference issues: they know that they benefit from their site passing rank in the present, and the penalty is an unknown quantity in the future.
If Google ends-up fixing this, I hope that they don't add metafilter to a white-list and call it a day. The results nowadays almost seem like a hand curated list that would only help the person that made that list.
As someone who's not familiar with MetaFilter, my initial impression is that the design feels outdated and I can't quickly identify how the site works or the value it provides.
I'm sure that over time I could grow to love MetaFilter, and possibly even find value in the qualities that give me this impression. But perhaps Google has an algorithm that lowers the rankings of sites that scare away casual users who stumble upon it from a Google search result (High and/or quick bounce rate maybe?).
It is one of the better link blogs on the web. I've been reading it since 2005? and I gave them my $5 back in 2008. Prior to the rise of the fractal advice subreddits, askmetafilter was (and probably still remains) one of the best general purpose advice forums on the web.
The point is, it has an incredibly high signal-to-noise ratio and it's a damned shame it's getting penalized for it.
Really? I've had several links posted to metafilter, in every instance the comment thread was full of petty, small minded nagging. Seems like just another 'intellectual' echo chamber to me.
> my initial impression is that the design feels outdated and I can't quickly identify how the site works or the value it provides
Part of the problem is that you're right, identifying the value it provides isn't a quick process, and it's less because of design problems and more because the kind of value it provides needs longer periods of evaluation to identify.
It can take more than a casual visit or two to run across the illuminating post or comment that's basically a small dissertation where someone drops the knowledge.
And it will probably take a little bit before someone offers to just send you some hardware they're not using, just because you expressed an interest in something like it.
Some examples that might be more amenable to evaluation:
* This went viral in January:
"Holy cow! 14 minutes to solve the back of the card that has been bugging my family for 20 years! That is amazing!"
Within a half hour on AskMeFi, someone picked out the answer to my problem.
It's probably notable that if you search for the exact title to my question (but not in quotes), you will get the SuperUser/StackExchange answer first in Google results (AskMe second). I did post the answer a Metafilter user helped me pull together on SU, so it's got the same content, but that's not where I got help.
I remember a few years ago when I didn’t know MetaFiler, I often left MetaFilter pretty quickly when it showed up in search results because it looks like one of these spammy link aggregator websites. (I had the same initial hurdle with StackOverflow because I thought it was similar to Yahoo Answers). Maybe this is a common behavior among new visitors and Google lowered some bounce rate threshold?
Given the leaked Google Rating Guidelines file, Metafilter's design, and running multiple blended Adsense units, it would be marked as spam when manually reviewed.
Yes, I believe it's possible that bounce rate is a factor. I'd guess Google uses bounce rate as one factor in determining user interest in the target link.
Thus something like a dated design might have an impact on search ranking.
Useless article by Danny Sullivan, his columns have offered zero advice for years now. He knows nothing special these days.
So MetaFilter has been getting some "help" from Google's public relations squad, as is usually the case with sites that hit a nerve with HN. What about the thousand or million other sites that lost their traffic?
Apparently not...Google--the fair and balanced search engine--decided that ads and Google's own properties are better.
Have you noticed Google's earnings reports that mention a 20+% increase in ad clicks every quarter? Where do you think that's coming from? From shifting traffic away from other sites to Google, so instead of going to Metafilter Google decides that a G+ post of youtube video is better,
Unless your good actors are willing to conform to a published set of (nerfed) behaviors that don't have the possibility of being bad, or are willing to register and be vetted by you individually, you can't help but be overwhelmed by false positives. It's the same reason why the most intricate, pervasive, and technologically flexible surveillance system in history can't find a terrorist.
edit: I think the entire endeavor is doomed. The bubble associating your current searches with your past searches, and attempts to eliminate spam and eHow through algorithms have just resulted in eliminating most sites from the searchable internet. You don't realize how bad its gotten from all of the search engines until you spend an hour on something like millonshort (which looks like it's down now.)