Matt Cutts: "Having Ads Obscuring Your Content, You May Get Penalize"

Speaking at the Pubcon conference, in Las Vegas, Google's head of web spam team Matt Cutts, was joined by Amit Singhal to answer the audience's questions following their presentations.Cutts said Google is testing algorithms that determine "what are the things that really matter, how much content is above the fold." Adding further, Cutts warned:"If you […]

PubconSpeaking at the Pubcon conference, in Las Vegas, Google's head of web spam team Matt Cutts, was joined by Amit Singhal to answer the audience's questions following their presentations.

Cutts said Google is testing algorithms that determine "what are the things that really matter, how much content is above the fold." Adding further, Cutts warned:

"If you have ads obscuring your content, you might want to think about it," asking publishers to consider, "Do they see content or something else that's distracting or annoying?"

Cutts started out by mentioning a tweet he saw yesterday: "I don't know if search engines are relevant in 6 months."

Cutts encouraged the audience: "You do not want to go where search engines are; you want to go where search engines are going to be."

Pubcon: Q&A with Matt Cutts

Long-Term (10,000' View):
These are long-term items that will affect sites for the foreseeable future and he recommends having a strategy in each of these areas
Mobile - A cell phone is a computer you carry with you everywhere.
Social - Google can only crawl the open web. If the Googlebot is blocked, then they cannot see it. However, social is an important aspect of Google eliminating spam and this will become a bigger and bigger portion of the algorithm
Local - Most purchases take place here and this will become bigger and bigger

Near Future (1,000' View):
Better page understanding is important to Google in the near future. For example, Google is trying to understand how much content is above the fold so it can improve user experience.
Personalized search is steadily growing and will become bigger in the near future.
Google is going to continue rolling about better tools for searchers.
Google would like to become more transparent regarding changes (including algorithmic) and they don't want Google to be a black box.
The webspam team is talking about the possibility of sending an alert to Google when something is published - this would let Google know who the author is.

Here is the Q&A:

Why is ranking data not available in Analytics?

Cutts: Over 96% of sites get all of their searches within the 1,000 limitation. The last 4% of sites would require 2-3 times more data storage.

Due to the Panda update, lower quality sites are outranking an authority site. Why?

Singhal: Google's preference is always algorithmic - it is scalable across all sites, countries, and languages. Overall, the Panda update has been a very positive change - the scientific measurements say the Google user experience is better than it used to be. However, they understand that no algorithm is perfect and want people to submit reports of instances like this so they can improve the algorithm.

Cutts: Google is listening. Unfortunately, the changes take time to implement. They use the aggregated reports to try to improve the algorithm. The algorithm is under active development and they want to get it right.

When we search for appliances, why do we only get Sears and other major stores?

Cutts: The web is one of the only places where the small business can move faster than the big guys. The big companies are often big for reason and as a result they can outrank other pages. However, the search engine does give the small business a chance Google Webmaster Tools is somewhat of an equalizer though and small businesses should use this - i.e. big businesses are more likely to use text in images/flash and small businesses will know better. Also, small businesses should concentrate on the small niche.

Are they trying to make the algorithm so perfect that they are missing the user experience?

Singhal: All scientific measures and manual reviews indicate that the algorithm is getting better and that search quality is improving (improving search quality = more relevant, higher quality results).

Google Places Page that got shut down by competitor - is there a better process to stop this type of behavior?

Cutts: The web used to be the "wild west" and there is still a small element of this especially in local. The local area is changing fast and a combination of manual spam fighters and algorithmic changes will get this under control. They are open to ideas on how to prevent malicious deletions of other businesses. They are working on this.

Where is the balance between privacy and data with SSL encryption?

Cutts: The trend is search is becoming more personal and this should continue, which means this is important to Google. People are unhappy that they have lost some of their keyword data. However, if you download your data from Google Webmaster Tools, 96% of people can still see all of their keyword data. They will not back down on the SSL - if anything they may move forward and advertisers may not get the data in the future. People want to know that they are not being snooped on.

Are PRWeb and press releases considered black hat due to duplicate content?

Cutts: Press releases are going to other people and asking them to write about you. Instead, work hard to produce high quality content on your site and people will want to write about you. It is harder to fake natural than be natural.
Singhal: The content must be high-quality and useful from a reader's perspective. If the content is high quality and you work hard for the users, it is OK.

If I do doorway pages will the whole site get penalized or just the doorway pages?

Cutts: Are you asking how to do doorway pages (incredulously)?!? There is an answer though - it depends on the amount of spam. If there is a huge amount of great content, they will probably only penalize the portion of the site that is using doorway pages. However, if

Singhal: Don't do it man.

Everyone says I need more links. How do links improve the quality of the site? I don't want to play this game and I don't want to do this.

Cutts: What matters is bottom line. Links are a part of search - they represent online reputation. Although there are many tools that report links, none of the tools can tell you which links are trusted by Google (not even Google's tools). While the link structure looks bad from the outside, the actual linkgraph that Google uses/trusts looks much better. When the New York Times complained about a site with 10,000 spammy links, Google investigated the site and not a single link had slipped through Google's filter. Only the links Google trusts count.

Is Google going to give more data to webmasters?

Google can either give more data (i.e. 2000 queries instead of 1,000) or give a longer timeframe (i.e. 60 days). They are leaning toward more data - they figure people can just download data periodically and still have access to past data. In an informal survey of the audience they disagree - 60% want longer timeframe and 40% want more queries.