- Using schema.org markup for organization logos
- Introducing "x-default hreflang" for international landing pages
- 5 common mistakes with rel=canonical
- The Webmaster Academy goes international
- A new opt-out tool
- Easier management of website verifications
- Making search-friendly mobile websites — now in 11 more languages
- We created a first steps cheat sheet for friends & family
- New first stop for hacked site recovery
- A reminder about selling links that pass PageRank
- Make the most of Search Queries in Webmaster Tools
- A faster image search
- Webmaster Tools verification strategies
- Introducing Data Highlighter for event data
- Helping Webmasters with Hacked Sites
- Giving Tablet Users the Full-Sized Web
- A new tool to disavow links
- Make the web faster with mod_pagespeed, now out of Beta
- Rich snippets guidelines
- Google Webmaster Guidelines updated
- Keeping you informed of critical website issues
- Structured Data Testing Tool
- Answering the top questions from government webmasters
- Site Errors Breakdown
- Search Queries Alerts in Webmaster Tools
Webmaster level: all Today, we’re launching support for the schema.org markup for organization logos, a way to connect your site with an iconic image. We want you to be able to specify which image we use as your logo in Google search results. Using schema.org Organization markup, you can indicate to our algorithms the location of your preferred logo. For example, a business whose homepage is www.example.com can add the following markup using visible on-page elements on their homepage:
This example indicates to Google that this image is designated as the organization’s logo image for the homepage also included in the markup, and, where possible, may be used in Google search results. Markup like this is a strong signal to our algorithms to show this image in preference over others, for example when we show Knowledge Graph on the right hand side based on users’ queries. As always, please ask us in the Webmaster Help Forum if you have any questions. Posted by RJ Ryan, Google Engineer
Webmaster Level: All The homepages of multinational and multilingual websites are sometimes configured to point visitors to localized pages, either via redirects or by changing the content to reflect the user’s language. Today we’ll introduce a new rel-alternate-hreflang annotation that the webmaster can use to specify such homepages that is supported by both Google and Yandex. To see this in action, let’s look at an example. The website example.com has content that targets users around the world as follows:
* http://example.com/en-gb: For English-speaking users in the UK * http://example.com/en-us: For English-speaking users in the USA * http://example.com/en-au: For English-speaking users in Australia * http://example.com/: The homepage shows users a country selector and is the default page for users worldwide In this case, the webmaster can annotate this cluster of pages using rel-alternate-hreflang using Sitemaps or using HTML link tags like this:
The new X-DEFAULT HREFLANG ATTRIBUTE VALUE signals to our algorithms that this page doesn’t target any specific language or locale and is the default page when no other page is better suited. For example, it would be the page our algorithms try to show French-speaking searchers worldwide or English-speaking searchers on google.ca. The same annotation applies for homepages that dynamically alter their contents based on a user’s perceived geolocation or the Accept-Language headers. The x-default hreflang value signals to our algorithms that such a page doesn’t target a specific language or locale. As always, if you have any questions or feedback, please tell us in the Internationalization Webmaster Help Forum. Posted by Pierre Far, Webmaster Trends Analyst
WEBMASTER LEVEL: INTERMEDIATE TO ADVANCED Including a rel=canonical link in your webpage is a strong hint to search engines your preferred version to index among duplicate pages on the web. It’s supported by several search engines, including Yahoo!, Bing, and Google. The rel=canonical link consolidates indexing properties from the duplicates, like their inbound links, as well as specifies which URL you’d like displayed in search results. However, rel=canonical can be a bit tricky because it’s not very obvious when there’s a misconfiguration.
_ While the webmaster sees the “red velvet” page on the left in their browser, search engines notice on the webmaster’s unintended “blue velvet” rel=canonical on the right._We recommend the following best practices for using rel=canonical: * A large portion of the duplicate page’s content should be present on the canonical version. ONE TEST IS TO IMAGINE YOU DON’T UNDERSTAND THE LANGUAGE OF THE CONTENT—IF YOU PLACED THE DUPLICATE SIDE-BY-SIDE WITH THE CANONICAL, DOES A VERY LARGE PERCENTAGE OF THE WORDS OF THE DUPLICATE PAGE APPEAR ON THE CANONICAL PAGE? IF YOU NEED TO SPEAK THE LANGUAGE TO UNDERSTAND THAT THE PAGES ARE SIMILAR; FOR EXAMPLE, IF THEY’RE ONLY TOPICALLY SIMILAR BUT NOT EXTREMELY CLOSE IN EXACT WORDS, THE CANONICAL DESIGNATION MIGHT BE DISREGARDED BY SEARCH ENGINES. * DOUBLE-CHECK THAT YOUR REL=CANONICAL TARGET EXISTS (IT’S NOT AN ERROR OR “SOFT 404”) * VERIFY THE REL=CANONICAL TARGET DOESN’T CONTAIN A NOINDEX ROBOTS META TAG * MAKE SURE YOU’D PREFER THE REL=CANONICAL URL TO BE DISPLAYED IN SEARCH RESULTS (RATHER THAN THE DUPLICATE URL) * INCLUDE THE REL=CANONICAL LINK IN EITHER THE OF THE PAGE OR THE HTTP HEADER * SPECIFY NO MORE THAN ONE REL=CANONICAL FOR A PAGE. WHEN MORE THAN ONE IS SPECIFIED, ALL REL=CANONICALS WILL BE IGNORED. MISTAKE 1: REL=CANONICAL TO THE FIRST PAGE OF A PAGINATED SERIES Imagine that you have an article that spans several pages: * example.com/article?story=cupcake-news&page=1 * example.com/article?story=cupcake-news&page=2 * and so on Specifying a rel=canonical from page 2 (or any later page) to page 1 is not correct use of rel=canonical, as these are not duplicate pages. Using rel=canonical in this instance would result in the content on pages 2 and beyond not being indexed at all.
_ Good content (e.g., “cookies are superior nutrition” and “to vegetables”) is lost when specifying rel=canonical from component pages to the first page of a series._In cases of paginated content, we recommend either a rel=canonical from component pages to a single-page version of the article, or to use rel=”prev” and rel=”next” pagination markup.
_ rel=canonical from component pages to the view-all page_
_ If rel=canonical to a view-all page isn’t designated, paginated content can use rel=”prev” and rel=”next” markup._MISTAKE 2: ABSOLUTE URLS MISTAKENLY WRITTEN AS RELATIVE URLS
_ Check the behavior of plugins by looking at the page’s source code._MISTAKE 4: CATEGORY OR LANDING PAGE SPECIFIES REL=CANONICAL TO A FEATURED ARTICLE Let’s say you run a site about desserts. Your dessert site has useful category pages like “pastry” and “gelato.” Each day the category pages feature a unique article. For instance, your pastry landing page might feature “red velvet cupcakes.” Because the “pastry” category page has nearly all the same content as the “red velvet cupcake” page, you add a rel=canonical from the category page to the featured individual article. If we were to accept this rel=canonical, then your pastry category page would not appear in search results. That’s because the rel=canonical signals that you would prefer search engines display the canonical URL in place of the duplicate. However, if you want users to be able to find both the category page and featured article, it’s best to only have a self-referential rel=canonical on the category page, or none at all.
_ Remember that the canonical designation also implies the preferred display URL. Avoid adding a rel=canonical from a category or landing page to a featured article._MISTAKE 5: REL=CANONICAL IN THE The rel=canonical link tag should only appear in the of an HTML document. Additionally, to avoid HTML parsing issues, it’s good to include the rel=canonical as early as possible in the . When we encounter a rel=canonical designation in the , it’s disregarded. This is an easy mistake to correct. Simply double-check that your rel=canonical links are always in the of your page, and as early as possible if you can.
_ rel=canonical designations in the are processed, not the ._CONCLUSION To create valuable rel=canonical designations: * Verify that most of the main text content of a duplicate page also appears in the canonical page. * Check that rel=canonical is only specified once (if at all) and in the of the page. * Check that rel=canonical points to an existent URL with good content (i.e., not a 404, or worse, a soft 404). * Avoid specifying rel=canonical from landing or category pages to featured articles as that will make the featured article the preferred URL in search results. And, as always, please ask any questions in our Webmaster Help forum. Written by Allan Scott, Software Engineer, Indexing Team
Webmaster level: All Since we launched the Webmaster Academy in English back in May 2012, its educational content has been viewed well over 1 million times. The Webmaster Academy was built to guide webmasters in creating great sites that perform well in Google search results. It is an ideal guide for beginner webmasters but also a recommended read for experienced users who wish to learn more about advanced topics. To support webmasters across the globe, we’re happy to announce that we’re launching the Webmaster Academy in 20 languages. So whether you speak Japanese or Italian, we hope we can help you to make even better websites! You can easily access it through Webmaster Central. We’d love to read your comments here and invite you to join the discussion in the help forums. Posted by Giacomo Gnecchi Ruscone, Search Quality
Webmasters have several ways to keep their sites content out of Googles search results. Today, as promised, were providing a way for websites to opt out of having their content that Google has crawled appear on Google Shopping, Advisor, Flights, Hotels, and Google+ Local search. Webmasters can now choose this option through our Webmaster Tools, and crawled content currently being displayed on Shopping, Advisor, Flights, Hotels, or Google+ Local search pages will be removed within 30 days. Posted by Matt Cutts, Distinguished Engineer
Webmaster level: All To help webmasters manage the verified owners for their websites in Webmaster Tools, we’ve recently introduced three new features: * VERIFICATION DETAILS VIEW: You can now see the methods used to verify an owner for your site. In the Manage owners page for your site, you can now find the new Verification details link. This screenshot shows the verification details of a user who is verified using both an HTML file uploaded to the site and a meta tag: Where appropriate, the Verification details will have links to the correct URL on your site where the verification can be found to help you find it faster. * REQUIRING THE VERIFICATION METHOD BE REMOVED FROM THE SITE BEFORE UNVERIFYING AN OWNER: You now need to remove the verification method from your site before unverifying an owner from Webmaster Tools. Webmaster Tools now checks the method that the owner used to verify ownership of the site, and will show an error message if the verification is still found. For example, this is the error message shown when an unverification was attempted while the DNS CNAME verification method was still found on the DNS records of the domain: * SHORTER CNAME VERIFICATION STRING: We’ve slightly modified the CNAME verification string to make it shorter to support a larger number of DNS providers. Some systems limit the number of characters that can be used in DNS records, which meant that some users were not able to use the CNAME verification method. We’ve now made the CNAME verification method have a fewer number of characters. Existing CNAME verifications will continue to be valid. We hope this changes make it easier for you to use Webmaster Tools. As always, please post in our Verification forum if you have any questions or feedback. Posted by Pierre Far, Webmaster Trends Analyst
Webmaster level: Intermediate As more and more users worldwide with mobile devices access the Internet, it’s fantastic to see so many websites making their content accessible and useful for those devices. To help webmasters optimize their sites we launched our recommendations for smartphones, feature-phones, tablets, and Googlebot-friendly sites in June 2012. We’re happy to announce that those recommendations are now also available in Arabic, Brazilian Portuguese, Dutch, French, German, Italian, Japanese, Polish, Russian, Simplified Chinese, and Spanish. US-based webmasters are welcome to read the UK-English version. We welcome you to go through our recommendations, pick the configuration that you feel will work best with your website, and get ready to jump on the mobile bandwagon! Thanks to the fantastic webmaster-outreach team in Dublin, Tokyo and Beijing for making this possible! Posted (but not translated) by John Mueller, Webmaster Trends Analyst, Zürich Switzerland
Webmaster level: beginnerEveryone knows someone who just set up their first blog on Blogger, installed WordPress for the first time or maybe who had a web site for some time but never gave search much thought. We came up with a first steps cheat sheet for just these folks. It’s a short how-to list with basic tips on search engine-friendly design, that can help Google and others better understand the content and increase your site’s visibility. We made sure it’s available in thirteen languages. Please feel free to read it, print it, share it, copy and distribute it! We hope this content will help those who are just about to start their webmaster adventure or have so far not paid too much attention to search engine-friendly design. Over time as you gain experience you may want to have a look at our more advanced Google SEO Starter Guide. As always we welcome all webmasters and site owners, new and experienced to join discussions on our Google Webmaster Help Forum.
Posted by Kaspar Szymanski, Search Quality Strategist, Dublin
Webmaster Level: All We certainly hope you never have to use our new Help for hacked sites informational series. Its a dozen articles and over an hour of videos dedicated to helping webmasters in the unfortunate event that their site is compromised.
_Overview: How and why sites are hacked_If you have further interest in why cybercriminals hack sites for spammy purposes, see Tiffany Oberoi’s explanation in Step 5: Assess the damage (hacked with spam).
_Tiffany Oberoi, a Webspam engineer, shares more information about sites hacked with spam_And if you’re curious about malware, Lucas Ballard from our Safe Browsing team, explains more about the topic in Step 5: Assess the damage (hacked with malware).
_Lucas Ballard, a Safe Browsing engineer, and I pretend to have a totally natural conversation about malware_While we attempt to outline the necessary steps in recovery, each task remains fairly difficult for site owners unless they have advanced knowledge of system administrator commands and experience with source code. For helping fellow webmasters through the difficult recovery time, wed like to thank the steady members in Webmaster Forum. Specifically, in the subforum Malware and hacked sites, wed be remiss not to mention the amazing contributions of Redleg and Denis Sinegubko. HOW TO AVOID EVER NEEDING _HELP FOR HACKED SITES_ Just as you focus on making a site thats good for users and search-engine friendly, keeping your site secure -- for you and your visitors -- is also paramount. When site owners fail to keep their site secure, hackers may exploit the vulnerability. If a hacker exploits a vulnerability, then you might need _Help for hacked sites_. So, to potentially avoid this scenario: * Be vigilant about keeping software updated * Understand the security practices of all applications, plugins, third-party software, etc., before you install them on your server. A security vulnerability in one software application can affect the safety of your entire site * Remove unnecessary or unused software * Enforce creation of strong passwords * Keep all devices used to log in to your servers secure (updated operating system and browser) * Make regular, automated backups of your site Help for hacked sites can be found at www.google.com/webmasters/hacked. We look forward to not seeing you there! Written by Maile Ohye, Developer Programs Tech Lead
Webmaster level: all Google has said for years that selling links that pass PageRank violates our quality guidelines. We continue to reiterate that guidance periodically to help remind site owners and webmasters of that policy. Please be wary if someone approaches you and wants to pay you for links or "advertorial" pages on your site that pass PageRank. Selling links (or entire advertorial pages with embedded links) that pass PageRank violates our quality guidelines, and Google does take action on such violations. The consequences for a linkselling site start with losing trust in Googles search results, as well as reduction of the sites visible PageRank in the Google Toolbar. The consequences can also include lower rankings for that site in Googles search results. If you receive a warning for selling links that pass PageRank in Googles Webmaster Tools, youll see a notification message to look for "possibly artificial or unnatural links on your site pointing to other sites that could be intended to manipulate PageRank." Thats an indication that your site has lost trust in Googles index. To address the issue, make sure that any paid links on your site dont pass PageRank. You can remove any paid links or advertorial pages, or make sure that any paid hyperlinks have the rel="nofollow" attribute. After ensuring that no paid links on your site pass PageRank, you can submit a reconsideration request and if you had a manual webspam action on your site, someone at Google will review the request. After the request has been reviewed, youll get a notification back about whether the reconsideration request was granted or not. We do take this issue very seriously, so we recommend you avoid selling (and buying) links that pass PageRank in order to prevent loss of trust, lower PageRank in the Google Toolbar, lower rankings, or in an extreme case, removal from Googles search results. Posted by Matt Cutts, Distinguished Engineer
Level: Beginner to Intermediate If you’re intrigued by the Search Queries feature in Webmaster Tools but aren’t sure how to make it actionable, we have a video that we hope will help!
_Maile shares her approach to Search Queries in Webmaster Tools_This video explains the vocabulary of Search Queries, such as: * Impressions * Average position (only the top-ranking URL for the user’s query is factored in our calculation) * Click * CTR The video also reviews an approach to investigating Top queries and Top pages: * Prepare by understanding your website’s goals and your target audience (then using Search Queries “filters” to support your knowledge) * Sort by clicks in Top queries to understand the top queries bringing searchers to your site (for the given time period) * Sort by CTR to notice any missed opportunities * Categorize queries into logical buckets that simplify tracking your progress and staying in touch with users’ needs * Sort Top pages by clicks to find the URLs on your site most visited by searchers (for the given time period) * Sort Top pages by impressions to find valuable pages that can be used to help feature your related, high-quality, but lower-ranking pages After you’ve watched the video and applied the knowledge of your site with the findings from Search Queries, you’ll likely have several improvement ideas to help searchers find your site. If you’re up for it, let us know in the comments what Search Queries information you find useful (and why!), and of course, as always, feel free to share any tips or feedback. Written by Maile Ohye, Developer Programs Tech Lead
Webmaster level: all People looking for images on Google often want to browse through many images, looking both at the images and their metadata (detailed information about the images). Based on feedback from both users and webmasters, we redesigned Google Images to provide a better search experience. In the next few days, you’ll see image results displayed in an inline panel so it’s faster, more beautiful, and more reliable. You will be able to quickly flip through a set of images by using the keyboard. If you want to go back to browsing other search results, just scroll down and pick up right where you left off.Top Search Queries in Webmaster Tools. As always, please ask on our Webmaster Help forum if you have questions. Posted by Hongyi Li, Associate Product Manager
Webmaster level: all Verifying ownership of your website is the first step towards using Google Webmaster Tools. To help you keep verification simple & reduce its maintenance to a minimum, especially when you have multiple people using Webmaster Tools, we’ve put together a small list of tips & tricks that we’d like to share with you: * The method that you choose for verification is up to you, and may depend on your CMS & hosting providers. If you want to be sure that changes on your side don’t result in an accidental loss of the verification status, you may even want to consider using two methods in parallel. * Back in 2009, we updated the format of the verification meta tag and file. If you’re still using the old format, we recommend moving to the newer version. The newer meta tag is called “google-site-verification, and the newer file format contains just one line with the file name. While we’re currently supporting ye olde format, using the newer one ensures that you’re good to go in the future. * When removing users’ access in Webmaster Tools, remember to remove any active associated verification tokens (file, meta tag, etc.). Leaving them on your server means that these users would be able to gain access again at any time. You can view the site owners list in Webmaster Tools under Configuration / Users. * If multiple people need to access the site, we recommend using the “add users” functionality in Webmaster Tools. This makes it easier for you to maintain the access control list without having to modify files or settings on your servers. * Also, if multiple people from your organization need to use Webmaster Tools, it can be a good policy to only allow users with email addresses from your domain. By doing that, you can verify at a glance that only users from your company have access. Additionally, when employees leave, access to Webmaster Tools is automatically taken care of when that account is disabled. * Consider using “restricted” (read-only) access where possible. Settings generally don’t need to be changed on a daily basis, and when they do need to be changed, it can be easier to document them if they have to go through a central account. We hope these tips help you to simplify the situation around verification of your website in Webmaster Tools. For more questions about verification, feel free to drop by our Webmaster Help Forums. Posted by John Mueller, Webmaster Trends Analyst, Zurich
Webmaster Level: All Update 19 February 2013: Data Highlighter for events structured markup is available in all languages in Webmaster Tools. At Google were making more and more use of structured data to provide enhanced search results, such as rich snippets and event calendars, that help users find your content. Until now, marking up your sites HTML code has been the only way to indicate structured data to Google. However, we recognize that markup may be hard for some websites to deploy. Today, were offering webmasters a simpler alternative: Data Highlighter. At initial launch, its available in English only and for structured data about _events_, such as concerts, sporting events, exhibitions, shows, and festivals.
Well make Data Highlighter available for more languages and data types in the months ahead. UPDATE 19 FEBRUARY 2013: Data Highlighter for events structured markup is available in all languages in Webmaster Tools.
Data Highlighter is a point-and-click tool that can be used by anyone authorized for your site in Google Webmaster Tools. No changes to HTML code are required. Instead, you just use your mouse to highlight and "tag" each key piece of data on a typical event page of your website:
If your page lists multiple events in a consistent format, Data Highlighter will "learn" that format as you apply tags, and help speed your work by automatically suggesting additional tags. Likewise, if you have many pages of events in a consistent format, Data Highlighter will walk you through a process of tagging a few example pages so it can learn about their format variations. Usually, 5 or 10 manually tagged pages are enough for our sophisticated machine-learning algorithms to understand the other, similar pages on your site.
When youre done, you can review a sample of all the event data that Data Highlighter now understands. If its correct, click "Publish."
From then on, as Google crawls your site, it will recognize your latest event listings and make them eligible for enhanced search results. You can inspect the crawled data on the Structured Data Dashboard, and unpublish at any time if youre not happy with the results.
Here’s a short video explaining how the process works: To get started with Data Highlighter, visit Webmaster Tools, select your site, click the "Optimization" link in the left sidebar, and click "Data Highlighter". If you have any questions, please read our Help Center article or ask us in the Webmaster Help Forum. Happy Highlighting! Posted by Justin Boyan, Product Manager
Webmaster level: All Since we announced Google’s recommendations for building smartphone-optimized websites, a common question we’ve heard from webmasters is how to best treat tablet devices. This is a similar question Android app developers face, and for that the Building Quality Tablet Apps guide is a great starting point. Although we do not have specific recommendations for building search engine friendly tablet-optimized websites, there are some tips for building websites that serve smartphone and tablet users well. When considering your site’s visitors using tablets, it’s important to think about both the devices and what users expect. Compared to smartphones, tablets have larger touch screens and are typically used on Wi-Fi connections. Tablets offer a browsing experience that can be as rich as any desktop or laptop machine, in a more mobile, lightweight, and generally more convenient package. THIS MEANS THAT, UNLESS YOU OFFER TABLET-OPTIMIZED CONTENT, USERS EXPECT TO SEE YOUR DESKTOP SITE RATHER THAN YOUR SITE’S SMARTPHONE SITE. Our recommendation for smartphone-optimized sites is to use responsive web design, which means you have one site to serve all devices. If your website uses responsive web design as recommended, be sure to test your website on a variety of tablets to make sure it serves them well too. Remember, just like for smartphones, there are a variety of device sizes and screen resolutions to test. Another common configuration is to have separate sites for desktops and smartphones, and to redirect users to the relevant version. If you use this configuration, be careful not to inadvertently redirect tablet users to the smartphone-optimized site too. TELLING ANDROID SMARTPHONES AND TABLETS APART For Android-based devices, it’s easy to distinguish between smartphones and tablets using the user-agent string supplied by browsers: Although both Android smartphones and tablets will include the word “Android” in the user-agent string, only the user-agent of smartphones will include the word “Mobile”. In summary, any Android device that does not have the word “Mobile” in the user-agent is a tablet (or other large screen) device that is best served the desktop site. For example, here’s the user-agent from Chrome on a Galaxy Nexus smartphone:
Mozilla/5.0 (Linux; Android 4.1.1; Galaxy Nexus Build/JRO03O) AppleWebKit/535.19 (KHTML, like Gecko) Chrome/18.0.1025.166 Mobile Safari/535.19Or from Firefox on the Galaxy Nexus:
Mozilla/5.0 (Android; Mobile; rv:16.0) Gecko/16.0 Firefox/16.0Compare those to the user-agent from Chrome on Nexus 7:
Mozilla/5.0 (Linux; Android 4.1.1; Nexus 7 Build/JRO03S) AppleWebKit/535.19 (KHTML, like Gecko) Chrome/18.0.1025.166 Safari/535.19Or from Firefox on Nexus 7:
Mozilla/5.0 (Android; Tablet; rv:16.0) Gecko/16.0 Firefox/16.0Because the Galaxy Nexus’s user agent includes “Mobile” it should be served your smartphone-optimized website, while the Nexus 7 should receive the full site. We hope this helps you build better tablet-optimized websites. As always, please ask on our Webmaster Help forums if you have more questions. Posted by Pierre Far, Webmaster Trends Analyst, and Scott Main, lead tech writer for developer.android.com
Webmaster level: Advanced Today we’re introducing a tool that enables you to disavow links to your site. If you’ve been notified of a manual spam action based on “unnatural links” pointing to your site, this tool can help you address the issue. If you haven’t gotten this notification, this tool generally isn’t something you need to worry about. First, a quick refresher. Links are one of the most well-known signals we use to order search results. By looking at the links between pages, we can get a sense of which pages are reputable and important, and thus more likely to be relevant to our users. This is the basis of PageRank, which is one of more than 200 signals we rely on to determine rankings. Since PageRank is so well-known, it’s also a target for spammers, and we fight linkspam constantly with algorithms and by taking manual action. If you’ve ever been caught up in linkspam, you may have seen a message in Webmaster Tools about “unnatural links” pointing to your site. We send you this message when we see evidence of paid links, link exchanges, or other link schemes that violate our quality guidelines. If you get this message, we recommend that you remove from the web as many spammy or low-quality links to your site as possible. This is the best approach because it addresses the problem at the root. By removing the bad links directly, you’re helping to prevent Google (and other search engines) from taking action again in the future. You’re also helping to protect your site’s image, since people will no longer find spammy links pointing to your site on the web and jump to conclusions about your website or business. If you’ve done as much as you can to remove the problematic links, and there are still some links you just can’t seem to get down, that’s a good time to visit our new Disavow links page. When you arrive, you’ll first select your site.
YOU’LL THEN BE PROMPTED TO UPLOAD A FILE CONTAINING THE LINKS YOU WANT TO DISAVOW.
The format is straightforward. All you need is a plain text file with one URL per line. An excerpt of a valid file might look like the following:
# CONTACTED OWNER OF SPAMDOMAIN1.COM ON 7/1/2012 TO
# ASK FOR LINK REMOVAL BUT GOT NO RESPONSE
# Owner of spamdomain2.com removed most links, but missed these
In this example, lines that begin with a pound sign (#) are considered comments and Google ignores them. The “domain:” keyword indicates that you’d like to disavow links from all pages on a particular site (in this case, “spamdomain1.com”). You can also request to disavow links on specific pages (in this case, three individual pages on spamdomain2.com). We currently support one disavowal file per site and the file is shared among site owners in Webmaster Tools. If you want to update the file, you’ll need to download the existing file, modify it, and upload the new one. The file size limit is 2MB. One great place to start looking for bad links is the “Links to Your Site” feature in Webmaster Tools. From the homepage, select the site you want, navigate to Traffic > Links to Your Site > Who links the most > More, then click one of the download buttons. This file lists pages that link to your site. If you click “Download latest links,” you’ll see dates as well. This can be a great place to start your investigation, but be sure you don’t upload the entire list of links to your site -- you don’t want to disavow all your links!
TO LEARN MORE ABOUT THE FEATURE, CHECK OUT OUR HELP CENTER, AND WE’D WELCOME YOUR COMMENTS AND QUESTIONS IN OUR FORUM. YOU’LL ALSO FIND A VIDEO ABOUT THE TOOL AND A QUICK Q&A BELOW.
If your page is on the web, speed matters. For developers and webmasters, making your page faster shouldn’t be a hassle, which is why we introduced mod_pagespeed in 2010. Since then the development team has been working to improve the functionality, quality and performance of this open-source Apache module that automatically optimizes web pages and their resources. Now, after almost two years and eighteen releases, we are announcing that we are taking off the Beta label. We’re committed to working with the open-source community to continue evolving mod_pagespeed, including more, better and smarter optimizations and support for other web servers. Over 120,000 sites are already using mod_pagespeed to improve the performance of their web pages using the latest techniques and trends in optimization. The product is used worldwide by individual sites, and is also offered by hosting providers, such as DreamHost, Go Daddy and content delivery networks like EdgeCast. With the move out of beta we hope that even more sites will soon benefit from the web performance improvements offered through mod_pagespeed. mod_pagespeed is a key part of our goal to help make the web faster for everyone. Users prefer faster sites and we have seen that faster pages lead to higher user engagement, conversions, and retention. In fact, page speed is one of the signals in search ranking and ad quality scores. Besides evangelizing for speed, we offer tools and technologies to help measure, quantify, and improve performance, such as Site Speed Reports in Google Analytics, PageSpeed Insights, and PageSpeed Optimization products. In fact, both mod_pagespeed and PageSpeed Service are based on our open-source PageSpeed Optimization Libraries project, and are important ways in which we help websites take advantage of the latest performance best practices.To learn more about mod_pagespeed and how to incorporate it in your site, watch our recent Google Developers Live session or visit the mod_pagespeed product page. Posted by Joshua Marantz and Ilya Grigorik, Google PageSpeed Team
Webmaster level: AllTraditional, text-only, search result snippets aim to summarize the content of a page in our search results. Rich snippets (shown above) allow webmasters to help us provide even better summaries using structured data markup that they can add to their pages. Today were introducing a set of guidelines to help you implement high quality structured data markup for rich snippets. Once youve correctly added structured data markup to you site, rich snippets are generated algorithmically based on that markup. If the markup on a page offers an accurate description of the pages content, is up-to-date, and is visible and easily discoverable on your page and by users, our algorithms are more likely to decide to show a rich snippet in Google’s search results. Alternatively, if the rich snippets markup on a page is spammy, misleading, or otherwise abusive, our algorithms are much more likely to ignore the markup and render a text-only snippet. Keep in mind that, while rich snippets are generated algorithmically, we do reserve the right to take manual action (e.g., disable rich snippets for a specific site) in cases where we see actions that hurt the experience for our users. To illustrate these guidelines with some examples: * If your page is about a band, make sure you mark up concerts being performed by that band, not by related bands or bands in the same town. * If you sell products through your site, make sure reviews on each page are about that pages product and not the store itself. * If your site provides song lyrics, make sure reviews are about the quality of the lyrics, not the quality of the song itself. In addition to the general rich snippets quality guidelines were publishing today, youll find usage guidelines for specific types of rich snippets in our Help Center. As always, if you have any questions or feedback, please tell us in the Webmaster Help Forum. Posted by Jeremy Lubin, Consumer Experience Specialist, & Pierre Far, Webmaster Trends Analyst
Webmaster level: All Today we’re happy to announce an updated version of our Webmaster Quality Guidelines. Both our basic quality guidelines and many of our more specific articles (like those on links schemes or hidden text) have been reorganized and expanded to provide you with more information about how to create quality websites for both users and Google. The main message of our quality guidelines hasn’t changed: Focus on the user. However, we’ve added more guidance and examples of behavior that you should avoid in order to keep your site in good standing with Google’s search results. We’ve also added a set of quality and technical guidelines for rich snippets, as structured markup is becoming increasingly popular. We hope these updated guidelines will give you a better understanding of how to create and maintain Google-friendly websites. Posted by Betty Huang & Eric Kuan, Google Search Quality Team
Webmaster level: All Having a healthy and well-performing website is important, both to you as the webmaster and to your users. When we discover critical issues with a website, Webmaster Tools will now let you know by automatically sending an email with more information. We’ll only notify you about issues that we think have significant impact on your site’s health or search performance and which have clear actions that you can take to address the issue. For example, we’ll email you if we detect malware on your site or see a significant increase in errors while crawling your site. For most sites these kinds of issues will occur rarely. If your site does happen to have an issue, we cap the number of emails we send over a certain period of time to avoid flooding your inbox. If you don’t want to receive any email from Webmaster Tools you can change your email delivery preferences. We hope that you find this change a useful way to stay up-to-date on critical and important issues regarding your site’s health. If you have any questions, please let us know via our Webmaster Help Forum. Posted by John Mueller, Webmaster Trends Analyst, Google Zürich
WEBMASTER LEVEL: ALL TODAY WE’RE EXCITED TO SHARE THE LAUNCH OF A SHINY NEW VERSION OF THE RICH SNIPPET TESTING TOOL, NOW CALLED THE STRUCTURED DATA TESTING TOOL. THE MAJOR IMPROVEMENTS ARE: * WE’VE IMPROVED HOW WE DISPLAY RICH SNIPPETS IN THE TESTING TOOL TO BETTER MATCH HOW THEY APPEAR IN SEARCH RESULTS. * THE BRAND NEW VISUAL DESIGN MAKES IT CLEARER WHAT STRUCTURED DATA WE CAN EXTRACT FROM THE PAGE, AND HOW THAT MAY BE SHOWN IN OUR SEARCH RESULTS. * THE TOOL IS NOW AVAILABLE IN LANGUAGES OTHER THAN ENGLISH TO HELP WEBMASTERS FROM AROUND THE WORLD BUILD STRUCTURED-DATA-ENABLED WEBSITES. HERE’S WHAT IT LOOKS LIKE: The new structured data testing tool works with all supported rich snippets and authorship markup, including applications, products, recipes, reviews, and others. Try it yourself and, as always, if you have any questions or feedback, please tell us in the Webmaster Help Forum. Written by Yong Zhu on behalf of the rich snippets testing tool team
Webmaster level: Beginner - Intermediate Government sites, from city to state to federal agencies, are extremely important to Google Search. For one thing, governments have a lot of content — and government websites are often the canonical source of information that’s important to citizens. Around 20 percent of Google searches are for local information, and local governments are experts in their communities. That’s why I’ve spoken at the National Association of Government Webmasters (NAGW) national conference for the past few years. It’s always interesting speaking to webmasters about search, but the people running government websites have particular concerns and questions. Since some questions come up frequently I thought I’d share this FAQ for government websites. QUESTION 1: HOW DO I FIX AN INCORRECT PHONE NUMBER OR ADDRESS IN SEARCH RESULTS OR GOOGLE MAPS? Although managing their agency’s site is plenty of work, government webmasters are often called upon to fix problems found elsewhere on the web too. By far the most common question I’ve taken is about fixing addresses and phone numbers in search results. In this case, government site owners really can do it themselves, by claiming their Google+ Local listing. Incorrect or missing phone numbers, addresses, and other information can be fixed by claiming the listing. Most locations in Google Maps have a Google+ Local listing — businesses, offices, parks, landmarks, etc. I like to use the San Francisco Main Library as an example: it has contact info, detailed information like the hours they’re open, user reviews and fun extras like photos. When we think users are searching for libraries in San Francisco, we may display a map and a listing so they can find the library as quickly as possible. If you work for a government agency and want to claim a listing, we recommend using a shared Google Account with an email address at your .gov domain if possible. Usually, ownership of the page is confirmed via a phone call or post card. QUESTION 2: I’VE CLAIMED THE LISTING FOR OUR OFFICE, BUT I HAVE 43 DIFFERENT CITY PARKS TO CLAIM IN GOOGLE MAPS, AND NONE OF THEM HAVE PHONES OR MAILBOXES. HOW DO I CLAIM THEM? Use the bulk uploader! If you have 10 or more listings / addresses to claim at the same time, you can upload a specially-formatted spreadsheet. Go to www.google.com/places/, click the "Get started now" button, and then look for the "bulk upload" link. If you run into any issues, use the Verification Troubleshooter. QUESTION 3: WERE MOVING FROM A .GOV DOMAIN TO A NEW .COM DOMAIN. HOW SHOULD WE MOVE THE SITE? We have a Help Center article with more details, but the basic process involves the following steps: * Make sure you have both the old and new domain verified in the same Webmaster Tools account. * Use a 301 redirect on all pages to tell search engines your site has moved permanently. * Dont do a single redirect from all pages to your new home page — this gives a bad user experience. * If theres no 1:1 match between pages on your old site and your new site (recommended), try to redirect to a new page with similar content. * If you cant do redirects, consider cross-domain canonical links. * Make sure to check if the new location is crawlable by Googlebot using the Fetch as Google feature in Webmaster Tools. * Use the Change of Address tool in Webmaster Tools to notify Google of your sites move. * Have a look at the Links to Your Site in Webmaster Tools and inform the important sites that link to your content about your new location. * We recommend not implementing other major changes at the same time, like large-scale content, URL structure, or navigational updates. * To help Google pick up new URLs faster, use the Fetch as Google tool to ask Google to crawl your new site, and submit a Sitemap listing the URLs on your new site. * To prevent confusion, its best to retain control of your old site’s domain and keep redirects in place for as long as possible — at least 180 days. What if you’re moving just part of the site? This question came up too — for example, a city might move its "Tourism and Visitor Info" section to its own domain. In that case, many of the same steps apply: verify both sites in Webmaster Tools, use 301 redirects, clean up old links, etc. In this case you dont need to use the Change of Address form in Webmaster Tools since only part of your site is moving. If for some reason you’ll have some of the same content on both sites, you may want to include a cross-domain canonical link pointing to the preferred domain. QUESTION 4: WEVE DONE A TON OF WORK TO CREATE UNIQUE TITLES AND DESCRIPTIONS FOR PAGES. HOW DO WE GET GOOGLE TO PICK THEM UP? First off, thats great! Better titles and descriptions help users decide to click through to get the information they need on your page. The government webmasters I’ve spoken with care a lot about the content and organization of their sites, and work hard to provide informative text for users. Googles generation of page titles and descriptions (or "snippets") is completely automated and takes into account both the content of a page as well as references to it that appear on the web. Changes are picked up as we recrawl your site. But you can do two things to let us know about URLs that have changed: * Submit an updated XML Sitemap so we know about all of the pages on your site. * In Webmaster Tools, use the Fetch as Google feature on a URL you’ve updated. Then you can choose to submit it to the index. * You can choose to submit all of the linked pages as well — if you’ve updated an entire section of your site, you might want to submit the main page or an index page for that section to let us know about a broad collection of URLs. QUESTION 5: HOW DO I GET INTO THE YOUTUBE GOVERNMENT PARTNER PROGRAM? For this question, I have bad news, good news, and then even better news. On the one hand, the government partner program has been discontinued. But don’t worry, because most of the features of the program are now available to your regular YouTube account. For example, you can now upload videos longer than 10 minutes. Did I say I had even better news? YouTube has added a lot of functionality useful for governments in the past year: * You can now broadcast live streaming video to YouTube via Hangouts On Air (requires a Google+ account). * You can link your YouTube account with your Webmaster Tools account, making it the "official channel" for your site. * Automatic captions continue to get better and better, supporting more languages. I hope this FAQ has been helpful, but I’m sure I haven’t covered everything government webmasters want to know. I highly recommend our Webmaster Academy, where you can learn all about making your site search-engine friendly. If you have a specific question, please feel free to add a question in the comments or visit our really helpful Webmaster Central Forum. Posted by Jason Morrison, Search Quality Team
Webmaster level: All Today we’re announcing more detailed Site Error information in Webmaster Tools. This information is useful when looking for the source of your Site Errors. For example, if your site suffers from server connectivity problems, your server may simply be misconfigured; then again, it could also be completely unavailable! Since each Site Error (DNS, Server Connectivity, and Robots.txt Fetch) is comprised of several unique issues, we’ve broken down each category into more specific errors to provide you with a better analysis of your site’s health. Site Errors will display statistics for each of your site-wide crawl errors from the past 90 days. In addition, it will show the failure rates for any category-specific errors that have been affecting your site. If you’re not sure what a particular error means, you can read a short description of it by hovering over its entry in the legend. You can find more detailed information by following the “More info” link in the tooltip. We hope that these changes will make Site Errors even more informative and helpful in keeping your site in tip-top shape. If you have any questions or suggestions, please let us know through the Webmaster Tools Help Forum. WRITTEN BY CESAR CUENCA AND TIFFANY WANG, WEBMASTER TOOLS INTERNS
Webmaster level: All We know many of you check Webmaster Tools daily (thank you!), but not everybody has the time to monitor the health of their site 24/7. It can be time consuming to analyze all the data and identify the most important issues. To make it a little bit easier we’ve been incorporating alerts into Webmaster Tools. We process the data for your site and try to detect the events that could be most interesting for you. Recently we rolled out alerts for Crawl Errors and today we’re introducing alerts for Search Queries data. The Search Queries feature in Webmaster Tools shows, among other things, impressions and clicks for your top pages over time. For most sites, these numbers follow regular patterns, so when sudden spikes or drops occur, it can make sense to look into what caused them. Some changes are due to differing demand for your content, other times they may be due to technical issues that need to be resolved, such as broken redirects. For example, a steady stream of clicks which suddenly drops to zero is probably worth investigating. The alerts look like this: We’re still working on the sensitivity threshold of the messages and welcome your feedback in our help forums. We hope the new alerts will be useful. Don’t forget to sign up for email forwarding to receive them in your inbox. Posted by Javier Tordable, Tech Lead, Webmaster Tools