The Search Console (or Google Webmaster Tools as it used to be known) is a completely free and indispensably useful service offered by Google to all webmasters.
Although you certainly don’t have to be signed up to Search Console in order to be crawled and indexed by Google, it can definitely help with optimising your site and its content for search.
Search Console is where you can monitor your site’s performance, identify issues, submit content for crawling, remove content you don’t want indexed, view the search queries that brought visitors to your site, monitor backlinks… there’s lots of good stuff here.
Perhaps most importantly though, Search Console is where Google will communicate with you should anything go wrong (crawling errors, manual penalties, increase in 404 pages, malware detected, etc.)
If you don’t have a Search Console account, then you should get one now. You may find that you won’t actually need some of the other fancier, more expensive tools that essentially do the same thing.
To get started, all you need is a Google sign-in, which you probably already have if you regularly use Google or Gmail, and visit Search Console.
Then follow this complete guide which will take you through every tool and feature, as clearly and concisely as possible.
Please note: we published a guide to the old Webmaster Tools service, written by Simon Heseltine, back in 2014. This is an updated, rewritten version that reflects the changes and updates to Search Console since, but much of the credit should go to Simon for laying the original groundwork.
Quick Links:
- Add a property
- Verification
- Dashboard
- Settings
- Messages
- Search Appearance
- Search Traffic
- Google Index
- Crawl
- Security Issues
- Other Resources
Add a property
If you haven’t already, you will have to add your website to Search Console.
Just click on the big red Add a Property button, then add your URL to the pop-up box.
Verification
Before Search Console can access your site, you have to prove to Google that you’re an authorized webmaster. You don’t have be in charge, but you do need permission from whoever is.
There are five methods of verification for Search Console There’s no real preference as to which method you use, although Google does give prominence to its ‘recommended method’…
1) The HTML file upload: Google provides you with a HTML verification file that you need to upload to the root directory of your site. Once you’ve done that, you just click on the provided URL, hit the verify button and you’ll have full access to Search Console data for the site.
There are also four alternative methods if the above doesn’t suit…
If you make any further updates to the HTML of your homepage, make sure the tag is still in place, otherwise your verification will be revoked. If this does happen, you’ll just have to go through the process again.
3) Domain Name Provider: here you’re presented with a drop down list of domain registrars or name providers, then Google will give you a step-by-step guide for inserting a TXT record to your DNS configuration.
4) Google Analytics: assuming you’re using Google Analytics and your Google account is the same one you’re using for Search Console, then you can verify the site this way, as long as the GA code is in the <head> section of your home page (and remains there), and you have ‘edit’ permission.
5) Google Tag Manager: this option allows you to use your own Google Tag Manager account to verify your site, providing you’re using the ‘container snippet’ and you have ‘manage’ permission.
Now that you’re verified, you’ll be able to see your site on the Home screen. (As well as any sites you’re also a webmaster for). Here you can access the site, add another property and see how many unread messages you’ve received from Google.
if you click on your site, you will be taken to its own unique Dashboard.
For the purposes of the following walk-throughs, I’ll be using my own website Methods Unsound, which means you can see all the things I need to fix and optimise in my own project.
Dashboard
Here’s where you can access all of your site’s data, adjust your settings and see how many unread messages you have.
The left-hand Dashboard Menu is where you can navigate to all the reports and tools at your disposal.
The three visualisations presented on the Dashboard itself (Crawl Errors, Search Analytics, and Sitemaps) are quick glimpses at your general site health and crawlability. These act as short-cuts to reports found in the left-hand menu, so we’ll cover these as we walk-through the tools.
Also note that Google may communicate a message directly on the dashboard, if it’s deemed important enough to be pulled out of your Messages. As you can see I have errors on my AMP pages that need fixing, but we’ll look at this when we get to the Dashboard Menu section further down.
First let’s take a look at settings…
Settings
Clicking on the gear icon in the top right corner will give you access to a variety of simple tools, preferences and admin features.
Search Console Preferences
This is simply where you can set your email preferences. Google promises not to spam you with incessant emails so it’s best to opt-in.
Site Settings
Here’s where you can set your preferred domain and crawl rate.
- Preferred domain let’s you set which version of your site you’d like indexed and whether your site shows up in search results with the www prefix or without it. Links may point to your site using http://www.example.com or http://example.com, but choosing a preference here will set how the URL is displayed in search.Google states that: “If you don’t specify a preferred domain, we may treat the www and non-www versions of the domain as separate references to separate pages” thus cannibalising your search visibility.
- Crawl rate lets you slow down the rate that Googlebots crawls your site. You only need to do this if you’re having server issues and crawling is definitely responsible for slowing down the speed of your server. Google has pretty sophisticated algorithms to make sure your site isn’t hit by Googlebots too often, so this is a rare occurrence.
Change of Address
This is where you tell Google if you’ve migrated your entire site to a new domain.
Once your new site is live and you’ve permanently 301 redirected the content from your old site to the new one, you can add the new site to Search Console (following the Add a Property instructions from earlier). You can then check the 301 redirects work properly, check all your verification methods are still intact on both old and new sites, then submit your change of address.
This will help Google index your new site quicker, rather than if you just left the Googlebots to detect all your 301 redirects on their own accord.
Google Analytics Property
If you want to see Search Console data in Google Analytics, you can use this tool to associate a site with your GA account and link it directly with your reports.
If you don’t have Google Analytics, there’s a link at the bottom of the page to set up a new account.
Users & Property Owners
Here you can see all the authorized users of the Search Console account, and their level of access.
You can add new users here and set their permission level.
- Anyone listed as an Owner will have permission to access every report and tool in Search Console.
- Full permission users can do everything except add users, link a GA account, and inform Google of a change of address.
- Those with Restricted permission have the same restrictions as Full permission users plus they only have limited viewing capabilities on data such as crawl errors and malware infections. Also they cannot submit sitemaps, URLs, reconsideration requests or request URL removals.
Verification Details
This lets you see the all the users of your Search Console account, their personal email addresses and how they were verified (including all unsuccessful attempts.)
You can unverify individuals here (providing you’re the owner).
Associates
Another Google platform, such as a G+ or AdWords, can be associated (or connected) with your website through Search Console. if you allow this association request, it will grant them capabilities specific to the platform they are associating with you.
Here’s an example direct from Google: “Associating a mobile app with a website tells Google Search to show search result links that point to the app rather than the website when appropriate.”
If you add an associate, they won’t be able to see any data in Search Console, but they can do things like publish apps or extensions to the Chrome Web Store on behalf of your site.
Here’s where you’ll find all your reports and tools available in the Search Console.
Let’s look at each option one-by-one.
Messages
Here’s where Google communicates with webmasters.
Again, you won’t get spammed here as Google promises not to bombard you with more than a couple of messages a month. You do need to pay attention when you do receive one though as this is where you’ll be informed if your site’s health is compromised.
This can be anything from a rise in 404 pages, to issues with crawling your site, or even more serious problems like your site being infected with malware.
Search Appearance
If you click on the ? icon to the right of ‘Search Appearance’ a handy pop-up will appear. Search Appearance Overview breaks down and explains each element of the search engine results page (SERP).
By clicking on each individual element, an extra box of information will appear telling you how to optimise that element to influence click-through, and where to find extra optimisation guidance within Search Console.
Structured Data
Structured data is a way for a webmaster to add information to their site that informs Google about the context of any given webpage and how it should appear in search results.
For example, you can add star ratings, calorie counts, images or customer ratings to your webpage’s structured data and these may appear in the snippets of search results.
The Structured Data section in Search Console contains information about all the structured data elements Google has located on your site, whether from Schema markup or other microformats.
It will also show you any errors it has found while crawling your structured data. If you click on the individual ‘Data Types’ it will show you exactly which URLs contain that particular markup and when it was detected.
If you click one of the URLs listed, you can see a further breakdown of the data, as well as a tool to show exactly how it looks in live search results. Just click on ‘Test Live Data’ and it will fetch and validate the URL using Google’s Structured Data Testing Tool.
Data Highlighter
Data Highlighter is an alternative to adding structured data to your HTML. As the explainer video below says, it’s a point and click tool where you can upload any webpage then highlight various elements to tell Google how you want that page to appear in search results.
There’s no need to implement any code on the website itself and you can set the Data Highlighter so it tags similar pages for you automatically.
To begin, click on the big red ‘Start Highlighting’ button…
Then enter the URL you wish to markup…
Then start highlighting and tagging…
After you hit publish, Google will take your added structured data into account once it has recrawled your site. You can also remove any structured data by clicking ‘Unpublish’ on the same page if you change your mind.
HTML Improvements
This is where Search Console will recommend any improvements to your meta descriptions and title tags, as well as informing you of any non-indexable content.
This is a very handy, easy-to-use feature that gives you optimisation recommendations that you can action right away.
For instance, if I click on the ‘Short meta descriptions’ link, I’ll be able to see the 14 URLs and their respective meta descriptions. I can then go into each one of these pages in my own CMS and add lengthier, more pertinent text.
Title tags and meta descriptions should be unique for each page and fall within certain character lengths, so for the purposes of both user experience and keeping Google informed about your site, this is a worthwhile report.
Sitelinks
Sitelinks are the subcategories that appear under the main URL when you search for a brand or a publisher.
Sadly you can’t specify to Google which categories you want highlighted here, but if you’re popular enough and your site’s architecture is solid enough then these will occur organically.
However in the Sitelinks section of Search Console, you can tell Google to remove a webpage that you DON’T wish to be included as a sitelink in your search results.
Accelerated Mobile Pages
This is a brand new tool, as Google’s AMP programme has only been available since earlier this year. AMP is a way for webmasters to serve fast-loading, stripped down webpages specifically to mobile users. Site speed and mobile friendliness are considered ranking signals so this is an important feature, although some SEOs are slow to adopt it.
As you can see from the report below, we’ve just started introducing AMP to our webpages and making a bit of a hash of it…
Accelerated Mobile Pages lets you see all the pages on your site with AMP implemented and which ones have errors. If you click on the error, you can see a list of your URLs with errors. Then by clicking on the URL, you will be recommended a fix by Google.
Clearly we have some custom JavaScript issues on our site that need addressing. If you click on the ‘Open Page’ button, you can see exactly how your AMP content appears on mobile.
Search Traffic
Search Analytics
Search Analytics tells you how much traffic you get from search, revealing clicks and impressions delivered on SERPs. It will also work out your click-through rate (CTR) and reveal your average organic position for each page.
And here’s the *really* good stuff… you can also see the queries that searchers are using in order to be served your site’s content.
The data for this is collected differently from Google Analytics, so don’t expect it to tally, however what this feature is really useful for is seeing which keywords and phrases are driving traffic to your site, as well as individual traffic-generating pages.
You can toggle between a variety of options, filters and date-ranges. I highly recommend looking at Impressions and CTR, to see which pages are generating high visibility but low click-through rate. Perhaps all these pages need is a tweak of a meta-description or some structured data?
Links to Your Site
Here’s where you can see the domains that link to your site and its content the most, as well as your most linked webpages.
This isn’t an exhaustive list, but a good indicator of where your content is appreciated enough to be linked. Clicking on the URLs on the right hand-side will show where they’re being linked to individually.
Internal Links
Here is where you can see how often each page on your site has been internally linked. Clicking on each ‘Target page’ will show a list of URLs where the internal link occurs.
There is a limit to how many ‘Target pages’ Search Console will show you, but if you have a small number of pages you can reverse the sort order and see which target pages have zero internal links. You can then go into your site and give these pages an internal link, or redirect them to somewhere else if they’re old legacy pages.
Manual Actions
This is where Google will inform you if it has administered a manual action to your site or specific webpage.
Google will offer any recommendations for you to act upon here, and will give you the chance to resubmit your site for reconsideration after you’ve fixed any problems.
Here’s a guide to what Google will most likely give you a manual penalty for and how you can avoid it.
International Targeting
Here you can target an audience based on language and country.
- Country: If you have a neutral top-level domain (.com or .org), geotargeting helps Google determine how your site appears in search results, particularly for geographic queries. Just pick your chosen country from the drop-down menu. If you don’t want your site associated with any country, select ‘Unlisted’.
- Language: If you manage a website for users speaking a different language, you need to make sure that search results display the correct version of your pages. To do this, insert hreflang tags in your site’s HTML, as this is what Google uses to match a user’s language preference to the right version of your pages. Or alternatively you can use sitemaps to submit language and regional alternatives for your pages.
Mobile usability
As mobile has overtaken desktop for searches this year, obviously your site has to be mobile-friendly, otherwise you’re providing a poor user experience to potentially half your visitors.
This report tells you of any issues your site has with mobile usability. And you’ll really want to be seeing the following message, as Google explicitly states you’ll otherwise be demoted.
Possible errors that will be highlighted by Search Console here include:
- Flash usage: mobile browsers do not render Flash-based content, so don’t use it.
- Viewport not configured: visitors to your site use a variety of devices with differing screen sizes so your pages should specify a viewport using the meta viewport tag.
- Fixed-width viewport: viewports fixed to a pixel-size width will flag up errors. Responsive design should help solve this.
- Content not sized to viewport: if a user has to scroll horizontally to see words and images, this will come up as an error.
- Small font size: if your font size is too small to be legible and requires mobile users to ‘pinch to zoom’ this will need to be changed.
- Touch elements too close: tappable buttons that are too close together can be a nightmare for mobile visitors trying to navigate your site.
- Interstitial usage: Google will penalise you if you’re using a full-screen interstitial pop-up to advertise an app when a user visits your mobile site.
Google Index
Index Status
This lets you know how many pages of your website are currently included in Google’s index.
You can quickly see any worrying trends from the last year (for instance that little dip in May 2015), as well as any pages that have been blocked by robots or removed.
Content Keywords
Here you can see the most common keywords found by the Googlebots as they last crawled your site.
If you click on each keyword, you’ll be able to see the other synonyms found for that keyword, as well as the number of occurrences.
As Simon Heseltine suggests, look out for unexpected, unrelated keywords showing up as it’s an indication your site may have been hacked and hidden keywords have been injected into your pages.
Blocked resources
This section lets you know of any images, CSS, JavaScript or other resources on your site that’s blocked to Googlebots.
These are listed by host-name, then by specific pages, which you can follow steps to diagnose and resolve.
Remove URLs
Where essentially you can make your content disappear from Google.
This only acts as a temporary fix, but by the time you’ve done this and either deleted your offending webpage or 301 redirected it elsewhere, there theoretically should no longer be a record of it.
Just enter the URL then select whether you want it removed from the search results and the cache, just from the cache or if you want an entire directory removed.
Be warned: this request can take between two to 12 hours to be processed.
Crawl
Crawl Errors
This report shows all the errors that Google has found when crawling your site over the last 90 days.
Site errors: the top half of the screen shows three tabs, where if you click on each you can see any past problems with your DNS, your server connectivity or whether a crawl had to be postponed. (Google will postpone a crawl rather than risk crawling URLs you don’t want indexed).
URL errors: the bottom half of the screen shows URL errors for desktop, smartphone and feature phone (a phone that can access the internet, but doesn’t have the advanced features of a smartphone).
You’ll likely see reports for the following on all three device types:
- Server error: Google can’t access your site because the server is too slow to respond, or because your site is blocking Google.
- Soft 404: this occurs when your server returns a real page for a URL that doesn’t actually exist on your site. You should replace these pages with 404 (Not found) or a 410 (Gone) return codes.
- Not found: these are all your 404 pages that occur when a Googlebot attempts to visit a page that doesn’t exist (because you deleted it or renamed it without redirecting the old URL, etc.) Generally 404 pages are fine and won’t harm your rankings, so only pay attention to the ones related to high-ranking content.
Crawl Stats
This section shows the progress of Googlebots crawling your site in the last 90 days.
You can see how fast your pages are being crawled, kilobytes downloaded per day and average time spent downloading pages on your site.
Spikes are perfectly normal, and there’s not very much you can do about them. But if you see a sustained drop in any of these charts then it might be worth investigating to see what’s dragging it down.
Fetch as Google
Here you can check how any page on your website is seen by Google once its been been crawled.
You can also submit these webpages for indexing. You may find this is a quicker way to be crawled and indexed then if you were to let Google find the page automatically.
- When you ‘Fetch’ a page, Google will simulate a crawl and you can quickly check any network connectivity problems or security issues with your site.
- ‘Fetch and Render’ does the same as the above, but it also lets you check how the page itself looks on mobile or desktop, including all resources on the page (such as images and scripts) and will let you know if any of these are blocked to Googlebots.
Remember the crawler is meant to see the same page as the visitor would, so this is a good way to get a direct on-page comparison.
If the page is successfully fetched and rendered, you can submit it to the index. You are allowed 500 webpage fetches per week, but you can only submit a webppage and have Google crawl ALL the pages linked within it, 10 times per month.
robots.txt Editor
A robots.txt file placed within the root of your site, is where you can specify pages you don’t want crawled by search engines. Typically this is used because you don’t want your server overwhelmed by Googlebots, particularly if you want them to ignore script or style files, or if you want certain images not to appear in Google Image Search.
Here is where you can edit your robots.txt and check for errors. The bottom of the page reveals your errors and warnings.
Sitemaps
Sitemaps are hosted on the server of your website and they basically inform search engines of every page of your site, including any new ones added. It’s a good way to let Google better crawl and understand your website.
Here’s where you can access all of the information about any sitemaps either submitted manually or found by Search Console. The blue bar represents pages or images submitted, the red bar represents actual pages and images indexed.
You can test a sitemap by clicking the ‘Add/Test sitemap’ button, and if it’s valid you can then add it to Search Console.
URL Parameters
As Simon Heseltine has previously commented, this section isn’t used much anymore since the introduction of canonical tags.
However you should use URL Parameters if, for instance, you need to tell Google to distinguish between pages targeted to different countries. These preferences can encourage Google to crawl a preferred version of your URL or prevent Google from crawling duplicate content on your site.
Security Issues
Although any security issues will be communicated with you in the Messages section and on the Dashboard screen, here’s where you can check on problems in more detail.
There’s also plenty of accessible information here about how to fix your site if it’s been hacked or been infected with malware.
Other Resources
Here’s where you can access all the tools provided by Google, outside of Search Console. Including the Structured Data Testing Tool and Markup Helper, which we went into greater detail about in earlier sections.
Other helpful resources here are the Google My Business Center, where you can use to improve your business’s local search visibility and the PageSpeed Insights tool, which will tell you exactly how well your site is performing on mobile and desktop in terms of loading time, and how to fix any issues.