For a leading search engine like Google to maintain their dominant market share, they must consistently provide the best search results to a user. Sophisticated algorithms do most of the hard work – but did you know Google provides a tool to examine “under the hood” website elements that can influence your site’s ranking? Here, we’ll help you verify Search Console and learn how to interpret the most relevant reports for top search engine rank.
Google Search Console is a free, yet powerful tool that allows you to gain actionable insights on how Google is crawling and indexing a website. In the past year, Google updated the name of this tool from Webmaster Tools to Search Console in an effort to better represent all professions that can benefit from the tool, not just webmasters. The following professions can also utilize the data of Google Search Console:
- Ecommerce business owner
- Web developer & designer
- SEO specialist
- App developer
- Marketing associate
- Online security professional
Sign-Up & Verify Search Console
When logged into a Google account, click on this link to visit Google Search Console. If you haven’t verified an account yet, you’ll need to provide the site’s domain name and then click the “add property” button.
After entering the Home page URL, you’ll need to verify that you have the authority to access the valuable Search Console data. There are 5 ways you can verify ownership; we recommend using the Alternate method: HTML Meta Tag. You’ll be provided with a single line of code to be pasted into the HTML’s <head> section of your website’s template (before the first <body> tag).
If you’re not comfortable accessing the code in the Volusion Dashboard, simply paste the code to the Marketing> SEO> “Globally Appended Meta Tags field”. Once you SAVE it, click the “Verify” button in the Search Console window.
Submit XML Sitemap
As a Volusion merchant, the first step you should take after verifying ownership of Search Console is to submit your site’s XML Sitemap. A sitemap lists all the links of your site and allows Google to quickly crawl and index your site. All Volusion client sitemaps are generated and updated automatically, so this is a simple step. Open Crawl> Sitemaps> Add/Test Sitemap> and Submit: sitemap.xml in the field following yoursite.com/.
The index status will immediately be “pending,” but check back in a couple days for the results of the URL index ratio. While 100% of URLs may not be indexed immediately, a site with a low index ratio could be Google’s way of telling you something is wrong with the site. Common reasons for index issues include: no content on pages, short and unhelpful content, or duplicate content (text that is copied and pasted from another site).
Address Duplicate & Poorly Written Meta Tags
Each page on a website should exist to serve a distinct purpose, and a page’s Meta tags reinforce this to potential visitors on the search results page. The Search Appearance> HTML Improvements section of Search Console reports the pages whose Meta data can be improved to better communicate the value of that page. Click the “duplicate, long, missing and short” links beneath Meta description and Title tag headings for the pages that violate each report. If you need a refresher on how to craft these tags that show up on the search results page, check out this post.
Obtain Traffic Driving Search Terms
Search Console’s Search Traffic> Search Analytics section is an interactive report you’ll use to learn more about your customers and how they arrived on your website. Compare and filter various search terms a customer entered and website pages based on the number of clicks, impressions, click-through-rate and average position. The data will give you a better idea of whether your site is appropriately optimized for relevant terms or if the site needs more accurately optimized content. This product categorization blog is a nice review of how organized keyword hierarchy communicates to Google.
Learn About Links
An influential ranking factor Google uses to gauge site authority is the number and quality of links pointing to your site. Think of an authentic backlink as a vote of confidence or endorsement for the specific topic at hand. Search Console’s Search Traffic> Links to Your Site report documents the sites that link most often; which pages on your site are linked to; and the anchor text phrases a visitor clicks to reach your site.
Link building best practices is an entire topic in itself and we’ve got you covered with a dedicated post available here. Networking to acquire these links should be a practice of quality over quantity; each link should help your customers. If you’ve taken shortcuts or paid for links in the past, the next paragraph may be for you.
Has Google Flagged Your Site?
Google’s vigilant manual webspam team exists to audit sites for spam, paid links and other black-hat tactics that their algorithm was unable to detect. More information on web spam can be found here. If your site has been flagged, this is a serious notification and must be addressed immediately. Otherwise, your site could be removed from the Google index.
In Search Traffic> Manual Actions, most sites will have the “all-clear” message: No manual webspam actions found. If you have a more ominous note from Google, consult this post which covers how to clean up unnatural links and tackle manual penalties.
Check Crawl Errors
A site that Google can easily crawl is a precursor to earning organic search traffic. A site with excessive DNS, server and URL errors does not show Google that you’re a high-quality site that deserves valuable visits. Regular observation of Search Console’s Crawl> Crawl Errors will help you catch broken 404 pages and take appropriate steps to fix the situation.
A 404 Not Found record is the URL of a page unable to be accessed by Google bots and visitors. While not every error is cause for concern (it’s normal for an ecommerce site to remove older products), this is valuable data. A 404 Not Found record can signal a botched platform migration, accidently deleted products/categories, incorrectly coded links or an improperly coded 404 error page. If an erroring page is still important to you, recreate the page or implement 301 Redirects to the most applicable live page so SEO value is passed from the old page to the target URL.
To wrap things up, Google Search Console is an invaluable tool for professionals who make their living on the Internet. This post focused on our favorite reports, but we recommend exploring the tool for yourself. Getting stuck? Every page has a “help” button for assistance with the specific report. Search Console collects and reports the results of search bot crawls and, if interpreted and addressed correctly, will help you build and maintain a search engine and customer friendly website. Feel free to post any questions or thoughts in the comments here.