Page level tracking is more important than ever. Google has forced us to look at what pages are ranking and where, then pull in data from other sources. Here's how you PPC data can help guide your SEO strategies, and other actionable next steps.
Articles by Ben Goodsell
There is a widespread belief that Google Webmaster Tool search query can be dismissed as worthless and inaccurate. Wrong. Downloading, categorizing, and analyzing trends over time is the best way to get the most out of what Google is giving us.
Applying PPC data has been key in accurately estimating the impact of missing iOS 6 organic traffic data. Unfortunately, Google now is returning trackable referral information for PPC only, skewing the ability to accurately apply this information.
Users are being conditioned to check (and click) the area of Google's SERP design that prominently displays information pulled via Knowledge Graph, Maps, and (conveniently for revenue) product listing ads, depending on the intent of a search query.
Buried deep within Google’s best practices documentation (for mobile), a potentially explosive burden lies dormant. One that can negatively affect page load speed, bounce rates, and potentially rankings. Discover the problem what you can do.
In iOS 6, Apple decided to default to Google’s secure search. But more traffic isn’t showing up as Not Provided (Google Analytics) or Keyword Unavailable (Omniture). Instead, the majority of iOS organic search gets bucketed as direct traffic.
Too often, waiting on the client or development team to implement website changes will help reduce excess crawling and indexation can hinder making the bottom line. Here’s how to bypass all that with Google and Bing’s parameter handling tools.
SEO professionals are constantly coping with never-ending Google updates, but you can achieve enlightenment working in the search industry. Here are five lessons from Benjamin Franklin on temperance, silence, resolution, sincerity, and tranquility.
This article is a simple breakdown of how to go about using a SEO site crawler to quickly identify duplicate content. Screaming Frog is definitely one of the most popular/powerful scrapers and it is the spider of choice for this tutorial.
From gaining insight into how Google crawls and indexes your site, to getting notified of an outdated WordPress installation, here is some useful (and free) functionality Google offers webmasters after the domain verification process is complete.
Becoming efficient with keyboard shortcuts can help you navigate the computer in such a way that it doesn’t compete with the real-time speed of SEO thought. Using these system and program shortcut tips will help save you some valuable time.
Not only has Google come out and supported a mobile strategy, it comes with specific SEO friendly recommendations at no extra cost. Google supports three ways of serving mobile specific content to users and have provided recommendations for each.
Enterprise SEO tools provide comprehensive site and competitor analysis – but some that provide incredible insight are also amazingly expensive. With a little manual effort and time you can create actionable reports at a fraction of the cost.
A successful negative SEO campaign will identify the weaknesses of your site and exploit them. So what’s the best defense? In addition to a solid SEO foundation and monitoring, here are some ways you can protect your website against some tactics.
While Chrome is super-fast and has a lot of great SEO extensions, Firefox still wins the overall battle because Firefox display metrics at a glance and offers greater functionality. Here’s a list of active SEO extensions and plugins to check out.
Google Webmaster Tool search query data is more accurate than currently perceived. Clicks seem to be correlated enough with visits to make a conservative estimate of (not provided) data and to show how to recover a percentage of this valuable data.
A quick overview of sitemap guidelines and limitations on Google and Bing and some sitemap optimization tips. Also learn a technique that will help you identify crawling and indexation issues using multiple sitemaps in Google Webmaster Tools.
Unfortunately, there isn't a silver bullet that will compensate for losing a rising percentage of keyword data in analytics. There are several things you can do to help gain insight and work around this obstacle. Experimentation is essential.