The amount of change possible on a large business Web site can be limited based on either company policies or outside oversight, such as Sarbanes-Oxley (SOx) compliance issues. Compliance programs, both internal and external, are essential to maintain a level of quality and an approval process to keep damaging requests from getting through. But these processes can cause problems when changes need to be made around SEO (define).
The added layer of complexity created by these programs can slow the Web development release process down, or even prevent a release altogether. Additional problems arise when SEO changes are requested, since SEO is rarely understood by others in the organization, and thus commonly considered less important.
From company to company, the amount of information that can be changed on a Web site varies. Most often, companies that carry financial information within their sites can change little or nothing without a massive overview and audit. The governing parties that watch these sites clearly dictate the changes that can be made. In many cases, your SEO-related changes will not make it out the door, in favor of experiments that may have at one time worked in the normal marketing world.
There are a few ground rules when considering a search marketing project with a large organization with compliance issues:
Is the Site Crawlable?
This can be determined if the content of a unique page is available in the search results. Test this by typing in a sentence from a given page on the site (usually a page that is three clicks from the home page works best for this experiment).
If you don’t find any results, make sure there isn’t a problem with the Web server. Sign up for Google Webmaster Tools to see samples of possible errors on a specific site. Some errors are much more important than others to monitor.
For example, if you see a list of errors within the 5xx range –500, 501, 502 — have your system administrators make sure an application isn’t running that may be pushing away spiders (define). An example of this would be Mod-Throttle, an Apache module that can limit the ability for a spider to crawl a site.
How Complicated are the URLs?
Depending on the configuration, many dynamic Web servers make it easy to display content from a database by simply requesting a specific stored record or combination of stored records. Usually this is transmitted to the web server by a dynamic value. One example of a dynamic value could look like www.interestingwebsite.com/product?type=1234&category=1144. In this particular example, two distinct database calls are sent.
Google recommends using no more than one dynamic attribute in any given URL. This recommendation may be fine for older sites that have been around for some time. However, new sites have little score to pass, thus a filter of this type may actually cause problems and may reduce the amount of rank to these pages. I recommend no dynamic attributes in a URL.
These pages commonly have little to no PageRank, a score given by Google to any given page based upon criteria that they determine important to a page’s relevance. Sometimes this data is uniform, other times it’s very difficult to predict.
How Far Does a User Have to Click to Find What They Want?
A user will commonly click around a site to find what they’re looking for. The flatter or shorter a click path is, the better the perceived experience.
Assuming a search engine will type in random phrases in your search engine will only result in little to no traffic. Thus it’s important that the engines can easily find what they’re looking for quickly and easily through internal links.
Long story short, if your site isn’t being picked up by the search engine crawlers, no matter what you do to each page, they won’t pick up the changes.