SEOTips On Across The Engines Ranking From SEO Book

Tips On Across The Engines Ranking From SEO Book

People are beginning to remember again that there are search engines beyond
Google. This reawakening is one reason why I added the
Can You
Please Them All?
session to our upcoming Search Engine Strategies San Jose
show next August. Aaron Wall over at SEO Book has also clearly seen the renewed
interest in pleasure more than Google. Out today is his excellent
Google vs Yahoo! vs MSN Search:
Defining Search Engine Relevancy
piece today.

It’s great and deserves a better title, something that better explains the
main focus, tips on what each of the major search engines seems to like in
ranking web pages. You can skim the short tips on top or enjoy the long detailed
tips down below.

Of course, one of the biggest challenges when you try to please different
search recipes is that it’s easy to get caught up in the perfect page trap, a
lot of work trying to match what you think each search engine likes but without
getting the result you expect.

We went through this before back in 1999-2000 or so. I did a long piece on it
in 2000,

In Pursuit Of The Perfect Page
for Search Engine Watch
members, plenty
of the main points which still hold up today:

Occasionally, I get questions about what “numbers” or “rules” should be
followed to construct the perfect page for each crawler-based search engine.
In other words, how many times should a term appear on a page for it to rank
in the top results? How often should a term be repeated in a meta tag to
attain success? How often can a term be repeated before a spam penalty ensues?

No one has these numbers, honest. Those that say they do are merely making
educated guesses at reverse engineering the crawler-based search engines. You
will always see exceptions to their perfect page formulas. Additionally, the
twin rise of a greater reliance on “off-the-page” ranking criteria and
human-compiled listings makes focusing on perfect page construction much less
an activity than in the past (see additional articles about this at the end of
this story). Those who are looking forward in the world of search engines are
not worrying about “keyword densities.” Instead, they are building content,
building links and doing other activities that will benefit them in the
future….

Enlarge your site to have real
content on the terms you want to be found for. If you sell shoes, have
articles about how to select different types of shoes. If you offer package
holidays, provide some tourist information about your destinations. Build this
“real” content and optimize it for your target terms. Then go out and link
build. Find sites that are non-competitive with yours but on related topics
and offer to swap links. These two activities are akin to building a house,
while concentrating on doorways is similar to renting. Renting is easy and
offers a lot of advantages, but at the end of the day, you don’t own anything.
Concentrate on building your house, and you should see traffic from search
engines and other publicity venues over the long term.

I generally advised most people back then not to pursue a perfect page
strategy, and I’ve

stuck to that
when the topic came up on our Search Engine Watch Forums
recently:

I can remember (there he goes again) when Webposition rolled out the page
analyzer tool to help those who wanted to build to target Infoseek, AltaVista,
etc. I can’t say I miss the days where it was easy to tell what someone was
doing because the URLs would all end in initials of the search engines a page
was targeted at (page-is.html, page-av.htlm, page-ex.html).

I always got a kick when a page clearly targeted by someone at Excite would rank
well for AltaVista and so on. To me, it underscored how these tools trying to
get you to do the “perfect” page often wouldn’t deliver — plus the differences
recommended often were so slight as not to matter.

Still, I can see that coming back. In fact, the SES San Jose agenda just

went up
, and the beginning of the second day is a session called “Can You
Please Them All?” Session descriptions haven’t yet posted, but we’re going to
revisit the issue. People are again wanting that top ranking across the board
without losing out on wherever they are at. I think building three different
sites and robot.txt banning out all the untargeted search engines is a massive
pain for most people and instead tend to recommend the be happy with ranking
with one of them approach. But that’s not for everyone, I know.

Again, for most people, trying to get the perfect page for each search engine
for all your terms is probably going to be too much work. But some will want to
make the effort, and especially so in a more selective manner, for their most
important terms. This is even more so when you see specific tips continuing to
work over a long period of time, such as MSN still seeming to love keyword-rich
URLs and domain names, as Aaron notes.

Resources

The 2023 B2B Superpowers Index
whitepaper | Analytics

The 2023 B2B Superpowers Index

8m
Data Analytics in Marketing
whitepaper | Analytics

Data Analytics in Marketing

10m
The Third-Party Data Deprecation Playbook
whitepaper | Digital Marketing

The Third-Party Data Deprecation Playbook

1y
Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study
whitepaper | Digital Marketing

Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

1y