SEO News

Word File

RSS
  1. How to Use Twitter Lead Gen Cards to Increase Newsletter Sign-ups

    The file will come as in Excel form, and then you came enter the data into your email list. They are the amplifiers that spread the good word about your brand. The fans that regularly read your newsletter are often your most valuable brand advocates.

  2. How to Detect and Deal With Toxic Content (That Could Poison Your Entire Site)

    Dead content: Pages that serve a 404 error because that file/content no longer exists. Duplicate Content: One of the quickest ways to spot duplicate content with Screaming Frog is to run a crawl report of your site and identify duplicated page...

  3. 5 Googley SEO Terms – Do They Mean What You Think They Mean?

    Most people think the robots.txt file is used to block content from the search engines. Erroneously, webmasters and site owners will add pages and folders to their robots.txt file, thinking this means that page won't get indexed.

  4. 'Demerol,' 'Butt' & 'Preggers' Among 1,400 Strange Words Google Bans on Android

    The list, part of a file called Dictionaries in the OS code, discovered by Wired, is an odd one. Wired found the file in the Android 4.4 Kitkat source code, and said that it contains 1,400 odd words. I try to Swype-type the word ‘condom' and I get...

  5. Twitter Speak From @ to Z – Terms & Definitions

    If you "favorite" a tweet, it means that tweet will go into a file with all of your other favorite tweets so you can go back and read it whenever you wish. This refers to how many letters, spaces, and punctuation are in a tweet (just like you're...

  6. Ecommerce Product Pages: How to Fix Duplicate, Thin & Too Much Content

    Block faceted pages via Robots.txt file. To audit word count for every page on your site, crawl the site with Screaming Frog and looking for potential trouble spots in the "Word Count" column. Ecommerce sites are particularly at risk, largely due...

  7. 5 New Website Vulnerabilities Straight from Black Hat & DEF CON

    When you visit the page, in the background unbeknownst to you, you're also visiting one of these services, which are used to unlock an encrypted file that downloads the malicious attack. See if you can pull in a simple hello world JavaScript file...