Beginner’s Guide to SEO: Best Practices – Part 2/3

In the first post in this series, Beginner’s Guide to Search Engine Optimization, Part I, we set the groundwork for developing our site and creating content. We thought about how to generally approach SEO and how to begin your keyword research.

guide to seo part2 Beginners Guide to SEO: Best Practices   Part 2/3

As a quick reminder it’s best to think of SEO as part of your marketing. Sometimes it will make sense to do something that doesn’t help your SEO efforts and perhaps even hurts them a bit, because it helps your overall marketing. Remember – it’s about the big picture. You don’t want to get lost in the details.

Also keep in mind that the keywords you target set the stage for everything else. Pages don’t just rank. They rank for specific queries or keyword phrases. Your choice in keywords should be focused on what ultimately provides the most benefit for your business. Targeting a word like “free” may bring lots of traffic, but that traffic won’t be looking to spend any money.

With the above in mind let’s look at some of the things we can and should do when developing our site. Let’s begin by making sure search engines can find out pages.

2. Search Engine Friendly Web Development

Duplicated Contents

Search engines don’t want the same content littering their results. It makes no sense for them to present the exact same page multiple times for the same query. Unfortunately most content management systems create multiple URLs for accessing the same content. Categories, tags, and search results all lead to the same content being found through multiple URLs.

You might want to block some of the URLs from being indexed though a robots.txt file, or through the use of the meta robots tag (use noindex, follow so links on the page can still be crawled) or use 301 (permanent) redirection to point the duplicate URLs to your URL of choice. If you allow search engines to decide which URL to index it may not be the one you prefer. The canonical attribute on link tags is another option to help search engines determine which URL is the one you want indexed

You also want to make sure that every page on your site has unique content. Many ecommerce sites will have very thin product information. For example one product might come in several different sizes and each size gets its own page. The content on those pages will likely be exactly the same with the exception of the different sizes. Search engines are not likely to rank all of those pages. They’ll choose one. Better would be to create a single page and allow for a choice of size on that page. If each size must have it’s own page rewrite some of the content to increase the percentage of uniqueness on each.

Canonical URL & Duplicated Contents

Canonical URLs (different than the canonical attribute mentioned above, but the same basic concept) are another example of duplicate content. Canonical URLs are a fancy way of saying multiple URLs can lead to the same page. Your home page might be accessed via:

  • domain.com
  • www.domain.com
  • domain.com/index.html
  • www.domain.com/index.html

Those are 4 different pages in the eyes of search engines and again only one will be indexed. Just as important are the links pointing into those pages. Say one site links to domain.com and another links to www.domain.com/index.html. You might think that means your home page has 2 links pointing to it. Nope. From the perspective of a search engine that’s 1 link pointing to each of 2 different pages. You’ve effectively cut in half the benefit of those links.

If your server runs Apache with mod_rewrite enabled (More than likely it does), you can add the following to your .htaccess file to correct the canonical issue between www and non www versions of your domain. If not, don’t worry. There are a variety of ways to rewrite URLs. One key point is that the rewrite should be a 301 or permanent redirect. You want to tell search engines this content over here should always be seen on that URL over there.

RewriteEngine On
RewriteCond %{HTTP_HOST} ^yourdomain.com
RewriteRule (.*) http://www.yourdomain.com/$1 [R=301,L]

Search Friendly URLs

Which of the following URLs tells you more about the content you’ll find on the page?

  • domain.com?id=3648373729&cat=12
  • domain.com/sports/baseball/statistics.php

The first tells you absolutely nothing about the page content, the second clearly tells you the content will be a page showing baseball statistics of some kind. That’s much more usable to real people as well as search engines. It helps search engines identify what the page is about, makes use of keywords, and is easier to crawl.

Note: Search engines can crawl dynamic URLs fine. However too many parameters can trip them up, especially when those parameters include session IDs. If you need to include parameters in your URLs try to limit how many. 2 or 3 are ok, a dozen could cause crawling issues.

Notice in the second URL above that keywords have been used in file and folder names. You don’t want to stuff keywords in there, but using them as above reveals a lot of information about your site and reinforces keyword themes. If your statistics page and your teams page and your players page all link back up to the main baseball page it helps reinforce the keyword theme baseball throughout that section. Assuming you also have sections for football and basketball and hockey all linking back up to your main sports page it further helps reinforce the keyword theme sports.

The idea of creating these keyword themes is a concept known as theming or siloing.

5 More Tips..

Be found.
The most important aspect of building a search friendly site is to make sure your content can be found and indexed. If your pages aren’t indexed they can’t appear in search results. Your first goal should be to prevent roadblocks to getting crawled and indexed. Build sites that are accessible and usable. The same principles you would follow for accessibility and usability will remove the roadblocks for search engine spiders.

Avoid Flash, Javascripts.
Search engines are better at crawling text than anything else. Avoid Flash, javascript, and images for the navigation of your site. Progress has been made in crawling each, but best practice still suggest coding links as straight html. If your design calls for Flash or Javascript in your main navigation then provide another navigational system for search spiders.

HTML Sitemap.
You can also help search engines find your content through html sitemaps. That’s HTML, not XML. XML sitemaps are meant to be a backup in case you’ve presented some roadblocks to being crawled. Create an html sitemap and link to it from your home page at the very least and even better from all pages on your site. It’s easy enough to add a link to your sitemap in the secondary navigation you might add to your footer.

Valid Codes.
Develop with clean valid code.
Search engines don’t really care if your code is valid. In fact none of the 4 major search engines have home pages that validate. However since some coding errors can be show stoppers to getting crawled it’s in your best interest to write valid code. Search engines are mainly interested in your content and while they have little problems finding your content inside your code, the less code you make them wade through the better.

Speed does matters.
Speed is now also a ranking factor
, at least at Google. Use CSS over <table>, move CSS and Javascripts to external files. Keep html file sizes as small as possible. Use gzip compression. Minify files. While it’s likely a minor factor, anything you can do to speed up your site will help your rankings with Google and probably the other engines in the not to distant future.

Some other thoughts about search friendly site development:

  • Semantic coding can reveal information about your pages and site to any application that understands the semantics. Search engines are making more use of microformats when determining what to rank for a particular query
  • Internal links – links are an important part of SEO and that includes internal links. Most every page on your site should link to other pages on your site. You also have complete control over the anchor text of internal links.
  • Periodically test to make sure links on your site are working. Fix or remove any broken links you find.
  • Use breadcrumbs. Breadcrumbs help people and search engines understand the architecture of your site and they naturally link back up through your sections. See keyword themes and silos above.

Resources

Again there’s a lot more that can be said about SEO and site development. Here are a few checklists with additional tips

3. On-Page SEO

Once upon a time this was SEO. People stuffed keywords everywhere they could and their pages ranked. Of course all that keyword stuffing was considered spam and no longer works as it once did. Today the idea is to write page content so that it reads well to real people. You also want to pay attention to a couple of key things.

Page Titles: <title></title>

Page titles are perhaps the most important thing you’ll write on the page for search engines. Page titles do play an important part in ranking. Keep your page titles short and include the main keyword phrase for the page. For low competitive phrases a good page title alone is probably enough to generate a good ranking.

Include your brand in your page titles. If your brand is well know you probably want to include it at the front. Keywords at the front of the page title are likely better for SEO, but a well known brand is going to induce more clicks. If your brand is not well known it’s probably best to include it at the end.

Make sure every page title on your site is unique. Far too many sites use the exact same page title (often the domain) across the site, which misses the benefit page titles give.

Also remember that your page title is what people see as the link in search results. Write your titles in a way that makes people want to click on them. A good page title should should contain your most important keyword phrase and make people want to read the page content.

Page Headings: <h1> - <h2>

Page headings might not be as strong a ranking factor as they once were, but I think it’s still a good idea to include keywords and phrases in them. Create one <h1> tag as your main page heading and then use <h2> – <h6> tags to present a hierarchy for the rest of the content.

Ideally your hx headings will use variations of the main keyword phrase you used in your page title. Many CMS applications like WordPress will generate an h1 heading that’s exactly the same as your page title. Ideally there would be some variation, but again hx tags may not be as important as they once were.

Meta Tags: <meta>

Meta Tags are not the end all and be all of SEO. There are 3 meta tags we’ll talk about here.

  • Meta keywords are pretty much useless. They’re far too easy to spam and are no longer considered a ranking signal. Google and Bing don’t even read them and it’s highly unlikely the other engines pay any real attention to them. You can safely ignore them completely, but if you feel you must include them use some common misspellings of your keywords. Seriously if you spend more than 30 seconds writing meta keywords for a page you’re wasting your time. You probably wasted the 30 seconds too.
  • Meta descriptions likely have little if any effect on where your pages rank for the same reasons meta keywords don’t. However sometimes your meta description will show as the snippet below your link in the search results. Write meta descriptions in a way that entices clicks. Use a strong call to action and maybe think of them as a mini-ad.
  • Meta robots are used to tell search engines not to index a page or follow the links on a page. You never need to tell search engines to index of follow since that’s their default behavior. Most of the time you won’t need to include these meta tags, but in the case of duplicate content described above you sometimes don’t want a page indexed. Most of the time you’ll still want the links followed.

ALT and others

ALT and Other Attributes and Semantic Tags have also been spammed to death, however they can still be useful. ALT attributes particularly are one of the few signals you can give about images. Don’t stuff them full of keywords. Write them as they were meant to be written as short descriptions for people who can’t see the image.

If an image is just “eye candy” such as a gradient behind your navigation bar leave the alt attribute blank (alt=""). What is there to describe? Stuffing attributes with keywords is more likely to get you flagged for spamming than it is to improve your ranking

The same is true for things like strong and em and any other tag or attribute you can think of. It’s highly doubtful any will play anything more than a minor role in how well your pages rank. Use them as they were intended to be used for real people reading your content. Use <strong> to add emphasis to a keyword or phrase if it makes sense, but understand that the more you use these tags the less impact they have with people reading your content.

I don’t want to leave you with the impression that adding keywords to tags and attributes won’t help at all. The point is not to obsess over small things that will have a minor impact. It’s certainly ok and makes sense to emphasize keywords and phrases where appropriate, but it makes no sense to add strong or em tags to every mention on the page.

Ultimately when writing page content it’s far more important to think about how well the content reads to real people than search engines. Think about why you want the page to rank in the first place. It’s so someone landing on it will absorb your content and take some action. So what of the page ranks well if it reads so poorly that people leave instantly.

Write a good page title, use <h1><h6> headings to organize your content and allow people to scan the page and write your content for your readers. Write naturally. Don’t try to force keywords on the page and use variety in your language. Sometimes call it SEO, sometimes call it search engine optimization, sometimes just say optimization. The variety reads better and also opens up the page to ranking for a greater number of keyword phrases.

Resources

A few posts on writing page titles, page headings, page URLs, and other on-page content

Summary

The most important aspect of building a search friendly site is to make sure search engines can find, crawl, and index your pages. You want to eliminate as many potential barriers as possible. Every time you do something to make it harder for search engines to find and understand your content, you put up a roadblock. Place enough roadblocks on all the avenues leading to your business and even if people want, they won’t be able to get there.

You can also develop sites in ways that reinforce your keyword themes and help search engines understand what topics the site considers most important and should rank for. It doesn’t hurt that these things also help your visitors understand the site better too. A usable and accessible site is usually a search friendly site.

With structure in place you have some tips for creating the actual pages on your site. While once upon a time most SEO happened here the impact of on-page content has been reduced over the years. There are a few key places you want to pay attention to and the good news is once you get a feel for it on-page SEO becomes as much good habits as anything else.

In the next (and final) post, we’ll be looking into:

  • Link Building – What other sites say about your site by linking to it says a lot about your site and why search engines should rank your pages.
  • Analytics – Help you determine what has and hasn’t been working so you can refocus your efforts and improve your site.

Stay tuned. Click here to read Part I of this article.

Lascia un commento

Il tuo indirizzo email non sarà pubblicato. I campi obbligatori sono contrassegnati *

quattordici − undici =

Questo sito usa Akismet per ridurre lo spam. Scopri come i tuoi dati vengono elaborati.