Search Engine Optimization

Search Engine Optimization

Search Optimization Companies

As soon as your site is visible on the net you no doubt will experience a deluge of e-mail from companies purporting to be Search Engine Optimization specialists. I urge you to consider some things before giving your hard earned cash to these companies.

The spammers have not been successful at optimizing their own sites for search engines. If they had, they wouldn’t use spam to drum up business. Plenty of clients would have found them via their websites. If you do decide to employ a search engine optimization firm, I suggest you search for them with Google and choose amongst the highest in the results.

Content

Content is still King but other factors are significant. If you link to sites which are not relevant to your content, or if sites not relevant to your content link to yours, search engines consider your site to be low quality and reduces your ranking. If a site that is not relevant links to yours, you can use Google Webmaster Tools to tell Google to ignore that link.

Since the goals of most major search engines is to provide the customer with the most relevant and desirable sites, make your site attractive to humans. Use complete well-formed sentences and paragraphs. Make sure that images have proper alt tags describing them.

Organize your site so there is a reasonable number of links on each page. You don’t want too many links on one page but you also don’t want to force users to go too deep to find the content they seek. As far as depth, I try to not have more than 5-10 items on a menu, and then as many layers of depth as necessary to accommodate the content within that context, keeping the most important content as near to the entry page as possible.

Because few readers will read past the first hundred kilobytes of a document, search engines tend to rank keywords in the top part of the document higher. For this reason, it is a good idea to use external JavaScript pages and external stylesheets. Another advantage of using external stylesheets and JavaScript is that if the same code is used on more than one page, it only needs to be loaded once by the browser and then accessed from cache for additional pages.

Search engines also tend to rank keywords found in the title, filename, and domain name more highly than keywords found elsewhere. Keywords found in larger text with larger fonts tend to get weighted more heavily as well as those found in <h1> through <h6> tags with heavier ranking going to the lower numbers.

Most search engines can not index images, video, audio, or flash content. Google does have some ability to index flash and images. You should make sure to use descriptive alt tags with any images. This is important not only for search engines but also for visually impaired users.

Mobile Friendly

Because that approximately 40% of traffic that comes to websites today originates from mobile devices such as smart phones, tablets, and phablets, Google now penalizes sites which are not mobile friendly when searches originate from a mobile device.

There are basically three approaches to mobile friendliness. The first is to create an entirely separate website for mobile devices, commonly with a ‘.mobi’ domain extension. The second is to dynamically serve pages with different mark up depending upon the device. The third is to make the site responsive by using media queries to serve different CSS code depending upon the device originating the request. It is also important to set the view port, otherwise small high resolution devices like the I-phones with their retina display will display text and buttons too small to read or operate.

Site Maps

You can help search engines find what is relevant to index by creating a sitemap which is an xml file describing your site. For a hand coded site there are numerous scripts available that you can run from cron periodically. For a WordPress site, WordPress maintains an atom and rss feed automatically at the URLs yoursite.com/feed/ and yoursite.com/feed/atom/ respectively. However, you can also get plugins that create a standard sitemap.xml file. You can get plugins that create several sitemap files based upon different aspects of your website and then provide a sitemapindex.xml file. In Google Web Master tools, under the Crawl section, there is a subsection for Sitemaps where you can submit sitemaps. Doing so will insure that your content gets indexed promptly and more thoroughly than relying on a spider to come along and crawl your site.

Links

Incoming links should be from relevant sites otherwise search engines will penalize you for link spamming. Outbound links should be to relevant sites otherwise search engines will consider your site low quality and penalize you. Avoid link farms like the plague. They might give you a temporary boost but once discovered search engines will severely penalize your site. The type of links you want are those that come naturally from good content.

Social Media

You can expand your sites reach and engage more viewers by tying your site in with social media sites like Facebook, Twitter, LinkedIn, PinInterest, and others. There are WordPress plugins that both allow automatic posting to these sites as well as automatic inclusion of content from these sites.

Be Honest

Show search engines the same content you show everyone else. They will penalize you if they think you are trying to fool them into thinking your site is something it is not.

Maintain Your Site

Make sure you don’t have incomplete links or broken HTML or CSS code. We have an excellent tool on shellx which will crawl your website and find any broken links. It is is called KLinkStatus and is a graphical application so you need to use a remote desktop to use it. It is located at Applications->Programming->KLinkStatus.

To check your site for syntactical correctness, I suggest you utilize the W3 validator at http://validator.w3.org/.

I suggest you code your site in HTML5. It is easier to code and more capable than earlier versions of HTML and XML. For now I suggest avoiding new tags like <header>, <footer>,
<section>, and <article> and stick to generic <div> and <span> tags when you can get the same functionality, even if the code is less readable, in order to maintain backward compatibility with
older browsers and pass HTML5 validation.

The goal is to make your site technically correct, device independent, and attractive to as wide of an audience as possible. If search engines determine that your site is high quality and likely to be viewable across a wide swatch of browers, they are likely to score it higher.

Meta Data

Some search engines ignore meta data entirely, others give indexnothing but meta data, and others correlate meta data to content and give you a better ranking of your meta data accurately describes your content. Use meta data but use it wisely, make sure keywords in your meta data are found in the content and accurately reflect that content.

Recent Posts