Thursday, October 7, 2010

SEO 101 techniques

Believe it or not, basic SEO is all about common sense and simplicity. The purpose of search engine optimization is to make a website as search engine friendly as possible. It's really not that difficult. SEO 101 doesn't require specialized knowledge of algorithms, programming or taxonomy but it does require a basic understanding of how search engines work.

For the purposes of brevity this piece starts with a few assumptions. The first assumption is a single, small business site is being worked on. The second assumption is the site in question is written using a fairly standard mark-up language such as HTML or PHP. The last assumption is that some form of keyword research and determination has already taken place and the webmaster is confident in the selection of keyword targets.

There are two aspects of search engines to consider before jumping in. The first is how spiders work. The second is how search engines figure out what pages relate to which keywords and phrases.

In the simplest terms, search engines collect data about a unique website by sending an electronic spider to visit the site and copy its content which is stored in the search engine's database. Generally known as 'bots', these spiders are designed to follow links from one page to the next. As they copy and assimilate content from one page, they record links and send other bots to make copies of content on those linked pages. This process continues ad infinitum. By sending out spiders and collecting information 24/7, the major search engines have established databases that measure their size in the tens of billions.

Knowing the spiders and how they read information on a site is the technical end of basic SEO. Spiders are designed to read site content like you and I read a newspaper. Starting in the top left hand corner, a spider will read site content line by line from left to right. If columns are used (as they are in most sites), spiders will follow the left hand column to its conclusion before moving to central and right hand columns. If a spider encounters a link it can follow, it will record that link and send another bot to copy and record data found on the page the link leads to. The spider will proceed through the site until it records everything it can possible find there.

As spiders follow links and record everything in their paths, one can safely assume that if a link to a site exists, a spider will find that site. There is no need to manually or electronically submit your site to the major search engines. The search spiders are perfectly capable of finding it on their own, provided a link to your site exists somewhere on the web. Search engines have an uncanny ability to judge the topic or theme of pages they are examining, and use that ability to judge the topical relationship of pages that are linked together. The most valuable incoming links, come from sites that share topical themes.

Once a search spider finds your site, helping it get around is the first priority. One of the most important basic SEO tips is to provide clear paths for spiders to follow from point A to point Z in your website. This is easily accomplished by providing easy to follow text links directed to the most important pages on the site in the navigation menu or simply at the bottom of each page. One of these text links should lead to a text-based sitemap, which lists and provides a text link to every page in the site. The sitemap can be the most basic page in the site as its purpose is more to direct spiders than help lost site visitors though designers should keep site visitors in mind when creating the sitemap. Google also accepts more advanced, XML based sitemaps, which can be read about in their Webmaster Help Center.

There will be cases where allowing spiders free access to every page on a site is not always desirable. Therefor you'll need to know how to tell spiders that some site content is off limits and should not be added to their database using "robots.txt" files. (To learn more about setting up your Robots.txt file, start with Jennifer Laycock's article on Robots.txt basics)

Offering spiders' access to the areas of the site one wants them to access is half the battle. The other half is found in the site content. Search engines are supposed to provide their users with lists of pages that relate to the search terms people enter in their search box. Search engines need to determine which of billions of pages is relevant to a small number of specific words. In order to do this, the search engine needs to know your site relates to those words.

To begin with, there are a few elements, a search engine looks at when examining a page. After the URL of a site, a search spider records the site title. It also examines the description meta tag. Both of these elements are found in the "head" section of the source code.

Titles should be written using the strongest keyword targets as the foundation. Some titles are written using two or three basic two-keyword phrases. A key to writing a good title is to remember that human readers will see the title as the reference link on the search engine results page. Don't overload your title with keyword phrases. Concentrate on the strongest keywords that best describe the topic of the page content.

The description meta tag is also fairly important. Search engines tend to use it to gather information on the topic or theme of the page. A well written description is phrased in two or three complete sentences with the strongest keyword phrases woven into each sentence. As with the title tag, some search engines will display the description on the search results pages, generally using it in whole or in part to provide the text that appears under the reference link.

Due to abuse by webmasters, such as using irrelevant terms, search engines place minor (if any) weight in the keywords meta tag. As such, it is not necessary to spend a lot of time worrying about the keywords tag.

After reading information found in the "head" section of the source code, spiders continue on to examine site content. It is wise to remember that spiders read the same way we do, left to right and following columns.

Good content is the most important aspect of search engine optimization. The easiest and most basic SEO rule is search engine spiders can be relied upon to read basic body text 100% of the time. By providing a search engine spider with basic text content, you offer the engines information in the easiest format for them to read. While some search engines can strip text and link content from Flash files, nothing beats basic body text when it comes to providing information to the spiders. You can almost always find a way to work basic body text into a site without compromising the designer's intended look, feel and functionality.

The content itself should be thematically focused. In other words, keep it simple. Some pages cover multiple topics on each page, which is confusing for spiders. The basic SEO rule here is if you need to express more than one topic on a page, you need more pages. Fortunately, creating new pages with unique topic-focused content is one of the most basic SEO techniques, making a site simpler for both live-users and electronic spiders.

When writing page content, try to use the strongest keyword targets early in the copy. For example, a site selling "Blue Widgets" might use the following as a lead-sentence;

"Blue Widgets by Smith and Co. are the strongest construction widgets available and are trusted by leading builders and contractors."

The primary target is obviously construction applications for the blue widget. By placing the keyword phrases "blue widgets" and "construction widgets" along side other keywords such as the singular words, "strongest", "trusted" and "builders" and "contractors", the sentence is crafted to help the search engine see a relationship between these words. Subsequent sentences would also have keywords and phrases weaved into them. One thing to keep in mind when writing page copy is unnecessary repetition of keywords (keyword stuffing) is often considered spam by search engines. Another thing to remember is that ultimately, the written copy is meant to be read by human eyes as well as search spiders. Read your copy out loud. Does is make sense and sound natural? If not, you've overdone the use of keyword phrases and need to make adjustments.

Another important element a spider examines when reading the site (and later relating the content to user queries), is the anchor text used in internal links. Using relevant keyword phrases in the anchor text is a basic SEO technique aimed at solidifying the search engine's perception of the relationship between pages and the words used in the link. For example... we also have a popular series of articles on the basics of SEO written by Stoney deGeyter. Linking the term "basics of SEO" is an example of using keyword phrases in the anchor text. Terms such as "SEO 101" or "SEO for beginners" could also have been used.

Remember, the foundation of successfully optimizing your site is simplicity. The goal is to make a site easy to find, easy to follow, and easy to read for search spiders and live-visitors, with well written topical content and relevant incoming links. While basic SEO can be time consuming in the early stages, the results are worth the effort and set the stage for more advanced future work.

Don't miss SEO 101: Everything You Need to Know About SEO (But Were Afraid to Ask) by Stoney deGeyter. The popular series that will equip you with all you need to take the first steps in making your site rank higher in the search engines.

Services found on ""

X-Rumer Daily Blasts
500 links per day, 15,000 links per month - $49

1,000 links per day, 30,000 links per month - $67

2,000 links per day, 60,000 links per month - $99
Purchase our services and get from 500 to 2000 links per day to your website. You can submit 30 different domains and get backlinks to different websites everyday, but you cannot add website withing a month after purchasing monthly services.
X-Rumer Link Blasts
150k Link Blast
250k Link Blast
500k Link Blast
750k Link Blast
1 Million Link Blast
X-Rumer Profile Backlinks
1,000 Profile Backlinks
2,000 Profile Backlinks
3,000 Profile Backlinks
5,000 Profile Backlinks
10,000 Profile Backlinks
20,000 Profile Backlinks

How to Build Traffic and Backlinks with Content Marketing

If you have been anywhere near the Internet marketing space for any length of time, you have most certainly heard the old adage “content is king.”
Never has that been more true than today. No longer can you just slap up a couple links on low PR article and blog sites and call it a day if you plan on ranking in Google.

This new era of SEO and link building requires you to build compelling content that people will naturally link to, link building as it is meant to be.
But creating compelling content that will bring in loads of links is much easier said than done. That begs the question: how can you use content strategy as a focal point of your link building efforts?

That’s one of the main points Vertical Measures President Arnie Kuenn answered in Thursday’s VM webinar entitled “How to Build Traffic & Backlinks with Content Marketing,” which you can view in its entirety on our free SEO webinars page.
Companies are just now starting to understand that information is one of their most important offerings. In this Internet Age, the world is no longer just about products and services.

It’s important for companies to provide relevant content that solves tough problems faced by customers and to position themselves as a trusted solutions provider in their industry, as customers rely on companies to be a trusted source of information.
In the webinar, Arnie explained that storytelling gains an advantage being that if you can tell a story in a meaningful way then people will listen, read and ultimately link. This could involve hiring journalists to tell the story of your company or your product in a way that will resonate with the reader.

The crux of a content strategy boils down to this: you must create great content. Earth shattering, right? That involves building content around your keywords that adds real value to the reader by being unique, informative and entertaining.
For example, an REO/foreclosures corporate blog recently posted a piece of content on 13 homes that can’t foreclose, in which they picked out ridiculous “housing” choices such as a Monopoly house and a Gingerbread house. A piece like this incorporates the keyword (foreclosed homes) around a creative theme that can naturally attract links.

Such linkworthy content, as we like to call it here at VM, can come in a number of different forms. Arnie listed 13 popular ones:

1. A Blog
2. White Paper Series
3. eBooks
4. Case Studies
5. Interviews
6. Infographics
7. Online Quizzes
8. Contests
9. eNewsletters
10. Community Forums
11. Podcasts
12. Videos
13. Webcasts/Webinars

Finally, one of the biggest keys in creating interesting content that people will link to is to make it relevant. If you write the most interesting piece of content in the world on something people don’t care about, nobody will link to it. It’s the sad truth of the matter.
However, if you research what’s trending, you can tailor that hot topic/event to one of your keywords. For example, you could do a post on the 10 most expensive car crashes in history, with the last one being Tiger Woods’ crash, as one company in this space recently did. This example takes a national story and relates it to a client’s industry, although it’s an industry that usually doesn’t mesh with Tiger Woods and golf.
When looking for what’s current and hot when brainstorming for content, you should:
- Check Google Trends & Google Insights for Search
- Check trending topics on Twitter, Yahoo and MSN
- Look at answer sites like Yahoo Answers to figure out what people are asking about in your industry
- Check sites like Digg, StumbleUpon, Delicious, Tipd and Mixx.
Finally, Arnie discussed five steps in the content development process, which are as follows:
1. Brainstorm Concepts
2. Choose Most Powerful Idea
3. Wireframe Content
4. Create CSS/HTML/Code
5. Prepare Seeding and Launch Plan