Search Engine Optimization is defined as, “Search engine optimization (SEO) is the process of improving the volume or quality of traffic to a web site or a web page (such as a blog) from search engines via “natural” or un-paid (“organic” or “algorithmic”) search results as opposed to other forms of search engine marketing (SEM) which may deal with paid inclusion.”
SEO (Search Engine Optmization) is not a new fad. As long as there have been search engines, there has been someone trying to exploit weaknesses of search engines. Keyword stuffing, cloaking, link exchanges, link wheels, and spamming have been around for years. SEO has evolved into something much more since then. Why? Because Google has evolved into something much more. Those strategies were used when search engines didn’t do much. But that was back when MTV had music videos and celebs had to actually use a publicist to communicate with the public.
If you haven’t heard, Google has branched out. They now offer many applications, services, platforms such as: Adwords, Adwords Editor, Analytics, Chrome, Desktop, Earth, Gmail, Picasa, Secure Access, SketchUp, Talk, Buzz, Docs, Calendar, Wave, Reader, Latitude, Maps, Blogger, YouTube, FeedBurner, Friend Connect, Gadgets, Profiles, Notebook, Orkut, Panoramio, Picnik, Sites, Voice, Sidewiki, Android OS development, App Engine, Code, Open Social, Webmaster Tools, Go, Goggles, Ride Find, and that’s not even close to everything!
Google had over 6 billion searches in February of 2010. That’s 65.2% of the search engine market. Beyond basic search there is image, video, blog, real time, and personalized search. Google send a super-fast bot out to your businesses website. The bot, or spider, will crawl around your website and record every page it finds. Once recorded it is plugged into an algorithm (which is said to have over 200 variables) and your sites webpage will then be ranked accordingly.
We are often asked, “How are rankings determined?” “Why does my site rank so poorly?” and the answer is simple. Back links, title tags, keyword saturation, and new content.
Back Links
Back links – the sole factor in Google PageRank link analysis algorithm. Every single page indexed in Google is assigned a PageRank.
- Google rewards are greater for natural link building.
- Links from older-authority sites with high PageRank are best.
- Relevancy plays a huge role between linking sites.
- Utilizing your keywords as the text used in the link (anchor text) is the key. You would not want to use ‘Brainerd Real Estate’ for every link because that is not natural. Google may see this as spam. Naturally, people would link to your site using the URL, City+keyword, and profiles and forum activity might yield anchor text like this: ‘my site’, ‘my blog’, ‘page title’ ‘mydomain.com’ and ‘www.mydomain.com’. Use variety in anchor text but at the same time stay focused on keywords.
- Utilize a variety of sites for back links.
- Learn a few basic commands to monitor competitor links.
Google uses links pointing to your site as a way of establishing authority. If Site A links to Site B then Google can measure that as site A vouching for site B, and they do measure it.
For Google’s ranking algorithm to continue working well they need to filter out spammers. They do this by rewarding sites that have become successful with little to no SEO. These sites had extremely talented and creative writers, great tools or games, amazing videos, or something that made people want to link to their site. Not everyone has the skills to program a sweet game, write award winning content, or think of “the next big thing.” If you are still having trouble grasping this concept; think of viral videos. They are what we call link bait. You might ask yourself, “How do I produce link bait for my business” It’s a tough question, we know, but there is a light at the end of the tunnel though, and that is to replicate this natural link building process yourself.
Other warnings should be issued before you start building links to your site:
- Do not build obscene amount of links too quickly.
- Do not use one keyword for every back link you build.
- Do not buy links on sites that sell a lot of links or sites selling links openly.
- Do not waste time on building links on sites with no commonality.
Links from older-authority sites with high PageRank are best. The best way to stay tuned in to PageRank and site age is to install a SEO toolbar (For Firefox: SEOBook & SEO Quake. I.E toolbar: SEO Inc. Chrome: SEO Buttons, PageRank extension) on your browser of choice.
Relevancy plays a huge role in between linking sites. Finding relevant sites that older with high PageRank will help your SEO efforts in a big way. PageRank is passed through the links. Think of it as link equity. Even better, find sites that already rank well on your desired keyword. Then do whatever you can to get a link on those high-ranking sites. The closer the site is to your exact keyword the better. However, finding other ‘Atlanta Real Estate’ sites that will give you a link is very difficult. This is why many successful SEO campaigns will build micro-sites, blogs, videos, articles, and press releases on the keyword. Think of it as making your own sites for those links. That’s how important back links are.
Utilize your keywords in anchor text. Doing this is a huge factor in organic rankings. Using your primary keywords as the link text on high authority sites which are related to your site will have the biggest impact. Finding these potential back link sites can be time consuming. There are tools to help but tools which promise to automate the process could leave you in a world of regret. Google can penalize, drop your rankings, and even de-index your site for doing automated link building.
Utilize a variety of sites for back links. Real-time, blog, image, video, and organic search means that there is a ton of way to make impressions. Build links on:
- Your business contacts’ websites – links from sites in the same geographic location can be very influential in rankings.
- Relevant blogs – comment on dofollow blogs.
- Top article directories – submit articles on your keyword to the top ten article directories.
- Relevant or high authority directories – this is a good way to get a link on high authority domains like DMOZ & Yahoo.
- Social media sites – Your Facebook, Twitter, and YouTube links. Most of them will not pass link equity, but they will help bring visitors to your site and get your site indexed faster. Most importantly you have a new platform to connect with your target market. Ideally, your target market would be building links to your site so place a lot of time and resources on this.
- Web 2.0 sites – These are the sites which allow you to do more than just retrieve information. You can upload, write blog posts, and talk with others in an online community. Each industry has different ones. For real estate its sites like Zillow, Real Estate Webmasters, Top Seller Sites. With something like Architecture it’s Flickr, Squidoo, Hubpages, and Scribd.
When you find a high relevant, high ranking, local site that would be perfect for you and your site, do not, I repeat, DO NOT just email the webmaster and say, “can I have a link on your site?” Be specific. Be respectful and tell them why you think their site is a perfect match, how you have been a big fan of their site, and where exactly you want the link to appear. You will have much more success being as specific as possible.
On-Page SEO
- Use good title tags.
- Do not use the same title tag more than once.
- Use keywords in the URL (or web address).
- Keywords should be used in heading text (h1, h2, h3, h4).
- Keywords should be used in alternate text for images.
- Keywords should appear in the body of the page.
- Remove www or non-www version of your site.
- Use a robots.txt file.
- Set up a sitemap.
Do not overdo it with keywords. If the text does not read naturally then you have used too many keywords. Meta descriptions can sometimes appear on the rankings as the descriptions. Google does not use meta keywords anymore so do not spend a ton of time on them.
If you have duplicate title tags it could mean that you will have fewer pages indexed in Google. The spider can be turned off by duplicate title tags and assume that the pages are the same even if they are only sharing a title.
Remove one site version. Are you able to view your site on www.yoursitename.com and yoursitename.com? If so, Google may see two versions of your website. This means more duplicate content, more pages to share the same amount of links, and some business owners build links to both thinking it is the same, it’s not. Set up a 301 redirect to the less popular version. Most of the link equity built up will be transferred through the 301 redirect. You can set up the redirect fairly easy by adding a few lines in your .htaccess file on your web server.
Robots.txt files can do a lot for your SEO campaign. Also known as the Robots Exclusion Standard, robots.txt will block all kinds of bots, IP addresses, and spiders from visiting certain areas on your site. Think of your CMS’s like WordPress. You would not want Google bot visiting your admin panel or plugins directory. You can even use robots.txt generators to take some of the legwork out of it for you. Ultimately you will help Google figure out which pages have the most importance on your site by blocking Googlebot from visiting the ones you do not want showing up in the index. This is often overlooked by web designers. Check to see if your site has one by typing this into your web browser www.yoursitename.com/robots.txt.
Set up a sitemap for your website. It is very easy these days. Sitemap generators will do it for you. If you have a large site, Google has sitemap resources that will help you generate sitemaps on sites with 50,000 pages or more. Sitemaps are another good way of telling Google which pages should have highest priority.
Reports, Analytics & Keyword Research
Step #1: Utilize Google’s Resources to Your Advantage
Two must have tools for every serious business owner is Google Webmaster Tools and Analytics – Use Google Analytics to track your visitors on a regular basis. Check out top exit pages, bounce rate, browsers, which keywords are sending you traffic, which keywords are converting into leads or sales. Webmaster Tools are a great way to see if the Google spider is having any issues with your site. You can also get a snapshot of your back links on Webmaster Tools, see current keyword rankings, point Google to a sitemap, set a preferred version of your site between www and non-www (will not set up a 301 redirect though and is not as good of a solution), and set your geographic location of your site by country.
Step #2: Utilize Yahoo’s Openness to Your Advantage
Yahoo shows us a complete portrait of back links to a site. All you have to do is type this in:
link:yoursite.com
site:yoursite.com
Keep an eye on total number of links on competitor sites. Take the all the links to the top ten competitors, drop them in a spreadsheet, and see which sites are sharing the same back link sources. Go right out and get those! You’ll be ranking with them in no time. If you find a competitor who is getting all kinds of links all of the sudden, see where they are getting the links, and find out how you can do whatever it is they are doing, but do it even better.
How to Select the Keywords for Your Master Keyword List
Do your keyword due diligence it is so important! You would not believe how often we refer to a master keyword list.
Use WordTracker, Keyword Corral, Keyword Elite, and AdWords to research keyword volume and programs like Market Samurai will show you competition levels. AdWords has a couple of keyword tools, Traffic Estimator and Suggestion Tool, which will help you find the right keywords too. If you have a blog or are writing articles, check out the WordTracker Keyword Questions Research Tool.
- Select keywords with low competition.
- Choose keywords with consistent traffic.
Competition levels can be derived by looking at the total number sites which show up when searching for that keyword. If 100,000,000 sites show up it might be too competitive. Also, check on the number of back links and indexed pages the top ten ranking sites have by typing “site:yourcompetitorsdomain.com” and “link:yourcompetitorsdomain.com” into Yahoo.
Most business owners want to target more than one keyword. This can be done be done in many ways. The most popular way is to add new content on a blog. If you have not set up a blog yet what are you waiting for? WordPress is completely free and takes 5-minutes to install. There are other options as well such as Joomla, Blogger.com, and WordPress.com.
Although it has been debated in the past, SEO is here to stay, and done properly it could take your business into the big time. Follow the guidelines closely. If you don’t, you will not succeed. Be prepared to make a commitment, if you don’t, you will not succeed. Google has given us a starter guide and if you follow the started guide you will rank well across their products. By taking a smart-scientific approach you will have rewards beyond comprehension.
Take a look at this, now these figures are only estimates, but this gives us some kind of idea of a monetary level to what it means to have top rankings on certain keywords. The cost of $313.01-$542.06 is what the estimated cost would be to run ads on these keywords 24/7.
If your site ranked in the organic results for these very same keywords it would be free and you would have just as many people seeing your site, if not more! Once again, it would not cost you a penny! The people searching for the keywords are exactly who you want to reach and would pay much more on offline media.
Organic rankings are so valuable and are very obtainable still. Get started before all of your competitors have a chance to become authority sites. Hurry up and get going, build some links, get a blog, and write lots and lots of content!
0 comments:
Post a Comment