SEO OPERATIONS MANUAL Part 4: RELATIONSHIP

Posted by khoiron h On Sabtu, 12 November 2011 0 komentar
Part 4: RELATIONSHIP



Our blog posting efforts cover these areas as appropriate:

1. Posts and submissions to the site's own domain-based blog

2. Posts and submissions to 3rd party blogs (blogger.com, blogspot)

3. Posts and submissions to the satellite portal sites

commenthut.com can be used to find appropriate dofollow blogs for posting to.

            
Article Content Submission
Keyword based articles are written and submitted to one or more of the following:
EzineArticles.com
SubmitYourArticle.com
Suite101.com
Helium.com
ArticleDashboard.com
ArticleBase.com
GoArticles.com
SearchWarp.com
Buzzle.com
Isnare.com
AmericanChronicle.com
ArticleCity.com
IdeaMarketers.com
Site-Reference.com
ArticleAlley.com
TheWhir.com
ArticlesFactory.com
Amazines.com
ArticleSnatch.com          
For article content and ideas, these sources are an option:
need-an-article-net
unique article wizard (for spinning)

Press Releases
We write and submit press releases about the site's product or service.
 Press releases are submitted to:
free-press-release.com
i-newswire.com
pressreleasespider.com
pr-log.org

Video Submission
We can create a desktop video (Camtasia-style) and/or work with the client's existing presentation and submit to one or more of the following:
YouTube.com
Vimeo.com
Revver.com
Viddler.com
realpeoplerealstuff.com
TubeMogul.com


PowerPoint  Submission
We can create a PowerPoint Presentation (PPT file), store in on the main site for indexing purposes and submit it to slideshare.com.

Photo Files
Client's photos are submitted to flickr.com   The client's photo gallery can be optimized by making the profile keyword rich.  "Sets" or subcategories can be created that are keyword appropriate. Photo file names and descriptions must be keyword rich (Add Tag).    Descriptions and Title Tags can also be edited accordingly. Descriptions can contain html code, so links to the main site can be included.


Yahoo Answers
This method requires us to create a profile and interact on Yahoo answers by proving helpful content AND by asking questions of others. Yahoo will ban users who only use this service to create back links.  It must not be abused by overuse and back links should NOT be included in each post.  Users with numerous posts withOUT back links are more credible and have more opportunity to mention sites and product links.


Wikipedia - A Non-Guaranteed SEO Technique
Companies that are mentioned or added to Wikipedia gain great credibility, but Wikipedia entries cannot be made by the site owner or anyone related to the site.  Also, external linking to one's own site is a violation.  Earning a Wikipedia entry takes time and must be done by someone with no self-serving interest in the entry.  This process takes time and patience, but cam be worth it of accomplished correctly.
EDU links - A Non-Guaranteed SEO Technique
These are quite difficult to get, but requesting a backlink from the owner of an .edu site can be very valuable. It is a matter of making as many requests as appropriate.


Social Networks
Generally, we create profiles, fan pages,  and social media accounts for clients. The types of social networks chosen vary according to the client's niche, need and their existing accounts.  For most, we will at least start with these three major networks.  But this technique is not limited to these.
                facebook.com
                twitter.com
                linkedin.com



STILL TO RESEARCH:
[Directory submissions
                - industry/authority]

Shopping Cart Feed Submission
For clients that use an ecommerce shopping cart, the XML product feed can be submitted to Froogle (owned by Google) for indexing individual product listings.


General Search Engine Submission
New websites/domains should be resubmitted to Google and Bing/yahoo about every 90 days.
It's not a good idea to do it any sooner, but doing it around 90 days or so makes sure that teh sites stay up-to-date with their spiders.    
                -


RSS Feeds Submission
We submit all our blog feeds XML to
feedping.com
feedagg.com
feedbase.com

Social Book Marks
We submit to social bookmarks listed on socialmarker.com and use socialmarking.com to grab JavaScript code for viral bookmarking.  The code can be added to the main site and/or blogs and portals.
               
Onlywire.com can be used for multiple submission of content.
Major bookmarking sites include:
digg.com
scribd.com
hubpages.com
squidoo.com
stumbleupon.com
de.licio.us


STILL TO RESEARCH: Forum Marketing

Podcast Production and Submission
We can create a unique podcast and submit it to appropriate networks.  The production can be a short and simple podcast or interview that provides helpful content in an interesting , informative and entertaining way. These are the post production steps:
1. Name the podcast show with rich with relevant keywords.
2. Make sure your MP3 files have good ID3 tags that are keyword rich. ID3V2 supports comment and URL fields. The major search engines may not pick up the ID3 tags now, but may in the future.  Some specialty engines and software tools already do.
3. The MP3file can be stored anywhere - we use the Amazon S3 storage facility for faster streaming.
4. Write a synopsis for the show in text and post it on all appropriate blogs and social media. Put the most important keywords as high up in the postings as possible while keeping  it readable and interesting.
5. Transcribe the podcast (or at least excerpts if it is too long)  for use as search engine indexing. Break the transcript up into sections. Make sure each section is on a separate web page and each separate web page has a great keyword-rich title relating to that segment of the podcast. Also, link to the podcast MP3 from those web pages.
6. Create an RSS Feed for the Podcast either manually or use pocastblaster.com to generate the feed code.
7.  Submit the podcast feed to:
rsspodcastdirectory.com
mefeedia.com
hardpodcafe.com

To publish the Podcasts on iTunes:

    1. Open the iTunes application on your computer.
    2. Pull down the "Advanced" menu and select "Subscribe to Podcast...."
    3. Paste the URL for the RSS file in the dialogue box.
    4. iTunes should begin downloading the most recent episodes of your podcast.
    5. Open the latest episode of your podcast once it becomes available by double-clicking on its title.
    6. If you can hear your podcast, the RSS is fine, and you can continue. Otherwise, you'll need to check for mistakes in your RSS file.
  1. Go to the iTunes Store in iTunes.
  2. Follow the link for "Podcasts."
  3. At the very bottom, on the left side of the page, follow the link for "Submit a Podcast."
  4. Enter the URL for your feed and click "Continue."
  5. Complete the requested information and follow any instructions to complete your submission.
  6. Wait several days, then search for your podcast in iTunes. Eventually, it should also show up in the iTunes category you provided.
  7. iTunes has further instructions on updating and troubleshooting your feed.http://www.apple.com/itunes/store/podcaststechspecs.html

To publish the Podcast on the Client's Website:
  1. Validate your RSS file. We can do this for free at a number of websites:
  2. Simply enter the URL for your RSS file:
    1. RSS.scripting.com - http://rss.scripting.com/
    2. FEED Validator: http://feedvalidator.org/
    3. RSS Advisory Board: http://www.rssboard.org/rss-validator
  3. Create a link to the RSS file's URL on the website.

To publish the podcast on blogger.com
  1. Make sure that your blog has enclosure links enabled.
    (See: http://help.blogger.com/bin/answer.py?answer=80259&topic=12466)
    1. Go to "Formatting" in your blog's "Settings" tab.
    2. Choose "Yes" from the pull-down menu next to "Show Link fields."
    3. Click the "Save Settings" button at the bottom of the page.
  2. Create a new post for the podcast episode.
  3. Give the post a Keyword Rich title.
  4. Below the "Link" field, click on "Add enclosure link."
  5. Enter the URL for the actual audio file.
  6. Enter keyword rich text to accompany the audio podcast in the body of the post.
  7. Click on "Publish Post" at the bottom of the page to publish the podcast.


Other Content Publishing
Squidoo.com
A Squidoo lens is created with strategic keyword rich content and an appropriate link to the main site.
Zimbio.com

READ MORE

SEO OPERATIONS MANUAL Part 3: REFERENCE

Posted by khoiron h On 0 komentar
Part 3: REFERENCE


Dynamic Satellite Portals

We create sites that act as third party references to the main site we want to help promote. These are NOT to be confused with "doorway" sites - which Google not only punishes , but bans from search results. These sites are set up with very strict parameters so as to not be confused with black hat SEO "tricks".

First, each portal is hosted on a keyword phrase domain (.net or .org) or a keyword rich .com domain, depending on what the keyword research results provide us with.

Each site is set up as a CMS (Content management System) site - usually on the WordPress Platform.
Each domain is hosted on a separate c-class server and on a different c-class server than the main hosted site.  Each domain registration record  is also privatized or at least different than the main site.  The concept is to create portals that are legally unrelated to the main site.
Next, each portal is optimized using:
1. Permalinks - This feature creates keyword rich URLs based on the category and title of each post.
2. Post Titles and Categories are based on major keywords
3. WP SEO plug-ins are installed including: All in One SEO Pack, Contextual related Posts and SEO Smarty Links.
4. User generated comments are created using keywords
5. Social content is added (YouTube videos embedded)
6. Continually updated content.
7. Outbound content seems to help ranking to some degree. This can include, but is not limited to: Adsense code, Amazon Associates ads (when relevant to content) and external dynamic content feeds (wpdirect.com)
8. Embedded YouTube videos with content relevant tags.

Creating these portals is crucial to do correctly and can be detrimental to an entire campaign if done incorrectly.  The content must be real, non-duplicated posts.  The outbound links must be legitimate and relevant.  Overuse of keywords is dangerous.  Over linking (back links) to the main site must be done only where appropriate and usually only in one to two places on the site.
The primary purpose of these portals is not to produce back links to the main site (although they will produce a few). The purpose is for the portal to have its OWN high ranking position on Google so that it will get traffic and create a following that can be communicated with.
Once a fair amount of traffic is achieved an opt-in subscriber form can be added to the portal that has been incentivized for the reader to subscribe with.
Each portal should be looked at as a website and campaign all its own and not a site solely created to promote one main site.  Therefore the content must be real and constantly updated.
Over promotion of the main site will cause suspicions of spamming.

Squeeze Page (Lead Capture) - Email Marketing
Squeeze pages can be set up on a secondary domain- apart from the main site. This website has the sole purpose of capturing the visitors name and email and placing them into a double-opt in (subscriber-confirmed) mailing list.  The page incentivizes the reader with a free report, free video or other free digital information product in an effort to gain trust and their subscription.
The follow up auto-responders associated with this capture system are crucial and are written to accomplish at least two major goals:
1. To provide the reader with excellent content to earn their favor and trust. Part of that content  can be references to the "blog" portals and their content.
2. To promote the main site and/or its products and services
There is no limit to the number of squeeze pages that can be created and promoted. Each with its own subscriber incentive model.
It's a good idea to create at least two squeeze pages for the purposes of A/B split-testing.
Capture form offers can be incentivized using re-written and edited Private label content and the design of an original or unique e-cover representing the product to be offered or given away.

Private Social Network
For some larger clients, the creation of a branded private social network can be developed using socialengine.com software or similar service. If the potential for a following is large enough and interest can be generated, the social site itself could be a considerable boost to the credibility and rank of a company's main site.

READ MORE

SEO OPERATIONS MANUAL Part 2: REPAIR

Posted by khoiron h On 0 komentar
Part 2: REPAIR

Overall Site Assessment

Search engines seem to care less about less about images, layout and site aesthetics. But the problem is, we also want to appeal to humans who actually visit these sites - not just Google's algorithm and robots.
So a general assessment of the site needs to include the basic and hopefully obvious tactics of good navigation, clear content attractive layout an up-to-date look that is appropriate for the niche. Almost all of this is objective opinion and mush of it should be provided by the client because of that.

However, the structure of the site will matter to search engines and that structure will effect site design. For example, sites need to be CSS based and have few or no tables within their structure.  This affects ranking AND visual layout. Further, page load speed is a huge factor in search engine ranking and also a major consideration for the designer. Sites designed with cascading style sheets will fare better in this area.
The bottom line is that older, passé sites need to be updated to match the current trends in structure and layout.  Some of these decisions are subject to the logistics of the code, others are objective in nature.

Fast Indexing (for new sites/domains)
New sites can be indexed in approximately 24 hours by submitting the domain to statistic monitoring sites.  Also a submission to Digg.com can start the process rather quickly.  Submitting to as many as possible is the safest way to insure indexing.
These sites can simply be visited with the domain in question appending the URL.  This can generate enough data to create the equivalent of a back-link.  Since the monitoring sites are indexed more often, the sites submitted to them are as well.
For  example, visit:  http://www.statbrain.com/www.yourdomain.com (replace with actual domain) to generate this data.
Here is a list of such services and the proper syntax:
http://www.statbrain.com/www.yourdomain.com
http://www.websiteoutlook.com/www.yourdomain.com



http://www.builtwith.com/?yourdomain.com
http://snapshot.compete.com/yourdomain.com
http://searchanalytics.compete.com/site_referrals/yourdomain.com
http://aboutus.org/yourdomain.com
http://quantcast.com/yourdomain.com
http://cubestat.com/www.yourdomain.com
http://alexa.com/siteinfo/yourdomain.com
http://alexa.com/data/details/?url=yourdomain.com
http://siteadvisor.cn/sites/yourdomain.com/summary
http://aboutdomain.org/backlinks/yourdomain.com


               
Site Code Checklist
The following is a checklist of most attributes that need to be checked for repair, update addition to or deletion from the site's code.
Title Tags
Search engines rely on spiders to crawl websites and index pages appropriately.  When a spider (crawler) lands on your website, the first thing they take notice of is your domain name and your website's title tag.
A title tag should include a description of what your website is about, rather than just including your website's URL, you should always incorporate your primary keyword phrase into the title tags of each webpage you own
You want to make sure that you use different title tags on every single web page, that way you are able to rank for different terms rather than just one.Title tag appears at the top bar of your web browser. Based on the keywords in this tag¸ search engines list the site relevant to a search made in their engines. It is one of the important aspects of getting rankings in search engines. Since this title also appears in search engines’ result pages¸ appropriate use of keywords in promotional language may result in more click-throughs.

H1 thru H6 Heading Tags
These tags are important to Google and  in the priority of their number. In other words, the H1 tag is looked at by Google after the Title tag as an important attribute for search results.  The H2 tag is looked at next, then H3 etc.  Obviously, these tags need to be keyword based or keyword rich.  Not all 6 are needed, but a minimum of H1 is strongly suggested.

Bold/Strong Tags
Google looks for these tags to find emphasized phrases on a site to see how it relates to the content.  With this attribute, an overuse could be detrimental.  But 1 or 2 appropriate strong or bold tags is recommended.
Basically, we never bold a page or a paragraph. Whether using b or strong it’s important ONLY to bold keywords or links with anchor text. 
Since everything on web is related to semantics, to increase chances of ranking better we should use <strong> rather <b> since the rendering in all browsers will be the same for both tags.  So, search engines may give more importance to <strong> over  <b>.
Some say an italicized keyword has some positive effect, but not as as much as bold/strong and therefore should also be used wisely (i.e. not overused).
Bold Tags = <b>   </b>
Strong Tags = <strong>  </strong>

Image File Names
A simple and  effective fix is to change image names to those that include or are  exclusively keywords.  Images called header.jpg can be easily converted to keyword_header.jpg or even keyword.jpg when appropriate.  (As long as the original file name is also changed to match.)
The Alt tags for each image should contain short descriptions that are keyword rich as well.  Null alt tags are considered an error by W3c validation tests.

Hyperlink/ Anchor Tags
Hyperlinks provide another keyword placement opportunity.  Of course, the associated file must have a matching file name.  Again, these need to be sensible to not appear as keyword "spam." 
The same is true for the anchor text associated with these links.  The anchor phrase "Click Here for More Info" can be changed to "Click Here for More Info About [keyword]" as long as it appears to be an appropriate use of the site's language.


Keyword Titled Pages
This technique is directly connected to the previous regarding hyperlinks and anchor tags. Obviously, a keyword hyperlink would need to link to an actual page of the same name (keyword based.)
But, technically, it would still make sense to create and name several  pages on a site with major keywords.  Even if there was little internal linking to those pages.  Google would still index these pages.  But not linking to them would be impractical and might ultimately look weird to Google if they study the site's structure. 
So these techniques work in tandem for good reason.


Home Link Canonicalization
We add Canonical Link Element in source code to avoid duplicate content issues. This tag or link element is added between head tags.  All internal links to the index page on the site need to be in the following format: 
http://www.domain.com/
Note the deliberate inclusion of the trailing slash.
Canonicalization also applies to other site pages.  Internal links should all be written in the same consistent format so that the search engine will recognize what an appropriate URL is for the site's navigation.
In other words, the link to "page 2" should not vary from page to page. It can't be "http://www.domain.com/page2.html/"  on one page, and "http://domain.com/page2.html" on another.  It needs to stay consistent site wide and best in this format:
"http://www.domain.com/page2.html/"            


Robots.txt
With the Robots Exclusion Protocol, we add or edit the robot.txt file to instruct search engines on what to index. The robots.txt allows us to hide files or directories we don’t wish the search engine spiders to find. 
 
Robots.txt syntax examples
 
To disallow the entire website:
User-agent: *
Disallow: /
 
To disallow Google from indexing the entire website:
User-agent: googlebot
Disallow: /
 
To disallow one specific file in a specific folder:
User-agent: *
Disallow: /folder/file.html
 
To disallow one specific folder and its contents AND any file or folder
that begins with the characters "folder":
User-agent: *
Disallow: /folder
 
To disallows all files beginning with subfolder in the folder directory :
User-agent: * 

Disallow: /folder/subfolder 
 
 
To disallow multiple files and directories:



User-agent: * 

Disallow: /folder/post- 

Disallow: /folder/posting 

Disallow: /folder/search 

Disallow: /forums/login 

Disallow: /forums/memberlist 





Additionally, search engines comply with the no follow attribute when used in the page code for specific links.
The rel="nofollow" attribute value was created for this purpose. This gives us more specific control: instead of telling search engines and bots not to follow ANY links on a page, it lets us easily instruct robots not to crawl a specific link.
For example:
 <a href="login.php" rel="nofollow">Log in Here</a>


Sitemap
Google likes sitemaps and can accept them in a number of formats, but they recommend creating a Sitemap based on the Sitemap protocol because the same file can be submitted to the other search engines, such as Bing and Yahoo!, that are members of sitemaps.org.
Here’s an example of a basic Sitemap with a single entry for a URL that includes an image and a video (for convenience, only a subset of available video information is shown).
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"
        xmlns:image="http://www.sitemaps.org/schemas/sitemap-image/1.1"
        xmlns:video="http://www.sitemaps.org/schemas/sitemap-video/1.1">
  <url> 
    <loc>http://www.example.com/foo.html</loc> 
    <image:image>
       <image:loc>http://example.com/image.jpg</image:loc> 
    </image:image>
    <video:video>     
      <video:content_loc>http://www.example.com/video123.flv</video:content_loc>
      <video:player_loc allow_embed="yes" autoplay="ap=1">http://www.example.com/videoplayer.swf?video=123</video:player_loc>
      <video:thumbnail_loc>http://www.example.com/thumbs/123.jpg</video:thumbnail_loc>
      <video:title>Grilling steaks for summer</video:title>  
      <video:description>Get perfectly done steaks every time</video:description>
    </video:video>
  </url>
</urlset>
Once we’ve created our Sitemap, we submit it to Google using Webmaster Tools. (Must have added the site to our Webmaster Tools account first.)
Google also accepts the following as Sitemaps:
RSS, mRSS, and Atom 1.0 Sitemaps
Google accepts RSS 2.0 and Atom 1.0 feeds. Most blog software creates a feed for us. Although that the feed may only provide information on recent URLs.
Text file Sitemap:
We can also provide Google with a simple text file that contains one URL per line. For example:
    http://www.example.com/file1.txt
    http://www.example.com/file2.txt
   
For best results, we need to ....
...specify URLs, as Google attempts to crawl them exactly as provided.
...use UTF-8 encoding.
...include nothing but the list of URLs.
...name the text file anything we want with the file a .txt extension (for instance, sitemap.txt).
Sitemap files should be submitted to Yahoo/Bing as well - according to their own guidelines.

Keyword Meta  Tags
Keyword meta tags are the most commonly known html attribute that has an apparent connection with SEO.  Unfortunately, this is one of the reason that Google doesn't pay as much attention to them.  Because they are easy to edit, they are easily abused.  rather than spend a lot of strategy on filling meta tags with keywords, most experts agree that it is more important to make sure this attribute has FEW values.  It would be more important to ELIMINATE a site's meta tag listing that looks too full than it would be to add a meta tag value to a site that has none.
But the consensus seems top be that it doesn't hurt to have approximately 5 very appropriate keywords in the meta tags listing, although it may help very little. And actually cause harm if over-used.
               
               
W3c Validation


The Markup Validation Service by the World Wide Web Consortium (W3C) allows us  to check HTML documents for conformance to HTML or XHTML and is also a quick method to check for errors in code. We validate sites as per W3C guidelines.
 
This is a good test to run after completing as much onsite SEO as possible as a "double-checking" process for other on page errors we may have missed or would not otherwise know about.
 
The validation can be run from here:
 
http://validator.w3.org/

Redirects
For sites that must redirect visitors to another site, there are strict requirements for appealing top Google.  Using NO redirection is best, but when that is not possible, it is important to follow these guidelines.
Because we have control of the mod_rewrite extension in teh Apache build of our servers, we can use it to dynamically change URL's using arguments on the fly - this is NOT a 301 redirect, but rather it's related behavior.
For example, if we wanted to redirect .htm files from an old server to their equivalent .php files on a new one using a 301 redirect, we would use a combination of mod_rewrite and the redirect directive to do the redirection + URL change.
We can do it on a file by file basis by making a really long list of possible redirects in the .htaccess file by hand without mod_rewrite, but that might  be a problem on a server with a lot of files, or a completely dynamic system. Therefore these 2 functions are often used together.
The syntax for the redirect directive is:
Redirect /yourdirectory http://www.newdomain.com/newdirectory
If the client requests http://myserver/service/foo.txt, it will be told to access http://www.yourdomain.com/service/foo.txt instead.
Note: Redirect directives take precedence over Alias and ScriptAlias directives, irrespective of their ordering in the configuration file. Also, URL-path must be a fully qualified URL, not a relative path, even when used with .htaccess files or inside of <Directory> sections.
If we use the redirect without the status argument, it will return a status code of 302 by default. This default behavior has given problems, so it's important to remember to use it, like this:
Redirect permanent /one http://www.newdomain.com/two
or
Redirect 301 /two http://www.newdomain.com/other
Both of which will return the 301 status code. If you wanted to return a 302 you could either not specify anything, or use "302" or "temp" as the status argument above.
You can also use 2 other directives - RedirectPermanent URL-path URL (returns a 301 and works the same as Redirect permanent /URL PathURL) and RedirectTemp URL-path URL (same, but for a 302 status).
For more global changes, we would use redirectMatch, with the same syntax:
RedirectMatch 301 ^(.*)$ http://www.newdomain.com
or
RedirectMatch permanent ^(.*)$ http://www.newdomain.com
These arguments will match any file requested at the old account, change the domain, and redirect it to the file of the same name at the new account.
You would use these directives in either the .htaccess file or the httpd file. It's most common to do it in the .htaccess file because it's the easiest and doesn't require a restart, but the httpd method has less overhead and works fine, as well.

This following scenario assumes we have a new domain (with no working pages under it) and want it to redirect properly to a main domain.
1. Ensure that we have 2 accounts - the old site and the new site (they do not have to be on different IP's or different machines).
2. The main (proper or canonical) site should be pointed at the new site using DNS. All other domains should be pointed at the old site using DNS. Parking them there is fine at this point.
3. Edit the .htaccess file at the root of your old account with this code:
Redirect 301 / http://www.newdomain.com/

Public WhoIs Record
This technique has little to do with SEO and more to do with good marketing - especially in the corporate world.  We want to make sure that a client's domain record shows an up-to-date and legitimate physical, business address in its public record.  Some potential customers will research this data  (because it is easy to do)  in an effort to verify the validity of a company's website.

               
hCard/vCard Physical Address Microformat
Microformats are a way of adding simple markup to human-readable data items such as events, contact details or locations, on web pages, so that the information in them can be extracted by software and indexed, searched for, saved, cross-referenced or combined.

Microformats are a way to use (X)HTML for data and a logical next step in the evolution of web design and information architecture and a way of thinking about data which combines html, content and presentation .

From the SEO perspective, the most commonly used microformat is the integration of Hcard for the contact address on the website contact page.

If we write the contact address of my company withOUTthe using the hCard the HTML will be as follows:
<p> BlackWire Marketing, LLC<br>
220 E. 11th Ave - Suite 1<br>
Eugene, OR 97401</p>
<p>541-343-3653</p>

<div id="" class="vcard">
 <a class="url fn n" href="http://www.blackwiremarketing.com/"> </a>
<div class="org">BlackWire Marketing, LLC</div>
 <a class="email"
href="mailto:support@blackwiremarketing.com">support@blackwiremarketing.com</a>
<div class="adr">
 <div class="street-address">220 E 11th Ave Suite 1</div>
<span class="locality">Eugene</span>,
<span class="region">OR</span>,
 <span class="postal-code">97401</span>
<span class="country-name">USA</span>
</div>
<div class="tel">541-343-3653</div>

The above code produces this result:
BlackWire Marketing, LLC
220 E 11th Ave Suite 1
Eugene , OR , 97401 USA
541-343-3653

Note:
publishing the email address will get it spidered by mail harvesters. Use that data with caution.


hcard code can be generated here:
Note: It is advisable to put only one address on a page.  But if we have to put more than one, then change the  div id="" class="vcard" tag accordingly for every address.


Google Analytics
We add Google Analytics to all sites for monitoring SEO and traffic progress and for future research.
               
Usability and Conversion Tools
Usability testing is the measurement of the quality of a user's experience when interacting with the website. If a visitor finds it difficult or frustrating to use the site, they will decide to exit.
We install a usability tool (JavaScript) on the site to track user behavior of visitors on the website. This helps track strengths & weaknesses as we record the visitor experience. 



One such tool is at: http://www.clixby.com
Clixby records visitor movements (mouse clicks, etc) on the site and stores it in a database for researching user on site experience.

Favicon
Adding a favicon to the site adds some credibility- but mostly to the end user.  aside from looking better, when bookmarked, it is easier to recognized on the user's side.  Title Tabs will look better and make the site more "brand- able".
http://www.favicon.cc   can be used to generate favicon files.

Site Content Checklist:
Keyword Density (ratio)
There is no formula for the ratio of keywords that should appear on a site when measured against non-keyword words (including code).  Some claim that 2% is low and that 18% is high.  There seems to be a lot of variation but usually within those ranges.
The main point is to use good, relevant content that is more readable for humans that for search engine spiders.  Obviously, content with zero keywords is senseless, but probably hard to do anyway if the content is going to be naturally relevant. i.e.: It would be hard to write an article about magic tricks without using the word 'magic" at least a few times.
On the other hand, using a major keyword too often and in a way that is less than pleasant to read would be a red flag of keyword overuse.

Keyword  Proximity
We pay attention to keyword proximity and prominence - as well as the order of the content itself.
The prominence of
the keyword is based on thefirst instance of where it appears within your content.  A keyword phrase that is used at the end of the content will be considered less relevant than a keyword phrase that appears in the first portion of the content or article.
This means that we need to ensure that we implement primary keyword phrases into the first half of our content, so that it is given more weight when search engine spiders index the website.

Keyword proximity is also very important in terms of search engine rankings because it indicates to the search engines that these keywords are related to one another.
For example, if we search for the keyword phrase 'dog training tips' we will pull up listings of websites that include the keyword phrase 'dog training tips'.
Without quotations, all websites that feature the keyword dog, training and tips seperately will also appear in the search results.Since we want to rank for specific phrases, rather than just individual keywords, we need to do our best to focus on close keyword proximity, so that search engine spiders crawling the website index entire phrases rather than singular keywords.

300 Word Plus Content
As a general rule of thumb, articles and content (especially blog posts) need to be at least 300 words.  But again, good quality content is more powerful that quantity.  But it is a good starting point.  In an extreme case, if we found a blog with a thousand articles all of which were only a couple of sentences, we would be suspicious.  We are not sure if Google would be or not depending on the content, but it could be a risk.  The 300 word rule is not set in stone, but can apply in some cases.

General Content "Rules"
More important that density ratios or word count is content quality.  Google loves dynamic (changing) content that is readable by humans (not just appealing to spider formulas).  They hate patterns that look contrived.  Content that is actually helpful and valuable will always fair better.
They despise duplicate content (content that appears on other sites the same as ours) and will punish it severely.  This is why pre-purchased articles require massive re-writes when used as website and blog content.
                               
               
Mobile Enabled  Site
Websites can be built with a mobile friendly alternate version. Browser checking code is used on the index page to detect the appropriate version. In the case of a mobile browser application in use, the redirect could go to a subdomain set up as http://mobile. yourdomain.com
Mobile enabled websites can be ranked by Google separately when Google see them as "mobile enabled" because they can be searched for separately.   Google classifies a website as mobile enabled based on certain signals like good page layout, the markups used, good semantic structure (h1 before h2, etc),  the right doctype is being set. correct encoding is used.  A Mobile Enabled site has no iframes, tables or popups. Mobile websites get a boost if certain quality metrics hold.





READ MORE

SEO OPERATIONS MANUAL Part 1: RESEARCH

Posted by khoiron h On 0 komentar
Part 1: RESEARCH


Competition to Search Volume Ratio (The Sweet Spot)
Keywords with low competition and high search volume are the goal when looking for effective keywords. But it's difficult to quantify what is "high" and what is "low". The parameters differ based on niche, geography and other unique factors for the business.

The industry uses this formula as an index or litmus test for Keyword Effectiveness:


KEI= Sv2/C         
C= competing sites that have KW in title and anchor       
SV= monthly searches
But again, no one seems to quantify what a "good" KEI is.  High KEI's are better than low ones, but that can only be measured when compared to other data like other keywords.  One source says a "good" KEI starts in the triple digits, but this is a generalization.
Additionally, keywords with higher commercial intent scores (OCI) will generally be better choices.  Our tests have used keywords with OCI ratings of 50% or greater and we have avoided those with less than that.  OCI will not apply to every niche, but should be considered whenever possible for commercially driven sites.  
Our basic formula range for good keywords  is:
Search Volume = 100 per day minimum (3,000) monthly
Competition Score of 100,000 or less (Counting only SE results of sites with the keyword in the title and at least one anchor tag)
OCI rating of 50% or greater.
A GREAT keyword fits this criteria:
Search Volume = 2,000  or more daily
Competition Score = 10,000 or less
OCI = 100%
Domain available (.com, .net and/or .org)
NOTE: These scores MUST be adjusted drastically, and is some cases do not apply, for local markets where competition is low and search volume is lower.

KW Domains (.net .org)
By far, the most  valuable attribute for SEO is a phrase-matched keyword domain. (keyword.com) . Finding the .com for a great keyword is rare now, but .net and .org suffice.  Although, they will almost always be beat by .com in an organic search race, they are definitely second place contenders - which is still great search engine result placement.   A phrase match keyword domain outranks almost all other attributes including content.  We have discovered keyord.com sites that rank in #1 positions, with blank html index pages. 
We have tested phrase matched domains in specific niches and outranked all competing sites in less than 5 days with a brand new domain and basic blog content.  Therefore, this attribute is at least stronger than the age of a domain registration.

KW RICH DOMAINS (.com)
When an exact phrase match domain cannot be found in .com, .org or .net (all other TLD's do not count), the next level down is a keyword rich domain with a .com.  In other words, we can't kind keyword.com, keyword.net, keyword.org...so we start looking for thekeyword.com, my keyword.com, keywordonline.com
Some testing still needs to be done to see if the variants in the domain work better when placed at the end or at the beginning.  Many SEO experts would say that keeping the keyword at the front is a safer bet.
Also, dashes in keyword rich domains seems to be less valuable than those without, so we avoid them.  (Although , their use is quite prevalent.)   Again, testing is the best way to know for sure.

Google Analytics
When beginning KW research, the easiest starting point is with the client's existing site traffic.  If the client is tracking visits with Google analytics, the SEO tech can start by looking at the words the site is already getting traffic from.  This list can at least reveal the best seed word if not a handful of good keywords to exploit since they are already working.

cPanel  Web Stats for Existing KW
For clients without Google Analytics, the web stats available in cPanel can suffice. Although the data is cruder, all that is needed are basic search items anyway.  AWStats or Webalyzer should be able to uncover some info that creates a good starting point for research.

Study the Authority Site
A quick analysis of the top authority site(s) for this niche can help reveal more keyword possibilities as well as how they are organized.  We look for tags, page titles categories and site map usages to see what is working for them.
quantcast.com, spyfu.com and alexa.com are some good sites we use for authority and competition research.

Google's Wonder Wheel
The suggested words that Google provides in its search bar (after typing the key phrase) will provide a list of keyword phrases that are obviously important to Google when performing that specific search. It makes sense to include as many of them as applicable.
But the Wonder Wheel 's visual "mind map-style" graphic can be displayed and used to find connected and relevant keywords. This data provides us with a potentially perfect schematic for site navigation that Google will love.  When Google loves the site's navigation, there is potential  for it to be displayed at the bottom of an organic listing.  This would greatly increase click-through rates.
There are a number of Wonder Wheel scrapers that can be used to derive the Wonder Wheel's level of data.  Each tree branch on the wheel is another level of related keywords that presents deeper research on any specific word.
READ MORE