Natural Search Optimization

Cherry Pick Rankings with the Best

Achieve better web rankings by hiring Dutchess Marketing to implement a customized optimization strategy. Let us fine tune your website's content, optimize meta data and execute a comprehensive link building program so the search engines recognize the relevancy.

An Overview of Our Process

Keyword Research

This step of the natural search engine optimization process can make or break a campaign. Keyword phrases are critical and proper selection will ensure qualified visitors. That's why we invest significant effort into researching keywords that make sense for your business. We'll evaluate your historical web analytics data and use market research tools to determine the best targets. After assesssing the level of competition and estimated effort for obtaining rankings we will present you with a phased project plan outlining the low hanging fruit and long term targets.

Keyword Mapping

Assignment of keyword phrases to pages is equally important as the effort made to discover them. Collectively, how you group keywords will impact your ability to score rankings. Depending upon the size of your website, Dutchess Marketing recommends the assignment of similar keyword phrases to one page because consolidations allows you to maintain keyword density.

Content Creation

Search engine copywriting is the development of website content that meets the following criteria: contains target keyword phrases for the search engines, provides valueable information to visitors and encourages visitors to move closer to conversion. Our expert copywriting services will exceeds the industry standards:

  • Performance Keyword Density
  • Title Tag Creation
  • Optimized Meta Data
  • Catchy Keyword Rich Headings
  • Media with Alt Tags
  • Embedded Links with Title Text
  • Stylized Keyword Phrases
  • Call to Action

Search Engine Index and Sitemaps

In order to rank for keyword phrases, your web pages must appear within the search engine index. Sitemaps are an excellent way initiate placement in the index, however, they do not guarantee positioning. Search engines primarily perform hierarchical-crawls to find new content and expand their index starting from the homepage descending down to the Nth level product pages. Search engine spiders are limited to using your homepage as an entry point. In many cases spiders will reach your internal pages via inbound links from other websites, however, your website must have a sitemap to ensure the search engines are aware of all content.

Robots.txt Configuration

Robots.txt is a file that needs to reside in the root directory of your website. It essentially "welcomes" in search engine spiders. Search engine spiders and other robots refer to this file in order to report accurate data related to the various spiders that visit the website. So, for obvious reasons, we will configure the robots.txt file to restrict the search engines from password protected content.

Duplicate Content Prevention, Canonical URLS, 404 Response Checks, and 301 Redirects

We will evaluate your website with attention drawn to potential causes of duplicate content, which is discredited by the search engines. We will check for canonical URL errors [http://www.domain.com vs http://domain.com] and recommend steps for eliminating multiple instances of cross domain duplicate content. We will study the behavior of your webpage(s) header response codes and determine if 301 redirect implementation can benefit you.

The Key to Successful Search Engine Optimization is Careful Planning and Persistence

Do you want to take your natural search engine optimization campaign to the next level? Take the first step towards SEO success and complete this form.