Meta Data Optimization For Professional Services Websites

Generally professional legal service websites are heavily Google centric and do not receive much organic search traffic from other search engines like BING or Yahoo. Another KPI, conversion rate of organic traffic, was also lower than expected. Reviewing the top performing attorney websites in several major metro markets in diverse practice areas, from many competitors websites like Justia, iLawyer marketing and SLS marketing, highlighting the use of unique meta data to maximize their organic audience and improve search engine indexation and page relevance to search user queries. The websites that showed up well in SERP’s page with greater visibility and frequency had a common characteristic of unique Meta keywords, unique Meta descriptions and unique title tags.

paid-search-marketingHow Search Engines Use Meta Data

While it is common marketing knowledge that Google does not read meta keywords for use in their algorithm as stated frequently by Matt Cutts from Google, there are 1,00’s of other search engines that do use this information to help establish page relevance and better indexation. Consider these two points about how search engines use Meta data.

  • While Google may not read Meta data, why do they stress it in their webmaster tool? In Google Webmaster tools under HTML improvements, WMT commonly lists and shows duplicative Meta data. Google actually recommends through their Webmaster tools to go “old school SEO” and improve your website by remove duplicative Meta data and provide unique titles, keywords, and descriptions for greater relevancy to user queries.
  • Recently Google has gone back to indexing news sites with Meta data.  Google recommends news sites and news articles identify story topics with by using unique Meta keywords to gain better relevancy and topic clarification.

Professional search engine marketers that used consistent quality guidelines to build websites, have continued to include unique Meta content in every site build. Many SEO’s focused on rank tend to ignore quality guidelines while reacting to the latest SEO trend. All the domains that ignore the practice of building page relevancy now have to play catch up, and here’s why.

Algorithm Changes In Google

In April 2012 Matt Cutts in one of his blog posts mentioned Google’s upcoming efforts to improve the site quality and user experience.  At that time, Google was looking to increase the quality of SERP’s by penalizing sites that were openly over-optimizing their domains with questionable SEO practices. This particular algorithm update was called Penguin. Penguin 1.1 was launched around April 24, 2012 and adjusted search results for a number of spam factors including non-diverse back link profiles, aggressive redundant anchor text, keyword stuffing and over-optimized title tags to name a few. This “over-optimization” of title tags includes titles that well exceed the standard character limitations. One common misdiagnosed SEO problem from Penguin is that it’s only link problem, when in fact it might be an poor optimization or over-optimization problem.

Most Meta data has a standard number of characters that will be displayed across the search engines. For example, title tags of 70 characters or less will show up in a Google search and 72 characters or less will show up in a BING or Yahoo search. When you research top performing sites you will see that many of these sites use no more than 70 characters to identify what each page is all about. Most websites that exceed 70 characters and use an abundance of keywords in the title tags tend to perform at a less than optimal level along with being poorly indexed.

SEO Industry Standards “Old School” Style

When it comes to keyword usage in the Meta keywords, the general rule of thumb is 3 to 5 unique keywords that support the relevancy of the page. A common SEO “black hat” tactic is to stuff a bunch of keywords in the string, repeat keywords in this string, or simply ignore the keywords because “Google doesn’t read it”. All of these are bad ideas. The Meta keywords help explains what the page is about, especially when the primary keyword of the page topic being the first phrase listed. The remaining keywords need to be relevant and non-duplicative to the primary keyword. Semantic indexing further reinforces the quality of the on page topic and improves the search engines understanding of what the page is truly all about. The outcome is that this page is easier to index because it is easier for search bots to understand. The byproduct of helping the search engines identify and understand the content on the page is with correct and consistent basic optimization is better placement in the results pages.