Google is doing it again. Making sure content is king. In case you haven’t noticed, there’s a slow but steady sea change in the attitude of the internets content gatekeepers. The change will affect how everyone approaches doing business on the web.

The shift may, as one writer suggested, define the next incarnation of the web. The wide-open, everyone is welcome structure of Web 2.0, is shifting back to a form that emphasizes quality and accuracy. And it can’t happen too soon. Especially when you consider the proliferation of keyword-stuffed junk that often passes for “content.”

Underscoring this shift is Google’s latest search algorithm update, which has caught the attention of web masters and designers. Dubbed Panda, the new search algorithm was rolled out last year and an update was announced at the end of March.

When originally announced, the update was specifically aimed at reducing rankings for low-quality sites and boost rankings for those of higher quality. So what constitutes a high-quality site? For quality content creators, this wasn’t rocket science. Google defined these sites as having “original content and information such as research, in-depth reports, thoughtful analysis and so on.” Something most of us knew all along.

In defending what seemed to many to require no defense, Google said, “Google depends on the high-quality content created by wonderful websites around the world, and we do have a responsibility to encourage a healthy web ecosystem. Therefore, it is important for high-quality sites to be rewarded, and that’s exactly what this change does.”

Google is always updating its algorithms and changing things, which keeps web specialists busy keeping up—and content creators on their best behavior.

Website managers and marketing experts say the changes to Google Panda are more significant and place renewed emphasis on building better sites. The initial response has been to beef up content and to build better links. Google is even helping developers with a number of tips for building better websites to achieve better search results.

Here’s a sampling:

  • Include a clear hierarchy and text links. Every page should be reachable from at least one static text link.
  • Create a useful, information-rich site, and write pages that clearly and accurately describe your content.
  • Offer a site map and make sure the links point to important parts of the site. If the site map has too many links, break the site map into multiple pages.
  • Keep the links on a given page to a reasonable number.
  • Make sure your site includes the words users would type to find your pages.
  • Use text instead of images to display important names, content, or links. The Google crawler doesn’t recognize text contained in images.
  • Make sure that your <title> elements and ALT attributes are descriptive and accurate.
  • If you use dynamic pages (i.e., the URL contains a “?” character), know that not every search engine spider crawls dynamic pages as well as static pages.
  • While content has always been king, good content rules. So it’s no surprise that Google’s is increasingly focusing on bringing better quality to websites. As those who engage in print have long known, quality trumps quantity.

Tony Dokoupil’s recent article in The Daily Beast highlights the shift toward higher quality on the web, not the least of which is the increased use of experts to generate, edit and manage content on websites.

The return to expert-moderated web content will also influence how website experts, marketers and users approach the internet. Designers and creators will be focused on improving quality, and users will increasingly rely on search engines to return the best content for their searches. It’s a win-win for both quality content creators and those who seek it.

This post was written by Mike Sobol, who is Co-Founder and COO at