Paid Search or Not

Paid Search or Not

The popular search engines use complex algorithms to define what the searcher is looking for and to deliver a smaller set of results than the possible three plus billion pages available. The algorithm works to establish what the relevance a webpage is to the searcher’s inquiry. The more defined the searcher’s inquiry and the better “indexed” a webpage is, the higher its listing position is within the “engine.” There are many techniques and strategies available to enhance the position of a webpage.

Cracking the Code

For years, webmasters have toiled to learn the “code” of the algorithm to ensure their webpages appear at the top of the listings. However, if they were successful, “paid search” would become non-existent. The search engines are in business for a reason and one reason only – to generate revenue for the engine’s owners. Unfortunately, paid search can be expensive; especially, for smaller enterprises with limited budgets. As such, webmasters work diligently and endlessly to master search engine optimization (SEO) techniques. However, as webmasters close the gap on cracking the “code,” the architects of the search engines change the algorithm, sending webmasters back to square one. It’s a game of cat and mouse.

Position Relevancy

In the world of search, positioning is everything. If a website does not appear on the first page of a search engine’s listings, it lacks relevancy. First page appearance is a function of being relevant to the search criteria the viewer has entered into the search box. However, there are countless websites that meet the same search criteria, but only a handful can make it to the first page. This is true; because, the engines use varied criteria to determine which sites get first page “ranking.” By and large, the sites which get the top listings are the sites with the most web traffic (most popular). But, becoming the most trafficked is very difficult if the site is not already on the first page – creating the “chicken and egg” scenario.

Webmaster’s Toolbox

Webmasters use a host of techniques to bring their websites to an engine’s first page listing. The incorporation of tools such as LinkedIn, Facebook, and other mediums (social and other) drive the popularity of a website. As the popularity increases, the ranking of a site within the search engine also increases. Although the uses of various mediums do increase the popularity of a website, alone, these techniques will not provide the results desired.

Additional tools at the webmaster’s disposal include the assignment “keywords.” These words are embedded in a website and are intended to match the words plugged into the search inquiry. When search words match keywords, the combination produces a search result (a hit). Unfortunately, the same keywords belong to an abundance of sites, so using keywords alone will not get the hits desired.

The Changing Algorithm

With newer and more sophisticated algorithms, the webmaster must employ newer techniques to optimize relevance. Although the true formulas are not disclosed, the search engine architects do provide some hints regarding their mysterious logic. One tip shared by them to increase relevancy is “content value.” With the older algorithms, the “loading” of many keywords, “fluff content,” and “hyperlinks” to other websites was a good way to achieve relevancy. However, the coding in the updated engines often downgrades a site for these tactics. When a web designer employs certain techniques such as these, the engines refer to these as “black hat” techniques. A black hat activity is essentially an attempt to “game” the engine, and such tactics actually lower a websites ranking.

In contrast to keyword loading and fluff is “content value.” The contemporary algorithms favor high value content. This includes rich, well-written, authoritative content. The better the content, the more its favored by the engines. However, the architects would never make it that easy to understand. Many webmasters believe that to be favored by the engines a site must include academic level content, a broad use of various online mediums (LinkedIn, Facebook, etc.), evolving content, newsworthy articles, the incorporation of web blogs, video blogs, and other tools, which elevate the value of a website. The techniques that use approved methods, which do not attempt to “game” the engines are considered “white hat” practices.

The Faster Way to the Top

Ultimately, the search engines are businesses, whose primary purpose is to generate revenue and profits for their share-holders. Their revenue comes from paid advertisers who purchase space on their engine pages (just like a Yellow Page listing). If webmasters learn the secret code, which renders paid advertising unnecessary, the engine would cease to generate revenue. As such, the changing algorithm ensures that no webmaster can win against the “house.”

As the “playing rules,” are always changing, the only way to “win” is to purchase paid advertising. Regardless of what tools, techniques, or content value a webmaster employs, the engines will always give preferential treatment (relevancy) to those who pay. It is hardly coincidental that paid search helps provide better ranking than entities that use only organic methods. Indeed, the more an entity spends on paid search the better the relevancy (rank position).

Share This

About the author

Leave a Reply