Leave a comment

Website Analytic Tips for SEO

Data is your friend and I don’t mean the guy from Star Trek

Ask any SEO and they will probably tell you that data from your log files are very important to the success of your online venture but did you know that there are many people that are afraid of data?

Fear of Data or FoD is an affliction that can strike anyone, regardless of how intelligent they are and the symptoms include; a look of confusion, a marked state of panic or an uncontrollable shaking of the head while uttering the words “what they heck do all these numbers even mean”. These symptoms are universal and can begin anytime you need to look under the hood of a website and figure out where the traffic is coming from and what these guests are doing during their visits.

In order to understand website analytics we need to first know what to look for and how it can help us. The following are the top ten most important analytics and what they mean to your success as a website owner or SEO.
1. Keyword Search Referrals is probably the biggest. Mining this for the long-tail and other nuggets of insight is invaluable. I have had websites where 80% of my traffic strategy was gleaned from studying this data. Remember, data does not lie. Even if you see something unexpected in these logs make sure to follow up.

2. Top Referrals is another high value SEO/SEM guiding analytic. This tells you what search engines are referring you the most visitors and it even breaks it down further so you can see what number is coming from say, Google Images and Google Search.

3. Top Viewed URL’s is a virtual goldmine of information because it allows you to see what content on your site is the most popular and it can help identify important SEO opportunities. I have seen client websites that garner a majority of traffic directly to categories on their blogs, even more so than to the main page and because of the relevance they convert extremely well.

4. Visit Duration is a tidbit of information that has a ton of value as it relays to you how happy your visitors are with the site. If you notice a high instance of low duration, like 95% staying for 30 seconds or less then you have what is called a bounce rate problem and you need to work on the look, feel and/or content of your site. Sometimes a quick stay is all that is needed for a visitor to view a picture they searched for and click on a money earning advertisement but typically speaking the longer your visitors linger on your site, the better!

5. Total Visits is a useful statistic for judging the overall general popularity of you website and will give you information on the number of visitors that visit your website. Generally, we like to look more closely at actual unique visitors because those are the ‘bread and butter’ of any growing site.

6. Visits per Host can help tell you a little more about the demographics of your traffic. You might notice a high instance of visitors from certain hosting addresses and this can mean many things. Once time I had a slew of visitors one month from a certain ip address of the local telephone company and it turned out that one of my posts went viral via inter-office email and that was the reason.

7. File Type give you information on how much bandwidth is being used by each file type that is loading up and can again help you target potential problems with high bandwidth cost and even poor user experience if you notice certain memory intensive graphics slowing down the website.

8. Visitors/Country is a good way to track where on the planet your visitors are coming from, this can be useful when planning content because if you see a high majority of your visitors are from a certain geographical area then you can leverage that knowledge and write more topical content to spread your websites appeal even further.

9. Total Data Transferred can be helpful when you have hosting problems and need a quick idea of how traffic heavy your content is. The most common issue being high bandwidth usage. Perhaps a majority of your visitors are watching bandwidth heavy movies or flash files.

10. Total Pages Viewed is handy in determining if your site has some stickiness (meaning visitors stick around and look at more than one page of your site), the more pages each visitor looks at the better it is for you because they obviously found your site compelling enough to spend some time reading the content.

So if you have FoD, like so many webmasters do – you should just remember how much value the data contains because once you realize how valuable it is and experience how much it can help you and your clients you will be like me and move on to a new addiction….LoD (Love of Data).

2 Comments

SEO (Search Engine Optimisation)

SEO (Search Engine Optimisation) is the study and business of improving search results. Includes a summary of search engine and SEO history, common SEO protocols, information on international search engines markets and types of SEO

Definition of SEO

SEO (Search Engine Optimisation or Search Engine Optimiser) refers to the study and business of improving search engine rankings in (unpaid) search engines. Depending on the end users’ georgraphic location and interest/industry, SEO tends to target different search engines and types of search, including multimedia search (image, video, audio), industry specific ‘vertical’ search, local search and social search. In most cases SEO is performed under the assumption that a better SERPs (Search Engine Result Pages) result will increase traffic to a website.
SEO is considered an online marketing strategy for increasing website traffic from search engine users. It works under the assumption that the higher a site ranks for a particular query in SERPs (Search Engine Results Pages), the higher the number of visitors from search engine users following links. Although there are many large networks related to SEO such as SMX, there is no official body that represents the SEO industry.
Furthermore, search engines generally don’t publish the rules of their search ‘algorithms’, which are used to determine the rank of a particular site or link. Extensive experimentation into SEO techniques and patents held by search engines has been documented and publicly released by prominent SEOs including Rand Fishkin, Barry Schwartz, Aaron Wall and Jill Whalen. However many SEO methods and metrics are widely disputed.
Despite this there are a few strategies that are commonly used in SEO. Typically SEO strategies consider how search engine ranking algorithms ‘work’ based off observation of different techniques in search engines and an understanding of the types of algorithms used by search engines. Also, SEOs (Search Engine Optimisers) work under the assumption that search engines use automated bots (also know as ‘spiders’ and ‘crawlers’) to index the contents of websites and related material like maps data and multimedia resources. Therefore SEOs will generally try to reduce the factors that might impede the ‘crawling’ of pages, for example most search engines cannot index content behind forms leading to the ‘deep web’.

Types of SEO

There are two schools of SEO that campaigns are typically classified into known as ‘white hat’ and ‘black hat’ SEO.

White hat SEO is a term associated with improving search ranking through increased ‘relevance’ via fixing site problems in information architecture, structure, coding and problems for search engine bot activity. This type of SEO is usually performed with the intention of creating long term results through improving site performance and relevance.

Black hat SEO may incorporate techniques used in ‘white hat’ SEO, however it usually has connotations of manipulative practices used to ‘trick’ search engines into ranking content highly. A common name for these practices is ‘spamdexing’ or the indexing of spam content. Examples of black hat SEO techniques include ‘keyword stuffing’ which is the repeated injection of useless or unrelated keywords into a web page and ‘link farming’ which is the ‘unnatural’ linking of websites together in order to build a large number of links to a page.

The main difference between ‘white hat’ and ‘black hat’ SEOs is that white hat SEO tend to follow the guidelines provided by search engines while black hat SEO does not. As a result, search engines may ban sites which employ black hat techniques.

History

According to industry analyst Danny Sullivan, the earliest known use of the phrase search engine optimization was a spam message posted on Usenet on July 26, 1997. However, it is generally accepted that webmasters having been optimizing search engines for websites since the early 1990s when the first internet search engines were created.

Early ‘SEO’ practices involved submission to a search engine via URL to the search engine in order to have a page crawled and indexed. Typically, search engines (still) collect this information into a cache on the search engine company’s servers, which is then processed using algorithms to determine the ‘weight’ and ‘relevance’ of the page to a particular set of words that would be used in search queries. Links from these pages are then scheduled for “re-crawling” at another date.

As search engines became more popular and important to online marketing in the mid 1990s, webmasters realized that the higher their search result the higher the traffic they were likely to receive. Many early search engines used meta data tags to rate a page’s relevance to a particular set of keywords, which made them open to manipulation as webmasters entered keywords in the meta tag and other site elements that were not truly relevant to the site’s actual keywords. In other cases, keywords were simply broken or incomplete. This over reliance on factors within the webmaster’s control lead to an increase in irrelevant pages and spam in search results. Search engines responded by creating new ranking algorithms which would be more difficult for webmasters to manipulate.

One of the most significant early developments was “Backrub”, which used a mathematical algorithm to rate the prominence / popularity of a website. Backrub was created by Larry Page and Sergey Brin (creators of Google) from Stanford University, and is used to calculate the popular PageRank metric, which estimates the likelihood of users visiting a page by randomly surfing the web based on the number of links to that page. This means that some links are stronger than others, as users are more likely to find some sites randomly based on the inbound links to that page.

The pair founded Google in 1998 which utilised the PageRank system. Google developed a large loyal following of users based on its simple design and because it could avoid the types of manipulation seen in other search engines using PageRank and other online / offline factors to rank pages. Despite this, webmasters were able to game the search engine using techniques similar to those used in the Inktomi search engine. “Link farming” techniques used include the mass buying and exchanging of links (often referred to as reciprocal or triangular depending on the context) and the creation of thousands of sites for the sole purpose of link building.

As search engines continue to develop, they are increasingly reliant on offline data which is difficult for webmasters and SEOs to manipulate. This includes regional / geographic data such as zipcodes (and even telephone verification) for local search. It also includes profile data like the age, sex, location and search history of users in an attempt to return more relevant results. Google now claims to rank sites based on more than 200 different factors.

Common SEO Protocols

There are several protocols that have become standard to major search engines.

robots.txt – this is a generic protocol to restrict the activity of search engine bots within websites. The file is placed at the root of the website and is read by search engines before they commence crawling. Instructions inside the file indicate to bots which files they can and can’t crawl.

sitemap.xml – many search engines including Google, Yahoo and MSN support the XML sitemaps protocol. This is an xml formatted list of files to be crawled which can be submitted to search engines. The protocol is designed to allow pages that are not linked from pages in a search engines index to be found by crawlers.

International Search Engine Share

The dominance of search engines in different geographic regions varies. For example, as of 2006 Google held around 40% of the US market but around 85-90% in Germany, and is generally considered the world’s dominant search engine. However, in Russia Yandex is the leading search engine, as is Baidu for China. Despite their dominance in different regions, it is generally accepted that these search engines use similar technologies in order to rank pages.

2 Comments

Keyword Research

What keywords are people searching for?
Use the SEO Book Keyword research tool to search for popular and Long Tail keywords related to your industry. This tool cross references the Google Keyword Tool, Wordtracker, and other popular keyword research tools. Notice how our keyword tool provides daily search estimates and cross references other useful keyword research tools.


Keyword research tools are better at providing a qualitative measure than a quantitative measure, so don’t be surprised if actual traffic volumes vary greatly from the numbers suggested by these tools. When in doubt you can also set up a Google AdWords account to test the potential size of a market.

In addition to looking up search volumes for what keywords you think are important, also take the time to ask past customers how they found you, why they chose you, and what issues were important to them in choosing you.

You can also get keyword ideas by doing things like

checking your web analytics or server logs
looking at page contents of competing websites
looking through topical forums and community sites to see what issues people frequently discuss

Site Structure

How should you structure your site?
Before drafting content consider what keywords are your most important and map out how to create pages to fit each important group of keywords within your site theme and navigational structure based on

market value
logical breaks in market segmentation
importance of ranking in building credibility / improving conversion rates
You may want to use an Excel spreadsheet or some other program to help you visualize your site structure. This mini-screenshot from an Excel spreadsheet shows example data for how you might align keywords for a section of a site focused on home based businesses, start ups, and franchise opportunities.

Make sure

your most important categories or pages are linked to sitewide
you link to every page on your site from at least one other page on your site
you use consistent anchor text in your navigation
you link to other content pages (and espeically to action items) from within the content area of your website
If you are uncertain how deep to make a portion of the site, start by creating a few high quality pages on the topic. Based on market feedback create more pages in the sections that are most valuable to your business.

On Page Optimization

It is hard to rank for keywords that do not appear in your page content, so each page should be organized around the goal of ranking for a specific keyword phrase, with some related phrases and related keywords mixed into the page copy.
Unique descriptive page titles play a crucial role in a successful search engine optimization campaigns. Page titles appear in the search results, and many people link to pages using the page title as their link anchor text.
If possible create hand crafted meta description tags which complement the page title by reinforcing your offer. If the relevant keywords for a page have multiple formats it may make sense to help focus the meta description on versions you did not use in the page title.

As far as page content goes, make sure you write for humans, and use heading tags to help break up the content into logical sections which will improve the scanability and help structure the document. When possible, make sure your page content uses descriptive modifiers as well.

Each page also needs to be sufficiently unique from other pages on your site. Do not let search engines index printer friendly versions of your content, or other pages where content is duplicate or nearly duplicate.

Leave a comment

How Does SEO Works?

SEO isn’t Rocket Science. In fact it’s based on 4 key principals:

1. Identify the Right Keywords
2. Optimize the website
3. Optimize Inbound Links (Backlinks)
4. Measure Results and Repeat

Each of those 4 key principals have a lot of details, but everything comes back to them.

Identifying the “right” keywords: This process is probably the most important of all research that you can accomplish and most important to understanding how SEO works. There is simply no point to entering into a search optimization campaign unless you know; which keywords are being searched, how competitive are the keywords (i.e. how likely and how long will it take to win them), which keywords drive conversions, which keywords drive traffic but not conversions. We have a very mathematical approach to keyword identification. Factors we consider are search volume, search engine result counts, keyword phrase use in title tags, Alexa Rankings, Google Page Rank of competitors. These factors are then mathematically combined into Keyword Effectiveness Index (KEI) Keyword Performance Index (KPI), BLKEI ™ and a BWKPI ™ – Download a sample Keyword Report.

On-Page Optimization: When a search engine visits your site and indexes a page (bot or spider) it cannot “see” images. All it can do is read the text that appears on the page to try and identify what the subject matter of the page is all about. What the text says and how that text is formatted is exceptionally important. Text on the page includes Title tags, Meta Tag, Description Tag, Image Alt text, Link alt text, link anchor text, and of course body text. Formatting factors are also taken into consideration including the usage of; bold, italics, H1 and other H# tags. Density of keyword phrases is also exceptionally important. If a particular keyword phrase is not used enough, then the search engine will most likely not determine that the page is about that subject. However, if the keyword density is too high (i.e. the keyword is used too often or “stuffed”) then the search engine may mark the page as “spam”. Again, our On page optimization reports are based heavily on these mathematical factors. – Download a sample On-Page Optimization Report

Site Structure – Many factors regarding the way that a site is constructed and structured can affect the overall performance of the site from a SEO perspective. One of the most common problems is duplicate content. While Google and other search engines are getting better about recognizing and handling duplicate content issues, it is much better to solve the problem from the website construction side than to leave it to the search engines to “decide” for you. Examples of duplicate content problems are: Dynamic sites that return the same web page under different URLS, URL tracking codes that are carried throughout the site as a visitor navigates, and site whose link structure include bounce from http to https pages with the same content. URL Rewriting techniques can be utilized to correct these issues.

Additional site structure problems are; Internal links that “link everything to everything”, SEO Hazards (Black hat techniques such as small text, same color text on near color backgrounds, and many others), Excessive links per page, and excessive outbound links.

Off-Page Optimization: While some search engines place a heavier importance on content, Google (the current King of Search) places a heavier importance on inbound links (backlinks). These are the links to your site (and sub pages) from other websites. The quality and quantity of these links is exceptionally important. To give you an example of just how important these inbound links are you can do a search on Google for the phrase “Click Here” http://www.google.com/search?hl=en&q=click+here as you can see Adobe Reader is the first result. If you look at the page you will see that the phrase “click here” doesn’t appear anywhere in the text. Why do they rank so well when the content is not relevant? The answer: Anchor Text. Anchor text is the text used on the link itself. A lot of sites use the phrase “click here to download adobe reader” In the case of Adobe Google recognizes nearly half a million backlinks to that specific page. http://www.google.com/search?sourceid=navclient&ie=UTF-8&rlz=1T4HPNW_en&q=link:http%3a%2f%2fget.adobe.com%2freader%2f . The acquisition of quality backlinks can be critical to your overall success in SEO.

Where we get links: There are a number of ways to build quality backlinks. Directory submissions, Off site Blogging, Press Releases, Working with your current vendors and customer to get links from their websites, testimonials of products that you use, sponsoring non profit events, and more. In addition on consideration is community building. Is your product or service conducive to an online community format with message boards and forums? If so this can be a fantastic way to have content (postings etc) built for you naturally by a community of site users. This can often be a great source of links. Social Networking sites like Twitter, Facebook, MySpace, LinkedIN and others can also be a source of strong communications, building brand followings and ultimately inbound links from natural sources. Lastly direct email requests to sites and blogs that represent your industry asking for links and / or link exchanges is the final method of link building. This brief overview of link building just scratches the surface on the details of a successful link building campaign.

Measure Measure Measure: Traffic is useless unless that traffic converts into leads and sales. We believe that it isn’t our responsibility to just drive traffic to our client’s sites instead we have to drive the right traffic through to the right experience at the right moment. We have to help your customers find exactly what they are looking for, and take them through that experience, from a search trigger, to completed action. When customers find our clients our clients see results. This can only be accomplished by a strong a dedicated approach to analytics review, and performance tracking. We have to know “why” the visitor visited the site, where they went, what pathing best creates the desired outcome and then manipulate our efforts to that end. Regular reporting of SEO campaign performance, Link Performance, and changes in competitors is a requirement to successful SEO.

Leave a comment

SEO Basics

Search Engine Optimization Made Easy

Search engine optimization (SEO) is the art and science of publishing and marketing information that ranks well in search engines like Google, Yahoo! Search, and Microsoft Bing. If you run into any new jargon while reading this Knol consider looking up their definitions using the Search Engine Marketing Glossary.

By default, many search engines show only 10 results per page. Most searchers tend to click on the top few results. If you rank at the top of the Search Engine Results Page (SERP), business is good, but if you are on the second or third page you might only get 1% of the search traffic that the top ranked site gets.
The two most powerful aspects of search engine marketing are:
users type what they want to find into search boxes, making search engines the most precisely targeted marketing medium in the history of the world
once you gain traction in the search results the incremental costs of gaining additional exposure are negligible when compared with the potential rewards, allowing individuals and small businesses to compete with (and perhaps eventually become) large corporations

While many people consider SEO to be complicated, I believe that SEO is nothing but an extension of traditional marketing. Search engine optimization consists of 9 main steps:
market research
keyword research
on -page optimization
site structure
link building
brand building
viral marketing
adjusting
staying up to date

Market Research

Do you have what it takes to compete in a market?
The first step is to search the major search engines to see what types of websites are ranking for words which you deem to be important. For example, if mostly colleges, media, and government institutions are ranking for your most important terms it may be difficult to rank for those types of queries. If, on the other hand, the market is dominated by fairly average websites which are not strongly established brands, it may be a market worth pursuing.
You can extend out the research you get from the search results by using the SEO for Firefox extension with the Firefox browser. This places many marketing data points right in the search results, and thus lets you see things like

site age
Google PageRank
inbound link count
if any governmental or educational sites link at their site
if they are listed in major directories
if bloggers link at their sites
The blue area under this CreditCards.com listing shows a wide array of marketing information.

2 Comments

What is Link Building?

Search engines view links as votes, with some votes counting more than others. To get high quality links (that help your site rank better) you need to participate in the social aspects of your community and give away valuable unique content that people talk about and share with others. The below Google TouchGraph image shows a small graphic representation of sites in the search field that are related to SeoBook.com based on linking patterns.

In this post Matt Cutts suggested that Google is getting better at understanding link quality. Search engines want to count quality editorial votes as links that help influence their relevancy algorithms.

Link building tips
try to link to your most relevant page when getting links (don’t point all the links at your home page)
mix your anchor text
use Yahoo! Site Explorer and other tools to analyze top competing backlinks
don’t be afraid to link out to relevant high quality resources
Link building strategies
submit your site to general directories like DMOZ, the Yahoo! Directory, and Business.com
submit your site to relevant niche directories
here is more background on directories and SEO
if you have a local site, submit to relevant local sites (like the local chamber of commerce)
join trade organizations
get links from industry hub sites
create content people would want to link at
here is a list of 101 useful link building strategies

Brand Building

Brand related search queries tend to be some of the most targeted, best converting, and most valuable keywords. As you gain mindshare people will be more likely to search for your brand or keywords related to your brand. A high volume of brand -related search traffic may also be seen as a sign of quality by major search engines.
If you build a strong brand when people search for more information about your brand, and other websites have good things to say about your brand, these interactions help to reinforce your brand image and improve your lead quality and conversion rates.

Things like advertising and community activity are easy ways to help improve your brand exposure, but obviously branding is a lot more complicated than that. One of my favorite books about branding is Rob Frankel’s The Revenge of Brand X.

Viral Marketing

Link building is probably the single hardest and most time consuming part of an effective SEO campaign, largely because it requires influencing other people. But links are nothing but a remark or citation. Seth Godin’s Purple Cow is a great book about being remarkable.

The beautiful thing about viral marketing is that creating one popular compelling idea can lead to thousands of free quality links. If your competitor is building one link at a time and you have thousands of people spreading your ideas for you for free then you are typically going to end up ranking better.
In SEO many people create content based around linking opportunities. Many of us refer to this as Link Baiting. You can learn link baiting tips from
SEO Book
Stuntdubl
Performancing
Copyblogger
Wolf Howl
You can search social news or social bookmarking sites like Digg or Del.icio.us to see what stories related to your topic became popular.

Measuring Results

Half the money I spend on advertising is wasted; the trouble is I don’t know which half.
– John Wanamaker

How can I tell if my SEO campaign is effective? The bottom line is what counts. Is your site generating more leads, higher quality leads, or more sales? What keywords are working? You can look at your server logs and an analytics program to track traffic trends and what keywords lead to conversion.
Outside of traffic another good sign that you are on the right track is if you see more websites asking questions or talking about you. If you start picking up high quality unrequested links you might be near aTipping Point to where your marketing starts to build on itself.
Search engines follow people, but lag actual market conditions. It may take search engines a while to find all the links pointing at your site and analyze how well your site should rank. Depending on how competitive your marketplace is, it may take anywhere from a few weeks to a couple of years to establish a strong market position. Rankings can be a moving target, since at any point in time,

you are marketing your business
competitors are marketing their businesses and reinvesting profits into building out their SEO strategy
search engines may change their relevancy algorithms

Leave a comment

What is Pay Per Click (PPC)?

Pay per click (PPC) (also called Cost per click) is an Internet advertising model used to direct traffic to websites, where advertisers pay the publisher (typically a website owner) when the ad is clicked. With search engines, advertisers typically bid on keyword phrases relevant to their target market. Content sites commonly charge a fixed price per click rather than use a bidding system. PPC “display” advertisements are shown on web sites with related content that have agreed to show ads. This approach differs from the “pay per impression” methods used in television and newspaper advertising.
In contrast to the generalized portal, which seeks to drive a high volume of traffic to one site, PPC implements the so-called affiliate model, that provides purchase opportunities wherever people may be surfing. It does this by offering financial incentives (in the form of a percentage of revenue) to affiliated partner sites. The affiliates provide purchase-point click-through to the merchant. It is a pay-for-performance model: If an affiliate does not generate sales, it represents no cost to the merchant. Variations include banner exchange, pay-per-click, and revenue sharing programs.
Websites that utilize PPC ads will display an advertisement when a keyword query matches an advertiser’s keyword list, or when a content site displays relevant content. Such advertisements are called sponsored links or sponsored ads, and appear adjacent to or above organic results on search engine results pages, or anywhere a web developer chooses on a content site.[1]
Among PPC providers, Google AdWords, Yahoo! Search Marketing, and Microsoft adCenter are the three largest network operators, and all three operate under a bid-based model. [1]
The PPC advertising model is open to abuse through click fraud, although Google and others have implemented automated systems[2] to guard against abusive clicks by competitors or corrupt web developers.[3]

There are two primary models for determining cost per click: flat-rate and bid-based. In both cases the advertiser must consider the potential value of a click from a given source. This value is based on the type of individual the advertiser is expecting to receive as a visitor to his or her website, and what the advertiser can gain from that visit, usually revenue, both in the short term as well as in the long term. As with other forms of advertising targeting is key, and factors that often play into PPC campaigns include the target’s interest (often defined by a search term they have entered into a search engine, or the content of a page that they are browsing), intent (e.g., to purchase or not), location (for geo targeting), and the day and time that they are browsing.
[edit]Flat-rate PPC
In the flat-rate model, the advertiser and publisher agree upon a fixed amount that will be paid for each click. In many cases the publisher has a rate card that lists the Cost Per Click (CPC) within different areas of their website or network. These various amounts are often related to the content on pages, with content that generally attracts more valuable visitors having a higher CPC than content that attracts less valuable visitors. However, in many cases advertisers can negotiate lower rates, especially when committing to a long-term or high-value contract.
The flat-rate model is particularly common to comparison shopping engines, which typically publish rate cards.[4] However, these rates are sometimes minimal, and advertisers can pay more for greater visibility. These sites are usually neatly compartmentalized into product or service categories, allowing a high degree of targeting by advertisers. In many cases, the entire core content of these sites is paid ads.
[edit]Bid-based PPC
In the bid-based model, the advertiser signs a contract that allows them to compete against other advertisers in a private auction hosted by a publisher or, more commonly, an advertising network. Each advertiser informs the host of the maximum amount that he or she is willing to pay for a given ad spot (often based on a keyword), usually using online tools to do so. The auction plays out in an automated fashion every time a visitor triggers the ad spot.
When the ad spot is part of a search engine results page (SERP), the automated auction takes place whenever a search for the keyword that is being bid upon occurs. All bids for the keyword that target the searcher’s geo-location, the day and time of the search, etc. are then compared and the winner determined. In situations where there are multiple ad spots, a common occurrence on SERPs, there can be multiple winners whose positions on the page are influenced by the amount each has bid. The ad with the highest bid generally shows up first, though additional factors such as ad quality and relevance can sometimes come into play (see Quality Score).
In addition to ad spots on SERPs, the major advertising networks allow for contextual ads to be placed on the properties of 3rd-parties with whom they have partnered. These publishers sign up to host ads on behalf of the network. In return, they receive a portion of the ad revenue that the network generates, which can be anywhere from 50% to over 80% of the gross revenue paid by advertisers. These properties are often referred to as a content network and the ads on them as contextual ads because the ad spots are associated with keywords based on the context of the page on which they are found. In general, ads on content networks have a much lower click-through rate (CTR) and conversion rate (CR) than ads found on SERPs and consequently are less highly valued. Content network properties can include websites, newsletters, and e-mails.[5]
Advertisers pay for each click they receive, with the actual amount paid based on the amount bid. It is common practice amongst auction hosts to charge a winning bidder just slightly more (e.g. one penny) than the next highest bidder or the actual amount bid, whichever is lower.[6] This avoids situations where bidders are constantly adjusting their bids by very small amounts to see if they can still win the auction while paying just a little bit less per click.
To maximize success and achieve scale, automated bid management systems can be deployed. These systems can be used directly by the advertiser, though they are more commonly used by advertising agencies that offer PPC bid management as a service. These tools generally allow for bid management at scale, with thousands or even millions of PPC bids controlled by a highly automated system. The system generally sets each bid based on the goal that has been set for it, such as maximize profit, maximize traffic at breakeven, and so forth. The system is usually tied into the advertiser’s website and fed the results of each click, which then allows it to set bids. The effectiveness of these systems is directly related to the quality and quantity of the performance data that they have to work with – low-traffic ads can lead to a scarcity of data problem that renders many bid management tools useless at worst, or inefficient at best.

%d bloggers like this: