Leave a comment

Website Analytic Tips for SEO

Data is your friend and I don’t mean the guy from Star Trek

Ask any SEO and they will probably tell you that data from your log files are very important to the success of your online venture but did you know that there are many people that are afraid of data?

Fear of Data or FoD is an affliction that can strike anyone, regardless of how intelligent they are and the symptoms include; a look of confusion, a marked state of panic or an uncontrollable shaking of the head while uttering the words “what they heck do all these numbers even mean”. These symptoms are universal and can begin anytime you need to look under the hood of a website and figure out where the traffic is coming from and what these guests are doing during their visits.

In order to understand website analytics we need to first know what to look for and how it can help us. The following are the top ten most important analytics and what they mean to your success as a website owner or SEO.
1. Keyword Search Referrals is probably the biggest. Mining this for the long-tail and other nuggets of insight is invaluable. I have had websites where 80% of my traffic strategy was gleaned from studying this data. Remember, data does not lie. Even if you see something unexpected in these logs make sure to follow up.

2. Top Referrals is another high value SEO/SEM guiding analytic. This tells you what search engines are referring you the most visitors and it even breaks it down further so you can see what number is coming from say, Google Images and Google Search.

3. Top Viewed URL’s is a virtual goldmine of information because it allows you to see what content on your site is the most popular and it can help identify important SEO opportunities. I have seen client websites that garner a majority of traffic directly to categories on their blogs, even more so than to the main page and because of the relevance they convert extremely well.

4. Visit Duration is a tidbit of information that has a ton of value as it relays to you how happy your visitors are with the site. If you notice a high instance of low duration, like 95% staying for 30 seconds or less then you have what is called a bounce rate problem and you need to work on the look, feel and/or content of your site. Sometimes a quick stay is all that is needed for a visitor to view a picture they searched for and click on a money earning advertisement but typically speaking the longer your visitors linger on your site, the better!

5. Total Visits is a useful statistic for judging the overall general popularity of you website and will give you information on the number of visitors that visit your website. Generally, we like to look more closely at actual unique visitors because those are the ‘bread and butter’ of any growing site.

6. Visits per Host can help tell you a little more about the demographics of your traffic. You might notice a high instance of visitors from certain hosting addresses and this can mean many things. Once time I had a slew of visitors one month from a certain ip address of the local telephone company and it turned out that one of my posts went viral via inter-office email and that was the reason.

7. File Type give you information on how much bandwidth is being used by each file type that is loading up and can again help you target potential problems with high bandwidth cost and even poor user experience if you notice certain memory intensive graphics slowing down the website.

8. Visitors/Country is a good way to track where on the planet your visitors are coming from, this can be useful when planning content because if you see a high majority of your visitors are from a certain geographical area then you can leverage that knowledge and write more topical content to spread your websites appeal even further.

9. Total Data Transferred can be helpful when you have hosting problems and need a quick idea of how traffic heavy your content is. The most common issue being high bandwidth usage. Perhaps a majority of your visitors are watching bandwidth heavy movies or flash files.

10. Total Pages Viewed is handy in determining if your site has some stickiness (meaning visitors stick around and look at more than one page of your site), the more pages each visitor looks at the better it is for you because they obviously found your site compelling enough to spend some time reading the content.

So if you have FoD, like so many webmasters do – you should just remember how much value the data contains because once you realize how valuable it is and experience how much it can help you and your clients you will be like me and move on to a new addiction….LoD (Love of Data).

2 Comments

SEO (Search Engine Optimisation)

SEO (Search Engine Optimisation) is the study and business of improving search results. Includes a summary of search engine and SEO history, common SEO protocols, information on international search engines markets and types of SEO

Definition of SEO

SEO (Search Engine Optimisation or Search Engine Optimiser) refers to the study and business of improving search engine rankings in (unpaid) search engines. Depending on the end users’ georgraphic location and interest/industry, SEO tends to target different search engines and types of search, including multimedia search (image, video, audio), industry specific ‘vertical’ search, local search and social search. In most cases SEO is performed under the assumption that a better SERPs (Search Engine Result Pages) result will increase traffic to a website.
SEO is considered an online marketing strategy for increasing website traffic from search engine users. It works under the assumption that the higher a site ranks for a particular query in SERPs (Search Engine Results Pages), the higher the number of visitors from search engine users following links. Although there are many large networks related to SEO such as SMX, there is no official body that represents the SEO industry.
Furthermore, search engines generally don’t publish the rules of their search ‘algorithms’, which are used to determine the rank of a particular site or link. Extensive experimentation into SEO techniques and patents held by search engines has been documented and publicly released by prominent SEOs including Rand Fishkin, Barry Schwartz, Aaron Wall and Jill Whalen. However many SEO methods and metrics are widely disputed.
Despite this there are a few strategies that are commonly used in SEO. Typically SEO strategies consider how search engine ranking algorithms ‘work’ based off observation of different techniques in search engines and an understanding of the types of algorithms used by search engines. Also, SEOs (Search Engine Optimisers) work under the assumption that search engines use automated bots (also know as ‘spiders’ and ‘crawlers’) to index the contents of websites and related material like maps data and multimedia resources. Therefore SEOs will generally try to reduce the factors that might impede the ‘crawling’ of pages, for example most search engines cannot index content behind forms leading to the ‘deep web’.

Types of SEO

There are two schools of SEO that campaigns are typically classified into known as ‘white hat’ and ‘black hat’ SEO.

White hat SEO is a term associated with improving search ranking through increased ‘relevance’ via fixing site problems in information architecture, structure, coding and problems for search engine bot activity. This type of SEO is usually performed with the intention of creating long term results through improving site performance and relevance.

Black hat SEO may incorporate techniques used in ‘white hat’ SEO, however it usually has connotations of manipulative practices used to ‘trick’ search engines into ranking content highly. A common name for these practices is ‘spamdexing’ or the indexing of spam content. Examples of black hat SEO techniques include ‘keyword stuffing’ which is the repeated injection of useless or unrelated keywords into a web page and ‘link farming’ which is the ‘unnatural’ linking of websites together in order to build a large number of links to a page.

The main difference between ‘white hat’ and ‘black hat’ SEOs is that white hat SEO tend to follow the guidelines provided by search engines while black hat SEO does not. As a result, search engines may ban sites which employ black hat techniques.

History

According to industry analyst Danny Sullivan, the earliest known use of the phrase search engine optimization was a spam message posted on Usenet on July 26, 1997. However, it is generally accepted that webmasters having been optimizing search engines for websites since the early 1990s when the first internet search engines were created.

Early ‘SEO’ practices involved submission to a search engine via URL to the search engine in order to have a page crawled and indexed. Typically, search engines (still) collect this information into a cache on the search engine company’s servers, which is then processed using algorithms to determine the ‘weight’ and ‘relevance’ of the page to a particular set of words that would be used in search queries. Links from these pages are then scheduled for “re-crawling” at another date.

As search engines became more popular and important to online marketing in the mid 1990s, webmasters realized that the higher their search result the higher the traffic they were likely to receive. Many early search engines used meta data tags to rate a page’s relevance to a particular set of keywords, which made them open to manipulation as webmasters entered keywords in the meta tag and other site elements that were not truly relevant to the site’s actual keywords. In other cases, keywords were simply broken or incomplete. This over reliance on factors within the webmaster’s control lead to an increase in irrelevant pages and spam in search results. Search engines responded by creating new ranking algorithms which would be more difficult for webmasters to manipulate.

One of the most significant early developments was “Backrub”, which used a mathematical algorithm to rate the prominence / popularity of a website. Backrub was created by Larry Page and Sergey Brin (creators of Google) from Stanford University, and is used to calculate the popular PageRank metric, which estimates the likelihood of users visiting a page by randomly surfing the web based on the number of links to that page. This means that some links are stronger than others, as users are more likely to find some sites randomly based on the inbound links to that page.

The pair founded Google in 1998 which utilised the PageRank system. Google developed a large loyal following of users based on its simple design and because it could avoid the types of manipulation seen in other search engines using PageRank and other online / offline factors to rank pages. Despite this, webmasters were able to game the search engine using techniques similar to those used in the Inktomi search engine. “Link farming” techniques used include the mass buying and exchanging of links (often referred to as reciprocal or triangular depending on the context) and the creation of thousands of sites for the sole purpose of link building.

As search engines continue to develop, they are increasingly reliant on offline data which is difficult for webmasters and SEOs to manipulate. This includes regional / geographic data such as zipcodes (and even telephone verification) for local search. It also includes profile data like the age, sex, location and search history of users in an attempt to return more relevant results. Google now claims to rank sites based on more than 200 different factors.

Common SEO Protocols

There are several protocols that have become standard to major search engines.

robots.txt – this is a generic protocol to restrict the activity of search engine bots within websites. The file is placed at the root of the website and is read by search engines before they commence crawling. Instructions inside the file indicate to bots which files they can and can’t crawl.

sitemap.xml – many search engines including Google, Yahoo and MSN support the XML sitemaps protocol. This is an xml formatted list of files to be crawled which can be submitted to search engines. The protocol is designed to allow pages that are not linked from pages in a search engines index to be found by crawlers.

International Search Engine Share

The dominance of search engines in different geographic regions varies. For example, as of 2006 Google held around 40% of the US market but around 85-90% in Germany, and is generally considered the world’s dominant search engine. However, in Russia Yandex is the leading search engine, as is Baidu for China. Despite their dominance in different regions, it is generally accepted that these search engines use similar technologies in order to rank pages.

2 Comments

Keyword Research

What keywords are people searching for?
Use the SEO Book Keyword research tool to search for popular and Long Tail keywords related to your industry. This tool cross references the Google Keyword Tool, Wordtracker, and other popular keyword research tools. Notice how our keyword tool provides daily search estimates and cross references other useful keyword research tools.


Keyword research tools are better at providing a qualitative measure than a quantitative measure, so don’t be surprised if actual traffic volumes vary greatly from the numbers suggested by these tools. When in doubt you can also set up a Google AdWords account to test the potential size of a market.

In addition to looking up search volumes for what keywords you think are important, also take the time to ask past customers how they found you, why they chose you, and what issues were important to them in choosing you.

You can also get keyword ideas by doing things like

checking your web analytics or server logs
looking at page contents of competing websites
looking through topical forums and community sites to see what issues people frequently discuss

Site Structure

How should you structure your site?
Before drafting content consider what keywords are your most important and map out how to create pages to fit each important group of keywords within your site theme and navigational structure based on

market value
logical breaks in market segmentation
importance of ranking in building credibility / improving conversion rates
You may want to use an Excel spreadsheet or some other program to help you visualize your site structure. This mini-screenshot from an Excel spreadsheet shows example data for how you might align keywords for a section of a site focused on home based businesses, start ups, and franchise opportunities.

Make sure

your most important categories or pages are linked to sitewide
you link to every page on your site from at least one other page on your site
you use consistent anchor text in your navigation
you link to other content pages (and espeically to action items) from within the content area of your website
If you are uncertain how deep to make a portion of the site, start by creating a few high quality pages on the topic. Based on market feedback create more pages in the sections that are most valuable to your business.

On Page Optimization

It is hard to rank for keywords that do not appear in your page content, so each page should be organized around the goal of ranking for a specific keyword phrase, with some related phrases and related keywords mixed into the page copy.
Unique descriptive page titles play a crucial role in a successful search engine optimization campaigns. Page titles appear in the search results, and many people link to pages using the page title as their link anchor text.
If possible create hand crafted meta description tags which complement the page title by reinforcing your offer. If the relevant keywords for a page have multiple formats it may make sense to help focus the meta description on versions you did not use in the page title.

As far as page content goes, make sure you write for humans, and use heading tags to help break up the content into logical sections which will improve the scanability and help structure the document. When possible, make sure your page content uses descriptive modifiers as well.

Each page also needs to be sufficiently unique from other pages on your site. Do not let search engines index printer friendly versions of your content, or other pages where content is duplicate or nearly duplicate.

Leave a comment

How Does SEO Works?

SEO isn’t Rocket Science. In fact it’s based on 4 key principals:

1. Identify the Right Keywords
2. Optimize the website
3. Optimize Inbound Links (Backlinks)
4. Measure Results and Repeat

Each of those 4 key principals have a lot of details, but everything comes back to them.

Identifying the “right” keywords: This process is probably the most important of all research that you can accomplish and most important to understanding how SEO works. There is simply no point to entering into a search optimization campaign unless you know; which keywords are being searched, how competitive are the keywords (i.e. how likely and how long will it take to win them), which keywords drive conversions, which keywords drive traffic but not conversions. We have a very mathematical approach to keyword identification. Factors we consider are search volume, search engine result counts, keyword phrase use in title tags, Alexa Rankings, Google Page Rank of competitors. These factors are then mathematically combined into Keyword Effectiveness Index (KEI) Keyword Performance Index (KPI), BLKEI ™ and a BWKPI ™ – Download a sample Keyword Report.

On-Page Optimization: When a search engine visits your site and indexes a page (bot or spider) it cannot “see” images. All it can do is read the text that appears on the page to try and identify what the subject matter of the page is all about. What the text says and how that text is formatted is exceptionally important. Text on the page includes Title tags, Meta Tag, Description Tag, Image Alt text, Link alt text, link anchor text, and of course body text. Formatting factors are also taken into consideration including the usage of; bold, italics, H1 and other H# tags. Density of keyword phrases is also exceptionally important. If a particular keyword phrase is not used enough, then the search engine will most likely not determine that the page is about that subject. However, if the keyword density is too high (i.e. the keyword is used too often or “stuffed”) then the search engine may mark the page as “spam”. Again, our On page optimization reports are based heavily on these mathematical factors. – Download a sample On-Page Optimization Report

Site Structure – Many factors regarding the way that a site is constructed and structured can affect the overall performance of the site from a SEO perspective. One of the most common problems is duplicate content. While Google and other search engines are getting better about recognizing and handling duplicate content issues, it is much better to solve the problem from the website construction side than to leave it to the search engines to “decide” for you. Examples of duplicate content problems are: Dynamic sites that return the same web page under different URLS, URL tracking codes that are carried throughout the site as a visitor navigates, and site whose link structure include bounce from http to https pages with the same content. URL Rewriting techniques can be utilized to correct these issues.

Additional site structure problems are; Internal links that “link everything to everything”, SEO Hazards (Black hat techniques such as small text, same color text on near color backgrounds, and many others), Excessive links per page, and excessive outbound links.

Off-Page Optimization: While some search engines place a heavier importance on content, Google (the current King of Search) places a heavier importance on inbound links (backlinks). These are the links to your site (and sub pages) from other websites. The quality and quantity of these links is exceptionally important. To give you an example of just how important these inbound links are you can do a search on Google for the phrase “Click Here” http://www.google.com/search?hl=en&q=click+here as you can see Adobe Reader is the first result. If you look at the page you will see that the phrase “click here” doesn’t appear anywhere in the text. Why do they rank so well when the content is not relevant? The answer: Anchor Text. Anchor text is the text used on the link itself. A lot of sites use the phrase “click here to download adobe reader” In the case of Adobe Google recognizes nearly half a million backlinks to that specific page. http://www.google.com/search?sourceid=navclient&ie=UTF-8&rlz=1T4HPNW_en&q=link:http%3a%2f%2fget.adobe.com%2freader%2f . The acquisition of quality backlinks can be critical to your overall success in SEO.

Where we get links: There are a number of ways to build quality backlinks. Directory submissions, Off site Blogging, Press Releases, Working with your current vendors and customer to get links from their websites, testimonials of products that you use, sponsoring non profit events, and more. In addition on consideration is community building. Is your product or service conducive to an online community format with message boards and forums? If so this can be a fantastic way to have content (postings etc) built for you naturally by a community of site users. This can often be a great source of links. Social Networking sites like Twitter, Facebook, MySpace, LinkedIN and others can also be a source of strong communications, building brand followings and ultimately inbound links from natural sources. Lastly direct email requests to sites and blogs that represent your industry asking for links and / or link exchanges is the final method of link building. This brief overview of link building just scratches the surface on the details of a successful link building campaign.

Measure Measure Measure: Traffic is useless unless that traffic converts into leads and sales. We believe that it isn’t our responsibility to just drive traffic to our client’s sites instead we have to drive the right traffic through to the right experience at the right moment. We have to help your customers find exactly what they are looking for, and take them through that experience, from a search trigger, to completed action. When customers find our clients our clients see results. This can only be accomplished by a strong a dedicated approach to analytics review, and performance tracking. We have to know “why” the visitor visited the site, where they went, what pathing best creates the desired outcome and then manipulate our efforts to that end. Regular reporting of SEO campaign performance, Link Performance, and changes in competitors is a requirement to successful SEO.

Leave a comment

SEO Basics

Search Engine Optimization Made Easy

Search engine optimization (SEO) is the art and science of publishing and marketing information that ranks well in search engines like Google, Yahoo! Search, and Microsoft Bing. If you run into any new jargon while reading this Knol consider looking up their definitions using the Search Engine Marketing Glossary.

By default, many search engines show only 10 results per page. Most searchers tend to click on the top few results. If you rank at the top of the Search Engine Results Page (SERP), business is good, but if you are on the second or third page you might only get 1% of the search traffic that the top ranked site gets.
The two most powerful aspects of search engine marketing are:
users type what they want to find into search boxes, making search engines the most precisely targeted marketing medium in the history of the world
once you gain traction in the search results the incremental costs of gaining additional exposure are negligible when compared with the potential rewards, allowing individuals and small businesses to compete with (and perhaps eventually become) large corporations

While many people consider SEO to be complicated, I believe that SEO is nothing but an extension of traditional marketing. Search engine optimization consists of 9 main steps:
market research
keyword research
on -page optimization
site structure
link building
brand building
viral marketing
adjusting
staying up to date

Market Research

Do you have what it takes to compete in a market?
The first step is to search the major search engines to see what types of websites are ranking for words which you deem to be important. For example, if mostly colleges, media, and government institutions are ranking for your most important terms it may be difficult to rank for those types of queries. If, on the other hand, the market is dominated by fairly average websites which are not strongly established brands, it may be a market worth pursuing.
You can extend out the research you get from the search results by using the SEO for Firefox extension with the Firefox browser. This places many marketing data points right in the search results, and thus lets you see things like

site age
Google PageRank
inbound link count
if any governmental or educational sites link at their site
if they are listed in major directories
if bloggers link at their sites
The blue area under this CreditCards.com listing shows a wide array of marketing information.

2 Comments

What is Link Building?

Search engines view links as votes, with some votes counting more than others. To get high quality links (that help your site rank better) you need to participate in the social aspects of your community and give away valuable unique content that people talk about and share with others. The below Google TouchGraph image shows a small graphic representation of sites in the search field that are related to SeoBook.com based on linking patterns.

In this post Matt Cutts suggested that Google is getting better at understanding link quality. Search engines want to count quality editorial votes as links that help influence their relevancy algorithms.

Link building tips
try to link to your most relevant page when getting links (don’t point all the links at your home page)
mix your anchor text
use Yahoo! Site Explorer and other tools to analyze top competing backlinks
don’t be afraid to link out to relevant high quality resources
Link building strategies
submit your site to general directories like DMOZ, the Yahoo! Directory, and Business.com
submit your site to relevant niche directories
here is more background on directories and SEO
if you have a local site, submit to relevant local sites (like the local chamber of commerce)
join trade organizations
get links from industry hub sites
create content people would want to link at
here is a list of 101 useful link building strategies

Brand Building

Brand related search queries tend to be some of the most targeted, best converting, and most valuable keywords. As you gain mindshare people will be more likely to search for your brand or keywords related to your brand. A high volume of brand -related search traffic may also be seen as a sign of quality by major search engines.
If you build a strong brand when people search for more information about your brand, and other websites have good things to say about your brand, these interactions help to reinforce your brand image and improve your lead quality and conversion rates.

Things like advertising and community activity are easy ways to help improve your brand exposure, but obviously branding is a lot more complicated than that. One of my favorite books about branding is Rob Frankel’s The Revenge of Brand X.

Viral Marketing

Link building is probably the single hardest and most time consuming part of an effective SEO campaign, largely because it requires influencing other people. But links are nothing but a remark or citation. Seth Godin’s Purple Cow is a great book about being remarkable.

The beautiful thing about viral marketing is that creating one popular compelling idea can lead to thousands of free quality links. If your competitor is building one link at a time and you have thousands of people spreading your ideas for you for free then you are typically going to end up ranking better.
In SEO many people create content based around linking opportunities. Many of us refer to this as Link Baiting. You can learn link baiting tips from
SEO Book
Stuntdubl
Performancing
Copyblogger
Wolf Howl
You can search social news or social bookmarking sites like Digg or Del.icio.us to see what stories related to your topic became popular.

Measuring Results

Half the money I spend on advertising is wasted; the trouble is I don’t know which half.
– John Wanamaker

How can I tell if my SEO campaign is effective? The bottom line is what counts. Is your site generating more leads, higher quality leads, or more sales? What keywords are working? You can look at your server logs and an analytics program to track traffic trends and what keywords lead to conversion.
Outside of traffic another good sign that you are on the right track is if you see more websites asking questions or talking about you. If you start picking up high quality unrequested links you might be near aTipping Point to where your marketing starts to build on itself.
Search engines follow people, but lag actual market conditions. It may take search engines a while to find all the links pointing at your site and analyze how well your site should rank. Depending on how competitive your marketplace is, it may take anywhere from a few weeks to a couple of years to establish a strong market position. Rankings can be a moving target, since at any point in time,

you are marketing your business
competitors are marketing their businesses and reinvesting profits into building out their SEO strategy
search engines may change their relevancy algorithms

Leave a comment

What is Pay Per Click (PPC)?

Pay per click (PPC) (also called Cost per click) is an Internet advertising model used to direct traffic to websites, where advertisers pay the publisher (typically a website owner) when the ad is clicked. With search engines, advertisers typically bid on keyword phrases relevant to their target market. Content sites commonly charge a fixed price per click rather than use a bidding system. PPC “display” advertisements are shown on web sites with related content that have agreed to show ads. This approach differs from the “pay per impression” methods used in television and newspaper advertising.
In contrast to the generalized portal, which seeks to drive a high volume of traffic to one site, PPC implements the so-called affiliate model, that provides purchase opportunities wherever people may be surfing. It does this by offering financial incentives (in the form of a percentage of revenue) to affiliated partner sites. The affiliates provide purchase-point click-through to the merchant. It is a pay-for-performance model: If an affiliate does not generate sales, it represents no cost to the merchant. Variations include banner exchange, pay-per-click, and revenue sharing programs.
Websites that utilize PPC ads will display an advertisement when a keyword query matches an advertiser’s keyword list, or when a content site displays relevant content. Such advertisements are called sponsored links or sponsored ads, and appear adjacent to or above organic results on search engine results pages, or anywhere a web developer chooses on a content site.[1]
Among PPC providers, Google AdWords, Yahoo! Search Marketing, and Microsoft adCenter are the three largest network operators, and all three operate under a bid-based model. [1]
The PPC advertising model is open to abuse through click fraud, although Google and others have implemented automated systems[2] to guard against abusive clicks by competitors or corrupt web developers.[3]

There are two primary models for determining cost per click: flat-rate and bid-based. In both cases the advertiser must consider the potential value of a click from a given source. This value is based on the type of individual the advertiser is expecting to receive as a visitor to his or her website, and what the advertiser can gain from that visit, usually revenue, both in the short term as well as in the long term. As with other forms of advertising targeting is key, and factors that often play into PPC campaigns include the target’s interest (often defined by a search term they have entered into a search engine, or the content of a page that they are browsing), intent (e.g., to purchase or not), location (for geo targeting), and the day and time that they are browsing.
[edit]Flat-rate PPC
In the flat-rate model, the advertiser and publisher agree upon a fixed amount that will be paid for each click. In many cases the publisher has a rate card that lists the Cost Per Click (CPC) within different areas of their website or network. These various amounts are often related to the content on pages, with content that generally attracts more valuable visitors having a higher CPC than content that attracts less valuable visitors. However, in many cases advertisers can negotiate lower rates, especially when committing to a long-term or high-value contract.
The flat-rate model is particularly common to comparison shopping engines, which typically publish rate cards.[4] However, these rates are sometimes minimal, and advertisers can pay more for greater visibility. These sites are usually neatly compartmentalized into product or service categories, allowing a high degree of targeting by advertisers. In many cases, the entire core content of these sites is paid ads.
[edit]Bid-based PPC
In the bid-based model, the advertiser signs a contract that allows them to compete against other advertisers in a private auction hosted by a publisher or, more commonly, an advertising network. Each advertiser informs the host of the maximum amount that he or she is willing to pay for a given ad spot (often based on a keyword), usually using online tools to do so. The auction plays out in an automated fashion every time a visitor triggers the ad spot.
When the ad spot is part of a search engine results page (SERP), the automated auction takes place whenever a search for the keyword that is being bid upon occurs. All bids for the keyword that target the searcher’s geo-location, the day and time of the search, etc. are then compared and the winner determined. In situations where there are multiple ad spots, a common occurrence on SERPs, there can be multiple winners whose positions on the page are influenced by the amount each has bid. The ad with the highest bid generally shows up first, though additional factors such as ad quality and relevance can sometimes come into play (see Quality Score).
In addition to ad spots on SERPs, the major advertising networks allow for contextual ads to be placed on the properties of 3rd-parties with whom they have partnered. These publishers sign up to host ads on behalf of the network. In return, they receive a portion of the ad revenue that the network generates, which can be anywhere from 50% to over 80% of the gross revenue paid by advertisers. These properties are often referred to as a content network and the ads on them as contextual ads because the ad spots are associated with keywords based on the context of the page on which they are found. In general, ads on content networks have a much lower click-through rate (CTR) and conversion rate (CR) than ads found on SERPs and consequently are less highly valued. Content network properties can include websites, newsletters, and e-mails.[5]
Advertisers pay for each click they receive, with the actual amount paid based on the amount bid. It is common practice amongst auction hosts to charge a winning bidder just slightly more (e.g. one penny) than the next highest bidder or the actual amount bid, whichever is lower.[6] This avoids situations where bidders are constantly adjusting their bids by very small amounts to see if they can still win the auction while paying just a little bit less per click.
To maximize success and achieve scale, automated bid management systems can be deployed. These systems can be used directly by the advertiser, though they are more commonly used by advertising agencies that offer PPC bid management as a service. These tools generally allow for bid management at scale, with thousands or even millions of PPC bids controlled by a highly automated system. The system generally sets each bid based on the goal that has been set for it, such as maximize profit, maximize traffic at breakeven, and so forth. The system is usually tied into the advertiser’s website and fed the results of each click, which then allows it to set bids. The effectiveness of these systems is directly related to the quality and quantity of the performance data that they have to work with – low-traffic ads can lead to a scarcity of data problem that renders many bid management tools useless at worst, or inefficient at best.

Leave a comment

ROI (Return on Investment) and SEO (Search Engine Optimizer)

Estimating the ROI (Return Of Investment) when doing SEO is extremely important if you want to get really successful. We are not only talking about invesment as in money but in time, links and all those things that can be seen as costs in SEO. By estimating the ROI on specific keywords we can make the site more profitable per spent hour/dollar/link.

How to estimate ROI

If we accept that you can reach the number 1 position on any keyword with any site if you invest enough it’s a question of selecting the best keywords. There are several factors that we need to take in to consideration when deciding how good a keyword is and when we have set a value to the keyword we need to compare this to the investment.

Percent of clicks on the positions in the first page of a search result
Things you need to know to estimate the value of a keyword includes the Search Demand, Convertion Rate and Averege Value on a Convertion. The calculation is pretty straight forward if we have this data: Calculate the amount of visitors you will get on the top position by multiplying search demand with ~40% (This is the amount of the searchers that will click on the top position if nothing else interferes). Take the number of visitors and multiply it with the convertion rate (if you have trouble estimating the convertion rate and the averege value per convertion you can measure it by doing an adwords campaign on the target keywords and get the data from Google Analytics). Multiply the number of convertions with the averege value on them.
Search Demand x 40% x Convertion Rate x Value of Convertions = Value of keyword

After this little procedure you know the value of the top position on the selected keyword. Now comes the tricky part, we have to estimate how much we have to invest to reach that position. This can be very tricky but the good part is that in most cases we don’t need any actual figures, we just need to be able to compare the competition on different keywords.
Start with trying to find out how tough the competition is, useful data here is their amount and strength of inbound links, number of pages on the site, PageRank, age of the site and quality of their On Page SEO. You can of course use a number of other measurements as you in a way try to backtrack Googles own algorithm. These are all pretty easy to find out though and they give a good enough picture as we don’t want to spend a lifetime measuring the competition.

How to find this data:

Number of inbound links
The best way is by using Yahoo’s siteexplorer, just write Linkdomain:thedomain.com in Yahoo’s search field. Yahoo doesn’t show everything but it shows more than Google’s Link: command.
Strength of inbound links
Visit a selection of the pages in the result from the Linkdomain and do a quick estimate of the pages, PR might be the best thing to see.
Number of indexed pages
Do a Site:thedomain.com in Googles search field.
Page Rank
Just visit the site with Google Toolbar installed and you will se the public snapshot of the PR in the bar.
Quality of On Page SEO
Try and estimate how good the On Page of the competitor by looking at it, see if it has unique titles on all pages and similar.
Compare the different competitors on these points, a good way can be to select a number of competitors from each search result (I usually take a look at the competitors on position 1, 3 and 10). Put a value to the competition, this is the tricky part. You have to weigh all factors and try to put a value to the competition. I usually put a value between 1 and 10 where 10 is the strongest. (With a few SEO-projects behind you it will get easier and easier to actually put a monetary value to these numbers, sadly the investment needed in time or money is not the same for everyone and you need to find it out yourself).
Note that PageRank is also a value similar to this estimate and if you want to do it faster you can actually look only at PageRank of main competitors and use that value (it’s less exact but a very fast measurement).

Putting a figure on the ROI

You can now divide the value of the keyword with the estimate of the competition and get a figure for the ROI. After this you just chose the keywords with the highest value per amount of work.
Value of Keyword / Estimate of competition = ROI

As you probably realize this is not the actual ROI until you are able to put a value on your time and your links and you can do a more exact estimate of how much you actually need to invest to overtake the competitors. The only advise here is that you will learn how as you work with it hor much effort you have to put in. At least we have an estimate that tells us what keywords are most profitable.
Using ROI to know when to stop

Another use of ROI is that by applying it to an ongoing SEO you can tell when it’s time to stop. As the averege click ratio on each position in the SERP (Search Engine Result Page) we know how much we will gain in profits from moving from position 3 to position 2 or 1. If you have reached position 4 on one of your target keywords and want to know if you should invest more in that keyword or start working with another one you simply do the ROI calculation for the new keyword and the one where you already have a position. Is there more to gain per invested hour or dollar in gaining a position in the old one or in the new one?

1 Comment

Getting good at SEO

One of the most important things you need to do if you want your SEO to become really successful is choosing the proper keywords. This can be a bit time consuming but it definitely worth the time.

Choosing keywords

The first thing you should do is get a list of the words you think are most relevant for your site. Don’t be afraid to use whole terms as “a knol for SEO” if you think it is necessary to describe your site. You should make a list with these most important words and phrases and take that list to Googles Keyword Tool. With help of the keyword tool you can find out not only the amount of traffic on the targeted keywords but also suggestions of more keywords. (Remember to set match type to Exact and chose the country you will be targeting).

Now that you know the search demand on all relevant keywords you will have to chose between them. There are two other factors besides the search demand that are important. The first is the keyword competition and the second is the relevance of the keyword. The lower the competition and the higher the relevance the better the keyword is. The relevance is the one requiring a bit more practice as it is how well the keyword or phrase relates to your product or website. A higher relevance will get more satisfied visitors and more convertions. The competition is something that can be measured. Do a search on the keyword and see what sites show up. There are a few parameters you can check to see if they are hard competitors or not (it is hard to get an exact estimate but it is fairly easy to get an estimate that is good enough). Check their PageRank to start with, this is not the most important factor but it is the easiest to check if you have Google Toolbar or any similar tool. Check their amount of inbound link, the best way today is by using Yahoos Siteexplorer. Just type: linkdomain:thedomain.com -domain:thedomain.com in Yahoos search window and you will see the links Yahoo have indexed pointing to that site without the links from the site itself. The link: commando in Google does not give you a clear picture for this. Next you need to check for the amount of indexed pages on the site. This is easiest done with the Site:thedomain.com commando in your Google search form.

The higher pagerank, more inbound links and more indexed pages the harder the competitor. There are of course a large amount of other factors but these are easily checked and they give a good enough picture of the search competition. If you want to do it more thoroughly you can do a basic on page check on them as well, see if they have the keyword in their title and H1 and so on.

When you have this data it’s up for choosing. Choose keywords that you believe you can get results in that are close enough to your product or service and still have a high enough search demand.

Perfect crawl and well defined pages

Once you have decided what keywords you want to target it’s time to make the site work perfectly for them. The first thing you have to do is making the site as easy as possible for search spiders. The easier it is for them to index your site the faster they will. The optimal is having all sub pages linked from the front page and this works well if you have say 5 – 10 pages. If you have thousands though this is not only impossible but also stupid. The most important part here is building a link structure where sites that are close in content link to each other. The better your internal linking manages to link related pages together the better results. One easy way of doing this is first building a standard hierarchy and then link together pages that belong together. One example of where this works perfectly is a WordPress blog, first you have the hierarchy in the categories or archives and with the tags you link related pages together (via a tag page but it’s the same principle).

When the structure is working well you need to define keywords for specific pages. You can’t expect to get results on hundreds of keywords on your front page, you need to spread them out on sub pages. If you have a vegetable site you might want to target the front page for vegetables and have sub pages for lettuce and cucumber. Cluster keywords based on the words themselves, even though complementary sandwiches are close to free snacks in meaning they shouldn’t be on the same page as the are far to different in syntax. Free snacks should actually go with free drinks as they are closer.
When you write the text for the pages keep in mind what keyword you are targeting. It’s not necessary to keep a certain percent of the keyword on the page but it should definitely be there. It should be visible in the titles, headlines and even in the meta data (description being the most important).

To further increase the definition of the pages make sure that the targeted keyword is in the internal links pointing to the page. Why would search engines consider the page to be about lettuce if the button that takes you there has carrots written all over it.

If you want help with your on page we have created this simple SEO Tool that can give you the pointers you need to target a page for a keyword.

1 Comment

What is SEO? (Search Engine Optimizer)

SEO or Search Engine Optimization is making a web page reach positions on the proper keywords in the search engine result pages. Most people use search engines on a daily basis to search for almost anything, if you manage to get your site visible on the proper keywords it can take it from a small site to a massive one.
Learning the basics in SEO

Give the search engines a chance to find your content

The first steps in SEO should always be to make your site easy to read for the Search Engines. There are a few things you have to think about here because what is easy to a human might be very hard for an automatic viewer.

For example, images are hard for search engines to categorize even if it’s extremely easy for a human to see that it’s a pine tree on the picture it is a complex nut for a machine to crack.

There are a lot of other things that can make a search engine having problems with your site and these are, Flash, Java, Frames and Pictures. It is of course not a problem having pictures on the site or showing a nice flash game, just make sure that there are ways for the search engines to find the content of the site. Think of it like this, the spiders that the engines send out to index your site can read text and use html-links but they have troubles with the elements mentioned before (even though they get better at it even as we speak). If you can show all your content with text and you can get to it by following html-links you have a good start. Using frames to build your site can still cause some problems even if these demands are met because frames has a tendency to make the site fragmented. If it is possible, stay away from frames. If you have problems making standard links to all your pages (if you use a flash navigation or similar) a sitemap might help out. There are two kinds of sitemaps that can be used, an xml one and html. Preferably you should have both but the html can even help your visitors if they get lost. If you use Java or similar you should make use of the noscript tags to make it possible for users that don’t accept scripts and for the search engines as well.

Explain what your site is about

Search engines read text, but they are not very good readers. They have a hard time understanding the metaphors and they never read between the lines. Make sure your content is readable even for a stupid reader, like Google.

One of the usual mistakes we do when we write about something for our sites is that we don’t really explain exactly what it is about. By using the words that we consider important we make it easier for search engines to understand the articles on the site.

Take this article for example, it’s about SEO and so far I haven’t mentioned SEO in the text. Google is getting smarter every day and could probably find out but why make it hard?

Don’t over use the words but keep them in mind when you write. If you are writing about food it wouldn’t hurt to once in a while mentioning food instead of writing “something to still your hunger”, “the old bangers and mash” or “grub”.

Make sure to use proper headlines and highlights. If you use correct markup for your headlines and if you are strict with mentioning the primary subject in them the SE:s will have a lot easier job indexing your articles.

Use meta tags but be careful. Write a proper description for every page on your site and set it as description in the meta. Set keywords that are the right ones (keywords are used less than in the beginning of search engines) and stay away from stuffing the tag full with unrelated keywords.

Use Alt and Title where possible, if you set proper Alt and Title tags on your images it makes it a lot easier to read both for search engines and for people with bad sight.

If people like you, Google will too

When your site is working perfectly and the little spiders can crawl it without problems there is still some work to do. One of the most important inventions when it comes to search engine results is the foundation for Google, it’s measuring the inbound links. With a lot of links pointing to your site from trusted sites Google (and the others) will start showing you on better positions.

For example, My blog is an authority in Web Marketing and I link to you in my blogroll. If someone sees this he might think, well if this highly trusted blog links to that site maybe it’s a very good site. Google will think the same.
There are several ways of making this happen and a reasonable place to start is ODP. Submit your site to them and if they think it’s a good site that adds value they might add you to the directory. This is of course a slow method because ODP-submissions take quite a while. There are other trusted directories out there, Yahoo has one for example.
If your site is good and informative there is always the possibility to get links by telling people that write about your topic that you actually have a good site that might be interesting to them. If they really do find it interesting they might tell about it and link to you.
If a lot of people like your site and link to it, the search engines will find out and show your articles in the search result.