Status OK – The file request was successful. For example, a page or image was found and loaded properly in a browser.Some poorly developed content management systems return 200 status codes even when a file does not exist. The proper response for file not found is a 404.See also:W3C HTTP 1.1 Status Code Definitions 


Moved Permanently – The file has been moved permanently to a new location.This is the preferred method of redirecting for most pages or websites, as it passes about 99% of link equity from the previous source. If you are going to move an entire site to a new location you may want to test moving a file or folder first, and then if that ranks well you may want to proceed with moving the entire site. Depending on your site authority and crawl frequency it may take anywhere from a few days to a month or so for the 301 redirect to be picked up.See also:W3C HTTP 1.1 Status Code DefinitionsOn Apache servers, you can redirect URLs in a .htaccess file or via in the headers of some dynamic pages. Most web hosts run on Apache.On IIS servers you can redirect using ASP or ASP.net, or from within the internet manager. 


Found – The file has been found, but is temporarily located at another URI.Generally, as it relates to SEO, it is typically best to avoid using 302 redirects. Some search engines struggle with redirect handling. Due to poor processing of 302 redirects some search engines have allowed competing businesses to hijack the listings of competitors.See also:W3C HTTP 1.1 Status Code Definitions 


Not Found – The server was unable to locate the URL.Some content management systems send 404 status codes when documents do exist. Ensure files that exist do give a 200 status code and requests for files that do not exist give a 404 status code. You may also want to check with your host to see if you can set up a custom 404 error page which makes it easy for site visitors toview your most popular and/or most relevant navigational optionsreport navigational problems within your siteSearch engines request a robots.txt file to see what portions of your site they are allowed to crawl. Many browsers request a favicon.icon file when loading your site. While neither of these files is necessary, creating them will help keep your log files clean so you can focus on whatever other errors your site might have.See also:W3C HTTP 1.1 Status Code Definitions 


Apache directory level configuration file which can be used to password protect or redirect files.As a note of caution, make sure you copy your current .htaccess file before editing it and do not edit it on a site that you can’t afford to have gone down unless you know what you are doing.See also:.htaccess, 301 Redirects & SEO 



Above the Fold

A term traditionally used to describe the top portion of a newspaper. In email or web marketing it means the area of content viewable prior to scrolling. Some people also define above the fold as an ad location at the very top of the screen, but due to banner blindness, typical ad locations do not perform as well as ads that are well integrated into content. If ads look like the content they typically perform much better. Google launched a top heavy algorithm to penalize sites with excessive ad placement above the fold.Image resultSee also:Google AdSense heat map – shows ad clickthrough rate estimates based on ad positioning. 

Absolute Link

A link which shows the full URL of the page being linked at. Some links only show relative (to the file pathing of the website) link paths instead of having the entire reference URL within the ‘a href’ tag. Due to canonicalization and hijacking related issues, it is typically preferred to use absolute links over relative links. 

Activity Biasy

Any attempted form of ad targeting might be targeted toward people who are more likely to engage in a particular activity, especially with ad retargeting. Correlation does not mean causation.See also:Here, There, and Everywhere: Correlated Online Behaviors Can Lead to Overestimates of the Effects of Advertising 

Ad Heavy Page Layout

(see Top Heavy

Ad Retargeting

 (see Retargeting

Ad Sense

The old name for Microsoft’s cost per click ad network later rebranded as Bing Ads.While it has some awesome features (including dayparting and demographic based bidding) it is still really new compared to Google AdWords. Due to Microsoft’s limited market share (15-25%) and program newness, many terms are vastly underpriced and present a great arbitrage opportunity (Like when Google AdWords first started).Google’s contextual advertising network. Publishers large and small may automatically publish relevant advertisements near their content and share the profits from those ad clicks with Google.AdSense offers a highly scalable automated ad revenue stream which will help some publishers establish a baseline for the value of their ad inventory. In many cases, AdSense will be underpriced, but that is the trade-off for automating ad sales.AdSense ad auction formats include
  • Cost per Click – advertisers are only charged when ads are clicked on
  • CPM – advertisers are charged a certain amount per ad impression. Advertisers can target sites based on keyword, category, or demographic information.
AdSense ad formats include
  • Text
  • Graphic
  • Animated graphics
  • Videos
In some cases, I have seen ads which got a 2 or 3% click-through rate (CTR), while sites that are optimized for maximum CTR (through aggressive ad integration) can obtain as high as a 50 or 60% CTR depending on how niche their site is, how commercially oriented their site is, or the relevancy and depth of advertisers in their vertical.It is also worth pointing out that if you are too aggressive in monetizing your site before it has built up adequate authority your site may never gain enough authority to become highly profitable.Depending on your vertical your most efficient monetization model may be any of the following
  • AdSense
  • Affiliate marketing
  • Direct ad sales
  • Selling your own products and services
  • or any mixture of the above
See also:Google AdSense program – sign up as an ad publisherGoogle AdSense heat map – shows ad clickthrough rate estimates based on ad positioning.Google AdWords – buy ads on Google search and/or contextually relevant web pages. 


Google’s advertisement and link auction network. Most of Google’s ads are keyword targeted and sold on a cost per click basis in an auction which factors in bid price and ad relevance. Google AdWords has expanded its reach to include all Google partner sites including YouTube, Google Play, Google Shopping, Google Maps, including the Maps app, and others!AdWords is an increasingly complex marketplace because it is so complex and ever growing – I would like to share some links for those interested in the AdWords program, additionally, there are Google AdWords courses, provided by Google, which you may take to get certified for running ads effectively through their network.See also:Google AdWords – sign up for an advertiser accountGoogle Advertising Professional Program – program for qualifying as an AdWords expertGoogle AdWords Learning Center – text and multimedia educational modules. Contains quizzes related to each section.AdWords Keyword Tool – shows related keywords, advertiser competition, and relative search volume estimates.Google Traffic Estimator – estimates bid prices and search volumes for keywords.Some PPC Tips and Tricks 

AdWords Site

(MFA) Made For Google Adsense Advertisements are websites that are designed from the ground up as a venue for advertisements. This is usually, but not always a bad thing. TV programming is usually Made For Advertisement. They are specifically created to be landing pages ready to take the clicks paid for from the ads and use some arbitrage to make the page profitable. 

Affiliate Marketing

Affiliate marketing programs allow merchants to expand their market reach and mindshare by paying independent agents on a cost per action (CPA) basis. Affiliates only get paid if visitors complete an action.Some power affiliates make hundreds of thousands or millions of dollars per year because they are heavily focused on automation and/or tap large traffic streams. Typically niche affiliate sites make more per unit effort than overly broad ones because they are easier to focus (and thus have a higher conversion rate).Selling a conversion is typically harder than selling a click (like AdSense does, for instance). Search engines are increasingly looking to remove the noise low-quality thin affiliate sites ad to the search results through the use of algorithms which detect thin affiliate sites and duplicate content; manual review; and, implementation of landing page quality scores on their paid ads.See also:Top Affiliate NetworksAmazon Associates – Amazon’s affiliate program which is one of the more popular affiliates program 


Some social networks or search systems take site age, page age, user account age, and related historical data into account when determining how much to trust that person, website, or document. Some specialty search engines, like blog search engines, may also boost the relevancy of new documents (Like what Google is starting to do now).Fresh content which is also cited on many other channels (like related blogs) will temporarily rank better than you might expect because many of the other channels which cite the content will cite it off their home page or a well trusted high PageRank page. After those sites publish more content and the reference page falls into their archives those links are typically from pages which do not have as much link authority as their home pages.See also:Google Patent 20050071741: Information retrieval based on historical data – mentions that document age, link age, link bursts, and link churn may be used to help score the relevancy of a document. 


Asynchronous JavaScript and XML is a technique which allows a web page to request additional data from a server without requiring a new page to load.See also:More AJAX Information 


Amazon.com owned search service which measures website traffic.Alexa is heavily biased toward sites that focus on marketing and webmaster communities. While not being highly accurate it is free. I tend not to use it because there is no consistency with rankings and barely reflects a websites ability to perform or relevance.See alsoAlexa.com 


(algo) A program used by search engines to determine what pages to suggest for a given search query.See alsoAlgorithm information provided by Wikipedia 

All The Web

A search engine which was created by Fast, then bought by Overture, which was bought by Yahoo. Yahoo may use AllTheWeb as a test bed for new search technologies and features.See alsoAllTheWeb 

Alt Attribute

Blind or visually impaired people and most major search engines are not able to easily distinguish what is in an image. Using an image alt attribute allows you to help screen readers and search engines understand the function of an image by providing a text equivalent for the object, these are not usually displayed to the end user, unless the graphic is undeliverable, or a browser is used that doesn’t display graphics. Alt text is important because search engines can’t tell one picture from another. Alt text is the one place where it is acceptable for the spider to get different content than the human user, but only because the alt text is accessible to the user, and when properly used is an accurate description of the associated picture. Special web browsers for visually challenged people rely on the alt text to make the content of graphics accessible to the users.Including alt attributes on all of your images give a great location for extra image context and therefore have a positive correlation for search engine ranking positions… plus you get to help out impaired audience members (two birds one stone style).See alsoW3C QA: Alt Attribute 


The largest internet retailing website. Amazon.com is rich in consumer generated media. Amazon also owns a number of other popular websites, including IMDB and Alexa.See also:Amazon.com – official site 


Software which allows you to track your page views, user paths, and conversion statistics based upon interpreting your log files or through including a JavaScript tracking code on your site.Ad networks are a game of margins. Marketers who track user action will have a distinct advantage over those who do not.See also:Google Analytics – Google’s free analytics programOne of my favorites: SEMRush 

Anchor Text

The text that a user would click on to follow a link. In case the link is an image the image alt attribute may act in the place of anchor text.Search engines assume that your page is authoritative for the words that people include in links pointing at your site. When links occur naturally they typically have a wide array of anchor text combinations. Too much similar anchor text may be a considered a sign of manipulation, and thus discounted or filtered out by search algorithms. Make sure when you are building links that you control that you try to mix up your anchor text.An example of anchor text:Search Engine Optimization BlogInside the tag elements exist the anchor text.SEO experts are always asked: What does the perfect link profile look like? There is no lump answer to this question since every site moves in its own little “microcosm”: Some factors which play a role in backlink profiles includes: theme environment, type of content on the website, the age of the domain, competitors. Via the Penguin update, Google checks backlink profiles for how spammy, or conversely how natural the link structure is in order to avoid spammy ranking influencing. By doing so, there are some rules and basic policies, that you must pay attention to and which are generally valid for all sites:
  1. The ratio between DoFollow and NoFollow links should be in a “healthy” relation. Too many backlinks with one or another property can stand out negatively.
  2. Instead of quantity, you should pay attention to the quality of the links. One backlink of a really strong site (e.g. Wikipedia) is clearly worth much more than 20 links from weak pages (e.g. blogs).
  3. High quality links have substantial weight for the link profile, because they are the most difficult to obtain. Backlinks of universities (EDU-links) and official government offices (GOV-links).
  4. The topic area from where it is linked always plays an important role. Backlinks of websites with a context having similar content are substantially more valuable than those of generic sites.
  5. As a rule, a natural linking takes place Midtail or Longtail, this means more words are linked instead of just one single word.
  6. The number of referring links mostly increases continuously and roughly linearly. For new websites, a suddenly erratically increasing number of new backlinks can appear to be conspicuous and unnatural.
  7. In general, a link profile does not only contain text links, but also image links. Even when just a small icon is linked, this is already an image backlink.
  8. A perfect link profile contains backlinks from different sources: Domains, IP-addresses, subnet-classes and also operators of websites should vary as often as possible.
 See also:AHREF’s Backlink Analyzer is probably the best in industry 


Google’s operating system which powers cell phones & some other consumer electronics devices like TVs.While Google pitches Android as being an “open” operating system, they are only open in markets they are losing & once they establish dominance they shift from open to closed by adding restrictions to “partners.”See Also:Android OS 


Application Program Interface – a series of conventions or routines used to access software functions. Most major search products have an API program to interface with additional software features.See Also:An Example of the Google APIs 


Exploiting market inefficiencies by buying and reselling a commodity for a profit. As it relates to the search market, many thin content sites laced with an Overture feed or AdSense ads buy traffic from the major search engines and hope to send some percent of that traffic clicking out on a higher priced ad. Shopping search engines generally draw most of their traffic through arbitrage.Buying low in one market and selling high in another market is a simple idea to take away for the term.See also:Arbitrage – InvestopediaArbitrage tips 


Ask is a search engine owned by InterActive Corp. They were originally named Ask Jeeves, but they left the lovable butler ‘Jeeves’ in early 2006. Their search engine is powered by the Teoma search technology, which is largely reliant upon Kleinberg’s concept of hubs and authorities.See also:AskKleinberg’s concept 


Active Server Pages – a dynamic Microsoft programming language.See also:ASP.net official siteLearn ASP.net 


Attempting to advance a commercial or political agenda while pretending to be an impartial grassroots participant in a social group. Participating in a user forum with the secret purpose of branding, customer recruitment, or public relations. 


Topical authorities are sites which are well trusted and well cited by experts within their topical community. A topical authority is a page which is referenced from many topical experts and hub sites. A topical hub is a page which references many authorities.Example potential topical authorities:
  • Largest brands in your field
  • Top blogger talking about your subject
  • Wikipedia page about your topic (These pages have expert citations at the bottom! Still questioning Wikipedia as a source? check the citations!)
  • Scientifically written articles or posts with plenty of highly trusted citations
See also:Although more specific to writing paper, these guidelines hold true for websites as well! 


The gravity of relevance a web page or domain has. Domain Authority is one of the highest correlating factors to having high rankings in search engines. Five large factors associated with site and page authority are link equity, site age, traffic trends, site history, and publishing unique original quality content.Five large factors associated with site and page authority are link equity, site age, traffic trends, site history, and publishing unique original quality content.Search engines constantly tweak their algorithms to try to balance relevancy algorithms based on topical authority and overall authority across the entire web. Sites may be considered topical authorities or general authorities. For example, Wikipedia is considered broad general authority sites. This site is a topical authority on SEO, but not a broad general authority.For example, Wikipedia is considered broad general authority site. The HONE site (honedigitalmarketing.com) is a topical authority on SEO and inbound marketing, but not a broad general authority. 

Automated Bid Management Software

Pay per click is growing increasingly complex, to help large advertisers scale with the increasing sophistication and complexity of these offerings some search engines and third party software developers have created software which makes it easier to control your ad spend. Some of the more advanced tools can integrate with your analytics programs and help you focus on conversions, Return on Investments, and earnings elasticity – instead of just looking at cost per click.See also:If you want to program internal bid management software you can get a developer token to use the Google AdWords API.A few popular bid management tools are 




Stands for ‘Business to Business’, often relating to a sales procedure or the description of a business.Example: Hone is a Business which sells to other Businesses – Therefore Hone sales are B2B 


Stands for ‘Business to Consumer’, often relating to a sales procedure or the description of a business.Example: Apple is a Business which sells to consumers – Therefore Apple sales are B2C 


(inlink, incoming link) Any link to a page or site from any other page or site. (see Inbound Link

Bait and Switch

Marketing technique where you make something look overtly pure or as though it has another purpose to get people to believe in it or vote for it (by linking at it or sharing it with friends), then switch the intent or purpose of the website after you gain authority, steady traffic, or strong email list.It is generally easier to get links to informational websites than commercial sites. Some new sites might gain authority much quicker if they started as non-commercialal and gained influence before trying to monetize their market position. 

Banner Blindness

During the first web boom, many businesses were based on how many eyeballs saw the site rather than actually building value for their audience. Many ads were typically quite irrelevant and web users learned to ignore the most common ad types.In many ways, text ads are successful because they are more relevant and look more like content, but with the recent surge in the popularity of text ads, some have speculated that in time people may eventually become text ad blind as well.Nick Denton stated:Imagine the internet with Google and Overture text ads are everywhere. Not only above search results but next to every article and weblog post. Ubiquity breeds contempt. Text ads, coupled with content targeting, are more effective than graphic ads for many advertisers; but they too, like banners, will suffer reader burnout and eventually be totally removed from viewer readership. 

Battelle, John

Popular search and media blogger who co-founded The Industry Standard and Wired, and authored a popular book on search called “The Search”.See also:Searchblog – blog about the intersection of search, media, and technology.The Search – John’s book about the history and future of search.The Database of Intentions – post about how search engines store many of our thoughtsWeb 2.0 Conference – a conference run by John Battelle.Battelle’s Google speech 

Behavioral Targeting

Ad targeting based on past recent experience and/or the implied intent of search queries. For example, if I recently searched for mortgages then am later reading a book review the page may still show me mortgage ads. 


A prejudice based on experiences or a particular worldview.Any media channel, publishing format, organization, or person is biased by how and why they were created and their own experiences, the current set of social standards in which they exist, other markets they operate in, the need for self preservation, how they interface with the world around them, their capital, knowledge, status, or technological advantages and limitations.Search engines aim to be relevant to users, but they also need to be profitable. Since search engines sell commercial ads some of the largest search engines may bias their organic search results toward informational (ie: non-commercial) websites. Some search engines are also biased toward information which has been published online for a great deal of time and is heavily cited.Search personalization biases our search results based on our own media consumption and searching habits.Large news organizations tend to aim for widely acceptable neutrality rather than objectivity. Some of the most popular individual web authors/publishers tend to be quite biased in nature. Rather than bias hurting one’s exposureThe known/learned bias of a specific author may make their news more appealing than news from an organization that aimed to seem arbitrarily neutral.I believe biased channels most likely typically have a larger readership than unbiased channels.Most people prefer to subscribe to media which matches their own biased worldview.If more people read what you write and passionately agree with it then they are more likely to link to it.Things which are biased in nature are typically easier to be cited than things which are unbiased.See also:Alejandro M. Diaz’s Through the Google Goggles [PDF] – thesis paper on Google’s biasesA Thousand Years of Nonlinear History – looks at economic, biological, and linguistic historyWikipedia: BiasWikipedia: Search neutrality 

Bid Management Software

(see Automated Bid Management Software


Microsoft’s search engine, which also powers the organic search results on Yahoo! Search. Bing controls about 20% of all search queries. 

Bing Ads

Microsoft’s paid search program, which rivals Google AdWords and powers paid search results on Yahoo! Search.See also:Bing AdsBing Ads Intelligence 

Black Hat SEO

Search engines set up guidelines that help deliver highly relevant, intent-driven search results for their users. Within that framework, search engines consider certain marketing techniques (or search engine influencing techniques) deceptive in nature, and label them as ‘black hat SEO’. Those which are considered within their guidelines are called white hat SEO techniques. The search guidelines are not a static set of rules, and things that may be considered legitimate one day may be considered deceptive the next.Search engines are not without flaws in their business models, but there is nothing immoral or illegal about testing search algorithms to understand how search engines work.People who have extensively tested search algorithms are probably more competent and more knowledgeable search marketers than those who give themselves the arbitrary label of white hat SEOs while calling others black hat SEOs.When making large investments in processes that are not entirely clear trust is important. Rather than looking for reasons to not work with an SEO it is best to look for signs of trust in a person you would like to work with.See also:Black Hat SEO Blog – AMAZING for beginners to understand what the difference between good/safe practices vs deceptive practices

Block Level Analysis

 A method used to break a page down into multiple points on the web graph by condensing its pages down into smaller blocks.Block level link analysis can be used to help determine if the content is page specific or part of a navigational system. It also can help determine if a link is a natural editorial link, what other links that link should be associated with, and/or if it is an advertisement. Search engines generally do not want to count advertisements as votes for authority.See alsoMicrosoft Research: Block-level Link Analysis 


A periodically updated journal, typically formatted in reverse chronological order. Many blogs not only archive and categorize information, but also provide a feed and allow simple user interaction like leaving comments on the posts.Most blogs tend to be personal in nature. Blogs are generally quite authoritative with heavy link equity because they give people a reason to frequently come back to their site, read their content, and link to whatever they think is interesting.Because of the informational nature of Blogs, they also tend to rank well in search engines when keywords/phrases have an information gathering intent. Great for Awareness steps in the buyer’s journey.The most popular blogging platforms are WordPressBloggerMovable Type, and Typepad 

Blog Comment Spam

Either manually or automatically (via a software program) adding low value or no value comments to other sites with the purpose of building links to another website.Automated blog spam example:“Nice post!”byDiscreat Overnight Cialis Online Canadian Pharmacy Free ShippingManual blog spam example:“I just wrote about this on my site. I don’t know you, but I thought I would add no value to your site other than linking through to mine. Check it out!!!!!”byseo101guy manual spammer (usually with keywords as my name)As time passes both manual and automated blog comment spam systems are evolving to look more like legitimate comments. I have seen some automated blog comment spam systems that have multiple fake personas that converse with one another. 


Blogger is a free blog platform owned by Google.It allows you to publish sites on a subdomain of Blogspot.com, or to FTP content to your own domain. If you are serious about building a brand or making money online you should publish your content to your own domain because it can be hard to reclaim a website’s link equity and age related trust if you have built years of link equity into a subdomain on someone else’s website.Blogger is probably the easiest blogging software tool to use, but it lacks many features present in other blog platforms.See also:Blogger.com 


Link list on a blog, usually linking to other blogs owned by the same company or friends of that blogger. Good for building internal links to previous content, or putting relevant posts together on one page, may greatly increase readership. 


A way to make words appear in a bolder font. Words that appear in a bolder font are more likely to be read by humans that are scanning a page (which I am sure all of you are doing – no one is interested in an entire document, they usually take snippets of the most important sections). A search engine may also place slightly higher weight on these words than regular text – if you write natural page copy and a word or phrase appears on a page many times it probably does not make sense or look natural if you bold every occurrence.Example use:wordswordsEither would appear as words


Most browsers come with the ability to bookmark your favorite pages. Many web-based services have also been created to allow you to bookmark and share your favorite resources. The popularity of a document (as measured in terms of link equity, number of bookmarks, or usage data) is a signal for the quality of the information. Some search engines use bookmarks to help aid their search relevancy.Social bookmarking sites are often called tagging sites. Del.icio.us is the most popular social bookmarking site. Yahoo! MyWeb also allows you to tag results. Google allows you to share feeds and/or tag pages.There are also several meta news sites that allow you to tag interesting pages. If enough people vote for your story then your story gets featured on the homepage.Slashdot is a tech news site primarily driven by central editors. Digg created a site covering the same type of news but is a ‘bottoms up’ news site which allows readers to vote for what they think is interesting.Many forms of vertical search, like YouTube, allow you to tag content. 

Boolean Search

Many search engines allow you to perform searches that contain mathematical formulas such as AND, OR, or NOT. By default most search engines include AND with your query, requiring results to be relevant for all the words in your query.Examples:
  • A Google search for SEO Book will return results for SEO AND Book.
  • A Google search for “SEO Book” will return results for the phrase SEO Book.
  • A Google search for SEO Book -Jorge will return results containing SEO AND Book but NOT Jorge.
  • A Google search for ~SEO -SEO will find results with words related to SEO that does not contain SEO.
Some search engines also allow you to search for other unique patterns or filtering ideas.Examples:
  • A numerical range: 12…18 would search for numbers between 12 and 18.
  • Recently updated: seo {frsh=100} would find recently updated documents. MSN search also lets you place more weight on local documents
  • Related documents: related:www.threadwatch.org would find documents related to Threadwatch.
  • Filetype: AdWords filetype:PDF would search for PDFs that mentioned AdWords.
  • Domain Extension: SEO inurl:.edu
  • IP Address: IP:
See also:Search Engine Showdown Features Chart – Greg R.Notess’s comparison of features at major search engines. 


Also known as a robot, spider, or crawler. A program which performs a task more or less autonomously. Search engines use bots to find and add web pages to their search indexes. Spammers often use bots to “scrape” content for the purpose of plagiarizing it for exploitation by the Spammer or collecting information from web pages. 

Bounce rate

The percentage of users who enter a site and then leave it without viewing any other pages. Relevancy ranking factor, if a lot of your traffic is bouncing from your website after visiting from a particular search keyword, search engines may see your page/website as less relevant and drop you down in the rankings. 


The emotional response associated with a company and/or product.A brand is built through controlling customer expectations and the social interactions between customers. Building a brand is what allows businesses to move away from commodity based pricing and move toward higher margin value based pricing. This duality is demonstrated by recognizing a marketing/sales aspect of branding and a social aspect of branding. Search engines may look at signals like repeat website visits & visits to a site based on keywords associated with a known entity and use them for relevancy signals in algorithms like Panda.See also:Rob Frankel – branding expert who provides free branding question answers every Monday. He also offers Frankel’s Laws of Big Time Branding™, blogs, and wrote the branding book titled The Revenge of Brand X 

Branded Keywords

 Keywords or keyword phrases associated with a brand or entity. Typically branded keywords occur late in the buying cycle, and are some of the highest value and highest converting keywords. These searches may be used as relevancy signals in algorithms like Panda. It is rumored that the majority of anchor text should be branded.

Breadcrumb Navigation

 A navigational technique used to help search engines and website users understand the relationship between pages.Example breadcrumb navigation:Home > SEO Tools > Free SEO AuditWhatever page the user is on is unlinked, but the pages above it within the site structure are linked to and organized starting with the home page then proceeding through the site file structure.Great for building a silo structure for optimum internal linking as well as being helpful to users. 

Brin, Sergey

 Co-founder of Google.See also:Wikipedia: Sergey Brin 

Broken Link

 A hyperlink or anchor which is not functioning. A link which does not lead to the desired location.Links may be broken for many reasons, but four of the most common reasons are
  • a website/web page going offline
  • linking to content which is temporary in nature (due to licensing structures or other reasons)
  • moving a page’s location
  • changing a domain’s content management system
Most large websites have some broken links, but if too many of a site’s links are broken it may be an indication of outdated content, and it may provide website users with a poor user experience. Both of which may cause search engines to rank a page as being less relevant.I love using ScreamingFrog as my broken link checker, and if you are smart – you can use broken links as the second most scalable link building technique for your website and content ? by checking other people’s websites you can identify broken links and offer a replacement page on your website! 


 A client used to view the world wide web.The most popular browsers are Google Chrome, Mozilla FireFox, and Internet Explorer (Now Edge). 

Bush, Vannevar

 WWII scientist who wrote a seminal research paper on the concepts of hypertext and a memory extension device titled As We May Think


 A well-trusted directory of business websites and information. Business.com is also a large pay per click arbitrage player.See also:Business.com 

Buying Cycle

 Before making large purchases consumers typically research what brands and products fit their needs. Keyword based search marketing allows you to reach consumers at any point in the buying cycle. In many markets, branded keywords tend to have high search volumes and high conversion rates.The buying cycle consists of the following stages
  • Awareness Stage
  • Consideration Stage
  • Decision Stage
 Image result for buyer's journey hubspotSee also:What is the Buyer’s Journey – Hone it! 




 A copy of a web page stored by a search engine. When you search the web you are not actively searching the whole web, but are searching files in the search engine index.Some search engines provide links to cached versions of pages in their search results and allow you to strip some of the formattings from cached copies of pages. 

Calacanis, Jason

 Founder of Weblogs, Inc. Also pushed AOL to turn Netscape into a Digg clone & then launched longtail SEO play Mahalo.See also:Calacanis.com – Jason’s blog 

Canonical URL

 Many content management systems are configured with errors which cause duplicate or exceptionally similar content to get indexed under multiple URLs. Many webmasters use inconsistent link structures throughout their site that causes the exact same content to get indexed under multiple URLs. The canonical version of any URL is the single most authoritative version indexed by major search engines. Search engines typically use PageRank or a similar measure to determine which version of a URL is the canonical URL.Webmasters should use consistent linking structures throughout their sites to ensure that they funnel the maximum amount of PageRank at the URLs they want to be indexed. When linking to the root level of a site or a folder index it is best to end the link location at a ‘/’ instead of placing the index.html or default.asp filename in the URL.Examples of URLs which may contain the same information in spite of being at different web addresses:http://www.seobook.com/http://www.seobook.com/index.shtmlhttp://seobook.com/http://seobook.com/index.shtmlhttp://www.seobook.com/?tracking-code 


 A US law that dictates how email marketers can treat their email subscribers, attain emails, and other laws regulating email submissions. This law is regularly in violation because people scrape and sell emails etc… it is very important to have an understanding of this law, especially if you are doing some form of email marketing.See also:CANSPAM Act 


 (see Index

Catch All Listing

 A listing used by pay per click search engines to monetize long tail terms that are not yet targeted by marketers. This technique may be valuable if you have very competitive keywords and are ok with targeting everyone to get ads in front of potential customers. It is not an ideal method since most major search engines have editorial guidelines that prevent bulk untargeted advertising, and most of the places that allow catching all listings have low traffic quality. Catch all listings may be an attractive idea on theme specific search engines and directories though, as they are already pre-qualified clicks. 


 Common Gateway Interface – interface software between a web server and other machines or software running on that server. Many CGI programs are used to add interactivity to a website. 


 Primarily known as Google’s web browser, there is also an OS by the same name.Google Chrome is the most popular web browser to date. 

Click fraud

 Improper clicks on a PPC advertisement usually by the publisher or his underlings for the purpose of undeserved profit. Click fraud is a huge issue for Ad agencies like Google because it lowers advertiser confidence that they will get fair value for their Ad spend. 


 A piece of content solely designed to get people to click it, even if that means using an inaccurate headline or photograph. 


 A program, computer, or process which makes information requests to another computer, process, or program. Often used to refer to a software. 


 Displaying different content to search engines and searchers. Depending on the intent of the display discrepancy and the strength of the brand of the person/company cloaking it may be considered reasonable or it may get a site banned from a search engine.Cloaking has many legitimate uses which are within search guidelines. For example, changing user experience based on location is common on many popular websites.It is worth noting that the greatest conversions come from a personalized page.See also:Dan Kramer’s guide to cloaking – The Definitive Guide to Cloaking 

Cluetrain Manifesto, The

 A book about how the web is a marketplace, and how it is different from traditional offline business.See also:The Cluetrain Manifesto website – offers the book for free online ? 


 In search results, the listings from any individual site are typically limited to a certain number and grouped together to make the search results appear neat and organized and to ensure diversity amongst the top ranked results. Clustering can also refer to a technique which allows search engines to group hubs and authorities on a specific topic together to further enhance their value by showing their relationships.Super useful when doing keyword research and competitor research, it helps catalog and determines relationships and industry leaders to beat.See alsoGoogle Touchgraph – This tool is a cluster creator, it’s awesome, worth at least one use! 


 Content Management Systems are tools used to help update and add information to a website.Blog software programs are some of the most popular content management systems currently used on the web. Many content management systems have errors associated with them which make it hard for search engines to index content due to issues such as duplicate content.The most popular CMS is WordPress.org 


 In topical authority based search algorithms, links which appear near one another on a page may be deemed to be related to one another. In algorithms like latent semantic indexing words which appear near one another often are frequently deemed to be related. 

Code swapping

 Changing the content after high rankings are achieved.See also:bait and switch 

Comment spam

 Posting blog comments for the purpose of generating an inlink to another site. The reason many blogs use link condoms. 


 Many blogs and other content management systems allow readers to leave user feedback.Leaving enlightening and thoughtful comments on someone else’s related website is one way to help get them to notice you. 

Comments Tag

Some web developers also place comments in the source code of their work to help make it easy for people to understand the code.HTML comments in the source code of a document appear as . They can be viewed if someone checks the source code of a document, but do not appear in the regular formatted HTML rendered version of a document.In the past, SEOs would stuff keywords in comment tags to help increase the page keyword density, but search has evolved beyond that stage, and at this point using comments to stuff keywords into a page adds to your risk profile and presents little ranking upside potential. 

Compacted Information

Information which is generally and widely associated with a product. For example, most published books have an ISBN.As the number of product databases online increases and duplicate content filters are forced to get more aggressive the keys to getting your information indexed are to have a site with enough authority to be considered the most important document on that topic, or to have enough non-compacted information (for example, user reviews) on your product level pages to make them be seen as unique documents. 

Concept Search

A search which attempts to conceptually match results with the query, not necessarily with those words, rather their concept.For example, if a search engine understands a phrase to be related to another word or phrase it may return results relevant to that other word or phrase even if the words you searched for are not directly associated with a result. In addition, some search engines will place various types of vertical search results at the top of the search results based on implied query related intent or prior search patterns by you or other searchers. 

Conceptual Links

Links which search engines attempt to understand beyond just the words in them. Some rather advanced search engines are attempting to find out the concept links versus just matching the words of the text to that specific word set. Some search algorithms may even look at co-citation and words near the link instead of just focusing on anchor text. 


(text, copy) The part of a web page that is intended to build value for and increase interest to the user. Advertising, navigation, branding and boilerplate are not considered to be content.

Contextual Advertising

Advertising programs which generate relevant advertisements based on the content of a web page.See also:Google AdSense is the most popular contextual advertising program.


Many forms of online advertising are easy to track. A conversion is reached when a desired goal is achieved.Most offline ads have generally been much harder to track than online ads. Some marketers use custom phone numbers or coupon codes to tie offline activity to online marketing.Here are a few common example desired goals
  • a product sale
  • completing a lead form
  • a phone call
  • capturing an email
  • filling out a survey
  • getting a person to pay attention to you
  • getting feedback
  • having a site visitor share your website with a friend
  • having a site visitor link at your site
See also:Setup Conversion trackingGoogle Website Optimizer is a free multivariable testing application

Conversion rate

 Percentage of users who convert – see conversion.


 Small data file written to a user’s local machine to track them and provide other clients information to personalize web experiences.


 The legal rights to publish and reproduce a particular piece of work.See also:Copyright.gov


 Cost per action. The effectiveness of many other forms of online advertising has their effectiveness measured on a cost per action basis. Many affiliate marketing programs and contextual ads are structured on a cost per action basis. An action may be anything from an ad click to filling out a lead form, to buying a product.


 Cost per click. Many search ads and contextually targeted ads are sold in auctions where the advertiser is charged a certain price per click.See also:Google AdWords – Google’s pay per click ad program which allows you to buy search and contextual ads.Google AdSense – Google’s contextual ad program.Facebook Ads – Huge bang for the buck with crazy targeting options!Yahoo! Search Marketing – Yahoo!’s pay per click ad platform


 Cost per thousand ad impressions.Many people use CPM as a measure of how profitable a website is or has the potential of becoming. CPM is also used to calculate payout for advertisers.

Crawl Depth

 How deeply a website is crawled and indexed.Since searches which are longer in nature tend to be more targeted, it is important to try to get most or all of a site indexed such that the deeper pages have the ability to rank for relevant long tail keywords. A large site needs adequate link equity to get deeply indexed. Another thing which may prevent a site from being fully indexed is duplicate content issues.

Crawl Frequency

 How frequently a website is crawled.Sites which are well trusted or frequently updated may be crawled more frequently than sites with low trust scores and limited link authority. Sites with highly artificial link authority scores (ie: mostly low-quality spammy links) or sites which are heavy in duplicate content or near duplicate content (such as affiliate feed sites) may be crawled less frequently than sites with unique content which are well integrated into the web.See also:Matt Cutts confirms the Google Sandbox effect as being an accidental side effect which occurred as an artifact of another part of the relevancy scoring algorithm. Part1Part 2Matt Cutts post-Indexing Timeline – mentions sites with unnatural link profiles may not be crawled as frequently or deeply


 A program which moves through the worldwide web or a website by way of the link structure to gather data. Also known as a bot or spider.


 Cascading Style Sheets is a method for adding styles to web documents.Note: Using external CSS files makes it easy to change the design of many pages by editing a single file. You can link to an external CSS file using code similar to the following in the head of your HTML documents See also:Mezzo Blue CSS Layout samplesGlish.com CSS Techniques


 Click through rate is the percentage of people who click on an advertisement they viewed, which is a way to measure how relevant a traffic source or keyword is to the target demographic. Search ads typically have a higher click through rate than traditional banner ads due to being highly relevant to implied searcher demand and a history of questionable ad labeling disclosure by search engines. A search engine can determine if a particular search query is navigational (branded) versus informational or transactional by analyzing the relative CTR of different listings on the search result page and the CTR of people who have repeatedly searched for a particular keyword term. A navigational search tends to have many clicks on the top organic listing, while the CTR curve is often far flatter on informational or transactional searches.See also:Consumer Ad Awareness in Search ResultsAdvanced Web Ranking CTR curveA Taxonomy of Web Search [PDF]Determining the Informational, Navigational & Transactional Intent of Web Queries [PDF]Automatic Query Type Identification Based on Click Through Information [PDF]

Cutts, Matt

 Google’s head of search quality. In January of 2017, it was revealed that he was stepping down, his successor has not been named yet.See also:Matt Cutts blogInterview of Matt CuttsSEO Myths by Matt Cutts


 Registering domains related to other trademarks or brands in an attempt to cash in on the value created by said trademark or brand.Remember that time Google.com was bought out from under them?



 Turning ad campaigns on or off, changing ad bid price, or budget constraints based on bidding more when your target audience is available and less when they are less likely to be available.Using these techniques can save you money and increase the effectiveness of ads.See also:Setting up automated bid rules


 Temporarily or permanently being removed from a directory or search engine.De-indexing may be due to any of the following:
  • Pages on new websites (or sites with limited link authority relative to their size) may be temporarily de-indexed until the search engine does a deep crawl and re-cache of the web.
  • During some updates, search engines readjust crawl priorities.
  • You need a significant number of high-quality links to get a large website well indexed and keep it well indexed.
  • Duplicate content filters, inbound and outbound link quality or other information quality-related issues may also relate to re-adjusted crawl priorities.
  • Pages which have changed location and are not properly redirected or pages which are down when a search engine tries to crawl them may be temporarily de-indexed.
  • Search Spam:
    • If a website tripped an automatic spam filter it may return to the search index anywhere from a few days to a few months after the problem has been fixed.
    • If a website is editorially removed by a human (manual penalty) you may need to contact the search engine directly to request inclusion.

Dead Link

 A link which is no longer functional.Most large high-quality websites have at least a few dead links in them, but the ratio of good links to dead links can be seen as a sign of information quality.

Dedicated Server

 A server which only serves one website or a small collection of websites owned by a single person.Dedicated servers tend to be more reliable than shared (or virtual) servers. Dedicated servers usually run from $100 to $500 a month. Virtual servers typically run from $5 to $50 per month.There is a weak ranking factor which prefers websites on a dedicated server (most likely for safety or trust purposes)

Deep Link

 A link which points to an internal page within a website.When links grow naturally typically most high-quality websites have many links pointing at interior pages. When you request links from other websites it makes sense to request a link from their most targeted relevant page to your most targeted relevant page. Some webmasters even create content based on easy linking opportunities they think up.Spreading link equity to deep levels of your website or highly specific content pages is a great way to build total website Domain Authority.

Deep Link Ratio

 The ratio of links pointing to internal pages to overall links pointing at a website.A high deep link ratio is typically a sign of a legitimate natural link profile.


 Popular social bookmarking website.See also:Del.icio.us


 Statistical data or characteristics which define segments of a population or audience.Some internet marketing platforms, such as AdCenter and AdWords, allow you to target ads at websites or searchers who fit amongst a specific demographic. Some common demographic data points are gender, age, income, education, location, job description, and more. Currently, Facebook Ads have the strongest demographic targeting options.


 Directories and search engines provide a short description near each listing which aims to add context to the title (also known as a ‘meta description’).High-quality directories typically prefer the description mentions what the site is about rather than having a description that is overtly promotional in nature. A description of the page content which is relevant for the particular search query and ranking page (this is called a snippet) these snippets do not have to be text, they can be snapshots of inventory, video samples and much more.


 Digg is a news aggregator with a curated front page, aiming to select stories specifically for the Internet audience such as science, trending political issues, and viral Internet issues.See also:Digg.com


 A categorized catalog of websites, typically manually organized by topical editorial experts.Some directories cater to specific niche topics, while others are more comprehensive in nature. Major search engines likely place significant weight on links from these directories, they act as citations to your website increasing site authority. Smaller and less established general directories likely pull less weight, despite including vital site information which may pass referral traffic and increase local search signals.


 The link disavow tool is a way for a webmaster to state they do not vouch for a collection of inbound links to their website.Recovering from manual link penalties will often require removing some lower quality inbound links & disavowing other low-quality links. For automated link penalties, like Penguin penalties the disavow tool should be sufficient for penalty recovery, however, Google still has to crawl the pages to apply disavow to the links. It still may make sense to remove some lower quality links to diminish any future risks of manual penalties. With the rise of negative SEO, publishers in spammy industries may be forced to proactively use the disavow tool.See also:How to use the Disavow toolDisavow tool


 The Open Directory Project is the largest human edited directory of websites. DMOZ is owned by AOL and is primarily ran by volunteer editors. DMOZ has been shut down as of March 17, 2017See also:DMOZ.org


 Domain Name Server or Domain Name System. A naming scheme used to help resolve a domain name/host name to a specific TCP/IP Address.Image result

Document Freshness

 (see fresh content)


 The schema used for logical or location organization of the web. Many people also use the word domain to refer to a specific website url.

Doorway Pages

 Pages designed to rank for highly targeted search queries, typically designed to redirect searchers to a page with other advertisements, or at as a landing page for a transaction to occur.Some webmasters cloak thousands of doorway pages on trusted domains and rake in a boatload of cash until they are caught and de-listed.

Duplicate Content

 Content which is duplicate or near duplicate in nature. Search engines do not want to index multiple versions of similar content. For example, printer friendly pages may be search engine unfriendly duplicates. Also, many automated content generation techniques rely on recycling content, so some search engines are somewhat strict in filtering out content they deem to be similar or nearly duplicate in nature.See also:Duplicate Content Detection – video where Matt Cutts talks about the process of duplicate content detectionIdentifying and Filtering near-duplicate documentsSearch Engine Patents On Duplicated Content and Re-Ranking MethodsStuntdubl: How to Remedy Duplicate Content – oldie, but still relevant

Dwell Time

 The amount of time a searcher spends on a website before leaving the site.Some search queries might require significant time for a user to complete their information goals while other queries might be things which are quickly answered by a landing page, thus dwell time in isolation may not be a great relevancy signal. However search engines can also look at other engagement metrics like repeat visits, branded searches, relative CTR & if users clicking on a particular listing have a high POGO rate (by subsequently clicking on yet another search result) to get an idea of a user’s satisfaction with a particular website and fold these metrics into an algorithm like Panda.So to answer your question – does dwell time affect your search engine rankings? Yes, it is a factor in an engagement metrics which get bundled into a larger relevancy flag.See also:Dwell time, the most important metric you aren’t measuring

Dynamic Content

 Content which changes over time or uses a dynamic language such as PHP to help render the page.In the past search engines were less aggressive at indexing dynamic content than they currently are. While they have greatly improved their ability to index dynamic content it is still preferable to use URL rewriting to help make dynamic content look static in nature. In my experience, the fewer changes to a page during site load up the better rankings a page is able to receive – Thi s may be due to keyword density fluctuations on the page or the uncertainty it presents to a crawler.

Dynamic Languages

 Programming languages such as PHP or ASP which build web pages upon request.


Earnings Per Click

 Many contextual advertising publishers estimate their potential earnings based on how much they make from each click.See also:Free internet marketing calculators

Editorial Link

 Search engines count links as votes of quality. They primarily want to count editorial links that were earned over links that were bought or bartered.Many paid links, such as those from quality directories, still count as signs of votes as long as they are also associated with editorial quality standards. If they are from sites without editorial control, like link farms, they are not likely helping you rank well. Using an algorithm similar to TrustRank, some search engines may place more trust on well-known sites with strong editorial guidelines.


 An HTML tag used to emphasize text. Many know this as Italics.Please note that it is more important that copy reads well to humans than any boost you may think you will get by tweaking it for bots. If every occurrence of a keyword on a page is in emphasis that will make the page hard to read, convert poorly, and may look weird to search engines and users alike. It is worth remembering that search engines take into consideration higher level factors like dwell time, click through rate, pages per session, bounce rates and others associated with the user interface – It is in your best interest to consider the user.<em>emphasis</em> would appear as emphasis


 Is a measurement of how people responded to or interacted with a piece of content. Engagement can be clicks, likes, shares or opens. Sometimes expressed as a percentage. Depending on goals for a campaign, Engagement may be the single most important factor for success.

Engagement Metrics

  • Daily active users, weekly, and monthly active users (active users are those who are performing the desired task like sharing, liking, or making a purchase)
  • Stickiness is generally calculated as the ratio of Daily Active Users to Monthly Active Users. A DAU/MAU ratio of 50% would mean that the average user of your app is using it 15 out of 30 days that month.
  • D1, D7 and D30 retentions are calculated as the percentage of users who are active at any time after 1 day, 7 days and 30 days of signing up or installing your app. This is often shown in tabular form in tools like Woopra, Mixpanel, and KISSmetrics cohorts analysis, but the most useful way to model the data though is a retention curve.
  • Brian Balfour (Hubspot) says that 7 day, or “Week 1 Retention (W1)” “has one of the biggest impacts on your retention over time” because improvements that you make in week 1 retention carry through the entire retention curve. For website and web-apps, typically 60-80% of new users are lost within the first week of signup. In mobile, it can be 70-80% users lost after the first day. Think of the untapped potential!
  • Use ‘red-flags’ or ‘Tripwires’ to engage visitors. Typically users don’t ask for help and tend to drift away if they don’t get value quickly. If you can spot those users and reach out, you may be able to engage them. To help spot users who aren’t engaged, you can use Tripwire Metrics to identify users who aren’t engaging well and use that data to quickly act to engage them during those vital first 7 days.
Search engines may analyze end user behavior to help refine and improve their rankings. Sites which get a high CTR and have a high proportion of repeat visits from brand related searches may get a ranking boost in algorithms like Panda.


 People, places or things which search engines aim to know & present background information about.Brands are a popular form of entities, but many other forms of information like songs or movies are also known as entities. Information about entities may be shown in knowledge graph results.

Entry Page

 The page which a user enters your site.If you are buying pay per click ads it is important to send visitors to the most appropriate and targeted page associated with the keyword they searched for (Ad score increases the better the landing page correlates with the ad). If you are doing link building it is important to point links at your most appropriate page when possible such that if anyone clicks the link they are sent to the most appropriate and relevant page, you help search engines understand what the pages on your site are associated with and the weight of the link is increased drastically.See also:One of my favorite tools to use in Google Analytics! Check out the Behavior Flowcharts

Ethical SEO

 Search engines like to paint SEO services which manipulate their relevancy algorithms as being unethical. No particular technique is typically associated with ethics but may be either effective or ineffective.Some search marketers lacking in creativity tend to describe services sold by others as being unethical while their own services are ethical.The only ethics issues associated with SEO are generally business ethics related issues. Two of the bigger frauds are
  • Not disclosing risks: Some SEOs may use high-risk techniques when they are not needed. Some may make the situation even worse by not disclosing potential risks to clients.
  • Taking the money and doing nothing: Since selling SEO services has almost no start up costs many of the people selling services may not actually know how to competently provide them. Some shady people claim to be SEOs and bilk money out of unsuspecting small businesses – I see it all the time, when people switch over to us, unfortunately, because of the lack of education the general populous has combined with how lucrative the SEO industry may be, SEOs can get away with what seems like murder.
As long as the client is aware of potential risks there is nothing unethical about being aggressive.


 Major search indexes are constantly updating. Google refers to this continuous refresh as “everflux”.In the past, Google updated their index roughly once a month. Those updates were named ‘Google Dances’, but since Google shifted to a constantly updating index Google no longer does what was traditionally called a Google Dance, now they make updates very frequently and may stir up quite the changes in search algorithm (using MozCast you can look at how volatile the algorithm is at any given time – it’s up to the good SEOs to determine what needs to be changed in order to ride the wave.)See also:Matt Cutts Google Terminology Video – Matt talks about the history of Google Updates and the shift from Google Dances to everflux.

Expert Document

 A quality page which links to many non-affiliated topical resources.See also:Hilltop: A Search Engine based on Expert Documents

External Link

 A link which references another domain.Some people believe in link hoarding, but linking out to other related resources is a good way to help search engines understand what your site is about. If you link out to lots of low-quality sites or primarily rely on low-quality reciprocal links some search engines may not rank your site very well. Search engines are more likely to trust high-quality editorial links (both to and from your site). Just like when you cite scientific articles in some school papers, it builds credibility for both parties.See also:It is important for writing, it is equally important for the web. Using high authority and credible citations will boost your page credibility.


Fair Use

 The stated exceptions of allowed usage of work under copyright laws without requiring permission of the original copyright holder. Fair use is covered in section 107 of the Copyright code.See also:US Copyright Office Section 107


 Favicon is short for “Favorites Icon” it is the small icon which appears next to URLs in a web browser.Upload an image named favicon.ico in the root of your site to have your site associated with a favicon.

See also:Quick Favicon generator


 (see bookmarks)


 Many content management systems like blogs, allow readers to subscribe to content update notifications via RSS or XML feeds; These act as a constant flow of content from the source, the content can take the form of blogs, videos, podcasts etc… Feeds can also refer to pay per click syndicated feeds, or merchant product feeds. Merchant product feeds have become less effective as content generators because of the ‘duplicate content’ filters which are getting really good!.

Feed Reader

 A Software or website used to subscribe to feed update notifications. Many have been discontinued due to changing media preferences, such as Google Reader or Blogline.See also:FeedDemon – desktop based feed reader


 Free for all pages are pages which allow anyone to add a link to them. Generally, these links do not pull much weight in search relevancy algorithms because many automated programs fill these pages with links pointing at low-quality websites, meaning they are basically irrelevant at this point in search algorithms.


 Certain activities or signatures which make a page or site appear unnatural might make search engines inclined to filter/remove them out of the search results.For example, if a site publishes significant duplicate content it may get a reduced crawl priority and get filtered out of the search results. Some search engines also have filters based on link quality, link growth rate, and anchor text. Some pages are also penalized for spamming.


 Popular open source web browser. Mozilla Firefox is the second most popular browser to date.See also:Download Firefox


 Vector graphics-based animation software which makes it easier to make websites look interactive in nature.Search engines tend to struggle indexing and ranking flash websites because flash typically contains very little relevant content which crawlers can understand. If you use flash ensure:
  • you embed flash files within HTML pages
  • you use a noembed element to describe what is in the flash player
  • you publish your flash content in multiple separate files such that you can embed appropriate flash files in relevant pages


 Someone who chooses to see updates from you on a social media site like Instagram, Facebook, or LinkedIn.


 Often refers to a “social following”. The group of people who have elected to get updates from you across all social media channels.


 An online group that meets on a website. Similar to a chat room. A great Forum is Reddit

Forward Links

 (see Outbound Links)


 A technique created by Netscape used to display multiple smaller pages on a single display. This web design technique allows for consistent site navigation but makes it hard to deep link to relevant content.Given the popularity of server side includes, content management systems, and dynamic languages there really is no legitimate reason to use frames to build a content site today.

Fresh Content

 Content which is dynamic in nature and gives people a reason to keep paying attention to your website, or piece of content which was recently published.Many SEOs talk a lot about fresh content, but fresh content does not generally mean re-editing old content. It more often refers to creating new content. The primary advantages of fresh content are:
  • Maintain and grow mindshare: If you keep giving people a reason to pay attention to you more and more people will pay attention to you, and link to your site – Growing your audience, brand trust, and domain authority.
  • Faster idea spreading: If many people pay attention to your site, when you come out with good ideas they will spread quickly.
  • Growing archives: If you are a content producer then owning more content means you have more chances to rank. If you keep building additional fresh content eventually that gives you a large catalog of relevant content.
  • Frequent crawling: Frequently updated websites are more likely to be crawled frequently – there is currently a HUGE boost for rankings when fresh content is involved (Proved by our clients performing well when combining our Blog posting and SEO services – Ask about our ‘Hone Hybrid’)
  • QDF: Google’s “query deserves freshness” algorithm may boost the rankings of recently published documents for search queries where they believe users are looking for recent information.
The big risk of creating lots of “fresh” content is that many low cost content sources will have poor engagement metrics, which in turn will lead to a risk of the site being penalized by Panda.A good litmus test on this front is: if you didn’t own your website would you still regularly visit it & read the new content published to it?


 File Transfer Protocol is a protocol for transferring data between computers.Many content management systems (such as blogging platforms) include FTP capabilities. Web development software such as Dreamweaver also comes with FTP capabilities. There are also a number of free or cheap FTP programs such as Cute FTP, Core FTP, and Leech FTP.

Fuzzy Search

 Search which finds matching terms when terms are misspelled (or fuzzy).Fuzzy search technology is similar to stemming technology, with the exception that fuzzy search corrects the misspellings at the user’s end and stemming searches for other versions of the same core word within the index.



 Google Advertising Professional was a program which qualifies marketers as being proficient AdWords marketers. This has since been replaced by the Google AdWords Certification program.See also:Google AdWords Certification

Gladwell, Malcolm

 The author who wrote the book titled ‘The Tipping Point’ about the when an idea, trend or social behavior crosses a threshold into widespread popularity. Factors discussed are:
  • The law of the few
  • The stickiness factor
  • and The Power of Context
See also:The Tipping Point – book about how ideas spread via a network of influencers (Connectors, Mavens, and Salesmen)

Godin, Seth

 Popular blogger, author, viral marketer and business consultant.See also:Seth’s blog – Seth talks about marketingPurple Cow – Probably Seth’s most popular book. It is about how to be remarkable. Links are citations or remarks. This book is a highly recommended for any SEO.All Marketers Are Liars – Book about creating and marketing authentic brand-related stories in a low-trust world.The Big Red Fez – A quick book about usability errors common to many websites.Seth Godin TED Talk


 The world’s leading search engine in terms of reach and usage. Google pioneered search by analyzing linkage data via PageRank. Google was created by Stanford students Larry Page and Sergey Brin.See also:Google WikipediaGoogle Research – includes published papers, theories, data structures and more

Google AdSense

 (see AdSense)

Google AdWords

 (see AdWords)

Google Base

 Free database of semantically structured information created by Google.Google Base may also help Google better understand what types of information are commercial in nature, and how they should structure different vertical search products. Google Base is no longer available.

Google Bombing

 Making a page rank well for a specific search query by pointing hundreds or thousands of links at it with the keywords in the anchor text. This is a super out-dated technique and is now for the most part ignored by ranking algorithms.See also:Google search: miserable failure search query on George Bush

Google Bowling

 Knocking a competitor out of the search results by pointing hundreds or thousands of low trust low-quality links at their website.Typically it is easier to bowl new sites out of the results. Older established sites are much harder to knock out of the search results. This is a Negative SEO technique (may not be the most ethical, but as sites have higher stakes for ranking, SEOs may be tempted to take a negative SEO approach to their services.)

Google Content Analyzer

 An experiment setup wizard provided by Google to help run any list of experiments for on-page content. It then generates reports for each individual experiment. Additionally, you can also see data from your experiment in your Analytics view – awesome program for split testing.See also:Google Content Analyzer 

Google Dance

 In the past, Google updated their index about once a month. Those updates were named Google Dances, but since Google shifted to a constantly updating index, Google no longer does what was traditionally called the ‘Google Dance’.Major search indexes are constantly updating. Google refers to this continuous refresh as everflux.The second meaning of Google Dance is a yearly party at Google’s corporate headquarters which Google holds for search engine marketers. This party coincides with the San Jose Search Engine Strategies conference.See also:Supplemental definition for everflux and why Google no longer abides by the Dance

Google Keyword Tool

 Keyword research tool provided by Google which estimates the competition for a keyword, recommends related keywords, and will tell you what keywords Google thinks are relevant to your site or a page on your site. This tool continues to grow and develop additional features, it is a great option for initial market research.See also:Google Keyword Tool

Google OneBox

 A portion of the search results page above the organic search results which Google sometimes uses to display vertical search results from Google News, Google Base, and other Google-owned vertical search services.Image result

Google Sitelinks

 On some search results where Google thinks one result is far more relevant than other results (like navigational or brand related searches) they may list numerous deep links to that site at the top of the search results.

Google Sitemaps

 A program which webmasters can use to help Google index content and web pages.Please note that the best way to submit your site to search engines and to keep it in their search indexes is to build high-quality editorial links.See also:Google Webmaster Central – access to Google Sitemaps and other webmaster related tools

Google Supplemental Index

 The index where pages with lower trust scores are stored. Pages may be placed in Google’s Supplemental Index if they consist largely of duplicate content, if the URLs are excessively complex in nature, or the site which hosts them lacks significant trust.

Google Traffic Estimator

 A tool which estimates bid prices and how many Google searchers will click on an ad for a particular keyword.If you do not submit a bid price the tool will return an estimated bid price necessary to rank #1 for 85% of Google’s queries for a particular keyword.See also:Google Traffic Estimator

Google Trends

 An essential tool which allows you to see how Google search volumes for a particular keyword change over time. Particularly useful for upcoming popular search terms and trends.See also:Google Trends

Google Wallet

 A Payment service provided by Google which helps Google better understand merchant conversion rates and the value of different keywords and markets. Most prevalent use is the customer facing payment methodSee also:Google Wallet

Google Webmaster Guidelines

 An arbitrary and ever-changing collection of specifications which can be used to justify penalizing any website.While some aspects of the guidelines might be rather clear, other aspects are blurry, and based on inferences which may be incorrect like: “Don’t deceive your users.” Many would (and indeed have) argued Google’s ad labeling within their own search results is deceptive, Google has run ads for illegal steroids & other shady offers, etc. The ultimate goal of the Webmaster Guidelines is to minimize the ROI of SEO and discourage active investment into SEO, this is especially true when using SEO to deceive consumers.See also:Google Webmaster GuidelinesGoogle’s Advertisement LabelingFederal Trade Commission: letter to search engines [PDF]How a Career Con Man Led a Federal Sting That Cost Google $500 Million

Google Webmaster Tools

 Tools offered by Google which show recent search traffic trends, let webmasters set a target geographic market, enable them to request select pages be re-crawled, show manual penalty notifications, and allow webmasters to both disavow links and request a manual review from Google’s editorial team.While some of the Google Webmaster Tools may seem useful, it is worth noting Google uses webmaster registration data to profile & penalize other websites owned by the same webmaster. It is worth proceeding with caution when registering with Google, especially if your website is tied to a business model Google both hates and has cloned in their search results – like hotel affiliates.See also:Google webmaster tools (also known as the Search Console)

Google Website Optimizer

 A free multivariable testing platform used to help AdWords advertisers improve their conversion rates. Google Website Optimizer is no longer available – it is now Google Content AnalyzerSee also:Google Content Analyzer


 Google’s search engine spider.Google has a shared crawl cache between their various spiders, including vertical search spiders and spiders associated with ad targeting.See also:Google’s crawl caching proxy – straight from Matt Cutts

Guestbook Spam

 A type of low quality automated link which search engines do not place much trust on. It occurs when automated messages with links to a website get built, usually in a blog comment section.



 The heading element briefly describes the subject of the section it introduces.Heading elements go from H1 to H6 with the lower numbered headings being most important. You should only use a single H1 element on each page and may want to use multiple other heading elements to structure a document. An H1 element source would look like:<h1>Your Topic</h1>Heading elements may be styled using CSS. Many content management systems place the same content in the main page heading and the page title, although in many cases it may be preferential to mix them up if possible.See also:w3.org: Headings


 The title of an article or story. The most visited/shareable content has high Emotional Marketing Value headlines.See also:Emotional Marketing ValueEMV Headline Analyzer I like usingAlternate EMV tool I like using

Hidden Text

 SEO technique used to show search engine spiders text that human visitors do not see.While some sites may get away with it for a while, generally the risk to reward ratio is inadequate for most legitimate sites to consider using hidden text.


 Making a search engine believe that another website exists at your URL. Typically done using techniques such as a 302 redirect or meta refresh. A duplicate content method which is frowned upon and ignored/penalized by modern search algorithms.


 An algorithm which ranks results largely based on unaffiliated expert citations.See also:Hilltop: A Search Engine based on Expert Documents


 Link-based algorithm which ranks relevancy scores based on citations from topical authorities.See also:Jon Klienberg’s Authoritative Sources in a Hyperlinked Environment [PDF]

Home Page

 The main page of your website, which is largely responsible for helping develop your brand and setting up the navigational schemes that will be used to help users and search engines navigate your website.As far as SEO goes, a home page is typically going to be one of the easier pages to rank for some of your more competitive terms, largely because it is easy to build links at a home page. You should ensure your homepage stays focused and reinforces your brand, and do not assume that most of your visitors will come to your site via the home page. If your site is well structured many pages on your site will likely be far more popular and rank better than your home page for relevant queries.


 (see Server)


 HyperText Markup Language is the language which pages on the World Wide Web are created.Some newer web pages are also formatted in XHTML.See also:HTML Mozilla Developer network


 HyperText Transfer Protocol is the foremost used protocol to communicate between servers and web browsers. Hypertext transfer protocol is the means by which data is transferred from its residing location on a server to an active browser.


 Topical hubs are sites which link to well trusted within their topical community. A topical authority is a page which is referenced from many topical hub sites. A topical hub is a page which references many authorities.See also:Mike Grehan on Topic Distillation [PDF]Jon Klienberg’s Authoritative sources in a hyperlinked environment [PDF]Jon Klienberg’s home page


 A Google search algorithm update which better enabled conversational search.See also:FAQ: All About The Google “Hummingbird” AlgorithmThe Google Hummingbird Patent



 Inverse Document Frequency is a term used to help determine the position of a term in a vector space model.IDF = log ( total documents in database / documents containing the term )


 An advertising term that means how many times an ad was seen (sometimes used for organic search views). Advertisers can buy ads based on how many impressions they want, or, as one alternative, under a pay per click model of advertising.

Inbound Link

 A link pointing to one website from another website.Most search engines allow you to see a sample of links pointing to a document by searching using the link: function. For example, using link:www.seobook.com would show pages linking to the homepage of this site (both internal links and inbound links). Due to canonical URL issues www.site.com and site.com may show different linkage data. Google typically shows a much smaller sample of linkage data than competing engines do, but Google still knows of and counts many of the links that do not show up when you use their link: function.


 A collection of data used as a bank to search through to find a match to a user query. The larger search engines have billions of documents in their catalogs.When search engines search they search via reverse indexes, by words, and return results based on matching relevancy vectors. Stemming and semantic analysis allow search engines to return near matches. The index may also refer to the root of a folder on a web server.


 Someone, usually on social media, who has an unusually strong influence over other peoples’ opinions. A minor celebrity of sorts who has a large social following.

Information Architecture

 Designing, categorizing, organizing, and structuring content in a useful and meaningful way.Good information architecture considers both how humans and search spiders access a website. Information architecture suggestions:
  • focus each page on a specific topic
  • use descriptive page titles and meta descriptions which describe the content of the page
  • use clean (few or no variables) descriptive file names and folder names
  • use headings to help break up text and semantically structure a document
  • use breadcrumb navigation to show page relationships
  • use descriptive link anchor text
  • link to related information from within the content area of your web pages
  • improve conversion rates by making it easy for people to take desired actions
  • avoid feeding search engines duplicate or near-duplicate content

Information Retrieval

 The field of science based on sorting or searching through large data sets to find relevant information.

Internal Link

 Link from one page on a site to another page on the same site.It is preferential to use descriptive internal linking to make it easy for search engines to understand what your website is about. Use consistent navigational anchor text for each section of your site, emphasizing other pages within that section. Place links to relevant related pages within the content area of your site to help further show the relationship between pages and improve the usability of your website.

Internal Navigation

 (see Navigation)


 A vast worldwide network of computers connected via TCP/IP.

Internet Explorer

 Microsoft’s web browser. After they beat out Netscape’s browser on the market share front they failed to innovate on any level for about 5 years, until Firefox forced them to. Internet Explorer is now called Microsoft Edge.See also:Download Microsoft Edge

Inverted File

 (see Reverse Index)

Invisible Web

 Portions of the web which are not easily accessible to crawlers due to search technology limitations, copyright issues, or information architecture issues.See also:Beyond Google: The Invisible Web

IP Address

 Internet Protocol Address. Every computer connected to the internet has an IP address. Some websites and servers have unique IP addresses, but most web hosts place multiple websites on a single host.Many SEOs refer to unique C class IP addresses. Every site is hosted on a numerical address like aa.bb.cc.dd. In some cases, many sites are hosted on the same IP address. It is believed by many SEOs that if links come from different IP ranges with a different number somewhere in the aa.bb.cc part then the link may count more than links from the same local range and host.

IP delivery

 (see cloaking)


 Internet Service Providers sell end users access to the web. Some of these companies also sell usage data to web analytics companies.


 (see emphasis)



 A client-side scripting language that can be embedded into HTML documents to add dynamic features.Search engines do not index most content in JavaScript. In AJAX, JavaScript has been combined with other technologies to make web pages even more interactive.


 Stands for the Joint Photographic Experts Group, which is the organization that created and distributed this widely-used image file format.



 A word or phrase which implies a certain mindset or demand that targeted prospects are likely to search for.Long tail and brand related keywords are typically worth more than shorter and vague keywords because they typically occur later in the buying cycle and are associated with a greater level of implied intent.

Keyword Density

 An old measure of search engine relevancy based on how prominent keywords appeared within the content of a page. Keyword density is no longer a valid measure of relevancy over a broad open search index though.When people use keyword stuffed copy it tends to read mechanically (and thus does not convert well and is not link worthy), plus some pages that are crafted with just the core keyword in mind often lack semantically related words and modifiers from the related vocabulary (and that causes the pages to rank poorly as well).See also:The Keyword Density of Non-SenseKeyword Density Analysis ToolSearch Engine Friendly Copywriting – What Does ‘Write Naturally’ Mean for SEO?

Keyword Funnel

 Similar to a marketing funnel, the keyword funnel pertains to keywords/phrases which are used at certain points of a marketing and sales funnel. Certain keywords have the intent of “looking for information” while others have the intent of making “purchase”. This can then develop a keyword funnel.Figure 2 - Business to Business Search Funnel StagesSee also:Keyword funnels to understand searchers intent

Keyword Research

 The process of discovering relevant keywords/phrases to focus your SEO and PPC marketing campaigns on.Example keyword discovery methods:
  • using keyword research tools
  • looking at analytics data or your server logs
  • looking at page copy on competing sites
  • reading customer feedback
  • placing a search box on your site and seeing what people are looking for
  • talking to customers to ask how and why they found and chose your business
This should be done at the start of any good online marketing campaign, as well as consistently updating and capitalizing on new opportunities.

Keyword Research Tools

 Tools which help you discover potential keywords based on past search volumes, search trends, bid prices, and page content from related websites.A short list of the most popular keyword research tools:SEMRush – one of my favorite tools because it tends to be conservative in estimatesSEO Book Keyword Research ToolBing Ad Intelligence – Requires an active Bing Ads advertiser account.Google AdWords Keyword Planner – powered from Google search data, but requires an active AdWords advertiser account for the complete version.**Note** most keyword research tools used alone are going to be very inaccurate at giving exact quantitative search volumes. These tools are better for qualitative measurements. To test the exact volume for a keyword it may make sense to set up a test Google AdWords campaign.

Keyword Stuffing

 Writing copy and including excessive amounts of the core keyword. Generally having more than 4% of the words as the target keyword may be considered ‘stuffing’.See also:Keyword Stuffing – Google Webmaster Guide

Keyword Suggestion Tools

 (see Keyword Research Tools)

Kleinberg, Jon

 A Computer Scientist who is largely responsible for much of the research that went into hubs and authorities for relevancy based search algorithms.See also:Jon Klienberg’s Authoritative sources in a hyperlinked environment [PDF]Jon Klienberg WikipediaHypersearching the WebJon Klienberg Citations

Knowledge Graph

 Google scrapes third party information & displays extended format in the search results to provide search result enhancements when possible.The goal of the knowledge graphs are largely for:
  • Answering user questions quickly without requiring them to leave the search results, particularly for easy to answer questions about known entities or frequently asked questions about a keyword.
  • Displacing the organic search results by moving more relevant/intent-driven results up the page.
  • Some knowledge graph listings (like hotel search, book search, song search, lyric search) also include links to other Google properties or other forms with ads in them, increasing the monetization of the search result page.
Image result for knowledge graph examples


Landing Page

 A page which visitors to a site arrive after clicking on a link or advertisement. This page is often times geared to create some kind of conversion; whether it captures a lead, makes a sale, or sign up for a webinar – landing pages try to be clear, concise and usually make it very obvious to the visitor what actions need to be made.

Landing Page Quality Scores

 A measure used by Google to help filter out irrelevant ads from their AdWords program.Quality score along wth budget make up Google’s 2-factor auction system.See also:Quality score and how it affects your PPC campaign

Lead magnet

 Usually a white paper, special offer, or bit of content that interests your target audience enough for them to exchange an email for.


 A citation from one web page to another web page or from one position to another in the same document.Most major search engines consider links as a vote of trust. Domain Authority is then built on top of this ‘votes of trust’ value.

Link Baiting

 The art of targeting, creating, and formatting information that provokes the audience to point links at your site. Many link baiting techniques are targeted at social media and bloggers.See also:SEO Book Search: Link Bait

Link Building

 Is the process of building high-quality linkage data that search engines will evaluate to trust your website as an authoritative, relevant, and trustworthy source.A few general link building tips/techniques:
  • build conceptually unique, link-worthy, high-quality content
  • create socially viral marketing ideas that spread and make your audience want to link to you.
  • mix your anchor text and try to keep it natural
  • get links to deep locations on your site (specific pages – not just your home page)
  • try to build at least a few quality links before actively obtaining any low-quality links
  • register your site in relevant high-quality directories
  • when possible try to focus your efforts on getting high-quality editorial links
  • create link bait (sounds scammy but it isn’t! Link bait can take the form of infographics which people want to link to!)
  • try to get bloggers to mention you on their blogs (guest blogging or networking with bloggers)
It takes a while to catch up with the competition, but if you work at it, eventually you can enjoy a self-reinforcing market positionSee also:Link building strategyFilthy Linking Rich [PDF] – Mike Grehan article about how top rankings are self-reinforcing

Link Bursts

 A fast and large growth of the links pointing to a website. These bursts of links may be ignored by search engines, or take much longer to properly affect rankings.When links occur naturally they generally develop over time (more of a ‘snowball effect’). In some case, viral articles receive many links very quickly, but in those cases, there are typically other signs of quality which allow the content to take a leading position in search rankings, these additional factors include:
  • increased usage data
  • increase in brand related search queries
  • traffic from the link sources to the site being increased
  • many of the new links coming from new pages on trusted domains
See also:What makes a good link?

Link Churn

 The rate at which a site loses links. Poor quality links get dropped far more frequently than high-quality links.See also:Information retrieval based on historical data

Link Disavow

 (see Disavow)

Link Equity

 Often referred to as “Link Juice”, A measure of how strong a web page/ site is based on its inbound links and the authority of the sites providing those links.

Link Farm

 Website or group of websites which exercises little or no editorial control when linking to other sites. FFA pages, or huge link aggregate/distribution sites are examples of link farms.

Link Hoarding

 A method of trying to keep all your link popularity by not linking out to other sites, or linking out using JavaScript or through cheesy redirects.Generally, link hoarding is a bad idea because:
  • many authority sites were at one point hub sites that freely linked out to other relevant resources
  • if you are unwilling to link out to other sites people are going to be less likely to link to your site (at least to start)
  • outbound links to relevant resources improve your credibility and boost your overall relevancy scores – NoFollowing the links is what should be done to keep crawlers within your website.

Link Popularity

 The number of links pointing to a website.For competitive search queries, link quality counts much more than link quantity. Google typically shows a smaller sample of known linkage data than the other engines do, even though Google still counts many of the links they do not show when you do a ‘link: search’ query. 

Link Reputation

 The combination of link equity and link popularity. 

Link Rot

 A measure of how many and what percent of a website’s links are broken.Links break for four common reasons:
  • a website going offline
  • linking to content which is temporary (due to licensing structures or other reasons)
  • a page moving location
  • changing a domain’s content management system
Most large websites have some broken links, but if too many of a site’s links are broken it may be an indication of outdated content, and it may provide website users with a poor user experience. Both of which may cause search engines to rank a page lower for being less relevant.See also:Xenu Link Sleuth is a free software program which crawls websites to find broken linksMy personal favorite broken link checker is Screaming Frog, it provides tons of data and info 

Link Velocity

 The rate which a page or website accumulates new inbound links.Pages or sites which receive a huge spike of new inbound links in a short duration may hit automated filters and/or be flagged for manual editorial review by search engineers. 

Log Files

 Server files which show you what your leading sources of traffic are and what people are searching for to find your website.Log files do not typically show as much data as analytics programs would, and if they do, it is generally not in a format that is as useful beyond seeing a few stats.

Long Tail

 A phrase describing that for any category of product being sold, there is much more aggregate demand for the ‘non-hits’ than there is for the ‘hits’.How does “long tail” apply to keywords? Long Tail keywords are more precise and specific, and therefore more valuable.


 Latent Semantic Indexing is a way for search systems to mathematically understanding a language based on the similarity of pages and keyword co-occurrence. A relevant result may not even have the search term in it. It may be returned based solely on the fact that it contains many similar words to those appearing in relevant pages which contain the search words.See also:LSI keyword generatorPatterns in Unstructured Data – How LSI works, FreeImprove Keywords to boost relevancy – straight from Google


 An acronym for “Lifetime Value”. This refers to how much money you can expect to make from an average customer over the term that they are your customer.


Manual Penalty

 Website penalties which are applied to sites after a Google engineer determines that the site has violated the Google Webmaster Guidelines. Recoveries from manual penalties may time out years later, or a person can request a review in Google Webmaster Tools after fixing what they believe to be the problem.Sites which had a manual penalty would typically have a warning shown in Google Webmaster Tools, whereas sites which have an automated penalty like Panda or Penguin do not show up in Google Webmaster tools.See also:Understanding the major Google Algorithm updatesDisavow & Link Removal: Understanding GooglePandas, Penguins & Popsicles – Eye opening for the noobies, be aware that SEO is a buzzword and anyone can say they know it… but have they ever done it???

Manual Review

 All major search engines combine a manual review process with their automated relevancy algorithms to help catch search spam and train their relevancy algorithms. Abnormal usage data or link growth patterns may flag sites for manual review.See also:Inktomi Spam Database Left Open to Public – article about Inktomi’s spam database from 2001Google General GuidelinesGoogle Image Labeler – example of how humans can be used to review content

Marketing Automation

 A popular marketing technique that creates automated systems to move people through the buying process. Autoresponders are a greatexample of marketing automation.

Mechanical Turk

 Amazon.com program which allows you to hire humans to perform easy tasks that computers are bad at. This works great for customer service chat boxes, or anything that requires an aspect of interaction which machines have not developed yet.See also:Mechanical Turk


 In ‘The Selfish Gene’ Richard Dawkins defines a meme as “a unit of cultural transmission, or a unit of imitation.” Many people use the word meme to refer to self spreading or viral ideas or parts of human culture.See also:Techmeme – meme tracker which shows technology ideas that are currently spreading on popular technology blogsMemeGenerator


 Usually, refers to “social mentions”. These can be direct comments in a tweet, or just shares, likes, retweets and other forms of engagement.

Meta Description

 A meta description tag is a page-level tag. It’s a text description that describes what the page is about. The recommended maximum length of a meta description is ~160 characters. Longer meta description text is generally cut off in search engine result displays.A good meta description tag should:
  • be relevant and unique to the page;
  • reinforce the page title;
  • focus on including offers and secondary keywords and phrases to help add context to the page.
  • Relevant meta description tags may appear in search results as part of the page description below the page title as shown below.
The code for a meta description tag looks like this<meta name=”Description” content=”Your best meta description here ” / >Good meta descriptions increase page click through rates.See also:Free meta tag generator – offers a free formatting tool and advice on creating meta description tags.

Meta Keywords

 The meta keywords is a tag which can be used to highlight keywords and keyword phrases which the page is targeting.The code for a meta keyword tag looks like this<meta name=”Keywords” content= ” keyword target 1, keyword target 2, another one, and maybe one more “>Many people spammed meta keyword tags, so most search engines do not place much weight on it. Many SEO professionals no longer use meta keywords tags, although it can deliver additional context, and it doesn’t hurt to add one per page just to cover your bases.See also:Free meta tag generator – offers a free formatting tool and advice on creating meta description tags.

Meta Refresh

 A meta tag used to make a browser refresh to another URL location.A meta refresh looks like this<meta http-equiv=”refresh” content=”10;url=http://www.site.com/folder/page.htm”>Generally, it is preferred to use a 301 redirect over a meta refresh (a 301 will pass link equity to the proceeding page, while a meta refresh or 302 will not).

Meta Search

 A search engine which pulls top ranked results from multiple other search engines and rearranges them into a new result set.See also:Myriad Search – an ad free meta search engine

Meta Tags

 People generally refer to ‘meta descriptions’ and ‘meta keywords’ as Meta Tags. Some people also group the page title in with these.The page title is very important.The meta description tag is somewhat important.The meta keywords tag is not that important.


 The maker of the popular Windows operating system and Internet Explorer browser, owner of the search engine Bing, and one of the largest software developers in the world.


 A measure of the number of people who think of you or your product when thinking of products in your category.Sites with strong mindshare, top rankings, or a strong memorable brand are far more likely to be linked to than sites which are less memorable and have less search exposure. The link quality of mindshare related links most likely exceeds the quality of the average link on the web. If you sell non-commodities, personal recommendations also typically carry far greater weight than search rankings alone. 

Mirror Site

 A site which mirrors (or duplicates) the contents of another website.Generally, search engines prefer not to index duplicate content. The one exception to this is that if you are a hosting company it might make sense to offer free hosting or a free mirror site to a popular open source software site to build significant link equity.

Movable Type

 For sale blogging software which allows you to host a blog on your website.Movable Type is typically much harder to install than WordPress.See also:Movable Type

MultiDimensional Scaling

 The process of taking snapshots of documents in a database to discover topical clusters through the use of latent semantic indexing. Multidimensional scaling is more efficient than singular vector decomposition since only a rough approximation of relevance is necessary when combined with other ranking criteria.


Natural Language Processing

 Algorithms which attempt to understand the true intent of a search query rather than just matching results to keywords.

Natural Link

 (see Editorial Link)

Natural Search

 (see Organic Search Results)


 Scheme to help website users understand where they are, where they have been, and how that relates to the rest of the website.It is best to use regular HTML navigation rather than coding your navigation in JavaScript, Flash, or some other type of navigation which search engines may not be able to easily index.

Navigation Search

 A search query which has the intent of visiting a specific website or business. An example would be, searching for ‘Hone Digital Marketing’ to find our business website.

Negative SEO

 Attempting to harmfully influence the rank/traffic of a third-party site.Over time Google shifts many link building strategies from being considered white hat to gray hat to black hat. A competitor can point a bunch of low-quality links with aggressive anchor text at a page in order to try to get the page filtered from the search results. If these new links cause a manual penalty, then the webmaster who gets penalized may not only have to disavow the new spam links, but they may have to try to remove or disavow links which were in place for 5 or 10 years already which later became “black hat” ex-post-facto. There are also strategies to engage in negative SEO without using links.


 A topic or subject which a website is focused on.Search is a broad field, but as you drill down each niche consists of many smaller niches. An example of drilling down to a niche market
  • search
    • search marketing, privacy considerations, legal issues, history of, future of, different types of vertical search, etc.
      • search engine optimization, search engine advertising
        • link building, keyword research, reputation monitoring, and management, viral marketing, SEO copywriting, Google AdWords, information architecture, etc.
Generally, it is easier to compete in small, new, or underdeveloped niches than trying to dominate large verticals. As your brand and authority grow you can go after bigger markets.


 Attribute used to prevent a link from passing link authority to the designation site (this is done by giving the link a “nofollow” property telling bots not to follow that link out). Commonly used on sites with user generated content, like in blog comments, or pages with many identical or unrelated links.The code to use nofollow on a link appears like<a href=”https://honedigitalmarketing.com” rel=”nofollow”>This takes you to the Hone Site</a>Nofollow can also be used in a robots meta tag to prevent a search engine from counting any outbound links on a page. This code would look like this<META NAME=”ROBOTS” CONTENT=”INDEX, NOFOLLOW”>Google’s Matt Cutts also pushes webmasters to use nofollow on any paid links, but since Google is the world’s largest link broker, their advice on how other people should buy or sell links should be taken with a grain of salt.

Not Provided

 (see Keyword (Not Provided))



 In philosophy, it is the study of ‘being’. As it relates to search, it is the attempt to create an exhaustive and rigorous conceptual schema about a domain. An ontology is typically a hierarchical data structure containing all the relevant entities and their relationships and rules within that domain.See also:Wikipedia: Ontology for Information Science

Open Source

 Software which is distributed with its source code such that developers can modify it as they see fit, by providing the code online it creates a breeding ground for ideas and optimization to the original product.On the web, open source is a great strategy for quickly building immense exposure and mindshare.


 ‘Fast standards’ based web browser. Currently the fourth most popular browser.See also:Opera.com

Organic Search Results

 Most major search engines have results that consist of paid ads and unpaid listings. The unpaid/algorithmic listings are called the organic search results. Organic search results are organized by relevance usually, which is largely determined based on linkage data, page content, usage data, and historical domain and trust related data.Most clicks on search results are on the organic search results. Some studies have shown that 80 to 90% + of clicks are on the organic search results.

Outbound Link

 A link from one website pointing at another external website.Some webmasters believe in link hoarding, but linking out to useful relevant related documents is an easy way to help search engines understand what your website is about. If you reference other resources it also helps you build credibility and leverage the work of others without having to do everything yourself. Some webmasters track where their traffic comes from, so if you link to related websites they may be more likely to link back to your site.


Page Title

 (see Title)

Page, Larry

 Co-founder of Google.


 A logarithmic scale based on link equity (among other elements) which estimates the importance of web documents.Since PageRank is widely bartered Google’s relevancy algorithms had to move away from relying on PageRank and place more emphasis on trusted links via algorithms such as TrustRank.The PageRank formula is:PR(A) = (1-d) + d (PR(T1)/C(T1) + … + PR(Tn)/C(Tn))PR= PageRankd= dampening factor (~0.85)c = number of links on the pagePR(T1)/C(T1) = PageRank of page 1 divided by the total number of links on page 1, (transferred PageRank)In text: for any given page A the PageRank PR(A) is equal to the sum of the parsed partial PageRank given from each page pointing at it multiplied by the dampening factor plus one minus the dampening factor.See also:The Anatomy of a Large-Scale Hypertextual Web Search EngineThe PageRank Citation Ranking: Bringing Order to the Web

Paid Inclusion

 A method of allowing websites which pass editorial quality guidelines to buy relevant exposure. 

Paid Link

 (see Text Link Ads

Panda Algorithm

 A Google algorithm which attempts to sort websites into buckets based on perceived quality. Signals referenced in the Panda patent include the link profile of the site & entity (or brand) related search queries.Sites which have broad, shallow content (like eHow) along with ad heavy layouts, few repeat visits, and high bounce rates from search visitors are likely to get penalized. Highly trusted & well-known brands (like Amazon.com) are likely to receive a ranking boost. In addition to the article databases and other content farms, many smaller ecommerce sites were torn down by Panda.Sites which had a manual penalty would typically have a warning shown in Google Webmaster Tools, whereas sites which have an automated penalty like Panda or Penguin would not.See also:Google: More guidance on building high-quality sitesThe Panda Patent 

Pay for Performance

 Payment structure where affiliated sales workers are paid commission for getting consumers to perform certain actions.Publishers publishing contextual ads are typically paid per ad click. Affiliate marketing programs pay affiliates for conversions – leads, downloads, or sales. An alias could be Pay per Action.


 Portable Document Format. This is a file format that lets you save images, files, and layouts in a way that both makes the file size smaller and also ensures it will appear the same on multiple devices.


Search engines prevent some websites suspected of spamming from ranking highly in the results by banning or penalizing them. These penalties may be automated algorithmically or manually applied. If a site is penalized algorithmically the site may start ranking again after a certain period of time after the reason for being penalized is fixed. If a site is penalized manually the penalty may last an exceptionally long time or require contacting the search engine with a re-inclusion request to remedy. Some sites are also filtered for various reasons. See also: Reasons why you may get a site penalty

Penguin Algorithm

 Google algorithm which penalizes sites with unnatural link profiles.When Google launched the Penguin algorithm they reduced the emphasis on links by updating some on-page keyword stuffing classifiers at the same time. Initially they also failed to name the Penguin update and simply called it a spam update, only later naming it after plenty of blow-back due to false positives. In many cases they run updates on top of one another or in close proximity to obfuscate which update caused an issue for a particular site.Sites which were hit by later versions of Penguin could typically recover on the next periodic Penguin update after disavowing low quality links inside Google Webmaster Tools, but sites which were hit by the first version of Penguin had a much harder time recovering. Sites which had a manual penalty would typically have a warning show in Google Webmaster Tools, whereas sites which have an automated penalty like Panda or Penguin would not.See also:Another step to rewarding high-quality sitesUnderstanding the Google Penguin AlgorithmDisavow and Link Removal: Understanding Google and Bing


 The permanent address for a given file on the Internet. Ex. “www.honedigitalmarketing.com/hone-u/


 Altering the search results based on a person’s location, search history, content they recently viewed, or other factors relevant to them on a personal level.


 A recursive acronym for PHP: Hypertext Preprocessor, is an open source server side scripting language used to render web pages or add interactivity to them.See also:PHP.net

Pigeon Update

 An algorithmic update to local search results on Google which tied in more signals which have been associated with regular web search.See also:Google “Pigeon” Updates Local Search Algorithm With Stronger Ties to Web Search Signals

Piracy Update

 Search algorithm update Google performed which lowered the rankings of sites which had an excessive number of DMCA takedown requests. Google has exempted both Blogspot and YouTube from this “relevancy” signal because they are such large aggregates of user submitted content.


 A piece of software that performs a specific task and works as an added component to another piece of software. Most plugins are used for a content management system like WordPress, but some browsers let you add plugin too, often referred to as extensions.


 Portable Network Graphics. An image file format.


 An expression which is synonymous with ‘bounce rate’ – The percent of users who click on a site listed in the search results only to quickly click back to the search results and click on another listing.A high POGO rate can be seen as a poor engagement metric, which in turn can flag a site to be ranked lower using an algorithm like Panda.

Poison Word

 Words which were traditionally associated with low-quality content that caused search engines to want to demote the rankings of a page. These words do not have much affect on your rankings in the current algorithm.See also:What Are Poison Words? Do They Matter?


 Web site offering common consumer services such as news, email, other content, and search. By linking out to these sources or providing these services the website becomes a Portal.


 Pay Per Click is a pricing model which most search ads and many contextual ad programs are sold through. PPC ads only charge advertisers if a potential customer clicks on an ad.See also:AdWords – Google’s PPC ad platformAdCenter – Microsoft’s Ad PPC Platform (Bing Ads)Yahoo! Search Marketing – Yahoo!’s PPC ad platform

Profit Elasticity

 A measure of the profit potential of different economic conditions based on adjusting the price, supply, or other variables to create a different profit potential where the supply and demand curves cross.


 A measure of how close words are to one another.A page which has words near one another may be deemed to be more likely to satisfy a search query containing the terms. If keyword phrases are repeated an excessive number of times, and the proximity is close on all the occurrences of both words it may also be a sign of unnatural (and potentially low quality) content.



 Query deserves freshness is an algorithmic signal based on things like a burst in search volume and a burst in news publication on a topic which tells Google that a particular search query should rank recent/fresh results.Newly published content may be seen as fresh content, but the older content may also be seen as fresh if it has been recently updated, has a big spike in readership, and/or has a large spike in its link velocity (the rate of growth of inbound links).

Quality Content

 Content which is link-worthy, brings value to the audience and uses citations for the information provided.See also:What is Quality Content?Unique content

Quality Link

 Search engines count links as votes of trust, and Quality links count more than low-quality links.There are a variety of ways to define what a quality link is, but the following are characteristics of a high-quality link:
  • Trusted Source: If a link is from a page or website which seems like it is trustworthy then it is more likely to count more than a link from an obscure, rarely used, and rarely cited website. See TrustRank for one example of a way to find highly trusted websites.
  • Hard to Get: The harder a link is to acquire the more likely a search engine will be to want to trust it and the more work a competitor will need to do to try to gain that link.
  • Aged: Some search engines may trust links from older resources or links that have existed for a length of time more than they trust brand new links or links from newer resources.
  • Co-citation: Pages that link to competing sites which also link to your site make it easy for search engines to understand what community your website belongs to. See Hilltop for an example of an algorithm which looks for co-citation from expert sources.
  • Related: Links from related pages or related websites may count more than links from unrelated sites.
  • In Content: Links which are in the content area of a page are typically going to be more likely to be editorial links than links that are not included in the editorial portion of a page This also builds context to the link, which spiders factor in when determining rank.
While appropriate anchor text may also help you rank even better than a link which lacks appropriate anchor text, it is worth noting that for competitive queries Google is more likely to place weight on a high-quality link where the anchor text does not match than trusting low-quality links where the anchor text matches.


 The actual ‘search string’ a searcher enters into a search engine.

Query Refinement

 Some searchers may refine their search query if they deemed the results as being irrelevant. Some search engines may aim to promote certain verticals or suggest other search queries if they deem other search queries or vertical databases as being relevant to the goals of the searcher.Query refinement is both a manual and an automated process. If searchers do not find their search results as being relevant they may search again. Search engines may also automatically refine queries using the following techniques:
  • Google OneBox: promotes a vertical search database near the top of the search result. For example, if image search is relevant to your search query images may be placed near the top of the search results.
  • Spell Correction: offers a did you mean link with the correct spelling near the top of the results.
  • Inline Suggest: offers related search results in the search results. Some engines also suggest a variety of related search queries.
  • Some search toolbars also aim to help searchers auto complete their search queries by offering a list of most popular queries which match the starting letters that a searcher enters into the search box.



 Google search relevancy algorithm signal which leverages user click path information to improve relevancy on less commonly searched terms.


 The portion of relevant documents that were retrieved when compared to all relevant documents.

Reciprocal Links

 Nepotistic link exchanges where websites try to build false authority by trading links, using three-way link trades, or other low-quality link schemes.When sites link naturally there is going to be some amount of cross linking within a community, but if most or all of your links are reciprocal in nature it may be a sign of ranking manipulation. Also, sites that trade links off topic or on links pages that are stashed away deep within their sites probably do not pass much link authority and may add more risk than reward.Quality reciprocal link exchanges in and of themselves are not a bad thing, but most reciprocal link offers are of low quality. If too many of your links are of low quality it may make it harder for your site to rank for relevant queries, and some search engines may look at inlink and outlink ratios as well as link quality when determining how natural a site’s link profile is.See also:Link Schemes by Google


 A method of alerting browsers and search engines that a page location moved. 301 redirects are for permanent change of location and 302 redirects are used for a temporary change of location. There are other methods of redirect but are generally not used, because they do not pass link equity like a 301 does.See also:Redirects

Referral rate

 How many people forwarded or shared a piece of content expressed as a percentage. Referral rate almost always refers to email messages. It describes how many people used the “forward to a friend” functionality in the email message.


 The source from which a website visitor came from.


 A company which allows you to register domain names.


 If a site has been penalized for spamming they may fix the infraction and ask for reinclusion. Depending on the severity of the infraction and the brand strength of the site they may or may not be added to the search index.See also:Google Reinclusion – sign up for Google Sitemaps, and request reinclusion from within Google SitemapsYahoo! Reinclusion – request a review here Yahoo! Search: URL Status – Second Review Request

Relative Link

 A link which shows the relation of the current URL to the URL of the page being linked at. Some links only show relative link paths instead of having the entire reference URL within the <a> tag. Due to canonicalization and hijacking related issues, it is typically preferred to use absolute links over relative links.Example relative link<a href=”../folder/filename.html”>Filename Page</a>Example absolute link<a href=”https://honedigitalmarketing.com/hone-u”>Hone U</a>


 A measure of how useful searchers find search results.Many search engines may also bias organic search results to informational resources since commercial ads also show in the search results.See also:Google vs Yahoo! vs Bing – compares the relevancy algorithms of the three major search engines


 (see retargeting)

Repeat Visits

 Visitors to a website which have visited it in the recent past.Search engines look at signals like a strong stream of regular repeat visits to a site and many brand-related searches as signals of strong user engagement, which lead them to rank the site higher in algorithms like Panda. Sites which get few repeat visits and few brand-related searches compared to their overall search traffic footprint may be viewed as lower quality sites and have their rankings suppressed.

Reputation Management

 Ensuring your brand related keywords display results which reinforce your brand. Many hate sites tend to rank highly for brand related queries. This may also involve monitoring of brand usage throughout the internet, or collecting/monitoring reviews and occasionally respond to customer service related reviews.


 Much like search engine submission, resubmission is generally a useless program which is offered by businesses taking advantage of naive consumers. Widely considered a useless service, submissions to search engines do not exist for a paid value, this process occurs naturally and cannot be influenced.


 Advertising programs targeted at people who have previously visited a given website or channel, viewed a particular product, or added a particular product to their shopping cart.Due to activity bias, many retargeted ads overstate their effective contribution to conversions.

Reverse Index

 An index of keywords which stores records of matching documents that contain those keywords.See also:Google: a Behind the Scenes Look – video where Jeff Dean talks about Google’s architecture


 (see URL Rewrite)


 A file which sits in the root of a site and tells search engines which files not to crawl. Some search engines will still list your URLs as URL only listings even if you block them using a robots.txt file.Accessible by visiting: yourwebsite.com/robots.txtDo not put files on a public server if you do not want search engines to index them!See also:Robotstxt.org


 Return on Investment is a measure of how much return you receive from each marketing dollar.While ROI is a somewhat sophisticated measurement, some search marketers prefer to account for their marketing using more sophisticated profit elasticity calculations.See also:Calculating ROI


 Rich Site Summary or Real Simple Syndication is a method of syndicating information to a feed reader or other software which allows people to subscribe to a channel they are interested in. Websites which have RSS feeds are usually located by visiting domain.com/feed Check out ours by going to honedigitalmarketing.com/feed



 A popular Apple browser.

Salton, Gerard

 The scientist who pioneered the information retrieval field.See also:A Theory of Indexing – 1975 book by Gerard Salton


 Intrusive software and programs which usually target ads, violate privacy, and are often installed without the computer owner knowing what the software does.


 Intrusive software and programs which usually target ads, violate privacy, and are often installed without the computer owner knowing what the software does.

Search Engine

 A tool or device used to find relevant information. Search engines consist of a spider, index, relevancy algorithms and search results. Use of search engines has grown to be ubiquitous and take part in most activities of humans lives in 2017.Image result for google searches over time

Search History

 Many search engines store user search history information. This data can be used for better ad targeting or to make old information more findable.Search engines may also determine what a document is about and how much they trust a domain based on aggregate usage data. Many brand related search queries is a strong signal of quality.

Search Marketing

 Marketing a website in search engines. Typically via SEO, buying pay per click ads, and paid inclusion.

Search pogo

 (see pogo rate)


 Search engine marketing. Also known as Search Marketing


 Search engine optimization is the art and science of publishing information and marketing it in a manner that helps search engines understand your information is relevant to relevant search queries.SEO consists largely of keyword research, SEO copywriting, information architecture, link building, brand building, building mindshare, reputation management, and viral marketing.SEO should be bundled into all marketing efforts for the simple fact that everyone is using search engines – It is frankly where the attention is at. If you aren’t taking advantage of SEO in all aspects of your marketing you are not maximizing your efforts.

SEO Copywriting

 Writing and formatting copy in a way that will help make the documents appear relevant to a wide array of relevant search queries.There are two main ways to write titles and be SEO friendly
  • Write literal titles that are well aligned with things people search for. This works well if you need backfill content for your site or already have an amazingly authoritative site.
  • Write page titles that are exceptionally compelling to link at. If enough people link to them then your pages and site will rank for many relevant queries even if the keywords are not in the page titles.
See also:Search Engine Friendly Copywriting – What Does ‘Write Naturally’ Mean for SEO?


 Search Engine Results Page is the page on which the search engines show the results for a search query. This term may also be used to reference the Search engine Ranking Position of a web page.


 A computer used to host files and serve them to the world wide web.Dedicated servers usually run from $100 to $500 a month. Virtual servers typically run from $5 to $50 per month.

Server Logs

 Files hosted on servers which display website traffic, trends, and sources.Server logs typically do not show as much data and are not as user-friendly analytics software. Not all hosts provide server logs.

Singular Value Decomposition

 The process of breaking down a large database to find the document vector (relevance) for various items by comparing them to other items and documents.Important steps:
  • Stemming: taking in account for various forms of a word on a page
  • Local Weighting: increasing the relevance of a given document based on the frequency a term appears in the document
  • Global Weighting: increasing the relevance of terms which appear in a small number of pages as they are more likely to be on topic than words that appear in most all documents.
  • Normalization: penalizing long copy and rewarding short copy to allow them fair distribution in results. a good way of looking at this is like standardizing things to a scale of 100.
Multidimensional scaling is more efficient than singular value decomposition because it requires exceptionally less computation. When combined with other ranking factors only a rough approximation of relevance is necessary.


 Techniques used to steal another web sites traffic, including the use of spyware or cybersquatting.

Site Map

 A page which can be used to help give search engines a secondary route to navigate through your site. Often provides tons of additional context. I highly recommend submitting a sitemap to any search engine you wish to rank in, submitting them usually occurs in the Webmaster tools – search “(Search engine of choice) webmaster tools”Tips:
  • On large websites, the on page navigation should help search engines find all applicable web pages.
  • On large websites, it does not make sense to list every page on the site map, just the most important pages.
  • Site maps can be used to help redistribute internal link authority toward important pages or sections, or sections of your site that are seasonally important.
  • Site maps can use slightly different or more descriptive anchor text than other portions of your site to help search engines understand what your pages are about.
  • Site maps should be created such that they are useful to humans, not just search engines, or create two versions like I do.


 (see Description)

Social Media

 Websites which allow users to create the valuable content. A few examples of social media sites are social bookmarking sites and social news sites.facebook.cominstagram.comtwitter.comdigg.compinterest.com 


 Unsolicited email messages.Or, in search engine terms.Search engines also like to outsource their relevancy issues by calling low quality search results spam. They have vague ever changing guidelines which determine what marketing techniques are acceptable at any given time. Typically search engines try hard not to flag false positives as spam, so most algorithms are quite lenient, as long as you do not build lots of low quality links, host large quantities of duplicate content, or perform other actions that are considered widely outside of relevancy guidelines. If your site is banned from a search engine you may request reinclusion after fixing the problem.See also:Google Webmaster GuidelinesMicrosoft Live Search: Guidelines for successful indexing – Fourum which links to the password protected guidelinesYahoo! Search Content Quality GuidelinesBMW Spamming – Matt Cutts posted about BMW using search spam. Due to their brand strength BMW was reincluded in Google quickly.


 The act of creating and distributing spam.


 Search engine crawlers which scan or “crawl” the web for pages to include in the index.Many non-traditional search companies have different spiders which perform other applications. For example, TurnItInBot searches for plagiarism. Spiders should obey the robots.txt protocol.

Splash Page

 Feature rich or elegantly designed web page which typically offers poor usability and does not offer search engines much content to index.Make sure your home page has relevant content on it if possible.


 A spam blog, typically consisting of stolen or automated low-quality content.


 Software programs which spy on web users, often used to collect consumer research and to behaviorally targeted ads. 


 Server Side Includes are a way to call portions of a page in from another page. SSI makes it easier to update websites.To use a server side include you have to follow one of the conditions:
  • end file names in a .shtml or .shtm extension
  • use PHP or some other language which makes it easy to include files via that programming language
  • change your .htaccess file to make .html or .htm files be processed as though they were .shtml files.
The code to create a server side include looks like this:<!–#include virtual=”/includes/filename.html” –>

Static Content

 Content which does not change frequently. May also refer to content that does not have any social elements to it and does not use dynamic programming languages.Many static sites do well, but the reasons fresh content works great for SEO are:
  • If you keep building content every day you eventually build a huge archive of content
  • by frequently updating your content you keep building mindshare, brand equity, and give people fresh content worth linking at
  • the ‘Fresh Content’ signal is very strong right now, and may help pages which would otherwise not rank for anything, start ranking quickly.


 Using the stem of a word to help satisfy search relevancy requirements. ex: searching for swimming can return results which contain swim. This usually enhances the quality of search results due to the extreme diversity of word used in, and their application in the English language.


 The act of making information systems and related websites aware of your website. In most cases you do not need to submit your website to large scale search engines, they follow links and index content naturally without influence. The best way to submit your site is to get others to link to it.Some topical or vertical search systems will require submission, but you should not need to submit your site to large scale search engines.

Sullivan, Danny

 Founder and lead editor of SearchEngineWatch.com, who later started SearchEngineLand.com.See also:Daggle – Danny’s personal blog 

Supplemental Results

 Documents which generally are trusted less and rank lower than documents in the main search index.Some search engines, such as Google, have multiple indices. Documents which are not well trusted due to any of the following conditions:
  • limited link authority relative to the number of pages on the site
  • duplicate content or near duplication
  • exceptionally complex URLs
Documents in the supplemental results are crawled less frequently than documents in the main index. Since documents in the supplemental results are typically considered to be trusted less than documents in the regular results, those pages probably carry less weight when they vote for other pages by linking them.If you want to view ONLY your supplemental results you can use this command site:www.yoursite.com *** -sljktfPages that are in the supplemental index are placed there because they are trusted less. Since they are crawled less frequently and have fewer resources diverted toward them, it makes sense that Google does not typically rank these pages as high as pages in the regular search index.Just how cache date can be used to view the relative health of a page or site, the percent of the site stuck in supplemental results and the types of pages stuck in supplemental results can tell you a lot about information architecture related issues and link equity related issues.To get your percentage of supplemental results you would divide your number of supplemental results by your total results countThe size of the supplemental index and the pages included in it change as the web grows and Google changes their crawling priorities. It is a moving target, but one that still gives you a clue to the current relative health of your site.If none of your pages are supplemental then likely you have good information architecture, and can put up many more profitable pages for your given link equity. If some of your pages are supplemental that might be fine as long as those are pages that duplicate other content and/or are generally of lower importance. If many of your key pages are supplemental you may need to look at improving your internal site architecture and/or marketing your site to improve your link equity.Comparing the size of your site and your supplemental ratio to similar sites in your industry may give you a good grasp on the upside potential of fixing common information architecture related issues on your site, what sites are wasting significant potential, and how much more competitive your marketplace may get if competitors fix their sites.


Tagging, tags

 (see Bookmarks


 The classification system of controlled vocabulary used to organize topical subjects, usually hierarchical in nature. 


 Blog search engine which tracks popular stories and link relationships.See also:Technorati.com 


 Internet service allowing a remote computer to log into a local one for projects such as script initialization or manipulation.


 A topical, community-based search engine largely reliant upon Kleinberg’s concept of hubs and authorities. Teoma powers Ask.com.

Term Frequency

 A measure of how frequently a keyword appears amongst a collection of documents.

Term Vector Database

 A weighted index of documents which aims to understand the topic of documents based on how similar they are to other documents, and then match the most relevant documents to a search query based on vector length and angle.See also:A Theory of IndexingMi Islita: Term Vector Theory and Keyword WeightsThe Term Vector Database: fast access to indexing terms for Web pagesVector Space Model

The Tragedy of the Commons

 Story about how in order to protect the commons some people will have to give up some rights or care more for the commons. In marketing attention is the commons, and Google largely won distribution because they found ways to make marketing less annoying.See also:The Tragedy of the Commons


 Synonym directory search engines use to help increase return relevancy.Thesaurus tools can also be used as a keyword research tool to help search marketers find related keywords to a target.


 The title element is used to describe the contents of a document.The title is one of the most important aspects of on-page SEO. Each page title should be:
  • Unique to that page: Not the same for every page of a site!
  • Descriptive: What important ideas does that page cover?
  • Not excessively long: Typically page titles should be kept to 8 to 10 words or less, with some of the most important words occurring near the beginning of the page title.
Page titles appear in search results as the links searchers click on. In addition, many people link to documents using the official document title as the link anchor text. Thus, by using a descriptive page title you are likely to gain descriptive anchor text and are more likely to have your listing clicked on.On some occasions, it also makes sense to use a title which is not literally descriptive but is easily associated with human emotions or a controversy such that your idea will spread further and many more people will point quality editorial links at your document.There are two main ways to write titles and be SEO friendlyWrite literal titles that are well aligned with things people search for. This works well if you need backfill content for your site or already have an amazingly authoritative site.Write page titles that are exceptionally compelling to link at. If enough people link to them then your pages and site will rank for many relevant queries even if the keywords are not in the page titles.See also:W3 Title ElementCopyblogger: Magnetic Headlines

Title Tag

 The section of code from a web page that specifies what text will appear at the top of the browser window for that page and that will appear in the top line of text describing that page in the search results. The <TITLE> tag appears within the <HEAD> section of a web page.

Top Heavy

 Google algorithm which penalizes websites which have a high ad density above the fold and sites which make it hard to find the content a user searched for before landing on the page.See also:Page layout algorithm improvement

Topic-Sensitive PageRank

 A method of computing PageRank which instead of producing a single global score creates topic related PageRank scores.See also:Topic-Sensitive PageRank – official research paper on the topicHow To Prosper With The New Google – Dan Thies offers a downloadable PDF highlighting how TSPR may have been largely at play during the Google Florida update.


 Automated notification that another website mentioned your site which is baked into most popular blogging software programs.Due to the automated nature of trackbacks, they are typically quite easy to spam. Many publishers turn trackbacks off due to a low signal to noise ratio.


 When a topic or piece of content “goes viral” and is widely shared and discussed, especially among its niche, although it may be across all the web space.See also:Google Trends – Where I go to check trend data (although twitter and facebook have some good info on trends as well)


 Search relevancy algorithm which places an additional weighting on links from trusted seed websites that are controlled by major corporations, educational institutions, or governmental institutions.See also:TrustRank algorithm Stanford tries to combat web spam with this algorithm


 Hosted blogging platform provided by SixApart, who also makes Movable Type.It allows you to publish sites on a subdomain off of Typepad.com, or to publish content which appears as though it is on its own domain. If you are serious about building a brand or making money online you should publish your content to your own domain because it can be hard to reclaim a website’s link equity and age related trust if you have built years of link equity into a subdomain on someone else’s website.See also:Typepad.comDomain Mapping – how to have your Typepad hosted blog appear under a different URL.


Unethical SEO

 Some search engine marketers lacking in creativity try to market their services as being ethical, whereas services rendered by other providers are somehow unethical. SEO services are generally neither ethical or unethical. They are either effective or ineffective 0as well as (allowed by webmaster guidelines or not).SEO is an inherently risky business, but any quality SEO service provider should make clients aware of potential risks and rewards of different recommended techniques.

Unsubscribe rate

 Used for email marketing. The percentage of how many subscribers choose to click the unsubscribe link and no longer receive your email announcements. Typically calculated per email campaign sent.


 Search engines frequently update their algorithms and data sets to help keep their search results fresh and make their relevancy algorithms hard to update. Most major search engines are continuously updating both their relevancy algorithms and search index.See also:Google Terminology – Matt Cutts talks about the difference between an algorithm update and an index update


 Uniform Resource Locator is the unique address of any web document.

URL Rewrite

 A technique used to help make URLs more unique and descriptive to help facilitate better sitewide indexing by major search engines.See also:Apache Module mod_rewrite – Apache module used to rewrite URLs on sites hosted on an Apache server.ISAPI Rewrite – software to rewrite URLs on sites hosted on a Microsoft Internet Information Server.


 How easy it is for customers to perform the desired actions.The structure and formatting of text and hyperlink based calls to action can drastically increase your website usability, and thus conversion rates.See also:Don’t Make Me Think – Steve Krug’s book about designing an easy to use website.The Big Red Fez – Seth Godin’s short book on common website usability errors.useit – Jakob Nielson’s newsletter on usability and web design.A User Interface Usability Checklist for Ecommerce Websites – Kim Krause’s website usability checklist.

Usage Data

 Things like a large stream of traffic, a high percent of visitors as repeat visitors, long dwell time, multiple page views per visitor, a high clickthrough rate, or a high level of brand related search queries may be seen by some search engines as a sign of quality. Some search engines may leverage these signals to improve the rankings of high quality documents and high quality websites via algorithms like Panda.

User Engagement

 (see usage data)

User generated content

 Any kind of content that is not created by the publisher or brand. Examples of user generated content would be comments on blog posts or on social media updates, or any social content a consumer or user creates about or in connection to a brand.


Vector Space Model

 (see Term Vector Database)

Vertical Search

 A search service which is focused on a particular field, a particular type of information, or a particular information format.For example, Business.com would be a B2B vertical search engine, and YouTube would be a video based vertical search engine.

Viral Marketing

 Self propagating marketing techniques. Common modes of transmission are email, blogging, and word of mouth marketing channels.Many social news sites and social bookmarking sites also lead to secondary citations.See also:Unleashing the Ideavirus – free Seth Godin ebook about spreading ideas

Virtual Domain

 Website hosted on a virtual server.

Virtual Server

 A server which allows multiple top level domains to be hosted on a single computer.Using a virtual server can save money for smaller applications, but dedicated hosting should be used for large commercial platforms. Most domains are hosted on virtual servers, but using a dedicated server on your most important domains should add server reliability, and could be seen as a sign of quality. Dedicated servers usually run from $100 to $500 a month. Virtual servers typically run from $5 to $50 per month.



 (see Blog)

Webmaster Tools

 (see Google Webmaster Tools)

White Hat SEO

 Search engines set up guidelines that help them extract billions of dollars of ad revenue from the work of publishers and the attention of searchers. Within that highly profitable framework search engines consider certain marketing techniques deceptive in nature, and label them as black hat SEO. Those which are considered within their guidelines are called white hat SEO techniques. The search guidelines are not a static set of rules, and things that may be considered legitimate one day may be considered deceptive the next.Search engines are not without flaws in their business models, but there is nothing immoral or illegal about testing search algorithms to understand how search engines work.People who have extensively tested search algorithms are probably more competent and more knowledgeable search marketers than those who give themselves the arbitrary label of white hat SEOs while calling others black hat SEOs.When making large investments in processes that are not entirely clear trust is important. Rather than looking for reasons to not work with an SEO it is best to look for signs of trust in a person you would like to work with.See also:White Hat SEO.com – parody site about white hat SEOBlack Hat SEO.com – parody site about black hat SEO


 Each domain has an owner on record. Ownership data is stored in the Whois record for that domain.Some domain registrars also allow you to hide the ownership data of your sites. Many large scale spammers use fake Whois data. Whois bash code may be used to determine a domain owner and their contact details. There are also websites which allow you to run Whois function to determine the results.See also:Whois Tool


 Software which allows information to be published using collaborative editing.


 A lexical database of English words which can be used to help search engines understand word relationships.See also:Wordnet – official site


 A popular open source blogging software platform, offering both a downloadable blogging program and a hosted solution.If you are serious about building a brand or making money online you should publish your content to your own domain because it can be hard to reclaim a website’s link equity and age related trust if you have built years of link equity into a subdomain on someone else’s website.WordPress.org is my favorite CMS right now.See also:WordPress.org – download the softwareWordPress.com – offers free blog hosting



 Extensible HyperText Markup Language is a class of specifications designed to move HTML to conform to XML formatting.See also:W3C: XHTML


 Extensible Markup Language is a simple, very flexible text format derived from SGML, used to make it easy to syndicate or format information using technologies such as RSS.See also:W3C: XML



 Internet portal company which was started with the popular Yahoo! Directory.See also:Yahoo.comThe History of Yahoo!

Yahoo! Answers

 Is a free question asking and answering service which allows Yahoo! to leverage social structures to create a ‘bottoms up network’ of free content.See also:Yahoo! Answers

Yahoo! Directory

 One of the original, most popular, and most authoritative web directories, started by David Filo and Jerry Yang in 1994.The Yahoo! Directory is one of a few places where most any legitimate site can pick up a trusted link. While the cost of $299 per year (It is free right now ;))may seem expensive to some small businesses, a Yahoo! Directory link will likely help boost your rankings in major search engines.See also:Yahoo! DirectoryNewly Added Sites

Yahoo! Search Marketing

 Yahoo!’s paid search platform, formerly known as Overture.See also:Yahoo! Search Marketing

Yahoo! Site Explorer

 Research tool which webmasters can use to see what pages Yahoo! has indexed from a website, and what pages link to those pages.See also:Yahoo! Site Explorer


 Feature-rich amateur video upload and syndication website owned by Google. Some consider YouTube a social media.See also:YouTube.com



 Non-commercial directory which was bought by Looksmart for $20 million, then abruptly shut down with little warning.[/vc_column_text][/vc_column][/vc_row]