Columbus, Ohio SEO Expert | Jacob Stoops

SEO Dictionary

A complete guide to many of SEO, Social Media, and Internet Marketing’s key terms.

Before you ask, I’ll come out and admit that I did not come up with this information on my own. Instead, I’ve collected information from several reputable sources and have tried to add my own thoughts and references where possible.

My objective is to compile the best information (and most valuable to my site’s users) from the most informative places around the web on commonly used terms in Search Engine Optimization, Marketing, and more. My end goal is to be a highly-valuable resource for all-things SEO-related.

Jump To: A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | #

A

Above the Fold - A traditional marketing term used to describe the top portion of a newspaper. On the web (or in email), it means the area of content in the browser viewable before scrolling.

Absolute Link – A hyperlink which shows the full URL of the page being linked to. For example, some links only show relative link [link] paths instead of having the entire reference URL.

Example of Absolute link:

<a href=”http://agent-seo.com/seo/the-file.html”>totally awesome</a>

Example of Relative link [link]:

<a href=”…/seo/the-file.html”>totally awesome</a>

Affiliate Marketing – Affiliate marketing programs allows merchants to expand their market reach and mindshare by paying independent agents on a cost per action (CPA) basis. Affiliates only get paid if visitors complete an action.

Most affiliates make next to nothing because they are not aggressive marketers, have no real focus, fall for wasting money on instant wealth programs that lead them to buying a bunch of unneeded garbage via other’s affiliate links, and do not attempt to create any real value.

Some power affiliates make hundreds of thousands or millions of dollars per year because they are heavily focused on automation and/or tap large traffic streams. Typically niche affiliate sites make more per unit effort than overtly broad ones because they are easier to focus (and thus have a higher conversion rate).
Some quality affiliate marketing networks:

Age – Domain age [link;], page age, user account age, and related historical data is sometimes taken into account when determining how much to trust that a particular person, website, page, or document. Generally, older domains are more trusted and therefore deemed more authoritative. However, sometimes search engines, like blog search engines, may boost the relevancy of newer documents.

Sometimes, fresh content may temporarily rank better than you might expect, because of citations from other channels which place the content or article on their home page or a well trusted (high PageRank) page on their site. After those sites publish more content and the referenced page falls deeper into their archives, those links typically become less authoritative causing a drop in rank.

Also see: Supplemental Index [link: ]

AJAX – Asynchronous JavaScript and XML is a technique which allows a web page to request additional data from a server without requiring a new page to load.

Alexa – California, U.S.-based subsidiary company of Amazon.com [link:] that is known for its toolbar and website. Once installed, the toolbar collects data on browsing behavior which is transmitted to the website where it is stored and analyzed and is the basis for the company’s web traffic reporting.

Visit Alexa.com

Algorithm – An algorithm is a set of finite, ordered steps for solving a mathematical problem. Each Search Engine uses a proprietary algorithm set to calculate the relevance of its indexed web pages to your particular query [link]. The result of this process is a list of sites ranked in the order of relevance as deemed by each particular search engine.

It is important to note:

  • Each search engine has a different algorithm that may weigh various factors differently.
  • Search engine algorithms are closely guarded in order to prevent exploitation of algorithmic results.
  • Search algorithms are also changed frequently to incorporate new data and improve relevancy.

AlltheWeb – Search engine which was created by Fast, then bought by Overture [link], which was bought by Yahoo! [link]. Yahoo! may use AllTheWeb as a test bed for new search technologies and features.

Visit AlltheWeb

Alt Attribute/Alt Tag – An Alt Attribute (or Alt Tag) is a way for you to give descriptions to your site’s images. Search engines aren’t able to look at an image and tell what it is about. Therefore, it is very important that you provide descriptions within the Alt Attribute so that they can scan, index, and rank your image accordingly.

To a search engine webcrawler and image represents a blank space on the canvas of a website. So by giving each image a description, you provide search engines with a frame of reference to understand what each image on your site is about.

In addition, Alt Tags are necessary from a usability perspective. Blind people (and people with other disabilities) are not able to easily distinguish what is in an image. Using an image alt attribute allows you to help screen readers and search engines understand the function of an image by providing a text equivalent for the object.

Example of an Alt Attribute:

<img src=”http://www.example.com/images/picture2.jpg” alt=””Pretty” width=”50″ height=”50″ />

See also:

AltaVista – Search engine bought out by Overture (although Overture was later bought by Yahoo). AltaVista was an early powerhouse in searching, but on October 25, 1999 they did a major algorithmic upgrade which caused them to dump many websites. Ultimately the upgrade and brand mismanagement drove themselves toward irrelevant class and they loss market and mind share.

Visit AltaVista

Amazon.com – Amazon.com is the largest internet retailing website. Amazon.com is rich in consumer generated media. Amazon also owns a number of other popular websites, including IMDB [link:www.imdb.com] and Alexa [link:].

Visit Amazon.com

Anchor Text – The text that a user clicks on when following a hyperlink. Quality anchor text within inbound links can carry tons of SEO value back to your site. Search engines assume that your page is authoritative for the words that people include in links pointing at your site. When links occur naturally they typically have a wide array of anchor text combinations.

Example of anchor text:

<a href=”http://www.example.com/”>World’s Best Gardening Tips Blog</a>

Beware: Too much similar anchor text may be a considered a sign of manipulation, and thus discounted or filtered. Make sure when you are building links that you control that you try to mix up your anchor text. Outside of your core brand terms if you are targeting Google, you probably do not want any more than 10% to 20% of your anchor text to be the same.

Note: In the case the link is an image, the image alt attribute may act in the place of anchor text.

AOL – Stands for America Online. AOL is a popular web portal which merged with Time Warner.

Visit AOL

API – Application Program Interface. A series of conventions or routines used to access software functions. Most major search products have an API program.

Arbitrage – Exploiting market inefficiencies by buying and reselling a commodity for a profit. As it relates to the search market, many thin content sites laced with an Overture [link: ] feed or AdSense [link: ] ads buy traffic from the major search engines and hope to send some percent of that traffic clicking out on a higher-priced ad. Shopping search engines generally draw most of their traffic through arbitrage.

See also:

Ask – Ask is a search engine owned by InterActive Corp. They were originally named Ask Jeeves, but they dumped Jeeves in early 2006. Their search engine is powered by the Teoma.

Visit Ask.com

ASP – Active Server Pages. a dynamic Microsoft programming language.

Visit ASP.net

Authority – The ability of a page or domain to rank well in search engines. Five large factors associated with site and page authority are link equity [link:], site age [link:], traffic trends [link:], site history [link:], and publishing unique original quality content [link:].

In a general sense, search engines try to create algorithms that rank websites that are deemed the most relevant authority on a given subject or query. The goal of every website owner should be to posture their website presence properly, so that they can be perceived as an “Authority” on whatever subject or field they are targeting.

Search engines constantly tweak their algorithms to try to balance relevancy algorithms based on topical authority and overall authority across the entire web. Sites may be considered topical authorities or general authorities. For example, Wikipedia and DMOZ are considered broad general authority sites. Sites such as SEOBook or SEOMoz have topical authority on SEO, but not a broad general authority.

Automated Bid Management Software – Used by Pay per click search engines. Some search engines and third party software developers have created software which makes it easier to control your ad spend. Some of the more advanced tools can integrate with your analytics program and help you focus on conversion, ROI, and earnings elasticity instead of just looking at cost per click.

If you want to program internal bid management software you can get a developer token to use the Google Adwords. A few popular bid management tools are:

B

Backlink – see Inbound Link [link: ]

Bad Request – The request could not be understood by the server due to malformed syntax. The client should now repeat the request without modifications.

Bait and Switch – Marketing technique where you make something look overtly pure or as though it has another purpose to get people to believe in it or vote for it (by linking at it or sharing it with friends), then switch the intent or purpose of the website after you gain authority [link: ].

It is generally easier to get links to informational websites than commercial sites. Some new sites might gain authority much quicker if they tried looking noncommercial and gaining influence before trying to monetize their market position.

Also see:

Banned – Also known as delisted or blacklisted, a banned site is a URL that has been removed from a search engine’s Index, typically for engaging in Black Hat SEO. Banned sites are ignored by search engines.

Banner Ad – A banner ad is a rectangular graphic advertisement. Banner ads are one of the commonest forms of online advertising. Their sizes vary, but most measure 468 pixels wide by 60 pixels high. Clicking on a banner ad will direct you to the advertiser’s website or a designated Landing Page.

Example of banner ad: [image:]

Banner Blindness – Banner blindness is a usability phenomenon in which a website visitor completely overlooks a banner. Such a banner may either be an advertising banner from an external site, or a banner that the serving site intends to use to promote content or a navigation link.

During the first web boom many businesses were based on eyeballs more than actually building real value. Many ads were typically quite irrelevant and web users learned to ignore the most common ad types.
In many ways text ads are successful because they are more relevant and look more like content, but with the recent surge in the popularity of text ads some have speculated that in time people may eventually become text ad blind as well.

Behavioral Targeting – Ad targeting based on past recent experience and/or implied intent. For example, if you recently searched online for restaurants then are later reading through another website, the page may still show you restaurant ads.

See also:

Best of the Web – Best of the Web Directory is a commercial web directory founded in 1994 providing websites categorized topically and regionally. BOTW allows site owners to submit their websites for an expedited review, and commercial sites are required to pay for the review.

Visit BOTW.org

Bid Management Software – see Automated Bid Management Software [link: ]

Bing – Formerly Live Search, Windows Live Search, MSN Search, Bing is the current web search engine (advertised as a “decision engine”) from Microsoft [link:]. Unveiled by Microsoft CEO Steve Ballmer on May 28, 2009 at the All Things Digital conference in San Diego, Bing is a replacement for Live Search. It went fully online on June 3, 2009, with a preview version released on June 1, 2009. In its first few weeks Bing was successful in gaining some market share.

Visit Bing

See also:

Black-Hat SEO – Refers to SEO tactics that are deemed by search engines as deceptive in nature. Search engines set up guidelines that help them extract billions of dollars of ad revenue from the work of publishers and the attention of searchers.

Within that highly profitable framework, search engines consider certain marketing and optimization techniques deceptive in nature, and label them as Black-Hat SEO. Those tactics which are considered within their guidelines are called White-Hat SEO [link:] techniques.

Note: The search guidelines are not a static set of rules, and things that may be considered legitimate one day may be considered deceptive the next. Search engines are not without flaws in their business models, but there is nothing immoral or illegal about testing search algorithms to understand how search engines work.

As long as you’re not trying to be tricky or deceptive, you should be okay. A good rule of thumb to avoid accidentally using black-hat practices is to design the interface and content of your website with a user-centered focus first and a search-engine focus second.

Blacklisted – Also known as banned [link:] or delisted, a blacklisted site is a URL that has been removed from a search engine’s Index, typically for engaging in Black Hat SEO. Blacklisted sites are ignored by search engines.

Blog – Short for web-log. A periodically updated online journal typically formatted in reverse chronological order. Many blogs not only archive and categorize information, but also provide a feed and allow simple user interaction like leaving comments on the posts.

Most blogs tend to be personal in nature; however more and more businesses are beginning to add and maintain blogs as core online competencies. Blogs can generally quite authoritative with heavy link equity because they give people a reason to frequently come back to their site, read their content, and link to whatever they think is interesting.

Some popular blogging platforms:

Blog Comment Spam – Either manually or automatically (via a software program) adding low value or no value comments to other sites.

Automated blog spam:

“Really great tips that everyone should follow. Also easy to understand. Thanks.”
by
Discreet Adult XXX Hardcore Porn Free Shipping

Manual blog spam:

“I was just looking for info on this topic. Now I’m going to leave a generic comment that contributes no value to your post, while asking you to link to my site. Check it out!!!”
by
douchebag manual spammer (usually keywords used in name)

Blogger – (1) Blogger is a free blog platform owned by Google [link:]. It allows you to publish sites on a subdomain off of Blogspot.com, or to FTP content to your own domain. Blogger is probably the easiest blogging software tool to use, but it lacks many some features present in other blog platforms.

Note: If you are serious about building a brand or making money online you should publish your content to your own domain, because it can be hard to reclaim a website’s link equity and age related trust if you have built years of link equity into a subdomain on someone else’s website. By default, Blogger blogs go to http://somename.blogspot.com as a URL format. As stated above, this is not optimal for SEO and you should try to get the content to post to your own site.

Visit Blogger

(2) A blogger is someone (anyone) who posts to their own weblog. This could be you!

Blogroll – Link list on a blog, usually linking to other blogs owned by the same company, friends of that blogger, or blogs that a blogger likes. A blogroll can be used to do link exchanges [link:].

Bold – A way to make words appear in a bolder font. Words that appear in a bolder font are more likely to be read by humans that are scanning a page.

A search engine may also place slightly greater weighting on these words than regular text; but if you write natural page copy and a word or phrase appears on a page many times it probably does not make sense or look natural if you bold ever occurrence. Word to the wise: don’t overuse this as an SEO technique. It will look bad from a user perspective and will saturate the effect from an SEO perspective.

Example:

<b>bold text</b>
<strong>bold text</strong>

Bookmarks – Bookmark is a feature in most Internet browsers that allows you to save the address of a web site you like. In Microsoft Internet Explorer [link:] bookmarks are known as favorites and a menu option is available for you to view your favorites in the Microsoft Internet Explorer browser. Most browsers [link:] come with the ability to bookmark your favorite pages. Many web based services have also been created to allow you to bookmark.

See:

Boolean Search – A search allowing the inclusion or exclusion of documents containing certain words through the use of operators such as AND, NOT and OR. Search engines like Google and Yahoo make standard use of the AND variable.

For example:

  • A Google search for Agent SEO will return results for Agent AND SEO.
  • A Google search for “Agent SEO” will return results for the phrase Agent SEO.
  • A Google search for Agent SEO -Jake will return results containing Agent AND SEO but NOT Jake.
  • A Google search for ~SEO -SEO will find results with words related to SEO that do not contain SEO.

See also:

Brand – The emotional response associated with your company and/or products. A brand is built through controlling customer expectations and the social interactions between customers. Building a brand is what allows you to move away from commodity based pricing and move toward higher margin value based pricing.

See also:

Branded Keywords – Keywords or keyword phrases associated with a brand [link:]. Typically branded keywords occur late in the buying cycle [link:], and are some of the highest value and highest converting keywords.

Some affiliate marketing programs prevent affiliates from bidding on the core brand related keywords, while others actively encourage it. Either way can work depending on your business model and marketing savvy, but it is important to ensure there is synergy between internal marketing and affiliate marketing programs.

Breadcrumb Navigation – Navigational technique used to help search engines and website users understand the relationship between pages and website hierarchy.

Example of breadcrumb navigation structure:

Home > Services > Search Engine Marketing

Bridge – An entry into a site other than the homepage. It may be a legitimate landing page, used to measure the results of a specific promotion or campaign. But such pages are often created exclusively to spam search engine results, or misrepresent a site’s content. Search engines frown heavily on the use of doorway pages, and may penalize sites if the practice is reported or discovered.

See also:

  • Doorway pages [link:]

Broken Link – A hyperlink that is not functioning, or a link which does not lead to the desired location. Links may broken for a number of reasons, but four of the most common reasons are:

  1. a website going offline
  2. linking to content which is temporary in nature (due to licensing structures or other reasons)
  3. linking to a page or site that has moved to another location
  4. linking to a site whose root domain name has changed

Most large websites have some broken links, but if too many of a site’s links are broken it may be an indication of outdated content, and it may provide website users with a poor user experience. Both of which may cause search engines to rank a page as being less relevant.

If you’re running a WordPress blog or website, I recommend using the Broken Link Checker plugin to monitor the status of your links.

Browser – Application used to view the World Wide Web. Some popular browsers include:

  • Internet Explorer [link:]
  • Firefox [link:]
  • Safari [link:]
  • Opera [link:]

Buying Cycle – Before making any purchase of goods/services or ideas, customers typically go through a buying cycle, firstly showing interest, conducting research and then making a purchase decision. The time it takes to complete the buying cycle could vary from time to time depending on the complexity of the item being purchased or the price involved.

C

Cache – Copy of a web page stored by a search engine. When you search the web you are not actively searching the whole web, but are searching files in the search engine index [link:]. Some search engines provide links to cached versions of pages in their search results, and allow you to strip some of the formatting from cached copies of pages.

Canonical URL – Many content management systems are configured with errors which cause duplicate or exceptionally similar content to get indexed under multiple URLs. Many webmasters use inconsistent link structures throughout their site that cause the exact same content to get indexed under multiple URLs.

The canonical version of any URL is the single most authoritative version indexed by major search engines. Search engines typically use PageRank or a similar measure to determine which version of a URL is the canonical URL.

Webmasters should use consistent linking structures throughout their sites to ensure that they funnel the maximum amount of PageRank at the URLs they want indexed. When linking to the root level of a site or a folder index it is best to end the link location at a / instead of placing the index.html or default.asp filename in the URL.

Examples of URLs which may contain the same information in spite of being at different web addresses:

  • http://agent-seo.com/
  • http://agent-seo.com/index.php
  • http://agent-seo.com/
  • http://agent-seo.com/index.php

Click-Fraud – The illegal practice of manipulating Cost-Per-Click (CPC)[link:] or Pay-Per-Click (PPC)[link:] revenue sharing agreements. There are numerous types of click fraud, but in a typical scenario the webmaster of a site that earns money from each click of the advertising links it publishes pays individuals a small fee to click those links.

Companies thus pay for advertising to clients who had no intention of buying from them. Some companies have filed class action lawsuits alleging that ad publishers such as Google[link:] and Yahoo![link:] have failed to aggressively confront click fraud because they benefit from increased CPC [link:] revenue.

Click-Through – Refers to a single instance of a user clicking on an advertising link or site listing and moving to a Landing Page. A higher Click-Through Rate (CTR)[link:] is one of the primary goals of Search Engine Optimization[link:].

Click-Through-Rate (CTR) – The percentage of users who click on an advertising link or search engine site listing out of the total number of people who see it, i.e. four click-throughs out of ten views is a 40% CTR.

Cloaking – Displaying different content to search engines and searchers. Depending on the intent of the display discrepancy and the strength of the brand of the person / company cloaking, it may be considered reasonable or it may get a site banned from a search engine. Cloaking is typically done to achieve a higher search engine position or to trick users into visiting a site. In such cases cloaking is considered to be Black Hat SEO and the offending URL could be Blacklisted [link:].

Cloaking has many legitimate uses which are within search guidelines. For example, changing user experience based on location is common on many popular websites.

See also:

Clustering – In SERPs[link:], clustering is limiting each represented website to one or two listings. This is done to make the search results appear neat and organized, as well as to ensure diversity amongst the top ranked search results.

Code Bloat – Code bloat is the production of code that is perceived as unnecessarily long, slow, or otherwise wasteful of resources. Code bloat can also be caused by inadequacies in the language in which the code is written, or inadequacies in the compiler used to compile the language.

Content Management System (CMS) – Tool used to help make it easy to update and add information to a website. Blog software programs such as WordPress[link:] are some of the most popular content management systems currently used on the web. In short, a good CMS will make it easier for non-HTML savvy users to create and maintain a website or blog.

Beware: Many content management systems have errors associated with them which make it hard for search engines to index content due to issues such as duplicate content.

Content-Rich Doorway – A doorway page dressed up with graphics, navigation, and linked to from a site map so that it looks like a normal part of a Web site. The copy is written to rank for a single keyword expression.

Contextual Link-Inventory (CLI) – Search engines/advertising networks use their contextual link inventory to match keyword-relevant text-link advertising with site content. CLI is generated based on listings of website pages with content that the ad-server deems a relevant keyword match. Ad networks further refine CLI relevancy by monitoring the Click-Through Rate[link:] of the displayed ads.

Conversion – Conversion is the term used for any significant action a user takes while visiting a site, i.e. making a purchase, requesting information, or registering for an account.

Conversion Rate – The key metric to evaluate the effectiveness of a conversion effort (accepting a free gift, setting an appointment), reflecting the percentage of people converted into buyers (or subscribers, or whatever action is desired) out of the total population exposed to the conversion effort.

For Web sites, the conversion rate is the number of visitors who took the desired action divided by the total number of visitors in a given time period (typically, per month). For example, the conversion rate of visitors that subscribe to a newsletter = number of visitors divided by number of subscribers. If a website has 10,000 visitors and 500 subscribe, the conversion rate equals 1 out of 20 (or 5%).

Cookie – Small data file written to a user’s local machine to track them. Cookies are used to help websites customize your user experience, can help your browser remember passwords, track search history, help affiliate program managers track conversions, and much more.

Cost-Per-Action (CPA) – In a cost-per-action advertising revenue system, advertisers are charged a conversion-based fee (i.e. each time a user buys a product, opens an account, or requests a free trial). Many affiliate marketing programs and contextual ads are structured on a cost per action basis.

Cost-Per-Click (CPC) – See Pay-Per-Click Advertising [link: ]

Cost-Per-Impression – See Cost Per Thousand [link:]

Cost-Per-Thousand (CPM) – Also known as Cost-Per-Impression [link:] or CPM for cost-per-mille (mille is the Latin word for thousand), cost-per-thousand is an advertising revenue system used by search engines and ad networks in which advertising companies pay an agreed amount for every 1,000 users who see their ads, regardless of whether a click-through or conversion is achieved. CPM is typically used for Banner Ad [link:] sales, while Cost-Per-Click [link:] is typically used for text link advertising.

Crawlability – Refers to the ability of a search engine to crawl through the entire text content of your website, easily navigating to every one of your web pages, without encountering an unexpected dead-end or chained redirects.

Crawler – See Web Crawler [link:]

Crawl Depth – How deeply a website is crawled, and how many pages are indexed (or what percentage of the site’s pages).

Since searches which are longer in nature tend to be more targeted, it is important to try to get most or all of a site indexed such that the deeper pages have the ability to rank for relevant long-tail keywords [link:].

A large site needs adequate link equity [link:] to get deeply indexed. Another thing which may prevent a site from being fully indexed is duplicate content [link:] issues.

Crawl Frequency – How frequently a website is crawled. Sites which are well trusted or frequently updated may be crawled more frequently than sites with low Trust Scores [link:] and limited Link Authority [link:].

On the flip side, websites with highly artificial link authority scores (ie: mostly low quality spammy links) or sites which are heavy in duplicate content [link:] or near duplicate content (such as affiliate feed sites) may be crawled less frequently than sites with unique content which are well integrated into the web.

See also:

Crawl Page – A document consisting of links to other pages, provided for the sole purpose of giving crawlers (robots) links to follow. Spammers used to submit these pages to the search engines en masse.

CSS – Stands for Cascading Style Sheets, which is a method for adding styles to web documents. Using external CSS files makes it easy to change the design of many pages by editing a single file. You can link to an external CSS file using code similar to the following in the head of your HTML documents:

<link rel=”stylesheet” href=”http://www.seobook.com/style.css” type=”text/css” />

Tip: Trying to stay away from using inline CSS styling, as it causes Code Bloat [link:], which is bad for SEO.

See also:

Cybersquatting – Registering domains related to other trademarks or brands in an attempt to cash in on the value created by said trademark or brand.

D

Dead Link – A link which is no longer functional. Most large, high-quality websites have at least a few dead links in them, but the ratio of good links to dead links can be seen as a sign of solid information quality.

Dedicated Server – Server which is limited to serving one website or a small collection of websites owned by a single person. Dedicated servers tend to be more reliable than shared (or virtual) servers [link:]. Dedicated servers usually run from $100 to $500 a month. Virtual servers typically run from $5 to $50 per month.

Deep Link – A link which points to an interior page within a website. When links grow naturally typically most high quality websites have many links pointing at many different interior pages.

When you request links from other websites it makes sense to request a link from their most targeted relevant page to your most targeted relevant page.

Del.icio.us – Highly popular social bookmarking website.

Visit Del.icio.us

Delisting – A URL that has been removed from a search engine’s index [link:]. Delisted sites are ignored by search engines. A site may become delisted for many reasons:

  • Engaging in Black-Hat SEO[link:] tactics
  • Pages on new websites (or sites with limited link authority relative to their size) may be temporarily de-indexed until the search engine does a deep spidering and re-cache of the web.
  • Pages which have changed location and are not properly redirected, or pages which are down when a search engine tries to crawl them may be temporarily de-indexed.
  • If a website tripped an automatic spam filter it may return to the search index anywhere from a few days to a few months after the problem has been fixed.
  • If a website is editorially removed by a human you may need to contact the search engine directly to request re-inclusion.
  • During some updates search engines readjust crawl priorities.

Digg – Social news site where users vote (or “Digg”) on which stories get the most exposure and become the most popular.

Visit Digg.com

Directory – A search site whose index is compiled by human editors (as opposed to web spiders). Although editors may pro actively include sites they consider to be of value, most inclusions are the result of submitted requests. The decision to include a site, and its subsequent ranking and categorization, is one of editorial judgment rather than being computed by an algorithm.

Some directories cater to specific niche topics, while others are more comprehensive in nature. Major search engines likely place significant weight on links from DMOZ[link:] and the Yahoo! Directory[link:]. Smaller and less established general directories likely pull less weight. If a directory does not exercise editorial control over listings, search engines will be far less likely to trust their links.

DMOZ – The Open Directory Project is the largest, most trusted human edited directory of websites. DMOZ is owned by AOL, and is primarily ran by volunteer editors. It is likely that search engines place significant weight on links from this directory.

Visit DMOZ

DNS – Domain Name Server or Domain Name System. A naming scheme mechanism used to help resolve a domain name / host name to a specific TCP/IP Address.

DNS Propagation – When a new domain name is registered (or an existing one is transferred to a new DNS), the information must make its way around the entire internet. This process usually takes around 24 hours, during which time the domain will be inaccessible to many or all users.

Domain – Scheme used for logical or location organization of the web. Many people also use the word domain to refer to a specific website (common domain extensions include .com, .net, .org, .edu, and .gov).

Domain Age – See Age[link:]

Domain Authority – See Authority[link:]

Domain Mirror – A domain mirror for your website enables you to host two domain names using the same web content. Typically, a domain mirror will be used to cheaply register variants of a domain name (i.e. – the .com.au and .com) and host them both using the same website. They can also be used to assist in the case of changing your domain since the old domain can be mirrored to the new domain once everything is set up.

Doorway – A document with a small amount of text (usually coherent but sometimes gibberish) intended to rank well specifically for one targeted expression. In the old days, people created as many doorways as they had targeted keywords and search engines to work with.

Doorway Page – Pages designed to rank for highly targeted search queries, typically designed to redirect searchers to a page with other advertisements. Some webmasters cloak [link:] thousands of doorway pages on trusted domains, and rake in a boatload of cash until they are caught and delisted [link:].

If the page would have a unique purpose outside of search then search engines are generally fine with it, but if the page only exists because search engines exist then search engines are more likely to frown on the behavior.

Dreamweaver – Popular web development and editing software offering a WYSIWYG [link:] interface.

Visit Dreamweaver: Official Site

Duplicate Content – Content which is duplicate or near duplicate in nature. Search engines do not want to index multiple versions of similar content.

For example, printer friendly pages may be search engine unfriendly duplicates. Also, many automated content generation techniques rely on recycling content, so some search engines are somewhat strict in filtering out content they deem to be similar or nearly duplicate in nature.

See also:

Dynamic Content – Content which changes over time or uses a dynamic language such as PHP to help render the page. In the past search engines were less aggressive at indexing dynamic content than they currently are. While they have greatly improved their ability to index dynamic content it is still preferable to use URL rewriting to help make dynamic content look static in nature.

Dynamic URL – A dynamic URL is the address of a Web page with content that depends on variable parameters that are provided to the server that delivers it.

The parameters may be already present in the URL itself or they may be the result of user input. A dynamic URL can often be recognized by the presence of certain characters or character strings that appear in the URL (visible in the address bar of your browser).

You might use URL parameters in your site to perform various functions apart from modifying the content of the page, such as,

  • Session ids – for tracking user sessions
  • Source trackers – for tracking the sources which are sending referrals to your pages and site
  • Format modifiers – for print formats etc

Example of dynamic URL (with paramters bolded):

http://www.example.com/video/videoPage.jsp?detailId=14800042&subNavId=page900090&navId=300030&parentId=100006

It is generally my belief that dynamic URLs, with their long strings of random numbers and variables are not as SEO-friendly as Static URLs [link:].

E

Editorial Links – Search engines count links as “votes” for a website’s quality. They primarily want to count editorial links – i.e. those links that were earned, rather than links that were bought or bartered.

Many paid links, such as those from quality directories [link: ], still count as signs of votes as long as they are also associated with editorial quality standards. If they are from sites without editorial control, like link farms [link: ], they are not likely to help you rank well. Using an algorithm similar to TrustRank [link: ], some search engines may place more trust on well known sites with strong editorial guidelines.

EMR – A search result that precisely matches a user’s search term.

Entry Page – The page on which a user enters your website. If you’re running pay-per-click[link: ] ads, it is very important to send visitors to the most appropriate and targeted page associated with the keyword they searched for (this strategy is called “deep linking” [link: ]). If you are doing link building it is important to point links at your most appropriate page when possible so that:

  • If anyone clicks the link they are sent to the most appropriate/relevant page.
  • You help search engines understand what the pages and content on your site are associated with.

Ethical SEO – Search engines like to paint SEO services which manipulate (or take advantage of) their relevancy algorithms as being unethical – i.e. any particular technique is generally not typically associated with ethics, but is either effective or ineffective.

Some search marketers lacking in creativity tend to describe services sold by others as being unethical while their own services are ethical. Any particular technique is generally not typically associated with ethics, but is either effective or ineffective.

The only ethics issues associated with SEO are generally business ethics related issues. Two of the bigger frauds are:

  • Not disclosing risks: Some SEOs may use high risk techniques when they are not needed. Some may make that situation even worse by not disclosing potential risks to clients. These techniques may be referred to ask grey-hat or black-hat [link:] SEO.
  • Taking money & doing nothing: Many people selling SEO services may not actually know how to competently provide them. Some shady people claim to be SEO’s and bilk money out of unsuspecting small businesses.

As long as the client is aware of potential risks there is nothing unethical about being aggressive.

External Link – Link which references another domain. Some people believe in link hoarding [link:], but linking out to other related resources is a good way to help search engines understand what your site is about.

However, if you link out to lots of low quality sites or primarily rely on low quality reciprocal links some search engines may not rank your site very well. Search engines are more likely to trust high quality editorial links [link: ] (both to and from your site).

F

Facebook – Facebook is a highly-popular social networking [link: ] website that is operated and privately owned by Facebook, Inc. Users can add friends and send them messages, and update their personal profiles to notify friends about themselves. Additionally, users can join networks organized by city, workplace, and school or college.

A January 2009 Compete.com [link: ] study ranked Facebook as the most used social network by worldwide monthly active users, followed by MySpace [link: ].

Visit Facebook.com

Favicon – Favorites Icon is a small icon which appears next to URLs in a web browser. Upload an image named favicon.ico in the root of your site to have your site associated with a favicon.

See also:

Feed – Many content management systems such as blogs, allow readers to subscribe to content update notifications via RSS [link: ] or XML feeds.

Feeds can also refer to pay per click syndicated feeds, or merchant product feeds. Merchant product feeds have become less effective as a means of content generation due to improving duplicate content filters.

Feed Aggregator – See Feed Reader [link: ]

Feed Reader – Software or website used to subscribe to feed update notifications.

See:

FFA – Free for all pages are pages which allow anyone to add a link to them. Generally these links do not pull much weight in search relevancy algorithms because many automated programs fill these pages with links pointing at low quality websites.

Firefox – Popular extensible open source web browser from Mozilla.

Download Mozilla’s Firefox

See also:

Flash – Vector graphics-based animation software which makes it easier to make websites look rich and interactive in nature.

Search engines tend to struggle indexing and ranking flash websites because flash typically contains so little relevant content. If your website uses flash, please ensure that:

  • You embed flash files within HTML pages
  • You use a noembed element to describe what is in the flash
  • You publish your flash content in multiple separate files such that you can embed appropriate flash files in relevant pages
  • You supplement the flash with noscript [link: ] content

See:

Frames – A technique created by Netscape [link: ] used to display multiple smaller pages on a single display. This web design technique allows for consistent site navigation, but makes it hard to deep link at relevant content.

Given the popularity of Server Side Includes, Content Management Systems [link: ], and Dynamic Languages [link: ] there really is no legitimate reason to use frames to build a content site today.

See also:

  • iFrames [link: ]

Fresh Content – Content which is dynamic in nature and gives people a reason to keep paying attention to your website. Blogs [link: ] are a good example of fresh content.

Many SEOs talk up fresh content, but fresh content does not generally mean re-editing old content. It more often refers to creating new content.

The primary advantages to fresh content are:

  • Maintain and grow mindshare: If you keep giving people a reason to pay attention to you more and more people will pay attention to you, and link to your site.
  • Faster idea spreading: If many people pay attention to your site, when you come out with good ideas they will spread quickly through such things as social networking [link: ].
  • Growing archives: If you are a content producer then owning more content means you have more chances to rank. If you keep building additional fresh content eventually that gives you a large catalog of relevant content.
  • Frequent crawling: Frequently updated websites are more likely to be crawled frequently [link: ].

FTP – File Transfer Protocol is a protocol for transferring data between computers. Many content management systems (such as blogging platforms) include FTP capabilities.

Web development software such as Dreamweaver also comes with FTP capabilities. There are also a number of free or cheap FTP programs such as Cute FTP, Core FTP, and Leech FTP.

Fuzzy Search – Search which will find matching terms when terms are misspelled (or fuzzy).

Fuzzy search technology is similar to stemming technology, with the exception that fuzzy search corrects the misspellings at the users end and stemming searches for other versions of the same core word within the index.

G

Gateway Page – Also known as a doorway page or jump page, a gateway page is a URL with minimal content designed to rank highly for a specific keyword and redirect visitors to a homepage or designated Landing Page.

Some search engines frown on gateway pages as a softer form of Cloaking or Spam. However, gateway pages may be legitimate landing pages designed to measure the success of a promotional campaign, and they are commonly allowed in Paid Listings.

See also:

  • Doorway Pages [link: ]

Geographical Targeting – Geographical targeting is the focusing of Search Engine Marketing efforts on specific states, counties, cities and neighborhoods that are important to a company’s business.

One basic aspect of geographical targeting is adding the names of relevant cities or streets to a site’s keywords, i.e. Hyde Street Chicago apartments. Another important element of geo-targeting is increasing your site’s presence on local search engines such as Google Maps [link: ] and/or Yahoo Local [link: ].

Geographical Segmentation – Geographic segmentation is the use of analytics to categorize a site’s web traffic by the physical locations from which it originated.

Google – The world’s leading search engine in terms of reach. Google was created in 1996 by Stanford students Larry Page and Sergey Brin, and actually started as a research project.

Since its incorporation in 1998, Google has been the on the cutting edge of the search industry. They’ve been famously innovative, pioneering such things as PageRank [link: ] and Google Earth [link: ], as well as acquiring media giants YouTube [link: ] and DoubleClick [link: ].

Search with Google

See also:

Googlebot – Google’s search engine spider [link: ] (or webcrawler [link: ]). Google has a shared crawl cache between their various spiders, including vertical search spiders and spiders associated with ad targeting.

See also:

Googleplex – Nickname for Google’s headquarters in Mountain View, California.

Google AdWords – Google’s advertisement and link auction network. Most of Google’s ads are keyword targeted and sold on a cost per click basis in an auction which factors in ad clickthrough rate as well as max bid.

Adwords is also a program that determines the advertising rates and keywords used in the Google AdSense [link] program. Advertisers bid on the keywords that are relevant to their businesses. Ranked ads then appear as sponsored links on Google Search Engine Results Pages (SERPS) and Google AdSense host sites.

See:

  • Google Adwords [link: ]
  • Google Adwords Keyword Tool [link: ]

Google AdSense – Google’s contextual advertising [link: ] network. Publishers large and small may automatically publish relevant advertisements near their content and share the profits from those ad clicks with Google.

AdSense offers a highly scalable automated ad revenue stream which will help some publishers establish a baseline for the value of their ad inventory. In many cases AdSense will be underpriced, but that is the trade off for automating ad sales.

AdSense ad auction formats include:

  • Cost Per Click (CPC) [link: ] – advertisers are only charged when ads are clicked on
  • Cost Per Impression (CPM) [link:] – advertisers are charged a certain amount per ad impression. Advertisers can target sites based on keyword, category, or demographic information.

AdSense ad formats include:

  • Text
  • Graphic
  • Animated Graphics
  • Videos

In some cases, I have seen ads which got a 2 or 3% click through rate (CTR); while sites that are optimized for maximum CTR (through aggressive ad integration) can obtain as high as a 50 or 60% CTR depending on:

  • How niche their website is
  • How commercially oriented their website is
  • The relevancy and depth of advertisers in their vertical

It is also worth pointing out that if you are too aggressive in monetizing your site before it has built up adequate authority your site may never gain enough authority to become highly profitable. Depending on your vertical your most efficient monetization model may be any of the following:

  • Adsense
  • Affiliate Marketing [link:]
  • Direct Ad Sales
  • Selling unique products and services
  • Mixture of all of the above

See also:

Google Analytics – A free service offered by Google that generates detailed statistics about the visitors to a website. Its main highlight is that the product is aimed at marketers as opposed to webmasters and technologists from which the industry of web analytics originally grew.

It is the most widely used website statistics service, currently in use at around 40% of the 10,000 most popular websites. Google Analytics can track visitors from all referrers, including search engines, display advertising, pay-per-click networks, email marketing and digital collateral such as links within PDF documents.

Visit Google Analytics

Google Base – Free database of semantically structured information created by Google. Google Base may also help Google better understand what types of information are commercial in nature, and how they should structure different vertical search products.

In addition, Google base is an excellent way to get your product inventory to show up within Google’s search engine results pages.

Visit Google Base

Google Bombing – Making a pank rank well for a specific search query by pointing hundreds or thousands of links at it with the keywords in the anchor text.

Google Bowling – Knocking a competitor out of the search results by pointing hundreds or thousands of low-trust, low-quality links at their website.

Typically it is easier to bowl new sites out of the results. Older established sites are much harder to knock out of the search results.

Google Checkout – Payment service provided by Google which helps Google better understand merchant conversion rates and the value of different keywords and markets.

See:

Google Dance – In the past Google updated their index roughly once a month. Those updates were named Google Dances, but since Google shifted to a constantly updating index, Google no longer does what was traditionally called a Google Dance.

Major search indexes are constantly updating. Google refers to this continuous refresh as Everflux [link:].

The second meaning of Google Dance is a yearly party at Google’s corporate headquarters which Google holds for search engine marketers. This party coincides with the San Jose Search Engine Strategies conference.

See also:

Google Earth – Google Earth is a virtual globe, map and geographic information program that was originally called EarthViewer 3D, and was created by Keyhole, Inc, a company acquired by Google in 2004.

It maps the Earth by the superimposition of images obtained from satellite imagery, aerial photography and GIS 3D globe.

Visit Google Earth

Google Keyword Tool – A keyword research tool provided by Google which estimates the competition and search volume for a keyword, recommends related keywords, and will tell you what keywords Google thinks are relevant to your site or a page on your site.

Try the Google Keyword Tool

Google Maps – Google Maps (for a time named Google Local) is a basic web mapping service application and technology provided by Google, free (for non-commercial use), that powers many map-based services, including the Google Maps website, Google Ride Finder, Google Transit,[1] and maps embedded on third-party websites via the Google Maps API.

It offers street maps, a route planner for traveling by foot, car, or public transport, and an urban business locator for numerous countries around the world.

Visit Google Maps

Google OneBox – Portion of the search results page above the organic search results which Google sometimes uses to display local and vertical search results from Google Maps [link:], Google News, Google Base [link:], and other Google owned vertical search services.

Google Supplemental Index – Index where pages with lower trust scores are stored. Pages may be placed in Google’s Supplemental Index if they consist largely of duplicate content, if the URLs are excessively complex in nature, if the site which hosts them lacks significant trust, or if they are new and still untrusted by Google.

Google Traffic Estimator – Tool which estimates bid prices and how many Google searchers will click on an ad for a particular keyword.

If you do not submit a bid price the tool will return an estimated bid price necessary to rank #1 for 85% of Google’s queries for a particular keyword.

Try Google Traffic Estimator

Google Trends – Tool which allows you to see how Google search volumes for a particular keyword change over time.

Try Google Trends

Google Webmaster Tools – Google Webmaster Tools is a no-charge web service by Google for webmasters. It allows webmasters to check indexing status and optimize visibility of their websites.

It has tools that let the webmaster:

  • Check and set the crawl rate.
  • List internal and external pages that link to the site.
  • See what keyword searches on Google led to the site being listed in the SERPs, and the click through rates of such listings.
  • View statistics about how Google indexes the site.
  • Submit and check a sitemap.
  • Generate and check a robots.txt file.
  • Set a preferred domain (i.e prefer agent-seo.com over www.agent-seo.com or vice versa).

Visit Google Webmaster Tools

Graphical Search Inventory – Non-text-based advertising that is displayed based on the relevance of surrounding content. Includes banners, pop-ups, toolbars and rich media.

Google Website Optimizer – Free multi variable testing platform used to help AdWords [link:] advertisers improve their conversion rates.

Try Google Website Optimizer

Gray-Hat SEO – Search Engine Optimization tactics that fall in between Black Hat SEO [link:] and White Hat SEO [link:]. Gray hat SEO techniques can be legitimate in some cases and illegitimate in others. Such techniques include Doorway Pages [link:], Gateway Pages [link:], Cloaking [link:] and duplicate content [link:].

Guestbook Spam – A type of low quality automated link which search engines do not want to place much trust on.

H

Heading Tags – The heading element briefly describes the subject of the section it introduces, and are generally used to denote places of importance on any given web page. If your website were a book, heading tags would closely resemble chapter headings.

Heading elements go from H1 to H6 with the lower numbered headings being most important (H1 = most important, H6 = least important).

An H1 element source would look like:

<h1>Important Subject</h1>

Heading elements may be styled using CSS [link:]. Many content management systems [link:] place the same content in the main page heading and the page title, although in many cases it may be preferential to mix them up if possible.

Tip: You should only use a single H1 element on each page, and may want to use multiple other heading elements to structure a document.

See also:

Hidden Text – SEO technique used to show search engine spiders text that human visitors do not see.

While some sites may get away with it for a while, generally the risk to reward ratio is inadequate for most legitimate sites to consider using hidden text.

Hits – This generally means all requests from a webserver including requests by a web browser for html pages, jpeg’s, gif’s and other images. Hits is an outdated phrase often thrown around when referring to website traffic, but is generally not very meaningful in quantitfying actual search engine traffic.

Homepage – The main page on your website, which is largely responsible for helping develop your brand and setting up the navigational schemes that will be used to help users and search engines navigate your website.

As far as SEO goes, a home page is typically going to be one of the easier pages to rank for some of your more competitive terms, largely because it is easy to build links at a home page.

You should ensure your homepage stays focused and reinforces your brand though, and do not assume that most of your visitors will come to your site via the home page. If your site is well structured many pages on your site will likely be far more popular and rank better than your home page for relevant queries.

Home Directory – The directory in which your site’s main index page is located. usually named /public_html/, or /www/ or /web/.

.htaccess – Apache directory-level configuration file which can be used to password protect or redirect files.

As a note of caution, make sure you copy your current .htaccess file before editing it, and do not edit it on a site that you can’t afford to have go down unless you know what you are doing.

See also:

HTMLHyperText Markup Language is the language in which pages on the World Wide Web are created.

Some newer web pages are also formatted in XHTML.

See also:

HTTPHyperText Transfer Protocol is the foremost used protocol to communicate between servers and web browsers. Hypertext transfer protocol is the means by which data is transferred from its residing location on a server to an active browser.

Hubs – A document that links out to many other documents devoted to a single topic. Think of any category page in a major directory like Yahoo! or DMOZ. All the documents linked to are assumed to be authorities (sort of a circular logic).

Topical hubs are sites which link to well trusted within their topical community. A topical authority is a page which is referenced from many topical hub sites. A topical hub is a page which references many authorities.

See also:

Hyperlink – Also known as link or HTML link, a hyperlink is an image or portion of text that when clicked on by a user opens another web page or jumps the browser to a different portion of the current page.

Inbound links [link:] with keyword-relevant anchor text [link:] are an important part of SEO strategy.

Code for a hyperlink looks like this:

<a href=”/yourpage.html”>Anchor Text</a>

I

iFrames – An HTML structure that allows another HTML document to be inserted into an HTML page. The iFrame is set up as a window frame of a specified size that scrolls along with the rest of the page, but the iFrame’s content can itself be scrolled if it is larger than the iFrame window.

Unlike the regular HTML frames function, which is used to divide the screen into multiple windows, the iFrame is typically used to insert an ad or small amount of text in the middle of a page.

iFrames have been widely used within the automotive industry as a way to “frame in” their inventory which is typically generated and maintained by a 3rd-party source. Using iFrames is not a solid SEO practice, as it allows for content not actually residing on a particular website (thus giving the original website no credit for the content).

Inbound Link – Link pointing to one website from another website.

Most search engines allow you to see a sample of links pointing to a document by searching using the link: function. For example, using link:www.agent-seo.com would show pages linking to the homepage of this site (both internal links [link:] and inbound links). Due to canonical URL [link:] issues www.agent-seo.com and agent-seo.com may show different linkage data.

Google typically shows a much smaller sample of linkage data than competing engines do, but Google still knows of and counts many of the links that do not show up when you use their link: function.

Index – Collection of data used as bank to search through to find a match to a user fed query. The larger search engines have billions of documents in their catalogs.

When search engines search they search via reverse indexes by words and return search results based on matching relevancy vectors. Stemming and semantic analysis allow search engines to return near matches.

(2) The term ‘Index’ may also refer to the root of a folder on a web server.

Information Architecture – the art of expressing a model or concept of information used in activities that require explicit details of complex systems. Among these activities are library systems, content management systems [link:], web development, user interactions, database development, programming, technical writing, enterprise architecture, and critical system software design.

Good information architecture considers both how humans and search spiders access a website. Information architecture suggestions:

  • Focus each page on a specific topic.
  • Use descriptive page titles [link:] and meta descriptions [link:] which describe the content of the page
  • Use clean (few or no variables) descriptive file names and folder names
  • Use headings [link:] to help break up text and semantically structure a document
  • Use breadcrumb navigation [link:] to show page relationships
  • Use descriptive link anchor text [link:]
  • Link to related information from within the content area of your web pages
  • Improve conversion [link:] rates by making it easy for people to take desired actions
  • Avoid feeding search engines duplicate or near-duplicate content [link:]

Internal Link – Link from one page on a site to another page on the same site.

It is preferential to use descriptive internal linking to make it easy for search engines to understand what your website is about. Use consistent navigational anchor text for each section of your site, emphasizing other pages within that section. Place links to relevant related pages within the content area of your site to help further show the relationship between pages and improve the usability of your website.

Internet – The Internet is a global system of interconnected computer networks that use the standard Internet Protocol Suite (TCP/IP) to serve billions of users worldwide. It is a network of networks that consists of millions of private and public, academic, business, and government networks of local to global scope that are linked by a broad array of electronic and optical networking technologies.

The Internet carries a vast array of information resources and services, most notably the inter-linked hypertext documents of the World Wide Web (WWW) and the infrastructure to support electronic mail.

See also:

Internet Explorer – Microsoft’s web browser. After they beat out Netscape’s browser on the marketshare front, they failed to innovate on any level for about 5 years – until Firefox [link:] forced them to.

See also:

Invisible Web – Portions of the web which are not easily accessible to crawlers due to search technology limitations, copyright issues, or information architecture [link:] issues.

IP AddressInternet Protocol Address. Every computer connected to the internet has an IP address. Some websites and servers have unique IP addresses, but most web hosts host multiple websites on a single host.

Many SEO’s refer to unique C class IP addresses. Every site is hosted on a numerical address like aa.bb.cc.dd. In some cases many sites are hosted on the same IP address. It is believed by many SEO’s that if links come from different IP ranges with a different number somewhere in the aa.bb.cc part, then the link may count more than links from the same local range and host.

ISPInternet Service Providers sell end users access to the web. Some of these companies also sell usage data to web analytics companies.

J

Javascript – A client-side scripting language that can be embedded into HTML [link:] documents to add dynamic features.

Search engines do not index most content in Javascript. In AJAX [link:], JavaScript has been combined with other technologies to make web pages even more interactive.

Jump Page – An entry into a site other than the homepage. It may be a legitimate landing page, used to measure the results of a specific promotion or campaign. However, these pages are often created exclusively to spam search engine results, or misrepresent a site’s content. Search engines frown heavily on the use of these doorway pages [link:], and may penalize sites if the practice is reported and/or discovered.

JQuery – JQuery is a lightweight cross-browser JavaScript library that emphasizes interaction between JavaScript [link:] and HTML [link:]. JQuery’s syntax is designed to make it easier to navigate a document, select DOM elements, create animations, handle events, and develop AJAX [link:] applications.

JQuery also provides capabilities for developers to create plugins on top of the JavaScript library. Providing this option, developers are able to create abstractions for low-level interaction and animation, advanced effects and high-level, theme-able widgets. This contributes to the creation of powerful and dynamic web pages.

Download JQuery

K

Keyword – The words or phrases a person types into a search box. Also refers to the word or phrase a site owner wants to be found under.

Search Engine Results Pages (SERP’s) [link:] rank indexed sites according to how relevant they are in relation to the search query (most relevant to least relevant).

Long tail [link:] and brand [link:] related keywords are typically worth more than shorter and vague keywords because they typically occur later in the buying cycle and are associated with a greater level of implied intent.

Keyword Density – An old measure of search engine relevancy based on how prominent keywords appeared within the content of a page. Keyword density is no longer a valid measure of relevancy over a broad open search index though.

When people use keyword stuffed copy it tends to read mechanically (and thus does not convert well and is not link worthy); plus some pages that are crafted with just the core keyword in mind often lack semantically related words and modifiers from the related vocabulary (and that causes the pages to rank poorly as well).

The keyword density formula is the total number of keyword mentions divided by the total number of words on a page. Traditionally it has been said that keywords should fall between 2 and 8% density (although this is not a heavily-used factor anymore).

See also:

Keyword Effectiveness Index (KEI) – Keyword effectiveness index: A mathematical representation of the popularity of a keyword compared to its popularity mesured as the number of pages in a search engines index.

KEI is a statistical formulation that reveals the most effective keyword phrases and terms to use in optimize your web pages for. Efficiency can be many things. According to KEI, it is efficient to optimize for keywords that have many searchers, but only a few competing pages.

The lower the KEI, the more popular your keywords are, and the less competition they have. That means that you might have a better chance of getting to the top in the search engines and receive a good number of searchers for your effort.

Example: Suppose the number of searches for a keyword is 821 per day and Google displays 224,234 results (pages) for that keyword. Then the ratio between the popularity and competitiveness for that keyword is:

Keyword Effectiveness Index Example #1

Example of the opposite: Suppose the number of searches for a keyword is 2 per day and Google displays 11,224,234 results (pages) for that keyword. Then the ratio between the popularity and competitiveness for that keyword is:

Keyword Effectiveness Index Example #2

Explanation: According to the KEI definition, the best keywords are those that have many searches and that don’t have much competition in the search results. A low KEI is therefore preferable.

Keyword Proximity – How close keywords are to each other on web pages.

Keyword Research – The process of discovering relevant keywords and keyword phrases to focus your SEO and PPC marketing campaigns on.

Examples of keyword research & discovery methods:

  • Using keyword research tools [link:].
  • Reviewing analytics data [link:] or server logs [link:].
  • Analyzing ad copy and tags on competing websites.
  • Reviewing consumer feedback.
  • Interacting with customers to understand how/why the found and/or chose your business.

Keyword Research Tools – Tools which help you discover potential keywords based on past search volumes, search trends, bid prices, and page content from related websites.

Some popular keyword research tools:

Please note that most keyword research tools used alone are going to be highly inaccurate at giving exact quantitative search volumes. The tools are better for qualitative measurements. To test the exact volume for a keyword it may make sense to set up a test Google AdWords [link:] campaign.

Keyword Stuffing – Writing copy that uses excessive amounts of the core keyword(s) and is usually written in a very unnatural, robotic fashion.

When people use keyword stuffed copy it tends to read mechanically (and thus does not convert well and is not link worthy); plus some pages that are crafted with just the core keyword in mind often lack semantically related words and modifiers from the related vocabulary (and that causes the pages to rank poorly as well).

L

Landing Page – The landing page is the page on which a visitor “lands” after clicking a:

  • search engine listing
  • hyperlink
  • email link
  • banner ad
  • ppc ad
  • other ad/link

The landing page can be a site’s homepage, but is usually a page designed to appeal to users who click-through from a specific ad or link. Similar to a doorway page, but a legitimate marketing function — it is used for counting and tracking arrivals and determining the effectiveness of a marketing campaign.

Well designed landing pages that are relevant to a user’s keyword query, will improve conversion rates, and play a critical role in Search Engine Marketing.

Landing Page Quality Scores – A measure used by Google to help filter noisy ads out of their AdWords [link:] program.

When Google AdWords [link:] launched, affiliates and arbitrage players made up a large portion of their ad market — as more mainstream companies have spent on search marketing, Google [link:] has done many measures to try to keep their ads relevant.

See also:

Latent Semantic Indexing (LSI) – An algebraic model of document retrieval based on a singular value decomposition of the vectorial space of index terms.

Link – See hyperlink [link:]

Link Baiting – The art of targeting, creating, and formatting information that provokes the target audience to point high quality links at your site. Many link baiting techniques are targeted at social media [link:] and bloggers [link:].

The types of link bait vary tremendously, but they include highly informative articles or news stories, useful resources and sometimes controversial or sensationalistic content.

Link baiting is a technique used to help a site improve its Link Popularity [link:] and Page Rank [link:]. Some sites use link baiting as the centerpiece of a website marketing campaign.

Link Building – The process of building high quality inbound links to and from your website. Over time, search engines will evaluate links pointing to your website in order to influence their decisions whether or not to trust that your website is authoritative, relevant, and trustworthy.

See also:

Quick link building tips:

  • Build conceptually unique linkworthy high-quality content
  • Create viral marketing ideas that want to spread and make people talk about you
  • Mix your anchor text
  • Get deep links
  • Try to build at least a few quality links before actively obtaining any low quality links
  • Register your site in relevant high quality directories such as DMOZ, the Yahoo! Directory, and Business.com
  • When possible try to focus your efforts mainly on getting high quality editorial links
  • Create link bait
  • Try to get bloggers to mention you on their blogs
  • It takes a while to catch up with the competition, but if you work at it long enough and hard enough eventually you can enjoy a self-reinforcing market position
  • Filthy Linking Rich [PDF] – Mike Grehan article about how top rankings are self reinforcing

Link Bursts – A rapid increase in the quantity of links pointing at a website.

When links occur naturally they generally develop over time. In some cases it may make sense that popular viral articles receive many links quickly, but in those cases there are typically other signs of quality as well, such as:

  • Increased usage data
  • Increase in brand related search queries
  • Traffic from the link sources to the site being linked at
  • Many of the new links coming from new pages on trusted domains

Link Churn – The rate at which a site loses links.

Link Equity – A measure of how strong a site is based on its inbound link popularity and the authority [link:] of the sites providing those links (i.e. the total number and overall quality of the links pointing to your website).

Link Exchange – A link exchange is a quid pro quo arrangement or reciprocal link [link:] exchange between two sites. Reciprocal links usually lead to the home page of the associate site.

Link Farm – Website or group of websites which exercises little to no editorial control when linking to other sites. FFA [link:] pages, for example, are link farms.

Link Hoarding – A method of trying to keep all your link popularity by not linking out to other sites, or linking out using JavaScript or through cheesy redirects.

Generally, link hoarding is a bad idea for the following reasons:

  • Many authority sites were at one point hub sites that freely linked out to other relevant resources
  • If you are unwilling to link out to other sites, people are going to be less likely to link to your site
  • Outbound links to relevant resources may improve your credibility and/or boost your overall relevancy scores
Of course, folks never know when we’re going to adjust our scoring. It’s pretty easy to spot domains that are hoarding PageRank; that can be just another factor in scoring. If you work really hard to boost your authority-like score while trying to minimize your hub-like score, that sets your site apart from most domains. Just something to bear in mind.”
~ Matt Cutts, Google

Link Popularity – The number and quality of links pointing at a website. Page Rank is achieved when backlinks [link:] are located on reputable, relevant sites rather than so-called Link Farms. Most search engines use link popularity as a factor in their algorithmic results.

For competitive search queries link quality counts much more than link quantity. Google typically shows a smaller sample of known linkage data than the other engines do, even though Google still counts many of the links they do not show when you do a link: search.

Link Reputation – The combination of your link equity [link:] and anchor text [link:].

Link Rot – A measure of how many and what percent of a website’s links are broken.

Links may broken for a number of reason, but four of the most common reasons are:

  • A website going offline
  • Linking to content which is temporary in nature (due to licensing structures or other reasons)
  • Moving a page’s location
  • Changing a domain’s content management system

Most large websites have some broken links, but if too many of a site’s links are broken it may be an indication of outdated content, and it may provide website users with a poor user experience. Both of which may cause search engines to rank a page as being less relevant.

See also:

Listings – see SERP [link: ].

Local Search – Local search refers to both the addition of geographical keywords (cities, streets, etc.) to search terms. Also refers to YellowPages-type search engines such as Google Maps [link: ], Yahoo! Local [link: ], etc. to find business services in a particular zip code.

Search engine placement services use local SEO to help traditional “brick and mortar businesses” connect with customers in their community.

Local Search Optimization – The process of increasing the amount of visitors to a website by ranking high for specific locality-based keyword phrases in search engines, plus the addition of geographical keywords in the search phrase (cities, streets, zip codes). Local SEO is a powerful tool attracting local customers, especially for local small businesses.

See also:

Log Files – Server files which show you what your leading sources of traffic are and what people are search for to find your website.

Log files DO NOT typically show as much data as analytics programs would — if they do, it is generally not in a format that is as useful beyond seeing the top few stats.

In my experience, when trying to compare log files to actual website analytics the waters are generally very muddy. Discrepancies between actual traffic and all server traffic should be expected, making this source of traffic a little unreliable.

Long-Tail Keywords – The phrase “The Long Tail (as a proper noun with capitalized letters)” was first coined by Chris Anderson in a 2004 Wired Magazine article to describe certain business and economic models such as Amazon.com or Netflix. The term long tail is also generally used in statistics, often applied in relation to wealth distributions or vocabulary use.

More recently the word long-tail is being used to describe a (niche) longer keyword focus while writing content for search engine optimization (i.e. a keyword phrase where several words are used to specify the search query).

See also:

M

Meta Description Tag – The meta description tag is typically a sentence or two of content which describes the content of the page. Search engines may consider or display this tag at their discretion. Relevant meta description tags may appear in search results [link:] as part of the page description below the page title [link:].

Meta Description code is found between the <head></head> tags and looks like this:

<meta name=”description” content=”Your meta description here. ” / >

A good Meta Description tag should:

  • Be highly relevant to the content on the page.
  • Serve to reinforce the page title.
  • Focus on including offers and secondary keywords & phrases to help add context to the page title.
  • Respect character limit best practices – 145-155 for homepage, and not exceeding 300 character spaces for all other pages.
  • Not be used as a place to spam keywords.

See also:

Meta Keywords Tag – The meta keywords tag is a tag which can be used to highlight keywords and keyword phrases which the page is targeting.

Meta Keywords code is found between the <head></head> tags and looks like this:

<meta name=”keywords” content=”your, keywords, here” / >

Many people spammed meta keyword tags and searchers typically never see the tag, so most search engines do not place any weight on it. Many SEO professionals no longer bother to consider meta keywords tags as a serious SEO tactic.

See also:

Meta Refresh – A meta tag used to make a browser refresh to another URL location.

Meta Refresh code is found between the <head></head> tags and looks like this:

<meta http-equiv=”refresh” content=”10;url=http://www.site.com/folder/page.htm” / >

Generally in most cases it is preferred to use a 301 [link:] or 302 [link:] redirect over a meta refresh.

Meta Robots Tag – A meta robots tag (named for a search engine Crawler or Robot) lets page authors prevent their webpages from being added to a search engine’s Index. Alternatives to a meta robots tag are Robots.txt files and password protection.

Meta Robots code is found between the <head></head> tags and looks like this:

<meta name=”robots” content=”index, follow” / > or
<meta name=”robots” content=”noindex, nofollow” / > or
<meta name=”robots” content=”noindex, follow” / >

Meta Search Engine – A meta search engine derives its listings by running user queries through multiple other search engines and then summarizing the results. A meta search engine does not maintain its own Index.

Listings are displayed by meta search engines either in aggregate or categorized by search engine source.

See also:

Meta Tags – People generally refer to meta descriptions and meta keywords as meta tags. Some people also group the page title in with these.

  • The page title [link:] is highly important.
  • The meta description [link:] tag is somewhat important.
  • The meta keywords tag [link:] is not important.

Meta Verification Tag – Meta tags generated by the likes of Google Webmaster Tools [link:], Bing Webmaster, and Yahoo Site Explorer [link:] to enabled webmasters to track important information about their website as it pertains to search engines.

Meta Verification code is found between the <head></head> tags

Google’s tag will look something like this:

<meta name=”google-site-verification” content=”klqM0e8rMzc-lw8nmLK3oSh2r6nRcK9R3hzKRQ8WO3M” />

Bing’s will look something like this:

<meta name=”msvalidate.01″ content=”821232E9D27E41CBDABD92C619D941AA” />

Yahoo’s will look something like this:

<meta name=”y_key” content=”3b9231726420ceb6″ />

Microformats – A microformat is a web-based approach to semantic markup that seeks to re-use existing XHTML and HTML tags to convey metadata and other attributes. This approach allows information intended for end-users (such as contact information, geographic coordinates, calendar events, and the like) to also be automatically processed by software.

See also:

Microsoft (MSN) – Microsoft Corporation is a multinational computer technology corporation that develops, manufactures, licenses, and supports a wide range of software products for computing devices. Headquartered in Redmond, Washington, USA, some of its most well-known products include the Microsoft Windows operating system, the internet browser Internet Explorer, and the Microsoft Office suite of productivity software among others.

Microsoft AdCenter – Microsoft’s cost per click ad network. It includes a few interesting features (dayparting, demographic-based bidding, etc.), but it is still a little behind in quality compared to Google AdWords [link].

Mirror Site – Site which mirrors (or duplicates) the contents of another website. Generally search engines prefer not to index duplicate content [link:]. The one exception to this is that if you are a hosting company it might make sense to offer free hosting or a free mirror site to a popular open source [link:] software site to build significant link equity.

Mod-rewrite – A module or plugin for Apache web servers that can be used to rewrite requested URLs on the fly.

It supports an unlimited number of rules and an unlimited number of attached rule conditions for each rule to provide a flexible and powerful URL manipulation mechanism. Mod-rewrites can be used to alter non search-friendly URLs to make them more search-friendly, thus increasing indexing chances (typically used for a dynamic database driven websites).

Movable Type – For sale blogging software which allows you to host a blog on your website. Movable Type is typically much harder to install that WordPress [link:] is.

Visit:

Moved Permanently – Called a 301-Redirect or “moved permanently”, means that the file has been moved permanently to a new location. This is the preferred method of redirecting for most pages or websites. Depending on your site authority and crawl frequency it may take anywhere from a few days to a month or so for the 301-Redirect [link:] to be picked up.

Myspace – One of the most popular social networking sites, largely revolving around connecting musicians to fans and having an easy to use blogging platform.

Visit:

N

Natural Listings – Web page listings that appear in a search engine’s results, based on the engine’s own proprietary algorithm. The website/web page owner has not paid for these positions in the results, and the listings are ranked in order of relevance to each search query with the most relevant listed first.

As long as a listing has achieved its position naturally (not paid for), it is a “natural” or “organic [link:]” listing.

Natural Search – See Organic Search Results [link:].

Netscape – Originally a company that created a popular web browser by the same name, Netscape is now a social news site similar to Digg.com [link:].

Visit:

Niche – A topic or subject which a website is focused on.

Search is a broad field, but as you drill down each niche consists of many smaller niches. An example of drilling down to a niche market:

  • Search
  • Search marketing, privacy considerations, legal issues, history of, future of, different types of vertical search, etc.
  • Search Engine Optimization, Search Engine Marketing, Pay-Per-Click
  • Link building, keyword research, reputation monitoring and management, viral marketing, SEO copywriting, Google AdWords, information architecture, etc.

Generally it is easier to compete in small, new, or underdeveloped niches than trying to dominate large verticals. As your brand [link:] and authority [link:] grow you can go after bigger markets.

Niche Directory – Directories which focus on or niche topics, specialist sectors, restricted regions, or single languages. One type of niche directory with a large number of sites in existence, is the shopping directory for example.

Nofollow – Attribute used to prevent a link from passing link authority. Commonly used on sites with user generated content, like in blog comments.

The code to use nofollow on a link appears like:

<a href=”http://www.example.com.com” rel=”nofollow”>anchor text</a>

Nofollow can also be used in a robots meta tag to prevent a search engine from counting any outbound links on a page. This code would look like this

<meta name=”robots” content=”noindex, nofollow” />

Noscript Tags – The tags are used to supplement Flash [link:], and can be a great way to introduce content to supplement 100% Flash websites, where there otherwise would be none. You have to be careful with how you use it, but if used in a white-hat, non-malicious manner it can be an effective way to optimize Flash websites.

Here is an example of what Noscript Tags look like:

<noscript>
<p>Your Supplemental Content</p>
</noscript>

O

Off-Site SEO – This refers to Search Engine Optimization tactics that do not involved any direct manipulation of your website’s files. It refers to things that happen away from your website that have a direct influence on how your site behaves in search engines.

Some examples of Off-Site SEO include:

  • Link Building [link: ]
  • Directory Submission [link: ]
  • Local Search [link: ]
  • Social Media [link: ]

On-Site SEO – This refers to Search Engine Optimization tactics that involve any direct edits that you might make to your website’s files. These edits and the way your website is structured have a direct influence on how your site behaves in search engines.

Some examples of On-Site SEO include:

  • Editing Title Tags [link: ]
  • Editing Meta Tags [link: ]
  • Copywriting [link: ]
  • Image Optimization [link: ]

One-Way Link – A One-way link is a term used among search engine optimizers referring to a specific type of link building method. It is a Hyperlink that points to a website without any reciprocal link; thus the link goes “one way” in direction.

It is suspected by many industry consultants that this type of link would be considered more natural in the eyes of search engines (and thus more valuable).

There are many theories on link building and one-way links verse reciprocal links. Google [link: ] is the company that has made this concept very popular with their PageRank [link: ] algorithm.

Open-Source – Software which is distributed with its source code such that developers can modify it as they see fit. On the web open source is a great strategy for quickly building immense exposure and mindshare.

A great example of open-source technology would be WordPress.

Opera – A fast standards based web browser.

Visit Opera

Organic Results – Most major search engines have results that consist of paid ads and unpaid listings. The unpaid / algorithmic listings are called the organic search results (or “natural search results”). Organic search results are organized by relevancy, which is largely determined based on linkage data, page content, usage data, and historical domain and trust related data.

Most clicks on search results [link: ] are on the organic search results. Some studies have shown that 60-80% of clicks are on the organic search results.

Also read: Click Distribution & Percentages by SERP Rank

Outbound Link – A link from one website to another external site.

Some webmasters believe in link hoarding [link: ], however linking out to useful relevant related documents is an easy way to help search engines understand what your website is about.

If you reference other resources it also helps you build credibility and leverage the work of others without having to do everything yourself. Some webmasters track where their traffic comes from, so if you link to related websites they may be more likely to link back to your site.

Overture – The company which pioneered search marketing by selling targeted searches on a pay per click basis. Originally named GoTo, they were eventually bought out by Yahoo! and branded as Yahoo! Search Marketing.

Visit: Yahoo! Search Marketing

Overture Keyword Selector Tool – Popular keyword research tool, based largely on Yahoo! search statistics. Heavily skewed toward commercially oriented searches, also combines singular and plural versions of a keyword into a single version.

Visit: Overture Keyword Selector Tool

P

PageRank – A logarithmic scale based on link equity which estimates the importance of web documents.

As defined by Google “PageRank relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page’s value.

In essence, Google interprets a link from page A to page B as a vote, by page A, for page B. But, Google looks at more than the sheer volume of votes, or links a page receives; it also analyzes the page that casts the vote. Votes cast by pages that are themselves “important” weigh more heavily and help to make other pages “important.”

Since PageRank is widely bartered, Google’s relevancy algorithms had to move away from relying on PageRank and place more emphasis on trusted links via algorithms such as TrustRank [link: ].

The PageRank formula is:

PR(A) = (1-d) + d (PR(T1)/C(T1) + … + PR(Tn)/C(Tn))

PR= PageRank
d= dampening factor (~0.85)
c = number of links on the page
PR(T1)/C(T1) = PageRank of page 1 divided by the total number of links on page 1, (transferred PageRank)

In text: For any given page A, the PageRank PR(A) is equal to the sum of the parsed partial PageRank given from each page pointing at it multiplied by the dampening factor plus one minus the dampening factor.

See also:

Page Title – See Title Tags [link: ]

Paid Inclusion – A method of allowing websites which pass editorial quality guidelines to buy relevant exposure.

Sometimes the payment is arranged through an affiliate or partner of the database company.

  • Directories such as the Yahoo! Directory [link: ] and Business.com allow websites to be listed for a flat yearly cost.
  • Yahoo! Search allows webmasters to pay for inclusion for a flat review fee and a category based cost per click.

Paid Listings – Directory listings where the results display only advertisers who have paid for inclusion, and possibly for position.

Paid Placement – Paid placement is a program in which advertisers’ listings are guaranteed to appear on a Results Page when particular Keywords are searched. The ranking of paid placement listings is determined by competitive bidding.

Unlike Paid Inclusion listings, paid placement listings are usually displayed separately from Natural Listings and are labeled as advertisements or sponsored links. Google and Yahoo! Search Marketing (formerly Overture) are two of the largest paid placement search networks.

See: Pay-Per-Click Advertising [link: ]

Pay-for-Performance – Payment structure where affiliated sales workers are paid commission for getting consumers to perform certain actions.

Publishers publishing contextual ads [link: ] are typically paid per ad click. Affiliate marketing programs [link: ] pay affiliates for conversions – leads, downloads, or sales.

Penalty – Search engines prevent some websites suspected of spamming [link: ] from ranking highly in the results by banning or penalizing them. These penalties may be automated algorithmically or manually applied.

If a site is penalized algorithmically the site may start ranking again after a certain period of time after the reason for being penalized is fixed. If a site is penalized manually the penalty may last an exceptionally long time or require contacting the search engine with a reinclusion [link: ] request to remedy.

Some sites are also filtered [link: ] for various reasons.

See also:

Permalink – A permalink, or permanent link, is a URL that points to a specific blog [link: ] or forum entry after it has passed from the front page to the archives. Because a permalink remains unchanged indefinitely, it is less susceptible to link rot.

Most modern weblogging and content-syndication software systems support such links (see: WordPress [link: ]). Other types of websites use the term permanent links, but the term permalink is most common within the blogosphere.

Personalization – Altering of the search results based on a person’s location, search history, content they recently viewed, or other factors relevant to them on a personal level. Personalized search results are most prevalent on Google [link: ].

PHP – PHP Hypertext Preprocessor is an open source server side scripting language used to render web pages or add interactivity to them.

See also:

Poison Word – Words which were traditionally associated with low quality content that may cause search engines to want to demote the rankings of a given page.

Poison words, are words that are known to decrease your pages rankings if a search engine finds them in the title, description or in the url. They don’t kill, they just bury pages in rankings.

Generally, people think of adult words first. Adult words (obscene) often put your page in an adult category where it is filtered out by various filters at search engines.

Newer non-adult Poison Words are being uncovered. These words don’t throw you into a different category, then just decrease your rankings. Poison Words signal to a search engine, that this page is of low value.

See Aaron Wall’s article on Poison Words.

PDF – Portable Document Format is a universal file format developed by Adobe Systems that allows files to be stored and viewed in the original printer friendly context. Learn more about Adobe PDF History

Portal – Web site offering common consumer services such as news, email, other content, and search. A generic term for any site which provides an entry point to the internet for a significant number of users. Examples are search engines, directories, built-in default browser or service provider homepages, etc.

Pay-Per-Click Advertising (PPC) – Pay Per Click is a pricing model which most search ads and many contextual ad programs are sold through. PPC ads only charge advertisers if a potential customer clicks on an ad.

Also known as Cost-Per-Click (CPC) [link:] or Pay-For-Performance [link:], cost-per-click is an advertising revenue system used by search engines and ad networks in which advertising companies pay an agreed amount for each click of their ads.

This Click-Through Rate [link:]-based payment structure is considered by some advertisers to be more cost-effective than the Cost-Per-Thousand payment structure, but it can at times lead to Click Fraud [link:].

See also:

  • AdWords – Google’s PPC ad platform [link:]
  • AdCenter – Microsoft’s PPC ad platform [link:]
  • Yahoo! Search Marketing – Yahoo!’s PPC ad platform [link:]

Position – A URL’s location within the natural or paid search listings. This may also be referred to as Rank [link:]

Q

Quality Content – Content which is unique, well-written, useful, relevant, and linkworthy in nature.

See also:

Quality Link – Search engines count links as votes of trust. Furthermore, having a fewer number of high-quality links is more helpful than a high quantity of low-quality links.

There are a variety of ways to define what a quality link is, but the following are characteristics of a high quality link:

  • Trusted Source: If a link is from a page or website which seems like it is trustworthy (or from a website with a solid amount of authority and equity), then it is more likely to count more than a link from an obscure, rarely used, and rarely cited website. Trusted sources are usually sites who boast a good link portfolio of their own. See TrustRank [link:] for one example of a way to find highly trusted websites.
  • Hard to Get: The harder a link is to acquire, the more likely a search engine will place trust in it. Also, this means a competitor will need to work even harder to gain that link, or to acquire a link of equal or better value.
  • Aged: Some search engines may trust links from older resources or links that have existed for a length of time more than they trust brand new links or links from newer resources.
  • Co-citation: Pages that link at competing sites which also link to your site make it easy for search engines to understand what community your website belongs to. See Hilltop [link:] for an example of an algorithm which looks for co-citation from expert sources.
  • Related: Links from related pages or related websites may count more than links from unrelated sites.
  • In Content: Links which are in the content area of a page are typically going to be more likely to be editorial links than links that are not included within the editorial portion of a page.
  • Anchor Text: Links with anchor text using descriptive keywords or phrases when pointing to a website are typically more valuable than links using vague phrases such as “click here.”

Note: While appropriate anchor text may also help you rank even better than a link which lacks appropriate anchor text, it is worth noting that for hyper-competitive queries Google is more likely to place weight on a high quality link where the anchor text does not match than trusting low quality links where the anchor text matches.

Query – The actual “search string” a searcher enters into a search engine.

Query Refinement – Some searchers may refine their search query if they deemed the results as being irrelevant. Some search engines may aim to promote certain verticals or suggest other search queries if they deem other search queries or vertical databases as being relevant to the goals of the searcher.

Query refinement is both a manual and an automated process. If searchers do not find their search results as being relevant they may search again. Search engines may also automatically refine queries using the following techniques:

  • Google OneBox: promotes a vertical search database near the top of the search result. For example, if image search is relevant to your search query images may be placed near the top of the search results.
  • Spell Correction: offers a did you mean link with the correct spelling near the top of the results.
  • Inline Suggest: offers related search results in the search results. Some engines also suggest a variety of related search queries.

Some search toolbars also aim to help searchers auto complete their search queries by offering a list of most popular queries which match the starting letters that a searcher enters into the search box.

R

Rank – Also known as position, rank is the placement a website occupies in the Search Engine Results Pages (SERPs) [link:] relative to the first listing on an algorithmic results page in response to a keyword query.

The first page of search engine results displays listings in the one through ten positions, the second page eleven through twenty, etc. Consumer studies have shown that most search engine users click only on sites that occupy a top-ten rank.

Real-Time Search – The indexing of published real-time content (with no delay) into search engine results.

See also:

Re-Inclusion Request – If a site has been penalized for spamming they may fix the infraction and ask for reinclusion. Depending on the severity of the infraction and the brand strength of the site they may or may not be added to the search index.

See also:

Reciprocal Link – A link relationships between two websites which usually comes as a result of a Link Exchange [link:] request. For example, if an owner of website A gets in touch with the owner of website B and they both agree to link to each other’s sites, then this would be an instance of a reciprocal link.

According to Aaron Wall at SEOBook (referenced in the sources section), reciprocal links are “Nepotistic link exchanges where websites try to build false authority by trading links, using three way link trades, or other low quality link schemes.”

Some reciprocal link relationships for websites are bound to occur – and is even a fairly natural thing – between website communities. However, if most or all of your links are reciprocal in nature it may be a sign of ranking manipulation. In addition, sites that trade off-topic links or links on lower-quality, deep website pages are not likely to pass much link authority, and may even result in a penalty from the search engines for being associated with such a page.

Reciprocal link exchanges in and of themselves are not a bad thing if they pass along quality in both directions. However, most reciprocal link offers are of low quality.

Again, if too many of the backlinks within your link portfolio are of the low quality variety, it may make it difficult for your site to achieve significant search rankings for relevant queries. Another note: Some search engines may look at your site’s inbound to outbound link ratios as well as link quality when determining how natural a site’s link profile is.

See also:

Redirect – A method of alerting browsers and search engines that a webpage has changed locations. 301-Redirects [link:] are for permanent change of location, while 302-Redirects [link:] are used for a temporary change of location.

A 301-Redirect is the preferable method for redirecting a given webpage to it’s new destination, as this method passes the most authority over from the old page to new.

Referrer – The source from which a website visitor came from. This can be a keyword or URL.

Registrar – A company which allows you to register domain names. An example of a registrar would be GoDaddy.

Relative Link – A link which shows the relation of the current URL to the URL of the page being linked at. Some links only show relative link paths instead of having the entire reference URL within the a href tag.

Due to canonicalization [link:] and hijacking [link:] related issues it is typically preferred to use absolute links over relative links.

Example relative link:

<a href=”../folder/filename.html”>Cool Stuff</a>

Example absolute link:

<a href=”http://seobook.com/folder/filename.html”>Cool Stuff</a>

Relevancy – A measure of how useful searchers find search results. Many search engines may also bias organic search results [link:] to informational resources since commercial ads also show in the search results.

Search engines typically rank organic search results from most relevant to least relevant – that is, they show the most relevant results to a given search query first. Working to make a website more relevant for a given search phrase is the high-level crux of what most SEOs do on a day-to-day basis.

See also:

  • Google vs Yahoo! vs MSN – compares the relevancy algorithms of the three major search engines
  • Search Engine Relevancy Challenge – survey of search relevancy based on user voting

Reputation Management – Ensuring your brand related keywords display results which reinforce your brand. Many hate sites tend to rank highly for brand related queries.

Many large brands are beginning to utilize social media [link:] and online review websites, etc as a way to manage their online reputation. In fact, many companies utilize social media [link:] as a customer support channel and a way to learn what people are saying about their brand. Social Listening [link:] is an activity that many Reputation Management firms engage in.

Results Page – See Search Engine Results Pages (SERPs) [link:]

Return-On-Investment (ROI) – Return on Investment or ROI is a measure of how much return you receive from each marketing dollar.

While ROI is a somewhat sophisticated measurement, some search marketers prefer to account for their marketing using more sophisticate profit elasticity [link:] calculations.

As an SEO, it is becoming increasingly important to tie together SEO strategy to business impact and ROI. This is often not an easy process and is heavily reliant on having quality processes, client interaction, correct installation of analytics, and quality attribution techniques in place.

Rich-Snippets – Tags used in the programming of a webpage to define certain kinds on content. Properly formatted, they are recognized by Google and can enhance how a listing appears in Google search results (also called microformats [link:] or microdata or RDFa [link:]).

See also:

  • Schema.org – This site provides a collection of schemas, i.e., html tags, that webmasters can use to markup their pages in ways recognized by major search providers.

Robots.txt File – Pending

RSS – Rich Site Summary or Real Simple Syndication is a method of syndicating information to a feed reader or other software which allows people to subscribe to a channel they are interested in.

Most blogs utilize RSS and a method of building up the blog’s subscribership and community interaction. RSS is an alternative to Email subscriptions and newsletters.

See also:

S

Safari – A popular Apple browser and the default browser that comes installed on MAC computers.

Visit the Safari Website

Sandbox – The Sandbox (a.k.a. “Sandboxing” or the “sandbox effect” or the “Google penalty”) is a name given to an observation about the way Google ranks web pages in its index. It is the subject of much debate—its existence has been written about since 2004 but not confirmed, with several statements to the contrary.

It referred to sites that were “restricted” in some ways in performance, limiting their ability to do well in the search results for certain/all search terms etc. It is now an out-of-date and commonly mis-used term, as the “sandbox effect” that is being referred to may have disappeared some time ago.

See also:

SEO Specialist – A person who specializes in performing Search Engine Optimization for websites.

Search Engine – Tool for finding information, especially on the Internet or World Wide Web. Search engines are essentially massive databases that cover wide swaths of the Internet.

Most search engines consist of three parts:

  1. At least one program, called a spider [link:], crawler [link;], or bot [link:], which “crawls” through the Internet gathering information.
  2. A database, or index [link:], which stores the gathered information.
  3. A search tool, with which users are able to search through the database of web documents by typing in keywords describing the information desired (usually at a Web site dedicated to the search engine). The order of results display are determined by each search engine’s relevancy algorithm [link:].

The most popular and influential search engines over the last 5-10 years have been Google [link:], Yahoo [link:], and Bing [link:].

Search Engine Friendly – The process of creating a web site with Meta and Text Content that can be read and understood by the various search engines.

A web page that has been designed and optimized for high search engine rankings. A search engine friendly page also makes it easy for search engines to follow the links on the page.

Search Engine Marketing (SEM) – Search engine marketing. Also known as Search Marketing [link:]

Search Engine Marketing Professional Organization (SEMPO) – The Search Engine Marketing Professional Organization (SEMPO) is a non-profit professional association founded in 2003 to increase awareness of the benefits of search engine marketing and provide educational resources to members and consumers.

Visit SEMPO.org

SEOSearch engine optimization or SEO is the process of improving a website’s visibility in the natural [link:] (or “organic” [link:]) search results for relevant search queries. The art and science behind SEO involves the of publishing information and marketing it in a manner that helps search engines understand your information is relevant to relevant search queries.

From a tactical standpoint, most SEOs utilize a variety of on-page and off-page factors such as keyword research [link:], SEO copywriting [link:], title [link:] and meta [link:] optimization, information architecture [link:], link building [link:], brand building [link:], building mindshare [link:], reputation management [link:], social media [link:], link-baiting [link:], and viral marketing [link:].

SEO Copywriting – Writing and formatting copy in a way that will help make the documents appear relevant to either a wide array of relevant search queries or a very specific target search query.

There are two main ways to write titles and be SEO friendly [link:]:

  • Write literal titles and content that are well aligned with things people search for. This works well if you need backfill content for your site or already have an amazingly authoritative site.
  • Write page titles and content that would be considered exceptionally compelling to link to or share via social media [link:]. If enough people link at them then your pages and site will rank for many relevant queries even if the keywords are not in the page titles.

Worth reading:

Search Engine Results Page (SERP)Search Engine Results Page or SERP is the page on which the search engines show the results for a search query. SERPs are typically divided into two sections, the natural [link:] (or organic [link:]) and paid [link:] sections.

Search History – Many search engines store user search history information. This data can be used for better ad targeting or to make old information more findable.

Search engines may also determine what a document is about and how much they trust a domain based on aggregate usage data. Many brand related search queries is a strong signal of quality.

This means that sites with a longer history of solid search traffic and usage may tend to appear more often in prominent positions within the search results (due to increased trust and relevancy bias from search engines that is placed on older and more highly trafficked websites).

Search Terms – Also known as keywords or query terms, search terms are the word(s) or phrase(s) a user enters into a search engine’s Query box.

Once a search term is entered, a search engine [link:] will display results in a Search Engine Results Page (SERP) [link:] which ranks indexed sites according to how relevant the Search Engine deems them to the search terms that were queried.

One of the most important SEO Strategies companies can employ is to optimize their site pages with content that contains targeted search terms relevant to their products or industry.

Search terms typically fall into several distinct categories:

  • Head Terms [link:] – Keywords that generate a lot of search volume but which are also highly competitive. Head terms are typically shorter in nature and higher in the business funnel, meaning that they may generate a lot of visits and conversions. Due to their inherent nature, head terms typically don’t experience conversion rates that are as high as long-tailed keywords. People tend to use head terms high in the buying/search cycle. An example of a head term would be the keyword ‘insurance’.
  • Mid-Tail Keywords [link:] – Phrases which generate a moderate amount of searches and are moderately competitive. Mid-Tail phrases are typically 2-4 words in length and are further down in the business funnel. People using these types of phrases may be further along in the buying/research cycle, but may not yet be ready to buy. These types of phrases typically deliver a higher conversion rate than head terms although they may not generate the sheer numbers. A good example of a mid-tail keyword would be ‘auto insurance quotes’.
  • Long-Tail Keywords [link:] – Keywords that are longer and more specific in nature which generate less search volume and are typically less competitive, but that typically drive the highest conversion rates. Long-tailed keywords fall lowest in the business funnel, and searchers who are utilizing long-tailed keyword searches are often very far along in the buying/research cycle (and are usually ready to buy). An example of a long-tailed keyword might be ‘affordable auto insurance quotes’.
  • Branded/Product Keywords – Keyword search phrases that use a brand name or specific product. An example of a brand-search would be ‘Nationwide Insurance’.

Semantic Web – Semantic Web is a group of methods and technologies to allow machines to understand the meaning – or “semantics” – of information on the World Wide Web.

Web semantics involves the transformation of the web from an inherently human-interpretable medium to an inherently computer-interpretable medium. In the semantic web, machines can read and understand the content published in the network.

The objective of the Semantic Web Architecture is to provide a knowledge representation of linked data in order to allow machine processing on a global scale.

See also:

Server – Computer used to host files and serve them to the WWW. Dedicated servers [link:] usually run from $100 to $500 a month. Virtual servers [link:] typically run from $5 to $50 per month.

Server Logs – Files hosted on servers which display website traffic trends and sources. Server logs typically do not show as much data and are not as user friendly as analytics [link:] software. Not all hosts provide server logs.

Siphoning – Techniques used to steal another web sites traffic, including the use of spyware [link:] or cybersquatting [link:].

Sitelinks – On some search results where Google thinks one result is far more relevant than other results (like navigational or brand related searches) they may list numerous deep links to that site at the top of the search results.

Sitemap – Sitemaps can be used to help give search engines a secondary route to navigate through your site. HTML Sitemaps typically consist of a list of links to the most important pages on a website, while XML sitemaps [link:] are typically more bare-boned URL lists and usually included all website URLs.

Tips:

  • On large websites the on page navigation should help search engines find all applicable web pages.
  • On large websites it does not make sense to list every page on the site map, just the most important pages.
  • Site maps can be used to help redistribute internal link authority toward important pages or sections, or sections of your site that are seasonally important.
  • Site maps can use slightly different or more descriptive anchor text than other portions of your site to help search engines understand what your pages are about.
  • Site maps should be created such that they are useful to humans, not just search engines.

Slug – A slug is the part of a URL which identifies a page using human-readable keywords. Slugs are used to construct clean URLs (often for permalinks) that are easy to type, descriptive, and easy to remember.

For example, in the URL on this site:

http://agent-seo.com/seo-dictionary/

The slug is ‘seo-dictionary’

Typically, slugs are generated from a human-readable phrase such as the title of a news article, blog post, or encyclopedia entry. It is also common practice to make the slug all lowercase and to remove non-essential words, such as conjunctions and articles, to shorten the final URL. Long page titles may also be truncated to keep the URL a reasonable length.

Social Bookmarking – Social bookmarking is a method for Internet users to organize, store, manage and search for bookmarks of resources online. Unlike file sharing, the resources themselves aren’t shared, merely bookmarks that reference them.

Some social bookmarking sites:

Social Media – Websites, tools, and platforms which allow users to create and share content across their social networks via a variety of different online mediums.

Since the mid 2000s, social media has grown by leaps and bounds and has been widely adopted by individuals and companies alike as a preferred medium from which to market, communicate, collaborate, connect, share, and perform a variety of other tasks.

Some of the most popular and influential social media platforms include:

  • Twitter [link:] – Micro-Blogging
  • Facebook [link:] – Social Community
  • Google+ [link:] – Social Community
  • YouTube [link:] – Video sharing
  • LinkedIn [link:] – Professional Community
  • Flickr [link:] – Photo sharing

Social Search – Social search or a social search engine is a type of web search that takes into account the Social Graph of the person initiating the search query.

In 2009, Google added social search to their algorithm, and more recently launched Search Plus Your World which is an amplification of their social search which is designed to feature personalized results and content from a user’s social circle directly within the SERPs [link:]. With this update, there is a huge emphasis on activity from Google+ [link:].

Spam – (a) Unsolicited email messages, or (b) low-quality websites with sparse or scraped [link:] content (or high quantities of low-quality or otherwise questionable backlinks) appearing within search results [link:].

Search engines also like to mask/outsource their relevancy issues by calling low quality search results spam. They have vague ever changing guidelines which determine what marketing techniques are acceptable at any given time.

Typically search engines try hard not to flag false positives as spam, so most algorithms are quite lenient, as long as you do not build lots of low quality links, host large quantities of duplicate content, or perform other actions that are considered widely outside of relevancy guidelines.

If your site is banned from a search engine you may request reinclusion after fixing the problem.

See also:

Spamming – The act of creating and distributing spam.

Spider – Search engine crawlers which search or “spider” the web for pages to include in the search engine’s index. Also called “crawlers” [link:] or “bots” [link:] and a variety of other names.

Many non-traditional search companies have different spiders which perform other applications. For example, TurnItInBot searches for plagiarism. Spiders should obey the robots.txt protocol.

Google’s main spider is called ‘Googlebot’ while Bing’s is named ‘Bingbot’ and ‘MSNBot’.

See also:

Splash Page – An introductory, beginning page on a website that generally uses Flash or another type of heavy graphics that usually has little value in SEO and can easily have no value to search engines but esthetic value to human visitors.

This page often precedes a homepage and is seen by a visitor before they can get to the homepage. Splash pages are less common now as many see them as an annoyance because they delay a visitor from getting to the actual website. Similar to a doorway page [link:].

Splog – Spam blog [link:], typically consisting of stolen or automated low quality content.

Static Content – Content which does not change frequently, and that tends to remain constant over time until edited by the author or webmaster. It is often hard coded onto the page and does not come from a database [link:] (i.e. is not dynamic [link:]). Search engines prefer static content over dynamic content as it is typically easier to index.

Many static sites do well, but the reasons fresh content works great for SEO are:

  • If you keep building content every day you eventually build a huge archive of content. This is why blogs are so popular from a user perspective and well-received by search engines
  • By frequently updating your content you keep building mindshare, brand equity, and give people fresh content worth linking to and sharing via popular social media [link:] outlets.

Status Code – A status code, also known as an error code, is a 3-digit code number assigned to every request (hit) received by the server. Most valid hits will have a status code of 200 (“ok”). “Page not found” errors will generate a 404 error.

Here are the most common status codes:

  • 200 OK
  • 300 Multiple Choices
  • 301 Moved Permanently
  • 302 Found / Temporary Redirect
  • 400 Bad Request
  • 401 Unauthorized
  • 403 Forbidden
  • 404 Not Found
  • 500 Internal Server Error
  • 501 Not Implemented
  • 503 Service Unavailable
  • 550 Permission denied

See also:

Stemming – Using the stem of a word to help satisfy search relevancy requirements. EX: searching for swimming can return results which contain swim. This usually enhances the quality of search results due to the extreme diversity of word used in, and their application in the English language.

Stop Words – Common words (ex: a, to, and, is …) which add little relevancy to a search query, and are thus are removed from the search query and ignored by search engines prior to finding the most relevant search results.

It is both fine and natural to use stop words in your page content. The reason stop words are ignored when people search is that the words are so common that they offer little to no discrimination value.

Worth reading:

Submission – The act of making information systems and related websites aware of your website. In most cases you no longer need to submit your website to large scale search engines, they follow links and index content.

The best way to submit your site is to get others to link to it. Another way would be to submit your XML Sitemap [link:] with all current pages included or new pages you’d like to be included in the index to Google Webmaster Tools [link:].

Some topical or vertical search [link:] systems will require submission, but you should not need to submit your site to large scale search engine.

Companies who charge money for submission of your website to a large mass of sites are probably not worth your time. The main search engines will find you naturally (and prefer to do it that way), and many of the directories they submit to may be of a low quality – which won’t do you much good.

Supplemental Results – Documents which generally are trusted less and rank lower than documents in the main search index.

Some search engines, such as Google, have multiple indicies. Documents which are not well trusted due to any of the following conditions:

  • Limited link authority relative to the number of pages on the site
  • Duplicate content or near duplication
  • Exceptionally complex URLs

Documents in the supplemental results are crawled less frequently than documents in the main index. Since documents in the supplemental results are typically considered to be trusted less than documents in the regular results, those pages probably carry less weight when they vote for other pages by linking at them.

You can find document’s on this site that are in Google’s supplemental results by searching for site:agent-seo.com *** -view:randomstring

T

Tag Cloud – A visual depiction of the word contents of a website, or of user-generated tags attached to online content, typically using color and font size to represent the prominence or frequency of the words or tags being depicted.

Taxonomy – Classification system of controlled vocabulary used to organize topical subjects, usually hierarchical in nature.

Technorati – Blog search engine which tracks popular stories and link relationships.

See also:

Term Frequency – A measure of how frequently a keyword appears amongst a collection of documents. Many SEOs also like to break this down on a page-by-page basis when looking at Keyword Density [link:].

Term Vector Database – A weighted index of documents which aims to understand the topic of documents based on how similar they are to other documents, and then match the most relevant documents to a search query based on vector length and angle.

See also:

Text-Link Ads – Advertisements which are formatted as text links.

Since the web was originally based on text and links people are typically more inclined to pay attention to text links than some other ad formats which are typically less relevant and more annoying.

However, search engines primarily want to count editorial links as votes, so links that are grouped together with other paid links (especially if those links are to off topic commercial sites) may be less likely to carry weight in search engines.

See also:

  • Google Adwords [link:]

Title Tags – A title tag is an HTML tag which contains a sentence of text describing the contents of its associated webpage.

The page title is one of the most important aspects to doing SEO on a web page. Each page title should be:

  • Unique to that page: Not the same for every page of a site!
  • Descriptive: What important ideas does that page cover? What keywords would you use to describe it?
  • Not excessively long: Typically page titles should be kept to 65 character spaces (8 to 10 words) or less, with some of the most important words occurring near the beginning of the page title.

Page titles appear in search results as the links searchers click on. In addition many people link to documents using the official document title as the link anchor text [link:]. Thus, by using a descriptive page title you are likely to gain descriptive anchor text and are more likely to have your listing clicked on.

On some occasions it also makes sense to use a title which is not literally descriptive, but is easily associated with human emotions or a controversy such that your idea will spread further and many more people will point quality editorial links [link:] at your document.

There are two main ways to write titles and be SEO friendly:

  1. Write literal titles that are well aligned with things people search for. This works well if you need backfill content for your site or already have an amazingly authoritative site.
  2. Write page titles that are exceptionally compelling to link at. If enough people link at them then your pages and site will rank for many relevant queries even if the keywords are not in the page titles.

Title Tags in your code:

<title>Your Keywords Go Here – Your Company Name</title>

Title tags in your browser:

[image]

Title tags in your search results:

[image]

Worth reading:

Toolbar – In a graphical user interface, on a computer monitor, a toolbar is a GUI widget on which on-screen buttons, icons, menus, or other input or output elements are placed. Toolbars are seen in office suites, graphics editors, and web browsers.

Many major search companies aim to gain marketshare by distributing search toolbars. Some of these toolbars have useful features such as pop-up blockers, spell checkers, and form autofill. These toolbars also help search engines track usage data [link:].

Topic-Sensitive PageRank – Method of computing PageRank which instead of producing a single global score creates topic related PageRank scores.

See also:

Trackback – Automated notification that another website mentioned your site which is baked into most popular blogging software programs.

Due to the automated nature of trackbacks they are typically quite easy to spam. Many publishers turn trackbacks off due to a low signal to noise ratio.

Traffic – See website traffic [link:]

TrustRank – Search relevancy algorithm which places additional weighting on links from trusted seed websites that are controlled by major corporations, educational institutions, or governmental institutions.

See also:

Twitter – Twitter is an online social networking service and microblogging service that enables its users to send and read text-based posts of up to 140 characters, known as “tweets”.

It was created in March 2006 by Jack Dorsey and launched that July. The service rapidly gained worldwide popularity, with over 300 million users as of 2011, generating over 300 million tweets and handling over 1.6 billion search queries per day.

Currently, Twitter is a standard-bearer for social media [link:] and a platform that most users/companies who are serious about developing social engagement and interaction need to be on.

Twitter Inc. is based in San Francisco, with additional servers and offices in New York City.

See also:

Typepad – Hosted blogging platform [link:] provided by SixApart, who also makes Movable Type [link:].

It allows you to publish sites on a subdomain off of Typepad.com, or to publish content which appears as though it is on its own domain.

Note: If you are serious about building a brand or making money online you should publish your content to your own domain because it can be hard to reclaim a website’s link equity and age related trust if you have built years of link equity into a subdomain on someone else’s website.

See also:

U

Unethical SEO – Some search engine marketers lacking in creativity try to market their services as being ethical, whereas services rendered by other providers are somehow unethical. SEO services are generally neither ethical or unethical. They are either effective or ineffective.

SEO is an inherently risky business, but any quality SEO service provider should make clients aware of potential risks and rewards of different recommended techniques. Unethical SEO service providers are often said to be practicing Black-Hat SEO [link:].

Update – Search engines frequently update their algorithms and data sets to help keep their search results fresh and make their relevancy algorithms hard to update. Most major search engines are continuously updating both their relevancy algorithms and search index.

See also:

  • Google Terminology – Good video featuring Google’s Matt Cutts [link:] discussing various commonly used terms inside Google.

URLUniform Resource Locator is the unique address of any web document. You can see the URL in your toolbar [link:] in the web address section.

URL Rewrite – A technique used to help make URLs more unique and descriptive to help facilitate better sitewide indexing by major search engines. Sometimes called a “mod_rewrite.” This is typically done when a website is using dynamic [link:] URLs.

See also:

  • Apache Module mod_rewrite – Apache module used to rewrite URLs on sites hosted on an Apache server.
  • ISAPI Rewrite – software to rewrite URLs on sites hosted on a Microsoft Internet Information Server.

Usability – The degree to which it is easy or difficult for customers to perform the desired actions on your website.

The structure and formatting of a websites design layout, calls to action, as well as text and hyperlink based navigation plays a significant part in a website’s overall usability – and thus conversion rates.

See also:

Usage Data – Things like a large stream of traffic, repeat visitors, multiple page views per visitor, a high clickthrough rate, or a high level of brand related search queries may be seen by some search engines as a sign of quality. Some search engines may factor in a website’s usage data as part of their ranking algorithm [link:].

V

Vertical Search – Pending

Viral Marketing – Pending

Virtual Domain – Pending

Virtual Server – Pending

Vlog – Pending

W

W3C – Pending

Web – Pending

Web Analytics – Pending

Web CEO – Pending

Web Crawler – Also known as Spider or Robot, a crawler is a search engine program that “crawls” the web, collecting data, following links, making copies of new and updated sites, and storing URLs in the search engine’s Index. This allows search engines to provide faster and more up-to-date listings.

Examples of ‘good’ spiders are Googlebot, MSNbot and Yahoo Slurp, which search the web for pages to place in their respective search indexes. ‘Bad’ spiders are those that search for vulnerabilities in server configurations, web content to steal, etc.

Many non-traditional search companies have different spiders which perform other functions. For example, TurnItInBot searches for plagiarism. Spiders should obey the robot.txt protocol.

Website Traffic – A general term used to describe the volume of visitors to a website.

Web traffic is the amount of data sent and received by visitors to a web site. It is a large portion of Internet traffic. This is determined by the number of visitors and the number of pages they visit.

Sites monitor the incoming and outgoing traffic to see which parts or pages of their site are popular and if there are any apparent trends, such as one specific page being viewed mostly by people in a particular country.

There are many ways to monitor this traffic and the gathered data is used to help structure sites, highlight security problems or indicate a potential lack of bandwidth — not all web traffic is welcome.

A popular and free tool for measuring website traffic is Google Analytics [link:]

See also:

White-Hat Tactics – Pending

Wiki – Pending

Wikipedia – Pending

WordPress – Pending

WYSIWYG – Pending

X

XHTML – Pending

XML – Pending

XML Feed – Pending

Y

Yahoo! – Pending

Yahoo! Answers – Pending

Yahoo! Directory – Pending

Yahoo! Local – Pending

Yahoo! Search Marketing – Pending

Yahoo! Site Explorer – Pending

Youtube – Pending

Z

None

#

10-Pack – Pending

200-Status – Pending

301-Redirect – Pending

302-Redirect – Pending

404-Error – Pending

Sources

As I stated above, I did not come up with all of the definitions above – simply compiled the best available information and added my thoughts, as well as other relevant links and material in most cases. Below are my sources:

  • SEOBook’s Glossary – From Aaron Wall, one of the most highly-recognized people in the SEO industry today.
  • SEO-Dictionary.com – This site is no longer up, but I had originally used it for as a source for a few key terms.
  • Submit Express’ SEO Dictionary
  • SEODictionary.net
  • Wikipedia – I used this for a few definitions. I know that Wikipedia is human-edited, but that doesn’t mean all the information there is crap.

Am I missing a term that you would like defined? Simply contact me and I’ll do my best to get it on this list!

My Work
Agent SEO Portfolio | Ohio SEO

Connect with Me

467 Subscribers & Followers
  • Subscribe to the Agent SEO RSS Feed
  • Follow Jacob Stoops on Twitter
  • Follow Jacob Stoops on Facebook