Skip to content

SEO dictionary


SEO dictionary of the real digital language: we present a complete SEO dictionary with all the terms related to the subject to solve your digital questions.

Within SEO there are many terms and concepts that we need to be clear when optimizing and positioning our site or project in the best possible way.

The good interpretation of a term is fundamental. Therefore, in this section, we will expand new terms and “words”, all related to SEO so that you are always updated.


Anchor text

What is the Anchor Text?

The Anchor Text or anchor text is the visible text in a link or hyperlink that provides information about the content we want to direct the user and the search engines.

Search engines have improved with the passage of time and increasingly use more factors to create their ranking rankings. One of these metrics or factors is the relevance of a link. The relevance of a link depends both on the authority of the page where that link comes from, and on the visible text of the anchor text. Of course, the link should always be as natural as possible or Google will understand it as a bad practice.

We can classify the anchor text in the following types:

  • Generic. Includes words such as: “this blog”, “click here”, “this page”.
  • Keyword. As we are interested in positioning one or another keyword, we use a different anchor text and choose the terms we want to highlight, for example, “On-page SEO“.
  • First name. When it is formed by a text different from the previous ones, and the objective is to link to a brand, a web, etc. The link would be: “SEOexpertum“.



What are the Backlinks?

Backlinks are the links or inbound links that point from other pages to your own. The number of backlinks on your page is important because the more relevant pages you link, the more notoriety your website will gain in the eyes of Google. Make sure they are natural and convenient links, always quality before quantity.

Black Hat SEO

What is Black Hat SEO?

SEO is called Black Hat to improve the search engine positioning of a website using unethical techniques or that contradict Google’s guidelines, make “cheating”. These practices are increasingly penalized by Google. Some examples of Black Hat SEO are:

CloakingSpinning – SPAM in forums and blog comments – Keyword Stuffing


Cannibalization of keywords

What is the cannibalization of keywords?

The cannibalization of keywords occurs when there are several pages in a web that compete for the same keywords, confusing the search engine by not knowing which page is the most relevant for that keyword, causing a loss in positioning.

How is this solved? The easiest way is to center each page in one or two keywords as much and in case it can not be avoided, we will have to create a main page of the product from where you can access the pages of the different formats in which we will include a canonical label to the main page of the product.


What is cloaking?

Cloaking is a Black Hat SEO technique that is very used to show different content depending on whether it is a user or a search engine robot that reads it.

Google is very hard with this practice and although years ago could have given results, forget it, it is out of what search engines seek with their updates: a more natural, ethical SEO and more focused on users.

Here is a video where Matt Cutts explains more about what cloaking is and how Google fights against him.


What is CTR?

The CTR (Click Through Rate) is the number of clicks that a link obtains with respect to the number of impressions. It is always calculated as a percentage and is a metric that is normally used to measure the impact of a digital campaign.

How to calculate the CTR?

As we said before, the CTR is calculated as a percentage. It is obtained by dividing the number of clicks that a link has obtained between the number of times it has been seen by users (impressions) multiplied by 100.

Let’s see an example: Let’s imagine that we have a result in Google that has been seen in 2000 times and that it has obtained 30 clicks, our CTR would be calculated like this:

CTR = (Clicks / Impressions) x 100 = (30/2000) x 100 = 1.5%
CTR = 1.5%

Where to see the CTR of our website on Google?

Thanks to the webmaster platform Google Search Console, formerly known as Webmaster Tools, you can check the CTR that your pages have in the Google search results.

Canonical tag

What is the canonical label?

The canonical tag was presented by Google, Yahoo! and Bing in 2009 to solve the problem of duplicate or similar content in SEO.

If your code does not have a canonical tag on a set of pages with duplicate or similar content, search engines will have to decide which URL is best suited to what the user is specifically looking for. However, if we introduce this label, we are the ones that indicate to Google and other search engines which is our preferred page. This will improve the process of indexing and positioning our website in SERPs.

Example of canonical tag: <link rel = “canonical” href = “» /> “

Let’s see an example: if our website is the platform from which we sell flats in the neighborhood of Chueca, in Madrid and we have several pages with very similar content, we must choose as canonical the URL for which we want to position ourselves. This may be the one that has brought us the most traffic or the one with the greatest benefit.

To use canonical tag effectively in SEO, you just have to follow these steps:

  • Choose which is the canonical or main page.
  • Decide which or which are your secondary pages that can compete in the positioning with the main one.
  • Add the canonical tag on the secondary pages pointing to the main page between «<head>» and «</ head>»
  • Put the canonical tag on the main page pointing to itself between «<head>» and «</ head>»


Duplicate Content

What is duplicate content?

It consists of having one or several URLs with exactly the same content. In principle, it is not a penalty unless there is an intention to deceive the search engine, to show more results on the same search.

There are two types of duplicate content: the internal duplicate content, the one that is repeated within the same site; and the external duplicate content, that content that is plagiarized by others.

Although it does not imply a penalty, the duplicate content can cause high positioning possibilities to be lost, because the search engines do not have full certainty of which are the most relevant pages for certain searches.

Density of keywords

What is keyword density?

Keyword density is the percentage of times that a word (or series of words) appears in the whole text versus the number of total words.

A few years ago, the density of keywords was one of the most important factors in SEO positioning, since it was the method used by search engines (Google, Yahoo, Bing) to identify the main topic of a page.

However, SEO has changed, now the Google guidelines recommend writing in the most natural way possible, that is, you have to write for the user instead of for the search engine.

Although there are still people who recommend not exceeding 3% the density of keywords, there is no ideal percentage.


Google Algorithm

The Google Algorithm is the form that the search engine has to position the pages before a search, that is, it is what decides whether you go first, second or on the second page.

This algorithm changes about 500 times a year and it is difficult to keep track of it. That is why it is better to know important changes such as Panda and Penguin, how they affect SEO and how to recover.

Google Panda

What is Google Panda for?

Google Panda serves so that users have easy access to portals where there is constant content quality. His birth took place with the sole purpose of rewarding quality rather than quantity, based on the criteria we mentioned in the previous section.

Facing the user, it is a benefit, since it allows you to obtain good information and, above all, not to feed plagiarism and duplicate content in networks. For the editors, it meant proposing new strategies in order to scale positions, but also a way to reward those who were committed to the development of good content and, above all, totally original.

Google Penguin

What is Google Penguin?

Penguin is the official name for the update of the Google algorithm designed to fight the webspam. This update was launched in April 2012.

It focuses on the off-site factors of a website, rewarding those sites that have a profile of links with links of high quality and unmanaged domains, and trying to punish those pages that have violated the Google guidelines, which have profiles of Unnatural links, too many links in low-quality sites, etc.

It was Google who, from the beginning, decided that the links generated to a website were a sign that their content was relevant. That’s why everyone started generating links to gogo. However, Google Penguin is a “where I said, I say Diego”.

The improvements that implement the algorithm include better detection of links of little value, purchased, in networks of articles, directories and basically any dynamic that involves trying to modify the profile of links on your website. The best way to make sure that Penguin does not penalize you is to respect Google’s guidelines and passively generate links through your content.

How does SEO change with Google Penguin?

  • Natural links, that is, generated passively or through real value. The syndication of articles, spinning, hidden links, directories (free or paid), promotions that result in links, etc. are prohibited.
  • Variety of anchor text: It no longer makes sense to generate links with a link text that you want to position. If Google detects a pattern that does not consider natural it can penalize you.
  • Search in your niche: The most valuable links are those of domains and pages of your niche or that speak of topics with a relationship.
  • Quality, not quantity: It is preferable to generate few links of quality that many of little value.



What is the keyword?

It refers to the keyword (or keywords) to refer to the terms by which we want to attract traffic to our website through search engines. You must take into account some factors associated with the Keywords (abbreviated KW) as the competition, the number of searches, the conversion or even the potential as a tool of branding.

The choice of one or another keyword will determine the strategy, the content of a page, the appearance of that keyword in texts and labels, and other factors of SEO positioning.

Keyword stuffing

What is keyword stuffing?

The Keyword Stuffing is a Black Hat SEO technique that consists of the excessive use of keywords within a text with the poorly focused objective of giving more relevance to this word. Google penalizes very often this type of over-optimization.

To avoid any type of negative action by Google, the texts must always be written to provide value to the user, and in the way that best suits your profile of the public. If the text manages to give useful, original and well-synthesized information, that will be a better indicator for Google than any variation in the number of keywords in the text.

There is no percentage that defines a perfect keyword density and Google recommends above all naturalness.


Link baiting

What is the link baiting?

The technique to attract links in an organic way through the creation of high-value content. One of the essential factors for search engine positioning is the number of links to a certain page.

Link Baiting wants a large number of users to link content from our site. To do this we must create original, relevant and novel content, such as articles, videos or infographics that draw the attention of users.

Link building

What is the link building?

The Link Building is one of the foundations of web positioning or SEO, which seeks to increase the authority of a page as much as possible by generating links to it.

The algorithms of most search engines such as Google or Bing are based on SEO factors on-site and SEO off-site, the latter based on the relevance of a website, whose main indicator are the links that point or backlinks. There are other factors, such as the anchor text of the link, if the link is follow or not, the mentions to brand or the links generated in RRSS.

It is important to keep in mind that good content is often linked naturally, so the effort to get links happens organically and with less effort than through other means.

Link juice

What is the link juice?

It is the authority that transmits a page through a link. Google positions web pages according to their authority and relevance, this is transferred from one page to another through links, and this transmitted authority is what we call Link Juice.

To understand it we have to understand a web page as a large juice glass (web) to which we make various holes (links) in the base. In this way, a glass that has a hole will transmit all its link juice through that single hole. If you have 10, each hole will pass 10% of the total juice link, and so on.



The long tail is a statistical term that refers to the distribution of a population.

Let’s suppose that your website attracts traffic through 100,000 keywords, and you focus on the 100 with the most visitors. Imagine that they represent around 20% of total traffic (depending on the nature of your website) corresponding to these terms, and the remaining 80% will correspond to terms with a very small number of searches. So the vast majority of traffic that attracts your website comes through terms that you are not analyzing and you do not even know what they are.

That is what we call long-tail, searches with more specific terms that individually generate very little traffic, but together they are the largest source of visits to the web. The term is applicable to other realities apart from online marketing; It was popularized by Chris Anderson in a Wired article, putting as examples of companies that have triumphed thanks to the business generated by their long tails, such as Amazon, Netflix or Apple.


Meta Tags

What are meta tags?

The meta tags or meta tags are information included in the web pages but which in turn are not seen directly by the user. They are used to provide information to browsers and search engines in a way that helps them to interpret the page better and they are written in HTML language within the web document itself.

The meta tags have been important at the SEO level for their ability to affect the behavior of search engines, providing information on why pages should position a website, giving a description of it or blocking access or indexing of the website by users. search engine robots.


What are microformats?

Microformats are a simple form of code that gives semantic meaning to content so that machines can read it and understand our products or services.

If we add Microformats to our website, Google can read it and show it in the search results. This information may include user votes, photo and author’s name, video, audio, etc.


Not provided

What is not provided?

The term “not provided” is a term used in Google Analytics that identifies all “secure” traffic within Google, or what is the same, all traffic that comes from users who have signed in to their Google account.

What happens with this data? What do I do with them? In this post, we discover the different options so you can interpret this data.


Off-site SEO

What is SEO Off-site?

It is the part of SEO work that focuses on external factors to the web page in which we work and that affects our site, including external links, social signals, mentions, and other metrics that reinforce the authority of the page.

One of the most important tasks of off-site SEO is the link building, generating links that point to your page on websites external to it, with what Google will give greater relevance.

On-site SEO

What is SEO On-site?

SEO On-site or SEO On-page is a set of internal factors that influence the positioning of a web page. Are those aspects that we can change ourselves on our page as:

  • The meta information, such as the title or the meta-description
  • The URL
  • The content
  • The <alt> attribute in the images
  • The web structure
  • The internal linking
  • The HTML code

Optimize SEO On-site is an essential process that every web page should take care of if it wants to appear in the search results.



What is page rank?

The Page Rank is the way Google measures the importance of a website, the search engine classifies the value of the websites on a scale of 1 to 10.

When a page links to another website, it transmits a value, and this value depends on the Page Rank of the page it links to.

Currently, Google has stopped publicly updating the Page Rank, and now nobody can see what score a website has for the search engine.

However, although they continue to use it internally to establish their search results, each time it has a lower weight within the algorithm set.

The Page Rank is given by factors such as the number of links and domains pointing to the web, the quality of these, the age of the domain, etc.



What is a Query?

The English term “query” means doubt or question. When we talk about databases, query or query string is a request for data stored in said BB.DD, although in a generic way it can refer to any interaction. When we talk about search engines, <strong> a query is the term we write in Google, a query that will then be derived in a SERP. </ Strong>


Robots tag

What is the Meta Robots label?
  What is robots.txt?

The Meta Robots tag is an HTML tag that is used to tell search engines to treat a URL in a certain way.

This tag is necessary if we do not want our website to be indexed or positioned in search engines.

You can also perform this function through the Robots.txt file on the page.

The difference between using the Meta Robots tag and the Robots.txt file is the following:

  • Through the tag, we tell Google that we do not want to index certain pages, but we do want the bots to track them.
  • However, if we use the Robots.txt file, we tell the bots that they do not bother to enter and track certain pages directly.

Ranking in search engines

What is the search engine ranking?

Search Engine Ranking or Ranking in Search Engines is one of those terms that you have heard more than once if you are interested in online marketing, but perhaps we do not know in depth its meaning.

What is the Ranking in Search Engines?

Search Engine Ranking is the position your website occupies on a search results page. That is, the position in which you appear in Google, Yahoo. Bing … when a user performs a search.

To improve our positioning we must use strategies and tools that help us optimize our website, increasing accessibility, usability, and content.


SEO negative

What is negative SEO?

It is a set of unethical and potentially illegal techniques based on Black Hat actions or negative actions. They are directed against competitors’ websites so that they are punished, de-indexed or lower in the search results. These acts are often penalized by the search engines.

Negative SEO can easily be countered if we regularly monitor our links profile and take measures to correct any strange or negative links we see.

Some examples of Negative SEO are:

Sending bad quality backlinks
Duplication of web content in other sites
Negative comments on the brand.


What are SERPs?

(Search Engine Results Page) refers to the results page of a search engine, such as Google or Bing.

It is the page that appears after performing a search, it is where the results are displayed.

The more web is optimized according to the quality criteria of the search engines, the more likely it will be to position better in the SERPs.


What is the sitemap?

A sitemap or website map is an XTML document that is sent to search engines. This document allows searchers to have a complete list of pages that make up a website, so they can index pages that their robots can not access because there are no direct links, be behind a form, etc.

If you want to generate an online sitemap, there are tools for it, then, Google allows you to upload it from Google Search Console.


What is spinning?

The spinning is a Black Hat SEO technique that refers to the creation of an article reusing different original texts.

In this way, the generation of content is accelerated in a simple way. It can be carried out through software that automates the process of modifying the content or manually, making them believe that they are different texts through synonyms or changes of order and words.

Although this technique has been widely used, doing it automatically falls within the penalty factors by Google. From its now famous Penguin, Google detects these practices more frequently.


White Hat SEO

What is White Hat SEO?

In essence, it is the SEO that follows the rules and guidelines of the search engines. Actually, it is easier to explain if we start with its opposite. And is that the Black Hat SEO are all the techniques used to cheat the engines and to position a web page above the rest in a “disloyal” way. We can not say that it is illegal since there is no legislation in this regard, but it is about bad practices and very penalized by search engines. And for the White Hat SEOs, of course.

In conclusion, the “white hat SEO” is anyone who follows the rules or “recommendations” given by search engines such as Google, Bing, Yahoo … In general, it is something in which all of them coincide and usually based on their own ethics to dictate them.

Leave a Reply to lonemert Cancel reply

Your email address will not be published. Required fields are marked *