Where does the term SEO (Search Engine Optimization) come from?
The first search engines emerged in the early 90s. By the time Google came along in 1996, many more had been created, including Yahoo. The web boom began. People realised that they could actually make money with their websites. So they came to the conclusion that they needed to attract traffic. What was the best way to attract traffic? Search engines . At that time, website owners started thinking about how they could reach the top positions… SEO was born!
SEO focuses on organic search results, i.e. those that are not paid for:

But anyway, let’s get to what matters and the reason why (I think) you are reading this chapter:
1. What is SEO?
Search engine optimization is the process of improving the visibility of a website in the organic results of different search engines. It is also often referred to by its English title, SEO (Search Engine Optimization).Wikipedia
SEO is one of the “disciplines” that has changed the most in recent years. We only have to look at the large number of Penguin and Panda updates that have taken place , and how these have turned the understanding of SEO upside down by 180 degrees. Now, SEO is about what Matt Cutts himself calls “Search Experience Optimization” or, in other words, “everything for the user.”
Although there are thousands of factors on which a search engine bases its ranking to one page or another, it could be said that there are two basic factors: authority and relevance.
- Authority is basically the popularity of a website. The more popular it is, the more valuable the information it contains. This factor is the one that a search engine takes into account the most, since it is based on the user’s own experience. The more a content is shared, the more users have found it useful.
- Relevance is the relationship a page has to a given search. This is not simply that a page contains a lot of the search term (in the early days it was like that) but that a search engine relies on hundreds of on-site factors to determine this.
SEO can be divided into two large groups:
- On-site : On-site SEO is concerned with relevance. It ensures that the website is optimized so that the search engine understands the main thing, which is the content of the website. Within On-site SEO we would include keyword optimization, loading time, user experience, code optimization and URL formatting.
- Off-site : Off-site SEO is the part of SEO work that focuses on factors outside the website we are working on. The most important factors in off-site SEO are the number and quality of links, presence on social networks, mentions in local media, brand authority and performance in search results, that is, the CTR that our results have in a search engine. You are probably thinking that all this is very good and very interesting, but you are here to find out why you need SEO on your website and what benefits you will get if you integrate it into your online strategy.
Once we know what SEO is, we must differentiate whether or not we follow the “recommendations” of the search engine. Black Hat SEO or White Hat SEO
- Black Hat SEO : Black hat is the attempt to improve the search engine positioning of a website using unethical techniques or techniques that contradict the search engine guidelines. Some examples of Black Hat SEO are Cloaking, Spinning, SPAM in forums and blog comments, or Keyword Stuffing. Black hat can provide benefits in the short term, but it is generally a risky strategy, with no continuity in the long term and that does not add value.
- White Hat SEO : This consists of all those actions that are ethically correct and that comply with the guidelines of search engines to position a web page in search results. Since search engines give greater importance to pages that best respond to a user’s search, White Hat includes the techniques that seek to make a page more relevant to search engines by providing value for its users.
2. Why is SEO important?
The most important reason why SEO is necessary is because it makes your website more useful to both users and search engines . Although search engines still can’t view a web page like a human can, SEO is necessary to help search engines understand what each page is about and whether or not it’s useful to users.
Now let’s give an example to make things clearer:
We have an e-commerce dedicated to the sale of children’s books. Well, for the term “coloring pages” there are about 673,000 searches per month. Assuming that the first result that appears after doing a search on Google gets 22% of clicks ( CTR = 22%), we would get about 148,000 visits per month.
Now, how much are those 148,000 visits worth? Well, if the average cost per click for that term is €0.20, we are talking about more than €29,000/month. This is just in Spain, if we have a business targeting several countries, 1.4 billion searches are made every hour in the world. Of those searches, 70% of the clicks are on organic results and 75% of users do not reach the second page. If we take all this into account, we see that there are a lot of clicks per month for the first result.
SEO is the best way for your users to find you through searches where your website is relevant. These users are looking for what you offer them. The best way to reach them is through a search engine.
3. How do search engines work?
The operation of a search engine can be summarized in two steps: crawling and indexing .
Tracking
A search engine crawls the web using what are called bots. These crawl all the pages through the links. Hence the importance of a good link structure. Just like any user would do when browsing the content of the Web, they go from one link to another and collect data about those web pages which they provide to their servers.
The crawling process starts with a list of web addresses from previous crawls and sitemaps provided by other websites. Once these websites are accessed, the bots look for links to other websites to visit. Bots are especially attracted to new sites and changes to existing websites.
Bots themselves decide which pages to visit, how often and how long they will crawl that website, which is why it is important to have optimal loading times and updated content.
It is very common for a website to need to restrict crawling of certain pages or content to prevent them from appearing in search results. To do this, search engine bots can be told not to crawl certain pages through the “ robots.txt ” file.
Indexing
Once a bot has crawled a website and gathered the necessary information, these pages are indexed. There they are sorted by their content, authority and relevance. This way, when we make a query to the search engine, it will be much easier for it to show us the results that are most closely related to our query.
At first, search engines were based on the number of times a word was repeated. When they did a search, they would search through their index for those terms to find which pages had them in their texts, ranking the one that had the word repeated the most times higher. Nowadays, they are more sophisticated and base their indexes on hundreds of different aspects. The publication date, whether they contain images, videos or animations, microformats, etc. are some of those aspects. Now they give more priority to the quality of the content.
Once the pages are crawled and indexed, it’s time for the algorithm to act: algorithms are the computer processes that decide which pages appear first or last in search results. Once the search is done, the algorithms check the indexes. This way they know which pages are the most relevant taking into account the hundreds of ranking factors. And all this happens in a matter of milliseconds.