Default

Search Engine Optimization – a historical perspective

it seems that every mention internet marketing website or ezine article contains a “search engine optimization” or SEO. This is the current buzzword, and it seems that everyone is using it, as if they are all experts in this area. Advertisements are everywhere on business forums. Internet marketing “professionals” provide their “search engine optimization services”, and the fees range from a few dollars to hundreds of dollars or even thousands of dollars, and Internet “masters” are willing to pay for you. Others seem to be ready to provide free advice on how to effectively incorporate SEO into your website.

However, almost no one will say what search engine optimization is! So, as we explore the history of search engine optimization, let us try to figure out what it is and what it does.

Simply put, search engine optimization is just an art and/or science (usually art rather than science) that makes web pages more attractive to Internet search engines. There is no doubt that if Internet companies do not regard search engine optimization as an integral part of any search engine marketing plan or plan, it will be negligent.

So, how do you need to “optimize” a website to attract the attention of search engines?

The relatively mysterious art of search engine optimization (SEO) began to shine in the dark ages of the Internet in the mid-1990s. It may be the Renaissance, but “Dark Age” is easier to spell. However, search engine optimization was quite basic in those days. In fact, many of the “search engines” available at the time were actually just web crawler (sorry, Spider-Man) directories, and ultimately extracted more data from the website than the website owner originally submitted.

Even in those dark days, a high-quality search engine can perform some discriminatory evaluations, and based on the information content of the website and other data (such as keywords, descriptions, text and graphic content) and certain topics and queries The relevance to assign weights or search engine rankings. Unfortunately, although search engine spiders can extract a certain amount of data, the ability of a website to obtain high search engine rankings depends largely on the material submitted by the webmaster.

You know, webmasters are not stupid, they quickly realize that by using various techniques, they can improve the search engine rankings of their websites. One of the techniques is to manipulate content by increasing the use of keywords. For example, the use of keywords is often a huge multiple hidden in the background of the website. In this way, they can improve the search engine ranking of their website. A higher level means more visitors, which usually means more money. It is easy for webmasters to understand this fact.

Enter search engine algorithm

“Algorithm” may be one of the most difficult words on the Internet to understand. It refers to systems or instructions, in this case, search engines will follow these rules when searching for website rankings. More stupidly, a search engine owner can decide that his or her algorithm will include instructions to assign the lowest ranking to the website with “blue” in it. The point is that this magical and mysterious algorithm is just a set of instructions, which are provided to the software used by search engines to assign search engine rankings.

Of course, search engine algorithms existed before, but like criminals and police, as webmasters became more adept at peeking at existing algorithms, search engines improved their algorithms to prevent them from doing so.

As search engines began to rely less and less on the information provided by webmasters, a major change emerged. They developed a software that could independently investigate websites and draw conclusions about what they found. The webmaster no longer fills out a form, provides a title, description, and a bunch of keywords, and then it is checked by the “Mortimer Snerd” indexer, and the result shows “Yes, Mr. Bergen. The keywords you asked are there, you Bet, there are a lot of keywords there!” The search engine software began to look for itself more deeply to make it logical, or at least quasi-logical, to make a decision about what it found.

People under 50 take a break: Well, 200% of Internet users are far from my age, and here is Mortimer Sneijder’s skinny. I believe that as early as the 1930s to the 1960s, there was a famous ventriloquist performer named Edgar Bergen, the father of Candice Bergen. He mainly works with two dummies, Charlie McCarthy and Mortimer Sneijder. Although Charlie McCarthy is a smart person, he usually wears a tie and a tuxedo, and seems to be very interested in the comings and goings of society. Bergen’s other main puppet is Mortimer Sneijder, a hillbilly who got out of the turnip cart. He believed anything other people said to him… and believed it true.

Back to the topic of search engine optimization.

Well, the software does not just accept the webmaster’s words, that is, keywords such as “weight loss”, “diet”, and “exercise” apply to the website theme, and then check whether these words exist, the software starts to look at a long list of factors. It will check the domain name and the words used in the title. It will check the frequency of keywords, the distance between them, and the order in which they appear. It will check what the “ALT” attribute attached to the image contains, and what the META tag must explain. Most importantly, it checks the textual content of the website to get a primary feel of how all these things fit together and how they match the propositions of the webmaster and the expectations of search engine customers.

Now you understand why so many people say “content is king!”

However, for a large search engine like Google, website content alone is not enough to ensure that its customers see the most valuable search results and the website gets The most accurate page ranking. Therefore, the search engine giant Google (Google) has introduced a “page ranking” system that can also check the number of inbound links to a website, and “page ranking” can also check the number of incoming links to a website. In other words, how many other websites on the Internet think that this website is related to the interests of their customers and therefore valuable to the interests of search engine customers.

As search engines become larger and more powerful, webmasters are getting better at circumventing their algorithms. Major search engines like Google make their specific algorithms strictly control secrets. This makes it difficult for both amatuer webmasters and search engine optimization services to accurately predict which technology or strategy will be the most successful on a given search engine to obtain a higher page ranking.

However, some of the deductions are based on webpages and websites, and seem to have indeed achieved higher page rankings on Google and other search engines.

Techniques such as selecting relevant domain names (including important keywords and phrases in the title), displaying keywords in image ALT tags, etc., emphasizing keywords by using title text and placing them at the beginning and end of the page are all very important. Having a large number of inbound links from related sites is as important as internal links (the development and value of sitemaps is another important topic).

However, most importantly, as search engine algorithms expand their capabilities, of course, based on the instructions provided, they begin to approach the views of human website viewers. As people might ask, “Is this website meaningful and provides relevant data in an understandable way?” Search engines are becoming more and more interested in the structure and content of the website.

Search engine web crawlers are also becoming more and more proficient in tracking your website, and if others see it appropriate, include a link to your website from theirs. This is another reason why links from other web pages can be important to get your website indexed in the first place, as well as to help it get a good page ranking.

Just like the Internet boom of the last century (I need to say that), the most common way to provide your website to search engines for consideration is to fill out a form. However, you will notice that in modern times, search engines ask you to provide less and less information about the website. They would rather take it themselves. On the other hand, filling out the form does not guarantee that your website will be indexed immediately, or even soon… if it happens. From the perspective of search engines or human visitors, although various SEO techniques are important, the quality of content provided to visitors may be the best of all SEO methods.

Leave a Reply

Your email address will not be published.