SEO history
the mid-1990s, with the first batch of search engine cataloging the early Internet, Webmasters and content providers began optimizing sites for search engines. Initially, all webmasters only need to submit the address or URL of a webpage, and various engines will send a “spider” to “crawl” the webpage and extract links to other webpages. This process includes a search engine spider. Download a page and store it on the search engine’s own server. The second program is called an indexer, and it extracts information about the page, such as the words it contains, their location, any weights for specific words, and all the links that the page contains. Then, all this information will be put into a scheduler for subsequent crawling.
Website owners recognize the value of high ranking and visibility in search engine results, creating opportunities for white hat and black hat search engine optimization practitioners. According to industry analyst Danny Sullivan, the term “search engine optimization” may have been used in 1997. Sullivan believes that Bruce Clay was one of the first to popularize the term. On May 2, 2007, Jason Gambert tried to persuade the Arizona Trademark Office that SEO is a “process” involving keyword manipulation, rather than a “marketing service.” The
early version of the search algorithm relied on information provided by the webmaster, such as key Character tags or index files in engines like ALIWEB. Meta tags provide a guide to the content of each page. However, indexing pages with metadata is not reliable because the keywords selected by the webmaster in the meta tag may not accurately represent the actual content of the website. Inaccurate, incomplete, and inconsistent data in meta tags may and indeed lead to irrelevant searches on page rankings.
Web content providers also manipulated some attributes in the HTML source code of the page in an attempt to rank high in search engines.
By 1997, search engine designers realized that webmasters were struggling to rank high in their search engines. Some webmasters were even filling pages with excessive or irrelevant keywords to manipulate themselves in search results. Ranking. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.
Early search engines relied too much on factors such as keyword density, and these factors were completely under the control of webmasters, so they suffered from abuse and ranking manipulation. In order to provide users with better search results, search engines must adjust to ensure that their results pages display the most relevant search results, rather than unscrupulous webmasters stuffed with irrelevant pages with a large number of keywords. This means shifting from a heavy dependence on term density to a more comprehensive semantic signal scoring process.
Since the success and popularity of a search engine depends on whether it can produce the most relevant results for any given search, poor quality or irrelevant search results may cause users to find other search sources. The response of search engines is to develop more complex ranking algorithms, taking into account other factors, which are more difficult for webmasters to operate. In 2005, an annual conference called AIRWeb (Online Adversarial Information Retrieval) was established to bring together practitioners and researchers who are concerned about search engine optimization and related topics.
Companies that use overly aggressive technologies can ban their clients’ websites from search results. In 2005, The Wall Street Journal reported on a company called Traffic Power, which allegedly used high-risk technologies but failed to disclose these risks to customers. Wired magazine reported that the same company sued the blogger and SEO aaronwall for writing about the ban. Google’s Matt Cutts later confirmed that Google did ban traffic and some of its customers.
Some search engines have also been exposed to the search engine optimization industry, and are often sponsors and guests of search engine optimization conferences, web chats and seminars. The major search engines provide information and guidelines to help optimize the website. Google has a Sitemaps program that can help webmasters understand if Google has encountered any problems indexing their sites, and can also provide sites with data about Google traffic. Bing Webmaster Tools provides a way for webmasters to submit sitemaps and web sources, allowing users to determine the “crawl rate” and track the status of web page indexing. In 2015, there were reports that Google was developing and promoting mobile search as a key feature of future products. In response, many brands began to adopt different approaches to implement their online marketing strategies.