Default

SEO101 beginner Rev.

you know that you can dramatically increase the number of daily visitors to visit your site from Google do?

Web Content (Mass + Keywords) + Links = SEO

image text or Javascript?

Question: Navigation is based on images, what if these are images of text?

A: The links obtained by search engines are very impressive, and they only read text. They like to match the subject of the target page with the text in the link to that page. Avoid using Javascript.

Action: Use plain text navigation. Text and containing elements can be styled by CSS to make them look like images, give a background image, or at least rearrange them to look more beautiful. The optional option is to ensure that there is some text-based navigation elsewhere on the page.

Advantages: The search engine will be able to match the subject and reputation (link text) of the target page. The higher the correlation, the better. Ensuring that there is text on the page will also help the visually impaired who use plain text browsers and braile output devices or text-to-language software.

Links and relevant links

Q: Do you want lots of links to your site?

A: Although links may be good for visitors and search engine spiders, the true evaluation of links may not be entirely clear. Visitors are less likely to click on irrelevant links, and search engines hate them. Reciprocal links (you link to me, I will link to you) can be used in association, but they have less weight in search engines than one-way inbound links.

How to do it: Avoid link fields (linked pages for the sake of links), handle each other’s link requests in the expected way and keep in touch.

Takeaway: Spending time and thinking about your link profile will ensure the natural growth of links. Search engines hate anything that looks artificial, or anything that can be interpreted as a “search engine trick.”

Copy content

Question: The webmaster copies content from another website, or builds many identical or very similar websites. Is this mechanism effective?

Answer: Search engines apply echo content filters to search results. Although the idea of ‚Äč‚Äčincreasing market share is a sincere idea, thousands of similar websites are not the way to do this. Efforts made on the secondary website will be cancelled.

What to do: Recognize that all pages on each of your websites are unique. Querying the exact match of the long string in the search engine will show whether the content appears elsewhere. If a webmaster has a fax page and wants to avoid any penalties or filters, the webmaster must select a page to redirect to other identical pages. Redirection will be introduced later. You can also use random RSS feeds and mix them before each page appears. The function of RSS is that there is always fresh content, but only when you mix 3 or more RSS feeds, you will get a unique result, not all other websites that use the same RSS feeds.

Gain: Your efforts will not be wasted, time and money can be saved. Always fresh content from RSS feeds.

Many sites

Q: From the same content point of view, there are a lot of sites. A webmaster has a certain number of closely themed websites, and all links and hosting are on the same IP address.

Answer: This is regarded as the framework of search engines, the link structure and numerous websites are unnatural. May be punished.

How to do it: Such a website structure can work if it is set up properly. This means that each site must reside on its own IP address, where at least Class C is different. IP addresses will be discussed later.

Benefits: Increased market share, effective links and pluggable websites, without penalty.

Unfriendly URL

Question: The URL (web address) of some of your sites pages look like domain dot com/product.php?id=2346&category=665&sid=f29a3483270cc10b3783706916216e3a

Answer: Search engine spiders are getting better and better, tracking and indexing these forms URL, not all. Spiders are afraid of being trapped in an endless loop, these pages have 1000 different URLs. They also bore your website visitors because they must remember mydomain.com/products/cars/boostvalves.

How to do it: Use search engine friendly URLs. If you run apachewebserver on the server support for Linux, use mod_rewrite url will be converted to a format better

harvest: a higher chance of being indexed by more search engines, there are the keywords in your URL, is in the serp click on the probability of higher

redirect

there are two major redirection, based on the HTTP specification

301 moved permanently

302 moved temporarily

you should use a 301 redirect as much as possible. This can be done on the server side script, for example,

the code: php

header ( “We found the HTTP / 1.1 301”);

header ( “LOCATION: WW NEWDOMAIN DOT COM /”);

If you run ApacheWeb server may be provided in the .htaccess file Redirect

IP address (Type C)

The IP address is the numeric address of the computer connected to the internet. For ease of reading, they are divided into four parts, and each part is called an octet because they contain 8-bit data when written in binary form. An example is 62.85.68.114. In this example, Class C is 68 (and everything else), so if you have several sites hosted on this IP, they will all have the same IP address. If the starting address of each IP address is 62.85.68. Then they all share the same C-level IP address. Since IP addresses are assigned a server is usually configured to each server has its own Class C; this means that the search engines know all the sites on the same C-class IP addresses are located on the same server as your competitors may Don’t stand still; neither should you.

Leave a Reply

Your email address will not be published.