that not every search engine optimization is keyword
research and link building
site of the technical aspects, from the structure of the content, to the cold, hard coded, and establish links and keyword optimization are equally important, even more important. If your website is not friendly to search engines, or does not follow the guidelines of Google webmasters, then it will be punished continuously, and it will never reach its full potential.
According to Moz’s 2015 ranking factors research, these are the top 10 SEO technology ranking factors.
URLs should be kept simple: short and without hyphens. This is exactly what Google wants, because long URLs and excessive use of hyphens have proven to perform worse than short and simple URLs. This makes sense, because Google continues to emphasize user-friendliness and user experience, and having a short and easy-to-remember URL meets this standard.
link content ratio
we know that Google loves content. They also hate spam links, and we know that. This is why if you have a lot of links on your website but not much content, Google will think you are trying to use some link scheme to downgrade you. This is why it is good to keep the link-to-content ratio low to ensure that you do not send out red flags.
Code and content ratio
and the ratio of link content as the content ratio of the code is preferably maintained at a low level. A lot of code plus very little content will cause a spam sign on Google, because it makes people feel that the site does not seem to be used. Too much code will also greatly hinder your page speed, which will also have a negative impact on your ranking.
Google Analytics tracking code
according to the study Moz, installed the Google tracking code website better than the performance of the site is not installed. Perhaps this sends a signal to Google that the site is managed by a webmaster who is actively involved in monitoring and therefore may be more trustworthy.
For those who don’t know, this is what the Google Analytics tracking code looks like in HTML.
var _gaq = _gaq || ;
ga.src = (‘https:’ == document.location.protocol?’ ssl’:’www’) +’
.google-analytics.com/ga.js ‘; var s = document.getElementsByTagName(‘script’); s.parentNode.insertBefore(ga, s) ;
txt is important because they tell search engine spiders like Googlebot how to interact with the pages and files of the website. If there are pages, files, or images that you don’t want Google to index, you can use robots.txt to block them. Without robots.txt, Google will index all the content on your site indiscriminately.
The URL is an HTTPS
secure website, or a website with an SSL security certificate, performs slightly better in SERP. This may be a signal, Google, that your website is safe, secure, and authentic. Again, the better the user experience, the higher the ranking.
is a page on a site map site map nature. This map contains metadata and information about the organization and content of the site. Googlebot and other search engine web crawlers use site maps as a guide to crawl the site more intelligently. Having a sitemap can help index your pages and allow you to highlight the content you want search engines to crawl.
Schema markup is a way to change the appearance of meta-information about a site displayed on a search engine page. By using the schema.org markup, you can modify the meta description under the search engine list to display comments, employee profiles and other information. Appropriate mode markup for certain information can even put the content in the Google answer box, which is a way to ensure website traffic. So, if you want to rank your website high, but don’t see much success, these factors may hinder you because they are the key to winning Google’s trust. Through the google search engine, your website is easy to be crawled, while ensuring that your website content is friendly.