Top 10 SEO tips for webmasters to stay away

Work Web promotion, seo, is quite comprehensive and complex. Because the algorithms of search engines, seo tips are complex and constantly changing.

 

For example, Google has hundreds of factors in ranking algorithms Web site. Moreover, the search engine algorithms is considered a priority by the two main reasons:

- They do not want competitors to know what they are doing.

- They do not want the spammer webmaster or seo tips apply an abuse to have high grades.

But another reason for making professional seo complicated theories seo, seo experience rapid change in recent years. These tips seo that the Webmaster Web Design, seo experts applied in previous years are no longer applicable to the present time. Lots of questions, many issues remain to be considered in professional seo mysterious.

Based on the keywords in the Meta tag keywords

This is the first taboo simple reason because search engines no longer rely on Meta tags to determine the content of Web pages. Instead, the search engines will analyze the content is displayed to the user to determine the content and classification, rules for page ranking. The text invisible to the user, such as Meta keywords, have no sense from a few years ago because they were spammers overuse. However, some search engines still use Meta tags s with very low weight. So you put this in the keyword Meta tag key, then forget it.

Title Meta Tag - provides information to the user, is one of the best tips is also important to do seo. It helps you significantly improve page rank.

Meta Description Tag - a summary of its content. It does not directly help you improve page rank, but it helps build google snippets associated with the content in the search results page. Meanwhile Yahoo to use the description tag in the page search results in some cases. This increases the proportion click CRT. And general intangible, Meta Description tag is also involved indirectly in the increased quality and increase the ranking of your website.

Cramming keywords into the text hidden

Occupying second place because it will make your site penalized or banned or removed from the list of indicators. The insertion of keywords with tiny font, font color and background or beyond the browser window, or even use the HTML CSS techniques seo tips seo as well as taboo. The algorithm of google quite well in the discovery of this seo techniques. And to be punished is unavoidable especially when the anti-spam is becoming a leading concern of many search engines (Google, Yahoo).

Purchase link

This is one of the ways is very popular and widely applied by the Webmaster and those who do seo. The problem is, the wrong link exchange URLs nature "natural" and it will make the search results no longer accurate to the user's query (Note that Web page ranking depends more to external URLs pointing to the page). And the search engines, especially Google, in an effort to improve search results useful for users, will find ways to combat the sale of links and they are prioritized this.

Matt Cutts, Google's engineers have also confirmed that the algorithm of google was completed in detecting links are traded. Typically, google uses three methods to determine the purchase link:

- Search for suspicious patterns, such as the form of "advertising", "sponsored" link located near.

- Google has thousands of editors in Asia, these people are looking for quality management. And certainly part of which will be trained to detect and warn the purchase link holding the Website.

- Google also has a tool that allows users to notice and complain about the purchase link. And we will be sending a team to manage the quality of search lies in Asia.

Then google will do anything to discover the purchase link? These links will be marked and no effect on rankings for pages linked to. Also, if the sale is detected in the aim of increasing the google rankings will apply sanctions, such as PageRank and even dropped the Website are prohibited.

So let's use of time and money more reasonable. Instead of spending time shopping for the link, you look for the link worthwhile, relevant to the topic of the page to provide useful information for users. And building a rich Web site or the information useful tool, you will get the "natural" user. It is the users retain the old and bring new traffic. Here's how to make sure and lasting.

Loss PageRank ranking

How to understand the mistakes of those who do seo is the Web site links to external sites, the PageRank of pages that will be "split" and "loss" to other sites. But the world has changed. PageRank is only a conventional indicators in ranking Web pages only.

So you set up to strengthen the link to the page content similarity, which enhance the reliability of information on your Web page.

Join link exchange system

Employment is a pretty old but still valid at all. Search engines want to link to keep the essence, "natural", citing the need to provide information and tools. Meanwhile, the exchange of links to express the change of tracks and they are very easy to detect.

Do not take time to participate in link exchanges to build a system to link this simple trick. However, link building is a very important job if the Web page in the link diagrams are useful for the user. Build links to sites that share themes and useful to users. And of course be even better if the Web page associated with this topic to your Website without necessarily associate vice versa.

Duplicate Content

There are two ways to create duplicate content:

- More Webmaster deliberately creating doorway pages, sites with similar content, even totally like the original page. These pages are presented in a variety of ways to promote products or services of the company.

- Many times, in the same Web page, the same content will appear at many different sites (different URLs). For example, the content of the Blog can be found in the link to the article, category, store, RSS and on the home page.

The problem with duplicate content google is always to bring people looking for a wide choice of content, for that google just pick out a single page of the duplicate content. Therefore duplicate content wasting the time of the search engine and wasting bandwidth of your Web server. And sometimes the results are displayed on the search page is not the version of the content that you want users to access.

You have to do to avoid duplicate content? Please refer to the article on duplicate content above and find ways to reduce them. In addition there are a number of tools to help you only need to release indexing while eliminating the minor version comes.



Using Session IDs in URLs

The google indexing Web pages is continuous. Googlebot frequency depends on the Web page ranking and level of site updates. For a Web site ranks high ranking job is prolonged persistence. Also, google and other search engines are like static Web pages. The end of the URL parameter will appear in search engines is considered as part of the URL.

If your Web page containing the Session ID parameter, it is more likely that spiders will fall into infinite loop when indexing your pages for each visit, they are assigned a new ID and Session GoogleBot will see this as a new article. With the Session ID, you will create duplicate content as said. And google will take a long time in vain indexing, while you add the cost of bandwidth for them. Session ID will reduce your page rankings.

Although the algorithm of google has improved significantly in the handling of session ID, but you should use a cookie instead of using parameters in the URL. Remember that only 2% of users do not use cookies.

You also try to create friendly URLs (keywords in the URL) using mod_rewrite URL with such htacess.

Website with Flash

In terms of art, a Web site entirely in Flash presentations can be very eye-catching, but it sure is hard to rank high on search engines. One reason is simply google like text. And if you present the text page with Flash limited to the provision of visual effects.

Excessive use of JavaScript

JavaScript can be very effective in Website Design. But the problem is that google will have difficulty to understand source code javascript. Although the present and future, google has and will put more effort but the use of JavaScript will remain ineffective in communicating with search engines.

To optimization, seo workers are disjoint JavaScript own, even in the case of use, please insert the file (included) or use CSS instead of the header or body of the Website. Please help machines understand the main content of the page and index them easily, like, all of them are good.

Cloaking Techniques

This is the seo techniques "black hat" to display different content to search bugs compared with the commonly used. This is a much older techniques spammers use in the previous year.

Search engines today easily detect scams by sending spiders often sign new cloaking detection purposes. There are many cloaking technique, trick spiders that can not list everything in the limits of the article. But they were soon detected. This is a seo tips "black hat" should be avoided.

In case of detection, relevant Web pages will be banned. So you should not use this technique. Let's assume the problem by other techniques.

conclude

The Webmaster, who do seo note when applying seo tips:

- Learn how to operate the search engine to help us understand the content of your web pages. The problem understanding above all have in common is that they make it difficult for search engines to index and identify Web page content. Hence, or build interactive Web site with a search engine to give them the unique content.

- Do not use valuable time to fool the search engines. Because the algorithms of search engines are smart enough redundancy to detect tricks, not to mention the man in the relay spam. Although even if you search through your eyes. It is only temporary and in a short time when the cost will be much more expensive bust.

Fool the search engines is not how long. Make use of time, effort and money to invest in content, useful tools and participate in other promotions that you will do if search engines did not exist.



Related Articles