The Basics of SEO: Search Engine Optimization

Introduction

SEO (Search Engine Optimization – Optimization for Search Engines) is the set of the techniques used to improve the positioning of a website in search engines especially in Google and in Bing. The goal of these techniques basically consists in increasing the traffic a website positioning in the first pages of organic results (that is, the search results that are not sponsored links).

This is the topic of this article. Initially, we will see some basic techniques that involve the optimization of the content. Later, we will know some techniques basic that involve the planning and publication of a web site. Then, we study the external factors that influence the positioning of the site. To finish, let’s talk, for purposes of knowledge of risk, dangerous techniques called black hat SEO which can harm or even ban your site from the search engines.

To make best use of this text, it is expected that the reader has knowledge basic HTML and publishing web sites.

As indexing occurs

The addition of new pages in a search engine occurs, essentially, when it “sweeps” all the sites already indexed, looking for changes on their pages and new links. The indexing of new sites may also occur via analysis of links, although, for these cases, it is possible to “register” sites recent through specific pages of the engine search ( the Google is here). In the case of Google, one factor that can collaborate in the positioning is the PageRank (PR): in general, the higher this parameter, the greater is the frequency with which Google does scans for it.

The PageRank is the value used by Google to determine the importance or the relevance of a page. The higher, the better. The best way to increase the PR of a site is causing other sites, with PR equal or greater to point links to it. So, practically all the techniques in this series of articles revolve around the publishing content that is relevant and well organized. Is generating useful content for users that your website will receive more links and improve your positioning naturally.

We use Google as an example in this series of articles by him be the search engine most used today, but the fundamental rules of SEO are virtually the same for all search engines. So how Google uses PageRank as one of its criteria for display relevant results to users, other search engines have equivalent systems to do the same.

 

Taking care of the content

Currently, the SEO is an important marketing strategy. In addition to considering the manner how search engines work, the SEO takes into account what people are looking for. So, the main focus should be on the content of the site.

When you put a site in the air, naturally expect that people can find you. Thus, their greatest concern should be to provide useful content, and relevant well organized. The search engines update with some frequency the criteria to create the ranking of the results, for more and more prioritize the results that bring the user to actually search.

The content of the tag <title> is used by search engines as the title of the results.

So, each page within the web site should have a unique title, that describe your content properly and to differentiate it from the others. The title should be clear and contain the main keywords of the page. Titles with more than 64 characters are cut off by Google, therefore it must also be short.

We must pay special attention to the title of the page initial, avoiding expressions such as “welcome to site” (you can give a warm welcome to its visitor as a way of introducing the text of the home page). If you use a visual editor HTML or Flash to create your web site, don’t forget to change the default title (to put a page up in the air with the title “untitled page” or “page-1” is unforgivable, not only from the point of view of SEO as well of usability).

Title tags – <h1>, <h2>, <h3>, <h4>, <h5>, and <h6>, – should be used to organize your text in topics, making it easier for the visitor to find the what he looks for without having to read the whole page, especially in too long texts. So, your titles should be clear and objective.

Make that the <h1> appear in the beginning of your HTML code, the more near possible of the tag <body>, and preferably only once. Use, whenever possible, the main keywords of the page in it. Already the <h2> can be used as a brief description of the content, with the keywords secondary.

Keep the hierarchy between the headings – a <h1> must come after a <h2>. If you think that the titles <h1> are very large, instead use one <h2> just because of the size, let it with the appearance ideal for your layout using CSS.

Key words within the text

In the same way that the main keywords of your site should appear in the titles, it is important that they appear also in the text, of course. However, do not overdo it in the replay and try to use synonyms, leaving your text more pleasant and giving the search engines some more alternative keywords.

In the text, you can give greater weight to those key expressions leaving the bold (<strong>) or italic (<em>). Note that for some search engines, the tags <b> for bold and <i> for italic are considered only aesthetic, without increasing its relevance as a keyword.

Use sublinhamento in the texts is not very recommended, because it can make the user confuse the words underlined with a link.

Care home page

When the SEO starts to take effect, many visitors can access your site directly from the inner pages, but the the home page remains the most important of your site. Not only it is your business card to anyone who accesses your site directly, as it is also the first to be indexed by search engines.

The text of the home page should be a summary of everything that is available on the site, using the main keywords – but without exaggeration: the text should continue short, enjoyable read and, mostly, make sense. Do not forget that the mechanisms search will visit your site, but visitors are “human” that will determine the relevance of it.

Another caution about the homepage is that all the the internal pages should be accessible through it so that indexing is done correctly. The links should be descriptive and direct (without the use of JavaScript). If you use a menu in Flash, reconsider the use of a menu by using CSS to achieve the same visual effects, but if there is no other alternative place links in text format in the footer of the page, both for the robots of search engines, and to ensure usability for visitors who are unable to access a menu that is animated by any reason.

If your site has many inner pages, you can split your content in sub-pages (like categories or sections) and to link your internal pages within them. An alternative is to create a site map a page with the list of all the internal pages of your website, to rank correctly. This helps not only the search engines, as well as the visitors I want to find a particular page on your site quickly. Therefore, the ideal is that the link to the site map to appear on all pages of the site, and not just on the page initial, so that visitors who entered your site directly on a page internal will be able to navigate through it and find what they are looking for.

Optimization of images

As well as the texts of a site, the images are part of your content and can also be optimized for the search engines – after all, virtually all of these sites offer the option to search only in images of the sites indexed.

In the same way that the content must be relevant and of course, pictures follow this same rule. When positioning an image, place it close to the text to which it refers – both in layout and in the HTML code. Here also it is hint to optimize the name of the file: por-do-sol.jpg is a better name than that img0001.jpg.

Use the parameter alt to add to the image an alternative text with some keywords related to it. However, the function of this parameter is to add an alternative description, both for the engine search how much for the screen readers used by the visually impaired. Do not use a list of keywords without meaning, this is considered black SEO a technique to “cheat” in the search results, but that can lead your site to be punished or even even banned. This subject will be discussed in more detail in part the end of this article.

If your site has a gallery of images, such as a catalog of products, for example, avoid publishing these photos in a Flash. In addition to being more easy to update, a gallery in pure HTML and semantic ismore a way to make the search engines find your content. In the same way, if you are using some feature in JavaScript to add “effects” in your images, check the HTML code generated by it is semantic.

In the same way, if you’re still planning a new site and your domain has not been registered, try to include the the main key word in your address.

In case you are using any content manager to publish your site, such as WordPress, Drupal or Joomla, change the permalinks of the internal pages. An address www.site.com.br/?page_id=2 it is not very good from the point of view of SEO.

Meta-tags

They have already been the subject of a series of articles here on the AbbreviationFinder (Meta Tags – what they are and how to use them). From there to here, lost a bit of weight to the search engines, but, even so, ceased to have any importance.

The meta-tag description, for example, is used to describe a site. As well as the title of the page, it appears in the result of your search, so you should be clear and brief (maximum of 160 characters). If possible, have a different description for each page.

Already the meta-tag keywords is used to strengthen the keywords of the page. It is important to keep the consistency with the text, so as to use only the tags of the greatest relevance. The excessive use of keywords is considered black hat, or is, the inappropriate use of SEO (this issue will be addressed with more details in the next article).

The organization of the source code

A source code clean, semantic (and, preferably, validated) it is also very important for the SEO. Use the tags HTML correctly within the use for which they are intended. If you have that set the font for a text, give preference to basic fonts available on virtually all computers, instead of using as alternative words in an image or in a file Flash, for example.

Use a CSS style sheet external to change the layout of the site, leaving in only HTML markup for the content. The same way, use scripts external JavaScript whenever possible, because your code is not interpreted by the search engines and give the false impression of “dirt” in HTML.

Sitemap

Unlike the site map, previously quoted, that is intended to facilitate the user’s navigation, a sitemap it is an XML file with all the links of the site, dedicated solely to the search engines. Through him, the indexing of the pages internal is faster and easier for the engines search. In addition, it is a resource used by tools such as the Google Webmaster Tools to scan the pages of your site in search of errors.

This XML file (which usually receives the name of sitemap.xml) must reside in the root of the site, preferably. If you do not has knowledge in XML, there are tools online that create this file automatically, as the XML Sitemaps (free for sites with up to 500 pages). For sites that use content managers, there are plugins that assemble and publish the sitemap automatically. WordPress, for example, account with the plugin Google (XML) Sitemaps Generator for WordPress.

robots.txt

As well as the sitemap, the file robots.txt it is designed to facilitate the indexing of the site only for the search sites, no direct effect to visitors. Through it, you can determine which pages and directories may (or may not) be indexed.

The first line, User-agent: *, is used to inform that all search engines are allowed on the site. Already the command Disallow blocks indexing on certain folders or files: in the example, the folder /js/and the file /temp.html. You should use a line for each folder or file is locked, on its path full from the root and beginning with a bar.

The file robots.txt can be created in a simple text editor, and must be published in the root of the site. An alternative to the file robots.txtis the use of the meta-tag robots.

Causing the search engines to crawl your site

After you apply all the basic techniques of SEO on your site, it’s time to help the search engines find you.

As has already been said, the search engines scour the existing sites looking for changes in the pages already indexed, and new links to index. This the process is known as crawling. As a rule, thethe higher the PageRank (PR) of a website, with more frequency this scan is performed in it. Therefore, the ideal is to do some site already indexed and with a PR good (at least 3 or 4) to point a link to your.

There are several ways of getting links to your site: if you just create a blog, for example, you can register on sites like the Technorati or Digg, where you publish and receive evaluations by their posts. Here again it is worth remembering that the most important thing is to produce useful content, relevant and well organized. If your the site has a layout ozzy osbourne (both in terms of visual as in the code), you can send it to galleries CSS as the CSS Beauty. There are also several galleries focused solely on for websites developed on WordPress, Drupal or Joomla.

In the latter case, you can register your URL directly in the search engines. Google, Yahoo! Search and Bing provide a form to submit your URL, but some experts in SEO recommend using this feature only in the latter case, if no other website is pointing to your.

External factors

The SEO depends not only on the site. The way he relates with other sites also influences – and much – in your position. Here are some external factors important:

– How many sites link to yours: the more links coming from different addresses, your site receives, the more he gains popularity next to the search engines. And if these pages treat of some subject related to your website and you have the same keywords that he, this popularity increases even more;

– What is the PageRank of these sites: the PageRank gives credibility to a website and, consequently, to the sites that are linked by it;

– How long ago the link was posted: more links the former tend to be more reliable. The same thing goes for the age of the sites that are pointing to your sites and older generally cause relevance;

– The text used in the link and when your around: when a link points to a page on your site, it is important that he has any keyword related to it in the text. A classic example is the search for “click here” in Google, which returns the site the Adobe Reader. This happens because there is a huge number of links that point to the website of the tool with the text “click here to download the” when offering a document in PDF format on the page. The ideal in these cases would be to use the text “Download Adobe Reader” on the link.

This is just the basics, the list of external factors that may influence the positioning of a website is huge. You have very little control over who is going to make a link to your site, and how this link will be done, but, as already noted here, the best way to get quality links to your site is by publishing useful content, and relevant. This causes your sites may be naturally linked to by others, including on social networks, such as Twitter and Facebook.

Link exchange and partnership requests may not be beneficial

Many sites propose systems of partnerships or exchange of links to try to improve your positioning in the search engines, but this may not have results of any kind.

For example, those lists of huge banners with links that is very common in blogs: for Google, those links in images can be considered advertising, and this material type is ignored for the purposesSEO. In addition, as we see, the more links in a page, the lower is the relevance of each one of them. Not let’s not forget that links between sites that deal with the same subject have more weight for SEO than a blog cooking exchange links with a blog of comics and with other design, for example. So,the best part is that the exchange of links between websites and blogs is natural, by the quality of the texts and affinity content, and not only to “make the number”.

Another common error is to imagine that spreading comments “cool, visit my blog” on other blogs -content-related or not – are benefiting from the SEO of our blog because we leave a link along with the comment. This the practice has no use: the majority of systems blog adds the tag rel=”nofollow”on the links, the that to the search engines want to say: “this link is here, but not it has relevance no to me, then do not follow”. In addition, this type of comment does not add in nothing and wont be deleted in a few blogs.

Commenting on other blogs is a good practice to make your site become known, but for this the ideal is to leave a comment relevant and user-friendly that adds something to the post and that arouses in the author or in the reader of that page, in the interest of visit your blog. If it is like what is occasionally going to do a link in a post (that has much more value, as opposed to a banner). It is best to always follow common sense: if you do not have anything to adding with your comment, not comment, until to not harm the image of your blog/site.

Black hat SEO

The set of techniques known as black hat SEO are ways to “cheat” the results of search engines, to make a website stay in the first results. But as most of these results offer useless content to the visitor, to as the algorithms of search engines are being improved, these techniques are being discovered. So, instead to give good results, its use causes the site to be penalized or even even banned from the search engines. We discuss the main techniques black hat SEO in this article precisely for you to avoid them:

– Hidden text: either through CSS or JavaScript, some sites are repeated several times, the key words throughout the text, by placing this code snippet in the same color as the bottom of the page so that the visitor does not see. There are also cases in which this text is hidden by placing another element on it via CSS;

– Keyword stuffing in images: as the key words within the text of the parameter alt images have very relevance to SEO, some web sites repeat the keyword several times within it. In addition to penalize your site in the search engines, this hurts a lot the usability of the site, since this parameter serves to make an alternative description of the image both to the search engines and to the visually impaired who use screen readers, or even by visitors that navigate on the site with the load images disabled;

– Keyword stuffing in the tag <title>: it is the the repetition of the main keywords in the title of the page. This title should be used to describe your content. Again, in addition to penalizing the site, this technique goes against the principles of usability;

– Keyword stuffing in meta-tags is in the meta-tag description or keyword, one should avoid the repetition of unnecessary key words. Amount exaggerated of meta-tags is also not offers good results;

– Doorway page: this technique creates a page toward the front exclusively for the search engines, without any useful content to the user. Some times, the visitor is redirected by the middle of the script to another page, or several pages advertisements are charged at the same time;

– Cybersquatting: an example of this technique can be observed when you try to access the Microsoft web site by typing the address wrong, without the letter ‘F’: microsot.com. Knowing that many users might commit this error of typing, Microsoft recorded this domain and the redirected to the Bing, your search tool. Some people record versions with error typing of the famous sites or your competitors, for capture more visitors. Since it does not offer the visitor what he is looking for, is not a good practice (except when practiced as in the case of Microsoft, which will take the visitor exactly to the site what he seeks);

– Link farm: are groups with multiple sites, if you link it each other with the sole purpose of increasing the number of references to them. Most of the times they also use the any variation of keyword stuffing, and you end up taking the visitor anywhere;

– Noscript: the tag <noscript> is used to provide alternate content when a script cannot be accessed, either because the browser does not support it or because the visitor is accessing the site through a reader screen. However, many people use this tag to display content “alternative”, with a strong repetition of the key words, because the search engines will also read the contents the <noscript>. As well as most of the techniques, besides hindering your site still goes against the principles usability;

In addition to penalize the site which uses techniques of the black hat, the search engines penalize also the sites that link to them. By so, always check the address linked in your site with some frequency to make sure that none of them uses this type of tool. Your site may end up being penalized without you know or find the reason.

If you find that your site disappeared from search results, you can check if he was banned from Google, or simply penalized by doing a search by the expression site:www.seusite.com.br (obviouslyusing the address of your site). If the seeker is not return no result, is it because he was banned. It is possible to to make a request reconsideration with Google for trying to make your site back to the search engine.

Ending

As you can see, the basic techniques of SEO can be implemented in your website easily, especially if this aspect is considered since the stage of planning. Making the content the focus in your page and making the user experience pleasant, it will be much easier to stay well positioned in the search engines.

To finish, a tip: you can also learn more about how to improve the indexing of a site by referring to the Guide of Optimization for search Engines Google (PDF).

The Basics of SEO – Search Engine Optimization 1