Saturday, July 22, 2006

Basic tips for web optimization.

Subscribe to my feed

1. Insert keywords in the meta tag keyword.

2. Insert keywords in the meta tag description.

3. Insert keywords in the title tag.

4. Insert keywords in the title of anchor tag.

5. Insert keywords in the alt tag of Image tag.

6. Insert keywords in the phrases of your body content.

7. Insert keywords in the H1, H2...H6 tags.

8. Do bold or Italic and colored red to the keyword of your body content.

Visit us :
Top SEO Resources

Basic tips for web optimization.

Subscribe to my feed

1. Insert keywords in the meta tag keyword.

2. Insert keywords in the meta tag description.

3. Insert keywords in the title tag.

4. Insert keywords in the title of anchor tag.

5. Insert keywords in the alt tag of Image tag.

6. Insert keywords in the phrases of your body content.

7. Insert keywords in the H1, H2...H6 tags.

8. Do bold or Italic and colored red to the keyword of your body content.

Visit us :
Top SEO Resources

Top Keyword Analysis Tools

Subscribe to my feed

1. Overture Keyword Suggestion tools

Overture provides a free tool that allows you to see how many times a search term is used on the overture during previous month.
http://www.inventory.overture.com

2. http://www.nichebot.com

Niche-Bot is an excellent niche research tool that allows you to quickly and easily find out how often a specific keyword has been searched across the Net.

This website displays keyword data using Wordtracker, overture and Google search Results.

3. http://www.mcdar.net/KeywordTool/keyWait.asp

This is another excellent keyword analysis tools. When you enter the appropriate URL and keyword then it will display Pagerank and Back links pages for the Top 10 websites.

4. Wordtracker keyword analysis tools
Word tracker gets the information about what people are searching for from Meta search engine like mamma.com, metacrowler.com,dogpile.com.This tools is not free.

http://www.wordtracker.com/

5.digitalpoient.com/tools/keywords

This is used to analysis the top keywords. It is free keyword tracker Tools. Google authorized to it.

6.find what.com

Findwhat.com has a keyword center which operate much like overture tools.

7.Google Trends Keyword analysis tools

Google Trends is a tool from Google Labs for keword analysis.
http://www.google.com/trends

To More About SEO Resources:
Top SEO Resources

Top Keyword Analysis Tools

Subscribe to my feed

1. Overture Keyword Suggestion tools


Overture provides a free tool that allows you to see how many times a search term is used on the overture during previous month.

http://www.inventory.overture.com


2. http://www.nichebot.com


Niche-Bot is an excellent niche research tool that allows you to quickly and easily find out how often a specific keyword has been searched across the Net.

This website displays keyword data using Wordtracker, overture and Google search Results.



3. http://www.mcdar.net/KeywordTool/keyWait.asp


This is another excellent keyword analysis tools. When you enter the appropriate URL and keyword then it will display Pagerank and Back links pages for the Top 10 websites.



4. Wordtracker keyword analysis tools


Word tracker gets the information about what people are searching for from Meta search engine like mamma.com, metacrowler.com,dogpile.com.This tools is not free.

http://www.wordtracker.com/



5.digitalpoient.com/tools/keywords


This is used to analysis the top keywords. It is free keyword tracker Tools. Google authorized to it.

6.find what.com


Findwhat.com has a keyword center which operate much like overture tools.



7.Google Trends Keyword analysis tools


Google Trends is a tool from Google Labs for keword analysis.
http://www.google.com/trends



To More About SEO Resources:

Top SEO Resources

Tuesday, July 18, 2006

What is PPC ?

Subscribe to my feed

PPC:

1. A Pay per Click also known as Pay per Ranking, Pay per Placement or Pay per Position.

2. A pay per click (PPC) is a Paid listing where the site's owner pays every time the listing is clicked.

3. When you pay a search engine for each visitor they send to your site, this is called pay per click (PPC).

4. Pay per click listings are usually marked as advertisements or "Sponsored Links" and appear above or alongside the search engine's organic listings.
5. In pay per click services, you pay only when a searcher or clicker clicks on your listing and connects to your site.

6. Pay per click search engines allow you to get text ad listings and keyword specific coverage in major search engines rapidly, and pay only for the traffic delivered.

7. Pay per click services is an effective way of bringing targeted traffic to your site. It also has the best rate of investments.


There are two major players in this field -

1. Overture which has network partnerships spanning MSN, InfoSpace, AltaVista, AllTheWeb and many other partners.

2. Google AdWords has a larger distribution network across Google, AOL, About, Earthlink, etc.

Why use PPC services?

1. Today PPC services have the best ROI as compared to other marketing channels.


2. Your competition may be using PPC services. So in order to outdo your competition the use of PPC services is essential.

3. Pay per Click increases your sales.

4. Whether your needs are short term or long term, using Pay per Click can bring you more targeted traffic.

HOW DOES PPC ADVERTISING WORK?

1. You bid for certain keywords that describe your site and the products or services you offer.
2. When a prospective buyer searches with these keywords, they are brought to your site listing. Money is paid only when someone clicks on the listing. It is worth the money, if you have written a specific description about your product or service. They are interested in knowing more about buying your product.

3. Good targeting and timing are as important as finding the right keywords. Timings are very important especially if your budget is very limited.

4. Pay Per Click services differ from optimization services because you can bid for hundreds of keywords if you want.

The advantage of using large pay per click search engines is that you are guaranteed good traffic and that your business model is scalable. Smaller engines provide slower feedback loops and some may not even provide quality traffic. Pay per click is ideal as a short term strategy.This is because of the immediate results and completes control.

To more About SEO :
SEO Resources

What is CSS?

Subscribe to my feed

What is CSS?

1. CSS stands for Cascading Style Sheets.

2. CSS is the technology used to make the layout for WebPages.

3. CSS allows you to make changes to all of the web pages that link to the CSS file at once by changing a style in the style sheet. Instead of having to manually change every style in every HTML file.

4. They would save you a lot of time to say the least, especially if you have a large or multiple web sites.

Syntax

The CSS syntax is made up of three parts: a selector, a property and a value:

Selector {property: value}

The selector is normally the HTML element/tag you wish to define, the property is the attribute you wish to change, and each property can take a value. The property and value are separated by a colon, and surrounded by curly braces:

body {color: black}

Cascading Stylesheets Advantages:

1. The content is separated from the design.

2. You site uses less bandwidth and loads faster.

3. Your website will automatically gain better search engine results.

4. CSS is compatible with newer browsers.

5. CSS can be used to display the same content on different media.

So it gives you great control of your website and makes your visitors happy when they are surfing your website.

Visit us:
SEO Resources

Monday, July 17, 2006

What is an atom

Subscribe to my feed

1.Atom is the name of a specific web feed format.

2.Atom defines a web feed format for representing and a protocol for editing Web resources such as Weblogs, online journals, Wikis, and similar content.

3. Atom is a way to read and write information on the web and share your contents and ideas by publishing to the web.

4. Atom is an XML-based document format that describes lists of related information known as "web feeds".

5. Web feeds in general provide web content or summaries of web content together with links to the full versions of the content, and other meta-data in a developer-friendly standardized format.

6. Web Feeds are composed of a number of items, known as "entries", each with an extensible set of attached metadata. For example, each entry has a title.
7. The primary use of Atom is the syndication of Web content such as weblogs and news headlines to Web sites as well as directly to user agents.

Atom, from a technical perspective, is an open standard that includes both:



* an XML-based web syndication format used by weblogs, news websites and web mail and,
* an simple HTTP-based Protocol known as Atom Publishing Protocol (APP for short) for creating and updating Web resources.

More about SEO:

http://www.halfvalue.com/top-articles/seo-resources.html

Why ? - Cloaking

Subscribe to my feed

What is Cloaking:

1. Cloaking is a technique that is used to display different pages to the search engine spiders than the ones normal visitors see.

2. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of whatever is requesting the page.

With cloaking, one can create two sets of pages:

1. The first for search engine spiders,

2. The second for regular human visitors.

This enables retaining the good look and feel of the site for humans, while still being able to show highly optimized pages to the spiders and thus generate nice amounts of traffic from the search engines. Cloaking also prevents humans from seeing what kind of optimization techniques you are using and stealing your optimized.

When a person requests the page of a particular URL from the website, the site's normal page is returned, but when a search engine spider makes the same request, a special page that has been created for the engine is returned, and the normal page for the URL is hidden from the engine - it is cloaked.

There are three ways of cloaking

1. One is "IP delivery", Where the IP addresses of spiders are recognized at the server and handled accordingly.

2. Another is "User-Agent delivery", where the spiders' User-Agents are recognized at the server and handled accordingly.

3. And the third is a combination of the two.


To more about SEO :

http://www.halfvalue.com/top-articles/seo-resources.html

Wednesday, July 12, 2006

Meta Search Engines

Subscribe to my feed

What is Meta Search engine:

A Meta search engine is a search tool that doesn't create its own database of information, but instead searches those of other search engines. "Metacrawler", for instance, searches the databases of each of the following engines: Lycos, WebCrawler, Excite, AltaVista, and Yahoo. Using multiple databases will mean that the search results are more comprehensive, but slower to obtain.

Where metacrowler is a system that crawl the web by follow the link. There is some Meta search engine like: http://www.mamma.com/ (The Mother of all Search Engines), www.metacrowler.com etc

So in short way a search engine that gets listings from two or more other search engines, rather than through its own efforts.

Visit us: To more SEO Resources

What is Syndication?

Subscribe to my feed

Syndication is the process of sharing content among sites. A link that says Syndicate this site, RSS, or XML means that the headlines, a link, and an entry description for each new weblog entry are made available for others to use on their websites or to access through a newsfeed reader program.

For the content viewer, the ability to subscribe to content using RSS means that you can easily get content that you want without every having to worry about spam. The content doesn't go to your email box, it goes to a news feeder. You can subscribe or unsubscribe to whatever content you want.

For the content provider, you can help popularize your site by making it really easy for people to keep up-to-date with your latest entries.
RSS is used by (among other things) news websites, weblogs and podcasting. The abbreviation is variously used to refer to the following standards:

Really Simple Syndication (RSS 2.0)
Rich Site Summary (RSS 0.91, RSS 1.0)
RDF Site Summary (RSS 0.9 and 1.0)

Visit us: To more SEO Resources

Why? - Google AdSense

Subscribe to my feed

Google AdSense is a fast and easy way for website publishers to make money online. Google AdSense gives you relevant Google ads of all sizes to your website's content pages. Where the ads are related to what your visitors are looking for on your site. You'll finally have a way to both monetize and enhance your content pages for your websites.

So in the short way Google AdSense is the program that can give you advertising revenue from each page on your website--with a minimal investment in time and no additional resources.

Also Google AdSense delivers relevant text and image ads that are precisely targeted to your web site and your web site content. And when you add a Google search box to your site, Google AdSense delivers relevant text ads that are targeted to the Google search results pages generated by your visitors' search request. In fact, that your visitors' will actually find them useful.

So The Google AdSense service is free, and you earn money every time someone clicks on ads.

Visit us: To more SEO Resources

What is Google Sitemaps ?

Subscribe to my feed

Google Sitemaps

One of the most common problems ecommerce web sites have with rankings is that portions of their web sites are not included in the search engines. The cause can be any of a variety of reasons like complicated URL strings to slow server response or lack of a proper internal linking structure. If Google can't find your pages, they won't get into the search results.

So Google Sitemaps tool that allows web site owners to create an XML or plain text file listing all or some of the URLs of a web site for inclusion in the Google index. Up to 50,000 are allowed per document and if you're over that, you can create another site map file.

In cases where a site might have some difficulty getting indexed, providing a sitemap to Google may assist in getting missing pages into Google's search results.

Google does not charge for this service and it's pretty easy to do in three easy steps. More information is available on Google Sitemaps at: https://www.google.com/webmasters/sitemaps/

Visit us: To more SEO Resources

Why? -Link exchange

Subscribe to my feed

What is Link exchange

Link exchange is one of the earliest and most popular SEO technique among websites to get inbound links (urls that pointing to your website). That increases traffic rank and page rank of your sites.

1-way link exchange

2-way link exchange or cross linking

3-way link exchange

4-way link exchange or parallel linking and



1-way link exchange:

This is the best link building strategy for any website. But on the other hand it is also difficult to implement because you need quality and relevant websites which will put your website link on their website. There is no need of Reciprocal Link Exchange.

2-way link exchange or cross linking:

Exchanging link between two websites is called reciprocal link, it is also called link swap.

Cross Linking is a popular technique designed to build traffic and increase Page Rank. It relies on two websites agreeing to point an outbound-link to the other site.

3-way link exchange: (That means A link to B, B link to C, and C link to A.)

If you have two sites like "A" & "B" and your link partner has one site like "C". If "C" links to "A" and in return,"B" links to "C".This is called 3-way link exchange.

4-way link exchange or parallel linking:

If You have two websites like "A" & "B" and your link partner also have two sites like "C" & "D".if C links to "A" and in return "B" links to "D".This it is called 4-way link Exchange or parallel link exchange.

You Visit for : SEO Resources

WHY? - PFI (Paid for inclusion)

Subscribe to my feed

What is PFI

PFI Stand for Pay For Inclusion. Many search engines offer a PFI program to assure frequent spidering / indexing of a site (or page). PFI does not guarantee that a site will be ranked highly (or at all) for a given search term. It just offers webmasters the opportunity to quickly incorporate changes to a site into a search engine's index.

You pay a certain amount per URL for quick inclusion into the search engine database. However, this does not guarantee a good ranking. You will have to optimize your site very well to get a good ranking. This is different from the pay per click model where you can attain a top position instantly by bidding for the top position.

INKTOMI - PAY FOR INCLUSION.

ALTAVISTA - PAY FOR INCLUSION

YAHOO - PAY FOR INCLUSION

ALLTHEWEB - PAY FOR INCLUSION



For companies that want to provide a large number of URLs to Yahoo, the Yahoo Search Submit paid inclusion program offers the opportunity to have guaranteed inclusion in Yahoo's organic index. ( http://searchmarketing.yahoo.com/srchsb/ )

You can also pay Yahoo $299 per year for your domain name to be included in the Yahoo Directory. With this option, there is no guarantee; even in you pay, that you will be accepted or that your suggested title, description and category will be used.

Visit us: To more SEO Resources

Why ? - Google Adwords

Subscribe to my feed

What is Google Adwords

With Google AdWords you create your own ads, choose keywords to help us match your ads to your audience and pay only when someone clicks on them.

Google AdWords is the ideal marketing tool for small to medium businesses. or PPC program where webmasters can create their own ads and choose keywords.

With Google AdWords you can reach people when they are actively looking for your products and services. That means you receive targeted visitors and customers.

For more details: - https://www.adwords.google.com/ -

AdWords Advantages

Google AdWords aims to provide the most effective advertising available for businesses of any size. We pledge to help you meet your customer acquisition needs by enabling you to:

* Reach people looking for your product or service

* Fully control your ad budget

* Easily create and edit your ads

* See your ads on Google within minutes of creating them


Visit us: To more SEO Resources

What is Yahoo Sitemaps

Subscribe to my feed

What is Yahoo Sitemaps

One of the most common problems ecommerce web sites have with rankings is that portions of their web sites are not included in the search engines. The cause can be any of a variety of reasons like complicated URL strings to slow server response or lack of a proper internal linking structure. If Yahoo can't find your pages, they won't get into the search results.

So Yahoo offers the option of submitting many URLs at once using Yahoo's free submission service at:
http://submit.search.yahoo.com/free/request.

Registration for a Yahoo account is required, but free.


Here's how it works:

Create a list of all the URLs you want indexed in a text file and name it something like, urllist.txt (Yahoo's suggestion). You can also provide a compressed file and call it urllist.gz.

If you are providing a plain text list of URLs to Google Sitemaps, then you can use that same file for Yahoo. Place the file in the main directory of your web site and submit the URL to the Yahoo free submission form at: http://submit.search.yahoo.com/free/request.

Visit us: To more SEO Resources

Tuesday, June 27, 2006

Why? - Doorway Pages?

Subscribe to my feed

What Are Doorway Pages?

Doorway pages are designed primarily for search engines to improve their traffic, not for human beings. This page explains how these pages are delivered technically, and some of the problems they pose.

A Web page designed specifically for the purpose of gaining high placement in a search engine's rankings known as Doorway pages.

It is An SEO technique, the doorway is meant to capture the attention of a search engine's spider by containing keywords and phrases that the spider will pick up on.

They are also known as Portal pages, jump pages, gateway pages, entry pages, bridge" pages , doorway" pages and by other names.

Does your site need a number-one position in Google? Don't want to change your site but still want top positions in "natural" search results? No problem. Search engine marketers (SEM) will create doorway pages or mirror sites to redirect traffic to your site.

Doorway Page Characteristics

Doorway pages come in all shapes and sizes. Some are very easy to spot. They're often computer-generated, text-only pages of gibberish. If human visitors viewed the page, they wouldn't purchase from the company. The page is ugly and nonsensical.

* Doorway pages often reside on an SEM firm's server under a different domain, not as a part of your own site.

* Doorway pages are often redirected from the SEM firm's server to your Web site. Once search engine software engineers detect the doorway pages, SEM spammers abandon that domain and put the doorway pages on another one. It's a cat-and-mouse game.

* Visitors don't see the same page search engine spiders do. In other words, one doorway page is presented to search engines; a different one is presented to visitors.

* Listen for any phrase that resembles "instant link popularity." Even if you don't change your site, unethical SEM firms may build hundreds, even thousands, of doorway pages that point to your site to artificially boost link popularity.

* Doorway pages are often created for individual search engines. They can be created for each search engine in different languages.

Visit us:

http://www.halfvalue.com/top-articles/seo-resources.html


The author is a regular contributor to halfvalue.com where more information about SEO Resources and other more accessories is available.

What is Search engine listings?

Subscribe to my feed

what is Search engine listings?

Search engine listings are related to Search engines Because when someone enter a specific word , known as keywords to Search engines then all pages containing those keywords that can be found in the search engine's directory are listed on the search engine result pages.

There are two kinds of search engine listings today. They are:

1 - Natural search engine listings - such as those returned in numeric form on major search engines like Yahoo, Google and MSN.

2 - Paid search engine listings - such as those returned in sponsored areas of search results on these same engines and others that belong to their network.

Natural search engine listings:

Natural search engine listings also known as "Organic search engine listings " it is a free listing of a site, usually found by a search engine's spider and then ranked by relevancy according to the search engine's methodology.

They are determined by the search engines' algorithms for finding, sorting and ranking pages based on relevancy. Because natural listings are considered "more accurate" by searchers, getting in natural listings is typically preferable to getting in paid listings.

Paid search engine listings:

Paid listings are usually labeled in a separate section such as "sponsored listings.Paid search engine are those returned based on auction type keyword bidding. These listings are displayed in sponsored areas of search results and are paid for on a per-click basis.

Visit us:

http://halfvalue.com/top-articles/seo-resources.html

The author is a regular contributor to halfvalue.com where more information about SEO Resources and other more accessories is available.

Sunday, June 25, 2006

Some SEO Tips !!!

Subscribe to my feed

1. Insert keywords within the title tag so that search engine robots will know what your page is about. The title tag is located right at the top of your document within the head tags. Inserting a keyword or key phrase will greatly improve your chances of bringing targeted traffic to your site.

2. Use the same keywords as anchor text to link to the page from different pages on your site. This is especially useful if your site contains many pages. The more keywords that link to a specific page the better.

3. Make sure that the text within the title tag is also within the body of the page. Adding the exact same text for your h1 tag will tell the reader who clicks on your page from a search engine result that they have clicked on the correct link and have arrived at the page where they intended to visit.

4. Do not use the exact same title tag on every page on your website. Search engine robots might determine that all your pages are the same if all your title tags are the same. If this happens, your pages might not get indexed.

So you always use the headline of your pages as the title tag to help the robots know exactly what your page is about. A good place to insert the headline is within the h1 tag. So the headline is the same as the title tag text.

5. Do not spam the description or keyword meta tag by stuffing meaningless keywords or even spend too much time on this tag.
6. Do not link to link-farms or other search engine unfriendly neighborhoods. A good rule of thumb is if your pages do not contain any words that reflect the content of the site you are linking to, do not link to it.

7. Do not use doorway pages. Doorway pages are designed for robots only, not humans. Search engines like to index human friendly pages which contain content which is relevant to the search.

8. Title tags for text links. Insert the title tag within the HTML of your text link to add weight to the link and the page where the link resides. This is like the alt tag for images.

9. Describe your images with the use of the alt tag. This will help search engines that index images to find your pages and will also help readers who use text only web browsers.

10. Submit to the search engines yourself. Do not use submission service or submission software. Doing so could get your site penalized or even banned.


11. Do not make your title from your webpage long. This is because Google, Yahoo, MSN and all the other search engines have a limit on reading the title from your webpage.

12. Do not duplicate your content on your site, this will work negatively. Use unique content on each page.

13. Do not hesitate to submit questions on SEO forums. Knowing more is more important than knowing less.

14. Download several tools that help you make a good keyword combination. There are some good programs for free.

15. Do not put irrelevant and bad content on your website.

Visit us:
http://www.halfvalue.com/top-articles/seo-resources.html

Wednesday, June 21, 2006

Why? - Search Engines

Subscribe to my feed

What is a search engine?

Here I am going to define the Search engine with respect of two view as given below :

1. In the respect of Visitor (Who want getting best results
2. In the respect of Website (Who want make best results)

In the respect of visitorthe Search engines enable to their visitors to enter a specific word or term, known as keywords. Once submitted, all pages containing those keywords that can be found in the search engine's directory are listed on the search engine result pages.

In the respect of websitewho wants listing to search engine a search engine is a website such as Google on which people can search for other websites on the Internet. Search engines do not include sites manually, but have an automated "search engine spiders" that "crawls" the web through links. If your website has just one link to it, it will eventually be found and indexed in a search engine. The term "search engine" is often used generically to describe both crawler-based search engines and human-powered directories.

There are two type of Search Engine:

1. Crawler-based search engines
2. Human-powered directories

These two types of search engines gather their listings in radically different ways.


Crawler-Based Search Engines

Crawler-based search engines, such as Google, create their listings automatically. They "crawl" or "spider" the web, then people search through what they have found.

If you change your web pages, crawler-based search engines eventually find these changes, and that can affect how you are listed. Page titles, body copy and other elements all play a role.

Human-Powered Directories

A human-powered directory, such as the Open Directory, depends on humans for its listings. You submit a short description to the directory for your entire site, or editors write one for sites they review. A search looks for matches only in the descriptions submitted.

Changing your web pages has no effect on your listing. Things that are useful for improving a listing with a search engine have nothing to do with improving a listing in a directory. The only exception is that a good site, with good content, might be more likely to get reviewed for free than a poor site.

Visit us:
halfvalue.com
halfvalue.co.uk
lookbookstores.com

Why ? - Meta Tags used

Subscribe to my feed

This article covers the most important aspects of Meta tags in relation to website optimization or SEO.

What are Meta tags?

Meta Tags are the information inserted in the area of the HTML code of your web pages, where apart from the Title Tag, other information inserted is not visible to the person surfing your web page but is intended for the search engine crawlers. Meta Tags are included so that the search engines are able to list your site in their indexes more accurately.

What can I include in a Meta tag?

There are basically four major Meta tags that you can use:

*

o The only resource type that is currently in use is "document" This is the only tag that you need to put in for indexing purposes, but use of the others is a good idea.



*

o Depending on the search engine, this will be displayed along with the title of your page in an index. "Content" could be a word, sentence or even paragraph to describe your page. Keep this reasonably short, concise and to the point. However, don't be so mean with your description that its not an appropriate reflection of the contents!

*

o Choose whatever keywords you think are appropriate, seperated by commas. Remember to include synoyms, americanisms and so on. So, if you had a page on cars, you might want to include keywords such as car, cars, vehicles, automobiles and so on.



*

o Content should contain either global, local or iu (for Internal Use). To be perfectly honest, I can't quite get my head around this one; its supposed to list available resources designed to allow the use to find things easily, but I still don't quite get it. My advice is to stick to "global".

Using Meta Tags in HTML is not necessary while making your web pages. There are many websites that don't feel the requirement to use Meta Tags at all. In short Meta information is used to communicate information to the search engine crawlers that a human visitor may not be concerned with. Infoseek and AltaVista were the first major crawler based search engines to support Meta keywords Tag in 1996. Inktomi and Lycos too followed thereafter.

Why are Meta Tags used? - Want to get a top ranking in search engines

Meta Tags were originally designed to provide webmasters with a way to help search engines know what their site was about. This in turn helped the search engines decide how to rank the sites in their search results. Making Meta Tags is a simple process. As the competition increased, webmasters started manipulating this tool through spamming of keywords. In turn most search engines withdrew their support to Meta keywords Tag, which included Lycos and AltaVista. From being considered as one of the most reliable and important tool, Meta Tags are now often abused. In the present day scenario a vital feature that the Meta Tags provide to the websites is the ability to control, to a certain extent, how some search engines describe its web pages. Apart from this, Meta Tags also offer the ability to specify that a certain website page should not be indexed.

Using Meta Tags, however, provides no guarantee that your website page would rank highly in the search engine rankings. Due to the rampant abuse and manipulation of the Meta keywords Tag by webmasters, most search engines don't support it anymore.

Visit us:

halfvalue.com
halfvalue.co.uk
lookbookstores.com

Why? - Meta Robots Tag used

Subscribe to my feed

Why? - Meta Robots Tag

The Meta Robots Tag gives you the ability that a particular page should NOT be indexed by a search engine. To keep spiders out, simply add this text between your head tags on each page you don't want indexed.

However, there is no need for using Meta Robots tag if one is already using detailed robots.txt file to block any specific indexing.

The various commands used under Meta Robots Tag are:

Index: allows the spider to index that page.<
Noindex: instructs the spider not to index the page.
Follow: instructs the spider to follow the links from that page and index them.
Nofollow: instructs the spider not to follow links from that page for indexing.

The format is shown below:













Note: Use only one of the above given commands.

Most major search engines support the Meta robots tag. However, the robots.txt convention of blocking indexing is more efficient, as you don't need to add tags to each and every page. If you use do a robots.txt file to block indexing, there is no need to also use Meta robots tags.

If you have not specified any Meta Robots Tag on a page, by default, the spiders understand that the page and all the links appearing on that page are open for indexing. Therefore, it makes more sense to use this Meta Tag in case you don't want certain parts of your web page indexed.

Visit us:
halfvalue.com
halfvalue.co.uk
lookbookstores.com