Tuesday, January 29, 2008

Understanding Google Domain Age Filter

When you launch a website, you are entering a world that is as populated as the world that we exist in. Maybe it’s even more populated than the real world. The worst part is that every webmaster who launches a website wants to be at the number one spot in Google search engine results.

Now this makes the job extremely difficult for Google to go through each one of these websites and rank them on the basis of their content and SEO techniques. A lot of webmasters want to hit it big early and hence they choose spam techniques to get to the top really fast.

So Google introduced a set of filters to counter spamming and create a search results page that has only quality websites in it. The Google domain age filter works on the policy that if your domain has been around for a few years, then the chances that you are a spam website are negligible. This is why google gives older domains a much better ranking than newer ones.

Can you get around it?

Does it mean that it will take a few years before your domain name gets some credibility? Not really, say experts.

  • Buying old domain names is one way to get around this. If there is a domain that has been bought and parked at a domain parking service, then it will have good credibility at Google.


  • Another work around is to buy the domain name early. If you are planning to start a few websites next year or so, then start now and book the domain names. Now tell google about them so that they start the evaluation process right away.


  • Also if and when you are buying an older domain name, please does not change the WHOIS information or you risk losing all the benefits of buying an old domain name.


And keep in mind that this is only half the war won. There are a set of other filters that you have to combat in order to get high rankings.

Thursday, January 24, 2008

Understanding the Google Trust Rank Filter

Google Filters - Google Trust Rank Filter

If you thought that a well detailed SEO program is enough to get you a top rank in the search results pages, then you are misinformed. The information that you have is only partially correct.

The complete information is that you need to follow this program routinely for almost one year to get out of the google sandbox and get into the google trust rank filter. Both these are a set of filters used by Google, which avoid new websites from gaining an undue advantage by using spamming techniques.

Some websites create thousands of back links at once using spamming techniques and dummy web pages to gain a high back link structure. This makes it possible to gain a high page rank the moment the website is launched defeating the very purpose of a search engine.

Hence the arrival of google trust rank filter might have actually benefited the web users but it has made life difficult for webmasters.

The Google trust rank

In order to get a good trust rank with google, you have to get quality in bound links to your website. Now this is not that easy to achieve and may even take years.

If you lose patience and resort to spamming, then you are out of the trust rank and into the sand box. Consider them to be the two sides of a coin that work in tandem. As the name suggests, the google trust rank filter is all about creating trust. Keep working on your web promotion using ethical practices and over a period of time as google gets to know your website better, your ranking will improve on the search results page.

Can you beat it?

This has been one of the most widely discussed things on the internet. Can you really beat the google filters? My advice would be to try and stick to legitimate SEO practices rather than trying to beat the filter and get into the sand box filter again.

Wednesday, January 23, 2008

Google SandBox Filter

Can you beat the google sand box filter?

Search engines have become far more equipped today to beat spammers at their own game. There was a time when websites used mass spamming techniques to reach the top of search engine results soon after their launch.

As a result, most of the top websites were beaten by the spammers and the quality of search engine results suffered.

Soon google allegedly came out with the sand box filter. The sandbox filter is considered to be a conscious effort on the part of google to prevent spamming websites from achieving a high ranking.

According to experts the google sand box filter is like a probationary time zone which may last for up to 3 months or even more. In this time period, the website may perform extremely poorly in organic search results. It is used to gauge the trust metrics of the website. This metrics will play an important part in deciding the future of the website in search results.

Different theories

There have been different theories surrounding the google sand box filter. Some say that the filter is aimed at postponing or ageing the back linking process.

Back links are crucial in giving the website a high page ranking. When the effectiveness of these back links are reduced, the website performs weakly until google ascertains the trustworthiness of the links.

However, off late many SEO experts have started to question the theories that support the existence of the google sand box filter.

Another theory says that the google sand box filter only affects English language websites. This means that any website with the .gov domain will not be affected.

The final word

The bottom line is that if you wish to achieve a top ranking in google search results, you will have to work for it by building quality trustworthy back links. You simply cannot follow spamming techniques to ensure that you reach the top.

Tuesday, January 22, 2008

Google Filters

The existence of Google filters

Google has simply come into existence and spun the virtual world around. It can now be called the central axis around which half of the virtual world revolves. Hence it is extremely crucial that you understand the way it functions. This is mandatory not only for the webmaster but also for the average netizen looking for information.

Google employs a set of algorithms to scan a webpage and bring up its search results. These algorithms are constantly updated every now and then to make them more effective and efficient in bringing up the results.

Also, the success of a webpage in the World Wide Web very well depends on its position on the search results page. Search engine optimization or SEO techniques were introduced to help a webpage rank better on search engine results pages.

But then Google incorporated these filters into their algorithms and once again, the rules changed.

What are Google filters?

There are almost hundreds of factors and variables that are programmed into a google algorithm and this makes it very difficult to identify if filters are used.

Filters are some rules that the search engines will follow to avoid unhealthy SEO practices.

For example, if you have built thousands of back links using an auto link creation program, then it is considered to be spamming by Google and the search engine will avoid the webpage altogether.

Similarly, the sandbox effect or the over optimization filter which is said to remove the pages that are crammed with keywords and the H1 and H2 tags are some of the examples.

The links filter which is associated with the links.htm page is almost certain to exist, say experts.

How to Avoid Google filters?

If you take all of the filters into account when optimizing your website, then SEO techniques become all the more difficult. But the best part is that SEO experts have started to introduce more and more methods to avoid these so called Google filters.

Thursday, December 20, 2007

Google Updates its Toolbar with New Rating System

Google Tool Bar Update

Google recently made a change to its toolbar software which includes two new buttons: a happy face and a frowning face, These button allow users to rate webpages as they visit them.

Google Toolbar version 4.2 which will be used to roll out "VR" (visitor rating). Google VR™ will use a digg like system with a voting button on the toolbar to collect visitor ratings of the pageand the VR will be displayed via a blue bar similar to the PR bar.

Monday, December 17, 2007

SEO and Ajax

Its web 2.0 and the world simply cannot stop raving about how Ajax has revolutionized the internet. Well, for people who do not know Ajax is a new method to bring tremendous user friendliness to the internet.

It uses existing technologies and combines them to produce some extremely powerful applications. Yes, the user will be thrilled and more than happy with your Ajax website. But where does that place the webmaster?

Ajax is also one of the most un-search engine friendly technologies. That means that if you have Ajax applications on your website, then it is time to rethink your SEO techniques.

The biggest difficulty with Ajax is that it gives you very little options to manipulate the TITLE tags and page headers.

Accessibility has always been a problem with Ajax and what good is technology when users cannot find it?

A workaround

A lot of SEO experts have suggested that creating a second set of static views to be fed to the search engines might be a good option.

In simpler terms, you create a second website without the technology to make it accessible by search engines. The URL must delver the same content to both the users as well as the search engines. But this is not very creative nor is it very practical. Think about the amount of time and effort that goes into it. We also need specific inbound links that point to relevant sections within the virtual directory.

Different solutions

The above mentioned two are not the only workarounds that have been proposed. A few people also suggest that using the ‘noscrpit’ tag is a good way to deliver some static information to the search engines. This means, you create duplicate sections of the same page, one for Ajax and the other for the search engine. But once again, is this feasible? Only time will tell, if Ajax and seo can ever go hand in hand.

Wednesday, December 12, 2007

SEO for Web 2.0

Search Engine Optimization SEO for Web 2.0

For a lot of people, web 2.0 has been the birth of a new concept. For some, it has been a new set of rules that govern the internet now. But what is it in reality? How do you define web 2.0?

Web 2.0 is an idea that has developed considerably since the dot com bubble of 2001. Web 2.0 is a huge community of websites that is linking more websites together and increasing everyday.

User generated content is one of the prime features of web 2.0. Although user generated content can be relied upon to shoot a website to spotlight, websites have to rely on search engines to get better rankings. This is where SEO for web 2.0 comes in.

Most experts suggest that SEO is web 2.0 and that there isn’t much of a difference in the two.

SEO

The fact that user generated content is what is making websites grow horizontally and vertically makes it more difficult to use standard SEO techniques. There is very little editorial control and there are other problems that arise with structural arrangement.

In fact many insurmountable issues may arise with SEO for web 2.0. However, most of these can be overcome with tactful thinking and a set of tools that seem to be mandatory for success in web 2.0.

Blogs and wikis are the new set of devices that any website needs to have in order to engage newer users.

Interactivity

If you have heard about interactivity and elements like Ajax widgets and podcasts, then you are right. But these are extremely difficult for search engines to pick up and very often leave the spiders dazed. If you are using these in your websites, then you need to employ XML, RSS and mirror sites to ensure that you have search engine traffic flowing in all the time.

Overall, you can say that SEO is continually evolving to meet the challenges of web 2.0.

Wednesday, December 5, 2007

Google Labs - New Experimental Search

Google is experimenting with new feature on search results that allow users to vote up or remove search results they see.

This implemented as part of its Google Labs Program , which allows the user to customize the search experience by adding, deleting and moving search results.

This feature is limited to user specific search results. The sites disliked or deleted by specific user will not appear in his search results .

This is experimental feature and may be available only for limited period .

For more information visit : http://www.google.com/experimental/

SEO Tips to Get Higher Google Page Ranking

As the internet started to develop, immense marketing hype went into it and the result was a network of websites which had clustered information scattered here and there. But these websites were simply not willing to give out an external link.

Most of the links were internal and kept returning the user to square one. It was like a never ending loop. As a result google got into the act and altered its algorithms in the quest to make the web a better place.

Its page rank system is one of the key aspects of this change that has redefined the way the web functions. According to this system, the importance of a website could be determined by the number of inbound links coming to it.

For example, if your webpage has a hundred inbound links coming to it chances are that it can have a high page rank in google.

Development

As a result, webmasters now started to create dummy sites for giving out links for their web pages and soon had pages with a million inbound links.

Once again, Google altered the algorithms and this time, the sites which give out the links are also judged on the basis of their page ranks.
So unless you have a link coming in from a web page with a high page rank, that link is not valued. So the result is a network of websites that not only have quality information to share but also work towards bringing in more useful websites to improve the network.

TIPS

• Focus on keeping keyword rich but relevant content in your website. The balance between the two should be perfect.
• The only way to make your website popular is by improving the quality of the content within.
• Link building is a significant way of improving traffic and page rank.

Monday, December 3, 2007

Importance of Google Page Rank

Searches and search engines have come a long way indeed from the old days. Today, complex algorithms and techniques are at work behind the scenes to make the result much more satisfying and relevant for the end user.

Take Google’s extremely popular Page Rank system
for example. This system came into place well before Google was actually created. But now it is the sole system in place on the World Wide Web that decides on the importance of a page within a website and gives it a numerical ranking from 1 – 10.

So if you have a Page Rank of 1, it means that the particular webpage has received the most votes or inbound links. Most of these votes have to be relevant or important ones.

How it works

The number of inbound links that is generated by a page is the primary factor that determines it page rank. For example, if a page A has an outbound link to page B, then that link can be called as a vote from A to B. But how popular is Page A?

This is also a determining factor when the value of the vote is counted. This brought to fore the concept of link building and link popularity.

Today, a webpage that is crammed with keywords cannot deceive a search engine. It looks into the Page Rank as well to determine how important the webpage really is.

So even if you have several inbound links to your webpage, it does not guarantee you a high page rank.

Because, the relevance of the pages which are giving you those links are also taken into account. You can say that the page rank of a document is based on the Page Rank of the document which is giving it links.

This truly remarkable concept which has redefined the way the internet functions was achieved with the help of a simple algorithm.