Thursday, December 20, 2007

Google Updates its Toolbar with New Rating System

Google Tool Bar Update

Google recently made a change to its toolbar software which includes two new buttons: a happy face and a frowning face, These button allow users to rate webpages as they visit them.

Google Toolbar version 4.2 which will be used to roll out "VR" (visitor rating). Google VR™ will use a digg like system with a voting button on the toolbar to collect visitor ratings of the pageand the VR will be displayed via a blue bar similar to the PR bar.

Monday, December 17, 2007

SEO and Ajax

Its web 2.0 and the world simply cannot stop raving about how Ajax has revolutionized the internet. Well, for people who do not know Ajax is a new method to bring tremendous user friendliness to the internet.

It uses existing technologies and combines them to produce some extremely powerful applications. Yes, the user will be thrilled and more than happy with your Ajax website. But where does that place the webmaster?

Ajax is also one of the most un-search engine friendly technologies. That means that if you have Ajax applications on your website, then it is time to rethink your SEO techniques.

The biggest difficulty with Ajax is that it gives you very little options to manipulate the TITLE tags and page headers.

Accessibility has always been a problem with Ajax and what good is technology when users cannot find it?

A workaround

A lot of SEO experts have suggested that creating a second set of static views to be fed to the search engines might be a good option.

In simpler terms, you create a second website without the technology to make it accessible by search engines. The URL must delver the same content to both the users as well as the search engines. But this is not very creative nor is it very practical. Think about the amount of time and effort that goes into it. We also need specific inbound links that point to relevant sections within the virtual directory.

Different solutions

The above mentioned two are not the only workarounds that have been proposed. A few people also suggest that using the ‘noscrpit’ tag is a good way to deliver some static information to the search engines. This means, you create duplicate sections of the same page, one for Ajax and the other for the search engine. But once again, is this feasible? Only time will tell, if Ajax and seo can ever go hand in hand.

Wednesday, December 12, 2007

SEO for Web 2.0

Search Engine Optimization SEO for Web 2.0

For a lot of people, web 2.0 has been the birth of a new concept. For some, it has been a new set of rules that govern the internet now. But what is it in reality? How do you define web 2.0?

Web 2.0 is an idea that has developed considerably since the dot com bubble of 2001. Web 2.0 is a huge community of websites that is linking more websites together and increasing everyday.

User generated content is one of the prime features of web 2.0. Although user generated content can be relied upon to shoot a website to spotlight, websites have to rely on search engines to get better rankings. This is where SEO for web 2.0 comes in.

Most experts suggest that SEO is web 2.0 and that there isn’t much of a difference in the two.

SEO

The fact that user generated content is what is making websites grow horizontally and vertically makes it more difficult to use standard SEO techniques. There is very little editorial control and there are other problems that arise with structural arrangement.

In fact many insurmountable issues may arise with SEO for web 2.0. However, most of these can be overcome with tactful thinking and a set of tools that seem to be mandatory for success in web 2.0.

Blogs and wikis are the new set of devices that any website needs to have in order to engage newer users.

Interactivity

If you have heard about interactivity and elements like Ajax widgets and podcasts, then you are right. But these are extremely difficult for search engines to pick up and very often leave the spiders dazed. If you are using these in your websites, then you need to employ XML, RSS and mirror sites to ensure that you have search engine traffic flowing in all the time.

Overall, you can say that SEO is continually evolving to meet the challenges of web 2.0.

Wednesday, December 5, 2007

Google Labs - New Experimental Search

Google is experimenting with new feature on search results that allow users to vote up or remove search results they see.

This implemented as part of its Google Labs Program , which allows the user to customize the search experience by adding, deleting and moving search results.

This feature is limited to user specific search results. The sites disliked or deleted by specific user will not appear in his search results .

This is experimental feature and may be available only for limited period .

For more information visit : http://www.google.com/experimental/

SEO Tips to Get Higher Google Page Ranking

As the internet started to develop, immense marketing hype went into it and the result was a network of websites which had clustered information scattered here and there. But these websites were simply not willing to give out an external link.

Most of the links were internal and kept returning the user to square one. It was like a never ending loop. As a result google got into the act and altered its algorithms in the quest to make the web a better place.

Its page rank system is one of the key aspects of this change that has redefined the way the web functions. According to this system, the importance of a website could be determined by the number of inbound links coming to it.

For example, if your webpage has a hundred inbound links coming to it chances are that it can have a high page rank in google.

Development

As a result, webmasters now started to create dummy sites for giving out links for their web pages and soon had pages with a million inbound links.

Once again, Google altered the algorithms and this time, the sites which give out the links are also judged on the basis of their page ranks.
So unless you have a link coming in from a web page with a high page rank, that link is not valued. So the result is a network of websites that not only have quality information to share but also work towards bringing in more useful websites to improve the network.

TIPS

• Focus on keeping keyword rich but relevant content in your website. The balance between the two should be perfect.
• The only way to make your website popular is by improving the quality of the content within.
• Link building is a significant way of improving traffic and page rank.

Monday, December 3, 2007

Importance of Google Page Rank

Searches and search engines have come a long way indeed from the old days. Today, complex algorithms and techniques are at work behind the scenes to make the result much more satisfying and relevant for the end user.

Take Google’s extremely popular Page Rank system
for example. This system came into place well before Google was actually created. But now it is the sole system in place on the World Wide Web that decides on the importance of a page within a website and gives it a numerical ranking from 1 – 10.

So if you have a Page Rank of 1, it means that the particular webpage has received the most votes or inbound links. Most of these votes have to be relevant or important ones.

How it works

The number of inbound links that is generated by a page is the primary factor that determines it page rank. For example, if a page A has an outbound link to page B, then that link can be called as a vote from A to B. But how popular is Page A?

This is also a determining factor when the value of the vote is counted. This brought to fore the concept of link building and link popularity.

Today, a webpage that is crammed with keywords cannot deceive a search engine. It looks into the Page Rank as well to determine how important the webpage really is.

So even if you have several inbound links to your webpage, it does not guarantee you a high page rank.

Because, the relevance of the pages which are giving you those links are also taken into account. You can say that the page rank of a document is based on the Page Rank of the document which is giving it links.

This truly remarkable concept which has redefined the way the internet functions was achieved with the help of a simple algorithm.

Monday, November 26, 2007

Benefits of LSI for SEO

SEO has been the mantra of success for many a websites until now. The formula has been tried and tested and it has worked as well. The results were pretty much based on keyword density rather than the context in which the keyword has been used. Hence it was necessary to identify all the keywords which had the same meaning which a user might look for. For example, a user looking for ‘red kitchen cabinets’ might look for ‘kitchen cabinets red’ ‘red colored cabinets’ and many more such keyword variations all of which mean the same for him. But for the search engine spider they are all different keywords.

LSI

Now we have a new method of SEO called Latent Semantic Indexing. LSI is based on identifying a group of keywords and the context in which it is used in the text. It creates a much more natural approach to search engines and results rather than the more mechanical approach which was common until now. If you look closely at a search results page you will find at least three to four sites that are loaded with keywords but do not have any relevant content in it. This can be avoided with LSI. Apart from the specific keyword, users will also find those sites which have relevant information.

The benefits

Now when you do an LSI based search on music, you will not only have the results with the word music. You can also find radio, mp3 and stuff like that. It incorporates methods like Single Variable decomposition and creating a database of relevant keywords. Also common stop words are eliminated. These new results based on recall, precision and ranking has surely revolutionized the way search engines function. It also means that a web developer has to make sure that each and every word on a page is important and not just the keywords.

Tuesday, November 20, 2007

LSI (Latent Semantic Indexing) : The new face of search

Latent Semantic Indexing

How do you look up for information on the internet? You type in the words in a search box and wait for the information to come up. But if you go through the results in detail, you will find that after the first few results pages, the rest are all irrelevant to the term that you searched for.

This is the retrieval method that is popular the world over today but will be soon replaced by a highly sophisticated model called LSI (Latent Semantic Indexing).

LSI is a new concept based retrieval method which uses a term and document matrix to describe or bring out the occurrence of terms in various documents. The results have been known to be 30% more effective than any conventional form of search that has been used.

Why it works

The reason why LSI works can be attributed to a term called ‘Shared words’. If you are searching for the term ‘mobile phones’, you might miss out on the result pages which also have the words, ‘Cellular Phone, lightweight, camera phones, etc.

While the user may find these words alike, that’s not how a spider thinks. LSI eliminates this problem by searching according to the concept of a searched term rather than its presence in the result pages.

It eliminates a lot of hassles for both the searcher as well as the content provider who does not have to carefully craft out a database based on keywords.

How it works

Most LSI software uses a completely automated system that is called Singular Value Decomposition. By using this, it creates a semantic or concept space and then improves successful retrieval of data.

In simpler terms, an LSI based model will be able to identify that cellular phones and mobile phones and lightweight phones occur in the same context and hence the results will be much more detailed and relevant.

Tuesday, November 13, 2007

The rise of social bookmarking and tagging

Storing, classifying and sharing information has reached a new level altogether with social bookmarking.

Social bookmarking is a method in which internet users save links to web pages that can be shared publicly or privately with access limited only to a certain group or limited number of people.

The people can then view these bookmarks in a chronological order or via the tags that mark them. Ever since iList.com was started in 1996, there have been several social bookmarking networks on the internet that have risen and fallen.

These companies include Backflip, Blink, Clip2 etc. But it was more recently in 2004 that the term social bookmarking was coined and del.icio.us brought the concept of tagging into the World Wide Web. Since then social bookmarking has been on the rise.

Why it works

It works because it has several advantages when compared with the more traditional automated resource location software.

Search engine spiders for example, use algorithms to determine the meaning of a resource or a web page and then bring it up on the results page.

On the other hand, all tagging is done manually and hence can give much better and streamlined results. As a result you will find semantically classified tags in a social bookmarking system.

Also as a resource is bookmarked by more and more people, its popularity increases and the resource is then ranked by the system based on its utility.

So, the chances that the end user will benefit from it are much higher as compared to a conventional search engine ranking system which ranks pages according to the external links pointing towards it.

The flipside

On the other hand, social bookmarking and tagging has its own set of drawbacks as well. Since there is no fixed vocabulary or spelling checks, tags are found to have errors and unclear tags are common.

Friday, November 2, 2007

MatCutts White Hat SEO Tips for Bloggers

Matcutts, popularly know as Google SEO Guy , recently gave presentation on White Hat SEO Tips for Bloggers.

This video is currently available on Viddler, and also in my blog Google SEO Guy Videos Section

The Summary of things Highlighted in the video " White Hat SEO Tips for Bloggers."

  • Use variation of the same keyword in the url, title and content separately
  • Use dashes and underscores in the URLs to separate the words
  • Name categories after the keywords that you are chasing
  • Avoid .biz and .info type of domains
  • Make a buzz on the net and Google will notice you

Lastly he concluded "“Selling links and buying links are against our guidelines, and we do take action on them”.