Thursday, June 4, 2009

Google Squared Launched- Search Results in Spreadsheet format



Google has launched this product with heights of hope and pride. The structured search results are no dream any more - thats the claim.. This comes as a big boom for those professional who are in research and who spend so much of time in structuring the result in putting it in spreadsheet. This is no simple a task, even technically this to me has a major step forward that would allow other people to follow a structured reporting. This product was announced in searchology 2009 and now it has smelled its first LIVE air. even though this is in beta stage, its a workable model. This engine has to be given few inputs that would allow it to be squared when result appears





Put your hands yourselves and see how this renders in spreadsheet format. Now your search results can take a completely different form, so you can remove content, add suggested columns of your own, and even save your squares for future access. The cool thing about Google Squared is that columns are dynamic, so they mesh with the content displayed in the squared results. Cell content is also customizable, so clicking on a cell will let you search for other possible values and display a confidence level (eg. low confidence). We’re also big fans of the fact that you can save your Squares, a small but important feature that could turn this into a quick and powerful utility for research.

Feel the Pressure - Google has to think this now, as the release of Bing and probably Google Squared not matching to the standards of Wolfram Alpha, google is now under pressure to innovate and get to the next level to keep up its place. The world is also closely watching Google Wave launch too.

Check the url http://mashable.com/2009/05/19/wolfram-alpha-better-than-google/ which says how Wolfram Alpha does better than Google Squared.

And put your hands in the new squared arena from Google - www.google.com/squared

Sathish Sampath
www.sathishsampath.com

Wednesday, June 3, 2009

Bing - New Search Engine From Microsoft



Microsoft is launching their next leader follower strategy release by fighting heads up with google in their bread and butter Discover "Search Engine" There are lots of hopes prevailing around in the market and specifically with the circles of Pro Microsoft. Microsoft has amended and touched their product/release Kumo hence the birth of "Bing". Microsoft is considering this to be the "Decision Engine"

Relevancy - Adaptability - and clear structure are considered to be the top weighing factor for this Search Engine. Even though things are not clear with regards to the in depth understanding of algorithm's etc.. its clear that Microsoft is trying serious this time to fight back and win this space.

There are news that IE6 is now going to force this SE even if your default search engine is selected otherwise. As we know these are all speculations at this point in time, but if this happens we can see HUGE hits of bing in many corporates

Also there are reports where Steve Balmer has sanctions A REAL HUGE sum of money ($80 million to $100 million in ad campaign to promote Bing). I am sure this is going to be a real good business war. Whoever is going to win, as a user we enjoy getting the best of the product.

Winning this competition is not easy at all, Google Bagged 75% (approx) of the total hits recently of the searches (survey report for US) but comscore still ranks microsft in a very low score. Its going to be a very good time for all of us to watch the Giants battle it out for their places, and as a Web Analyst there is a raising concern too :)

All you SEO's watch out for yet another big SE for you to think and optimize for. Get your analysis and strategies ready, which is going to be key, relevance or page rank or its link submission or its back links.. its whole new story to crack again and get the versatility spread as over the years we have learnt for the Google Search Engine.

The good thing for us is If there is such a huge campaign planned, i am sure awareness towards Search Engine increases a lot and we can bag more projects. We can add up to our portfolio

Sathish Sampath
www.sathishsampath.com

Wednesday, December 24, 2008

Google Page Rank Calculation - unfolded

PageRank is a numeric value that represents how important a page is on the web. Google figures that when one page links to another page, it is effectively casting a vote for the other page. The more votes that are cast for a page, the more important the page must be. Also, the importance of the page that is casting the vote determines how important the vote itself is. Google calculates a page's importance from the votes cast for it. How important each vote is is taken into account when a page's PageRank is calculated.
PageRank is Google's way of deciding a page's importance. It matters because it is one of the factors that determines a page's ranking in the search results. It isn't the only factor that Google uses to rank pages, but it is an important one.
From here on in, we'll occasionally refer to PageRank as "PR".


Not all links are counted by Google. For instance, they filter out links from known link farms. Some links can cause a site to be penalized by Google. They rightly figure that webmasters cannot control which sites link to their sites, but they can control which sites they link out to. For this reason, links into a site cannot harm the site, but links from a site can be harmful if they link to penalized sites. So be careful which sites you link to. If a site has PR0, it is usually a penalty, and it would be unwise to link to it.

How is PageRank calculated?

To calculate the PageRank for a page, all of its inbound links are taken into account. These are links from within the site and links from outside the site.
PR(A) = (1-d) + d(PR(t1)/C(t1) + ... + PR(tn)/C(tn))

That's the equation that calculates a page's PageRank. It's the original one that was published when PageRank was being developed, and it is probable that Google uses a variation of it but they aren't telling us what it is. It doesn't matter though, as this equation is good enough.

In the equation 't1 - tn' are pages linking to page A, 'C' is the number of outbound links that a page has and 'd' is a damping factor, usually set to 0.85.
We can think of it in a simpler way:-
a page's PageRank = 0.15 + 0.85 * (a "share" of the PageRank of every page that links to it)
"share" = the linking page's PageRank divided by the number of outbound links on the page.

A page "votes" an amount of PageRank onto each page that it links to. The amount of PageRank that it has to vote with is a little less than its own PageRank value (its own value * 0.85). This value is shared equally between all the pages that it links to.

From this, we could conclude that a link from a page with PR4 and 5 outbound links is worth more than a link from a page with PR8 and 100 outbound links. The PageRank of a page that links to yours is important but the number of links on that page is also important. The more links there are on a page, the less PageRank value your page will receive from it.

If the PageRank value differences between PR1, PR2,.....PR10 were equal then that conclusion would hold up, but many people believe that the values between PR1 and PR10 (the maximum) are set on a logarithmic scale, and there is very good reason for believing it. Nobody outside Google knows for sure one way or the other, but the chances are high that the scale is logarithmic, or similar. If so, it means that it takes a lot more additional PageRank for a page to move up to the next PageRank level that it did to move up from the previous PageRank level. The result is that it reverses the previous conclusion, so that a link from a PR8 page that has lots of outbound links is worth more than a link from a PR4 page that has only a few outbound links.

Whichever scale Google uses, we can be sure of one thing. A link from another site increases our site's PageRank. Just remember to avoid links from link farms.
Note that when a page votes its PageRank value to other pages, its own PageRank is not reduced by the value that it is voting. The page doing the voting doesn't give away its PageRank and end up with nothing. It isn't a transfer of PageRank. It is simply a vote according to the page's PageRank value. It's like a shareholders meeting where each shareholder votes according to the number of shares held, but the shares themselves aren't given away. Even so, pages do lose some PageRank indirectly, as we'll see later.

Now we'll look at how the calculations are actually done.

For a page's calculation, its existing PageRank (if it has any) is abandoned completely and a fresh calculation is done where the page relies solely on the PageRank "voted" for it by its current inbound links, which may have changed since the last time the page's PageRank was calculated.

The equation shows clearly how a page's PageRank is arrived at. But what isn't immediately obvious is that it can't work if the calculation is done just once. Suppose we have 2 pages, A and B, which link to each other, and neither have any other links of any kind. This is what happens:-

Step 1: Calculate page A's PageRank from the value of its inbound links
Page A now has a new PageRank value. The calculation used the value of the inbound link from page B. But page B has an inbound link (from page A) and its new PageRank value hasn't been worked out yet, so page A's new PageRank value is based on inaccurate data and can't be accurate.

Step 2:
Calculate page B's PageRank from the value of its inbound links
Page B now has a new PageRank value, but it can't be accurate because the calculation used the new PageRank value of the inbound link from page A, which is inaccurate.

It's a Catch 22 situation. We can't work out A's PageRank until we know B's PageRank, and we can't work out B's PageRank until we know A's PageRank.
Now that both pages have newly calculated PageRank values, can't we just run the calculations again to arrive at accurate values? No. We can run the calculations again using the new values and the results will be more accurate, but we will always be using inaccurate values for the calculations, so the results will always be inaccurate.

The problem is overcome by repeating the calculations many times. Each time produces slightly more accurate values. In fact, total accuracy can never be achieved because the calculations are always based on inaccurate values. 40 to 50 iterations are sufficient to reach a point where any further iterations wouldn't produce enough of a change to the values to matter. This is precisiely what Google does at each update, and it's the reason why the updates take so long.

One thing to bear in mind is that the results we get from the calculations are proportions. The figures must then be set against a scale (known only to Google) to arrive at each page's actual PageRank. Even so, we can use the calculations to channel the PageRank within a site around its pages so that certain pages receive a higher proportion of it than others.


Internal linking

Fact:
A website has a maximum amount of PageRank that is distributed between its pages by internal links.

The maximum PageRank in a site equals the number of pages in the site * 1. The maximum is increased by inbound links from other sites and decreased by outbound links to other sites. We are talking about the overall PageRank in the site and not the PageRank of any individual page. You don't have to take my word for it. You can reach the same conclusion by using a pencil and paper and the equation.
Fact: The maximum amount of PageRank in a site increases as the number of pages in the site increases.

The more pages that a site has, the more PageRank it has. Again, by using a pencil and paper and the equation, you can come to the same conclusion. Bear in mind that the only pages that count are the ones that Google knows about.

Fact: By linking poorly, it is possible to fail to reach the site's maximum PageRank, but it is not possible to exceed it
.

Poor internal linkages can cause a site to fall short of its maximum but no kind of internal link structure can cause a site to exceed it. The only way to increase the maximum is to add more inbound links and/or increase the number of pages in the site.

What can we do with this 'overall' PageRank?


we are going to ignore that fact, mainly because other 'Pagerank Explained' type documents ignore it in the calculations, and it might be confusing when comparing documents. The calculator operates in two modes:- Simple and Real. In Simple mode, the calculations assume that all pages are in the Google index, whether or not any other pages link to them. In Real mode the calculations disregard unlinked-to pages. These examples show the results as calculated in Simple mode.

Let's consider a 3 page site (pages A, B and C) with no links coming in from the outside. We will allocate each page an initial PageRank of 1, although it makes no difference whether we start each page with 1, 0 or 99.


Apart from a few millionths of a PageRank point, after many iterations the end result is always the same. Starting with 1 requires fewer iterations for the PageRanks to converge to a suitable result than when starting with 0 or any other number. You may want to use a pencil and paper to follow this or you can follow it with the calculator.
The site's maximum PageRank is the amount of PageRank in the site. In this case, we have 3 pages so the site's maximum is 3.

At the moment, none of the pages link to any other pages and none link to them. If you make the calculation once for each page, you'll find that each of them ends up with a PageRank of 0.15. No matter how many iterations you run, each page's PageRank remains at 0.15. The total PageRank in the site = 0.45, whereas it could be 3. The site is seriously wasting most of its potential PageRank.

Example 1




Now begin again with each page being allocated PR1. Link page A to page B and run the calculations for each page. We end up with:-
Page A = 0.15
Page B = 1
Page C = 0.15
Page A has "voted" for page B and, as a result, page B's PageRank has increased. This is looking good for page B, but it's only 1 iteration - we haven't taken account of the Catch 22 situation. Look at what happens to the figures after more iterations:-

After 100 iterations the figures are:-
Page A = 0.15
Page B = 0.2775
Page C = 0.15
It still looks good for page B but nowhere near as good as it did. These figures are more realistic. The total PageRank in the site is now 0.5775 - slightly better but still only a fraction of what it could be.

Example 2



Try this linkage. Link all pages to all pages. Each page starts with PR1 again. This produces:-
Page A = 1
Page B = 1
Page C = 1
Now we've achieved the maximum. No matter how many iterations are run, each page always ends up with PR1. The same results occur by linking in a loop. E.g. A to B, B to C and C to D. View this in the calculator.

This has demonstrated that, by poor linking, it is quite easy to waste PageRank and by good linking, we can achieve a site's full potential. But we don't particularly want all the site's pages to have an equal share. We want one or more pages to have a larger share at the expense of others. The kinds of pages that we might want to have the larger shares are the index page, hub pages and pages that are optimized for certain search terms. We have only 3 pages, so we'll channel the PageRank to the index page - page A. It will serve to show the idea of channeling.

Example 3


Now try this. Link page A to both B and C. Also link pages B and C to A. Starting with PR1 all round, after 1 iteration the results are:-
Page A = 1.85
Page B = 0.575
Page C = 0.575
and after 100 iterations, the results are:-
Page A = 1.459459
Page B = 0.7702703
Page C = 0.7702703
In both cases the total PageRank in the site is 3 (the maximum) so none is being wasted. Also in both cases you can see that page A has a much larger proportion of the PageRank than the other 2 pages. This is because pages B and C are passing PageRank to A and not to any other pages. We have channeled a large proportion of the site's PageRank to where we wanted it.

Example 4




Finally, keep the previous links and add a link from page C to page B. Start again with PR1 all round. After 1 iteration:-
Page A = 1.425
Page B = 1
Page C = 0.575
By comparison to the 1 iteration figures in the previous example, page A has lost some PageRank, page B has gained some and page C stayed the same. Page C now shares its "vote" between A and B. Previously A received all of it. That's why page A has lost out and why page B has gained. and after 100 iterations:-
Page A = 1.298245
Page B = 0.9999999
Page C = 0.7017543

When the dust has settled, page C has lost a little PageRank because, having now shared its vote between A and B, instead of giving it all to A, A has less to give to C in the A-->C link. So adding an extra link from a page causes the page to lose PageRank indirectly if any of the pages that it links to return the link. If the pages that it links to don't return the link, then no PageRank loss would have occured. To make it more complicated, if the link is returned even indirectly (via a page that links to a page that links to a page etc), the page will lose a little PageRank. This isn't really important with internal links, but it does matter when linking to pages outside the site.

Example 5: new pages

Adding new pages to a site is an important way of increasing a site's total PageRank because each new page will add an average of 1 to the total. Once the new pages have been added, their new PageRank can be channeled to the important pages. We'll use the calculator to demonstrate these.

Let's add 3 new pages to Example 3 [view]. Three new pages but they don't do anything for us yet. The small increase in the Total, and the new pages' 0.15, are unrealistic as we shall see. So let's link them into the site.
Link each of the new pages to the important page, page A [view]. Notice that the Total PageRank has doubled, from 3 (without the new pages) to 6. Notice also that page A's PageRank has almost doubled.

There is one thing wrong with this model. The new pages are orphans. They wouldn't get into Google's index, so they wouldn't add any PageRank to the site and they wouldn't pass any PageRank to page A. They each need to be linked to from at least one other page. If page A is the important page, the best page to put the links on is, surprisingly, page A [view]. You can play around with the links but, from page A's point of view, there isn't a better place for them.


It is not a good idea for one page to link to a large number of pages so, if you are adding many new pages, spread the links around. The chances are that there is more than one important page in a site, so it is usually suitable to spread the links to and from the new pages. You can use the calculator to experiment with mini-models of a site to find the best links that produce the best results for its important pages.

Dangling links


"Dangling links are simply links that point to any page with no outgoing links. They affect the model because it is not clear where their weight should be distributed, and there are a large number of them. Often these dangling links are simply pages that we have not downloaded yet..........Because dangling links do not affect the ranking of any other page directly, we simply remove them from the system until all the PageRanks are calculated. After all the PageRanks are calculated they can be added back in without affecting things significantly." - extract from the original PageRank paper by Google’s founders, Sergey Brin and Lawrence Page.
A dangling link is a link to a page that has no links going from it, or a link to a page that Google hasn't indexed. In both cases Google removes the links shortly after the start of the calculations and reinstates them shortly before the calculations are finished. In this way, their effect on the PageRank of other pages in minimal.


Inbound links
Inbound links (links into the site from the outside) are one way to increase a site's total PageRank. The other is to add more pages. Where the links come from doesn't matter. Google recognizes that a webmaster has no control over other sites linking into a site, and so sites are not penalized because of where the links come from. There is an exception to this rule but it is rare and doesn't concern this article. It isn't something that a webmaster can accidentally do.
The linking page's PageRank is important, but so is the number of links going from that page. For instance, if you are the only link from a page that has a lowly PR2, you will receive an injection of 0.15 + 0.85(2/1) = 1.85 into your site, whereas a link from a PR8 page that has another 99 links from it will increase your site's PageRank by 0.15 + 0.85(7/100) = 0.2095. Clearly, the PR2 link is much better - or is it? See here for a probable reason why this is not the case.
Once the PageRank is injected into your site, the calculations are done again and each page's PageRank is changed. Depending on the internal link structure, some pages' PageRank is increased, some are unchanged but no pages lose any PageRank.
It is beneficial to have the inbound links coming to the pages to which you are channeling your PageRank. A PageRank injection to any other page will be spread around the site through the internal links. The important pages will receive an increase, but not as much of an increase as when they are linked to directly. The page that receives the inbound link, makes the biggest gain.
It is easy to think of our site as being a small, self-contained network of pages. When we do the PageRank calculations we are dealing with our small network. If we make a link to another site, we lose some of our network's PageRank, and if we receive a link, our network's PageRank is added to. But it isn't like that. For the PageRank calculations, there is only one network - every page that Google has in its index. Each iteration of the calculation is done on the entire network and not on individual websites.
Because the entire network is interlinked, and every link and every page plays its part in each iteration of the calculations, it is impossible for us to calculate the effect of inbound links to our site with any realistic accuracy.


Outbound links
Outbound links are a drain on a site's total PageRank. They leak PageRank. To counter the drain, try to ensure that the links are reciprocated. Because of the PageRank of the pages at each end of an external link, and the number of links out from those pages, reciprocal links can gain or lose PageRank. You need to take care when choosing where to exchange links.
When PageRank leaks from a site via a link to another site, all the pages in the internal link structure are affected. (This doesn't always show after just 1 iteration). The page that you link out from makes a difference to which pages suffer the most loss. Without a program to perform the calculations on specific link structures, it is difficult to decide on the right page to link out from, but the generalization is to link from the one with the lowest PageRank.
Many websites need to contain some outbound links that are nothing to do with PageRank. Unfortunately, all 'normal' outbound links leak PageRank. But there are 'abnormal' ways of linking to other sites that don't result in leaks. PageRank is leaked when Google recognizes a link to another site. The answer is to use links that Google doesn't recognize or count. These include form actions and links contained in javascript code.
Form actions

Adding new pages
There is a possible negative effect of adding new pages. Take a perfectly normal site. It has some inbound links from other sites and its pages have some PageRank. Then a new page is added to the site and is linked to from one or more of the existing pages. The new page will, of course, aquire PageRank from the site's existing pages. The effect is that, whilst the total PageRank in the site is increased, one or more of the existing pages will suffer a PageRank loss due to the new page making gains. Up to a point, the more new pages that are added, the greater is the loss to the existing pages. With large sites, this effect is unlikely to be noticed but, with smaller ones, it probably would.
So, although adding new pages does increase the total PageRank within the site, some of the site's pages will lose PageRank as a result. The answer is to link new pages is such a way within the site that the important pages don't suffer, or add sufficient new pages to make up for the effect (that can sometimes mean adding a large number of new pages), or better still, get some more inbound links.

Full version of this is clearly explained by Phil Craven in http://www.webworkshop.net/pagerank.html

Source of this article - http://www.webworkshop.net/pagerank.html

Sathish Sampath
WebSearch Lab

Sunday, December 7, 2008

Google Algorithm - Play of numbers

Google's Relevancy Algorithms Change by Keyword: Longtail vs Core Category Words

Changes in Search


The modern era of personalization of keywords and relevancy towards the content, localization towards the region specific to get the tag being hit for the region specific. The newer algorithm is pointing away from the universal or the global search or the special query related searches. but even other than these things, still the google's core relevancy algorithm are definitely query dependent. and has got a new shape of understanding the deviation logic from the contents extensiveness and the focus.


Competitive Keywords


There is a little interesting turn here. the query search on the keywords are now determind thro the quality of internal and the external links and inturn link quality, link diversity, link anchor text and perhaps other signals of quality like usage data and a LocalRank boost. for the competitive keywords, the spider doesnt give much of relevance when it identifies that some page is has heavy onpage optimization.

Low Competition Keywords

The logic behind this

For search relevancy algorithms where there are fewer matches and fewer external signals of quality available, Google must put more weight on the content of individual pages. also since the relevancy check is of not that ranking level, these keywords are not so highly rated and taken

Where there is no community to rely upon Google must trust publishers. And while each longtail ranking might have little value the nickels and quarters add up. Their limited search volume and value leads many competitors to skip over them as they do not appear in most keyword research tools.

Google Said - "The same post highlighted that "broad match currently accounts for over 1/3 of all clicks and conversions for advertisers, worldwide" and that Google "recently improved the search query report to provide more granular detail on which queries are triggering ads for your broad match keywords."

also ""Did you know that 20% of the queries Google receives each day are ones we haven’t seen in at least 90 days, if at all?" - Google Adwords team


competitive keywords

link anchor text - 20%
in community links - 20%
link diversity - 15%
site age - 15%
onpage optimization - 5%
domain authority - 25%

long tail keywords

link diversity - 5%
in community links 0%
site age - 15%
link anchor text - 15%
domain authority - 15%

When we calculate the identity towards these things, it would actually lead to a level where any new site, which doesnt have much of indexing and pages it has to depend on the less competitive keywords or the long tail once for few months till they are equipped for the big fight


sathish sampath


Websearch lab