Big changes in Search Engine Optimization. To all- Here is Google New Algorithm 2011……
Google’s New Algorithm 2010 / 2011 Revealed
Many SEO Gurus have attempted to give a rough outline of what the Google algorithm might look like. Based on research and suggestions this might be how the formula basically could look like;
Google’s Score = (Kw Usage Score * 0.3) + (Domain * 0.25) + (PR Score * 0.25) + (Inbound Link Score * 0.25) + (User Data * 0.1) + (Content Quality Score * 0.1) + (Manual Boosts) – (Automated & Manual Penalties)
If you are asking yourself how does the latest 2011 algorithm work or what does SEO mean, you will find on this website all the secrets you want revealed for absolutely free. Learn how the algorithm best works and the most common factors that effect your website ranking with this breakthrough revealed information.
Search Engine Optimisation Key Word Usage Factors:-
• Keyword in title tags
• Rich Header tags
• Documents rich with relevant text information
• Keyword in alt tag
• Keywords in internal links pointing to the page
• Keywords in the domain and/or URL
• The order key words appear
Google New Algorithm
Domain Strength / Speed-
• Domain Strength
• Domain Speed
• Local/specific domain name
• Quality Hosting
Registration history-
• When the Domain was registered
• Strength of the links pointing to the domain
• The topical neighborhood of domain based on the in bound and out bound links
• Historical use and links pattern to domain
• Inbound links and referrals from social media
• Inbound Link Score
Age & Quality of links
• Quality of the domains sending links (non paid)
• Links from Authority Websites
• Quality of pages sending links and relevance
• Anchor text of links
• Link quantity/weight metric (Page Rank or a variation)
• Subject matter of linking pages/sites
• User Data
Historical CTR to page in SERPs
• Time users spend on a page
• Search requests for URL/domain
• Historical visits/use of URL/domain by users
Content Quality Score
• Duplicate content filter
• Order of key words in content
• Content update frequency
• Highlighted and visibility of keywords
• Usability and W3C rules
Negative Penalties (for SEO purposes)
• Forums
• Facebook
• Twitter
• Article Submissions
• Blogs
• Link Exchange
• Paid Links
• FFA’s (Free for all backlinks)
Google over the years have persistently pursued innovation, constantly updated its algorithm or algorythm (if that is the way you spell it) and refused to accept the limitations of existing models.As a result, Google’s latest development was the breakthrough PageRank™ technology that changed the way searches are conducted today. And if there is one SEO company that knows how the algorithm works and how would it would boost your rankings on Google organic search that would be White Hat Works.
What’s new about Google’s latest algorithm in 2010 / 2011?
There is much talk in the SEO world about what Google is going to focus on in 2010. Matt Cutts, head of Google’s Webspam team has mentioned and hinted in various forums, Youtube and on his Blog that SEO professionals’ latest focus should be in 2010. Part of the Caffeine project, apparently in 2010 the speed of your website and web pages loading will now play a major factor in the algorithm.
To achieve faster speed your website needs to be hosted on a super fast host and reducing the overall size of your web pages. Obviously this means moving to a better Internet Service Provider and serving faster websites by increasing the download speed.
This will mean less content on a page, utilising CSS (Cascading Style Sheets) and images have to load faster. A webpage downloading speed of 3 seconds or less is pretty good, but 1 second or lower might have to be achieved in cases. You can now measure the speed with a program in the Google’s Webmaster Tools.
Does this mean Google in a way is pushing developers to write and create better websites that load faster and content writers to write quality content instead of quantity content? and the digital marketing guys, will they have to be at the same time creative but their art work must be short, sweat and fast loading? Does using flash animations go out the window? Hosting Video clips on your own website? would that be wise? or better to be hosted on You Tube instead?All these are questions and probably the answers you are looking for is, yes it is!What could happen is Google favouring certain websites and in the other hand penalise other ones. Websites owners that keep stuffing their pages with content, customer reviews, endless blogs and articles, videos, links and endless images will be penalised.
The good news is websites that are clean, focused, compatible and fast will benefit.
Many SEO professionals are saying that this is not fair for the SME’s (small to medium sized business) that cannot afford to be hosted on super fast hosts and do not have large teams that can restructure their websites to go faster. The question is how fast can the cooperate companies react to the ever growing demands of the Internet? Probably not and it will take them months before they even decide what to do. In the other hand SME’s probably will be in a better position to react a lot faster due to quick decision making compensating for all the other factors.
The SEO e-marketers also mention about countries that have slow internet access, like in Scotland and other rural areas where you cannot get broadband? Are they going to get penalised? Probably not, as its the Google Bots measuring the speed and not at what speed the website is loaded in these areas on an end users computer. Actually Google in a way by “forcing” developers to increase their websites overall speed would mean these areas with no broadband will benefit in the long run. Mobile phone users with their web enabled devices like the iPhone and Nexus by Google will be able to load more and more websites, specially the ones that are more lean and load faster.
White Hat Works Conclusion for 2010 and 2011:
Host your website on a super fast Internet Service Provider
Use CSS as much as possible and comply with website usability
Write quality content, not quantity content
Convert your a Flash website to a HTML website
Export your Flash animations where possible – more SEO Tips
Keep images and their size to a minimum without loosing the quality
Test your website in various browsers and mobile phone devices
Make your website Google Friendly
Get free guidance from a reputable SEO E-marketing Company
Faster websites means Google can keep up with the growth of the Internet without the need to keep buying and installing new servers. Many have asked, how many servers does Google have? This is a well kept secret, but if you do your maths and calculations the number of servers could be around 700,000 and growing. These servers are spread over 100 data-centres around the globe and makes Google the largest IT producer of (Co2) carbon emissions on Earth.
Google is in the business of providing a top notch service and encouraging growth, but at the same time be profitable by keeping their costs and carbon emissions down.
Hence the need for this new factor that will be included in the updated Google 2010 Algorythm. Maybe the main reason Google is focusing on speed is because what they really are trying to do is to reduce their carbon foot print and we (SEO guys and website owners) all need to help them achieve this.
One effected company said:
We are an ecommerce site that has been running for almost 5 months and have seen steadily growing traffic. We have made no major changes to our website recently but are continually adding content and products to the site. Our site has been severely affected (almost 50% down on traffic) since last Tuesday (07/12/10) which coincides with the latest 2011 algorithm change which was the response to the New York Times article regarding Black Hat SEO techniques.
I have struggled to find much information about it and it doesn’t seem to be widely talked about at the moment. As far as I can tell it will specifically target site reviews and penalise people with negative feedback. Because we are a relatively young site we don’t have any feedback (negative or otherwise). Is it visible that this could have such a dramatic effect on a site that has essentially done nothing wrong and has anyone had the same problem?
Any advice would be greatly appreciated because we are being hit particularly hard in the lead up to Christmas which should be our busiest weeks.
• Domain Strength
• Domain Speed
• Local/specific domain name
• Quality Hosting
Registration history-
• When the Domain was registered
• Strength of the links pointing to the domain
• The topical neighborhood of domain based on the in bound and out bound links
• Historical use and links pattern to domain
• Inbound links and referrals from social media
• Inbound Link Score
Age & Quality of links
• Quality of the domains sending links (non paid)
• Links from Authority Websites
• Quality of pages sending links and relevance
• Anchor text of links
• Link quantity/weight metric (Page Rank or a variation)
• Subject matter of linking pages/sites
• User Data
Historical CTR to page in SERPs
• Time users spend on a page
• Search requests for URL/domain
• Historical visits/use of URL/domain by users
Content Quality Score
• Duplicate content filter
• Order of key words in content
• Content update frequency
• Highlighted and visibility of keywords
• Usability and W3C rules
Negative Penalties (for SEO purposes)
• Forums
• Article Submissions
• Blogs
• Link Exchange
• Paid Links
• FFA’s (Free for all backlinks)
Google over the years have persistently pursued innovation, constantly updated its algorithm or algorythm (if that is the way you spell it) and refused to accept the limitations of existing models.As a result, Google’s latest development was the breakthrough PageRank™ technology that changed the way searches are conducted today. And if there is one SEO company that knows how the algorithm works and how would it would boost your rankings on Google organic search that would be White Hat Works.
What’s new about Google’s latest algorithm in 2010 / 2011?
There is much talk in the SEO world about what Google is going to focus on in 2010. Matt Cutts, head of Google’s Webspam team has mentioned and hinted in various forums, Youtube and on his Blog that SEO professionals’ latest focus should be in 2010. Part of the Caffeine project, apparently in 2010 the speed of your website and web pages loading will now play a major factor in the algorithm.
To achieve faster speed your website needs to be hosted on a super fast host and reducing the overall size of your web pages. Obviously this means moving to a better Internet Service Provider and serving faster websites by increasing the download speed.
This will mean less content on a page, utilising CSS (Cascading Style Sheets) and images have to load faster. A webpage downloading speed of 3 seconds or less is pretty good, but 1 second or lower might have to be achieved in cases. You can now measure the speed with a program in the Google’s Webmaster Tools.
Does this mean Google in a way is pushing developers to write and create better websites that load faster and content writers to write quality content instead of quantity content? and the digital marketing guys, will they have to be at the same time creative but their art work must be short, sweat and fast loading? Does using flash animations go out the window? Hosting Video clips on your own website? would that be wise? or better to be hosted on You Tube instead?All these are questions and probably the answers you are looking for is, yes it is!What could happen is Google favouring certain websites and in the other hand penalise other ones. Websites owners that keep stuffing their pages with content, customer reviews, endless blogs and articles, videos, links and endless images will be penalised.
The good news is websites that are clean, focused, compatible and fast will benefit.
Many SEO professionals are saying that this is not fair for the SME’s (small to medium sized business) that cannot afford to be hosted on super fast hosts and do not have large teams that can restructure their websites to go faster. The question is how fast can the cooperate companies react to the ever growing demands of the Internet? Probably not and it will take them months before they even decide what to do. In the other hand SME’s probably will be in a better position to react a lot faster due to quick decision making compensating for all the other factors.
The SEO e-marketers also mention about countries that have slow internet access, like in Scotland and other rural areas where you cannot get broadband? Are they going to get penalised? Probably not, as its the Google Bots measuring the speed and not at what speed the website is loaded in these areas on an end users computer. Actually Google in a way by “forcing” developers to increase their websites overall speed would mean these areas with no broadband will benefit in the long run. Mobile phone users with their web enabled devices like the iPhone and Nexus by Google will be able to load more and more websites, specially the ones that are more lean and load faster.
White Hat Works Conclusion for 2010 and 2011:
Host your website on a super fast Internet Service Provider
Use CSS as much as possible and comply with website usability
Write quality content, not quantity content
Convert your a Flash website to a HTML website
Export your Flash animations where possible – more SEO Tips
Keep images and their size to a minimum without loosing the quality
Test your website in various browsers and mobile phone devices
Make your website Google Friendly
Get free guidance from a reputable SEO E-marketing Company
Faster websites means Google can keep up with the growth of the Internet without the need to keep buying and installing new servers. Many have asked, how many servers does Google have? This is a well kept secret, but if you do your maths and calculations the number of servers could be around 700,000 and growing. These servers are spread over 100 data-centres around the globe and makes Google the largest IT producer of (Co2) carbon emissions on Earth.
Google is in the business of providing a top notch service and encouraging growth, but at the same time be profitable by keeping their costs and carbon emissions down.
Hence the need for this new factor that will be included in the updated Google 2010 Algorythm. Maybe the main reason Google is focusing on speed is because what they really are trying to do is to reduce their carbon foot print and we (SEO guys and website owners) all need to help them achieve this.
One effected company said:
We are an ecommerce site that has been running for almost 5 months and have seen steadily growing traffic. We have made no major changes to our website recently but are continually adding content and products to the site. Our site has been severely affected (almost 50% down on traffic) since last Tuesday (07/12/10) which coincides with the latest 2011 algorithm change which was the response to the New York Times article regarding Black Hat SEO techniques.
I have struggled to find much information about it and it doesn’t seem to be widely talked about at the moment. As far as I can tell it will specifically target site reviews and penalise people with negative feedback. Because we are a relatively young site we don’t have any feedback (negative or otherwise). Is it visible that this could have such a dramatic effect on a site that has essentially done nothing wrong and has anyone had the same problem?
Any advice would be greatly appreciated because we are being hit particularly hard in the lead up to Christmas which should be our busiest weeks.
Thanks
Hi
ReplyDeletenice blog on search engine optimization.Thank you for sharing this.
It is the need of today's business.
Customers who find you through search marketing tend to be high converting customers as they were actively seeking out the product or service you are providing.
There are some orgnizations like www.blueflyweb.com providing realiable search engine optimization.
Hi,
ReplyDeleteYour blog work is impressive, The post above is a good effort you made to give an overview of google updates. Google still considering updates for products and algorithms.
White label PPC