Showing posts with label Latest Google Update. Show all posts
Showing posts with label Latest Google Update. Show all posts

Wednesday, January 30, 2013

Google Overhaul’s Image Search; Webmasters Rejoice

As most people are aware, the current Google Image Search requires multiple steps to view images. Currently, when a user searches for an image, they are brought to a search result page with different images. To see a bigger version of the picture, users have to click on it and then they are brought to another page. To view another picture, they have to go back to the image result page and then click on another picture. It can become quite a hassle for those who want to look at multiple pictures.
The new change to Google Image Search will now bring up a larger image when an image is clicked on the image result page. Users will have the ability to look through multiple images in a way that does not require multiple clicking. Simply using their keyboard, users will be able to skip through the different images instead of needing to go back to the search result page and selecting a new image.
This new change will also add four clickable targets that will make it easier for users to visit the actual website that the image is being hosted on. This update to Google Image Search will also prevent phantom visits, which many publishers have noticed happening to the current system. More information on the results will also be displayed, including metadata, the page hosting the image, its size, and domain name. This is designed to increase click-through rates to the sites hosting the images.
During the testing of this new update, it was reported that Google noticed an increase in click-through traffic, which should make webmasters very happy.
You can find out more about this update to Google Image search here.

Thursday, August 16, 2012

Google+ Launches Vanity URL Feature for Select Users

Google+ Launches Vanity URL Feature for Select Users

Google is ready to host Vanity URLs (custom URL feature) for Google Plus users, well almost. Saurabh Sharma, a Product Manager at Google made the announcement Monday afternoon, in an official Google blog post.

Initially, the vanity URLs will be offered arbitrarily to a select few – mostly celebrities, brands and companies. Currently, a standard Google+ URL for a user, looks something like this: http://plus.google.com/123456789060894565657/ (example).

The first few to get the custom URLs include, but not limited to, Brittany Spears, David Beckham, Hugo Boss, Toyota, and so on. So now the Dell custom URL could read like, google.com/+dell. Google’s own URL is google.com/+google.

Google will later cover verified ‘commoners’ too. Facebook and MySpace also provide users with custom URLs.

Earlier, Google converted cartoonist Matthew Inman’s “The Oatmeal” profile , in a lighter vein, into a custom URL, reading something like, http://plus.google.com/blergasdf1234thimbleturdorgasm99meatpoopypoopxv9donkeypie.

The major benefit of having a custom URL is that people can easily search for an individual, a company or a brand. It is in fact much easier to remember a vanity address than 10-12 numbers. Also, the word ‘plus’ is not required anymore to search for someone on Google. Just type, google.com/+name and you’ll find the person on top of the search result.

Saurabh Sharma, a Product Manager for Google+, posted the following to the Official Google+ Page:

“Today we’re introducing custom URLs to make it even easier for people to find your profile on Google+. A custom URL is a short, easy to remember web address that links directly to your profile or page on Google+. For instance, +TOYOTA can now use google.com/+toyota to unveil their latest models, +Britney Spears can share her upcoming appearances at google.com/+britneyspears, +Ubisoft can share game trailers and videos at google.com/+assassinscreed, and +Delta can help travelers find great deals at google.com/+delta.”

resource:http://www.thenextseo.co.in/news/google-launches-vanity-url-feature-select-users/1398.html

Friday, May 18, 2012

Another step to reward high-quality sites

Google has said before that search engine optimization, or SEO, can be positive and constructive—and we're not the only ones. Effective search engine optimization can make a site more crawlable and make individual pages more accessible and easier to find. Search engine optimization includes things as simple as keyword research to ensure that the right words are on the page, not just industry jargon that normal people will never type.

“White hat” search engine optimizers often improve the usability of a site, help create great content, or make sites faster, which is good for both users and search engines. Good search engine optimization can also mean good marketing: thinking about creative ways to make a site more compelling, which can help with search engines as well as social media. The net result of making a great site is often greater awareness of that site on the web, which can translate into more people linking to or visiting a site.

The opposite of “white hat” SEO is something called “black hat webspam” (we say “webspam” to distinguish it from email spam). In the pursuit of higher rankings or traffic, a few sites use techniques that don’t benefit users, where the intent is to look for shortcuts or loopholes that would rank pages higher than they deserve to be ranked. We see all sorts of webspam techniques every day, from keyword stuffing to link schemes that attempt to propel sites higher in rankings.

The goal of many of our ranking changes is to help searchers find sites that provide a great user experience and fulfill their information needs. We also want the “good guys” making great sites for users, not just algorithms, to see their effort rewarded. To that end we’ve launched Panda changes that successfully returned higher-quality sites in search results. And earlier this year we launched a page layout algorithm that reduces rankings for sites that don’t make much content available “above the fold.”

In the next few days, we’re launching an important algorithm change targeted at webspam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines. We’ve always targeted webspam in our rankings, and this algorithm represents another improvement in our efforts to reduce webspam and promote high quality content. While we can't divulge specific signals because we don't want to give people a way to game our search results and worsen the experience for users, our advice for webmasters is to focus on creating high quality sites that create a good user experience and employ white hat SEO methods instead of engaging in aggressive webspam tactics.

Here’s an example of a webspam tactic like keyword stuffing taken from a site that will be affected by this change:


Of course, most sites affected by this change aren’t so blatant. Here’s an example of a site with unusual linking patterns that is also affected by this change. Notice that if you try to read the text aloud you’ll discover that the outgoing links are completely unrelated to the actual content, and in fact the page text has been “spun” beyond recognition:


Sites affected by this change might not be easily recognizable as spamming without deep analysis or expertise, but the common thread is that these sites are doing much more than white hat SEO; we believe they are engaging in webspam tactics to manipulate search engine rankings.

The change will go live for all languages at the same time. For context, the initial Panda change affected about 12% of queries to a significant degree; this algorithm affects about 3.1% of queries in English to a degree that a regular user might notice. The change affects roughly 3% of queries in languages such as German, Chinese, and Arabic, but the impact is higher in more heavily-spammed languages. For example, 5% of Polish queries change to a degree that a regular user might notice.

We want people doing white hat search engine optimization (or even no search engine optimization at all) to be free to focus on creating amazing, compelling web sites. As always, we’ll keep our ears open for feedback on ways to iterate and improve our ranking algorithms toward that goal.

see more : http://insidesearch.blogspot.in/2012/04/another-step-to-reward-high-quality.html

Tuesday, May 15, 2012

Bigger & Tiered Index, Document Ranking, Sitelink Changes and More

Google Logo - Stock
It must be time for Google to share its latest list of search quality updates. And, like clockwork, Google didn’t disappoint today — the company has posted a list of 53 changes that affect search results.

This list is particularly interesting because of all the upheaval happening this month, what with Panda updates 3.5 and 3.6 occurring in an eight-day span, along with the Penguin update and a Google screwup related to parked domains. Yeah, April was a crazy month for SEO folks.

As always, there’s a lot to digest and the most important items don’t always reveal themselves right away. But here’s a look at the items that caught my eye after a first read-through of Google’s blog post.

Bigger, Tiered Index

Perhaps the biggest news is that Google has increased the size of its base index — the collection of web pages and documents it can show as search results — by 15 percent.

Similarly, Google also says it’s launched a new “index tier.”

Increase base index size by 15%. [project codename "Indexing"] The base search index is our main index for serving search results and every query that comes into Google is matched against this index. This change increases the number of documents served by that index by 15%. *Note: We’re constantly tuning the size of our different indexes and changes may not always appear in these blog posts.

New index tier. [launch codename "cantina", project codename "Indexing"] We keep our index in “tiers” where different documents are indexed at different rates depending on how relevant they are likely to be to users. This month we introduced an additional indexing tier to support continued comprehensiveness in search results.

That sounds almost like Google’s old “supplemental index” system that launched in 2003, and it may be tempting to say the supplemental index has returned, or something along those lines. But, as far as I recall, Google never said the supplemental index was going away; it said they’d stop using the “Supplemental Results” label on search results that came from it.

SEO & Ranking Updates

Google has also announced numerous updates that relate to how documents are ranked, updates that sound like they’re at least related to — if not part of — the larger Panda and Penguin updates that are already well known.

Improvements to how search terms are scored in ranking. [launch codename "Bi02sw41"] One of the most fundamental signals used in search is whether and how your search terms appear on the pages you’re searching. This change improves the way those terms are scored.

Keyword stuffing classifier improvement. [project codename "Spam"] We have classifiers designed to detect when a website is keyword stuffing. This change made the keyword stuffing classifier better.

More authoritative results. We’ve tweaked a signal we use to surface more authoritative content.

More domain diversity. [launch codename "Horde", project codename "Domain Crowding"] Sometimes search returns too many results from the same domain. This change helps surface content from a more diverse set of domains.

If I were to guess, I’d think that the first two items above could be related to “spun” content — one of the practices that Google likely considers to be a hallmark of low-quality content. But that’s just a guess on my part.

The last two items — authoritative results and domain diversity — almost sound contradictory. At least to me. On a Friday afternoon during a long week with international travel.

Sitelinks Updates

There are several changes related to Google’s sitelinks and “megasitelinks” — the additional links that show up below a top-ranking result for some queries.

If you spend time trying to optimize for sitelinks (and if you have an authoritative site, it’s probably a good idea to be doing that), these changes are worth reading closely. Here are those changes, word-for-word from Google’s post:

“Sub-sitelinks” in expanded sitelinks. [launch codename "thanksgiving"] This improvement digs deeper into megasitelinks by showing sub-sitelinks instead of the normal snippet.

Better ranking of expanded sitelinks. [project codename "Megasitelinks"] This change improves the ranking of megasitelinks by providing a minimum score for the sitelink based on a score for the same URL used in general ranking.

Sitelinks data refresh. [launch codename "Saralee-76"] Sitelinks (the links that appear beneath some search results and link deeper into the site) are generated in part by an offline process that analyzes site structure and other data to determine the most relevant links to show users. We’ve recently updated the data through our offline process. These updates happen frequently (on the order of weeks).

Less snippet duplication in expanded sitelinks. [project codename "Megasitelinks"] We’ve adopted a new technique to reduce duplication in the snippets of expanded sitelinks.

The first item seems to be saying that sub-sitelinks may show up instead of a text snippet, which I think means that some search results could have two layers of sitelinks — megasitelinks below the main result, and then sub-sitelinks below one of the megasitelinks. I’ve not seen anything like that yet.

Local-related Changes

There are a couple changes related to local/geo searches and search results.

Improvements to local navigational searches. [launch codename "onebar-l"] For searches that include location terms, e.g. [dunston mint seattle] or [Vaso Azzurro Restaurant 94043], we are more likely to rank the local navigational homepages in the top position, even in cases where the navigational page does not mention the location.

Country identification for webpages. [launch codename "sudoku"] Location is an important signal we use to surface content more relevant to a particular country. For a while we’ve had systems designed to detect when a website, subdomain, or directory is relevant to a set of countries. This change extends the granularity of those systems to the page level for sites that host user generated content, meaning that some pages on a particular site can be considered relevant to France, while others might be considered relevant to Spain.

In the first item, Google seems to be saying that it’s able to able to identify the correct local result for specific navigational searches — such as a search for a specific local restaurant — even if the site/page is poorly optimized for local search.

It’s reminiscent of the “Venice” update earlier this year which involved Google launching ways to better correlate web pages/documents to their locations.

Miscellaneous Updates

In addition to the items I’ve highlighted above, read through Google’s post for these other items that caught my eye:

  • A change that should reduce amount of paginated results showing on a search results page.
  • Two changes related to snippets, including one which promises to show more text from the beginning of pages.
  • Three changes related to freshness — fresh results and freshness signals, and one that ignores fresh content if it’s deemed low quality. (There’s also an Autocomplete change designed to reduce the visibility of low-quality results.)
  • A change that Google says will help it show more informative/concise titles in its search results. (For what it’s worth, just about every SEO that I know wishes Google wouldn’t change titles at all.)
  • Improvements in how Google uses previous search activity to determine your intent as you continue to search.

That’s a lot to digest and something else may have caught your attention as being important. The comments are open, so let us know what stands out for you as you look through Google’s April search changes.