Since early 2009 Google's Matt Cutts has recorded a superhuman number of videos to help struggling site owners understand their site in search. While the videos are great, sometimes the guy just needs to get to the point. With that in mind we've done the hard work and watched every Matt Cutts video to pull out simple, concise versions of his answers: The Short Cutts!
526 Short Cutts (Running time 19:48:31)
404's are rechecked more frequently than 410's.
Google will try to execute JS as the page loads.
Authoritative sites tend to have a batter PageRank.
Search Quality Raters who manually assess side-by-side results and judge which is best and various other tests.
For manual actions check out Webmaster Tools. Check dates of algorithmic updates. Tweak content.
Yes, but Google will only pick up the first one.
404 the page. Consider displaying related products and using "unavailable after" META tag.
Either use a responsive site or rel=alternate if you have a mobile site with a rel=canonical on the mobile site.
Percolator = Incremental indexing. Dremel = Like MySQL but for huge databases. Pregel = Solution to graph problems
Anything that is deceptive or manipulative or where high value material compensation is traded.
Clarity, but also include jargon.
Internally yes, externally no.
Keep the site up-to-date, move with the times.
Use the "rel=canonical" tag
Ideally, but different TLD's are also a top indicator.
Google decides which is the most important and ranks that one highly whilst lowering the others.
Do not use the same sites over and over, and always write unique content, not spun.
The more serious the action taken by the webmaster, the more serious Google treat it.
No limit, but ensure the links are not spammy.
No - Keep them unique and apply them to the most important pages.
Not directly, although sites with more pages have a better chance of ranking due to wider variety of keywords.
Keep it natural and spam free.
No, provided it's for a legitimate purpose.
Public Pagerank is becoming deprecated so is updated left often.
Ordinarily no as internal links should not normally use rel="nofollow".
Keep it small and have plenty of unique content on the same page.
Make sure you have high quality content on your website that people would refer back to.
Typically, no unless it is abused and manual action applied.
Users should send spam reports and Google will take action.
Google is trying to do more, there is a problem of scale.
No more than non-mobile, but users want quick mobile experiences.
A rel nofollow tag on widgets and infographic embeds is recommended.
No, contact them and ask them not to link to you then disavow them using the disavow tool.
They can't show them all, but in time they plan to show around 3 links in messages to help diagnose the problem
You can as long as Google treats them as generic, otherwise they're tied to a specific country. List here: https://support.google.com/webmasters/answer/1347922?hl=en
If you're not trying to stuff hidden text in there then don't worry about it, it's a normal thing on today's web
As long as it's not spammy and keyword stuffed then you shouldn't stress about it. Google understands lots of sites need boiler plate content.
Not usually, it can look spammy. If they are country TLDs, perhaps have a single country locator page and link from there.
They're working on providing examples as the messages go out, or a place in Webmaster Tools to see examples. Meanwhile, ask a Googler in the Webmaster Forums for examples
No, if it's just for a short period then you shouldn't worry about that. If it's for, say, 2 weeks, then yes that could be a problem
People use more natural language. Google want to master conversational search
Google prefers a source link to be near the top, but anywhere in the article is ok
Yes, released relatively soon. For now there are PHP and Python libraries to allow you access to this data
Google don't hate Albania. Google.al launched earlier in 2013 and there was a Google doodle for Albania independence day recently, but proper support takes time
No, but maybe it's a signal for Google to look at in the future
Google will include some examples of URLs they have flagged as spam, so that webmasters know where to look to clean up links
Uploading files that aren't in plain text (Excel, Word etc.), trying to disavow individual links rather than whole domains, using incorrect domain: syntax, having the reconsideration plea in the disavow text and leaving lines uncommented that should be commented
Yes, they're always looking to use that signal in useful ways. E.g. It may mean pages on non-important sites are surfaced if content is written by an important author.
Data refreshes are the same as algorithm updates (they're not). Updates are intended to sell more ads (they're not). Link building can replace good UX and design (it can't).
Paid content must be disclosed to users and the links no-followed. Google will take action if not.
Not usually, but sometimes as an in-joke.
All search engines are subjective to some degree, but Google take outside influence where it helps to improve search results
If the site has multiple pages that are a really good match for the query.
Penguin 2.0, advertorials that pass PageRank, attacks on specific spammy areas like 'payday loans' results, more sophisticated link analysis, better detection and reporting of hacked sites, detection and better ranking for people who are industry authorities, refining the Panda algorithm, and better communication to webmasters generally.
Yes, it's still used by Google though maybe not other services
J.Crew or Eddie Bauer. In Medium
That patents Google have had granted are in use. They're not necessarily
Not having a domain at all, not making it crawlable, not using words people search for on your page, not having compelling content, not having good titles and descriptions, not making use of webmaster resources.
No, not for a small number of related sites.
Do a spam report with examples and they'll prune them out of results
Should be fine in one go, but probably add in stages
It's possible but can be difficult. Best to get a fresh domain
Typically, no. Don't worry about it
More info becomes available to determine rank
Focus your content and improve your site
You will lose the benefit of that link
The algorithm isn't good enough, or you can't see all the links
Matt sees Google as a non invasive assistant
It's not recommended
Put your site back the way it was
The same as a link
Yes, turn on 'Priority Inbox'
Look in Google Webmaster Tools in Traffic > Links to Your Site
People use it
Impossible to say, but it's probably referrer spam
Try to keep the same story on a single URL as it evolves
He pretends to be a dinosaur
Don't worry about it, but if you are then use Disavow tool
People falling for snakeoil SEO promises
Explain what you've done to change your ways, giving assurance it won't happen again
301 is a direct instruction, rel canonical is just a strong hint
They already have in Dublin
They're estimates. They're only accurate to 3 significant digits
Use both if it feels natural
Ads that don't pass PageRank are fine
Maybe you're cloaking. Use Fetch as Googlebot in Webmaster Tools to confirm
Matt sees the positive side of the web instead of just the spam
Yes, maybe to Search Experience Optimisation
About 2 weeks
Google usually detects they're irrelevant, but 301, canonical tag or Webmaster Tools can help
It's ok if they're natural
When the search is similar to a brand search
Either no-one uses them or Google can't support them anymore
People would use it for evil
If you can't add value, don't add pages
Not for normal search but potentially for vertical specific e.g Recipes
Not really, unless you're selective about what you repost, and have lots of unique content
You're ok if there's just a few
Make sure you write the article yourself so it's not the exact same article as others. Do another angle
Use webfonts instead
We crawl the web, crawling the best pages more often. Then we sift documents based upon queries and rank those results
Will return the best results to the user, regardless of the TLD it's on
Meta description matters, meta keywords don't
Let Googlebot find it in its own time
Bart was the internal code name, it was known as Fritz externally
There are no guidelines on having tracking pixels, so this shouldn't affect rankings
When a freehost has got infested with so much spam
Report it on the Webmaster Forum
The ASYNC code shouldn't slow the site
Google doesn't share the data and it isn't used in search ranking
It's not just optimisation. It's content, links, user experience etc.
Use 301s in general
Hacker News, Digg, Reddit, Techmeme, Tech Crunch
They're a US company. Internationalisation is hard
There is a team working on it. In general Google can handle some AJAX
Point a rel="author" link to your profile, then name your posts on your profile
Add rel="author" on the end of link URLs and make the anchor text "+your name"
One or two is fine, any more than five and Googlebot will probably not follow. Avoid mixing 301 and 302s as well
Look at backlinks, internal linking, canonicalization, most common cause of a drop is violating quality guidelines e.g selling links
Looks for certain words and gives them certain weight. If there are enough words with enough weight then it says "this looks like it needs filtering"
We support all people on the web using forums, webmaster videos, blogs, chats etc.
Unlikely. It's not worth Google's time
Maybe, maybe not. Your time would be better spent elsewhere
Yes, they're not treating it as specific to Colombia
No, but using Google Translate to auto-generate this can be spammy. Get a human to do a real translation
We already try to compensate for these sites
Use Google Alerts
Use subfolders and set up in Google Webmaster Tools
We look at canonicals, rel="author" tags, where we found it first and if it's scraped content
Not for Google. Not a ranking factor at time of video
Matt wouldn't recommend going to those lengths
They may come back soon
There are about 40 domains you could use that aren't country specific
No, but it doesn't hurt to put the effort in
Cloaking is bad. Very bad. Don't cloak content
It's an umbrella term for PageRank, site history, respectability, content quality etc.
We sometimes take the DMOZ description and use it in your SERP listing
It will be by the time you read this
Use good text, good titles, and fresh content
We average data from Google Toolbar users
Start with site:mysite.com, check for malware, check Webmaster Tools. Try the Webmaster Forum. Is it just your site, or lots of sites?
Ranking reports aren't important. Concentrate on how your site converts
They are. Don't use paid directories
Ok if it's a normal implementation
Not a fan
You don't need keywords in the domain name
Optimise site speed, control of CMS, education program, internal linking, social media
The percentage of nofollow is very small, not a big problem
Test first, but it should be ok
Not so bad on a couple of domains if they offer something different
Do what is right for the site, but a little more content is better for Googlebot
This should not be classified as spam
Yes, but they usually have it anyway
It's how Google reduces latency between finding a document and users being able to search it
Report it via the Google Webmaster Forum
To stop people obsessing over it
It doesn't always, but it's usually the best or only place for the product
It can do. Do some tests on certain pages
It doesn't matter
A teaser or title is ok, not the whole post
Use the Safe Browsing diagnostic service. Use Webmaster Tools. Try unmaskparasite.com
No, it's very useful for helping Google provide great results
Underscores link words together. Dashes show separate words
Yes it is. Do something good for users, show both to all
Build authority on your main site or small set of sites
People don't tend to get good links to porn sites
When Google crawls, it's a snapshot in time. It might miss important links in a rotation
No, the algorithm handles that
If they're really bad, it can have a negative impact on crawling or indexation
No (At the time of video, you can now with the disavow tool.)
Matt loves new search engines and competition
Test first with a subpage and 301 it. Then start with the smallest parts of your site and move it gradually
Lower your DNS TTL to about 5 minutes; duplicate the content to the new site. Once traffic picks up you can remove the old site
Have as many as you like, but don't chain more than five
Tweet it, use pubsubhubbub to get it crawled. Do a DMCA request if you're ripped off
Read the papers and literature. Check out Jeff Dean, Urs Holzle, Luiz Barroso
A few months, maybe. Only if all signals confirm the change
Google don't want to play with the robots.txt spec, they want it in Webmaster Tools
They try not to make changes during the holidays, but it can happen
Google already does
Translate the content, put them on separate domains
Keyword stuffing, duplication, link quality, hacked sites, communication with users
No it's not, as long as it's the same for Googlebot
Sites should be more nuanced in their approach
Not a guideline anymore. Do what is right for the page and users
Not using Twitter in ranking (they can) and returning non-crawled pages as family safe
After recrawling. Perform a reconsideration request first
A combination of PageRank, URL length, usefulness
Use related questions links, highlight good ones, add voting buttons etc.
Just don't use an underscore
Yes, FTP, WML, WAP etc. But use HTTP/HTTPS
Treat customers badly to get complaint links, Spam team are slackers, only links matter
Google tends to treat as regular 404s
Tripit, 3rd party batteries, podcasts, MySixSense, Wifi Analyser, Mapviewer etc.
Hacked Sites, Communication, Social, Online Reputation
About 1 month
No official position on this
There's no API for this, but more data will be provided in time as people do request that
Buy one domain name, as much hosting as possible and go from there
First crawl date. Don't worry about it
Tell them Google agrees. For results that will stand the test of time, build a great site
Yes, FeedBurner already has one
Let Google figure out the duplicates on their own before taking this step
They're irrelevant to search engines, but worth it to get visits
Virtually no difference
It won't be the deciding factor
It's an estimate
Anchor text in inbound links
Ask the Intern and The Garbageman
Googlebot can interact with forms, or maybe there are just links you don't know about
Personalised Search, Country/City Dependent, Different Data Centre
Update or remove If-Modified-Since Header
New high quality content
URL Path but mainly for UX
goo.gl URL Shortener, Favourite Doodle
Yes, depending on the length of downtime
Yes, if you want your Twitter profile to rank. No for the outbound links
Matt doesn't know, but it's good suggestion
Google has an open mind
Do what's best for the user
Still as an advocate for its users
Incorporate data around the image
It notifies Google quicker than just 301's, but do both
Search for the phrase on Google
Google will try to interpret page the best it can
Use techniques which will stand the test of time
No, this is cloaking
It's a work in progress
Quality > Frequency
Keep up to date and restrict /wp-admin/ folder to good IP addresses
301 to main site if you have them. Don't make multiple sites
mail.google.com, google.com/calendar, www.techmeme.com, Google News, Tech Crunch, Google Reader, Twitter, Friend Feed
Bad markup. It just needs more time or Google don't have enough trust in your domani
It doesn't really matter what your URL structure is
There's 200 in total
There was a algorithm change called Mayday
To avoid people selecting them all
One that lists all bookmarks instead of the bookmark bar
Typically a few days
Some countrys use DMOZ because it's easier to click than type, but we rely on it less.
You can't target at state level
One or two pages, it's fine, if it's lots then block
Work in progress
More mobile and more refined
It refines results when possible
Google understands them well
Try to understand user's intent and document understanding
Don't worry about it
Controversy, participating in the community, oiginal research, newsletters, social media, lists, blogs, How tos and tutorials, run a useful service, make a few videos
Yes, but it helps if you use the canonical tag
No guarantees to crawl anything in the sitemap. To get more pages crawled get more authority/reputation
As long as we haven't crawled the site when it was hacked, no
The web, gadgets, how to improve search quality
It can, yes
No, that's cloaking
Can be good or bad, like any form of linking
More than 25 and less than 1,000
Hopefully over time Google will get better at exciting JS and AJAX and represent that appropriately in our search results
As longs as it's natural, it's fine
Create a sitemap index referencing multiple sitemaps
I don't know, maybe they just like it?
Look at meta data, captioning, and tags to images
In general, Google figure out what your post is about, so don't worry too much about it
Make something excellent
Not at the moment
The top 9 screen resolutions in Google Analytics were all bigger than 800x600
Don't worry about it. There's no special benefit to it being the first link
If the websites are relevant then a small number isn't an issue
Return information real-time, UI changes, hacking websites will still be a trend
Have a set of delimited links on your site that accurately reflect site's hierarchy
Google treats all links the same
Interesting content gets followed links
It will eventually fade out, just wait
If you can make your site work on mobile, then do that instead
Filter and sorting by recent results and type of results (video, forum, reviews)
That people gain weight when they first start working at Google due to free food (the Google 15); people want to make a Space Elevator
To only show the q= parameter in the URL and not all other parameters
404 for single page; block via robots.txt for all website
Phrase searches, the minus sign, using the plus sign, site:
Because they are getting more links
Google Tech Talks, Google IO, Reporter Centre
Use HTTPS; use POP3; labels; turn off web clips; change your theme
Make sure site isn't hacked. Go to Google Webmaster Help Forums, file reconsideration request
Probably not. Could be seen as reciprocal link
I wouldn't be surprised
If you manually change the geo-location from what Google thinks, you'll notice you may not appear in the old location as often
He'd create a useful service and operate completely transparently
They choose the most relevant/useful snippets and titles to the user's query
Go download Google Mobile app - it's got voice recognition
301 redirect each webpage to the new webpage. Contact any good links you may have, asking them to update their link
UP, UP, DOWN, DOWN, LEFT, RIGHT, LEFT, RIGHT, B, A - Secret ninja mode
Google will show the best version
Redirect by IP, but don't do anything special for Googlebot
We make a lot of small updates instead of large updates. When we have big changes, we are happy to confirm that they have taken place
Most of the time you don't need to worry about it. You might get a link to you from it, so worst case it won't hurt
Too many have spammed it
We want as few bytes as possible to reduce download time
Disclose the relationshop with surrounding text
Just treat Googlebot the same as a user
Query match, authority and links
Yes, often they're less valuable
One day, but they're taking it slowly at first
A .com may be the best result for UK
You can search by your location on Google on a smart phone
They don't really
Only if you link to spam
They are always improving it
Go for small niche and be creative
Yes, it can affect what countries you rank in
If it's software, it's fine
Stagger your launch, keep same URLs
Use Webmaster Tools or Yahoo Site Explorer
Use the alt attribute
It's more about the value to users
Don't worry about it
Don't worry about it
Matt doesn't agree with the premise
Make it visible to Google and Users
No, it's just something people want to link to
Nothing, Google should get better over time
There's no real limit
No, it shouldn't
It's a factor in which countries SERPs you show in
Link to them from the root page
It's fine, but work on links and content first
Yes, try to add unique value
To not index email addresses
Google won't at the moment, so you can exclude it
We found a link to the page but didn't crawl the page itself
Don't worry about it
The string now contains the result position
They do for most penalties
Average time to get the page
You can if you want, Matt wouldn't bother
No, probably not
You should let Googlebot crawl it
No, Google will decide that
Yes, with event tracking etc.
Yes, but not specifically for social media
Use a sitemap, link from root, use dropdowns instead of text boxes
Create rich content. Do the hard work
They don't really care
It doesn't affect ranking so leave them in
Yes, you should
File a spam report with Google
Get unhacked, then send reconsideration request
Have key pages near the root
It's a trade off
It may affect the referrer data in analytics programs
Don't worry about it, but keyword in the URL helps a little bit
It's a subset of total backlinks. Use Webmaster Tools for more. And yes, they do
Lots of valid reasons to do this, but Google might scrutinise it
Don't overdo it, but it's acceptable
It's about great content and authority
Matt has no idea
Probably not. It's open to abuse
Indirectly. It's all about trust, authority, PageRank, reputation and quality
No and no
They can, but fix the structure first. Be careful with it
You can see this data in Webmaster Tools
New types of data to search; semantic search; mobile; people will store more data in the cloud
There are no plans