Recapping Sarah MacKay’s Interview With Google’s Matt Cutts

I’ve just covered a few of the points from the recent interview between Sarah Mackay (MC) of WebmasterRadio and Matt Cutts (MC). There’s hardly anything ground breaking, but it’s always useful to hear straight from the horse’s mouth. I’ve tried to stay as true as possible to the interview, with as little editing as possible. Any emphasis is mine. You can get the full interview in MP3 from WebmasterRadio here.

SM You are the most popular person in the search world after Las vegas?
MC There are a ton of people like me at Google, but I end up doing a lot of this stuff with webmasters but there’s so many people here who feel the same way and who want to make sure that we get the best search results and that people.

On Duplicate Content
SM Should people still be concerned with duplicate content?
MC On a page level sometimes people ask about a .com and a .de for example, or I have some content but it’s in a different language and it’s saying kinda the same thing. That’s definitely not something you need to worry about. Typically when someone has a set of articles and also have a printable version, that’s also not something that people need to worry about in terms of duplicate content because we’re usually pretty good about finding the best copy of that information and just showing that without any penalties or anything like that. Again, if people are just covering a topic in a general way and saying things in different ways that is also not a reason to worry.

The thing that I would start to worry about is where you have multiple repeated sentences or when you are copying and pasting a paragraph or even entire pages and putting them in multiple places, multiple subdomains, multiple domains or multiple subdirectories. If it’s the sort of thing where you can do a search and find 5 copies of the exact same phrase that starts to look bad. But if you’ve just restated it from experts or another page for novices that’s definitely not something to worry about as far as duplicate content.

On The Sandbox

SM Is there any amount of time that a brand new website should expect to wait before they see their site spidered and ranked?
MC Not automatically. We try to be pretty good about finding sites relatively quickly. The thing that I would caution people against is thinking ‘I’m going to make a site on Monday and it’s going to be ranking for my banner phrase by Friday’ or something like that.

The advice that I normally give is to start out with a relatively small niche, something where you can get to be an expert, where you can get to be well known, like this is something that you really do well. Then once you get known in that niche you can build out from there. Your reputation can grow slowly as more people find out about your site.

On Mom And Pop Sites vs. Corporations

SM When you are competing against such huge corporations that have been out there and are established, with excellent sites, you’ve got to slowly work your way?
MC It’s absolutely the case that we want small mom and pops to be able to compete well with large corporations as long as they have really good information, good resources or good services.


SM What is the best way to sort the good from the bad SEOs?
MC That’s a great question. A couple of points I would start out with.

  • I would absolutely ask for references.
  • Look out for high pressure tactics.
  • Try to get a feel for the reputation of the site.
  • They should explain exactly what they are doing.

Ultimately you are responsible for your pages and the pages on your site. We recently have been tracking down an SEO in Spain who ended up doing a lot of SEO for clients but they would hide all sorts of other content on their client’s pages, including links back to the SEO. Unless you really know what your SEO is doing and they can explain it clearly, that can be a problem because you don’t really know whether they are doing something unsavoury or unethical.

SM How can companies who have made mistakes correct themselves?
MC Yeah, it depends. The best way to handle that is to do a reinclusion request. There is information on our webmaster pages and on my blog. In essence, what we need to know is:

  1. The spammy content is gone.
  2. That it’s not going to happen again.

We look at whether people are willing to provide specifics about what happened, the timeline, the product or what SEO they were using. Those sort of things give a real ring of truth to say this person is trying to make sure that they’re site has straightened up and it’s flying right. Those are the sort of things that help.

One thing to bear in mind is that will help with any penalty that has been imposed manually with our spam fighters, but sometimes our algorithms change because we have to keep improving our algorithms to return the most relevant search results. If it is our algorithms that are saying your site is not as good or we don’t think this site should be returned over another site, then a reinclusion request can’t help with that.

A reinclusion request can only lift penalties that have been put in place by our spam fighting team. It certainly never hurts to do a reinclusion request but some of the times it is our algorithms that decide a site shouldn’t be returned over another site.

SM For those people affected by a change in the algorithms, is it best for them to take steps to improve their existing site or if it’s at a point where it is so bad should they just start over?
MC Well a lot of the times you can get a feel for if the site is completely gone, in which case that might point more towards either the site being down, the server being down or a manual penalty. It could also be the case that they are returning some really weird content errors. So one thing I would recommend is people enroll in Google Sitemaps, which allows people to basically get access to a webmaster console. They don’t have to provide a sitemap, which is a list of urls for Google to crawl on their site, but just by signing up you can get a list of all the error pages we have seen, 404s, problems with robots.txt and stuff like that. So for example in the PubCon talk I mentioned that Nissan Motors has a robots.txt on it and so it’s not anything that Google doesn’t like about the site, it’s literally that they have told us ‘you are not allowed to crawl our site’, which is pretty strange for an automobile company in 2005. I would definitely try the webmaster console.

SM. Is there a set time for people who have done something wrong?
MC. Yeah there is. We do have policies, but it sort of varies by the severity. So something like hidden text could vary from 30 days up to 60 or 90 days. Then if we have seen repeated incidents it can go much longer. If somebody is using JavaScript to dynamically construct a new frame that will hide text or something that is really clearly deliberate that would probably result in a much longer penalty.

A good rule of thumb is if you think you were doing something shady or your SEO was doing something shady and you’re pretty sure you have been caught, go ahead and fix whatever it is that you think might be the problem and then drop us a note via a reinclusion request.

On His Blog

SM. You’ve seen a remarkable response to your blog, haven’t you?
MC. One thing that has always been frustrating to me is that there was no informal place where you could point to some tips about a reinclusion request. Or here’s how you can setup something simple to use Google Sitemaps. It’s been a lot of fun. At the WebmasterWorld conference a ton of people were saying, ‘I read your blog’. That made me feel really good. But the main thing is, it’s just a nice informal way to talk about some fun stuff but also to give some informal information out to help webmasters.

It’s a good way where people can send feedback or ask questions. The only thing that has been frustrating is that I do it at nights and on the weekend. I don’t take too much worktime out to do it. Maybe I should, there’s only a finite amount of time I have to work on it. It’s been a lot of fun, it’s been really rewarding so far.

On Google Site Search

SM. Where can you put Google site search to search externally and internally? Have you seen a good response from that? How is it benefiting users?
MC. It’s in everbody’s best interest to have good search. I have heard about studies on people leaving a site where a large percentage said they were frustrated by a bad searching experience. We do provide a free site search where you can just add a box and you’ll actually be doing a Google search. It’s a really flexible tool that lets a lot of people get value add very easily. Having something that can make your site search easy is really really nice.

On Froogle

SM. How is Froogle different than other shopping site services? Why did Google put their resources into it?
MC. It’s kind of interesting. Froogle has a couple of things that are different.

  1. We don’t accept any kind of payment for inclusion, it’s totally free.
  2. We thought it was pretty important that the left hand side of the shopping comparison be completely free. You don’t have to worry about somebody paying for top listings or clickbacks.
  3. We do have ads on the right hand side, but they are clearly marked like all our ads.
  4. We crawl the web. People can submit their feeds but we can also increase our coverage by finding products out on the web that have prices.
  5. Sometimes without having to do anything whatsoever we can drive some traffic to your site.

It has been through some changes recently where you can do a lot of stuff like sorting by categories and grouping by price. There is also something where you can do local searching so it can find inventory in real time in your local neighbourhood. They’ve added reviews, so you can see different ratings for products and vendors.

On Search And Webmaster Conferences

SM. Will we see you at any of the upcoming conferences?
MC. After 5 or 6 years at Google I’m trying to cut it down to 4 or 5 conferences a year. It is important to be able to speak at conferences, and we try to send people to as many conferences as we can. But we also have our user support pages, we answer questions on forums, on the blog and stuff like that where people can get feedback and have good information.

Leave a Reply

Your email address will not be published. Required fields are marked *