Yes it’s yet one more post about the Matt Cutts Videos. You’re probably sick of hearing me talk about them all week so I promise this will be the last. I’ll even do my best to refrain from mentioning his name for awhile, though of course now that I have he’ll probably release the Google algorithm over the weekend. I suppose if he does you won’t mind one more mention.
I finally had time to watch all the videos and I wanted to mention a few things for anyone who hadn’t. First some links to where you can find the videos. The posts on Matt Cutts blog linking to the videos are:
You can also find them at Pro SEO
I found them a bit easier to watch on the Pro SEO site, but take your choice.
The running theme I found through all of the videos was Matt urging site owners to focus on content. He repeated several times that if you build a site with good content that site will probably be found in Google. He mentioned building sites for your visitors and if you build a site that’s good for your visitors it will probably be good for search engines too. Something I’ve been preaching for awhile now.
One of the questions was about whether accessibility considerations would make it into the search engine. Matt said if you had asked him a few weeks ago he would have said no, but now that the technology exists through accessible search he could see some of that technology getting incorporated into the general search engine. Something I think should happen and something we should all strive for regardless of it’s impact on a search engine.
Matt let it be known that Google treats dynamic and static pages essentially the same. The only issues with dynamic pages are if there are too many parameters in the url. Matt suggested limiting them to 2 or 3 or even better using mod_rewrite to create search friendly URLs. Parameters that use ‘id’ in some way are also best to be avoided. He also confirmed that Google treats <strong> and <b> the same as do they for <em> <i>. So feel free to use either. What’s interesting in this statement is that it seems to confirm that Google does place a little extra emphasis on words enclosed in those tags. Yeah we all knew it anyway, but confirmation is nice. And a little extra emphasis doesn’t mean you should be wrapping those tags around every keyword on your page.
One of the questions asked about purchasing an older domain name and redirecting it to the current business site. While the question wasn’t phrased this way I get the feeling the real question was could you take advantage of links pointing to an older domain that has just become available. Some have theorized making use of older domains as a way to avoid the mythical sandbox or to take advantage of the existing links for an older domain. Matt’s answer was interesting in that he said as long as the links to the old domain were related to the new business everything would be fine, but buying a poker domain for example can cause problems if you’re using it now for your shoe site. I think the key here is to do a little research before purchasing an expiring domain. While you may be able to inherit some of the positives associated with that domain you can also inherit some of the negatives. Buying a domain that had been banned in the recent past would probably leave your site banned as well.
There was a little discussion about duplicate content in a couple of the videos. The main points were that Google has things in place for duplicate detection at every point where they may encounter a page. On the initial crawl , during indexing, right down to the moment before Google decides to display the page in the results. Google also detects near duplicates as well as complete duplicates. The only advice was to really try to make your pages as different as possible. However there is no penalty associated with a duplicate page. Google will simply decide which page they feel is better for a search and show that better one. So no penalty, but if you have duplicate pages they may not appear in the results. One thing Matt mentions toward the end of the discussion is making your template different you think the content might be duplicate. I think that brings to mind my own theory that Google looks beyond content alone when considering the duplicate issue.
One of the videos is devoted solely to data centers and I though the discussion interesting throughout. Most of the time you access any of the Google ip addresses on the same class c block you should be at the same data center, but there may be times when that isn’t true. Matt also let out that different data centers won’t always be operating under the exact same set of rules. So if Google has something new they are going to add to the algorithm it might first go on one data center for testing and only later added to the other data centers. Something many of us have assumed for awhile, but it does help explain why you might see your site rank well at one moment for a given query and not so well a few moments later. Not only can the results be coming from different data centers, but they may be presented under a slightly different set of rules.
There’s plenty more good stuff in the videos if you have the time to check them out. Some like the concept of themes in your website I’ve held back from talking about since I have a short series of posts planned for the topic in the near future. Give the videos a look if you haven’t already. Though approximately 5 minutes apiece they may take a little longer to get through depending on your connection speed. Took me about an hour and a half to get through all 10 over my dsl connection. Not too long a time for the wealth of information they contain. Ok no more Matt Cutts for awhile. He’s a good guy with good information, but I’ve gotten a little tired of typing his name this week. Sorry Matt.