Lessons from Google about URL Lengths

A few weeks ago, I was reading through some posts on the official Google blogs, such as InsideSearch, WebmasterCentral, and the GoogleBlog. I was taking notes and sharing a few links via the Tweeters, and while doing such I noticed a link like this:

Search queries data is....what?


I did some more poking around and came across more URLs like this, such as:

More transparency around....?

and

Look of authorship in...?

What was going on?

Of course, I kept digging. I found more and more and more examples of these URLs. “Why are they doing this, and what’s the average length?” I thought to myself.

I did a bit of number crunching and here is what I found:

The longest URL I found: Create and manage Custom Search Engines from within Webmaster Tools with 95 characters

The shortest URL I found: Gmail’s new look at 59 characters

The average length for Webmaster Central: 90 characters

The average length for GoogleBlog: 76 characters

The average length for InsideSearch: 81 characters

Why would Google do this?

I have a couple of thoughts as to why Google is cutting off their URL before the whole title is finished. Here they are.

Too long of URL could be a negative ranking factor

SEO best practices have often been that you should have title tags shorter than 77 characters, which is what the search engines have historically shown. What I am finding nowadays is more along the lines of 62-67 characters. That being said, Cyrus has an interesting Whiteboard Friday about tests with title tags where he found that a really long title correlated with higher rankings in some cases, but not others.

Here is an example I found of a title tag that is uber long, yet I cannot find it ranking. Here’s the URL:

http://thelongestlistofthelongeststuffatthelongestdomainnameatlonglast.com/wearejustdoingthistobestupidnowsincethiscangoonforeverandeverandeverbutitstilllookskindaneatinthebrowsereventhoughitsabigwasteoftimeandenergyandhasnorealpointbutwehadtodoitanyways.html

I searched for “we are just doing this to be stupid now” and “it still looks kind of neat in the browser even though”, which are both in the URL. The post isn’t found on the first page. And obviously these are not competitive terms!

Here’s another. This post from The Blaze has this URL:

http://www.theblaze.com/blog/2011/02/01/kansas-city-star-complains-about-the-lack-of-response-during-his-response-to-the-response-to-his-response-to-a-point-he-didnt-hear-and-doesnt-understand/

Yet it does not rank for the query of “kansas city star complains about the lack of response to”.

I suspect that URL length could be a negative ranking factor if you are stuffing your URLs with a lot of keywords. At least, I’d hope this is so, since otherwise we’d have the new version of meta keywords stuffing. Yikes.

At first I wondered if there was a technical limitation, but according to this old Webmaster World forum discussion, the maximum was in the 2,000+ character realm for points at which the search engines had trouble. Of course that was 2007, so take it with a grain of salt.

URL length affects usability

The next argument I could think of for shorter URL, just like shorter title tags, is usability/linkability. If your URL is 500 or more characters long, people will probably be less inclined to share it. With today’s more widespread use of URL shorteners such as bit.ly, this may be less of a consideration, yet still a valid one.


I’m interested if anyone has insights into this. Personally, I think it is best to keep URLs shorter rather than longer for shareability reasons and because of the possible negative correlation with rankings. Since the Google blogs are between 76-90 characters on average, I’d shoot for that range.

If you use WordPress, maybe you should use the SEO Slugs plugin.

I’d be interested to see, though, if anyone else has tested or has any more ideas.

37 thoughts on “Lessons from Google about URL Lengths

  1. Interesting stuff John! I’d love to see some testing done here. I have a site with long urls like this and I am soon going to do a test where I shorten all of them and see if my rankings improve (or decline). I’ll let you know once I’m done my test.

    1. Thanks Marie. I’m planning to do some testing as well, just want to make sure to implement everything correctly. I’d be interested to hear the results of your test.

      Maybe think about testing half of your URLs by shortening them, and leaving the other half and seeing what difference it makes?

    2. Marie,

      I would change any of your URLs, you will lose any links pointing to those old URLs. I would create new pages on your site, and just keep the URLs short and see what happens.

      You NEVER want to change the URLs of your already published content.

  2. Testing shmesting. A few quick searches in Google will show that nice short and clean URLs dominate. I suppose we can’t be sure if this is a matter of correlation or causation… maybe a little of both. Usability and click-thru are big enough reasons to push for short and clean URLs as a best practice without even having to mention SEO. How did you determine the URL length #s for Google blogs? Did you use a crawling tool like Screaming Frog?

    1. Lindsay –

      Thanks for commenting on my site! You’re right that a few quick searches will show that short clean URLs are better. I want to try to quantify the impact a bit though, so that’s why I want to test it.

      I (sssh this is a secret) scraped the 30 most recent posts from each of the sites I mentioned, dumped them into an Excel spreadsheet, and averaged the lengths that way, per blog. Could have done more I guess, but it gave a pretty accurate number I think.

      1. I love to use Screaming Frog to collect length data for a domain. When I’m working on a site audit it is sometimes fun to be able to mention with accuracy that, “Your average Title Tag length is 12 characters. You’re missing an opportunity here.” A full crawl and CSV export into Excel makes it really easy.

        1. That’s really smart. Some clients that could be a win. Especially if you want to get an overall length, and you’re running Screaming Frog on the site anyways, I can see this being a great idea! Thanks for the tip.

  3. Hey John, its always nice to have the hard evidence to back up your recommendations. To this day I still encounter marketers who simply think the rules of SEO don’t apply to them or their site. I have a entire folder in my bookmarks with links of articles that make evidence based conclusions about SEO. Yours has been added. No one can argue against Google for SEO : )

  4. Google’s Matt Cutts has said Google likes to see URLs containing no more than 5 words. For longer URLs than that, he has said that Google may tend to ignore the terms. From the cut-off URLs you show, that seems to be the case.

  5. I apologise in advance because I am about to pour negativity all over this.

    I would suggest that there is a character limit on their article slug for the URL set in the CMS, and I would guess it to be 40 characters.

    It is a simple way of controlling the URL length when there are excessively verbose article titles.

    This is not likely to be a ranking factor, more of a usability factor; long URLs just suck.

    Both examples you provide, fail to produce the goods for reasons other than those you suggest.

    Your quoted search for the “Kansas City star…” ends in “to” which is not how it appears in the URL string; it should be “during”. Swap these around an up it pops, at number 1.

    The example using thelongestlist… fails because the URL does not contain space delimiters and Google has historically been very poor at inferring spaces between concatenated words. It is okay at inferring spaces between two words that get mashed together, but they have to be clear-cut and not have other word-matches in the string; Google stands no chance with this example. Unable to separate the individual words, we would need to search for the whole string (from the slash after the TLD, right to the end) but this is too long a search query for Google to service.

    Yahoo used to give less credence to URLs with deep folder structures, but this is something that Google avoided doing as it was kinda baked into the way that PageRank works; the homepage normally getting the most juice and it flowing into the hierarchy from there. Yahoo’s solution was a poor-man’s version of this on-site hill-topping.

    1. “Hill-topping” was the wrong choice of words there. Hill top is a subjective authority algorithm (Teoma got there long before Google) and PageRank is purely objective popularity.

    2. Dug –

      Thank you so much for your comment. It is very valuable to have feedback like this and people who read and test the posts so thoroughly! I have not been in the search world nearly as long as you, so your information here is very valuable to myself, and I am sure to many others.

      To your overall point though, I am still not sure that we can dismiss this as a ranking factor. Even if we say that Google “prefers” short URLs, are we still not also saying that they possibly rank them higher? Google likes to display shorter URLs, so it makes sense to me that they a) would rank them higher, and b) would try to find ways to “shorten” long URLs in the SERPs, which they have. If you do a search for “holiday cottages” here in the US, you will find many instances where they are doing the “http://www.domain.com > Folder > Folder > Location” format of the URL instead of the whole URL.

      It’s just a hunch, and it also makes sense to me that Google has a cut-off of sorts in their CMS, but the question there is why? Why not just use a “slug” plugin for shorter, cleaner URLs? Why cut it off?

      Thanks so much for your feedback, once again. It is truly valuable to me and other readers!

  6. Interesting! I’m gonna watch with more attention this, and see what’s the situation, focusing on google.it.

    thanks for sharing!

  7. I think suggesting that it is a “negative ranking factor” is a little alarmist. It hints of penalties if your URLs are too long which just isn’t the case.

    Google have only recently given a stuff about the usability or accessibility of a site and while it is beginning to play a part in ranking results, it is very early days. There are so many CMSs out there that are tied into awful URL structures, I think it would be a mistake to make it a ranking factor of any significance because there are plenty of sites that are of otherwise good quality that are stuck with crappy URLs.

    Google’s move toward shortening the URL in the results (namely displaying breadcrumb trails) is an effort to increase the usability of their own site. The benefit for webmasters is higher click-through rates, but I doubt Google give a stuff about my click-through rate as long as their customer (the searcher) is happy.

    However, the display of the breadcrumb is both a replacement for, and independent to, the URL structure.

    To me “Google prefers shorter URLs” means don’t fill your URLs with extraneous gumph (querystrings session IDs and the like) and keep it in context with keywords. Keywords in URLs are a ranking factor and a short, concise keyword-rich URL can help users make sense of what can be impenetrable garbage.

    Why Google cut their URLs short at this point is a fairly simple question. Look at those URLs in the search results. http://is.gd/tRh4Hs
    They almost always truncate exactly as the.html, so there is always the selected part of the article title from the URL displayed in the search results. They are fine-tuning the appearance of their search result.

    What this does highlight though is how Google snips out the date information and replaces it with an elipsis to keep the more relevant information intact (domain and article title), as well as the character limits that will be displayed there (very similar numbers to titles).

  8. I agree that long URLs must be a negative. I always try to keep mine to a minimum. Even if I name the post something longer (or funny), I change the URL to be more search-friendly terms — but stay away from keyword stuffing. Blegh!

    1. I agree with you, Visiture, and thanks for the comment! As short as possible is a good rule, while still being useful. Part of me thinks that this is one way that Google could deal with keyword stuffing of URLs. Keep the crawl length to a certain number of characters, which is why it is always best to have your important keywords at the beginning of your URL. Interesting…

    1. Amazon has a ton of links (Amazon.com is a TBPR 9 and has over a million linking root domains). They could rank for just about anything they want!

      You are right that they verge on keyword stuffing in their URLs, but they’re a big brand. They can get away with it. Big brands can do that, with Google’s bias towards them.

  9. My view regarding long URLs is that it is a usability factor and repeating keywords in the URL looks spammy and I would not click though as a user. However best is to stick to around 40 chars. When it comes to ranking even if your url is above 100 chars and you have content that interests users it should rank better. I use the KISS method Keep It Short & Simple like http://www.farhanonline.com/seo

  10. Also, Ninja Kiwi has only offered us ‘packages’ of ranks. That means that everyone who gets reset to a rank will be exactly the same. I’m not able to adjust individual accounts.

    If you have any questions, I’ll be hanging around the thread to provide more information, but I am currently reseting SAS3 ranks!

  11. Pingback: On Site Optimization Checklist » Robot Creative

  12. Pingback: On-site SEO optimization audit checklist | Robot Creative

  13. Hi

    This is a very interesting discussion. I did a search on long urls not getting indexed and found this. I have a site that has 300+ pages and only one third are indexed. Some of the pages do have long URLS and the reason being for a good user experience in navigation.

    So something like /maintopic/subtopic/subtopic/article title (It’s a training site). It works well in the navigation which I thought is something that we are supposed to provide. i.e make the site easily navigable. Now the flip side of this is on a big site with a lot of topics that, that sort of page structure causes LONG urls.

    I wonder if Matt Cutts has ever done one of his Q&A videos on this.
    I wish I knew a definitive answer as changing the page structure is a big job 🙂

    Thanks
    Lynne

  14. Nice blog right here! Also your website quite a bit up very fast!
    What web host are you using? Can I am getting your affiliate link in your host?
    I desire my site loaded up as quickly as
    yours lol

  15. Thanks for the sensible critique. Me & my neighbor were just preparing to do a little research about this. We got a grab a book from our local library but I think I learned more clear from this post. I am very glad to see such great information being shared freely out there.

  16. The reason I think you see Google doing that is because they don’t have to do SEO. The sub-parts of the Google company don’t care if they rank because they don’t make money by ranking. Google is the portal, they make money by selling adds they don’t have to rank to make money, they decide what ranks, including the adds that get shown….

  17. I don’t think there is such character limit. Then what about those url which is longer than 50 characters, doesn’t index them? Even they are indexed. So there is no such bounding I believe.

  18. Pingback: url структура?

  19. Thanks for the info. But I don’t see any change in my site? My site ranks well in google but the urls aren’t like as you said.

Comments are closed.