Too many URLs is a Good Problem for SEO

If you ever receive an email from Google that says: “Googlebot found an extremely high number of URLs on your site,” then you’re probably doing something right. If you have an “extremely” high number of URLs in Google’s index, that is very good. Sometimes you may have a large website and may not get Google to even crawl all your URLs, so the fact that Google liked your site enough in the first place to crawl a high number of URLs, then you’ve done something right. Of course, it may also indicate a faulty URL structure that allows bad URLs, but it might also mean your site is just plain huge.


The more URLs you have in Google’s index, the more ‘chances’ you could say you have to rank for something. Maybe it’s like buying more lottery tickets, and thus, increasing your odds of winning.

The alert emailed to me above was for a site that has around 12 million pages indexed so far. The trick is that all 12 million pages are generated from only 1 page. That page has a few MySQL database queries which produce the content and uses an htaccess file to make the URLs SEO friendly.

It queries a table that has 100,000 cities in it and gives distances to different cities in the same 100,000-city list. Since 100k x 100k = 10 billion, that means there’s 10 billion pages/URLs generated from that single page. Even though Google crawled 10 million of them, it’s still only a fraction of the 10 billion total.

Generating a high page count from database content is a highly recommended technique for getting search traffic.