I've been submitting quite a few sitemaps to google's webmaster tools. Each sitemap has a list of 50,000 URLs. It sounds like a lot, but believe it or not, I plan to upload millions of pages to google.
Here's how... every search result on my site is an entirely dynamic page, with custom titles, h2 tags, meta tags, clean URLs and content -- even if no results are found.
For example: if you search for "Perfect", you get 1800 results, an appropriate title tag, h2, meta, the works.
http://www.ovidz.com/Videos/Perfect
If you add "Blowjob" to the search, you get 113 results.
http://www.ovidz.com/Videos/Perfect-Blowjob
You can go a third level deep and add, say, "Asian" to the search and you'll still get results:
http://www.ovidz.com/Videos/Perfect-Blowjob-Asian
And so on. You can add any combination of keywords to the search and you'll get some kind of result.
I have a total of about 10,000 tags in my database. I'm generating these sitemaps by cycling through the `tags` table with random tag combinations.
So,...
10000 x 10000 = 99,990,000 URLs (two random tags)
10000 x 10000 x 10000 = 999,700,020,000 URLs (three random tags)
So my question is, will any flags be raised at google when they see 100's of thousands of URLs being submitted from one site every day? If so, what will happen?