I would suggest detecting that it's Googlebot/Bingbot (look for 'googlebot' or 'bingbot' in the User Agent string) and then defaulting the year/month/date fields with valid values. They should then be able to get to the next page. (They do submit forms if they know values to enter).
BUT if you've got that aggressive of an age check do you really want Google/Bing to index pages beyond that point? It means that traffic will bypass the age check, no?
If you're not familiar with sitemaps google "Google Sitemaps" and you'll find information to get you started. All the pages you want indexed should be in the sitemap.
__________________
For the past 12+ years I've focused on building & running gay affiliate sites.
|