View Single Post
Old 03-25-2025, 08:02 AM  
TurboB
Confirmed User
 
TurboB's Avatar
 
Industry Role:
Join Date: Dec 2016
Posts: 1,037
Quote:
Originally Posted by TheLegacy View Post
Wow - you're heading in the wrong direction

There's no strict limit on the number of pages you can submit for indexing in Google Search Console, but submitting too many pages rapidly might lead to issues with crawl budget and indexing speed, rather than a direct penalty

for the record to those who dont know what that means

Google has a limited crawl budget for each website, which determines how often and how many pages it can crawl. For indexing speed - Rapidly submitting a large number of pages can overwhelm Google's indexing capabilities, potentially slowing down the indexing process for all pages.

There are so many other things to consider that I'm leaving out but based on the images you've shown - there are more important problems you may have there.

Take a look at this page and I hope it helps

https://developers.google.com/search...-traffic-drops
I have been doing SEO for ~7 years and am not new enough to not notice other problems/reasons for the Google hit I may have.

I am not forcing indexation - leaving it to be decided by Google.

So the problem - 10 sites running on the same cam aggregator. 3 of them survived updates as they are 3-4 years old. The rest (more fresh) sites got hit in the same way I posted in the first post.

Has anyone made a fresh site recently on cam aggregator without a "noindex" tag on the model pages and running it without problems with tens of thousands of model pages on the site?
TurboB is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote