Quote:
Originally Posted by TheLegacy
Wow - you're heading in the wrong direction
There's no strict limit on the number of pages you can submit for indexing in Google Search Console, but submitting too many pages rapidly might lead to issues with crawl budget and indexing speed, rather than a direct penalty
for the record to those who dont know what that means
Google has a limited crawl budget for each website, which determines how often and how many pages it can crawl. For indexing speed - Rapidly submitting a large number of pages can overwhelm Google's indexing capabilities, potentially slowing down the indexing process for all pages.
There are so many other things to consider that I'm leaving out but based on the images you've shown - there are more important problems you may have there.
Take a look at this page and I hope it helps
https://developers.google.com/search...-traffic-drops
|
i second that. that is by far the best explanation so far!
just to to it: if you are 100% sure there are no crawling issues and no road block - the crawling budget is an elastic category, and only depends on your link portfolio.
to put it simply: the more good links you have, the higher importance your website will have, thus Google will crawl links much faster. big and important websites have higher limits.
from the logical perspective: if your website has too many new pages daily, and (almost) no links, it is normal to think about it as potential spam. again, can't confirm nor deny G algo uses that for ranking. i'm just not excluding it as an option.