Google's John Mueller said a site with 100,000 URLs usually won't be enough to impact crawl budget. Your site and number of pages should be larger to where you might run into an issue with Google's crawl budget.
John said this on Twitter "100k URLs is usually not enough to affect crawl budget (it's <1/minute over 3 months)." Just to be me and do what I do, Gary Illyes said in 2016 that a site with 100,000 URLs may see a benefit of using nofollow on internal links for crawl budget purposes - keep in mind, that was five years ago.
Here are the tweets in context with what John Mueller said:
What are you trying to achieve?
— 🫕 John 🫕 (@JohnMu) December 21, 2021
Why do you want them out of the index? What issues are you seeing with them now in search?
— 🫕 John 🫕 (@JohnMu) December 21, 2021
100k URLs is usually not enough to affect crawl budget (it's <1/minute over 3 months), and if it's noindex/404, we won't crawl them that often. With robots.txt it's rare we'd show them in search, site:-queries don't matter.
— 🫕 John 🫕 (@JohnMu) December 21, 2021
Thanks very much John, really appreciate your input! 😊🙏
— Matt Tutt (@MattTutt1) December 21, 2021
Hope this thread helps some SEOs.
Forum discussion at Twitter.