Google's John Mueller said on Twitter that using the URL parameter tool is no replacement for using a robots.txt file for blocking content. John was asked "how reliable is it" when setting "crawl no urls" of a certain type of URL pattern. John said "it's not a replacement for the robots.txt -- if you need to be sure that something's not crawled, then block it properly."
Here are those sets of tweets:
Yes
— 🍌 John 🍌 (@JohnMu) May 29, 2020
It's not a replacement for the robots.txt -- if you need to be sure that something's not crawled, then block it properly.
— 🍌 John 🍌 (@JohnMu) May 29, 2020
I should note that last year John said do not use the robots.txt to block indexing of URLs with parameters. So keep that in mind as well.
Forum discussion at Twitter.