The other day, Google employee John Muller explained on Twitter that the URL Parameters tool in the Search Console is not a replacement for the robots.txt file.
One of the webmasters asked if Google uses the URL Parameters tool, available in the old version of GSC.
Mueller answered in the affirmative.
Then the specialist asked: how reliable is the setting, which prohibits URL scanning with a certain parameter.
To this, Muller replied that it was not a replacement for the robots.txt file. Therefore, in order for something not to be scanned for sure, it must be properly blocked – i.e., use robots.txt.