John Mueller from Google said in a Stack Exchange thread that although it is good practice to make a dynamically driven XML sitemap file, it is not good practice to make a dynamically driven robots.txt file.
He said that file should probably be kept more static and updated by hand.
John wrote:
Making the robots.txt file dynamic (for the same host! Doing this for separate hosts is essentially just a normal robots.txt file for each of them.) would likely cause problems: it's not crawled every time a URL is crawled from the site, so it can happen that the "wrong" version is cached. For example, if you make your robots.txt file block crawling during business hours, it's possible that it's cached then, and followed for a day -- meaning nothing gets crawled (or alternately, cached when crawling is allowed). Google crawls the robots.txt file about once a day for most sites, for example.
As you can see, changing your robots.txt file too often throughout the day can cause confusion for Google's crawlers and lead them places they should not go, while sometimes blocking them from places they should be invited to venture into.
Forum discussion at Stack Exchange.