Google's John Mueller said on Twitter that having shared robots.txt across multiple domains is fine and should work for search. John wrote "It sounds like you have a shared robots.txt file across domains? That shouldn't be a problem, we might show those cross-domain URLs as errors in Search Console, but if they're on all domains, that should work regardless."
So if you are set up to do this, the Google Search Console errors might be a bit unusual but as long as you understand the output and the set up, it should make sense to you.
Here are those tweets:
It sounds like you have a shared robots.txt file across domains? That shouldn't be a problem, we might show those cross-domain URLs as errors in Search Console, but if they're on all domains, that should work regardless.
— John ☆.o(≧▽≦)o.☆ (@JohnMu) May 23, 2018
I have personally never seen an example of a shared robots.txt file set up like this, have you ever done it?
Forum discussion at Twitter.