Frédéric Dubut from Bing's search team said on Twitter that if you create a specific robots.txt directive for Bingbot, their crawler, then Bing will only look at that specific section. So you should make sure that when you do that, copy all the directives from the default to the Bingbot section that you want Bing to comply with.
He said:
Useful robots.txt reminder - if you create a section for #Bingbot specifically, all the default directives will be ignored (except Crawl-Delay). You MUST copy-paste the directives you want Bingbot to follow under its own section. #SEO #TechnicalSEO
Google works a bit differently and goes with the most strict directive they can find, when not told otherwise:
Google's docs say: "For situations where multiple crawlers are specified along with different directives, the search engine will use the sum of the negative directives."
— Glenn Gabe (@glenngabe) January 3, 2019
So Google will combine googlebot with directives targeting all user-agents. https://t.co/22NvjMRq18
Ah, you're right. So robots.txt is handled the same way between Google and Bing, but the meta robots tag can have directives combined. Great catch. :) @rustybrick
— Glenn Gabe (@glenngabe) January 3, 2019
Useful robots.txt reminder - if you create a section for #Bingbot specifically, all the default directives will be ignored (except Crawl-Delay). You MUST copy-paste the directives you want Bingbot to follow under its own section. #SEO #TechnicalSEO
— Frédéric Dubut (@CoperniX) January 2, 2019
Forum discussion at Twitter.
Update: John Mueller said it works the same way for Google, he said this on Reddit "This is standard for any user agent section in the robots.txt :)"