In case you didn't know, you can upgrade your Feedburner URL to feedproxy.google.com. However, some people noticed that feedproxy.google.com is actually disallowing robots.txt -- odd, huh? In fact, the problem has been breaking some feeds.
Fortunately, Google has been paying attention to the relevant discussion and will be adding the following code snippet to the robots.txt file for feedproxy.google.com:
User-agent: * Disallow: /~a/
Google has updated the issues page to confirm this update has been completed. Google said this will fix the issue: "This should permit all readers/crawlers that previously retrieved feed content, but now get a blocked response, to start working properly again. Our apologies for any inconvenience you may have encountered! "
Forum discussion continues at Google Groups.