Google's John Mueller reminds webmasters on his Google+ page that Google has a limit of only being able to process up to 500KB of your robots.txt file.
This is an important point, if you have a super heavy robots.txt file, and it is beyond 500KB, then GoogleBot can get confused. If GoogleBot gets confused with your robots.txt it can cause serious issues with your site's health in the Google results.
Google's John Mueller said:
#102 of the things to keep in mind when working on a big website: If you have a giant robots.txt file, remember that Googlebot will only read the first 500kb. If your robots.txt is longer, it can result in a line being truncated in an unwanted way. The simple solution is to limit your robots.txt files to a reasonable size :-).
John links to this Google document on the robots.txt controls for more information.
If you have any questions on Google's robots.txt handeling, John is answering questions on his Google+ page.
Forum discussion at Google+.