Google added the fetch as Googlebot feature the other day and now people are really beginning to explore it. One topic I have seen come up was why is the Fetch as Googlebot feature only showing up to 100Kb of the page it is fetching? Does that mean Googlebot only crawls up to a 100Kb of a specific page?
The quick answer is no, Googlebot does index more than a 100Kb, but the fetch feature only shows up to a 100Kb.
Historically, Googlebot at one point only indexed up to 100Kb. In fact, some time in 2006, the Google cache showed over 100kb of the page, which put the 100Kb maximum page size limitation to rest.
Google does indeed index pages larger than a 100Kb, especially in the days of higher bandwidth. But in terms of the Fetch as Googlebot feature, for speed purposes, it only grabs 100 Kb for this tool.
JohnMu of Google said in a Google Webmaster Help thread:
As far as I know, this is a limitation of the Fetch as Googlebot feature, so I believe more or less the only difference between a real Googlebot and this feature. The main problem is that arbitrary file sizes would bog down the Webmaster Tools user interface so we had to draw a line somewhere.
Forum discussion at Google Webmaster Help.