The other day, Google launched the mobile testing tool and since then, Google has found two common repetitive trends when webmasters are claiming the tool does not work right.
Google's John Mueller wrote on Google+ that typically when the tool gives back an error saying your page is not mobile friendly, and you know it is indeed mobile friendly, is when you are (a) blocking GoogleBot from specific resources or (b) cloaking GoogleBot. In both cases, even if your site is mobile friendly, since GoogleBot doesn't know, it will mark it as not mobile friendly.
John wrote more detail on Google+ to explain:
1. Too much blocked by robots.txt. Googlebot needs to be able to recognize the mobile-friendliness through crawling. If a JavaScript file that does a redirect is blocked, if a CSS file that's necessary for the mobile version of the page is blocked, or if you use separate URLs and block those, then Googlebot won't be able to see your mobile site. The Mobile-Friendly Test will hint at this, the Fetch and Render tool in Webmaster Tools will give you details. The PageSpeed Insights tool doesn't use the robots.txt, so it may look normal there. Don't disallow crawling of your site's CSS or JS files!2. Cloaking to Googlebot. Cloaking has been in our Webmaster Guidelines for a long time, it causes all kinds of problems. Some sites try to recognize Googlebot by looking for "Googlebot" in the user-agent, and will then serve it a "Googlebot-version" of the page (which is often the desktop page). If Googlebot-smartphone crawls and sees the desktop page, the page won't be seen as being mobile-friendy. Fetch and Render in Webmaster Tools can help you recognize this. The PageSpeed Insights tool doesn't use the Googlebot user-agent. Don't cloak!
This is Google once again, asking you nicely, to stop blocking GoogleBot from crawling your CSS and JavaScript files.
Forum discussion at Google+.