Yesterday, Google came out with a really nice new PageSpeed Insights tool upgrade that in many cases shows the real-user page speed metrics based on Chrome user data. But in many cases the report shows "unavailable" mostly for smaller sites that get less traffic that a larger site.
Why is Google showing "unavailable" - simply because they do not have enough data to measure the page speed of that page yet. Google explains:
PSI uses data from the Chrome User Experience Report, which provides speed data for popular URLs that are known by Google's web crawlers. If the speed data for the queried URL is not available in the CrUX dataset, we recommend using Lighthouse to run a synthetic performance audit to estimate page speed, and investigate page optimization recommendations provided by PSI and Lighthouse.
For my corporate site, as I said yesterday, it gets an unavailable message, because it gets a lot less traffic than a news site like this one:
That message is a turn off for some, as Alan put it:
Until @googlewmc fixes the very high rate of "unavailable" in the new GPSI, it's pretty much useless for serious audit work. This was not a problem before the latest changes to GPSI.
— Alan Bleiweiss (@AlanBleiweiss) January 10, 2018
Google's John Mueller did share a bit more information about the source of this data:
Ah! I believe you mean the data from the Chrome user experience report. There's more about that at https://t.co/LUOw6gu0a3 + infos on how to check Big Query for the specifics.
— John ☆.o(≧▽≦)o.☆ (@JohnMu) January 11, 2018
The question people are asking is what is the threshold for when data might show in that box for less popular sites?
Forum discussion at Twitter.