Yahoo recently announced that they are supporting four new types of exclusion tags in the robots.txt file: NOINDEX, NOARCHIVE, NOSNIPPET, and NOFOLLOW. The benefits of being able to declare these directives in the robots.txt file enables folks who store PDFs, Word Documents, and other files on the web and cannot easily place these directives in the header.
Google actually expanded its robots.txt protocol in July with the unavailable_after tag, and Sebastian discovered the Noindex: / directive to block Googlebot from crawling your entire site.
The downside to these changes is that you'll have to check the robots.txt file to see if link juice is passed.
Yahoo also announced that this is related to its most recent search update:
Along with this change, we'll be rolling out additional changes to our crawling, indexing and ranking algorithms over the next few days. We expect the update will be completed early next week, but you may see some changes in ranking as well as some shuffling of the pages in the index during this process.
Forum discussion continues at WebmasterWorld.