In Google's Webmaster Guidelines they write:
If the site map is larger than 100 or so links, you may want to break the site map into separate pages.
The 100 link per page concern has been a concern in the SEO world for years. What is behind this concern?
(1) Search engine spiders want to be efficient. If they find a page with thousands of links on it, it may take a really long time for the spider to get to all of them.
(2) Users looking at a page with hundreds of links on the page possibly can overwhelm the user.
But to think that Google or other search spiders are not built to crawl hundreds of links on a page kind of seems a bit ridiculous. They may be programmed to stop a certain point but I doubt that number is 100. In addition, to think that a good usability designer cannot design a page that is easy to use with over a hundred links, is also a bit wrong.
Like Ben wrote in What Is Google's Indexing Limit?
The first mythbuster, is that you can have more than 100 links in a navigation menu and get by just fine. The prevailing thought for a long time was that Google would only spider the first 100 links, and any more was risk for penalty. Not true anymore, times have changed. However, there is still inherent problems with more than 100 links, such as page size which can cap the amount of spiderable links and so on.
So if you have a 120 links on a page and that page is designed and easy to use, I would not worry. But if you feel that you can make that page even easier to use by breaking it out into additional pages, then that is probably the way to go.
Forum discussion at Search Engine Roundtable Forums and Search Engine Watch Forums.