Last week we reported that Bing expanded their submit URL tool from 10 URLs per day to 10,000 URLs per day, removing essentially all the quotas. The shocking part in that announcement was that they seemed to want to do away with crawling as a strategy for getting web pages into their index.
Yesterday, I interviewed Christi Olson, Bing’s head of search evangelism for Search Engine Land and she confirmed, that is the direction Bing hopes to take. It is inefficient and resource intensive for both Bing and the publishers for Bing to crawl their sites. Instead, they hope that publishers begin submitting content to their search engine.
One point Christi Olson made crystal clear to me is that publishers should be using this now. If you produce content, using the tool will tell Bing as a "strong signal" to crawl the page NOW. If you want to pretty much ensure faster crawling and thus indexing of your content in Bing, then use it. You can submit stuff manually in the Bing Webmaster Tools interface or code into your CMS their API. I'll be adding it here using the API soon.
I got into a lot of the concerns about this strategy change and how it might impact SEOs on Search Engine Land - so read that. And yes, this is what Yoast is doing with Bing. I still am trying to find out officially what Yoast is doing with Google here.
Christi Olson from Bing answered additional questions on Twitter from the SEO community after I published this story yesterday. Here they are:
Thank you for taking the time to talk with us today @rustybrick. :) https://t.co/X4KEOLpvkk
— Christi Olson (@ChristiJOlson) February 7, 2019
Yes point and feedback taken. Crawling isn't going to stop - this is a signal for new and fresh/updated data to be included in the index.
— Christi Olson (@ChristiJOlson) February 7, 2019
More quickly. Think about it this way - today not every website is crawled daily, weekly, or monthly. When new pages are added or content of a page is updated submitting the URL is the signal to crawl more quickly so that it's included in the index.
— Christi Olson (@ChristiJOlson) February 7, 2019
The systematic crawling of the web to check for new content and updates to existing content is inefficient and takes/wastes a lot of resources. Receiving a signal helps to improve crawling efficiency. :)
— Christi Olson (@ChristiJOlson) February 7, 2019
Forum discussion at Twitter.