This week in search I posted the big monthly Google Webmaster report, so catch up there quickly. Bing announced a boat load of new search spam penalties that is rolling out. Google may have done a local search algorithm update this week. Google shared some details on what it is working on for health and medical queries. Google has rolled out the speed reports in Search Console, there is no ETA on an API. GoogleBot is now running Chrome 78. Google said some SEOs use no as a way to convince themselves of their own theories. Google said it is a bad sign if your robots.txt or sitemap file ranks for normal searches. Google said do not use robots.txt to block indexing of URLs with parameters. Oh yea, two wrongs don’t make a right in search or SEO. Google’s product parser uses a different port that what Search Console uses. Google is testing an image carousel in the local pack. Google My Business limits businesses to 20 service areas. Google Ads Editor is upgraded to version 1.2. Google Merchant Center now supports multi-country feed support. Google won’t use Fitbit data for search or search ads. Check out the Aja Frost from Hubspot vlog and check out my stuff from the Google Webmaster Conference. That was this past week in search at the Search Engine Roundtable.
Make sure to subscribe to our video feed or subscribe directly on iTunes to be notified of these updates and download the video in the background. Here is the YouTube version of the feed:
For the original iTunes version, click here.
Search Topics of Discussion:
- November 2019 Google Webmaster Report
- Bing Inorganic Site Structure Spam Penalties
- Google My Business Local Listing Update This Week
- Google To Go Big With Health Search For Medical Queries?
- New Google Search Console Speed Reports Rolling Out
- No ETA On Google Search Console Speed Reports API
- Evergreen GoogleBot Now On Chrome 78
- Google: SEOs Use "No" As A Way To Confirm Their Theories
- Google: It Is A Bad Sign If Your Robots.txt Or Sitemap File Is Ranking For Normal Queries
- Google: Do Not Use Robots.txt To Block Indexing Of URLs With Parameters
- Two Wrongs Don't Make A Right With Google And Breaking Guidelines
- Google: Our Production Parser Is Different; The Search Console Parser Uses A Java Port
- Google Tests Image Carousel In Local Pack
- Google My Business Limits Businesses To 20 Service Areas
- Google Ads Editor v1.2 Adds New Campaign Types & Cross-Account Features
- Google Merchant Center Gets Multi-Country Feed Support & Redesign
- Google: We Wont Use Fitbit Data For Search Ads
- Vlog #23: Aja Frost from HubSpot On Topic Clusters, Featured Snippets, Image SEO & Suggested Clips
- Takeaways: Google Webmaster Conference Product Summit
Please do subscribe on YouTube or subscribe via iTunes or on your favorite RSS reader. Don't forget to comment below with the right answer and good luck!