Have you heard of Effingo by Google? Google actually spoke about it on their Google Cloud blog in May 2023. Now Gary Illyes from Google said it is used by Google Search and other areas of Google to do insanely fast data transfers across Google data centers.
By insanely fast, Gary said that Effingo "transfers data at the overall rate of 14 terabytes per second." He said "Effingo is transferring a little over 1 exabytes of data per day at Google, data like some of our ranking signals, our web and media indexes, maybe the raw bytes of people's photos."
That comes out to be "about 14 terabytes per second," he wrote on LinkedIn.
I assume the data transferring specific to Google Search are the indexes and other search data that Google moves from its Google Search data centers.
Gary explained on case in which he used it at Google, he wrote:
One of my projects about a decade ago required replicating a large amount of data, probably order of a few dozen terabytes, to 3 regions/datacenters. I knew how painful it is to use traditional methods for copying data and that it will take an eternity for the replication to finish, so, naturally, I was procrastinating a lot. My tech lead asked me what's up and I explained my dread. He asked whether I've heard of Effingo. I have not, yet 5 minutes later I was already typing up the command that will take my data from datacenter A and will copy it to B and C. I was expecting the replication to take a few hours; it took about 3 minutes. I was in love.
Here is the Google paper released on this just last week.
That is just so insanely fast...
And that name - Effingo - amazing. On the name, Gary added in the comments, "fun background detail: we used to have a training that had a module about how to name things such that we're not going to be embarrassed if it ends up on the front page of NYT/WSJ."
Forum discussion at LinkedIn.