Google Cloud today announced Transfer Service, a new service for enterprises that want to move their data from on-premise systems to the cloud. This new managed service is meant for large-scale transfers on the scale of billions of files and petabytes of data. It complements similar services from Google that allow you to ship data to its data centers via a hardware appliance and FedEx or to automate data transfers from SaaS applications to Google’s BigQuery service.
Transfer Service handles all of the hard work of validating your data’s integrity as it moves to the cloud. The agent automatically handles failures and uses as much available bandwidth as it can to reduce transfer times.
To do this, all you have to do is install an agent on your on-premises servers, select the directories you want to copy and let the service do its job. You can then monitor and manage your transfer jobs from the Google Cloud console.
The obvious use case for this is archiving and disaster recovery. But Google is also targeting companies that are looking to lift and shift workloads (and their attached data), as well as analytics and machine learning use cases.
As with most of Google Cloud’s recent product launches, the focus here is squarely on enterprise customers. Google wants to make it easier for them to move their workloads to its cloud, and for most workloads, that also involves moving lots of data as well.