Data movement


If you must move your data, do it 20x faster than ever before.

The ability to move your data from one place in your organization to another (from edge to cloud, or datacenter to datacenter) is a core function of edge, cloud, and hybrid/ multi-cloud computing. Sending large amounts of data, or any amount of data over distance, can be slow and unreliable—increasing your security exposure and decreasing productivity.

Just want to move your data? We enable you to do so ridiculously fast.

use cases

When might you want to move data?

There are a variety of circumstances where moving or transferring data exceptionally fast negates time lost, increases productivity, and drives higher business value, such as:

Backup and recovery

A critical, often costly, workflow that must meet aggressive recovery time objectives (RTO).

Vendor or hardware refresh

Updating and integrating new data stores is often slow and costly.

Cloud migration or repatriation

Digitization often means getting data to or from the cloud—and later optimizing data locations.

Mergers and acquisitions

Moving and combining large volumes of data is expensive and complex.


Never wait on your data again.

Faster time to insight

Take meaningful action up to 20x sooner on a single data set with local-like performance.

Enhance agility

Easily move data across hybrid, multi cloud environments—from edge, to core, to cloud.

Increase performance

Improve app performance by reducing effects of latency and time waiting on data.

Improve security posture

Improve your risk profile by reducing time data is in flight (and better protecting it during).

Reduce operational costs

Decrease project timelines and costs, such as migrations, by moving data faster.

Consistent workforce experience

Make data quickly and highly available to a decentralized workforce as it's needed.


Let's get moving! (or not?)

Tasks like moving data from on-prem to the cloud, replicating data between cloud regions, relocating a data center, or upgrading a hardware IT platform all face a significant challenge: how to move data in a timely manner without impacting business operations. Typically, and particularly for data at scale, this process is slow—and costly. With existing data transmission approaches, data arrives at the destination in an unpredictable time and performance varies with data size and type—so much so, that physically transporting data can be considered an acceptable solution. This negatively impacts productivity, impedes employee collaboration and workflow efficiency, increases operational costs, and can put valuable IP at risk.

Vcinity Solution

Move and use data 20x faster.

Vcinity enables you to move large datasets (yes, even petabytes) across hybrid and multi-cloud environments, even across globally dispersed sites or locations.

Vcinity’s technology can be deployed as bookends across your choice of Hub (hardware), Edge (hardware, software), Cloud (software) locations.

An example of how it works.

Let’s take a critical workflow for many organizations: backup and restore.  Companies typically work to identify the most cost-effective option that allows them to meet aggressive Recovery Time Objectives (RTO). Typically, the shorter the RTO, the closer your data needs to be—often, on premise. Now, not only is the amount of data you have to store accumulating, but potentially the number of local copies you need to meet your Recovery Point Objective (RPO). With Vcinity’s remote data access, the recovery can begin instantaneously, no matter where the backup copy is, resulting in near-zero RTO. Vcinity’s data movement reduces time to backup and store remotely by up to 94%—enabling more backup cycles and improving RPO significantly. Eliminating the time and cost of copies also increases your agility and security posture.

our TECHnology Features

How we move data so darn fast.

A goal of modern data strategies should be to make your data accessible when and where you need it without moving it—but if you must move your data, you want to do so insanely fast—even over great distances.

Historically, data movement has been optimized through a variety of methods, such as shrinking payloads via compression and deduplication, switching from TCP to UDP to reduce the effects of flow control, or presorting data for a particular workflow. While helpful, the impact of these strategies tends to be conversely aligned to the distance data travels,  peaking around 30% real bandwidth use beyond a metro network.

We take a layered, compounding approach moving your data fast—with sustained, 90 percent bandwidth utilization. Our technology scales linearly based on your bandwidth, allowing you to transfer ~100TB of data with a 10 Gbps connection from the US west coast to east coast just as quickly and easily as you would 1PB with a 100 Gbps connection (this takes about 23 hours).

We deliver these results—minimizing the disruptive effect of latency on data transfers—by re-engineering multiple aspects of data movement, such as:

At the access level, we extend typical high performance computing (HPC) protocols used for quicker data flow, turning your using WAN into a Global LAN.

We pre-negotiate based on size, do resequencing in in memory, first attempt to rebuild lost packets at the destination (instead of resend), only resend missing data in a drop scenario, and use parallelization.

As sequencing is done in memory, we send packets simultaneously across multiple latency paths to further obscure your data across already secure data transfer. For instance, you can double-encrypt data before send with two different encapsulated algorithms, split the stream into shards, and travel the data down entirely different paths or topologies (for instance, over WAN and satellite), and then rejoin them at the destination—where your data is reconstituted and unencrypted.

With Vcinity, you can quickly, easily, and securely move your data across your global, hybrid, multi-cloud environment.

Start moving your data—fast—with Vcinity.


Do you wish to connect?

Contact us if you need further assistance