Click here for Support   |    Sales: +1 866 755 0267
Blog

A modern approach for enabling migrations to AWS

August 8, 2022

Stephen Wallo.
Steve Wallo, CTO

The law of data gravity:

The call of cloud computing and cloud environments, especially ones as mature and feature rich as Amazon Web Services (AWS), will always be at the forefront of organizations’ digital journeys. The opportunity to take advantage of robust analytics, machine learning, and emerging microservices, almost infinite scalability and elasticity, enhanced security, and more can be game-changing. Yet, the ability to utilize these functions ultimately requires access to data—data created en masse from edge to cloud. Making these large datasets accessible to the application is often complex, time consuming, and expensive—a challenge often referred to as “data gravity.” This is a growing concern for companies, particularly as data creation at the edge in non-cloud locations is exploding. Microservices and other constrained compute capabilities are being designed specifically for edge applications, yet many organizations still need to leverage cloud-based operations. It’s balancing the complementary nature of edge and cloud resources—while somehow also accounting for the physical distance between where the data is created and where the data is needed.

There are two approaches to address this gravity issue. First, you can move your workload; this means making your distant data immediately available to these structures by removing the impacts of distance and latency. Second, if you do want to move your data, doing so in a way that minimizes transfer times to maximize the value of that data, both in terms of quicker monetization and timelier insights.

 

Moving your workload to AWS:

The ideal hybrid or distributed cloud scenario is one that is flexible and allows you to securely use cloud-based services against the freshest data possible. Vcinity’s revolutionary software enables your AWS applications to execute on remote data where and when it is created (such as on-prem or at the edge)—delivering local-like performance within your scalable AWS compute environment. This capability fundamentally changes the age-old paradigm that compute and data must be co-located for the application to optimally perform. Vcinity enables this separation of applications and data via a high-performance tunnel that connects data to your locations (memory to memory) at near-line speeds, while eliminating the performance loss due to latency beyond the first bit.

Examples of use cases well-suited to leverage your data in place while still using AWS compute include, but are not limited to:

  • Enabling workstations in AWS for remote workers and artists/ creatives.
  • Taking advantage of batch or burst compute for life sciences and media and entertainment rendering.
  • Executing analytics applications, like Vertica, Teradata or Snowflake, in AWS on data located both at the edge or in another AWS region.
  • Using several silos of data in real-time to facilitate the training and curation of Amazon SageMaker models; and/ or,
  • Enabling analysts to work remotely on data that has data residency requirements.

 

Moving your data to AWS:

While moving your workload to the cloud may provide optimal flexibility, sometimes data must be moved—such as migrating data into the AWS cloud for the first time or from AWS region to region. Oftentimes and in those instances, current method fall short. Vcinity expedites these data transfers by using incumbent WAN connections and moving data at near line-speeds in a faster, more deterministic manner.

Examples of when you may want to hasten data movement into or around AWS environments include, but are not limited to:

  • Centralizing data storage in a database, such as Amazon TimeStream, Amazon Relational Database Service (Amazon RDS), or Amazon Redshift.
  • Consolidating into Amazon S3 storage.
  • Archiving to Amazon S3 Glacier storage classes.
  • Backup/ recovery of offsite workloads; and/ or,
  • Managing log aggregation and storage.

Additionally, being able to move data more quickly and efficiently opens up further workflows for operational optimization, such as future mergers and acquisitions, increasing aggressive recovery targets, and allowing for further adoption of AWS cloud services across distributed or hybrid environments. This ultimately allows for faster data monetization and more effective access to time perishable data sets.

 

Data anti-gravity considerations:

Whether you decide to move your workload or your data to AWS, you need to consider several key performance indicators to ensure you are designing the most impactful data and cloud strategy. Metrics to consider, while not exhaustive, could be:

  • Cost of the project in money: There are several factors that play into overall costs—from subscriptions, to purchases, to incidentals like egress and surcharges.
  • Cost of the project in time: The full amount of time required to sanitize data, curate it, and perform project management—which can sometimes result in even the smallest workloads taking weeks or months to complete.
  • Cost of the project in complexity: Creating transparency and an approach to best manage processes—such as reformatting, refactoring, retooling, re-permissioning, and repositioning data—can increase the risk profile of the data and incur undue exposure.
  • Limitations of environment: Does the data have sufficient reach where it is located? If moving to AWS, will storage within a certain region or availability zone be sufficient, or do you need an instantaneous, global view?

Using Vcinity to enable applications in AWS work against data everywhere (whether at the edge, on-prem, or already in the AWS cloud) anywhere lowers cost in time, money, complexity, and oversight, while eliminating the typical bottlenecks surrounding data locality or proximity.

 

Best practices for moving your workload to AWS:

The concept of increasing cloud utilization by moving your workload to AWS, as opposed to your data, is a bit more novel than traditional data migrations. As mentioned previously in this blog, moving your workload not your data provides you the flexibility to take advantage of AWS microservices while keeping data on-prem in another AWS region, and more. Below are a few best practices as you migrate your workloads to AWS:

Think big, think different:

Now that you can use Vcinity to counteract data gravity, establish which services would be the most useful to your organization without the constraints of compute and data locality. Example questions to ask include:

  • Could backing up to and restoring from AWS to any of my locations lower my data resilience cost and improve my risk posture?
  • Could I benefit from bursting to AWS for sporadic processing needs of unwieldy dataset, without needing or waiting to pre-cache or copy data to AWS?
  • Would expanding development into cloud resources against my on-prem data result in a quicker time? To revenue?
Moving data? Optimize your pipeline:

Once you’ve defined how AWS resources can propel your company’s vision forward, you should determine the infrastructure that needs to be in place to support your intended volume of data consumed. For example, a key consideration for data transfer use cases is identifying and procuring WAN bandwidth versus data volume. Vcinity can sustain 90+ percent of a provisioned connection regardless of transport or latency. For easy math, that’s roughly moving 10TB/ 1Gbps/ day. With a 100Gbps connection dedicated, you could estimate about 1PB of data transfer across any distance each day.

Stay secure, simple, and flexible:

In today’s data-driven world, data security is paramount. Minimizing movement and copies of data with Vcinity inherently results in better control of data. The fewer copies of data, the smaller the attack vector or footprint of the exposed information. How can control of your data’s location bolster your security profile? Consider whether moving a workload to AWS enables you to leverage cloud compute benefits while maintaining data residency or compliance requirements. Alternatively, consider using Vcinity to enable a single, globally-accessible dataset—resulting in simplified, global security and management of your data.

Always be inventing:

Position yourself competitively by taking advantage of both your data and AWS. Unlocking your data when and where it is created is the quickest path to unleash incredible (and continually newly innovated) capabilities from AWS. Continually be evaluating how ubiquitous data access allows you to deploy new workloads, experiment with new services, optimize operations, and delight customers.

 

Let’s get moving:

Ready to realize more value from your data? Reach out to your AWS rep or contact us at Vcinity to have a conversation around your workflows to discuss moving your workload and/or data to AWS can help accelerate your company’s digital evolution. 

Learn more about Vcinity for AWS.

rounded_1.png
rounded_1.png
rounded_1.png
rounded_1.png