As companies of all sizes and in all industries learn to cope with a new “business-as-usual” approach that includes all-remote workforces and intense pressure to cut costs, faster innovation and data agility may appear out of immediate reach. Industry leaders who look to data analytics, artificial intelligence (AI), and machine learning are demonstrating that you can have it all, with the right planning and infrastructure.
Data-intensive applications—such as image processing, data analytics, and AI—depend on rapidly growing enterprise data. With that growth comes architectural considerations. As datasets grow, applications are required to be closer and closer to that data due to network latency. As more applications are added to the environment, they start to generate data faster than it’s possible to move this data elsewhere without great cost and interruption, making migration almost impossible. This is the data gravity paradox that creates lock in and introduces future business risk: the more you gather your data together, the harder it comes to change how you handle it.
So how are businesses unlocking the power of innovation without being tied to a single cloud? Successes (and challenges) exist across a variety of industries. Here, we’ll look at a few dynamic examples of how multi-cloud accelerates innovation, enhances data agility, and reduces costs.
Media and entertainment
Today’s media and entertainment landscape is increasingly composed of relatively small and specialised studios that meet the swelling content-production needs of the largest players, like Netflix and Hulu. To deliver the blockbuster movies and award-winning TV shows, these geographically dispersed studios require efficient collaboration on animation, color correction, special effects, and editing. Multi-cloud solutions enable these teams to work together on the same projects, access their preferred production tools from various public clouds, and streamline approvals without the delays associated with moving large media files from one site to another. A high-throughput, low-latency data lake eliminates concerns that lag will inhibit productivity. Additionally, a central storage solution that attaches to multiple clouds reduces the large egress fees often associated with taking enormous video files out of public clouds.
Beyond the need for collaboration, other factors drive the growth of data and of multi-cloud within media and entertainment. Cameras and viewing devices have greater resolution, meaning that file sizes are larger than ever, requiring greater bandwidth in dispersed data centers than what can be achieved on-premises. Streaming services rely on data analytics to programmatically understand content popularity, divine what new content should be created, and which content should be shelved. Many of the processes related to these workflows are increasingly utilising the public cloud due to the availability of complementary data sets and use-case-specific tools for handling different types of analytics.
Transportation and autonomous driving
Connected car and autonomous driving projects generate immense amounts of data from a variety of sensors. For example, Tesla’s autopilot utilises eight cameras, twelve ultrasonic sensors, and one radar to interpret the car’s surroundings and make decisions about its path and how to avoid potential obstacles. Researchers in this field are trying to accommodate the 100s of petabytes of video and still image-generated data that are used to retrain algorithms. These are still the early days for autonomous vehicles (AVs). When 20–50x more are on road, handling more variants in driving situations (manoeuvring around any city street, any parking garage, etc.), an even greater amount of deep learning will be required. By 2030, autonomous vehicles on the road will create a predicted 1 Zettabyte of data.
Car manufacturers, public transportation agencies, and rideshare companies are among those motivated to take advantage of multi-cloud innovation, blending both accessibility of data across multiple clouds without the risks of significant egress charges and slow transfers, while maintaining the freedom to leverage the optimal public cloud services for each project.
Within the energy sector, multi-cloud adoption can help lower the significant costs associated with finding and drilling for resources. In one example, an oil and gas services company had more than 4 petabytes of data, which it had collected by accumulating data such as sonar scans of undersea floors, geospatial photos, and land surveys, for petrotechnical analytics and seismic processing. Engineers and data scientists at this company used machine learning (ML) analytics to identify places that merited more resources to prospect for oil, to gauge environmental risks of new projects, and to improve safety.
By taking advantage of the services and processing power across multiple clouds, this company created efficiencies that can help save millions of dollars. This is possible by leveraging spot instances across multiple clouds at the same time in order to get much faster results at a lower cost than when a limited number of GPUs are available in any particular cloud. By simultaneously replicating its on-prem data lake and making the data available to multiple cloud services, this organisation supported a wide set of applications and workloads. This demonstrates how oil and gas organisations can scale PBs of data without sacrificing time, while also delivering a new level of resilience by enabling cloud-based recovery.
Healthcare and life sciences
Healthcare is one of the industries that’s lagging behind in multi-cloud adoption. This isn’t due to lack of desire, but because of the many challenges around the protection of data. The need to know where data lives, who has access to it, who has accessed it—along with Health Insurance Portability and Accountability Act (HIPAA) regulations and Digital Advertising Alliance (DAA) guidelines—all bring unique challenges in this field.
Even with those caveats, multi-cloud helps healthcare and life sciences unlock the power of innovation. This is clear in the realm of genomic analysis, in particular, where analysis of huge datasets can help improve—or save—lives. FASTQ files contain the sequencing data of raw genomes; they contain millions of snippets of DNA that need to be assembled like a jigsaw puzzle. These files then allow researchers to do variant analysis, identifying differences between individuals’ genomes. The intensive process of analysing genomes consumes quite a lot of space from a storage standpoint. For example, to study the genomes of 150 cancer patients who receive a particular treatment, then analyse differences in DNA between those who were treated successfully, those who didn’t respond well to the treatment, and against the general population, variant analysis on thousands of people may be necessary. The ability to scale up across clouds and take advantage of spot instances, while sharing access to datasets with researchers around the world, is critical to making these workflows practical and accessible.
Multiply the innovation from your cloud strategy
A multi-cloud solution, in which the same copy of data is available to multiple clouds, allows users to take advantage of each cloud’s services—more than 500 available today. Multi-cloud storage can improve data agility, provide data proximity without vendor lock-in, and scale compute and storage on-demand, independent of each other. Multi-cloud offers financial savings and eliminates many operational complexities, whether you have 10s of TB or 100s of PB of data.
Adopting a multi-cloud strategy today can future-proof your organisation for when you’re ready to tackle a new workload—without being forced to copy or move it closer to the latest cloud capabilities. You don’t need to have that use case today, but an effective multi-cloud strategy allows you to leverage it when the use case is required tomorrow.
Credit: Source link