UPDATED 15:54 EDT / JULY 27 2023

CLOUD

Redefining data management: Supercloud’s role in the modern enterprise landscape

The modern enterprise landscape is no stranger to changing organizational habits on sourcing, filtering, storing and operationalizing data. But with data volumes swiftly growing beyond the data center, it’s become a task for companies to perform those functions.

Supercloud is prompting an expansion of the same idea as multicloud and edge, as the modern application development environments demand more data than resides in proximity.

“We move data across wide area networks faster than any other technology,” said Russ Davis (pictured), chief operating officer and chief product officer of Vcinity Inc. “Because of how we do that, we’re able to actually enable compute to work with data that is remote from it. I know that seems like a misnomer of some kind, but for many applications that actually works. And if for whatever reason the network isn’t big enough or there’s just too much data, we go back to the first side of our coin — which is we just then move it to where the compute is.”

Davis spoke with theCUBE industry analyst John Furrier at the Supercloud 3: Security, AI and the Supercloud event, during an exclusive broadcast on theCUBE, SiliconANGLE Media’s livestreaming studio. They discussed solutions to the data movement conundrum as companies increasingly invest in multicloud.

How it all works

Vcinity helps companies speed up time to insights by reducing data delays and facilitating seamless application operations, thus creating faster outcomes from the moment of data creation. It does this by allowing customers to fuse their wide area network with their storage fabric, according to Davis.

“We’re basically enabling our customers to utilize their wide area networks almost as part of their storage fabric,” he said. “So, we’re extending that storage in the data that’s sitting there to the compute wherever it sits.”

Scaled-out compute-on-demand is the very need that’s driven the rise of multicloud and hybrid cloud. In the supercloud, that idea is reinforced in a more platform-agnostic manner as companies such as Vcinity operate across several cloud hyperscalers, including AWS and Azure, Davis added.

“How do I make all of that work together? What’s the value of the work? It’s insight from data,” he said. “For one part of a business process or another, we are able to connect those. We don’t care which cloud it’s in, whether that’s public, private, a colo, on-premise, doesn’t matter to us. Our whole notion as a technology is to connect the computer [and] the users to data wherever it sits.”

There are also cost benefits to this approach, especially as more companies employ burst compute with their cloud strategy instead of bearing the often unsustainable fees to go fully cloud-focused, according to Davis.

“If you put it up there, you better be able to keep it there and only have to take out a little bit at a time as needed,” he noted. “But what we see is a trend where more and more are actually deploying storage on-prem but using burst compute to the cloud, and that could be any of the number of clouds.”

Software paired with hardware to assist scale-up efforts

Vcinity operates primarily as software in cloud marketplaces such as AWS and Azure. It also relies on hardware-driven capabilities for scaling up, with specialized acceleration for data going into or coming out of those clouds, according to Davis.

“Fundamentally, it is software that’s available in, say, the AWS or Azure Marketplace,” he explained. “That allows for us to do software-only scale-out types of solutions. And then we’ve got actual hardware assist technologies to scale up. One of the benefits to that is that we’re also now seeing a movement in the cloud providers, especially AWS and Azure, with exposing FPGA-based instances so that you can actually have specialized compute.”

Another beneficial use case for Vcinity’s tech is artificial intelligence. For AI computations, large data streams often need to be channeled simultaneously — and that’s exactly where the platform excels, Davis added.

“What we see in the trend today is that a lot of those applications require that data be moved to a large compute farm,” he said. “To do that it’s a lot of data, and in a lot of cases, it’s in different locations. Imagine if you could, say, have data generated in five different spots that you want to run some AI workload against, but you could do data fusion in real time because we can connect that compute to each of those different remote locations.”

Here’s the complete video interview, part of SiliconANGLE’s and theCUBE’s coverage of the Supercloud 3: Security, AI and the Supercloud event:

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One-click below supports our mission to provide free, deep and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU