“IT teams now have to adapt to the new pressures of all this data. They are being asked to manage billions of files, make that data available to applications running in their data center and in the public cloud, and to enable a daunting breadth of workflows and end-user requirements. They need a smarter approach to data management. Digital transformation isn’t possible without it,” says Ben Gitenstein, VP of Product, Qumulo, in an exclusive Hotseat interview with EnterpriseTalk.
ET Bureau: How can enterprises seamlessly store and manage data to scale their applications?
Ben Gitenstein: Enterprises need to radically simplify the complexity of their software tools and infrastructure for managing unstructured data. IT teams can no longer afford to build siloed infrastructure for each of their data needs.
It’s too complex to manage and too difficult to gain visibility into all of the data the organization has, much less leverage it for innovation when it is dispersed across user machines, multiple systems and managed by different tools.
There is a simpler approach. Forward-thinking IT leaders can overhaul their legacy data systems with a unified file data platform that simplifies the way they store and manage data. By consolidating the sprawling data environment into one seamless system, IT teams can eliminate data silos and reclaim wasted administrative time on never-ending troubleshooting and redundant manual configuration tasks.
Enterprises also need to be able to scale. When selecting a data management vendor, enterprises shouldn’t have to choose between the present and future needs. With a scalable data solution that can be leveraged at the edge in the datacenter and in the cloud, enterprises can buy the performance and capacity they need today, with a simple way to scale as their workloads change and their datasets get bigger in the future.
ET Bureau: How does smart data management play a critical role in a successful digital transformation journey?
Ben Gitenstein: Enterprises of all kinds — from research universities to animation studios to hospitals — are all on a digital transformation journey. The COVID-19 pandemic has only accelerated this shift to a digital-first future, in the transition to remote work, telehealth and virtual education.
As the world becomes increasingly digitized, enterprises are hurtling past the age of “Big Data” and now entering what I call “Massive Data.” In today’s times, it has become a norm for enterprises to generate hundreds of terabytes or even petabytes of data on a daily basis from sources like cameras, medical research devices, particle accelerators, and high-performance computers.
IT teams now have to adapt to the new pressures of all this data. They are being asked to manage billions of files, make that data available to applications running in their data center and in the public cloud, and to enable a daunting breadth of workflows and end-user requirements. They need a smarter approach to data management. Digital transformation isn’t possible without it.
One example is in banking. Banks are undergoing a digital transformation as they shift to remote, paperless transactions, which will dramatically increase the amount of file data that banks have to manage. As more people process their financial transactions remotely in 2021, there will be a tsunami of files (images and documents) that go from physical to digital. This rapid digitization of traditional paper files will require a data infrastructure that can scale and keep up with the demand, and banks need to be ready for it.
ET Bureau: How can enterprises integrate platforms to their existing infrastructure without having to juggle with storage pools or perform data migration?
Ben Gitenstein: There are many industry-standard requirements that a data platform must adhere to. Leveraging industry-standard protocols, such as NFS and SMB, makes moving applications simple. Integrating automated encryption in-flight and at-rest make it easy to ensure compliance and regulatory standards are met. And, having the tools to migrate data into the data platform simply and with full data integrity make the move simple.
ET Bureau: Security has always been a concern among enterprises when utilizing a massive amount of data. What data management best practices will help and enable enterprises to keep their data secure without incurring an additional cost?
Ben Gitenstein: Effective security goes back to simplicity. Most organizations are grappling with the complexity of meeting Chief Security Officer requirements, many of which are new while staying within budget. One of the most urgent requirements from CSOs is to encrypt all of their data.
Traditionally, large-scale file data is encrypted using bespoke hardware, which usually costs extra and adds a new management tax. That will change as security teams look for simpler solutions built into the software. I believe that organizations will increasingly insist on solutions with built-in encryption and key management that is simple to deploy and easy to manage and provides both encryptions in-flight and at rest.
Ben Gitenstein runs Product at Qumulo. He and his team of product managers and data scientists have conducted nearly 1,000 interviews with storage users and analyzed millions of data points to understand customer needs and the direction of the storage market. Prior to working at Qumulo, Ben spent five years at Microsoft, where he split his time between Corporate Strategy and Product Planning.