Why virtualization is making enterprise data warehouses obsolete [Q&A]

Virtualization

For many years the database has been at the heart of enterprise IT. But the shift to the cloud has led to massive challenges with migrations proving both slow and expensive.

A different way to approach this is to use virtualization, allowing existing applications to run on any modern cloud platform without being rewritten or replaced. We spoke to Mike Waas founder and CEO of Datometry, a SaaS database virtualization platform, to find out more.

BN: How does vendor lock-in of databases hurt the industry?

Advertisement

MW: Databases are a critical component of the modern IT stack. Because they have very proprietary query languages, drivers, and tools, databases hold an incredibly strong lock on their customers: in order to switch databases, all applications need to be rewritten and reconfigured.

However, over time enterprises are being held back by legacy database systems. What once was a modern database has become a liability. Legacy systems cannot satisfy the business' ever-increasing needs and as a result, the competitive posture of the enterprise suffers. The older the legacy system, the more pronounced the impact on the business.

Besides affecting customers directly, vendor lock-in also lowers or even eliminates competition in the market. Over time, database vendors have become complacent. Once they are entrenched with a market segment, the need to innovate is reduced, much to the customers' detriment.

But it's not just customers who are hurting. Startups and innovators in this space are affected negatively as well. They need to overcome the incumbents' vendor lock-in to reach new customers and are up against an extremely high barrier-to-entry.

BN: What are the challenges IT leaders face when migrating databases?

MW: Database migration is the poster child of an 80/20 problem. In the beginning the project comes along well and success seems certain. However, as 80/20 problems would have it, the last 20 percent of the project is where 80 percent of the effort goes. For larger, mission-critical systems, these last 20 percent often translate to multiple years beyond the original schedule.

Besides running long, database migrations are also notorious for their costs spiraling out of control. An often-overlooked component are 'hidden' costs. One source of such hidden costs, for example, are the required upgrades for dependent systems like ETL and BI platforms. These are merely prep for the actual migration but may tax the organization significantly.

Ask any IT leader about their experience in this area, and chances are they have their own horror story of a database migration gone wrong. They are not alone. Gartner claims the majority of migrations ultimately fail. Some drag on for years, some even a full decade. Finally, they collapse under their own weight after many years.

BN: What is database virtualization and how does it solve the migration challenge?

MW: Database System Virtualization, or DSV for short, is the concept of inserting an abstraction between the applications and the database so the database can be replaced without having to change the applications. SQL, APIs, tools remain the same while IT replaces the underlying database.

DSV effectively intercepts the communication of the application with the database and translates and optimizes it for the new destination database. The translation overcomes discrepancies between the source and the destination system through emulation. This way DSV can reconcile differences like missing data types, or lack of certain functionality on the destination system.

A natural application scenario for DSV is the current global trend of enterprises moving from on-premises systems to cloud databases. With DSV, enterprises complete this move at a 10th of the cost and time of conventional migrations -- and without the risk.

One of our customers, a leading logistics and transportation firm, recently moved a highly complex enterprise data warehouse from an outmoded appliance to Azure Synapse. Probably the most powerful testament to the effectiveness of DSV is how seamless their migration was: their IT leadership cut over from the legacy system to the cloud data warehouse at the end of the implementation without letting their business users know.

BN: What audiences benefit the most from database virtualization?

MW: The concept of DSV is universal. Any organization who wants to protect their investment in the development of applications needs to transition them at some point to a more modern database stack. With DSV they can do so without time-consuming, costly and risky migration projects.

In the long run, we expect DSV to be about as ubiquitous as, say, server or network virtualization. Counter to common belief, there's no real benefit from writing applications directly for a given database. However, vendor lock-in is a good reason not to overfit applications. Instead, by going all virtual, the enterprise becomes nimble and more efficient.

That said, innovators prioritize high-value segments as they break into the market. Datometry is no exception. We started with data warehouse appliances as source systems. Customers worldwide are currently looking to move the workloads from these systems to cloud databases. The success in this market segment validates both our technical and our business approach.

That said, we constantly solicit feedback from prospects and customers where they see the greatest need for the next wave of source systems, so we can add support to our platform.

BN: Why is now the time for database system virtualization?

MW: Several factors make for what is best described as a 'perfect storm'. First, adoption of public cloud has created an urgency around migrating databases. Within the next decade, a large portion of the existing database market will have to undergo migration and modernization. Hence, enterprises are eager to migrate their databases.

Second, databases are at the heart of the cloud wars. The cloud provider who can capture an enterprise’s data estate stands to gain the entire account over time as applications gravitate toward the data. Hence, the Hyperscalers are eager to migrate their customers’ databases.

Third, cloud databases have reached a level of maturity that makes them viable alternatives to legacy systems. Cloud databases have come a long way over the past decade and provide comparable functionality, performance, and scalability. Systems like Azure Synapse and Google BigQuery are now at eye level with Teradata or Exadata appliances.

This means, an extremely strong market demand aligns with the Hyperscalers' offers that can technically stand up to legacy technology. Add to that the enormous uncertainty in the markets and it becomes clear why enterprises are looking for a low-risk solution like DSV to enter the next phase of their cloud deployments.

Image creditgustavofrazao/depositphotos.com

© 1998-2022 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.