Three Reasons Fast Databases Will Never Be Fast Enough

Mathias Golombek Mathias Golombek
January 10, 2019 Big Data, Cloud & DevOps

Ready to learn Data Science? Browse Data Science Training and Certification courses developed by industry thought leaders and Experfy in Harvard Innovation Lab.

If you look at the data management market (e.g. the Data Platforms Map from the 451 Group), you are likely to be overwhelmed by the diversity of today’s technologies.

Many decision makers think that this broad range of choices is too complex to consider. As a consequence, they stick with the ‘same old, same old’ of the big brands such as Oracle, Microsoft or IBM and assure themselves that disruptive technologies will be adopted by these vendors, even if it takes a little longer. But in the long term they will probably integrate it, or simply acquire adequate smaller market players. Are you a decision maker that thinks that way? Read on to find out the pitfalls in this strategy.

I probably do not need to emphasize how important data management has become in the recent years. Nor do I need to mention that its strategic role will increase in the future. Data-driven automated decisions are helping organizations across the globe, and artificial intelligence algorithms are revolutionizing the way we think about complicated problems. Magazines are full of exciting use cases that improve our lives and change the market rules for nearly every industry. So let’s take it as given that collecting, processing and analyzing data is one of the crucial factors for a company‘s competitiveness.

Coming back to the large vendors who tell their customer-base that their wide software stacks ensure compatibility, that the acquired technologies have or will be integrated deeply into their huge platforms, and that buying everything from one vendor has lots of advantages. The majority of the experienced decision makers seem to be tired and disappointed by these promises. The new generation, the millennials, do not even consider the big players as feasible software suppliers. They prefer best-of-breed solutions that solve their challenges in an optimal way. They won’t believe the marketing before they have tested the software. And they want dedicated service providers who are approachable and deliver first-class support and enablement.

Don’t believe in the fallacy that the increasing hardware power will fix your current performance limitations automatically. Or that ordinary solutions are ‘fast enough’, just because they can cope with your current needs. Here are the three most important reasons why this is a dangerous calculation, and why fast databases will never be fast enough:

1) First Never Follows

If you want to be the number one of your market, then you should follow that ad slogan of our client Adidas. Second-best leads to a competitive disadvantage compared to the ones who choose the best. Leveraging data has become one of the most important competitive advantages, which means you may not be satisfied with a fast database – you will need the fastest. Remember the complex landscape from above – there are different ‘fastest’ solutions for different use cases. But don’t believe the vendors (not even my own company) –  test the different options in a thorough proof of concept.

2) Data will grow over your head

Maybe you are able to predict the data growth of your data sources? This might be true for today, but there will probably be changes in your company that will lead to a data explosion. IoT products, sensor logs, social media data or a fine-granular supply chain surveillance from your suppliers to your customers – these are all reasons why data growth will never be linear again.

In the past few years, it has become feasible to process (e.g. via NoSQL solutions, data grids and streaming systems) and store more and more data in a cost-effective way (especially thanks to Hadoop). But in many cases, that data already can’t be analyzed appropriately anymore.

More data means more to analyze, so even if you are happy with your database performance today, you might not be tomorrow.

3) Limitations and constraints avoid your success

Have you ever been in the situation where your existing database held back your imagination? For instance, you would not even think of evaluating your whole customer data from the past 10 years as you are used to being limited to the past 3 months?

One of my close friends works for a company that has a special committee that meets once a week to decide whether new analysis can be executed in their existing Oracle database. And a DBA at one of our clients told us that if he had thrown a certain query against their productive Teradata database, he would probably have been fired immediately. Can you imagine how this constraints innovation?

Many questions are not answered in analytics teams, many applications are not implemented due to limitations, and I assume that lots of business models have not been created as consequence.

Summary

Ultra-fast data analysis in the speed of thoughts leads to new questions, new analysis, new applications and eventually more success. Your requirements will tremendously change tomorrow. Accessible data volumes will explode. And your competition is not hesitant to pass this by. So don’t take the easy decision to stick with your existing vendors. Don’t hesitate and invest in market research and testing projects. Make sure you are using state-of-the-art technology, and prepare yourself for the future.

Originally appeared on Exasol

  • Experfy Insights

    Top articles, research, podcasts, webinars and more delivered to you monthly.

  • Mathias Golombek

    Tags
    Data Science
    © 2021, Experfy Inc. All rights reserved.
    Leave a Comment
    Next Post
    The AI War Machine: AI Breakout

    The AI War Machine: AI Breakout

    Leave a Reply Cancel reply

    Your email address will not be published. Required fields are marked *

    More in Big Data, Cloud & DevOps
    Big Data, Cloud & DevOps
    Cognitive Load Of Being On Call: 6 Tips To Address It

    If you’ve ever been on call, you’ve probably experienced the pain of being woken up at 4 a.m., unactionable alerts, alerts going to the wrong team, and other unfortunate events. But, there’s an aspect of being on call that is less talked about, but even more ubiquitous – the cognitive load. “Cognitive load” has perhaps

    5 MINUTES READ Continue Reading »
    Big Data, Cloud & DevOps
    How To Refine 360 Customer View With Next Generation Data Matching

    Knowing your customer in the digital age Want to know more about your customers? About their demographics, personal choices, and preferable buying journey? Who do you think is the best source for such insights? You’re right. The customer. But, in a fast-paced world, it is almost impossible to extract all relevant information about a customer

    4 MINUTES READ Continue Reading »
    Big Data, Cloud & DevOps
    3 Ways Businesses Can Use Cloud Computing To The Fullest

    Cloud computing is the anytime, anywhere delivery of IT services like compute, storage, networking, and application software over the internet to end-users. The underlying physical resources, as well as processes, are masked to the end-user, who accesses only the files and apps they want. Companies (usually) pay for only the cloud computing services they use,

    7 MINUTES READ Continue Reading »

    About Us

    Incubated in Harvard Innovation Lab, Experfy specializes in pipelining and deploying the world's best AI and engineering talent at breakneck speed, with exceptional focus on quality and compliance. Enterprises and governments also leverage our award-winning SaaS platform to build their own customized future of work solutions such as talent clouds.

    Join Us At

    Contact Us

    1700 West Park Drive, Suite 190
    Westborough, MA 01581

    Email: [email protected]

    Toll Free: (844) EXPERFY or
    (844) 397-3739

    © 2025, Experfy Inc. All rights reserved.