• Big Data
  • Experfy Editor
  • MAR 26, 2014

Avoiding Failure in Big Data Adoption

According to a Gartner’s report on big data adoption in 2013, 64 percent of organizations have already invested in or plan to invest in Big Data technology, with 34 percent planning to invest within the next year or two.  In the Gartner big data study, information and business leaders most often associate the term “opportunity” with “big data.” This positive perception undoubtedly translates to increased investments in and adoption of big data technology. While there is general consensus that engagement with big data is necessary for enterprises to remain competitive, few people have considered the pitfalls of rushing into the big data madness without a long-term perspective.  Jim Kaskade, the CEO of Infochimps, offers five maladaptive practices that businesses must avoid when it comes to their organization’s big data adoption.

Do it all at once. It’s an evolution, not a revolution. Any solution designed to solve all your problems will inevitably yield a series of disappointments. Try identifying a very specific business challenge to address with Big Data, solve it, and expand and iterate your program step-by-step.

Do it all yourself. Implementing Big Data solutions will take outside expertise as the infrastructure is too big and complicated to build in-house. Combining streaming, batch, near real-time and real-time data sources is a major obstacle for most IT departments. Sidestep the heavy lifting and quickly gain the right insights by partnering with a vendor that understands Big Data.

Bring your data to the apps. Enterprise data is too big and too sensitive to haul to the public cloud where apps are being built. Instead, bring the apps to the data – build apps in virtual private clouds that reside in tier-4 data centers, eliminating expensive, risky migrations.

House your own data. 10 terabytes of legacy infrastructure costs $1M or more to store – the warehouse for any major company will be larger than 20 terabytes. That’s expensive. Instead of housing your own data, partner with an expert who can guide you through a hybrid deployment strategy that leverages a public, virtual private or private cloud.

Rest all hopes on Hadoop. Hadoop, which performs historical batch processing, gets 80% of the Big Data attention, but it’s only 20% of the solution. For a truly customer-centric view, tie together historical, real-time and near real-time data.


The Harvard Innovation Lab

Made in Boston @

The Harvard Innovation Lab


Matching Providers

Matching providers 2
comments powered by Disqus.