Data is clearly not what it used to be! Organizations of all types are finding new uses for data as part of their digital transformations. New data is transactional and unstructured, publicly available and privately collected, and its value is derived from the ability to aggregate and analyze it. We can divide this new data into two categories: big data and fast data. The big data–fast data paradigm is driving a completely new architecture for data centers.I will cover each of the top five data challenges presented by new data center architectures
As smaller organizations move to public cloud, the remaining private datacenters are also getting much larger. A big driver for this scale is data leading to a completely new set of storage architectures that can operate a large scale and require very little management of the data. A new class of storage vendor has emerged, whose solutions accomplish this goal through a combination of 1) software defined storage 2) commodity building block hardware componentry 3) distributed scalable storage architectures and 4) application awareness. Let’s look at each of these solution characteristics and how they make large scale datacenter operations cost effective.
Data is one of the most important assets that any company has Today, there are new and changing uses of data in the digital economy. The big questions however are, who is winning with data, where is this data being kept, what makes new data different, when should data be kept, moved, deleted or transformed, how should data be valued, and why data is so much more important than it used to be? Once we accept the premise that data value should be measured, what would we do with this measure?