Roger Levy brings extensive international, engineering and business leadership experience to his role as VP, Products, at MariaDB. He has a proven track record of growing businesses, transforming organizations and fostering innovation in the areas of data networking, security, enterprise software, cloud computing and mobile communications solutions, resulting in on-time, high-quality and cost-effective products and services. Previous roles include VP and GM of HP Public Cloud at Hewlett-Packard, SVP of Products at Engine Yard, as well as founding R.P. Levy Consulting LLC.
About the Author:
Feature Article: January 2016
Check out Craig S. Mullins’ blog on data + database technology. more>
Yes, data is still important, even in the age of OO and agile development. more>
While for most people “weight loss” or “learning a new language” are solid New Year’s resolutions, CIOs have very different goals in mind. How best to store and manage data is on the minds of many CIOs as they kick off the New Year perhaps leading to a resolution to corral their data in more secure and efficient ways. It’s time to bring databases, which underlie most every app and enterprise on the planet, back in the spotlight. Here’s what organizations can anticipate for this year.
Multi-model databases get popular
Every minute we send over 200 million emails and over 300 thousand tweets. Already by 2013, 90% of the world's data had been created in two years. There’s no doubt that the variety, velocity and volume of data is exploding. Compounding this explosion is the increasing variety of formats of data that organizations are collecting, storing and processing. CIOs are going to need new tools to manage this flood of information.
Multi-layered database security
2015 saw every type of organization, from global retailers to the Catholic Church, experience financial losses and reputation damage from data breaches. Security has long been a concern of CIOs, but the growing frequency of high-profile attacks and new regulations make data protection a critical 2016 priority for businesses, governments, and non-profit organizations.
Organizations can no longer rely on only firewalls and antivirus software to protect their data. Amidst a myriad of threats, a robust security regimen will require the following: network access, firewalls, disk-level encryption, identity management, anti-phishing education, and more. The most sophisticated and dangerous hackers are after the contents of an enterprise's database, so securing the database itself must be a core component of every organization’s IT strategy.
In 2016 we will see a redoubling of focus on data security. Enterprises will be using database technology with native encryption to protect data as it resides in the database, and SSL encryption to protect data as it moves between applications. They also will control access to the database with stronger password validation and a variety of access authorization levels based on a user’s role. And organizations will continue to have to hold themselves accountable via regular audits and testing, making sure that they have the best practices in place and that their personnel take those practices seriously.
More enterprises turn to hybrid cloud
With the recent revenue announcements by public cloud providers such as Amazon AWS and Microsoft Azure, it is clear that adoption of public cloud services is becoming mainstream. However, public cloud offerings may never fully replace on-premise data storage. Many organizations will not move all their tech workloads to the public cloud for economic and security reasons.
Managing latency and data security continue to be challenges facing IT organizations. Many organizations have real-time applications that cannot afford even the slightest delay when it comes to pulling data from the cloud vs. having it stored on-premise. Others have concerns about their ability to control the infrastructure where their data is stored. In addition, as more countries put data privacy laws into effect, databases will need to be in different cloud and on-premise deployments depending on the region. Enterprises continue to explore when and how to leverage the public cloud and when it makes more sense to maintain data in their own data centers or to use a hybrid cloud approach.
In 2016, expect to see a greater focus placed on creating solutions that improve data migration, security and efficiencies utilizing hybrid cloud architectures.
Easier and cost effective machine learning
With the rapid growth in the type and volume of data being created and collected, comes the opportunity to mine that data for valuable information and insights into organizations and their customers. Just as Facebook and Google have built their empires on this knowledge, other companies are following suit.
More and more companies are hiring specialist “data scientists” to introduce and implement machine learning technologies. But the number of experts in this field simply isn’t growing fast enough, and this rarity makes hiring a data scientist cost-prohibitive for anyone that isn’t a Fortune 500 company. In fact, the US alone faces a shortage of 140,000 to 190,000 people with analytical expertise and 1.5 million managers and analysts with the skills to understand and make decisions based on the analysis of big data, according to McKinsey & Company. In response, organizations are turning to machine learning tools that enable all of their employees to derive insights without needing to rely on specialists. Just as crucial as collecting data is the need to understand what lies in a company’s database and how it can be turned into valuable insights.
Recently the major public cloud vendors have introduced a variety of offerings to provide machine learning services. These include offers such as Azure ML Studio from Microsoft, the Google Prediction API, Amazon Machine Learning and IBM’s Watson Analytics.
We can expect that 2016 will be a year when additional solutions appear, mature, and are recognized as a critical, possibly required, piece of enterprise IT operations. The growth of machine learning will place new demands on databases which store and manage the data “fuel” for such applications. In 2016, look for a focus on database capabilities that facilitate real-time analytical processing of large data sets.
How can a CIO make this all happen?
With the recent rise of the Chief Data Officer, the widespread adoption of new database technologies, and the acute need for better IT security, CIOs can master their data. The database is back in the spotlight. One of the most foundational technologies is once again one of the hottest. A CIO’s bet for staying on top of these new trends in 2016 will be the same strategy as in years past, laying down clear policies for who can access data and what it gets used for, all the while staying on top of new technology innovations and new threats targeting the integrity of a company’s data.
By Roger Levy
VP of Product at MariaDB
This 2015 article by Craig Mullins is a part of a multi-part series on database systems from TechTarget.
|What is a Database?|
|The History and Future of Database Change Management|
|Fixing Corrupt Microsost Access Databases|
|How to Work Remotely and Still Be The Best|
|Getting in Touch with Big Data|
|Planning for Effective Data Warehouse Testing|
|Social Data Has Become Social Big Data|
|The Future of Data Centers: Achieving Agility in a Rapidly Shifting World|
|Here’s a News Year’s Resolution: Master Your Database|
|Making the Grade: Cost Savings Upgrades for Today's Data Center|
|How to Choose the Best DBA for Your Company|
|Virtualization: Wading Through the Deluge of Data|
|SQL Databases and Network Attached Storage|
|Why Big Data Needs Cloud|
|Ten reasons why you should use data models to build apps|
|Beware Big Schema|
|How to Implement Successful Data Integration Cross-Regionally|
|Forging a Path Beyond Hadoop - Software Database Mgmt Sys for Big Data Analytics|
|Database Tips and Tricks|
|Why Data Still Matters|