Accelerate Time to Value of Your Data with DataOps

Share:
Share with your friends










Submit

Over the last several years innovative companies have been adopting a DevOps paradigm that reduces the barriers between developers and operations teams in delivering high quality software faster. This has allowed leaders like Uber, Amazon, and Lyft to experiment with a better customer experience and business ideas faster than existing enterprises. A similar paradigm, DataOps promises to bring the same kind of benefits to operationalizing data.

The fundamental challenge is that businesses are capturing larger quantities of data, but face various management, quality, and security challenges in leveraging it to create new value. This new business value includes building faster apps, reducing costs, updating apps more quickly, making sense of IoT data, leveraging AI, and complying with new data regulations like GDPR. Doing this efficiently requires improving the data management lifecycle in a manner similar to the software development lifecycle (SDLC) of DevOps.

A DataOps strategy requires different tools and skillsets that complement the work done by developers, testers, and operations experts involved in DevOps.  These will help automate data processes outside of the traditional SDLC such as database configurations, batch jobs, machine learning models, cloud data integration, and legacy data integration.

Data scientists help identify new algorithms for making sense of existing data and identifying ways to enhance this with third party data. Data quality engineers find ways to automate the identification and cleansing of inaccurate data. Database managers identify the best data stores and configurations to speed data collections and access. The EU’s new GDRP regulations call for the creation of a Data Protection Officer to ensure appropriate data management and usage.

Databases tend to pose major bottlenecks to DevOps teams and processes due to their complex nature and common data movement challenges. The DBA will play an integral role in alleviating these challenges and play a critical role in how businesses move to enabling DevOps digital transformation. DataOps also involves streamlining the preparation of data so developers can leverage it during the application development process. Another component of DataOps involves improving the infrastructure used for deploying data across the most efficient or cost-effective infrastructure.

Examples of DataOps processes include:

  • Automate the configuration and scaling of database apps required to grow and shrink with seasonal app usage such as holiday shopping rushes.
  • Identify whether relational databases, NoSQL databases, or stream processing systems are the best fit for a particular-app and configuring the appropriate data management infrastructure without breaking existing apps.
  • Inventory an enterprise’s data stores in order to consolidate them, remove unused data, and cost efficiently tier data for fast access and long-term archiving.
  • Automate the collection and formatting of data from servers, IoT devices, and infrastructure to improve machine learning models for applications like predictive sales, infrastructure automation, and predictive maintenance.
  • Develop a data privacy management architecture for meeting the EU’s GDPR requirements that threaten stiff fines for non-compliance. This includes being able to automatically locate all instances of a given user’s data and quickly delete some or all of this upon request.
George Lawton
George Lawton
glawton@gmail.com

George Lawton is a journalist based in San Francisco, Calif. Over the last 15 years he has written over 2,000 stories for publications about computers, communications, knowledge management, business, health and other areas which interest him.

No Comments

Post A Comment