DevOps is one of the most popular work methodologies being implemented in most large enterprises. It is still in its infancy though, and we expect many more developments to happen in the near future.
AI for DevOps
Artificial intelligence is permeating all elements of IT – why would DevOps be left behind? With AI technologies, DevOps will become smarter, and be able to make predictions regarding the effect and risks of deployments, identify bottlenecks in procedures, as well as shortcuts through automation. Developments in RPA enable the optimization of different handoff and automation points in the pipeline. Predictive analytics based on AI will enable organizations to plan operational capacity more efficiently, and make fault predictions prior to deployments.
DevOps for AI
Recently, a few new tools, referred together to as MLOps are allowing developers to deploy new AI apps quickly and repeatedly. Though right now they are more of a developmental aid and have not yet started close collaborations with the Ops side of the enterprise, it is expected that soon that will also include Ops aspects. DevOps pipelines in the near future will enable AI models that have been ‘learning’ from historical data to be moved from production to development.
As of now, new databases, indexes, triggers, stored procedures etc. are manually deployed by DBAs outside the DevOps pipeline, and their operational impact is provided through capacity planning which is based on guesswork. It is expected that this huge gap will be soon bridged through products that enable databases and their related artifacts to be deployed through the pipeline, along with the apps they have been coupled with. This will include conventional relational databases, non-structured databases and object stores. This way, apps and their supporting data stores will be created, tested and implemented through the same automated pipeline as apps which don’t require specific database handholding.
Analytics and visualizations related to newly deployed apps are usually brought online manually by data scientists after the fact. However this is expected to change, and data analytics will be introduced into DevOps pipelines almost immediately as the release requirements are identified. Here too the aim is to do away with special manual promotion. DataOps will also enable data governance, prevent loss of data, manage data lifecycles, and copy data management features that need integration in every new deployment.
Blockchain for DevOps
DevOps evolutions in the near future will speed up the volume, variety and velocity of the pipeline’s traffic drastically, making auditing, compliance verification, and management difficult. A distributed tech method like Blockchain may come in handy here, as it will allow developers to have a secure, central, permissioned distributed ledger that enables the recording, tracking, authentication and auditing of pipeline transactions. It also makes digital contracts possible at every stage; these can be verified at runtime to trigger artifact promotion, and it’s possible to audit them after the fact to ensure compliance with rules.