AI The Logical Next Frontier For Accunomics…
It was early 2000 when we first pioneered the use of data at the invoice level to build our profitability modeling for our Integrated Oil clients. The original idea was pretty simple, segment customers and products by category, A, B, & C, so that
clients could focus their attention on elements of their supply chain and product offering that drove the most profit. What ensued was something a little unexpected – by following the chain of profitability we enabled our clients to restructure the commercial model and/or exit non-value adding assets, regions, products, contracts, and organization structures there by driving financial benefits that were measured in hundreds of millions of dollars in profit improvement. Soon the power of our modeling would shift to the growth side of the equation. Enhanced by technical improvements using business intelligence tools, like PowerPivot, coupled with our hallmark Management Operating System, we were able to provide Customer insight like never before. This led to improved customer service, laser focus on customers with the best potential for profitable growth, and strengthened pricing performance, all while reducing the complexity of managing the business. This brings me to speak about what’s next in the logical approach of managing data to further reduce business complexity. Our latest work includes a very creative application of Artificial Intelligence (AI) where we focus on the predictive uses of AI, uses that leverage the information contained within large historical datasets to determine expected outputs for unseen examples. This specific use of AI is actually called Machine Learning (ML). ML can be defined as an approach to achieve artificial intelligence through systems that can learn from experience to find patterns in a set of data. ML is all about predicting stuff. It is intelligent because the ML system itself produces a predictive mathematical model through a training process that relies on experimentation and feedback with data (real world examples) – much like how a human learns to ride a bike, take the SAT, or manage a business. It actually learns patterns from data provided to formulate predicted outcomes. Elegantly, as new data is produced through the passage of time, the ML model refines the patterns based on what the new data is telling it, resulting in a tighter level of predictability of outcomes. So instead of being static, we can build and employ predictive models that actually improve as they age!
Okay, now on to Deep Learning (DL). DL simply refers to the technique for implementing ML. The technique our solution employs is a concept known as Deep Neural Networks (DNN). Basically, DNN is where the code structures you invoke are arranged in layers that work quite like the human brain, learning patterns of patterns. What of course is different than the human brain is the speed, and capacity of data to be processed. Whereas a human brain benefits from reducing the number of variables and examples to consider (we call it focusing), a DNN actually benefits from a larger set of variables (called “features” in AI-speak) and examples. So now you have a model that not only learns from past patterns but actually formulates new patterns as well, with quantities of qualified data never thought possible. This DL is extremely powerful and until recently was too computationally expensive to run. Technology has changed and that is why today’s applications of this process were literally only dreamed of just a short time ago. In business terms, we are talking about using massive data quantities in our AI models to determine business outcomes with extremely high degrees of accuracy. So, if you personally have an exceptionally quick thought process you can immediately see how a business would have value for such work. Or, if you are more like me, it may take a little more work, so let me help. Think of it this way, how much of an Integrated Oil Company’s organization and asset configuration exist for the primary purpose of reacting to unplanned conditions? How much of the organization could be resized if product mix was highly predictive? How much safety stock and assets to hold safety stock would be required if you were able to have a highly predictive front end to product sales? Because our AI models thrive on massive amounts of data to make the predictive process richer, there are many other outputs from this modeling process that can produce extremely valuable business intelligence. Here is a brief listing of outputs from a recent AI modeling of a large Downstream business:
- “Ship to” product forecast with extreme focus on 90 days out
- Profitability and Customer margin analysis on an on-going basis for “Ship-
- Terminal margin optimization analysis
- Site, terminal, geographic market attractiveness data
- Customer sensitivity (predictive reaction) to price changes
Our consulting teams have had a field day with the quantity and quality of analysis findings that have been derived from these AI modeling activities. And clearly, we are just starting to realize the significant benefits from these new technologies. Over time I will certainly post more examples of how AI drives additional complexity reduction opportunities, but one thing is clear to me right now, businesses that are slow to adapt these technologies will find it extremely difficult to compete with those who are early adaptors.