September 10th, 2020
By Laila Partridge, Managing Director of the STANLEY + Techstars Accelerator
I’ve recently moderated two sessions with leaders at Stanley Black & Decker and Rockwell Automation with deep expertise in manufacturing, AI/ML, and Industry4.0 deployments.
The July 27 session kicked off the monthly speaker series with:
Mark Maybury, CTO of Stanley Black & Decker
Sudhi Bangalore, Global Vice President of Industry4.0 at Stanley Black & Decker
and the second session on August 26 dove deeper into AI issues specifically with:
Mukesh Dalali, Chief AI Officer of Stanley Black & Decker
Paul Turner, Vice President of Industry 4.0 Applications and Analytics at Stanley Black & Decker
Bijan Sayyar-Rodsari, Director of Advanced Analytics at Rockwell Automation
Beginning with the non-obvious and most controversial statement:
Much of the conversation on August 26 centered on the high value potential of AI in manufacturing, which will be driven by data utilization. Paul Turner acknowledges that “data is important, but it is not the only thing.” He goes on to point out that “a lot of data initiatives where AI is applied to data are about minimizing errors and getting the most accurate model. That is actually secondary (in manufacturing). Value comes first”. He explains his perspective through two anecdotes. The first is for quality applications, a high value opportunity in manufacturing. These are rare events (think: 1%) which means that AI applications that predict 99% accuracy aren’t helpful. The second example is less obvious.
Paul describes a situation where 5% accuracy is preferred over 95% accuracy. In this instance, a clogged flow to an ink nozzle caused 95% of quality issues and was already detected by an alarmed, integrated sensor. To avoid duplicate alarms, a more nuanced, multivariant and nonlinear, and stochastic approach was required to identify the other 5% defects. In the absence of this context, a typical AI application would likely seek a 95% accuracy result.
These above examples illustrate a number of other conversation topics:
Factory sensorization by the number of sensors and collection points is still a problem. Paul stated that “we are not there yet” and described Stanley Black & Decker’s change to its’ plant level ROI guidelines. Mukesh noted that manufacturing process digitization will help.
Data access and context is slowing machine learning algorithm innovation. Bijan spoke about the industry’s sensitivity about publicly sharing data and application needs while Mukesh noted the high number of disparate data sources and overall lack of context.
Alarm overload at the diagnostic and descriptive layer means that data must be translated into actionable insights. Often failure to integrate with existing systems and operator processes leads to pilot purgatory.
Critical need for domain expertise when designing AI applications for manufacturing, due to the “imbalance in data collected.” Faulty conditions represent the minority of manufacturing operations, so developing AI solutions will require combining scarce data sets with domain expertise.
Need for employee-centric AI solutions and Stanley Black & Decker’s focus on the Future of Work challenge was highlighted by both Sudhi and Mark in the kick-off session and was touched upon throughout the subsequent session. Bijan added a less common observation about the shortage of data scientist resources in manufacturing.
For specific innovation areas to pursue, Bijan described Rockwell’s interest in automated learning with minimal reliance on data science expertise in the application workflow (to address the shortage of data scientists) and identified manufacturing quality as one of the biggest opportunities. Mukesh called out the need for edge management systems to avoid the time delays of cloud processing. He put it in the context of processing data for actionable insights and how that was transitioning from recommendations to automated decisions. Mukesh cited the example:
An insight could be that the vibrations are abnormal.
A recommendation for an operator could be to decrease the speed of a machine.
While a decision for a fully automated machine would be to autonomously make that decrease in speed without any human in the loop.
Paul spoke about aggregating value and how those value drivers require a layered approach:
Foundational layer (sensor data) enables basic performance management (MES or OEE)
Analytics layer on top of basic performance might just look at diagnostics and descriptive type of analytics
Whereas performance applications are about aggregating and combining individual solutions into value drivers
Continuous improvement sits on top and is the mechanism by which to drive behaviors in the plant.
Perhaps the clearest direction came during the kick-off session when Sudhi describes and quantified the value ($200 million) of thirty use cases Stanley Black & Decker had identified as near term solution gaps.
The best long-term vision also comes from the kick-off session when Mark walks through details of the AI Roadmap for Manufacturing. It is the result of 70 leading academics and global manufacturing companies’ joint work and gives a solid 10 year view of technology adoption by the manufacturing sector.
Most noteworthy is the emerging new trend in the industry: collaboration.
Explicitly stated by several speakers: cross-industry collaboration amongst partners, both large and small companies, is the only way to fully achieve the benefits of AI in the manufacturing industry. No one company will have a complete solution and startups provide a critical innovation component.