How Analytics is Transforming Supply Chain Management

 

 

Supply chain management is a field where Big Data and analytics have obvious applications. Until recently, however, businesses have been less quick to implement big data analytics in supply chain management than in other areas of operation such as marketing or manufacturing.

Of course supply chains have for a long time now been driven by statistics and quantifiable performance indicators. But the sort of analytics which are really revolutionising industry today – real time analytics of huge, rapidly growing and very messy unstructured datasets – were largely absent.

This was clearly a situation that couldn’t last. Many factors can clearly impact on supply chain management – from weather to the condition of vehicles and machinery, and so recently executives in the field have thought long and hard about how this could be harnessed to drive efficiencies

Image result for supply chain analytics

 

Why is it so Important?

Relying on traditional supply chain execution systems is becoming increasingly more difficult, with a mix of global operating systems, pricing pressures and ever increasing customer expectations. There are also recent economic impacts such as rising fuel costs, the global recession, supplier bases that have shrunk or moved off-shore, as well as increased competition from low-cost outsourcers. All of these challenges potentially create waste in your supply chain. That’s where data analytics comes in.

Data analytics is the science of examining raw data to help draw conclusions about information. It is used in many industries to allow companies and organization to make better business decisions and in the sciences to verify (or disprove) existing models or theories.

All businesses with a supply chain devote a fair amount of time to making sure it adds value, but these new advanced analytic tools and disciplines make it possible to dig deeper into supply chain data in search of savings and efficiencies.

The supply chain is a great place to use analytic tools to look for a competitive advantage, because of its complexity and also because of the prominent role supply chain plays in a company’s cost structure and profitability. Supply chains can appear simple compared to other parts of a business, even though they are not. If we keep an open mind, we can always do better by digging deeper into data as well as by thinking about a predictive instead of reactive view of the data.

 

https://www.industryweek.com/blog/supply-chain-analytics-what-it-and-why-it-so-important

https://www.forbes.com/sites/bernardmarr/2016/04/22/how-big-data-and-analytics-are-transforming-supply-chain-management/#3a01760339ad

Questions

  1. Q) What are the applications of analytics in supply chain?
  2. Q) What are some of the pain points in supply chain addressed by analytics

 

 

 

How to Survive the Overwhelming Tide of Data

With the increase in accessibility to production and quality data from the use of automation, the Internet of Things, and handheld devices manufacturers are finally able to gather and analyze data to improve their processes at a level hereto unseen before. However, with this seemingly limitless access data comes a new problem: having too much data. More and more companies are falling into the trap of collecting data for the sake of collecting data just because they can and this can actually be harmful to a business. As Douglas Fair states in his article “Drowning in Quality Data: How to Rise Above”, “the insight gleaned from data that is what actually benefits the business”. This means that along with optimizing their processes and machines on the manufacturing floor, manufacturers now also have to think about optimizing how they collect their data so that they are getting the most benefit from it.

When optimization the data collection process, it is important to ask these five simple questions when assessing whether or not they need to be collecting certain pieces of data.

  1. Why do we need to gather this data? What is the improvement we are trying to make with this data we are collecting?
  2. How will we use the data after collection? What are we going to do with it after we have collated it?
  3. Who will evaluate the data? Will it be automated or will we be dedicating personnel to it? Do we have the labor available right now to handle it?
  4. What is a reasonable amount of data to collect? Can we defend why we need as much as we do or could we do the same thing with less?
  5. How frequently do we need to collect the data? How often are we analyzing and using the data to make decisions? Do these coincide with each other well?

At the end of the day, the only sure fire way to make sure you don’t fall into “data gluttony” is to check yourself and ensure that you are collecting data for specific purposes, using all the data you collect, and acting on the insights gained from the data to improve your bottom-line.

 

Source: https://www.manufacturing.net/article/2019/01/drowning-quality-data-how-rise-above

 

Questions:

  1. With data becoming so centric to operations now-a-days, are we going to start seeing roles dedicated to data analysis on site at plants? How will this affect the way plants are run?
  2. What are the costs associated with “data gluttony”? Is it really as big a problem as Fair makes it out to be?
  3. How long does the process of optimizing data collection take? How often should companies review their data collection process to ensure they aren’t collecting useless data?

How will manufacturing progress in 2019?

As manufacturers are continuing to run their operations as lean and efficient as possible technology is continuing to drive change industry. Decision Analyst, on behalf of IQMS, conducted a survey of 151 North American Manufacturers about technologies that they are using to transform their operations. Louis Columbus wrote about the results in his article “Ten Manufacturing Technology Predications for 2019” where he summarizes what the key technological advancements will be that transform manufacturing as we enter the New Year.

  1. More attainable lights-out production courtesy of affordable Smart Machines that are able to run unattended for two or more shifts.
  2. Real-time monitoring with Wi-Fi enabled shop floors and IoT enabled smart machines to improve scheduling accuracy, inventory control, plan performance, and greater flexibility in managing production lines.
  3. Greater adoption of analytics and BI to capitalize on data streams and improve capacity through better resource planning and scale their businesses.
  4. Mobile ERP and quality management applications will become mainstream thanks to advances in integration, usability and high-speed cellular networks and help companies improve data accuracy and operational efficiencies and reduce operational delays.
  5. Digitally-driven transformation with a customer focus by utilizing the above to offer short-notice production runs and achieve greater supplier collaboration.
  6. Replace old legacy machines with cheaper smart machines helping small and mid-tier manufacturers pursue new digital business models.
  7. There will be a major shift to fast-tracking of smart, connected products to avoid price wars and premature commoditization so that within two-years at least two –thirds of product portfolios will be connected thanks to IoT and other technological innovations.
  8. Spreading of the security perimeter thanks to a proliferation of IoT endpoints and an increasing amount of threats to operations from new sources.
  9. Utilizing IIoT to increase productivity by helping improve the inconsistent, inflexible legacy data structures form the shop floor to the top floor.
  10. Greater revenue streams from those manufacturers who were early adopters of IoT will widen the gap between those who adopted IoT early and those who did not.

 

Questions:

  1. What will happen to manufacturers who don’t embrace these changes? Will they be able to catch up or will they soon become irrelevant?
  2. What will be the major challenges faced by manufacturers who try to adopt these changes in their operations? How quickly will they see the results from these changes?
  3. Looking beyond 2019, how will the manufacturing space continue to grow as newer technologies come out?

Source: https://www.manufacturing.net/blog/2018/11/ten-manufacturing-technology-predictions-2019

Understanding the Analytics Supply Chain

In the article The Analytics Supply Chain, the author explains an interesting concept that appears to be a bottleneck for big data analytics projects at firms.  This issue lies within the analytics supply chain itself.  Deploying big data projects has become more and more popular, but the results are not always satisfactory.  Often projects take too long to source the data, build the models, and deliver the analytics-based solutions to decision makers in an organization.  In a twist of fate, the analytics supply chain becomes the issue for supplying the analytics which was meant to help the supply chain.  The author suggests looking at the analytics supply chain in terms of customers being decision makers and the products being consumed are analytical models.  Just as a normal supply chain, bad inputs usually equates to bad outputs.

When thinking of data as raw material and output as models, bad or incomplete data usually results in poor or incomplete models.  Furthermore, sometimes sourcing enough data for complete models takes too long, and thus substitutes are used such as spreadsheets.  The article states that between 20% and 80% of spreadsheets have been found to have errors, and as one might imagine, errors lead to the proliferation of different versions of the truth.  So a complete model using complete, accurate data is necessary.  And such models can take time, and one can think of unfinished models as inventory.  Inventory does not contribute to the bottom line, so the requirement of complete models outputted in a reasonable amount of time becomes important.  Perhaps this requires a more precise approach or hiring more individuals, but it’s important to recognize that data driven models that are too complex or incomplete will normally not deliver the analytics based decisions as anticipated.

Finally, the author makes some suggestions about identifying if an analytics supply chain is in need of repair.  If analytics projects are hindered by lack of IT, data, and/or other scarce technical resources, there might be an issue in the analytics supply chain.  If a firm’s ability to proceed with new analytical models is hindered by the constant maintenance on older systems, there may be an analytics supply chain issue.  If big data systems have been employed, but the results don’t seem to justify the investment, perhaps it’s time to evaluate the analytics supply chain as opposed to scrapping big data projects because they appear to be a waste of money.  Evaluations like this could save firms money in multiple ways not to mention the time invested in preexisting projects.

 

Has anyone ever heard of evaluating the analytics supply chain?

Have you ever run across models that were too complex to implement?

Have you seen any instances where big data projects were scrapped because results were not produced fast enough?

http://data-informed.com/the-analytics-supply-chain/

Process Mining – Big Data Working for Manufacturers

In the article Why Manufacturers Need Process Mining – A New Type of Data Analytics, the author is extolling the benefits of what he calls a new type of data analytics – process mining.  Process mining can be used to reduce inventory costs, identify production bottlenecks, improve on-time delivery, and optimize logistics between production sites, distribution centers, and end clients.  It’s hard to justify  that process mining is actually a new type of data analytics that can be used for manufacturers, but process mining does follow the a few of the key rules for successful big data usage.  First, as the name implies, it concentrates on one process.  Implementation of successful big data projects normally requires concentration on one area for improvement.  The specificity of big data projects usually allows them to be successful, and the article looks at why process mining, and big data analytics on the whole, can be successful in saving manufacturers money.

Process mining specifically looks at a very important factor in manufacturer’s processes, KPIs or key performance indicators.  KPIs are exactly what they sound like, the main factors that measure performance and the overall successfulness of process or project.  Process mining’s value lies in the fact that it initially makes one look at KPIs of a process.  It also challenges the manufacturer to validate if they are measuring the right KPIs and understand if the data they are gathering can be related back to KPIs.  Process mining, as all big data does, uses software do the difficult work of visualizing the processes and highlighting specific variances impacting KPIs.  One example used in the article is the examination of throughput times and the ability to identify that specific vendors are not meeting their lead-time commitments.  These types of analyses and results is exactly what big data projects are meant to do and achieve.

Another point that the article points out is that process mining encourages manufacturers to identify inefficiencies and problems within the process.  It encourages companies to embrace their issues.  This ideology is absolutely necessary for continuous improvement, and it is a key to any big data project.  Embracing issues can be difficult, especially for older, engrained processes, but it is the only way to eliminate them.  It certainly is not the easiest thing to do, and requires a humble mindset entering an improvement project.  Process mining has all the key factors of a successful big data software, and it could be very useful for manufacturers that want to embrace the big data revolution.

 

Do you believe addressing KPIs first is the best way to approach a big data project?

Do you think process mining is actually a new or different type of big data analytics or just a rebranding of basic big data?

Do you think that some manufacturers are reluctant to implement big data projects because they do not want to know their inefficiencies?

 

http://www.mbtmag.com/article/2017/02/why-manufacturers-need-process-mining-new-type-big-data-analytics

Caterpillar is Saving Big Money using Big Data and the IoT

In the article IoT And Big Data At Caterpillar: How Predictive Maintenance Saves Millions Of Dollars, the author examines an interesting case of the company Caterpillar saving significant amounts of money using big data and the IoT.  The best part of this case study is that Caterpillar is seeing a very quick ROI on their big data investment, which is not something that can be said for most companies.  As a Caterpillar manager put it, you don’t have to look for a “grand slam” with big data; sometimes you just need multiple smaller applications of big data to experience significant savings.  In this instance, gathering as much data as possible seems to be the best approach, and utilizing experts in the processes and in the data to analyze and understand the insights gleaned helps realize real value.

Caterpillar utilized big data on in its Marine Division, mainly to analyze fuel consumption for its customers as it most affects the bottom line.  Sensors on the ship monitored everything from generators, engines, GPS, refrigeration, and fuel meters, and Caterpillar utilized Pentaho’s data and analytics platform.  Insights gained have been a correlation between fuel meters and amount of power used by the refrigeration containers, and also that running more generators at lower power instead of maxing out a few was more efficient.  The cost savings here added up to $650,000+/year.  Another insight was to the optimization of a ship’s hull cleaning schedule.  Through the collection of data of cleaned and uncleaned ships, the data showed that cleanings should be performed once every 6 months instead of once every two years.  The savings associated with this optimization was $400,000/ship.

In the grand scheme of big data, predictive maintenance analytics seems to be the most powerful tool consistently being used.  With data being generated just about anywhere you could imagine via the IIoT, understanding trends becomes easier and easier.  Interestingly and contrary to previous articles, Caterpillar believes that you can’t collect too much data.  They point out how data storage is very cheap.  In the words of a Caterpillar manager, you can’t see “relationships about relationships” in the data if you don’t collect it.  Although a more is better approach is definitely not what some companies have ascribed to, it seems to be working well for Caterpillar’s marine division as they continue to pull value of out of big data and analytics.

Quick returns on big data investments seems to be rare, so do you think that companies just aren’t utilizing the big data correctly?

Do you believe in the more is better approach with regard to collecting data?

Do you believe Caterpillar is more likely to invest in big data projects in other parts of their company due to the success in the marine division?

http://www.forbes.com/sites/bernardmarr/2017/02/07/iot-and-big-data-at-caterpillar-how-predictive-maintenance-saves-millions-of-dollars/#3e01059a5a63

Looking for the Signal in the Noise

In the article A Signal in the Noise: How to Best Manage Big Data, the author advocates some ways to improve the use of data for manufacturers as well as his idea for the best way to approach the use of big data analytics and data management.  The author equivocates big data as sometimes “a needle in the haystack” when it comes to finding and using it in the manufacturing environment.  With so much data produced by even small firms, the management, organization, and analyzation of the data can become overwhelming if not impossible.  Industrial Internet of Things-based systems are estimated to create $4 trillion to $11 trillion in new economic value for manufacturers by 2025, according to McKinsey Global Institute.  With such large value to be had, ignoring or underutilizing data could be a catastrophic mistake for manufacturers.

In a study of manufacturers conducted by the MPI Group, 76% of respondents reported plans to increase their use of smart devices in the next two years while 66% plan to increase their investment in IIoT-enabled products over the same time period.  With many firms taking the necessary steps to collect and use data, a strategic approach is necessary.  The author suggests a number of steps to jump-start the transformation to using data analytics.  Recognizing human limits and the burden of isolation is the first suggested step.  Here the author is advocating that firms understand that human teams are just not capable getting the insights that technology is capable of.  The next step is forgetting the traditional supply chain cycle, and embrace the complexity of modern supply chains.  With the IIoT, data can be captured all along the supply chain which can offer useful insights not previously possible.

Finally, the author advocates understanding four different measures before taking on new data management and analytics initiatives.  Find the actual problem, the “signal in the noise” is the first and foremost issue at hand.  Unless there is direction to the project, the project will almost certainly fail.  Next is understanding the business case.  One has to understand the strategic advantage behind any data management implementation.  Lastly, finding where the optimization is most desperately needed and identifying the experts that are best suited to handle the data being generated is key.  Overall, the biggest pitfall in bringing on new technology is the belief that good things will just happen.  An understanding of the problem, the right people, and a tailored process will allow the technology to do the work it is supposed to.

Do you believe that understanding big data is really about finding the proverbial needle in a haystack?

Do you believe the increase in smart device usage automatically translates to more useful big data, or just more data in general?

As with most strategic IT plays, projects start off to gain a strategic advantage but quickly become part of the IT infrastructure.  Do you think we are in the middle stage between big data being a strategic advantage and a necessary IT infrastructure requirement for manufacturers?

 

http://www.industryweek.com/technology/signal-noise-how-best-manage-big-data