In a description of the National Science Foundation sponsored center for Smart process Manufacturing (http://www.rockwellautomation.com/resources/downloads/rockwellautomation/pdf/about-us/company-overview/TIMEMagazineSPMcoverstory.pdf) the authors suggest that market disruptions such as a “$3000 automobile or a $300 personal computer” might be outcomes. Plant integration, plant optimization and manufacturing knowledge are listed as the phases to get to this reality. What are the barriers to such an evolution in manufacturing ? How much integration of people, process and technology needs to happen to transform existing manufacturing ? Will leadership for this transformation come from small, agile companies who, when successful, will be integrated into larger ones or can the large companies lead such a transformation ? Finally, how global will this phenomenon need to be to transform supply chains ?
Current Supply Misconceptions
As a profession, those that work in supply chain are constantly on the lookout for initiatives that can make companies more efficient, cut cost, or incorporate new technology (usually to become more efficient or cut cost). But not all of the newest hype is always worth an investment. And some of the newest trends or prevailing knowledge don’t always save time and dollars. In the article The Biggest Supply Chain Fallacies, we will look at some of the current misconceptions and over-hyped technology that currently are in the supply chain industry.
Regarding newer technologies, Blockchain is very popular just about everywhere. Through its rise in cryptocurrency, many companies have taken notice and exploring where Blockchain might add value. I’ve even wrote in recent blogs about Samsung is planning to implement Blockchain in its logistics. But Blockchain isn’t without its issues, and the author of the article brings ups a very good point – Blockchain cannot overcome the issue of garbage in – garbage out. Suppose an upstream supplier lies about what they are doing and enters the untrue information in the Blockchain. This information will be regarded as true and is unable to be changed later on. The need for certify and monitor suppliers is not solved by Blockchain, and since Blockchain’s records are unchangeable, the need to certify may actually be more important.
The next fallacy is that Corporate Social Responsibly (CSR) initiatives will assuredly drive better financial performance. Unfortunately, this is just not true. CSR programs can reduce cost such as initiatives to reduce fuel consumption through better routing and more full truckloads, but CSR programs overall tend to be better looking from the outside. A realistic view of CSR programs is that may reduce cost, they will most likely attract better talent, and they will attract positive attention to the firm. Overall, CSR matters more in wealthy nations and to younger employees, and its important to be realistic about what can be achieved through their usage.
Finally, the last supply chain fallacy discussed is the apparent truck driver shortage. This shortage, as claimed by the majority, is due to the fact that younger people do not want these jobs. But the basis of this issue is most likely just economics. The average wage for the truck driver according to Salary.com is a bit over $42,000/year. At that wage, most young people do not want that job. If wages went up, economically it would make sense that there would be more individuals that would want to drive trucks. And in fact, that is apparently happening. From 2013 – 2017, truck driver salaries increased between 15%-18%. As wages increase, there will eventually be more drivers, and this shortage will be solved. Interestingly, the author of the article brings up automation taking the place of drivers as wages rise. Autonomous vehicles are a hot topic right now, but it’s highly unlikely that autonomous trucking fleets make their way on to our roads anytime soon.
As supply chain professionals, its important that we discern fact from fiction and over-hyped technology from value-adding technology. Getting differing opinions, staying well read, and keeping an open mind appear to be the best ways to move forward, even in an ever-changing environment.
Do you believe these points to be true?
Are there other supply chain misconceptions not mentioned?
Where will blockchain’s utility be found or is it most likely not useful in a supply chain context?
Using the Cloud to Improve Warehouse Performance
Starting in May of 2018, Ametek Prestolite Power will begin offering Ametek Insight though their Wireless Battery Identification Devices (WBID). Ametek Insight, a cloud-based ZigBee software, is the newest intelligence solution provided by Ametek that, when paired with the WBID, aims to solve one of today’s warehouse challenges – fleet optimization. The WBID allows users to continuously and remotely monitor an entire fleet of forklifts using real time data, transmitted and collected using Insight, by managing battery performance, changing settings, updating software/firmware, and more. The ability to use and apply this technology will allow companies to optimize fleet management to extend battery life, increase productivity, reduce costs and ultimately better serve customers. Ametek expects that Insight will eventually become a standard feature on a majority of its batter charger, not just offered through WBID. Can this technology be adapted to optimize other areas of the supply chain besides forklift operation? As the Internet of Things and real-time data analytics takes a more prominent role in supply chains, how will the job of a supply chain manager change?
Staying Ahead of Customer Needs with the Help of CNC Technology
Bridge Tool & Die is a major player in the Carbide tooling market since 2005 and has been transitioning from the use of manual grinders, the tooling industry norm, to the use of automated CNC grinders to increase productivity, capacity, and quality in the hopes of being able to better serve its customers who are continually looking for better and cheaper solutions. This transition began back in December of 2015 when Bridge Tool & Die implanted a Three-Pronged Strategy to enhance their manufacturing processes by reducing machining time and increasing consistency. The three prongs were: retrofitting their existing manual grinders with CNC, setting up multi grinder work stations, and purchasing high-end CNC machines. Using this approach, an operator would theoretically be able to operate three machines at once – one manual, one CNC, and one semi-automatic grinder. Most recently, in 2017 Bridge Tool &Die invested in a Studer CT960 CNC multi-axis grinder to further improve quality and capabilities, boasting of grinding parts in a third of the time, halving the polishing time required, and achieving tolerances of 0.0001” in an industry where the market normal is tolerances of 0.0004”. As well, the new CNC machine is lowering costs for Bridge Tool & Die, as it is expected to require two fewer operators to run, reducing labor costs for the company. Glenn Bridgeman, the owner of Bridge Tool & Die, states that “The need for increased technology was not driven by reducing operators in our shop . . . Rather, it offers us the ability to keep all of our experienced operators, and address capacity versus technology-allowing us to grow over 15% per year.” Will Bridge Tool & Die continue in its trend towards a more automated grinding process? How will industry competitors react to their ability to achieve above normal tolerances? How soon before automation with CNC becomes the new norm and all manual grinding, both at Bridge Tool & Die and elsewhere, becomes obsolete? How will Bridge Tool & Die continue to improve its processes beyond the use of CNC grinders?
In the article The Analytics Supply Chain, the author explains an interesting concept that appears to be a bottleneck for big data analytics projects at firms. This issue lies within the analytics supply chain itself. Deploying big data projects has become more and more popular, but the results are not always satisfactory. Often projects take too long to source the data, build the models, and deliver the analytics-based solutions to decision makers in an organization. In a twist of fate, the analytics supply chain becomes the issue for supplying the analytics which was meant to help the supply chain. The author suggests looking at the analytics supply chain in terms of customers being decision makers and the products being consumed are analytical models. Just as a normal supply chain, bad inputs usually equates to bad outputs.
When thinking of data as raw material and output as models, bad or incomplete data usually results in poor or incomplete models. Furthermore, sometimes sourcing enough data for complete models takes too long, and thus substitutes are used such as spreadsheets. The article states that between 20% and 80% of spreadsheets have been found to have errors, and as one might imagine, errors lead to the proliferation of different versions of the truth. So a complete model using complete, accurate data is necessary. And such models can take time, and one can think of unfinished models as inventory. Inventory does not contribute to the bottom line, so the requirement of complete models outputted in a reasonable amount of time becomes important. Perhaps this requires a more precise approach or hiring more individuals, but it’s important to recognize that data driven models that are too complex or incomplete will normally not deliver the analytics based decisions as anticipated.
Finally, the author makes some suggestions about identifying if an analytics supply chain is in need of repair. If analytics projects are hindered by lack of IT, data, and/or other scarce technical resources, there might be an issue in the analytics supply chain. If a firm’s ability to proceed with new analytical models is hindered by the constant maintenance on older systems, there may be an analytics supply chain issue. If big data systems have been employed, but the results don’t seem to justify the investment, perhaps it’s time to evaluate the analytics supply chain as opposed to scrapping big data projects because they appear to be a waste of money. Evaluations like this could save firms money in multiple ways not to mention the time invested in preexisting projects.
Has anyone ever heard of evaluating the analytics supply chain?
Have you ever run across models that were too complex to implement?
Have you seen any instances where big data projects were scrapped because results were not produced fast enough?
Small business owners and entrepreneurs are starting to employ autonomous machines in collaborative and even artistic ways, and that may point to a new chapter of artisanal and small-batch manufacturing in the U.S. this report looks at an example of a TLAC, a 2D and 3D design, print, and publishing shop in Toronto, Canada, that creates books for self-published authors.
The automation industry is seeing a shift in its labor force. As many current workers get ready to retire, a younger workforce has yet to arrive to take its place. In response to this shortfall, the advanced robotic market has grown significantly. Advanced robotic systems and collaborative robots are taking center stage at a time when manufacturing industries need them the most. This is a report based off some findings by the Boston Consulting Group and others.
In the article Why Manufacturers Need Process Mining – A New Type of Data Analytics, the author is extolling the benefits of what he calls a new type of data analytics – process mining. Process mining can be used to reduce inventory costs, identify production bottlenecks, improve on-time delivery, and optimize logistics between production sites, distribution centers, and end clients. It’s hard to justify that process mining is actually a new type of data analytics that can be used for manufacturers, but process mining does follow the a few of the key rules for successful big data usage. First, as the name implies, it concentrates on one process. Implementation of successful big data projects normally requires concentration on one area for improvement. The specificity of big data projects usually allows them to be successful, and the article looks at why process mining, and big data analytics on the whole, can be successful in saving manufacturers money.
Process mining specifically looks at a very important factor in manufacturer’s processes, KPIs or key performance indicators. KPIs are exactly what they sound like, the main factors that measure performance and the overall successfulness of process or project. Process mining’s value lies in the fact that it initially makes one look at KPIs of a process. It also challenges the manufacturer to validate if they are measuring the right KPIs and understand if the data they are gathering can be related back to KPIs. Process mining, as all big data does, uses software do the difficult work of visualizing the processes and highlighting specific variances impacting KPIs. One example used in the article is the examination of throughput times and the ability to identify that specific vendors are not meeting their lead-time commitments. These types of analyses and results is exactly what big data projects are meant to do and achieve.
Another point that the article points out is that process mining encourages manufacturers to identify inefficiencies and problems within the process. It encourages companies to embrace their issues. This ideology is absolutely necessary for continuous improvement, and it is a key to any big data project. Embracing issues can be difficult, especially for older, engrained processes, but it is the only way to eliminate them. It certainly is not the easiest thing to do, and requires a humble mindset entering an improvement project. Process mining has all the key factors of a successful big data software, and it could be very useful for manufacturers that want to embrace the big data revolution.
Do you believe addressing KPIs first is the best way to approach a big data project?
Do you think process mining is actually a new or different type of big data analytics or just a rebranding of basic big data?
Do you think that some manufacturers are reluctant to implement big data projects because they do not want to know their inefficiencies?