IoT - Engineering.com https://www.engineering.com/category/technology/iot/ Fri, 07 Mar 2025 20:17:46 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.2 https://www.engineering.com/wp-content/uploads/2025/06/0-Square-Icon-White-on-Purpleb-150x150.png IoT - Engineering.com https://www.engineering.com/category/technology/iot/ 32 32 Industry 4.0 gets a curriculum developed by ASME and Autodesk https://www.engineering.com/industry-4-0-gets-a-curriculum-developed-by-asme-and-autodesk/ Fri, 07 Mar 2025 20:05:53 +0000 https://www.engineering.com/?p=137440 Six free courses benefit educators, engineering students and engineers.

The post Industry 4.0 gets a curriculum developed by ASME and Autodesk appeared first on Engineering.com.

]]>
Educators and engineering firms looking for training on Industry 4.0 have a new resource: a six course curriculum for smart manufacturing created by the American Society of Mechanical Engineers (ASME) and Autodesk, Inc. The two organizations collaborated through 2023 and 2024 to compile interviews, analyze data and come up with real-world examples for the set of six free online courses. The lessons cover evolving engineering skills, including Artificial Intelligence (AI) and robotics, design for sustainability, Industry 4.0, data skills and business and digital literacy. 

“We began this effort based on feedback from educators. Industry 4.0 is here and companies are struggling because they’re not workforce ready. Much of the knowledge being taught in the classroom is not geared to digital strategies,” says Pooja Thakkar Singh, program manager for the American Society of Mechanical Engineers. The courses are meant for mechanical engineers, manufacturing engineers and Computer Numerical Control (CNC) machinists. The sixth course contains examples of R&D work that show how to apply acquired knowledge to projects using Autodesk’s Fusion software. The overall goal of the curriculum is to empower educators and equip students and professionals with in-demand skills that will advance their careers and support modern manufacturing.

“The courses run between 30 and 45 minutes, with the underlay of the Fusion exercises being a little bit deeper. The software is very easy to use. This is advanced manufacturing,” says Debra Pothier, senior manager for Autodesk for strategy for architecture, engineering, construction and operations (AECO) and a partnership owner for ASME.

For example, the design for sustainability course covers how value chains and supply chains impact the environment and product lifecycles influence design solutions. It also explores the importance of the “Triple Bottom Line,” a framework that measures social, environmental and financial benefit.

A case study for the course, Evolving engineering skills, including AI, robotics and more with ASME. (Image: ASME and Autodesk)

“There were requests to make the courses as customizable as possible due to the changing landscape of Industry 4.0. We accomplished this by integrating PowerPoint slide decks and videos that faculty and instructors can switch out to keep up with the latest information,” says Singh.

Another way to customize the courses is to use Doodly, an animation software. The program can create dynamic virtual board drawings for videos. 

“Then you can remake the courses by re-recording and re-downloading content as many times as you need,” says Singh.

Critical ingredients for the courses

One of the key components of the courses is explanations of digital manufacturing skills that apply to mechanical engineering, manufacturing engineering and CNC machining, like CAM 2.5, 3-axis milling and simulation.

“Doing this impactfully involved getting all the stakeholders together in one virtual classroom. We had to ask ourselves what students were looking for and what they were passionate about. We also had to cover the challenges that industry experts were facing. We didn’t know those until we heard that from the sources,” says Singh.

Another key component is a simultaneous focus on hard skills, such as data analysis, and soft skills, like collaboration.

“We demonstrate how Fusion software facilitates the communication of generative design AI outputs, bridging the gap between technical skill and practical application,” says Curt Chan, strategic partnerships manager for Autodesk.

A third key component is explanations of the enormous impact of AI and how this tool affects the design process. In some situations, generative AI software can handle 90 percent of the programming required for machine part development. A mechanical engineer can then utilize their expertise to finetune the last 10 percent.

AI is like “a whole new toolbelt in a number of ways,” says Jason Love, technology communications manager for Autodesk.

Before AI was widely used, a mechanical engineer designing an assembly might be required to learn how to draw a diagram of the parts in an assembly.

“Now there are tools in place that with the click of a mouse, create those 2D diagrams from your 3D models. It falls to the human engineer to double check the accuracy of those drawings,” says Love.

Some educators may be unfamiliar with such changes or the new workflow itself. The courses address these problems by ensuring viewers grasp how many options AI creates.

“Faculty are going to have to teach the entire process, not their little silo,” says Pothier.

Introduction for the course: Evolving engineering skills, including AI, robotics and more with ASME. (Image: ASME and Autodesk)

How and why the curriculum works

The six courses are not critically tied to one another. This gives an educator flexibility to “plug and play.” A manager could assign a course when an employee has downtime or an educator wants to offer extra credit.

“Through conversations with educators, I’ve observed many different ways they are planning and implementing the curriculum,” says Chan.

The models have a loose sequential order. For example, the course that serves as an introduction defines the term “Industry 4.0.” It also explains the driving forces behind production processes and relates the remaining challenges from Industry 3.0. A later course on digital literacy and data skills provides participants with an understanding of Industry 4.0 technologies and data measurements. This course also gives participants an understanding of the role of big data and how numerical insights drive manufacturing processes.

Each course has a self-assessment that learners can complete to earn a certificate. Participants can earn credit or mark their skills as upgraded after completing certain courses or the entire set.

One factor contributing to the popularity of the courses is the shift during the past five years to the use of online education, for both synchronous and asynchronous learning. This is partly due to the influence of the COVID-19 pandemic. Students and engineers have also become more well versed and more highly motivated to utilize knowledge they have drawn from online content. 

ASME and Autodesk’s history of partnership

The Industry 4.0 curriculum is the latest result of ASME’s and Autodesk’s history of teamwork. The two entities have been working together since 2021. That year, Autodesk Foundation, Autodesk’s philanthropic arm, began donating funds to ASME’s Engineering for Change (E4C) research fellowship program.

As of late February 2025, the Autodesk Foundation has funded over 100 E4C fellowships to support nonprofits and startups in a range of fields. These include energy and materials development, health and resilience systems and work and prosperity opportunities. The donations to E4C have also expanded the reach and impact of Autodesk Foundation’s Impact internship program. That program connects individuals in the Autodesk Foundation portfolio with new engineers.

In 2022, ASME and Autodesk released the results of a collaborative multiphase research project on the future of manufacturing. The effort involved a research study conducted between August 2021 through May 2022. The report on the study investigated and identified the future workflows and skills required for mechanical engineering, manufacturing engineering and CNC machinist roles.

“This was the project that was the basis for the six-course curriculum. The second phase of the project was the curriculum design and creation. We piloted the first four courses by launching a competition relating to sustainability and ocean clean-up,” says Singh.

The Autodesk-hosted event featured university teams designing an autonomous robot to clean up trash from the ocean. Students relied on skills they had learned from the courses.

One of the teams in the competition, Wissen Marinos, was formed of students from India’s National Institute of Technology Silchar. Wissen Marinos team captain Pratisruti Buragohain says participating in the competition enabled team members to develop problem-solving abilities, technical skills and soft skills.

“Despite facing various hurdles along the way, we tackled each one of them strategically and with a meticulous determination. In essence, our experience throughout the competition bestowed upon invaluable lessons, equipping us with enhanced design proficiency, research skills and efficient problem-solving strategies,” says Buragohain.

Additional steps for the curriculum have included the translation and localization of the courses into Japanese and German. ASME and Autodesk are tracking how widely the curriculum is used and asking what information students and engineers are learning from it.

“Any curriculum takes time. It’s going to take time to drive it the use of this curriculum. That’s about keeping a pulse on the industry, hearing what they have to say and what Autodesk’s customers have to say,” says Chan.

Pothier says Autodesk is striving to close the skills gap and be a trusted partner to engineering firms.

“We give the underpinning of, “This is how you do it with Fusion and we’re giving you modular pieces.” We’re giving it to universities and firms in a way that students really want to consume it. Our team is very passionate because we feel if you’re going to be sending your kids to school, they need those skills today,” says Pothier.

View the courses at: https://www.autodesk.com/learn/ondemand/collection/asme-manufacturing-education-courses.

The post Industry 4.0 gets a curriculum developed by ASME and Autodesk appeared first on Engineering.com.

]]>
AI and Industry 5.0 are definitely not hype https://www.engineering.com/ai-and-industry-5-0-are-definitely-not-hype/ Mon, 24 Feb 2025 20:52:58 +0000 https://www.engineering.com/?p=137042 The biggest players in manufacturing convened at the ARC Industry Leadership Forum, and they were all-in on AI.

The post AI and Industry 5.0 are definitely not hype appeared first on Engineering.com.

]]>
There is a lingering sentiment among the manufacturing community that the trends towards AI, digitalization and digital transformation (collectively referred to as Industry 5.0) are nothing more than marketing hype designed to sell new products and software.

Nothing could be further from the truth.

Granted, any new trend will always have an element of bandwagon business from marginal players and hype-riders looking to benefit from the latest trends.

But in terms of how digital transformation and AI are being researched and implemented in the manufacturing industry, there is plenty of steak to go along with all that sizzle.

One of the best ways to distinguish between an over-hyped trend and something with substance is to watch who is watching it. A great place to see that in action was at the recent ARC Industry Leadership Forum, which took place in Orlando, Fla. February 10-13.

Nico Duursema CEO, Cerilon, delivers his keynote address at the ARC Industry Leadership Forum in Orlando, Fla. (Image: ARC Advisory Group, taken from X, formerly Twitter)

This year’s event was almost entirely focused on AI, digital transformation and Industry 5.0 in manufacturing. It attracted more than 600 attendees representing some of the biggest companies in the manufacturing sector.

Indeed, the top 30 of these attending companies with publicly available financial numbers had a combined 2023 market cap of $4.22 trillion. If this market cap were a country, it would rank as the 4th largest economy in the world, just behind Germany ($4.5 trillion GDP) and ahead of Japan ($4.20 trillion GDP). Most of these companies were users undergoing significant digital transformation initiatives.

The fact that these industrial heavyweights are already fully invested in implementing AI and digital strategies shows the scale of the opportunity, and the huge strategic risk of ignoring it—we’re talking Blockbuster Video-level strategic risk.

But the question remains: where do you begin, especially if you don’t have the capital and assets of these massive multinational businesses?

Everywhere, all at once

In the current state of things, engineering leaders can be easily overwhelmed with all the trends and challenges thrown at them. Mathias Oppelt, vice-president of customer-driven innovation at Siemens Digital Industries (Siemens is certainly a technology vendor, but also manufactures its products using the latest smart manufacturing principles), hears about this from his customers daily and summed it up nicely during his session at the ARC Forum:

“You need to act more sustainably; you need to have higher transparency across your value chain. Have you thought about your workforce transformation yet? There’s a lot of people retiring in the next couple of years and there’s not many people coming back into the into the workforce. You still must deal with cost efficiency and all the productivity measures, while also driving energy efficiency. And don’t forget about your competition—they will still be there. And then there’s all that new technology coming up, artificial intelligence, large language models, ChatGPT—and on it goes, all of that all at once.”

Sound familiar?

Even with all these challenges, everything must now be done at speed. “Speed and adaptability will be the key drivers to continuing success. You need to adapt to all these challenges, which are continuously coming at you faster. If you’re standing still, you’re almost moving backwards.” Oppelt said.

The answer is simple, offered Oppelt with a wry smile: just digitally transform. The crowd, sensing his sarcasm, responded with nervous laughter. It was funny, but everyone understood it was also scary, because no one really knows where to start.

Bite the bullet, but take small bites

“The continuous improvement engineers out there know how risky it can be to bite off more than the organization can chew or to try to drive more change than it can manage,” says Doug Warren, senior vice president of the Monitoring and Control business for Aveva, a major industrial software developer based in Cambridge, U.K.

“It helps to take bite-sized pieces, and maybe even use the first bite to drive some incremental benefit or revenue to fund the next bite and then the next bite. You can sort of see this this self funding approach emerge, assuming the business objectives and the metrics tied to those business objectives show results.”

Warren is puzzled by how slow a number of industrial segments have been to fully embrace digitalization and digital transformation, saying that “…it seems like everyone has at least dipped a toe or a foot into the water,” but the number of organizations that are doing it at scale across the whole enterprise is lower than most people would guess.

“The level of technological advancement doesn’t come as a big surprise, and where we go from here won’t be a big surprise. The trick will be how fast you get past the proof-of-concept and into full scale deployment,” he says.

From Warren’s perspective, if you’re not taking advantage of the digitalization process to fundamentally change the way you’re doing work, then you’re probably not getting as much value.

“To just digitize isn’t enough. How do we change those work processes? How do we inject more efficiency into work processes to take advantage of the technological advancements you are already investing in? That’s the special sauce,” he says, conceding that it’s difficult because people typically prefer routine and structure. “That’s probably got a lot to do with the lack of real speed of adoption, because you still have to overcome the way you’ve always done it.”

Warren says a good way to look at it is like a more nuanced version of the standard continuous improvement initiatives companies have been undertaking for decades.

“Continuous improvement is incremental changes over time, where digital transformation provides at least an impetus for more of a step change in the way we perform work, whatever that work might be.”

What’s old is new again

One of the main points of hesitation towards full scale implementation of digital transformation or AI initiatives is the perceived newness of it and the uncertainty or risk associated with the perception of so-called “bleeding edge” technology.

The thing is, none of this is all that new. The concept of the neural network was developed in the 1940s and Alan Turing introduced his influential Turing Test in 1950. The first AI programs were developed in the early 1960s. If you are a chess enthusiast, you’ve certainly played against AI opponents for the last 20 years. Most popular video games have had story lines fuelled by AI-powered non-player-characters (NPCs) for almost as long.

What has changed over the last few decades is the amount of computing power available, the democratized access to that compute power through the cloud, and the speed provided by the latest advances in chips.

This growth of available computational power and technology can now be applied to all the improvements organizations have been trying to achieve with continuous improvement. And they are proving to be most effective when combined with the extensive knowledge found within companies.

“Industry definitely provides complexities because it’s not just AI and machine learning (ML). There’s also domain knowledge, so it’s really a hybrid approach,” says Claudia Chandra, chief product officer for Honeywell Connected Industrials based in San Francisco.

Chandra earned a Ph.D. in artificial intelligence and software engineering from UC Berkeley 25 years ago and has spent her career working with data, AI, edge platforms and analytics.

“I’m not for just AI/ML on its own. It’s really the domain knowledge that needs to be incorporated along with (AI’s) first principles. The accuracy would not be there without that combination, because data alone won’t get you there,” Chandra said.

“That tribal knowledge needs to be codified, because that gets you there faster and might complement what’s in the data. So, digitization is the precursor to AI/ML—you need to collect the data first in order to get to AI/ML,” she says, reiterating that it must be a step-by-step process to reduce risk.

Chandra says companies that have taken these incremental steps towards digitalization and embrace the cloud or even more advanced tech such as AI/ML will find that their digital transformation is no longer a behemoth with all the pain and risk that go with it. Plus, any vendor with a good understanding of the technology will provide at least a starting point—including pre trained models—so companies don’t have to start from scratch. “But ultimately, as you train it more, as you use it more, it will get better with the data that’s specific to your company,” she says.

Certainly, the success of any AI-enabled digital transformation initiative is all about the underlying data and training the AI appropriately to get the required accuracy. But it takes several steps to set the conditions for value generation: Commit to a project; start small with the right use case; and be persistent and diligent with the data. Once you get a small victory, put the value and the experience towards the next project. With such an approach, you will soon learn why AI and Industry 5.0 are here to stay—and so will your competition.

The post AI and Industry 5.0 are definitely not hype appeared first on Engineering.com.

]]>
What should you do about legacy systems? https://www.engineering.com/what-should-you-do-about-legacy-systems/ Thu, 13 Feb 2025 14:09:43 +0000 https://www.engineering.com/?p=136664 Obsolete hardware and software is the biggest barrier to digital transformation.

The post What should you do about legacy systems? appeared first on Engineering.com.

]]>
According to Deloitte, 62% of businesses saw legacy systems (out of date software and hardware) as a top barrier to digital transformation. The following case introduces our discussion of why, and what you can do about it:

Mark Bonds is the Financial Director of a medium sized manufacturer of pump equipment for the mining industry. He has been following recent advances in artificial intelligence and is excited about the possibilities it offers for analysis of production data that will enable costs to be better managed and more identification of opportunities for process improvement.

Mark has shared his ideas with the company CEO and has just presented them to the management team. The response from the Chief Information Officer, Ahmed Shakria, was swift, the artificial intelligence based systems that Mark proposed would need to be connected to the company’s existing technology, in a range of areas, including their ERP system, with its financial, sales and operational data and this would not be simple – the ERP system had been significantly customised and this complicated integration with other technology.

The Operations and Supply Chain Director, Linda Lui, liked Mark’s idea but felt that its value would be limited if the only data it had came from the ERP system. She argued that a lot of the production equipment gathered data that was currently seldom used but would greatly increase the value of the artificial intelligence, as long as the technical challenges of integrating the data with the artificial intelligence could be resolved.

Linda thought further about the operational impact of the changes being proposed. Some of the data the existing technology gathered from the operations equipment was very important in the work the machine operators and others did – providing them with feedback that was important in managing quality and productivity. If changes were made to the technology, would she lose this and would the finance department now be intervening in operational decisions with their new access to information? What would this mean for the workers – would their jobs change for the better by giving them more ability to control the output of their work or would they become more stressful as more data-based direction was given to them?

This case study provides a brief introduction to the issues raised in dealing with legacy systems. Often when legacy system changes are considered it is the technical issues that most of the focus is on – and these are often considerable. We examined the reasons for legacy system change failure in research we conducted for the University of Waterloo Watspeed Digital Transformation online certificate program. It can be used to review your own legacy systems and develop your strategy for dealing with them, as part of your digital transformation.

Your starting point in developing your legacy systems strategy is your business strategy, which should include your digital transformation objectives. These will determine the critical challenges you will face in your legacy systems. For example, your strategic priorities may include a desire for greater flexibility in manufacturing in response to the market which may currently be limited due to the inability of the legacy systems to accommodate rapid product changes. Your analysis of your legacy systems should be firmly based on your future needs.

We identified the common technical issues that are found in legacy systems when technological change is taking place. These will enable you to review your own legacy systems and consider the impact they will have on your digital transformation activity.

Maintenance Cost: The cost of the legacy system maintenance is high and unsustainable

Skills Shortages: Skills required to operate and maintain the legacy systems are in short supply

Technical Incompatibility: The legacy systems cannot be technically integrated with the new digital transformation technology

Security Issues: The legacy systems are vulnerable to attack, data loss or functionally unreliable

Regulatory Non-Compliance: The legacy systems are not capable of meeting mandatory regulatory requirements

Unscalable: The legacy systems cannot grow to meet the forecasted business need

Unsupported: The legacy systems are no longer supported by their vendors

Innovation Hindrance: The legacy systems limit innovations that the business would like to make now or in the future

Technical Debt: Weaknesses in the technology itself that make it harder to use, maintain and update

The technical challenges can be addressed by modifying the existing systems to a small or large extent, in a variety of ways or replacing them altogether. A description of the options available is summarised by Gartner.

The challenges associated with legacy systems are not just technical though – they are also based on the impact changes in information systems have on process and human elements and these must be carefully considered in legacy systems changes – not doing so is a common cause of failure of legacy focused aspects of digital transformation projects.

Changes in legacy systems can have significant impacts on organisational processes that will influence support for and success of implementation. The process aspects that legacy system changes can impact include:

Process Complexity: Are the legacy systems changes making operational processes more or less complex?

Productivity: What is the impact on productivity – will it increase or decrease in the future?

Process Quality: Will changes to legacy systems improve product and / or service quality?

Process Agility: Will changes improve the ability of processes to respond to market driven changes or not?

Legacy system changes will also impact the human aspects of organisations. The areas this impact should be considered in include:

Skills: Are the skills necessary for operating and exploiting updated or new systems available within the organisation or can they be obtained easily by training and / or recruitment? Will skills changes increase or decrease the value of the skills of existing workers?

Job Security: Will there be an impact on employment numbers? If jobs will be lost, will this be sensitively managed?

Culture: Do the systems changes require changes in company culture? For example, do they require changes in levels of adherence to work instructions, role flexibility and agility, etc.

Quality of working life: What will the impact be on the quality of the job experience of employees? Will they have more or less autonomy in their jobs? Will they be more or less stressful? Reductions in quality of working life are likely to lead to resistance to technological change.

Management Capabaility: is the existing management structure and capability appropriate for the changes being made to legacy systems and, if not, how will you address this?

Successfully overcoming the challenges that legacy systems pose requires a holistic approach that combines understanding of technical, process and human factors. A mainly technically focused approach will usually fail.

The post What should you do about legacy systems? appeared first on Engineering.com.

]]>
Sustainable dairy company picks Rockwell’s Plex system https://www.engineering.com/sustainable-dairy-company-picks-rockwells-plex-system/ Mon, 03 Feb 2025 16:11:15 +0000 https://www.engineering.com/?p=136308 New Zealand-based Miraka will use Plex to integrate its enterprise resource planning (ERP) systems

The post Sustainable dairy company picks Rockwell’s Plex system appeared first on Engineering.com.

]]>
Miraka’s geothermal powered processing facility in Taupō, New Zealand. (Image: Miraka Ltd)

New Zealand-based Miraka, the world’s first dairy processor to get its power from renewable geothermal energy, has chosen smart manufacturing software Plex from Rockwell Automation to become even more efficient and sustainable.

Miraka will use Plex to integrate its enterprise resource planning (ERP) systems. ERP is a software system that helps organizations streamline and automate their core business processes—including financial management, human resources, supply chain, sales, and customer relations—across the entire enterprise.

Miraka says its use of geothermal energy helps it “emit 92% less manufacturing carbon emissions than traditional coal-fired factories, giving Miraka one of industry’s lowest global carbon footprints.”

The dairy company will use Plex to connect, automate, track and analyze its operations—from the pasture to the factory floor— to take its core values of excellence and innovation to the next level.

Robert Bell, Miraka CFO, calls Plex a “single source of truth,” with intuitive tools that will help Miraka optimize their business and operational performance by increasing efficiencies.

Plex supports Miraka’s goals to become even more resilient, agile, and sustainable by offering a holistic view across the enterprise so Miraka quickly can respond to market demands and customer changes without interrupting production. The Plex software was built around the pillars of smart manufacturing, helping companies not only streamline their operations, but making it easier for them to follow industry standards and grow their business.

“Plex is a modular system, so it can grow and adapt as needs change in the future, allowing companies like us to remain agile and stay ahead of the competition,” added Bell.

The post Sustainable dairy company picks Rockwell’s Plex system appeared first on Engineering.com.

]]>
How to build a business case for implementing IIoT in manufacturing https://www.engineering.com/how-to-build-a-business-case-for-implementing-iiot-in-manufacturing/ Tue, 19 Nov 2024 21:29:48 +0000 https://www.engineering.com/?p=134163 Here is a generic business case example for a medium-sized manufacturer’s IIoT strategy to optimize operations, increase efficiency, reduce downtime and enhance competitiveness.

The post How to build a business case for implementing IIoT in manufacturing appeared first on Engineering.com.

]]>
As manufacturing becomes increasingly complex, the need for greater efficiency, flexibility and visibility in operations is more critical than ever. The Industrial Internet of Things (IIoT) offers a transformative solution to achieve these goals.

By implementing an IIoT strategy, a manufacturing facility can leverage real-time data from sensors, machines and other devices to drive smarter decision-making, improve equipment reliability and reduce costs.

The integration of IIoT will enhance operational efficiency, improve predictive maintenance and enable better resource allocation, resulting in a significant return on investment (ROI) and a competitive edge.

Editor’s Note: This is solely an example and is not meant to represent a specific IIoT implementation. Your situation is unique and will require you to identify your own technology options, investment level and strategic choices. Any dollar amounts mentioned are for demonstration only so readers can understand how to start planning for IIoT in their facility.

Problem Statement

Inefficiencies in production: Current manufacturing processes often suffer from inefficiencies such as unplanned downtime, suboptimal resource usage and bottlenecks in production lines.

Lack of real-time data: Operations are often based on historical data or periodic checks, leading to delays in decision-making. This prevents a proactive approach to changes or issues that could affect output or quality.

Unpredictable maintenance: Equipment failures and breakdowns are often unexpected, resulting in costly downtime and repairs. Maintenance is typically reactive, occurring after equipment failure.

Evidence-based decision making: The absence of real-time, actionable data makes it difficult for management to make fully informed decisions that can optimize production efficiency, resource allocation and cost management.

Proposed solution: IIoT implementation

The implementation of IIoT-connected devices involves integrating smart sensors, connected devices and cloud-based analytics to gather and analyze data from various machines, production lines and equipment. This will allow the manufacturing facility to:

  • Monitor equipment performance in real-time.
  • Use predictive analytics to foresee and mitigate equipment failures.
  • Optimize the production process through continuous data collection and analysis.
  • Improve overall operational efficiency and decision-making.

Key benefits of implementing IIoT

Increased Operational Efficiency

Real-Time Monitoring: IIoT allows operators to monitor equipment status, production rates and environmental conditions in real time. This helps identify inefficiencies and prevent production delays.

Data-Driven Decisions: Continuous data collection enables better forecasting and resource planning. Managers can make more informed decisions about staffing, machine usage and inventory management.

Example: By using IIoT sensors to track the performance of machines in real-time, we can eliminate the need for manual inspections, reducing waste and speeding up production.

Predictive Maintenance

Reducing Downtime: IIoT enables predictive maintenance, where data from connected sensors can be analyzed to predict when a piece of equipment is likely to fail. Maintenance can then be scheduled before the failure occurs, reducing unplanned downtime.

Lower Maintenance Costs: By performing maintenance only when needed (instead of adhering to rigid schedules or waiting for equipment to fail), maintenance costs are reduced and the lifespan of equipment is extended.

Example: A vibration sensor on a motor can detect abnormal vibrations that indicate wear and tear. Maintenance can be performed before the motor fails, thus avoiding costly repairs and production downtime.

Improved Product Quality

Process Optimization: IIoT can help fine-tune production processes by continuously collecting data on product quality, temperature, pressure and other key factors. By monitoring these variables, the manufacturing process can be adjusted to maintain product quality within specifications.

Automated Quality Control: With IIoT-enabled sensors, defects can be identified early and corrective actions can be taken immediately, ensuring consistent product quality.

Example: Using IIoT sensors to track temperature and pressure during the manufacturing of a product can help prevent defects caused by improper conditions, ensuring higher product consistency.

Cost Savings

Energy Efficiency: By tracking energy consumption across the facility in real time, IIoT can help identify areas of excessive energy use and provide insights for optimization. This can lead to significant savings on utility costs.

Reduction in Waste: Real-time tracking of raw materials and finished goods can help identify and reduce waste throughout the manufacturing process, further cutting costs.

Example: With IIoT tracking energy usage, a manufacturing plant could optimize its equipment cycles to minimize energy consumption, leading to lower overall energy bills.

Enhanced safety and compliance

Safety Monitoring: IIoT sensors can monitor workplace conditions such as temperature, humidity, gas leaks and noise levels, ensuring compliance with safety regulations and preventing workplace accidents.

Regulatory Compliance: Real-time data collection and analysis help ensure that manufacturing processes comply with industry standards and regulations, minimizing the risk of fines and improving overall operational transparency.

Example: Sensors that monitor hazardous gas levels can trigger alarms if unsafe concentrations are detected, allowing for quick intervention and preventing potential accidents.

ROI and financial justification

Cost Savings from Reduced Downtime: Predictive maintenance can cut unscheduled downtime in half, leading to direct cost savings in repair and lost production time.

Energy Savings: Energy optimization could lead to 10-20% savings in energy consumption, depending on the scale of the implementation.

Increased Production Output: Real-time monitoring and optimization could increase production output by 5-15%, depending on the specific constraints within the manufacturing process.

Example of financial impact:

Scenario: Assume the facility’s current annual downtime costs are $1 million due to unplanned maintenance and equipment failure. With IIoT-enabled predictive maintenance, we can reduce downtime by 30%.

Energy Costs: If the facility spends $500,000 annually on energy, a 10% reduction in energy costs due to IIoT optimization could save $50,000 per year.

Given an initial investment of $500,000 for IIoT infrastructure (sensors, analytics platforms, training, etc.), the ROI would be realized within 1-2 years depending on the scale of the savings and the facility’s existing operations.

IIoT implementation plan

Pilot Phase (3-6 Months): Start with a small-scale implementation in a single production line or equipment type.

Install sensors, connect devices to a central system and begin gathering real-time data.

Analyze data for initial insights and improvements in maintenance, efficiency and production.

Full-Scale Rollout (6-12 Months): Expand IIoT deployment to all critical equipment and production lines.

Integrate with existing ERP or MES systems for seamless data flow and analysis.

Provide training for staff and operators to maximize the utility of IIoT data.

Continuous Monitoring and Optimization (Ongoing): Continuously monitor system performance and adjust processes based on IIoT insights.

Review performance metrics regularly and adjust strategies for cost reduction and efficiency gains.

Risk mitigation

Initial Investment: The upfront costs of IIoT infrastructure may be a concern. However, the anticipated savings and ROI (estimated at 1-2 years) mitigate this risk.

Integration Challenges: IIoT integration with legacy systems may require initial technical adjustments. Partnering with experienced vendors and involving IT and operations teams early can help ensure smooth integration.

Change Management: Employees may be resistant to new technologies. To address this, comprehensive training and clear communication about the benefits of IIoT can help foster buy-in from staff.

Implementing an IIoT strategy will transform the manufacturing facility by improving operational efficiency, reducing downtime, lowering costs and enhancing product quality. The potential return on investment is significant, with savings in energy, maintenance and production output. By adopting IIoT, we will not only optimize our current operations but also position ourselves for future growth, industry challenges and potential disruptions.

Planning your spend

An initial investment of $500,000 for an IIoT implementation might be typical for a medium-sized manufacturing company. The exact spend will depend on many factors, such as the complexity of operations, the number of machines or production lines, the scope of the IIoT deployment and the desired level of sophistication in terms of sensors, analytics and integration with existing systems.

Here’s a breakdown of the types of manufacturing companies that might require such an investment:

1. Mid-Sized Manufacturing Companies (200–500 Employees)

Production Scale: Mid-sized companies typically have multiple production lines or facilities, often producing at a higher volume than smaller companies but on a smaller scale compared to large enterprises.

Infrastructure Needs: These companies may need to implement IIoT across several machines, production lines, or facilities. The $500,000 investment would cover sensors, data aggregation systems, analytics platforms and integration with existing manufacturing execution systems (MES) or enterprise resource planning (ERP) systems.

Complexity of Operations: The company may have a mix of automated and manual processes, requiring a balance of low-cost sensors (for simple machines) and more advanced sensors for critical equipment or complex processes.

ROI Consideration: A mid-sized company with multiple production lines could see significant cost savings from predictive maintenance, reduced downtime and energy optimization, making a $500,000 investment feasible.

2. Large Manufacturing Companies (500+ Employees)

Production Scale: Large manufacturing companies are likely to have extensive operations, with multiple plants, diverse production lines, or high-volume manufacturing.

Scope of Deployment: IIoT deployment in large companies would likely be comprehensive, covering various types of machinery, production lines and facilities. The company would also need sophisticated analytics platforms to manage the large volume of data generated by hundreds or thousands of sensors.

Advanced Integration: A large company would need to integrate IIoT solutions with their existing enterprise systems (ERP, MES, SCADA) for real-time data flow and centralized control. They would also likely have teams dedicated to managing and interpreting IIoT data.

Investment Justification: In a large-scale manufacturing operation, the $500,000 investment could be distributed across various areas, such as equipment, sensors, software platforms and training. The potential savings from predictive maintenance, efficiency gains and energy optimization would make the ROI quite attractive.

What Does a $500,000 IIoT Investment Cover?

For a mid-sized manufacturer, here’s a rough breakdown of what this initial investment might cover:

Sensors and Hardware (approx. $150,000–$250,000): The cost of smart sensors (temperature, pressure, vibration, humidity, flow meters, etc.) and edge devices to collect data from machines.

Number of sensors would vary based on the facility size, production lines and equipment complexity.

Software and Analytics Platforms (approx. $100,000–$150,000): This includes platforms to analyze the data, manage operations, provide predictive insights and integrate with existing systems like ERP or MES. The cost would vary depending on the platform’s sophistication, the level of customization and the number of users.

Cloud Infrastructure / Data Storage (approx. $50,000–$75,000): Costs for cloud storage, data processing and potentially integrating AI/machine learning capabilities to analyze sensor data and generate actionable insights.

Integration with Existing Systems (approx. $50,000–$75,000): The integration of IIoT data with current manufacturing systems (like MES or SCADA) to create a seamless flow of information across the company. This may require custom software development and IT resources.

Training and Change Management (approx. $25,000–$50,000): To ensure that employees and managers understand how to use the new IIoT system effectively, training and ongoing support are critical. This cost includes both technical training for operators and broader organizational change management to ensure smooth adoption.

How Can a Smaller Company Justify This Investment?

If the company is smaller (around 100–200 employees) but still facing challenges like unpredictable equipment failure, downtime, or inefficient energy use, it might consider starting with a smaller pilot project or a scaled-down implementation to reduce the initial investment.

For example, they could start with a single production line or critical machines, focusing on predictive maintenance or energy optimization. As the company sees the return on investment (ROI), they could then scale up to other parts of the operation.

Key factors affecting IIoT investment

Size and Complexity of the Facility: Smaller production lines or simpler operations will require fewer sensors and devices, reducing the cost. Simpler machinery or fewer critical assets may only need basic sensors (e.g., temperature, vibration, or pressure sensors), which are less expensive than more advanced sensors or smart machines.

Scope of Deployment: A smaller company might choose to deploy IIoT to one production line or just a few key pieces of equipment (e.g., motors, pumps, or conveyors) to start, rather than a facility-wide implementation. Many small manufacturers start with a pilot project to test the benefits of IIoT on a smaller scale before expanding.

Type of IIoT Technology: The type of sensors, connectivity and software used will significantly impact the cost. Basic sensors (for monitoring temperature, pressure and simple on/off status) are relatively affordable. Advanced sensors (such as vibration, motion, or humidity sensors) and equipment with built-in IoT capabilities will increase the investment.

Cloud vs. On-Premise Solutions: Cloud-based platforms are often more affordable for small businesses because they typically require less upfront infrastructure and are billed as a subscription service. On-premise solutions that require significant IT infrastructure or hardware (e.g., servers, local data storage) can increase costs.

Level of Automation and Data Analytics: Basic data collection and monitoring systems without sophisticated analytics will be less expensive. Advanced analytics and predictive maintenance systems that use machine learning or AI will require more investment in software and data processing.

The post How to build a business case for implementing IIoT in manufacturing appeared first on Engineering.com.

]]>
How does IIoT enable automation and control in manufacturing? https://www.engineering.com/how-does-iiot-enable-automation-and-control-in-manufacturing/ Thu, 14 Nov 2024 17:49:36 +0000 https://www.engineering.com/?p=134006 Here’s the basics on how IIoT helps manufacturers create an automated ecosystem, driving optimization and continuous improvement.

The post How does IIoT enable automation and control in manufacturing? appeared first on Engineering.com.

]]>
IIoT initiatives can go a long way in enhancing the results of a manufacturer’s continuous improvement programs by enhancing real-time control and automation of manufacturing processes. This is achieved by providing an integrated, data-driven approach to operations that connects sensors, machines and devices to systems such as MES, PLM and ERP, among others.

As we’ve learned in this series of articles, IIoT involves the deployment of sensors and smart devices on machines and equipment throughout a manufacturing facility. These sensors continuously collect data such as temperature, pressure, speed, vibration, humidity and other critical performance metrics. This constant stream of real-time data allows operators to monitor machine health, detect performance deviations and measure production parameters instantly.

For example, a temperature sensor on an industrial oven can continuously monitor the baking process and relay that information to a central control system, allowing operators to adjust on the fly if the temperature deviates from the ideal range.

One of the key advantages of such a system in manufacturing is predictive maintenance. The real-time data from sensors not only help manufacturer predict when a machine or component is likely to fail before it breaks down, it helps develop a timeline of machine health to aid in long-term production scheduling. This is possible through techniques like vibration analysis, acoustic monitoring and thermal imaging, which detect signs of wear or malfunction.

By identifying potential issues early, manufacturers can perform maintenance only when needed—reducing unnecessary downtime and costly emergency repairs. This predictive capability enhances the automation of maintenance schedules, allowing for smoother, more continuous production runs.

The value of an IIoT regime isn’t limited to maintenance. When done properly, it enhances the automation of decision-making by integrating advanced data analytics with machine control systems. As data is collected in real time, algorithms analyze this information and automatically adjust equipment settings, production schedules or supply chain logistics.

For example, if a sensor detects a drop in the pressure of a hydraulic system, an IIoT-powered control system can automatically adjust the pressure to maintain optimal performance without human intervention, while signalling operators to warn them of a potential problem. This reduces human error, ensures consistent quality and increases the efficiency of manufacturing processes.

Integrating machines and systems

IIoT enables interconnectivity between different machines, production lines and even links multiple factories or plants. This integration creates seamless communication between devices and systems, enabling data to flow between them in real time. As a result, manufacturers can coordinate complicated operations across the entire production process, from raw material handling to final product assembly.

If a machine on one production line detects a fault and stops, IIoT can signal other parts of the system to adjust, switch tasks or even reroute resources to prevent bottlenecks or downtime elsewhere in the system.

Enhanced supply chain visibility and automation

The follow-on to this is that manufacturers gain real-time insights into their supply chain operations. By integrating data from inventory systems, suppliers and logistics providers, manufacturers can track raw materials, monitor stock levels and predict demand fluctuations. This improves dynamic scheduling and just-in-time production, ensuring that materials are available when needed and avoiding overproduction or stockouts.

IIoT can also help automate inventory management through systems that track and reorder supplies autonomously, reducing manual inputs and the risk of human error.

Energy and resource optimization

Using these systems, manufacturers can continuously monitor energy consumption and resource utilization in real time. Data on electricity use, water consumption, air pressure, and other utilities, help identify inefficiencies and optimize resource allocation.

For example, by tracking the energy usage of different machines and automatically adjusting power consumption, engineers can ensure equipment is operating at peak efficiency, reducing waste and operational costs. Smart systems can also suggest or adjust energy consumption during idle times or change temperature settings in response to production demands.

Real-time quality control

Just like many other aspects of manufacturing, IIoT can help turn quality control processes from reactive to proactive in real-time. Monitoring production parameters like temperature, speed, material composition and other factors during manufacturing, allows manufacturers to detect deviations that could eventually lead to quality defects.

Sensors would measure the consistency of materials or operations like material removal in machining, detect faults in real time and automatically adjust the production line to correct these issues before defects occur, ensuring products meet the required specifications.

Improved worker safety

IIoT can also enhance worker safety by automating dangerous tasks and monitoring environmental conditions in real time. These systems detect hazardous conditions such as toxic gases or abnormal machine vibrations, alerting workers to potential dangers. In more automated setups, IIoT allows for robots and machines to take over high-risk tasks, minimizing human exposure to dangerous environments.

In addition, IIoT-enabled wearables, such as safety vests or helmets with sensors, track workers’ vital signs and environmental conditions, ensuring their safety by triggering alarms or alerts if any dangerous situations arise.

The benefits significantly enhance real-time control and automation in manufacturing by enabling continuous monitoring, automated decision-making, predictive maintenance, and real-time optimization. Integrating data across machines, systems and supply chains makes manufacturing processes more intelligent, efficient and adaptable. By driving automation, improving quality control, reducing costs, and increasing productivity, IIoT is helping manufacturers stay competitive in a rapidly evolving industrial landscape.

Feedback loops

Data feedback loops are a critical concept in optimizing and adjusting processes dynamically, especially in the context of manufacturing. These loops involve continuously collecting data from various systems, processing it to generate insights and then using that information to manage the process. The objective is to maintain efficiency, improve quality, reduce waste and adapt to changing conditions.

The first step in a data feedback loop is the collection of real-time data. We have already learned this data can come from a variety of sensors on machinery, production equipment, supply chain systems, environmental sensors or even wearable devices used by workers.

Once data is collected, it’s sent to a central system where it is processed and analyzed. This may involve simple statistical analysis, machine learning algorithms or deep AI-driven analytics. For processes that require immediate adjustments, data is processed instantly at the edge, allowing for quick decision-making.

Some feedback loops use historical data and predictive algorithms to anticipate problems before they occur. A predictive maintenance system can use data on vibration levels and temperature changes to predict when a machine is likely to fail. Advanced analytics can identify patterns or trends in the data, such as recurring defects or inefficiencies. These insights allow companies to target specific areas for improvement.

Decision-making and adjustment

Based on the analysis, decisions are made on how to adjust or optimize the process in real-time. These adjustments can be made by human operators or automated control systems that directly change operational parameters without human intervention.

In an automated system, once a problem is detected or an optimization is identified, the system can adjust itself. For example, if a temperature sensor on a furnace shows that the temperature is too high, the control system can automatically reduce the heat. Similarly, in a factory, a machine might speed up or slow down based on real-time demand or product quality measurements.

In some cases, a feedback loop will alert a human operator about an issue, but the operator will make the decision on how to proceed, such as when a sensor detects a quality issue with a product.

The key advantage of data feedback loops is their ability to drive process optimization continuously. Over time—as the system collects more data—it improves its ability to make more accurate predictions, identify inefficiencies and adjust processes more effectively.

At a glance: a feedback loop in manufacturing

To explore the basic concept of data feedback loops in manufacturing, consider a smart factory scenario with a robotic assembly line, as described below:

Data Collection: Robots on the assembly line are equipped with sensors to monitor the position, speed, and performance of each part as it moves through the production process.

Data Processing: As the sensors collect data on each part, this information is fed into an analytics platform. The system compares the real-time data with pre-set performance targets, such as the desired part speed, quality standards, and cycle times.

Decision-Making and Adjustment: If the system detects that a part is not being assembled correctly (e.g., an incorrect part placement or missing component), the feedback loop triggers an automatic adjustment, such as slowing down the robot or stopping the line for a quality check. Alternatively, if the system notices that parts are moving too slowly, it can increase the robot speed to meet production goals.

Optimization: As the system continues to gather data and make adjustments, it identifies trends and optimizes the process. For example, it might recognize that certain parts are consistently experiencing defects and automatically adjust the settings to improve alignment or material flow.

Data feedback loops are particularly valuable because they provide real-time adaptability. Manufacturing environments are dynamic, and changes in demand, raw materials, or equipment conditions can occur quickly. A well-designed feedback loop ensures that the system can respond immediately to these changes, keeping production smooth and efficient.

Closing the loop

Once adjustments are made, the feedback loop continues by monitoring the results of those changes and further refining the process. If the adjustment improves performance, the loop continues to operate as usual. If it causes a problem or doesn’t improve performance, the loop learns from that data, adjusting its predictions and recommendations.

By enabling real-time data collection, analysis and automated adjustments, IIoT systems give manufacturers the information required to use feedback loops in a more nuanced, targeted manner to continuously improve performance, maintain high levels of quality, and adapt to changing conditions.

The post How does IIoT enable automation and control in manufacturing? appeared first on Engineering.com.

]]>
Using AI at the edge to connect the dots in IIoT https://www.engineering.com/using-ai-at-the-edge-to-connect-the-dots-in-iiot/ Fri, 08 Nov 2024 18:40:40 +0000 https://www.engineering.com/?p=133768 Edge AI expert Jack Ferrari shares his insights on why this tech is a good fit for manufacturing operations.

The post Using AI at the edge to connect the dots in IIoT appeared first on Engineering.com.

]]>

For manufacturing facilities, an IoT strategy for equipment data collection and analysis is an essential step toward digital transformation, providing the data required to generate data-driven insights, predictive maintenance and other benefits. However, connecting machines and equipment to the internet raises challenges for IT teams, including security concerns, data storage, bandwidth and computing power.

To tackle these challenges, many teams consider whether to process data at the edge, or in the cloud. Historically, while the edge has benefits such as processing speed, low bandwidth and security, cloud solutions offer unmatched computing power. To use complex computing solutions such as AI models, you may think cloud is the only option. However, vendors like MathWorks are proving that AI at the edge can provide the best of both worlds.

Engineering.com recently spoke with Jack Ferrari, Edge AI Product Manager at MathWorks, to learn more about Edge AI and how manufacturers use it.

Engineering.com (Eng.com): What are the benefits of edge devices compared to ‘dumb’ sensors that just send data straight on to a pc or to the cloud?

Jack Ferrari (JF): Running AI models locally on edge devices instead of the cloud brings several benefits. First, the inference time (or, response time) of the model can be greatly reduced, as data is no longer required to be shuffled back and forth over the Internet. Secondly and for the same reason, edge AI enhances data privacy (all data fed to/from the model stays on the device) and makes applications more reliable (less prone to network outages). Finally and thirdly, edge AI can lower costs by reducing/eliminating cloud hosting and storage fees.

Eng.com: What trends and technologies drive edge AI’s adoption across industries?

JF: The large and rapidly growing number of IoT devices across industries (expected to reach 40 billion by 2030) are generating massive amounts of data at the edge, driving the need for local processing to handle data efficiently, reduce latency and lower cloud costs. Advancements in hardware, like AI accelerators and software, like new model compression techniques, are working in tandem to enable the adoption of edge AI.

Eng.com: Do you think industry-wide adoption of edge technology is driven by devices becoming more cost effective, energy efficient and/or powerful, or are edge trends driven by other, more strategic factors?

JF: A combination of the two is influencing edge AI’s adoption: On the technology side, new hardware platforms are being designed with AI workloads in mind. Recent advancements in microcontrollers (MCUs), digital signal processors (DSPs) and AI accelerators (like Neural Processing Units (NPUs) are enabling the deployment of models that were previously impossible to consider running at the edge. Besides simply having greater horsepower, these new chips are being optimized to execute AI workflows with greater energy efficiency. At the same time, the ecosystem of software tools used to compress AI models and program them on edge devices is becoming more robust and user-friendly, making the technology more accessible. Strategically, edge AI is enabling companies to differentiate their products in new ways. For example, by adding real-time processing and decision-making capabilities, enhancing device security by handling all data processing locally and enabling the personalization of AI models through techniques like on-device learning.

Eng.com: In many industries, security and IP concerns hold back adoption of AI tools. Is this seen in manufacturing?

JF: Security and IP concerns can indeed impact AI adoption in manufacturing. However, processing sensitive data at the edge (close to where it originates), rather than transmitting it to the cloud, can reduce exposure to potential breaches, offering a way to address these concerns.

Eng.com: What benefits can engineers expect when using edge AI?

JF: There are four primary benefits to using edge AI:

  • Lower latency: AI models can deliver predictions and classifications more quickly, which is crucial for engineers working on time-sensitive applications. This rapid response can enhance user experience and enable real-time decision- making, particularly in scenarios where milliseconds matter, such as autonomous vehicles or live data monitoring.
  • Lower costs: Reducing data transmission and storage fees, along with improved energy efficiency, leads to significant cost savings. For engineers, this means more budget can be allocated to other critical projects or resources and they can ensure their systems have higher uptime and availability, even during network outages, thus maintaining service continuity.
  • Enhanced privacy: By processing incoming data on-device instead of transmitting it to the cloud, engineers can ensure higher levels of data privacy and security. This is particularly beneficial in industries where sensitive information is handled, as it reduces the risk of data breaches and ensures compliance with privacy regulations, making it easier to protect user data.
  • Improved reliability: As edge AI does not rely on continuous cloud connectivity, it can continue to function during network outages. This ensures that critical operations, like monitoring and control systems in manufacturing, remain active even if the cloud connection is lost.

Eng.com: What are common challenges associated with edge AI?

JF: While it’s becoming easier to implement AI models on the edge, organizations should be mindful of several challenges that accompany the technology:

  • Resource constraints: Edge devices typically have limited processing power, memory and storage. Complex AI models may run slowly or not at all. To mitigate this, proper care should be taken in selecting model architectures that are well-suited for the edge device they will eventually be deployed to. Additionally, models can be further optimized for edge deployment with compression techniques like projection, pruning and quantization.
  • Model deployment: Translating AI models from the high-level languages where they are defined and trained (like Python or MATLAB) to low-level languages that can be compiled to run on edge devices (like C or C++) can be challenging. MathWorks tools facilitate this process by automating the conversion, ensuring efficient deployment on a diverse range of hardware. For example, Airbus used GPU Coder to deploy deep learning models, trained in MATLAB for defect detection, onto embedded GPUs. GPU Coder automatically translated their MATLAB code into the corresponding CUDA code, which could be compiled and run on their embedded system.
  • Model maintenance: After deploying AI models to the edge, organizations should have a plan for keeping them updated over time. This can take several forms:
  • Over-the-air (OTA) updates, where new model files and weights are sent to edge devices over a network connection.
  • On-device training (or, incremental learning), where models are updated and refined directly on the device using local data, allowing for personalization without the need to communicate with the cloud.

Eng.com: Are there edge AI use cases that are applicable across multiple industries?

JF: Beyond classic examples like image classification, object detection and semantic segmentation, one interesting application of edge AI MathWorks is seeing used across industries are virtual sensors (or, software sensors). AI-based virtual sensors can be used to infer sensor data that might be difficult or expensive to measure directly, by analysing data from other sensors in real-time. One great example is for estimating the state of charge of a battery. While difficult to measure directly, it can be inferred from the values of other, more easily attainable values, like current, voltage and operating temperature. By using AI models trained on historical battery performance data, the virtual sensor can predict the state of charge more accurately and adapt to changes in battery health and usage patterns, providing real-time insights without the need for additional hardware. Virtual sensors are applicable to multiple industries, including automotive, aerospace, manufacturing and healthcare. As another example, Poclain Hydraulics used MATLAB to design and deploy a neural network-based virtual sensor for monitoring the temperature of motors used in power machinery.

Eng.com: Do you think that the trend toward AI and cloud-based IoT systems make custom-built systems into dinosaurs, in other words would a manufacturer be ‘crazy’ to consider building a solution in-house?

JF: While AI and cloud-based IoT systems offer scalable and cost-effective solutions, the decision to build a system in-house depends on a manufacturer’s specific needs and capabilities. Companies with specialized requirements or strong internal expertise may benefit from custom solutions, while others might prefer the speed and lower upfront costs of cloud-based platforms. Ultimately, the choice hinges on factors like customization, security and time to market.

Eng.com: As the complexity increases of the devices we use to monitor and maintain our equipment, is there a growing need to monitor and maintain the edge and IT devices as well? How do we do that?

JF: Yes, as the complexity and number of AI-enabled edge devices increases, so does the need for monitoring and maintenance. Over time, the input data to AI models can drift or differ significantly from the data they were originally trained on, negatively impacting model accuracy and performance. Organizations should anticipate this and consider approaches to continuously update their models, whether through OTA updates or incremental learning.

For more on edge AI, check out https://www.engineering.com/edge-ai-solutions-every-engineer-should-know-about.

The post Using AI at the edge to connect the dots in IIoT appeared first on Engineering.com.

]]>
How should I design my IIoT architecture? https://www.engineering.com/how-should-i-design-my-iiot-architecture/ Tue, 05 Nov 2024 14:39:30 +0000 https://www.engineering.com/?p=133620 The goal for companies starting out should be flexibility, interoperability and incremental investment

The post How should I design my IIoT architecture? appeared first on Engineering.com.

]]>
At the heart of an effective IIoT system is a modular architecture. This approach allows manufacturers to implement plug-and-play devices, which can be added or removed as necessary. For example, if a facility wishes to introduce a new type of sensor to monitor machine performance, it can do so without overhauling the entire system. This flexibility enables incremental upgrades that enhance capabilities progressively.

In addition, adopting a microservices architecture means that each component of the IIoT system operates independently. If a particular service—such as data collection or processing—needs improvement, it can be scaled or replaced without affecting the entire infrastructure. This targeted enhancement ensures that the system evolves alongside operational needs, fostering innovation and responsiveness.

Flexible data management

As data volumes increase, flexible data management becomes essential. Leveraging cloud solutions allows manufacturers to tap into virtually limitless data storage and processing capabilities, accommodating the influx of information from a growing number of IIoT devices. This scalability ensures that data can be collected and analyzed efficiently, supporting informed decision-making.

Moreover, integrating edge computing allows for local data processing. By analyzing data closer to where it is generated, manufacturers can reduce latency and bandwidth demands, resulting in quicker response times and more efficient analytics. This setup is particularly beneficial for real-time applications, where immediate insights can drive operational improvements.

Interoperability

To maximize the benefits of IIoT, interoperability is crucial. By adopting standard communication protocols like MQTT or OPC UA, new devices integrate seamlessly with existing systems. This standardization reduces compatibility issues and simplifies the addition of new technologies.

Open APIs further facilitate integration by connecting diverse applications and devices. This approach not only enhances the system’s scalability but also promotes innovation by enabling third-party developers to contribute new functionalities.

Adaptive network infrastructure

A robust networking infrastructure is essential for supporting the growth of IIoT systems. Investing in scalable solutions, such as 5G or private LTE, ensures the network can handle a number of connected devices without sacrificing performance. These high-capacity networks facilitate rapid data transfer, which is critical for real-time operations.

Mesh networking can also enhance connectivity. As the number of IIoT devices increases, a mesh network can improve reliability and coverage. The devices can communicate more effectively with each other and with central systems.

User-centric design

A focus on user-centric design is essential for an IIoT system to be accessible and useful. Developing intuitive interfaces enables users to interact with complex data and analytics. As new functionalities and devices are integrated, these interfaces should remain adaptable.

Customization options allow users to tailor their dashboards and data presentations. This flexibility ensures employees can concentrate on the metrics that matter most to their specific roles, enhancing productivity and engagement.

Incremental investment

By planning for phased implementation, organizations can gradually adopt IIoT technologies, assessing results and adjusting the strategy based on initial deployments. This method reduces the risk associated with large-scale changes and enables organizations to learn and adapt as they progress.

Starting with pilot programs provides an opportunity to test scalability in real-world conditions. These initial tests inform future investments and expansions, ensuring that the overall strategy aligns with operational goals.

Collaboration and ecosystem engagement

To keep pace with technological advancements, manufacturers must engage in collaboration and ecosystem engagement. Partnering with technology providers and stakeholders ensures that the IIoT ecosystem can evolve together, sharing insights and best practices.

Active community engagement in industry forums and engineering-focused websites such as Engineering.com helps manufacturers stay updated on emerging technologies and methodologies that facilitate scaling. By participating in these discussions, organizations can learn from the experiences of others and implement strategies that drive success.

Training and support

As systems evolve, training and support become critical. Providing continuous training for staff ensures that employees can effectively navigate new technologies and systems. This investment in human capital is essential for maximizing the benefits of IIoT.

Additionally, ensuring access to technical support helps organizations address challenges that arise during scaling. Support teams can assist with integration and troubleshooting, allowing manufacturers to focus on their core operations.

Feedback and iteration

Establishing feedback mechanisms is crucial for ongoing improvement. By collecting input from users and stakeholders, manufacturers can implement iterative enhancements to their IIoT systems as they scale. This feedback loop fosters a culture of continuous improvement and adaptation.

Encouraging an adaptation to change within teams is also vital. By promoting a culture that embraces innovation and is open to new ideas, organizations can optimize their IIoT implementations over time, ensuring they remain responsive to evolving operational needs.

The breakdown

Here’s a simplified (but not simple) breakdown of the typical layers in this type of modular architecture:

1. Device Layer (Edge Layer)

Components: Sensors, actuators and other smart devices.

Function: Collects data from machinery and equipment and may perform local processing to filter or aggregate data before transmission.

2. Connectivity Layer

Components: Communication protocols and network infrastructure.

Function: Facilitates communication between devices and central systems using wired (e.g., Ethernet) or wireless technologies (e.g., Wi-Fi, Bluetooth, LoRaWAN, cellular).

3. Data Ingestion Layer

Components: Gateways and edge computing devices.

Function: Manages the transmission of data from edge devices to cloud or on-premises servers, handling data aggregation and initial processing.

4. Data Processing and Analytics Layer

Components: Cloud or on-premises servers equipped with data analytics and machine learning tools.

Function: Analyzes the ingested data for insights, predictive maintenance, and operational optimization, utilizing advanced algorithms and models.

5. Storage Layer

Components: Databases and data lakes.

Function: Stores historical data for analysis, reporting and compliance, supporting both structured and unstructured data types.

6. Application Layer

Components: User interfaces, dashboards and applications.

Function: Provides tools for visualization, reporting, and user interaction, enabling stakeholders to make informed decisions based on data insights.

7. Security Layer

Components: Security protocols, encryption and access controls.

Function: Ensures data integrity and confidentiality, protecting the system from cyber threats and unauthorized access at all layers.

8. Integration Layer

Components: APIs and middleware.

Function: Enables integration with existing enterprise systems (like ERP, MES, and SCADA) for seamless data flow and operational coherence.

Wrap up

This layered modular architecture provides flexibility and scalability, allowing manufacturers to implement IIoT solutions tailored to their specific needs. By clearly defining each layer’s role, organizations can enhance interoperability, maintain security, and ensure that data flows effectively from devices to actionable insights. This structure facilitates incremental upgrades and the integration of new technologies as they become available.

The post How should I design my IIoT architecture? appeared first on Engineering.com.

]]>
How to plan data collection, storage and visualization in an IIoT deployment https://www.engineering.com/how-to-plan-data-collection-storage-and-visualization-in-an-iiot-deployment/ Mon, 21 Oct 2024 19:32:23 +0000 https://www.engineering.com/?p=133070 Be sure to consider scalability and future-proofing to accommodate evolving manufacturing processes and technologies.

The post How to plan data collection, storage and visualization in an IIoT deployment appeared first on Engineering.com.

]]>

When it comes to an IIoT (Industrial Internet of Things) implementation in manufacturing, data collection, storage, analytics and visualization are the core backplane that drives actionable insights and enables smarter operations.

How do these components typically align in an IIoT system and what considerations should a manufacturing engineer should keep in mind when planning an implementation? It can certainly get complicated, but breaking things down into their smaller parts makes it more manageable.

Data Collection

The effectiveness of data collection largely depends on sensor architecture. Depending on the equipment or process, various types of sensors (temperature, pressure, vibration, etc.) need to be deployed across critical points in the manufacturing process. Ensure sensors are selected with appropriate accuracy, environmental tolerance and response time for the specific application.

A Data Acquisition Systems (DAS) act as an interface between these sensors and the IIoT platform. It gathers real-time data from sensors and transmits it to the edge or cloud infrastructure. The big decision here is whether to use edge processing (local data pre-processing) or rely on centralized data gathering at the cloud level. Edge processing offers lower latency, making it ideal for real-time tasks. It also reduces bandwidth needs by processing data locally. However, it requires more upfront investment in hardware and can be harder to scale. In contrast, cloud processing handles large data volumes more easily and scales better, though it comes with higher latency and ongoing costs for bandwidth and storage. Cloud systems also need robust security measures for data transmission. A hybrid approach combining both edge and cloud processing might be an option that balances real-time processing with scalable, centralized data management, but it depends on each application and the desired outcomes.

The next big decision is to determine the optimal sampling rate. Too high of a sampling frequency can overwhelm your storage and bandwidth, while too low may miss critical insights, particularly in dynamic manufacturing processes. Work with process engineers to determine the data sampling frequency based on process variability. Test this often to ensure what you think is the optimal sampling rate isn’t leaking potential value.

If you are going to base major decision off the insights gained through this IIoT system, you must ensure the integrity of collected data. This means that error checking (e.g., using checksums or hashing) and redundancy mechanisms (e.g., backup data paths or local buffering) are in place to handle network failures or sensor malfunctions.

A checksum is a small-sized piece of data derived from a larger set of data, typically used to verify the integrity of that data. It acts as a digital fingerprint, created by applying a mathematical algorithm to the original data. When the data is transmitted or stored, the checksum is recalculated at the destination and compared with the original checksum to ensure that the data has not been altered, corrupted or tampered with during transmission or storage.

Hashing is the process of converting input data into a fixed-size string of characters, typically a unique value (hash), using a mathematical algorithm. This hash is used for verifying data integrity, securing communication, and enabling fast data retrieval, with each unique input producing a unique hash.

When planning sensor deployment, focus on critical assets and key process variables that directly impact production efficiency, quality or safety. Implementing a hierarchical sensor strategy (high-priority sensors collecting frequent data, lower-priority ones providing long-term insights) can help balance costs and data richness.

Data Storage

Here again you are faced with a decision between either local (edge) or a centralized cloud environment for data storage. The same the same pros and cons apply as did in data acquisition, but your needs may be different.

Edge storage is useful for real-time, low-latency processing, especially in critical operations where immediate decision-making is necessary. It also reduces the amount of data that needs to be transmitted to the cloud.

Cloud storage is scalable and ideal for long-term storage, cross-site access and aggregation of data from multiple locations. However, the bandwidth required for real-time data streaming to the cloud can be costly, especially in large-scale manufacturing operations.

Manufacturing environments typically generate large volumes of data due to high-frequency sensors. Plan for data compression and aggregation techniques at the edge to minimize storage overhead.

Lossless compression reduces data size without any loss of information, ideal for critical data. Popular algorithms include GZIP, effective for text data, LZ4, which is fast and low latency for real-time systems, and Zstandard (Zstd), offering high compression and quick decompression for IIoT.

Lossy compression, on the other hand, is suitable for sensor data where some precision loss is acceptable in exchange for better compression. Wavelet compression is efficient for time-series data, and JPEG/MJPEG is often used for images or video streams, reducing size while maintaining most visual information.

Data aggregation techniques help reduce data volume by combining or filtering information before transmission. Summarization involves averaging or finding min/max values over a time period. Sliding window aggregation and time bucketing group data into time intervals, reducing granularity. Event-driven aggregation sends data only when conditions are met, while threshold-based sampling and change-detection algorithms send data only when significant changes occur. Edge-based filtering and preprocessing ensure only relevant data is transmitted, and spatial and temporal aggregation combines data from multiple sources to reduce payload size.

Because edge devices often operate in resource-constrained environments, deal with real-time data and must efficiently manage the communication between local systems and central servers, there are several edge-specific considerations for optimizing data management in IIoT systems. For real-time applications, techniques like streaming compression (e.g., LZ4) and windowed aggregation help minimize latency by processing data locally. Delta encoding reduces data size by only transmitting changes from previous values, minimizing redundancy. Additionally, hierarchical aggregation allows data to be aggregated at intermediate nodes, such as gateways, before being sent to the central system, further reducing the transmission load and improving overall efficiency in multi-layered edge networks. These considerations are uniquely suited to edge computing because edge devices need to be efficient, autonomous, and responsive without relying heavily on central systems or expensive bandwidth.

You’ll also need a storage architecture that can scale to accommodate both current and future data growth. Also, implement a robust redundancy and backup strategy. With critical manufacturing data, losing information due to hardware failure or network issues can be costly. Redundant storage, preferably in different geographic locations (for disaster recovery), is crucial for resilience.

TIP: For time-sensitive data (e.g., real-time process control), store at the edge and use data batching for non-urgent data that can be transmitted to the cloud periodically, reducing latency and network costs.

Analytics

Real-time analytics is essential for immediate decision-making (shutting down a faulty machine or adjusting a process parameter), while historical analytics provides long-term insights into trends and performance (predictive maintenance, yield optimization).

To enable real-time analytics, data should undergo initial pre-processing and filtering at the edge, so that only relevant insights or alerts are passed to the cloud or central system. This reduces data transfer overhead and minimizes latency in decision-making. For long-term analysis (identifying trends, root cause analysis), use batch processing techniques to handle large datasets over time. Machine learning (ML) and AI models are increasingly integrated into IIoT systems to identify anomalies, predict failures or optimize operations based on historical data.

IIoT analytics is more than just looking at individual sensor data; it’s about correlating data across multiple devices, sensors and even different factory lines to uncover patterns. Implement data fusion techniques where data from different sensors or sources can be combined to improve the accuracy and richness of insights.

Visualization

Visualization tools are essential for both operators and decision-makers to quickly assess the performance of processes and machines. These should include customizable dashboards that display real-time Key performance indicators (KPIs) like throughput, efficiency, downtime and machine health. KPIs should be linked to the specific objectives of the manufacturing process.

For process optimization and long-term planning, historical trends and patterns should be visualized clearly. This allows for root-cause analysis, identifying inefficiencies and making data-driven decisions about process improvements.

These visualizations should be tailored to different user roles. Operators need real-time alerts and immediate insights into machine performance, while managers or engineers might need access to historical data and trend analysis. Design the user interface (UI) and access controls with these distinctions in mind.

For advanced implementations, digital twins and augmented reality can be used to simulate and visualize complex data in 3D. Digital twins create a virtual replica of the manufacturing environment, allowing engineers to monitor and optimize operations without needing to be physically present.

Planning IIoT implementations

When planning IIoT in manufacturing, focus on building a scalable, resilient and secure architecture for data collection, storage, analytics and visualization. Ensure that data collection is optimized to balance cost and data richness, using both edge and cloud storage appropriately. Analytics capabilities should provide real-time decision support while enabling deep insights through predictive maintenance and long-term performance analysis. Visualization tools should cater to different user needs, ensuring clear, actionable insights through both real-time dashboards and historical data views. Keep in mind the challenges of data volume, latency, network bandwidth and data integrity as you design the IIoT system, with attention to scalability and future-proofing the infrastructure to accommodate evolving manufacturing processes and t

The post How to plan data collection, storage and visualization in an IIoT deployment appeared first on Engineering.com.

]]>
6 Best Practices When Developing XR for Industrial Applications https://www.engineering.com/resources/6-best-practices-when-developing-xr-for-industrial-applications/ Mon, 21 Oct 2024 13:44:35 +0000 https://www.engineering.com/?post_type=resources&p=133025 Through Industry 4.0 and the industrial internet of things (IIoT), developers have brought industry into the digital realm. Industry experts can learn, control and share anything about a process with a few clicks. But these experts are still limited by their physical connections.

The post 6 Best Practices When Developing XR for Industrial Applications appeared first on Engineering.com.

]]>

Developers, however, can start to blend the physical and digital realms via technologies like virtual reality (VR), augmented reality (AR) and mixed reality (MR) — collectively referred to as extended reality (XR). But this dream is still in its infancy. As a result, developers need guidelines to ensure they are going down the correct path when creating XR experiences.

In this 7-page ebook, developers will learn:

  • How XR is bound to change industry.
  • Which challenges exist when making XR experiences for industry.
  • Six best practices to keep the development of industrial XR experiences on track.
  • How Unity can help make industrial XR experiences a reality.

To download your free ebook, fill out the form on this page. Your download is sponsored by Unity Technologies.

The post 6 Best Practices When Developing XR for Industrial Applications appeared first on Engineering.com.

]]>