Lionel Grealou, Author at Engineering.com https://www.engineering.com/author/lionel-grealou/ Thu, 15 May 2025 14:04:34 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.2 https://www.engineering.com/wp-content/uploads/2025/06/0-Square-Icon-White-on-Purpleb-150x150.png Lionel Grealou, Author at Engineering.com https://www.engineering.com/author/lionel-grealou/ 32 32 The prompt frontier—how engineers are learning to speak AI https://www.engineering.com/the-prompt-frontier-how-engineers-are-learning-to-speak-ai/ Wed, 14 May 2025 17:03:09 +0000 https://www.engineering.com/?p=139717 Will engineers shape AI, or will AI shape them?

The post The prompt frontier—how engineers are learning to speak AI appeared first on Engineering.com.

]]>
Microsoft defines prompt engineering as the process of creating and refining the prompt used by an artificial intelligence (AI) model. “A prompt is a natural language instruction that tells a large language model (LLM) to perform a task. The process is also known as instruction tuning. The model follows the prompt to determine the structure and content of the text it needs to generate.”

For engineers, this means understanding how to structure prompts to solve technical problems, automate tasks, and enhance decision-making. This particularly applies when working with Generative AI—referring to AI models that can create new content, such as text, images, or code, based on the input they receive.

An article from McKinsey suggests that “Prompt engineering is likely to become a larger hiring category in the next few years.” Furthermore, it highlights that “Getting good outputs is not rocket science, but it can take patience and iteration. Just like when you are asking a human for something, providing specific, clear instructions with examples is more likely to result in good outputs than vague ones.”

Why engineers should care about prompt engineering

AI is quickly becoming an integral part of engineering workflows. Whether it is for generating reports, optimizing designs, analyzing large datasets, or even automating repetitive tasks, engineers are interacting with AI tools more frequently. However, the effectiveness of these tools depends heavily on how well they are instructed.

Unlike traditional programming, where logic is explicitly defined, AI models require well-structured prompts to perform optimally. A poorly phrased question or vague instructions can lead to suboptimal or misleading outputs. Engineers must develop prompt engineering skills to maximize AI’s potential, just as they would with any other technical tool.

Interestingly, some experts argue that prompt engineering might become less critical as AI systems evolve. A recent Lifewire article suggests that AI tools are becoming more intuitive, reducing the need for users to craft highly specific prompts. Instead, AI interactions could become as seamless as using a search engine, making advanced prompt techniques less of a necessity over time.

Key prompt skills engineers need

Engineers do not need to be AI researchers, but a foundational understanding of machine learning models, natural language processing, and AI biases can help them craft better prompts. Recognizing how models interpret data and respond to inputs is crucial.

AI tools perform best when given clear, well-defined instructions. Techniques such as specifying the format of the response, using constraints, and breaking down requests into smaller components can improve output quality. For example, instead of asking, “Explain this system,” an engineer could say, “Summarize this system in three bullet points and provide an example of its application.”

Engineers must develop an experimental mindset, continuously refining prompts to get more precise and useful outputs. Testing different wordings, constraints, and levels of detail can significantly improve AI responses. Applying Chain-of-Thought Prompting encourages AI to think step-by-step, improving reasoning and accuracy. Rather than asking, “What is the best material for this component?” an engineer could use: “Consider mechanical strength, cost, and sustainability. Compare three material options and justify the best choice.”

Examples of prompt engineering in action

To illustrate how effective prompt engineering works, consider these examples using your favorite Gen-AI engine:

  • Manufacturing Improvement: Instead of asking an AI tool, “How can I improve my factory efficiency?” an engineer could prompt: “Analyze this production data and suggest three changes to reduce waste by at least 10% while maintaining throughput.”
  • Material Selection: Instead of a generic prompt like “Recommend a good material,” an engineer could use: “Compare aluminum and stainless steel for a structural component, considering weight, durability, and cost.”
  • Software Debugging: Instead of “Fix this code,” a structured prompt could be: “Analyze this Python script for performance issues and suggest optimizations for reducing execution time by 20%.”
  • Compliance Checks: Engineers working with sustainability standards could ask: “Review this product lifecycle report and identify areas where it fails to meet ISO 14001 environmental standards.”
  • System Design Optimization: Instead of asking, “How can I improve this mechanical system?” a structured prompt could be: “Given the following design constraints (weight limit: 50kg, max dimensions: 1m x 1m x 1m, operational temperature range: -20°C to 80°C), suggest three alternative system configurations that maximize efficiency while minimizing cost. Provide a trade-off analysis and justify the best choice.”

Such structured prompts hep AI generate more useful, targeted responses, demonstrating the value of thoughtful prompt engineering.

Applications of prompt engineering in actual engineering

Prompt engineering is not just for software developers—it has real-world applications across multiple engineering disciplines:

  • Manufacturing & Design: AI can assist in generating CAD models, optimizing designs for manufacturability, and analyzing production data for efficiency improvements.
  • Electrical & Software Engineering: Engineers can use AI for debugging code, generating test cases, and even predicting circuit failures.
  • Product Development: AI-driven tools can help in ideation, simulating product performance, and accelerating R&D workflows.
  • Sustainability & Compliance: Engineers working in sustainability can leverage AI to assess material lifecycle impacts, optimize energy usage, and ensure compliance with environmental regulations.

The future of prompt engineering in manufacturing

As AI models continue to evolve, the demand for engineers who can effectively interact with them will only grow. Mastering prompt engineering today will give professionals an edge in leveraging AI to drive innovation and efficiency.

However, the trajectory of prompt engineering is uncertain. Some predict that as AI becomes more advanced, it will require less intervention from users, shifting the focus from crafting prompts to verifying AI-generated results. This means engineers may not need to spend as much time iterating on prompts, but instead will focus on critically assessing AI outputs, filtering misinformation, and ensuring AI-driven decisions align with engineering standards and ethics.

Despite this, for the foreseeable future, engineers who master the art of prompt engineering will have a competitive advantage. Just as early adopters of CAD and simulation tools gained an edge, those who learn to effectively communicate with AI will be better positioned to innovate, optimize, and automate their workflows.

A new skill for a new era

Prompt engineering is more than just a buzzword—it is a fundamental skill for the AI-driven future of engineering. As AI tools become more embedded in daily workflows, knowing how to communicate with them effectively will set apart those who use AI passively from those who actively shape its outputs. One thing for sure: “AI will not replace engineers, but engineers who know AI will”—a quote often attributed to Mark Zuckerberg.

The engineering industry is entering a transformative era, where AI-driven tools are no longer just supplementary but central to problem-solving and innovation. This shift is not merely about learning how to phrase a question effectively—it is about rethinking how engineers interact with intelligent systems. The ability to refine, adapt, and critically assess AI-generated insights will be just as important as the ability to craft precise prompts.

This raises a key question: As AI continues to advance, will prompt engineering remain a specialized skill, or will it become an intuitive part of every engineer’s workflow? Either way, those who proactively develop their AI literacy today will be best prepared to lead in the next evolution of engineering practice.

The post The prompt frontier—how engineers are learning to speak AI appeared first on Engineering.com.

]]>
Digital transformation: a modern-day conclave? https://www.engineering.com/digital-transformation-a-modern-day-conclave/ Thu, 08 May 2025 19:56:23 +0000 https://www.engineering.com/?p=139557 In a connected business environment, binary signals are no longer sufficient.

The post Digital transformation: a modern-day conclave? appeared first on Engineering.com.

]]>
The recent Vatican conclave, steeped in centuries of tradition, offered more than just a moment of spiritual significance—it served as a striking metaphor. Behind closed doors, a small group of leaders debated, deliberated, and ultimately declared a decision to the world with a puff of white smoke. Following a speedy deliberation and election, Chicago-born cardinal Robert Francis Prevost was elected on the second day of the conclave. He will be known as Pope Leo XIV.

This ceremonial approach works well works well for Catholicism, but less so in business. Indeed, many organizations still treat digital transformation in a similar manner. Initiated within closed executive circles, strategic direction is shaped, technology investments approved, and roadmaps defined—all behind the scenes. The broader organization often only sees the outcome, not the process.

As organizations operate within increasingly agile, transparent, and connected environments, the question arises: can lasting transformation truly be driven in isolation? Or must organizations evolve toward a more open, participatory model that empowers insight and ownership from the ground up?

Communicating decisions without context or clarity

Transformation decisions are often communicated with great fanfare — announcements, all-hands meetings, slick slide decks. But for many across the organization, these moments resemble smoke signals from a distant tower: symbolic but vague. The decision is clear, but the rationale, trade-offs, and implications are not.

Whether it is the selection of a cloud platform, a shift to agile delivery, or a complete redesign of the operating model, announcements without contextual transparency create uncertainty. Teams scramble to interpret direction, while middle managers attempt to reverse-engineer the thinking behind strategic pivots.

This disconnect results in lost time, misaligned execution, and diminished trust. In a connected business environment, binary signals are no longer sufficient. What is needed is clarity on how decisions are made, why certain paths were chosen, and what success looks like.

The politics of closed rooms

Digital transformation is often framed as a technical or operational initiative. But at its core is political alignment. Like a conclave, the process often concentrates decision-making among a select group of influential stakeholders, typically executives who are both the architects and potential beneficiaries of change.

These leaders must navigate internal power dynamics. Functional heads may lobby for systems that protect their operational autonomy. Transformation officers may push for standardization that enables control and reporting. Budget holders often weigh innovation against short-term performance.

In this context, decisions are shaped not just by strategy, but by alignment of interests and trade-offs between competing priorities. Some of this is necessary. But when politics override participation, transformation becomes less…well…transformative. Excluding frontline insights, product team perspectives, or customer feedback in the early stages can result in solutions that are misaligned with operational realities. The architecture may be sound in theory but fragile in execution.

Transformation cannot succeed as a black box exercise. Governance must account for diverse inputs while avoiding paralysis. Political alignment is necessary—but not sufficient. Executive sponsorship provides critical momentum and legitimacy, but it must be matched by genuine engagement from all levels of the organization. Strategic direction set at the top should create the conditions for broad-based participation, where insight flows upward and action cascades downward in sync.

Unlocking bottom-up momentum

While leadership sets the tone and vision, execution at the edge ultimately determines success. Bottom-up momentum is not simply a cultural aspiration—it is a practical necessity.

Organizations that outperform in transformation tend to decentralize experimentation. They provide local teams with the frameworks, tools, and autonomy to adapt global strategies to real-world conditions. This includes structured experimentation with minimal viable pilots, cross-functional squads that test solutions early, and platforms that allow for scalable iteration.

Modern technologies enable this shift. Cloud-native architectures support modular deployment. Low-code platforms reduce development bottlenecks. Digital twins simulate impact before committing real-world resources. AI and analytics offer continuous feedback loops from operations and customers.

When employees have the means and mandate to contribute to transformation, they become co-creators, not just recipients. Engagement rises. Resistance drops. Execution accelerates. And critically, early warnings surface faster, enabling quicker course correction. A bottom-up approach also democratizes ownership. It cultivates a culture where individuals at all levels recognize their stake in the outcome. Transformation becomes embedded in daily work, not isolated in a PMO.

From ritual to renewal

True transformation is not a symbolic gesture. It is a system-wide renewal process that demands transparency, adaptability, and inclusion. Success depends on shifting from episodic initiatives to a continuous capability for change.

In this model, transparency is not just about sharing decisions—it is about sharing context. That includes access to roadmaps, visibility into interdependencies, and real-time updates on progress. It means creating systems where feedback is expected, not requested. Collaborative dashboards, open architecture reviews, and real-time KPI monitoring move the organization beyond annual reviews and stage gates. Governance becomes lighter and smarter.

Most importantly, the organization learns to manage tension between vision and execution, global alignment and local flexibility, leadership direction and grassroots innovation. This is the essence of digital maturity. Conclaves serve their purpose. But they are designed to select, not to transform. In business, waiting for white smoke is no longer viable. Decisions must be made in daylight, informed by insight from across the enterprise.

The post Digital transformation: a modern-day conclave? appeared first on Engineering.com.

]]>
Turning unstructured data into action with strategic AI deployment https://www.engineering.com/turning-unstructured-data-into-action-with-strategic-ai-deployment/ Fri, 02 May 2025 13:12:31 +0000 https://www.engineering.com/?p=139379 Transform industrial data from disconnected and fragmented to a more unified, actionable strategic resource.

The post Turning unstructured data into action with strategic AI deployment appeared first on Engineering.com.

]]>
Artificial Intelligence (AI) is driving profound change across the industrial sector; its true value lies in overcoming the challenge of transforming fragmented, siloed data into actionable insights. As AI technologies reshape industries, they offer powerful capabilities to predict outcomes, optimize processes, and enhance decision-making. However, the real potential of AI is unlocked when it is applied to the complex task of integrating unstructured, “freshly harvested” data from both IT and OT systems into a cohesive, strategic resource.

This article explores the strategic application of AI within industrial environments, where the convergence of IT and OT systems plays a critical role. From predictive maintenance to real-time process optimization, AI brings new opportunities to unify disparate data sources through intelligent digital thread—driving smarter decisions that lead to both immediate operational improvements and long-term innovation. Insights are drawn from industry frameworks to illustrate how businesses can effectively leverage AI to transform data into a competitive advantage.

From raw data to ready insights

In an ideal world, industrial data flows seamlessly through systems and is immediately ready for AI algorithms to digest and act upon. Yet, the reality is far different at this stage. Much of the data that businesses generate is fragmented, siloed, unstructured, sometimes untimely available, making it difficult to extract real-time actionable insights. To realize the full potential of AI, organizations must confront this data challenge head-on.

The first hurdle is understanding the true nature of “freshly harvested” data—the new, often unrefined information generated through sensors, machines, and human input. This raw data is often incomplete, noisy, or inconsistent, making it unreliable for decision-making. The key question is: How can organizations transform this raw data into structured, meaningful insights that AI systems can leverage to drive innovation?

The role of industrial-grade data solutions

According to industrial thought leaders, the solution lies in the deployment of “industrial-grade” AI solutions that can manage the complexities of industrial data. These solutions must be tailored to meet the specific requirements of industrial environments, where data quality and consistency are non-negotiable. Seamless enterprise-wide data integration is key—whether for predictive maintenance that connects sensor data with enterprise asset management, real-time process optimization that synchronizes factory operations with ERP and MRP platforms—driving supply chain resilience that links production planning with logistics and inventory.

The first step in this process is data integration—the practice of bringing together disparate data sources into a unified ecosystem. This is where many organizations fail, as they continue to operate in data silos, making it nearly impossible to get a holistic view of operations. By leveraging industrial-grade data fabrics, companies can create a single, cohesive data environment where data from multiple sources, whether from edge devices or cloud systems, can be processed together in real time.

Data structuring—the secret to actionable insights

Once raw data is integrated, it must be structured in a way that makes it interpretable and useful for AI models. Raw data points need to be cleaned, categorized, and tagged with relevant metadata to create a foundation for analysis. This is a critical step in the data preparation lifecycle and requires both human expertise and sophisticated algorithms.

The structuring of data enables the development of reliable AI models. These models are trained on historical data, but the real power lies in their ability to make predictions and provide insights from new, incoming data—what we might call “freshly harvested” data. For example, predictive maintenance models can alert manufacturers to potential equipment failures before they occur, while quality control models can detect deviations in production in real time, allowing for immediate intervention.

The importance of explainability cannot be understated. For industrial AI applications to be truly valuable, stakeholders must be able to trust the insights generated. Clear, transparent AI models that are explainable ensure that human operators can understand and act upon AI recommendations with confidence.

Operationalizing AI for real results

Having structured data and trained models is only part of the equation. The real test is turning AI-generated insights into actionable outcomes. This is where real-time decision-making comes into play.

Organizations need to operationalize AI by embedding it within their decision-making frameworks. Real-time AI systems need to communicate directly with production systems, supply chains, and maintenance teams to drive immediate action. For example, an AI system might detect an anomaly in production quality and automatically adjust parameters, triggering alerts to the relevant personnel. The ability to act on AI insights immediately is what separates a theoretical AI application from one that delivers real-world value.

Moreover, feedback loops are essential. The AI models should not be static but should continuously learn and adapt based on new data and operational changes. This iterative approach ensures that AI doesn’t just solve problems for today but continues to improve and optimize processes over time.

Generative AI: A catalyst for innovation and workforce augmentation

While AI’s predictive capabilities are often the focal point, generative AI holds particular promise for transforming industrial workflows. By augmenting human creativity and problem-solving, generative AI helps address the skill gap in the workforce. For example, AI-assisted design can produce innovative solutions that human engineers may not have considered.

However, the integration of generative AI into industrial settings requires careful consideration. As powerful as it is, generative AI can be more costly than traditional AI models. Its inclusion in industrial applications must be strategic, ensuring that the value it brings—such as faster prototyping or more efficient design—justifies the investment.

How to build a sustainable AI strategy for data insights

Turning fragmented data into actionable insights requires a strategic approach. Based on industry frameworks from ABB and ARC Advisory Group, here’s a blueprint for effective AI adoption in industrial settings:

  1. Begin by understanding what is to be achieved through AI—whether it is optimizing efficiency, reducing downtime, or improving quality control. Align AI initiatives with these objectives to ensure focused efforts.
  2. Assess the existing data infrastructure and invest in solutions that integrate and standardize data across your systems. A unified data environment is crucial for enabling AI-driven insights.
  3. Avoid generic AI solutions. Instead, select AI tools that address specific use cases—whether it is predictive maintenance or process optimization. Tailored solutions are far more likely to provide valuable, actionable insights.
  4. In highly regulated industries, transparent and explainable AI models are essential for building trust and compliance. Make sure AI systems provide insights that are understandable and auditable.
  5. AI adoption is not a one-time implementation. Begin with pilot projects, learn from the results, and scale up gradually. This approach allows businesses to optimize AI systems while minimizing risk.

Scaling AI for broader impact

Collaboration is key to successful AI adoption. Partnering with experienced software providers, AI developers, and industry experts can help organizations navigate the challenges of scaling AI across their operations. Moreover, integrating generative AI alongside traditional AI approaches allows companies to strike a balance between innovation and cost-effectiveness.

The promise of AI in transforming industries is undeniable, but to truly realize its value, organizations must overcome the data fragmentation challenges that hinder effective AI deployment. By integrating, structuring, and operationalizing data, companies can convert raw information into actionable insights that drive measurable results. The future of industrial AI is not just about predictions and optimization—it’s about continuous learning, innovation, and the strategic use of AI to create sustainable, long-term growth.

The post Turning unstructured data into action with strategic AI deployment appeared first on Engineering.com.

]]>
Managing the world’s most complex machine https://www.engineering.com/managing-the-worlds-most-complex-machine/ Mon, 28 Apr 2025 18:17:24 +0000 https://www.engineering.com/?p=139223 With 100,000 parts and a 50-year expected operational lifespan, PLM is the only option for managing CERN’s Large Hadron Collider.

The post Managing the world’s most complex machine appeared first on Engineering.com.

]]>
David Widegren, Head of Engineering Information Management at CERN, at ACE 2025 in Boston, discussed the role of Product Lifecycle Management (PLM) strategies in managing the world’s most complex scientific instrument. (Image: Lionel Grealou)

CERN stands for the European Organization for Nuclear Research (from the French ‘Conseil Européen pour la Recherche Nucléaire’). It operates the world’s largest and most powerful particle accelerator—the Large Hadron Collider (LHC), which spans a 27-kilometre loop buried about 600 feet beneath the France-Switzerland border near Geneva. The LHC accelerates protons and ions to near-light speeds, recreating the conditions just after the Big Bang. This enables physicists testing fundamental theories about the forces and particles that govern our universe—and providing invaluable data on the building blocks of reality.

Operating at an astonishing temperature of -271.3°C—colder than outer space—the LHC’s superconducting magnets require cryogenic cooling systems, creating one of the coldest environments on Earth. Although some sensationalized media reports have raised concerns about the LHC creating black holes on Earth, CERN’s scientific community has rigorously demonstrated that these fears are unfounded. The LHC’s energy levels, while impressive, are a fraction of those generated by natural cosmic events that occur regularly without incident.

CERN operates with a collaborative network of 10,000 staff across 80 countries, supported by an annual budget of $1.4 billion. This immense collaboration drives groundbreaking research that demands the highest levels of reliability and precision. Managing the LHC’s enormous infrastructure—including millions of components—requires a comprehensive approach that integrates engineering and scientific disciplines. This is where digital transformation, powered by PLM and digital twins, becomes essential.

New digital backbone for an evolving scientific platform

Historically, CERN used legacy CAD systems such as CATIA V5, AutoCAD, SolidWorks, Inventor, Revit, and SmarTeam to manage critical design and operational data, alongside multiple asset and document repositories. However, as the LHC grew in complexity, these tools created inefficiencies, data silos, and challenges around synchronization, verification, and scalability.

To modernize its approach, CERN adopted Aras Innovator—a CAD-agnostic, part-centric PLM platform—redefining its approach to integrated data management. This shift enables CERN to track components across their full lifecycle, providing real-time insights into performance, wear, and maintenance needs. With over 100 million components—many exposed to extreme radiation and high-energy fields—this capability is critical for ensuring resilience and longevity. The integration of digital twins into the ecosystem allows CERN to predict component failures, optimize performance, and plan preventive maintenance.

Towards an integrated digital engineering platform. (Image: CERN presentation at ACE 2025)

Given the LHC’s extraordinary expected lifespan—over 50 years—the management of its components and systems from design and construction through decommissioning is a monumental task. Some systems, such as superconducting magnets and cryogenic infrastructures, must remain functional for decades. PLM helps CERN manage these long-term needs by providing a unified, scalable solution that integrates data across all lifecycle phases. This is essential not only for maintaining operational efficiency but also for ensuring the LHC’s systems continue to meet high standards of scientific precision and safety.

Sustainability is integral to CERN’s long-term strategy. Managing the LHC’s lifecycle includes minimizing environmental impact and optimizing energy consumption. PLM and digital twins enable CERN to optimize resource usage, reduce waste, and extend the life of crucial systems, ultimately supporting the organization’s long-term sustainability goals.

CERN’s shift to Aras Innovator has also facilitated the integration of various data streams—ranging from engineering documents to enterprise asset management. By connecting this information through a robust digital thread, CERN ensures that all stakeholders, from engineers to researchers, operate with a unified, reliable view of the system. This shared information base enhances collaboration, reduces errors, and accelerates decision-making.

While PLM manages engineering and operational data, experimental research outputs are handled separately by specialized Laboratory Information Management Systems (LIMS). However, synergies between PLM and LIMS are increasingly being explored, with the goal of creating faster feedback loops between research and engineering to enable more data-driven innovation.

Managing complexity without digital overload

As CERN continues to push the boundaries of scientific discovery, the need for real-time monitoring and predictive analytics becomes more critical. Digital twins enable real-time health checks on LHC components, tracking their condition and ensuring compliance with safety standards.

Yet the real challenge is not simply managing technical complexity but doing so without introducing unnecessary digital complexity. The scale of the LHC, with its intricate interconnected systems, requires CERN to balance advanced technologies with operational simplicity. Digital tools must enhance operations without becoming another layer of complication.

The key question: How can CERN manage scientific complexity while minimizing the complexity of digital tools?

New technologies must deliver actionable insights that enable faster, better decisions, instead of overwhelming stakeholders with excess data or redundant processes.

Some key questions that arise:

  • What measurable reductions in maintenance costs or unplanned downtime can CERN achieve through predictive digital models?
  • How will real-time monitoring improve energy efficiency, system lifespan, and reliability?
  • How much faster can experimental setups and calibrations be completed using simulation and virtual commissioning?

Ultimately, the success of CERN’s digital transformation will not be judged by the sophistication of its tools, but by clear, quantifiable outcomes: lower downtime, improved reliability, energy-efficient operations, and faster scientific throughput.

Lessons from the LHC to the FCC

CERN’s digital transformation is not just about adding tools—it is about making complex systems easier to manage and enabling faster, more informed decisions. This mindset is critical as CERN embarks on its next major project: the Future Circular Collider (FCC).

The FCC will dwarf the LHC, with a circumference of 91 kilometers and particle collisions at energy levels up to 100 TeV—far beyond the LHC’s 13 TeV. Construction costs are estimated between €20 billion and €25 billion, with initial operations targeted around 2040. The scale of the FCC presents massive engineering challenges, from magnet design to cryogenic systems.

Here, lessons learned from the LHC’s digital journey will pay dividends.

The LHC’s digital twins—validated over years of operation—will serve as the foundation for FCC simulations. Virtual modeling allows CERN to identify risks earlier, test complex designs in silico, and optimize operations before construction even begins. By compressing design timelines and minimizing construction risks, CERN can potentially save both operational and capital costs while improving reliability.

CERN’s approach shows that digital transformation is not about complexity for its own sake. It is about ensuring that scientific and operational challenges are met with clarity, efficiency, and sustainability—building a stronger foundation for the next generation of discovery.

The post Managing the world’s most complex machine appeared first on Engineering.com.

]]>
Making sense of tariff impact: why digital transformation was never optional https://www.engineering.com/making-sense-of-tariff-impact-why-digital-transformation-was-never-optional/ Mon, 14 Apr 2025 17:36:13 +0000 https://www.engineering.com/?p=138693 Are you simply reacting to disruption or leading your company through it? PLM is the secret weapon at the center of a resilient response to volatility.

The post Making sense of tariff impact: why digital transformation was never optional appeared first on Engineering.com.

]]>
Tariffs have stormed back into the global spotlight, shaking global trade, and putting pressure on manufacturers with international supply chains. For companies already facing inflation, material shortages, and geopolitical instability, new tariff and political wargames add another layer of complexity to navigate.

This is not just another opinion on trade policy—it’s a wake-up call. Companies that postponed digital transformation are now struggling to manage disruption with enterprise systems unfit for today’s pace of change. Many still rely on spreadsheets, siloed software, and disconnected teams. In contrast, organizations that invested in a connected digital backbone—especially one centered around Product Lifecycle Management (PLM)—are better equipped to assess impacts, respond rapidly, and protect margins.

Understanding tariff impact across the value chain

Tariffs create ripple effects across operations, affecting cost, compliance, and sourcing decisions:

  • Importers and contract manufacturers experience the first wave of cost increases.
  • Brand leaders and OEMs must decide whether to absorb, offset, or pass along those costs.
  • Suppliers across borders face pressure to renegotiate contracts, timelines, or terms.

At the center of this complexity is the costed Bill of Materials (BOM), listing parts, sub-assemblies, components, raw materials, formulations, and quantities needed to manufacture a product, along with the associated cost information for each item. It should be the single source of truth for real-time cost impacts; yet too often, BOM updates lag behind key decisions—causing margin erosion and compliance issues.

The critical question extends beyond “What became more expensive?” to “Who is going to pay for it?” The answer depends on industry dynamics, supply agreements, market conditions, and strategic intent. Highly commoditized sectors may need to absorb added costs to remain competitive, while premium markets might have room to pass on increases—if brand value and pricing power allow.

Complexity deepens when tariffs affect multiple tiers of the supply chain. Without timely insight into these dynamics, companies default to reactive choices—accepting margin erosion, postponing decisions, or making trade-offs that compromise long-term strategy and customer trust. A digitally connected value chain creates clarity. With end-to-end PLM integrating procurement and finance data, manufacturers can model how shocks flow through their products and portfolios—and respond before it is too late.

Integrated digital approach to tariff management

Addressing tariff impact requires more than PLM. It demands a fully integrated digital thread that connects product, sourcing, financial, and customer data.

  • PLM captures product structures, supplier dependencies, and alternate design paths—anchoring the impact assessment.
  • Enterprise Resource Planning (ERP) maintains costed BOMs and tracks profitability changes as duties or sourcing costs shift.
  • Supply Chain Management (SCM) evaluates alternate suppliers and logistics to manage cost and lead time risks.
  • Trade compliance systems monitor changes to tariff codes and cross-border regulations.

Supporting systems play complementary roles:

  • Product Data Management (PDM) ensures the correct version of product data is used when making design or sourcing changes—critical to avoid rework or compliance issues.
  • Material Requirements Planning (MRP) provides forward visibility into procurement and inventory to avoid overbuying high-tariff parts or facing shortages.
  • Customer Relationship Management (CRM) contributes commercial insight, highlighting customer sensitivities, regional exposures, and contract constraints that influence pricing strategies.

The value of this integrated technology stack lies in connecting innovation with sourcing, costing, and compliance. With this comprehensive view, manufacturers can confidently simulate trade-offs, evaluate impacts, and execute necessary changes. When systems work together, companies can coordinate across engineering, procurement, finance, and sales using shared data and aligned risk thresholds. This enables scenario modeling, supplier exposure analysis, and implementation of controlled design or sourcing shifts with full traceability and governance.

Proactive risk management in an era of uncertainty

In today’s context, tariffs have become instruments of urgency—tools used to pressure negotiation rather than long-term policy levers. For manufacturers, this translates to operational volatility with abrupt announcements, unclear duration, and vague scope that disrupt planning cycles.

Leaders must now make critical decisions with limited clarity and compressed timelines. Digital maturity enables companies to manage these situations with structure and foresight by:

  • Simulating tariff scenarios before they take effect
  • Identifying high-risk suppliers or parts and activating contingency plans
  • Evaluating design alternatives and reconfiguring BOMs with version control
  • Maintaining compliance as sourcing or target markets shift

Without these capabilities, companies resort to instinct, delay decisions, or absorb unnecessary costs. With them, responses become structured, traceable, and repeatable.

Digital transformation as a competitive imperative

Beyond tariffs, digital transformation has always been a strategic foundation for resilience against various forms of disruption. Companies that invested in connected systems now operate with speed and alignment, ready for AI and analytics-driven decision making. Those that delayed now face both the disruption and the steep learning curve of modernization.

True resilience emerges from more than technology alone—it comes from embedding governance, data discipline, and cross-functional collaboration into the organization’s DNA. Digital transformation enables faster, better decisions under pressure by connecting development, sourcing, operations, compliance, and customer functions into a coherent ecosystem.

As tariff volatility evolves—potentially settling into a new reality shaped by regional sourcing and revised trade agreements—the strategic question remains: Are we still reacting to disruption, or are we built to lead through it? Trade policy may eventually stabilize, but volatility will not. Digital transformation is not tied to election cycles or regulatory changes. It represents a long-term investment in adaptability, insight, and resilience. The companies that will thrive are those that stopped waiting for stability—and started building for it.

The post Making sense of tariff impact: why digital transformation was never optional appeared first on Engineering.com.

]]>
Decoding Dassault’s 3D Universes jargon: combining virtual and real intelligence https://www.engineering.com/decoding-dassaults-3d-universes-jargon-combining-virtual-and-real-intelligence/ Mon, 24 Mar 2025 18:00:02 +0000 https://www.engineering.com/?p=137969 Can Dassault Systèmes convince the market that this is more than just another buzzword-laden evolution?

The post Decoding Dassault’s 3D Universes jargon: combining virtual and real intelligence appeared first on Engineering.com.

]]>
Product Lifecycle Management (PLM) reimagined: from static digital twins to an AI-powered, generative intelligence ecosystem. (Image: Dassault Systèmes.)

Dassault Systèmes has unveiled 3D Universes (styled as 3D UNIV+RSES for branding), a bold step toward reimagining how industries engage with digital and physical realities. This is not just another 3D modeling update. It represents a fundamental shift from static digital twins to an AI-powered, generative intelligence ecosystem. The branding itself—3D UNIVERSES instead of “3D Universes”—signals a new paradigm where virtual and real (V+R) are seamlessly integrated, enabling continuous learning, automation, and adaptability across product lifecycles.

But with this shift comes a set of key challenges: What does this mean for legacy users? How will intellectual property be managed in an AI-driven world? And can Dassault Systèmes convince the market that this is more than just another buzzword-laden evolution?

Virtual + real: more than just digital twins

The concept of V+R (Virtual + Real) is not new to Dassault Systèmes. It has been a central theme in the company’s Virtual Twin Experience, where digital twins are no longer mere representations but are continuously evolving with real-world inputs.

In 3D Universes, this vision is taken further:

  • AI-powered models learn from real-world behaviors and adjust accordingly
  • Virtual companions provide intelligent assistance in decision-making
  • Generative AI and sense computing optimize designs and simulations in real-time

This moves beyond the traditional “digital twin” approach. Rather than acting as a static mirror of the physical world, 3D Universes enables a dynamic, self-improving system that continuously integrates, analyzes, and adapts. The idea is not new. For instance, Siemens and other ‘PLM software’ providers are actively exploring opportunities for AI to add an intelligent layer to the PLM data backbone.

From static to generative intelligence

Dassault Systèmes has long been a leader in 3D modeling, PDM/PLM, and simulation, though 3D Universes marks a significant departure from traditional software functionality. It introduces an AI-driven, generative framework that transforms how products are designed, validated, and maintained.

Key differentiators from this new positioning include:

  • AI-assisted workflows that automatically refine and evolve designs.
  • Predictive simulations that adapt based on real-world sensor data.
  • A “living” knowledge platform that evolves with industry trends and user inputs.

You get the idea. Rather than designing a product in isolation, cross-functional teams, from Product Development, Engineering, Quality, Procurement, and supply chains can now co-create with AI, allowing for an iterative, automated process that reduces risk, enhances efficiency, and accelerates innovation cycles.

Beyond software—a living digital ecosystem

The shift to 3D Universes also seems to represent a move away from traditional licensing-based software models toward a consumption-based, Experience-as-a-Service (XaaS) framework—a similar commercial model per the approach recently described as “AI-as-a-Service” by Microsoft CEO Satya Nadella. This aligns with broader industry trends where companies are transitioning from one-time software purchases to continuous value-driven digital services.

What does this mean in practical terms?

  • Customers will consume intelligence rather than static software.
  • Real-time virtual twins will become decision-making hubs, constantly updating based on real-world inputs.
  • AI-generated designs will automate engineering iterations, dramatically reducing manual effort.

This is a major shift for legacy customers who are accustomed to on-premises, private cloud hosting, and transactional software ownership. Dassault Systèmes will need to provide a clear roadmap to help these organizations transition without disrupting their existing workflows and wider integration landscape.

IP, trust and the generative economy

One of the most critical challenges in this transformation is intellectual property (IP) ownership and data security. In an AI-driven, generative economy, where does human ingenuity end and machine-driven design begin? If AI generates a product variation based on learning from past designs, who owns the output?

Some key concerns include:

  • Ensuring IP integrity when AI continuously iterates on existing designs.
  • Managing security risks as real-world data feeds into digital models.
  • Addressing industry adoption barriers for companies that have built their entire business around traditional IP protection frameworks.

Dassault Systèmes, and other enterprise solution provider in this space, will need to provide strong governance mechanisms to help customers navigate these complexities and build trust in the generative AI-powered design process.

Dassault Systèmes issued a YouTube video presentation as a teaser to outline the core ambitions of 3D Universes, reinforcing its role in shaping a new generative economy—elaborating on key messages:

  • Virtual-Plus-Real Integration: A seamless blend of digital and physical data enhances accuracy and applicability in simulations.
  • Generative AI Integration: AI-driven processes enable more adaptable and intelligent design iterations.
  • Secure Industry Environment: A trusted space for integrating and cross-simulating virtual twins while ensuring IP protection.
  • Training Multi-AI Engines: Supports the development of AI models within a unified framework, promoting more sophisticated AI applications.

While the video presents a compelling vision and sets timeline expectations towards an aspirational 15-year journey by 2040, it introduces complex terminology that might not be easily digestible for a broad audience. The use of “Universes” as branding adds an extra layer of abstraction that could benefit from clearer explanations and, in due time, a gradual transition roadmap for legacy users.

Additionally, the practical implementation and real-world applications remain vague, leaving some unanswered questions about industry adoption and integration. How will companies transition to this model? What are the concrete steps beyond the conceptual framework? The challenge will be ensuring that this does not become another overcooked marketing push that confuses rather than inspires potential adopters. Users demand clarity and pragmatism in linking solutions to problem statements and practical value realization.

A bold leap into the future

The potential of 3D Universes is enormous, but its success hinges on several key factors:

  • Market Education: Dassault Systèmes must articulate the value proposition beyond buzzwords, demonstrating tangible ROI for both new and legacy users.
  • Seamless Transition Strategy: Organizations need a clear pathway to adopt 3D Universes without disrupting their current operations.
  • AI Governance & IP Assurance: Addressing industry concerns around AI-generated designs, IP ownership, ethical AI, and data security will be crucial for widespread adoption.

If 3D Universes delivers on its promise, it has the potential to redefine how industries design, simulate, and optimize products across their entire lifecycle. By truly integrating Virtual + Real intelligence, Dassault Systèmes is making a bold statement about the next frontier of digital transformation.

The question now is: Are industries ready to embrace this generative future, or will skepticism slow its adoption? Furthermore, where should organizations start on this journey? Can solution providers be bold enough to share a pragmatic roadmap towards this goal, and keep us posted on their learnings in this space? Will 3D Universes bring us one step closer to the “Industry Renaissance” previously advocated by Dassault Systèmes Chairman Bernard Charles? Time will tell, but one thing is certain—Dassault Systèmes is positioning itself at the forefront of the next industrial/digital revolution.

The post Decoding Dassault’s 3D Universes jargon: combining virtual and real intelligence appeared first on Engineering.com.

]]>
RIP SaaS, long live AI-as-a-service https://www.engineering.com/rip-saas-long-live-ai-as-a-service/ Thu, 16 Jan 2025 21:04:52 +0000 https://www.engineering.com/?p=135747 Microsoft CEO Satya Nadella recently predicted the end of the SaaS era as we know it, which could level the playing field for smaller manufacturers.

The post RIP SaaS, long live AI-as-a-service appeared first on Engineering.com.

]]>
Artificial Intelligence (AI) is no longer just a buzzword—it is a game-changer driving new insights, automation, and cross-functional integration. AI is transforming industries by powering digital transformation and business optimization; and a lot more innovation is expected. While some sectors are advanced in leveraging AI, others—particularly traditional manufacturing and legacy enterprise software providers—are scrambling to integrate AI into traditional digital ecosystems.

Many executives foresee AI revolutionizing Software-as-a-Service (SaaS) by transitioning from static tools to dynamic, personalized, and intelligent capabilities. AI-as-a-Service (AIaaS) offers businesses unprecedented opportunities to innovate and scale. The promise is a future powered by AI agents and Copilot-like systems that streamline infrastructure, connect enterprise data, and reduce reliance on traditional configuration and system integration.

In a recent BG2 podcast, Satya Nadella shared his vision for AI’s role in reshaping technology and business. He stated, “The opportunities far outweigh the risks, but success requires deliberate action.” These opportunities extend beyond industry giants to startups and mid-sized enterprises, enabling them to adopt AI and leapfrog traditional barriers. Smaller enterprises, in particular, stand to gain by avoiding the pitfalls of complex digital transformations, taking advantage of AI to innovate faster and scale effectively.

Revolutionizing Experiences and Integration

AI is (or will be) fundamentally changing how users interact with SaaS platforms. Traditional SaaS tools are often said to be rigid, offering one-size-fits-all interfaces that require users to adapt. In contrast, AI brings opportunities to disrupt this model by analyzing user behavior in real-time to offer personalized workflows, predictive suggestions, and proactive solutions. Nadella emphasized this transformation, saying, “The next 10x function of ChatGPT is having persistent memory combined with the ability to take action on our behalf.”

This aligns with the emergence of Copilot systems, where AI acts as a collaborative partner rather than a mere self-contained tool. Imagine a SaaS platform that not only remembers user preferences but actively anticipates needs, offering intelligent guidance and dynamic adjustments to workflows. Such personalization fosters deeper engagement and loyalty while transforming the management of business rules and system infrastructure.

Empowering Smaller Enterprises

The promise of AI extends not only to large enterprises but also to smaller businesses, particularly those in manufacturing and traditionally underserved sectors. For example, a small manufacturer could adopt AI-driven tools to optimize supply chain management, automate repetitive tasks, and deliver personalized customer experiences—all without the complexity of traditional ERP systems.

To ensure successful adoption, businesses must:

  • Identify high-impact areas: Focus on processes that benefit most from automation and predictive analytics, such as customer service, supply chain management, or marketing optimization.
  • Leverage scalable solutions: Choose AI platforms that align with current needs but can scale as the business grows.
  • Build internal expertise: Invest in upskilling employees to work alongside AI tools, ensuring alignment between human and machine capabilities.
  • Partner strategically: Collaborate with AI vendors that prioritize interoperability and ethical standards to avoid vendor lock-in and compliance risks.

Redefining Value: Pricing Models and Proactive Solutions

AI is not only transforming technical capabilities but also redefining pricing models for SaaS platforms. Traditional subscription fees are being replaced by real-time, usage-based pricing, powered by AI algorithms that align revenue with the value delivered. Nadella warned, “Do not bet against scaling laws,” underscoring AI’s potential to adapt and optimize at scale. For instance, AI can analyze customer usage patterns to calculate fair, dynamic pricing, ensuring customers pay for the outcomes that matter most.

This shift to value-based pricing can help SaaS companies differentiate themselves in competitive markets, reinforcing their commitment to customer success. Additionally, as AI drives data integration, traditional software vendors (ERP, CRM, PLM, MES, etc.) will need to adapt their business models. With AI, vendor lock-in could become obsolete, or at least redefined, as businesses migrate data seamlessly across platforms, fueled by open standards and interconnected data assets.

Overcoming Adoption Challenges

While the promise of AIaaS is immense, transitioning from traditional SaaS is not without its hurdles. Businesses must address:

  • Cost barriers: AI solutions can require significant upfront investment, especially for smaller firms. Clear ROI metrics and phased implementation plans can mitigate this challenge.
  • Technical expertise gaps: The lack of in-house AI expertise can slow adoption. Partnering with AI-savvy consultants or platforms can bridge this gap.
  • Resistance to change: Shifting from static tools to dynamic AI-driven systems requires cultural change. Leadership must communicate the benefits clearly and provide training to ease transitions.

Responsible AI: Trust, Compliance, and the Road Ahead

The rise of AI-powered SaaS platforms presents both immense opportunity and significant responsibility. As these platforms analyze vast datasets, safeguarding user privacy and ensuring compliance with regulatory standards will be non-negotiable. Nadella’s remark that “Innovation must go hand in hand with ethical considerations” underscores the need to balance technological advancement with accountability.

To build trust and ensure accountability, businesses must prioritize:

  • Transparent data policies: Clearly communicate how user data is collected, stored, and used.
  • Robust security measures: Safeguards against data breaches are critical for maintaining trust.
  • User-centric governance: Empower users with control over their data while ensuring compliance with global regulations.

Final Thoughts…

Looking ahead, adaptive AI systems and large language models will continue to redefine how SaaS platforms deliver value, addressing evolving customer needs with precision and speed. Nadella’s vision for AIaaS is inspiring, but businesses must remain grounded. To lead in this new era, organizations must tackle critical questions:

  • How can they balance AI’s immense potential with the risks of misuse or ethical lapses?
  • What steps are necessary to ensure AI enhances—not replaces—human decision-making?
  • How can smaller enterprises leapfrog traditional barriers to scale with AI?
  • Can persistent memory systems foster meaningful personalization without sacrificing user trust?
  • What role will regulatory frameworks play in ensuring accountable innovation?

By addressing these questions and embracing the opportunities AI presents, SaaS providers can chart a path toward sustained success. The question is not whether AI will transform SaaS, but how organizations will adapt to lead in this new digital era

The post RIP SaaS, long live AI-as-a-service appeared first on Engineering.com.

]]>
Core transformation unlocked: digital opportunities for small and medium manufacturers https://www.engineering.com/core-transformation-unlocked-digital-opportunities-for-small-and-medium-manufacturers/ Thu, 09 Jan 2025 21:22:32 +0000 https://www.engineering.com/?p=135190 Harnessing AI to redefine operational agility and drive growth could be a key differentiator in the near term.

The post Core transformation unlocked: digital opportunities for small and medium manufacturers appeared first on Engineering.com.

]]>
Technology is no longer optional—it is a fundamental driver of business success. This does not mean it always takes center stage, but without it, businesses risk falling behind. Small and medium manufacturers now have a unique opportunity to learn from the transformation journeys of larger enterprises—including to consider alternate paths. By adapting digital strategies to their scale and needs, they can accelerate innovation, improve efficiency, and compete on a broader stage. The convergence of artificial intelligence (AI), cloud computing, IoT, and enterprise platforms provides a roadmap to rethink traditional operations while fostering resilience and agility.

Deloitte’s recent report, The Intelligent Core: AI Changes Everything for Core Modernization, highlights a critical shift in the role of core systems due to the rise of AI: “For years, core and enterprise resource planning systems have been the single source of truth for enterprises’ systems of records. AI is fundamentally challenging that model.” AI is moving core systems away from static, rigid structures, offering systems that are adaptive and predictive, transforming how businesses operate.

For smaller manufacturers, this shift underscores the importance of moving beyond static systems. By adopting modular, cloud-based ERP solutions, they can introduce intelligence incrementally without overhauling their entire infrastructure. Scalable platforms allow small and medium manufacturers to integrate AI gradually, starting with targeted applications like inventory management or demand forecasting.

Converging technologies for strategic growth

Deloitte emphasizes the convergence of AI with technologies like IoT and robotics as key drivers of transformation: “In an increasingly convergent world, enterprises would do well to explore intentional industry and technology intersections that propel innovation across boundaries.” While core technologies and enterprise systems may seem exclusive to large enterprises, smaller manufacturers can strategically adopt them to address their unique challenges.

Referring to “core transformation” implies more than digital transformation; AI is poised to disrupt what is, or should be, in the core because it drives new, accessible, capabilities. This is certainly the beginning of some sort of “data democratization” across functions, leveraging both structured and unstructured data sets. The notion of digital core is perhaps more than a merely data repository or functional vault. It is about intellectual property and pan-enterprise dynamic insights—while maintaining appropriate levels of consistency, traceability, and security of the relevant data assets.

Collaborations with technology providers or local academic institutions can help small and medium manufacturers access cutting-edge solutions tailored to their needs without heavy upfront investments. Intentional adoption of converging technologies ensures immediate and sustained value. Among other things, AI can elevate IT from a support function to a strategic enabler, allowing smaller manufacturers to use AI selectively to drive measurable outcomes.

AI-powered tools, such as shop-floor predictive maintenance, can analyze machine data to predict failures, reducing downtime and costs. Similarly, AI-driven production scheduling can optimize workflows, helping manufacturers meet tight deadlines. These high-impact, low-barrier applications of AI can deliver substantial value for small and medium businesses.

Sustainability and scalability as core principles

Deloitte also highlights the importance of balancing sustainability with technological modernization: “The AI revolution will demand heavy energy and hardware resources—making enterprise infrastructure a strategic differentiator once again.” For smaller manufacturers, this presents an opportunity to make strategic decisions that combine scalability with environmental responsibility.

Furthermore, a cloud-first strategy can help small and medium manufacturers reduce costs while enhancing scalability. Cloud services allow businesses to pay for only what they use, easing the financial burden of infrastructure investment. By investing into energy-efficient hardware and renewable energy sources, businesses can align their modernization efforts with sustainability goals.

This intersection of scalability and sustainability also extends to supply chain practices. For instance, AI-powered just-in-time inventory management can contribute to minimize waste and the environmental impact of overproduction. IoT-enabled sensors can track goods in real time, improving logistics efficiency and reducing emissions. These innovations provide operational savings and enhance a manufacturer’s environmental credentials, strengthening their position in the marketplace.

AI-enabled ways of working

The convergence of AI and enterprise digital technologies offers smaller manufacturers the ability to rethink their entire system of operations. By adopting AI-enabled ways of working, businesses can unlock new levels of scalability and agility. AI maximizes resource utilization, reduces inefficiencies, and enables faster, more accurate decision-making. As such, AI-powered analytics uncover hidden patterns, driving innovation in product design and service delivery, which gives manufacturers a competitive edge.

AI also shifts operations from reactive to proactive. For example, integrating AI into CRM systems allows manufacturers to anticipate customer needs and adjust production schedules dynamically. AI-powered chatbots and virtual assistants enhance customer interactions, providing instant support and fostering stronger relationships. This can drive significant value to end-users, such as:

  • Improving knowledge management, and in turn, reducing errors and duplication.
  • Minimizing essential non-value-added activities, without complex data and digital transformation investment.
  • Learning from new insights (and enabling new technologies), embedding lessons into continuous improvement opportunities.
  • Driving continuous efficiencies and time-to-market optimization.

The vision described by Deloitte is about an AI-enabled core aligning with what the business is doing, rather than the reverse: “In the truly agentic future, we expect to see more of these kinds of bots that work autonomously and across various systems. Then, maintaining core systems becomes about overseeing a fleet of AI agents.” McKinsey reinforces this perspective in its latest quarterly insights publication, stating: “Companies are rethinking their digital strategies, moving away from massive transformations to more modular approaches that focus on areas of greatest impact.” This modularity ensures that smaller manufacturers can scale AI capabilities incrementally, avoid the risks of large-scale overhauls, and achieve meaningful progress.

Strategic growth through AI

Smaller manufacturers can achieve long-term scalability by focusing on creating ecosystems that support seamless data exchange and collaboration. AI-driven simulations, such as digital twins, can refine processes before implementation, reducing risks and maximizing efficiency. These ecosystems improve productivity while preparing businesses for future technological advancements. Starting with high-impact, low-barrier AI initiatives like predictive maintenance and optimized production scheduling allows manufacturers to achieve immediate benefits. These small-scale efforts can pave the way for broader digital transformation, leading to sustained growth.

As ERP systems and other core technologies transform into intelligent platforms, leveraging AI to provide dynamic, real-time insights instead of relying on static records, PDM and wider PLM systems are poised to embrace similar advancements. The adoption of AI-driven PLM systems is already underway in some forward-thinking organizations, and the wider industry is quickly following suit. While transitioning from legacy systems can be complex, the promise of intelligent, predictive PLM systems is worth the effort. As AI technology matures and platforms become increasingly interconnected, enterprise platforms will evolve into dynamic, proactive solutions that enable manufacturers to make smarter, data-driven decisions and unlock new opportunities for growth and innovation.

Digital transformation and AI certainly offer smaller manufacturers a clear path toward scalability and competitiveness—pending they are not afraid of experimenting. By strategically adopting converging technologies, prioritizing sustainability, and gradually integrating AI into operations, small and medium manufacturers can modernize their processes without overstretching resources. This incremental approach might foster resilience and agility, ensuring that businesses can evolve alongside the technological advancements that will define the future of manufacturing.

The post Core transformation unlocked: digital opportunities for small and medium manufacturers appeared first on Engineering.com.

]]>
Lessons from 2024 and what innovators and engineers can expect from 2025 https://www.engineering.com/lessons-from-2024-and-what-innovators-and-engineers-can-expect-from-2025/ Fri, 20 Dec 2024 13:15:00 +0000 https://www.engineering.com/?p=135037 The task ahead is clear: cross-functional collaboration, systems interoperability, and business-digital strategy alignment.

The post Lessons from 2024 and what innovators and engineers can expect from 2025 appeared first on Engineering.com.

]]>
The manufacturing industry in 2024 was a proving ground for transformative technologies. From AI-driven efficiency to the adoption of edge computing, product developers and manufacturing engineers learned critical lessons about what works—and what does not—in the relentless pursuit of speed and innovation.

Looking ahead to 2025, the focus will shift from technology adoption to mastery. This is about more than tools—it is about driving measurable outcomes through integrated and strategic approaches. Let’s break down the key takeaways from 2024 and explore the priorities for 2025.

2024: A year of ambitious gains and sobering realities

1. AI for Automation and Predictive Analytics

AI proved itself as a game-changer in 2024, enabling automation, predictive maintenance, and workflow optimization. However, achieving measurable ROI remained elusive for many. Successful organizations treated AI as a strategic investment aligned with business goals, not just a shiny tool.

Key lessons learned include:

  • Companies that excelled used AI for defect detection, energy optimization, and production adjustments.
  • Filling gaps in data quality and platform integration was a critical step toward unlocking AI’s value.
  • Manufacturers need implementation roadmaps with measurable KPIs to ensure AI adoption delivers real results.

2. Robust Digital Thread Foundations

Seamless integration across PLM, ERP, and MES platforms emerged as a significant competitive advantage in 2024—but not without challenges. Establishing digital threads required technical rigor, governance, cross-functional process and data alignment.

Key lessons learned include:

  • End-to-end integration enabled faster design cycles, fewer production errors, and smoother operations.
  • Engineers played a pivotal role as data stewards, ensuring accuracy and driving adoption of best practices.
  • Software providers must prioritize not just APIs but also functional use cases that deliver measurable outcomes.

3. Decentralized Processing and Edge Computing

Edge computing revolutionized data processing by bringing it closer to the production floor. This enabled real-time analytics, adaptive robotics, and dynamic production controls—allowing machines to make decisions autonomously.

Key lessons learned include:

  • Decentralized systems introduced complexities, requiring a balance of performance, security, and infrastructure resilience.
  • Engineers faced challenges ensuring secure, uninterrupted data flows across networks.
  • Scaling edge computing depends on effectively managing infrastructure robustness and agility.

4. Security and Network Infrastructure

As connected devices proliferated, manufacturers became prime targets for sophisticated cyberattacks. In response, zero-trust frameworks and AI-powered threat detection became essential.

Key lessons learned include:

  • Security must be embedded from the design phase, not added as an afterthought.
  • Operational technology (OT) and IT systems convergence created new vulnerabilities requiring proactive management.
  • Cybersecurity resilience is no longer optional—it is fundamental to digital transformation.

2025: from adoption to mastery

In 2025, manufacturers will move beyond technology adoption and focus on integrating tools to drive strategic business outcomes. Success will depend on scalability, governance, and workforce enablement as organizations shift from experimentation to enterprise-wide transformation.

Here is what lies ahead and how manufacturing engineers can lead the charge:

1. AI as the Orchestrator

AI will evolve from incremental improvements to a central orchestrator of production processes, supply chains, and energy efficiency. To enable this transformation, engineers must focus on building robust, scalable infrastructures.

Key priorities include:

  • Data governance: Is the data fueling AI clean, accurate, and accessible?
  • Interoperability: Does AI seamlessly integrate with PLM, MES, ERP, and other systems?
  • Scalability: Can AI solutions adapt to growing production complexities?
  • Workforce enablement: Are AI insights designed to empower human decision-making?

Mastering AI in 2025 means ensuring it drives autonomous, intelligent, and value-driven operations.

2. Scalable, On-Demand Solutions

Cloud-based solutions and Everything-as-a-Service (XaaS) will dominate manufacturing in 2025, offering flexibility, reduced costs, and faster time-to-value. Manufacturing engineers must bridge legacy systems with cloud platforms to enable this transition.

Key priorities include:

  • Integration roadmaps: How will legacy systems connect to cloud-native platforms?
  • Cost-value balance: What are the trade-offs of XaaS versus on-premises systems?
  • Infrastructure optimization: Are cloud resources configured for performance and minimal latency?
  • Complexity management: How will hybrid environments with legacy and cloud systems be managed?

Success will depend on striking the right balance between scalability and operational stability.

3. Non-Negotiable cybersecurity foundation

With AI, IIoT, and edge computing driving increased connectivity, cybersecurity must be embedded into every phase of the manufacturing lifecycle. Engineers will play a leading role in ensuring systems are resilient and secure.

Key priorities include:

  • Security-first design: Are systems designed with embedded security protocols from the outset?
  • OT-IT convergence: How are production systems safeguarded while integrating with IT systems?
  • Real-time risk monitoring: Are AI-driven tools used to proactively detect and mitigate threats?
  • Team education: How can engineers enable teams to recognize and respond to cyber risks?

In 2025, neglecting cybersecurity will carry far greater risks than the cost of proactive investments.

4. IIoT: The Standard for Operations

In 2025, IIoT will transition from an innovation to a baseline standard for all manufacturers. The real challenge will be mastering IIoT to drive clear, actionable results.

Key priorities include:

  • Actionable insights: How will raw sensor data be translated into strategic actions?
  • Device interoperability: Are diverse IIoT platforms seamlessly connected?
  • Scalability: Can IIoT systems grow alongside production demands?
  • Security: How are connected devices protected from vulnerabilities?

Manufacturing engineers must focus on turning IIoT data into decisions that enhance efficiency, reduce costs, and drive competitive advantage.

Manufacturing in 2025: the road ahead

The digital transformation of manufacturing cannot succeed in isolation. It requires alignment with enterprise-wide strategies, seamless platform interoperability, and clear business outcomes. For manufacturing engineers, the task is clear:

  • Champion cross-functional collaboration.
  • Ensure systems interoperability.
  • Drive strategic, value-driven adoption of transformative technologies.

The lessons of 2024 are unmistakable: transformation requires more than tools; it requires deliberate integration, governance, and execution. By mastering AI, edge computing, IIoT, and cybersecurity, engineers will enable smarter factories, faster production, and resilient operations.

For those prepared to lead, 2025 holds immense potential. Are you ready to navigate the future of manufacturing?

The post Lessons from 2024 and what innovators and engineers can expect from 2025 appeared first on Engineering.com.

]]>
Digital twin of an organization: scalable agility, resiliency https://www.engineering.com/digital-twin-of-an-organization-scalable-agility-resiliency/ Mon, 02 Dec 2024 21:06:03 +0000 https://www.engineering.com/?p=134491 Are DTOs yet another enterprise-level strategy or a scalable solution for SMEs, start-ups and innovators?

The post Digital twin of an organization: scalable agility, resiliency appeared first on Engineering.com.

]]>
The concept of Digital Twin of an Organization (DTO) is broadly defined by tech consultancy Gartner as “a dynamic software model of any organization that relies on operational and contextual data to understand how an organization operationalizes its business model, connects with its current state, responds to changes, deploys resources and delivers customer value.”

This idea emerged from the rise of digital twins in industries like manufacturing, healthcare and aerospace. Originally, digital twins focused on replicating physical systems for simulation and optimization, but the concept has now evolved to include entire organizations.

DTOs aim to monitor and simulate internal operations. This makes it easier to anticipate changes, streamline processes and accurately implement strategic initiatives. Companies like SAP and Fujitsu are pioneering DTO frameworks, integrating business process modeling with AI and performance monitoring to guide organizations in adopting enabling technology effectively, optimizing business processes and address complex digitally-enabled challenges.

So, is DTO just another enterprise-level strategy? Or does it offer potential as a scalable solution for start-ups, particularly those eager for rapid growth?

Evolving beyond traditional digital twins

Product digital twins are commonly described as valuable tools in engineering and manufacturing, where they simulate the performance of physical assets like machinery, vehicles, or complex devices. While traditional digital twins target individual assets—such as products or machinery—to optimize maintenance and predict failures, DTOs take a broader approach. They extend beyond the industrial and physical domains to encompass an organization’s entire structure, including processes, people, systems and data. This approach aims for companies to create a digital representation of how they operate, make decisions and deliver value—integrating data from various sources: processes, technology, personnel and even customer interactions, providing a unified view of the entire business.

Therefore, a DTO aims to be a comprehensive digital replica of an organization that integrates operational and contextual data to provide a holistic view of its operations—beyond enterprise platforms and associated disciplines like PLM, ERP, CRM and others. A shift from asset-specific focus to organization-wide modeling allows companies to simulate their operations as a whole—offering valuable insights into strategy, operations and customer engagement. The goal is not merely to optimize parts but to enhance the entire organizational ecosystem—certainly a mandate of the Chief Data Officer (CDO) to orchestrate. The rise in digital and business transformation initiatives has fueled interest in DTOs. By providing a virtual model of an organization, DTOs enable scenario planning, risk forecasting and effective prioritization of projects. The advancements in AI, machine learning, IoT and process mining have made sophisticated DTO solutions more feasible and practical.

Why DTOs can matter for businesses of all sizes

DTOs empower companies of any size to operate with greater agility and precision, providing them both immediate and long-term benefits:

  • Data-driven decision-making: By leveraging real-time data, DTOs enable companies to model different scenarios, predict outcomes and choose the best course of action. This reduces uncertainty and supports informed decision-making.
  • Enhanced operational efficiency: Organizations can pinpoint inefficiencies, remove process bottlenecks and improve workflows with DTOs. This leads to smarter resource management and potentially streamlined operations.
  • Cost savings and risk mitigation: DTOs provide the ability to test and validate changes before implementation, reducing the risk of costly mistakes. Organizations can prioritize investments that promise the greatest impact and cost-effectiveness.
  • Strategic agility: DTOs enable companies to respond to market changes quickly, anticipate customer needs and maintain alignment with business goals. This agility is vital for both start-ups scaling up and established enterprises seeking to stay competitive.

Scaling with DTOs

DTOs can offer specific advantages for start-ups and companies looking at scaling their operations. They allow smaller organizations to simulate their growth trajectory, anticipate challenges and align technology with operational structures early on. Such proactive approach can certainly help start-ups maintain agility while managing growth, avoiding pitfalls and ensuring that resources are used effectively. By aligning the organizational structure with the technology roadmap, DTOs provide a framework for sustainable expansion, ensuring that growth is both efficient and controlled. For larger, more mature organizations, DTOs are likely to facilitate deeper insights into complex operations, enabling continuous refinement and innovation. Companies can use DTOs to build resilience, manage disruptions and sustain performance even in volatile market conditions.

By using DTOs, organizations can model different ‘what-if’ scenarios, anticipating supply chain disruptions, market changes, or internal challenges. This proactive capability supports an adaptable and resilient business structure, critical for any company in today’s fast-paced digital environment.

DTOs as a growth blueprint

In essence, DTOs provide a blueprint for growth and continuous improvement. They allow companies to explore new ideas, test them safely and scale with confidence. Whether refining existing operations or planning future expansions, DTOs enable organizations to:

  • Simulate and forecast: DTOs allow businesses to test scenarios before they occur, reducing risk and improving decision-making.
  • Optimize performance: By identifying inefficiencies and realigning resources, DTOs help organizations operate at peak efficiency.
  • Drive innovation: DTOs foster a culture of experimentation, where new strategies can be safely modeled and tested without the fear of costly failures.
  • Align with strategy: DTOs ensure that every aspect of the organization is aligned with strategic goals, from day-to-day operations to long-term objectives.

By combining the power of real-time data, AI-driven insights and predictive modeling, DTOs are not just tools for efficiency—they are strategic enablers of growth, innovation and resilience. For companies ready to navigate the complexities of the modern business landscape, embracing a DTO can be the key to achieving sustainable success. Considering DTOs, the first question might well be where to start to avoid ‘boiling the ocean’ and getting overwhelmed by complexity. Interestingly, per a BMC blog post from 2020, Stephen Watts highlighted that “The obvious first step in creating a DTO is developing a virtual representation of the organization that is accurate and comprehensive. Once a model is created, teams can analyze and interpret data to learn more about systems and processes while anticipating issues and areas of concern […] For organizations that are considering implementing a DTO, it is a good idea to start with a small project and then gradually scale up from there.”

DTOs are not only about operational efficiency; they can become a fundamental enabler of innovation, allowing organizations to test new ideas, reduce costs and maximize impact. As digital transformation continues to reshape industries, DTOs are likely to play a critical role in helping companies not only survive but thrive amid change.

The post Digital twin of an organization: scalable agility, resiliency appeared first on Engineering.com.

]]>