PLM/ERP - Engineering.com https://www.engineering.com/category/technology/plm-erp/ Tue, 17 Jun 2025 17:08:50 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.2 https://www.engineering.com/wp-content/uploads/2025/06/0-Square-Icon-White-on-Purpleb-150x150.png PLM/ERP - Engineering.com https://www.engineering.com/category/technology/plm-erp/ 32 32 Duro reboots its PLM platform for AI https://www.engineering.com/duro-reboots-its-plm-platform-for-ai/ Tue, 17 Jun 2025 16:24:00 +0000 https://www.engineering.com/?p=140690 Duro Design is a ground-up revamp of Duro’s cloud PLM platform, and in other news, Onshape gets MBD.

The post Duro reboots its PLM platform for AI appeared first on Engineering.com.

]]>
You’re reading Engineering Paper, and here are the latest headlines from the world of design and simulation software.

Duro, the cloud-based PLM provider, has relaunched its platform as Duro Design.

Michael Corr, co-founder and CEO of Duro, told me that the change is more than just a new product name. “It’s really a new product… a revolution of what we’re doing, not just an incremental evolution,” he said.

Duro first launched its eponymous PLM platform in 2018, targeting startups and small-to-medium businesses looking for a quick and modern alternative to that jack-of-all-trades, Excel.

“We were very limited in functionality and very automated and opinionated, because we just helped customers implement industry best practices out of the box,” Corr said.

Since then, Corr said, the market for modern PLM tools has evolved. “The level of innovation that’s happening today is unprecedented,” he said, referring both to new hardware companies and the SaaS software startups catering to them. Duro’s customers wanted more capability, configurability and compatibility, and Corr saw that the platform could either adapt or harden into the same kind of stale PLM tool it had been built to disrupt.

“We recognized there was a unique small window to just completely revamp our platform and really meet what this market had evolved to be,” Corr said.

Duro Design is that revamp. Duro’s legacy PLM platform will be phased out and the company will help existing customers migrate to Duro Design.

So what’s the difference? A big part of it, as you might imagine, is AI. Corr describes Duro Design as “AI-native,” a phrase which I asked him to define (lest it come across as marketing fluff).

“Deep refactoring of our platform allowed us to leverage what was becoming the best practices for building AI-based tools,” Corr told me. “We changed our database structure, we changed our API structure, so that AI technologies, LLMs and even generative AI capabilities, were being built natively in the core of our platform, versus being a bolt-on after the fact.”

Screenshot of Duro Design. (Image: Duro.)

For example, Duro Design uses AI for natural language search, helping users more easily sort through heaps of design data. Users can also manage their PLM environment with AI by prompting changes to the underlying YAML configuration language (YAML ain’t markup language, if you’re a fan of backronyms). Duro Design also uses AI to analyze change orders and provide predictions and recommendations, according to Corr.

AI isn’t the only difference. Sandwich in a P and you get another tentpole of Duro Design: API.

“Following an API first approach, every single feature that we offer is exposed through the API,” Corr said, in contrast to the more limited API of the legacy platform. “[Users] can reconfigure their account as they wish. They could build their own integrations. They can even build their own front end web client if they wanted to.”

As far as integrations go, Duro offers plenty of its own with add-ins for Solidworks, NX (or should I say Designcenter), Altium 365, Onshape and more.

Speaking of Onshape…

Onshape gets MBD

PTC announced that its cloud CAD platform Onshape will soon be capable of model-based definition (MBD). The feature is in an “an early visibility program with select customers,” according to the press release, “and is expected to be generally available in late 2025.”

What is MBD? There are many bickering definitions for this engineering acronym, but when it stands for model-based definition it refers to annotating a 3D model with manufacturing data such as materials, dimensions, tolerances and the like. It’s an alternative to the standard 2D drawings that everyone loves to hate (but that don’t seem to be going away anytime soon).

MBD in Onshape. (Image: PTC.)

“Our new MBD capabilities remove the need to interpret 2D drawings by embedding PMI [product manufacturing information] directly into the 3D model,” David Katzman, PTC’s general manager of Onshape and Arena, said in the release. “And because Onshape is cloud-native, this information is instantly accessible to everyone who needs it, from any device and any location. It’s a major step forward in making MBD practical and scalable for real-world use.”

PTC is showing off Onshape’s MBD this week at the Paris Air Show with their customer Aura Aero (June 16 – 19 2025, Zone B4). Check it out if you’re in town (but you might want to stay away from the Louvre).

Design and Simulation Week 2025

If you’re not already counting down the days to Engineering.com’s annual Design and Simulation week, here’s your 27-day warning.

Running from July 14 – 18, 2025, this series of expert webinars will explore the top trends in engineering software from some of the leading voices in the industry (and me). You’ll learn about AI, automation, multiphysics and how to make the most of modern tools.

Register for Design and Simulation Week now and start counting.

One last link

DiffusionRenderer is a pretty cool new AI-based rendering tool from Nvidia. It takes a 2D video and strips out the geometry and material info in order to plug in different lighting conditions, like changing a scene from day to night. In addition to its creative applications, Nvidia says it’ll help generate synthetic data for robotics training and autonomous vehicle development.

Got news, tips, comments, or complaints? Send them my way: malba@wtwhmedia.com.

The post Duro reboots its PLM platform for AI appeared first on Engineering.com.

]]>
In the rush to digital transformation, it might be time for a rethink https://www.engineering.com/in-the-rush-to-digital-transformation-it-might-be-time-for-a-rethink/ Tue, 03 Jun 2025 15:03:32 +0000 https://www.engineering.com/?p=140223 One of the main themes from the PLM Road Map and PDT North America event was just how much we still have to learn about going digital.

The post In the rush to digital transformation, it might be time for a rethink appeared first on Engineering.com.

]]>
In the breakneck pace of digital transformation, is comprehension being left behind? Do we need a rethink? No one at PLM Road Map and PDT North America, a collaboration with BAE Systems’ Eurostep organization—a leading gathering of product lifecycle management (PLM) professionals—said that, at least not in so many words, but presentations by one user after another raised the issue.

In my opening presentation, I confronted these issues by positioning PLM as a strategic business approach, thereby joining it to digital transformation, which has been CIMdata’s focus for more than four decades. And in the conference’s thought leadership vignettes, multiple PLM solution providers stressed connectivity and new tools to aid understanding and comprehension; in these vignettes, many supported my positioning of PLM.

The issues of comprehension were presented to conference attendees from several points of view. Many presenters delved into data and information quality—accuracy, completeness, structure, ownership, possible corruption, its exploding volume, and the steady growth of regulation.

Some numbers that made many attendees uncomfortable:

• There are hundreds of engineering software tools and new ones appear every week. Every engineering organization uses dozens of tools, systems, solutions, “apps,” and platforms; their constant updates are often disruptive to users

• About 800 standards apply to engineering information and its connections to the rest of the enterprise, said Kenneth Swope, The Boeing Co.’s Senior Manager for Enterprise Interoperability Standards and Supply Chain Collaboration

30 terabytes of data are generated in CAD and manufacturing for each of the hundreds of engines produced by Rolls-Royce PLC every year, reported Christopher Hinds, Head of Enterprise Architecture. Some output files from CFD analyses exceed 650 GB per part, he added.

Speakers also discussed how digital transformation is revealing the shortfalls in comprehension of data and information. “If we can’t agree on what data is, we can’t use it,” observed Swope. These shortfalls are caused by accelerated product development, shorter product lifecycles, and an explosion of product modifications and differentiations thanks to the software now embedded in every product.

A graphic construction of the comprehension challenges in digital transformation. (Image: CIMdata Inc.)

In my conference-opening presentation, “PLM’s Integral Role in Digital Transformation,” I stressed that companies need to think beyond digitizing data, that merely converting analog data to digital isn’t enough. Yes, digitalization is at the core of an organization’s digital transformation … but moving to a digital business requires rethinking many organizational structures and business processes as well as understanding the growing value of data.

So how does PLM fit into this? Only by seeing PLM as a strategic business approach can its depth and breadth in the reach of digital transformation can be comprehended. PLM concentrates the organization’s focus on the collaborative creation, use, management, and dissemination of product related intellectual assets—a company’s core asset. This makes PLM the platform for integrating external entities into lifecycle processes—thereby enabling end-to-end (E2E) connectivity … and the optimization of associated business functions and entities throughout the lifecycle.

Don’t forget, I cautioned, that the data generated from your products and services often becomes more valuable than the products themselves. Why? Because product data touches all phases of a product’s life, these digital assets play a central role in an enterprise’s digital transformation. Hence I warned that digital transformation will collapse without the implementation of the right set of data governance policies, procedures, structure, roles, and responsibilities.

Many presenters also noted how PLM and digital transformation are helping them deal with the challenges of stiffer competition, rising costs, downward pressure on pricing, customer demands for more functionality and longer service lives, data-hungry Artificial Intelligence (AI), and Product as a Service (PaaS) business models

And while all these factors aggravate the issues I addressed, speakers expressed confidence that they will eventually reap the benefits of PLM and digital transformation—starting with getting better products to market sooner and at lower cost.

Another challenge with digital transformation and comprehension is the multitude of ways that presenting companies organize and identify their engineering systems and functions. All these manufacturers use basically same processes to develop and produce a new product or system but these tasks are divided up in countless ways; no two companies’ product-development nomenclature are the same.

Sorting this out is crucial to the understanding and comprehension of the enterprise’s data and information. Gaining access to departmental “silos” of data is increasingly seen as just the beginning of digging information out of obsolete “legacy” systems and outdated formats.

Dr. Martin Eigner’s concept of the extended digital thread integrated across the product lifecycle. (Image: Eigner Engineering Consult.)

In the conference’s Day One keynote presentation, Martin Eigner of Eigner Engineering Consult, Baden-Baden, Germany, spoke on “Reflecting on 40 Years of PDM/PLM: Are We Where We Wanted to Be?” The answer, of course, is both yes and no.

Dr. Eigner expressed his frustration in PLM’s fragmented landscape. We are still tied to legacy systems (ERP, MES, SCM, CRM) that depend on flawed interfaces reminiscent of outdated monolithic software, he pointed out. As digitalization demands and technologies like IoT, AI, knowledge graphs, and cloud solutions continue to grow, the key question is: Can the next generation of PLM solutions meet the challenges of digital transformation with the advanced, modern software technologies available?

“The vision of PLM till exists,” Dr. Eigner continued, “but the term was hijacked in the late 1990s while the PLM vision was still being discussed. Vendors of product data management (PDM) solutions applied the term for their PDM offerings” which “mutated from PDM to PLM virtually overnight.”

“Ultimately,” he noted, “business opportunities and ROI will be significantly boosted by the overarching Digital Thread on Premise or as a Service,” leveraged with “knowledge graphs connected with the Digital Twin.” Applying “generative AI can optionally create an Omniverse with enhanced data connectivity and traceability.”

This stage of digital transformation, he summarized, “will improve decision making and support AI application development.” In turn, these “will revolutionize product development, optimize processes, reduce costs, and position the companies implementing this at the forefront of their industries. And we are coming back to our original PLM vision as the Single Source of Truth.”

Uncomfortably ambitious productivity improvements with AI and digital transformation. Image: GE Aerospace

The challenges of getting this done were addressed by Michael Carlton, Director, Digital Technology PLM Growth at GE Aerospace, Evendale, Ohio, using what he termed as “developing a best-in-class Enterprise PLM platform to increase productivity and capacity amid rising demands for digital thread capabilities, technology transformation, automation, and AI.” His remedies included “leveraging AI, cloud acceleration, observability, analytics, and automation techniques.”

“Uncomfortably ambitious productivity improvements,” Carlton continued, include “reduction in PLM environment build cycle time, parallel development programs on different timelines, shifting testing left (i.e., sooner), improved quality throughout, automated data security tests, and growing development capacity.”

IDC slide showing how PLM maintains the digital threads that define the product ecosystem by weaving together product development, manufacturing, supply chain, service to balance cost, time, and quality. (Image: IDC.)

The issue of PLM and the boardroom was raised in a presentation, by John Snow, Research Director, Product Innovation Strategies, at International Data Corp. (IDC), Needham, Mass. In his data-packed Day 2 keynote, Snow detailed how complex this issue is and the “disconnect between corporate concerns and engineering priorities.”

PLM, observed Snow, “maintains the digital threads that define the product ecosystem: weaving together product development, manufacturing, the supply chain, and service to balance cost, time, and quality.”

The opportunity for engineering in the boardroom is that “80% of product costs is locked in during design,” however, the Cost of Goods Sold (COGS) is 10X to 15X higher than Cost of R&D (CR&D), Snow explained.

“Poor product design,” Snow continued, “has an outsized impact on COGS, but good design does,” too. Thus, “increasing the engineering budget can have a big impact on profits (if properly allocated).” Current efforts to leverage design for manufacturing & assembly (DFM/A) are falling short,” he added.

HOLLISTER’s roller-coaster journey toward PLM showing key decision points; the loop indicates a stop and restart. (Image: Hollister Inc.)

Near the other end of the corporate size scale from GE Aerospace is Hollister Inc., Libertyville, Ill., an employee-owned medical supplies manufacturer of ostomy and continence products. Stacey Burgardt, Hollister’s Senior Program Manager for PLM, addressed PLM implementation challenges in her presentation on “The Role of Executive Sponsorship in PLM Success at Hollister.”

Burgardt, formerly R&D and Quality Leader, outlined Hollister’s PLM vision as three transformations:

• To product data centric from document centric

• To digital models from drawings, and

• To live collaboration and traceability from systems of record.

In her appeal to sponsors, Burgardt estimated total expected benefits through 2030 at $29 million. This sum included significant gains from improved efficiency of associates, smaller software costs, and reduced waste, scrap, and rework.

Unlike every other presenter, Hollister has yet to implement PLM, though not from lack of effort dating back to 2018. Hollister is currently finalizing PLM solution selection and planning. Burgardt focused the need for executive sponsorship and strategies to secure it. “Identify the right executive sponsors in the governance model including the CEO and CFO,” she said, “and the

leaders of the main functions that PLM will impact, and someone who has seen a successful PLM who can advocate.

“Be persistent,” she concluded, “and be adaptable.” Address sponsors’ concerns and “If it’s not the right time, keep the embers burning and try again.”

And this led to my conference summation topic: sponsorship. The fact that PLM and digital transformation are now recognizably tougher and will take longer than once hoped led to my Executive Spotlight panel discussion at the end of Day 2: “The Role of the Executive Sponsor in Driving a PLM Transformation.” My four panelists agreed high-level sponsorships are indispensable … and we discussed how to identify, enlist, and maintain those sponsorships.

To conclude, looking back over the two days’ presentations, I think the answer is “yes” to my questions in the first paragraph. And the sooner this rethink gets going the better.

The post In the rush to digital transformation, it might be time for a rethink appeared first on Engineering.com.

]]>
[Survey Report] Complexity Overload and Bottleneck Struggles: The Hidden Costs of Manual PLM Testing https://www.engineering.com/resources/survey-report-complexity-overload-and-bottleneck-struggles-the-hidden-costs-of-manual-plm-testing/ Wed, 28 May 2025 19:24:04 +0000 https://www.engineering.com/?post_type=resources&p=140096 Manual testing slows engineering teams down — and the data proves it. In this exclusive survey of 115 engineering professionals, discover why manual PLM software testing is no longer sustainable for today’s complex environments. Key findings include: Explore the real-world impact of manual testing bottlenecks, and see why so many organizations are urgently seeking automation […]

The post [Survey Report] Complexity Overload and Bottleneck Struggles: The Hidden Costs of Manual PLM Testing appeared first on Engineering.com.

]]>
Manual testing slows engineering teams down — and the data proves it. In this exclusive survey of 115 engineering professionals, discover why manual PLM software testing is no longer sustainable for today’s complex environments.

Key findings include:

  • 92% of teams have postponed or canceled releases due to incomplete manual testing
  • 66% require at least one week — and 34% need two or more weeks — for a full regression pass
  • 88% can only handle 1–2 integrated systems manually, limiting test coverage
  • And more

Explore the real-world impact of manual testing bottlenecks, and see why so many organizations are urgently seeking automation to accelerate releases, improve quality, and empower their teams.

Your download is sponsored by Keysight Technologies.

The post [Survey Report] Complexity Overload and Bottleneck Struggles: The Hidden Costs of Manual PLM Testing appeared first on Engineering.com.

]]>
Aras Software at 25: PLM transformation through connected intelligence https://www.engineering.com/aras-software-at-25-plm-transformation-through-connected-intelligence/ Sat, 17 May 2025 13:01:53 +0000 https://www.engineering.com/?p=139728 Its trajectory mirrors the wider PLM market shift—from rigid systems to flexible, integrated platforms.

The post Aras Software at 25: PLM transformation through connected intelligence appeared first on Engineering.com.

]]>
Roque Martin, CEO at Aras Software, opened ACE 2025 by reflecting on Aras’ 25-year evolution—from early PLM strategy roots to hands-on innovation and enterprise-wide digital thread leadership. (Image: Lionel Grealou)

Nestled in Boston’s Back Bay during the first three days of April, ACE 2025 marked a key milestone: Aras’ 25th anniversary. It was a celebration of a quarter-century of innovation in the PLM space, built on the vision of founder Peter Schroer. What began as a small gathering has grown into a global forum for transformation. Aras Innovator continues to position itself as a challenger to legacy PLM systems, offering an open and adaptable platform.

“Building on the company’s red box concept,” as presented several years ago by John Sperling, SVP of Product Management, the Aras strategy is rooted in an overlay approach and containerization—designed to simplify integration and support relationship-driven data management. CEO Roque Martin described Aras’ evolution from its early roots in PDM and document control to today’s enterprise-scale PLM platform—enabling connected intelligence across functions and domains.

This trajectory mirrors the wider PLM market shift—from rigid systems to flexible, integrated platforms that support customization, adaptability, and data fluidity across engineering and operational boundaries.

AI, cloud, and the connected enterprise

Nowadays, it is close to impossible to discuss tech/IT/OT or digital transformation without exploring new opportunities from artificial intelligence (AI). Cloud and SaaS are established deployment standards across enterprise software solutions. Nevertheless, PLM tech solutions often lag when it comes to adopting modern architecture and licensing models.

The intersection of PLM and AI is rapidly redefining transformation strategies. Aras’ ACE 2025 conference embraced this momentum through the theme: “Connected Intelligence: AI, PLM, and a Future-Ready Digital Thread.” This theme reflects how AI has become more than an emerging trend—it is now central to enabling smarter decision-making, increased agility, and value creation from data.

While cloud and SaaS have become standard deployment models, PLM platforms have historically struggled to keep pace. Aras is challenging that with an architecture that emphasizes openness, extensibility, and modern integration practices—foundational enablers for enterprise-grade AI. In this landscape, the importance of aligning AI readiness with digital thread maturity is growing. PLM no longer sits at the periphery of IT/OT strategy—it is becoming the backbone for scalable, connected transformation.

Bridging old and new

Martin opened ACE 2025 by recalling that the term “digital thread” originated in aerospace back in 2013—not a new concept, but one whose visual metaphor still resonates. With the announcement of InnovatorEdge, Aras showcased the next leap in PLM evolution—designed to connect people, data, and processes using AI, low-code extensibility, and secure integrations.

With InnovatorEdge, Aras introduces a modular, API-first extension designed to modernize PLM without discarding legacy value. It strikes a balance between innovation and compatibility, targeting four key priorities. It balances innovation with compatibility and addresses four key areas:

  1. Seamless connections across enterprise systems and tools.
  2. AI-powered analytics to enhance decision-making capabilities.
  3. Secure data portals enabling supply chain data collaboration.
  4. Open APIs to support flexible, industry-specific configurations.

By maintaining its commitment to adaptability while embracing modern cloud-native patterns, Aras reinforces its position as a strategic PLM partner—not just for managing product data, but for navigating complexity, risk, and continuous innovation at scale.

Data foundations

As we stand at the intersection of AI and PLM, ACE 2025 made one thing clear: solid data foundations are essential to unlock the full potential of connected intelligence. Rob McAveney, CTO at Aras, stressed that AI is not just about automation—it is about building smarter organizations through better use of data. “AI is indeed not just about topping up data foundation,” he said, “but helping organizations transform by leveraging new data threads.”

McAveney illustrated Aras’ vision with a simple yet powerful equation:

Digital Thread + AI = Connected Intelligence

This means:

  • Discover insights across disconnected data silos.
  • Enrich fragmented data by repairing links and improving context.
  • Amplify business value using simulation, prediction, and modeling.
  • Connect people and systems into responsive feedback loops.

Every mainstream PLM solution provider is racing to publish AI-enabled tools, recognizing that intelligence and adaptability are no longer optional in today’s dynamic product environments. Siemens continues to evolve its intelligent enterprise twins, embedding AI into its Xcelerator portfolio to drive predictive insights and closed-loop optimization. Dassault Systèmes recently unveiled its 3D UNIV+RSE vision for 2040, underscoring a future where AI, sustainability, and virtual twin experiences converge to reshape product innovation and societal impact. Meanwhile, PTC strengthens its suite through AI-powered generative design and analytics across Creo, Windchill, and ThingWorx. Across the board, AI is becoming the common thread—fueling a transformation from static PLM to connected, cognitive, and continuously learning platforms.

With so much movement among the established players, is Aras’ open, modular approach finally becoming the PLM disruptor the industry did not see coming? Across the board, AI is becoming the common thread—fueling a transformation from static PLM to connected, cognitive, and continuously learning platforms. Gartner VP Analyst Sudip Pattanayak echoed this in his analysis, emphasizing the need for traceability and data context as cornerstones of digital thread value. He identified four critical areas of transformation:

  1. Collaboration via MBSE and digital engineering integration.
  2. Simulation acceleration through democratized digital twins.
  3. Customer centricity driven by IoT and usage-based insights.
  4. Strategic integration of PLM with ERP, MES, and other platforms.
Sudip Pattanayak, VP Analyst at Gartner, highlighted that “PLM supports the enterprise digital thread” by building a connected ecosystem of product information. (Image: Lionel Grealou)

From a business standpoint, this translates to strategic benefits in risk management, compliance, product quality, and brand protection. For instance, digital thread traceability supports:

  • Warranty tracking and root cause analysis for recalls.
  • Maintenance, usage, and service optimization.
  • Real-time feedback loops from market to R&D.
  • Commercial impact modeling from product failures.

Pattanayak concluded that enterprises should not aim for total digital thread coverage from day one. Instead, the priority is identifying high-value “partial threads” and scaling from there—with AI capabilities built on solid, governed, and well-connected data structures.

The post Aras Software at 25: PLM transformation through connected intelligence appeared first on Engineering.com.

]]>
Managing the world’s most complex machine https://www.engineering.com/managing-the-worlds-most-complex-machine/ Mon, 28 Apr 2025 18:17:24 +0000 https://www.engineering.com/?p=139223 With 100,000 parts and a 50-year expected operational lifespan, PLM is the only option for managing CERN’s Large Hadron Collider.

The post Managing the world’s most complex machine appeared first on Engineering.com.

]]>
David Widegren, Head of Engineering Information Management at CERN, at ACE 2025 in Boston, discussed the role of Product Lifecycle Management (PLM) strategies in managing the world’s most complex scientific instrument. (Image: Lionel Grealou)

CERN stands for the European Organization for Nuclear Research (from the French ‘Conseil Européen pour la Recherche Nucléaire’). It operates the world’s largest and most powerful particle accelerator—the Large Hadron Collider (LHC), which spans a 27-kilometre loop buried about 600 feet beneath the France-Switzerland border near Geneva. The LHC accelerates protons and ions to near-light speeds, recreating the conditions just after the Big Bang. This enables physicists testing fundamental theories about the forces and particles that govern our universe—and providing invaluable data on the building blocks of reality.

Operating at an astonishing temperature of -271.3°C—colder than outer space—the LHC’s superconducting magnets require cryogenic cooling systems, creating one of the coldest environments on Earth. Although some sensationalized media reports have raised concerns about the LHC creating black holes on Earth, CERN’s scientific community has rigorously demonstrated that these fears are unfounded. The LHC’s energy levels, while impressive, are a fraction of those generated by natural cosmic events that occur regularly without incident.

CERN operates with a collaborative network of 10,000 staff across 80 countries, supported by an annual budget of $1.4 billion. This immense collaboration drives groundbreaking research that demands the highest levels of reliability and precision. Managing the LHC’s enormous infrastructure—including millions of components—requires a comprehensive approach that integrates engineering and scientific disciplines. This is where digital transformation, powered by PLM and digital twins, becomes essential.

New digital backbone for an evolving scientific platform

Historically, CERN used legacy CAD systems such as CATIA V5, AutoCAD, SolidWorks, Inventor, Revit, and SmarTeam to manage critical design and operational data, alongside multiple asset and document repositories. However, as the LHC grew in complexity, these tools created inefficiencies, data silos, and challenges around synchronization, verification, and scalability.

To modernize its approach, CERN adopted Aras Innovator—a CAD-agnostic, part-centric PLM platform—redefining its approach to integrated data management. This shift enables CERN to track components across their full lifecycle, providing real-time insights into performance, wear, and maintenance needs. With over 100 million components—many exposed to extreme radiation and high-energy fields—this capability is critical for ensuring resilience and longevity. The integration of digital twins into the ecosystem allows CERN to predict component failures, optimize performance, and plan preventive maintenance.

Towards an integrated digital engineering platform. (Image: CERN presentation at ACE 2025)

Given the LHC’s extraordinary expected lifespan—over 50 years—the management of its components and systems from design and construction through decommissioning is a monumental task. Some systems, such as superconducting magnets and cryogenic infrastructures, must remain functional for decades. PLM helps CERN manage these long-term needs by providing a unified, scalable solution that integrates data across all lifecycle phases. This is essential not only for maintaining operational efficiency but also for ensuring the LHC’s systems continue to meet high standards of scientific precision and safety.

Sustainability is integral to CERN’s long-term strategy. Managing the LHC’s lifecycle includes minimizing environmental impact and optimizing energy consumption. PLM and digital twins enable CERN to optimize resource usage, reduce waste, and extend the life of crucial systems, ultimately supporting the organization’s long-term sustainability goals.

CERN’s shift to Aras Innovator has also facilitated the integration of various data streams—ranging from engineering documents to enterprise asset management. By connecting this information through a robust digital thread, CERN ensures that all stakeholders, from engineers to researchers, operate with a unified, reliable view of the system. This shared information base enhances collaboration, reduces errors, and accelerates decision-making.

While PLM manages engineering and operational data, experimental research outputs are handled separately by specialized Laboratory Information Management Systems (LIMS). However, synergies between PLM and LIMS are increasingly being explored, with the goal of creating faster feedback loops between research and engineering to enable more data-driven innovation.

Managing complexity without digital overload

As CERN continues to push the boundaries of scientific discovery, the need for real-time monitoring and predictive analytics becomes more critical. Digital twins enable real-time health checks on LHC components, tracking their condition and ensuring compliance with safety standards.

Yet the real challenge is not simply managing technical complexity but doing so without introducing unnecessary digital complexity. The scale of the LHC, with its intricate interconnected systems, requires CERN to balance advanced technologies with operational simplicity. Digital tools must enhance operations without becoming another layer of complication.

The key question: How can CERN manage scientific complexity while minimizing the complexity of digital tools?

New technologies must deliver actionable insights that enable faster, better decisions, instead of overwhelming stakeholders with excess data or redundant processes.

Some key questions that arise:

  • What measurable reductions in maintenance costs or unplanned downtime can CERN achieve through predictive digital models?
  • How will real-time monitoring improve energy efficiency, system lifespan, and reliability?
  • How much faster can experimental setups and calibrations be completed using simulation and virtual commissioning?

Ultimately, the success of CERN’s digital transformation will not be judged by the sophistication of its tools, but by clear, quantifiable outcomes: lower downtime, improved reliability, energy-efficient operations, and faster scientific throughput.

Lessons from the LHC to the FCC

CERN’s digital transformation is not just about adding tools—it is about making complex systems easier to manage and enabling faster, more informed decisions. This mindset is critical as CERN embarks on its next major project: the Future Circular Collider (FCC).

The FCC will dwarf the LHC, with a circumference of 91 kilometers and particle collisions at energy levels up to 100 TeV—far beyond the LHC’s 13 TeV. Construction costs are estimated between €20 billion and €25 billion, with initial operations targeted around 2040. The scale of the FCC presents massive engineering challenges, from magnet design to cryogenic systems.

Here, lessons learned from the LHC’s digital journey will pay dividends.

The LHC’s digital twins—validated over years of operation—will serve as the foundation for FCC simulations. Virtual modeling allows CERN to identify risks earlier, test complex designs in silico, and optimize operations before construction even begins. By compressing design timelines and minimizing construction risks, CERN can potentially save both operational and capital costs while improving reliability.

CERN’s approach shows that digital transformation is not about complexity for its own sake. It is about ensuring that scientific and operational challenges are met with clarity, efficiency, and sustainability—building a stronger foundation for the next generation of discovery.

The post Managing the world’s most complex machine appeared first on Engineering.com.

]]>
Unlocking innovation in product development https://www.engineering.com/unlocking-innovation-in-product-development/ Tue, 22 Apr 2025 14:48:04 +0000 https://www.engineering.com/?p=138827 The challenges in modern product development.

The post Unlocking innovation in product development appeared first on Engineering.com.

]]>
Dassault Systèmes has sponsored this post. Written by Nancy O’Flaherty, Senior Offer Marketing Manager, Dassault Systèmes.

(Image: Dassault Systèmes.)

The demands on product development organizations have never been more intense. Faced with shrinking go-to-market timelines, competitive pressures, stringent regulations and evolving customer expectations, companies must rethink how they design, develop and deliver products. Traditional Product Lifecycle Management (PLM) systems, once seen as essential tools, are increasingly viewed as a barrier to innovation rather than a catalyst. While effective at managing data repositories, these legacy systems fail to keep pace with modern, interconnected, agile development methods.

Consequently, many organizations grapple with the challenges of siloed decision-making. When departments operate in isolation, it results in ineffective communication and collaboration, creating obstacles that complicate the already demanding landscape of product development. This disconnection can significantly extend project timelines, introduce errors and stifle creativity. Moreover, the burden of cumbersome administrative tasks often leaves little room for teams to focus on what truly matters: driving innovation and delivering high-quality products. Additionally, a lack of visibility into project statuses and accountability makes it difficult for stakeholders to track progress and make informed decisions.

Companies need to fundamentally shift toward model-based, data-driven product development, providing the tools and capabilities required to solve today’s challenges and unlock new opportunities for the future.

Several well-documented issues hamper product development. These challenges limit efficiency, stifle innovation and increase costs.

Siloed decision-making

Traditional systems promote fragmentation. Teams work in isolation, with data scattered across siloed applications. This slows collaboration and decision-making, creating bottlenecks in project timelines.

Slow innovation and burdensome administrative tasks

Traditional PLM systems are often file-based and BOM-centric, forcing teams to manage duplicate data manually and synchronize updates. This tedium reduces focus on creativity and problem-solving. 

Lack of transparency and accountability

Meeting regulations and achieving traceability remain uphill battles. Data remains locked within engineering systems, inaccessible to non-engineering disciplines that could greatly benefit from its insights.  

Barriers to adopting model-based engineering

Most legacy systems are not equipped for agile product development or model-based systems engineering (MBSE), which are critical methods for creating complex, software-driven products.

(Image: Dassault Systèmes.)

Rethinking product development: Transforming challenges into innovation

A unified innovation platform transforms how organizations approach product development. It goes beyond addressing IT or data management issues and transforms how teams collaborate, innovate and succeed.

In today’s challenging environment, the 3DEXPERIENCE platform significantly changes how companies handle product development, providing solutions designed to address key issues directly. By fostering a unified, holistic environment that enhances communication across disciplines, the platform effectively breaks down silos, enabling teams to collaborate seamlessly.

ENOVIA on the 3DEXPERIENCE platform connects all disciplines throughout the product lifecycle. This foundational capability eliminates the barriers created by siloed systems and fosters cross-disciplinary collaboration. Critically, it provides compatibility with legacy systems, ensuring a smooth transition to this innovative platform.

Real-time access to 3D-configured engineering data allows users to retrieve information instantly, from anywhere and on any device. This capability enables stakeholders from engineering, design, manufacturing and marketing to interact with and make decisions based on the same 3D data in a fully interactive manner. The benefits are evident — leaders can make informed decisions using actionable insights rather than relying on outdated information.

Built-in collaboration and data science tools

Virtual Twin Experiences transform collaboration and decision-making by creating dynamic, digital replicas of physical products, processes or entire enterprises. These virtual models enable real-time simulation, analysis and optimization, bridging the gap between the physical and digital worlds. By leveraging technologies such as IoT, AI and advanced analytics, virtual twins provide actionable insights, enhance decision-making and improve efficiency across the product lifecycle or organizational operations. They empower organizations to predict outcomes, test scenarios and innovate faster, ultimately driving smarter strategies and sustainable growth.

Support for MBSE

Unlike legacy PLM solutions, the 3DEXPERIENCE platform offers native support for MBSE, a crucial differentiator in today’s complex, software-driven environments. From defining use cases to designing and validating systems, MBSE workflows are integral to the platform’s capabilities, paving the way for faster, more effective development cycles.

Capitalizing on enterprise knowledge with AI

The 3DEXPERIENCE platform helps organizations unlock and leverage their intellectual property. Using AI-powered generative experiences, the platform reveals hidden opportunities in enterprise knowledge, enabling teams to learn from the past and innovate for the future. 

(Image: Dassault Systèmes.)

Transforming product development with proven results

“This is really a big advantage for us, that we have a best-in-class system that we can use to track, manage and ultimately act on, manufacture and bring to market what started out as data and turns into the real world,” Allison said. “To create world-class technology, you have to use world-class technology.” —Eric Allison, Chief Product Officer, Joby Aviation

“A holistic digital model comprising all the attributes of each discipline invariably shortens time to market because we can work faster and more efficiently.” —Bernd Hirt, Group Manager Core Function Mechanics, Bosch Car Multimedia

Why invest in the 3DEXPERIENCE platform now 

The stakes for product development teams have never been higher. Competitive pressures, workforce dynamics and sustainability goals are reshaping industries. Resilience and adaptability are no longer merely desirable, they are essential for survival and market leadership.

The 3DEXPERIENCE platform provides a critical framework for navigating these challenges and unlocking new opportunities. By adopting a unified platform, organizations position themselves for the future of innovation, efficiency and collaborative success.

The time to act is now. Unlock a new realm of possibilities for your product development processes and shift from file-based, siloed systems to an agile, innovation-first future. Download the e-book to learn more.


About the Author

Nancy O’Flaherty is a Senior Offer Marketing Manager at Dassault Systèmes, where she has spent the past 17 years developing marketing strategies to drive product awareness, customer engagement and adoption. With over 25 years of experience in the high-tech industry, Nancy has strong expertise in market trends, customer needs and positioning technology solutions for success.

The post Unlocking innovation in product development appeared first on Engineering.com.

]]>
Making sense of tariff impact: why digital transformation was never optional https://www.engineering.com/making-sense-of-tariff-impact-why-digital-transformation-was-never-optional/ Mon, 14 Apr 2025 17:36:13 +0000 https://www.engineering.com/?p=138693 Are you simply reacting to disruption or leading your company through it? PLM is the secret weapon at the center of a resilient response to volatility.

The post Making sense of tariff impact: why digital transformation was never optional appeared first on Engineering.com.

]]>
Tariffs have stormed back into the global spotlight, shaking global trade, and putting pressure on manufacturers with international supply chains. For companies already facing inflation, material shortages, and geopolitical instability, new tariff and political wargames add another layer of complexity to navigate.

This is not just another opinion on trade policy—it’s a wake-up call. Companies that postponed digital transformation are now struggling to manage disruption with enterprise systems unfit for today’s pace of change. Many still rely on spreadsheets, siloed software, and disconnected teams. In contrast, organizations that invested in a connected digital backbone—especially one centered around Product Lifecycle Management (PLM)—are better equipped to assess impacts, respond rapidly, and protect margins.

Understanding tariff impact across the value chain

Tariffs create ripple effects across operations, affecting cost, compliance, and sourcing decisions:

  • Importers and contract manufacturers experience the first wave of cost increases.
  • Brand leaders and OEMs must decide whether to absorb, offset, or pass along those costs.
  • Suppliers across borders face pressure to renegotiate contracts, timelines, or terms.

At the center of this complexity is the costed Bill of Materials (BOM), listing parts, sub-assemblies, components, raw materials, formulations, and quantities needed to manufacture a product, along with the associated cost information for each item. It should be the single source of truth for real-time cost impacts; yet too often, BOM updates lag behind key decisions—causing margin erosion and compliance issues.

The critical question extends beyond “What became more expensive?” to “Who is going to pay for it?” The answer depends on industry dynamics, supply agreements, market conditions, and strategic intent. Highly commoditized sectors may need to absorb added costs to remain competitive, while premium markets might have room to pass on increases—if brand value and pricing power allow.

Complexity deepens when tariffs affect multiple tiers of the supply chain. Without timely insight into these dynamics, companies default to reactive choices—accepting margin erosion, postponing decisions, or making trade-offs that compromise long-term strategy and customer trust. A digitally connected value chain creates clarity. With end-to-end PLM integrating procurement and finance data, manufacturers can model how shocks flow through their products and portfolios—and respond before it is too late.

Integrated digital approach to tariff management

Addressing tariff impact requires more than PLM. It demands a fully integrated digital thread that connects product, sourcing, financial, and customer data.

  • PLM captures product structures, supplier dependencies, and alternate design paths—anchoring the impact assessment.
  • Enterprise Resource Planning (ERP) maintains costed BOMs and tracks profitability changes as duties or sourcing costs shift.
  • Supply Chain Management (SCM) evaluates alternate suppliers and logistics to manage cost and lead time risks.
  • Trade compliance systems monitor changes to tariff codes and cross-border regulations.

Supporting systems play complementary roles:

  • Product Data Management (PDM) ensures the correct version of product data is used when making design or sourcing changes—critical to avoid rework or compliance issues.
  • Material Requirements Planning (MRP) provides forward visibility into procurement and inventory to avoid overbuying high-tariff parts or facing shortages.
  • Customer Relationship Management (CRM) contributes commercial insight, highlighting customer sensitivities, regional exposures, and contract constraints that influence pricing strategies.

The value of this integrated technology stack lies in connecting innovation with sourcing, costing, and compliance. With this comprehensive view, manufacturers can confidently simulate trade-offs, evaluate impacts, and execute necessary changes. When systems work together, companies can coordinate across engineering, procurement, finance, and sales using shared data and aligned risk thresholds. This enables scenario modeling, supplier exposure analysis, and implementation of controlled design or sourcing shifts with full traceability and governance.

Proactive risk management in an era of uncertainty

In today’s context, tariffs have become instruments of urgency—tools used to pressure negotiation rather than long-term policy levers. For manufacturers, this translates to operational volatility with abrupt announcements, unclear duration, and vague scope that disrupt planning cycles.

Leaders must now make critical decisions with limited clarity and compressed timelines. Digital maturity enables companies to manage these situations with structure and foresight by:

  • Simulating tariff scenarios before they take effect
  • Identifying high-risk suppliers or parts and activating contingency plans
  • Evaluating design alternatives and reconfiguring BOMs with version control
  • Maintaining compliance as sourcing or target markets shift

Without these capabilities, companies resort to instinct, delay decisions, or absorb unnecessary costs. With them, responses become structured, traceable, and repeatable.

Digital transformation as a competitive imperative

Beyond tariffs, digital transformation has always been a strategic foundation for resilience against various forms of disruption. Companies that invested in connected systems now operate with speed and alignment, ready for AI and analytics-driven decision making. Those that delayed now face both the disruption and the steep learning curve of modernization.

True resilience emerges from more than technology alone—it comes from embedding governance, data discipline, and cross-functional collaboration into the organization’s DNA. Digital transformation enables faster, better decisions under pressure by connecting development, sourcing, operations, compliance, and customer functions into a coherent ecosystem.

As tariff volatility evolves—potentially settling into a new reality shaped by regional sourcing and revised trade agreements—the strategic question remains: Are we still reacting to disruption, or are we built to lead through it? Trade policy may eventually stabilize, but volatility will not. Digital transformation is not tied to election cycles or regulatory changes. It represents a long-term investment in adaptability, insight, and resilience. The companies that will thrive are those that stopped waiting for stability—and started building for it.

The post Making sense of tariff impact: why digital transformation was never optional appeared first on Engineering.com.

]]>
Understanding PLM: who uses it, why they use it and its challenges https://www.engineering.com/understanding-plm-who-uses-it-why-they-use-it-and-its-challenges/ Mon, 07 Apr 2025 14:11:49 +0000 https://www.engineering.com/?p=138437 Gathering and managing data, insights and inspiration can never be reliable without PLM and the digital transformation it enables.

The post Understanding PLM: who uses it, why they use it and its challenges appeared first on Engineering.com.

]]>
It is an unfortunate fact that in almost every technology discussion the basics are often overlooked, resulting in more than a little confusion.

In this article I’m going back to the basics of product lifecycle management (PLM) by exploring two fundamental questions: Who uses PLM?  And what are some of its challenges?  

This is my third article on PLM basics for Engineering.com. Previously, I wrote Answering 3 Top PLM Questions and Why Every Enterprise Needs Its Own Digital Twins.  

A common thread across all these articles is the importance of collaboration and innovation.  PLM supports these vital enterprise processes as no other technology can do.  Without a well-defined PLM strategy and associated enabling technologies, the gathering and management of data, insights, and inspiration is never reliable.  Only PLM-enabled collaboration and innovation can ensure the long-term sustainability of an enterprise.

Since the first two articles were written, the importance of PLM to enterprise-scale digital transformation and vice versa has become more widely recognized. But PLM’s message is often lost in noisy and cluttered marketplaces that change at ever increasing rates.  To keep pace, PLM solution providers come up with elaborate new tools and add-on capabilities that make digital twins more powerful; digital threads more all-encompassing; end-to-end-end connectivity reach further and deeper because both the ends and the beginnings of products’ lifecycles grow hazier amid expanded capabilities.  At the same time, PLM is being extended into new areas of the enterprise, some of which have only tenuous links to the everyday understanding of “product.”   In these uncertain times, going back to basics has historically proven to be a wise policy. 

All three of these back-to-basics articles drew inspiration from Answerthepublic.com, a marketing-focused platform that unearths user questions commonly entered into search engines.

Who Should be Using PLM?

This question addresses why PLM is so widely used and appreciated, even without deeper understanding of it. PLM is probably the most widely implemented among the myriad tools engineers rely on—standalone CAD included.  New-product developers in all industries turn to PLM at every stage of their work:

•  Creating, redeveloping, and enhancing the enterprise’s products, services, assets, or systems, both physical and digital.

•  Creating, developing, and enhancing processes.

•  Enhancing and extending enterprise connectivity.

•  Delving into supply chain management to cope with the variants in all of the above.

•  Everyone, technically trained or not, who supports these engineers.

If you are using PLM, the aim should be extending the use of it from product development throughout the extended enterprise, including the enablement of:

Digital twins, which are virtual representations—digital surrogates—of physical assets (or services, or even manufacturing systems and the organization itself) that exploit data flows into and out of that asset.  A digital twin of a product typically holds geometry and representations of materials, components, and behavior through the asset’s multiple iterations—as-designed, as-produced, and as-maintained.

Digital threads, which are webs of decisions and myriad links that reach all the data, decisions, and processes that create, maintain, and leverage digital twins from design engineering through production, sales, service, support, and warranties.

End-to-end (E-2-E) lifecycle connectivity, which reaches and joins everything relevant to each digital twin and its digital threads from initial ideation through end-of-life and disposal or remanufacturing and repurposing.

Digital transformation, which is rendering/converting all the enterprise’s data to get rid of bothersome formats and silos that prevent data and information from being freely accessed, used, shared, and reused; the digital transformation of an organization’s product lifecycle is thus a powerful enabler of collaboration and innovation.

What are some challenges of PLM enablement?

As with any transformational technology, implementers and project managers must overcome multiple challenges.  Because PLM’s broad capabilities and toolsets reach deep into the enterprise’s data, these challenges sometimes overwhelm the resources allocated to the implementation; start-up dates are missed, and deadlines are blown.

Like any large-impact technology implementation, some of the challenging aspects of PLM include:

•  Significant investment in money, time, and resources that requires careful monitoring and control.

•  Intensive planning, system-to-system accommodations, and staff retraining across the enterprise.

•  Tightly focused efforts for digital transformation to deal with unformatted data and information while facilitating access to departmental data silos.

•  Similarly, a tight focus on dealing with unexpected job complexity that can frustrate new users.

•  Persistence during a sometimes tedious implementation with a nonstop focus on priorities.

•  Continuous updates for top management so they aren’t tempted to reassign resources and reallocate funds.

•  Continuous evaluation and improvement throughout the continued usage of the new digital PLM-enabling technologies, commonly called “staying the course.”

Every marketplace is constantly reconfiguring itself, driven by countless innovations. This can disrupt enterprise-scale collaboration and thwart innovation. Gathering and managing data and insights can become so complicated that their use becomes unreliable—or worse.

Among the consequences of these reconfigurations is the fragmentation of PLM.  PLM implementation challenges have motivated a handful of software startups to offer toolkits for digital twins, digital threads, and even enhanced connectivity as stand-alone capabilities.  Marketed as sufficient in themselves for everyday user tasks, these toolkits are being tacked on to information technology, operational technology, engineering technology, and other top-of-the-enterprise platforms and systems. 

I must caution that none of these narrow offerings have PLM’s powerful and widely used capabilities to foster true innovation and collaboration, or at least not to the extent necessary.  And only collaboration and innovation can ensure the long-term sustainability of the enterprise.

The underlying theme running through all three of these articles is enabling and enhancing collaboration with PLM, and, as noted, no other technology can do this. Gathering and managing data, insights, and inspiration can never be reliable without PLM and the digital transformation it enables. In turn, only with the digital enablement of PLM can the long-term sustainability of the enterprise be assured.

And this is why understanding the basics of PLM is so critical.

The post Understanding PLM: who uses it, why they use it and its challenges appeared first on Engineering.com.

]]>
Technology rundown for analyzing tariff impact in manufacturing https://www.engineering.com/technology-rundown-for-analyzing-tariff-impact-in-manufacturing/ Thu, 03 Apr 2025 18:20:25 +0000 https://www.engineering.com/?p=138377 If you are stuck scrolling through spreadsheets to figure out your exposure, have fun and good luck.

The post Technology rundown for analyzing tariff impact in manufacturing appeared first on Engineering.com.

]]>
If you are an engineer for a US-based manufacturer, there’s a very good chance you woke up today to an entirely new task: figure out how much stuff we buy that now have tariffs applied to them.

On April 2, 2025, the U.S. government announced a sweeping new tariff regime that impacts products produced from virtually every country in the world. In an official release, the White House says the move is meant to reprioritize manufacturing in the US. While many economists believe the new tariffs will not produce this result, it doesn’t change the fact that most US manufacturers will have some significant math in their immediate future.

If you buy finished goods, this is likely not a complicated feat. But if you buy raw materials and components from suppliers in other countries and then use those materials and components to make your products, you have a much heavier lift.

Hopefully your company has maintained a reasonable digital transformation investment strategy and you have one or even several digital assets that will make this process much easier and far more accurate than any manual process—especially once you have to deduce if you can absorb some of the costs or have to pass them on to customers.

If you are stuck scrolling through spreadsheets to figure out your exposure, have fun and good luck.

Here is a list of digital solutions commonly found in manufacturing, and how you can use them to find your tariff exposure, calculate your additional spend and decide if it’s worth eating any increase:

Enterprise Resource Planning (ERP)
Purpose: Centralized financial, inventory, and operational data management

Role in Tariffs:

  • Provides financial visibility into product and material costs
  • Tracks landed costs for imported components
  • Supports decision-making on pricing adjustments
  • Inventory tracking, but (near) real-time supply chain risk assessments may require additional risk management tools

Supply Chain Management (SCM)
Purpose: Optimizes procurement, logistics, and supplier management

Role in Tariffs:

  • Models tariff impact on supply chain flows (e.g., supplier costs, lead times)
  • Optimizes sourcing strategies to minimize cost increases
  • Supports trade route and logistics adjustments to avoid tariff-heavy regions
  • Tariff-specific impact simulations may require integration with ERP or specialized tariff analysis tools

Trade Compliance and Tariff Management Tools
Purpose: Ensures compliance with international trade laws and updates tariff classifications

Role in Tariffs:

  • Tracks and updates tariff changes in response to regulatory updates
  • Automates classification of goods under the correct Harmonized System (HS) codes
  • Supports compliance audits and documentation for trade regulations
  • Forecasting financial impact of tariff changes requires integration with ERP or BI systems

Business Intelligence (BI) and Predictive Analytics
Purpose: Data visualization and financial impact analysis

Role in Tariffs:

  • Analyzes historical and real-time cost impacts of tariffs
  • Models financial scenarios to predict margin impacts
  • Integrates with ERP and trade compliance data to provide actionable insights
  • Supports strategic decision-making on pricing adjustments and supplier shifts

Pricing Optimization Software
Purpose: Adjusts product pricing based on market conditions and cost fluctuations

Role in Tariffs:

  • Determines whether tariff costs should be absorbed or passed on to customers
  • Optimizes pricing strategies based on competitive market data
  • Prevents margin erosion by aligning pricing with demand sensitivity

Product Lifecycle Management (PLM)
Purpose: Manages product design, BOMs, and supplier data

Role in Tariffs:

  • Identifies which materials and components are subject to tariffs
  • Supports product redesign efforts to reduce reliance on high-tariff materials
  • Stores country-of-origin and trade compliance documentation
  • Can be involved in redesigning products to mitigate tariff impacts

Digital Twins and Scenario Planning
Purpose: Virtual simulation of manufacturing operations for efficiency and resilience

Role in Tariffs:

  • Simulates supply chain resilience strategies in response to tariff disruptions
  • Models operational efficiencies to offset increased costs
  • Tests alternative sourcing and logistics adjustments before implementation
  • Less directly involved in product redesign than PLM tools

The post Technology rundown for analyzing tariff impact in manufacturing appeared first on Engineering.com.

]]>
Decoding Dassault’s 3D Universes jargon: combining virtual and real intelligence https://www.engineering.com/decoding-dassaults-3d-universes-jargon-combining-virtual-and-real-intelligence/ Mon, 24 Mar 2025 18:00:02 +0000 https://www.engineering.com/?p=137969 Can Dassault Systèmes convince the market that this is more than just another buzzword-laden evolution?

The post Decoding Dassault’s 3D Universes jargon: combining virtual and real intelligence appeared first on Engineering.com.

]]>
Product Lifecycle Management (PLM) reimagined: from static digital twins to an AI-powered, generative intelligence ecosystem. (Image: Dassault Systèmes.)

Dassault Systèmes has unveiled 3D Universes (styled as 3D UNIV+RSES for branding), a bold step toward reimagining how industries engage with digital and physical realities. This is not just another 3D modeling update. It represents a fundamental shift from static digital twins to an AI-powered, generative intelligence ecosystem. The branding itself—3D UNIVERSES instead of “3D Universes”—signals a new paradigm where virtual and real (V+R) are seamlessly integrated, enabling continuous learning, automation, and adaptability across product lifecycles.

But with this shift comes a set of key challenges: What does this mean for legacy users? How will intellectual property be managed in an AI-driven world? And can Dassault Systèmes convince the market that this is more than just another buzzword-laden evolution?

Virtual + real: more than just digital twins

The concept of V+R (Virtual + Real) is not new to Dassault Systèmes. It has been a central theme in the company’s Virtual Twin Experience, where digital twins are no longer mere representations but are continuously evolving with real-world inputs.

In 3D Universes, this vision is taken further:

  • AI-powered models learn from real-world behaviors and adjust accordingly
  • Virtual companions provide intelligent assistance in decision-making
  • Generative AI and sense computing optimize designs and simulations in real-time

This moves beyond the traditional “digital twin” approach. Rather than acting as a static mirror of the physical world, 3D Universes enables a dynamic, self-improving system that continuously integrates, analyzes, and adapts. The idea is not new. For instance, Siemens and other ‘PLM software’ providers are actively exploring opportunities for AI to add an intelligent layer to the PLM data backbone.

From static to generative intelligence

Dassault Systèmes has long been a leader in 3D modeling, PDM/PLM, and simulation, though 3D Universes marks a significant departure from traditional software functionality. It introduces an AI-driven, generative framework that transforms how products are designed, validated, and maintained.

Key differentiators from this new positioning include:

  • AI-assisted workflows that automatically refine and evolve designs.
  • Predictive simulations that adapt based on real-world sensor data.
  • A “living” knowledge platform that evolves with industry trends and user inputs.

You get the idea. Rather than designing a product in isolation, cross-functional teams, from Product Development, Engineering, Quality, Procurement, and supply chains can now co-create with AI, allowing for an iterative, automated process that reduces risk, enhances efficiency, and accelerates innovation cycles.

Beyond software—a living digital ecosystem

The shift to 3D Universes also seems to represent a move away from traditional licensing-based software models toward a consumption-based, Experience-as-a-Service (XaaS) framework—a similar commercial model per the approach recently described as “AI-as-a-Service” by Microsoft CEO Satya Nadella. This aligns with broader industry trends where companies are transitioning from one-time software purchases to continuous value-driven digital services.

What does this mean in practical terms?

  • Customers will consume intelligence rather than static software.
  • Real-time virtual twins will become decision-making hubs, constantly updating based on real-world inputs.
  • AI-generated designs will automate engineering iterations, dramatically reducing manual effort.

This is a major shift for legacy customers who are accustomed to on-premises, private cloud hosting, and transactional software ownership. Dassault Systèmes will need to provide a clear roadmap to help these organizations transition without disrupting their existing workflows and wider integration landscape.

IP, trust and the generative economy

One of the most critical challenges in this transformation is intellectual property (IP) ownership and data security. In an AI-driven, generative economy, where does human ingenuity end and machine-driven design begin? If AI generates a product variation based on learning from past designs, who owns the output?

Some key concerns include:

  • Ensuring IP integrity when AI continuously iterates on existing designs.
  • Managing security risks as real-world data feeds into digital models.
  • Addressing industry adoption barriers for companies that have built their entire business around traditional IP protection frameworks.

Dassault Systèmes, and other enterprise solution provider in this space, will need to provide strong governance mechanisms to help customers navigate these complexities and build trust in the generative AI-powered design process.

Dassault Systèmes issued a YouTube video presentation as a teaser to outline the core ambitions of 3D Universes, reinforcing its role in shaping a new generative economy—elaborating on key messages:

  • Virtual-Plus-Real Integration: A seamless blend of digital and physical data enhances accuracy and applicability in simulations.
  • Generative AI Integration: AI-driven processes enable more adaptable and intelligent design iterations.
  • Secure Industry Environment: A trusted space for integrating and cross-simulating virtual twins while ensuring IP protection.
  • Training Multi-AI Engines: Supports the development of AI models within a unified framework, promoting more sophisticated AI applications.

While the video presents a compelling vision and sets timeline expectations towards an aspirational 15-year journey by 2040, it introduces complex terminology that might not be easily digestible for a broad audience. The use of “Universes” as branding adds an extra layer of abstraction that could benefit from clearer explanations and, in due time, a gradual transition roadmap for legacy users.

Additionally, the practical implementation and real-world applications remain vague, leaving some unanswered questions about industry adoption and integration. How will companies transition to this model? What are the concrete steps beyond the conceptual framework? The challenge will be ensuring that this does not become another overcooked marketing push that confuses rather than inspires potential adopters. Users demand clarity and pragmatism in linking solutions to problem statements and practical value realization.

A bold leap into the future

The potential of 3D Universes is enormous, but its success hinges on several key factors:

  • Market Education: Dassault Systèmes must articulate the value proposition beyond buzzwords, demonstrating tangible ROI for both new and legacy users.
  • Seamless Transition Strategy: Organizations need a clear pathway to adopt 3D Universes without disrupting their current operations.
  • AI Governance & IP Assurance: Addressing industry concerns around AI-generated designs, IP ownership, ethical AI, and data security will be crucial for widespread adoption.

If 3D Universes delivers on its promise, it has the potential to redefine how industries design, simulate, and optimize products across their entire lifecycle. By truly integrating Virtual + Real intelligence, Dassault Systèmes is making a bold statement about the next frontier of digital transformation.

The question now is: Are industries ready to embrace this generative future, or will skepticism slow its adoption? Furthermore, where should organizations start on this journey? Can solution providers be bold enough to share a pragmatic roadmap towards this goal, and keep us posted on their learnings in this space? Will 3D Universes bring us one step closer to the “Industry Renaissance” previously advocated by Dassault Systèmes Chairman Bernard Charles? Time will tell, but one thing is certain—Dassault Systèmes is positioning itself at the forefront of the next industrial/digital revolution.

The post Decoding Dassault’s 3D Universes jargon: combining virtual and real intelligence appeared first on Engineering.com.

]]>