Explainability and Innovation: How XAI Is Transforming New Product Development

Index

XAI and Industrial Innovation: A Strategic Combination for Industry 4.0

In the landscape of industrial innovation, Explainable Artificial Intelligence (XAI) is revolutionising the process of new product development. Its ability to make algorithmic decisions transparent and understandable positions it as an enabling technology to improve operational efficiency, reduce time-to-market, and strengthen trust in artificial intelligence systems.

In particular, XAI proves to be a strategic lever for Industry 4.0, as it:
– Reduces product development costs
– Accelerates time-to-market
– Increases transparency and trust in AI
– Fosters knowledge formalisation and organisational learning

In our blog, we have already covered concrete applications of XAI developed by Intellico, focusing on the use of the MATILDE platform in sectors such as food, cosmetics, and chemicals and in specific projects on rubber compound development.

The attention these use cases have received in academia also raises a key question: why does XAI work so well in business processes? In this article, we sum up some key point on the most interesting articles addressing MATILDE and Intellico XAI

Knowledge Management and Creativity: The Role of XAI in the Creative Phases of Innovation

An academic study by Politecnico di Bari and Politecnico di Milano explores how XAI, particularly MATILDE’s, supports the various stages of the creative process:

Preparation – Gathering and Organising Knowledge
XAI collects heterogeneous data, including past failures, to generate explainable predictions. This helps managers identify the key variables influencing decisions.

Incubation – Idea Generation
Through cycles of explanation and adaptation, XAI supports the generation of solutions, suggesting recipes or combinations to test, and reduces the number of physical prototypes.

Illumination – Practical Application
The generated ideas are validated in the laboratory, and the feedback data feeds back into the algorithm, strengthening human–machine integration.

Verification – Evaluation of Results
By monitoring KPIs and metrics, companies consolidate leadership strategies and improve team organisation to integrate XAI into the R&D process.

Overcoming Organisational Tensions with XAI

A second academic study conducted by Fabrizio Amarilli (Dublin City University), Francesca Saraceni (Intellico), Lorenzo Tencati (Intellico), and Sara Uboldi (Intellico) analyses how XAI helps manage the main “organisational tensions”—persistent and often contradictory conflicts that arise when companies adopt advanced technologies such as artificial intelligence in product innovation processes.

1. Automation vs. Human Judgement
Thanks to explainability, XAI prevents automation from marginalising human intuition and experience. The model provides recommendations that are validated, adapted, and enriched by personal experience, generating learning for both people and intelligent systems.

2. Transparency vs. Complexity
Even the most sophisticated models become interpretable and trustworthy thanks to XAI. The design of explainable and user-friendly interfaces allows users to explore the reasons behind algorithmic decisions at different levels of detail, thus strengthening the dialogue between technology and people.

3. Speed vs. Accuracy
XAI enables a first phase of rapid hypothesis selection, reducing the number of tests, but does not replace physical validation.

4. Standardisation vs. Personalisation
MATILDE allows decisions to be adapted to regulatory constraints or local preferences, while maintaining consistency and scalability.

These tensions are addressed with a “both/and” approach, which values continuous feedback between humans and intelligent systems.

Evolutionary and collaborative management transforms XAI into a true partner in the innovation process: it does not replace human judgement, but amplifies it; it does not impose a single path, but allows organisations to navigate the complexity of industrial change while maintaining a balance between opposing needs. In this way, XAI not only solves the problem of algorithmic opacity (“black box”), but becomes an enabler for faster, more conscious, and adaptive organisational growth.

Intellico and the Future of Explainability in Innovation

These studies confirm that explainable AI is more effective and generates greater value when integrated throughout the innovation process, from concept to go-to-market.

Our vision? To overcome the “black box” of AI to create more conscious, rapid, and adaptive organisational growth.

Would you like to explore the potential of XAI in your processes?

Contact us to discover how Intellico solutions can accelerate innovation in your company.


Contributors

Sara Uboldi (Head of Solutions Intellico)

References

Ilaria Mancuso, Antonio Messeni Petruzzelli, Umberto Panniello, Federico Frattini,
The role of explainable artificial intelligence (XAI) in innovation processes: a knowledge management perspective, Technology in Society, Volume 82, 2025, 102909, ISSN 0160-791X,
https://doi.org/10.1016/j.techsoc.2025.102909

“Managing Paradoxical Tensions in the Implementation of Explainable AI for Product Innovation”, 2025 33rd International Conference on Enabling Technologies: Infrastructure for Collaborative Enterprises (WETICE)
979-8-3315-6512-1/25/$31.00 ©2025 IEEE
DOI: 10.1109/WETICE67341.2025.1109207

Do you need more information?

Fill out the dedicated form to be contacted by one of our experts.