From Baltic Region to European Cooperation

3–6 Jun 2025 | Brussels, Belgium

Project cooperationUpdated on 8 May 2025

Explainable AI: from requirements to evaluation of results

Marco Ciaramella

R&D manager, adjunct professor AI at Intellisemantic

Rivoli, Italy, Italy

About

Explainable AI is a key topic to consider to validate credible applications. Moreover explainable AI solutions to use depends from the kind of applications users and from the AI technology used, as for example disease diagnosis, etc. Our team is willing for addressing Explainable AI topics in the context of AI technologies applied to healthcare, supporting the clinical domain experts with third-party data evaluation within research support, data architecture and infrastructure support, software development. Such topic includes the respect of European regulations, including GDPR, FAIR principles, the AI Act and the IP ownership.

Topic

  • Cluster Health (CL1): HORIZON-HLTH-2025-01-DISEASE-04: Leveraging artificial intelligence for pandemic preparedness and response
  • Cluster Health (CL1): HORIZON-HLTH-2025-01-TOOL-03: Leveraging multimodal data to advance Generative Artificial Intelligence applicability in biomedical research (GenAI4EU)

Type

  • Partner seeks Consortium/Coordinator

Similar opportunities