Valerie Morignat
Dr. Valerie MorignatPh.D. (Arts & Art Sciences, Sorbonne) & MIT Certified in AI and Business Strategy and Machine Learning.

AI'M SPEAKING

An Empathy-Building Exhibit
The merging of human imagination with machine intelligence will lead to empathy-building experiences changing the face of healthcare.
  • About The Concept
  • According to the World Health Organization, 7.5 million Americans have Psoriasis, a life-impacting chronic skin condition having social exclusion and depression as consequences. Psoriasis comes with myths and misconceptions deeply affecting patients. In the context of the Psoriasis Awareness Month, I conceptualized AI'M SPEAKING, an AI-powered art exhibit giving a uniquely powerful voice to patients.
PSORIASIS AWARENESS MONTH
THE AI'M SPEAKING EXPERIENCE

An AI-powered interactive experience, the AI’M SPEAKING concept is inspired by the painted allegories of the Baroque period. The Baroque interest in human psychology had as a corollary the intense scrutiny of the physical world that resulted from the 16th and 17th centuries optical discoveries. Allegories from the Baroque period deployed rhetorical qualities that conveyed subtle meaning throughout spectacular, and sometimes theatrical experiences.

A gallery of interactive videos, the AI’M SPEAKING exhibit draws from Baroque dramatization with a series of iridescent interactive portraits evoking the inner life of psoriasis patients. Emotional intelligence and Self-perception are central themes of the concept. The portrait series embeds facial analysis and speech recognition technologies enabling the artworks to react to visitors’ facial expressions and interact with them through speech.

The characters’ appearance, all together intimate and spectacular, nocturnal and fluorescent, expresses the many challenges of living with a skin that attracts unwanted attention and the desperate attempt to hide it under layers of clothes and makeup. The interactive experience builds on this dramatic dynamic and the emotional roller-coaster it entails. At first glance, each subject seems the still portrait of a person asleep. As visitors get closer to the artwork, the character awakens, makes eye contact, and directly engages the visitor by asking a question. Each segment of the conversation between characters and visitors is powered by Machine Learning and Natural Language Processing and scripted upon real patient testimonials.

While interacting with these ‘living’ portraits, visitors become part of a patient-focused conversation, from which they deepen their understanding of the unique challenges of living with psoriasis. The exhibit ultimately challenges the myths surrounding psoriasis while raising awareness and building empathy towards patients.

The deepest thing in man is the skin.
Paul Valéry.
The Technology behind AI'M SPEAKING

This concept integrates Computer Vision, Deep Learning and Natural Language Processing. They enable naturalistic interactions between artificial agents and users. Interactions are powered by cognitive sensors and rendered by real-time motion retargetting and lip-sync video technologies. Characters are created using interwoven video and CGI, which creates a realistic animation and more natural transitions between 'sleep' and interaction states. Enhanced with motion sensors, cameras, 3D microphones and a sound system, characters’ behavior and speech result from real-time data processing combined with underlying scripted branching scenarios. Motion is detected every time a visitor steps in a defined perimeter. Facial and silhouette analysis, eye-gaze tracking, and speech recognition enable artificial characters to detect and respond naturalistically to visitors.

  • Interactive Storytelling
  • Machine Learning and Natural Language Processing enhance the emotional intelligence and personalization features of the immersive and interactive experiences we design. We build partnerships with third parties to create benchmarks based on datasets and train artificial agents in virtual environments. This enables deep reinforcement learning algorithms to improve storytelling and respond more realistically to users.
  • Emotion Sensing
  • A user-centricity emulator, NLP allows for highly personalized experiences. NLP-enabled agents understand, process, and respond to human speech, which supports user-friendly, seamless, and controller-free interactions. When useful, agents can remember previous interactions with specific users and adapt their response, or tailor the plot agency, commensurately with the detected sentiments.
Project skills & expertise

Every project we lead is conceptualized from a holistic perspective, and integrates the most relevant and adapted technologies. We create environments, storyworlds, and experiences that put the emphasis on seamless and delightful interactions between digital environments, artificial agents and human users.

CREATIVE DIRECTION, VIDEO DIRECTION

ARTIFICIAL INTELLIGENCE DEPLOYMENT

UX, INTERACTION DESIGN, GRAPHIC DESIGN

STORYTELLING

The AI'M SPEAKING Experience: Expect Emotional Intelligence
  • AI
  • ML and NLP deployment & integration, IT
  • Experience Design
  • AR UX, interaction design, CGI, interface engineering
  • Video
  • Direction, multibranching storyboarding, editing, VFX
  • Storytelling
  • Interactive Storytelling, medical science compliance