ILES

Sign Language Modelling and Processing

The people involved in the projects under this topic belongs to several teams. Permanents : A. Braffort (ILES), M. Filhol (ILES), M. Gouiffès (AMI), E. Prigent (CPU) et C. Verrecchia (P2I). Post-docs : C. Danet (ILES), E. Martinod (ILES). PhD students : C. Challant (ILES), P. Sharma (ILES), H. Bull (AMI), Y. Ouakrim (AMI & Gipsa-Lab)

Sign Languages

Sign Languages (SLs) are the natural languages used in Deaf communities and French Sign Language (LSF) is the one used in France. They are visuo-gestural languages: a person expresses himself / herself in SL using many body components (hands and arms, but also facial components, gaze, torso, etc.) while the interlocutor perceives the message through the visual channel. The linguistic system of SL makes use of these specific channels: numerous pieces of the information are expressed simultaneously and organized in space, and iconicity plays a central role.

To date, SLs have no writing system or standard graphics for transcription. They are still scarcely described and less-resourced (very few reference books, limited lexicons, partial knowledge on grammar, little resource in general). Computer modeling of SL requires designing representations in a domain where little data is available and where pre-existing essentially linear models have been developed for written or spoken languages and do not cover all aspects of SLs.

Through numerous collaborations, we produce linguistic resources and address analysis, representation and processing of Sign Language in an interdisciplinary way. We take views from several fields of computer science (NLP, signal processing, computer graphics), as well as language, movement and perception sciences.

Current PhD thesis and projects

Current PhD thesis

  • Modelling
    • 2021-2024: Camille Challant, “Formal representation and grammatical constraints for French sign language”. Supervisor M. Filhol. Paris-Saclay University
  • Animation generation
    • 2021-2024: Paritosh Sharma, “Synthèse de langue des signes par un système d’animation à granularité décroissante à partir d’AZee”. Supervisor M. Filhol. Paris-Saclay University
  • Analysis, recognition, machine learning

Recently defended thesis

  • 2022: Marion Kaczmarek, “Specification of computer-aided translation software for signed languages.” (online by the end of 2022). Supervisor A. Braffort, co-supervisor M. Filhol. Defended on 26/09/2022. Paris-Saclay University

Current projects and collaborations

  • 2021-2024 : Easier – European project Horizon 2020
    Coordination at LISN : M. Filhol.
    EASIER aims to design, develop, and validate a complete multilingual machine translation system that will act as a framework for barrier-free communication among deaf and hearing individuals, as well as provide a platform to support sign language content creation.
    Persons involved at LISN : 2 permanents (M. Filhol, A. Braffort), 1 invited researcher (J. McDonald), 1 engineer (T. Von Ascheberg)
  • 2020-2024 : Serveur Gestuel – Project PSPC of BPI France
    Coordination at LISN : A. Braffort.
    The aim is to provide deaf sign language users with the equivalent of a voice server for hearing people, in partnership with the Gipsa-lab and the companies IVèS and 4DViews. The team is involved in both the automatic recognition and the automatic generation parts. Partners : Entreprises IVès and 4Dviews companies, LISN and Gipsa-Lab laboratories.
    Persons invloved at LISN : 4 permanents (A. Braffort, M. Filhol, M. Gouiffès, C. Verrecchia), 3 phd students (Y. Ouakrim, P. Sharma, H. Bull) et 2 post-docs (C. Danet, E. Martinod).
  • Continuous collaboration on movement analysis and sign language synthesis in collaboration with J. McDonald and R. Wolfe (DePaul University, Chicago). Generation of 3D animations from the linguistic representations AZee designed in the team, movement analysis using mocap corpora for modelling and synthesis.

Rencently completed projects

  • 2018-2021 : Rosetta – Project PIA Grands défis du numérique of BPI France
    Coordination at LISN : A. Braffort.
    Development of an automatic generator of multilingual subtitles for television programmes and internet video content for deaf and hard of hearing people, based on artificial intelligence and animation of a virtual signer.
    Partners : Systran, MocapLab and MFP (Multimédia France Productions/filiale deFrance Télévision) ompanies, LISN and CHArt-LUTIN laboratories. Persons implied at LISN : 5 permanents (A. Braffort, M. Filhol, M. Gouiffès, E. Prigent, F. Yvon), 3 phd students (F. Bigand, F. Buet, M. Kaszmarek), 3 post-docs (V. Belissen, C. Danet, E. Martinot).