Stage

Neural Architecture Growthfor Frugal Learning

Position type : IA, Science des Données

We aim at training tiny neural networks with a flexible architecture that adapts and grows on the fly while training, in order to reduce AI environmental footprint.

1 document Published on

Keywords

Neural networks, frugal AI, differential calculus,optimization, expressive power

Internship context

Research lab: INRIA TAU team (joint team between INRIA, CNRS and LISN of Université Paris-Saclay)
Location: LISN (building 660 “Digiteo”, at Université Paris-Saclay)
Supervision team: Guillaume Charpiat, Sylvain Chevallier, Alex Davey, Stella Douka, François Landes, Stéphane Rivaud, Théo Rudkiewicz and Alena Shilova

Contact: tau-frugal@inria.fr

Funding: European project MANOLO

Expected skills


The required skills are the central ones for any ML researcher both on the theoretical and technical sides :

  • Python, PyTorch, Git (could be learned on the fly)
  • maths: linear algebra (SVD…), differential calculus, statistics…
  • Optimisation techniques
  • Properties of small or big networks
  • Expressivity of neural networks and general notions of expressivity (VC-dimension, Rademacher complexity)
  • Estimation of carbon footprint of a computation
  • Deep understanding of computational cost of neural network training and linear algebra operation on GPU
  • Many software development skills (continuous integration, documentation, use of cluster (slurm), …) to improve our open-source implementation.

  • Neural Architecture Growth for Frugal Learning PDF - 509.4 kB