Mathematics and computer science master’s track 2024/25

Data Sciences, Geometry and Combinatorics

The “mathematics and computer science track” is a master’s (M2) track aiming for a dual formation in mathematics and computer science with courses at the border of the two disciplines (almost each course is taught jointly by at least a mathematician and a computer scientist).  A significant fraction of the attendees is non-French speaking, and the lectures are delivered in English.

This track is simultaneously a track of the M2 “Computer Science” master’s program, which leads to a diploma in Computer Science and of the M2 “Mathematics” master’s program, which leads to a diploma in Mathematics. It is also possible to get both diplomas by registrating in each formation. It is jointly supported by the Bézout Labex, the UFR of Mathematics and the Institut Gaspard Monge. The persons in charge are Laurent Hauswirth (LAMA) and Cyril Nicaud (LIGM).  This track is open to students, supported or not by the Bézout scholarship program.

The information below concerns the academic year 2024-2025. The archives on previous years are available here: 2018-2019, 2019-2020, 2020-2021, 2021-2022, 2022-2023. Before 2018, the Bézout Labex supported individual courses at the interface of mathematics and computer science.

REGISTRATION 2024-2025: Registration are possible on the following pages from the mathematics master’s program and the computer science master’s program (depending on the degree aimed for).  These pages are not completely up-to-date concerning the pedagogical aspects… see below for the newest information.


(Note: UE=”Unité d’enseignement”=indivisible piece of lectures; HETD=”Heure équivalent TD”: 1 HETD corresponds to 40 minutes of lecture.)

  • 4 weeks on basics: complements in mathematics and complements in computer science. Each UE has 6 ECTS and 48 HETD. (In 2024-2025: from September 16 to October 11.)
  • 10 weeks for a general large background: data sciences, probabilistic methods, discrete maths and geometric calculus. Each UE has 6 ECTS and 60 HETD, split into 2 courses of 3 ECTS and 30 HETD each. (In 2024-2025: from October 14 to December 20. Exams the week of January 6.)
  • 8 weeks for two UE of specialization chosen by students among four UE, having each 6 ECTS and 40 HETD. (In 2024-2025: from January 20 to March 21, with one week break. Exams the week of March 24.)
  • A research memoir/internship of 18 ECTS. (In 2024-2025: from March 31.)

Preliminary list of courses

1UE Basics Mathematics (analysis, algebra, probability, geometry)
1UE Basics Computer Science (complexity, algorithms, programming, graphs)
1UE Discrete and continuous optimisation660
 Discrete optimisation330
 Continuous optimisation330
1UE Geometry and Combinatorics
1UE Data Sciences
 Introduction to data sciences330
 Computational aspects of data sciences330
2UE Advanced Data Sciences 640
 Deep learning methods320
 Computational aspects of deep learning methods320
2UE Maths specialization: Advanced Geometry
2UE CS specialization: Algebraic Combinatorics and formal calculus640
 Combinatorics Hopf algebra320
2UE Large random matrices and applications640

Detailed description of the courses of the first semester

Basics in Mathematics.
  • Algebra and linear algebra (Fradelizi): Groups: order, quotient group, cyclic groups, finite groups, finite abelian groups, group actions; Rings, polynomials and fields: ideals, principal ideal domains, finite fields; Linear algebra: endomorphisms, eigenvectors, spectral theorem.
  • Analysis (Sester): Normed vector spaces : equivalent norms, topology, continuous functions, compactness, the finite dimensional case; Examples of metric spaces; Differential calculus. Extremum problems; Convex functions, convexity inequalities. Asymptotic analysis.
  • Probability (Martinez): Random experiment and probability spaces. Law, mean value, moments,… of a random variable. Applications to combinatorics on graphs; Deviation’s inequality, concentration inequalities (Markov, Tchebychev, Hoeffding inequality…); Martingales, inequalities with Martingales; Markov chains. References : The Probabilistic Method (N. Alon, J. H. Spencer).
  • Geometry (Fanoni and Sabourau): We will give an introduction to graph theory: connectedness, degree, trees/forests, adjacency matrix etc.
Basics in Computer sciences.
  • Algorithmic: data structures (Nicaud): The data structures studied include: array, lists, stack, …; dynamical arrays; trees, well-balanced trees- heap, priority queues; hashtables; minimal range query; suffix array; suffix trees.
  • Complexity (Thapper): The course is an introduction to computational complexity theory. We will cover the following notions: Turing machines, the Church-Turing thesis, (un)decidability, the halting problem; P, NP, polynomial-time reductions, NP-completeness, the Cook-Levin theorem, co-NP; PSPACE; the time and space hierarchy theorems, Ladner’s theorem; the polynomial hierarchy and collapses; approximation of NP-hard problems. References: Introduction to the Theory of Computation (Michael Sipser).
  • Programmation (Borie): Programmation : Python and Sage.
    Quick review of the basics of programming. Solve simple mathematical-algorithmic problems with Python (gcd, f(x) = 0, numerical integration algorithm, knapsack, backtracking, …). Getting started with Sage, a computer algebra system. Programming project at the interface mathematics and computer science.
  • Graph theory (Bulteau and Weller). Fundamentals; Connectivity- Planar graphs- Flow/Cut- Examples of graph classes- Examples of problems- Matchings; P/NP, Reductions; parameterized algorithms; examples of parameters; kernels; minors
Discrete and continuous optimization.
  • Discrete optimization (Thapper): Min-max results in combinatorial optimization provide elegant mathematical statements, are often related to the existence of efficient algorithms, and illustrate well the power of duality in optimization. The course aims at being a gentle introduction to the richness of this type of results, and especially those that belong to the theory of perfect graphs. It will make connections with the course of continuous optimization, in particular in what concerns linear programming and polyhedra, and will rely on concrete examples taken from industry that illustrate the relevance of tools from combinatorial optimization for real-world applications.
    The preliminary plan of the course is as follows:- Discrete optimization in bipartite graphs: Hall’s marriage theorem, König’s theorems, algorithms; chains and antichains in posets: theorems of Dilworth and Mirsky; chordal graphs: interval graphs, coloring, duality, decomposition; perfect graphs: definition, weak and strong theorems; perfect graphs: polyhedra, algorithms; Lovász’ theta function: definition, computation, sandwich theorem, Shannon capacity.
  • Continuous optimization (Sandier): The course will cover the theory and main examples in convex optimization. The tentative list of topics covered is as follows: Convex sets and functions, convex optimization problems. Duality and optimality conditions. Among examples we will see Linear programming, Quadratic programming, Second order cone programming. Additional topics will include sparse solutions via L1 penalization, and notions on algorithms, including the simplex algorithm and interior point methods.
Geometry and combinatorics.
  • Geometry (De Mesmay): Algorithms and combinatorics of embedded graphs. This course will provide an introduction to the study of graphs arising in geometric settings, with a focus on planar graphs and graphs embedded on surfaces. The main objective of the course is to explore the interactions between the geometry and topology of low-dimensional spaces on the one hand, and the combinatorics of their discrete structures on the other hand, as well as to showcase algorithmic techniques tailored for these objects. Topics that will be investigated include: Basics of planar graphs: Jordan’s curve theorem, combinatorial representations, duality, Euler’s formula, Kuratowski-Wagner theorem, Planarity testing, Tutte embedding, Efficient algorithms for planar graphs, Classification of surfaces, basics of topological graph theory, Topological algorithms: homotopy testing and shortest loops.
  • Combinatorics (Novelli): The lectures on enumerative combinatorics will consist in the study of classical objects: permutations, trees, partitions, parking functions; classical sequences: factorial, Catalan, Schroder; classical methods: bijections, group actions, induction, generating series.The lectures will be heavily based on the study of various examples, some very easy and others trickier.
Data Sciences.
  • Theoretical aspects of Data Sciences (Bonis) :
    This course will provide the necessary tools to understand data sciences from the theoretical perspective. The goal of this course is to introduce notions of statistical estimation with a focus on parametrics statistics and cast machine learning problems as a statistical estimation problem. In particular, the students will be familiar with the following concepts:
    – The basics of statistical estimation (estimator, loss function, …)
    – Classical estimators (moment method, maximum likelihood, …)
    – Properties of statistical estimators (mean squared error, bias-variance tradeoff, …)
    – Linear regression
    – General machine learning problems (regression and classification)
  • Computational aspects of Data Sciences (Lacombe):
    This course will present the basics of data sciences from the practitioner perspective. The goal is to understand the typical machinery (from theory and numerics) used by data scientists when they design machine learning models given a set of data. At the end of the course, the students will be familiar with most notions any data scientist should know, including:
    – Standard terminology (supervised / unsupervised learning, regression / classification, etc.);
    – Optimization through gradient descent
    – Basics of supervised learning (linear regression, classification…)
    – Basics of unsupervised learning (k-means, PCA…).
    – Software: Python with pandas, numpy, scikit-learn, and jax. 

Detailed description of the courses of the second semester (choose two courses)

Advanced Geometry and graph theory:

Advanced Graph theory (Fanoni and Sabourau):This course will focus on families of expander graphs. These are sequences of graphs, with growing number of vertices, which are at the same time sparse and highly connected. For their interesting properties, they have many applications in mathematics and computer science. We will talk about constructions of examples of expanders, the different viewpoints which can be used to define them and some of their properties. We will also present two applications, one in computer science (error correcting codes) and one in mathematics (embeddings in Euclidean spaces).

Advanced Data Sciences :

Introduction to modern machine learning problems (Lacombe)

This course will be in continuation with the course of the first semester, and will be dedicated to modern machine learning models, mainly (deep) neural networks and their different flavors. At the end of the course, the students will have notions on: Feedforward fully-connected networks and their training trough back-propagation, Convolutional and Residual neural networks, diffusion models, transformer architecture… In terms of software:, we will rely on Python with Tensorflow and/or PyTorch. Importantly, students will be able to quickly understand modern machine learning problems and adapt to new models when they encountering them in academia or industry.

Theoretical aspects of deep learning methods (Hebiri)

In this lecture, we will provide statistical controls (bound on generalization error) for general supervised learning algorithms. First, we will derive a bound for the Empirical Risk Minimizer (ERM) using tools from the Vapnik–Chervonenkis theory. Then, we will consider several algorithms based on the convexification of the risk in the context of binary classification. The last part of the course will explore modern multi-class classification problems and present some techniques for addressing the problem based for instance on set-valued approaches.

By the end of the course, students will be familiar with classic ML algorithms such as trees, random forests, SVMs, boosting, and neural network. In addition, the methods introduced in the course will be compared on real data.
Algebraic combinatorics and formal calculus (Borie and Novelli)

Operads in combinatorics (Borie): Informally, an operad is a space of operations having one output and several inputs that can be composed. Each operad leads to the definition of category of algebras. This theory offers a tool to study situations wherein several operations interact with each others. This lecture begins by presenting some elementary objects of algebraic combinatorics: combinatorial classes and combinatorial algebras. We introduce then (non-symmetric) operads and study some tools allowing to establish presentations by generators and relations of operads. Koszul duality in non-symmetric operads is an important part of this theory which shall be presented. We end this lecture by reviewing some generalizations: colored operads, symmetric operads, and pros. We shall also explain how the theory of operads offers a tool to obtain enumerative results.

Algebraic combinatorics (Novelli): The lectures on algebraic combinatorics will consist in the study of: classical symmetric functions and a short discussion about representation theory; noncommutative symmetric functions (NCSF); the definition of Hopf algebras; the dual algebra of NCSF, quasi-symmetric functions; the modern generalizations of those algebras; and the use of all these algebraic properties (transition matrices, expressions in various bases, morphisms of Hopf algebras) to solve (classical) combinatorial questions. As in the lectures in combinatorics of the first semester, the lectures will be heavily based on the study of examples.