Laboratoire d'informatique de l'École polytechnique

News

2 post doc positions

Post-doc positions — Deep Learning for Text/NLP

The Deep Learning for Text/NLP Data Science& Mining group (DASCIM team) led by professor Michail VAZIRGIANNIS at Ecole Polytechnique is seeking 2 post-doctoral researchers in the area of “AI/Machine Learning for Graphs and Text”. The appointment would be for one year, with possible renewal for additional years.

More specifically topics are:

  • Abstractive summarization with end to end architectures
  • Unsupervised NLP (i.e. sense induction, disambiguation, unsupervised summarization/paraphrasing)
  • deep learning for text streams – the case of # prediction in twitter.
  • text generation algorithms for document understanding

The candidates must have at least two of the following skill sets:

  • Sound mathematical background (Proba/Stat, Linear Algebra,)
  • Experience in Text Retrieval & Mining (recommendation algorithms, text categorization opinion/sentiment mining, text generation)
  • Deep Learning skills (with recent architectures, i.e. transformers, GNNs)
  • Very good programming skills (including Python)

Requirements:

  • a recent Ph.D. degree in either: Computer Science, Applied Mathematics, Engineering
  • analytical skills and creative thinking with a hard working attitude
  • a sound publication record with visible impact

Funding: Competitive funding (salary, travel budget, budget for interns) for up to 36 months.

Applications

Interested graduate students should send by email to Prof. M. Vazirgiannis (mvazirg@lix.polytechnique.fr):

  • a cover letter including presentation of their academic record, motivation of the candidate.


  • a full CV with detailed grading information for the acquired degrees.

The location

The postdoctoral researchers will be based in the Informatics Laboratory (LIX) of Ecole Polytechnique in the broader area of Paris. Ecole Polytechnique is the premier engineering University of France (highly ranked internationally according to the latest University rankings) leading institute of the “Institute Polytechnique de Paris”. Famous scientists (including Nobel prize and Field medal recipients) and industrial leaders are alumni of the school, offering an exceptional environment for research in the centre of the fast growing excellence pole of Saclay few km south of Paris. Ecole Polytechnique is as well the main founding member of the new INSTITUT POLYTECHNIQUE DE PARIS federating some of the best engineering schools of France. The DaSciM group has already a significant impact in local and international research and industrial activities. Additionally, it offers ample computing resources and facilities in the University campus.

See further details at:

DASCIM group: http://www.lix.polytechnique.fr/dascim/
LIX @Ecole Polytechnique: http://www.lix.polytechnique.fr/en
Ecole Polytechnique: https://www.polytechnique.edu/en
IPP: https://www.ip-paris.fr/en/home-en/
M. Vazirgiannis scholar page: https://scholar.google.fr/citations?user=aWGJYcMAAAAJ&hl=en

Talk by Antonio Jiménez-Pastor: « DD-finite functions: a computable extension for holonomic functions »

Speaker: Antonio Jiménez-Pastor
Location: Room Emmy Noether, Alan Turing Building
Date: Tue, 28 Sep 2021, 11:00-12:00

The next meeting of the Max seminar will be on Tuesday, September 28. Antonio Jiménez-Pastor will talk about DD-finite functions: a computable extension for holonomic functions.

Abstract: D-finite or holonomic functions are solutions to linear differential equations with polynomial coefficients. It is this property that allow us to exactly represent these functions on the computer. in this talk we present a natural extension of this class of functions: the DD-finite functions. These functions are the solutions of linear differential equations with D-finite coefficients. We will see the properties these functions have and how we can algorithmically compute with them.

Multi-Output Prediction: On the Question of Modeling Outputs Together (When; Why; And Implications for Transfer Learning)

Speaker: Jesse Read (DASCIM)
Location: Amphi Sophie Germain (+remote)
Date: Thu, 23 Sep 2021, 13:00-14:00

In machine learning we often need to build models that predict multiple outputs for a single instance (we can point to the large areas of multi-label classification, and multi-target regression, involving a applications in diverse domains: text categorization, image labeling, signal classification, time-series forecasting, recommender systems, …). There is a common assumption through much of this literature that one should model the outputs together due to the presence of dependence among them. Intuitively this makes sense, but others argue, often convincingly, that modeling the outputs independently is sufficient. Much of this discrepancy can be resolved after knowing which loss metric(s) are under consideration, however there is a more interesting story to tell since empirical and theoretical results sometimes contradict each other, and years of activity in these areas still do not give us a full picture of the mechanisms behind the relative success (or lack thereof) of modeling outputs/labels/tasks together. In exploring what it means for one label to be `dependent’ on another, we take a path through some old and some new areas of the literature. We come across interesting results which we then take into the area of transfer learning, to challenge the long-held assumption that the source task must be similar to the target task.