Research areas: Data science, Geometric science of information, Big Data, Machine Learning, High-dimensional statistics.
Challenge: Current Data Science is often biased by inappropriate representation,
and adhoc data/model goodness-of-fit or distances
(poster).
Goal: Learn appropriate data/model geometry for Intrinsic Data Science with principled distances.
How to: By building a theory of Computational Information Geometry...
... and showcase it in applications arising in data science, learning, intelligence, vision and imaging.
[+]
Fabric of modern information theory where computation = science of transformations,
geometry = science of invariance, information = science of communication (between data and models).
Driven by serenpidity and curiosity, I also sometimes work on computer graphics, user interfaces, computational photography, computer arithmetic, systems biology,
geometric combinatorial optimization, economy, (GP)GPU, etc.).
I am also interested by epistemology, and in particular abduction and other principles yielding creativity and new knowledge.
[publications]
(books | arxiv | DBLP)
[lectures]
blog
[video]
[slides]
[software]
[services/events/editorship]
[+]
Recent highlights
- 3rd Geometric Science of Information (GSI) LNCS proceedings
- Journal of Information Geometry Springer
(board)
- Geometric Science of Information (GSI), 2017
- Geometrical and topological structures of information, CIRM, August 28th-September 1st 2017
-
On Hölder projective divergences
-
A series of maximum entropy upper bounds of the differential entropy,
slides
- Combinatorial bounds on the $\alpha$-divergence of univariate mixture models, IEEE ICASSP 2017
- INFORMATION GEOMETRY METRIC FOR RANDOM SIGNAL DETECTION
IN LARGE RANDOM SENSING SYSTEMS, IEEE ICASSP 2017
- Interview with Prof C.R. Rao, December 2016,
(formatted in latex/pdf)
- Tsallis Regularized Optimal Transport and Ecological Inference, AAAI-17,
code
- Exploring and measuring non-linear correlations: Copulas,
Lightspeed Transportation and Clustering, NIPS Time Series Workshop, 2016.
- Large Margin Nearest Neighbor Classification using Curved Mahalanobis Distances
(arxiv,
slides,
ICIP'16,
video of Cayley-Klein Voronoi diagrams)
- Guaranteed bounds on the
Kullback-Leibler divergence of univariate mixtures using piecewise log-sum-exp inequalities
(arxiv,
IEEE SPL)
- Learning on High-dimensional Neuromanifolds with Relative Natural Gradients
(arxiv)
- Loss factorization, weakly supervised learning and label noise robustness,
(arxiv,
ICML'16)
-
Patch Matching with Polynomial Exponential Families and Projective Divergences (SISAP'16,
paper
)