Research in astronomy
Astronomers have been collecting photons emitted from various regions of the Universe since millenaries. For that purpose, each civilization has imagined, built and often transmitted to their successors various instruments, resulting in an arsenal of observation techniques and devices, of which the current instruments are often, at least in their principles, the direct descendants. Because astronomy is fundamentally an observational Science, astronomers have been continuously scrutinizing the Earth's atmosphere, their instruments and their data. And they have constantly been relying on data modeling and analysis techniques to understand the astrophysical information collected by their instruments, and to extract it from the data. Astronomy has thus a natural, deep and long term acquaintance with atmospheric optics, the quest for new optical concepts, applied mathematics, and signal processing.
Astronomy of the twenty-first century is increasingly more driven by very large international projects aimed at building extremely ambitious instruments. Such instruments are designed to challenge our current understanding of the Universe, and their achievement necessitates important breakthroughs with respect to technology at the time they are conceived. Obviously, these breakthroughs may regard the physical location of the instrument (testing and equipping new high altitude low turbulence sites on Earth, or synchronizing and connecting the observation facilities of multiple terrestrial sites separated by thousands of kilometers), and hardware technologies (such as ultra low-noise detectors or high precision optical devices). But this is only half way to success, because modern astronomical data are often extremely complex and large in size. Without dedicated processing methods, the scientific exploitation of such instruments remains very poor considering their financial cost and the whole information that could potentially be extracted from these instruments.
A first common characteristic of modern astronomical instruments is their high complexity. This complexity poses unprecedented modeling and calibration problems, which directly imperils their exploitation. Another characteristic is that these instruments are conceived to cover large observation domains (for instance very wide fields of views or very large wavelength ranges) with very fine sampling grids in each dimension. This implies that, regardless of how high is the fidelity of the hardware technology, the targeted astrophysical information remains at the instrument detection/estimation limit. The instrumental complexity and the faintness of the information to be extracted challenge simultaneously modeling, detection, and estimation techniques. Finally, the volume of data generated by these high resolution instruments is often huge, which poses storage and computational problems.
Astronomical data processing
Signal processing techniques devoted to modern astronomical instruments must be extremely accurate, robust (allowing to cope with misspecified direct model errors and with outliers) and implementable at very low computational and memory costs. In this context, our group is active in several domains of signal processing, both at the theoretical and application levels. Theoretical aspects regard information and data modeling (including statistical learning techniques and sparse models), estimation methods & inverse problems (in particular through dedicated regularization terms and/or statistical priors), hypothesis testing (in particular for weak alternatives in large scale data) and optimization. Applications to recent instruments or projects include SPHERE (exoplanet detection), MUSE (in particular detection and estimation of distant and faint galactic sources), VLTI (image reconstruction for polychromatic interferometers such as AMBER) and recently the SKA and some of its pathfinders (dedicated image priors, statistical characterization of the direct model, approaches to solve large scale inverse problems including optimization strategies).
Atmospheric optics and site-testing
Ground-based astronomy is severely limited by atmospheric turbulence. Advances in atmospheric optics, particularly in characterizing atmospheric turbulence and modeling its effects on astronomical image formation, have made important contributions to improving the resolution of ground-based optical instruments. Our group brings together a large knowledge of light propagation in turbulent media, physics of the atmosphere and instrumental development. Together with a unique set of instruments to probe the atmospheric turbulence, this expertise has earned our team to intervene in the site-testing of major projects, including existing telescopes of the 8-10 m class and future Extremely Large Telescopes (ELT) such as the E-ELT (European 40m Telescope) and the TMT (american 30m Telescope). Our group is also in charge of the qualification of very promising sites such as Dome C in Antarctica and Ali in the high altitude plateau in western Tibet. Our group is also very active in optical turbulence prevision with the use of 3D non-hydrostatic meteorological models, such as Weather Research Forecast (WRF) coupled with the Trinquet-Vernin parametrization.
Contacts: Jean Vernin
High-angular resolution & high-contrast imaging techniques
High-angular resolution (HAR) techniques aim at revealing the smallest details possible from astronomical object observations, by obtaining the theoretical angular resolution of large telescopes (namely , where is the observing wavelength and the telescope diameter) in spite of atmospheric turbulence. In addition, high-contrast imaging (HCI) techniques, such as stellar coronagraphy, aim at revealing the faintest possible details around much brighter objects, e.g., permitting the detection of an exoplanet orbiting its parent star.
From a theoretical point-of-view, our group has developed instrumental concepts such as the Apodized Lyot Coronagraph (ALC) for HCI, a concept that is giving impressive results within the instrument SPHERE (Spectro-Polarimetric High-contrast Exoplanet REsearch, mounted on the Very Large Telescope, VLT, of the European Southern Observatory, ESO, Chile).
Contacts: Claude Aime
Research in other areas
Surveillance methods and real-time sensing for system monitoring
System surveillance deals with real-time monitoring of persistent and transient phenomena within a given environment. The primary objective is to provide an automatic interpretation of scenes and to predict their evolution based on the information acquired by sensors. For instance, hyperspectral imaging provides 2D spatial images over many contiguous bands that allow to identify and quantify distinct materials from remotely observed data. Applications include land use analysis, mineral detection, environment monitoring, field surveillance, etc. System surveillance plays a central role within the area of signal and image processing, and deals with research topics such as system modeling, classification, detection and estimation with centralized or distributed techniques.