From 2023, the largest digital camera ever made will record and live stream (in cosmic time) a massive time lapse of the universe from an observatory on a mountaintop in Chile.
Dark matter time lapse
The idea for "repeated imaging of large areas of the sky" to probe for "transient objects" was first conceived in 1996 under the term "Dark Matter Telescope". Many years of development culminated in the construction of the Large Synoptic Survey Telescope (LSST) – a wide-field, large aperture reflecting telescope that will help us understand some of the most fundamental questions about the universe. Over the next ten years it will digitally scan the entire available sky. Over and over again. Think of it as a really fancy astro panorama time lapse.
A panorama camera
The "Legacy Survey of Space and Time", as the project is officially called, will record (almost) every night for ten years. Every twenty seconds it will take a picture that encompasses an area of the sky equivalent to roughly 40 full moons. Above all, it's a steady rhythm: 15-second exposure and 5 seconds to move the camera on its giant head. The wide field of view and the very short (in cosmic time) interval makes it ideal to observe quick and feint objects or to measure the spectrum of mass as it evolves. As a result, it will catalogue 90% of the near-Earth objects larger than 300 m. Furthermore, it will assess the threat they pose to life on Earth. How does that work?
Sensor size XXL
Firstly, the camera weighs 3 tons and features the largest optical lens ever built, with a diameter of 1.55 m. And that is just one of three huge optical elements being used. Its focal plane, the equivalent of an imaging sensor in a digital camera, is more than 60 cm wide and made up of 189 individual CCD sensors that each contribute 16 megapixels. As a result, a single picture consists of 3.2 billion pixels.
Turn notifications on
The camera will take over 200,000 pictures or 1.28 petabytes (that's 1,280 terabytes) in resulting data every year. While the resulting data will help to further our understanding of how dark matter affects the behaviour of galaxies inside the universe, it still begs the question: how on earth are they going to analyse so much data? An AI-driven processor capable of computing around 250 teraflops will detect miniscule changes in brightness or position and notify whoever is interested – all in under one minute. It will generate approximately 10 million alerts per night.
Sharing images as a core feature
Within 24 hours of observation, the images will be available in two forms: raw, straight from the camera, and single visit images which have been processed and include additional information. Annually, a catalogue of around 20 billion galaxies and 17 billion stars, each with more than 200 attributes, will be released. The finished time lapse "video" will also be published. On top of that, LSST is reserving 10% of its computing power (and disk space) for user generated data products.
There are tons of other insane details about this project. For example, the whole array only works precisely below -100°C. Moreover, in order to focus, the incoming light passes six filters of differing wavelengths. For more information, check out Seeker's video: