You are here

In-Situ Analysis of Cosmological Simulations

Award Information
Agency: Department of Energy
Branch: N/A
Contract: DE-FG02-12ER90330
Agency Tracking Number: 99227
Amount: $149,931.00
Phase: Phase I
Program: SBIR
Solicitation Topic Code: 26 a
Solicitation Number: DE-FOA-0000577
Timeline
Solicitation Year: 2012
Award Year: 2012
Award Start Date (Proposal Award Date): 2012-02-20
Award End Date (Contract End Date): 2012-11-19
Small Business Information
28 Corporate Drive
Clifton Park, NY -
United States
DUNS: 010926207
HUBZone Owned: No
Woman Owned: No
Socially and Economically Disadvantaged: No
Principal Investigator
 Berk Geveci
 Dr.
 (518) 371-3971
 berk.geveci@kitware.com
Business Contact
 Katherine Osterdahl
Title: Dr.
Phone: (518) 371-3971
Email: comm@kitware.com
Research Institution
 Stub
Abstract

Cosmological simulations play a very important role in the DOE High Energy Physics Cosmic Frontier program. They are used to connect fundamental physics with observation, and are especially critical in the effort to understand Dark Energy. Current and upcoming cosmological observations that survey large areas of the sky produce very large datasets. For example, the Large Synoptic Survey Telescope (LSST) will produce up to 30 terabytes of data per night. Simulations carried out to interpret results from surveys of this type already produce terabytes of data per simulation and this will rise to many petabytes within a decade. To have any hope of realizing, encapsulating, and interpreting the enormous wealth of information contained in such datasets, it is necessary to find very efficient ways to explore and analyze them, with the goal of eventually automating many such tasks. Critical challenges facing such simulations include workflow I/O and lack of domain-specific data analysis algorithms. Overcoming these challenges requires a revolutionary shift in the way cosmological predictions are obtained. Instead of the traditional workflow where simulation codes are run for days and the analysis is conducted as a post-processing step, on-line methods that enable scientists to analyze the data in tandem with the evolving simulation are required. To address this, the software infrastructure that we propose to create will provide the following functionality: 1. Data-reduction to minimize I/O 2. Robust and efficient halo-extraction methods 3. On-line/forward tracking of halos to capture halo formation dynamics 4. In-situ and co-visualization capabilities The development of such an infrastructure will pave a way forward toward the analysis of exascale datasets that are expected within the coming decade. The basic strategy to achieve this consists of (i) extending and optimizing existing halo-extraction techniques to improve robustness; (ii) developing an on-line halo-tracking method to be used in tandem with the simulation code at the time-step resolution, enabling insights unavailable using present approaches; (iii) minimizing the I/O bottleneck by reducing output to only halos, halo-formation history and a sub-sample of the remaining particles; and (iv) leveraging ParaViews in-situ and co-visualization capabilities. In addition to cosmological applications, the framework to be developed has a broader applicability to industries that use particle-based simulation techniques, including astrophysics, ballistics and defense, volcanology, oceanology, solid mechanics modeling, and various maritime applications. By addressing the wider issues associated with large-data simulations, this work will drive innovation in the computational sciences and be adaptable to many industries, facilitate the transition from terascale work to peta- and eventually exascale computing.

* Information listed above is at the time of submission. *

US Flag An Official Website of the United States Government