| Issue |
EPJ Nuclear Sci. Technol.
Volume 11, 2025
Special Issue on ‘Overview of recent advances in HPC simulation methods for nuclear applications’, edited by Andrea Zoia, Elie Saikali, Cheikh Diop and Cyrille de Saint Jean
|
|
|---|---|---|
| Article Number | 61 | |
| Number of page(s) | 24 | |
| DOI | https://doi.org/10.1051/epjn/2025055 | |
| Published online | 01 October 2025 | |
https://doi.org/10.1051/epjn/2025055
Regular Article
Multi-output Gaussian processes for the reconstruction of homogenized cross-sections
1
EDF R&D PERICLES, 7 Boulevard Gaspard Monge, 91120 Palaiseau, France
2
Université Paris-Saclay, CEA, Service d’Études des Réacteurs et de Mathématiques Appliquées, 91190 Gif-sur-Yvette, France
3
EDF DQI, 2 Rue Ampère, 93206 Saint-Denis Cedex, France
* e-mail: This email address is being protected from spambots. You need JavaScript enabled to view it.
Received:
1
June
2025
Received in final form:
6
August
2025
Accepted:
7
August
2025
Published online: 1 October 2025
Deterministic nuclear reactor neutronics codes employing the prevalent two-step scheme often generate a substantial amount of intermediate data at the interface of their two subcodes, which can impede the overall performance of the software. The bulk of this data comprises “few-groups homogenized cross-sections” or HXS, which are stored as tabulated multivariate functions and interpolated inside the core code. A number of mathematical tools have been studied for this interpolation purpose over the years, but few meet all the challenging requirements of neutronics computation chains: extreme accuracy, low memory footprint, fast predictions. We here present a new framework to tackle this task, based on multi-output Gaussian processes (MOGP). These smooth and tunable bayesian regressors are able to model several quantities at once, and to capture the correlations between them – a key asset in the modeling of HXS’s, which we show to be highly similar from one another. Several models of this family are discussed, compared, adapted to the case of very numerous HXS’s, and their possible modeling choices are experimented on. These machine learning models enable us to interpolate HXS’s with improved accuracy compared to the current multilinear standard, using only a fraction of its training data – meaning that the amount of required precomputation is reduced by a factor of several dozens. They also necessitate an even smaller fraction of its storage requirements, preserve its reconstruction speed, and unlock new functionalities such as adaptive sampling and facilitated uncertainty quantification. We demonstrate the efficiency of this approach on a rich test case reproducing the VERA benchmark, proving in particular its scalability to datasets of millions of HXS’s.
© O. Truffinet et al., Published by EDP Sciences, 2025
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.
