Integrating Simulations and Observations: A Foundation Model for Estimating the Aerosol Mixing State IndexClick to copy article linkArticle link copied!
- Fei JiangFei JiangDepartment of Earth and Environmental Sciences, The University of Manchester, Manchester M13 9PL, U.K.More by Fei Jiang
- Zhonghua Zheng*Zhonghua Zheng*E-mail: [email protected]Department of Earth and Environmental Sciences, The University of Manchester, Manchester M13 9PL, U.K.More by Zhonghua Zheng
- Hugh CoeHugh CoeDepartment of Earth and Environmental Sciences, The University of Manchester, Manchester M13 9PL, U.K.National Centre for Atmospheric Sciences, The University of Manchester, Manchester M13 9PL, U.K.More by Hugh Coe
- Robert M. HealyRobert M. HealyEnvironmental Monitoring and Reporting Branch, Ontario Ministry of the Environment, Conservation and Parks, Toronto M9P 3V6, CanadaMore by Robert M. Healy
- Laurent PoulainLaurent PoulainAtmospheric Chemistry Department (ACD), Leibniz Institute for Tropospheric Research (TROPOS), Leipzig 04318, GermanyMore by Laurent Poulain
- Valérie GrosValérie GrosLaboratoire des Sciences du Climat et de l’Environnement, CNRS-CEA-UVSQ, IPSL, Université Paris-Saclay, Gif-sur-Yvette 91191, FranceMore by Valérie Gros
- Hao ZhangHao ZhangDepartment of Earth and Environmental Sciences, The University of Manchester, Manchester M13 9PL, U.K.More by Hao Zhang
- Weijun LiWeijun LiDepartment of Atmospheric Sciences, School of Earth Sciences, Zhejiang University, Hangzhou 310027, ChinaMore by Weijun Li
- Dantong LiuDantong LiuDepartment of Atmospheric Sciences, School of Earth Sciences, Zhejiang University, Hangzhou 310027, ChinaMore by Dantong Liu
- Matthew WestMatthew WestDepartment of Mechanical Science and Engineering, University of Illinois Urbana−Champaign, Urbana, Illinois 61801, United StatesMore by Matthew West
- David Topping*David Topping*[email protected]Department of Earth and Environmental Sciences, The University of Manchester, Manchester M13 9PL, U.K.More by David Topping
- Nicole Riemer*Nicole Riemer*[email protected]Department of Climate, Meteorology and Atmospheric Sciences, University of Illinois Urbana−Champaign, Urbana, Illinois 61801, United StatesMore by Nicole Riemer
Abstract
Accurately predicting aerosol mixing states in real-world environments is crucial for understanding their impacts on climate change and human health. However, observational data inherently exhibit spatiotemporal gaps, and high costs and equipment requirements further exacerbate these limitations, particularly for in situ measurements. While particle-resolved models can simulate individual particle composition and size changes and serve as benchmarks, they face challenges in real-world applications due to a combination of factors. One of the major challenges is the limited availability of detailed input data (e.g., emission inventories) that accurately reflect actual environmental conditions. In this study, we frame the emulation of aerosol simulation as a general task and treat the estimation of real-world mixing states as a downstream task. We developed a foundation model pretrained on particle-resolved simulations and fine-tuned it using observational data from the field campaign. The fine-tuned model consistently outperformed baseline models, showing greater stability and robustness across various data sets. Permutation feature importance and sensitivity analyses revealed that aerosol species concentrations were the most critical factors for the foundation model. This approach, which involves pretraining on particle-resolved simulations and fine-tuning on limited observational data, offers a viable solution to challenges posed by limited observational data.
This publication is licensed under
License Summary*
You are free to share(copy and redistribute) this article in any medium or format and to adapt(remix, transform, and build upon) the material for any purpose, even commercially within the parameters below:
Creative Commons (CC): This is a Creative Commons license.
Attribution (BY): Credit must be given to the creator.
*Disclaimer
This summary highlights only some of the key features and terms of the actual license. It is not a license and has no legal value. Carefully review the actual license before using these materials.
License Summary*
You are free to share(copy and redistribute) this article in any medium or format and to adapt(remix, transform, and build upon) the material for any purpose, even commercially within the parameters below:
Creative Commons (CC): This is a Creative Commons license.
Attribution (BY): Credit must be given to the creator.
*Disclaimer
This summary highlights only some of the key features and terms of the actual license. It is not a license and has no legal value. Carefully review the actual license before using these materials.
License Summary*
You are free to share(copy and redistribute) this article in any medium or format and to adapt(remix, transform, and build upon) the material for any purpose, even commercially within the parameters below:
Creative Commons (CC): This is a Creative Commons license.
Attribution (BY): Credit must be given to the creator.
*Disclaimer
This summary highlights only some of the key features and terms of the actual license. It is not a license and has no legal value. Carefully review the actual license before using these materials.
Special Issue
Published as part of ACS ES&T Air special issue “Elevating Atmospheric Chemistry Measurements and Modeling with Artificial Intelligence”.
Synopsis
Both observational data and simulations face limitations in real-world aerosol mixing state estimations. This study employs a foundation model combining simulations and observations, enhancing temporal variation estimations with implications for climate and human health impacts.
Introduction
Materials and Methods
Figure 1
Figure 1. Overview of the workflow for the foundation and fine-tuned model development. (a) Data Preparation for Pretraining and Fine-tuning: (1) Generation of the pretraining data set for the foundation model using the PartMC-MOSAIC simulation. (2) Fine-tuning data set derived from the MEGAPOLI campaign observational data, source from Healy et al. (6) (b) The Pretrained Foundation Model: (1) Architecture of the foundation model, illustrating the overall structure (left) and the residual block structure (right). The “linear transformation” refers to a simple mathematical operation where each input is multiplied by a weight and added to a bias (a constant value), linearly altering the input values. (2) Model development through hyperparameter optimization, where the purple marker highlights the best foundation model selected during validation, and the yellow marker represents its performance on the pretraining testing set. (c) Fine-tuning the Pretrained Foundation Model: (1) Fine-tuning process, where observational data is chronologically split into a fine-tuning training set (first 50%) and a fine-tuning testing set (remaining data). The foundation model is pretrained using three different data volumes (20%, 50%, and 90% of the pretraining data) to investigate how the amount of pretraining data affects fine-tuning performance. (2) Effect of data sparsity or abundance, with different data subsets (case 1- 6) used for training and evaluated on the fine-tuning testing set. Fine-tuning targets include the output layer and adjacent layers. In the “Swapped Temporal Order” experiment, the roles of the data sets are reversed: the fine-tuning testing set is redivided (using the same proportional split) to serve as the training set, while the original fine-tuning training set is used for testing. (3) Feature sensitivity analysis, examining the model’s response to different feature subsets, with fine-tuning targeting both input and output layers alongside adjacent layers.
Aerosol Mixing State Metric Calculations
Data Preparation for Pretraining and Fine-Tuning
Particle-Resolved Aerosol Model
Pretraining Data Set: PartMC-MOSAIC Simulations
Parameters | Range |
---|---|
Environmental Variable | |
Relative humidity (RH) | [0.4, 1) |
Latitude | 90°S, 90°N |
Day of Year | [1, 365] |
Temperature | Varies with day of the year and location assumptions for each scenario, remaining constant within each scenario |
Gas Phase Emissions Scaling Factor | |
SO2, NO2, NO, NH3, CO, CH3OH, | |
ALD2 (Acetaldehyde), ANOL (Ethanol), | |
AONE (Acetone), DMS (Dimethyl sulfide), | |
ETH (Ethene), HCHO (Formaldehyde), | [0, 200%] of the reference scenario |
ISOP (Isoprene), OLEI (Internal olefin carbons), | |
OLET (Terminal olefin carbons), | |
PAR (Paraffin carbon), TOL (Toluene), XYL (Xylene) | |
Carbonaceous Aerosol Emissions (onemode) | |
Dg | [25 nm, 250 nm] |
σg | [1.4, 2.5] |
BC/OC mass ratio | [0, 100%] |
Ea | [0, 1.6 × 107 m–2 s–1] |
Sea Salt Emissions (twomodes) | |
Dg,1 | [180 nm, 720 nm] |
σg,1 | [1.4, 2.5] |
Ea,1 | [0, 1.69 × 105 m–2 s–1] |
Dg,2 | [1 μm, 6 μm] |
σg,2 | [1.4, 2.5] |
Ea,2 | [0, 2380 m–2 s–1] |
OC fraction | [0, 20%] |
Dust Emissions (twomodes) | |
Dg,1 | [80 nm, 320 nm] |
σg,1 | [1.4, 2.5] |
Ea,1 | [0, 5.86 × 105 m–2 s–1] |
Dg,2 | [1 μm, 6 μm] |
σg,2 | [1.4, 2.5] |
Ea,2 | [0, 2380 m–2 s–1] |
Restart Timestamp | |
Timestamp | [0, 24 h] |
The variables Dg, σg, and Ea refer to geometric mean diameter, geometric standard deviation, and number emission flux, respectively.
Fine-Tuning Data Set: MEGAPOLI Campaign Observations
Data Source | Species 1 | Species 2 | Species 3 | Species 4 | Species 5 |
---|---|---|---|---|---|
PartMC simulation | BC | SO4 | NO3 | NH4 | ARO1, ARO2, ALK1, OLE1, API1, API2, LIM1, LIM2, OC |
MEGAPOLI observation | BC | SO4 | NO3 | NH4 | OA |
In PartMC, organic aerosols (OA) include aromatic hydrocarbons (ARO), alkanes (ALK), olefins (OLE), and other organic compounds (OC), corresponding to the OA category in the MEGAPOLI data set.
The Pretrained Foundation Model
The Pretrained Foundation Model Architecture
The Pretrained Foundation Model Development
Hyperparameter | Range |
---|---|
batch_size | [16, 32, 64, 128] |
hidden_size | [128, 256, 512, 1024] |
learning_ratea | 10–6 – 10–4 |
num_blocksb | 10 – 20 [step = 1] |
The learning rate is sampled between 1e-5 and 1e-3, and the sampling is done in a logarithmic space.
The number of blocks is sampled between 10 and 20, inclusive. The step = 1 parameter ensures that Optuna will consider every integer within this range.
Fine-Tuning the Pretrained Foundation Model
Hyperparameter | Range | Sampling Method |
---|---|---|
batch_size | 1 | NA |
L2 regularization coefficient (λ)a | 10–6–10–3 | Log-uniform sampling |
learning_rate | 10–7–10–4 | Log-uniform sampling |
Number of fine-tuned layers (Final layers)b | 1–5 | Integer sampling |
Number of fine-tuned layers (Initial layers)c | 1 | NA |
L2 regularization coefficient (λ): a small λ allows the model more flexibility but increases the risk of overfitting, while a larger λ improves regularization at the cost of potential underfitting.
Number of fine-tuned layers (Final layers): The final few layers of the model, usually including the output layer and preceding layers.
Number of fine-tuned layers (Initial layers): The initial layers of the model, responsible for extracting fundamental features; the fine-tuning input layer is only applied in feature sensitivity analysis to account for changes in the number of input features.
Impact of Pretraining Data Volume on Fine-Tuning Performance
Effect of Data Sparsity or Abundance
Features Sensitivity Analysis
Model Evaluation and Interpretation
Results and Discussion
Impact of Pretraining Data Volume on Fine-Tuning Performance
Figure 2
Figure 2. (a) Feature importance of the pretrained foundation model, evaluated using PFI on the fine-tuning test set. (b) The feature importance of the fine-tuned model (trained by 100% of fine-tuning training data) was evaluated using PFI on the fine-tuning test set. (c) Performance of the optimal fine-tuned model trained by 100% of fine-tuning training data, including R2, RMSE and MAE on the fine-tuning testing set.
Training data | Model Type | R2/RMSE |
---|---|---|
Fine-tuning training set | LRa (Baseline model) | 0.0689/0.0635 |
Fine-tuning training set | AutoMLb (Baseline model) | 0.1915/0.0592 |
20% of pretraining data | Pretrained Model | Before Fine-tuning: –1.6689/0.1076 |
After Fine-tuning: 0.2651/0.0565 | ||
50% of pretraining data | Pretrained Model | Before Fine-tuning: –0.4882/0.0803 |
After Fine-tuning: 0.4199/0.0502 | ||
90% of pretraining data | Pretrained Model | Before Fine-tuning: –0.2278/0.0780 |
After Fine-tuning: 0.6373/0.0397 |
LR represents linear regression model; AutoML.
Refers to the best-performing tree-based model selected by the automated machine learning process, which includes XGBoost, LightGBM, Random Forest (RF), and Extra-Trees.
Effects of Data Sparsity or Abundance on Fine-Tuned Model Performance
Training data group (Data size) | Model Type | R2 | RMSE | MAE | MAPE (%) |
---|---|---|---|---|---|
10% of training data (20) | LRa | –118.27 | 0.7192 | 0.6025 | 111.16 |
10% of training data (20) | AutoMLb | –1.2036 | 0.0978 | 0.0810 | 15.75 |
10% of training data (20) | Fine-tuning | 0.1758 | 0.0598 | 0.0412 | 7.63 |
20% of training data (39) | LR | –1.9922 | 0.1139 | 0.0864 | 15.80 |
20% of training data (39) | AutoML | 0.2016 | 0.0588 | 0.0476 | 9.04 |
20% of training data (39) | Fine-tuning | 0.3634 | 0.0525 | 0.0390 | 7.15 |
40% of training data (78) | LR | 0.2207 | 0.0581 | 0.0465 | 8.43 |
40% of training data (78) | AutoML | 0.2012 | 0.0589 | 0.0475 | 8.99 |
40% of training data (78) | Fine-tuning | 0.5133 | 0.0459 | 0.0353 | 6.46 |
50% of training data (97) | LR | 0.3516 | 0.0530 | 0.0409 | 7.42 |
50% of training data (97) | AutoML | 0.3318 | 0.0538 | 0.0435 | 8.05 |
50% of training data (97) | Fine-tuning | 0.5577 | 0.0438 | 0.0334 | 6.15 |
80% of training data (156) | LR | 0.1022 | 0.0624 | 0.0513 | 9.80 |
80% of training data (156) | AutoML | 0.1408 | 0.0610 | 0.0501 | 9.44 |
80% of training data (156) | Fine-tuning | 0.6084 | 0.0412 | 0.0316 | 5.87 |
100% of training data (195) | LR | 0.0689 | 0.0635 | 0.0527 | 10.14 |
100% of training data (195) | AutoML | 0.1915 | 0.0592 | 0.0491 | 9.43 |
100% of training data (195) | Fine-tuning | 0.6373 | 0.0397 | 0.0295 | 5.41 |
LR represents the linear regression model; AutoML.
Refers to the best-performing tree-based model selected by the automated machine learning process, which includes XGBoost, LightGBM, Random Forest (RF), and Extra-Trees.
The model performance was evaluated on the fine-tuning testing set.
Figure 3
Figure 3. Comparison of model results obtained using different fractions of the fine-tuning training data. Each case is individually compared against both the reference data and the best-performing model (fine-tuned with 100% of the training set).
Figure 4
Figure 4. Comparison of model results obtained using different feature subset combinations. Each case is individually compared against both the reference data and the best-performing model (fine-tuned with all input features).
Integrated Feature Importance and Sensitivity Analysis for Aerosol Mixing State Prediction Models
Feature sets | Input size | R2 | RMSE | MAE | MAPE (%) |
---|---|---|---|---|---|
All feature included | 18 | 0.6373 | 0.0397 | 0.0295 | 5.41 |
Drop VOC gas | 11 | 0.5902 | 0.0422 | 0.0350 | 6.53 |
Drop non-VOC gas | 14 | 0.5719 | 0.0431 | 0.0362 | 6.74 |
Drop all gas | 7 | 0.4686 | 0.0480 | 0.0398 | 7.45 |
Drop environmental data | 16 | 0.4827 | 0.0474 | 0.0386 | 7.20 |
Drop aerosol | 13 | –0.3972 | 0.0778 | 0.0614 | 11.92 |
Only aerosol | 5 | 0.1931 | 0.0592 | 0.0492 | 9.28 |
Implications
Data Availability
Code and simulation data to reproduce the foundation model and fine-tuned model are available at https://github.com/envdes/code_MEGAPOLI_Foundation_Model. MEGAPOLI observational data will be made available on request.
Supporting Information
The Supporting Information is available free of charge at https://pubs.acs.org/doi/10.1021/acsestair.4c00329.
Detailed descriptions of the PartMC-MOSAIC and MEGAPOLI data sets, the hyperparameter results of both the pretrained model and the fine-tuned model, and the results of the “Swapped Temporal Order” experiment (PDF)
Terms & Conditions
Most electronic Supporting Information files are available without a subscription to ACS Web Editions. Such files may be downloaded by article for research use (if there is a public use license linked to the relevant article, that license may permit other uses). Permission may be obtained from ACS for other uses through requests via the RightsLink permission system: http://pubs.acs.org/page/copyright/permissions.html.
Acknowledgments
This work made use of the facilities of the N8 Centre of Excellence in Computationally Intensive Research (N8 CIR) provided and funded by the N8 research partnership and EPSRC (Grant No. EP/T022167/1). The Centre is co-ordinated by the Universities of Durham, Manchester, and York. The authors acknowledge the assistance given by Research IT and Computational Shared Facility 3 (CSF3) at The University of Manchester. Z.Z. appreciates the support provided by the academic start-up funds from the Department of Earth and Environmental Sciences at The University of Manchester and the Aerosol Science Career Development Grant from The Aerosol Society. We thank Reetahan Mukhopadhyay for his contributions to a related project.
References
This article references 84 other publications.
- 1Raes, F.; Dingenen, R. V.; Vignati, E.; Wilson, J.; Putaud, J.-P.; Seinfeld, J. H.; Adams, P. Formation and cycling of aerosols in the global troposphere. Atmos. Environ. 2000, 34, 4215– 4240, DOI: 10.1016/S1352-2310(00)00239-9Google ScholarThere is no corresponding record for this reference.
- 2Pöschl, U. Atmospheric Aerosols: Composition, Transformation, Climate and Health Effects. Angew. Chem., Int. Ed. 2005, 44, 7520– 7540, DOI: 10.1002/anie.200501122Google ScholarThere is no corresponding record for this reference.
- 3Després, V. R.; Huffman, J. A.; Burrows, S. M.; Hoose, C.; Safatov, A. S.; Buryak, G.; Fröhlich-Nowoisky, J.; Elbert, W.; Andreae, M. O.; Pöschl, U.; Jaenicke, R. Primary biological aerosol particles in the atmosphere: a review. Tellus B: Chemical and Physical Meteorology 2022, 64, 15598, DOI: 10.3402/tellusb.v64i0.15598Google ScholarThere is no corresponding record for this reference.
- 4Winkler, P. The growth of atmospheric aerosol particles as a function of the relative humidity─II. An improved concept of mixed nuclei. J. Aerosol Sci. 1973, 4, 373– 387, DOI: 10.1016/0021-8502(73)90027-XGoogle ScholarThere is no corresponding record for this reference.
- 5Riemer, N.; Ault, A. P.; West, M.; Craig, R. L.; Curtis, J. H. Aerosol Mixing State: Measurements, Modeling, and Impacts. Reviews of Geophysics 2019, 57, 187– 249, DOI: 10.1029/2018RG000615Google ScholarThere is no corresponding record for this reference.
- 6Healy, R. M.; Riemer, N.; Wenger, J. C.; Murphy, M.; West, M.; Poulain, L.; Wiedensohler, A.; O’Connor, I. P.; McGillicuddy, E.; Sodeau, J. R.; Evans, G. J. Single particle diversity and mixing state measurements. Atmospheric Chemistry and Physics 2014, 14, 6289– 6299, DOI: 10.5194/acp-14-6289-2014Google ScholarThere is no corresponding record for this reference.
- 7Bondy, A. L.; Bonanno, D.; Moffet, R. C.; Wang, B.; Laskin, A.; Ault, A. P. The diverse chemical mixing state of aerosol particles in the southeastern United States. Atmospheric Chemistry and Physics 2018, 18, 12595– 12612, DOI: 10.5194/acp-18-12595-2018Google ScholarThere is no corresponding record for this reference.
- 8Lee, A. K.; Rivellini, L.-H.; Chen, C.-L.; Liu, J.; Price, D. J.; Betha, R.; Russell, L. M.; Zhang, X.; Cappa, C. D. Influences of Primary Emission and Secondary Coating Formation on the Particle Diversity and Mixing State of Black Carbon Particles. Environ. Sci. Technol. 2019, 53, 9429– 9438, DOI: 10.1021/acs.est.9b03064Google ScholarThere is no corresponding record for this reference.
- 9Yu, C.; Liu, D.; Broda, K.; Joshi, R.; Olfert, J.; Sun, Y.; Fu, P.; Coe, H.; Allan, J. D. Characterising mass-resolved mixing state of black carbon in Beijing using a morphology-independent measurement method. Atmospheric Chemistry and Physics 2020, 20, 3645– 3661, DOI: 10.5194/acp-20-3645-2020Google ScholarThere is no corresponding record for this reference.
- 10Ye, Q.; Gu, P.; Li, H. Z.; Robinson, E. S.; Lipsky, E.; Kaltsonoudis, C.; Lee, A. K.; Apte, J. S.; Robinson, A. L.; Sullivan, R. C.; Presto, A. A.; Donahue, N. M. Spatial Variability of Sources and Mixing State of Atmospheric Particles in a Metropolitan Area. Environ. Sci. Technol. 2018, 52, 6807– 6815, DOI: 10.1021/acs.est.8b01011Google ScholarThere is no corresponding record for this reference.
- 11Rissler, J.; Nordin, E. Z.; Eriksson, A. C.; Nilsson, P. T.; Frosch, M.; Sporre, M. K.; Wierzbicka, A.; Svenningsson, B.; Löndahl, J.; Messing, M. E.; Sjogren, S.; Hemmingsen, J. G.; Loft, S.; Pagels, J. H.; Swietlicki, E. Effective Density and Mixing State of Aerosol Particles in a Near-Traffic Urban Environment. Environ. Sci. Technol. 2014, 48, 6300– 6308, DOI: 10.1021/es5000353Google ScholarThere is no corresponding record for this reference.
- 12Schutgens, N. A. J.; Stier, P. A pathway analysis of global aerosol processes. Atmospheric Chemistry and Physics 2014, 14, 11657– 11686, DOI: 10.5194/acp-14-11657-2014Google ScholarThere is no corresponding record for this reference.
- 13Jacobson, M. Z. Strong radiative heating due to the mixing state of black carbon in atmospheric aerosols. Nature 2001, 409, 695– 697, DOI: 10.1038/35055518Google ScholarThere is no corresponding record for this reference.
- 14Chung, S. H.; Seinfeld, J. H. Global distribution and climate forcing of carbonaceous aerosols. Journal of Geophysical Research: Atmospheres 2002, 107, AAC 14-1, DOI: 10.1029/2001JD001397Google ScholarThere is no corresponding record for this reference.
- 15Fierce, L.; Bond, T. C.; Bauer, S. E.; Mena, F.; Riemer, N. Black carbon absorption at the global scale is affected by particle-scale diversity in composition. Nature Communication 2016, 7, 12361, DOI: 10.1038/ncomms12361Google ScholarThere is no corresponding record for this reference.
- 16Fierce, L.; Riemer, N.; Bond, T. C. Toward Reduced Representation of Mixing State for Simulating Aerosol Effects on Climate. Bulletin of the American Meteorological Society 2017, 98, 971– 980, DOI: 10.1175/BAMS-D-16-0028.1Google ScholarThere is no corresponding record for this reference.
- 17Liu, D. Black-carbon absorption enhancement in the atmosphere determined by particle mixing state. Nature Geoscience 2017, 10, 184– 188, DOI: 10.1038/ngeo2901Google ScholarThere is no corresponding record for this reference.
- 18Yao, Y.; Curtis, J. H.; Ching, J.; Zheng, Z.; Riemer, N. Quantifying the effects of mixing state on aerosol optical properties. Atmospheric Chemistry and Physics 2022, 22, 9265– 9282, DOI: 10.5194/acp-22-9265-2022Google ScholarThere is no corresponding record for this reference.
- 19Wang, J.; Cubison, M. J.; Aiken, A. C.; Jimenez, J. L.; Collins, D. R. The importance of aerosol mixing state and size-resolved composition on CCN concentration and the variation of the importance with atmospheric aging of aerosols. Atmospheric Chemistry and Physics 2010, 10, 7267– 7283, DOI: 10.5194/acp-10-7267-2010Google ScholarThere is no corresponding record for this reference.
- 20Ching, J.; Riemer, N.; West, M. Black carbon mixing state impacts on cloud microphysical properties: Effects of aerosol plume and environmental conditions. Journal of Geophysical Research: Atmospheres 2016, 121, 5990– 6013, DOI: 10.1002/2016JD024851Google ScholarThere is no corresponding record for this reference.
- 21Ching, J.; Fast, J.; West, M.; Riemer, N. Metrics to quantify the importance of mixing state for CCN activity. Atmospheric Chemistry and Physics 2017, 17, 7445– 7458, DOI: 10.5194/acp-17-7445-2017Google ScholarThere is no corresponding record for this reference.
- 22Shen, W.; Wang, M.; Riemer, N.; Zheng, Z.; Liu, Y.; Dong, X. Improving BC Mixing State and CCN Activity Representation With Machine Learning in the Community Atmosphere Model Version 6 (CAM6). Journal of Advances in Modeling Earth Systems 2024, 16, e2023MS003889 DOI: 10.1029/2023MS003889Google ScholarThere is no corresponding record for this reference.
- 23Ching, J.; Riemer, N.; West, M. Impacts of black carbon mixing state on black carbon nucleation scavenging: Insights from a particle-resolved model. Journal of Geophysical Research: Atmospheres 2012, 117, D23209, DOI: 10.1029/2012JD018269Google ScholarThere is no corresponding record for this reference.
- 24Ching, J.; Kajino, M. Aerosol mixing state matters for particles deposition in human respiratory system. Sci. Rep. 2018, 8, 8864, DOI: 10.1038/s41598-018-27156-zGoogle ScholarThere is no corresponding record for this reference.
- 25Ching, J.; Kajino, M.; Matsui, H. Resolving aerosol mixing state increases accuracy of black carbon respiratory deposition estimates. One Earth 2020, 3, 763– 776, DOI: 10.1016/j.oneear.2020.11.004Google ScholarThere is no corresponding record for this reference.
- 26Riemer, N.; West, M. Quantifying aerosol mixing state with entropy and diversity measures. Atmospheric Chemistry and Physics 2013, 13, 11423– 11439, DOI: 10.5194/acp-13-11423-2013Google ScholarThere is no corresponding record for this reference.
- 27O’Brien, R. E.; Wang, B.; Laskin, A.; Riemer, N.; West, M.; Zhang, Q.; Sun, Y.; Yu, X.; Alpert, P.; Knopf, D. A.; Gilles, M. K.; Moffet, R. C. Chemical imaging of ambient aerosol particles: Observational constraints on mixing state parameterization. Journal of Geophysical Research: Atmospheres 2015, 120, 9591– 9605, DOI: 10.1002/2015JD023480Google ScholarThere is no corresponding record for this reference.
- 28Riemer, N.; West, M.; Zaveri, R. A.; Easter, R. C. Simulating the evolution of soot mixing state with a particle-resolved aerosol model. Journal of Geophysical Research: Atmospheres 2009, 114, D09202, DOI: 10.1029/2008JD011073Google ScholarThere is no corresponding record for this reference.
- 29Zhu, S.; Sartelet, K. N.; Healy, R. M.; Wenger, J. C. Simulation of particle diversity and mixing state over Greater Paris: a model–measurement inter-comparison. Faraday Discuss. 2016, 189, 547– 566, DOI: 10.1039/C5FD00175GGoogle ScholarThere is no corresponding record for this reference.
- 30Zheng, Z.; Curtis, J. H.; Yao, Y.; Gasparik, J. T.; Anantharaj, V. G.; Zhao, L.; West, M.; Riemer, N. Estimating Submicron Aerosol Mixing State at the Global Scale With Machine Learning and Earth System Modeling. Earth and Space Science 2021, 8, 9479– 9496, DOI: 10.1029/2020EA001500Google ScholarThere is no corresponding record for this reference.
- 31Bommasani, R.; Hudson, D. A.; Adeli, E. On the Opportunities and Risks of Foundation Models. ArXiv preprint 2022–07–12, DOI: 10.48550/arXiv.2108.07258Google ScholarThere is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 32Olson, M. H.; Hergenhahn, B. R. An introduction to theories of learning, ninth ed. ed.; Psychology Press: New York, 2016; OCLC: 914472558.Google ScholarThere is no corresponding record for this reference.
- 33Chen, M.; Tworek, J.; Jun, H. Evaluating Large Language Models Trained on Code. ArXiv preprint 2021–07–14, DOI: 10.48550/arXiv.2107.03374Google ScholarThere is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 34Neelakantan, A.; Xu, T.; Puri, R. Text and Code Embeddings by Contrastive Pre-Training. ArXiv preprint 2022–01–24, DOI: 10.48550/arXiv.2201.10005Google ScholarThere is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 35Zhou, C.; Li, Q.; Li, C. A Comprehensive Survey on Pretrained Foundation Models: A History from BERT to ChatGPT. ArXiv preprint 2023–05–01, DOI: 10.48550/arXiv.2302.09419Google ScholarThere is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 36Chen, S.; Long, G.; Jiang, J. Foundation Models for Weather and Climate Data Understanding: A Comprehensive Survey. ArXiv preprint 2023–12–05, DOI: 10.48550/arXiv.2312.03014Google ScholarThere is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 37Nguyen, T.; Brandstetter, J.; Kapoor, A. ClimaX: A foundation model for weather and climate. ArXiv preprint 2023–12–18, DOI: 10.48550/arXiv.2301.10343Google ScholarThere is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 38Chen, K.; Han, T.; Gong, J. FengWu: Pushing the Skillful Global Medium-range Weather Forecast beyond 10 Days Lead. ArXiv preprint 2023–04–06, DOI: 10.48550/arXiv.2304.02948Google ScholarThere is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 39Bi, K.; Xie, L.; Zhang, H.; Chen, X.; Gu, X.; Tian, Q. Accurate medium-range global weather forecasting with 3D neural networks. Nature 2023, 619, 533– 538, DOI: 10.1038/s41586-023-06185-3Google ScholarThere is no corresponding record for this reference.
- 40Chen, L.; Zhong, X.; Zhang, F.; Cheng, Y.; Xu, Y.; Qi, Y.; Li, H. FuXi: A cascade machine learning forecasting system for 15-day global weather forecast. npj climate and atmospheric science 2023, 6, 190, DOI: 10.1038/s41612-023-00512-1Google ScholarThere is no corresponding record for this reference.
- 41Lam, R.; Sanchez-Gonzalez, A.; Willson, M. ; GraphCast: Learning skillful medium-range global weather forecasting. ArXiv preprint , 2023-08–04; DOI: 10.48550/arXiv.2212.12794 . (Accessed: 2025–02–09).Google ScholarThere is no corresponding record for this reference.
- 42Pathak, J.; Subramanian, S.; Harrington, P. FourCastNet: A Global Data-driven High-resolution Weather Model using Adaptive Fourier Neural Operators. ArXiv preprint 2022–02–22, DOI: 10.48550/arXiv.2202.11214Google ScholarThere is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 43Man, X.; Zhang, C.; Feng, J. W-MAE: Pre-trained weather model with masked autoencoder for multi-variable weather forecasting. ArXiv preprint 2023–12–15, DOI: 10.48550/arXiv.2304.08754Google ScholarThere is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 44Webersinke, N.; Kraus, M.; Bingler, J. A. ClimateBert: A Pretrained Language Model for Climate-Related Text. ArXiv preprint 2022–12–17, DOI: 10.48550/arXiv.2110.12010Google ScholarThere is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 45Yuan, Y.; Lin, L. Self-Supervised Pretraining of Transformers for Satellite Image Time Series Classification. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 2021, 14, 474– 487, DOI: 10.1109/JSTARS.2020.3036602Google ScholarThere is no corresponding record for this reference.
- 46Cong, Y.; Khanna, S.; Meng, C. SatMAE: Pre-training Transformers for Temporal and Multi-Spectral Satellite Imagery. ArXiv preprint 2023–01–15, DOI: 10.48550/arXiv.2207.08051Google ScholarThere is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 47Reed, C. J.; Gupta, R.; Li, S. Scale-MAE: A Scale-Aware Masked Autoencoder for Multiscale Geospatial Representation Learning. ArXiv preprint 2023–09–22, DOI: 10.48550/arXiv.2212.14532Google ScholarThere is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 48Zaveri, R. A.; Easter, R. C.; Fast, J. D.; Peters, L. K. Model for Simulating Aerosol Interactions and Chemistry (MOSAIC). Journal of Geophysical Research: Atmospheres 2008, 113, D13204, DOI: 10.1029/2007JD008782Google ScholarThere is no corresponding record for this reference.
- 49DeVille, R. E. L.; Riemer, N.; West, M. Weighted Flow Algorithms (WFA) for stochastic particle coagulation. Reviews of Geophysics 2011, 230, 8427– 8451, DOI: 10.1016/j.jcp.2011.07.027Google ScholarThere is no corresponding record for this reference.
- 50DeVille, L.; Riemer, N.; West, M. Convergence of a generalized Weighted Flow Algorithm for stochastic particle coagulation. Journal of Computational Dynamics 2019, 6, 69, DOI: 10.3934/jcd.2019003Google ScholarThere is no corresponding record for this reference.
- 51Zaveri, R. A.; Peters, L. K. A new lumped structure photochemical mechanism for large-scale applications. Journal of Geophysical Research: Atmospheres 1999, 104, 30387– 30415, DOI: 10.1029/1999JD900876Google ScholarThere is no corresponding record for this reference.
- 52Zaveri, R. A.; Easter, R. C.; Peters, L. K. A computationally efficient Multicomponent Equilibrium Solver for Aerosols (MESA). Journal of Geophysical Research: Atmospheres 2005, 110, D24203, DOI: 10.1029/2004JD005618Google ScholarThere is no corresponding record for this reference.
- 53Schell, B.; Ackermann, I. J.; Hass, H.; Binkowski, F. S.; Ebel, A. Modeling the formation of secondary organic aerosol within a comprehensive air quality model system. Journal of Geophysical Research: Atmospheres 2001, 106, 28275– 28293, DOI: 10.1029/2001JD000384Google ScholarThere is no corresponding record for this reference.
- 54Zheng, Z.; West, M.; Zhao, L.; Ma, P.-L.; Liu, X.; Riemer, N. Quantifying the structural uncertainty of the aerosol mixing state representation in a modal model. Atmospheric Chemistry and Physics 2021, 21, 17727– 17741, DOI: 10.5194/acp-21-17727-2021Google ScholarThere is no corresponding record for this reference.
- 55Levakov, G.; Rosenthal, G.; Shelef, I.; Raviv, T. R.; Avidan, G. From a deep learning model back to the brain─Identifying regional predictors and their relation to aging. Human Brain Mapping 2020, 41, 3235– 3252, DOI: 10.1002/hbm.25011Google ScholarThere is no corresponding record for this reference.
- 56Zhang, F.; Li, Z.; Zhang, B.; Du, H.; Wang, B.; Zhang, X. Multi-modal deep learning model for auxiliary diagnosis of Alzheimer’s disease. Neurocomputing 2019, 361, 185– 195, DOI: 10.1016/j.neucom.2019.04.093Google ScholarThere is no corresponding record for this reference.
- 57Xu, H.; Lin, J.; Zhang, D.; Mo, F. Retention time prediction for chromatographic enantioseparation by quantile geometry-enhanced graph neural network. Nat. Commun. 2023, 14, 3095, DOI: 10.1038/s41467-023-38853-3Google ScholarThere is no corresponding record for this reference.
- 58Wan, J.; Jiang, J.-W.; Park, H. S. Machine learning-based design of porous graphene with low thermal conductivity. Carbon 2020, 157, 262– 269, DOI: 10.1016/j.carbon.2019.10.037Google ScholarThere is no corresponding record for this reference.
- 59Healy, R. M.; Sciare, J.; Poulain, L.; Kamili, K.; Merkel, M.; Müller, T.; Wiedensohler, A.; Eckhardt, S.; Stohl, A.; Sarda-Estève, R. Sources and mixing state of size-resolved elemental carbon particles in a European megacity: Paris. Atmospheric Chemistry and Physics 2012, 12, 1681– 1700, DOI: 10.5194/acp-12-1681-2012Google ScholarThere is no corresponding record for this reference.
- 60Healy, R. M.; Sciare, J.; Poulain, L.; Crippa, M.; Wiedensohler, A.; Prévôt, A. S. H.; Baltensperger, U.; Sarda-Estève, R.; McGuire, M. L.; Jeong, C.-H.; McGillicuddy, E.; O’Connor, I. P.; Sodeau, J. R.; Evans, G. J.; Wenger, J. C. Quantitative determination of carbonaceous particle mixing state in Paris using single-particle mass spectrometer and aerosol mass spectrometer measurements. Atmospheric Chemistry and Physics 2013, 13, 9479– 9496, DOI: 10.5194/acp-13-9479-2013Google ScholarThere is no corresponding record for this reference.
- 61He, K.; Zhang, X.; Ren, S. Deep Residual Learning for Image Recognition. ArXiv preprint 2025–12–10, DOI: 10.48550/arXiv.1512.03385Google ScholarThere is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 62He, K.; Zhang, X.; Ren, S. Identity Mappings in Deep Residual Networks. ArXiv preprint 2016–07–25, DOI: 10.48550/arXiv.1603.05027Google ScholarThere is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 63Akiba, T.; Sano, S.; Yanase, T. Optuna: A Next-generation Hyperparameter Optimization Framework. ArXiv preprint 2019–07–25, DOI: 10.48550/arXiv.1907.10902Google ScholarThere is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 64Weiss, K.; Khoshgoftaar, T. M.; Wang, D. A survey of transfer learning. Journal of Big Data 2016, 3, 9, DOI: 10.1186/s40537-016-0043-6Google ScholarThere is no corresponding record for this reference.
- 65Guo, Y.; Shi, H.; Kumar, A. SpotTune: Transfer Learning Through Adaptive Fine-Tuning. ArXiv preprint 2018–11–21, DOI: 10.48550/arXiv.1811.08737Google ScholarThere is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 66Girshick, R. B.; Donahue, J.; Darrell, T. Rich Feature Hierarchies for Accurate Object Detection and Semantic Segmentation. ArXiv preprint 2014–10–22, DOI: 10.48550/arXiv.1311.2524Google ScholarThere is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 67Long, M.; Cao, Y.; Wang, J. Learning Transferable Features with Deep Adaptation Networks. ArXiv preprint 2015–05–27, DOI: 10.48550/arXiv.1502.02791Google ScholarThere is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 68Razavian, A. S.; Azizpour, H.; Sullivan, J. CNN Features off-the-shelf: an Astounding Baseline for Recognition. ArXiv preprint 2014–05–12, DOI: 10.48550/arXiv.1403.6382Google ScholarThere is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 69Cortes, C.; Mohri, M.; Rostamizadeh, A. L2 Regularization for Learning Kernels. ArXiv preprint 2012–05–09, DOI: 10.48550/arXiv.1205.2653Google ScholarThere is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 70van Laarhoven, T. L2 Regularization versus Batch and Weight Normalization. ArXiv preprint 2017–06–16, DOI: 10.48550/arXiv.1706.05350Google ScholarThere is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 71Wang, C.; Wu, Q.; Weimer, M.; Zhu, E. FLAML: A Fast and Lightweight AutoML Library. MLSys 2021.Google ScholarThere is no corresponding record for this reference.
- 72Zhang, C.; Hu, Q.; Su, W.; Xing, C.; Liu, C. Satellite spectroscopy reveals the atmospheric consequences of the 2022 Russia-Ukraine war. Science of The Total Environment 2023, 869, 161759, DOI: 10.1016/j.scitotenv.2023.161759Google ScholarThere is no corresponding record for this reference.
- 73Zheng, Z.; Fiore, A. M.; Westervelt, D. M.; Milly, G. P.; Goldsmith, J.; Karambelas, A.; Curci, G.; Randles, C. A.; Paiva, A. R.; Wang, C.; Wu, Q.; Dey, S. Automated Machine Learning to Evaluate the Information Content of Tropospheric Trace Gas Columns for Fine Particle Estimates Over India: A Modeling Testbed. Journal of Advances in Modeling Earth Systems 2023, 15, e2022MS003099 DOI: 10.1029/2022MS003099Google ScholarThere is no corresponding record for this reference.
- 74Ke, G.; Meng, Q.; Finley, T.; Wang, T.; Chen, W.; Ma, W.; Ye, Q.; Liu, T. LightGBM: A Highly Efficient Gradient Boosting Decision Tree. Advances in Neural Information Processing Systems. Curran Associates, Inc., 2017.Google ScholarThere is no corresponding record for this reference.
- 75Chen, T.; Guestrin, C. XGBoost: A Scalable Tree Boosting System. ArXiv preprint 2016–06–10, DOI: 10.48550/arXiv.1603.02754Google ScholarThere is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 76Breiman, L. Random Forests. Machine Learning 2001, 45, 5– 32, DOI: 10.1023/A:1010933404324Google ScholarThere is no corresponding record for this reference.
- 77Geurts, P.; Ernst, D.; Wehenkel, L. Extremely randomized trees. Machine Learning 2006, 63, 3– 42, DOI: 10.1007/s10994-006-6226-1Google ScholarThere is no corresponding record for this reference.
- 78Altmann, A.; Toloşi, L.; Sander, O.; Lengauer, T. Permutation importance: a corrected feature importance measure. Bioinformatics 2010, 26, 1340– 1347, DOI: 10.1093/bioinformatics/btq134Google ScholarThere is no corresponding record for this reference.
- 79Fisher, A.; Rudin, C.; Dominici, F. All models are wrong, but many are useful: Learning a variable’s importance by studying an entire class of prediction models simultaneously. Journal of Machine Learning Research 2019, 20, 1– 81Google ScholarThere is no corresponding record for this reference.
- 80Hallquist, M. The formation, properties and impact of secondary organic aerosol: current and emerging issues. Atmospheric Chemistry and Physics 2009, 9, 5155– 5236, DOI: 10.5194/acp-9-5155-2009Google ScholarThere is no corresponding record for this reference.
- 81Jimenez, J. L. Evolution of Organic Aerosols in the Atmosphere. Science 2009, 326, 1525– 1529, DOI: 10.1126/science.1180353Google ScholarThere is no corresponding record for this reference.
- 82Xu, W.; Han, T.; Du, W.; Wang, Q.; Chen, C.; Zhao, J.; Zhang, Y.; Li, J.; Fu, P.; Wang, Z.; Worsnop, D. R.; Sun, Y. Effects of Aqueous-Phase and Photochemical Processing on Secondary Organic Aerosol Formation and Evolution in Beijing, China. Environ. Sci. Technol. 2017, 51, 762– 770, DOI: 10.1021/acs.est.6b04498Google ScholarThere is no corresponding record for this reference.
- 83Vaswani, A.; Shazeer, N.; Parmar, N. Attention is All you Need. ArXiv preprint 2017–12–06, DOI: 10.48550/arXiv.1706.03762Google ScholarThere is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 84Lam, R. Learning skillful medium-range global weather forecasting. Science 2023, 382, 1416– 1421, DOI: 10.1126/science.adi2336Google ScholarThere is no corresponding record for this reference.
Cited By
This article has not yet been cited by other publications.
Article Views
Altmetric
Citations
Article Views are the COUNTER-compliant sum of full text article downloads since November 2008 (both PDF and HTML) across all institutions and individuals. These metrics are regularly updated to reflect usage leading up to the last few days.
Citations are the number of other articles citing this article, calculated by Crossref and updated daily. Find more information about Crossref citation counts.
The Altmetric Attention Score is a quantitative measure of the attention that a research article has received online. Clicking on the donut icon will load a page at altmetric.com with additional details about the score and the social media presence for the given article. Find more information on the Altmetric Attention Score and how the score is calculated.
Recommended Articles
Abstract
Figure 1
Figure 1. Overview of the workflow for the foundation and fine-tuned model development. (a) Data Preparation for Pretraining and Fine-tuning: (1) Generation of the pretraining data set for the foundation model using the PartMC-MOSAIC simulation. (2) Fine-tuning data set derived from the MEGAPOLI campaign observational data, source from Healy et al. (6) (b) The Pretrained Foundation Model: (1) Architecture of the foundation model, illustrating the overall structure (left) and the residual block structure (right). The “linear transformation” refers to a simple mathematical operation where each input is multiplied by a weight and added to a bias (a constant value), linearly altering the input values. (2) Model development through hyperparameter optimization, where the purple marker highlights the best foundation model selected during validation, and the yellow marker represents its performance on the pretraining testing set. (c) Fine-tuning the Pretrained Foundation Model: (1) Fine-tuning process, where observational data is chronologically split into a fine-tuning training set (first 50%) and a fine-tuning testing set (remaining data). The foundation model is pretrained using three different data volumes (20%, 50%, and 90% of the pretraining data) to investigate how the amount of pretraining data affects fine-tuning performance. (2) Effect of data sparsity or abundance, with different data subsets (case 1- 6) used for training and evaluated on the fine-tuning testing set. Fine-tuning targets include the output layer and adjacent layers. In the “Swapped Temporal Order” experiment, the roles of the data sets are reversed: the fine-tuning testing set is redivided (using the same proportional split) to serve as the training set, while the original fine-tuning training set is used for testing. (3) Feature sensitivity analysis, examining the model’s response to different feature subsets, with fine-tuning targeting both input and output layers alongside adjacent layers.
Figure 2
Figure 2. (a) Feature importance of the pretrained foundation model, evaluated using PFI on the fine-tuning test set. (b) The feature importance of the fine-tuned model (trained by 100% of fine-tuning training data) was evaluated using PFI on the fine-tuning test set. (c) Performance of the optimal fine-tuned model trained by 100% of fine-tuning training data, including R2, RMSE and MAE on the fine-tuning testing set.
Figure 3
Figure 3. Comparison of model results obtained using different fractions of the fine-tuning training data. Each case is individually compared against both the reference data and the best-performing model (fine-tuned with 100% of the training set).
Figure 4
Figure 4. Comparison of model results obtained using different feature subset combinations. Each case is individually compared against both the reference data and the best-performing model (fine-tuned with all input features).
References
This article references 84 other publications.
- 1Raes, F.; Dingenen, R. V.; Vignati, E.; Wilson, J.; Putaud, J.-P.; Seinfeld, J. H.; Adams, P. Formation and cycling of aerosols in the global troposphere. Atmos. Environ. 2000, 34, 4215– 4240, DOI: 10.1016/S1352-2310(00)00239-9There is no corresponding record for this reference.
- 2Pöschl, U. Atmospheric Aerosols: Composition, Transformation, Climate and Health Effects. Angew. Chem., Int. Ed. 2005, 44, 7520– 7540, DOI: 10.1002/anie.200501122There is no corresponding record for this reference.
- 3Després, V. R.; Huffman, J. A.; Burrows, S. M.; Hoose, C.; Safatov, A. S.; Buryak, G.; Fröhlich-Nowoisky, J.; Elbert, W.; Andreae, M. O.; Pöschl, U.; Jaenicke, R. Primary biological aerosol particles in the atmosphere: a review. Tellus B: Chemical and Physical Meteorology 2022, 64, 15598, DOI: 10.3402/tellusb.v64i0.15598There is no corresponding record for this reference.
- 4Winkler, P. The growth of atmospheric aerosol particles as a function of the relative humidity─II. An improved concept of mixed nuclei. J. Aerosol Sci. 1973, 4, 373– 387, DOI: 10.1016/0021-8502(73)90027-XThere is no corresponding record for this reference.
- 5Riemer, N.; Ault, A. P.; West, M.; Craig, R. L.; Curtis, J. H. Aerosol Mixing State: Measurements, Modeling, and Impacts. Reviews of Geophysics 2019, 57, 187– 249, DOI: 10.1029/2018RG000615There is no corresponding record for this reference.
- 6Healy, R. M.; Riemer, N.; Wenger, J. C.; Murphy, M.; West, M.; Poulain, L.; Wiedensohler, A.; O’Connor, I. P.; McGillicuddy, E.; Sodeau, J. R.; Evans, G. J. Single particle diversity and mixing state measurements. Atmospheric Chemistry and Physics 2014, 14, 6289– 6299, DOI: 10.5194/acp-14-6289-2014There is no corresponding record for this reference.
- 7Bondy, A. L.; Bonanno, D.; Moffet, R. C.; Wang, B.; Laskin, A.; Ault, A. P. The diverse chemical mixing state of aerosol particles in the southeastern United States. Atmospheric Chemistry and Physics 2018, 18, 12595– 12612, DOI: 10.5194/acp-18-12595-2018There is no corresponding record for this reference.
- 8Lee, A. K.; Rivellini, L.-H.; Chen, C.-L.; Liu, J.; Price, D. J.; Betha, R.; Russell, L. M.; Zhang, X.; Cappa, C. D. Influences of Primary Emission and Secondary Coating Formation on the Particle Diversity and Mixing State of Black Carbon Particles. Environ. Sci. Technol. 2019, 53, 9429– 9438, DOI: 10.1021/acs.est.9b03064There is no corresponding record for this reference.
- 9Yu, C.; Liu, D.; Broda, K.; Joshi, R.; Olfert, J.; Sun, Y.; Fu, P.; Coe, H.; Allan, J. D. Characterising mass-resolved mixing state of black carbon in Beijing using a morphology-independent measurement method. Atmospheric Chemistry and Physics 2020, 20, 3645– 3661, DOI: 10.5194/acp-20-3645-2020There is no corresponding record for this reference.
- 10Ye, Q.; Gu, P.; Li, H. Z.; Robinson, E. S.; Lipsky, E.; Kaltsonoudis, C.; Lee, A. K.; Apte, J. S.; Robinson, A. L.; Sullivan, R. C.; Presto, A. A.; Donahue, N. M. Spatial Variability of Sources and Mixing State of Atmospheric Particles in a Metropolitan Area. Environ. Sci. Technol. 2018, 52, 6807– 6815, DOI: 10.1021/acs.est.8b01011There is no corresponding record for this reference.
- 11Rissler, J.; Nordin, E. Z.; Eriksson, A. C.; Nilsson, P. T.; Frosch, M.; Sporre, M. K.; Wierzbicka, A.; Svenningsson, B.; Löndahl, J.; Messing, M. E.; Sjogren, S.; Hemmingsen, J. G.; Loft, S.; Pagels, J. H.; Swietlicki, E. Effective Density and Mixing State of Aerosol Particles in a Near-Traffic Urban Environment. Environ. Sci. Technol. 2014, 48, 6300– 6308, DOI: 10.1021/es5000353There is no corresponding record for this reference.
- 12Schutgens, N. A. J.; Stier, P. A pathway analysis of global aerosol processes. Atmospheric Chemistry and Physics 2014, 14, 11657– 11686, DOI: 10.5194/acp-14-11657-2014There is no corresponding record for this reference.
- 13Jacobson, M. Z. Strong radiative heating due to the mixing state of black carbon in atmospheric aerosols. Nature 2001, 409, 695– 697, DOI: 10.1038/35055518There is no corresponding record for this reference.
- 14Chung, S. H.; Seinfeld, J. H. Global distribution and climate forcing of carbonaceous aerosols. Journal of Geophysical Research: Atmospheres 2002, 107, AAC 14-1, DOI: 10.1029/2001JD001397There is no corresponding record for this reference.
- 15Fierce, L.; Bond, T. C.; Bauer, S. E.; Mena, F.; Riemer, N. Black carbon absorption at the global scale is affected by particle-scale diversity in composition. Nature Communication 2016, 7, 12361, DOI: 10.1038/ncomms12361There is no corresponding record for this reference.
- 16Fierce, L.; Riemer, N.; Bond, T. C. Toward Reduced Representation of Mixing State for Simulating Aerosol Effects on Climate. Bulletin of the American Meteorological Society 2017, 98, 971– 980, DOI: 10.1175/BAMS-D-16-0028.1There is no corresponding record for this reference.
- 17Liu, D. Black-carbon absorption enhancement in the atmosphere determined by particle mixing state. Nature Geoscience 2017, 10, 184– 188, DOI: 10.1038/ngeo2901There is no corresponding record for this reference.
- 18Yao, Y.; Curtis, J. H.; Ching, J.; Zheng, Z.; Riemer, N. Quantifying the effects of mixing state on aerosol optical properties. Atmospheric Chemistry and Physics 2022, 22, 9265– 9282, DOI: 10.5194/acp-22-9265-2022There is no corresponding record for this reference.
- 19Wang, J.; Cubison, M. J.; Aiken, A. C.; Jimenez, J. L.; Collins, D. R. The importance of aerosol mixing state and size-resolved composition on CCN concentration and the variation of the importance with atmospheric aging of aerosols. Atmospheric Chemistry and Physics 2010, 10, 7267– 7283, DOI: 10.5194/acp-10-7267-2010There is no corresponding record for this reference.
- 20Ching, J.; Riemer, N.; West, M. Black carbon mixing state impacts on cloud microphysical properties: Effects of aerosol plume and environmental conditions. Journal of Geophysical Research: Atmospheres 2016, 121, 5990– 6013, DOI: 10.1002/2016JD024851There is no corresponding record for this reference.
- 21Ching, J.; Fast, J.; West, M.; Riemer, N. Metrics to quantify the importance of mixing state for CCN activity. Atmospheric Chemistry and Physics 2017, 17, 7445– 7458, DOI: 10.5194/acp-17-7445-2017There is no corresponding record for this reference.
- 22Shen, W.; Wang, M.; Riemer, N.; Zheng, Z.; Liu, Y.; Dong, X. Improving BC Mixing State and CCN Activity Representation With Machine Learning in the Community Atmosphere Model Version 6 (CAM6). Journal of Advances in Modeling Earth Systems 2024, 16, e2023MS003889 DOI: 10.1029/2023MS003889There is no corresponding record for this reference.
- 23Ching, J.; Riemer, N.; West, M. Impacts of black carbon mixing state on black carbon nucleation scavenging: Insights from a particle-resolved model. Journal of Geophysical Research: Atmospheres 2012, 117, D23209, DOI: 10.1029/2012JD018269There is no corresponding record for this reference.
- 24Ching, J.; Kajino, M. Aerosol mixing state matters for particles deposition in human respiratory system. Sci. Rep. 2018, 8, 8864, DOI: 10.1038/s41598-018-27156-zThere is no corresponding record for this reference.
- 25Ching, J.; Kajino, M.; Matsui, H. Resolving aerosol mixing state increases accuracy of black carbon respiratory deposition estimates. One Earth 2020, 3, 763– 776, DOI: 10.1016/j.oneear.2020.11.004There is no corresponding record for this reference.
- 26Riemer, N.; West, M. Quantifying aerosol mixing state with entropy and diversity measures. Atmospheric Chemistry and Physics 2013, 13, 11423– 11439, DOI: 10.5194/acp-13-11423-2013There is no corresponding record for this reference.
- 27O’Brien, R. E.; Wang, B.; Laskin, A.; Riemer, N.; West, M.; Zhang, Q.; Sun, Y.; Yu, X.; Alpert, P.; Knopf, D. A.; Gilles, M. K.; Moffet, R. C. Chemical imaging of ambient aerosol particles: Observational constraints on mixing state parameterization. Journal of Geophysical Research: Atmospheres 2015, 120, 9591– 9605, DOI: 10.1002/2015JD023480There is no corresponding record for this reference.
- 28Riemer, N.; West, M.; Zaveri, R. A.; Easter, R. C. Simulating the evolution of soot mixing state with a particle-resolved aerosol model. Journal of Geophysical Research: Atmospheres 2009, 114, D09202, DOI: 10.1029/2008JD011073There is no corresponding record for this reference.
- 29Zhu, S.; Sartelet, K. N.; Healy, R. M.; Wenger, J. C. Simulation of particle diversity and mixing state over Greater Paris: a model–measurement inter-comparison. Faraday Discuss. 2016, 189, 547– 566, DOI: 10.1039/C5FD00175GThere is no corresponding record for this reference.
- 30Zheng, Z.; Curtis, J. H.; Yao, Y.; Gasparik, J. T.; Anantharaj, V. G.; Zhao, L.; West, M.; Riemer, N. Estimating Submicron Aerosol Mixing State at the Global Scale With Machine Learning and Earth System Modeling. Earth and Space Science 2021, 8, 9479– 9496, DOI: 10.1029/2020EA001500There is no corresponding record for this reference.
- 31Bommasani, R.; Hudson, D. A.; Adeli, E. On the Opportunities and Risks of Foundation Models. ArXiv preprint 2022–07–12, DOI: 10.48550/arXiv.2108.07258There is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 32Olson, M. H.; Hergenhahn, B. R. An introduction to theories of learning, ninth ed. ed.; Psychology Press: New York, 2016; OCLC: 914472558.There is no corresponding record for this reference.
- 33Chen, M.; Tworek, J.; Jun, H. Evaluating Large Language Models Trained on Code. ArXiv preprint 2021–07–14, DOI: 10.48550/arXiv.2107.03374There is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 34Neelakantan, A.; Xu, T.; Puri, R. Text and Code Embeddings by Contrastive Pre-Training. ArXiv preprint 2022–01–24, DOI: 10.48550/arXiv.2201.10005There is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 35Zhou, C.; Li, Q.; Li, C. A Comprehensive Survey on Pretrained Foundation Models: A History from BERT to ChatGPT. ArXiv preprint 2023–05–01, DOI: 10.48550/arXiv.2302.09419There is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 36Chen, S.; Long, G.; Jiang, J. Foundation Models for Weather and Climate Data Understanding: A Comprehensive Survey. ArXiv preprint 2023–12–05, DOI: 10.48550/arXiv.2312.03014There is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 37Nguyen, T.; Brandstetter, J.; Kapoor, A. ClimaX: A foundation model for weather and climate. ArXiv preprint 2023–12–18, DOI: 10.48550/arXiv.2301.10343There is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 38Chen, K.; Han, T.; Gong, J. FengWu: Pushing the Skillful Global Medium-range Weather Forecast beyond 10 Days Lead. ArXiv preprint 2023–04–06, DOI: 10.48550/arXiv.2304.02948There is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 39Bi, K.; Xie, L.; Zhang, H.; Chen, X.; Gu, X.; Tian, Q. Accurate medium-range global weather forecasting with 3D neural networks. Nature 2023, 619, 533– 538, DOI: 10.1038/s41586-023-06185-3There is no corresponding record for this reference.
- 40Chen, L.; Zhong, X.; Zhang, F.; Cheng, Y.; Xu, Y.; Qi, Y.; Li, H. FuXi: A cascade machine learning forecasting system for 15-day global weather forecast. npj climate and atmospheric science 2023, 6, 190, DOI: 10.1038/s41612-023-00512-1There is no corresponding record for this reference.
- 41Lam, R.; Sanchez-Gonzalez, A.; Willson, M. ; GraphCast: Learning skillful medium-range global weather forecasting. ArXiv preprint , 2023-08–04; DOI: 10.48550/arXiv.2212.12794 . (Accessed: 2025–02–09).There is no corresponding record for this reference.
- 42Pathak, J.; Subramanian, S.; Harrington, P. FourCastNet: A Global Data-driven High-resolution Weather Model using Adaptive Fourier Neural Operators. ArXiv preprint 2022–02–22, DOI: 10.48550/arXiv.2202.11214There is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 43Man, X.; Zhang, C.; Feng, J. W-MAE: Pre-trained weather model with masked autoencoder for multi-variable weather forecasting. ArXiv preprint 2023–12–15, DOI: 10.48550/arXiv.2304.08754There is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 44Webersinke, N.; Kraus, M.; Bingler, J. A. ClimateBert: A Pretrained Language Model for Climate-Related Text. ArXiv preprint 2022–12–17, DOI: 10.48550/arXiv.2110.12010There is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 45Yuan, Y.; Lin, L. Self-Supervised Pretraining of Transformers for Satellite Image Time Series Classification. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 2021, 14, 474– 487, DOI: 10.1109/JSTARS.2020.3036602There is no corresponding record for this reference.
- 46Cong, Y.; Khanna, S.; Meng, C. SatMAE: Pre-training Transformers for Temporal and Multi-Spectral Satellite Imagery. ArXiv preprint 2023–01–15, DOI: 10.48550/arXiv.2207.08051There is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 47Reed, C. J.; Gupta, R.; Li, S. Scale-MAE: A Scale-Aware Masked Autoencoder for Multiscale Geospatial Representation Learning. ArXiv preprint 2023–09–22, DOI: 10.48550/arXiv.2212.14532There is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 48Zaveri, R. A.; Easter, R. C.; Fast, J. D.; Peters, L. K. Model for Simulating Aerosol Interactions and Chemistry (MOSAIC). Journal of Geophysical Research: Atmospheres 2008, 113, D13204, DOI: 10.1029/2007JD008782There is no corresponding record for this reference.
- 49DeVille, R. E. L.; Riemer, N.; West, M. Weighted Flow Algorithms (WFA) for stochastic particle coagulation. Reviews of Geophysics 2011, 230, 8427– 8451, DOI: 10.1016/j.jcp.2011.07.027There is no corresponding record for this reference.
- 50DeVille, L.; Riemer, N.; West, M. Convergence of a generalized Weighted Flow Algorithm for stochastic particle coagulation. Journal of Computational Dynamics 2019, 6, 69, DOI: 10.3934/jcd.2019003There is no corresponding record for this reference.
- 51Zaveri, R. A.; Peters, L. K. A new lumped structure photochemical mechanism for large-scale applications. Journal of Geophysical Research: Atmospheres 1999, 104, 30387– 30415, DOI: 10.1029/1999JD900876There is no corresponding record for this reference.
- 52Zaveri, R. A.; Easter, R. C.; Peters, L. K. A computationally efficient Multicomponent Equilibrium Solver for Aerosols (MESA). Journal of Geophysical Research: Atmospheres 2005, 110, D24203, DOI: 10.1029/2004JD005618There is no corresponding record for this reference.
- 53Schell, B.; Ackermann, I. J.; Hass, H.; Binkowski, F. S.; Ebel, A. Modeling the formation of secondary organic aerosol within a comprehensive air quality model system. Journal of Geophysical Research: Atmospheres 2001, 106, 28275– 28293, DOI: 10.1029/2001JD000384There is no corresponding record for this reference.
- 54Zheng, Z.; West, M.; Zhao, L.; Ma, P.-L.; Liu, X.; Riemer, N. Quantifying the structural uncertainty of the aerosol mixing state representation in a modal model. Atmospheric Chemistry and Physics 2021, 21, 17727– 17741, DOI: 10.5194/acp-21-17727-2021There is no corresponding record for this reference.
- 55Levakov, G.; Rosenthal, G.; Shelef, I.; Raviv, T. R.; Avidan, G. From a deep learning model back to the brain─Identifying regional predictors and their relation to aging. Human Brain Mapping 2020, 41, 3235– 3252, DOI: 10.1002/hbm.25011There is no corresponding record for this reference.
- 56Zhang, F.; Li, Z.; Zhang, B.; Du, H.; Wang, B.; Zhang, X. Multi-modal deep learning model for auxiliary diagnosis of Alzheimer’s disease. Neurocomputing 2019, 361, 185– 195, DOI: 10.1016/j.neucom.2019.04.093There is no corresponding record for this reference.
- 57Xu, H.; Lin, J.; Zhang, D.; Mo, F. Retention time prediction for chromatographic enantioseparation by quantile geometry-enhanced graph neural network. Nat. Commun. 2023, 14, 3095, DOI: 10.1038/s41467-023-38853-3There is no corresponding record for this reference.
- 58Wan, J.; Jiang, J.-W.; Park, H. S. Machine learning-based design of porous graphene with low thermal conductivity. Carbon 2020, 157, 262– 269, DOI: 10.1016/j.carbon.2019.10.037There is no corresponding record for this reference.
- 59Healy, R. M.; Sciare, J.; Poulain, L.; Kamili, K.; Merkel, M.; Müller, T.; Wiedensohler, A.; Eckhardt, S.; Stohl, A.; Sarda-Estève, R. Sources and mixing state of size-resolved elemental carbon particles in a European megacity: Paris. Atmospheric Chemistry and Physics 2012, 12, 1681– 1700, DOI: 10.5194/acp-12-1681-2012There is no corresponding record for this reference.
- 60Healy, R. M.; Sciare, J.; Poulain, L.; Crippa, M.; Wiedensohler, A.; Prévôt, A. S. H.; Baltensperger, U.; Sarda-Estève, R.; McGuire, M. L.; Jeong, C.-H.; McGillicuddy, E.; O’Connor, I. P.; Sodeau, J. R.; Evans, G. J.; Wenger, J. C. Quantitative determination of carbonaceous particle mixing state in Paris using single-particle mass spectrometer and aerosol mass spectrometer measurements. Atmospheric Chemistry and Physics 2013, 13, 9479– 9496, DOI: 10.5194/acp-13-9479-2013There is no corresponding record for this reference.
- 61He, K.; Zhang, X.; Ren, S. Deep Residual Learning for Image Recognition. ArXiv preprint 2025–12–10, DOI: 10.48550/arXiv.1512.03385There is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 62He, K.; Zhang, X.; Ren, S. Identity Mappings in Deep Residual Networks. ArXiv preprint 2016–07–25, DOI: 10.48550/arXiv.1603.05027There is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 63Akiba, T.; Sano, S.; Yanase, T. Optuna: A Next-generation Hyperparameter Optimization Framework. ArXiv preprint 2019–07–25, DOI: 10.48550/arXiv.1907.10902There is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 64Weiss, K.; Khoshgoftaar, T. M.; Wang, D. A survey of transfer learning. Journal of Big Data 2016, 3, 9, DOI: 10.1186/s40537-016-0043-6There is no corresponding record for this reference.
- 65Guo, Y.; Shi, H.; Kumar, A. SpotTune: Transfer Learning Through Adaptive Fine-Tuning. ArXiv preprint 2018–11–21, DOI: 10.48550/arXiv.1811.08737There is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 66Girshick, R. B.; Donahue, J.; Darrell, T. Rich Feature Hierarchies for Accurate Object Detection and Semantic Segmentation. ArXiv preprint 2014–10–22, DOI: 10.48550/arXiv.1311.2524There is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 67Long, M.; Cao, Y.; Wang, J. Learning Transferable Features with Deep Adaptation Networks. ArXiv preprint 2015–05–27, DOI: 10.48550/arXiv.1502.02791There is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 68Razavian, A. S.; Azizpour, H.; Sullivan, J. CNN Features off-the-shelf: an Astounding Baseline for Recognition. ArXiv preprint 2014–05–12, DOI: 10.48550/arXiv.1403.6382There is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 69Cortes, C.; Mohri, M.; Rostamizadeh, A. L2 Regularization for Learning Kernels. ArXiv preprint 2012–05–09, DOI: 10.48550/arXiv.1205.2653There is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 70van Laarhoven, T. L2 Regularization versus Batch and Weight Normalization. ArXiv preprint 2017–06–16, DOI: 10.48550/arXiv.1706.05350There is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 71Wang, C.; Wu, Q.; Weimer, M.; Zhu, E. FLAML: A Fast and Lightweight AutoML Library. MLSys 2021.There is no corresponding record for this reference.
- 72Zhang, C.; Hu, Q.; Su, W.; Xing, C.; Liu, C. Satellite spectroscopy reveals the atmospheric consequences of the 2022 Russia-Ukraine war. Science of The Total Environment 2023, 869, 161759, DOI: 10.1016/j.scitotenv.2023.161759There is no corresponding record for this reference.
- 73Zheng, Z.; Fiore, A. M.; Westervelt, D. M.; Milly, G. P.; Goldsmith, J.; Karambelas, A.; Curci, G.; Randles, C. A.; Paiva, A. R.; Wang, C.; Wu, Q.; Dey, S. Automated Machine Learning to Evaluate the Information Content of Tropospheric Trace Gas Columns for Fine Particle Estimates Over India: A Modeling Testbed. Journal of Advances in Modeling Earth Systems 2023, 15, e2022MS003099 DOI: 10.1029/2022MS003099There is no corresponding record for this reference.
- 74Ke, G.; Meng, Q.; Finley, T.; Wang, T.; Chen, W.; Ma, W.; Ye, Q.; Liu, T. LightGBM: A Highly Efficient Gradient Boosting Decision Tree. Advances in Neural Information Processing Systems. Curran Associates, Inc., 2017.There is no corresponding record for this reference.
- 75Chen, T.; Guestrin, C. XGBoost: A Scalable Tree Boosting System. ArXiv preprint 2016–06–10, DOI: 10.48550/arXiv.1603.02754There is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 76Breiman, L. Random Forests. Machine Learning 2001, 45, 5– 32, DOI: 10.1023/A:1010933404324There is no corresponding record for this reference.
- 77Geurts, P.; Ernst, D.; Wehenkel, L. Extremely randomized trees. Machine Learning 2006, 63, 3– 42, DOI: 10.1007/s10994-006-6226-1There is no corresponding record for this reference.
- 78Altmann, A.; Toloşi, L.; Sander, O.; Lengauer, T. Permutation importance: a corrected feature importance measure. Bioinformatics 2010, 26, 1340– 1347, DOI: 10.1093/bioinformatics/btq134There is no corresponding record for this reference.
- 79Fisher, A.; Rudin, C.; Dominici, F. All models are wrong, but many are useful: Learning a variable’s importance by studying an entire class of prediction models simultaneously. Journal of Machine Learning Research 2019, 20, 1– 81There is no corresponding record for this reference.
- 80Hallquist, M. The formation, properties and impact of secondary organic aerosol: current and emerging issues. Atmospheric Chemistry and Physics 2009, 9, 5155– 5236, DOI: 10.5194/acp-9-5155-2009There is no corresponding record for this reference.
- 81Jimenez, J. L. Evolution of Organic Aerosols in the Atmosphere. Science 2009, 326, 1525– 1529, DOI: 10.1126/science.1180353There is no corresponding record for this reference.
- 82Xu, W.; Han, T.; Du, W.; Wang, Q.; Chen, C.; Zhao, J.; Zhang, Y.; Li, J.; Fu, P.; Wang, Z.; Worsnop, D. R.; Sun, Y. Effects of Aqueous-Phase and Photochemical Processing on Secondary Organic Aerosol Formation and Evolution in Beijing, China. Environ. Sci. Technol. 2017, 51, 762– 770, DOI: 10.1021/acs.est.6b04498There is no corresponding record for this reference.
- 83Vaswani, A.; Shazeer, N.; Parmar, N. Attention is All you Need. ArXiv preprint 2017–12–06, DOI: 10.48550/arXiv.1706.03762There is no corresponding record for this reference.
(Accessed: 2025–02–09)
- 84Lam, R. Learning skillful medium-range global weather forecasting. Science 2023, 382, 1416– 1421, DOI: 10.1126/science.adi2336There is no corresponding record for this reference.
Supporting Information
Supporting Information
The Supporting Information is available free of charge at https://pubs.acs.org/doi/10.1021/acsestair.4c00329.
Detailed descriptions of the PartMC-MOSAIC and MEGAPOLI data sets, the hyperparameter results of both the pretrained model and the fine-tuned model, and the results of the “Swapped Temporal Order” experiment (PDF)
Terms & Conditions
Most electronic Supporting Information files are available without a subscription to ACS Web Editions. Such files may be downloaded by article for research use (if there is a public use license linked to the relevant article, that license may permit other uses). Permission may be obtained from ACS for other uses through requests via the RightsLink permission system: http://pubs.acs.org/page/copyright/permissions.html.