Intercomparison of airborne and surface-based measurements during the CLARIFY, ORACLES and LASIC field experiments
- 1Met Office, Exeter, EX1 3PB, UK
- 2Department of Earth and Environmental Sciences, University of Manchester, M13 9PL, UK
- 3Rosential School of Marine and Atmospheric Science, University of Miami, Miami, FL 33149, USA
- 4University of Exeter, Exeter, EX4 4PY, UK
- 5Department of Oceanography, University of Hawai’i at Mānoa,, Honolulu, HI, USA
- 6School of Meteorology, University of Oklahoma, Norman, OK, USA
- 7Cooperative Institute for Severe and High-Impact Weather Research and Operations, University of Oklahoma, Norman, OK, USA
- 8FAAM Airborne Laboratory, Cranfield, MK43 0AL, UK
- 9Universities Space Research Association, Columbia, MD, USA
- 10Department of Atmospheric Sciences, University of Washington, Seattle, WA, USA
- 11Bay Area Environmental Research Institute, NASA Ames Research Centre, Moffett Field, Mountain View, CA, USA
- 12Department of Atmospheric and Oceanic Sciences, University of Colorado, Boulder, CO, USA
- 13School of Chemistry, University of Bristol, Bristol, BS8 1TS, UK
- 14Haseltine Lake Kempner, Bristol, BS1 6HU, UK
- 15Department of Atmospheric Sciences, University of North Dakota, Grand Forks, ND, USA
- 16School of Meteorology, University of Oklahoma, Norman, OK, USA
- 17Aerodyne Research Inc., Billerica, MA, USA
- 18College of Earth, Ocean, and Atmospheric Sciences, Oregon State University, Corvallis OR, USA
- 19Department of Physics, University of Auckland, Auckland, New Zealand
- 20NASA Ames Research Centre, Moffett Field, Mountain View, CA, USA
- 21Laboratory for Atmospheric and Space Physics, University of Colorado, Boulder, CO, USA
- 22Brookhaven National Laboratory, Upton, NY, USA
- 23NOAA Chemical Sciences Laboratory (CSL), Boulder, CO, USA
- 24Cooperative Institute for Research in Environmental Sciences (CIRES) at the University of Colorado Boulder, Boulder, CO, USA
- 1Met Office, Exeter, EX1 3PB, UK
- 2Department of Earth and Environmental Sciences, University of Manchester, M13 9PL, UK
- 3Rosential School of Marine and Atmospheric Science, University of Miami, Miami, FL 33149, USA
- 4University of Exeter, Exeter, EX4 4PY, UK
- 5Department of Oceanography, University of Hawai’i at Mānoa,, Honolulu, HI, USA
- 6School of Meteorology, University of Oklahoma, Norman, OK, USA
- 7Cooperative Institute for Severe and High-Impact Weather Research and Operations, University of Oklahoma, Norman, OK, USA
- 8FAAM Airborne Laboratory, Cranfield, MK43 0AL, UK
- 9Universities Space Research Association, Columbia, MD, USA
- 10Department of Atmospheric Sciences, University of Washington, Seattle, WA, USA
- 11Bay Area Environmental Research Institute, NASA Ames Research Centre, Moffett Field, Mountain View, CA, USA
- 12Department of Atmospheric and Oceanic Sciences, University of Colorado, Boulder, CO, USA
- 13School of Chemistry, University of Bristol, Bristol, BS8 1TS, UK
- 14Haseltine Lake Kempner, Bristol, BS1 6HU, UK
- 15Department of Atmospheric Sciences, University of North Dakota, Grand Forks, ND, USA
- 16School of Meteorology, University of Oklahoma, Norman, OK, USA
- 17Aerodyne Research Inc., Billerica, MA, USA
- 18College of Earth, Ocean, and Atmospheric Sciences, Oregon State University, Corvallis OR, USA
- 19Department of Physics, University of Auckland, Auckland, New Zealand
- 20NASA Ames Research Centre, Moffett Field, Mountain View, CA, USA
- 21Laboratory for Atmospheric and Space Physics, University of Colorado, Boulder, CO, USA
- 22Brookhaven National Laboratory, Upton, NY, USA
- 23NOAA Chemical Sciences Laboratory (CSL), Boulder, CO, USA
- 24Cooperative Institute for Research in Environmental Sciences (CIRES) at the University of Colorado Boulder, Boulder, CO, USA
Abstract. Data are presented from intercomparisons between two research aircraft, the FAAM BAe-146 and the NASA Lockheed P3, and between the BAe-146 and the surface-based DOE (Department of Energy) ARM (Atmospheric Radiation Monitoring) Mobile Facility at Ascension Island (8 S, 14.5W, a remote island in the mid-Atlantic). These took place from 17 August to 5 September 2017, during the African biomass burning season. The primary motivation was to give confidence in the use of data from multiple platforms with which to evaluate numerical climate models. The three platforms were involved in the CLouds-Aerosol-Radiation Interaction and Forcing for Year 2017 (CLARIFY-2017), ObseRvations of Aerosols above CLouds and their intEractionS (ORACLES), and Layered Atlantic Smoke and Interactions with Clouds (LASIC) field experiments. Comparisons from flight segments on six days where the BAe-146 flew alongside the ARM facility on Ascension Island are presented, along with comparisons from wing-tip to wing-tip flight of the P3 and BAe-146 on 18th August 2017. The intercomparison flight sampled a relatively clean atmosphere overlying a moderately polluted boundary layer, while the 6 fly-bys of the ARM site sampled both clean and polluted conditions 2–4 km upwind. We compare and validate characterisations of aerosol physical, chemical, and optical properties, atmospheric radiation, and cloud microphysics between platforms. We assess the performance of measurement instrumentation in the field, under conditions where sampling conditions are not tightly controlled as in laboratory measurements where calibrations are performed. Solar radiation measurements compared well between airborne platforms. Optical absorption coefficient measurements compared well across all three platforms, even though absolute magnitudes were often low (< 10 Mm−1) and close to the sensitivity limits of measurement instrumentation thereby confounding assessments of the comparability of absorption Ångström exponent characterisations. Aerosol absorption measurements from airborne platforms were more comparable than aircraft-to-ground observations. Scattering coefficient observations compared well between airborne platforms, but agreement with ground-based measurements was worse, potentially caused by small differences in sampling conditions or actual aerosol population differences. Chemical composition measurements followed a similar pattern, with better comparisons between the airborne platforms. Thermodynamics, aerosol, and cloud microphysical properties generally compared well.
- Preprint
(2050 KB) -
Supplement
(621 KB) - BibTeX
- EndNote
Paul Alan Barrett et al.
Status: final response (author comments only)
-
RC1: 'Comment on amt-2022-59', Anonymous Referee #1, 11 Apr 2022
Comments on "Intercomparison of airborne and surface-based measurements. . ." by Barrett et al., 2022.
This manuscript provides a nice summary of gas-phase and aerosol measurements made using two airborne platforms and one ground site during a coordinated group of intensive field campaigns in 2017. The projects dedicated a portion of a flight to in-flight comparisons between instruments on the UK FAAM BAe-146 aircraft and the US NASA P-3 aircraft. In addition, the FAAM aircraft made multiple transects upwind of the US DOE ARM Mobile Facility at Ascension Island.
Most, but not all, of the measurements agreed within combined stated uncertainties. Notable, and somewhat surprising, exceptions were the scattering measurements between the ground and aircraft measurements, as well as chemical composition measurements between these platforms. Documenting both the agreement and the discrepancies is important, as such datasets are used to evaluate models and satellite measurements and to gain process understanding. The material is eminently suitable for publication in AMT.
In general, the manuscript is well written and clear. However, there are some methodological issues that should be addressed, necessitating a major revision. These issues have to do with averaging of measurements across level flight legs followed by subsequent linear regression without accounting for uncertainties or variability, the apparent use of one-sided linear regression when both x- and y-values have uncertainties, and failure to clearly state combined uncertainties during comparisons. In addition, the primary table of data is extremely daunting, and should be broken up and some of it placed in an Appendix or the Supplemental Materials. Detailed comments follow.
Major comments:
1) The comparisons between the various instruments are based primarily on linear regression against mean values from long periods of flight. There are several problems with this approach:
a) The uncertainties quoted are for each instrument's inherent response time as installed in the aircraft. Yet averaging together many minutes of data will result in reduced uncertainties (if the same population is being randomly sampled). One would expect better agreement than the stated raw instrument uncertainties for such averaged data.
b) Regression should be applied using the highest time resolution data possible, rather than to just a few average values from these different "runs". A quick example: if there were only two "runs", using this manuscript's approach there would be only two values, and regression would be a perfect fit to the two data points. The agreement between instruments should be based on the highest resolution data reported, to which the stated uncertainties apply. If one were to fit to averaged values, uncertainties must be adjusted and accounted for in the regression. It would be very interesting to see the regression from the large dynamic range covered in the profile of the two aircraft; this would be a nice way to rigorously compare instruments in a challenging environment.
c) The linear regressions appear to use one-sided least-squares fits. Because there are uncertainties in both x and y parameters, a 2-sided regression, such as orthogonal distance regression, should be used to determine slopes and intercepts. Further, the regressions should account for the uncertainties in each parameter, whether averaged or not.
2) Most of the data are presented in Table 3, which is so large as to be completely unwieldy and is extraordinarily difficult to read because it spans multiple pages. Generally it is much preferable to show data graphically. Instead of Table 3, I recommend a series of graphs of the key data that are discussed and analyzed (at their native resolution). For example, a plot of extinction coefficient for the two airborne platforms could be shown with all of the data covering the full dynamic range, with points perhaps colored by the run type (BL, FT, etc.). It may be most effective to use log-log plots to show the range of values clearly. The numerical values in Table 3 could go into an appendix or the supplemental materials, hopefully in a more compact format.
3) There is extensive discussion of aerosol number concentration and effective radius. However, aerosol mass is extremely important as it is the parameter most often carried in models. Thus it would be very useful to compare integrated volume from the different size distribution instruments. I would suggest that Fig. 6 be converted to 6 panels, with a, b, and c showing, on a linear y-scale, the number concentration comparisons, and panel d, e, and f showing the volume concentrations on a linear panel. A log-log scale with almost 9 orders of magnitude on the y-axis can hide so much detail. For example, at ~2 nm in the current Fig. 6a, there is almost an of magnitude difference between the green line (FAAM PCASP1) and the others. Is this significant? When plotted on a linear scale we can see if this difference is a significant contributor to parameters we care about, such as integrated number or volume (mass).
4) Figure 8. I had trouble understanding Fig. 8b. The y-label say it is the Angstrom exponent of absorption, but the caption says it is that for extinction. Is it derived using Eq. 2 applied to the absorption coefficient values shown in Fig. 8a? If so, why are the markers in 8b plotted at ~460 nm when the closest wavelength pairs are at 470 and 405 nm? Please explain carefully how these values were derived. Also, it would make more sense graphically for these two plots to be side-by-side, to enhance the vertical scaling and make differences more evident.
5) Lines 950-956. The agreement between the AMS on the FAAM aircraft and the ACMS at the ARM site was quite poor, with factors of 3-4.5 difference. These data should be shown in Table 3, but are not. Poorly agreeing data can be just as important as data that agree well, so please show the values if they are part of a project data archive and not rejected for quality-controlled reasons independent of this comparison.
Minor comments:
1) Abstract. The data are described multiple times as agreeing "well". This should be changed to a more quantitative statement, such as "the data agreed within combined experimental uncertainty", if this is the case when the comparison is made at time resolutions for which the stated uncertainties are valid (see comment 1b above).
2) Line 186. Need period after "Beer's Law"
3) Line 217. Two periods.
4) line 249. Change "dependant" to "dependent", here and elsewhere.
5) Line 255. I don't understand this sentence. Please clarify.
6) Line 268. Do "rear" and "front" instruments refer to PSAPs or nephelometers?
7) Line 283. Please state the flow rates to each optical instrument.
8) Line 379. What are representative uncertainties for the absorption coefficient determined from the CAPS PMSSA instrument?
9) Line 397. Moore et al. (2021) provide a thorough analysis of refractive index sensitivities for the UHSAS.
Moore, R. H., Wiggins, E. B., Ahern, A. T., Zimmerman, S., Montgomery, L., Campuzano Jost, P., Robinson, C. E., Ziemba, L. D., Winstead, E. L., Anderson, B. E., Brock, C. A., Brown, M. D., Chen, G., Crosbie, E. C., Guo, H., Jimenez, J. L., Jordan, C. E., Lyu, M., Nault, B. A., Rothfuss, N. E., Sanchez, K. J., Schueneman, M., Shingler, T. J., Shook, M. A., Thornhill, K. L., Wagner, N. L., and Wang, J.: Sizing response of the Ultra-High Sensitivity Aerosol Spectrometer (UHSAS) and Laser Aerosol Spectrometer (LAS) to changes in submicron aerosol composition and refractive index, Atmos. Meas. Tech., 14, 4517–4542, https://doi.org/10.5194/amt-14-4517-2021, 2021.
10) Line 393. Although this is described in more detail in Wu et al. (2020), please provide a succinct explanation for why an empirical correction factor is needed for the SMPS, when it's quite a fundamental instrument.
11) Line 403. Perhaps just state "with updated electronics" rather than "with SPP200 electronics". Or explain what SPP200 means.
12) Line 417. Change "bin dimensions" to "bin boundary diameters".
13) Line 418. The underwing PCASP is not only not adjusted for the "absorbing characteristics" of the BBA, but it's in general not adjusted for any varying refractive index, including water. This could make a significant sizing difference with in-cabin spectrometers.
14) Line 641. What are linear regression "sensitivities"?
15) Line 664. Data taken at or below detection limit are also of use, and should be plotted as suggested in comment 1b above.
16) Line 688. "Re" (effective radius) is not defined.
17) Line 677 (and 955). Show the LACIS ACMS data in Table 3. Are they at least correlated?
18) Line 1080. Replace hyphen with a comma.
References:
Please ensure that all references comply with Copernicus' style guide. For example, for Baumgardner et al. the title is capitalized, as is Cotterell et al. (2021). This behavior is a result of reference manager software, which always messes up formatting and must be thoroughly checked manually.
-
RC2: 'Comment on amt-2022-59', Anonymous Referee #2, 04 Jul 2022
Review of "Intercomparison of airborne and surface-based measurements during CLARIFY, ORACLES and LASIC field experiments"
The manuscript provides a comprehensive overview of collocated measurements of two aircraft platforms and a facility on the ground. The detailed description of the experiments is of great quality, although its presentation can be improved. Moreover, although mentioned, the differing approaches regarding the drying (or not drying) of the sampled aerosol is a critical flaw in the study comparing aerosol parameters under different states. A deeper discussion, including the expected growth of the aerosol particles and probable losses due to evaporation, should be included in the results part addressing the prevalent RH for shown PSDs.
The manuscript is well written and structured. However, the extensive use of abbreviations makes it partly hard to read and comprehend. Furthermore, inconsistency in the text appears in abbreviations and units. Too many to address individually. Authors must carefully recheck and harmonize all abbreviations and units. A list of acronyms is recommended.
After a needed major revision, the manuscript is recommended for publishing. Major and minor comments are listed below.
Major comments:
In general, comparing measurements with different setups, actively dried or not, is not recommended. To ensure comparable conditions, one should care for RH below 40 %. Especially the RH is of crucial importance for filter-based absorption photometers. The observed gradient in the RH (Fig 4c) transposes into the airplane's piping and will bias the absorption measurements due to the principle of differential measurement of the light attenuation behind the filter spots even if the cabin is heated to 30 °C (which also has implications for the volatile components of the aerosol particles). I.e., a sample at ~80 % RH at ~12 °C outside equals inside at ~26 % at 30 °C. As shown in the profile, there was a change to ~1 % RH at ~20 °C outside, which equals 0.6 % at 30 °C inside. This relatively fast change of more than 25 % can significantly impact the filter-based absorption at NASA P3's PSAP or the TAP used on FAAM. However, the Nafion™ dryer at FAAM aircraft should dampen this effect significantly. The discussion must address this feature of the experimental setup.
Table 3 is way too large. One should consider presenting the content more comprehensibly, like with figures. E.g., the table content can be separated into the coefficients of the linear fitting and average values.
Figure 5 displays correlations of two variables consisting of uncertainty each. Hence a linear fit is not applicable, and an orthogonal fit accounting for both uncertainties should be applied. Moreover, it is unsuitable for fitting a linear behavior based on two observations. I would suspect that the statistical significance of those fits is small. Enhance the number of data points by decreasing the averaging window or address this in a deeper discussion.
Since a major point of the motivation is biomass burning aerosol, the discussion, and presentation of the aerosol particle light absorption coefficient is, in my opinion, not sufficiently addressed. Please also provide profiles of aerosol particle light scattering and absorption and a discussion of those.
Minor comments:
Abstract:
Line 40: please add ° in the coordinates
Line 52: first appearance: Avoid using "well" when comparing devices. Please rephrase.
Introduction:
-
Instruments:
Line 115: Although referenced, no details on the SMPS of the AMS rack are presented in section 2.4.2
Line 121: exemplarily for other referencing parts in the manuscript. For all references, a period should adjoin the subsection. Instead of 2.52, it should read 2.5.2.
Line 118: Provide details of the CPC by referring to section 2.6, i.e., their volume flow rate.
Line 120 and 124: What means good? Within which range?
Line 126: Please provide the particle losses due to the tubing as a function of particle diameter used to correct those losses, e.g., in the supplementary material.
Line 132: Please provide the period of the periodical change.
Line 160: split ms-1; otherwise, it is inverse milliseconds.
Line 186: Period after Beer's Law.
Line 209 and repeatedly appearing along with the text: Please avoid judgmental adjectives such as "good."
Line 217: Remove one period.
Line 259: (first appearance): Ensure the optical coefficients are properly subscripted.
Line 300 and 359: Use a uniform notation; Nafion™ or Nafion(TM)
Line 354 and 359: Explain where the dilution of the aerosol arises and the underlying reasons. Comment in which why this was accounted for. Leakage of the Nafion™ membrane will bias the outside measurement with airplane cabin aerosol.
Line 378: (first appearance): AAE (absorption angstrom exponent) is not σap. Please change.
Line 393: Comment or discuss where the factor of 1.8 originates from; Line 719: Comment on the underlying reasons for the empirical scaling factor used for the PSD.
Line 400: According to the reference list, "Howell et al. (2020)" was published in 2021.
Line 428: Comment on the expected uncertainty omitting the refractive index correction of particles larger than 800 nm.
Results
Line 641: rephrase sensitivity to "the slope". Consistency: BAe-146 or BAe146. Choose.
Line 661: Discuss the differences in the measured CN between the two airplanes based on the cut-off of the CPCs.
Line 792: One could update Figure 9, including the separation between NIR and VIS, and add the corresponding integrated values.
Line 900: Please comment on the volatile nature of ammonium nitrate evaporating already at 20°C and its impact on the chemical composition measurements. See Schaap et al. (2004).
Schaap, M., Spindler, G., Schulz, M., Acker, K., Maenhaut, W., Berner, A., Wieprecht, W., Streit, N., Muller, K., Bruggemann, E., Chi, X., Putaud, J. P., Hitzenberger, R., Puxbaum, H., Baltensperger, U., and ten Brink, H.: Artefacts in the sampling of nitrate studied in the "INTERCOMP" campaigns of EUROTRAC-AEROSOL, Atmos. Environ., 38, 6487-6496, 10.1016/j.atmosenv.2004.08.026, 2004.
Line 1071: Provide a valuable reference for BBA density.
References:
Add doi if available to each reference.
General comments:
Regarding tables: Table description on top of the tables.
The manuscript is very long. I recommend a revision in places that can be shortened. For instance, the instrument description part contains repetitive passages (e.g., gaseous components) and can be shortened, e.g., in the form of tables. A tabular overview of the instruments and corresponding parameters would be more understandable. After, differences between the airplanes and ARM-site regarding drying and instrument location (if necessary) can be explained.
Updating the colors of the fitting functions and adding the wavelength when optical coefficients are considered can improve figure 5.
Figure 5, 6: Please provide the aerosol particles' volume and surface size distribution and their integrated and cumulative (along the diameter) sum values, e.g., in the supplementary material. Those would help comprehend the contribution of the different aerosol populations to the optical properties since those are a function of the cross-section of the aerosol particles.
Comment on the different observed size ranges of the different AMS systems, i.e., the difference between ACSM and AMS when comparing the chemical composition. I am not an expert in that field, but could it be that this explains the observed difference?
Line 1595: The specific instrument should be mentioned in the legend for each variable in all the figures. Change typo: its AAE (absorption angstrom exponent, not extinction angstrom exponent)
Figure 10a): Comment and discuss the discrepancy of one order of magnitude in the observed PSD of the 2DS and CDP.
Paul Alan Barrett et al.
Paul Alan Barrett et al.
Viewed
HTML | XML | Total | Supplement | BibTeX | EndNote | |
---|---|---|---|---|---|---|
331 | 106 | 14 | 451 | 46 | 8 | 12 |
- HTML: 331
- PDF: 106
- XML: 14
- Total: 451
- Supplement: 46
- BibTeX: 8
- EndNote: 12
Viewed (geographical distribution)
Country | # | Views | % |
---|
Total: | 0 |
HTML: | 0 |
PDF: | 0 |
XML: | 0 |
- 1