Intercomparison of airborne and surface-based measurements during the CLARIFY, ORACLES and LASIC field experiments
- 1Met Office, Exeter, EX1 3PB, UK
- 2Department of Earth and Environmental Sciences, University of Manchester, M13 9PL, UK
- 3Rosential School of Marine and Atmospheric Science, University of Miami, Miami, FL 33149, USA
- 4University of Exeter, Exeter, EX4 4PY, UK
- 5Department of Oceanography, University of Hawai’i at Mānoa,, Honolulu, HI, USA
- 6School of Meteorology, University of Oklahoma, Norman, OK, USA
- 7Cooperative Institute for Severe and High-Impact Weather Research and Operations, University of Oklahoma, Norman, OK, USA
- 8FAAM Airborne Laboratory, Cranfield, MK43 0AL, UK
- 9Universities Space Research Association, Columbia, MD, USA
- 10Department of Atmospheric Sciences, University of Washington, Seattle, WA, USA
- 11Bay Area Environmental Research Institute, NASA Ames Research Centre, Moffett Field, Mountain View, CA, USA
- 12Department of Atmospheric and Oceanic Sciences, University of Colorado, Boulder, CO, USA
- 13School of Chemistry, University of Bristol, Bristol, BS8 1TS, UK
- 14Haseltine Lake Kempner, Bristol, BS1 6HU, UK
- 15Department of Atmospheric Sciences, University of North Dakota, Grand Forks, ND, USA
- 16School of Meteorology, University of Oklahoma, Norman, OK, USA
- 17Aerodyne Research Inc., Billerica, MA, USA
- 18College of Earth, Ocean, and Atmospheric Sciences, Oregon State University, Corvallis OR, USA
- 19Department of Physics, University of Auckland, Auckland, New Zealand
- 20NASA Ames Research Centre, Moffett Field, Mountain View, CA, USA
- 21Laboratory for Atmospheric and Space Physics, University of Colorado, Boulder, CO, USA
- 22Brookhaven National Laboratory, Upton, NY, USA
- 23NOAA Chemical Sciences Laboratory (CSL), Boulder, CO, USA
- 24Cooperative Institute for Research in Environmental Sciences (CIRES) at the University of Colorado Boulder, Boulder, CO, USA
- 1Met Office, Exeter, EX1 3PB, UK
- 2Department of Earth and Environmental Sciences, University of Manchester, M13 9PL, UK
- 3Rosential School of Marine and Atmospheric Science, University of Miami, Miami, FL 33149, USA
- 4University of Exeter, Exeter, EX4 4PY, UK
- 5Department of Oceanography, University of Hawai’i at Mānoa,, Honolulu, HI, USA
- 6School of Meteorology, University of Oklahoma, Norman, OK, USA
- 7Cooperative Institute for Severe and High-Impact Weather Research and Operations, University of Oklahoma, Norman, OK, USA
- 8FAAM Airborne Laboratory, Cranfield, MK43 0AL, UK
- 9Universities Space Research Association, Columbia, MD, USA
- 10Department of Atmospheric Sciences, University of Washington, Seattle, WA, USA
- 11Bay Area Environmental Research Institute, NASA Ames Research Centre, Moffett Field, Mountain View, CA, USA
- 12Department of Atmospheric and Oceanic Sciences, University of Colorado, Boulder, CO, USA
- 13School of Chemistry, University of Bristol, Bristol, BS8 1TS, UK
- 14Haseltine Lake Kempner, Bristol, BS1 6HU, UK
- 15Department of Atmospheric Sciences, University of North Dakota, Grand Forks, ND, USA
- 16School of Meteorology, University of Oklahoma, Norman, OK, USA
- 17Aerodyne Research Inc., Billerica, MA, USA
- 18College of Earth, Ocean, and Atmospheric Sciences, Oregon State University, Corvallis OR, USA
- 19Department of Physics, University of Auckland, Auckland, New Zealand
- 20NASA Ames Research Centre, Moffett Field, Mountain View, CA, USA
- 21Laboratory for Atmospheric and Space Physics, University of Colorado, Boulder, CO, USA
- 22Brookhaven National Laboratory, Upton, NY, USA
- 23NOAA Chemical Sciences Laboratory (CSL), Boulder, CO, USA
- 24Cooperative Institute for Research in Environmental Sciences (CIRES) at the University of Colorado Boulder, Boulder, CO, USA
Abstract. Data are presented from intercomparisons between two research aircraft, the FAAM BAe-146 and the NASA Lockheed P3, and between the BAe-146 and the surface-based DOE (Department of Energy) ARM (Atmospheric Radiation Monitoring) Mobile Facility at Ascension Island (8 S, 14.5W, a remote island in the mid-Atlantic). These took place from 17 August to 5 September 2017, during the African biomass burning season. The primary motivation was to give confidence in the use of data from multiple platforms with which to evaluate numerical climate models. The three platforms were involved in the CLouds-Aerosol-Radiation Interaction and Forcing for Year 2017 (CLARIFY-2017), ObseRvations of Aerosols above CLouds and their intEractionS (ORACLES), and Layered Atlantic Smoke and Interactions with Clouds (LASIC) field experiments. Comparisons from flight segments on six days where the BAe-146 flew alongside the ARM facility on Ascension Island are presented, along with comparisons from wing-tip to wing-tip flight of the P3 and BAe-146 on 18th August 2017. The intercomparison flight sampled a relatively clean atmosphere overlying a moderately polluted boundary layer, while the 6 fly-bys of the ARM site sampled both clean and polluted conditions 2–4 km upwind. We compare and validate characterisations of aerosol physical, chemical, and optical properties, atmospheric radiation, and cloud microphysics between platforms. We assess the performance of measurement instrumentation in the field, under conditions where sampling conditions are not tightly controlled as in laboratory measurements where calibrations are performed. Solar radiation measurements compared well between airborne platforms. Optical absorption coefficient measurements compared well across all three platforms, even though absolute magnitudes were often low (< 10 Mm−1) and close to the sensitivity limits of measurement instrumentation thereby confounding assessments of the comparability of absorption Ångström exponent characterisations. Aerosol absorption measurements from airborne platforms were more comparable than aircraft-to-ground observations. Scattering coefficient observations compared well between airborne platforms, but agreement with ground-based measurements was worse, potentially caused by small differences in sampling conditions or actual aerosol population differences. Chemical composition measurements followed a similar pattern, with better comparisons between the airborne platforms. Thermodynamics, aerosol, and cloud microphysical properties generally compared well.
- Preprint
(2050 KB) -
Supplement
(621 KB) - BibTeX
- EndNote
Paul Alan Barrett et al.
Status: open (until 28 Jun 2022)
-
RC1: 'Comment on amt-2022-59', Anonymous Referee #1, 11 Apr 2022
reply
Comments on "Intercomparison of airborne and surface-based measurements. . ." by Barrett et al., 2022.
This manuscript provides a nice summary of gas-phase and aerosol measurements made using two airborne platforms and one ground site during a coordinated group of intensive field campaigns in 2017. The projects dedicated a portion of a flight to in-flight comparisons between instruments on the UK FAAM BAe-146 aircraft and the US NASA P-3 aircraft. In addition, the FAAM aircraft made multiple transects upwind of the US DOE ARM Mobile Facility at Ascension Island.
Most, but not all, of the measurements agreed within combined stated uncertainties. Notable, and somewhat surprising, exceptions were the scattering measurements between the ground and aircraft measurements, as well as chemical composition measurements between these platforms. Documenting both the agreement and the discrepancies is important, as such datasets are used to evaluate models and satellite measurements and to gain process understanding. The material is eminently suitable for publication in AMT.
In general, the manuscript is well written and clear. However, there are some methodological issues that should be addressed, necessitating a major revision. These issues have to do with averaging of measurements across level flight legs followed by subsequent linear regression without accounting for uncertainties or variability, the apparent use of one-sided linear regression when both x- and y-values have uncertainties, and failure to clearly state combined uncertainties during comparisons. In addition, the primary table of data is extremely daunting, and should be broken up and some of it placed in an Appendix or the Supplemental Materials. Detailed comments follow.
Major comments:
1) The comparisons between the various instruments are based primarily on linear regression against mean values from long periods of flight. There are several problems with this approach:
a) The uncertainties quoted are for each instrument's inherent response time as installed in the aircraft. Yet averaging together many minutes of data will result in reduced uncertainties (if the same population is being randomly sampled). One would expect better agreement than the stated raw instrument uncertainties for such averaged data.
b) Regression should be applied using the highest time resolution data possible, rather than to just a few average values from these different "runs". A quick example: if there were only two "runs", using this manuscript's approach there would be only two values, and regression would be a perfect fit to the two data points. The agreement between instruments should be based on the highest resolution data reported, to which the stated uncertainties apply. If one were to fit to averaged values, uncertainties must be adjusted and accounted for in the regression. It would be very interesting to see the regression from the large dynamic range covered in the profile of the two aircraft; this would be a nice way to rigorously compare instruments in a challenging environment.
c) The linear regressions appear to use one-sided least-squares fits. Because there are uncertainties in both x and y parameters, a 2-sided regression, such as orthogonal distance regression, should be used to determine slopes and intercepts. Further, the regressions should account for the uncertainties in each parameter, whether averaged or not.
2) Most of the data are presented in Table 3, which is so large as to be completely unwieldy and is extraordinarily difficult to read because it spans multiple pages. Generally it is much preferable to show data graphically. Instead of Table 3, I recommend a series of graphs of the key data that are discussed and analyzed (at their native resolution). For example, a plot of extinction coefficient for the two airborne platforms could be shown with all of the data covering the full dynamic range, with points perhaps colored by the run type (BL, FT, etc.). It may be most effective to use log-log plots to show the range of values clearly. The numerical values in Table 3 could go into an appendix or the supplemental materials, hopefully in a more compact format.
3) There is extensive discussion of aerosol number concentration and effective radius. However, aerosol mass is extremely important as it is the parameter most often carried in models. Thus it would be very useful to compare integrated volume from the different size distribution instruments. I would suggest that Fig. 6 be converted to 6 panels, with a, b, and c showing, on a linear y-scale, the number concentration comparisons, and panel d, e, and f showing the volume concentrations on a linear panel. A log-log scale with almost 9 orders of magnitude on the y-axis can hide so much detail. For example, at ~2 nm in the current Fig. 6a, there is almost an of magnitude difference between the green line (FAAM PCASP1) and the others. Is this significant? When plotted on a linear scale we can see if this difference is a significant contributor to parameters we care about, such as integrated number or volume (mass).
4) Figure 8. I had trouble understanding Fig. 8b. The y-label say it is the Angstrom exponent of absorption, but the caption says it is that for extinction. Is it derived using Eq. 2 applied to the absorption coefficient values shown in Fig. 8a? If so, why are the markers in 8b plotted at ~460 nm when the closest wavelength pairs are at 470 and 405 nm? Please explain carefully how these values were derived. Also, it would make more sense graphically for these two plots to be side-by-side, to enhance the vertical scaling and make differences more evident.
5) Lines 950-956. The agreement between the AMS on the FAAM aircraft and the ACMS at the ARM site was quite poor, with factors of 3-4.5 difference. These data should be shown in Table 3, but are not. Poorly agreeing data can be just as important as data that agree well, so please show the values if they are part of a project data archive and not rejected for quality-controlled reasons independent of this comparison.
Minor comments:
1) Abstract. The data are described multiple times as agreeing "well". This should be changed to a more quantitative statement, such as "the data agreed within combined experimental uncertainty", if this is the case when the comparison is made at time resolutions for which the stated uncertainties are valid (see comment 1b above).
2) Line 186. Need period after "Beer's Law"
3) Line 217. Two periods.
4) line 249. Change "dependant" to "dependent", here and elsewhere.
5) Line 255. I don't understand this sentence. Please clarify.
6) Line 268. Do "rear" and "front" instruments refer to PSAPs or nephelometers?
7) Line 283. Please state the flow rates to each optical instrument.
8) Line 379. What are representative uncertainties for the absorption coefficient determined from the CAPS PMSSA instrument?
9) Line 397. Moore et al. (2021) provide a thorough analysis of refractive index sensitivities for the UHSAS.
Moore, R. H., Wiggins, E. B., Ahern, A. T., Zimmerman, S., Montgomery, L., Campuzano Jost, P., Robinson, C. E., Ziemba, L. D., Winstead, E. L., Anderson, B. E., Brock, C. A., Brown, M. D., Chen, G., Crosbie, E. C., Guo, H., Jimenez, J. L., Jordan, C. E., Lyu, M., Nault, B. A., Rothfuss, N. E., Sanchez, K. J., Schueneman, M., Shingler, T. J., Shook, M. A., Thornhill, K. L., Wagner, N. L., and Wang, J.: Sizing response of the Ultra-High Sensitivity Aerosol Spectrometer (UHSAS) and Laser Aerosol Spectrometer (LAS) to changes in submicron aerosol composition and refractive index, Atmos. Meas. Tech., 14, 4517–4542, https://doi.org/10.5194/amt-14-4517-2021, 2021.
10) Line 393. Although this is described in more detail in Wu et al. (2020), please provide a succinct explanation for why an empirical correction factor is needed for the SMPS, when it's quite a fundamental instrument.
11) Line 403. Perhaps just state "with updated electronics" rather than "with SPP200 electronics". Or explain what SPP200 means.
12) Line 417. Change "bin dimensions" to "bin boundary diameters".
13) Line 418. The underwing PCASP is not only not adjusted for the "absorbing characteristics" of the BBA, but it's in general not adjusted for any varying refractive index, including water. This could make a significant sizing difference with in-cabin spectrometers.
14) Line 641. What are linear regression "sensitivities"?
15) Line 664. Data taken at or below detection limit are also of use, and should be plotted as suggested in comment 1b above.
16) Line 688. "Re" (effective radius) is not defined.
17) Line 677 (and 955). Show the LACIS ACMS data in Table 3. Are they at least correlated?
18) Line 1080. Replace hyphen with a comma.
References:
Please ensure that all references comply with Copernicus' style guide. For example, for Baumgardner et al. the title is capitalized, as is Cotterell et al. (2021). This behavior is a result of reference manager software, which always messes up formatting and must be thoroughly checked manually.
Paul Alan Barrett et al.
Paul Alan Barrett et al.
Viewed
HTML | XML | Total | Supplement | BibTeX | EndNote | |
---|---|---|---|---|---|---|
259 | 80 | 9 | 348 | 26 | 5 | 9 |
- HTML: 259
- PDF: 80
- XML: 9
- Total: 348
- Supplement: 26
- BibTeX: 5
- EndNote: 9
Viewed (geographical distribution)
Country | # | Views | % |
---|
Total: | 0 |
HTML: | 0 |
PDF: | 0 |
XML: | 0 |
- 1