An SNR-Optimized Scanning Strategy for Geostationary Carbon Cycle Observatory ( GeoCarb ) Instrument

The Geostationary Carbon Observatory (GeoCarb) will make measurements of greenhouse gases over the land mass in the western hemisphere. The extreme flexibility of observing from geostationary orbit induces an optimization problem, as operators must choose what to observe and when. We express this problem in terms of an optimal subcovering problem, and use an Incremental Optimization (IO) algorithm to create a scanning strategy that minimizes expected error as a function of the signal-to-noise ratio (SNR), and show that this method outperforms the "human selected" strategy in terms of global error 5 distributions.


Introduction
Understanding the effects of anthropogenic carbon dioxide (CO 2 ) on the carbon cycle requires us to understand the spatial distribution of atmospheric CO 2 concentrations to identify natural and anthropogenic sources and sinks.In addition to a sparse in situ sampling network, ground-based remote sensing measurements are currently obtained from the Total Column Carbon Observing Network (TCCON) and space-based measurements from the Orbiting Carbon Observatory (OCO-2) (Eldering et al. (2017a), Eldering et al. (2017b), Crisp et al. (2017), Crisp et al. (2008), Crisp et al. (2004)) and Greenhouse Gases Observing Satellite (GOSAT) (Kuze et al. (2009), Yokota et al. (2009), Hammerling et al. (2012)).These instruments have provided a wealth of data for understanding the global carbon cycle in the recent years.However, these instruments have spatial and temporal limitations.The repeat cycles of the space-based instruments force the spatial and temporal interpolation of the atmospheric CO 2 concentrations within their respective cycles, 3 days for GOSAT (Kuze et al. (2009)) and 16 days for OCO-2 (Miller et al. (2007)).The sparsity of the TCCON measurement sites restricts the latitudinal range of observations.The new Geostationary Carbon Observatory (GeoCarb) (Polonsky et al. (2014)) will allow us to augment the current remote sensors on the ground and in space.
Recently selected as the NASA's Earth Venture Mission-2 (EVM-2), GeoCarb is set to launch into geostationary orbit in 2022 to be positioned at 85 • W with the mission of improving the understanding of the carbon cycle.Building on the work of OCO-2, GeoCarb will observe reflected sunlight daily over the Americas, and retrieve the column average dry air mole fraction of carbon dioxide (XCO 2 ), carbon monoxide (XCO), methane (XCH 4 ), and solar-induced fluorescence (SIF).
GeoCarb views reflected sunlight from Earth through a narrow slit that projects on the Earth's surface to an area measuring about 1,740 miles (2,800 kilometers) from north to south and about 3.7 miles (6 kilometers) from east to west.The instrument makes measurements along the slit with a 4.5 second integration.Instrument pointing is accomplished by way of two scanning mirrors that shift the field of view (FoV) north-south and east-west.The pointing system is extremely flexible, and observations can be made at any location at any time with sufficient solar illumination.This flexibility induces an optimization problem: where should the instrument take measurements at a given time throughout the day?
Determining when and where to make daily scans with GeoCarb's observing capabilities is mathematically similar to a CO 2 observation network optimization problem for establishing new observation sites.Selecting the optimal location of new observing stations has been shown to be feasible by utilizing various optimization algorithms.There have been previous studies performed on the problem of optimizing CO 2 observation networks utilizing computationally expensive evolutionary algorithms [i.e.Simulated Annealing (Rayner et al. (1996); Gloor et al. (2000)) and Genetic Algorithm (Nickless et al. ( 2018))] and one utilizing a deterministic, incremental algorithm (Patra and Maksyutov (2002)).All of the previous studies utilized their optimization routines to minimize CO 2 measurement uncertainty as a function of signal-to-noise ratio (SNR).
In this paper, a deterministic, incremental optimization routine (IO) is utilized to find a scanning strategy for GeoCarb that minimizes expected measurement uncertainty as a function of SNR.Section 2 gives background information on the GeoCarb mission and the objectives for this paper.Section 3 explains the process used to create the SNR-optimized algorithm and how the expected error is calculated from the simulated retrievals.In Section 4, the distribution of global error of the algorithmselected strategy is compared to a baseline strategy to determine the performance of the algorithm and the sensitivity of algorithm to inputs is investigated, with results discussed in Sect. 5. We offer concluding statements in Sect.6 and future research goals.

Background
GeoCarb will be hosted on a SES Government Solutions (http://www.ses-gs.com)communications satellite in geostationary orbit at 85 • W. It will measure reflected near-infrared sunlight in the O 2 band at 0.76µm to measure total column O 2 , weak and strong CO 2 bands at 1.61µm and 2.06µm to measure XCO 2 , and the CH 4 and CO bands 2.32µm for measuring XCH 4 and XCO.The O 2 spectral band is identical to that of the OCO-2 mission and allows for determination of mixing ratios and the measurement of SIF, as well as additional information on aerosol and cloud contamination of retrievals.The baseline mission for GeoCarb aims to produce column-averaged mixing ratios of CO2, CH4 and CO with accuracy per sample of 0.7% (≈ 2.7 ppm), 1% (≈ 18 ppb) and 10% (≈ 10 ppb), respectively.Geostationary orbit offers two main advantages over low Earth orbit (LEO).First, the signal-to-noise (SNR) is proportional to the square root of the dwell time for detectors limited by photon shot noise.Geostationary orbits enables longer observation times, thereby increasing SNR.Second, areas with high and uncertain anthropogenic emissions of CO 2 , CH 4 and CO may be targeted with contiguous sampling, relatively small spatial footprints and fine temporal resolution allowing for several observations per day on continental scales are possible.
We are presented the problem of finding an optimized scanning strategy for the GeoCarb satellite instrument.The underlying abstract mathematical problem related to optimizing the scanning pattern is the Geometric Set Cover problem (Hetland (2014)).
Given a finite set of points in space and a set of subsets, the objective is to find a minimal set of scan blocks whose union covers all points in the space.This idea is identical to a network optimization problem comparing the coverage area of a potential network observation site to a geometric subset in the space to be covered.The task of determining the locations of new observation sites so that the total number of required sites to cover an area is minimal is solved similarly.Therefore, we looked at methods used for optimizing network observation sites for our application.However, our motivation extends beyond just finding a minimal covering set.Our goal is to find a minimal covering set that is operationally efficient and minimizes global measurement error for the GeoCarb instrument.Specific to the instrument's application, the geometric subsets are 5-minute East-to-West scan blocks, shown in Fig. 1, and the points in space that we are trying to cover is the North American and South American land masses between 50 • N and 50 • S. Because measurement errors are influenced by parameters that vary in space and time, the solutions are in the form of ordered sets where the scan blocks are ordered by time of execution.With the simplifying assumptions of making our problem computationally tractable and minimizing scan coverage over the ocean, we propose a candidate set of 135 scan blocks (Fig. 1).This is a much larger candidate set than those of the network optimization studies that utilized evolutionary algorithms (Rayner et al. (1996); Gloor et al. (2000)).Therefore, the computationally efficient Incremental Optimization (IO) procedure was implemented to select scan blocks that minimize our objective function at each increment in time.

Scan Blocks
We assume that GeoCarb will process commands in terms of 5-minute scan blocks, during which the instrument steps the slit from east to west.The set of considered scan blocks, shown in Fig. 1, purposely excludes potential scans that are primarily over the ocean, as GeoCarb will not be able to make retrievals over water surfaces due to lack of signal in the CO 2 spectral bands.The potential scans are also largely restricted to land between 50 • N and below 50 • S due to larger solar zenith angles at the higher latitudes.Each slit observation (i.e.1016 individual soundings) is assumed to take 5 seconds, after which the slit moves to the west by half a slit width.

Science Operations Timeline
A goal of this study is to create a scanning strategy that views all land masses of interest within the time window of usable daylight.To determine what time of day to begin the scanning process, Macapá, Brazil and Mexico City, Mexico were chosen as geographic reference points (Fig. 1) to determine the beginning and ending time, respectively, of the usable daylight time frame.Macapá is located at (0 • , 50 • W ) at the mouth of the Amazon river and being on the equator gives us a consistent starting time relative to airmass factor (AF), a function of solar zenith angle (SZA) and the sensor zenith angle (ZA), where AF = 1 cos(SZA) + 1 cos(ZA) .Located at (19.5 • N , 99.25 • W ), Mexico City, Mexico is an ideal reference point to determine when the window of usable daylight ended because it is longitudinally centered in the North American land mass while being close enough to the equator for the calculated airmass factors to remain consistent through the winter months.The scanning strategy calculates the starting time when Macapá first exceeds a starting threshold for AF and the ending time when Mexico City drops below an ending threshold for AF to determine when the usable daylight time window is over.As a result of parameter exploration experiments described in section 3.7, the suggested starting threshold is AF = 2.6 for the Summer Solstice and AF = 2.7 for the Autumn Equinox for minimum variance in predicted errors.

Uncertainty in Retrieved Gas Concentrations
GeoCarb retrieves gas concentrations using reflected sunlight.The radiance, I, observed by GeoCarb is an aggregate of insolation and atmospheric and land surface processes that absorb, reflect, and scatter photons.The impact of these processes is parameterized using a simple model from Polonsky et al. (2014) that incorporates the effects of surface albedo and attenuation by aerosols over the sun-Earth-satellite path described by the solar zenith angle (SZA) and the sensor zenith angle (ZA): where F sun is the band-specific solar irradiance, α is the band-specific surface albedo, and τ is the optical depth (OD) of atmospheric scatterers (e.g.aerosols).For our simple model, we assumed a cloud-free atmosphere, whereas in the operational environment, clouds play a major role in retrieval quality due to poorly understood 3-D scattering effects.As can be readily verified, larger zenith angles lead to reduced signal for constant scatterer OD, as does smaller surface albedo.Note that τ is a quantity with significant spatial and temporal variability, as aerosol concentrations are modified by atmospheric dynamics, emissions, and chemistry.Typical values of τ in successful retrievals for OCO-2 are less than 0.6 for nadir soundings near the equator and decrease as AF increases.Similarly, surface albedo varies with land cover type on small spatial scales, and throughout the year with vegetation density.The OD term was set to τ = 0.3 as it was previously found to be a reasonable estimate for a "clear" sky retrieval (Crisp et al. (2004), O'Dell et al. (2012)).
An important indicator of observation quality is the signal-to-noise ratio (SNR).In the case of GeoCarb, the signal is referred to as I, and the instrument noise equivalent spectral radiance, N , as where N 0 and N 1 are parameters that empirically capture the effects of the instrument design (e.

Objective Function
Examining the definition of SNR, it is easy to see that Since the goal is to ultimately maximize I, we define an objective function that minimizes its multiplicative inverse.In addition to minimizing SNR, two constraints were included in the objective function to prevent erratic behavior in the scanning strategy.
An overlap term was introduced to minimize repeated coverage of regions.A distance term, δ, was also included, which is the squared linear distance from the boundary of the last selected scan block to a candidate scan block.The objective function, c, to be minimized is given by c(s, t) = median(e AF (s,t,x,y) α(x, y AF (s, t, x, y) = Airmass factor of a point in a scan block with respect to time.
The median of e AF (s,t,x,y) α(x, y) −1 is used because we assume that the distributions of airmass factor and surface albedo are non-Gaussian within the scan blocks.The high variability of both parameters are described in Section 3.4.2

Surface Albedo
The MCD43C3 Version 6 White Sky Albedo MODIS band 6 data set (Schaaf and Wang (2015)) was utilized for obtaining surface albedo, α.The MODIS BRDF/Albedo product combines multiband, atmospherically corrected surface reflectance data from the MODIS and MISR instruments to fit a Bidirectional Reflectance Distribution Function (BRDF) in seven spectral bands at a 1 km spatial resolution on a 16-day cycle (Lucht et al. (2000)).The White Sky Albedo measure is a bihemispherical reflectance obtained by integrating the BRDF over all viewing and irradiance directions.These albedo measures are purely properties of the surface, therefore they are compatible with any atmospheric specification to provide true surface albedo as an input to regional and global climate models.The native data was aggregated to the 0.5 • spatial resolution, and interpolated in time to daily resolution.

Seasonal Variation of Parameters
Since AF is affected by the sun's position and albedo is affected by the density of vegetation, there are large seasonal variations in both of these variables, shown in Fig. 2 and 3.However, there is little to no variation between day-to-day comparisons of these variables.It suffices then, and gives an added advantage of being computationally efficient, to calculate separate scanning strategies for each month rather than day.

Optimization Algorithms
The time-dependency of the scanning strategy requires the solutions to be represented as ordered scan blocks of the candidate set.Therefore, the sum of permutations gives approximately 7 × 10 230 possible solutions.Since it is computationally intractable to evaluate all possible solutions, a Greedy heuristic algorithm was employed to find a minimal covering set as a lower-bound estimate for set cardinality, and then it was modified to an Incremental Optimization (IO) algorithm to find a scanning strategy optimizing for SNR.

Greedy Algorithm
Viewing the North American and South American land masses as a uniform space to be covered without considering any additional constraints, the problem is a Geometric Set Cover problem where the goal is to find a minimal cardinality covering set that we will call optimal.It is well-known that there are no known analytical solutions to the Set Cover problem, as it is one of Karp's 21 NP-Complete problems, and the optimization version is NP-Hard (Karp (1972)).However, there exists a heuristic method for finding a solution called the Greedy algorithm that selects the cover with the largest intersection with the uncovered space recursively until the space is covered (Hetland (2014)).The pseudo-code is shown in Algorithm 1.The Greedy algorithm is computationally efficient, though it is difficult to verify that the solution it finds is the optimal solution.The Greedy algorithm is suitable for our application because it reduces the set of candidate blocks at each iteration by removing the selected scan blocks and this ensures that there are no repeated scan blocks in a scanning strategy.Running the Greedy heuristic with no objective function shows that the area of interest can be covered using 83 scan blocks.Therefore, we took this as the lower bound of covering set size.

Incremental Optimization
The Greedy algorithm was modified to select the scan block that minimizes the objective function at each iteration to satisfy operational constraints.Presented in Patra and Maksyutov (2002), this modification to the Greedy algorithm makes it an Incremental Optimization (IO) algorithm because its goal is to minimize the objective function at each increment of time to find the global optimum.Like the Greedy algorithm, IO has the advantage of being computationally inexpensive.However, it may find local optima only and produce sub-optimal solutions depending on the nature of the problem.Usually to avoid this issue, it is common to introduce small perturbations at each increment, such as in evolutionary algorithms (i.e.simulated annealing and genetic algorithm).It has been shown that IO yields results that are nearly as good as evolutionary algorithms while using a fraction of the computational power (Nickless et al. (2018)).For GeoCarb's application, we were looking at the global distribution of errors, σ, and therefore were not concerned about local optima.An additional constraint was added that requires the algorithm to cover South America before switching to North America to further prevent erratic scanning behavior.
The pseudo-code procedure of the algorithm is shown in Algorithm 2.

Evaluating the Optimized Scanning Strategy
To determine a "best" algorithm-selected scanning strategy, the global distribution of error, σ (Eq.( 3)), of the algorithm-selected strategy is compared to a baseline scanning strategy that we considered the "obvious" choice if chosen by a human, shown in Fig. 4. We say the baseline strategy is the "obvious" choice because it tracks the sun's path and covers the entire area of interest in five coherent regions in the order of Tropical South America East, Tropical South America West, Temperate South America, Tropical North America, and Temperate North America.For scanning start and stop times of the baseline strategy, the same times used by the IO algorithm are used for the baseline strategy (1230 UTC for the Autumn Equinox and 1315 UTC for the Summer Solstice).

Parameter Exploration
In Equation ( 5), the overlap and distance terms have equal weighting in the objective function.The effects of weighting these terms on the global distribution of errors were investigated by adding (w o , w d ) constant weight terms to Eq. ( 5) as new input parameters resulting in Eq. ( 6).Utilizing Equation 6, the algorithm now has three inputs, w o , w d , and the starting airmass factor threshold.A Monte Carlo experiment was performed to explore the parameter space of weights and determine the distribution of sample error statistics across the range of possible starting thresholds.For both w o and w d , 1000 weights each were randomly sampled from a uniform distribution between 0 and 10.This process was repeated for starting AF thresholds starting from 2.5 increasing by 0.1 to 3.5 and found that the minimum variance of predicted errors occurs when the starting AF threshold is 2.6 for the Summer Solstice and 2.7 for the Autumn Equinox, shown in Fig. 5. Figure 6 shows that the weight values that correspond to minimum error distribution medians and variance are when w o , w d are both equal to one.

Global error
Based on the parameter exploration results, the global distributions of error were investigated for simulations during the Summer Solstice with a starting AF threshold of 2.6 and simulations during the Autumn Equinox with a starting AF threshold of 2.7.The algorithm-selected scanning strategies consistently matched or exceeded the overall performance of the baseline scanning pattern, shown in Fig. 7 and 8.The region where the most significant improvement is seen is in the Amazon during the Autumn Equinox, refer to Fig. 9. Additionally, the potential of both scanning strategies to yield observations where the SNR was greater than 100 (yielding a predicted error of σ = 2.17) was analyzed for both strategies.A comparison of the distributions of medians of usable soundings obtained from the parameter exploration experiment is shown in Fig. 10.During the Summer Solstice with a starting AF threshold of 2.6, the algorithm-selected strategy yielded approximately 3.79 million usable soundings versus the baseline strategy, which yielded approximately 3.02 million usable soundings.Similarly during the Autumn Equinox with a starting AF threshold of 2.7, the algorithm-selected strategy yielded approximately 4.31 million usable soundings versus 3.04 million usable soundings from the baseline.It is important to note that these figures are within the cloud-free environment of our model and we expect the yield of usable soundings to be significantly less during operations, but that will likely affect both the baseline and optimal strategies similarly.

Sensitivity Analysis
To quantify the algorithm's sensitivity to input parameters, the method of standardized regression coefficients (SRC) was utilized (Helton et al. (2006)).SRCs are the regression coefficients of a linear model fitted to the standardized dependent variable, The dependent variable in this case is the predicted error and the independent variables are w o , w d , and the starting AF threshold.The standardization of variables allows for measuring the effect of the input parameters without their dependency on units (i.e.km 2 , ppm).The coefficient of determination, R 2 , of the SRC model tells us how much of the variability in the sample statistics is explained by the model.R 2 is defined as the Modeled Sum of Squares (MSS) divided by the Total Sum of Squares (TSS), where and Ŷ = model predicted values, Ȳ = mean Error, Y = observed values, n = number of observations.The method of SRC was chosen for the sensitivity analysis by convenience of readily available simulation data from the parameter exploration experiment.
In the results, both the median and variance of the global error are found to be sensitive to starting AF thresholds as seen in Fig. 5 and 10, and Tables 1 and 2. The starting AF thresholds affect the scanning strategy as a whole by shifting the scanning time frame.This sensitivity was expected considering that airmass factors rely on time and play a large role in the calculation of radiance (Eq.( 1)).Because SRCs determine the effect of the input parameters in the presence of others, the SRCs fitted to a linear model of predicted error with respect to w o and w d were also analyzed within the Monte Carlo samples of starting AF threshold equal to 2.7 for the Autumn Equinox and starting AF threshold equal to 2.6 for the Summer Solstice.
Within the specified starting AF threshold of 2.7 for the Autumn Equinox, moderate effects of the weights were found on the sample global error distribution.The values in Table 3 show that the SRC model explains approximately half of the variability in median of global error distributions, R 2 = 0.552, and the parameter with the largest effect on the variance is the distance, δ.
With respect to variance of global error distributions, the SRC model explains less than half of the variability with R 2 = 0.384.
Again, the parameter with the largest effect is the distance term.
Within the specified starting AF threshold of 2.6 for the Summer Solstice, the effects of the weights on the sample global error distribution are small.The SRC model explains approximately a quarter of the variability in median of global error distributions, R 2 = 0.242, and approximately 15% of the variability with R 2 = 0.148, shown in Table 4.The parameter with the largest effect is the overlap term for both variance and median of error distributions.
Small R 2 values signify little sensitivity to the independent variables or a nonlinear relationship between the independent and dependent variables.Visual analysis of the scatter plots of the distributions of sample statistics versus weights (Fig. 6) does not imply a nonlinear relationship between the weights and sample statistics.We illustrate an efficient technique that selects a covering set that also minimizes global measurement error.The Incremental Optimization routine gives us an optimized scanning strategy that performs better than the baseline scanning strategy relative to the global distribution of error and number of usable soundings.We also found that by optimizing for the global distribution of error, we obtained an improvement in regional errors as well, seen in Fig. 8.There may be better objective functions to optimize, and the structure of the algorithm does allow human intervention beyond the scope of this work.For example, in the event of natural phenomena such as wildfires, droughts, and volcanic eruptions, the algorithm can be modified to capture prioritized scanning regions during the minimum predicted error time for those regions.
At the moment, our model does not take into account the effect of clouds on retrieval quality.It is known that clouds play a significant role in scattering effects and influences the calculation of radiance (Eq.( 1)).In a case study by Polonsky et al. (2014) that included clouds and aerosols in the atmosphere, they found that the number of usable soundings passing their post-processing filter (PPF) of AOD < 0.1 was between 8.1% to 20% of total simulated soundings.We believe that an AOD threshold of 0.1 is too conservative for a clear sky AOD threshold, therefore it was relaxed to 0.  2012), they found that 22% of scenes were classified correctly as "clear" when they used an AOD threshold of 0.3.Because τ = 0.3 in our calculation of radiance (Eq.( 1)), we estimate that the true number of usable soundings will be around 20% of our original estimates of daily usable soundings in Sect.4.1.Going forward, cloud products from CALIPSO will be incorporated to better simulate operational conditions.This will yield more robust scanning strategies and estimates of usable soundings.
The SNR-Optimized scanning strategy outperforms the human-selected strategy first proposed for the GeoCarb scientific observation plan.The algorithm selects strategies that consistently match or exceed the performance of the baseline scanning pattern with respect to global error.The algorithm-selected strategies yield an 18% increase of soundings with a SNR>100 during the Summer and a 41% increase during the Autumn over the baseline strategy.
g. telescope length, detector noise) on overall instrument noise (O'Dell et al. (2012)).The O 2 A-band (0.763µm) specific constants, N 0 = 0.1819 and N 1 = 0.003295 are used in the noise model, which are figures derived from airborne trials with the Tropospheric Infrared Mapping Spectrometers (TIMS) by Lockheed Martin (Kumer et al. (2013)), that were later revised in Polonsky et al. (2014).The SNR is then defined as I N .In O'Brien et al. (2016), the authors fitted an empirical model to predict the observational uncertainty as a function of the measurement SNR.They found that the function σ = 1 1 14 + (0.0039)SN R .(3) Atmos.Meas.Tech.Discuss., https://doi.org/10.5194/amt-2018-359Manuscript under review for journal Atmos.Meas.Tech.Discussion started: 9 November 2018 c Author(s) 2018.CC BY 4.0 License.best captures the observation error distribution as SNR increases.The same model is used to connect SNR and uncertainty for later scanning strategy distributions.The distribution of σ is treated as the metric against which a particular scanning sequence is evaluated.
where s = A candidate scan block.t = Time.E = Uncovered land mass I = The set of selected scan blocks α(x, y) = Surface albedo of a point in scan block.δ = Distance from last selected scan block.

E
← Space to be covered S ← Set of scan blocks where E ⊆ si ∈ S I ← ∅ while E I do Find s * ∈ S such that s * ∩ E is maximal for all si ∈ S Append s * to I Remove s * from S end while 6 Atmos.Meas.Tech.Discuss., https://doi.org/10.5194/amt-2018-359Manuscript under review for journal Atmos.Meas.Tech.Discussion started: 9 November 2018 c Author(s) 2018.CC BY 4.0 License.

E
← North American and South American land masses between 50 • N and 50 • S S ← Set of 5-minute east-to-west scan blocks where E ⊆ si ∈ S I ← ∅ C ← Objective Function while E I do Find s * ∈ S such that C(s * ) is the minimum of the set {C(si) : si ∈ S} Append s * to I Remove s * from S end while Atmos.Meas.Tech.Discuss., https://doi.org/10.5194/amt-2018-359Manuscript under review for journal Atmos.Meas.Tech.Discussion started: 9 November 2018 c Author(s) 2018.CC BY 4.0 License.

Figure 5 .Figure 6 .
Figure 5. Violin plots show the effect of starting threshold on variance of errors: Summer Solstice (top) and Autumn Equinox (bottom). 17

Figure 7 .
Figure7.Global error distributions of observations with a predicted SNR > 100 of the baseline strategy (left) and algorithm-selected strategy (right) for the minimum variance starting thresholds.The number of observations is significantly greater in the algorithm-selected strategies.

Figure 10 .
Figure 10.Violin plots show the effect of starting threshold on error distribution medians: Summer Solstice (top) and Autumn Equinox (bottom).

Table 1 .
Atmos.Meas.Tech.Discuss., https://doi.org/10.5194/amt-2018-359Manuscript under review for journal Atmos.Meas.Tech.Discussion started: 9 November 2018 c Author(s) 2018.CC BY 4.0 License.Candidate scan blocks are shown here in yellow.In red are the geographic reference points, Mexico City, Mexico and Macapá, The SRCs show that the starting AF threshold has major effect on variance and median of global error distributions.

Table 2 .
The SRCs show that the starting AF threshold has major effect on median and variance of global error distributions

Table 3 .
The SRCs show that the distance term has a moderate effect on the median of global error distributions and some effect on variance of global error distributions during the Autumn Equinox.

Table 4 .
The coefficients of determination tell us that the weighting has little effect on global error distribution during the Summer Solstice.