Accurate pre-flight and in-flight calibrations of the SOLSTICE are both crucial to the solar irradiance measurements. The pre-flight calibrations provide the absolute values of the irradiances that are traceable to the National Institute of Standards and Technology (NIST) radiometric standards. The in-flight calibrations provide the relative knowledge on how irradiances on any day can be compared precisely to the irradiances on any other day during the mission.
Unit Level Pre-flight Calibrations:
- Mirror Reflectivity – Measure reflectivity efficiency and uniformity in order to pick the best mirrors for flight
- Grating Efficiency – Measure gratings’ combined reflectivity and diffraction efficiency and their uniformity in order to pick the best gratings for flight
- Grating Scattered Light – Measure the gratings’ scattered light properties, the main source of scattered light in the spectrometer
- Grating Drive Rotation – Calibrate the grating drive rotation step sizes in order to provide the most precise wavelength scale for the spectrometer
- Detector Efficiency – Measure the PMT’s quantum efficiency (QE) and uniformity in order to pick the best PMT’s for flight
- Detector Linearity – Calibrate the PMT’s linearity over full range of operational count rates
- Slit Sizes – Calibrate the entrance and exit slit sizes for precise comparison of UARS measurements to future measurements using the same technique, such as by EOS SOLSTICE
System Level Pre-Flight Calibrations:
- Sensitivity Calibrate each channel’s sensitivity over its full field of view. Sensitivity is the combination of mirror reflectivity, grating efficiency, PMT QE, entrance slit area, exit slit spectral bandpass. The NIST radiometric standards used for these calibrations include deuterium and FEL lamps and the Synchroton Ultraviolet Radiation Facility (SURF) in Gaithersburg, Maryland.
- Wavelength Scale Validate the unit level calibration of the grating drive rotation
- Scattered Light Validate the unit level calibration of the grating scattered light
Due to the nature the SOLSTICE design, continued exposure to ultraviolet solar radiation causes a gradual decay in the instrument sensitivity. We believe two distinct processes share responsibility for this decay—reduction in photocathode efficiency of our photomultiplier tubes with prolonged exposure and polymerization of the optical elements from exposure to hard x-rays.
In order to model the resulting decay in instrument sensitivity, SOLSTICE observes an ensemble of stars at specific pre-selected wavelengths over the duration of the mission. General information about this technique is also available.
Prior to launch, SOLSTICE personnel selected approximately 30 different stars of spectral classes O, B, and A to observe during portions of the spacecraft orbit when the instrument does not observe the sun. Stars were selected according to visual magnitude, spectral class, and any known variability in apparent brightness. Only the brightest stars with no known variability were selected. After launch, the instrument began observing each star at every selected wavelength whenever possible. Certain stars were removed from the list upon determination of their pathological nature relative to others.
The data from all these observations exists as a time-series for each star at each calibration wavelength. The stellar data analysis algorithms filter the stellar data according to the intensity of each stellar measurement, as well as geophysical effects, such as the South Atlantic Anomaly and others. Modeling of changes in instrument sensitivity occurs by fitting the entire ensemble of stars at each wavelength using a multi-variate non-linear least squares algorithm. The resulting calibration curves are then folded into SOLSTICE production processing algorithms to correct the solar observations.
- Degradation Rate – Stellar data directly provide the instrument degradation rate assuming the stars are stable sources. Several stars are observed to validate the stability of individual stars.
- Alignment Scans – Calibrate the SOLSTICE pointing direction relative to the UARS platform and validate that there are no alignment drifts during the mission
- Field of View Maps – Validate the pre-flight sensitivity calibrations over its field of view (FOV) and monitor how the instrument degradation is changing over its FOV
- Wavelength Scale – Use the many solar absorption and emission lines as reference wavelengths for the wavelength scale which shifts slightly with temperature and with target pointing angle
- Scattered Light – Validate the pre-flight calibration of the scattered light
- Detector Gain – Correct for the detector gain that changes slightly with temperature
Application of the Calibration Parameters:
In calculating the irradiances, the solar data are corrected for scattered light, detector linearity, detector dark counts, detector gain changes, instrument sensitivity and degradation. The stellar irradiances undergo similar processing, but the degradation factors are treated as free parameters and are adjusted to make the mean stellar irradiance invariant in time. The resulting degradation factors are then the same ones applied to the solar data.
The wavelength scale is referenced in vacuum wavelength units to high resolution solar spectra above 200 nm and to atomic or ionic transition levels below 200 nm. Each spectrum’s wavelength scale is also adjusted to the SOLSTICE reference wavelength scale to account for small wavelength shifts related to temperature changes and to pointing offsets.
A detailed description of the SOLSTICE instrument calibration is given by Woods et al. (“Solar Stellar Irradiance Comparison Experiment 1: 2. instrument calibration”, J. Geophys. Res. 98, 10679-10694, 1993) and by Woods et al. (“Validation of the UARS Solar Ultraviolet Irradiances: Comparison with the ATLAS-1, -2 Measurements”, J. Geophys. Res., in press, 1996).