Keeping new datasets in step with modern organizations using R2O

Underlying transitions from research to operations are typically some critical sources of raw data or observations that can further an operational objective or priority when provided to practitioners and applied to the mission. In early stages, there is often at least some degree of research involvement in either setting the specifications of the instruments that will take new observations, or establishing concepts and performance specifications for potential derived research byproducts from those observations. Most successful iterations of the R2O cycle consume a few years at the very most. What happens, then, when there are observing system or new dataset procurements that take a decade or longer?

Consider this case study. The Geostationary Operational Environmental Satellite R-Series (GOES-R) launched in November 2016 after years of planning and preparation. As a weather satellite, GOES-R will be able to provide operational meteorologists with imagery of atmospheric patterns and cloud features at a temporal, spatial, and spectral capacity substantially greater than the “legacy” GOES and its instruments. The ground segment will also create products, formulated from past research, that are composited or derived from the imagery. Those will also be provided to the meteorologists in the field to complement the raw imagery.

However, the National Weather Service (NWS) established the requirements for GOES-R in 1999. In the years that have passed, the derived products have been formulated but, most notably, the NWS has evolved as an organization. The forecaster (practitioner) workflow has changed with increases in other observational data and numerical weather prediction model output. There have also been substantial improvements in information technology such that the methods of storing and visualizing weather observations and communicating weather forecasts to the public have evolved too.

So, when it is time to validate the new imagery and products from GOES-R against the NWS requirements, should they pretend it is 1999? With a strong R2O cycle in place, there is the opportunity to refresh the requirements in the context of today’s research and operational environments. To put it another way, the users of imagery and derived products from GOES-R should not have to settle for a product with accuracy and precision specifications based on what was attainable in 1999. Change is inevitable. Certain products may hold more promise to meet better specifications than others, and find more users, in today’s research and operational contexts. There may be other observations that have since filled related needs or model analyses that may be more reliable for producing a representation of the atmosphere that a satellite simply cannot provide. Instead, focusing on what is realistic and attainable with the new instruments and state of the research in 2016 and the years ahead should be the absolute priority.

The evolution of requirements and priorities can be summarized in three stages over the lifespan of an observation:

  1. Establish requirements to set the instrument specifications based on the best concept of future users’ needs.
  2. Set updated “lifetime” requirements when the data is first available with a consideration of the organizations’ present and future mission.
  3. Update priorities to refocus attention on where the observing system can add value today, shifting away from validating all previous requirements evenly.

It is easy to get caught pairing the instrument specifications with the requirements for the entire lifespan of a new observing system because the instruments and other components were built to meet the requirements initially. And with current procurement timelines spanning at least several years, if not a decade for major acquisitions, there is little that can be done to change that. But following availability of the new observations, the paradigm should change, and hopefully there is room for growth through either the instrumentation exceeding its specification, providing higher quality data, or the scientific community providing new methods for using the collected data. In order to enable this ability, there has to be some flexibility to update software components of the system on timelines more consistent with R2O transitions (weeks or months, not years). In general, and with the right architecture, it should not cost more, or at least substantially more, to update the software.

Fixed hardware and a static observing system are rare in an era where organizations, enterprises, missions, and requirements are continually evolving, especially at today’s rapid pace. But where that is a reality (such as with weather satellites), past priorities should be dismissed as old requirements are updated through demonstrations in proving grounds and testbeds. These forums, which involve practitioners directly in the R2O process, can be used to establish a new benchmark for operational functions of the raw data and derived research byproducts from an observing system or dataset.

Establishing metrics that support the progress toward updated requirements from the new benchmark can show where there is the potential for added value over what the current capabilities, research, and configuration provide. After all, R2O is about capturing the added value. A combination of metrics and new requirements should turn the cogs of the proverbial R2O wheel for the lifespan of the observing system or dataset and potentially help to recognize limitations that can set the hardware or other requirements on the next enhancement to the dataset.

In all cases though, ideas (from research, operations, or elsewhere) should not be taken at face value. Whether these are research ideas, software ideas, or evolutionary hardware ideas, the idea should only serve as an impetus for subsequent analysis. There should be methods for testing the validity of a proposed concept and establishing metrics and requirements accordingly. Ideas are opinions resulting from what “we” know now and applying it to resolve a current or future challenge. It can be costly to not verify the dexterity of an idea in a R2O cycle that involves a major or novel set of observations or other data.

Managing a data collection or observing system means owning the requirements and keeping the R2O cycle as modern as the organization or enterprise it serves. Keeping R2O processes nimble and efficient means knowing when the requirements are out of date. Otherwise, there can be a waste of time, or worse, a loss of faith in the processes that can bear tomorrow’s fruit.

Jordan Gerth

Jordan Gerth

is a research meteorologist with a decade of R2O experience, interacting with academia, the federal government, and the private sector on weather satellite and software projects.
Jordan Gerth

Latest posts by Jordan Gerth (see all)

Share