Metrics for monitoring R2O skill
While the R2O process is a “soft” cycle that is dependent on a number of players and factors, there are some “hard” metrics that can provide a clear picture as to whether the implemented R2O structure is efficient and effective. To be clear, R2O cannot be “forced”, and subjective feedback is an important part of the cycle that can control the total time duration that a transition takes. Therefore, the best way to determine whether there is R2O skill within a community or organization is to consider a set of recent transitions together, instead of focusing on one transition and classifying it as a success or failure.
Metrics to consider are:
- Percent of ready research byproducts in the science portfolio successfully transitioned, or the success rate to date
It is important to make sure the investment in research is converted to value for operations. If there are a number of ready research byproducts that are stalled, waiting for transition, and this percentage is low, it is worthwhile to look at the steps in the R2O process, or reassess the requirements that necessitated the research in the first place.
- Percent of transitioned research byproducts actively used in operations
If transitioned research byproducts are not used in operations, then either they are patently not useful and there was a failure in the requirements, or there has not been sufficient training or a realization about how to apply the research byproducts to operational challenges.
- Amount of time that research byproducts are demonstrated in a pre-operational capacity, requiring the resources of a researcher to serve as a technical interface
If researchers are supporting byproducts in a pre-operational status for over a year, without implementing improvements, it is worth investigating the cause. It is possible that the R2O process is not agile enough, technical systems lack capacity, or not enough funding has been allocated for R2O transitions relative to the science portfolio.
- Number of transitioned research byproducts that solve the same operational user and challenge
When observation systems and their science portfolios exist in silos, disjoint from operations and each other, then transitioned research byproducts may be too similar. If this situation arises, it is likely because of limited oversight over the many R2O transitions. In most cases, users desire information regardless of the observation system, so blended research byproducts that incorporate many observations are most ideal. Requirements should be written this way to spur streamlined implementations.
- Alignment between specific routine observations and their use in transitioned research byproducts
R2O is a bridge between massive investments (observations, science, and operations). To determine whether the investment in observation systems and their capabilities is worthwhile, monitor whether the research byproducts are using the unique observations, and then operations is using these research byproducts. Each observation in time, space, and character should map to a research byproduct that operations is using.
- Degree of effort to convert research code to operational code for running on technical systems that support R2O
Straightforward standards should make it easy to convert research code to operational code in all but the most complex situations. Researchers should be aware of these standards at the beginning of their projects in order to decrease the need to re-architect code during the transition to operations.
- Amount of time to incorporate improvements to research byproducts on technical systems that support R2O
It should be possible to implement improvements to research byproducts on smaller time scales than completely new implementations. This is because, in most cases, the observations and supplementary data should already be available on the technical system for the updated algorithm to use. That is not to suggest that improvements should avoid those portions of the process that require due diligence to assure that operational complications (e.g., time of availability and quality) do not arise.
- Ratio of the cost of operational implementation to the cost of research
Ideally, research costs should be higher than that for operational implementation, so the best ratios are well less than unity (one). High operational implementation costs generally suggest that the wrong processes or technical systems are in place to facilitate R2O, or, alternatively, low research costs suggest that there is not enough of innovation in the pipeline.
Latest posts by Jordan Gerth (see all)
- Preliminary, non-operational, but used operationally? Huh? - April 10, 2017
- Peer review and R2O - February 5, 2017
- Knowledge, skills, and mental representations - January 16, 2017