Please enter your text to search.

A Draft Framework for Research Impact Data Standards

Introduction

Organizations that support important research activity, both funders and hosts, must report on the impact of that activity; particularly if they are to maintain support momentum in an environment that is competitive for available resources. Two factors make impact reporting increasingly difficult. First, excellence in research activity means a multiplicity of disciplines, institutions and funders in collaboration. This means research teams have multiple organizations seeking impact reports on the same research activity but each organization asks for different information. Second, no single generic impact report can encapsulate the diverse needs of our diverse and multi-disciplinary research community.

The goal of the CASRAI Research Impact Standards is to reach agreement on the indicators that measure the impact of research activity (and the investment in that activity) and to encode that agreement in an open data specification for software providers to implement in their solutions. As a result of achieving this goal, researchers will be able to maintain a single source of data (specific to their unique contributions and discipline) from which they can automatically generate impact reports for all of their stakeholders.

Inputs to a CASRAI Standard Framework

There is a great deal of commonality of structure and content in the various frameworks that are employed for measuring impact, even while there are superficial differences. The attached table compares the five frameworks used as inputs to the currently proposed CASRAI impacts framework.

To approach the impact standards in an organized and comprehensive manner, the CASRAI framework should be inclusive of the subcomponents outlined in the above referenced, and structured for clarity so that diverse users can appreciate easily how their research impact categories map onto the standard.

Standard Catalogue of Indicators

Impact indicators are the fundamental input to all impact measurement. Some indicators are common to most impact reporting (i.e. change in number of publications in refereed journals as a result of the research activity), while some indicators are used only in very subject-specific reporting (i.e. change in the adequacy of operational support of research infrastructure as a result of the research activity).

A review of the five impact measurement methodologies outlined above produced what could be termed a “standard catalogue” of impact indicators. Each framework uses slightly different labels and groupings and not all frameworks include all indicators.

Impact indicators in a standard catalogue can be seen as falling into three main categories and thirteen sub-categories:

  1. Capacity Impacts
    Human, Leadership, Physical, Integration.
  2. Productivity Impacts
    Direct/quantitative, Direct/qualitative, Derived/quantitative.
  3. Societal Impacts
    Direct economic, Indirect economic, Derived social, Derived health, Derived environmental, Derived cultural.

The tables attached list the specific indicators included in each sub-category. Changes to any of the following indicators attributable to a research activity represent a measure of the overall impact of that research activity.

Considerations for adopting a standard catalogue

In developing an effective standard catalogue, the CASRAI stakeholders must consider and discuss the following:
1. Capturing impacts
2. Linking impact to people and activity
3. Attributing impact to investment.

Capturing Impacts

Impact indicators can be simple counts of one type of data or compound comparisons of multiple types of data in combination. But all impact indicators have one thing in common: they are calculations performed upon source data. The CASRAI Research Impact Data Standards will govern the structure, semantics and format of that source data.

Measuring the impact of research is measuring the change caused by that research. To measure this, each piece of source data MUST include at least two elements: the change and its cause. To fulfil this requirement in a standardized manner, the committee proposes a new concept: Indicator Events.

A CASRAI indicator event is a single recordable occurrence that changes the count of one or multiple impact indicators and that can be related to one or more research personnel or activity.

Most indicator events have a one-to-one relationship with a corresponding single indicator. For example, publishing a new article in a journal is an indicator event. Recording it once in a CASRAI CV or in a Personal Impact Dashboard tool (on a laptop or smartphone or a tablet)  would result in the “number of publications in refereed journals” indicator to increase by one if the impact data of that researcher were included in an aggregate impact report.

Some individual indicator events trigger an incremental change to multiple indicators. For example, winning a research contract on a highly collaborative and multi-disciplinary activity is a single indicator event. However, recording it once in a CASRAI-compliant database would result in an incremental “up-tick” to the counts of: funding received, HQP-in-training, number of international collaborators, level of multi-disciplinarity, smaller network linkages, and level of access of private sector to research groups.

The potential lies in getting the most indicator data from the fewest administrative steps. The challenge is to standardize the minimum number of indicator events that, if recorded once, will produce all the required impact indicators.

Fortunately, many of the indicator events are already captured in some form as they occur. Researchers already capture some in their various, non-standardized CVs and institutions already capture others in their various information management systems (including paper). All that is required is for the systems now used to capture this data to have key data elements added that make the data-capture “CASRAI-conformant.” This will enable easier “roll-up” reporting on data collected by multiple researchers, teams and departments and aggregated for periodic impact reports.

What follows is a summary breakdown of:
a) those indicator events currently captured by researchers and students;
b) those indicator events currently captured by institutions; and
c) those indicator events needed but not currently captured.

Indicator events that researchers and students already record

These events are already captured when updating a CV or a reference manager tool. The current CASRAI CV data standards already support the capture of most of these events.

Examples of such indicator events include:
• contributions (outputs) published or presented
• citations (academic)
• mentoring services provided
• degrees attained
• distinctions awarded
• funding awarded (contract or grant)
• new Intellectual Property registered
• employment attained
• program/course creation
• participation in new research activity.

Indicator events that institutions already record

These events are already captured in some form (including paper) at institutions. With the adoption of the proposed CASRAI open data standards for capturing these events, institutions could capture this data using the systems tool of their choice. This source data could be periodically aggregated and become an automated input to a standard Institutional Research Impact Report.

Examples of such indicator events include:
• admissions applications
• employment applications
• new hires
• infrastructure budget approval
• new library holdings
• infrastructure becoming operational

Indicator events not currently recorded

These are events that may not be currently captured systematically by either the researchers or the institutions. This data, if agreed by the community to be required for full impact measurement, needs to be captured in a standard-conformant manner. The community should decide who is responsible for capturing it.
Examples of such indicator events include:
• citations (non-academic)
• recognition of a derived output
• post-research identification of a direct or derived societal impact.

Linking Impacts

As stated earlier, source data for impact indicators must include two components: the change to an indicator and the cause of that change.

There are two ways to link an indicator event to a research activity:
1. Direct link: In recording the indicator event (i.e. a new publication), the event was a clear outcome of an identifiable research activity.
2. Indirect link: In recording the indicator event (i.e. getting a new job in your field), it is not feasible to make a direct link between the event and a single research activity. But a link can be inferred through the connection to this person and the activities they participated in.

It is therefore crucial that the standard data structures for capturing indicator events not only include fields that describe the event (Journal Title, Article Title, Job Title, Date, etc.) but also a single field that sets a link to a specific research activity (Activity ID).

With this single field (Activity ID), an indicator event is linked to a research activity. Any reports about the overall impact of that research activity will be able to include all events linked to that activity.

In cases where a direct link cannot be made from an indicator event to a specific research activity, there CAN be a link to the unique ID of the researcher (ORCID Researcher ID).

Attributing Impacts

A major problem in measuring the impact of research activity is attributing the impacts to the investments made. This is not a problem that is simple to solve, however, open data standards that enable recorded impacts to be linked to people and activities will allow every funder of each activity to draw a contribution link between their investments and the resulting impacts.

Leave a Reply