On 23 April 2024, the first ‘National Research Software Day’ took place in Hilversum, the Netherlands. During the unconference part of the program, Deekshitha ran the session titled, ‘Decoding Research Software Impact: A Collaborative Journey’, where a model for research software impact was at the center of the discussion.
Authors: Deekshitha¹, Carlos Martinez Ortiz¹, Rena Bakhshi¹, Jason Maassen¹, Rob van Nieuwpoort², Slinger Jansen³, Ton Smeele³, Andrew Treloar⁴, Louise Bezuidenhout², Maarten Schermer³, Laurents Sesink⁵, Pui-Kei Fung³, and Colette Bos¹
In this session, participants collaborated on the topic of research software impact. Participants took the perspective of policymakers, funders, researchers and Research Software Engineers (RSEs) to explore perspectives of research software impact in four 15-minute rounds, in which participants attempted to answer a set of questions from the perspective they were exploring. After each round, the participants changed groups to cover a different perspective. This workshop aimed to define the impact of research software from the perspectives of these 4 stakeholders. Here the term impact refers to the difference the research software makes to the research process or community.
Measuring the impact of research software is the main focus of Deekshitha’s PhD study. Currently, there is a lack of sufficient methods to accurately measure research software impact. Her study seeks to develop an impact model that will enable all stakeholders of research software to measure its impact effectively and ensure recognition for the individuals behind its development. During the sessions, four questions were discussed (from different perspectives):
- What are the goals of measuring research software impact?
- How can research software impact be measured?
- Which measures correspond to which goal?
- How should these goals be prioritized?
Researchers and Research Software Engineers’ perspective
From the researcher’s and RSEs perspective, we have identified six goals, as shown in the Figure 1. Among these, recognition and a track record for funding are essential goals for this group. Despite differing priorities, researchers and RSEs share similar perspectives. Therefore, they are represented together in a single diagram illustrating both viewpoints.
The figure below illustrates the goals and the factors that help measure their impact and interdependencies. The first branch represents the goals (Answer to question 1), the second connection illustrates how to measure those goals (impact factors and their connection with goals- Answer to questions 2 and 3), and the third branch shows their prioritization value (Answer to question 4) along with the corresponding group (Researchers and RSEs; National and Institutional; no other group identified for Funders). This prioritization resulted from discussions with the workshop participants and the value starts from one. For example, in Figure 1, there are three impact factors grouped under Researchers. Based on the discussion, we found that the impact factor ‘citations’ has higher priority compared to other factors, so it is ranked 1 in the category for Researchers. Grey-colored boxes in front of the impact factors indicate the groups they belong to (eg: Researchers and RSEs). Additionally, impact factors unrelated to the goals (or we couldn’t find a relation with identified goals) are presented in separate black-colored boxes (see Figure 2 and Figure 3).
Funders perspective
Funders like NWO, SURF, and eScience Center who support research software projects, prioritize demonstrating value for money and aligning with international strategic goals like Open Science. By emphasizing research software impact, funders can act as change agents, influencing research and ensuring it aligns with broader institutional objectives. Selecting projects with significant impact enhances the funders’ image and showcases their commitment to supporting valuable and innovative research. Furthermore, funders contribute to research and educational institutions by funding unique and impactful software, thus enhancing their overall value.
Funders can measure the impact of research software using various metrics and factors, such as reusability, reuse (with caution), support letters from users, in-kind support, and many others listed in Figure 2. Additionally, the project should comply with metrics such as FAIRness, Data management plans (DMPs), and Software management plans (SMPs).
Policy makers perspective
Policy makers and funders play distinct yet interconnected roles within governance and funding, particularly in research, education, and public policy sectors. Funders focus primarily on providing financial support and resources to achieve specific objectives and outcomes. On the other hand, policy makers are responsible for formulating and implementing regulations, guidelines, and directives that shape the strategic direction and operational frameworks of institutions and organizations.
In terms of policy making, we realized there should be a distinction between national and international policies to ensure alignment with the institutional mission and purpose, maintain the independence and sovereignty of research institutes, and adequately consider risk assessment, marketability, and valorization. Figure 3 shows goals, impact factors, and their groups with priority values from the policymakers’ perspective.
Take home message
The collaborative and dynamic nature of the session made it a memorable experience, combining serious discussion with a touch of fun. This approach not only sparked insightful conversations but also strengthened the community’s commitment to advancing the understanding and measurement of research software impact. Additionally, everyone agrees that RSEs are not appropriately recognized for their work, and recognition is one of the main goals for measuring impact. Funders have open science goals to achieve and policymakers want to ensure value for money.
We look forward to continuing this journey and invite all who are passionate about the transformative power of research software to join us. Together, we can build a robust and internationally recognized model for assessing research software impact. If you’re eager to join this impactful journey, reach out and we can connect over a cup of coffee to further the discussion.
Credit: We want to credit the tool we used, the free version of EdrawMind, in this blog post.
Footnotes