Insights from the “Open Science: Monitoring Progress, Assessing Impact” Conference
We're bringing you the key takeaways from the recent international conference "Open Science: Monitoring Progress, Assessing Impact," which took place on July 7–8, 2025, at UNESCO Headquarters in Paris and online. This two-day event brought together global experts, policymakers, and Open Science practitioners to explore state‑of‑the‑art approaches and future directions of how we monitor and assess the impact of Open Science.
Organized by UNESCO, OpenAIRE, PathOS, EOSC Track, and the Open Science Monitoring Initiative (OSMI), the conference offered a unique platform for exchanging knowledge, refining tools, and aligning efforts across the globe on Open Science. It marked a key milestone for several major initiatives – for OpenAIRE in particular, it was both the culmination of the PathOS project, which presented actionable insights into the impact of Open Science practices and the official launch of the EOSC Open Science Observatory a key milestone of the EOSC Track project. Impart part of the event was also the launch of the OSMI Principles of Open Science Monitoring and a global perspective presented by UNESCO.
Key Takeaways: Building a Shared Vision for Open Science Monitoring
The conference opened with welcoming remarks by Lydia Britto, Assistant Director‑General for Natural Sciences at UNESCO, who called this event as "global defining moment for the Open Science movement". Across all sessions, a few key themes emerged that transcend individual projects or policies. First, monitoring must be more than measurement - it must be meaningful, rooted in context, and designed to inform improvement, not just compliance. Whether through national dashboards, institutional pilots, or global frameworks, speakers emphasized the importance of causal thinking, shared definitions, and participatory approaches to understanding Open Science's real-world effects. Second, there was a strong call to center equity, diversity, and inclusivity in both what is being monitored and how. From the role of the Global South in shaping global evidence to the need for capacity-building and local adaptation, it was clear that Open Science can only thrive through collaboration across levels and regions. Finally, the idea that monitoring is a learning process (iterative, evolving, and co-created) was echoed throughout. The tools, frameworks, and stories shared at the conference signal a shift toward a more reflective and coordinated approach to tracking the progress and impact of Open Science.
In the following sections, we take you through the highlights of each session, showcasing how these themes played out in practice, from PathOS's causal approach to impact, launch of the OSMI Principles to the launch of the EOSC Open Science Observatory, and UNESCO's global perspective.
Watch the recording from the conference here
Open Science in Practice: Tools, Evidence, and Insights from the PathOS Project
The conference provided a dedicated space to explore the results of the PathOS project - a three-year project funded by the European Union to better understand and measure Open Science impacts. What made PathOS distinct was its commitment to causality. Rather than just observing correlations, the project worked to trace how Open Science practices actually lead to outcomes: what changes, for whom, and under what conditions. PathOS combines data-driven approaches, AI-assisted analyses, qualitative insights, and policy-oriented frameworks to explore these causal links. The three sessions dedicated to PathOS provided both a window into the project's methodology and a wealth of insights for the Open Science community.
Over three years, PathOS has built an Evidence Base - comprising scoping reviews, detailed maps of Key Impact Pathways, and deep‑dive case studies, and a Toolkit that includes an Indicator Handbook, reusable code and datasets, a Cost‑Benefit Analysis (CBA) framework, and a modular training programme. Learn more about these in the PathOS Open Science Resource Hub.
But beyond the outputs, Ioanna Grypari (PathOS Coordinator, Technical Manager, ATHENA RC & OpenAIRE) emphasized the how. Measuring impact, as the team argued, demands forethought, quality data, and causal reasoning. Quantitative data is not enough. Understanding whether and how Open Science leads to impact requires triangulation with expert insights and qualitative methods that can trace what happens after a user engages with an open resource.
Case studies from diverse contexts (e.g. from ELIXIR's open datasets to the RCAAP repository in Portugal and climate publications in repositories) illustrate how Open Access resources support innovation, increase visibility, and reduce costs.
The key lesson? Start with the right question, not with a predefined metric; culture change often matters more than technical tools; building trust requires transparency; and strong insights emerge when big and qualitative data is combined with expert knowledge. Moreover, effective monitoring needs forethought, quality data, and causal logic, not just counting, but understanding.
Strategic Lessons from PathOS: What Matters for Open Science Policy
The second session zoomed out to reflect on what PathOS reveals to policymakers. The project defined impact not as a narrow result but as a lasting change in how knowledge is created, shared, and used. Yet even with this clarity, it became evident that the field still grapples with basic definitions. What does "openness" mean? What counts as "equitable access"? These questions remain unresolved and slow down both research and policy alignment.
The evidence landscape, as PathOS found, is fragmented and often focuses more on tracking whether Open Science practices are being adopted (uptake) rather than on assessing their real-world effects and benefits (impact). Meanwhile, the impact of Open Science on research quality is underexplored. And crucially, the role of the Global South is frequently neglected, both in terms of the evidence base and in global conversations about Open Science's future.
The session called for more attention to overlooked dimensions: to equity, diversity, and participation and to building capacity for meaningful engagement with open resources. For impact to be fully understood, it must be studied in context geographically, institutionally, and socially. A full synthesis of these findings will be presented in the upcoming PathOS synthesis report, which will include three policy briefs, so stay tuned for more detailed insights – stay tuned!
From Monitoring to Meaning: Applying Open Science Indicators and Assessing Value
The third PathOS session introduced two key tools that link monitoring to strategic decision-making: the Open Science Impact Indicator Handbook and the Cost-Benefit Analysis (CBA) framework.
The Indicator Handbook is a living resource designed to support monitoring efforts with meaningful, context-sensitive indicators. It emphasizes a critical point: indicators are only as good as the thinking behind them. As Vincent Traag (CWTS) mentioned, without causal reasoning, indicators can lead to misleading conclusions - especially when multiple changes are happening at once, making attribution difficult. For instance, more citations may appear to signal greater impact, but unless we understand whether openness (of data, code, or other outputs) actually caused the change, such indicators can be misleading. In terms of the indicators themselves, definitions and a shared taxonomy are essential to clarify what is being measured and why. Without this shared understanding, comparisons across institutions, disciplines, or countries become unreliable.
The CBA framework offers a structured way to weigh the costs and benefits of Open Science interventions. It was applied to two PathOS case studies, those examining ELIXIR and RCAAP, to communicate value to funders and stakeholders. The findings were compelling - for example, RCAAP's estimated benefits, primarily from labour cost savings, exceeded its operational costs by 32 percent over a 20-year horizon.
Still, conducting a CBA is not trivial. Gathering the right data, identifying stakeholders, and defining counterfactuals all require careful design. But when the process is planned from the start with a CBA mindset, it becomes far more manageable. The session concluded that while burdensome at times, CBA can be a powerful tool to support sustainability and funding for Open Science resources - especially in a policy landscape that increasingly demands economic justification. Learn more about CBA framework for Open Science here.
OSMI: Establishing Global Principles of Open Science Monitoring
The conference also marked an important step for the Open Science Monitoring Initiative (OSMI), which officially presented its Principles of Open Science Monitoring. Developed through a global consultation of over 170 experts from 42 countries, these principles aim to guide countries and institutions in building monitoring systems that are relevant, transparent, and responsibly used. The principles focus on three pillars Relevance and significance, Transparency and reproducibility and Self-assessment and responsible use.
Four working groups were set up to advance Open Science monitoring, underpinned by principles that ensure inclusivity and global relevance, with particular attention to perspectives from the Global South. Next steps include launching two additional groups on capacity building and technical specifications, developing a self‑assessment grid, translating the principles, and creating localized use cases. Designed as a flexible guide rather than rigid standards, these principles help institutions assess their Open Science practices, monitor progress, and align strategies, while supporting accountability, equitable access, and informed policy planning through a unified yet adaptable framework.
Scaling Open Science Monitoring
During the session Scaling Open Science Monitoring, the spotlight was on how Europe is advancing its capacity to track and support the implementation of Open Science policies. As presented by Stefan Liebler (European Commission), the ERA Policy Agenda 2025–2027 puts Open Science at its core, identifying "Enabling Open Science via sharing and re-use of data, including through the EOSC" as a structural priority and "Equity in Open Science" as a key action. Monitoring efforts within EOSC are evolving to meet these ambitions, including through the co-programmed partnership (EOSC Association), the EOSC Steering Board's annual Survey on National Contributions to EOSC and Open Science, and future plans to monitor the EOSC Federation (consisting of multiple EOSC Nodes that are interconnected and can collaborate to share and manage scientific data, knowledge, and resources within and across thematic and geographical research communities).
The second part of the session focused on the launch of the next phase of the EOSC Open Science Observatory, a data-driven policy intelligence platform that monitors Open Science developments across Europe. As described by Tereza Szybisty (OpenAIRE), the Observatory offers a transparent and comprehensive overview of national policies, practices, and trends. Built in co-creation with the community, the platform is fully open by design: its code, data, and visualisations are openly licensed, documented, and available for reuse. Several data sources feed into the Observatory, starting with the Survey on National Contributions. The second major source is the OpenAIRE Graph, one of the world's largest open scholarly knowledge graphs, which links policy developments to actual research outputs such as publications, datasets, and software. Additional inputs include qualitative country narratives gathered by OpenAIRE's National Open Access Desks, the upcoming European Open Science Resources Registry (which will centralise policy documents), and data from Eurostat and CoARA to complement the broader research and assessment landscape.
Reflecting on the process of building and evolving the Observatory, several lessons emerged. First, monitoring Open Science is a learning journey - some questions still cannot be fully answered, not due to missing data but because the systems are still developing. The Observatory serves not just as a dashboard, but as a tool for reflection, aspiration, and shared progress. The need for shared definitions across countries was a recurring theme, as was the value of targeted support (including tools like the EOSC Survey Café and expert support) to enhance participation and data quality. Stakeholder engagement in shaping both the Observatory and its underlying framework has been crucial for ensuring relevance. Lastly, visualisation tools have proven essential in transforming complex data into actionable insights. A key takeaway echoed by all speakers: monitoring must remain adaptive, community-driven, and forward-looking - a resource that supports learning and improvement, not just a compliance checklist.
UNESCO: A Global Vision for Open Science Monitoring
Rania Sabo (UNESCO) presented progress on the implementation of the 2021 Recommendation on Open Science, adopted by 194 Member States as the first international framework for Open Science. The first global consultation involving 77 countries highlighted significant advances in open access policies, national infrastructures, and capacity building. However, challenges remain, including financial constraints, infrastructure gaps, fragmented data, and the dominance of commercial publishing. Panelists from Saudi Arabia, China, and Côte d'Ivoire shared experiences in building national monitoring systems, emphasizing that monitoring should focus on connections, narratives, and processes rather than only on outputs. The results of these consultations will feed into the Open Science Outlook 2026, which aims to provide a global overview of progress and remaining challenges.
The Role of RPOs and RFOs in Open Science Monitoring: Connecting Institutional Practices to Policy Aggregation.
The final joint session, co-organized with conference organizers of the OPUS project, focused on how research-performing and research-funding organizations can align their monitoring with national and global policy frameworks. Experiences from the OPUS pilot universities showed both the potential and the challenges – such as low institutional readiness – of implementing monitoring systems that inform research assessment and national policies.
Natalia Manola (OpenAIRE) described how we can use tools like the OpenAIRE Graph and OpenAIRE MONITOR dashboards to link and curate data can bridge institutional practices and policy-making.
Discussion underscored the need for core adaptable indicators, compatible infrastructures, shared definitions, and avoiding misuse of monitoring punitive decision‑making.
Keeping the Momentum: Advancing Open Science Together
As the conference drew to a close, one message stood out clearly: meaningful progress in Open Science depends not only on strong policies and innovative tools, but also on collective action and shared responsibility. The insights, frameworks, and collaborations showcased over these two days demonstrated the growing global momentum to monitor and assess Open Science in a way that is inclusive, data-driven, and impact-oriented. With initiatives like PathOS, EOSC Track, OpenAIRE OSMI, and the ongoing commitment of UNESCO and European Commision, we are laying the foundations for a more transparent, equitable, and effective Open Science ecosystem -one that serves researchers, policymakers, and society at large.
We extend our sincere thanks to all participants for their engagement and invaluable contributions, to the PathOS partners (OpenAIRE, CWTS), the EOSC Track project, the French Ministry of Higher Education, Research and Innovation, and Inria for their support, and to UNESCO for generously hosting this important gathering.
Have you attended the conference?
When you subscribe to the blog, we will send you an e-mail when there are new updates on the site so you wouldn't miss them.