Weekly digest: OASPA equity recommendations, Open Research Integrity and Peer Review Week

Sophie Nobes

This week, we signpost the launch of a public consultation on OASPA’s recommendations for equity in open access and the upcoming Barcelona Declaration Conference on Open Research Information. We read about the announcement of the theme of Peer Review Week 2024 and about the updated COPE guidelines for editors investigating image manipulation, and we explore article retraction trends since 2000. Finally, we read about new guidelines for improving the sustainability of digital scientific resources.

Have your say: OASPA draft recommendations via OASPA

The Open Access Scholarly Publishing Association (OASPA) has released draft recommendations for increasing equity in open access (OA) publishing. Using learnings from the 2023 Equity in OA workshop series, the recommendations focus on the financial barriers that impede OA publishing. Feedback from anyone interested in OA is invited until 1 July.

Conference on Open Research Integrity via Barcelona Declaration

The Barcelona Declaration on Open Research Information has announced the inaugural Conference on Open Research Information. At the conference ­– which will be hosted by Sorbonne University, Paris, France, from 23 to 24 September – signatories and supporters of the Barcelona Declaration will develop a joint roadmap for the future of open research information. Registration is now open for signatories and supporters, with limited spaces available for organizations that are considering endorsing The Declaration. Proposals for talks that will support signatories in fulfilling their commitment to open research information are invited until 10 July.

Peer Review Week 2024 via The Scholarly Kitchen | 4-minute read

Running from 23 to 27 September, Peer Review Week 2024 will focus on the theme of Innovation and technology in peer review. Chosen by a global poll and a dedicated steering group, the theme aims to encourage dialogue on evolving practices and the future of peer review in the age of artificial intelligence and automation. Those planning on participating in this community-led event are encouraged to register their activities and resources for promotion.a

COPE guidelines for investigating image manipulation via COPE | 2-minute read

How should suspicions of inappropriate image manipulation in research articles be investigated? This workflow, developed by the Committee on Publication Ethics (COPE) and Springer Nature, provides journal editors with a step-by-step process to follow when investigating cases of suspected image manipulation.

Research misconduct more prevalent than ever? via Nature | 4-minute read

“Research misconduct has become more prevalent in Europe over the last two decades” concludes this article published in Scientometrics. The study found that retraction rates of biomedical science papers from European authors increased fourfold from 2000 to 2021. But do these results demonstrate a true increase in misconduct, or are they reflective of better detection methods and increased vigilance from the scientific community?a

O3 Guidelines for a sustainable scientific record via Nature | 22-minute read

Scientific resources can often become outdated or unavailable owing to funding issues, staff turnover, or shifting priorities. In this article, Charles Tapley Hoyt (Senior Scientist at Northeastern University) and Benjamin Gyori (Associate Professor at Northeastern University) propose the Open Data, Open Code, Open Infrastructure (O3) Guidelines for the creation and maintenance of a sustainable scientific record. These guidelines provide a road map that encourages the use of open data and code, community involvement in curation, and decentralized governance models to ensure continuous use and maintenance of resources.a

Enjoy our content? Read last week’s digest and check out our latest guest blog!

Don’t forget to follow us on Twitter/X and LinkedIn for regular updates!

aPaige – a generative AI tool created by Oxford PharmaGenesis – was used to create an early draft of this summary. Paige uses OpenAI’s GPT Large Language Models, securely and privately accessed from within Microsoft’s Azure platform. The AI-generated output was reviewed, modified or rewritten, and checked for accuracy by at least one member of the Open Pharma team. The news pieces included in the weekly digest are curated by the Open Pharma team without the use of AI.