FAQ general questions
Do you provide a tool to aggregate, combine or compare the data collected in the Journal Monitor (OAM-CH) and the Repository Monitor?
This project does not provide a tool for in-depth aggregation, combination or comparison of the provided (underlying) data. Some institutions provide underlying datasets for data specialists to use the Journals monitor data (FZ Jülich) and triangulate it against the data collected via the repository survey.
How do you plan to improve on the data aggregation in the future?
Depending on contributions of swissuniversities members, a follow-up project in cooperation with an organisation to operate data aggregation and hosting the monitoring data could improve on the data aggregation in the future.
How do you address the bias that arises from using an international publication database?
The use of data from international publication databases leads to disciplinary and linguistic bias due to the heavy concentration on science, technology and medicine, as well as the overrepresentation of scientific articles in English. To provide an alternative source of data, we conduct an annual national survey among swissuniversities members. Publications in any language and with any content are included in the repository data.
Do you plan to offer a cost monitor in the future and how can the institutions support this endeavour?
A cost monitor is out of scope of this project. Nevertheless, we encourage all swissuniversities members to submit data on publication fees (i.e., APCs) to the OpenAPC initiative.
FAQ Journal Monitor (OAM-CH)
How is the Journal Monitor (OAM-CH) different from the German OAM?
We specified the Monitor to the project’s demands as an OA shares monitor. A cost monitor is not within the scope of this project. You can filter the publications by canton and swissuniversities member. We use a simplified representation of the OA categories, e.g. we define Bronze as Closed. We also use OpenAlex exclusively as the source database (without the Web of Science and Scopus databases).
How often are the Journal Monitor (OAM-CH) data updated?
The data are updated every weekend. This means that every Monday, up-to-date data are available for your analyses. Please note that we have no influence on the up-to-dateness of the source databases.
Is it possible to estimate what percentage of journals published by international scientific publishers is covered by the Journal Monitor (OAM-CH)?
The Journal Monitor (OAM-CH) analyses all publications from scientific journals that have a DOI. The analysis is independent from the OA status of the articles. The OA status is assigned to publications via their DOI using the data from Unpaywall. The coverage depends on the scientific field. In the humanities, for example, the DOI has not yet become established across the board, whereas in the geosciences over 90 percent of (journal) publications from the last few years are covered by the Journal Monitor (OAM-CH).
What does the category Closed mean?
On the publications level, Closed means that an article was neither published Open Access nor could a freely accessible version of the article be found (this also includes Bronze articles). Articles can be categorized as Closed, Hybrid, Green, Gold or Diamond. On the publishers and journals levels, a distinction is only made between Closed/Hybrid and Open Access. Closed/Hybrid means that a publisher does not only publish OA or that articles in a journal are not exclusively OA. Subscription publishers or subscription journals (which are both labelled as Closed/Hybrid in the OA monitor) may sometimes publish Hybrid, Bronze (which is subsumed under the Closed category), or Green Open Access articles. These articles are labelled accordingly on the publications level.
Why do you count Bronze OA as Closed?
Bronze articles are free to read on the publisher’s website, without a license that grants any other rights. There may be a delay between publication and availability to read, and often articles can be removed unilaterally by the publisher. This is why we categorize Bronze as Closed.
Do I have to register my institution to be able to carry out analyses of my own publications via OAM-CH?
You do not need to register your institution to use the OA monitor.
Do I have to create a user account in Journal Monitor (OAM-CH)?
It is not necessary to create a user account for the OAM-CH.
When searching for publications by my institution, do I obtain results for all publications we are involved in, or only for those with a corresponding author from our institution?
The search result in the OA monitor always refers to all institutions involved, not only to those with the corresponding author.
In the Journal Monitor (OAM-CH), a particular article is labelled as Green. Why am I brought to a Closed version when I click on the DOI that is displayed?
If your institution has subscribed to the journal in which the selected article appeared, you will automatically be directed to the paywalled or Closed version on the publisher’s website. This is independent from the OA model assigned to the article in the monitor.
In the Journal Monitor (OAM-CH), some publications are incorrectly assigned to our institution. Who do I contact about this?
Please send an email indicating the incorrectly assigned publications to: oamonitor@consortium.ch.
From which source database does the publication date originate? What date is this exactly?
The publication date is taken from Unpaywall. In Unpaywall, the field is described as follows: “As reported by the publishers, who unfortunately have inconsistent definitions of what counts as officially ‘published’. Returned as an ISO8601-formatted timestamp, generally with only year-month-day.” (see Unpaywall, published_date).
FAQ Repository Monitor
How often is the Repository Monitor data updated?
The data is updated annually.
Which criteria did you use to select the resource types for the repository survey?
This survey builds on a survey on Open Access Monitoring conducted among the members of the Arbeitskreis Open Access (AKOA) in April 2020. Based on the results of the survey and confirmed by the project task force, the resource types were selected as being the most representative and relevant ones for a survey on national level.
Which standard vocabularies do you use in the repository survey?
The Open Access typology we use is a simplified version of the Unpaywall definition, see Wiki entry on Open Access typology. For the definition of resource types we mainly followed the COAR 3.0 vocabulary, see Wiki entry on resource types. The institutions are invited to deliver the underlying metadata on item level based on metadata described in the OpenAire Guidelines for institutional and thematic Repository Managers, see Wiki entry on data collection.
Why did you expand the COAR 3.0 definition of journal article and included “professional article” in the survey?
Institutional repository services put much work into negotiating self-archiving options for professional articles with specialist publishing houses which do not have a designated OA policy (as scientific publishing houses often do) to allow self-archiving of versions of the published content on their repositories and making them OA in the process. We expect the Green OA share of professional articles to increase over the following years. Monitoring this progress will be an indicator how successful the negotiations with specialist publishers are. Institutions who decide to deliver professional article data are requested to deliver underlying datasets which include metadata to differentiate between scientific article and professional article for further analysis, if available.
Why did you include co-authored resources and not limit the figures to corresponding or first/last author?
Firstly, not all institutional repositories are currently equipped to deliver the data in that way. Secondly, with some disciplines, the differentiation between corresponding/first/last author is less relevant. This practice results in the same article being claimed by different institutions, depending on the authors’ affiliations.
I do not see any kind of data mapping for this survey. Can you explain?
A refined metadata mapping process to define national standards on metadata quality for the national OA Monitoring is not a part of this low-threshold survey and exceeds the scope and short time horizon of this project. It would have meant a disproportionate encroachment on the autonomy of the institutions. The final report Monitoring the open access policy of Horizon 2020 stated a “lack of consistent and rigorous practices” (consistent with the official guidelines of OpenAIRE) of many repositories on how metadata on publications are handled. This fact needs to be kept in mind when looking at the delivered data, but this project is not mandated to set national standards on metadata quality. The heterogeneity of Swiss HEI needs also to be considered, as there are different grades of OA implementation and support on an institutional level.
As the specialist in charge of delivering the data, how can I create the requested Open Access categories using my repository data?
Institutions can construct the Open Access categories step by step from metadata elements (repository, journal, access rights, etc.), or use providers like Unpaywall for retrieval of this information. Content without publisher’s DOI which is available Open Access on the repository (and provides a creative commons licence) usually can be categorized as Green.
As the specialist in charge of delivering the data, how do I include preprints in the survey?
The institutions are requested to count the best available Open Access version across all instances of a publication. For this project, only publications will be counted if they are available in a reviewed version (post-print or published versions). Submitted or preprint documents which have not yet been published shall not be counted. In addition, if the best available OA version of a publication is the pre-print, the publication shall be counted as Closed. We follow the report Monitoring the open access policy of Horizon 2020 of the European Commission, who defines publications in the context of OA Policies and Monitoring as “Peer-reviewed scientific publications, encompassing articles, books, book chapters, monographs, etc.”, specifically excluding preprint publications (journal articles, book chapters, books) which did not have undergone a peer-review or editorial review.
My institution publication reference system (e.g., repository) is not fit to provide the data you are asking for. What should I do?
This project acknowledges the fact that information on the review type might be not a common element of the publication metadata provided by the repositories and needs to allow fuzziness and the data given by the repositories themselves. The review practices also differ between research areas. There is no authoritative system available defining for each resource type which journal/book etc. to categorize as peer-reviewed or editorial reviewed. Please contact oamonitor@consortium.ch for further questions.
Why haven’t all participating institutions provided advanced Open types and/or all four years for the Repository Monitor?
Our goal of providing data for objective, valid and reliable indicators from institutional repositories must at the same time acknowledge the different stages of development of the repositories and meet the needs of the many. For this national survey, we aimed for the largest possible number of participating institutions. This is why we introduced the low-threshold “basic” typology “Open” as an alternative to the “advanced” Green, Gold and Hybrid, and also why we included data from the last 1-2 years (instead of the last four).
Which tools do you use to improve on the data delivered by the institutions?
It is not in the scope of this project to validate the delivered data against secondary sources like Open data sources (e.g., OpenAIRE, Unpaywall, CrossRef, OpenAPC, DataCite, ORCID, DOAJ) or proprietary databases (e.g., Dimensions, Scopus or Web of Science). Using multiple data sources (triangulation) for cross-validation would have improved the operational quality criteria on coverage and accuracy.