Making the Case for Open Work in Evaluation, Tenure, and Promotion

The evaluation of academic work is a hot topic, not just at RIT. Heated discussions about things like “Impact Factor,” the raw bean-counting of systems like Google Scholar, the value of one journal or conference over another, their acceptance rates, and the value of “alt-metrics” are happening across academia. In their recently released “Guide to Supporting Open Scholarship for University Presidents and Provosts,” the National Academies of Science, Engineering and Math call for a review of policies including “…academic hiring, review, tenure, and promotion (valuing diverse types of research products; metrics that incentivize the open dissemination of articles, data, and other research outputs; and valuing collaborative research)” to better align them with open scholarship.

On the one hand, metrics can be a quick way of indicating the “impact” (or at least a base level awareness) of a given piece of work. The real challenge is evaluating the actual work itself, especially when those evaluators may not be familiar with the domain of the person being assessed.

In 2013, the RIT President, Provost, Dean of the Golisano College of Computing and Information Sciences, and Director of The School of Interactive Games and Media (SIGM) signed a set of guidelines for tenure and promotion to supplement traditional policy.

To delineate the types of work and potential evaluation methods, to communicate to faculty, committees, and administrators from computer science and related disciplines the work of SIGM, which differs significantly from traditional Computer Science. It was created to address the challenge described above, particularly since works of faculty in SIGM often do not adhere to the “universally understood” academic metrics. The document is linked under “Longer Reads” in the list below, and Open Source (which can be read in this context as “Open Work”) is part of the overall document. You may find it helpful in structuring your case. It’s important to note that while the document is signed by administrators, it was created in conjunction with the SIGM full professors and several full-professor candidates.

This resource list below comes from across a wide variety of disciplines, but we suggest you read beyond those that come from just your domain, as you’ll likely find helpful thoughts and framing in all of them.

Comments on how to improve this document are encouraged.

Quick Reads

draft metric from the CHAOSSCommunity, which seeks your comments. CHAOSS creates metrics and analytics software and dashboards for the Open Source Software ecosystem to draw pictures around community health based on data from repositories and other inputs. The metric here is an early draft of their first attempt at creating metrics for academia. The team at Open@RIT has developed a plug-in to allow CHAOSS’s software tools to pull data from the Center for Open Science’s OSF platform.

The Association for Psychological Sciences Open Practice Badges. These are displayed along with the digital versions of articles published in their journals to indicate what standards of Open Science these peer-reviewed articles meet.

Bibliometrics: The Leiden Manifesto for research metrics (Nature.com) Ten principles to guide research evaluation

Guidelines for Evaluating Work in Digital Humanities and Digital Media (MLA)

San Francisco Declaration on Research Assessment, their Project TARA is in development, you can get involved.

Longer Reads

RIT School of Interactive Games and Media Guidelines for Tenure and Promotion. This was signed by the previous provost Jeremy Haefner in 2013 and was shared with current Provost Ellen Granberg, who supported its continued use.

Backgrounder on “Citation Analysis” (University of Illinois Chicago Library)

GitHub Statistics as a Measure of the Impact of Open-Source Bioinformatics Software (“Frontiers in Bioengineering and Biotechnology)

Software is Scholarship (MIT Computational Law Journal)

An approach to measuring and encouraging research translation and research impact (Health Research and Policy Systems Journal)

The Journal Of Open Source Software and a piece on its start (Blog of the Society for Scholarly Publishing)

France’s National Plan for Open Science 2021-2024

Data West Conference Keynote slide deck “The Value of Open Science” with links (Prof. dr. JC Burgelman, Free University of Brusse)

The Website of The Maintainers, a group that studies maintenance issues across disciplines (For Open Work as scholarship, expansion, revision, and maintenance of the work is vital, but often ignored.)

The DORA Resource Library of articles and case studies