The Qualities of Quality – Validating and justifying digital scholarship beyond traditional values frameworks

This is the second in a series of posts about each of the teams that will be attending SCI 2015, and their projects. This one is adapted from the text of the proposal submitted by Samuel Moore.

What does validation mean outside of a values/normative framework?

Justice Potter Stewart famously quipped, in a US Supreme Court case about pornography, that he could not define obscenity but that he “knew it when he saw it”. In a very different sphere, virtually every research funder and institution in the world includes either “quality” (sometimes replaced with “excellence”) as a key target in their mission statements, goals, or criteria for assessment. Some of them even seek to define the term. But like many normative claims about distinctive characteristics of prestigious activities these definitions are slippery. In many cases they are either circular, entirely retrospective or can be reduced to “what those who matter know when they see it”.


At the same time research makes particular claims to being necessarily un-planned in its overall direction while also being an expensive and therefore exclusive activity. Decisions therefore need to be made on which research, and which researchers, will be supported with the limited funds available. It is an article of faith that it is “curiosity-driven” research that is ultimately the most productive, rather than that directed to specific societal goals. An objective measure of some form – “quality” – is therefore required to justify and prioritise investment in research without direct application.

Is “quality” merely a rhetorical fiction required to square this circle, or does it capture and express values that underlie the research that is worthy of public funding? Is the confusion about what is meant by quality in national and institutional research assessment a serious issue that is really a stand-in for questions about authority (and credibility) or is it simply a tool of realpolitik, one of the messy compromises required in running real-world institutions?

Traditionally quality is determined through a process of peer review. This social process has become the cornerstone of scholarly practice while at the same time having its validity and utility highly contested. Arguably quality and peer review are mutually supportive concepts where to attack one is seen to be attacking the other and threatening the stability of institutions of scholarship. Yet analysis of peer review records rarely shows any coupling between the two concepts. Nonetheless, peer review, of both grants and literature, confers credibility and acts as a socially validated proxy of quality within the research community. In turn the outlets (funders, journals) which certify a peer review process become proxies of these proxies. The irony is that the focus on these secondary proxies has led to a further layer of proxy measures (the Impact Factor) that decouples the conferring of prestige and credibility from the peer review process that it is supposed to be based on. Even if “quality” is not a mistaken concept it is a largely debased one.

We plan to bring together different perspectives and skill-sets on this issue inspired by a twitter conversation between two of us ( This discussion, focussing on the question of whether “quality” is a single or heterogeneous concept draws on analytical work seeking correlations within the scores in real world assessment rankings and also brings experience of large and novel data now available on the use and discussion of research outputs. In addition we bring experience of the intersection of new forms of digital scholarship and what it means for the future of the university in political terms and linguistic and philological analysis to round out our team.

In the context of the institute we propose to use this initial experience in order to combine a narrative approach to statements of quality from research funders, institutions, assessors and researchers with an analytical approach that can test whether the claims made can be supported by the information used. We will begin with a dissection and analysis of the rhetorics of “quality” from various fields and an enumeration of the current proxy measures (journal brands, publisher names, citation indices, impact factor, altmetrics, peer-review procedures, invitation-only journals) that are claimed to accurately pre-filter for, or retrospectively label, “excellence”.

Our hypothesis is that “quality” as an objective scale is not a useful concept, that it is socially determined and driven by existing power structures. Moreover it is now such a confused concept, dependent on layers of re-interpretation and measurement through weakly relevant proxies that it is not even well socially determined.

Puzzle of complexityHowever quality can be re-imagined as a multi-variate construct that can be deployed to address different priorities. This shift from “quality” to “qualities” has potentially valuable practical outcomes in focussing our attention on different aspects of communicated research outputs. It also, importantly, should give cause for pause when the term is used across disciplinary boundaries; quality and its evaluation must be tied to the purpose of the research which, itself, must be situated within specific disciplinary practices. Most importantly it raises profound political questions around the consensus justifications for publicly funded research. If we are to address “qualities” rather than “quality”, we are required to examine the societal values and expectations that underpin the public funding of research.

Working Group members

  • Samuel Moore is a PhD student in the Department of Digital Humanities at King’s College London. His research focusses on the extent to which open-access publishing in the humanities is a disruptive process or merely transformational in the UK higher education context. He is also Managing Editor of the Ubiquity Press Metajournals, which publish structured summaries of openly available research objects, such as open-source software, data and bioresources. Consequently, Samuel is deeply interested in academic credit, novel research outputs, and the future of the university.
  • Cameron Neylon is a failed scientist and amateur humanist currently working as Advocacy Director at PLOS. He has worked for the past decade on the challenges of bringing scholarly communications onto the web including issues of Open Access, Open Data, incentives and assessment structures. He was a contributor to the Altmetrics Manifesto and the Panton Principles, and has written widely on research assessment, peer review and the challenges of research governance.
  • Dr. Martin Paul Eve is a Senior Lecturer in Literature, Technology and Publishing at Birkbeck, University of London. He is a founder of the Andrew W. Mellon Foundation-funded Open Library of Humanities, the author of Open Access and the Humanities: Contexts, Controversies and the Future (open access from Cambridge University Press, 2014), and the lead developer of the open-source XML typesetting platform, meTypeset.
  • Damian Pattinson obtained his PhD in neuroscience from University College London, where he studied the development of sensory pathways in the laboratory of Prof Maria Fitzgerald. After a brief postdoc at Kings College London, Damian joined the BMJ as a Scientific Editor on Clinical Evidence. He moved over to the online clinical resource, BMJ Best Practice shortly after its conception, firstly as Commissioning Editor, and then later as Senior Editor. He joined PLOS ONE in February 2010 as Executive Editor, and became Editorial Director in October 2012.
  • Jennifer Lin, PhD is Senior Product Manager at PLOS. She is the primary lead of the Article-Level Metrics initiative and the publisher’s data program. She earned her PhD at Johns Hopkins University. She has 15 years of experience in community outreach, change management, product development, and project management in scholarly communications, education, and the public sector.
  • Daniel Paul O’Donnell is Professor of English at the University of Lethbridge (Alberta, Canada). He trained as an Anglo-Saxon philologist and has been active in what is now known as the Digital Humanities since his undergraduate days at the Dictionary of Old English the late 1980s. His current work focuses on the place and practice of the Humanities in the digital age, particularly in terms of social organisation, communication practices, and globalisation. He is the founding director of,, and the Lethbridge Journal Incubator. He is currently Vice President at and a former chair of the Text Encoding Initiative.


The group is purposefully cross-disciplinary and comprises members from academia and the open-access publishing community. Our position is radical in as much as it potentially undermines current assumptions and hierarchies in the research enterprise. There is a real opportunity to influence how publishers, funders and universities approach the idea of research quality in their systems and organisations; we will achieve this primarily through influencing the narrative around “quality” and how the word is used. The group has a track record in targeted interventions that over time change discourse. In the context of the institute we will start this process through preparing a report for online publication, and consider the opportunities for a more formal (and possibly more in depth) publication.

We will follow this up with OpEds targeted at both traditional institutional audiences (Times Higher Education, Chronicle of Higher Education, mainstream media) as well as online communities (LSE Impact Blog). We also have good contacts with key players including the European Commission, HEFCE (UK), Social Sciences and Humanities Research Council (Canada), and other funders. We will seek opportunities to present our outputs in relevant forums across a range of stakeholder groups. There are also many community initiatives focussing on incentives for researchers and the link to assessment. We are already engaged with many of these and will use them as a further means for dissemination.


[ Image credits: used under CC license used under CC license used under CC license. ]

[ edited on 18 May to update the bio of Martin Eve ]