Skip to Main Content

Responsible Use of Metrics: Introduction

Welcome

This guide introduces research metrics and provides practical guidance on how to use them responsibly at South East Technological University. Metrics can help to understand research visibility, output patterns and disciplinary positioning. They are one part of a wider evidence base and should always be interpreted in context.

What This Guide Covers

  • What research metrics are

  • Principles for responsible and ethical use

  • How to choose appropriate indicators

  • Tools and services available through SETU

  • Practical examples

  • Key policies, frameworks and further reading

Why Responsible Use Matters

Research metrics influence decisions on funding and strategic planning. When used without context, they can misrepresent research activity or create unintended pressures. SETU encourages a balanced, transparent and responsible approach to evaluation that values the full range of research contributions, including qualitative impact, collaboration, community engagement and disciplinary diversity.

Understanding Reseach Metrics

What Are Research Metrics

Research metrics are quantitative indicators used to analyse patterns in research outputs. They do not measure quality on their own. They can support decision making when paired with expert judgement and contextual explanation.

Categories of Metrics

Article Level Metrics

Indicators describing individual research outputs.
Examples:

  • Citation counts

  • Downloads

  • Readers or saves

  • Policy mentions

  • Altmetric attention scores
    These can vary by field, publication date and access route.

Journal Level Metrics

Indicators describing the publishing venue rather than the article itself.
Examples:

  • Journal Impact Factor

  • SJR (Scimago Journal Rank)

  • SNIP (Source Normalised Impact per Paper)
    Journal indicators should not be used to judge the quality of an individual researcher or article.

Researcher or Group Level Metrics

Examples:

  • h index

  • Field Weighted Citation Impact

  • Collaboration indicators

  • Output volume and patterns
    These metrics should be interpreted with discipline norms, career stage and research role in mind.

Altmetrics

Indicators that track online attention and engagement across news, policy, social media and reference managers. Altmetrics do not measure quality or significance but provide early signs of reach and discussion.

Principles for Responsible Use of Metrics

Principles for Responsible Use of Metrics

Responsible metrics are guided by international frameworks such as DORA, the Leiden Manifesto, CoARA and The Metric Tide. The following principles reflect best practice and SETU values.

1. Use Metrics With Context

Disciplinary differences, publication cultures and citation time lags all affect metric behaviour. No metric should be interpreted in isolation.

2. Combine Quantitative and Qualitative Evidence

Peer review, narrative statements and case studies remain essential. Metrics act as supporting evidence rather than definitive indicators.

3. Avoid Using Journal Based Metrics to Assess Individuals

Journal Impact Factor and similar indicators should not be used to evaluate a researcher or individual article.

4. Use Multiple Indicators Where Possible

Research activity is too varied for any single number to offer a complete picture. Multiple indicators reduce the risk of bias.

5. Promote Transparency

Explain which metrics were used, why they were chosen and how they were interpreted. Transparency supports fairness in evaluation.

6. Anticipate Unintended Consequences

Over reliance on metrics can encourage gaming, risk aversion or shifts in research behaviour. Responsible use prioritises integrity, openness and academic freedom.

7. Support Research Diversity

Responsible metrics recognise the value of outputs that are not highly cited, including practice based research, discipline specific formats and community engaged scholarship.

Tools and Services at SETU

ORCID

A persistent identifier that supports profile accuracy and links outputs across platforms. Researchers are encouraged to maintain an up to date ORCID record.

Institutional Repository

Supports open access visibility and long term preservation of SETU research outputs. Deposited works often receive more downloads and citations. (SETU will be launching a University repository in the near future)

Scopus and Web of Science

Databases offering structured citation data and analytical tools. Useful for tracking citation patterns, collaboration networks and field weighted indicators.

Google Scholar

Broad coverage and easy to use. Works best when combined with other sources due to mixed indexing quality.

Altmetric and PlumX

Tools to track attention across policy, news and online platforms. Useful for early signs of engagement beyond academia.

Library Support

SETU Library staff can help with:

  • Building publication profiles

  • Understanding citation indicators

  • Identifying suitable metrics for specific needs

  • Using tools such as Scopus, Web of Science and ORCID

Data Steward & Digital Innovation Officer

Using Metrics at SETU

Using Metrics at SETU

Good Practice for SETU Researchers

Selecting the Right Metric

Start by defining the question.
Examples:

  • How widely has this article been cited?

  • How is a research group collaborating internationally?

  • What is the reach of open access outputs?
    Different questions require different indicators.

Contextualising Results

When presenting metrics:

  • Specify the data source

  • Note the date retrieved

  • Explain known limitations

  • Compare with field averages only where appropriate

Reflecting Disciplinary Norms

Citations accumulate at different rates across fields. Humanities and social sciences may emphasise books or practice outputs. Engineering and health sciences may have shorter publication cycles. Interpret metrics accordingly.

Using Metrics in Narratives

Narrative CVs, funding applications and promotion cases benefit from concise explanation:

  • Describe the significance of the work

  • Explain how metrics support the narrative

  • Include qualitative impact where relevant

Avoiding Common Mistakes

  • Do not compare outputs across unrelated fields

  • Do not rely on Google Scholar alone for citation counts

  • Do not equate high output volume with quality

  • Do not rely on a single metric in decision making