What research metrics are
Principles for responsible and ethical use
How to choose appropriate indicators
Tools and services available through SETU
Practical examples
Key policies, frameworks and further reading
Research metrics influence decisions on funding and strategic planning. When used without context, they can misrepresent research activity or create unintended pressures. SETU encourages a balanced, transparent and responsible approach to evaluation that values the full range of research contributions, including qualitative impact, collaboration, community engagement and disciplinary diversity.
Research metrics are quantitative indicators used to analyse patterns in research outputs. They do not measure quality on their own. They can support decision making when paired with expert judgement and contextual explanation.
Indicators describing individual research outputs.
Examples:
Citation counts
Downloads
Readers or saves
Policy mentions
Altmetric attention scores
These can vary by field, publication date and access route.
Indicators describing the publishing venue rather than the article itself.
Examples:
Journal Impact Factor
SJR (Scimago Journal Rank)
SNIP (Source Normalised Impact per Paper)
Journal indicators should not be used to judge the quality of an individual researcher or article.
Examples:
h index
Field Weighted Citation Impact
Collaboration indicators
Output volume and patterns
These metrics should be interpreted with discipline norms, career stage and research role in mind.
Indicators that track online attention and engagement across news, policy, social media and reference managers. Altmetrics do not measure quality or significance but provide early signs of reach and discussion.
Responsible metrics are guided by international frameworks such as DORA, the Leiden Manifesto, CoARA and The Metric Tide. The following principles reflect best practice and SETU values.
Disciplinary differences, publication cultures and citation time lags all affect metric behaviour. No metric should be interpreted in isolation.
Peer review, narrative statements and case studies remain essential. Metrics act as supporting evidence rather than definitive indicators.
Journal Impact Factor and similar indicators should not be used to evaluate a researcher or individual article.
Research activity is too varied for any single number to offer a complete picture. Multiple indicators reduce the risk of bias.
Explain which metrics were used, why they were chosen and how they were interpreted. Transparency supports fairness in evaluation.
Over reliance on metrics can encourage gaming, risk aversion or shifts in research behaviour. Responsible use prioritises integrity, openness and academic freedom.
Responsible metrics recognise the value of outputs that are not highly cited, including practice based research, discipline specific formats and community engaged scholarship.
A persistent identifier that supports profile accuracy and links outputs across platforms. Researchers are encouraged to maintain an up to date ORCID record.
Supports open access visibility and long term preservation of SETU research outputs. Deposited works often receive more downloads and citations. (SETU will be launching a University repository in the near future)
Databases offering structured citation data and analytical tools. Useful for tracking citation patterns, collaboration networks and field weighted indicators.
Broad coverage and easy to use. Works best when combined with other sources due to mixed indexing quality.
Tools to track attention across policy, news and online platforms. Useful for early signs of engagement beyond academia.
SETU Library staff can help with:
Building publication profiles
Understanding citation indicators
Identifying suitable metrics for specific needs
Using tools such as Scopus, Web of Science and ORCID
Start by defining the question.
Examples:
How widely has this article been cited?
How is a research group collaborating internationally?
What is the reach of open access outputs?
Different questions require different indicators.
When presenting metrics:
Specify the data source
Note the date retrieved
Explain known limitations
Compare with field averages only where appropriate
Citations accumulate at different rates across fields. Humanities and social sciences may emphasise books or practice outputs. Engineering and health sciences may have shorter publication cycles. Interpret metrics accordingly.
Narrative CVs, funding applications and promotion cases benefit from concise explanation:
Describe the significance of the work
Explain how metrics support the narrative
Include qualitative impact where relevant
Do not compare outputs across unrelated fields
Do not rely on Google Scholar alone for citation counts
Do not equate high output volume with quality
Do not rely on a single metric in decision making