Frequently asked questions
- Applying the SCImago Ranking for MENA in practice
- Role of Key Partners
What is the SCImago Research Centers Ranking - MENA Edition?
The SCImago Research Centers ranking is a ranking focussing specifically on research centres. In its first inaugural launch, it will include research centres in the Middle East and North Africa region.
What will the SCImago Research Centers Ranking measure?
The RCR aims to measure not only the results of research activities, but also the wide range of impacts associated with the research activities.
What does the SCImago Research Centers Ranking - MENA consist of?
The ranking model has three weighted components, with sub-components that are also weighted:
- 40% comprises RESEARCH: performance, productivity, openness and collaboration.
- 40% comprises INNOVATION: technical impact.
- 20% comprises SOCIETY: web visibility, social networks, SDG contributions.
|High Quality Publications Q1||5%|
|Scientific Talent Pool||4%|
|INNOVATION 40%||TECHNICAL IMPACT||10%||Innovative Knowledge||10%|
|SOCIETY 20%||WEB VISIBILITY||5%||Inbound Links||5%|
|WEB PRESENCE||4%||Web Size||4%|
|SDG contribution||5%||Share of papers in SDGs||5%|
For definitions of each sub-component, see Indicators section.
Why rank research centers?
Almost all existing rankings focus on creating league tables for universities. Research centers are equally critical for driving research innovation and thereby knowledge economy but are often under highlighted. This ranking intends to address this issue by:
- Allowing more for visibility of the centers, targeting stakeholders and funding
- Facilitating more collaboration between universities and research centers
- Enabling benchmarking and accountability for decision makers
- Enabling comparability, driving competitiveness
Why do we need a new ranking?
The ranking consists of a set of indicators and weightings that are tailored specially for Middle East and North African research centers.
How does SCImago Research Centers Ranking - MENA Edition differ from existing rankings?
The components of the SCImago ranking model for MENA comprise a different set of items and in different proportions than that of the global SCImago for Institutions Rankings (SIR). These proportions have been determined as best reflecting the objectives and ambitions of the MENA region.
Computed over the institution's leadership output using the methodology established by the Karolinska Institutet in Sweden where it is named "Item oriented field normalized citation score average". The normalization of the citation values is done on an individual article level. Brute Impact for an institution is the sum of values of it's documents (Rehn et al., 2014).
Excellence indicates the amount of an institution’s scientific output that is included in the top 10% of the most cited papers in their respective scientific fields. It is a measure of high-quality output of research institutions (SCImago Lab, 2011; Bornmann et al., 2012; Bornmann & Moya Anegón, 2014a; Bornmann et al., 2014b).
High Quality Publications (Q1)
The number of publications that an institution publishes in the most influential scholarly journals of the world. These are those ranked in the first quartile (25%) in their categories as ordered by SCImago Journal Rank (SJRII) indicator (Miguel et al., 2011; Chinchilla-Rodríguez et al., 2015).
Total number of documents published in scholarly journals indexed in Scopus (Romo-Fernández et al., 2011; OECD & SCImago Research Group, 2016).
Scientific Talent Pool (STP)
Total number of different authors from an institution in the total publication output of that institution during a particular period of time.
Open Access (OA)
Percentage of documents published in Open Access journals or indexed in Unpaywall database.
International Collaboration (IC)
Number of scientific research output from an institution with at least one institution from a country outside the defined MENA region listed in the authorship byline.
Regional collaboration (RC)
Number of scientific research output from an institution with at least two different countries from the defined MENA region listed in the authorship byline.
Industry collaboration (IDC)
Number of scientific research output from an institution with at least one Industry (corporate) organization listed in the authorship byline.
Innovative Knowledge (IK)
Scientific publication output from an institution cited in patents. Based on PATSTAT (Moya Anegón & Chinchilla-Rodríguez, 2015).
Number of patent applications (simple families). Based on PATSTAT.
Patent Citations (PC)
Number of citations from patents (patent families citing a document) received by the institution's documents.
Inbound Links (IL)
Number of networks(subnets) from which inbound links to the institution website came from. Data extracted from ahrefs database.
Number of documents that have at least one mention in PlumX Metrics (https://plumanalytics.com). Mentions in Twitter, Facebook, blogs, news and comments (Reddit, Slideshare, Vimeo or YouTube) are considered.
Web Size (WS)
Number of pages associated to the institution’s URL according to Google (Aguillo et al., 2010).
Share of papers in SDGs (SDG)
Percentage of the scientific publication output that covers at least one of the 17 UN Sustainable Development Goals.
- Aguillo, I., Bar-Ilan, J., Levene, M., & Ortega, J. (2010). Comparing university rankings. Scientometrics, (85), 243-256. https://doi.org/10.1007/s11192-010-0190-z
- Bornmann, L., Moya Anegón, F., & Leydesdorff, L. (2012) The new Excellence Indicator in the World Report of the SCImago Institutions Rankings 2011. Journal of Informetrics, 6(2), 333-335. http://dx.doi.org/10.1016/j.joi.2011.11.006
- Bornmann, L., & Moya Anegón, F. (2014a). What proportion of excellent papers makes an institution one of the best worldwide? Specifying thresholds for the interpretation of the results of the SCImago Institutions Ranking and the Leiden Ranking. Journal of the Association for Information Science and Technology, 65(4), 732-736. https://doi.org/10.1002/asi.23047
- Bornmann, L., Stefaner, M., Moya Anegón, F., & Mutz, R. (2014b). Ranking and mapping of universities and research-focused institutions worldwide based on highly-cited papers. A visualisation of results from multi-level models. Online Information Review, 38(1), 43-58. https://doi.org/10.1108/OIR-12-2012-0214
- Chinchilla-Rodríguez, Z., Miguel, S., & Moya Anegón, F. (2015). What factors are affecting the visibility of Argentinean publications in human and social sciences in Scopus? Some evidences beyond the geographic realm of the research. Scientometrics, 102(1): 789-810. https://doi.org/10.1007/s11192-014-1414-4
- Miguel, S., Chinchilla-Rodríguez, Z., & Moya Anegón, F. (2011) Open Access and Scopus: A New Approach to Scientific From the Standpoint of Access. Journal of the American Society for Information Science and Technology, 62(6), pp. 1130-1145. http://dx.doi.org/ 10.1002/asi.21532
- Moya Anegón, F., & Chinchilla-Rodríguez, Z. (2015). Impacto tecnológico de la producción universitaria iberoamericana. En S. Barro (Coord.), La transferencia de la I+D, la innovación y el emprendimiento en las universidades. Educación Superior en Iberoamérica. Informe 2015 (pp. 83-94). Centro Interuniversitario de Desarrollo.
- OECD and SCImago Research Group (CSIC) (2016). Compendium of Bibliometric Science Indicators. OECD. http://oe.cd/scientometrics.
- Rehn, C., Gornitzki, C., Larsson, A., & Wadskog, D. (2014). Bibliometric Handbook for Karolinska Institutet. https://kib.ki.se/sites/default/files/bibliometric_handbook_2014.pdf
- Romo-Fernández, L.M., Lopez-Pujalte, C., Guerrero Bote, V.P., Moya Anegón, F. (2011). Analysis of Europe’s scientific production on renewable energies. Renewable Energy, 36(9), 2529-2537. https://doi.org/10.1016/j.renene.2011.02.001
- SCImago Lab (2011). Scientific Excellence Georeferenced. The neighborhood matters.
3. Applying the SCImago ranking for MENA in practice
How often will the ranking exercise take place?
Scimago Ranking will be published periodically every year.
Is inclusion in the ranking voluntary?
No. The ranking intends to cover exhaustively all the centers in the MENA Region.
What are the criteria for research centers being included in the ranking?
The inclusion criterium is the center has published at least one paper in a journal included in the Scopus database during the period 2016-2020.
Which research centers are included in the ranking exercise?
All of the centers in the MENA Region perfomimg non-health related reseach and not being part of an higher education institution.
Is it possible to be added to or removed from the list of partaking research centers?
The Ranking is open to add new centers and to delete those that can disappear or being merged.
Will research centers be expected to provide any data themselves?
Feedback from the centers is welcomed, but data should be extracted from external independent sources.
How will the rankings be used?
The aim of the Ranking is to promote the increase of the quantity and quality of the research performed and the ways it is disseminated and applied to local and international communities.
Are there guidelines as to how a research center can publicise its ranking position?
The ranks should be publicise by the centers jointly with a critical assessment of their meaning and the adequate institutional, national and international context.
Is there an appeals process if a research center wants to contest its ranking position?
Appealing is welcomed if supported by actual evidence and after careful understanding of the way the indicators are built and their meaning.
Will the components of the ranking model and their weightings change?
It can be expected minor changes for adapting new data sources. However, it is possible that new variables can be added in the future for better reflecting impact.
When will SCImago Research Centers Ranking - MENA Edition have achieved its aims?
Several of the institutions achieve zero values for several of the indicators. SCImago will consider the ranking is a success if all of the centers perform significantly in every indicator.
4. Role of Key Partners
Who are the Key Partners in MENA?
The Egyptian Ministry of High Education and Scientific Research (MOHESR) provides support and guidance by agreeing to share insights related to research centers related strategies. MOHESR is responsible for the implementation of the scientific strategy for Egypt set out by the Higher Council for Science & Technology (HCST) of Egypt. It oversees the Science, Technology & Innovation Funding Authority (STDF) and the Innovators Supporting Fund (ISF).
ELSEVIER is a world-leading provider of scientific and technical information, and Scopus is the world's largest abstract and citation database of peer-reviewed academic literature. Scopus data encompasses citation metrics, funding acknowledgements, UN SDG categorization, and has the highest accuracy and coverage of peer-reviewed journals, and institutional affiliations.
SCImago is a research group from the Consejo Superior de Investigaciones Científicas (CSIC), University of Granada, Extremadura, Carlos III (Madrid) and Alcalá de Henares, dedicated to information analysis, representation and retrieval by means of visualisation techniques. SCImago is one of the leading research organizations providing a wide range of rankings, most notably the SCImago Institution Ranking (SIR).
What are the roles of each Key Partner?
MOHESR acts as a sounding board to SCImago by providing advice and guidance related to the role of research centers in the Egyptian research ecosystem.
ELSEVIER is the main data provider for the SCImago ranking for MENA.
SCIMAGO developed and manages the SCImago ranking for MENA.
Why have these Key partners been chosen?
In 2006, the Egyptian Ministry for Scientific Research (MOHESR) embarked on an ambitious exercise to overhaul Science and Technology (S&T) activities in Egypt, resulting in a complete restructuring of the S&T governance and management model in Egypt, and the creation of the Higher Council for Science and Technology (HCST), and the Science, Technology & Innovation Funding Authority (STDF).
MOHESR is mandated to:
- ensure funding of Science & Technology, with the aim of addressing the priorities set by the HCSRE
- support the complete cycle of innovation
- support and develop Egyptian research and innovation capabilities
- ensure the integration of Science, Technology and Innovation (STI) elements in national strategies
- bridge the gap between Industry & Academia
ELSEVIER is the data provider of choice of all prestigious global ranking providers including but not limited to:
- Times Higher Education
- Shanghai Ranking Consultancy (SRC) for the Best Chinese University Ranking
…as well as local or subject-specific rankings providers such as:
- the Financial Times
- Frankfurter Allgemeine Zeitung
- the Perspektywy University Ranking in Poland
- the National Institutional Ranking Framework (NIRF) in India
SCIMAGO carries out world-class research into indicator development and is an innovator in ranking building. SCImago has developed:
- SCImago Journal Rank & Country Rank (SJR) is a portal that includes journals and country scientific indicators developed from the information contained in Scopus. This portal takes its name from the SCImago Journal Rank indicator developed by SCimago.
- Shape of Science is an information visualization project whose aim is to reveal the structure of science.
- SCImago Institutions Rankings (SIR) is a classification of academic and research-related institutions ranked by a composite indicator that combines three different sets of indicators based on research performance, innovation outputs and societal impact measured by their web visibility.
- Atlas of Science project proposes the creation of an information system whose major aim is to achieve a graphic representation of IberoAmerican Science Research. Such representation is conceived as a collection of interactive maps, allowing navigation functions throughout the semantic spaces formed by the maps.
- Elsevier’s Scopus provides [bibliometric (citation) data] of peer reviewed academic literature.
- Unpaywall provides Open Access article data.
- Data is derived from the European Patent Office’s PATSTAT database, which contains bibliographical and legal event patent data from leading industrialised and developing countries.
- Google provides Web Size.
- Ahrefs database provides Inbound Links.
- Elsevier’s PlumX provide altmetrics data.
The analysis can be performed at several levels. For example, considering the distribution of ranks by country of the 391 centers
|COUNTRIES||Top 20||Top 100||Top 200||Top 300||Total|
|United Arab Emirates||ARE||4||9||12||14|
The interpretation of the results should be taken into the following contexts:
- Global Impact
- Specific indicators
- Evolution (when available)
Queries about the SCImago Research Centers Ranking – MENA Edition can be directed to through the https://www.scimagorc.com/mena or directly by contacting: firstname.lastname@example.org.
Support for research centers in understanding the SCImago ranking model for MENA is provided via:
Elsevier’s Analytical Services offers the option of a “SCImago ranking for MENA research performance report”, through which a research center can obtain insights into their performance in the global rankings, and that helps them develop a better strategic understanding of their position in these rankings. It offers a deep dive into their performance within agreed bibliometric indicators, showing how they are doing with respect to selected comparators (e.g. other research centers and universities within the same subject area and/or region), which collaborations are most beneficial to them, and how they can possibly improve their position in the rankings.