Speakers

Michael Bales, Weill Cornell Medicine

VIVO Dashboard: A Drupal-based application for analyzing publication metadata

VIVO Dashboard is a Drupal-based application for analyzing publication metadata. VIVO Dashboard allows end users (primarily administrators) to generate reports as bar graphs, HTML lists, and spreadsheets. Users can filter by year, publication type, organizational affiliation, and first-last author affiliation, among other elements. VIVO Dashboard includes a Citation Impact Map for displaying article-level data on field-normalized citation impact. In the Citation Impact Map, each article is assigned a percentile rank of times cited, as measured against a baseline of articles of the same type, published in the same field and year. VIVO Dashboard is freely available on GitHub at https://github.com/wcmc-its/vivodashboard. A standalone version of the Citation Impact Map is freely available at https://github.com/wcmc-its/citation-impact/

Fatima Barnes

Formalizing an Institution Wide Journal Club and Research Seminar Elective

The Office of Graduate Medical Education (GME) and the Health Sciences Library are collaborating to develop a formal institutional wide Journal Club and a Research Seminar Elective. Responses from a needs assessment survey identified knowledge gaps in research methods currently utilized across the GME enterprise. The survey results also showed great interest in research training and an establishment of formal Journal Clubs within residency programs where a dearth in scholarly activity has been identified. To gain the necessary skills of conducting clinical research effectively, all of our residents are encouraged to take the Introduction to Principles and Practice of Clinical Research (IPPCR) course offered by the NIH Clinical Center. The IPPCR comprises 60 to 90-minute weekly, virtual seminars and is only offered between September and April. We believe that if the IPPCR Certificate is made a requirement for incoming PGY1s, research skills and knowledge of all first year residents would be enhanced. A formal request has been made to have this program designed to be offered fully online and open throughout the year.In the meantime, a task force will be identified to design a Research Seminar Elective. Journal Clubs will also be encouraged to be offered in all the residency programs. To support research initiatives, the health sciences library will plan a series of monthly topics/workshops to provide knowledge and skills necessary to incorporate best evidence based practice in the care of individual patients.

Professor Chaomei Chen, Drexel University

The Types and Roles of Uncertainty in Science

Chaomei Chen is a Professor in the College of Computing and Informatics at Drexel University, Philadelphia, USA. Chen has extensive expertise in the structure and dynamics of scientific fields and in detecting and visualizing critical patterns and emerging trends in scientific literature. He is currently leading the development of a Visual Analytic Observatory of Scientific Knowledge, funded under the NSF SciSIP Program. He is the Editor-in-Chief of Information Visualization and the Specialty Chief Editor of Frontiers in Research Metrics and Analytics. Chen is the author of a series of books on information visualization, scientific creativity, mapping scientific frontiers, quantitative assessments of critical evidence, and, most recently, the role of uncertainty in scientific knowledge. His software, CiteSpace, is widely used in numerous scientometric reviews of scientific domains. His 2006 article on CiteSpace is a Google Scholar Citation Classics.

http://cluster.cis.drexel.edu/~cchen/bio.html

Heather Coates

Empowering Researchers Through Values-Driven Evidence

Since 2012, librarians in the Center for Digital Scholarship have offered workshops and consultations to faculty in support of promotion and/or tenure. Many times, our conversations with faculty begin by answering questions about metrics. As our conversations with administrators and faculty have evolved, we have used these questions to initiate broader conversations about the research process, scholarly products, dissemination of scholarship, and evidence. We also integrate institutional values such as Open Access, Civic Engagement, Economic Development of Indiana, and Public Scholarship into our workshops and consultations. Through these educational opportunities, we emphasize the choices available to empower faculty, to illustrate the options available for disseminating their scholarship in formats and ways that will reach the chosen audience(s). We help them to craft digital identities that reflect the full range of their scholarship and enable them to gather compelling evidence demonstrating how others have engaged with and used their work. Finally, we support them in using both metrics and qualitative evidence responsibly to make their case for promotion and/or tenure. In this lightning talk Heather describe how we’ve used research metrics as a starting point for values-driven research dissemination and evaluation.

Robert Frodeman, University of North Texas

Strangelovean Metrics

The danger with metrics of all kinds is that the map comes to supplant our awareness of the territory. This implies that metrics should be designed to fail gracefully, and to call attention to their own inadequacies. Building in such failsafe mechanisms will help make apparent the fact that quantities are always derivative in nature, dependent on prior qualitative assessments.

Professor Steve Fuller, University of Warwick

Measuring Research Value in Time

What is the difference between the short-term and long-term value of research, and how should each be measured?  Should we expect research of high value in the short-term to remain of high value in the long term, or do the two temporal horizons trade off against each other?

Professor J. Britt Holbrook, New Jersey Institute of Technology

Transforming Research Metrics

Designers typically design products with potential users in mind. Those who design research metrics have a responsibility to consider researchers — rather than research managers – as their primary users. Otherwise, we run the risk of transforming research into something that produces bits of information rather than useful knowledge.

Professor Kiarri N Kershaw, Feinberg School of Medicine

Dick Klavans, Map of Science

Recent Progress in Research Portfolio Analysis

George M. Santangelo, National Institutes of Health

Patty Smith, Northwestern University

Aaron Sorenson, Digital Science Consultancy

Altmetric-Citation Disequilibria in Alzheimer-Disease Papers — Can Such Discrepancies Be Used to Differentiate between Investigators with Similar Citation-based Statistics?

Imagine a scenario where two junior neuroscientists, both studying Alzheimer’s Disease, are asking their departmental chair for additional lab space, a scarce commodity. Both are funded on modest NIH training grants, and both have similar publication histories in terms of number of papers as well as citation-based metrics such as total citations, H-Index, and average citations per paper. It turns out, however, that the two scientists publish in very different journals and as a result their papers tend to generate very different levels of online buzz. Can (and should) altmetric discrepancies be used by those who manage science to cast a tie-breaking vote in researcher-performance evaluations when citation-based approaches fail to identify a clear winner for a scarce resource (e.g., early-career funding, additional lab space, etc.)?

Robyn Reed, Penn State Medical College

Capturing, Showcasing, and Measuring Research Impact among Biomedical Researchers

Tracking research scholarship across a large, multidisciplinary organization is desirable for strategic planning, assessing the contributions of individuals and units, and demonstrating broader impact on specific fields and on communities ranging from local to international. The Pennsylvania State University (Penn State) comprises hundreds of departments, schools, colleges, institutes, and centers across 24 campuses. To assist in tracking Penn State’s research productivity, a university-wide research networking platform will be available to the public, with analytic capabilities that will empower investigators and research administrators to document, query, and benchmark a variety of scholarly outcomes. The platform will also facilitate the identification of collaborators, mentors, reviewers and institutional resources to drive research productivity. The authors’ roles in the networking system have been to build the College of Medicine presence and to determine the metrics of interest to that college. Of importance to the College of Medicine are the abilities to track publication quantity and quality, measure citation quantity and quality, attribute intellectual property, assess social impact of scholarship, and view collaborations by researcher and department/institute. Early instances of data usage at the College of Medicine will be discussed. One example is the dashboard project, which seeks to capture and share various metrics to facilitate routine discussions between department chairs and the Dean.  Secondary analyses of research productivity data may be necessary to achieve more detailed information about scholarly outcomes.

Kari Wojtanik, Susan G. Komen Foundation