Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Matthew S. Mayernik is active.

Publication


Featured researches published by Matthew S. Mayernik.


Social Studies of Science | 2011

Science friction: Data, metadata, and collaboration:

Paul N. Edwards; Matthew S. Mayernik; Archer L. Batcheller; Geoffrey C. Bowker; Christine L. Borgman

When scientists from two or more disciplines work together on related problems, they often face what we call ‘science friction’. As science becomes more data-driven, collaborative, and interdisciplinary, demand increases for interoperability among data, tools, and services. Metadata – usually viewed simply as ‘data about data’, describing objects such as books, journal articles, or datasets – serve key roles in interoperability. Yet we find that metadata may be a source of friction between scientific collaborators, impeding data sharing. We propose an alternative view of metadata, focusing on its role in an ephemeral process of scientific communication, rather than as an enduring outcome or product. We report examples of highly useful, yet ad hoc, incomplete, loosely structured, and mutable, descriptions of data found in our ethnographic studies of several large projects in the environmental sciences. Based on this evidence, we argue that while metadata products can be powerful resources, usually they must be supplemented with metadata processes. Metadata-as-process suggests the very large role of the ad hoc, the incomplete, and the unfinished in everyday scientific work.


european conference on research and advanced technology for digital libraries | 2007

Know thy sensor: trust, data quality, and data integrity in scientific digital libraries

Jillian C. Wallis; Christine L. Borgman; Matthew S. Mayernik; Alberto Pepe; Nithya Ramanathan; Mark Hansen

For users to trust and interpret the data in scientific digital libraries, they must be able to assess the integrity of those data. Criteria for data integrity vary by context, by scientific problem, by individual, and a variety of other factors. This paper compares technical approaches to data integrity with scientific practices, as a case study in the Center for Embedded Networked Sensing (CENS) in the use of wireless, in-situ sensing for the collection of large scientific data sets. The goal of this research is to identify functional requirements for digital libraries of scientific data that will serve to bridge the gap between current technical approaches to data integrity and existing scientific practices.


conference on computer supported cooperative work | 2012

Who’s Got the Data? Interdependencies in Science and Technology Collaborations

Christine L. Borgman; Jillian C. Wallis; Matthew S. Mayernik

Science and technology always have been interdependent, but never more so than with today’s highly instrumented data collection practices. We report on a long-term study of collaboration between environmental scientists (biology, ecology, marine sciences), computer scientists, and engineering research teams as part of a five-university distributed science and technology research center devoted to embedded networked sensing. The science and technology teams go into the field with mutual interests in gathering scientific data. “Data” are constituted very differently between the research teams. What are data to the science teams may be context to the technology teams, and vice versa. Interdependencies between the teams determine the ability to collect, use, and manage data in both the short and long terms. Four types of data were identified, which are managed separately, limiting both reusability of data and replication of research. Decisions on what data to curate, for whom, for what purposes, and for how long, should consider the interdependencies between scientific and technical processes, the complexities of data collection, and the disposition of the resulting data.


acm/ieee joint conference on digital libraries | 2010

Digital libraries for scientific data discovery and reuse: from vision to practical reality

Jillian C. Wallis; Matthew S. Mayernik; Christine L. Borgman; Alberto Pepe

Science and technology research is becoming not only more distributed and collaborative, but more highly instrumented. Digital libraries provide a means to capture, manage, and access the data deluge that results from these research enterprises. We have conducted research on data practices and participated in developing data management services for the Center for Embedded Networked Sensing since its founding in 2002 as a National Science Foundation Science and Technology Center. Over the course of eight years, our digital library strategy has shifted dramatically in response to changing technologies, practices, and policies. We report on the development of several DL systems and on the lessons learned, which include the difficulty of anticipating data requirements from nascent technologies, building systems for highly diverse work practices and data types, the need to bind together multiple single-purpose systems, the lack of incentives to manage and share data, the complementary nature of research and development in understanding practices, and sustainability.


Bulletin of the American Meteorological Society | 2015

Peer Review of Datasets: When, Why, and How

Matthew S. Mayernik; Sarah Callaghan; Roland Leigh; Jonathan A. Tedds; Steven J. Worley

AbstractPeer review holds a central place within the scientific communication system. Traditionally, research quality has been assessed by peer review of journal articles, conference proceedings, and books. There is strong support for the peer review process within the academic community, with scholars contributing peer reviews with little formal reward. Reviewing is seen as a contribution to the community as well as an opportunity to polish and refine understanding of the cutting edge of research. This paper discusses the applicability of the peer review process for assessing and ensuring the quality of datasets. Establishing the quality of datasets is a multifaceted task that encompasses many automated and manual processes. Adding research data into the publication and peer review queues will increase the stress on the scientific publishing system, but if done with forethought will also increase the trustworthiness and value of individual datasets, strengthen the findings based on cited datasets, and in...


conference on computer supported cooperative work | 2013

Unearthing the Infrastructure: Humans and Sensors in Field-Based Scientific Research

Matthew S. Mayernik; Jillian C. Wallis; Christine L. Borgman

Distributed sensing systems for studying scientific phenomena are critical applications of information technologies. By embedding computational intelligence in the environment of study, sensing systems allow researchers to study phenomena at spatial and temporal scales that were previously impossible to achieve. We present an ethnographic study of field research practices among researchers in the Center for Embedded Networked Sensing (CENS), a National Science Foundation Science & Technology Center devoted to developing wireless sensing systems for scientific and social applications. Using the concepts of boundary objects and trading zones, we trace the processes of collaborative research around sensor technology development and adoption within CENS. Over the 10-year lifespan of CENS, sensor technologies, sensor data, field research methods, and statistical expertise each emerged as boundary objects that were understood differently by the science and technology partners. We illustrate how sensing technologies were incompatible with field-based environmental research until researchers “unearthed” their infrastructures, explicitly reintroducing human skill and expertise into the data collection process and developing new collaborative languages that emphasized building dynamic sensing systems that addressed human needs. In collaborating around a dynamic sensing model, the sensing systems became embedded not in the environment of study, but in the practices of the scientists.


BioScience | 2012

Advanced Technologies and Data Management Practices in Environmental Science: Lessons from Academia

Rebecca R. Hernandez; Matthew S. Mayernik; Michelle L. Murphy-Mariscal; Michael F. Allen

Environmental scientists are increasing their capitalization on advancements in technology, computation, and data management. However, the extent of that capitalization is unknown. We analyzed the survey responses of 434 graduate students to evaluate the understanding and use of such advances in the environmental sciences. Two-thirds of the students had not taken courses related to information science and the analysis of complex data. Seventy-four percent of the students reported no skill in programming languages or computational applications. Of the students who had completed research projects, 26% had created metadata for research data sets, and 29% had archived their data so that it was available online. One-third of these students used an environmental sensor. The results differed according to the students’ research status, degree type, and university type. Changes may be necessary in the curricula of university programs that seek to prepare environmental scientists for this technologically advanced and data-intensive age.


association for information science and technology | 2016

Research data and metadata curation as institutional issues

Matthew S. Mayernik

Research data curation initiatives must support heterogeneous kinds of projects, data, and metadata. This article examines variability in data and metadata practices using “institutions” as the key theoretical concept. Institutions, in the sense used here, are stable patterns of human behavior that structure, legitimize, or delegitimize actions, relationships, and understandings within particular situations. Based on prior conceptualizations of institutions, a theoretical framework is presented that outlines 5 categories of “institutional carriers” for data practices: (a) norms and symbols, (b) intermediaries, (c) routines, (d) standards, and (e) material objects. These institutional carriers are central to understanding how scientific data and metadata practices originate, stabilize, evolve, and transfer. This institutional framework is applied to 3 case studies: the Center for Embedded Networked Sensing (CENS), the Long Term Ecological Research (LTER) network, and the University Corporation for Atmospheric Research (UCAR). These cases are used to illustrate how institutional support for data and metadata management are not uniform within a single organization or academic discipline. Instead, broad spectra of institutional configurations for managing data and metadata exist within and across disciplines and organizations.


Archive | 2013

Bridging data lifecycles: Tracking data use via data citations workshop report

Matthew S. Mayernik

Digital technologies for identifying and linking to resources on the internet promise to make connections between scholarly publications and their underlying data more transparent and traceable. “Data citations” are formal citations included in reference lists of published articles to data resources that led to a given research result. The workshop “Bridging Data Lifecycles: Tracking Data Use via Data Citations,” held by the University Corporation for Atmospheric Research (UCAR) in April, 2012, brought together 80 people from 30+ organizations to discuss many aspects of data citations. This report outlines the important activities, tools, challenges, and impediments to data citation initiatives that were identified during the workshop. The report also outlines a set of recommendations on how to get started on the processes of assigning citations and actionable identifiers to data sets without having solved every issue. By making it easy for users, providing openness and transparency in how data citation tools are being applied, and leveraging the interest and expertise of the multiple communities of stakeholders, organizations can promote, enable, and embed data citations as regular components of the scholarly communication infrastructure.


Journal of the Association for Information Science and Technology | 2017

Assessing and tracing the outcomes and impact of research infrastructures

Matthew S. Mayernik; David L. Hart; Keith E. Maull; Nicholas M. Weber

Recent policy shifts on the part of funding agencies and journal publishers are causing changes in the acknowledgment and citation behaviors of scholars. A growing emphasis on open science and reproducibility is changing how authors cite and acknowledge “research infrastructures”—entities that are used as inputs to or as underlying foundations for scholarly research, including data sets, software packages, computational models, observational platforms, and computing facilities. At the same time, stakeholder interest in quantitative understanding of impact is spurring increased collection and analysis of metrics related to use of research infrastructures. This article reviews work spanning several decades on tracing and assessing the outcomes and impacts from these kinds of research infrastructures. We discuss how research infrastructures are identified and referenced by scholars in the research literature and how those references are being collected and analyzed for the purposes of evaluating impact. Synthesizing common features of a wide range of studies, we identify notable challenges that impede the analysis of impact metrics for research infrastructures and outline key open research questions that can guide future research and applications related to such metrics.

Collaboration


Dive into the Matthew S. Mayernik's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sarah Callaghan

Rutherford Appleton Laboratory

View shared research outputs
Top Co-Authors

Avatar

John Kunze

University of California

View shared research outputs
Top Co-Authors

Avatar

Steven J. Worley

National Center for Atmospheric Research

View shared research outputs
Top Co-Authors

Avatar

Suzie Allard

University Corporation for Atmospheric Research

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge