Meeting agenda
Time and Date | Description |
---|---|
7 AM 21 Feb | Registration, Breakfast |
8 AM 21 Feb | Welcome address (PI meeting organizers) |
8:05 AM 21 Feb | Housekeeping (KnowInnovation) |
8:15 AM 21 Feb | Irene Qualters (NSF Division Director, ACI) |
8:30 AM 21 Feb | Rajiv Ramnath (NSF Program Director, SI2) |
8:40 AM 21 Feb | Prompt and activity (KnowInnovation) |
9:25 AM 21 Feb | Invited talk: Lorena Barba |
10:00 AM 21 Feb | Invited talk: Jed Brown |
10:35 AM 21 Feb | Invited talk: Eric Klavins |
11:45 AM 21 Feb | Breakout discussions |
12:35 PM 21 Feb | Breakout group present backs |
1:00 PM 21 Feb | Lunch |
1:30 PM 21 Feb | Invited talk: Tim Menzies |
2:15 PM 21 Feb | Audience proposed breakout discussions |
3:15 PM 21 Feb | Present backs |
3:30 PM 21 Feb | Institute report: T. Daniel Crawford |
3:50 PM 21 Feb | Institute report: Nancy Wilkins-Diehr |
4:15 PM 21 Feb | Poster session and reception |
7:00 AM 22 Feb | Breakfast |
8:00 AM 22 Feb | Steve Konsek (NSF Program Director, I-Corps) |
8:10 AM 22 Feb | Invited Talk: Tracy Teal |
9:20 AM 22 Feb | Breakout discussions |
10:45 AM 22 Feb | Present backs |
11:45 AM 22 Feb | Complete meeting survey |
12:00 PM 22 Feb | Jim Kurose (NSF Assistant Director for CISE) |
12:15 PM 22 Feb | Lunch & close |
Talk abstracts
Lorena Barba
Associate Professor, Mechanical and Aerospace Engineering, George Washington University, Washington DC
Title: “How to run a lab for reproducible research”
Abstract: “… if everyone on a research team knows that everything they do is going to someday be published for reproducibility, they’ll behave differently from day one.”—David Donoho, pioneer of the reproducible-research movement. As a principal investigator, how do you run your lab for reproducibility? I submit the following action areas: commitment, transparency and open science, onboarding, collaboration, community and leadership. Make a public commitment to reproducible research—what this means for you could differ from others, but an essential core is common to all. Transparency is an essential value, and embracing open science is the best route to realize it. Onboarding every lab member with a deliberate group “syllabus” for reproducibility sets the expectations high. What is your list of must-read literature on reproducible research? I can share mine with you: my lab members helped to make it. For collaborating efficiently and building community, we take inspiration from the open-source world. We adopt its technology platforms to work on software and to communicate, openly and collaboratively. Key to the open-source culture is to give credit—give lots of credit for every contribution: code, documentation, tests, issue reports! The tools and methods require training, but running a lab for reproducibility is your decision. Start here –> commitment.
Slides: https://doi.org/10.6084/m9.figshare.4676170
Jed Brown
Assistant Professor, Department of Computer Science, University of Colorado, Boulder
Title: “Community building through software design”
Abstract: Scientific software has many different usage modes with widely varying consequences on the potential to build a cohesive and agile developer and user community. We propose that one can design software to foster desirable community attributes with modest effort and without compromising technical capability. Pivotal in this strategy is the principle of extensibility, by which we anticipate innovation and provide a mechanism for loosely coupled development, packaging autonomy, credit, and long-term sustainability. We will consider design examples from applications and libraries and reflect on their consequences. We will also propose best practices for new and legacy software.
Slides: https://doi.org/10.6084/m9.figshare.4676218
Eric Klavins
Professor, Electrical Engineering, University of Washington, Seattle
Title: The Aquarium Laboratory Operating System
Abstract: Laboratory Research at the bench involving molecular biology reagents, equipment, and living cells results in research that is difficult or impossible to reproduce. Generations of graduate students and postdocs have invented or implemented existing or new experimental methods only to graduate and take their knowledge with them. Lab notebooks, the mainstay of experimental labs, are notoriously useless, even in electronic form, for those wishing to reproduce another scientist’s work. By some accounts, up to 90% of scientific research in the life sciences is irreproducible, leading to a waste of funding and very slow progress in basic researcher. To address this problem, we invented a software system called Aquarium, which allows us to pre-specify laboratory protocols and experimental workflows in software at a level of detail that makes them easy to reproduce. We have run Aquarium for three years in our lab, running hundreds of jobs per week, managing tens of thousands of samples, for over thirty different experimental protocols. In April 2016, we scaled our lab up into a service center and now run jobs for dozens of researchers in eight different labs at UW and elsewhere. Based on the Aquarium framework, we have made all the workflows in our lab easy to reproduce by anyone via a web interface. Furthermore, the software can train an undergraduate technician to run all the protocols in three weeks. Using Aquarium, a new graduate student can build transgenic strains using a keyboard and mouse, getting results in days instead of months to years.
Slides: https://doi.org/10.6084/m9.figshare.4684156
Tim Menzies
Professor, Computer Science, North Carolina State University
Title: “Understanding software: recent lessons from empirical software engineering”
Slides: https://doi.org/10.6084/m9.figshare.4680961
Tracy Teal
Project Lead for Data Carpentry; Assistant Professor, BEACON, Michigan State University
Title: “Democratizing and supporting software development: a pathway to sustainable software”
Abstract: With the expansion in the variety, velocity and volume of data being produced, computing and software development has become a crucial element of research. However, while we value the research, we place less importance on the development of the software itself, viewing software as a service to research. By viewing software as a service, we derate the effort and expertise it takes to produce, and the training required, for effective research computing. We also don’t provide support for the people doing the development, often expecting individual developers to provide systems administration, user support and training and produce documentation and user interfaces. With our increased reliance on research computing, accurate and reproducible research requires that software not be separate from the act of conducting research, but an integral component - a part of, rather than a service to research. Shifts in how we provide data skills and software development training, integrate development into research programs and academic departments and value software as a product can have an impact on the quality, creativity and types of research we can conduct.