Do you help researchers using IT tools to further their research goals?
Are you involved in research computing and/or data science training?
Do you consult with researchers on more effectively doing their research with advanced computing resources?
Compute with data?
The Researcher-Facing Track of the People Network brings together people from research computing groups, libraries, research institutes, and other organizations who support researchers in every phase of the research lifecycle. Many of us are also Data- & Systems-Facing, but this track is a community-led opportunity to discuss the practices, perspectives, and experiences of facilitation from any perspective.
Topics include:
- Research computing facilitation
- Outreach to all disciplines, esp. those under-represented, to aid with research computing resources
- Education and Training
- The art and practice of facilitation
- Increasing communications, collaborations, and team-building
- Research computing technologies
- And more, as determined by our members!
We connect via monthly calls and an email list. We also invite you to review and contribute to the Leading Practices of Facilitation!
Join Us!
Join Us! Fill out our membership form to let us know who you are and what you’re interested in. We’ll add you to the email list.
Monthly Calls
Monthly calls are on the second Thursday of the month, 1p ET/ 12p CT/ 11a MT/ 10a PT/ 8a HT. Connection information and links to any materials are distributed via email.
Upcoming Call(s)
Researcher-Facing Track, Thursday Apr 8, 2021 1pm ET/ 12pm CT/ 11am MT/ 10am PT/ 8am HT
Title: On Measuring the Impact of Training
Presenter(s):
Kari Jordan (Carpentries)
Julie Wilson Rojewski and Astri Briliyanti, CyberAmbassadors
Description:
On previous Researcher-Facing calls, we’ve had the opportunity to discuss topics relating to measuring impact and improving training. And as discussed in the Leading Practices of Facilitation, “training & education” is one of the major pillars of our efforts. Many of us provide training opportunities and struggle to define and measure “impact” or “success” — is it short terms gains (quality scores for the class & instructors, reduced support burden, and acclimating users), long term considerations (effectiveness of training programs, building relationships, promoting awareness and participation), and does it depend on the kind of training (professional skills, technical topics)? Or are we confounding these, complicating both the objectives and outcomes?
April’s call will showcase two “case studies” of measuring training impact, where each presenter will talk about their programs, define “impact”, and explain their approach to measuring this. Please also join us by contributing to our pre-talk survey: What challenges do you currently face in measuring training impact? And what successful strategies have you tried?
Previous Call Topics
Track Coordinators
Justin Booth, Dir. Research IT, Michigan State University
Bob Freeman, Dir. Research Technology Operations, Harvard Business School
rf-coordinators@carcc.org to contact both Justin and Bob
Steering Committee Members (alphabetically)
Gladys Karina Andino Bautista
Katia Bulekova
Dirk Colbry
Martin Cuma
Christina Koch
Amy Neeser, UC Berkeley
Janna Ore Nugent
Wirawan Purwant
Annelie Rugg
We also wish to thank persons who have previously lead and contributed to the Researcher-Facing track:
- Claire Mizumoto (U. California, San Diego), Track Coordinator
- Erin Share (USC), Steering Committee
- Dana Brunson, Track Coordinator
- Lauren Michael, Track Coordinator