Researcher-Facing Track

Do you help researchers using IT tools to further their research goals?
Are you involved in research computing and/or data science training?
Do you consult with researchers on more effectively doing their research with advanced computing resources?
Compute with data?

The Researcher-Facing Track of the People Network brings together people from research computing groups, libraries, research institutes, and other organizations who support researchers in every phase of the research lifecycle. Many of us are also Data- & Systems-Facing, but this track is a community-led opportunity to discuss the practices, perspectives, and experiences of facilitation from any perspective.

Topics include:

  • Research computing facilitation
  • Outreach to all disciplines, esp. those under-represented, to aid with research computing resources
  • Education and Training
  • The art and practice of facilitation
  • Increasing communications, collaborations, and team-building
  • Research computing technologies
  • And more, as determined by our members!

We connect via monthly calls and an email list. We also invite you to review and contribute to the Leading Practices of Facilitation!

Join Us!

Join Us! Fill out our membership form to let us know who you are and what you’re interested in. We’ll add you to the email list.

Monthly Calls

Monthly calls are on the second Thursday of the month, 1p ET/ 12p CT/ 11a MT/ 10a PT/ 8a HT. Connection information and links to any materials are distributed via email.

Upcoming Call(s)

Researcher-Facing Track, Thursday Apr 8, 2021 1pm ET/ 12pm CT/ 11am MT/ 10am PT/ 8am HT

Title: On Measuring the Impact of Training
Presenter(s):
Kari Jordan (Carpentries)
Julie Wilson Rojewski and Astri Briliyanti, CyberAmbassadors

Description:
On previous Researcher-Facing calls, we’ve had the opportunity to discuss topics relating to measuring impact and improving training. And as discussed in the Leading Practices of Facilitation, “training & education” is one of the major pillars of our efforts. Many of us provide training opportunities and struggle to define and measure “impact” or “success” — is it short terms gains (quality scores for the class & instructors, reduced support burden, and acclimating users), long term considerations (effectiveness of training programs, building relationships, promoting awareness and participation), and does it depend on the kind of training (professional skills, technical topics)? Or are we confounding these, complicating both the objectives and outcomes?

April’s call will showcase two “case studies” of measuring training impact, where each presenter will talk about their programs, define “impact”, and explain their approach to measuring this. Please also join us by contributing to our pre-talk survey: What challenges do you currently face in measuring training impact? And what successful strategies have you tried?

Previous Call Topics

3/11/21 All About Orienting Researchers to Research Computing + Data Resources video
doc
2/11/21 Supporting Researchers with Containers video
doc
1/14/21 All About CaRCC (… beyond the R-F Track) video
doc
slides
11/12/20 CaRCC End-Of-Year Party doc
11/12/20 Profiling and Optimizing R Code in Your Workloads video
doc
slides
10/08/20 Writing More Equitable Job Postings video
doc
slides
09/10/20 Supporting research projects on the cloud video unavailable
doc
slides
07/09/20 Big Data, Big Compute Solutions, a Community Discussion video
doc
slides
05/14/20 Remote Support (cont’d): Working with Your Team Remotely video
doc
04/09/20 Sharing Good Practices in Remote User Support & Training video
doc
03/12/20 User Support via Ask.CI and Locales video
doc
02/13/20 Doing it in public: User support via Slack, open forums, Github repos and blogs video
doc
01/09/20 Handling Support Requests – A Community Discussion video
doc
12/12/19 University of Missouri reports and Researcher Facing/CI-Engineer Metrics video
doc
11/14/19 Open OnDemand User Experiences & Challenges video
doc
10/10/19 An Expansion of User Support Services for the Research Computing Group at the University of Colorado Boulder video slides doc
09/12/19 A Startup Framework for Building Digital Research Capacity and Community at UCLA slides
doc
08/08/19 PEARC19 recap!
07/11/19 Creating effective training materials
06/13/19 Lightning talks / Open mic sessions on track topics

Track Coordinators

Justin Booth, Dir. Research IT, Michigan State University
Bob Freeman, Dir. Research Technology Operations, Harvard Business School
rf-coordinators@carcc.org to contact both Justin and Bob

Steering Committee Members (alphabetically)
Gladys Karina Andino Bautista
Katia Bulekova
Dirk Colbry
Martin Cuma
Christina Koch
Amy Neeser, UC Berkeley
Janna Ore Nugent
Wirawan Purwant
Annelie Rugg

We also wish to thank persons who have previously lead and contributed to the Researcher-Facing track:

  • Claire Mizumoto (U. California, San Diego), Track Coordinator
  • Erin Share (USC), Steering Committee
  • Dana Brunson, Track Coordinator
  • Lauren Michael, Track Coordinator