Ten Tactics Internal Evaluators Can Use to Build Organizational Learning within Nonprofits

[I wrote this piece for the American Evaluation Association’s AEA365 (a daily blog with tips about evaluation from evaluators) during a special Organizational Learning-Evaluation Capacity Building week in April 2024; it was published April 15, 2024].

Most of my work focuses on evaluation capacity-building with nonprofits. I’ve been immersed deeply in building culture and practices for organizational learning for 16 years, both as an internal evaluator and as a coach to many internal evaluators. Based on my experiences in a range of settings, here are ten (among many more) tactics to build organizational learning:

  1. Form a data committee that includes representatives from different departments and roles. The purpose of this committee might vary depending on where your organization is in its data journey, but some functions might include: inspiring champions in non-evaluation roles to bring learning and information to others in your organization; data governance; and identifying areas of overlap and opportunity between evaluation and other projects.

  2. Fold data/results and reflection into existing meetings and structures. To help reduce the silo-ing of evaluation, look for opportunities to bring a data point naturally into staff and other regular meetings.

  3. Use consistent reflection questions and/or protocols. Consider using some predictable questions and structures that support non-data folks to have a “way in” to interpret data during longer data reviews.

  4. Set, share, and keep to a data schedule. To help avoid that “black box” feeling, create and share a data schedule with your organization that includes when tools will be ready (e.g., when a survey will open), when data will be collected, and when results will be available in which format(s).

  5. Be proactive. Share updates, findings, check in about data needs in your organization. People will appreciate hearing small wins or challenges you are working on and that you have their interests in mind.

  6. Express appreciation. Honor those who contribute to evaluation work (e.g., who help collect data) or who use your data. Not only will this contribute to a positive data culture and make people feel good, but it will also foster learning through examples.

  7. Share examples from other organizations/nonprofits. Our colleagues, who are not evaluators, put a lot of faith in us. Finding and sharing examples of how similar organizations have done what you propose can build confidence.

  8. Ask leaders in a range of roles to feature/share how they use evaluation in their work. This helps people see a broader vision of how they could leverage data and inspires data use.

  9. Learn by doing together. Work on tools like surveys, and get input on types of analyses, with folks who are not evaluators. Describe the why behind your thinking and ask about their rationales too. You’ll learn from each other, and the work will be more meaningful and fun.

  10. Add capacity with an evaluation advisory board. Bringing some outside evaluators and community members in can help advance learning, adding diverse perspectives, experiences, and wisdom.

Hot Tips

Curious about how to set up an evaluation advisory board? Here’s an article about it in the Journal of Youth Development that I co-wrote with fellow advisory board colleagues. Although we focus on the context of youth development programs, the tips and examples are applicable to other settings.

Rad Resources

Here’s a helpful protocol that includes questions for data reviews from the Network for College Success.

I’ve found The Facilitator’s Guide to Participatory Decision-Making to be a powerful and practical book to help me frame meetings, data reviews, and other gatherings effectively. You can find more information about it, as well as related publications, here.

Flowers blooming from a recent visit to the Oak Park Conservatory with a bestie who has been a teacher for 30 years—she is always learning and promoting learning.

Let’s Get Physical (with Data and in Nature)

For quite some time, I’ve been fascinated by data physicalization—collecting, visualizing, engaging with data using tactile and physical forms. This is a story of bringing hands-on data collection to a hands-on event, which maybe will inspire you to try something old-school too.

What am I even talking about? Here are some beautiful examples of data physicalization from Jon Schwabish and The Urban Institute.

We weren’t sure how people were learning about our events….

After volunteering for over a year, I recently joined the Board of The Somerville Growing Center—a green space in my neighborhood where community and plants grow. The Center is a space for local preschool children and people of all ages to learn about the natural world and urban agriculture and to simply play outdoors. I’ve loved participating in and helping to host yoga classes, concerts, and the most adorable annual camp-out for stuffed animals, among other events. Last spring, I grew sunflower plants from seeds at home, which then grew in the garden—it was so fun to watch them unfold, grow, and survive bunny snack attacks. The Growing Center is a true oasis in Somerville-the most densely populated municipality in New England, with over 80,000 people living in just 4 square miles.

In March, The Somerville Growing Center hosted its 25th annual Maple Boil Down.  A team of volunteers tapped maple trees on the Tufts University (my Ph.D. alma mater) campus from January to March. Then, at the Boil Down, volunteers boiled it all down into real, true 100% hyper-local maple syrup. Kids and adults got to learn about the process, taste sap and syrup, and engage in hands-on activities.

As we planned outreach and advertising for the event, a fellow board member who was going to put up flyers wondered how effective they were, and a group of us discussed a survey that the Growing Center had used in the past to ask folks how they found out about the event. The group wondered if we might use such a survey again and which outreach categories we should ask about. I learned that at last year’s Boil Down, where they posted a survey using a QR code, they only got two responses….Always eager to learn and improve, the team was ready to try something new.

I thought about the event, and all the hands-on activities that were planned (e.g., getting to touch and use some of the kinds of tools involved in tapping sap), and I wondered if we could use a physical survey. We quickly identified outreach categories/responses, and the team made jars for each category. As parties entered the event, volunteers gave them each a small tile, which attendees placed in a relevant jar, before engaging in other activities. As a result, we got 410 responses! (out of around 620 attendees, but for instance, my friend’s 6-month-old baby didn’t answer). We learned that friends and family/word of mouth was the most powerful source, followed by online event listings and social media. It was easy, hands-on, and an approach that matched the flow and vibe of the day. We’re excited to use this approach again.

By the way, the Growing Center is celebrating its 30th anniversary right now. If you like their mission, and feel so inspired, please donate $30 (or any amount) here.

Also, I will love to hear about any ways you use data physicalization. Reach out if you want to get crafty and make something together.

Happy Spring!

Our physicalized survey with some responses early in the day

Shhh...Listening and Using Scientific Tools to Solve Problems

Today, I heard a fascinating interview with Saul Permutter, a Nobel-winning physicist, on WBUR’s Here and Now radio show. Perlmutter discussed his new book that shares ideas of how we can all apply the scientific method to decision-making. He noted how Americans tend to have a bias toward valuing people with strong opinions.

Of course, this raised feelings, thoughts, and fears about political divisiveness, but it also reminded me of my work. I’ve noticed a lot of evaluation consultants positioning themselves as “experts'“ and that some who express strong opinions (e.g., “always do this” or “never ever do that”) get a lot of attention. I understand this—as a human with my own biases and attention span, etc.—and also because of programs’ and policymakers’ frustration with experts, particular those in academia. As scientists and academics, we are trained to consider nuances, limitations, and caveats. We get a question, and we say, “well on the one hand….but on the other….” There was a famous joke that I’ve seen attributed to different sources about some type of expert/academic having too many hands because they always answered questions with “on the one hand, on the other, on the other." Sometimes you just want a simple answer!

So, on the one hand, I appreciate how some experts in my field have cut through the noise with clear, compelling communications, and on the other…. (hahahahaha, see what I did there?!). But it’s true—I value expertise, and I am proud of all I’ve learned and can share from a 30+ year career in research and evaluation. And yet experts don’t know everything, and I tire of that positioning at times, especially when we need to listen more to voices who haven’t been heard. I aspire, as I wrote in a post yesterday, to communicate about research and evaluation and theory clearly, in ways that can be understood and used by a range of folks. And yes, I have strong opinions about some things (e.g., pie charts with too many wedges, right?!), and there are times when I want my expertise and experiences to be heard and considered. But on the other hand, I want to do more listening, and you might find, that when we work together, I’ll say, “There’s rarely one right way to do this kind of thing” and “you are the expert about your own program [or experience].” I hope I listen well, and together we generate multiple solutions that we can consider and test together.

To make a sound decision, take a meaningful action, or solve a problem — whether as individuals, in groups, or as a society — we need first to understand reality. But when reality is not easy to discern, and we’re not sure which experts to trust to clarify the matter, we adopt other strategies for navigating the clutter. We “go with our gut”; decide what we “believe” and look for evidence to reaffirm whatever that is; adopt positions based on our affiliations with people we know; even find reassurance in belittling the people who disagree with us. We choose to consult experts who tell us what we like to hear; or bond in shared mistrust of people providing or communicating the information that confuses us, whether they are scientists, scholars, journalists, community leaders, policymakers, or other experts. These coping strategies may help us get by in our personal or professional lives; they may provide a consoling sense of identity or belonging. But they do not actually help us see clearly or make good decisions.
Book excerpt: 'Third Millennium Thinking: Creating Sense in a World of Nonsense'

By Saul Perlmutter, John Campbell and Robert MacCoun

A post-it someone stuck on the window of my bus stop!

Translations and Little Passions

We all need more small delights, and I love getting to hear about people’s little passions, as you may have deduced from my recent post here.

One of my passions is yet another quirky podcast that is so full of joy and silliness that I wish someone, anyone?! I know would listen to it and talk about it with me…it’s Dr. Gameshow, hosted by the comedian Jo Firestone. Listeners write in with games they design, and on each show, callers (adults and sometimes their kids) play 3 of these games. A recent game was “Do You Know What My Roommate is Talking About?” Two listeners designed it based on conversations with their roommate who forgets words and uses phrases like “centipede car (it takes the children)” to mean other things (in this case, school bus!). Callers get a word or phrase from the roommate and guess what they were talking about.

I relate to this game as a lifelong “translator.” Growing up, my grandmother lived on the first floor of our house, and my mother and I lived on the top two floors. They did not get along well, and I spent a lot of time up and down the stairs, carrying and translating messages between them. Was this healthy for a kid? Big NOPE. But sometimes difficult experiences and the ways we learn to cope lead to some skills we can use in positive ways. (I also seemed to love interpreting the communications of my 2 year-old neighbor, when I was 5 years old, so maybe this was just something innate within me too). Anyway, across my career, I’ve been fascinated by and drawn to ways to improve science communication (check out the Alda Center for some cool stuff about this) and in particular how to translate research and evaluation for use among parents, programs, and policy makers (among others). This led me to my incredible Applied Child Development doctoral program at Tufts. Through an internship and part-time job at The Children’s Trust across 5 years of my program, I got to translate research and evaluation needs and findings between state legislators, academics, parent educators, and nonprofit leaders. I learned so much from everyone there.

For the past 16 years, I’ve been immersed in the world of AmeriCorps, first as an internal evaluator and then as a consultant to programs and state commissions across the country. I’ve been called a unicorn by one program officer because I know evaluation and I know AmeriCorps, and I can speak both languages. I’m still always learning, and I don’t want to be the rare unicorn!

If you know an evaluator who wants to become more familiar with AmeriCorps, I’m offering this orientation webinar on April 5 and would love for you to share this.

What are your small passions these days?

An image from Dr. Gameshow’s Instagram

Evaluation Inspiration is Everywhere: Sleeping with Celebrities

If you want to be a good evaluator, or to be really good at so many professions actually, be sure to keep your eyes open in the world outside of evaluation for ideas. Sometimes you’ll find inspiration and approaches to emulate in surprising places.

One of my favorite podcasts to listen to is called Sleeping with Celebrities. Most of the guests are not well-known celebrities, in my opinion, but that’s no matter: “Each week on the slyly humorous and reassuring Sleeping with Celebrities, host John Moe talks with a different guest from the world of entertainment about something they know a lot about.”

It's a show you can listen to as a relaxing escape or to help feel sleepy (“we don’t want guests to bring their A game, we want them to bring their Zzz game”). The host is from the world of comedy and throws in some deadpan humor I just love and that makes me suddenly giggle while sleepy.

What are these somethings the guests know a lot about?” Well among my favorite episodes guests have gone deep deep into such passions and topics as: porridge; describing Melrose Place; ten (surprising) favorite candies; how to build a home music recording studio; and a guy’s setlist of covers from his college open-mic days. (By the way, in one episode, I learned the shocking fact that carrots are bad for rabbits, and that we only think of carrots and bunnies as going together because of Bugs Bunny and the movie It Happened One Night!).

How does this relate to evaluation? We can apply a lot from this show to our skills in designing interviews and focus groups and how we facilitate them.

  • Conveying curiosity, warmth, and empathy: John Moe conveys genuine interest in his guests’ sometimes most ordinary of topics, through his tone of voice, his questions and paraphrases, and another way he does this is by….

  • Listening well and using meaningful probes: Although he has planned some questions, he listens well and asks follow-up questions that are truly interesting and detailed in response to what he’s heard from his guest.

  • Slowing things way down and keeping them simple: Too many times, we try to cram too many questions into an hour-long (or even shorter!) interview or focus group. We rush through. Let’s cut our questions way back and leave space to breathe, slow down, allow for surprises, and to really dig in.

  • Appreciating differences, quirks, and discovery: The topics covered are rarely something the host knows about (he had never seen Melrose Place, but he asked follow-up questions about characters and plot points his guest had mentioned with great delight). Isn’t this why we choose to interview the people we do?—we want to learn from them. Let’s keep our minds open to listen for new discoveries and insights, big and small.

Of course, these are good reminders for our everyday conversations and listening too. At one evaluators’ meetup, after I discovered this podcast, I found myself chatting with a colleague about his new hobby of fly fishing. Instead of rushing off from a quick chat with him to talk to someone else, I found myself asking more questions about fly fishing, and the more I asked, the more fun we had. (And, did you know there are free little flybraries!!! to borrow fly ties just like Free Little Libraries?! Mind blown).

Sweet evaluation-y dreams,

Gretchen

Data Inspiration from Geena Davis

This is about how Geena Davis’s use of data inspired me, and some of my takeaways that maybe you will find useful too.

I belong to a celebrity memoir book club through my local library (CelebriTea! We drink and spill the tea!). I’ve loved it—it gets me to read some fun books I might not otherwise read with a group of women from different backgrounds, who exchange thoughts and questions and ideas in a remarkably enjoyable and inclusive way. I don’t know how it happens, but I’ve never seen a group who takes turns so well, builds on each other’s ideas, and shares dissenting opinions so warmly and naturally. The stakes are low, of course, but we often have strong opinions, and the dynamic has been this way with members new and old joining, and varied sizes of attendance. I’d love to bottle the nature of these exchanges and be able to bring them to more spaces.

One month, we read Geena Davis’s memoir, which was fun and compelling. I was aware of her work to bring gender parity to Hollywood, but listening to her audiobook, I learned more details. I was struck by how she, who is not a data scientist or program evaluator, valued research and recognized that collecting data would help her learn and make her case to Disney and others. She invested in research, and her work has led to improvements, though gradual, in both roles and representation of women and girls as well as in measurement tools themselves.

You can read more about her work here, among other places:

https://www.bu.edu/articles/2020/geena-davis/ and this is the book we read: https://www.harpercollins.com/products/dying-of-politeness-geena-davis?variant=40404274806818

Inspiring takeaways/reminders

  • Great research ideas come from the people involved in the issue (I hope you already know this!). Geena Davis was a woman acting in Hollywood, seeing roles diminish for women as they aged, and a mother who noticed the lack of female representation in what her kids watched. These experiences inspired her to take action and fund groundbreaking research.

  • Results are powerful when shared within relationships/Don’t weaponize data. In her memoir, she describes bringing the results of their initial research to Disney animators, who were surprised and dismayed to see the unconscious biases they brought to their work. One animator noted that he had just drawn a crowd scene and filled it with all male characters, without even thinking about it. She comments on how these animators got into their work in order to bring joy to children and families. As quoted in the BU article, “I never bust anyone publicly.” Instead, Ms. Davis assumes best intentions and leverages a range of meeting types to share results.

  • Iterate! She started her research center in the early 2000s, and now 20 years later, they’ve continued to create new innovations in science and measurement. Here’s an example of software they developed: https://seejane.org/research-informs-empowers/data/ . Before developing this software, most studies of representation had been conducted manually.

From this rainy winter day in Boston, I hope you are staying warm tonight, maybe with a cup of tea and a book, and finding some inspiration. The Geena Davis Institute on Gender in Media says, “If she can see it, she can be it®” If we can share examples of effective data use, we can learn and make the world better. Also, please support your local library and librarians!

a cup of tea with a strainer steeps with images of teapots behind it

Tiny Habits

While big moves have their place, I am often fascinated by working small and how tiny habits, interventions, and movements can lead to large impacts.
I loved the book Atomic Habits--it inspired me and gave me practical ideas to use: https://lnkd.in/eNrD8pdr
When stuck on some big, hard tasks, I return again and again to the Pomodoro Method 🍅 to help me focus and break them down into smaller bites : https://lnkd.in/eFVdK3UN

What are some ways you like to work small? #atomichabits #pomodorotechnique #eval

A bee hotel at The Somerville Growing Center with a QR code to learn more and make your own.

Gratitude for 8 Years of Bee's Knees!

A drawing of many 8s using a range of colors and pens

Yesterday, my business, Bee’s Knees Consulting LLC turned 8 years old. Thank you to my clients, friends, and colleagues for sharing your work, engagement, encouragement, opportunities, wisdom, inspiration, and joy. Every year, I get to learn and try new things and build new connections (in my heart, brain, and between people and organizations) in service.

8 has always been my favorite number for all sorts of reasons. For fun, I looked up some musings about 8 and found that 8 often represents balance and stability, abundance, constant flow, and renewal. There are 8 major planets in our solar system, and oxygen is the eighth element of the periodic table. What does this mean for Bee’s Knees? No idea, but particularly after the past few years, it feels good to reflect on balance, renewal, and breathing for sure, as well as what I want to do more of, less of, when, and how. I'm excited for the year ahead.

In gratitude,

Gretchen

ps, it was fun to do a meditative drawing of 8s last night at a craft night hosted by two of my favorite local businesses, Tiny Turns Paperie and Remnant Brewing

#eval #evaluation #nationalservice #consulting #programdesign

Places, Spaces, Science, and Bravery

The moments when seemingly disparate threads of our interests and professional directions intertwine and reveal a pattern, connecting past and present, are beautiful to me. This post is about how a recent learning experience reminded me of how “place” is one unifying theme in my work and life.

On June 13, I joined one of the best Greater Boston Evaluation Network presentations I’ve ever attended—Using Space and Plan to Enhance Program Evaluation by Katie Butler. Katie is the founder and principal evaluator of The Geoliteracy Project. In her talk, she featured ways we could use place (physical location) and space (your context for a place, e.g., your classroom) to enhance evaluations, whether process or outcome studies. I’ve thought about her presentation many times and ways that place and space have been a part of my work (and life). I’ve often reflected on how fortunate I was to grow up in safe neighborhood, with kind neighbors, plenty of space and a garden, in a relatively affordable city (Pittsburgh), within walking distance to cultural institutions, my schools, and parks. The economic and structural situation of my tiny family changed dramatically for the worse between my infancy and school age, and later, there would have been no way we could have landed in that home, if we hadn’t already been there. We had no car after I turned 4, nor could we afford one, so walkability and public transit were critical and thankfully excellent. Place contributed to my resilience. When I was at City Year, the organization talked about how a zip code should not determine the quality of a child’s education and future. In my work with Fishing Partnership, “place” is a critical part of nearly every conversation (e.g., the community health workers who are from the fishing communities they serve, bringing preventive healthcare to the docks where fishermen are, the contributions of the fishing industry to their port towns and the economy of Massachusetts, the risks of the sea and the ways fishermen serve as first responders). Anyway, during her presentation, Katie’s passion and skills were evident, and I kept thinking about a formative experience in my career and one of my most important mentors.

I got to practice so many skills, that I still use today in my first job out of college. For 3 years, I served as a Research Associate for a large NIH study at the University of Pittsburgh Medical Center with Herb Needleman, M.D. We studied the neurocognitive and behavioral effects of low levels of lead exposure among boys living in the city of Pittsburgh, who had first been a part of another longitudinal study. Herb was a hero and my mentor. He trusted me and treated me like a colleague from the first time he interviewed me, and I treasured our lunches years later on my visits home from graduate school. On my first day of work, Herb gave me a manual for the new software he had purchased to build our study’s database and trusted me to build it. Herb and my other exceptionally wonderful boss, mentor, and friend, Julie Riess, Ph.D. nurtured my learning and interests. Thanks to them, my first publication was in JAMA and meaningful to policy (Needleman HL, Riess JA, Tobin MJ, Biesecker GE, Greenhouse JB. Bone Lead Levels and Delinquent Behavior. JAMA. 1996;275(5):363–369); you can read more about it here.

Maybe another time, I’ll describe all the things I got to do and learn while working with Herb and how it was a lucky combination of events that I even ended up there. What I most want to do today is to honor Herb. There are so many books and articles about him and by him, but for now, I hope you will read the links below. In recent years, we’ve witnessed the lead water crisis in Flint, Michigan and the devastating effects of environmental toxins in too many other places. We see how we are connected in this world as the air quality in the U.S. is affected by the fires in Canada. Science and bravery still matter.

https://www.pbs.org/wgbh/nova/article/herbert-needleman/

https://ehp.niehs.nih.gov/doi/10.1289/EHP2636

  

Putting Capacity Back into Capacity-Building

A sunflower plant grows with other plants and supports in a community garden

(I wrote this for the American Evaluation Association’s AEA365 site during a special Organizational Learning-Evaluation Capacity Building week in April 2023).

Hi, I am Gretchen Biesecker, Principal Consultant with Bee’s Knees Consulting LLC in Somerville, MA. A large part of my practice focuses on evaluation capacity-building with nonprofits small and large, including AmeriCorps programs across the U.S. AmeriCorps. AmeriCorps is a federal agency that “brings people together to tackle the country’s most pressing challenges through national service and volunteering.” Through a national network, AmeriCorps enrolls 200,000 Americans each year to meet critical needs in education, the environment, disaster services, public health, among others.

Sometimes my capacity-building work can get pretty meta! For instance, my colleague, Marc Bolan and I conducted a randomized control trial of an AmeriCorps program’s efforts to build evaluation capacity among organizations hosting their AmeriCorps members. The goal of the study was to measure the program’s impact on participants’ knowledge, attitude, and confidence relating to evaluation, and their capacity to carry out evaluation and performance measurement activities.

More commonly, over the past 7 years, I’ve worked with AmeriCorps programs and their state commissions to build their knowledge, understanding, confidence, and use of evaluation and data. I conduct trainings in person and via Zoom and offer office hours or individual coaching sessions. I also lead cohorts of 4-6 programs who want to improve how they collect or use data, articulate research questions and ideas for evaluation studies, or better put results into action.

Lessons Learned

• Evaluators need to develop better tools and approaches to measure the outcomes of our capacity-building work across a range of programs and settings. Few tools exist, and those that do can be too long and full of jargon to work well among nonprofits.

• Too often, evaluation capacity-building focuses on deficits, rather than capacities and assets that already exist. I see this when I look at survey tools and assessments designed to measure capacity-building, when I review evaluation training materials, and when program staff share past experiences with me. I’ve seen assessments that yield scores based on the absence of capacities. Not only can this approach feel demoralizing to programs, but it flies in the face of the very name—capacity-building!

Instead, I find that most programs and their staff want to learn and improve in their work, and they believe in evaluation. Their lack of capacity is not rooted in a lack of interest—it’s stalled by a lack of time, positive experiences, resources, and funding. When we start by focusing on capacities that programs identify and want to work on, ask about their assets and build on them, and create dedicated time and space, we see success.

Hot Tips

• Appreciative inquiry, an approach that focuses on strengths, works well in evaluation capacity-building. Asking questions and focusing on tiny and bold steps that could lead to improvements, creates excitement, confidence, and positive momentum.

• Building evaluation capacity takes time and money, or in other words… CAPACITIES! We need more foundations and funders to pay for this work. When evaluators take the time to deeply understand the programs they serve, and programs can get work done within the learning experience (e.g., writing an evaluation plan as part of a cohort experience), capacity grows.

• A combination of small group and 1:1 work with programs can be powerful. Individual, 1:1 time allows programs to ask specific questions and reflect on their unique challenges, strengths, and plans. Small group time helps peers learn and share ideas and realize that many evaluation-related challenges and ideas affect us all.