Data is everywhere—but without context, it can mislead more than inform. Watch this webinar, The Data Mirage: Why Purpose and Context Matter, where we explore how to use data intentionally to advance equity and support smarter decisionmaking.
We’ll cover:
- Why data without context is just noise
- How to focus on what truly matters in your data strategy
- Real-world examples of organizations using data to drive impact, not just activity
This session is ideal for nonprofit, philanthropic, and public sector leaders who want their data practices to reflect their values and goals.
Your Host

Michelle Haynes-Baratz, Ph.D.
Managing Associate
Community Science
Michelle is an organizational psychologist who brings two decades of experience researching, developing, and implementing evidence-based interventions to create more equitable and inclusive workplaces. She specializes in leveraging data, both quantitative and qualitative, to understand organizational ecosystems with a specific focus on transforming organizational culture and climate with equity at its core.
Webinar Video and Deck
1
00:00:00.000 –> 00:00:01.550
Michelle HaynesBaratz: Evening. We’re
2
00:00:02.190 –> 00:00:17.920
Michelle HaynesBaratz: wherever you may be, joining us from, and be it live, or asynchronous welcome. My name is Michelle Haynes Barretts. I am a managing associate at community science. And I am so glad that you are here with me to talk about data by purpose and context matters.
3
00:00:18.050 –> 00:00:29.810
Michelle HaynesBaratz: So I know that the world is a busy and very swirling place. I don’t take lightly that you’ve chosen to spend some of your precious time with me, and my sincere hope is that it will feel like time well spent.
4
00:00:30.040 –> 00:00:56.349
Michelle HaynesBaratz: So this is the final session of our 3 part Webinar series on strengthening strategy and impact during uncertain times. Hopefully, you’ve had a chance to tune in previously and hear from my amazing colleagues. We had amber trout who kicked us off with a session on sustaining momentum when everything seems urgent. How do we keep going and working towards that long game? And then we heard from the wise Jasmine Williams Washington, who invited us to reimagine capacity building
5
00:00:56.350 –> 00:01:02.670
Michelle HaynesBaratz: as a holistic approach and means to achieve strategic alignment as we work towards lasting change.
6
00:01:02.670 –> 00:01:18.290
Michelle HaynesBaratz: So today, I’m going to round out the conversation by talking through the measurement process. That is, as we’re building as we’re moving through. How can we measure what matters? So that we know what’s working, what’s not and where we might need to pivot. So
7
00:01:18.390 –> 00:01:20.580
Michelle HaynesBaratz: some takeaways for today.
8
00:01:22.270 –> 00:01:48.920
Michelle HaynesBaratz: data is a tool. It is a means to an end. It is not the end, and of itself important to remember. Takeaway number 2. There is no one way. There is no right answer in terms of what one should measure or how to measure it. Rather, context matters deeply. And last, but not least, your data strategy needs to match the moment and your horizon of change. And this really has to do with thinking about change over time.
9
00:01:49.080 –> 00:02:15.739
Michelle HaynesBaratz: So before we dive into our conversation about data, let me introduce you to community science. I am but one of many voices of our collective, and I’m very grateful to have my colleague, Carlyn Morales, who is joining us here today who is also helping with making sure that this webinar runs smoothly so as questions come in and so forth. She’ll be in the background making sure that I’m paying attention to those. So
10
00:02:16.140 –> 00:02:39.869
Michelle HaynesBaratz: community science, we work with governments, we work with foundations, we work with nonprofit organizations on solutions to social problems. And we really do so with a systems change lens and always centering equity at the core of what we do. So this is actually a 3 part Webinar series, as I mentioned previously, and it is being brought to you by the organizational effectiveness practice area at community science.
11
00:02:39.870 –> 00:02:47.349
Michelle HaynesBaratz: where we really partner with mission driven organizations to strengthen their capacity, their responsiveness, power, and impact.
12
00:02:47.350 –> 00:03:01.120
Michelle HaynesBaratz: And we really see organizations as living organisms themselves, and also as part of a larger ecosystem, and we believe deeply that organizations are always stronger when they center equity in what they do.
13
00:03:02.640 –> 00:03:27.420
Michelle HaynesBaratz: All right. So let’s go ahead and dive in and talk about why it is that we are here. So the landscape is shifting. We’re in a moment right now across philanthropy and social change movements. Really, where so many of us are trying to work differently. We’re trying to be more trust, based, more collaborative, more grounded in relationships and lived experience. And this shift has been sort of in the works for some time
14
00:03:27.420 –> 00:03:32.980
Michelle HaynesBaratz: now, but in some ways the push has really been accelerated recently
15
00:03:33.250 –> 00:04:00.320
Michelle HaynesBaratz: and frankly, when everything is on fire as seems to be these times. It’s easy to get caught up in the overwhelm and the frenzy. But as a sector, as we mobilize away from transactional grant making, and from sort of the rigid logic of turn on investment, especially as philanthropy is eager to meet the moment. If we’re leaning into nuance, if we’re leaning into flexibility and complexity.
16
00:04:00.430 –> 00:04:19.729
Michelle HaynesBaratz: how do we still stay grounded right? How do we stay accountable to our purpose, to our communities and to ourselves? And how do we know that we’re on the right path? And that’s really what today’s conversation is about, how can we measure what really matters without getting lost in the weeds or being tethered to the wrong indicators.
17
00:04:21.790 –> 00:04:28.419
Michelle HaynesBaratz: So the key one key, I would say, is to remember that data
18
00:04:28.620 –> 00:04:35.070
Michelle HaynesBaratz: is a flashlight. The data is not the endpoint, it is a tool to guide you through the unknown.
19
00:04:35.480 –> 00:05:04.619
Michelle HaynesBaratz: So once upon a time in my former life I was an academic, and I taught research methods religiously for a hundred years, and I loved it, and no one ever wanted to teach the class. It was a required course. Students sort of loathed it, and frankly, it was because they were scared of it, you know, they’d say, Oh, but why do I need to learn this? And I’m not good at math. And I want to be a psychologist because I want to help people. You know, why do I need this?
20
00:05:05.890 –> 00:05:28.500
Michelle HaynesBaratz: But for me. I really, I love methods, and it’s not because I love math. It’s because it’s a powerful tool to help people better. Right? It’s about coming up with clever ways to understand, to inform. So how can you help if you don’t know what’s wrong, or you don’t know what’s needed, or you don’t know what matters? You don’t know what’s working? What’s not right. So
21
00:05:28.740 –> 00:05:46.400
Michelle HaynesBaratz: my love of teaching methods was really about sharing that insight. You know, I lived for those light bulb moments where you’d feel that resistance sort of melting into curiosity like there’s really there’s nothing to be scared of. It’s just a tool. It’s a flashlight that helps guide you through the dark.
22
00:05:46.800 –> 00:06:14.190
Michelle HaynesBaratz: So that said, we need to learn to use the tool wisely. Okay. So you may have heard this expression, what gets measured gets managed. It’s a quote that’s been attributed to various folks, Peter Drucker, typically. And then we’re wondering if that’s true. And often if you go back digging. It looks like it was likely the average way that that made the point. But the issue remains which is.
23
00:06:14.730 –> 00:06:26.600
Michelle HaynesBaratz: once you really start measuring something that sort of focused attention right? And so when people pay a lot of attention to that metric and de facto, not others.
24
00:06:26.680 –> 00:06:48.990
Michelle HaynesBaratz: you can miss really important stuff, right? And the lesson there is that you better make sure that if you are measuring, you’re measuring something that actually matters. And the problem is that we often can fall into some common traps. So let’s talk a little bit about what those traps are that can distract, that can freeze, or that can mislead us.
25
00:06:49.230 –> 00:07:02.179
Michelle HaynesBaratz: So the 1st one we measure what’s easy, right? We measure how many people went to a training or liked a training, but not whether that training actually mattered, or whether it changed behavior or made a difference.
26
00:07:02.677 –> 00:07:21.260
Michelle HaynesBaratz: Sometimes we measure too much right? So this kitchen sink approach. You know. One more. Question. One more, one more. And then eventually, we’re drowning in data, and we don’t know what to prioritize or, frankly, what it tells us. And people can only hold on to so much information. And if everything is important.
27
00:07:21.870 –> 00:07:38.859
Michelle HaynesBaratz: but nothing’s important, right? And we see this one a lot, actually, this sort of data overwhelm, and it freezes folks. And there’s so much data by the time that they make sense of what to do with the data, it’s no longer even relevant because the time has passed or the moment has moved on.
28
00:07:39.140 –> 00:07:42.280
Michelle HaynesBaratz: So that’s definitely one to to pay attention to.
29
00:07:42.390 –> 00:08:03.709
Michelle HaynesBaratz: We lock ourselves into rigid measures. And so we kind of miss real change that’s happening in front of us again. In fact, we stay so focused on this metric that we’re not open to paying attention to other indicators. That may be signaling, hey? Changes afoot right? Something’s evolving in a slightly different way than maybe you had anticipated.
30
00:08:06.696 –> 00:08:31.310
Michelle HaynesBaratz: Another one is that we spend too much time reporting or asking for metrics that it drains the work. I’m sure folks can relate to this one. Right? So you’re prepping for this meeting. You’re prepping for that meeting. You’re collecting data. You’re crunching data, and you’re doing it in such short periods of time you don’t actually have time to do the work right? So
31
00:08:31.820 –> 00:08:37.810
Michelle HaynesBaratz: question becomes, what do we do right? How do we avoid these sort of pitfalls?
32
00:08:38.690 –> 00:08:54.749
Michelle HaynesBaratz: so here are some considerations and some offerings. I’m afraid that there’s excuse me no right answer. There’s no sort of magic wand, but it’s important to keep a few things in mind. The 1st is that context matters
33
00:08:55.150 –> 00:09:01.490
Michelle HaynesBaratz: deeply. In the tundra, you know, a good winter coat is essential.
34
00:09:02.000 –> 00:09:14.679
Michelle HaynesBaratz: If you’re in the tropics. That is just not at the top of the shopping list, right? So, even though it would be great to always say X plus y equals, Z. That’s the way you do it. It’s not the way that works. Context really really matters right.
35
00:09:14.720 –> 00:09:37.110
Michelle HaynesBaratz: And then the second consideration is that sometimes we need to slow down to speed up. And this is, I think, really hard. I catch myself falling into this trap sometimes, where everything feels urgent and swirling, you know, and we can lose our focus in that frenzy. And then when I personally feel that swirling, I have come to train myself.
36
00:09:37.443 –> 00:09:49.100
Michelle HaynesBaratz: It’s a signal. It’s a data point, if you will, to to breathe, to slow down and to move with intention and purpose. Right? Because ultimately it’s what prevents those missteps in the long run.
37
00:09:49.560 –> 00:09:52.549
Michelle HaynesBaratz: So with those 3 considerations in mind.
38
00:09:52.660 –> 00:10:00.410
Michelle HaynesBaratz: here are 3 questions that hopefully get you a little bit closer to using data effectively.
39
00:10:00.470 –> 00:10:15.059
Michelle HaynesBaratz: So the 1st one is, are we what we’re really trying to understand? Right? Really thinking through what you need to know. So we worked with this one client that was clearly committed to using data.
40
00:10:15.120 –> 00:10:35.020
Michelle HaynesBaratz: But the data collection process had sort of taken on a life of its own over time. The process had just multiplied, it had cauliflowered, and there was this very long survey, and while it was interesting there was no clear why, so it wasn’t clear.
41
00:10:35.220 –> 00:10:47.599
Michelle HaynesBaratz: that is to say it wasn’t clear why, knowing X about a particular set of people would change anything in terms of how the clients were approaching the work, and so ultimately we ended up partnering with them
42
00:10:47.840 –> 00:11:03.030
Michelle HaynesBaratz: to refine that data collection strategy and to make sure that the data that they gathered was truly meaningful and could generate insights that would actually drive decision making that would change the way if needed. This client was approaching the issue.
43
00:11:03.830 –> 00:11:20.010
Michelle HaynesBaratz: Question. 2. What would progress actually look like, what would it feel like? So in another example, we partnered with a foundation that was working to transform the early childhood education, landscape at a systems level.
44
00:11:20.010 –> 00:11:35.530
Michelle HaynesBaratz: and from the beginning we worked with them to really co-create and operationalize their vision, and we did so with both quantitative and qualitative indicators of progress as well as cross-referencing and cross, checking that with the grantees, and so forth in the space.
45
00:11:36.430 –> 00:12:00.149
Michelle HaynesBaratz: But another key to their success was their openness and their curiosity to really understand what progress looked like, and also using that data that we were collecting in real time to make tweaks, to make decisions, for example, to provide technical assistance or capacity building to grantees as needed, so that the progress could continue to be realized.
46
00:12:01.540 –> 00:12:04.060
Michelle HaynesBaratz: Last, but certainly not least.
47
00:12:04.110 –> 00:12:19.150
Michelle HaynesBaratz: who defines what matters right? And are we actually listening? And so in our work with another foundation, we’re seeking to understand best practices for inclusive decision making. There’s definitely a literature that speaks to this.
48
00:12:19.150 –> 00:12:43.710
Michelle HaynesBaratz: And we’re working and learning from that literature. But we are also working and learning from those who are most proximate to the issues in their particular context and context is always important. But in this particular project it’s really nuanced and critical to make sure that the voices of those who are typically divested or marginalized from this decision-making process
49
00:12:43.750 –> 00:12:51.420
Michelle HaynesBaratz: get to define their version right? Of what meaningful inclusion looks like when we’re doing this measurement. So
50
00:12:51.780 –> 00:13:01.300
Michelle HaynesBaratz: those 3 questions are hopefully useful tools to think through how we can use data effectively. Now, something.
51
00:13:01.300 –> 00:13:09.549
Kerlin Morales | Community Science: To keep paying Michelle. We have actually a hand raised, Jeff. I’m gonna allow you to talk to ask her question.
52
00:13:09.550 –> 00:13:10.190
Michelle HaynesBaratz: Okay.
53
00:13:13.690 –> 00:13:16.539
Jeff Gilbert: I misclicked. I am so sorry I did not.
54
00:13:16.540 –> 00:13:16.920
Kerlin Morales | Community Science: Oh!
55
00:13:16.920 –> 00:13:19.270
Jeff Gilbert: My hand raised. Sorry.
56
00:13:19.660 –> 00:13:20.200
Kerlin Morales | Community Science: I’m sorry.
57
00:13:20.200 –> 00:13:27.719
Michelle HaynesBaratz: Yes, I overstand. I misclick things all the time, but thank you so much for for letting us know. All right.
58
00:13:27.980 –> 00:13:28.940
Michelle HaynesBaratz: So
59
00:13:32.050 –> 00:13:44.629
Michelle HaynesBaratz: all right the 3 horizons. So something to keep in mind. With those those the answers to those questions that we just reviewed is that it really kind of depends where you are in the process of change.
60
00:13:44.630 –> 00:14:08.090
Michelle HaynesBaratz: And so if you have been following this series, you’ve seen this slide before the 3 horizons of change model. So the idea is that you have the 1st horizon, which is immediate. It’s right. Now it’s what’s in front of you, and it’s sort of like what’s urgent. Right? And then you have the 3rd horizon, which is that long, long term vision. Right? It’s basically
61
00:14:08.480 –> 00:14:16.130
Michelle HaynesBaratz: it’s why you’re doing the work. It’s the impact you want to have, and it’s the why. And then you have the the second horizon, which
62
00:14:16.390 –> 00:14:26.969
Michelle HaynesBaratz: is often the hardest to sort of think through right. It’s the bridge. It’s how you get there. How do you get from where you are now to that long term goal and where you want to be. It’s the how
63
00:14:27.140 –> 00:14:41.760
Michelle HaynesBaratz: and from a measurement perspective, you want to make sure that you’re aligning the measurement with where you are in the horizon right? And oftentimes the best way to do that is to backwards map right? So you start with that 3rd horizon.
64
00:14:41.800 –> 00:14:57.220
Michelle HaynesBaratz: What do you want to build? What are you working towards, and then think backwards. What are the signals that would suggest that you’re on your way? What would help you? Course correct if you needed to. So there’s now what’s emerging? Really early science
65
00:14:57.330 –> 00:15:04.759
Michelle HaynesBaratz: horizon? 2 sort of soon it’s all relative. But in the middle, what are the patterns? What are the relationships. What are the culture shifts?
66
00:15:05.080 –> 00:15:16.909
Michelle HaynesBaratz: And then, last, you’re working towards that 3rd horizon which is deep change, which is typically policy, institutional change, systemic change as well.
67
00:15:17.060 –> 00:15:18.010
Michelle HaynesBaratz: so
68
00:15:19.790 –> 00:15:32.039
Michelle HaynesBaratz: when we’re thinking about those 3 horizons. So we’re thinking about sort of okay, these are the questions, the framework that I’m applying. And I’m thinking about, how do I apply that in this context of thinking through time?
69
00:15:32.320 –> 00:15:51.880
Michelle HaynesBaratz: Some other things really to keep in mind change is not linear, and it it doesn’t happen at a constant rate of acceleration. Right? So folks might typically think of this kind of line graph when people think okay, increase over time. But rarely is that how change happens right? So think of the notion of things like tipping points.
70
00:15:51.880 –> 00:16:18.729
Michelle HaynesBaratz: A lot has to happen oftentimes to tip the scale which frankly, looks a whole lot more, either like an S curve. Right? So slow, slow, slow, and then eventually you get a tip, and then you get a plateau, or sometimes we see exponential growth as well. Right? So the idea is that we have to be really nimble and and thoughtful in the process of figuring out what to measure, because we don’t know before it happens, what the shape’s going to look like.
71
00:16:18.730 –> 00:16:25.400
Michelle HaynesBaratz: And it really depends on place and time, what you’re going what you’re going to see. And if you go looking for linear though
72
00:16:25.630 –> 00:16:46.889
Michelle HaynesBaratz: chances are you’re not going to find it. It’s not, I mean, it’s not de facto, but chances are you won’t. And so tomorrow it’s different than 6 months. It’s different from 10 years from now, and I’d actually like to share what I think is a really nice, real world example. This was a data visualization put together by Tribu and Collins. I may have said that name incorrectly
73
00:16:46.890 –> 00:16:55.720
Michelle HaynesBaratz: case. I apologize if I did but from Bloomberg business. And they created these data visualizations that look at the rate of change
74
00:16:55.720 –> 00:16:58.930
Michelle HaynesBaratz: for social policy issues in the Us.
75
00:16:58.960 –> 00:17:13.369
Michelle HaynesBaratz: And what I love about this graph. So various issues, you have interracial marriage, prohibition, women’s suffrage, etc, etc. What I love about the graph is that you see a few things really clearly.
76
00:17:13.650 –> 00:17:20.239
Michelle HaynesBaratz: So first, st you see that depending on the issue.
77
00:17:20.319 –> 00:17:48.679
Michelle HaynesBaratz: that the rate of change looks very different. Right? So clearly, context matters change doesn’t always look exactly the same. You also see some nice examples of some S curves. You see, some nice examples of some exponential stuff. You don’t see too much linear going on. But I think it really is a nice underscoring of this idea. That change. It depends right? And so some things even go up.
78
00:17:48.680 –> 00:18:03.450
Michelle HaynesBaratz: or rather go down before they go up. Right. Things get worse before they improve in terms of thinking through your metrics. So again, all in the context of really being open to what the pattern of change is, how it’s going to unfold.
79
00:18:04.270 –> 00:18:07.189
Michelle HaynesBaratz: So something else to keep in mind.
80
00:18:09.260 –> 00:18:23.469
Michelle HaynesBaratz: there are subsystems that are also changing at different rates, right? So organizations are not monolithic. They’re sort of the rate of change of the whole organization. But then, within an organization.
81
00:18:23.610 –> 00:18:53.059
Michelle HaynesBaratz: there are these subunits, right, these subsystems. So, for example, you have individual departments. You have different organizational functions, and they also have their own rate of change, right? These nested systems. And so you may have one department that’s going 20 miles an hour. You have another that’s going 45 miles an hour, and you have one department that cannot turn on the engine right? So, keeping in mind that you have the whole unit. But even within the system that you’re studying, if it’s an organization or that you’re working with.
82
00:18:53.060 –> 00:18:59.090
Michelle HaynesBaratz: there’s other things going on below the surface that are potentially moving at different rates.
83
00:19:00.100 –> 00:19:08.649
Michelle HaynesBaratz: And last, but definitely, not least, is this we we like to use this expression, that the the runway is long.
84
00:19:08.940 –> 00:19:23.209
Michelle HaynesBaratz: It’s really a truism that we see when we’re working with a lot of clients. So we get in and we realize that that on ramp, those building blocks. They take a long time to set up to lay the groundwork.
85
00:19:23.430 –> 00:19:35.889
Michelle HaynesBaratz: And and for anyone on the front lines who’s actually doing this work. You all probably know this pretty well right, and the key is that your measurement strategy has to be attuned to this.
86
00:19:35.980 –> 00:20:02.740
Michelle HaynesBaratz: So another. Once upon a time story, I was part of this multi year organizational change, initiative, and for the 1st year we were laying groundwork. It took a year a year to meet with all the different stakeholders to. With all the different groups, to discuss issues from their perspective, to get buy-in, to do all the things right, and and some of it kind of felt invisible. Right? If you were measuring our our end.
87
00:20:02.740 –> 00:20:10.050
Michelle HaynesBaratz: our end goal, our end, Kpi, if you will, which was culture change, which is what we were working towards. There’s no way it would have showed up.
88
00:20:10.110 –> 00:20:13.959
Michelle HaynesBaratz: But that relationship and awareness building
89
00:20:14.030 –> 00:20:25.810
Michelle HaynesBaratz: was critical. We would have had a huge organizational initiative that would have never taken off, because it would have been tone deaf, or it just would have been launched in a vacuum, so that whole year
90
00:20:25.940 –> 00:20:52.229
Michelle HaynesBaratz: was really crucial to building the bridges that we needed to launch phase 2 and phase 3, which eventually did lead us to those kpis which was what we were working towards. But again, the the point of the story is to be realistic, that change takes time, and that the rate of change really is important in thinking through our various measurement strategies.
91
00:20:52.410 –> 00:20:53.270
Michelle HaynesBaratz: So
92
00:20:54.550 –> 00:21:24.239
Michelle HaynesBaratz: as we round out the conversation today, so this is sort of the last content slide, if you will. I want us to consider one more quandary, and I imagine this is a situation that will resonate with many of you. It’s what we in the Oe team have been calling the head heart decisions right? You want to do it all. You have limited resources. I think many of us can probably relate to that. And so then the question
93
00:21:24.550 –> 00:21:31.209
Michelle HaynesBaratz: from a measurement perspective becomes, how do you use data to make those really hard decisions?
94
00:21:31.300 –> 00:22:00.490
Michelle HaynesBaratz: And so we recently worked with a client where that was precisely the question, right? This organization was trying to figure out how to maximize this impact of a pretty limited resource for the most needy in their community. And historically, with this this particular context, this had translated over time into spreading the fund funding more broadly, so they wanted to make sure that everybody got something.
95
00:22:00.690 –> 00:22:07.929
Michelle HaynesBaratz: but as the number of requests grew, the amount that any one person might get, or 1 1 org or initiative might get
96
00:22:08.280 –> 00:22:09.370
Michelle HaynesBaratz: decreased.
97
00:22:09.600 –> 00:22:30.519
Michelle HaynesBaratz: So when we started to engage with multiple stakeholders in the ecosystem, we really started to look at data holistically. And what we found was that this very well intentioned approach was really diluting impact rather than driving change. And so in the end, what we did was we really focused on
98
00:22:30.790 –> 00:22:48.880
Michelle HaynesBaratz: defining what was essential in their particular context, right? And once we were able to figure out what was important for this group of folks, we really helped to reimagine what meaningful equity driven investment might look like. So instead of spreading those resources thinly.
99
00:22:49.240 –> 00:22:58.990
Michelle HaynesBaratz: they’re now considering some recommendations for making more more strategic, more high impact funding decisions that serve the most underserved communities.
100
00:22:59.100 –> 00:23:10.869
Michelle HaynesBaratz: And so as we wrap up right thinking through, that’s really the goal. Right? When when I started. I said, one of the key takeaways data is a tool.
101
00:23:10.870 –> 00:23:30.160
Michelle HaynesBaratz: That’s exactly right. So the question becomes, how can we use data in a thoughtful way, not just to collect data for data sake, but rather so that we can use it as a tool, so that when we are in the dark and we don’t know how to proceed. We can use our flashlight right and thinking through again that
102
00:23:30.200 –> 00:23:41.130
Michelle HaynesBaratz: how I use the tool is going to depend on where I am in the process, you know. What do I need to know? Now? What do I think is happening next? And what do we hope to see?
103
00:23:41.570 –> 00:23:50.430
Michelle HaynesBaratz: And so with that I am happy to take any questions that folks may have, and I thank you for your time.
104
00:24:03.590 –> 00:24:14.229
Kerlin Morales | Community Science: You know, as people are thinking of their question, I know that there were submitted. There were some submitted earlier. So if we want to get started with one of those
105
00:24:14.680 –> 00:24:22.279
Kerlin Morales | Community Science: as people thinking. The 1st one would be, what are some useful practices to account
106
00:24:22.560 –> 00:24:29.150
Kerlin Morales | Community Science: for personal bias when contextualizing data and the analysis and reporting section.
107
00:24:29.620 –> 00:24:52.279
Michelle HaynesBaratz: Yeah, it’s a really good question, because we all have them right. The personal bias is a very real thing. I think that being aware of them is really critical clearly, and then also to the extent that when you’re engaging in these measurement strategies that you are working very much in community with
108
00:24:52.370 –> 00:25:07.180
Michelle HaynesBaratz: other stakeholders, to make sure that you are collecting data in a way not only that is useful in terms of how you are seeing it, but also in the way that that they too are seeing it. So really using a participatory approach
109
00:25:07.210 –> 00:25:32.470
Michelle HaynesBaratz: and doing that from the beginning as much as possible. So that means involving folks from the beginning in data collection. So when you’re thinking about what to measure, when you’re thinking through how to collect that data. What are the best strategies for collecting that data, and then also for making sense of of the data as well. So I’ll quote
110
00:25:33.177 –> 00:25:35.090
Michelle HaynesBaratz: my colleague, Jasmine.
111
00:25:35.430 –> 00:25:37.209
Michelle HaynesBaratz: She likes to say we don’t.
112
00:25:37.710 –> 00:25:53.090
Michelle HaynesBaratz: We don’t like to be right. We want to get it right. And one of the ways that we can make sure that we are getting it right is to check our assumptions. And so oftentimes we go back to folks that we have worked with, and said, Hey, here’s what we heard
113
00:25:53.480 –> 00:26:00.240
Michelle HaynesBaratz: is this, resonating with you to make sure that in fact, we have gotten it right? So hopefully
114
00:26:00.430 –> 00:26:03.250
Michelle HaynesBaratz: that provides they’ve got ticks.
115
00:26:06.620 –> 00:26:15.580
Kerlin Morales | Community Science: Yes, and we actually have a hand up from one of our attendees sharing. If you would like to share your question.
116
00:26:18.033 –> 00:26:34.590
Sharon Duncan Jones-Eversley: Great. 1st of all, thank you so much. I really appreciate it. It was just very refreshing and definitely needed one of the things that I really love is working with communities, particularly as it relates to people’s lived experiences.
117
00:26:34.990 –> 00:26:43.879
Sharon Duncan Jones-Eversley: And I guess maybe about 20, if not 35 years ago Robert Wood Johnson did, or had a series of really helping community
118
00:26:44.360 –> 00:27:05.681
Sharon Duncan Jones-Eversley: investors what I call them, people who live in the community to really kind of understand research. And then, you know, because of funding interest, it kind of went away. But are there any suggested user friendly software that you have found that has that have that has been
119
00:27:06.300 –> 00:27:12.272
Sharon Duncan Jones-Eversley: works well with communities with diverse strengths as well as challenges.
120
00:27:12.910 –> 00:27:34.010
Sharon Duncan Jones-Eversley: because I really think along the way I feel like sometimes our research is like a foreign language, and once we leave, how are they continuing to interpret or build upon some of that information? So thank you so much. I’m gonna I guess you all will mute me so I can get the response.
121
00:27:34.010 –> 00:27:39.628
Michelle HaynesBaratz: Okay, let’s thank you for that question, Sharon. I really appreciate it. It’s a great question.
122
00:27:40.550 –> 00:28:02.749
Michelle HaynesBaratz: so I, I agree with you wholeheartedly right? So when we’re thinking through sort of centering the experience of of our communities, we want to make sure that it’s not just in an extractive way, right? That that we are providing resources in terms of folks taking ownership of their own sort of
123
00:28:03.124 –> 00:28:20.355
Michelle HaynesBaratz: data and knowledge and continuing to build upon that. You asked about some software. I don’t know that I have a software platform per se recommendation. But I do think that there are some, some tools, and actually we have one that when we
124
00:28:20.950 –> 00:28:35.937
Michelle HaynesBaratz: sort of that that talks through, you know what is a logic model and sort of talks a little bit about the basics. Of what research can look like and and and breaks it down in a it doesn’t have to be technical right? I think that
125
00:28:37.620 –> 00:29:01.440
Michelle HaynesBaratz: that it that the that when folks get beyond the the jargon, it’s something that everyone can resonate with right just collecting information to make better decisions. So we could certainly include that at the tail end of this sort of 3 part series we’ll send an FAQ. And we can send that link along. No problem. Of course. Of course.
126
00:29:02.210 –> 00:29:06.040
Sharon Duncan Jones-Eversley: That was very helpful. Because I think that is it. It is.
127
00:29:06.330 –> 00:29:20.810
Sharon Duncan Jones-Eversley: It’s just like when we have to become familiar with their language and how they operationalize terms that that exchange should be bidirectional and not from a hierarchical perspective. So thank you so much. I really appreciate it.
128
00:29:20.810 –> 00:29:24.389
Michelle HaynesBaratz: Of course, thanks for taking the time to join us today. I appreciate you.
129
00:29:30.880 –> 00:29:42.539
Kerlin Morales | Community Science: Next question could be drilling into the how, what approaches have you used that help? Teams develop indicators? Indicators that really matter.
130
00:29:43.440 –> 00:29:54.129
Michelle HaynesBaratz: Say, Oh, okay, I I have now just managed to find the question thing so I can read it again. Thank you. All right. Let me take Julian. What are the approaches? You
131
00:29:54.170 –> 00:30:15.059
Michelle HaynesBaratz: tell the team development indicators that matter? Yeah, it’s a great question. So drilling into the how the how is oftentimes we don’t do it alone. So we really do take the time to work with our clients. And you know we sometimes use Community Advisory board, so on, and so forth, to
132
00:30:15.250 –> 00:30:20.660
Michelle HaynesBaratz: yes, use what we know, but also to make sure that we are working with
133
00:30:20.720 –> 00:30:49.609
Michelle HaynesBaratz: folks with whom we are trying to serve. And so, having a conversation in terms of thinking about what not? What do I think change would look like in your community? What do you think change would look like in your community. What do you think it would feel like? So it’s really about that participatory co-creation process. And you can collect that data in multiple ways. It can be stage right in stages. It can be
134
00:30:49.900 –> 00:31:13.159
Michelle HaynesBaratz: through a survey. It can be through focus groups. One of the things. I think that’s important is that you know, data collection is not often a 1 and done right. But we take the time to collect some data, have it feed into our process in some sort of way, and then go back, see if we need a little bit more, maybe around 2, refine so on, and so forth, and so
135
00:31:13.910 –> 00:31:23.949
Michelle HaynesBaratz: I think that that is one possibility. And then also you don’t always have to reinvent the wheel right? So I do think that there is merit and value at looking at
136
00:31:24.130 –> 00:31:35.690
Michelle HaynesBaratz: previous approaches, what indicators have potentially worked well. And then building upon that right, using that sort of expansive mindset rather than always sort of
137
00:31:35.810 –> 00:31:38.160
Michelle HaynesBaratz: insular and and closing.
138
00:31:44.090 –> 00:31:50.100
Kerlin Morales | Community Science: And I don’t know if you would like to go to kind of that 3rd question that kind of drills more into.
139
00:31:50.230 –> 00:31:59.780
Kerlin Morales | Community Science: Do you have in mind any specific metrics for those who are supporting lending and technical assistance to both small businesses and real estate developers.
140
00:32:00.360 –> 00:32:07.679
Michelle HaynesBaratz: Listen. Oh, I’m not seeing the real estate developer question. Oh, recommended data metrics for this.
141
00:32:12.760 –> 00:32:19.730
Michelle HaynesBaratz: I don’t know that I have a commonly recommended metric for that specific context.
142
00:32:23.680 –> 00:32:34.789
Michelle HaynesBaratz: I would have to think about that. And then perhaps we can add that to the to the FAQ. After we have a moment to to percolate that, but nothing off the top of of my head comes to mind.
143
00:32:36.862 –> 00:33:03.279
Michelle HaynesBaratz: I see another one here. So now that I’ve actually found the box the one about. Can you talk a little bit about navigating between quantitative and qualitative data? We all know that Quant doesn’t capture the nuances of what all is actually happening. But qualitative data is hard to quantify and show overall impact. But when we do that it flattens out the nuances what to do.
144
00:33:03.300 –> 00:33:11.671
Michelle HaynesBaratz: Great question. I think it’s a both, and, to be quite honest. I I truly believe that there is
145
00:33:12.460 –> 00:33:18.930
Michelle HaynesBaratz: merit and value in both. And so if there is space and capacity to have those
146
00:33:19.230 –> 00:33:31.410
Michelle HaynesBaratz: quant indicators with the contextualization of the qualitative, I think that that’s the best of both worlds. I also think that depending on who you’re speaking to.
147
00:33:31.660 –> 00:33:32.670
Michelle HaynesBaratz: and
148
00:33:33.760 –> 00:33:44.749
Michelle HaynesBaratz: the data, certain data is more compelling to certain kinds of folks, right. And so to to the extent that that’s also true. I think that there’s value in capture, in, in
149
00:33:44.910 –> 00:33:51.380
Michelle HaynesBaratz: not capturing. I think that there is value in trying to get both and seeing
150
00:33:52.500 –> 00:34:11.730
Michelle HaynesBaratz: how they can play together. Right? It’s the both. And instead of the either or so that would be. That would be my recommendation, and that is typically our approach. I have. I have not done something solely quant or quantitative in quite a long time, and it’s and I was trained as
151
00:34:11.840 –> 00:34:19.869
Michelle HaynesBaratz: as very much as a quantitative person. But I see tremendous value in making sure that we have the context. Right? So
152
00:34:20.705 –> 00:34:21.429
Michelle HaynesBaratz: both.
153
00:34:32.760 –> 00:34:42.800
Michelle HaynesBaratz: Okay, as founder of an intermediary. What are the best approaches to balancing?
154
00:34:45.830 –> 00:35:08.280
Michelle HaynesBaratz: So I don’t know if other folks are able to see the question. So I don’t know if no, they’re not. Okay. So another question as a funder of Forgive me. Funder of intermediary funders and technical assistance providers, what are the best approaches to balancing. How to ask for the right data from the intermediary about outputs and outcomes from both
155
00:35:08.400 –> 00:35:12.749
Michelle HaynesBaratz: the intermediary and the end. Beneficiary.
156
00:35:13.510 –> 00:35:21.010
Michelle HaynesBaratz: Well, best approaches for balancing. I mean, I do think that there’s something to be said for really taking that.
157
00:35:22.730 –> 00:35:30.499
Michelle HaynesBaratz: What do you need? What? What do you need to know right? So I I do think that there is something to be said for
158
00:35:31.200 –> 00:35:37.443
Michelle HaynesBaratz: have to know versus nice to know. And again, I do think that that’s a a function of
159
00:35:40.160 –> 00:36:09.909
Michelle HaynesBaratz: I think that’s a function of what you’re trying to accomplish. And so if you are in a capacity building perspective. The question is, I think, oftentimes what we want to know is, is it working right? So what are the one or 2 or 3 things that you can ask to figure out whether or not your approach is working, and or if there needs to be tweaks, I do think that there really is something that can be that is beneficial to re imagining data
160
00:36:09.980 –> 00:36:39.130
Michelle HaynesBaratz: as something that’s feeding into a continuous decision making process rather than this. Just this static thing that we use. And so if you reframe in that light hopefully that can help narrow the problem space. Because what you also don’t want to do is, fall into that trap of asking for more, more and more, and then it creates an issue where we can’t even make space to get the work done in the 1st place. So
161
00:36:49.070 –> 00:36:58.220
Michelle HaynesBaratz: And then I think that there was another one that came in beforehand. I want to be able to spread
162
00:36:58.805 –> 00:37:05.670
Michelle HaynesBaratz: a little bit of attention. Here we had someone who had asked about
163
00:37:08.810 –> 00:37:26.419
Michelle HaynesBaratz: how to make competing entities play nicely in the sandbox when it comes to mapping data elements and sharing definitions. And I think that it’s a really excellent question. I think we are now in a moment of collective.
164
00:37:27.330 –> 00:37:46.267
Michelle HaynesBaratz: you know, collective movement, all trying to hopefully work towards the same thing. And so then the question becomes, how can we reframe our work rather than this sort of 0 sum kind of competitive game. How can we reframe it as working towards the the collective
165
00:37:46.850 –> 00:38:11.480
Michelle HaynesBaratz: the collective good, and there is certainly plenty of work to be done to to get us there. And so I think, really recasting and reframing that collective goal. What is it that we are all trying to to work towards is a useful 1st step. And I think that hoarding
166
00:38:11.560 –> 00:38:21.819
Michelle HaynesBaratz: data or hoarding information really doesn’t do anyone any good in the long in the long term. And so thinking through ways of
167
00:38:22.520 –> 00:38:35.829
Michelle HaynesBaratz: of of recasting, and also potentially carving out space, so that folks have their their pieces of the collective pie, if you will, is also another thing to think through.
168
00:38:36.030 –> 00:38:36.740
Michelle HaynesBaratz: So
169
00:38:50.050 –> 00:39:10.660
Michelle HaynesBaratz: there is another question about asking about how to track racial wealth gaps. So knowing it’s difficult to track how racial wealth gaps are measured, what are ways to track proxy data and what publicly available resources are out there to try to track them in Wealth Gap
170
00:39:11.093 –> 00:39:25.006
Michelle HaynesBaratz: to track wealth gaps in communities. I don’t have off the top of my head. I have some databases. That that I’ve worked with. But I don’t want to give incorrect information. That’s something we can also make sure that we
171
00:39:25.460 –> 00:39:33.220
Michelle HaynesBaratz: we attend to in the context of the the FAQ. When we do our follow up Post Webinar.
172
00:39:49.310 –> 00:39:56.627
Michelle HaynesBaratz: All right, folks, I will take this, then, as
173
00:39:59.600 –> 00:40:04.233
Michelle HaynesBaratz: as our opportunity to say, Thank you. Thank you so much for
174
00:40:04.760 –> 00:40:15.562
Michelle HaynesBaratz: joining today if you had an opportunity to join my colleagues. That’s wonderful. And if you haven’t. I know that there are recordings so you can listen to those wise folks
175
00:40:15.900 –> 00:40:39.269
Michelle HaynesBaratz: talk about their perspectives as well. And by all means, if you’re anything like me. You hang up or you get off a call, and you think oh, I had that one extra question. Please feel free to reach out happy to hear from you, Mhanes Barretts, at communityscience.com. We will follow up with a recording and some faqs as well post this webinar.
176
00:40:39.320 –> 00:40:42.320
Michelle HaynesBaratz: and with that I say, thank you very much.
177
00:40:44.490 –> 00:40:48.250
Michelle HaynesBaratz: Take good care, everyone. Thank you. Bye, bye.
By: Organizational Effectiveness Practice Area at Community Science
Uncertainty is the new normal. Whether due to shifting political landscapes, funding cycles, or burnout, many mission-driven organizations are struggling to maintain clarity and momentum. This summer, we hosted a three-part webinar series titled Maintaining Strategy Momentum in Uncertain Times. Amber Trout kicked us off with a session on sustaining momentum — when everything seems urgent, how do we keep going and work toward the long game? We then heard from Jasmine Willams-Washington, who invited us to reimagine capacity building as a holistic approach and means to achieve strategic alignment as we work toward lasting change. Michelle Haynes-Baratz rounded out the conversation by talking through the measurement process — how can we measure what matters so that we know what’s working, what’s not, and where we might need to pivot?
Below, we respond to your most pressing questions and share relevant takeaways and practical strategies that funders, nonprofits, and ecosystem partners can implement right now.
- How do we integrate goal tracking into our daily work?
Many organizations set ambitious goals, but those goals often sit untouched until the year-end report. The key is making goals a living part of your organization’s culture and routines. That means building them into decisionmaking. Specifically:
- Define your unique value. Ask: What can only we do in our ecosystem, and who are we accountable to? Centering both your role and responsibility makes goal setting relevant and grounded in purpose.
- Use three horizon thinking. Break down goals into short-term actions (H1), strategic tests (H2), and long-term vision (H3).
- Assign clear roles for each goal — someone to track progress, troubleshoot, and keep it visible.
- Create check-in rhythms (weekly, monthly) where teams reflect on what’s working, what’s not, and what’s changed.
- Keep it visible. Use a whiteboard, dashboard, or team tracker that everyone can access and update.
- Where do you start with the data you are collecting outside of reporting?
Start by asking what decisions do we wish we could make more confidently? That helps reframe data from a compliance obligation to a strategic asset. Many organizations collect valuable information but don’t always connect it back to decisionmaking, learning, or impact. Here are some practical starting points:
- Revisit the original intent.
Look at the data you’re collecting and ask why it was gathered in the first place. Was it to inform program adjustments? Track client outcomes? Understand equity impacts? Also ask: Whose voices are missing from this story? What is not being measured that matters? Clarifying the purpose helps you focus on the right data. And if it’s not the right data, stop collecting it. - Look for patterns over time.
Even simple trend analysis (like changes in participation, engagement, or outcomes) can spark useful insights. Instead of chasing statistical perfection, ask: What’s increasing, decreasing, or stalling — and why might that be happening? - Prioritize meaning over volume.
You don’t need all the data — you need the right data. Focus on what’s most meaningful to the decisions you’re trying to make. Often, a few strong indicators are more useful than dozens of disconnected metrics. - Bring data into conversations.
Facilitate regular discussions where staff or partners review data and reflect together. It can be framed as sensemaking, rather than evaluation. Ask: What resonates? What is most surprising? What else do we wish we knew? - Pair numbers with stories.
If you’re looking to use data beyond reporting, integrate qualitative insights from community members, staff, or partners. Their perspectives add depth and help turn numbers into action.
The shift begins when you see data as a tool for reflection, learning, and alignment — not just funder reporting. Start with the questions that matter most to your mission, then pull the data that helps illuminate them.
- How can we protect margin when everything feels urgent?
where power builds. It’s protected time to step back and ask: Are we responding to real community need or reacting to urgency shaped by funders, headlines, or internal pressure? Taking that pause helps you lead with strategy, not stress.
- Acknowledge urgency addiction. Talk openly about the pressure to always say “yes.”
- Revisit your mission. Use it to decide what not to take on — margin starts with focus.
- Block margin time on calendars for strategic thinking, staff development, or pause weeks.
- Build habits of reflection. Try a “margin moment” at the start of each meeting to ask: What are we rushing into? What needs space?
- Normalize saying “no,” even to well-funded or high visibility opportunities if they pull you away from your core work. Protecting your margin isn’t avoidance; it’s how you stay focused, grounded, and able to respond when it matters the most.
- How do I shift a crisis-minded leadership team toward strategy?
When everything feels like a fire drill, strategy takes a backseat. Leaders who “save the day” might feel effective in the moment — but this cycle creates instability and confusion. Strategy isn’t a luxury, it’s a form of care for your staff, partners, and community. It brings coherence in moments of chaos.
- Name the pattern. Use non-blaming language to observe and ask: We’ve been in response mode a lot — is it helping our long-term goals?
- Ask traction questions. What’s ours to lead? Where are we most trusted? These re-center the conversation on purpose.
- Link wins to systems. After a quick win, ask: How do we make this easier next time? Strategy is about building repeatable success.
- Model slow thinking. Encourage timeouts for reflection and intention before launching into solutions.
When you step out of reactive mode, the next step is to reconnect with your organization’s “sweet spot.”
- How do we strengthen our organization’s “sweet ?”
Your “sweet spot” is where passion, skill, and community need overlap. It’s the work your team is proud to lead, trusted to deliver, and energized to sustain. Strengthening it makes your impact more focused and your team more resilient.
- Ask your team: What work would we do even without a grant?
- Look for repeated signals. Where does the community consistently turn to you? What do partners rely on you for?
- Audit your current strategy. Does it build on what you do best, or pull you away?
- Align internal roles and systems to support that focus — not just funder demands.
- Revisit annually as your ecosystem and community evolve.
- Once you name your sweet spot, stay disciplined. Prioritize what strengthens your core contribution and turn down work that might grow your visibility but dilute your impact.
- As a funder, how do I support ecosystems, not just grantees?
Supporting systemic impact means looking beyond individual grantee performance to the health of the whole ecosystem. That includes trust building, coordination, and funding infrastructure.
- Map beyond your portfolio. Who are the connectors, under-resourced leaders, and key influencers in the field?
- Support shared infrastructure. Fund tools, systems, and roles that benefit multiple organizations.
- Make space for peer learning. Cohorts or cross-org strategy groups strengthen the whole field.
- Fund reflection and relationships — not just activities. Ask: What would it look like to fund not just organizations, but relationships and coordination across the ecosystem?
- Offer patient capital that gives organizations time to build trust, clarify roles, and move at the pace of real change.
- How do we measure capacity building — not just outputs?
Organizations are moving away from checking boxes to asking: Is this work making us stronger, clearer, and more adaptive? At Community Science, we measure capacity by how well an organization turns learning into action. That might look like shifting a strategy midstream, co-designing solutions with partners, or advocating for change in the broader systems they’re working to influence (e.g., education or housing).
- Define success based on alignment, clarity, and decisionmaking — not just completed trainings.
- Use three horizon-specific metrics:
- H1: smoother processes, clearer roles.
- H2: better cross-team coordination, quicker pivots.
- H3: long-term positioning and leadership in your field.
- Use qualitative tools like reflection prompts, team feedback, and case stories.
- Track behavioral shifts. Are people collaborating more? Are people making decisions with less confusion?
- Don’t hide the mess — show learning, iteration, and growth.
- How do we reduce bias in our data analysis?
All data work includes interpretation, and interpretation includes bias. The goal is not to be perfectly objective, but to be transparent, inclusive, and reflective in how meaning is made.
- Ask reflective questions before you start analysis: What assumptions are we making? Who’s not at the table?
- Include diverse voices in sensemaking — staff, partners, and community members.
- Disaggregate carefully. Don’t just show differences — explore why they exist.
- Pair numbers with context. Use quotes, stories, or historical background.
- Document your interpretation process and share limitations clearly.
- Close the loop with communities. Share findings and ask: Does this match your experience?
Also check out our Doing Evaluation in Service of Racial Equity, a three-part series for evaluation professionals describing how to incorporate racial equity as a core value, embedded in every aspect of the evaluation process Commissioned by The W.K. Kellogg Foundation (WKKF) and developed and written by Community Science.
- What are your recommendations for tools and resources to get started or enhance one’s journey into data collection, analysis, and storytelling and visualization? What are best practices for survey development and deployment?
Great question! The key is to match your tools and strategies to your actual goals and capacity. Here is one resource many have found helpful in framing data collection in service of decision making. the W.K. Kellogg Foundation. This guide is designed for people with little or no experience with formal evaluation to help them become more familiar with evaluation concepts and practices, partner with independent evaluators, and use evaluation more effectively to continually learn from and improve their work. 
- How can we support partner organizations without overstepping?
Supporting others’ growth requires humility and trust. It’s about creating the space, tools, and relationships that allow others to lead. Some offerings for where to begin:
- Start by listening. Ask: What does capacity mean to you?
- Avoid one-size-fits-all technical assistance. Offer flexible coaching, tools, or funds orgs can shape.
- Support strategic retreats or learning cohorts where reflection is the goal, not production.
- Build relational trust through consistent, thoughtful check-ins — not just transactions.
- Respect ecosystem roles. Support partners name and protect their niche (their sweet spot) and collaborate more intentionally. Then invest in the relationships, time, and trust building it takes to make cross-organizational collaboration actually work and last.
Related Webinars

Maintaining Strategy Momentum in Uncertain Times with Amber Trout, Ph.D.
When uncertainty is the norm, staying focused on your strategic goals can be challenging. This webinar helps nonprofit and mission-driven organizations revisit and reinforce their strategy by reviewing their full pathway of action—from today’s urgent needs to long-term impact. Participants will explore how to align immediate priorities, build internal capacity, and remain accountable to their mission with a place-based lens. We’ll also focus on the often-overlooked “middle phase” of implementation—the space between urgent action and distant outcomes—where organizations must intentionally invest in the systems, staff, and practices needed to sustain change.
Key Takeaways:
- Assess your strategic alignment.
- Avoid burnout from overloaded plans.
- Create clarity about what to do now and what to grow over time
Resources for this webinar: Maintaing Strategy Momentium in Uncertain Times

Mapping the Work: A New Way to Build Capacity with Jasmine Williams-Washington, Ph.D.
Capacity building goes beyond improving organizational functions—it’s about aligning strategy, community accountability, and systems awareness.
Watch this webinar with Jasmine Williams-Washington, “Mapping the Work: A New Way to Build Capacity” where we explore a systems approach to strengthening nonprofits and their ecosystems.
What You’ll Learn:
- How to identify and operate from your organization’s “sweet spot”
- Strategies for mapping your ecosystem and bridging silos
- Applying a systems lens to deepen your impact
This webinar is for nonprofit leaders, funders, evaluators, and anyone interested in sustainable, community-driven change. You’ll walk away with practical strategies to help your organization become more resilient, reflective, and connected.
Resources for this webinar: Mapping The Work: A New Way to Build Capacity
