Inside Funder-Intermediary-Evaluator Partnerships: Three Cases

Posted Friday, April 26, 2019 at 10:26 am

By Meg Long and Clare Nolan

Funders who have ambitious goals to change large systems often create partnerships with intermediaries and evaluators to help realize their visions. But what does it take to effectively weave these partners together and position them to achieve their goals? While these three-way partnerships are common in the social sector, our initial scan of the literature mostly revealed substantive resources on two-way relationships—how funders can partner with intermediaries, and how they can partner with evaluators. However, very few resources spoke to funder partnerships involving intermediaries and evaluators.

Exploring cases of varyingly-configured funder-intermediary-evaluator partnerships can illuminate issues that may arise, and can also help determine strategies for managing those tensions. Though the three case examples covered here differ in terms of content, geography, investment, time-span, and partner roles, all three partnerships encountered—and successfully navigated—relationship tensions.


Linked Learning Regional Hubs of Excellence

In 2015, The James Irvine Foundation engaged Jobs for the Future (JFF) as an intermediary to help design and manage a cross-sector systems change initiative which aimed to increase the quality and scale of Linked Learning, an evidence-based approach to college and career readiness in California. Because the foundation was testing a new strategy—translating an educational model into a broader regional systems approach—they saw value in commissioning a collaborative evaluation. Considering pre-existing relationships and the Foundation’s appreciation of their expertise in developmental evaluation and systems change initiatives, Irvine commissioned Equal Measure, Engage R+D, and Harder+Company Community Research for the evaluation.

Having identified and funded the intermediary and evaluation teams directly, Irvine played a major role in managing partner relationships and setting the tone for sharing learnings and insights across the Linked Learning initiative. As one JFF staff member noted, “The funder set just the right tone to create more space for this kind of honest trust-building… It accelerated things.” And according to Irvine, “Having an evaluation partner in charge of external observations and then curating and facilitating that reflection process has been so powerful,” because it built the capacity of the intermediary and the funder to engage in authentic assessments of the partnership structure in-person and digitally.

Some tensions between the partners took time to overcome. For example, JFF staff were initially somewhat unclear about the role of the evaluation, which prompted deeper conversations to clarify partner roles and norms. In addition, reflection sessions became critical for building knowledge, trust, and the capacity of the partners. This time enabled the triad of partners to leverage the knowledge and expertise of each organization to strengthen learning overall.


Consumer Voices for Coverage

In 2007, the Robert Wood Johnson Foundation launched Consumer Voices for Coverage (CVC)—a state-level advocacy initiative for health reform—with Community Catalyst as the intermediary. Following the passage of the Affordable Care Act (ACA), the partnership’s reform efforts moved from state to federal action. RWJF gave Community Catalyst leeway in oversight and subcontracting—enabling them to serve  as both the initial touch point for grantees and determine when it was necessary to obtain guidance from the funder. In the second phase of the initiative, RWJF sought to build the evaluation capacity of grantees and the intermediary by engaging Spark Policy Institute to provide evaluation coaching services. These services included “Evaluation 101” webinars, offering evaluation coaching to CVC grantees, and gathering feedback about the technical assistance that Community Catalyst provided. The purpose of this work was to help Community Catalyst and its grantees harness evaluation as a driver of effective advocacy.

The greatest challenge the partnership faced was generating grantee buy-in for evaluation capacity building. A number of grantees opted not to participate in the process. While some grantees cited capacity issues, others noted negative past experiences that dissuaded them from participating. However, grantees that engaged with Spark came to see the benefit of capacity building. As Community Catalyst put it, “To do that work and to have grantees define their own evaluation questions was really important and different.”


Opportunity Youth Incentive Fund

The Opportunity Youth Incentive Fund (now known as the Opportunity Youth Forum) is a complex social change effort involving 34 regional and national funders designed to address unemployment and educational attainment for youth between 16 and 24 who are not currently in school or in the workforce. The partnership structure was derived from the multiple roles played by the Aspen Institute’s Forum for Community Solutions (AIFCS), which manages the funder collaborative. (The Aspen Forum for Community Solutions co-convenes the Collective Impact Forum with FSG.) These roles include implementation and providing national-scale voice for the work of the initiative. In managing the funders and partners AIFCS also serves as fiscal intermediary, developing a learning framework, monitoring collective impact, and building national momentum around the Opportunity Youth agenda. Jobs for the Future serves as implementation intermediary, while Equal Measure provides evaluation and design of thought leadership approaches for the partnership.

Though partners have varied perspectives and diverse experiences, and AIFCS coordinates their needs, roles, and communication, the initiative’s complex structure proves challenging at times. One challenge is that funders expect to see demonstrations of impact that may not always align with the pace of the initiative’s work and the scope of the evaluation. At the same time, a shared culture of learning unites the partners. Setting learning expectations and norms (such as being honest and learning from experience) upfront was critical. Particularly with grantees, promoting a culture of learning has encouraged risk-taking and learning from risks. Having many partners has accelerated AIFCS’ responsiveness to grantees’ learning and capacity building needs.

These case examples are but a small part of our recent report, Weaving Successful Partnerships: When Funders, Evaluators, and Intermediaries Work Together. In the report, we share five tensions that typically arise in funder-intermediary-evaluator partnerships, along with learning notes based on analysis of the various challenges that arise in these partnership triads.


Meg Long is president of Equal Measure; Clare Nolan is co-founder of Engage R+D

New Research Study: When Collective Impact Has an Impact

Posted Thursday, March 1, 2018 at 4:14 pm

We are excited to share with you the recently published: “When Collective Impact Has an Impact: A Cross-Site Study of 25 Collective Impact Initiatives,” conducted by a research team from the organizations ORS Impact and the Spark Policy Institute.

This study, commissioned by the Collective Impact Forum in early 2017, was designed to look at the question of “To what extent and under what conditions does the collective impact approach contribute to systems and population changes?” 

In order to explore these questions, the research team studied 25 sites -- with eight deep dive site visits -- and has generated a rich set of findings that we hope will be useful for the field of collective impact practitioners, community members, funders and researchers/evaluators. 

The research team looked at the implementation of the five collective impact conditions and the Principles of Practice, and how these contributed to the following:

  • Early Changes: Changes to the environment that lay the foundation for systems and policy changes, such as increased partnership quality, collaboration, and awareness of the issue.
     
  • Systems Changes: Changes to core institutions within the initiative’s geographic area that (1) may be formalized and likely to sustain or more informal experiments that could lay the groundwork for future formalized changes, and (2) may happen in a single organization, multiple organizations with a common purpose (both in terms of issue area and sector), or multiple organizations with multiple purposes, and
     
  • Population Changes: Changes in the target population of the initiative, which may be specific people within specific systems, geographic areas, or with specific needs.

A key finding of the report is that “the role of the collective impact initiatives in contributing to population change alongside other efforts or enablers is a critical and valuable aspect of social change.”

The content below highlights some of the key findings from the rich and nuanced report. For more depth, we encourage you to read the Executive Summary and the Full Report.

In addition, we would like to thank the funders supporting this study: the Annie E. Casey Foundation, Bill & Melinda Gates Foundation, the Houston Endowment, the Robert R. McCormick Foundation, the Robert Wood Johnson Foundation, and the W.K. Kellogg Foundation.

Our Collective Impact Forum team will be working over the coming months to share additional insights for how these findings may inform your work. Stay tuned for a webinar diving into the study and its findings, future blogs going deeper on specific pieces of the study, and virtual coffee chats with leaders from some of the sites that were study participants. We also welcome other ideas for how we can make the findings of this report most useful for you … do let us know!
 

Map and List of 25 Study Sites

Map and List of 25 Study Sites


SUMMARY OF STUDY FINDINGS

Population Change

Overall, the study found that:

1) 20 of the 25 collective impact sites studied had achieved population change on at least one outcome

2) For all eight deep dive site visit sites, the research team found that collective impact undoubtedly contributed to the designed population change.

  • For three of these, there was evidence that the CI approach had a strong contribution to population change, with low plausibility of any alternative explanation for how that change could have otherwise occurred
     
  • For the other five sites, the researchers found evidence that CI had been a necessary element of the population change story, but alone was not sufficient for explaining the population change achieved (i.e., it was necessary but not sufficient for creating the change)


System Changes

The study also looked at what type of system changes the deep-dive sites were affecting, and how those related to population change:

  • Changes in services and practices are the most common systems changes achieved across sites; formalized system changes were also frequently seen in sites
     
  • Amongst the eight site visit sites, the three with no strong plausible alternative explanations were more likely to have a focus on data and on resources, whereas the five where collective impact was necessary but insufficient for achieving population change were more likely to focus on political will and policy change
     
  • Population changes generally stemmed from changes in services and improved practices and policies


Early Changes

In addition to population and system changes, the study looked at what early changes were created by the collective impact initiative that lay the foundation for these subsequent changes.

  • The most frequent early changes that were identified by the eight site visit sites as contributing to longer-term change were the collective impact initiative’s role in strengthening partnerships, building and enhancing collaboration, increasing or reframing visibility on the issue, and building political will.
     
  • Across the broader set of 25 sites, additional early changes included increasing data availability and use, and increasing the capacity of local partners.


Outcome from Implementing Collective Impact

 


In addition, the study explored the implementation of the collective impact approach, and how implementation related to the outcomes achieved. Key findings include:

  • Study sites generally demonstrated stronger implementation of Backbone Support and Common Agenda conditions and emerging or no implementation of Shared Measurement and Continuous Communication
     
  • For sites with more mature implementation of the conditions, backbone support, common agenda, mutually reinforcing activities, and shared measurement all were contributing to early changes and systems change outcomes. There was no strong relationship identified between continuous communication and outcomes.


Advancing Equity within Collective Impact

The study also looked at how initiatives approach equity in their work - specifically the initiatives’ capacity to engage in equity action, implementation of equity-focused actions, and their representation and meaningful inclusion. They found that initiatives with strong and emerging equity focus showed promise in their equity outcomes; those with no focus typically did not see results that advanced equity with a few exceptions.


Implications

Finally, the study highlights four implications that are relevant to all collective impact stakeholders:

  • Collective impact is a long-term proposition; take the time to lay a strong foundation
     
  • Systems changes take many forms; be iterative and intentional
     
  • Equity is achieved through different routes; be aware, intentional, and adaptable
     
  • Collective impact initiatives take on different roles in driving change; be open to different routes to making a difference

Additional specific implications for funders, practitioners, community members, and researchers/evaluators are also included in the body of the full report and will be the topic of future blogs.

Download the full report and executive summary


What findings from the study resonated with you? What seemed similar to your own work? What seemed different? Let us know in the comments!

Webinar

Using Data in Collective Impact

In this virtual coffee chat, JaNay Queen Nazaire and Jeff Raderstrong from Living Cities will share what they have learned about how to use data within collective impact efforts to help change behavior and achieve better outcomes.

Part of this chat also goes over Living Cities' Data and Collective Impact resource series.

This virtual coffee chat was held on June 6, 2017.

Resources referenced in this episode:

Data and Collective Impact

About the Collective Impact Virtual Coffee Chats: The CI Virtual Coffee talks are free online webinar chats where we talk with collective impact practitioners from around the field to hear about their work and see what they're learning. Each Virtual Coffee chat will include Q&A time where attendees can ask their questions and find answers.

Listen to past Collective Impact Virtual Coffee Chats

Virtual Coffee archive

Presentation

Evaluating Collective Impact - COP In-Person Meeting (Sept. 2016)

This Community of Practice in-person meeting was held on Sept. 28-29, 2016.

Access materials from the meeting at the links on the left of the page.

Download an updated version of the Community Engagement Kit in the Forum Library.

Sept. 2016 Meeting Materials

  • Post-Meeting Summary
  • Evaluating Collective Impact Presentation (9/29/16)
  • COP Leadership Pre-Meeting: Engaging Your Board of Directors in Support of Collective Impact (9/28/16)
  • Rider-Pool Foundation Collective Impact Fellowship 2017
  • Community Engagement Toolkit

Tool

Facilitating Intentional Group Learning: A Practical Guide to 21 Learning Activities

For organizations to be successful, individuals need opportunities to share data, as well as their knowledge and experiences, with others. Facilitated, intentional group activities create the ideal environment for reflection and dialogue that lead to new insights and understandings.

From quick 20-minute activities to multi-hour gatherings, this guide provides detailed instructions on how to conduct high-energy, inclusive, and productive experiences.

Click on the link on the left to download the full guide.

Top Takeaways

  • Anyone can design and facilitate a learning activity. The goal is to be thoughtful and deliberate about what is to be learned, by whom, when, and where.
     
  • Learning opportunities can be incorporated into already established meetings and gatherings; taking time for intentional learning doesn’t have to be time consuming or costly.
     
  • Being clear about the goals for learning will ensure that the right activity is designed for the right reasons.


Related Resources

Webinar: How to Integrate Continuous Learning into Collective Impact

Podcast: Building in Continuous Learning into Collective Impact

How can Evaluators Be Effective Allies to LGBTQ+ Communities?

Posted Monday, January 25, 2016 at 5:43 pm

By Efrain Gutierrez and Grisel M. Robles-Schrader

This post was originally published on AEA365 on December 31, 2015.

This is Efrain Gutierrez with FSG and Grisel M. Robles-Schrader with Robles-Schrader Consulting. We want to share some suggestions to help cisgender heterosexual allies support lesbian, gay, bisexual, transgender, queer or questioning, intersex, and asexual (LGBTQ+) communities in culturally responsive evaluations. Allies who seek to address LGBTQ+ needs should consider the following:

Hot Tips:

1. Acknowledge your privilege. Some things you have taken for granted are not always accessible to the LGBTQ+ community: broad community support, bathrooms appropriate to your gender identity, acceptance in places of worship, and protection against discrimination are just a few examples. Reflecting on how the unavailability of these supports might affect LGBTQ+ communities will help you identify blind spots and ask better questions during interviews and surveys.

2. Get educated. Expand your knowledge of LGBTQ+ terminology and the history of the LGBTQ+ movement in the US. Don’t expect members of the community to have to educate you. Be proactive!

3. Ensure that LGBTQ+ folks are partners in your evaluation activities. Evaluators can’t reflect the needs and experiences of the LGBTQ+ community if members of this community are not included in meaningful ways in the evaluation activities. You can’t represent people that have not been invited to collaborate as equal partners in the process.

4. Be a good listener. When members of the LGBTQ+ community share personal experiences during interviews or focus groups, listen carefully. Don’t focus on what you want to say next, or offer alternative explanations (e.g., “They probably didn’t mean that”). Understand that some LGBTQ+ experiences are unique to their lives and you might not relate. Sometimes the role of an ally is to simply give a chance for the voiceless to be heard.

5. Speak up but not over. After listening to LGBTQ+ communities, spread awareness by using your privileges and resources to help them reach others. The most effective way to share power is to empower LGBTQ communities to share their own stories instead of filtering them through your understanding as a cis/heterosexual person. Use direct quotes in your evaluations and always give credit to those you are learning from.

6. Create safe spaces. Unfortunately there are still spaces unreceptive to LGBTQ+ stories. Speak up when people make homophobic comments and educate others. As an ally, you can help create opportunities for safety where resistance occurs. It can be as simple as coming out as an ally to your family, workplace, and/or house of worship, or it may require a more lengthy, involved process of opening hearts and minds. Remember that sharing cannot occur where fear exists.

Being an ally goes beyond being merely accepting of LGBTQ+ communities; it requires intentionality, learning, and action!


What do you think? Have recommendations to share from your own collective impact work? Let us know in the comments.

Measure Partnerships for Impact

Posted 4 years ago at 10:08 am

We are two practitioners in the field of collaborative work who, like many of you, want to make a positive contribution to the success of collaboratives that are trying to impact conditions of well-being in their communities.  The following post has been designed to share with you our learnings from the field and our wish is to inspire and motivate you to begin this journey with your groups.  This is a white paper, consisting of 3 distinct parts that examine the value of building, sustaining and measuring the strength and effectiveness of partnerships, resulting in accelerated social impact.

Part 1: Why measuring your collaborative consistently, is an important
process in attaining your impact goals


Part 2: Practical steps on how to measure your collaborative


Part 3: The value in using data about your partnership for your collaborative
growth, stakeholder engagement, sustainability, and
social impact results

Ask Me About Evaluation for Collective Impact

Posted 4 years ago at 10:08 am

Hello, everyone,

As part of the Collective Impact Forum’s “Ask Me About” initiative, I am here to answer your questions and provide any guidance that could help give you more confidence in your work.

At FSG, I lead our Strategic Learning and Evaluation approach area. Our team focuses on providing evaluation expertise over a wide range of topic areas, including U.S. and global health, economic development, youth and education, arts and culture, community development, and the environment. We help plan and conduct evaluations, develop strategic learning and evaluation systems, conduct strategic reviews, build evaluation capacity, develop shared measurement systems, and facilitate organizational learning. Our work also includes assisting organizations in evaluating their collective impact efforts; you can find some of our recommendations in our recent publication, Guide to Evaluating Collective Impact.

What are you wondering about as you think about how and when to conduct evaluations of your collective impact effort?  What would you love to understand better? What challenges are you facing?

As part of “Ask Me About,” I’m here to listen and share ideas and recommendations where I can. This thread will be open for the next week, through June 2, for you to share your questions and reflections.

How can I help?

Tool

Follow the Money: A Tool for Mapping Funds for Out-of-School Time Initiatives

This tool provides a step-by-step approach for statewide afterschool networks and others to track out-of-school time investments in their states. Specific state examples and customizable worksheets are also included to assist users with data collection and analysis.

Created by The Finance Project.

Article

Collective Insights on Collective Impact

Sponsored and curated by the Collective Impact Forum, "Collective Insights on Collective Impact," which appears in Stanford Social Innovation Review's fall issue, shares cutting-edge thinking from 22 practitioners, funders, community organizers, and thought-leaders.

Through the diverse voices of the authors, you can dive deeper into important collective impact topics such as public policy, evaluation, sources of power, and community engagement.

You can download the full set of articles here, or follow the links below to access individual articles from the collection.

Join the discussion for the Forum!


Article Topics and Authors

Join the discussion for the Forum!
 

RELATED WEBINARS

Pages