Showing posts with label evaluation. Show all posts
Showing posts with label evaluation. Show all posts

Friday, September 13, 2013

Are Foundations too Focused on Themselves?




The Center for Effective Philanthropy (CEP) just came out with a new report, “Nonprofit Challenges: What Foundations Can Do.” There were some dismaying findings. CEP’s survey of nonprofit leaders found that 52 percent felt that foundations are unaware of the challenges that nonprofits face. Less than 33 percent felt that foundations use their resources in ways that help nonprofits actually meet challenges. And, perhaps the most alarming finding is that nonprofit leaders felt more challenged to attract foundation support than any other kind of support. Survey respondents were nearly unanimous (99 percent) about the difficulty of attracting and maintaining (89 percent) foundation funding.

Today, there are well over 75,000 private foundations operating in the United States. Their tax-exempt assets total nearly $644 billion, of which they charitably spend approximately $3 billion each year (Foundation Center, 2012). With such largess and numbers, one wouldn’t imagine that foundation funding would be so hard to come by. Given how many foundations exist and CEP’s findings that nonprofit leaders find foundation grants the hardest to get, there is obviously a sharp disconnect between supply (foundation support) and demand (nonprofit financial need). So, what accounts for this disconnect? I would argue that the nonprofit economy is inherently dysfunctional, and one main reason is that private donors (individual philanthropists and foundations) behave as if the nonprofit economy is like a capital marketplace. Just like they behave in the for-profit economy, donors make emotion-based decisions, and foundations, as charitably focused as they are, are better suited to serving internal priorities than the needs of nonprofits. As a result, nonprofits that are set up to address issues and problems that could never be supported in the for-profit marketplace still find themselves underserved in the nonprofit sector. In other words, funders of nonprofits don't prioritize the very things that were intended to be addressed in the nonprofit system.

Albeit incorporated as a nonprofit entity, foundations behave like any other consumer in the marketplace (i.e., their giving behavior is like that of irrational private donors or bottom-line focused corporate investors) because of foundations'  history, ideological roots, and, ironically, their own pursuit of excellence. Consider the following:

Firstly, foundations are inherently self-serving. When the private foundation entity was established by the Tax Reform Act of 1969, foundations were asked to do only one thing, which was to behave charitably. Regulation of private foundations was necessary at that time because wealthy nonprofits were being incorporated as a way to avoid estate taxes and as an instrument to maintain control of wealth. To counteract this, the IRS specified that private foundations must use their funding for charitable purpose and not for personal inurement. Outside of this requirement, however, government never went so far as to intrude on foundation ‘owners’ freedom of expression, with the result that today’s foundations prioritize the wishes (and whims) of donors over social needs. This historical backdrop helps explain the oddity of foundation culture: There are so many foundations that do wonderful things for the public good, but their individuality, which is rooted in the privilege of free expression, makes this sector seem incoherent, uncoordinated, and idiosyncratic in how and what they support.

Second, in order to maintain the sanctity of foundation ‘owners’ (donors, founders, trustees, executive leadership) wishes and intentions, foundations maintain a highly ritualized process for identifying, vetting, reviewing, and deciding on which nonprofits to support. This is called the due diligence process. Anyone who has been a grantmaker will appreciate that this time- and resource-consuming process has its merits. After all, with so many nonprofits and projects to choose from, funders need to be sure that they are supporting the best projects over any others. Foundations’ careful vetting and review process are useful in protecting foundations’ objectives; however, the downside of all this is that nonprofits are at a disadvantage and waste too much financial and human resources going through this process. At what point is the sanctity of upholding a foundation’s vision worth the harm it causes in the nonprofit field?

Third, the specialized practice of foundation strategic planning has done much, I would say too much, to prioritize the primacy of a foundation’s objectives. Consider that a foundation’s strategic planning process is considered a finished success when trustees feel that their plan is a unique reflection of their combined interests and visions. (This is why having a diverse and community-based board is so important.) To the detriment of society, however, too little time is spent on the needs of the field. Oftentimes, when a survey of grantees is conducted during a foundation’s strategic planning process, the results of such a scan are used to inform the tactical, and not strategic, level. In other words, any input from the field is secondary after addressing the needs and wants of the foundation board. Consequently, foundation support better reflects the lifestyles, entrepreneurial practices, and ideologies of the wealthy than the needs of society’s most underserved. (To this point, research studies have confirmed that foundations are not redistributive to the poor and that they reinforce class divisions [Center on Philanthropy at Indiana University & Google, 2007; Odendahl, 1990; Ostrander, 1984; Ostrower, 1995; Silver, 2007].)

Fourth, foundations spend an inordinate amount of time and money on assessments that are more about their own internal operating and managerial concerns and less about their impact on grantees’ capacity to realize social purpose. I looked up the number and type of assessment tools that exist today for foundations on Foundation Center’s TRASI database (Tools and Resources for Assessing Social Impact). (A wonderful, yet underutilized, resource, by the way.) There are more than 60 tools specifically designed for foundation assessment, and nearly all are meant to help foundations answer questions about their own internal operating or managerial performance. For example, CEP’s Grantee Perception Report helps to gauge a foundation’s customer service as an indicator of performance. The Wallace Assessment Tool assesses if grantees fulfilled their project objectives: In other words, did grantees make good on their promises to funders? That so many foundation assessment tools now exist speaks to the fact that foundations are seriously interested in their performance, which is a good thing. The downside though is that these tools are internally focused and do not attempt to answer how foundation funding helps grantees. With all this energy spent on foundation performance, it's a wonder that the more fruitful and challenging pursuit of answering how foundations actually help nonprofits remains under-developed.

In case you’re curious, here’s a selection of foundation assessment tools available, which can be found using TRASI on the Foundation Center website.

·      A Guide to Actionable Measurement (Gates Foundation)
·      Application Perception Report (CEP)
·      Ashoka Measuring Effectiveness Questionnaire
·      Balanced Scorecard (New Profit, Inc.)
·      Benefit-Cost Analysis (Abt Associates)
·      Benefit-Cost Ratio (Robin Hood Foundation)
·      Building a Performance Management System (RootCause)
·      Building Future Leaders Diagnostic Survey (Bridgespan)
·      Capabilities Profiler (Keystone Accountability)
·      Charting Impact (Independent Sector, BBB Wise Giving Alliance, & Guidestar)
·      Checklist for Reviewing a Randomized Controlled Trial (Coalition for Evidence-Based Policy)
·      Community of Learners (TCC Group)
·      Comparative Constituency Feedback (Keystone Accountability)
·      Compass Index Sustainability Assessment (ATKisson Inc)
·      Core Capacity Assessment Tool (TCC Group)
·      Criteria for Philanthropy at Its Beset (NCRP)
·      Developing a Theory of Change (Keystone Accountability)
·      DevResults (CaudillWeb)
·      Evaluating the Impact of Development Projects on Poverty (World Bank)
·      Evaluation Plan Builder (Innovation Network)
·      Evaluation Principles and Practices (Hewlett Foundation)
·      External Review of Program Strategy (Duke Foundation)
·      Foundation Performance Assessment Framework (Irvine Foundation)
·      Foundation Scorecard (RWJ Foundation)
·      Foundations of Success Guideline for Effective Evaluation (Foundations of Success)
·      Framework for Program Evaluation (CDCP)
·      Grantee Perception Report (CEP)
·      Impact Reporting and Investment Standards (GIIN)
·      Learning for Results (GEO)
·      Learning with Constituents (Keystone Accountability)
·      Measures of Success (Foundations of Success)
·      Multidimensional Assessment Process (CEP)
·      Operational Benchmarking Report (CEP)
·      Organizational Assessment Tool (Innovation Network)
·      Outcome-Based Evaluation (Organizational Research Services)
·      Program and Policymaking Evaluation (Kellogg Foundation)
·      Project Streamline Grantmaker Assessment Tool (CEP & GMN)
·      Prove It! (New Economics Foundation)
·      Pulse (Acumen Fund)
·      Social Audit (Social Audit Network)
·      Social Impact Assessment (Rockefeller Foundation & Goldman Sachs Foundation)
·      SROI (The SROI Network, and many others’ proprietary SROI tools)
·      Staff Perception Report (CEP)
·      Stakeholder Assessment Report (CEP)
·      Success Measures Data System (Neighbor Works America)
·      Readiness for Organizational Learning and Evaluation Instrument (FSG)
·      Theory of Change Community (ActKnowledge)
·      Trustee Evaluation Toolkit (FSG)
·      Wallace Assessment Tool (Wallace Foundation)

I took the time to type this list (and nearly wore down my fingers) from the TRASI site to point out that the foundation sector really has an embarrassment of riches when it comes to assessment instruments, guides, and tools. In the context of CEP’s newest findings, what becomes clear is that foundations are fixated on their internal objectives and not enough on the needs and objectives of the nonprofits they are supposed to serve.

Don’t get me wrong. I am excited that so many foundation tools exist because I really value foundation evaluation: There is nothing more exciting than combining program strategy and evaluation wherein clues of funding outcomes, intended or accidental, provide a way of improving grantmaking interventions. I've mined for ideas in many of the tools listed in TRASI because I am personally interested in advancing grantmaking practice. But, given what nonprofit leaders are communicating via the CEP report, I am also cognizant that foundations’ pursuit of their own excellence has hurt nonprofits. All this innovation in improving foundation performance has had a significant downside. While foundations have evolved increasingly sophisticated, professional practices, this field’s focus on its own objectives has resulted in skirting the question of how foundations are actually helping and serving public charities and, by extension, society’s needs and demands. Until foundations spend just as much time innovating in that direction, too many nonprofit leaders will feel like beggars than deliverers of society’s salvation. There is tremendous opportunity now for foundations to strike a new approach that incorporates social need and nonprofit concerns earlier and with more urgency in foundations’ planning, evaluation, management, and operational practices.


Works Cited

Buteau, E., Brock, A., & Chaffin, M. (2013). Nonprofit challenges: What foundations can do. San Francisco, CA: Center for Effective Philanthropy.

The Center on Philanthropy at Indiana University, & Google. (2007). Patterns of household charitable giving by income group, 2005. Indianapolis, IN: The Center on Philanthropy at Indiana University.

Foundation Center. (2012a). FC Stats: Number of grantmaking foundations, assets, total giving, and gifts received, 1975 to 2010. New York, NY: Foundation Center.

Odendahl, T. (1990). Charity begins at home: Generosity and self-interest among the philanthropic elite. New York, NY: Basic Books, Inc., Publishers.

Ostrander, S. (1984). Women of the upper class. Philadelphia, PA: Temple University Press.

Ostrower, F. (1995). Why the wealthy give: The culture of elite philanthropy. Princeton, NJ: Princeton University Press.

Silver, I. (2007). Disentangling class from philanthropy: The double-edged sword of alternative giving. Critical Sociology, 33(3), 537–549. doi:10.1163/156916307X189013

Saturday, July 20, 2013

Finally! Transforming Grant Reports into Useful Data


How many of you have heard this request: "Hi, there! How's it going? I know you're busy with that convening tomorrow, but can you send me any relevant data on the impact of our funding? I need it by the end of this week for a presentation...board meeting...foundation newsletter. Thanks!" You know the drill. Today's foundation program staff's job descriptions are expanding to include the ability to make grants that have a measurable impact. Inherent in that responsibility is your ability  to collect and analyze outcome-related evidence: How do we know that our grants are having an impact? How is our support helping (or not)? What have been the effects of our grants—planned or unplanned? If these are your set of evaluation questions, this thought has likely also crossed your mind: How do I make sense of all the information from site visits, phone calls, conversations with the field, and grant report documentation (videos, narratives, studies, articles)? I know that we're having an impact, because I hear, read, and see it, but when it comes to collecting it for a report, it's hard to know where to start!

Oftentimes, this material sits in a cabinet until an evaluation opportunity comes up when, more likely than not, the consultant asks you for all this material. Or, maybe you’re one of the lucky ones who actually has an evaluation expert on staff to manage all this information. Either way, an under-recognized software tool that foundations should consider using is data analysis software. This software should be considered just as essential as your grants management software. In this post, I’m going to describe how to get started with one data analysis software called Dedoose. I’m definitely not getting paid to promote them (they don't even know I exist), and this is really just an opportunity to share my positive experience with this product.

To start, data analysis software is a tool that can be used to analyze information that you have locked up in volumes of grants reports. These software (and there are many competing products) are often used by researchers for statistical data analysis in quantitative research and/or for qualitative research to organize and identify patterns in text-, visual-, or oral-based materials. Given that the primary users of data analysis software are scientists, most are pretty technical, not very user friendly, and not cheap. I’ve been using NVivo (for qualitative) and SPSS (for quantitative), and they take a while to learn how to use, which is a reason why such products don’t get picked up outside of academia or research work. When I first encountered Dedoose, I was really impressed. I had a Eureka! moment when I first used it, because it was just so darn easy to use and holds so much promise for making all that grant report content (there's probably miles of it stacked in cabinets all across America) actually become relevant to outcome-focused work. What grantees don't realize is that their reports sit in cabinets not because of a lack of interest, but because it's 1) overwhelming to go back to them after an initial reading because there's just so many, and 2) it's difficult to transform the information into usable data without involving a lot of work. This is why program staff might have favorite writers to return to time and again when it's docket report-writing time or may keep a running document of good quotes that they update--both pretty spotty and clunky efforts.

I promised a colleague that I would help set up the technological infrastructure of their grantmaking program to operationalize their evaluation process. In other words, I am helping them organize their grant reports and other material evidence of grant impact so that they can pull up stories and documentation of their grant effects easily. So, instead of writing a procedures document just for them, I’m using this opportunity to share this with all of you.

Some notes about Dedoose and what I will and won’t cover. Dedoose is web-based, which I love because it allows program staff to access the materials from home or office or wherever and can be shared via the cloud with co-workers. But, for those of you whose foundations haven’t yet tackled what it means to place grant materials on the cloud, be sure to discuss that policy-level question first before subscribing. (I will point out, however, that much scientific data is sensitive in nature, such as for studies of adolescent behavior, incarcerated study subjects, and the like. Dedoose was built to protect your data to the utmost, but check it out for yourself.) As for what I will and won’t cover, Dedoose does a great job providing instructions on how to use its software, so I will assume that you will refer to their video presentations for instruction. What I will supplement is how to apply Dedoose for grant-related materials. I have made up a narrative grant report as my example for how to use “descriptors” and “code,” which can be applied to include video uploads (e.g., if your grantee CEO was interviewed on CNN and you want to store and code it for evaluation-relevant content). So here we go…

To start, let’s create the scenario. You are an environmental program officer working on two portfolios—healthy rivers and alternative fuels. In your healthy rivers portfolio, you have a cohort of grantees who are all part of a 2010-2013 multi-year funding cycle of advocacy-based organizations. In this portfolio, you just received an interim grant report from the fictitious Happy Earth Network, which received a three-year grant of $300,000. In it, they described what they accomplished in 2012 because of your funding. They are super excited about the many objectives they met.

Now, dust off your foundation or program’s Theory of Change. Huh, you say? Hopefully, if done well and relatively recently, it will provide you with exactly the kind of information that you need to look for in order to assess if your grantmaking is on track. You don’t have to have it but it does help, especially because a Theory of Change should reflect the expectations of key foundation stakeholders (i.e., your co-workers, boss, and board). When you start analyzing your data, you want to make sure that what you analyze is of relevance to others, not just you. Here’s what you need to pluck out from your Theory of Change (or here’s what you can ponder in the absence of one):

  1. What are the objectives of my program’s funding? In this case, let’s say your program’s objective is to restore native salmon runs by dismantling dams in rivers where the environmental and social costs exceed the economic benefits.
  2.   What should I be looking for to know if objectives are being met? You should have several indicators, so let’s use three as examples of outputs, outcomes, and impact. Output: Using your grant, grantee hires a communications director to sway public opinion in favor of dam removal. Outcome: Dam is removed. Impact: Salmon populations achieve sustainable levels in dam-removed watershed.
  3. What indicators should I be looking for? Achieving more than output-level results is challenging, so you need to track the indicators of trends moving toward (or away from) hoped-for outcomes and impacts. Given the three levels of objectives listed, here are some examples of indicators: Output-level change indicator: Number of Happy Earth Network’s Facebook followers climbs to indicate public recognition; Outcome-level change indicator: Grantees report swaying politicians to their side; Impact-level change indicator: Scientifically commissioned report shows salmon runs are re-appearing.
So, back to Dedoose. The first thing you want to do is to think about how your stakeholders will want to slice and dice the data based on organizational or grantmaking categories. Dedoose calls these “descriptors.” Will you need to demonstrate how a particular grantmaking portfolio is doing? Is your foundation starting to expand its grantmaking to include, for instance, minority-led organizations? Think about what information you might want for foundation communications: Maybe this year’s annual report will showcase grantmaking in rural communities. In other words, anticipate how you want to categorize your grantees. These categorical buckets will enable you to organize and call up the data based on grantees’ demographic, organizational information. For this example, let’s say that these categories are: ID #(this should be the same identifying number you use in your grants management and files), Grant Program (Environment), Grant Portfolio (Healthy Rivers), Cohort (Healthy Rivers-Advocacy Building), Grant amount ($300,000), Budget Size ($2 million), Org Founded (1995), Organization Name (Happy Earth Network), Minority-Led (yes—Happy Earth Network is led by a Latina), Location of Grantees’ Office (Montana).

Next, return to your Theory of Change and look at your indicators. These are, according to Dedoose, going to be your “codes.” Now coding is a big deal. It’s the most important reason why you would turn to data analysis software in the first place. Making up precise, targeted, and relevant codes is what will enable you to call up useful grant report content and transform a 15-page narrative report from Happy Earth Network into a powerful data source. Being able to quickly pull up relevant data, which has been separated out from a lot of stuff that you don’t need, will help you generate communications content, write to-the-point docket reports that are enlivened with relevant grantee quotes, and be ready to analyze the coded content to spot trends, gather evidence of the trends, and analyze trends. In turn, all this will help you distribute more targeted, responsive grants and discern if your grantmaking strategy needs to change in order to better reach your objectives. When I’ve used codes to look for problems in grantmaking approach, that docket report section pretty much writes itself when I see the relevant data excerpted from grant reports on that topic.

I suggest starting with the following as broad categorical codes to start: (a) grant impact on organization and/or its staff, (b) the organization’s impact on the field, audiences/public, and/or influentials, (c) challenges that grantee is facing, (d) grantee suggestions for improving your foundation's grantmaking, (e) grantee's praise of your foundation/board/staff, (f) board/boss-specific information, (g) quotes that can be used for communications and docket reports, and (h) items to monitor. Code names should be concise, so just use "monitor" instead of "items to monitor."

The codes you select are also quite personal and should reflect the character, interests, and objectives of your foundation.  Let's pretend that an interest in youth is consistently shared across all your foundation's grantmaking programs, so I've used the code "youth impact" for examples of impact on youth.

Keep in mind, codes should be meaningful enough that everyone in your foundation is applying the same codes consistently. (Note, Dedoose allows individual users to review their coded work so they can compare their coding work with others, which will help avoid inter-reliability issues.) Ultimately, you want just the right amount of codes to find the information you need--not too broad that they bring up meaningless data, not too fine that the information you want doesn't come up, and not too many that you're overwhelmed by codes.

Here’s an example of coding Happy Earth Network’s interim grant report. You can see the codes I made up in the bottom right box labeled “Codes.” In the large field is the grant report. I thought that these two sentences, which I highlighted and then made into an "excerpt," indicate how Happy Earth Network used their grant, is affecting public perception, and is exciting youth attention to their cause.  Hence, I assigned the following codes: "impact on grantee," "youth impact," and "grantee's impact." You can assign as many codes as you like to your selection. The idea is that every time you want to generate a report of all content that was assigned a code, such as "impact on grantee," only this content will be gathered and displayed together, with all other content filtered out.



Here's another example: I selected these sentences (highlighted in green), which are about Happy Earth Network's current organizational challenge (they have a hard time retaining scientists in their rural, low-paying community). At some point, when I want to review all the different kinds of challenges my grantees are facing, I can generate a report of just those selections coded "staff challenge" to analyze for any trends. I also coded this same selection "monitor" to remind me that I want to follow up on how Happy Earth Network is doing with hiring and retaining skilled staff.


And, here's what the document looks like when I'm done coding. Notice that some sentences don't get any coding at all, while others (as in the examples above) got one or more codes.


It looks quite messy, but you'll never need to look at this document in this state again. (If you want to read original grant reports, remember that your grants management system is the best place for managing and reviewing this kind of material. Data analysis instruments are just for analyzing the content.) Upon coding, you can forget about reading this grant report in this long narrative format. You, or your program associate, just read it in order to code it--now you just want to be able to call up the relevant bits. As an example, let's say you want to review only information that is relevant to how your grantmaking affected your grantees. Remember you have a code for this, so you can export all data coded "impact on grantee." Here's an example of how Dedoose exports this information (you can export it as an Excel or Word document--either one makes it easy for you to cut and paste for your report writing).

I've only coded one organization in this example, Happy Earth Network, but if we had other grantees' reports, their data would also come up under this code assignment. Keep in mind that you can upload and code not only grant reports, but also your site visit notes, transcripts of recordings, reviewers' notes, media coverage--anything that can be selected as text for you to assign codes. Also, remember your descriptors? You can apply a filter for only those types of organizational characteristics you want to examine. For instance, you can review the codes of only organizations in your grant portfolio (1st descriptor) that are in the current cohort (2nd descriptor).


You can use Dedoose as an individual program officer, or this tool can be used throughout the foundation. There's healthy competition in the world of data analysis software, trying to make them easier and more powerful to use. A couple years ago, when I didn't know about Dedoose, I would never have suggested using data analysis tools for nonprofits, unless they were actually doing social science research and high-end evaluations. But, I stumbled on Dedoose for a project and found out first hand how they made this sophisticated program easy to use for any social change-oriented organization. What I love is that you can upload and code just about anything text or video based, it works on a Mac or PC, and it's remotely accessible as it is web- and not desktop-based. Dedoose will help you track, organize, and discern evidence of impact. So the next time you're asked for data about how your foundation's grantmaking is affecting your grantees or the field, you can say, "I can have it to you today!"




 
BLOG DESIGN BY DESIGNER BLOGS