Clark Quinn says “Yes, you (we!) do have to change”

3212164191_4ae8fbf744_oLearnlets » Yes, you do have to change. Clark Quinn nails a good one today. In writing about effective learning, he notes:

And that’s assuming courses are all the learning unit should be doing, but increasingly we recognize that that’s only a small proportion of what makes important business outcomes, and increasingly we’re recognizing that the role needs to move from instructional designer to performance consultant. More emphasis can and should be on providing performance resources and facilitating useful interactions rather than creating courses. Think performance support first, and communities of practice, only resorting to courses as a last result.

This rang my bell as I’m starting to do some research for a client on effective ways for more distributed work place learning. (More on this later, as I want to pick your collective brains.)I’m deeply interested in these intersections on roles.

What do we know about the costs of moving towards more performance support for globally distributed, not-all-in-one organization contexts? (I’m talking international development here!) Ideas?

Just a Reminder – Chocolate Guinness Cake Day

guinnesscakeExperimentation: chocolate cakes and communicators | Full Circle Associates. Keeping on my reblogging of this St. Paddy’s day treat.

This year I used a different recipe, and then altered it for a slightly healthier version.

Chocolate Guinness Cake
Ingredients
1 3/4 cups all-purpose flour
3/4 cup natural (not Dutch-processed) cocoa powder – I upped the antioxident power to 1 cup. I love chocolate…
1 3/4 teaspoons baking powder
1/2 teaspoon baking soda
1/2 teaspoon ground cinnamon
2 sticks plus 5 tablespoons unsalted butter, softened – I used one stick of butter plus 3 tbs and 3/4 cup pumpkin puree
2 1/4 cups firmly packed light brown sugar – I used just over a cup of coconut sugar
3 large eggs
1 1/2 teaspoons vanilla extract
1 1/2 cups Guinness stout (do not include foam when measuring)
1 cup coarsely chopped pecans – I added a bit more nuts, plus 3/4 cup unsweetened shredded coconut
Confectioners’ sugar for dusting

Method
1. Position a rack in the center of the oven and preheat the oven to 325°F (160°C). Grease the bottom and sides of a 9-by-3-inch round cake pan or springform pan. Dust the pan with flour.

2. Sift together the flour, cocoa powder, baking powder, baking soda, and cinnamon into a medium bowl. Whisk to combine, and set aside.

3. In the bowl of an electric mixer, using the paddle attachment, beat the butter at medium-high speed until creamy, about 1 minute. Gradually add the brown sugar and beat at high speed until very light and creamy, about 3 minutes. Reduce the speed to medium-low and add the eggs one at a time, beating well after each addition and scraping down the sides of the bowl with a rubber spatula as necessary. Beat in the vanilla extract. Reduce the speed to low and add the dry ingredients in three additions, alternating with the stout in two additions and mixing just until blended. Add the pecans and mix just until combined. Remove bowl from the mixer stand and stir a few times with the rubber spatula to make sure the batter is evenly blended. Scrape the batter into the prepared pan and smooth the top.

4. Bake the cake for 70 to 75 minutes, until a cake tester inserted into the center comes out clean. Cool the cake in the pan on a rack for 20 minutes. I cooked mine for 70 minutes in a convection oven. 

5. Invert the cake onto the rack and cool completely. With the springform pan, I just slipped the bottom out. This cake is delightful served warm.

6. Just before serving, dust the top of the cake lightly with confectioners’ sugar. Store in an airtight container at room temperature for up to a week. I did consider making the cream cheese frosting from the NYTimes version, but I resisted. Have a bit of Guinness with your cake as a beverage choice, or a nice cup of coffee!

 

From: http://www.leitesculinaria.com/recipes/cookbook/choc_guinness_cake.html

Data, Transparency & Impact Panel –> a portfolio mindset?

KanterSEASketchnotesYesterday I was grateful to attend a panel presentation by Beth Kanter (Packard Foundation Fellow), Paul Shoemaker (Social Venture Partners), Jane Meseck (Microsoft Giving) and Eric Stowe (Splash.org) moderated by Erica Mills (Claxon). First of all, from a confessed short attention spanner, the hour went FAST. Eric tossed great questions for the first hour, then the audience added theirs in the second half. As usual, Beth got a Storify of the Tweets and a blog post up before we could blink. (Uncurated Tweets here.)

There was  much good basic insight on monitoring for non profits and NGOs. Some of may favorite soundbites include:

  • What is your impact model? (Paul Shoemaker I think. I need to learn more about impact models)
  • Are you measuring to prove, or to improve (Beth Kanter)
  • Evaluation as a comparative practice (I think that was Beth)
  • Benchmark across your organization (I think Eric)
  • Transparency = Failing Out Loud (Eric)
  • “Joyful Funeral” to learn from and stop doing things that didn’t work out (from Mom’s Rising via Beth)
  • Mission statement does not equal IMPACT NOW. What outcomes are really happening RIGHT NOW (Eric)
  • Ditch the “just in case” data (Beth)
  • We need to redefine capacity (audience)
  • How do we create access to and use all the data (big data) being produced out of all the M&E happening in the sector (Nathaniel James at Philanthrogeek)

But I want to pick out a few themes that were emerging for me as I listened. These were not the themes of the terrific panelists — but I’d sure wonder what they have to say about them.

A Portfolio Mindset on Monitoring and Evaluation

There were a number of threads about the impact of funders and their monitoring and evaluation (M&E) expectations. Beyond the challenge of what a funder does or doesn’t understand about M&E, they clearly need to think beyond evaluation at the individual grant or project level. This suggests making sense across data from multiple grantees –> something I have not seen a lot of from funders. I am reminded of the significant difference between managing a project and managing a portfolio of projects (learned from my clients at the Project Management Institute. Yeah, you Doc!) IF I understand correctly, portfolio project management is about the business case –> the impacts (in NGO language), not the operational management issues. Here is the Wikipedia definition:

Project Portfolio Management (PPM) is the centralized management of processes, methods, and technologies used by project managers and project management offices (PMOs) to analyze and collectively manage a group of current or proposed projects based on numerous key characteristics. The objectives of PPM are to determine the optimal resource mix for delivery and to schedule activities to best achieve an organization’s operational and financial goals ― while honouring constraints imposed by customers, strategic objectives, or external real-world factors.

There is a little bell ringing in my head that there is an important distinction between how we do project M&E — which is often process heavy and too short term to look at impact in a complex environment — and being able to look strategically at our M&E across our projects. This is where we use the “fail forward” opportunities, the iterating towards improvements AND investing in a longer view of how we measure the change we hope to see in the world. I can’t quite articulate it. Maybe one of you has your finger on this pulse and can pull out more clarity. But the bell is ringing and I didn’t want to ignore it.

This idea also rubs up against something Eric said which I both internally applauded and recoiled from. It was something along the lines of “if you can’t prove you are creating impact, no one should fund you.” I love the accountability. I worry about actually how to meaningfully do this in a)  very complex non profit and international development contexts, and for the next reason…

Who Owns Measurement and Data?

Chart from Effective Philanthropy 2/2013
Chart from Effective Philanthropy 2/2013

There is a very challenging paradigm in non profits and NGOs — the “helping syndrome.” The idea that we who “have” know what the “have nots” need or want. This model has failed over and over again and yet we still do it. I worry that this applies to M&E as well. So first of all, any efforts towards transparency (including owning and learning from failures) is stellar. I love what I see, for example, on Splash.org particularly their Proving.it technology. (In the run up to the event, Paul Shoemaker pointed to this article on the disconnect on information needs between funders and grantees.) Mostly I hear about the disconnect between funders information needs and those of the NPOs. But what about the stakeholders’ information needs and interests?

Some of the projects I’m learning from in agriculture (mostly in Africa and SE/S Asia) are looking towards finding the right mix of grant funding, public (government and international) investment and local ownership (vs. an extractive model). Some of the more common examples are marketing networks for farmers to get the best prices for their crops, lending clubs and using local entrepreneurs to fill new business niches associated with basics such as water, food, housing, etc. The key is the ownership at the level of stakeholders/people being served/impacted/etc. (I’m trying to avoid the word users as it has so many unintended other meanings for me!)

So if we are including these folks as drivers of the work, are they also the drivers of M&E and, in the end, the “owners” of the data produced. This is important not only because for years we have measured stakeholders and rarely been accountable to share that data, or actually USE it productive, but also because change is often motivated by being able to measure change and see improvement. 10 more kids got clean water in our neighborhood this week. 52 wells are now being regularly serviced and local business people are increasing their livelihoods by fulfilling those service contracts.  The data is part of the on-the-ground workings of a project. Not a retrospective to be shoveled into YARTNR (yet another report that no one reads.)

In working with communities of practice, M&E is a form of community learning. In working with scouts, badges are incentives, learning measures and just plain fun. The ownership is not just at the sponsor level. It is embedded with those most intimately involved in the work.

So stepping back to Eric’s staunch support of accountability, I say yes AND the full ownership of that accountability with all involved, not just the NGO/NPO/Funder.

The Unintended Consequences of How We Measure

Related to ownership of M&E and the resulting data brings me back to the complexity lens. I’m a fan of the Cynefin Framework to help me suss out where I am working – simple, complicated, complex or chaotic domains. Using the framework may be a good diagnostic for M&E efforts because when we are working in a complex domain, predicting cause and effect may not be possible (now, or into the future.) If we expect M&E to determine if we are having impact, this implies we can predict cause and effect and focus our efforts there. But things such as local context may suggest that everything won’t play out the same way everywhere.  What we are measuring may end up having unintended negative consequences (this HAS happened!) Learning from failures is one useful intervention, but I sense we have a lot more to learn here. Some of the threads about big data yesterday related to this — again a portfolio mentality looking across projects and data sets (calling Nathaniel James) We need to do more of the iterative monitoring until we know what we SHOULD be measuring.  I’m getting out of my depth again here (Help! Patricia Rogers! Dave Snowden!)  The point is, there is a risk of being simplistic in our M&E and a risk of missing unintended consequences. I think that is one reason I enjoyed the panel so much yesterday, as you could see the wheels turning in people’s heads as they listened to each other! 🙂

Arghhh, so much to think about and consider. Delicious possibilities…

 Wednesday Edit: See this interesting article on causal chains… so much to learn about M&E! I think it reflects something Eric said (which is not captured above) about measuring what really happens NOW, not just this presumption of “we touched one person therefore it transformed their life!!”

Second edit: Here is a link with some questions about who owns the data… may be related http://www.downes.ca/cgi-bin/page.cgi?post=59975

Third edit: An interesting article on participation with some comments on data and evaluation http://philanthropy.blogspot.com/2013/02/the-people-affected-by-problem-have-to.html

Fourth Edit (I keep finding cool stuff)

The public health project is part of a larger pilgrimage by Harvard scholars to study the Kumbh Mela. You can follow their progress on Twitter, using the hashtag #HarvardKumbh.

 

Jessica Lipnack’s Story of a Virtual Presentation Prep

There are a pile of good lessons on presentation preparation in Jessica’s blog post,  Endless Knots: In the future, now: presenting virtually, but I appreciated all the layers around virtual collaboration, about walking the talk at every level when we talk about sustainability, virtual teams and collaboration. Sweet! Thanks, Jessica! Here is a snippet, but click in to read the whole thing.

OK, so what was so special about this? I always vet my presentations with clients and usually have a back-and-forth to fine-tune. This was the most global preparation I’ve ever done and I say this having done quite a number of these virtual presentations. And by the time we were done with all the preparation, Karl, Jacobina, and I felt like a team.