Article - Open Access.

Idioma principal

Measuring impact: Program evaluation and design for social change

EMANS, Denielle ; HEMPEL, Adina ;

Article:

The purpose of this paper is to provide an introduction to program evaluation and stimulate a discussion concerning the growing need for impact assessment in social design practice. By advancing social science assessment methodologies for the discipline of design, the paper argues for a critical engagement with evaluation research in order to understand the overall effectiveness and quality, of interventions. Framing the study as both a theoretical and practi-cal space to blend interdisciplinarity with social responsibility; the implications for design’s evolution, when it shifts to a measurable field of practice, is discussed. The findings suggest the integration of practical evaluation concepts into design practice, including the notion of ri-gor, causality, and theory of change. The framework offers a way for designers to delineate how community resources, environmental components, and contextual variables interact to produce the intended outcomes of a social design intervention.

Article:

Palavras-chave: Design for social change, program evaluation, impact assessment, design thinking,

Palavras-chave: ,

DOI: 10.5151/despro-icdhs2016-03_021

Referências bibliográficas
  • [1] Amott, N. & Mackinaw, A. (2006). ‘Mapping Change: Using a Theory of Change to Guide Planning and Evaluation’. Grant Craft. Available: http://www.grantcraft.org/guides/mapping-change
  • [2] Brown, T. (June 2008). Design Thinking. Harvard Business Review, 88.
  • [3] Brown, T. (2009). Change By Design: How design thinking transforms organizations and inspires innova-tion, Harper Collins.
  • [4] Brown, T., and Wyatt, J. (Winter 2010). ‘Design Thinking for Social Innovation’. Stanford Social In-novation Review, 32.
  • [5] Emans, D. & Hempel, A. (2014). ‘Hybrid-learning for social design’. Tasmeem: 3. Available: http://dx.doi.org/10.5339/tasmeem.2014.3
  • [6] Henry, G. & Mark, M. (2003). ‘Beyond Use: Understanding Evaluation’s Influence on Attitudes and Actions’. American Journal of Evaluation, 24(3), pp. 293–314.
  • [7] Hermans, L., Naber, A., & Enserink, B. (2012). ‘An approach to design long-term monitoring and evaluation frameworks in multi-actor systems—A case in water management’. Evaluation and Pro-gram Planning, 35, pp. 427–438.
  • [8] Krippendorff, K. (2004). Content analysis: An introduction to its methodology, 2nd edition, Thousand Oaks, CA: Sage.
  • [9] Lee, C. & Nowell, B. (2015). ‘A Framework for Assessing the Performance of Nonprofit Organiza-tions’. American Journal of Evaluation, 36(3), 299-31
  • [10] Lee, Y.F., Altschuld, J.W., & Lee, L.S. (2012). ‘Essential competencies for program evaluators in a diverse cultural context’. Evaluation and Program Planning, 35, pp. 439–444.
  • [11] Linnell, D., Radosevich, Z., Spack, J. (2002). ‘The Guide for Successful Nonprofit Management’. Third Sector New England. Available: http://tsne.org/evaluation-best-practices-resources-nonprofit-experts
  • [12] Morino, M. (2011). ‘Leap of Reason: Managing to Outcomes in an Era of Scarcity’. Venture Philan-thropy Partners. Available: http://www.vppartners.org/sites/default/files/documents/LOR_Full_Version_Facing_Pages.pdf
  • [13] Organizational Research Services (2004). ‘Theory of Change: A Practical Tool For Action, Results and Learning’. Annie E. Casey Foundation. Available: http://www.aecf.org/m/resourcedoc/aecf-theoryofchange-2004.pdf
  • [14] Parkhurst, M. & Preskill, H. (2015). ‘Collective Insights on Collective Impact. Learning in Action: Evaluating Collective Impact’. Stanford Social Innovation Review. Available: http://ssir.org/supplement/collective_insights_on_collective_impact
  • [15] Patton, M.Q. (1997). Utilization-focused evaluation: The new century text. 3rd edition, Thousand Oaks, CA: Sage.
  • [16] Plattner, H. (2011). Forward In Design thinking: understand – improve – apply, Berlin: Springer, XIV.
  • [17] Preskill, H., Parkhurst, M., & Juster, J.S. (2014). ‘Guide to Evaluating Collective Impact’. FSG. Availa-ble: http://collectiveimpactforum.org/resources/guide-evaluating-collective-impact
  • [18] Rachel, L. (2006). ‘Guidelines for Impact or Outcome Evaluation’. Gender and Development Group. WorldBank. Available: http://siteresources.worldbank.org/INTGENDER/Resources/UNIFEMEvaluationGuidelinesFinal.pdf
  • [19] Rogers, P.J. & BetterEvaluation (2012). ‘Introduction to Impact Evaluation’. InterAction and Rocke-feller Foundation. Available: https://www.interaction.org/sites/default/files/1%20-%20Introduction%20to%20Impact%20Evaluation.pdf
  • [20] Stern, E. (2015). ‘Impact Evaluation: A Guide for Commissioners and Managers’. Bond for Interna-tional Development. Available: https://www.bond.org.uk/data/files/Impact_Evaluation_Guide_0515.pdf
  • [21] Thomas, D. (2006). ‘A General Inductive Approach for Analyzing Qualitative Evaluation Data’. Amer-ican Journal of Evaluation, 27(2), pp. 237-246.
  • [22] USAID (2013). ‘Performance Monitoring and Evaluation: Tips for rigorous impact evaluation’. Avail-able: http://pdf.usaid.gov/pdf_docs/Pnadw119.pdf
  • [23] Visocky O’Grady, K., & Visocky O’Grady, J. (2006). A Designer’s Research Manual. Success in De-sign, Rockport Publishers.
  • [24] Wassenich, P., & Whiteside, K. (2004). ‘CDD Impact Assessments Study: Optimizing Evaluation De-sign Under Constraints’, Social Development. The World Bank. Available: http://catalog.ihsn.org/index.php/citations/21109
Como citar:

EMANS, Denielle; HEMPEL, Adina; "Measuring impact: Program evaluation and design for social change", p. 302-307 . In: Wong, Wendy Siuyi; Kikuchi, Yuko & Lin, Tingyi (Eds.). Making Trans/National Contemporary Design History [=ICDHS 2016 – 10th Conference of the International Committee for Design History & Design Studies]. São Paulo: Blucher, 2016.
ISSN 2318-6968, DOI 10.5151/despro-icdhs2016-03_021

últimos 30 dias | último ano | desde a publicação


downloads


visualizações


indexações