EVALUATION OF PUBLIC INTERVENTIONS IN A COMPLEX ENVIRONMENT: DEVELOPING GENERALIZABLE KNOWLEDGE FROM CASE STUDIES
DOI:
https://doi.org/10.12775/JPM.2019.003Keywords
governance, theory-oriented evaluation, case study, casual mechanisms, realist synthesisAbstract
Purpose: There is a growing need expressed to view public interventions not only through their effects but also to consider how those effects are produced. The problem is that we live in a complex world which is characterised by feedback loops, adaptation by both- those delivering and those receiving the intervention. Case study approach which has the explanatory power does not necessary have to serve for one-off, discrete evaluation. The aim of this article is to confront the dilemma of developing generalizable knowledge from case study research and on the basis of the extant literature suggest approaches to enhance its external validity to enable the middle-ranged theories formulation.
Methodology: To this end, the literature review has been carried out to identify techniques that can be used to enhance external validity of case study, as well as the review of evaluation reports from the Science and Innovation Policy Evaluations Repository (SIPER) to examine the evaluation practice from the point of view of case study utilisation.
Findings: Case study approach is in the position to play an instrumental role in evaluation oriented on learning. Context and human agency matter but they are difficult to capture by other approaches such as (quasi-) experimental designs. Although evaluators often resort to case studies, their full potential is not being exploited.
Originality/value: The paper makes a contribution in the debate on how to increase the effectiveness of public policy instruments through greater use of case studies. It departs from traditional thinking about evaluation as a way of arriving at universal laws that apply anywhere, anytime. Instead, the significance of middle-range theories for practice is acknowledged.
References
Astbury B. (2013). Some reflections on Pawson’s Science of Evaluation: A Realist Manifesto. Evaluation 19(4):383-401.
Astbury, B., Leeuw, F. (2010). Unpacking Black Boxes: Mechanisms and Theory Building in Evaluation. American Journal of Evaluation 31(3), 363-381.
Aus J.(2005). Conjunctural Causation in Comparative Case-Oriented Research. Exploring the Scope Conditions of Rationalist and Institutionalist Causal Mechanism. ARENA Working Paper No. 28, November 2005.
Barnes M., Matka E. and Sullivan H.(2003). Evidence, Understanding and Complexity: Evaluation in Non-Linear Systems. Evaluation 9(3): 265-284.
Befani B. (2013). Between complexity and generalization: Addressing evaluation challenges with QCA. Evaluation 19(3): 269-283.
Befani B. (2016). Causal frameworks for assessing the impact of development programmes. UEA seminar, 9 March 2016.
Blamey A. and Mackenzie M. (2007). Theories of Change and Realistic Evaluation. Peas in a Pod or Apples and Orangaes. Evaluation 13(4):439-455.
Blatter J. and Haverland M. (2012). Two or three approaches to explanatory case study research? Paper prepared for the presentation at the Annual Meeting of the American Political Science Association, New Orleans, August 30 – September 2, 2012.
Byrne, D. (2013). Evaluating complex social interventions in a complex world. Evaluation 19(3), 217-228.
Granger R. and Maynard R. (2015) Unlocking the Potential of the “What Works” Approach to Policymaking and Practice: Improving Impact Evaluations. American Journal of Evaluation 36(4):558-569.
Hedström P. and Swedberg R. (1998). Social mechanisms: An analytical approach to social theory. Cambridge:Cambridge University Press.
Marchal B., van Belle S., van Olmen, Hoerée T., Kegels G. (2012). Is realist evaluation keepinf its promise? A review of published empirical studies in the field of health systems research. Evaluation 18(2):192-212.
Maxwell J. (2004). Using Qualitative Methods for Causal Explanation. Field Methods 16(3):243-264.
Pawson R. (2003). Nothing as Practical as Good Theory. Evaluation 9(4): 471-490.
Pawson R. (2006). Evidence-based policy: A realist perspective. SAGE: London.
Pawson R. and Tilley N. (2007). Realistic Evaluation. London:SAGE.
Pawson R., Greenhalgh Y., Harvey G., and Walshe K. (2004). Realist synthesis: an Introdction. ESCR Research Methods programme. University of Manchester RMP methods Paper No.2
Ragin C. (1987). The Comparative Mthod: Moving Beyond Qualitative nad Quantitative Strategies. Berkley: University of California Press.
Riege A. (2003). Validity and reliability tests in case study research: a literature review with ‘hands-on’ applications for each research phase. QualitativeMarket Research: An International Journal 6(2):75-86.
Schmitt J. and Beach D. (2015). The contribution of process tracing to theory-based evaluations of complex aid insruments. Evaluation 21(4): 429-447.
Simons H. (2015). Interpret in context: Generalizing from the single case in evaluation. Evaluation 21(2): 173-188.
Skocpol, T. (1979). States and Social Revolutions: A Comparative Analysis of France, Russia, and China. Cambridge: Cambridge University Press.
Solmeyer A., Costance N. (2015) Unpackingthe ‘Black Box’ of Social Programs and Policies: Introduction. American Journal of Evaluation 36(4): 470-474.
Thorne, S., Armstrong, E., Harris, S., Hislop, T., Kim-Sing, C., Oglov, V., Oliffe, J., & Stajduhar, K. (2009).
Patient Real-Time and 12-MonthmRetrospective Perceptions of Difficult Communications in the Cancer Diagnostic Period. Qualitative Health Research, 19(10), 1383–1394.
Van der Knapp P. (2004). Theory-based Evaluation and Learning :Possibilities and Challenges. Evaluation 10(1): 16-34.
Verweij S.and Gerrits L. (2012). Understanding and researching complexity with Qualitative Comparative Analysis: Evaluationg transportation infrastructure projects. Evaluation 19(1): 40-55.
Weiss C. (1997). Theory-based evaluation: Past, present and future. New directions for Evaluation 76. San Fransisco, CA:Jossey-Bass, 41-55.
Woolcock M. (2013). Using case studies to explore the external validity of ‘complex’ development interventions. Evaluation 19(3):229-248.
Yin R. (2013). Validity and generalisation in future case study evaluations. Evaluation 19(3):321-332.
Yin R. (2018). Case study research and applications: design and methods. SAGE.
Ylikoski P. (2018). Mechanism-based theorizing and generalization form case studies. Studies in History and Philosophy of Science, http://doi.otg/10.1016/j.shpsa.2018.11.009
Downloads
Published
How to Cite
Issue
Section
License
Copyright
Articles submitted to the journal should not have been published before in their current or substantially similar form, or be under consideration for publication with another journal. Authors submitting articles for publication warrant that the work is not an infringement of any existing copyright and will indemnify the publisher against any breach of such warranty. For ease of dissemination and to ensure proper policing of use, papers and contributions become the legal copyright of the publisher unless otherwise agreed.
Plagiarism and ghostwriting
In response to the issue of plagiarism and ghostwriting the editors of the Journal of Positive Management has introduced the following rules to counteract these phenomena:
1. Contributors should be aware of their responsibility for a content of manuscripts.
2. Collective authors are obliged to reveal the contribution and an affiliation of each author (i.e. who is an author of specified part of a paper).
3. Any act of dishonesty will be denounced, the editors will inform appropriate institutions about the situation and give evidence of all cases of misconduct and unethical behaviour.
4. The editors may ask contributors for financial disclosure (i.e. contribution of specified institutions).
Stats
Number of views and downloads: 393
Number of citations: 0