Ana səhifə

On the meri road: Continuing the Journey for Outcome Reporting Dr Ilisapeci Lyons., Philip E. Maher


Yüklə 104 Kb.
tarix25.06.2016
ölçüsü104 Kb.
On the MERI Road: Continuing the Journey for Outcome Reporting

Dr Ilisapeci Lyons., Philip E. Maher.

Regional NRM Programs, Natural Resources and Environment

Queensland Department of Environment and Resource Management,

GPO Box 2454, Brisbane, Qld 4001, Australia.


There is increasing expectation by all stakeholders that the outcomes of Natural Resource Management (NRM) activities can be demonstrated and that investment of public monies is providing value for money spent. To achieve this, we need a much stronger focus on outcomes than we have seen in recent programs, including the focus of monitoring and evaluation (M&E) (Pannell 2008).
Monitoring to provide timely, reliable and accurate data and information on the condition of our natural resources, and their responses to human use, is fundamental to natural resource management (EPA WA 2003). Evaluation of the data to provide evidence of the efficiency, effectiveness and appropriateness of NRM activities is then used to confirm the difference between actual versus desired NRM outcomes and performance.
The benefits of an outcome reporting approach to NRM include:

  • improved capacity to report on the immediate outputs and intermediate and longer term outcomes of the activities;

  • better informed environmental policy and decision making;

  • increased ability to measure the effectiveness of actions taken on the ground; and

  • development of a platform to move towards continuous improvement in NRM

(EPA WA 2003).
Lane, Robinson and Taylor (2009) state that performance audits (ANAO 2001, 2004, 2008) have emphasised the need for a clearer focus on specific outcomes of regional bodies. Institutions, government agencies and other concerned stakeholders agree that there is an ongoing need to collect evidence to confirm the NRM programs currently underway in Australia are adequately achieving the anticipated national or state outcomes (ANAO 2007-08).
In response to this, the Queensland Department of Environment and Resource Management (DERM) has undertaken a scoping study of Queensland Regional NRM Bodies to gauge their capacity and interest in being involved in furthering outcome reporting for NRM as well as the types of data available and the usefulness of that data to support such an initiative.
The scoping study was undertaken with the support and involvement of the Queensland Regional NRM Groups Collective, Regional NRM Bodies (RB’s) and the Regional NRM Programs business unit within DERM, with a view to being able to better demonstrate the outcomes of activities undertaken in NRM in QLD. There were several aims to the scoping exercise. These were to:


  • Review and gauge the use of MERI plans in the regions and determine its usefulness in developing state level outcome reporting;

  • Gain an overview of the monitoring programs in the regions and how the regions measure success;

  • Define, with the regions, the scale for outcome reporting;

  • Establish, with the regions, the level of resources available and the contribution they can make to outcome reporting; and

  • Learn from the regions the role and contribution they would like DERM to have in outcome reporting.

A particular interest of Regional NRM Programs is to gather information on the effectiveness of the regional delivery model. This paper presents some of the challenges of establishing outcome reporting and working with multiple stakeholders to report on program achievements.


ENGAGEMENT
One of the main challenges of this work is that it involves working with multiple stakeholders who have different capabilities, access, resources, community make-up and NRM priorities. The success of any form of engagement requires the investment of funding and commitment of time (Guthrie et al. 2005). This project involved extensive discussions with RB representatives with dialogue occurring at various levels from management to field operations. Other factors that were considered in the engagement process included benefits of involvement and the level of interest from regional representatives for the work. Discussion in this study involved workshops, individual and group meetings, as well as phone conversations.
Department of Communities (2005) and Sarkissian (2008) argue that the onus is on government to establish a process with communities that they have confidence in. In undertaking this project Regional NRM Programs adopted an open adaptive approach in defining what outcome reporting might look like but also the process and forums for discussion.
The IAP2 Public Participation Spectrum offered a useful guide for this work in defining the types of engagement that has occurred and is taking place.


Inform


Consult

Involve

Collaborate

Empower

Public Participation Goal:


Public Participation Goal:

Public Participation Goal:

Public Participation Goal:

Public Participation Goal:

To provide the public with balanced and objective information to assist them in understanding the problems, alternatives, opportunities and/or solutions.

To obtain public feedback on analysis, alternatives and/or decisions.

To work directly with the public throughout the process to ensure that public concerns and aspirations are consistently understood and considered.

To partner with the public in each aspect of the decision including the development and alternatives and identifications of the preferred solution.

To place final decision-making in the hands of the public.

Promise to the Public:

Promise to the Public:

Promise to the Public:

Promise to the Public:

Promise to the Public:

We will keep you informed.

We will keep you informed, listen to and acknowledge concerns and provide feedback on how public input influenced the decision.

We will work with you to ensure that your concerns and aspirations are directly reflected in the alternatives developed and provide feedback on how public input influenced the decision.

We will look to you for direct advice and innovation in formulating solutions and incorporate your advice and recommendations into the decisions to maximum extent possible.

We will implement what you decide.

The IAP2 Public Participation Spectrum (Source: www. www.iap2.org.au accessed 20 June 2011).
The form of engagement supported in this project is a combination of ‘Inform’ ‘Consult’, ‘Involve’ and ‘Collaborate’.


  • Inform: Performance reporting was established to report against Australian Government business plan targets. Performance reporting includes outputs, and standard output codes as well as narrative reports on project progress against milestones.

  • Consult: The performance reporting process has been adapted over time through feedback from both the Australian Government and RB’s.

  • Involve and Collaborate: Previous pilot work on developing State of Region reporting and the current project on outcome reporting have involved active engagement with regional bodies to define the reporting process and content.

Sarkissian et al. (2008) inform us that trust is the key factor that determines whether engagement succeeds or fails and whether participants are involved for the long or short-term. Trust between participants is especially important in this project where a greater level of participation is being pursued from regional bodies and performance reporting is viewed as an opportunity for validation or dialogue about change.


Adopting a Participative Approach
Donaldson and Lipsey (2006) state that engaging stakeholders in the design and process of evaluation improves the chance that the evaluation will be useful, feasible, accurate, and meet the requirements of stakeholders. A participatory approach is being adopted because of the complexity of this project, where there are different stakeholder interests, capacities, NRM priorities, and the challenge of capturing outcome information. Additional support for the adoption of a participatory approach as the basis for this work is provided by Wadsworth (2011), Patton (1996) and Posavac (2011) who state that participatory evaluation raises challenges for participants who are encouraged to question their assumptions about defining progress.
Participatory Evaluation
The first step in any evaluation exercise is to engage stakeholders to scope the activity (Patton 1996 and Donaldson I 2006). Guba and Lincoln (1989) caution that careful consideration must be exercised in determining how stakeholders are involved in evaluative work as the exercise can ‘enfranchise or disenfranchise’ stakeholder groups. In this exercise the different RB’s were invited and engaged in our discussions. .
The perspective taken in this work is that success will be determined with the RB’s. This is consistent with Donaldson and Lipsey (2006), Patton (1996), Wadsworth (2011), Owen (2006), Alkin et al (2006), Datta (2006), Guba and Lincoln (1989) who state that participants must determine their evaluation question, what they want to evaluate and measure, the methods used and have an active role in the analysis and interpretation of the data. If an inclusive evaluation process is used it will require a flexible design to cope with the complexities and varying capacities of the RB’s (Wadsworth 2011, Owen 2006). A guiding state framework will also be critical.

Some of the ways Regional NRM Programs will work with RB’s to develop outcome reporting is to work with:



  • local resources and capacities (Patton 1996, Donaldson et al. 2006);

  • the knowledge and experience of the stakeholders (Patton 1996, Owen 2006, Donaldson et al. 2006);

  • the data sets, analysis and interpretations submitted by the RB’s (Patton 1996, Wadsworth 2011);

  • RB’s in the decision-making process (Patton 1996, Donaldson et al. 2006, Wadsworth 2011); and

  • RB’s to communicate the findings (Alkin et al. 2006, Patton 1996).

This process is supported by Posavac E. (2011) and Alki et al. (2006) who state reporting is a negotiated outcome, which includes the structure and the message delivered in a report.


The questions we want answered from outcome reporting are:

  • What are the overall impacts and legacy of NRM programs in Queensland?

  • What is the state of and change over time in assets against planned immediate, intermediate and long-term outcomes?

  • What are the social and economic impacts of the programs at the regional scale?

  • What are the types of change occurring as a result of the RB activities?

Outcome reporting will also be used by Regional NRM Programs to:



  • inform future investment;

  • improve program delivery and target support to the regional bodies; and

  • confirm the value of the regional delivery model for NRM in Queensland.


Other Monitoring, Evaluation, Accounting and Reporting Processes
An important requirement in this work was to explore and get to know the different players working in NRM across the monitoring, evaluation and reporting field (MER). The reasons for this were we wanted to explore possibilities of building on existing processes and data collection methods and where possible use available data as a basis for the outcome reporting.
An important consideration for Regional NRM Programs is that outcome reporting should not place greater requirements on already stretched resources in the regions.
Working with Existing Reporting Processes: What do they Offer
The main M&E programs occurring in the regions include:

  • State of Environment and State of the Region reporting undertaken with the state government (in some regions); and

  • Monitoring, Evaluation, Reporting and Improvement (MERI) run under the Australian Government’s Caring for our Country (CfoC) program (including formalised Program Logics and MERI Plans)

One of the first questions was, how is the MERI process being used by RB’s and what can it offer for NRM outcome reporting in Queensland. There was a mixed response to this question. While the regions value its discipline and logical process, the challenge is that MERI reporting is based on the Australian Government Caring for our Country business plan targets. For many RB’s this leads to a situation where only the minimum reporting requirements are met, which falls well short of capturing the appropriate data required for the broader outcomes of their work.


What the Regions Want
Performance reporting for NRM Programs is currently outputs based. While the regions expressed satisfaction with current reporting processes, most are keen to have involvement in a process that provides an opportunity to better report their overall achievements. A review of RB MERI and other M&E plans indicated that regional bodies are in fact collecting useful outcome type data. However, RB’s face constraints in developing outcome reporting such as knowledge of program monitoring, the capacity and resources to do outcome work, lack of support network to develop an M&E plan and system and requirements for reporting against CfoC national business plan targets and regional priorities.
Capacity and capability are critical factors that determine the type and coverage of outcome reporting each RB can undertake. RB’s confirmed the need for outcome reporting to be, as far as possible, based on and align to existing processes and that any “new” process needed to be matched to the capacity of the region to deliver. With limited resources available to do additional data collection and reporting, RB representatives requested a templated format be developed for outcome reporting, that would set clear boundaries that regions could more easily respond to using existing resources.
It is difficult for RB’s to report on changes to our natural resources as a result of their activities. While there is real interest and some limited progress (eg: water) in the establishment of baseline data for long-term monitoring of environmental condition and land management practice, we as a nation remain bereft of a common set of baseline data covering the extent and condition of our natural resources (Wentworth Group et al. 2008).
The current move toward establishing a set of national environmental accounts offers an opportunity for Regional NRM Programs to work with a wider set of information and procedures that will contribute to developing a more complete picture of resource condition and change.
Scale of Reporting and Defining Success
Most of the monitoring work undertaken by the RB’s is regionally based and remains project/program focused. Funding priorities and program outcomes have a strong influence on the M&E work undertaken by the regions. Each region collects data for a specific program purpose and at different scales related to their technical capability, capacity, NRM situation and plan, and requirements of the funding program. These circumstances make aggregation, collation and organisation of information at the state level challenging.
CONCLUSION
Our scoping confirms that RB’s want to incorporate outcome reporting into existing processes and work with existing reporting tools where possible. Suggestions offered by the RB’s will be explored further and the limitations of working within existing reporting systems will continue to be investigated. Consideration will be given to the use of all existing MER processes and systems including those not incorporated in funded projects where those systems and processes make significant contributions to environmental outcomes.
At this early stage of outcome reporting, the RB’s have requested a templated format for brief and concise reporting on outcomes. The timing of outcome reporting is yet to be negotiated; the challenge for the regions is finding the capacity to address both the frequency and volume of current reporting requirements and capturing data needed to demonstrate practice change and ultimately real changes to the environment.
A positive outcome of the study is confirmation that outcome reporting will emerge from current reporting processes. For Regional NRM Programs, working with the RB’s to further develop a framework that exploits the opportunities presented in the current MER processes and uses available data is a positive finding that will be further developed and supported over time.
Another important factor highlighted through the study is that to establish a sustainable, on-going outcome reporting process there is a need to provide additional support to RB’s.

Viewing outcome reporting as an evolving process will allow it to adapt to changes and developments such as the development of a national set of environmental accounts and the on-going planning work occurring in the regions. Our next step in this process is to develop a more encompassing guiding framework for NRM outcome reporting in Queensland.


Bibliography
Alkin, Marvin, C., Christie, Christina, A., and Rose Mike (2006) Communicating Evaluation, p.384 – 403, Sage Publications London
ANAO (2001). ‘Performance Information for Commonwealth Financial Assistance under the Natural Heritage Trust’. Audit Report No. 43 of 2000-01, p.56. ANAO, Canberra.
ANAO (2004). ‘The administration of the National Action Plan for Salinity and Water Quality’. Report 17 of 2004-05. ANAO, Canberra.

ANAO (2008). Regional Delivery Model for the Natural Heritage Trust and the National Action Plan for Salinity and Water Quality’ Audit Report No.21 2007–08. ANAO, Canberra


Datta Lois-ellin (2006) The practice of evaluation: Challenges and New Directions. P.419 – 438, Sage Publications London.
Department of Communities (2005) Engaging Queenslanders: Community engagement in the business of government. State of Queensland.
Donaldson, Steward, I. and Lipsey Mark, W. (2006) Roles for theory in contemporary evaluation practice: Developing practical knowledge. P.56 – 75. Sage Publications London.
EPA WA (2003), State of the Environment Reporting series Paper No1 ‘State Monitoring and Evaluation Framework Discussion Paper’ Environmental Protection Authority, Perth, Western Australia.
Guba Egon G. and Lincoln Yvonna, S. (1981) Effective Evaluation. Jossey-Bass Publishers. San Francisco.
Guba Egon, G. and Lincoln Yvonna, S. (1989) Fourth generation evaluation. Sage Publications. Newbury Park.
Guthrie, D., Bishop, P., Lawrence, G., Rolfe, J. and Cheshire, L. (2005) Engagement Government: A study of government community engagement for regional outcomes. Central Queensland University, Rockhampton.
International Association of Public Participation (IAP2) (2010) Public Participation Spectrum. http:// www. www.iap2.org.au.
Lane, M. B., Robinson, C. J., Taylor, B. (2009) ‘Contested Country Local and Natural Resources Management in Australia’. CSIRO Publishing, Collingwood, Victoria.
Owen John, M. (2006) Program evaluation forms and approaches. 3rd edition. Allen & Unwin, Crows Nest.
Pannell, D. J. (2008) Pannell Discussions No. 122, 24 March 2008 http://cyllene.uwa.edu.au/~dpannell/ School of Agriculture and Resource Economics, University of Western Australia
Patton, Michael Q. (1997) Utilization-focused evaluation: The new century text. 3rd edition, Sage Publications. Thousand Oaks.
Posavac Emil, J. (2011) Program evaluation. Methods and Case studies. 8th Edition Prentice Hall. Boston.
Sarkissian, W. Hofer, N. Shore, Y., Vajda, S. and Wilkinson, C. (2009) Kitchen table sustainability: Practical recipes for community engagement with sustainability. Earthscan publishing for a sustainable future. London.
Wadsworth Yoland (2011) Everydayt evaluation on the run: The user-friendly guide to effective evaluation. 3rd edition. Allen & Unwin. Crows Nest.
Wentworth Group of Concerned Scientists, Abal, E., Boully, L., Byron, N., Green, P., Lowe, I., Tarte, D., and Trewin, D. (2008) Accounting for nature, a model for building the national environmental accounts of Australia. Wentworth Group.
Note: this paper was presented at the Australasian Evaluation Society International Conference, Sydney, Australia, 29 August – 2 September 2011


Verilənlər bazası müəlliflik hüququ ilə müdafiə olunur ©atelim.com 2016
rəhbərliyinə müraciət