Evaluation of Investments in the Strengthening Management and Governance Programme

Appendix One: Methodology


This section describes in detail the methodology used in the evaluation. A summarised version is contained in the main body of the report (refer to section 1).

Project planning and design

A needs assessment was undertaken to discuss how the evaluation would be useful to the key stakeholders and to inform the overall evaluation design.

Using the RUFDATA14 tool, the evaluation team met with the Policy Wāhanga and the Relationships and Information Wāhanga to discuss the direction and focus of the evaluation based on their respective needs and current work programmes.


14 Developed by Saunders (2000), RUFDATA is an abbreviation for the key questions asked of key stakeholders at the initial stages of the evaluation when drafting an evaluation plan. R stands for Reason for the evaluation, U for Use, F for Foci/Focus of the evaluation, D for Data or evidence of the evaluation, A for Audience of the evaluation, T for Timing, and A for Agency conducting the evaluation.

Evaluation approach

Through the needs assessment, the following evaluation objectives were identified:

  • To understand what is good practice governance and management for Māori organisations.
  • To understand the extent to which investments in governance and management have been effective, and the range of outcomes achieved.
  • To guide future investments in governance and management initiatives and to inform future programme design.

In addition, the evaluation was informed by a set of questions derived, in part, from the key outcomes in the draft governance and management outcomes model and programme outcomes relating to the SMG programme. The following questions were explored:

  • Did the SMG programme promote improvements in governance and management practices/ outcomes (improved systems, infrastructure, communications, strategy)?
  • What impact (if any) did the SMG programme have on the performance of Māori organisations?
  • What areas of the SMG programme could be enhanced?

A mixed method (quantitative and qualitative) approach including document review, online survey and key informant interviews was undertaken to inform the objectives of the evaluation.

Document review

SMG programme summary reports (and in some cases full reports), briefing papers, the assessment framework template and the 2005 SMG programme process evaluation were analysed to better understand the nature of the SMG programme and give focus to the evaluation. These documents were also used to inform the interview guide for the key informant interviews and the online survey.

Online survey

The table below shows the steps and activities that were undertaken for the online survey.

Table 4: Step-by-step procedure for the online survey

Table 4: Step-by-step procedure for the online survey
1. Planning process Determining resources – a programme plan was established identifying all the resources needed to implement the survey. These activities included:
  • generating survey questions (peer reviewed by Research, Information and Monitoring Team and programme staff);
  • developing the online tool and hosting the tool on the Te Puni Kōkiri server;
  • testing the survey with a sample of Te Puni Kōkiri staff and one external expert to focus on sufficiency and clarity of content, accessibility, interpretability and any relevant issues with the survey tool;
  • generating an email list of all SMG programme participants from the Te Puni Kōkiri administrative database and validating the list by the programme and regional staff;
  • and
  • informing Te Puni Kōkiri staff of the objectives of the evaluation.
2. Data collection
  • Implementing the survey – sending the web link to all respondents, with instructions on how to complete the survey.
  • Sending reminders to participants to complete the survey.
3. Data analysis Downloading and cleaning data from the Te Puni Kōkiri server and exporting into MS Excel for analysis.
4. Reporting Interpreting data and integrating with the qualitative findings.

The online survey was sent to all 110 organisations that had participated, or were currently participating, in the SMG programme. Best efforts were made to source up-to-date email addresses from regional staff; however, of the 110 organisations:

  • 13 organisations had invalid email addresses (and current email addresses were not able to be sourced);
  • six organisations had duplicate email addresses in the database; and
  • two organisations notified the evaluation team that the key person with knowledge of and information about the SMG programme had left the organisation.

As a result, the total population for the online survey was reduced to 89 organisations.

The online survey was initially run for two weeks (from 3 to 17 March 2008), which generated a 46% response rate. The evaluation team decided to extend the survey for another three weeks (until 7 April) to further increase the response rate. This resulted in only three additional completed surveys.

A total of 45 valid surveys (51%) were completed, which is above the average response rate for online surveys. The literature15 confirmed that the average response rate for online surveys is between 30% and 40%.


15 Ritter & Sue (2007).

Key informant interviews

Interviews were held with six organisations, three assessors, and Te Puni Kōkiri regional and national office staff.

The six organisations were:

  • Te Rūnanga o Whaingaroa, Kaeo;
  • Solomon Group, Auckland;
  • Turanga Health, Gisborne;
  • Wainuiomata Christian Fellowship, Lower Hutt;
  • Te Rōpū Tautoko ki te Tonga, Dunedin; and
  • Awarua Social Services, Bluff.

Interviews were held with a range of stakeholders including organisation managers, directors, operational staff, board members and/or trustees.

The three assessors interviewed were:

  • KPMG, Wellington;
  • KCSM, Opotiki; and
  • Manukau Business Solutions, Auckland.

Te Puni Kōkiri regional directors and regional staff (responsible for the SMG programme in their region) in the regions of the six SMG organisations were also interviewed. In addition, national office staff responsible for administering the SMG programme were also interviewed.

The interviews gathered data from organisations, assessors and Te Puni Kōkiri staff. Interviews were generally held at workplaces and at times, in places most suitable for the interviewees.

A semi-structured interview guide was used during the interviews and interviews were tape-recorded with the consent of the interviewees.

Table 5 provides a description of the actions undertaken by the evaluators for the key informant interviews.

Table 5: Procedure for the conduct of the key informant interviews

Table 5: Procedure for the conduct of the key informant interviews
1. Planning process
  • Developing the interview guide and sending it to the former and current SMG project managers for feedback on the sufficiency of coverage and areas that the evaluation may need to focus on.
  • Selecting key informants - a purposive random sample of the 110 Māori organisations was undertaken that resulted in a sample of six organisations being selected based on the following criteria:
    • Investment reach – a mix of investments at a regional and national level.
    • Assessors – a mix of assessors assigned to respective organisations.
    • Investment status – organisations that have participated at different stages of the programme (i.e. different financial years).
    • Nature of assessment – assessment and/or remedial work.
  • Selecting a random sample of three of out six assessors. The three assessors had all undertaken 10 or more assessments and/or remedial work.
2. Data collection Using a two-team approach (two members of the evaluation team), face-to-face interviews with interviewees were undertaken. The interviewees were informed of their rights prior to the interview and were requested to sign a consent form before the interview commenced.
3. Data analysis Content analysis using a thematic approach was utilised to analyse the fieldwork notes.
4. Report writing A draft report was submitted for peer review internally and externally before release.

A mixed method data analysis was used to analyse the data gathered from the quantitative and qualitative results.

Initially, the raw data from the online survey was analysed using Crystal Reports analytical software and then exported to Microsoft Excel for further analysis using frequency distribution and percentages in tables, pie and bar graphs.

The qualitative information (field notes and audio-tape material) was analysed through a thematic approach noting the ‘significant’ findings from the interviews.

The McKinsey 7S16 framework was also used to help organise and present the findings of the evaluation against the seven key elements of the 7S framework, namely: Strategy; Structure; Systems; Style; Skills; Staff; and Shared values. Further refinement of the findings and key themes was carried out to synthesise the findings and to simplify the presentation of the results.

The findings from the online survey were integrated, where possible, with the qualitative findings to support the results or to present opposing arguments.

Information collected is presented as aggregated results and quotes are attributed anonymously to maintain the confidentiality of participants.


16 Peters & Waterman (1982).