- Designing public spaces using Minecraft brings refugees and local communities...
- UN Habitat Commits to Implementing the Global Compact on Migration
- Residents, diplomats and UN-Habitat staff join hands to clean up Nairobi
- UN-Habitat Executive Director unveils Youth Declar-Action at the Sustainable...
- Blue economy forum will boost water’s potential
- Op-Ed By Maimunah Mohd Sharif on Sustainable Blue Economy Conference
- UN-Habitat leads Africities session on effective local government planning for...
- Chinese Cities Improving in Global Competitiveness
- Resilient cities, a matter of planning for and with children
- UN-Habitat Executive Director: World Cities Day Message
RESULTS-BASED MANAGEMENT HANDBOOK
- Preliminary Sections
- Part 1: Overview of RBM
- Part 2: Results-Based Planning
- Part 3: Results-Based Monitoring and Reporting
- Part 4: Results-Based Evaluation
- Part 5: Capacity Building, Knowledge Management and Innovations in RBM
3.2 Monitoring and reporting for results at strategic and programme levels
The strategic results that UN-Habitat seeks to achieve are articulated in the three key planning documents which are aligned: the six-year strategic plan, the biennial strategic framework and the biennial work programme and budget. As explained in the previous chapter, the six-year strategic plan is implemented through three consecutive biennial work programmes and budgets. For example, the 2014-2019 strategic plan will be implemented during the 2014-2015, 2016-2017 and 2018-2019 biennial work programmes and budgets. Preparation for results-based monitoring and reporting on the programme results takes place during the planning process, when the results and the corresponding performance indicators that measure them are formulated, as explained in Part 2 of this Guide.
Results based monitoring and reporting requires a structured system or framework for the collection and analysis of performance information. A performance measurement framework is a plan to systematically collect relevant data over the time frame of the planned programme, to track and demonstrate progress made towards achieving expected results. It documents the major elements of the monitoring system, (see also page 78) and ensures that performance information is collected on a regular basis. It also contains information on baseline, targets, and the responsibility for data collection.
In UN-Habitat, the framework for measuring performance in the strategic plan and biennial work programme and budget comprises the results framework and the performance measurement plan, as well as the strategic framework. These frameworks therefore provide the basis for results-based monitoring and reporting and overall programme performance management for the organization. These main frameworks are accompanied by various tools that support the collection and analysis of the programme performance data as explained in subsequent sections under monitoring and reporting on the strategic plan and biennial work programme and budget. Table 15 shows the major tools that are used in monitoring and reporting on the strategic plan and biennial work programme and budget, and some aspects also apply to project level.
Monitoring the implementation of the strategic plan
In order to assess progress made in implementation of its six year strategic plan, UN-Habitat undertakes results-based monitoring of its programme performance. The results framework for the strategic plan defines “what to monitor”, and the performance measurement plan describes “how to monitor” the implementation and performance of the strategic plan. The two documents together with other accompanying tools constitute the main monitoring framework and system for tracking progress on the implementation of the strategic plan. Monitoring the implementation of the strategic plan entails tracking these different levels:
(a) Overall strategic results: Monitoring at this level entails assessment of higher level indicators (such as the percentage of urban population with access to adequate housing), through surveys and strategic impact evaluations. Performance information should be collected at least once during the six-year period of the strategic plan. As the performance information required for this level of results is largely dependent on surveys, censuses and studies carried out at national, regional and global levels by different institutions, the responsibility for ensuring identification and access to the required sources should be at corporate level. This should include the independent evaluation unit, directors and branch coordinators.
(b) Focus area strategic results are monitored based on performance data on indicators collected at least once per biennium. It also includes the documentation of results statements tracked annually during the collection of programme performance information. This information is then used as the basis for external reviews and evaluations to assess UN-Habitat’s performance at the strategic results level. Branch coordinators and regional directors are responsible for monitoring the focus area strategic results noting the contribution of UN-Habitat and that of partners.
(c) Expected accomplishments: Monitoring entails annual collection of data on indicators of achievement at expected accomplishment level. Most indicators are quantitative, with specified variables to be measured and data collected as per the defined frequency in the Programme Performance Plan. Subprogramme coordinators have the overall responsibility for
monitoring progress towards expected accomplishments, with the support of Unit Heads and RBM Champions (e.g., number of partner cities that prepared local economic development plans, and number of partner cities that set priorities based on local economic assessment), by collecting indicator data using data sheets once a year. This information is also recorded in IMDIS, which has provision for six-monthly updates of interim or estimate values. Analysis of the data collected is done annually for the preparation of the annual report and also every two years at the end of every biennium. Qualitative information relevant to the expected accomplishment is also captured as results statements in the annual progress report and as accomplishment accounts statements in IMDIS. Findings from evaluations carried out during the reporting period, on the performance of the respective focus areas, also provide valuable information to corroborate monitoring information that is largely self-assessment.
(d) Outputs: Monitoring implementation of outputs that contribute to the achievement of the strategic plan results is also part of the monitoring of the implementation of the work programme and budget, because they are aligned. This is a continuous process undertaken by project managers at all levels in the organization as explained under project level monitoring and monitoring of the work programme and budget using PAAS and IMDIS.
Reporting on the implementation of the strategic plan
Reporting on the implementation of the six-year strategic plan is a requirement which is internal to UN-Habitat. The annual progress report is mandated by the Governing Council (GC) of UN-Habitat through its resolutions, for example, the Strategic plan for 2014-2019 and the work programme and budget of the United Nations Human Settlements Programme for the biennium 2016-2017 “calls upon the Executive Director to report annually to Member States and, in consultation with the Committee of Permanent Representatives, to the Governing Council at its twenty-sixth session on progress made in resource mobilization, outcome-level performance, the implementation of the strategic plan and the work programme and budget, including evaluation in line with the results-based Management framework” (Resolution GC/25/3 of April 2015) .
As the strategic plan is implemented through three successive biennial work programmes and budgets, the annual report includes progress made on the implementation of the work programme, while also capturing the cumulated progress in the implementation of the strategic plan through the programme performance indicators.
Since each of the subprogrammes/focus areas is implemented jointly by regional offices and thematic branches in a matrix fashion, the performance information for the report is provided by the branches, regional and country offices following guidelines provided by the Quality Assurance Unit.
Reporting on the implementation of the strategic plan follows results-based principles and should cover the major achievements in relation to the strategic results: progress on indicators of achievement measured against targets; results achieved at expected accomplishment level; resource utilization rates compared with budgets/allocations, and explanation for any variance. Reporting on UN-Habitat’s results should be guided by the following principles of good results-based reporting:
Process of preparing the annual report: roles and responsibilities
Preparation of the annual progress report on the implementation of the six-year strategic plan starts in October and ends when it is presented to the regular session of the CPR in the first quarter of the year, usually in March. A summary of the document is presented to the Governing Council for the alternate year. The process which takes about four months involves several steps and responsibilities as explained and illustrated below.
The reports are consolidated by the Quality Assurance Unit using performance data and information from country, regional and focus area global activities, which are tracked by programme managers and other field staff using performance management data sheets and reporting templates. Each focus area/branch reports progress using reporting templates. Achievements at global, regional and country level are reported in separate paragraphs. Regional and country offices use reporting templates to report on focus areas or expected accomplishments as appropriate.
Global reporting includes the results achieved at expected accomplishment and sub-expected accomplishment levels. Important outputs and processes that led to significant achievements of results may be reported. Also to be reported is a results statement which is a synthesis of the information on trends and conditions for indicators of achievement and targets from the data sheets (quantitative data), to strengthen the analysis of the progress made. At country level, reporting should cover outstanding achievements in any of the focus areas as appropriate.
All the information provided should be validated using the evaluation reports and six-monthly self-assessments by each branch discussed by the senior management. The strategic plan focal points should ensure that all the information provided by regional and country offices is cleared by the Regional Director before submission to the Quality Assurance Unit.
The Management and Operations Division provides information on budget utilization for each focus area. To demonstrate efficiency in the use of resources towards achievement of the planned results, the financial information should include statements on variances between estimate, allotment and expenditure. Branch Coordinators are responsible for providing the interpretation on resource utilization, and explanations for any variances as appropriate.
The data and information from the focal points is consolidated by the Quality Assurance Unit into a draft report, which is circulated to the senior management team for validation and comments. Once the comments have been incorporated, the revised report is discussed in a senior management performance review meeting to assess the performance of the agency, address emerging issues and provide management response and next steps.
The final draft with the management response is then presented by the Office of the Executive Director to the CPR sub-committee for programmes, for review and feedback. The final report is then prepared by the Quality Assurance Unit incorporating feedback from the CPR sub-committee and submitted for discussion by the regular session of the CPR.
The annual report is also used to meet the reporting requirements of UN-Habitat development partners (multi-year funding donors) as per the cooperation agreements. Providing reports on the results achieved to Member States and other stakeholders is a way of accounting for the resources entrusted to the organization in terms of results attained.
Monitoring and reporting on the implementation of the work programme and budget is a UN-Secretariat mandatory self-assessment. As a member of the UN Secretariat, UN-Habitat uses the online Integrated Monitoring and Documentation Information System (IMDIS) (http://imdis.un.org/) for monitoring and reporting on its biennial work programme and the strategic plan. IMDIS is the Secretariat-wide system for programme performance monitoring and reporting, including the preparation of the Secretary-General’s Programme Performance Report.
The biennial work programme and budget approved by the General Assembly, is uploaded in IMDIS and set for monitoring its implementation. IMDIS is designed to facilitate continuous and comprehensive programme implementation monitoring by staff at different levels within the same organizational unit in accordance with their assigned roles and responsibilities.
Programme managers use IMDIS to track and record programme performance monitoring and reporting information on outputs, indicators and accomplishments within their particular area of responsibility. The Department of Management is able to verify progress and generate all necessary IMDIS information for organization-wide monitoring and reporting.
The system promotes accountability, transparency and information sharing. Step-by-step instructions on how to use IMDIS for monitoring and reporting on programme performance by UN-Habitat managers and reporting focal points is presented in section 3.2.4.
Evidence for programme performance delivery is collected at all levels: outputs and indicators of achievement as well as results, change and impact. In addition to the minimum evidence entered in IMDIS, UN-Habitat has responded to external audit recommendations to strengthen its evidence-based programme performance monitoring and reporting.
A separate database (Programme Performance Evidence Database) has been established where all documents that support evidence for delivery of the biennial work programmes are uploaded and stored either as documents or web links to sites where the relevant documents are located. These include hyperlinks to the intranet, PAAS, extranet and shared drives such as the K drive in Lotus Notes.
Monitoring and data collection in IMDIS
Monitoring of the work programme in IMDIS takes place along the results chain as shown in figure 33 below. The process involves tracking progress on implementation and recording achievement of results by collecting performance information on:
- Delivery of outputs according to categories
- Expected accomplishments (outcomes) through
- Indicators of achievement
- Accomplishment accounts (highlights of results achieved)
The Department of Management issues advisory notes and guidelines to all agencies under the UN Secretariat, that guide the process of monitoring and reporting on the implementation of the work programmes and budgets throughout the biennium.
The required actions to be undertaken in IMDIS within the six-monthly updates and reporting timeframes are communicated. UN-Habitat follows this monitoring and reporting cycle, which is summarized in table 18.
Authorised programme staff monitor and enter updates in IMDIS on the outputs, indicators of achievement, expected accomplishment and results attained through the implementation of their respective programme of work during a given biennium.
Each user can only update their areas of responsibility, depending on the approved right of access, but can view the whole programme of work for their organization as well as other programmes. Information on progress towards achievements of expected accomplishments is captured at the aggregate level.
Step-by-step instructions on how to use IMDIS for monitoring and reporting on programme performance by UN-Habitat managers and reporting focal points is discussed in detail in Section 3.2.4.
Monitoring progress towards achievement of expected accomplishments
(a) Verifying and updating indicator methodology
Data on indicators of achievement are used to monitor progress towards achievements of expected accomplishments. The process of tracking progress towards achieving expected accomplishments in IMDIS starts with setting up, verifying and updating the indicator methodology which defines how the indicator data will be collected, recorded and analysed. Branch coordinators and Unit heads are responsible for defining, verifying and updating indicator methodology for the expected accomplishments they are accoutable for.
In analyzing the use of indicators as tools for reporting on accomplishments during a biennium, every effort should be made to rely on sound data collection methods. For that, programme managers need to define the variables that make up the indicator, identify data sources, determine data collection and verification methods, determine how often the measurements will be done, create a presentation format and identify external factors that could distort measurements.
This should be done early in the biennium so that the collection and reporting of results becomes less cumbersome. For UN-Habitat, the programme performance plan for the six-year strategic plan (2014-2019) contains all the basic information on how the strategic plan and the work programmes and budgets for the period will be monitored. Minor revisions will need to be made at the start of each biennium as appropriate to reflect changes made during the planning process for that biennium.
(b) Tracking progress on expected achievements in IMDIS
- Updating performance measures (indicators)
- Indicators of achievement are primary sources of data for analysis of programme performance, and as such need to be relevant and reliable. The baseline and target values for each indicator in IMDIS should be aligned with the approved budget fascicle.
- A baseline measure is the actual value of an indicator on the first day of the biennium, or on a date as close to 1 January as is practical.
- When the actual value is new and cannot easily be determined, a reasonable estimate may be substituted. The target is an estimated value of the indicator on the last day of the biennium, given the original programme of work and budget approved by the General Assembly.
- Baselines and targets are needed to gauge actual programme outcomes/impact, as well as variations from anticipated results. Measurements should be taken as regularly as is feasible during the biennium in order to evaluate progress over time and to connect changes with specific accomplishments and/or shortcomings in programme design and delivery. All documentation, as supporting evidence on the reported progress on indicators and results achieved (table 20 below), must be collected and uploaded to the Programme Performance Evidence Database as soft copies or through web links/hyperlinks to relevant sites.
“Interim values” and “description of results” serve as an indication of whether the expected targets approved by the Member States for the subprogramme indicators have been or will be achieved by the end of the biennium. This data should be recorded by programme managers before writing statements of results on a subprogramme level, for interim reporting as of end of first year of biennium and final reporting as of end of the entire biennium.
- Recording results statements
Also assessed are the results statements. The results statement includes information on trends and conditions of indicators of achievements on whether (or not) the expected accomplishments have been achieved or are being achieved. Other relevant information captured in the results statement includes challenges or issues being addressed; activities undertaken; results or accomplishments; verifiable data; and other information including comparison of the actual value of indicators with the original targets, variations from the target and reasons why, and lessons learned, including recommendations on how to solve problems/issues identified.
Monitoring and recording delivery of outputs in IMDIS
To assess whether or not delivery of outputs is on track and in line with the programme budget, the number of outputs delivered and the percentage of outputs completed in relation to the total number of outputs planned is monitored in IMDIS. The purpose is to assess whether output delivery is in line with the programme budget in terms of quantity and timeliness. Within the results chain, outputs are the products and services such as reports, publications, servicing of major meetings, training workshops, advisory services and field projects, which result from the completion of several activities that a programme is expected to produce in order to achieve its expected accomplishments.
Detailed information required for tracking output delivery must be entered in the IMDIS and in the programme performance evidence database, including supporting documents (table 21) as evidence of accountability and to demonstrate their contribution towards the achievement of the expected accomplishments.
Recording output delivery in IMDIS
The instructions for updating the output details for each category in IMDIS can be found in this Handbook under section 3.2.4. Information on the following elements should be provided for each implemented output:
(a) Replace the output definition (aggregate output title which was formulated at the work programme and budget planning stage) with an actual specific description of the deliverable (see the hypothetical example provided below).
(b) Start/End date: enter the actual month/year when the output implementation started and ended.
(c) Output status: review and update output delivery status (applicable status for the 1st quarter of the biennium is “not started, in progress, implemented, reformulated”).
- For the output in progress provide a short description of the work undertaken. This can be entered in the field of “Remarks” in the category of Non-recurrent publications. In other output categories it is best entered into the field of “Description”;
- For the implemented output, enter mandatory output identifiers as specified in table 20;
- For the reformulated output, enter the reason for deviation by clicking on the drop down arrow and highlighting the relevant reason. In addition, enter remarks explaining why the output was reformulated. Please note that an output can still be considered reformulated even if it continues to address the same subject matter of the originally programmed output and caters to the same intended users.
(d) Issue date and publication identifier (applicable to the non-recurrent publications category): Enter issue date and ISBN/ISSN number or URL as mandatory identifiers for any publication reported as “implemented”.
(e) Abstract (applicable to non-recurrent publications category): Enter a short abstract of the publication
(f) Responsible officers: Enter the name of the staff member(s) and division responsible for the implementation of the output.
(g) Organizational unit: Please note that the implementing branch in the field of organizational unit is pre-selected by QAU based on the approved programme frameworks.
(h) Intermediate results (applicable to implemented output): Record how the output has been used or applied by the intended beneficiaries and/or assisting constituencies; record usage statistics and reference and beneficiary feedback. This information serves as evidence of how the outputs have contributed towards the achievement of the expected accomplishment.
(i) Remarks: indicate the relevance of the output to the Expected Results. Please note that the Quality Assurance Unit will use the “Remarks” field to provide comments for revising the output content, if needed.
In addition to the output delivery information entered in IMDIS by programme managers, substantive documents should also be uploaded onto the PAAS and UN-Habitat websites.
Reporting on the Programme of Work
Reporting on the work programme and budget is a UN-Secretariat requirement. The Department of Management prepares three reports based on the information and actions entered in IMDIS as per the advisory notes and guidelines. These include the programme performance documentation status report for the biennium, interim programme performance report and the programme performance report for the biennium.
Programme performance data analysis and reporting on the work programme and budget
Mandatory self-assessments are requested by the UN Secretariat and are conducted by managers in the context of reporting results of the subprogrammes in the results-based format and are reflected in the biennial programme performance report. In real terms, this is mandatory monitoring/assessment of the biennial work programme.
The self-assessment reports on the programme of work comprise six-monthly data and information (months 12, 18 and 24). The reporting consists of analysis of the logical framework, trends and conditions of indicators of achievements (together with baselines and targets), and the methods used in the collection of data in IMDIS. The information collected in IMDIS throughout the biennium is used by UN-Habitat to contribute to the programme performance reports prepared by the Department of Management for accountability to Member States. The data analysis and results reporting takes place at two main levels: the strategic objective and expected accomplishment levels.
Highlights of programme results: should showcase the key achievements of the Agency selected from each subprogramme. In addition, programme managers are required to prepare a brief summary describing the main challenges, obstacles and unmet goals the programme encountered. Ideally the lessons learned and areas in need of improvement identified when assessing subprogramme performance for each expected accomplishment should be reflected in this summary.
Expected Accomplishment Results or Statements of Results: these are required at 12, 18 and 24 months of the programme cycle. The responsibility for analyzing and preparing results for each expected accomplishment rests with the branch coordinators and unit heads for subprogrammes, and heads of offices responsible for the respective expected accomplishments. The purpose of the statements of results is to provide a summary of specific sub-programme statement of results based on data collected for the indicators of achievement and other relevant information that serves as the source for reporting on the extent to which the relevant expected accomplishment was achieved. The analysis of progress in the statements of results should be made principally in reference to the indicators of achievement including the comparison of targets to actual achievements and corresponding performance measures (baselines and targets) established by departments and approved by the General Assembly at the beginning of the biennium. Programme managers may wish to highlight specific outputs or groups of outputs that were particularly effective, and best practices identified in the programme’s substantive or operational areas, or use supplementary indicators or other compelling information to further support the results achieved. A typical statement of results would address these questions:
- What was accomplished (statement of facts)?
- How was it verified (reference to indicator methodology used)?
- How did this compare with your target (comparison with the target)?
- What explains the variation (reason for variation with the target)?
- What did you learn (reference to best practices and lessons learned)?
Information on challenges, obstacles and unmet goals should also be included for progress (months 12, 18) and final reporting (months 21-23) on accomplishment accounts, so that they can be extracted and summarized at the end of the biennium.
Work months reporting: the time spent by each professional staff member or consultant on the delivery of planned outputs is reported, irrespective of whether funding is received through the regular budget or from extra-budgetary resources. The purpose of the reporting is to account for allocation of professional staff and consultants’ time within the subprogrammes. Work months are reported using a standard template in IMDIS.
- Programme performance documentation status report for the biennium
- The report is published in early October of the first year of the biennium and is based on data recorded in IMDIS by the end of September. The report uses the following specific set of data for measuring the status of programme performance documentation in IMDIS:
- Percentage of indicator of achievement methodology completed
- Percentage of performance measures that have baselines and targets in line with approved budgets
- Percentage of output for which the status reported is implemented or reformulated. This does not include outputs ‘in progress’ or ‘not started’
An average of the three percentages mentioned above provides the documentation status of programme performance data. All agencies under the Secretariat are rated based on the percentage achieved. The report helps to indicate the level of preparedness of the agencies to effectively monitor and report on the implementation of the work programme and budget during the biennium.
- Interim programme performance report
- This report covers the first year of implementation of the biennial work programme and budget. It is used to assess and record progress achieved in programme implementation halfway through the programme. The data collected from IMDIS at the end of the first 12-month period of the biennium is used to develop an interim report on programme performance that is presented to the Management Performance Board meeting held at the start of the second half of the biennium, and assessed in conjunction with the senior managers’ compacts.
- The interim report covers all key elements of the programme performance report, which includes implementation rate of programmed outputs, interim progress on indicators of achievement, statements of results achieved and highlights of programme results. The report also captures challenges and lessons learned at expected accomplishment level.
- Programme performance report for the biennium
This is the end-of -biennium programme performance report that is submitted as the report of the Secretary General by the Department of Management. It presents the overall programme performance in terms of the implementation and results achieved over the biennium. All three reports are prepared as consolidated reports for the Secretariat but also contain sections for each agency.
- What is IMDIS?
- The Integrated Monitoring and Document Information System (IMDIS) is an online reporting system that can be found at http://imdis.un.org. The system is supported by the Department of Economic and Social Affairs (DESA) and administered by the Office of Programme Planning, Budget and Accounts (OPPBA), Department of Management (DM) on the planning/budgeting side, and the Policy and Oversight Coordination Service (POCS), DM on the monitoring and reporting side.
- How IMDIS works
- IMDIS is the Secretariat-wide system for programme performance monitoring and reporting, including the preparation of the Secretary-General’s Programme Performance Report (PPR).
- The system is designed to facilitate continuous and comprehensive programme implementation monitoring by staff at different levels within the same organizational unit, in accordance with their assigned roles and responsibilities. Programme managers use IMDIS to select and update information on outputs, indicators and accomplishments within their particular area of responsibility. The DM is able to verify progress and generate all necessary IMDIS information for organization-wide monitoring, including what is required for the PPR.
- Each IMDIS registered user has access to view the entire programme of work of the organization, as well as programmes of all other departments and offices. Each registered user, however, has limited rights to make changes in IMDIS in their respective sections of the work programme depending on their area of responsibility. This right of access is defined during registration and is in-built with user passwords.
- IMDIS also promotes accountability and transparency, and can be used to foster collaboration and exchange of best practices throughout the Secretariat.
- The User Guide contains step-by-step instructions for IMDIS and programme performance reporting by managers and reporting focal points.
- Getting started with IMDIS
- Setting up a new IMDIS account
- The login authority and password are organized by the Department of Management of the UN Secretariat in New York. In UN-Habitat, the Quality Assurance Unit is the coordinating office and clearinghouse for IMDIS password requests. The requests are made by UN-Habitat monitoring and reporting focal points.
- IMDIS focal points for each office or unit are selected by their respective managers, who then request passwords from the Quality Assurance Unit. The following information must be submitted to the Quality Assurance Unit:
(i) Name of the selected staff member
(ii) Index number
(iii) Email address
(iv) Subprogramme for which IMDIS focal point will be responsible for reporting
- The Quality Assurance Unit will send a formal request to the Department of Management and an IMDIS account will be set up. Username and password will be sent directly to the new user.
4. Accessing IMDIS and the programme (UN-Habitat)
IMDIS Webpage address is http://imdis.un.org/. IMDIS is also accessible through UNON Intranet: http://www.unon.org/restrict/intranet/. To log in to the system, a user ID and password are required. Enter your user ID and password, and then click on [Ok].
Step 1: Enter user ID and password as registered and click OK.
Step 2: Select the 2014-2015 biennium
Once you have logged on, you will reach the home page of the application, which requires you to select the desired biennium and the type of view. The biennia correspond to the periods of the biennial programme budgets. For the current PPR exercise, for example select 2014-2015.
Select programme of work and view
Step 3: Scroll down and select the View “by programme element”.
Indicate whether to view the work plan by programme element or by organizational unit, and then click on [Go]. For reporting purposes, select “by programme element” view as all reporting requirements are accessible through this view only.
Users scroll down through a number of screens selecting the programme element, budget section, component and subprogramme of the selected office as per the 2014-2015 proposed programme budget (http://www.un.org/en/ga/fifth/68/ppb1415sg.shtml).
Step 4: Select International cooperation for development
- Prog’d [Programmed] – these are the mandated Work programme outputs as per the GA approved work programme for 2010-2011 [A/64/6 (sect.14)].
- Add’t [Additional] – these are additional outputs added at the discretion of programme managers or by legislative mandate.
- c.f [Carried Forward] – these are outputs that are carried forward from the last biennium
Schedule of final outputs
Step 5: Click on UN-Habitat budget section (15. Human Settlements).
Schedule of final outputs
This will give you the four programme elements, as follows:
Schedule of final outputs
- Policy making organs: This lists all the outputs related to the Governing Council of UN-Habitat
- Executive Direction and Management: This lists all outputs under Policy and Strategic Planning Unit, Evaluation Unit, the Secretariat of the Governing Council and the Division of External Relations.
- Programme of work: This lists all outputs under the seven sub programmes
- Programme Support: This lists outputs under the Office of Management
Step 6: Click on the ‘Programme of work’ component to access the UN-Habitat subprogrammes
Schedule of final outputs
Step 7: Select the subprogramme you would like to access. The number on the right shows the number of outputs for each sub-programme in summary form.
Schedule of final outputs
To save this page as your home page and avoid having to scroll down through multiple screens at the start of each session, click on [Bookmark] in the navigation area at the top of any page. The next time you log in, you will automatically be taken to the selected screen. The bookmark can be changed to any page at any time. Return to the main page by clicking on [Top]. To go up one level, click on [Back].
Step 8: Click on any link under the ‘final output by category’ to view the work programme outputs by category under the specific subprogramme.
- Viewing the logical framework
Reviewing programme content
- A work programme consists of a logical framework and a schedule of outputs. Having scrolled down to the subprogramme of interest, the various components of the logical framework for that subprogramme can be displayed and hidden by using the “expand” and “contract” icons respectively. Click on the “expand” (+) icon next to any of the indicators to display the indicator methodology and associated performance measurements in a pop-up window. Note that no reporting can be performed from this view.
- To see the distribution of outputs by category, source and status of implementation, make sure that the totals are turned on by clicking on the [Totals] button in the navigation bar at the top of the page. With the totals turned off, the application will respond somewhat more quickly. Totals can be turned on and off at any point during the user’s session.
- Verifying and Updating Indicator Methodology
- When analyzing the use of indicators as tools for reporting on accomplishments, every effort should be made to rely on sound data collection methods. For this, programme managers need to define the variables that make up the indicator, identify data sources, determine data collection and verification methods, fix the periodicity of measurements, create a presentation format and identify external factors that could distort measurements. This should be done early in the biennium so that the collection and reporting of results becomes less cumbersome. For UN-Habitat, this has been done as the programme performance plan for the six-year strategic plan (2014-2019). Minor revisions will need to be made at the start of each biennium, as appropriate, to reflect changes made during the planning process for that biennium.
- In order to record the indicator methodology and associated performance measurements in IMDIS, scroll down to the subprogramme concerned and click on the [Update indicator methodology] icon. Select one of the indicators of achievement from the logical framework that appears. Follow the instructions to update the methodology and click on [Save].
- Updating Performance Measures (indicators)
- Indicators of achievement are primary sources of data for analysis of programme performance, and as such need to be relevant and reliable. The baseline and target values for each indicator in IMDIS should be aligned with the approved budget fascicle. A baseline measure is the actual value of an indicator on the first day of the biennium, or on a date as close to 1 January as is practical. When the actual value is new and cannot easily be determined, a reasonable estimate may be substituted. The target is an estimated value of the indicator on the last day of the biennium, given the original programme of work and budget approved by the General Assembly.
- Baselines and targets are needed to gauge actual programme outcomes/impact, as well as variations from anticipated results. Measurements should be taken as regularly as is feasible during the biennium in order to evaluate progress over time and connect changes with specific accomplishments and/or shortcomings in programme design and delivery.
- In order to review associated performance measurements in IMDIS, scroll down to the subprogramme concerned and click on the [Update indicator methodology] icon. Select one of the indicators of achievement from the logical framework that appears.
- The fields “interim value” and “final value” should be filled out with actual measurement of the indicator performance at the time of data collection, accompanied by a “description of results” using the “update indicator methodology” function.
- In order to review associated performance measurements in IMDIS, scroll down to the subprogramme concerned and click on the [Update indicator methodology] icon. Select one of the indicators of achievement from the logical framework that appears.
- “Interim values” and “description of results” will serve as an indication of whether the expected targets approved by the Member States for the subprogramme indicators have been achieved. This data should be recorded by programme managers before writing statements of results on a subprogramme level for interim reporting as of end of first year of biennium and final reporting as of end of the entire biennium.
- Reporting on Delivery of Programmed outputs
Navigation between major categories of outputs
Step 1: Select the subprogramme as shown in section 4 step 7.
Programme of work implementation and monitoring in progress
Schedule of final outputs
This will show you the category of outputs under each sub-programme.
If for example you click Ad hoc expert groups this is what you see:
- To navigate between major categories of outputs return to the activity listing screen. At the top of the screen select the required category from the [Switch to] menu.
- Or return to the subprogramme screen and select the final outputs by category required.
Final outputs by category
Recording status of output implementation
- Users may elect at this point either to display a brief listing, or full details. In the case of some output types, such as parliamentary documentation, output sorting options are also available with the “view by” feature. For ease of location, use the “view by activity title” button to view an (alphabetical) list of outputs. Sorting capabilities vary from one output to another depending on the characteristics of the particular type.
- A checkmark next to an output title indicates that it has already been submitted to DM for verification and has been verified. DM will see a similar checkmark once the record has been reviewed and archived by the responsible programme monitoring officer. Outputs with checkmarks may be considered complete and do not require any further attention from programme managers.
- An “action pending” icon indicates that further attention is required before reporting on that output can be considered complete.
- To update the status of implementation, scroll down by clicking on the output title until the [Update workplan] button appears. This is an output detail record. Clicking on [Update workplan] will bring up the update form.
For all publications which include recurrent publications, non-recurrent publications and technical materials, you will see as follows:
For advisory services, trainings and projects, you will see as follows:
Follow the instructions and enter all the required reporting details. The section under “optional” should also be completed because it provides vital evidence on programme performance.
- For verification purposes, DM requires the completion of certain fields , which generally capture the status of implementation of outputs, including some form of identifier that can be used to locate the output (see table 20), as well as any reason for deviation from programmed commitments, where relevant. Where deviations occur, a legislative decision justifying the change must be cited. The “remarks” field may be used for this purpose. The “remarks” field can also be used to record and trace information on the implementation history of the output itself, i.e., staff assigned to it, progress to date, etc.
- Once all the required information is provided, save the form as shown below:
- To update the status of implementation of outputs in categories of “advisory services”, “training courses, seminars and workshops”, ”“field projects”, “fellowships and grants”, scroll down to the output category and click on the [Update] button under the output that you intend to modify.
- This is an output detail record. Clicking on [Insert] will bring up the update form. Follow the form instructions and click on Save.
Submitting outputs to Department of Management for verification
- DO NOT CLICK on the yellow submission icon at any time.
- Submission of completed outputs to the Department of Management is the sole responsibility of the Quality Assurance Unit using the submission icon on the left.
Adding or deleting additional outputs
- Outputs that were implemented in addition to those originally programmed are referred to as “additional”. They may be added as a result of a legislative decision taken after the biennial budget was approved by the General Assembly, or they may be added at the initiative of the Secretariat, for example to enhance the possibility of attainment of programme objectives, and/or as a result of the unanticipated availability of extra-budgetary resources. Additional outputs should be entered in IMDIS ONLY when they have been implemented.
- They are categorized into two:
- (i) Added by legislation: outputs added by a legislative decision (GC, GA etc.) taken after the biennial budget was approved by the General Assembly. The legislative authority and intergovernmental body that took the decision should be specified.
- (ii) Added by initiative of the secretariat: outputs introduced to the work load by programme managers to enhance the attainment of the objectives of the subprogramme and as a result of the unanticipated availability of extra-budgetary resources.
- To enter additional outputs, scroll down to the bottom of any output list and click on [Add additional outputs]. Please note that the details will differ depending on the type of output you are adding.
Follow the instructions provided in the form and enter all necessary information to accompany the new record. Title should be entered in the field [Title/Nature of service/Title of service- depending on the type of output].
- If the additional activity is mandated by legislative decision, the legislative decision including document symbol number and date of decision must be entered in the field [Legislative mandate].
- When adding a new output, remember to select the expected accomplishment to which this output contributes.
- Reason for adding this output should be explained in the field [Remarks].
- Once the record is complete, click on [Save].
- Unlike programmed commitments additional outputs may be deleted by programme managers for example, when a discretionary item has been postponed or terminated due to lack of funding. Because they are considered supplementary to the original work programme as approved by the General Assembly, they need not appear in the final programme performance accounting.
- To delete an additional output, view the corresponding output list by activity title and click on [Go].
- Then from the list that appears, select the activity to be deleted. Finally, click on [Delete activity], or click on the “delete” icon to remove individual outputs from an output group. Output groups or “activities” are those records having a system-generated record identifier beginning with “PB” for programme budget.
- Resource utilization
- Once the output is reported as implemented, the work months utilized should be recorded.
- To update work months in IMDIS, select any output category and scroll to the bottom of the screen. Then click on [Review work months]. A work month summary worksheet will appear for your review. To edit the worksheet, click on [Update work months]
- Click the ‘Update work months’ button at the bottom of the screen.
- Enter work months under the following categories: P-RB=Professional regular budget; P-XB=Professional extra budgetary; C-RB=Consultant regular budget; C-XB=Consultant extra-budgetary work months.This work months format should be used for all major categories. You may use up to five digits, including two decimal points, i.e., 122.35. Work months for multiple activities can be entered. Once done, please click on [Save].
NB: Programme managers are requested to enter work months for each output implemented even though work months are recorded at the activity level.
(iii) An item-by-item report on work months by subprogramme is available for review, printing and exporting using the report entitled “Detailed work months”, which can be accessed through the “Reports” icon at the top of each page.
(iv) 1 work month = 4 weeks = 20 working days. 1 working day = 0,05 work month. Work month is calculated for actual total number of professional staff (P) plus consultants (C) time used to implement an output. Work months are reported separately by source of funding. RB refers to UN regular budget and XB is applicable to UN-Habitat’s external funding sources.
(v) In the category of
– parliamentary documentation
– expert groups, rapporteurs
– recurrent/non-recurrent publications
– other substantive activities
– conference services, administration, oversight
Work months utilized for all the outputs are recorded at the aggregated output title level. To view work months select “activity title” in the top right-hand corner, and the “full details” format. To update work months click “Review Work Months” at the bottom of the page.
(vi) In the category of:
– substantive servicing of meetings
– advisory services
– training courses, seminars and workshops
– fellowships and grants
– field projects
Viewing of the work months can be done by selecting the “full details” format in the top right corner. Work months should be recorded at the aggregated output title level through the “Review Work Months” button at the bottom of the page.
NB: You may use up to five-digits, including two decimal points i.e., 122.35. Work months for multiple outputs can be entered before clicking the ‘Submit’ button.
- Status of Outputs Implementation
The status of implementation of outputs is represented as a percentage of the total number of outputs planned for the biennium. For example for the sample below, the planned outputs for this subprogramme was 75, and the percentages are calculated based on this number, without considering additional outputs.
That is why it is important to reformulate outputs that are already in the work programme instead of adding new outputs which may not count towards the percentage value.
- Not started – These are outputs whose implementation has not started. Click on the “not started” link to see all the outputs not started for the current biennium for the sub-pogrramme selected on the date you are online.
- In progress – These are outputs whose work is ongoing. Click on the “in progress” link to see all the outputs which are in progress for the current biennium for the subprogramme selected on the date you are online.
- Implemented – These are outputs that have been completed and evidence for their implementation is available. These are outputs whose implmentation has not started. Click on the “implemented” link to see all the outputs implemented in the current biennium for the sub-pogrramme selected at the date you are online.
- Reformulated – These are outputs that may have been changed to align with the outputs planned in the work programme. This can be very useful instead of having many additional outputs that will not count in the overall implementation rate in IMDIS.
- Postponed – These are outputs that are not going to be implmented in the current biennium due to factors such as change of mandate, financial constraints, etc. These outputs will automatically be included in the next bienium as recurrent outputs.
- Terminated – These are outputs that will not be implemented in the current biennium and are not planned to be implemented in the future.
17 Information provided in this document is extracted from the IMDIS User’s Guide (December 2014) and adjusted for UN-Habitat use.