The origins of SDGToolkit go back to the late 1960s. Reviews of agricultural research identified the fact that most agricultural research and experimentation was not taking into account the impacts of the natural environment on yields1. However, by the 1970s more attention was being given to agroecological zoning to take into account environmental factors on yields2. The growth in international funding of economic development projects was accompanied by project cycle management guidelines and such techniques as the Log Frame Approach which became a de facto standard by the 1990s being adopted by most international development agencies.
However, by the 1990s projects had attainted a 35% failure rate amounting to an annual loss of some $75 billion wasted on over- or under-ambitious projects3. The levels of objective evaluations of economic rates of return had fallen to around 20% of funded projects and there was a lack of involvement of stakeholders and environmental impacts were insufficiently addressed. This reality was apparent in a 1992 World Bank portfolio performance evaluation in 19924 and a reassessment, made in 2010 by the World Bank Evaluation Group5.
It became apparent that many projects used Log Frames as fixed specifications to establish fixed budgets losing essential flexibility necessary to adapt to inevitable changes in conditions associated with climate change
In 1983, SEEL-Systems Engineering Economics Lab was established to monitor the capabilities of decision analysis and global networks. By 2010 it had become very evident that the existing project cycle procedures were inadequate. The increasingly multi-disciplinary nature of project team design work meant more comprehensive guidance and due diligence procedures were required.
To complete each procedure there was a need for a considerable number of accurate analytical tools covering a broad range of vertical applications arises from:
The SDGToolkit helps fill this gap by providing a sequential due diligence design procedure supported by advanced analytical tools to help process and project designers improve the quality of data used in each procedural step of a design. Data quality refers specifically to its reliability in terms of precision, representability and relevance to the analysis in question.
- The complex multi-factor analysis required to manage domain-specific issues to identify feasible economic development solutions, especially in low income countries
- The impetus created by the demands of the global initiative of Sustainable Development Goals1 and Agenda 2030
1 The UN Sustainable Development Goals initiative was launched in 2015 and these consist of some 17 development goals and over 232 target indicators for over 190 countries to be achieved by 2030.
2 Due Diligence Design Procedures, or 3DP, are a set of essential steps in project design to ensure that all critical factors are taken into account. They were established by the Open Quality Standards Initiative (OQSI) in 2018 as recommendation OQSI:3DP/1 (2018). 3DP is a step by step procedure covering gaps, needs and constraints analysis and the identification of a feasibility envelope which bounds a basic feasible project solution. The optimization of this solution minimizes likely costs and maximizes output volume and quality while reducing the risks of implementation.
3 Decision Analysis evolved as an applied discipline in the 1960s based on logical mathematical determinate models developed by the Decision Analysis Group at Stanford Research Institute and Stanford University. The lead developer is Ronald Howard of Stanford University.