Big irrigation projects in Africa have failed to deliver. What’s needed next
In 1938, French colonial authorities in what is today Mali started on an ambitious infrastructure plan to transform the desert into an area of agricultural production. Water was diverted from the Niger River through a canal system to enable irrigation on over one million hectares of fertile land. Eventually covering over 100,000 hectares, this project is still one of the largest irrigation schemes in Africa.
The Malian project, known as “Office Du Niger”, has had a profound influence on agricultural water management and planning across Africa since the mid-20th century. By the 1960s African governments saw it as a model for rural development.
With World Bank funding, hundreds of dams and large irrigation schemes were set up across Africa. The intended goals were increasing food security, reducing poverty, and stimulating economic growth. Unfortunately, the reality of many of these irrigation projects has been quite different.
Since 2008, in response to rising food prices, governments across Africa have announced plans for a new era of irrigation scheme development. Yet, it remains unclear why earlier schemes fell so short of expectations. To answer this question, we evaluated the performance of 79 schemes constructed across sub-Saharan Africa between the 1940s and 2010.
Our research reviewed original targets for agricultural production areas, as reported in project planning documents. These were compared with estimates of how much irrigated land projects currently support. The estimates were derived from high-resolution satellite imagery.
Our findings show that these irrigation schemes deliver on average only 18 per cent of the irrigated production area they originally propose. And many schemes are now completely inactive – some only a few years after construction. There appears to be little evidence of scheme performance improving over more than 60 years.
Cycle of failure
Research on individual schemes has blamed a number of factors for irrigation project failures. These include scheme size and climate. Arguing that larger schemes that experience more variable climates fail more often. This was largely not the case in our analysis of 79 projects.
Instead, we found the main causes of failure to be political and management frameworks underpinning irrigation project development.
First the political. For governments, a key motivation for scheme development was to produce more food. This would also reduce dependence on imports while generating exports. But the resulting focus on production of low-value staple crops – such as rice and maize – often led to poor financial performance.
Low-value crops undermine the financial sustainability of capital intensive irrigation projects in the long term. This is because these crops don’t always generate reliable and substantial profits from land allocated within schemes. And that makes it harder for farmers to contribute to the maintenance and upkeep of infrastructure.
The result is a cycle of dependence on external investment and subsidies. Once this initial investment runs out, many schemes deteriorate rapidly.
Second, donors have tended to prefer large, centrally-managed infrastructure projects. They seem to be less complex technically and logistically than a multiplicity of smaller scale initiatives. Unfortunately, many centralised government agencies in sub-Saharan Africa are underfunded and poorly resourced. Many lack the technical and institutional capacity needed to manage such large-scale projects.
At the same time, donor preferences for scale stimulate government appetite for optimistic plans to tap financial support. As a result, proposed irrigated areas and scheme returns are often unrealistic. For example, the Office Du Niger only recently achieved 10 per cent of the 1 million hectares planned in 1938. On the other hand, schemes designed to irrigate 127,000 hectares around Lake Chad are now completely inactive.
Planners, too, understate costs and overstate benefits. Our research argues that without changes to the way projects are envisaged, implemented and managed, African governments risk repeating mistakes of the 20th century development. This could have damaging consequences for poverty, food security and economic development.
Ways forward
Failures of large-scale irrigation in sub-Saharan Africa have been acknowledged for several decades. But our research suggests that this has had little impact on the way planners or governments approach such projects.
Given actual outcomes achieved, it is arguable that many large-scale irrigation projects have not delivered a return on investment. Even those that were initially viable have since gobbled up funds for maintenance and rehabilitation. Greater and more systematic monitoring and accounting of performance is needed to address these issues.
To do this, governments, donors and researchers can use new data sources such as satellite imagery. Just as important are reforms to planning processes to ensure investments are made contingent on successful and sustainable outcomes for farmers and communities.
In parallel, we also suggest a rethink of the historical preference for large projects. Are they the best or only means of increasing either food security or farmer incomes?
There is a growing recognition that farmers across Africa are highly entrepreneurial. This is evidenced, for example, by the recent increase in focus within the World Bank and other agencies on farmer-led irrigation. Small-scale farmers have for many decades and even centuries been developing a wide range of irrigation systems independent of development agencies or governments.
Evidence suggests that these investments may be several orders of magnitude cheaper than large schemes. These may offer better returns in terms of farmer incomes and rural livelihoods.
Continuing investment
Investments in large-scale water infrastructure will continue to an important means of agricultural production. This is more so as water availability becomes increasingly erratic in many regions due to climate change and pressures from population growth.
This calls for investments in storage infrastructure – both built and natural – to ensure reliable access to water. This in turn provides a basis for encouraging farmers to invest in irrigated agriculture, thus reducing risks associated with adoption of new technologies or practices.
This also calls for new approaches to how irrigation development is financed and implemented in Africa. There’s a need to combine both large and small scale approaches to irrigation development to meet the twin goals of improving food and water security.
Tom Higginbottom, Research Associate in Earth Observation and Food Security, University of Manchester; Roshan Adhikari, Research Associate, Global Development Institute, University of Manchester, and Timothy Foster, Senior Lecturer in Water-Food Security, Department of Mechanical, Aerospace & Civil Engineering, University of Manchester
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Water-smart ecosystems: a drop of hope in the digital water path
Characterised by a multitude of elements that coexist and work together, water ecosystems are playing a key role in digital water management processes.
The correlation between global climate change and water is crystal clear. Climate change impacts the water cycle by prompting extreme weather events, reducing water availability and water quality and posing challenges to sustainable development, biodiversity and humans’ basic right to safe drinking water and sanitation.
Regional and nationwide climate policies and planning are flourishing lately with a focus on an integrated approach to climate change and water management. But it is now essential that we pay proper attention to the role played by complex ecosystems (private organisations, public administrations, cities, consumers) and how they are favouring innovation and digital capacity – including cutting-edge technologies – to deliver cost-effective services.
Over time, the concept of digitalisation has moved to the forefront when referring to innovation and smartness. Digitalisation has already transformed citizens’ daily lives and created new business models. However, the current approach tends to be driven by the collection of large amounts of data that is linked to our daily activities.
If we want to address ‘smartness’, we need to focus on how these data flows are created and how they can be intelligently transported, shared and consumed on a global scale. Smart digital life and digital transformation is the ‘planned change’ for better usage of ‘data’ involving new technologies. Digital transformation, in contrast to digitalisation, means a higher rate of change, a broader extent of change and a greater participation of citizens and public administrations in close collaboration within industry.
Among its many facets, digital transformation means moving from isolation to globally connected systems and elements that form an interdependent, far more efficient ecosystem made up of multiple entities to work together effectively. ‘Smartness’ is not just about installing digital interfaces or smart sensors in traditional infrastructure or streamlining systems’ operations. It is also about using technology and data purposefully to make better decisions and deliver better services.
The need to digitalise water
Global challenges such as climate change demand flexible and adaptive governance approaches to deal with risk and uncertainty and to implement and guarantee the long-term sustainability of water management. That means moving from a reactive water management approach, triggered by climate change-related extreme events, for instance, to a preventive and predictive one, based on a real-time informed decision support system.
Despite a promising technological surge triggered by the Covid-19 pandemic, the water domain is still fragmented and characterised by a low level of maturity concerning the integration and standardisation of ICT solutions. Water and wastewater supply chains must provide good services at all times. Equally, water managers need to be better equipped with resources and have access to valuable information and hands-on practical case studies if they are to help organisations improve their efficiency rates and reduce decision-making times and costs.
Evidence through data is critical to address the many challenges the water sector is facing. Context data (describing what is going on, where, when and why) must be prevalent and accessible in near real-time, leading to the creation of a digital continuum, one in which the boundaries between applications domains are blurred. Consequently, it will be necessary for context information to flow freely between the different application domains, breaking the current silos of information.
This vision involves the integration of multiple systems of different nature – not just IoT devices but all kinds of context information sources. This ‘system of systems’ approach relies on the creation of a common knowledge repository that keeps and shares the data – one in which each system updates its own context information and can access the context information provided by other connected systems.
Data and the concept of trustworthiness
The exponential growth of data crossing borders and the proliferation of cloud computing has brought the topic of data sovereignty to the fore. With organisations of all types and sizes, collecting and storing large amounts of data has become essential to establish how, when and at what price others may use such data across the value chain.
Clearly, this approach requires a broad adoption of open platforms and interoperable standards to ensure that all connected systems can talk to one another. Interoperable platforms such as FIWARE allow water sector companies and other relevant stakeholders and industries to select the most appropriate combination of tools and data provided by different content information providers.
FIWARE is at the forefront of providing the standards to make this possible. Such standards for context information management are fundamental to the creation of interoperable platforms and infrastructures that can be deployed anywhere by just about anyone. It leads to the creation of a data-driven economy for the benefit of all, allowing the integration of multiple content data sources.
Publishing the context information management platform under an open-source licence can encourage the adoption of it as a standard. It also provides the benefits to achieve a massive influx from external contributors, which also contribute to increasing the ecosystem of third-party solutions. The result of these interactions is a continuous evolution of the platform to offer new and much better solutions that will lead to a much faster growth of the platform. It accelerates innovation across sectors and facilitates adoption of new services and products.
FIWARE plays a key role in driving the definition of ‘de facto’ standards and data models across a wide range of domains, including water management as part of the Fiware4Water project. FIWARE’s commitment to the topic has now achieved further notes with the recent launch of the FIWARE Domain Committee, devoted to bringing FIWARE technology to the water sector.
Interested in joining the FIWARE Smart Water Domain Committee? Reach out to us at fiware.org
Robert Brears is the founder of Our Future Water, a FIWARE Foundation media partner, and Angeles Tejado is a Senior Project and Marketing Manager at FIWARE Foundation.
Header image by Nathan Dumlao, Unsplash
Reforesting Europe would increase rainfall – new research
“Plant more trees” is often the first idea that comes to mind when we think about how to prevent further climate change or at least adapt to its impacts. There are good reasons for this. Multiple studies have shown that as well as trees being a fantastic way to store carbon dioxide, they offer other benefits, such as a cooling effect in cities, the ability to reduce flood risk and boost biodiversity, among other things.
Our new study in Nature Geoscience shows that trees could also affect rainfall patterns.
We used measurements of rainfall across Europe to investigate what effect forests have on rainfall totals. We know that forests mostly increase local and downwind rainfall in the summer and winter, but the magnitude of this effect varies across regions and seasons.
To identify a realistic reforestation strategy we used the global reforestation potential map. In the area we looked at in our research (most of Europe), 14.4% of the land surface was considered suitable for reforestation, an area larger than France.
We then compared the effect of turning all that land into forest to the precipitation changes in a future scenario in which the world faces intermediate levels of climate change, based on current predictions. While the climate scenario projects wetter winters and drier summers, the inclusion of reforestation could enhance European summertime rainfall by an average of 7.6%, potentially offsetting some of the drying that climate change is projected to cause. However, we also found reforestation may exacerbate the increase in winter rainfall.
In the UK and Ireland for example, where around 37% of the land area has the potential for reforestation, we estimate that reforestation on this scale would increase precipitation by an average of 0.74 mm/day (24%) in winter and 0.48 mm/day (19%) during summer.
Several factors potentially contribute to this. Forests typically have a higher surface “roughness” than agricultural land. This creates more turbulence over the trees and slows the movement of heavy clouds causing them to rain over and downwind of the forests. The same is true of urban areas too – increased surface roughness from buildings can amplify the precipitation over cities and downwind of cities. And forests typically evaporate more water than agricultural land, particularly during the summer season, which likely means more rain.
These findings demonstrate the relevance of land management in the assessment of climate change pathways. Many countries are considering how changes to land cover could contribute to their climate mitigation and adaptation efforts.
For instance the recently published climate change risk assessment from the UK government’s Climate Change Committee advisory body highlights that the gap has widened between the level of risk we face and the level of adaptation underway. Intervention measures are therefore urgently needed but require careful consideration. The new report points out that we must avoid poor planning being “locked-in”.
Reforestation in particular needs careful planning, as trees need decades to grow, and as they interact in such a complex way with multiple aspects of the environment. For example, while we may see increased rainfall from forestation, we may also see decreased runoff and water availability, since trees typically evaporate more water than crops or grass.
The species of tree we plant also needs to be carefully considered – will it be able to cope with higher temperatures? Will the type of tree be resilient to the invasive species and pathogens projected to increase with climate change? If not, then we have wasted our time and money.
Policy makers therefore need to thoroughly and carefully assess any kind of nature-based solution before embarking on a scheme that may provide no long term benefit. It is all about making sure that we are putting the right intervention in the right place, at the right time.
Elizabeth Lewis, Lecturer in Computational Hydrology, Newcastle University; Edouard Davin, Senior Scientist, Institute for Atmospheric and Climate Science, Swiss Federal Institute of Technology Zurich, and Ronny Meier, PostDoc, Institute for Atmospheric and Climate Science, Swiss Federal Institute of Technology Zurich
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Where East meets West, Torishima bridges the hemispheres
Gerry Ashe, Deputy CEO, Torishima Pump Mfg Co Ltd
Torishima is a leading Japanese company specialising in the design and manufacture of engineered pumping equipment. The products are used in a variety of applications in desalination plants, on water transmission and wastewater schemes, and extensively in large power plants. Although established more than 100 years ago, during the past 20 years Torishima’s objective has been to expand its markets, transitioning from a predominantly domestic company to a global player in the engineered pump market.
As a Japanese company, Torishima has taken the unique approach of embracing these new markets, with a strong focus on a multicultural approach to the management team. The business set up its Torishima Global Team (TGT) in 2002. At the end of 2020, Torishima’s export business accounted for almost 60 per cent of its total revenue. TGT has set up and manages a network of service businesses, sales offices and manufacturing plants, from Michigan in the USA, all the way through Europe, the Middle East, the Indian subcontinent, Asia and as far east as Melbourne in Australia.
Torishima’s strength has been its products and its people. In a highly competitive industry Torishima’s design and manufacturing capabilities have ensured its products are not only reliable but are manufactured on time and within budget. Torishima’s customer-focused approach ensures it works in partnership with large engineering contractors to ensure major infrastructure projects are completed as planned. Torishima is fully committed to its environmental responsibilities. Extensive research and development goes into the design of its products, resulting in high hydraulic and mechanical efficiencies, which, in turn, reduce the energy needed to drive the pumping equipment.
Torishima’s plan is to continue to grow and expand within its core business. As the world’s population continues to grow, the demand for water and power will also increase. Torishima is in the business of water supply and there are few things more important than making sure people have access to clean water. That’s why what Torishima does is more than just manufacturing pumps – it plays a critical part in the supply of water across the globe.
Torishima is fully committed to engineering excellence in the Water Market. For more information please click here.
Why are water companies dumping raw sewage in Britain’s rivers and coastal seas?
There were more than 400,000 discharges of raw sewage in 2020, together lasting more than three million hours, from water companies into rivers in England and Wales. One company, Southern Water, was recently fined a record £90 million for dumping up to 21 billion litres of untreated sewage over six years in protected seas off England’s southern coast.
To understand why this is happening, we need to understand the history of our sewer systems.
For most of human civilisation, sanitation was managed in a dry form. When people visited a latrine, their waste ended up in a drainage pit below or a cesspit nearby. Liquids were allowed to seep into the ground where nature would (hopefully) deal with any contaminants. The solids left over were often recycled directly to agriculture, providing nutrients for farmlands. This all changed, about 150 years ago, in the Victorian era.
People migrating from the countryside to Britain’s crowded industrial cities meant more waste and longer distances to transport it to farms. International trade brought higher quality fertilisers to the UK too, destroying the market for London’s cesspool waste, as farmers preferred South American guano (bird droppings).
Then, the water closet arrived. Toilet waste no longer filled a pit that someone had to empty, it magically disappeared with the pull of a chain. London’s water use in 1850 nearly doubled in six years, as waste was carried through rudimentary sewers and open drains into the Thames. Two years later, in 1858, the effect of these raw sewage discharges was fully felt during “the Great Stink”, when the Thames was so odorous it forced Parliament to stop meeting due to the smell.
The solution came from the engineer Sir Joseph Bazalgette, who designed an integrated sewer system to carry untreated waste and rain water from across London further down the Thames where it was dumped via two outfalls. This was one of the largest engineering works of the time, with over 1,100 miles of street sewers, 82 miles of mains sewers, and four pumping stations installed.
After creating his initial design, Bazalgette doubled the diameters of the pipes, stating:
"We’re only going to do this once and there’s always the unforeseen."
Direct river outfalls were later replaced by sewage treatment plants, but the sewer capacity, even after Bazalgette doubled the size of the pipes, was exceeded within his own lifetime. He may have been right, that such an enormous undertaking, at such huge public expense, could only be done once. But ever since we have been trying to patch and alter a system that continues to age and be overwhelmed.
Sewer systems in the 21st century
After the construction of London’s sewers, local authorities began installing their own across the country. In 1945 there were over 1,400 sewerage companies throughout England and Wales. These were merged in the Water Act of 1973, simplifying the structure to just ten regional water authorities.
Investment fell from £3.5 billion in 1974 to just £1.8 billion in 1985. The sector was privatised under the Water Act of 1989, and now 32 privately owned water and sewerage companies operate in the UK today.
Privatisation has led to a balancing act, where water companies seek sufficient profit to attract investment, while also keeping water bills low enough to provide a public service. Both the bills water companies can charge their customers, and the performance measures they must meet, are agreed with government regulators. As the UK’s population grows, water usage increases, and climate change brings more rainfall in more intense bursts into sewers. This balancing act is becoming harder to maintain.
Water companies are allowed to release untreated waste water in rare circumstances when the system becomes overwhelmed, preventing damage to equipment and properties. This is often due to very heavy rainfall, blockages and unexpected equipment failures. Increasing sewage and rainwater flows mean these events are likely to become more frequent.
The amount of sewage companies are permitted to release is set by the regulators, but when companies fail to manage increased flows they may exceed the permits and be penalised with fines. If they try to hide or under-report these releases, the penalties are significantly larger. But the damage to the environment is often already done.
To reduce untreated releases and the environmental damage they cause, water companies are making efforts to address it. Thames Water recently spent £3.8 billion on a new “supersewer” for London, while not paying investors for the last three financial years. A bold move, but not one that will see future investors rush to provide capital for upgrades. Sewer systems are expensive and technically difficult to expand or change, and so it will be a slow and expensive process.
One way to ease pressure on the system – and save some of the 1.1 billion litres of water homes flush down the toilet each year – might be to resurrect elements of waste treatment from before the Victorian era.
Prototype flushless toilets can treat waste without water and sewer connections, by filtering waste through special membranes and sterilising it with heat. This could keep a lot of sewage out of the sewer system and prevent waste entering rivers, without needing expensive technologies. These systems can even recover energy – in the form of biogas fuel – and nutrients from waste, to provide farms with fertiliser and homes with power.
Peter Cruddas, Senior Lecturer in Environmental Engineering, University of Portsmouth and Keiron Philip Roberts, Lecturer in Sustainability and the Built Environment, University of Portsmouth
This article is republished from The Conversation under a Creative Commons license. Read the original article.