Smart Cities World‘s Climate action on why satellite data is key to smarter, sustainable cities is all about information relayed to enhance sustainability in the built environment generally and/or help in an easy and less costly understanding of smart cities development.
Why satellite data is key to smarter, sustainable cities
By Gaetano Volpe: CEO, Latitudo 40
Satellite imagery can act as “a time machine” for cities battling climate change, says Gaetano Volpe of Latitudo 40, helping us understand the past and future.
The image above is of Latitudo 40’s solution that brings together satellite imagery and artificial intelligence.
An IoT sensor is great for collecting data on the state of air quality from the moment it is deployed but that data tells us nothing about what happened previously to create the current environmental conditions.
If data is to truly help us build more sustainable, safer, healthier and greener cities, we need technologies that enable us to understand what has happened in the past and predict how a situation might evolve in the future. It isn’t a lack of datasets standing in the way of doing this but rather knowing how to use the ones that already exist.
In Europe and around the world, initiatives such as the EU’s Climate-Neutral & Smart Cities mission and its Mission on Adaptation to Climate Change are helping to galvanise climate action but nobody is under-estimating the scale of the challenge. What these initiatives have in common is the need for constant monitoring of a city’s territory and environment to assess the current situation and check progress. Moreover, this monitoring needs to be put in context with information from decades past to gain the necessary deeper understanding.
This was the aim when developing the climate change adaptation and mitigation platform Latitudo 40, that allows cities to be constantly monitored. It uses raw data generated by earth observation satellites, combined with artificial intelligence (AI), to understand how the earth’s systems have changed and predict how they will evolve in the future. It is designed to provide a more sustainable and resilient approach to urban climate action.
If data is to truly help us build more sustainable, safer, healthier and greener cities, we need technologies that enable us to understand what has happened in the past and predict how a situation might evolve in the future
In our specialist field of satellite remote sensing, we see a lot of valuable data available, but cities are using only a small portion of it to support key decision-making. To change this, we combine data from satellites with data produced within the city and, through a fusion of the two, create information models that help inform urban planners where to invest money and resources when it comes to protecting and futureproofing their cities.
A typical example is our dataset for estimating urban thermal comfort, which brings together information on urban heat island areas, tree canopy (or lack of it) and the age distribution of the resident population. A digital representation of a city can be created in a matter of hours that quickly highlights and offers insight into key climate and sustainability issues.
Satellite technologies are now several decades old but, due to their complexity, have never reached a mainstream level of usage in the market. Image search and image processing requires specific skills and complex processing systems that aren’t typically available within cities. To make the best possible use of the information potential of these images, we have developed what we call “complexity simplification,” a cloud-based processing workflow that automates image search, analysis, and interpretation.
Computer vision and AI algorithms complete the process by extracting the parameters of greatest interest to cities and presenting a simple representation of the evolution of the urban scenarios over time.
Continuous and frequent monitoring
Crucially, unlike the aforementioned IoT sensor, satellite imagery allows for a historical representation of the city, almost a time machine that facilitates an understanding of the starting- and end-point and what happened in between, as well as continuous and frequent monitoring into the future.
Thanks to satellite imagery, we can easily understand whether there has been land consumption and how much the relationship between green areas and urbanisation has changed; the state of urban green spaces and how they contribute to mitigating environmental phenomena; and what phenomena have triggered a specific event, such as a flood or the failure of urban infrastructure in the past and activate the best monitoring systems to prevent them occurring in the future.
It’s one thing having the data and tools, though, and quite another ensuring they are accessible to those who need them. If they are to be truly effective, they need to be embedded in the daily operations of urban planners and decision-makers just like spreadsheets and email.
Thanks to satellite imagery, we can easily understand whether there has been land consumption and how much the relationship between green areas and urbanisation has changed
This thinking underpinned the development of Latitudo 40, which we describe as “a digital information factory in the cloud”. It can be accessed by a standard web browser and the processing made available through APIs that allow easy integration with existing spatial information systems. No special knowledge of data processing and geospatial analysis technologies is required and analyses provide a representation of the city with an easy-to-understand map, graphs and automated reports.
From this information, cities can set specific sustainability goals such as increasing green space per inhabitant, reducing the incidence of urban heat islands per inhabitant, and improving climate comfort in metropolitan suburbs. Every city can verify these goals and achievements via monitoring.
Our experience has made us realise that when it comes to data collection and reporting, city managers often allocate high-end budgets for consulting services that can stop with the creation of a static product. What’s needed going forward is a more agile approach facilitated by business models such as software-as-a-service (SaaS) and backed by real-time, accessible data and services. Only then will we be able to turn data into actionable information and use it to build more sustainable, resilient safer, healthier and greener cities.
Reducing costs, enabling more performant (new) energy businesses and the complex coordination of multiple energy players are crucial in this transformation. However we’re still in the early stages of AI\ML, how can we achieve AI\ML rapid adoption at scale?
Why there is no energy transition without Intelligence Intensity
For the green deal to succeed, we need to start moving towards a whole system approach, interconnecting sectors from diverse energy carriers to industries, transport, and buildings, driving Power-to-X, industrial clusters, industrial smart steering, 24 by 7 green energy matching, hybrid energy parks, and new low-carbon energy value chains leading to billions of networked “things”. Flexible yet complex coordination is required that is close to real-time and optimised for multiple, varying stakeholder interests – impossible to be done by humans.
The key role AI/ML plays in reducing the gigantic investments required for the energy transition can lower the levelised costs of energy, accelerate the issuing of permits and grid connections, and optimise yield, thus speeding up the deployment of the massive renewable generation required. Grid capacity can be expanded digitally, avoiding traditional grid reinforcements that are expensive and time-consuming to build. AI\ML also enables flexibility services coordination for maximum DERs value and infrastructure usage.
Microsoft is fully committed to a rapid AI\ML adoption at scale which is already evolving into a technical reality with higher use than anticipated. Partnerships and co-innovation with clients and partners and the wider ecosystem accelerate the creation of missing digital solutions and the development of digital accelerators for wider, faster, and simpler adoption of digital.
Accelerating AI\ML innovation through open data platforms, open ecosystems, open-source
AI\ML needs a lot of data! Strengthened open energy data platforms give innovators in the ecosystem access in a safe, scalable and performant way to vast volumes of quality data essential to train AI models. Microsoft joined OSDU (Open Subsurface Data Universe) to create an open-source, cloud-agnostic platform to collect subsurface data from O&G operations valuable to O&G but also to renewable offshore players.
Energy Datahubs in Europe also play a vital role in driving innovation. This is why Microsoft and Energinet partnered to co-create the open-source Green Energy Hub blueprints on GitHub for experts to contribute and for others to develop their own data hubs, creating an accelerator for the future smart green solutions.
With AI still in its early stages, it is key to inspire energy players of its successful, tangible impact and to facilitate access to solutions. Microsoft launched the Open AI Energy Initiative (OAI), an open ecosystem for operators, independent software vendors, and equipment providers to offer additional solutions, and the global AI Centre of Excellence for Energy called Microsoft Energy Core features over 40 partner solutions.
The driving co-innovation force of strategic partnerships with energy leaders
Strategic partnerships with market makers enables the acceleration of transformation but also to co-invest deeper and wider in the creation of leading-edge digital solutions for current operations and for the complex chain orchestration needed for a successful energy transition. Foundational research for AI in energy and energy-specific platform-based capabilities are not only developed faster.
These intelligence-intense, leading-edge lighthouse use cases inform the industry for fast followers and create digital optimism for speed. Together we become a driving force for the formation of new value chains, ecosystems, and business models that accelerate meeting the goals of the green agenda.
Utilities specific digital accelerators for wider, faster, and simpler adoption
Energy players want more pre-built capabilities specific to utilities for faster time to market AI\ML models. The 15 years of enhanced utilities-specific industry data models acquired from ADRM exemplify the current enrichment with automation of data ingestion from multiple sources, addressing a major hurdle on data.
Another example is the common domain-specific ontologies that are fundamental to accelerating the development of digital twin solutions. Microsoft, together with Agder Energi, launched the open-source Energy Grid Ontology to be added by others for smart cities and smart buildings.
More broadly, the road ahead is for industry clouds. Energy players can focus much higher in the technology stack at the business applications layer, thus shortening innovation cycles, getting faster into the predictive era, and simplifying adoption.
Through co-investment, Microsoft is accelerating the development of energy-specific platform-based capabilities allowing energy players to focus their AI efforts at the business applications level such as for portfolio optimisation, risk management, and also trading.
As of October 2021, 44 countries were reported to have their own national AI strategic plans, showing their willingness to forge ahead in the global AI race. These include emerging economies like China and India, which are leading the way in building national AI plans within the developing world.
Oxford Insights, a consultancy firm that advises organisations and governments on matters relating to digital transformation, has ranked the preparedness of 160 countries across the world when it comes to using AI in public services. The US ranks first in their 2021 Government AI Readiness Index, followed by Singapore and the UK.
Notably, the lowest-scoring regions in this index include much of the developing world, such as sub-Saharan Africa, the Carribean and Latin America, as well as some central and south Asian countries.
The developed world has an inevitable edge in making rapid progress in the AI revolution. With greater economic capacity, these wealthier countries are naturally best positioned to make large investments in the research and development needed for creating modern AI models.
In contrast, developing countries often have more urgent priorities, such as education, sanitation, healthcare and feeding the population, which override any significant investment in digital transformation. In this climate, AI could widen the digital divide that already exists between developed and developing countries.
The hidden costs of modern AI
AI is traditionally defined as “the science and engineering of making intelligent machines”. To solve problems and perform tasks, AI models generally look at past information and learn rules for making predictions based on unique patterns in the data.
AI is a broad term, comprising two main areas – machine learning and deep learning. While machine learning tends to be suitable when learning from smaller, well-organised datasets, deep learning algorithms are more suited to complex, real-world problems – for example, predicting respiratory diseases using chest X-ray images.
Crucially, neural networks are data hungry, often requiring millions of examples to learn how to perform a new task well. This means they require a complex infrastructure of data storage and modern computing hardware, compared to simpler machine learning models. Such large-scale computing infrastructure is generally unaffordable for developing nations.
Beyond the hefty price tag, another issue that disproportionately affects developing countries is the growing toll this kind of AI takes on the environment. For example, a contemporary neural network costs upwards of US$150,000 to train, and will create around 650kg of carbon emissions during training (comparable to a trans-American flight). Training a more advanced model can lead to roughly five times the total carbon emissions generated by an average car during its entire lifetime.
Developed countries have historically been the leading contributors to rising carbon emissions, but the burden of such emissions unfortunately lands most heavily on developing nations. The global south generally suffers disproportionate environmental crises, such as extreme weather, droughts, floods and pollution, in part because of its limited capacity to invest in climate action.
Developing countries also benefit the least from the advances in AI and all the good it can bring – including building resilience against natural disasters.
Using AI for good
While the developed world is making rapid technological progress, the developing world seems to be underrepresented in the AI revolution. And beyond inequitable growth, the developing world is likely bearing the brunt of the environmental consequences that modern AI models, mostly deployed in the developed world, create.
But it’s not all bad news. According to a 2020 study, AI can help achieve 79% of the targets within the sustainable development goals. For example, AI could be used to measure and predict the presence of contamination in water supplies, thereby improving water quality monitoring processes. This in turn could increase access to clean water in developing countries.
The benefits of AI in the global south could be vast – from improving sanitation to helping with education, to providing better medical care. These incremental changes could have significant flow-on effects. For example, improved sanitation and health services in developing countries could help avert outbreaks of disease.
But if we want to achieve the true value of “good AI”, equitable participation in the development and use of the technology is essential. This means the developed world needs to provide greater financial and technological support to the developing world in the AI revolution. This support will need to be more than short term, but it will create significant and lasting benefits for all.
(The image above is of Jamesteohart / Shutterstock)
Today, analytics, artificial intelligence (AI), and machine learning (ML) have become big business. Throughout the 2020s, Harvard Business Review estimates that these technologies will add $13 trillion to the global economy, impacting virtually every sector in the process.
One of the biggest drivers of the value-add provided by AI/ML will come from smart cities: cities that leverage enhancements in such technologies to deliver improved services for citizens. Smart cities promise to provide data-driven decisions for essential public services like sanitation, transportation, and communications. In this way, they can help improve the quality of life for both the general public and public sector employees, while also reducing environmental footprints and providing more efficient and more cost-effective public services.
Whether it be improved traffic flow, better waste collection practices, video surveillance, or maintenance schedules for infrastructure – the smart city represents a cleaner, safer, and more affordable future for our urban centers. But realizing these benefits will require us to redefine our approach towards networking, data storage, and the systems underpinning and connecting both. To capitalize on the smart city paradigm, we’ll need to adopt a new and dynamic approach to computing and storage.
Providing bottomless storage for the urban environment
In practice, the smart city will require the use of vast arrays of interconnected devices, whether it be sensors, networked vehicles, and machinery for service delivery. These will all generate an ever-growing quantity and variety of data that must be processed and stored, and made accessible to the rest of the smart city’s network for both ongoing tasks and city-wide analytics. While a smart city may not need access to all the relevant data at once, there’s always the possibility of historic data needing to be accessed on recall to help train and calibrate ML models or perform detailed analytics.
All of this means that a more traditional system architecture that processes data through a central enterprise data center – whether it be on-premise or cloud – can’t meet the scaling or performance requirements of the smart city.
This is because, given its geographic removal from the places where data is generated and used, a centralized store can’t be counted on to provide the rapid and reliable service that’s needed for smart city analytics or delivery. Ultimately, the smart city will demand a decentralized approach to data storage. Such a decentralized approach will enable data from devices, sensors, and applications that serve the smart city to be analyzed and processed locally before being transferred to an enterprise data center or the cloud, reducing latency and response times.
To achieve the cost-effectiveness needed when operating at the scale of data variety and volume expected of a smart city, they’ll need access to “bottomless clouds”: storage arrangements where prices per terabyte are so low that development and IT teams won’t need to worry about the costs of provisioning for smart city infrastructure. This gives teams the ability to store all the data they need without the stress of draining their budget, or having to arbitrarily reduce the data pool they’ll be able to draw from for smart city applications or analytics.
Freeing up resources for the smart city with IaaS
Infrastructure-as-a-service (IaaS) is based around a simple principle: users should only pay for the resources they actually use. When it comes to computing and storage resources, this is going to be essential to economically deliver on the vision of the smart city, given the ever-expanding need for provisioning while also keeping down costs within the public sector.
For the smart city in particular, IaaS offers managed, on-demand, and secure edge computing and storage services. IaaS will furnish cities with the components needed to deliver on their vision – whether it be storage, virtualization environments, or network structures. Through being able to scale up provisioning based on current demand while also removing the procurement and administrative burden of handling the actual hardware to a specialist third party, smart cities can benefit from economies of scale that have underpinned much of the cloud computing revolution over the past decade.
In fact, IaaS may be the only way to go, when it comes to ensuring that the data of the smart city is stored and delivered in a reliable way. While handling infrastructure in-house may be tempting from a security perspective, market competition between IaaS providers incentivizes better service provision from all angles, whether customer experience, reliability and redundancy, or the latest standards in security.
Delivering the smart city is a 21st century necessity
The world’s top cities are already transforming to keep up with ever-expanding populations and in turn their ever-expanding needs. Before we know it, various sectors of urban life will have to be connected through intelligent technology to optimize the use of shared resources – not because we want to, but because we need to.
Whether it be a question of social justice, fiscal prudence, or environmental conscience, intelligently allocating and using the resources of the city is the big question facing our urban centers in this century. But the smart city can only be delivered through a smart approach to data handling and storage. Optimizing a city’s cloud infrastructure and guaranteeing cost-effective and quality provisioning through IaaS will be essential to delivering on the promise of the smart city, and thus meet some of our time’ most pressing challenges.
David Friend is the co-founder and CEO of Wasabi Technologies, a revolutionary cloud storage company. David’s first company, ARP Instruments developed synthesizers used by Stevie Wonder, David Bowie, Led Zeppelin and even helped Steven Spielberg communicate with aliens providing that legendary five-note communication in Close Encounters of the ThirdKind. Friend founded or co-founded five other companies: Computer Pictures Corporation – an early player in computer graphics, Pilot Software – a company that pioneered multidimensional databases for crunching large amounts of customer data, Faxnet – which became the world’s largest provider of fax-to-email services, Sonexis – a VoIP conferencing company, and immediately prior to Wasabi, what is now one of the world’s leading cloud backup companies, Carbonite. David is a respected philanthropist and is on the board of Berklee College of Music, where there is a concert hall named in his honor, serves as president of the board of Boston Baroque, an orchestra and chorus that has received 7 Grammy nominations. An avid mineral and gem collector he donated Friend Gem and Mineral Hall at the Yale Peabody Museum of Natural History. David graduated from Yale and attended the Princeton University Graduate School of Engineering where he was a David Sarnoff Fellow.
Published on 20 September 2021, in E&T, AJ Abdallah’s question of How will artificial intelligence power the cities of tomorrow?
How will artificial intelligence power the cities of tomorrow?
By AJ Abdallat, Beyond Limits
Achieving a decarbonised future will require efficiency-boosting measures that AI can help to identify and implement.
Artificial intelligence is taking the stage as smart cities become not just an idea for the future, but a present reality. Advanced technologies are at the forefront of this change, driving valuable strategies and optimising the industry across all operations. These technologies are quickly becoming the solution for fulfilling smart city and clean city initiatives, as well as net-zero commitments.
AI is becoming well integrated with the development of smart cities. A 2018 Gartner report forecast that AI would become a critical feature of 30 per cent of smart city applications by 2020, up from just 5 per cent a few years previously. Implementation of AI is rapidly being recognised as the not-so-secret ingredient helping major energy providers accomplish their lowest-carbon footprints yet, along with unparalleled sustainability and attractive profit margins.
What makes a city ‘smart’ is the collection and analysis of vast amounts of data across numerous sectors, from metropolitan development and utility allocation all the way down to manual functions like city services. Smart cities require the construction and maintenance of arrangements of sensors, equipment and other systems designed to create sustainability and efficiency.
Altering the strategy behind a city’s utilities operations is one of the major keys to making it smarter and more sustainable. AI solutions are already making significant strides where this is concerned. As the CEO of an AI company creating software for the utilities sector, the impact that advanced solutions are already having on the industry is something I’m very excited about.
One real-world example of AI powering smart city utilities is the Nvidia Metropolis platform, which uses intelligent video analytics to improve public services, logistics, and more. Nvidia describes it as being designed to: “create more sustainable cities, maintain infrastructure, and improve public services for residents and communities.” The company collects data from sensors and other IoT devices, city-wide, to provide insights that can lead to improvements in areas like disaster response, asset protection, supply forecasting and traffic management.
Another solution at the forefront of building smarter cities is a project led by Xcell Security House and Finance SA that aims to build the world’s first power plant guided by cognitive AI, driving utility development in West Africa. As the earliest implementation of an AI-powered plant from the ground up, it will employ advanced sensor-placement technology and techniques that embed knowledge and expertise into every part of the facility’s processes. Stakeholders will have streamlined access to facility-scale insights, creating a plant environment with greater risk mitigation as well as maximised efficiency and productivity.
These are just two of many emerging applications of AI in smart city development. When applying AI, the sector also stands to achieve greater cost and operational efficiencies in several key areas such as predictive maintenance, load forecasting/optimisation, grid reliability, energy theft prevention and renewable resource optimisation.
When discussing energy efficiency, many factors enter the picture, including the impact of environmental factors as commonplace as temperature and humidity levels. Historically, experienced human operators were best equipped to identify efficiency-boosting adjustments. Today, cognitive AI is making moves to encode that human knowledge and expertise across providers’ entire operations, delivering recommendations at a moment’s notice. Explainable AI creates the trust necessary for operators, engineers and stakeholders to solve acute issues quickly. The system’s shrewd situational awareness helps detect, foresee and solve problems, even when circumstances are in constant flux – scenarios as critical as an entire city’s water and power supply.
AI is already playing a principal role in supporting the move towards smarter cities by helping entire sectors get closer to efficiency and net-zero objectives. Achieving a decarbonised future will require more resourceful processes that boost efficiency and reduce waste. AI for utilities can elevate productivity, yielding more attention around resource consumption, and hastening the adoption of renewable, carbon-friendly strategies on a global scale.
According to a report from IDC, smart city technology spending across the globe reached $80 billion in 2016 and is expected to grow to $135 billion by 2021. It is imperative that companies, industries, and other entities looking to participate in this important stage of digital transformation seek out industrial-grade AI companies with software that provides holistic, organisation/sector/city-wide insights through sensor placement technology and data collection techniques.
Governments at every level, as well as public and private organisations, are facilitating technological implementation and digital transformation. Private and public partnerships have become a major mechanism by which cities can adopt technology that makes them smarter. The best course of action is to embrace AI that blends knowledge-based reasoning with advanced digitalisation techniques, helping stakeholders distinguish unanticipated scenarios and make tough choices.
Choosing the most dynamic form of AI to transform the utilities sector will contribute remarkably to the development of smart cities. Enhanced communication, strengthened collaboration, increased fuel savings and decreased waste will help companies – particularly in high-value industries – to increase their profits. Indelible process improvements, like streamlined operational capacities where all facilities function more efficiently in harmony, are the future of smart city technology.
Originally posted on Algerian Freedom Alliance: by Scander Safsaf. 10/04/2018. Algerian Freedom Alliance. A Bicephalous Truth By deterrence I meant benefiting from a nuclear one, and applying a form of state coercion also seen as a deterrence. The very “raison d’être” of such security apparatus, beyond the subject of submission, as I will detail later,…
Deep in the Algerian desert, a Sahrawi-run event puts Western Sahara’s struggle for liberation on the big screen. 25 November 2022 The image above is of GOV.UK By Ariel Sophia Bardi, a freelance writer and photographer based in Rome. AUSERD REFUGEE CAMP, Algeria—At about 10 p.m., in the middle of the Sahara desert, just two lights […]
Originally posted on The Further Adventures of Shannon and Brodie: Once again with the wanderlust. This time for Africa. Waka waka into Casablanca jetlagged and a day late. Don’t fly TAP Air. The fields were dirt brown as we descended toward the airport, but hey it’s winter here too. 33 degrees north latitude puts us…
This site uses functional cookies and external scripts to improve your experience.
Privacy & Cookies Policy
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.