Reducing costs, enabling more performant (new) energy businesses and the complex coordination of multiple energy players are crucial in this transformation. However we’re still in the early stages of AI\ML, how can we achieve AI\ML rapid adoption at scale?
Why there is no energy transition without Intelligence Intensity
For the green deal to succeed, we need to start moving towards a whole system approach, interconnecting sectors from diverse energy carriers to industries, transport, and buildings, driving Power-to-X, industrial clusters, industrial smart steering, 24 by 7 green energy matching, hybrid energy parks, and new low-carbon energy value chains leading to billions of networked “things”. Flexible yet complex coordination is required that is close to real-time and optimised for multiple, varying stakeholder interests – impossible to be done by humans.
The key role AI/ML plays in reducing the gigantic investments required for the energy transition can lower the levelised costs of energy, accelerate the issuing of permits and grid connections, and optimise yield, thus speeding up the deployment of the massive renewable generation required. Grid capacity can be expanded digitally, avoiding traditional grid reinforcements that are expensive and time-consuming to build. AI\ML also enables flexibility services coordination for maximum DERs value and infrastructure usage.
Microsoft is fully committed to a rapid AI\ML adoption at scale which is already evolving into a technical reality with higher use than anticipated. Partnerships and co-innovation with clients and partners and the wider ecosystem accelerate the creation of missing digital solutions and the development of digital accelerators for wider, faster, and simpler adoption of digital.
Accelerating AI\ML innovation through open data platforms, open ecosystems, open-source
AI\ML needs a lot of data! Strengthened open energy data platforms give innovators in the ecosystem access in a safe, scalable and performant way to vast volumes of quality data essential to train AI models. Microsoft joined OSDU (Open Subsurface Data Universe) to create an open-source, cloud-agnostic platform to collect subsurface data from O&G operations valuable to O&G but also to renewable offshore players.
Energy Datahubs in Europe also play a vital role in driving innovation. This is why Microsoft and Energinet partnered to co-create the open-source Green Energy Hub blueprints on GitHub for experts to contribute and for others to develop their own data hubs, creating an accelerator for the future smart green solutions.
With AI still in its early stages, it is key to inspire energy players of its successful, tangible impact and to facilitate access to solutions. Microsoft launched the Open AI Energy Initiative (OAI), an open ecosystem for operators, independent software vendors, and equipment providers to offer additional solutions, and the global AI Centre of Excellence for Energy called Microsoft Energy Core features over 40 partner solutions.
The driving co-innovation force of strategic partnerships with energy leaders
Strategic partnerships with market makers enables the acceleration of transformation but also to co-invest deeper and wider in the creation of leading-edge digital solutions for current operations and for the complex chain orchestration needed for a successful energy transition. Foundational research for AI in energy and energy-specific platform-based capabilities are not only developed faster.
These intelligence-intense, leading-edge lighthouse use cases inform the industry for fast followers and create digital optimism for speed. Together we become a driving force for the formation of new value chains, ecosystems, and business models that accelerate meeting the goals of the green agenda.
Utilities specific digital accelerators for wider, faster, and simpler adoption
Energy players want more pre-built capabilities specific to utilities for faster time to market AI\ML models. The 15 years of enhanced utilities-specific industry data models acquired from ADRM exemplify the current enrichment with automation of data ingestion from multiple sources, addressing a major hurdle on data.
Another example is the common domain-specific ontologies that are fundamental to accelerating the development of digital twin solutions. Microsoft, together with Agder Energi, launched the open-source Energy Grid Ontology to be added by others for smart cities and smart buildings.
More broadly, the road ahead is for industry clouds. Energy players can focus much higher in the technology stack at the business applications layer, thus shortening innovation cycles, getting faster into the predictive era, and simplifying adoption.
Through co-investment, Microsoft is accelerating the development of energy-specific platform-based capabilities allowing energy players to focus their AI efforts at the business applications level such as for portfolio optimisation, risk management, and also trading.
As of October 2021, 44 countries were reported to have their own national AI strategic plans, showing their willingness to forge ahead in the global AI race. These include emerging economies like China and India, which are leading the way in building national AI plans within the developing world.
Oxford Insights, a consultancy firm that advises organisations and governments on matters relating to digital transformation, has ranked the preparedness of 160 countries across the world when it comes to using AI in public services. The US ranks first in their 2021 Government AI Readiness Index, followed by Singapore and the UK.
Notably, the lowest-scoring regions in this index include much of the developing world, such as sub-Saharan Africa, the Carribean and Latin America, as well as some central and south Asian countries.
The developed world has an inevitable edge in making rapid progress in the AI revolution. With greater economic capacity, these wealthier countries are naturally best positioned to make large investments in the research and development needed for creating modern AI models.
In contrast, developing countries often have more urgent priorities, such as education, sanitation, healthcare and feeding the population, which override any significant investment in digital transformation. In this climate, AI could widen the digital divide that already exists between developed and developing countries.
The hidden costs of modern AI
AI is traditionally defined as “the science and engineering of making intelligent machines”. To solve problems and perform tasks, AI models generally look at past information and learn rules for making predictions based on unique patterns in the data.
AI is a broad term, comprising two main areas – machine learning and deep learning. While machine learning tends to be suitable when learning from smaller, well-organised datasets, deep learning algorithms are more suited to complex, real-world problems – for example, predicting respiratory diseases using chest X-ray images.
Crucially, neural networks are data hungry, often requiring millions of examples to learn how to perform a new task well. This means they require a complex infrastructure of data storage and modern computing hardware, compared to simpler machine learning models. Such large-scale computing infrastructure is generally unaffordable for developing nations.
Beyond the hefty price tag, another issue that disproportionately affects developing countries is the growing toll this kind of AI takes on the environment. For example, a contemporary neural network costs upwards of US$150,000 to train, and will create around 650kg of carbon emissions during training (comparable to a trans-American flight). Training a more advanced model can lead to roughly five times the total carbon emissions generated by an average car during its entire lifetime.
Developed countries have historically been the leading contributors to rising carbon emissions, but the burden of such emissions unfortunately lands most heavily on developing nations. The global south generally suffers disproportionate environmental crises, such as extreme weather, droughts, floods and pollution, in part because of its limited capacity to invest in climate action.
Developing countries also benefit the least from the advances in AI and all the good it can bring – including building resilience against natural disasters.
Using AI for good
While the developed world is making rapid technological progress, the developing world seems to be underrepresented in the AI revolution. And beyond inequitable growth, the developing world is likely bearing the brunt of the environmental consequences that modern AI models, mostly deployed in the developed world, create.
But it’s not all bad news. According to a 2020 study, AI can help achieve 79% of the targets within the sustainable development goals. For example, AI could be used to measure and predict the presence of contamination in water supplies, thereby improving water quality monitoring processes. This in turn could increase access to clean water in developing countries.
The benefits of AI in the global south could be vast – from improving sanitation to helping with education, to providing better medical care. These incremental changes could have significant flow-on effects. For example, improved sanitation and health services in developing countries could help avert outbreaks of disease.
But if we want to achieve the true value of “good AI”, equitable participation in the development and use of the technology is essential. This means the developed world needs to provide greater financial and technological support to the developing world in the AI revolution. This support will need to be more than short term, but it will create significant and lasting benefits for all.
This AMEinfo’s article covered the views of 2 major personnel on their 2022 Technology predictions from Veeam Software, a privately held US-based information technology company.
Danny Allan, Chief Technology Officer and Senior Vice President of Product Strategy at Veeam, and Claude Schuck, Regional Director, Middle East at Veeam, offer their outlook for the technology sector in 2022
AI and automation will replace entry-level jobs in the finance, healthcare, legal and software industries
Privacy-focused legislation will shift attention to data sovereignty clouds
Digital transformation powers ahead thanks to containers
Danny Allan, Chief Technology Officer and Senior Vice President of Product Strategy at Veeam, and Claude Schuck, Regional Director, Middle East at Veeam, offer their outlook for the technology sector in 2022.
We start with Danny Allan with his 6 technology product predictions:
1- Acquisitions will stagnate as company valuations outstrip available assets
In 2021, global M&A activity reached new highs aided by low-interest rates and high stock prices. In 2022, we will see that momentum shift. Larger acquisitions will be few and far between as company valuations continue to rise. Only well-established, cash-rich companies will have the money required to make new purchases. The higher purchase threshold will make it harder for medium- and small-sized companies to grow and evolve, giving the advantage to larger, established firms.
2- AI and automation will replace entry-level jobs in the finance, healthcare, legal and software industries
The talent shortage will leave many jobs unfilled, making way for the advancement of artificial intelligence and automation to fill new roles. We have seen technology begin its takeover in the service industry with the introduction of robotic waiters during the pandemic. In 2022, we will see AI and automation capable of filling positions in other hard-hit sectors like the finance, healthcare, legal and software industries. These developments will mostly affect entry-level positions, like interns, making it harder for recent graduates entering the workforce to gain job experience in the future.
3- CI/CD will stabilize and standardize to become an IT team requirement
The Bill Gates memo in 2001 became the industry standard in how to design, develop and deliver complex software systems – and today it feels like there has been no standard since then. IT teams and developers fell into habits of adopting “known” technology systems, and not standardizing in new spaces, like continuous integration and continuous delivery (CI/CD). In 2022, we’re going to see a shift towards more stability and standardization for CI/CD. IT leaders have an opportunity to capitalize on this high-growth and high-valuation market to increase deployment activity and solve the “day two operations problem.”
4- Tech’s labor market will be met with big money and big challenges
The COVID-19 economy – and the subsequent great resignation – throughout the last two years certainly made its mark in the tech industry. As we continue to see turnover and lower employee retention, tech salaries will begin to grow in 2022 to incentivize talent to stay. I see this causing an interesting dynamic, presenting bigger challenges, especially to the folks in the startup and VC world. The bigger tech giants are the ones who can meet the high dollar demand and deliver benefits for a competitive workforce. It will be interesting to see in the years ahead what this does for innovation, which tends to come from the hungry startups where people work for very little for a long time. We could very well see a resurgence of tech talent returning to the “old guard” companies to meet their needs for stable (and large) salaries, forgoing the competitive, hard-knocks of startups that could cause a skill and talent gap that lasts for years to come.
5- New privacy-focused legislation will shift attention to data sovereignty clouds
With increased focus on General Data Protection Regulation (GDPR) regulating data protection and privacy in the EU and the California Consumer Privacy Act (CCPA) enhancing privacy rights and consumer protection for Californians, other states and countries are facing pressure to enact comprehensive data privacy legislation. As this continues in 2022, I expect we’ll see much more focus on data sovereignty clouds to keep data within nations or within a certain physical location. This is a far more specified cloud model that we’re starting to see in EMEA with Gaia-X. Some will see this as an obstacle, but once implemented, this will be a good thing as it puts consumer privacy at the core of the business strategy.
6- Containers will become mainstream to support the cloud explosion of 2021
Businesses wrongly predicted that employees would return to the office, as normal, in 2021. Instead, remote working continued, and companies were forced to develop long-term remote working strategies to ensure efficiency, sustainability and to retain employees seeking flexibility. This remote work strategy demanded cloud-based solutions, resulting in an explosion of cloud service adoption. To meet this moment, containers will become mainstream in 2022, making the generational shift to cloud much easier and more streamlined for organizations.
Next, we go with Claude Schuck with his 4 security predictions
Every enterprise in the Middle East looking to build a strategy around Modern Data Protection should keep the three important pillars in mind – Cloud, Security, and Containers. Businesses need to have a good understanding of what the cloud brings to an organization and why it is important. Secondly, before the pandemic, we had a centralized office where employees were all in one place. With decentralization now, the boundaries of the organization have become invisible. Data is all over the place, necessitating a need for a comprehensive security strategy to safeguard all entry points. And finally, we see an increased interest in Kubernetes as a critical piece of an enterprise’s cloud infrastructure. This has created a new area around container-native data protection that needs addressing.
1- Accelerated adoption of Cloud technologies
Although the cloud is not yet mainstream in the region, adoption is expected to witness significant growth in the Middle East as enterprises begin to “trust” in-country offerings with the big public cloud players like Microsoft Azure and Amazon Web Services having opened data centers in the Middle East. Gartner forecasts end-user spending on public cloud services in the MENA region to grow 19% in 2022. Another big trend we see is that many governments across the Middle East are creating their own ‘Government Cloud’ in order to have control over their data and not letting it reside in the public realm. With this acceleration, Veeam is investing in more headcount in the region to be able to assist organizations as they transition to the cloud.
2- Security: cybercrime
In the Middle East, security will always be a top priority. Not only can cyberattacks affect day-to-day business, impact revenue, and create other problems, but above all, it affects the brand reputation and workforce. Enterprises will continue to invest and safeguard themselves against the ever-growing increase in cyberattacks, especially ransomware. Although organizations in the Middle East, in general, spend a lot on security technologies, there is a huge gap when it comes to planning and executing a security strategy. This mainly boils down to the complexity of the IT environment. There are still a lot of legacy systems. Protecting these complicated environments is a big challenge and becomes even more so in the transition phase of moving to the cloud. Regional CISOs need to have a stringent security program in place which includes important elements like stress testing of IT Systems, backups, a disaster recovery strategy, and educating employees to become the first line of defense for improving organizational resilience.
3- Security: data privacy and protection
In early September 2021, the UAE announced the introduction of a new federal data protection law. With this, data privacy and security are set to take center stage as consumers demand transparency and their “right” to be forgotten. By having the option of opting out, consumers can ensure that their data is being handled in a correct way and they are not targeted by organizations. But more importantly, international corporations that are based in the UAE and the Middle East can be assured that policies are being applied when it comes to data in-country – whether it be in terms of the way data is stored, IP is managed, or how customer and consumer data is protected.
4- Digital transformation powers ahead thanks to containers
The rapid adoption of Containers in enterprises, the need for on-demand resources, and the flexibility of workloads will drive digital transformation. The Lack of skilled resources and understanding around the technology is a big challenge for enterprises in the Middle East. Veeam, through its acquisition of Kasten, is simplifying container strategy and delivering the industry’s leading Cloud Data Management platform that supports data protection for container-based applications built-in Kubernetes environments.
(The image above is of Jamesteohart / Shutterstock)
Today, analytics, artificial intelligence (AI), and machine learning (ML) have become big business. Throughout the 2020s, Harvard Business Review estimates that these technologies will add $13 trillion to the global economy, impacting virtually every sector in the process.
One of the biggest drivers of the value-add provided by AI/ML will come from smart cities: cities that leverage enhancements in such technologies to deliver improved services for citizens. Smart cities promise to provide data-driven decisions for essential public services like sanitation, transportation, and communications. In this way, they can help improve the quality of life for both the general public and public sector employees, while also reducing environmental footprints and providing more efficient and more cost-effective public services.
Whether it be improved traffic flow, better waste collection practices, video surveillance, or maintenance schedules for infrastructure – the smart city represents a cleaner, safer, and more affordable future for our urban centers. But realizing these benefits will require us to redefine our approach towards networking, data storage, and the systems underpinning and connecting both. To capitalize on the smart city paradigm, we’ll need to adopt a new and dynamic approach to computing and storage.
Providing bottomless storage for the urban environment
In practice, the smart city will require the use of vast arrays of interconnected devices, whether it be sensors, networked vehicles, and machinery for service delivery. These will all generate an ever-growing quantity and variety of data that must be processed and stored, and made accessible to the rest of the smart city’s network for both ongoing tasks and city-wide analytics. While a smart city may not need access to all the relevant data at once, there’s always the possibility of historic data needing to be accessed on recall to help train and calibrate ML models or perform detailed analytics.
All of this means that a more traditional system architecture that processes data through a central enterprise data center – whether it be on-premise or cloud – can’t meet the scaling or performance requirements of the smart city.
This is because, given its geographic removal from the places where data is generated and used, a centralized store can’t be counted on to provide the rapid and reliable service that’s needed for smart city analytics or delivery. Ultimately, the smart city will demand a decentralized approach to data storage. Such a decentralized approach will enable data from devices, sensors, and applications that serve the smart city to be analyzed and processed locally before being transferred to an enterprise data center or the cloud, reducing latency and response times.
To achieve the cost-effectiveness needed when operating at the scale of data variety and volume expected of a smart city, they’ll need access to “bottomless clouds”: storage arrangements where prices per terabyte are so low that development and IT teams won’t need to worry about the costs of provisioning for smart city infrastructure. This gives teams the ability to store all the data they need without the stress of draining their budget, or having to arbitrarily reduce the data pool they’ll be able to draw from for smart city applications or analytics.
Freeing up resources for the smart city with IaaS
Infrastructure-as-a-service (IaaS) is based around a simple principle: users should only pay for the resources they actually use. When it comes to computing and storage resources, this is going to be essential to economically deliver on the vision of the smart city, given the ever-expanding need for provisioning while also keeping down costs within the public sector.
For the smart city in particular, IaaS offers managed, on-demand, and secure edge computing and storage services. IaaS will furnish cities with the components needed to deliver on their vision – whether it be storage, virtualization environments, or network structures. Through being able to scale up provisioning based on current demand while also removing the procurement and administrative burden of handling the actual hardware to a specialist third party, smart cities can benefit from economies of scale that have underpinned much of the cloud computing revolution over the past decade.
In fact, IaaS may be the only way to go, when it comes to ensuring that the data of the smart city is stored and delivered in a reliable way. While handling infrastructure in-house may be tempting from a security perspective, market competition between IaaS providers incentivizes better service provision from all angles, whether customer experience, reliability and redundancy, or the latest standards in security.
Delivering the smart city is a 21st century necessity
The world’s top cities are already transforming to keep up with ever-expanding populations and in turn their ever-expanding needs. Before we know it, various sectors of urban life will have to be connected through intelligent technology to optimize the use of shared resources – not because we want to, but because we need to.
Whether it be a question of social justice, fiscal prudence, or environmental conscience, intelligently allocating and using the resources of the city is the big question facing our urban centers in this century. But the smart city can only be delivered through a smart approach to data handling and storage. Optimizing a city’s cloud infrastructure and guaranteeing cost-effective and quality provisioning through IaaS will be essential to delivering on the promise of the smart city, and thus meet some of our time’ most pressing challenges.
David Friend is the co-founder and CEO of Wasabi Technologies, a revolutionary cloud storage company. David’s first company, ARP Instruments developed synthesizers used by Stevie Wonder, David Bowie, Led Zeppelin and even helped Steven Spielberg communicate with aliens providing that legendary five-note communication in Close Encounters of the ThirdKind. Friend founded or co-founded five other companies: Computer Pictures Corporation – an early player in computer graphics, Pilot Software – a company that pioneered multidimensional databases for crunching large amounts of customer data, Faxnet – which became the world’s largest provider of fax-to-email services, Sonexis – a VoIP conferencing company, and immediately prior to Wasabi, what is now one of the world’s leading cloud backup companies, Carbonite. David is a respected philanthropist and is on the board of Berklee College of Music, where there is a concert hall named in his honor, serves as president of the board of Boston Baroque, an orchestra and chorus that has received 7 Grammy nominations. An avid mineral and gem collector he donated Friend Gem and Mineral Hall at the Yale Peabody Museum of Natural History. David graduated from Yale and attended the Princeton University Graduate School of Engineering where he was a David Sarnoff Fellow.
Published on 20 September 2021, in E&T, AJ Abdallah’s question of How will artificial intelligence power the cities of tomorrow?
How will artificial intelligence power the cities of tomorrow?
By AJ Abdallat, Beyond Limits
Achieving a decarbonised future will require efficiency-boosting measures that AI can help to identify and implement.
Artificial intelligence is taking the stage as smart cities become not just an idea for the future, but a present reality. Advanced technologies are at the forefront of this change, driving valuable strategies and optimising the industry across all operations. These technologies are quickly becoming the solution for fulfilling smart city and clean city initiatives, as well as net-zero commitments.
AI is becoming well integrated with the development of smart cities. A 2018 Gartner report forecast that AI would become a critical feature of 30 per cent of smart city applications by 2020, up from just 5 per cent a few years previously. Implementation of AI is rapidly being recognised as the not-so-secret ingredient helping major energy providers accomplish their lowest-carbon footprints yet, along with unparalleled sustainability and attractive profit margins.
What makes a city ‘smart’ is the collection and analysis of vast amounts of data across numerous sectors, from metropolitan development and utility allocation all the way down to manual functions like city services. Smart cities require the construction and maintenance of arrangements of sensors, equipment and other systems designed to create sustainability and efficiency.
Altering the strategy behind a city’s utilities operations is one of the major keys to making it smarter and more sustainable. AI solutions are already making significant strides where this is concerned. As the CEO of an AI company creating software for the utilities sector, the impact that advanced solutions are already having on the industry is something I’m very excited about.
One real-world example of AI powering smart city utilities is the Nvidia Metropolis platform, which uses intelligent video analytics to improve public services, logistics, and more. Nvidia describes it as being designed to: “create more sustainable cities, maintain infrastructure, and improve public services for residents and communities.” The company collects data from sensors and other IoT devices, city-wide, to provide insights that can lead to improvements in areas like disaster response, asset protection, supply forecasting and traffic management.
Another solution at the forefront of building smarter cities is a project led by Xcell Security House and Finance SA that aims to build the world’s first power plant guided by cognitive AI, driving utility development in West Africa. As the earliest implementation of an AI-powered plant from the ground up, it will employ advanced sensor-placement technology and techniques that embed knowledge and expertise into every part of the facility’s processes. Stakeholders will have streamlined access to facility-scale insights, creating a plant environment with greater risk mitigation as well as maximised efficiency and productivity.
These are just two of many emerging applications of AI in smart city development. When applying AI, the sector also stands to achieve greater cost and operational efficiencies in several key areas such as predictive maintenance, load forecasting/optimisation, grid reliability, energy theft prevention and renewable resource optimisation.
When discussing energy efficiency, many factors enter the picture, including the impact of environmental factors as commonplace as temperature and humidity levels. Historically, experienced human operators were best equipped to identify efficiency-boosting adjustments. Today, cognitive AI is making moves to encode that human knowledge and expertise across providers’ entire operations, delivering recommendations at a moment’s notice. Explainable AI creates the trust necessary for operators, engineers and stakeholders to solve acute issues quickly. The system’s shrewd situational awareness helps detect, foresee and solve problems, even when circumstances are in constant flux – scenarios as critical as an entire city’s water and power supply.
AI is already playing a principal role in supporting the move towards smarter cities by helping entire sectors get closer to efficiency and net-zero objectives. Achieving a decarbonised future will require more resourceful processes that boost efficiency and reduce waste. AI for utilities can elevate productivity, yielding more attention around resource consumption, and hastening the adoption of renewable, carbon-friendly strategies on a global scale.
According to a report from IDC, smart city technology spending across the globe reached $80 billion in 2016 and is expected to grow to $135 billion by 2021. It is imperative that companies, industries, and other entities looking to participate in this important stage of digital transformation seek out industrial-grade AI companies with software that provides holistic, organisation/sector/city-wide insights through sensor placement technology and data collection techniques.
Governments at every level, as well as public and private organisations, are facilitating technological implementation and digital transformation. Private and public partnerships have become a major mechanism by which cities can adopt technology that makes them smarter. The best course of action is to embrace AI that blends knowledge-based reasoning with advanced digitalisation techniques, helping stakeholders distinguish unanticipated scenarios and make tough choices.
Choosing the most dynamic form of AI to transform the utilities sector will contribute remarkably to the development of smart cities. Enhanced communication, strengthened collaboration, increased fuel savings and decreased waste will help companies – particularly in high-value industries – to increase their profits. Indelible process improvements, like streamlined operational capacities where all facilities function more efficiently in harmony, are the future of smart city technology.
Originally posted on The Omer's Speech: The protection of cultural heritage is a crucial aspect of preserving the history and identity of nations and communities around the world. Unfortunately, despite the efforts of various declarations and international organizations such as UNESCO, these illegal activities in cultural heritage and the theft of cultural objects remain…
Originally posted on Travel Between The Pages: On this day in 1960, Albert Camus, French author, philospopher and journalist, died in an automobile accident at age 46. In his coat pocket lay an unused train ticket. Camus had intense Motorphobia (fear of automobiles), and thus avoided riding in cars as much as possible. Instead, he…
Originally posted on Levant's Agora: By Nick Butler. Many have dismissed last month’s COP27 climate conference as a failure, owing to the lack of progress on pledges made at the COP26 summit last year, and to the absence of clear commitments to phase out fossil fuels. More broadly, the COP process itself has been…
This site uses functional cookies and external scripts to improve your experience.
Privacy & Cookies Policy
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.