Egypt is working on formulating a strategy for artificial intelligence (AI) which will include the establishment of the country’s first faculty of artificial intelligence and artificial intelligence academy in the coming academic year, in a bid to produce the scientific workforces needed to develop a sustainable knowledge-based economy.
The FAI will start student enrolment in the next academic year, 2019-20, as a centre of excellence for artificial intelligence research, education, teaching and training.
Besides establishing an artificial intelligence academy specialising in innovation and new thinking in artificial intelligence, several AI departments will also be set up at higher education institutions to develop capacity and boost innovations.
AI is the science of developing computer systems capable of carrying out human tasks.
According to a 2017 PricewaterhouseCoopers (PwC) report entitled The Potential Impact of AI in the Middle East, it is estimated that 7.7% of Egypt’s gross domestic product could come from the AI sector by 2030.
“We estimate that the Middle East is expected to accrue 2% of the total global benefits of AI in 2030. This is equivalent to US$320 billion,” the report stated.
“In the wake of the fourth industrial revolution, governments and businesses across the Middle East are beginning to realise the shift globally towards AI and advanced technologies.
“They are faced with a choice between being a part of the technological disruption, or being left behind. When we look at the economic impact for the region, being left behind is not an option.”
The biggest opportunity for AI in the Middle East and Africa region is in the financial sector where it is estimated that 25% of all AI investment in the region predicted for 2021, or US$28.3 million, will be spent on developing AI solutions. This is followed by the public services, including education among other sectors, according to the PwC report.
Samir Khalaf Abd-El-Aal, a science expert at the National Research Centre in Cairo, welcomed news of the FAI as a “pioneering initiative” that will have an impact on Egypt as well as North Africa.
“It is a good step forward for raising awareness of the potential of AI for sustainable development as well as contributing in facing regional challenges to fully harness the deployment of AI, including infrastructure, skills, knowledge gaps, research capacities and availability of local data,” Abd-El-Aal told University World News.
“The FAI is an important initiative in training students in AI, which will become one of the tools of future jobs, as well as building AI applications in Arabic, which can easily go to all Arabic-speaking countries including North African states.”
“The FAI could also act as a regional focal point for carrying out mapping for local artificial intelligence start-ups, research centres and civil society organisations as well as serving as an incubator for skills development and promoting AI entrepreneurship oriented towards solving North African problems,” Abd-El-Aal said.
Virtual science hub
The Egypt government also announced the launch of a virtual science hub at the Forum. The hub, affiliated to the Academy of Scientific Research and Technology at the Ministry of Higher Education and Scientific Research, aims to enable integration, management and planning of Egyptian technological resources, work on the international information network, and includes an integrated database for all Egyptian technological resources.
It also includes all scientific and technical resources as well as material assets and academic research contributions, which will make it possible to measure the degree of technological readiness of all Egyptian academic and research institutions. The general objective of the system is to provide the necessary information to support decision-makers in research projects and to facilitate the follow-up of research activities.
I can still recall my surprise when a book by evolutionary biologist Peter Lawrence entitled “The making of a fly” came to be priced on Amazon at $23,698,655.93 (plus $3.99 shipping). While my colleagues around the world must have become rather depressed that an academic book could achieve such a feat, the steep price was actually the result of algorithms feeding off each other and spiralling out of control. It turns out, it wasn’t just sales staff being creative: algorithms were calling the shots.
This eye-catching example was spotted and corrected. But what if such algorithmic interference happens all the time, including in ways we don’t even notice? If our reality is becoming increasingly constructed by algorithms, where does this leave us humans?
Inspired by such examples, my colleague Prof Allen Lee and I recently set out to explore the deeper effects of algorithmic technology in a paper in the Journal of the Association for Information Systems. Our exploration led us to the conclusion that, over time, the roles of information technology and humans have been reversed. In the past, we humans used technology as a tool. Now, technology has advanced to the point where it is using and even controlling us.
We humans are not merely cut off from the decisions that machines are making for us but deeply affected by them in unpredictable ways. Instead of being central to the system of decisions that affects us, we are cast out in to its environment. We have progressively restricted our own decision-making capacity and allowed algorithms to take over. We have become artificial humans, or human artefacts, that are created, shaped and used by the technology.
Examples abound. In law, legal analysts are gradually being replaced by artificial intelligence, meaning the successful defence or prosecution of a case can rely partly on algorithms. Software has even been allowed to predict future criminals, ultimately controlling human freedom by shaping how parole is denied or granted to prisoners. In this way, the minds of judges are being shaped by decision-making mechanisms they cannot understand because of how complex the process is and how much data it involves.
In the job market, excessive reliance on technology has led some of the world’s biggest companies to filter CVs through software, meaning human recruiters will never even glance at some potential candidates’ details. Not only does this put people’s livelihoods at the mercy of machines, it can also build in hiring biases that the company had no desire to implement, as happened with Amazon.
In news, what’s known as automated sentiment analysis analyses positive and negative opinions about companies based on different web sources. In turn, these are being used by trading algorithms that make automated financial decisions, without humans having to actually read the news.
In fact, algorithms operating without human intervention now play a significant role in financial markets. For example, 85% of all trading in the foreign exchange markets is conducted by algorithms alone. The growing algorithmic arms race to develop ever more complex systems to compete in these markets means huge sums of money are being allocated according to the decisions of machines.
On a small scale, the people and companies that create these algorithms are able to affect what they do and how they do it. But because much of artificial intelligence involves programming software to figure out how to complete a task by itself, we often don’t know exactly what is behind the decision-making. As with all technology, this can lead to unintended consequences that may go far beyond anything the designers ever envisaged.
But the algorithms that amplified the initial problems didn’t make a mistake. There wasn’t a bug in the programming. The behaviour emerged from the interaction of millions of algorithmic decisions playing off each other in unpredictable ways, following their own logic in a way that created a downward spiral for the market.
The conditions that made this possible occurred because, over the years, the people running the trading system had come to see human decisions as an obstacle to market efficiency. Back in 1987 when the US stock market fell by 22.61%, some Wall Street brokers simply stopped picking up their phones to avoid receiving their customers’ orders to sell stocks. This started a process that, as author Michael Lewis put it in his book Flash Boys, “has ended with computers entirely replacing the people”.
The financial world has invested millions in superfast cables and microwave communications to shave just milliseconds off the rate at which algorithms can transmit their instructions. When speed is so important, a human being that requires a massive 215 milliseconds to click a button is almost completely redundant. Our only remaining purpose is to reconfigure the algorithms each time the system of technological decisions fails.
As new boundaries are carved between humans and technology, we need to think carefully about where our extreme reliance on software is taking us. As human decisions are substituted by algorithmic ones, and we become tools whose lives are shaped by machines and their unintended consequences, we are setting ourselves up for technological domination. We need to decide, while we still can, what this means for us both as individuals and as a society.
Global research and advisory firm Gartner has highlighted the top strategic technology trends that organizations need to explore in 2019 in its special report titled “Top 10 Strategic Technology Trends for 2019”.
Gartner defines a strategic technology trend as one with substantial disruptive potential that is beginning to break out of an emerging state into broader impact and use, or which are rapidly growing trends with a high degree of volatility reaching tipping points over the next five years.
Cearley: The Intelligent Digital Mesh continues as a major driver through 2019
“The Intelligent Digital Mesh has been a consistent theme for the past two years and continues as a major driver through 2019. Trends under each of these three themes are a key ingredient in driving a continuous innovation process as part of a Continuous NEXT strategy,” said David Cearley, vice president and Gartner Fellow.
“For example, artificial intelligence (AI) in the form of automated things and augmented intelligence is being used together with IoT, edge computing and digital twins to deliver highly integrated smart spaces. This combinatorial effect of multiple trends coalescing to produce new opportunities and drive new disruption is a hallmark of the Gartner top 10 strategic technology trends for 2019.”
The top 10 strategic technology trends for 2019 are:
Autonomous things, such as robots, drones and autonomous vehicles, use AI to automate functions previously performed by humans. Their automation goes beyond the automation provided by rigid programming models and they exploit AI to deliver advanced behaviours that interact more naturally with their surroundings and with people.
“As autonomous things proliferate, we expect a shift from stand-alone intelligent things to a swarm of collaborative intelligent things, with multiple devices working together, either independently of people or with human input,” said Cearley.
“For example, if a drone examined a large field and found that it was ready for harvesting, it could dispatch an “autonomous harvester.” Or in the delivery market, the most effective solution may be to use an autonomous vehicle to move packages to the target area. Robots and drones on board the vehicle could then ensure final delivery of the package.
Augmented analytics focuses on a specific area of augmented intelligence, using machine learning (ML) to transform how analytics content is developed, consumed and shared. Augmented analytics capabilities will advance rapidly to mainstream adoption, as a key feature of data preparation, data management, modern analytics, business process management, process mining and data science platforms.
Automated insights from augmented analytics will also be embedded in enterprise applications — for example, those of the HR, finance, sales, marketing, customer service, procurement and asset management departments — to optimize the decisions and actions of all employees within their context, not just those of analysts and data scientists. Augmented analytics automates the process of data preparation, insight generation and insight visualization, eliminating the need for professional data scientists in many situations.
“This will lead to citizen data science, an emerging set of capabilities and practices that enables users whose main job is outside the field of statistics and analytics to extract predictive and prescriptive insights from data,” said Cearley. “Through 2020, the number of citizen data scientists will grow five times faster than the number of expert data scientists. Organizations can use citizen data scientists to fill the data science and machine learning talent gap caused by the shortage and high cost of data scientists.”
The market is rapidly shifting from an approach in which professional data scientists must partner with application developers to create most AI-enhanced solutions to a model in which the professional developer can operate alone using predefined models delivered as a service.
This provides the developer with an ecosystem of AI algorithms and models, as well as development tools tailored to integrating AI capabilities and models into a solution. Another level of opportunity for professional application development arises as AI is applied to the development process itself to automate various data science, application development and testing functions. By 2022, at least 40 percent of new application development projects will have AI co-developers on their team.
“Ultimately, highly advanced AI-powered development environments automating both functional and nonfunctional aspects of applications will give rise to a new age of the ‘citizen application developer’ where nonprofessionals will be able to use AI-driven tools to automatically generate new solutions. Tools that enable nonprofessionals to generate applications without coding are not new, but we expect that AI-powered systems will drive a new level of flexibility,” said Cearley.
A digital twin refers to the digital representation of a real-world entity or system. By 2020, Gartner estimates there will be more than 20 billion connected sensors and endpoints and digital twins will exist for potentially billions of things. Organizations will implement digital twins simply at first. They will evolve them over time, improving their ability to collect and visualize the right data, apply the right analytics and rules, and respond effectively to business objectives.
“One aspect of the digital twin evolution that moves beyond IoT will be enterprises implementing digital twins of their organizations (DTOs). A DTO is a dynamic software model that relies on operational or other data to understand how an organization operationalizes its business model, connects with its current state, deploys resources and responds to changes to deliver expected customer value,” said Cearley. “DTOs help drive efficiencies in business processes, as well as create more flexible, dynamic and responsive processes that can potentially react to changing conditions automatically.”
The edge refers to endpoint devices used by people or embedded in the world around us. Edge computing describes a computing topology in which information processing, and content collection and delivery, are placed closer to these endpoints. It tries to keep the traffic and processing local, with the goal being to reduce traffic and latency.
In the near term, edge is being driven by IoT and the need keep the processing close to the end rather than on a centralized cloud server. However, rather than create a new architecture, cloud computing and edge computing will evolve as complementary models with cloud services being managed as a centralized service executing, not only on centralized servers, but in distributed servers on-premises and on the edge devices themselves.
Over the next five years, specialized AI chips, along with greater processing power, storage and other advanced capabilities, will be added to a wider array of edge devices. The extreme heterogeneity of this embedded IoT world and the long life cycles of assets such as industrial systems will create significant management challenges. Longer term, as 5G matures, the expanding edge computing environment will have more robust communication back to centralized services. 5G provides lower latency, higher bandwidth, and (very importantly for edge) a dramatic increase in the number of nodes (edge endoints) per square km.
Conversational platforms are changing the way in which people interact with the digital world. Virtual reality (VR), augmented reality (AR) and mixed reality (MR) are changing the way in which people perceive the digital world. This combined shift in perception and interaction models leads to the future immersive user experience.
“Over time, we will shift from thinking about individual devices and fragmented user interface (UI) technologies to a multichannel and multimodal experience. The multimodal experience will connect people with the digital world across hundreds of edge devices that surround them, including traditional computing devices, wearables, automobiles, environmental sensors and consumer appliances,” said Cearley.
“The multichannel experience will use all human senses as well as advanced computer senses (such as heat, humidity and radar) across these multimodal devices. This multiexperience environment will create an ambient experience in which the spaces that surround us define “the computer” rather than the individual devices. In effect, the environment is the computer.”
Blockchain, a type of distributed ledger, promises to reshape industries by enabling trust, providing transparency and reducing friction across business ecosystems potentially lowering costs, reducing transaction settlement times and improving cash flow.
Today, trust is placed in banks, clearinghouses, governments and many other institutions as central authorities with the “single version of the truth” maintained securely in their databases. The centralized trust model adds delays and friction costs (commissions, fees and the time value of money) to transactions. Blockchain provides an alternative trust mode and removes the need for central authorities in arbitrating transactions.
”Current blockchain technologies and concepts are immature, poorly understood and unproven in mission-critical, at-scale business operations. This is particularly so with the complex elements that support more sophisticated scenarios,” said Cearley. “Despite the challenges, the significant potential for disruption means CIOs and IT leaders should begin evaluating blockchain, even if they don’t aggressively adopt the technologies in the next few years.”
Many blockchain initiatives today do not implement all of the attributes of blockchain — for example, a highly distributed database. These blockchain-inspired solutions are positioned as a means to achieve operational efficiency by automating business processes, or by digitizing records. They have the potential to enhance sharing of information among known entities, as well as improving opportunities for tracking and tracing physical and digital assets. However, these approaches miss the value of true blockchain disruption and may increase vendor lock-in. Organizations choosing this option should understand the limitations and be prepared to move to complete blockchain solutions over time and that the same outcomes may be achieved with more efficient and tuned use of existing nonblockchain technologies.
A smart space is a physical or digital environment in which humans and technology-enabled systems interact in increasingly open, connected, coordinated and intelligent ecosystems. Multiple elements —including people, processes, services and things — come together in a smart space to create a more immersive, interactive and automated experience for a target set of people and industry scenarios.
“This trend has been coalescing for some time around elements such as smart cities, digital workplaces, smart homes and connected factories. We believe the market is entering a period of accelerated delivery of robust smart spaces with technology becoming an integral part of our daily lives, whether as employees, customers, consumers, community members or citizens,” said Cearley.
Digital Ethics and Privacy
Digital ethics and privacy is a growing concern for individuals, organizations and governments. People are increasingly concerned about how their personal information is being used by organizations in both the public and private sector, and the backlash will only increase for organizations that are not proactively addressing these concerns.
“Any discussion on privacy must be grounded in the broader topic of digital ethics and the trust of your customers, constituents and employees. While privacy and security are foundational components in building trust, trust is actually about more than just these components,” said Cearley. “Trust is the acceptance of the truth of a statement without evidence or investigation. Ultimately an organization’s position on privacy must be driven by its broader position on ethics and trust. Shifting from privacy to ethics moves the conversation beyond ‘are we compliant’ toward ‘are we doing the right thing.’”
Quantum computing (QC) is a type of nonclassical computing that operates on the quantum state of subatomic particles (for example, electrons and ions) that represent information as elements denoted as quantum bits (qubits). The parallel execution and exponential scalability of quantum computers means they excel with problems too complex for a traditional approach or where traditional algorithms would take too long to find a solution.
Industries such as automotive, financial, insurance, pharmaceuticals, military and research organizations have the most to gain from the advancements in QC. In the pharmaceutical industry, for example, QC could be used to model molecular interactions at atomic levels to accelerate time to market for new cancer-treating drugs or QC could accelerate and more accurately predict the interaction of proteins leading to new pharmaceutical methodologies.
“CIOs and IT leaders should start planning for QC by increasing understanding and how it can apply to real-world business problems. Learn while the technology is still in the emerging state. Identify real-world problems where QC has potential and consider the possible impact on security,” said Cearley. “But don’t believe the hype that it will revolutionize things in the next few years. Most organizations should learn about and monitor QC through 2022 and perhaps exploit it from 2023 or 2025.”
Analysts will explore top industry trends at Gartner Symposium/ITxpo 2018 running from March 4 to 6, 2019 in Dubai, UAE.
Gartner Symposium/ITxpo is the world’s most important gathering of CIOs and senior IT leaders, uniting a global community of CIOs with the tools and strategies to help them lead the next generation of IT and achieve business outcomes. More than 25,000 CIOs, senior business and IT leaders worldwide will gather for the insights they need to ensure that their IT initiatives are key contributors to, and drivers of, their enterprise’s success.
“It is safe to say that there is no sector or industry that has not been impacted by the ongoing digital transformation and the innovative technologies” said Ali Al Jassim, CEO, Etihad ESCO in an article of TECHNICAL REVIEW MIDDLE EAST on Monday, 24 September 2018.
Ali Al Jassim is the CEO of Etihad ESCO. (Image source: Etihad)
The advent of smartphones, social media, intelligent manufacturing and automation are set to transform the future of industries, including electric power systems. The electricity sector is poised to take advantage of the rapid digital transformation, with US$1.3 trillion of value estimated to be captured globally between the year 2016 and 2025.
Be it renewable energy, non-renewable energy or any other source of energy at generation – it doesn’t matter – power system is still going to play a pivotal role in future. Being an integral part in transmission and distribution (T&D) and with continuous growth in energy demand, power system has to be smart and robust to support sustainability.
With the introduction of smart automation, artificial intelligence and continuous monitoring a lot can be achieved. These can help in estimation of future loads, seasonal requirements on grid accordingly generation can be planned using sum of renewables and non-renewables source. Smart power systems can help in minimising T&D losses, also it can help in better maintenance, power quality, sustainability.
The disruptive convergence of digital technology advancements is marked by its customer-centric nature; on-demand, tailored consumption; and a decentralised infrastructure. As the Fourth Industrial Revolution, according to the World Economic Forum, ‘builds on the digital revolution and combines multiple technologies that are leading to unprecedented paradigm shifts in the economy, business, society, and individually,’ we need to ensure that the growth of electric power systems is in the right direction.
As energy efficiency and sustainability continue to be the biggest challenges we face today, the electric power industry is witnessing a period of sustained growth to cater to the rising energy needs. Modernisation initiatives such as the deployment of smart grid solutions will support the continuing growth of the sector’s infrastructure, as new facilities get added to the existing network and incorporated with the installed base. Grid intelligence will aid planners and operators in successfully navigating the increasing complexity of safe and reliable power supply and delivery.
Even as renewable power generation technologies expand to contribute to reducing the total amount of energy consumed, yet they cannot completely displace the necessity for new baseload generation. Fossil generation sources that provide cleaner and lower emissions will continue to be the mainstay of power generation additions. As the industry envisions its future, nuclear power is expected to grow in significance, but this will also give rise to challenges for industrialised countries.
Demand reduction and the need for new generation additions can be attained through conservation and efficiency improvements to an extent. For the installation of new substations in thickly populated urban areas with intense load densities, compact designs with reduced footprints will be imperative.
An ideal mix of power generation resources will encompass central station power, supported by renewable energy sources including wind and solar technologies, and eco-friendly distributed generation complemented by consumer demand side response programs. The right mix of these resources will lead to the creation of an efficient and feasible energy market with balance. A central and distributed generation capability will also reduce greenhouse gas (GHG) emissions, since renewable energy can be effectively used to serve load based on resource availability, in response to consumer demand.
The problems plaguing the current electric power system include aging infrastructure, unreliability, weather-related outages and security concerns. Energy loss during transmission, another major drawback of the existing power system, happens during the transmission of energy from large power plants to the consumers through extensive networks over long distances. Electricity transmission and distribution losses average about five per cent of the electricity being transmitted and distributed annually in the United States, according to the US Energy Information Administration (EIA).
Despite the current challenges faced by the electric power industry, the opportunities for improvement are numerous. If integrated under appropriate interconnection standards, in microgrids, or in automated distribution systems, distributed resources can help improve grid reliability and resilience for customers who seek uninterrupted service.
There is no doubt that the need of the hour is a modern power system that is capable of supporting the development and deployment of increasingly clean energy and energy-efficiency technologies. Such systems will have certain essential features as identified now and may need more of them that will become apparent over time. Further development and implementation of a regulatory framework and business models that provide incentives for power generators, system operators and utilities that focus on reducing or eliminating pollution and other environmental damages, is the most essential feature. System reliability, protection of physical and virtual assets from malicious or accidental damages, enhancement of grid infrastructure, and safeguarding consumers from unfair pricing and other pitfalls must also be focus areas. An ideal electric power system will be the one that creates sustainable business models for firms in the power sector, while also achieving these goals.
The automation of distribution has immense potential in optimising the reliability and performance of distribution systems. As the industry aims to capitalize on the deployment of advanced metering infrastructure (AMI) systems, major investments in new distribution automation and distribution management systems can be expected in the near future. The potential of digital technologies in accelerating the sector’s modernisation and growth is immense and can add exceptional shareholder, customer and environmental value. This is exactly what makes these times both exciting and challenging for our industry.
As we turn to leveraging the fundamentals of digitisation to increase the life cycle of energy infrastructure and to optimize electricity network flows, Energy Service companies are set to play an increasingly important role in regulating the electric power system to ensure the efficient use of energy by providing valuable data on energy losses in the system to customers and helping them resolve these issues.
Autonomous vehicles are coming and they have the potential to radically better our lives. But to reap the rewards of this new technology, we first have to adapt the world to its requirements. This means preparing the way for massive engineering projects that will introduce the latest generation of mobile networks into our cities. As future autonomous vehicles become safer and more efficient, they will rely on high-bandwidth mobile networks to wirelessly share and receive data from each other.
Here is an account of Saber Fallah, Director of Connected Autonomous Vehicles Lab at the University of Surrey on what looks like the ‘not-so-far-future’ in the developed world; the MENA region being only marginally affected with some realisations in the Gulf countries. Hence our proposed featured above picture with our due compliments to the author and publisher.
Self-driving vehicles currently work by collecting data from an array of sensors, which is then interpreted by various algorithms. These algorithms tell the vehicle where to drive, at what speed and when to stop.
But the data that these sensors collect is inherently limited. The vehicle cannot see any vehicles outside of its field of vision, nor can it be aware of traffic occurring ten miles further down the road. To overcome this, future autonomous vehicles will be constantly accessing and interpreting data collected by thousands of surrounding vehicles, and roadside units (computing devices that provide connectivity support to passing vehicles). Huge swathes of additional information will be provided to the vehicle about road surface, weather, traffic conditions, other vehicle information and intended control actions.
We expect that driverless cars will be commercially available by 2025 and the whole UK transportation system will be fully automated by 2070. When this happens, these vehicles will sometimes be controlled by a traffic management system, which could activate useful manoeuvres such as platooning, when automated vehicles travel very closely together at very high speeds, and intersection management. These connected autonomous vehicles (CAV) will create a completely different transportation network for future generations and one that is safer, faster, more efficient, more environmentally friendly and more productive. As we rapidly approach the point at which CAVs are ready for the streets, we have to make sure that our streets are ready for them.
With so much data needing to be shared, having a high bandwidth and fast wireless communication technology is essential. The next generation of wireless communication systems, based on the faster 5G technology, can potentially provide the required bandwidth. But to achieve this, we need to begin drastically increasing the number of radio antennas and roadside units in cities.
Even the most recent networks (4G LTE) that exist today simply aren’t up to the task and will have to be upgraded. 5G networks will demand faster and more flexible infrastructure that can adapt to unexpected problems. Countries across the world will also have to invest heavily in new roadside units’ that can help reduce any data delays and minimise the reliance on network data centres, by acting as alternative data sources. At the same time, the security of these networks have to be considered, ensuring the safety and privacy of all communication over them.
The slowly turning gears of policymakers are currently lagging behind the astronomical progress of connected autonomous vehicles. The Netherlands is currently the country furthest ahead in preparation, thanks mainly to its excellent road infrastructure. Singapore’s decision to allow self-driving cars to be tested on public roads mean that it is quickly also becoming a world leader in this field. Both the United States and Sweden are also beginning to prepare for this future.
Across the world, many governments are coming to realise the necessity of infrastructure change. For example, the UK government recently announced its goal of becoming a global leader in autonomous vehicles, with new development and testing areas to be championed. Indeed, several UK-based projects are attempting to lead the country onwards. UK CITE is equipping 40 miles of urban roads, dual-carriageways and motorways within Coventry and Warwickshire with extremely fast data networks required by CAVs. Another project, E-CAVE, is adapting Ordnance Survey digital data to help the development of CAVs. The data, which is used to create a local map of the environment, enhances the perception of CAVs and allows them drive more safely.
Even with the vast technological challenges and regulatory hurdles currently encompassing the deployment of autonomous vehicles, it’s not a question of “if”, but rather “when” they will be prevalent on the roads. Now is the time to have a conversation over developing the correct urban infrastructure for this new age.
McKinsey & Company produced last week a report that revealed that some 20 million Middle East jobs could be automated ; that is 45% of all existing jobs in the Middle East could be affected by the increasing automation in the region’s work environment. So, is it that 20 million Middle East jobs could be automated or is it something else?
In any case, this report based on 6 examined Middle Eastern countries where 20.8 million full-time employees with $366.6 billion in wage income are, as of now, associated with activities that are technically automatable; mainly expatriates workers who would very likely bear the brunt of the process of automation.
JAMES DARTNELL leading international robotics professor has claimed that sophisticated artificial intelligence systems are currently a distant reality and that statistics around the potential for job losses as a result of AI and robotics are significantly inflated.
McKinsey & Company has claimed that 20 million jobs in the Middle East could be automated
An estimated 45 percent of existing jobs in the Middle East could be automated, according to McKinsey & Company’s latest report, The Future of Jobs in the Middle East, launched today at the World Government Summit.
In the UAE, the research estimates that based on the segmentation of work activities by sector, occupation and education, more than 93 percent of the labour-saving technical automation potential applies to jobs currently held by expat workers.
More than 60 percent of the automation potential is concentrated in six out of the 19 sectors examined; these include other services, administrative and support, government, manufacturing, construction, and retail trade as well as wholesale trade.
In all six Middle Eastern countries that the report examined, $366.6 billion in wage income and 20.8 million full-time equivalent employees (FTEs) are associated with activities that are already technically automatable today.
“Our research encourages Middle East policymakers and business leaders to embrace the transition into the new age of automation and invest in skills that workers of the future will require,” said Jan Peter Moore, associate partner at McKinsey & Company. “For countries such as the UAE, Bahrain and Kuwait, the projected proportion of work, and by extension workers, displaced is higher than the projected global average. This means workers in these countries will need to evolve to adapt to global forces of workforce automation and technological progress more rapidly than other countries in the region.”
Automation’s potential varies substantially across industries. Sectors like manufacturing, transportation and warehousing where routine tasks are common have a high potential for automation, whereas sectors where more human interaction is required, including the arts, entertainment, recreation, healthcare and education, have a lower potential to be automated – ranging from 29 to 37 percent.
For policymakers and governments in the region, there is a critical link between displacement by automation and low-to-medium levels of education and experience.
In a region where “57 percent” of the current employed workforce has not completed a high school education, automation poses a real risk, McKinsey & Company says. The automation potential more than halves, falling to nearly 22 percent, for employees holding bachelor or graduate degrees.