Egypt is working on formulating a strategy for artificial intelligence (AI) which will include the establishment of the country’s first faculty of artificial intelligence and artificial intelligence academy in the coming academic year, in a bid to produce the scientific workforces needed to develop a sustainable knowledge-based economy.
The FAI will start student enrolment in the next academic year, 2019-20, as a centre of excellence for artificial intelligence research, education, teaching and training.
Besides establishing an artificial intelligence academy specialising in innovation and new thinking in artificial intelligence, several AI departments will also be set up at higher education institutions to develop capacity and boost innovations.
AI is the science of developing computer systems capable of carrying out human tasks.
According to a 2017 PricewaterhouseCoopers (PwC) report entitled The Potential Impact of AI in the Middle East, it is estimated that 7.7% of Egypt’s gross domestic product could come from the AI sector by 2030.
“We estimate that the Middle East is expected to accrue 2% of the total global benefits of AI in 2030. This is equivalent to US$320 billion,” the report stated.
“In the wake of the fourth industrial revolution, governments and businesses across the Middle East are beginning to realise the shift globally towards AI and advanced technologies.
“They are faced with a choice between being a part of the technological disruption, or being left behind. When we look at the economic impact for the region, being left behind is not an option.”
The biggest opportunity for AI in the Middle East and Africa region is in the financial sector where it is estimated that 25% of all AI investment in the region predicted for 2021, or US$28.3 million, will be spent on developing AI solutions. This is followed by the public services, including education among other sectors, according to the PwC report.
Samir Khalaf Abd-El-Aal, a science expert at the National Research Centre in Cairo, welcomed news of the FAI as a “pioneering initiative” that will have an impact on Egypt as well as North Africa.
“It is a good step forward for raising awareness of the potential of AI for sustainable development as well as contributing in facing regional challenges to fully harness the deployment of AI, including infrastructure, skills, knowledge gaps, research capacities and availability of local data,” Abd-El-Aal told University World News.
“The FAI is an important initiative in training students in AI, which will become one of the tools of future jobs, as well as building AI applications in Arabic, which can easily go to all Arabic-speaking countries including North African states.”
“The FAI could also act as a regional focal point for carrying out mapping for local artificial intelligence start-ups, research centres and civil society organisations as well as serving as an incubator for skills development and promoting AI entrepreneurship oriented towards solving North African problems,” Abd-El-Aal said.
Virtual science hub
The Egypt government also announced the launch of a virtual science hub at the Forum. The hub, affiliated to the Academy of Scientific Research and Technology at the Ministry of Higher Education and Scientific Research, aims to enable integration, management and planning of Egyptian technological resources, work on the international information network, and includes an integrated database for all Egyptian technological resources.
It also includes all scientific and technical resources as well as material assets and academic research contributions, which will make it possible to measure the degree of technological readiness of all Egyptian academic and research institutions. The general objective of the system is to provide the necessary information to support decision-makers in research projects and to facilitate the follow-up of research activities.
For a number of years now, the provision of languages in British schools and universities has been in decline. Yet, as Brexit looms largely on the horizon, there has been much talk in the media and from politicians about the need for a “global Britain”.
Arguably, a country can only really be global and outward looking if language skills are considered essential for its citizens. The government seems to share this view – at least to some extent. This is reflected in the fact that the Department of Education has provided funding to open a National Centre for Excellence for Language Pedagogy and to roll out a cross-sector mentoring project, which was piloted very successfully in Wales.
A number of surveys, such as the annual British Council survey of English primary and secondary schools, reports on the falling numbers of pupils participating in language learning. This is a decline that started in 2004, when languages were taken out of the compulsory curriculum in secondary schools.
There was a rise in the number of pupils taking languages in 2011 as a result of the introduction of the English Baccalaureate (EBACC) – which has a language as a core subject. However, this increase proved to be shortlived, despite the government’s ambition for 90% of pupils to gain the EBACC by 2025.
In 2014, the Guardian commissioned a survey which questioned young people about learning languages. The survey identified some of the main benefits people perceive to be linked to learning languages. This includes: better job prospects abroad, talking to other people, learning about another culture, learning another skill, and incentive to travel.
On the other hand, perceived downsides were seen as languages being difficult, the predominance of English, and that the way languages are taught in schools is “not useful in real life”.
To find out more about why young people choose (not) to study a language, we surveyed 107 students that were studying a language at Lancaster University or the University of Nottingham. This includes students who studied a language as an optional module to complement their main degree course, as well as those who studied a language as part of their degree.
Our survey showed that for the vast majority of these degree students (over 90%) and students taking optional modules (over 75%) their main motivation was enjoyment as well as a genuine interest in the language and the countries where it is spoken. This aspect ranks much higher than “employability skills” – despite this often being the main angle under which languages are promoted.
Students do, however, realise and appreciate the broad range of transferable skills gained from studying languages. This includes analytical and problem solving skills, the ability to communicate well (also in your first language), and committing yourself to a long-term project.
When asked what might put young people off studying a language and why they think there are not more language learners in the UK, many referred to the lack of engagement with cultural aspects – such as history, politics, society or literature – in language classes. They also spoke of the myth of English being the only language you need, poor handling of languages in the British education system, and the lack of governmental initiatives to promote the study of languages.
To get more people excited about languages then, there needs to be a rethink of the way in which they are promoted and embedded into the curriculum. And there must be more focus on enjoyment and intercultural competence and more cultural engagement and “real-life” tasks.
This is important, because studying a language is not just about enhancing your CV and adding something useful to your skills set. It is also about embracing other cultures, developing intercultural competence, enjoying languages as an exciting object of study, and reflecting on your own national and cultural identity.
The government should also recognise the importance of languages and rethink the value placed on foreign language competency in the British education system. A national policy on languages could help to address attitudes towards languages and further promote joined up thinking across the different education sectors.
Here’s a question that keeps me awake some nights – what will we do with advances in business, economy and technology if we do not pay attention to harnessing the capabilities of young people who will at some point be responsible for the successful functioning of their communities and the world? Are we doing enough to safeguard their basic rights to education, food, shelter, and other basic amenities? Are we making our best efforts to give them a real voice?
These questions present us an opportunity to think about the issues facing young people around the globe, and especially in the MENA region where the youth crisis is perhaps the most intensified. In our minds, youth stands for dreams, innovation, and new opportunities – or simply put, the future. Yet too many of these dreams are today being thwarted. Globally, youth unemployment is three times higher than that of adults.
Children and the youth face a bigger risk when displaced; they are far more vulnerable than adults when subject to violence and exploitation, physical and psychological abuse, trafficking, or when they pulled away from schools and given arms by extremists.
In 2017, the United Nations High Commissioner for Refugees (UNHCR) released a report according to which 57 percent refugee population comprised young children including 173,800 unaccompanied and separated child refugees.
These are some realties that Sharjah’s leadership, who has entrusted the emirate’s future with the youth, has committed itself to help changing. Our ambitions led us to create an international platform ‘Investing in the Future: Middle East and North Africa (IIFMENA) Conference, held in Al Jawaher Reception and Convention Centre, to bring the world together once every two years to tackle a specific humanitarian and development challenge in the MENA Region.
The first edition of the conference hosted regional governments and international agencies to discuss ways and means to safeguard the rights and lives of refugee children and adolescents who are victims of conflicts and wars. The second edition focused on the crucial issue of the pressing need for gender equity by offering girls and women equal opportunities in society and economy.
The theme of the upcoming edition on October 24–25 is ‘Youth – Crisis Challenges and Development Opportunities,’ and it will be hosted by TBHF in partnership with UNDP, UNICEF, UNHCR, NAMA Women Advancement Establishment and UN Women. This edition will shed the spotlight on youth-related issues with a focus on the consequences of wars, conflicts and disasters on them. The potential of a whole generation risks being wasted as the region stokes social tensions.
Through the conference, we would like to highlight that youth should have the opportunity to participate in the social and economic development of their communities. We need to establish a clear mechanism to involve them in the decision-making process, harness their potentials, and ignite their leadership skills.
IIFMENA will be hosting targeted discussions on how governments and private organisations can offer stronger support to countries that host victims of crises, whether refugees or immigrants, especially considering that 85% of displaced individuals have sought asylum in developing countries that are still struggling to promote their economy, infrastructure, heath, and education services.
Youth are agents of change.
Creating large numbers of decent jobs for young people is critical for achieving overall development objectives, from poverty reduction to better health and education. Globally, 600 million jobs will be needed over the next 10 to 15 years. Developing the youth’s employability skills will also be a core focus of the conference agenda.
The expert insights in this edition will seek to offer strategic direction to the agenda of youth empowerment with a special focus on how they can be prepared and equipped to be safely returned to their homelands once conditions are normalised. When given the space and opportunity to rebuild their own communities, young people can turn their energy and creativity towards solving today’s challenges and tomorrow’s problems.
International communities will need to rally efforts to be able to execute this strategy. It is our collective responsibility to ensure our youth does not feel abandoned, lost or cheated – it is in these times they are most vulnerable and have no choice but to seek an alternative environment not conducive to their own development or that of their community’s.
Displacement, marginalisation and lack of opportunities are all problems that humans created for themselves. It’s time we turn these problems into long-term solutions for us, and more importantly, for our children.
The 2030 Agenda for Sustainable Development recognises the importance of tackling youth oppression and unemployment, and calls for promoting their rights in education, employment and civic engagement. Through the IIFMENA Conference this year, we seek to take this agenda by demonstrating that a common global agenda can galvanise support from many different actors – something critical to the successful promotion of the youth towards a brighter, more just future.
We couldn’t find better read to start the New Year than with THE EDVOCATE’s article dated December 30, 2017on Digital literacy only through Digital citizenship. THE EDVOCATE is a website devoted to advocating for education equity, reform, and innovation.
The MENA countries’ education establishments could do well to meditate on what is put forward like some sort of a precept by Matthew Lynch, author of the article, i.e. “Embracing technology and digital literacy is a key factor to encourage learning from infancy through adulthood.”
With the increased importance of technology in society, digital literacy is gaining recognition as the most valuable tool for lifelong learning. What does this mean? Essentially, as citizens of a global society, the influence of social media, technology, and online resources is massive. For children, the access to a home computer with internet increases their likelihood of college attendance exponentially. For adults, the ever evolving tech world can either help them succeed or hold them back.
Society has changed over the last 15 years. It has become increasingly important to continue education after entering the workforce. The influence of technology on business is the main reason for this new mandate. In early learning through adulthood, digital literacy is showing the most promise for success. The edtech industry has long-focused on the value of digital competency for children. It’s time digital literacy was incorporated into adult education in the same way, but with a few adjustments.
The foundation of digital literacy has four factors. Technological skills and access, authorship rules, representation rules, and online social responsibility. For students and employees to interact responsibly in a digital society, it’s imperative to understand all four parts of the puzzle.
The core competencies of using computers, navigating the internet, and having access to broadband internet are essential to success. In today’s schools, students who utilize online research and display computer skills are more likely to graduate. Additionally, organizations like DigitalLiteracy.gov emphasize the importance of harnessing technology to find work and advance in your career.
Authorship understanding is becoming increasingly essential every day. Individuals can create and share content seamlessly in the digital age. This ability allows global citizens to interact and bond together for common goals. It also means that discerning authentic content is becoming harder to do. Those with good digital literacy skills will have the advantage of sharing ideas efficiently and knowledgeably filtering content.
Related to authorship is the issue of digital representation. Knowing how to decide what content is authentic and what isn’t is essential for every citizen. Understanding how to use resources like Politifact and Snopes will help individuals navigate representation issues more soundly.
To use technology and the internet in your life, it’s imperative to understand all the tenants of digital literacy. Lastly, and possibly most important, is digital ethics or online social responsibility. Digital ethics is the discernment of what is appropriate to say, do and share. It also includes observance of copyright laws and privacy.
To fully embrace digital literacy, individuals must also learn digital citizenship. The tenants of this idea are much more sophisticated than those of literacy. However, they guide behavior online, safety practices and research rules. Comprehension of the nine elements of digital citizenship will make technology safer and more helpful for children and adults, alike.
Understanding the Stats
In a 2013 report by the New York City Comptroller’s Office, the educational achievement of homes without broadband access was disproportionately poor. 42% of disconnected households attained less than high school graduation, and only 5% earned a Bachelor’s degree. Similar educational deficiencies were noted in a 2011 Microsoft infographic. The infographic suggested that 77% of jobs will require digital competency by 2020. Additionally, it recorded a 6% greater high school graduation rate for students with home access to technology.
Does the research suggest that mere access to internet and technology will improve educational and career performance? Not exactly. There are other important factors to success. Students need to be digitally literate which includes an understanding of digital citizenship rules.
The ability to use technology isn’t enough to advance individuals. Technology use comes with many possible hurdles which can present themselves to halt progress. Things like improper research practices can hurt student performance. Additionally, unsafe internet practices and inappropriate online activity can harm employees. To avoid these common missteps, people need proper education on digital citizenship and literacy.
From pre-k through adult life, technology is ingratiated in daily living. According to the International Guidelines on Information Literacy, technological education should start early. However, the report also states that teaching and improvement should continue throughout life to support personal and career growth. The European Commission Joint Research Center agrees. The commission suggests that digital literacy is essential to school success and later lifelong improvement.
Embracing technology and digital literacy is a key factor to encourage learning from infancy through adulthood. The impact of technology on learning has roots in the science of how we learn. As such, it has long been important to encourage academic advancement. However, the development of a global society has made involvement mandatory for successful individuals from all walks of life.
How does digital literacy inform your life? What continued learning tools have you embraced in your career? We want to hear your experiences.
Paul Baker, Professor of Linguistics and English Language, Lancaster University wrote this contribution to The Conversation on perhaps the most ancient issue of all, that of a mum’s language being overtaken by her daughter’s streetwise lingua. Realistically however, in today’s world, it is the billions of Asians, Africans and the less numerous Arabs of the MENA who by using the English language as a vehicle for all inter-nations and communities communication for all matters of trade, business and of course politics have turn it into a world’s present and foreseeable future language of first importance. It is to be noted that the Americanisation of the English language in the MENA is also occurring under our very eyes. It remains that as a consequence of the above it is also not only taking place in the old continent but developing into as wide a variety of, as it were, local “Creoles”. Education systems are of course tailored in such a way as to provide all required human resources to each of the specific markets.
In the meantime, here is Paul’s write up with our thanks and any comments are welcome either directly to the publisher / author or to MENA-Forum.
Brits can get rather sniffy about the English language – after all, they originated it. But a Google search of the word “Americanisms” turns up claims that they are swamping, killing and absorbing British English. If the British are not careful, so the argument goes, the homeland will soon be the 51st State as workers tell customers to “have a nice day” while “colour” will be spelt without a “u” and “pavements” will become “sidewalks”. The two versions of English are intelligible but have long had enough differences to inspire Oscar Wilde to claim:
We have really everything in common with America nowadays, except, of course, the language.
My research examined how both varieties of the language have been changing between the 1930s and the 2000s and the extent to which they are growing closer together or further apart. So do Brits have cause for concern?
Well, yes and no. On the one hand, most of the easily noticeable features of British language are holding up. Take spelling, for example – towards the 1960s it looked like the UK was going in the direction of abandoning the “u” in “colour” and writing “centre” as “center”. But since then, the British have become more confident in some of their own spellings. In the 2000s, the UK used an American spelling choice about 11% of the time while Americans use a British one about 10% of the time, so it kind of evens out. Automatic spell-checkers which can be set to different national varieties are likely to play a part in keeping the two varieties fairly distinct.
There is also no need to worry too much about American words, such as “vacation”, “liquor” and “law-maker” creeping into British English. There are a few cases of this kind of vocabulary change but they mostly tend to be relatively rare words and they are not likely to alter British English too much.
The British are still using “mum” rather than “mom”, “folk” rather than “folks”, “transport” rather than “transportation”, “petrol” rather than “gas”, “railway” rather than “railroad” and “motorway” rather than “highway”. Words to keep an eye on, however, are lawyer, jail, cop, guy and movie – all of which are creeping into the lexicon more and more.
But when we start thinking of language more in terms of style than vocabulary or spelling, a different picture emerges. Some of the bigger trends in American English are moving towards a more compact and informal use of language. American sentences are on average one word shorter in 2006 than they were in 1931.
Americans also use a lot more apostrophes in their writing than they used to, which has the effect of turning the two words “do not” into the single “don’t”. They’re getting rid of certain possessive structures, too – so “the hand of the king” becomes the shorter “the king’s hand”. Another trend is to avoid passive structures such as “a paper was written”, instead using the more active form, “I wrote a paper”.
I’m rather fond of gradable adverbs
And some words are starting to be drastically eroded from English – especially a grammatical class called gradable adverbs which consists of boosters like “frightfully” and “awfully” and downtoners (words or phrases which reduce the force of another word or phrase) like “quite” and “rather”.
If anything marks out the British linguistically, it’s their baroque way of using adverbs, especially as a form of polite sangfroid or poise – so “the worst day ever” is “things perhaps aren’t quite as wonderful as they could be”. As the American critic Alexander Woollcott once said: “The English have an extraordinary ability for flying into a great calm.”
Classic films such as Brief Encounter are absolutely packed with gradable adverbs. Americans, on the other hand, tend to communicate in a more straightforward manner, telling it “as it is”. However, and here’s the thing, in all these aspects Brits are changing too – and in exactly the same way as Americans. They’re just about 30 years behind the trend that Americans seem to be leading.
So this raises a question, is British English actively following American English – copying its more economical, direct use of language – or is this something that is simply a global trend in language use? Perhaps we’re all just on the same path and the British would have gone in that direction, even if America had never been discovered? I’d like to think the latter but due to the large amount of American language that British people encounter through different forms of media, I suspect the former is more accurate.
These stylistic changes generally make for a more user-friendly version of the language which is accessible and easy to follow so they’re hard to resist. Except for the loss of those gradable adverbs, though – I’m slightly annoyed about that and would like to advocate that we keep hold of them. They’re a linguistic passport and also a marker of national character, so it would be rather lovely if we could hold on to them.
Privacy & Cookies Policy
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.