Advertisements
Human ancestors may have spread to north Africa

Human ancestors may have spread to north Africa

Human ancestors may have spread to north Africa earlier than thought, stone tool discovery suggests

File 20181128 32226 1i5mteg.jpg?ixlib=rb 1.1
An Oldowan core freshly excavated at Ain Boucherit from which sharp-edged cutting flakes were removed.
M. Sahnouni

John McNabb, University of Southampton

East Africa is famously the birthplace of humankind and the location where our ancient hominin ancestors first invented sophisticated stone tools. This technology, dating back to 2.6m years ago, is then thought to have spread around Africa and the rest of the Old World later on.

But new research, published in Science, has uncovered an archaeological site in Algeria containing similar tools that may be as old as 2.44m years. The team, led by the archaeologist Mohamed Sahnouni, excavated stone tools at the site Ain Boucherit that they estimate are between 1.92m and 2.44m years old. This suggests that human ancestors spread to the region much earlier than previously thought or that the stone tool technology was simultaneously invented by earlier hominin species living outside east Africa.

The artefacts belong to the “Oldowan” – the oldest known stone tool industry. Rounded river cobbles, used as hammer stones, were used to flake other cobbles, turning them into simple cores. The flakes were then transformed into scrapers and various knives by resharpening their edges. Essentially this was a tool kit for processing animal tissue, such as marrow, bone and brain tissue, but also plant material. However, it is not known for sure which hominin species first created Oldowan tools – potentially Australopithecus or Homo habilis.

The stone tools are very similar to those of early Oldowan sites in East Africa. Bones at the site even have cut marks, where a stone tool has gouged into the bone during butchery. The cut marks may mean these hominins were actively hunting.

But we have only ever found early Oldowan tools in the east African rift valley before, more than 4,000km away. We have always assumed that it started there some 2.6m years ago, so we shouldn’t find it so far from its original home at that age unless we have missed something.

Many archaeologists do indeed suspect there is an unseen ghost somewhere in the machine. There have been discoveries of early hominin sites to the south, in Chad, that suggest that some of our earliest ancestors lived well beyond East Africa. Oldowan-like sites have also been found outside of Africa, in Georgia, beginning at 1.8m years ago – which seems surprisingly early.

Game changer

The new discovery is telling us that our focus on East Africa as the birthplace of early humans is too narrow – we should be doing what Sahnouni and others have done all along and looking elsewhere. The same team recently published findings about another Oldowan site in Algeria that is about 1.75m years old, but to find early Oldowan tools well over half a million years earlier is a bit of a game changer.

Sahnouni excavating at the site.
M. Sahnouni

It all hinges on how reliable that 2.44m-year-old date really is. Dating specialists will be scrutinising the details very carefully. According to the paper, four different techniques were used. Palaeomagnetic dating measures the direction and intensity of the Earth’s magnetic field in sediments – this is locked into rocks when they form, helping to tell us how old they are.

The team found that the upper level mapped onto a short period of normal polarity taking place between 1.77m and 1.94m years ago. The lower level’s sediments fitted into a long period of reversed direction at between 1.94m and 2.58m years ago.

To get more precise dates, the team turned to a dating technique called electron spin resonance dating, which measures radioactive decay in quartz sand grains. However, they used a less common version of the technique that was operating close to its upper limit of reliability at this age range. The measurement delivered an age of 1.92m years old, younger than suggested by paleomagnetism.

There are some concerns about how suitable this last method is but the team has been honest about that. They also compared the dates with extinction times of animals present at the site, which suggested the date wasn’t impossible.

To get a better idea of the maximum age of the tools, they used a technique for estimating the rates of sedimentation – basically how long the different layers at the site took to build up. You have to throw in some fancy statistical work though, and map it onto the palaeomagnetic results. Extrapolating backwards in time, the team calculated that the actual age of the lower level is 2.44m years old. I suspect dating specialists will be looking at this carefully.

Mystery hominin?

Now to our ghost. The oldest tools ever found outside of Africa are the ones from Georgia dated to 1.8m years ago. There is a small Oldowan-like site in Pakistan from around the same time and more core-and-flake sites in east China at 1.66m years ago. If the Georgian site represents the first move out of Africa, then these early African migrants got to Pakistan and China extremely quickly.

Stone tool cut marks on animal skeleton.
I. Caceres

In Georgia, the tools may have been made by early Homo erectus, which dates back to about 1.8m years ago. As there is a Homo erectus specimen from China dated to 1.6m years old, it is easy to assume that Homo erectus must have been the species that spread the tool technology around the world – and much quicker than we had thought.

But we cannot be sure of that. What if our ghost was an earlier hominin species from Africa predating Homo erectus – such as Homo habilis? Perhaps the Oldowan actually began earlier than 2.6m years ago, and was already widespread throughout Africa by 2.4m years ago.

Maybe our mysterious hominin began to migrate out from Africa before 1.8m years ago, and carried its core-and-flake industry eastwards. That would certainly give it more time to cover those huge distances. Perhaps Homo erectus only migrated eastwards out of Africa later, following in the footsteps of an earlier traveller that we know nothing about.

So that’s a lot of maybes, but then nobody expected there to be Oldowan tools in Georgia when they were first found. It caused a lot of controversy, but now most archaeologists are comfortable with the finding. The Georgian archaeologists went back, did more work and proved their case. I don’t doubt Sahnouni and his team will be doing the same.The Conversation

John McNabb, Senior Lecturer in Palaeolithic Archaeology, University of Southampton

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Advertisements
Evidence of a Cosmic Calamity in ancient Dead Sea

Evidence of a Cosmic Calamity in ancient Dead Sea

 

Archaeologists at a site in what’s now Jordan have found evidence of a cosmic calamity as told by Bruce Bower on November 20, 2018.  Here it is.

An exploding meteor may have wiped out ancient Dead Sea communities

 

ANCIENT WIPEOUT Preliminary evidence indicates that a low-altitude meteor explosion around 3,700 years ago destroyed cities, villages and farmland north of the Dead Sea (shown in the background above) rendering the region uninhabitable for 600 to 700 years. Fightbegin/istock.com

 

DENVER — A superheated blast from the skies obliterated cities and farming settlements north of the Dead Sea around 3,700 years ago, preliminary findings suggest.

Radiocarbon dating and unearthed minerals that instantly crystallized at high temperatures indicate that a massive airburst caused by a meteor that exploded in the atmosphere instantaneously destroyed civilization in a 25-kilometer-wide circular plain called Middle Ghor, said archaeologist Phillip Silvia. The event also pushed a bubbling brine of Dead Sea salts over once-fertile farm land, Silvia and his colleagues suspect.

People did not return to the region for 600 to 700 years, said Silvia, of Trinity Southwest University in Albuquerque. He reported these findings at the annual meeting of the American Schools of Oriental Research on November 17.

Excavations at five large Middle Ghor sites, in what’s now Jordan, indicate that all were continuously occupied for at least 2,500 years until a sudden, collective collapse toward the end of the Bronze Age. Ground surveys have located 120 additional, smaller settlements in the region that the researchers suspect were also exposed to extreme, collapse-inducing heat and wind. An estimated 40,000 to 65,000 people inhabited Middle Ghor when the cosmic calamity hit, Silvia said.

The most comprehensive evidence of destruction caused by a low-altitude meteor explosion comes from the Bronze Age city of Tall el-Hammam, where a team that includes Silvia has been excavating for the last 13 years. Radiocarbon dating indicates that the mud-brick walls of nearly all structures suddenly disappeared around 3,700 years ago, leaving only stone foundations.

What’s more, the outer layers of many pieces of pottery from same time period show signs of having melted into glass. Zircon crystals in those glassy coats formed within one second at extremely high temperatures, perhaps as hot as the surface of the sun, Silvia said.

High-force winds created tiny, spherical mineral grains that apparently rained down on Tall el-Hammam, he said. The research team has identified these minuscule bits of rock on pottery fragments at the site.

Examples exist of exploding space rocks that have wreaked havoc on Earth (SN: 5/13/17, p. 12). An apparent meteor blast over a sparsely populated Siberian region in 1908, known as the Tunguska event, killed no one but flattened 2,000 square kilometers of forest. And a meteor explosion over Chelyabinsk, Russia, in 2013 injured more than 1,600 people, mainly due to broken glass from windows that were blown out.

Citations: P.J. Silvia et al. The 3.7kaBP Middle Ghor event: catastrophic termination of a Bronze Age civilization. American Schools of Oriental Research annual meeting, Denver, November 17, 2018.

Further Reading: T. Sumner. Here’s how an asteroid impact would kill you. Science News. Vol. 191, May 13, 2017, p. 12. 

Artist Rebuilding war destroyed Baghdad Library

Artist Rebuilding war destroyed Baghdad Library

History repeating itself over and over, Iraq and its Capital City Baghdad know how the first to pay is as always those that the mob instinctively understand as being the seat of power. Knowledge that is; so destroying the libraries was like getting rid of the symbols of the civilisation. However, unlike their predecessors, the latest invaders as elaborated here in this article are helping in the endeavour to reconstruct the most significant collateral damage of all through the initiative of an Artist Rebuilding war destroyed Baghdad Library.

A Smithonian SmartNews Keeping you current article by Brigit Katz dated August 15, 2018, is about how a pertinent gesture of rebuilding war destroyed Baghdad Library could   


How an Artist Is Rebuilding a Baghdad Library Destroyed During the Iraq War

 

“168:01,” an installation now on view at the Aga Khan Museum in Toronto, encourages visitors to donate books to the University of Baghdad

(Aly Manji)

 

In 2003, at the start of the U.S.-led war in Iraq, looters set fire to the College of Fine Arts at the University of Baghdad. The college’s vast collection of 70,000 books was destroyed, and 15 years later, students still have few titles at their disposal. So, as Hadani Ditmars reports for the Art Newspaper, an installation at the Aga Khan Museum in Toronto is asking the public to help replenish the school’s lost library.

168:01,” as the project by Iraqi-American artist Wafaa Bilal is titled, is a stark white display featuring bookshelves filled with 1,000 blank books. Visitors are encouraged to replenish the volumes with titles from an Amazon wish list compiled by the college’s students and faculty; donations can be made by sending the books on the wish list to the museum, or by gifting funds to the project through Bilal’s website.

In exchange for their donations, visitors are able to take home one of the exhibition’s white volumes that represent a rich cultural heritage stripped bare by years of conflict. In turn, the colorful books they contributed to the project will ultimately be sent to the College of Fine Arts.

“I wanted a simple visual representation of what’s been lost,” Bilal told Murray Whyte of the Toronto Star last month. “But what’s important is that, over time, this place comes back to life.”

Though Bilal’s project is focused on recouping the losses of one tragic event, “168:01” calls attention to a long history of cultural destruction in Iraq. The installation’s title refers to the destruction of the House of Wisdom, or Bayt al-Hikma, a grand library possibly founded by the Abbasid caliph Al-Mansur in the 8th century. Legend has it that when the Mongols laid siege to Baghdad in 1258, the library’s entire collection of manuscripts and books were thrown into the Tigris. The river is said to have run black for seven days—or 168 hours—due to all the ink seeping into its waters. But the “o1” in the installation’s title is meant to signify a new era of restoration in Iraq—one that looks beyond centuries of loss.

Bilal, who came to America as a refugee in the wake of the First Gulf War, often reflects upon the traumas that have taken place in the country of his birth. In one of his best known works, the 2007 project “Domestic Tension,” the artist sequestered himself in a gallery space and broadcasted live on the internet. Viewers could chat with him at all hours—and opt to shoot him with a robotically controlled paintball gun.

“168:01,” by contrast, seeks to move forward from violence. “To be completely frank, when we talk about war and destruction, when you try to bring that image here, I don’t think it resonates,” Bilal told Whyte of the Star. “There’s an obsession, I think, with images of conflict — when war is taking place, you want to engage people with that. But what happens post-conflict? Either you move on, or you look and say, what needs to be done now? I want to reflect the time now, and now is about rebuilding.”

“168:01” was first conceptualized with the Art Gallery of Windsor and curator Srimoyeee Mitra for Bilal’s major solo exhibition at the museum in 2016. The project has since appeared in various iterations at other museums and galleries around the globe—from a tall tower of books at the Foundation for Art and Creative Technology in Liverpool to an entire room at the National Taiwan Museum of Fine Arts.

Though the installation at the Aga Khan Museum winds down Sunday, it will be rebuilt for the National Veterans Art Museum Triennial in Chicago next summer.

To date, thanks to visitors who have donated to the project, Bilal has been able to ship 1,700 texts back to Baghdad, contributing to the effort to rebuild the College of Fine Arts’ once prolific collection.

Read more: https://www.smithsonianmag.com/smart-news/artist-hopes-rebuild-baghdad-library-destroyed-during-iraq-war-180969979/#dbyxWHZyxugV3IVE.99
Give the gift of Smithsonian magazine for only $12! http://bit.ly/1cGUiGv
Follow us: @SmithsonianMag on Twitter

 

Genetic Parameters for yield and its components in Bread Wheat

Genetic Parameters for yield and its components in Bread Wheat

The civilization of ancient Egypt has always been and still is indebted to the Nile River and its dependable water supply that allowed amongst all staple food crops, wheat and barley to be farmed. These are grown throughout the Delta region and all along the banks of the Nile, more recently in the newly reclaimed areas of the western desert. Egypt, the most populous country in the MENA region had for centuries, wheat as a central component of the typical diet of its inhabitants.


The country has lately not only been the largest importer of wheat but also the largest wheat consumer and bread eater per capita in the world. Hence, wheat represents almost 10% of the total value of agricultural production and about 20% of all agricultural imports. However, in 2015, domestic wheat was noticed to be declining as this was found to be less profitable by its producers due mainly to the intervention of Egypt’s government-subsidized bread program. There seem to be an increasing need to reform but at the same time for some Research and Development in all segment of wheat farming. Research on all Genetic Parameters for yield and its components in Bread Wheat would obviously be top of local academic institution’s agenda. 

This article of the International Network of Natural Sciences dwells on a piece of research titled An Estimation of Genetic Parameters for yield and its components in bread wheat (Triticum aestivum L.) genotypes under pedigree selection as per a study of Abdel Aziz Nasr Sharaan, Kamal Hassan Ghallab, Mohamed Abdel Salam M. Eid of the Department of Agronomy, Fayoum University, Egypt and published by IJAAR on July 31, 2018.

Genetic Parameters for yield and its components in Bread Wheat

Abstract

Grain yield is a complex trait and is greatly influenced by various environmental conditions. A 3-year field investigation was carried out to estimate genetic parameters for yield and its related traits of wheat under selection in reclaimed soils conditions. Three field experiments were executed at the Experimental Farm of the Faculty of Agriculture, Fayoum University at Demo (new reclaimed sandy loam soil), Fayoum Governorate, during 2012/2013, 2013/2014, 2014/2015 growing seasons in randomized complete block design (RCBD) with three replications. Results revealed that mean square values were highly significant for all studied traits in all seasons of the experiments, indicating the presence of sufficient variability among the investigated genotypes and gave several opportunities for wheat improvement. Spring barley. Photo O'Gorman Photography.

Great correspondence was observed between genotypic coefficients of variation and phenotypic coefficients of variation in every one of the traits. The coefficients of variation were high for no. fertile tillers plant-1 (NFT), grains spike-1 (GS), grains weight spike-1 (GWS), grain yield plant-1 (GYP), spikes m-2 (NSM), grain yield (GY), and harvest index (HI). In addition to, Moderate were recorded for heading date (HD) and spike length (SL) in the all seasons, and low were obtained for days to physiological maturity (DPM) in all seasons. Heritability was greater than 80% for all studied traits whereas genetic advance as a percentage of mean (GAM %) ranged from 12.22 (SS) to 77.00 (GY) in the 1st season and from 15.42 & 12.69 (DPM) to 112.07 & 68.35 (GYP) in 2nd and 3rdseasons.

Get the original articles in Source: Estimation of Genetic Parameters for yield and its components in bread wheat (Triticum aestivum L.) genotypes under pedigree selection

ijaar-v10no2-p22-30

Journal Name: International Journal of Agronomy and Agricultural Research (IJAAR)

Published By: International Network for Natural Sciences

Download PDF

via Estimation of Genetic Parameters for yield and its components in bread wheat (Triticum aestivum L.) genotypes under pedigree selection – IJAAR

“Scientific future shaped by ICT”: Dubai Science Park Director

“Scientific future shaped by ICT”: Dubai Science Park Director

“Scientific future shaped by ICT”: Dubai Science Park Director in a TahawulTech‘s article by James Dartnell is about how and why Marwan Abdulaziz Janahi, the Dubai Science Park’s executive director made such a statement at a time of non-negligible uncertainty not only for that country but for the whole region’s main contribution to the world economy, i.e. hydrocarbons. The highly mediatised UAE’s growing ambitions in Space for notably building a new city on Mars does perhaps come handy in helping to do away with some degree of that uncertainty as well as other things, but what about its ambitions with respect to developing an industry. What about holding on to its present and possibly foreseeable future’s success story in local and regional retail and trade centre? Here is TahawulTech’s article.

“Scientific future shaped by ICT”: Dubai Science Park Director

The executive director of Dubai Science Park has said that the future of the Middle East’s scientific industry will be significantly affected by swift technological advancements.
Marwan Abdulaziz Janahi, who has been confirmed as a judge for tahawultech.com’s inaugural Future Enterprise Awards on 14th October at Jumeirah Emirates Towers Hotel in Dubai, said that IT was now not only saving lives, but also advancing the pace of scientific research.
Dubai Science Park’s work focuses around four main areas of science: human science, plant science, energy and environmental science, and Janahi believes that all are now being inextricably linked with and transformed by technology. “Across all of these areas, technology is an important component,” he says. “There is more and more of an overlap between ICT and these sectors. The essence of managing green buildings is a building management system, which is founded on ICT. Using data to predict human conditions is another prime example of where technology is needed.”
[Marwan Abdulaziz Janahi is part of the judging panel of TahawulTech.com Future Enterprise Awards on 14th October 2018 at Emirates Towers, Dubai.| Learn more about TahawulTech.com Future Enterprise Awards.]
While Janahi believes that “all” industries are being disrupted by technology, he says that the healthcare industry in particular sets to benefit citizens through its transformation. “The changes we’re seeing in digital health are particularly impressive,” he says. “Data that sits within servers can now be mined and used for forecasting, while telemedicine gives allows people who don’t have easy access to medical facilities a chance to be looked after. Even regular GP checkups can be done remotely.”
He also believes healthcare transformation will have a significant knock-on effect on other verticals. “There will be a huge disruption in the insurance industry, and managing the journey of patient, which today is all done offline,” he says. “Wearables will be huge, while technologies for things like blood sugar monitoring that connect to smartphones will have a huge impact. The human and environmental sciences will see the biggest disruption in the scientific field.”
Janahi is a chairing member of the Pharmaceuticals and Medical Equipment Task force of the Dubai Industrial Strategy 2030, which was announced in 2016. The strategy focuses on five other key areas – aerospace, maritime, aluminum and fabricated metals, food and beverages and machinery and equipment – and aims to transform Dubai from being a service-based economy, to one that creates “25%” of its GDP from industrial activity.
“The bulk of Dubai’s GDP comes from logistics, finance and tourism,” Janahi says. “Manufacturing currently creates around 9-10% of it, and we want to increase that number substantially. We want these kinds of enablers to make Dubai more successful, with opportunities for the short, medium and long-term.”
Janahi is keen to broaden his technological knowledge by participating as a judge in tahawultech.com’s Future Enterprise Awards. “I’m really excited to be a judge,” he says. “I’ve seen more and more technology adopted by the healthcare and pharmaceutical industries, but for me it’s interesting to see how technology can be deployed, and how we can learn from other industries.”
For more stories on success and business transformation through technology subscribe to our newsletter. You can also follow us on Twitter and LinkedIn, like us on Facebook to get daily updates on the latest tech news.

Read more on “Scientific future shaped by ICT”: Dubai Science Park executive director

Who owns Science and Technology?

Who owns Science and Technology?

The French say “La technologie est neutre ! Tout dépend de l’usage que l’on en fait“, but what about Science? Here is  Rohan Deb Roy, of University of Reading ‘s view on the issue. Not all international collaborations are equal. We selected the above picture of US Army Africa/Flickr, CC BY so as to best  illustrate this articleDoes it really matter in this day and age who owns science and technology? Please read on and comment.  Dr  Rohan Deb RoyLecturer in South Asian History at the University of Reading since 2015 todate says the following. 

I am particularly interested in the histories of science and medicine, histories of empire and colonialism, environmental history, and animal history. I am the author of Malarial Subjects: Empire, Medicine and Nonhumans in British India, 1820-1909 (Cambridge: Cambridge University Press, 2017) and co-editor (with Guy Attewell) of Locating the Medical: Explorations in South Asian History (New Delhi: Oxford University Press, 2018). I have put together a co-edited special issue on “Nonhuman Empires” (with Sujit Sivasundaram) for the journal “Comparative Studies of South Asia, Africa and the Middle East”, (35.1, May 2015). I received my PhD from University College London (2009), and have held postdoctoral fellowships at the Centre for Studies in Social Sciences Calcutta (2009-2010), at the University of Cambridge (2011-2013), and at the Max Planck Institute for the History of Science in Berlin (2013-2015). I have been a Barnard-Columbia Weiss International Visiting Scholar in the History of Science . . .  

Decolonise science – time to end another imperial era

File 20180404 189798 krb5ws.jpg?ixlib=rb 1.1
Anti-cholera inoculation in Calcutta in 1894.
Wellcome Collection, CC BY-SA

 

Sir Ronald Ross had just returned from an expedition to Sierra Leone. The British doctor had been leading efforts to tackle the malaria that so often killed English colonists in the country, and in December 1899 he gave a lecture to the Liverpool Chamber of Commerce about his experience. In the words of a contemporary report, he argued that “in the coming century, the success of imperialism will depend largely upon success with the microscope”.

Ross, who won the Nobel Prize for Medicine for his malaria research, would later deny he was talking specifically about his own work. But his point neatly summarised how the efforts of British scientists was intertwined with their country’s attempt to conquer a quarter of the world.

Ross was very much a child of empire, born in India and later working there as a surgeon in the imperial army. So when he used a microscope to identify how a dreaded tropical disease was transmitted, he would have realised that his discovery promised to safeguard the health of British troops and officials in the tropics. In turn, this would enable Britain to expand and consolidate its colonial rule.

Ronald Ross at his lab in Calcutta, 1898.
Wellcome Collection, CC BY

Ross’s words also suggest how science was used to argue imperialism was morally justified because it reflected British goodwill towards colonised people. It implied that scientific insights could be redeployed to promote superior health, hygiene and sanitation among colonial subjects. Empire was seen as a benevolent, selfless project. As Ross’s fellow Nobel laureate Rudyard Kipling described it, it was the “white man’s burden” to introduce modernity and civilised governance in the colonies.

But science at this time was more than just a practical or ideological tool when it came to empire. Since its birth around the same time as Europeans began conquering other parts of the world, modern Western science was inextricably entangled with colonialism, especially British imperialism. And the legacy of that colonialism still pervades science today.

As a result, recent years have seen an increasing number of calls to “decolonise science”, even going so far as to advocate scrapping the practice and findings of modern science altogether. Tackling the lingering influence of colonialism in science is much needed. But there are also dangers that the more extreme attempts to do so could play into the hands of religious fundamentalists and ultra-nationalists. We must find a way to remove the inequalities promoted by modern science while making sure its huge potential benefits work for everyone, instead of letting it become a tool for oppression.

The gracious gift of science

When a slave in an early 18th-century Jamaican plantation was found with a supposedly poisonous plant, his European overlords showed him no mercy. Suspected of conspiring to cause disorder on the plantation, he was treated with typical harshness and hanged to death. The historical records don’t even mention his name. His execution might also have been forgotten forever if it weren’t for the scientific enquiry that followed. Europeans on the plantation became curious about the plant and, building on the slave’s “accidental finding”, they eventually concluded it wasn’t poisonous at all.

Instead it became known as a cure for worms, warts, ringworm, freckles and cold swellings, with the name Apocynum erectum. As the historian Pratik Chakrabarti argues in a recent book, this incident serves as a neat example of how, under European political and commercial domination, gathering knowledge about nature could take place simultaneously with exploitation.

For imperialists and their modern apologists, science and medicine were among the gracious gifts from the European empires to the colonial world. What’s more, the 19th-century imperial ideologues saw the scientific successes of the West as a way to allege that non-Europeans were intellectually inferior and so deserved and needed to be colonised.

A racist caricature of European scientists visiting Africa. The severed head on the right is that of Ronald Ross.
Wellcome Collection, CC BY

In the incredibly influential 1835 memo “Minute on Indian Education”, British politician Thomas Macaulay denounced Indian languages partially because they lacked scientific words. He suggested that languages such as Sanskrit and Arabic were “barren of useful knowledge”, “fruitful of monstrous superstitions” and contained “false history, false astronomy, false medicine”.

Such opinions weren’t confined to colonial officials and imperial ideologues and were often shared by various representatives of the scientific profession. The prominent Victorian scientist Sir Francis Galton argued that the “the average intellectual standard of the negro race is some two grades below our own (the Anglo Saxon)”. Even Charles Darwin implied that “savage races” such as “the negro or the Australian” were closer to gorillas than were white Caucasians.

Yet 19th-century British science was itself built upon a global repertoire of wisdom, information, and living and material specimens collected from various corners of the colonial world. Extracting raw materials from colonial mines and plantations went hand in hand with extracting scientific information and specimens from colonised people.

Imperial collections

Leading public scientific institutions in imperial Britain, such as the Royal Botanic Gardens at Kew and the British Museum, as well as ethnographic displays of “exotic” humans, relied on a global network of colonial collectors and go-betweens. By 1857, the East India Company’s London zoological museum boasted insect specimens from across the colonial world, including from Ceylon, India, Java and Nepal.

The British and Natural History museums were founded using the personal collection of doctor and naturalist Sir Hans Sloane. To gather these thousands of specimens, Sloane had worked intimately with the East India, South Sea and Royal African companies, which did a great deal to help establish the British Empire.

The scientists who used this evidence were rarely sedentary geniuses working in laboratories insulated from imperial politics and economics. The likes of Charles Darwin on the Beagle and botanist Sir Joseph Banks on the Endeavour literally rode on the voyages of British exploration and conquest that enabled imperialism.

Sir Hans Sloane’s imperial collection started the British Museum.
Paul Hudson/Wikipedia, CC BY

Other scientific careers were directly driven by imperial achievements and needs. Early anthropological work in British India, such as Sir Herbert Hope Risley’s Tribes and Castes of Bengal, published in 1891, drew upon massive administrative classifications of the colonised population.

Map-making operations including the work of the Great Trigonometrical Survey in South Asia came from the need to cross colonial landscapes for trade and military campaigns. The geological surveys commissioned around the world by Sir Roderick Murchison were linked with intelligence gathering on minerals and local politics.

Efforts to curb epidemic diseases such as plague, smallpox and cholera led to attempts to discipline the routines, diets and movements of colonial subjects. This opened up a political process that the historian David Arnold has termed the “colonisation of the body”. By controlling people as well as countries, the authorities turned medicine into a weapon with which to secure imperial rule.

Imperialist Cecil Rhodes planned a railway and telegraph line to connect Africa.
Wikipedia

New technologies were also put to use expanding and consolidating the empire. Photographs were used for creating physical and racial stereotypes of different groups of colonised people. Steamboats were crucial in the colonial exploration of Africa in the mid-19th century. Aircraft enabled the British to surveil and then bomb rebellions in 20th-century Iraq. The innovation of wireless radio in the 1890s was shaped by Britain’s need for discreet, long-distance communication during the South African war.

In these ways and more, Europe’s leaps in science and technology during this period both drove and were driven by its political and economic domination of the rest of the world. Modern science was effectively built on a system that exploited millions of people. At the same time it helped justify and sustain that exploitation, in ways that hugely influenced how Europeans saw other races and countries. What’s more, colonial legacies continue to shape trends in science today.

Modern colonial science

Since the formal end of colonialism, we have become better at recognising how scientific expertise has come from many different countries and ethnicities. Yet former imperial nations still appear almost self-evidently superior to most of the once-colonised countries when it comes to scientific study. The empires may have virtually disappeared, but the cultural biases and disadvantages they imposed have not.

You just have to look at the statistics on the way research is carried out globally to see how the scientific hierarchy created by colonialism continues. The annual rankings of universities are published mostly by the Western world and tend to favour its own institutions. Academic journals across the different branches of science are mostly dominated by the US and western Europe.

It is unlikely that anyone who wishes to be taken seriously today would explain this data in terms of innate intellectual superiority determined by race. The blatant scientific racism of the 19th century has now given way to the notion that excellence in science and technology are a euphemism for significant funding, infrastructure and economic development.

Because of this, most of Asia, Africa and the Caribbean is seen either as playing catch-up with the developed world or as dependent on its scientific expertise and financial aid. Some academics have identified these trends as evidence of the persisting “intellectual domination of the West” and labelled them a form of “neo-colonialism”.

Various well-meaning efforts to bridge this gap have struggled to go beyond the legacies of colonialism. For example, scientific collaboration between countries can be a fruitful way of sharing skills and knowledge, and learning from the intellectual insights of one another. But when an economically weaker part of the world collaborates almost exclusively with very strong scientific partners, it can take the form of dependence, if not subordination.

A 2009 study showed that about 80% of Central Africa’s research papers were produced with collaborators based outside the region. With the exception of Rwanda, each of the African countries principally collaborated with its former coloniser. As a result, these dominant collaborators shaped scientific work in the region. They prioritised research on immediate local health-related issues, particularly infectious and tropical diseases, rather than encouraging local scientists to also pursue the fuller range of topics pursued in the West.

In the case of Cameroon, local scientists’ most common role was in collecting data and fieldwork while foreign collaborators shouldered a significant amount of the analytical science. This echoed a 2003 study of international collaborations in at least 48 developing countries that suggested local scientists too often carried out “fieldwork in their own country for the foreign researchers”.

In the same study, 60% to 70% of the scientists based in developed countries did not acknowledge their collaborators in poorer countries as co-authors in their papers. This is despite the fact they later claimed in the survey that the papers were the result of close collaborations.

Mistrust and resistance

International health charities, which are dominated by Western countries, have faced similar issues. After the formal end of colonial rule, global health workers long appeared to represent a superior scientific culture in an alien environment. Unsurprisingly, interactions between these skilled and dedicated foreign personnel and the local population have often been characterised by mistrust.

For example, during the smallpox eradication campaigns of the 1970s and the polio campaign of past two decades, the World Health Organization’s representatives found it quite challenging to mobilise willing participants and volunteers in the interiors of South Asia. On occasions they even saw resistance on religious grounds from local people. But their stringent responses, which included the close surveillance of villages, cash incentives for identifying concealed cases and house-to-house searches, added to this climate of mutual suspicion. These experiences of mistrust are reminiscent of those created by strict colonial policies of plague control.

Polio eradication needs willing volunteers.
Department for International Development, CC BY

Western pharmaceutical firms also play a role by carrying out questionable clinical trials in the developing world where, as journalist Sonia Shah puts it, “ethical oversight is minimal and desperate patients abound”. This raises moral questions about whether multinational corporations misuse the economic weaknesses of once-colonised countries in the interests of scientific and medical research.

The colonial image of science as a domain of the white man even continues to shape contemporary scientific practice in developed countries. People from ethnic minorities are underrepresented in science and engineering jobs and more likely to face discrimination and other barriers to career progress.

To finally leave behind the baggage of colonialism, scientific collaborations need to become more symmetrical and founded on greater degrees of mutual respect. We need to decolonise science by recognising the true achievements and potential of scientists from outside the Western world. Yet while this structural change is necessary, the path to decolonisation has dangers of its own.

Science must fall?

In October 2016, a YouTube video of students discussing the decolonisation of science went surprisingly viral. The clip, which has been watched more than 1m times, shows a student from the University of Cape Town arguing that science as a whole should be scrapped and started again in a way that accommodates non-Western perspectives and experiences. The student’s point that science cannot explain so-called black magic earned the argument much derision and mockery. But you only have to look at the racist and ignorant comments left beneath the video to see why the topic is so in need of discussion.

Inspired by the recent “Rhodes Must Fall” campaign against the university legacy of the imperialist Cecil Rhodes, the Cape Town students became associated with the phrase “science must fall”. While it may be interestingly provocative, this slogan isn’t helpful at a time when government policies in a range of countries including the US, UK and India are already threatening to impose major limits on science research funding.

More alarmingly, the phrase also runs the risk of being used by religious fundamentalists and cynical politicians in their arguments against established scientific theories such as climate change. This is a time when the integrity of experts is under fire and science is the target of political manouevring. So polemically rejecting the subject altogether only plays into the hands of those who have no interest in decolonisation.

Alongside its imperial history, science has also inspired many people in the former colonial world to demonstrate remarkable courage, critical thinking and dissent in the face of established beliefs and conservative traditions. These include the iconic Indian anti-caste activist Rohith Vemula and the murdered atheist authors Narendra Dabholkar and Avijit Roy. Demanding that “science must fall” fails to do justice to this legacy.

The call to decolonise science, as in the case of other disciplines such as literature, can encourage us to rethink the dominant image that scientific knowledge is the work of white men. But this much-needed critique of the scientific canon carries the other danger of inspiring alternative national narratives in post-colonial countries.

A March for Science protester in Melbourne.
www.wikimedia.com, CC BY-SA

For example, some Indian nationalists, including the country’s current prime minister, Narendra Modi, have emphasised the scientific glories of an ancient Hindu civilisation. They argue that plastic surgery, genetic science, aeroplanes and stem cell technology were in vogue in India thousands of years ago. These claims are not just a problem because they are factually inaccurate. Misusing science to stoke a sense of nationalist pride can easily feed into jingoism.

Meanwhile, various forms of modern science and their potential benefits have been rejected as unpatriotic. In 2016, a senior Indian government official even went so far as to claim that “doctors prescribing non-Ayurvedic medicines are anti-national”.

The path to decolonisation

Attempts to decolonise science need to contest jingoistic claims of cultural superiority, whether they come from European imperial ideologues or the current representatives of post-colonial governments. This is where new trends in the history of science can be helpful.

For example, instead of the parochial understanding of science as the work of lone geniuses, we could insist on a more cosmopolitan model. This would recognise how different networks of people have often worked together in scientific projects and the cultural exchanges that helped them – even if those exchanges were unequal and exploitative.

But if scientists and historians are serious about “decolonising science” in this way, they need to do much more to present the culturally diverse and global origins of science to a wider, non-specialist audience. For example, we need to make sure this decolonised story of the development of science makes its way into schools.

Students should also be taught how empires affected the development of science and how scientific knowledge was reinforced, used and sometimes resisted by colonised people. We should encourage budding scientists to question whether science has done enough to dispel modern prejudices based on concepts of race, gender, class and nationality.

Schools need to teach the non-Western history of science.
Wikimedia Commons

Decolonising science will also involve encouraging Western institutions that hold imperial scientific collections to reflect more on the violent political contexts of war and colonisation in which these items were acquired. An obvious step forward would be to discuss repatriating scientific specimens to former colonies, as botanists working on plants originally from Angola but held primarily in Europe have done. If repatriation isn’t possible, then co-ownership or priority access for academics from post-colonial countries should at least be considered.

This is also an opportunity for the broader scientific community to critically reflect on its own profession. Doing so will inspire scientists to think more about the political contexts that have kept their work going and about how changing them could benefit the scientific profession around the world. It should spark conversations between the sciences and other disciplines about their shared colonial past and how to address the issues it creates.

The ConversationUnravelling the legacies of colonial science will take time. But the field needs strengthening at a time when some of the most influential countries in the world have adopted a lukewarm attitude towards scientific values and findings. Decolonisation promises to make science more appealing by integrating its findings more firmly with questions of justice, ethics and democracy. Perhaps, in the coming century, success with the microscope will depend on success in tackling the lingering effects of imperialism.

Rohan Deb Roy, Lecturer in South Asian History, University of Reading

This article was originally published on The Conversation. Read the original article.

Learning from the Great Pyramids of Giza and Stonehenge

Learning from the Great Pyramids of Giza and Stonehenge

Below is Daniel Brown, of Nottingham Trent University article; it has an introduction that says it all about yet an other attempt in today’s learning from the Great Pyramids of Giza and Stonehenge that is put to test as technological advances allows us to go deeper into the ever so thinning layers of history...

The graph above is of a book on the Pyramid of Giza written by Eckhart R. Schmitz .

Ever since humans could look up to see the sky, we have been amazed by its beauty and untold mysteries. Naturally then, astronomy is often described as the oldest of the sciences, inspiring people for thousands of years. Celestial phenomena are featured in prehistoric cave paintings. And monuments such as the Great Pyramids of Giza and Stonehenge seem to be aligned with precision to cardinal points or the positions where the moon, sun or stars rise and set on the horizon.

From the pyramids to Stonehenge – were prehistoric people astronomers?

File 20180306 146661 134tohl.jpg?ixlib=rb 1.1

Ricardo Liberato/wikimedia, CC BY-ND

 

Today, we seem to struggle to imagine how ancient people could build and orient such structures. This has led to many assumptions. Some suggest prehistoric people must have had some knowledge of mathematics and sciences to do this, whereas others go so far as to speculate that alien visitors showed them how to do it.

But what do we actually know about how people of the past understood the sky and developed a cosmology? A scientific discipline called “archaeoastronomy” or “cultural astronomy”, developed in the 1970s, is starting to provide insights. This subject combines various specialist areas, such as astronomy, archaeology, anthropology and ethno-astronomy.

Simplistic methods

The pyramids of Egypt are some of the most impressive ancient monuments, and several are oriented with high precision. Egyptologist Flinder Petrie carried out the first high-precision survey of the Giza pyramids in the 19th century. He found that each of the four edges of the pyramids’ bases point towards a cardinal direction to within a quarter of a degree.

But how did the Egyptians know that? Just recently, Glen Dash, an engineer who studies the Giza pyramids, proposed a theory. He draws upon the ancient method of the “Indian circle”, which only requires a shadow casting stick and string to construct an east-west direction. He outlined how this method could have been used for the pyramids based on its simplicity alone.

So could this have been the case? It’s not impossible, but at this point we are in danger of falling into a popular trap of reflecting our current world views, methods and ideas into the past. Insight into mythology and relevant methods known and used at the time are likely to provide a more reliable answer.

Stonehenge sun. simonwakefield/Flickr, CC BY-SA

This is not the first time scientists have jumped to conclusions about a scientific approach applied to the past. A similar thing happened with Stonehenge. In 1964, the late astronomer Gerald Hawkins developed an intricate method to use pit holes and markers to predict eclipses at the mysterious monument. However, this does not mean that this is how Stonehenge was intended to be used.

Way forward

To start understanding the past we need to include various approaches from other disciplines to support an idea. We also have to understand that there will never be only one explanation or answer to how a monument might have been aligned or used.

So how can cultural astronomy explain the pyramids’ alignment? A study from 2001 proposed that two stars, Megrez and Phad, in the stellar constellation known as Ursa Major may have been the key. These stars are visible through the entire night. Their lowest position in the sky during a night can mark north using the merkhet – an ancient timekeeping instrument composing a bar with a plumb line attached to a wooden handle to track stars’ alignment.

The benefit of this interpretation is that it links to star mythology drawn from inscriptions in the temple of Horus in Edfu. These elaborate on using the merkhet as a surveying tool – a technique that can also explain the orientation of other Egyptian sites. The inscription includes the hieroglyph “the Bull’s Foreleg” which represents the Big Dipper star constellation and its possible position in the sky.

The use of the two stars Megrez and Phad of Ursa Major to line up with the cardinal north direction (meridian indicated in orange) as simulated for 2562BC. Daniel Brown

Similarly, better ideas for Stonehenge have been offered. One study identified strange circles of wood near the monument, and suggested these may have represented the living while the rocks at Stonehenge represented the dead. Similar practices are seen in monuments found in Madagascar, suggesting it may have been a common way for prehistoric people to think about the living and the dead. It also offers an exciting new way of understanding Stonehenge in its wider landscape. Others have interpreted Stonehenge and especially its avenue as marking the ritual passage through the underworld with views of the moon on the horizon.

Cultural astronomy has also helped shed light on 6,000-year-old passage graves – a type of tomb consisting of a chamber of connected stones and a long narrow entrance – in Portugal. Archaeologist Fabio Silva has shown how views from inside the tombs frame the horizon where the star Aldebaran rises above a mountain range. This might mean it was built to give a view of the star from the inside either for the dead or the living, possibly as an initiation ritual.

Fieldwork at one of the passage graves in Portugal, Dolmen da Orca. Next to the stone structure is a replica tent to simulate the view from inside of the passage grave. Daniel Brown

But Silva also drew upon wider supporting evidence. The framed mountain range is where the builders of the graves would have migrated with their livestock over summer. The star Aldebaran rises for the first time here in the year – known as a helical rising – during the beginning of this migration. Interestingly, ancient folklore also talks about a shepherd in this area who spotted a star so bright that it lit up the mountain range. Arriving there he decided to name both the mountain range and his dog after the star – both names still exist today.

Current work carried out by myself in collaboration with Silva has also shown how a view from within the long, narrow entrance passages to the tombs could enhance the star’s visibility by restricting the view through an aperture.

The ConversationBut while it is easy to assume that prehistoric people were analytic astronomers with great knowledge of science, it’s important to remember that this only reflects our modern views of astronomy. Findings from cultural astronomy show that people of the past were indeed sky watchers and incorporated what they saw in many aspects of their lives. While there are still many mysteries surrounding the meaning and origins of ancient structures, an approach drawing on as many areas as possible, including experiences and engaging in meaning is likely our best bet to work out just what they were once used for.

Daniel Brown, Lecturer in Astronomy, Nottingham Trent University

This article was originally published on The Conversation. Read the original article.

Space as a step towards the future

Space as a step towards the future

Could the Arabs of all peoples of the MENA region, one day regain their historical inspiration and explore space, wondered Nature Asia. almost at the same time as many were witnessing the United Arab Emirates surprisingly engage into building a Space Program justifying their approach to space as not just about getting into orbit but to consider space as a step towards the future. Here is an account of Emily Thomas, Durham University  regarding the different approaches to space.

What is space? The 300-year-old philosophical battle that is still raging today

File 20171013 11673 1x9ndkz.jpg?ixlib=rb 1.1
Pexels, CC BY

 

Mountains. Whales. The distant stars. All these things exist in space, and so do we. Our bodies take up a certain amount of space. When we walk to work, we are moving through space. But what is space? Is it even an actual, physical entity? In 1717, a battle was waged over this question. Exactly 300 years later, it continues.

You might think physicists have “solved” the problem of space. The likes of mathematician Hermann Minkowski and physicist Albert Einstein taught us to conceive space and time as a unified continuum, helping us to understand how very large and very little things such as individual atoms move. Nonetheless, we haven’t solved the question of what space is. If you sucked all the matter out of the universe, would space be left behind?

Twenty-first century physics is arguably compatible with two very different accounts of space: “relationism” and “absolutism”. Both these views owe their popularity to Caroline of Ansbach (1683-1737), a German-born Queen of Great Britain, who stuck her oar into the philosophical currents swirling around her.

Caroline was a keen philosopher, and in the early 18th century she schemed to pit the leading philosophies of her period against each other. On the continent, philosophers were stuck in “rationalism”, spinning world theories from armchairs. Meanwhile, British philosophers were developing science-inspired “empiricism” – theories built on observations. They were worshipping scientists such as Robert Boyle and Isaac Newton.

Gottfried Wilhelm von Leibniz.
/gbrown/philosophers/leibniz/BritannicaPages/Leibniz/LeibnizGif.html

Caroline asked two philosophers to exchange letters. One was the German philosopher Gottfried Leibniz, rationalist par excellence. The other was the English philosopher Samuel Clarke, a close friend of Newton. The two men agreed, and their exchange was published in 1717 as A Collection of Papers. The dull title doesn’t sound like much, but these papers were revolutionary. And one of their central issues was the nature of space.

Everything or nothing?

Is there space between the stars? The relationist Leibniz argued that space is the spatial relations between things. Australia is “south of” Singapore. The tree is “three meters left of” the bush. Sean Spicer is “behind” the bush. That means space would not exist independently of the things it connects. For Leibniz, if nothing existed, there couldn’t be any spatial relations. If our universe were destroyed, space would not exist.

In contrast, the absolutist Clarke argued that space is a sort of substance that is everywhere. Space is a giant container, containing all the things in the universe: stars, planets, us. Space allows us to make sense of how things move from one place to another, of how our entire material universe could move through space. What’s more, Clarke argued that space is divine: space is God’s presence in the world. In a way, space is God. For Clarke, if our universe were destroyed, space would be left behind. Just as you can’t delete God, you can’t delete space.

Samuel Clarke.
Portrait attributed to Charles Jervas.

The Leibniz-Clarke letters exploded early 18th century thought. Thinkers like Newton, who were already involved in the debate, were dragged deeper in. Newton argued that space was more than the relations between material objects. He argued it was an absolute entity, that everything moves in relation to it. This led to the distinction between “relative” and “absolute” motion. The Earth moves relative to other material things, such as the sun, but it also moves absolutely – with regard to space.

Others joined the party later, like Immanuel Kant. He believed space is just a concept humans use to make sense of the world, rather than a real entity. It wasn’t just philosophers and physicists who had views on space either. All sorts of people had their say, from stocking makers to tenant farmers. One especially unlikely discussion of space turns up in Thomas Amory’s 1755 Memoirs: Containing the Lives of Several Ladies of Great Britain.

The problem with God

People were especially edgy about Clarke’s view that space is God. Does that mean we’re moving through God all the time? God doesn’t just see everything, he is everywhere? They also became worried about Big Things. As a whale takes up more space than a holy man, is a whale holier? As mountains are so large, are they like God?

Holy?
Thomas Fuhrmann/wikipedia, CC BY-SA

The 20th century philosopher Bertrand Russell once argued we shouldn’t worship mere size. “Sir Isaac Newton was very much smaller than a hippopotamus, but we do not on that account value him less than the larger beast,” he wrote. Some 18th century thinkers would have disagreed – they were worried they should be worshipping a hippopotamus over Newton.

Today, the concept of God is disappearing from the debate. Yet some contemporary philosophers, such as Tim Maudlin and Graham Nerlich think that current theories in physics do support Clarke’s view (minus the religious parts). Spacetime is one big container, and all of us move around in it.

Other philosophers, such as Kenneth Manders and Julian Barbour, think our best physics is compatible with both views, and there are other reasons to believe Leibniz’s theory was right. If the physics really is compatible with absolutism or relationism, then perhaps we should prefer relationism as the simpler theory? After all, why posit a giant entity that acts like a container if we don’t have to?

The ConversationAs a historian of space and time, I’m fascinated by how the debate has evolved, how something that started 300 years ago has unfurled and grown. Clearly, though the Leibniz-Clarke papers are not well known outside of philosophy, the debate they started continues. Caroline of Ansbach has a lot to answer for.

Emily Thomas, Assistant Professor of Philosophy, Durham University

This article was originally published on The Conversation. Read the original article.

 

 

The Conversation

Hike in Intellectual Property Fees in GCC

Hike in Intellectual Property Fees in GCC

Would the following Hike in Intellectual Property Fees in GCC story have anything to do with the fall in the price of oil and gas or as put forward here due to GCC’s new internal as well as interstates coordination ?  This article written by a team of BQ discussing with Katie Montazeri, partner at DLA Piper Middle East LLP who expertised the trend in the domain was published on October 23, 2016. It clearly shows that there is indeed some sort of inverted proportionality relationship between the two segments of these countries’ economies.

Hike in IP fees in GCC – who’s gaining?

As the fees for protecting one’s brand rises, small businesses opting for non-registration of trademarks could face risk from unscrupulous competitors

Intellectual property fees spiked recently all across the GCC, in some countries as much as 6,200 percent, making the registration of patents, trademarks or design among the most expensive in the world.

In the last couple of months Bahrain and Kuwait raised their intellectual property fees more than just significantly, while Saudi Arabia and the UAE did the same last year. According to the local media reports, in Kuwait, the official cost for registering a trademark is due to increase this year from the USD 25 to USD 1,586, while in Bahrain registration fees will be raised by 728 percent from USD 160 to USD 1,325.

In Saudi Arabia, since last year fees for renewing trademarks jumped from USD 80 to USD 800. Last March, in the UAE the same fee has gone up by 99.9 percent and now is USD 2,725.

katie-montazeriExplaining the consequences of those decisions, Katie Montazeri, partner at DLA Piper Middle East LLP, says that last year’s unexpected 100 percent rise in many of the official fees charged by the UAE trademark office meant that the UAE was now among the most expensive countries in the world in terms of filing a national trademark application. DLA Piper is a global law firm with lawyers located in more than 30 countries throughout the Americas, Europe, the Middle East, Africa and Asia Pacific.

THE IMPACT IS PARTICULARLY SIGNIFICANT BECAUSE THE UAE ALSO HAS A MONOCLASS SYSTEM, MEANING THAT A SEPARATE APPLICATION MUST BE FILED (AND PAID FOR) IN EACH RELEVANT CLASS. NOT SURPRISINGLY, IT APPEARS THAT THE FEE INCREASE HAS LED TO A DECREASE IN THE NUMBER OF TRADEMARK FILINGS RECEIVED PER MONTH BY THE OFFICE

“The impact is particularly significant because the UAE also has a monoclass system, meaning that a separate application must be filed (and paid for) in each relevant class. Not surprisingly, it appears that the fee increase has led to a decrease in the number of trademark filings received per month by the office. In my experience, businesses are continuing to file trademark applications in the UAE, but they are taking a more conservative approach to the number of classes to be covered. It remains to be seen whether this will have an impact on enforcement in the longer term,” stated Montazeri.

The Qatari office of Abu Ghazaleh Intellectual Property (AGIP) had something similar to say. “The number of applications that will be filed at the trademark offices in each country of the GCC will be reduced,” their Doha-based agents said and added that small businesses, law firms and ministries of trade will be the most affected by such high fees for intellectual property registration and renewal.

According to Montazeri, many international brand owners would simply absorb the additional filing and renewal fees as the cost of doing business in the region. “For them, the need for brand protection is greater than the deterrent effect of high costs. SMEs are most likely to find the increases prohibitive and may defer registration or even decide that trademark registration is not affordable, potentially placing their brands at risk,” he said.

“Interestingly, the IPO (intellectual property owners) in the European Union recently reduced their fees as a means of making the EUTM (European Union Trading Mission) more attractive to SMEs. It would benefit SMEs in the GCC if a discounted rate would apply for applicants whose turnover or number of employees (for example) falls below a certain threshold. This would also help to support national initiatives to encourage inventors and innovators in line with the UAE’s emphasis on innovation and the knowledge economy,” Montazeri added.

Others will follow

In Qatar, according to the web pages of Ministry of Economy and Commerce, a total fee of QR 1,000 (USD 274) should be paid upon submission of the application and another fee amounting to QR 325 (USD 89) is collected after the application is accepted and the registration of trademark is announced in the official gazette.

If a four-month period starting from the date of publication, elapses without an objection being raised, a trademark registration certificate is granted upon payment of a total of QR 2,025 (USD 556). So the total cost of trademark registration in Qatar would be QR 3,350 (USD 920), which is, at least for now, less than in Kuwait or Bahrain. The trademark certificate is valid within Qatar for 10 years from the issuance date and can be renewed for another 10 years.

SOME COMMENTATORS HAVE SUGGESTED THAT THE INCREASE IN FEES IS LINKED TO THE INTRODUCTION OF THE UNIFIED GCC TRADEMARK LAW AND THE NEED TO FUND IMPROVED TECHNOLOGY SUCH AS PUBLICLY ACCESSIBLE IP (INTELLECTUAL PROPERTY) DATABASES

But, Montazeri feels the remaining GCC member states (Oman and Qatar), too might follow the example of their neighbors, in preparation for the introduction of long-awaited unified GCC trademark law. “There have been corresponding increases in the fees charged by the trademark offices in both Kuwait and KSA, and it seems likely that other GCC countries will follow suit. Some commentators have suggested that the increase in fees is linked to the introduction of the unified GCC trademark law and the need to fund improved technology such as publicly accessible IP (intellectual property) databases,” she told BQ.

“If so, then the longer-term effect on the GCC economies is likely to be positive because digitalization and ease of search facilities is an area where the GCC is less sophisticated than other regions and businesses would benefit from more advanced filing and search technology. However, the short-term impact of these increases (particularly on SMEs) combined with the monoclass system, which requires multiple filings, is significant. It will be interesting to assess what (if any) impact the increases have on the UAE trademark office’s revenue and on the number of filings from businesses within and outside the UAE,” Montazeri added.

Read more in the original document at the above mentioned address

COP 21 Earth’s Climate Change Challenge . . .

COP 21 Earth’s Climate Change Challenge . . .

On 22 April 2016, the ratification of the Paris agreement by 175 countries at the headquarters of the United Nations in New York, revived the hopes that all nations work together so as to hopefully meet the COP 21 Earth’s Climate Change Challenge . . .

For the success of the COP21, this signature last week is a real source of satisfaction for the scientific community who wants to reach carbon neutrality by the end of the century.

The international community has pledged to limit the increase in global temperature to that of the pre-industrial era with an adopted text calling for a continuation of efforts to cap the increase to 1.5°C.

Conscious of having to make an effort, the European Union decided to drop by 40% its emissions by 2030.

The United Kingdom has already managed to reduce its GHG emissions by 20%.

The United States have reduced since 2000, their emissions by 6%, whilst their GDP increased by 28%.

Canada agreed with the United States on the purpose of reducing polluting gases emissions caused by oil and gas drilling by upto 45% below 2012 levels by 2025.

As a matter of fact, there are more than 21 countries who decidedly got involved in this sense since the beginning of the 21st century.

Climate agreement: Is it too little, too late?

A historic agreement on climate change signed by 175 countries last week was a watershed moment in history for the global battle to preserve our environment

Gulf News published an article on April 25, 2016 that was compiled by Chiranjib Sengupta, Hub Editor

‘Representatives from more than 160 nations gathered at the United Nations to sign the accord they hammered out in Paris last December to reduce greenhouse gas emissions and slow the effects of climate change,” said the Los Angeles Times in an editorial. “But is it too little, too late? The accord was an extraordinary achievement, but in the end, it was only a nonbinding agreement, and everyone understood that the real, daunting challenge would be in working together to meet the accord’s stated goals,” the paper questioned.

Climate Change Paris