Pangolins look like small, armoured ant-eaters. With their long prehensile tails, they are surprisingly good climbers, and when they’re threatened, they curl up into tight balls to wait for their assailant to leave. It’s a surprisingly cute, disarming and horribly inept defence against the threat they now face. Because once wrapped in a ball, they can very easily be picked up.
Between 2006 and 2015, 1,112,756 Pangolins were and trafficked. It’s done on such a scale that their bodies are initially bought and sold in bales and sacks – entire truckloads of them – before being broken down into scales and pieces of meat for pseudo-curatives within traditional Chinese medicine. They are the most trafficked animal in the world.
All eight species of Pangolin are now threatened with extinction, which means that to trade in them is completely illegal. Ever since 1975 an international convention – – has stopped all trade in species that are endangered. Almost every country in the world is a signatory of it, and all have a responsibility to act.
One of the problems is that, like so many crimes, the buying and selling of endangered wildlife like pangolins has gone digital. Online order forms for pangolin scales, auctions for pangolin products, and quack medical sites describing in tedious detail the bogus science behind pangolin medicine all support this criminal industry. But while individual cases have been building up, it’s been difficult to know exactly how much of it is happening, and where.
That is why, for a year, I and a group of technologists have been building a way to detect this – and other kinds of – online environmental crime.
It wasn’t an easy task: we had to create a technical process that could identify the language of the endangered animals market; scour the internet looking for examples of it; and then train an entire architecture of algorithms to filter out the vast number of sites that probably weren’t involved. And because we also knew the practice would change over time, we needed to make the whole system dynamic, so it would learn from what it found in order to find more. (If online research methodology is your thing, you can read about the nuts and bolts of the project .)
We ended up finding almost 5,000 web pages that were (likely: it’s not an exact science) part of the online Pangolin trafficking world. Included in this were sites that enabled buying and selling, as well as sites that featured commentaries, articles and reviews about products containing Pangolin body parts. Casting the net wider, we found even more: our tech detected 7,800 web pages related to the online sale of ivory, and over 1,000 pages on eBay selling a restricted species of orchid. This, we’re sure, is only the tip of the iceberg.
We did this work in order to shed light on a neglected issue. Media organisations need to report and investigate it. Tech companies need to be embarrassed. Politicians need to get angry and law enforcement agencies need to feel under pressure to act. But this work ties into a more fundamental concern.
Ever since the 1990s, when it crept out of labs and militaries and into our homes, the internet has posed a basic challenge to society. From starting a business to giving to charity, it has made countless kinds of positive behaviour easier to perform. But it’s brought with it a scary catch: an array of bad, destructive behaviour has also been made easier, cheaper, and more convenient to perform.
Serious reform by tech companies and governments to tackle these problems began around a decade ago. First came action against online child sexual exploitation, followed by widespread efforts to kick terrorists off the internet, or at least its most popular parts. Then, a little later, platforms began to wrestle with the problem of how they were being used to form and spread far-Right politics, hate speech, misogyny and xenophobia. Parliament is confronting a new problem: the production of ‘fake news’ and disinformation.
Throughout this time, it has largely been the very public exposure of harmful influences that has spurred both governments and technology companies to act. There has been a solid link between the visibility of a problem and how much is done to fix it, creating an overall process that we could call enforcement-through-media embarrassment.
This approach is an incredibly crude way for sophisticated political systems to prioritise dealing with online problems. Pressure is something that normally happens in a small space – against a small number of platforms.
Take the recent DCMS on disinformation and fake news. Facebook is mentioned 500 times in that report. YouTube, an absolute hotbed of conspiracy theories and disinformation, just twice. Online gaming is another venue where harm is disseminated in great quantity, but it is barely dealt with at all.
Pressure can also exist only for a remarkably narrow range of issues. In article after article, event after event, the same examples are used endlessly: Cambridge Analytica; fake news; Russian hacking, echo chambers. Issues, let’s be honest, that pop up on journalists’ Twitter feeds, and are on the radar of politicians, writers and television producers.
The result, of course, is that lower profile problems haven’t been fixed. From the general political and chattering class, there is an absolutely astounding lack of hunger to uncover and shout about the much broader set of problems that exist. And no other problem has been more neglected than the use of the internet to trade endangered wildlife.
The most visible problems are not always going to be the most harmful. And the biggest platforms aren’t going to be the only places where that harm happens. Reform through pressure and media exposure is always going to leave huge gaps, and it is in those gaps that problems like environmental crime exist – and you can be certain that there are many others. The worst examples of online life will rarely be those you’ve seen on the front page.