Monday 20 April 2020

EU toolbox to guide Member States on the ethical use of mobile apps to fight COVID-19



On April 15th, the e-Health Network branch of the European Commission published the EU toolbox: “Mobile applications to support contact tracing in the EU’s fight against COVID-19”. This communications aims at guiding Member States on how to best use mobile apps to tackle COVID-19 while safeguarding data privacy.

The common approach aims to exploit the latest privacy-enhancing technological solutions that enable at-risk individuals to be contacted and, if necessarily, to be tested as quickly as possible, regardless of where they are and the app they are using. It explains the essential requirements for national apps, namely that they be:
  • voluntary;
  • approved by the national health authority;
  • privacy-preserving - personal data is securely encrypted; and
  • dismantled as soon as no longer needed.

Mobile apps have potential to bolster contact tracing strategies to contain and reverse the spread of COVID-19. EU Member States are converging towards effective app solutions that minimise the processing of personal data, and recognise that interoperability between these apps can support public health authorities and support the reopening of the EU’s internal borders.

Member States agreed on April 16th that COVID-19 mobile applications should not process the location data of individuals, because "it is not necessary nor recommended for the purpose of contact tracing".

"Collecting an individual's movements in the context of contact tracing apps would create major security and privacy issues," states the EU toolbox adopted by EU countries and supported by the European Commission.

The EU toolbox was delivered following the European Commission’s recommendation, released on April 8th, on apps for contact tracing. This recommendation reflects on a common Union toolbox for the use of technology and data in order to combat and exit from the COVID-19 crisis, in particular concerning mobile applications and the use of anonymised mobility data.

The recommendation sets out a process towards the adoption with the Member States of a toolbox, focusing on two dimensions:

  • A pan-European coordinated approach for the use of mobile applications for empowering citizens to take effective and more targeted social distancing measures and for warning, preventing and contact tracing; and
  • A common approach for modelling and predicting the evolution of the virus through anonymised and aggregated mobile location data.

Additionally, on April 16th, the Commission published the EU approach for efficient contact tracing apps to support gradual lifting of confinement measures. Also on April 16th, the European Commission published guidance on the development of new apps that support the fight against coronavirus in relation to data protection. 

Since the outbreak of the coronavirus pandemic, Member States, backed by the Commission, have been assessing the effectiveness, security, privacy, and data protection aspects of digital solutions to address the crisis. Contact tracing apps, if fully compliant with EU rules and well coordinated, can play a key role in all phases of crisis management, especially when time will be ripe to gradually lift social distancing measures.

The Commission guidance sets out features and requirements which apps should meet to ensure compliance with EU privacy and personal data protection legislation, in particular the General Data Protection Regulation (GDPR) and the ePrivacy Directive. However, the guidance is not legally binding. It is without prejudice to the role of the Court of Justice of the EU, which is the only institution that can give authoritative interpretation of EU law.

The present guidance addresses only voluntary apps supporting the fight against COVID 19 pandemic (apps downloaded, installed and used on a voluntary basis by individuals) with one or several of the following functionalities:

  • Provide accurate information to individuals about the COVID-19 pandemic;
  • Provide questionnaires for self-assessment and for guidance to individuals (symptom checker functionality);
  • Alert persons who have been in proximity for a certain duration to an infected person, in order to provide information such as whether to self-quarantine and where to get tested (contact tracing and warning functionality);
  • Provide a communication forum between patients and doctors in situation of self-isolation or where further diagnosis and treatment advice is provided (increased use of telemedicine).

This guidance does not cover apps aimed at enforcing quarantine requirements (including those which are mandatory).

By the end of April 2020: Member States with the Commission will seek clarifications on the solution proposed by Google and Apple with regard to contact tracing functionality on Android and iOS in order to ensure that their initiative is compatible with the EU common approach.

Sunday 19 April 2020

Shoshana Zuboff on surveillance capitalism | VPRO Documentary




'“Surveillance capitalism,” Zuboff writes, “unilaterally claims human experience as free raw material for translation into behavioural data. Although some of these data are applied to service improvement, the rest are declared as a proprietary behavioural surplus, fed into advanced manufacturing processes known as ‘machine intelligence’, and fabricated into prediction products that anticipate what you will do now, soon, and later. Finally, these prediction products are traded in a new kind of marketplace that I call behavioural futures markets. Surveillance capitalists have grown immensely wealthy from these trading operations, for many companies are willing to lay bets on our future behaviour.”

While the general modus operandi of Google, Facebook et al has been known and understood (at least by some people) for a while, what has been missing – and what Zuboff provides – is the insight and scholarship to situate them in a wider context. She points out that while most of us think that we are dealing merely with algorithmic inscrutability, in fact what confronts us is the latest phase in capitalism’s long evolution – from the making of products, to mass production, to managerial capitalism, to services, to financial capitalism, and now to the exploitation of behavioural predictions covertly derived from the surveillance of users. In that sense, her vast (660-page) book is a continuation of a tradition that includes Adam Smith, Max Weber, Karl Polanyi and – dare I say it – Karl Marx.

Viewed from this perspective, the behaviour of the digital giants looks rather different from the roseate hallucinations of Wired magazine. What one sees instead is a colonising ruthlessness of which John D Rockefeller would have been proud. First of all there was the arrogant appropriation of users’ behavioural data – viewed as a free resource, there for the taking. Then the use of patented methods to extract or infer data even when users had explicitly denied permission, followed by the use of technologies that were opaque by design and fostered user ignorance.


And, of course, there is also the fact that the entire project was conducted in what was effectively lawless – or at any rate law-free – territory. Thus Google decided that it would digitise and store every book ever printed, regardless of copyright issues. Or that it would photograph every street and house on the planet without asking anyone’s permission. Facebook launched its infamous “beacons”, which reported a user’s online activities and published them to others’ news feeds without the knowledge of the user. And so on, in accordance with the disrupter’s mantra that “it is easier to ask for forgiveness than for permission”.'


Friday 10 April 2020

Documentary: The Crisis of Science (The Corbett Report)



In 2015 a study from the Institute of Diet and Health with some surprising results launched a slew of click bait articles with explosive headlines: “Chocolate accelerates weight loss” insisted one such headline.

“Scientists say eating chocolate can help you lose weight” declared another.

“Lose 10% More Weight By Eating A Chocolate Bar Every Day…No Joke!” promised yet another.

There was just one problem: This was a joke.

The head researcher of the study, “Johannes Bohannon,” took to io9 in May of that year to reveal that his name was actually John Bohannon, the “Institute of Diet and Health” was in fact nothing more than a website, and the study showing the magical weight loss effects of chocolate consumption was bogus. The hoax was the brainchild of a German television reporter who wanted to “demonstrate just how easy it is to turn bad science into the big headlines behind diet fads.”

Given how widely the study’s surprising conclusion was publicized—from the pages of Bild, Europe’s largest daily newspaper to the TV sets of viewers in Texas and Australia—that demonstration was remarkably successful. But although it’s tempting to write this story off as a demonstration about gullible journalists and the scientific illiteracy of the press, the hoax serves as a window into a much larger, much more troubling story.

That story is The Crisis of Science. This is The Corbett Report.

What makes the chocolate weight loss study so revealing isn’t that it was completely fake; it’s that in an important sense it wasn’t fake. Bohannes really did conduct a weight loss study and the data really does support the conclusion that subjects who ate chocolate on a low-carb diet lose weight faster than those on a non-chocolate diet. In fact, the chocolate dieters even had better cholesterol readings. The trick was all in how the data was interpreted and reported.

As Bohannes explained in his post-hoax confession:

“Here’s a dirty little science secret: If you measure a large number of things about a small number of people, you are almost guaranteed to get a ‘statistically significant’ result. Our study included 18 different measurements—weight, cholesterol, sodium, blood protein levels, sleep quality, well-being, etc.—from 15 people. (One subject was dropped.) That study design is a recipe for false positives.”

You see, finding a “statistically significant result” sounds impressive and helps scientists to get their paper published in high-impact journals, but “statistical significance” is in fact easy to fake. If, like Bohannes, you use a small sample size and measure for 18 different variables, it’s almost impossible not to find some “statistically significant” result. Scientists know this, and the process of sifting through data to find “statistically significant” (but ultimately meaningless) results is so common that it has its own name: “p-hacking” or “data dredging.”

But p-hacking only scrapes the surface of the problem. From confounding factors to normalcy bias to publication pressures to outright fraud, the once-pristine image of science and scientists as an impartial font of knowledge about the world has been seriously undermined over the past decade.

Although these types of problems are by no means new, they came into vogue when John Ioannidis, a physician, researcher and writer at the Stanford Prevention Research Center, rocked the scientific community with his landmark paper “Why Most Published Research Findings Are False.” The 2005 paper addresses head on the concern that “most current published research findings are false,” asserting that “for many current scientific fields, claimed research findings may often be simply accurate measures of the prevailing bias.” The paper has achieved iconic status, becoming the most downloaded paper in the Public Library of Science and launching a conversation about false results, fake data, bias, manipulation and fraud in science that continues to this day.

Source: The Corbett Report