-
Notifications
You must be signed in to change notification settings - Fork 0
/
learnings.tex
89 lines (77 loc) · 5.08 KB
/
learnings.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
\section{Learning ethics for real}\label{learning-ethics-for-real}
Ideally, we would be able to take our learnings straight from the lab to
reality. As already touched upon, our project has been set up to emulate
a market-oriented data economy; however, if you particularly look at
ethics, we see differences.
One of the core pillars of ethics is understanding and accepting
responsibility. In a cascade-funded setting this is not easy. As money
is passed from the European commission to coordinator onto subcontracted
SMEs, a legal and contractional regime is established that sets the
playing field for many things that follow. Ethical actions require the
choice to do things in the right way. This goes both for the SME
performing the data experiment (which had to work with the services and
data provided by the framework project to funding) and the data
innovation hubs (which in turn had to work for the projects that were
selected by external reviewers). In such a setting, there is not that
much room for deciding for shared values (if you do not consider public
money as shared value in itself).
We think it is important to acknowledge that precompetitive and funded
environments have a dynamic of their own. They are important for
progress towards a true European data economy; thus, we will focus our
analysis mostly on this setting. We do this in the hope that other
project that have similar mechanics can learn from us. Constant progress
also in developing ethical frameworks can also have a positive impact on
the market.
\subsection{Competition of values}\label{competition-of-values}
\textbf{Funded projects must be governed by choices and finding partners
that share values. In addition, in a funded setting, we need positive
competition around reliable and responsible data innovations.}
To be clear: we have seen external reviewers choose, let us say,
challenging projects. Often generating a major positive impact has the
risk that, if done wrongly, it may trigger also negative impact. In our
public report on the findings of the first open call, we have listed
many different ethical challenges we encountered. We did not feel
prepared for all these challenges (from conducting medical trials to
dealing with financial transaction data) and had to rely on the
competence of the SMEs conducting the experiments. In many cases that
worked out well, however, in many cases elimination of ethical risk was
not an option because the SME was funded because it promised an output,
and the data innovation hubs were funded to support the SMEs. Failure on
both sides was not really an option in order to bring the overall
project to an end after the fair and independent selection of the open
call was finished. We learnt by putting more and more `terms and
conditions'' online. However, a balancing of risk and impact during the
selection process based also on the non-formalized values of the
infrastructure would have (e.g. by sometimes selecting the second-most
impactful experiment, if it has better compatibility with the
self-determined competences and values of the infrastructure providers).
\subsection{Compliance as a baseline}\label{compliance-as-a-baseline}
\textbf{We cannot argue that legal compliance is a given and ethics
should only go beyond this.}
Most of the risks detected by the ethics monitoring group of our project
focused on compliance with GDPR. With GDPR in place for more than 10
years, it would be expected that it should be at the core of all data
processing that involves data related to human data subjects. Reality is different. Even a common understanding of terminology is
complex. If you look at anonymity, there are, on one hand, certainly
difficult edge cases that have required more recent court rulings. An
example might be the definition of personal data as clarified in the
famous Patric Breyer case (ECLI:EU:C:2016:779). However, the reality is
that the difference between anonymity and pseudonymity is often
understood even in simple cases. Many big companies have invested in
compliance, also due to clear requirements and considerable possible
fines. The SME space, but also the research system, is from our
experience only very slowly catching up given exemptions and lack of
enforcement in this domain. This fact makes it very difficult to establish
ethics monitoring, saying repeatedly to people ` thatI am not a lawyer, but I
think what you are doing might not be legal in this way. The argument that
data coming from the EU is more `ethical` due to a clear legal regime
does not hold. We have seen in our experiments multiple public data sets,
which were published by EU projects for which we could not clearly
determine a legal basis.
\subsection{Setting values}\label{setting-values}
As said above, over the course of our project, the terms and conditions
of the open calls evolved. In addition, some service providers set conditions
on what they would support and what not. Often those terms were only
there to have a better lever to enforce compliance, particularly with
the contractual requirements of the grant (namely, the ethical impact
assessment of the project and evidence collection required by the funding).