Optimising cohort data in Europe
Big data practices are often framed in terms of an opposition between the individual autonomy of participants and public benefit derived from data-intensive research (Minari et al., 2018). This obscures big data’s own in-built biases (Aiello et al., 2020) which suggest that what is really at stake is the tension between the public good and the autonomy of the system. Facebook, for instance, tailors content algorithmically in relation to users’ interactions on their newsfeed (Shadbolt et al., 2019). While this reduces the personal autonomy of the user, it increases the autonomy and operational capability of the system (Facebook in this case). 2.2.3. Privacy and confidentiality Privacy relies on the notion that there is an intrinsic self, protecting a sphere of autonomy in which the individual is free to express and inhabit this self (Cech, 2019). In cohort research, privacy is evaluated in terms of its context. Namely, the flow of information is determined by contextual rules pertaining to information access (including the frequency, purpose and timing) and the participants involved. The violation of contextual rules leads to consequential concerns that directly affect the person whose privacy has been breached and deontological concerns where participants are wronged by a privacy breach but are unaware of it. In big data settings, however, both deontological and consequential concerns are amplified because the potential number of persons and contexts concerned is far more extended (Sun et al., 2018). Three evolutions affect the contextuality of privacy: big data, the emergence of digital technologies and IT platforms for data collection and sharing. Advances in mobile and wearable technology allow for greater profiling and an extension in the collection of personal data (collected either with or without consent) (Winter et al., 2018; Aiello et al., 2020). As a result, modern digital technologies problematise the issue of control because their structures and modes of operation distribute the control away from users and dilute personal agency and awareness. Hence, participants simply become a part of the system with which they interact. This is partly motivated by participants’ lack of awareness of their own autonomy: they tend to readily agree to terms and conditions of the platforms they use without understanding the implications for doing so (Dobrick et al., 2018). Autonomy can thus not be taken for granted because it is easily exchanged for perceived benefits (such as the services of an app). In fact, few users have a really full control of their privacy rights in digital platform settings (Price and Cohen, 2019). The concepts of privacy and confidentiality are fluid and are thus constantly redefined by technologies that have the potential to breach them (Price and Cohen, 2019). Because users are willing to leave a vast trail of personal data on online platforms, there are also serious concerns about the status and significance of consent in online settings (Umbach et al., 2020, McRae et al., 2020). While there is a general ethical principle that users should know what information is used about them, it seems that people are ready to give a part of their autonomy, and thus their privacy, to external stakeholders and actors
Made with FlippingBook flipbook maker