Digital Future Society’s new data ethics for the digital age
In the era of ever-increasing data volumes, how can we exercise our human rights and what are our responsibilities to protect our privacy?
Building on studies of the collective dimension of data protection, this article sets out to embed this new perspective in an assessment model centred on human rights (Human Rights, Ethical and Social Impact Assessment-HRESIA). This self-assessment model intends to overcome the limitations of the existing assessment models, which are either too closely focused on data processing or have an extent and granularity that make them too complicated to evaluate the consequences of a given use of data.
Keywords: How to Use Enabling Technology for Human Rights, Data protection, Impact assessment, Human rights impact assessment, Ethical impact assessment, Social impact assessment, General Data Protection Regulation
In the era of ever-increasing data volumes, how can we exercise our human rights and what are our responsibilities to protect our privacy?
The UN Special Rapporteurs on freedom of opinion and expression, the situation of human rights defenders, and freedom of peaceful assembly and association, reinforced the notion that States must ensure that human rights are respected and protected in the digital arena.
The Center for Digital Trust (C4DT) “brings together 12 founding partners, 35 laboratories, civil society, and policy actors to collaborate, share insight, and to gain early access to trust-building technologies, building on state-of-the-art research at EPFL and beyond.”