comments by Data Protection Unit – CoE
Council of Europe
Directorate General
Human Rights and Rule of Law
Information Society – Action against Crime Directorate
Information Society Department
Data Protection Unit
The Data Protection Unit of the Council of Europe welcomes the report “The age of digital interdependence” of the High Level Panel (`Report`) and its aim to “underscore the fact that universal human rights apply equally online as offline” in the digital age. We strongly support its emphasis to build on existing human rights frameworks and conventions and on the need of reassessing their implementation in the digital environment.
In a digitally interdependent world, the right to private life as enshrined in Article 12 of UNDHR, Article 17 of the International Covenant on Civil and Political Rights and in Article 8 of the European Convention of Human Rights, is and will increasingly be “an enabling right” which would, in simple terms, enable the exercise and full enjoyment of other human rights and fundamental freedoms. It will also remain the core factor in preserving human dignity and individual’s right to informational self-determination.
In this vein we would certainly support further work to protect individuals in the digital age and explore how the Council of Europe’s Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (“Convention 108”), which is the only global, legally binding multilateral instrument on the protection of privacy and data protection, could better contribute to the implementation of the recommendations of the Report.
Convention 108 was open for signature in 1981 and since then, it has influenced various international (OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data), regional (EU, African Union) and national privacy legislations. Convention 108 currently counts 55 parties and some 25 observers. Beside the work related to the implementation of the Convention itself, the Council of Europe has already produced reference documents in important areas such as Artificial Intelligence, Big data, health related data, media and privacy, internet governance, data processing by law enforcement, etc. It is worthwhile mentioning that Convention 108+ (as amended by the protocol CETS No. 2231) is seen to become the international standard on privacy in the digital age by the UN special Rapporteur on the right to privacy as well who has already recommended “to all UN Member States to accede to Convention 108+” in two of his reports2. By joining Convention 108+ any country would participate at the highest possible level of the common international law in shaping the future of the right to private life while contributing to maintain the free flow of data globally.
Therefore we would strongly support efforts to further promote the application of the core provisions of the modernised Convention 108 throughout the IGF and its constituencies (notably the application of high level data protection principles for online data processing, the right choice of legal basis for the processing of personal data, its requirement on transparency from the data controllers’ side, the possibility for data subjects to exercise their rights with special attention to the new generation of rights, the high level data security, a transborder data flow regime based on an appropriate level of protection of individuals, the lawful use of exceptions for instance for national security and law enforcement purposes and the establishment of independent regulatory authorities with a mandate and effective powers to oversee the application of these principles and provisions).
In the digital age our focus should remain on two objectives: free flow of data and respect for human dignity, as stated in the Preamble of the Protocol amending Convention 108: “(…) it is necessary to secure the human dignity and protection of the human rights and fundamental freedoms of every individual and, given the diversification, intensification and globalisation of data processing and personal data flows, personal autonomy based on a person’s right to control of his or her personal data and the processing of such data”. Therefore we would suggest that further considerations be given to the recommendations of the Report in the following areas:
- The right to privacy has to be guaranteed by existing open, multilateral conventions with reference to the statement in Recommendation 3a. In achieving this, further efforts are to be deployed at the IGF level to encourage UN member states to accede to Council of Europe modernised Convention 108 and to ensure the application of its principles and provisions by other stake- holders.
- With reference to peer-to-peer information sharing as described on page
14, we believe that in addition, a real mechanism for joint actions open for regulatory and law enforcement authorities is to be put in place at the IGF to effectively protect individuals in the digital space (for example by a 24/7 alert mechanism, designation of focal points in national administration and in stakeholders’ structures, etc.). - Furthermore, we would urge as referred to on page 18 in the sub-chapter
3.1 “Human Rights and Human Agency” in title “The right to privacy” that main definitions and principles are agreed upon and applied within the IGF in connection to the right to privacy (e.g.: definition of privacy, personal data, special categories of data, the right choice of legal base for the processing, accountability of data controllers, enforcement of rights, etc.). - We would recommend tackling the issues and considerations pertaining to the right to privacy jointly with those pertaining to cyber-security (cyber- resilience) and fight against cyber-crime.
- We support any action as described on page 19 in sub-chapter 3.2 to build “Trust and social cohesion” and would recommend further emphasis on the question “How can trust be promoted in the digital age”. We believe that specific forum have to be created within the IGF, based on multi-stakeholder participation to effectively promote the free flow of data while guaranteeing an appropriate level of protection for individuals as enshrined in modernised Convention 108 and to deliver viable solutions for users, in a timely manner and without any discrimination.
____
1 https://search.coe.int/cm/Pages/result_details.aspx?ObjectId=09000016807c65bf
2 2018 Annual Report on the Right to Privacy (Report A/73/45712) and Annual Report of 1 March 2019 to the UN Human Rights Council (Report A/HRC/40/63)
For reference please see also the Comments by the Council of Europe’s Information Society Department on the Report of the UN Secretary General’s High-level Panel on Digital Cooperation
Recent Comments on this Site
3rd July 2024 at 2:48 pm
The ideas discussed in this session were much broader. I propose to ionclude the following:
Citizens’ expectations from governments are increasing, and effective use of digital technologies can help meet these demands. Beyond technology development, it’s essential to cultivate digital skills and a forward-thinking mindset in the public sector. The main challenge is changing work habits and focusing on problem-solving before technology implementation. Digital services must be citizen-centric, secure, and user-friendly.
Open policy-making and innovative thinking are crucial, along with safe experimentation spaces like GovTech Labs. These labs test new policies and technologies, fostering innovation through skill development and co-creation. Design thinking and user experience should prioritize simplicity and functionality.
Success in digital services depends on organizational maturity and a clear vision supported by citizens and legislation. Challenges include digital skill gaps, data analysis capabilities, and regulatory barriers, requiring a shift towards enabling innovation.
Future challenges include digital identification, AI regulations, and ensuring technology accessibility for all, including senior citizens. Practical strategies and public co-creation are necessary for meaningful change.
See in context
3rd July 2024 at 12:27 pm
Like David, I don’t think cybersecurity and ‘crypto-technologists’ should be considered non-technical.
See in context
3rd July 2024 at 12:26 pm
I think Torsten’s suggestion for the last sentence of para.3 is a good one. Ross Anderson’s “chat control” paper made a convincing case that domestic violence and sexual abuse are closely linked, and that preventive measures which ignore one in favour of the other are less likely to be effective.
See in context
3rd July 2024 at 12:14 pm
Thanks Torsten – I think the changes made result in a more balanced statement without sacrificing relevant detail. I remain concerned at the use of the word “exponential” without reference to substantiating evidence, for the reasons I set out in my previous comment.
See in context
3rd July 2024 at 11:04 am
[Watermarking and certification of origin should be a more reliable means to authenticate content and should be supported by regulation.]
I would add here: Watermarking and certification of origin should be a more reliable means to authenticate content and should be supported by regulation, keeping in mind that also these methods can be circumvented.
See in context
3rd July 2024 at 11:01 am
The session organizers and participants modified this message to better reflect the discussion at the workshop as follows:
The interplay of privacy and safety: The participants of Workshop 1a of EuroDIG believe privacy and child safety are intertwined and inseparable, advocating that legal solutions to combat child sexual abuse online must strive to optimise both. These measures should be centred on children’s rights and their best interests, as a way forward to achieve this balance.
See in context
3rd July 2024 at 11:00 am
The session organizers and participants modified this message to better reflect the discussion at the workshop as follows: CSA is currently increasing exponentially and has serious consequences for the rights and development of children. For this reason, recognising such depictions and preventing child sexual abuse should go hand in hand. Participants are concerned about the safety of users, including with regard to the potential use of CSAM detection technology. Breaches of confidential communication or anonymity are seen critically. At the same time, advantages are recognised in the regulations, e.g. with regard to problem awareness or safety by design approaches. Age verification procedures are perceived as both a risk and an advantage, with a caution on risks to anonymity and participation.
See in context
3rd July 2024 at 10:58 am
After a meeting among the workshop organizers, this message was changed as follows: Advancements in legal and regulatory measures on Child Sexual Abuse (CSA): Workshop 1a discussed three recent measures on the protection of children from online Child Sexual Abuse (CSA): the proposed EU CSA Regulation (CSAR), the new UK Online Safety Act, and the positive results from the Lithuanian Law on the Protection of Minors against detrimental effects of public information. An agreement was found on the need for better regulation in this field, emphasising the accountability of online service providers for monitoring illegal and harmful material and safeguarding minors.
See in context
2nd July 2024 at 1:02 pm
From my perspective, the comments on technology take up too much space in this message. This topic was explored in more depth in another workshop. It also leaves too little room for other aspects that played a role in the exchange. Therefore, here is a suggestion to change the message:
CSA is currently increasing exponentially and has serious consequences for the rights and development of children. For this reason, recognising such depictions and preventing sexual violence should go hand in hand. Participants are concerned about the safety of users, including with regard to the potential use of technology. Breaches of confidential communication or anonymity are seen critically. At the same time, advantages are recognised in the regulations, e.g. with regard to problem awareness or safety by design approaches. Age verification procedures are perceived as both a risk and an advantage. It can improve the protection of children on the internet, limit the spread of CSA material and empower children. However, this should not be at the expense of anonymity and participation.
See in context
1st July 2024 at 5:53 pm
New technology-open proposal for the first sentence of the paragraph, as there was no explicit request in the workshop to exclude CCS:
To detect CSAM online, only techniques that can protect privacy by not learning anything about the content of a message other than whether an image matches known illegal content should be used.
See in context