VI. Principles and Functions of Digital Cooperation
VI. PRINCIPLES AND FUNCTIONS OF DIGITAL COOPERATION
In the course of our outreach, many stakeholders suggested principles to which digital cooperation mechanisms should adhere and functions they should seek to serve. Drawing also on work of previous initiatives in these areas, this annex summarises the principles and functions we suggest are most important to guide the future evolution of digital cooperation.
KEY PRINCIPLES OF DIGITAL COOPERATION
- Consensus-oriented: Decisions should be made in ways that seek consensus among public, private and civic stakeholders.
- Polycentric: Decision-making should be highly distributed and loosely yet efficiently coordinated across specialised centres.
- Customised: There is generally no “one size fits all” solution; different communities can implement norms in their own way, according to circumstances.
- Subsidiarity: Decisions should be made as locally as possible, closest to where the issues and problems are.
- Accessible: It should be as easy as possible to engage in digital cooperation mechanisms and policy discussions.
- Inclusive: Decisions should be inclusive and democratic, representing diverse interests and accountable to all stakeholders.
- Agile: Digital cooperation should be dynamic, iterative and responsive to fast-emerging policy issues.
- Clarity in roles and responsibility: Clear roles and shared language should reduce confusion and support common understanding about the responsibilities of actors involved in digital cooperation (governments, private sector, civil society, international organisations and academia).
- Accountable: There should be measurable outcomes, accountability and means of redress.
- Resilient: Power distribution should be balanced across sectors, without centralised top-down control.
- Open: Processes should be transparent, with minimum barriers to entry.
- Innovative: It should always be possible to innovate new ways of cooperating, in a bottom-up way, which is also the best way to include diverse perspectives.
- Tech-neutral: Decisions should not lock in specific technologies but allow for innovation of better and context-appropriate alternatives.
- Equitable outcomes: Digital cooperation should maximise the global public interest (internationally) and be anchored in broad public benefit (nationally).
KEY FUNCTIONS OF DIGITAL COOPERATION
- Leadership – generating political will among leaders from government, business, and society, and providing an authoritative response to digital policy challenges.
- Deliberation – providing a platform for regular, comprehensive and impactful deliberations on digital issues with the active and effective participation of all affected stakeholders.
- Ensuring inclusivity – ensuring active and meaningful participation of all stakeholders, for example by linking with existing and future bottom-up networks and initiatives.214
- Evidence and data – monitoring developments and identifying trends to inform decisions, including by analysing existing data sources.
- Norms and policy making – building consensus among diverse stakeholders, respecting the roles of states and international organisations in enacting and enforcing laws.
- Implementation – following up on policy discussions and agreements.
- Coordination – creating shared understanding and purpose across bodies in different policy areas and at different levels (local, national, regional, global), ensuring synchronisation of efforts, interoperability and policy coherence, and the possibility of voluntary coordination between interested stakeholder groups.
- Partnerships – catalysing partnerships around specific issues by providing opportunities to network and collaborate.
- Support and capacity development – strengthening capacity development, monitoring digital developments, identifying trends, informing policy actors and the public of emerging risks and opportunities, and providing data for evidence-based decision making – allowing traditionally marginalised persons or other less-resourced stakeholders to actively participate in the system.
- Conflict resolution and crisis management – developing the skills, knowledge and tools to prevent and resolve disputes and connect stakeholders with assistance in a crisis.
Recent Comments on this Site
3rd July 2024 at 2:48 pm
The ideas discussed in this session were much broader. I propose to ionclude the following:
Citizens’ expectations from governments are increasing, and effective use of digital technologies can help meet these demands. Beyond technology development, it’s essential to cultivate digital skills and a forward-thinking mindset in the public sector. The main challenge is changing work habits and focusing on problem-solving before technology implementation. Digital services must be citizen-centric, secure, and user-friendly.
Open policy-making and innovative thinking are crucial, along with safe experimentation spaces like GovTech Labs. These labs test new policies and technologies, fostering innovation through skill development and co-creation. Design thinking and user experience should prioritize simplicity and functionality.
Success in digital services depends on organizational maturity and a clear vision supported by citizens and legislation. Challenges include digital skill gaps, data analysis capabilities, and regulatory barriers, requiring a shift towards enabling innovation.
Future challenges include digital identification, AI regulations, and ensuring technology accessibility for all, including senior citizens. Practical strategies and public co-creation are necessary for meaningful change.
See in context
3rd July 2024 at 12:27 pm
Like David, I don’t think cybersecurity and ‘crypto-technologists’ should be considered non-technical.
See in context
3rd July 2024 at 12:26 pm
I think Torsten’s suggestion for the last sentence of para.3 is a good one. Ross Anderson’s “chat control” paper made a convincing case that domestic violence and sexual abuse are closely linked, and that preventive measures which ignore one in favour of the other are less likely to be effective.
See in context
3rd July 2024 at 12:14 pm
Thanks Torsten – I think the changes made result in a more balanced statement without sacrificing relevant detail. I remain concerned at the use of the word “exponential” without reference to substantiating evidence, for the reasons I set out in my previous comment.
See in context
3rd July 2024 at 11:04 am
[Watermarking and certification of origin should be a more reliable means to authenticate content and should be supported by regulation.]
I would add here: Watermarking and certification of origin should be a more reliable means to authenticate content and should be supported by regulation, keeping in mind that also these methods can be circumvented.
See in context
3rd July 2024 at 11:01 am
The session organizers and participants modified this message to better reflect the discussion at the workshop as follows:
The interplay of privacy and safety: The participants of Workshop 1a of EuroDIG believe privacy and child safety are intertwined and inseparable, advocating that legal solutions to combat child sexual abuse online must strive to optimise both. These measures should be centred on children’s rights and their best interests, as a way forward to achieve this balance.
See in context
3rd July 2024 at 11:00 am
The session organizers and participants modified this message to better reflect the discussion at the workshop as follows: CSA is currently increasing exponentially and has serious consequences for the rights and development of children. For this reason, recognising such depictions and preventing child sexual abuse should go hand in hand. Participants are concerned about the safety of users, including with regard to the potential use of CSAM detection technology. Breaches of confidential communication or anonymity are seen critically. At the same time, advantages are recognised in the regulations, e.g. with regard to problem awareness or safety by design approaches. Age verification procedures are perceived as both a risk and an advantage, with a caution on risks to anonymity and participation.
See in context
3rd July 2024 at 10:58 am
After a meeting among the workshop organizers, this message was changed as follows: Advancements in legal and regulatory measures on Child Sexual Abuse (CSA): Workshop 1a discussed three recent measures on the protection of children from online Child Sexual Abuse (CSA): the proposed EU CSA Regulation (CSAR), the new UK Online Safety Act, and the positive results from the Lithuanian Law on the Protection of Minors against detrimental effects of public information. An agreement was found on the need for better regulation in this field, emphasising the accountability of online service providers for monitoring illegal and harmful material and safeguarding minors.
See in context
2nd July 2024 at 1:02 pm
From my perspective, the comments on technology take up too much space in this message. This topic was explored in more depth in another workshop. It also leaves too little room for other aspects that played a role in the exchange. Therefore, here is a suggestion to change the message:
CSA is currently increasing exponentially and has serious consequences for the rights and development of children. For this reason, recognising such depictions and preventing sexual violence should go hand in hand. Participants are concerned about the safety of users, including with regard to the potential use of technology. Breaches of confidential communication or anonymity are seen critically. At the same time, advantages are recognised in the regulations, e.g. with regard to problem awareness or safety by design approaches. Age verification procedures are perceived as both a risk and an advantage. It can improve the protection of children on the internet, limit the spread of CSA material and empower children. However, this should not be at the expense of anonymity and participation.
See in context
1st July 2024 at 5:53 pm
New technology-open proposal for the first sentence of the paragraph, as there was no explicit request in the workshop to exclude CCS:
To detect CSAM online, only techniques that can protect privacy by not learning anything about the content of a message other than whether an image matches known illegal content should be used.
See in context