4.1 Challenges and gaps
4.1 CHALLENGES AND GAPS
The international community is not starting from scratch. It can build on established mechanisms for digital cooperation involving governments, technical bodies, civil society and other organisations. Some are based in national and international law,185 others in “soft law” – norms, guidelines, codes of conduct and other self-regulatory measures adopted by business and tech communities.186 Some are loosely organised, others highly institutionalised.187 Some focus on setting agendas and standards, others on monitoring and coordination.188 Many could evolve to become better fit for purpose.
The need for better digital cooperation is not so much with managing the technical nuts and bolts of how technologies function, as mechanisms here are generally well-established, but with the unprecedented economic, societal and ethical challenges they cause. How to tell, in context, when conversations on social media cross the line into inciting violence? How to limit the use of cyber weapons possessed not only by states but non-state actors and individuals?189 How to adapt trade systems designed for a different era to the newly emerging forms of online commerce?
The 2003 and 2005 World Summit on the Information Society (WSIS) established the Internet Governance Forum (IGF) as a platform for multi-stakeholder dialogue.190 Global, national and regional IGF meetings have contributed to many important digital debates. But the IGF, in its current form, has limitations in addressing challenges that are now emerging from new digital technologies.
The need for strengthened cooperation mechanisms has been raised many times in recent years by broad initiatives – such as the NetMundial Conference,191 the Global Commission on Internet Governance192 and Web Foundation’s Contract for the Web193 – and more narrowly focused efforts such as the Broadband Commission, the Alliance for Affordable Internet, the Internet & Jurisdiction Policy Network, the Global Commission on the Stability of Cyberspace, the Charter of Trust, Smart Africa, and the International Panel on AI recently announced by Canada and France.194
In our consultations, we heard a great deal of dissatisfaction with existing digital cooperation arrangements: a desire for more tangible outcomes, more active participation by governments and the private sector, more inclusive processes and better follow-up. Overall, systems need to become more holistic, multi-disciplinary, multi-stakeholder, agile and able to convert rhetoric into practice. We have identified six main gaps:
First, despite their growing impact on society, digital technology and digital cooperation issues remain relatively low on many national, regional and global political agendas. Only recently have forums such as the G20 started regularly to address the digital economy.195 In 2018, the UN Secretary-General for the first time delivered an opening statement in person at the IGF in Paris.196
Second, digital cooperation arrangements such as technical bodies and standard-setting organisations are often not inclusive enough of small and developing countries, indigenous communities, women, young and elderly people and those with disabilities. Even if they are invited to the table, such groups may lack the capacity to participate effectively and meaningfully.197
Third, there is considerable overlap among the large number of mechanisms covering digital policy issues. As a result, the digital cooperation architecture has become highly complex but not necessarily effective. There is no simple entry point. This makes it especially hard for small enterprises, marginalised groups, developing countries and other stakeholders with limited budgets and expertise to make their voices heard.198
Fourth, digital technologies increasingly cut across areas in which policies are shaped by separate institutions. For example, one body may look at data issues from the perspective of standardisation, while another considers trade, and still another regulates to protect human rights.199 Many international organisations are trying to adjust their traditional policy work to reflect the realities of the digital transformation, but do not yet have enough expertise and experience to have well-defined roles in addressing new digital issues. At a minimum there needs to be better communication across different bodies to shape awareness. Ideally, effective cooperation should create synergies.
Fifth, there is a lack of reliable data, metrics and evidence on which to base practical policy interventions. For example, the annual cost of cybercrime to the global economy is variously estimated at anything from $600 billion200 to $6 trillion.201 Estimates of the value of the AI market in 2025 range from $60 billion202 to $17 trillion.203 The problem is most acute in developing countries, where resources to collect evidence are scarce and data collection is generally uneven. Establishing a knowledge repository on digital policy, with definitions of terms and concepts, would also increase clarity in policy discussions and support consistency of measurement of digital inclusion, as we have noted in our Recommendation 1D.
Sixth, lack of trust among governments, civil society and the private sector – and sometimes a lack of humility and understanding of different perspectives – can make it more difficult to establish the collaborative multi-stakeholder approach needed to develop effective cooperation mechanisms.
Inter-governmental work must be balanced with work involving broader stakeholders. Multi-stakeholder and multilateral approaches can and do co-exist. The challenge is to evolve ways of using each to reinforce the effectiveness of the other.
VALUES AND PRINCIPLES
As noted in the discussion of values in Chapter 1, we believe global digital cooperation should be: inclusive; respectful; human-centred; conducive to human flourishing; transparent; collaborative; accessible; sustainable and harmonious. Shared values become even more important during periods of rapid change, limited information and unpredictability, as with current discussions of cooperation relating to artificial intelligence.
It would be useful for the private sector, communities and governments to conduct digital cooperation initiatives by explicitly defining the values and principles that guide them. The aim is to align stakeholders around a common vision, maximise the beneficial impacts and minimise the risk of misuse and unintended consequences.
Alongside these shared values, we believe it is useful to highlight operational principles as a reference point for the future evolution of digital cooperation mechanisms. The principles we propose for global digital cooperation mechanisms include that they should: be easy to engage in, open and transparent; inclusive and accountable to all stakeholders; consult and debate as locally as possible; encourage innovation of both technologies and better mechanisms for cooperating; and, seek to maximise the global public interest. These are set forth in more detail in Annex VI, based on the experience of internet governance and technical coordination bodies – such as the WSIS process, UNESCO and the NetMundial conference.204
Defining values and principles is only the first step: we must operationalise them in practice in the design and development of digital technology and digital cooperation mechanisms. Where the reach of hard governance is limited or ambiguous – for example, at the stage of innovation or when the long-term impact of technologies is hard to predict – values-based cooperation approaches can play a vital role.
We should look for opportunities to operationalise values and principles at each step in the design and development of new technologies, as well as new policy practices. For example, educational institutions could encourage software developers, business executives and engineers to integrate values and principles in their work and use professional codes of conduct akin to the medical profession’s Hippocratic Oath. Businesses can integrate values into workflows, use values-based measures to assess risk and institute a suitable incentive structure for staff to follow shared values. Self-assessments and third-party audits can also help institutionalise a business culture based on shared values.
Recent Comments on this Site
3rd July 2024 at 2:48 pm
The ideas discussed in this session were much broader. I propose to ionclude the following:
Citizens’ expectations from governments are increasing, and effective use of digital technologies can help meet these demands. Beyond technology development, it’s essential to cultivate digital skills and a forward-thinking mindset in the public sector. The main challenge is changing work habits and focusing on problem-solving before technology implementation. Digital services must be citizen-centric, secure, and user-friendly.
Open policy-making and innovative thinking are crucial, along with safe experimentation spaces like GovTech Labs. These labs test new policies and technologies, fostering innovation through skill development and co-creation. Design thinking and user experience should prioritize simplicity and functionality.
Success in digital services depends on organizational maturity and a clear vision supported by citizens and legislation. Challenges include digital skill gaps, data analysis capabilities, and regulatory barriers, requiring a shift towards enabling innovation.
Future challenges include digital identification, AI regulations, and ensuring technology accessibility for all, including senior citizens. Practical strategies and public co-creation are necessary for meaningful change.
See in context
3rd July 2024 at 12:27 pm
Like David, I don’t think cybersecurity and ‘crypto-technologists’ should be considered non-technical.
See in context
3rd July 2024 at 12:26 pm
I think Torsten’s suggestion for the last sentence of para.3 is a good one. Ross Anderson’s “chat control” paper made a convincing case that domestic violence and sexual abuse are closely linked, and that preventive measures which ignore one in favour of the other are less likely to be effective.
See in context
3rd July 2024 at 12:14 pm
Thanks Torsten – I think the changes made result in a more balanced statement without sacrificing relevant detail. I remain concerned at the use of the word “exponential” without reference to substantiating evidence, for the reasons I set out in my previous comment.
See in context
3rd July 2024 at 11:04 am
[Watermarking and certification of origin should be a more reliable means to authenticate content and should be supported by regulation.]
I would add here: Watermarking and certification of origin should be a more reliable means to authenticate content and should be supported by regulation, keeping in mind that also these methods can be circumvented.
See in context
3rd July 2024 at 11:01 am
The session organizers and participants modified this message to better reflect the discussion at the workshop as follows:
The interplay of privacy and safety: The participants of Workshop 1a of EuroDIG believe privacy and child safety are intertwined and inseparable, advocating that legal solutions to combat child sexual abuse online must strive to optimise both. These measures should be centred on children’s rights and their best interests, as a way forward to achieve this balance.
See in context
3rd July 2024 at 11:00 am
The session organizers and participants modified this message to better reflect the discussion at the workshop as follows: CSA is currently increasing exponentially and has serious consequences for the rights and development of children. For this reason, recognising such depictions and preventing child sexual abuse should go hand in hand. Participants are concerned about the safety of users, including with regard to the potential use of CSAM detection technology. Breaches of confidential communication or anonymity are seen critically. At the same time, advantages are recognised in the regulations, e.g. with regard to problem awareness or safety by design approaches. Age verification procedures are perceived as both a risk and an advantage, with a caution on risks to anonymity and participation.
See in context
3rd July 2024 at 10:58 am
After a meeting among the workshop organizers, this message was changed as follows: Advancements in legal and regulatory measures on Child Sexual Abuse (CSA): Workshop 1a discussed three recent measures on the protection of children from online Child Sexual Abuse (CSA): the proposed EU CSA Regulation (CSAR), the new UK Online Safety Act, and the positive results from the Lithuanian Law on the Protection of Minors against detrimental effects of public information. An agreement was found on the need for better regulation in this field, emphasising the accountability of online service providers for monitoring illegal and harmful material and safeguarding minors.
See in context
2nd July 2024 at 1:02 pm
From my perspective, the comments on technology take up too much space in this message. This topic was explored in more depth in another workshop. It also leaves too little room for other aspects that played a role in the exchange. Therefore, here is a suggestion to change the message:
CSA is currently increasing exponentially and has serious consequences for the rights and development of children. For this reason, recognising such depictions and preventing sexual violence should go hand in hand. Participants are concerned about the safety of users, including with regard to the potential use of technology. Breaches of confidential communication or anonymity are seen critically. At the same time, advantages are recognised in the regulations, e.g. with regard to problem awareness or safety by design approaches. Age verification procedures are perceived as both a risk and an advantage. It can improve the protection of children on the internet, limit the spread of CSA material and empower children. However, this should not be at the expense of anonymity and participation.
See in context
1st July 2024 at 5:53 pm
New technology-open proposal for the first sentence of the paragraph, as there was no explicit request in the workshop to exclude CCS:
To detect CSAM online, only techniques that can protect privacy by not learning anything about the content of a message other than whether an image matches known illegal content should be used.
See in context