3.2 Trust and social cohesion
3.2 TRUST AND SOCIAL COHESION
The world is suffering from a “trust deficit disorder”, in the words of the UN Secretary-General addressing the UN General Assembly in 2018.140 Trust among nations and in multilateral processes has weakened as states focus more on strategic competition than common interests and behave more aggressively. Building trust, and underpinning it with clear and agreed standards, is central to the success of digital cooperation.
Digital technologies have enabled some new interactions that promote trust, notably by verifying people’s identities and allowing others to rate them.141 Although not reliable in all instances, such systems have enabled many entrepreneurs on e-commerce platforms to win the trust of consumers, and given many people on sharing platforms the confidence to invite strangers into their cars or homes.
In other ways, digital technologies are eroding trust. Lies can now spread more easily, including through algorithms which generate and promote misinformation, sowing discord and undermining confidence in political processes.142 The use of artificial intelligence to produce “deep fakes” – audio and visual content that convincingly mimics real humans – further complicates the task of telling truth from misinformation.143
Violations of privacy and security are undermining people’s trust in governments and companies. Trust between states is challenged by new ways to conduct espionage, manipulate public opinion and infiltrate critical infrastructure. While academia has traditionally nurtured international cooperation in artificial intelligence, governments are incentivised to secrecy by awareness that future breakthroughs could dramatically shift the balance of power.144
The trust deficit might in part be tackled by new technologies, such as training algorithms to identify and take down misinformation. But such solutions will pose their own issues: could we trust the accuracy and impartiality of the algorithms? Ultimately, trust needs to be built through clear standards and agreements based on mutual self-interest and values and with wide participation among all stakeholders, and mechanisms to impose costs for violations.
How can trust be promoted in the digital age?
The problem of trust came up repeatedly in written contributions to the Panel. Microsoft’s contribution stressed that an atmosphere of trust incentivises the invention of inclusive new technologies. As Latin American human rights group Derechos Digitales put it, “all participants in processes of digital cooperation must be able to share and work together freely, confident in the reliability and honesty of their counterparts”. But how can trust be promoted? We received a large number of ideas:
Articulating values and principles that govern technology development and use. Being transparent about decision-making that impacts other stakeholders, known vulnerabilities in software, and data breaches. Governments inviting participation from companies and civil society in discussions on regulation. Making real and visible efforts to obtain consent and protect data, including “security-bydesign” and “privacy-by-design” initiatives.149
Accepting oversight from a trusted third-party: for the media, this could be an organisation that fact-checks sources; for technology companies, this could be external audits of design, deployment and internal audit processes; for governments, this could be reviews by human rights forums.
Understanding the incentive structures that erode trust, and finding ways to change them: for example, requiring or pressuring social media firms to refuse to run adverts which contain disinformation, de-monetise content that contains disinformation, and clearly label sponsors of political adverts.150
Finally, digital cooperation itself can be a source of trust. In the Cold War, small pools of shared interest – non-proliferation or regional stability – allowed competitors to work together and paved the way for transparency and confidence-building measures that helped build a modicum of trust.151 Analogously, getting multiple stakeholders into a habit of cooperating on issues such as standard-setting and interoperability, addressing risks and social harm and collaborative application of digital technologies to achieve the SDGs, could allow trust to be built up gradually.
All citizens can play a role in building societal resilience against the misuse of digital technology. We all need to deepen our understanding of the political, social, cultural and economic impacts of digital technologies and what it means to use them responsibly. We encourage nations to consider how educational systems can train students to thoughtfully consider the sources and credibility of information.
There are many encouraging instances of digital cooperation being used to build individual capacities that will collectively make it harder for irresponsible use of digital technologies to erode societal trust.145 Examples drawn to the Panel’s attention by written submissions and interviews include:
- The 5Rights Foundation and British Telecom developed an initiative to help children understand how the apps and games they use make money, including techniques to keep their attention for longer.146
- The Cisco Networking Academy and United Nations Volunteers are training youth in Asia and Latin America to explore how digital technologies can enable them to become agents of social change in their communities.147
- The Digital Empowerment Foundation is working in India with WhatsApp and community leaders to stop the spread of misinformation on social media.148
Recent Comments on this Site
3rd July 2024 at 2:48 pm
The ideas discussed in this session were much broader. I propose to ionclude the following:
Citizens’ expectations from governments are increasing, and effective use of digital technologies can help meet these demands. Beyond technology development, it’s essential to cultivate digital skills and a forward-thinking mindset in the public sector. The main challenge is changing work habits and focusing on problem-solving before technology implementation. Digital services must be citizen-centric, secure, and user-friendly.
Open policy-making and innovative thinking are crucial, along with safe experimentation spaces like GovTech Labs. These labs test new policies and technologies, fostering innovation through skill development and co-creation. Design thinking and user experience should prioritize simplicity and functionality.
Success in digital services depends on organizational maturity and a clear vision supported by citizens and legislation. Challenges include digital skill gaps, data analysis capabilities, and regulatory barriers, requiring a shift towards enabling innovation.
Future challenges include digital identification, AI regulations, and ensuring technology accessibility for all, including senior citizens. Practical strategies and public co-creation are necessary for meaningful change.
See in context
3rd July 2024 at 12:27 pm
Like David, I don’t think cybersecurity and ‘crypto-technologists’ should be considered non-technical.
See in context
3rd July 2024 at 12:26 pm
I think Torsten’s suggestion for the last sentence of para.3 is a good one. Ross Anderson’s “chat control” paper made a convincing case that domestic violence and sexual abuse are closely linked, and that preventive measures which ignore one in favour of the other are less likely to be effective.
See in context
3rd July 2024 at 12:14 pm
Thanks Torsten – I think the changes made result in a more balanced statement without sacrificing relevant detail. I remain concerned at the use of the word “exponential” without reference to substantiating evidence, for the reasons I set out in my previous comment.
See in context
3rd July 2024 at 11:04 am
[Watermarking and certification of origin should be a more reliable means to authenticate content and should be supported by regulation.]
I would add here: Watermarking and certification of origin should be a more reliable means to authenticate content and should be supported by regulation, keeping in mind that also these methods can be circumvented.
See in context
3rd July 2024 at 11:01 am
The session organizers and participants modified this message to better reflect the discussion at the workshop as follows:
The interplay of privacy and safety: The participants of Workshop 1a of EuroDIG believe privacy and child safety are intertwined and inseparable, advocating that legal solutions to combat child sexual abuse online must strive to optimise both. These measures should be centred on children’s rights and their best interests, as a way forward to achieve this balance.
See in context
3rd July 2024 at 11:00 am
The session organizers and participants modified this message to better reflect the discussion at the workshop as follows: CSA is currently increasing exponentially and has serious consequences for the rights and development of children. For this reason, recognising such depictions and preventing child sexual abuse should go hand in hand. Participants are concerned about the safety of users, including with regard to the potential use of CSAM detection technology. Breaches of confidential communication or anonymity are seen critically. At the same time, advantages are recognised in the regulations, e.g. with regard to problem awareness or safety by design approaches. Age verification procedures are perceived as both a risk and an advantage, with a caution on risks to anonymity and participation.
See in context
3rd July 2024 at 10:58 am
After a meeting among the workshop organizers, this message was changed as follows: Advancements in legal and regulatory measures on Child Sexual Abuse (CSA): Workshop 1a discussed three recent measures on the protection of children from online Child Sexual Abuse (CSA): the proposed EU CSA Regulation (CSAR), the new UK Online Safety Act, and the positive results from the Lithuanian Law on the Protection of Minors against detrimental effects of public information. An agreement was found on the need for better regulation in this field, emphasising the accountability of online service providers for monitoring illegal and harmful material and safeguarding minors.
See in context
2nd July 2024 at 1:02 pm
From my perspective, the comments on technology take up too much space in this message. This topic was explored in more depth in another workshop. It also leaves too little room for other aspects that played a role in the exchange. Therefore, here is a suggestion to change the message:
CSA is currently increasing exponentially and has serious consequences for the rights and development of children. For this reason, recognising such depictions and preventing sexual violence should go hand in hand. Participants are concerned about the safety of users, including with regard to the potential use of technology. Breaches of confidential communication or anonymity are seen critically. At the same time, advantages are recognised in the regulations, e.g. with regard to problem awareness or safety by design approaches. Age verification procedures are perceived as both a risk and an advantage. It can improve the protection of children on the internet, limit the spread of CSA material and empower children. However, this should not be at the expense of anonymity and participation.
See in context
1st July 2024 at 5:53 pm
New technology-open proposal for the first sentence of the paragraph, as there was no explicit request in the workshop to exclude CCS:
To detect CSAM online, only techniques that can protect privacy by not learning anything about the content of a message other than whether an image matches known illegal content should be used.
See in context