Workshop 4: Building cross-stakeholder awareness and understanding of the direct and indirect environmental impacts of digital/Internet technologies and how to mitigate them
Rapporteur: Francesco Vecchi
The nexus between digital transition and environmental impact has been recognised by the Council of Europe as one of the core challenges not only for citizens, but also for international and regional organisations. Besides, environment and human rights are deeply connected, as well as child abuse and exploitation. Human rights cannot take place in an unhealthy biosphere, and they are therefore double linked to data gatherings and online communications.
First, we need to distinguish direct (e.g. energy consumption, mining of rare minerals and raw materials) and indirect environmental effects (e.g. results of the implementation of digital innovation in industries). To analyse these outcomes, – that broadly cause the deterioration of the biosphere and worsen of life conditions – a standard measure must be defined, like the 2001 life cycle assessment.
When it comes to AI, the direct/indirect effects framework is extremely useful. People generally think AI is software and ephemeral, but it is rooted in concrete infrastructures, like the computing stack that train these large models processing data at a large scale. Consequently, LLMs have environmental impacts (e.g. inference). The direct impact is due to the infrastructure behind these, while indirect impacts concern their application either for environmentally positive and negative aims.
Speaking of cloud services, they are huge factories, data centers, filled with computers and storage devices. They consume a great deal of electricity and water when located in warmer climate zones. If we look at the environmental reports from Meta, Microsoft, Google, their largest environmental contribution (90%) comes from the supply chain. However, this process is not transparent and it needs being more visible to the users.
Moreover, Quantum Internet is far from being sustainable, while the standardisation body for the protocols underlying all the Internet networking has not been achieved yet. To decrease the environmental impact of the Internet, it is first crucial to determine the green metrics for measuring it. Second, we must acknowledge that ranking implies inequalities, since not all internet resources are provided everywhere.
All in all, the fact that we do not fully understand the impact of the latest Digital technology highlights the relevant of keeping this topic on the agenda. Also, it is crucial to reflect on the decision-making process and its lack of knowledge regarding the environmental cost to each decision. Regulation is needed, specifically for water and energy consumption, but it is also important that they are as data-driven as possible.
It is not that easy to make sure that sustainable technology by design implies that rules and plans are followed and implemented. That would requires changes in culture and in economic and political systems, which would also impact on demand and supply.
However, speaking of regulations, a debate was raised about their appropriateness. First, AI are black boxes: we do not properly understand how they work, and their application in society is also new. These dark areas are the reasons why regulators should take a consultative and iterative approach.
Also, regulation on AI is not universally accepted: it is not that clear why energy consumption should be regulated, if we could, for example, use energy for something else, especially improving social standards. AI’s regulation is opposite to the very root of AI’s technology, and regulators are really context-specific. Some countries can be much more cautious, and the Council of Europe’s work is to accommodate possible really different approaches to AI.
Finally, there are several solutions to contain the environmental impact of Digital Development. For instance, we could improve measurement, standards, and collaboration on data collection, in parallel with the regulation process. Also, we should look at the whole life cycle impact: little energy from digital technologies is recycled, and much can be done both from the direct and indirect impact perspective. Furthermore, it is crucial to focus on where the energy comes from, as well as on physical hardware. All in all, we need to form a holistic view on the ICT environmental impact to stress its urgency, even considering degrowth as a solution.
Recent Comments on this Site
3rd July 2024 at 2:48 pm
The ideas discussed in this session were much broader. I propose to ionclude the following:
Citizens’ expectations from governments are increasing, and effective use of digital technologies can help meet these demands. Beyond technology development, it’s essential to cultivate digital skills and a forward-thinking mindset in the public sector. The main challenge is changing work habits and focusing on problem-solving before technology implementation. Digital services must be citizen-centric, secure, and user-friendly.
Open policy-making and innovative thinking are crucial, along with safe experimentation spaces like GovTech Labs. These labs test new policies and technologies, fostering innovation through skill development and co-creation. Design thinking and user experience should prioritize simplicity and functionality.
Success in digital services depends on organizational maturity and a clear vision supported by citizens and legislation. Challenges include digital skill gaps, data analysis capabilities, and regulatory barriers, requiring a shift towards enabling innovation.
Future challenges include digital identification, AI regulations, and ensuring technology accessibility for all, including senior citizens. Practical strategies and public co-creation are necessary for meaningful change.
See in context
3rd July 2024 at 12:27 pm
Like David, I don’t think cybersecurity and ‘crypto-technologists’ should be considered non-technical.
See in context
3rd July 2024 at 12:26 pm
I think Torsten’s suggestion for the last sentence of para.3 is a good one. Ross Anderson’s “chat control” paper made a convincing case that domestic violence and sexual abuse are closely linked, and that preventive measures which ignore one in favour of the other are less likely to be effective.
See in context
3rd July 2024 at 12:14 pm
Thanks Torsten – I think the changes made result in a more balanced statement without sacrificing relevant detail. I remain concerned at the use of the word “exponential” without reference to substantiating evidence, for the reasons I set out in my previous comment.
See in context
3rd July 2024 at 11:04 am
[Watermarking and certification of origin should be a more reliable means to authenticate content and should be supported by regulation.]
I would add here: Watermarking and certification of origin should be a more reliable means to authenticate content and should be supported by regulation, keeping in mind that also these methods can be circumvented.
See in context
3rd July 2024 at 11:01 am
The session organizers and participants modified this message to better reflect the discussion at the workshop as follows:
The interplay of privacy and safety: The participants of Workshop 1a of EuroDIG believe privacy and child safety are intertwined and inseparable, advocating that legal solutions to combat child sexual abuse online must strive to optimise both. These measures should be centred on children’s rights and their best interests, as a way forward to achieve this balance.
See in context
3rd July 2024 at 11:00 am
The session organizers and participants modified this message to better reflect the discussion at the workshop as follows: CSA is currently increasing exponentially and has serious consequences for the rights and development of children. For this reason, recognising such depictions and preventing child sexual abuse should go hand in hand. Participants are concerned about the safety of users, including with regard to the potential use of CSAM detection technology. Breaches of confidential communication or anonymity are seen critically. At the same time, advantages are recognised in the regulations, e.g. with regard to problem awareness or safety by design approaches. Age verification procedures are perceived as both a risk and an advantage, with a caution on risks to anonymity and participation.
See in context
3rd July 2024 at 10:58 am
After a meeting among the workshop organizers, this message was changed as follows: Advancements in legal and regulatory measures on Child Sexual Abuse (CSA): Workshop 1a discussed three recent measures on the protection of children from online Child Sexual Abuse (CSA): the proposed EU CSA Regulation (CSAR), the new UK Online Safety Act, and the positive results from the Lithuanian Law on the Protection of Minors against detrimental effects of public information. An agreement was found on the need for better regulation in this field, emphasising the accountability of online service providers for monitoring illegal and harmful material and safeguarding minors.
See in context
2nd July 2024 at 1:02 pm
From my perspective, the comments on technology take up too much space in this message. This topic was explored in more depth in another workshop. It also leaves too little room for other aspects that played a role in the exchange. Therefore, here is a suggestion to change the message:
CSA is currently increasing exponentially and has serious consequences for the rights and development of children. For this reason, recognising such depictions and preventing sexual violence should go hand in hand. Participants are concerned about the safety of users, including with regard to the potential use of technology. Breaches of confidential communication or anonymity are seen critically. At the same time, advantages are recognised in the regulations, e.g. with regard to problem awareness or safety by design approaches. Age verification procedures are perceived as both a risk and an advantage. It can improve the protection of children on the internet, limit the spread of CSA material and empower children. However, this should not be at the expense of anonymity and participation.
See in context
1st July 2024 at 5:53 pm
New technology-open proposal for the first sentence of the paragraph, as there was no explicit request in the workshop to exclude CCS:
To detect CSAM online, only techniques that can protect privacy by not learning anything about the content of a message other than whether an image matches known illegal content should be used.
See in context