|

Comments by Commenter

  • A.Karanasiou

    • I would suggest the following amendment: “There are also many examples of digital technologies being used or developed in a manner that restricts fundamental human rights and liberties, such as the right to privacy, the right to access to information, and the right to free speech, posing thereby significant threats to societal cohesion, democracy and self-determination.

    • Comment on 3.1 Human rights and human agency on 30th September 2019

      I would suggest the following addition: “The advent of AI in our daily lives creates further limitations for due process, posed by opaque and inscrutable algorithmic processes that have wide applications in the private sector and the public affairs alike ( *1). Automated decision making challenges fundamental human rights in an unprecedented manner and diffuses accountability due to the various levels of interactions between human operators and artificial agents (*2). Human agency and autonomy, both underpinning rationals for seminal human rights, such as privacy and free speech, are massively being redefined in the era of automation.  Enhancing algorithmic accountability should therefore be a key priority for policy-making aiming to create a regulative framework that guarantees fairness and transparency. ”

       

      Footnotes

      (*1) Pasquale, F. (2015). The black box society. Harvard University Press.

      (*2) Karanasiou, A. P., & Pinotsis, D. A. (2017). A study into the layers of automated decision-making: emergent normative and legal aspects of deep learning. International Review of Law, Computers & Technology31(2), 170-187.

    • Comment on 3.1 Human rights and human agency on 30th September 2019

      I would add here a footnote with Balkin’s seminal work on information fiduciaries (Balkin, J. M. (2015). Information fiduciaries and the first amendment. UCDL Rev.49, 1183.)

  • almut.nagel@ec.europa.eu

    • Comment on Focus Area 2 on 27th June 2022

      in addition to “greening ICT” and respective methods for measuring the footprints, we also discussed the need of methodologies for the enabling side, so please add:

      “Futhermore, common methodologies are also necessary to measure and compare the enabling effects of digital solutions in order to describe their net-environmental benefit.”

  • Amali De Silva - Mitchell

  • Andre Melancia

  • Andrew Campling

  • CatherineGarcia_ISOC

    • I agree with the emphasis added to paragraph 4 in regards to rights of the children in the Internet. During my intervention in the  Q&A I addressed the importance of encryption as being instrumental to Safety by Design, for this reason I request this to be added to paragraph 4(see addition in bold):
       We need to recognize that digital services and the Internet were created without considering children as users, but they make up a third of Internet users. This is why mandatory risk assessment and mitigation is so important. In fact, the solution is not to exclude children from services: The right of access to the media was declared in Article 17 of the Convention on the Rights of the Child and clarified by General Comment No. 25. In this regard, the European Commission’s Better Internet for Kids (BIK + strategy) has strongly developed safety by design principles. In order to be able to offer children safe online services, the implementation of encryption, including end-to-end encryption, has been instrumental to Safety by Design. While it is useful to deploy age verification it is of key importance to understand the unintended consequences this may have for instance due to the use of biometric data.

  • Claudia Leopardi

    • Regarding @Vittorio’s second comment, if I am understanding it’s point correctly, I do agree that not discussing the definition was an aim of the session itself (to be able to stay in the 45 minutes), but defining Internet Fragmentation in works such as the PNIF Output Document is indeed useful. For this reason, I’d agree that changing the second sentence of the first message to “Thus, it is crucial to address the risks that come with it.” can be a good idea to avoid confusion.

  • Constance

    • Comment on Focus Area 2 on 5th July 2022

      The multi stakeholder involvement in the standards development processis crucial, as is the value of a collaborative process to address identified problems and or issues, including the engagement of policymakers in the process so they gain a better understanding of what standards exist and how they are intended to be applied.  Governments are critical to encourage development, adoption and standards implementation rather than mandate or regulate solutions.

    • Comment on Focus Area 2 on 5th July 2022

      The following should be included:
      The multi stakeholder involvement in standards development process is needed, as is the value of a collaborative process to address identified problems and or issues, including the engagement of policymakers in the process so they gain a better understanding of what standards exist and how they are intended to be applied.  Governments are critical to encourage development, adoption and standards implementation rather than mandate or regulate solutions.

    • Comment on Focus Area 2 on 5th July 2022

      Here the messaging does not capture the focus of the discussion on the need to implement standards once they have been finalized.  That an implementation framework is important to address national cybersecurity issues and at the international level cooperation is important for effective implementation.

    • Comment on Focus Area 2 on 5th July 2022

      The following should be included: “The multi stakeholder involvement in the standards development process is needed, as is the value of a collaborative process to address identified problems and or issues, including the engagement of policymakers in the process so they gain a better understanding of what standards exist and how they are intended to be applied.  Governments are critical to encourage development, adoption and standards implementation rather than mandate or regulate solutions.”

    • Comment on Focus Area 2 on 5th July 2022

      Here, ‘relentless testing’ is not necessarily connected to consumers but to consumer organisations’ testing programmes and to societal organisation of responsible disclosure.

      Also, it is unclear what procurement has got to do with lower-level standard bodies (who are they?).

    • Comment on Focus Area 2 on 5th July 2022

      Paragraph 1: It is unclear what the actual message is.

    • Comment on Focus Area 2 on 5th July 2022

      Paragraph 2: It is unclear what the actual message is.

    • Comment on Focus Area 2 on 5th July 2022

      Paragraph 2: Mentioning standardisation bodies in one line with industry deployment seems like a mix-up/mistake.

  • Davidfrautschy_ISOC

    • Comment on Focus Area 1 on 2nd July 2022

      […} Any regulatory initiatives aimed at creating sovereignty in a particular field (NIS Directive, DNS4EU) must be well-examined to be sure they do not harm human rights online, do not harm the open and global nature of the Internet, and are in line with the democratic, multistakeholder principles.

    • Comment on Focus Area 1 on 2nd July 2022

      comment on paragraph I don’t see this paragraph related to the Digital Sovereignty discussions. I suggest deleting it.

    • Comment on Focus Area 1 on 2nd July 2022

      Alternative wording:

      The European vision of digital sovereignty could (should?) be used to increase competition and foster economic growth for the EU and its member states.

    • Par 1_
      I agree with @Vbertola and @ClaudiaLeopardi on shortening the second sentence. Regarding the third sentence, we should not limit the scope to government actions, so I’d rephrase it to (changes in bold) “Policy proposals that fragment the Internet, whether intentionally or not, prevent it from being a global space” (the use of “global” is preferred to “open”). I don’t understand the objective and meaning of the last sentence.
      Par 2_
      On the last sentence, it is important to include the following: “… ensuring that companies, civil society and the technical community are included in such discussions.”

       

    • It is important to include the following change (in bold): ” (…) ensuring that companies, civil society sector, and the technical community are included in such discussions.”

    • It’s not about “respecting” stakeholders, but about “taking into account the views of” stakeholders.

    • I moderated this panel and I found it frustrating that none of the speakers brought into the discussion the “What’s next” that was requested on the title. Instead I did explain the efforts that the Internet Society is doing on this front. I don’t want to “sell” our work here, but I would like to suggest the following addition to the paragraph: “There is a need to raise awareness of the risks of Internet fragmentation and also an opportunity to build on the capacities of the technical community and other stakeholders who are interested in addressing these challenges. Proven solutions, like the Internet Impact Assessment toolkit can be a way forward.”

    • I agree with Mark Carvell’s comment

    • During my intervention on the Q&A I highlighted the importance of taking into account the public statements of relevant EU official institutions in this debate. I request to include the following sentence at the end: “To this extent, it is important to acknowledge the opinion of the European Data Protection Board, the complementary assessment of the European Parliament Research Center and the opinion of the European Council Legal Services, who coincide in the disproportionality of the measures proposed by the European Commission, from a privacy point of view. According to the EU Council Legal service, the proposal in its current version would fail to pass a challenge at the European Court of Justice.”

    • This wishful thinking that a “strong regulation will prevent violation or misuse” is at least naïve. Nothing will prevent malicious actors -autocratic governments, corporations from foreign jurisdictions, or hackers – to utilize on their behalf a tool that would spy on citizens’ mobile phones or computers. We cannot allow the creation of a law that would foster this kind of surveillance.

    • Regardless an intervention was contested or not during the workshop, EuroDIG official messages cannot reflect statements that are inaccurate. Client-side-scanning defeats completely the purpose of encryption. If breaking encryption is equal to opening a sealed envelop to read the content, client-side-scanning is equal to having someone read above your shoulder while you write the letter. Common sense shows that Client-side-Scanning is the end of privacy, so the first sentence is to be redrafted completely. I suggest:
      “Client-side scanning for detecting CSA online involves methods that are questionable from a privacy perspective, and can be as intrusive as requiring the analisys of all the media stored in all citizen’s devices, making everyone suspect by definition.”

      Also hoping that AI technologies will bring a magic-wand-solution to this challenge is a naïve approach. The fact that we would need so many safeguards as the ones listed, is an indication of the immaturity of this proposal.

    • In this debate about protecting vulnerable groups, it is precisely the vulnerable groups the ones who are disregarded. Minors are continuously forgotten when formulating solutions. They should be included in this paragraph.

      ” (…) Significant weight should be given to civil society, including individuals from the vulnerable groups, like minors; and non-tehnical backgrounds (…)”

      I wonder why cybersecurity and crypto-technologists are considered non-technical.

    • This whole paragraph has no sense at all. the first sentence needs a reference note, to clarify which study supports the statement.

      On the second sentence, there is a misguided concept. The so called “privacy paradox”, is a term coined by EDPS Buttarelli, which describes the contradiction between privacy-related attitudes and behavior: we know our privacy is being exploited, and we feel like we’re losing control of our data, yet statistics show we’re not changing our online behavior to claim it back. Instead, we keep feeding data-collection machines that undermine our privacy. The use of this term on the second sentence is inappropriate and out of context.
      Finally, in line with previous EuroDIG positions, we must acknowledge that strong end-to-end encryption protects citizens, enterprises and governments. There should be no excuse to undermine it. Therefore, the second sentence also needs a reformulation.

    • Finally, there was a question from the audience about security risks associated to client-side-scanning. Participants agreed that, indeed, there can be risks of attacks on the database, also that reverse engineering could occur and that this would be a way for criminals to circumvent detection.

    • The new proposal paragraph 2 ws 1b by Torsten Krause is an oxymoron: Client-Side-Scanning and Privacy cannot simply not coexist.

      Proposal:
      Client-side scanning should not be used as a means to detect CSAM online. Instead, only techniques that can protect privacy should be used. Concerns are raised about anti-grooming techniques that analyse visual and textual data. The use of AI raises questions about proxies, bias and accuracy. Effective task-based models that respect privacy require comprehensive and accurate data, especially the use of metadata. Authorities play a critical role in double-checking the effectiveness of these measures and privacy compliance. Looking ahead, data-saving and anonymity-preserving age verification mechanisms could be a future-proof solution for robust verification and privacy protection.

  • Desara

  • EpE

  • flindeberg

    • I’d suggest to add an example also encompassing routing and not just (ccTLD) DNS-delegations.

      ” such as revoking the delegation of a TLD, IP-address prefix or ASN,”?

      ccTLDs  are quite special, even in the DNS-realm, and I think what EuroDIG *actually* wants is the separation of Internet-infrastructure and geopolitical considerations (cf. Internet fragmentation).

    • “Thus, it is crucial to address the risks that come with it rather than trying to define it”

      Isn’t a fragmented Internet one where the idea of one global Internet does not hold? That is fragmented unique identifiers, regardless of if it is v4 vs v6, ccTLDs, or anything else. Or in other words, the infrastructure layers of the Internet needs to be kept intact, and that the infrastructure and transport layers are fundamentally separated from the information that flows over the network.

  • Galia Kondova

    • I suggest that the following addition to the paragraph is made, namely:

      Countries should also make sure that the broad public is informed and educated about the use of a digital ID system. Countries should also make sure that digital ID systems are user-friendly while meeting high security, privacy and technological standards.

  • Giacomo Mazzone

  • Ieva

    • The ideas discussed in this session were much broader. I propose to ionclude the following:

      Citizens’ expectations from governments are increasing, and effective use of digital technologies can help meet these demands. Beyond technology development, it’s essential to cultivate digital skills and a forward-thinking mindset in the public sector. The main challenge is changing work habits and focusing on problem-solving before technology implementation. Digital services must be citizen-centric, secure, and user-friendly.

      Open policy-making and innovative thinking are crucial, along with safe experimentation spaces like GovTech Labs. These labs test new policies and technologies, fostering innovation through skill development and co-creation. Design thinking and user experience should prioritize simplicity and functionality.

      Success in digital services depends on organizational maturity and a clear vision supported by citizens and legislation. Challenges include digital skill gaps, data analysis capabilities, and regulatory barriers, requiring a shift towards enabling innovation.

      Future challenges include digital identification, AI regulations, and ensuring technology accessibility for all, including senior citizens. Practical strategies and public co-creation are necessary for meaningful change.

  • JuttaCroll

  • kmulberry

  • Maarten Botterman

  • Manuel

    • Comment on Focus Area 1 on 1st July 2022

      In order to capture the discussions and conclusions of WS3 on International connectivity, I would suggest to add the following text to Focus Area 1, possibly after or within paragraph 2:
      “Connectivity should be a fundamental building block in EU efforts, highlighting the importance of connecting Europe to the rest of the world. There is a need for increased investments on international connectivity through submarine cables and other technologies. The EU shoud take decisive steps in establishing a comprehensive digital connectivity strategy not just between the EU Members, but also to other regions across the Globe, in particular those regions with high traffic growth, such as Africa or South America. This strategy is crucial to turn the EU into world-class data hub and its digital products competitive worldwide. In this regard, the Global Gateway is a Key Strategic framework: the new European Strategy aims at boosting smart, clean and secure links in digital, energy and transport”.

      Normal
      0

      21

      false
      false
      false

      PT
      X-NONE
      X-NONE

      /* Style Definitions */
      table.MsoNormalTable
      {mso-style-name:”Table Normal”;
      mso-tstyle-rowband-size:0;
      mso-tstyle-colband-size:0;
      mso-style-noshow:yes;
      mso-style-priority:99;
      mso-style-parent:””;
      mso-padding-alt:0cm 5.4pt 0cm 5.4pt;
      mso-para-margin-top:0cm;
      mso-para-margin-right:0cm;
      mso-para-margin-bottom:8.0pt;
      mso-para-margin-left:0cm;
      line-height:107%;
      mso-pagination:widow-orphan;
      font-size:11.0pt;
      font-family:”Calibri”,sans-serif;
      mso-ascii-font-family:Calibri;
      mso-ascii-theme-font:minor-latin;
      mso-hansi-font-family:Calibri;
      mso-hansi-theme-font:minor-latin;
      mso-bidi-font-family:”Times New Roman”;
      mso-bidi-theme-font:minor-bidi;
      mso-fareast-language:EN-US;}

    • Comment on Focus Area 1 on 1st July 2022

      Slight amendment to the text previously sent:

      “Connectivity should be a fundamental building block in EU efforts, highlighting the importance of connecting Europe to the rest of the world. There is a need for increased investments on international connectivity through submarine cables and other technologies. The EU shoud take decisive steps in establishing a comprehensive digital connectivity strategy not just between the EU Members, but also to other regions across the Globe, in particular those regions with high traffic growth, such as Africa, South America

      /* Style Definitions */
      table.MsoNormalTable
      {mso-style-name:”Table Normal”;
      mso-tstyle-rowband-size:0;
      mso-tstyle-colband-size:0;
      mso-style-noshow:yes;
      mso-style-priority:99;
      mso-style-parent:””;
      mso-padding-alt:0cm 5.4pt 0cm 5.4pt;
      mso-para-margin:0cm;
      mso-pagination:widow-orphan;
      font-size:10.0pt;
      font-family:”Times New Roman”,serif;}

      and Asia including Japan. This strategy is crucial to turn the EU into world-class data hub and its digital products competitive worldwide. In this regard, the Global Gateway is a Key Strategic framework: the new European Strategy aims at boosting smart, clean and secure links in digital, energy and transport”.

      Normal
      0

      21

      false
      false
      false

      PT
      X-NONE
      X-NONE

  • Mark Carvell

    • I’m providing comments on the draft messages as one of the focal points for this session.

      The Subtopic 3 messages need to have more concise focus on the issue in the question of the session title: How can the Global Digital Compact prevent Internet fragmentation?

      The broader issues of the conduct of the GDC process and its expected impact and implementation, were discussed more fully in the Pre5 Workshop on 19 June.

    • The broad context of global challenges to be addressed by the GDC was not the principal focus of this subtopic session. I propose therefore replacing the draft text of the first message with the following specific message on the opportunity provided by the proposed Compact to marshal a global multistakeholder course of action to prevent Internet fragmentation:

      TEXT:

      1. The Global Digital Compact should include detailed and transparent commitments by stakeholders – including governments, regulators and the technical community – to  prevent fragmentation of the single, global and interoperable Internet and its core infrastructure.

    • I propose withdrawing the draft text and substituting the following two paragraphs which take into account the discussion in the session of EuroDIG’s continued involvement in the GDC process. The second paragraph expresses EuroDIG’s support for the current IGF community proposal that the GDC Co-Facilitators establish a “GDC Multistakeholder Sounding Board” which was considered towards the end of the session.

      PROPOSED NEW TEXT:

      2. The multistakeholder EuroDIG community looks forward to continuing to be engaged in the UN process of finalisation of the provisions of the GDC relating to the risks of Internet fragmentation and to its other thematic areas.

      3. EuroDIG supports the proposal submitted to the GDC Co-Facilitators by the Internet Governance Forum’s Multistakeholder Advisory Group and Leadership Panel that a Global Digital Compact Multistakeholder Sounding Board be established to assist the Co-Facilitators at all stages of the GDC’s development.

       

    • I propose that the draft text paragraph 3 is not adopted. While the risks of governance fragmentation was touched on by me and other speakers, the possible creation of new bodies such as the Digital Cooperation Forum proposed in the S-G’s Policy Brief and the HLAB proposal for a Global Commission on Just and Sustainable Digitalization, was beyond the scope of this Subtopic 3 session. IGF strengthening was not discussed.

  • Melle Tiel Groenestege

  • Michael J. Oghia

  • Michael Tunks

  • Nicola Frank

    • Easy access to and findability of trusted content needs to be ensured.

    • You could add after the last sentence: the DSA package offers the perfect opportunity.

       

    • At the end a sentence should be added: The DSA package offers the perfect opportunity.

    • Sorry, the comment which slipped into para 3 should have been for par 2:
      At the end a sentence should be added: The DSA package offers the perfect opportunity.

    • Add at the end: Support for R&D which brings together technology innovation and creativity is key.

    • Comment on Focus Area 4 on 30th June 2022

      I propose the following for para 3:

       
      Disinformation during the pandemic and the war against Ukraine has confirmed how it can polarize the public debate and be a driver of a crisis. We need a variety of measures to counter disinformation: regulatory measures, trustworthy content through the support of sustainable and independent journalism and independent public service media, fact-checking initiatives and investment in digital and media literacy. Any response to disinformation must comply with human rights and European values, such as democracy and the rule of law. Increased cooperation of the different stakeholders is key.

      Normal
      0

      false
      false
      false

      EN-US
      X-NONE
      X-NONE

      /* Style Definitions */
      table.MsoNormalTable
      {mso-style-name:”Table Normal”;
      mso-tstyle-rowband-size:0;
      mso-tstyle-colband-size:0;
      mso-style-noshow:yes;
      mso-style-priority:99;
      mso-style-parent:””;
      mso-padding-alt:0cm 5.4pt 0cm 5.4pt;
      mso-para-margin-top:0cm;
      mso-para-margin-right:0cm;
      mso-para-margin-bottom:8.0pt;
      mso-para-margin-left:0cm;
      line-height:107%;
      mso-pagination:widow-orphan;
      font-size:11.0pt;
      font-family:”Calibri”,sans-serif;
      mso-ascii-font-family:Calibri;
      mso-ascii-theme-font:minor-latin;
      mso-hansi-font-family:Calibri;
      mso-hansi-theme-font:minor-latin;
      mso-bidi-font-family:”Times New Roman”;
      mso-bidi-theme-font:minor-bidi;}

    • First paragraph: There is growing awareness that visions beyond regulation are needed for an Internet for Trust which allows for a democratic discourse. Approaches and concrete initiatives exist which contribute to a digital public sphere based on human rigths and working according to democratic rules.

  • Oksana

  • Olivier Crépin-Leblond

  • Olivier Vergeynst

    • One thing I presented is that the environmental impact of digital technologies is multiple: it’s about GHG emissions but also about energy, water and abiotic resource depletion. If you focus your actions on one indicator only (let’s say GHG emissions), you may for example end up having a very negative effect on abiotic resources, some of which are becoming dangerously scarce. This is why analysis and recommendations should be based on ISO 14040/14044 LCA methodology which, in my understanding, is multicriteria-based.
      I also mentioned briefly the social impact that IT can have (e.g. I talked about jobs that can be created through the refurbishment sector, but it is of course a much larger topic like the work conditions to extract natural resources in mines in Congo RDC and other area, to manufacture devices in factories like Foxconn, or positive aspects that technology can bring to society).
      So I’d like to propose a small modification along the following sentence:
      “A set of indicators that measure the environmental and social impacts of digital technologies is necessary to enable making the right decisions at the regulatory and political levels.”.
      Kind regards,

      Olivier

      Feedback from other panelists:

      ======================================
      I agree with Olivier – I think we will need a set of indicators because a single indicator is not practical. 
       
      For instance, data centres have a range of performance metrics  – see this briefing note I did a few years back.  It’s a bit out of date now as there are new metrics and many of those listed have now been formally standardised.  Single metrics also tend to get misused, like PUE, which should be for trend analysis but is used to compare facilities often in a misleading way.
       
      https://www.techuk.org/images/Data_centre_performance_metrics_for_Tiny_Tots.pdf
       
      We also did a map of environmental standards relevant to data centres.  Again it is a bit out of date but does demonstrate the range of standards applicable and in use within the sector.  Some of these standards are now becoming the basis for regulation or procurement requirements.   
       
      https://www.techuk.org/insights/news/item/15702-mapping-data-centre-standards  (scroll down to the pink bar at the bottom for the pdf)
       
      Best
       
      Emma
      ======================================
      Dear Olivier, dear all,
       
      I was going to make the same comment, suggesting to also include metrics that take into account rebound effects.
       
      Kind regards,
       
      Beat
       

       

    • One thing I presented is that the environmental impact of digital technologies is multiple: it’s about GHG emissions but also about energy, water and abiotic resource depletion. If you focus your actions on one indicator only (let’s say GHG emissions), you may for example end up having a very negative effect on abiotic resources, some of which are becoming dangerously scarce. This is why analysis and recommendations should be based on ISO 14040/14044 LCA methodology which, in my understanding, is multicriteria-based.
       
      I also mentioned briefly the social impact that IT can have (e.g. I talked about jobs that can be created through the refurbishment sector, but it is of course a much larger topic like the work conditions to extract natural resources in mines in Congo RDC and other area, to manufacture devices in factories like Foxconn, or positive aspects that technology can bring to society).
       
      So I’d like to propose a small modification along the following sentence:
      “A set of indicators that measure the environmental and social impacts of digital technologies is necessary to enable making the right decisions at the regulatory and political levels.”

      Many thanks,
      Olivier
      feedback from other panelists:

      ===============================
      I agree with Olivier – I think we will need a set of indicators because a single indicator is not practical. 
       
      For instance, data centres have a range of performance metrics  – see this briefing note I did a few years back.  It’s a bit out of date now as there are new metrics and many of those listed have now been formally standardised.  Single metrics also tend to get misused, like PUE, which should be for trend analysis but is used to compare facilities often in a misleading way.
       
      https://www.techuk.org/images/Data_centre_performance_metrics_for_Tiny_Tots.pdf
       
      We also did a map of environmental standards relevant to data centres.  Again it is a bit out of date but does demonstrate the range of standards applicable and in use within the sector.  Some of these standards are now becoming the basis for regulation or procurement requirements.   
       
      https://www.techuk.org/insights/news/item/15702-mapping-data-centre-standards  (scroll down to the pink bar at the bottom for the pdf)
       
      Best
      Emma
      ===================================
      Dear Olivier, dear all,
       
      I was going to make the same comment, suggesting to also include metrics that take into account rebound effects.
       
      Kind regards,
      Beat

    • Ooops, commented on the wrong paragraph. Please disregard my previous comment on this one, Andrijana did a great job and I have nothing to comment here 🙂

    • Additional proposal from Ilias IAKOVIDIS (Ilias.Iakovidis@ec.europa.eu), that I agree with,

      Rephrase as follows:

      A standardized methodology and indicators are necessary to assess and monitor the environmental and social impact of digital technologies to enable evidence based decision making at the regulatory and political levels

    • Additional proposal from Ilias IAKOVIDIS (Ilias.Iakovidis@ec.europa.eu), that I agree with,

      Rephrase as follows:

      ‘To reduce the environmental impact of the digital world, it is necessary to adopt measures to optimize  energy and material efficiency (circularity) of digital sector. For example, increase the use of renewables, innovate for low energy consumption,  keeping devices longer in use, facilitating re-use, improving reparability and recyclability, and, adopting sustainable business models.”

  • Pekka Kanerva

    • I kindly suggest the following changes:

      – Spelling the name of Faktabaari

      – Russian misinformation => disinformation (since we are talking about intentional false information)

    • I kindly suggest the following changes:
      Replace this: ”Thus, one of the key priorities is to enhance citizens digital literacy and education going beyond only digital competencies and including cultural aspects.”
      with this: ”Thus, one of the key priorities is to enhance citizens’ digital literacy and education by going beyond just digital competencies and including also ethical, social and cultural dimensions.”
      Add this important point that was said by the speaker: Responsibility for digital information literacy education lies not only with the formal education system, but also cultural institutions, NGOs, youth work play a key role.

    • I kindly suggest the following changes:
      –       governs => governments
      –       Replace this: ”Therefore, the contemporary political landscape requires three-level trust: political power; knowledge organisations; and individual.”
      –       By this:
      –       ”Therefore, the contemporary political landscape requires three levels of trust: trust in basic societal functions and structures of the society, trust in knowledge organizations, and trust between one another as individuals.”

    • I kindly suggest the following changes:

      Please add these two important points that were said by the speakers/audience:
      –       There is an initiative on the Nordic level to protect children from the harms of the Internet, and this initiative has already been promulgated into legislation in Denmark.
      –       As the role of parents is crucial in educating children to use the Internet in a savvy way, also parents need education. That’s why we need adult education also from beyond the formal education system, just like the adult education system in Finland already provides training in basic digital skills.

  • Petra Arts

    • As I mentioned during the session, I believe we should be careful with using the term ‘content moderation’ in the context of the Internet infrastructure level, as these services are typically very far removed from the actual content. I would like to suggest amending this paragraph to read: “Recent cases show that certain infrastructure providers unwillingly take action that could be argued to be content moderation by suspending services for the platforms in an ad-hoc manner without any transparent policy. But infrastructure services have limited possible options, which tend to be temporary solutions (clearing cache), overbroad reactions (limiting access) or options that open up websites to cyberattack (terminating services of particular users).”

  • Roberto

    • The overall objective is the production and fruition of local content – i.e. in the local language and using the local writing system. Universal Acceptance is a tool to get there, but we should not confuse the means with the purpose.

    • Although I was the one who mentioned this during the session, I am not sure that we should push for frequency regulation – besides, it is very likely to be be outside our scope

    • There have been some comments about the messages on the WS-16 mailing list rather than being logged as part of the messages procedure. The final result was a list of messages agreed by consensus.

  • Robin Wilton

    • The term “exponentially” is potentially misleading, and should not be used as a synonym for “rapidly”, unless substantiated with independently-verified data. “Exponential” means that the growth tends to infinity: a CSA prevention policy based on that assumption would almost certainly be disproportionate in practice.

      It is unsafe to make the claim of exponential growth without clarifying whether it refers to instances of real-world child abuse, reports of alleged illegal content, *actual* illegal content, or prosecutions for any of the above.

      As the late Prof. Ross Anderson’s paper (https://arxiv.org/pdf/2210.08958) shows, the gap between the figures cited in soundbites, and the figures for actual instances of illegal material, is extreme. As Levy and Robinson clearly  acknowledged in their 2022 paper on the same topic, “The statistic most often used to illustrate this is the number of reports received by NCMEC, which amounted to 29.4 million in 2021. However, without context this number provides little useful information and can be easily misinterpreted.”

      To put it into perspective – again, based on Levy and Robinson’s own figures – the number of reports that were referred to the police represented just 0.3% of the “headline-grabbing” figure.

      It is unsafe to base policy on phrases such as “exponential growth”, when the evidence demonstrates that the figures cited in support of such policies have already been used in misleading ways.

    • Concerning the word “vicious”: the original drafter may have been thinking of the phrase “vicious circle” (a situation in which the apparent solution of one problem in a chain of circumstances creates a new problem and increases the difficulty of solving the original problem). If so, then the word “vicious” is appropriate in context, since function creep and lack of independent oversight would indeed risk creating such a situation.

      Perhaps the sentence should be re-worded as follows: “Moreover, the possibility of function creep, combined with lack of independent oversight, gives rise to the risk of creating a vicious circle, in which enforcement measures create worsening outcomes for citizens. The precedent CSA regulation allows law enforcement to obtain the technical capability to monitor all European citizens in real time – a capability which, in light of Paragraph 1 above, is indistinguishable from a general monitoring obligation.”

    • First, it is worrying to see messages being proposed that, according to a participant, were not articulated in the room.

      Second, the “find a balance between privacy and safety” argument is outdated. The more considered view is that regulation must strive to optimize both, and must do this in the context that children and other at-risk groups are fully-fledged holders of the full set of rights, rather than reduced to being ‘objects of protection’.

      Finally, the closing sentence of the current draft expresses a good principle (“These measures should be centred on children’s rights and their best interests to achieve this balance”), but should end there, for the reason set out above. Centring policy and technical measures on the rights and best interests of the individual is a necessary but insufficient condition of acceptability. Those looking for a fuller set of evaluation criteria may find the REPHRAIN Framework useful: https://bpb-eu-w2.wpmucdn.com/blogs.bristol.ac.uk/dist/1/670/files/2023/02/Safety-Tech-Challenge-Fund-evaluation-framework-report.pdf

    • Thanks Torsten – I think the changes made result in a more balanced statement without sacrificing relevant detail. I remain concerned at the use of the word “exponential” without reference to substantiating evidence, for the reasons I set out in my previous comment.

    • With respect, the proposed revision does not address the objections in the comment from 26th June.

      The phrase “self-generated abuse material” is unclear and problematic. Does it refer to content generated by the subject of the abuse themselves? Or was the intent to refer to what is more usually called “user generated content” (UGC) – that is, content generated by the users of a platform/service, but not necessarily depicting or referring to themselves?

      Further, “pathological” is being used here in an unconventional way, without explaining how its normal use (‘relating to or caused by disease”) is relevant to this issue.

      Replacing the word “widespread” with “trending” is not a material improvement, and does not substantiate the claim.

      The main question is: is this unclear message an accurate reflection of unclear statements in the workshop, or did the workshop produce a clear message which is expressed unclearly here?

    • Andrew, your opening sentence, regardless of which version, is a mixture of analysis, misdirection, and wishful thinking. The two principal problems with it are these:

      1 – your use of the word “involves” is misdirection, because it implies that these methods are a fait accompli, viable at scale and without adverse consequences, which is very far from the case.

      2 – Your assertion that it is possible to implement CSS that is privacy-respecting and effective is entirely open to challenge. In particular, you are choosing to ignore – and not for the first time – widely publicised and valid analysis of CSS as a capability that introduces systemic vulnerability and creates the foundations for mass surveillance of law-abiding citizens. Here are three examples:

      https://academic.oup.com/cybersecurity/article/10/1/tyad020/7590463?login=false

      Chatcontrol or Child Protection?

      https://signal.org/blog/pdfs/ndss-keynote.pdf

    • @Velofisch Actually I am not at all convinced that “these methods” minimise the impact on privacy. One of the fundamental problems with Andrew’s claims is that client-side scanning places, on every device or handset, the means to inspect the individual’s communications. From that point on, you are wholly dependent on the good-faith action of the authorities to use that tool only as specified. You can’t stop it from being used to inspect everything. In my view, that is a risk wholly disproportionate to the stated goals of this approach.

      It gets worse, in my view. Client-side scanning (as envisaged by Andrew) takes place regardless of any basis for suspicion. This reverses the “presumption of innocence”. It starts with the assumption that everyone is committing a CSA offence, and goes looking for the evidence. That mindset ought not to be acceptable to any citizen in a democratic society.

    • I think Torsten’s suggestion for the last sentence of para.3 is a good one. Ross Anderson’s “chat control” paper made a convincing case that domestic violence and sexual abuse are closely linked, and that preventive measures which ignore one in favour of the other are less likely to be effective.

    • Like David, I don’t think cybersecurity and ‘crypto-technologists’ should be considered non-technical.

  • Ross Creelman

    • I would propose the following wording:

      5G reinforces the foundation for the digital transformation by offering new ways to innovate and create new business models based on real time availability of data.

    • I would propose:

      5G has huge potential to increase sustainability, especially in urban environments, by allowing them to reduce energy consumption and by enabling a new generation of digital services and solutions for cities.

    • I would propose:

      Collaborating on the elaboration of 5G standards is key to ensure interoperable data/ IoT solutions and to promote the security of the 5G ecosystem.

    • I would propose:

      Data protection remains a key consideration in the context of 5G as for all digital communications. The GDPR is protecting data to varying degrees depending on the area of application, however additional safeguards may be necessary for medical data.

  • sofia.rasgado

    • ·       Type of content: Child Sexual Abuse (CSAM) was mentioned; self-generated abuse material that can be voluntarily produced, shared and misused or due to coercion by online organizer groups (sextortion), both with severe consequences as suicidal actions; Pathological content (live broadcasting content appealing to violence, alcohol consumption, sexual abuse, etc).

    • Taking into account the previous comments, could it be rephrase by:
      Privacy-Respectful, Inclusive, and Accessible Technical Approaches
      Client-side scanning for detecting known CSAM online involves methods that can be privacy-preserving, learning nothing about the content of a message except whether an image matches known illegal content; concerns are raised about anti-grooming techniques analysing visual and textual data.
      AI deployment involves issues of proxies, biases, and accuracy, necessitating inclusive and accurate data for effective task-oriented models that respect privacy, especially leveraging metadata. Authorities play a crucial role in double-checking these measures’ effectiveness and privacy compliance. Looking ahead, data-saving and anonymity-preserving age verification mechanisms could be a future-proof solution for robust verification and privacy protection.

      Leave a comment on paragraph 2

       

    • Taking into account the previous comments, could it be rephrased by:
      3.    Diversity and Multi-Stakeholder Philosophy: A diversified multi-stakeholder approach is required to ensure that solutions are comprehensive in addressing harmful online content. Significant weight should be given to civil society, including individuals from the vulnerable groups, like minors, and non-technical backgrounds, should be involved in this process and their perspectives taken into account.

  • Stephen Wyber

  • Tjabbe Bos

    • Based on my participation in the session, it appears this message does not appropriately cover the scope of the discussion. In particular, several of the speakers and participants agreed on the need for instruments that go beyond traditional government-to-government cooperation to also include direct cooperation measures, provided that appropriate safeguards to ensure the protection of fundamental right are provided for. Therefore, I propose to reformulate this message as follows:

       

      Criminal justice instruments should provide for safeguards to ensure that fundamental principles are respected, including principles of proportionality, necessity and legality.

       

      In its current form I would strongly oppose the message.

  • Torsten Krause

    • New proposal:

      Normal
      0

      21

      false
      false
      false

      DE
      X-NONE
      X-NONE

      /* Style Definitions */
      table.MsoNormalTable
      {mso-style-name:”Normale Tabelle”;
      mso-tstyle-rowband-size:0;
      mso-tstyle-colband-size:0;
      mso-style-noshow:yes;
      mso-style-priority:99;
      mso-style-parent:””;
      mso-padding-alt:0cm 5.4pt 0cm 5.4pt;
      mso-para-margin-top:0cm;
      mso-para-margin-right:0cm;
      mso-para-margin-bottom:8.0pt;
      mso-para-margin-left:0cm;
      line-height:107%;
      mso-pagination:widow-orphan;
      font-size:11.0pt;
      font-family:”Calibri”,sans-serif;
      mso-ascii-font-family:Calibri;
      mso-ascii-theme-font:minor-latin;
      mso-hansi-font-family:Calibri;
      mso-hansi-theme-font:minor-latin;
      mso-bidi-font-family:”Times New Roman”;
      mso-bidi-theme-font:minor-bidi;
      mso-fareast-language:EN-US;}

      The panel discussed the EU Commission’s proposal for a regulation to prevent and combat child sexual abuse. On this basis, the EU plans to search for known and unknown child sexual abuse material as well as grooming in digital services. Stakeholders agreed that something needs to be done to better protect children online. However, there were differing views on the measures and options needed to achieve this.

    • New proposal:

      Normal
      0

      21

      false
      false
      false

      DE
      X-NONE
      X-NONE

      /* Style Definitions */
      table.MsoNormalTable
      {mso-style-name:”Normale Tabelle”;
      mso-tstyle-rowband-size:0;
      mso-tstyle-colband-size:0;
      mso-style-noshow:yes;
      mso-style-priority:99;
      mso-style-parent:””;
      mso-padding-alt:0cm 5.4pt 0cm 5.4pt;
      mso-para-margin-top:0cm;
      mso-para-margin-right:0cm;
      mso-para-margin-bottom:8.0pt;
      mso-para-margin-left:0cm;
      line-height:107%;
      mso-pagination:widow-orphan;
      font-size:11.0pt;
      font-family:”Calibri”,sans-serif;
      mso-ascii-font-family:Calibri;
      mso-ascii-theme-font:minor-latin;
      mso-hansi-font-family:Calibri;
      mso-hansi-theme-font:minor-latin;
      mso-bidi-font-family:”Times New Roman”;
      mso-bidi-theme-font:minor-bidi;
      mso-fareast-language:EN-US;}

      Although something needs to be done, it is also important to discuss the implementation of these measures. The aim of the present proposal is to protect children from sexual abuse and exploitation on the Internet. However, there are several challenges in doing so. First, detection is currently a temporary derogation from the e-privacy directive, after which children would be unprotected. This justifies the need for regulation coming into force in the summer of 2024. Second, there has been concern among some that the current existing mechanisms produce an overload of false positives for law enforcement. Third, there was a desire articulated to learn more about how the EU Centre will develop identifier and work with existing databases and actors. Finally, it was suggested that the EU center work with existing hotlines, which have played an important role in combating child sexual abuse to date and have actively removed a significant amount of content from the Internet. All in all, the intentions are good, but there were doubts among some participants and discussants about their implementation, whether the authors of the proposals had consulted experts in technology, fundamental rights and children’s rights.

    • New proposal:

      Normal
      0

      21

      false
      false
      false

      DE
      X-NONE
      X-NONE

      /* Style Definitions */
      table.MsoNormalTable
      {mso-style-name:”Normale Tabelle”;
      mso-tstyle-rowband-size:0;
      mso-tstyle-colband-size:0;
      mso-style-noshow:yes;
      mso-style-priority:99;
      mso-style-parent:””;
      mso-padding-alt:0cm 5.4pt 0cm 5.4pt;
      mso-para-margin-top:0cm;
      mso-para-margin-right:0cm;
      mso-para-margin-bottom:8.0pt;
      mso-para-margin-left:0cm;
      line-height:107%;
      mso-pagination:widow-orphan;
      font-size:11.0pt;
      font-family:”Calibri”,sans-serif;
      mso-ascii-font-family:Calibri;
      mso-ascii-theme-font:minor-latin;
      mso-hansi-font-family:Calibri;
      mso-hansi-theme-font:minor-latin;
      mso-bidi-font-family:”Times New Roman”;
      mso-bidi-theme-font:minor-bidi;
      mso-fareast-language:EN-US;}

      In fact, 59% of child sexual abuse content removed last year was hosted in EU Member States, making Europe the world leader in this area, hosting ⅔ of all cases. However, in addition to images of child sexual abuse, sexualized depictions of young people are increasingly found online, which they have created consensually (sexting). Currently, digital services voluntarily report known material of child sexual abuse. If the EU proposal come in force, services will be required to do so.

    • New proposal:
      We need to recognize that digital services and the Internet were created without considering children as users, but they make up a third of Internet users. This is why mandatory risk assessment and mitigation is so important. In fact, the solution is not to exclude children from services: The right of access to the media was declared in Article 17 of the Convention on the Rights of the Child and clarified by General Comment No. 25.
      In order to be able to offer children safe online services, it is useful to know how old or in which age cohort the users are.

    • New proposal:
      Finally, doubts and difficulties are not a reason to not act. Indeed, false-positives will be asses by humans and their quality will be checked by other humans. Nevertheless, when it comes to surveillance, it is a problematic topic. Therefor a strong regulation is needed to prevent violation and missuse of the regulation for other issues by companies or memberstates.

    • Please add a “too” at the end of the last sentence.

    • Please add a “too” at the end of the last sentence of the draft.

    • From my perspective, the comments on technology take up too much space in this message. This topic was explored in more depth in another workshop. It also leaves too little room for other aspects that played a role in the exchange. Therefore, here is a suggestion to change the message:
      CSA is currently increasing exponentially and has serious consequences for the rights and development of children. For this reason, recognising such depictions and preventing sexual violence should go hand in hand. Participants are concerned about the safety of users, including with regard to the potential use of technology. Breaches of confidential communication or anonymity are seen critically. At the same time, advantages are recognised in the regulations, e.g. with regard to problem awareness or safety by design approaches. Age verification procedures are perceived as both a risk and an advantage. It can improve the protection of children on the internet, limit the spread of CSA material and empower children. However, this should not be at the expense of anonymity and participation.

    • Proposal new paragraph 1, Ws 1b:
      Self-generated abusive material and pathological content are emerging as the most widespread harms to vulnerable groups online. All stakeholders are aware that measures and regulations must be taken to protect vulnerable groups. They are also aware that the rights and needs for protection against violence and abuse as well as privacy and participation must be guaranteed.

    • Proposal new paragraph 1, ws1b:

      Self-generated abusive material and pathological content are emerging as the most widespread harms to vulnerable groups online. All stakeholders are aware that measures and regulations must be taken to protect vulnerable groups. They are also aware that the rights and needs for protection against violence and abuse as well as privacy and participation must be guaranteed.

    • proposal new last sentence of paraagraaph 2, ws1b:
      Looking aheaad, data-saving and anonymity-preserving age verification mechanisms could be a future-proof solution for robust verification and privacy protection.

    • proposal new last sentence at paragraph 3 ws1b:
      All stakeholder groups, including those affected by violence, should be involved in this process and their perspectives taken into account.

    • It was an attempt not only to react, but also to reflect on other thoughts and suggest a possible solution. In view of your response, I suggest formulating it as follows. By the way, sources with current data are listed in the workshop wiki. In my view, it is not necessary to repeat them here.

      new proposal paragraph 1, ws 1b:
      Child sexual abuse material and other unlawful content massively violates the rights of those affected and those who happen to come into contact with it by accident. All stakeholders recognise that measures and regulations must be put in place to protect vulnerable groups, e.g. children. They also acknowledge that the rights and needs for protection from violence and abuse as well as privacy and participation of all must be equally guaranteed.

    • Here is an attempt to summarise the ongoing discourse in a common message. Regardless, we should be careful to reflect the discussion of the workshop and not continue the discourse.

      New proposal paragraph 2 ws 1b:

      If client-side scanning is used to detect known CSAM online, only techniques that can protect privacy by not learning anything about the content of a message other than whether an image matches known illegal content should be used. Concerns are raised about anti-grooming techniques that analyse visual and textual data. The use of AI raises questions about proxies, bias and accuracy. Effective task-based models that respect privacy require comprehensive and accurate data, especially the use of metadata. Authorities play a critical role in double-checking the effectiveness of these measures and privacy compliance. Looking ahead, data-saving and anonymity-preserving age verification mechanisms could be a future-proof solution for robust verification and privacy protection.

    • New technology-open proposal for the first sentence of the paragraph, as there was no explicit request in the workshop to exclude CCS:

      To detect CSAM online, only techniques that can protect privacy by not learning anything about the content of a message other than whether an image matches known illegal content should be used.

  • vassilis

  • vbertola

    • This was the subject of some last second comments in the session; I think that the substance is ok but the logical flow now does not work well, as you start with a negative (DoH could affect user choice) then you have a positive (provide more privacy) but you connect them with “as well as”. I would just break the first sentence in two and change the connection, e.g.: “DoH protocol… resolved for them. It could provide more privacy by encrypting DNS queries; however…”.

    • Perhaps I would also mention the word “jurisdiction” in an additional open question – i.e. “Which jurisdiction should apply to DNS resolution?”.

    • I remember a couple of interventions mentioning the importance of developing agreed ways to keep users in charge, through proper information and easy, open configuration options – this is an issue that often comes up. Not a vital omission though 🙂

    • I agree, this is a complex issue with lots of technicalities and yet the summary is really good. I have a few minor observations and am making them throughout the text, but in the overall it’s a good summary.

    • I could not attend this session, but by reading the messages I am a bit puzzled by the lack of any reference to the big issue regarding blockchain & privacy, i.e. how can you give each user full control on their personal information, including the possibility to update and delete it, on a technical infrastructure where everything is by design public, copied everywhere in a huge number of copies and impossible to change or delete once it is written.

    • Perhaps this point should make the message more general: I think that there was strong agreement that policy input into standardization processes “must be based on the multistakeholder principle, ensuring equal participation among stakeholder groups”, be it through a new body or through the existing standardization organizations.

    • Perhaps, at the end, you could also add something about making sure that the “offline” values of Europe continue to apply online, e.g. “in order to… and to enforce the European values in the norms and customs for online activities”.

    • Comment on Focus Area 1 on 29th June 2022

      Suggested changes:

      “avoid creating barriers to weaker economic players” – competition regulation is always meant to create constraints onto dominant players, to facilitate the others.

      Also, I am still not too sure about the final paragraph that singles out two specific things, one of which, DNS4EU, is not even a regulatory initiative (I would actually say that it is not even an exertion of sovereignty, not any more than any public procurement initiative). Possibly we can just strike the parentheses with the examples.

    • Comment on Focus Area 1 on 29th June 2022

      “States” should have a capital S.

    • Comment on Focus Area 1 on 29th June 2022

      This is in total contradiction with the last sentence of paragraph 2. Do we want to prioritise values, or do we want to prioritise business? You can’t always maximise both. Actually, most of the current wave of EU regulation is about protecting values in the face of business pressures. It looks like this paragraph is actually advocating against EU regulation.

    • Comment on Focus Area 2 on 29th June 2022

      Er… what is “internet operability”?

      Also, the “operational level of the Internet” (whatever that is) is not managed by the IETF. The IETF makes standards, it does not manage anything.

      In the overall, it is unclear to me what the message is.

    • Comment on Focus Area 2 on 29th June 2022

      Third sentence: again, industries implement standards, but standardisation bodies do not. Perhaps the sentence should be broken into two parts.

      Fourth sentence: not sure why the encouragement to participation is just for NGOs.

    • Comment on Focus Area 2 on 29th June 2022

      Third sentence: again, industries implement standards, but standardisation bodies do not. Perhaps the sentence should be broken into two parts.

      Fourth sentence: not sure why the encouragement to participation is just for NGOs.

       

    • Comment on Focus Area 2 on 29th June 2022

      Sorry, the last comment was meant for §2 instead – I reposted it there.

    • Comment on Focus Area 3 on 29th June 2022

      I am all in favour of including youth, but if you write the last sentence this way, it looks like the #1 remedy to the challenges posed by the geopolitical tensions is including youth, and I do not think this is what we want to say. Perhaps it could be something like:

      – there is a need to take a fresh look

      – one of the issues to consider is the inclusion of…

    • Comment on Focus Area 3 on 29th June 2022

      I’m sure we said that digital identity solutions should also be open and interoperable and should allow end-users to pick their trusted identity provider among many, avoiding the centralized control of online identification by either the government or the dominant Internet platforms. At least, I’m sure that Stefano Quintarelli said so 🙂

    • As discussed in the final session, we should acknowledge that some governmental intervention is either positive or necessary to protect other things (e.g. privacy or competition), even if it creates some fragmentation. We also need to add a mention of private sector led fragmentation. My proposal for the third sentence would be:

      “Government regulations that fragment the internet, whether intentionally or not, prevent it from being an open space, though they may sometimes be necessary to protect other rights and the public interest. Private sector may also fragment the internet by closing down services into walled gardens and breaking the principle of interoperability through open standards.”

      If this is too long, I would rather drop the current last sentence (“The failure…”) – I am not actually sure of what it means. Or, we could break point 1 into two points.

    • @Flindeberg – many people argue that also breaking the flow of information at the application level is fragmentation. This is why the discussion about definition is unsolved. By the way, I am not sure why we want to say that definitions are useless; it seems like gratuitous criticism of what the IGF PNIF has been doing. Perhaps we could just say “Thus, it is crucial to address the risks that come with it.”

    • I don’t necessarily disagree, but it’s hard to ask for a commitment to prevent “fragmentation” without being clear on whether the concept only includes the core technical resources and its governance (ICANN etc) or also the application and content level (which is way more complex and controversial). Perhaps we could just say “prevent fragmentation of the Internet’s core technical resources and of their governance system”.

    • I don’t remember such a deep discussion and expression of consensus in the room so to allow EuroDIG to express support for anything. In fact, I was at the session but I don’t even remember the discussion around this proposal.

    • For clarity, I would add at the end something like “Legal requirements should not disrupt the global and collaborative open source software development model.”

  • Velofisch

  • Yrjo Lansipuro

    •  
      The Internet has changed how war is fought, and how it is covered by media. At
      the same time, the war has put “One world, one Internet” to a stress test.  The foundations of global and interoperable Internet should not be affected by the deepening geopolitical divide, even though it has fragmented the content layer.
       
      No one has the right to disrupt the global network that exists as a result of voluntary cooperation by thousands of networks. The mission of Internet actors is to promote and uphold the network, and to help restore it if destroyed by armed aggression.
       
      The war has been accompanied by heightened weaponization of the content layer of the Internet. New EU legislation is expected to curb at least the role of very large platforms in spreading disinformation and hate speech.
       
       

  • [during drafting session 2024]

Source: https://comment.eurodig.org/comments-by-commenter/