The preceding chapters of this report have shown that our rapidly changing and interdependent digital world urgently needs improved digital cooperation founded on common human values. Based on our analysis and consultations with diverse stakeholders, and noting that not all Panel members were supportive of all recommendations, we make the following recommendations:
AN INCLUSIVE DIGITAL ECONOMY AND SOCIETY
1A: We recommend that by 2030, every adult should have affordable access to digital networks, as well as digitally-enabled financial and health services, as a means to make a substantial contribution to achieving the SDGs. Provision of these services should guard against abuse by building on emerging principles and best practices, one example of which is providing the ability to opt in and opt out, and by encouraging informed public discourse.
1B: We recommend that a broad, multi-stakeholder alliance, involving the UN, create a platform for sharing digital public goods, engaging talent and pooling data sets, in a manner that respects privacy, in areas related to attaining the SDGs.
1C: We call on the private sector, civil society, national governments, multilateral banks and the UN to adopt specific policies to support full digital inclusion and digital equality for women and traditionally marginalised groups. International organisations such as the World Bank and the UN should strengthen research and promote action on barriers women and marginalised groups face to digital inclusion and digital equality.
1D: We believe that a set of metrics for digital inclusiveness should be urgently agreed, measured worldwide and detailed with sex disaggregated data in the annual reports of institutions such as the UN, the International Monetary Fund, the World Bank, other multilateral development banks and the OECD. From this, strategies and plans of action could be developed.
In this report we have emphasised that the role of digital technologies in achieving the Sustainable Development Goals goes far beyond simply promoting greater access to the internet. With the right blend of policy, investment in infrastructure and human capacity, and cooperation among stakeholders, they can revolutionise fields as diverse as health and education, governance, economic empowerment and enterprise, agriculture and environmental sustainability.
The specific decisions needed to promote inclusivity and minimise risks will depend on local and national conditions. They should consider four main factors.
First, the broader national policy and regulatory frameworks should make it easy to create, run and grow small businesses. These frameworks should ensure that digital service providers – including e-commerce and inclusive finance platforms – support the growth of local enterprises. This requires enabling policies on investment and innovation, and structural policies to ensure fair competition, privacy rights, consumer protection and a sustainable tax base. Efforts to agree regional or global standards in these areas are welcome.
Second, investments should be made in both human capacity (see Recommendation 2 below) and physical infrastructure. Creating the foundation of universal, affordable access to electricity and the internet will often require innovative approaches, such as community groups operating rural networks, or incentives such as public sector support.
Third, targeted measures should address the barriers faced by women, indigenous people, rural populations and others who are marginalised by factors such as a lack of legal identity, low literacy rates, social norms that prevent them from fully participating in civic and economic life, and discriminatory land ownership, tenure and inheritance practices.
Fourth, respect for human rights – including privacy – is fundamental. Panel members had divergent views on digital ID systems in particular: they have immense potential to improve delivery of social services, especially for people who currently lack legal identity, but they are also vulnerable to abuse. As digital ID becomes more prevalent, we must emphasise principles for its fair and effective use.
Achieving this ambition will require multi-stakeholder alliances involving governments, private sector, international organisations, citizen groups and philanthropy to build new models of collaboration around “digital public goods” and data sets that can be pooled for the common good. SDG-related areas include health, energy, agriculture, clean water, oceans and climate change. These alliances could establish minimum criteria for classifying technologies and content as “digital public goods” and connect with relevant communities of practice that can provide guidance and support for investment, implementation and capacity development.
We are concerned that women face particular challenges in meaningfully accessing the internet, inclusive mobile financial services and online commerce, and controlling their own digital IDs and health records. Policies should include targeted capacity development for female entrepreneurs and policy makers. We call on the technology sector to make more sustained and serious efforts to address the gap in female technology employees and management, include women’s voices when determining online terms and conditions, and act to prevent online harassment and promotion of domestic abuse, building upon the work of existing initiatives such as the High-level Panel on Women’s Economic Empowerment.
While some preliminary work is underway, there is currently no agreed set of clear metrics or standards for the inclusiveness of digital technologies and cooperation. While any metrics will evolve over time, we call for research and multi-stakeholder consultation to establish a basis of shared global understanding as promptly as possible. We encourage the UN, international development agencies and multilateral banks such as the Asian Development Bank, the New Development Bank and the World Bank to drive this process by incorporating digital inclusion as a key metric in approving and evaluating projects. Facets of digital inclusion which may be considered include gender, financial services, health, government services, national digital economy policies, use of online e-commerce platforms and mobile device penetration.
HUMAN AND INSTITUTIONAL CAPACITY
2: We recommend the establishment of regional and global digital help desks to help governments, civil society and the private sector to understand digital issues and develop capacity to steer cooperation related to social and economic impacts of digital technologies.
Many countries urgently need to make critical choices about the complex issues discussed in this report. In what types of infrastructure should they invest? What types of training do their populations require to compete in the global digital economy? How can those whose livelihoods are disrupted by technological change be protected? How can technology be used to deliver social services and improve governance? How can regulation be appropriately balanced to encourage innovation while protecting human rights?
Policy decisions will have profound impact, but many of the decision-makers lack sufficient understanding of digital technologies and their implications. Capacity development for government officials and regulators could help to harness technology for inclusive economic development to achieve the SDGs. Priorities could include diagnostics on digital capacities and how they interact with society and the economy, and identifying skills workers will need. Capacity development initiatives with the private sector would also develop the capacity of officials and regulators to engage with the private sector so they can understand the operations of the digital economy and respond in an agile way to emerging issues (see Recommendation 5B).
For decisions to be well informed and inclusive, all stakeholders and the public need also to better understand the benefits and risks of digital technologies. Decisions around technology should be underpinned by a broad social dialogue on its costs, benefits and norms. We encourage capacity development programs for governments, civil society organisations, the private sector – including small- and medium-sized enterprises and start-ups – consumers, educators, women and youth. Existing capacity development initiatives by civil society, academia and technical and international organisations could benefit from the promotion of best practices.
A regional approach is recommended to develop capacity, to enable differing local contexts to be addressed. Regional help desks could be led by organisations such as the African Union or the Association of Southeast Asian Nations, in collaboration with UN Regional Commissions. The regional help desks would: conduct research and promote best practice in digital cooperation; provide capacity development training and recommend open-source or licensed products and platforms; and support requests for advice from governments, local private sector (particularly small and medium enterprises) and civil society in their regions. Staff would have regional expertise, and coordinate closely with the private sector and civil society.
A global help desk to coordinate the work of regional help desks could form part of the new digital cooperation architecture we recommend exploring in Recommendation 5A.
HUMAN RIGHTS AND HUMAN AGENCY
3A: Given that human rights apply fully in the digital world, we urge the UN Secretary-General to institute an agencies-wide review of how existing international human rights accords and standards apply to new and emerging digital technologies. Civil society, governments, the private sector and the public should be invited to submit their views on how to apply existing human rights instruments in the digital age in a proactive and transparent process.
3B: In the face of growing threats to human rights and safety, including those of children, we call on social media enterprises to work with governments, international and local civil society organisations and human rights experts around the world to fully understand and respond to concerns about existing or potential human rights violations.
3C: We believe that autonomous intelligent systems should be designed in ways that enable their decisions to be explained and humans to be accountable for their use. Audits and certification schemes should monitor compliance of AI systems with engineering and ethical standards, which should be developed using multi-stakeholder and multilateral approaches. Life and death decisions should not be delegated to machines. We call for enhanced digital cooperation with multiple stakeholders to think through the design and application of these standards and principles such as transparency and non-bias in autonomous intelligent systems in different social settings.
As discussed in Chapter 3, while human rights apply online as well as offline, technology presents challenges that were not foreseen when many foundational human rights accords were created. National laws and regulations must prevent advances in technology being used to erode human rights or avoid accountability. We need to cooperate to ensure that digital technologies advance the inherent dignity and equal and inalienable rights of every human.
Applying human rights in the digital age requires better coordination and communication between governments, technology companies, civil society and other stakeholders. Companies have often reacted slowly and inadequately to learning that their technologies are being deployed in ways that undermine human rights. We need more forward-looking efforts to identify and mitigate risks in advance: companies should consult with governments, civil society and academia to assess the potential human rights impact of the digital technologies they are developing. From risk assessment to ongoing due diligence and responsiveness to sudden events, it should be clarified what society can reasonably expect from each stakeholder, including technology firms.
In some areas there is consensus that much more needs to be done – notably, companies providing social media services need to do more to prevent the dissemination of hatred and incitement of violence, and companies providing online services and apps used by children need to do more to ensure appropriate design and meaningful data consent.
Consensus is also emerging that more needs to be done to safeguard the human right to privacy: individuals often have little or no meaningful
understanding of the implications of providing their personal data in return for digital services. We believe companies, governments and civil society should agree to clear and transparent standards that will enable greater interoperability of data in ways that protect privacy while enabling data to flow for commercial, research and government purposes, and supporting innovation to achieve the SDGs. Such standards should prevent data collection going beyond intended use, limit re-identification of individuals via datasets, and give individuals meaningful control over how their personal data is shared.
We also emphasise our belief that autonomous intelligent systems should be designed in ways that enable their decisions to be explained and humans to be held to account for their use. Audits and certification schemes should monitor compliance of AI systems with engineering and ethical standards. Humans should never delegate life and death decisions to machines.
TRUST, SECURITY AND STABILITY
4. We recommend the development of a Global Commitment on Digital Trust and Security to shape a shared vision, identify attributes of digital stability, elucidate and strengthen the implementation of norms for responsible uses of technology, and propose priorities for action.
As the digital economy increasingly merges with the physical world and deploys autonomous intelligent systems, it depends ever more on trust and the stability of the digital environment. Trust is built through agreed standards, shared values and best practices. Stability implies a digital environment that is peaceful, secure, open and cooperative. More effective action is needed to prevent trust and stability being eroded by the proliferation of irresponsible use of cyber capabilities.
The Global Commitment on Digital Trust and Security could build on and create momentum behind the voluntary norms agreed in the report of the 2015 GGE, and complement relevant global processes. It could address areas such as ways to strengthen implementation of agreed norms; developing societal capacity for cybersecurity and resilience against misinformation; encouraging companies to strengthen authentication practices, adhere to stricter software development norms and be more transparent in the use of software and components; and improving the digital hygiene of new users coming online.
GLOBAL DIGITAL COOPERATION
5A: We recommend that, as a matter of urgency, the UN Secretary- General facilitate an agile and open consultation process to develop updated mechanisms for global digital cooperation, with the options discussed in Chapter 4 as a starting point. We suggest an initial goal of marking the UN’s 75th anniversary in 2020 with a “Global Commitment for Digital Cooperation” to enshrine shared values, principles, understandings and objectives for an improved global digital cooperation architecture. As part of this process, we understand that the UN Secretary-General may appoint a Technology Envoy.
5B: We support a multi-stakeholder “systems” approach for cooperation and regulation that is adaptive, agile, inclusive and fit for purpose for the fast-changing digital age.
Enhancing digital cooperation will require both reinvigorating existing multilateral partnerships and potentially the creation of new mechanisms that involve stakeholders from business, academia, civil society and technical organisations. We should approach questions of governance based on their specific circumstances and choosing among all available tools.
Where possible we can make existing inter-governmental forums and mechanisms fit for the digital age rather than rush to create new mechanisms, though this may involve difficult judgement calls: for example, while the WTO remains a major forum to address issues raised by the rapid growth in cross-border e-commerce, it is now over two decades since it was last able to broker an agreement on the subject.
Given the speed of change, soft governance mechanisms – values and principles, standards and certification processes – should not wait for agreement on binding solutions. Soft governance mechanisms are also best suited to the multi-stakeholder approach demanded by the digital age: a fact-based, participative process of deliberation and design, including governments, private sector, civil society, diverse users and policy-makers.
The aim of the holistic “systems” approach we recommended is to bring together government bodies such as competition authorities and consumer protection agencies with the private sector, citizens and civil society to enable them to be more agile in responding to issues and evaluating trade-offs as they emerge. Any new governance approaches in digital cooperation should also, wherever possible, look for ways – such as pilot zones, regulatory sandboxes or trial periods – to test efficacy and develop necessary procedures and technology before being more widely applied.213
We envisage that the process of developing a “Global Commitment for Digital Cooperation” would be inspired by the “World We Want” process, which helped formulate the SDGs. Participants would include governments, the private sector from technology and other industries, SMEs and entrepreneurs, civil society, international organisations including standards and professional organisations, academic scholars and other experts, and government representatives from varied departments at regional, national, municipal and community levels. Multi-stakeholder consultation in each member state and region would allow ideas to bubble up from the bottom.
The consultations on an updated global digital cooperation architecture could define upfront the criteria to be met by the governance mechanisms to be proposed, such as funding models, modes of operation and means for serving the functions explored in this report.
More broadly, if appointed, a UN Tech Envoy could identify over-the-horizon concerns that need improved cooperation or governance; provide light-touch coordination of multi-stakeholder actors to address shared concerns; reinforce principles and norms developed in forums with relevant mandates; and work with UN member states, civil society and businesses to support compliance with agreed norms.
The Envoy’s mandate could also include coordinating the digital technology-related efforts of UN entities; improving communication and collaboration among technology experts within the UN; and advising the UN Secretary- General on new technology issues. Finally, the Envoy could promote partnerships to build and maintain international digital common resources that could be used to help achieve the SDGs.
We believe in a future which is inclusive and empowering; a future in which digital technologies are used to reduce inequalities, bring people together, enhance international peace and security and promote economic opportunity and environmental sustainability.
Our recommendations toward that future will require sustained commitment to fundamental human values. They will require leadership and political will, clarity about roles and responsibilities, shared meanings to ease communication, inclusive partnerships with capacity development, aligned incentives, greater coherence of currently fragmented efforts, and building a climate of trust.
We hope this report has shown why individuals, civil society, the private sector and governments urgently need to strengthen cooperation to build that better future.
Recent Comments on this Site
27th February 2023 at 4:40 am
room for statistical innovation with quantum, robots and mind reading applications now in development.
See in context
27th February 2023 at 4:39 am
Green ehealth aps
See in context
27th February 2023 at 4:38 am
New theme of Digital Health is required
See in context
27th February 2023 at 4:37 am
collaboration of techiques / know-how, can enhance medical aps
See in context
27th February 2023 at 4:36 am
very important. brainwave techs, Quantum, VR, how algorithms work for a health ap etc. Naive users must not be made use of. Consent is key, What is a reasonable person ?
See in context
27th February 2023 at 4:34 am
Medical aps and ethics. Ethics for brain wave ehealth applications operation. VR ethics.
See in context
27th February 2023 at 4:33 am
New era of brain waves for manipulating the applications such as robots or VR. Ethics is important.
See in context
27th February 2023 at 4:32 am
Important for medical internet of things and ehealth for SDG 3
See in context
27th February 2023 at 4:31 am
Transparency of data sources and algorithms. Building of trust and human oversight. Zero failures for healthcare. Due diligence and audits that are timely. Getting rid of errors and old non relevant data.
See in context
27th February 2023 at 4:29 am
critical for medical information
See in context