Algorithm Governance Roundup #3
|
Community Spotlight: European Centre for Algorithmic Transparency | New Dutch regulator for algorithms
|
|
|
|
Welcome to January’s Roundup, and wishing you a happy new year. We hope you enjoy this month’s community spotlight on the newly established European Centre for Algorithmic Transparency, which will provide the European Commission with in-house scientific and technical support to enforce compliance with the Digital Services Act.
As a reminder, we take submissions: we are a small team who select content from public sources. If you would like to share content please reply or send a new email to algorithm-newsletter@awo.agency. Our only criterion for submission is that the update relates to algorithm governance, with emphasis on the second word: governance. We would love to hear from you.
Many thanks and happy reading!
AWO team
|
In Europe, the European Centre for Algorithmic Transparency (ECAT) has been set up in the Commission’s Joint Research Centre following the introduction of the Digital Services Act. The Centre will have three areas of work: 1) to provide in-house scientific and technical support to the regulator; 2) to conduct research to develop the algorithmic governance field; 3) to ‘dynamise’ the networks of vetted researchers and external auditors foreseen by the DSA. In this month’s Community Spotlight, we interview Carlos Torrecilla Salinas, who is leading the set up of ECAT, about the Centre’s background, work and role in the wider regulatory environment.In Germany, AlgorithmWatch has announced its first fellows in algorithmic accountability reporting. The fellows come from journalism, academia and civil society and will look at a range of issues including automated systems in the public-sector, chat-bots, fact-checking, the hard-of-hearing community, education, surveillance, and gender-based violence. In the Netherlands, a new regulator for algorithms has been introduced (full announcement in Dutch). The regulator is housed in the Dutch Data Protection Agency, but has its own tasks and responsibilities. The regulator will work on cross-sectoral risks and effects of algorithms, sharing knowledge with other regulators, citizens, industry and academia. It will also, in collaboration with regulators, publish guidance on algorithms and AI - for example with the Authority for Financial Markets, Media Authority and Data Protection Authority. The regulator will formulate ‘joint interpretation of standards’ in relation to algorithms. In Serbia, the Ministry of Interior has announced it will consult with civil society over a new set of policing draft laws. SHARE Foundation has explained the elements which enable automated biometric data searches during the surveillance of public spaces. Read the full paper here. In Spain, Barcelona City Council has approved rules for the implementation of AI into municipal services (full document in Catalan). The protocol defines means to protect citizen rights during AI procurement and implementation processes. It implements and builds upon the EU’s incoming AI Regulation. This is part of the Municipal Government’s Framework for the Ethical Implementation of AI, which also requires the creation of a public register of algorithms and an external advisory body this year.
In the UK, the CDDO and CDEI have created the Algorithmic Transparency Recording Standard Hub. The Hub includes an updated Standard following pilots, new guidance and a collection of published transparency reports. This follows the piloted standard we included in our ‘Community Spotlight’ in our first newsletter.
In Canada, the Algorithmic Impact Assessments of 5 automated tools being used to assist with immigration and refugee determinations have been published to the Open Government Portal. The Algorithmic Impact Assessment (AIA) is a mandatory tool for government bodies to understand and manage the risks associated with automated decision systems. In total, 11 detailed AIAs have been published since its introduction in 2019. In the US, the National Institute of Standards and Technology (NIST) will publish its AI Risk Management Framework 1.0 (RMF) on 26 January (via an online event). The voluntary framework is intended to improve the ability to incorporate trustworthiness considerations into the design, development, use, and evaluation of AI products, services, and systems. This has been developed through a collaborative process, building on feedback from the 2nd draft RMF and workshop in October.
|
The UK's Digital Regulation Cooperation Forum (the CMA, OFCOM, ICO and FCA) has announced a call for input on their workplan for 2023/24. One workstream is concerned with “improvements in algorithmic transparency”. The response deadline is 27 January 2023. AlgorithmWatch is hiring three new roles: Amnesty Tech is hiring a Technical Research Consultant to investigate algorithmic harms on Instagram and TikTok. The consultancy is intended to deliver research, including replicable research methodologies and data collection frameworks, on the availability and algorithmic amplification of content depicting and/or promoting depression, self-harm and suicide to children and young users. The application deadline is 02 February. Eticas is hiring an Ethics and Technology Researcher. The role is EU/UK-based and on a remote basis.
|
AI UK 2023 Hybrid in-person and online at Queen Elizabeth II Centre in Westminster, 21-22 March AI UK is hosted by The Alan Turing Institute and the programme is structured around the UK Government’s Office for Science and Technology Strategy’s priorities. Topics will include algorithmic bias and AI ethics.European Workshop on Algorithmic Fairness Hybrid in-person and online in Zurich, 7-9 June. Hosted by the Zurich University of Applied Sciences, University of Zurich and Politecnico di Milano, the workshop aims to create a space for European researchers to connect and discuss the status of algorithmic fairness. Participants are invited to submit proposals of full papers for discussion or abstracts for presentation. The proposal deadline is 31 January.Data Justice Lab Conference Hybrid in-person and online in Cardiff, 19-20 June. The conference theme is Collective Experiences in the Datafied Society, and intends to explore impacts, lived experiences and forms of resistance in relation to datafication. Participants are invited to submit proposals for papers or practical workshops. The proposal deadline is 30 January. 6th International Conference on Public Policy, panel on Transformation of the policy and regulatory agendas of emerging technologies Hybrid in-person and online in Toronto, 27-29 June. This panel aims to discuss empirical and theoretical works that address the transformation of policy and regulatory agendas of emerging technologies in diverse political and economic contexts. Participants are invited to submit proposals for papers. The proposal deadline is 31 January.
|
Community Spotlight: Carlos Torrecilla Salinas, European Centre for Algorithmic Transparency
|
Carlos Torrecilla Salinas, is the Head of the Digital Economy unit of European Commission’s Joint Research Centre. He is also leading, with colleagues in DG Connect, the setup of the European Centre for Algorithmic Transparency, as they prepare to provide in-house technical and scientific support for the Commission’s enforcement of the Digital Services Act.Q: What is the European Centre for Algorithmic Transparency? Carlos: The European Centre for Algorithmic Transparency (ECAT) was set up following the passing of the Digital Services Act (DSA). The DSA gives the European Commission exclusive responsibility to regulate Very Large Online Platforms (VLOPs). As a consequence, the Commission needed to increase their knowledge and capacity in the field of algorithmic systems and algorithmic transparency, in particular. This is where the idea for the Centre arose. The DSA’s policy aims are to tackle harmful content online. The main focus of the DSA is on digital services, and it places obligations on digital service providers according to their size and nature. Digital services, in most cases, are powered by algorithmic systems. For example, the type of content served to users is decided autonomously based on complex algorithmic systems. Therefore, to regulate digital services it is necessary to understand the technical nature of these algorithmic systems. ECAT is based in the European Commission’s Joint Research Centre (JRC). The JRC was chosen as ECAT’s home for two reasons. Firstly, the JRC is the scientific service in the European Commission, making it appropriate given that algorithmic systems and their social impact are emerging research fields in academia and industry. Secondly, the JRC was already doing work on algorithmic systems from different angles. We have conducted technical research to understand how these systems work, but also, we are interested in the long-term social impact of these systems. The JRC is also at arms-length from DG CONNECT, which means we can collaborate with them to support regulation through the provision of scientific and technical advice. ECAT will be based out of three locations in the JRC. Our main home is in Seville, but we will also have a presence in Ispra and Brussels to collaborate with our partners in DG CONNECT and other stakeholders.
Q: Could you tell us about the work conducted at ECAT? Carlos: We will conduct work around three main pillars: Firstly, to provide scientific and technical support to the regulator. The DSA gives the Commission exclusive enforcement capacity in relation to obligations on VLOPs, for whom algorithmic transparency will have the most impact. For example, VLOPs will have to complete self-assessments and transparency reports. In practice, ECAT will provide the scientific and technical support the Commission needs to enforce compliance. It is possible we would interact with VLOPs to understand how a particular system or service is functioning, whether it is biased, whether it is fulfilling its purpose. The Commission has the capacity to inspect the premises and infrastructure of VLOPs, and we could provide technical support to enable them to understand how systems and their architecture function. We are not a regulatory body, but we will be at the disposal of the Commission to support their regulatory role regarding VLOPs. We are currently providing methodological support for DG CONNECT to calculate the threshold to determine whether a digital service provider is a VLOP. The actual calculation, generation of data and discussion with potential VLOPS is in the hands of DG CONNECT, but if they need any further scientific or technical help in this process, we have the mandate to intervene. Secondly, to conduct research to develop the algorithmic governance field. Algorithmic transparency, algorithmic auditing and algorithmic monitoring are emerging areas. Unlike other industries, such as engineering, this is not an area with well-established methodologies that the industry follows. The JRC is primarily a research institution, so the idea is to push forward the knowledge of the field by helping to define methodologies that could be adopted by industry. We want to do this internally, but also through engagement with standards developing organisations, academia and industry. At a high-level, we have started our work on methodologies for audit. We are starting to run internal reviews and draw on our in-house knowledge. This is being used to support the development of the DSA’s delegated act on auditing (per Article 37.7). This delegated act, amongst others, will help with the implementation of the DSA. We would also like to understand the concept of ‘systemic risk’ in Article 34 of the DSA – what do we mean by this and what are the long-term effects of these technologies on society? We are interested in taking an inter-disciplinary approach to this question, since it not only concerns technology, the remit of software engineers and computer scientists, but also economists, social scientists and ethicists. Finally, to act as a 'dynamisor' of networks. The DSA foresees the creation of multiple different networks. Article 40 foresees the creation of ‘vetted researchers’ who will have access to platform data and will be vetted by the digital service coordinators at the Member State-level. Here, ECAT will play a central role in dynamising the community, encouraging scientific debate, centralising results, and publishing them. We want to ensure the knowledge produced by the DSA comes back to society. The DSA also foresees the creation of an auditing community. Article 37.1 requires platforms to have independent audits of their compliance. We would like to understand the needs of this industry, to engage with them and with standards developing organisations to eventually make some practices standard. Our goal is to make algorithmic transparency a design choice because, if this becomes embedded within the design of platforms, it creates a level playing field, and much more certainty for consumers and citizens.
Q: How does ECAT fit into the wider regulatory environment? Carlos: As discussed, ECAT will provide in-house scientific and technical support to the Commission as they supervise the DSA. This means that decision-making capacity remains with DG CONNECT. However, we still have an active role engaging with industry, academia and national regulatory authorities. Under the DSA, Member States will create digital services coordinators who have a key role in the full implementation of the legislation. Therefore, in each Member State there will or already might be a technical service that is like ECAT. The idea is to have them as a counterpart, to enable an informal exchange of experiences, knowledge, and capacity. For example, we have already started to engage with our French counterpart PEReN. Articles 64 and 72 of the DSA also establish the capacity of the Commission to request support from Member States and the corresponding duty to support the Commission. Therefore, under these legal provisions we have the formal ability to collaborate for inspections or investigations as needed. To be successful, the DSA will need to be implemented with industry, particularly if we want to succeed with our goal of algorithmic transparency by design. We will engage with VLOPs, industry and standards developing organisations to ensure that our methodological approaches become standards, and there are some provisions in the DSA to support this.
We also plan to engage with academia. As we discussed, we will have a central role with vetted researchers, but we will also engage with the wider academic community. We are actively starting conversations with similar organisations to the JRC, in EU and non-EU countries, to see if we could sign collaboration agreements for the exchange of good practices and knowledge. For example, we are talking to our counterparts in Japan at the AIST to see if we could collaborate on this field. Civil society also have a key role to play, from raising the alert on potential breaches of the regulation to algorithmic monitoring and auditing. A lot of this engagement is happening in the coming weeks and months as we initiate the activities of ECAT and the Commission deploys the legal architecture of the DSA - so we foresee a very busy future!Q: Is there anything else you’d like to share about ECAT?Carlos: We are in the process of staffing up. We have just concluded a very successful recruitment campaign with over 500 applications for 14 posts. We received applications from 26 of the 27 Member States which was a great confirmation that people all over Europe have interest in joining this fantastic endeavour.
|
:// Thank you for reading. If you found it useful, forward this on to a colleague or friend. If this was forwarded to you, please subscribe!
If you have an event, interesting article, or even a call for collaboration that you want included in next month’s issue, please reply or email us at algorithm-newsletter@awo.agency. We would love to hear from you!
|
You are receiving Algorithm Governance Roundup as you have signed up for AWO’s newsletter mailing list. Your email and personal information is processed based on your consent and in accordance with AWO’s Privacy Policy. You can withdraw your consent at any time by clicking here to unsubscribe. If you wish to unsubscribe from all AWO newsletters, please email privacy@awo.agency. A W O
Wessex House Teign Road Newton Abbot TQ12 4AA United Kingdom
|
|
|
|
|
|