-
Is there a single regulatory regime that governs software?
There is no single regulatory regime that governs software in the UK.
A number of laws and regulations across various sectors are, however, relevant to software. These include general regulations covering consumer protection, advertising, employment, intellectual property and data protection, amongst others.
In certain limited circumstances, there are specific regulations for software, such as use of software in medical devices, and those around harm caused by software and computer systems.
-
How are proprietary rights in software and associated materials protected?
In the UK, the main form of protection for software is through copyright, which applies to the software code (whether source code or any compiled version), any algorithms, graphics, video and audio recordings, and any documents, specifications, user manuals and other materials associated with it. Copyright arises automatically in the UK on creation of an original work, and there is no need to register this. Copyright generally protects against copying – there is no protection against independent creation of similar software or materials, where actual copying is not reasonably evident.
The functional aspects of any software application are not generally protectable in the UK unless it can be shown that a software with similar functionality came about as a result of copying of any code, specification or design materials.
Software code “as such” is not patentable in the UK, but a software-related invention might be patentable depending on the context, i.e. where the contribution made by the software-related invention has a technical effect (such as where software is used to drive equipment, or improve performance of a computing system).
The “look and feel” of any software application may be protected by registered design in the UK, which can be a quick and cost-effective way to protect unique aspects of the user interface.
Software is often exploited in such a way that the source code and design materials are not disclosed to any licensee or user of the software. To the extent source code and design materials are kept confidential and reasonable steps have been taken to prevent disclosure to third parties, the laws of confidence and/or trade secrets will allow the software owner to protect and enforce their rights in that confidential information.
-
In the event that software is developed by a software developer, consultant or other party for a customer, who will own the resulting proprietary rights in the newly created software in the absence of any agreed contractual position?
In the UK, the first owner of copyright is the author or creator of the copyright works. Where a software developer, consultant or contractor creates software for a customer, then the software developer, consultant or contractor will own the copyright in the software code and any documents, specifications, graphics or other materials in the software.
The same is true for any invention or concept which may give software its technical effect. Rights to any invention will remain with the software developer, consultant or contractor, who will own the rights to the invention, and will be entitled to apply for patent protection.
If it is the intention that the customer owns all rights in software, it is, therefore, important to ensure that such rights, including in any inventions, concepts or ideas, and any software code, documentation or materials, are assigned to the customer under a written agreement which is signed by the software developer, consultant or contractor.
The position is made more complex where, for example, a software developer will use its own proprietary code, templates or specifications to build software for its customer for efficiency and to control costs. The software developer is unlikely to agree to assign its own proprietary materials to the customer, in which case, the customer should seek an assignment of any software and materials created specifically for it, together with a non-exclusive, perpetual licence to use the software developer’s own proprietary software and materials insofar as is necessary for the use and exploitation of the newly created software.
-
Are there any specific laws that govern the harm / liability caused by Software / computer systems?
Liability for harm caused by software and computer systems is governed predominantly by either contract law or the law of negligence. Where a business provides software to a consumer, chapter 3 of the Consumer Rights Act 2015 sets out various implied contractual terms that govern such supply of software, including that it is of satisfactory quality, fit for purpose and as described. Chapter 3 of the Consumer Rights Act 2015 also provides various remedies if those statutory rights are not adhered to.
Outside of contractual liability the tort of negligence might cover harm caused by software/computer systems on a case by case basis subject to the facts of any given case.
-
To the extent not covered by (4) above, are there any specific laws that govern the use (or misuse) of software / computer systems?
The Computer Misuse Act 1990 is the main legislation that criminalises unauthorised access to computer systems and data. The Computer Misuse Act 1990 criminalises access to computer systems and data which has not been authorised by the owner of the computer system.
-
Other than as identified elsewhere in this overview, are there any technology-specific laws that govern the provision of software between a software vendor and customer, including any laws that govern the use of cloud technology?
Generally speaking, there are no technology-specific laws that govern the provision of software between a software vendor and customer in the UK, and no specific laws that govern the use of cloud technology.
However, where the customer is a regulated financial services firm (hereafter referred to as a “regulated firm”), certain rules and guidance may apply in these circumstances. Which rules and guidance apply in particular instances, and the extent to which they apply, will depend on factors such as: the vendor’s role in practice, what the regulated firm’s activities are, and the impact that the service may have on those regulated activities.
In general, the rules and guidance issued by the Financial Conduct Authority (“FCA”) and the Prudential Regulation Authority (“PRA”) are intended to be technology-neutral. As such, the rules and guidance are not “technology-specific”, but apply in situations where the services provided are supported by technology, including cloud services.
Specific rules and guidance apply in two circumstances: (1) outsourcing, and (2) activities which may affect a regulated firm’s operational resilience (i.e. the ability of a regulated firm and the financial sector as a whole to prevent, adapt, respond to, recover and learn from operational disruptions (these could include disruptions caused by the failure of technology on which the regulated firm depends)). In such circumstances, one or more of the following may be relevant:
- The FCA’s rules and guidance in chapter 8 of the Senior Management Arrangements, Systems and Controls handbook (“SYSC”) within the FCA Handbook. SYSC 8.1 applies to outsourcing, meaning it can apply to SaaS. Depending on what activities the regulated firm carries out, SYSC 8 applies either as guidance or as rules.
- The FCA’s Finalised Guidance FG16/5 for firms outsourcing to the cloud and other third-party IT services.
- The European Banking Authority (“EBA”) Guidelines on Outsourcing Arrangements dated 25 February 2019 (EBA/GL/2019/02).
- The PRA’s Supervisory Statement of March 2021 (SS2/21) on Outsourcing and Third Party Risk Management. Although directed to PRA-regulated firms such as banks, building societies, PRA-designated investment firms, insurance and reinsurance firms, the PRA’s drafting was reviewed by the FCA and it is a useful tool in understanding how the EBA Guidelines are likely to be interpreted and applied by the UK regulators.
- The FCA’s rules and guidance relating to operational resilience, in SYSC. The main rules appear in SYSC 15A.
- The PRA’s rules and guidance relating to operational resilience, in the Operational Resilience sections of the PRA Rulebook. This is supplemented by PRA guidance in its Supervisory Statement of March 2022 (SS1/21) on Impact Tolerances for Important Business Services.
- A shared policy summary on operational resilience, from the Bank of England, PRA and FCA, dated March 2021.
In some instances, a regulated firm may have to consider the impact of technology on its activities even where the provision of services does not amount to outsourcing, or is not considered relevant to the regulated firm’s operational resilience. For example, the PRA’s SS2/21 also discusses third party arrangements which do not involve outsourcing. Moreover, where a regulated firm is subject to the FCA’s Consumer Duty, it will have to consider the impact of the services it receives from third parties on its ability to comply with the Consumer Principle and deliver good outcomes for retail customers.
In addition, sector-specific conduct rules may affect the services the regulated firm receives. For example, if a regulated mortgage lender uses a technology platform to support the provision of documentation to applicants and potential borrowers, it will have to ensure that the documentation produced complies with requirements set out in the Mortgages and Home Finance: Conduct of Business sourcebook.
In short, where the service recipient is a regulated firm then, depending on the services and the impact of those services of the firm’s activities, a complex and varied tapestry of rules and guidance could apply. Regulated firms should seek specialist guidance in this area, as there is no one-size-fits-all roadmap or solution for determining how to comply with the requirements.
-
Is it typical for a software vendor to cap its maximum financial liability to a customer in a software transaction? If ‘yes’, what would be considered a market standard level of cap?
Yes, it is typical for a software vendor to cap its maximum financial liability to a customer in a software transaction in the UK, although there may be certain areas of liability that are excluded from this cap (please see the response to Question 8 for further information on these excluded areas of liability).
There is no market standard level of cap in the UK, as a liability cap will depend on a range of factors unique to each transaction, including the respective negotiating positions of the customer and software vendor. That said, it is not unusual for the level of cap to range between 100% to 150% of the annualised or total value of the contract.
-
Please comment on whether any of the following areas of liability would typically be excluded from any financial cap on the software vendor’s liability to the customer or subject to a separate enhanced cap in a negotiated software transaction (i.e. unlimited liability): (a) confidentiality breaches; (b) data protection breaches; (c) data security breaches (including loss of data); (d) IPR infringement claims; (e) breaches of applicable law; (f) regulatory fines; (g) wilful or deliberate breaches.
- Confidentiality breaches – No typical position – deal specific. A customer will generally push for this area of liability to be excluded from any financial cap, whereas a software vendor will typically resist this position and will require it to either be subject to the general cap on liability or to a separate enhanced cap.
- Data protection breaches – Same as confidentiality breaches (covered at (a)).
- Data security breaches (including loss of data) – Same as confidentiality breaches (covered at (a)).
- IPR infringement claims – In the absence of unique deal-specific reasons that require a contrary position, this area of liability is typically excluded from any financial cap (usually linked to the IPR infringement indemnity).
- Breaches of applicable law – Same as confidentiality breaches (covered at (a)), although it is not uncommon for breaches specifically of the Bribery Act 2010 and/or Modern Slavery Act 2015 by the software vendor to be excluded from any financial caps.
- Regulatory fines – Same as confidentiality breaches (covered at (a)).
- Wilful or deliberate breaches – In the absence of unique deal-specific reasons that require a contrary position, this area of liability is typically excluded from any financial cap. Please note, however, although there is English case law to aid interpretation, there is no single, settled legal definition of what constitutes a “wilful or deliberate breach”, so the customer and software vendor may wish to consider including an agreed definition of these terms within the contract.
-
Is it normal practice for software source codes to be held in escrow for the benefit of the software licensee? If so, who are the typical escrow providers used?
It is not uncommon in the UK for source codes to be held in escrow for the benefit of the software licensee, particularly where the software is either bespoke (and the software licensor has retained ownership of IP in the software) or performs critical operations for the software licensee.
Historically, source code escrow was used for traditional “on premise” licenses of software, but more recently, escrow providers offer an equivalent service for cloud-based software. Where there is a single-tenanted cloud environment, source codes and route level access credentials may be held in escrow, to allow the licensee to take control of the cloud environment in the event the vendor is no longer able to support it. Where the cloud environment is a “one to many” unrestricted cloud environment, the escrow provider may hold a separately hosted, mirrored instance of the cloud production environment, as well as the source code and certain data, to enable the licensee to migrate to the replicated environment if the vendor no longer supports the original service environment.
Commonly used escrow providers in the UK include Escrow London, Iron Mountain, LE&AS, NCC Group, and SES.
-
Are there any export controls that apply to software transactions?
The UK controls the export of software that is used in the military and where the software may be considered “dual use” where it can be adapted for military use. The UK government website lists items that are restricted by category. Anyone looking to export restricted items will require an export licence. It is a criminal offence to breach the export regulations.
Separately, there are complex issues that arise when dealing with the export of software to certain countries, especially China and Russia, in respect of sanctions rather than export controls. Anyone looking to export to a country where there are sanctions in place should consult the relevant regulations for that jurisdiction.
-
Other than as identified elsewhere in this questionnaire, are there any specific technology laws that govern IT outsourcing transactions?
There are no specific technology laws governing IT outsourcing in the UK.
-
Please summarise the principal laws (present or impending), if any, that protect individual staff in the event that the service they perform is transferred to a third party IT outsource provider, including a brief explanation of the general purpose of those laws.
The Transfer of Undertakings (Protection of Employment) Regulations 2006 (“TUPE”) provide the following significant protection for employees in an IT outsourcing situation:
- The primary purpose of TUPE is to automatically transfer the employment of individual staff from their current employer to a third party IT outsource provider on the same date that the service they perform is transferred to the third party IT outsourcing provider.
- The starting point under TUPE is that the individual employees transfer to the third party IT outsource provider on the same terms and conditions of employment (the name of their employer will change and they will also join or have the option of joining the pension scheme offered by the third party IT provider).
- The transfer of employment takes place automatically by operation of law under TUPE and is not something that parties can choose to ignore.
- TUPE provides enhanced protection to employees in outsourcing situations as the dismissal of an employee with at least 2 years’ continuous service where the sole or principal reason for the dismissal is the transfer itself is automatically unfair. The third party IT outsource provider must be able to show the dismissal was for an economic, technical or organisational reason that entailed a change in the workforce to avoid an automatic unfair dismissal finding, and even then, the dismissed employee can still challenge the fairness of their dismissal under general unfair dismissal law.
- The third party IT outsource provider is prevented from changing the terms and conditions of employees that transfer to it under TUPE if the sole or principal reason for the change is the transfer itself. Again, the third party IT outsource provider must be able to show that any changes to terms and conditions of employment are made for an economic, technical or organisational reason entailing changes in the workforce or the employment contract permits the change in question.
- TUPE requires the current employer to inform the employees about the proposed transfer and to consult with appropriate representatives of the employees if the third party IT outsource provider is proposing to take any measures/make changes to their employment terms after the transfer. The penalty for failing to comply with this obligation is a protective award of up to 13 weeks’ uncapped pay to each affected employee.
-
Which body(ies), if any, is/are responsible for the regulation of telecommunications networks and/or services?
Telecommunications networks and services are primarily regulated in the UK by The Office of Communications (Ofcom), a government-approved regulatory and competition authority.
Ofcom sets and enforces codes, regulations and guidance within the telecommunications sector.
While Ofcom is the main body responsible for telecommunications regulation, other bodies involved in regulating the sector include:
- The Competition and Markets Authority (“CMA”), which ensures adequate competition in the telecommunications sector. The CMA and Ofcom often collaborate on actions, such as providing joint statements on safety and competition, and they may cross refer investigations;
- The Information Commissioner’s Office (“ICO”), which monitors the telecommunications sector’s compliance with information rights and data privacy laws; and
- The National Cyber Security Centre (“NCSC”), which provides guidance to the telecommunications sector on cyber security issues.
-
Please summarise the principal laws (present or impending), if any, that govern telecommunications networks and/or services, including a brief explanation of the general purpose of those laws.
The primary legislation governing the UK telecommunications sector is the Communications Act 2003, as supplemented by the Wireless Telegraphy Act 2006.
The Communications Act 2003 established Ofcom as the independent regulatory body responsible for overseeing the telecommunications industry in the UK, and set out Ofcom’s duties. Together with the Wireless Telegraphy Act, the Communications Act provides Ofcom with powers within the UK, such as the right to grant licences to providers who wish to provide telecommunications services, and the enforcement of compliance with legislation and guidelines. Both acts provide a regulatory framework which seeks to protect consumers, ensure there is fair competition, and uphold standards of content.
Further laws that supplement the governance of the telecommunications sector in the UK include:
- The European Electronic Communications Code (“EECC”), which was transposed into UK law in late 2020. The EECC looks to improve service quality by making investment in infrastructures more attractive to companies, and to protect consumers by placing price limits on international calls, providing affordable services, and promoting better security;
- The Telecommunications (Security) Act 2021, which seeks to enhance the security of telecommunications networks across the UK by requiring providers to have measures in place to identify and then reduce the risk of security breaches;
- The Online Safety Bill (currently with the House of Lords at the Committee Stage), which will regulate social media sites and applications. It is expected that the Bill will have a direct impact on the telecommunications sector, as it is expected to affect how content is accessed through networks (see also Question 26 and Question 28;
- The UK General Data Protection Regulation, which sets out how organisations must collect, store, and use individuals’ data (see also Question 15); and
- The Privacy and Electronic Communications Regulations, which protects individuals’ privacy (see also Question 15).
-
Which body(ies), if any, is/are responsible for data protection regulation?
Data protection is primarily regulated in the UK by The Information Commissioner’s Office (ICO), an executive non-departmental public body.
-
Please summarise the principal laws (present or impending), if any, that that govern data protection, including a brief explanation of the general purpose of those laws.
PRESENT
Principal laws Brief description The General Data Protection Regulation (2016/679) (“EU GDPR”) The EU GDPR enhances individuals’ data protection and privacy rights and harmonises the data protection laws within the EU, aiming to ensure that personal data is handled responsibly by organisations and in accordance with fundamental privacy principles. The EU GDPR has extraterritorial effect and will apply to UK-based controllers and processors who:
- are processing personal data in the context of activities of the controller or processor’s establishment in the EU; or
- offer goods or services to data subjects in the EU, or who monitor the behaviour of data subjects in the EU.
There are also implications for UK controllers who have an establishment in the EEA, have customers in the EEA, or monitor individuals in the EEA. The EU GDPR still applies to this processing.
UK GDPR The EU GDPR is retained in modified form in the United Kingdom (“UK”) under the UK General Data Protection Regulation (“UK GDPR”). The key principles, rights and obligations of the UK GDPR remain the same as the EU GDPR. The UK GDPR also applies to controllers and processors based outside the UK if their processing activities relate to:
- offering goods or services to individuals in the UK; or
- monitoring the behaviour of individuals taking place in the UK.
The UK has the independence to keep this framework under review. The UK GDPR sits alongside the Data Protection Act 2018 (“DPA 2018”).
DPA 2018 The DPA 2018 initially set out permitted derogations and supplementary provisions to the EU GDPR, repealing and replacing the Data Protection Act 1998. The DPA sits alongside and supplements the UK GDPR (for example, it provides the exemptions from the UK GDPR). Law Enforcement Directive EU 2016/680 (“LED”) Part 3 of the DPA 2018 brought the LED into UK law. This complements the UK GDPR and sets out requirements for processing personal data for criminal law enforcement purposes. Part 3 of the DPA 2018 concerns the police and criminal justice and includes separate data protection rules for law enforcement authorities.
The Data Protection (Charges and Information) Regulations 2018 The Data Protection (Charges and Information) Regulations 2018 require every UK controller that processes personal information to pay a data protection fee to the ICO. The information provided to the ICO is published on a register. These regulations determine the fees an organisation will need to pay in relation to data protection charges. There are three different tiers of fee and controllers are expected to pay between £40 and £2,900.
Freedom of Information Act 2000 (“FOIA”) FOIA provides public access to information held by public authorities. It does this in two ways: - public authorities are obliged to publish certain information about their activities; and
- members of the public are entitled to request information from public authorities.
FOIA covers any recorded information that is held by a public authority in England, Wales and Northern Ireland, and by UK-wide public authorities based in Scotland. Information held by Scottish public authorities is covered by Scotland’s own Freedom of Information (Scotland) Act 2002.
Privacy and Electronic Communications Regulations 2003 (“PECR”) PECR are derived from European law. PECR implement European Directive 2002/58/EC, also known as ‘the e-privacy Directive’, which complements the general data protection regime and sets out more specific privacy rights on electronic communications. PECR cover:
- marketing by electronic means, including marketing calls, texts, emails and faxes;
- the use of cookies or similar technologies that track information about people accessing a website or other electronic service;
- security of public electronic communications services;
- privacy of customers using communications networks or services as regards traffic and location data, itemised billing, line identification services (eg caller ID and call return), and directory listings.
The EU is in the process of replacing the current e-privacy law with a new e-privacy Regulation (“ePR”), to apply alongside the EU version of the GDPR. However, the ePR will not automatically form part of UK law as the UK has left the EU.
Environmental Information Regulations 2004 (“EIR”) The EIR provide public access to environmental information held by public authorities. They do this in two ways: - public authorities must make environmental information available proactively; and
- members of the public are entitled to request environmental information from public authorities.
The EIR cover any recorded information held by public authorities in England, Wales and Northern Ireland. Environmental information held by Scottish public authorities is covered by the Environmental Information (Scotland) Regulations 2004.
Network and Information Systems Regulations 2018 (“NIS Regulations”) The NIS Regulations intend to address the threats posed to network and information systems and therefore aim to improve the functioning of the digital economy. NIS Regulations concern ‘network and information systems’ and their security. These are any systems that process ‘digital data’ for operation, use, protection and maintenance purposes. NIS Regulations require these systems to have sufficient security to prevent any action that compromises either the data they store, or any related services they provide.
Investigatory Powers Act 2016 (“IPA”) The IPA provides a framework to govern the use and oversight of investigatory powers by law enforcement and the security and intelligence agencies. The IPA sets out the lawful acquisition of communications data which is the “who, where, when, how and with whom” of a communication but not the content (i.e. what was said). The IPA builds on, and supersedes parts of, the Regulation of Investigatory Powers Act 2000 (“RIPA”). There are limited exceptions to the prohibitions in the Investigatory Powers (Interception by Businesses etc. for Monitoring and Record-keeping Purposes) Regulations 2018 (SI 2018/356). Re-use of Public Sector Information Regulations 2015 (“RPSI”) RPSI relates to public sector information produced as part of a public task. Public sector bodies have to publish a list of the main information they hold for the purpose of a public task. RPSI does not apply to information that would be exempt from disclosure under information access legislation (such as the DPA 2018 and FOIA).
Regulation (EU) 910/2014 on electronic identification and trust services for electronic transactions in the internal market (“eIDAS”) Following the UK’s withdrawal from the EU, the eIDAS Regulation was adopted into UK law and amended by The Electronic Identification and Trust Services for Electronic Transactions (Amendment etc.) (EU Exit) Regulations 2019. In addition, the existing UK trust services legislation, The Electronic Identification and Trust Services for Electronic Transactions Regulation 2016 (SI 2016/696) was also amended. Electronic trust services can be used in a number of ways to provide security for electronic documents, communications and transactions e.g. to help ensure that documents sent electronically have not been altered in any way and that the sender can be easily recognised. Electronic trust services allow for such security properties to be applied and then validated and thus help ensure confidence in the electronic transfer of information.
The UK eIDAS Regulations provide the legal framework for the use of electronic trust services offered within the UK and recognise equivalent services offered in the EU.
IMPENDING (as of July 2023)
Principal law Brief Description UK Data Protection and Digital Information (No.2) Bill (“DPDI No.2 Bill”) The DPDI No. 2 Bill aims to alleviate the burden of compliance with the UK GDPR and its implementation in the UK Data Protection Act 2018 for organisations in the UK. The bill includes changes to UK GDPR plus digital verification services, smart data schemes, DPO changes, new PECR powers and ICO reform (among others). Parliament introduced the bill and simultaneously withdrew the No. 1 Bill, on 8 March 2023. The bill is currently at the Report stage at the House of Commons.
The bill’s main proposed changes are as follows:
- “Personal data” definition:
- information may be “personal data” if it is not sufficiently protected via appropriate measures that mitigate the risk of unauthorised persons obtaining such information.
- pseudonymised data is only personal data if it can be re-identified using reasonable means, i.e. that a person is “reasonably likely to use” (in view of the time, cost and effort involved, technology and resources available to the person).
- Legitimate interests – Direct marketing, the intra-group transmission of personal data, and security of network and information systems are listed as examples of recognised legitimate interests. A balancing test is still required.
- Transfers of personal data (i.e. data exports) – a risk-based approach to adequacy decisions made by the UK, to be termed “approved” transfers, under a “data protection test” whereby the standard of protection in the receiving country is “not materially lower”.
- ICO – additional duties (e.g. having regard to the desirability of promoting innovation and competition); structural changes (Information Commission body); Secretary of State powers (e.g. to designate strategic priorities to which the ICO must have regard generally); and new ICO powers to require documents, reports (e.g. technical reports) and interviews, and to reject complaints under certain conditions (including no complaint to the controller first – new provisions specifically address complaints to controllers and their responses, and the Secretary of State could require reporting of complaints to the ICO).
- Data Subject Access Requests (DSARs) – controllers may refuse a request or charge reasonable fees if the request is “vexatious or excessive”.
- Further processing for scientific research purposes – Expands the scope to cover any research that can be described as scientific, privately or publicly funded, for commercial or non-commercial activity, including statistical research. Exempts controllers from notifying individuals about such processing if it takes disproportionate effort and data has been provided by them directly.
- Data Protection Officers (DPOs) – no DPO requirement, but a “senior responsible individual” (SRI), part of senior management, must be appointed for likely “high-risk” processing.
- Data Protection Impact Assessments (DPIAs) – only required for likely high-risk processing.
- Records of Processing Activities (ROPAs) – only required for likely high-risk processing, regardless of organisational size.
- Cookies consent exemptions – Includes collecting statistical information to make improvements, enabling the appearance or function of a website to reflect user preferences, installing necessary security updates to software on a device and identifying the individual’s geolocation in an emergency.
- Privacy and Electronic Communications Regulations (PECR) – Fines for failure to comply with the rules on direct marketing and cookie consent under the PECR are to be raised to the UK GDPR levels. Public electronic communication services and networks are to report suspicious direct marketing activity to the ICO.
Retained EU Law (Revocation and Reform) Bill (“REUL”) The REUL seeks to make major changes to the body of retained EU law in UK domestic law. A targeted list of around 600 specific pieces of secondary legislation and EU legislation is expected to be revoked at the end of 2023 which does not currently include the UK GDPR and the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR).
The REUL does revoke one piece of data related legislation, which concerns the IPA (see above). As REUL was recently amended by the House of Lords, it is now awaiting further consideration in the House of Commons on 12 June 2023.
Online Safety Bill Introduced to Parliament on 18 January 2023, the Bill is aimed at social media, messaging, search, online advertising services. The Bill is currently at the report stage at the House of Lords. The Bill aims to protect children by making social media platforms:
- Remove illegal content quickly or prevent it from appearing in the first place. This includes removing content promoting self-harm;
- Prevent children from accessing harmful and age-inappropriate content;
- Enforce age limits and age-checking measures;
- Ensure the risks and dangers posed to children are more transparent, including by publishing risk assessments;
- Provide parents and children with clear and accessible ways to report problems online when they do arise.
Platforms will need to:
- Remove all illegal content;
- Remove content that is banned by their own terms and conditions;
- Empower adult internet users with tools so that they can tailor the type of content they see and can avoid potentially harmful content if they do not want to see it on their feeds. Children will be automatically prevented from seeing this content without having to change any settings.
Ofcom will have powers to take action against companies which do not follow their new duties. Companies will be fined up to £18 million or 10 percent of their annual global turnover, whichever is greater. Criminal action will be taken against senior managers who fail to follow information requests from Ofcom.
Digital Markets, Competition and Consumer Bill This introduces a competition regime for largest and powerful digital platforms including a mandatory code of conduct and merger control. The draft bill was published in April 2023 and is now at Committee stage. This is likely to be in force from 2024 at the earliest.
Scope / application: Companies with ‘strategic market status’ (SMS) in respect of digital activity, and a link to the UK, with turnover of £1bn in the UK or £25bn globally.
Product Security and Telecommunications Infrastructure Act This introduces a product compliance regime to protect consumer connectable products against cyberattacks. Scope: manufacturers, importers and distributors of “connectable products”: an internet-connectable product, or a network-connectable product.
Received Royal Assent on 6 December 2022. Partially in force. Will require new regulations before it has any practical effect.
-
What is the maximum sanction that can be imposed by a regulator in the event of a breach of any applicable data protection laws?
Law(s) Sanction UK GDPR and DPA 2018 The ICO can take enforcement action by issuing enforcement notices (imposing fines or the suspension or cessation of processing), assessment notices (for a compulsory audit) or information notices (requiring the provision of information for investigation). There are two tiers of fines that can be imposed by the ICO:
- A maximum fine of £17.5 million or 4 per cent of annual global turnover – whichever is greater – including for infringement of any of the data protection principles, rights of individuals or rules concerning restricted data transfers.
- A maximum fine of £8.7 million or 2 per cent of annual global turnover – whichever is greater – for infringement of other provisions, such as administrative requirements of the legislation.
EU GDPR The enforcement action that data protection regulators in EU Member States can take is generally similar to actions the ICO can take in the UK. There are two tiers of fines that can be imposed by data protection regulators in EU Member States:
- A maximum fine of €20 million or 4 per cent of annual global turnover – whichever is greater – including for infringement of any of the data protection principles, rights of individuals or rules concerning restricted data transfers.
- A maximum fine of €10 million or 2 per cent of annual global turnover – whichever is greater – for infringement of other provisions, such as administrative requirements of the legislation.
LED In the UK, ICO fines for law enforcement authorities are subject to the same financial limits as under UK GDPR. In European member states, maximum fines are determined by member state law.
The Data Protection (Charges and Information) Regulations 2018 A fine of up to £4,350 can be imposed by the ICO for failure to pay the data protection fee. FOIA The ICO does not have the power to fine a public authority under FOIA. However, failure to comply with an ICO enforcement notice may lead to prosecution and a fine of up to £5,000 in the magistrates’ court and an unlimited fine in the Crown Court. PECR The ICO can impose a fine of up to £500,000 for breach of the PECR. Note: the DPDI No.2 Bill (see above) looks to increase fines under PECR to either up to 4% of an organisation’s global turnover, or £17.5m, whichever is greater.
EIR As under FOIA, the ICO has no direct power to fine. However, a controller who breaches the EIR and has been served with an enforcement notice can be prosecuted for failing to comply with a notice. This offence carries a maximum penalty of a £5,000 fine in the magistrates’ court and an unlimited fine in the Crown Court. NIS Regulations The NIS Regulations set out a sliding scale of maximum financial penalties which can be imposed by the ICO: - £1 million – for any contravention that the ICO determines was not ‘a material contravention’;
- £8.5 million – for a ‘material contravention which the ICO determines does not and could not have created a significant risk to, or significant impact on, or in relation to, the service provision by the OES* or RDSP*’;
- £17 million – for a ‘material contravention which the ICO determines has or could have created a significant risk to, or significant impact on, or in relation to, the service provision by the OES or RDSP’.
*An OES is an ‘operator of essential services’, and an RDSP is a ‘relevant digital service provider’.
IPA Imprisonment for a term not exceeding 2 years, a fine, or both. eIDAS The ICO can take action for breaches of eIDAS, including by imposing fines of up to £1,000. In addition, the DPA 2018 includes provisions for criminal offences related to data protection, including:
- unlawful obtaining, disclosing, or selling of personal data. It is a criminal offence to intentionally or recklessly obtain, disclose, or sell personal data without lawful authority. This offence can be punishable by a fine or imprisonment.
- re-identification of de-identified personal data. Re-identifying previously de-identified personal data without lawful authority is a criminal offence, subject to fines or imprisonment.
- alteration of personal data to prevent disclosure. Knowingly altering, defacing, blocking, erasing, or destroying personal data with the intention of preventing its disclosure is an offence under the DPA 2018.
Offences committed by a person in an organisation
The DPA 2018 introduces the concept of “offences by bodies corporate.” This means that if an offence under the DPA 2018 is committed by an organisation, such as a company, partnership, or government body, the organisation can be held criminally liable. This includes offences related to data protection principles, appointment of a data protection officer, etc. Criminal penalties in the DPA 2018 apply to processing under the LED by competent law enforcement authorities.
-
Do technology contracts in your country typically refer to external data protection regimes, e.g. EU GDPR or CCPA, even where the contract has no clear international element?
In relation to the EU GDPR, yes, especially as a result of the extra territorial effect of the EU GDPR.
References to other third country data protection laws (such as the CCPA) are not typically included in contracts, unless they are directly applicable to the processing carried out as part of the services provided under the contract.
-
Which body(ies), if any, is/are responsible for the regulation of artificial intelligence?
Currently (as of July 2023), there is no single body responsible for the regulation of artificial intelligence (AI) in the UK.
The UK government has established an “Office for Artificial Intelligence” (“OAI”), which sits within the Department for Science, Innovation & Technology, and which issues papers and guidance on the UK approach to AI regulation, but it is not directly responsible for regulation itself. Instead, a number of different bodies will have roles in ensuring AI is used safely and ethically in the UK. According to the OAI’s 2023 White Paper, these bodies are likely to include:
- The Information Commissioner’s Office (“ICO”): The ICO is responsible for protecting personal data, and has published guidance on AI and data protection. It is engaging with the government on further proposals for regulatory reform that will support the government’s pro-innovation approach to AI regulation;
- The Equality and Human Rights Commission (“EHRC”): The EHRC’s role may be important in setting standards for bias mitigation;
- The Employment Agency Standards Inspectorate (“EASI”): The EASI is likely to issue guidance for the employment sector;
- The Financial Conduct Authority (“FCA”): The FCA is carrying out consultations and is collaborating with the Digital Regulation Cooperation Forum to work through an appropriate framework for AI in financial services;
- The Intellectual Property Office (“IPO”): The Office for Artificial Intelligence has deferred to the IPO on intellectual property issues surrounding AI, although the IPO has not yet issued guidance.
Other bodies that have, or may have, a role in regulating AI in the UK may include:
- The Competition and Markets Authority (“CMA”): The CMA is responsible for promoting competition and preventing anti-competitive behaviour;
- The Medicines and Healthcare products Regulatory Agency (“MHRA”): The MHRA has already published guidance on how AI systems can be used in healthcare and medical devices.
Certain other organisations in the UK are also working to develop standards and best practices for the use of AI, including the Alan Turing Institute. Such organisations may play an advisory role in future regulation of AI in the UK.
-
Please summarise the principal laws (present or impending), if any, that that govern the deployment and use of artificial intelligence, including a brief explanation of the general purpose of those laws.
Currently (as of July 2023), there are no laws dealing directly with artificial intelligence in the UK comparable with, for example, the EU’s AI Act. Instead, the principal laws governing the deployment and use of AI are existing laws relating to data protection and unfair competition, as well as certain specific guidelines such as those issued by the MHRA.
It seems likely that new laws and regulations, or modifications to existing laws and regulations, will be promulgated as the regulation of AI develops and evolves. In particular, we expect the UK government (via the IPO) to publish regulatory guidance around AI and intellectual property in the near future.
-
Are there any specific legal provisions (present or impending) in respect of the deployment and use of Large Language Models and/or generative AI?
No. However, the Office for Artificial Intelligence’s recent White Paper makes extensive reference to large language models (LLMs) and makes certain proposals around how LLMs could be regulated in the future, for example, through applying greater regulatory scrutiny and requirements for statutory reporting where an LLM is trained on quantities of data that exceeds a certain limit (not yet specified).
LLMs are the focus of intense media scrutiny at the moment, which makes it more likely we will see the UK government or regulators proposing specific regulatory guidance in the near future. Having said that, the government in the White Paper repeatedly emphasises that regulation should remain proportionate so as not to stifle innovation in the space.
-
Which body(ies), if any, is/are responsible for the regulation of blockchain and / or digital assets generally?
There is no single body responsible for the regulation of blockchain or related digital assets in the UK. However, the UK government and regulators are taking steps to increase regulation in this sector. For example, HM Treasury has published a number of papers on the regulation of digital assets, and is working on a new regulatory regime.
Some bodies that are generally responsible for the regulation of digital assets in the UK include:
- The Financial Conduct Authority (“FCA”): The FCA’s current remit over digital assets is to ensure that crypto firms operating in the UK comply with anti-money laundering and counter-terrorism legislation. In respect of advertising of cryptoassets, the FCA only exercises powers where the cryptoassets are traded on existing regulated platforms, such as via a “Contract for Difference”.
- The Advertising Standards Authority (“ASA”): The ASA has oversight of issues across all forms of cryptoasset advertising, including in relation to NFTs.
-
What are the principal laws (present or impending), if any, that govern (i) blockchain specifically (if any) and (ii) digital assets, including a brief explanation of the general purpose of those laws?
In principle, any person can launch a protocol, smart contract, ledger or blockchain in the UK. This is a technological endeavour which is currently completely unregulated in the UK.
When taking a broader interpretation of the question, the issue at stake is the use and application of blockchain and associated digital asset technology.
In the UK, the regulatory regime specifically covering digital assets (including tokens, cryptocurrencies, NFTs and new forms of organisational structures (e.g. Decentralised Autonomous Organisations (“DAOs”))) is nascent. The UK does not have an equivalent of the EU’s Markets in Crypto Assets (“MiCA”) Regulation and one is not currently expected. MiCA treats “crypto assets” as an entirely new asset-class. In contrast, the UK’s approach has been to treat some digital assets as within scope of the existing rules and others outside its perimeter. In order to offer products and services in those assets inside the existing regulatory perimeter, a firm would need to be authorised and regulated in the usual way. The perimeter of current rules is blurred: it is possible that some firms which offer products and services outside the scope of traditional regulation still operate their business in a way which requires them to become authorised and regulated under, for example, the Payment Services Regulation or the Electronic Money rules. The FCA has focused its efforts on publishing information for consumers about the risks of dealing with digital assets and with unauthorised firms (e.g. those based outside the UK). Outside those areas, some firms have to register with the FCA for money laundering rules.
The UK’s regulatory bodies (HM Treasury, the Bank of England, the PRA, and the FCA) are consulting widely on various topics. These include treatment of DAOs, digital securities, a central bank-backed digital currency (also known as the Digital Pound), amendments to the Financial Services and Markets Bill to capture some types of digital assets as if they were “traditional” financial instruments, and how the financial promotions rules will apply to digital assets. The International Organization of Securities Commissions (“IOSCO”) has published an 18-point plan covering areas including conflicts of interest, disclosure rules, and governance in digital assets. IOSCO’s approach is similar to the FCA’s in that it says:
“Regulators should use existing frameworks or New Frameworks to regulate and oversee crypto-asset trading, other crypto-asset services, and the issuing, marketing and selling of crypto-assets (including as investments), in a manner consistent with IOSCO Objectives and Principles for Securities Regulation and relevant supporting IOSCO standards, Recommendations, and good practices … The regulatory approach should seek to achieve regulatory outcomes for investor protection and market integrity that are the same as, or consistent with, those that are required in traditional financial markets.”
Shortly after the release of the IOSCO guidelines, the FCA held a press conference at which they presented the findings and a timetable for finalisation of relevant policy measures of “Q4 2023”. Much depends on the passing of the Financial Services & Markets Bill currently with the House of Lords. The Bank of England and the FCA (with the support and likely insistence of HM Treasury) would like to press ahead with “digital FMI” – large-scale financial infrastructure – projects, but cannot do so until the Bill becomes law.
A notable absentee from any current proposal is an update to the market abuse rules to cover digital assets specifically. To that extent, most secondary trading of digital assets is subject to the same (lack of) rules covering the spot foreign exchange markets. The lack of formal market abuse rules in this market essentially has created a two-tier ecosystem: the “wholesale” tier where participants trade large blocks of currency pairs and adhere to voluntary ethics and conduct standards; and the “retail” tier populated by social media influencers and online unregulated brokers which retain “last look” and (allegedly) front-run their own clients.
-
Are blockchain based assets such as cryptocurrency or NFTs considered “property” capable of recovery (and other remedies) if misappropriated?
Cryptoassets (which, although predominantly blockchain-based, may include assets created and managed through other technologies) are considered “property” at common law following the seminal case of AA v Persons Unknown re Bitcoin 2019. Here the Applicant was a London-based insurer who had paid Bitcoin on behalf of its Canadian client following a ransomware attack and had relied on blockchain analytics to trace those Bitcoin to a cryptocurrency exchange in the BVI. The Applicant sought a proprietary injunction, amongst other things, to restrict use of the proceeds of those Bitcoin, and the Judge considered whether a proprietary injunction could apply to cryptoassets broadly. The Judge in making his decision relied in part on the UK Jurisdiction Taskforce Paper “Crypto Assets and Smart Contracts” dated 11 November 2019. The Judge stated, at paragraph 61, “I am satisfied for the purpose of granting an interim injunction in the form of an interim proprietary injunction that crypto currencies are a form of property capable of being the subject of a proprietary injunction”. The Judge granted a proprietary injunction over the traced proceeds of the applicant’s Bitcoin, with cryptoassets thereby becoming property at common law.
This decision was recently affirmed by the Court of Appeal in Tulip Trading v Bitcoin Association at al, 2023, at paragraph 24. Although the decision in AA was at first instance and at an interim stage (i.e. not dealt with in substantive proceedings), Tulip Trading cemented AA’s position as confirming cryptoassets broadly as property.
Non fungible tokens (“NFTs”) have been considered separately by the English courts, and explicitly referred to as property. The case of Osbourne v Persons Unknown was similar to AA in that a proprietary injunction was granted over misappropriated NFTs. The Judge in Osbourne noted, “I am satisfied… that there is at least a realistically arguable case that such tokens are to be treated as property as a matter of English law”.
In any event, the Law Commission has published a Digital Assets Consultation Paper, which contains provisional law reform proposals to ensure that the law recognises and protects digital assets (including crypto-tokens and cryptoassets). The Paper notes, “We provisionally propose the explicit recognition of a “third” category of personal property distinct from things in possession and things in action, which would allow for a more nuanced consideration of new, emergent, and idiosyncratic objects of property rights. We label this category “data objects””. Following responses from third parties, a final report with law reform recommendations is to be published in the second half of 2023. On that basis, cryptoassets may fall into the legislative framework as a “data object”, where they will be treated as property so long as they: 1. are “composed of data represented in an electronic medium, including in the form of computer code, electronic, digital or analogue signals”, 2.“exist independently of person and the legal system” and 3. are “rivalrous”.
Cryptoassets are, therefore, capable of being treated at property at common law. This will potentially be clarified in the near future at the legislative level.
There are recent instances where the English courts have ordered recovery of cryptoassets from other jurisdictions via a civil route, including the recent case of Joseph Keen Shing Law v Persons Unknown and Huobi Global Ltd (unreported), where a court designed a mechanism to compel the return of cryptoassets.
-
Which body(ies), if any, is/are responsible for the regulation of search engines and marketplaces?
There is no single body responsible for the regulation of search engines or marketplaces in the UK. However, a combination of the Competition and Markets Authority, Trading Standards, Advertising Standards Authority, Ofcom and the Financial Conduct Authority may each regulate certain aspects of a search engines or marketplace (to the extent there are any regulated activities).
-
Please summarise the principal laws (present or impending), if any, that govern search engines and marketplaces, including a brief explanation of the general purpose of those laws.
Currently, search engines and marketplaces in the United Kingdom are primarily governed by general laws that apply to online services and information providers. Whilst there are no specific laws dedicated to search engines and marketplaces, the following legal frameworks are relevant:
- Electronic Commerce (EC Directive) Regulations 2002 (amended through the Electronic Commerce (Amendment etc.) (EU Exit) Regulations 2019 following Brexit) (“E-Commerce Regulations”): The E-Commerce Regulations apply to virtually every commercial website, including marketplaces which are considered “information society services”. The E-Commerce Regulations impose certain obligations on marketplace operators, including the requirement to provide mandated information to users and have prescribed features and functions of the site relating to contract formation.
- Platform to Business Regulations (Retained Regulation (EU) 2019/1150 on promoting fairness and transparency for business users of online intermediation services and corporate website users of online search engines (also known as the UK Platform-to-business Regulation or UK P2B Regulation) as amended by the Online Intermediation Services for Business Users (Amendment) (EU Exit) Regulations 2020, SI 2020/796 (“P2B Regulations”): The focus of the P2B Regulations is to regulate the relationship between business users and search engines and online intermediation services (such as marketplaces) that use them to sell products or services. The P2B Regulations seek to ensure that the platforms operated by these types of intermediaries deal with their business users fairly and in a transparent manner. The rules ban certain unfair practices, such as changing online terms and conditions without cause, and mandate transparency over the ranking of search results.
- Online Safety Bill (currently on its second reading passing through the report stage of the House of Lords): The Bill will make search engines legally responsible for protecting the online safety of their users, requiring that they remove harmful or illegal content quickly or prevent it from appearing in the first place.
- Advertising Standards: Those search engines or marketplaces that display advertisements must comply with advertising standards and endure that adverts on the platform meet the regulations set by the Advertising Standards Authority, such as the UK Code of Non-broadcast Advertising and Direct & Promotional Marketing, which is the rule book for non-broadcast advertisements, sales promotions and direct marketing communications.
- Data Protection Law: Those search engines or marketplaces that process personal data of users must comply with all applicable laws and regulations in the UK relating to privacy and the processing of personal data relating to data subjects located in the UK, including the UK General Data Protection Regulation (as defined in The Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations 2019), the Data Protection Act 2018, and the Privacy and Electronic Communications (EC Directive) Regulations 2003 (SI 2426/2003).
- Consumer Protection Laws (including the Consumer Rights Act 2015 (“CRA”) and Consumer Protection From Unfair Trading Regulations 2008 (“CPRs”): Search engines and marketplaces must comply with the requirements set out in the CRA, including ensuring any consumer terms and notices comply with the requirements of fairness and transparency (this would include any “buyer” terms and conditions or rules that feature on a marketplace). In addition, platforms must comply with the requirements set out under the CPRs, which prohibit certain commercial practices (including misleading action, misleading omission, and aggressive practices) during the whole lifetime of a consumer-to-trader transaction (i.e. advertising, marketing, entry into the contract, performance and enforcement).
-
Which body(ies), if any, is/are responsible for the regulation of social media?
There is no single body responsible for the regulation of search engines or marketplaces in the UK. However, a combination of the Competition and Markets Authority, Trading Standards, Advertising Standards Authority (“ASA”), Ofcom and the Financial Conduct Authority may each regulate certain aspects of social media (to the extent there are any regulated activities).
-
Please summarise the principal laws (present or impending), if any, that govern social media, including a brief explanation of the general purpose of those laws?
Social media platforms in the UK are primarily governed currently by general laws that apply to online services and information providers. Whilst there are not specific laws dedicated to social media platforms, the following legal frameworks are relevant:
- The Statutory Code of Practice (“Code”): The Code for providers of online social media platforms was published in accordance with Section 103 of the Digital Economy Act 2017. The Code provides guidance for social media platforms. It sets out actions that the Government believes social media platforms should take to prevent bullying, insulting, intimidating and humiliating behaviours on their sites. The Code is directed at social media platforms, but is also relevant to any sites hosting user-generated content and comments, including review websites, gaming platforms, online marketplaces and the like. The Code does not affect how illegal or unlawful content or conduct is dealt with.
- Online Safety Bill (“Bill”): The Bill, currently on its second reading passing through the report stage of the House of Lords, will makes social media platforms legally responsible for protecting the online safety of their users, in particular minors. It will protect users by requiring social media platforms to: remove illegal content quickly or prevent it from appearing in the first place, enforce age limits and age-checking measures, and provide parents and children with clear and accessible ways to report problems online when they do arise. The Bill will also empower adult internet users with tools so that they can tailor the type of content they see and avoid potentially harmful content. Ofcom, as regulator, will have powers to take action against companies who do not follow their new duties.
- Electronic Commerce (EC Directive) Regulations 2002 (amended through the Electronic Commerce (Amendment etc.) (EU Exit) Regulations 2019 following Brexit) (“E-Commerce Regulations”): The E-Commerce Regulations apply to virtually every commercial website including social media platforms (which are considered “hosting services”, since they typically host and display user generated content). The E-Commerce Regulations impose certain obligations on social media platforms, including the requirement to provide mandated information. The E-Commerce Regulations seek to protect social media platforms from liability caused by content posted by users.
- Platform to Business Regulations (Retained Regulation (EU) 2019/1150 on promoting fairness and transparency for business users of online intermediation services and corporate website users of online search engines (also known as the UK Platform-to-business Regulation or UK P2B Regulation) as amended by the Online Intermediation Services for Business Users (Amendment) (EU Exit) Regulations 2020, SI 2020/796 (“P2B Regulations”): The focus of the P2B Regulations is to regulate the relationship between business users and online intermediation services (e.g. marketplaces, social media platforms etc) and search engines. The P2B Regulations seek to ensure that the platforms operated by these types of intermediaries deal with their business users fairly and in a transparent manner. The rules ban certain unfair practices, such as changing online terms and conditions without cause, and mandate transparency over the ranking of search results.
- Advertising Standards: Social media platforms that display advertisements must comply with advertising standards and ensure adverts meet the regulations set by the Advertising Standards Authority, such as the UK Code of Non-broadcast Advertising and Direct & Promotional Marketing (“CAP Code”), which is the rule book for non-broadcast advertisements, sales promotions and direct marketing communications. The CAP Code covers many different types of advertising in social media, from the more traditional ‘paid-for’ ads to advertorials and affiliate marketing. The CAP Code requires that all marketing, including that on social media, is legal, decent, honest and truthful, and contains general rules and sector-specific rules that marketers must comply with. The CAP Code also requires that marketing communications are obviously identifiable as such and sets out further rules around influencer marketing.
- Data Protection Law: Social media platforms that process personal data of users must comply with all applicable laws and regulations in the UK relating to privacy and the processing of personal data relating to data subjects located in the UK, including the UK General Data Protection Regulation (as defined in The Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations 2019), the Data Protection Act 2018, and the Privacy and Electronic Communications (EC Directive) Regulations 2003 (SI 2426/2003).
- Consumer Protection Laws (including the Consumer Rights Act 2015 (“CRA”): Social Media platforms must comply with the requirements set out in the CRA, including ensuring any consumer terms and notices comply with the requirements of fairness and transparency (this would include any platform terms of use, for example).
-
What are your top 3 predictions for significant developments in technology law in the next 3 years?
- Further developments in laws and regulations governing the use of artificial intelligence.Laws and regulations around AI are already developing at a frenetic pace in the UK. It seems likely this trend will continue as the UK Government and Regulators attempt to keep up with the rapid development of AI technology. It seems particularly likely that regulation will focus on the responsible use of generative AI and other LLMs. In that respect, there is likely to be debate and conflict around the open source development of LLMs (which may run the risk of complicating or evading conventional regulatory oversight).
Another likely (and probably necessary) development will be international standardisation of AI regulation, or the establishment of a body having international oversight. While this seems a somewhat distant proposal at present, and could be complicated by political competition around AI development, there are already widespread calls for the development of a global, neutral; non-profit agency for governing AI, with some likening the sort of necessary global co-operation to the International Atomic Energy Agency and the International Civil Aviation Organisation. - Further developments in laws and regulations governing the “Web3” space, especially blockchain and cryptoassets.As with AI, more regulation seems likely in the Web3 space to keep up with the rapid development of products in this area, particularly cryptoassets. Although some of the fervour around certain cryptoassets, for example NFTs, seems to have abated somewhat recently, Web3 and, in particular, the decentralisation of the Internet is still an area into which a huge amount of investment is being poured in the UK, and which has significant complexity and accordant risk, making it ripe for further regulation. The UK Government (HM Treasury) has already published a consultation on a future financial services regulatory regime for cryptoassets, which itself followed on from a 2022 consultation on the regulation of fiat-backed stablecoins. There seems likely to be significant development in the regulation of cryptoassets following the consultation, and wider regulation of the Web3 space also seems likely as the technology develops.
- Developments in privacy law to keep pace with the greater use of technology by government bodies and law enforcement.
As government agencies and law enforcement make greater use of technology (in particular, AI, facial recognition, etc), it seems probable that there will be increased concerns around how these technologies affect and potentially impinge on individuals’ privacy rights and civil liberties. This may lead to the UK Courts ruling on the legality of such use of technology by the Government and other public services as well as by corporations such as tech giants and/or social media companies. Given how long a prominent case may take to work its way through the UK Court system, it may be unlikely that we will see a UK Court judgment that significantly alters the privacy landscape during the next three years. Having said that, the level of interest and concern around privacy in relation to developments in AI and other technologies does seem likely to lead to some attendant developments in the law during that period.
- Further developments in laws and regulations governing the use of artificial intelligence.Laws and regulations around AI are already developing at a frenetic pace in the UK. It seems likely this trend will continue as the UK Government and Regulators attempt to keep up with the rapid development of AI technology. It seems particularly likely that regulation will focus on the responsible use of generative AI and other LLMs. In that respect, there is likely to be debate and conflict around the open source development of LLMs (which may run the risk of complicating or evading conventional regulatory oversight).
-
Do technology contracts in your country commonly include provisions to address sustainability / net-zero obligations or similar environmental commitments?
It is increasingly common for customers to request provisions relating to sustainability in their contracts.
Often, technology vendors’ public-facing websites have sections that deal with their commitments to sustainability (sometimes as part of their ESG reporting), often containing extensive reporting data. For technology vendors with a global presence (e.g. cloud services ‘hyperscalers’), this data will usually be presented as global figures, so it may be difficult to glean UK-specific information from such websites without requests for further information from the vendors.
Technology vendors typically resist inserting sustainability commitments at a contractually binding level with individual customers. As an alternative, they may agree to provide more fulsome information than that contained on their public websites for review, including country-specific data and/or scorecards/reviews from external sustainability ratings agencies such as EcoVadis.
In addition, UK Government entities, certain large UK corporates and financial institutions (e.g. prominent banks) may be subject to extra regulatory scrutiny around their sustainability/net zero commitments. Where such an entity is the customer in a technology transaction, it may be possible to negotiate some contractual level commitments around sustainability from technology vendors. However, the commitments are likely to be on the light side, given practicalities and technology vendors’ reluctance to be subject to external oversight and/or obligations from customers in respect of their own sustainability commitments (which are often coordinated globally at large technology vendors).
United Kingdom: TMT
This country-specific Q&A provides an overview of TMT laws and regulations applicable in United Kingdom.
-
Is there a single regulatory regime that governs software?
-
How are proprietary rights in software and associated materials protected?
-
In the event that software is developed by a software developer, consultant or other party for a customer, who will own the resulting proprietary rights in the newly created software in the absence of any agreed contractual position?
-
Are there any specific laws that govern the harm / liability caused by Software / computer systems?
-
To the extent not covered by (4) above, are there any specific laws that govern the use (or misuse) of software / computer systems?
-
Other than as identified elsewhere in this overview, are there any technology-specific laws that govern the provision of software between a software vendor and customer, including any laws that govern the use of cloud technology?
-
Is it typical for a software vendor to cap its maximum financial liability to a customer in a software transaction? If ‘yes’, what would be considered a market standard level of cap?
-
Please comment on whether any of the following areas of liability would typically be excluded from any financial cap on the software vendor’s liability to the customer or subject to a separate enhanced cap in a negotiated software transaction (i.e. unlimited liability): (a) confidentiality breaches; (b) data protection breaches; (c) data security breaches (including loss of data); (d) IPR infringement claims; (e) breaches of applicable law; (f) regulatory fines; (g) wilful or deliberate breaches.
-
Is it normal practice for software source codes to be held in escrow for the benefit of the software licensee? If so, who are the typical escrow providers used?
-
Are there any export controls that apply to software transactions?
-
Other than as identified elsewhere in this questionnaire, are there any specific technology laws that govern IT outsourcing transactions?
-
Please summarise the principal laws (present or impending), if any, that protect individual staff in the event that the service they perform is transferred to a third party IT outsource provider, including a brief explanation of the general purpose of those laws.
-
Which body(ies), if any, is/are responsible for the regulation of telecommunications networks and/or services?
-
Please summarise the principal laws (present or impending), if any, that govern telecommunications networks and/or services, including a brief explanation of the general purpose of those laws.
-
Which body(ies), if any, is/are responsible for data protection regulation?
-
Please summarise the principal laws (present or impending), if any, that that govern data protection, including a brief explanation of the general purpose of those laws.
-
What is the maximum sanction that can be imposed by a regulator in the event of a breach of any applicable data protection laws?
-
Do technology contracts in your country typically refer to external data protection regimes, e.g. EU GDPR or CCPA, even where the contract has no clear international element?
-
Which body(ies), if any, is/are responsible for the regulation of artificial intelligence?
-
Please summarise the principal laws (present or impending), if any, that that govern the deployment and use of artificial intelligence, including a brief explanation of the general purpose of those laws.
-
Are there any specific legal provisions (present or impending) in respect of the deployment and use of Large Language Models and/or generative AI?
-
Which body(ies), if any, is/are responsible for the regulation of blockchain and / or digital assets generally?
-
What are the principal laws (present or impending), if any, that govern (i) blockchain specifically (if any) and (ii) digital assets, including a brief explanation of the general purpose of those laws?
-
Are blockchain based assets such as cryptocurrency or NFTs considered “property” capable of recovery (and other remedies) if misappropriated?
-
Which body(ies), if any, is/are responsible for the regulation of search engines and marketplaces?
-
Please summarise the principal laws (present or impending), if any, that govern search engines and marketplaces, including a brief explanation of the general purpose of those laws.
-
Which body(ies), if any, is/are responsible for the regulation of social media?
-
Please summarise the principal laws (present or impending), if any, that govern social media, including a brief explanation of the general purpose of those laws?
-
What are your top 3 predictions for significant developments in technology law in the next 3 years?
-
Do technology contracts in your country commonly include provisions to address sustainability / net-zero obligations or similar environmental commitments?