EBA reports on adoption of FinTech: What legal issues should you consider when introducing new technologies?

United KingdomScotland

The European Banking Authority (“EBA”) recently released a report analysing the key risks and opportunities for seven applications of innovative technologies in the financial services sector currently in use around Europe (available here, “Report”). The Report aims to “raise awareness, within the supervisory community and the industry, of current and potential FinTech applications, [and] to provide a balanced analysis of associated potential prudential risks and opportunities that may arise”.

This article sets out some of the key legal themes that should be front of mind, if you are considering introducing, increasing or changing your use of, any of these FinTech applications.

Applications considered by the Report

The EBA considers the operational pros and cons of each application of FinTech listed below in its Report:

  • Biometric authentication for mobile apps using fingerprint recognition
  • Use of robo-advisors to provide automated investment advice
  • Use of big data and machine learning to support credit scoring
  • Use of distributed ledger technology (DLT) and smart contracts in trade finance
  • Use of DLT to streamline client due diligence and KYC processes
  • Mobile payments and wallets using near-field communication (NFC) technology
  • Outsourcing core banking and payment systems to public cloud.

The EBA states that these applications were selected as they are all currently in use across the financial services sector (but with varying degrees of integration across Europe). It considers them to represent a strong mix of present and nearly present use cases of FinTech in both the retail and non-retail banking market.

What legal issues should you consider?

If you are looking to introduce, or increase or change your use of, these applications, we recommend that you consider the following key legal issues.

Data protection

Given the recent focus on data protection, with the introduction of the GDPR, it will be no surprise that data protection is a key concern when introducing innovative technologies into financial services. You should consider in particular:

  • Which lawful basis for processing are you relying on? If you are processing sensitive personal data, you will need an additional lawful grounds to lawfully process this information. This may apply, for example, if you are using biometric data to perform customer authentication depending on whether the data is processed through your technology solution or locally on the customer’s own device.
  • Has sufficient fair processing information been provided to all the individuals whose personal data is being processing? Where you do not hold a direct relationship with an individual, it can be harder to ensure that you provide them with sufficient fair processing information (usually a privacy notice) when you are acting as a data controller. For example, if you collect data from a customer’s social media feed relating to their friends or contacts for credit scoring purposes, how do you provide these third parties with information about your processing activities?
  • Have you performed a Data Protection Impact Assessment (“DPIA”)? A DPIA is required where data processing is likely to result in a high risk to the rights and freedoms of customers, in particular when introducing new technologies. This must describe the nature and scope of the proposed processing, assess whether it is proportionate to your aims, and consider whether any additional protections could be put in place to achieve the same aims while better protecting individuals. If you carry out a DPIA and the result is that the processing may represent a high risk to individuals, then you would need to consult with the data protection regulator, the Information Commissioner’s Office.
  • Have you embedded privacy into the technology solution? The concept of privacy by design and data minimisation (i.e. that you process the minimum amount of personal data required to achieve your aim) is crucial to understand for those in the financial services industry, as there is a historic tendency to store more data than required and for longer than is necessary - both to protect against legal claims and due to the use of legacy IT systems.
  • If you receive a request from a data subject to exercise his or her rights, would you be able to perform the request? How easy would this be? For example, you should ensure that you can isolate the personal data relating to one individual in your systems in order to process it independently of other data.

IT outsourcing and security

Although important for any technology outsourcing, it is crucial for financial services entities to ensure that the technology being procured is robust and fit for purpose – whether focused on retail or trade finance. In particular, you should ensure that you:

  • Have the protection of appropriate contractual business continuity arrangements, service levels and, as relevant, a related service credit regime.
  • Comply with SYSC 8 if you are a relevant authorised firm engaging in a material outsourcing that could cause service disruption in the event of a failure or otherwise cast serious doubt upon the firm's continuing satisfaction of the threshold conditions or compliance with relevant FCA or PRA principles or rules.
  • Have adequate information security standards within your contracts, including tying security arrangements to a recognised industry standard (such as PCI DSS) if possible, having benchmarking and continuous improvement provisions in the contract, and having appropriate audit rights.

You should also consider the EBA’s draft guidelines on outsourcing (here), which are open for consultation until 24 September 2018, its recommendations on outsourcing to the public cloud (here), and the FCA’s own guidance (our summary here).

Customer fairness and transparency

As technology solutions become more complex, it becomes increasingly important that you understand the underlying impact of any technology on customers – especially if you are providing financial services to consumers. For example, you must understand the criteria that your algorithm may use to give automated investment advice or make any decisions relating to credit scoring.

You should take care to ensure that your technology solution could not discriminate against certain consumers, as set out in the Equality Act 2010, or if you are regulated, fall foul of the FCA’s TCF Principles.

For further information on the Report, the FinTech applications it considers or legal issues when considering these, please contact the authors.