What GDPR Means for Fraud Prevention


The General Data Protection Regulation (GDPR) is a piece of EU legislation which directly impacts all organizations or people which process the personal information of individuals. Organizations that do not understand their fraud operations completely will end up with flawed or incomplete compliance with GDPR. There will be organizations found lacking and the consequences could be expensive for a company’s bottom-line and damaging to its reputation. It is therefore vital that fraud teams liaise with any GDPR projects or functions as part of preparations for and beyond the May 25 implementation date.  In-house fraud prevention teams and fraud solution vendors should step forward now and take the lead in an area which is so critical to them.

GDPR in essence  gives people whose data is stored by organizations notification of what is done with that information, rights over what happens to it and where it may be transferred and delivers a degree of confidence that the data is secure..

GDPR becomes applicable across all EU member states on 25th May 2018; however, some countries are putting additional legislation in place to clarify national options, such as age of consent of children for data processing. For example, in the UK,  the Data Protection Bill 2018  plans to lay out personal liabilities for directors, managers and principals of businesses for breaches of data protection law.

Back to GDPR basics

Many misconceptions exist regarding GDPR. In general, the easiest way to fail to comply with the GDPR is to assume it doesn’t apply in an individual case. Consequently, it’s worth reviewing   the essentials of the legislation – when does it apply, to whom it applies, what constitutes personal data and how will GDPR compliance be implemented and assessed.

  • Personal data is defined as data about an identifiable, living person – the data subject. Identifiable goes beyond data just containing identifiers such as name, date of birth, passport number, mobile phone number and IP address to include any data where the identity of the data subject could be reasonably deduced. Only if a data subject could not be reasonably identified by the data held can it not be viewed as personal data. It is not yet completely clear if all or some behavioral biometrics data is included under definition of identifiable.
  • The regulation applies if either the organization or person handling data or the data subject is in the EU/EEA. This means that GDPR applies to businesses outside the EU processing data of subjects inside the EU and to EU businesses handling data of persons outside the EU.
  • GDPR defines two categories of organisations or in some cases persons which handle data: controllers, who determine the purpose and means of data processing, and processors who act on the instructions and with the agreement of a controller.
  • Data processing is defined broadly to include all activities from capture, to storage, manipulation, organization, augmentation as well as archiving and deletion.
  • Finally, GDPR will be implemented and assessed by supervisory authorities in the EU member states where the organizations in control of processing have its main establishment. However, data subjects are able to complain to their local supervisory authority. Data subjects gain specific rights over how their data is processed, including the right to object and require rectification if there are errors.

The new law isn’t dramatically different from previous data protection law, but it does add some new obligations on controllers and processors.  It also codifies the legal basis under which processing may take place.

6 key principles of GDPR

GDPR has six principles, which parallel earlier EU member states’ laws. What is new is that for the first time data controllers are obliged to demonstrate their compliance with the regulation upon request of the supervisory authority. Previously, compliance inquiries would only come about if a complaint had been received.

There are six types of legal basis for processing personal data. The least applicable to most business operations is the basis of consent of the data subject. This consent must be specific, informed, unambiguous and given freely and capable of being revoked as easily as given. Because of this last aspect, many organizations will not be able to cope with loss of consent for their normal operations and may be best advised to look for an alternative legal basis.

The three most common reasons for processing personal data for most commercial operations will be performance of a contract to which the data subject is party, meeting an obligation of EU or member state law or in the controller’s legitimate interest (as long as that does not conflict with the data subject’s own rights and freedoms). The other legitimate reasons might be public interest or the vital interest of the data subject. In many cases there are multiple reasons for data processing within a single relationship with a customer: fulfilment of a contract, legal obligations such as payroll and tax reporting and the legitimate interest of the organization to keep a customer informed about the product or operational matters. This is what can make it complicated to map personal data to a set of purposes, each with a lawful basis.

For some special categories of data explicit consent is required from the data subject due to the risks of it being lost or used incorrectly. This includes types of information previously covered by regulation such as health records and political views, but GDPR also extends this to genetic data and biometric information when used to identify a person. In these cases no other legal basis is appropriate, and the exemptions are few and unlikely to apply to most organizations.

The legislation also requires that controllers process the minimum amount of data required for the task. This is, again, not new and is referred to as data minimization. Organizations which continue to store information “just in case it becomes useful” are likely to be in breach of this principle. New systems are also expected to implement “data privacy by design” which in practice means ensuring that GDPR principles and rules are enshrined in software development.

Automated decision making, including a definition of profiling, is allowed by GDPR but data subjects must be informed that it is happening and gain a right to object and have an opportunity to present their case to a person. Given the huge adoption of automated fraud tools over the last twenty years and the further adoption of AI or machine learning techniques, this is an important requirement to bear in mind.

GDPR requires contracts between controllers and processors to have specific clauses including timely notification of breaches of personal data security. These may be breaches of cyber security, but may just as easily occur by inadvertent disclosure on paper, USB memory sticks or in communication. GDPR puts the onus on controllers and processors to inform supervisory authorities if there is a likelihood of harm to data subjects and in some cases to inform the data subjects themselves.

Does GDPR help or hinder fraud management?

So what might this mean to a fraud prevention team? Many fraud operations involve handling personal data and in some cases the information captured to prevent fraud may be personal data in itself.

GDPR itself recognises the importance of fraud prevention within two of its recitals:

  • Recital 47: “The processing of personal data strictly necessary for the purposes of preventing fraud also constitutes a legitimate interest of the data controller concerned…”
  • Recital 71: “decision-making based on … profiling should be allowed where expressly authorised by … law … including for fraud or tax evasion monitoring and prevention purposes”

There is also an important exemption to the data subjects’ right to erasure of their personal data when “for the establishment, exercise or defence of legal claims.”

Compliance for fraud teams can be grouped into three categories:

  • Purpose and grounds for processing
  • Use of automation
  • Operational considerations

These three categories will typically impact relationships between the fraud team and different business areas. For example, the purpose of processing will affect customer onboarding, public interfaces such as the website and legal areas such as contracts and privacy notices. Operational considerations, by way of contrast, may be more related to IT and customer support.

No two organizations’ fraud prevention mechanisms are exactly alike; neither are their customer relationships, so any GDPR compliance policy must be tailored to the systems and processes in use. As new technology solves problems, it must be assessed to ensure it complies with legislation and data protection policies. These data protection impact assessments (DPIA) are now mandatory.

Grounds for processing data

To start with any processing of data for fraud prevention purposes must meet the principles of GDPR, including fairness, transparency and legality, but there are also requirements on the provision of information on the purpose of processing and its legal basis. Fraud processing must be covered by the processing descriptions provided to customers. The only likely exemption would be if the data used for this purpose somehow did not identify the data subject; this might be the case for statistical or anonymous reporting, but will probably not support prevention or detection of financial crimes like fraud. Fraud prevention may already be part of a contract with the customer, but it is useful to include it in the privacy notice sent to customers.

These notifications must also include information where special categories of data are used; this may affect businesses which are using behavioral biometrics to provide positive or negative signals about the user’s activity. It is not yet completely clear which behavioral methods, for example capturing how a user navigates a website or uses a computer mouse, will be deemed as identifying an individual.

Privacy notifications, and in some cases request for consent, should also list any transfers outside the EU/EEA, for example when using a service provider located outside the EU. It is therefore vital if using cloud-based storage or external IT functions to know where they are located, including any sub-processors which these third parties may use.

Automated decision making

Fraud prevention and detection is an area that often heavily relies on automation, especially during customer acquisition or transaction processing. Therefore, data subjects must be appropriately notified about automated decision-making whether they are clients, members of the public or employees. Data subjects’ right to know about any automated decision making includes an accessible, high-level description of the logic used, the purpose of the automation and likely consequences, for example having a payment stopped.

Data subjects gain a right to object to decisions made entirely by automated systems. One basic redress would be to allow data subjects to make their case to a human. This “safety valve” may mean that members of staff should be able to overrule the computers in some cases. This may also mean that in addition to being able to explain the logic at a high level, the human may need to be able to explain more about the decision in the specific case. In practice this may mean a cardholder calling customer services to ask why their card has been declined and being able to reverse that decision. In some organisations that may be a change to existing practices. It will also ensure that companies will be required to maintain some manual review staff to address these appeals.

Operational matters

Processes may need to be considered as well. For example, there will be considerations of handling data subjects’ rights. One key consideration is ensuring that you are in fact dealing with the data subject. You may use different identity verification techniques, but before handing over a subject’s entire data history, you should be sure it really is them.

For many of these rights there are exemptions which prevent criminals being able to delete data or cover their tracks. Fraud teams should therefore consider which processes they want and when they want to be informed, as this may indicate compromise of an account in a similar way to changing address or other basic account details.

IT and data security is one area which requires consideration, especially breaches of data security in suppliers which could compromise client accounts. Under GDPR such processors are required to inform the controller without delay, as are any sub-processors. Early notification of these sorts of breaches may allow fraud teams to react more quickly to emerging situations and require additional security for potentially compromised client accounts.

Get educated on GDPR

Finally, GDPR requires staff members who handle personal data to be educated on GDPR, what it means and how to process personal data securely. This should go hand-in-hand with a means of reporting weaknesses or breaches and a culture of transparency. Whistleblowing hotlines and services may be set up to collect vital intelligence not just from members of staff, but suppliers, clients and members of the public. This should be a separate service from any data protection request from data subjects.

These are some of the highlights of GDPR. Any organization which thinks it may be covered by GDPR is advised to read the regulation itself and get specialist advice from data privacy professionals and lawyers.

Tagged with:
Posted in:
Author: Jonathan Williams

Jonathan Williams is head of consulting at Mk2 Consulting, an independent advisor on payments, identity and fraud prevention. Before that Williams was head of strategy and product for payments products at Experian. With a background in cybersecurity, telecommunications and embedded systems he addresses both business and technological issues.