Research involves the collection, processing and analysis of data. As such, a great proportion of researchers will spend their days handling personal and/or sensitive data. If you are in the UK, this means dealing with such data will require compliance with the Data Protection Act (DPA) 2018, which is the UK’s implementation of the General Data Protection Restrictions (GDPR). Data protection compliance is regulated by the Information Commissioner’s Office (ICO). Not only do data breaches risk sizable fines, but they can also serve to damage the reputation of the researcher and the organisation they are employed by. To protect your organisation from the devastating effects of data breaches, it’s important that researchers undergo regular data protection training and understand what steps to take to protect data under the law.

Types of Data

Researchers handle a range of data types. The DPA 2018 regulates the use of personal data and special category data. Personal data is any information relating to a living, identified or identifiable person. Special category data is data that is regarded as more sensitive and is consequently subject to tighter restrictions. Examples of this type of data include: racial/ethnic origin, health or sex-life information, political opinions, and religious beliefs. This type of data may, for instance, be used in health research, and must be gathered and processed according to the directives laid out in Article 9 of the GDPR. Amongst other things, this article states that explicit consent must be given by the data subject for in order for their data to be processed.

The Issue of Consent

Data Protection legislation guides researchers to implement best practices, such as that of informed consent. Consent must be informed (consent given with the knowledge of the purposes for which their data will be used and/or transferred to), voluntary (consent should not be coerced), and fair (the individual should be given all/any supplementary information to ensure full transparency).

Whilst Data Protection regulation can seem overwhelming, especially regarding special category data. Following procedures of best practice and undertaking regular data protection training and refresher training can help keep compliance fresh in the minds of researchers.

Safeguard and Best Practices for Reseachers

There are a number of safeguards in place designed to protect the personal data of research participants. Research must receive approval from a research ethics committee. Data processing should be limited to what is strictly necessary (this is what is known as data minimisation). All who handle personal data should be literate in the principles of confidentiality and data protection. Data should be anonymised/pseudonymised wherever possible. If data is anonymised in line with the ICO’s ‘Anonymisation Code of Practice’ then it is no longer regarded as personal data. However, it is important to recognise that the act of anonymisation is still classed as processing personal data.

Why is Data Protection Important in Research?

Improper data protection practices can result in a data breach. Data breaches can have unprecedented effects on both the data subjects (those whose data is stored and processed) and the organisations or individuals charged with protecting the data. The ICO can issue monetary penalties up to €20,000,000 or 4% of your annual turnover, whichever is greater, for data breaches where fault is determined.

Breaches can also quickly become tabloid scandals, and culminate in massive damage to your reputation or that of your employee. Given the potentially crippling effects that a data breach could have on your organisation and career, it is clear the importance that thorough data protection training has to play.

Most importantly, a data breach could result in damage to the rights, freedoms and privacy of your data subjects.

Data protection is not an issue confined to large businesses. Data protection requirements apply to all organisations that process personal data, and failure to comply with legislation could have devastating effects on your organisation no matter its size or occupation. Following the General Data Protection Regulation (GDPR) legislation which governs the EU, the UK implemented the Data Protection Act 2018 in order to comply. The new legislation has seen a tightening up in surveillance and more serious consequences should data breaches occur, including potentially devastating fines. Therefore, data protection training is more crucial than ever in protecting your organisation.

Who Does Data Protection Apply To?

Data protection restrictions apply to all data controllers and data processors.

Data controllers are organisations that own personal data and are responsible for deciding how personal data is used. If your club or society hold information about its members, volunteers, suppliers, or employees, then data protection applies to you.

Requirements for Your Club/Society

You are required to ensure all officials, staff, volunteers, and members have satisfactory data protection awareness under GDPR, and that anyone who handles club matters adheres to strict data policies. Remember, data can only be used for the purpose for which it was obtained and this purpose must be explicitly stated.

Under GDPR legislation, data subjects (individuals who have personal data held about them) are entitled to know how their data will be used and for what. Your organisation must be clear and transparent regarding your data processing procedures and issue a privacy notice explaining this to data subjects.

Another individual right under GDPR is the right of access to the personal data that is held about you. This can be achieved through the submission of a Subject Access Request (SAR). You are required to respond within one month if you receive a SAR from a data subject.

Valid consent must be gained in order to process personal data. This includes consent from club officials to make their names and contact details available to the public. In order for consent to be valid it must be informed, specific, freely given and easily revocable. You are required to ensure any consent obtained in the past meets the new criteria and re-obtain consent if necessary.

As a data controllers, clubs and societies are responsible for the personal data they own – even when it is in the hands of a third party. Therefore, it is important to draw up a written contract between you and any third parties you work with, documenting their agreement to comply with your data policies.

Though data protection education and training you can minimise the chances of your organisation suffering a data breach. However, if a breach does occur you are responsible for reporting it to the Information Commissioner’s Office (ICO) within 72 hours of it being detected. Data handlers must be trained in rapidly identifying breaches it order to implement measures intended to minimise any adverse consequences.

High Profile Breaches

In November 2017 an exclusive Oxford and Cambridge club had its reputation tarnished when it suffered a serious data breach. Names, home addresses, phone numbers and some bank details were extracted from a computer system. The information leaked was enough to facilitate identity theft. Due to the number of high profile members in the club, including Steven Fry, the breach was the focus of many news articles and growing publicity rendered the clubs reputation ruined.

Queensland sports club found itself a victim of cyber-attack in March 2018. Hackers extracted personal information on 70,000 individuals including the club’s employees, members, events centre customer and corporate partners. The information extracted included name, gender, date of birth, address, telephone number, email address, next of kin, employment status, membership number, photograph, company details, invoices and bank accounts. The largescale nature of this breach highlights the widespread repercussions that a data breach could have across your business.

The Benefits of Compliance

Lack of data protection regulation compliance can result in data breaches, which can have a multitude of ramifications for your organisation and the individuals whose data is implicated.

Following a data breach your organisation could incur fines of up to €20,000,000. Large-scale breaches often hit the news and result in serious damage to an organisation’s reputation. Your organisation has a duty to protect its data subjects as breaches can result in emotional, physical and financial consequences for the subjects involved. In order to ensure compliance within your organisation all members and employees should undergo regular data protection training along with the implementation of rigorous data policies.

Churches have charitable status but are not regulated by the charity commission as other charities are. Despite this, church trustees have the same responsibilities as other charity trustees. These responsibilities include compliance with laws such as the General Data Protection Regulation (GDPR) and the Data Protection Act 2018. Good data protection practices are a legal requirement, so regular refresher training is an essential means of ensuring compliance amongst volunteers and church employees. This is because churches possess a large amount of personal data, such as information about clergy members and attendees.

GDPR requires that organisations perform regular data audits, thus parish resources must be monitored for compliant data use and retention. With an increased focus on accountability, members of the church are required to demonstrate compliance rather than simply state it. This demonstration can take the form of documenting processing activities, data protection training, policy reviews and audits of parish resources and processes.

Explaining the Jargon

Data protection applies to any individual/organisation that processes personal data. Personal data is information about an identifiable, living person, whilst processing is anything that happens to this data (including its storage and transfer). Special category data is a type of personal data that is viewed as more sensitive, and is consequently more at risk should a data breach occur. Religious belief falls into this special category of data, so churches must be especially cautious with data that affiliates individuals with religion. Data controllers are any individuals/organisations that collect and are responsible for the use of personal data. Within a church the incumbent and the Parochial Church Council (PCC) are seen as separate data controllers. Individuals/organisations that process personal data on behalf of the data controller are termed data processors. The data controller retains ultimate responsibility for personal data, even when in the hands of the data processor. For this reason, if churches outsource any data processing functions to third parties, they must ensure a written contract is signed whereby the processor agrees to comply with certain data policies.

Data subjects are individuals who have personal data held about them which is out of their control; essentially all of us fall into this category.

Who Requires a Data Protection Officer (DPO)?

There are certain criteria whereby data controllers are required to appoint a Data Protection Officer (DPO). A DPO is an individual removed from the daily processes of your organisation who is responsible for ensuring data protection compliance. Large scale processing of special category data necessitates a DPO, however the scale of processing in most churches does not fulfil this criteria. Although, you are at liberty to appoint a DPO even when not required to do so.

Data Subject Rights

Under new legislation, data subjects now have extended rights over their personal data. Your church must issue a privacy notice, informing data subjects how their data will be processed and ensure they have informed consent for it. Subjects have the right to access any personal data the church stores about them, through submission of Subject Access Request (SAR). You must respond within one month of receiving a SAR.

Consent

Consent is one of the grounds on which you can legally process personal data. In order to be valid, consent must be freely given, unambiguous and indicated by a clear affirmative action. It must also be specific and informed. Past consent must be checked for validity and re-obtained if it is not up to current standards.

Specific consent is required to use an individual’s personal data for marketing purposes. It is important to recognise that sending people information about church services could be viewed as marketing so you must apply marketing restrictions accordingly. You are not permitted to use the data on your electoral roll for marketing purposes unless you have acquired specific consent to do so.

Why is Data Protection Important for Your Church?

Failure to comply with data protection can result in data breaches. It is your legal and moral duty to protect those you hold personal data about. Data breaches can result in emotional, physical and financial consequences for the affected data subjects. Additionally, the consequences of a data breach on your church could be substantial. Repercussions include damage to your reputation as well as penalties issued by the Information Commissioner’s Office. Data protection training can help to demonstrate compliance, protect your data subjects and avoid the devastating effects that a data breach could have on your church.

We tend to think of data protection in relation to corporations, however all data controllers and data processors must uphold the standards of the Data Protection Act 2018 (DPA 2018) – this is the UK’s implementation of the General Data Protection Regulation (GDPR).

A data controller is any individual or organisation who owns, controls, and is responsible for personal information about data subjects; data processors are any persons or organisation that processes data on the data controller’s behalf. If you are a childminder, you act as the data controller, as you will collect information about the children in your care, e.g., their parents’ contact information, home addresses and so on; you will then determine the purpose for which this information is used and the means by which it will be processed.

As data controllers under the DPA 2018 and GDPR, you are held responsible for, and must be able to demonstrate compliance with, the principles of data protection. These are: lawfulness, fairness and transparency, data minimisation, accuracy, storage limitation and integrity, and confidentiality of personal data.

Protecting Children’s Data

A separate set of restrictions surround child data protection in order to safeguard children. Childminders must ensure they understand and comply with these requirements to protect the children in their care and defend themselves against data breaches. Only parents/carers with parental responsibility can provide personal data on the behalf of a child and issue consent for this data to be collected. Childminders are required to make reasonable efforts to ensure that the person providing this data does in fact hold parental responsibilities for the child. Once over the age of thirteen years old children can give their consent directly for the processing of their personal data. It is important to recognise that children have the same rights as adults over their personal data. These rights include: access, correction, erasure, processing restriction, portability, objection to processing, information on processing and rights relating to automated decision making.

Special Category Data

Childminders belong in a group of professionals who are likely to access personal information that falls into the ‘special data’ category, e.g. health information such as allergies, medications, and so on. This data is regarded as highly sensitive, so those who control it must comply with the GDPR’s ten conditions for processing special category data in Article 9 (2).

Privacy Notices

With an increased focus on transparency under the DPA 2018, childminders are now required to issue privacy notices. These notices will explain how and why personal data will be processed and should be made readily accessible to parents and children. Any correspondence addressed to children should be simple and easy to understand.

Sharing Personal Data

Sometimes childminders will be required to share the personal data that they hold with others, for example with other care providers, emergency back-up childminders, or other professionals working with the child. A GDPR Data Sharing Agreement is required for information sharing in these situations.

Why is Data Protection Important for Childminders?

Data protection is important for all data controllers, but especially for childminders given the nature of sensitive data that they will process/store about children. The rigorous data protection requirements surrounding both children’s data and special category data means that good data protection training is a necessity for all childminders. The absence of data protection awareness and policy implementation can result in data breaches. Breaches can have unprecedented effects on individuals, often resulting in emotional, physical, and financial damage.

Charities and voluntary organisations are third sector, not-for-profit organisations and whilst they benefit from numerous exemptions, they are not overlooked when it comes to data protection regulation. The Data Protection Act 2018 (the UK’s implementation of GDPR) applies to any individual/company that handles personal data. Personal data can be defined as any information on a living identified or identifiable person. Volunteers have the same responsibilities as any other employees when it comes to upholding high data protection standards and, as such, they require thorough data protection training in order to mitigate the risk of a breach.

When to Appoint a Data Protection Officer (DPO)

A Data Protection Officer (DPO) is an external, independent expert in data protection that reports to the highest level of management. The DPO can be an existing employee or an externally appointed individual. All companies are allowed to appoint a DPO, but only companies that fall into one of the following categories are required to do so:

  • Public authorities
  • Companies whose core activities include large scale systematic monitoring of individuals
  • Companies whose core activities include large scale processing of special categories of data or data regarding criminal convictions/offences

The Issue of Consent

The new data protection legislation requires that organisations, including charities and voluntary organisations, obtain valid consent when collecting and processing personal data. Consequently, any consent you have obtained in the past must be checked for validity under the new legislation and re-obtained if deemed unsatisfactory. In order for consent to be valid it must fulfil the following criteria:

  • It must be freely given
  • Requires an affirmative action
  • Consent must be specific and cover the controllers name, purpose of processing and types of processing activity that will be undertaken
  • Explicitly expressed in words

Privacy Notices and SAR’s

Data subjects (individuals who have personal data held about them) have the right to be informed about the way in which their data will be processed. Companies are required to issue a privacy notice detailing how they intend to use a subject’s data. The privacy notice must be concise, transparent, easily accessible, easy to understand, and free of charge.

Data subjects also have the right to view precisely what data that is held about them by various organisations, also known as a data subject access request (DSAR). In order to access this data they must submit a Subject Access Request (SAR) to the data controller. The data controller is then obliged to respond to the request within one month. Data subjects are entitled to the following information:

  • Confirmation that you are processing their data
  • A copy of their personal data
  • Other supplementary information, most of which will feature in your privacy notice

High Profile Charity Data Protection Breaches

  • Late in 2017, Age UK leaked thousands of staff files in two data breaches. Information included names, addresses, dates of birth and national insurance numbers of past and present employees. The data disclosed because of the breach was everything criminals needed in order to commit identify fraud, and left Age UK with no choice but to pay £100,000 to fraud prevention services to protect those affected. This is £100,000 that the organisation would not be able to use to support the elderly.
  • In November 2017 the charity Change, Grow, Live changed premises. Over 100 files were left behind in the move, some of which contained highly sensitive information. Included in the forgotten files were details of beneficiaries’ experience of abuse and addiction. Public trust in the charity dropped significantly as the press caught wind of the breach and its reputation has been irreversibly damaged.

Data breaches could mean big trouble for charities and other voluntary organisations, as they rely so heavily on public interest and branding for support/donations. Whilst complying with the DPA 2018 and GDPR can seem overwhelming, particularly for smaller organisations, the data protection principles are not merely a box-ticking exercise. Good data protection signifies to your donors that you respect their privacy and the work you do.

Social media is a vast and expanding network, which allows space for the personal data of users to be compromised for the use of cyber-criminal activity and distortion. Consequently, a need to protect the personal data of social media users has intensified. Social networking now requires intelligence to confront hacking attempts in the forms of phishing scams, spyware, viruses and cloning. The increased use of social networking sites, and the vast amount of personal data that they store, has also called for more data protection.

Social Networking Sites and Data Protection

As social media offerings are developed, a social media user is supplying their personal data to a wide online network, at the risk of their personal data being manipulated. Therefore, a wide range of data protection requirements are necessitated.

The Data Protection Act (DPA) 2018 provides the legal precautions necessary to prevent social media networks from exploiting personal data. The DPA includes an exemption for personal data, which has been used for domestic purposes. The Information Commissioner’s Office (ICO) states that the domestic purposes exemption is necessitated for individuals using social networking sites for personal reasons. Therefore, an individual using Facebook or Twitter for their own personal reasons, does not need to comply with the DPA.

The domestic purposes exemption only relates to individuals, therefore if a business is using social networking sites to promote their business, then they are required to conform to the DPA.

Catfishing: A form of Identity Theft

The personal data uploaded onto social media networks by individuals, has enabled the concept of ‘catfishing’ to occur. Catfishing is the process in which personal data, such as name, age and photographs of an individual are stolen, in order to create another identity. In July 2017 Labour MP Ann Coffey called for a law to criminalise catfishing, as it is the act of stealing personal identity, and thus a form of identity theft. The demand to make catfishing illegal has gained more momentum in recent years, due to the increase in catfishing scandals and personal data available on social media networks. Although no laws have been made against catfishing yet, with the increased focus upon data protection, it seems likely that more calls will be made to make catfishing illegal.

Data Breaches and Social Networking Sites

The rise in cyber-criminal activity in the recent years has demanded the new DPA legislation in 2018 to strengthen the controls over personal data online. In 2013, Twitter experienced a data breach which allowed cyber-hackers access to 250,000 accounts, exposing the names, email addresses and passwords of each of these social media users. This data breach followed a series of security breaches in US technology and social media companies, including the hacking of the Wall Street Journal and New York Times. Furthermore, Apple in 2013 were encouraged to stop using Java to mitigate the risks of cyber-hacking.

Moreover, LinkedIn in 2012 lost the account credentials for 167 million LinkedIn accounts following a data breach. This data breach involved a hacker stealing the encrypted passwords of these accounts, from the networking site, resulting in a process of re-setting all account passwords to occur. The rise in cyber-criminal activity and hacking of social media networking sites, means that it is imperative to understand what personal data you have uploaded onto these sites.

Social Media and Businesses

A vast number of businesses now utilise social media networking sites to promote their business and to communicate with customers, these social media networks tend to be Facebook, LinkedIn and Twitter.

Organisations, through abiding to the DPA and the PECR requirements, can receive consent from social media users, using social plugs in the form of a “like” or “follow.” Thus, organisations enjoy an easier capture of consent through social media, whilst complying with the data protection regulations, but it tends to be through default. Social media users tend to be un aware that a “like” effectively offers their online data to that business.

Consequently, a lot of confusion arises between the business and the online customer, and data protection now states how to use these social plugs legally. Social plugs offer businesses the opportunity to expand their outreach across social media easily, however data protection regulations have been put into place to ensure that these social plugs are not exploited.

The EU-US Privacy Shield has in effect committed social media networks from the US, to comply with the new framework agreement within the GDPR, to protect the personal data of EU citizens. Thus, businesses and their social media audience agree to the terms and conditions set out by the GDPR.

The business industry, as well as society in general, is increasingly becoming dominated by social media networks and the social media culture. Therefore, personal data, which exists on these networking sites in excess, is at risk. Despite the renewal of the DPA 2018, businesses and individuals need to be aware of the risks of cyber-hacks and data breaches.

The handling of personal data within the marketing sector varies, thus it requires an alternative understanding of protection principles. The use of personal data for marketing purposes is subject to the Data Protection Act (DPA) and Privacy and Electronic Communications Regulations (PECR). PECR is derived from European legislation and therefore implements the European e-privacy Directive, which specifies the risks to privacy which can occur through using electronic communications. The marketing sector utilises the internet and digital networks to contact customers, therefore they need to be aware of the DPA and PECR regulations to mitigate the risks of causing a data breach.

Direct Marketing

Direct marketing is the promotional procedure used to contact target customers directly, through channels such as email, telephone, SMS or fax. The e-privacy regulation and PECR have established the protection principles which need to be followed by an organisation if they wish to utilise direct marketing. As a result, direct marketing requires consent. The individual giving consent needs to be made aware of the methods of communication that will be used, and also what this personal data will be used for regarding direct marketing purposes.

Digitonomy Ltd, a UK based credit broker, was fined £120,000 by the Information Commissioner’s Office (ICO) in 2016 for contravention of the PECR. Digitonomy Ltd were found guilty of sending over 5 million marketing texts to the public, without receiving any consent. These texts, which were labelled as ‘spam,’ elicited 1,464 complaints, forcing the ICO to take legal action.

Digitonomy Ltd were working with affiliate marketing companies to distribute these text messages, assuming that this classified as achieving consent. However, the ICO highlighted that this did not constitute as receiving specific consent. The inability of Digitonomy Ltd to prove that they had received specific consent, meant that they were subject to the ICO’s guidelines and penalties.

Therefore, it is essential for marketing authorities to be thoroughly aware of the protection regulations set out in the protection act and the PECR, to ensure that they do not have to face monetary penalties issued by the ICO.

Legitimate Interest

Legitimate interest can be used by marketing authorities as a flexible legal premise used to process personal data, in a way that an individual would expect their information to be used. The PECR explicitly states that if an organisation wishes to send a direct marketing messaging electronically, then they must have received consent previously. The organisation is allowed to contact an induvial again, through further electronic messages, but only if they are offering similar products or services. Furthermore, the individual should be offered the opportunity to opt out of receiving such communications in every channel of communication which the induvial has with the organisation.

To ensure that your business is utilising legitimate interest in the right manner, the ICO has issued three tests which can be used to justify the use of legitimate interest: the purpose test, the necessity test and the balancing test. If legitimate interest is used correctly by a business, then there should be no reason for the ICO to investigate that business’ marketing procedures.

Due to the changes in the standards of the GDPR, businesses have been forced to consider whether existing customers still want to receive marketing emails. Therefore, re-permissioning campaigns have been used by businesses to receive permission from customers to continue sending marketing emails. Asos.com, the British online fashion company, sent out a series of bold and concise emails with the title: “The law is changing. Are you set to get your ASOS emails?” Therefore, this organisation has achieved a clear confirmation of whether their marketing emails can be sent to certain individuals or not. Consequently, this has ensured that Asos.com are compliant with the DPA and the PECR.

It is essential for businesses to be certain about how they can market their business, whilst remaining compliant with the DPA and the PECR, to avoid any crippling fines from the ICO. This certainty can be achieved through ensuring direct marketing and legitimate interest is utilised in the right manner.

Whilst finance and technology have been linked ever since the first ATM in the late 1960s, the advent of mobile internet has truly changed the game for financial services technology, or “Fintech”, as it’s commonly known.

Often deemed a decentralising force, many Fintech companies are seen as directly oppositional to large, traditional banks and offer radically different, user-centric, experiences of things like mortgages, insurance, and currency exchange.

In many ways, Fintech has thrived because of its willingness to capitalise on user experience. Much in the same way that activities like shopping, online dating, and scheduling taxi-cabs have been revolutionised through mobile applications, Fintech’s accessibility, its transparency, and its attempts to keep charges to a minimum (all things banks have a bad reputation for), have led to similar disruption for the financial sector.

It’s true that Fintech offers clear opportunity for a new, more efficient, effective, and – dare I say – human approach to finance, but it can also represent hidden risk. It could be argued that the speed, intangibility, and global nature of digital finance is in danger of creating unforeseen regulatory gaps as agencies like the Financial Conduct Authority (FCA) in the UK rush to keep pace.

We can see these concerns play-out in the media at the moment. For example, self-described ‘beyond banking’ app, Revolut, hit the news recently amid non-compliance accusations over money laundering.

Whilst Revolut vehemently denies any wrongdoing and CEO, Nik Storonsky, has clarified that the app simply reverted to its original anti-money laundering screening system rather than remove the function altogether (due to the technology recording too many false positives), the predicament nevertheless shed light on the internal struggle that exists for digital finance between customer satisfaction, speed, and matters of compliance.

The truth is, although many new finance apps appear more human (in that they offer an open, honest, and jargon-free experience without the lengthy approval processes that banks enforce), they are, in fact, artificially intelligent.

That means that it’s often complex algorithms – not humans – that process customer details and translate them into requirements and decisions for things such as mortgages, lending/borrowing, and what constitutes a safe or unsafe money transfer.

In many ways, the mechanisation of everyday financial services is revolutionary (it has been likened to the industry’s transformation in the 1980s at the advent of computerised banking). Machine learning and AI mean that computers can write and test rules themselves. They can learn, for example, to make mathematically perfect lending decisions in seconds – possibly putting an end to unscrupulous and predatory lending practices (like the sort that contributed to the 2008 financial crash).

On the other hand, there is a cost of sorts used to generate this innovation and, for Fintech, this usually comes in the form of data. Big Data – such as the data gathered continuously by your smart phone (and which can be used to predict human behavioural patterns) – fuels Fintech in numerous ways. For example, financial technologies use our personal data to customise user experience, offering banking recommendations based off our spending patterns.

Fintechs also use data and predictive analytics to make credit and lending decisions, to manage risk, detect fraud, to fuel marketing, as well as devise customer retention/loyalty programmes. We shouldn’t underestimate just how much Fintech relies on access to data, and what that data can be used for. After all, Big Data begs Big questions:

  • What happens if data security is compromised?
  • Who (or what) is held accountable by regulatory watchdogs for decisions made by robots?
  • Just how do Fintech firms protect our consumer rights?

Champions of Fintech argue that consumers, indeed, society, will benefit from increased access to more personalised, more cost-effective finance products that encourage fair competition and inspire change. However, those who are more sceptical argue that more data naturally equals more risk, pointing to cyber-security attacks like the one suffered by Tesco bank in late 2018. Tesco was fined £16.4M by the FCA for the breach which saw 34 unauthorised online transactions take place.

Others question FinTech’s use of automated decision software, arguing that it could actually increase the risk of financial exclusion as customers with little or no digital footprint could become ‘invisible’ to applications that rely on data to profile people and assess risk. Similarly, customers might be unfairly profiled due to their spending or shopping habits being similar to someone else’s that has been refused credit in the past. Lumping people together like this suddenly doesn’t sound all that human …

With so much digital information available for Fintech firms to use and analyse, it is imperative that regulatory bodies like the FCA continue to question how Big Data is being used, and for firms to implement safeguards that ensure data is processed ethically and lawfully. This is particularly true under GDPR (or the UK’s implementation of it, the Data Protection Act 2018). Under this legislation, data controllers must:

  • Be transparent about how they intend to use data (including putting measures in place to track and audit data use and for customers to access records about how their data is being used).
  • Obtain informed consent from data subjects to use their data in the manner they want to. Organisations risk breaching data privacy and data security laws if they carry-out group or individual profiling on data they only have implied consent for.
  • Ensure that automated decision software is fair and unbiased.
  • Protect data integrity by using only accurate data and updating this data as and when required.

As we might suspect, underpinning Fintech’s regulatory obligations is yet another innovation, aptly named Regulatory Technology, or “Regtech” for short. Whilst not a new concept, the continued crossover between regulation and technology may well become crucial as Fintech encounters ever more regulatory and reporting requirements in the future. Extending disruptive digital technologies to regulation, indeed, seems like the next logical step.

The UK government in May 2018 has implemented the Data Protection Act (DPA) in accordance with some of the European General Data Protection Regulations (GDPR). However, with Brexit negotiations materialising and declarations that the UK will be leaving the EU Digital Single Market, there is uncertainty surrounding whether the UK’s DPA will change, and subsequently how data will be handled between the UK and Europe. The UK government has requested that a co-operative relationship between the UK and the EU is achieved to ensure a free-flow of data. However, Brexit does have the potential to revise and reshape the protection regulations in the UK. Therefore, organisations need to consider and prepare for how the DPA will apply to the UK post-Brexit.

Which aspects of the DPA will potentially cause problems following Britain’s exit from the EU?

The UK needs to have a free-flow of data between the EU and the UK for business, economic and security interests. However, the European GDPR may prevent this from happening, especially due to Article 45. The UK’s exit from the EU will initiate its status as a third country, when referenced in EU law, and Article 45 specifies that the UK would have to achieve an adequacy arrangement to enjoy a transmission of data between the UK and the EU. This status as a third country will demand action from the UK government, to ensure that there is still a relationship of mutual co-operation between the UK and the EU. The transfer and protection of personal data between data controllers is essential, as data privacy and data protection are vital in terms of personal rights, as well as the digital economy.

The EU Commission has ten adequacy arrangements with third countries outside of the EU already, in line with the 1995 Directive. To achieve this adequacy arrangement, the UK would have to meet the EU Commission’s expectations regarding the UK’s own commitment to data protection and the effectiveness of its legal framework. The UK government has mostly aligned UK data protection law with the GDPR to try and ensure a smooth transition and to mitigate the risks to businesses.

However, if the UK is denied an adequacy arrangement then businesses would experience the repercussions, in the form of EU safeguards which would initiate added costs to businesses. Businesses which are reliant upon personal data capture, such as marketing, telecommunications and finance organisations, would suffer the most if this post-Brexit situation was to occur. These organisations are reliant on free-flowing channels of data between the UK and the EU, yet if there are economic obstacles then these organisations will experience detrimental effects.

Furthermore, the EU-US Privacy Shield has allowed the EU and the US free access to data; however, if the UK does not achieve an adequacy arrangement from the EU, the UK will not have access to data from the US. If this happens, the UK will have to confront economic and security challenges.

What has the UK data protection bill put into place to create a smooth transition out of the EU in 2019?

Whilst implementing the DPA, the UK government did so with the consideration that Britain is leaving the EU. Therefore, the UK bill factors in the differentiations between the European data regulations and the UK data regulations, and therefore the UK government apply the new standards to all UK data, not just areas which are under EU competence.

The UK government have expressed a desire for the continuation of the UK’s Information Commissioner’s Office (ICO) role, which would be used to ensure UK businesses are still represented in the EU, and to ensure that the UK are fairly represented in disputes. However, this attempt to streamline the process of communication between the UK and the EU has not yet been put into effect, and there is no guarantee that by 2019 the UK ICO will be given a role in the EU data regulation process.

Effects on Immigration in a post-Brexit UK:

Some commentators on the UK’s DPA have suggested that Brexit could allow the UK government to establish discriminatory immigration laws. The UK government has implemented aspects of the EU’s GDPR, such as allowing some organisations exemption from the DPA. For example, the Home Office is exempt from the DPA and have the legal right to reject data subject’s access requests to their immigration documents. This has been particularly controversial because Brexit means that over 3 million EU citizens will have to register their residence, and this will be hard due to data subjects not being able to retrieve their personal data from the Home Office.

The uncertainty which surrounds Brexit has instigated an atmosphere of apprehension within the business sector, and therefore it is vital to gain a well-informed stance on the implications of a post-Brexit UK. Organisations and staff members need to be aware and prepared for the repercussions they might have to confront, if the UK government do not achieve an adequacy agreement post-Brexit.

Trends in data protection for direct marketing

Have data protection authorities begun the great fightback against business? Perhaps they have been tasked with bringing in some much-needed cash to national coffers, because fines have become the next big trend in data protection and should seriously concern marketers in all sizes of business.

Some recent marketing-related fines have included:

  • Amazon – €746m for compiling data on customers
  • WhatsApp – €225m for failing to provide information in clear and plain language
  • Austria Post – €9.5m for failing to allow subject access requests by email
  • Grindr – €6.3m for sharing location services without consent because it was special category data on sexual orientation
  • Sky Italia – €3.3m for unwanted phone calls

Overall, there’s been a 113% increase in GDPR fines between July 2020 to July 2021, with 709 in total compared to 332 in the year before. Penalties for violations have more than doubled as well, from €130.69 million up to July 2020 to €293.96 million up to July 2021. 

Continue reading