The handling of personal data within the marketing sector varies, thus it requires an alternative understanding of protection principles. The use of personal data for marketing purposes is subject to the Data Protection Act (DPA) and Privacy and Electronic Communications Regulations (PECR). PECR is derived from European legislation and therefore implements the European e-privacy Directive, which specifies the risks to privacy which can occur through using electronic communications. The marketing sector utilises the internet and digital networks to contact customers, therefore they need to be aware of the DPA and PECR regulations to mitigate the risks of causing a data breach.

Direct Marketing

Direct marketing is the promotional procedure used to contact target customers directly, through channels such as email, telephone, SMS or fax. The e-privacy regulation and PECR have established the protection principles which need to be followed by an organisation if they wish to utilise direct marketing. As a result, direct marketing requires consent. The individual giving consent needs to be made aware of the methods of communication that will be used, and also what this personal data will be used for regarding direct marketing purposes.

Digitonomy Ltd, a UK based credit broker, was fined £120,000 by the Information Commissioner’s Office (ICO) in 2016 for contravention of the PECR. Digitonomy Ltd were found guilty of sending over 5 million marketing texts to the public, without receiving any consent. These texts, which were labelled as ‘spam,’ elicited 1,464 complaints, forcing the ICO to take legal action.

Digitonomy Ltd were working with affiliate marketing companies to distribute these text messages, assuming that this classified as achieving consent. However, the ICO highlighted that this did not constitute as receiving specific consent. The inability of Digitonomy Ltd to prove that they had received specific consent, meant that they were subject to the ICO’s guidelines and penalties.

Therefore, it is essential for marketing authorities to be thoroughly aware of the protection regulations set out in the protection act and the PECR, to ensure that they do not have to face monetary penalties issued by the ICO.

Legitimate Interest

Legitimate interest can be used by marketing authorities as a flexible legal premise used to process personal data, in a way that an individual would expect their information to be used. The PECR explicitly states that if an organisation wishes to send a direct marketing messaging electronically, then they must have received consent previously. The organisation is allowed to contact an induvial again, through further electronic messages, but only if they are offering similar products or services. Furthermore, the individual should be offered the opportunity to opt out of receiving such communications in every channel of communication which the induvial has with the organisation.

To ensure that your business is utilising legitimate interest in the right manner, the ICO has issued three tests which can be used to justify the use of legitimate interest: the purpose test, the necessity test and the balancing test. If legitimate interest is used correctly by a business, then there should be no reason for the ICO to investigate that business’ marketing procedures.

Due to the changes in the standards of the GDPR, businesses have been forced to consider whether existing customers still want to receive marketing emails. Therefore, re-permissioning campaigns have been used by businesses to receive permission from customers to continue sending marketing emails. Asos.com, the British online fashion company, sent out a series of bold and concise emails with the title: “The law is changing. Are you set to get your ASOS emails?” Therefore, this organisation has achieved a clear confirmation of whether their marketing emails can be sent to certain individuals or not. Consequently, this has ensured that Asos.com are compliant with the DPA and the PECR.

It is essential for businesses to be certain about how they can market their business, whilst remaining compliant with the DPA and the PECR, to avoid any crippling fines from the ICO. This certainty can be achieved through ensuring direct marketing and legitimate interest is utilised in the right manner.

Whilst finance and technology have been linked ever since the first ATM in the late 1960s, the advent of mobile internet has truly changed the game for financial services technology, or “Fintech”, as it’s commonly known.

Often deemed a decentralising force, many Fintech companies are seen as directly oppositional to large, traditional banks and offer radically different, user-centric, experiences of things like mortgages, insurance, and currency exchange.

In many ways, Fintech has thrived because of its willingness to capitalise on user experience. Much in the same way that activities like shopping, online dating, and scheduling taxi-cabs have been revolutionised through mobile applications, Fintech’s accessibility, its transparency, and its attempts to keep charges to a minimum (all things banks have a bad reputation for), have led to similar disruption for the financial sector.

It’s true that Fintech offers clear opportunity for a new, more efficient, effective, and – dare I say – human approach to finance, but it can also represent hidden risk. It could be argued that the speed, intangibility, and global nature of digital finance is in danger of creating unforeseen regulatory gaps as agencies like the Financial Conduct Authority (FCA) in the UK rush to keep pace.

We can see these concerns play-out in the media at the moment. For example, self-described ‘beyond banking’ app, Revolut, hit the news recently amid non-compliance accusations over money laundering.

Whilst Revolut vehemently denies any wrongdoing and CEO, Nik Storonsky, has clarified that the app simply reverted to its original anti-money laundering screening system rather than remove the function altogether (due to the technology recording too many false positives), the predicament nevertheless shed light on the internal struggle that exists for digital finance between customer satisfaction, speed, and matters of compliance.

The truth is, although many new finance apps appear more human (in that they offer an open, honest, and jargon-free experience without the lengthy approval processes that banks enforce), they are, in fact, artificially intelligent.

That means that it’s often complex algorithms – not humans – that process customer details and translate them into requirements and decisions for things such as mortgages, lending/borrowing, and what constitutes a safe or unsafe money transfer.

In many ways, the mechanisation of everyday financial services is revolutionary (it has been likened to the industry’s transformation in the 1980s at the advent of computerised banking). Machine learning and AI mean that computers can write and test rules themselves. They can learn, for example, to make mathematically perfect lending decisions in seconds – possibly putting an end to unscrupulous and predatory lending practices (like the sort that contributed to the 2008 financial crash).

On the other hand, there is a cost of sorts used to generate this innovation and, for Fintech, this usually comes in the form of data. Big Data – such as the data gathered continuously by your smart phone (and which can be used to predict human behavioural patterns) – fuels Fintech in numerous ways. For example, financial technologies use our personal data to customise user experience, offering banking recommendations based off our spending patterns.

Fintechs also use data and predictive analytics to make credit and lending decisions, to manage risk, detect fraud, to fuel marketing, as well as devise customer retention/loyalty programmes. We shouldn’t underestimate just how much Fintech relies on access to data, and what that data can be used for. After all, Big Data begs Big questions:

  • What happens if data security is compromised?
  • Who (or what) is held accountable by regulatory watchdogs for decisions made by robots?
  • Just how do Fintech firms protect our consumer rights?

Champions of Fintech argue that consumers, indeed, society, will benefit from increased access to more personalised, more cost-effective finance products that encourage fair competition and inspire change. However, those who are more sceptical argue that more data naturally equals more risk, pointing to cyber-security attacks like the one suffered by Tesco bank in late 2018. Tesco was fined £16.4M by the FCA for the breach which saw 34 unauthorised online transactions take place.

Others question FinTech’s use of automated decision software, arguing that it could actually increase the risk of financial exclusion as customers with little or no digital footprint could become ‘invisible’ to applications that rely on data to profile people and assess risk. Similarly, customers might be unfairly profiled due to their spending or shopping habits being similar to someone else’s that has been refused credit in the past. Lumping people together like this suddenly doesn’t sound all that human …

With so much digital information available for Fintech firms to use and analyse, it is imperative that regulatory bodies like the FCA continue to question how Big Data is being used, and for firms to implement safeguards that ensure data is processed ethically and lawfully. This is particularly true under GDPR (or the UK’s implementation of it, the Data Protection Act 2018). Under this legislation, data controllers must:

  • Be transparent about how they intend to use data (including putting measures in place to track and audit data use and for customers to access records about how their data is being used).
  • Obtain informed consent from data subjects to use their data in the manner they want to. Organisations risk breaching data privacy and data security laws if they carry-out group or individual profiling on data they only have implied consent for.
  • Ensure that automated decision software is fair and unbiased.
  • Protect data integrity by using only accurate data and updating this data as and when required.

As we might suspect, underpinning Fintech’s regulatory obligations is yet another innovation, aptly named Regulatory Technology, or “Regtech” for short. Whilst not a new concept, the continued crossover between regulation and technology may well become crucial as Fintech encounters ever more regulatory and reporting requirements in the future. Extending disruptive digital technologies to regulation, indeed, seems like the next logical step.

The UK government in May 2018 has implemented the Data Protection Act (DPA) in accordance with some of the European General Data Protection Regulations (GDPR). However, with Brexit negotiations materialising and declarations that the UK will be leaving the EU Digital Single Market, there is uncertainty surrounding whether the UK’s DPA will change, and subsequently how data will be handled between the UK and Europe. The UK government has requested that a co-operative relationship between the UK and the EU is achieved to ensure a free-flow of data. However, Brexit does have the potential to revise and reshape the protection regulations in the UK. Therefore, organisations need to consider and prepare for how the DPA will apply to the UK post-Brexit.

Which aspects of the DPA will potentially cause problems following Britain’s exit from the EU?

The UK needs to have a free-flow of data between the EU and the UK for business, economic and security interests. However, the European GDPR may prevent this from happening, especially due to Article 45. The UK’s exit from the EU will initiate its status as a third country, when referenced in EU law, and Article 45 specifies that the UK would have to achieve an adequacy arrangement to enjoy a transmission of data between the UK and the EU. This status as a third country will demand action from the UK government, to ensure that there is still a relationship of mutual co-operation between the UK and the EU. The transfer and protection of personal data between data controllers is essential, as data privacy and data protection are vital in terms of personal rights, as well as the digital economy.

The EU Commission has ten adequacy arrangements with third countries outside of the EU already, in line with the 1995 Directive. To achieve this adequacy arrangement, the UK would have to meet the EU Commission’s expectations regarding the UK’s own commitment to data protection and the effectiveness of its legal framework. The UK government has mostly aligned UK data protection law with the GDPR to try and ensure a smooth transition and to mitigate the risks to businesses.

However, if the UK is denied an adequacy arrangement then businesses would experience the repercussions, in the form of EU safeguards which would initiate added costs to businesses. Businesses which are reliant upon personal data capture, such as marketing, telecommunications and finance organisations, would suffer the most if this post-Brexit situation was to occur. These organisations are reliant on free-flowing channels of data between the UK and the EU, yet if there are economic obstacles then these organisations will experience detrimental effects.

Furthermore, the EU-US Privacy Shield has allowed the EU and the US free access to data; however, if the UK does not achieve an adequacy arrangement from the EU, the UK will not have access to data from the US. If this happens, the UK will have to confront economic and security challenges.

What has the UK data protection bill put into place to create a smooth transition out of the EU in 2019?

Whilst implementing the DPA, the UK government did so with the consideration that Britain is leaving the EU. Therefore, the UK bill factors in the differentiations between the European data regulations and the UK data regulations, and therefore the UK government apply the new standards to all UK data, not just areas which are under EU competence.

The UK government have expressed a desire for the continuation of the UK’s Information Commissioner’s Office (ICO) role, which would be used to ensure UK businesses are still represented in the EU, and to ensure that the UK are fairly represented in disputes. However, this attempt to streamline the process of communication between the UK and the EU has not yet been put into effect, and there is no guarantee that by 2019 the UK ICO will be given a role in the EU data regulation process.

Effects on Immigration in a post-Brexit UK:

Some commentators on the UK’s DPA have suggested that Brexit could allow the UK government to establish discriminatory immigration laws. The UK government has implemented aspects of the EU’s GDPR, such as allowing some organisations exemption from the DPA. For example, the Home Office is exempt from the DPA and have the legal right to reject data subject’s access requests to their immigration documents. This has been particularly controversial because Brexit means that over 3 million EU citizens will have to register their residence, and this will be hard due to data subjects not being able to retrieve their personal data from the Home Office.

The uncertainty which surrounds Brexit has instigated an atmosphere of apprehension within the business sector, and therefore it is vital to gain a well-informed stance on the implications of a post-Brexit UK. Organisations and staff members need to be aware and prepared for the repercussions they might have to confront, if the UK government do not achieve an adequacy agreement post-Brexit.

Trends in data protection for direct marketing

Have data protection authorities begun the great fightback against business? Perhaps they have been tasked with bringing in some much-needed cash to national coffers, because fines have become the next big trend in data protection and should seriously concern marketers in all sizes of business.

Some recent marketing-related fines have included:

  • Amazon – €746m for compiling data on customers
  • WhatsApp – €225m for failing to provide information in clear and plain language
  • Austria Post – €9.5m for failing to allow subject access requests by email
  • Grindr – €6.3m for sharing location services without consent because it was special category data on sexual orientation
  • Sky Italia – €3.3m for unwanted phone calls

Overall, there’s been a 113% increase in GDPR fines between July 2020 to July 2021, with 709 in total compared to 332 in the year before. Penalties for violations have more than doubled as well, from €130.69 million up to July 2020 to €293.96 million up to July 2021. 

Continue reading

The UK government is planning significant changes to the UK’s data protection regime. From re-orientating the Information Commissioner’s Office (ICO) to new ways for businesses to process data, these far-reaching GDPR reforms are set to have a significant impact on business. We covered these changes in depth in a previous article and webinar

High on the government’s agenda as outlined in their consultation is reform of the ICO – the Information Commissioner’s Office. This has been on the cards for sometime, with the government keen to align the ICO towards delivering the National Data Strategy. The Department for Digital, Culture, Media and Sport (DCMS) has outlined their proposed changes to the regulator.

Continue reading

The UK government’s consultation on reforming data protection, launched on 9 September, sets out a radically different framework for data protection than GDPR. From re-orientating the Information Commissioner’s Office to new ways for businesses to process data, these far-reaching reforms are set to have a significant impact on business.

Although the plans have been announced in consultation and not every proposal may make it into law, the direction of travel has been clear for some time. The UK plans to make it much easier for most businesses to use data, and get the most from data, while still ensuring strong levels of protection.

In this short video, our Director of Learning and Content takes us through what the potential changes are and how they might affect the way we process data.

Watch now

Here’s what you need to know about the UK’s plans to radically alter GDPR

The UK government’s consultation on reforming data protection, launched on 9 September, sets out a radically different framework for data protection than GDPR. From re-orientating the Information Commissioner’s Office to new ways for businesses to process data, these far-reaching reforms are set to have a significant impact on business.

Although the plans have been announced in consultation and not every proposal may make it into law, the direction of travel has been clear for some time. The UK plans to make it much easier for most businesses to use data, and get the most from data, while still ensuring strong levels of protection.

“The government wants to remove unnecessary barriers to responsible data use. A small hairdressing business should not have the same data protection processes as a multimillion-pound tech firm. Our reforms would move away from the “one-size-fits-all” approach and allow organisations to demonstrate compliance in ways more appropriate to their circumstances, while still protecting citizens’ personal data to a high standard.” Department for Culture, Media and Sport.

Continue reading
Webinar invitation banner

As COVID-19 restrictions are lifted and businesses begin to return to the office, companies are taking a variety of approaches to managing the transition. While some are staying at home for now and others have gone back full time, most are opting for a hybrid working policy. While this might be a sensible and fair solution for the time being, having staff work both at home as well as the office raises several data security and GDPR compliance concerns.

In this webinar, we were joined by Dechert LLP’s Director of Risk and Compliance Mohbub Rahman to explore the key things you need to remember to keep data safe during the latest transition.

The webinar covered:

  • How companies are transitioning back to the office
  • How hybrid working works
  • Data protection risks in a hybrid working environment
  • How hackers and scammers took advantage during the pandemic
  • Best practice for data security with hybrid working

Watch now

Continue reading

But European Commission warns adequacy could be revoked ‘immediately’

What is the EU adequacy decision?

The EU adequacy decision is a legal instrument issued by the European Commission that determines whether a non-European Union (EU) country or territory provides an adequate level of data protection to enable the free flow of personal data from the EU member states to that country.When the European Commission issues an adequacy decision, it means that the EU considers the non-EU country’s data protection framework as providing an adequate level of protection for personal data. This enables the transfer of personal data from the EU to that country without the need for additional safeguards or contractual arrangements.

Continue reading

On the three year anniversary of GDPR coming into force, VinciWorks hosted a webinar to look at the last three years of GDPR. We explored the effect the regulation has had on the way we collect and process data and discussed what we can expect in the next 12 months.

During the webinar we shared a conversation between our Director of Learning and Content Nick Henderson and Richard Hogg, who is the global Information Governance Director for White & Case LLP. Hogg, who has 20 years of global experience in the field, is responsible for global information governance across the firm. He previously worked at IBM, where he played a critical role in their journey to preparation for GDPR, and he speaks regularly on topics of privacy and information governance. Richard shared his expert perspective on GDPR and his views on the future of data protection.

GDPR – Three years on: Watch the full webinar here

Continue reading