By Andrew Wychrij, Cyber Security Manager – Consulting, at Reliance Cyber with contributions from Katherine Arthur, Cyber Security Consultant at Reliance Cyber.
Much like its EU equivalents, the Information Commissioner’s Office (ICO), the UK’s independent regulator for data protection and privacy, has high expectations when it comes to organisations taking individual responsibility for maintaining standards of privacy and data protection.
Critically, the ICO expects firms to be extremely proactive and vigilant around risks in this regard. Working to ensure that there is adequate security around data, that a lawful basis for processing exists, and that data subjects are treated fairly and their rights observed, are requirements that the ICO and other Data Protection Authorities (DPAs) are always looking for from organisations.
DPAs, notably the ICO, have indicated that they want to see the right behaviours and attitudes from organisations regarding data protection. The ICO has repeatedly warned that the biggest cyber risk to an organisation is not a ‘hacker,’ but in fact complacency; companies leaving themselves open to cyber-attacks through lack of due diligence are warned to expect fines by the ICO.
Of course, many organisations fail in demonstrating the adequate level of complacency-avoidance the ICO and others demand – and are sanctioned accordingly. No sector, however, appears to receive quite the attention the tech industry does in this regard – and recent record fines, the most recent being a staggering €1.2 billion issued to Meta in May 2023, show no sign of this trend abating.
Singled out by DPAs?
It takes little more than a casual understanding of data protection and privacy developments to know that tech giants are very regularly in the firing line. Nine of the ten largest GDPR fines (as of 22 May 2023) have been issued to firms in this sector – notably Amazon, Google and Meta.
These fines, headed by a record €1.2 billion penalty given to Meta in May 2023 (beating the former record of €746 million issued to Amazon in 2021), have been levied against tech firms with an increasingly regular basis – and with values that dwarf those given to other companies.
To illustrate this point, the largest fine given to a non-tech firm was €35.3 million, received by retailer H&M in 2020 by the Data Protection Authority in Hamburg, Germany, for illegal surveillance of its employees. This, while undoubtedly serious, would not even make it into the top 10 of GDPR’s largest fines.
The question, then, is why do tech firms tend to butt up against DPAs so much? One could look at the millions of euros in fines and conclude that tech giants might be valuable cash cows for regulators, there is an underlying perception in the public sphere that their attitude towards data handling is lax and their respect for data rights severely lacking.
Too much data, not enough rigour?
Tech firms process huge volumes of personal data, particularly given that many rely so heavily on using that data for revenue-generating purposes – generally targeted and non-targeted advertising. The sheer amount of data collected – including everything from location information to browsing history and biometric data – presents a large risk to data subjects, especially given it is often collected without the individuals’ realisation or understanding.
Case study on data collection: Snapchat AI chatbot
A trending series of screenshots from Snapchat’s Al chat feature are making their way around social media – a woman shows growing concern that the chat knows she has a child and knows the child’s name; further on in the conversation the Al bot confirms that it was able to find out this information based on previously shared data such as location data. The woman pleads to Snapchat users that she is worried about the way this ‘bot’ could have this information and wonders what other data the app is collecting.
With this high-volume processing comes a large amount of responsibility, in terms of keeping that data secure, respecting the rights of individuals and ensuring that individuals are informed about how their data is being used. Whilst data can be used for a variety of purposes, under the UK/EU GDPR it should be collected for an explicit legitimate purpose and not processed beyond that – and individuals should understand that purpose and the implications of the processing on them.
Tech giants have regularly been criticised for a lack of transparency around data collection and use. Many tech companies have murky policies when it comes to data privacy, and it can be difficult for individuals to understand exactly what information is being collected and how it is being used.
This is, admittedly, not always easy to do correctly (be it through appropriate privacy notices or other means), but it is a vital obligation under data protection regulations – and ultimately is underlined by the desire of an organisation to act transparently and in good faith towards their users and customers.
Tech firms have habitually fallen foul of this requirement. There have already been numerous instances of tech companies being accused of using their data and algorithms to manipulate users – from the Cambridge Analytica scandal to accusations of bias in search results and social media algorithms – and, bearing this in mind, it is little wonder that they have attracted so much scrutiny from regulators and distrust from the public at large.
A welfare issue – processing children’s data
One of the largest, and most contentious, pitfalls that customer-facing firms have is processing children’s data. This is particularly important for social media providers where there is a high likelihood of children wanting to use their services and therefore having their data processed.
The UK/EU GDPR make special provisions for protecting children’s data intended to enhance the protection of children’s personal data and to ensure that children are addressed in plain, clear language that they can understand – especially where parental consent is not required or received.
Regulators take a dim view of firms not adequately protecting children’s data rights – and tech firms have been found wanting in this regard on several occasions. In the UK, we have recently seen tech giant TikTok receive a £12.7 million fine for several data protection breaches including failure to use children’s personal data lawfully and failure to implement policies it laid out in order to protect children.
As with most aspects of data protection compliance, there are no easy solutions to the problems firms face here, but appropriate rigour (risk assessments, child-friendly design, transparency, limiting features on children’s profiles (e.g., geolocation being turned off) and preventing children’s data being shared) will get you a long way. This rigour, however, is all too often absent.
Respect for data users – a question of transparency
The modern world is complicated with regards to data. With increasing numbers of users of online platforms, apps, browsers etc. we cannot expect every person to fully understand their rights as an individual. There are a plethora of examples concerning tech companies that illustrate this, where firms are using data in a manner that their customers might not expect or understand.
Take behavioural advertising, sending ads targeted using user-profile information, for example. Meta was fined €390 million in January 2023 by the Irish Data Protection Commission (DPC) for relying on contract terms as a lawful basis for personalised advertising on Instagram and Facebook. The DPC ruled that using terms when people sign up for Meta’s platforms as a basis for behavioural advertising was invalid. This seems like a reasonable conclusion – how can users be sure what their data is being used for if they don’t provide informed and explicit opt-in consent?
Another potential issue is international data transfers. Transfers of personal data between the EU and US (where the Metas, Googles, Amazons and Apples of this world are typically based) are viewed sceptically by Europeans, particularly given the possibility of the US government using this data for surveillance. This creates a danger for the rights of UK and EU nationals to be undermined – particularly if their data is sent to the US without their knowledge and without appropriate additional safeguards being implemented. This is exactly what prompted the DPC to fine Meta a colossal €1.2 billion in May 2023 after a decade-long complaint.
Tech firms need to be responsible partners in protecting individuals’ data. Ensuring that there is data privacy by design – where rights and freedoms of users are considered at the set-up phase of a project – is vitally important. Equally so is ensuring that users have a clear choice and meaningful control over their data usage while organisations show that there is genuine transparency and accountability on their side.
A long road ahead
With the vast amounts of personal information that tech companies collect and store, many data savvy individuals are worried about how this information is being used and who has access to it, there is also concern about the potential for abuse; with so much personal information at their disposal, these companies have the power to shape public opinion and even influence political outcomes.
Data privacy and data protection is complex and multifaceted; it will require a concerted effort from individuals, governments, and the tech industry itself to find solutions that balance the benefits of technological innovation with the need to protect privacy and individual rights. This is all the more important with the rise in prevalence of sophisticated AI solutions and new technologies that could benefit companies at the expense of the rights of individuals.
GDPR is certainly beginning to show it has teeth – though many of the rulings made against tech giants (notably Meta’s €1.2-billion Amazon’s €746 million fines) are being fiercely fought in the courts and it remains unclear what changes these penalties will drive to the benefit of data subjects at large.
For now, individuals need to remain vigilant when disseminating their data. Of course, tech giants bring lots of real-life benefits, making the world more connected and many services more innovative and easier than access to ever. However, understanding how that data is being used is often hard to disentangle and the tech giants have shown that they have a way to go to demonstrate the accountability and good practice DPAs demand.
How we can help?
If you would like to understand more about data privacy and protection, get in touch to arrange a free consultation with one of our experts today.