FreshRSS

🔒
❌ About FreshRSS
There are new available articles, click to refresh the page.
Before yesterdayYour RSS feeds

UK now expects compliance with children’s privacy design code

By Natasha Lomas

In the UK, a 12-month grace period for compliance with a design code aimed at protecting children online expires today — meaning app makers offering digital services in the market which are “likely” to be accessed by children (defined in this context as users under 18 years old) are expected to comply with a set of standards intended to safeguard kids from being tracked and profiled.

The age appropriate design code came into force on September 2 last year however the UK’s data protection watchdog, the ICO, allowed the maximum grace period for hitting compliance to give organizations time to adapt their services.

But from today it expects the standards of the code to be met.

Services where the code applies can include connected toys and games and edtech but also online retail and for-profit online services such as social media and video sharing platforms which have a strong pull for minors.

Among the code’s stipulations are that a level of ‘high privacy’ should be applied to settings by default if the user is (or is suspected to be) a child — including specific provisions that geolocation and profiling should be off by default (unless there’s a compelling justification for such privacy hostile defaults).

The code also instructs app makers to provide parental controls while also providing the child with age-appropriate information about such tools — warning against parental tracking tools that could be used to silently/invisibly monitor a child without them being made aware of the active tracking.

Another standard takes aim at dark pattern design — with a warning to app makers against using “nudge techniques” to push children to provide “unnecessary personal data or weaken or turn off their privacy protections”.

The full code contains 15 standards but is not itself baked into legislation — rather it’s a set of design recommendations the ICO wants app makers to follow.

The regulatory stick to make them do so is that the watchdog is explicitly linking compliance with its children’s privacy standards to passing muster with wider data protection requirements that are baked into UK law.

The risk for apps that ignore the standards is thus that they draw the attention of the watchdog — either through a complaint or proactive investigation — with the potential of a wider ICO audit delving into their whole approach to privacy and data protection.

“We will monitor conformance to this code through a series of proactive audits, will consider complaints, and take appropriate action to enforce the underlying data protection standards, subject to applicable law and in line with our Regulatory Action Policy,” the ICO writes in guidance on its website. “To ensure proportionate and effective regulation we will target our most significant powers, focusing on organisations and individuals suspected of repeated or wilful misconduct or serious failure to comply with the law.”

It goes on to warn it would view a lack of compliance with the kids’ privacy code as a potential black mark against (enforceable) UK data protection laws, adding: “If you do not follow this code, you may find it difficult to demonstrate that your processing is fair and complies with the GDPR [General Data Protection Regulation] or PECR [Privacy and Electronics Communications Regulation].”

Tn a blog post last week, Stephen Bonner, the ICO’s executive director of regulatory futures and innovation, also warned app makers: “We will be proactive in requiring social media platforms, video and music streaming sites and the gaming industry to tell us how their services are designed in line with the code. We will identify areas where we may need to provide support or, should the circumstances require, we have powers to investigate or audit organisations.”

“We have identified that currently, some of the biggest risks come from social media platforms, video and music streaming sites and video gaming platforms,” he went on. “In these sectors, children’s personal data is being used and shared, to bombard them with content and personalised service features. This may include inappropriate adverts; unsolicited messages and friend requests; and privacy-eroding nudges urging children to stay online. We’re concerned with a number of harms that could be created as a consequence of this data use, which are physical, emotional and psychological and financial.”

“Children’s rights must be respected and we expect organisations to prove that children’s best interests are a primary concern. The code gives clarity on how organisations can use children’s data in line with the law, and we want to see organisations committed to protecting children through the development of designs and services in accordance with the code,” Bonner added.

The ICO’s enforcement powers — at least on paper — are fairly extensive, with GDPR, for example, giving it the ability to fine infringers up to £17.5M or 4% of their annual worldwide turnover, whichever is higher.

The watchdog can also issue orders banning data processing or otherwise requiring changes to services it deems non-compliant. So apps that chose to flout the children’s design code risk setting themselves up for regulatory bumps or worse.

In recent months there have been signs some major platforms have been paying mind to the ICO’s compliance deadline — with Instagram, YouTube and TikTok all announcing changes to how they handle minors’ data and account settings ahead of the September 2 date.

In July, Instagram said it would default teens to private accounts — doing so for under 18s in certain countries which the platform confirmed to us includes the UK — among a number of other child-safety focused tweaks. Then in August, Google announced similar changes for accounts on its video charing platform, YouTube.

A few days later TikTok also said it would add more privacy protections for teens. Though it had also made earlier changes limiting privacy defaults for under 18s.

Apple also recently got itself into hot water with the digital rights community following the announcement of child safety-focused features — including a child sexual abuse material (CSAM) detection tool which scans photo uploads to iCloud; and an opt in parental safety feature that lets iCloud Family account users turn on alerts related to the viewing of explicit images by minors using its Messages app.

The unifying theme underpinning all these mainstream platform product tweaks is clearly ‘child protection’.

And while there’s been growing attention in the US to online child safety and the nefarious ways in which some apps exploit kids’ data — as well as a number of open probes in Europe (such as this Commission investigation of TikTok, acting on complaints) — the UK may be having an outsized impact here given its concerted push to pioneer age-focused design standards.

The code also combines with incoming UK legislate which is set to apply a ‘duty of care’ on platforms to take a rboad-brush safety-first stance toward users, also with a big focus on kids (and there it’s also being broadly targeted to cover all children; rather than just applying to kids under 13s as with the US’ COPPA, for example).

In the blog post ahead of the compliance deadline expiring, the ICO’s Bonner sought to take credit for what he described as “significant changes” made in recent months by platforms like Facebook, Google, Instagram and TikTok, writing: “As the first-of-its kind, it’s also having an influence globally. Members of the US Senate and Congress have called on major US tech and gaming companies to voluntarily adopt the standards in the ICO’s code for children in America.”

“The Data Protection Commission in Ireland is preparing to introduce the Children’s Fundamentals to protect children online, which links closely to the code and follows similar core principles,” he also noted.

And there are other examples in the EU: France’s data watchdog, the CNIL, looks to have been inspired by the ICO’s approach — issuing its own set of right child-protection focused recommendations this June (which also, for example, encourage app makers to add parental controls with the clear caveat that such tools must “respect the child’s privacy and best interests”).

The UK’s focus on online child safety is not just making waves overseas but sparking growth in a domestic compliance services industry.

Last month, for example, the ICO announced the first clutch of GDPR certification scheme criteria — including two schemes which focus on the age appropriate design code. Expect plenty more.

Bonner’s blog post also notes that the watchdog will formally set out its position on age assurance this autumn — so it will be providing further steerage to organizations which are in scope of the code on how to tackle that tricky piece, although it’s still not clear how hard a requirement the ICO will support, with Bonner suggesting it could be actually “verifying ages or age estimation”. Watch that space. Whatever the recommendations are, age assurance services are set to spring up with compliance-focused sales pitches.

Children’s safety online has been a huge focus for UK policymakers in recent years, although the wider (and long in train) Online Safety (neé Harms) Bill remains at the draft law stage.

An earlier attempt by UK lawmakers to bring in mandatory age checks to prevent kids from accessing adult content websites — dating back to 2017’s Digital Economy Act — was dropped in 2019 after widespread criticism that it would be both unworkable and a massive privacy risk for adult users of porn.

But the government did not drop its determination to find a way to regulate online services in the name of child safety. And online age verification checks look set to be — if not a blanket, hardened requirement for all digital services — increasingly brought in by the backdoor, through a sort of ‘recommended feature’ creep (as the ORG has warned). 

The current recommendation in the age appropriate design code is that app makers “take a risk-based approach to recognising the age of individual users and ensure you effectively apply the standards in this code to child users”, suggesting they: “Either establish age with a level of certainty that is appropriate to the risks to the rights and freedoms of children that arise from your data processing, or apply the standards in this code to all your users instead.” 

At the same time, the government’s broader push on online safety risks conflicting with some of the laudable aims of the ICO’s non-legally binding children’s privacy design code.

For instance, while the code includes the (welcome) suggestion that digital services gather as little information about children as possible, in an announcement earlier this summer UK lawmakers put out guidance for social media platforms and messaging services — ahead of the planned Online Safety legislation — that recommends they prevent children from being able to use end-to-end encryption.

That’s right; the government’s advice to data-mining platforms — which it suggests will help prepare them for requirements in the incoming legislation — is not to use ‘gold standard’ security and privacy (e2e encryption) for kids.

So the official UK government messaging to app makers appears to be that, in short order, the law will require commercial services to access more of kids’ information, not less — in the name of keeping them ‘safe’. Which is quite a contradiction vs the data minimization push on the design code.

The risk is that a tightening spotlight on kids privacy ends up being fuzzed and complicated by ill-thought through policies that push platforms to monitor kids to demonstrate ‘protection’ from a smorgasbord of online harms — be it adult content or pro-suicide postings, or cyber bullying and CSAM.

The law looks set to encourage platforms to ‘show their workings’ to prove compliance — which risks resulting in ever closer tracking of children’s activity, retention of data — and maybe risk profiling and age verification checks (that could even end up being applied to all users; think sledgehammer to crack a nut). In short, a privacy dystopia.

Such mixed messages and disjointed policymaking seem set to pile increasingly confusing — and even conflicting — requirements on digital services operating in the UK, making tech businesses legally responsible for divining clarity amid the policy mess — with the simultaneous risk of huge fines if they get the balance wrong.

Complying with the ICO’s design standards may therefore actually be the easy bit.

 

Regology snares $8M Series A to help navigate maze of global regulations

By Ron Miller

Every country has its own bundle of laws, rules and regulations, and they change on a regular basis making it an enormous challenge to keep up with it all. That usually requires large staffs filling in spreadsheets and unwieldy processes, but Regology, an early-stage startup wants to change that by putting some automation to work on the problem.

Today the company announced an $8 million Series A led by Acme Capital with participation from existing investors Gagarin Capital and Pine Wave Investments.

Company co-founder and CEO Mukund Goenka spent more than 15 years working in the banking industry where he saw first-hand the difficulties in keeping up with regulations and the financial consequences of failing to do so. He formed Regology to provide large global companies with a way to stay on top of these myriad regulations.

Goenka says that his company started by compiling a database of laws. “We have a very large database of laws that is constantly updated, covering geographies from five continents, and a number of countries and jurisdictions. We also cover the lawmaking process of going from bills all the way to laws to regulations and a number of agencies and their regular updates on a daily basis. And it covers a number of industries and topic areas as well,” Goenka explained.

They don’t stop there though. They also give customers a framework for automating compliance wherever they are doing business, and they constantly review the laws and updates to help sure their customers are staying in compliance over time. Their target market is large Fortune 500. companies, and while Goenka couldn’t name specific ones, he did say that it included some of the largest tech companies and biggest banks.

 

The company launched in 2017 and today has 20 full time employees with plans to at least double that by the end of the year. He says that being diverse is essential in a business that is already looking at the regulatory environment in 25 countries. Understanding how each of these countries works is essential to the business and that requires a diverse workforce to pull off.

Goenka says that the company has been remote from day one, long before COVID. While there is still a small office in Palo Alto, he intends to keep remain mostly remote, even when it’s considered safe to reopen offices.

Regulations can define the best places to build and invest

By Annie Siebert
Noorjit Sidhu Contributor
Noorjit Sidhu is an early-stage investor at Plug & Play Ventures, focused on investments across data infrastructure and cloud, artificial intelligence, financial services, and the future of learning and work.

Market timing  —  how relevant an idea is to the current state and direction of a market  —  is the most important factor in determining the durability of that idea.

Several inputs inform market timing: The skew of consumer preferences in response to a pandemic. The price of goods for a resource that is finite and becoming scarce. The creation of a novel algorithmic or genetic technique that enlarges the potential of what can be streamlined, repaired and built.

But market timing is also defined by a less discussed area that is born not in capital markets but in the public sector  —  the regulatory landscape  —  namely, the decisions of government, the broader legal system and its combined level of scrutiny toward a particular subject.

We can understand the successes and challenges of several valuable companies today based on their combustion with the regulatory landscape.

We can understand the successes and challenges of several valuable companies today based on their combustion with the regulatory landscape, and perhaps also use it as an optic to see what areas represent unique opportunities for new companies to start and scale.

Looking back: The value in regulatory gray areas

“The tech comes in and moves faster than regulatory regimes do, or can control it,” Uber co-founder and former CEO Travis Kalanick said at The Aspen Institute in 2013.

The brash statement downplayed that the regulatory landscape had, in fact, driven a number of pivotal outcomes for the company up to that event. It changed its name from UberCab to Uber after receiving a cease-and-desist order in its first market, California. Several early employees left because of the startup’s regulatory challenges and iconoclastic ethos. It shut down its taxi service in New York after just a month of operations, and then in early 2013 received its lifeline in the city after being approved through a pilot program.

Fast forward to the present, and Uber has a market cap of about $82 billion, with the ousted Kalanick having a personal net worth in the neighborhood of $2.8 billion.

Still, even at its scale, many of its most important questions on growth centered around how favorably the regulatory landscape would treat its category. Most recently, this came with the U.K. Supreme Court ruling that Uber drivers could not be classified as independent contractors.

The regulatory fabric has had similar leverage over other sharing-economy companies. In October 2014, for example, Airbnb’s business model became viable in San Francisco when Mayor Ed Lee legalized short-term rentals. In November 2015, Proposition F in the city aimed to restrict short-term rentals like Airbnb, and the startup spent millions in advertisements to mobilize voters in opposition.

Airbnb’s current market cap stands at $92 billion, and its CEO, Brian Chesky, has an estimated net worth over $11 billion. Like Uber, its regulatory tribulations continue, most recently being fined and judged to owe $9.6 million to the city of Paris.

The stories of these two companies and others in the sharing economy space demonstrate the value that the regulatory fabric can add or subtract from a company’s wealth, but also underscore the value  —  for founding teams, early employees, investors and customers  —  of navigating the gray areas.

Looking around: The data economy

The present regulatory fabric has precipitated market timing for ideas in a number of categories. Solutions that enable data privacy, like BigID, and ones that embed data privacy into larger customer value propositions, like Blotout, are on streamlined growth tailwinds from the GDPR in Europe and their inspired analogs in the U.S.

❌