FreshRSS

🔒
❌ About FreshRSS
There are new available articles, click to refresh the page.
Yesterday — December 4th 2020Your RSS feeds

Review: Wireless headsets from Logitech, Audio-Technica, SteelSeries, HyperX and more

By Devin Coldewey

With the amount of time you’re spending at home these days, you deserve a better headset. A wireless one that works with your computer and maybe your console as well, with a mic for calls and great sound for games and movies. Fortunately there are a lot to choose from, and I’ve tested out your best options.

I asked the leading audio and peripheral companies to send over their flagship wireless headset, with prices ranging from about $100 to $250. Beyond this price range returns diminish swiftly, but right now that’s the sweet spot for comfort, sound and usability.

For years I’ve avoided wireless headsets because there were too many compromises, but I’m pleased to say that the latency has been eliminated and battery life in the ones I reviewed is uniformly excellent. (NB: If the wireless version feels too expensive, you can often get wired ones for $50-100 less.)

To test the headphones, I used them all for a variety of everyday tasks, from video calls to movies and music (with only minimal EQing to get a sense of their natural sound) to AAA games and indies. None require an app to work, though some have companion software for LEDs or game profiles. I have a fairly large head and medium-sized ears, for what it’s worth. All the headphones are rather bulky, though the angle I shot them at individually makes them look huge — you can see in the image up top that they’re all roughly the same size.

None of these headphones have active noise cancelling, but many offer decent physical isolation to the point where they offer a “monitor” feature that pipes in sound from the outside world — useful if you’re playing a game but waiting for the oven to preheat or something. Only the first set has a built-in mic, the rest have detachable ones of generally solid quality, certainly good enough for streaming and chatting, though for broadcast a separate one would be better. All these headphones use a USB-A style dongle, though the 7P/7X also has a USB-C connector.

SteelSeries 7P/7X – $149

Image Credits: Devin Coldewey / TechCrunch

The 7P and 7X headsets, designed with the PS5 and Xbox Series X in mind (as well as PC) respectively, are my first and most unreserved recommendation.

The standout feature on these is, to me, a truly surprising sound with an almost disturbingly broad stage and clarity. I almost couldn’t believe what I was hearing when I put on some familiar tracks I use for reference. This isn’t a 7.1 simulation or anything like that — but no doubt the gaming focus led to creating a large soundstage. It worked!

I also found the headphones to be very comfortable, with a “ski goggle” strap instead of a per-band adjustment that lets them sit very lightly as well as “remembering” your setting. The spacious earcups rotate for travel or comfort.

The built-in mic is unobtrusive and stows away nicely, but if you’re picky about placement it was a bit floppy to adjust. Many of the other headsets have nicer mics that completely detach — maybe that’s a plus for you, but I tend to lose them.

My main issues with these are that the controls feel cheap and not particularly well laid out. The bottom of the headset is a jumble of ports and buttons and the volume dials don’t have much travel — it’s 0 to 100 in one full swipe. (Volume control is independent from system volume.)

The dongle is different from the others in that it is itself USB-C, but with a USB-A cable attached. That’s good for compatibility, but the cable is three feet long, making it kind of silly to attach to some laptops and whatnot. You could easily get your own short cord, though.

At $150 I think these are an easy recommendation for just about anyone looking at that price range.

Audio-Technica AT-GWL – $250

Image Credits: Devin Coldewey / TechCrunch

The high price on these is partly because they are the wireless version of a headset that also comes wired, so if you want the solid audio performance and comfy fit, you can save some money by going wired.

The sound of the AT-GWLs is rich and naturally has a focus on the upper-mid vocal range, which makes voices in media really pop. I did find the sound a bit confined, which hitting the “surround” setting actually helped with. I know that this sort of virtualization has generally been frowned on, but it’s been a while since these settings have been over the top and distortive. I found surround better for games but not necessarily for music, but it’s very easy to switch on and off.

The headphones are light and adjusted with traditional, no-nonsense metal bands, with a single pad on the top. I would say they are the lightest-feeling pair I tested, with the SteelSeries and Razer coming in just behind owing to some extra weight and bulk. Despite being compact, the AT-GWLs felt airy but not big. The leather-microfiber combo cups are nice, and I think they’ll break in well to provide better isolation over time.

Where they fall short is in the interface. First, a note to Audio-Technica: Turn down the notification noises! Turning the headset on, the mic on or off or hitting the system-independent volume max produces loud, surprising beeps. Too loud!

Second, the buttons and dials are stiff, small and same-feeling. Lifting a hand quickly to turn down the volume (maybe after a huge beep) you may very easily mistake the power switch for the volume dial. The dial also doubles as a button for surround mode, and next to it is a microscopic button to turn on and off the sound of surroundings. It’s a bit of a jumble — nothing you can’t get used to, but considering how nice other headsets on this list made their controls, it has to be said.

HyperX Cloud II wireless – $100

Image Credits: Devin Coldewey / TechCrunch

HyperX (owned by Kingston) wasn’t exactly known for audio until fairly recently, but its previous Cloud headset got the crucial Wirecutter endorsement, and it’s easy to see why. For less money than any of the other headsets in this roundup, the follow-up to that headset (which I’m wearing right now) has excellent sound and isolation.

I was surprised to find a soundstage nearly as wide as the 7P/7X, but with more of a focus on the punchy lower register instead of on detail and placement. My music felt big and close, and the atmosphere of games likewise, more immediately present.

The Cloud II’s controls are simple and effective. The volume dial, tied directly to the system volume, is superb: grippy, with smooth motion and just the right amount of friction, and just-barely-there clicks. There are two good-size buttons, the power one concave and the mic mute (which gives different sounds for muted and active) convex.

It’s unfortunate that they’re not as comfortable, for me anyway, as the others on this list. The cups (though a bit on the warm side) and band are perfectly fine. It’s that there’s little rotation to those cups, meaning there’s no play to accommodate the shape of your head. I don’t know, maybe it’s just my big dome, but they were noticeably tighter at the front of my ear than the back, so I was constantly adjusting or trying to twist them.

I’ll say this: If they add a bit more adjustment to the cups, these would be my default recommendation over the 7P/7X. As exciting as the SteelSeries sound is to me, the Cloud IIs seem more like what people expect, and are $50 cheaper.

Logitech G-733 – $130

The matte texture of the G733s had a weird interaction with my camera — they don’t look speckly IRL. Image Credits: Devin Coldewey / TechCrunch

These are Logitech’s streamer-friendly, color-coordinated, LED-sporting set, but they’re better than the loud design would suggest.

The sound is definitely gaming-forward, with a definite emphasis on the low end and a very central, present sound that was a lot like the Cloud II.

To be honest, I was not expecting the G733s to be very comfortable — their stiff plastic look suggested they’d creak, weigh down my ears and crush my noggin. But in fact they’re really light and quite comfy! There’s a lot of play in the positions of the earcups. The fit is a little odd in that there’s a plainly inferior version of the 7P/7X’s “ski goggle” strap that really only has four settings, while the cups slide up and down about two thirds of an inch. It was just enough to accommodate my (again, apparently very large) head.

The mic boom is rather short, and sadly there is no indicator for when the mic is on or off, which is sometimes a minor inconvenience and sometimes a major pain. You can tell from the sound the mute button makes, though.

The volume dial is nice and smooth, though the “clicks” are really far apart. I like the texture of it and the mic mute button, the power button not so much. But it works.

The colors may not be to everyone’s liking, but I have to hand it to Logitech for going all the way. The headset, mic and even the USB dongle are all the same shade, making it much easier to keep track of them in my growing pile of headphones and widgets.

Logitech Pro-X – $200

Image Credits: Devin Coldewey / TechCrunch

Currently Logitech’s most premium set of gaming headphones, the Pro-X abandon the bright, plasticky look of its other sets and goes for understated and black.

The sound of the Logitech is big and very clear, with almost a reference feel in how balanced the bands are. I felt more presence in the mid-lows of smart bass-playing than the other sets. There is a “surround” feel that makes it feel more like you’re in a room of well-configured speakers than headphones, something that I think emerges from a de-emphasis of the center channel. The media is “out there,” not “in here.” It’s not a bad or a good thing, just distinct from the others.

The controls are about on par with the Cloud II’s: a nice frictiony volume wheel controlling system volume, a nice mic toggle button and a fairly meaty on-off switch you’re unlikely to trip on purpose.

Also like the Cloud IIs, there is no rotation to the earcups, making them less comfortable to me than the ATs and SteelSeries, and Logitech’s cheaper G-733s. A larger head than my own, if that’s possible, would definitely feel clamped. I do think these would wear in well, but all the same a bit of play would help a lot.

The external material, a satinized matte plastic, looks truly lovely but is an absolute fingerprint magnet. Considering you’ll be handling these a lot (and let’s be honest, not necessarily with freshly washed hands), you’re going to need to wipe them down rather more than any of the others I tested.

Razer Blackshark V2 Pro – $180

Image Credits: Devin Coldewey / TechCrunch

The understated Razer Blackshark V2 Pro soon became my go-to for PC gaming when the SteelSeries set was attached to the PS5.

Their sound is definitely gaming-focused, with extra oomph in the lows and mid-lows, but music didn’t sound overly shifted in that direction. The soundstage is full but not startlingly so, and everything sounded detailed without being harsh.

The Razers look heavy but aren’t — it varies day to day but I think they’re definitely competing for “most comfortable” with the A-Ts and SteelSeries. The cups feel spacious and have a nice seal, making for a very isolated listening experience. Adjustment is done with the wires attached to the cups, which is nothing special — I kind of wish this setup would let you adjust the cant as well as the height. The material is like the Logitechs — prone to fingerprints, though a little less so, in my experience.

Their controls are very well designed and laid out, all on one side. The protruding (system-independent) volume knob may seem odd at first but you’ll love it soon. The one big notch or click indicates exactly 50%, which is super useful for quick “calibration,” and turning the knob is smooth yet resistant enough that I never once accidentally changed it. Meanwhile there are conveniently placed and distinguishable buttons for mute and power, and ports for the detachable mic, charge cord and 3.5mm input.

I’m hard pressed to think of any downsides to the Blackshark except that it doesn’t work with consoles.

3 ways the pandemic is transforming tech spending

By Walter Thompson
Eric Tan Contributor
Eric Tan is Senior Vice President of IT and Business Services Coupa, a leader in business spend management and a former Battery portfolio company.
Scott Goering Contributor
Scott Goering is VP of Business Development at Battery Ventures .

Ever since the pandemic hit the U.S. in full force last March, the B2B tech community keeps asking the same questions: Are businesses spending more on technology? What’s the money getting spent on? Is the sales cycle faster? What trends will likely carry into 2021?

Recently we decided to join forces to answer these questions. We analyzed data from the just-released Q4 2020 Outlook of the Coupa Business Spend Index (BSI), a leading indicator of economic growth, in light of hundreds of conversations we have had with business-tech buyers this year.

A former Battery Ventures portfolio company, Coupa* is a business spend-management company that has cumulatively processed more than $2 trillion in business spending. This perspective gives Coupa unique, real-time insights into tech spending trends across multiple industries.

Tech spending is continuing despite the economic recession — which helps explain why many startups are raising large rounds and even tapping public markets for capital.

Broadly speaking, tech spending is continuing despite the economic recession — which helps explain why many tech startups are raising large financing rounds and even tapping the public markets for capital. Here are our three specific takeaways on current tech spending:

Spending is shifting away from remote collaboration to SaaS and cloud computing

Tech spending ranks among the hottest boardroom topics today. Decisions that used to be confined to the CIO’s organization are now operationally and strategically critical to the CEO. Multiple reasons drive this shift, but the pandemic has forced businesses to operate and engage with customers differently, almost overnight. Boards recognize that companies must change their business models and operations if they don’t want to become obsolete. The question on everyone’s mind is no longer “what are our technology investments?” but rather, “how fast can they happen?”

Spending on WFH/remote collaboration tools has largely run its course in the first wave of adaptation forced by the pandemic. Now we’re seeing a second wave of tech spending, in which enterprises adopt technology to make operations easier and simply keep their doors open.

SaaS solutions are replacing unsustainable manual processes. Consider Rhode Island’s decision to shift from in-person citizen surveying to using SurveyMonkey. Many companies are shifting their vendor payments to digital payments, ditching paper checks entirely. Utility provider PG&E is accelerating its digital transformation roadmap from five years to two years.

The second wave of adaptation has also pushed many companies to embrace the cloud, as this chart makes clear:

Similarly, the difficulty of maintaining a traditional data center during a pandemic has pushed many companies to finally shift to cloud infrastructure under COVID. As they migrate that workload to the cloud, the pie is still expanding. Goldman Sachs and Battery Ventures data suggest $600 billion worth of disruption potential will bleed into 2021 and beyond.

In addition to SaaS and cloud adoption, companies across sectors are spending on technologies to reduce their reliance on humans. For instance, Tyson Foods is investing in and accelerating the adoption of automated technology to process poultry, pork and beef.

All companies are digital product companies now

Mention “digital product company” in the past, and we’d all think of Netflix. But now every company has to reimagine itself as offering digital products in a meaningful way.

Grab-Singtel and Ant Group win digital bank licenses in Singapore

By Manish Singh

Singapore on Friday granted four firms, including Ant Group and Grab, licenses to run digital banks in the Southeast Asian country, in a move that would allow the tech giants to expand their financial services offerings.

The nation’s central bank, Monetary Authority of Singapore (MAS), said it applied a “rigorous, merit-based process” to select a strong slate of digital banks. As these digital banks start their pilot operations, MAS said it will review whether more companies could be granted this license.

A total of 21 firms, including TikTok-parent firm ByteDance, had applied to get a digital license, of which 14 met the eligibility criteria, MAS said. Tech giants see a major opportunity in expanding to financial services as a way to supercharge their revenue in the rapidly growing region.

The other two licenses went to an entity wholly owned by internet giant Sea, and a consortium of Greenland Financial Holdings, Linklogis Hong Kong and Beijing Cooperative Equity Investment Fund Management.

Like traditional banks, Grab-Singtel and Sea will be able to offer customers banking accounts, debit and credit cards and other services. Digital wholesale banks — the licenses of which went to Ant-owned entity and Greenland Financial consortium — will serve small and medium-sized businesses. None of them will be required to have a physical presence.

“We expect them to thrive alongside the incumbent banks and raise the industry’s bar in delivering quality financial services, particularly for currently underserved businesses and individuals,” said MAS MD Ravi Menon in a statement. A handful of countries, including the U.K., India and Hong Kong, have streamlined their regulations in recent years to grant tech companies the ability to operate as digital banks.

Ride-hailing firm Grab and telecom operator Singtel formed a consortium last year to apply for the digital full bank license. Their combined experience and expertise “will further our goal to empower more people to gain better control of their money and achieve better economic outcomes for themselves, their businesses and families,” said Anthony Tan, Group CEO & co-founder of Grab, in a statement Friday.

“Over the years, Ant Group has accumulated substantial experience and proven success, especially in China where we work with partner financial institutions to serve the needs of SMEs,” Ant Group said in a statement. “We look forward to building stronger and deeper collaborations with all participants in the financial services industry in Singapore.”

Health tech venture firm OTV closes new $170 million fund and expands into Asia

By Catherine Shu

OTV (formerly known as Olive Tree Ventures), an Israeli venture capital firm that focuses on digital health tech, announced it has closed a new fund totaling $170 million. The firm also launched a new office in Shanghai, China to spearhead its growth in the Asia Pacific region.

OTV currently has a total of 11 companies in its portfolio. This year, it led rounds in telehealth platforms TytoCare and Lemonaid Health, and its other investments include genomic machine learning platform Emedgene; microscopy imaging startup Scopio; and at-home cardiac and pulmonary monitor Donisi Health. OTV has begun investing in more B and C rounds, with the goal of helping companies that have already validated products deal with regulations and other issues as they expand.

OTV focuses on digital health products that have the potential to work in different countries, make healthcare more affordable and fill gaps in overwhelmed healthcare systems.

Jose Antonio Urrutia Rivas will serve as OTV’s head of Asia Pacific, managing its Shanghai office and helping the firm’s portfolio companies expand in China and other Asian countries. This brings OTV’s offices to a total of four, with other locations in New York, Tel Aviv and Montreal. Before joining OTV, Rivas worked at financial firm LarrainVial as its Asian market director.

OTV was founded in 2015 by general partners Mayer Gniwisch, Amir Lahat and Alejandro Weinstein. OTV partner Manor Zemer, who has worked in Asian markets for over 15 years and spent the last five living in Beijing, told TechCrunch that the firm decided it was the right time to expand into Asia because “digital health is already highly well-developed in many Asia-Pacific countries, where digital health products complement in-person healthcare providers, making that region a natural fit for a venture capital firm specializing in the field.”

He added that OTV “wanted to capitalize on how the COVID-19 pandemic has thrust the internationalized and interconnected nature of the world’s healthcare infrastructures into the limelight, even though digital health was a growth area long before the pandemic.”

Boost ROI with intent data and personalized multichannel marketing campaigns

By Walter Thompson

Coronavirus is causing large and small businesses to drastically cut marketing budgets. In Forrester’s self-described “most optimistic scenario,” the analysts project a 28% drop in U.S. marketing spend by the end of 2021. Even Google is cutting its marketing budget in half. As marketers move forward, Forrester predicts marketing automation platforms will grow despite an overall decline in marketing technology investment.

Automation platforms help marketers scale their communications. However, scaling communications is not a substitute for intimacy, which all humans crave. Because of the pandemic, it is harder than ever to get attention, let alone make a connection. More mass email blasts from your marketing automation platform are not going to get you the connections with prospects you crave. So how should marketers proceed? Direct mail captures 100% of your audience’s attention. It provides a sensory experience for your prospects and customers, and that helps establish an emotional connection.

Winning marketers are strategically merging automation and digital data with the more intimate channel of direct mail. We call this tactile marketing automation (TMA).

TMA is the integration of direct mail or personalized swag with a marketing automation platform. With TMA, a marketer doesn’t have to think about creating direct mail campaigns outside of digital campaigns. Rather, direct mail experiences are already fully integrated into the pre-built customer journey.

TMA uses intent data to inform content, messaging and the timing of direct mail touchpoints that maximize relevancy and scalability. Multichannel campaigns including direct mail report an ROI 18 percentage points higher than those without direct mail. Plus, 84% of marketers state direct mail improves multichannel campaign performance.

Read on to see how you can merge digital communications and direct mail to deliver remarkable experiences that spark a connection.

Incorporate intent data

Personalization is a key ingredient of a remarkable experience. Many marketers automate processes by introducing marketing software and then call it personalization. But, oftentimes it’s just quicker batching and blasting. Brands can’t just change the first name on a piece of content and call it “personalized.” Real personalization is necessary and vital for real results. Our consumers expect more. The best way to introduce real personalization within a marketing mix is to use intent data and trigger-driven campaigns.

Six-Word Sci-Fi: Stories Written By You

By WIRED Readers
Here's this month's prompt, how to submit, and an illustrated archive of past favorites.
Before yesterdayYour RSS feeds

Neuroglee gets $2.3 million to develop digital therapeutics for neurodegenerative diseases

By Catherine Shu

There are now about 50 million people with dementia globally, a number the World Health Organization expects to triple by 2050. Alzheimer’s is the leading cause of dementia and caregivers are often overwhelmed, without enough support.

Neuroglee, a Singapore-based health tech startup, wants to help with a digital therapeutic platform created to treat patients in the early stages of the disease. Founded this year to focus on neurodegenerative diseases, Neuroglee announced today it has raised $2.3 million in pre-seed funding.

The round was led by Eisai Co., one of Japan’s largest pharmaceutical companies, and Kuldeep Singh Rajput, the founder and chief executive officer of predictive healthcare startup Biofourmis.

Neuroglee’s prescription digital therapy software for Alzheimer’s, called NG-001, is its main product. The company plans to start clinical trials next year. NG-001 is meant to complement medication and other treatments, and once it is prescribed by a clinician, patients can access its cognitive exercises and tasks through a tablet.

The software tracks patients’ progress, such as the speed of their fingers and the time it takes to complete an exercise, and delivers personalized treatment programs. It also has features to address the mental health of patients, including one that shows images that can bring up positive memories, which in turn can help alleviate depression and anxiety when used in tandem with other cognitive behavioral therapy techniques.

For caregivers and clinicians, NG-001 helps them track patient progress and their compliance with other treatments, like medications. This means that healthcare providers can work closely with patients even remotely, which is especially important during the COVID-19 pandemic.

Neuroglee founder and CEO Aniket Singh Rajput told TechCrunch that its first target markets for NG-001 are the United States and Singapore, followed by Japan. NG-001 needs to gain regulatory approval in each country, and it will start by seeking U.S. Food and Drug Administration clearance.

Once it launches, clinicians will have two ways to prescribe NG-001, through their healthcare provider platform or an electronic prescription tool. A platform called Neuroglee Connect will give clinicians, caregivers and patients access to support and features for reimbursement and coverage.

Google shutting down Poly 3D content platform

By Lucas Matney

Google is almost running out of AR/VR projects to kill off.

The company announced today in an email to Poly users that they will be shutting down the 3D-object creation and library platform “forever” next year. The service will shut down on June 30, 2021 and users won’t be able to upload 3D models to the site starting April 30, 2021.

Poly was introduced as a 3D creation tool optimized for virtual reality. Users could easily create low-poly objects with in-VR tools. The software was designed to serve as a lightweight way to create and view 3D assets that could in turn end up in games and experiences, compared to more art and sculpting-focused VR tools like Google’s Tilt Brush and Facebook’s (now Adobe’s) Medium software.

Google has already discontinued most of the company’s AR/VR plays, including most notably their Daydream mobile VR platform.

The AR/VR industry’s initial rise prompted plenty of 3D-centric startups to bet big on creating or hosting a library of digital objects. As investor enthusiasm has largely faded and tech platforms hosting AR/VR content have shuttered those products, it’s less clear where the market is for this 3D content for the time being.

Users that have uploaded objects to Poly will be able to download their data and models ahead of the shutdown.

Okay nabs funding from Sequoia to build performance dashboards for engineering managers

By Lucas Matney

Amid the pandemic, workplace cultures have been turned on their heads, meanwhile investment and growth haven’t slowed for many tech companies, requiring them to still onboard new engineering managers even while best practices for remote management are far from codified.

Because of remote work habit shifts, plenty of new tools have popped up to help engineers be more productive, or quickly help managers interface with direct-reports more often. Okay is taking a more observatory route, aiming to give managers dashboards that quantify the performance of their teams so that they can get a picture of where they have room to improve.

The startup, which launched out of Y Combinator earlier this year, tells TechCrunch they’ve raised $2.2 million in funding led by Sequoia and are launching the open beta of their service.

Co-founders Antoine Boulanger and Tomas Barreto met while working at Box — Boulanger as a senior director of engineering and Barreto as a VP of engineering. They told TechCrunch that in the process of building out a suite of in-house tools designed to help managers at Box understand their teams better, they realized the opportunity for a subscription toolset that could help managers across companies. For the most part, Boulanger says that today Okay is largely replacing tools built in-house as well.

Getting a picture of an engineering team’s productivity means plugging into these toolsets and gathering data into a digestible feed. Okay can be integrated with a number of toolsets, including software like GitHub, PagerDuty, CircleCI and Google Calendar.

“Part of the problem for managers is that there are so many tools, so how do you get signal from the noise?” Barreto tells TechCrunch.

A large part of Okay’s sell seems to be ensuring that managers can keep an active eye on the common pitfalls of rapid scaling and keep them in check so that can keep direct-reports satisfied. On the individual basis, managers can quickly see stats related to how much of an individual manager’s time is being spent in meetings compared to un-interrupted “maker time” where they actually have the ability to get work done.

People don’t like to be micro-managed and the idea that everything you do is feeding into a pie chart that judges whether you’re a good employee or not isn’t the most savory sell for engineers. Okay’s founders hope they can strike a balance and give managers data that they’re not tempted to over-rely on, instead defaulting to team-level insights when they can so that managers are dialed into general trends like how long projects are taking on average or how long it takes for pull requests to be reviewed.

Investors have been bankrolling remote work tools at a heightened pace for the last several months and things have been especially fortunate for young companies that were ahead of the trend. Barreto, for his part, has served as a scout at Sequoia since 2018 according to his LinkedIn.

The team says their product, as it stands today, is best fit for companies with 50-200 engineers that are high-growth and perhaps going through some of those growing pains. The company’s early customers include teams at Brex, Plaid and Split.

Europe will push to work with the US on tech governance, post-Trump

By Natasha Lomas

The European Union said today that it wants to work with US counterparts on a common approach to tech governance — including pushing to standardize rules for applications of technologies like AI and pushing big tech to be more responsible for what their platforms amplify.

EU lawmakers are anticipating rebooted transatlantic relations under the incoming administration of president-elect Joe Biden .

The Commission has published a new EU-US agenda with the aim of encouraging what it bills as “global cooperation — based on our common values, interests and global influence” in a number of areas, from tackling the coronavirus pandemic to addressing climate change and furthering a Western geopolitical agenda.

Trade and tech policy is another major priority for the hoped for reboot of transatlantic relations, starting with an EU-US Summit in the first half of 2021.

Relations have of course been strained during the Trump era as the sitting US president has threatened the bloc with trade tariffs, berated European nations for not spending enough on defence to fulfil their Nato commitments and heavily implied he’d be a lot happier if the EU didn’t exist at all (including loudly supporting brexit).

The Commission agenda conveys a clear message that the bloc’s lawmakers are hopeful of a lot more joint working — toward common goals and interests — once the Biden administration takes office early next year.

Global AI standards?

On the tech front the Commission’s push is for alignment on governance.

“The EU and the US need to join forces as tech-allies to shape technologies, their use and their regulatory environment,” the Commission writes in the agenda. “Using our combined influence, a transatlantic technology space should form the backbone of a wider coalition of like-minded democracies with a shared vision on tech governance and a shared commitment to defend it.”

Among the proposals it’s floating is a “Transatlantic AI Agreement” — which it envisages as setting “a blueprint for regional and global standards aligned with our values”.

While the EU is working on a pan-EU framework to set rules for the use of “high risk” AIs, some US cities and states have already moved to ban the use of specific applications of artificial intelligence — such as facial recognition. So there’s potential to align on some high level principles or standards.

(Or, as the EU puts it: “We need to start acting together on AI — based on our shared belief in a human-centric approach and dealing with issues such as facial recognition.”)

 

“Our shared values of human dignity, individual rights and democratic principles make us natural partners to harness rapid technological change and face the challenges of rival systems of digital governance. This gives us an unprecedented window of opportunity to set a joint EU-US tech agenda,” the Commission also writes, suggesting there’s a growing convergence of views on tech governance.

Talks on tackling big tech

Here it also sees opportunity for the EU and the US to align on tackling big tech — saying it wants to open discussions on setting rules to tackle the societal and market impacts of platform giants.

“There is a growing consensus on both sides of the Atlantic that online platforms and Big Tech raise issues which threaten our societies and democracies, notably through harmful market behaviours, illegal content or algorithm-fuelled propagation of hate speech and disinformation,” it writes.

“The need for global cooperation on technology goes beyond the hardware or software. It is also about our values, our societies and our democracies,” the Commission adds. “In this spirit, the EU will propose a new transatlantic dialogue on the responsibility of online platforms, which would set the blueprint for other democracies facing the same challenges. We should also work closer together to further strengthen cooperation between competent authorities for antitrust enforcement in digital markets.”

The Commission is on the cusp of unveiling its own blueprint for regulating big tech — with a Digital Services Act and Digital Markets Act due to be presented later this month.

Commissioners have said the legislative packages will set clear conditions on digital players, such as for the handling and reporting of illegal content, as well as setting binding transparency and fairness requirements.

They will also introduce a new regime of ex ante rules for so-called gatekeeper platforms that wield significant market power (aka big tech) — with such players set to be subject to a list of dos and don’ts, which could include bans on certain types of self-preferencing and limits on their use of third party data, with the aim of ensuring a level playing field in the future.

The bloc has also been considering beefing up antitrust powers for intervening in digital markets.

Given how advanced EU lawmakers are on proposals to regulate big tech vs US counterparts there’s arguably only a small window of opportunity for the latter to influence the shape of EU rules on (mostly US) big tech.

But the Commission evidently takes the view that rebooted relations, post-Trump, present an opportunity for it to influence US policy — by encouraging European-style platform rules to cross the pond.

It’s fond of claiming the EU’s data protection framework (GDPR) has set a global example that’s influenced lawmakers around the world. So its intent now looks to be to double down — and push to export a European approach to regulating big tech back where most of these giants are based (even as the bloc’s other institutions are still debating and amending the EU proposals).

Next-gen mobile security

Another common challenge the document points to is next-gen mobile connectivity. This has been a particular soapbox of Trump’s in recent years, with the ALL-CAPS loving president frequently taking to Twitter to threaten and bully allies into taking a tough line on allowing Chinese vendors as suppliers for next-gen mobile infrastructure, arguing they pose too great a national security risk.

“We are facing common challenges in managing the digital transition of our economies and societies. These include critical infrastructure, such as 5G, 6G or cybersecurity assets, which are essential for our security, sovereignty and prosperity — but also data, technologies and the role of online platforms,” the Commission writes, easing into the issue.

EU lawmakers go on to say they will put forward proposals “for secure 5G infrastructure across the globe and open a dialogue on 6G” — as part of what they hope will be “wider cooperation on digital supply chain security done through objective risk-based assessments”.

Instead of a blanket ban on Huawei as a 5G supplier the Commission opted to endorse a package of “mitigating measures” — via a 5G toolbox — at the start of this year, which includes requirements for carriers to beef up network security and risk profile assessments of suppliers.

So it looks to be hoping the US can be convinced in the value of a joint approach to standardizing these sorts of security assessments — aka, ‘no more nasty surprises’ — as a strategy to reduce the shocks and uncertainty that have hit digital supply chains during Trump’s presidency.

Increased cooperation around cybersecurity is another area where the EU says it will be pressing US counterparts — floating the idea of joint EU-US restrictions against attributed attackers from third countries in the future. (A proposal which, should it be taken up, could see coordinated sanctions against Russia, which has previously been identified by US and European intelligence agencies running malware attacks targeted at COVID-19 vaccine R&D, for example.)

Easing EU-US data flows

A trickier area for the tech side of the Commission’s plan to reboot transatlantic relations is EU-US data flows.

That’s because Europe’s top court torpedoed the Commission’s US adequacy finding this summer — stripping the country of a privileged status of ‘essential equivalence’ in data protection standards.

Without that there’s huge legal uncertainty and risk for US businesses that want to take EU citizens’ data out of the region for processing. And recent guidance from EU regulators on how to lawfully secure data transfers makes it clear that in some instances there simply won’t be any extra measures or contractual caveats which will fix the risk entirely.

The solution may in fact be data localization in the EU. (Something the Commission’s Data Governance Act proposal, unveiled last week, appeared to confirm by allowing for Member States to set conditions for reuse of the most sensitive types of data — such as prohibiting transfers to third countries.)

“We must also openly discuss diverging views on data governance and see how these can be overcome constructively,” the Commission writes on this thorny issue, adding: “The EU and the US should intensify their cooperation at bilateral and multilateral level to promote regulatory convergence and facilitate free data flow with trust on the basis of high standards and safeguards.”

Commissioners have warned before that there’s no quick fix for the EU-US data transfer issue — but a longer term solution would be a convergence of standards in the areas of privacy and data protection.

And, again, that’s an area where US states have been taking action. But the Commission’s agenda pushing for “regulatory convergence” to ease data flows sums to trying to convince US counterparts of the economic case for reforming Section 702 of FISA…

Digital tax and tech-trade cooperation

Digital tax reform is also inexorably on the EU agenda since no agreement has been possibly under Trump on this stickiest of tech policy issues.

It writes that both the EU and the US should “strongly commit to the timely conclusion of discussions on a global solution within the context of OECD and G20” — saying this is vital to create “a fair and modern economy, which provides market-based rewards for the best innovative ideas”.

“Fair taxation in the digital economy requires innovative solutions on both sides of the Atlantic,” it adds. 

Another proposal the EU is floating is to establish a EU-US Trade and Technology Council — to “jointly maximise opportunities for market-driven transatlantic collaboration, strengthen our technological and industrial leadership and expand bilateral trade and investment”.

It envisages the body focusing on reducing trade barriers; developing compatible standards and regulatory approaches for new technologies; ensuring critical supply chain security; deepening research collaboration and promoting innovation and fair competition — saying there should also be “a new common focus on protecting critical technologies”.

“We need closer cooperation on issues such as investment screening, Intellectual Property rights, forced transfers of technology, and export controls,” it adds.

The Commission announced its own Intellectual Property Action Plan last week, alongside the Data Governance Act proposal — which included support for SMEs to file patents. It also said it will consider whether reform the framework for filing standards essential patents, encouraging industry to engage in forums aimed at reducing litigation in the meanwhile.

YouTube upgrades Premieres with trailers, themes and a live pre-show option

By Sarah Perez

YouTube today is launching three new features designed to improve its “Premieres” experience, including trailers, themes, and live stream “pre-shows” that later redirect to the main event. Premieres, which first arrived in 2018, are designed to give creators the ability to leverage the revenue generation possibilities that come with live videos without having to actually “go live.”

Instead, Premieres allow creators to promote a scheduled video release by pointing fans to a landing page with a live chat in the sidebar, just like other live videos. This lets creators take advantage of money-making features like SuperChat, Stickers, ads, and Channel Memberships.

However, some creators want to engage with fans live ahead of their video premiere. The new “Live Redirect” feature will now make it a more seamless experience when they do so, as it allows creators to host a live stream that redirects to the upcoming Premiere just before it starts. This gives creators time to build up their audience ahead of the video’s release, as they can now not only join the chat to engage fans, but also live stream to their fans directly.

Image Credits: YouTube

YouTube says it tested this feature over the past several months with We Are One Film Festival, New York Comic-Con, BTS, Cardi B, and Justin Bieber, in advance of today’s launch.

Another new feature will allow creators to upload a pre-recorded video that will be featured on the Premiere landing page before the main event. This trailer can range from 15 seconds to 3 minutes in length, and works to create hype for the premiere ahead of its release. Creators can also encourage their fans to set a reminder so they won’t miss the video’s launch.

Image Credits: YouTube

The video countdown experience that plays just before their Premieres go live can also now be customized A new set of Countdown Themes will include those designed for different vibes or moods, like calm, playful, dramatic or sporty, for example.

Image Credits: YouTube

Since their launch, Premieres have been used by over 8 million YouTube channels, including big names like BLACKPINK, Tiny Desk, James Charles, Supercell, and Cirque du Soleil, among others. Their adoption significantly grew during the pandemic, the company also notes. Since March 1, 2020, YouTube has seen over 85% growth in daily Premieres, with over 80% of the channels having never before used a Premiere until this year.

The first two features will arrive to creators with at least 1,000 subscribers starting today, but Countdown Themes won’t be available for a couple of months, YouTube says.

Orbit raises $4M for its community experience platform

By Frederic Lardinois

Orbit, a startup that is building tools to help organizations build communities around their proprietary and open-source products, today announced that it has raised a $4 million seed funding round led by Andreessen Horowitz’s Martin Casado. A number of angel investors, including Chris Aniszczyk, Jason Warner and Magnus Hillestad, as well as the a16z’s Cultural Leadership Fund, also participated, in addition to previous backers Heavybit and Harrison Metal.

The company describes its service as a “community experience platform.” Currently, Orbit’s focus is on Developer Relations and Community teams, as well as open-source maintainers. There’s no reason the company couldn’t branch out into other verticals as well, though, given that its overall framework is really applicable across all communities.

Orbit team: Patrick Woods, Nicolas Goutay and Josh Dzielak

As Orbit co-founder Patrick Woods told me, community managers have generally had a hard time figuring out who was really contributing to their communities because those contributions can come in lots of forms and often happen across a wide variety of platforms. In addition, the sales and marketing teams also often don’t understand how a community impacts a company’s bottom line. Orbit aggregates all of these contributions across platforms.

“There is a lack of understanding around the ways in which community impacts go-to-market and business value,” Woods told me when I asked him about the genesis of the idea. “There’s a big gap in terms of the tooling associated with that. Many companies agree that community is important, but if you put $1 in the community machine today, it’s hard to know where that’s going to come out — and is it going to come out in terms of $0.50 or $100? This was a set of challenges that we noticed across companies of all sizes.”

Image Credits: Orbit

Especially in open-source communities, there will always be community members who create a lot of value but who don’t have a commercial relationship with a company at all. That makes it even harder for companies to quantify the impact of their communities, even if they agree that community is an important way to grow their business and that, in Orbit’s words, “community is the new pre-sales.”

At the core of Orbit (the company) is Orbit the open-source community framework. The founding team of Woods (CEO) and Josh Dzielak (CTO) developed this framework to help organizations understand how to best build what the team calls a “high gravity community” to attract new members and retain existing ones — and how to evaluate them. You can read more about the concept here.

Image Credits: Orbit

“We’re trying to reframe the discussion away from an extractive worldview that says how much value can we generate from this lead? It’s actually more about how much love can we generate from these community members,” Woods said. “Because, if you think about the culture associated with what we’re trying to do, it’s fundamentally creative and generative. And our goal is really to help people think less about value extraction and more about value creation.”

At the end of the day, though, no matter the philosophy behind your community-building efforts, there has to be a way to measure ROI and turn some of those community members into paying customers. To do that, Orbit currently pulls in data from sources like GitHub, Twitter and Discourse, with support for Slack and other tools coming soon. With that, the service makes it far easier for community managers to keep tabs on what is happening inside their community and who is participating.

Image Credits: Orbit

In addition to the built-in dashboards, Orbit also provides an API to help integrate all of this data into third-party services as well.

“One of the key understandings that drives the Orbit vision is that a community is not a funnel and building a community is not about conversions, but making connections; cultivating dialog and engagement; being open and giving back; and creating value versus trying to capture it,” a16z’s Casado writes. “The model has proven to be very effective, and now Orbit has built a product around it. We strongly believe Orbit is a must-have product for those building developer-focused companies.”

The company is already working with just under 150 companies and its users include the likes of Postman, CircleCI, Kubernetes and Apollo GraphQL.

The company will use the new round, which closed a few weeks ago, to, among other things, build out its go-to-market efforts and develop more integrations.

AWS goes after Microsoft’s SQL Server with Babelfish for Aurora PostgreSQL

By Frederic Lardinois

AWS today announced a new database product that is clearly meant to go after Microsoft’s SQL Server and make it easier — and cheaper — for SQL Server users to migrate to the AWS cloud. The new service is Babelfish for Aurora PostgreSQL. The tagline AWS CEO Andy Jassy used for this service in his re:Invent keynote today is probably telling: “Stop paying for SQL Server licenses you don’t need.” And to show how serious it is about this, the company is even open-sourcing the tool.

What Babelfish does is provide a translation layer for SQL Server’s proprietary SQL dialect (T-SQL) and communications protocol so that businesses can switch to AWS’ Aurora relational database at will (though they’ll still have to migrate their existing data). It provides translations for the dialect, but also SQL commands,  cursors, catalog views, data types, triggers, stored procedures and functions.

The promise here is that companies won’t have to replace their database drivers or rewrite and verify their database requests to make this transition.

“We believe Babelfish stands out because it’s not another migration service, as useful as those can be. Babelfish enables PostgreSQL to understand database requests—both the command and the protocol—from applications written for Microsoft SQL Server without changing libraries, database schema, or SQL statements,” AWS’s Matt Asay writes in today’s announcement. “This means much faster ‘migrations’ with minimal developer effort. It’s also centered on ‘correctness,’ meaning applications designed to use SQL Server functionality will behave the same on PostgreSQL as they would on SQL Server.”

PostgreSQL, AWS rightly points out, is one of the most popular open-source databases in the market today. A lot of companies want to migrate their relational databases to it — or at least use it in conjunction with their existing databases. This new service is going to make that significantly easier.

The open-source Babelfish project will launch in 2021 and will be available on GitHub under the Apache 2.0 license.

“It’s still true that the overwhelming majority of relational databases are on-premise,” AWS CEO Andy Jassy said. “Customers are fed up with and sick of incumbents.” As is tradition at re:Invent, Jassy also got a few swipes at Oracle into his keynote, but the real target of the products the company is launching in the database area today is clearly Microsoft.

UK to set up ‘pro-competition’ regulator to put limits on big tech

By Natasha Lomas

The U.K. is moving ahead with a plan to regulate big tech, responding to competition concerns over a “winner-takes-all” dynamic in digital markets.

It will set up a new Digital Market Unit (DMU) to oversee a “pro-competition” regime for internet platforms — including those funded by online advertising, such as Facebook and Google — the Department of Digital, Culture, Media and Sport (DCMS) announced today.

It’s moving at a clip — with the new unit slated to begin work in April. Although the necessary law to empower the new regulator to make interventions will take longer. The government said it will consult on the unit’s form and function in early 2021 — and legislate “as soon as parliamentary time allows.”

A core part of the plan is a new statutory code of conduct aimed at giving platform users more choice and third-party businesses more power over the intermediaries that host and monetize them.

The government suggests the code could require tech giants to allow users to opt out of behavioral advertising entirely — something Facebook’s platform, for example, does not currently allow.

It also wants the code to support the sustainability of the news industry by “rebalancing” the relationship between publishers and platform giants, as it puts it.

Concern over how to support quality public interest journalism in an era of ad-funded user-generated-content giants has been stepping up in recent years as online disinformation has been actively weaponized to attack democracies and try to influence votes.

“The new code will set clear expectations for platforms that have considerable market power — known as strategic market status — over what represents acceptable behaviour when interacting with competitors and users,” DCMS writes in a press release.

It suggests the DMU will have powers to “suspend, block and reverse decisions of tech giants, order them to take certain actions to achieve compliance with the code, and impose financial penalties for noncompliance,” although full details are set to be worked out next year.

Digital Markets Taskforce, which the government set up earlier this year to advise on the design of the competition measures, will inform the unit’s work, including how the regime will work in practice, per DCMS.

The taskforce will also come up with the methodology that’s used to determine which platforms/companies should be designated as having strategic market status.

On that front it’s all but certain Facebook and Google will gain the designation and be subject to the code and oversight by the DMU, although confirmation can only come from the unit itself once it’s up and running. But U.K. policymakers don’t appear to have been fooled by bogus big tech talking points of competition being “only a click away.”

The move to set up a U.K. regulator for big tech’s market power follows a competition market review chaired by former U.S. President Barack Obama’s chief economic advisor, professor Jason Furman, which reported last year. The expert panel recommended existing competition policy was fit for purpose but that new tools were needed for it to tackle market challenges flowing from platform power and online network effects.

Crucially, the Furman report advocated for a “broad church” interpretation of consumer welfare as the driver of competition interventions — encompassing factors such as choice, quality and innovation, not just price.

That’s key given big tech’s strategic application of free-at-the-point-of-use services as a tool for dominating markets by gaining massive marketshare that in turn gives it the power to set self-serving usage conditions for consumers and anti-competitive rules for third-party businesses — enabling it to entrench its hold on the digital attention sphere.

The U.K.’s Competition and Markets Authority (CMA) also undertook a market study of the digital advertising sector — going on to report substantial concerns over the power of the adtech duopoly. Although in its final report it deferred competitive intervention in favor of waiting for the government to legislate.

Commenting on the announcement of the DMU in a statement, digital secretary Oliver Dowden said: “I’m unashamedly pro-tech and the services of digital platforms are positively transforming the economy — bringing huge benefits to businesses, consumers and society. But there is growing consensus in the U.K. and abroad that the concentration of power among a small number of tech companies is curtailing growth of the sector, reducing innovation and having negative impacts on the people and businesses that rely on them. It’s time to address that and unleash a new age of tech growth.”

Business secretary Alok Sharma added: “The dominance of just a few big tech companies is leading to less innovation, higher advertising prices and less choice and control for consumers. Our new, pro-competition regime for digital markets will ensure consumers have choice and mean smaller firms aren’t pushed out.”

The U.K.’s move to regulate big tech means there’s now broad consensus among European lawmakers that platform power must be curtailed — and that competition rules need proper resourcing to get the job done.

A similar digital market regime is due to be presented by EU lawmakers next month.

The European Commission has said the forthcoming ex ante pan-EU regulation — which it’s calling the Digital Markets Act — will identify platforms that hold significant market power, so-called internet gatekeepers, and apply a specific set of fairness and transparency rules and obligations on them with the aim of rebalancing competition. Plans to open algorithmic blackboxes to regulatory oversight is also in the cards at the EU level.

A second piece of proposed EU legislation, the Digital Services Act, is set to update rules for online businesses by setting clear rules and responsibilities on all players in specific areas such as hate speech and illegal content.

The U.K. is also working on a similar online safety-focused regime — proposing to regulate a range of harms in its Online Harms white paper last year though it has yet to come forward with draft legislation.

This summer the BBC reported that the government has not committed to introduce a draft bill next year either — suggesting its planned wider internet regulation regime may not be in place until 2023 or 2024.

It looks savvy for U.K. lawmakers to prioritize going after platform power since many of the problems that flow from harmful internet content are attached to the reach and amplification of a handful of tech giants.

A more competitive landscape for social media could encourage competition around the quality of the community experienced for users — meaning that, for example, smaller platforms that properly enforce hate speech rules and don’t torch user privacy could gain an edge.

Although rules to enable data portability and/or interoperability are likely to be crucial to kindling truly vibrant and innovative competition in markets that have already been captured by a handful of data-mining adtech giants.

Given the U.K.’s rush to address the market power of big tech, it’s interesting to recall how many times the Facebook CEO Mark Zuckerberg snubbed the DCMS committee’s calls for him to give evidence over online disinformation and digital campaigning (including related to the Cambridge Analytica data misuse scandal) — not once but so many times we lost count.

It seems U.K. lawmakers kept a careful note of that.

 

Facebook’s latest ad tool fail puts another dent in its reputation

By Natasha Lomas

Reset yer counters: Facebook has had to ‘fess up to yet another major ad reporting fail.

This one looks like it could be costly for the tech giant to put right — not least because it’s another dent in its reputation for self-reporting. (For past Facebook ad metric errors check out our reports from 2016 here, here, here and here.)

AdExchanger reported on the code error last week with Facebook’s free “conversion lift” tool, which it said affected several thousand advertisers.

The discovery of the flaw has since led the tech giant to offer some advertisers millions of dollars in credits, per reports this week, to compensate for miscalculating the number of sales derived from ad impressions (which is, in turn, likely to have influenced how much advertisers spent on its digital snake oil).

According to an AdAge report yesterday, which quotes industry sources, the level of compensation Facebook is offering varies depending on the advertiser’s spend — but in some instances the mistake means advertisers are being given coupons worth tens of millions of dollars.

The issue with the tool went unfixed for as long as 12 months, with the problem persisting between August 2019 and August 2020, according to reports.

The Wall Street Journal says Facebook quietly told advertisers this month about the technical problem with its calculation of the efficacy of their ad campaigns, skewing data advertisers use to determine how much to spend on its platform.

One digital agency source told the WSJ the issue particularly affects certain categories such as retail where marketers have this year increased spending on Facebook and similar channels by up to 5% or 10% to try to recover business lost during the early stages of the pandemic.

Another of its industry sources pointed out the issue affects not just media advertisers but the tech giant’s competitors — since the tool could influence where marketers chose to spend budget (whether they spend on Facebook’s platform or elsewhere).

Last week the tech giant told AdExchanger that the bug was fixed on September 1, saying then that it was “working with impacted advertisers.”

In a subsequent statement a company spokesperson told us: “While making improvements to our measurement products, we found a technical issue that impacted some conversion lift tests. We’ve fixed this and are working with advertisers that have impacted studies.”

Facebook did not respond to a request to confirm whether some impacted advertisers are being offered millions of dollars worth of ad vouchers to rectify its code error.

It did confirm it’s offering one-time credits to advertisers who have been “meaningfully” impacted by the issue with the (nonbillable) metric, adding that the impact is on a case-by-case basis, depending on how the tool was used.

Nor did it confirm how many advertisers had impacted studies as a result of the year-long technical glitch — claiming it’s a small number.

While the tech giant can continue to run its own reporting systems for B2B customers free from external oversight for now, regulating the fairness and transparency of powerful internet platforms that other businesses depend upon for market access and reach is a key aim of a major forthcoming digital services legislative overhaul in the European Union.

Under the Digital Services Act and Digital Markets Act plan, the European Commission has said tech giants will be required to open up their algorithms to public oversight bodies — and will also be subject to binding transparency rules. So the clock may be ticking for Facebook’s self-serving self-reporting.

The Ethics of Rebooting the Dead

By Janet Manley
The notion of resurrecting people as digital entities is becoming less hypothetical. But just because something can be done, doesn’t always mean it should.

Why 'Head Empty' Memes Are Dominating 2020

By Cecilia D'Anastasio
In a year of chaos, a series of internet tropes has sought to offer escape from the turmoil.

GDPR enforcement must level up to catch big tech, report warns

By Natasha Lomas

A new report by European consumer protection umbrella group Beuc, reflecting on the barriers to effective cross-border enforcement of the EU’s flagship data protection framework, makes awkward reading for the regional lawmakers and regulators as they seek to shape the next decades of digital oversight across the bloc.

Beuc’s members filed a series of complaints against Google’s use of location data in November 2018 — but some two years on from raising privacy concerns there’s been no resolution of the complaints.

Since 2018, legal cases in 🇪🇺, 🇺🇸 &🇦🇺 have been launched against Google in relation to their collection and use of location data. Since then, nothing happened while Google generated $251billion from advertising revenue. pic.twitter.com/tNkUvXrAan

— The Consumer Voice (@beuc) November 26, 2020

The tech giant continues to make billions in ad revenue, including by processing and monetizing internet users’ location data. Its lead data protection supervisor, under GDPR’s one-stop-shop mechanism for dealing with cross-border complaints, Ireland’s Data Protection Commission (DPC), did finally open an investigation in February this year.

But it could still be years before Google faces any regulatory action in Europe related to its location tracking.

This is because Ireland’s DPC has yet to issue any cross-border GDPR decisions, some 2.5 years after the regulation started being applied. (Although, as we reported recently, a case related to a Twitter data breach is inching toward a result in the coming days.)

By contrast, France’s data watchdog, the CNIL, was able to complete a GDPR investigation into the transparency of Google’s data processing in much quicker order last year.

This summer French courts also confirmed the $57 million fine it issued, slapping down Google’s appeal.

But the case predated Google coming under the jurisdiction of the DPC. And Ireland’s data regulator has to deal with a disproportionate number of multinational tech companies, given how many have established their EU base in the country.

The DPC has a major backlog of cross-border cases, with more than 20 GDPR probes involving a number of tech companies including Apple, Facebook/WhatsApp and LinkedIn. (Google has also been under investigation in Ireland over its adtech since 2019.)

This week the EU’s internet market commissioner, Thierry Breton, said regional lawmakers are well-aware of enforcement “bottlenecks” in the General Data Protection Regulation (GDPR).

He suggested the commission has learned lessons from this friction — claiming it will ensure similar concerns don’t affect the future working of a regulatory proposal related to data reuse that he was out speaking in public to introduce.

The commission wants to create standard conditions for rights-respecting reuse of industrial data across the EU, via a new Data Governance Act (DGA), which proposes similar oversight mechanisms as are involved in the EU’s oversight of personal data — including national agencies monitoring compliance and a centralized EU steering body (which they’re planning to call the European Data Innovation Board as a mirror entity to the European Data Protection Board).

The commission’s ambitious agenda for updating and expanding the EU’s digital rules framework, means criticism of GDPR risks taking the shine off the DGA before the ink has dried on the proposal document — putting pressure on lawmakers to find creative ways to unblock GDPR’s enforcement “bottleneck.” (Creative because national agencies are responsible for day-to-day oversight, and member states are responsible for resourcing DPAs.) 

In an initial GDPR review this summer, the commission praised the regulation as a “modern and horizontal piece of legislation” and a “global reference point” — claiming it’s served as a point of inspiration for California’s CCPA and other emerging digital privacy frameworks around the world.

But they also conceded GDPR enforcement is lacking.

The best answer to this concern “will be a decision from the Irish data protection authority about important cases,” the EU’s justice commissioner, Didier Reynders, said in June.

Five months later European citizens are still waiting.

Beuc’s report — which it’s called “The long and winding road: Two years of the GDPR: A cross-border data protection case from a consumer perspective” — details the procedural obstacles its member organizations have faced in seeking to obtain a decision related to the original complaints, which were filed with a variety of DPAs around the EU.

This includes concerns of the Irish DPC making unnecessary “information and admissibility checks;” as well as rejecting complaints brought by an interested organization on the grounds they lack a mandate under Irish law, because it does not allow for third party redress (yet the Dutch consumer organization had filed the complaint under Dutch law which does …).

The report also queries why the DPC chose to open its own volition inquiry into Google’s location data activities (rather than a complaint-led inquiry) — which Beuc says risks a further delay to reaching a decision on the complaints themselves.

It further points out that the DPC’s probe of Google only looks at activity since February 2020 not November 2018 when the complaints were made — meaning there’s a missing chunk of Google’s location data processing that’s not even being investigated yet.

It notes that three of its member organizations involved in the Google complaints had considered applying for a judicial review of the DPC’s decision (NB: others have resorted to that route) — but they decided not to proceed in part because of the significant legal costs it would have entailed.

The report also points out the inherent imbalance of GDPR’s one-stop-shop mechanism shifting the administration of complaints to the location of companies under investigation — arguing they therefore benefit from “easier access to justice” (versus the ordinary consumer faced with undertaking legal proceedings in a different country and (likely) language).

“If the lead authority is in a country with tradition in ‘common law,’ like Ireland, things can become even more complex and costly,” Beuc’s report further notes.

Another issue it raises is the overarching one of rights complaints having to fight what it dubs “a moving target” — given well-resourced tech companies can leverage regulatory delays to (superficially) tweak practices, greasing continued abuse with misleading PR campaigns. (Something Beuc accuses Google of doing.)

DPAs must “adapt their enforcement approach to intervene more rapidly and directly.” it concludes.

“Over two years have passed since the GDPR became applicable, we have now reached a turning point. The GDPR must finally show its strength and become a catalyst for urgently needed changes in business practices,” Beuc goes on in a summary of its recommendations. “Our members experience and that of other civil society organisations, reveals a series of obstacles that significantly hamper the effective application of the GDPR and the correct functioning of its enforcement system.

BEUC recommends to the relevant EU and national authorities to make a comprehensive and joint effort to ensure the swift enforcement of the rules and improve the position of data subjects and their representing organisations, particularly in the framework of cross-border enforcement cases.”

We reached out to the Commission and the Irish DPC with questions about the report. But at the time of writing neither had responded. We’ve also asked Google for comment.

Update: The DPC’s deputy commissioner, Graham Doyle, told us the reason it chose to open a “forward-looking” inquiry into Google’s location practices in early 2020 was it wanted to be able to investigate “in real time” rather than try to go back and replicate how things were.

Doyle also said the location-related Google complaints had been lodged with different DPAs at difference times — meaning some complaints had taken considerably longer to reach Ireland than November 2018, raising questions about the efficiency of the current procedures for European DPAs to send complaints to a lead supervisor.

“The complaints in question were lodged with different Supervisory Authorities on different dates from November 2018,” he said. “The DPC received these complaints in July 2019, following which we engaged with Beuc. We then opened an own-volition inquiry in February 2020 in a manner that will enable us to undertake real-time testing in order to evidence our findings.”

Beuc earlier sent a list of eight recommendations for “efficient” GDPR enforcement to the commission in May.

Update II: A commission spokesperson pointed back to its earlier evaluation of the GDPR this summer, flagging follow-up actions it committed to at that point — such as continuing bilateral exchanges with member states on proper implementation of the regulation.

It also said that it would “continue to use all the tools at its disposal to foster compliance by member states with their obligations” — including, potentially, instigating infringement procedures if necessary.

Additional follow-up actions related to “implementing and complementing” the legal framework that it detailed in the report included supporting “further exchanges of views and national practices between member states on topics that are subject to further specification at national level so as to reduce the level of fragmentation of the single market, such as processing of personal data relating to health and research, or which are subject to balancing with other rights such as the freedom of expression;” and to push for “a consistent application of the data protection framework in relation to new technologies to support innovation and technological developments.” 

The commission also said it would use the GDPR Member States Expert Group to “facilitate discussions and sharing of experience between member states and with the commission,” with a view to improving the regulation’s operation.

In the area of GDPR’s governance system, EU lawmakers committed to continue to monitor the effectiveness and independence of national DPAs, and said they would work to encourage cooperation between regulators (“in particular in fields such as competition, electronic communications, security of network and information systems and consumer policy”), while also supporting the EDPB to assess how procedures related to cross-border cases could be improved.  

France starts collecting tax on tech giants

By Romain Dillet

France is going forward with its plan to tax big tech companies. The government has sent out notices to tech giants, as reported by the Financial Times, Reuters and AFP. There could be retaliation tariffs on French goods in the U.S.

For the past couple of years, France’s Economy Minister Bruno Le Maire has been pushing hard for a tax reform. Many economy ministers in Europe think tech companies aren’t taxed properly. They generate revenue in one country, but report to tax authorities in another country. They take advantage of countries with low corporate tax to optimize the bottom line.

Le Maire first pitched the idea of a European tax on big tech companies based on local revenue. But he failed to get support from other European countries — European tax policies require a unanimous decision from members of the European Union.

The French government chose not to wait for other European countries and started working on its own local tax. There are two requirements:

  • You generate more than €750 million in revenue globally and €25 million in France.
  • And you’re operating a marketplace (Amazon’s marketplace, Uber, Airbnb…) or an advertising business (Facebook, Google, Criteo…).

If you meet those two requirements, you have to pay 3 percent of your French revenue in taxes.

At the same time, the OECD has been working on a way to properly tax tech companies with a standardized set of rules that would work across the globe. But OECD members have yet to reach a compromise.

France and the U.S. have been arguing on and off for the past couple of years about the tech tax. In August 2019, then U.S. President Donald Trump and French President Emmanuel Macron reached a deal by promising that the French government would scrap the French tax as soon as the OECD finds a way to properly tax tech companies in countries where they operate.

In December 2019, the U.S. promised 100% tariffs on French wine, cheese and handbags because the previous deal wasn’t good enough. In January 2020, the two sides agreed to wait a little bit more to see if the OECD framework would come through.

And here we are. According to the French government, OECD negotiations have failed so it’s time to start collecting the French digital tax. Let’s s see how the U.S. reacts during the Trump-Biden transition.

Europe’s data strategy aims to tip the scales away from big tech

By Natasha Lomas

Google wants to organize the world’s information but European lawmakers are in a rush to organize the local digital sphere and make Europe “the most data-empowered continent in the world”, internal market commissioner Thierry Breton said today, setting out the thinking behind the bloc’s data strategy during a livestreamed discussion organized by the Brussels-based economic think tank, Bruegel.

Rebalancing big data power dynamics to tip the scales away from big tech is another stated aim.

Breton likened the EU’s ambitious push to encourage industrial data sharing and rebalance platform power to work done in the past to organize the region’s air space and other physical infrastructure — albeit, with a lot less time to get the job done given the blistering pace of digital innovation.

“This will require of course political vision — that we have — and willingness, that I believe we have too, and smart regulation, hopefully you will judge, to set the right rules and investment in key infrastructure,” said Breton.

During the talk, he gave a detailed overview of how the flotilla of legislative proposals which are being worked on by EU lawmakers will set rules intended to support European businesses and governments to safely unlock the value of industrial and public data and drive the next decades of economic growth.

“We have been brave enough to set our rules in the personal data sphere and this is what we need to do now for government and public and industrial data. Set the rules. The European rules. Everyone will be welcome in Europe, that’s extremely important — provided they respect our rules,” said Breton.

“We don’t have one minute to lose,” he added. “The battle for industrial data is starting now and the battlefield may be Europe so we need to get ready — and this is my objective.”

EU lawmakers are drafting rules for how (non-personal) data can be used and shared; who will get access to them; and how rights can be guaranteed under the framework, per Breton. And he argued that concerns raised by European privacy challenges to international data transfers — reflected in the recent Schrems II ruling — are not limited to privacy and personal data. 

“These worries are in fact at the heart of the Single Market for data that I am building,” he said. “These worries are clear in the world we are entering when individuals or companies want to keep control over its data. The key question is, therefore, how to organize this control while allowing data flow — which is extremely important in the data economy.”

An open single European market for data must recognize that not all data are the same — “in terms of their sensitivity” — Breton emphasized, pointing to the EU’s General Data Protection Regulation (GDPR) data protection framework as “the proof of that”.

“Going forward, there are also sensitive industrial data that should benefit from specific conditions when they are accessed, used or shared,” he went on. “This is a case for instance for some sensitive public data [such as] from public hospitals, but also anonymized data that remains sensitive, mixed data which are difficult to handle.”

At one point during the talk he gave the example of European hospitals during the pandemic not being able to share data across borders to help in the fight against the virus because of the lack of a purpose-built framework to securely enable such data flows.

“I want our SMEs and startups, our public hospitals, our cities and many other actors to use more data — to make them available, to value them, to share them — but for this we need to generate the trust,” he added.

The first legislative plank of the transformation to a single European data economy is a Data Governance Act (DGA) — which Breton said EU lawmakers will present tomorrow, after a vote on the proposal this afternoon.

“With this act we are defining a European approach to data sharing,” he noted on the DGA. “This new regulation will facilitate data sharing across sectors and Member States. And it will put those who generate the data in the driving seat — moving away from the current practices of the big tech platforms.

“Concretely, with this legislation, we create the conditions to allow access to a reuse of sensitive public data, creating a body of harmonized rules for the single market.”

A key component of building the necessary trust for the data economy will mean creating rules that state “European highly sensitive data should be able to be stored and processed in the EU”, Breton also said, signalling that data localization will be a core component of the strategy — in line with a number of recent public remarks in which he’s argued it’s not protectionist for European data to be stored in Europe. 

“Without such a possibility Member States will never agree to open their data hold,” Breton went on, saying that while Europe will be “open” with data, it will not be offering a “naive” data free-for-all.

The Commission also wants the data framework to support an ecosystem of data brokers whose role Breton said will be to connect data owners and data users “in a neutral manner” — suggesting this will empower companies to have stronger control over the data they generate, (i.e the implication being rather than the current situation where data-mining platform giants can use their market power to asset-strip weaker third parties).

“We are shifting here the product,” he said. “And we promote also data altruism — the role of sharing data, industrial or personal, for common good.”

Breton also noted that the forthcoming data governance proposal will include a shielding provision — meaning data actors will be required to take steps to avoid having to comply with what he called “abusive and unlawful” data access requests for data held in Europe from third countries.

“This is a major point. It is not a question of calling into question our international judicial or policy cooperation. We cannot tolerate abuses,” he said, specifying three off-limits examples (“unauthorized access; access that do offer sufficient legal guarantees; or fishing expeditions), adding: “By doing so we are ensuring that European law and the guarantees it carries is respected. This is about enforcing our own rules.”

Breton also touched on other interlocking elements of the policy strategy which regional lawmakers see as crucial to delivering a functional data framework: Namely the Digital Services Act (DSA) and Digital Markets Act (DMA) — which are both due to be set out in detail early next month.

The DSA will put “a clear responsibility and obligation on platforms and the content that is spread”, said Breton.

While the companion ex ante regulation, the DMA, will “frame the behaviours of gatekeepers — of systemic actors in the Single Market — and target their behaviors against their competitors or customers”; aka further helping to pin and clip the wings of big tech.

“With this set of regulation I just want to set up the rules and that the rules are clear — based on our values,” he added.

He also confirmed that interoperability and portability will be a key feature of the EU’s hoped for data transformation.

“We are working on this on several strands,” he said on this. “The first is standards for interoperability. That’s absolutely key for sectoral data spaces that we will create and very important for the data flows. You will see that we will create a European innovation data board — set in the DGA today — which will help the Commission in setting and working the right standards.”

While combating “blocking efforts and abusive behaviors” by platform gatekeepers — which could otherwise put an artificial limit on the value of the data economy — will be “the job of the DMA”, he noted.

A fourth pillar of the data strategy — which Breton referred to as a “data act” — will be introduced in 2021, with the aim of “increasing fairness in the data economy by clarifying data usage rights in business to business and business to government settings”.

“We will also consider enhanced data portability rights to give individuals more control — which is extremely important — over the data they produce,” he added. “And we will have a look at the intellectual property rights framework.”

He also noted that key infrastructure investments will be vital — pointing to the Commission’s plan to build a European industrial cloud and related strategic tech investment priorities such as in compute power capacity, building out next-gen connectivity and support for cutting edges technologies like quantum encryption.

Privacy campaigner Max Schrems, who had been invited as the other guest speaker, raised the issue of enforceability — pointing out that Ireland’s data protection authority, which is responsible for overseeing a large number of major tech companies in the region, still hasn’t issued any decisions on cross-border complaints filed under the 2.5 year old GDPR framework.

Breton agreed that enforcement will be a vital piece of the puzzle — claiming EU lawmakers are alive to the problem of enforcement “bottlenecks” in the GDPR.

“We need definitely clear, predictable, implementable rules — and this is what is driving me when I am regulating against the data market. But also what you will find behind the DSA and the DMA with an ex ante regulation to be able to apply it immediately and everywhere in Europe, not only in one country, everywhere at the same time,” he said. “Just to be able to make sure that things are happening quick. In this digital space we have to be fast.”

“So we will again make sure in DSA that Member State authorities can ask platforms to remove immediately content cross-border — like, for example, if you want an immediate comparison, the European Arrest Warrant.”

The Commission will also have the power to step in via cooperation at the European level, Breton further noted.

“So you see we are putting in rules, we are not naive, we understand pretty well where we have the bottleneck — and again we try to regulate. And also, in parallel, that’s very important because like everywhere where you have regulation you need to have sanctions — you will have appropriate sanctions,” he said, adding: “We learn the lessons from the GDPR.”

❌