FreshRSS

🔒
❌ About FreshRSS
There are new available articles, click to refresh the page.
Before yesterdayYour RSS feeds

Data was the new oil, until the oil caught fire

By Danny Crichton

We’ve been hearing how “data is the new oil” for more than a decade now, and in certain sectors, it’s a maxim that has more than panned out. From marketing and logistics to finance and product, decision-making is now dominated by data at all levels of most big private orgs (and if it isn’t, I’d be getting a résumé put together, stat).

So it might be a something of a surprise to learn that data, which could transform how we respond to the increasingly deadly disasters that regularly plague us, has been all but absent from much of emergency response this past decade. Far from being a geyser of digital oil, disaster response agencies and private organizations alike have for years tried to swell the scope and scale of the data being inputted into disaster response, with relatively meager results.

That’s starting to change though, mostly thanks to the internet of things (IoT), and frontline crisis managers today increasingly have the data they need to make better decisions across the resilience, response, and recovery cycle. The best is yet to come — with drones flying up, simulated visualizations, and artificial intelligence-induced disasters — what we’re seeing today on the frontlines is only the beginning of what could be a revolution in disaster response in the 2020s.

The long-awaited disaster data deluge has finally arrived

Emergency response is a fight against the fog of war and the dreadful ticking of the clock. In the midst of a wildfire or hurricane, everything can change in a matter of seconds — even milliseconds if you aren’t paying attention. Safe roads ferrying evacuees can suddenly become impassable infernos, evacuation teams can reposition and find themselves spread far too thin, and unforeseen conditions can rapidly metastasize to cover the entire operating environment. An operations center that once had perfect information can quickly find it has no ground truth at all.

Unfortunately, even getting raw data on what’s happening before and during a disaster can be extraordinarily difficult. When we look at the data revolution in business, part of the early success stems from the fact that companies were always heavily reliant on data to handle their activities. Digitalization was and is the key word: moving from paper to computers in order to transform latent raw data into a form that was machine-readable and therefore analyzable. In business, the last ten years was basically upgrading to version two from version one.

In emergency management however, many agencies are stuck without a version at all. Take a flood — where is the water and where is it going? Up until recently, there was no comprehensive data on where waters rose from and where they sloshed to. When it came to wildfires, there were no administrative datasets on where every tree in the world was located and how prone each is to fire. Even human infrastructure like power lines and cell towers often had little interface with the digital world. They stood there, and if you couldn’t see them, they couldn’t see you.

Flood modeling is on the cutting edge of disaster planning and response. Image Credits: CHANDAN KHANNA/AFP via Getty Images

Models, simulations, predictions, analysis: all of these are useless without raw data, and in the disaster response realm, there was no detailed data to be found.

After years of promising an Internet of Things (IoT) revolution, things are finally internet-izing, with IoT sensors increasingly larding up the American and world landscape. Temperature, atmospheric pressure, water levels, humidity, pollution, power, and other sensors have been widely deployed, emitting constant streams of data back into data warehouses ready for analysis.

Take wildfires in the American West. It wasn’t all that long ago that the U.S. federal government and state firefighting agencies had no knowledge of where a blaze was taking place. Firefighting has been “100 years of tradition unimpeded by progress,” Tom Harbour, head of fire response for a decade at the U.S. Forest Service and now chief fire officer at Cornea put it.

And he’s right. After all, firefighting is a visceral activity — responders can see the fires, even feel the burning heat echoing off of their flesh. Data wasn’t useful, particularly in the West where there are millions of acres of land and large swaths are sparsely populated. Massive conflagrations could be detected by satellites, but smoldering fires in the brush would be entirely invisible to the geospatial authorities. There’s smoke over California — exactly what is a firefighter on the ground supposed to do with such valuable information?

Today after a decade of speculative promise, IoT sensors are starting to clear a huge part of this fog. Aaron Clark-Ginsberg, a social scientist at RAND Corporation who researches community resilience, said that air quality sensors have become ubiquitous since they are “very cheap [and] pretty easy to use” and can offer very fine-grained understandings of pollution — a key signal, for instance, of wildfires. He pointed to the company Purple Air, which in addition to making sensors, also produces a popular consumer map of air quality, as indicative of the potential these days for technology.

Maps are the critical intersection for data in disasters. Geospatial information systems (GIS) form the basis for most planning and response teams, and no company has a larger footprint in the sector than privately-held Esri. Ryan Lanclos, who leads public safety solutions at the company, pointed to the huge expansion of water sensors as radically changing responses to certain disasters. “Flood sensors are always pulsing,“ he said, and with a “national water model coming out of the federal government ,” researchers can now predict through GIS analysis how a flood will affect different communities with a precision unheard of previously.

Digital maps and GIS systems are increasingly vital for disaster planning and response, but paper still remains quite ubiquitous. Image Credits: Paul Kitagaki Jr.-Pool/Getty Images

Cory Davis, the director of public safety strategy and crisis response at Verizon (which, through our parent company Verizon Media, is TechCrunch’s ultimate owner), said that all of these sensors have transformed how crews work to maintain infrastructure as well. “Think like a utility that is able to put a sensor on a power line — now they have sensors and get out there quicker, resolve it, and get the power back up.”

He noted one major development that has transformed sensors in this space the last few years: battery life. Thanks to continuous improvements in ultra-low-power wireless chips as well as better batteries and energy management systems, sensors can last a really long time in the wilderness without the need for maintenance. “Now we have devices that have ten-year battery lives,” he said. That’s critical, because it can be impossible to connect these sensors to the power grid in frontier areas.

The same line of thinking holds true at T-Mobile as well. When it comes to preventative planning, Jay Naillon, senior director of national technology service operations strategy at the telco, said that “the type of data that is becoming more and more valuable for us is the storm surge data — it can make it easier to know we have the right assets in place.” That data comes from flood sensors that can offer real-time warnings signals to planners across the country.

Telecom interest — and commercial interest in general — has been critical to accelerating the adoption of sensors and other data streams around disasters. While governments may be the logical end user of flood or wildfire data, they aren’t the only ones interested in this visibility. “A lot of consumers of that information are in the private sector,” said Jonathan Sury, project director at the National Center for Disaster Preparedness at the Earth Institute at Columbia University. “These new types of risks, like climate change, are going to affect their bottom lines,” and he pointed to bond ratings, insurance underwriting and other areas where commercial interest in sensor data has been profound.

Sensors may not literally be ubiquitous, but they have offered a window into the ambiguity that emergency managers have never had visibility into before.

Finally, there is the extensive datasets around mobile usage that have become ubiquitous throughout much of the world. Facebook’s Data for Good project, for instance, provides data layers around connectivity — are users connecting from one place and then later connecting from a different location, indicating displacement? That sort of data from the company and telcos themselves can help emergency planners scout out how populations are shifting in real-time.

Data, data, on the wall — how many AIs can they call?

Rivulets of data have now turned into floods of information, but just like floodwaters rising in cities across the world, the data deluge now needs a response all its own. In business, the surfeit of big data has been wrangled with an IT stack from data warehouses all the way to business intelligence tools.

If only data for disasters could be processed so easily. Data relevant for disasters is held by dozens of different organizations spanning the private, public, and non-profit sectors, leading to huge interoperability problems. Even when the data can be harmonized, there are large challenges in summarizing the findings down to an actual decision a frontline responder can use in their work — making AI a tough sale still today, particularly outside of planning. As Davis of Verizon put it, “now that they have this plethora of data, a lot of cities and federal agencies are struggling with how to use it.”

Unfortunately, standardization is a challenge at all scales. Globally, countries mostly lack interoperability, although standards are improving over time. Amir Elichai, the founder and CEO of 911 call-handling platform Carbyne, said that “from a technology standpoint and a standards standpoint, there is a big difference between countries,” noting that protocols from one country often have to be completely rewritten to serve a different market.

Tom Cotter, director of emergency response and preparedness at health care disaster response organization Project HOPE, said that even setting up communications between responders can be challenging in an international environment. “Some countries allow certain platforms but not others, and it is constantly changing,” he said. “I basically have every single technology communication platform you can possibly have in one place.”

One senior federal emergency management official acknowledged that data portability has become increasingly key in procurement contracts for technology, with the government recognizing the need to buy commercially-available software rather than custom-designed software. That message has been picked up by companies like Esri, with Lanclos stating that “part of our core mission is to be open and … create data and to share that openly to the public or securely through open standards.”

For all its downsides though, the lack of interoperability can be ironically helpful for innovation. Elichai said that the “lack of standards is an advantage — you are not buying into a legacy standard,” and in some contexts where standards are lacking, quality protocols can be built with the assumption of a modern data workflow.

Even with interoperability though, the next challenge becomes data sanitation — and disaster data is dirty as … well, something. While sensor streams can be verified and cross-checked with other datasets, in recent years there has been a heavy increase in the quantity of citizen-submitted information that has to be carefully vetted before it is disseminated to first responders or the public.

With citizens having more access to smartphones than ever, emergency planners have to sanitize uploaded data uploaded in order to verify and make it useful. Image Credits: TONY KARUMBA/AFP via Getty Images

Bailey Farren, CEO and co-founder of disaster communications platform Perimeter, said that “sometimes citizens have the most accurate and real-time information, before first responders show up — we want citizens to share that with …government officials.” The challenge is how to filter the quality goods from the unhelpful or malicious. Raj Kamachee, the CIO of Team Rubicon, a non-profit which assembles teams of volunteer military veterans to respond to natural disasters, said that verification is critical, and it’s a key element of the infrastructure he has built at the organization since joining in 2017. “We’ve gotten more people using it so more feedback [and] more data [is] coming through the pipes,” he said. “So creating a self-service, a very collaborative approach.”

With quality and quantity, the AI models should come, right? Well, yes and no.

Sury of Columbia wants to cool down at least some of the hype around AI. “The big caveat with all of these machine learning and big data applications is that they are not a panacea — they are able to process a lot of disparate information, [but] they’re certainly not going to tell us exactly what to do,” he said. “First responders are already processing a lot of information,” and they don’t necessarily need more guidance.

Instead, AI in disasters is increasingly focused on planning and resilience. Sury pointed to OneConcern, a resiliency planning platform, as one example of how data and AI can be combined in the disaster planning process. He also pointed to the CDC’s Social Vulnerability Index and risk tools from FEMA that integrate different data signals into scalar values by emergency planners to optimize their contingency plans.

Yet, almost everyone I talked to was much more hesitant about the power of AI. As I discussed a bit in part one of this series regarding the disaster sales cycle, data tools have to be real-time and perfect every time given the lives that are on the line. Kamachee of Team Rubicon noted that when choosing tools, he avoids whiz-bang and instead looks at the pure utility of individual vendors. “We go high tech, but we prepare for low tech,” he said, empathizing that in disaster response, everything must be agile and adaptable to changing circumstances.

Elichai of Carbyne saw this pattern in his sales. There’s a “sensitivity in our market and the reluctance from time to time to adopt” new technologies he said, but acknowledged that “there is no doubt that AI at a certain point will provide benefits.”

Naillon of T-Mobile had similar views from the operator perspective, saying that “I can’t say that we really leverage AI very much” in the company’s disaster planning. Instead of AI as brain, the telecom company simply uses data and forecast modeling to optimally position equipment — no fancy GANs required.

Outside of planning, AI has helped in post-disaster recovery, and specifically around damage assessments. After a crisis transpires, assessments of infrastructure and private property have to be made in order for insurance claims to be filed and for a community to move forward. Art delaCruz, COO and president of Team Rubicon, noted that technology and a flourish of AI has helped significantly around damage assessments. Since his organization often helps rebuild communities in the course of its work, triaging damage is a critical element of its effective response strategy.

There’s a brighter future, other than that brightness from the sun that is going to burn us to a crisp, right?

So AI today is helping a bit with resilience planning and disaster recovery and not so much during emergency response itself, but there is certainly more to come across the entire cycle. Indeed, there is a lot of excitement about the future of drones, which are increasingly being used in the field, but there are concerns long term about whether AI and data will ultimately cause more problems than they solve.

Drones would seem to have an obvious value for disaster response, and indeed, they have been used by teams to get additional aerial footage and context where direct access by responders is limited. Kamachee of Team Rubicon noted that in the Bahamas on a mission, response teams used drones to detect survivors, since major roads were blocked. The drones snapped images that were processed using AI, and helped the team to identify those survivors for evacuation. He described drones and their potential as “sexy; very, very cool.”

Aerial views from drones can give disaster response teams much better real-time information, particularly in areas where on-the-ground access is limited. Image Credits: Mario Tama/Getty Images

Cotter of Project HOPE similarly noted that faster data processing translates to better responses. “Ultimately speed is what saves lives in these disasters,” he said. We’re “also able to manage more responses remotely [and] don’t have to send as many people downrange,” giving response teams more leverage in resource-constrained environments.

“I see more emergency management agencies using drone technology — search and rescue, aerial photography,” Davis of Verizon said, arguing that operators often have a mentality of “send a machine into a situation first.” He continued, arguing, “artificial intelligence is going to continue to get better and better and better [and] enable our first responders to respond more effectively, but also more efficiently and safer.”

With data flooding in from sensors and drones and processed and verified better than ever, disaster response can improve, perhaps even better than Mother Nature can galvanize her increasingly deadly whims. Yet, there is one caveat: will the AI algorithms themselves cause new problems in the future?

Clark-Ginsburg of RAND, perhaps supplying that typical RANDian alternatives analysis, said that these solutions can also create problems themselves, “technological risks leading to disaster and the world of technology facilitating disaster.” These systems can break, they can make mistakes, and more ominously — they can be sabotaged to increase chaos and damage.

Bob Kerrey, a co-chair of the 9/11 Commission, former senator and governor of Nebraska, and currently the board chairman of Risk & Return, a disaster response VC fund and philanthropy I profiled recently, pointed to cybersecurity as increasingly a wild card in many responses. “There wasn’t a concept called zero days — let alone a market for zero days — in 2004 [when the 9/11 Commission was doing its work], and now there is.” With the 9/11 terrorist attacks, “they had to come here, they had to hijack planes … now you don’t need to hijack planes to damage the United States,” noting that hackers “can be sitting with a bunch of other guys in Moscow, in Tehran, in China, or even your mother’s basement.”

Data is a revolution in the making for disaster response, but it may well cause a whole second-order set of problems that didn’t exist before. What is giveth is taketh away. The oil gushes, but then the well suddenly runs dry – or simply catches fire.


Future of Technology and Disaster Response Table of Contents


Analytics as a service: Why more enterprises should consider outsourcing

By Ram Iyer
Joey Lei Contributor
Joey Lei is director of service management at Synoptek. With more than 14 years of experience in engineering and product management, Lei is responsible for the development and growth of the Synoptek service portfolio and solution development with strategic technology alliance partners.
Debbie Zelten Contributor
Debbie Zelten (SAFe(R) 4 Agilist, SAFe Scrum Master, CSM, LSSGB, PMI-ACP) is the director of application development and business intelligence at Synoptek. She has over 20 years of experience in implementing software and data analytics solutions for companies of all sizes.

With an increasing number of enterprise systems, growing teams, a rising proliferation of the web and multiple digital initiatives, companies of all sizes are creating loads of data every day. This data contains excellent business insights and immense opportunities, but it has become impossible for companies to derive actionable insights from this data consistently due to its sheer volume.

According to Verified Market Research, the analytics-as-a-service (AaaS) market is expected to grow to $101.29 billion by 2026. Organizations that have not started on their analytics journey or are spending scarce data engineer resources to resolve issues with analytics implementations are not identifying actionable data insights. Through AaaS, managed services providers (MSPs) can help organizations get started on their analytics journey immediately without extravagant capital investment.

MSPs can take ownership of the company’s immediate data analytics needs, resolve ongoing challenges and integrate new data sources to manage dashboard visualizations, reporting and predictive modeling — enabling companies to make data-driven decisions every day.

AaaS could come bundled with multiple business-intelligence-related services. Primarily, the service includes (1) services for data warehouses; (2) services for visualizations and reports; and (3) services for predictive analytics, artificial intelligence (AI) and machine learning (ML). When a company partners with an MSP for analytics as a service, organizations are able to tap into business intelligence easily, instantly and at a lower cost of ownership than doing it in-house. This empowers the enterprise to focus on delivering better customer experiences, be unencumbered with decision-making and build data-driven strategies.

Organizations that have not started on their analytics journey or are spending scarce data engineer resources to resolve issues with analytics implementations are not identifying actionable data insights.

In today’s world, where customers value experiences over transactions, AaaS helps businesses dig deeper into their psyche and tap insights to build long-term winning strategies. It also enables enterprises to forecast and predict business trends by looking at their data and allows employees at every level to make informed decisions.

Basecamp sees mass employee exodus after CEO bans political discussions

By Taylor Hatmaker

Following a controversial ban on political discussions earlier this week, Basecamp employees are heading for the exits. The company employs around 60 people, and roughly a third of the company appears to have accepted buyouts to leave, many citing new company policies.

On Monday, Basecamp CEO Jason Fried anounced in a blog post that employees would no longer be allowed to openly share their “societal and political discussions” at work.

“Every discussion remotely related to politics, advocacy or society at large quickly spins away from pleasant,” Fried wrote. “You shouldn’t have to wonder if staying out of it means you’re complicit, or wading into it means you’re a target.”

Basecamp’s departures are significant. According to Twitter posts, Basecamp’s head of design, head of marketing and head of customer support will all depart. The company’s iOS team also appears to have quit en masse.

The no-politics rule at Basecamp follows a similar stance that Coinbase CEO Brian Armstrong staked out late last year. Armstrong also denounced debates around “causes or political candidates” arguing that such discussions distracted from the company’s core work. About 60 members of Coinbase’s 1,200 person staff took buyouts in light of the internal policy change — a ratio that makes the exodus at Basecamp look even more dramatic.

Like Coinbase, Basecamp was immediately criticized for muzzling its employees over important issues, many of which disproportionately impact marginalized employees.

Drawing the line on “political” topics becomes murky very quickly for any non-white or LGBTQ employees, for whom many issues that might be seen as political in nature in some circles — the Black Lives Matter movement, for instance — are inextricably and deeply personal. It’s not a coincidence these grand stands against divisive “politics” at work issue down from white male tech executives.

“If you’re in doubt as to whether your choice of forum or topic for a discussion is appropriate, please ask before posting,” Basecamp CTO David Heinemeier Hansson wrote in his own blog post, echoing Fried.

According to Platformer, Fried’s missive didn’t tell the whole story. Basecamp employees instead said the tension arose from internal conversations about the company itself and its commitment to DEI work, not free-floating arguments about political candidates. Fried’s blog post does mention one particular source of tension in a roundabout way, referencing an employee-led DEI initiative that would be disbanded.

“We make project management, team communication, and email software,” Fried wrote. “We are not a social impact company.”

Heirlume raises $1.38M to remove the barriers of trademark registration for small businesses

By Darrell Etherington

Platforms like Shopify, Stripe and WordPress have done a lot to make essential business-building tools, like running storefronts, accepting payments, and building websites accessible to businesses with even the most modest budgets. But some very key aspects of setting up a company remain expensive, time-consuming affairs that can be cost-prohibitive for small businesses — but that, if ignored, can result in the failure of a business before it even really gets started.

Trademark registration is one such concern, and Toronto-based startup Heirlume just raised $1.7 million CAD (~$1.38 million) to address the problem with a machine-powered trademark registration platform that turns the process into a self-serve affair that won’t break the budget. Its AI-based trademark search will flag if terms might run afoul of existing trademarks in the U.S. and Canada, even when official government trademark search tools, and even top-tier legal firms might not.

Heirlume’s core focus is on levelling the playing field for small business owners, who have typically been significantly out-matched when it comes to any trademark conflicts.

“I’m a senior level IP lawyer focused in trademarks, and had practiced in a traditional model, boutique firm of my own for over a decade serving big clients, and small clients,” explained Heirlume co-founder Julie MacDonnell in an interview. “So providing big multinationals with a lot of brand strategy, and in-house legal, and then mainly serving small business clients when they were dealing with a cease-and-desist, or an infringement issue. It’s really those clients that have my heart: It’s incredibly difficult to have a small business owner literally crying tears on the phone with you, because they just lost their brand or their business overnight. And there was nothing I could do to help because the law just simply wasn’t on their side, because they had neglected to register their trademarks to own them.”

In part, there’s a lack of awareness around what it takes to actually register and own a trademark, MacDonnell says. Many entrepreneurs just starting out seek out a domain name as a first step, for instance, and some will fork over significant sums to register these domains. What they don’t realize, however, is that this is essentially a rental, and if you don’t have the trademark to protect that domain, the actual trademark owner can potentially take it away down the road. But even if business owners do realize that a trademark should be their first stop, the barriers to actually securing one are steep.

“There was an an enormous, insurmountable barrier, when it came to brand protection for those business owners,” she said. “And it just isn’t fair. Every other business service, generally a small business owner can access. Incorporating a company or even insurance, for example, owning and buying insurance for your business is somewhat affordable and accessible. But brand ownership is not.”

Heirlume brings the cost of trademark registration down from many thousands of dollars, to just under $600 for the first, and only $200 for each additional after that. The startup is also offering a very small business-friendly ‘buy now, pay later’ option supported by Clearbanc, which means that even businesses starting on a shoestring can take step of protecting their brand at the outset.

In its early days, Heirlume is also offering its core trademark search feature for free. That provides a trademark search engine that works across both U.S. and Canadian government databases, which can not only tell you if your desired trademark is available or already held, but also reveal whether it’s likely to be able to be successfully obtained, given other conflicts that might arise that are totally ignored by native trademark database search portals.

Heirlume search tool comparison

Image Credits: Heirlume

Heirlume uses machine learning to identify these potential conflicts, which not only helps users searching for their trademarks, but also greatly decreases the workload behind the scenes, helping them lower costs and pass on the benefits of those improved margins to its clients. That’s how it can achieve better results than even hand-tailored applications from traditional firms, while doing so at scale and at reduced costs.

Another advantage of using machine-powered data processing and filing is that on the government trademark office side, the systems are looking for highly organized, curated data sets that are difficult for even trained people to get consistently right. Human error in just data entry can cause massive backlogs, MacDonnell notes, even resulting in entire applications having to be tossed and started over from scratch.

“There are all sorts of datasets for those [trademark requirement] parameters,” she said. “Essentially, we synthesize all of that, and the goal through machine learning is to make sure that applications are utterly compliant with government rules. We actually have a senior level trademark examiner that that came to work for us, very excited that we were solving the problems causing backlogs within the government. She said that if Heirlume can get to a point where the applications submitted are perfect, there will be no backlog with the government.”

Improving efficiency within the trademark registration bodies means one less point of friction for small business owners when they set out to establish their company, which means more economic activity and upside overall. MacDonnell ultimately hopes that Heirlume can help reduce friction to the point where trademark ownership is at the forefront of the business process, even before domain registration. Heirlume has a partnership with Google Domains to that end, which will eventually see indication of whether a domain name is likely to be trademarkable included in Google Domain search results.

This initial seed funding includes participation from Backbone Angels, as well as the Future Capital collective, Angels of Many and MaRS IAF, along with angel investors including Daniel Debow, Sid Lee’s Bertrand Cesvet and more. MacDonnell notes that just as their goal was to bring more access and equity to small business owners when it comes to trademark protection, the startup was also very intentional in building its team and its cap table. MacDonnell, along with co-founders CTO Sarah Guest and Dave McDonnell, aim to build the largest tech company with a majority female-identifying technology team. Its investor make-up includes 65% female-identifying or underrepresented investors, and MacDonnell says that was a very intentional choice that extended the time of the raise, and even led to turning down interest from some leading Silicon Valley firms.

“We want underrepresented founders to be to be funded, and the best way to ensure that change is to empower underrepresented investors,” she said. “I think that we all have a responsibility to actually do do something. We’re all using hashtags right now, and hashtags are not enough […] Our CTO is female, and she’s often been the only female person in the room. We’ve committed to ensuring that women in tech are no longer the only person in the room.”

The era of the European insurtech IPO will soon be upon us

By Ram Iyer
Phil Edmondson-Jones Contributor
Phil Edmondson-Jones is a principal at Oxx, the specialist SaaS VC backing Europe and Israel's most promising B2B SaaS businesses at the scale-up stage.

Once the uncool sibling of a flourishing fintech sector, insurtech is now one of the hottest areas of a buoyant venture market. Zego’s $150 million round at unicorn valuation in March, a rumored giant incoming round for WeFox, and a slew of IPOs and SPACs in the U.S. are all testament to this.

It’s not difficult to see why. The insurance market is enormous, but the sector has suffered from notoriously poor customer experience and major incumbents have been slow to adapt. Fintech has set a precedent for the explosive growth that can be achieved with superior customer experience underpinned by modern technology. And the pandemic has cast the spotlight on high-potential categories, including health, mobility and cybersecurity.

Fintech has set a precedent for the explosive growth that can be achieved with superior customer experience underpinned by modern technology.

This has begun to brew a perfect storm of conditions for big European insurtech exits. Here are four trends to look out for as the industry powers toward several European IPOs and a red-hot M&A market in the next few years.

Full-stack insurtech continues to conquer

Several early insurtech success stories started life as managing general agents (MGAs). Unlike brokers, MGAs manage claims and underwriting, but unlike a traditional insurer, pass risk off their balance sheet to third-party insurers or reinsurers. MGAs have provided a great way for new brands to acquire customers and underwrite policies without actually needing a fully fledged balance sheet. But it’s a business model with thin margins, so MGAs increasingly are trying to internalize risk exposure by verticalizing into a “full-stack” insurer in the hope of improving their unit economics.

This structure has been prevalent in the U.S., with some of the bigger recent U.S. insurtech IPO successes (Lemonade and Root), SPACs (Clover and MetroMile), and more upcoming listings (Hippo and Next) pointing to the prizes available to those who can successfully execute this expensive growth strategy.

Hacking my way into analytics: A creative’s journey to design with data

By Annie Siebert
Sydney Anh Mai Contributor
Sydney Anh Mai is an award-winning product designer at Kickstarter. Her work has appeared on The Verge, Design Weekly and Core 77.

Growing up, did you ever wonder how many chairs you’d have to stack to reach the sky?

No? I guess that’s just me then.

As a child, I always asked a lot of “how many/much” questions. Some were legitimate (“How much is 1 USD in VND?”); some were absurd (“How tall is the sky and can it be measured in chairs?”). So far, I’ve managed to maintain my obnoxious statistical probing habit without making any mortal enemies in my 20s. As it turns out, that habit comes with its perks when working in product.

Growing up, did you ever wonder how many chairs you’d have to stack to reach the sky?

My first job as a product designer was at a small but energetic fintech startup whose engineers also dabbled in pulling data. I constantly bothered them with questions like, “How many exports did we have from that last feature launched?” and “How many admins created at least one rule on this page?” I was curious about quantitative analysis but did not know where to start.

I knew I wasn’t the only one. Even then, there was a growing need for basic data literacy in the tech industry, and it’s only getting more taxing by the year. Words like “data-driven,” “data-informed” and “data-powered” increasingly litter every tech organization’s product briefs. But where does this data come from? Who has access to it? How might I start digging into it myself? How might I leverage this data in my day-to-day design once I get my hands on it?

Data discovery for all: What’s in the way?

“Curiosity is our compass” is one of Kickstarter’s guiding principles. Powered by a desire for knowledge and information, curiosity is the enemy of many larger, older and more structured organizations — whether they admit it or not — because it hinders the production flow. Curiosity makes you pause and take time to explore and validate the “ask.” Asking as many what’s, how’s, why’s, who’s and how many’s as possible is important to help you learn if the work is worth your time.

Lobus raises $6 million for an art management platform on the blockchain

By Lucas Matney

Reshaping ownership proofs in the fine art markets has been one of the blockchain’s clearest real-world use cases. But in recent months as top auction houses have embraced NFTs and popular artists experiment with the crypto medium, that future has seemed more tangible than ever before.

The ex-Christie’s and Sotheby’s team at Lobus is aiming to commoditize blockchain tech with an asset management platform that they hope can bring creator-friendly mechanisms from NFT marketplaces like SuperRare to the physical art world as well, allowing art owners to maintain partial ownership of the works they sell so that they can benefit from secondary transactions down the line. While physical art sellers have grown accustomed to selling 100% of their work while seeing that value accrue over time as it trades hands, Lobus’s goal is for artist’s to maintain fractional ownership throughout those sales, ensuring that they earn a commission on sales down the road. It’s a radical idea and a logistical nightmare made feasible by the blockchain’s approach to ownership.

“We’re really on a mission of making artists into owners,” Lobus co-CEO Sarah Wendell Sherrill tells TechCrunch. “We are really leveraging the best of what NFTs are putting out there about ownership and asking the questions of how to help create different ownership structures and interrupt this asset class.”

The startup is encapsulating these new mechanics in a wide-reaching art asset management platform that they hope can entice users of the aging legacy software suites being used today. Teaming robust ownership proofs with a CRM, analytics platform and tools like dynamic pricing, Lobus wants to give the art market its own Carta-like software platform that is approachable to the wider market.

Lobus tells TechCrunch they have raised $6 million from Upside Capital, 8VC, Franklin Templeton, Dream Machine, Weekend Fund and BoostVC, among others. Angels participating in the round include Rob Hayes, Troy Carter, Suzy Ryoo, Rebecca and Cal Henderson, Henry Ward and Lex Sokolin.

A big goal for the team has been removing the complexities of understanding what the blockchain is and instead focusing on what their tech can deliver to their network of art owners. While the NFT boom of the past few months has already produced billions in sales, efforts like Lobus are attempting to cross-pollinate the mechanics of crypto art with the global art market in an effort to put stakeholders across the board on the same footing. In addition to having partnerships with around 300 active artists, Lobus has also sold their platform to collectors, artist estates and asset managers.

At the moment, Lobus has around 45,000 art objects in its database, encompassing about $5.4 billion in asset value across physical and digital objects.

5 emerging use cases for productivity infrastructure in 2021

By Ram Iyer
Gleb Polyakov Contributor
Gleb Polyakov is co-founder and CEO of Nylas, which provides productivity infrastructure solutions for modern software. Gleb studied Physics at Georgia Tech and enjoys chess, motorcycles and space. Previously, he worked in finance and founded an IoT coffee company.

When the world flipped upside down last year, nearly every company in every industry was forced to implement a remote workforce in just a matter of days — they had to scramble to ensure employees had the right tools in place and customers felt little to no impact. While companies initially adopted solutions for employee safety, rapid response and short-term air cover, they are now shifting their focus to long-term, strategic investments that empower growth and streamline operations.

As a result, categories that make up productivity infrastructure — cloud communications services, API platforms, low-code development tools, business process automation and AI software development kits — grew exponentially in 2020. This growth was boosted by an increasing number of companies prioritizing tools that support communication, collaboration, transparency and a seamless end-to-end workflow.

Productivity infrastructure is on the rise and will continue to be front and center as companies evaluate what their future of work entails and how to maintain productivity, rapid software development and innovation with distributed teams.

According to McKinsey & Company, the pandemic accelerated the share of digitally enabled products by seven years, and “the digitization of customer and supply-chain interactions and of internal operations by three to four years.” As demand continues to grow, companies are taking advantage of the benefits productivity infrastructure brings to their organization both internally and externally, especially as many determine the future of their work.

Automate workflows and mitigate risk

Developers rely on platforms throughout the software development process to connect data, process it, increase their go-to-market velocity and stay ahead of the competition with new and existing products. They have enormous amounts of end-user data on hand, and productivity infrastructure can remove barriers to access, integrate and leverage this data to automate the workflow.

Access to rich interaction data combined with pre-trained ML models, automated workflows and configurable front-end components enables developers to drastically shorten development cycles. Through enhanced data protection and compliance, productivity infrastructure safeguards critical data and mitigates risk while reducing time to ROI.

As the post-pandemic workplace begins to take shape, how can productivity infrastructure support enterprises where they are now and where they need to go next?

Kandji nabs $60M Series B as Apple device management platform continues to thrive

By Ron Miller

During the pandemic, having an automated solution for onboarding and updating Apple devices remotely has been essential, and today Kandji, a startup that helps IT do just that, announced a hefty $60 million Series B investment.

Felicis Ventures led the round with participation from SVB Capital, Greycroft, Okta Ventures and The Spruce House Partnership. Today’s round comes just 7 months after a $21 million Series A, bringing the total raised across three rounds to $88.5 million, according to the company.

CEO Adam Pettit says that the company has been growing in leaps in bounds since the funding round last October.

“We’ve seen a lot more traction than even originally anticipated. I think every time we’ve put targets up onto the board of how quickly we would grow, we’ve accelerated past them,” he said. He said that one of the primary reasons for this growth has been the rapid move to work from home during the pandemic.

“We’re working with customers across 40+ industries now, and we’re even seeing international customers come in and purchase so everyone now is just looking to support remote workforces and we provide a really elegant way for them to do that,” he said.

While Pettit didn’t want to discuss exact revenue numbers, he did say that it has tripled since the Series A announcement. That is being fueled in part he says by attracting larger companies, and he says they have been seeing more and more of them become customers this year.

As they’ve grown revenue and added customers, they’ve also brought on new employees, growing from 40 to 100 since October. Pettit says that the startup is committed to building a diverse and inclusive culture at the company and a big part of that is making sure you have a diverse pool of candidates to choose from.

“It comes down to at the onset just making the decision that it’s important to you and it’s important to the company, which we’ve done. Then you take it step by step all the way through, and we start at the back into the funnel where are candidates are coming from.”

That means clearly telling their recruiting partners that they want a diverse candidate pool. One way to do that is being remote and having a broader talent pool to work with. “We realized that in order to hold true to [our commitment], it was going to be really hard to do that just sticking to the core market of San Diego or San Francisco, and so now we’ve expand expanded nationally and this has opened up a lot of [new] pools of top tech talent,” he said.

Pettit is thinking hard right now about how the startup will run its offices whenever they allowed back, especially with some employees living outside major tech hubs. Clearly it will have some remote component, but he says that the tricky part of that will be making sure that the folks who aren’t coming into the office still feel fully engaged and part of the team.

Tiger Global backs Indian crypto startup at over $500M valuation

By Manish Singh

Coinswitch Kuber, a startup that allows young users in India to invest in cryptocurrencies, said on Thursday it has raised $25 million in a new financing round as it looks to expand its reach in India, the world’s second largest internet market and also the place where the future of private cryptocurrencies remains uncertain for now.

Tiger Global financed the entire Series B funding round of Coinswitch Kuber and valued the three-year-old Indian startup at over $500 million. The announcement of Series B comes just three months after Coinswitch closed its $15 million Series A round from Ribbit Capital, Sequoia Capital India, and Kunal Shah. The Bangalore-based startup has raised $41.5 million to date.

TechCrunch reported earlier this month that the New York-headquartered technology hedge fund had led or was in advanced stages of talks to lead investments in many Indian startups including Coinswitch.

Coinswitch Kuber is one of the handful of startups operating in the cryptocurrency space today. The crypto exchange allows users to buy slivers of several popular cryptocurrencies. A user on Coinswitch, for instance, can buy small sachets of bitcoin and other currencies for as low as 100 Indian rupees ($1.3)-worth.

The startup said it has amassed over 4.5 million users, more than half of whom are aged 25 or younger. In the past 11 months, Coinswitch Kuber said it processed transactions over $5 billion.

But how the startup, which aims to add 5.5 million by the end of this year, performs in the future is not entire in its hand.

While trading of private cryptocurrency such as bitcoin is currently legal in India, New Delhi is widely expected to introduce a law that bans all private cryptocurrency.

Ashish Singhal, co-founder and chief executive of Coinswitch Kuber, said he is optimistic that India will not ban private cryptocurrencies, but said the startup closed the financing round with Tiger Global before New Delhi’s indication to formulate a law.

“This investment round brings us at par with some of the most sought after cryptocurrency companies in the world and sets us up for the long run,” said Singhal.

In recent months, some crypto startups in India have started to explore a contingency plan in the event the nation does end up banning cryptocurrency trading in the country. Many startups are today building in India, but focusing on serving customers overseas.

“As they build India’s leading cryptocurrency platform, CoinSwitch is well positioned to capture the tremendous growing interest in crypto among retail investors. We are excited to partner with CoinSwitch as they innovate in this emerging asset class,” said Scott Shleifer, Partner at Tiger Global, in a statement.

Business continuity planning is a necessity for your fund and portfolio

By Ram Iyer
Will Poole Contributor
Will Poole is co-founder and managing partner of Capria Ventures, a global financial services firm leading, partnering with and funding the largest network of fund managers collaborating to deliver superior returns and scaled impact in emerging markets.

Just shy of a year ago, I sent an email to our global fund manager partners and to our direct portfolio CEOs titled “Only the decisive survive.” At that time, not many outside of China were concerned about COVID-19. However, I was obsessed.

Hearing stories from fund manager friends with operations in China, I knew things were worse than what the Chinese press were telling the world. And I live only five miles south of the location of the first COVID death in the U.S. The pandemic was accelerating exponentially, and I wanted to get all of our partners to open their eyes to the risks and prepare as well as they could.

I’m not writing with that level of intensity or urgency this time, but I am concerned. We all need to be taking precautionary measures, not just in light of COVID, but to ensure our firms can continue to thrive when faced with unexpected tragedy.

We all need to be taking precautionary measures, not just in light of COVID, but to ensure our firms can continue to thrive when faced with unexpected tragedy.

My partner Susana invested in 90 funds over 20 years — she’s seen everything from motorcycle accidents to depression take out fund managers and CEOs. Life works that way sometimes, and it’s not always someone else. It’s the “What happens if I get hit by a bus scenario?” In this case, the bus happens to be a global pandemic.

One of our funds in Asia recently reported COVID cases in three CEOs among their 23 companies. While developed market infections and deaths are trending down, many countries are seeing serious new outbreaks, and some, like Brazil, are doing badly.

Pandemic forecasting site IHME predicts a growing caseload across sub-Saharan Africa and East Asia and Pacific regions. The LAC region is trending down overall, but some countries, including Colombia, are expected to experience a second (or third) wave of infections.

As the Economist said in mid-February, “Coronavirus is not done with humanity yet.”

Planning for your fund

A month or so ago, we were trying to move forward with an investment in a fund in Africa with whom we had been speaking and doing due diligence for a few months. They went radio silent for over two weeks. We didn’t know whether to be miffed, concerned for their health, or what.

Data scientists: Bring the narrative to the forefront

By Ram Iyer
Peter Wang Contributor
Peter Wang is CEO and co-founder of data science platform Anaconda. He’s also a co-creator of the PyData community and conferences, and a member of the board at the Center for Humane Technology.

By 2025, 463 exabytes of data will be created each day, according to some estimates. (For perspective, one exabyte of storage could hold 50,000 years of DVD-quality video.) It’s now easier than ever to translate physical and digital actions into data, and businesses of all types have raced to amass as much data as possible in order to gain a competitive edge.

However, in our collective infatuation with data (and obtaining more of it), what’s often overlooked is the role that storytelling plays in extracting real value from data.

The reality is that data by itself is insufficient to really influence human behavior. Whether the goal is to improve a business’ bottom line or convince people to stay home amid a pandemic, it’s the narrative that compels action, rather than the numbers alone. As more data is collected and analyzed, communication and storytelling will become even more integral in the data science discipline because of their role in separating the signal from the noise.

Yet this can be an area where data scientists struggle. In Anaconda’s 2020 State of Data Science survey of more than 2,300 data scientists, nearly a quarter of respondents said that their data science or machine learning (ML) teams lacked communication skills. This may be one reason why roughly 40% of respondents said they were able to effectively demonstrate business impact “only sometimes” or “almost never.”

The best data practitioners must be as skilled in storytelling as they are in coding and deploying models — and yes, this extends beyond creating visualizations to accompany reports. Here are some recommendations for how data scientists can situate their results within larger contextual narratives.

Make the abstract more tangible

Ever-growing datasets help machine learning models better understand the scope of a problem space, but more data does not necessarily help with human comprehension. Even for the most left-brain of thinkers, it’s not in our nature to understand large abstract numbers or things like marginal improvements in accuracy. This is why it’s important to include points of reference in your storytelling that make data tangible.

For example, throughout the pandemic, we’ve been bombarded with countless statistics around case counts, death rates, positivity rates, and more. While all of this data is important, tools like interactive maps and conversations around reproduction numbers are more effective than massive data dumps in terms of providing context, conveying risk, and, consequently, helping change behaviors as needed. In working with numbers, data practitioners have a responsibility to provide the necessary structure so that the data can be understood by the intended audience.

Saltbox raises $10.6M to help booming e-commerce stores store their goods

By Mary Ann Azevedo

E-commerce is booming, but among the biggest challenges for entrepreneurs of online businesses are finding a place to store the items they are selling and dealing with the logistics of operating.

Tyler Scriven, Maxwell Bonnie and Paul D’Arrigo co-founded Saltbox in an effort to solve that problem.

The trio came up with a unique “co-warehousing” model that provides space for small businesses and e-commerce merchants to operate as well as store and ship goods, all under one roof. Beyond the physical offering, Saltbox offers integrated logistics services as well as amenities such as the rental of equipment and packing stations and access to items such as forklifts. There are no leases and tenants have the flexibility to scale up or down based on their needs.

“We’re in that sweet spot between co-working and raw warehouse space,” said CEO Scriven, a former Palantir executive and Techstars managing director.

Saltbox opened its first facility — a 27,000-square-foot location — in its home base of Atlanta in late 2019, filling it within two months. It recently opened its second facility, a 66,000-square-foot location, in the Dallas-Fort Worth area that is currently about 40% occupied. The company plans to end 2021 with eight locations, in particular eyeing the Denver, Seattle and Los Angeles markets. Saltbox has locations slated to come online as large as 110,000 square feet, according to Scriven.

The startup was founded on the premise that the need for “co-warehousing and SMB-centric logistics enablement solutions” has become a major problem for many new businesses that rely on online retail platforms to sell their goods, noted Scriven. Many of those companies are limited to self-storage and mini-warehouse facilities for storing their inventory, which can be expensive and inconvenient. 

Scriven personally met with challenges when starting his own e-commerce business, True Glory Brands, a retailer of multicultural hair and beauty products.

“We became aware of the lack of physical workspace for SMBs engaged in commerce,” Scriven told TechCrunch. “If you are in the market looking for 10,000 square feet of industrial warehouse space, you are effectively pushed to the fringes of the real estate ecosystem and then the entrepreneurial ecosystem at large. This is costing companies in significant but untold ways.”

Now, Saltbox has completed a $10.6 million Series A round of financing led by Palo Alto-based Playground Global that included participation from XYZ Venture Capital and proptech-focused Wilshire Lane Partners in addition to existing backers Village Capital and MetaProp. The company plans to use its new capital primarily to expand into new markets.

The company’s customers are typically SMB e-commerce merchants “generating anywhere from $50,000 to $10 million a year in revenue,” according to Scriven.

He emphasizes that the company’s value prop is “quite different” from a traditional flex office/co-working space.

“Our members are reliant upon us to support critical workflows,” Scriven said. 

Besides e-commerce occupants, many service-based businesses are users of Saltbox’s offering, he said, such as those providing janitorial services or that need space for physical equipment. The company offers all-inclusive pricing models that include access to loading docks and a photography studio, for example, in addition to utilities and Wi-Fi.

Image Credits: Saltbox

Image Credits: Saltbox

The company secures its properties with a mix of buying and leasing by partnering with institutional real estate investors.

“These partners are acquiring assets and in most cases, are funding the entirety of capital improvements by entering into management or revenue share agreements to operate those properties,” Scriven said. He said the model is intentionally different from that of “notable flex space operators.”

“We have obviously followed those stories very closely and done our best to learn from their experiences,” he added. 

Investor Adam Demuyakor, co-founder and managing partner of Wilshire Lane Partners, said his firm was impressed with the company’s ability to “structure excellent real estate deals” to help them continue to expand nationally.

He also believes Saltbox is “extremely well-positioned to help power and enable the next generation of great direct to consumer brands.”

Playground Global General Partner Laurie Yoler said the startup provides a “purpose-built alternative” for small businesses that have been fulfilling orders out of garages and self-storage units.

Saltbox recently hired Zubin Canteenwalla  to serve as its chief operating offer. He joined Saltbox from Industrious, an operator co-working spaces, where he was SVP of Real Estate. Prior to Industrious, he was EVP of Operations at Common, a flexible residential living brand, where he led the property management and community engagement teams.

For startups choosing a platform, a decision looms: Build or buy?

By Annie Siebert
TX Zhuo Contributor
TX Zhuo is the managing partner of Fika Ventures, focusing on fintech, enterprise software and marketplace opportunities.
Colton Pace Contributor
Colton Pace is an investor at Fika Ventures. He previously held roles investing at Vulcan Capital and Madrona Venture Labs.

Everyone warns you not to build on top of someone else’s platform.

When I first started in VC more than 10 years ago, I was told never to invest in a company building on top of another company’s platform. Dependence on a platform makes you susceptible to failure and caps the return on your investment because you have no control over API access, pricing changes and end-customer data, among other legitimate concerns.

I am sure many of you recall Facebook shutting down its API access back in 2015, or the uproar Apple caused when it decided to change the commission it was charging app developers in 2020.

Put simply, founders can no longer avoid the decision around platform dependency.

Salesforce in many ways paved the way for large enterprise platform companies, being the first dedicated SaaS company to surpass $10 billion in annual revenue supported by its open application development marketplace. Salesforce’s success has given rise to dominant platforms in other verticals, and for founders starting companies, there is no avoiding that platform decision these days.

Some points to consider:

  • Over 4,000 fintech companies, including several unicorns, have built their platforms on top of Plaid.
  • Recruiters may complain about the cost, but 95% still utilize LinkedIn.
  • More than 20,000 companies trust Segment to be their system of record for customer data.
  • Shopify powers over 1 million businesses across the globe.
  • Epic has the medical records of nearly 50% of the U.S. population.

What does this mean for founders who decide to build on top of another platform?

Increase speed to market

PostScript, an SMS/MMS marketing platform for commerce brands, built its platform on Shopify, giving it immediate access to over 1 million brands and a direct customer acquisition funnel. That has allowed PostScript to capture 3,500 of its own customers and successfully close a $35 million Series B in March 2021.

Ability to focus on core functionality

Varo, one of the fastest-growing neobanks, started in 2015 with the principle that a bank could put customers’ interests first and be profitable. But in order to deliver on its mission, it needed to understand where its customers were spending their money. By partnering with Plaid, Varo enabled more than 176,000 of its users to connect their Varo account to outside apps and services, allowing Varo to focus on its core mission to provide more relevant financial products and services.

Gain credibility by association

Autodesk acquires Upchain

By Frederic Lardinois

Autodesk, the publicly-traded software company best known for its CAD and 3D modeling tools, today announced that it has acquired Upchain, a Toronto-based startup that offers a cloud-based product lifecycle management (PLM) service. The two companies, which didn’t disclose the acquisition price, expect the transaction to close by July 31, 2021.

Since its launch in 2015, Upchain raised about $7.4 million in funding, according to Crunchbase. The central idea behind the service was that existing lifecycle management solutions, which are meant to help businesses take new products from inception production and collaborate with their supply chain in the process, were cumbersome and geared toward large multi-national enterprises. Upchain’s focus is on small and mid-sized companies and promises to be more affordable and usable than other solutions. It’s customer base spans a wide range of industries, ranging from textiles and apparel to automotive, aerospace, industrial machines, transportation and entertainment.

“We’ve had a singular focus at Upchain to up-level cloud collaboration across the entire product lifecycle, changing the way that people work together so that everyone has access to the data they need, when they need it,” Upchain CEO and founder John Laslavic said in today’s announcement. “Autodesk shares our vision for radically simplifying how engineers and manufacturers across the entire value chain collaborate and bring a top-quality product to market faster. I look forward to seeing how Upchain and Autodesk, together, take that vision to the next level in the months and years to come.”

For Autodesk, this is the company’s 15th acquisition since 2017. Earlier this year, the company made its first $1 billion acquisition when it bought Portland, OR-based Innovyze, a 35-year-old company that focuses on modeling and lifecycle management for the water management industry. 

“Resilience and collaboration have never been more critical for manufacturers as they confront the increasing complexity of developing new products. We’re committed to addressing those needs by offering the most robust end-to-end design and manufacturing platform in the cloud,” said Andrew Anagnost, President and CEO of Autodesk. “The convergence of data and processes is transforming the industry. By integrating Upchain with our existing offerings, Autodesk customers will be able to easily move data without barriers and will be empowered to unlock and harness valuable insights that can translate to fresh ideas and business success.”

Building customer-first relationships in a privacy-first world is critical

By Ram Iyer
Travis Clinger Contributor
Travis Clinger is SVP, head of addressability and ecosystem at LiveRamp, a data connectivity platform safely moving data through the pipes connecting most every brand, tech platform, publisher and advertiser on the open internet.
Jeff Nienaber Contributor
Jeff Nienaber is senior director, global audience ads at Microsoft Advertising, which provides intelligent solutions that empower advertisers to deliver engaging, personalized experiences to over half a billion people worldwide.

In business today, many believe that consumer privacy and business results are mutually exclusive — to excel in one area is to lack in the other. Consumer privacy is seen by many in the technology industry as an area to be managed.

But the truth is, the companies who champion privacy will be better positioned to win in all areas. This is especially true as the digital industry continues to undergo tectonic shifts in privacy — both in government regulation and browser updates.

By the end of 2022, all major browsers will have phased out third-party cookies — the tracking codes placed on a visitor’s computer generated by another website other than your own. Additionally, mobile device makers are limiting identifiers allowed on their devices and applications. Across industry verticals, the global enterprise ecosystem now faces a critical moment in which digital advertising will be forever changed.

Up until now, consumers have enjoyed a mostly free internet experience, but as publishers adjust to a cookie-less world, they could see more paywalls and less free content.

They may also see a decrease in the creation of new free apps, mobile gaming, and other ad-supported content unless businesses find new ways to authenticate users and maintain a value exchange of free content for personalized advertising.

When consumers authenticate themselves to brands and sites, they create revenue streams for publishers as well as the opportunity to receive discounts, first-looks, and other specially tailored experiences from brands.

To protect consumer data, companies need to architect internal systems around data custodianship versus acting from a sense of data entitlement. While this is a challenging and massive ongoing evolution, the benefits of starting now are enormous.

Putting privacy front and center creates a sustainable digital ecosystem that enables better advertising and drives business results. There are four steps to consider when building for tomorrow’s privacy-centric world:

Transparency is key

As we collectively look to redesign how companies interact with and think about consumers, we should first recognize that putting people first means putting transparency first. When people trust a brand or publishers’ intentions, they are more willing to share their data and identity.

This process, where consumers authenticate themselves — or actively share their phone number, email or other form of identity — in exchange for free content or another form of value, allows brands and publishers to get closer to them.

1Password acquires SecretHub and launches new enterprise secrets management tool

By Frederic Lardinois

1Password, the password management service that competes with the likes of LastPass and BitWarden, today announced a major push beyond the basics of password management and into the infrastructure secrets management space. To do so, the company has acquired secrets management service SecretHub and is now launching its new 1Password Secrets Automation service.

1Password did not disclose the price of the acquisition. According to CrunchBase, Netherlands-based SecretHub never raised any institutional funding ahead of today’s announcement.

For companies like 1Password, moving into the enterprise space, where managing corporate credentials, API tokens, keys and certificates for individual users and their increasingly complex infrastructure services, seems like a natural move. And with the combination of 1Password and its new Secrets Automation service, businesses can use a single tool that covers them from managing their employee’s passwords to handling infrastructure secrets. 1Password is currently in use by more then 80,000 businesses worldwide and a lot of these are surely potential users of its Secrets Automation service, too.

“Companies need to protect their infrastructure secrets as much if not more than their employees’ passwords,” said Jeff Shiner, CEO of 1Password. “With 1Password and Secrets Automation, there is a single source of truth to secure, manage and orchestrate all of your business secrets. We are the first company to bring both human and machine secrets together in a significant and easy-to-use way.”

In addition to the acquisition and new service, 1Password also today announced a new partnership with GitHub. “We’re partnering with 1Password because their cross-platform solution will make life easier for developers and security teams alike,” said Dana Lawson, VP of partner engineering and development at GitHub, the largest and most advanced development platform in the world. “With the upcoming GitHub and 1Password Secrets Automation integration, teams will be able to fully automate all of their infrastructure secrets, with full peace of mind that they are safe and secure.”

JXL turns Jira into spreadsheets

By Frederic Lardinois

Atlassian’s Jira is an extremely powerful issue tracking and project management tool, but it’s not the world’s most intuitive piece of software. Spreadsheets, on the other hand, are pretty much the de facto standard for managing virtually anything in a business. It’s maybe no surprise then that there are already a couple of tools on the market that bring a spreadsheet-like view of your projects to Jira or connect it to services like Google Sheets.

The latest entrant in this field is JXL Spreadsheets for Jira (and specifically Jira Cloud), which was founded by two ex-Atlassian employees, Daniel Franz and Hannes Obweger. And in what has become a bit of a trend, Atlassian Ventures invested in JXL earlier this year.

Franz built the Good News news reader before joining Atlassian, while his co-founder previously founded Radiant Minds Software, the makers of Jira Roadmaps (now Portfolio for Jira), which was acquired by Atlassian.

Image Credits: JXL

“Jira is so successful because it is awesome,” Franz told me. “It is so versatile. It’s highly customizable. I’ve seen people in my time who are doing anything and everything with it. Working with customers [at Atlassian] — at some point, you didn’t get surprised anymore, but what the people can do and track with JIRA is amazing. But no one would rock up and say, ‘hey, JIRA is very pleasant and easy to use.’ ”

As Franz noted, by default, Jira takes a very opinionated view of how people should use it. But that also means that users often end up exporting their issues to create reports and visualizations, for example. But if they make any changes to this data, it never flows back into Jira. No matter how you feel about spreadsheets, they do work for many people and are highly flexible. Even Atlassian would likely agree because the new Jira Work Management, which is currently in beta, comes with a spreadsheet-like view and Trello, too, recently went this way when it launched a major update earlier this year.

Image Credits: JXL

Over the course of its three-month beta, the JXL team saw how its users ended up building everything from cross-project portfolio management to sprint planning, backlog maintenance, timesheets and inventory management on top of its service. Indeed, Franz tells me that the team already has some large customers, with one of them having a 7,000-seat license.

Pricing for JXL seems quite reasonable, starting at $1/user/month for teams with up to 10 users. Larger teams get increasingly larger discounts, down to $0.45/user/month for licenses with over 5,000 seats. There is also a free trial.

One of the reasons the company can offer this kind of pricing is because it only needs a very simple backend. None of a customer’s data sits on JXL’s servers. Instead, it sits right on top of Jira’s APIs, which in turn also means that changes are synced back and forth in real time.

JXL is now available in the Atlassian Marketplace and the team is actively hiring as it looks to build out its product (and put its new funding to work).

Meroxa raises $15M Series A for its real-time data platform

By Frederic Lardinois

Meroxa, a startup that makes it easier for businesses to build the data pipelines to power both their analytics and operational workflows, today announced that it has raised a $15 million Series A funding round led by Drive Capital. Existing investors Root, Amplify and Hustle Fund also participated in this round, which together with the company’s previously undisclosed $4.2 million seed round now brings total funding in the company to $19.2 million.

The promise of Meroxa is that can use a single platform for their various data needs and won’t need a team of experts to build their infrastructure and then manage it. At its core, Meroxa provides a single Software-as-a-Service solution that connects relational databases to data warehouses and then helps businesses operationalize that data.

Image Credits: Meroxa

“The interesting thing is that we are focusing squarely on relational and NoSQL databases into data warehouse,” Meroxa co-founder and CEO DeVaris Brown told me. “Honestly, people come to us as a real-time FiveTran or real-time data warehouse sink. Because, you know, the industry has moved to this [extract, load, transform] format. But the beautiful part about us is, because we do change data capture, we get that granular data as it happens.” And businesses want this very granular data to be reflected inside of their data warehouses, Brown noted, but he also stressed that Meroxa can expose this stream of data as an API endpoint or point it to a Webhook.

The company is able to do this because its core architecture is somewhat different from other data pipeline and integration services that, at first glance, seem to offer a similar solution. Because of this, users can use the service to connect different tools to their data warehouse but also build real-time tools on top of these data streams.

Image Credits: Meroxa

“We aren’t a point-to-point solution,” Meroxa co-founder and CTO Ali Hamidi explained. “When you set up the connection, you aren’t taking data from Postgres and only putting it into Snowflake. What’s really happening is that it’s going into our intermediate stream. Once it’s in that stream, you can then start hanging off connectors and say, ‘Okay, well, I also want to peek into the stream, I want to transfer my data, I want to filter out some things, I want to put it into S3.”

Because of this, users can use the service to connect different tools to their data warehouse but also build real-time tools to utilize the real-time data stream. With this flexibility, Hamidi noted, a lot of the company’s customers start with a pretty standard use case and then quickly expand into other areas as well.

Brown and Hamidi met during their time at Heroku, where Brown was a director of product management and Hamidi a lead software engineer. But while Heroku made it very easy for developers to publish their web apps, there wasn’t anything comparable in the highly fragmented database space. The team acknowledges that there are a lot of tools that aim to solve these data problems, but few of them focus on the user experience.

Image Credits: Meroxa

“When we talk to customers now, it’s still very much an unsolved problem,” Hamidi said. “It seems kind of insane to me that this is such a common thing and there is no ‘oh, of course you use this tool because it addresses all my problems.’ And so the angle that we’re taking is that we see user experience not as a nice-to-have, it’s really an enabler, it is something that enables a software engineer or someone who isn’t a data engineer with 10 years of experience in wrangling Kafka and Postgres and all these things. […] That’s a transformative kind of change.”

It’s worth noting that Meroxa uses a lot of open-source tools but the company has also committed to open-sourcing everything in its data plane as well. “This has multiple wins for us, but one of the biggest incentives is in terms of the customer, we’re really committed to having our agenda aligned. Because if we don’t do well, we don’t serve the customer. If we do a crappy job, they can just keep all of those components and run it themselves,” Hamidi explained.

Today, Meroxa, which the team founded in early 2020, has over 24 employees (and is 100% remote). “I really think we’re building one of the most talented and most inclusive teams possible,” Brown told me. “Inclusion and diversity are very, very high on our radar. Our team is 50% black and brown. Over 40% are women. Our management team is 90% underrepresented. So not only are we building a great product, we’re building a great company, we’re building a great business.”  

Jack Ma’s Ant called to end anti-competition in payments

By Rita Liao

The details for Ant’s overhaul have arrived. Ant Group, the fintech affiliate of Alibaba controlled by Jack Ma, will become a financial holding company that will bring more regulatory scrutiny over how it lends and generates profits, China’s central bank said on Monday.

Ant started as an online payments processor for Alibaba marketplaces and has over time blossomed into an empire of payments, lending, wealth management and insurance. Its encroachment onto the existing financial industry had not been particularly welcome in China, and a few years ago, the giant began positioning itself as a “technology provider” rather than one competing with big banks and conventional wealth managers.

Despite these efforts, the government wanted to further rein in the fintech giant.

As part of what the government dubs a “rectification plan” for Ant, of which initial public offering was called off in November as regulators sought to curb the power of the country’s internet giants, Ant will “correct its anti-competitive practices.” That entails giving consumers more options in payment methods and removing unscrupulous tricks that lure users into getting loans.

Ant, which has over 1 billion annual users around the world, most of whom are in China, is also asked to end its monopoly on user data and ensure the information safety of individuals and the nation.

As a financial holding company, Ant will also need to control the liquidity risk of its financial products and shrink the size of its money-market fund, one of the world’s biggest.

❌