A Solid Power manufacturing engineer holds two 20 ampere hour (Ah) all solid-state battery cells for the BMW Group and Ford Motor Company. The 20 ampere hour (Ah) all solid-state battery cells were produced on Solid Power’s Colorado-based pilot production line. Source: Solid Power.
Solid state battery systems have long been considered the next breakthrough in battery technology, with multiple startups vying to be the first to commercialization. Automakers have been some of the top investors in the technology, each of them seeking the edge that will make their electric vehicles safer, faster and with increased range.
Ford Motor Company and BMW Group have put their money on battery technology company Solid Power.
The Louisville, Colorado-based SSB developed said Monday its latest $130 million Series B funding round was led by Ford and BMW, the latest signal that the two OEMs see SSBs powering the future of transportation. Under the investment, Ford and BMW are equal equity owners and company representatives will join Solid Power’s board.
Solid Power received additional investment in the round from Volta Energy Technologies, the venture capital firm spun out of the U.S. Department of Energy’s Argonne National Laboratory.
Solid state batteries are so named because they lack a liquid electrolyte, as Mark Harris explained in an ExtraCrunch article earlier this year. Liquid electrolyte solutions are usually flammable and at risk of overheating, so SSBs are considered to be generally safer. The real value of SSBs versus their lithium-ion counterparts is the energy density. Solid Power says its batteries can provide as much as a 50% to 100% increase in energy density compared to rechargeable batteries. Theoretically, electric vehicles with more energy dense batteries can travel longer distances on a single charge.
This latest round of investment will help Solid Power boost its manufacturing to produce battery cells with the company’s highest ampere hour (Ah) output yet. Under separate joint development agreements with Ford and BMW, it will deliver to the OEMs 100 Ah cells for testing and vehicle integration from 2022.
Until this point, the company has been manufacturing cells with 2 Ah and 10 Ah output. “Hundreds” of 2 Ah battery cells were validated by Ford and BMW late last year, Solid Power said in a statement. Meanwhile, it is currently producing 20 Ah solid-state batteries on a pilot basis with standard lithium-ion equipment.
As opposed to the 20 Ah pilot-scale cells – which are composed of 22-layers at 9×20 cm – these 100 Ah cells will have a larger footprint and even more layers, Solid Power spokesman Will McKenna told TechCrunch. (‘Layers’ refers to the number of double-sided cathodes, McKenna explained – so the 20 Ah cell has 22 cathodes and 22 anodes, with an all-solid electrolyte separator in-between each, all in a single cell.)
Unlike Solid Power’s manufacturing, traditional lithium-ion batteries must undergo electrolyte filling and cycling in their production processes. Solid Power says these additional steps accounts for 5% and 30% of capital expenditure in a typical GWh-scale lithium-ion facility.
This isn’t the first time Solid Power has landed investments from the automakers. The company’s $20 million Series A in 2018 attracted capital from BMW and Ford, as well as Samsung, Hyundai, Volta and others. It’s part of a new wave of companies that have attracted the attention of OEMs. Other notable examples include Volkswagen-backed QuantumScape and General Motors, which has put its money on SES.
Ford is also independently researching advanced battery technologies and is planning on opening a $185 million R&D battery lab, the company said last week.
The fact that COVID-19 accelerated the need for digital transformation across virtually all sectors is old news. What companies are doing to propel success under the circumstances has been under the spotlight. However, how they do it has managed to find a place in the shadows.
Simply put, the explosive increase in innovation and adoption of digital solutions shouldn’t be allowed to take place at the expense of ethical considerations.
This is about morals — but it’s also about the bottom line. Stakeholders, both internal and external, are increasingly intolerant of companies that blur (or ignore) ethical lines. These realities add up to a need for leaders to embrace an all-new learning curve: How to engage in digital transformation that includes ethics by design.
Simply put, the explosive increase in innovation and adoption of digital solutions shouldn’t be allowed to take place at the expense of ethical considerations.
It’s easy to rail against the evils of the executive lifestyle or golden parachuting, but more often than not, a pattern of ethics violations arises from companywide culture, not leadership alone. Ideally, employees act ethically because it aligns with their personal values. However, at a minimum, they should understand the risk that an ethical breach represents to the organization.
In my experience, those conversations are not being held. Call it poor communication or lack of vision, but most companies rarely model potential ethical risks — at least not openly. If those discussions take place, they’re typically between members of upper management, behind closed doors.
Why don’t ethical concerns get more of a “town hall” treatment? The answer may come down to an unwillingness to let go of traditional thinking about business hierarchies. It could also be related to the strong (and ironically, toxic) cultural message that positivity rules. Case in point: I’ve listened to leaders say they want to create a culture of disruptive thinking — only to promptly tell an employee who speaks up that they “lack a growth mindset.”
What’s the answer, then? There are three solutions I’ve found to be effective:
These simple solutions are a great starting point to solve ethics issues regarding digital transformation and beyond. They cause leaders to look into the heart of the company and make decisions that will impact the organization for years to come.
Making digital shifts is, by nature, a technical operation. It requires personnel with advanced and varied expertise in areas such as AI and data operations. Leaders in the digital transformation space are expected to possess enough cross-domain competency to tackle tough problems.
That’s a big ask — bringing a host of technically minded people together can easily lead to a culture of expertise arrogance that leaves people who don’t know the lingo intimidated and reluctant to ask questions.
Digital transformation isn’t simply about infrastructure or tools. It is, at its heart, about change management, and a multifunctional approach is needed to ensure a healthy transition. The biggest mistake companies can make is assuming that only technical experts should be at the table. The silos that are built as a result inevitably turn into echo chambers — the last place you want to hold a conversation about ethics.
In the rush to go digital, regardless of how technical the problem, the solution will still be a fundamentally human-centric one.
Not all ethical imperatives related to digital transformation are as debatable as the suggestion that it should be people-first; some are much more black and white, like the fact that you have to start somewhere to get anywhere.
Luckily, “somewhere” doesn’t have to be from scratch. Government, risk and compliance (GRC) standards can be used to create a highly structured framework that’s mostly closed to interpretation and provides a solid foundation for building out and adopting digital solutions.
The utility of GRC models applies equally to startup multinationals and offers more than just a playbook; thoughtful application of GRC standards can also help with leadership evaluation, progress reports and risk analysis. Think of it like using bowling bumpers — they won’t guarantee you roll a strike, but they’ll definitely keep the ball out of the gutter.
Of course, a given company might not know how to create a GRC-based framework (just like most of us would be at a loss if tasked with building a set of bowling bumpers). This is why many turn to providers like IBM OpenPages, COBIT and ITIL for prefab foundations. These “starter kits” all share a single goal: Identify policies and controls that are relevant to your industry or organization and draw lines from those to pivotal compliance points.
Although getting started with the GRC process is typically cloud-based and at least partially automated, it requires organizationwide input and transparency. It can’t be effectively run by specific departments, or in a strictly top-down fashion. In fact, the single most important thing to understand about implementing GRC standards is that it will almost certainly fail unless both an organization’s leadership and broader culture fully support the direction in which it points.
Today’s leaders — executives, entrepreneurs, influencers and more — can’t be solely concerned with “winning” the digital race. Arguably, transformation is more of a marathon than a sprint, but either way, technique matters. In pursuing the end goal of competitive advantage, the how and why matter just as much as the what.
This is true for all arms of an organization. Internal stakeholders such as owners and employees risk their careers and reputations by tolerating a peripheral approach to ethics. External stakeholders like customers, investors and suppliers have just as much to lose. Their mutual understanding of this fact is what’s behind the collective, cross-industry push for transparency.
We’ve all seen the massive blowback against individuals and brands in the public eye who allow ethical lapses on their watch. It’s impossible to fully eliminate the risk of experiencing something similar, but it is a risk that can be managed. The danger is in letting the “tech blinders” of digital transformation interfere with your view of the big picture.
Companies that want to mitigate that risk and rise to the challenges of the digital era in a truly ethical way need to start by simply having conversations about what ethics, transparency and inclusivity mean — both in and around the organization. They need to follow up those conversations with action where necessary, and with open-mindedness across the board.
It’s smart to be worried about innovation lag in a time when enterprise is moving and shifting faster than ever, but there is time to make all the proper ethical considerations. Failing to do so will only derail you down the line.
Over the past several years I’ve covered my fair share of upstart avatar companies that were all chasing the same dream — building out a customizable platform for a digital persona that gained wide adoption across games and digital spaces. Few of those startups I’ve covered in the past are still around. But by netting a string of successful partnerships with celebrity musicians, LA-based Genies has come closer than any startup before it to realizing the full vision of a wide-reaching avatar platform.
The company announced today that they’ve closed a $65 million Series B led by Mary Meeker’s firm Bond. NEA, Breyer Capital, Tull Investment Group, NetEase, Dapper Labs and Coinbase Ventures also participated in the deal. Mary Meeker will be joining the Genies board. The company didn’t disclose the Genies’ most recent valuation.
This funding comes at an inflection point for the eight-year-old company, evidenced by the investments from NBA Top Shot-maker Dapper Labs and crypto giant Coinbase. As announced last week, the company is rolling out an NFT platform on Dapper Labs’ Flow blockchain, partnering closely with the startup, which will be building out the backend for a Genies avatar accessories storefront. Like Dapper Labs has leveraged its exclusive deals with sports leagues to ship NFTs with official backing, Genies is planning to capitalize on its partnerships with celebrities in its roster, including Justin Bieber, Shawn Mendes, Cardi B and others to create a platform for buying and trading avatar accessories en masse.
In October, the company announced a brand partnership with Gucci, opening the startup to another big market opportunity.
Genies’ business has largely focused on leveraging high-profile partnerships to give its entertainer clients a digital presence that can spice up what they’re sharing on social media and beyond. As they’ve rolled out avatar creation to all users through beta mobile apps, Genies has been focusing on one of the more explicit dreams of the avatar companies before it; building out a broad network of avatar users and a broad network of compatible platforms through its SDK.
“An avatar is a vehicle to be able to showcase more of your authentic self,” Genies CEO Akash Nigam tells TechCrunch. “It’s not limited by real-world constraints, it’s an alter-ego personality.”
Trends in the NFT world have provided new realms of exploration for Genies, but so have broader pandemic-era trends that have pushed more users to wholly digital spaces where they socialize and connect. “The pandemic accelerated everything,” Nigam says.
Nigam emphasizes that despite the major opportunity its upcoming NFT platform will present, Genies is still an avatar company first-and-foremost, not an NFT startup, though he does say he is believes crypto-backed digital goods are going to be around for a long time. He has few doubts that the current environment around digital goods helped juice Genies’ funding round, which he says was “6-8X oversubscribed” and was an opportunistic play for the startup, which “could have gone years without having to raise.”
The company says their crypto marketplace will launch in the coming months, as early as this summer.
Twitter Spaces, the company’s new live audio rooms feature, is opening up more broadly. The company announced today it’s making Twitter Spaces available to any account with 600 followers or more, including both iOS and Android users. It also officially unveiled some of the features it’s preparing to launch, like Ticketed Spaces, scheduling features, reminders, support for co-hosting, accessibility improvements, and more.
Along with the expansion, Twitter is making Spaces more visible on its platform, too. The company notes it has begun testing the ability to find and join a Space from a purple bubble around someone’s profile picture right from the Home timeline.
Image Credits: Twitter
Twitter says it decided on the 600 follower figure as being the minimum to gain access to Twitter Spaces based on its earlier testing. Accounts with 600 or more followers tend to have “a good experience” hosting live conversations because they have a larger existing audience who can tune in. However, Twitter says it’s still planning to bring Spaces to all users in the future.
In the meantime, it’s speeding ahead with new features and developments. Twitter has been building Spaces in public, taking into consideration user feedback as it prioritizes features and updates. Already, it has built out an expanded set of audience management controls, as users requested, introduced a way for hosts to mute all speakers at once, and added the laughing emoji to its set of reactions, after users requested it.
Now, its focus is turning towards creators. Twitter Spaces will soon support multiple co-hosts, and creators will be able to better market and even charge for access to their live events on Twitter Spaces. One feature, arriving in the next few weeks, will allow users to schedule and set reminders about Spaces they don’t want to miss. This can also help creators who are marketing their event in advance, as part of the RSVP process could involve pushing users to “set a reminder” about the upcoming show.
Twitter Spaces’ rival, Clubhouse, also just announced a reminders feature during its Townhall event on Sunday as well at the start of its external Android testing. The two platforms, it seems, could soon be neck-and-neck in terms of feature set.
Image Credits: Twitter
But while Clubhouse recently launched in-app donations feature as a means of supporting favorite creators, Twitter will soon introduce a more traditional means of generating revenue from live events: selling tickets. The company says it’s working on a feature that will allow hosts to set ticket prices and how many are available to a given event, in order to give them a way of earning revenue from their Twitter Spaces.
A limited group of testers will gain access to Ticketed Spaces in the coming months, Twitter says. Unlike Clubhouse, which has yet to tap into creator revenue streams, Twitter will take a small cut from these ticket sales. However, it notes that the “majority” of the revenue will go to the creators themselves.
Image Credits: Twitter
Twitter also noted that it’s improving its accessibility feature, live captions, so they can be paused and customized, and is working to make them more accurate.
The company will be hosting a Twitter Space of its own today around 1 PM PT to further discuss these announcements in more detail.
Education may well be the most important activity we conduct as a society — and it may also be the hardest space to build a startup in. Selling to school districts and universities is notoriously difficult, but enticing consumers is even harder. Learning takes focus, patience, tenacity and resources, and most consumers would prefer to watch some lip-sync videos on TikTok than stare at math equations (not to mention that such entertainment is free). Engagement and education feel aggressively at odds, which limits the way that startups can scale and succeed.
Yet, the revulsion VCs have traditionally had for the space has slowly dissipated over the past 10 years. Consumer and enterprise startups in edtech are increasingly attracting funding, and there is a growing crop of edtech-focused investors who are betting big on the future here. What’s changed isn’t the market or its potential, but rather the perception that ambitious and sustainable companies can truly be built in education.
One of the companies that has led the charge in transforming those perceptions is Pittsburgh-based Duolingo. It’s a language-learning app that has caught fire. From humble origins a decade ago as a translation platform for news agencies, it’s now used by 500 million people across the world to learn Spanish, English, French and more, all while generating bookings of $190 million in 2020. It’s a smashing success, but a success that was hard earned after a years-long effort of product and revenue experimentation to find its current niche.
TechCrunch’s writer and analyst for this EC-1 is Natasha Mascarenhas. Mascarenhas has been covering edtech from the very first day she joined TechCrunch as a venture capital and startups writer, and she has built up a reputation as a fearless chronicler of this increasingly vital ecosystem. The lead editor of this package was Danny Crichton, the copy editor was Richard Dal Porto, and illustrations were created by Nigel Sussman.
Duolingo had no say in the content of this analysis and did not get advanced access to it. Mascarenhas has no financial ties to Duolingo or other conflicts of interest to disclose.
The Duolingo EC-1 comprises four main articles numbering 12,200 words and a reading time of 48 minutes. Here’s what’s in store:
And finally, note that Duolingo CEO and co-founder Luis von Ahn is coming to Disrupt, so make sure to grab your tickets because the conversation will continue there.
We’re always iterating on the EC-1 format. If you have questions, comments or ideas, please send an email to TechCrunch Managing Editor Danny Crichton at firstname.lastname@example.org.
Luis von Ahn, an entrepreneur who has dedicated his career to scaling free education, has probably annoyed you more than once. In fact, you’ve likely been annoyed by his work dozens and maybe hundreds of times over the years.
A decade before he co-founded the whimsical and language-learning app Duolingo, one of the most popular education apps in the world with over 500 million downloads and 40 million active users, he was building the technology that would become CAPTCHA, those human-annoying but bot-preventing little tests that pop up when registering or logging in to popular internet services like email.
It may seem like a radical pivot, but in fact, the lessons of how to create useful security tests at scale for consumers would one day offer the core DNA for building one of the most successful edtech companies in the world. The immigrant entrepreneur would be soon learn himself that crowdsourcing, language and a willingness to adapt and ignore critics could change the face of an industry forever.
Von Ahn grew up in Guatemala City, where he saw firsthand the wretched state of public schools in impoverished countries. His mother spent most of her income sending him to “fancy private school” as he puts it, and he estimates she spent over $1 million on his education over his lifetime. The price tag weighed on him, and he knew he wanted to broaden access to education in the future.
After attending Duke as an undergrad, von Ahn was an enterprising first-year computer science Ph.D. student at top-ranked Carnegie Mellon University when he attended a talk by Yahoo’s chief scientist about 10 of Yahoo’s biggest headaches. One issue stood out: hackers were creating bots that register thousands of email addresses to send spam.
Inspired and full of immigrant grit, von Ahn and a team led by his then-adviser Manuel Blum created a nifty little test that could distinguish between bots and humans. The test, called a CAPTCHA, presented squiggly, ink-blotted words whenever a user tried to login. Computer vision at the time couldn’t read the obscured text, but humans easily could — creating a useful signal. The deceptively simple test worked, so von Ahn, then a 20-something student, gave it to Yahoo for free, not understanding the value it would one day have.
Luis von Ahn, the inventor of CAPTCHA and reCAPTCHA, and co-founder of Duolingo. Image Credits: Duolingo
A fire was lit. With Yahoo as a distribution channel, CAPTCHA tests exploded in popularity, becoming an almost universally recognizable security checkpoint feature. At their peak, people spent 500,000 hours a day typing up to 200 million CAPTCHAs around the world. About 10% of the world’s population had recognized at least one word, von Ahn estimates.
For all the technology’s success though, there was a downside. “During those 10 seconds while you’re typing in a CAPTCHA, your brain is doing something that computers can’t do, which is amazing,” von Ahn said. But the tests were annoying and pointless, so he wondered, “Could we get those 500,000 hours a day to do something useful for humanity?”
So in 2005, he launched reCAPTCHA. These new tests would have the same goal of CAPTCHA, but with a twist: the prompts would all be scans of books. Users would complete the security test while also helping to digitize books for the Internet Archive.
The early design of ReCAPTCHA. Image Credits: Duolingo
This time, von Ahn knew his nifty idea was worth something. In 2009, he sold reCAPTCHA to Google, a transaction conducted just a year after the internet giant had purchased a license to one of his other research projects, a game focused on image labeling.
Luis von Ahn presenting about reCAPTCHA and CAPTCHA, two of his iconic inventions. Image Credits: Duolingo
The acquisition offered not just a monetary award (exact terms of the deal were not disclosed), but also suddenly garnered von Ahn serious clout in the industry just a few years after acquiring his Ph.D. Yet, instead of taking up tenure at the tech company, he stayed local in Pittsburgh and became a computer science professor at his alma mater.
Entering the world of education as a professor felt like an answer to his original dream of expanding access to education. What von Ahn didn’t know, though, was that his iconic work was simply foreshadowing. Carnegie Mellon, crowdsourced translation and even Google would all play a role in his next project as well, albeit in wildly different ways: incubation, failure and investment. For him, the success of two tools that used language as a barrier was the beginning of a long journey into discovering if, and how, language could instead be a bridge. It was an insight that would grow into a startup valued at $2.4 billion with the goal of making language learning fun: Duolingo.
In 2011, edtech startups such as Coursera and Codecademy were popping up — companies that today are valued as multibillion dollar businesses. The rise of iPads and tablets in classrooms gave permission to founders who believed the future of education was on the internet. Enthusiasm was boiling, and virtual instruction felt like a nascent, but ambitious place to bet on.
Duolingo CEO and co-founder Luis von Ahn was tired of the gray and dreary design aesthetic edtech companies used to emulate universities. Instead, he and the company’s early team sought inspiration from games like Angry Birds and Clash Royale, looking to build a class that screamed more cartoon anarchy than lecture hall. From that frenetic creativity came the company’s distinctive mascot: a childish and rebellious evergreen-colored owl named Duo.
Duolingo didn’t just throw out the old colors though — it wanted to completely rethink language learning from the bottom up for mobile. So it replaced top-down curriculums with analytics-driven growth strategies, becoming consumed by an ethos that has more recently been dubbed product-led growth.
Used by companies such as Calendly, Slack and Dropbox, product-led growth is a strategy in which a company iterates its product to create loyal fans-turned-customers who popularize the product with others, creating a viral growth loop. It’s an attractive route because it vastly lowers the cost of acquiring users while also increasing engagement and thus retention. Duolingo, for example, has taken this model and found ways to embed engagement hooks, pockets of joy and addictive education features within its core app.
With early venture capital in its pocket, Duolingo could afford to focus on product over profits.
In part one of this EC-1, we explored how von Ahn’s previous products around CAPTCHA led to Duolingo’s launch, the rise and fall of crowdsourced translation as a way to disrupt language learning, and the accidental iteration of a top education app by a pair of interns. The startups’ early signs of success gave it energy to focus on growth to accomplish two things: know what they’re doing works, and garner a lot of user data so it continued iterating the product into something that was ever more addicting to use.
Now, we’ll analyze how Duolingo used product-led growth as a lever to expand its consumer base, and how a company built on gamification tries to balance its whimsy with education outcomes.
Duo, Duolingo’s mascot, flying around. Image Credits: Duolingo
Tyler Murphy, having graduated from his intern position at Duolingo launching the company’s iOS app, noticed that the gaming world was rapidly innovating around him in the mid-2010s. Angry Birds was no longer the only popular game on mobile, and video games generally were getting more engaging, with in-app currencies, progress bars and an experience that felt creatively addictive. He suddenly saw connections between the entertainment that games provided and the patient learning required for languages.
“Wouldn’t it be cool if the skill got harder and harder, kind of like how a character in a game gets more powerful and powerful?” he remembers asking. Duolingo had taken early inspiration from Angry Birds as well as Clash Royale later, following that game’s launch in 2016. “Half the people at Duolingo were playing Clash Royale, at some point,” he said. “And I think that shaped our product roadmap a lot and our design language a lot.”
Games solved a problem that was acutely personal for Murphy. The employee, who would go on to become chief designer at Duolingo, had gone to college to teach Spanish to students, but ultimately left the field after struggling to inspire kids in a classroom setting. The realization that Duolingo could borrow from gaming instead of monotonous edtech companies gave an adrenaline rush — and permission — to the team to experiment with new approaches to learning.
Every game needs some form of experience points and leveling up, and for Duolingo learners, that progress comes in the form of skill trees.
These trees, which were conceived by a design agency during the company’s early development, are Duolingo’s core experience, a visual representation of language skills that are interconnected and get progressively more difficult and refined over time. Each skill is a prerequisite for another. Sometimes it’s just logic: in order to be able to speak about restaurants, you probably should be able to introduce yourself first. Sometimes, however, it’s a necessary building block: in order to speak about your routine, you should be able to speak about basic everyday activities.
In Duolingo, each unit has its own suite of skills, each of which is broken down into five lessons. Once you complete all five lessons, you can move to the next skill. Complete all skills and you can move to the next unit. Depending on the language, a user might encounter an average of 60 skills across nine different units within a course.
Duolingo Skill Tree UX in 2012. Image Credits: Duolingo
Duolingo Skill Tree UX in 2021. Image Credits: Duolingo
Duolingo had its “leveling up” model figured out, but now it had to integrate gamification into every nook and cranny of its app. One of its first challenges was rebuilding the sort of teacher-student emotional bond that can help students stay motivated to learn. No one likes to fail, and Duolingo stumbled upon a scalable approach through its cartoon owl mascot Duo — also thought of by the design agency behind the skill trees.
Whenever users succeed or fail at their lessons today, they are likely to be encouraged or admonished by Duo’s presence. Designers sprinkled Duo throughout the product, looking at Super Mario Brothers as an example of how to use iconic art to create a friendly gaming experience. In early iterations of the app, Duo was present but static, more of an icon than a personality. That changed as the company increasingly pushed harder on engagement.
As its meandering route to monetization will demonstrate, Duolingo isn’t mission-oriented, it’s mission-obsessed.
Co-founders Luis von Ahn and Severin Hacker never wanted to charge consumers for access to Duolingo content, a purpose imbued throughout the company’s culture. For years in order to work at Duolingo, you had to be comfortable with joining a company in Pittsburgh that was in no rush to make money. The startup, filled with education enthusiasts and mission-driven employees, became “very college pizza vibes,” Gina Gotthilf, former VP of Marketing at Duolingo, described. Everyone was against making money and having structure — some employees even threatened to quit if Duolingo ever charged a cent to users.
“One thing that recruited me was this brilliance that we can kill two birds with one stone,” she said, referring to Duolingo’s original translation-service business model we talked about in part one of this EC-1. “It was obviously tied to Luis’ thinking and reCAPTCHA and it was magical and brilliant.”
Free may not have paid the bills, but it did come with a valuable upside: growth. By 2017, Duolingo would boast having 200 million users, which was double von Ahn’s goal when he first launched to the public on the TechCrunch Disrupt stage.
Duolingo launched saying it would never do advertisements, subscriptions or in-app purchases — approaches that now all exist on the platform. Today, Duolingo has a simple freemium business model that is remarkably unconventional. It has a free version with all of its learning content, and it charges a subscription of $6.99 per month for paywalled features such as unlimited hearts, no advertisements and progress tracking. It also has a number of other revenue streams it’s developing, such as language proficiency tests.
As we’ll explore, Duolingo’s route from anti-business rebel to conventional consumer subscription is complex, full of twists and turns. While Duolingo never wanted to look like other edtech companies, as we saw with its product strategy in part two, it turns out that evolving from college pizza vibes meant that it would have to take a page from its peers.
Duocon, Duolingo’s new conference to celebrate education and language. Image Credits: Duolingo
“They had users and in Silicon Valley, there was this notion that if you have users, you can turn anything into money,” said Bing Gordon, the Kleiner Perkins Caufield & Byers (KPCB) partner who led Duolingo’s $20 million Series C in 2014.
“This was not very controversial back then, at least with investors,” von Ahn said. “This became controversial for us once we raised a ton of money, and we still weren’t making more money.”
While the company’s investors were relatively lenient in the early years, patience was starting to run thin. In June 2015, Duolingo raised a $45 million Series D round led by Laela Sturdy of Google Capital (later rebranded CapitalG), valuing the company at $470 million. She invested because of Duolingo’s growth and engagement numbers, but confronted von Ahn with some direct advice.
“She said to me, ‘Look, it worked for you to continue getting bigger and bigger checks from venture capital,’” von Ahn said. “‘But this is the last time it works for you … if you’re trying to con people, you cannot con anybody bigger than us [at Google].’” Duolingo’s valuation wouldn’t just be at stake next time it went fundraising on Sand Hill Road — its very survival would be as well.
Looking back, Sturdy said that she always “had confidence that they would come up with a revenue model” because of Duolingo’s passionate and organic users.
When a startup chooses to raise venture capital, it sets itself on a heavily-prescribed course. Suddenly, success isn’t defined merely as cash-flow breakeven with a long-term sustainable business. It has to be an exit of some sorts, and a big one at that. While Duolingo used venture as a lifeline to fund its product development, venture also came with pressure to become a billion-dollar company, or more. And that meant making revenue, not just growing engagement.
Von Ahn says his conversation with Sturdy is what really changed his mindset about money. After the Google check hit Duolingo’s bank account, he and Hacker began thinking about ways to make Duolingo as much a monetary success as it had been an educational one.
Duolingo’s Pittsburgh HQ. Image Credits: Duolingo
“It was clear that Luis didn’t have commercial instincts, he had cultural instincts and a deep focus on learning,” said Gordon. “[When we invested] Duolingo predicted it was on the verge of revenue growth, and it turned out it was not on the verge of revenue growth.”
What Gordon is alluding to was a litany of monetization attempts in Duolingo’s past. Translation, which helped von Ahn’s previous two startups, didn’t work when applied to language-learning services, and the company only secured two customers before ending the service. Business partnerships, such as a relationship with Uber to certify and train drivers in Brazil to speak English, didn’t catch fire.
Duolingo has been wildly successful. It has pulled in 500 million total registered learners, 40 million active users, 1.5 million premium subscribers and $190 million in booked revenues in 2020. It has a popular and meme-ified mascot in the form of the owl Duo, a creative and engaging product, and ambitious plans for expansion.There’s just one key question in the midst of all those milestones: Does anyone actually learn a language using Duolingo?
“Language is first and foremost a social, relational phenomenon,” said Sébastien Dubreil, a teaching professor at Carnegie Mellon University. “It is something that allows people to make meaning and talk to each other and conduct the business of living — and when you do this, you use a tone of different kinds of resources that are not packaged in the vocabulary and grammar.”
Duolingo CEO and co-founder Luis von Ahn estimates that Duolingo’s upcoming product developments will get users from zero to a knowledge job in a different language within the next two to three years. But for now, he is honest about the limits of the platform today.
“I won’t say that with Duolingo, you can start from zero and make your English as good as mine,” he said. “That’s not true. But that’s also not true with learning a language in a university, that’s not true with buying books, that’s not true with any other app.”
Luis von Ahn, the co-founder of Duolingo, visiting President Obama in 2015. Image Credits: Duolingo
While Dubreil doesn’t think Duolingo can teach someone to speak a language, he does think it has taught consistency — a hard nut to crack in edtech. “What Duolingo does is to potentially entice students to do things you cannot pay them enough time to actually do, which is to spend time in that textbook and reinforce vocabulary and the grammar,” he said.
That’s been the key focus for the company since the beginning. “I said this when we started Duolingo and I still really strongly believe it: The hardest thing about learning a language is staying motivated,” von Ahn said, comparing it to how people approach exercise: it’s hard to stay motivated, but a little motion a day goes a long way.
With an enviable lead in its category, Duolingo wants to bring the quality and effectiveness of its curriculum on par with the quality of its product and branding. With growth and monetization secured, Duolingo is no longer in survival mode. Instead, it’s in study mode.
In this final part, we will explore how Duolingo is using a variety of strategies, from rewriting its courses to what it dubs Operation Birdbrain, to become a more effective learning tool, all while balancing the need to keep the growth and monetization engines stoked while en route to an IPO.
Duolingo’s office decor. Image Credits: Duolingo
Duolingo’s competitors see the app’s massive gamification and solitary experience as inherently contradictory with high-quality language education. Busuu and Babbel, two subscription-based competitors in the market, both focus on users talking in real time to native speakers.
Bernhard Niesner, the co-founder and CEO of Busuu, which was founded in 2008, sees Duolingo as an entry-level tool that can help users migrate to its human-interactive service. “If you want to be fluent, Duolingo needs innovation,” Niesner said. “And that’s where we come in: We all believe that you should not be learning a language just by yourself, but [ … ] together, which is our vision.” Busuu has more than 90 million users worldwide.
Duolingo has been the subject of a number of efficacy studies over the years. One of its most positive reports, from September 2020, showed that its Spanish and French courses teach the equivalent of four U.S. university semesters in half the time.
Babbel, which has sold over 10 million subscriptions to its language-learning service, cast doubt on the power of these findings. Christian Hillemeyer, who heads PR for the startup, pointed out that Duolingo only tested for reading and writing efficacy — not for speaking proficiency, even though that is a key part of language learning. He described Duolingo as “just a funny game that is maybe not as bad as Candy Crush.”
One of the ironic legacies of Duolingo’s evolution is that for years it outsourced much of the creation of its education curriculum to volunteers. It’s a legacy the company is still trying to rectify.
The year after its founding, Duolingo launched its Language Incubator in 2013. Similar to its original translation service, the company wanted to leverage crowdsourcing to invent and refine new language courses. Volunteers — at least at first — were seen as a warm-but-scrappy way to bring new material to the growing Duolingo community and more than 1,000 volunteers have helped bring new language courses to the app.
Clubhouse, the voice-based networking app that’s now being knocked off by every major tech platform, is bringing its service to Android. The company announced during its weekly Townhall event that its Android version has entered beta testing with a handful of non-employees who will provide the company with early feedback ahead of a public launch.
In its release notes, Clubhouse referred to this test as involving a “rough beta version” that’s in the process of being rolled out to a group of “friendly testers.” That means there’s not a way for the broader public to sign up for the Android app just yet.
The lack of an Android client combined with its invite system initially gave Clubhouse an aura of exclusivity. You had to know someone to get in, and then you would need an iOS device to participate. But the delay to provide access to Android users also gave larger competitors time to catch up with Clubhouse and court users who were being left behind. One of the largest of the rivals, Facebook, recently challenged Clubhouse across all its platforms and services, in fact.
Facebook announced a full audio strategy that included a range of new products, from short-form audio snippets to a direct Clubhouse clone that works across Facebook and Messenger. It also announced a way for Instagram Live users to turn off their video and mute their mics, similar to Clubhouse. Even Facebook’s R&D division tested a Clubhouse alternative, Hotline, which offers a sort of mashup between the popular audio app and Instagram Live, with more of a Q&A focus.
Meanwhile, Twitter is continuing to expand its audio rooms feature, Twitter Spaces, and there are Clubhouse alternatives from Reddit, LinkedIn, Spotify, Discord, Telegram, and others, in the works, too.
For Clubhouse, that means the time has come to push for growth — especially as there are already some signs its initial hype is wearing off. According to app store intelligence firm Apptopia, Clubhouse has seen an estimated 13.5 million downloads on iOS to date, but the number of daily downloads has been falling, mirroring a decline in the number of daily active users.
Image Credits: Apptopia
Apptopia’s data shows that Clubhouse’s daily active users are down 68% from a high in February 2021, though that doesn’t necessarily mean that Clubhouse is over — it’s just becoming less of a daily habit. However, if the company is able to build out its creator community and establish a number of popular shows, which it’s aiming to do via its accelerator, it could still see users tuning in on a weekly and monthly basis. And those sessions would be longer in comparison with some other social apps, as Clubhouse users often tune into shows that run over an hour — even leaving the app open as they do other tasks.
Plus, Clubhouse is taking aim at the challenges around re-engaging people whose usage may have dwindled in recent days. Also during its Townhall, the company announced it would introduce a bell icon for events that will allow users to be notified about the events they’ve RSVP’d to. This will be important for creators who are advertising their events, as well.
Clubhouse didn’t give a specific timeframe as to when its Android app would reach more testers or the wider public, only noting that it’s looking forward to welcoming more Android users in the “coming weeks.” In March, Clubhouse had said the Android launch would take a couple of months.
In the real world — the world on which the global economy runs — we don’t expose every aspect of our financial activity in public. We want to be able to select which parties can access our financial data and under what circumstances — for example, our credit history or bank transactions. The problem with the blockchain world is that this financial privacy doesn’t really exist. This has led to pretty bad abuses, such as the practice of “front-running,” where a nefarious person can take advantage of you immediately after seeing your transaction on a public blockchain. What’s required is a real infrastructure improvement to this problem, for, without it, the crypto “Shangri-La” of “DeFi” (decentralized finance) will never have a hope of taking off.
It’s therefore significant that some well-known organizations in the realm of blockchain financing are investing the equivalent of $11.5 million into SCRT, the native coin of the Secret Network blockchain. The investment was led by Arrington Capital and Blocktower Capital and also includes Spartan Group and Skynet Trading.
Tor Bair, founder of Secret Foundation said: “The addition of these valuable and experienced partners to the Secret ecosystem marks a significant inflection point for Secret Network as we concentrate on expanding and supporting our fast-growing application layer.”
Secret, which used to be called Enigma before a pivot, claims to have been the first privacy-first smart contract platform. (The first version of this blockchain was called the “Enigma Mainnet,” but this branding was changed to Secret Network via a governance vote in summer 2020).
So far in 2021, the Secret Network ecosystem has launched several native applications, including SecretSwap, a “front-running resistant,” cross-chain AMM with privacy protections. It is also developing Secret NFTs.
So why is this at all significant? Why should we care? It’s simply because, without privacy, DeFi is highly unlikely to go mainstream.
Without privacy in transactions, the traditional economic system won’t bother taking any notice of crypto and blockchain, outside of noting whether the price of bitcoin goes up or down.
Admittedly, Secret is not the only player tackling this area. It is playing in the same arena as blockchain projects such as Phala (not yet on mainnet, and built on Polkadot), Oasis and Aleo, which recently just fundraised via Andreessen Horowitz.
What these projects all have in common is this race toward what’s known as the Web3 “application privacy” space. Once again, they are trying to reproduce the kind of financial privacy that we have all come to expect from the traditional financial system, but which remains elusive in the blockchain world.
However, this approach should not be confused with privacy coins like Monero and Zcash. These are coins, and therefore not the same at all as the above-named projects, which are aiming at what’s known as “programmable privacy.”
Bair told me: “Transactional privacy [via privacy coins] just means hiding simple aspects of transactions from other parties — a narrow form of privacy. Smart contract privacy — what we call ‘programmable privacy’ — is a much more powerful idea, allowing developers to build complex decentralized and permissionless applications that also protect data privacy, with big consequences for Web3 security and usability. As an analogy — imagine trying to build a decentralized Facebook. Normal blockchains expose all data by default, a much worse outcome for user privacy and security. Only smart contract privacy allows you to build these types of complex applications without compromising the user experience and threatening their safety.”
Front-running is often described as getting a transaction first in line before a known future transaction occurs. Bair claims Secret protects against this at the protocol level because all interactions with smart contracts are encrypted and cannot be viewed even by the network validators, “so all DeFi applications built on Secret Network are front-running resistant by default” he told me.
That said, Secret will still have to compete with the myriad privacy projects already on — for instance — Ethereum, such as Automata. The Secret Network is a standalone blockchain and would still require a developer community to be successful, versus Ethereum and Polkadot, which technically have a head start, of sorts. But these blockchains are public by default. So Secret’s hyperfocus on the issue of privacy may yet make Secret a major player in this realm.
Bair commented: “Only programmable privacy can give users and developers this level of control in the DeFi world. Without programmable privacy, DeFi will never achieve mass adoption outside of purely speculative activity. Secret Network intends to become the foundation for new types of DeFi applications that better protect users while also allowing traditional institutions to participate securely, with protections for sensitive data. Also, blockchains don’t need thousands of developers to succeed in the short term — they need the right developers who build the early critical applications.”
Furthermore, Secret has in its favor the fact that due to the whole nature of decentralization of the blockchain, the space isn’t nearly as much a “winner-take-all” environment as the general internet has become due to the growth of the large Big Tech platforms — that would be counter to the point of decentralization. As Bair told me: “Secret’s vision is to become a data privacy hub for all public blockchains, collaborating more than competing with networks like Ethereum (to which we already have a live bridge with over $100 million in assets locked).”
Secret Network claims it was one of the first blockchains to feature privacy-preserving smart contracts, which it launched on mainnet in September 2020. It says this means all decentralized applications built on Secret Network have data privacy by default. The Secret Network blockchain itself is based on Cosmos SDK/Tendermint, giving it its own independent consensus, on-chain governance, and features like slashing and delegation. It is secured by the native coin Secret (SCRT), which must be staked by network validators and is used for transaction and computation fees as well as governance, said the foundation.
Commenting on the investment, Michael Arrington, founder of Arrington Capital said: “Secret is the first blockchain ecosystem to prioritize privacy. Financial privacy is critical to individual freedom, and Arrington Capital has long been committed to financial privacy and censorship resistance. The rapid expansion of decentralized finance makes solutions like Secret Network a timely addition to the DeFi ecosystem.” (Arrington Capital was established by Arrington, who was also previously the founder of TechCrunch, but who has no involvement in the title these days).
Jamie Burke, founder of Outlier Ventures in the U.K. and a Secret backer, told me: “Privacy will be essential to the security and adoption of Web3, from DeFi to NFTs and beyond. Secret Network brings new and unique privacy functionality to the space, and as a result we believe it will be foundational to the next generation of successful Web3 applications.”
Secret is also getting support from DeFi players such as the Sienna Network. Monty Munford, Chief Evangelist and Core Contributor to the privacy DeFi company told me: “Of all the networks in all the world, we chose Secret because it was a yes-yes-yes brainer. They understand privacy and we understand DeFi; it’s a match made in heaven.”
Zoomo, the Australian startup with a mission to electrify delivery fleets through e-bike subscriptions, announced a $12 million interim capital raise on Monday.
The company made a name for itself through partnerships with UberEats and DoorDash to help delivery workers access e-bikes through weekly subscriptions at discounted rates. Zoomo then grew to offer monthly subscriptions to corporate partners in Australia, the U.S. and London for last mile delivery, with a fleet that has expanded beyond 10,000 units globally.
Now, the startup hopes to expand its service outward towards continental Europe and other states across the U.S. It currently operates in New York City, San Francisco, Los Angeles and Philadelphia. Zoomo also wants to build up its consumer model, which mainly serves couriers but is extending to commuters, and will invest in the development of its next generation of vehicle offerings.
“We initially built our products to service the demands of gig workers in the food delivery industry,” Mina Nada, Zoomo CEO and co-founder said in a statement. “Their expectations for quality commercial vehicles, on demand service, flexible financing and tech enabled security features spurred us to innovate. We’re now seeing enterprises and fleet managers benefiting from the platform we have built. Enterprise fleet managers looking for clean and efficient vehicles are choosing us.”
Zoomo’s focus on e-bikes for food delivery makes it unique in the electric bike rental space. Its business model offers a full-stack e-bike, from the hardware and software to same-day servicing and financing options, which especially helps big business partners deploy and manage large fleets of vehicles at scale. It’s a tall order, and Zoomo’s strategy could be leading a new trend in micromobility of being a one-stop-shop that promises quick scalability.
German mobility software provider Wunder Mobility recently announced its efforts to offer a souped up e-moped that’s been co-designed with Chinese consumer manufacturer Yadea for the dockless sharing market. It also launched a new subsidiary to finance the vehicles, along with its software, to shared micromobility providers. Wunder Mobility plans to offer e-scooters and e-bikes for financing in the future, but it doesn’t design its own vehicles or sell them outright. While the business models and target customers don’t perfectly align, the blueprint is the same: Corner a market, provide top quality hardware and software and make it as accessible as possible.
Coronavirus spurred a demand for delivery in all industries, and we can see companies like FluidTruck and Rivian stepping up to the plate to meet the needs of eco-conscious e-commerce giants with their electric delivery vans. The online food delivery industry is no different with a market that’s expected to reach $192.16 billion in 2025 at a compound annual growth rate of 11%. But for delivery within cities, e-bikes offer a smarter solution for meeting climate change goals while dodging traffic congestion.
Zoomo’s custom designed bikes can bear more than 200 kilograms of load via various cargo options, according to a Zoomo spokesperson. For enterprise customers, like health food company Cornucopia, e-cargo delivery vehicles like a Trailer Trike or a Covered Trike are used to deliver goods sustainably. Gorillas, an on-demand grocery delivery company, and Just Eat Takeaway, acquirer of Grubhub and Seamless, are also clients of Zoomo’s.
“At Just Eat Takeaway.com, we want to build a sustainable future for food delivery, and are committed to doing our bit to help keep carbon emissions to a minimum, as well as providing an efficient customer experience from order to delivery,” said a Just Eat Takeaway spokesperson in a statement. “E-Vehicles are an integral part of the Scoober model and we are pleased to work in cooperation with Zoomo.”
Zoomo’s newest funding round, led by Australian VC AirTree, follows an $11 million Series A raised in August 2020, with support in both rounds from the Clean Energy Finance Corporation, Maniv Mobility and Contrarian Ventures. Withrop Square and Wisdom VC, mobility and clean tech-focused investors, also joined this round.
The only thing happening faster than tech mobility right now is the speed at which our early bird price for TC Sessions: Mobility 2021 will disappear. You have just four days left to save $100 to our June 9 deep-dive exploration into the current and future state of mobility and transportation.
Buy your pass to TC Sessions Mobility 2021 before the price increase goes into effect on Thursday, May 6 at 11:59 pm (PT).
The event is virtual (thanks a bunch, you nasty pandemic), but the information you’ll glean and the worldwide connections you’ll make are very real. The global nature of the opportunities waiting for you to find is a big democratizing benefit of going virtual.
The day comes jam-packed with presentations, one-on-one interviews and breakout sessions with the top founders, investors and makers in mobility. Don’t stress about missing out over a schedule conflict — your pass includes access to video-on-demand once the event ends.
Here’s what Rachael Wilcox, a creative producer at Volvo Cars, told us about her experience at TC Sessions: Mobility 2020.
“I was impressed with the virtual platform. It was easy to navigate and a great format for asking questions. Combining live-stream and VOD options provided schedule flexibility, which let me be more focused and attend more presentations than I could at a live event.”
Let’s take a quick look at just some of the topics and the people who will cover them. Check out the agenda here and plan your strategy.
If you’re into mobility, you can’t afford to miss the information, trends, networking and connection opportunities you’ll find at TC Sessions: Mobility 2021. You also can’t afford to miss out on the early bird price. Buy your $95 pass before Thursday, May 6 at 11:59 pm (PT), and you’ll save $100.
Is your company interested in sponsoring or exhibiting at TC Sessions: Mobility 2021? Contact our sponsorship sales team by filling out this form.
Remote work is no longer a new topic, as much of the world has now been doing it for a year or more because of the COVID-19 pandemic.
Companies — big and small — have had to react in myriad ways. Many of the initial challenges have focused on workflow, productivity and the like. But one aspect of the whole remote work shift that is not getting as much attention is the culture angle.
A 100% remote startup that was tackling the issue way before COVID-19 was even around is now seeing a big surge in demand for its offering that aims to help companies address the “people” challenge of remote work. It started its life with the name Icebreaker to reflect the aim of “breaking the ice” with people with whom you work.
“We designed the initial version of our product as a way to connect people who’d never met, kind of virtual speed dating,” says co-founder and CEO Perry Rosenstein. “But we realized that people were using it for far more than that.”
So over time, its offering has evolved to include a bigger goal of helping people get together beyond an initial encounter –– hence its new name: Gatheround.
“For remote companies, a big challenge or problem that is now bordering on a crisis is how to build connection, trust and empathy between people that aren’t sharing a physical space,” says co-founder and COO Lisa Conn. “There’s no five-minute conversations after meetings, no shared meals, no cafeterias — this is where connection organically builds.”
Organizations should be concerned, Gatheround maintains, that as we move more remote, that work will become more transactional and people will become more isolated. They can’t ignore that humans are largely social creatures, Conn said.
The startup aims to bring people together online through real-time events such as a range of chats, videos and one-on-one and group conversations. The startup also provides templates to facilitate cultural rituals and learning & development (L&D) activities, such as all-hands meetings and workshops on diversity, equity and inclusion.
Gatheround’s video conversations aim to be a refreshing complement to Slack conversations, which despite serving the function of communication, still don’t bring users face-to-face.
Image Credits: Gatheround
Since its inception, Gatheround has quietly built up an impressive customer base, including 28 Fortune 500s, 11 of the 15 biggest U.S. tech companies, 26 of the top 30 universities and more than 700 educational institutions. Specifically, those users include Asana, Coinbase, Fiverr, Westfield and DigitalOcean. Universities, academic centers and nonprofits, including Georgetown’s Institute of Politics and Public Service and Chan Zuckerberg Initiative, are also customers. To date, Gatheround has had about 260,000 users hold 570,000 conversations on its SaaS-based, video platform.
All its growth so far has been organic, mostly referrals and word of mouth. Now, armed with $3.5 million in seed funding that builds upon a previous $500,000 raised, Gatheround is ready to aggressively go to market and build upon the momentum it’s seeing.
Venture firms Homebrew and Bloomberg Beta co-led the company’s latest raise, which included participation from angel investors such as Stripe COO Claire Hughes Johnson, Meetup co-founder Scott Heiferman, Li Jin and Lenny Rachitsky.
Co-founders Rosenstein, Conn and Alexander McCormmach describe themselves as “experienced community builders,” having previously worked on President Obama’s campaigns as well as at companies like Facebook, Change.org and Hustle.
The trio emphasize that Gatheround is also very different from Zoom and video conferencing apps in that its platform gives people prompts and organized ways to get to know and learn about each other as well as the flexibility to customize events.
“We’re fundamentally a connection platform, here to help organizations connect their people via real-time events that are not just really fun, but meaningful,” Conn said.
Homebrew Partner Hunter Walk says his firm was attracted to the company’s founder-market fit.
“They’re a really interesting combination of founders with all this experience community building on the political activism side, combined with really great product, design and operational skills,” he told TechCrunch. “It was kind of unique that they didn’t come out of an enterprise product background or pure social background.”
He was also drawn to the personalized nature of Gatheround’s platform, considering that it has become clear over the past year that the software powering the future of work “needs emotional intelligence.”
“Many companies in 2020 have focused on making remote work more productive. But what people desire more than ever is a way to deeply and meaningfully connect with their colleagues,” Walk said. “Gatheround does that better than any platform out there. I’ve never seen people come together virtually like they do on Gatheround, asking questions, sharing stories and learning as a group.”
James Cham, partner at Bloomberg Beta, agrees with Walk that the founding team’s knowledge of behavioral psychology, group dynamics and community building gives them an edge.
“More than anything, though, they care about helping the world unite and feel connected, and have spent their entire careers building organizations to make that happen,” he said in a written statement. “So it was a no-brainer to back Gatheround, and I can’t wait to see the impact they have on society.”
The 14-person team will likely expand with the new capital, which will also go toward helping adding more functionality and details to the Gatheround product.
“Even before the pandemic, remote work was accelerating faster than other forms of work,” Conn said. “Now that’s intensified even more.”
Gatheround is not the only company attempting to tackle this space. Ireland-based Workvivo last year raised $16 million and earlier this year, Microsoft launched Viva, its new “employee experience platform.”
Hangry, an Indonesian cloud kitchen startup that wants to become a global food and beverage company, has raised a $13 million Series A. The round was led by returning investor Alpha JWC Ventures, and included participation from Atlas Pacific Capital, Salt Ventures and Heyokha Brothers. It will be used to increase the number of Hangry’s outlets in Indonesia, including launching its first dine-in restaurants, over the next two years before it enters other countries.
Along with a previous round of $3 million from Alpha JWC and Sequoia Capital’s Surge program, Hangry’s Series A brings its total funding to $16 million. It currently operates about 40 cloud kitchens in Greater Jakarta and Bandung, 34 of which launched in 2020. Hangry plans to expand its total outlets to more than 120 this year, including dine-in restaurants.
Founded in 2019 by Abraham Viktor, Robin Tan and Andreas Resha, Hangry is part of Indonesia’s burgeoning cloud kitchen industry. Tech giants Grab and Gojek both operate networks of cloud kitchens that are integrated with their food delivery services, while other startups in the space include Everplate and Yummy.
One of the main ways Hangry sets itself apart is by focusing on its own brands, instead of providing kitchen facilities and services to restaurants and other third-party clients. Hangry currently has four brands, including Indonesian chicken dishes (Ayam Koplo) and Japanese food (San Gyu), that cost about 15,000 to 70,000 IDR per portion (or about $1 to $6 USD). Its food can be ordered through Hangry’s own app, plus GrabFood, GoFood and ShopeeFood.
“Given that Hangry has developed an extensive cloud kitchen network across Indonesia, we naturally would have interest from other brands to leverage our networks,” chief executive officer Viktor told TechCrunch. “However, our focus is to grow our brands since our brands are rapidly growing in popularity in Indonesia and require all kitchen resources that they need to realize their full potential.”
Providing food deliveries helped Hangry grow during COVID-19 lockdowns and social distancing, but in order to become a global brand within a decade, it needs to operate in multiple channels, he added.
“We knew that we will one day have to serve customers in all channels, including dine in,” said Viktor. “We started the hard way, doing delivery-first business, where we faced the challenges surrounding making sure our food still tastes good when it reaches customers’ homes. Now we feel ready to serve our customers in our restaurant premises. Our dine-in concept is an expansion of everything we’ve done in delivery channels.”
In a press statement, Alpha JWC Ventures partner Eko Kurniadi said, “In the span of 1.5 years, [Hangry] launched multiple brands across myriad tastes and categories, and almost all of them are amongst the best sellers list with superior ratings in multiple platforms, tangible examples of product-market fit. This is only the beginning and we can already foresee their growth to be a top local F&B brand in the country.”
We’ve been hearing how “data is the new oil” for more than a decade now, and in certain sectors, it’s a maxim that has more than panned out. From marketing and logistics to finance and product, decision-making is now dominated by data at all levels of most big private orgs (and if it isn’t, I’d be getting a résumé put together, stat).
So it might be a something of a surprise to learn that data, which could transform how we respond to the increasingly deadly disasters that regularly plague us, has been all but absent from much of emergency response this past decade. Far from being a geyser of digital oil, disaster response agencies and private organizations alike have for years tried to swell the scope and scale of the data being inputted into disaster response, with relatively meager results.
That’s starting to change though, mostly thanks to the internet of things (IoT), and frontline crisis managers today increasingly have the data they need to make better decisions across the resilience, response, and recovery cycle. The best is yet to come — with drones flying up, simulated visualizations, and artificial intelligence-induced disasters — what we’re seeing today on the frontlines is only the beginning of what could be a revolution in disaster response in the 2020s.
Emergency response is a fight against the fog of war and the dreadful ticking of the clock. In the midst of a wildfire or hurricane, everything can change in a matter of seconds — even milliseconds if you aren’t paying attention. Safe roads ferrying evacuees can suddenly become impassable infernos, evacuation teams can reposition and find themselves spread far too thin, and unforeseen conditions can rapidly metastasize to cover the entire operating environment. An operations center that once had perfect information can quickly find it has no ground truth at all.
Unfortunately, even getting raw data on what’s happening before and during a disaster can be extraordinarily difficult. When we look at the data revolution in business, part of the early success stems from the fact that companies were always heavily reliant on data to handle their activities. Digitalization was and is the key word: moving from paper to computers in order to transform latent raw data into a form that was machine-readable and therefore analyzable. In business, the last ten years was basically upgrading to version two from version one.
In emergency management however, many agencies are stuck without a version at all. Take a flood — where is the water and where is it going? Up until recently, there was no comprehensive data on where waters rose from and where they sloshed to. When it came to wildfires, there were no administrative datasets on where every tree in the world was located and how prone each is to fire. Even human infrastructure like power lines and cell towers often had little interface with the digital world. They stood there, and if you couldn’t see them, they couldn’t see you.
Flood modeling is on the cutting edge of disaster planning and response. Image Credits: CHANDAN KHANNA/AFP via Getty Images
Models, simulations, predictions, analysis: all of these are useless without raw data, and in the disaster response realm, there was no detailed data to be found.
After years of promising an Internet of Things (IoT) revolution, things are finally internet-izing, with IoT sensors increasingly larding up the American and world landscape. Temperature, atmospheric pressure, water levels, humidity, pollution, power, and other sensors have been widely deployed, emitting constant streams of data back into data warehouses ready for analysis.
Take wildfires in the American West. It wasn’t all that long ago that the U.S. federal government and state firefighting agencies had no knowledge of where a blaze was taking place. Firefighting has been “100 years of tradition unimpeded by progress,” Tom Harbour, head of fire response for a decade at the U.S. Forest Service and now chief fire officer at Cornea put it.
And he’s right. After all, firefighting is a visceral activity — responders can see the fires, even feel the burning heat echoing off of their flesh. Data wasn’t useful, particularly in the West where there are millions of acres of land and large swaths are sparsely populated. Massive conflagrations could be detected by satellites, but smoldering fires in the brush would be entirely invisible to the geospatial authorities. There’s smoke over California — exactly what is a firefighter on the ground supposed to do with such valuable information?
Today after a decade of speculative promise, IoT sensors are starting to clear a huge part of this fog. Aaron Clark-Ginsberg, a social scientist at RAND Corporation who researches community resilience, said that air quality sensors have become ubiquitous since they are “very cheap [and] pretty easy to use” and can offer very fine-grained understandings of pollution — a key signal, for instance, of wildfires. He pointed to the company Purple Air, which in addition to making sensors, also produces a popular consumer map of air quality, as indicative of the potential these days for technology.
Maps are the critical intersection for data in disasters. Geospatial information systems (GIS) form the basis for most planning and response teams, and no company has a larger footprint in the sector than privately-held Esri. Ryan Lanclos, who leads public safety solutions at the company, pointed to the huge expansion of water sensors as radically changing responses to certain disasters. “Flood sensors are always pulsing,“ he said, and with a “national water model coming out of the federal government ,” researchers can now predict through GIS analysis how a flood will affect different communities with a precision unheard of previously.
Digital maps and GIS systems are increasingly vital for disaster planning and response, but paper still remains quite ubiquitous. Image Credits: Paul Kitagaki Jr.-Pool/Getty Images
Cory Davis, the director of public safety strategy and crisis response at Verizon (which, through our parent company Verizon Media, is TechCrunch’s ultimate owner), said that all of these sensors have transformed how crews work to maintain infrastructure as well. “Think like a utility that is able to put a sensor on a power line — now they have sensors and get out there quicker, resolve it, and get the power back up.”
He noted one major development that has transformed sensors in this space the last few years: battery life. Thanks to continuous improvements in ultra-low-power wireless chips as well as better batteries and energy management systems, sensors can last a really long time in the wilderness without the need for maintenance. “Now we have devices that have ten-year battery lives,” he said. That’s critical, because it can be impossible to connect these sensors to the power grid in frontier areas.
The same line of thinking holds true at T-Mobile as well. When it comes to preventative planning, Jay Naillon, senior director of national technology service operations strategy at the telco, said that “the type of data that is becoming more and more valuable for us is the storm surge data — it can make it easier to know we have the right assets in place.” That data comes from flood sensors that can offer real-time warnings signals to planners across the country.
Telecom interest — and commercial interest in general — has been critical to accelerating the adoption of sensors and other data streams around disasters. While governments may be the logical end user of flood or wildfire data, they aren’t the only ones interested in this visibility. “A lot of consumers of that information are in the private sector,” said Jonathan Sury, project director at the National Center for Disaster Preparedness at the Earth Institute at Columbia University. “These new types of risks, like climate change, are going to affect their bottom lines,” and he pointed to bond ratings, insurance underwriting and other areas where commercial interest in sensor data has been profound.
Sensors may not literally be ubiquitous, but they have offered a window into the ambiguity that emergency managers have never had visibility into before.
Finally, there is the extensive datasets around mobile usage that have become ubiquitous throughout much of the world. Facebook’s Data for Good project, for instance, provides data layers around connectivity — are users connecting from one place and then later connecting from a different location, indicating displacement? That sort of data from the company and telcos themselves can help emergency planners scout out how populations are shifting in real-time.
Rivulets of data have now turned into floods of information, but just like floodwaters rising in cities across the world, the data deluge now needs a response all its own. In business, the surfeit of big data has been wrangled with an IT stack from data warehouses all the way to business intelligence tools.
If only data for disasters could be processed so easily. Data relevant for disasters is held by dozens of different organizations spanning the private, public, and non-profit sectors, leading to huge interoperability problems. Even when the data can be harmonized, there are large challenges in summarizing the findings down to an actual decision a frontline responder can use in their work — making AI a tough sale still today, particularly outside of planning. As Davis of Verizon put it, “now that they have this plethora of data, a lot of cities and federal agencies are struggling with how to use it.”
Unfortunately, standardization is a challenge at all scales. Globally, countries mostly lack interoperability, although standards are improving over time. Amir Elichai, the founder and CEO of 911 call-handling platform Carbyne, said that “from a technology standpoint and a standards standpoint, there is a big difference between countries,” noting that protocols from one country often have to be completely rewritten to serve a different market.
Tom Cotter, director of emergency response and preparedness at health care disaster response organization Project HOPE, said that even setting up communications between responders can be challenging in an international environment. “Some countries allow certain platforms but not others, and it is constantly changing,” he said. “I basically have every single technology communication platform you can possibly have in one place.”
One senior federal emergency management official acknowledged that data portability has become increasingly key in procurement contracts for technology, with the government recognizing the need to buy commercially-available software rather than custom-designed software. That message has been picked up by companies like Esri, with Lanclos stating that “part of our core mission is to be open and … create data and to share that openly to the public or securely through open standards.”
For all its downsides though, the lack of interoperability can be ironically helpful for innovation. Elichai said that the “lack of standards is an advantage — you are not buying into a legacy standard,” and in some contexts where standards are lacking, quality protocols can be built with the assumption of a modern data workflow.
Even with interoperability though, the next challenge becomes data sanitation — and disaster data is dirty as … well, something. While sensor streams can be verified and cross-checked with other datasets, in recent years there has been a heavy increase in the quantity of citizen-submitted information that has to be carefully vetted before it is disseminated to first responders or the public.
With citizens having more access to smartphones than ever, emergency planners have to sanitize uploaded data uploaded in order to verify and make it useful. Image Credits: TONY KARUMBA/AFP via Getty Images
Bailey Farren, CEO and co-founder of disaster communications platform Perimeter, said that “sometimes citizens have the most accurate and real-time information, before first responders show up — we want citizens to share that with …government officials.” The challenge is how to filter the quality goods from the unhelpful or malicious. Raj Kamachee, the CIO of Team Rubicon, a non-profit which assembles teams of volunteer military veterans to respond to natural disasters, said that verification is critical, and it’s a key element of the infrastructure he has built at the organization since joining in 2017. “We’ve gotten more people using it so more feedback [and] more data [is] coming through the pipes,” he said. “So creating a self-service, a very collaborative approach.”
With quality and quantity, the AI models should come, right? Well, yes and no.
Sury of Columbia wants to cool down at least some of the hype around AI. “The big caveat with all of these machine learning and big data applications is that they are not a panacea — they are able to process a lot of disparate information, [but] they’re certainly not going to tell us exactly what to do,” he said. “First responders are already processing a lot of information,” and they don’t necessarily need more guidance.
Instead, AI in disasters is increasingly focused on planning and resilience. Sury pointed to OneConcern, a resiliency planning platform, as one example of how data and AI can be combined in the disaster planning process. He also pointed to the CDC’s Social Vulnerability Index and risk tools from FEMA that integrate different data signals into scalar values by emergency planners to optimize their contingency plans.
Yet, almost everyone I talked to was much more hesitant about the power of AI. As I discussed a bit in part one of this series regarding the disaster sales cycle, data tools have to be real-time and perfect every time given the lives that are on the line. Kamachee of Team Rubicon noted that when choosing tools, he avoids whiz-bang and instead looks at the pure utility of individual vendors. “We go high tech, but we prepare for low tech,” he said, empathizing that in disaster response, everything must be agile and adaptable to changing circumstances.
Elichai of Carbyne saw this pattern in his sales. There’s a “sensitivity in our market and the reluctance from time to time to adopt” new technologies he said, but acknowledged that “there is no doubt that AI at a certain point will provide benefits.”
Naillon of T-Mobile had similar views from the operator perspective, saying that “I can’t say that we really leverage AI very much” in the company’s disaster planning. Instead of AI as brain, the telecom company simply uses data and forecast modeling to optimally position equipment — no fancy GANs required.
Outside of planning, AI has helped in post-disaster recovery, and specifically around damage assessments. After a crisis transpires, assessments of infrastructure and private property have to be made in order for insurance claims to be filed and for a community to move forward. Art delaCruz, COO and president of Team Rubicon, noted that technology and a flourish of AI has helped significantly around damage assessments. Since his organization often helps rebuild communities in the course of its work, triaging damage is a critical element of its effective response strategy.
So AI today is helping a bit with resilience planning and disaster recovery and not so much during emergency response itself, but there is certainly more to come across the entire cycle. Indeed, there is a lot of excitement about the future of drones, which are increasingly being used in the field, but there are concerns long term about whether AI and data will ultimately cause more problems than they solve.
Drones would seem to have an obvious value for disaster response, and indeed, they have been used by teams to get additional aerial footage and context where direct access by responders is limited. Kamachee of Team Rubicon noted that in the Bahamas on a mission, response teams used drones to detect survivors, since major roads were blocked. The drones snapped images that were processed using AI, and helped the team to identify those survivors for evacuation. He described drones and their potential as “sexy; very, very cool.”
Aerial views from drones can give disaster response teams much better real-time information, particularly in areas where on-the-ground access is limited. Image Credits: Mario Tama/Getty Images
Cotter of Project HOPE similarly noted that faster data processing translates to better responses. “Ultimately speed is what saves lives in these disasters,” he said. We’re “also able to manage more responses remotely [and] don’t have to send as many people downrange,” giving response teams more leverage in resource-constrained environments.
“I see more emergency management agencies using drone technology — search and rescue, aerial photography,” Davis of Verizon said, arguing that operators often have a mentality of “send a machine into a situation first.” He continued, arguing, “artificial intelligence is going to continue to get better and better and better [and] enable our first responders to respond more effectively, but also more efficiently and safer.”
With data flooding in from sensors and drones and processed and verified better than ever, disaster response can improve, perhaps even better than Mother Nature can galvanize her increasingly deadly whims. Yet, there is one caveat: will the AI algorithms themselves cause new problems in the future?
Clark-Ginsburg of RAND, perhaps supplying that typical RANDian alternatives analysis, said that these solutions can also create problems themselves, “technological risks leading to disaster and the world of technology facilitating disaster.” These systems can break, they can make mistakes, and more ominously — they can be sabotaged to increase chaos and damage.
Bob Kerrey, a co-chair of the 9/11 Commission, former senator and governor of Nebraska, and currently the board chairman of Risk & Return, a disaster response VC fund and philanthropy I profiled recently, pointed to cybersecurity as increasingly a wild card in many responses. “There wasn’t a concept called zero days — let alone a market for zero days — in 2004 [when the 9/11 Commission was doing its work], and now there is.” With the 9/11 terrorist attacks, “they had to come here, they had to hijack planes … now you don’t need to hijack planes to damage the United States,” noting that hackers “can be sitting with a bunch of other guys in Moscow, in Tehran, in China, or even your mother’s basement.”
Data is a revolution in the making for disaster response, but it may well cause a whole second-order set of problems that didn’t exist before. What is giveth is taketh away. The oil gushes, but then the well suddenly runs dry – or simply catches fire.
When Secretary of State Antony Blinken and National Security Advisor Jake Sullivan sat down with Chinese officials in Anchorage, Alaska for the first high-level bilateral summit of the new administration, it was not a typical diplomatic meeting. Instead of a polite but restrained diplomatic exchange, the two sides traded pointed barbs for almost two hours. “There is growing consensus that the era of engagement with China has come to an unceremonious close,” wrote Sullivan and Kurt Campbell, the Administration’s Asia czar also in attendance, back in 2019. How apt that they were present for that moment’s arrival.
A little more than one hundred days into the Biden Administration, there is no shortage of views on how it should handle this new era of Sino-American relations. From a blue-ribbon panel assembled by former Google Chairman Eric Schmidt to a Politico essay from an anonymous former Trump Administration official that consciously echoes (in both its name and its author’s anonymity) George Kennan’s famous “Long Telegram” laying out the theory of Cold War containment, to countless think tank reports, it seems everyone is having their say.
What is largely uncontroversial though is that technology is at the center of U.S.-China relations, and any competition with China will be won or lost in the digital and cyber spheres. “Part of the goal of the Alaska meeting was to convince the Chinese that the Biden administration is determined to compete with Beijing across the board to offer competitive technology,” wrote David Sanger in the New York Times shortly afterward.
But what, exactly, does a tech-centered China strategy look like? And what would it take for one to succeed?
One encouraging sign is that China has emerged as one of the few issues on which even Democrats agree that President Trump had some valid points. “Trump really was the spark that reframed the entire debate around U.S.-China relations in DC,” says Jordan Schneider, a China analyst at the Rhodium Group and the host of the ChinaTalk podcast and newsletter.
While many in the foreign policy community favored some degree of cooperation with China before the Trump presidency, now competition – if not outright rivalry – is widely assumed. “Democrats, even those who served in the Obama Administration, have become much more hawkish,” says Erik Brattberg of the Carnegie Endowment for International Peace. Trump has caused “the Overton Window on China [to become] a lot narrower than it was before,” adds Schneider.
The US delegation led by Secretary of State Antony Blinken face their Chinese counterparts at the opening session of US-China talks at the Captain Cook Hotel in Anchorage, Alaska on March 18, 2021. Image Credits: FREDERIC J. BROWN/POOL/AFP via Getty Images
As the U.S.-China rivalry has evolved, it has become more and more centered around competing philosophies on the use of technology. “At their core, democracies are open systems that believe in the free flow of information, whereas for autocrats, information is something to be weaponized and stifled in the service of the regime,” says Lindsay Gorman, Fellow for Emerging Technologies at the German Marshall Fund. “So it’s not too surprising that technology, so much of which is about how we store and process and leverage information, has become such a focus of the U.S.-China relationship and of the [broader] democratic-autocratic competition around the world.”
Tech touches everything now – and the stakes could not be higher. “Tech and the business models around tech are really ‘embedded ideology,’’ says Tyson Barker of the German Council on Foreign Relations. “So what tech is and how it is used is a form of governance.”
What does that mean in practice? When Chinese firms expand around the world, Barker tells me, they bring their norms with them. So when Huawei builds a 5G network in Latin America, or Alipay is adopted for digital payments in Central Europe, or Xiaomi takes more market share in Southeast Asia, they are helping digitize those economies on Chinese terms using Chinese norms (as opposed to American ones). The implication is clear: whoever defines the future of technology will determine the rest of the twenty-first century.
That shifting balance has focused minds in Washington. “I think there is a strong bipartisan consensus that technology is at the core of U.S.-China competition,” says Brattberg. But, adds Gorman, “there’s less agreement on what the prescription should be.” While the Democratic experts now ascendant in Washington agree with Trump’s diagnosis of the China challenge, they believe in a vastly different approach from their Trump Administration predecessors.
Out, for instance, are restrictions on Chinese firms just for being Chinese. “That was one of the problems with Trump,” says Walter Kerr, a former U.S. diplomat who publishes the China Journal Review. “Trump cast broad strokes, targeting firms whether it was merited or not. Sticking it to the Chinese is not a good policy.”
Instead the focus is on inward investment – and outward cooperation.
Democrats are first shoring up America’s economic challenges – in short, be strong at home to be strong abroad. “There’s no longer a bright line between foreign and domestic policy,” President Biden said in his first major foreign policy speech. “Every action we take in our conduct abroad, we must take with American working families in mind. Advancing a foreign policy for the middle class demands urgent focus on our domestic economic renewal.”
This is a particular passion of Jake Sullivan, Biden’s national security advisor, who immersed himself in domestic policy while he was Hillary Clinton’s chief policy aide during her 2016 presidential campaign. “We’ve reached a point where foreign policy is domestic policy, and domestic policy is foreign policy,” he told NPR during the transition.
Jake Sullivan, White House national security adviser, speaks during a news conference Image Credits: Jim Lo Scalzo/EPA/Bloomberg via Getty Images
This is increasingly important for technology, as concern grows that America is lagging behind on research and development. “We’re realizing that we’ve underinvested in the government grants and research and development projects that American companies [need] to become highly innovative in fields like quantum computing, AI, biotechnology, etc,” says Kerr.
“Rebuilding” or “sustaining” America’s “technological leadership” is a major theme of the Longer Telegram and is the very operating premise of the report of the China Strategy Group assembled by Eric Schmidt, former executive chairman of Alphabet, Google’s parent company, and the first chair of the Department of Defense’s Innovation Advisory Board. Those priorities have only become more important during the pandemic. It’s a question of “how do we orient the research system to fill in the industrial gaps that have been made very clear by the COVID crisis?” says Schneider of Rhodium.
Startups constantly talk about being mission-oriented, but it’s hard to take most of those messages seriously when the mission is optimizing cash flow for tax efficiency. However, a new generation of startups is emerging that are taking on some of the largest global challenges and bringing the same entrepreneurial grit, operational excellence, and technical brilliance to bear on actual missions — ones that may well save thousands of lives.
ClimateTech has been a huge beneficiary of this trend in general, but one small specialty has caught my eye: disaster response. It’s a category for software services that’s percolated for years with startups here and there, but now a new crop of founders is taking on the challenges of this space with renewed urgency and vigor.
As the elevator pitch would have it, disaster response is hitting hockey stick growth. 2020 was a brutal year, and in more ways than just the global COVID-19 pandemic. The year also experienced a record number of hurricanes, among the worst wildfire seasons in the Western United States, and several megastorms all across the world. Climate change, urbanization, population growth, and poor response practices have combined to create some of the most dangerous conditions humanity has ever collectively faced.
I wanted to get a sense of what the disaster response market has in store this decade, so over the past few weeks, I have interviewed more than 30 startup founders, investors, government officials, utility execs and more to understand this new landscape and what’s changed. In this four-part series on the future of technology and disaster response, to be published this weekend and next, we’ll look at the sales cycle in this market, how data is finally starting to flow into disaster response, how utilities and particularly telcos are dealing with internet access issues, and how communities are redefining disaster management going forward.
Before we get into all the tech developments in disaster response and resilience though, it’s important to ask a basic question: if you build it, will they come? The resounding answer from founders, investors, and government procurement officials was simple: no.
In fact, in all my conversations for this series, the hell of the emergency management sales cycle came up repeatedly, with more than one individual describing it as possibly the toughest sale that any company could make in the entire world. That view might be surprising in a market that easily runs into the tens of billions of dollars if the budgets for procurement are aggregated across local, state, federal, and international governments. Yet, as we will see, the unique dynamics of this market make almost any traditional sales approach useless.
Despite that pessimism though, that doesn’t mean sales are impossible, and a new crop of startups are piercing the barriers of entry in this market. We’ll look at the sales and product strategies that startups are increasingly relying on today to break through.
Few will be surprised that government sales are hard. Generations of govtech startup founders have learned that slow sales cycles, byzantine procurement processes, cumbersome verification and security requirements, and a general lassitude among contract officers makes for a tough battlefield to close on revenue. Many government agencies now have programs to specifically onboard startups, having discovered just how hard it is for new innovations to run through their gauntlet.
Emergency management sales share all the same problems as other govtech startups, but then they deal with about a half dozen more problems that make the sales cycle go from exhausting to infernal hell.
The first and most painful is the dramatic seasonality of the sales in the emergency space. Many agencies that operate on seasonal disasters — think hurricanes, wildfires, winter storms, and more — often go through an “action” period where they respond to these disasters, and then transition into a “planning” period where they assess their performance, determine what changes are needed for next season, and consider what tools might be added or removed to increase the effectiveness of their responders.
Take Cornea and Perimeter, two startups in the wildfire response space that I profiled recently. Both of the teams described how they needed to think in terms of fire seasons when it came to product iteration and sales. “We took two fire seasons to beta test our technology … to solve the right problem the right way,” Bailey Farren, CEO and co-founder of Perimeter, said. “We actually changed our focus on beta testing during the [2019 California] Kincaid fire.”
In this way, disaster tech could be compared to edtech, where school technology purchases are often synchronized with the academic calendar. Miss the June through August window in the U.S. education system, and a startup is looking at another year before it will get another chance at the classroom.
Edtech might once have been a tougher sale to make in order to thread that three-month needle, but disaster response is getting more difficult every year. Climate change is exacerbating the length, severity, and damage caused by all types of disasters, which means that responding agencies that might have had six months or more out-of-season to plan in the past are sometimes working all year long just to respond to emergencies. That gives little time to think about what new solutions an agency needs to purchase.
Worse, unlike the standardized academic calendar, disasters are much less predictable these days as well. Flood and wildfire seasons, for instance, used to be relatively concentrated in certain periods of the year. Now, such emergencies can emerge practically year-round. That means that procurement processes can both start and freeze on a moment’s notice as an agency has to respond to its mission.
Seasonality doesn’t just apply to the sales cycle though — it also applies to the budgets of these agencies. While they are transpiring, disasters dominate the eye of the minds for citizens and politicians, but then we forget all about them until the next catastrophe. Unlike the annual consistency of other government tech spending, disaster tech funding often comes in waves.
One senior federal emergency management official, who asked not to be named since he wasn’t authorized to speak publicly, explained that consistent budgets and the ability to spend them quickly is quite limited during “blue sky days” (i.e. periods without a disaster), and agencies like his have to rely on piecing together supplementary disaster funds when Congress or state legislatures authorize additional financing. The best agencies have technological roadmaps on hand so that when extra funding comes in, they can use it immediately to realize their plans, but not all agencies have the technical planning resources to be that prepared.
Amir Elichai, the CEO and co-founder of Carbyne, a cloud-native platform for call handling in 911 centers, said that this wave of interest crested yet again with the COVID-19 pandemic last year, triggering huge increases in attention and funding around emergency response capabilities. “COVID put a mirror in front of government faces and showed them that ‘we’re not ready’,” he said.
Perhaps unsurprisingly, next-generation 911 services (typically dubbed NG911), which have been advocated for years by the industry and first responders, is looking at a major financing boost. President Biden’s proposed infrastructure bill would add $15 billion to upgrade 911 capabilities in the United States — funding that has been requested for much of the last decade. Just last year, a $12 billion variant of that bill failed in the Senate after passing the U.S. House of Representatives.
Sales are all about providing proverbial painkillers versus vitamins to customers, and one would expect that disaster response agencies looking to upgrade their systems would be very much on the painkiller side. After all, the fear and crisis surrounding these agencies and their work would seem to bring visceral attention to their needs.
Yet, that fear actually has the opposite effect in many cases, driving attention away from systematic technology upgrades in favor of immediate acute solutions. One govtech VC, who asked not to be named to speak candidly about the procurement process his companies go through, said that “we don’t want to paint the picture that the world is a scary and dangerous place.” Instead, “the trick is to be … focused on the safety side rather than the danger.” Safety is a much more prevalent and consistent need than sporadically responding to emergencies.
When a wave of funding finally gets approved though, agencies often have to scramble to figure out what to prioritize now that the appropriated manna has finally dropped from the legislative heaven. Even when startups provide the right solutions, scrying which problems are going to get funded in a particular cycle requires acute attention to every customer.
Josh Mendelsohn, the managing partner at startup studio and venture fund Hangar, said that “the customers have no shortage of needs that they are happy to talk about … the hardest part is how you narrow the funnel — what are the problems that are most meritorious?” That merit can, unfortunately, evolve very rapidly as mission requirements change.
Let’s say all the stars line up though — the agencies have time to buy, they have a need, and a startup has the solution that they want. The final challenge that’s probably the toughest to overcome is simply the lack of trust that new startups have with agencies.
In talking to emergency response officials the past few weeks, reliability unsurprisingly came up again and again. Responding to disasters is mission-critical work, and nothing can break in the field or in the operations center. Frontline responders still use paper and pens in lieu of tablets or mobile phones since they know that paper is going to work every single time and not run out of battery juice. The move fast and break things ethos of Silicon Valley is fundamentally incompatible with this market.
Seasonality, on-and-off funding, lack of attention, procurement scrambling, and acute reliability requirements combine to make emergency management sales among the hardest possible for a startup. That doesn’t even get into all the typical govtech challenges like integrating with legacy systems, the massive fragmentation of thousands of emergency response agencies littered across the United States and globally, and the fact that in many agencies, people aren’t that interested in change in the first place. As one individual in the space described how governments approach emergency technology, “a lot of departments are looking at it as maybe I can hit retirement before I have to deal with it.”
So the sales cycle is hell. Why, then, are VCs dropping money in the sector? After all, we’ve seen emergency response data platform RapidSOS raise $85 million just a few months ago, about the same time Carbyne raised $25 million. There are quite a few more startups at the earliest phases that have raised pre-seed and seed investment as well.
The key argument that nearly everyone in this sector agreed on is that founders (and their investors) have to throw away their private-sector sales playbooks and rebuild their approach from the bottom up to sell specifically to these agencies. That means devising entirely different strategies and tactics to secure revenue performance.
The first and most important approach is, in some respects, to not even start with a company at all, but rather to start learning what people in this field actually do. As the sales cycle perhaps indicates, disaster response is unlike any other work. The chaos, the rapidly changing environment, the multi-disciplinary teams and cross-agency work that has to take place for a response to be effective have few parallels to professional office work. Empathy is key here: the responder that uses paper might have nearly lost their life in the field when their device failed. A 911 center operator may have listened to someone perish in real-time as they scrambled to find the right information from a software database.
In short, it’s all about customer discovery and development. That’s not so different from the enterprise world, but patience radiated out of many of my conversations with industry participants. It just takes more time — sometimes multiple seasons — to figure out precisely what to build and how to sell it effectively. If an enterprise SaaS product can iterate to market-fit in six months, it might take two to three years in the government sector to reach an equivalent point.
Michael Martin of RapidSOS said “There is no shortcut to doing customer discovery work in public service.” He noted that “I do think there is a real challenge between the arrogance of the Silicon Valley tech community and the reality of these challenges“ in public safety, a gap that has to be closed if a startup wants to find success. Meanwhile, Bryce Stirton, president and co-founder of public-safety company Responder Corp, said that “The end user is the best way to look at all the challenges … what are all the boxes the end user has to check to use a new technology?”
Mendelsohn of Hangar said that founders need to answer some tough questions in that process. “Ultimately, what are your entry points,” he asked. “Cornea has had to go through that customer discovery process … it all feels necessary, but what are the right things that require the least amount of behavior change to have impact immediately?”
Indeed, that process is appreciated on the other side as well. The federal emergency management official said, “everyone has a solution, but no one asked me about my problem.” Getting the product right and having it match the unique work that takes place in this market is key.
Let’s say you have a great product though — how do you get it through the perilous challenges of the procurement process? Here, answers differed widely, and they offer multiple strategies on how to approach the problem.
Martin of RapidSOS said that “government does not have a good model for procuring new services to solve problems.” So, the company chose to make its services free for government. “In three years, we went from no agencies using our stuff to all agencies using our stuff, and that was based on not making it a procurement problem,” he said. The company’s business model is based on having paid corporate partners who want to integrate their data into 911 centers for safety purposes.
That’s a similar model used by MD Ally, which received a $3.5 million seed check from General Catalyst this past week. The company adds telehealth referral services into 911 dispatch systems, and CEO and founder Shanel Fields emphasized that she saw an opportunity to create a revenue engine from the physician and mental health provider side of her market while avoiding government procurement.
Outside of what might be dubbed “Robinhood for government” (aka, just offering a service for free), another approach is to link up with more well-known and trusted brand names to offer a product that has the innovation of a startup but the reliability of an established player. Stirton of Responder said “we learned in [this market] that it takes more than just capital to get companies started in this space.” What he found worked was building private-sector partnerships to bring a joint offering to governments. For instance, he noted cloud providers Amazon Web Services and Verizon have good reputations with governments and can get startups over procurement hurdles (TechCrunch is owned by Verizon Media, which is owned by Verizon).
Elichai of Carbyne notes that much of his sales is done through integration partners, referencing CenterSquare as one example. For 911 services, “The U.S. market is obviously the most fragmented” and so partners allow the company to avoid selling to thousands of different agencies. “We are usually not selling direct to governments,” he said.
Partners can also help deal with the problem of localism in emergency procurement: many government agencies don’t know precisely what to buy, so they simply buy software that is offered by companies in their own backyard. Partners can offer a local presence while also allowing a startup to have a nimble national footprint.
Another angle on partners is building out a roster of experienced but retired government executives who can give credibility to a startup through their presence and networks. Even more than in enterprise, government officials, particularly in emergency management, have to work and trust one another given the closely-coupled work that they perform. Hearing a positive recommendation from a close contact down the street can readily change the tenor of a sales conversation.
Finally, as much as emergency management software is geared for governments, private sector companies increasingly have to consider much of the same tooling to protect their operations. Many companies have distributed workforces, field teams, and physical assets they need to protect, and often have to respond to disasters in much the same way that governments do. For some startups, it’s possible to bootstrap in the private sector early on while continuing to assiduously develop public sector relationships.
In short, a long-term customer development program coupled with quality partnerships and joint offerings while not forgetting the private sector offers the best path for startups to break through into these agencies.
The good news is that the hard work can be rewarded. Not only are there serious dollars that flow through these agencies, but the agencies themselves know that they need better technology. Tom Harbour, who is chief fire officer at Cornea and formerly national director of fire management at the U.S. Forest Service, notes that “These are billions of dollars we spend … and we know we can be more efficient.” Government doesn’t always make it easy to create efficiency, but for the founders willing to go the distance, they can build impactful, profitable, and mission-driven companies.
With an increasing number of enterprise systems, growing teams, a rising proliferation of the web and multiple digital initiatives, companies of all sizes are creating loads of data every day. This data contains excellent business insights and immense opportunities, but it has become impossible for companies to derive actionable insights from this data consistently due to its sheer volume.
According to Verified Market Research, the analytics-as-a-service (AaaS) market is expected to grow to $101.29 billion by 2026. Organizations that have not started on their analytics journey or are spending scarce data engineer resources to resolve issues with analytics implementations are not identifying actionable data insights. Through AaaS, managed services providers (MSPs) can help organizations get started on their analytics journey immediately without extravagant capital investment.
MSPs can take ownership of the company’s immediate data analytics needs, resolve ongoing challenges and integrate new data sources to manage dashboard visualizations, reporting and predictive modeling — enabling companies to make data-driven decisions every day.
AaaS could come bundled with multiple business-intelligence-related services. Primarily, the service includes (1) services for data warehouses; (2) services for visualizations and reports; and (3) services for predictive analytics, artificial intelligence (AI) and machine learning (ML). When a company partners with an MSP for analytics as a service, organizations are able to tap into business intelligence easily, instantly and at a lower cost of ownership than doing it in-house. This empowers the enterprise to focus on delivering better customer experiences, be unencumbered with decision-making and build data-driven strategies.
Organizations that have not started on their analytics journey or are spending scarce data engineer resources to resolve issues with analytics implementations are not identifying actionable data insights.
In today’s world, where customers value experiences over transactions, AaaS helps businesses dig deeper into their psyche and tap insights to build long-term winning strategies. It also enables enterprises to forecast and predict business trends by looking at their data and allows employees at every level to make informed decisions.
Apple faces an antitrust complaint in Europe, TikTok has a new CEO and YouTube TV disappears from Roku. This is your Daily Crunch for April 30, 2021.
Also, this is my last day at TechCrunch, and therefore my last day writing The Daily Crunch. It’s been a blast rounding up the news for all of you, and thank you to everyone who took the time to tell me they enjoyed the newsletter.
On Monday, TechCrunch will be debuting a more collaborative approach to The Daily Crunch — stay tuned!
The big story: Europe charges Apple with antitrust breach
The European Commission has filed preliminary charges against Apple, focusing on complaints by Spotify that Apple’s App Store policies — particularly its requirements around in-app purchase — are anti-competitive.
“The Commission takes issue with the mandatory use of Apple’s own in-app purchase mechanism imposed on music streaming app developers to distribute their apps via Apple’s App Store,” it wrote. “The Commission is also concerned that Apple applies certain restrictions on app developers preventing them from informing iPhone and iPad users of alternative, cheaper purchasing possibilities.”
Apple has 12 weeks to respond to the charges.
The tech giants
ByteDance CFO assumes role as new TikTok CEO — Eight months after former TikTok CEO Kevin Mayer quit in the midst of a full-court press from the Trump administration, TikTok finally has a new permanent leader.
Roku removes YouTube TV from its channel store following failed negotiations — Earlier this week, Roku warned customers that the YouTube TV app may be removed from its streaming media players and TVs, and it alleged that Google was leveraging its monopoly power during contract negotiations to ask for unfair terms.
Computer vision inches toward ‘common sense’ with Facebook’s latest research — One development Facebook has pursued in particular is what’s called “semi-supervised learning.”
Startups, funding and venture capital
Developer-focused video platform Mux achieves unicorn status with $105M funding — “I think video’s eating software, the same way software was eating the world 10 years ago.”
As concerns rise over forest carbon offsets, Pachama’s verified offset marketplace gets $15M — The startup is building a marketplace for forest carbon credits that it says is more transparent and verifiable thanks to its use of satellite imagery and machine learning technologies.
Heirlume raises $1.38M to remove the barriers of trademark registration for small businesses — Heirlume’s machine-powered trademark registration platform turns the process into a self-serve affair that won’t break the budget.
Advice and analysis from Extra Crunch
Optimism reigns at consumer trading services as fintech VC spikes and Robinhood IPO looms — But services that help consumers trade might need to retool their models over time to ensure long-term income.
Amid the IPO gold rush, how should we value fintech startups? — If there has ever been a golden age for fintech, it surely must be now.
The health data transparency movement is birthing a new generation of startups — Twin struggles seem to be taking place: a push for more transparency on provider and payer data, and another for strict privacy protection for personal patient data.
(Extra Crunch is our membership program, which helps founders and startup teams get ahead. You can sign up here.)
Cloud infrastructure market keeps rolling in Q1 with almost $40B in revenue — That’s up $2 billion from last quarter and up 37% over the same period last year.
The second shot is kicking in — A new episode of the Webby-nominated Equity podcast.
Pitch your startup to seasoned tech leaders, and a live audience, on Extra Crunch Live — We’re bringing the pitch-off format to Extra Crunch Live.
The Daily Crunch is TechCrunch’s roundup of our biggest and most important stories. If you’d like to get this delivered to your inbox every day at around 3pm Pacific, you can subscribe here.