By 2025, 463 exabytes of data will be created each day, according to some estimates. (For perspective, one exabyte of storage could hold 50,000 years of DVD-quality video.) It’s now easier than ever to translate physical and digital actions into data, and businesses of all types have raced to amass as much data as possible in order to gain a competitive edge.
However, in our collective infatuation with data (and obtaining more of it), what’s often overlooked is the role that storytelling plays in extracting real value from data.
The reality is that data by itself is insufficient to really influence human behavior. Whether the goal is to improve a business’ bottom line or convince people to stay home amid a pandemic, it’s the narrative that compels action, rather than the numbers alone. As more data is collected and analyzed, communication and storytelling will become even more integral in the data science discipline because of their role in separating the signal from the noise.
Yet this can be an area where data scientists struggle. In Anaconda’s 2020 State of Data Science survey of more than 2,300 data scientists, nearly a quarter of respondents said that their data science or machine learning (ML) teams lacked communication skills. This may be one reason why roughly 40% of respondents said they were able to effectively demonstrate business impact “only sometimes” or “almost never.”
The best data practitioners must be as skilled in storytelling as they are in coding and deploying models — and yes, this extends beyond creating visualizations to accompany reports. Here are some recommendations for how data scientists can situate their results within larger contextual narratives.
Ever-growing datasets help machine learning models better understand the scope of a problem space, but more data does not necessarily help with human comprehension. Even for the most left-brain of thinkers, it’s not in our nature to understand large abstract numbers or things like marginal improvements in accuracy. This is why it’s important to include points of reference in your storytelling that make data tangible.
For example, throughout the pandemic, we’ve been bombarded with countless statistics around case counts, death rates, positivity rates, and more. While all of this data is important, tools like interactive maps and conversations around reproduction numbers are more effective than massive data dumps in terms of providing context, conveying risk, and, consequently, helping change behaviors as needed. In working with numbers, data practitioners have a responsibility to provide the necessary structure so that the data can be understood by the intended audience.
Tecton, the company that pioneered the notion of the machine learning feature store, has teamed up with the founder of the open source feature store project called Feast. Today the company announced the release of version 0.10 of the open source tool.
The feature store is a concept that the Tecton founders came up with when they were engineers at Uber. Shortly thereafter an engineer named Willem Pienaar read the founder’s Uber blog posts on building a feature store and went to work building Feast as an open source version of the concept.
“The idea of Tecton [involved bringing] feature stores to the industry, so we build basically the best in class, enterprise feature store. […] Feast is something that Willem created, which I think was inspired by some of the early designs that we published at Uber. And he built Feast and it evolved as kind of like the standard for open source feature stores, and it’s now part of the Linux Foundation,” Tecton co-founder and CEO Mike Del Balso explained.
Tecton later hired Pienaar, who is today an engineer at the company where he leads their open source team. While the company did not originally start off with a plan to build an open source product, the two products are closely aligned, and it made sense to bring Pienaar on board.
“The products are very similar in a lot of ways. So I think there’s a similarity there that makes this somewhat symbiotic, and there is no explicit convergence necessary. The Tecton product is a superset of what Feast has. So it’s an enterprise version with a lot more advanced functionality, but at Feast we have a battle-tested feature store that’s open source,” Pienaar said.
As we wrote in a December 2020 story on the company’s $35 million Series B, it describes a feature store as “an end-to-end machine learning management system that includes the pipelines to transform the data into what are called feature values, then it stores and manages all of that feature data and finally it serves a consistent set of data.”
Del Balso says that from a business perspective, contributing to the open source feature store exposes his company to a different group of users, and the commercial and open source products can feed off one another as they build the two products.
“What we really like, and what we feel is very powerful here, is that we’re deeply in the Feast community and get to learn from all of the interesting use cases […] to improve the Tecton product. And similarly, we can use the feedback that we’re hearing from our enterprise customers to improve the open source project. That’s the kind of cross learning, and ideally that feedback loop involved there,” he said.
The plan is for Tecton to continue being a primary contributor with a team inside Tecton dedicated to working on Feast. Today, the company is releasing version 0.10 of the project.
Everyone warns you not to build on top of someone else’s platform.
When I first started in VC more than 10 years ago, I was told never to invest in a company building on top of another company’s platform. Dependence on a platform makes you susceptible to failure and caps the return on your investment because you have no control over API access, pricing changes and end-customer data, among other legitimate concerns.
I am sure many of you recall Facebook shutting down its API access back in 2015, or the uproar Apple caused when it decided to change the commission it was charging app developers in 2020.
Put simply, founders can no longer avoid the decision around platform dependency.
Salesforce in many ways paved the way for large enterprise platform companies, being the first dedicated SaaS company to surpass $10 billion in annual revenue supported by its open application development marketplace. Salesforce’s success has given rise to dominant platforms in other verticals, and for founders starting companies, there is no avoiding that platform decision these days.
Some points to consider:
What does this mean for founders who decide to build on top of another platform?
PostScript, an SMS/MMS marketing platform for commerce brands, built its platform on Shopify, giving it immediate access to over 1 million brands and a direct customer acquisition funnel. That has allowed PostScript to capture 3,500 of its own customers and successfully close a $35 million Series B in March 2021.
Varo, one of the fastest-growing neobanks, started in 2015 with the principle that a bank could put customers’ interests first and be profitable. But in order to deliver on its mission, it needed to understand where its customers were spending their money. By partnering with Plaid, Varo enabled more than 176,000 of its users to connect their Varo account to outside apps and services, allowing Varo to focus on its core mission to provide more relevant financial products and services.
Graphic designer Paul Rand once famously said that the public was more familiar with bad design than good design. While he was referring to most of the design in the world being “bad”, these days that phrase might take on a second meaning: people typically only notice and talk about (and usually complain about) design when it is ugly, or works badly. Conversely, if it’s good, and it works, you don’t hear much.
Today a startup called UserZoom that has built a platform used by companies like Google, Microsoft, PayPal, Salesforce and many others stay off the bad design radar — with tools to evaluate their design and identify where and when it doesn’t work, and how to link it up better with bigger customer experience strategies — is announcing some significant funding to expand its business.
The company has raised $100 million — money that CEO and co-founder Alfonso de la Nuez said will be used to continue building its tools and mission to make design as critical to a company’s mission as sales might be to an e-commerce company. Alongside this, it has made an acquisition, of another experience insights company called EnjoyHQ, to expand its research operations.
“We feel companies are only scratching the surface of what they could be doing,” he said. “We think experience management could become the third system of record, similar to ERP or CRM.”
This funding is being led by Owl Rock, with other unnamed investors participating. Prior to this, UserZoom raised some $34 million. It is not disclosing valuation, but de la Nuez notes that this latest investment represents a minority stake UserZoom, that the startup is profitable and grew revenues by 40% last year, and that it’s currently on an annual run rate of $80 million.
De la Nuez and UserZoom are currently based out of Los Gatos in the South Bay Area, but the company actually got its start in Barcelona, Spain, where de la Nuez and his co-founder Xavier Mestres originally ran a more old-school user experience design consulting company.
“We had physical labs, testing sites, were we ran focus groups,” he recalled. “It was tedious and manual.”
Years of working like that, and he and Mestres and a third co-founder who has since left the company, Javier Darriba, decided to see how and if they could retool the concept as a piece of software.
Their timing was perfect: It was 2007, the year of the iPhone debut, and the smaller screen of that device, and Apple’s prowess in nailing design and user experience, suddenly got the tech world (and the rest of the world) thinking about how they, too, could rethink their own digital experiences. You might think of it as an earlier iteration of the kind of digital transformation that people talk about today.
The company was growing in Spain at a time when it was much harder for startups to raise substantial rounds, so UserZoom made the decision move to California, but Mestres, who is the CTO, still runs the startup’s engineering, design and customer support teams (100 out of 300 staff in all) out of Barcelona. The cost base of employing tech people in Spain are completely different from the Bay Area, “and it’s helped us become profitable,” de la Nuez said.
The core of the company’s product is a platform that runs what it refers to as “XIM” (Experience Insights Management), which lets customers test out any digital experience — be it something on the web, or a phone, or a smartwatch or an interactive voice service, and soon, other interfaces such as automotive. (And it’s a list that is likely to grow as more hardware and services are built.) It can recruit testers to evaluate design, product interaction, marketing decisions that the company is trying out, and so on.
That testing interface is essentially started as product development begins, the idea being that customers can apply the principle of “agile development” as they continue to work on the product, rather than leave all of that to be tested after a product is technically already completed.
As a company users UserZoom, the results of tests can be shared among different stakeholders who can make notes on how product development would work (or wouldn’t work) with how they are envisioning, say, a new sales strategy or engagement goal. It also helps develop KPIs for customers to determine how and if a design is meeting KPIs.
These can cover not just basic goals like “more conversions” or “less shopping cart abandonment” or “opting in to cookies” but also whether a design is meeting accessibility goals. (As seen with the recent controversy around Ravelry, this is indeed a growing issue and one that de la Nuez said will be getting more attention at UserZoom.)
The space of UX and testing to improve it is a pretty crowded and well-funded one, with others in it including LogRocket, UserTesting, ContentSquare, companies focusing on specific verticals, like AB Tasty and many others. What’s giving UserZoom an edge, it seems, is not just its extensive and impressive customer base, but its focus on trying to provide an end-to-end concept of design and experience and how it might fit in with a bigger business strategy.
“In today’s digital economy, the quality of the customer and user experience is the driving factor that helps businesses retain customers and generate increased revenue,” said Pravin Vazirani, managing director at Owl Rock, in a statement. “Despite this, many organizations are still unable to properly extract and manage the potential insights that lie within a customer journey. UserZoom enables companies to harness these insights and drive improved digital experiences.” Andy Lefkarites, an investor at Owl Rock said in a statement, “We see a tremendous market opportunity for UserZoom, which enables companies of all sizes and industries to continually enhance and prioritize their digital experience strategy. We are pleased to be able to support UserZoom with growth capital to enable them to seize that opportunity.”
Before a startup can achieve product-market fit, founders must first listen to their customers, build what they require and fashion a business plan that makes the whole enterprise worthwhile. The numbers will tell the true story, but when it happens, you’ll feel it in your bones because sales will be good, customers will happy and revenue will growing.
Reaching that tipping point can be a slog, especially for first-time founders. To uncover some basic truths about building products, we spoke to three entrepreneurs who have each built more than one company:
First-time founders often try to build the product they think the market wants. That’s what Scratchpad co-founder Salehi did when he founded his previous startup PersistIQ. Before launching his latest venture, he took a different approach: Instead of plowing ahead with a product and adjusting after he got in front of customers, he decided to step back and figure out what his customers needed first.
“Tactically what we did differently at Scratchpad is we tried to be much more deliberate up front. And what that looked like was [ … ] to not start with building, even though the product is such an important part, but really step back and understand what we are doing here in the first place,” he said.
One of the issues with deploying a machine learning application is that it tends to be expensive and highly compute intensive. Deeplite, a startup based in Montreal, wants to change that by providing a way to reduce the overall size of the model, allowing it to run on hardware with far fewer resources.
Today, the company announced a $6 million seed investment. Boston-based venture capital firm PJC led the round with help from Innospark Ventures, Differential Ventures and Smart Global Holdings. Somel Investments, BDC Capital and Desjardins Capital also participated.
Nick Romano, CEO and co-founder at Deeplite, says that the company aims to take complex deep neural networks that require a lot of compute power to run, tend to use up a lot of memory, and can consume batteries at a rapid pace, and help them run more efficiently with fewer resources.
“Our platform can be used to transform those models into a new form factor to be able to deploy it into constrained hardware at the edge,” Romano explained. Those devices could be as small as a cell phone, a drone or even a Raspberry Pi, meaning that developers could deploy AI in ways that just wouldn’t be possible in most cases right now.
The company has created a product called Neutrino that lets you specify how you want to deploy your model and how much you can compress it to reduce the overall size and the resources required to run it in production. The idea is to run a machine learning application on an extremely small footprint.
Davis Sawyer, chief product officer and co-founder, says that the company’s solution comes into play after the model has been built, trained and is ready for production. Users supply the model and the data set and then they can decide how to build a smaller model. That could involve reducing the accuracy a bit if there is a tolerance for that, but chiefly it involves selecting a level of compression — how much smaller you can make the model.
“Compression reduces the size of the model so that you can deploy it on a much cheaper processor. We’re talking in some cases going from 200 megabytes down to on 11 megabytes or from 50 megabytes to 100 kilobytes,” Davis explained.
Rob May, who is leading the investment for PJC, says that he was impressed with the team and the technology the startup is trying to build.
“Deploying AI, particularly deep learning, on resource-constrained devices, is a broad challenge in the industry with scarce AI talent and know-how available. Deeplite’s automated software solution will create significant economic benefit as Edge AI continues to grow as a major computing paradigm,” May said in a statement.
The idea for the company has roots in the TandemLaunch incubator in Montreal. It launched officially as a company in mid-2019 and today has 15 employees with plans to double that by the end of this year. As it builds the company, Romano says the founders are focused on building a diverse and inclusive organization.
“We’ve got a strategy that’s going to find us the right people, but do it in a way that is absolutely diverse and inclusive. That’s all part of the DNA of the organization,” he said.
When it’s possible to return to work, the plan is to have offices in Montreal and Toronto that act as hubs for employees, but there won’t be any requirement to come into the office.
“We’ve already discussed that the general approach is going to be that people can come and go as they please, and we don’t think we will need as large an office footprint as we may have had in the past. People will have the option to work remotely and virtually as they see fit,” Romano said.
Meroxa, a startup that makes it easier for businesses to build the data pipelines to power both their analytics and operational workflows, today announced that it has raised a $15 million Series A funding round led by Drive Capital. Existing investors Root, Amplify and Hustle Fund also participated in this round, which together with the company’s previously undisclosed $4.2 million seed round now brings total funding in the company to $19.2 million.
The promise of Meroxa is that can use a single platform for their various data needs and won’t need a team of experts to build their infrastructure and then manage it. At its core, Meroxa provides a single Software-as-a-Service solution that connects relational databases to data warehouses and then helps businesses operationalize that data.
“The interesting thing is that we are focusing squarely on relational and NoSQL databases into data warehouse,” Meroxa co-founder and CEO DeVaris Brown told me. “Honestly, people come to us as a real-time FiveTran or real-time data warehouse sink. Because, you know, the industry has moved to this [extract, load, transform] format. But the beautiful part about us is, because we do change data capture, we get that granular data as it happens.” And businesses want this very granular data to be reflected inside of their data warehouses, Brown noted, but he also stressed that Meroxa can expose this stream of data as an API endpoint or point it to a Webhook.
The company is able to do this because its core architecture is somewhat different from other data pipeline and integration services that, at first glance, seem to offer a similar solution. Because of this, users can use the service to connect different tools to their data warehouse but also build real-time tools on top of these data streams.
“We aren’t a point-to-point solution,” Meroxa co-founder and CTO Ali Hamidi explained. “When you set up the connection, you aren’t taking data from Postgres and only putting it into Snowflake. What’s really happening is that it’s going into our intermediate stream. Once it’s in that stream, you can then start hanging off connectors and say, ‘Okay, well, I also want to peek into the stream, I want to transfer my data, I want to filter out some things, I want to put it into S3.”
Because of this, users can use the service to connect different tools to their data warehouse but also build real-time tools to utilize the real-time data stream. With this flexibility, Hamidi noted, a lot of the company’s customers start with a pretty standard use case and then quickly expand into other areas as well.
Brown and Hamidi met during their time at Heroku, where Brown was a director of product management and Hamidi a lead software engineer. But while Heroku made it very easy for developers to publish their web apps, there wasn’t anything comparable in the highly fragmented database space. The team acknowledges that there are a lot of tools that aim to solve these data problems, but few of them focus on the user experience.
“When we talk to customers now, it’s still very much an unsolved problem,” Hamidi said. “It seems kind of insane to me that this is such a common thing and there is no ‘oh, of course you use this tool because it addresses all my problems.’ And so the angle that we’re taking is that we see user experience not as a nice-to-have, it’s really an enabler, it is something that enables a software engineer or someone who isn’t a data engineer with 10 years of experience in wrangling Kafka and Postgres and all these things. […] That’s a transformative kind of change.”
It’s worth noting that Meroxa uses a lot of open-source tools but the company has also committed to open-sourcing everything in its data plane as well. “This has multiple wins for us, but one of the biggest incentives is in terms of the customer, we’re really committed to having our agenda aligned. Because if we don’t do well, we don’t serve the customer. If we do a crappy job, they can just keep all of those components and run it themselves,” Hamidi explained.
Today, Meroxa, which the team founded in early 2020, has over 24 employees (and is 100% remote). “I really think we’re building one of the most talented and most inclusive teams possible,” Brown told me. “Inclusion and diversity are very, very high on our radar. Our team is 50% black and brown. Over 40% are women. Our management team is 90% underrepresented. So not only are we building a great product, we’re building a great company, we’re building a great business.”
While visual ‘no code‘ tools are helping businesses get more out of computing without the need for armies of in-house techies to configure software on behalf of other staff, access to the most powerful tech tools — at the ‘deep tech’ AI coal face — still requires some expert help (and/or costly in-house expertise).
This is where bootstrapping French startup, NLPCloud.io, is plying a trade in MLOps/AIOps — or ‘compute platform as a service’ (being as it runs the queries on its own servers) — with a focus on natural language processing (NLP), as its name suggests.
Developments in artificial intelligence have, in recent years, led to impressive advances in the field of NLP — a technology that can help businesses scale their capacity to intelligently grapple with all sorts of communications by automating tasks like Named Entity Recognition, sentiment-analysis, text classification, summarization, question answering, and Part-Of-Speech tagging, freeing up (human) staff to focus on more complex/nuanced work. (Although it’s worth emphasizing that the bulk of NLP research has focused on the English language — meaning that’s where this tech is most mature; so associated AI advances are not universally distributed.)
Production ready (pre-trained) NLP models for English are readily available ‘out of the box’. There are also dedicated open source frameworks offering help with training models. But businesses wanting to tap into NLP still need to have the DevOps resource and chops to implement NLP models.
NLPCloud.io is catering to businesses that don’t feel up to the implementation challenge themselves — offering “production-ready NLP API” with the promise of “no DevOps required”.
Its API is based on Hugging Face and spaCy open-source models. Customers can either choose to use ready-to-use pre-trained models (it selects the “best” open source models; it does not build its own); or they can upload custom models developed internally by their own data scientists — which it says is a point of differentiation vs SaaS services such as Google Natural Language (which uses Google’s ML models) or Amazon Comprehend and Monkey Learn.
NLPCloud.io says it wants to democratize NLP by helping developers and data scientists deliver these projects “in no time and at a fair price”. (It has a tiered pricing model based on requests per minute, which starts at $39pm and ranges up to $1,199pm, at the enterprise end, for one custom model running on a GPU. It does also offer a free tier so users can test models at low request velocity without incurring a charge.)
“The idea came from the fact that, as a software engineer, I saw many AI projects fail because of the deployment to production phase,” says sole founder and CTO Julien Salinas. “Companies often focus on building accurate and fast AI models but today more and more excellent open-source models are available and are doing an excellent job… so the toughest challenge now is being able to efficiently use these models in production. It takes AI skills, DevOps skills, programming skill… which is why it’s a challenge for so many companies, and which is why I decided to launch NLPCloud.io.”
The platform launched in January 2021 and now has around 500 users, including 30 who are paying for the service. While the startup, which is based in Grenoble, in the French Alps, is a team of three for now, plus a couple of independent contractors. (Salinas says he plans to hire five people by the end of the year.)
“Most of our users are tech startups but we also start having a couple of bigger companies,” he tells TechCrunch. “The biggest demand I’m seeing is both from software engineers and data scientists. Sometimes it’s from teams who have data science skills but don’t have DevOps skills (or don’t want to spend time on this). Sometimes it’s from tech teams who want to leverage NLP out-of-the-box without hiring a whole data science team.”
“We have very diverse customers, from solo startup founders to bigger companies like BBVA, Mintel, Senuto… in all sorts of sectors (banking, public relations, market research),” he adds.
Use cases of its customers include lead generation from unstructured text (such as web pages), via named entities extraction; and sorting support tickets based on urgency by conducting sentiment analysis.
Content marketers are also using its platform for headline generation (via summarization). While text classification capabilities are being used for economic intelligence and financial data extraction, per Salinas.
He says his own experience as a CTO and software engineer working on NLP projects at a number of tech companies led him to spot an opportunity in the challenge of AI implementation.
“I realized that it was quite easy to build acceptable NLP models thanks to great open-source frameworks like spaCy and Hugging Face Transformers but then I found it quite hard to use these models in production,” he explains. “It takes programming skills in order to develop an API, strong DevOps skills in order to build a robust and fast infrastructure to serve NLP models (AI models in general consume a lot of resources), and also data science skills of course.
“I tried to look for ready-to-use cloud solutions in order to save weeks of work but I couldn’t find anything satisfactory. My intuition was that such a platform would help tech teams save a lot of time, sometimes months of work for the teams who don’t have strong DevOps profiles.”
“NLP has been around for decades but until recently it took whole teams of data scientists to build acceptable NLP models. For a couple of years, we’ve made amazing progress in terms of accuracy and speed of the NLP models. More and more experts who have been working in the NLP field for decades agree that NLP is becoming a ‘commodity’,” he goes on. “Frameworks like spaCy make it extremely simple for developers to leverage NLP models without having advanced data science knowledge. And Hugging Face’s open-source repository for NLP models is also a great step in this direction.
“But having these models run in production is still hard, and maybe even harder than before as these brand new models are very demanding in terms of resources.”
The models NLPCloud.io offers are picked for performance — where “best” means it has “the best compromise between accuracy and speed”. Salinas also says they are paying mind to context, given NLP can be used for diverse user cases — hence proposing number of models so as to be able to adapt to a given use.
“Initially we started with models dedicated to entities extraction only but most of our first customers also asked for other use cases too, so we started adding other models,” he notes, adding that they will continue to add more models from the two chosen frameworks — “in order to cover more use cases, and more languages”.
SpaCy and Hugging Face, meanwhile, were chosen to be the source for the models offered via its API based on their track record as companies, the NLP libraries they offer and their focus on production-ready framework — with the combination allowing NLPCloud.io to offer a selection of models that are fast and accurate, working within the bounds of respective trade-offs, according to Salinas.
“SpaCy is developed by a solid company in Germany called Explosion.ai. This library has become one of the most used NLP libraries among companies who want to leverage NLP in production ‘for real’ (as opposed to academic research only). The reason is that it is very fast, has great accuracy in most scenarios, and is an opinionated” framework which makes it very simple to use by non-data scientists (the tradeoff is that it gives less customization possibilities),” he says.
“Hugging Face is an even more solid company that recently raised $40M for a good reason: They created a disruptive NLP library called ‘transformers’ that improves a lot the accuracy of NLP models (the tradeoff is that it is very resource intensive though). It gives the opportunity to cover more use cases like sentiment analysis, classification, summarization… In addition to that, they created an open-source repository where it is easy to select the best model you need for your use case.”
While AI is advancing at a clip within certain tracks — such as NLP for English — there are still caveats and potential pitfalls attached to automating language processing and analysis, with the risk of getting stuff wrong or worse. AI models trained on human-generated data have, for example, been shown reflecting embedded biases and prejudices of the people who produced the underlying data.
Salinas agrees NLP can sometimes face “concerning bias issues”, such as racism and misogyny. But he expresses confidence in the models they’ve selected.
“Most of the time it seems [bias in NLP] is due to the underlying data used to trained the models. It shows we should be more careful about the origin of this data,” he says. “In my opinion the best solution in order to mitigate this is that the community of NLP users should actively report something inappropriate when using a specific model so that this model can be paused and fixed.”
“Even if we doubt that such a bias exists in the models we’re proposing, we do encourage our users to report such problems to us so we can take measures,” he adds.
As a company founded by data scientists, Streamlit may be in a unique position to develop tooling to help companies build machine learning applications. For starters, it developed an open source project, but today the startup announced an expanded beta of a new commercial offering and $35 million Series B funding.
Sequoia led the investment with help from previous investors Gradient Ventures and GGV Capital. Today’s round brings the total raised to $62 million, according to the company.
Data scientists can download the open source project and build a machine learning application, but it requires a certain level of technical aptitude to make all the parts work. Company co-founder and CEO Adrien Treuille says that so far the company has 20,000 monthly active developers using the open source tooling to develop streaming apps, which have been viewed millions of times.
As they have gained that traction, they have customers who would prefer to use a commercial service. “It’s great to have something free and that you can use instantly, but not every company is capable of bridging that into a commercial offering,” Treuille explained.
Company COO and co-founder Amanda Kelly says that the commercial offering called Streamlit for Teams is designed to remove some of the complexity around using the open source application. “The whole [process of] how do I actually deploy an app, put it in a container, make sure it scales, has the resources and is securely connected to data sources […] — that’s a whole different skill set. That’s a DevOps and IT skill set,” she said.
What Streamlit for Teams does is take care of all that in the background for end users, so they can concentrate on the app building part of the equation without help from the technical side of the company to deploy it.
Sonya Huang, a partner at Sequoia, who is leading the firm’s investment in Streamlit, says that she was impressed with the company’s developer focus and sees the new commercial offering as a way to expand usage of the applications that data scientists have been building in the open source project.
“Streamlit has a chance to define a better interface between data teams and business users by ushering in a new paradigm for interactive, data-rich applications,” Huang said.
They have data scientists at big-name companies like Uber, Delta Dental and John Deere using the open source product already. They have kept the company fairly lean with 27 employees up until now, but the plan is to double that number in the coming year with the new funding, Kelly says.
She says that the founding team recognizes that it’s important to build a diverse company. She admits that it’s not always easy to do in practice when as a young startup, you are just fighting to stay alive, but she says that the funding gives them the luxury to step back and begin to hire more deliberately.
“Literally right before this call, I was on with a consultant who is going to come in and work with the executive team, so that we’re all super clear about what we mean [when it comes to] diversity for us and how is this actually a really core part of our company, so that we can flow that into recruiting and people and engineering practices and and make that a lived value within our company,” she said.
Streamlit for Teams is available in beta starting today. The company plans to make it generally available some time later this year.
The Kubernetes project was a major undertaking for the company, Esri Product Managers Trevor Seaton and Philip Heede told me. Traditionally, like so many similar products, ArcGIS was architected to be installed on physical boxes, virtual machines or cloud-hosted VMs. And while it doesn’t really matter to end-users where the software runs, containerizing the application means that it is far easier for businesses to scale their systems up or down as needed.
“We have a lot of customers — especially some of the larger customers — that run very complex questions,” Seaton explained. “And sometimes it’s unpredictable. They might be responding to seasonal events or business events or economic events, and they need to understand not only what’s going on in the world, but also respond to their many users from outside the organization coming in and asking questions of the systems that they put in place using ArcGIS. And that unpredictable demand is one of the key benefits of Kubernetes.”
The team could have chosen to go the easy route and put a wrapper around its existing tools to containerize them and call it a day, but as Seaton noted, Esri used this opportunity to re-architect its tools and break it down into microservices.
“It’s taken us a while because we took three or four big applications that together make up [ArcGIS] Enterprise,” he said. “And we broke those apart into a much larger set of microservices. That allows us to containerize specific services and add a lot of high availability and resilience to the system without adding a lot of complexity for the administrators — in fact, we’re reducing the complexity as we do that and all of that gets installed in one single deployment script.”
While Kubernetes simplifies a lot of the management experience, a lot of companies that use ArcGIS aren’t yet familiar with it. And as Seaton and Heede noted, the company isn’t forcing anyone onto this platform. It will continue to support Windows and Linux just like before. Heede also stressed that it’s still unusual — especially in this industry — to see a complex, fully integrated system like ArcGIS being delivered in the form of microservices and multiple containers that its customers then run on their own infrastructure.
A bit later this month, Esri also plans to launch its new design system to make it easier and faster for developers to create clean and consistent user interfaces. This design system will launch April 22, but the company already provided a bit of a teaser today. As Powell noted, the challenge for Esri is that its design system has to help the company’s partners put their own style and branding on top of the maps and data they get from the ArcGIS ecosystem.
At its Octane21 conference, Okta, the popular authentication and identity platform, today announced a new — and free — developer edition that features fewer limitations and support for significantly more monthly active users than its current free plan.
“Our overall philosophy isn’t, ‘we want to just provide […] a set of authentication and authorization services.’ The way we’re looking at this is, ‘hey, app developer, how do we provide you the foundation you need to get up and running quickly with authorization and authentication as one part of it,’ ” Diya Jolly, Okta’s chief product officer, told me. And she believes that Okta is in a unique position to do so, because it doesn’t only offer tools to manage authorization and access, but also systems for securing microservices and providing applications with access to privileged resources.
It’s also worth noting that, while the deal hasn’t closed yet, Okta’s intent to acquire Auth0 significantly extends its developer strategy, given Auth0’s developer-first approach.
As for the expanded free account, Jolly noted that the company found that developers wanted to be able to access more of the service’s features during their prototyping phases. That means the new free Developer Edition comes with support for multi-factor authentication, machine-to-machine tokens and B2B integrations, for example, in addition to expanded support for integrations into toolchains. As is so often the case with enterprise tools, the free edition doesn’t come with the usual enterprise support options and has lower rate limits than the paid plans.
Still, and Jolly acknowledged this, a small to medium-sized business may be able to build applications and take them into production based on this new free plan.
“15K [monthly active users] is is a lot, but if you look at our customer base, it’s about the right amount for the smaller business applications, the real SMBs, and that was the goal. In a developer motion, you want people to try out things and then upgrade. I think that’s the key. No developer is going to come and build with you if you don’t have a free offering that they can tinker around and play with.”
She noted that the company has spent a lot of time thinking about how to support developers through the application development lifecycle overall. That includes better CLI tools for developers who would rather bypass Okta’s web-based console, for example, and additional integrations with tools like Terraform, Kong and Heroku. “Today, [developers] have to stitch together identity and Okta into those experiences — or they use some other identity — we’ve pre-stitched all of this for them,” Jolly said.
The new Okta Starter Developer Edition, as well as the new documentation, sample applications and integrations, are now available at developer.okta.com.
Messaging is the medium these days, and today a startup that has built an API to help others build text and video interactivity into their services is announcing a big round to continue scaling its business. Sendbird, a popular provider of chat, video and other interactive services to the likes of Reddit, Hinge, Paytm, Delivery Hero and hundreds of others by way of a few lines of code, has closed a round of $100 million, money that it plans to use to continue expanding the functionalities of its platform to meet our changing interactive times. Sendbird has confirmed that the funding values the company at $1.05 billion.
Today, customers collectively channel some 150 million users through Sendbird’s APIs to chat with each other and large groups of users over text and video, a figure that has seen a lot of growth in particular in the last year, where people were spending so much more time in front of screens as their primary interface to communicate with the world.
Sendbird already provides some services around that core functionality such as moderation and text search. John Kim, Sendbird’s CEO and founder, said that additional developments like moderation has seen a huge take-up, and services it plans to add into the mix include payments and logistics features, and that it is looking at adding in group audio conversations for customers to build their own Clubhouse clones.
“We are getting enquiries,” said Kim. “We will be setting it up in a personalized way. Voice chat has certainly picked up due to Clubhouse.”
The funding — oversubscribed, the company says — is being led by Steadfast Financial, with Softbank’s Vision Fund 2 also participating, along with previous backers ICONIQ Capital, Tiger Global Management, and Meritech Capital. It comes about two years after Sendbird closed its Series B at $102 million, and the startup appears to have nearly doubled its valuation since then: PitchBook estimates it was around $550 million in 2019.
That growth, in a sense, is not a surprise, given not just the climate right now for virtual interaction, but the fact that Sendbird itself has tripled the number of customers using its tools since 2019. The company, co-headquartered in Seoul, Korea and San Mateo, has now raised around $221 million.
The market that Sendbird has been pecking away at since being founded in 2013 is a hefty one.
Messaging apps have become a major digital force, with a small handful of companies overtaking (and taking on) the primary features found on the most basic of phones and finding traction with people by making them easier to use and full of more interesting features to use alongside the basic functionality. That in turn has led a wave of other companies to build in their own communications features, a way both to provide more community for their users, and to keep people on their own platforms in the process.
“It’s an arms race going on between messaging and payment apps,” Sid Suri, Sendbird’s marketing head, said to me in describing the competitive landscape. “There is a high degree of urgency among all businesses to say we don’t have to lose users to any of them. White label services like ours are powering the ability to keep up.”
Sendbird is indeed one of a wave of companies that have identified both that trend and the opportunity of building that functionality out as a commodity of sorts that can be embedded anywhere a developer chooses to place it by way of an API. It’s not the only one: others in the same space include publicly-listed Twilio, the similarly-named competitor MessageBird (which is also highly capitalised and has positioned itself as a consolidator in the space), PubNub, Sinch, Stream, Firebase and many more.
That competition is one reason why Sendbird has raised money. It gives it more capital to bring on more users, and critically to invest in building out more functionality alongside its core features, to address the needs of its existing users, and to discover new opportunities to provide them with features they perhaps didn’t know they needed in their messaging channels to keep users’ attention.
“We are doing a lot around transactions and payments, as well as logistics,” Kim said in an interview. “We are really building out the end to end experience [since that] really ties into engagement. A couple of new features will be heavily around transactions, and others will be around more engagement.”
Karan Mehandru, a partner at Steadfast, is joining the board with this round, and he believes that there remains a huge opportunity especially when you consider the many verticals that have yet to adopt solid and useful communications channels within their services, such as healthcare.
“The channel that Sendbird is leveraging is the next channel we have come to expect from all brands,” he said in an interview. “Sendbird may look the same as others but if you peel the onion, providing a scalable chat experience that is highly customized is a real problem to solve. Large customers think this is critical but not a core competence and then zoom back to Sendbird because they can’t do it. Sendbird is a clear leader. Sendbird is permeating many verticals and types of companies now. This is one of those rare companies that has been at the right place at the right time.”
Meet Soda, a data monitoring platform that is going to help you discover issues with your data processing setup. This way, you can react as quickly as possible and make sure that you keep the full data picture.
If you’re building a digital-first company, you and your customers are likely generating a ton of data. And you may even be leveraging that data to adjust your product itself — think about hotel pricing, finding the right restaurant on a food delivery website, applying for a loan with a fintech company, etc. Those are data-heavy products.
“Companies build a data platform — as they call it — in one of the big three clouds [Amazon Web Services, Google Cloud, Microsoft Azure]. They land their data in there and they make it available for analytics and more,” Soda co-founder and CEO Maarten Masschelein told me.
You can then tap into those data lakes or data warehouses to display analytics, visualize your data, monitor your services, etc. But what happens if there’s an issue in your data workflows?
It might take you a while to realize that there’s some missing data, or that you’re miscounting some stuff. For instance, Facebook miscalculated average video view times for several years. When you spot that issue, an important part of your business might be affected.
Soda wants to catch data issues as quickly as possible by monitoring your data automatically and at scale. “We sit further upstream, closer to the source of data,” Masschelein said.
When you set up Soda with your data platform, you instantly get some alerts. Soda tells you if there’s something off. For example, if your application generated only 6,000 records today while you usually generate 24,000 records in 24 hours, chances are there’s something wrong. Or if you usually get a new entry every minute and there hasn’t been an entry in 15 minutes, your data might not be fresh.
“But that only covers a small part of what is considered data issues. There’s more logic that you want to test and validate,” Masschelein said.
Soda lets you create rules to test and validate your data. Basically, think about test suite in software development. When you build a new version of your app, your code needs to pass several tests to make sure that nothing critical is going to break with the new version.
With Soda, you can check data immediately and get the result. If the test doesn’t pass, you can programmatically react — for instance, you can stop a process and quarantine data.
Today, the startup is also launching Soda Cloud. It’s a collaboration web application that gives you visibility in your data flows across the organization. This way, non-technical people can easily browse metadata to see whether everything seems to be flowing correctly.
Basically, Soda customers use Soda SQL, a command-line tool that helps someone scan data, along with Soda Cloud, a web application to view Soda SQL results.
Beyond those products, Soda’s vision is that data is becoming an entire category in software products. Development teams now have a ton of dev tools available to automate testing, integration, deployment, versioning, etc. But there’s a lot of potential for tools specifically designed for data teams.
Soda has recently raised a $13.5 million Series A round (€11.5 million) led by Singular, a new Paris-based VC fund that I covered earlier this week. Soda’s seed investors Point Nine Capital, Hummingbird Ventures, DCF and various business angels also participated.
If you develop software for a large enterprise company, chances are you’ve heard of Tricentis. If you don’t develop software for a large enterprise company, chances are you haven’t. The software testing company with a focus on modern cloud and enterprise applications was founded in Austria in 2007 and grew from a small consulting firm to a major player in this field, with customers like Allianz, BMW, Starbucks, Deutsche Bank, Toyota and UBS. In 2017, the company raised a $165 million Series B round led by Insight Venture Partners.
Today, Tricentis announced that it has acquired Neotys, a popular performance testing service with a focus on modern enterprise applications and a tests-as-code philosophy. The two companies did not disclose the price of the acquisition. France-based Neotys launched in 2005 and raised about €3 million before the acquisition. Today, it has about 600 customers for its NeoLoad platform. These include BNP Paribas, Dell, Lufthansa, McKesson and TechCrunch’s own corporate parent, Verizon.
As Tricentis CEO Sandeep Johri noted, testing tools were traditionally script-based, which also meant they were very fragile whenever an application changed. Early on, Tricentis introduced a low-code tool that made the automation process both easier and resilient. Now, as even traditional enterprises move to DevOps and release code at a faster speed than ever before, testing is becoming both more important and harder for these companies to implement.
“You have to have automation and you cannot have it be fragile, where it breaks, because then you spend as much time fixing the automation as you do testing the software,” Johri said. “Our core differentiator was the fact that we were a low-code, model-based automation engine. That’s what allowed us to go from $6 million in recurring revenue eight years ago to $200 million this year.”
Tricentis, he added, wants to be the testing platform of choice for large enterprises. “We want to make sure we do everything that a customer would need, from a testing perspective, end to end. Automation, test management, test data, test case design,” he said.
The acquisition of Neotys allows the company to expand this portfolio by adding load and performance testing as well. It’s one thing to do the standard kind of functional testing that Tricentis already did before launching an update, but once an application goes into production, load and performance testing becomes critical as well.
“Before you put it into production — or before you deploy it — you need to make sure that your application not only works as you expect it, you need to make sure that it can handle the workload and that it has acceptable performance,” Johri noted. “That’s where load and performance testing comes in and that’s why we acquired Neotys. We have some capability there, but that was primarily focused on the developers. But we needed something that would allow us to do end-to-end performance testing and load testing.”
The two companies already had an existing partnership and had integrated their tools before the acquisition — and many of its customers were already using both tools, too.
“We are looking forward to joining Tricentis, the industry leader in continuous testing,” said Thibaud Bussière, president and co-founder at Neotys. “Today’s Agile and DevOps teams are looking for ways to be more strategic and eliminate manual tasks and implement automated solutions to work more efficiently and effectively. As part of Tricentis, we’ll be able to eliminate laborious testing tasks to allow teams to focus on high-value analysis and performance engineering.”
NeoLoad will continue to exist as a stand-alone product, but users will likely see deeper integrations with Tricentis’ existing tools over time, include Tricentis Analytics, for example.
Johri tells me that he considers Tricentis one of the “best kept secrets in Silicon Valley” because the company not only started out in Europe (even though its headquarters is now in Silicon Valley) but also because it hasn’t raised a lot of venture rounds over the years. But that’s very much in line with Johri’s philosophy of building a company.
“A lot of Silicon Valley tends to pay attention only when you raise money,” he told me. “I actually think every time you raise money, you’re diluting yourself and everybody else. So if you can succeed without raising too much money, that’s the best thing. We feel pretty good that we have been very capital efficient and now we’re recognized as a leader in the category — which is a huge category with $30 billion spend in the category. So we’re feeling pretty good about it.”
While quantum computing may still be in its infancy, most pundits in the industry will tell you that now is the time to learn the basic concepts. And while there is little that’s immediately intuitive on the hardware side of quantum computing, the actual software tools that most players in the industry are developing today should feel somewhat familiar to virtually any developer.
Unsurprisingly, the “IBM Quantum Developer Certification,” as it’s officially called, focuses on IBM’s own software tools and especially Qiskit, its SDK for working with quantum computers. Qiskit has already proven quite popular, with more than 600,000 installs, and when IBM Quantum and the Qiskit team hosted a quantum summer school last year, almost 5,000 developers participated.
But on top of knowing their way around the basics of Qiskit (think defining and executing quantum circuits) developers also need to learn some of the basics of quantum computing itself. Once you know your way around Bloch spheres, Pauli matrices and Bell states, you’ll probably be in good shape for taking the certification exam, which will be administered on the Pearson VUE platform.
Abe Asfaw, the global lead for Quantum Education and Open Science at IBM, told me that this is just the first of a series of planned quantum certifications.
“What we’ve built is a multi-tiered developer certification,” he told me. “The first tier is what we’re releasing in this announcement and that tier gets developers introduced to how to work with quantum circuits. How do you use Qiskit […] to build out a quantum circuit and how do you run it on a quantum computer? And once you run it on a quantum computer, how do you look at the results and how do you interpret the results? This sets the stage for the next series of certifications that we’re developing, which are then going to be attached to use cases that are being explored in optimization, chemistry and finance. All of these can now be sort of integrated into the developer workflow once we have enabled someone to show that they can work with quantum circuits.”
Asfaw stressed that IBM has focused on educating developers about quantum computing for quite a while now, in part because it takes some time to develop the skills and intuition to build quantum circuits. He also noted that the open-source Qiskit SDK has integrated a lot of the tools that developers need to work at both the circuit level — which is a bit closer to writing in C or maybe even assembly in the classical computing world — and at the application level, where a lot of that is abstracted away.
“The idea is to make it easy for someone who is currently developing, whether it’s in the cloud, whether it’s using Python, to be able to run these tools and integrate quantum computing into their workflow,” Asfaw said. “I think the hardest part, to be very honest, is just giving someone the comfort to know that quantum computing is real today and that you can work with quantum computers. It’s as easy as opening up a Jupyter notebook and writing some code in Python.”
Digital House, a Buenos Aires-based edtech focused on developing tech talent through immersive remote courses, announced today it has raised more than $50 million in new funding.
Notably, two of the main investors are not venture capital firms but instead are two large tech companies: Latin American e-commerce giant Mercado Libre and San Francisco-based software developer Globant. Riverwood Capital, a Menlo Park-based private equity firm, and existing backer early-stage Argentina-based venture firm Kaszek also participated in the financing.
The raise brings Digital House’s total funding raised to more than $80 million since its 2016 inception. The Rise Fund led a $20 million Series B for Digital House in December 2017, marking the San Francisco-based firm’s investment in Latin America.
Nelson Duboscq, CEO and co-founder of Digital House, said that accelerating demand for tech talent in Latin America has fueled demand for the startup’s online courses. Since it first launched its classes in March 2016, the company has seen a 118% CAGR in revenues and a 145% CAGR in students. The 350-person company expects “and is on track” to be profitable this year, according to Duboscq.
Digital House CEO and co-founder Nelson Duboscq. Image Credits: Digital House
In 2020, 28,000 students across Latin America used its platform. The company projects that more than 43,000 will take courses via its platform in 2021. Fifty percent of its business comes out of Brazil, 30% from Argentina and the remaining 20% in the rest of Latin America.
Specifically, Digital House offers courses aimed at teaching “the most in-demand digital skills” to people who either want to work in the digital industry or for companies that need to train their employees on digital skills. Emphasizing practice, Digital House offers courses — that range from six months to two years — teaching skills such as web and mobile development, data analytics, user experience design, digital marketing and product development.
The courses are fully accessible online and combine live online classes led by in-house professors, with content delivered through Digital House’s platform via videos, quizzes and exercises “that can be consumed at any time.”
Digital House also links its graduates to company jobs, claiming an employability rate of over 95%.
Looking ahead, Digital House says it will use its new capital toward continuing to evolve its digital training platforms, as well as launching a two-year tech training program — dubbed the the “Certified Tech Developer” initiative — jointly designed with Mercado Libre and Globant. The program aims to train thousands of students through full-time two-year courses and connect them with tech companies globally.
Specifically, the company says it will also continue to expand its portfolio of careers beyond software development and include specialization in e-commerce, digital marketing, data science and cybersecurity. Digital House also plans to expand its partnerships with technology employers and companies in Brazil and the rest of Latin America. It also is planning some “strategic M&A,” according to Duboscq.
Francisco Alvarez-Demalde, co-founder & co-managing partner of Riverwood Capital, noted that his firm has observed an accelerating digitization of the economy across all sectors in Latin America, which naturally creates demand for tech-savvy talent. (Riverwood has an office in São Paulo).
For example, in addition to web developers, there’s been increased demand for data scientists, digital marketing and cybersecurity specialists.
“In Brazil alone, over 70,000 new IT professionals are needed each year and only about 45,000 are trained annually,” Alvarez-Demalde said. “As a result of such a talent crunch, salaries for IT professionals in the region increased 20% to 30% last year. In this context, Digital House has a large opportunity ahead of them and is positioned strategically as the gatekeeper of new digital talent in Latin America, preparing workers for the jobs of the future.”
André Chaves, senior VP of Strategy at Mercado Libre, said the company saw in Digital House a track record of “understanding closely” what Mercado Libre and other tech companies need.
“They move as fast as we do and adapt quickly to what the job market needs,” he said. “A very important asset for us is their presence and understanding of Latin America, its risks and entrepreneurial environment. Global players have succeeded for many years in our region. But things are shifting gradually, and local knowledge of risks and opportunities can make a great difference.”
Clubhouse, the social audio app that first took Silicon Valley by storm and is now gaining much wider appeal, is an interesting user experience case study.
Hockey-stick growth — 8 million global downloads as of last month, despite still being in a prelaunch, invite-only mode, according to App Annie — is something most startups would kill for. However, it also means that UX problems can only be addressed while in “full flight” — and that changes to the user experience will be felt at scale rather under the cover of a small, loyal and (usually) forgiving user base.
In our latest UX teardown, Built for Mars founder and UX expert Peter Ramsey and TechCrunch reporter Steve O’Hear discuss some of Clubhouse’s UX challenges as it continues to onboard new users at pace while striving to create enough stickiness to keep them active.
Peter Ramsey: Content feeds are notoriously difficult to get right. Which posts should you see? How should you order them? How do you filter out the noise?
On Clubhouse, once you’ve scrolled past all the available rooms in your feed, you’re prompted to follow more people to see more rooms. In other words, Clubhouse is inadvertently describing how it decides what content you see, i.e., your homepage is a curated list of rooms based on people you follow.
Except there’s a problem: I don’t follow half the people who already appear in my feed.
Image Credits: Clubhouse
Steve O’Hear: I get it. This could be confusing, but why does it actually matter? Won’t people just continue to use the homepage regardless?
Peter: In the short term, yes. People will use the homepage in the same way they’d use Instagram’s search page (which is to just browse occasionally). But in the long term, this content needs to be consistently relevant or people will lose interest.
Steve: But Twitter has a search page that shows random content that I don’t control…
Peter: Yeah, but they also have a home feed that you do control. It’s fine to also have the more random “slot machine style” content feed — but you need the base layer.
Peter: In the early days of Twitter, the team noticed something in their data: When people follow at least 30 others, they’re far more likely to stick around. This is often described as an “aha moment” — the moment that the utility of a product really clicks for the user.
This story has become startup folklore, and I’ve worked with many companies who take this message too literally, forgetting the nuance of what they really found: It’s not enough to just follow 30 random people — you need to follow 30 people who you genuinely care about.
Clubhouse has clearly adopted a similar methodology, by pre-selecting 50 people for you to follow while signing up.
Have you noticed that some people have accumulated millions of followers really quickly? It’s because the same people are almost always recommended—I tried creating accounts with polar opposite interests, and the same people were pre-selected almost every time.
And at no point does it explain that following those 50 people will directly impact the content that is available to you, or that if your homepage gets uninteresting, you’ll need to unfollow these people individually.
But they should, and it could look more like this:
Steve: Why do you think Clubhouse does this? Laziness?
Peter: I think in the early days of Clubhouse they just wanted to maximize connections, and by always recommending the same people (Clubhouse’s founders and investors), they could somewhat control the content that is shown to new users.
Twitter today is announcing what the company calls a “strategic acquihire” of the API integration platform Reshuffle. The startup’s commercial technology, which allows developers to build workflows and connect systems using any API, will be wound down as a result of Twitter’s deal. However, Reshuffle’s entire team of seven, including co-founders Amir Shevat and Avner Braverman, will be joining Twitter, where they’ll work to accelerate the work being done to modernize Twitter’s new, unified API.
The new Twitter API 2.0 was first introduced last year, having been rebuilt from the ground up for the first time since 2012. It now includes features missing from the older version, like conversation threading, poll results, pinned tweets, spam filtering and more powerful stream filtering and search query language. It’s also been designed in a way that will allow Twitter to release new functionality faster as the company itself rolls out more features. For example, the API has added new support for newer features like “hide replies” and tweet annotations. And as Twitter’s pace of development has recently been sped up, the company now has even more significant product launches on the horizon, including the public release of Twitter Spaces (audio rooms) and soon, Super Follow (a subscription service for creators and their fans).
In addition, Twitter’s API team is working to develop products that meet the various needs of different types of developers, including consumer-facing app developers, business and enterprise developers and academic researchers. In January, Twitter’s API opened up to researchers, and Twitter promised more functionality would soon be on the way.
Reshuffle’s team will be immediately tasked with helping Twitter accelerate its API efforts and building tools for developers.
Reshuffle’s CEO Avner Braverman, who has nearly two decades of experience in both engineering and technical consumer-facing roles across startups and larger companies like IBM, will join Twitter’s Developer Platform team. Meanwhile, Reshuffle’s CPO Amir Shevat, whose previous roles included VP of Platform for Twitch, Head of Developer Relations at Slack, and Senior Developer Relations Manager at Google, will join Twitter as a senior member of the Developer Platform team.
“We’re doubling down on our investment and ambitions by bringing the Reshuffle team on board,” noted a Twitter blog post announcing the deal, co-authored by Twitter Revenue Product Lead Bruce Falck and Twitter Developer Platform Lead Sonya Penn. “Their experience building developer platforms will accelerate and enhance our work by building the tools that will make it easier and quicker for developers to find value on our platform,” it read.
Reshuffle’s existing product will wind down operations over the coming weeks, following the acqui-hire. However, the team will continue to maintain its open-source project for the developer community, Twitter notes.
Image Credits: A photo of Reshuffle’s product
Twitter has been on an acquisition and acqui-hire spree in recent months, having bought newsletter platform Revue, which is already integrated into Twitter’s website; as well as teams from social podcasting app Breaker, screen sharing social app Squad and creative design agency Ueno; and last year, stories template maker Chroma Labs.
Twitter says it will continue to look for more acqui-hire opportunities in the future as a means of scaling its own teams and accelerating their work.
The company declined to share deal terms for the Reshuffle acqui-hire.
According to PitchBook, Reshuffle was backed by $6.35 million in funding from investors including Cardumen Capital, Cerca Partners, Maverick Ventures, Meron Capital, Dell Technologies Capital, Engineering Capital and Lightspeed Venture Partners. PitchBook says the business was valued at $11.85 million.
One clear sign of a maturing platform is when the company exposes the services it uses for its own tools to other developers. Zoom has been doing that for some time introducing Zoom Apps last year and the Marketplace to distribute and sell these apps. Today, the company introduced a new SDK (software development kit) to help developers embed Zoom video services inside another application.
“Our Video SDK enables developers to leverage Zoom’s industry-leading HD video, audio, and interactive features to build video-based applications and desktop experiences with native user interfaces,” Zoom’s Natalie Mullin wrote in a blog post announcing the new SDK.
If you want to include video in your app, you could try and code it yourself, or you could simply take advantage of Zoom’s expertise in this area and use the SDK to add video to the application and save a lot of time and effort.
The company envisions applications developers embedding video in social, gaming or retail applications where including video could enhance the user experience. For example, a shop owner could show different outfits to an online shopper in a live video feed, and discuss their tastes in real time.
Zoom CTO Brendan Ittelson said the SDK is actually part of a broader set of services designed to help developers take advantage of all the developer tooling that the company has been developing in recent years. As part of that push, the company is also announcing a central developer portal.
“We want to be able to have a single point where developers can go to to learn about all of the tools and resources that are available for them in the Zoom platform for their work in development, so we’re launching developer.zoom.us as that central hub for all developer resources,” Ittelson told me.
In addition, the company said that it wanted to give developers more data about how people are using the Zoom features in their applications, so they will be providing a new analytics dashboard with usage statistics.
“We are adding additional tools and actually providing developers with analytic dashboards. So folks that have developed apps for the Zoom ecosystem are able to see information about the usage of those apps across the platform,” Ittelson said.
He believes these tools combined with the new video SDK and existing set of tools will provide developers with a variety of options for building Zoom functionality into their applications, or embedding their application into Zoom as they see fit.
Most American retail banks are designed the same way: Customers must pass several desks set aside for loan and mortgage officers before they can talk to a customer representative.
I only step inside a bank a few times each year, but even pre-pandemic, I can’t remember the last time I saw someone sitting at one of those desks. Everyone I know who’s obtained a home or business loan in the recent past started with an online application process.
For this morning’s column, Alex Wilhelm interviewed Dave Girouard, CEO of Upstart, an AI-powered fintech lender that expects to see growth increase 114% this year.
A forecast like that suggests that retail banks have gotten comfortable with using automated tools to calculate risk, which may help explain all the empty desks at my local branch.
“If Upstart hits its 2021 numbers, we will be able to read into them broader adoption of AI among old-guard firms,” says Alex.
According to PitchBook, investors are also more bullish on AI: Q4 2020 saw record funding for AI and ML startups, and exit totals are increasing as well.
I wouldn’t mind adding a gently used desk to my home office; perhaps I should call my bank and see if they have one to spare.
Thanks very much for reading Extra Crunch. Have a great weekend!
Senior Editor, TechCrunch
Full Extra Crunch articles are only available to members.
Use discount code ECFriday to save 20% off a one- or two-year subscription.
Image Credits: dowell (opens in a new window) / Getty Images
Data is a gold mine for a company. If managed well, it provides the clarity and insights that lead to better decision-making at scale, in addition to an important tool to hold everyone accountable.
However, most companies are stuck in Data 1.0.
Image Credits: Bryce Durbin/TechCrunch
A friend and I founded a tech startup last year. Like a lot of other startups, we’re looking for funding.
Should we come to Silicon Valley to meet with venture capitalists?
How should we begin that process? What type of visa should we get and how easy is it to get?
—Logical in Lagos
Image Credits: Yuichiro Chino / Getty Images
Why are developers still solving everyday pain points with manual, archaic processes, as opposed to employing “Little AI”?
There are millions of everyday use cases for AI, where technology is empowered to learn and decide on a course of action that offers the best outcome for consumers and companies alike.
Image Credits: Thomas Barwick (opens in a new window) / Getty Images
The increasing demand for AI and data science experts, driven in part by the pandemic’s economic impact, is showing no sign of abating.
Many employers are failing to identify viable job candidates, much less interviewing or hiring them. What’s holding them back?
Often, it’s a poorly drafted job posting.
Image Credits: Korrawin / Getty Images
No-code is changing how organizations build and maintain applications.
It democratizes application development by creating “citizen developers” who can quickly build out apps that meet their business-facing needs in real time, realigning IT and business objectives by bringing them closer together.
How can your company get ahead of the trend?
Image Credits: jokerpro / Getty Images
The idiosyncrasies of sales taxes are a burden on small- and medium-sized businesses, but a new legion of startups is emerging to help companies manage the intricacies of cross-jurisdictional taxes.
Image Credits: VectorInspiration / Getty Images
Some founders and investors argue that these preferred shares protect them from the whims of the market, but the perspective isn’t universally accepted.
Dual-class shares are a controversial governance structure, and some wonder if they are setting up an unfair playing field by allowing a cabal to wield outsized power.
So why would Snowflake give up such a powerful tool?
Image Credits: Bryce Durbin
As transit agencies seek to win back riders, a flurry of platforms — some backed by giants like Uber, Intel and BMW — are offering new technology partnerships.
Whether it’s bundling bookings, payments or just trip planning, startups are selling these mobility-as-a-service (MaaS) offerings as a lifeline to make transit agencies the backbone of urban mobility.
Israeli consumer stock-trading service eToro is going public in the United States via a SPAC. One thing that points to?
Trading platforms are being valued like high-margin video games.
I knew African founders lacked the same access to capital as entrepreneurs based in Europe or the United States, but the numbers are far less favorable than I thought.
According to Dauda Barry, CEO of Adaplay Esports, African startups have raised $500 million so far in 2021. If that trend continues, he estimates that the region’s tech companies will exceed the $1.4 billion they raised in 2020.
For perspective: “Stripe raised more yesterday than Barry had reported for the entire African continent this year,” Alex Wilhelm noted in today’s column.
Digging deeper, he pulled numbers from Crunchbase and PitchBook to track VC activity in Africa over the last three months. Once he filtered private equity funding from nonequity investments, the numbers were “staggering.”
“I am surprised that more VCs aren’t investing in Africa,” says Alex. “It smells like investing arbitrage.”
Image Credits: Pgiam (opens in a new window) / Getty Images
Companies that help farmers raise money for agricultural development projects are revolutionizing the way farm and forestland are acquired, developed and commercialized across the United States.
While private equity has gotten a lot of press for expanding the size of their farmland investments, those investments are still dwarfed by the size of the potential farm industry in the U.S., meaning there’s still plenty of opportunity for investors to provide additional capital.
The crypto art craze might seem silly and expensive, but it could empower artists from emerging economies and underrepresented groups to access the global art market in ways that they couldn’t before.
Can it outlive the hype?
Image Credits: jayk7 (opens in a new window) / Getty Images
That Olo raised its IPO price is not a huge surprise, given the software company’s rapid growth and profits. In the case of DigitalOcean, we have more work to do as its approach to growth is a bit different.
Stripe’s $600 million round values the payments and banking software company at $95 billion, near the top end of the valuation range at which the company was said to be raising funds back in November 2020.
Sadly, Stripe is still being coy with growth metrics. The Exchange digs in, no matter how vague.
Julia Collins, the first Black woman to co-found a venture-backed unicorn, and investor Sarah Kunst offer fundraising pointers on Extra Crunch Live.
Kunst says good design is critical, but:
If you’re not a graphic designer, then any incremental minute that you’re spending on trying to make your deck pretty is a waste of time. You need to be focusing on content. Hire somebody, pay them a tiny bit of money to be able to do a nice graphics pass on your deck, and it’s going to make it a lot easier for people to to get the information that you need them to know.
Image Credits: Getty Images
Startup hiring processes can be opaque, and breaking into the deep tech world as a nontechnical person seems daunting. This column offers tactical advice for finding, reaching out to, cultivating relationships with and working at deep tech companies as a nontechnical candidate.