FreshRSS

🔒
❌ About FreshRSS
There are new available articles, click to refresh the page.
Before yesterdayYour RSS feeds

‘No code’ will define the next generation of software

By Walter Thompson
Alex Nichols Contributor
Alex Nichols is a vice president at CapitalG, Alphabet's independent growth fund, where he focuses on growth stage investments in software.
Jesse Wedler Contributor
Jesse Wedler is a partner at CapitalG, Alphabet's independent growth fund, where he focuses on growth stage investments in software.

It seems like every software funding and product announcement these days includes some sort of reference to “no code” platforms or functionality. The frequent callbacks to this buzzy term reflect a realization that we’re entering a new software era.

Similar to cloud, no code is not a category itself, but rather a shift in how users interface with software tools. In the same way that PCs democratized software usage, APIs democratized software connectivity and the cloud democratized the purchase and deployment of software, no code will usher in the next wave of enterprise innovation by democratizing technical skill sets. No code is empowering business users to take over functionality previously owned by technical users by abstracting complexity and centering around a visual workflow. This profound generational shift has the power to touch every software market and every user across the enterprise.

The average enterprise tech stack has never been more complex

In a perfect world, all enterprise applications would be properly integrated, every front end would be shiny and polished, and internal processes would be efficient and automated. Alas, in the real world, engineering and IT teams spend a disproportionate share of their time fighting fires in security, fixing internal product bugs and running vendor audits. These teams are bursting at the seams, spending an estimated 30% of their resources building and maintaining internal tools, torpedoing productivity and compounding technical debt.

Seventy-two percent of IT leaders now say project backlogs prevent them from working on strategic projects. Hiring alone can’t solve the problem. The demand for technical talent far outpaces supply, as demonstrated by the fact that six out of 10 CIOs expect skills shortages to prevent their organizations from keeping up with the pace of change.

At the same time that IT and engineering teams are struggling to maintain internal applications, business teams keep adding fragmented third-party tools to increase their own agility. In fact, the average enterprise is supporting 1,200 cloud-based applications at any given time. Lacking internal support, business users bring in external IT consultants. Cloud promised easy as-needed software adoption with seamless integration, but the realities of quickly changing business needs have led to a roaring comeback of expensive custom software.

French court slaps down Google’s appeal against $57M GDPR fine

By Natasha Lomas

France’s top court for administrative law has dismissed Google’s appeal against a $57M fine issued by the data watchdog last year for not making it clear enough to Android users how it processes their personal information.

The State Council issued the decision today, affirming the data watchdog CNIL’s earlier finding that Google did not provide “sufficiently clear” information to Android users — which in turn meant it had not legally obtained their consent to use their data for targeted ads.

“Google’s request has been rejected,” a spokesperson for the Conseil D’Etat confirmed to TechCrunch via email.

“The Council of State confirms the CNIL’s assessment that information relating to targeting advertising is not presented in a sufficiently clear and distinct manner for the consent of the user to be validly collected,” the court also writes in a press release [translated with Google Translate] on its website.

It found the size of the fine to be proportionate — given the severity and ongoing nature of the violations.

Importantly, the court also affirmed the jurisdiction of France’s national watchdog to regulate Google — at least on the date when this penalty was issued (January 2019).

The CNIL’s multimillion dollar fine against Google remains the largest to date against a tech giant under Europe’s flagship General Data Protection Regulation (GDPR) — lending the case a certain symbolic value, for those concerned about whether the regulation is functioning as intended vs platform power.

While the size of the fine is still relative peanuts vs Google’s parent entity Alphabet’s global revenue, changes the tech giant may have to make to how it harvests user data could be far more impactful to its ad-targeting bottom line. 

Under European law, for consent to be a valid legal basis for processing personal data it must be informed, specific and freely given. Or, to put it another way, consent cannot be strained.

In this case French judges concluded Google had not provided clear enough information for consent to be lawfully obtained.

It also objected to a pre-ticked checkbox — which it said does not meet the requirements of the GDPR.

So, tl;dr, the CNIL’s decision has been entirely vindicated.

Reached for comment on the court’s dismissal of its appeal, a Google spokeswoman sent us this statement:

People expect to understand and control how their data is used, and we’ve invested in industry-leading tools that help them do both. This case was not about whether consent is needed for personalised advertising, but about how exactly it should be obtained. In light of this decision, we will now review what changes we need to make.

GDPR came into force in 2018, updating long standing European data protection rules and opening up the possibility of supersized fines of up to 4% of global annual turnover.

However actions against big tech have largely stalled, with scores of complaints being funnelled through Ireland’s Data Protection Commission — on account of a one-stop-shop mechanism in the regulation — causing a major backlog of cases. The Irish DPC has yet to issue decisions on any cross border complaints, though it has said its first ones are imminent — on complaints involving Twitter and Facebook.

Ireland’s data watchdog is also continuing to investigate a number of complaints against Google, following a change Google announced to the legal jurisdiction of where it processes European users’ data — moving them to Google Ireland Limited, based in Dublin, which it said applied from January 22, 2019 — with ongoing investigations by the Irish DPC into a long running complaint related to how Google handles location data and another major probe of its adtech, to name two

On the GDPR one-stop shop mechanism — and, indirectly, the wider problematic issue of ‘forum shopping’ and European data protection regulation — the French State Council writes: “Google believed that the Irish data protection authority was solely competent to control its activities in the European Union, the control of data processing being the responsibility of the authority of the country where the main establishment of the data controller is located, according to a ‘one-stop-shop’ principle instituted by the GDPR. The Council of State notes however that at the date of the sanction, the Irish subsidiary of Google had no power of control over the other European subsidiaries nor any decision-making power over the data processing, the company Google LLC located in the United States with this power alone.”

In its own statement responding to the court’s decision, the CNIL also notes its view that GDPR’s one-stop-shop mechanism was not applicable in this case — writing that: “It did so by applying the new European framework as interpreted by all the European authorities in the guidelines of the European Data Protection Committee.”

Privacy NGO noyb — one of the privacy campaign groups which lodged the original ‘forced consent’ complaint against Google, all the way back in May 2018 — welcomed the court’s decision on all fronts, including the jurisdiction point.

Commenting in a statement, noyb’s honorary chairman, Max Schrems, said: “It is very important that companies like Google cannot simply declare themselves to be ‘Irish’ to escape the oversight by the privacy regulators.”

A key question is whether CNIL — or another (non-Irish) EU DPA — will be found to be competent to sanction Google in future, following it’s shift to naming Google Ireland as their data processor.

On the wider ruling, Schrems added: “This decision requires substantial improvements by Google. Their privacy policy now really needs to make it crystal clear what they do with users’ data. Users must also get an option to agree to only some parts of what Google does with their data and refuse other things.”

French digital rights group, La Quadrature du Net — which had filed a related complaint against Google, feeding the CNIL’s investigation — also declared victory today, noting it’s the first sanction in a number of GDPR complaints it has lodged against tech giants on behalf of 12,000 citizens.

Nouvelle victoire !

Le @Conseil_Etat valide intégralement, en la reprenant à son compte, la sanction de 50 millions d'€ contre Google prononcée en janvier 2019 par la CNIL.https://t.co/6gJRL5ZM3r

— La Quadrature du Net (@laquadrature) June 19, 2020

“The rest of the complaints against Google, Facebook, Apple and Microsoft are still under investigation in Ireland. In any case, this is what his authority promises us,” it adds in another tweet.

Implement DevSecOps to transform your business to IT-as-code

By Walter Thompson
Michael Fraser Contributor
Michael Fraser is an Air Force Veteran and co-founder of Refactr, a DevSecOps automation platform that helps tech teams modernize towards IT-as-code.

Conduct an online search and you’ll find close to one million websites offering their own definition of DevSecOps.

Why is it that domain experts and practitioners alike continue to iterate on analogous definitions? Likely, it’s because they’re all correct. DevSecOps is a union between culture, practice and tools providing continuous delivery to the end user. It’s an attitude; a commitment to baking security into the engineering process. It’s a practice; one that prioritizes processes that deliver functionality and speed without sacrificing security or test rigor. Finally, it’s a combination of automation tools; correctly pieced together, they increase business agility.

The goal of DevSecOps is to reach a future state where software defines everything. To get to this state, businesses must realize the DevSecOps mindset across every tech team, implement work processes that encourage cross-organizational collaboration, and leverage automation tools, such as for infrastructure, configuration management and security. To make the process repeatable and scalable, businesses must plug their solution into CI/CD pipelines, which remove manual errors, standardize deployments and accelerate product iterations. Completing this process, everything becomes code. I refer to this destination as “IT-as-code.”

Why is DevSecOps important?

Whichever way you cut it, DevSecOps, as a culture, practice or combination of tools, is of increasing importance. Particularly these days, with more consumers and businesses leaning on digital, enterprises find themselves in the irrefutable position of delivering with speed and scale. Digital transformation that would’ve taken years, or at the very least would’ve undergone a period of premeditation, is now urgent and compressed into a matter of months.

The keys to a successful DevSecOps program

Security and operations are a part of this new shift to IT, not just software delivery: A DevSecOps program succeeds when everyone, from security, to operations, to development, is not only part of the technical team but able to share information for repeatable use. Security, often seen as a blocker, will uphold the “secure by design” principle by automating security code testing and reviews, and educating engineers on secure design best practices. Operations, typically reactive to development, can troubleshoot incongruent merges between engineering and production proactively. However, currently, businesses are only familiar with utilizing automation for software delivery. They don’t know what automation means for security or operations. Figuring out how to apply the same methodology throughout the whole program and therefore the whole business is critical for success.

Why you should worry about data transparency

By Walter Thompson
Kevin Walkup Contributor
Kevin Walkup is President and COO of Harmonate, a data-services firm serving private funds.

In the world of business and finance, the question on everyone’s mind is whether COVID-19 is going to lead to permanent changes in the economy. Will lockdowns become part of everyday life? Will new disruptive consumer behaviors emerge? How will investors react?

First, we’re headed for a period of radical required transparency. For me, this started hitting home two years ago. George Walker, CEO of massive fund of funds Neuberger Berman, told Bloomberg that his pricing power over clients was “zero” because customers had access to competing asset managers, as well as data, research and analysis and other investment options online.

“Clients are tough. Consumers are tough now. Technology is changing,” he said. “The pricing challenges are significant and real.”

One would think Walker would be averse to greater transparency. His self-interest theoretically might dictate that he behave like a wizard behind a curtain. But Walker embraced transparency, saying his clients should use the leverage they gain from shopping around to save money when hiring asset managers like him. Teachers’ unions and others need the cash, he said. He’s too smart to fight the tide.

Transparency, secondly, is going to put a lot of pressure on companies. Walker knows that transparency might put downward pressure on prices but, if deployed properly, it can also cultivate loyalty. It can also attract competitors’ clients who are also navigating the same comparison-shopping environment. The bottom line is that companies will need to work harder as their customers become savvier.

What’s more, that new environment is coming as the tailwinds that fueled the growth of tech companies in the last 15 years are losing steam as the internet, social media and smart devices transition into mature technologies. Technology is still going to let companies scale up in extraordinary and cost-effective ways, but innovation will face a higher bar to wow clients.

This dilemma brings us to the third and most important point to consider moving forward as companies struggle to survive in the post-coronavirus economy. Leaders of firms that thrive in the coming years must be prepared for success in fits and starts, sometimes after failures that in the short term seem like unforced errors but in hindsight might become experiences crucial to progress on the crooked path to success.

Consider Intel’s Operation Crush in the 1970s. As business guru Hamilton Helmer wrote in his must-read “7 Powers: The Foundations of Business Strategy,” microprocessors at the time were not a big portion of Intel’s business. Many in the company didn’t think chips were worth much effort, in part because their competitors were producing perfectly good microprocessors, too. The fact that microprocessors were constituent parts of other companies’ machines and not standalone products bolstered skeptical views among Intel’s sales force.

Intel honchos nonetheless wanted to secure 2,000 contracts for their microprocessors under Operation Crush in part because they saw how Japanese competitors were beating them in the semiconductor space. They needed to pivot. One contract happened to be with IBM, which around that time had a market cap of almost $40 billion compared to Intel’s cap of $1.7 billion. Nobody thought the deal would yield a sea change for either company, but Intel and IBM’s collaboration resulted in a personal computer revolution that made Intel a success.

The 1970s were a heady time that, it turns out, were the transition decade to an entirely new phase of the economy. The Operation Crush lesson is that the avenues to success after the coronavirus subsides are not going to be straightforward, obvious or immediate. We simply don’t know the second and third-order implications of much of what has happened in the last few months to be able to chart a direct path forward.

Instead, we have some truths that can serve as our guiding lights. Companies will need to embrace transparency. They will need to show grace under pressure. And they will need to have the courage to embark on aggressive campaigns that seize the moment even if they don’t fully understand what that moment might be.

How startups can leverage elastic services for cost optimization

By Walter Thompson
Joey Lei Contributor
Joey Lei is director of service management at Synoptek, a global systems integrator and managed services provider. Prior to joining Synoptek, he was a lead product manager for Dell EMC’s Data Protection Division and was a founding product manager for Dell EMC PowerProtect Data Manager, Dell EMC’s newest generation data protection and data management solution.

Due to COVID-19, business continuity has been put to the test for many companies in the manufacturing, agriculture, transport, hospitality, energy and retail sectors. Cost reduction is the primary focus of companies in these sectors due to massive losses in revenue caused by this pandemic. The other side of the crisis is, however, significantly different.

Companies in industries such as medical, government and financial services, as well as cloud-native tech startups that are providing essential services, have experienced a considerable increase in their operational demands — leading to rising operational costs. Irrespective of the industry your company belongs to, and whether your company is experiencing reduced or increased operations, cost optimization is a reality for all companies to ensure a sustained existence.

One of the most reliable measures for cost optimization at this stage is to leverage elastic services designed to grow or shrink according to demand, such as cloud and managed services. A modern product with a cloud-native architecture can auto-scale cloud consumption to mitigate lost operational demand. What may not have been obvious to startup leaders is a strategy often employed by incumbent, mature enterprises — achieving cost optimization by leveraging managed services providers (MSPs). MSPs enable organizations to repurpose full-time staff members from impacted operations to more strategic product lines or initiatives.

Why companies need cost optimization in the long run

Microsoft acquires robotic process automation platform Softomotive

By Frederic Lardinois

During his Build keynote, Microsoft CEO Satya Nadella today confirmed that the company has acquired Softomotive, a software robotic automation platform. Bloomberg first reported that this acquisition was in the works earlier this month, but the two companies didn’t comment on the report at the time.

Today, Nadella noted that Softomotive would become part of Microsoft’s Power Automate platform. “We’re bringing RPA – or robotic process automation to legacy apps and services with our acquisition of Softomotive,” Nadella said.

Softomotive currently has about 9,000 customers around the world. Softomotive’s WinAutomation platform will be freely available to Power Automate users with what Microsoft calls an RPA attended license in Power Automate.

In Power Automate, Microsoft will use Softomotive’s tools to enable a number of new capabilities, including Softomotives low-code desktop automation solution WinAutomation. Until now, Power Automate did not feature any desktop automation tools.

It’ll also build Softomotive’s connectors for applications from SAP, as well as legacy terminal screens and Java, into its desktop automation experience and enable parallel execution and multitasking for UI automation.

Softomotives other flagship application, ProcessRobot for server-based enterprise RPA development, will also find a new home in Power Automate. My guess, though, is that Microsoft mostly bought the company for its desktop automation skills.

“One of our most distinguishing characteristics, and an indelible part of our DNA, is an unswerving commitment to usability,” writes Softomotive CEO and co-founder Marios Stavropoulos. “We have always believed in the notion of citizen developers and, since less than two percent of the world population can write code, we believe the greatest potential for both process improvement and overall innovation comes from business end users. This is why we have invested so diligently in abstracting complexity away from end users and created one of the industry’s most intuitive user interfaces – so that non-technical business end users can not just do more, but also make deeper contributions by becoming professional problem solvers and innovators. We are extremely excited to pursue this vision as part of Microsoft.”

The two companies did not disclose the financial details of the transaction.

Enterprise companies find MLOps critical for reliability and performance

By Walter Thompson
Rish Joshi Contributor
Rish is an entrepreneur and investor. Previously, he was a VC at Gradient Ventures (Google’s AI fund), co-founded a fintech startup building an analytics platform for SEC filings and worked on deep-learning research as a graduate student in computer science at MIT.

Enterprise startups UIPath and Scale have drawn huge attention in recent years from companies looking to automate workflows, from RPA (robotic process automation) to data labeling.

What’s been overlooked in the wake of such workflow-specific tools has been the base class of products that enterprises are using to build the core of their machine learning (ML) workflows, and the shift in focus toward automating the deployment and governance aspects of the ML workflow.

That’s where MLOps comes in, and its popularity has been fueled by the rise of core ML workflow platforms such as Boston-based DataRobot. The company has raised more than $430 million and reached a $1 billion valuation this past fall serving this very need for enterprise customers. DataRobot’s vision has been simple: enabling a range of users within enterprises, from business and IT users to data scientists, to gather data and build, test and deploy ML models quickly.

Founded in 2012, the company has quietly amassed a customer base that boasts more than a third of the Fortune 50, with triple-digit yearly growth since 2015. DataRobot’s top four industries include finance, retail, healthcare and insurance; its customers have deployed over 1.7 billion models through DataRobot’s platform. The company is not alone, with competitors like H20.ai, which raised a $72.5 million Series D led by Goldman Sachs last August, offering a similar platform.

Why the excitement? As artificial intelligence pushed into the enterprise, the first step was to go from data to a working ML model, which started with data scientists doing this manually, but today is increasingly automated and has become known as “auto ML.” An auto-ML platform like DataRobot’s can let an enterprise user quickly auto-select features based on their data and auto-generate a number of models to see which ones work best.

As auto ML became more popular, improving the deployment phase of the ML workflow has become critical for reliability and performance — and so enters MLOps. It’s quite similar to the way that DevOps has improved the deployment of source code for applications. Companies such as DataRobot and H20.ai, along with other startups and the major cloud providers, are intensifying their efforts on providing MLOps solutions for customers.

We sat down with DataRobot’s team to understand how their platform has been helping enterprises build auto-ML workflows, what MLOps is all about and what’s been driving customers to adopt MLOps practices now.

The rise of MLOps

If we let the US Postal Service die, we’ll be killing small businesses with it

By Walter Thompson
Laura Behrens Wu Contributor
Laura Behrens Wu is the co-founder and CEO of Shippo, which is building a shipping platform for 21st century e-commerce.

Since moving to the United States, I’ve come to appreciate and admire the United States Postal Service as a symbol of American ingenuity and resilience.

Like electricity, telephones and the freeway system, it’s part of our greater story and what binds the United States together. But it’s also something that’s easy to take for granted. USPS delivers 181.9 million pieces of First Class mail each day without charging an arm and a leg to do so. If you have an address, you are being served by the USPS — and no one’s asking you for cash up front.

As CEO of Shippo, an e-commerce technology platform that helps businesses optimize their shipping, I have a unique vantage point into the USPS and its impact on e-commerce. The USPS has been a key partner since the early days of Shippo in making shipping more accessible for growing businesses. As a result of our work with the USPS, along with several other emerging technologies (like site builders, e-commerce platforms and payment processing), e-commerce is more accessible than ever for small businesses.

And while my opinion on the importance of the USPS is not based on my company’s business relationship with the Postal Service, I want to be upfront about the fact that Shippo generates part of its revenue from the purchase of shipping labels through our platform from the USPS along with several other carriers. If the USPS were to stop operations, it would have an impact on Shippo’s revenue. That said, the negative impact would be far greater for many thousands of small businesses.

I know this because at Shippo, we see firsthand how over 35,000 online businesses operate and how they reach their customers. We see and support everything from what options merchants show their customers at checkout through how they handle returns — and everything in between. And while each and every business is unique with different products, customers operations and strategies, they all need to ship.

In the United States, the majority of this shipping is facilitated by the USPS, especially for small and medium businesses. For context, the USPS handles almost half of the world’s total mail and delivers more than the top private carriers do in aggregate, annually, in just 16 days. And, it does all of this without tax dollars, while offering healthcare and pension benefits to its employees.

As has been the case for many organizations, COVID-19 has significantly impacted the USPS. While e-commerce package shipments continue to rise (+30% since early March based on Shippo data), it has not been enough to overcome the drastic drop in letter mail. With this, I’ve heard opinions of supposed “inefficiency,” calls for privatization, pushes for significant pricing and structural changes, and even indifference to the possibility of the USPS shutting down.

Amid this crisis, we all need the USPS and its vital services now more than ever. In a world with a diminished or dismantled USPS, it won’t be Amazon, other major enterprises, or even Shippo that suffer. If we let the USPS die, we’ll be killing small businesses along with it.

Quite often, opinions on the efficiency (or lack thereof) of the USPS are very narrow in scope. Yes, the USPS could pivot to improve its balance sheet and turn operating losses into profits by axing cumbersome routes, increasing prices and being more selective in who they serve.

However, this omits the bigger picture and the true value of the USPS. What some have dubbed inefficient operations are actually key catalysts to small business growth in the United States. The USPS gives businesses across the country, regardless of size, location or financial resources, the ability to reach their customers.

We shouldn’t evaluate the USPS strictly on balance sheet efficiency, or even as a “public good” in the abstract. We should look at how many thousands of small businesses have been able to get started thanks to the USPS, how hundreds of billions of dollars of commerce is made possible by the USPS annually and how many millions of customers, who otherwise may not have access to goods, have been served by the USPS.

In the U.S., e-commerce accounts for over half a trillion dollars in sales annually, and is growing at double-digit rates each year. When I hear people talk about the growth of e-commerce, Amazon is often the first thing that comes up. What doesn’t shine through as often is the massive growth of small business — which is essential to the health of commerce in general (no one needs a monopoly!). In fact, the SMB segment has been growing steadily alongside Amazon. And with the challenges that traditional businesses face with COVID-19, more small businesses than ever are moving online.

USPS Priority Mail gets packages almost anywhere in the U.S. in two to three days (average transit time is 2.5 days based on Shippo data) and starts at around $7 per shipment, with full service: tracking, insurance, free pickups and even free packaging that they will bring to you.

In a time when we as consumers have become accustomed to free and fast shipping on all of our online purchases, the USPS is essential for small businesses to keep up. As consumers we rarely see behind the curtain, so to speak, when we interact with e-commerce businesses. We don’t see the small business owner fulfilling orders out of their home or out of a small storefront, we just see an e-commerce website. Without the USPS’ support, it would be even harder, in some cases near impossible, for small business owners to live up to these sky-high expectations. For context, 89% of U.S.-based SMBs (under $10,000 in monthly volume) on the Shippo platform rely on the USPS.

I’ve seen a lot of talk about the USPS’s partnership with Amazon, how it is to blame for the current situation, and how under a private model, things would improve. While we have our own strong opinions on Amazon and its impact on the e-commerce market, Amazon is not the driver of USPS’s challenges. In fact, Amazon is a major contributor in the continued growth of the USPS’s most profitable revenue stream: package delivery.

While I don’t know the exact economics of the deal between the USPS and Amazon, significant discounting for volume and efficiency is common in e-commerce shipping. Part of Amazon’s pricing is a result of it actually being cheaper and easier for the USPS to fulfill Amazon orders, compared to the average shipper. For this process, Amazon delivers shipments to USPS distribution centers in bulk, which significantly cuts costs and logistical challenges for the USPS.

Without the USPS, Amazon would be able to negotiate similar processes and efficiencies with private carriers — small businesses would not. Given the drastic differences in daily operations and infrastructure between the USPS and private carriers, small businesses would see shipping costs increase significantly, in some cases by more than double. On top of this, small businesses would see a new operational burden when it comes to getting their packages into the carriers’ systems in the absence of daily routes by the USPS.

Overall, I would expect to see the level of entrepreneurship in e-commerce slow in the United States without the USPS or with a private version of the USPS that operates with a profit-first mindset. The barriers to entry would be higher, with greater costs and larger infrastructure investments required up-front for new businesses. For Shippo, I’d expect to see a much greater diversity of carriers used by our customers. Our technology that allows businesses to optimize across several carriers would become even more critical for businesses. Though, even with optimization, small businesses would still be the group that suffers the most.

Today, most SMB e-commerce brands, based on Shippo data, spend between 10-15% of their revenue on shipping, which is already a large expense. This could rise well north of 20%, especially when you take into account surcharges and pick-up fees, creating an additional burden for businesses in an already challenging space.

I urge our lawmakers and leaders to see the full picture: that the USPS is a critical service that enables small businesses to survive and thrive in tough times, and gives citizens access to essential services, no matter where they reside.

This also means providing government support — both financially and in spirit — as we all navigate the COVID-19 crisis. This will allow the USPS to continue to serve both small businesses and citizens while protecting and keeping their employees safe — which includes ensuring that they are equipped to handle their front-line duties with proper safety and protective gear.

In the end, if we continue to view the USPS as simply a balance sheet and optimize for profitability in a vacuum, we ultimately stand to lose far more than we gain.

AWS launches Amazon AppFlow, its new SaaS integration service

By Frederic Lardinois

AWS today launched Amazon AppFlow, a new integration service that makes it easier for developers to transfer data between AWS and SaaS applications like Google Analytics, Marketo, Salesforce, ServiceNow, Slack, Snowflake and Zendesk. Like similar services, including Microsoft Azure’s Power Automate, for example, developers can trigger these flows based on specific events, at pre-set times or on-demand.

Unlike some of its competitors, though, AWS is positioning this service more as a data transfer service than a way to automate workflows, and, while the data flow can be bi-directional, AWS’s announcement focuses mostly on moving data from SaaS applications to other AWS services for further analysis. For this, AppFlow also includes a number of tools for transforming the data as it moves through the service.

“Developers spend huge amounts of time writing custom integrations so they can pass data between SaaS applications and AWS services so that it can be analysed; these can be expensive and can often take months to complete,” said AWS principal advocate Martin Beeby in today’s announcement. “If data requirements change, then costly and complicated modifications have to be made to the integrations. Companies that don’t have the luxury of engineering resources might find themselves manually importing and exporting data from applications, which is time-consuming, risks data leakage, and has the potential to introduce human error.”

Every flow (which AWS defines as a call to a source application to transfer data to a destination) costs $0.001 per run, though, in typical AWS fashion, there’s also cost associated with data processing (starting at 0.02 per GB).

“Our customers tell us that they love having the ability to store, process, and analyze their data in AWS. They also use a variety of third-party SaaS applications, and they tell us that it can be difficult to manage the flow of data between AWS and these applications,” said Kurt Kufeld, vice president, AWS. “Amazon AppFlow provides an intuitive and easy way for customers to combine data from AWS and SaaS applications without moving it across the public internet. With Amazon AppFlow, our customers bring together and manage petabytes, even exabytes, of data spread across all of their applications — all without having to develop custom connectors or manage underlying API and network connectivity.”

At this point, the number of supported services remains comparatively low, with only 14 possible sources and four destinations (Amazon Redshift and S3, as well as Salesforce and Snowflake). Sometimes, depending on the source you select, the only possible destination is Amazon’s S3 storage service.

Over time, the number of integrations will surely increase, but for now, it feels like there’s still quite a bit more work to do for the AppFlow team to expand the list of supported services.

AWS has long left this market to competitors, even though it has tools like AWS Step Functions for building serverless workflows across AWS services and EventBridge for connections applications. Interestingly, EventBridge currently supports a far wider range of third-party sources, but as the name implies, its focus is more on triggering events in AWS than moving data between applications.

❌