FreshRSS

🔒
❌ About FreshRSS
There are new available articles, click to refresh the page.
Before yesterdayYour RSS feeds

AWS adds natural language search service for business intelligence from its data sets

By Jonathan Shieber

When Amazon Web Services launched QuickSight, its business intelligence service, back in 2016 the company wanted to provide product information and customer information for business users — not just developers.

At the time, the natural language processing technologies available weren’t robust enough to give customers the tools to search databases effectively using queries in plain speech.

Now, as those technologies have matured, Amazon is coming back with a significant upgrade called QuickSight Q, which allows users to just ask a simple question and get the answers they need, according to Andy Jassy’s keynote at AWS re:Invent.

“We will provide natural language to provide what we think the key learning is,” said Jassy. “I don’t like that our users have to know which databases to access or where data is stored. I want them to be able to type into a search bar and get the answer to a natural language question.

That’s what QuickSight Q aims to do. It’s a direct challenge to a number of business intelligence startups and another instance of the way machine learning and natural language processing are changing business processes across multiple industries.

“The way Q works. Type in a question in natural language [like]… ‘Give me the trailing twelve month sales of product X?’… You get an answer in seconds. You don’t have to know tables or have to know data stores.”

It’s a compelling use case and gets at the way AWS is integrating machine learning to provide more no-code services to customers. “Customers didn’t hire us to do machine learning,” Jassy said. “They hired us to answer the questions.”

AWS announces DevOps Guru to find operational issues automatically

By Ron Miller

At AWS re:Invent today, Andy Jassy announced DevOps Guru, a new tool for DevOps teams to help the operations side find issues that could be having an impact on an application performance. Consider it like the sibling of CodeGuru, the service the company announced last year to find issues in your code before you deploy.

It works in a similar fashion using machine learning to find issues on the operations side of the equation. “I’m excited to launch a new service today called Amazon DevOps Guru, which is a new service that uses machine learning to identify operational issues long before they impact customers,” Jassy said today.

The way it works is that it collects and analyzes data from application metrics, logs, and events “to identify behavior that deviates from normal operational patterns,” the company explained in the blog post announcing the new service.

This service essentially gives AWS a product that would be competing with companies like Sumo Logic, DataDog or Splunk by providing deep operational insight on problems that could be having an impact on your application such as misconfigurations or resources that are over capacity.

When it finds a problem, the service can send an SMS, Slack message or other communication to the team and provides recommendations on how to fix the problem as quickly as possible.

What’s more, you pay for the data analyzed by the service, rather than a monthly fee. The company says this means that there is no upfront cost or commitment involved.

AWS launches SageMaker Data Wrangler, a new data preparation service for machine learning

By Frederic Lardinois

AWS launched a new service today, Amazon SageMaker Data Wrangler, that makes it easier for data scientists to prepare their data for machine learning training. In addition, the company is also launching SageMaker Feature Store, available in the SageMaker Studio, a new service that makes it easier to name, organize, find and share machine learning features.

AWS is also launching Sagemaker Pipelines, a new service that’s integrated with the rest of the platform and that provides a CI/CD service for machine learning to create and automate workflows, as well as an audit trail for model components like training data and configurations.

As AWS CEO Andy Jassy pointed out in his keynote at the company’s re:Invent conference, data preparation remains a major challenge in the machine learning space. Users have to write their queries and the code to get the data from their data stores first, then write the queries to transform that code and combine features as necessary. All of that is work that doesn’t actually focus on building the models but on the infrastructure of building models.

Data Wrangler comes with over 300 pre-configured data transformation built-in, that help users convert column types or impute missing data with mean or median values. There are also some built-in visualization tools to help identify potential errors, as well as tools for checking if there are inconsistencies in the data and diagnose them before the models are deployed.

All of these workflows can then be saved in a notebook or as a script so that teams can replicate them — and used in SageMaker Pipelines to automate the rest of the workflow, too.

 

It’s worth noting that there are quite a few startups that are working on the same problem. Wrangling machine learning data, after all, is one of the most common problems in the space. For the most part, though, most companies still build their own tools and as usual, that makes this area ripe for a managed service.

AWS launches Glue Elastic Views to make it easier to move data from one purpose-built data store to another

By Jonathan Shieber

AWS has launched a new tool to let developers move data from one store to another called Glue Elastic Views.

At the AWS:Invent keynote CEO Andy Jassy announced Glue Elastic Views, a service that lets programmers move data across multiple data stores more seamlessly.

The new service can take data from disparate silos and move them together. That AWS ETL service allows programmers to write a little bit of SQL code to have a materialized view tht can move from one source data store to another.

For instance, Jassy said, a programmer can move data from DynamoDB to Elastic Search allowing a developer to set up a materialized view to copy that data — all the while managing dependencies. That means if data changes in the source data lake, then it will automatically be updated in the other data stores where the data has been relocated, Jassy said.

“When you have the ability to move data… and move that data easily from data store to data store… that’s incredibly powerful,” said Jassy.

AWS goes after Microsoft’s SQL Server with Babelfish for Aurora PostgreSQL

By Frederic Lardinois

AWS today announced a new database product that is clearly meant to go after Microsoft’s SQL Server and make it easier — and cheaper — for SQL Server users to migrate to the AWS cloud. The new service is Babelfish for Aurora PostgreSQL. The tagline AWS CEO Andy Jassy used for this service in his re:Invent keynote today is probably telling: “Stop paying for SQL Server licenses you don’t need.” And to show how serious it is about this, the company is even open-sourcing the tool.

What Babelfish does is provide a translation layer for SQL Server’s proprietary SQL dialect (T-SQL) and communications protocol so that businesses can switch to AWS’ Aurora relational database at will (though they’ll still have to migrate their existing data). It provides translations for the dialect, but also SQL commands,  cursors, catalog views, data types, triggers, stored procedures and functions.

The promise here is that companies won’t have to replace their database drivers or rewrite and verify their database requests to make this transition.

“We believe Babelfish stands out because it’s not another migration service, as useful as those can be. Babelfish enables PostgreSQL to understand database requests—both the command and the protocol—from applications written for Microsoft SQL Server without changing libraries, database schema, or SQL statements,” AWS’s Matt Asay writes in today’s announcement. “This means much faster ‘migrations’ with minimal developer effort. It’s also centered on ‘correctness,’ meaning applications designed to use SQL Server functionality will behave the same on PostgreSQL as they would on SQL Server.”

PostgreSQL, AWS rightly points out, is one of the most popular open-source databases in the market today. A lot of companies want to migrate their relational databases to it — or at least use it in conjunction with their existing databases. This new service is going to make that significantly easier.

The open-source Babelfish project will launch in 2021 and will be available on GitHub under the Apache 2.0 license.

“It’s still true that the overwhelming majority of relational databases are on-premise,” AWS CEO Andy Jassy said. “Customers are fed up with and sick of incumbents.” As is tradition at re:Invent, Jassy also got a few swipes at Oracle into his keynote, but the real target of the products the company is launching in the database area today is clearly Microsoft.

AWS brings ECS, EKS services to the data center, open sources EKS

By Ron Miller

Today at AWS re:Invent, Andy Jassy talked a lot about how companies are making a big push to the cloud, but today’s container-focussed announcements gave a big nod to the data center as the company announced ECS Anywhere and EKS Anywhere, both designed to let you run these services on-premises, as well as in the cloud.

These two services, ECS for generalized container orchestration and EKS for that’s focused on Kubernetes will let customers use these popular AWS services on premises. Jassy said that some customers still want the same tools they use in the cloud on prem and this is designed to give it to them.

Speaking of ECS he said,  “I still have a lot of my containers that I need to run on premises as I’m making this transition to the cloud, and [these] people really want it to have the same management and deployment mechanisms that they have in AWS also on premises and customers have asked us to work on this. And so I’m excited to announce two new things to you. The first is the launch, or the announcement of Amazon ECS anywhere, which lets you run ECS and your own data center,” he told the re:Invent audience.

Image Credits: AWS

He said it gives you the same AWS API’s and cluster configuration management pieces. This will work the same for EKS, allowing this single management methodology regardless of where you are using the service.

While it was at it, the company also announced it was open sourcing EKS, its own managed Kubernetes service. The idea behind these moves is to give customers as much flexibility as possible, and recognizing what Microsoft, IBM and Google have been saying, that we live in a multi-cloud and hybrid world and people aren’t moving everything to the cloud right away.

In fact, in his opening Jassy stated that right now in 2020, just 4% of worldwide IT spend is on the cloud. That means there’s money to be made selling services on premises, and that’s what these services will do.

❌