Breaking News

Amazon AWS unveils RedShift ML to ‘bring machine learning to more builders’ – ZDNet

Amazon’s vice president of machine learning, Swami Sivasubramanian, Tuesday offered a keynote on machine learning for week two of Amazon’s re:Invent conference for Amazon Web Services. 

During the keynote, Sivasubramanian announced the company’s middleware platform for machine learning, SageMaker, will be able going forward to automatically break up the parts of a large neural net and distribute those parts across multiple computers. This form of parallel computing, known as model parallelism, is usually something that takes substantial effort. 
The new capability, he said, was part of a theme of bringing machine learning, even large deep learning forms, to more individuals than the small group of scientists with the skills for developing it. Sivasubramanian recalled his early work on AWS when services such as S3 storage were released. Machine learning, he said, is “at a similar point in its development,” he said, implying AI is having an AWS moment, booming more broadly available. 
“How can we bring machine learning to this large and growing group of database developers and data analysts?” was one of the key questions Sivasubramanian posed. 
“Training deep learning models can take weeks by teams of scientists,” said Sivasubramanian. Amazon was able in just hours, said Sivasubramanian, to train a very large neural network known as “T5,” a version of Google’s Transformer natural language processing, in just three hours. 
Sivasubramanian said Amazon was able to use the distributed training to reduce average training times for very large deep learning networks by forty percent. 
Using the distribute parallelism, said Sivasubramanian, Amazon was able to train both T5 and another large neural network, Mask R CNN, a convolutional neural network used for object detection. 

In addition to distributed parallelism, Sivasubramanian discussed capabilities for more easily preparing data for ML models, called SageMaker Data Wranglers; and a means for detecting bias in machine learning, called SageMaker Clarify. 
Data Wranglers will replace the tedious process of preparing the features that are going to be used in a machine learning model, explained Sivasubramanian. Sivasubramanian described how tax software maker Intuit has numerous features it has to track when walking an individual through tax preparation, such as where in the process of the document an individual is currently working. Data Wrangler is a way to keep track of those features.
Clarify aims to help improve models by running an algorithm to automatically detecting potential bias across the machine learning workflow. The software was explained by Amazon AWS ML scientist Dr. Nashlie Sephus, who specializes in issues of bias in ML.
Clarify is integrated into SageMarker Model Monitor to send alerts to data scientists if a model begins to develop a bias.
Sivasubramanian also announced an extension to the SageMaker Debugger program, first announced last year, that is meant to improve the use of resources in deep learning training. The new capability, called Deep Profiling for SageMaker Debugger. The new capability will visualize resources such as GPUs and memory that are being used in training, offer recommendations to adjust resource usage, and set up training operations to run at different times in the total workflow of deep learning training. 
The presentation was a showcase for AWS’s three-year-old SageMaker middleware. The different new product offerings all came back to SageMaker, which Sivasubramanian said was becoming a standard inside of corporations as the means to build AI. 
SageMaker Edge Manager so you can prepare run monitor run ML models across fleets of devices. Make your model run up to 25 times faster. Understand the performance of models on each device across your fleet through a simple dashboard.
Sivasubramanian touted SageMaker as the “most complete end-to-end” machine learning development environment, noting it was available “through a single pane of glass” in what’s called SageMaker Studio. 

Homing in further on the theme of extending ML to more “builders,” Sivasubramanian said the company was extending the SageMaker Autopilot technology introduced last year. That technology was meant to help people without ML expertise to automatically select machine learning models to use. To date, that has involved automatically bringing up an ML model in AWS’s Aurora database program, known as Aurora ML, or other AWS apps.
“But what if you didn’t have to choose a model at all?” asked Sivasubramanian. The answer is to integrate Auto Pilot into RedShift, Amazon’s data warehouse application, known as Amazon RedShift ML. The integration means users of the RedShift can simply formulate SQL queries to the database to make machine learning inferences, without building an ML model. 
“Interactions between data and ML are abstracted away,” is the gist. 
Sivasubramanian said AWS will extend the same integration it had done with Aurora ML to graph database applications, unveiling Amazon Neptune ML, for “easy, fast and accurate predictions for graph applications.”
A service packaged on top of Autopilot, QuickSight Q, provides automatic responses to queries, such as patterns in sales data. 

Another element Sivasubramanian, of extending ML, is to find ways to apply it to immediate business problems. To that end, Sivasubramanian unveiled Amazon Lookout for Metrics, what it calls an “anomaly detection service for metrics with root cause analysis.” 
The software will perform automatic data mining from a variety of applications. 
“The service inspects data and trains ML models to find anomalies,” said Sivasubramanian. “It uses your feedback to continuously optimize its algorithm,” said Sivasubramanian.
Amazon is spinning Lookout into vertical-market versions, including something called Lookout for Equipment, which can use sensors on equipment in the field to send alerts if there’s an anomaly in the running of a piece of equipment.
A similar service for manufacturing inspection, Lookout for Vision, will examine products in an assembly line to deject anomalies such as undesirable variation in color.  
Such services can tie into other Amazon AWS services and devices, including Monitron, for monitoring equipment, and the Panorama appliance, a small device that takes in-camera streams for computer vision in edge applications. AWS’s Vice President for AI, Matt Wood, came onstage to discuss several new aspects of these industrial applications, including a new software development kit to extend Panorama, such as to add more camera feeds.

Wood discussed another vertical market application, healthcare, and life sciences. The work in that field is hampered by the way data is spread throughout numerous repositories in different formats, requiring weeks or months to prepare data, he said.
To ameliorate all that, Wood unveiled Amazon HealthLake, a way to “store, transform, and analyze health and life science data in the cloud at petabyte scale,” as the company puts it.
Wood gave an example of HealthLake used in the detection of diabetes. For early diagnosis of diabetes, doctors have to go through hundreds of thousands of data points from donor notes and prescriptions and the rest. “It is a Herculean effort,” he said, to analyze all that data. “We can bring it all together in minutes,” said Wood. HealthLake can automatically load structured data but also extract information such as patient names from physician notes and added that to a data lake. “These are all just pieces of the puzzle scattered around different silos, but when combined, we can get a much clearer picture,” said Wood. 

Wood brought on stage Elad Benjamin, the general manager of the radiology informatics business at medical device giant Philips. Benjamin noted the difficulty of the abundance of data in healthcare. “Physicians need to work through those silos, and it’s getting harder for them to diagnose and treat,” said Benjamin. He said Philips is using AWS services to free up scientists to pursue “higher-level activities.” 
Sivasubramanian rounded out his presentation by discussing what Amazon is doing to “educate the next generation of machine learning developers.” 

AWS re:Invent

Source: https://www.zdnet.com/article/got-amazon-aws-unveils-sagemaker-clarify-edge-manager-redshift-ml/