Weka AI™ demonstrates its ability to accelerate DataOps with Valohai integration by deriving actionable intelligence from data and providing operational agility and governance for customers’ edge-to-core-to-cloud data pipelines.
WekaIO™ (Weka), an Advanced Technology Partner in the Amazon Web Services (AWS) Partner Network (APN) and an innovation leader in high-performance and scalable file storage, is pleased to announce its integration with the deep learning pipeline management solution from Valohai, a Weka Innovation Network™ (WIN) partner. The announcement underpins Weka’s commitment to empower Data Scientists and Chief Data Officers to manage and prioritize data science pipelines. The tools from Valohai are supported in an Amazon Virtual Private Cloud (Amazon VPC), and available in AWS Marketplace.
“New workloads are driving the need for modern foundational architectures, and the recently launched Weka AI offers a transformative solution framework for Accelerated DataOps,” said Shailesh Manjrekar, head of AI and strategic alliances at Weka. “Our partnership with Valohai and our integration with its Deep Learning Pipeline Management tools expand Weka AI’s capabilities to offer Explainable AI (XAI). This is a critical factor for use cases with a social impact, including autonomous driving, healthcare, and genomics.”
Integration with solutions from technology alliance partners such as Valohai enhances the power of Weka AI. Underpinned by the Weka File System (WekaFS™), Weka AI now provides a production-ready solution where the entire AI data pipeline workflow, from data ingestion to batch feature extraction, training, hyperparameter optimization, and finally to inference and versioning, can run on the same storage platform, whether on-premises or on AWS. This is only possible because of the excellent mixed workload performance and the data management and governance abilities provided by WekaFS.
“Machine Learning (ML) gives businesses a competitive advantage, but while ML is hard, real-world ML is much harder,” said Eero Laaksonen, chief executive officer at Valohai. “A real-world ML system is 95% enabling code, with only 5% actual ML code that creates business value, so the question becomes how to ensure that efforts are focused on that 5%. With Valohai, businesses can focus on data science, as it handles everything else. It is as simple as pointing to your code and data and hitting ‘run’. The seamless integration between our DLMS solution and Weka’s powerful snap-to-object capabilities offer a quick, zero-setup infrastructure for DataOps. This further helps businesses to build models 10x faster, regain 35% of lost cloud costs, and free their DataOps teams with automated ML orchestration, data management, and data mobility. It is a win-win.”
Weka AI, with Valohai DLMS, enables hybrid workflows, where data scientists can employ Jupyter Notebook or the Valohai GUI, either on-premise or on AWS, to do data transformation, model training, hyperparameter optimization, and inference in a Kubernetes-orchestrated environment. Valohai DLMS seamlessly integrates with WekaFS, which leverages i3.n Amazon Elastic Compute Cloud (Amazon EC2) instances with Non-Volatile Memory Express (NVMe) flash for the performance tier and extends the file namespace over Amazon Simple Storage Service (Amazon S3) buckets for the capacity tier. Valohai DLMS also takes file namespace snapshots when running experiments and stores them for data versioning. These versions can then be rehydrated anytime to reproduce an experiment and provides the required explainability and transparency. Weka AI provides security and governance with end-to-end encryption of the pipeline and also provides integration with leading key management solutions such as, HashiCorp Vault.
Manjrekar adds, “Weka AI, when deployed with Valohai on AWS, builds upon our success for accelerating Genomics, Fintech, and AI/ML/DL data pipelines. Customers use Weka on AWS for outstanding performance as we can showcase 100GB/sec of throughput with 5 million 4KB IOPS for a 16xi3en cluster, all with less than 250-microsecond latency.”
For more information on Weka AI and Weka on AWS, go to: https://www.weka.io/solutions/ai-analytics/ and https://www.weka.io/products/public-cloud/aws
For more information on accelerated DataOps with Weka AI, read the blog at: https://bit.ly/2AamwSw
Valohai is an end-to-end machine learning operations (MLOps) tool that helps companies to move from proof-of-concept projects to real-world machine learning solutions. Our customers build custom machine learning workflows to manage their experiments and ensure transparency on each step of the model building. Automatic versioning and orchestration for training instances ensure that the ML team doesn’t waste time on mundane tasks, but focuses on building the models that are the core behind each real-world ML solution. For more information, visit valohai.com.
Weka offers WekaFS, the modern file system that uniquely empowers organizations to solve the newest, biggest problems holding back innovation. Optimized for NVMe and the hybrid cloud, Weka handles the most demanding storage challenges in the most data-intensive technical computing environments, delivering truly epic performance at any scale. Its modern architecture unlocks the full capabilities of today’s data center, allowing businesses to maximize the value of their high-powered IT investments. Weka helps industry leaders reach breakthrough innovations and solve previously unsolvable problems.
Follow WekaIO: Twitter, LinkedIn, and Facebook
WekaIO, WekaIO Matrix, WekaFS, Weka AI, Weka Innovation Network (WIN), the Weka brand mark, and the Weka, Weka AI, and WIN logos are trademarks of WekaIO, Inc.
View source version on businesswire.com: https://www.businesswire.com/news/home/20200520005528/en/