Hardware

Qnap Introduces Mustang Range Calculator Accelerator Cards

Table of contents:

Anonim

QNAP today unveiled two calculation accelerator cards designed for deep learning inference by AI. In this new range are the Mustang-V100 (based on VPU) and the Mustang-F100 (based on FPGA). Users can install these PCIe-based accelerator cards on an Intelo-based server / PC on a branded NAS.

QNAP Introduces Mustang Range Calculator Accelerator Cards

These Mustang-V100 and Mustang-F100 accelerator cards are optimized for the OpenVINO architecture and can extend workloads to Intel hardware with maximized performance. They can also be used with the OpenVINO Workflow Consolidation Tool.

New accelerator cards

Both the Mustang-V100 and Mustang-F100 offer inexpensive acceleration solutions for AI inference. They also work with the OpenVINO toolset to optimize inference workloads for image classification and computer vision tasks. The OpenVINO toolset helps accelerate the development of high-performance computer vision and deep learning in vision applications. Includes Model Optimizer and Inference Engine.

As QNAP NAS evolves to support a wider variety of applications, the combination of large storage and PCIe expandability is beneficial for use in AI. The OpenVINO Workflow Consolidation Tool (OWCT) has been developed. Uses Intel's OpenVINO joint tool technology. When used with the OWCT, the Intel-based NAS offers an ideal inference server solution to assist organizations in the rapid creation of inference systems. AI developers can deploy formed models on a NAS for inference and install either the Mustang-V100 card or the Mustang-F100 card.

QNAP NAS now supports Mustang-V100 and Mustang-F100 cards with the latest version of QTS 4.4.0 operating system. To see the NAS models that support QTS 4.4.0, you can go to www.qnap.com. To download and install the OWCT application for the QNAP NAS, you have to enter the App Center.

Hardware

Editor's choice

Back to top button