Hardware

Asus esc4000 g4 and asus g4 esc8000, new servers based on nvidia tesla

Table of contents:

Anonim

Asus has announced the launch of new Asus ESC4000 G4 and Asus G4 ESC8000 servers, compatible with Nvidia Tesla V100 and Tesla P4 artificial intelligence solutions, to offer users the latest technologies that improve the performance of their AI system.

New servers To your ESC4000 G4 and Asus G4 ESC8000

The new Asus G4 ESC8000 is optimized for high-performance computing of HPC applications and IA formations, this server supports up to 8 32GB Tesla GPU V100 cards and is part of the HGX-T1 platform of accelerated servers thanks to GPUs from Nvidia.

We recommend reading our post about Asus ROG Aura Terminal Review in Spanish (Complete analysis)

The Asus ESC4000 G4 is designed for high-performance HPC computing and inference operations, supports up to 4 Tesla GPU V100 32GB or 8 Tesla P4 cards depending on the application. It is part of the I2-platform HGX servers accelerated thanks to Nvidia, and offers a latency reduction of up to 10 times in the inference of operations in deep learning thanks to its performance of 20 TeraFlops through hardware operations and INT8 transcoding engine.

The Nvidia Tesla V100 GPU offers 32GB HBM2 memory that doubles previous generations. This large memory capacity improves performance in deep learning by 50% over the previous generation, as well as increases productivity and enables researchers to make major scientific advances in a short period of time. A memory upgrade also enables HPC computing applications to run larger simulations more efficiently.

As for the Tesla P4, it is the world's fastest GPU for scale-out architecture servers, to deliver fast and intuitive AI applications. It reduces inference latency by up to 10 times in any type of hyperscale infrastructure and provides up to 40 times more energy efficiency than a CPU.

Hardware

Editor's choice

Back to top button