X
Innovation

AWS brings EC2 G4 instances into general availability

The accelerated computing instances feature Nvidia T4 Tensor Core GPUs
Written by Stephanie Condon, Senior Writer

Amazon Web Services on Friday announced the general availability of Elastic Compute Cloud (EC2) G4 instances. Featuring Nvidia T4 Tensor Core GPUs, the cloud instances are optimized for accelerating machine learning inference and graphics-intensive workloads. 

Amazon previewed the G4 instances alongside Nvidia in March. Nvidia also announced at the time that T4 GPUs are offered by Cisco, Dell EMC, Fujitsu, HPE and Lenovo. Back in January, Google began offering T4 GPUs on the Google Cloud Platform in beta.

T4 GPUs can be useful for running machine-learning powered applications, such as adding metadata to an image, object detection, recommender systems, automated speech recognition or language translation. They're also a cost-effective platform for building and running graphics-intensive applications, including remote graphics workstations, video transcoding or photo-realistic design. 

Along with the latest generation of T4 GPUs, Amazon's G4 instances feature custom 2nd Generation Intel Xeon Scalable processors, up to 100 Gbps of networking throughput, and up to 1.8 TB of local NVMe storage. 

In the coming weeks, the G4 instances will support Amazon Elastic Inference, enabling developers to reduce costs by provisioning the right amount of GPU performance for their workloads. 

G4 instances are now available in the US East (N. Virginia, Ohio), US West (Oregon, N. California), Europe (Frankfurt, Ireland, London), and Asia Pacific (Seoul and Tokyo) regions, with availability in additional regions planned in the coming months. 

Editorial standards