A preview is not available for this record, please engage by choosing from the available options ‘download’ or ‘view’ to engage with the material
Description
Selecting Google Cloud VMs with stronger Bidirectional Encoder Representations from Transformers (BERT) performance allows organizations to analyze and get AI insights from text data even faster. Using a natural language processing (NLP) framework to analyze textual data, BERT workloads can make predictions, answer questions, and even respond to conversation. At multiple VM and batch sizes, tests show that Google Cloud N2 VMs enabled by 3rd Gen Intel Xeon Scalable processors at INT8 precision can improve BERT performance compared to N2D VMs with 3rd Gen AMD EPYC processors at FP32 precision.