A preview is not available for this record, please engage by choosing from the available options ‘download’ or ‘view’ to engage with the material
Description
With tools such as Bidirectional Encoder Representations from Transformers (BERT), organizations can use deep learning algorithms to make sense of large amounts of text. Using the natural language processing (NLP) framework, BERT puts words into the correct context, allowing applications to predict sentences, suggest answers, and generate responses. At multiple instance sizes, third-party testing using INT8 precision demonstrated that companies could get better performance by running their BERT workloads on AWS M6i instances enabled by 3rd Gen Intel Xeon Scalable processors compared to instances with older processors.