• Home
  • Sofware
  • Intel continues to innovate in machine learning and artificial intelligence

Intel continues to innovate in machine learning and artificial intelligence

Intel has released the 4th Generation Xeon Scalable CPUs, which it has developed to close the gap between the workloads that can run on CPUs and GPUs for users who run projects in the field of machine learning (ML) and artificial intelligence (AI).
 Intel continues to innovate in machine learning and artificial intelligence
READING NOW Intel continues to innovate in machine learning and artificial intelligence
Intel has released the 4th Generation Xeon Scalable CPUs, which it has developed to close the gap between the workloads that can run on CPUs and GPUs for users who run projects in the field of machine learning (ML) and artificial intelligence (AI).

It is thought that artificial intelligence, which is increasing in popularity, will be used in the entire workflow in the near future. The impact and capabilities of artificial intelligence applications such as ChatGPT are growing.

With next-generation Xeon CPUs powering GPUs, Intel aims to lay the foundation for AI strategies in the coming period and accelerate the penetration of artificial intelligence into every application by focusing on the end-to-end performance of applications. From analyzing smaller datasets at the edge to large data lakes, Xeon processors eliminate the need to purchase external accelerators, which can be costly to train these systems.

Intel Xeon CPU Max models offer high performance thanks to HBM technology

Intel Xeon CPU Max models of the 4th Generation Xeon Scalable CPU family stand out as the first and only x86-based processors with high-bandwidth (HBM) technology. It offers significant compute performance in high-performance computing and AI workloads where memory bandwidth is constrained. The 4th Gen Xeon CPU Max series with HBM delivers up to 3.7x higher processing performance compared to the 3rd gen and up to 2.3x higher than the 4th gen without HBM.

With the increasing digitization of all setters, larger data is used and transaction demand is increasing. According to Statista’s research, it is estimated that by 2025, worldwide data production will exceed 180 zettabytes. Increasingly important in such an environment, cloud environments are used to deploy new business models. The cloud, which has revolutionized the way businesses are established and scaled, stands out as an important opportunity. Cloud and artificial intelligence technologies; In data centers, across the network and at the edge, it is changing the way businesses and consumers alike access and use data. It creates an unprecedented transaction demand. Adoption of AI requires a single portable software development environment and a platform that can apply all kinds of analytical techniques. The Intel Xeon CPU Max series meets this need thanks to its high single-thread performance.

Nvidia CEO Jensen Huang has the following statements about Sapphire Rapids (Intel Xeon CPU Max series):

“I chose Sapphire Rapids as the CPU for Nvidia Hopper because Sapphire Rapids has excellent single-threaded performance.”

One of the remarkable innovations in the 4th generation Xeon Scalable processor stands out as the integration of Advanced Matrix Extensions (AMX). 4th Gen Intel Xeon Scalable processors with Intel Advanced Matrix Extensions (Intel AMX), 10x higher inference and training performance (in both training and inference, theoretically 8x compared to AVX-512 and 10x compared to 3rd generation) yields). Many organizations today push their inference needs to discrete GPUs to achieve desired performance levels and meet service level agreements. Intel AMX can deliver a 10x performance boost in terms of AI inference rates compared to Intel third-generation Xeon processors. The new Intel processor also provides up to 10x speed boost for data preparation and training. 4th Gen processors also include HuggingFace performance for Stability AI.

Comments
Leave a Comment

Details
138 read
okunma52950
0 comments