The famous computer scientist Alan Kay said that people who really take software seriously should make their own hardware. But according to Intel CEO Pat Gelsinger (Pat Gelsinger), it also has the opposite effect: If you want hardware to succeed, you must put software first.
Extensive software compatibility is a fundamental advantage that Intel processors traditionally have over other CPUs, not only because of the x86 architecture, but also because Intel has always worked closely with software developers. But as the world changes, Intel CEO Pat Gelsinger must look at software differently than his predecessor. On the one hand, Intel must work with a wider independent software vendor (ISV) ecosystem than before, and work more closely than before. But on the other hand, Intel’s own software can bring a new source of income for the company.
“One thing I learned in 11 years of’holidays’ [at VMware and EMC] It is a mistake to provide a chip that is not supported by the software,” Pat Gelsinger said in an interview CRN“We must provide software functionality, and then we must authorize it, accelerate it, and make it more secure with the hardware under it. For me, this is a major shift that I need to promote at Intel.”
Expanding the Intel software ecosystem
Intel has been trying to ensure that software can take advantage of its latest hardware by properly supporting all the latest instruction set extensions and other technologies aimed at accelerating certain workloads. To a large extent, Intel assisted its partners in creating a software ecosystem optimized for its processors.
Over the years, this approach helped strengthen Intel’s software ecosystem until accelerated computing emerged in the mid-2000s. Nvidia began to actively promote its CUDA platform, while other companies rely on various open or proprietary standards, such as OpenCL, Vulkan, Metal, and OpenAI, to accelerate performance-critical workloads through proprietary hardware. Companies like Apple and Nvidia have created their own software ecosystem. Although not as extensive as Intel, they are competitive enough to attract software developers.
Today, a large number of artificial intelligence (AI) and high-performance computing (HPC) applications are developed for Nvidia’s CUDA platform and therefore require the company’s hardware and software stack. This naturally poses a challenge for Intel and its data center CPUs and computing GPUs designed for artificial intelligence and supercomputers, because they are now at the other end of the equation: they must compete with the established ecosystem.
When Raja Koduri joined Intel at the end of 2017, his first move at the chip giant was to build an open standard cross-platform application programming interface (API) that allows developers to access CPU, GPU, FPGA and other accelerators Program to eliminate the need for separate code bases and tools for each architecture. Intel calls this oneAPI.
Use oneAPI and ensure that ISV will optimize its program for Intel’s instruction set extensions, such as AMX (Advanced Matrix Extension), XMX (Xe Matrix Extension) or deep learning enhancements (AVX-512 VNNI, 4VNNIW, AVX-512 BF16, etc.).) , Gelsinger said, Intel will have to contact more developers than ever before and work better with them.
Artificial intelligence and high-performance computing are of course the big trends that have become the headlines of technology companies such as Intel. Obviously this blue company is catching up with Nvidia. But artificial intelligence and high-performance computing programs are not the only types of software that Intel needs to optimize for its hardware. Emerging applications in edge computing, data centers, and even client PCs will have to rely on new Intel hardware that was not available a few years ago, and these applications must become part of the Intel software ecosystem.
For example, Intel’s upcoming Alder Lake CPU for client PCs will integrate high-performance and energy-efficient cores and a special Intel thread director hardware unit to ensure correct load balancing and the correct allocation of cores for different workloads. In order to maximize the efficiency of the thread director, Intel needs to work closely with developers of operating systems and third-party programs.
Another example is Intel’s Atom system-on-chip based on energy-efficient cores for 5G and edge computing applications. The programs that run on these SoCs must be optimized for them (eventually for Xe-HP GPUs for edge machines), not for Intel’s Xeon or AMD’s Epyc processors with full fat cores. This means that Intel will have to work with many suitable software developers, because the number of potential edge computing applications is difficult to estimate. Nvidia’s EGX platform is also there, which includes easy-to-deploy machines supported by software that uses Nvidia’s CUDA hardware acceleration.
Intel considers paid software services
According to CRN, some of Intel’s partners believe that the chip giant can consider Nvidia’s data center and edge computing methods, which include DGX systems for AI and/or HPC and EGX machines for edge applications. Value-added resellers can use machines with common software stacks and equip them with additional programs tailored to specific customers.
Since Intel is the world’s largest supplier of PC and server CPUs, the company is unlikely to be interested in competing with its customers and supplying its own machines.This might spoil Pat Gelsinger hopes for its semi-custom/custom x86 business, But it can still use not only its hardware, but also its software. For example, Intel’s new chief technology officer Greg Lavender said that Intel has provided its Intel Unite and Intel Data Center Manager software, but requires additional fees and can expand its software products.
“I do hope that you will see more in this field: How do we use our software assets? How do we have unique monetized software assets and services that we will provide to the industry. They can exist independently, right? Yes, This is part of the business model I hope to do more in the future,” Lavender told CRN.
Intel is not yet ready to talk about the exact types of paid software it may provide to its customers through the CPU, but it said that instead of developing its own programs, it would sell things such as advanced platform telemetry to software manufacturers and share the revenue. Such data can make it easier for security companies to detect malware or viruses on client and server systems.
Traditionally, Intel has used its software as a value-adder to its hardware. For example, Intel’s Quick Sync Video software for desktop PCs is provided free of charge with Intel drivers and utilizes the video encoding/decoding capabilities built into Intel CPUs. In a world where many companies are now considering customizing their own local SoCs for specific applications, it remains to be seen how Intel manages to make money by selling hardware and software, but this is an option that management is now considering.