Intel won't benchmark GPUs with more than 768 shaders and 3GB memory
Flashback: Intel was poised to be bravo six, going dark. Their covert GPU development program was confirmed. It took one burst for them to capture nearly a dozen industry specialists from other companies. They hunkered down. They went radio silent for a year. Then, they hired a marketing team.
That marketing team flipped the paradigm and introduced an unprecedented level of transparency into the development process. (I’m personally grateful to Intel’s marketing team for giving me so much to write about.) Their ploy has often worked. Early promises of ray tracing and a 10nm production node extinguished concerns based on Intel’s CPU strife. An early leak promising a GPU with 4096 cores quickly impressed; only last week photos of the largest GPU in development sparked a wave of new curiosity. But promises made years ago are quickly forgotten and intangible specifications raise concerns over accuracy.
It’s been five months since their last press event. In that period, we’ve seen only one Intel GPU make the rounds. Designed with software developers in mind, the Xe DG1 SDV won’t impress consumers. As the only visible product, however, and given that it has plenty of RGB, that role has fallen upon it. Yesterday’s SiSoftware leak dispelled rumors suggesting it would have 128 EUs, but rather 96 EUs (the difference was an error in counting the CPU’s integrated graphics as part of the discrete solution). That’s a mediocre 768 shaders/cores. The database entry also showed that it was operating at 1.5 GHz, and that it was paired with 3GB of memory.
That’s by no means an insufficient performance bracket for a device intended for developers. Released as the right consumer-oriented product, in laptops, say, as Intel demonstrated back in January, it may be perfectly competitive. But I will pose this question to Intel’s marketing team: why should anyone care?
A year ago, when this device was leaked alongside three Xe HP (high performance) GPUs with stunningly beautiful core counts, the leak as a whole was a good sign. That only the least impressive of the products has manifested is not a great sign. I’d almost say that last week’s photo of what turned out to be the largest of the previously leaked GPUs was an attempt to distract from that fact. What was very promising five months ago doesn’t look so rosy anymore.
Intel, choose: stop putting RGB on your developer cards and pretending gamers have something to look forward to soon, or outright give us something to be hopeful for.