Back in January this year, Intel launched PCIe card versions of its Xe DG1 graphics card with largely similar specifications as that of its mobile equivalent, the Iris Xe Max. The Xe DG1, however, tops-out at 80 execution units (EUs) while the Xe Max offers up to 96. We haven't seen many benchmarks of the DG1 yet, but the card seems to have made its way to the Basemark GPU database earlier this month.
Discovered by known leaker @TUM_APISAK on Twitter, the Asus Xe DG1 card with 80 EUs, 4 GB of 128-bit LPDDR4X-4266 VRAM, 1,500 MHz clock, and a 30 W TDP seems to have scored 17,289 points in the Basemark GPU Vulkan test. In comparison, the four-year old Asus Radeon RX 550 is nearly 2% faster with a score of 17,619 points. The Radeon RX 550 is a Polaris-based card that features 10 compute units (CUs), a 1,183 MHz clock, 128-bit 4 GB GDDR5 RAM, and a 50 W TBP. The benchmark seems to have been run on identically configured systems powered by an Intel Core i3-10100F processor.
The choice of CPU makes sense here as Xe DG1 will work only on select 9th gen and 10th gen Intel Core processors running on motherboards featuring B460, H410, B365, and H310C chipsets with a special BIOS. A theory doing the rounds (via @0xCats) is that the consumer DG1 cards lack EEPROM chips to store the firmware and thus need to rely on firmware on the motherboard to function.
@TUM_APISAK had previously leaked a 3DMark Fire Strike result purportedly of the DG1 wherein Intel's solution was found to be in the GTX 750 Ti/RX 480 territory.
The 2% performance difference in the present Basemark GPU Vulkan test is not really significant, so it can be assumed for now that the Xe DG1 on par with an RX 550. Considering this is Intel's first foray into discrete GPUs (not counting Larrabee), the performance seems pretty decent and without surprises.
Intel never marketed the DG1 at gamers; in fact, it will be available only in select OEM systems (which explains the paucity of benchmark data), so it's more of testing the waters right now.
The upcoming Xe-HPG DG2 should hopefully up the ante considerably and become a viable NVIDIA/AMD competitor.