Dell has now taken the wraps off a new variant of its Pro Max Plus laptop lineup, and it's geared squarely at AI workloads. While the name doesn’t exactly roll off the tongue, the actual hardware inside these machines is more interesting than most product refreshes. These new Dell Pro Max Plus models will be among the first mobile workstations to ship with a Qualcomm AI-100 Inference Card.
While Dell hasn’t shared full specs for the Qualcomm model yet, the base frame and general hardware approach should follow the larger 16-inch and 18-inch Dell Pro Max Plus variants. These models were announced earlier this year with solid workstation credentials: up to an Intel Core Ultra 9 275HX processor (Arrow Lake-HX), Nvidia RTX Pro 5000 Blackwell GPUs with up to 24GB VRAM, up to 256GB of CAMM memory at 7200 MT/s, and up to 16 TB of storage.
For reference, the Pro Max 16 sports a 4K OLED 120Hz touchscreen, while the Pro Max 18 opts for a 1440p 18-inch panel. Both feature a 92Wh battery with 280W ExpressCharge support and a webcam with Windows Hello login.
The Qualcomm-based Pro Max Plus will ditch discrete graphics in favor of a dual AI-100 NPU configuration, featuring 32 AI cores and 64GB LPDDR4x memory for running large language models locally. Dell is clearly targeting this toward engineers and researchers working on high-security AI models and chatbot development, where data sensitivity rules out cloud processing.
There’s no official availability timeline yet, but this configuration will likely appeal to enterprise buyers or niche developers, not the average end user. Still, for anyone looking to prototype, test, or deploy AI workloads in the field or on-the-go, the Pro Max Plus with Qualcomm’s inference card might be something to look out for.
Buy the Dell Latitude 5550 15 laptop with Intel Ultra 7 155U on Amazon.
Source(s)
Dell (via press release)
Dell Pro Max Plus
Dell’s most powerful laptops are becoming even more powerful!
Select Dell Pro Max Plus laptops will be available with a Qualcomm AI-100 Inference Card (discrete NPU), making them the World’s first mobile workstations with an enterprisegrade discrete NPU*, which enables unparalleled performance for AI workloads.
With dual Qualcomm AI-100 cards on the Qualcomm AI-100 Inference Card (dNPU), it provides 32 AI-cores and 64GBs of LPDDR4x memory to handle Large Language Models (LLMs) from 30B, up to 109B parameters. This provides for a local, mobile device that delivers better intellectual property security and at a reduced cost, when compared to server environments.
This is a great solution for AI Engineers and Data Scientists developing chatbots, co-pilots and AI agents, providing for amazing inferencing speed and precision.