xAI’s recent 24-hour hackathon has produced Halftime, an experimental AI-powered advertising tool designed to integrate branded elements directly into video content. The project was created by University of Waterloo computer science students Pravin Lohani, Yuvraj Dwivedi and Krish Garg, and was highlighted by xAI as one of the event’s standout submissions.
According to the project description, it uses Grok, xAI’s large language model, to examine the narrative, lighting, setting and objects inside a video. Based on that analysis, it tries to generate ad placements that match the scene’s visual and contextual style. Instead of pausing playback with pre-rolls or pop-ups, Halftime seemingly inserts items such as product packaging, background billboards, or branded objects that appear to belong in the environment.
In a demo shared on X, the system places items such as soft-drink cans into character interactions and overlays brand logos onto buildings in city scenes. The tool aims to create on-the-fly product placement that promises to feel less intrusive than traditional ad breaks. The tool aims to create on-the-fly product placement that promises to feel less intrusive than traditional ad breaks.
xAI featured the project in its hackathon highlight thread, describing it as a system that “dynamically weaves AI-generated ads into the scenes you’re watching, so breaks feel like part of the story instead of interruptions.” The hackathon hosted more than 150 projects across categories such as agents, safety tools, AI-assisted creation and monetisation technologies.
Public reaction to the tool on X appears mixed. Some users praised the technical execution and potential revenue applications for video platforms. Others raised concerns about the spread of AI-generated advertising, with several comments criticising the idea of embedding ads more deeply into entertainment content.










