Notebookcheck Logo

Master Chief voice actor Steve Downes says AI voice cloning crosses a line and wants it to stop

A screengrab of Halo's primary protagonist Master Chief (image source r/halo)
A screengrab of Halo's primary protagonist Master Chief (image source r/halo)
Halo’s Master Chief voice actor, Steve Downes, has publicly condemned AI-generated clones of his performance, arguing that unauthorized imitations deceive fans and threaten actors’ livelihoods. While acknowledging AI’s broader benefits, Downes warns that hyper-realistic voice reproductions cross an ethical line and is calling for such uses of his voice to stop.

Steve Downes, the voice behind the Halo series’ legendary protagonist Master Chief, came forward recently and said that he feels uncomfortable with the numerous AI reproductions of his voice, stating that they “cross a line that gets into an area I’m uncomfortable with.” Naturally, as with most VAs whose work is essentially duplicated without remuneration, he wants it to stop.

He voiced his concerns in a recent AMA on his YouTube channel. While Downes affirmed the benefits of AI and its inevitable advantages, he criticized unauthorized use of his voice, which misleads fans into thinking they’re hearing lines he himself recorded.

In the AMA, he explained:

One of the things that can be overwhelming when it comes to attention from fans is when AI gets involved. A lot of it is harmless, I suppose, but some of it cannot be harmless. I’ve been very vocal about my feelings on the use of artificial intelligence, which, on the one hand, is inevitable and has many positive effects on not only show business but humanity in general, but can also be a detriment. It can also be something that deprives the actor of his work.

Downes further continued:

I’ve heard some things online in terms of AI and the reproduction of my voice that sounds like my voice that… like I said, most of the stuff I’ve seen is pretty harmless, but it cannot be that way real quick. So, I’m not a proponent. I don’t like it. I would prefer that it not be done. There’s a lot of fan-made projects out there that are really cool, that are done just from the heart. But when you get to the AI part and deceiving somebody into thinking, in my case, that these are the lines that I actually spoke when they’re not, that’s when we cross a line that gets into an area that I’m not comfortable with. I’ll go on the record with that.

Deepfakes and voice mimicry have been on the rise and have become nearly indistinguishable in recent years. AI voice cloning has crossed the “indistinguishable threshold.” Numerous analysts and experts predict that 2026 will see a boom in fake AI-generated voice-overs, which will increase the risk of fraud. That’s already a reality as unsuspecting people are subjected to 1,000 AI-generated scam calls daily.

Buy Halo Infinite on Amazon here

Please share our article, every link counts!
Mail Logo
Google Logo Add as a preferred
source on Google

No comments for this article

Got questions or something to add to our article? Even without registering you can post in the comments!
No comments for this article / reply

static version load dynamic
Loading Comments
Comment on this article
> Expert Reviews and News on Laptops, Smartphones and Tech Innovations > News > News Archive > Newsarchive 2026 01 > Master Chief voice actor Steve Downes says AI voice cloning crosses a line and wants it to stop
Rahim Amir Noorali, 2026-01-25 (Update: 2026-01-25)