Steve Downes, the voice behind the Halo series’ legendary protagonist Master Chief, came forward recently and said that he feels uncomfortable with the numerous AI reproductions of his voice, stating that they “cross a line that gets into an area I’m uncomfortable with.” Naturally, as with most VAs whose work is essentially duplicated without remuneration, he wants it to stop.
He voiced his concerns in a recent AMA on his YouTube channel. While Downes affirmed the benefits of AI and its inevitable advantages, he criticized unauthorized use of his voice, which misleads fans into thinking they’re hearing lines he himself recorded.
In the AMA, he explained:
One of the things that can be overwhelming when it comes to attention from fans is when AI gets involved. A lot of it is harmless, I suppose, but some of it cannot be harmless. I’ve been very vocal about my feelings on the use of artificial intelligence, which, on the one hand, is inevitable and has many positive effects on not only show business but humanity in general, but can also be a detriment. It can also be something that deprives the actor of his work.
Downes further continued:
I’ve heard some things online in terms of AI and the reproduction of my voice that sounds like my voice that… like I said, most of the stuff I’ve seen is pretty harmless, but it cannot be that way real quick. So, I’m not a proponent. I don’t like it. I would prefer that it not be done. There’s a lot of fan-made projects out there that are really cool, that are done just from the heart. But when you get to the AI part and deceiving somebody into thinking, in my case, that these are the lines that I actually spoke when they’re not, that’s when we cross a line that gets into an area that I’m not comfortable with. I’ll go on the record with that.
Deepfakes and voice mimicry have been on the rise and have become nearly indistinguishable in recent years. AI voice cloning has crossed the “indistinguishable threshold.” Numerous analysts and experts predict that 2026 will see a boom in fake AI-generated voice-overs, which will increase the risk of fraud. That’s already a reality as unsuspecting people are subjected to 1,000 AI-generated scam calls daily.














