Let's finally talk about AI and the games industry.
We've been needing to have this talk for sometime now.
My day job, the thing that pays the bills and keeps Reconnect going, is data.
I’ve been a data professional for over a decade now. From lowly analyst all the way up to manager, if it’s in the field of data I’ve done it. My specialities include, data migration, data engineering, data governance and consumer data protection, and information management
Suffice to say I’ve been working in and around AI since before it was cool.
I mention this to give what I’m about to say some weight, but I will be honest and say that I wouldn’t easily be able to make my own AI / ML model tomorrow if I needed to.
I’m a bit of the same when it comes to video games. I’ve been closely following the industry for a long time now, and I have even taken the time to make small games in Unity and Godot. But likewise If I was to make a game tomorrow, I wouldn’t find it easy.
Because I have my toes dipped in both worlds, I have a lot of opinions about AI and video games. Given I’ve spent a lot of my professional life arguing the ethical points of AI and navigating legal grey areas, I’m not overly a fan of the whole kit and caboodle.
However, on the whole, AI has been in the background for a lot longer than most of us realize, and It’s not going anywhere. With studios across the world adopting AI with speed I think it’s time we understood the basics so that we may criticize the industry fairly and efficiently.
What is AI?
In the industry we refer to it as ‘Unsupervised Decision Making’, a decade ago this was referred to as an algorithm. Simply replace all marketing that mentions AI with the word algorithm and nothing has changed. Just the buzz word.
We develop models to be able to carry out tasks for us, and where there is ambiguity, that normally a human would have to weigh in on, we have trained a computer to be able to make that decision unsupervised. This is the measure of how effective a model is. How many unsupervised decisions did it actually get wrong.
We want AI for scenarios where it would be too much work for a human to carry out the decisions manually, and/or where we would have to write a lot of prescriptive code to account for all the variations. It’s simply the next evolution in automation. As a society we are constantly creating tools to make tasks more efficient to carry out.
It comes in different forms
As previously mentioned AI has just become a catch all term for most things we would have called an algorithm years ago.
It spans a lot of different forms. For most of us, we are familiar with AI being voice assistants, tools like ChatGPT, image generation like mid journey, and text re-writers like grammarly.
But AI goes further than that, and it has done for some time. The reason why smart phone cameras have gotten heaps better in the last ten years is not the better hardware but the AI image processing inside. If you use something like Xero to upload receipts and have the information extracted out, that’s often AI. Many of our devices and products are using AI to manage power and battery levels.
Scientists across the world are using AI to refine data, things like genome sequencing, rocket trajectory, sifting through astronomical data.
AI is everywhere, and my guiding line tends to be, if you can see the AI its a bad product. AI should happen in the background, it’s not at its best when the consumer is aware.
This is my helpful way of trying to get us to the point where we are criticising the correct AI. Because sadly not all AI is bad.
What the problem?
The uproar about AI does tend to be specifically about what’s known as Generative AI or GenAI for short.
For the most part, most of it was trained on stolen data, it’s extremely resource intensive which not only damages the planet, but increases prices of hardware everywhere, and it appears to be threatening the livelihood of artists everywhere.
My biggest fear, coming from my professional career, is the threat it has to consumer data protections. Prior to this AI hype, there were not enough laws to protect users data from being sold and used in ways consumers would not consent to. AI and this mad dash to obtain data has increased this threat twenty fold.
My go to example is home loans. You already give a lot of your data to a bank to prove you can make a home loan. In my own country (NZ), some banks were already denying home loans after determining people were spending too much money on fast food.
If a stance on using stolen data is not taken soon enough, then these companies will continue to get away with harvesting your data from other sources. Imagine a world where an AI at a bank or health insurance company starts penalising you because it has access to data you did not consent to it having. Increased health premiums because they have obtained your location data and determine you don’t exercise enough, or denial of a mortgage because they can see that you are job hunting.
You can hardly trust a human to look at your data, nefarious or not, and make a judgement call without your input, why trust an AI model?
(Personal rant over)
What does it mean for the games industry?
The games industry is a unique beast when it comes to AI.
For the most part, AI thrives when it has access to a lot of easy to access and parse data. The games industry does not have that. Ubisoft does not have the ability to access data from Activision or Microsoft.
This means AI has to be trained in house, and even then it’s not going to be easy to parse.
Microsofts recent showcase of an AI trained on watching footage of a game is all smoke and mirrors. Because to be able to recreate a game, you need to be able to recreate the scenes and coding logic inside a game engine, which you cannot parse from watching footage.
So when we talk about AI in games, it’s going to be the easily accessible stuff like text and images or it’s going to be super niche.
Ubisoft’s demo of an NPC you could have a full conversation with, it’s nothing more than rudimentary natural language processing. ChatGPT can hold a better conversation. The key to remember here is authorship. Games are authored to fit a narrative. Chatting to a guard in Skyrim about the latest Liverpool match is useless.
If a studio wants to be able to offer the ability to have unscripted conversations with NPC’s then the studio will need to spend a lot of time tailoring that model to the lore of the game it is in. Which means the studio will have to create more lore and do a lot of training. Then when they make another game, most of that work needs to be thrown out because that game might not be in the same world. It’s not impossible, it’s just a lot of work, for questionable return. We know for a fact that most of these studios claim to be pursuing AI as a way to reduce costs, not increase them.
With some of these demos that are being shown off, we really need to ask the question as to who it’s for. For the most part the AI being shown off is for investors, to show that they are being ‘cutting edge’ and not falling behind. It’s the same reason why so many studios publicly claimed they were doing NFT games. Investors are stupid and need to be deceived to continue to give funding.
So what AI makes sense?
Well if we think of the tasks a games development team would like to automate to speed up their process then we end up landing on art assets.
Art takes time, and the thing most of us don’t consider is that when a game is in its early phases of development a lot of art is created that we never see.
Assets for locations, mechanics, characters that need to be created just to get a taste of whether the vision is viable. At the end it may be thrown out. The quicker assets can be made to prove or disprove a concept the better for the studio where money is tight and funding is uncertain.
EA talked about how they spend a lot of time 3d modelling stadiums and given the amount of sports games they pump out, it’s actually a use case for AI that makes sense. Kinda hurts to admit that EA said something logical.
This is where the rubber hits the road. The obvious concern here is job losses. As a writer and an artist it’s all too easy to view this kind of AI as threatening our livelihoods. Especially when leaders whose sole motivation is to make more money, are saying these things so publicly. We’ve also see examples of places that use to hire artists using Generative AI instead.
But it’s not that simple. For one, all tech based industries are always creating tools to automate away roles and yet for the past twenty something years more and more people are needed to get games out the door.
Take a look at SpeedTree, which is a tool that will populate your game engine with foliage. It’s old enough that it’s an industry staple but by all definitions is an AI tool that vastly improves a process that may have taken several people a lot of time to do. Has it cost jobs? We don’t know. But there are a myriad of tools in the industry that have come about as a need to automate certain tasks.
Where AI crosses a line, is where it’s not being used to augment the process but to cut it out entirely. Voice actors losing out on future work because they had their voice cloned is a shady practice. It puts the entire profession and jeopardy and is short term thinking, it’s not sustainable.
Whilst the threat of AI is real, and the concerns valid, I think we overestimate the actual impact of what it means to create a game. We may hear of a studio claiming that AI will make the next game faster to make, but what we don’t see is, how in the end they needed to hire double the amount of contractors to meet their deadline.
We often forget just how much of a shit show it is to get a game to the finish line. Contractors are hired in the hundreds as a game picks up pace, but none of this is visible to us consumers.
I can’t recall the interview, but on a podcast I heard a developer be asked what they wanted AI for and the answer was testing. Games always release with bugs. This is often because even if they have a testing team of 50 or 500, nothing finds bugs faster than 10,000 players in a single hour.
The bigger and more complex a game the variables that need to be tested increase parabolically. Hello Games had to deploy automated testing bots inside their own game because with billions of planets they were never going to have enough man power to test it all.
The video game development pipeline is a complicated thing. We know from countless anecdotes (several which can be found in my series on game engines) that things go wrong all the time, and its the very technology that they rely on that often is the cause.
This years GDC survey about the state of the industry specifically asked respondents about AI. AI and Games has a good write up on that survey, but the key takeaways are:
Generative AI is being used in studios around the world
However, it doesn’t necessarily mean it’s being used specifically in games. People working in support functions such as Finance are the ones also using it.
Collective interest in using Generative AI has gone down from last year. Probably indicating that studios tried it and it didn’t stick.
Most common uses were helping with code, art assets, and automation of repetitive tasks.
Whilst this survey makes provides some evidence that the problem maybe isn’t as big as we fear, its not wholly representative. Those working in AAA games weren’t as representative in the survey as they should be given the footprint those studios have. It also only focussed on what the individuals are doing in their role. Not everyone in a studio would necessarily be privy to decisions made by their leaders to cut the voice acting budget in half and rely on generated audio for the bulk of the process.
We should be mad at AI for the harm it’s doing to the planet and data privacy. We should definitely be angry at it for its part in the enshitification of consumer products. But we have to accept that that some AI is fine, and some is necessary.
We can’t stop game studios from trying to streamline their process. We don’t complain when a studio announces the use of the newest Unreal Engine, when that decision itself could have cost jobs.
Psyonix, the developers behind Rocket League, started off as a studio that specialised in helping other studios use Unreal Engine. As Unreal has gotten more advanced or studios have switched to other engines, that technically could have put Psyonix or similar outfits out of job.
Whilst job cuts are a healthy thing to be aware of, we should take time to educate ourselves on how the industry operates, and focus on the more systemic failures.
Studios, everywhere, are cutting jobs for any reason they can think of. It’s hard to see whether AI is actually having an impact or not. There are a melting pot of reasons the industry is in the trouble that is in currently, and AI is far too new and nuanced to be to blame. It’s a symptom, not a cause.
In the coming years and decades studios are going to try and use AI to streamline their pipeline. Some things won’t work out, and the studio will end up paying more money in the long run, which will probably result in the next game being staffed correctly. Some things will work out, tools like speed tree will work and then they will become part of the process and we probably won’t hear much about it.
Given the state of the industry at present, I don’t believe that a future where we are making AAA games that create millions in profit but require a quarter of manpower is near.
Whether AI exists or not, studios are still going to make shit games we don’t want. They are still going to trend chase and try and optimise stakeholder trends. That’s just the rut we are in.
It is my belief that most of the companies publicly claiming that AI is going to save the day are just saying that to make stakeholders happy. Fundamentally games that rely on AI, as it currently works, are going to make shit games no one wants to play, or get deep into production and find themselves needing to hire a lot of manpower.
We can be angry about these things. But being angry at an EA press release doesn’t change anything. We should be pushing for game developers to be in recognised unions to protect their livelihood and we should be looking out for where AI is crossing a line, like with voice actors.
But for the other areas we don’t know enough to make the call. In the long run, studios may still need the same amount of artists, however, given the optimisations that AI has provided they are more able to hit their deadlines without crunch culture.
We also just need to do a common sense sanity check on what we are being told. The press releases from Ubisoft and Microsoft make no sense and exist more to appease shareholders and bolster popularity than actually have an effect on the industry.
We need to allow studios to optimise their own processes, we want better games, we want less crunch culture and we want the industry to be strong. That is only achieved when studios are able to optimise the pipeline that makes games.
To reiterate. Some AI is bad, there is plenty to be mad about. But the games industry is unique, there is more nuance and hidden factors that we need to take into account. Instead we should be focussing on supporting where we can make a difference and ignore the stuff that’s not for us.
Amazing. A nuanced, non-pitchfork-brandishing take. I really appreciated the thoughtful look at what, specifically is actually wrong with AI. I really appreciated this read
I think AI could easily be used in a positive way and do think it could help make games faster to develop but because of it handling some small amount of the small stuff while actual people handle the more important stuff. This means making games take 3 years instead of 4, or 4 instead of 5, not making games take 6 months to make. It's all about understanding the limitations of the tech and allowing developers to use it in ways that would benefit them and in ways that they want to, as opposed to firing some and forcing who are rest to make up the difference with an AI. The problem is execs want to cut those jobs and force it, instead of letting the developers make it work in a specific way for them, similar to what you talked here.
Great article Alex. Rather funny we both posted about AI almost at the exact same time today!