Hooded Horse CEO Tim Bender has taken a firm view on genAI art assets. "I fucking hate gen AI art and it has made my life more difficult in many ways."

Manor Lords and Terra Invicta publishers Hooded Horse are imposing a strict ban on generative AI assets in their games, with company co-founder Tim Bender describing it as an “ethics issue” and “a very frustrating thing to have to worry about”.

“I fucking hate gen AI art and it has made my life more difficult in many ways… suddenly it infests shit in a way it shouldn’t,” Bender told Kotaku in a recent interview. “It is now written into our contracts if we’re publishing the game, ‘no fucking AI assets.'” I assume that’s not a verbatim quote, but I’d love to be proven wrong.

The publishers also take a dim view of using generative AI for “placeholder” work, or indeed any ‘non-final’ aspect of game development. “We’ve gotten to the point where we also talk to developers and we recommend they don’t use any gen AI anywhere in the process because some of them might otherwise think, ‘Okay, well, maybe what I’ll do is for this place, I’ll put it as a placeholder,’ right?” Bender went on.

@[email protected]
link
fedilink
English
12
edit-2
9h

There’s a problem in movies that I keep thinking about in relation to this.

Movies often use music from other movies in early cuts to get something rough together. They time the scenes around the music, they work with it for ages, and finally it’s time to make an original track to replace the rough copy.

But they have to use something that’s the same tempo, because of how the scenes were timed around the old music. And it has to fit in the same vibe, because that’s what the old music felt like.

So you end up with a piece of music that’s usually pretty close to the temporary music, and a lot of Hollywood osts sound almost identical as a result. When I see people talk about using gen ai for placeholders and concept art, I see that same problem turning up.

@[email protected]
link
fedilink
English
47h

Same thing happens with games to some degree.

There are many stories from gaming about placeholder music becoming integral part of the game.

In original doom games Carmack and Romero loved Black Sabbath and listened it during testing amd working on the game. That led to now legendary doom ost.

During the development of Max Payne 2 Remedy used Poets of the fall song as a placeholder and in the end they decited they wanted it in to the game, but because they could not get in to agreement with the publisher, and because PoF members are just cool guys, they eventually made song just for the game to get around the licensing debucle. That song was later released as a single.

I remember hearing story about Brutal Legend having some licenced music as a place holder in meeting with investors and it lead that music ending in to the game.

Im writing this while im little busy, so everything is coming from my memory, without fact checking, so who ever is reading this take it with a pinch of salt.

Infrapink
link
fedilink
26h

Judas Priest, not Black Sabbath.

@[email protected]
link
fedilink
English
69h

I had never heard about temp tracks, but this makes so much sense. That’s a powerful homogenizing force.

@[email protected]
link
fedilink
English
69h

Famously, Stanley Kubrick used classical music as a temporary track for 2001: A Space Odyssey, and intended on having Pink Floyd do the soundtrack. However, he grew to like how the classical music felt so much that he decided to keep it.

@[email protected]
link
fedilink
English
28h

I wonder if that’s why so many sequences use “4 on the floor” arranged roughly around a 12 bar pattern, or a specific piece of classical music that the studio could have gotten from public domain

@[email protected]
link
fedilink
English
221d

@[email protected]
link
fedilink
English
54
edit-2
1d

As with much discussion of generative AI, the difficulty of Hooded Horse’s position is pinning down what they’re trying to ban. Does an artwork count as generated if somebody used the tech to make a base image of some kind, then fleshed it out and finished it off at length by hand?

A very salient question. Is someone generates a rough outline and then redraws it, fixing errors and making modifications with their human artist eye, is the thing they draw a problem? It will involve a human artist, and human artistic skill.

Tracing is one way to teach children how to draw. If someone generates an image to trace for practice, is all their art problematic because they were trained with AI?

This seems kind of like asking a vegan if they’d eat lab-grown meat… I think the answer depends heavily on why the person believes what they do in the first place.

Overspark
link
fedilink
English
40
edit-2
1d

One way of looking at it is serving a vegan a vegan meal, after you slaughtered a cow for the first couple of tries. Some of the damage has already been done. Also, we’ve had several kerfuffles already where GenAI “placeholders” were present in a released game, and caused plenty of outrage. It’s far safer to never have those placeholders to begin with. Just draw up something ugly in Paint, at least it’ll be plenty obvious you need to fix it before launching the game.

@[email protected]
link
fedilink
English
261d

Maybe a better analogy would be the Ship of Theseus - how much of an AI-generated picture has to be replaced by human work for it to not be considered slop anymore?

@[email protected]
link
fedilink
English
28h

Or to stick with the vegan/meat analogy - making the perfect vegan sausage patty by making several meat patties, each one with iteratively less meat until a vegan patty is left, as well as several dead pigs.

Leon
link
fedilink
English
26h

Homeopathic burgers.

halfdane
link
fedilink
English
381d

Slop of Theseus

NoiseColor
link
fedilink
English
-411d

Omg. The damage has been done? Cows have been killed, because someone used an ai generated texture for mud.

Overspark
link
fedilink
English
311d

In order to generate that texture, AI bots have already been attacking every website hosting content on the internet for the past year, to the point that they were basically DDoSed and forced to take extreme measures to stay online. Plenty of copyrighted works have been slurped up without consent from their authors, a massive amount of energy has been used to inference the models and even more energy (far more than all cryptocurrencies combined for example) is used generating things from those models. So yes, a lot of damage has already been done. Far more than killing a couple of cows.

NoiseColor
link
fedilink
English
-431d

That’s bullshit exaggeration and you know it.

Plus there are legally made models.

Massive energy is used to give you porn, its the way it is. Humanity needs more and more energy all the time. Making that one thing you don’t like the problem is not sensible.

@[email protected]
link
fedilink
English
18h

I heard an interesting statistic the other day. Golf courses use vastly more water than AI. Upwards of 30x more in some areas.

AI usage only accounts for like 20% of water usage by data centers in general.

Leon
link
fedilink
English
36h

The problem here is that you lose nuance.

Yes, a lot of datacentres use evaporative cooling, meaning that the heat is taken away as the water evaporates. It’s a cheap and effective way of doing things and the water returns to the water cycle and doesn’t really get locked up anywhere. So it’s not really a problem, right?

Well yes, in a vacuum that’s fantastic. However there’s two caveats to this: evaporative cooling works best in arid areas, because the air can hold more water. Thus they build these AI datacentres in naturally arid areas. Smart, they’re using physics to their advantage!

What’s the second problem then? They’re now using up the ground water in those arid areas to cool their datacentres and thus ruining it for the people that live there, leaving them without safe water to drink.

Also I don’t know how many anti-AI people will be all “bUt gOlF CoUrSeS ArE OkAy, We lOvE ThOsE!!” These things exist purely for rich people that don’t contribute anything, so we could get rid of both and the world would be a better place.

NoiseColor
link
fedilink
English
17h

People here will hate you for saying that 😁.

@[email protected]
link
fedilink
English
191d

That’s bullshit exaggeration and you know it.

It is not a bullshit exaggeration.

NoiseColor
link
fedilink
English
-231d

Yes it is. It is and also a generalization.

And in the end, it doesn’t matter. The are tens of thousands of people dying each year to support the living standard you enjoy, but you have focused on ai. Your outrage is a fallacy.

@[email protected]
link
fedilink
English
711h

You just made a fallacy of relative privation. While they no fallacious argument. They used hyperbole which is not a fallacy.

So shut the fuck up, if you want to call people out or make an argument. Actually make a point and don’t just drop to attacking people’s character with accusations of fallacy. It’s fucked up and does nothing but make you look stupid at the best of times.

justdaveisfine
link
fedilink
English
211d

I’ve seen the argument that if you’re generating an image and making some edits, you’re robbing yourself of original concepts. Even if human hands do the editing you’ve already outsourced one of the most important parts.

@[email protected]
link
fedilink
English
81d

I’ve seen the argument that if you’re generating an image and making some edits, you’re robbing yourself of original concepts

This argument can also be deployed against Fair Use artworks, though, or tracing.

@[email protected]
link
fedilink
English
241d

I need to admit that in the past day, I asked an AI to write unit tests for a feature I’d just added. I didn’t trust it to write the feature, and I had to fix the tests afterwards, but it did save time.

I really don’t see any usefulness or good intent in the art world though. Sooo much of those models has been put together through copyright theft of people’s work. Disney made a pretty good case against them, before deciding to team up for a shitty service feature.

It’s sad Clair Obscur lost that indie award, but hopefully the game dev world can take that as a bit of a lesson.

@[email protected]
link
fedilink
English
510h

If you acknowledge the problem with theft from artists, do you not acknowledge there’s a problem with theft from coders? Code intended to be fully open source with licenses requiring derivatives to be open source is now being served up for closed source uses at the press of a button with no acknowledgement.

For what it’s worth, I think AI would be much better in a post scarcity moneyless society, but so long as people need to be paid for their work I find it hard to use ethically. The time it might take individuals to do the things offloaded to AI might mean a company would need to hire an additional person if they were not using AI. If AI were not trained unethically then I’d view it as a productivity tool and so be it, but because it has stolen for its training data it’s hard for me to view it as a neutral tool.

@[email protected]
link
fedilink
English
110h

If the models are in fact reading code that’s GPL licensed, I think that’s a fair concern. Lots of code on sites like Stack Overflow is shared with the default assumption that their rights are not protected (that varies for some coding sites). That’s helpful if the whole point is for people to copy paste those solutions into large enterprise apps, especially if there’s no feasible way to write it a different way.

The main reason I don’t pursue that issue is that with so much public documentation, it becomes very hard to prove what was generated from code theft. I’ve worked with AI models that were able to make very functioning apps just off a project’s documentation, without even seeing examples.

@[email protected]
link
fedilink
English
19h

I don’t think training on all public information is super ethical regardless, but to the extent that others may support it, I understand that SO may be seen as fair game. To my knowledge though, all the big AIs I’m aware of have been trained on GitHub regardless of any individual projects license.

It’s not about proving individual code theft, it’s about recognizing the model itself is built from theft. Just because an AI image output might not resemble any preexisting piece of art doesn’t mean it isn’t based on theft. Can I ask what you used that was trained on just a projects documentation? Considering the amount of data usually needed for coherent output, I would be surprised if it did not need some additional data.

@[email protected]
link
fedilink
English
18h

The example I gave was more around “context” than “model” - data related to the question, not their learning history. I would ask the AI to design a system that interacts with XYZ, and it would be thoroughly confused and have no idea what to do. Then I would ask again, linking it to the project’s documentation page, as well as granting it explicit access to fetch relevant webpages, and it would give a detailed response. That suggests to me it’s only working off of the documentation.

That said, AIs are not strictly honest, so I think you have a point that the original model training may have grabbed data like that at some point regardless. If most AI models don’t track/cite the details on each source used for generation, be it artwork on Deviantart or licensed Github repos, I think it’s fair to say any of those models should become legally liable; moreso if there’s ways of demonstrating “copying-like” actions from the original.

blaue_Fledermaus
link
fedilink
English
21d

I recently used one “agentic ‘AI’” to help writing unit tests. Was surprisingly productive with it; but also felt very dirty afterwards.

@[email protected]
link
fedilink
English
010h

Entire problem with AI is literally a legal one. The entire moral outrage that everyone has for it has only been able to be sourced back to legal arguments. Hell even every philosophical argument being made all over the place still stems down to the legalities of it.

If you can find a single moral or philosophical argument to be made that does not have a rooted bias in the law then you might have a reason to feel dirty. But realistically you only feel dirty because your being told to feel dirty by idiots all around you.

If you hold copyright to that high of an esteem that you feel disgraced and sullied for violating it even indirectly then yeah, feel dirty. But I really doubt you hold the draconian laws of copyright to such a high morale standing as to let your self worth be hurt from it.

But even still, beyond ai, every tool you use in your work flow is almost guaranteed to be built off the back of abuse, slave labor, theft, and exploitation at some level. If we threw away tools and progress just because they were built by assholes we would have no tools at all.

Fight for better regulation, and more care in the next step of advancement. But to throw away tools is just not realistic, we live in reality unfortunately.

If the tool is genuinely useless to you then don’t use it. If it is genuinely useful then use it. If you can find a better tool then use that instead.

blaue_Fledermaus
link
fedilink
English
18h

The copyright thing doesn’t bother me much, but the absurdly inflated hype and pushiness from the companies does, and using it at this moment only feeds into it. Probably after the bubble bursts I won’t feel bad about using it.

Scrubbles
link
fedilink
English
51d

Don’t. I think it honestly has a place. Now that place is vastly different from what business bros think it is, but it does have a place. I think writing tests is a great reason, and it’s a good double check. Writing documentation is good, and even writing some boilerplate code and models. The kicker is that you need to already be an engineer to use it, and to understand what it’s doing. I would not trust it blindly, and I feel confident enough to catch it.

It’s another tool in our belt, it’s fine to use it that way. Management is insane though if they think you’ll 10x. Maybe 2x.

@[email protected]
link
fedilink
English
01d

I often use it in programming to either layout the unit testsor do something that’s repetitive like create entities or DTOs from schemas. These tasks I can do myself easily but they’re boring and I will also make mistakes. I always have to check every single line and need to correct things, plus have to write one or two detailed prompts to make sure that the correct pattern and style is followed. It saves me a lot of time, but always tries to do more than it should: if it writes tests it will try and run them, and then try and fix them, and then try to change my code which is annoying and I always cancel all of that.

I find AI art and creative writing boring and I only really see these things as a tool to support being more efficient where applicable, and you also have to know what you’re doing, just like using any other tool.

@[email protected]
link
fedilink
English
91d

create entities or DTOs from schemas

Surely there are deterministic tools to do this?

@[email protected]
link
fedilink
English
51d

There are and I used to use them but they aren’t error-free either or following the style guides I need to adhere to so it’s essentially the same outcome.

NoiseColor
link
fedilink
English
-141d

I don’t know what you mean, but as a designer I can imagine my work without ai anymore. I get the same response from everybody I know In my line of work.

I don’t get banning it. At most for the ethical prudes is limiting one self to the models that were legally trained. But I have no problem admitting I am not one of those.

@[email protected]
link
fedilink
English
81d

I still haven’t seen anything neat from any models that were certified following only legally permitted content. That said, to my knowledge there’s very few of that variety.

Training off of the work of current artists serves to starve them by negating the chance companies hire them on, and results in circumstances where AI trains off of other AIs, creating terrible work and a complete lack of innovation.

People suggest a brilliant future where no one has to work and AI does everything, but current generations of executives are so cut-throat and greedy to maximize revenue at the top, that will never happen without extreme, rapid political and commercial reform.

NoiseColor
link
fedilink
English
-91d

Artists have been always starving. The future is such that if you can’t compete with ai , chose another profession where you can. That’s not something I want, but the world is changing and people have to change with it. That’s either with another profession or by voting in politicians that can redistribute the wealth back to them. There is no option where the progress stops , where the clock stops ticking.

@[email protected]
link
fedilink
English
91d

Many artists do starve, and many others succeed. Not sure what your point is, or why you want to shift the needle more in the former direction.

AI can’t compete with artists if they are not generating content to serve for the model. Even if the models could achieve consistent art, it would mean we get no new themes or ideas. People who would normally invent those new styles will start by repeating what’s existing, and will be paid for that.

Many nations provide grants for art, because they recognize it’s a world that doesn’t always generate immediate, quantifiable monetary return, but in the long run proves valuable. The base expectation is that companies recognize that value and uniqueness in fostered talent as well, rather than the immediacy of AI prompts giving them “good enough” visuals.

NoiseColor
link
fedilink
English
-41d

Artists are always starving is because that’s how it’s always been. I don’t think it can be an argument for or against anything.

I’ve worked with ai image generation professionally and I can say that they are not missing new ideas if people using them aren’t. They are great for brainstorming new ideas. They can’t make a design, but are a great tool speeding up the process.

I love art. I go to galleries often. I don’t think ai can do that and will never be able to. Not true art like capturing a moment in time with the original style of the artist and their life experience. I don’t think ai is a threat to that.

LOGIC💣
link
fedilink
English
11d

I saw an article about an artist who used AI just for overall composition, and who said that he couldn’t compete if he didn’t do this, because everyone in his field was doing it and it was significantly faster than what he used to do.

I suspect that when people say things like “AI cannot possibly help field X be more efficient like it does in field Y,” what they often really mean is, “I work in field Y and not field X.”

NoiseColor
link
fedilink
English
-41d

He’s right. You have to use the tools at your disposal. It’s not only a matter of survival but also about streamlining your work process. Focusing on the main design decisions and letting the machine do at least some of the leg work when possible. It’s more pleasant like that.

I don’t mind people hating on ai. Everybody can not use it as much as they want.

I fucking hate gen AI art and it has made my life more difficult in many ways… suddenly it infests shit in a way it shouldn’t

seeing as how using genAI even during development is still rare enough that it makes the news, I can’t imagine it’s been as big of a problem for them as they make it seem. this sounds more like a smaller publisher taking a popular public stance for the PR.

@[email protected]
link
fedilink
English
5
edit-2
1d

I’ve seen games in store listings that were obviously AI slop copying their entire game, Manor Lords (which is awesome btw)

Create a post

Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.

Rules

1. Submissions have to be related to games

Video games, tabletop, or otherwise. Posts not related to games will be deleted.

This community is focused on games, of all kinds. Any news item or discussion should be related to gaming in some way.

2. No bigotry or harassment, be civil

No bigotry, hardline stance. Try not to get too heated when entering into a discussion or debate.

We are here to talk and discuss about one of our passions, not fight or be exposed to hate. Posts or responses that are hateful will be deleted to keep the atmosphere good. If repeatedly violated, not only will the comment be deleted but a ban will be handed out as well. We judge each case individually.

3. No excessive self-promotion

Try to keep it to 10% self-promotion / 90% other stuff in your post history.

This is to prevent people from posting for the sole purpose of promoting their own website or social media account.

4. Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts

This community is mostly for discussion and news. Remember to search for the thing you’re submitting before posting to see if it’s already been posted.

We want to keep the quality of posts high. Therefore, memes, funny videos, low-effort posts and reposts are not allowed. We prohibit giveaways because we cannot be sure that the person holding the giveaway will actually do what they promise.

5. Mark Spoilers and NSFW

Make sure to mark your stuff or it may be removed.

No one wants to be spoiled. Therefore, always mark spoilers. Similarly mark NSFW, in case anyone is browsing in a public space or at work.

6. No linking to piracy

Don’t share it here, there are other places to find it. Discussion of piracy is fine.

We don’t want us moderators or the admins of lemmy.world to get in trouble for linking to piracy. Therefore, any link to piracy will be removed. Discussion of it is of course allowed.

Authorized Regular Threads

Related communities

PM a mod to add your own

Video games

Generic

Help and suggestions

By platform
By type
By games
Language specific
  • 1 user online
  • 125 users / day
  • 581 users / week
  • 1.85K users / month
  • 6K users / 6 months
  • 1 subscriber
  • 8.11K Posts
  • 169K Comments
  • Modlog