main account for [email protected] because it’s down so often
My point is this is a larger problem of capitalism and just supporting companies that don’t do this doesn’t always work. The gaming industry is getting worse, and there will be less and less devs that actually don’t do this. And there will be less and less devs that are fiscally able to not do this. I have seen it happen time and time again.
You’re not understanding what I’m saying. This is them testing the waters. All companies that are publicly traded will continue to do whatever they can to make a profit. Consumer be damned. I don’t buy AAA games, but it doesn’t change the fact that changes like this will propagate across the gaming industry and make it worse as a whole. The issue here is them trying to turn a profit for shareholders no matter what. So many of these companies are prioritizing making a profit why they happen to release games and not vice versa. That is a result of the last stage unregulated capitalism we have in America and those same tendencies that they’ve propagated to the global economy.
Don’t be so naive to say “just don’t buy it bro.” I won’t. It doesn’t really affect me that much RIGHT NOW. But it might, the more they’re legally allowed to and the more they get away with the more everyone will do it. It happened with DLC’s, then releasing unfinished games, in game purchase in games you have to buy, etc. God forbid I don’t want the entirety of AAA gaming to be a pile of shit. It’s time for lawmakers to step in in this industry and most others and put limits on what type of things companies can do to make a profit. And this goddamn law that companies must do everything they can to turn a profit for their shareholders must be done away with.
Ok but a lot the example images shown here that people are saying are AI images have clearly been edited. People are basically saying something looks off with the faces.
The other people just happen to have a feature that is in a slightly unusual place.
To me this seems like people conflate something feeling off with the image being AI generated
Yeah I’m talking about Nvidia and Intel here, but tbh ryzen 4000 cpus run pretty hot, but they also optimized ryzen quite a bit before they changed to this new chip set, which makes sense to me. Seems like Nvidia and Intel are worried about what looks good power wise on paper rather than optimization sometimes.
As for as the 2080 goes, like I said, it was big FOR THE TIME, and power hungry FOR THE TIME. It’s still reasonable especially for today’s standards
As for as the last two gens, 3000 and 4000 series, they are known to draw more than their rated power requirements, which, for their min recommended psu wattage, 3080 was 50 watts more than the 2080 (750w), and 4080 was 100 w more than that (850w)
To add to that, both of these gens of cards, when doing graphics intensive things like gaming, can overdraw power and have been known to cause hard shutdowns in pcs with PSUs that are even slightly higher rated than their min rec. Before these last two gens you could get away with a slightly lower than rated wattage PSU and sacrifice a little performance but that is definitely no longer the case.
And sure, the performance to watts used is better in the 3080, but they also run 10+ degrees hotter and the 4000 series even moreso.
I just hope the 5000 series goes the way of power consumption refinement rather than smashing more chips onto a board or vram fuckery like with the 4060, like I’d be happy with similar performance on the 5000 series if it was less power hungry
I used hamachi because no one aside from me in my group of friends knew how to port forward, but it didn’t work on my network and it took me 4 years to figure out it was because at&t has it’s own network on it’s dialup modems by default.
They still do that to this day with their fiber modem/routers! I hate it! And even if you do passthrough to have your own up for only your router, your ping is still never below 23ms because there’s two stop points in the chain, that and at&t’s dns resolution is ass.
Damn internet oligopolies.
We are reaching the limits of render technology with our current architectures. You’ll find that most established practices for computer hardware/software/firmware started as a “cheat” or weird innovation that began with using something in an ass backwards way. Reducing the amount of data a GPU needs to render is a good way to get more out of old and new hardware. It’s not perfected yet but the future of these features is very promising.
Hell yeah