• 0 Posts
  • 15 Comments
Joined 1Y ago
cake
Cake day: Aug 02, 2023

help-circle
rss
  • I can’t simultaneously play a third MMO (already got FFXI and FFXIV)
  • X4 custom start allows me to jump to the parts I want to play instantly, no matter if it’s starting wars, flooding the market, dogfighting, etc
  • My X4 save is a gzip file: no need to worry about latency after moving to another country etc (my EVE account is locked to a region halfway across the world)
  • I don’t have to wait for irl people to do something fun in X4
  • The gziped save file is in xml format. If something breaks I can just fix it
  • X4 has a huge modding scene for whatever features you want
  • X4’s modding tools are super easy to learn: it’s all xml and lua. Took me only 2 hours to figure out how to modify the UI from scratch.

Because it’s in a genre that has no good alternatives?

EVE is spreadsheet simulator, Elite Dangerous is space-truck simulator, NMS is all planets not space, StarField is StarField.

The only viable alternative I found was X4. Even that is slightly different from what Star Citizen promises (it’s more empire management than solo flying in the endgame, vanilla balance is also questionable: you can “luke skywalker” a destroyer with a scout with pure dogfighting skills)


My personal complaints (despite enjoying the gameplay):

  1. Input lag. It’s negligible compared to other games, but comparing it to DDDA it feels much higher (meh vs “oh wow this is smooth!”)

  2. FSR. There is definitely something wrong with the FSR implementation here, because there are minor traces of ghosting that are not present in other games. Rotate your character in the character selection screen, or look at a pillar with water as the backdrop with light rays nearby. That being said, it becomes less obvious during actual gameplay. I do hope that this will be fixed though.


Been playing it since release and I have to say I quite like it. The mtx is less intrusive than Dragon Age Origins’ DLC (no mention in game at all versus “There’s a person bleeding out on the road, if you want to help him please go to the store page”).

So far, the game is a buttery smooth 60 fps at 4k max graphics + FSR3 w/o ray tracing except for inside the capital city (running 7800x3d with a 7900xtx). The only graphics complaint I have is the FSR implementation is pretty bad, with small amounts of ghosting under certain lighting conditions. There’s also a noticeable amount of input lag compared to the first game: not game breaking, but if you do a side-by-side comparison it’s pretty obvious.

Sure the game has its issues, but right now this looks like something that I enjoy. Games don’t need to be masterworks to be fun (my favorite games are some old niche JRPGs that have been absolutely demolished by reviewers at the time), and right now I think it’s money well spent.


It doesn’t have to be turn-based. FFXI and FFXII are also great. I feel the bigger issue is that making a story heavy game while everyone else is also making story heavy games makes it no longer unique.

I wouldn’t mind going back to ATB, but I don’t think that would win back an audience except for nostalgia points.

Maybe more FF:T though? Kinda miss that.


You Is Into

Baba IS Money

Take The Breach

+

:)

Also is anyone reminded of Final Fantasy: Tactics by the small isometric maps?


Back in the 90s we had the Flash as well.

Somehow I still have that theme song stuck in my head…

And that scene where a brainwashed Flash destroys an entire row of parking meters…


When I said small I was referring to portable (kinda forgot the word), as hunts can be completed in 15min or less. I think I would still prefer World though, probably because I did 300 Narwa hunts in one week before they fixed the “loot drop tables” bug.


MH series always does one big (console) one small (mobile) in that order. Last gen World was the big and Rise was the small.

This is probably gonna be the big one :)


The argument is that processing data physically “near” where the data is stored (also known as NDP, near data processing, unlike traditional architecture designs, where data is stored off-chip) is more power efficient and lower latency for a variety of reasons (interconnect complexity, pin density, lane charge rate, etc). Someone came up with a design that can do complex computations much faster than before using NDP.

Personally, I’d say traditional Computer Architecture is not going anywhere for two reasons: first, these esoteric new architecture ideas such as NDP, SIMD (probably not esoteric anymore. GPUs and vector instructions both do this), In-network processing (where your network interface does compute) are notoriously hard to work with. It takes CS MS levels of understanding of the architecture to write a program in the P4 language (which doesn’t allow loops, recursion, etc). No matter how fast your fancy new architecture is, it’s worthless if most programmers on the job market won’t be able to work with it. Second, there’re too many foundational tools and applications that rely on traditional computer architecture. Nobody is going to port their 30-year-old stable MPI program to a new architecture every 3 years. It’s just way too costly. People want to buy new hardware, install it, compile existing code, and see big numbers go up (or down, depending on which numbers)

I would say the future is where you have a mostly Von Newman machine with some of these fancy new toys (GPUs, Memory DIMMs with integrated co-processors, SmartNICs) as dedicated accelerators. Existing application code probably will not be modified. However, the underlying libraries will be able to detect these accelerators (e.g. GPUs, DMA engines, etc) and offload supported computations to them automatically to save CPU cycles and power. Think your standard memcpy() running on a dedicated data mover on the memory DIMM if your computer supports it. This way, your standard 9to5 programmer can still work like they used to and leave the fancy performance optimization stuff to a few experts.


And that’s fine. Plenty of authors are great at writing the journey and terrible at writing endings. And from what we’ve gotten so far at least he now knows what not to do when writing an ending.


Oh great, a human failed the Turing Test…


Am I the only idiot that read X as X11 then realized it was referring to Twitter?


The problem is that hardware has come a long way and is now much harder to understand.

Back in the old days you had consoles with custom MIPS processors, usually augmented with special vector ops and that was it. No out-of-order memory access, no DMA management, no GPU offloading etc.

These days, you have all of that on x86 plus branch predictors, complex cache architecture with various on-chip interconnects, etc… It’s gotten so bad that most CS undergrad degrees only teach a simplified subset of actual computer architecture. How many people actually write optimized inline assembly these days? You need to be a crazy hacker to pull off what game devs in the 80-90s used to do. And crazy hackers aren’t in the game industry anymore, they get paid way better working on high performance simulation software/networking/embedded programming.

Are there still old fashioned hackers that make games? Yes, but you’ll want to look into the modding scene. People have been modifying the Java bytecode /MS cli for ages for compiled functions. A lot of which is extremely technically impressive (i.e. splicing a function in realtime). It’s just that none of these devs who can do this wants to do this for a living with AAA titles. Instead, they’re doing it as a hobby with modding instead.


If it helps you avoid users it’s a plus.

I’d take deciphering the Rosetta code over that any day.