Any benefits from x64 version?

So I found the subdirectory with the x64 windows tiles builds. Does it have any benefits? Does CDDA ever run out of memory or something else?

I use it. I honestly can’t say if it’s helping or not, I’ll be curious to see what turns up here. I could see getting oom on lower end systems after exploring a lot of the map without saving.

I also use x64.

Cant say if it helps though :(.
Would also love to hear the benefits for CDDA :slight_smile:

64-bit also provides better performance, that is if the hardware (CPU) supports 64-bit software, at least in theory. My understanding is that 64-bit CPUs have higher data throughput capacity than 32-bit CPUs. The game does choke sometimes especially during explosions or building collapses. Also if you like to play with a very high number of zombies, you might see a difference in performance between 32-bit and 64-bit versions. Your turns might take half a second instead of 3 seconds! :stuck_out_tongue: I can’t really testify on the performance differences, as I’ve only played 64-bit versions. 64-bit design decision could also be a safety measure against future unknowns - just to be sure that the game isn’t limited by the 32-bit limitations in the future. It might be called foresight. Who knows what sort of memory footprint CDDA has 10 years from now? 20 years? You wouldn’t want to start writing 64-bit versions of advanced 32-bit software far later down the road. Although I have no idea how challenging or laborous that 32-bit-to-64-bit transformation process is.

I doubt the game’s gonna run out of memory. On my PC the game’s memory load has stayed at around 350MB (the few times I have bothered to glance at it).

There’s really no reason to NOT go with 64-bit software design, right? I mean these days it’s just mercy and kindness to provide 32-bit versions just because 32-bit OSs can’t run 64-bit software, while 64-bit OSs CAN run 32-bit software, and 32-bit OSs aren’t produced anymore. 32-bit is legacy now …right? On the other hand, 32-bit CPUs are gonna be around for a long time, making sure there will always be a demand for 32-bit OSs and software because CPUs aren’t the first component to go bad in a computer.

Siding the matter, look at Dwarf Fortress as a warning example. It’s a single-thread application, meaning that it doesn’t benefit from multiple CPU cores. Although right now I’m not sure if multicore CPUs even existed when Dwarf Fortress was launched. My understanding is that you don’t transmute a single-thread application to a multi-thread version just like that. And now people have complained for a few years how DF stutters and stalls on their machines, especially the advanced and old fortresses. They call it “the FPS death”. And the developer, one man, is understandably reluctant to rewrite the whole game… Probably waiting for an easy way out, a future tech to save the day, while being fully aware that the further he develops the game, the more laborous a rewrite is going to be.

It more or less means you always gotta start a software project with the tech that provides the biggest room for expansion.

Edit:
Hasty look-up:
First dual-cores came out 2005 or after, depending how you view it: Pentium D, Athlon 64 x2.
Dwarf Fortress’s development started 2002. Alpha released 2006.

Important note: Multi-thread != 64-bit, they’re completely different things. For example going to 64-bit will provide some small performance enhancements, as well as allowing for increased memory usage (a thing that can make a difference in something like C:DDA where we can have a huge number of items and monsters moving around in the memory at the same time, especially with things like Z-levels coming onto the scene. On the other hand a game like C:DDA, where everything pretty much runs in lockstep with the player’s turns, has very little that can take advantage of multithreading. Some notable things that might be able to be multithreaded out would be rot calculations and updating items/fields/map tiles, as well as maybe vision processing. Beyond that there is very little that can actually be multithreaded out for any sort of real gains, because all of the logic is locked to the current state of the game and can only be updated for it.

Games like DF can get a bit more out of multithreading, mainly with things like heat calculations, liquid flows, and pathfinding, but even they are limited a fair bit in the amount of gain that they can get out of a game through multithreading. Where multithreading really shines is in more advanced games that may need to be running several AI’s in real time alongside the player, all of their inputs, and the effects of all of those different players, as well as managing syncing up with a multiplayer server, because it ensures that if any one part of the game is running a bit slower than it doesn’t bog down the other parts of the game.

So to sum things up, multithreading requires the right kinds of problems to see any sort of advantage in performing it, but for those particular problems it can lead to very large increases in the speed which the computer works as well as allowing for different parts of the problem to continue working at full speed even if another part is slow (such as still taking player input at full speed even if you have a slow internet connection), which makes it much better for real time and multiplayer games. What 64-bit does is allows for larger chunks of memory to be handled at once, which can provide large speed-ups in cases where you need to pass around large bits of memory along with allowing for a larger overall cap on memory use, and thus is very useful for things like C:DDA where we are passing around very large arrays and chunks of memory as different parts of the map are loaded, processed, and saved.

In general if you have a 64-bit computer there is no reason why you shouldn’t be using the 64-bit version; it will provide some small benefits in certain areas without any sort of penalty to you.

There is one detriment to using a 64 bit build, which is that pointers double in size, increasing memory pressure.
However dda doesn’t use enough memory for that to be a real issue in most cases, and AFAIK it’s not particularly pointer heavy (most of the memory is full of huge data objects, not lots of pointers to small objects).
The 64 bit build will enable some more recent cou operations as a side effect of the fact that all 64 bit cous have them, I suspect this is where any performance benefits are coming from.