New Development: ModHelper and Utilities

Kick back and discuss whatever takes your fancy.
Post Reply
User avatar
jlf65
Posts: 1285
Joined: Wed Aug 10, 2016 9:10 pm

Very handy to help modders

Post by jlf65 » Fri Aug 18, 2017 3:28 pm

Very handy to help modders optimize their mods... or gamers to see which mods are overloading their system. I could use something like that, myself. There are still places/times where my frame rate drops below 20 that I would like to have an idea of what is the bottleneck.



Mystical Panda
Posts: 750
Joined: Wed Mar 30, 2016 2:02 pm

Here's a cleaned up version,

Post by Mystical Panda » Fri Aug 18, 2017 5:48 pm

Here's a cleaned up version, showing peaks and averages:


[img]https://taleoftwowastelands.com/sites/default/files/Mystical%20Panda/cleanedupstatistics.png[/img]


You do not have the required permissions to view the files attached to this post.

Mystical Panda
Posts: 750
Joined: Wed Mar 30, 2016 2:02 pm

Checking out the samples it

Post by Mystical Panda » Fri Aug 18, 2017 8:08 pm

Checking out the samples it looks like cpu time stays about 1/2 way through most of the playthrough, with memory steadily climbing. I'm noticing a drop in fps from 60; it's either:


1) the regular 7200rpm hdd slowing things down streaming in textures.


2) the gpu is maxing out (not quite sure how to programmatically get metrics on that just yet).


3) Since the cpu time is totaled for all cores, a single core might be 'choking' up. Though, I'm kinda leaning towards to first two.


You do not have the required permissions to view the files attached to this post.

User avatar
jlf65
Posts: 1285
Joined: Wed Aug 10, 2016 9:10 pm

It wouldn't surprise me if it

Post by jlf65 » Fri Aug 18, 2017 9:29 pm

It wouldn't surprise me if it were 3. You have to set various functions to use threads via the ini file, after all. I'm quite certain you'll find one thread is probably close to 100% all the time, while most of the others fluctuate considerably between 0 and some upper value that's probably not all that high.



Mystical Panda
Posts: 750
Joined: Wed Mar 30, 2016 2:02 pm

jlf65 wrote:

Post by Mystical Panda » Sat Aug 19, 2017 6:26 am

[quote=jlf65]


 


It wouldn't surprise me if it were 3. You have to set various functions to use threads via the ini file, after all. I'm quite certain you'll find one thread is probably close to 100% all the time, while most of the others fluctuate considerably between 0 and some upper value that's probably not all that high.


[/quote]


That might explain it some considering there's over 40 individual threads running (which could be anything from system dlls (I'd imagine) to actual game code. Let me tinker around a bit more and see what I might be able to come up with.


Speaking of 4gb memory we were talking about earlier, without ENBBoost, I get an "Out of Memory" error when using really high res texture packs, not when just running the game with just mods. I'm guessing that, like XP (using the same DX model), either the game 'shadows' textures in system memory, and/or is keeping track of where textures are located (needs ptrs to the data). If system memory were the issue (game data, hashes, tables, etc.,.), ENBBoost would have no effect since it only handles graphics memory, not internal 'private' data for the game- I'd still get an "Out of Memory" error from the game's data itself; which I'm not. I'm still leaning towards 4gb patch not only expands system memory, but it's mostly used to handle textures larger than the game itself was coded for.


I'll try and unpatched game with ENBBoost and see what happens; I'd imagine I might have problems if I can't work around the 2gb limit, especially since the heap is now almost 512mb itself (1/4 of all available resources).



Mystical Panda
Posts: 750
Joined: Wed Mar 30, 2016 2:02 pm

Here's another interesting

Post by Mystical Panda » Sat Aug 19, 2017 9:16 am

Here's another interesting sample made at 1 second intervals. You can see in the processor and memory data where the game first started and the 'lul' is where I was at the main menu (brewing coffee time). Then as the game loaded and I went into combat the cpu was ok (at least from a 'total' standpoint) but the memory ceiling-ed out (I raised the virtual headroom to 4gb from 3.5gb, in the graph, so I could see where it peaked out). This is really at the 3.5-3.8gb boundary for virtual memory space allocated to a 32bit app; though it didn't go over.


I'm wondering if the engine, sensing that memory is about to be breached, unloads game data best it can (when the burst is bigger than what can be unloaded, "Out Of Memory"), whatever it can (causing bodies to disappear pre-maturely), in an attempt to mitigate a ceiling hit. This might be one contributor to an fps hit of sorts since it would come as a 'emergency' measure, rather than balanced in the game's core design.


Since ENBBoost was running at the time and presumably handling textures, it's safe to assume that without a 4gb patch, the game would've bombed on excessive game data only. It's still an interesting number to look at since it implies that any mod that adds more data could set it over the limit, or the game will add that to it's ever increasing load balance. atm it's hard to say.


You do not have the required permissions to view the files attached to this post.

Mystical Panda
Posts: 750
Joined: Wed Mar 30, 2016 2:02 pm

This one looks more like I'd

Post by Mystical Panda » Sat Aug 19, 2017 2:17 pm

This one looks more like I'd expect it to; cpu peaks corresponding very closely to disk reads and memory peaks around half the max. Yet still average less than half max. The 'lul' once again is waiting at the main menu. Sampling intervals are at 1s. There's a big contrast to this and the previous one memory wise. Omitting the 150+% drive read, since I'm not sure why it was that high unless it's because I checked the logicaldisk and not the physical for the value- seems to be the only sample (1 in the lot), that goes that high.


I'll need to start a new game and hit some of the 'warzones' fresh and see what the numbers show. Most have been cleared by now and are awaiting a cell reset.


You do not have the required permissions to view the files attached to this post.

User avatar
jlf65
Posts: 1285
Joined: Wed Aug 10, 2016 9:10 pm

You're probably right about

Post by jlf65 » Sat Aug 19, 2017 4:38 pm

You're probably right about the textures and system memory. It would make sense that if there's not enough vram, DX will buffer what it can in main memory, which for 32-bit apps would be in their 4GB space. Unless your textures never exceed the size of the vram, one should always be using ENB to keep the textures out of the app ram.


It might make for an interesting test to at least once try saving all the thread cpu time samples individually... look for any threads that are maxed out. It would make sense that there's a lot of activity around loads - gotta process the level data to find what to load, then process the BSAs and loose files to find the data. Possibly decompress it if it's compressed.



Mystical Panda
Posts: 750
Joined: Wed Mar 30, 2016 2:02 pm

Here's a quick video of what

Post by Mystical Panda » Mon Aug 21, 2017 2:00 pm

Here's a quick video of what I have so far.


I'm not totally sure yet on getting individual core utilizations, so I added an overlay on top of the total cpu and total drive usage representing respective 'queue lengths' for each. Red bars are where the 'sample' is beyond it's 'normal' bounds (for example with drive usage > 100%, which I read could happen- no idea why, but must be the way MS calculates/ samples it).


In the cpu usage, the overlay is scaled to total cpus * 2. If it's correct, you can see the spikes where threads where being queued (waiting to be processed), and where the graph spiked red (# of threads waiting are higher than cpuCount * 2). According to what I've read, the calculation is queue > cpus * 2 = cpu bottleneck. I'm guessing that the threads that are 'queued' are the ones using the first, or second core (game and potentially 'other' processes like enb, reshade or even the os).


The drive usage doesn't appear to have any overlay data (fuchia in color), so it appears the drive, even though its a regular old 7200, doesn't have any queued read/write request. But could still effect fps when streaming in necessary data from the hdd.


Values are calculated based on the samples shown, and I added a 'time frame' for the tab's message area (at the bottom of each tab) showing the relavent times the sampling was taken.



Mystical Panda
Posts: 750
Joined: Wed Mar 30, 2016 2:02 pm

A quick note: Interestingly,

Post by Mystical Panda » Mon Aug 21, 2017 2:03 pm

A quick note: Interestingly, if you see the 'lowest' processor use, that's where I was waiting at the main menu, but if you notice the HDD use didn't rise after loading my saved game. I'm guessing this is because I've enabled loading data from the hard drive cache, and therefore an hdd read wasn't necessary.



Post Reply