A big project with potentially big performance improvements.

General discussion about Train Simulator, your thoughts, questions, news and views!

Moderator: Moderators

deltic009
Very Active Forum Member
Posts: 4017
Joined: Fri Nov 27, 2009 1:06 am

Re: A big project with potentially big performance improvements.

Post by deltic009 »

gptech wrote:Assets not used in a route aren't pre-LOADED as such; pre-REFERENCED so they show as available, yes. Have you determined the RAM usage of the route without any scenarios in the World Editor, and compared that with the Scenery tiles removed? Has anybody investigated the possibility of a single *less than optimal* asset being at the heart of the issue?-----probably more likely than a fault in the TS core, which manages to run other routes with little bother.
I haven't checked the RAM usage in World Editor or compared it with the Scenery Tiles removed (well, renamed folder I assume achieves the same). Don't know where I would even start with identifying a single asset causing the issue as there's been no consistency to where on the route, or when that is has crashed out although most often in Scnerio Editor has been when attempting to load stock that is going to be placed there.

If you have any tips or advice on how to achieve any/some of those suggestions I'd be glad to lend you my ear.
Matthew Wilson, development team at Vulcan Productions

http://www.vulcanproductions.co.uk
https://www.facebook.com/VulcanFoundry/
deltic009
Very Active Forum Member
Posts: 4017
Joined: Fri Nov 27, 2009 1:06 am

Re: A big project with potentially big performance improvements.

Post by deltic009 »

I feel that my experiment with the SVR has reached some form of conclusion for now - a backup is taken in case I fancy further inroads. All items in Tracks.bin and the Scenery tiles have had their blueprints duplicated and moved to a custom Provider and Product folder. My motivation for this was stability of the route and loading times. There were more than one asset provider where either due to the SVR or others, the relevant Product folder would enable access to over a thousand assets when in reality the SVR needed under 20 of them. RW Tools has been invaluable throughout as it is in many tasks, but what of the results?

I had a detailed scenario with realistic consists for the SVR and a decent amount of static rolling stock in the correct places. Before I did anything to the route I had to turn Scenery Quality down to 1-notch off the bottom. I would still frequently experience crashes in Scenario Editor if I was working. Now my Scenery Quality is set to one-off the top and the route is a lot more stable. I have also attempted to replace the 3D track with 2D, this didn't allow me to up the setting but made the frame rate higher and increased stability further.

I haven't altered assets in the loft or road tiles because they have nested references to cross-section files and need relative pathed textures - a lot more work.

I deem the experiment a success and you may be glad to know that I'm in communication with Steam Sounds Supreme with a view to this approach being adopted for the release candidate of the route and with the hope it will allow a lot more complex scenarios to be achieved.
Matthew Wilson, development team at Vulcan Productions

http://www.vulcanproductions.co.uk
https://www.facebook.com/VulcanFoundry/
User avatar
smarty2
Very Active Forum Member
Posts: 9976
Joined: Sun Aug 21, 2005 8:16 am
Location: 1963, at Snow Hill!
Contact:

Re: A big project with potentially big performance improvements.

Post by smarty2 »

I know that I am not as technical as you fella's and may be talking rubbish (as usual) and this may not even be relevant but not having the bloat inducing Freeware Packs (no offence) installed reduces the size of the Kuju folders by a considerable amount, so I assume that is going to lessen any strain on the system as it loads? Just a thought....
Best Regards
Martin (smarty2)
Non technically minded individual!

Is There A God?
Dudley Bible web page
deltic009
Very Active Forum Member
Posts: 4017
Joined: Fri Nov 27, 2009 1:06 am

Re: A big project with potentially big performance improvements.

Post by deltic009 »

The Severn Valley doesn't use any assets from the Kuju\RailSimulator folder, but does use stuff from the UKTS freeware packs. I think it needs about 170 out of 4 thousand assets, so I have still followed the same methodology for those assets.
Matthew Wilson, development team at Vulcan Productions

http://www.vulcanproductions.co.uk
https://www.facebook.com/VulcanFoundry/
User avatar
peterfhayes
Very Active Forum Member
Posts: 2155
Joined: Mon Sep 26, 2011 5:07 am

Re: A big project with potentially big performance improvements.

Post by peterfhayes »

There is a lot of misinformation wrt Physical RAM usage. I doubt very much that any crash described here was due Physical RAM usage or overload.
If it was a Physical RAM issue - WINDOWS would warn you and ask you to turn down your settings, and if you didn't WINDOWS would crash with a BSOD (Black or Blue). TS can address up to 4GB Physical RAM without any dramas and any excess could be cached into the paging file.
The reason that crashes occur when using the editor is due to the Virtual Address Space VAS (or Process Address Space). This has nothing to do with Physical RAM!
VAS: Every program that runs under windows has to be loaded into its own virtual address space, so that when the program crashes it does not bring down windows as well. (As it used to in earlier Windows versions).
This VAS has nothing to do with how much Physical RAM is on board it is for a 32-bit app like TS2017 a MAXIMUM of 4GB. That would be still true if only 1GB or 192 GB of RAM was installed.

Now it gets complicated:
A 32-bit app (TS) running under windows 32 -bit gets 4GB VAS BUT 2GB is allocated to the system, leaving 2GB for windows, BUT the video card VRAM also needs some of the VAS leaving a small amount for TS. That is why a 32-bit OS is not good for TS.

If you use a 64-bit OS - TS still gets 4GB but this time it gets all 4 GB as windows 64-bit has up to 8 TERABYTES of VAS to utilise.
Now as TS loads and plays (or any 32-bit app) the VAS can fragment and start to lose contiguous space for TS to load into. Further, as you edit in TS, the VAS becomes more and more fragmented and there is less contiguous space for TS to load into (for the editing process) and it only needs as little as 1MB of VAS to be fragmented at a specific address and TS will crash. Saving TS continually during editing will clear the VAS and help prevent crashes.

Also as the code becomes more intense you can experience the so called 'hard page faults' (not really faults! What it means is the that Windows OS has "loaded" the TS code but has yet to "pass" it on to the cpu and Physical RAM. This will result in a xC0000005 error in event viewer. If the fault is serious enough TS will crash.

We need to clarify it is not a MEMORY (Physical RAM) issue it is a Virtual address Space (VAS) issue that will have disappeared with TSW.

The fix, there isn’t one due to 32-bit structure. However, I offer a couple of (partial) fixes that may or may NOT work:
Neither fix is 100% foolproof and may need registry hacks for optimal results. These fixes are primarily for windows server but they do work in Windows 7 to 10.
Change the settings in the DeskTop Heap (Google) https://www.ibm.com/support/knowledgece ... inreg.html and https://social.technet.microsoft.com/Fo ... server8gen

Change the settings in the HeapDeCommitFreeBlockThreshold in the Registry https://technet.microsoft.com/en-us/lib ... g.65).aspx
What does this do?
This value specifies the number of freed bytes above which the heap manager de-commits the memory (instead of retaining and re-using the memory). If you set this registry key to a high value (for example, 262144), the heap manager is more effective when making sure that no bytes are de-committed.
Therefore, virtual address fragmentation is lessened or even avoided.

Note: You may still get "OOM errors"/TS shutdowns particularly after TS2017 has been running/edited for some time and this would probably be due to the total depletion of the 4GB VAS.

Cautionary Note:
The reason I have hesitated about publishing this is that I can't be sure that the decimal value of 262144 (256 MB) is the optimal value for a 64-bit OS and there may be a better figure for this version of Win 7-10. I have now tried a figure of 524,288 (512MB) =0x80000 and this works without incident, but it still may not be the optimum value.

This tweak is quite complex and should only be used by experienced simmers with extensive computer experience. I make no guarantees and it should be used at your own risk. I have not seen any issues my system, but that is not to say that it won’t impact adversely on your set up. It is not a universal panacea and it may not work in every situation.

However, the latter has more chance of “success”, but may need extensive experimenting to get the best value.
My advice – forget about RAM it very rarely causes any issues unless it is faulty. RAM does NOT care if your app is 32 or 64-bit, it is allocated automatically by the system.
pH
fireman44
Getting the hang of things now
Posts: 90
Joined: Tue Aug 27, 2013 8:00 am

Re: A big project with potentially big performance improvements.

Post by fireman44 »

peterfhayes wrote:Saving TS continually during editing will clear the VAS and help prevent crashes.
I would not argue with any of your technical points because they are above my head, but I would say from my own experience that just saving during editing solves nothing. TS will still crash when you exceed the use of a specific amount of RAM.

Saving and restarting is the solution to that issue and you can monitor RAM usage alongside TS if you run it in a window and display the Windows Task manager alongside For me (and,certainly, may other people) the crisis point is always close to 3.5 GB and if you save and restart before reaching that, no "unable to save to dump file" crashes arise. The same applies whilst playing the game.

Whether we are saying the same thing in a different language I do not know, but if we are not then I am saying with some certainty that the amount of RAM used by TS does have a significant effect on the game.
gptech
Very Active Forum Member
Posts: 19585
Joined: Fri Oct 10, 2008 5:48 pm
Location: Wakefield, West Yorkshire

Re: A big project with potentially big performance improvements.

Post by gptech »

deltic009 wrote:Don't know where I would even start with identifying a single asset causing the issue as there's been no consistency to where on the route, or when that is has crashed out although most often in Scnerio Editor has been when attempting to load stock that is going to be placed there.If you have any tips or advice on how to.....
There's no dead easy way to determine whether a *rogue* asset is used, but a decent first step would be to see if the list of required assets includes any that have been cited as 'irritable' in various forums.
deltic009 wrote:The main thing that struck me about the route was the daunting quantity of downloads required from here in order to function
Are you in a position to divulge just what those downloads are?
deltic009
Very Active Forum Member
Posts: 4017
Joined: Fri Nov 27, 2009 1:06 am

Re: A big project with potentially big performance improvements.

Post by deltic009 »

gptech wrote:
deltic009 wrote:The main thing that struck me about the route was the daunting quantity of downloads required from here in order to function
Are you in a position to divulge just what those downloads are?
I am not currently, but when I'm home this evening at some time I should be able to.
Matthew Wilson, development team at Vulcan Productions

http://www.vulcanproductions.co.uk
https://www.facebook.com/VulcanFoundry/
User avatar
peterfhayes
Very Active Forum Member
Posts: 2155
Joined: Mon Sep 26, 2011 5:07 am

Re: A big project with potentially big performance improvements.

Post by peterfhayes »

Fireman 44
Saving and restarting is the solution to that issue and you can monitor RAM usage alongside TS if you run it in a window and display the Windows Task manager alongside For me (and,certainly, may other people) the crisis point is always close to 3.5 GB and if you save and restart before reaching that, no "unable to save to dump file" crashes arise. The same applies whilst playing the game.
Unfortunately what you are seeing there is cause and effect. As you are not monitoring the VAS or the paging file at the same time, it would be reasonable to assume that 3.5 GB of Physical RAM used is the tipping point. But unfortunately, this is not correct. (Are you, by chance, mixing up the amount of Physical RAM under a 32-bit OS that can be used.)
At 3.5 GB RAM usage the OS will be addressing/using between 2 to 2.5GB Physical RAM, so you are actually implying the when Physical RAM usage gets to around 1.5 to 2 GB RAM TS crashes? You would have to use something like ProcMon to determine the actual Physical RAM usage between TS and the OS. I have seen (on complex routes/scenarios) a total max Physical RAM usage of 6.2GB, say the OS is addressing 2.5GB that means whatever else is running - primarily TS2017 is addressing the rest - I do not see any crashes. The editor really messes up the VAS and that can be seen (retrospectively) running VMMAP. https://docs.microsoft.com/en-us/sysint ... oads/vmmap

Two messages:
If the Physical RAM is affected by anything, hardware, software, anything at all, Windows will display a warning and then eventually Crash - BSOD.
If there is a VAS issue when TS2017 is running (editing) then ONLY TS2017 will crash.

That is the way windows has worked since Windows 95.

The concept of VAS is very difficult to comprehend, but it is the main reason that we get issues with TS 2017.
Windows allocates and uses RAM NOT TS2017 and RAM is only an extremely fast storage device – no processing occurs there, just volatile storage and retrieval for and by the cpu.
pH
User avatar
eyore
Very Active Forum Member
Posts: 1226
Joined: Tue Jan 27, 2004 6:22 pm
Location: Cumbrian hills

Re: A big project with potentially big performance improvements.

Post by eyore »

As a non technical user I can't comment on the validity of pH's VAS suggestions, but I was wondering if this is going away from the point of the thread?
To put it in simple terms, knowing the bucket is too small doesn't tell us what we can do to stop it over filling. Whether the issue is RAM or VAS the editor seems to get overwhelmed by something and I applaud deltic's attempt to ascertain a source of the issue.
Phil

Image
deltic009
Very Active Forum Member
Posts: 4017
Joined: Fri Nov 27, 2009 1:06 am

Re: A big project with potentially big performance improvements.

Post by deltic009 »

Yes, please also do not see me as any technical expert here, and I am not saying that this is the definitive solution, but I have picked an avenue of investigation and found some positive results - perhaps coupled to what Peter says it will be much better, but unfortunately I am not entirely comfortable with the unknown (to me) things that he has suggested, so have not implemented those changes.
Matthew Wilson, development team at Vulcan Productions

http://www.vulcanproductions.co.uk
https://www.facebook.com/VulcanFoundry/
fireman44
Getting the hang of things now
Posts: 90
Joined: Tue Aug 27, 2013 8:00 am

Re: A big project with potentially big performance improvements.

Post by fireman44 »

peterfhayes wrote:Unfortunately what you are seeing there is cause and effect. As you are not monitoring the VAS or the paging file at the same time, it would be reasonable to assume that 3.5 GB of Physical RAM used is the tipping point. But unfortunately, this is not correct. (Are you, by chance, mixing up the amount of Physical RAM under a 32-bit OS that can be used.)
You are absolutely correct, I am not monitoring VAS or paging file - partly through lack of knowledge of how to but mostly because I don't need to know that information to deal with the problem. You of course have a far more explorative mind than me, all I am trying to do is detect when an issue is about to arise and deal with it before it does.

From discussion with others, the 3.5GB (or a miniscule amount above) seems to be a consistent breaking point which is why for those seeking to deal with the issue we agree it is sensible to save and restart at around 3.25 GB. It works well apart from for the odd advanced that inists on full start-up procedure when re-starting.

Just for info I am using a 64-bit OS and run Windows Task Manager alongside the game (windowed) to take readings from. Of course I have significantly more physical RAM and am fully aware TS cannot use it.
gptech
Very Active Forum Member
Posts: 19585
Joined: Fri Oct 10, 2008 5:48 pm
Location: Wakefield, West Yorkshire

Re: A big project with potentially big performance improvements.

Post by gptech »

eyore wrote:but I was wondering if this is going away from the point of the thread?
Not at all---I applaud Matthew's investigations but there has been a certain 'bias' from the outset...
we all know there are a multitude of flaws within the TS engine
and accordingly the investigation has focused on finding a cure/fix/bodge/work around for/to prove flaws in the game engine--if you want to prove light is composed of particles you can set up an experiment to do so, and the next guy can set up an experiment to prove that light is composed of waves; as it *has* to be one or the other, a case of "seek and ye shall find"?

ANY data processing takes up CPU time and RAM, so whilst referencing a load of assets has an impact on loading times they're not loaded fully into RAM, possibly even that data is paged out to disc for when it is needed, so unused assets don't/shouldn't have a huge impact of the game's ability to display and run a route. Of course the specs and configuration of the machine come into play here, but this issue isn't really one peculiar to TS, but one shared by all applications using a 32bit architecture.
If referencing assets was a huge problem then we wouldn't have routes such as WCML-N, JT's Western Main Lines etc. as they have to follow the 32bit limitations too. Too much has been made of the so called 'Kuju bloat' and whilst some/many have found that reducing the total load on the PC helps, nobody has come up with any measured figures to *prove* it's existence (or not that I've seen) but it does make a handy cause/flaw to blame when we run into issues.

There's no denying that Matthew's improved his situation, but it's a rather drastic solution and very much an incomplete one--some of the assets haven't been touched, those that have are still loaded; it's only the referencing of assets that has been changed, it's not a nice simple procedure, and is completely negated as soon as a scenario writer enables the P/P pairing of Kuju/RailSimulator (for example) to use any stock from that folder set, and can any of you see route builders implementing the same?....many route builders don't know one end of a .bin file from t'other as it's not a skill particularly needed in their field.

So, the whole point of the thread is about formulating a theory/strategy/method that stands on it's own, and is easily implemented--that will of course will mean differences in opinion and loads of testing before something concrete and workable is produced. At the furthest extreme having a dedicated TS installation for each route you have, each installation containing just the assets used by that particular route, with each instance having it's own set of graphical settings to suit the scope/complexity of the route will work---but that ain't a solution!

Unfortunately it would appear that there are very few of us with a copy of the route to do our own testing and the most we know is that some of Brendon's Irish Rail assets are used---just as they are in many other routes without serious issues--and the possibility of a *rogue* .bin file cannot be ruled out.
gptech
Very Active Forum Member
Posts: 19585
Joined: Fri Oct 10, 2008 5:48 pm
Location: Wakefield, West Yorkshire

Re: A big project with potentially big performance improvements.

Post by gptech »

fireman44 wrote: no "unable to save to dump file" crashes arise.
"Unable to save...." isn't the cause of the crash though; many are missing the subtle distinction between that message that tells you that the game has crashed but hasn't been able to leave a clue, and the *real* cause of the crash.
deltic009
Very Active Forum Member
Posts: 4017
Joined: Fri Nov 27, 2009 1:06 am

Re: A big project with potentially big performance improvements.

Post by deltic009 »

Seems like a whole fuss over nothing then, rather than investigate further and risk distributing something that might not give some other folks improvement and mislead them into the benefits I'll stop what I'm doing. It's taking a lot of my spare time up and I thought perhaps naively that something was better than nothing, but as I'm not looking to remove 1034 assets one at a time in order to identify a rogue asset within the route, then it shall abruptly finish here.

Thanks to those who have shown some interest in this endeavour. I shall move on to using more of my scant spare time on reskinning and attempting to make the WR Hydraulic Class 52 compatible with the AP and Mega Pack enhancements.
Matthew Wilson, development team at Vulcan Productions

http://www.vulcanproductions.co.uk
https://www.facebook.com/VulcanFoundry/
Locked

Return to “[TS] General Discussion”