I want to test a new application and check out the important performance stats. I want statistics showing me when the program needed which amount of disc space (temporary files), ram and cpu usage. I want to run the application several times with different settings to find out the bottle necks and a good configuration for my system.
Has someone an idea how to do this best? Is there a program helping to document this data?
Nagios, xymon and the like are usually used to track and monitor performance metrics.
They work on the system level and have a resolution usually around 5min.
This certainly helps and is needed anyway to keep track of the system.
But itīs not the whole story.
For a new application, I would first try to get some basics:
- Is it a single or multithread app,
- does it use many network connections,
- where and how much (in parallel) does it read and write,
- is it dependend on another server or daemon (a db maybe?)
A look inside the application can maybe realized by turning on debug logging.
Or by modifying the app, so that it gathers info and logs them for certain events, such as tmp file creation.
thank you for your response!!
nagios is just for server applications, am i right? or can i use it for stand-alone systems as well?
i can configure how many processes the applications uses and there is no internet connection because it just runs on my computer.
there are a lot of temporary data and it is very important for me to monitor this as well. but i havent found any way to log that automatically.
You can also try out Xymon (used to be Hobbit which used to be BigBrother). It works just fine running as both a server and client on a stand-alone system. It has built-in "tests" to check for CPU/Mem utilization, hard disk usage, running/hung processes, etc. It is very easy to write custom tests for it (they are just shell scripts that send output to the server) and even modify the existing tests to suite your needs.
There are also lots of 3rd party tests that the community has contributed to it.