Produce a small batch program which does nothing much but get the time in a loop.
Produce another one which is identical but doesn't get the time.
Run both program with the same number of iterations. Say, 100,000 to start with. Keep going up [or down] until there is a significant difference in CPU used (or IOs, but I doubt it) [or until there is no significant difference in CPU used]. If there is a consistent and noticeable increase in elapsed time that might be something, but elapased time is a fickle beast to measure by (because it depends on so many things).
Assess results from against transaction volumes you system is going to process, per day, per hour, per minute, whatever.