spark

spark

26M Downloads

Add the ability to see the methods the CPU is spending the most time on

jtsus opened this issue ยท 3 comments

commented

The current state of spark's profiling allows for a quite comprehensive look into the call stack but it is lacking when it comes to actually interpreting the data presented. Adding the ability to see the methods the CPU is spending the most time on will allow for developers to optimize hot spots in their code that they simply didn't know was lagging because of how spread-out the calls are throughout the stack.

commented

@JustinSamaKun can you actually show us, how you imagine it should look like?
On example take any report and show report + simplified info as you want.

Why I'm asking, because I can't imagine how callstack could be displayed more simplified.
For example plugin A calls method plugin.compute() and that method takes all time on some collection change, like collection.put(). In other words with your suggestion you will see collection.put() and not the source of that, so it will be useless information. That's because report are so comprehensive.

commented

@i0xHeX In my mind it would be just like the heap dumb page but for methods, then you could just use the filter search to narrow it down to methods from your plugin

commented

I agree with @i0xHeX 's analysis - don't think this would be particularly useful in comparison with the current tree view.

Thanks for the suggestion but this isn't something I plan to implement. :)