| ▲ | spockz a day ago | |||||||||||||||||||||||||||||||
For me the main takeaway of this is that you want to have automated performance tests in place combined with insights into flamegraphs by default. And especially for these kind of major language upgrade changes. | ||||||||||||||||||||||||||||||||
| ▲ | malkia a day ago | parent | next [-] | |||||||||||||||||||||||||||||||
Benchmarking requires a bit of different setup than the rest of the testing, especially if you want down to the ms timings. We have continous benchmarking of one of our tools, it's written in C++, and to get "same" results everytime we launch it on the same machine. This is far from ideal, but otherwise there be either noisy neighbours, pesky host (if it's vm), etc. etc. One idea that we thought was what if we can run the same test on the same machine several times, and check older/newer code (or ideally through switches), and this could work for some codepaths, but not for really continous checkins. Just wondering what folks do. I can assume what, but there is always something hidden, not well known. | ||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||
| ▲ | esafak a day ago | parent | prev [-] | |||||||||||||||||||||||||||||||
What are folks using for perf testing on JVM these days? | ||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||