Let’s review our performance measures for 2014.
– regressions in 2014 for tcheck2, trobopan, and tspaint.
– improvements in tprovider, tsvgx, and tp4m.
– overall regressions in time to throbber start and stop.
– recent checkerboard regressions apparent on Eideticker.
This section tracks Perfomatic graphs from https://wiki.mozilla.org/Buildbot/Talos for background on Talos.for mozilla-central builds of Firefox for Android, for Talos tests run on Android 4.0 Opt. The test names shown are those used on tbpl. See
Measure of “checkerboarding” during simulation of real user interaction with page. Lower values are better.
Jan 2014: 4.7
Dec 2014: 18.2
This test seems to be one of our most frequently regressing tests. We had some good improvements this year, but overall we end the year significantly regressed from where we started. Silver lining: Test results are much less noisy now than they have been all year!
(For details on the December regression, see bug 1111565 / bug 1097318).
Panning performance test. Value is square of frame delays (ms greater than 25 ms) encountered while panning. Lower values are better.
Jan 2014: 28000
Dec 2014: 62000
Again, there are some wins and losses over the year but we end the year significantly regressed. There is a lot of noise in the results.
Performance of history and bookmarks’ provider. Reports time (ms) to perform a group of database operations. Lower values are better.
Jan 2014: 560
Dec 2014: 520
Very steady performance here with a slight improvement in April carrying through to the end of the year.
An svg-only number that measures SVG rendering performance. About half of the tests are animations or iterations of rendering. This ASAP test (tsvgx) iterates in unlimited frame-rate mode thus reflecting the maximum rendering throughput of each test. The reported value is the page load time, or, for animations/iterations – overall duration the sequence/animation took to complete. Lower values are better.
Jan 2014: 6150
Dec 2014: 5900
This is great — we’re seeing the best performance of the year.
Generic page load test. Lower values are better.
Jan 2014: 970
Dec 2014: 855
Wow, even better. This tells me someone out there really cares about our page load performance.
Startup performance test. Lower values are better.
Jan 2014: 3700
Dec 2014: 4100
You can’t win them all? It feels like we’re slowly losing ground here.
Note that this test is currently hidden on treeherder; it fails very often – bug 1112697.
Throbber Start / Throbber Stop
These graphs are taken from http://phonedash.mozilla.org. Browser startup performance is measured on real phones (a variety of popular devices).
I could not get cohesive phonedash graphs for the entire year, since we made so many changes to autophone over the year, but here are views for the last 6 months. It looks like we have some work to do on time to throbber start. Time to throbber stop is better, but we have lost ground there too.
These graphs are taken from http://eideticker.mozilla.org. Eideticker is a performance harness that measures user perceived performance of web browsers by video capturing them in action and subsequently running image analysis on the raw result.
More info at: https://wiki.mozilla.org/Project_Eideticker
Again, I couldn’t generate good graphs for the whole year, but here are some for the last 3 months.
Eideticker startup tests seem to be performing well.
But we’ve had some recent regressions in checkerboarding.
Happy New Year!