[BuildStream] Weekly performance update for the Debian-like project



Hello,

As before, a notebook containing *all* of the results should be available via following link:

https://mybinder.org/v2/gh/james-ennis/bst-benchmarks-notebooks/master?filepath=all_results.ipynb

If the link has expired, you'll need to go to https://mybinder.org/ and the copy and paste: https://github.com/james-ennis/bst-benchmarks-notebooks into the url field, with branch "master" and file "all_results.ipynb". Then "launch".

When the notebook has loaded please click on "Kernel" -> "Restart and Run all".

This will show the four interactive graphs and some instructions regarding how the toolbar can be used to interact with each graph.

-----

All results were obtained on the same hardware (a Codethink dev machine configured as a GitLab runner - not being used for anything else), and the results files are being appended to by a CI job. Specs as follows:

* Linux (Debian stable)
* x86_64
* 16G RAM
* 500Gb SSD
* Intel i7-3770
* 8 cores @ 3.40 GHz

-----

Here are the times and max-rss (peak memory usage) results for a `bst show` of base-files/base-files.bst in the Debian-like project [0] (jennis/use_remote_file branch [1]) for the last 5 weeks:

┌─────┬─────────────────┬─────────────────┐
│     │      Show       │ Show once built │
├─────┼───────┬─────────┼───────┬─────────┤
│Dates│Time   │Memory   │Time   │Memory   │
│     │(s)    │(M)      │(s)    │(M)      │
├─────┼───────┼─────────┼───────┼─────────┤
│13/05│11.65  │186      │12.88  │200      │
│19/05│       │         │       │         │
├─────┼───────┼─────────┼───────┼─────────┤
│20/05│11.20  │187      │10.40  │211      │
│26/05│       │         │       │         │
├─────┼───────┼─────────┼───────┼─────────┤
│27/05│8.86   │180      │8.29   │203      │
│02/06│       │         │       │         │
├─────┼───────┼─────────┼───────┼─────────┤
│03/06│7.26   │176      │6.71   │200      │
│09/06│       │         │       │         │
├─────┼───────┼─────────┼───────┼─────────┤
│10/06│7.44   │186      │6.97   │210      │
│16/06│       │         │       │         │
└─────┴───────┴─────────┴───────┴─────────┘

So, it looks like we've got slower and we're using more memory...

## Why are we slower?
Remember that these results are an average of all of last weeks results.

Since bschubert/cython landed, we substantially reduced our show times from
~11s to ~7s, and I'm seeing that for the base-files stack, there is a ~1
second variance on the time it takes to show this stack. That is, last week,
I can see that there are many results that are less than 7 seconds. You
should be able to see this variance on the graphs. I would argue that the
slightly higher time average we're seeing this week is due to noise.

## Why are we using more memory?
This one was in fact very traceable, bschubert/stricter-node-api [2] has introduced
this jump in memory.

-----

Here are the times and max-rss results for a `bst build` (4, 8 and 12 builders) of base-files/base-files.bst in the Debian-like project [1].

┌─────┬─────────────────┬─────────────────┬──────────────────┐
│     │   4 Builders    │   8 Builders    │   12 Builders    │
├─────┼───────┬─────────┼───────┬─────────┼───────┬──────────┤
│Date │Time   │Memory   │Time   │Memory   │Time   │Memory    │
│     │(s)    │(M)      │(s)    │(M)      │(s)    │(M)       │
├─────┼───────┼─────────┼───────┼─────────┼───────┼──────────┤
│13/05│168.47 │197      │189.70 │198      │213.01 │197       │
│19/05│       │         │       │         │       │          │
├─────┼───────┼─────────┼───────┼─────────┼───────┼──────────┤
│20/05│133.44 │210      │153.84 │210      │175.90 │210       │
│26/05│       │         │       │         │       │          │
├─────┼───────┼─────────┼───────┼─────────┼───────┼──────────┤
│27/05│129.73 │202      │152.97 │202      │174.24 │202       │
│02/06│       │         │       │         │       │          │
├─────┼───────┼─────────┼───────┼─────────┼───────┼──────────┤
│03/06│127.61 │199      │144.91 │199      │160.1  │199       │
│09/06│       │         │       │         │       │          │
├─────┼───────┼─────────┼───────┼─────────┼───────┼──────────│
│06/05│134.19 │209      │135.50 │209      │132.45 │209       │
│12/05│       │         │       │         │       │          │
└─────┴───────┴─────────┴───────┴─────────┴───────┴──────────┘

## General points about the results
The build results are quite interesting this week, we see that now
that it has been over a week since jennis/push_based_pipeline [3],
landed we can see that more builders != slower runtime. Now, all the
times for different numbers of builders have almost evened out.

We strongly suspect that we're now at a point where we're bound
by the time it takes to actually complete the ~6000 import jobs
but we're not certain. We have discussed creating a project which
forces a sleep() in the build commands which will perhaps show us
what we're expecting to see (the time for 12 builders < 8 builders
< 4 builders).

## Why is 4 builders slower than before?
Again, there is a jump in the build times due to
bschubert/stricter-node-api [2].

Weirdly, from the results, it looks as if there is also another
jump caused by bschubert/remove-useless-condition [4], but looking
at the diff, I would argue that this should have had no affect on
the runtime. So I'm a bit perplexed here, but the 2 second jump
looks pretty clear.

## Why has the max-rss increased?
The memory has increased due to bschubert/stricter-node-api [2].

Please see [5] if you're unable to open the weekly notebook.

Thanks,
James

[0] https://gitlab.com/jennis/debian-stretch-bst
[1] https://gitlab.com/jennis/debian-stretch-bst/tree/jennis/use_remote_file
[2] https://gitlab.com/BuildStream/buildstream/merge_requests/1384
[3] https://gitlab.com/BuildStream/buildstream/merge_requests/1344
[4] https://gitlab.com/BuildStream/buildstream/merge_requests/1387
[5] https://mail.gnome.org/archives/buildstream-list/2019-February/msg00046.html

Attachment: weekly_results.ipynb
Description: Text document



[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]