[Notes] [Git][BuildStream/buildstream][tpollard/494] 25 commits: element.py: Prepare local sandbox for bst checkout and bst shell



Title: GitLab

Tom Pollard pushed to branch tpollard/494 at BuildStream / buildstream

Commits:

15 changed files:

Changes:

  • CONTRIBUTING.rst
    ... ... @@ -3,72 +3,137 @@ Contributing
    3 3
     Some tips and guidelines for developers hacking on BuildStream
    
    4 4
     
    
    5 5
     
    
    6
    -Feature additions
    
    7
    ------------------
    
    8
    -Major feature additions should be proposed on the
    
    9
    -`mailing list <https://mail.gnome.org/mailman/listinfo/buildstream-list>`_
    
    10
    -before being considered for inclusion, we strongly recommend proposing
    
    11
    -in advance of commencing work.
    
    6
    +.. _contributing_filing_issues:
    
    7
    +
    
    8
    +Filing issues
    
    9
    +-------------
    
    10
    +If you are experiencing an issue with BuildStream, or would like to submit a patch
    
    11
    +to fix an issue, then you should first search the list of `open issues <https://gitlab.com/BuildStream/buildstream/issues>`_
    
    12
    +to see if the issue is already filed, and `open an issue <https://gitlab.com/BuildStream/buildstream/issues/new>`_
    
    13
    +if no issue already exists.
    
    14
    +
    
    15
    +For policies on how to submit an issue and how to use our project labels,
    
    16
    +we recommend that you read the `policies guide
    
    17
    +<https://gitlab.com/BuildStream/nosoftware/alignment/blob/master/BuildStream_policies.md>`_.
    
    18
    +
    
    19
    +
    
    20
    +.. _contributing_fixing_bugs:
    
    21
    +
    
    22
    +Fixing bugs
    
    23
    +-----------
    
    24
    +Before fixing a bug, it is preferred that an :ref:`issue be filed <contributing_filing_issues>`
    
    25
    +first in order to better document the defect, however this need not be followed to the
    
    26
    +letter for minor fixes.
    
    27
    +
    
    28
    +Patches which fix bugs should always come with a regression test.
    
    29
    +
    
    12 30
     
    
    13
    -If you are experiencing an issue with BuildStream or would like to submit a small patch/feature, then
    
    14
    -you can `oepn an issue <https://gitlab.com/BuildStream/buildstream/issues/new?issue%5Bassignee_id%5D=&issue%5Bmilestone_id%5D=>`_
    
    31
    +.. _contributing_adding_features:
    
    15 32
     
    
    16
    -For policies on how to submit and issue and how to use our project labels, we recommend that you read the `policies guide <https://gitlab.com/BuildStream/nosoftware/alignment/blob/master/BuildStream_policies.md>`_
    
    33
    +Adding new features
    
    34
    +-------------------
    
    35
    +Feature additions should be proposed on the `mailing list
    
    36
    +<https://mail.gnome.org/mailman/listinfo/buildstream-list>`_
    
    37
    +before being considered for inclusion. To save time and avoid any frustration,
    
    38
    +we strongly recommend proposing your new feature in advance of commencing work.
    
    39
    +
    
    40
    +Once consensus has been reached on the mailing list, then the proposing
    
    41
    +party should :ref:`file an issue <contributing_filing_issues>` to track the
    
    42
    +work. Please use the *bst_task* template for issues which represent
    
    43
    +feature additions.
    
    17 44
     
    
    18
    -New features must be well documented and tested either in our main
    
    19
    -test suite if possible, or otherwise in the integration tests.
    
    45
    +New features must be well documented and tested in our test suite.
    
    20 46
     
    
    21 47
     It is expected that the individual submitting the work take ownership
    
    22 48
     of their feature within BuildStream for a reasonable timeframe of at least
    
    23 49
     one release cycle after their work has landed on the master branch. This is
    
    24
    -to say that the submitter is expected to address and fix any side effects and
    
    25
    -bugs which may have fell through the cracks in the review process, giving us
    
    26
    -a reasonable timeframe for identifying these.
    
    50
    +to say that the submitter is expected to address and fix any side effects,
    
    51
    +bugs or regressions which may have fell through the cracks in the review
    
    52
    +process, giving us a reasonable timeframe for identifying these.
    
    27 53
     
    
    28 54
     
    
    29
    -Patch submissions
    
    30
    ------------------
    
    31
    -If you want to submit a patch, do ask for developer permissions on our
    
    32
    -IRC channel first (GitLab's button also works, but you may need to
    
    33
    -shout about it - we often overlook this) - for CI reasons, it's much
    
    34
    -easier if patches are in branches of the main repository.
    
    55
    +.. _contributing_submitting_patches:
    
    56
    +
    
    57
    +Submitting patches
    
    58
    +------------------
    
    35 59
     
    
    36
    -Branches must be submitted as merge requests in gitlab. If the branch
    
    37
    -fixes an issue or is related to any issues, these issues must be mentioned
    
    38
    -in the merge request or preferably the commit messages themselves.
    
    39 60
     
    
    61
    +Ask for developer access
    
    62
    +~~~~~~~~~~~~~~~~~~~~~~~~
    
    63
    +If you want to submit a patch, do ask for developer permissions, either
    
    64
    +by asking us directly on our public IRC channel (irc://irc.gnome.org/#buildstream)
    
    65
    +or by visiting our `project page on GitLab <https://gitlab.com/BuildStream/buildstream>`_
    
    66
    +and using the GitLab UI to ask for permission.
    
    67
    +
    
    68
    +This will make your contribution experience smoother, as you will not
    
    69
    +need to setup any complicated CI settings, and rebasing your branch
    
    70
    +against the upstream master branch will be more painless.
    
    71
    +
    
    72
    +
    
    73
    +Branch names
    
    74
    +~~~~~~~~~~~~
    
    40 75
     Branch names for merge requests should be prefixed with the submitter's
    
    41
    -name or nickname, e.g. ``username/implement-flying-ponies``.
    
    76
    +name or nickname, followed by a forward slash, and then a descriptive
    
    77
    +name. e.g.::
    
    42 78
     
    
    43
    -You may open merge requests for the branches you create before you
    
    44
    -are ready to have them reviewed upstream, as long as your merge request
    
    45
    -is not yet ready for review then it must be prefixed with the ``WIP:``
    
    46
    -identifier.
    
    79
    +  username/fix-that-bug
    
    47 80
     
    
    81
    +This allows us to more easily identify which branch does what and
    
    82
    +belongs to whom, especially so that we can effectively cleanup stale
    
    83
    +branches in the upstream repository over time.
    
    84
    +
    
    85
    +
    
    86
    +Merge requests
    
    87
    +~~~~~~~~~~~~~~
    
    88
    +Once you have created a local branch, you can push it to the upstream
    
    89
    +BuildStream repository using the command line::
    
    90
    +
    
    91
    +  git push origin username/fix-that-bug:username/fix-that-bug
    
    92
    +
    
    93
    +GitLab will respond to this with a message and a link to allow you to create
    
    94
    +a new merge request. You can also `create a merge request for an existing branch
    
    95
    +<https://gitlab.com/BuildStream/buildstream/merge_requests/new>`_.
    
    96
    +
    
    97
    +You may open merge requests for the branches you create before you are ready
    
    98
    +to have them reviewed and considered for inclusion if you like. Until your merge
    
    99
    +request is ready for review, the merge request title must be prefixed with the
    
    100
    +``WIP:`` identifier.
    
    101
    +
    
    102
    +
    
    103
    +Organized commits
    
    104
    +~~~~~~~~~~~~~~~~~
    
    48 105
     Submitted branches must not contain a history of the work done in the
    
    49
    -feature branch. Please use git's interactive rebase feature in order to
    
    50
    -compose a clean patch series suitable for submission.
    
    106
    +feature branch. For example, if you had to change your approach, or
    
    107
    +have a later commit which fixes something in a previous commit on your
    
    108
    +branch, we do not want to include the history of how you came up with
    
    109
    +your patch in the upstream master branch.
    
    110
    +
    
    111
    +Please use git's interactive rebase feature in order to compose a clean
    
    112
    +patch series suitable for submission upstream.
    
    113
    +
    
    114
    +Every commit in series should pass the test suite, this is very important
    
    115
    +for tracking down regressions and performing git bisections in the future.
    
    51 116
     
    
    52 117
     We prefer that documentation changes be submitted in separate commits from
    
    53
    -the code changes which they document, and new test cases are also preferred
    
    54
    -in separate commits.
    
    118
    +the code changes which they document, and newly added test cases are also
    
    119
    +preferred in separate commits.
    
    55 120
     
    
    56 121
     If a commit in your branch modifies behavior such that a test must also
    
    57 122
     be changed to match the new behavior, then the tests should be updated
    
    58
    -with the same commit. Ideally every commit in the history of master passes
    
    59
    -its test cases, this makes bisections more easy to perform, but is not
    
    60
    -always practical with more complex branches.
    
    123
    +with the same commit, so that every commit passes it's own tests.
    
    61 124
     
    
    62 125
     
    
    63 126
     Commit messages
    
    64 127
     ~~~~~~~~~~~~~~~
    
    65
    -Commit messages must be formatted with a brief summary line, optionally
    
    66
    -followed by an empty line and then a free form detailed description of
    
    67
    -the change.
    
    128
    +Commit messages must be formatted with a brief summary line, followed by
    
    129
    +an empty line and then a free form detailed description of the change.
    
    68 130
     
    
    69 131
     The summary line must start with what changed, followed by a colon and
    
    70 132
     a very brief description of the change.
    
    71 133
     
    
    134
    +If the commit fixes an issue, or is related to an issue; then the issue
    
    135
    +number must be referenced in the commit message.
    
    136
    +
    
    72 137
     **Example**::
    
    73 138
     
    
    74 139
       element.py: Added the frobnicator so that foos are properly frobbed.
    
    ... ... @@ -77,33 +142,464 @@ a very brief description of the change.
    77 142
       the element. Elements that are not properly frobnicated raise
    
    78 143
       an error to inform the user of invalid frobnication rules.
    
    79 144
     
    
    145
    +  Fixes #123
    
    146
    +
    
    147
    +In the case that you have a commit which necessarily modifies multiple
    
    148
    +components, then the summary line should still mention generally what
    
    149
    +changed (if possible), followed by a colon and a brief summary.
    
    150
    +
    
    151
    +In this case the free form detailed description of the change should
    
    152
    +contain a bullet list describing what was changed in each component
    
    153
    +separately.
    
    154
    +
    
    155
    +**Example**::
    
    156
    +
    
    157
    +  artifact cache: Fixed automatic expiry in the local cache
    
    80 158
     
    
    81
    -Coding style
    
    82
    -------------
    
    83
    -Coding style details for BuildStream
    
    159
    +    o _artifactcache/artifactcache.py: Updated the API contract
    
    160
    +      of ArtifactCache.remove() so that something detailed is
    
    161
    +      explained here.
    
    84 162
     
    
    163
    +    o _artifactcache/cascache.py: Adhere to the new API contract
    
    164
    +      dictated by the abstract ArtifactCache class.
    
    85 165
     
    
    86
    -Style guide
    
    87
    -~~~~~~~~~~~
    
    88
    -Python coding style for BuildStream is pep8, which is documented here: https://www.python.org/dev/peps/pep-0008/
    
    166
    +    o tests/artifactcache/expiry.py: Modified test expectations to
    
    167
    +      match the new behavior.
    
    168
    +
    
    169
    +  This is a part of #123
    
    170
    +
    
    171
    +
    
    172
    +Coding guidelines
    
    173
    +-----------------
    
    174
    +This section discusses coding style and other guidelines for hacking
    
    175
    +on BuildStream. This is important to read through for writing any non-trivial
    
    176
    +patches and especially outlines what people should watch out for when
    
    177
    +reviewing patches.
    
    178
    +
    
    179
    +Much of the rationale behind what is layed out in this section considers
    
    180
    +good traceability of lines of code with *git blame*, overall sensible
    
    181
    +modular structure, consistency in how we write code, and long term maintenance
    
    182
    +in mind.
    
    183
    +
    
    184
    +
    
    185
    +Approximate PEP-8 Style
    
    186
    +~~~~~~~~~~~~~~~~~~~~~~~
    
    187
    +Python coding style for BuildStream is approximately `pep8 <https://www.python.org/dev/peps/pep-0008/>`_.
    
    89 188
     
    
    90 189
     We have a couple of minor exceptions to this standard, we dont want to compromise
    
    91 190
     code readability by being overly restrictive on line length for instance.
    
    92 191
     
    
    93
    -The pep8 linter will run automatically when running the test suite.
    
    192
    +The pep8 linter will run automatically when :ref:`running the test suite <contributing_testing>`.
    
    193
    +
    
    194
    +
    
    195
    +Line lengths
    
    196
    +''''''''''''
    
    197
    +Regarding laxness on the line length in our linter settings, it should be clarified
    
    198
    +that the line length limit is a hard limit which causes the linter to bail out
    
    199
    +and reject commits which break the high limit - not an invitation to write exceedingly
    
    200
    +long lines of code, comments, or API documenting docstrings.
    
    201
    +
    
    202
    +Code, comments and docstrings should strive to remain written for approximately 80
    
    203
    +or 90 character lines, where exceptions can be made when code would be less readable
    
    204
    +when exceeding 80 or 90 characters (often this happens in conditional statements
    
    205
    +when raising an exception, for example). Or, when comments contain a long link that
    
    206
    +causes the given line to to exceed 80 or 90 characters, we don't want this to cause
    
    207
    +the linter to refuse the commit.
    
    208
    +
    
    209
    +
    
    210
    +.. _contributing_documenting_symbols:
    
    211
    +
    
    212
    +Documenting symbols
    
    213
    +~~~~~~~~~~~~~~~~~~~
    
    214
    +In BuildStream, we maintain what we call a *"Public API Surface"* that
    
    215
    +is guaranteed to be stable and unchanging across stable releases. The
    
    216
    +symbols which fall into this special class are documented using Python's
    
    217
    +standard *docstrings*, while all other internals of BuildStream are documented
    
    218
    +with comments above the related symbol.
    
    219
    +
    
    220
    +When documenting the public API surface which is rendered in the reference
    
    221
    +manual, we always mention the major version in which the API was introduced,
    
    222
    +as shown in the examples below. If a public API exists without the *Since*
    
    223
    +annotation, this is taken to mean that it was available since the first stable
    
    224
    +release 1.0.
    
    225
    +
    
    226
    +Here are some examples to get the hang of the format of API documenting
    
    227
    +comments and docstrings.
    
    228
    +
    
    229
    +**Public API Surface method**::
    
    230
    +
    
    231
    +  def frobnicate(self, source, *, frobilicious=False):
    
    232
    +      """Frobnicates this element with the specified source
    
    233
    +
    
    234
    +      Args:
    
    235
    +         source (Source): The Source to frobnicate with
    
    236
    +         frobilicious (bool): Optionally specify that frobnication should be
    
    237
    +                              performed fribiliciously
    
    238
    +
    
    239
    +      Returns:
    
    240
    +         (Element): The frobnicated version of this Element.
    
    241
    +
    
    242
    +      *Since: 1.2*
    
    243
    +      """
    
    244
    +      ...
    
    245
    +
    
    246
    +**Internal method**::
    
    247
    +
    
    248
    +  # frobnicate():
    
    249
    +  #
    
    250
    +  # Frobnicates this element with the specified source
    
    251
    +  #
    
    252
    +  # Args:
    
    253
    +  #    source (Source): The Source to frobnicate with
    
    254
    +  #    frobilicious (bool): Optionally specify that frobnication should be
    
    255
    +  #                         performed fribiliciously
    
    256
    +  #
    
    257
    +  # Returns:
    
    258
    +  #    (Element): The frobnicated version of this Element.
    
    259
    +  #
    
    260
    +  def frobnicate(self, source, *, frobilicious=False):
    
    261
    +      ...
    
    262
    +
    
    263
    +**Public API Surface instance variable**::
    
    264
    +
    
    265
    +  def __init__(self, context, element):
    
    266
    +
    
    267
    +    self.name = self._compute_name(context, element)
    
    268
    +    """The name of this foo
    
    269
    +
    
    270
    +    *Since: 1.2*
    
    271
    +    """
    
    272
    +
    
    273
    +.. note::
    
    274
    +
    
    275
    +   Python does not support docstrings on instance variables, but sphinx does
    
    276
    +   pick them up and includes them in the generated documentation.
    
    277
    +
    
    278
    +**Internal instance variable**::
    
    279
    +
    
    280
    +  def __init__(self, context, element):
    
    281
    +
    
    282
    +    self.name = self._compute_name(context, element) # The name of this foo
    
    283
    +
    
    284
    +**Internal instance variable (long)**::
    
    285
    +
    
    286
    +  def __init__(self, context, element):
    
    287
    +
    
    288
    +    # This instance variable required a longer explanation, so
    
    289
    +    # it is on a line above the instance variable declaration.
    
    290
    +    self.name = self._compute_name(context, element)
    
    291
    +
    
    292
    +
    
    293
    +**Public API Surface class**::
    
    294
    +
    
    295
    +  class Foo(Bar):
    
    296
    +      """The main Foo object in the data model
    
    297
    +
    
    298
    +      Explanation about Foo. Note that we always document
    
    299
    +      the constructor arguments here, and not beside the __init__
    
    300
    +      method.
    
    301
    +
    
    302
    +      Args:
    
    303
    +         context (Context): The invocation Context
    
    304
    +         count (int): The number to count
    
    305
    +
    
    306
    +      *Since: 1.2*
    
    307
    +      """
    
    308
    +      ...
    
    309
    +
    
    310
    +**Internal class**::
    
    311
    +
    
    312
    +  # Foo()
    
    313
    +  #
    
    314
    +  # The main Foo object in the data model
    
    315
    +  #
    
    316
    +  # Args:
    
    317
    +  #    context (Context): The invocation Context
    
    318
    +  #    count (int): The number to count
    
    319
    +  #
    
    320
    +  class Foo(Bar):
    
    321
    +      ...
    
    322
    +
    
    323
    +
    
    324
    +.. _contributing_class_order:
    
    325
    +
    
    326
    +Class structure and ordering
    
    327
    +~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    
    328
    +When creating or modifying an object class in BuildStream, it is
    
    329
    +important to keep in mind the order in which symbols should appear
    
    330
    +and keep this consistent.
    
    331
    +
    
    332
    +Here is an example to illustrate the expected ordering of symbols
    
    333
    +on a Python class in BuildStream::
    
    334
    +
    
    335
    +  class Foo(Bar):
    
    336
    +
    
    337
    +      # Public class-wide variables come first, if any.
    
    338
    +
    
    339
    +      # Private class-wide variables, if any
    
    340
    +
    
    341
    +      # Now we have the dunder/magic methods, always starting
    
    342
    +      # with the __init__() method.
    
    343
    +
    
    344
    +      def __init__(self, name):
    
    345
    +
    
    346
    +          super().__init__()
    
    347
    +
    
    348
    +          # NOTE: In the instance initializer we declare any instance variables,
    
    349
    +          #       always declare the public instance variables (if any) before
    
    350
    +          #       the private ones.
    
    351
    +          #
    
    352
    +          #       It is preferred to avoid any public instance variables, and
    
    353
    +          #       always expose an accessor method for it instead.
    
    354
    +
    
    355
    +          #
    
    356
    +          # Public instance variables
    
    357
    +          #
    
    358
    +          self.name = name  # The name of this foo
    
    359
    +
    
    360
    +          #
    
    361
    +          # Private instance variables
    
    362
    +          #
    
    363
    +          self._count = 0   # The count of this foo
    
    364
    +
    
    365
    +      ################################################
    
    366
    +      #               Abstract Methods               #
    
    367
    +      ################################################
    
    368
    +
    
    369
    +      # NOTE: Abstract methods in BuildStream are allowed to have
    
    370
    +      #       default methods.
    
    371
    +      #
    
    372
    +      #       Subclasses must NEVER override any method which was
    
    373
    +      #       not advertized as an abstract method by the parent class.
    
    374
    +
    
    375
    +      # frob()
    
    376
    +      #
    
    377
    +      # Implementors should implement this to frob this foo
    
    378
    +      # count times if possible.
    
    379
    +      #
    
    380
    +      # Args:
    
    381
    +      #    count (int): The number of times to frob this foo
    
    382
    +      #
    
    383
    +      # Returns:
    
    384
    +      #    (int): The number of times this foo was frobbed.
    
    385
    +      #
    
    386
    +      # Raises:
    
    387
    +      #    (FooError): Implementors are expected to raise this error
    
    388
    +      #
    
    389
    +      def frob(self, count):
    
    390
    +
    
    391
    +          #
    
    392
    +          # An abstract method in BuildStream is allowed to have
    
    393
    +          # a default implementation.
    
    394
    +          #
    
    395
    +          self._count = self._do_frobbing(count)
    
    396
    +
    
    397
    +          return self._count
    
    398
    +
    
    399
    +      ################################################
    
    400
    +      #     Implementation of abstract methods       #
    
    401
    +      ################################################
    
    402
    +
    
    403
    +      # NOTE: Implementations of abstract methods defined by
    
    404
    +      #       the parent class should NEVER document the API
    
    405
    +      #       here redundantly.
    
    406
    +
    
    407
    +      def frobbish(self):
    
    408
    +         #
    
    409
    +         # Implementation of the "frobbish" abstract method
    
    410
    +         # defined by the parent Bar class.
    
    411
    +         #
    
    412
    +         return True
    
    413
    +
    
    414
    +      ################################################
    
    415
    +      #                 Public Methods               #
    
    416
    +      ################################################
    
    417
    +
    
    418
    +      # NOTE: Public methods here are the ones which are expected
    
    419
    +      #       to be called from outside of this class.
    
    420
    +      #
    
    421
    +      #       These, along with any abstract methods, usually
    
    422
    +      #       constitute the API surface of this class.
    
    423
    +
    
    424
    +      # frobnicate()
    
    425
    +      #
    
    426
    +      # Perform the frobnication process on this Foo
    
    427
    +      #
    
    428
    +      # Raises:
    
    429
    +      #    (FrobError): In the case that a frobnication error was
    
    430
    +      #                 encountered
    
    431
    +      #
    
    432
    +      def frobnicate(self):
    
    433
    +          frobnicator.frobnicate(self)
    
    434
    +
    
    435
    +      # set_count()
    
    436
    +      #
    
    437
    +      # Sets the count of this foo
    
    438
    +      #
    
    439
    +      # Args:
    
    440
    +      #    count (int): The new count to set
    
    441
    +      #
    
    442
    +      def set_count(self, count):
    
    443
    +
    
    444
    +          self._count = count
    
    445
    +
    
    446
    +      # get_count()
    
    447
    +      #
    
    448
    +      # Accessor for the count value of this foo.
    
    449
    +      #
    
    450
    +      # Returns:
    
    451
    +      #    (int): The count of this foo
    
    452
    +      #
    
    453
    +      def get_count(self, count):
    
    454
    +
    
    455
    +          return self._count
    
    456
    +
    
    457
    +      ################################################
    
    458
    +      #                 Private Methods              #
    
    459
    +      ################################################
    
    460
    +
    
    461
    +      # NOTE: Private methods are the ones which are internal
    
    462
    +      #       implementation details of this class.
    
    463
    +      #
    
    464
    +      #       Even though these are private implementation
    
    465
    +      #       details, they still MUST have API documenting
    
    466
    +      #       comments on them.
    
    467
    +
    
    468
    +      # _do_frobbing()
    
    469
    +      #
    
    470
    +      # Does the actual frobbing
    
    471
    +      #
    
    472
    +      # Args:
    
    473
    +      #    count (int): The number of times to frob this foo
    
    474
    +      #
    
    475
    +      # Returns:
    
    476
    +      #    (int): The number of times this foo was frobbed.
    
    477
    +      #
    
    478
    +      def self._do_frobbing(self, count):
    
    479
    +          return count
    
    480
    +
    
    481
    +
    
    482
    +.. _contributing_public_and_private:
    
    483
    +
    
    484
    +Public and private symbols
    
    485
    +~~~~~~~~~~~~~~~~~~~~~~~~~~
    
    486
    +BuildStream mostly follows the PEP-8 for defining *public* and *private* symbols
    
    487
    +for any given class, with some deviations. Please read the `section on inheritance
    
    488
    +<https://www.python.org/dev/peps/pep-0008/#designing-for-inheritance>`_ for
    
    489
    +reference on how the PEP-8 defines public and non-public.
    
    490
    +
    
    491
    +* A *public* symbol is any symbol which you expect to be used by clients
    
    492
    +  of your class or module within BuildStream.
    
    493
    +
    
    494
    +  Public symbols are written without any leading underscores.
    
    495
    +
    
    496
    +* A *private* symbol is any symbol which is entirely internal to your class
    
    497
    +  or module within BuildStream. These symbols cannot ever be accessed by
    
    498
    +  external clients or modules.
    
    499
    +
    
    500
    +  A private symbol must be denoted by a leading underscore.
    
    501
    +
    
    502
    +* When a class can have subclasses, then private symbols should be denoted
    
    503
    +  by two leading underscores. For example, the ``Sandbox`` or ``Platform``
    
    504
    +  classes which have various implementations, or the ``Element`` and ``Source``
    
    505
    +  classes which plugins derive from.
    
    506
    +
    
    507
    +  The double leading underscore naming convention invokes Python's name
    
    508
    +  mangling algorithm which helps prevent namespace collisions in the case
    
    509
    +  that subclasses might have a private symbol with the same name.
    
    510
    +
    
    511
    +In BuildStream, we have what we call a *"Public API Surface"*, as previously
    
    512
    +mentioned in :ref:`contributing_documenting_symbols`. In the :ref:`next section
    
    513
    +<contributing_public_api_surface>` we will discuss the *"Public API Surface"* and
    
    514
    +outline the exceptions to the rules discussed here.
    
    515
    +
    
    516
    +
    
    517
    +.. _contributing_public_api_surface:
    
    518
    +
    
    519
    +Public API surface
    
    520
    +~~~~~~~~~~~~~~~~~~
    
    521
    +BuildStream exposes what we call a *"Public API Surface"* which is stable
    
    522
    +and unchanging. This is for the sake of stability of the interfaces which
    
    523
    +plugins use, so it can also be referred to as the *"Plugin facing API"*.
    
    524
    +
    
    525
    +Any symbols which are a part of the *"Public API Surface*" are never allowed
    
    526
    +to change once they have landed in a stable release version of BuildStream. As
    
    527
    +such, we aim to keep the *"Public API Surface"* as small as possible at all
    
    528
    +times, and never expose any internal details to plugins inadvertently.
    
    529
    +
    
    530
    +One problem which arises from this is that we end up having symbols
    
    531
    +which are *public* according to the :ref:`rules discussed in the previous section
    
    532
    +<contributing_public_and_private>`, but must be hidden away from the
    
    533
    +*"Public API Surface"*. For example, BuildStream internal classes need
    
    534
    +to invoke methods on the ``Element`` and ``Source`` classes, wheras these
    
    535
    +methods need to be hidden from the *"Public API Surface"*.
    
    536
    +
    
    537
    +This is where BuildStream deviates from the PEP-8 standard for public
    
    538
    +and private symbol naming.
    
    539
    +
    
    540
    +In order to disambiguate between:
    
    541
    +
    
    542
    +* Symbols which are publicly accessible details of the ``Element`` class, can
    
    543
    +  be accessed by BuildStream internals, but must remain hidden from the
    
    544
    +  *"Public API Surface"*.
    
    545
    +
    
    546
    +* Symbols which are private to the ``Element`` class, and cannot be accessed
    
    547
    +  from outside of the ``Element`` class at all.
    
    548
    +
    
    549
    +We denote the former category of symbols with only a single underscore, and the latter
    
    550
    +category of symbols with a double underscore. We often refer to this distinction
    
    551
    +as *"API Private"* (the former category) and *"Local Private"* (the latter category).
    
    552
    +
    
    553
    +Classes which are a part of the *"Public API Surface"* and require this disambiguation
    
    554
    +were not discussed in :ref:`the class ordering section <contributing_class_order>`, for
    
    555
    +these classes, the *"API Private"* symbols always come **before** the *"Local Private"*
    
    556
    +symbols in the class declaration.
    
    557
    +
    
    558
    +Modules which are not a part of the *"Public API Surface"* have their Python files
    
    559
    +prefixed with a single underscore, and are not imported in BuildStream's the master
    
    560
    +``__init__.py`` which is used by plugins.
    
    561
    +
    
    562
    +.. note::
    
    563
    +
    
    564
    +   The ``utils.py`` module is public and exposes a handful of utility functions,
    
    565
    +   however many of the functions it provides are *"API Private"*.
    
    566
    +
    
    567
    +   In this case, the *"API Private"* functions are prefixed with a single underscore.
    
    568
    +
    
    569
    +Any objects which are a part of the *"Public API Surface"* should be exposed via the
    
    570
    +toplevel ``__init__.py`` of the ``buildstream`` package.
    
    571
    +
    
    572
    +
    
    573
    +File naming convention
    
    574
    +~~~~~~~~~~~~~~~~~~~~~~
    
    575
    +With the exception of a few helper objects and data structures, we structure
    
    576
    +the code in BuildStream such that every filename is named after the object it
    
    577
    +implements. E.g. The ``Project`` object is implemented in ``_project.py``, the
    
    578
    +``Context`` object in ``_context.py``, the base ``Element`` class in ``element.py``,
    
    579
    +etc.
    
    580
    +
    
    581
    +As mentioned in the previous section, objects which are not a part of the
    
    582
    +:ref:`public, plugin facing API surface <contributing_public_api_surface>` have their
    
    583
    +filenames prefixed with a leading underscore (like ``_context.py`` and ``_project.py``
    
    584
    +in the examples above).
    
    585
    +
    
    586
    +When an object name has multiple words in it, e.g. ``ArtifactCache``, then the
    
    587
    +resulting file is named all in lower case without any underscore to separate
    
    588
    +words. In the case of ``ArtifactCache``, the filename implementing this object
    
    589
    +is found at ``_artifactcache/artifactcache.py``.
    
    94 590
     
    
    95 591
     
    
    96 592
     Imports
    
    97 593
     ~~~~~~~
    
    98
    -Module imports inside BuildStream are done with relative ``.`` notation
    
    594
    +Module imports inside BuildStream are done with relative ``.`` notation:
    
    99 595
     
    
    100
    -Good::
    
    596
    +**Good**::
    
    101 597
     
    
    102
    -  from .context import Context
    
    598
    +  from ._context import Context
    
    103 599
     
    
    104
    -Bad::
    
    600
    +**Bad**::
    
    105 601
     
    
    106
    -  from buildstream.context import Context
    
    602
    +  from buildstream._context import Context
    
    107 603
     
    
    108 604
     The exception to the above rule is when authoring plugins,
    
    109 605
     plugins do not reside in the same namespace so they must
    
    ... ... @@ -122,128 +618,586 @@ This makes things clear when reading code that said functions
    122 618
     are not defined in the same file but come from utils.py for example.
    
    123 619
     
    
    124 620
     
    
    125
    -Policy for private symbols
    
    126
    -~~~~~~~~~~~~~~~~~~~~~~~~~~
    
    127
    -Private symbols are expressed via a leading ``_`` single underscore, or
    
    128
    -in some special circumstances with a leading ``__`` double underscore.
    
    621
    +.. _contributing_instance_variables:
    
    622
    +
    
    623
    +Instance variables
    
    624
    +~~~~~~~~~~~~~~~~~~
    
    625
    +It is preferred that all instance state variables be declared as :ref:`private symbols
    
    626
    +<contributing_public_and_private>`, however in some cases, especially when the state
    
    627
    +is immutable for the object's life time (like an ``Element`` name for example), it
    
    628
    +is acceptable to save some typing by using a publicly accessible instance variable.
    
    629
    +
    
    630
    +It is never acceptable to modify the value of an instance variable from outside
    
    631
    +of the declaring class, even if the variable is *public*. In other words, the class
    
    632
    +which exposes an instance variable is the only one in control of the value of this
    
    633
    +variable.
    
    634
    +
    
    635
    +* If an instance variable is public and must be modified; then it must be
    
    636
    +  modified using a :ref:`mutator <contributing_accessor_mutator>`.
    
    637
    +
    
    638
    +* Ideally for better encapsulation, all object state is declared as
    
    639
    +  :ref:`private instance variables <contributing_public_and_private>` and can
    
    640
    +  only be accessed by external classes via public :ref:`accessors and mutators
    
    641
    +  <contributing_accessor_mutator>`.
    
    642
    +
    
    643
    +.. note::
    
    644
    +
    
    645
    +   In some cases, we may use small data structures declared as objects for the sake
    
    646
    +   of better readability, where the object class itself has no real supporting code.
    
    647
    +
    
    648
    +   In these exceptions, it can be acceptable to modify the instance variables
    
    649
    +   of these objects directly, unless they are otherwise documented to be immutable.
    
    650
    +
    
    651
    +
    
    652
    +.. _contributing_accessor_mutator:
    
    653
    +
    
    654
    +Accessors and mutators
    
    655
    +~~~~~~~~~~~~~~~~~~~~~~
    
    656
    +An accessor and mutator, are methods defined on the object class to access (get)
    
    657
    +or mutate (set) a value owned by the declaring class, respectively.
    
    658
    +
    
    659
    +An accessor might derive the returned value from one or more of its components,
    
    660
    +and a mutator might have side effects, or delegate the mutation to a component.
    
    661
    +
    
    662
    +Accessors and mutators are always :ref:`public <contributing_public_and_private>`
    
    663
    +(even if they might have a single leading underscore and are considered
    
    664
    +:ref:`API Private <contributing_public_api_surface>`), as their purpose is to
    
    665
    +enforce encapsulation with regards to any accesses to the state which is owned
    
    666
    +by the declaring class.
    
    667
    +
    
    668
    +Accessors and mutators are functions prefixed with ``get_`` and ``set_``
    
    669
    +respectively, e.g.::
    
    670
    +
    
    671
    +  class Foo():
    
    672
    +
    
    673
    +      def __init__(self):
    
    674
    +
    
    675
    +          # Declare some internal state
    
    676
    +          self._count = 0
    
    677
    +
    
    678
    +      # get_count()
    
    679
    +      #
    
    680
    +      # Gets the count of this Foo.
    
    681
    +      #
    
    682
    +      # Returns:
    
    683
    +      #    (int): The current count of this Foo
    
    684
    +      #
    
    685
    +      def get_foo(self):
    
    686
    +          return self._count
    
    687
    +
    
    688
    +      # set_count()
    
    689
    +      #
    
    690
    +      # Sets the count of this Foo.
    
    691
    +      #
    
    692
    +      # Args:
    
    693
    +      #    count (int): The new count for this Foo
    
    694
    +      #
    
    695
    +      def set_foo(self, count):
    
    696
    +          self._count = count
    
    697
    +
    
    698
    +.. attention::
    
    129 699
     
    
    130
    -Before understanding the naming policy, it is first important to understand
    
    131
    -that in BuildStream, there are two levels of privateness which need to be
    
    132
    -considered.
    
    700
    +   We are aware that Python offers a facility for accessors and
    
    701
    +   mutators using the ``@property`` decorator instead. Do not use
    
    702
    +   the ``@property`` decorator.
    
    133 703
     
    
    134
    -These are treated subtly differently and thus need to be understood:
    
    704
    +   The decision to use explicitly defined functions instead of the
    
    705
    +   ``@property`` decorator is rather arbitrary, there is not much
    
    706
    +   technical merit to preferring one technique over the other.
    
    707
    +   However as :ref:`discussed below <contributing_always_consistent>`,
    
    708
    +   it is of the utmost importance that we do not mix both techniques
    
    709
    +   in the same codebase.
    
    135 710
     
    
    136
    -* API Private
    
    137 711
     
    
    138
    -  A symbol is considered to be *API private* if it is not exposed in the *public API*.
    
    712
    +.. _contributing_abstract_methods:
    
    139 713
     
    
    140
    -  Even if a symbol does not have any leading underscore, it may still be *API private*
    
    141
    -  if the containing *class* or *module* is named with a leading underscore.
    
    714
    +Abstract methods
    
    715
    +~~~~~~~~~~~~~~~~
    
    716
    +In BuildStream, an *"Abstract Method"* is a bit of a misnomer and does
    
    717
    +not match up to how Python defines abstract methods, we need to seek out
    
    718
    +a new nomanclature to refer to these methods.
    
    142 719
     
    
    143
    -* Local private
    
    720
    +In Python, an *"Abstract Method"* is a method which **must** be
    
    721
    +implemented by a subclass, whereas all methods in Python can be
    
    722
    +overridden.
    
    144 723
     
    
    145
    -  A symbol is considered to be *local private* if it is not intended for access
    
    146
    -  outside of the defining *scope*.
    
    724
    +In BuildStream, we use the term *"Abstract Method"*, to refer to
    
    725
    +a method which **can** be overridden by a subclass, whereas it
    
    726
    +is **illegal** to override any other method.
    
    147 727
     
    
    148
    -  If a symbol has a leading underscore, it might not be *local private* if it is
    
    149
    -  declared on a publicly visible class, but needs to be accessed internally by
    
    150
    -  other modules in the BuildStream core.
    
    728
    +* Abstract methods are allowed to have default implementations.
    
    151 729
     
    
    730
    +* Subclasses are not allowed to redefine the calling signature
    
    731
    +  of an abstract method, or redefine the API contract in any way.
    
    152 732
     
    
    153
    -Ordering
    
    154
    -''''''''
    
    155
    -For better readability and consistency, we try to keep private symbols below
    
    156
    -public symbols. In the case of public modules where we may have a mix of
    
    157
    -*API private* and *local private* symbols, *API private* symbols should come
    
    158
    -before *local private* symbols.
    
    733
    +* Subclasses are not allowed to override any other methods.
    
    159 734
     
    
    735
    +The key here is that in BuildStream, we consider it unacceptable
    
    736
    +that a subclass overrides a method of it's parent class unless
    
    737
    +the said parent class has explicitly given permission to subclasses
    
    738
    +to do so, and outlined the API contract for this purpose. No surprises
    
    739
    +are allowed.
    
    160 740
     
    
    161
    -Symbol naming
    
    162
    -'''''''''''''
    
    163
    -Any private symbol must start with a single leading underscore for two reasons:
    
    164 741
     
    
    165
    -* So that it does not bleed into documentation and *public API*.
    
    742
    +Error handling
    
    743
    +~~~~~~~~~~~~~~
    
    744
    +In BuildStream, all non recoverable errors are expressed via
    
    745
    +subclasses of the ``BstError`` exception.
    
    166 746
     
    
    167
    -* So that it is clear to developers which symbols are not used outside of the declaring *scope*
    
    747
    +This exception is handled deep in the core in a few places, and
    
    748
    +it is rarely necessary to handle a ``BstError``.
    
    168 749
     
    
    169
    -Remember that with python, the modules (python files) are also symbols
    
    170
    -within their containing *package*, as such; modules which are entirely
    
    171
    -private to BuildStream are named as such, e.g. ``_thismodule.py``.
    
    172 750
     
    
    751
    +Raising exceptions
    
    752
    +''''''''''''''''''
    
    753
    +When writing code in the BuildStream core, ensure that all system
    
    754
    +calls and third party library calls are wrapped in a ``try:`` block,
    
    755
    +and raise a descriptive ``BstError`` of the appropriate class explaining
    
    756
    +what exactly failed.
    
    173 757
     
    
    174
    -Cases for double underscores
    
    175
    -''''''''''''''''''''''''''''
    
    176
    -The double underscore in python has a special function. When declaring
    
    177
    -a symbol in class scope which has a leading underscore, it can only be
    
    178
    -accessed within the class scope using the same name. Outside of class
    
    179
    -scope, it can only be accessed with a *cheat*.
    
    758
    +Ensure that the original system call error is formatted into your new
    
    759
    +exception, and that you use the Python ``from`` semantic to retain the
    
    760
    +original call trace, example::
    
    180 761
     
    
    181
    -We use the double underscore in cases where the type of privateness can be
    
    182
    -ambiguous.
    
    762
    +  try:
    
    763
    +      os.utime(self._refpath(ref))
    
    764
    +  except FileNotFoundError as e:
    
    765
    +      raise ArtifactError("Attempt to access unavailable artifact: {}".format(e)) from e
    
    183 766
     
    
    184
    -* For private modules and classes
    
    185 767
     
    
    186
    -  We never need to disambiguate with a double underscore
    
    768
    +Enhancing exceptions
    
    769
    +''''''''''''''''''''
    
    770
    +Sometimes the ``BstError`` originates from a lower level component,
    
    771
    +and the code segment which raised the exception did not have enough context
    
    772
    +to create a complete, informative summary of the error for the user.
    
    187 773
     
    
    188
    -* For private symbols declared in a public *scope*
    
    774
    +In these cases it is necessary to handle the error and raise a new
    
    775
    +one, e.g.::
    
    189 776
     
    
    190
    -  In the case that we declare a private method on a public object, it
    
    191
    -  becomes ambiguous whether:
    
    777
    +  try:
    
    778
    +      extracted_artifact = self._artifacts.extract(self, cache_key)
    
    779
    +  except ArtifactError as e:
    
    780
    +      raise ElementError("Failed to extract {} while checking out {}: {}"
    
    781
    +                         .format(cache_key, self.name, e)) from e
    
    192 782
     
    
    193
    -  * The symbol is *local private*, and only used within the given scope
    
    194 783
     
    
    195
    -  * The symbol is *API private*, and will be used internally by BuildStream
    
    196
    -    from other parts of the codebase.
    
    784
    +Programming errors
    
    785
    +''''''''''''''''''
    
    786
    +Sometimes you are writing code and have detected an unexpected condition,
    
    787
    +or a broken invariant for which the code cannot be prepared to handle
    
    788
    +gracefully.
    
    197 789
     
    
    198
    -  In this case, we use a single underscore for *API private* methods which
    
    199
    -  are not *local private*, and we use a double underscore for *local private*
    
    200
    -  methods declared in public scope.
    
    790
    +In these cases, do **not** raise any of the ``BstError`` class exceptions.
    
    201 791
     
    
    792
    +Instead, use the ``assert`` statement, e.g.::
    
    202 793
     
    
    203
    -Documenting private symbols
    
    204
    -'''''''''''''''''''''''''''
    
    205
    -Any symbol which is *API Private* (regardless of whether it is also
    
    206
    -*local private*), should have some documentation for developers to
    
    207
    -better understand the codebase.
    
    794
    +  assert utils._is_main_process(), \
    
    795
    +      "Attempted to save workspace configuration from child process"
    
    208 796
     
    
    209
    -Contrary to many other python projects, we do not use docstrings to
    
    210
    -document private symbols, but prefer to keep *API Private* symbols
    
    211
    -documented in code comments placed *above* the symbol (or *beside* the
    
    212
    -symbol in some cases, such as variable declarations in a class where
    
    213
    -a shorter comment is more desirable), rather than docstrings placed *below*
    
    214
    -the symbols being documented.
    
    797
    +This will result in a ``BUG`` message with the stack trace included being
    
    798
    +logged and reported in the frontend.
    
    215 799
     
    
    216
    -Other than this detail, follow the same guidelines for documenting
    
    217
    -symbols as described below.
    
    218 800
     
    
    801
    +BstError parameters
    
    802
    +'''''''''''''''''''
    
    803
    +When raising ``BstError`` class exceptions, there are some common properties
    
    804
    +which can be useful to know about:
    
    219 805
     
    
    220
    -Documenting BuildStream
    
    221
    ------------------------
    
    806
    +* **message:** The brief human readable error, will be formatted on one line in the frontend.
    
    807
    +
    
    808
    +* **detail:** An optional detailed human readable message to accompany the **message** summary
    
    809
    +  of the error. This is often used to recommend the user some course of action, or to provide
    
    810
    +  additional context about the error.
    
    811
    +
    
    812
    +* **temporary:** Some errors are allowed to be *temporary*, this attribute is only
    
    813
    +  observed from child processes which fail in a temporary way. This distinction
    
    814
    +  is used to determine whether the task should be *retried* or not. An error is usually
    
    815
    +  only a *temporary* error if the cause of the error was a network timeout.
    
    816
    +
    
    817
    +* **reason:** A machine readable identifier for the error. This is used for the purpose
    
    818
    +  of regression testing, such that we check that BuildStream has errored out for the
    
    819
    +  expected reason in a given failure mode.
    
    820
    +
    
    821
    +
    
    822
    +Documenting Exceptions
    
    823
    +''''''''''''''''''''''
    
    824
    +We have already seen :ref:`some examples <contributing_class_order>` of how
    
    825
    +exceptions are documented in API documenting comments, but this is worth some
    
    826
    +additional disambiguation.
    
    827
    +
    
    828
    +* Only document the exceptions which are raised directly by the function in question.
    
    829
    +  It is otherwise nearly impossible to keep track of what exceptions *might* be raised
    
    830
    +  indirectly by calling the given function.
    
    831
    +
    
    832
    +* For a regular public or private method, your audience is a caller of the function;
    
    833
    +  document the exception in terms of what exception might be raised as a result of
    
    834
    +  calling this method.
    
    835
    +
    
    836
    +* For an :ref:`abstract method <contributing_abstract_methods>`, your audience is the
    
    837
    +  implementor of the method in a subclass; document the exception in terms of what
    
    838
    +  exception is prescribed for the implementing class to raise.
    
    839
    +
    
    840
    +
    
    841
    +.. _contributing_always_consistent:
    
    842
    +
    
    843
    +Always be consistent
    
    844
    +~~~~~~~~~~~~~~~~~~~~
    
    845
    +There are various ways to define functions and classes in Python,
    
    846
    +which has evolved with various features over time.
    
    847
    +
    
    848
    +In BuildStream, we may not have leveraged all of the nice features
    
    849
    +we could have, that is okay, and where it does not break API, we
    
    850
    +can consider changing it.
    
    851
    +
    
    852
    +Even if you know there is a *better* way to do a given thing in
    
    853
    +Python when compared to the way we do it in BuildStream, *do not do it*.
    
    854
    +
    
    855
    +Consistency of how we do things in the codebase is more important
    
    856
    +than the actual way in which things are done, always.
    
    857
    +
    
    858
    +Instead, if you like a certain Python feature and think the BuildStream
    
    859
    +codebase should use it, then propose your change on the `mailing list
    
    860
    +<https://mail.gnome.org/mailman/listinfo/buildstream-list>`_. Chances
    
    861
    +are that we will reach agreement to use your preferred approach, and
    
    862
    +in that case, it will be important to apply the change unilaterally
    
    863
    +across the entire codebase, such that we continue to have a consistent
    
    864
    +codebase.
    
    865
    +
    
    866
    +
    
    867
    +Avoid tail calling
    
    868
    +~~~~~~~~~~~~~~~~~~
    
    869
    +With the exception of tail calling with simple functions from
    
    870
    +the standard Python library, such as splitting and joining lines
    
    871
    +of text and encoding/decoding text; always avoid tail calling.
    
    872
    +
    
    873
    +**Good**::
    
    874
    +
    
    875
    +  # Variables that we will need declared up top
    
    876
    +  context = self._get_context()
    
    877
    +  workspaces = context.get_workspaces()
    
    878
    +
    
    879
    +  ...
    
    880
    +
    
    881
    +  # Saving the workspace configuration
    
    882
    +  workspaces.save_config()
    
    883
    +
    
    884
    +**Bad**::
    
    885
    +
    
    886
    +  # Saving the workspace configuration
    
    887
    +  self._get_context().get_workspaces().save_config()
    
    888
    +
    
    889
    +**Acceptable**::
    
    890
    +
    
    891
    +  # Decode the raw text loaded from a log file for display,
    
    892
    +  # join them into a single utf-8 string and strip away any
    
    893
    +  # trailing whitespace.
    
    894
    +  return '\n'.join([line.decode('utf-8') for line in lines]).rstrip()
    
    895
    +
    
    896
    +When you need to obtain a delegate object via an accessor function,
    
    897
    +either do it at the beginning of the function, or at the beginning
    
    898
    +of a code block within the function that will use that object.
    
    899
    +
    
    900
    +There are several reasons for this convention:
    
    901
    +
    
    902
    +* When observing a stack trace, it is always faster and easier to
    
    903
    +  determine what went wrong when all statements are on separate lines.
    
    904
    +
    
    905
    +* We always want individual lines to trace back to their origin as
    
    906
    +  much as possible for the purpose of tracing the history of code
    
    907
    +  with *git blame*.
    
    908
    +
    
    909
    +  One day, you might need the ``Context`` or ``Workspaces`` object
    
    910
    +  in the same function for another reason, at which point it will
    
    911
    +  be unacceptable to leave the existing line as written, because it
    
    912
    +  will introduce a redundant accessor to the same object, so the
    
    913
    +  line written as::
    
    914
    +
    
    915
    +    self._get_context().get_workspaces().save_config()
    
    916
    +
    
    917
    +  Will have to change at that point, meaning we lose the valuable
    
    918
    +  information of which commit originally introduced this call
    
    919
    +  when running *git blame*.
    
    920
    +
    
    921
    +* For similar reasons, we prefer delegate objects be accessed near
    
    922
    +  the beginning of a function or code block so that there is less
    
    923
    +  chance that this statement will have to move in the future, if
    
    924
    +  the same function or code block needs the delegate object for any
    
    925
    +  other reason.
    
    926
    +
    
    927
    +  Asides from this, code is generally more legible and uniform when
    
    928
    +  variables are declared at the beginning of function blocks.
    
    929
    +
    
    930
    +
    
    931
    +Vertical stacking of modules
    
    932
    +~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    
    933
    +For the sake of overall comprehensiveness of the BuildStream
    
    934
    +architecture, it is important that we retain vertical stacking
    
    935
    +order of the dependencies and knowledge of modules as much as
    
    936
    +possible, and avoid any cyclic relationships in modules.
    
    937
    +
    
    938
    +For instance, the ``Source`` objects are owned by ``Element``
    
    939
    +objects in the BuildStream data model, and as such the ``Element``
    
    940
    +will delegate some activities to the ``Source`` objects in it's
    
    941
    +possesion. The ``Source`` objects should however never call functions
    
    942
    +on the ``Element`` object, nor should the ``Source`` object itself
    
    943
    +have any understanding of what an ``Element`` is.
    
    944
    +
    
    945
    +If you are implementing a low level utility layer, for example
    
    946
    +as a part of the ``YAML`` loading code layers, it can be tempting
    
    947
    +to derive context from the higher levels of the codebase which use
    
    948
    +these low level utilities, instead of defining properly stand alone
    
    949
    +APIs for these utilities to work: Never do this.
    
    950
    +
    
    951
    +Unfortunately, unlike other languages where include files play
    
    952
    +a big part in ensuring that it is difficult to make a mess; Python,
    
    953
    +allows you to just call methods on arbitrary objects passed through
    
    954
    +a function call without having to import the module which defines
    
    955
    +those methods - this leads to cyclic dependencies of modules quickly
    
    956
    +if the developer does not take special care of ensuring this does not
    
    957
    +happen.
    
    958
    +
    
    959
    +
    
    960
    +Minimize arguments in methods
    
    961
    +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    
    962
    +When creating an object, or adding a new API method to an existing
    
    963
    +object, always strive to keep as much context as possible on the
    
    964
    +object itself rather than expecting callers of the methods to provide
    
    965
    +everything the method needs every time.
    
    966
    +
    
    967
    +If the value or object that is needed in a function call is a constant
    
    968
    +for the lifetime of the object which exposes the given method, then
    
    969
    +that value or object should be passed in the constructor instead of
    
    970
    +via a method call.
    
    971
    +
    
    972
    +
    
    973
    +Minimize API surfaces
    
    974
    +~~~~~~~~~~~~~~~~~~~~~
    
    975
    +When creating an object, or adding new functionality in any way,
    
    976
    +try to keep the number of :ref:`public, outward facing <contributing_public_and_private>`
    
    977
    +symbols to a minimum, this is important for both
    
    978
    +:ref:`internal and public, plugin facing API surfaces <contributing_public_api_surface>`.
    
    979
    +
    
    980
    +When anyone visits a file, there are two levels of comprehension:
    
    981
    +
    
    982
    +* What do I need to know in order to *use* this object.
    
    983
    +
    
    984
    +* What do I need to know in order to *modify* this object.
    
    985
    +
    
    986
    +For the former, we want the reader to understand with as little effort
    
    987
    +as possible, what the public API contract is for a given object and consequently,
    
    988
    +how it is expected to be used. This is also why we
    
    989
    +:ref:`order the symbols of a class <contributing_class_order>` in such a way
    
    990
    +as to keep all outward facing public API surfaces at the top of the file, so that the
    
    991
    +reader never needs to dig deep into the bottom of the file to find something they
    
    992
    +might need to use.
    
    993
    +
    
    994
    +For the latter, when it comes to having to modify the file or add functionality,
    
    995
    +you want to retain as much freedom as possible to modify internals, while
    
    996
    +being sure that nothing external will be affected by internal modifications.
    
    997
    +Less client facing API means that you have less surrounding code to modify
    
    998
    +when your API changes. Further, ensuring that there is minimal outward facing
    
    999
    +API for any module minimizes the complexity for the developer working on
    
    1000
    +that module, by limiting the considerations needed regarding external side
    
    1001
    +effects of their modifications to the module.
    
    1002
    +
    
    1003
    +When modifying a file, one should not have to understand or think too
    
    1004
    +much about external side effects, when the API surface of the file is
    
    1005
    +well documented and minimal.
    
    1006
    +
    
    1007
    +When adding new API to a given object for a new purpose, consider whether
    
    1008
    +the new API is in any way redundant with other API (should this value now
    
    1009
    +go into the constructor, since we use it more than once? could this
    
    1010
    +value be passed along with another function, and the other function renamed,
    
    1011
    +to better suit the new purposes of this module/object?) and repurpose
    
    1012
    +the outward facing API of an object as a whole every time.
    
    1013
    +
    
    1014
    +
    
    1015
    +Avoid transient state on instances
    
    1016
    +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    
    1017
    +At times, it can be tempting to store transient state that is
    
    1018
    +the result of one operation on an instance, only to be retrieved
    
    1019
    +later via an accessor function elsewhere.
    
    1020
    +
    
    1021
    +As a basic rule of thumb, if the value is transient and just the
    
    1022
    +result of one operation, which needs to be observed directly after
    
    1023
    +by another code segment, then never store it on the instance.
    
    1024
    +
    
    1025
    +BuildStream is complicated in the sense that it is multi processed
    
    1026
    +and it is not always obvious how to pass the transient state around
    
    1027
    +as a return value or a function parameter. Do not fall prey to this
    
    1028
    +obstacle and pollute object instances with transient state.
    
    1029
    +
    
    1030
    +Instead, always refactor the surrounding code so that the value
    
    1031
    +is propagated to the desired end point via a well defined API, either
    
    1032
    +by adding new code paths or changing the design such that the
    
    1033
    +architecture continues to make sense.
    
    1034
    +
    
    1035
    +
    
    1036
    +Refactor the codebase as needed
    
    1037
    +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    
    1038
    +Especially when implementing features, always move the BuildStream
    
    1039
    +codebase forward as a whole.
    
    1040
    +
    
    1041
    +Taking a short cut is alright when prototyping, but circumventing
    
    1042
    +existing architecture and design to get a feature implemented without
    
    1043
    +re-designing the surrounding architecture to accommodate the new
    
    1044
    +feature instead, is never acceptable upstream.
    
    1045
    +
    
    1046
    +For example, let's say that you have to implement a feature and you've
    
    1047
    +successfully prototyped it, but it launches a ``Job`` directly from a
    
    1048
    +``Queue`` implementation to get the feature to work, while the ``Scheduler``
    
    1049
    +is normally responsible for dispatching ``Jobs`` for the elements on
    
    1050
    +a ``Queue``. This means that you've proven that your feature can work,
    
    1051
    +and now it is time to start working on a patch for upstream.
    
    1052
    +
    
    1053
    +Consider what the scenario is and why you are circumventing the design,
    
    1054
    +and then redesign the ``Scheduler`` and ``Queue`` objects to accommodate for
    
    1055
    +the new feature and condition under which you need to dispatch a ``Job``,
    
    1056
    +or how you can give the ``Queue`` implementation the additional context it
    
    1057
    +needs.
    
    1058
    +
    
    1059
    +
    
    1060
    +Adding core plugins
    
    1061
    +-------------------
    
    1062
    +This is a checklist of things which need to be done when adding a new
    
    1063
    +core plugin to BuildStream proper.
    
    1064
    +
    
    1065
    +
    
    1066
    +Update documentation index
    
    1067
    +~~~~~~~~~~~~~~~~~~~~~~~~~~
    
    1068
    +The documentation generating scripts will automatically pick up your
    
    1069
    +newly added plugin and generate HTML, but will not add a link to the
    
    1070
    +documentation of your plugin automatically.
    
    1071
    +
    
    1072
    +Whenever adding a new plugin, you must add an entry for it in ``doc/source/core_plugins.rst``.
    
    1073
    +
    
    1074
    +
    
    1075
    +Bump format version
    
    1076
    +~~~~~~~~~~~~~~~~~~~
    
    1077
    +In order for projects to assert that they have a new enough version
    
    1078
    +of BuildStream to use the new plugin, the ``BST_FORMAT_VERSION`` must
    
    1079
    +be incremented in the ``_versions.py`` file.
    
    1080
    +
    
    1081
    +Remember to include in your plugin's main documentation, the format
    
    1082
    +version in which the plugin was introduced, using the standard annotation
    
    1083
    +which we use throughout the documentation, e.g.::
    
    1084
    +
    
    1085
    +  .. note::
    
    1086
    +
    
    1087
    +     The ``foo`` plugin is available since :ref:`format version 16 <project_format_version>`
    
    1088
    +
    
    1089
    +
    
    1090
    +Add tests
    
    1091
    +~~~~~~~~~
    
    1092
    +Needless to say, all new feature additions need to be tested. For ``Element``
    
    1093
    +plugins, these usually need to be added to the integration tests. For ``Source``
    
    1094
    +plugins, the tests are added in two ways:
    
    1095
    +
    
    1096
    +* For most normal ``Source`` plugins, it is important to add a new ``Repo``
    
    1097
    +  implementation for your plugin in the ``tests/testutils/repo/`` directory
    
    1098
    +  and update ``ALL_REPO_KINDS`` in ``tests/testutils/repo/__init__.py``. This
    
    1099
    +  will include your new ``Source`` implementation in a series of already existing
    
    1100
    +  tests, ensuring it works well under normal operating conditions.
    
    1101
    +
    
    1102
    +* For other source plugins, or in order to test edge cases, such as failure modes,
    
    1103
    +  which are not tested under the normal test battery, add new tests in ``tests/sources``.
    
    1104
    +
    
    1105
    +
    
    1106
    +Extend the cachekey test
    
    1107
    +~~~~~~~~~~~~~~~~~~~~~~~~
    
    1108
    +For any newly added plugins, it is important to add some new simple elements
    
    1109
    +in ``tests/cachekey/project/elements`` or ``tests/cachekey/project/sources``,
    
    1110
    +and ensure that the newly added elements are depended on by ``tests/cachekey/project/target.bst``.
    
    1111
    +
    
    1112
    +One new element should be added to the cache key test for every configuration
    
    1113
    +value which your plugin understands which can possibly affect the result of
    
    1114
    +your plugin's ``Plugin.get_unique_key()`` implementation.
    
    1115
    +
    
    1116
    +This test ensures that cache keys do not unexpectedly change or become incompatible
    
    1117
    +due to code changes. As such, the cache key test should have full coverage of every
    
    1118
    +YAML configuration which can possibly affect cache key outcome at all times.
    
    1119
    +
    
    1120
    +See the ``tests/cachekey/update.py`` file for instructions on running the updater,
    
    1121
    +you need to run the updater to generate the ``.expected`` files and add the new
    
    1122
    +``.expected`` files in the same commit which extends the cache key test.
    
    1123
    +
    
    1124
    +
    
    1125
    +Protocol buffers
    
    1126
    +----------------
    
    1127
    +BuildStream uses protobuf and gRPC for serialization and communication with
    
    1128
    +artifact cache servers.  This requires ``.proto`` files and Python code
    
    1129
    +generated from the ``.proto`` files using protoc.  All these files live in the
    
    1130
    +``buildstream/_protos`` directory.  The generated files are included in the
    
    1131
    +git repository to avoid depending on grpcio-tools for user installations.
    
    1132
    +
    
    1133
    +
    
    1134
    +Regenerating code
    
    1135
    +~~~~~~~~~~~~~~~~~
    
    1136
    +When ``.proto`` files are modified, the corresponding Python code needs to
    
    1137
    +be regenerated.  As a prerequisite for code generation you need to install
    
    1138
    +``grpcio-tools`` using pip or some other mechanism::
    
    1139
    +
    
    1140
    +  pip3 install --user grpcio-tools
    
    1141
    +
    
    1142
    +To actually regenerate the code::
    
    1143
    +
    
    1144
    +  ./setup.py build_grpc
    
    1145
    +
    
    1146
    +
    
    1147
    +Documenting
    
    1148
    +-----------
    
    222 1149
     BuildStream starts out as a documented project from day one and uses
    
    223 1150
     sphinx to document itself.
    
    224 1151
     
    
    1152
    +This section discusses formatting policies for editing files in the
    
    1153
    +``doc/source`` directory, and describes the details of how the docs are
    
    1154
    +generated so that you can easily generate and view the docs yourself before
    
    1155
    +submitting patches to the documentation.
    
    1156
    +
    
    1157
    +For details on how API documenting comments and docstrings are formatted,
    
    1158
    +refer to the :ref:`documenting section of the coding guidelines
    
    1159
    +<contributing_documenting_symbols>`.
    
    1160
    +
    
    225 1161
     
    
    226 1162
     Documentation formatting policy
    
    227 1163
     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    
    228 1164
     The BuildStream documentation style is as follows:
    
    229 1165
     
    
    230
    -* Titles and headings require two leading empty lines above them. Only the first word should be capitalized.
    
    1166
    +* Titles and headings require two leading empty lines above them.
    
    1167
    +  Only the first word in a title should be capitalized.
    
    231 1168
     
    
    232
    -  * If there is an ``.. _internal_link`` anchor, there should be two empty lines above the anchor, followed by one leading empty line.
    
    1169
    +  * If there is an ``.. _internal_link:`` anchor, there should be two empty lines
    
    1170
    +    above the anchor, followed by one leading empty line.
    
    233 1171
     
    
    234 1172
     * Within a section, paragraphs should be separated by one empty line.
    
    235 1173
     
    
    236
    -* Notes are defined using: ``.. note::`` blocks, followed by an empty line and then indented (3 spaces) text.
    
    1174
    +* Notes are defined using: ``.. note::`` blocks, followed by an empty line
    
    1175
    +  and then indented (3 spaces) text.
    
    237 1176
     
    
    238
    -* Code blocks are defined using: ``.. code:: LANGUAGE`` blocks, followed by an empty line and then indented (3 spaces) text. Note that the default language is `python`.
    
    1177
    +  * Other kinds of notes can be used throughout the documentation and will
    
    1178
    +    be decorated in different ways, these work in the same way as ``.. note::`` does.
    
    1179
    +
    
    1180
    +    Feel free to also use ``.. attention::`` or ``.. important::`` to call special
    
    1181
    +    attention to a paragraph, ``.. tip::`` to give the reader a special tip on how
    
    1182
    +    to use an advanced feature or ``.. warning::`` to warn the user about a potential
    
    1183
    +    misuse of the API and explain it's consequences.
    
    1184
    +
    
    1185
    +* Code blocks are defined using: ``.. code:: LANGUAGE`` blocks, followed by an empty
    
    1186
    +  line and then indented (3 spaces) text. Note that the default language is ``python``.
    
    239 1187
     
    
    240 1188
     * Cross references should be of the form ``:role:`target```.
    
    241 1189
     
    
    242
    -  * To cross reference arbitrary locations with, for example, the anchor ``_anchor_name``, you must give the link an explicit title: ``:ref:`Link text <anchor_name>```. Note that the "_" prefix is not required.
    
    1190
    +  * Explicit anchors can be declared as ``.. _anchor_name:`` on a line by itself.
    
    1191
    +
    
    1192
    +  * To cross reference arbitrary locations with, for example, the anchor ``anchor_name``,
    
    1193
    +    always provide some explicit text in the link instead of deriving the text from
    
    1194
    +    the target, e.g.: ``:ref:`Link text <anchor_name>```.
    
    1195
    +    Note that the "_" prefix is not used when referring to the target.
    
    243 1196
     
    
    244 1197
     Useful links:
    
    245 1198
     
    
    246
    -For further information, please see the `Sphinx Documentation <http://www.sphinx-doc.org/en/master/usage/restructuredtext/basics.html>`_.
    
    1199
    +For further information, please see the `Sphinx Documentation
    
    1200
    +<http://www.sphinx-doc.org/en/master/usage/restructuredtext/basics.html>`_.
    
    247 1201
     
    
    248 1202
     
    
    249 1203
     Building Docs
    
    ... ... @@ -312,54 +1266,28 @@ the ``man/`` subdirectory, which will be automatically included
    312 1266
     in the buildstream distribution.
    
    313 1267
     
    
    314 1268
     
    
    315
    -Documenting conventions
    
    316
    -~~~~~~~~~~~~~~~~~~~~~~~
    
    317
    -We use the sphinx.ext.napoleon extension for the purpose of having
    
    318
    -a bit nicer docstrings than the default sphinx docstrings.
    
    319
    -
    
    320
    -A docstring for a method, class or function should have the following
    
    321
    -format::
    
    322
    -
    
    323
    -  """Brief description of entity
    
    324
    -
    
    325
    -  Args:
    
    326
    -     argument1 (type): Description of arg
    
    327
    -     argument2 (type): Description of arg
    
    328
    -
    
    329
    -  Returns:
    
    330
    -     (type): Description of returned thing of the specified type
    
    331
    -
    
    332
    -  Raises:
    
    333
    -     (SomeError): When some error occurs
    
    334
    -     (SomeOtherError): When some other error occurs
    
    335
    -
    
    336
    -  A detailed description can go here if one is needed, only
    
    337
    -  after the above part documents the calling conventions.
    
    338
    -  """
    
    339
    -
    
    340
    -
    
    341 1269
     Documentation Examples
    
    342 1270
     ~~~~~~~~~~~~~~~~~~~~~~
    
    343 1271
     The examples section of the documentation contains a series of standalone
    
    344 1272
     examples, here are the criteria for an example addition.
    
    345 1273
     
    
    346
    -* The example has a ``${name}``
    
    1274
    +* The example has a ``${name}``.
    
    347 1275
     
    
    348
    -* The example has a project users can copy and use
    
    1276
    +* The example has a project users can copy and use.
    
    349 1277
     
    
    350
    -  * This project is added in the directory ``doc/examples/${name}``
    
    1278
    +  * This project is added in the directory ``doc/examples/${name}``.
    
    351 1279
     
    
    352
    -* The example has a documentation component
    
    1280
    +* The example has a documentation component.
    
    353 1281
     
    
    354 1282
       * This is added at ``doc/source/examples/${name}.rst``
    
    355 1283
       * A reference to ``examples/${name}`` is added to the toctree in ``doc/source/examples.rst``
    
    356 1284
       * This documentation discusses the project elements declared in the project and may
    
    357
    -    provide some BuildStream command examples
    
    358
    -  * This documentation links out to the reference manual at every opportunity
    
    1285
    +    provide some BuildStream command examples.
    
    1286
    +  * This documentation links out to the reference manual at every opportunity.
    
    359 1287
     
    
    360
    -* The example has a CI test component
    
    1288
    +* The example has a CI test component.
    
    361 1289
     
    
    362
    -  * This is an integration test added at ``tests/examples/${name}``
    
    1290
    +  * This is an integration test added at ``tests/examples/${name}``.
    
    363 1291
       * This test runs BuildStream in the ways described in the example
    
    364 1292
         and assert that we get the results which we advertize to users in
    
    365 1293
         the said examples.
    
    ... ... @@ -386,17 +1314,17 @@ The ``.run`` file format is just another YAML dictionary which consists of a
    386 1314
     
    
    387 1315
     Each *command* is a dictionary, the members of which are listed here:
    
    388 1316
     
    
    389
    -* ``directory``: The input file relative project directory
    
    1317
    +* ``directory``: The input file relative project directory.
    
    390 1318
     
    
    391
    -* ``output``: The input file relative output html file to generate (optional)
    
    1319
    +* ``output``: The input file relative output html file to generate (optional).
    
    392 1320
     
    
    393 1321
     * ``fake-output``: Don't really run the command, just pretend to and pretend
    
    394 1322
       this was the output, an empty string will enable this too.
    
    395 1323
     
    
    396
    -* ``command``: The command to run, without the leading ``bst``
    
    1324
    +* ``command``: The command to run, without the leading ``bst``.
    
    397 1325
     
    
    398 1326
     * ``shell``: Specifying ``True`` indicates that ``command`` should be run as
    
    399
    -  a shell command from the project directory, instead of a bst command (optional)
    
    1327
    +  a shell command from the project directory, instead of a bst command (optional).
    
    400 1328
     
    
    401 1329
     When adding a new ``.run`` file, one should normally also commit the new
    
    402 1330
     resulting generated ``.html`` file(s) into the ``doc/source/sessions-stored/``
    
    ... ... @@ -419,30 +1347,10 @@ regenerate them locally in order to build the docs.
    419 1347
          command: build hello.bst
    
    420 1348
     
    
    421 1349
     
    
    422
    -Protocol Buffers
    
    423
    -----------------
    
    424
    -BuildStream uses protobuf and gRPC for serialization and communication with
    
    425
    -artifact cache servers.  This requires ``.proto`` files and Python code
    
    426
    -generated from the ``.proto`` files using protoc.  All these files live in the
    
    427
    -``buildstream/_protos`` directory.  The generated files are included in the
    
    428
    -git repository to avoid depending on grpcio-tools for user installations.
    
    1350
    +.. _contributing_testing:
    
    429 1351
     
    
    430
    -
    
    431
    -Regenerating code
    
    432
    -~~~~~~~~~~~~~~~~~
    
    433
    -When ``.proto`` files are modified, the corresponding Python code needs to
    
    434
    -be regenerated.  As a prerequisite for code generation you need to install
    
    435
    -``grpcio-tools`` using pip or some other mechanism::
    
    436
    -
    
    437
    -  pip3 install --user grpcio-tools
    
    438
    -
    
    439
    -To actually regenerate the code::
    
    440
    -
    
    441
    -  ./setup.py build_grpc
    
    442
    -
    
    443
    -
    
    444
    -Testing BuildStream
    
    445
    --------------------
    
    1352
    +Testing
    
    1353
    +-------
    
    446 1354
     BuildStream uses pytest for regression tests and testing out
    
    447 1355
     the behavior of newly added components.
    
    448 1356
     
    
    ... ... @@ -495,6 +1403,7 @@ with::
    495 1403
     Alternatively, any IDE plugin that uses pytest should automatically
    
    496 1404
     detect the ``.pylintrc`` in the project's root directory.
    
    497 1405
     
    
    1406
    +
    
    498 1407
     Adding tests
    
    499 1408
     ~~~~~~~~~~~~
    
    500 1409
     Tests are found in the tests subdirectory, inside of which
    
    ... ... @@ -513,7 +1422,7 @@ a subdirectory beside your test in which to store data.
    513 1422
     When creating a test that needs data, use the datafiles extension
    
    514 1423
     to decorate your test case (again, examples exist in the existing
    
    515 1424
     tests for this), documentation on the datafiles extension can
    
    516
    -be found here: https://pypi.python.org/pypi/pytest-datafiles
    
    1425
    +be found here: https://pypi.python.org/pypi/pytest-datafiles.
    
    517 1426
     
    
    518 1427
     Tests that run a sandbox should be decorated with::
    
    519 1428
     
    
    ... ... @@ -521,8 +1430,9 @@ Tests that run a sandbox should be decorated with::
    521 1430
     
    
    522 1431
     and use the integration cli helper.
    
    523 1432
     
    
    524
    -Measuring BuildStream performance
    
    525
    ----------------------------------
    
    1433
    +
    
    1434
    +Measuring performance
    
    1435
    +---------------------
    
    526 1436
     
    
    527 1437
     
    
    528 1438
     Benchmarking framework
    

  • NEWS
    ... ... @@ -24,6 +24,9 @@ buildstream 1.3.1
    24 24
       o Add new `pip` source plugin for downloading python packages using pip,
    
    25 25
         based on requirements files from previous sources.
    
    26 26
     
    
    27
    +  o Generate Docker images from built artifacts using
    
    28
    +    `contrib/bst-docker-import` script.
    
    29
    +
    
    27 30
     
    
    28 31
     =================
    
    29 32
     buildstream 1.1.5
    

  • buildstream/_artifactcache/artifactcache.py
    ... ... @@ -426,6 +426,22 @@ class ArtifactCache():
    426 426
             raise ImplError("Cache '{kind}' does not implement contains()"
    
    427 427
                             .format(kind=type(self).__name__))
    
    428 428
     
    
    429
    +    # contains_subdir_artifact():
    
    430
    +    #
    
    431
    +    # Check whether an artifact element contains a digest for a subdir
    
    432
    +    # which is populated in the cache, i.e non dangling.
    
    433
    +    #
    
    434
    +    # Args:
    
    435
    +    #     element (Element): The Element to check
    
    436
    +    #     key (str): The cache key to use
    
    437
    +    #     subdir (str): The subdir to check
    
    438
    +    #
    
    439
    +    # Returns: True if the subdir exists & is populated in the cache, False otherwise
    
    440
    +    #
    
    441
    +    def contains_subdir_artifact(self, element, key, subdir):
    
    442
    +        raise ImplError("Cache '{kind}' does not implement contains_subdir_artifact()"
    
    443
    +                        .format(kind=type(self).__name__))
    
    444
    +
    
    429 445
         # list_artifacts():
    
    430 446
         #
    
    431 447
         # List artifacts in this cache in LRU order.
    
    ... ... @@ -551,11 +567,12 @@ class ArtifactCache():
    551 567
         #     element (Element): The Element whose artifact is to be fetched
    
    552 568
         #     key (str): The cache key to use
    
    553 569
         #     progress (callable): The progress callback, if any
    
    570
    +    #     subdir (str): The optional specific subdir to pull
    
    554 571
         #
    
    555 572
         # Returns:
    
    556 573
         #   (bool): True if pull was successful, False if artifact was not available
    
    557 574
         #
    
    558
    -    def pull(self, element, key, *, progress=None):
    
    575
    +    def pull(self, element, key, *, progress=None, subdir=None, excluded_subdirs=None):
    
    559 576
             raise ImplError("Cache '{kind}' does not implement pull()"
    
    560 577
                             .format(kind=type(self).__name__))
    
    561 578
     
    

  • buildstream/_artifactcache/cascache.py
    ... ... @@ -92,6 +92,16 @@ class CASCache(ArtifactCache):
    92 92
             # This assumes that the repository doesn't have any dangling pointers
    
    93 93
             return os.path.exists(refpath)
    
    94 94
     
    
    95
    +    def contains_subdir_artifact(self, element, key, subdir):
    
    96
    +        tree = self.resolve_ref(self.get_artifact_fullname(element, key))
    
    97
    +
    
    98
    +        # This assumes that the subdir digest is present in the element tree
    
    99
    +        subdirdigest = self._get_subdir(tree, subdir)
    
    100
    +        objpath = self.objpath(subdirdigest)
    
    101
    +
    
    102
    +        # True if subdir content is cached or if empty as expected
    
    103
    +        return os.path.exists(objpath)
    
    104
    +
    
    95 105
         def extract(self, element, key):
    
    96 106
             ref = self.get_artifact_fullname(element, key)
    
    97 107
     
    
    ... ... @@ -228,7 +238,7 @@ class CASCache(ArtifactCache):
    228 238
                 remotes_for_project = self._remotes[element._get_project()]
    
    229 239
                 return any(remote.spec.push for remote in remotes_for_project)
    
    230 240
     
    
    231
    -    def pull(self, element, key, *, progress=None):
    
    241
    +    def pull(self, element, key, *, progress=None, subdir=None, excluded_subdirs=None):
    
    232 242
             ref = self.get_artifact_fullname(element, key)
    
    233 243
     
    
    234 244
             project = element._get_project()
    
    ... ... @@ -247,8 +257,14 @@ class CASCache(ArtifactCache):
    247 257
                     tree.hash = response.digest.hash
    
    248 258
                     tree.size_bytes = response.digest.size_bytes
    
    249 259
     
    
    250
    -                self._fetch_directory(remote, tree)
    
    260
    +                # Check if the element artifact is present, if so just fetch subdir
    
    261
    +                if subdir and os.path.exists(self.objpath(tree)):
    
    262
    +                    self._fetch_subdir(remote, tree, subdir)
    
    263
    +                else:
    
    264
    +                    # Fetch artifact, excluded_subdirs determined in pullqueue
    
    265
    +                    self._fetch_directory(remote, tree, excluded_subdirs=excluded_subdirs)
    
    251 266
     
    
    267
    +                # tree is the remote value, so is the same without or without dangling ref locally
    
    252 268
                     self.set_ref(ref, tree)
    
    253 269
     
    
    254 270
                     element.info("Pulled artifact {} <- {}".format(display_key, remote.spec.url))
    
    ... ... @@ -668,8 +684,10 @@ class CASCache(ArtifactCache):
    668 684
                              stat.S_IRGRP | stat.S_IXGRP | stat.S_IROTH | stat.S_IXOTH)
    
    669 685
     
    
    670 686
             for dirnode in directory.directories:
    
    671
    -            fullpath = os.path.join(dest, dirnode.name)
    
    672
    -            self._checkout(fullpath, dirnode.digest)
    
    687
    +            # Don't try to checkout a dangling ref
    
    688
    +            if os.path.exists(self.objpath(dirnode.digest)):
    
    689
    +                fullpath = os.path.join(dest, dirnode.name)
    
    690
    +                self._checkout(fullpath, dirnode.digest)
    
    673 691
     
    
    674 692
             for symlinknode in directory.symlinks:
    
    675 693
                 # symlink
    
    ... ... @@ -948,10 +966,12 @@ class CASCache(ArtifactCache):
    948 966
         #     remote (Remote): The remote to use.
    
    949 967
         #     dir_digest (Digest): Digest object for the directory to fetch.
    
    950 968
         #
    
    951
    -    def _fetch_directory(self, remote, dir_digest):
    
    969
    +    def _fetch_directory(self, remote, dir_digest, *, excluded_subdirs=None):
    
    952 970
             fetch_queue = [dir_digest]
    
    953 971
             fetch_next_queue = []
    
    954 972
             batch = _CASBatchRead(remote)
    
    973
    +        if not excluded_subdirs:
    
    974
    +            excluded_subdirs = []
    
    955 975
     
    
    956 976
             while len(fetch_queue) + len(fetch_next_queue) > 0:
    
    957 977
                 if len(fetch_queue) == 0:
    
    ... ... @@ -966,8 +986,9 @@ class CASCache(ArtifactCache):
    966 986
                     directory.ParseFromString(f.read())
    
    967 987
     
    
    968 988
                 for dirnode in directory.directories:
    
    969
    -                batch = self._fetch_directory_node(remote, dirnode.digest, batch,
    
    970
    -                                                   fetch_queue, fetch_next_queue, recursive=True)
    
    989
    +                if dirnode.name not in excluded_subdirs:
    
    990
    +                    batch = self._fetch_directory_node(remote, dirnode.digest, batch,
    
    991
    +                                                       fetch_queue, fetch_next_queue, recursive=True)
    
    971 992
     
    
    972 993
                 for filenode in directory.files:
    
    973 994
                     batch = self._fetch_directory_node(remote, filenode.digest, batch,
    
    ... ... @@ -976,6 +997,10 @@ class CASCache(ArtifactCache):
    976 997
             # Fetch final batch
    
    977 998
             self._fetch_directory_batch(remote, batch, fetch_queue, fetch_next_queue)
    
    978 999
     
    
    1000
    +    def _fetch_subdir(self, remote, tree, subdir):
    
    1001
    +        subdirdigest = self._get_subdir(tree, subdir)
    
    1002
    +        self._fetch_directory(remote, subdirdigest)
    
    1003
    +
    
    979 1004
         def _fetch_tree(self, remote, digest):
    
    980 1005
             # download but do not store the Tree object
    
    981 1006
             with tempfile.NamedTemporaryFile(dir=self.tmpdir) as out:
    

  • buildstream/_context.py
    ... ... @@ -110,6 +110,9 @@ class Context():
    110 110
             # Make sure the XDG vars are set in the environment before loading anything
    
    111 111
             self._init_xdg()
    
    112 112
     
    
    113
    +        # Whether or not to attempt to pull buildtrees globally
    
    114
    +        self.pullbuildtrees = False
    
    115
    +
    
    113 116
             # Private variables
    
    114 117
             self._cache_key = None
    
    115 118
             self._message_handler = None
    
    ... ... @@ -160,7 +163,7 @@ class Context():
    160 163
             _yaml.node_validate(defaults, [
    
    161 164
                 'sourcedir', 'builddir', 'artifactdir', 'logdir',
    
    162 165
                 'scheduler', 'artifacts', 'logging', 'projects',
    
    163
    -            'cache'
    
    166
    +            'cache', 'pullbuildtrees'
    
    164 167
             ])
    
    165 168
     
    
    166 169
             for directory in ['sourcedir', 'builddir', 'artifactdir', 'logdir']:
    
    ... ... @@ -185,6 +188,9 @@ class Context():
    185 188
             # Load artifact share configuration
    
    186 189
             self.artifact_cache_specs = ArtifactCache.specs_from_config_node(defaults)
    
    187 190
     
    
    191
    +        # Load pull buildtrees configuration
    
    192
    +        self.pullbuildtrees = _yaml.node_get(defaults, bool, 'pullbuildtrees', default_value='False')
    
    193
    +
    
    188 194
             # Load logging config
    
    189 195
             logging = _yaml.node_get(defaults, Mapping, 'logging')
    
    190 196
             _yaml.node_validate(logging, [
    

  • buildstream/_frontend/cli.py
    ... ... @@ -305,10 +305,12 @@ def init(app, project_name, format_version, element_path, force):
    305 305
                   help="Allow tracking to cross junction boundaries")
    
    306 306
     @click.option('--track-save', default=False, is_flag=True,
    
    307 307
                   help="Deprecated: This is ignored")
    
    308
    +@click.option('--pull-buildtrees', default=False, is_flag=True,
    
    309
    +              help="Pull buildtrees from a remote cache server")
    
    308 310
     @click.argument('elements', nargs=-1,
    
    309 311
                     type=click.Path(readable=False))
    
    310 312
     @click.pass_obj
    
    311
    -def build(app, elements, all_, track_, track_save, track_all, track_except, track_cross_junctions):
    
    313
    +def build(app, elements, all_, track_, track_save, track_all, track_except, track_cross_junctions, pull_buildtrees):
    
    312 314
         """Build elements in a pipeline"""
    
    313 315
     
    
    314 316
         if (track_except or track_cross_junctions) and not (track_ or track_all):
    
    ... ... @@ -327,7 +329,8 @@ def build(app, elements, all_, track_, track_save, track_all, track_except, trac
    327 329
                              track_targets=track_,
    
    328 330
                              track_except=track_except,
    
    329 331
                              track_cross_junctions=track_cross_junctions,
    
    330
    -                         build_all=all_)
    
    332
    +                         build_all=all_,
    
    333
    +                         pull_buildtrees=pull_buildtrees)
    
    331 334
     
    
    332 335
     
    
    333 336
     ##################################################################
    
    ... ... @@ -429,10 +432,12 @@ def track(app, elements, deps, except_, cross_junctions):
    429 432
                   help='The dependency artifacts to pull (default: none)')
    
    430 433
     @click.option('--remote', '-r',
    
    431 434
                   help="The URL of the remote cache (defaults to the first configured cache)")
    
    435
    +@click.option('--pull-buildtrees', default=False, is_flag=True,
    
    436
    +              help="Pull buildtrees from a remote cache server")
    
    432 437
     @click.argument('elements', nargs=-1,
    
    433 438
                     type=click.Path(readable=False))
    
    434 439
     @click.pass_obj
    
    435
    -def pull(app, elements, deps, remote):
    
    440
    +def pull(app, elements, deps, remote, pull_buildtrees):
    
    436 441
         """Pull a built artifact from the configured remote artifact cache.
    
    437 442
     
    
    438 443
         By default the artifact will be pulled one of the configured caches
    
    ... ... @@ -446,7 +451,7 @@ def pull(app, elements, deps, remote):
    446 451
             all:   All dependencies
    
    447 452
         """
    
    448 453
         with app.initialized(session_name="Pull"):
    
    449
    -        app.stream.pull(elements, selection=deps, remote=remote)
    
    454
    +        app.stream.pull(elements, selection=deps, remote=remote, pull_buildtrees=pull_buildtrees)
    
    450 455
     
    
    451 456
     
    
    452 457
     ##################################################################
    

  • buildstream/_scheduler/queues/pullqueue.py
    ... ... @@ -32,9 +32,20 @@ class PullQueue(Queue):
    32 32
         complete_name = "Pulled"
    
    33 33
         resources = [ResourceType.DOWNLOAD, ResourceType.CACHE]
    
    34 34
     
    
    35
    +    def __init__(self, scheduler, buildtrees=False):
    
    36
    +        super().__init__(scheduler)
    
    37
    +
    
    38
    +        # Current default exclusions on pull
    
    39
    +        self._excluded_subdirs = ["buildtree"]
    
    40
    +        self._subdir = None
    
    41
    +        # If buildtrees are to be pulled, remove the value from exclusion list
    
    42
    +        if buildtrees:
    
    43
    +            self._subdir = "buildtree"
    
    44
    +            self._excluded_subdirs.remove(self._subdir)
    
    45
    +
    
    35 46
         def process(self, element):
    
    36 47
             # returns whether an artifact was downloaded or not
    
    37
    -        if not element._pull():
    
    48
    +        if not element._pull(subdir=self._subdir, excluded_subdirs=self._excluded_subdirs):
    
    38 49
                 raise SkipJob(self.action_name)
    
    39 50
     
    
    40 51
         def status(self, element):
    
    ... ... @@ -49,7 +60,7 @@ class PullQueue(Queue):
    49 60
             if not element._can_query_cache():
    
    50 61
                 return QueueStatus.WAIT
    
    51 62
     
    
    52
    -        if element._pull_pending():
    
    63
    +        if element._pull_pending(subdir=self._subdir):
    
    53 64
                 return QueueStatus.READY
    
    54 65
             else:
    
    55 66
                 return QueueStatus.SKIP
    

  • buildstream/_stream.py
    ... ... @@ -160,12 +160,14 @@ class Stream():
    160 160
         #    track_cross_junctions (bool): Whether tracking should cross junction boundaries
    
    161 161
         #    build_all (bool): Whether to build all elements, or only those
    
    162 162
         #                      which are required to build the target.
    
    163
    +    #    pull_buildtrees (bool): Whether to pull buildtrees from a remote cache server
    
    163 164
         #
    
    164 165
         def build(self, targets, *,
    
    165 166
                   track_targets=None,
    
    166 167
                   track_except=None,
    
    167 168
                   track_cross_junctions=False,
    
    168
    -              build_all=False):
    
    169
    +              build_all=False,
    
    170
    +              pull_buildtrees=False):
    
    169 171
     
    
    170 172
             if build_all:
    
    171 173
                 selection = PipelineSelection.ALL
    
    ... ... @@ -195,7 +197,10 @@ class Stream():
    195 197
                 self._add_queue(track_queue, track=True)
    
    196 198
     
    
    197 199
             if self._artifacts.has_fetch_remotes():
    
    198
    -            self._add_queue(PullQueue(self._scheduler))
    
    200
    +            # Query if pullbuildtrees has been set globally in user config
    
    201
    +            if self._context.pullbuildtrees:
    
    202
    +                pull_buildtrees = True
    
    203
    +            self._add_queue(PullQueue(self._scheduler, buildtrees=pull_buildtrees))
    
    199 204
     
    
    200 205
             self._add_queue(FetchQueue(self._scheduler, skip_cached=True))
    
    201 206
             self._add_queue(BuildQueue(self._scheduler))
    
    ... ... @@ -295,7 +300,8 @@ class Stream():
    295 300
         #
    
    296 301
         def pull(self, targets, *,
    
    297 302
                  selection=PipelineSelection.NONE,
    
    298
    -             remote=None):
    
    303
    +             remote=None,
    
    304
    +             pull_buildtrees=False):
    
    299 305
     
    
    300 306
             use_config = True
    
    301 307
             if remote:
    
    ... ... @@ -310,8 +316,12 @@ class Stream():
    310 316
             if not self._artifacts.has_fetch_remotes():
    
    311 317
                 raise StreamError("No artifact caches available for pulling artifacts")
    
    312 318
     
    
    319
    +        # Query if pullbuildtrees has been set globally in user config
    
    320
    +        if self._context.pullbuildtrees:
    
    321
    +            pull_buildtrees = True
    
    322
    +
    
    313 323
             self._pipeline.assert_consistent(elements)
    
    314
    -        self._add_queue(PullQueue(self._scheduler))
    
    324
    +        self._add_queue(PullQueue(self._scheduler, buildtrees=pull_buildtrees))
    
    315 325
             self._enqueue_plan(elements)
    
    316 326
             self._run()
    
    317 327
     
    

  • buildstream/element.py
    ... ... @@ -1316,7 +1316,8 @@ class Element(Plugin):
    1316 1316
         #
    
    1317 1317
         @contextmanager
    
    1318 1318
         def _prepare_sandbox(self, scope, directory, deps='run', integrate=True):
    
    1319
    -        with self.__sandbox(directory, config=self.__sandbox_config) as sandbox:
    
    1319
    +        # bst shell and bst checkout require a local sandbox.
    
    1320
    +        with self.__sandbox(directory, config=self.__sandbox_config, allow_remote=False) as sandbox:
    
    1320 1321
     
    
    1321 1322
                 # Configure always comes first, and we need it.
    
    1322 1323
                 self.configure_sandbox(sandbox)
    
    ... ... @@ -1691,18 +1692,26 @@ class Element(Plugin):
    1691 1692
     
    
    1692 1693
         # _pull_pending()
    
    1693 1694
         #
    
    1694
    -    # Check whether the artifact will be pulled.
    
    1695
    +    # Check whether the artifact will be pulled. If the pull operation is to
    
    1696
    +    # include a specific subdir of the element artifact (from cli or user conf)
    
    1697
    +    # then the local cache is queried for the subdirs existence.
    
    1698
    +    #
    
    1699
    +    # Args:
    
    1700
    +    #    subdir (str): Whether the pull has been invoked with a specific subdir set
    
    1695 1701
         #
    
    1696 1702
         # Returns:
    
    1697 1703
         #   (bool): Whether a pull operation is pending
    
    1698 1704
         #
    
    1699
    -    def _pull_pending(self):
    
    1705
    +    def _pull_pending(self, subdir=None):
    
    1700 1706
             if self._get_workspace():
    
    1701 1707
                 # Workspace builds are never pushed to artifact servers
    
    1702 1708
                 return False
    
    1703 1709
     
    
    1704
    -        if self.__strong_cached:
    
    1705
    -            # Artifact already in local cache
    
    1710
    +        if self.__strong_cached and subdir:
    
    1711
    +            # If we've specified a subdir, check if the subdir is cached locally
    
    1712
    +            if self.__artifacts.contains_subdir_artifact(self, self.__strict_cache_key, subdir):
    
    1713
    +                return False
    
    1714
    +        elif self.__strong_cached:
    
    1706 1715
                 return False
    
    1707 1716
     
    
    1708 1717
             # Pull is pending if artifact remote server available
    
    ... ... @@ -1724,11 +1733,10 @@ class Element(Plugin):
    1724 1733
     
    
    1725 1734
             self._update_state()
    
    1726 1735
     
    
    1727
    -    def _pull_strong(self, *, progress=None):
    
    1736
    +    def _pull_strong(self, *, progress=None, subdir=None, excluded_subdirs=None):
    
    1728 1737
             weak_key = self._get_cache_key(strength=_KeyStrength.WEAK)
    
    1729
    -
    
    1730 1738
             key = self.__strict_cache_key
    
    1731
    -        if not self.__artifacts.pull(self, key, progress=progress):
    
    1739
    +        if not self.__artifacts.pull(self, key, progress=progress, subdir=subdir, excluded_subdirs=excluded_subdirs):
    
    1732 1740
                 return False
    
    1733 1741
     
    
    1734 1742
             # update weak ref by pointing it to this newly fetched artifact
    
    ... ... @@ -1736,10 +1744,10 @@ class Element(Plugin):
    1736 1744
     
    
    1737 1745
             return True
    
    1738 1746
     
    
    1739
    -    def _pull_weak(self, *, progress=None):
    
    1747
    +    def _pull_weak(self, *, progress=None, subdir=None, excluded_subdirs=None):
    
    1740 1748
             weak_key = self._get_cache_key(strength=_KeyStrength.WEAK)
    
    1741
    -
    
    1742
    -        if not self.__artifacts.pull(self, weak_key, progress=progress):
    
    1749
    +        if not self.__artifacts.pull(self, weak_key, progress=progress, subdir=subdir,
    
    1750
    +                                     excluded_subdirs=excluded_subdirs):
    
    1743 1751
                 return False
    
    1744 1752
     
    
    1745 1753
             # extract strong cache key from this newly fetched artifact
    
    ... ... @@ -1757,17 +1765,17 @@ class Element(Plugin):
    1757 1765
         #
    
    1758 1766
         # Returns: True if the artifact has been downloaded, False otherwise
    
    1759 1767
         #
    
    1760
    -    def _pull(self):
    
    1768
    +    def _pull(self, subdir=None, excluded_subdirs=None):
    
    1761 1769
             context = self._get_context()
    
    1762 1770
     
    
    1763 1771
             def progress(percent, message):
    
    1764 1772
                 self.status(message)
    
    1765 1773
     
    
    1766 1774
             # Attempt to pull artifact without knowing whether it's available
    
    1767
    -        pulled = self._pull_strong(progress=progress)
    
    1775
    +        pulled = self._pull_strong(progress=progress, subdir=subdir, excluded_subdirs=excluded_subdirs)
    
    1768 1776
     
    
    1769 1777
             if not pulled and not self._cached() and not context.get_strict():
    
    1770
    -            pulled = self._pull_weak(progress=progress)
    
    1778
    +            pulled = self._pull_weak(progress=progress, subdir=subdir, excluded_subdirs=excluded_subdirs)
    
    1771 1779
     
    
    1772 1780
             if not pulled:
    
    1773 1781
                 return False
    
    ... ... @@ -1790,10 +1798,21 @@ class Element(Plugin):
    1790 1798
             if not self._cached():
    
    1791 1799
                 return True
    
    1792 1800
     
    
    1793
    -        # Do not push tained artifact
    
    1801
    +        # Do not push tainted artifact
    
    1794 1802
             if self.__get_tainted():
    
    1795 1803
                 return True
    
    1796 1804
     
    
    1805
    +        # Do not push elements that have a dangling buildtree artifact unless element type is
    
    1806
    +        # expected to have an empty buildtree directory
    
    1807
    +        if not self.__artifacts.contains_subdir_artifact(self, self.__strict_cache_key, 'buildtree'):
    
    1808
    +            return True
    
    1809
    +
    
    1810
    +        # strict_cache_key can't be relied on to be available when running in non strict mode
    
    1811
    +        context = self._get_context()
    
    1812
    +        if not context.get_strict():
    
    1813
    +            if not self.__artifacts.contains_subdir_artifact(self, self.__weak_cache_key, 'buildtree'):
    
    1814
    +                return True
    
    1815
    +
    
    1797 1816
             return False
    
    1798 1817
     
    
    1799 1818
         # _push():
    
    ... ... @@ -2149,17 +2168,18 @@ class Element(Plugin):
    2149 2168
         #    stdout (fileobject): The stream for stdout for the sandbox
    
    2150 2169
         #    stderr (fileobject): The stream for stderr for the sandbox
    
    2151 2170
         #    config (SandboxConfig): The SandboxConfig object
    
    2171
    +    #    allow_remote (bool): Whether the sandbox is allowed to be remote
    
    2152 2172
         #
    
    2153 2173
         # Yields:
    
    2154 2174
         #    (Sandbox): A usable sandbox
    
    2155 2175
         #
    
    2156 2176
         @contextmanager
    
    2157
    -    def __sandbox(self, directory, stdout=None, stderr=None, config=None):
    
    2177
    +    def __sandbox(self, directory, stdout=None, stderr=None, config=None, allow_remote=True):
    
    2158 2178
             context = self._get_context()
    
    2159 2179
             project = self._get_project()
    
    2160 2180
             platform = Platform.get_platform()
    
    2161 2181
     
    
    2162
    -        if directory is not None and self.__use_remote_execution():
    
    2182
    +        if directory is not None and allow_remote and self.__use_remote_execution():
    
    2163 2183
     
    
    2164 2184
                 self.info("Using a remote sandbox for artifact {} with directory '{}'".format(self.name, directory))
    
    2165 2185
     
    
    ... ... @@ -2173,7 +2193,7 @@ class Element(Plugin):
    2173 2193
                 yield sandbox
    
    2174 2194
     
    
    2175 2195
             elif directory is not None and os.path.exists(directory):
    
    2176
    -            if self.__remote_execution_url:
    
    2196
    +            if allow_remote and self.__remote_execution_url:
    
    2177 2197
                     self.warn("Artifact {} is configured to use remote execution but element plugin does not support it."
    
    2178 2198
                               .format(self.name), detail="Element plugin '{kind}' does not support virtual directories."
    
    2179 2199
                               .format(kind=self.get_kind()), warning_token="remote-failure")
    
    ... ... @@ -2193,7 +2213,8 @@ class Element(Plugin):
    2193 2213
                 rootdir = tempfile.mkdtemp(prefix="{}-".format(self.normal_name), dir=context.builddir)
    
    2194 2214
     
    
    2195 2215
                 # Recursive contextmanager...
    
    2196
    -            with self.__sandbox(rootdir, stdout=stdout, stderr=stderr, config=config) as sandbox:
    
    2216
    +            with self.__sandbox(rootdir, stdout=stdout, stderr=stderr, config=config,
    
    2217
    +                                allow_remote=allow_remote) as sandbox:
    
    2197 2218
                     yield sandbox
    
    2198 2219
     
    
    2199 2220
                 # Cleanup the build dir
    

  • contrib/bst-docker-import
    ... ... @@ -93,10 +93,10 @@ echo "INFO: Checking out $element ..." >&2
    93 93
     $bst_cmd checkout --tar "$element" "$checkout_tar" || die "Failed to checkout $element"
    
    94 94
     echo "INFO: Successfully checked out $element" >&2
    
    95 95
     
    
    96
    -echo "INFO: Importing Docker image ..."
    
    96
    +echo "INFO: Importing Docker image ..." >&2
    
    97 97
     "${docker_import_cmd[@]}" "$checkout_tar" "$docker_image_tag" || die "Failed to import Docker image from tarball"
    
    98
    -echo "INFO: Successfully import Docker image $docker_image_tag"
    
    98
    +echo "INFO: Successfully import Docker image $docker_image_tag" >&2
    
    99 99
     
    
    100
    -echo "INFO: Cleaning up ..."
    
    100
    +echo "INFO: Cleaning up ..." >&2
    
    101 101
     rm "$checkout_tar" || die "Failed to remove $checkout_tar"
    
    102
    -echo "INFO: Clean up finished"
    102
    +echo "INFO: Clean up finished" >&2

  • doc/source/additional_docker.rst
    1
    +
    
    2
    +.. _bst_and_docker:
    
    3
    +
    
    4
    +
    
    5
    +BuildStream and Docker
    
    6
    +======================
    
    7
    +
    
    8
    +BuildStream integrates with Docker in multiple ways. Here are some ways in
    
    9
    +which these integrations work.
    
    10
    +
    
    11
    +Run BuildStream inside Docker
    
    12
    +-----------------------------
    
    13
    +
    
    14
    +Refer to the :ref:`BuildStream inside Docker <docker>` documentation for
    
    15
    +instructions on how to run BuildStream as a Docker container.
    
    16
    +
    
    17
    +Generate Docker images
    
    18
    +----------------------
    
    19
    +
    
    20
    +The
    
    21
    +`bst-docker-import script <https://gitlab.com/BuildStream/buildstream/blob/master/contrib/bst-docker-import>`_
    
    22
    +can be used to generate a Docker image from built artifacts.
    
    23
    +
    
    24
    +You can download it and make it executable like this:
    
    25
    +
    
    26
    +.. code:: bash
    
    27
    +
    
    28
    +  mkdir -p ~/.local/bin
    
    29
    +  curl --get https://gitlab.com/BuildStream/buildstream/raw/master/contrib/bst-docker-import > ~/.local/bin/bst-docker-import
    
    30
    +  chmod +x ~/.local/bin/bst-docker-import
    
    31
    +
    
    32
    +Check if ``~/.local/bin`` appears in your PATH environment variable -- if it
    
    33
    +doesn't, you should
    
    34
    +`edit your ~/.profile so that it does <https://stackoverflow.com/questions/14637979/>`_.
    
    35
    +
    
    36
    +Once the script is available in your PATH and assuming you have Docker
    
    37
    +installed, you can start using the ``bst-docker-import`` script. Here is a
    
    38
    +minimal example to generate an image called ``bst-hello`` from an element
    
    39
    +called ``hello.bst`` assuming it is already built:
    
    40
    +
    
    41
    +.. code:: bash
    
    42
    +
    
    43
    +  bst-docker-import -t bst-hello hello.bst
    
    44
    +
    
    45
    +This script can also be used if you are running BuildStream inside Docker. In
    
    46
    +this case, you will need to supply the command that you are using to run
    
    47
    +BuildStream using the ``-c`` option.  If you are using the
    
    48
    +`bst-here wrapper script <https://gitlab.com/BuildStream/buildstream/blob/master/contrib/bst-here>`_,
    
    49
    +you can achieve the same results as the above example like this:
    
    50
    +
    
    51
    +.. code:: bash
    
    52
    +
    
    53
    +  bst-docker-import -c bst-here -t bst-hello hello.bst

  • doc/source/core_additional.rst
    ... ... @@ -8,3 +8,4 @@ Additional writings
    8 8
     
    
    9 9
        additional_cachekeys
    
    10 10
        additional_sandboxing
    
    11
    +   additional_docker

  • tests/completions/completions.py
    ... ... @@ -103,7 +103,7 @@ def test_commands(cli, cmd, word_idx, expected):
    103 103
         ('bst --no-colors build -', 3, ['--all ', '--track ', '--track-all ',
    
    104 104
                                         '--track-except ',
    
    105 105
                                         '--track-cross-junctions ', '-J ',
    
    106
    -                                    '--track-save ']),
    
    106
    +                                    '--track-save ', '--pull-buildtrees ']),
    
    107 107
     
    
    108 108
         # Test the behavior of completing after an option that has a
    
    109 109
         # parameter that cannot be completed, vs an option that has
    

  • tests/integration/pullbuildtrees.py
    1
    +import os
    
    2
    +import shutil
    
    3
    +import pytest
    
    4
    +
    
    5
    +from tests.testutils import cli_integration as cli, create_artifact_share
    
    6
    +from tests.testutils.integration import assert_contains
    
    7
    +
    
    8
    +
    
    9
    +DATA_DIR = os.path.join(
    
    10
    +    os.path.dirname(os.path.realpath(__file__)),
    
    11
    +    "project"
    
    12
    +)
    
    13
    +
    
    14
    +
    
    15
    +# Remove artifact cache & set cli.config value of pullbuildtrees
    
    16
    +# to false, which is the default user context
    
    17
    +def default_state(cli, integration_cache, share):
    
    18
    +    # cache_dir = os.path.join(integration_cache, 'artifacts')
    
    19
    +    # Is this leaving behind the buildtree dir object?
    
    20
    +    # Might need to nuke the whole /artifacts dir instead
    
    21
    +    # cli.remove_artifact_from_cache(project, element_name, cache_dir=cache_dir)
    
    22
    +    shutil.rmtree(os.path.join(integration_cache, 'artifacts'))
    
    23
    +    cli.configure({'pullbuildtrees': False, 'artifacts': {'url': share.repo, 'push': False}})
    
    24
    +
    
    25
    +
    
    26
    +# A test to capture the integration of the pullbuildtrees
    
    27
    +# behaviour, which by default is to not include the buildtree
    
    28
    +# directory of an element
    
    29
    +@pytest.mark.integration
    
    30
    +@pytest.mark.datafiles(DATA_DIR)
    
    31
    +def test_pullbuildtrees(cli, tmpdir, datafiles, integration_cache):
    
    32
    +    project = os.path.join(datafiles.dirname, datafiles.basename)
    
    33
    +    element_name = 'autotools/amhello.bst'
    
    34
    +
    
    35
    +    # Create artifact shares for pull & push testing
    
    36
    +    with create_artifact_share(os.path.join(str(tmpdir), 'remote1')) as share1,\
    
    37
    +        create_artifact_share(os.path.join(str(tmpdir), 'remote2')) as share2:
    
    38
    +        cli.configure({
    
    39
    +            'artifacts': {'url': share1.repo, 'push': True},
    
    40
    +        })
    
    41
    +
    
    42
    +        # Build autotools element, checked pushed, delete local
    
    43
    +        result = cli.run(project=project, args=['build', element_name])
    
    44
    +        assert result.exit_code == 0
    
    45
    +        assert cli.get_element_state(project, element_name) == 'cached'
    
    46
    +        assert share1.has_artifact('test', element_name, cli.get_element_key(project, element_name))
    
    47
    +        default_state(cli, integration_cache, share1)
    
    48
    +
    
    49
    +        # Pull artifact with default config, assert that pulling again
    
    50
    +        # doesn't create a pull job, then assert with buildtrees user
    
    51
    +        # config set creates a pull job.
    
    52
    +        result = cli.run(project=project, args=['pull', element_name])
    
    53
    +        assert element_name in result.get_pulled_elements()
    
    54
    +        result = cli.run(project=project, args=['pull', element_name])
    
    55
    +        assert element_name not in result.get_pulled_elements()
    
    56
    +        cli.configure({'pullbuildtrees': True})
    
    57
    +        result = cli.run(project=project, args=['pull', element_name])
    
    58
    +        assert element_name in result.get_pulled_elements()
    
    59
    +        default_state(cli, integration_cache, share1)
    
    60
    +
    
    61
    +        # Pull artifact with default config, then assert that pulling
    
    62
    +        # with buildtrees cli flag set creates a pull job.
    
    63
    +        result = cli.run(project=project, args=['pull', element_name])
    
    64
    +        assert element_name in result.get_pulled_elements()
    
    65
    +        result = cli.run(project=project, args=['pull', '--pull-buildtrees', element_name])
    
    66
    +        assert element_name in result.get_pulled_elements()
    
    67
    +        default_state(cli, integration_cache, share1)
    
    68
    +
    
    69
    +        # Pull artifact with pullbuildtrees set in user config, then assert
    
    70
    +        # that pulling with the same user config doesn't creates a pull job,
    
    71
    +        # or when buildtrees cli flag is set.
    
    72
    +        cli.configure({'pullbuildtrees': True})
    
    73
    +        result = cli.run(project=project, args=['pull', element_name])
    
    74
    +        assert element_name in result.get_pulled_elements()
    
    75
    +        result = cli.run(project=project, args=['pull', element_name])
    
    76
    +        assert element_name not in result.get_pulled_elements()
    
    77
    +        result = cli.run(project=project, args=['pull', '--pull-buildtrees', element_name])
    
    78
    +        assert element_name not in result.get_pulled_elements()
    
    79
    +        default_state(cli, integration_cache, share1)
    
    80
    +
    
    81
    +        # Pull artifact with default config and buildtrees cli flag set, then assert
    
    82
    +        # that pulling with pullbuildtrees set in user config doesn't create a pull
    
    83
    +        # job.
    
    84
    +        result = cli.run(project=project, args=['pull', '--pull-buildtrees', element_name])
    
    85
    +        assert element_name in result.get_pulled_elements()
    
    86
    +        cli.configure({'pullbuildtrees': True})
    
    87
    +        result = cli.run(project=project, args=['pull', element_name])
    
    88
    +        assert element_name not in result.get_pulled_elements()
    
    89
    +        default_state(cli, integration_cache, share1)
    
    90
    +
    
    91
    +        # Assert that a partial build element (not containing a populated buildtree dir)
    
    92
    +        # can't be pushed to an artifact share, then assert that a complete build element
    
    93
    +        # can be. This will attempt a partial pull from share1 and then a partial push
    
    94
    +        # to share2
    
    95
    +        result = cli.run(project=project, args=['pull', element_name])
    
    96
    +        assert element_name in result.get_pulled_elements()
    
    97
    +        cli.configure({'artifacts': {'url': share2.repo, 'push': True}})
    
    98
    +        result = cli.run(project=project, args=['push', element_name])
    
    99
    +        assert element_name not in result.get_pushed_elements()
    
    100
    +        assert not share2.has_artifact('test', element_name, cli.get_element_key(project, element_name))
    
    101
    +
    
    102
    +        # Assert that after pulling the missing buildtree the element artifact can be
    
    103
    +        # successfully pushed to the remote. This will attempt to pull the buildtree
    
    104
    +        # from share1 and then a 'complete' push to share2
    
    105
    +        cli.configure({'artifacts': {'url': share1.repo, 'push': False}})
    
    106
    +        result = cli.run(project=project, args=['pull', '--pull-buildtrees', element_name])
    
    107
    +        assert element_name in result.get_pulled_elements()
    
    108
    +        cli.configure({'artifacts': {'url': share2.repo, 'push': True}})
    
    109
    +        result = cli.run(project=project, args=['push', element_name])
    
    110
    +        assert element_name in result.get_pushed_elements()
    
    111
    +        assert share2.has_artifact('test', element_name, cli.get_element_key(project, element_name))
    
    112
    +        default_state(cli, integration_cache, share1)

  • tests/testutils/artifactshare.py
    ... ... @@ -128,7 +128,7 @@ class ArtifactShare():
    128 128
     
    
    129 129
             valid_chars = string.digits + string.ascii_letters + '-._'
    
    130 130
             element_name = ''.join([
    
    131
    -            x if x in valid_chars else '_'
    
    131
    +            x if x in valid_chars else '-'
    
    132 132
                 for x in element_name
    
    133 133
             ])
    
    134 134
             artifact_key = '{0}/{1}/{2}'.format(project_name, element_name, cache_key)
    



  • [Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]