[Notes] [Git][BuildStream/buildstream][richardmaw/distinguish-sandboxing-build-fail] 28 commits: contributing: WIP again to be kind to reviewers



Title: GitLab

richardmaw-codethink pushed to branch richardmaw/distinguish-sandboxing-build-fail at BuildStream / buildstream

Commits:

30 changed files:

Changes:

  • .gitlab-ci.yml
    ... ... @@ -79,32 +79,46 @@ source_dist:
    79 79
       - cd ../..
    
    80 80
       - mkdir -p coverage-linux/
    
    81 81
       - cp dist/buildstream/.coverage coverage-linux/coverage."${CI_JOB_NAME}"
    
    82
    -  except:
    
    83
    -  - schedules
    
    84 82
       artifacts:
    
    85 83
         paths:
    
    86 84
         - coverage-linux/
    
    87 85
     
    
    88 86
     tests-debian-9:
    
    89
    -  image: buildstream/testsuite-debian:9-master-119-552f5fc6
    
    87
    +  image: buildstream/testsuite-debian:9-master-123-7ce6581b
    
    90 88
       <<: *linux-tests
    
    89
    +  except:
    
    90
    +  - schedules
    
    91 91
     
    
    92 92
     tests-fedora-27:
    
    93
    -  image: buildstream/testsuite-fedora:27-master-119-552f5fc6
    
    93
    +  image: buildstream/testsuite-fedora:27-master-123-7ce6581b
    
    94 94
       <<: *linux-tests
    
    95
    +  except:
    
    96
    +  - schedules
    
    95 97
     
    
    96 98
     tests-fedora-28:
    
    97
    -  image: buildstream/testsuite-fedora:28-master-119-552f5fc6
    
    99
    +  image: buildstream/testsuite-fedora:28-master-123-7ce6581b
    
    98 100
       <<: *linux-tests
    
    101
    +  except:
    
    102
    +  - schedules
    
    99 103
     
    
    100 104
     tests-ubuntu-18.04:
    
    101
    -  image: buildstream/testsuite-ubuntu:18.04-master-119-552f5fc6
    
    105
    +  image: buildstream/testsuite-ubuntu:18.04-master-123-7ce6581b
    
    102 106
       <<: *linux-tests
    
    107
    +  except:
    
    108
    +  - schedules
    
    109
    +
    
    110
    +overnight-fedora-28-aarch64:
    
    111
    +  image: buildstream/testsuite-fedora:aarch64-28-master-123-7ce6581b
    
    112
    +  tags:
    
    113
    +    - aarch64
    
    114
    +  <<: *linux-tests
    
    115
    +  only:
    
    116
    +  - schedules
    
    103 117
     
    
    104 118
     tests-unix:
    
    105 119
       # Use fedora here, to a) run a test on fedora and b) ensure that we
    
    106 120
       # can get rid of ostree - this is not possible with debian-8
    
    107
    -  image: buildstream/testsuite-fedora:27-master-119-552f5fc6
    
    121
    +  image: buildstream/testsuite-fedora:27-master-123-7ce6581b
    
    108 122
       stage: test
    
    109 123
       variables:
    
    110 124
         BST_FORCE_BACKEND: "unix"
    

  • CONTRIBUTING.rst
    ... ... @@ -97,7 +97,13 @@ a new merge request. You can also `create a merge request for an existing branch
    97 97
     You may open merge requests for the branches you create before you are ready
    
    98 98
     to have them reviewed and considered for inclusion if you like. Until your merge
    
    99 99
     request is ready for review, the merge request title must be prefixed with the
    
    100
    -``WIP:`` identifier.
    
    100
    +``WIP:`` identifier. GitLab `treats this specially
    
    101
    +<https://docs.gitlab.com/ee/user/project/merge_requests/work_in_progress_merge_requests.html>`_,
    
    102
    +which helps reviewers.
    
    103
    +
    
    104
    +Consider marking a merge request as WIP again if you are taking a while to
    
    105
    +address a review point. This signals that the next action is on you, and it
    
    106
    +won't appear in a reviewer's search for non-WIP merge requests to review.
    
    101 107
     
    
    102 108
     
    
    103 109
     Organized commits
    
    ... ... @@ -122,6 +128,12 @@ If a commit in your branch modifies behavior such that a test must also
    122 128
     be changed to match the new behavior, then the tests should be updated
    
    123 129
     with the same commit, so that every commit passes its own tests.
    
    124 130
     
    
    131
    +These principles apply whenever a branch is non-WIP. So for example, don't push
    
    132
    +'fixup!' commits when addressing review comments, instead amend the commits
    
    133
    +directly before pushing. GitLab has `good support
    
    134
    +<https://docs.gitlab.com/ee/user/project/merge_requests/versions.html>`_ for
    
    135
    +diffing between pushes, so 'fixup!' commits are not necessary for reviewers.
    
    136
    +
    
    125 137
     
    
    126 138
     Commit messages
    
    127 139
     ~~~~~~~~~~~~~~~
    
    ... ... @@ -144,6 +156,16 @@ number must be referenced in the commit message.
    144 156
     
    
    145 157
       Fixes #123
    
    146 158
     
    
    159
    +Note that the 'why' of a change is as important as the 'what'.
    
    160
    +
    
    161
    +When reviewing this, folks can suggest better alternatives when they know the
    
    162
    +'why'. Perhaps there are other ways to avoid an error when things are not
    
    163
    +frobnicated.
    
    164
    +
    
    165
    +When folks modify this code, there may be uncertainty around whether the foos
    
    166
    +should always be frobnicated. The comments, the commit message, and issue #123
    
    167
    +should shed some light on that.
    
    168
    +
    
    147 169
     In the case that you have a commit which necessarily modifies multiple
    
    148 170
     components, then the summary line should still mention generally what
    
    149 171
     changed (if possible), followed by a colon and a brief summary.
    

  • buildstream/_artifactcache/artifactcache.py
    ... ... @@ -937,15 +937,22 @@ class ArtifactCache():
    937 937
                                 "Invalid cache quota ({}): ".format(utils._pretty_size(cache_quota)) +
    
    938 938
                                 "BuildStream requires a minimum cache quota of 2G.")
    
    939 939
             elif cache_quota > cache_size + available_space:  # Check maximum
    
    940
    +            if '%' in self.context.config_cache_quota:
    
    941
    +                available = (available_space / (stat.f_blocks * stat.f_bsize)) * 100
    
    942
    +                available = '{}% of total disk space'.format(round(available, 1))
    
    943
    +            else:
    
    944
    +                available = utils._pretty_size(available_space)
    
    945
    +
    
    940 946
                 raise LoadError(LoadErrorReason.INVALID_DATA,
    
    941 947
                                 ("Your system does not have enough available " +
    
    942 948
                                  "space to support the cache quota specified.\n" +
    
    943
    -                             "You currently have:\n" +
    
    944
    -                             "- {used} of cache in use at {local_cache_path}\n" +
    
    945
    -                             "- {available} of available system storage").format(
    
    946
    -                                 used=utils._pretty_size(cache_size),
    
    947
    -                                 local_cache_path=self.context.artifactdir,
    
    948
    -                                 available=utils._pretty_size(available_space)))
    
    949
    +                             "\nYou have specified a quota of {quota} total disk space.\n" +
    
    950
    +                             "- The filesystem containing {local_cache_path} only " +
    
    951
    +                             "has: {available_size} available.")
    
    952
    +                            .format(
    
    953
    +                                quota=self.context.config_cache_quota,
    
    954
    +                                local_cache_path=self.context.artifactdir,
    
    955
    +                                available_size=available))
    
    949 956
     
    
    950 957
             # Place a slight headroom (2e9 (2GB) on the cache_quota) into
    
    951 958
             # cache_quota to try and avoid exceptions.
    

  • buildstream/_platform/linux.py
    ... ... @@ -18,9 +18,9 @@
    18 18
     #        Tristan Maat <tristan maat codethink co uk>
    
    19 19
     
    
    20 20
     import os
    
    21
    -import shutil
    
    22 21
     import subprocess
    
    23 22
     
    
    23
    +from .. import _site
    
    24 24
     from .. import utils
    
    25 25
     from ..sandbox import SandboxDummy
    
    26 26
     
    
    ... ... @@ -38,16 +38,18 @@ class Linux(Platform):
    38 38
     
    
    39 39
             self._have_fuse = os.path.exists("/dev/fuse")
    
    40 40
     
    
    41
    -        bwrap_version = self._get_bwrap_version()
    
    41
    +        bwrap_version = _site.get_bwrap_version()
    
    42 42
     
    
    43 43
             if bwrap_version is None:
    
    44 44
                 self._bwrap_exists = False
    
    45 45
                 self._have_good_bwrap = False
    
    46 46
                 self._die_with_parent_available = False
    
    47
    +            self._json_status_available = False
    
    47 48
             else:
    
    48 49
                 self._bwrap_exists = True
    
    49 50
                 self._have_good_bwrap = (0, 1, 2) <= bwrap_version
    
    50 51
                 self._die_with_parent_available = (0, 1, 8) <= bwrap_version
    
    52
    +            self._json_status_available = (0, 3, 2) <= bwrap_version
    
    51 53
     
    
    52 54
             self._local_sandbox_available = self._have_fuse and self._have_good_bwrap
    
    53 55
     
    
    ... ... @@ -97,6 +99,7 @@ class Linux(Platform):
    97 99
             # Inform the bubblewrap sandbox as to whether it can use user namespaces or not
    
    98 100
             kwargs['user_ns_available'] = self._user_ns_available
    
    99 101
             kwargs['die_with_parent_available'] = self._die_with_parent_available
    
    102
    +        kwargs['json_status_available'] = self._json_status_available
    
    100 103
             return SandboxBwrap(*args, **kwargs)
    
    101 104
     
    
    102 105
         def _check_user_ns_available(self):
    
    ... ... @@ -119,21 +122,3 @@ class Linux(Platform):
    119 122
                 output = ''
    
    120 123
     
    
    121 124
             return output == 'root'
    122
    -
    
    123
    -    def _get_bwrap_version(self):
    
    124
    -        # Get the current bwrap version
    
    125
    -        #
    
    126
    -        # returns None if no bwrap was found
    
    127
    -        # otherwise returns a tuple of 3 int: major, minor, patch
    
    128
    -        bwrap_path = shutil.which('bwrap')
    
    129
    -
    
    130
    -        if not bwrap_path:
    
    131
    -            return None
    
    132
    -
    
    133
    -        cmd = [bwrap_path, "--version"]
    
    134
    -        try:
    
    135
    -            version = str(subprocess.check_output(cmd).split()[1], "utf-8")
    
    136
    -        except subprocess.CalledProcessError:
    
    137
    -            return None
    
    138
    -
    
    139
    -        return tuple(int(x) for x in version.split("."))

  • buildstream/_project.py
    ... ... @@ -219,6 +219,19 @@ class Project():
    219 219
     
    
    220 220
             return self._cache_key
    
    221 221
     
    
    222
    +    def _validate_node(self, node):
    
    223
    +        _yaml.node_validate(node, [
    
    224
    +            'format-version',
    
    225
    +            'element-path', 'variables',
    
    226
    +            'environment', 'environment-nocache',
    
    227
    +            'split-rules', 'elements', 'plugins',
    
    228
    +            'aliases', 'name',
    
    229
    +            'artifacts', 'options',
    
    230
    +            'fail-on-overlap', 'shell', 'fatal-warnings',
    
    231
    +            'ref-storage', 'sandbox', 'mirrors', 'remote-execution',
    
    232
    +            'sources', '(@)'
    
    233
    +        ])
    
    234
    +
    
    222 235
         # create_element()
    
    223 236
         #
    
    224 237
         # Instantiate and return an element
    
    ... ... @@ -402,6 +415,8 @@ class Project():
    402 415
                     "Project requested format version {}, but BuildStream {}.{} only supports up until format version {}"
    
    403 416
                     .format(format_version, major, minor, BST_FORMAT_VERSION))
    
    404 417
     
    
    418
    +        self._validate_node(pre_config_node)
    
    419
    +
    
    405 420
             # FIXME:
    
    406 421
             #
    
    407 422
             #   Performing this check manually in the absense
    
    ... ... @@ -467,16 +482,7 @@ class Project():
    467 482
     
    
    468 483
             self._load_pass(config, self.config)
    
    469 484
     
    
    470
    -        _yaml.node_validate(config, [
    
    471
    -            'format-version',
    
    472
    -            'element-path', 'variables',
    
    473
    -            'environment', 'environment-nocache',
    
    474
    -            'split-rules', 'elements', 'plugins',
    
    475
    -            'aliases', 'name',
    
    476
    -            'artifacts', 'options',
    
    477
    -            'fail-on-overlap', 'shell', 'fatal-warnings',
    
    478
    -            'ref-storage', 'sandbox', 'mirrors', 'remote-execution'
    
    479
    -        ])
    
    485
    +        self._validate_node(config)
    
    480 486
     
    
    481 487
             #
    
    482 488
             # Now all YAML composition is done, from here on we just load
    

  • buildstream/_site.py
    ... ... @@ -18,6 +18,8 @@
    18 18
     #        Tristan Van Berkom <tristan vanberkom codethink co uk>
    
    19 19
     
    
    20 20
     import os
    
    21
    +import shutil
    
    22
    +import subprocess
    
    21 23
     
    
    22 24
     #
    
    23 25
     # Private module declaring some info about where the buildstream
    
    ... ... @@ -44,3 +46,22 @@ build_all_template = os.path.join(root, 'data', 'build-all.sh.in')
    44 46
     
    
    45 47
     # Module building script template
    
    46 48
     build_module_template = os.path.join(root, 'data', 'build-module.sh.in')
    
    49
    +
    
    50
    +
    
    51
    +def get_bwrap_version():
    
    52
    +    # Get the current bwrap version
    
    53
    +    #
    
    54
    +    # returns None if no bwrap was found
    
    55
    +    # otherwise returns a tuple of 3 int: major, minor, patch
    
    56
    +    bwrap_path = shutil.which('bwrap')
    
    57
    +
    
    58
    +    if not bwrap_path:
    
    59
    +        return None
    
    60
    +
    
    61
    +    cmd = [bwrap_path, "--version"]
    
    62
    +    try:
    
    63
    +        version = str(subprocess.check_output(cmd).split()[1], "utf-8")
    
    64
    +    except subprocess.CalledProcessError:
    
    65
    +        return None
    
    66
    +
    
    67
    +    return tuple(int(x) for x in version.split("."))

  • buildstream/plugin.py
    ... ... @@ -111,6 +111,7 @@ Class Reference
    111 111
     
    
    112 112
     import os
    
    113 113
     import subprocess
    
    114
    +import sys
    
    114 115
     from contextlib import contextmanager
    
    115 116
     from weakref import WeakValueDictionary
    
    116 117
     
    
    ... ... @@ -190,7 +191,7 @@ class Plugin():
    190 191
             # Dont send anything through the Message() pipeline at destruction time,
    
    191 192
             # any subsequent lookup of plugin by unique id would raise KeyError.
    
    192 193
             if self.__context.log_debug:
    
    193
    -            print("DEBUG: Destroyed: {}".format(self))
    
    194
    +            sys.stderr.write("DEBUG: Destroyed: {}\n".format(self))
    
    194 195
     
    
    195 196
         def __str__(self):
    
    196 197
             return "{kind} {typetag} at {provenance}".format(
    

  • buildstream/sandbox/_sandboxbwrap.py
    ... ... @@ -17,6 +17,8 @@
    17 17
     #  Authors:
    
    18 18
     #        Andrew Leeming <andrew leeming codethink co uk>
    
    19 19
     #        Tristan Van Berkom <tristan vanberkom codethink co uk>
    
    20
    +import collections
    
    21
    +import json
    
    20 22
     import os
    
    21 23
     import sys
    
    22 24
     import time
    
    ... ... @@ -24,7 +26,8 @@ import errno
    24 26
     import signal
    
    25 27
     import subprocess
    
    26 28
     import shutil
    
    27
    -from contextlib import ExitStack
    
    29
    +from contextlib import ExitStack, suppress
    
    30
    +from tempfile import TemporaryFile
    
    28 31
     
    
    29 32
     import psutil
    
    30 33
     
    
    ... ... @@ -53,6 +56,7 @@ class SandboxBwrap(Sandbox):
    53 56
             super().__init__(*args, **kwargs)
    
    54 57
             self.user_ns_available = kwargs['user_ns_available']
    
    55 58
             self.die_with_parent_available = kwargs['die_with_parent_available']
    
    59
    +        self.json_status_available = kwargs['json_status_available']
    
    56 60
     
    
    57 61
         def run(self, command, flags, *, cwd=None, env=None):
    
    58 62
             stdout, stderr = self._get_output()
    
    ... ... @@ -160,24 +164,31 @@ class SandboxBwrap(Sandbox):
    160 164
                     gid = self._get_config().build_gid
    
    161 165
                     bwrap_command += ['--uid', str(uid), '--gid', str(gid)]
    
    162 166
     
    
    163
    -        # Add the command
    
    164
    -        bwrap_command += command
    
    165
    -
    
    166
    -        # bwrap might create some directories while being suid
    
    167
    -        # and may give them to root gid, if it does, we'll want
    
    168
    -        # to clean them up after, so record what we already had
    
    169
    -        # there just in case so that we can safely cleanup the debris.
    
    170
    -        #
    
    171
    -        existing_basedirs = {
    
    172
    -            directory: os.path.exists(os.path.join(root_directory, directory))
    
    173
    -            for directory in ['tmp', 'dev', 'proc']
    
    174
    -        }
    
    175
    -
    
    176
    -        # Use the MountMap context manager to ensure that any redirected
    
    177
    -        # mounts through fuse layers are in context and ready for bwrap
    
    178
    -        # to mount them from.
    
    179
    -        #
    
    180 167
             with ExitStack() as stack:
    
    168
    +            pass_fds = ()
    
    169
    +            # Improve error reporting with json-status if available
    
    170
    +            if self.json_status_available:
    
    171
    +                json_status_file = stack.enter_context(TemporaryFile())
    
    172
    +                pass_fds = (json_status_file.fileno(),)
    
    173
    +                bwrap_command += ['--json-status-fd', str(json_status_file.fileno())]
    
    174
    +
    
    175
    +            # Add the command
    
    176
    +            bwrap_command += command
    
    177
    +
    
    178
    +            # bwrap might create some directories while being suid
    
    179
    +            # and may give them to root gid, if it does, we'll want
    
    180
    +            # to clean them up after, so record what we already had
    
    181
    +            # there just in case so that we can safely cleanup the debris.
    
    182
    +            #
    
    183
    +            existing_basedirs = {
    
    184
    +                directory: os.path.exists(os.path.join(root_directory, directory))
    
    185
    +                for directory in ['tmp', 'dev', 'proc']
    
    186
    +            }
    
    187
    +
    
    188
    +            # Use the MountMap context manager to ensure that any redirected
    
    189
    +            # mounts through fuse layers are in context and ready for bwrap
    
    190
    +            # to mount them from.
    
    191
    +            #
    
    181 192
                 stack.enter_context(mount_map.mounted(self))
    
    182 193
     
    
    183 194
                 # If we're interactive, we want to inherit our stdin,
    
    ... ... @@ -190,7 +201,7 @@ class SandboxBwrap(Sandbox):
    190 201
     
    
    191 202
                 # Run bubblewrap !
    
    192 203
                 exit_code = self.run_bwrap(bwrap_command, stdin, stdout, stderr,
    
    193
    -                                       (flags & SandboxFlags.INTERACTIVE))
    
    204
    +                                       (flags & SandboxFlags.INTERACTIVE), pass_fds)
    
    194 205
     
    
    195 206
                 # Cleanup things which bwrap might have left behind, while
    
    196 207
                 # everything is still mounted because bwrap can be creating
    
    ... ... @@ -238,10 +249,27 @@ class SandboxBwrap(Sandbox):
    238 249
                             # a bug, bwrap mounted a tempfs here and when it exits, that better be empty.
    
    239 250
                             pass
    
    240 251
     
    
    252
    +            if self.json_status_available:
    
    253
    +                json_status_file.seek(0, 0)
    
    254
    +                child_exit_code = None
    
    255
    +                # The JSON status file's output is a JSON object per line
    
    256
    +                # with the keys present identifying the type of message.
    
    257
    +                # The only message relevant to us now is the exit-code of the subprocess.
    
    258
    +                for line in json_status_file:
    
    259
    +                    with suppress(json.decoder.JSONDecodeError):
    
    260
    +                        o = json.loads(line)
    
    261
    +                        if isinstance(o, collections.abc.Mapping) and 'exit-code' in o:
    
    262
    +                            child_exit_code = o['exit-code']
    
    263
    +                            break
    
    264
    +                if child_exit_code is None:
    
    265
    +                    raise SandboxError("`bwrap' terminated during sandbox setup with exitcode {}".format(exit_code),
    
    266
    +                                       reason="bwrap-sandbox-fail")
    
    267
    +                exit_code = child_exit_code
    
    268
    +
    
    241 269
             self._vdir._mark_changed()
    
    242 270
             return exit_code
    
    243 271
     
    
    244
    -    def run_bwrap(self, argv, stdin, stdout, stderr, interactive):
    
    272
    +    def run_bwrap(self, argv, stdin, stdout, stderr, interactive, pass_fds):
    
    245 273
             # Wrapper around subprocess.Popen() with common settings.
    
    246 274
             #
    
    247 275
             # This function blocks until the subprocess has terminated.
    
    ... ... @@ -317,6 +345,7 @@ class SandboxBwrap(Sandbox):
    317 345
                     # The default is to share file descriptors from the parent process
    
    318 346
                     # to the subprocess, which is rarely good for sandboxing.
    
    319 347
                     close_fds=True,
    
    348
    +                pass_fds=pass_fds,
    
    320 349
                     stdin=stdin,
    
    321 350
                     stdout=stdout,
    
    322 351
                     stderr=stderr,
    

  • buildstream/source.py
    ... ... @@ -973,32 +973,34 @@ class Source(Plugin):
    973 973
                 # the items of source_fetchers, if it happens to be a generator.
    
    974 974
                 #
    
    975 975
                 source_fetchers = iter(source_fetchers)
    
    976
    -            try:
    
    977 976
     
    
    978
    -                while True:
    
    977
    +            while True:
    
    979 978
     
    
    980
    -                    with context.silence():
    
    979
    +                with context.silence():
    
    980
    +                    try:
    
    981 981
                             fetcher = next(source_fetchers)
    
    982
    -
    
    983
    -                    alias = fetcher._get_alias()
    
    984
    -                    for uri in project.get_alias_uris(alias, first_pass=self.__first_pass):
    
    985
    -                        try:
    
    986
    -                            fetcher.fetch(uri)
    
    987
    -                        # FIXME: Need to consider temporary vs. permanent failures,
    
    988
    -                        #        and how this works with retries.
    
    989
    -                        except BstError as e:
    
    990
    -                            last_error = e
    
    991
    -                            continue
    
    992
    -
    
    993
    -                        # No error, we're done with this fetcher
    
    982
    +                    except StopIteration:
    
    983
    +                        # as per PEP479, we are not allowed to let StopIteration
    
    984
    +                        # thrown from a context manager.
    
    985
    +                        # Catching it here and breaking instead.
    
    994 986
                             break
    
    995 987
     
    
    996
    -                    else:
    
    997
    -                        # No break occurred, raise the last detected error
    
    998
    -                        raise last_error
    
    988
    +                alias = fetcher._get_alias()
    
    989
    +                for uri in project.get_alias_uris(alias, first_pass=self.__first_pass):
    
    990
    +                    try:
    
    991
    +                        fetcher.fetch(uri)
    
    992
    +                    # FIXME: Need to consider temporary vs. permanent failures,
    
    993
    +                    #        and how this works with retries.
    
    994
    +                    except BstError as e:
    
    995
    +                        last_error = e
    
    996
    +                        continue
    
    999 997
     
    
    1000
    -            except StopIteration:
    
    1001
    -                pass
    
    998
    +                    # No error, we're done with this fetcher
    
    999
    +                    break
    
    1000
    +
    
    1001
    +                else:
    
    1002
    +                    # No break occurred, raise the last detected error
    
    1003
    +                    raise last_error
    
    1002 1004
     
    
    1003 1005
             # Default codepath is to reinstantiate the Source
    
    1004 1006
             #
    

  • buildstream/storage/_casbaseddirectory.py
    ... ... @@ -30,7 +30,6 @@ See also: :ref:`sandboxing`.
    30 30
     from collections import OrderedDict
    
    31 31
     
    
    32 32
     import os
    
    33
    -import tempfile
    
    34 33
     import stat
    
    35 34
     
    
    36 35
     from .._protos.build.bazel.remote.execution.v2 import remote_execution_pb2
    
    ... ... @@ -51,6 +50,183 @@ class IndexEntry():
    51 50
             self.modified = modified
    
    52 51
     
    
    53 52
     
    
    53
    +class ResolutionException(VirtualDirectoryError):
    
    54
    +    """ Superclass of all exceptions that can be raised by
    
    55
    +    CasBasedDirectory._resolve. Should not be used outside this module. """
    
    56
    +    pass
    
    57
    +
    
    58
    +
    
    59
    +class InfiniteSymlinkException(ResolutionException):
    
    60
    +    """ Raised when an infinite symlink loop is found. """
    
    61
    +    pass
    
    62
    +
    
    63
    +
    
    64
    +class AbsoluteSymlinkException(ResolutionException):
    
    65
    +    """Raised if we try to follow an absolute symlink (i.e. one whose
    
    66
    +    target starts with the path separator) and we have disallowed
    
    67
    +    following such symlinks.
    
    68
    +    """
    
    69
    +    pass
    
    70
    +
    
    71
    +
    
    72
    +class UnexpectedFileException(ResolutionException):
    
    73
    +    """Raised if we were found a file where a directory or symlink was
    
    74
    +    expected, for example we try to resolve a symlink pointing to
    
    75
    +    /a/b/c but /a/b is a file.
    
    76
    +    """
    
    77
    +    def __init__(self, message=""):
    
    78
    +        """Allow constructor with no arguments, since this can be raised in
    
    79
    +        places where there isn't sufficient information to write the
    
    80
    +        message.
    
    81
    +        """
    
    82
    +        super().__init__(message)
    
    83
    +
    
    84
    +
    
    85
    +class _Resolver():
    
    86
    +    """A class for resolving symlinks inside CAS-based directories. As
    
    87
    +    well as providing a namespace for some functions, this also
    
    88
    +    contains two flags which are constant throughout one resolution
    
    89
    +    operation and the 'seen_objects' list used to detect infinite
    
    90
    +    symlink loops.
    
    91
    +
    
    92
    +    """
    
    93
    +
    
    94
    +    def __init__(self, absolute_symlinks_resolve=True, force_create=False):
    
    95
    +        self.absolute_symlinks_resolve = absolute_symlinks_resolve
    
    96
    +        self.force_create = force_create
    
    97
    +        self.seen_objects = []
    
    98
    +
    
    99
    +    def resolve(self, name, directory):
    
    100
    +        """Resolves any name to an object. If the name points to a symlink in
    
    101
    +        the directory, it returns the thing it points to,
    
    102
    +        recursively.
    
    103
    +
    
    104
    +        Returns a CasBasedDirectory, FileNode or None. None indicates
    
    105
    +        either that 'target' does not exist in this directory, or is a
    
    106
    +        symlink chain which points to a nonexistent name (broken
    
    107
    +        symlink).
    
    108
    +
    
    109
    +        Raises:
    
    110
    +
    
    111
    +        - InfiniteSymlinkException if 'name' points to an infinite
    
    112
    +          symlink loop.
    
    113
    +        - AbsoluteSymlinkException if 'name' points to an absolute
    
    114
    +          symlink and absolute_symlinks_resolve is False.
    
    115
    +        - UnexpectedFileException if at any point during resolution we
    
    116
    +          find a file which we expected to be a directory or symlink.
    
    117
    +
    
    118
    +        If force_create is set, this will attempt to create
    
    119
    +        directories to make symlinks and directories resolve.  Files
    
    120
    +        present in symlink target paths will also be removed and
    
    121
    +        replaced with directories.  If force_create is off, this will
    
    122
    +        never alter 'directory'.
    
    123
    +
    
    124
    +        """
    
    125
    +
    
    126
    +        # First check for nonexistent things or 'normal' objects and return them
    
    127
    +        if name not in directory.index:
    
    128
    +            return None
    
    129
    +        index_entry = directory.index[name]
    
    130
    +        if isinstance(index_entry.buildstream_object, Directory):
    
    131
    +            return index_entry.buildstream_object
    
    132
    +        elif isinstance(index_entry.pb_object, remote_execution_pb2.FileNode):
    
    133
    +            return index_entry.pb_object
    
    134
    +
    
    135
    +        # Now we must be dealing with a symlink.
    
    136
    +        assert isinstance(index_entry.pb_object, remote_execution_pb2.SymlinkNode)
    
    137
    +
    
    138
    +        symlink_object = index_entry.pb_object
    
    139
    +        if symlink_object in self.seen_objects:
    
    140
    +            # Infinite symlink loop detected
    
    141
    +            message = ("Infinite symlink loop found during resolution. " +
    
    142
    +                       "First repeated element is {}".format(name))
    
    143
    +            raise InfiniteSymlinkException(message=message)
    
    144
    +
    
    145
    +        self.seen_objects.append(symlink_object)
    
    146
    +
    
    147
    +        components = symlink_object.target.split(CasBasedDirectory._pb2_path_sep)
    
    148
    +        absolute = symlink_object.target.startswith(CasBasedDirectory._pb2_absolute_path_prefix)
    
    149
    +
    
    150
    +        if absolute:
    
    151
    +            if self.absolute_symlinks_resolve:
    
    152
    +                directory = directory.find_root()
    
    153
    +                # Discard the first empty element
    
    154
    +                components.pop(0)
    
    155
    +            else:
    
    156
    +                # Unresolvable absolute symlink
    
    157
    +                message = "{} is an absolute symlink, which was disallowed during resolution".format(name)
    
    158
    +                raise AbsoluteSymlinkException(message=message)
    
    159
    +
    
    160
    +        resolution = directory
    
    161
    +        while components and isinstance(resolution, CasBasedDirectory):
    
    162
    +            c = components.pop(0)
    
    163
    +            directory = resolution
    
    164
    +
    
    165
    +            try:
    
    166
    +                resolution = self._resolve_path_component(c, directory, components)
    
    167
    +            except UnexpectedFileException as original:
    
    168
    +                errormsg = ("Reached a file called {} while trying to resolve a symlink; " +
    
    169
    +                            "cannot proceed. The remaining path components are {}.")
    
    170
    +                raise UnexpectedFileException(errormsg.format(c, components)) from original
    
    171
    +
    
    172
    +        return resolution
    
    173
    +
    
    174
    +    def _resolve_path_component(self, c, directory, components_remaining):
    
    175
    +        if c == ".":
    
    176
    +            resolution = directory
    
    177
    +        elif c == "..":
    
    178
    +            if directory.parent is not None:
    
    179
    +                resolution = directory.parent
    
    180
    +            else:
    
    181
    +                # If directory.parent *is* None, this is an attempt to
    
    182
    +                # access '..' from the root, which is valid under
    
    183
    +                # POSIX; it just returns the root.
    
    184
    +                resolution = directory
    
    185
    +        elif c in directory.index:
    
    186
    +            try:
    
    187
    +                resolution = self._resolve_through_files(c, directory, components_remaining)
    
    188
    +            except UnexpectedFileException as original:
    
    189
    +                errormsg = ("Reached a file called {} while trying to resolve a symlink; " +
    
    190
    +                            "cannot proceed. The remaining path components are {}.")
    
    191
    +                raise UnexpectedFileException(errormsg.format(c, components_remaining)) from original
    
    192
    +        else:
    
    193
    +            # c is not in our index
    
    194
    +            if self.force_create:
    
    195
    +                resolution = directory.descend(c, create=True)
    
    196
    +            else:
    
    197
    +                resolution = None
    
    198
    +        return resolution
    
    199
    +
    
    200
    +    def _resolve_through_files(self, c, directory, require_traversable):
    
    201
    +        """A wrapper to resolve() which deals with files being found
    
    202
    +        in the middle of paths, for example trying to resolve a symlink
    
    203
    +        which points to /usr/lib64/libfoo when 'lib64' is a file.
    
    204
    +
    
    205
    +        require_traversable: If this is True, never return a file
    
    206
    +        node.  Instead, if force_create is set, destroy the file node,
    
    207
    +        then create and return a normal directory in its place. If
    
    208
    +        force_create is off, throws ResolutionException.
    
    209
    +
    
    210
    +        """
    
    211
    +        resolved_thing = self.resolve(c, directory)
    
    212
    +
    
    213
    +        if isinstance(resolved_thing, remote_execution_pb2.FileNode):
    
    214
    +            if require_traversable:
    
    215
    +                # We have components still to resolve, but one of the path components
    
    216
    +                # is a file.
    
    217
    +                if self.force_create:
    
    218
    +                    directory.delete_entry(c)
    
    219
    +                    resolved_thing = directory.descend(c, create=True)
    
    220
    +                else:
    
    221
    +                    # This is a signal that we hit a file, but don't
    
    222
    +                    # have the data to give a proper message, so the
    
    223
    +                    # caller should reraise this with a proper
    
    224
    +                    # description.
    
    225
    +                    raise UnexpectedFileException()
    
    226
    +
    
    227
    +        return resolved_thing
    
    228
    +
    
    229
    +
    
    54 230
     # CasBasedDirectory intentionally doesn't call its superclass constuctor,
    
    55 231
     # which is meant to be unimplemented.
    
    56 232
     # pylint: disable=super-init-not-called
    
    ... ... @@ -168,29 +344,34 @@ class CasBasedDirectory(Directory):
    168 344
             self.index[name] = IndexEntry(dirnode, buildstream_object=newdir)
    
    169 345
             return newdir
    
    170 346
     
    
    171
    -    def _add_new_file(self, basename, filename):
    
    347
    +    def _add_file(self, basename, filename, modified=False):
    
    172 348
             filenode = self.pb2_directory.files.add()
    
    173 349
             filenode.name = filename
    
    174 350
             self.cas_cache.add_object(digest=filenode.digest, path=os.path.join(basename, filename))
    
    175 351
             is_executable = os.access(os.path.join(basename, filename), os.X_OK)
    
    176 352
             filenode.is_executable = is_executable
    
    177
    -        self.index[filename] = IndexEntry(filenode, modified=(filename in self.index))
    
    353
    +        self.index[filename] = IndexEntry(filenode, modified=modified or filename in self.index)
    
    178 354
     
    
    179
    -    def _add_new_link(self, basename, filename):
    
    180
    -        existing_link = self._find_pb2_entry(filename)
    
    355
    +    def _copy_link_from_filesystem(self, basename, filename):
    
    356
    +        self._add_new_link_direct(filename, os.readlink(os.path.join(basename, filename)))
    
    357
    +
    
    358
    +    def _add_new_link_direct(self, name, target):
    
    359
    +        existing_link = self._find_pb2_entry(name)
    
    181 360
             if existing_link:
    
    182 361
                 symlinknode = existing_link
    
    183 362
             else:
    
    184 363
                 symlinknode = self.pb2_directory.symlinks.add()
    
    185
    -        symlinknode.name = filename
    
    364
    +        assert isinstance(symlinknode, remote_execution_pb2.SymlinkNode)
    
    365
    +        symlinknode.name = name
    
    186 366
             # A symlink node has no digest.
    
    187
    -        symlinknode.target = os.readlink(os.path.join(basename, filename))
    
    188
    -        self.index[filename] = IndexEntry(symlinknode, modified=(existing_link is not None))
    
    367
    +        symlinknode.target = target
    
    368
    +        self.index[name] = IndexEntry(symlinknode, modified=(existing_link is not None))
    
    189 369
     
    
    190 370
         def delete_entry(self, name):
    
    191 371
             for collection in [self.pb2_directory.files, self.pb2_directory.symlinks, self.pb2_directory.directories]:
    
    192
    -            if name in collection:
    
    193
    -                collection.remove(name)
    
    372
    +            for thing in collection:
    
    373
    +                if thing.name == name:
    
    374
    +                    collection.remove(thing)
    
    194 375
             if name in self.index:
    
    195 376
                 del self.index[name]
    
    196 377
     
    
    ... ... @@ -231,9 +412,13 @@ class CasBasedDirectory(Directory):
    231 412
                 if isinstance(entry, CasBasedDirectory):
    
    232 413
                     return entry.descend(subdirectory_spec[1:], create)
    
    233 414
                 else:
    
    415
    +                # May be a symlink
    
    416
    +                target = self._resolve(subdirectory_spec[0], force_create=create)
    
    417
    +                if isinstance(target, CasBasedDirectory):
    
    418
    +                    return target
    
    234 419
                     error = "Cannot descend into {}, which is a '{}' in the directory {}"
    
    235 420
                     raise VirtualDirectoryError(error.format(subdirectory_spec[0],
    
    236
    -                                                         type(entry).__name__,
    
    421
    +                                                         type(self.index[subdirectory_spec[0]].pb_object).__name__,
    
    237 422
                                                              self))
    
    238 423
             else:
    
    239 424
                 if create:
    
    ... ... @@ -254,36 +439,9 @@ class CasBasedDirectory(Directory):
    254 439
             else:
    
    255 440
                 return self
    
    256 441
     
    
    257
    -    def _resolve_symlink_or_directory(self, name):
    
    258
    -        """Used only by _import_files_from_directory. Tries to resolve a
    
    259
    -        directory name or symlink name. 'name' must be an entry in this
    
    260
    -        directory. It must be a single symlink or directory name, not a path
    
    261
    -        separated by path separators. If it's an existing directory name, it
    
    262
    -        just returns the Directory object for that. If it's a symlink, it will
    
    263
    -        attempt to find the target of the symlink and return that as a
    
    264
    -        Directory object.
    
    265
    -
    
    266
    -        If a symlink target doesn't exist, it will attempt to create it
    
    267
    -        as a directory as long as it's within this directory tree.
    
    268
    -        """
    
    269
    -
    
    270
    -        if isinstance(self.index[name].buildstream_object, Directory):
    
    271
    -            return self.index[name].buildstream_object
    
    272
    -        # OK then, it's a symlink
    
    273
    -        symlink = self._find_pb2_entry(name)
    
    274
    -        absolute = symlink.target.startswith(CasBasedDirectory._pb2_absolute_path_prefix)
    
    275
    -        if absolute:
    
    276
    -            root = self.find_root()
    
    277
    -        else:
    
    278
    -            root = self
    
    279
    -        directory = root
    
    280
    -        components = symlink.target.split(CasBasedDirectory._pb2_path_sep)
    
    281
    -        for c in components:
    
    282
    -            if c == "..":
    
    283
    -                directory = directory.parent
    
    284
    -            else:
    
    285
    -                directory = directory.descend(c, create=True)
    
    286
    -        return directory
    
    442
    +    def _resolve(self, name, absolute_symlinks_resolve=True, force_create=False):
    
    443
    +        resolver = _Resolver(absolute_symlinks_resolve, force_create)
    
    444
    +        return resolver.resolve(name, self)
    
    287 445
     
    
    288 446
         def _check_replacement(self, name, path_prefix, fileListResult):
    
    289 447
             """ Checks whether 'name' exists, and if so, whether we can overwrite it.
    
    ... ... @@ -297,6 +455,7 @@ class CasBasedDirectory(Directory):
    297 455
                 return True
    
    298 456
             if (isinstance(existing_entry,
    
    299 457
                            (remote_execution_pb2.FileNode, remote_execution_pb2.SymlinkNode))):
    
    458
    +            self.delete_entry(name)
    
    300 459
                 fileListResult.overwritten.append(relative_pathname)
    
    301 460
                 return True
    
    302 461
             elif isinstance(existing_entry, remote_execution_pb2.DirectoryNode):
    
    ... ... @@ -314,23 +473,44 @@ class CasBasedDirectory(Directory):
    314 473
                            .format(name, type(existing_entry)))
    
    315 474
             return False  # In case asserts are disabled
    
    316 475
     
    
    317
    -    def _import_directory_recursively(self, directory_name, source_directory, remaining_path, path_prefix):
    
    318
    -        """ _import_directory_recursively and _import_files_from_directory will be called alternately
    
    319
    -        as a directory tree is descended. """
    
    320
    -        if directory_name in self.index:
    
    321
    -            subdir = self._resolve_symlink_or_directory(directory_name)
    
    322
    -        else:
    
    323
    -            subdir = self._add_directory(directory_name)
    
    324
    -        new_path_prefix = os.path.join(path_prefix, directory_name)
    
    325
    -        subdir_result = subdir._import_files_from_directory(os.path.join(source_directory, directory_name),
    
    326
    -                                                            [os.path.sep.join(remaining_path)],
    
    327
    -                                                            path_prefix=new_path_prefix)
    
    328
    -        return subdir_result
    
    476
    +    def _replace_anything_with_dir(self, name, path_prefix, overwritten_files_list):
    
    477
    +        self.delete_entry(name)
    
    478
    +        subdir = self._add_directory(name)
    
    479
    +        overwritten_files_list.append(os.path.join(path_prefix, name))
    
    480
    +        return subdir
    
    329 481
     
    
    330 482
         def _import_files_from_directory(self, source_directory, files, path_prefix=""):
    
    331
    -        """ Imports files from a traditional directory """
    
    483
    +        """ Imports files from a traditional directory. """
    
    484
    +
    
    485
    +        def _ensure_followable(name, path_prefix):
    
    486
    +            """ Makes sure 'name' is a directory or symlink to a directory which can be descended into. """
    
    487
    +            if isinstance(self.index[name].buildstream_object, Directory):
    
    488
    +                return self.descend(name)
    
    489
    +            try:
    
    490
    +                target = self._resolve(name, force_create=True)
    
    491
    +            except InfiniteSymlinkException:
    
    492
    +                return self._replace_anything_with_dir(name, path_prefix, result.overwritten)
    
    493
    +            if isinstance(target, CasBasedDirectory):
    
    494
    +                return target
    
    495
    +            elif isinstance(target, remote_execution_pb2.FileNode):
    
    496
    +                return self._replace_anything_with_dir(name, path_prefix, result.overwritten)
    
    497
    +            return target
    
    498
    +
    
    499
    +        def _import_directory_recursively(directory_name, source_directory, remaining_path, path_prefix):
    
    500
    +            """ _import_directory_recursively and _import_files_from_directory will be called alternately
    
    501
    +            as a directory tree is descended. """
    
    502
    +            if directory_name in self.index:
    
    503
    +                subdir = _ensure_followable(directory_name, path_prefix)
    
    504
    +            else:
    
    505
    +                subdir = self._add_directory(directory_name)
    
    506
    +            new_path_prefix = os.path.join(path_prefix, directory_name)
    
    507
    +            subdir_result = subdir._import_files_from_directory(os.path.join(source_directory, directory_name),
    
    508
    +                                                                [os.path.sep.join(remaining_path)],
    
    509
    +                                                                path_prefix=new_path_prefix)
    
    510
    +            return subdir_result
    
    511
    +
    
    332 512
             result = FileListResult()
    
    333
    -        for entry in sorted(files):
    
    513
    +        for entry in files:
    
    334 514
                 split_path = entry.split(os.path.sep)
    
    335 515
                 # The actual file on the FS we're importing
    
    336 516
                 import_file = os.path.join(source_directory, entry)
    
    ... ... @@ -338,14 +518,18 @@ class CasBasedDirectory(Directory):
    338 518
                 relative_pathname = os.path.join(path_prefix, entry)
    
    339 519
                 if len(split_path) > 1:
    
    340 520
                     directory_name = split_path[0]
    
    341
    -                # Hand this off to the importer for that subdir. This will only do one file -
    
    342
    -                # a better way would be to hand off all the files in this subdir at once.
    
    343
    -                subdir_result = self._import_directory_recursively(directory_name, source_directory,
    
    344
    -                                                                   split_path[1:], path_prefix)
    
    521
    +                # Hand this off to the importer for that subdir.
    
    522
    +
    
    523
    +                # It would be advantageous to batch these together by
    
    524
    +                # directory_name. However, we can't do it out of
    
    525
    +                # order, since importing symlinks affects the results
    
    526
    +                # of other imports.
    
    527
    +                subdir_result = _import_directory_recursively(directory_name, source_directory,
    
    528
    +                                                              split_path[1:], path_prefix)
    
    345 529
                     result.combine(subdir_result)
    
    346 530
                 elif os.path.islink(import_file):
    
    347 531
                     if self._check_replacement(entry, path_prefix, result):
    
    348
    -                    self._add_new_link(source_directory, entry)
    
    532
    +                    self._copy_link_from_filesystem(source_directory, entry)
    
    349 533
                         result.files_written.append(relative_pathname)
    
    350 534
                 elif os.path.isdir(import_file):
    
    351 535
                     # A plain directory which already exists isn't a problem; just ignore it.
    
    ... ... @@ -353,10 +537,78 @@ class CasBasedDirectory(Directory):
    353 537
                         self._add_directory(entry)
    
    354 538
                 elif os.path.isfile(import_file):
    
    355 539
                     if self._check_replacement(entry, path_prefix, result):
    
    356
    -                    self._add_new_file(source_directory, entry)
    
    540
    +                    self._add_file(source_directory, entry, modified=relative_pathname in result.overwritten)
    
    357 541
                         result.files_written.append(relative_pathname)
    
    358 542
             return result
    
    359 543
     
    
    544
    +    @staticmethod
    
    545
    +    def _files_in_subdir(sorted_files, dirname):
    
    546
    +        """Filters sorted_files and returns only the ones which have
    
    547
    +           'dirname' as a prefix, with that prefix removed.
    
    548
    +
    
    549
    +        """
    
    550
    +        if not dirname.endswith(os.path.sep):
    
    551
    +            dirname += os.path.sep
    
    552
    +        return [f[len(dirname):] for f in sorted_files if f.startswith(dirname)]
    
    553
    +
    
    554
    +    def _partial_import_cas_into_cas(self, source_directory, files, path_prefix="", file_list_required=True):
    
    555
    +        """ Import only the files and symlinks listed in 'files' from source_directory to this one.
    
    556
    +        Args:
    
    557
    +           source_directory (:class:`.CasBasedDirectory`): The directory to import from
    
    558
    +           files ([str]): List of pathnames to import. Must be a list, not a generator.
    
    559
    +           path_prefix (str): Prefix used to add entries to the file list result.
    
    560
    +           file_list_required: Whether to update the file list while processing.
    
    561
    +        """
    
    562
    +        result = FileListResult()
    
    563
    +        processed_directories = set()
    
    564
    +        for f in files:
    
    565
    +            fullname = os.path.join(path_prefix, f)
    
    566
    +            components = f.split(os.path.sep)
    
    567
    +            if len(components) > 1:
    
    568
    +                # We are importing a thing which is in a subdirectory. We may have already seen this dirname
    
    569
    +                # for a previous file.
    
    570
    +                dirname = components[0]
    
    571
    +                if dirname not in processed_directories:
    
    572
    +                    # Now strip off the first directory name and import files recursively.
    
    573
    +                    subcomponents = CasBasedDirectory._files_in_subdir(files, dirname)
    
    574
    +                    # We will fail at this point if there is a file or symlink to file called 'dirname'.
    
    575
    +                    if dirname in self.index:
    
    576
    +                        resolved_component = self._resolve(dirname, force_create=True)
    
    577
    +                        if isinstance(resolved_component, remote_execution_pb2.FileNode):
    
    578
    +                            dest_subdir = self._replace_anything_with_dir(dirname, path_prefix, result.overwritten)
    
    579
    +                        else:
    
    580
    +                            dest_subdir = resolved_component
    
    581
    +                    else:
    
    582
    +                        dest_subdir = self.descend(dirname, create=True)
    
    583
    +                    src_subdir = source_directory.descend(dirname)
    
    584
    +                    import_result = dest_subdir._partial_import_cas_into_cas(src_subdir, subcomponents,
    
    585
    +                                                                             path_prefix=fullname,
    
    586
    +                                                                             file_list_required=file_list_required)
    
    587
    +                    result.combine(import_result)
    
    588
    +                processed_directories.add(dirname)
    
    589
    +            elif isinstance(source_directory.index[f].buildstream_object, CasBasedDirectory):
    
    590
    +                # The thing in the input file list is a directory on
    
    591
    +                # its own. We don't need to do anything other than create it if it doesn't exist.
    
    592
    +                # If we already have an entry with the same name that isn't a directory, that
    
    593
    +                # will be dealt with when importing files in this directory.
    
    594
    +                if f not in self.index:
    
    595
    +                    self.descend(f, create=True)
    
    596
    +            else:
    
    597
    +                # We're importing a file or symlink - replace anything with the same name.
    
    598
    +                importable = self._check_replacement(f, path_prefix, result)
    
    599
    +                if importable:
    
    600
    +                    item = source_directory.index[f].pb_object
    
    601
    +                    if isinstance(item, remote_execution_pb2.FileNode):
    
    602
    +                        filenode = self.pb2_directory.files.add(digest=item.digest, name=f,
    
    603
    +                                                                is_executable=item.is_executable)
    
    604
    +                        self.index[f] = IndexEntry(filenode, modified=True)
    
    605
    +                    else:
    
    606
    +                        assert isinstance(item, remote_execution_pb2.SymlinkNode)
    
    607
    +                        self._add_new_link_direct(name=f, target=item.target)
    
    608
    +                else:
    
    609
    +                    result.ignored.append(os.path.join(path_prefix, f))
    
    610
    +        return result
    
    611
    +
    
    360 612
         def import_files(self, external_pathspec, *, files=None,
    
    361 613
                          report_written=True, update_utimes=False,
    
    362 614
                          can_link=False):
    
    ... ... @@ -378,28 +630,27 @@ class CasBasedDirectory(Directory):
    378 630
     
    
    379 631
             can_link (bool): Ignored, since hard links do not have any meaning within CAS.
    
    380 632
             """
    
    381
    -        if isinstance(external_pathspec, FileBasedDirectory):
    
    382
    -            source_directory = external_pathspec._get_underlying_directory()
    
    383
    -        elif isinstance(external_pathspec, CasBasedDirectory):
    
    384
    -            # TODO: This transfers from one CAS to another via the
    
    385
    -            # filesystem, which is very inefficient. Alter this so it
    
    386
    -            # transfers refs across directly.
    
    387
    -            with tempfile.TemporaryDirectory(prefix="roundtrip") as tmpdir:
    
    388
    -                external_pathspec.export_files(tmpdir)
    
    389
    -                if files is None:
    
    390
    -                    files = list_relative_paths(tmpdir)
    
    391
    -                result = self._import_files_from_directory(tmpdir, files=files)
    
    392
    -            return result
    
    393
    -        else:
    
    394
    -            source_directory = external_pathspec
    
    395 633
     
    
    396 634
             if files is None:
    
    397
    -            files = list_relative_paths(source_directory)
    
    635
    +            if isinstance(external_pathspec, str):
    
    636
    +                files = list_relative_paths(external_pathspec)
    
    637
    +            else:
    
    638
    +                assert isinstance(external_pathspec, Directory)
    
    639
    +                files = external_pathspec.list_relative_paths()
    
    640
    +
    
    641
    +        if isinstance(external_pathspec, FileBasedDirectory):
    
    642
    +            source_directory = external_pathspec.get_underlying_directory()
    
    643
    +            result = self._import_files_from_directory(source_directory, files=files)
    
    644
    +        elif isinstance(external_pathspec, str):
    
    645
    +            source_directory = external_pathspec
    
    646
    +            result = self._import_files_from_directory(source_directory, files=files)
    
    647
    +        else:
    
    648
    +            assert isinstance(external_pathspec, CasBasedDirectory)
    
    649
    +            result = self._partial_import_cas_into_cas(external_pathspec, files=list(files))
    
    398 650
     
    
    399 651
             # TODO: No notice is taken of report_written, update_utimes or can_link.
    
    400 652
             # Current behaviour is to fully populate the report, which is inefficient,
    
    401 653
             # but still correct.
    
    402
    -        result = self._import_files_from_directory(source_directory, files=files)
    
    403 654
     
    
    404 655
             # We need to recalculate and store the hashes of all directories both
    
    405 656
             # up and down the tree; we have changed our directory by importing files
    
    ... ... @@ -511,6 +762,28 @@ class CasBasedDirectory(Directory):
    511 762
             else:
    
    512 763
                 self._mark_directory_unmodified()
    
    513 764
     
    
    765
    +    def _lightweight_resolve_to_index(self, path):
    
    766
    +        """A lightweight function for transforming paths into IndexEntry
    
    767
    +        objects. This does not follow symlinks.
    
    768
    +
    
    769
    +        path: The string to resolve. This should be a series of path
    
    770
    +        components separated by the protocol buffer path separator
    
    771
    +        _pb2_path_sep.
    
    772
    +
    
    773
    +        Returns: the IndexEntry found, or None if any of the path components were not present.
    
    774
    +
    
    775
    +        """
    
    776
    +        directory = self
    
    777
    +        path_components = path.split(CasBasedDirectory._pb2_path_sep)
    
    778
    +        for component in path_components[:-1]:
    
    779
    +            if component not in directory.index:
    
    780
    +                return None
    
    781
    +            if isinstance(directory.index[component].buildstream_object, CasBasedDirectory):
    
    782
    +                directory = directory.index[component].buildstream_object
    
    783
    +            else:
    
    784
    +                return None
    
    785
    +        return directory.index.get(path_components[-1], None)
    
    786
    +
    
    514 787
         def list_modified_paths(self):
    
    515 788
             """Provide a list of relative paths which have been modified since the
    
    516 789
             last call to mark_unmodified.
    
    ... ... @@ -518,29 +791,43 @@ class CasBasedDirectory(Directory):
    518 791
             Return value: List(str) - list of modified paths
    
    519 792
             """
    
    520 793
     
    
    521
    -        filelist = []
    
    522
    -        for (k, v) in self.index.items():
    
    523
    -            if isinstance(v.buildstream_object, CasBasedDirectory):
    
    524
    -                filelist.extend([k + os.path.sep + x for x in v.buildstream_object.list_modified_paths()])
    
    525
    -            elif isinstance(v.pb_object, remote_execution_pb2.FileNode) and v.modified:
    
    526
    -                filelist.append(k)
    
    527
    -        return filelist
    
    794
    +        for p in self.list_relative_paths():
    
    795
    +            i = self._lightweight_resolve_to_index(p)
    
    796
    +            if i and i.modified:
    
    797
    +                yield p
    
    528 798
     
    
    529
    -    def list_relative_paths(self):
    
    799
    +    def list_relative_paths(self, relpath=""):
    
    530 800
             """Provide a list of all relative paths.
    
    531 801
     
    
    532
    -        NOTE: This list is not in the same order as utils.list_relative_paths.
    
    533
    -
    
    534 802
             Return value: List(str) - list of all paths
    
    535 803
             """
    
    536 804
     
    
    537
    -        filelist = []
    
    538
    -        for (k, v) in self.index.items():
    
    539
    -            if isinstance(v.buildstream_object, CasBasedDirectory):
    
    540
    -                filelist.extend([k + os.path.sep + x for x in v.buildstream_object.list_relative_paths()])
    
    541
    -            elif isinstance(v.pb_object, remote_execution_pb2.FileNode):
    
    542
    -                filelist.append(k)
    
    543
    -        return filelist
    
    805
    +        symlink_list = filter(lambda i: isinstance(i[1].pb_object, remote_execution_pb2.SymlinkNode),
    
    806
    +                              self.index.items())
    
    807
    +        file_list = list(filter(lambda i: isinstance(i[1].pb_object, remote_execution_pb2.FileNode),
    
    808
    +                                self.index.items()))
    
    809
    +        directory_list = filter(lambda i: isinstance(i[1].buildstream_object, CasBasedDirectory),
    
    810
    +                                self.index.items())
    
    811
    +
    
    812
    +        # We need to mimic the behaviour of os.walk, in which symlinks
    
    813
    +        # to directories count as directories and symlinks to file or
    
    814
    +        # broken symlinks count as files. os.walk doesn't follow
    
    815
    +        # symlinks, so we don't recurse.
    
    816
    +        for (k, v) in sorted(symlink_list):
    
    817
    +            target = self._resolve(k, absolute_symlinks_resolve=True)
    
    818
    +            if isinstance(target, CasBasedDirectory):
    
    819
    +                yield os.path.join(relpath, k)
    
    820
    +            else:
    
    821
    +                file_list.append((k, v))
    
    822
    +
    
    823
    +        if file_list == [] and relpath != "":
    
    824
    +            yield relpath
    
    825
    +        else:
    
    826
    +            for (k, v) in sorted(file_list):
    
    827
    +                yield os.path.join(relpath, k)
    
    828
    +
    
    829
    +        for (k, v) in sorted(directory_list):
    
    830
    +            yield from v.buildstream_object.list_relative_paths(relpath=os.path.join(relpath, k))
    
    544 831
     
    
    545 832
         def recalculate_hash(self):
    
    546 833
             """ Recalcuates the hash for this directory and store the results in
    

  • doc/source/using_config.rst
    ... ... @@ -147,6 +147,44 @@ The default mirror is defined by its name, e.g.
    147 147
        ``--default-mirror`` command-line option.
    
    148 148
     
    
    149 149
     
    
    150
    +Local cache expiry
    
    151
    +~~~~~~~~~~~~~~~~~~
    
    152
    +BuildStream locally caches artifacts, build trees, log files and sources within a
    
    153
    +cache located at ``~/.cache/buildstream`` (unless a $XDG_CACHE_HOME environment
    
    154
    +variable exists). When building large projects, this cache can get very large,
    
    155
    +thus BuildStream will attempt to clean up the cache automatically by expiring the least
    
    156
    +recently *used* artifacts.
    
    157
    +
    
    158
    +By default, cache expiry will begin once the file system which contains the cache
    
    159
    +approaches maximum usage. However, it is also possible to impose a quota on the local
    
    160
    +cache in the user configuration. This can be done in two ways:
    
    161
    +
    
    162
    +1. By restricting the maximum size of the cache directory itself.
    
    163
    +
    
    164
    +For example, to ensure that BuildStream's cache does not grow beyond 100 GB,
    
    165
    +simply declare the following in your user configuration (``~/.config/buildstream.conf``):
    
    166
    +
    
    167
    +.. code:: yaml
    
    168
    +
    
    169
    +  cache:
    
    170
    +    quota: 100G
    
    171
    +
    
    172
    +This quota defines the maximum size of the artifact cache in bytes.
    
    173
    +Other accepted values are: K, M, G or T (or you can simply declare the value in bytes, without the suffix).
    
    174
    +This uses the same format as systemd's
    
    175
    +`resource-control <https://www.freedesktop.org/software/systemd/man/systemd.resource-control.html>`_.
    
    176
    +
    
    177
    +2. By expiring artifacts once the file system which contains the cache exceeds a specified usage.
    
    178
    +
    
    179
    +To ensure that we start cleaning the cache once we've used 80% of local disk space (on the file system
    
    180
    +which mounts the cache):
    
    181
    +
    
    182
    +.. code:: yaml
    
    183
    +
    
    184
    +  cache:
    
    185
    +    quota: 80%
    
    186
    +
    
    187
    +
    
    150 188
     Default configuration
    
    151 189
     ---------------------
    
    152 190
     The default BuildStream configuration is specified here for reference:
    

  • tests/cachekey/cachekey.py
    ... ... @@ -36,7 +36,7 @@
    36 36
     # the result.
    
    37 37
     #
    
    38 38
     from tests.testutils.runcli import cli
    
    39
    -from tests.testutils.site import HAVE_BZR, HAVE_GIT, HAVE_OSTREE, IS_LINUX
    
    39
    +from tests.testutils.site import HAVE_BZR, HAVE_GIT, HAVE_OSTREE, IS_LINUX, MACHINE_ARCH
    
    40 40
     from buildstream.plugin import CoreWarnings
    
    41 41
     from buildstream import _yaml
    
    42 42
     import os
    
    ... ... @@ -144,6 +144,8 @@ DATA_DIR = os.path.join(
    144 144
     # The cache key test uses a project which exercises all plugins,
    
    145 145
     # so we cant run it at all if we dont have them installed.
    
    146 146
     #
    
    147
    +@pytest.mark.skipif(MACHINE_ARCH != 'x86_64',
    
    148
    +                    reason='Cache keys depend on architecture')
    
    147 149
     @pytest.mark.skipif(not IS_LINUX, reason='Only available on linux')
    
    148 150
     @pytest.mark.skipif(HAVE_BZR is False, reason="bzr is not available")
    
    149 151
     @pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
    

  • tests/examples/autotools.py
    ... ... @@ -3,7 +3,7 @@ import pytest
    3 3
     
    
    4 4
     from tests.testutils import cli_integration as cli
    
    5 5
     from tests.testutils.integration import assert_contains
    
    6
    -from tests.testutils.site import IS_LINUX
    
    6
    +from tests.testutils.site import IS_LINUX, MACHINE_ARCH
    
    7 7
     
    
    8 8
     pytestmark = pytest.mark.integration
    
    9 9
     
    
    ... ... @@ -13,6 +13,8 @@ DATA_DIR = os.path.join(
    13 13
     
    
    14 14
     
    
    15 15
     # Tests a build of the autotools amhello project on a alpine-linux base runtime
    
    16
    +@pytest.mark.skipif(MACHINE_ARCH != 'x86_64',
    
    17
    +                    reason='Examples are writtent for x86_64')
    
    16 18
     @pytest.mark.skipif(not IS_LINUX, reason='Only available on linux')
    
    17 19
     @pytest.mark.datafiles(DATA_DIR)
    
    18 20
     def test_autotools_build(cli, tmpdir, datafiles):
    
    ... ... @@ -36,6 +38,8 @@ def test_autotools_build(cli, tmpdir, datafiles):
    36 38
     
    
    37 39
     
    
    38 40
     # Test running an executable built with autotools.
    
    41
    +@pytest.mark.skipif(MACHINE_ARCH != 'x86_64',
    
    42
    +                    reason='Examples are writtent for x86_64')
    
    39 43
     @pytest.mark.skipif(not IS_LINUX, reason='Only available on linux')
    
    40 44
     @pytest.mark.datafiles(DATA_DIR)
    
    41 45
     def test_autotools_run(cli, tmpdir, datafiles):
    

  • tests/examples/developing.py
    ... ... @@ -4,7 +4,7 @@ import pytest
    4 4
     import tests.testutils.patch as patch
    
    5 5
     from tests.testutils import cli_integration as cli
    
    6 6
     from tests.testutils.integration import assert_contains
    
    7
    -from tests.testutils.site import IS_LINUX
    
    7
    +from tests.testutils.site import IS_LINUX, MACHINE_ARCH
    
    8 8
     
    
    9 9
     pytestmark = pytest.mark.integration
    
    10 10
     
    
    ... ... @@ -14,6 +14,8 @@ DATA_DIR = os.path.join(
    14 14
     
    
    15 15
     
    
    16 16
     # Test that the project builds successfully
    
    17
    +@pytest.mark.skipif(MACHINE_ARCH != 'x86_64',
    
    18
    +                    reason='Examples are writtent for x86_64')
    
    17 19
     @pytest.mark.skipif(not IS_LINUX, reason='Only available on linux')
    
    18 20
     @pytest.mark.datafiles(DATA_DIR)
    
    19 21
     def test_autotools_build(cli, tmpdir, datafiles):
    
    ... ... @@ -35,6 +37,8 @@ def test_autotools_build(cli, tmpdir, datafiles):
    35 37
     
    
    36 38
     
    
    37 39
     # Test the unmodified hello command works as expected.
    
    40
    +@pytest.mark.skipif(MACHINE_ARCH != 'x86_64',
    
    41
    +                    reason='Examples are writtent for x86_64')
    
    38 42
     @pytest.mark.skipif(not IS_LINUX, reason='Only available on linux')
    
    39 43
     @pytest.mark.datafiles(DATA_DIR)
    
    40 44
     def test_run_unmodified_hello(cli, tmpdir, datafiles):
    
    ... ... @@ -66,6 +70,8 @@ def test_open_workspace(cli, tmpdir, datafiles):
    66 70
     
    
    67 71
     
    
    68 72
     # Test making a change using the workspace
    
    73
    +@pytest.mark.skipif(MACHINE_ARCH != 'x86_64',
    
    74
    +                    reason='Examples are writtent for x86_64')
    
    69 75
     @pytest.mark.skipif(not IS_LINUX, reason='Only available on linux')
    
    70 76
     @pytest.mark.datafiles(DATA_DIR)
    
    71 77
     def test_make_change_in_workspace(cli, tmpdir, datafiles):
    

  • tests/examples/flatpak-autotools.py
    ... ... @@ -3,7 +3,7 @@ import pytest
    3 3
     
    
    4 4
     from tests.testutils import cli_integration as cli
    
    5 5
     from tests.testutils.integration import assert_contains
    
    6
    -from tests.testutils.site import IS_LINUX
    
    6
    +from tests.testutils.site import IS_LINUX, MACHINE_ARCH
    
    7 7
     
    
    8 8
     
    
    9 9
     pytestmark = pytest.mark.integration
    
    ... ... @@ -32,6 +32,8 @@ def workaround_setuptools_bug(project):
    32 32
     
    
    33 33
     # Test that a build upon flatpak runtime 'works' - we use the autotools sample
    
    34 34
     # amhello project for this.
    
    35
    +@pytest.mark.skipif(MACHINE_ARCH != 'x86_64',
    
    36
    +                    reason='Examples are writtent for x86_64')
    
    35 37
     @pytest.mark.skipif(not IS_LINUX, reason='Only available on linux')
    
    36 38
     @pytest.mark.datafiles(DATA_DIR)
    
    37 39
     def test_autotools_build(cli, tmpdir, datafiles):
    
    ... ... @@ -55,6 +57,8 @@ def test_autotools_build(cli, tmpdir, datafiles):
    55 57
     
    
    56 58
     
    
    57 59
     # Test running an executable built with autotools
    
    60
    +@pytest.mark.skipif(MACHINE_ARCH != 'x86_64',
    
    61
    +                    reason='Examples are writtent for x86_64')
    
    58 62
     @pytest.mark.skipif(not IS_LINUX, reason='Only available on linux')
    
    59 63
     @pytest.mark.datafiles(DATA_DIR)
    
    60 64
     def test_autotools_run(cli, tmpdir, datafiles):
    

  • tests/examples/integration-commands.py
    ... ... @@ -3,7 +3,7 @@ import pytest
    3 3
     
    
    4 4
     from tests.testutils import cli_integration as cli
    
    5 5
     from tests.testutils.integration import assert_contains
    
    6
    -from tests.testutils.site import IS_LINUX
    
    6
    +from tests.testutils.site import IS_LINUX, MACHINE_ARCH
    
    7 7
     
    
    8 8
     
    
    9 9
     pytestmark = pytest.mark.integration
    
    ... ... @@ -12,6 +12,8 @@ DATA_DIR = os.path.join(
    12 12
     )
    
    13 13
     
    
    14 14
     
    
    15
    +@pytest.mark.skipif(MACHINE_ARCH != 'x86_64',
    
    16
    +                    reason='Examples are writtent for x86_64')
    
    15 17
     @pytest.mark.skipif(not IS_LINUX, reason='Only available on linux')
    
    16 18
     @pytest.mark.datafiles(DATA_DIR)
    
    17 19
     def test_integration_commands_build(cli, tmpdir, datafiles):
    
    ... ... @@ -23,6 +25,8 @@ def test_integration_commands_build(cli, tmpdir, datafiles):
    23 25
     
    
    24 26
     
    
    25 27
     # Test running the executable
    
    28
    +@pytest.mark.skipif(MACHINE_ARCH != 'x86_64',
    
    29
    +                    reason='Examples are writtent for x86_64')
    
    26 30
     @pytest.mark.skipif(not IS_LINUX, reason='Only available on linux')
    
    27 31
     @pytest.mark.datafiles(DATA_DIR)
    
    28 32
     def test_integration_commands_run(cli, tmpdir, datafiles):
    

  • tests/examples/junctions.py
    ... ... @@ -3,7 +3,7 @@ import pytest
    3 3
     
    
    4 4
     from tests.testutils import cli_integration as cli
    
    5 5
     from tests.testutils.integration import assert_contains
    
    6
    -from tests.testutils.site import IS_LINUX
    
    6
    +from tests.testutils.site import IS_LINUX, MACHINE_ARCH
    
    7 7
     
    
    8 8
     pytestmark = pytest.mark.integration
    
    9 9
     
    
    ... ... @@ -13,6 +13,8 @@ DATA_DIR = os.path.join(
    13 13
     
    
    14 14
     
    
    15 15
     # Test that the project builds successfully
    
    16
    +@pytest.mark.skipif(MACHINE_ARCH != 'x86_64',
    
    17
    +                    reason='Examples are writtent for x86_64')
    
    16 18
     @pytest.mark.skipif(not IS_LINUX, reason='Only available on linux')
    
    17 19
     @pytest.mark.datafiles(DATA_DIR)
    
    18 20
     def test_build(cli, tmpdir, datafiles):
    
    ... ... @@ -23,6 +25,8 @@ def test_build(cli, tmpdir, datafiles):
    23 25
     
    
    24 26
     
    
    25 27
     # Test the callHello script works as expected.
    
    28
    +@pytest.mark.skipif(MACHINE_ARCH != 'x86_64',
    
    29
    +                    reason='Examples are writtent for x86_64')
    
    26 30
     @pytest.mark.skipif(not IS_LINUX, reason='Only available on linux')
    
    27 31
     @pytest.mark.datafiles(DATA_DIR)
    
    28 32
     def test_shell_call_hello(cli, tmpdir, datafiles):
    

  • tests/examples/running-commands.py
    ... ... @@ -3,7 +3,7 @@ import pytest
    3 3
     
    
    4 4
     from tests.testutils import cli_integration as cli
    
    5 5
     from tests.testutils.integration import assert_contains
    
    6
    -from tests.testutils.site import IS_LINUX
    
    6
    +from tests.testutils.site import IS_LINUX, MACHINE_ARCH
    
    7 7
     
    
    8 8
     
    
    9 9
     pytestmark = pytest.mark.integration
    
    ... ... @@ -12,6 +12,8 @@ DATA_DIR = os.path.join(
    12 12
     )
    
    13 13
     
    
    14 14
     
    
    15
    +@pytest.mark.skipif(MACHINE_ARCH != 'x86_64',
    
    16
    +                    reason='Examples are writtent for x86_64')
    
    15 17
     @pytest.mark.skipif(not IS_LINUX, reason='Only available on linux')
    
    16 18
     @pytest.mark.datafiles(DATA_DIR)
    
    17 19
     def test_running_commands_build(cli, tmpdir, datafiles):
    
    ... ... @@ -23,6 +25,8 @@ def test_running_commands_build(cli, tmpdir, datafiles):
    23 25
     
    
    24 26
     
    
    25 27
     # Test running the executable
    
    28
    +@pytest.mark.skipif(MACHINE_ARCH != 'x86_64',
    
    29
    +                    reason='Examples are writtent for x86_64')
    
    26 30
     @pytest.mark.skipif(not IS_LINUX, reason='Only available on linux')
    
    27 31
     @pytest.mark.datafiles(DATA_DIR)
    
    28 32
     def test_running_commands_run(cli, tmpdir, datafiles):
    

  • tests/format/list-directive-type-error/project.conf
    ... ... @@ -4,4 +4,4 @@ options:
    4 4
       arch:
    
    5 5
         type: arch
    
    6 6
         description: Example architecture option
    
    7
    -    values: [ x86_32, x86_64 ]
    7
    +    values: [ x86_32, x86_64, aarch64 ]

  • tests/frontend/invalid_element_path/project.conf
    1
    +# Project config for frontend build test
    
    2
    +name: test
    
    3
    +
    
    4
    +elephant-path: elements

  • tests/frontend/show.py
    ... ... @@ -36,6 +36,19 @@ def test_show(cli, datafiles, target, format, expected):
    36 36
                                  .format(expected, result.output))
    
    37 37
     
    
    38 38
     
    
    39
    +@pytest.mark.datafiles(os.path.join(
    
    40
    +    os.path.dirname(os.path.realpath(__file__)),
    
    41
    +    "invalid_element_path",
    
    42
    +))
    
    43
    +def test_show_invalid_element_path(cli, datafiles):
    
    44
    +    project = os.path.join(datafiles.dirname, datafiles.basename)
    
    45
    +    result = cli.run(project=project, silent=True, args=[
    
    46
    +        'show',
    
    47
    +        "foo.bst"])
    
    48
    +
    
    49
    +    result.assert_main_error(ErrorDomain.LOAD, LoadErrorReason.INVALID_DATA)
    
    50
    +
    
    51
    +
    
    39 52
     @pytest.mark.datafiles(DATA_DIR)
    
    40 53
     @pytest.mark.parametrize("target,except_,expected", [
    
    41 54
         ('target.bst', 'import-bin.bst', ['import-dev.bst', 'compose-all.bst', 'target.bst']),
    

  • tests/integration/project/elements/base/base-alpine.bst
    ... ... @@ -7,6 +7,11 @@ description: |
    7 7
     
    
    8 8
     sources:
    
    9 9
       - kind: tar
    
    10
    -    url: alpine:integration-tests-base.v1.x86_64.tar.xz
    
    11 10
         base-dir: ''
    
    12
    -    ref: 3eb559250ba82b64a68d86d0636a6b127aa5f6d25d3601a79f79214dc9703639
    11
    +    (?):
    
    12
    +    - arch == "x86_64":
    
    13
    +        ref: 3eb559250ba82b64a68d86d0636a6b127aa5f6d25d3601a79f79214dc9703639
    
    14
    +        url: "alpine:integration-tests-base.v1.x86_64.tar.xz"
    
    15
    +    - arch == "aarch64":
    
    16
    +        ref: 431fb5362032ede6f172e70a3258354a8fd71fcbdeb1edebc0e20968c792329a
    
    17
    +        url: "alpine:integration-tests-base.v1.aarch64.tar.xz"

  • tests/integration/project/elements/sandbox-bwrap/break-shell.bst
    1
    +kind: manual
    
    2
    +depends:
    
    3
    +  - base/base-alpine.bst
    
    4
    +
    
    5
    +public:
    
    6
    +  bst:
    
    7
    +    integration-commands:
    
    8
    +    - |
    
    9
    +      chmod a-x /bin/sh

  • tests/integration/project/elements/sandbox-bwrap/command-exit-42.bst
    1
    +kind: manual
    
    2
    +depends:
    
    3
    +  - base/base-alpine.bst
    
    4
    +
    
    5
    +config:
    
    6
    +  build-commands:
    
    7
    +  - |
    
    8
    +    exit 42

  • tests/integration/project/elements/sandbox-bwrap/non-executable-shell.bst
    1
    +kind: manual
    
    2
    +
    
    3
    +depends:
    
    4
    +  - sandbox-bwrap/break-shell.bst
    
    5
    +
    
    6
    +config:
    
    7
    +  build-commands:
    
    8
    +  - |
    
    9
    +    exit 42

  • tests/integration/project/project.conf
    ... ... @@ -9,6 +9,12 @@ options:
    9 9
         type: bool
    
    10 10
         description: Whether to expect a linux platform
    
    11 11
         default: True
    
    12
    +  arch:
    
    13
    +    type: arch
    
    14
    +    description: Current architecture
    
    15
    +    values:
    
    16
    +      - x86_64
    
    17
    +      - aarch64
    
    12 18
     split-rules:
    
    13 19
       test:
    
    14 20
         - |
    

  • tests/integration/sandbox-bwrap.py
    1 1
     import os
    
    2 2
     import pytest
    
    3 3
     
    
    4
    +from buildstream._exceptions import ErrorDomain
    
    5
    +
    
    4 6
     from tests.testutils import cli_integration as cli
    
    5 7
     from tests.testutils.integration import assert_contains
    
    6
    -from tests.testutils.site import HAVE_BWRAP
    
    8
    +from tests.testutils.site import HAVE_BWRAP, HAVE_BWRAP_JSON_STATUS
    
    7 9
     
    
    8 10
     
    
    9 11
     pytestmark = pytest.mark.integration
    
    ... ... @@ -29,3 +31,32 @@ def test_sandbox_bwrap_cleanup_build(cli, tmpdir, datafiles):
    29 31
         # Here, BuildStream should not attempt any rmdir etc.
    
    30 32
         result = cli.run(project=project, args=['build', element_name])
    
    31 33
         assert result.exit_code == 0
    
    34
    +
    
    35
    +
    
    36
    +@pytest.mark.skipif(not HAVE_BWRAP, reason='Only available with bubblewrap')
    
    37
    +@pytest.mark.skipif(not HAVE_BWRAP_JSON_STATUS, reason='Only available with bubblewrap supporting --json-status-fd')
    
    38
    +@pytest.mark.datafiles(DATA_DIR)
    
    39
    +def test_sandbox_bwrap_distinguish_setup_error(cli, tmpdir, datafiles):
    
    40
    +    project = os.path.join(datafiles.dirname, datafiles.basename)
    
    41
    +    element_name = 'sandbox-bwrap/non-executable-shell.bst'
    
    42
    +
    
    43
    +    result = cli.run(project=project, args=['build', element_name])
    
    44
    +    result.assert_task_error(error_domain=ErrorDomain.SANDBOX, error_reason="bwrap-sandbox-fail")
    
    45
    +
    
    46
    +
    
    47
    +@pytest.mark.integration
    
    48
    +@pytest.mark.skipif(not HAVE_BWRAP, reason='Only available with bubblewrap')
    
    49
    +@pytest.mark.datafiles(DATA_DIR)
    
    50
    +def test_sandbox_bwrap_return_subprocess(cli, tmpdir, datafiles):
    
    51
    +    project = os.path.join(datafiles.dirname, datafiles.basename)
    
    52
    +    element_name = 'sandbox-bwrap/command-exit-42.bst'
    
    53
    +
    
    54
    +    cli.configure({
    
    55
    +        "logging": {
    
    56
    +            "message-format": "%{element}|%{message}",
    
    57
    +        },
    
    58
    +    })
    
    59
    +
    
    60
    +    result = cli.run(project=project, args=['build', element_name])
    
    61
    +    result.assert_task_error(error_domain=ErrorDomain.ELEMENT, error_reason=None)
    
    62
    +    assert "sandbox-bwrap/command-exit-42.bst|Command 'exit 42' failed with exitcode 42" in result.stderr

  • tests/storage/virtual_directory_import.py
    1
    +from hashlib import sha256
    
    2
    +import os
    
    3
    +import pytest
    
    4
    +import random
    
    5
    +import tempfile
    
    6
    +from tests.testutils import cli
    
    7
    +
    
    8
    +from buildstream.storage._casbaseddirectory import CasBasedDirectory
    
    9
    +from buildstream.storage._filebaseddirectory import FileBasedDirectory
    
    10
    +from buildstream._artifactcache import ArtifactCache
    
    11
    +from buildstream._artifactcache.cascache import CASCache
    
    12
    +from buildstream import utils
    
    13
    +
    
    14
    +
    
    15
    +# These are comparitive tests that check that FileBasedDirectory and
    
    16
    +# CasBasedDirectory act identically.
    
    17
    +
    
    18
    +
    
    19
    +class FakeArtifactCache():
    
    20
    +    def __init__(self):
    
    21
    +        self.cas = None
    
    22
    +
    
    23
    +
    
    24
    +class FakeContext():
    
    25
    +    def __init__(self):
    
    26
    +        self.artifactdir = ''
    
    27
    +        self.artifactcache = FakeArtifactCache()
    
    28
    +
    
    29
    +
    
    30
    +# This is a set of example file system contents. It's a set of trees
    
    31
    +# which are either expected to be problematic or were found to be
    
    32
    +# problematic during random testing.
    
    33
    +
    
    34
    +# The test attempts to import each on top of each other to test
    
    35
    +# importing works consistently.  Each tuple is defined as (<filename>,
    
    36
    +# <type>, <content>). Type can be 'F' (file), 'S' (symlink) or 'D'
    
    37
    +# (directory) with content being the contents for a file or the
    
    38
    +# destination for a symlink.
    
    39
    +root_filesets = [
    
    40
    +    [('a/b/c/textfile1', 'F', 'This is textfile 1\n')],
    
    41
    +    [('a/b/c/textfile1', 'F', 'This is the replacement textfile 1\n')],
    
    42
    +    [('a/b/d', 'D', '')],
    
    43
    +    [('a/b/c', 'S', '/a/b/d')],
    
    44
    +    [('a/b/d', 'S', '/a/b/c')],
    
    45
    +    [('a/b/d', 'D', ''), ('a/b/c', 'S', '/a/b/d')],
    
    46
    +    [('a/b/c', 'D', ''), ('a/b/d', 'S', '/a/b/c')],
    
    47
    +    [('a/b', 'F', 'This is textfile 1\n')],
    
    48
    +    [('a/b/c', 'F', 'This is textfile 1\n')],
    
    49
    +    [('a/b/c', 'D', '')]
    
    50
    +]
    
    51
    +
    
    52
    +empty_hash_ref = sha256().hexdigest()
    
    53
    +RANDOM_SEED = 69105
    
    54
    +NUM_RANDOM_TESTS = 10
    
    55
    +
    
    56
    +
    
    57
    +def generate_import_roots(rootno, directory):
    
    58
    +    rootname = "root{}".format(rootno)
    
    59
    +    rootdir = os.path.join(directory, "content", rootname)
    
    60
    +    if os.path.exists(rootdir):
    
    61
    +        return
    
    62
    +    for (path, typesymbol, content) in root_filesets[rootno - 1]:
    
    63
    +        if typesymbol == 'F':
    
    64
    +            (dirnames, filename) = os.path.split(path)
    
    65
    +            os.makedirs(os.path.join(rootdir, dirnames), exist_ok=True)
    
    66
    +            with open(os.path.join(rootdir, dirnames, filename), "wt") as f:
    
    67
    +                f.write(content)
    
    68
    +        elif typesymbol == 'D':
    
    69
    +            os.makedirs(os.path.join(rootdir, path), exist_ok=True)
    
    70
    +        elif typesymbol == 'S':
    
    71
    +            (dirnames, filename) = os.path.split(path)
    
    72
    +            os.makedirs(os.path.join(rootdir, dirnames), exist_ok=True)
    
    73
    +            os.symlink(content, os.path.join(rootdir, path))
    
    74
    +
    
    75
    +
    
    76
    +def generate_random_root(rootno, directory):
    
    77
    +    # By seeding the random number generator, we ensure these tests
    
    78
    +    # will be repeatable, at least until Python changes the random
    
    79
    +    # number algorithm.
    
    80
    +    random.seed(RANDOM_SEED + rootno)
    
    81
    +    rootname = "root{}".format(rootno)
    
    82
    +    rootdir = os.path.join(directory, "content", rootname)
    
    83
    +    if os.path.exists(rootdir):
    
    84
    +        return
    
    85
    +    things = []
    
    86
    +    locations = ['.']
    
    87
    +    os.makedirs(rootdir)
    
    88
    +    for i in range(0, 100):
    
    89
    +        location = random.choice(locations)
    
    90
    +        thingname = "node{}".format(i)
    
    91
    +        thing = random.choice(['dir', 'link', 'file'])
    
    92
    +        target = os.path.join(rootdir, location, thingname)
    
    93
    +        if thing == 'dir':
    
    94
    +            os.makedirs(target)
    
    95
    +            locations.append(os.path.join(location, thingname))
    
    96
    +        elif thing == 'file':
    
    97
    +            with open(target, "wt") as f:
    
    98
    +                f.write("This is node {}\n".format(i))
    
    99
    +        elif thing == 'link':
    
    100
    +            symlink_type = random.choice(['absolute', 'relative', 'broken'])
    
    101
    +            if symlink_type == 'broken' or not things:
    
    102
    +                os.symlink("/broken", target)
    
    103
    +            elif symlink_type == 'absolute':
    
    104
    +                symlink_destination = random.choice(things)
    
    105
    +                os.symlink(symlink_destination, target)
    
    106
    +            else:
    
    107
    +                symlink_destination = random.choice(things)
    
    108
    +                relative_link = os.path.relpath(symlink_destination, start=location)
    
    109
    +                os.symlink(relative_link, target)
    
    110
    +        things.append(os.path.join(location, thingname))
    
    111
    +
    
    112
    +
    
    113
    +def file_contents(path):
    
    114
    +    with open(path, "r") as f:
    
    115
    +        result = f.read()
    
    116
    +    return result
    
    117
    +
    
    118
    +
    
    119
    +def file_contents_are(path, contents):
    
    120
    +    return file_contents(path) == contents
    
    121
    +
    
    122
    +
    
    123
    +def create_new_casdir(root_number, fake_context, tmpdir):
    
    124
    +    d = CasBasedDirectory(fake_context)
    
    125
    +    d.import_files(os.path.join(tmpdir, "content", "root{}".format(root_number)))
    
    126
    +    assert d.ref.hash != empty_hash_ref
    
    127
    +    return d
    
    128
    +
    
    129
    +
    
    130
    +def create_new_filedir(root_number, tmpdir):
    
    131
    +    root = os.path.join(tmpdir, "vdir")
    
    132
    +    os.makedirs(root)
    
    133
    +    d = FileBasedDirectory(root)
    
    134
    +    d.import_files(os.path.join(tmpdir, "content", "root{}".format(root_number)))
    
    135
    +    return d
    
    136
    +
    
    137
    +
    
    138
    +def combinations(integer_range):
    
    139
    +    for x in integer_range:
    
    140
    +        for y in integer_range:
    
    141
    +            yield (x, y)
    
    142
    +
    
    143
    +
    
    144
    +def resolve_symlinks(path, root):
    
    145
    +    """ A function to resolve symlinks inside 'path' components apart from the last one.
    
    146
    +        For example, resolve_symlinks('/a/b/c/d', '/a/b')
    
    147
    +        will return '/a/b/f/d' if /a/b/c is a symlink to /a/b/f. The final component of
    
    148
    +        'path' is not resolved, because we typically want to inspect the symlink found
    
    149
    +        at that path, not its target.
    
    150
    +
    
    151
    +    """
    
    152
    +    components = path.split(os.path.sep)
    
    153
    +    location = root
    
    154
    +    for i in range(0, len(components) - 1):
    
    155
    +        location = os.path.join(location, components[i])
    
    156
    +        if os.path.islink(location):
    
    157
    +            # Resolve the link, add on all the remaining components
    
    158
    +            target = os.path.join(os.readlink(location))
    
    159
    +            tail = os.path.sep.join(components[i + 1:])
    
    160
    +
    
    161
    +            if target.startswith(os.path.sep):
    
    162
    +                # Absolute link - relative to root
    
    163
    +                location = os.path.join(root, target, tail)
    
    164
    +            else:
    
    165
    +                # Relative link - relative to symlink location
    
    166
    +                location = os.path.join(location, target)
    
    167
    +            return resolve_symlinks(location, root)
    
    168
    +    # If we got here, no symlinks were found. Add on the final component and return.
    
    169
    +    location = os.path.join(location, components[-1])
    
    170
    +    return location
    
    171
    +
    
    172
    +
    
    173
    +def directory_not_empty(path):
    
    174
    +    return os.listdir(path)
    
    175
    +
    
    176
    +
    
    177
    +def _import_test(tmpdir, original, overlay, generator_function, verify_contents=False):
    
    178
    +    fake_context = FakeContext()
    
    179
    +    fake_context.artifactcache.cas = CASCache(tmpdir)
    
    180
    +    # Create some fake content
    
    181
    +    generator_function(original, tmpdir)
    
    182
    +    if original != overlay:
    
    183
    +        generator_function(overlay, tmpdir)
    
    184
    +
    
    185
    +    d = create_new_casdir(original, fake_context, tmpdir)
    
    186
    +
    
    187
    +    duplicate_cas = create_new_casdir(original, fake_context, tmpdir)
    
    188
    +
    
    189
    +    assert duplicate_cas.ref.hash == d.ref.hash
    
    190
    +
    
    191
    +    d2 = create_new_casdir(overlay, fake_context, tmpdir)
    
    192
    +    d.import_files(d2)
    
    193
    +    export_dir = os.path.join(tmpdir, "output-{}-{}".format(original, overlay))
    
    194
    +    roundtrip_dir = os.path.join(tmpdir, "roundtrip-{}-{}".format(original, overlay))
    
    195
    +    d2.export_files(roundtrip_dir)
    
    196
    +    d.export_files(export_dir)
    
    197
    +
    
    198
    +    if verify_contents:
    
    199
    +        for item in root_filesets[overlay - 1]:
    
    200
    +            (path, typename, content) = item
    
    201
    +            realpath = resolve_symlinks(path, export_dir)
    
    202
    +            if typename == 'F':
    
    203
    +                if os.path.isdir(realpath) and directory_not_empty(realpath):
    
    204
    +                    # The file should not have overwritten the directory in this case.
    
    205
    +                    pass
    
    206
    +                else:
    
    207
    +                    assert os.path.isfile(realpath), "{} did not exist in the combined virtual directory".format(path)
    
    208
    +                    assert file_contents_are(realpath, content)
    
    209
    +            elif typename == 'S':
    
    210
    +                if os.path.isdir(realpath) and directory_not_empty(realpath):
    
    211
    +                    # The symlink should not have overwritten the directory in this case.
    
    212
    +                    pass
    
    213
    +                else:
    
    214
    +                    assert os.path.islink(realpath)
    
    215
    +                    assert os.readlink(realpath) == content
    
    216
    +            elif typename == 'D':
    
    217
    +                # We can't do any more tests than this because it
    
    218
    +                # depends on things present in the original. Blank
    
    219
    +                # directories here will be ignored and the original
    
    220
    +                # left in place.
    
    221
    +                assert os.path.lexists(realpath)
    
    222
    +
    
    223
    +    # Now do the same thing with filebaseddirectories and check the contents match
    
    224
    +
    
    225
    +    files = list(utils.list_relative_paths(roundtrip_dir))
    
    226
    +    duplicate_cas._import_files_from_directory(roundtrip_dir, files=files)
    
    227
    +    duplicate_cas._recalculate_recursing_down()
    
    228
    +    if duplicate_cas.parent:
    
    229
    +        duplicate_cas.parent._recalculate_recursing_up(duplicate_cas)
    
    230
    +
    
    231
    +    assert duplicate_cas.ref.hash == d.ref.hash
    
    232
    +
    
    233
    +
    
    234
    +# It's possible to parameterize on both original and overlay values,
    
    235
    +# but this leads to more tests being listed in the output than are
    
    236
    +# comfortable.
    
    237
    +@pytest.mark.parametrize("original", range(1, len(root_filesets) + 1))
    
    238
    +def test_fixed_cas_import(cli, tmpdir, original):
    
    239
    +    for overlay in range(1, len(root_filesets) + 1):
    
    240
    +        _import_test(str(tmpdir), original, overlay, generate_import_roots, verify_contents=True)
    
    241
    +
    
    242
    +
    
    243
    +@pytest.mark.parametrize("original", range(1, NUM_RANDOM_TESTS + 1))
    
    244
    +def test_random_cas_import(cli, tmpdir, original):
    
    245
    +    for overlay in range(1, NUM_RANDOM_TESTS + 1):
    
    246
    +        _import_test(str(tmpdir), original, overlay, generate_random_root, verify_contents=False)
    
    247
    +
    
    248
    +
    
    249
    +def _listing_test(tmpdir, root, generator_function):
    
    250
    +    fake_context = FakeContext()
    
    251
    +    fake_context.artifactcache.cas = CASCache(tmpdir)
    
    252
    +    # Create some fake content
    
    253
    +    generator_function(root, tmpdir)
    
    254
    +
    
    255
    +    d = create_new_filedir(root, tmpdir)
    
    256
    +    filelist = list(d.list_relative_paths())
    
    257
    +
    
    258
    +    d2 = create_new_casdir(root, fake_context, tmpdir)
    
    259
    +    filelist2 = list(d2.list_relative_paths())
    
    260
    +
    
    261
    +    assert filelist == filelist2
    
    262
    +
    
    263
    +
    
    264
    +@pytest.mark.parametrize("root", range(1, 11))
    
    265
    +def test_random_directory_listing(cli, tmpdir, root):
    
    266
    +    _listing_test(str(tmpdir), root, generate_random_root)
    
    267
    +
    
    268
    +
    
    269
    +@pytest.mark.parametrize("root", [1, 2, 3, 4, 5])
    
    270
    +def test_fixed_directory_listing(cli, tmpdir, root):
    
    271
    +    _listing_test(str(tmpdir), root, generate_import_roots)

  • tests/testutils/site.py
    ... ... @@ -4,7 +4,7 @@
    4 4
     import os
    
    5 5
     import sys
    
    6 6
     
    
    7
    -from buildstream import utils, ProgramNotFoundError
    
    7
    +from buildstream import _site, utils, ProgramNotFoundError
    
    8 8
     
    
    9 9
     try:
    
    10 10
         utils.get_host_tool('bzr')
    
    ... ... @@ -33,8 +33,10 @@ except (ImportError, ValueError):
    33 33
     try:
    
    34 34
         utils.get_host_tool('bwrap')
    
    35 35
         HAVE_BWRAP = True
    
    36
    +    HAVE_BWRAP_JSON_STATUS = _site.get_bwrap_version() >= (0, 3, 2)
    
    36 37
     except ProgramNotFoundError:
    
    37 38
         HAVE_BWRAP = False
    
    39
    +    HAVE_BWRAP_JSON_STATUS = False
    
    38 40
     
    
    39 41
     try:
    
    40 42
         utils.get_host_tool('lzip')
    
    ... ... @@ -49,3 +51,5 @@ except ImportError:
    49 51
         HAVE_ARPY = False
    
    50 52
     
    
    51 53
     IS_LINUX = os.getenv('BST_FORCE_BACKEND', sys.platform).startswith('linux')
    
    54
    +
    
    55
    +_, _, _, _, MACHINE_ARCH = os.uname()

  • tests/utils/misc.py
    ... ... @@ -27,4 +27,5 @@ def test_parse_size_over_1024T(cli, tmpdir):
    27 27
         patched_statvfs = mock_os.mock_statvfs(f_bavail=bavail, f_bsize=BLOCK_SIZE)
    
    28 28
         with mock_os.monkey_patch("statvfs", patched_statvfs):
    
    29 29
             result = cli.run(project, args=["build", "file.bst"])
    
    30
    -        assert "1025T of available system storage" in result.stderr
    30
    +        failure_msg = 'Your system does not have enough available space to support the cache quota specified.'
    
    31
    +        assert failure_msg in result.stderr



  • [Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]