[Notes] [Git][BuildStream/buildstream][jennis/migrate_pull_push_commands] 28 commits: _pipeline.py: Fix the planner to retain the original order when depth sorting



Title: GitLab

James Ennis pushed to branch jennis/migrate_pull_push_commands at BuildStream / buildstream

Commits:

21 changed files:

Changes:

  • .gitlab-ci.yml
    ... ... @@ -60,8 +60,18 @@ tests-ubuntu-18.04:
    60 60
       image: buildstream/testsuite-ubuntu:18.04-5da27168-32c47d1c
    
    61 61
       <<: *tests
    
    62 62
     
    
    63
    +tests-python-3.7-stretch:
    
    64
    +  image: buildstream/testsuite-python:3.7-stretch-a60f0c39
    
    65
    +  <<: *tests
    
    66
    +
    
    67
    +  variables:
    
    68
    +    # Note that we explicitly specify TOXENV in this case because this
    
    69
    +    # image has both 3.6 and 3.7 versions. python3.6 cannot be removed because
    
    70
    +    # some of our base dependencies declare it as their runtime dependency.
    
    71
    +    TOXENV: py37
    
    72
    +
    
    63 73
     overnight-fedora-28-aarch64:
    
    64
    -  image: buildstream/testsuite-fedora:aarch64-28-06bab030-32a101f6
    
    74
    +  image: buildstream/testsuite-fedora:aarch64-28-5da27168-32c47d1c
    
    65 75
       tags:
    
    66 76
         - aarch64
    
    67 77
       <<: *tests
    
    ... ... @@ -70,6 +80,12 @@ overnight-fedora-28-aarch64:
    70 80
       except: []
    
    71 81
       only:
    
    72 82
       - schedules
    
    83
    +  before_script:
    
    84
    +    # grpcio needs to be compiled from source on aarch64 so we additionally
    
    85
    +    # need a C++ compiler here.
    
    86
    +    # FIXME: Ideally this would be provided by the base image. This will be
    
    87
    +    # unblocked by https://gitlab.com/BuildStream/buildstream-docker-images/issues/34
    
    88
    +    - dnf install -y gcc-c++
    
    73 89
     
    
    74 90
     tests-unix:
    
    75 91
       # Use fedora here, to a) run a test on fedora and b) ensure that we
    
    ... ... @@ -90,7 +106,6 @@ tests-unix:
    90 106
         # Since the unix platform is required to run as root, no user change required
    
    91 107
         - ${TEST_COMMAND}
    
    92 108
     
    
    93
    -
    
    94 109
     tests-fedora-missing-deps:
    
    95 110
       # Ensure that tests behave nicely while missing bwrap and ostree
    
    96 111
       image: buildstream/testsuite-fedora:28-5da27168-32c47d1c
    
    ... ... @@ -108,6 +123,22 @@ tests-fedora-missing-deps:
    108 123
     
    
    109 124
         - ${TEST_COMMAND}
    
    110 125
     
    
    126
    +tests-fedora-update-deps:
    
    127
    +  # Check if the tests pass after updating requirements to their latest
    
    128
    +  # allowed version.
    
    129
    +  allow_failure: true
    
    130
    +  image: buildstream/testsuite-fedora:28-5da27168-32c47d1c
    
    131
    +  <<: *tests
    
    132
    +
    
    133
    +  script:
    
    134
    +    - useradd -Um buildstream
    
    135
    +    - chown -R buildstream:buildstream .
    
    136
    +
    
    137
    +    - make --always-make --directory requirements
    
    138
    +    - cat requirements/*.txt
    
    139
    +
    
    140
    +    - su buildstream -c "${TEST_COMMAND}"
    
    141
    +
    
    111 142
     # Lint separately from testing
    
    112 143
     lint:
    
    113 144
       stage: test
    
    ... ... @@ -140,8 +171,8 @@ docs:
    140 171
       stage: test
    
    141 172
       variables:
    
    142 173
         BST_EXT_URL: git+https://gitlab.com/BuildStream/bst-external.git
    
    143
    -    BST_EXT_REF: 573843768f4d297f85dc3067465b3c7519a8dcc3 # 0.7.0
    
    144
    -    FD_SDK_REF: 612f66e218445eee2b1a9d7dd27c9caba571612e # freedesktop-sdk-18.08.19-54-g612f66e2
    
    174
    +    BST_EXT_REF: 0.9.0-0-g63a19e8068bd777bd9cd59b1a9442f9749ea5a85
    
    175
    +    FD_SDK_REF: freedesktop-sdk-18.08.25-0-g250939d465d6dd7768a215f1fa59c4a3412fc337
    
    145 176
       before_script:
    
    146 177
       - |
    
    147 178
         mkdir -p "${HOME}/.config"
    

  • CONTRIBUTING.rst
    ... ... @@ -1534,6 +1534,10 @@ You can always abort on the first failure by running::
    1534 1534
     
    
    1535 1535
       tox -- -x
    
    1536 1536
     
    
    1537
    +Similarly, you may also be interested in the ``--last-failed`` and
    
    1538
    +``--failed-first`` options as per the
    
    1539
    +`pytest cache <https://docs.pytest.org/en/latest/cache.html>`_ documentation.
    
    1540
    +
    
    1537 1541
     If you want to run a specific test or a group of tests, you
    
    1538 1542
     can specify a prefix to match. E.g. if you want to run all of
    
    1539 1543
     the frontend tests you can do::
    
    ... ... @@ -1545,6 +1549,12 @@ If you wanted to run the test_build_track test within frontend/buildtrack.py you
    1545 1549
     
    
    1546 1550
       tox -- tests/frontend/buildtrack.py::test_build_track
    
    1547 1551
     
    
    1552
    +When running only a few tests, you may find the coverage and timing output
    
    1553
    +excessive, there are options to trim them. Note that coverage step will fail.
    
    1554
    +Here is an example::
    
    1555
    +
    
    1556
    +  tox -- --no-cov --durations=1 tests/frontend/buildtrack.py::test_build_track
    
    1557
    +
    
    1548 1558
     We also have a set of slow integration tests that are disabled by
    
    1549 1559
     default - you will notice most of them marked with SKIP in the pytest
    
    1550 1560
     output. To run them, you can use::
    

  • buildstream/_frontend/cli.py
    ... ... @@ -2,6 +2,7 @@ import os
    2 2
     import sys
    
    3 3
     from contextlib import ExitStack
    
    4 4
     from fnmatch import fnmatch
    
    5
    +from functools import partial
    
    5 6
     from tempfile import TemporaryDirectory
    
    6 7
     
    
    7 8
     import click
    
    ... ... @@ -111,14 +112,25 @@ def complete_target(args, incomplete):
    111 112
         return complete_list
    
    112 113
     
    
    113 114
     
    
    114
    -def complete_artifact(args, incomplete):
    
    115
    +def complete_artifact(orig_args, args, incomplete):
    
    115 116
         from .._context import Context
    
    116 117
         ctx = Context()
    
    117 118
     
    
    118 119
         config = None
    
    119
    -    for i, arg in enumerate(args):
    
    120
    -        if arg in ('-c', '--config'):
    
    121
    -            config = args[i + 1]
    
    120
    +    if orig_args:
    
    121
    +        for i, arg in enumerate(orig_args):
    
    122
    +            if arg in ('-c', '--config'):
    
    123
    +                try:
    
    124
    +                    config = orig_args[i + 1]
    
    125
    +                except IndexError:
    
    126
    +                    pass
    
    127
    +    if args:
    
    128
    +        for i, arg in enumerate(args):
    
    129
    +            if arg in ('-c', '--config'):
    
    130
    +                try:
    
    131
    +                    config = args[i + 1]
    
    132
    +                except IndexError:
    
    133
    +                    pass
    
    122 134
         ctx.load(config)
    
    123 135
     
    
    124 136
         # element targets are valid artifact names
    
    ... ... @@ -128,8 +140,9 @@ def complete_artifact(args, incomplete):
    128 140
         return complete_list
    
    129 141
     
    
    130 142
     
    
    131
    -def override_completions(cmd, cmd_param, args, incomplete):
    
    143
    +def override_completions(orig_args, cmd, cmd_param, args, incomplete):
    
    132 144
         """
    
    145
    +    :param orig_args: original, non-completion args
    
    133 146
         :param cmd_param: command definition
    
    134 147
         :param args: full list of args typed before the incomplete arg
    
    135 148
         :param incomplete: the incomplete text to autocomplete
    
    ... ... @@ -150,7 +163,7 @@ def override_completions(cmd, cmd_param, args, incomplete):
    150 163
                     cmd_param.opts == ['--track-except']):
    
    151 164
                 return complete_target(args, incomplete)
    
    152 165
             if cmd_param.name == 'artifacts':
    
    153
    -            return complete_artifact(args, incomplete)
    
    166
    +            return complete_artifact(orig_args, args, incomplete)
    
    154 167
     
    
    155 168
         raise CompleteUnhandled()
    
    156 169
     
    
    ... ... @@ -161,7 +174,7 @@ def override_main(self, args=None, prog_name=None, complete_var=None,
    161 174
         # Hook for the Bash completion.  This only activates if the Bash
    
    162 175
         # completion is actually enabled, otherwise this is quite a fast
    
    163 176
         # noop.
    
    164
    -    if main_bashcomplete(self, prog_name, override_completions):
    
    177
    +    if main_bashcomplete(self, prog_name, partial(override_completions, args)):
    
    165 178
     
    
    166 179
             # If we're running tests we cant just go calling exit()
    
    167 180
             # from the main process.
    
    ... ... @@ -355,78 +368,6 @@ def build(app, elements, all_, track_, track_save, track_all, track_except, trac
    355 368
                              build_all=all_)
    
    356 369
     
    
    357 370
     
    
    358
    -##################################################################
    
    359
    -#                           Pull Command                         #
    
    360
    -##################################################################
    
    361
    -@cli.command(short_help="Pull a built artifact")
    
    362
    -@click.option('--deps', '-d', default='none',
    
    363
    -              type=click.Choice(['none', 'all']),
    
    364
    -              help='The dependency artifacts to pull (default: none)')
    
    365
    -@click.option('--remote', '-r',
    
    366
    -              help="The URL of the remote cache (defaults to the first configured cache)")
    
    367
    -@click.argument('elements', nargs=-1,
    
    368
    -                type=click.Path(readable=False))
    
    369
    -@click.pass_obj
    
    370
    -def pull(app, elements, deps, remote):
    
    371
    -    """Pull a built artifact from the configured remote artifact cache.
    
    372
    -
    
    373
    -    By default the artifact will be pulled one of the configured caches
    
    374
    -    if possible, following the usual priority order. If the `--remote` flag
    
    375
    -    is given, only the specified cache will be queried.
    
    376
    -
    
    377
    -    Specify `--deps` to control which artifacts to pull:
    
    378
    -
    
    379
    -    \b
    
    380
    -        none:  No dependencies, just the element itself
    
    381
    -        all:   All dependencies
    
    382
    -    """
    
    383
    -
    
    384
    -    with app.initialized(session_name="Pull"):
    
    385
    -        if not elements:
    
    386
    -            guessed_target = app.context.guess_element()
    
    387
    -            if guessed_target:
    
    388
    -                elements = (guessed_target,)
    
    389
    -
    
    390
    -        app.stream.pull(elements, selection=deps, remote=remote)
    
    391
    -
    
    392
    -
    
    393
    -##################################################################
    
    394
    -#                           Push Command                         #
    
    395
    -##################################################################
    
    396
    -@cli.command(short_help="Push a built artifact")
    
    397
    -@click.option('--deps', '-d', default='none',
    
    398
    -              type=click.Choice(['none', 'all']),
    
    399
    -              help='The dependencies to push (default: none)')
    
    400
    -@click.option('--remote', '-r', default=None,
    
    401
    -              help="The URL of the remote cache (defaults to the first configured cache)")
    
    402
    -@click.argument('elements', nargs=-1,
    
    403
    -                type=click.Path(readable=False))
    
    404
    -@click.pass_obj
    
    405
    -def push(app, elements, deps, remote):
    
    406
    -    """Push a built artifact to a remote artifact cache.
    
    407
    -
    
    408
    -    The default destination is the highest priority configured cache. You can
    
    409
    -    override this by passing a different cache URL with the `--remote` flag.
    
    410
    -
    
    411
    -    If bst has been configured to include build trees on artifact pulls,
    
    412
    -    an attempt will be made to pull any required build trees to avoid the
    
    413
    -    skipping of partial artifacts being pushed.
    
    414
    -
    
    415
    -    Specify `--deps` to control which artifacts to push:
    
    416
    -
    
    417
    -    \b
    
    418
    -        none:  No dependencies, just the element itself
    
    419
    -        all:   All dependencies
    
    420
    -    """
    
    421
    -    with app.initialized(session_name="Push"):
    
    422
    -        if not elements:
    
    423
    -            guessed_target = app.context.guess_element()
    
    424
    -            if guessed_target:
    
    425
    -                elements = (guessed_target,)
    
    426
    -
    
    427
    -        app.stream.push(elements, selection=deps, remote=remote)
    
    428
    -
    
    429
    -
    
    430 371
     ##################################################################
    
    431 372
     #                           Show Command                         #
    
    432 373
     ##################################################################
    
    ... ... @@ -1010,6 +951,78 @@ def artifact():
    1010 951
         """Manipulate cached artifacts"""
    
    1011 952
     
    
    1012 953
     
    
    954
    +################################################################
    
    955
    +#                     Artifact Pull Command                    #
    
    956
    +################################################################
    
    957
    +@artifact.command(name="pull", short_help="Pull a built artifact")
    
    958
    +@click.option('--deps', '-d', default='none',
    
    959
    +              type=click.Choice(['none', 'all']),
    
    960
    +              help='The dependency artifacts to pull (default: none)')
    
    961
    +@click.option('--remote', '-r',
    
    962
    +              help="The URL of the remote cache (defaults to the first configured cache)")
    
    963
    +@click.argument('elements', nargs=-1,
    
    964
    +                type=click.Path(readable=False))
    
    965
    +@click.pass_obj
    
    966
    +def artifact_pull(app, elements, deps, remote):
    
    967
    +    """Pull a built artifact from the configured remote artifact cache.
    
    968
    +
    
    969
    +    By default the artifact will be pulled one of the configured caches
    
    970
    +    if possible, following the usual priority order. If the `--remote` flag
    
    971
    +    is given, only the specified cache will be queried.
    
    972
    +
    
    973
    +    Specify `--deps` to control which artifacts to pull:
    
    974
    +
    
    975
    +    \b
    
    976
    +        none:  No dependencies, just the element itself
    
    977
    +        all:   All dependencies
    
    978
    +    """
    
    979
    +
    
    980
    +    with app.initialized(session_name="Pull"):
    
    981
    +        if not elements:
    
    982
    +            guessed_target = app.context.guess_element()
    
    983
    +            if guessed_target:
    
    984
    +                elements = (guessed_target,)
    
    985
    +
    
    986
    +        app.stream.pull(elements, selection=deps, remote=remote)
    
    987
    +
    
    988
    +
    
    989
    +##################################################################
    
    990
    +#                     Artifact Push Command                      #
    
    991
    +##################################################################
    
    992
    +@artifact.command(name="push", short_help="Push a built artifact")
    
    993
    +@click.option('--deps', '-d', default='none',
    
    994
    +              type=click.Choice(['none', 'all']),
    
    995
    +              help='The dependencies to push (default: none)')
    
    996
    +@click.option('--remote', '-r', default=None,
    
    997
    +              help="The URL of the remote cache (defaults to the first configured cache)")
    
    998
    +@click.argument('elements', nargs=-1,
    
    999
    +                type=click.Path(readable=False))
    
    1000
    +@click.pass_obj
    
    1001
    +def artifact_push(app, elements, deps, remote):
    
    1002
    +    """Push a built artifact to a remote artifact cache.
    
    1003
    +
    
    1004
    +    The default destination is the highest priority configured cache. You can
    
    1005
    +    override this by passing a different cache URL with the `--remote` flag.
    
    1006
    +
    
    1007
    +    If bst has been configured to include build trees on artifact pulls,
    
    1008
    +    an attempt will be made to pull any required build trees to avoid the
    
    1009
    +    skipping of partial artifacts being pushed.
    
    1010
    +
    
    1011
    +    Specify `--deps` to control which artifacts to push:
    
    1012
    +
    
    1013
    +    \b
    
    1014
    +        none:  No dependencies, just the element itself
    
    1015
    +        all:   All dependencies
    
    1016
    +    """
    
    1017
    +    with app.initialized(session_name="Push"):
    
    1018
    +        if not elements:
    
    1019
    +            guessed_target = app.context.guess_element()
    
    1020
    +            if guessed_target:
    
    1021
    +                elements = (guessed_target,)
    
    1022
    +
    
    1023
    +        app.stream.push(elements, selection=deps, remote=remote)
    
    1024
    +
    
    1025
    +
    
    1013 1026
     ################################################################
    
    1014 1027
     #                     Artifact Log Command                     #
    
    1015 1028
     ################################################################
    
    ... ... @@ -1116,3 +1129,37 @@ def fetch(app, elements, deps, track_, except_, track_cross_junctions):
    1116 1129
     def track(app, elements, deps, except_, cross_junctions):
    
    1117 1130
         click.echo("This command is now obsolete. Use `bst source track` instead.", err=True)
    
    1118 1131
         sys.exit(1)
    
    1132
    +
    
    1133
    +
    
    1134
    +################################################################
    
    1135
    +#                          Pull Command                        #
    
    1136
    +################################################################
    
    1137
    +@cli.command(short_help="Pull a built artifact", hidden=True)
    
    1138
    +@click.option('--deps', '-d', default='none',
    
    1139
    +              type=click.Choice(['none', 'all']),
    
    1140
    +              help='The dependency artifacts to pull (default: none)')
    
    1141
    +@click.option('--remote', '-r',
    
    1142
    +              help="The URL of the remote cache (defaults to the first configured cache)")
    
    1143
    +@click.argument('elements', nargs=-1,
    
    1144
    +                type=click.Path(readable=False))
    
    1145
    +@click.pass_obj
    
    1146
    +def pull(app, elements, deps, remote):
    
    1147
    +    click.echo("This command is now obsolete. Use `bst artifact pull` instead.", err=True)
    
    1148
    +    sys.exit(1)
    
    1149
    +
    
    1150
    +
    
    1151
    +##################################################################
    
    1152
    +#                           Push Command                         #
    
    1153
    +##################################################################
    
    1154
    +@cli.command(short_help="Push a built artifact", hidden=True)
    
    1155
    +@click.option('--deps', '-d', default='none',
    
    1156
    +              type=click.Choice(['none', 'all']),
    
    1157
    +              help='The dependencies to push (default: none)')
    
    1158
    +@click.option('--remote', '-r', default=None,
    
    1159
    +              help="The URL of the remote cache (defaults to the first configured cache)")
    
    1160
    +@click.argument('elements', nargs=-1,
    
    1161
    +                type=click.Path(readable=False))
    
    1162
    +@click.pass_obj
    
    1163
    +def push(app, elements, deps, remote):
    
    1164
    +    click.echo("This command is now obsolete. Use `bst artifact push` instead.", err=True)
    
    1165
    +    sys.exit(1)

  • buildstream/_gitsourcebase.py
    ... ... @@ -296,18 +296,24 @@ class GitMirror(SourceFetcher):
    296 296
                 shallow = set()
    
    297 297
                 for _, commit_ref, _ in self.tags:
    
    298 298
     
    
    299
    -                _, out = self.source.check_output([self.source.host_git, 'rev-list',
    
    300
    -                                                   '--boundary', '{}..{}'.format(commit_ref, self.ref)],
    
    301
    -                                                  fail="Failed to get git history {}..{} in directory: {}"
    
    302
    -                                                  .format(commit_ref, self.ref, fullpath),
    
    303
    -                                                  fail_temporarily=True,
    
    304
    -                                                  cwd=self.mirror)
    
    305
    -                for line in out.splitlines():
    
    306
    -                    rev = line.lstrip('-')
    
    307
    -                    if line[0] == '-':
    
    308
    -                        shallow.add(rev)
    
    309
    -                    else:
    
    310
    -                        included.add(rev)
    
    299
    +                if commit_ref == self.ref:
    
    300
    +                    # rev-list does not work in case of same rev
    
    301
    +                    shallow.add(self.ref)
    
    302
    +                else:
    
    303
    +                    _, out = self.source.check_output([self.source.host_git, 'rev-list',
    
    304
    +                                                       '--ancestry-path', '--boundary',
    
    305
    +                                                       '{}..{}'.format(commit_ref, self.ref)],
    
    306
    +                                                      fail="Failed to get git history {}..{} in directory: {}"
    
    307
    +                                                      .format(commit_ref, self.ref, fullpath),
    
    308
    +                                                      fail_temporarily=True,
    
    309
    +                                                      cwd=self.mirror)
    
    310
    +                    self.source.warn("refs {}..{}: {}".format(commit_ref, self.ref, out.splitlines()))
    
    311
    +                    for line in out.splitlines():
    
    312
    +                        rev = line.lstrip('-')
    
    313
    +                        if line[0] == '-':
    
    314
    +                            shallow.add(rev)
    
    315
    +                        else:
    
    316
    +                            included.add(rev)
    
    311 317
     
    
    312 318
                 shallow -= included
    
    313 319
                 included |= shallow
    

  • buildstream/_pipeline.py
    ... ... @@ -22,6 +22,7 @@
    22 22
     import os
    
    23 23
     import itertools
    
    24 24
     from operator import itemgetter
    
    25
    +from collections import OrderedDict
    
    25 26
     
    
    26 27
     from ._exceptions import PipelineError
    
    27 28
     from ._message import Message, MessageType
    
    ... ... @@ -479,7 +480,7 @@ class Pipeline():
    479 480
     #
    
    480 481
     class _Planner():
    
    481 482
         def __init__(self):
    
    482
    -        self.depth_map = {}
    
    483
    +        self.depth_map = OrderedDict()
    
    483 484
             self.visiting_elements = set()
    
    484 485
     
    
    485 486
         # Here we want to traverse the same element more than once when
    

  • buildstream/_scheduler/queues/queue.py
    ... ... @@ -170,9 +170,9 @@ class Queue():
    170 170
             skip = [job for job in jobs if self.status(job.element) == QueueStatus.SKIP]
    
    171 171
             wait = [job for job in jobs if job not in skip]
    
    172 172
     
    
    173
    +        self.skipped_elements.extend([job.element for job in skip])
    
    173 174
             self._wait_queue.extend(wait)
    
    174 175
             self._done_queue.extend(skip)
    
    175
    -        self.skipped_elements.extend(skip)
    
    176 176
     
    
    177 177
         # dequeue()
    
    178 178
         #
    

  • conftest.py
    ... ... @@ -32,7 +32,7 @@ def pytest_addoption(parser):
    32 32
     
    
    33 33
     
    
    34 34
     def pytest_runtest_setup(item):
    
    35
    -    if item.get_marker('integration') and not item.config.getvalue('integration'):
    
    35
    +    if item.get_closest_marker('integration') and not item.config.getvalue('integration'):
    
    36 36
             pytest.skip('skipping integration test')
    
    37 37
     
    
    38 38
     
    

  • requirements/dev-requirements.txt
    ... ... @@ -2,7 +2,7 @@ coverage==4.4
    2 2
     pylint==2.2.2
    
    3 3
     pycodestyle==2.4.0
    
    4 4
     pytest==4.0.2
    
    5
    -pytest-cov==2.6.0
    
    5
    +pytest-cov==2.6.1
    
    6 6
     pytest-datafiles==2.0
    
    7 7
     pytest-env==0.6.2
    
    8 8
     pytest-xdist==1.25.0
    

  • tests/artifactcache/config.py
    ... ... @@ -138,5 +138,5 @@ def test_missing_certs(cli, datafiles, config_key, config_value):
    138 138
         # Use `pull` here to ensure we try to initialize the remotes, triggering the error
    
    139 139
         #
    
    140 140
         # This does not happen for a simple `bst show`.
    
    141
    -    result = cli.run(project=project, args=['pull', 'element.bst'])
    
    141
    +    result = cli.run(project=project, args=['artifact', 'pull', 'element.bst'])
    
    142 142
         result.assert_main_error(ErrorDomain.LOAD, LoadErrorReason.INVALID_DATA)

  • tests/artifactcache/junctions.py
    ... ... @@ -58,7 +58,7 @@ def test_push_pull(cli, tmpdir, datafiles):
    58 58
             project_set_artifacts(base_project, base_share.repo)
    
    59 59
     
    
    60 60
             # Now try bst push
    
    61
    -        result = cli.run(project=project, args=['push', '--deps', 'all', 'target.bst'])
    
    61
    +        result = cli.run(project=project, args=['artifact', 'push', '--deps', 'all', 'target.bst'])
    
    62 62
             assert result.exit_code == 0
    
    63 63
     
    
    64 64
             # And finally assert that the artifacts are in the right shares
    
    ... ... @@ -78,7 +78,7 @@ def test_push_pull(cli, tmpdir, datafiles):
    78 78
             assert state != 'cached'
    
    79 79
     
    
    80 80
             # Now try bst pull
    
    81
    -        result = cli.run(project=project, args=['pull', '--deps', 'all', 'target.bst'])
    
    81
    +        result = cli.run(project=project, args=['artifact', 'pull', '--deps', 'all', 'target.bst'])
    
    82 82
             assert result.exit_code == 0
    
    83 83
     
    
    84 84
             # And assert that they are again in the local cache, without having built
    

  • tests/completions/completions.py
    ... ... @@ -11,8 +11,6 @@ MAIN_COMMANDS = [
    11 11
         'checkout ',
    
    12 12
         'help ',
    
    13 13
         'init ',
    
    14
    -    'pull ',
    
    15
    -    'push ',
    
    16 14
         'shell ',
    
    17 15
         'show ',
    
    18 16
         'source ',
    
    ... ... @@ -54,6 +52,12 @@ SOURCE_COMMANDS = [
    54 52
         'track ',
    
    55 53
     ]
    
    56 54
     
    
    55
    +ARTIFACT_COMMANDS = [
    
    56
    +    'push ',
    
    57
    +    'pull ',
    
    58
    +    'log ',
    
    59
    +]
    
    60
    +
    
    57 61
     WORKSPACE_COMMANDS = [
    
    58 62
         'close ',
    
    59 63
         'list ',
    
    ... ... @@ -117,8 +121,7 @@ def assert_completion_failed(cli, cmd, word_idx, expected, cwd=None):
    117 121
     @pytest.mark.parametrize("cmd,word_idx,expected", [
    
    118 122
         ('bst', 0, []),
    
    119 123
         ('bst ', 1, MAIN_COMMANDS),
    
    120
    -    ('bst pu', 1, ['pull ', 'push ']),
    
    121
    -    ('bst pul', 1, ['pull ']),
    
    124
    +    ('bst artifact ', 2, ARTIFACT_COMMANDS),
    
    122 125
         ('bst source ', 2, SOURCE_COMMANDS),
    
    123 126
         ('bst w ', 1, ['workspace ']),
    
    124 127
         ('bst workspace ', 2, WORKSPACE_COMMANDS),
    
    ... ... @@ -272,12 +275,52 @@ def test_argument_element_invalid(datafiles, cli, project, cmd, word_idx, expect
    272 275
     @pytest.mark.parametrize("cmd,word_idx,expected", [
    
    273 276
         ('bst he', 1, ['help ']),
    
    274 277
         ('bst help ', 2, MAIN_COMMANDS),
    
    278
    +    ('bst help artifact ', 3, ARTIFACT_COMMANDS),
    
    275 279
         ('bst help in', 2, ['init ']),
    
    276
    -    ('bst help p', 2, ['pull ', 'push ']),
    
    277
    -    ('bst help p', 2, ['pull ', 'push ']),
    
    278 280
         ('bst help source ', 3, SOURCE_COMMANDS),
    
    279 281
         ('bst help w', 2, ['workspace ']),
    
    280 282
         ('bst help workspace ', 3, WORKSPACE_COMMANDS),
    
    281 283
     ])
    
    282 284
     def test_help_commands(cli, cmd, word_idx, expected):
    
    283 285
         assert_completion(cli, cmd, word_idx, expected)
    
    286
    +
    
    287
    +
    
    288
    +@pytest.mark.datafiles(os.path.join(DATA_DIR, 'project'))
    
    289
    +def test_argument_artifact(cli, tmpdir, datafiles):
    
    290
    +    project = os.path.join(datafiles.dirname, datafiles.basename)
    
    291
    +
    
    292
    +    # Build an import element with no dependencies (as there will only be ONE cache key)
    
    293
    +    result = cli.run(project=project, args=['build', 'import-bin.bst'])  # Has no dependencies
    
    294
    +    result.assert_success()
    
    295
    +
    
    296
    +    # Get the key and the artifact ref ($project/$element_name/$key)
    
    297
    +    key = cli.get_element_key(project, 'import-bin.bst')
    
    298
    +    artifact = os.path.join('test', 'import-bin', key)
    
    299
    +
    
    300
    +    # Test autocompletion of the artifact
    
    301
    +    cmds = [
    
    302
    +        'bst artifact log ',
    
    303
    +        'bst artifact log t',
    
    304
    +        'bst artifact log test/'
    
    305
    +    ]
    
    306
    +
    
    307
    +    for i, cmd in enumerate(cmds):
    
    308
    +        word_idx = 3
    
    309
    +        result = cli.run(project=project, cwd=project, env={
    
    310
    +            '_BST_COMPLETION': 'complete',
    
    311
    +            'COMP_WORDS': cmd,
    
    312
    +            'COMP_CWORD': str(word_idx)
    
    313
    +        })
    
    314
    +        words = []
    
    315
    +        if result.output:
    
    316
    +            words = result.output.splitlines()  # This leaves an extra space on each e.g. ['foo.bst ']
    
    317
    +            words = [word.strip() for word in words]
    
    318
    +
    
    319
    +            if i == 0:
    
    320
    +                expected = PROJECT_ELEMENTS + [artifact]  # We should now be able to see the artifact
    
    321
    +            elif i == 1:
    
    322
    +                expected = ['target.bst', artifact]
    
    323
    +            elif i == 2:
    
    324
    +                expected = [artifact]
    
    325
    +
    
    326
    +            assert expected == words

  • tests/frontend/help.py
    ... ... @@ -18,10 +18,9 @@ def test_help_main(cli):
    18 18
     
    
    19 19
     
    
    20 20
     @pytest.mark.parametrize("command", [
    
    21
    +    ('artifact'),
    
    21 22
         ('build'),
    
    22 23
         ('checkout'),
    
    23
    -    ('pull'),
    
    24
    -    ('push'),
    
    25 24
         ('shell'),
    
    26 25
         ('show'),
    
    27 26
         ('source'),
    

  • tests/frontend/order.py
    1
    +import os
    
    2
    +
    
    3
    +import pytest
    
    4
    +from tests.testutils import cli, create_repo
    
    5
    +
    
    6
    +from buildstream import _yaml
    
    7
    +
    
    8
    +# Project directory
    
    9
    +DATA_DIR = os.path.join(
    
    10
    +    os.path.dirname(os.path.realpath(__file__)),
    
    11
    +    "project",
    
    12
    +)
    
    13
    +
    
    14
    +
    
    15
    +def create_element(repo, name, path, dependencies, ref=None):
    
    16
    +    element = {
    
    17
    +        'kind': 'import',
    
    18
    +        'sources': [
    
    19
    +            repo.source_config(ref=ref)
    
    20
    +        ],
    
    21
    +        'depends': dependencies
    
    22
    +    }
    
    23
    +    _yaml.dump(element, os.path.join(path, name))
    
    24
    +
    
    25
    +
    
    26
    +# This tests a variety of scenarios and checks that the order in
    
    27
    +# which things are processed remains stable.
    
    28
    +#
    
    29
    +# This is especially important in order to ensure that our
    
    30
    +# depth sorting and optimization of which elements should be
    
    31
    +# processed first is doing it's job right, and that we are
    
    32
    +# promoting elements to the build queue as soon as possible
    
    33
    +#
    
    34
    +# Parameters:
    
    35
    +#    targets (target elements): The targets to invoke bst with
    
    36
    +#    template (dict): The project template dictionary, for create_element()
    
    37
    +#    expected (list): A list of element names in the expected order
    
    38
    +#
    
    39
    +@pytest.mark.datafiles(os.path.join(DATA_DIR))
    
    40
    +@pytest.mark.parametrize("target,template,expected", [
    
    41
    +    # First simple test
    
    42
    +    ('3.bst', {
    
    43
    +        '0.bst': ['1.bst'],
    
    44
    +        '1.bst': [],
    
    45
    +        '2.bst': ['0.bst'],
    
    46
    +        '3.bst': ['0.bst', '1.bst', '2.bst']
    
    47
    +    }, ['1.bst', '0.bst', '2.bst', '3.bst']),
    
    48
    +
    
    49
    +    # A more complicated test with build of build dependencies
    
    50
    +    ('target.bst', {
    
    51
    +        'a.bst': [],
    
    52
    +        'base.bst': [],
    
    53
    +        'timezones.bst': [],
    
    54
    +        'middleware.bst': [{'filename': 'base.bst', 'type': 'build'}],
    
    55
    +        'app.bst': [{'filename': 'middleware.bst', 'type': 'build'}],
    
    56
    +        'target.bst': ['a.bst', 'base.bst', 'middleware.bst', 'app.bst', 'timezones.bst']
    
    57
    +    }, ['base.bst', 'middleware.bst', 'a.bst', 'app.bst', 'timezones.bst', 'target.bst']),
    
    58
    +])
    
    59
    +@pytest.mark.parametrize("operation", [('show'), ('fetch'), ('build')])
    
    60
    +def test_order(cli, datafiles, tmpdir, operation, target, template, expected):
    
    61
    +    project = os.path.join(datafiles.dirname, datafiles.basename)
    
    62
    +    dev_files_path = os.path.join(project, 'files', 'dev-files')
    
    63
    +    element_path = os.path.join(project, 'elements')
    
    64
    +
    
    65
    +    # FIXME: Remove this when the test passes reliably.
    
    66
    +    #
    
    67
    +    #        There is no reason why the order should not
    
    68
    +    #        be preserved when the builders is set to 1,
    
    69
    +    #        the scheduler queue processing still seems to
    
    70
    +    #        be losing the order.
    
    71
    +    #
    
    72
    +    if operation == 'build':
    
    73
    +        pytest.skip("FIXME: This still only sometimes passes")
    
    74
    +
    
    75
    +    # Configure to only allow one fetcher at a time, make it easy to
    
    76
    +    # determine what is being planned in what order.
    
    77
    +    cli.configure({
    
    78
    +        'scheduler': {
    
    79
    +            'fetchers': 1,
    
    80
    +            'builders': 1
    
    81
    +        }
    
    82
    +    })
    
    83
    +
    
    84
    +    # Build the project from the template, make import elements
    
    85
    +    # all with the same repo
    
    86
    +    #
    
    87
    +    repo = create_repo('git', str(tmpdir))
    
    88
    +    ref = repo.create(dev_files_path)
    
    89
    +    for element, dependencies in template.items():
    
    90
    +        create_element(repo, element, element_path, dependencies, ref=ref)
    
    91
    +        repo.add_commit()
    
    92
    +
    
    93
    +    # Run test and collect results
    
    94
    +    if operation == 'show':
    
    95
    +        result = cli.run(args=['show', '--deps', 'plan', '--format', '%{name}', target], project=project, silent=True)
    
    96
    +        result.assert_success()
    
    97
    +        results = result.output.splitlines()
    
    98
    +    else:
    
    99
    +        if operation == 'fetch':
    
    100
    +            result = cli.run(args=['source', 'fetch', target], project=project, silent=True)
    
    101
    +        else:
    
    102
    +            result = cli.run(args=[operation, target], project=project, silent=True)
    
    103
    +        result.assert_success()
    
    104
    +        results = result.get_start_order(operation)
    
    105
    +
    
    106
    +    # Assert the order
    
    107
    +    print("Expected order: {}".format(expected))
    
    108
    +    print("Observed result order: {}".format(results))
    
    109
    +    assert results == expected

  • tests/frontend/pull.py
    ... ... @@ -70,7 +70,7 @@ def test_push_pull_all(cli, tmpdir, datafiles):
    70 70
                 assert cli.get_element_state(project, element_name) != 'cached'
    
    71 71
     
    
    72 72
             # Now try bst pull
    
    73
    -        result = cli.run(project=project, args=['pull', '--deps', 'all', 'target.bst'])
    
    73
    +        result = cli.run(project=project, args=['artifact', 'pull', '--deps', 'all', 'target.bst'])
    
    74 74
             result.assert_success()
    
    75 75
     
    
    76 76
             # And assert that it's again in the local cache, without having built
    
    ... ... @@ -111,7 +111,7 @@ def test_pull_secondary_cache(cli, tmpdir, datafiles):
    111 111
             assert cli.get_element_state(project, 'target.bst') != 'cached'
    
    112 112
     
    
    113 113
             # Now try bst pull
    
    114
    -        result = cli.run(project=project, args=['pull', 'target.bst'])
    
    114
    +        result = cli.run(project=project, args=['artifact', 'pull', 'target.bst'])
    
    115 115
             result.assert_success()
    
    116 116
     
    
    117 117
             # And assert that it's again in the local cache, without having built,
    
    ... ... @@ -146,7 +146,7 @@ def test_push_pull_specific_remote(cli, tmpdir, datafiles):
    146 146
     
    
    147 147
             # Now try `bst push` to the good_share.
    
    148 148
             result = cli.run(project=project, args=[
    
    149
    -            'push', 'target.bst', '--remote', good_share.repo
    
    149
    +            'artifact', 'push', 'target.bst', '--remote', good_share.repo
    
    150 150
             ])
    
    151 151
             result.assert_success()
    
    152 152
     
    
    ... ... @@ -161,7 +161,7 @@ def test_push_pull_specific_remote(cli, tmpdir, datafiles):
    161 161
             artifacts = os.path.join(cli.directory, 'artifacts')
    
    162 162
             shutil.rmtree(artifacts)
    
    163 163
     
    
    164
    -        result = cli.run(project=project, args=['pull', 'target.bst', '--remote',
    
    164
    +        result = cli.run(project=project, args=['artifact', 'pull', 'target.bst', '--remote',
    
    165 165
                                                     good_share.repo])
    
    166 166
             result.assert_success()
    
    167 167
     
    
    ... ... @@ -216,7 +216,7 @@ def test_push_pull_non_strict(cli, tmpdir, datafiles):
    216 216
             assert cli.get_element_state(project, 'target.bst') == 'waiting'
    
    217 217
     
    
    218 218
             # Now try bst pull
    
    219
    -        result = cli.run(project=project, args=['pull', '--deps', 'all', 'target.bst'])
    
    219
    +        result = cli.run(project=project, args=['artifact', 'pull', '--deps', 'all', 'target.bst'])
    
    220 220
             result.assert_success()
    
    221 221
     
    
    222 222
             # And assert that the target is again in the local cache, without having built
    
    ... ... @@ -291,7 +291,7 @@ def test_push_pull_cross_junction(cli, tmpdir, datafiles):
    291 291
             assert cli.get_element_state(project, 'junction.bst:import-etc.bst') == 'buildable'
    
    292 292
     
    
    293 293
             # Now try bst pull
    
    294
    -        result = cli.run(project=project, args=['pull', 'junction.bst:import-etc.bst'])
    
    294
    +        result = cli.run(project=project, args=['artifact', 'pull', 'junction.bst:import-etc.bst'])
    
    295 295
             result.assert_success()
    
    296 296
     
    
    297 297
             # And assert that it's again in the local cache, without having built
    

  • tests/frontend/push.py
    ... ... @@ -82,7 +82,7 @@ def test_push(cli, tmpdir, datafiles):
    82 82
             with create_artifact_share(os.path.join(str(tmpdir), 'artifactshare2')) as share2:
    
    83 83
     
    
    84 84
                 # Try pushing with no remotes configured. This should fail.
    
    85
    -            result = cli.run(project=project, args=['push', 'target.bst'])
    
    85
    +            result = cli.run(project=project, args=['artifact', 'push', 'target.bst'])
    
    86 86
                 result.assert_main_error(ErrorDomain.STREAM, None)
    
    87 87
     
    
    88 88
                 # Configure bst to pull but not push from a cache and run `bst push`.
    
    ... ... @@ -90,7 +90,7 @@ def test_push(cli, tmpdir, datafiles):
    90 90
                 cli.configure({
    
    91 91
                     'artifacts': {'url': share1.repo, 'push': False},
    
    92 92
                 })
    
    93
    -            result = cli.run(project=project, args=['push', 'target.bst'])
    
    93
    +            result = cli.run(project=project, args=['artifact', 'push', 'target.bst'])
    
    94 94
                 result.assert_main_error(ErrorDomain.STREAM, None)
    
    95 95
     
    
    96 96
                 # Configure bst to push to one of the caches and run `bst push`. This works.
    
    ... ... @@ -100,7 +100,7 @@ def test_push(cli, tmpdir, datafiles):
    100 100
                         {'url': share2.repo, 'push': True},
    
    101 101
                     ]
    
    102 102
                 })
    
    103
    -            result = cli.run(project=project, args=['push', 'target.bst'])
    
    103
    +            result = cli.run(project=project, args=['artifact', 'push', 'target.bst'])
    
    104 104
     
    
    105 105
                 assert_not_shared(cli, share1, project, 'target.bst')
    
    106 106
                 assert_shared(cli, share2, project, 'target.bst')
    
    ... ... @@ -114,7 +114,7 @@ def test_push(cli, tmpdir, datafiles):
    114 114
                         {'url': share2.repo, 'push': True},
    
    115 115
                     ]
    
    116 116
                 })
    
    117
    -            result = cli.run(project=project, args=['push', 'target.bst'])
    
    117
    +            result = cli.run(project=project, args=['artifact', 'push', 'target.bst'])
    
    118 118
     
    
    119 119
                 assert_shared(cli, share1, project, 'target.bst')
    
    120 120
                 assert_shared(cli, share2, project, 'target.bst')
    
    ... ... @@ -156,7 +156,7 @@ def test_push_all(cli, tmpdir, datafiles):
    156 156
     
    
    157 157
             # Now try bst push all the deps
    
    158 158
             result = cli.run(project=project, args=[
    
    159
    -            'push', 'target.bst',
    
    159
    +            'artifact', 'push', 'target.bst',
    
    160 160
                 '--deps', 'all'
    
    161 161
             ])
    
    162 162
             result.assert_success()
    
    ... ... @@ -346,7 +346,7 @@ def test_recently_pulled_artifact_does_not_expire(cli, datafiles, tmpdir):
    346 346
             assert cli.get_element_state(project, 'element1.bst') != 'cached'
    
    347 347
     
    
    348 348
             # Pull the element1 from the remote cache (this should update its mtime)
    
    349
    -        result = cli.run(project=project, args=['pull', 'element1.bst', '--remote',
    
    349
    +        result = cli.run(project=project, args=['artifact', 'pull', 'element1.bst', '--remote',
    
    350 350
                                                     share.repo])
    
    351 351
             result.assert_success()
    
    352 352
     
    
    ... ... @@ -386,7 +386,7 @@ def test_push_cross_junction(cli, tmpdir, datafiles):
    386 386
             cli.configure({
    
    387 387
                 'artifacts': {'url': share.repo, 'push': True},
    
    388 388
             })
    
    389
    -        result = cli.run(project=project, args=['push', 'junction.bst:import-etc.bst'])
    
    389
    +        result = cli.run(project=project, args=['artifact', 'push', 'junction.bst:import-etc.bst'])
    
    390 390
     
    
    391 391
             cache_key = cli.get_element_key(project, 'junction.bst:import-etc.bst')
    
    392 392
             assert share.has_artifact('subtest', 'import-etc.bst', cache_key)
    
    ... ... @@ -407,7 +407,7 @@ def test_push_already_cached(caplog, cli, tmpdir, datafiles):
    407 407
             result.assert_success()
    
    408 408
             assert "SKIPPED Push" not in result.stderr
    
    409 409
     
    
    410
    -        result = cli.run(project=project, args=['push', 'target.bst'])
    
    410
    +        result = cli.run(project=project, args=['artifact', 'push', 'target.bst'])
    
    411 411
     
    
    412 412
             result.assert_success()
    
    413 413
             assert not result.get_pushed_elements(), "No elements should have been pushed since the cache was populated"
    

  • tests/frontend/workspace.py
    ... ... @@ -1105,10 +1105,10 @@ def test_external_push_pull(cli, datafiles, tmpdir_factory, guess_element):
    1105 1105
                 'artifacts': {'url': share.repo, 'push': True}
    
    1106 1106
             })
    
    1107 1107
     
    
    1108
    -        result = cli.run(project=project, args=['-C', workspace, 'push'] + arg_elm)
    
    1108
    +        result = cli.run(project=project, args=['-C', workspace, 'artifact', 'push'] + arg_elm)
    
    1109 1109
             result.assert_success()
    
    1110 1110
     
    
    1111
    -        result = cli.run(project=project, args=['-C', workspace, 'pull', '--deps', 'all'] + arg_elm)
    
    1111
    +        result = cli.run(project=project, args=['-C', workspace, 'artifact', 'pull', '--deps', 'all'] + arg_elm)
    
    1112 1112
             result.assert_success()
    
    1113 1113
     
    
    1114 1114
     
    

  • tests/integration/build-tree.py
    ... ... @@ -130,7 +130,8 @@ def test_buildtree_pulled(cli, tmpdir, datafiles):
    130 130
             assert cli.get_element_state(project, element_name) != 'cached'
    
    131 131
     
    
    132 132
             # Pull from cache, ensuring cli options is set to pull the buildtree
    
    133
    -        result = cli.run(project=project, args=['--pull-buildtrees', 'pull', '--deps', 'all', element_name])
    
    133
    +        result = cli.run(project=project,
    
    134
    +                         args=['--pull-buildtrees', 'artifact', 'pull', '--deps', 'all', element_name])
    
    134 135
             result.assert_success()
    
    135 136
     
    
    136 137
             # Check it's using the cached build tree
    
    ... ... @@ -164,7 +165,7 @@ def test_buildtree_options(cli, tmpdir, datafiles):
    164 165
             assert cli.get_element_state(project, element_name) != 'cached'
    
    165 166
     
    
    166 167
             # Pull from cache, but do not include buildtrees.
    
    167
    -        result = cli.run(project=project, args=['pull', '--deps', 'all', element_name])
    
    168
    +        result = cli.run(project=project, args=['artifact', 'pull', '--deps', 'all', element_name])
    
    168 169
             result.assert_success()
    
    169 170
     
    
    170 171
             # The above is the simplest way I know to create a local cache without any buildtrees.
    

  • tests/integration/pullbuildtrees.py
    ... ... @@ -55,12 +55,12 @@ def test_pullbuildtrees(cli, tmpdir, datafiles, integration_cache):
    55 55
             # Pull artifact with default config, assert that pulling again
    
    56 56
             # doesn't create a pull job, then assert with buildtrees user
    
    57 57
             # config set creates a pull job.
    
    58
    -        result = cli.run(project=project, args=['pull', element_name])
    
    58
    +        result = cli.run(project=project, args=['artifact', 'pull', element_name])
    
    59 59
             assert element_name in result.get_pulled_elements()
    
    60
    -        result = cli.run(project=project, args=['pull', element_name])
    
    60
    +        result = cli.run(project=project, args=['artifact', 'pull', element_name])
    
    61 61
             assert element_name not in result.get_pulled_elements()
    
    62 62
             cli.configure({'cache': {'pull-buildtrees': True}})
    
    63
    -        result = cli.run(project=project, args=['pull', element_name])
    
    63
    +        result = cli.run(project=project, args=['artifact', 'pull', element_name])
    
    64 64
             assert element_name in result.get_pulled_elements()
    
    65 65
             default_state(cli, tmpdir, share1)
    
    66 66
     
    
    ... ... @@ -68,13 +68,13 @@ def test_pullbuildtrees(cli, tmpdir, datafiles, integration_cache):
    68 68
             # with buildtrees cli flag set creates a pull job.
    
    69 69
             # Also assert that the buildtree is added to the artifact's
    
    70 70
             # extract dir
    
    71
    -        result = cli.run(project=project, args=['pull', element_name])
    
    71
    +        result = cli.run(project=project, args=['artifact', 'pull', element_name])
    
    72 72
             assert element_name in result.get_pulled_elements()
    
    73 73
             elementdigest = share1.has_artifact('test', element_name, cli.get_element_key(project, element_name))
    
    74 74
             buildtreedir = os.path.join(str(tmpdir), 'artifacts', 'extract', 'test', 'autotools-amhello',
    
    75 75
                                         elementdigest.hash, 'buildtree')
    
    76 76
             assert not os.path.isdir(buildtreedir)
    
    77
    -        result = cli.run(project=project, args=['--pull-buildtrees', 'pull', element_name])
    
    77
    +        result = cli.run(project=project, args=['--pull-buildtrees', 'artifact', 'pull', element_name])
    
    78 78
             assert element_name in result.get_pulled_elements()
    
    79 79
             assert os.path.isdir(buildtreedir)
    
    80 80
             default_state(cli, tmpdir, share1)
    
    ... ... @@ -83,21 +83,21 @@ def test_pullbuildtrees(cli, tmpdir, datafiles, integration_cache):
    83 83
             # that pulling with the same user config doesn't creates a pull job,
    
    84 84
             # or when buildtrees cli flag is set.
    
    85 85
             cli.configure({'cache': {'pull-buildtrees': True}})
    
    86
    -        result = cli.run(project=project, args=['pull', element_name])
    
    86
    +        result = cli.run(project=project, args=['artifact', 'pull', element_name])
    
    87 87
             assert element_name in result.get_pulled_elements()
    
    88
    -        result = cli.run(project=project, args=['pull', element_name])
    
    88
    +        result = cli.run(project=project, args=['artifact', 'pull', element_name])
    
    89 89
             assert element_name not in result.get_pulled_elements()
    
    90
    -        result = cli.run(project=project, args=['--pull-buildtrees', 'pull', element_name])
    
    90
    +        result = cli.run(project=project, args=['--pull-buildtrees', 'artifact', 'pull', element_name])
    
    91 91
             assert element_name not in result.get_pulled_elements()
    
    92 92
             default_state(cli, tmpdir, share1)
    
    93 93
     
    
    94 94
             # Pull artifact with default config and buildtrees cli flag set, then assert
    
    95 95
             # that pulling with pullbuildtrees set in user config doesn't create a pull
    
    96 96
             # job.
    
    97
    -        result = cli.run(project=project, args=['--pull-buildtrees', 'pull', element_name])
    
    97
    +        result = cli.run(project=project, args=['--pull-buildtrees', 'artifact', 'pull', element_name])
    
    98 98
             assert element_name in result.get_pulled_elements()
    
    99 99
             cli.configure({'cache': {'pull-buildtrees': True}})
    
    100
    -        result = cli.run(project=project, args=['pull', element_name])
    
    100
    +        result = cli.run(project=project, args=['artifact', 'pull', element_name])
    
    101 101
             assert element_name not in result.get_pulled_elements()
    
    102 102
             default_state(cli, tmpdir, share1)
    
    103 103
     
    
    ... ... @@ -105,10 +105,10 @@ def test_pullbuildtrees(cli, tmpdir, datafiles, integration_cache):
    105 105
             # can't be pushed to an artifact share, then assert that a complete build element
    
    106 106
             # can be. This will attempt a partial pull from share1 and then a partial push
    
    107 107
             # to share2
    
    108
    -        result = cli.run(project=project, args=['pull', element_name])
    
    108
    +        result = cli.run(project=project, args=['artifact', 'pull', element_name])
    
    109 109
             assert element_name in result.get_pulled_elements()
    
    110 110
             cli.configure({'artifacts': {'url': share2.repo, 'push': True}})
    
    111
    -        result = cli.run(project=project, args=['push', element_name])
    
    111
    +        result = cli.run(project=project, args=['artifact', 'push', element_name])
    
    112 112
             assert element_name not in result.get_pushed_elements()
    
    113 113
             assert not share2.has_artifact('test', element_name, cli.get_element_key(project, element_name))
    
    114 114
     
    
    ... ... @@ -116,10 +116,10 @@ def test_pullbuildtrees(cli, tmpdir, datafiles, integration_cache):
    116 116
             # successfully pushed to the remote. This will attempt to pull the buildtree
    
    117 117
             # from share1 and then a 'complete' push to share2
    
    118 118
             cli.configure({'artifacts': {'url': share1.repo, 'push': False}})
    
    119
    -        result = cli.run(project=project, args=['--pull-buildtrees', 'pull', element_name])
    
    119
    +        result = cli.run(project=project, args=['--pull-buildtrees', 'artifact', 'pull', element_name])
    
    120 120
             assert element_name in result.get_pulled_elements()
    
    121 121
             cli.configure({'artifacts': {'url': share2.repo, 'push': True}})
    
    122
    -        result = cli.run(project=project, args=['push', element_name])
    
    122
    +        result = cli.run(project=project, args=['artifact', 'push', element_name])
    
    123 123
             assert element_name in result.get_pushed_elements()
    
    124 124
             assert share2.has_artifact('test', element_name, cli.get_element_key(project, element_name))
    
    125 125
             default_state(cli, tmpdir, share1)
    
    ... ... @@ -128,10 +128,10 @@ def test_pullbuildtrees(cli, tmpdir, datafiles, integration_cache):
    128 128
             # if pull-buildtrees is set, however as share3 is the only defined remote and is empty,
    
    129 129
             # assert that no element artifact buildtrees are pulled (no available remote buildtree) and thus the
    
    130 130
             # artifact cannot be pushed.
    
    131
    -        result = cli.run(project=project, args=['pull', element_name])
    
    131
    +        result = cli.run(project=project, args=['artifact', 'pull', element_name])
    
    132 132
             assert element_name in result.get_pulled_elements()
    
    133 133
             cli.configure({'artifacts': {'url': share3.repo, 'push': True}})
    
    134
    -        result = cli.run(project=project, args=['--pull-buildtrees', 'push', element_name])
    
    134
    +        result = cli.run(project=project, args=['--pull-buildtrees', 'artifact', 'push', element_name])
    
    135 135
             assert "Attempting to fetch missing artifact buildtrees" in result.stderr
    
    136 136
             assert element_name not in result.get_pulled_elements()
    
    137 137
             assert not os.path.isdir(buildtreedir)
    
    ... ... @@ -143,7 +143,7 @@ def test_pullbuildtrees(cli, tmpdir, datafiles, integration_cache):
    143 143
             # to the empty share3. This gives the ability to attempt push currently partial artifacts to a remote,
    
    144 144
             # without exlipictly requiring a bst pull.
    
    145 145
             cli.configure({'artifacts': [{'url': share1.repo, 'push': False}, {'url': share3.repo, 'push': True}]})
    
    146
    -        result = cli.run(project=project, args=['--pull-buildtrees', 'push', element_name])
    
    146
    +        result = cli.run(project=project, args=['--pull-buildtrees', 'artifact', 'push', element_name])
    
    147 147
             assert "Attempting to fetch missing artifact buildtrees" in result.stderr
    
    148 148
             assert element_name in result.get_pulled_elements()
    
    149 149
             assert os.path.isdir(buildtreedir)
    

  • tests/sandboxes/remote-exec-config.py
    ... ... @@ -42,7 +42,7 @@ def test_old_and_new_configs(cli, datafiles):
    42 42
         # Use `pull` here to ensure we try to initialize the remotes, triggering the error
    
    43 43
         #
    
    44 44
         # This does not happen for a simple `bst show`.
    
    45
    -    result = cli.run(project=project, args=['pull', 'element.bst'])
    
    45
    +    result = cli.run(project=project, args=['artifact', 'pull', 'element.bst'])
    
    46 46
         result.assert_main_error(ErrorDomain.LOAD, LoadErrorReason.INVALID_DATA, "specify one")
    
    47 47
     
    
    48 48
     
    
    ... ... @@ -97,5 +97,5 @@ def test_empty_config(cli, datafiles):
    97 97
         # Use `pull` here to ensure we try to initialize the remotes, triggering the error
    
    98 98
         #
    
    99 99
         # This does not happen for a simple `bst show`.
    
    100
    -    result = cli.run(project=project, args=['pull', 'element.bst'])
    
    100
    +    result = cli.run(project=project, args=['artifact', 'pull', 'element.bst'])
    
    101 101
         result.assert_main_error(ErrorDomain.LOAD, LoadErrorReason.INVALID_DATA, "specify one")

  • tests/sources/git.py
    ... ... @@ -883,6 +883,195 @@ def test_git_describe(cli, tmpdir, datafiles, ref_storage, tag_type):
    883 883
         assert p.returncode != 0
    
    884 884
     
    
    885 885
     
    
    886
    +@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
    
    887
    +@pytest.mark.datafiles(os.path.join(DATA_DIR, 'template'))
    
    888
    +@pytest.mark.parametrize("ref_storage", [('inline'), ('project.refs')])
    
    889
    +@pytest.mark.parametrize("tag_type", [('annotated'), ('lightweight')])
    
    890
    +def test_git_describe_head_is_tagged(cli, tmpdir, datafiles, ref_storage, tag_type):
    
    891
    +    project = str(datafiles)
    
    892
    +
    
    893
    +    project_config = _yaml.load(os.path.join(project, 'project.conf'))
    
    894
    +    project_config['ref-storage'] = ref_storage
    
    895
    +    _yaml.dump(_yaml.node_sanitize(project_config), os.path.join(project, 'project.conf'))
    
    896
    +
    
    897
    +    repofiles = os.path.join(str(tmpdir), 'repofiles')
    
    898
    +    os.makedirs(repofiles, exist_ok=True)
    
    899
    +    file0 = os.path.join(repofiles, 'file0')
    
    900
    +    with open(file0, 'w') as f:
    
    901
    +        f.write('test\n')
    
    902
    +
    
    903
    +    repo = create_repo('git', str(tmpdir))
    
    904
    +
    
    905
    +    def tag(name):
    
    906
    +        if tag_type == 'annotated':
    
    907
    +            repo.add_annotated_tag(name, name)
    
    908
    +        else:
    
    909
    +            repo.add_tag(name)
    
    910
    +
    
    911
    +    ref = repo.create(repofiles)
    
    912
    +    tag('uselesstag')
    
    913
    +
    
    914
    +    file1 = os.path.join(str(tmpdir), 'file1')
    
    915
    +    with open(file1, 'w') as f:
    
    916
    +        f.write('test\n')
    
    917
    +    repo.add_file(file1)
    
    918
    +
    
    919
    +    file2 = os.path.join(str(tmpdir), 'file2')
    
    920
    +    with open(file2, 'w') as f:
    
    921
    +        f.write('test\n')
    
    922
    +    repo.branch('branch2')
    
    923
    +    repo.add_file(file2)
    
    924
    +
    
    925
    +    repo.checkout('master')
    
    926
    +    file3 = os.path.join(str(tmpdir), 'file3')
    
    927
    +    with open(file3, 'w') as f:
    
    928
    +        f.write('test\n')
    
    929
    +    repo.add_file(file3)
    
    930
    +
    
    931
    +    tagged_ref = repo.merge('branch2')
    
    932
    +    tag('tag')
    
    933
    +
    
    934
    +    config = repo.source_config()
    
    935
    +    config['track'] = repo.latest_commit()
    
    936
    +    config['track-tags'] = True
    
    937
    +
    
    938
    +    # Write out our test target
    
    939
    +    element = {
    
    940
    +        'kind': 'import',
    
    941
    +        'sources': [
    
    942
    +            config
    
    943
    +        ],
    
    944
    +    }
    
    945
    +    element_path = os.path.join(project, 'target.bst')
    
    946
    +    _yaml.dump(element, element_path)
    
    947
    +
    
    948
    +    if ref_storage == 'inline':
    
    949
    +        result = cli.run(project=project, args=['source', 'track', 'target.bst'])
    
    950
    +        result.assert_success()
    
    951
    +    else:
    
    952
    +        result = cli.run(project=project, args=['source', 'track', 'target.bst', '--deps', 'all'])
    
    953
    +        result.assert_success()
    
    954
    +
    
    955
    +    if ref_storage == 'inline':
    
    956
    +        element = _yaml.load(element_path)
    
    957
    +        tags = _yaml.node_sanitize(element['sources'][0]['tags'])
    
    958
    +        assert len(tags) == 1
    
    959
    +        for tag in tags:
    
    960
    +            assert 'tag' in tag
    
    961
    +            assert 'commit' in tag
    
    962
    +            assert 'annotated' in tag
    
    963
    +            assert tag['annotated'] == (tag_type == 'annotated')
    
    964
    +
    
    965
    +        assert set([(tag['tag'], tag['commit']) for tag in tags]) == set([('tag', repo.rev_parse('tag^{commit}'))])
    
    966
    +
    
    967
    +    checkout = os.path.join(str(tmpdir), 'checkout')
    
    968
    +
    
    969
    +    result = cli.run(project=project, args=['build', 'target.bst'])
    
    970
    +    result.assert_success()
    
    971
    +    result = cli.run(project=project, args=['checkout', 'target.bst', checkout])
    
    972
    +    result.assert_success()
    
    973
    +
    
    974
    +    if tag_type == 'annotated':
    
    975
    +        options = []
    
    976
    +    else:
    
    977
    +        options = ['--tags']
    
    978
    +    describe = subprocess.check_output(['git', 'describe'] + options,
    
    979
    +                                       cwd=checkout).decode('ascii')
    
    980
    +    assert describe.startswith('tag')
    
    981
    +
    
    982
    +    tags = subprocess.check_output(['git', 'tag'],
    
    983
    +                                   cwd=checkout).decode('ascii')
    
    984
    +    tags = set(tags.splitlines())
    
    985
    +    assert tags == set(['tag'])
    
    986
    +
    
    987
    +    rev_list = subprocess.check_output(['git', 'rev-list', '--all'],
    
    988
    +                                       cwd=checkout).decode('ascii')
    
    989
    +
    
    990
    +    assert set(rev_list.splitlines()) == set([tagged_ref])
    
    991
    +
    
    992
    +    p = subprocess.run(['git', 'log', repo.rev_parse('uselesstag')],
    
    993
    +                       cwd=checkout)
    
    994
    +    assert p.returncode != 0
    
    995
    +
    
    996
    +
    
    997
    +@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
    
    998
    +@pytest.mark.datafiles(os.path.join(DATA_DIR, 'template'))
    
    999
    +def test_git_describe_relevant_history(cli, tmpdir, datafiles):
    
    1000
    +    project = str(datafiles)
    
    1001
    +
    
    1002
    +    project_config = _yaml.load(os.path.join(project, 'project.conf'))
    
    1003
    +    project_config['ref-storage'] = 'project.refs'
    
    1004
    +    _yaml.dump(_yaml.node_sanitize(project_config), os.path.join(project, 'project.conf'))
    
    1005
    +
    
    1006
    +    repofiles = os.path.join(str(tmpdir), 'repofiles')
    
    1007
    +    os.makedirs(repofiles, exist_ok=True)
    
    1008
    +    file0 = os.path.join(repofiles, 'file0')
    
    1009
    +    with open(file0, 'w') as f:
    
    1010
    +        f.write('test\n')
    
    1011
    +
    
    1012
    +    repo = create_repo('git', str(tmpdir))
    
    1013
    +    repo.create(repofiles)
    
    1014
    +
    
    1015
    +    file1 = os.path.join(str(tmpdir), 'file1')
    
    1016
    +    with open(file1, 'w') as f:
    
    1017
    +        f.write('test\n')
    
    1018
    +    repo.add_file(file1)
    
    1019
    +    repo.branch('branch')
    
    1020
    +    repo.checkout('master')
    
    1021
    +
    
    1022
    +    file2 = os.path.join(str(tmpdir), 'file2')
    
    1023
    +    with open(file2, 'w') as f:
    
    1024
    +        f.write('test\n')
    
    1025
    +    repo.add_file(file2)
    
    1026
    +
    
    1027
    +    file3 = os.path.join(str(tmpdir), 'file3')
    
    1028
    +    with open(file3, 'w') as f:
    
    1029
    +        f.write('test\n')
    
    1030
    +    branch_boundary = repo.add_file(file3)
    
    1031
    +
    
    1032
    +    repo.checkout('branch')
    
    1033
    +    file4 = os.path.join(str(tmpdir), 'file4')
    
    1034
    +    with open(file4, 'w') as f:
    
    1035
    +        f.write('test\n')
    
    1036
    +    tagged_ref = repo.add_file(file4)
    
    1037
    +    repo.add_annotated_tag('tag1', 'tag1')
    
    1038
    +
    
    1039
    +    head = repo.merge('master')
    
    1040
    +
    
    1041
    +    config = repo.source_config()
    
    1042
    +    config['track'] = head
    
    1043
    +    config['track-tags'] = True
    
    1044
    +
    
    1045
    +    # Write out our test target
    
    1046
    +    element = {
    
    1047
    +        'kind': 'import',
    
    1048
    +        'sources': [
    
    1049
    +            config
    
    1050
    +        ],
    
    1051
    +    }
    
    1052
    +    element_path = os.path.join(project, 'target.bst')
    
    1053
    +    _yaml.dump(element, element_path)
    
    1054
    +
    
    1055
    +    result = cli.run(project=project, args=['source', 'track', 'target.bst', '--deps', 'all'])
    
    1056
    +    result.assert_success()
    
    1057
    +
    
    1058
    +    checkout = os.path.join(str(tmpdir), 'checkout')
    
    1059
    +
    
    1060
    +    result = cli.run(project=project, args=['build', 'target.bst'])
    
    1061
    +    result.assert_success()
    
    1062
    +    result = cli.run(project=project, args=['checkout', 'target.bst', checkout])
    
    1063
    +    result.assert_success()
    
    1064
    +
    
    1065
    +    describe = subprocess.check_output(['git', 'describe'],
    
    1066
    +                                       cwd=checkout).decode('ascii')
    
    1067
    +    assert describe.startswith('tag1-2-')
    
    1068
    +
    
    1069
    +    rev_list = subprocess.check_output(['git', 'rev-list', '--all'],
    
    1070
    +                                       cwd=checkout).decode('ascii')
    
    1071
    +
    
    1072
    +    assert set(rev_list.splitlines()) == set([head, tagged_ref, branch_boundary])
    
    1073
    +
    
    1074
    +
    
    886 1075
     @pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
    
    887 1076
     @pytest.mark.datafiles(os.path.join(DATA_DIR, 'template'))
    
    888 1077
     def test_default_do_not_track_tags(cli, tmpdir, datafiles):
    

  • tests/testutils/runcli.py
    ... ... @@ -167,6 +167,23 @@ class Result():
    167 167
         def assert_shell_error(self, fail_message=''):
    
    168 168
             assert self.exit_code == 1, fail_message
    
    169 169
     
    
    170
    +    # get_start_order()
    
    171
    +    #
    
    172
    +    # Gets the list of elements processed in a given queue, in the
    
    173
    +    # order of their first appearances in the session.
    
    174
    +    #
    
    175
    +    # Args:
    
    176
    +    #    activity (str): The queue activity name (like 'fetch')
    
    177
    +    #
    
    178
    +    # Returns:
    
    179
    +    #    (list): A list of element names in the order which they first appeared in the result
    
    180
    +    #
    
    181
    +    def get_start_order(self, activity):
    
    182
    +        results = re.findall(r'\[\s*{}:(\S+)\s*\]\s*START\s*.*\.log'.format(activity), self.stderr)
    
    183
    +        if results is None:
    
    184
    +            return []
    
    185
    +        return list(results)
    
    186
    +
    
    170 187
         # get_tracked_elements()
    
    171 188
         #
    
    172 189
         # Produces a list of element names on which tracking occurred
    



  • [Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]