Valentin David pushed to branch valentindavid/git_shallow_fetch at BuildStream / buildstream
Commits:
-
180fa774
by Benjamin Schubert at 2018-12-13T12:05:15Z
-
224aa4c2
by Benjamin Schubert at 2018-12-13T12:34:41Z
-
053beb66
by Tristan Van Berkom at 2018-12-13T14:23:19Z
-
29ab271c
by Tristan Van Berkom at 2018-12-13T14:23:19Z
-
ba955cf0
by Tristan Van Berkom at 2018-12-13T14:23:19Z
-
6010b5a4
by Tristan Van Berkom at 2018-12-13T14:23:19Z
-
3a6d27a4
by Tristan Van Berkom at 2018-12-13T14:23:19Z
-
4c0e602c
by Tristan Van Berkom at 2018-12-13T14:23:19Z
-
60ddeeb9
by Tristan Van Berkom at 2018-12-13T14:58:28Z
-
bf72cc42
by Angelos Evripiotis at 2018-12-13T17:31:17Z
-
13eb7ed2
by Angelos Evripiotis at 2018-12-13T18:01:56Z
-
540cb670
by Valentin David at 2018-12-14T11:23:56Z
15 changed files:
- CONTRIBUTING.rst
- buildstream/buildelement.py
- buildstream/element.py
- buildstream/plugins/elements/autotools.py
- buildstream/plugins/elements/cmake.py
- buildstream/plugins/elements/distutils.py
- buildstream/plugins/elements/make.py
- buildstream/plugins/elements/makemaker.py
- buildstream/plugins/elements/manual.py
- buildstream/plugins/elements/meson.py
- buildstream/plugins/elements/modulebuild.py
- buildstream/plugins/elements/pip.py
- buildstream/plugins/elements/qmake.py
- buildstream/plugins/sources/git.py
- tests/sources/git.py
Changes:
... | ... | @@ -1547,23 +1547,50 @@ Tests that run a sandbox should be decorated with:: |
1547 | 1547 |
|
1548 | 1548 |
and use the integration cli helper.
|
1549 | 1549 |
|
1550 |
-You should first aim to write tests that exercise your changes from the cli.
|
|
1551 |
-This is so that the testing is end-to-end, and the changes are guaranteed to
|
|
1552 |
-work for the end-user. The cli is considered stable, and so tests written in
|
|
1553 |
-terms of it are unlikely to require updating as the internals of the software
|
|
1554 |
-change over time.
|
|
1555 |
- |
|
1556 |
-It may be impractical to sufficiently examine some changes this way. For
|
|
1557 |
-example, the number of cases to test and the running time of each test may be
|
|
1558 |
-too high. It may also be difficult to contrive circumstances to cover every
|
|
1559 |
-line of the change. If this is the case, next you can consider also writing
|
|
1560 |
-unit tests that work more directly on the changes.
|
|
1561 |
- |
|
1562 |
-It is important to write unit tests in such a way that they do not break due to
|
|
1563 |
-changes unrelated to what they are meant to test. For example, if the test
|
|
1564 |
-relies on a lot of BuildStream internals, a large refactoring will likely
|
|
1565 |
-require the test to be rewritten. Pure functions that only rely on the Python
|
|
1566 |
-Standard Library are excellent candidates for unit testing.
|
|
1550 |
+You must test your changes in an end-to-end fashion. Consider the first end to
|
|
1551 |
+be the appropriate user interface, and the other end to be the change you have
|
|
1552 |
+made.
|
|
1553 |
+ |
|
1554 |
+The aim for our tests is to make assertions about how you impact and define the
|
|
1555 |
+outward user experience. You should be able to exercise all code paths via the
|
|
1556 |
+user interface, just as one can test the strength of rivets by sailing dozens
|
|
1557 |
+of ocean liners. Keep in mind that your ocean liners could be sailing properly
|
|
1558 |
+*because* of a malfunctioning rivet. End-to-end testing will warn you that
|
|
1559 |
+fixing the rivet will sink the ships.
|
|
1560 |
+ |
|
1561 |
+The primary user interface is the cli, so that should be the first target 'end'
|
|
1562 |
+for testing. Most of the value of BuildStream comes from what you can achieve
|
|
1563 |
+with the cli.
|
|
1564 |
+ |
|
1565 |
+We also have what we call a *"Public API Surface"*, as previously mentioned in
|
|
1566 |
+:ref:`contributing_documenting_symbols`. You should consider this a secondary
|
|
1567 |
+target. This is mainly for advanced users to implement their plugins against.
|
|
1568 |
+ |
|
1569 |
+Note that both of these targets for testing are guaranteed to continue working
|
|
1570 |
+in the same way across versions. This means that tests written in terms of them
|
|
1571 |
+will be robust to large changes to the code. This important property means that
|
|
1572 |
+BuildStream developers can make large refactorings without needing to rewrite
|
|
1573 |
+fragile tests.
|
|
1574 |
+ |
|
1575 |
+Another user to consider is the BuildStream developer, therefore internal API
|
|
1576 |
+surfaces are also targets for testing. For example the YAML loading code, and
|
|
1577 |
+the CasCache. Remember that these surfaces are still just a means to the end of
|
|
1578 |
+providing value through the cli and the *"Public API Surface"*.
|
|
1579 |
+ |
|
1580 |
+It may be impractical to sufficiently examine some changes in an end-to-end
|
|
1581 |
+fashion. The number of cases to test, and the running time of each test, may be
|
|
1582 |
+too high. Such typically low-level things, e.g. parsers, may also be tested
|
|
1583 |
+with unit tests; alongside the mandatory end-to-end tests.
|
|
1584 |
+ |
|
1585 |
+It is important to write unit tests that are not fragile, i.e. in such a way
|
|
1586 |
+that they do not break due to changes unrelated to what they are meant to test.
|
|
1587 |
+For example, if the test relies on a lot of BuildStream internals, a large
|
|
1588 |
+refactoring will likely require the test to be rewritten. Pure functions that
|
|
1589 |
+only rely on the Python Standard Library are excellent candidates for unit
|
|
1590 |
+testing.
|
|
1591 |
+ |
|
1592 |
+Unit tests only make it easier to implement things correctly, end-to-end tests
|
|
1593 |
+make it easier to implement the right thing.
|
|
1567 | 1594 |
|
1568 | 1595 |
|
1569 | 1596 |
Measuring performance
|
... | ... | @@ -215,10 +215,6 @@ class BuildElement(Element): |
215 | 215 |
# Setup environment
|
216 | 216 |
sandbox.set_environment(self.get_environment())
|
217 | 217 |
|
218 |
- # Enable command batching across prepare() and assemble()
|
|
219 |
- self.batch_prepare_assemble(SandboxFlags.ROOT_READ_ONLY,
|
|
220 |
- collect=self.get_variable('install-root'))
|
|
221 |
- |
|
222 | 218 |
def stage(self, sandbox):
|
223 | 219 |
|
224 | 220 |
# Stage deps in the sandbox root
|
... | ... | @@ -1612,9 +1612,9 @@ class Element(Plugin): |
1612 | 1612 |
sandbox_vpath = sandbox_vroot.descend(path_components)
|
1613 | 1613 |
try:
|
1614 | 1614 |
sandbox_vpath.import_files(workspace.get_absolute_path())
|
1615 |
- except UtilError as e:
|
|
1615 |
+ except UtilError as e2:
|
|
1616 | 1616 |
self.warn("Failed to preserve workspace state for failed build sysroot: {}"
|
1617 |
- .format(e))
|
|
1617 |
+ .format(e2))
|
|
1618 | 1618 |
|
1619 | 1619 |
self.__set_build_result(success=False, description=str(e), detail=e.detail)
|
1620 | 1620 |
self._cache_artifact(rootdir, sandbox, e.collect)
|
... | ... | @@ -55,7 +55,7 @@ See :ref:`built-in functionality documentation <core_buildelement_builtins>` for |
55 | 55 |
details on common configuration options for build elements.
|
56 | 56 |
"""
|
57 | 57 |
|
58 |
-from buildstream import BuildElement
|
|
58 |
+from buildstream import BuildElement, SandboxFlags
|
|
59 | 59 |
|
60 | 60 |
|
61 | 61 |
# Element implementation for the 'autotools' kind.
|
... | ... | @@ -63,6 +63,12 @@ class AutotoolsElement(BuildElement): |
63 | 63 |
# Supports virtual directories (required for remote execution)
|
64 | 64 |
BST_VIRTUAL_DIRECTORY = True
|
65 | 65 |
|
66 |
+ # Enable command batching across prepare() and assemble()
|
|
67 |
+ def configure_sandbox(self, sandbox):
|
|
68 |
+ super().configure_sandbox(sandbox)
|
|
69 |
+ self.batch_prepare_assemble(SandboxFlags.ROOT_READ_ONLY,
|
|
70 |
+ collect=self.get_variable('install-root'))
|
|
71 |
+ |
|
66 | 72 |
|
67 | 73 |
# Plugin entry point
|
68 | 74 |
def setup():
|
... | ... | @@ -54,7 +54,7 @@ See :ref:`built-in functionality documentation <core_buildelement_builtins>` for |
54 | 54 |
details on common configuration options for build elements.
|
55 | 55 |
"""
|
56 | 56 |
|
57 |
-from buildstream import BuildElement
|
|
57 |
+from buildstream import BuildElement, SandboxFlags
|
|
58 | 58 |
|
59 | 59 |
|
60 | 60 |
# Element implementation for the 'cmake' kind.
|
... | ... | @@ -62,6 +62,12 @@ class CMakeElement(BuildElement): |
62 | 62 |
# Supports virtual directories (required for remote execution)
|
63 | 63 |
BST_VIRTUAL_DIRECTORY = True
|
64 | 64 |
|
65 |
+ # Enable command batching across prepare() and assemble()
|
|
66 |
+ def configure_sandbox(self, sandbox):
|
|
67 |
+ super().configure_sandbox(sandbox)
|
|
68 |
+ self.batch_prepare_assemble(SandboxFlags.ROOT_READ_ONLY,
|
|
69 |
+ collect=self.get_variable('install-root'))
|
|
70 |
+ |
|
65 | 71 |
|
66 | 72 |
# Plugin entry point
|
67 | 73 |
def setup():
|
... | ... | @@ -31,12 +31,19 @@ See :ref:`built-in functionality documentation <core_buildelement_builtins>` for |
31 | 31 |
details on common configuration options for build elements.
|
32 | 32 |
"""
|
33 | 33 |
|
34 |
-from buildstream import BuildElement
|
|
34 |
+from buildstream import BuildElement, SandboxFlags
|
|
35 | 35 |
|
36 | 36 |
|
37 | 37 |
# Element implementation for the python 'distutils' kind.
|
38 | 38 |
class DistutilsElement(BuildElement):
|
39 |
- pass
|
|
39 |
+ # Supports virtual directories (required for remote execution)
|
|
40 |
+ BST_VIRTUAL_DIRECTORY = True
|
|
41 |
+ |
|
42 |
+ # Enable command batching across prepare() and assemble()
|
|
43 |
+ def configure_sandbox(self, sandbox):
|
|
44 |
+ super().configure_sandbox(sandbox)
|
|
45 |
+ self.batch_prepare_assemble(SandboxFlags.ROOT_READ_ONLY,
|
|
46 |
+ collect=self.get_variable('install-root'))
|
|
40 | 47 |
|
41 | 48 |
|
42 | 49 |
# Plugin entry point
|
... | ... | @@ -36,7 +36,7 @@ See :ref:`built-in functionality documentation <core_buildelement_builtins>` for |
36 | 36 |
details on common configuration options for build elements.
|
37 | 37 |
"""
|
38 | 38 |
|
39 |
-from buildstream import BuildElement
|
|
39 |
+from buildstream import BuildElement, SandboxFlags
|
|
40 | 40 |
|
41 | 41 |
|
42 | 42 |
# Element implementation for the 'make' kind.
|
... | ... | @@ -44,6 +44,12 @@ class MakeElement(BuildElement): |
44 | 44 |
# Supports virtual directories (required for remote execution)
|
45 | 45 |
BST_VIRTUAL_DIRECTORY = True
|
46 | 46 |
|
47 |
+ # Enable command batching across prepare() and assemble()
|
|
48 |
+ def configure_sandbox(self, sandbox):
|
|
49 |
+ super().configure_sandbox(sandbox)
|
|
50 |
+ self.batch_prepare_assemble(SandboxFlags.ROOT_READ_ONLY,
|
|
51 |
+ collect=self.get_variable('install-root'))
|
|
52 |
+ |
|
47 | 53 |
|
48 | 54 |
# Plugin entry point
|
49 | 55 |
def setup():
|
... | ... | @@ -31,12 +31,19 @@ See :ref:`built-in functionality documentation <core_buildelement_builtins>` for |
31 | 31 |
details on common configuration options for build elements.
|
32 | 32 |
"""
|
33 | 33 |
|
34 |
-from buildstream import BuildElement
|
|
34 |
+from buildstream import BuildElement, SandboxFlags
|
|
35 | 35 |
|
36 | 36 |
|
37 | 37 |
# Element implementation for the 'makemaker' kind.
|
38 | 38 |
class MakeMakerElement(BuildElement):
|
39 |
- pass
|
|
39 |
+ # Supports virtual directories (required for remote execution)
|
|
40 |
+ BST_VIRTUAL_DIRECTORY = True
|
|
41 |
+ |
|
42 |
+ # Enable command batching across prepare() and assemble()
|
|
43 |
+ def configure_sandbox(self, sandbox):
|
|
44 |
+ super().configure_sandbox(sandbox)
|
|
45 |
+ self.batch_prepare_assemble(SandboxFlags.ROOT_READ_ONLY,
|
|
46 |
+ collect=self.get_variable('install-root'))
|
|
40 | 47 |
|
41 | 48 |
|
42 | 49 |
# Plugin entry point
|
... | ... | @@ -31,12 +31,19 @@ See :ref:`built-in functionality documentation <core_buildelement_builtins>` for |
31 | 31 |
details on common configuration options for build elements.
|
32 | 32 |
"""
|
33 | 33 |
|
34 |
-from buildstream import BuildElement
|
|
34 |
+from buildstream import BuildElement, SandboxFlags
|
|
35 | 35 |
|
36 | 36 |
|
37 | 37 |
# Element implementation for the 'manual' kind.
|
38 | 38 |
class ManualElement(BuildElement):
|
39 |
- pass
|
|
39 |
+ # Supports virtual directories (required for remote execution)
|
|
40 |
+ BST_VIRTUAL_DIRECTORY = True
|
|
41 |
+ |
|
42 |
+ # Enable command batching across prepare() and assemble()
|
|
43 |
+ def configure_sandbox(self, sandbox):
|
|
44 |
+ super().configure_sandbox(sandbox)
|
|
45 |
+ self.batch_prepare_assemble(SandboxFlags.ROOT_READ_ONLY,
|
|
46 |
+ collect=self.get_variable('install-root'))
|
|
40 | 47 |
|
41 | 48 |
|
42 | 49 |
# Plugin entry point
|
... | ... | @@ -51,7 +51,7 @@ See :ref:`built-in functionality documentation <core_buildelement_builtins>` for |
51 | 51 |
details on common configuration options for build elements.
|
52 | 52 |
"""
|
53 | 53 |
|
54 |
-from buildstream import BuildElement
|
|
54 |
+from buildstream import BuildElement, SandboxFlags
|
|
55 | 55 |
|
56 | 56 |
|
57 | 57 |
# Element implementation for the 'meson' kind.
|
... | ... | @@ -59,6 +59,12 @@ class MesonElement(BuildElement): |
59 | 59 |
# Supports virtual directories (required for remote execution)
|
60 | 60 |
BST_VIRTUAL_DIRECTORY = True
|
61 | 61 |
|
62 |
+ # Enable command batching across prepare() and assemble()
|
|
63 |
+ def configure_sandbox(self, sandbox):
|
|
64 |
+ super().configure_sandbox(sandbox)
|
|
65 |
+ self.batch_prepare_assemble(SandboxFlags.ROOT_READ_ONLY,
|
|
66 |
+ collect=self.get_variable('install-root'))
|
|
67 |
+ |
|
62 | 68 |
|
63 | 69 |
# Plugin entry point
|
64 | 70 |
def setup():
|
... | ... | @@ -31,12 +31,19 @@ See :ref:`built-in functionality documentation <core_buildelement_builtins>` for |
31 | 31 |
details on common configuration options for build elements.
|
32 | 32 |
"""
|
33 | 33 |
|
34 |
-from buildstream import BuildElement
|
|
34 |
+from buildstream import BuildElement, SandboxFlags
|
|
35 | 35 |
|
36 | 36 |
|
37 | 37 |
# Element implementation for the 'modulebuild' kind.
|
38 | 38 |
class ModuleBuildElement(BuildElement):
|
39 |
- pass
|
|
39 |
+ # Supports virtual directories (required for remote execution)
|
|
40 |
+ BST_VIRTUAL_DIRECTORY = True
|
|
41 |
+ |
|
42 |
+ # Enable command batching across prepare() and assemble()
|
|
43 |
+ def configure_sandbox(self, sandbox):
|
|
44 |
+ super().configure_sandbox(sandbox)
|
|
45 |
+ self.batch_prepare_assemble(SandboxFlags.ROOT_READ_ONLY,
|
|
46 |
+ collect=self.get_variable('install-root'))
|
|
40 | 47 |
|
41 | 48 |
|
42 | 49 |
# Plugin entry point
|
... | ... | @@ -31,12 +31,19 @@ See :ref:`built-in functionality documentation <core_buildelement_builtins>` for |
31 | 31 |
details on common configuration options for build elements.
|
32 | 32 |
"""
|
33 | 33 |
|
34 |
-from buildstream import BuildElement
|
|
34 |
+from buildstream import BuildElement, SandboxFlags
|
|
35 | 35 |
|
36 | 36 |
|
37 | 37 |
# Element implementation for the 'pip' kind.
|
38 | 38 |
class PipElement(BuildElement):
|
39 |
- pass
|
|
39 |
+ # Supports virtual directories (required for remote execution)
|
|
40 |
+ BST_VIRTUAL_DIRECTORY = True
|
|
41 |
+ |
|
42 |
+ # Enable command batching across prepare() and assemble()
|
|
43 |
+ def configure_sandbox(self, sandbox):
|
|
44 |
+ super().configure_sandbox(sandbox)
|
|
45 |
+ self.batch_prepare_assemble(SandboxFlags.ROOT_READ_ONLY,
|
|
46 |
+ collect=self.get_variable('install-root'))
|
|
40 | 47 |
|
41 | 48 |
|
42 | 49 |
# Plugin entry point
|
... | ... | @@ -31,7 +31,7 @@ See :ref:`built-in functionality documentation <core_buildelement_builtins>` for |
31 | 31 |
details on common configuration options for build elements.
|
32 | 32 |
"""
|
33 | 33 |
|
34 |
-from buildstream import BuildElement
|
|
34 |
+from buildstream import BuildElement, SandboxFlags
|
|
35 | 35 |
|
36 | 36 |
|
37 | 37 |
# Element implementation for the 'qmake' kind.
|
... | ... | @@ -39,6 +39,12 @@ class QMakeElement(BuildElement): |
39 | 39 |
# Supports virtual directories (required for remote execution)
|
40 | 40 |
BST_VIRTUAL_DIRECTORY = True
|
41 | 41 |
|
42 |
+ # Enable command batching across prepare() and assemble()
|
|
43 |
+ def configure_sandbox(self, sandbox):
|
|
44 |
+ super().configure_sandbox(sandbox)
|
|
45 |
+ self.batch_prepare_assemble(SandboxFlags.ROOT_READ_ONLY,
|
|
46 |
+ collect=self.get_variable('install-root'))
|
|
47 |
+ |
|
42 | 48 |
|
43 | 49 |
# Plugin entry point
|
44 | 50 |
def setup():
|
... | ... | @@ -183,7 +183,7 @@ WARN_INVALID_SUBMODULE = "invalid-submodule" |
183 | 183 |
#
|
184 | 184 |
class GitMirror(SourceFetcher):
|
185 | 185 |
|
186 |
- def __init__(self, source, path, url, ref, *, primary=False, tags=[]):
|
|
186 |
+ def __init__(self, source, path, url, ref, *, primary=False, tags=[], tracking=None):
|
|
187 | 187 |
|
188 | 188 |
super().__init__()
|
189 | 189 |
self.source = source
|
... | ... | @@ -192,11 +192,100 @@ class GitMirror(SourceFetcher): |
192 | 192 |
self.ref = ref
|
193 | 193 |
self.tags = tags
|
194 | 194 |
self.primary = primary
|
195 |
+ dirname = utils.url_directory_name(url)
|
|
195 | 196 |
self.mirror = os.path.join(source.get_mirror_directory(), utils.url_directory_name(url))
|
197 |
+ self.fetch_mirror = os.path.join(source.get_mirror_directory(), '{}-{}'.format(dirname, ref))
|
|
196 | 198 |
self.mark_download_url(url)
|
199 |
+ self.tracking = tracking
|
|
200 |
+ |
|
201 |
+ def mirror_path(self):
|
|
202 |
+ if os.path.exists(self.mirror):
|
|
203 |
+ return self.mirror
|
|
204 |
+ else:
|
|
205 |
+ assert os.path.exists(self.fetch_mirror)
|
|
206 |
+ return self.fetch_mirror
|
|
207 |
+ |
|
208 |
+ def ensure_fetchable(self, alias_override=None):
|
|
209 |
+ |
|
210 |
+ if os.path.exists(self.mirror):
|
|
211 |
+ return
|
|
212 |
+ |
|
213 |
+ if self.tags:
|
|
214 |
+ for tag, commit, _ in self.tags:
|
|
215 |
+ if commit != self.ref:
|
|
216 |
+ self.source.status("{}: tag '{}' is not on commit '{}', so a full clone is required"
|
|
217 |
+ .format(self.source, tag, commit))
|
|
218 |
+ self.ensure_trackable(alias_override=alias_override)
|
|
219 |
+ return
|
|
220 |
+ |
|
221 |
+ if os.path.exists(self.fetch_mirror):
|
|
222 |
+ return
|
|
223 |
+ |
|
224 |
+ with self.source.tempdir() as tmpdir:
|
|
225 |
+ self.source.call([self.source.host_git, 'init', '--bare', tmpdir],
|
|
226 |
+ fail="Failed to init git repository",
|
|
227 |
+ fail_temporarily=True)
|
|
228 |
+ |
|
229 |
+ url = self.source.translate_url(self.url, alias_override=alias_override,
|
|
230 |
+ primary=self.primary)
|
|
231 |
+ |
|
232 |
+ self.source.call([self.source.host_git, 'remote', 'add', '--mirror=fetch', 'origin', url],
|
|
233 |
+ cwd=tmpdir,
|
|
234 |
+ fail="Failed to init git repository",
|
|
235 |
+ fail_temporarily=True)
|
|
236 |
+ |
|
237 |
+ _, refs = self.source.check_output([self.source.host_git, 'ls-remote', 'origin'],
|
|
238 |
+ cwd=tmpdir,
|
|
239 |
+ fail="Failed to clone git repository {}".format(url),
|
|
240 |
+ fail_temporarily=True)
|
|
241 |
+ |
|
242 |
+ advertised = None
|
|
243 |
+ for ref_line in refs.splitlines():
|
|
244 |
+ commit, ref = ref_line.split('\t', 1)
|
|
245 |
+ if ref == 'HEAD':
|
|
246 |
+ continue
|
|
247 |
+ if ref.endswith('^{}'):
|
|
248 |
+ continue
|
|
249 |
+ if self.tracking:
|
|
250 |
+ # For validate_cache to work
|
|
251 |
+ if ref not in ['refs/heads/{}'.format(self.tracking),
|
|
252 |
+ 'refs/tags/{}'.format(self.tracking)]:
|
|
253 |
+ continue
|
|
254 |
+ if self.ref == commit:
|
|
255 |
+ advertised = ref
|
|
256 |
+ break
|
|
257 |
+ |
|
258 |
+ if advertised is None:
|
|
259 |
+ self.source.status("{}: {} is not advertised on {}, so a full clone is required"
|
|
260 |
+ .format(self.source, self.ref, url))
|
|
261 |
+ |
|
262 |
+ self.ensure_trackable(alias_override=alias_override)
|
|
263 |
+ return
|
|
264 |
+ |
|
265 |
+ self.source.call([self.source.host_git, 'fetch', '--depth=1', 'origin', advertised],
|
|
266 |
+ cwd=tmpdir,
|
|
267 |
+ fail="Failed to fetch repository",
|
|
268 |
+ fail_temporarily=True)
|
|
269 |
+ |
|
270 |
+ # We need to have a ref to make it clonable
|
|
271 |
+ self.source.call([self.source.host_git, 'update-ref', 'HEAD', self.ref],
|
|
272 |
+ cwd=tmpdir,
|
|
273 |
+ fail="Failed to tag HEAD",
|
|
274 |
+ fail_temporarily=True)
|
|
275 |
+ |
|
276 |
+ try:
|
|
277 |
+ move_atomic(tmpdir, self.fetch_mirror)
|
|
278 |
+ except DirectoryExistsError:
|
|
279 |
+ # Another process was quicker to download this repository.
|
|
280 |
+ # Let's discard our own
|
|
281 |
+ self.source.status("{}: Discarding duplicate clone of {}"
|
|
282 |
+ .format(self.source, url))
|
|
283 |
+ except OSError as e:
|
|
284 |
+ raise SourceError("{}: Failed to move cloned git repository {} from '{}' to '{}': {}"
|
|
285 |
+ .format(self.source, url, tmpdir, self.fetch_mirror, e)) from e
|
|
197 | 286 |
|
198 | 287 |
# Ensures that the mirror exists
|
199 |
- def ensure(self, alias_override=None):
|
|
288 |
+ def ensure_trackable(self, alias_override=None):
|
|
200 | 289 |
|
201 | 290 |
# Unfortunately, git does not know how to only clone just a specific ref,
|
202 | 291 |
# so we have to download all of those gigs even if we only need a couple
|
... | ... | @@ -231,18 +320,20 @@ class GitMirror(SourceFetcher): |
231 | 320 |
alias_override=alias_override,
|
232 | 321 |
primary=self.primary)
|
233 | 322 |
|
323 |
+ mirror = self.mirror_path()
|
|
324 |
+ |
|
234 | 325 |
if alias_override:
|
235 | 326 |
remote_name = utils.url_directory_name(alias_override)
|
236 | 327 |
_, remotes = self.source.check_output(
|
237 | 328 |
[self.source.host_git, 'remote'],
|
238 |
- fail="Failed to retrieve list of remotes in {}".format(self.mirror),
|
|
239 |
- cwd=self.mirror
|
|
329 |
+ fail="Failed to retrieve list of remotes in {}".format(mirror),
|
|
330 |
+ cwd=mirror
|
|
240 | 331 |
)
|
241 | 332 |
if remote_name not in remotes:
|
242 | 333 |
self.source.call(
|
243 | 334 |
[self.source.host_git, 'remote', 'add', remote_name, url],
|
244 | 335 |
fail="Failed to add remote {} with url {}".format(remote_name, url),
|
245 |
- cwd=self.mirror
|
|
336 |
+ cwd=mirror
|
|
246 | 337 |
)
|
247 | 338 |
else:
|
248 | 339 |
remote_name = "origin"
|
... | ... | @@ -250,7 +341,7 @@ class GitMirror(SourceFetcher): |
250 | 341 |
self.source.call([self.source.host_git, 'fetch', remote_name, '--prune', '--force', '--tags'],
|
251 | 342 |
fail="Failed to fetch from remote git repository: {}".format(url),
|
252 | 343 |
fail_temporarily=True,
|
253 |
- cwd=self.mirror)
|
|
344 |
+ cwd=mirror)
|
|
254 | 345 |
|
255 | 346 |
def fetch(self, alias_override=None):
|
256 | 347 |
# Resolve the URL for the message
|
... | ... | @@ -261,7 +352,7 @@ class GitMirror(SourceFetcher): |
261 | 352 |
with self.source.timed_activity("Fetching from {}"
|
262 | 353 |
.format(resolved_url),
|
263 | 354 |
silent_nested=True):
|
264 |
- self.ensure(alias_override)
|
|
355 |
+ self.ensure_fetchable(alias_override)
|
|
265 | 356 |
if not self.has_ref():
|
266 | 357 |
self._fetch(alias_override)
|
267 | 358 |
self.assert_ref()
|
... | ... | @@ -270,12 +361,14 @@ class GitMirror(SourceFetcher): |
270 | 361 |
if not self.ref:
|
271 | 362 |
return False
|
272 | 363 |
|
273 |
- # If the mirror doesnt exist, we also dont have the ref
|
|
274 |
- if not os.path.exists(self.mirror):
|
|
364 |
+ if not os.path.exists(self.mirror) and not os.path.exists(self.fetch_mirror):
|
|
365 |
+ # If the mirror doesnt exist, we also dont have the ref
|
|
275 | 366 |
return False
|
276 | 367 |
|
368 |
+ mirror = self.mirror_path()
|
|
369 |
+ |
|
277 | 370 |
# Check if the ref is really there
|
278 |
- rc = self.source.call([self.source.host_git, 'cat-file', '-t', self.ref], cwd=self.mirror)
|
|
371 |
+ rc = self.source.call([self.source.host_git, 'cat-file', '-t', self.ref], cwd=mirror)
|
|
279 | 372 |
return rc == 0
|
280 | 373 |
|
281 | 374 |
def assert_ref(self):
|
... | ... | @@ -325,11 +418,13 @@ class GitMirror(SourceFetcher): |
325 | 418 |
def stage(self, directory):
|
326 | 419 |
fullpath = os.path.join(directory, self.path)
|
327 | 420 |
|
421 |
+ mirror = self.mirror_path()
|
|
422 |
+ |
|
328 | 423 |
# Using --shared here avoids copying the objects into the checkout, in any
|
329 | 424 |
# case we're just checking out a specific commit and then removing the .git/
|
330 | 425 |
# directory.
|
331 |
- self.source.call([self.source.host_git, 'clone', '--no-checkout', '--shared', self.mirror, fullpath],
|
|
332 |
- fail="Failed to create git mirror {} in directory: {}".format(self.mirror, fullpath),
|
|
426 |
+ self.source.call([self.source.host_git, 'clone', '--no-checkout', '--shared', mirror, fullpath],
|
|
427 |
+ fail="Failed to create git mirror {} in directory: {}".format(mirror, fullpath),
|
|
333 | 428 |
fail_temporarily=True)
|
334 | 429 |
|
335 | 430 |
self.source.call([self.source.host_git, 'checkout', '--force', self.ref],
|
... | ... | @@ -359,9 +454,11 @@ class GitMirror(SourceFetcher): |
359 | 454 |
|
360 | 455 |
# List the submodules (path/url tuples) present at the given ref of this repo
|
361 | 456 |
def submodule_list(self):
|
457 |
+ mirror = self.mirror_path()
|
|
458 |
+ |
|
362 | 459 |
modules = "{}:{}".format(self.ref, GIT_MODULES)
|
363 | 460 |
exit_code, output = self.source.check_output(
|
364 |
- [self.source.host_git, 'show', modules], cwd=self.mirror)
|
|
461 |
+ [self.source.host_git, 'show', modules], cwd=mirror)
|
|
365 | 462 |
|
366 | 463 |
# If git show reports error code 128 here, we take it to mean there is
|
367 | 464 |
# no .gitmodules file to display for the given revision.
|
... | ... | @@ -389,6 +486,8 @@ class GitMirror(SourceFetcher): |
389 | 486 |
# Fetch the ref which this mirror requires its submodule to have,
|
390 | 487 |
# at the given ref of this mirror.
|
391 | 488 |
def submodule_ref(self, submodule, ref=None):
|
489 |
+ mirror = self.mirror_path()
|
|
490 |
+ |
|
392 | 491 |
if not ref:
|
393 | 492 |
ref = self.ref
|
394 | 493 |
|
... | ... | @@ -397,7 +496,7 @@ class GitMirror(SourceFetcher): |
397 | 496 |
_, output = self.source.check_output([self.source.host_git, 'ls-tree', ref, submodule],
|
398 | 497 |
fail="ls-tree failed for commit {} and submodule: {}".format(
|
399 | 498 |
ref, submodule),
|
400 |
- cwd=self.mirror)
|
|
499 |
+ cwd=mirror)
|
|
401 | 500 |
|
402 | 501 |
# read the commit hash from the output
|
403 | 502 |
fields = output.split()
|
... | ... | @@ -514,8 +613,8 @@ class GitSource(Source): |
514 | 613 |
self.track_tags = self.node_get_member(node, bool, 'track-tags', False)
|
515 | 614 |
|
516 | 615 |
self.original_url = self.node_get_member(node, str, 'url')
|
517 |
- self.mirror = GitMirror(self, '', self.original_url, ref, tags=tags, primary=True)
|
|
518 | 616 |
self.tracking = self.node_get_member(node, str, 'track', None)
|
617 |
+ self.mirror = GitMirror(self, '', self.original_url, ref, tags=tags, primary=True, tracking=self.tracking)
|
|
519 | 618 |
|
520 | 619 |
self.ref_format = self.node_get_member(node, str, 'ref-format', 'sha1')
|
521 | 620 |
if self.ref_format not in ['sha1', 'git-describe']:
|
... | ... | @@ -633,7 +732,7 @@ class GitSource(Source): |
633 | 732 |
with self.timed_activity("Tracking {} from {}"
|
634 | 733 |
.format(self.tracking, resolved_url),
|
635 | 734 |
silent_nested=True):
|
636 |
- self.mirror.ensure()
|
|
735 |
+ self.mirror.ensure_trackable()
|
|
637 | 736 |
self.mirror._fetch()
|
638 | 737 |
|
639 | 738 |
# Update self.mirror.ref and node.ref from the self.tracking branch
|
... | ... | @@ -643,6 +742,7 @@ class GitSource(Source): |
643 | 742 |
|
644 | 743 |
def init_workspace(self, directory):
|
645 | 744 |
# XXX: may wish to refactor this as some code dupe with stage()
|
745 |
+ self.mirror.ensure_trackable()
|
|
646 | 746 |
self.refresh_submodules()
|
647 | 747 |
|
648 | 748 |
with self.timed_activity('Setting up workspace "{}"'.format(directory), silent_nested=True):
|
... | ... | @@ -717,15 +817,16 @@ class GitSource(Source): |
717 | 817 |
# Assert that the ref exists in the track tag/branch, if track has been specified.
|
718 | 818 |
ref_in_track = False
|
719 | 819 |
if self.tracking:
|
820 |
+ mirror = self.mirror.mirror_path()
|
|
720 | 821 |
_, branch = self.check_output([self.host_git, 'branch', '--list', self.tracking,
|
721 | 822 |
'--contains', self.mirror.ref],
|
722 |
- cwd=self.mirror.mirror)
|
|
823 |
+ cwd=mirror)
|
|
723 | 824 |
if branch:
|
724 | 825 |
ref_in_track = True
|
725 | 826 |
else:
|
726 | 827 |
_, tag = self.check_output([self.host_git, 'tag', '--list', self.tracking,
|
727 | 828 |
'--contains', self.mirror.ref],
|
728 |
- cwd=self.mirror.mirror)
|
|
829 |
+ cwd=mirror)
|
|
729 | 830 |
if tag:
|
730 | 831 |
ref_in_track = True
|
731 | 832 |
|
... | ... | @@ -749,7 +850,7 @@ class GitSource(Source): |
749 | 850 |
|
750 | 851 |
self.refresh_submodules()
|
751 | 852 |
for mirror in self.submodules:
|
752 |
- if not os.path.exists(mirror.mirror):
|
|
853 |
+ if not os.path.exists(mirror.mirror) and not os.path.exists(mirror.fetch_mirror):
|
|
753 | 854 |
return False
|
754 | 855 |
if not mirror.has_ref():
|
755 | 856 |
return False
|
... | ... | @@ -761,7 +862,7 @@ class GitSource(Source): |
761 | 862 |
# Assumes that we have our mirror and we have the ref which we point to
|
762 | 863 |
#
|
763 | 864 |
def refresh_submodules(self):
|
764 |
- self.mirror.ensure()
|
|
865 |
+ self.mirror.ensure_fetchable()
|
|
765 | 866 |
submodules = []
|
766 | 867 |
|
767 | 868 |
for path, url in self.mirror.submodule_list():
|
... | ... | @@ -28,6 +28,7 @@ import shutil |
28 | 28 |
from buildstream._exceptions import ErrorDomain
|
29 | 29 |
from buildstream import _yaml
|
30 | 30 |
from buildstream.plugin import CoreWarnings
|
31 |
+from buildstream.utils import url_directory_name
|
|
31 | 32 |
|
32 | 33 |
from tests.testutils import cli, create_repo
|
33 | 34 |
from tests.testutils.site import HAVE_GIT
|
... | ... | @@ -1018,3 +1019,193 @@ def test_overwrite_rogue_tag_multiple_remotes(cli, tmpdir, datafiles): |
1018 | 1019 |
|
1019 | 1020 |
result = cli.run(project=project, args=['build', 'target.bst'])
|
1020 | 1021 |
result.assert_success()
|
1022 |
+ |
|
1023 |
+ |
|
1024 |
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
|
|
1025 |
+@pytest.mark.datafiles(os.path.join(DATA_DIR, 'template'))
|
|
1026 |
+def test_fetch_shallow(cli, tmpdir, datafiles):
|
|
1027 |
+ project = str(datafiles)
|
|
1028 |
+ |
|
1029 |
+ repo = create_repo('git', str(tmpdir))
|
|
1030 |
+ previous_ref = repo.create(os.path.join(project, 'repofiles'))
|
|
1031 |
+ |
|
1032 |
+ file1 = os.path.join(str(tmpdir), 'file1')
|
|
1033 |
+ with open(file1, 'w') as f:
|
|
1034 |
+ f.write('test\n')
|
|
1035 |
+ ref = repo.add_file(file1)
|
|
1036 |
+ |
|
1037 |
+ source_config = repo.source_config(ref=ref)
|
|
1038 |
+ |
|
1039 |
+ # Write out our test target with a bad ref
|
|
1040 |
+ element = {
|
|
1041 |
+ 'kind': 'import',
|
|
1042 |
+ 'sources': [
|
|
1043 |
+ source_config
|
|
1044 |
+ ]
|
|
1045 |
+ }
|
|
1046 |
+ _yaml.dump(element, os.path.join(project, 'target.bst'))
|
|
1047 |
+ |
|
1048 |
+ sources_dir = os.path.join(str(tmpdir), 'sources')
|
|
1049 |
+ os.makedirs(sources_dir, exist_ok=True)
|
|
1050 |
+ config = {
|
|
1051 |
+ 'sourcedir': sources_dir
|
|
1052 |
+ }
|
|
1053 |
+ cli.configure(config)
|
|
1054 |
+ |
|
1055 |
+ result = cli.run(project=project, args=[
|
|
1056 |
+ 'fetch', 'target.bst'
|
|
1057 |
+ ])
|
|
1058 |
+ result.assert_success()
|
|
1059 |
+ |
|
1060 |
+ cache_dir_name = url_directory_name(source_config['url'])
|
|
1061 |
+ full_cache_path = os.path.join(sources_dir, 'git', cache_dir_name)
|
|
1062 |
+ shallow_cache_path = os.path.join(sources_dir, 'git', '{}-{}'.format(cache_dir_name, ref))
|
|
1063 |
+ |
|
1064 |
+ assert os.path.exists(shallow_cache_path)
|
|
1065 |
+ assert not os.path.exists(full_cache_path)
|
|
1066 |
+ |
|
1067 |
+ output = subprocess.run(['git', 'log', '--format=format:%H'],
|
|
1068 |
+ cwd=shallow_cache_path,
|
|
1069 |
+ stdout=subprocess.PIPE).stdout.decode('ascii')
|
|
1070 |
+ assert output.splitlines() == [ref]
|
|
1071 |
+ |
|
1072 |
+ result = cli.run(project=project, args=[
|
|
1073 |
+ 'build', 'target.bst'
|
|
1074 |
+ ])
|
|
1075 |
+ result.assert_success()
|
|
1076 |
+ |
|
1077 |
+ output = subprocess.run(['git', 'log', '--format=format:%H'],
|
|
1078 |
+ cwd=shallow_cache_path,
|
|
1079 |
+ stdout=subprocess.PIPE).stdout.decode('ascii')
|
|
1080 |
+ assert output.splitlines() == [ref]
|
|
1081 |
+ |
|
1082 |
+ assert os.path.exists(shallow_cache_path)
|
|
1083 |
+ assert not os.path.exists(full_cache_path)
|
|
1084 |
+ |
|
1085 |
+ result = cli.run(project=project, args=[
|
|
1086 |
+ 'track', 'target.bst'
|
|
1087 |
+ ])
|
|
1088 |
+ result.assert_success()
|
|
1089 |
+ |
|
1090 |
+ assert os.path.exists(full_cache_path)
|
|
1091 |
+ output = subprocess.run(['git', 'log', '--format=format:%H'],
|
|
1092 |
+ cwd=full_cache_path,
|
|
1093 |
+ stdout=subprocess.PIPE).stdout.decode('ascii')
|
|
1094 |
+ assert output.splitlines() == [ref, previous_ref]
|
|
1095 |
+ |
|
1096 |
+ |
|
1097 |
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
|
|
1098 |
+@pytest.mark.datafiles(os.path.join(DATA_DIR, 'template'))
|
|
1099 |
+def test_fetch_shallow_not_tagged(cli, tmpdir, datafiles):
|
|
1100 |
+ """When a ref is not tagged and not head of branch on remote we cannot
|
|
1101 |
+ get a shallow clone. It should automatically get a full clone.
|
|
1102 |
+ """
|
|
1103 |
+ |
|
1104 |
+ project = str(datafiles)
|
|
1105 |
+ |
|
1106 |
+ repo = create_repo('git', str(tmpdir))
|
|
1107 |
+ previous_ref = repo.create(os.path.join(project, 'repofiles'))
|
|
1108 |
+ |
|
1109 |
+ file1 = os.path.join(str(tmpdir), 'file1')
|
|
1110 |
+ with open(file1, 'w') as f:
|
|
1111 |
+ f.write('test\n')
|
|
1112 |
+ ref = repo.add_file(file1)
|
|
1113 |
+ |
|
1114 |
+ source_config = repo.source_config(ref=previous_ref)
|
|
1115 |
+ |
|
1116 |
+ # Write out our test target with a bad ref
|
|
1117 |
+ element = {
|
|
1118 |
+ 'kind': 'import',
|
|
1119 |
+ 'sources': [
|
|
1120 |
+ source_config
|
|
1121 |
+ ]
|
|
1122 |
+ }
|
|
1123 |
+ _yaml.dump(element, os.path.join(project, 'target.bst'))
|
|
1124 |
+ |
|
1125 |
+ sources_dir = os.path.join(str(tmpdir), 'sources')
|
|
1126 |
+ os.makedirs(sources_dir, exist_ok=True)
|
|
1127 |
+ config = {
|
|
1128 |
+ 'sourcedir': sources_dir
|
|
1129 |
+ }
|
|
1130 |
+ cli.configure(config)
|
|
1131 |
+ |
|
1132 |
+ result = cli.run(project=project, args=[
|
|
1133 |
+ 'fetch', 'target.bst'
|
|
1134 |
+ ])
|
|
1135 |
+ result.assert_success()
|
|
1136 |
+ |
|
1137 |
+ cache_dir_name = url_directory_name(source_config['url'])
|
|
1138 |
+ full_cache_path = os.path.join(sources_dir, 'git', cache_dir_name)
|
|
1139 |
+ shallow_cache_path = os.path.join(sources_dir, 'git', '{}-{}'.format(cache_dir_name, previous_ref))
|
|
1140 |
+ |
|
1141 |
+ assert not os.path.exists(shallow_cache_path)
|
|
1142 |
+ assert os.path.exists(full_cache_path)
|
|
1143 |
+ |
|
1144 |
+ output = subprocess.run(['git', 'log', '--format=format:%H'],
|
|
1145 |
+ cwd=full_cache_path,
|
|
1146 |
+ stdout=subprocess.PIPE).stdout.decode('ascii')
|
|
1147 |
+ assert output.splitlines() == [ref, previous_ref]
|
|
1148 |
+ |
|
1149 |
+ |
|
1150 |
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
|
|
1151 |
+@pytest.mark.datafiles(os.path.join(DATA_DIR, 'template'))
|
|
1152 |
+def test_fetch_shallow_workspace_open(cli, tmpdir, datafiles):
|
|
1153 |
+ """
|
|
1154 |
+ Workspaces should get a full clone.
|
|
1155 |
+ """
|
|
1156 |
+ project = str(datafiles)
|
|
1157 |
+ |
|
1158 |
+ repo = create_repo('git', str(tmpdir))
|
|
1159 |
+ previous_ref = repo.create(os.path.join(project, 'repofiles'))
|
|
1160 |
+ |
|
1161 |
+ file1 = os.path.join(str(tmpdir), 'file1')
|
|
1162 |
+ with open(file1, 'w') as f:
|
|
1163 |
+ f.write('test\n')
|
|
1164 |
+ ref = repo.add_file(file1)
|
|
1165 |
+ |
|
1166 |
+ source_config = repo.source_config(ref=ref)
|
|
1167 |
+ |
|
1168 |
+ # Write out our test target with a bad ref
|
|
1169 |
+ element = {
|
|
1170 |
+ 'kind': 'import',
|
|
1171 |
+ 'sources': [
|
|
1172 |
+ source_config
|
|
1173 |
+ ]
|
|
1174 |
+ }
|
|
1175 |
+ _yaml.dump(element, os.path.join(project, 'target.bst'))
|
|
1176 |
+ |
|
1177 |
+ sources_dir = os.path.join(str(tmpdir), 'sources')
|
|
1178 |
+ os.makedirs(sources_dir, exist_ok=True)
|
|
1179 |
+ config = {
|
|
1180 |
+ 'sourcedir': sources_dir
|
|
1181 |
+ }
|
|
1182 |
+ cli.configure(config)
|
|
1183 |
+ |
|
1184 |
+ result = cli.run(project=project, args=[
|
|
1185 |
+ 'fetch', 'target.bst'
|
|
1186 |
+ ])
|
|
1187 |
+ result.assert_success()
|
|
1188 |
+ |
|
1189 |
+ cache_dir_name = url_directory_name(source_config['url'])
|
|
1190 |
+ full_cache_path = os.path.join(sources_dir, 'git', cache_dir_name)
|
|
1191 |
+ shallow_cache_path = os.path.join(sources_dir, 'git', '{}-{}'.format(cache_dir_name, ref))
|
|
1192 |
+ |
|
1193 |
+ assert os.path.exists(shallow_cache_path)
|
|
1194 |
+ assert not os.path.exists(full_cache_path)
|
|
1195 |
+ |
|
1196 |
+ output = subprocess.run(['git', 'log', '--format=format:%H'],
|
|
1197 |
+ cwd=shallow_cache_path,
|
|
1198 |
+ stdout=subprocess.PIPE).stdout.decode('ascii')
|
|
1199 |
+ assert output.splitlines() == [ref]
|
|
1200 |
+ |
|
1201 |
+ workspace = os.path.join(str(tmpdir), 'workspace')
|
|
1202 |
+ |
|
1203 |
+ result = cli.run(project=project, args=[
|
|
1204 |
+ 'workspace', 'open', 'target.bst', '--directory', workspace
|
|
1205 |
+ ])
|
|
1206 |
+ result.assert_success()
|
|
1207 |
+ |
|
1208 |
+ output = subprocess.run(['git', 'log', '--format=format:%H'],
|
|
1209 |
+ cwd=workspace,
|
|
1210 |
+ stdout=subprocess.PIPE).stdout.decode('ascii')
|
|
1211 |
+ assert output.splitlines() == [ref, previous_ref]
|