gecko_taskgraph.util package
Submodules
gecko_taskgraph.util.attributes module
- gecko_taskgraph.util.attributes.copy_attributes_from_dependent_job(dep_job, denylist=())
- gecko_taskgraph.util.attributes.is_try(params)
Determine whether this graph is being built on a try project or for mach try fuzzy.
- gecko_taskgraph.util.attributes.match_run_on_hg_branches(hg_branch, run_on_hg_branches)
Determine whether the given project is included in the run-on-hg-branches parameter. Allows ‘all’.
- gecko_taskgraph.util.attributes.match_run_on_projects(project, run_on_projects)
Determine whether the given project is included in the run-on-projects parameter, applying expansions for things like “integration” mentioned in the attribute documentation.
- gecko_taskgraph.util.attributes.release_level(project)
Whether this is a staging release or not.
- Return str:
One of “production” or “staging”.
- gecko_taskgraph.util.attributes.sorted_unique_list(*args)
Join one or more lists, and return a sorted list of unique members
- gecko_taskgraph.util.attributes.task_name(task)
gecko_taskgraph.util.backstop module
- gecko_taskgraph.util.backstop.is_backstop(params, push_interval=20, time_interval=240, trust_domain='gecko', integration_projects={'autoland'}, backstop_strategy='backstop')
Determines whether the given parameters represent a backstop push.
- Parameters:
push_interval (int) – Number of pushes
time_interval (int) – Minutes between forced schedules. Use 0 to disable.
trust_domain (str) – “gecko” for Firefox, “comm” for Thunderbird
integration_projects (set) – project that uses backstop optimization
- Returns:
True if this is a backstop, otherwise False.
- Return type:
bool
gecko_taskgraph.util.bugbug module
- exception gecko_taskgraph.util.bugbug.BugbugTimeoutException
Bases:
Exception
- gecko_taskgraph.util.bugbug.get_session()
- gecko_taskgraph.util.bugbug.push_schedules(branch, rev)
- gecko_taskgraph.util.bugbug.translate_group(group)
gecko_taskgraph.util.cached_tasks module
- gecko_taskgraph.util.cached_tasks.add_optimization(config, taskdesc, cache_type, cache_name, digest=None, digest_data=None)
Allow the results of this task to be cached. This adds index routes to the task so it can be looked up for future runs, and optimization hints so that cached artifacts can be found. Exactly one of digest and digest_data must be passed.
- Parameters:
config (TransformConfig) – The configuration for the kind being transformed.
taskdesc (dict) – The description of the current task.
cache_type (str) – The type of task result being cached.
cache_name (str) – The name of the object being cached.
digest (bytes or None) – A unique string indentifying this version of the artifacts being generated. Typically this will be the hash of inputs to the task.
digest_data (list of bytes or None) – A list of bytes representing the inputs of this task. They will be concatenated and hashed to create the digest for this task.
gecko_taskgraph.util.chunking module
Utility functions to handle test chunking.
- class gecko_taskgraph.util.chunking.BaseManifestLoader(params)
Bases:
object
- abstract get_manifests(flavor, subsuite, mozinfo)
Compute which manifests should run for the given flavor, subsuite and mozinfo.
This function returns skipped manifests separately so that more balanced chunks can be achieved by only considering “active” manifests in the chunking algorithm.
- Parameters:
flavor (str) – The suite to run. Values are defined by the ‘build_flavor’ key in moztest.resolve.TEST_SUITES.
subsuite (str) – The subsuite to run or ‘undefined’ to denote no subsuite.
mozinfo (frozenset) – Set of data in the form of (<key>, <value>) used for filtering.
- Returns:
A tuple of two manifest lists. The first is the set of active manifests (will run at least one test. The second is a list of skipped manifests (all tests are skipped).
- class gecko_taskgraph.util.chunking.BugbugLoader(*args, **kwargs)
Bases:
DefaultLoader
Load manifests using metadata from the TestResolver, and then filter them based on a query to bugbug.
- CONFIDENCE_THRESHOLD = 0.7
- get_manifests(suite, mozinfo)
Compute which manifests should run for the given flavor, subsuite and mozinfo.
This function returns skipped manifests separately so that more balanced chunks can be achieved by only considering “active” manifests in the chunking algorithm.
- Parameters:
flavor (str) – The suite to run. Values are defined by the ‘build_flavor’ key in moztest.resolve.TEST_SUITES.
subsuite (str) – The subsuite to run or ‘undefined’ to denote no subsuite.
mozinfo (frozenset) – Set of data in the form of (<key>, <value>) used for filtering.
- Returns:
A tuple of two manifest lists. The first is the set of active manifests (will run at least one test. The second is a list of skipped manifests (all tests are skipped).
- class gecko_taskgraph.util.chunking.DefaultLoader(params)
Bases:
BaseManifestLoader
Load manifests using metadata from the TestResolver.
- get_manifests(suite, frozen_mozinfo)
Compute which manifests should run for the given flavor, subsuite and mozinfo.
This function returns skipped manifests separately so that more balanced chunks can be achieved by only considering “active” manifests in the chunking algorithm.
- Parameters:
flavor (str) – The suite to run. Values are defined by the ‘build_flavor’ key in moztest.resolve.TEST_SUITES.
subsuite (str) – The subsuite to run or ‘undefined’ to denote no subsuite.
mozinfo (frozenset) – Set of data in the form of (<key>, <value>) used for filtering.
- Returns:
A tuple of two manifest lists. The first is the set of active manifests (will run at least one test. The second is a list of skipped manifests (all tests are skipped).
- get_tests(suite)
- gecko_taskgraph.util.chunking.chunk_manifests(suite, platform, chunks, manifests)
Run the chunking algorithm.
- Parameters:
platform (str) – Platform used to find runtime info.
chunks (int) – Number of chunks to split manifests into.
manifests (list) – Manifests to chunk.
- Returns:
A list of length chunks where each item contains a list of manifests that run in that chunk.
- gecko_taskgraph.util.chunking.get_manifest_loader(name, params)
- gecko_taskgraph.util.chunking.get_runtimes(platform, suite_name)
- gecko_taskgraph.util.chunking.get_test_tags(config, env)
- gecko_taskgraph.util.chunking.guess_mozinfo_from_task(task, repo='', app_version='', test_tags=[])
Attempt to build a mozinfo dict from a task definition.
This won’t be perfect and many values used in the manifests will be missing. But it should cover most of the major ones and be “good enough” for chunking in the taskgraph.
- Parameters:
task (dict) – A task definition.
- Returns:
A dict that can be used as a mozinfo replacement.
gecko_taskgraph.util.declarative_artifacts module
- gecko_taskgraph.util.declarative_artifacts.get_geckoview_artifact_id(config, platform, package, update_channel=None)
- gecko_taskgraph.util.declarative_artifacts.get_geckoview_artifact_map(config, job)
- gecko_taskgraph.util.declarative_artifacts.get_geckoview_template_vars(config, platform, package, update_channel)
- gecko_taskgraph.util.declarative_artifacts.get_geckoview_upstream_artifacts(config, job, package, platform='')
gecko_taskgraph.util.dependencies module
- gecko_taskgraph.util.dependencies.chunk_locale_grouping(config, tasks)
Split by a chunk_locale (but also by platform, build-type, product)
This grouping is written for mac signing with notarization, but might also be useful elsewhere.
- gecko_taskgraph.util.dependencies.partner_repack_ids_grouping(config, tasks)
Split by partner_repack_ids (but also by platform, build-type, product)
This grouping is written for release-{eme-free,partner}-repack-signing.
- gecko_taskgraph.util.dependencies.platform_grouping(config, tasks)
- gecko_taskgraph.util.dependencies.single_grouping(config, tasks)
- gecko_taskgraph.util.dependencies.single_locale_grouping(config, tasks)
Split by a single locale (but also by platform, build-type, product)
The locale can be None (en-US build/signing/repackage), a single locale, or multiple locales per task, e.g. for l10n chunking. In the case of a task with, say, five locales, the task will show up in all five locale groupings.
This grouping is written for non-partner-repack beetmover, but might also be useful elsewhere.
- gecko_taskgraph.util.dependencies.skip_only_or_not(config, task)
Return True if we should skip this task based on only_ or not_ config.
gecko_taskgraph.util.docker module
- class gecko_taskgraph.util.docker.HashingWriter(writer)
Bases:
object
A file object with write capabilities that hashes the written data at the same time it passes down to a real file object.
- hexdigest()
- write(buf)
- class gecko_taskgraph.util.docker.ImagePathsMap(config_path, image_dir='/builds/worker/checkouts/gecko/taskcluster/docker')
Bases:
Mapping
ImagePathsMap contains the mapping of Docker image names to their context location in the filesystem. The register function allows Thunderbird to define additional images under comm/taskcluster.
- register(jobs_config_path, image_dir)
Register additional image_paths. In this case, there is no ‘jobs’ key in the loaded YAML as this file is loaded via jobs-from in kind.yml.
- class gecko_taskgraph.util.docker.VoidWriter
Bases:
object
A file object with write capabilities that does nothing with the written data.
- write(buf)
- gecko_taskgraph.util.docker.create_context_tar(topsrcdir, context_dir, out_path, image_name, args)
Create a context tarball.
A directory
context_dir
containing a Dockerfile will be assembled into a gzipped tar file atout_path
.We also scan the source Dockerfile for special syntax that influences context generation.
If a line in the Dockerfile has the form
# %include <path>
, the relative path specified on that line will be matched against files in the source repository and added to the context under the pathtopsrcdir/
. If an entry is a directory, we add all files under that directory.Returns the SHA-256 hex digest of the created archive.
- gecko_taskgraph.util.docker.docker_image(name, by_tag=False)
Resolve in-tree prebuilt docker image to
<registry>/<repository>@sha256:<digest>
, or<registry>/<repository>:<tag>
if by_tag is True.
- gecko_taskgraph.util.docker.docker_url(path, **kwargs)
- gecko_taskgraph.util.docker.generate_context_hash(topsrcdir, image_path, image_name, args)
Generates a sha256 hash for context directory used to build an image.
- gecko_taskgraph.util.docker.image_path(name)
- gecko_taskgraph.util.docker.parse_volumes(image)
Parse VOLUME entries from a Dockerfile for an image.
- gecko_taskgraph.util.docker.post_to_docker(tar, api_path, **kwargs)
POSTs a tar file to a given docker API path.
The tar argument can be anything that can be passed to requests.post() as data (e.g. iterator or file object). The extra keyword arguments are passed as arguments to the docker API.
- gecko_taskgraph.util.docker.stream_context_tar(topsrcdir, context_dir, out_file, image_name, args)
Like create_context_tar, but streams the tar file to the out_file file object.
gecko_taskgraph.util.hash module
- gecko_taskgraph.util.hash.get_file_finder(base_path)
- gecko_taskgraph.util.hash.hash_path(path)
Hash a single file.
Returns the SHA-256 hash in hex form.
- gecko_taskgraph.util.hash.hash_paths(base_path, patterns)
Give a list of path patterns, return a digest of the contents of all the corresponding files, similarly to git tree objects or mercurial manifests.
Each file is hashed. The list of all hashes and file paths is then itself hashed to produce the result.
gecko_taskgraph.util.hg module
- gecko_taskgraph.util.hg.calculate_head_rev(root)
- gecko_taskgraph.util.hg.find_hg_revision_push_info(repository, revision)
Given the parameters for this action and a revision, find the pushlog_id of the revision.
- gecko_taskgraph.util.hg.get_hg_commit_message(root, rev='.')
- gecko_taskgraph.util.hg.get_hg_revision_branch(root, revision)
Given the parameters for a revision, find the hg_branch (aka relbranch) of the revision.
- gecko_taskgraph.util.hg.get_json_pushchangedfiles(repository, revision)
- gecko_taskgraph.util.hg.get_push_data(repository, project, push_id_start, push_id_end)
gecko_taskgraph.util.partials module
- gecko_taskgraph.util.partials.find_localtest(fileUrls)
- gecko_taskgraph.util.partials.get_balrog_platform_name(platform)
Convert build platform names into balrog platform names.
Remove known values instead to catch aarch64 and other platforms that may be added.
- gecko_taskgraph.util.partials.get_builds(release_history, platform, locale)
Examine cached balrog release history and return the list of builds we need to generate diffs from
- gecko_taskgraph.util.partials.get_partials_artifacts_from_params(release_history, platform, locale)
- gecko_taskgraph.util.partials.get_partials_info_from_params(release_history, platform, locale)
- gecko_taskgraph.util.partials.get_release_builds(release, branch)
- gecko_taskgraph.util.partials.get_sorted_releases(product, branch)
Returns a list of release names from Balrog. :param product: product name, AKA appName :param branch: branch name, e.g. mozilla-central :return: a sorted list of release names, most recent first.
- gecko_taskgraph.util.partials.populate_release_history(product, branch, maxbuilds=4, maxsearch=10, partial_updates=None)
gecko_taskgraph.util.partners module
- gecko_taskgraph.util.partners.GITHUB_API_ENDPOINT = 'https://api.github.com/graphql'
LOGIN_QUERY, MANIFEST_QUERY, and REPACK_CFG_QUERY are all written to the Github v4 API, which users GraphQL. See https://developer.github.com/v4/
- gecko_taskgraph.util.partners.apply_partner_priority(config, jobs)
- gecko_taskgraph.util.partners.check_if_partners_enabled(config, tasks)
- gecko_taskgraph.util.partners.check_login(token)
- gecko_taskgraph.util.partners.fix_partner_config(orig_config)
- gecko_taskgraph.util.partners.generate_attribution_code(defaults, partner)
- gecko_taskgraph.util.partners.get_attribution_config(manifestRepo, token)
- gecko_taskgraph.util.partners.get_ftp_platform(platform)
- gecko_taskgraph.util.partners.get_partner_config_by_kind(config, kind)
Retrieve partner data starting from the manifest url, which points to a repository containing a default.xml that is intended to be drive the Google tool ‘repo’. It descends into each partner repo to lookup and parse the repack.cfg file(s).
Supports caching data by kind to avoid repeated requests, relying on the related kinds for partner repacking, signing, repackage, repackage signing all having the same kind prefix.
- gecko_taskgraph.util.partners.get_partner_config_by_url(manifest_url, kind, token, partner_subset=None)
Retrieve partner data starting from the manifest url, which points to a repository containing a default.xml that is intended to be drive the Google tool ‘repo’. It descends into each partner repo to lookup and parse the repack.cfg file(s).
If partner_subset is a list of sub_config names only return data for those.
Supports caching data by kind to avoid repeated requests, relying on the related kinds for partner repacking, signing, repackage, repackage signing all having the same kind prefix.
- gecko_taskgraph.util.partners.get_partner_url_config(parameters, graph_config)
- gecko_taskgraph.util.partners.get_partners(manifestRepo, token)
Given the url to a manifest repository, retrieve the default.xml and parse it into a list of partner repos.
- gecko_taskgraph.util.partners.get_partners_to_be_published(config)
- gecko_taskgraph.util.partners.get_repack_configs(repackRepo, token)
For a partner repository, retrieve all the repack.cfg files and parse them into a dict
- gecko_taskgraph.util.partners.get_repack_ids_by_platform(config, build_platform)
- gecko_taskgraph.util.partners.get_repo_params(repo)
Parse the organisation and repo name from an https or git url for a repo
- gecko_taskgraph.util.partners.get_token(params)
We use a Personal Access Token from Github to lookup partner config. No extra scopes are needed on the token to read public repositories, but need the ‘repo’ scope to see private repositories. This is not fine grained and also grants r/w access, but is revoked at the repo level.
- gecko_taskgraph.util.partners.locales_per_build_platform(build_platform, locales)
- gecko_taskgraph.util.partners.parse_config(data)
- Parse a single repack.cfg file into a python dictionary.
data is contents of the file, in “foo=bar
- baz=buzz” style. We do some translation on
locales and platforms data, otherwise passthrough
- gecko_taskgraph.util.partners.query_api(query, token)
Make a query with a Github auth header, returning the json
gecko_taskgraph.util.perfile module
- gecko_taskgraph.util.perfile.perfile_number_of_chunks(is_try, try_task_config, files_changed, type)
gecko_taskgraph.util.perftest module
- gecko_taskgraph.util.perftest.is_external_browser(label)
gecko_taskgraph.util.platforms module
- gecko_taskgraph.util.platforms.architecture(build_platform)
- gecko_taskgraph.util.platforms.archive_format(build_platform)
Given a build platform, return the archive format used on the platform.
- gecko_taskgraph.util.platforms.executable_extension(build_platform)
Given a build platform, return the executable extension used on the platform.
- gecko_taskgraph.util.platforms.platform_family(build_platform)
Given a build platform, return the platform family (linux, macosx, etc.)
gecko_taskgraph.util.scriptworker module
Make scriptworker.cot.verify more user friendly by making scopes dynamic.
Scriptworker uses certain scopes to determine which sets of credentials to use. Certain scopes are restricted by branch in chain of trust verification, and are checked again at the script level. This file provides functions to adjust these scopes automatically by project; this makes pushing to try, forking a project branch, and merge day uplifts more user friendly.
In the future, we may adjust scopes by other settings as well, e.g. different scopes for push-to-candidates rather than push-to-releases, even if both happen on mozilla-beta and mozilla-release.
Additional configuration is found in the graph config.
- gecko_taskgraph.util.scriptworker.BALROG_ACTIONS = ('submit-locale', 'submit-toplevel', 'schedule', 'v2-submit-locale', 'v2-submit-toplevel')
Map balrog scope aliases to sets of projects.
This is a list of list-pairs, for ordering.
- gecko_taskgraph.util.scriptworker.BALROG_SCOPE_ALIAS_TO_PROJECT = [['nightly', {'comm-central', 'larch', 'mozilla-central', 'pine'}], ['beta', {'comm-beta', 'mozilla-beta'}], ['release', {'comm-release', 'mozilla-release'}], ['esr115', {'comm-esr115', 'mozilla-esr115'}], ['esr128', {'comm-esr128', 'mozilla-esr128'}]]
Map the balrog scope aliases to the actual scopes.
- gecko_taskgraph.util.scriptworker.BALROG_SERVER_SCOPES = {'aurora': 'balrog:server:aurora', 'beta': 'balrog:server:beta', 'default': 'balrog:server:dep', 'esr115': 'balrog:server:esr', 'esr128': 'balrog:server:esr', 'nightly': 'balrog:server:nightly', 'release': 'balrog:server:release'}
The list of the release promotion phases which we send notifications for
- gecko_taskgraph.util.scriptworker.BEETMOVER_ACTION_SCOPES = {'default': 'beetmover:action:push-to-candidates', 'nightly': 'beetmover:action:push-to-nightly', 'nightly-larch': 'beetmover:action:push-to-nightly', 'nightly-pine': 'beetmover:action:push-to-nightly'}
Map the beetmover tasks aliases to the actual action scopes. The action scopes are generic across different repo types.
- gecko_taskgraph.util.scriptworker.BEETMOVER_APT_REPO_SCOPES = {'all-nightly-branches': 'beetmover:apt-repo:nightly', 'all-release-branches': 'beetmover:apt-repo:release', 'default': 'beetmover:apt-repo:dep'}
Map the beetmover tasks aliases to the actual action scopes.
- gecko_taskgraph.util.scriptworker.BEETMOVER_BUCKET_SCOPES = {'all-nightly-branches': 'beetmover:bucket:nightly', 'all-release-branches': 'beetmover:bucket:release', 'default': 'beetmover:bucket:dep'}
Map the beetmover scope aliases to the actual scopes. These are the scopes needed to import artifacts into the product delivery APT repos.
- gecko_taskgraph.util.scriptworker.BEETMOVER_REPO_ACTION_SCOPES = {'default': 'beetmover:action:import-from-gcs-to-artifact-registry'}
Known balrog actions.
- gecko_taskgraph.util.scriptworker.BEETMOVER_SCOPE_ALIAS_TO_PROJECT = [['all-nightly-branches', {'comm-central', 'larch', 'mozilla-central', 'pine'}], ['all-release-branches', {'comm-beta', 'comm-esr115', 'comm-esr128', 'comm-release', 'mozilla-beta', 'mozilla-esr115', 'mozilla-esr128', 'mozilla-release'}]]
Map the beetmover scope aliases to the actual scopes.
- gecko_taskgraph.util.scriptworker.DEVEDITION_SIGNING_CERT_SCOPES = {'beta': 'signing:cert:nightly-signing', 'default': 'signing:cert:dep-signing'}
Map beetmover scope aliases to sets of projects.
- gecko_taskgraph.util.scriptworker.SIGNING_SCOPE_ALIAS_TO_PROJECT = [['all-nightly-branches', {'comm-central', 'larch', 'mozilla-central', 'pine'}], ['all-release-branches', {'comm-beta', 'comm-esr115', 'comm-esr128', 'comm-release', 'mozilla-beta', 'mozilla-esr115', 'mozilla-esr128', 'mozilla-release'}]]
Map the signing scope aliases to the actual scopes.
- gecko_taskgraph.util.scriptworker.add_scope_prefix(config, scope)
Prepends the scriptworker scope prefix from the graph config.
- Parameters:
config (TransformConfig) – The configuration for the kind being transformed.
scope (string) – The suffix of the scope
- Returns:
the scope to use.
- Return type:
string
- gecko_taskgraph.util.scriptworker.generate_artifact_registry_gcs_sources(dep)
- gecko_taskgraph.util.scriptworker.generate_beetmover_artifact_map(config, job, **kwargs)
Generate the beetmover artifact map.
Currently only applies to beetmover tasks.
- Parameters:
() (config) – Current taskgraph configuration.
job (dict) – The current job being generated
- Common kwargs:
platform (str): The current build platform locale (str): The current locale being beetmoved.
- Returns:
- A list of dictionaries containing source->destination
maps for beetmover.
- Return type:
list
- gecko_taskgraph.util.scriptworker.generate_beetmover_partials_artifact_map(config, job, partials_info, **kwargs)
Generate the beetmover partials artifact map.
Currently only applies to beetmover tasks.
- Parameters:
() (config) – Current taskgraph configuration.
job (dict) – The current job being generated
partials_info (dict) – Current partials and information about them in a dict
- Common kwargs:
platform (str): The current build platform locale (str): The current locale being beetmoved.
- Returns:
- A list of dictionaries containing source->destination
maps for beetmover.
- Return type:
list
- gecko_taskgraph.util.scriptworker.generate_beetmover_upstream_artifacts(config, job, platform, locale=None, dependencies=None, **kwargs)
Generate the upstream artifacts for beetmover, using the artifact map.
Currently only applies to beetmover tasks.
- Parameters:
job (dict) – The current job being generated
dependencies (list) – A list of the job’s dependency labels.
platform (str) – The current build platform
locale (str) – The current locale being beetmoved.
- Returns:
A list of dictionaries conforming to the upstream_artifacts spec.
- Return type:
list
- gecko_taskgraph.util.scriptworker.get_balrog_server_scope(config, *, alias_to_project_map=[['nightly', {'comm-central', 'larch', 'mozilla-central', 'pine'}], ['beta', {'comm-beta', 'mozilla-beta'}], ['release', {'comm-release', 'mozilla-release'}], ['esr115', {'comm-esr115', 'mozilla-esr115'}], ['esr128', {'comm-esr128', 'mozilla-esr128'}]], alias_to_scope_map={'aurora': 'balrog:server:aurora', 'beta': 'balrog:server:beta', 'default': 'balrog:server:dep', 'esr115': 'balrog:server:esr', 'esr128': 'balrog:server:esr', 'nightly': 'balrog:server:nightly', 'release': 'balrog:server:release'})
Determine the restricted scope from config.params[‘project’].
- Parameters:
config (TransformConfig) – The configuration for the kind being transformed.
alias_to_project_map (list of lists) – each list pair contains the alias and the set of projects that match. This is ordered.
alias_to_scope_map (dict) – the alias alias to scope
- Returns:
the scope to use.
- Return type:
string
- gecko_taskgraph.util.scriptworker.get_beetmover_action_scope(config, *, release_type_to_scope_map={'default': 'beetmover:action:push-to-candidates', 'nightly': 'beetmover:action:push-to-nightly', 'nightly-larch': 'beetmover:action:push-to-nightly', 'nightly-pine': 'beetmover:action:push-to-nightly'})
Determine the restricted scope from config.params[‘target_tasks_method’].
- Parameters:
config (TransformConfig) – The configuration for the kind being transformed.
release_type_to_scope_map (dict) – the maps release types to scopes
- Returns:
the scope to use.
- Return type:
string
- gecko_taskgraph.util.scriptworker.get_beetmover_apt_repo_scope(config, *, alias_to_project_map=[['all-nightly-branches', {'comm-central', 'larch', 'mozilla-central', 'pine'}], ['all-release-branches', {'comm-beta', 'comm-esr115', 'comm-esr128', 'comm-release', 'mozilla-beta', 'mozilla-esr115', 'mozilla-esr128', 'mozilla-release'}]], alias_to_scope_map={'all-nightly-branches': 'beetmover:apt-repo:nightly', 'all-release-branches': 'beetmover:apt-repo:release', 'default': 'beetmover:apt-repo:dep'})
Determine the restricted scope from config.params[‘project’].
- Parameters:
config (TransformConfig) – The configuration for the kind being transformed.
alias_to_project_map (list of lists) – each list pair contains the alias and the set of projects that match. This is ordered.
alias_to_scope_map (dict) – the alias alias to scope
- Returns:
the scope to use.
- Return type:
string
- gecko_taskgraph.util.scriptworker.get_beetmover_bucket_scope(config, *, alias_to_project_map=[['all-nightly-branches', {'comm-central', 'larch', 'mozilla-central', 'pine'}], ['all-release-branches', {'comm-beta', 'comm-esr115', 'comm-esr128', 'comm-release', 'mozilla-beta', 'mozilla-esr115', 'mozilla-esr128', 'mozilla-release'}]], alias_to_scope_map={'all-nightly-branches': 'beetmover:bucket:nightly', 'all-release-branches': 'beetmover:bucket:release', 'default': 'beetmover:bucket:dep'})
Determine the restricted scope from config.params[‘project’].
- Parameters:
config (TransformConfig) – The configuration for the kind being transformed.
alias_to_project_map (list of lists) – each list pair contains the alias and the set of projects that match. This is ordered.
alias_to_scope_map (dict) – the alias alias to scope
- Returns:
the scope to use.
- Return type:
string
- gecko_taskgraph.util.scriptworker.get_beetmover_repo_action_scope(config, *, release_type_to_scope_map={'default': 'beetmover:action:import-from-gcs-to-artifact-registry'})
Determine the restricted scope from config.params[‘target_tasks_method’].
- Parameters:
config (TransformConfig) – The configuration for the kind being transformed.
release_type_to_scope_map (dict) – the maps release types to scopes
- Returns:
the scope to use.
- Return type:
string
- gecko_taskgraph.util.scriptworker.get_devedition_signing_cert_scope(config, *, alias_to_project_map=[['beta', {'mozilla-beta'}]], alias_to_scope_map={'beta': 'signing:cert:nightly-signing', 'default': 'signing:cert:dep-signing'})
Determine the restricted scope from config.params[‘project’].
- Parameters:
config (TransformConfig) – The configuration for the kind being transformed.
alias_to_project_map (list of lists) – each list pair contains the alias and the set of projects that match. This is ordered.
alias_to_scope_map (dict) – the alias alias to scope
- Returns:
the scope to use.
- Return type:
string
- gecko_taskgraph.util.scriptworker.get_phase_from_target_method(config, alias_to_tasks_map, alias_to_phase_map)
Determine the phase from config.params[‘target_tasks_method’].
- Parameters:
config (TransformConfig) – The configuration for the kind being transformed.
alias_to_tasks_map (list of lists) – each list pair contains the alias and the set of target methods that match. This is ordered.
alias_to_phase_map (dict) – the alias to phase map
- Returns:
the phase to use.
- Return type:
string
- gecko_taskgraph.util.scriptworker.get_release_config(config)
Get the build number and version for a release task.
Currently only applies to beetmover tasks.
- Parameters:
config (TransformConfig) – The configuration for the kind being transformed.
- Returns:
- containing both build_number and version. This can be used to
update task.payload.
- Return type:
dict
- gecko_taskgraph.util.scriptworker.get_scope_from_project(config, alias_to_project_map, alias_to_scope_map)
Determine the restricted scope from config.params[‘project’].
- Parameters:
config (TransformConfig) – The configuration for the kind being transformed.
alias_to_project_map (list of lists) – each list pair contains the alias and the set of projects that match. This is ordered.
alias_to_scope_map (dict) – the alias alias to scope
- Returns:
the scope to use.
- Return type:
string
- gecko_taskgraph.util.scriptworker.get_scope_from_release_type(config, release_type_to_scope_map)
Determine the restricted scope from config.params[‘target_tasks_method’].
- Parameters:
config (TransformConfig) – The configuration for the kind being transformed.
release_type_to_scope_map (dict) – the maps release types to scopes
- Returns:
the scope to use.
- Return type:
string
- gecko_taskgraph.util.scriptworker.get_signing_cert_scope(config, *, alias_to_project_map=[['all-nightly-branches', {'comm-central', 'larch', 'mozilla-central', 'pine'}], ['all-release-branches', {'comm-beta', 'comm-esr115', 'comm-esr128', 'comm-release', 'mozilla-beta', 'mozilla-esr115', 'mozilla-esr128', 'mozilla-release'}]], alias_to_scope_map={'all-nightly-branches': 'signing:cert:nightly-signing', 'all-release-branches': 'signing:cert:release-signing', 'default': 'signing:cert:dep-signing'})
Determine the restricted scope from config.params[‘project’].
- Parameters:
config (TransformConfig) – The configuration for the kind being transformed.
alias_to_project_map (list of lists) – each list pair contains the alias and the set of projects that match. This is ordered.
alias_to_scope_map (dict) – the alias alias to scope
- Returns:
the scope to use.
- Return type:
string
- gecko_taskgraph.util.scriptworker.get_signing_cert_scope_per_platform(build_platform, is_shippable, config)
- gecko_taskgraph.util.scriptworker.with_scope_prefix(f)
Wraps a function, calling
add_scope_prefix()
on the result of calling the wrapped function.- Parameters:
f (callable) – A function that takes a
config
and some keyword arguments, and returns a scope suffix.- Returns:
the wrapped function
- Return type:
callable
gecko_taskgraph.util.signed_artifacts module
Defines artifacts to sign before repackage.
- gecko_taskgraph.util.signed_artifacts.generate_specifications_of_artifacts_to_sign(config, job, keep_locale_template=True, kind=None, dep_kind=None)
- gecko_taskgraph.util.signed_artifacts.get_geckoview_artifacts_to_sign(config, job)
- gecko_taskgraph.util.signed_artifacts.get_signed_artifacts(input, formats, behavior=None)
Get the list of signed artifacts for the given input and formats.
- gecko_taskgraph.util.signed_artifacts.is_mac_signing_king(kind)
- gecko_taskgraph.util.signed_artifacts.is_notarization_kind(kind)
- gecko_taskgraph.util.signed_artifacts.is_partner_kind(kind)
gecko_taskgraph.util.taskcluster module
- gecko_taskgraph.util.taskcluster.find_task(index_path, use_proxy=False)
- gecko_taskgraph.util.taskcluster.insert_index(index_path, task_id, data=None, use_proxy=False)
- gecko_taskgraph.util.taskcluster.list_task_group_complete_tasks(task_group_id)
- gecko_taskgraph.util.taskcluster.list_task_group_incomplete_task_ids(task_group_id)
- gecko_taskgraph.util.taskcluster.list_task_group_tasks(task_group_id)
Generate the tasks in a task group
- gecko_taskgraph.util.taskcluster.state_task(task_id, use_proxy=False)
Gets the state of a task given a task_id.
In testing mode, just logs that it would have retrieved state. This is a subset of the data returned by
status_task()
.- Parameters:
task_id (str) – A task id.
use_proxy (bool) – Whether to use taskcluster-proxy (default: False)
- Returns:
- The state of the task, one of
pending, running, completed, failed, exception, unknown
.
- Return type:
str
- gecko_taskgraph.util.taskcluster.status_task(task_id, use_proxy=False)
Gets the status of a task given a task_id.
In testing mode, just logs that it would have retrieved status.
- Parameters:
task_id (str) – A task id.
use_proxy (bool) – Whether to use taskcluster-proxy (default: False)
- Returns:
- A dictionary object as defined here:
https://docs.taskcluster.net/docs/reference/platform/queue/api#status
- Return type:
dict
- gecko_taskgraph.util.taskcluster.trigger_hook(hook_group_id, hook_id, hook_payload)
gecko_taskgraph.util.taskgraph module
Tools for interacting with existing taskgraphs.
- gecko_taskgraph.util.taskgraph.find_decision_task(parameters, graph_config)
Given the parameters for this action, find the taskId of the decision task
- gecko_taskgraph.util.taskgraph.find_existing_tasks(previous_graph_ids)
- gecko_taskgraph.util.taskgraph.find_existing_tasks_from_previous_kinds(full_task_graph, previous_graph_ids, rebuild_kinds)
Given a list of previous decision/action taskIds and kinds to ignore from the previous graphs, return a dictionary of labels-to-taskids to use as
existing_tasks
in the optimization step.
gecko_taskgraph.util.verify module
- class gecko_taskgraph.util.verify.DocPaths(paths=NOTHING)
Bases:
object
- add(path)
Projects that make use of Firefox’s taskgraph can extend it with their own task kinds by registering additional paths for documentation. documentation_paths.add() needs to be called by the project’s Taskgraph registration function. See taskgraph.config.
- get_files(filename)
- gecko_taskgraph.util.verify.verify_aliases(task, taskgraph, scratch_pad, graph_config, parameters)
This function verifies that aliases are not reused.
- gecko_taskgraph.util.verify.verify_always_optimized(task, taskgraph, scratch_pad, graph_config, parameters)
This function ensures that always-optimized tasks have been optimized.
- gecko_taskgraph.util.verify.verify_attributes(task, taskgraph, scratch_pad, graph_config, parameters)
- gecko_taskgraph.util.verify.verify_dependency_tiers(task, taskgraph, scratch_pad, graph_config, parameters)
- gecko_taskgraph.util.verify.verify_docs(filename, identifiers, appearing_as)
Look for identifiers of the type appearing_as in the files returned by documentation_paths.get_files(). Firefox will have a single file in a list, but projects such as Thunderbird can have documentation in another location and may return multiple files.
- gecko_taskgraph.util.verify.verify_kinds_docs(kinds)
- gecko_taskgraph.util.verify.verify_parameters_docs(parameters)
- gecko_taskgraph.util.verify.verify_required_signoffs(task, taskgraph, scratch_pad, graph_config, parameters)
Task with required signoffs can’t be dependencies of tasks with less required signoffs.
- gecko_taskgraph.util.verify.verify_routes_notification_filters(task, taskgraph, scratch_pad, graph_config, parameters)
This function ensures that only understood filters for notifications are specified.
See: https://firefox-ci-tc.services.mozilla.com/docs/manual/using/task-notifications
- gecko_taskgraph.util.verify.verify_run_known_projects(task, taskgraph, scratch_pad, graph_config, parameters)
Validates the inputs in run-on-projects.
We should never let ‘try’ (or ‘try-comm-central’) be in run-on-projects even though it is valid because it is not considered for try pushes. While here we also validate for other unknown projects or typos.
- gecko_taskgraph.util.verify.verify_run_using()
- gecko_taskgraph.util.verify.verify_shippable_no_sccache(task, taskgraph, scratch_pad, graph_config, parameters)
- gecko_taskgraph.util.verify.verify_task_graph_symbol(task, taskgraph, scratch_pad, graph_config, parameters)
This function verifies that tuple (collection.keys(), machine.platform, groupSymbol, symbol) is unique for a target task graph.
- gecko_taskgraph.util.verify.verify_test_packaging(task, taskgraph, scratch_pad, graph_config, parameters)
- gecko_taskgraph.util.verify.verify_trust_domain_v2_routes(task, taskgraph, scratch_pad, graph_config, parameters)
This function ensures that any two tasks have distinct
index.{trust-domain}.v2
routes.
gecko_taskgraph.util.workertypes module
- gecko_taskgraph.util.workertypes.get_worker_type(graph_config, parameters, worker_type)
Get the worker type provisioner and worker-type, optionally evaluating aliases from the graph config.
- gecko_taskgraph.util.workertypes.worker_type_implementation(graph_config, parameters, worker_type)
Get the worker implementation and OS for the given workerType, where the OS represents the host system, not the target OS, in the case of cross-compiles.