mozbuild.vendor package

Submodules

mozbuild.vendor.host_angle module

class mozbuild.vendor.host_angle.AngleHost(manifest)

Bases: BaseHost

upstream_commit(revision)
upstream_snapshot(revision)
upstream_tag(revision)

Temporarily clone the repo to get the latest tag and timestamp

mozbuild.vendor.host_base module

class mozbuild.vendor.host_base.BaseHost(manifest)

Bases: object

download_single_file(url, destination)
upstream_path_to_file(revision, filepath)
upstream_release_artifact(revision, release_artifact)
upstream_snapshot(revision)
upstream_tag(revision)

Temporarily clone the repo to get the latest tag and timestamp

mozbuild.vendor.host_codeberg module

class mozbuild.vendor.host_codeberg.CodebergHost(manifest)

Bases: BaseHost

upstream_commit(revision)

Query the codeberg api for a git commit id and timestamp.

upstream_snapshot(revision)

mozbuild.vendor.host_git module

class mozbuild.vendor.host_git.GitHost(*args, **kwargs)

Bases: BaseHost

upstream_commit(revision)
upstream_snapshot(revision)

mozbuild.vendor.host_github module

class mozbuild.vendor.host_github.GitHubHost(manifest)

Bases: BaseHost

api_get(path)

Generic Github API get.

upstream_commit(revision)

Query the github api for a git commit id and timestamp.

upstream_path_to_file(revision, filepath)
upstream_release_artifact(revision, release_artifact)
upstream_snapshot(revision)

mozbuild.vendor.host_gitlab module

class mozbuild.vendor.host_gitlab.GitLabHost(manifest)

Bases: BaseHost

upstream_commit(revision)

Query the gitlab api for a git commit id and timestamp.

upstream_snapshot(revision)

mozbuild.vendor.host_googlesource module

class mozbuild.vendor.host_googlesource.GoogleSourceHost(manifest)

Bases: BaseHost

upstream_commit(revision)

Query for a git commit and timestamp.

upstream_path_to_file(revision, filepath)
upstream_snapshot(revision)

mozbuild.vendor.mach_commands module

mozbuild.vendor.mach_commands.check_modified_files(command_context)

Ensure that there aren’t any uncommitted changes to files in the working copy, since we’re going to change some state on the user.

mozbuild.vendor.mach_commands.vendor(command_context, library, revision, ignore_modified=False, check_for_update=False, add_to_exports=False, force=False, verify=False, patch_mode=None)

Vendor third-party dependencies into the source repository.

Vendoring rust and python can be done with ./mach vendor [rust/python]. Vendoring other libraries can be done with ./mach vendor [arguments] path/to/file.yaml

mozbuild.vendor.mach_commands.vendor_python(command_context, keep_extra_files, add, remove, upgrade, upgrade_package, force)
mozbuild.vendor.mach_commands.vendor_rust(command_context, **kwargs)

mozbuild.vendor.moz_yaml module

class mozbuild.vendor.moz_yaml.License

Bases: object

Voluptuous validator which verifies the license(s) are valid as per our allow list.

exception mozbuild.vendor.moz_yaml.MozYamlVerifyError(filename, error)

Bases: Exception

mozbuild.vendor.moz_yaml.RE_FIELD(string, pos=0, endpos=9223372036854775807)

Scan through string looking for a match, and return a corresponding match object instance.

Return None if no position in the string matches.

mozbuild.vendor.moz_yaml.RE_SECTION(string, pos=0, endpos=9223372036854775807)

Scan through string looking for a match, and return a corresponding match object instance.

Return None if no position in the string matches.

class mozbuild.vendor.moz_yaml.UpdateActions

Bases: object

Voluptuous validator which verifies the update actions(s) are valid.

class mozbuild.vendor.moz_yaml.UpdatebotTasks

Bases: object

Voluptuous validator which verifies the updatebot task(s) are valid.

mozbuild.vendor.moz_yaml.VALID_SOURCE_HOSTS = ['gitlab', 'googlesource', 'github', 'angle', 'codeberg', 'git', 'yaml-dir']

— # Third-Party Library Template # All fields are mandatory unless otherwise noted

# Version of this schema schema: 1

bugzilla:

# Bugzilla product and component for this directory and subdirectories product: product name component: component name

# Document the source of externally hosted code origin:

# Short name of the package/library name: name of the package

description: short (one line) description

# Full URL for the package’s homepage/etc # Usually different from repository url url: package’s homepage url

# Human-readable identifier for this version/release # Generally “version NNN”, “tag SSS”, “bookmark SSS” release: identifier

# Revision to pull in # Must be a long or short commit SHA (long preferred) revision: sha

# The package’s license, where possible using the mnemonic from # https://spdx.org/licenses/ # Multiple licenses can be specified (as a YAML list) # A “LICENSE” file must exist containing the full license text license: MPL-2.0

# If the package’s license is specified in a particular file, # this is the name of the file. # optional license-file: COPYING

# If there are any mozilla-specific notes you want to put # about a library, they can be put here. notes: Notes about the library

# Configuration for the automated vendoring system. # optional vendoring:

# Repository URL to vendor from # eg. https://github.com/kinetiknz/nestegg # Any repository host can be specified here, however initially we’ll only # support automated vendoring from selected sources. url: source url (generally repository clone url)

# Type of hosting for the upstream repository # Valid values are ‘gitlab’, ‘github’, googlesource source-hosting: gitlab

# Type of Vendoring # This is either ‘regular’, ‘individual-files’, or ‘rust’ # If omitted, will default to ‘regular’ flavor: rust

# Type of git reference (commit, tag) to track updates from. # You cannot use tag tracking with the individual-files flavor # If omitted, will default to tracking commits. tracking: commit

# When using tag tracking (only on Github currently) use a release artifact # for the source code instead of the automatically built git-archive exports. # The source repository must build these artifacts with consistent filenames # for every tag. This is useful when the Github repository uses submodules # since they are not included in the git-archives. # Substitution is performed on the filename, {tag} is replaced with the tag name. # optional release-artifact: “rnp-{tag}.tar.gz”

# Base directory of the location where the source files will live in-tree. # If omitted, will default to the location the moz.yaml file is in. vendor-directory: third_party/directory

# Allows skipping certain steps of the vendoring process. # Most useful if e.g. vendoring upstream is complicated and should be done by a script # The valid steps that can be skipped are listed below skip-vendoring-steps:

  • fetch

  • keep

  • include

  • exclude

  • move-contents

  • hg-add

  • spurious-check

  • update-moz-yaml

  • update-moz-build

# List of patch files to apply after vendoring. Applied in the order # specified, and alphabetically if globbing is used. Patches must apply # cleanly before changes are pushed. # Patch files should be relative to the vendor-directory rather than the gecko # root directory. # All patch files are implicitly added to the keep file list. # optional patches:

  • file

  • path/to/file

  • path/*.patch

  • path/** # Captures all files and subdirectories below path

  • path/* # Captures all files but _not_ subdirectories below path. Equivalent to path/

# List of files that are not removed from the destination directory while vendoring # in a new version of the library. Intended for mozilla files not present in upstream. # Implicitly contains “moz.yaml”, “moz.build”, and any files referenced in # “patches” # optional keep:

  • file

  • path/to/file

  • another/path

  • *.mozilla

# Files/paths that will not be vendored from the upstream repository # Implicitly contains “.git”, and “.gitignore” # optional exclude:

  • file

  • path/to/file

  • another/path

  • docs

  • src/*.test

# Files/paths that will always be vendored from source repository, even if # they would otherwise be excluded by “exclude”. # optional include:

  • file

  • path/to/file

  • another/path

  • docs/LICENSE.*

# Files that are modified as part of the update process. # To avoid creating updates that don’t update anything, ./mach vendor will detect # if any in-tree files have changed. If there are files that are always changed # during an update process (e.g. version numbers or source revisions), list them # here to avoid having them counted as substative changes. # This field does NOT support directories or globbing # optional generated:

  • ‘{yaml_dir}/vcs_version.h’

# If neither “exclude” or “include” are set, all files will be vendored # Files/paths in “include” will always be vendored, even if excluded # eg. excluding “docs/” then including “docs/LICENSE” will vendor just the # LICENSE file from the docs directory

# All three file/path parameters (“keep”, “exclude”, and “include”) support # filenames, directory names, and globs/wildcards.

# Actions to take after updating. Applied in order. # The action subfield is required. It must be one of: # - copy-file # - move-file # - move-dir # - replace-in-file # - replace-in-file-regex # - delete-path # - run-script # Unless otherwise noted, all subfields of action are required. # # If the action is copy-file, move-file, or move-dir: # from is the source file # to is the destination # # If the action is replace-in-file or replace-in-file-regex: # pattern is what in the file to search for. It is an exact strng match. # with is the string to replace it with. Accepts the special keyword # ‘{revision}’ for the commit we are updating to. # File is the file to replace it in. # # If the action is delete-path # path is the file or directory to recursively delete # # If the action is run-script: # script is the script to run # cwd is the directory the script should run with as its cwd # args is a list of arguments to pass to the script # # If the action is run-command: # command is the command to run # Unlike run-script, command is _not_ processed to be relative # to the vendor directory, and is passed directly to python’s # execution code without any path substitution or manipulation # cwd is the directory the command should run with as its cwd # args is a list of arguments to pass to the command # # # Unless specified otherwise, all files/directories are relative to the # vendor-directory. If the vendor-directory is different from the # directory of the yaml file, the keyword ‘{yaml_dir}’ may be used # to make the path relative to that directory. # ‘run-script’ supports the addictional keyword {cwd} which, if used, # must only be used at the beginning of the path. # # optional update-actions:

  • action: copy-file from: include/vcs_version.h.in to: ‘{yaml_dir}/vcs_version.h’

  • action: replace-in-file pattern: ‘@VCS_TAG@’ with: ‘{revision}’ file: ‘{yaml_dir}/vcs_version.h’

  • action: delete-path path: ‘{yaml_dir}/config’

  • action: run-script script: ‘{cwd}/generate_sources.sh’ cwd: ‘{yaml_dir}’

# Configuration for automatic updating system. # optional updatebot:

# TODO: allow multiple users to be specified # Phabricator username for a maintainer of the library, used for assigning # reviewers. For a review group, preface with #, such as “#build”” maintainer-phab: tjr

# Bugzilla email address for a maintainer of the library, used for needinfos maintainer-bz: tom@mozilla.com

# Optional: A preset for ./mach try to use. If present, fuzzy-query and fuzzy-paths will # be ignored. If it, fuzzy-query, and fuzzy-path are omitted, ./mach try auto will be used try-preset: media

# Optional: A query string for ./mach try fuzzy. If try-preset, it and fuzzy-paths are omitted # then ./mach try auto will be used fuzzy-query: media

# Optional: An array of test paths for ./mach try fuzzy. If try-preset, it and fuzzy-query are # omitted then ./mach try auto will be used fuzzy-paths: [‘media’]

# The tasks that Updatebot can run. Only one of each task is currently permitted # optional tasks:

  • type: commit-alert branch: upstream-branch-name cc: [”bugzilla@email.address”, “another@example.com”] needinfo: [”bugzilla@email.address”, “another@example.com”] enabled: True filter: security frequency: every platform: windows blocking: 1234

  • type: vendoring branch: master enabled: False

    # frequency can be ‘every’, ‘release’, ‘N weeks’, ‘N commits’ # or ‘N weeks, M commits’ requiring satisfying both constraints. frequency: 2 weeks

mozbuild.vendor.moz_yaml.load_moz_yaml(filename, verify=True, require_license_file=True)

Loads and verifies the specified manifest.

mozbuild.vendor.rewrite_mozbuild module

Problem:

./mach vendor needs to be able to add or remove files from moz.build files automatically to be able to effectively update a library automatically and send useful try runs in.

So far, it has been difficult to do that.

Why:
  • Some files need to go into UNIFIED_SOURCES vs SOURCES

  • Some files are os-specific, and need to go into per-OS conditionals

  • Some files are both UNIFIED_SOURCES/SOURCES sensitive and OS-specific.

Proposal:

Design an algorithm that maps a third party library file to a suspected moz.build location. Run the algorithm on all files specified in all third party libraries’ moz.build files. See if the proposed place in the moz.build file matches the actual place.

Initial Algorithm

Given a file, which includes the filename and the path from gecko root, we want to find the correct moz.build file and location within that file. Take the path of the file, and iterate up the directory tree, looking for moz.build files as we go. Consider each of these moz.build files, starting with the one closest to the file. Within a moz.build file, identify the SOURCES or UNIFIED_SOURCES block(s) that contains a file in the same directory path as the file to be added. If there is only one such block, use that one. If there are multiple blocks, look at the files within each block and note the longest length of a common prefix (including partial filenames - if we just did full directories the result would be the same as the prior step and we would not narrow the results down). Use the block containing the longest prefix. (We call this ‘guessing’.)

Result of the proposal:

The initial implementation works on 1675 of 1977 elligible files. The files it does not work on include:

  • general failures. Such as when we find that avutil.cpp wants to be next to adler32.cpp but avutil.cpp is in SOURCES and adler32.cpp is in UNIFIED_SOURCES. (And many similar cases.)

  • per-cpu-feature files, where only a single file is added under a conditional

  • When guessing, because of a len(…) > longest_so_far comparison, we would prefer the first block we found. - Changing this to prefer UNIFIED_SOURCES in the event of a tie

    yielded 17 additional correct assignments (about a 1% improvement)

  • As a result of the change immediately above, when guessing, because given equal prefixes, we would prefer a UNIFIED_SOURCES block over other blocks, even if the other blocks are longer - Changing this (again) to prefer the block containing more files yielded 49 additional

    correct assignments (about a 2.5% improvement)

The files that are ineligible for consideration are:
  • Those in libwebrtc

  • Those specified in source assignments composed of generators (e.g. [f for f in ‘%.c’])

  • Those specified in source assignments to subscripted variables (e.g. SOURCES += foo[‘x86_files’])

We needed to iterate up the directory and look at a different moz.build file _zero_ times.

This indicates this code is probably not needed, and therefore we will remove it from the algorithm.

We needed to guess base on the longest prefix 944 times, indicating that this code is

absolutely crucial and should be double-checked. (And indeed, upon double-checking it, bugs were identified.)

After some initial testing, it was determined that this code completely fell down when the vendoring directory differed from the moz.yaml directory (definitions below.) The code was slightly refactored to handle this case, primarily by (a) re-inserting the logic to check multiple moz.build files instead of the first and (b) handling some complicated normalization notions (details in comments).

Slightly Improved Algorithm Changes:

Don’t bother iterating up the directory tree looking for moz.build files, just take the first. When guessing, in the event of a common-prefix tie, prefer the block containing more files

With these changes, we now Successfully Matched 1724 of 1977 files

CODE CONCEPTS

source-assignment

An assignment of files to a SOURCES or UNIFIED_SOURCES variable, such as SOURCES += [‘ffpvx.cpp’]

We specifically look only for these two variable names to avoid identifying things such as CXX_FLAGS.

Sometimes; however, there is an intermediary variable, such as SOURCES += celt_filenames In this situation we find the celt_filenames assignment, and treat it as a ‘source-assignment’

source-assignment-location

source-assignment-location is a human readable string that identifies where in the moz.build file the source-assignment is. It can used to visually match the location upon manual inspection; and given a source-assignment-location, re-identify it when iterating over all source-assignments in a file.

The actual string consists of the path from the root of the moz.build file to the source-assignment, plus a suffix number.

We suffix the final value with an incrementing counter. This is to support moz.build files that, for whatever reason, use multiple SOURCES += [] list in the same basic block. This index is per-file, so no two assignments in the same file (even if they have separate locations) should have the same suffix.

For example:

When SOURCES += [‘ffpvx.xpp’] appears as the first line of the file (or any other unindented-location) its source-assignment-location will be > SOURCES 1.

When SOURCES += [‘ffpvx.xpp’] appears inside a conditional such as CONFIG[‘OS_TARGET’] == ‘WINNT’ then its source-assignment-location will be > if CONFIG[‘OS_TARGET’] == ‘WINNT’ > SOURCES 1

When SOURCES += [‘ffpvx.xpp’] appears as the second line of the file, and a different SOURCES += [] was the first line, then its source-assignment-location will be “> SOURCES 2”.

No two source-assignments may have the same source-assignment-location. If they do, we raise an assert.

file vs filename

a ‘filename’ is a string specifing the name and sometimes the path of a file. a ‘file’ is an object you get from open()-ing a filename

A variable that is a string should always use ‘filename’

vendoring directory vs moz.yaml directory

In many cases, a library’s moz.yaml file, moz.build file(s), and sources files will all live under a single directory. e.g. libjpeg

In other cases, a library’s source files are in one directory (we call this the ‘vendoring directory’) and the moz.yaml file and moz.build file(s) are in another directory (we call this the moz.yaml directory). e.g. libdav1d

normalized-filename

A filename is ‘normalized’ if it has been expanded to the full path from the gecko root. This requires a moz.build file.

For example a filename lib/opus.c may be specified inside the media/libopus/moz.build file. The filename is normalized by os.path.join()-ing the dirname of the moz.build file (i.e. media/libopus) to the filename, resulting in media/libopus/lib/opus.c

A filename that begins with ‘/’ is presumed to already be specified relative to the gecko root, and therefore is not modified.

Normalization gets more complicated when dealing with separate vendoring and moz.yaml directories. This is because a file can be considered normalized when it looks like third_party/libdav1d/src/a.cpp _or_ when it looks like media/libdav1d/../../third_party/libdav1d/src/a.cpp This is because in the moz.build file, it will be specified as ../../third_party/libdav1d/src/a.cpp and we ‘normalize’ it by prepending the path to the moz.build file.

Normalization is not just about having an ‘absolute’ path from gecko_root to file. In fact it’s not really about that at all - it’s about matching filenames. Therefore when we are dealing with separate vendoring and moz.yaml directories we will very quickly ‘re-normalize’ a normalized filename to get it into one of those foo/bar/../../third_party/… paths that will make sense for the moz.build file we are interested in.

Whenever a filename is normalized, it should be specified as such in the variable name, either as a prefix (normalized_filename) or a suffix (target_filename_normalized)

statistic

Using some hacky stuff, we report statistics about how many times we hit certain branches of the code. e.g.

  • “How many times did we refine a guess based on prefix length”

  • “How many times did we refine a guess based on the number of files in the block”

  • “What is the histogram of guess candidates”

We do this to identify how frequently certain code paths were taken, allowing us to identify strange behavior and investigate outliers. This process lead to identifying bugs and small improvements.

exception mozbuild.vendor.rewrite_mozbuild.MozBuildRewriteException

Bases: Exception

mozbuild.vendor.rewrite_mozbuild.add_file_to_moz_build_file(normalized_filename_to_add, moz_yaml_dir=None, vendoring_dir=None)

This is the overall function. Given a filename, relative to the gecko root (aka normalized), we look for a moz.build file to add it to, look for the place in the moz.build file to add it, and then edit that moz.build file in-place.

It accepted two optional parameters. If one is specified they both must be. If a library is vendored in a separate place from the moz.yaml file, these parameters specify those two directories.

mozbuild.vendor.rewrite_mozbuild.assignment_node_to_source_filename_list(code, node)

If the list of filenames is not a list of constants (e.g. it’s a generated list) it’s (probably) infeasible to try and figure it out. At least we’re not going to try right now. Maybe in the future?

If this happens, we’ll return an empty list. The consequence of this is that we won’t be able to match a file against this list, so we may not be able to add it.

(But if the file matches a generated list, perhaps it will be included in the Sources list automatically?)

mozbuild.vendor.rewrite_mozbuild.ast_get_source_segment(code, node)
mozbuild.vendor.rewrite_mozbuild.edit_moz_build_file_to_add_file(normalized_mozbuild_filename, unnormalized_filename_to_add, unnormalized_list_of_files)

This function edits the moz.build file in-place

I had _really_ hoped to replace this whole damn thing with something that adds a node to the AST, dumps the AST out, and then runs black on the file but there are some issues: - third party moz.build files (or maybe all moz.build files) aren’t always run through black - dumping the ast out losing comments

mozbuild.vendor.rewrite_mozbuild.edit_moz_build_file_to_remove_file(normalized_mozbuild_filename, unnormalized_filename_to_remove)

This function edits the moz.build file in-place

mozbuild.vendor.rewrite_mozbuild.filenames_directory_is_in_filename_list(filename_normalized, list_of_normalized_filenames)

Given a normalized filename and a list of normalized filenames, first turn them into a containing directory, and a list of containing directories. Then test if the containing directory of the filename is in the list.

ex:

f = filenames_directory_is_in_filename_list f(“foo/bar/a.c”, [“foo/b.c”]) -> false f(“foo/bar/a.c”, [“foo/b.c”, “foo/bar/c.c”]) -> true f(“foo/bar/a.c”, [“foo/b.c”, “foo/bar/baz/d.c”]) -> false

mozbuild.vendor.rewrite_mozbuild.find_all_posible_assignments_from_filename(source_assignments, filename_normalized)

Given a list of source assignments and a normalized filename, narrow the list to assignments that contain a file whose directory matches the filename’s directory.

mozbuild.vendor.rewrite_mozbuild.get_all_mozbuild_filenames(gecko_root)

Find all the third party moz.build files in the gecko repo

mozbuild.vendor.rewrite_mozbuild.get_all_target_filenames_normalized(all_mozbuild_filenames_normalized)

Given a list of moz.build files, returns all the files listed in all the souce assignments in the file.

This function is only used for debug/testing purposes - there is no reason to call this as part of ‘the algorithm’

mozbuild.vendor.rewrite_mozbuild.get_attribute_label(node)
mozbuild.vendor.rewrite_mozbuild.get_closest_mozbuild_file(normalized_filename, moz_yaml_dir=None, vendoring_dir=None, all_mozbuild_filenames_normalized=None)

Returns the closest moz.build file in the directory tree to a normalized filename

mozbuild.vendor.rewrite_mozbuild.get_file_reference_modes(source_assignments)

Given a set of source assignments, this function traverses through the files references in those assignments to see if the files are referenced using absolute paths (relative to gecko root) or relative paths.

It will return all the modes that are seen.

mozbuild.vendor.rewrite_mozbuild.get_gecko_root()

Using __file__ as a base, find the gecko root

mozbuild.vendor.rewrite_mozbuild.get_mozbuild_file_search_order(normalized_filename, moz_yaml_dir=None, vendoring_dir=None, all_mozbuild_filenames_normalized=None)

Returns an ordered list of normalized moz.build filenames to consider for a given filename

normalized_filename: a source filename normalized to the gecko root

moz_yaml_dir: the path from gecko_root to the moz.yaml file (which is the root of the moz.build files)

moz_yaml_dir: the path to where the library’s source files are

all_mozbuild_filenames_normalized: (optional) the list of all third-party moz.build files If all_mozbuild_filenames_normalized is not specified, we look in the filesystem.

The list is built out of two distinct steps.

In Step 1 we will walk up a directory tree, looking for moz.build files. We append moz.build files in this order, preferring the lowest moz.build we find, then moving on to one in a higher directory. The directory we start in is a little complicated. We take the series of subdirectories between vendoring_dir and the file in question, and then append them to the moz.yaml directory.

Example:

When moz_yaml directory != vendoring_directory:
    moz_yaml_dir = foo/bar/
    vendoring_dir = third_party/baz/
    normalized_filename = third_party/baz/asm/arm/a.S
    starting_directory: foo/bar/asm/arm/
When moz_yaml directory == vendoring_directory
    (In this case, these variables will actually be 'None' but the algorthm is the same)
    moz_yaml_dir = foo/bar/
    vendoring_dir = foo/bar/
    normalized_filename = foo/bar/asm/arm/a.S
    starting_directory: foo/bar/asm/arm/

In Step 2 we get a bit desparate. When the vendoring directory and the moz_yaml directory are not the same, there is no guarentee that the moz_yaml directory will adhere to the same directory structure as the vendoring directory. And indeed it doesn’t in some cases (e.g. libdav1d.) So in this situation we start at the root of the moz_yaml directory and walk downwards, adding _any_ moz.build file we encounter to the list. Later on (in all cases, not just moz_yaml_dir != vendoring_dir) we only consider a moz.build file if it has source files whose directory matches the normalized_filename, so this step, though desparate, is safe-ish and believe it or not has worked for some file additions.

mozbuild.vendor.rewrite_mozbuild.guess_best_assignment(source_assignments, filename_normalized)

Given several assignments, all of which contain the same directory as the filename, pick one we think is best and return its source-assignment-location.

We do this by looking at the filename itself (not just its directory) and picking the assignment which contains a filename with the longest matching prefix.

e.g: “foo/asm_neon.c” compared to [“foo/main.c”, “foo/all_utility.c”], [“foo/asm_arm.c”]

-> [“foo/asm_arm.c”] (match of foo/asm_)

mozbuild.vendor.rewrite_mozbuild.log(*args, **kwargs)
mozbuild.vendor.rewrite_mozbuild.mozbuild_file_to_source_assignments(normalized_mozbuild_filename, assignment_type)

Returns a dictionary of ‘source-assignment-location’ -> ‘normalized source filename list’ contained in the moz.build file specified

normalized_mozbuild_filename: the moz.build file to read

mozbuild.vendor.rewrite_mozbuild.node_to_name(code, node)
mozbuild.vendor.rewrite_mozbuild.node_to_readable_file_location(code, node, child_node=None)
mozbuild.vendor.rewrite_mozbuild.normalize_filename(normalized_mozbuild_filename, filename)
mozbuild.vendor.rewrite_mozbuild.remove_file_from_moz_build_file(normalized_filename_to_remove, moz_yaml_dir=None, vendoring_dir=None)

Given a filename, relative to the gecko root (aka normalized), we look for the nearest moz.build file, look in that file for the file, and then edit that moz.build file in-place.

mozbuild.vendor.rewrite_mozbuild.renormalize_filename(mode, moz_yaml_dir, vendoring_dir, normalized_mozbuild_filename, normalized_filename_to_act_on)
Edit the normalized_filename_to_act_on to either
  • Make it an absolute path from gecko root (if we’re in that mode)

  • Get a relative path from the vendoring directory to the yaml directory where the moz.build file is (If they are in separate directories)

mozbuild.vendor.rewrite_mozbuild.test_all_third_party_files(gecko_root, all_mozbuild_filenames_normalized)

Run the algorithm on every source file in a third party moz.build file and output the results

mozbuild.vendor.rewrite_mozbuild.try_to_match_target_file(all_mozbuild_filenames_normalized, target_filename_normalized)

Runs ‘the algorithm’ on a target file, and returns if the algorithm was successful

all_mozbuild_filenames_normalized: the list of all third-party moz.build files target_filename_normalized - the target filename, normalized to the gecko root

mozbuild.vendor.rewrite_mozbuild.unnormalize_filename(normalized_mozbuild_filename, normalized_filename)
mozbuild.vendor.rewrite_mozbuild.validate_directory_parameters(moz_yaml_dir, vendoring_dir)

mozbuild.vendor.vendor_manifest module

class mozbuild.vendor.vendor_manifest.VendorManifest(topsrcdir, settings, log_manager, topobjdir=None, mozconfig=<object object>, virtualenv_name=None)

Bases: MozbuildObject

convert_patterns_to_paths(directory, patterns)
fetch_and_unpack(revision)

Fetch and unpack upstream source

fetch_individual(new_revision)
get_full_path(path, support_cwd=False)
get_source_host()
import_local_patches(patches, yaml_dir, vendor_dir)
process_individual(new_revision, timestamp, ignore_modified, add_to_exports)
process_regular(new_revision, timestamp, ignore_modified, add_to_exports)
process_regular_or_individual(is_individual, new_revision, timestamp, ignore_modified, add_to_exports)
process_rust(command_context, old_revision, new_revision, timestamp, ignore_modified)
should_perform_step(step)
spurious_check(revision, ignore_modified)
update_files(revision)
update_moz_build(vendoring_dir, moz_yaml_dir, add_to_exports)
update_yaml(revision, timestamp)
vendor(command_context, yaml_file, manifest, revision, ignore_modified, check_for_update, force, add_to_exports, patch_mode)
mozbuild.vendor.vendor_manifest.iglob_hidden(*args, **kwargs)
mozbuild.vendor.vendor_manifest.list_of_paths_to_readable_string(paths)
mozbuild.vendor.vendor_manifest.throwe()

mozbuild.vendor.vendor_python module

class mozbuild.vendor.vendor_python.VendorPython(*args, **kwargs)

Bases: MozbuildObject

vendor(keep_extra_files=False, add=None, remove=None, upgrade=False, upgrade_package=None, force=False)
mozbuild.vendor.vendor_python.hash_file_text(file_path)
mozbuild.vendor.vendor_python.remove_environment_markers_from_requirements_txt(requirements_txt: Path)

mozbuild.vendor.vendor_rust module

class mozbuild.vendor.vendor_rust.VendorRust(*args, **kwargs)

Bases: MozbuildObject

BUILDTIME_LICENSE_WHITELIST = {'BSD-3-Clause': ['bindgen', 'fuchsia-zircon', 'fuchsia-zircon-sys', 'fuchsia-cprng', 'glsl', 'instant']}
ICU4X_LICENSE_SHA256 = '853f87c96f3d249f200fec6db1114427bc8bdf4afddc93c576956d78152ce978'
RUNTIME_LICENSE_FILE_PACKAGE_WHITELIST = {'deque': '6485b8ed310d3f0340bf1ad1f47645069ce4069dcc6bb46c7d5c6faf41de1fdb', 'fuchsia-cprng': '03b114f53e6587a398931762ee11e2395bfdba252a329940e2c8c9e81813845b', 'icu_calendar': '853f87c96f3d249f200fec6db1114427bc8bdf4afddc93c576956d78152ce978', 'icu_calendar_data': '853f87c96f3d249f200fec6db1114427bc8bdf4afddc93c576956d78152ce978', 'icu_collections': '853f87c96f3d249f200fec6db1114427bc8bdf4afddc93c576956d78152ce978', 'icu_locid': '853f87c96f3d249f200fec6db1114427bc8bdf4afddc93c576956d78152ce978', 'icu_locid_transform': '853f87c96f3d249f200fec6db1114427bc8bdf4afddc93c576956d78152ce978', 'icu_locid_transform_data': '853f87c96f3d249f200fec6db1114427bc8bdf4afddc93c576956d78152ce978', 'icu_properties': '853f87c96f3d249f200fec6db1114427bc8bdf4afddc93c576956d78152ce978', 'icu_properties_data': '853f87c96f3d249f200fec6db1114427bc8bdf4afddc93c576956d78152ce978', 'icu_provider': '853f87c96f3d249f200fec6db1114427bc8bdf4afddc93c576956d78152ce978', 'icu_provider_adapters': '853f87c96f3d249f200fec6db1114427bc8bdf4afddc93c576956d78152ce978', 'icu_provider_macros': '853f87c96f3d249f200fec6db1114427bc8bdf4afddc93c576956d78152ce978', 'icu_segmenter': '853f87c96f3d249f200fec6db1114427bc8bdf4afddc93c576956d78152ce978', 'litemap': '853f87c96f3d249f200fec6db1114427bc8bdf4afddc93c576956d78152ce978', 'tinystr': '853f87c96f3d249f200fec6db1114427bc8bdf4afddc93c576956d78152ce978', 'writeable': '853f87c96f3d249f200fec6db1114427bc8bdf4afddc93c576956d78152ce978', 'yoke': '853f87c96f3d249f200fec6db1114427bc8bdf4afddc93c576956d78152ce978', 'yoke-derive': '853f87c96f3d249f200fec6db1114427bc8bdf4afddc93c576956d78152ce978', 'zerofrom': '853f87c96f3d249f200fec6db1114427bc8bdf4afddc93c576956d78152ce978', 'zerofrom-derive': '853f87c96f3d249f200fec6db1114427bc8bdf4afddc93c576956d78152ce978', 'zerovec': '853f87c96f3d249f200fec6db1114427bc8bdf4afddc93c576956d78152ce978', 'zerovec-derive': '853f87c96f3d249f200fec6db1114427bc8bdf4afddc93c576956d78152ce978'}
RUNTIME_LICENSE_PACKAGE_WHITELIST = {'BSD-2-Clause': ['arrayref', 'mach', 'qlog'], 'BSD-3-Clause': ['subtle']}
RUNTIME_LICENSE_WHITELIST = ['Apache-2.0', 'Apache-2.0 WITH LLVM-exception', 'CC0-1.0', 'ISC', 'MIT', 'MPL-2.0', 'Unicode-3.0', 'Unicode-DFS-2016', 'Unlicense', 'Zlib']
check_cargo_version(cargo)

Ensure that Cargo is new enough.

check_openssl()

Set environment flags for building with openssl.

MacOS doesn’t include openssl, but the openssl-sys crate used by mach-vendor expects one of the system. It’s common to have one installed in /usr/local/opt/openssl by homebrew, but custom link flags are necessary to build against it.

get_cargo_path()
has_modified_files()

Ensure that there aren’t any uncommitted changes to files in the working copy, since we’re going to change some state on the user. Allow changes to Cargo.{toml,lock} since that’s likely to be a common use case.

log(level, action, params, format_str)

Log a structured log event.

A structured log event consists of a logging level, a string action, a dictionary of attributes, and a formatting string.

The logging level is one of the logging.* constants, such as logging.INFO.

The action string is essentially the enumeration of the event. Each different type of logged event should have a different action.

The params dict is the metadata constituting the logged event.

The formatting string is used to convert the structured message back to human-readable format. Conversion back to human-readable form is performed by calling format() on this string, feeding into it the dict of attributes constituting the event.

Example Usage:

self.log(logging.DEBUG, 'login', {'username': 'johndoe'},
    'User login: {username}')
static runtime_license(package, license_string)

Cargo docs say: — https://doc.rust-lang.org/cargo/reference/manifest.html

This is an SPDX 2.1 license expression for this package. Currently crates.io will validate the license provided against a whitelist of known license and exception identifiers from the SPDX license list 2.4. Parentheses are not currently supported.

Multiple licenses can be separated with a /, although that usage is deprecated. Instead, use a license expression with AND and OR operators to get more explicit semantics. — But I have no idea how you can meaningfully AND licenses, so we will abort if that is detected. We’ll handle / and OR as equivalent and approve is any is in our approved list.

serialize_issues_json()
vendor(ignore_modified=False, force=False)

Module contents