Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 14 additions & 0 deletions lib/spack/docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -135,6 +135,20 @@
# Enable todo items
todo_include_todos = True

#
# Disable duplicate cross-reference warnings.
#
from sphinx.domains.python import PythonDomain
class PatchedPythonDomain(PythonDomain):
def resolve_xref(self, env, fromdocname, builder, typ, target, node, contnode):
if 'refspecific' in node:
del node['refspecific']
return super(PatchedPythonDomain, self).resolve_xref(
env, fromdocname, builder, typ, target, node, contnode)

def setup(sphinx):
sphinx.override_domain(PatchedPythonDomain)

# -- General configuration -----------------------------------------------------

# If your documentation needs a minimal Sphinx version, state it here.
Expand Down
152 changes: 136 additions & 16 deletions lib/spack/docs/packaging_guide.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1212,27 +1212,47 @@ structure like this:
package.py
ad_lustre_rwcontig_open_source.patch

If you supply a URL instead of a filename, you need to supply a checksum,
like this:
If you supply a URL instead of a filename, you need to supply a
``sha256`` checksum, like this:

.. code-block:: python

patch('http://www.nwchem-sw.org/images/Tddft_mxvec20.patch.gz',
md5='f91c6a04df56e228fe946291d2f38c9a')
patch('http://www.nwchem-sw.org/images/Tddft_mxvec20.patch',
sha256='252c0af58be3d90e5dc5e0d16658434c9efa5d20a5df6c10bf72c2d77f780866')

Spack includes the hashes of patches in its versioning information, so
that the same package with different patches applied will have different
hash identifiers. To ensure that the hashing scheme is consistent, you
must use a ``sha256`` checksum for the patch. Patches will be fetched
from their URLs, checked, and applied to your source code. You can use
the ``spack sha256`` command to generate a checksum for a patch file or
URL.

This directive provides an ``md5`` checksum. You can use other hashing
algorihtms like ``sha256`` as well. The patch will be fetched from the
URL, checked, and applied to your source code. You can use the ``spack
md5`` command to generate a checksum for a patch file.
Spack can also handle compressed patches. If you use these, Spack needs
a little more help. Specifically, it needs *two* checksums: the
``sha256`` of the patch and ``archive_sha256`` for the compressed
archive. ``archive_sha256`` helps Spack ensure that the downloaded
file is not corrupted or malicious, before running it through a tool like
``tar`` or ``zip``. The ``sha256`` of the patch is still required so
that it can be included in specs. Providing it in the package file
ensures that Spack won't have to download and decompress patches it won't
end up using at install time. Both the archive and patch checksum are
checked when patch archives are downloaded.

``patch`` can take two options keyword arguments. They are:
.. code-block:: python

patch('http://www.nwchem-sw.org/images/Tddft_mxvec20.patch.gz',
sha256='252c0af58be3d90e5dc5e0d16658434c9efa5d20a5df6c10bf72c2d77f780866',
archive_sha256='4e8092a161ec6c3a1b5253176fcf33ce7ba23ee2ff27c75dbced589dabacd06e')

""""""""""""""""""""""""""""""""""""""
``md5``, ``sha256``, ``sha512``, etc.
""""""""""""""""""""""""""""""""""""""
``patch`` keyword arguments are described below.

Use one of these when you supply a patch to be downloaded from a remote
site. The downloaded file will be validated using the given checksum.
""""""""""""""""""""""""""""""
``sha256``, ``archive_sha256``
""""""""""""""""""""""""""""""

Hashes of downloaded patch and compressed archive, respectively. Only
needed for patches fetched from URLs.

""""""""
``when``
Expand Down Expand Up @@ -1309,6 +1329,21 @@ if you run install, hit ctrl-C, and run install again, the code in the
patch function is only run once. Also, you can tell Spack to run only
the patching part of the build using the :ref:`cmd-spack-patch` command.

.. _patch_dependency_patching:

^^^^^^^^^^^^^^^^^^^
Dependency patching
^^^^^^^^^^^^^^^^^^^

So far we've covered how the ``patch`` directive can be used by a package
to patch *its own* source code. Packages can *also* specify patches to be
applied to their dependencies, if they require special modifications. As
with all packages in Spack, a patched dependency library can coexist with
other versions of that library. See the `section on depends_on
<dependency_dependency_patching_>`_ for more details.

.. _handling_rpaths:

---------------
Handling RPATHs
---------------
Expand Down Expand Up @@ -1482,7 +1517,7 @@ particular constraints, and package authors can use specs to describe
relationships between packages.

^^^^^^^^^^^^^^
Version Ranges
Version ranges
^^^^^^^^^^^^^^

Although some packages require a specific version for their dependencies,
Expand Down Expand Up @@ -1530,7 +1565,7 @@ correct way to specify this would be:


^^^^^^^^^^^^^^^^
Dependency Types
Dependency types
^^^^^^^^^^^^^^^^

Not all dependencies are created equal, and Spack allows you to specify
Expand Down Expand Up @@ -1566,6 +1601,91 @@ inject the dependency's ``prefix/lib`` directory, but the package needs to
be in ``PATH`` and ``PYTHONPATH`` during the build process and later when
a user wants to run the package.

.. _dependency_dependency_patching:

^^^^^^^^^^^^^^^^^^^
Dependency patching
^^^^^^^^^^^^^^^^^^^

Some packages maintain special patches on their dependencies, either to
add new features or to fix bugs. This typically makes a package harder
to maintain, and we encourage developers to upstream (contribute back)
their changes rather than maintaining patches. However, in some cases
it's not possible to upstream. Maybe the dependency's developers don't
accept changes, or maybe they just haven't had time to integrate them.

For times like these, Spack's ``depends_on`` directive can optionally
take a patch or list of patches:

.. code-block:: python

class SpecialTool(Package):
...
depends_on('binutils', patches='special-binutils-feature.patch')
...

Here, the ``special-tool`` package requires a special feature in
``binutils``, so it provides an extra ``patches=<filename>`` keyword
argument. This is similar to the `patch directive <patching_>`_, with
one small difference. Here, ``special-tool`` is responsible for the
patch, so it should live in ``special-tool``'s directory in the package
repository, not the ``binutils`` directory.

If you need something more sophisticated than this, you can simply nest a
``patch()`` directive inside of ``depends_on``:

.. code-block:: python

class SpecialTool(Package):
...
depends_on(
'binutils',
patches=patch('special-binutils-feature.patch',
level=3,
when='@:1.3'), # condition on binutils
when='@2.0:') # condition on special-tool
...

Note that there are two optional ``when`` conditions here -- one on the
``patch`` directive and the other on ``depends_on``. The condition in
the ``patch`` directive applies to ``binutils`` (the package being
patched), while the condition in ``depends_on`` applies to
``special-tool``. See `patch directive <patching_>`_ for details on all
the arguments the ``patch`` directive can take.

Finally, if you need *multiple* patches on a dependency, you can provide
a list for ``patches``, e.g.:

.. code-block:: python

class SpecialTool(Package):
...
depends_on(
'binutils',
patches=[
'binutils-bugfix1.patch',
'binutils-bugfix2.patch',
patch('https://example.com/special-binutils-feature.patch',
sha256='252c0af58be3d90e5dc5e0d16658434c9efa5d20a5df6c10bf72c2d77f780866',
when='@:1.3')],
when='@2.0:')
...

As with ``patch`` directives, patches are applied in the order they
appear in the package file (or in this case, in the list).

.. note::

You may wonder whether dependency patching will interfere with other
packages that depend on ``binutils``. It won't.

As described in patching_, Patching a package adds the ``sha256`` of
the patch to the package's spec, which means it will have a
*different* unique hash than other versions without the patch. The
patched version coexists with unpatched versions, and Spack's support
for handling_rpaths_ guarantees that each installation finds the
right version. If two packages depend on ``binutils`` patched *the
same* way, they can both use a single installation of ``binutils``.

.. _setup-dependent-environment:

Expand Down
7 changes: 5 additions & 2 deletions lib/spack/spack/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -207,8 +207,11 @@
from spack.version import Version, ver
__all__ += ['Version', 'ver']

from spack.spec import Spec, alldeps
__all__ += ['Spec', 'alldeps']
from spack.spec import Spec
__all__ += ['Spec']

from spack.dependency import all_deptypes
__all__ += ['all_deptypes']

from spack.multimethod import when
__all__ += ['when']
Expand Down
89 changes: 2 additions & 87 deletions lib/spack/spack/cmd/checksum.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,13 +25,12 @@
from __future__ import print_function

import argparse
import hashlib

import llnl.util.tty as tty
import spack
import spack.cmd
import spack.util.crypto
from spack.stage import Stage, FailedDownloadError
import spack.util.web
from spack.util.naming import *
from spack.version import *

Expand All @@ -52,90 +51,6 @@ def setup_parser(subparser):
help='versions to generate checksums for')


def get_checksums(url_dict, name, **kwargs):
"""Fetches and checksums archives from URLs.

This function is called by both ``spack checksum`` and ``spack create``.
The ``first_stage_function`` kwarg allows ``spack create`` to determine
things like the build system of the archive.

Args:
url_dict (dict): A dictionary of the form: version -> URL
name (str): The name of the package
first_stage_function (callable): Function to run on first staging area
keep_stage (bool): Don't clean up staging area when command completes

Returns:
str: A multi-line string containing versions and corresponding hashes
"""
first_stage_function = kwargs.get('first_stage_function', None)
keep_stage = kwargs.get('keep_stage', False)

sorted_versions = sorted(url_dict.keys(), reverse=True)

# Find length of longest string in the list for padding
max_len = max(len(str(v)) for v in sorted_versions)
num_ver = len(sorted_versions)

tty.msg("Found {0} version{1} of {2}:".format(
num_ver, '' if num_ver == 1 else 's', name),
"",
*spack.cmd.elide_list(
["{0:{1}} {2}".format(str(v), max_len, url_dict[v])
for v in sorted_versions]))
print()

archives_to_fetch = tty.get_number(
"How many would you like to checksum?", default=1, abort='q')

if not archives_to_fetch:
tty.die("Aborted.")

versions = sorted_versions[:archives_to_fetch]
urls = [url_dict[v] for v in versions]

tty.msg("Downloading...")
version_hashes = []
i = 0
for url, version in zip(urls, versions):
try:
with Stage(url, keep=keep_stage) as stage:
# Fetch the archive
stage.fetch()
if i == 0 and first_stage_function:
# Only run first_stage_function the first time,
# no need to run it every time
first_stage_function(stage, url)

# Checksum the archive and add it to the list
version_hashes.append((version, spack.util.crypto.checksum(
hashlib.md5, stage.archive_file)))
i += 1
except FailedDownloadError:
tty.msg("Failed to fetch {0}".format(url))
except Exception as e:
tty.msg("Something failed on {0}, skipping.".format(url),
" ({0})".format(e))

if not version_hashes:
tty.die("Could not fetch any versions for {0}".format(name))

# Find length of longest string in the list for padding
max_len = max(len(str(v)) for v, h in version_hashes)

# Generate the version directives to put in a package.py
version_lines = "\n".join([
" version('{0}', {1}'{2}')".format(
v, ' ' * (max_len - len(str(v))), h) for v, h in version_hashes
])

num_hash = len(version_hashes)
tty.msg("Checksummed {0} version{1} of {2}".format(
num_hash, '' if num_hash == 1 else 's', name))

return version_lines


def checksum(parser, args):
# Make sure the user provided a package and not a URL
if not valid_fully_qualified_module_name(args.package):
Expand All @@ -160,7 +75,7 @@ def checksum(parser, args):
if not url_dict:
tty.die("Could not find any versions for {0}".format(pkg.name))

version_lines = get_checksums(
version_lines = spack.util.web.get_checksums_for_versions(
url_dict, pkg.name, keep_stage=args.keep_stage)

print()
Expand Down
3 changes: 1 addition & 2 deletions lib/spack/spack/cmd/create.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,6 @@
import llnl.util.tty as tty
import spack
import spack.cmd
import spack.cmd.checksum
import spack.util.web
from llnl.util.filesystem import mkdirp
from spack.repository import Repo
Expand Down Expand Up @@ -587,7 +586,7 @@ def get_versions(args, name):
version = parse_version(args.url)
url_dict = {version: args.url}

versions = spack.cmd.checksum.get_checksums(
versions = spack.util.web.get_checksums_for_versions(
url_dict, name, first_stage_function=guesser,
keep_stage=args.keep_stage)

Expand Down
2 changes: 1 addition & 1 deletion lib/spack/spack/cmd/fetch.py
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ def fetch(parser, args):
specs = spack.cmd.parse_specs(args.packages, concretize=True)
for spec in specs:
if args.missing or args.dependencies:
for s in spec.traverse(deptype_query=spack.alldeps):
for s in spec.traverse(deptype_query=all):
package = spack.repo.get(s)
if args.missing and package.installed:
continue
Expand Down
4 changes: 2 additions & 2 deletions lib/spack/spack/cmd/flake8.py
Original file line number Diff line number Diff line change
Expand Up @@ -79,9 +79,9 @@

# exemptions applied to all files.
r'.py$': {
# Exempt lines with URLs from overlong line errors.
'E501': [
r'(https?|ftp|file)\:',
r'(https?|ftp|file)\:', # URLs
r'([\'"])[0-9a-fA-F]{32,}\1', # long hex checksums
]
},
}
Expand Down
Loading