wiki:TalkPackaging

Talk on Packaging 2010-03-13

These are the notes from the talks at Meeting20100313 on Python packaging for Fedora, RHEL, Windows (from David Fraser's talk), Debian (from tumbleweed's talk), the future of Python packaging (Simon Cross and other's informal discussion), etc

General Packaging

  • We have source code for Python libraries, applications and tools
  • These have dependencies on other Python libraries, as well as non-Python libraries (database drivers etc)
  • We need users to run our applications etc on various platforms

Terms of Endearment

  • Distribution here refers to a built Python thing
  • Distro refers to a Linux distribution
  • Package refers to a Python package (a module with submodules)
  • rpm refers to a package file in the RPM format (to distinguish from Python packages)

Distutils

  • distutils is the standard Python packaging system
  • Various extensions exist that augment its capabilities, and because you use Python, you can do that to
  • Input: you write a setup.py script (and optionally a setup.cfg configuration file) that specifies:
    • Package metadata (name, version, authorship, licensing, dependencies, etc)
    • Package contents (pure python modules/packages, extensions, executable Python scripts, data - non-code that needs to be in the package, scripts that aid installation etc)
  • Output: running setup.py supports various commands that produce different outputs:
    • sdist produces a source distribution - it helps if you can regenerate the distribution from a clean source distribution
      • MANIFEST.in template -> MANIFEST list of files to include - these are not regenerated automatically by default (--force-manifest)
    • bdist contains a generic built installation (but I've hardly ever seen it used as it's not a useful package format)
    • bdist_rpm generates RPM and SRPM package files
    • bdist_wininst produces a Windows executable installer that will install the distribution into an existing Python installation
    • The output generally gets put in a dist subdirectory
  • Commands include the above output commands, as well as intermediate commands that are run on the way to producing the above
    • Each command can have command-specific options (as well as the generic options passed to setup.py)
    • Options can be specified per-command either in setup.py or in setup.cfg, or on the command-line
    • build is an intermediate command which runs:
      • build_py for pure modules (simple copy to the build directory)
      • build_ext for compiling C/C++ extensions, links them to build directory
      • build_clib for building C/C++ libraries
      • build_scripts which copies them and alters the #! line
    • install then installs everything from the build directory to the target (separate steps like build)
    • clean cleans up the build directory
    • You can also register the command with the Python package index, and upload it
  • Scripts can include things to run (which can be installed into the path), as well as a post-installation/pre-removal script
  • Things to consider:
    • If you've got a bunch of related files, should they be in a package? Otherwise they can clutter the standard Python site-packages directory
    • Can you cleanly regenerate your source distribution from itself
    • Package dependencies: a distribution can provide, require or obsolete packages
    • Consider creating distributions on the same platform as you're targetting, or from other platforms, or both
    • How are you going to build your distribution for different platforms?
    • How are you going to deliver your distribution to people on different platforms, including dependencies?
    • How will your distribution interact with the native packaging on the target platforms (if any)

The world of eggs

setuptools is a set of extensions to distutils that try and bring it into the modern age:

  • Adds proper dependency support to Python packages
  • Lots of surrounding tools - easy_install, pkg_resources etc - very simple ways of installing stuff from the standard Python repository
  • Lots more tools being built around this format
  • Simply import setuptools instead of distutils
  • Does not integrate directly with distros' packaging systems
  • Supports parallel versions of the same library reasonably well (you can require a specific version and use it even if a different one is present - however conflicts can arise)
  • Not that good at uninstallation etc

The world of RPM

  • RPM is a packaging format that originated with RedHat, and is now used in RHEL/CentOS/Fedora, SUSE Enterprise/openSUSE, Mandriva as well as being part of the Linux Standard Base
  • Input: A .spec file defines a rpm's metadata, how to build both a source and a binary rpm, what sources to use, patches to apply, etc, etc
    • This is a more data-driven format than setup.py
    • The .spec file contains a header section for general information as well as
      • script sections for %preparing the build, %setup (unpacking the source and packaging), %building the source, %installing (into a BUILD_ROOT directory), %checking the results, and %clean`ing up
      • %pre and %post installation scripts, and %preun and %postun uninstallation scripts
      • A list of the %files included in the rpm (including categories like %documentation, and file attributes)
      • A %changelog
    • Support for macros, including shell execution to define macros
    • Support for subpackages - multiple rpms can be built from the same .spec file
  • From distutils, bdist_rpm generates a source distribution, creates a spec file, generates a source rpm from that, and then a binary rpm
  • What's really useful about rpm is dependency based on repositories
    • Different distros have different tools for this, and they vary in capability between releases - RHEL/CentOS/Fedora use yum
    • There isn't that much functional difference between the .deb format/the .rpm format, and apt-get/yum etc - for the user. There is for the packager...
    • You can specify dependencies as options to bdist_rpm (in setup.py, setup.cfg or on the commandline) - they don't seem to get included from the normal package metadata
  • Distros/repositories have different standards and requirements for inclusion of rpms and .spec files
    • bdist_rpm's automatically generated .spec files will generally not meet these requirements
    • The general feeling is that for inclusion into a distro, .spec files should be hand-generated and maintained
    • Are you targeting your rpms for inclusion in a Linux distro? Read and follow the rules and follow the procedures...
    • Otherwise you may be happy with the standard distutils stuff
    • Generally packages using setuptools can install their .egg information alongside the source code and include that in the rpm
  • Targetting older distros can be tricky if you are using lots of modern Python stuff
    • Distros have their own version of Python - you may require a newer one. Typically this can be installed alongside as something like python25.rpm - often these rpms exist, sometimes you have to rebuild them
    • In that case you will need to package all the dependencies for the new Python - usually called python25-babel etc
    • Doing this on multiple distros can be tiring. Since you're not targetting inclusion in the old distros, try and get away with murder (or at least, functional packages rather than beautifully crafted ones)
  • We have a tool called centuryegg for targetting older distros
    • This is currently reasonably specific to our set of requirements
    • Order of priority:
      • Use existing rpms from the target distro
      • Backport rpms from newer releases of the target distro
      • Spin our own rpms from the eggs in PyPI
    • Target is to be able to automatically source and download all the requirements from a simple list, generate the rpms, and upload them into a repository
    • We should make it more generic - is anyone interested?
  • Generating your own repository
    • You will generally need different repositories for different distros and versions (even apparently equivalent ones like RHEL4/CentOS 4)
    • Usually this just involves compiling (strongly recommend doing this on the target distro - we had strange crashes due to minor library versioning differences etc)
    • You then just need to run something like createrepo and put the files in a web space

The world of Windows

  • bdist_win32 is fine for lots of purposes - especially if distributing packages to other developers
  • bdist_msi has now also been added, that produces packages in Windows installer's MSI format (this is also used to produce the python msi itself)
  • Building extension libraries can be a nightmare (if they use the Visual Studio C library) due to DLL hell - they need to match both the exact VS C library that Python was built with, and the exact VS C library that any other libraries they use link against
  • For distributing applications, most Windows users expect a single install, and may be confused by having the Python runtime environment set up on their machine with lots of libraries
  • py2exe is the most popular of a variety of tools for producing a frozen Python distribution on Windows:
    • An extension to distutils
    • Packages up the Python runtime, and a set of Python packages, modules, extensions, scripts and data, into a target directory
    • Scripts are converted to stub win32 executables that load the Python dll and execute some code
    • Automatic search for Python library dependencies (by scanning your source code for import statements), as well as manual specification of requirements
    • All the libraries you depend on need to be included - sometimes having things installed as eggs on the build environment can produce problems
    • Running in frozen mode often requires some changes to the underlying code for compatibility - location of files etc - lots of tips on the wiki site
  • It's fairly common to use InnoSetup or NSIS to produce an installer containing all the required files, Start menu items, etc

Putting it all together

  • It makes life easier if you can do all the above, and whatever other targets you require, from the same setup.py script
  • Separate out options for the different commands as much as possible
  • We found we had to hack distutils a lot with derived code to make it all work
  • Beware of dependencies from setup.py - somebody may be trying to do something that doesn't need the dependency, so trap ImportErrors etc

The world of Debian

  • Install devscripts, python-support and build-essential.
    • To download the sample package, you can run:
      dget -xu http://ctpug.org.za/raw-attachment/wiki/TalkPackaging/hellopy_1.9-1.dsc
      
  • Package your source up nicely using sdist
  • Things get named name_version.orig.tar.gz
    • You can use uscan to download, repack, and rename it for you if you create a debian/watch file appropriately. You should do this.
  • Unpack the source distribution, and run dh_make
    • Go for single binary mode. It's the simplest.
  • Look through the different example files that are produced:
    • There are loads of them, you don't have to use all of them
    • debian/control tells what different packages should be produced
      • The first block is for the source package (the one you are working on). Later blocks for the binary packages you will get it to build.
      • Call the binary package python-$libname to follow convention
      • Architecture: any means build on every architecture; all means just build one package for all architectures (pure python).
      • Description - first line is summary, the rest is detail
      • Format is the same as email: To continue on a new line, put a space as the first character; for a blank line use <space><dot> like so: .
      • XS-Python-Version: You can set this to all or >= 2.5, etc. (/usr/share/doc/python-support/README.gz)
      • Specify Build-Depends: debhelper (>= 7), python-support. Include anything here for building documentation and running tests. Depend on python-all if you want to test on all supported Python versions.
      • Depends: calls the macro ${python_Depends}
    • debian/copyright is important to signal the licenses etc, following Debian rules.
    • debian/rules is a makefile that generates a deb
      • It's a lot of work to write all the rules, so there are a lot of helper tools: debhelper, cdbs, dh (debhelper 7)
      • This used to involve lots of separate build steps
      • debhelper 7 is the way to go nowadays... Now you just use dh:
        #!/usr/bin/make -f
        
        %:
                dh $@
        
    • Debhelper modules are mostly configured through files in debian/ named module (or package.module when building multiple packages).
      • You can have pre- and post- install scripts by creating debian/preinst etc. (man dh_installdeb)
    • For a python package, you just need a few files (those mentioned above)
    • dch lets you edit debian/changelog
    • To install scripts, create a little script that loads the module (debian/hellopy) and add the instruction to install in debian/install: debian/hellopy /usr/bin
  • There are two competing ways to do python dependencies on debian: python-central and python-support
    • python-central is maintained by the Python maintainer, used by 10-20% of the packages.
    • python-support is much more popular, and the maintainer is more reactive. Integrates nicely with debhelper 7
  • Patching the source directly is frowned upon, there are systems for managing patches
    • Put 3.0 (quilt) in debian/source/format and then you can use quilt to maintain patches in debian/patches.

You should annotate your patches (probably in DEP3 format).

  • Source format 3.0 gives you a .debian.tar.gz instead of a .diff.gz containing the debian additions to a package.
  • Then you build with debuild -uc -us:
    • Your package should automatically clean everything it creates - clean gets run before build
    • Debian builds packages in an empty chroot, so you need to specify anything non-essential that your build requires. Test this with pbuilder / Launchpad PPAs.
    • debc lists the contents of the deb file just built.
    • lintian -Ivi --pedantic ../*.changes will tell you what you did wrong.
  • Then you install...
    • As you install, python-support will automatically byte compile your python code for you by calling update-python-modules
    • /usr/lib/pymodules/python2.x contains symlinks to the original code under /usr/share/pyshared/ as well as the byte-compiled files for that version
    • Extensions go into /usr/lib/python-support/python-$libname/python2.x/
    • These symlinks and byte-compiled files aren't actually owned by the package, but will be regenerated/cleaned up if you run update-python-modules for that package again.
    • All data files should go into /usr/share/$package.
    • If you are packaging an application rather than a library, you shouldn't install into the global Python namespace, but rather into /usr/share/$package, with appropriate modifications to the scripts.
  • More advanced ideas:
    • A lot of the packaging work is being pedantic, making sure it will be correct no matter who runs it.
    • You can override dh rules in debian/rules to do things like adjusting automatic compression of files
  • To get your package into Debian, hop onto the #debian-python IRC channel on OFTC and ask for someone to help - they'll check through your package and hlpe you fix mistakes. (Or ping tumbleweed).

Future of Python packaging

  • Tarek Ziade is working towards producing a decent standard packaging library for Python, codenamed `distutils2.
  • This basically replaces things as follows:
    • distutils2 replaces distutils, as well as adding various things from setuptools: egg-info, recording installed files (for uninstall etc), entry points, and version info
    • Distribute replaces the rest of setuptools (focus on building)
    • pip replaces easy_install (focus on install/uninstall)
  • The idea is that distutils2 will be included in the standard library and will be extensible enough for Distribute (or anyone similar library) to produce building tools, or pip (or any similar library) to produce installation tools
  • Thus the Linux distros will be able to make their own tools to integrate with their packaging systems (if they so desire), on top of the standard distutils2
  • To produce low quality deb files there are a few options - stdeb is the most actively maintained and useful one at the moment
Last modified 10 years ago Last modified on 03/13/10 17:53:04

Attachments (3)

Download all attachments as: .zip