I've got a number of scripts that use common definitions. How do I split them in multiple files? Furthermore, the application can not be installed in any way in my scenario; it must be possible to have an arbitrary number of versions concurrently running and it must work without superuser rights. Solutions I've come up with are:
- Duplicate code in every script. Messy, and probably the worst scheme.
- Put all scripts and common
code in a single directory, and
use
from . import
to load them. The downside of this approach is that I'd like to put my libraries in other directory than the applications. - Put common
code in its own directory, write a
__init__.py
that imports all submodules and finally usefrom . import
to load them. Keeps code organized, but it's a little bit of overhead to maintain__init__.py
and qualify names. - Add the library directory to
sys.path
andimport
. I tend to this, but I'm not sure whether fiddling withsys.path
is nice code. - Load using
execfile
(exec
in Python 3). Combines the advantages of the previous two approaches: Only one line per module needed, and I can use a dedicated. On the other hand, this evades the python module concept and polutes the global namespace. - Write and install a module using
distutils
. This installs the library for all python scripts and needs superuser rights and impacts other applications and is hence not applicable in my case.
What is the best method?