0

I'm writing a small package for internal use and come to a design problem. I define a few classes and constants (i.e., server IP address) in some file, let's call it mathfunc.py. Now, some of these classes and constants will be used in other files in the same package. My current setup is like this:

/mypackage
   __init__.py
   mathfunc.py
   datefunc.py

So, at the moment I think I have to import mathfunc.py in datefunc.py to use the classes defined there (or alternatively import both of them all the time). This sounds wrong to me because then I'll be in a lot of pain importing lots of files everywhere. Is it a proper design at all or there is some other way? Maybe I can put all definitions in some file which will not be a subpackage on its own, but will be used by all other files?

sashkello
  • 17,306
  • 24
  • 81
  • 109
  • See the answers to the questions [_Python namespacing and classes_](http://stackoverflow.com/questions/5117194/python-namespacing-and-classes/5118437#5118437) and [_Importing Python classes from different files in a subdirectory_](http://stackoverflow.com/questions/5134893/importing-python-classes-from-different-files-in-a-subdirectory). – martineau Oct 31 '13 at 01:35

1 Answers1

0

Nope, that's pretty much how Python works. If you want to use objects declared in another file, you have to import from it.

Tips:

  • You can keep your namespace clean by only importing the things you need, rather than using from foo import *.
  • If you really need to do a "circular import" (where A needs things in B, and B needs things in A) you can solve that by only importing inside the functions where you need the object, not at the top of a file.
Christian Ternus
  • 8,406
  • 24
  • 39