Mother Manual

Index:

Configuring Mother:

The basic principles:

The introspective Mother features:

The database interface:

Transactions, Connection Pool and Threaded Environments:

Plugins:

Appendix:

Install Mother

Pre-install requirements, one of the 2 libraries:

Grab the DBMother tarball at:

Untar it and change directory to the created directory:

$ tar xzf mother-x.y.z.tgz
$ cd mother-x.y.z

To install Mother, just use distutils:

# python setup.py install

Be sure to run the previous command as root on Unix systems, or to use the absolute python path on Windows if python won't be recognized as valid command.

Besides the addition of the python module, a command line tool called "mothermapper" will also get installed (normally in "/usr/bin/" on Unix systems).

The Database example used on this guide

Example code has been included with DBMother. We use the following schema for the sample code and our tutorial. Please pay close attention to the following description because we will use throughout the manual.

Here's a graphic representation of the database:

   Stars
       \
        \
        Planets       Lifeforms
         /    \          /
        /      \        /
       /        \      /
Moons_info     Civiliztions

We have a need to keep track of star systems. For each star system, we have a set of planets. For each planet we could have some information about it's moons. Lifeforms is a table that store information about the different forms of life present in our universe. Civilizations is a relation between Lifeforms and Planets: we will use it to store information like, "humans live on earth".

Here's the sql script (shipped with the tarball):

Important

For Postgresql users. For Posgtgres version >= 8.1 and < 8.2.3: you need to create tables "WITH OIDS". Remove "WITH OIDS" otherwise.

Note that Mother is scalable and changing postgres version is safe. If you upgrade your postgres to 8.2.3, for example, you don't have to create tables WITH OIDS and your work environment will be working at all.

Important

fields names are completely arbitrary: in particular, there is no need for DbMother to create keys and foreign keys with the same name.

create table stars (
    star_id         serial,
    star_name       text,
    star_age        int,
    star_mass       int,

    primary key(star_id)
) WITH OIDS;

create table planets (
    star_id         int,
    planet_id       serial,
    planet_name     text,
    planet_mass     int,

    primary key(planet_id),
    foreign key(star_id) references stars(star_id)
) WITH OIDS;

create table moons_info (
    planet_id       int,
    moon_info_id    serial,
    num_moons       int,

    primary key(moon_info_id),
    foreign key(planet_id) references planets(planet_id)
) WITH OIDS;

create table lifeforms (
    life_id         serial,
    life_name       text,
    life_age        int,

    primary key(life_id)
) WITH OIDS;

create table civilizations (
    life_id         int,
    planet_id       int,
    age             int,

    foreign key(life_id) references lifeforms(life_id),
    foreign key(planet_id) references planets(planet_id)
) WITH OIDS;

Creating a Mother Environment

To handle a database with DBMother we need to create a DBMother environment, using the mothermapper tool. If you are ever in doubt what parameters you could use with mothermapper, try:

$ mothermapper -h

Thanks to DBMother's introspection, there is no need to write XML and models files: DBMother is able to automatically obtain the database structure.

First of all, we need to create a configuration file which allows one to specify:

  • database parameters
  • logging features
  • and the Connection Pool

To create a configuration file with default values (using Postgres as the backend), just do:

$ mothermapper -P /my/path/dbmother.conf

otherwise use -S for SQLite:

$ mothermapper -S /my/path/dbmother.conf

At this point you need to edit the newly created map file and set the parameters for your particular environment. Don't worry: the conf file is heavy commented and most options contain suitable default values.

Important

One value you should pay attention to is "MOTHER_MAP". Here you should place a reasonable directory and file name value like "/my/path/dbmother.map".

When you are done with your edits, test your configuration like so:

$ mothermapper -c /my/path/dbmother.conf -t

Now the database needs tables. Altough this is not mandatory, We can use "mothermapper" to run a sql script (the sql script example is shipped with the tarball, we assume that it was saved on /my/path/db1.sql):

$ mothermapper -c /my/path/dbmother.conf -e /my/path/db1.sql

Finally we need to create the DBMother map file. The map is created automatically by "mothermapper" (assuming you remembered to set the "MOTHER_MAP" value in the DBMother config file) try the following:

$ mothermapper -c /my/path/dbmother.conf -s

Done. If all went well you can do a final test by importing your configuration file like so:

>>> from mother.mothers import init_mother
>>> init_mother('/my/path/dbmother.conf')

Success!!

Simple Database Actions I

In this section we will handle simple and single records. We use the persitent connection, so make sure that the mother configuration file enables it. Note that the faster way to execute simple db actions is explained in the sectioin Handling Many Records. Although this way is very efficent and easy, the DbMother classes explained in this section are more powerful, flexible and extendible.

Obviously it's always possible to use both.

The primary concept of DbMother classes is: declare a short but extendible class for a table: each instance of this class is a table record. To explain this concept let's start handling records on the table 'stars'.

Put the following class declaration inside the file sample.py:

from mother.mothers import *

class ClsStars(DbMother):

    table_name= 'stars'

    def __init__(self, store= {}, flag= MO_NOA, session= None):
        DbMother.__init__(self, store, flag, session)

    def my_function(self):
        print "This is my function, this is my realm"

The argument session is used when dealing with the connection pool, useful for threaded environments, as web applications. For now, forgive it. Let's take a look at the store and the flag arguments.

Suppose we want to insert the Sun star on the table stars. The Sun name is 'sun', the Sun age is 10 and the Sun mass is 20:

>>> # Mother Initialization
>>> from sample import *
>>> init_mother('/my/conf/file')
>>> # Working with Mother
>>> sun_dict= {'star_name': 'sun', 'star_mass': 20, 'star_age': 10}
>>> Sun= ClsStars(sun_dict)
>>> Sun.insert()
>>> print Sun
>>> sun_id = Sun.getField('star_id')

Note that Mother has retrieved for us the primary key of Sun automatically. Note also that Mother knows which is the primary key of the table stars. Moreover, Mother knows the fields of that table; try to do:

>>> wrong_dict= {'foo': 1}
>>> Sun= ClsStars(wrong_dict)
>>> Sun.insert()

A question arises: what about the initialization argument flag? This argument is useful to perform db actions during the initialization. In other words the Sun insertion could be made without the insert() call:

>>> Sun= ClsStars(sun_dict, MO_SAVE)

The flag MO_NOA means "No Action" and is the default value of this argument.

Now we want to update the Sun record, setting the star_mass to 15:

>>> Sun.setField('star_mass', 15)
>>> # or Sun.setFields({'star_mass': 15}) to change more fields
>>> Sun.update()

If we want to update a record without having his mother instance we have to use his primary key (otherwise we risk a dangerous update):

>>> Sun= ClsStars({'star_id': sun_id, 'star_mass': 15}, MO_UP)

Note that Mother understands automatically what has to be updated.

Now, we want to delete the sun record:

>>> Sun.delete()

of, if we don't have a mother instance for this record, we can do:

>>> ClsStars({'star_id': sun_id}, MO_DEL)

To delete a record there is no need to specify his primary key:

>>> ClsStars({'star_name': 'sun'}, MO_DEL)

But note that this call will delete all records on the table stars with star_name= 'sun'. If you want to avoid this type of dangerous call, define your mother class to be "paranoid":

class ClsStars(DbMother):

    table_name= 'stars'
    paranoid = True

    def __init__(....):
        ...

Now Mother will refuse to perform db actions without a primary key.

Finally, we could want to select a record.

>>> Sun= ClsStars({'star_id': sun_id})
>>> Sun.load()
>>> # or Sun.load(fields= 'star_name') if we want to load only the name
>>> # or Sun= ClsStars({'star_id': sun_id}, MO_LOAD)

There is no need to use the primary key, but remember that this call will raise an exception if there isn't a unique record with the specified fields (this fact could be used to verify if a unique record with some values is present on the database).

If we want to give some sql specific values to some fields, as DEFAULT, NULL, True or False, just use SQL_DEFAULT, SQL_NULL, SQL_FALSE, SQL_TRUE:

>>> from mother.mothers import SQL_NULL
>>> Sun= ClsStars({'star_mass': SQL_NULL, 'star_name': 'sun'}, MO_SAVE)
>>> print Sun
>>> print Sun.getFields()

Note that the methods getFields() and getField() accept an optional argument: autoload. If autoload is True, the requested fields will be retrieved from the database if necessary.

All the described methods have a good inline doc:

>>> print Sun.getFields.__doc__

Now, play a little bit with your classes, try to understand them. Note that it's possible, important and encouraged to extend them, for example redefining the __init__() method and/or adding some function:

class ClsStars(DbMotther):
    table_name= 'stars'

    def __init__(self, myarg, otherarg, store= {},
                 flag= MO_NOA, session= session):

        work_with_args(myarg, otherarg)
        DbMother.__init__(self, store, flag, session)

    def my_function(self, *args):
        print args

It's possible to modify and extend the Mother classes also coding some plugin.

Simple Database Actions II

In the previous chapter we learned how to define DbMother classes, handling simple records. The goal of these classes is to build intelligent structures: they are extendible with our functions and, as We will see in the next sections, they can be extended in a very simple way to handle children and relations with a strong introspection.

But We don't need always such structures: when, for example, We need only to handle records on an isolated table, without children or relations, and without the need to extend our Mother class with custom methods.

In such situations, it's possible to use a MotherFly class, whithout the need to define a DbMother class for each table.

In the previous section, we defined the class ClsStars:

class ClsStars(DbMother):

    table_name= 'stars'

    def __init__(self, store= {}, flag= MO_NOA, session= None):
        DbMother.__init__(self, store, flag, session)

To perform the same db actions explained before, We don't need to define this class: We can dynamically obtain it using the following methods:

  • getMotherBuilder(table_name)
  • getMotherObj(table_name [, store= {}, flag= MO_NOA, session= None])

For example:

>>> ClsStars= getMotherBuilder('stars')
>>> type(ClsStars)
>>> sun_dict= {'star_name': 'sun'}
>>> sun= ClsStars(sun_dict, MO_SAVE)

As it's possible to see, the ClsStars is now a MotherFly instance. We can use a MotherFly instance at the same way We use a DbMother class. So, it's possible to perform the simple database actions explained in the previous section without the need to define any class.

The getMotherObj function is simply a wrapper:

>>> ClsStars= getMotherBuilder('stars')
>>> sun= ClsStars(sun_dict, MO_SAVE)

    is eqaul to:

>>> sun= getMotherObj('stars', sun_dict, MO_SAVE)

Although the usage of getMotherBuilder and getMotherObj is very useful and easy, a MotherFly instance does not allow children and relations handling. Moreover, the MotherFly instances can not be extended with our methods (not in an easy way), or plugins and so on.

So, the decision to use a MotherFly class or a DbMother class depends on the various situations. Obviously, it's always possible to use both.

It's higly recommended to use a MotherFly instance when We need to perform a simple db action, because, generally, it's faster.

When, instead, We need to handle complex tasks, being the simple db actions only a part of them, a DbMother class is better, because We can handle these tasks with a Object Oriented behaviour, defining our custom methods:

class ClsStars(DbMother):

    table_name= 'stars'

    def __init__(self, my_arg1, my_arg2, store= {}, flag= MO_NOA, session= None)

        DbMother.__init__(self, store, flag, session)
        self.my_arg1= my_arg1
        self.my_arg2= my_arg2

    def get_arg1(self):

        return self.my_arg1

    def get_star_image(self):
        something.wget(self.my_arg2)

    ...

Handling Multiple Records

In the previous section we learned how to handle one record at time. Now we begin to deal with a set of records. To do that a specific class is used: MotherBox.

If an instance of a Mother class represents a record on a table, an instance of a MotherBox class is a set of record of one table. Differently by DbMother, we don't need to create a class for each table.

Let's start retrieving all records on the table stars:

>>> from sample import *
>>> init_mother('/my/conf/file')
>>> star_box= MotherBox(ClsStars, filter= None, flag= MO_LOAD)
>>> # it's also possible to use the table name instead of ClsStarts:
>>> # star_box= MotherBox('stars', filter= None, flag= MO_LOAD)
>>> print len(star_box)

Now the star_box instance contains the records. len(star_box) gives us the number of retrieved records. We can choose to take them as dictionnaries or as Mother instances:

>>> mommas= star_box.getRecords(flag_obj= True)
>>> dicts= star_box.getRecords()
>>> for momma in mommas:
...    print momma
...
>>> for d in dicts:
...    print d
...

When getting records as Mother instances, if a DbMother class was used to initialize the MotherBox, instances of this class will be returned. Otherwise, the return value is a list of MotherFly instances.

The MotherBox technology is more complex: a lot of args could be specified during the initialization. First of all, the filters. Just take all records with star_mass equal to 10:

>>> star_box= MotherBox(ClsStars, filter= {'star_mass': 10}, flag= MO_LOAD)
>>> star_box= MotherBox(ClsStars, filter= 'star_mass = 10', flag= MO_LOAD)

As you can see, fiilters could be dicts or strings. Filter could be also MoFilter instances: this allows to escape strings, adding security. This type of filters is explained after: for now, know that all the Mother internal operations are safe: SQL is escaped so that SQL injection is not allowed.

We could be interested to retrieve only the star_name:

>>> star_box= MotherBox(ClsStars, fields= ['star_name'], flag= MO_LOAD)

We could be interested to retrieve records oredered by star_id:

>>> star_box= MotherBox(ClsStars, order= ['star_id'], flag= MO_LOAD)

Now that we know how to load records, it's time to see how to delete and update them. To delete all records on stars with star_mass > 15, we can do:

>>> MotherBox(ClsStars, filter= 'star_mass > 15', flag= MO_DEL)

To update all records on stars with star_mass = 15, setting star_age = 2, just do:

>>> fup= {'star_age': 2}
>>> filter= {'star_mass': 15}
>>> MotherBox(ClsStars, filter= filter, fields= fup, flag= MO_UP)

Finally, a MotherBox instance is iterable:

>>> for record in MyBox:
...     print record

Handling Many Records

From Mother version >= 0.6.3 a new class is available: MotherMany. The goal of MotherMany is to speed up massive db actions and to provide an easy way to perform db actions without DbMother classes definition.

Often We don't need complex structures to execute sql statements. A Mother class is a structured class: We can extend it with plugins, We can handle children and relations, But sometimes, We don't need this machinery.

When, for example, We have an isolated table, without children and without relations, the definition and the usage of a Mother class should be avoided.

The same applies when We have to perform a great number of statements: when We need a fast approach.

The optimization problem is easy to understand: if You need to insert 1.000.000 of records on a table, initializing one DbMother class for each record is crazy.

Altough custom queries are always available, MotherMany offers, as usual, SQL injection controls and a fast way to execute actions.

The initialization of a MotherMany instance is easy:

>>> MotherMany(builder, store= None, flag= MO_NOA,
...            session= None, fields= None)

The builder is a DbMother class or a table name. The store is a list of dicts, each dict representing a record: also a simple dict is accepted. The flag, as usual, specifies an action. The argument fields is useful when using flag= MO_LOAD or flag= MO_UP. The session is used when dealing with the connection Pool.

To insert a set of records (now using as builder a Mother class):

>>> star_recors= [{'star_name': 'a'}, {'star_name': 'b'}]
>>> MotherMany(ClsStars, star_records, MO_SAVE)

If MO_NOA is used instead of MO_SAVE, we can do the following (using as builder a table name):

>>> star_recors= [{'star_name': 'a'}, {'star_name': 'b'}]
>>> momma= MotherMany('stars', star_records, MO_NOA)
>>> other_records= [{'star_name': 'c'}, {'star_name': 'd'}]
>>> other_record= {'star_name': 'e'}
>>> momma.addRows(other_records)
>>> momma.addRows(other_record)
>>> momma.insert()

As You can see, addRows() accepts a list of dicts or a dict.

Deleting records is exactly the same: just use the MO_DEL flag or the MotherMany.delete() call.

To select records, It's possible to use the argument fields, which is a list of strings: only these fields will be loaded from the database. If the argument fields is omitted, all the table fields will be selected:

>>> momma= MotherMany(ClsStarts, star_records, MO_LOAD)
>>> records= momma.getRecords()
>>> # momma= MotherMany('stars', star_records, MO_NOA)
>>> # momma.load()

When updating records, the argument fields is used to specify which fields have to be updated. For example:

>>> srecords= [{'star_id': 1, 'star_name': 'a'},
...            {'star_id': 2, 'star_name': 'b'}]
>>> MotherMany('stars', srecords, MO_UP, fields= ['star_name'])
>>> # MotherMany('stars', srecords, MO_NOA, fields= ['star_name'])
>>> # momma.update()

will perform a set of queries like: "UPDATE stars SET star_name= 'a' WHERE star_id = 1". If the argument fields is omitted, it defaults to all the dictionnary keys that are not part of the primary key for the table. So, in the previous example, if fields is omitted, the result is the same.

Finally, it's possible to call the getRecords() method to fetch the worked records. This method accepts an optional argument: flag_obj. If flag_obj is False (the default), a list of dicts is returned. If flag_obj is True, a list of Mother instances will be returned. In this case, if MotherMany were initialized with, as builder, a Mother class, this class will be used to initilize the instances. If the builder is a table name, the call getRecords(flag_obj= True) returns a list of MotherFly instances.

>>> dicts= momma.getRecords()
>>> mommas= momma.getRecords(flag_obj= True)

Note that for each action, MO_LOAD excepted, the dicts returned by the previous calls are exactly the dictionnaries used to initialize the MotherMany class and added with addRows(). For the MO_LOAD action, instead, the results are the records fetched from the database.

Remember that a MotherMany instance is iterable:

>>> for record in momma:
...     print record

Handling Children

The table planets is a child of the table stars: a planet could be in a particular star system. Now we learn how to handle children.

There are two ways to handle children: We can use the MotherMany way, or the classic way. The first is more, more fast, but more poor. To begin We will learn the second. The MotherMany way is explained in the section Handling Many Children.

First of all we need to create a Mother class for the table planets: after that we have to enable the children Manager for the ClsStars class, specifiyng a list of children that we want to handle. Working on sample.py:

from mother.mothers import *

class ClsPlanets(DbMother):
    table_name= 'planets'
    def __init__(self, store= {}, flag= MO_NOA, session= None):
        DbMother.__init__(self, store, flag, session)

class ClsStars(DbMother, MotherManager):
    table_name= 'stars'
    def __init__(self, store= {}, flag= MO_NOA, session= None):
        self.initChildManager([ClsPlanets])
        DbMother.__init__(self, store, flag, session)

Note that we have subclassed the ClsStars class with MotherManager and called the initChildManager() method to specify that we want to handle planets children. There is no need to call initChildManager() on the __init__ method. It's possible to call it manually, for example when We need it.

Children can be specified also as table names:

class ClsStars(DbMother, MotherManager):
    table_name= 'stars'
    def __init__(self, store= {}, flag= MO_NOA, session= None):
        self.initChildManager(['planets'])
        DbMother.__init__(self, store, flag, session)

So, which is the best way? Using table names does not require classes definition: in this example there is no need to define ClsStars.

Using custom classes is more powerful, because when children will be handled, there will be the chance to handle children with our custom classes instead of handling them with a MotherFly class.

Another important point is: there is no need to specify a list of children. If initChildManager() is called without arguments, Mother is able to understand which children have to be handled. In other words, if no argument is specified, children defaults to the list of all children, specified by table names.

Now, let's create the Sun star and insert the planet Earth:

>>> from sample import *
>>> init_mother('/my/conf/file')
>>> Sun= ClsStars({'star_name': 'sun'}, MO_SAVE)
>>> Earth= Sun.insertPlanets({'planet_name': 'earth'})
>>> print Earth

Note that:

  • the method insertPlanets() is created automatically
  • the foreign key 'star_id' was not specified and it's assigned automatically

Now we want all the planets living on the solar system:

>>> planet_box= Sun.getMultiplePlanets()

Now all planets on the solar system with planet_mass > 12, ordered by planet_id:

>>> ftr= 'planet_mass > 12'
>>> order= ['planet_id']
>>> planet_box= Sun.getMultiplePlanets(filter= ftr, order= order)

If we are interested only on planet names, we can specify to load only this field:

>>> planet_box= Sun.getMultiplePlanets(fields= ['planet_name'])

If we want to retrieve a unique planet we can do:

>>> Earth= Sun.getPlanets({'planet_name': 'earth'})

Note that this call will raise an exception if there isn't a unique record on the table planets with planet_name = 'earth' and star_id = Sun.getField('star_id').

We can use this fact to test if a unique planet exists:

>>> try:
...   planet= Sun.getPlanets({'planet_name': 'earth'})
... except:
...   print "No unique planet on the solar system with name earth"
>>>

Now it's time to update:

>>> ftr= {'planet_name': 'earth'}
>>> Sun.updateMultiplePlanets({'planet_mass': 42}, filter= ftr)

This call update all planets on the solar system with planet_name = 'earth', setting planet_mass= 42.

Deleting children is similar:

>>> Sun.deleteMultiplePlanets("planet_mass > 12 ")
>>> Sun.deleteMultiplePlanets({'planet_mass': 2})

Handling Many Children

This section explains the second way to handle children. The difference between this way and the classic way is almost the same between a DbMother class and the MotherMany class.

The methods explained in this section are very, very fast, but more poor than the methods created by initChildManager(), explained in the previous chapter.

Anyway, these methods should be used often for two reasons:

As for initChildManager(), We can use the method initManyManager() to create our methods automatically:

from mother.mothers import *

class ClsPlanets(DbMother):
    table_name= 'planets'
    def __init__(self, store= {}, flag= MO_NOA, session= None):
        DbMother.__init__(self, store, flag, session)

class ClsStars(DbMother, MotherManager):
    table_name= 'stars'
    def __init__(self, store= {}, flag= MO_NOA, session= None):
        self.initManyManager([ClsPlanets])
        DbMother.__init__(self, store, flag, session)

It's not mandatory to call initManyManager() during the initialization: it could be called everytime We need it. Anyway, calling it creates the following methods:

These methods are very similar to the MotherMany usage. The unique difference is the authomatic export of the foreign key between parent and child. For example:

>>> sun= ClsStars({'star_id': 1})
>>> box= sun.insertManyPlanets([{'planet_name': 'earth'},
...                             {'planet_name': 'mars'}])

will produce the following queries:

INSERT INTO planets (planet_name, star_id) VALUES ('earth', 1);
INSERT INTO planets (planet_name, star_id) VALUES ('mars', 1);

The call returns a MotherMany instance.

As for initChildManager, it's possible to specify a list of table names instead of a list of Mother classes. Once more, there is no need to specify the list: if no argument is used calling initManyManager, the magic methods will be created for each child.

Note that, as for the MotherMany class, for the SELECT and UPDATE actions We can provide a list of fields: this repeats exactly the MotherMany behaviour.

Handling Relations

As for the children of one table we can handle relations with Mother. Now we focus our attention on the tables Lifeforms, Planets and Civilizations.

A record on the Civilizations table means that a certain form of life lives on a certein planet.

To handle children we need to enable the Children Manager calling initChildManager(). To enable the relation manager we have to call initRelationManager().

Let's edit sample.py, inserting a class for the Lifeforms table:

from mother.mothers import *

class ClsLifeforms(DbMother):
    table_name= 'lifeforms'
    def __init__(self, store= {}, flag= MO_NOA, session= None):
        DbMother.__init__(self, store, flag, session)

class ClsPlanets(DbMother, MotherManager):
    table_name= 'planets'
    def __init__(self, store= {}, flag= MO_NOA, session= None):
        self.initRelationManager([ClsLifeforms])
        DbMother.__init__(self, store, flag, session)

Now we can start to handle lifeforms and civilization in a powerful way:

>>> from sample import *
>>> init_mother('/my/conf/file')
>>> Mars= ClsPlanets({'planet_name': 'mars'}, MO_SAVE)
>>> martians_dict= {'life_name': 'green_people'}
>>> Martians= Mars.assignLifeforms(martians_dict, MO_SAVE)

What happens? The assignLifeforms method is a magic, auto-created method. A new record is inserted on the table Lifeforms (MO_SAVE) and, after that, a new record is inserted on the table Civilizations. At the end, Martians is a Mother instance for the record inserted on the table lifeforms.

We did two things in one: we inserted the martian Life and we assigned a relation between the planet Mars and this form of life. What about if the martian Life is already present? It's simple:

>>> Martians= ClsLifeforms(martian_dict, MO_SAVE)
>>> Mars.assignLifeforms(Martians.getFields())

We can also load the related record:

>>> martians_dict= {'life_id': 1}
>>> Martians= Mars.assignLifeforms(martians_dict, MO_LOAD)
>>> print Martians

If we want to insert some informations on the relation record, for example the field "age", which specifies that a certain form of life lives on a certain planet from "age" years, we can do it:

>>> Mars.assignLifeforms(martians_dict, MO_NOA, params= {'age': 12})

Dropping relation is easy too. To drop all relations between Mars and Lifeforms with age > 5:

>>> Mars.dropMultipleLifeforms(filter= 'age > 5')

If we want to use a filter on the table Lifeforms, we have to specify the argument jfilter:

>>> Mars.dropMultipleLifeforms(jfilter= {'life_name': 'green_people'})

These calls delete records on the table Civilizations. Moreover, we can decide to act also on the table Lifeforms:

>>> Mars.dropMultipleLifeforms(filter= 'age > 5', flag= MO_DEL)

With this call we delete not only the relation records, but also the records on the table lifeforms that live on Mars from at least 5 years.

Now we want all form of life thae live on Mars:

>>> lifes_box= Mars.joinLifeforms()

If we are interested only about life names, we can specify this passing a list of fields:

>>> lifes_box= Mars.joinLifeforms(fields= ['life_name'])

We can filter and/or order the records:

>>> filter= 'age > 5'
>>> jfilter= {'life_name': 'green_people'}
>>> lifes_box= Mars.joinLifeforms(order= ['life_id'], filter= filter)
>>> lifes_box= Mars.joinMultipleLifeforms(filter= filter, jfilter= jfilter)

Finally, we could be interested about the relation record between Mars and Martions:

>>> rel= Mars.paramsLifeforms(Martians)
>>> # we can also use dicts:
>>> martian_dict= {'life_id': Martians.getField('life_id')}
>>> rel= Mars.paramsLifeforms(martian_dict)

We could be interested to retrive also a specific field:

>>> rel= Mars.paramsLifeforms(Martians, fields= ['age'])

It's possible to retrieve a Mother class (there is no need to define the class on sample.py):

>>> momma=  Mars.paramsLifeforms(Martians, flag_obj= True)

Handling Complex Children

What about if we want to retrieve planets of the solar system with two moons or planets on the solar system where humans live?

Mother is able to handle this type of JOIN queries in an easy and transaparent way.

Let's start retreving all planets on the solar system where humans live:

>>> humans_filter = {'life_id': 1}
>>> box= Sun.getMultiplePlanets(jbuilder= ClsLifeforms, jfilter= humans_filter)

The same concept applies when dealing with Moons (it's your work to define the class ClsMoons on sample.py):

>>> moons_filter= {'num_moons': 2}
>>> box= Sun.getMultiplePlanets(jbuilder= ClsMoonsInfo, jfilter= moons_filter)

Note that Mother is able to understand which tables have to be joined, altought the two situations are a little bit different.

Obviously you can use at the same time the other arguments for the getMultiplePlanets() method:

>>> Sun.getMultiplePlanets(filter= 'planet_mass > 5',
...                 jbuilder= ClsLifeforms, jfilter= humans_filter)

The MotherFusion Class

The MotherFusion class was introduced with mother version 0.6.2. All the Mother classes presented just now are able to load records only from a unique table.

The MotherFusion class allows to load records from two or three tables. For example, we could be interested to load stars and planets at the same time, performing a join between the two tables:

>>> from mother.mothers import MotherFusion
>>> momma= MohterFusion(ClsStars, ClsPlanets)
>>> print len(momma)
>>> momma.getRecords()
>>> momma.getRecords(flag_obj= True)

The table 'planets' is a child of the table 'stars'. As we can expect, the MotherFusion class is able to understand the database structure. So we can use it for two tables linked with a relation, as for the tables 'planet' and 'lifeforms', that are linked with the table 'civilizations':

>>> momma= MotherFusion(ClsPlanets, ClsLifeforms)

Note that in both cases, you don't need to specify the nature of the tables relation: Mother is able to understand it.

As we can expect, we can use filters as usual:

>>> momma= MotherFusion(ClsPlanets, ClsStars, filter= 'planet_mass > 12')
>>> momma= MotherFusion(ClsPlanets, ClsStars, filter= {'star_mass': 14})

We can use also a MoFilter class to specify a complex filter.

We can specify also which fields we want to load; to do it, we can choose different ways. If there is no problem for ambiguity, we can list them:

>>> momma= MotherFusion(ClsStars, ClsPlanets, \
        fields= ['star_mass', 'planet_id'])

If there is an ambiguity problem, we can provide two dicts or two list with a tuple:

>>> momma= MotherFusion(ClsStars, ClsPlanets, \
        fields= ({'star_name': 'foo'}, {'planet_mass': 'bar'}))

This will produce: "SELECT stars.star_nams AS foo, planet.planet_mass AS bar". Following this case, remember that an empty dict means 'all table fields':

>>> momma= MohterFusion(ClsStars, ClsPlanets, \
        fields= ({}, {'planet_mass': 'bar'}))

will produce: "SELECT stars.*, planet.planet_mass as bar". The following fields specification is also valid:

>>> fields= (['star_name'], {'planet_mass': 'bar'})
>>> fields= (['star_name', 'star_id'], {})
>>> fields= (['star_name'], {})

Obvioulsy, we can provide a MotherSession and an order:

>>> momma= MotherFusion(ClsLifeforms, ClsPlanets,
        session= MySession, order= ['star_name'])

If we set to True the 'distinct' argument, we will have a "SELECT DISTINCT" statement:

>>> momma= MotherFusion(ClsLifeforms, ClsPlanets, distinct= True)

If the two tables are not Father-Child (or Child-Fater) tables, we can force the relation table to be used:

>>> momma= MohterFusion(ClsLifeforms, ClsPlanets, rtbl= 'civilizations')

In the same case, we could be interested to load also the records on the relation tables (for example 'civilizations.age'). If we set params= True, MotherFusion will load all the fields on the relation table:

>>> momma= MotherFusion(ClsLifeforms, ClsPlanets, params= True)

We can also specify manually which field to load on the relation table:

>>> momma= MotherFusion(ClsLifeforms, ClsPlanets, params= ['age'])

From version 0.6.3 or later, the initialization of MotherFusion accepts a new argument, named 'side'. With this argument is possible to do a RIGHT or LEFT join:

>>> momma= MotherFusion(ClsLifeforms, ClsPlanets, side="RIGHT")
>>> momma= MotherFusion(ClsLifeforms, ClsPlanets, side="LEFT")

Note that, whed using side, the order of the two Mother builders is important.

Finally, a MotherFusion instance is iterable:

>>> for record in momma:
...     print record

Performing Custom queries

SQL is a rich language: we need to perform some custom queries. Custom queris could be executed calling a set of functions. These functions are exported almost everywhere. For example, each Mother instance will contain these methods.

These functions have origin on the persistent connection (DbOne) and on each session from the connection pool. Each session and the persistent connection implement them.

The following methods are available:

  • oc_query(str, dict)
  • ov_query(str, dict)
  • or_query(str, dict)
  • mr_query(str, dict)
  • mg_query(str, list)
  • mq_query(str, list)
  • beginTrans()
  • commmit()
  • rollback()

These methods, if used correctly, are SQL injection safe.

Before to explain what these methods do, let's see some example to understand the usage:

>>> from mother.mothers import *
>>> init_mother('/my/conf/file')
>>> from mother.abdbda import DbOne
>>> # All the following queris are in the same transaction:
>>> DbOne.beginTrans()
>>> sun= ClsStars(sun_dict, MO_SAVE)
>>> earth= sun.insertPlanets(earth_dict)
>>> earth.set_field('planet_name', 'Earth')
>>> earth.update()
>>> sun.oc_query('DELETE * FROM stars')
>>> DbOne.oc_query('DELETE * FROM planets')
>>> sun.commit()

As it's possible to see, when using the persistent connection, the instance or the class from where we call these methods doesn't matter: each db statement is executed inside the unique persistent connection, shared by all instances.

When using session, the behaviour is similar (note that sessions are always in a transaction state):

>>> from mother.mothers import *
>>> init_mother('/my/conf/file')
>>> session= MotherSession('session_example')
>>> # All the following queris are in the same session-transaction:
>>> sun= ClsStars(sun_dict, MO_SAVE, session= session)
>>> sun.oc_query('DELETE * FROM stars')
>>> earth= sun.insertPlanets(earth_dict)
>>> earth.set_field('planet_name', 'Earth')
>>> earth.update()
>>> session.oc_query('DELETE * FROM planets')
>>> session.endSession()

Now, the methods explanation.

The methods to handle transactions are self explanatory. The methods oc_query, ov_query, or_query, mr_query have the same syntax:

function(str, dict)

where str is something like:

When using Postgres:
"INSERT INTO stars (star_name) VALUES (%(star_name)s)"
When using SQLite:
"INSERT INTO stars (star_name) VALUES (:star_name)"

and dict is a dictionnary that contains the promised values:

sun_dict= {'star_name': 'sun'}

These methods do:

- oc_query -> One Commit Query
   This method does not return any value, so it has to be used to
   perform db actions with no return: for example DELETE FROM...

- ov_query -> One Value Query
   This method have to return a uniqe value. For example the query

     SELECT COUNT(1) from stars

   will return an integer.

   If a different number of results will be returned, an exception
   will be raised.

- or_query -> One Record Query
   This method has to return a unique row: the return values is a dict,
   containing the values for each field. For example:

     SELECT star_name, star_age FROM stars WHERE stars.id = 1

   If a different number of rows will be returned, an exception will be
   raised.

- mr_query -> Multiple Record Query
   This methods returns a list of dicts, each dict for each fetched row.
   For example:

     SELECT * from planets

The other methods, mg_query and mq_query, are used to execute massive queries and behaves like the executemany() db drivers functions. Instead of a dict as the second argument, they take a list of dicts. These methods do:

- mq_query -> Multiple Quiet Query
   >>> li= [{'star_name': 'sun'}, {'star_name': 'alpha'}]
   >>> mq_query('DELETE FROM stars WHERE star_name = %(star_name)s', li)

   Returns none.

- mg_query -> Multiple Get Query
   >>> li= [{'star_name': 'sun'}, {'star_name': 'alpha'}]
   >>> mq_query('SELECT * FROM stars WHERE star_name = %(star_name)s', li)

   Returns a list of dicts

Transactions

Transactions are handled by Mother, which allows nested transactions. In this section we deal once more with the persistent connection.

As for the methods to perform action queries, beginTrans(), commit() and rollback() are DbOne methods, but they are exported to each Mother instance. Note that calling these methods from a Mother instance or calling them from the DbOne class produces the same effect:

>>> DbOne.beginTrans()
>>> try:
...     Sun.insert()
...     Sun.commit()
... except:
...     Sun.rollback()
>>>

The chance to call nested transactions is very useful: if we call two times beginTrans() we need to call two times commit() to commit our queries. Instead, rollback could be called once (and calling rollback more times does not produce any errror).

This allows the following code:

>>> def foo():
>>>   Sun.beginTrans()
...   try:
...     Sun.insert()
...     Sun.commit()
...   except:
...     Sun.rollback()
...     raise 'Foo'
>>>
>>> def bar():
...   DbOne.beginTrans()
...   try:
...     DbOne.oc_query(myquery)
...     foo()
...     DbOne.commit()
...   except:
...     DbOne.rollback()
...     raise 'Bar'
>>>
>>> bar()

The function foo() is not dangerous for his transaction: we can safely call it from everywhere, because Mother is able to deal with nested transaction, doing exactly what you need: calling bar(), queries will be committed only with the last commit() statement inside bar() itself.

Obviously you can call directly foo(), obtaining now a classic behaviour.

Sessions and Threaded Environments

When we need to develop applications in a threaded environments, we need isolated transactions. In fact the persistent connection is not enough, because different flux of code have to behave indipendently, while the persistent connection is shared to each Mother instance.

Mother implements a connection pool: editing the Mother configuration file it's possible to tune it in a deep way. The file is strongly commented.

To get a session, the MotherSession() call is used; we can give a name to each session: this is very useful for debugging purposes, but you can safely call this function without arguments: Mother will assign a random name to your session:

>>> from mother.mothers import *
>>> init_mother('/my/conf/file')
>>> session= MotherSession('hello_world')

Now that a session is ready, we can begin to use it:

>>> Sun= ClsStars(sun_dict, MO_SAVE, session)
>>> earth= Sun.insertPlanets(earth_dict)
>>> earth.setField('planet_mass', 34)
>>> earth.update()

The db actions are now inside your session: note that this applies also to the db actions produced by Earth, because Earth is born inside a session.

Sessions are always in a transaction state. To commit the queries we can call commit() or endSession(). The endSession() call commit the transaction, and puts the connections used just now back to the pool:

>>> session.endSession()

while the commit() call commits all pending queries to the database, but session is not closed and it's possible to continue to use it.

To rollback the queries inside a session we use rollback():

>>> session= MotherSession('hello_world')
>>> try:
...   Sun= ClsStars(sun_dict, MO_SAVE, session)
...   earth= Sun.insertPlanets(earth_dict)
...   earth.setField('planet_mass', 34)
...   earth.update()
... except:
...   session.rollback()
>>>
>>> session.endSession()

To perform custom queries inside sessions, just use the session methods or the Mother instance methods:

>>> session= MotherSession('CustomQueries')
>>> try:
...   Sun= ClsStars(sun_dict, MO_SAVE, session)
...   Sun.oc_query('delete from planets')
...   session.or_query('select * from lifeforms where life_id = 1')
... except:
...   session.rollback()
>>>
>>> session.endSession()

When you develop internal methods, make sure to propagate the session:

>>> class ClsFoo(DbMother):
...   table_name= 'foo'
...   def __init__(....):
...     ...
...
...   def wrong_method(self, *args):
...     ClsBar(mydict, MO_SAVE)
...
...   def correct_method(self, *args):
...     # this works also if no session was used to
...     # initialize this instance:
...     ClsBar(mydict, MO_SAVE, self.session)
...
...   def always_correct(self, *args):
...     # this query is executed inside a session
...     # if this instance was initialized with a
...     # session, with the persistent connection
...     # otherwise.
...     self.oc_query('delete from foobar')

Finally, to monitor the connection pool, use the following methods:

>>> from mother.mothers import MotherPoolStatus, MotherPoolStratus
>>> print MotherPoolStatus()
>>> print MotherPoolStratus()

MotherTrigger

We can define some triggers to be fired before or after DbMother actions. This feature requires the usage of the MotherTrigger class. To use it, We have to subclass our DbMother class with it.

Explainig this feature with an example is the faster way. Suppose that We want to send a mail to some God after a Lifeform extinction from our Universe. We can do the following:

from mother.mothers import *
from mother.plugins import MotherTrigger

class Lifeform(DbMother, MotherTrigger):

  table_name= 'stars'

  def __init__(self, my_god= 'Shiva@Olympus.Paradise', store= {},
                   flag= MO_NOA, session= None):

      self.my_god= my_god
      self.add_trigger(MO_DEL, MO_AFTER, self.gods_alert)
      DbMother.__init__(self, store, flag, session)

  def gods_alert(self):

      my_god= self.my_god
      life_name= self.getField('life_name')
      sub= "Lifeform Extinction"
      body= "The Lifeform %s is dead." % life_name
      sendmailto(my_god, sub, body)

Each action flag (MO_SAVE, MO_DEL, MO_UP and MO_LOAD) can be used with the add_trigger() call. To trigger function before an action, it's possible to use the MO_BEFORE symbol:

self.add_trigger(MO_DEL, MO_BEFORE, self.gods_alert)

Important

Triggers have to be setted before the DbMother.__init__() call, otherwise they will be ignored when using an action flag during the initialization.

MotherCaster

The MotherCaster plugin has two goals:

  • Provide a validation for fields types
  • Provide a validation for required fields

The plugin is very useful when Mother is used with a Web Form, because it's possible to control fields consistency (types) and fields presence (required fields).

The Plugin is simple: a DbMother class has to be subcalssed with the MotherCaster class, which is contained in mother.plugins. After that, We need to define an internal dict.

Let's see an example:

from mother.mothers import DbMother, MO_NOA
from mother.plugins.mocaster import *

class ClsStars(DbMother, MotherCaster):

  cast_fields= {
      star_mass: int,
      star_age: int,
      star_name: str }

  # Optionally...
  required_fields= ['star_age']

  def __init__(self, d= {}, flag= MO_NOA, session= None):

      MotherCaster.__init__(self, autocast= True)
      DbMother.__init__(self, d, flag, session)

Note that:

  • cast_fields does not have to contain all fields for the table.
  • required_fields is optional and the previous consideration apply there.

When a ClsStars is initialized, the following happens:

  • fields are checked: if they have a wrong type, a casting is automatically done (auotcast= True).
  • if a provided field is not present in the table stars, it's dropped (note that this does not depend on cast_fields or required_fields declarations).
  • Mother controls if all required_fields are provided.

If an error occurs, the following exception will be raised: MoWrongFields. This is more then an exception: it has the following attributes: ifields and mfields. Ifields is a list of invalid fields, while mfields is a list of missing fields (they are required but absent):

>>> MyDict= dict(star_name= 19, star_age= '12', not_existent_field=1)
>>> MyStar= ClsStars(MyDict, MO_NOA)

# Here, star_name becomes '19' and star_age 12. not_existent_field
# is silently dropped.

>>> MyDict= dict(star_name= 'sun', star_mass='www')
>>> try:
...  MyStar= ClsStars(MyDict, MO_NOA)
... except MoWrongFields, errors:
...  print "Invalid types:", errors.ifields
...  print "Missing fields:", errors.mfields
>>>

Here, star_mass is invalid because it's impossible to cast it to an int, and star_age, which is a required field, is missing.

Logging

Logging, like the pool, is configurable on the Mother configuration file. It's possbile to log to standard output, to file (with file rotation capabilites), to mail and to syslog (this requires to enable syslog tcp daemon, because windows does not support Unix sockets).

Once the logging features are tuned on the Mother configuration file, it's possbible to begin to produces some string activity:

>>> init_mother('/my/conf/file')
>>> from mother.Speaker import RED, GREEN, YELLOW
>>> import datetime
>>> Sun= ClsStars()
>>> Sun.log_info('It's %s', GREEN(datetime.datetime.today()))
>>> Sun.log_warning('aia aia %s %s', RED(1), YELLOW('foo'))
>>> from mother.speaker import *
>>> Speaker.log_debug('the same methods are callable from the Speaker class')
>>> Speaker.log_noise('Noise Noise %s', RED('noise'))
>>> Speaker.log_insane('Insane Insane %s', RED('noise'))
>>> Sun.log_noise('Soft Soft %s', RED('noise'))

Remember that the string are formatted in a C way: You don't have to use the python way. Remember also to use always the %s symbol, also if You are printing integer or floats:

>>> # the python way
>>> "%s %d" % ('a', 1)
>>> # the mother way
>>> "%s %s", 'a', 1

Setting the configuration level, some of the previous logged string will be dropped. To use a custom logging level, use log_log():

>>> Speaker.log_log(23, 'hi %s %s %s', 1, 2 ,'a')

To turn off the logging feature, set the log level to 0.

If smtp logging is enabled, the function log_mail() will be available: using it, You can send some logs via mail.

Custom Complex Filters

Sometime we have to use strings as filters. For example, to get all planets with planet_mass > 5, we must do:

>>> MotherBox(ClsPlanets, filter= 'planet_mass > 5', flag= MO_LOAD)

When no-string filters are provided, Mother is able to escape correctly the various variables, but when we work with strings, we are tempted to do:

>>> ftr= 'blabla %s %s' % (foo, bar)
>>> MotherBox(ClsPlanets, filter= ftr, flag= MO_LOAD)

This is not good, because SQL injection is possible. To let Mother escapes your string filters, you have to use a class: MoFilter.

It's easy, instead of:

>>> filter= 'blabla %(foo)s %(bar)s' % {'foo': foo, 'bar': bar}

it's possible to do:

>>> from mother.mothers import MoFilter
>>> store= {'foo': foo, 'bar': bar}
>>> filter= MoFilter('blabla %(foo)s %(bar)s', store= store}
>>> # For SQLite you have to use:   'blabla :foo :bar'

Now Mother will escape for you the filter, adding the security layer vs SQL injection.

Moreover, we can add different type of filter in the same class:

>>> filter= MoFilter('blabla %(foo)s %(bar)s', store= store}
>>> filter.add_filter({'age': 1})
>>> filter.add_filter('dkafsak %(az)s', store= {'az': 5})
>>> MotherBox(ClsPlanets, filter= filter, flag= MO_LOAD)

If You want to force a filter on a specific table, use the following argument:

>>> filter= MoFilter({'my': 'filter'}, table= 'my_table)

Copyright and License

Copyright (c) 2006-2007 Federico Tomassini aka efphe (effetom at gmail dot com) All rights reserved.

Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:

  • Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
  • Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.
  • Neither the name of the University of California, Berkeley nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.

THIS SOFTWARE IS PROVIDED BY THE REGENTS AND CONTRIBUTORS ''AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE REGENTS AND CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

Changelog History

VERSION = 0.6.4-r2

  • Mother does not have to crash if and when psycopg2.mogrify() crash.

VERSION = 0.6.4-r1

  • registering atexit functions to close connections
  • mothermapper: new option -g

VERSION = 0.6.4

  • DbMother._update() debugged
  • DbMother.delete() does not clean the store
  • Manual updated
  • License is applied to each source file
  • MO_ULOAD and DbMother.uload dropped
  • MotherBox.getFields() dropped
  • triggers is now a plugin, so DbMother is lighter
  • imports are now absolute
  • Manual: new section for MotherFly
  • MotherBox: debugged updateBox
  • MotherMany is now iterable
  • MotherFusion is now iterable
  • MotherBox is now iterable
  • MotherBox: dropped triggers
  • MotherBox and MotherMany accpet a table name instead of DbMother class
  • initManyManager and initChildManager extended. children list is not mandatory and children can be specified by table names.
  • mothermapper uses MotherMany for -f option

VERSION = 0.6.3

  • Manual: introduces the new sections: copyright and changelog
  • Manual: new stylesheets
  • Manual: improved section for the Custom Queries
  • DbFly.ov_query() debugged
  • _DbMap inherits speaker methods directly (see previous change)
  • DbFly and DbOne are now handled in the same way
  • otpimization: DbMother classes are more light
  • sqlite bug: crash on empty results for get type queries
  • sqlite.py postgres.py: ifaces aligned
  • rewrited some methods on db interfaces
  • dropped the DictCursor psycopg2 feature
  • manual: corrections and new index layout
  • new MotherManager family functions: actionMany...
  • new MotherManager method: insertManyChildren
  • new class MotherMany
  • abdbda imports (and exports) multiple query methods
  • sqlite and postgres have now mq_query and mg_query
  • new method on postgres, sqlite and abdbda: multiple_query()
  • DbMother fields methods moved on _DbMap
  • MotherFusion: new __init__ arg side

VERSION = 0.6.2-r1

  • relParams debugged: session was not exported
  • MotherFusion: fields handler improved

VERSION = 0.6.2

  • cleaned doc: Lifes -> Lifeforms
  • DbOne was calling twice _connect()
  • Removed HeavyBug on Sqlite-Map creation
  • mothermapper -f: removed __import__, used execfile()
  • MotherFusion: it's now possible to specify jfilter
  • debugged arg_format usage
  • MotherFusion: debugged ({}, {}..) for selectWaht()
  • An interactive example for SQLite is provided
  • Examples for SQLite ans Postgres are provided
  • MotherFusion: __len__ method
  • MotherFusion: introduces distinct
  • MotherFusion: introduced order
  • MohterFusion is explained on the Manual
  • mothermapper -f help prints a human readable help
  • MotherFusion is able to exclude redundant params
  • MotherFusion can load relation params
  • new class MotherFusion
  • code cleaned
  • new _DbMap._sqlFreeJoin method
  • _sqlJoin&C moved from MotherManager to _DbMap class
  • mothermapper --help: deleted -C option

VERSION = 0.6.1

  • defconf.py -> _defconf.py
  • def_pgres.py -> _def_pgres.py
  • def_sqlite.py -> _def_sqlite.py
  • mothermapper: removed C option, added -P and -S
  • added new modules: def_pgres and def_sqlite
  • debugging MohterManager methods

VERSION = 0.6.0

  • Sqlite Support
  • rewrited initalization methods
  • dbda removed: added abdbda
  • added specific db interface: postgres.py sqlite.py
  • mothermapper uses always a persistent connection
  • mothermapper meta queries reside now on the db core
  • MO_ULOAD action is now obsolete
  • no primary key is needed to load one record
  • no primary key is needed to modify db record
  • added the flag 'paranoid' to the Mother classes
  • added class filter MoFilter
  • Mother uses now MoFilter instances as internal filters
  • writed a new documentation guide
  • _sqlBuildFilter() removed from DbMother
  • mogrify() is used only for postgres
  • logs are more clear
  • controlling fields on getField()
  • tester.py: a test unit

VERSION = 0.5.6

  • dbg: Psygres reconnection
  • logging is more accurate
  • Mother uses only the pkeys to build filters (MO_ULOAD exception)

VERSION = 0.5.5

  • exceptions are handled more simply
  • mother is now able to restore a broken connection. This is the case of a postgres restart, for example. Note that psycopg2 is broken and this feature is not working, depending on the psycopg2 version.
  • getChild() is now reimplemented
  • Avoid wildcard on select queries
  • DbMother.load() returns now the getFields() dict
  • fnaming function debugging
  • init_mother() tests the function fnaming when provided
  • MotherPoolStratus()
  • getField: obsolete parameter value. New paramater autoload.

VERSION = 0.5.4

  • new concept of calm for the pool
  • inserted controls testing sessions initializaztion
  • logs about pool are now cleaner
  • default configuration file changed
  • it's now possible, when using pool, to disable the base connection
  • implemented Pool Types: LIMITED, ELASTIC, GROWING
  • dropped sessions are now directly closed (reference leaks)
  • fixed the "Session VS Persistent" gap for internal transactions
  • DbMother class: three wrappers to handle transactions internally
  • new method: DbMother.getField()
  • mothermapper and symbols: no more massive DELETE (sync pattern)
  • mothermapper: no more transatctions: now MotherSession

VERSION = 0.5.3-r1

  • Fixed bug about colors on win32: mothermapper/set_log_color()
  • Fixed a not compliant query (fields IS NULL (VS) field= NULL)
  • Fixed a bug on MotherBox._retrieve_mothers()

VERSION = 0.5.3

  • Some output changed
  • Colors are always disabled on win32 systems.
  • A stupid bug fixed on _moMap

VERSION = 0.5.2-r1

  • Bug fixed: broken map on win32 platforms (cPickle)

VERSION = 0.5.2

  • It's now possible to commit() queries inside sessions.
  • new method MotherPoolStatus()
  • PsygresPool.back_home -> PsygresPool.backHome
  • PsygresPool is not Speaker child
  • PsygresPoll attrs are now private _attrs
  • PsygresPool.session_number -> PsygresPoll.status
  • PsygresPool remembers the orphaned sessions
  • PsygresPool has now a detailed method: status()
  • introduced a new argument on relParams(): flag_obj

VERSION = 0.5.1

  • offensive logs removed
  • the map dicts are now copied when Mother reads them
  • log_rotate debugged
  • The Mother Map is saved as pickled dict
  • Various optimizations on code
  • MotherMapped family functions dropped
  • The DB map is loaded by init_mother(), once
  • init_dbda() bug fixed (DB_PORT when using Unix sockets)
  • assignRelation debugged
  • symbols on MotherMap removed (optimization)
  • initRelationManager builds the relParams() wrapper
  • new method on Mothermanager: relParams()
  • debugged getChildren (when jbuilder is specified)

VERSION = 0.5.0

  • mothermapper debugged
  • _insertNoOid() debugged when pkey is absent
  • inline docs more complete
  • lambdas removed

VERSION = 0.4.9

  • from mother.speaker import * not needed on the conf file
  • added a tester script
  • a lot of global vars removed
  • joining accepts new argument: jfilter!
  • all database actions converted to new OIDs conf
  • DbMother._sqlPkeysFromOid added
  • DbMother._sqlWhereOid dropped
  • MotherBox is not using OIDs anymore
  • init_mother test MOTHER_OIDS and does what has to be done
  • Mother conf file accepts keyword MOTHER_OIDS
  • Mother is preparing to drop completely OIDs
  • The MotherBox accepts now the new arg notriggers (defaults True)
  • oid is removed almost everywhere: it remains in _insert()

VERSION = 0.4.8-r2

  • a few bugs

VERSION = 0.4.8-r1

  • DbMother.update() bug fixed.
  • On install, check if psycopg2 is installed. If not, print a warning.

VERSION = 0.4.8

  • DbMother.update takes now an argument
  • MotherCaster accepts required_fields
  • filters fixed
  • inline docs are now fine
  • updateChildren had a bug due to the last updates
  • New Plugin Mothercaster
  • New submodule mother.plugins
  • Mother RAM Optimizations: _trigger_actions and _flag_actions are now staticmethod

VERSION = 0.4.7

  • SQL optimizations: oid loading is now not needed everywhere Note that queries will be executed with less controls, but more faster.
  • SQL optimizations: MotherBox don't need to SELECT from DB to fire triggers
  • SQL optimizations: sql string methods are now faster and cleaner
  • Pretty Print: debug messages more friendly
  • Some debug on dbda.py
  • SQL_DEFAULT is always applied
  • Memory Optimizations: Multiple Variables inside functions
  • Color Wrappers OKI_COL, ERR_COL, INF_COL
  • MotherBox from scratch: unneeded methods removed and optimizations
  • DbMother controls fields

VERSION = 0.4.6

  • mothermapper has the new option '-v'
  • conf files and maps are loaded with execfile: import avoided
  • INSTALL
  • mothermapper accepts new options: Q, S, C. Obsolete options: r
  • multiple global variables removed from speaker
  • setup.py has been created: Mother become a standard python module!!
  • the map file could be created everywhere: init_mother(), init_dbda() and init_speaker depend now on the location of this file.
  • dbstruct.py and rels.py are merged in a unique file: the map file
  • dbmapper.py is moved on mothermapper and it's installed as system script
  • new documentation: mother_threads.txt
  • metaqry.py merged in dbmapper.py
  • Mothers classes store sessions
  • conf.py is now almost commented: default values defined
  • Sessions naming
  • New API MotherSession()
  • Initialization methods don't use globals() (more safe)
  • New Class PsygresPool to use a pool of persistent connections
  • New Class PsygresFly to use isolated DB Sessions

VERSION = 0.4.5

  • new svn layout
  • Added support for log file rotation
  • mothers should be Win32 compatible
  • speaker supports now: log_to_file, log_to_syslog, log_to_stdout, log_to_smtp
  • box.py removed: MotherBox is merged on mothers.py
  • dbda.py and speaker.py from scratch using Classes and staticmethods
  • supersingleton pattern removed
  • init_methods() introduced
  • shared.py removed
  • headers files removed (speaker_h.py, mothers_h.py)
  • speaker uses now the logging module instead of syslog: API is changed!!!

VERSION = 0.4.4-r1

  • mothers.py cleaned from old unused variables
  • mother_init works at the same time with ZpsycoPGDA and psycopg2
  • MotherBox debugged: store -> _store
  • MotherFly accepts now the argument oid

VERSION = 0.4.4

  • MotherBox debugged: select oid -> select tbl.oid
  • dbmapper uses now DEFAULT_LOGGER
  • pop inserted on MotherBox.getFields
  • MotherBox.getFields and MotherBox.filterBox optimized
  • Database adapter is implemented with SuperSingleton design pattern
  • Introduced test.py: example and test module

VERSION = 0.4.3

  • triggers debugged. The bug was heavy, so the new release.

VERSION = 0.4.2

  • debugged mothers.py:805
  • MotherBox._sqlBuildFilterBox: controls when WHERE is present in the filter.
  • MotherManager.deleteChildren: no more using getChildren
  • debug on MotherBox
  • Speaker.log_int_raise was printing 2 times the error string
  • _sqlJoinParent debugged

VERSION < 0.4.2

  • First official release