instead of "count".
- got rudimental "mapping to multiple tables" functionality cleaned up,
more correctly documented
-
+- restored global_connect() function, attaches to a DynamicMetaData
+instance called "default_metadata". leaving MetaData arg to Table
+out will use the default metadata.
0.2.1
- "pool" argument to create_engine() properly propigates
- fixes to URL, raises exception if not parsed, does not pass blank
`DynamicMetaData` is ideal for applications that need to use the same set of `Tables` for many different database connections in the same process, such as a CherryPy web application which handles multiple application instances in one process.
+#### Using the global Metadata object
+
+Some users prefer to create `Table` objects without specifying a `MetaData` object, having Tables scoped on an application-wide basis. For them the `default_metadata` object and the `global_connect()` function is supplied. `default_metadata` is simply an instance of `DynamicMetaData` that exists within the `sqlalchemy` namespace, and `global_connect()` is a synonym for `default_metadata.connect()`. Defining a `Table` that has no `MetaData` argument will automatically use this default metadata as follows:
+
+ {python}
+ from sqlalchemy import *
+
+ # a Table with just a name and its Columns
+ mytable = Table('mytable',
+ Column('col1', Integer, primary_key=True),
+ Column('col2', String(40))
+ )
+
+ # connect all the "anonymous" tables to a postgres uri in the current thread
+ global_connect('postgres://foo:bar@lala/test')
+
+ # create all tables in the default metadata
+ default_metadata.create_all()
+
+ # the table is bound
+ mytable.insert().execute(col1=5, col2='some value')
+
#### Reflecting Tables
Once you have a `BoundMetaData` or a connected `DynamicMetaData`, you can create `Table` objects without specifying their columns, just their names, using `autoload=True`:
if not hasattr(engine, '_legacy_metadata'):
engine._legacy_metadata = BoundMetaData(engine)
metadata = engine._legacy_metadata
+ elif metadata is not None and not isinstance(metadata, MetaData):
+ # they left MetaData out, so assume its another SchemaItem, add it to *args
+ args = list(args)
+ args.insert(0, metadata)
+ metadata = None
+
+ if metadata is None:
+ metadata = default_metadata
+
name = str(name) # in case of incoming unicode
schema = kwargs.get('schema', None)
autoload = kwargs.pop('autoload', False)
for row in r:
l.append(row)
self.assert_(len(l) == 3)
-
+
+ def test_global_metadata(self):
+ t1 = Table('table1', Column('col1', Integer, primary_key=True),
+ Column('col2', String(20)))
+ t2 = Table('table2', Column('col1', Integer, primary_key=True),
+ Column('col2', String(20)))
+
+ assert t1.c.col1
+ global_connect(testbase.db)
+ default_metadata.create_all()
+ try:
+ assert t1.count().scalar() == 0
+ finally:
+ default_metadata.drop_all()
+ default_metadata.clear()
+
def testpassiveoverride(self):
"""primarily for postgres, tests that when we get a primary key column back
from reflecting a table which has a default value on it, we pre-execute