approach can be upgraded to this new
approach. [ticket:1401]
+ - [bug] ORM will perform extra effort to determine
+ that an FK dependency between two tables is
+ not significant during flush if the tables
+ are related via joined inheritance and the FK
+ dependency is not part of the inherit_condition,
+ saves the user a use_alter directive.
+ [ticket:2527]
+
- [feature] New standalone function with_polymorphic()
provides the functionality of query.with_polymorphic()
in a standalone form. It can be applied to any
local_table = None
"""The :class:`.Selectable` which this :class:`.Mapper` manages.
- Typically is an instance of :class:`.Table` or :class:`.Alias`.
- May also be ``None``.
+ Typically is an instance of :class:`.Table` or :class:`.Alias`.
+ May also be ``None``.
The "local" table is the
- selectable that the :class:`.Mapper` is directly responsible for
+ selectable that the :class:`.Mapper` is directly responsible for
managing from an attribute access and flush perspective. For
non-inheriting mappers, the local table is the same as the
"mapped" table. For joined-table inheritance mappers, local_table
will be the particular sub-table of the overall "join" which
- this :class:`.Mapper` represents. If this mapper is a
+ this :class:`.Mapper` represents. If this mapper is a
single-table inheriting mapper, local_table will be ``None``.
See also :attr:`~.Mapper.mapped_table`.
mapped_table = None
"""The :class:`.Selectable` to which this :class:`.Mapper` is mapped.
- Typically an instance of :class:`.Table`, :class:`.Join`, or
+ Typically an instance of :class:`.Table`, :class:`.Join`, or
:class:`.Alias`.
- The "mapped" table is the selectable that
- the mapper selects from during queries. For non-inheriting
+ The "mapped" table is the selectable that
+ the mapper selects from during queries. For non-inheriting
mappers, the mapped table is the same as the "local" table.
For joined-table inheritance mappers, mapped_table references the
full :class:`.Join` representing full rows for this particular
"""
inherits = None
- """References the :class:`.Mapper` which this :class:`.Mapper`
+ """References the :class:`.Mapper` which this :class:`.Mapper`
inherits from, if any.
This is a *read only* attribute determined during mapper construction.
"""
concrete = None
- """Represent ``True`` if this :class:`.Mapper` is a concrete
+ """Represent ``True`` if this :class:`.Mapper` is a concrete
inheritance mapper.
This is a *read only* attribute determined during mapper construction.
primary_key = None
"""An iterable containing the collection of :class:`.Column` objects
- which comprise the 'primary key' of the mapped table, from the
+ which comprise the 'primary key' of the mapped table, from the
perspective of this :class:`.Mapper`.
This list is against the selectable in :attr:`~.Mapper.mapped_table`. In the
referenced by the :class:`.Join`.
The list is also not necessarily the same as the primary key column
- collection associated with the underlying tables; the :class:`.Mapper`
+ collection associated with the underlying tables; the :class:`.Mapper`
features a ``primary_key`` argument that can override what the
:class:`.Mapper` considers as primary key columns.
"""
single = None
- """Represent ``True`` if this :class:`.Mapper` is a single table
+ """Represent ``True`` if this :class:`.Mapper` is a single table
inheritance mapper.
:attr:`~.Mapper.local_table` will be ``None`` if this flag is set.
"""
non_primary = None
- """Represent ``True`` if this :class:`.Mapper` is a "non-primary"
- mapper, e.g. a mapper that is used only to selet rows but not for
+ """Represent ``True`` if this :class:`.Mapper` is a "non-primary"
+ mapper, e.g. a mapper that is used only to selet rows but not for
persistence management.
This is a *read only* attribute determined during mapper construction.
"""A mapping of "polymorphic identity" identifiers mapped to :class:`.Mapper`
instances, within an inheritance scenario.
- The identifiers can be of any type which is comparable to the
+ The identifiers can be of any type which is comparable to the
type of column represented by :attr:`~.Mapper.polymorphic_on`.
- An inheritance chain of mappers will all reference the same
+ An inheritance chain of mappers will all reference the same
polymorphic map object. The object is used to correlate incoming
result rows to target mappers.
"""
columns = None
- """A collection of :class:`.Column` or other scalar expression
+ """A collection of :class:`.Column` or other scalar expression
objects maintained by this :class:`.Mapper`.
- The collection behaves the same as that of the ``c`` attribute on
+ The collection behaves the same as that of the ``c`` attribute on
any :class:`.Table` object, except that only those columns included in
this mapping are present, and are keyed based on the attribute name
defined in the mapping, not necessarily the ``key`` attribute of the
validators = None
"""An immutable dictionary of attributes which have been decorated
- using the :func:`~.orm.validates` decorator.
-
+ using the :func:`~.orm.validates` decorator.
+
The dictionary contains string attribute names as keys
mapped to the actual validation method.
-
+
"""
c = None
self.inherits = class_mapper(self.inherits, compile=False)
if not issubclass(self.class_, self.inherits.class_):
raise sa_exc.ArgumentError(
- "Class '%s' does not inherit from '%s'" %
+ "Class '%s' does not inherit from '%s'" %
(self.class_.__name__, self.inherits.class_.__name__))
if self.non_primary != self.inherits.non_primary:
np = not self.non_primary and "primary" or "non-primary"
raise sa_exc.ArgumentError(
"Inheritance of %s mapper for class '%s' is "
- "only allowed from a %s mapper" %
+ "only allowed from a %s mapper" %
(np, self.class_.__name__, np))
# inherit_condition is optional.
if self.local_table is None:
self.inherits.local_table,
self.local_table)
self.mapped_table = sql.join(
- self.inherits.mapped_table,
+ self.inherits.mapped_table,
self.local_table,
self.inherit_condition)
"the inherited versioning column. "
"version_id_col should only be specified on "
"the base-most mapper that includes versioning." %
- (self.version_id_col.description,
+ (self.version_id_col.description,
self.inherits.version_id_col.description)
)
if self.mapped_table is None:
raise sa_exc.ArgumentError(
- "Mapper '%s' does not have a mapped_table specified."
+ "Mapper '%s' does not have a mapped_table specified."
% self)
def _set_with_polymorphic(self, with_polymorphic):
if self.inherits:
self.dispatch._update(self.inherits.dispatch)
super_extensions = set(
- chain(*[m._deprecated_extensions
+ chain(*[m._deprecated_extensions
for m in self.inherits.iterate_to_root()]))
else:
super_extensions = set()
def _configure_listeners(self):
if self.inherits:
super_extensions = set(
- chain(*[m._deprecated_extensions
+ chain(*[m._deprecated_extensions
for m in self.inherits.iterate_to_root()]))
else:
super_extensions = set()
"remove *all* current mappers from all classes." %
self.class_)
#else:
- # a ClassManager may already exist as
- # ClassManager.instrument_attribute() creates
+ # a ClassManager may already exist as
+ # ClassManager.instrument_attribute() creates
# new managers for each subclass if they don't yet exist.
_mapper_registry[self] = True
manager.mapper = self
manager.deferred_scalar_loader = self._load_scalar_attributes
-
- # The remaining members can be added by any mapper,
+
+ # The remaining members can be added by any mapper,
# e_name None or not.
if manager.info.get(_INSTRUMENTOR, False):
return
self._readonly_props = set(
self._columntoproperty[col]
for col in self._columntoproperty
- if not hasattr(col, 'table') or
+ if not hasattr(col, 'table') or
col.table not in self._cols_by_table)
- # if explicit PK argument sent, add those columns to the
+ # if explicit PK argument sent, add those columns to the
# primary key mappings
if self._primary_key_argument:
for k in self._primary_key_argument:
len(self._pks_by_table[self.mapped_table]) == 0:
raise sa_exc.ArgumentError(
"Mapper %s could not assemble any primary "
- "key columns for mapped table '%s'" %
+ "key columns for mapped table '%s'" %
(self, self.mapped_table.description))
elif self.local_table not in self._pks_by_table and \
isinstance(self.local_table, schema.Table):
util.warn("Could not assemble any primary "
"keys for locally mapped table '%s' - "
- "no rows will be persisted in this Table."
+ "no rows will be persisted in this Table."
% self.local_table.description)
if self.inherits and \
not self.concrete and \
not self._primary_key_argument:
- # if inheriting, the "primary key" for this mapper is
+ # if inheriting, the "primary key" for this mapper is
# that of the inheriting (unless concrete or explicit)
self.primary_key = self.inherits.primary_key
else:
- # determine primary key from argument or mapped_table pks -
+ # determine primary key from argument or mapped_table pks -
# reduce to the minimal set of columns
if self._primary_key_argument:
primary_key = sql_util.reduce_columns(
if len(primary_key) == 0:
raise sa_exc.ArgumentError(
"Mapper %s could not assemble any primary "
- "key columns for mapped table '%s'" %
+ "key columns for mapped table '%s'" %
(self, self.mapped_table.description))
self.primary_key = tuple(primary_key)
if column in mapper._columntoproperty:
column_key = mapper._columntoproperty[column].key
- self._configure_property(column_key,
- column,
- init=False,
+ self._configure_property(column_key,
+ column,
+ init=False,
setparent=True)
def _configure_polymorphic_setter(self, init=False):
- """Configure an attribute on the mapper representing the
- 'polymorphic_on' column, if applicable, and not
+ """Configure an attribute on the mapper representing the
+ 'polymorphic_on' column, if applicable, and not
already generated by _configure_properties (which is typical).
Also create a setter function which will assign this
attribute to the value of the 'polymorphic_identity'
- upon instance construction, also if applicable. This
+ upon instance construction, also if applicable. This
routine will run when an instance is created.
"""
else:
# polymorphic_on is a Column or SQL expression and doesn't
# appear to be mapped.
- # this means it can be 1. only present in the with_polymorphic
+ # this means it can be 1. only present in the with_polymorphic
# selectable or 2. a totally standalone SQL expression which we'd
# hope is compatible with this mapper's mapped_table
col = self.mapped_table.corresponding_column(self.polymorphic_on)
if col is None:
- # polymorphic_on doesn't derive from any column/expression
+ # polymorphic_on doesn't derive from any column/expression
# isn't present in the mapped table.
- # we will make a "hidden" ColumnProperty for it.
- # Just check that if it's directly a schema.Column and we
+ # we will make a "hidden" ColumnProperty for it.
+ # Just check that if it's directly a schema.Column and we
# have with_polymorphic, it's likely a user error if the
# schema.Column isn't represented somehow in either mapped_table or
# with_polymorphic. Otherwise as of 0.7.4 we just go with it
"loads will not function properly"
% col.description)
else:
- # column/expression that polymorphic_on derives from
+ # column/expression that polymorphic_on derives from
# is present in our mapped table
# and is probably mapped, but polymorphic_on itself
- # is not. This happens when
- # the polymorphic_on is only directly present in the
+ # is not. This happens when
+ # the polymorphic_on is only directly present in the
# with_polymorphic selectable, as when use polymorphic_union.
# we'll make a separate ColumnProperty for it.
instrument = True
key = col.key
self._configure_property(
- key,
+ key,
properties.ColumnProperty(col, _instrument=instrument),
init=init, setparent=True)
polymorphic_key = key
self._configure_property(key, prop, init=False, setparent=False)
elif key not in self._props:
self._configure_property(
- key,
- properties.ConcreteInheritedProperty(),
+ key,
+ properties.ConcreteInheritedProperty(),
init=init, setparent=True)
def _configure_property(self, key, prop, init=True, setparent=True):
self._log("_configure_property(%s, %s)", key, prop.__class__.__name__)
if not isinstance(prop, MapperProperty):
- # we were passed a Column or a list of Columns;
+ # we were passed a Column or a list of Columns;
# generate a properties.ColumnProperty
columns = util.to_list(prop)
column = columns[0]
"explicitly."
% (prop.columns[-1], column, key))
- # existing properties.ColumnProperty from an inheriting
+ # existing properties.ColumnProperty from an inheriting
# mapper. make a copy and append our column to it
prop = prop.copy()
prop.columns.insert(0, column)
"(including its availability as a foreign key), "
"use the 'include_properties' or 'exclude_properties' "
"mapper arguments to control specifically which table "
- "columns get mapped." %
+ "columns get mapped." %
(key, self, column.key, prop))
if isinstance(prop, properties.ColumnProperty):
col = self.mapped_table.corresponding_column(prop.columns[0])
- # if the column is not present in the mapped table,
- # test if a column has been added after the fact to the
+ # if the column is not present in the mapped table,
+ # test if a column has been added after the fact to the
# parent table (or their parent, etc.) [ticket:1570]
if col is None and self.inherits:
path = [self]
break
path.append(m)
- # subquery expression, column not present in the mapped
+ # subquery expression, column not present in the mapped
# selectable.
if col is None:
col = prop.columns[0]
- # column is coming in after _readonly_props was
+ # column is coming in after _readonly_props was
# initialized; check for 'readonly'
if hasattr(self, '_readonly_props') and \
- (not hasattr(col, 'table') or
+ (not hasattr(col, 'table') or
col.table not in self._cols_by_table):
self._readonly_props.add(prop)
else:
- # if column is coming in after _cols_by_table was
+ # if column is coming in after _cols_by_table was
# initialized, ensure the col is in the right set
if hasattr(self, '_cols_by_table') and \
col.table in self._cols_by_table and \
def _log_desc(self):
return "(" + self.class_.__name__ + \
"|" + \
- (self.local_table is not None and
- self.local_table.description or
+ (self.local_table is not None and
+ self.local_table.description or
str(self.local_table)) +\
- (self.non_primary and
+ (self.non_primary and
"|non-primary" or "") + ")"
def _log(self, msg, *args):
def __str__(self):
return "Mapper|%s|%s%s" % (
self.class_.__name__,
- self.local_table is not None and
+ self.local_table is not None and
self.local_table.description or None,
self.non_primary and "|non-primary" or ""
)
for m in mappers:
if not m.isa(self):
raise sa_exc.InvalidRequestError(
- "%r does not inherit from %r" %
+ "%r does not inherit from %r" %
(m, self))
else:
mappers = []
self._mappers_from_spec(spec, selectable),
False)
- def _with_polymorphic_args(self, spec=None, selectable=False,
+ def _with_polymorphic_args(self, spec=None, selectable=False,
innerjoin=False):
if self.with_polymorphic:
if not spec:
if selectable is not None:
return mappers, selectable
else:
- return mappers, self._selectable_from_mappers(mappers,
+ return mappers, self._selectable_from_mappers(mappers,
innerjoin)
@_memoized_configured_property
mappers])
):
if getattr(c, '_is_polymorphic_discriminator', False) and \
- (self.polymorphic_on is None or
+ (self.polymorphic_on is None or
c.columns[0] is not self.polymorphic_on):
continue
yield c
return result
def _is_userland_descriptor(self, obj):
- if isinstance(obj, (MapperProperty,
+ if isinstance(obj, (MapperProperty,
attributes.QueryableAttribute)):
return False
elif not hasattr(obj, '__get__'):
return False
def common_parent(self, other):
- """Return true if the given mapper shares a
+ """Return true if the given mapper shares a
common inherited parent as this mapper."""
return self.base_mapper is other.base_mapper
for col in self.primary_key
]
- def _get_state_attr_by_column(self, state, dict_, column,
+ def _get_state_attr_by_column(self, state, dict_, column,
passive=attributes.PASSIVE_OFF):
prop = self._columntoproperty[column]
return state.manager[prop.key].impl.get(state, dict_, passive=passive)
dict_ = attributes.instance_dict(obj)
return self._get_committed_state_attr_by_column(state, dict_, column)
- def _get_committed_state_attr_by_column(self, state, dict_,
+ def _get_committed_state_attr_by_column(self, state, dict_,
column, passive=attributes.PASSIVE_OFF):
prop = self._columntoproperty[column]
if statement is not None:
result = loading.load_on_ident(
session.query(self).from_statement(statement),
- None,
- only_load_props=attribute_names,
+ None,
+ only_load_props=attribute_names,
refresh_state=state
)
_none_set.issuperset(identity_key):
util.warn("Instance %s to be refreshed doesn't "
"contain a full primary key - can't be refreshed "
- "(and shouldn't be expired, either)."
+ "(and shouldn't be expired, either)."
% state_str(state))
return
result = loading.load_on_ident(
session.query(self),
- identity_key,
- refresh_state=state,
+ identity_key,
+ refresh_state=state,
only_load_props=attribute_names)
- # if instance is pending, a refresh operation
+ # if instance is pending, a refresh operation
# may not complete (even if PK attributes are assigned)
if has_key and result is None:
raise orm_exc.ObjectDeletedError(state)
"""assemble a WHERE clause which retrieves a given state by primary
key, using a minimized set of tables.
- Applies to a joined-table inheritance mapper where the
+ Applies to a joined-table inheritance mapper where the
requested attribute names are only present on joined tables,
- not the base table. The WHERE clause attempts to include
+ not the base table. The WHERE clause attempts to include
only those tables to minimize joins.
"""
props = self._props
tables = set(chain(
- *[sql_util.find_tables(c, check_columns=True)
+ *[sql_util.find_tables(c, check_columns=True)
for key in attribute_names
for c in props[key].columns]
))
if leftcol.table not in tables:
leftval = self._get_committed_state_attr_by_column(
- state, state.dict,
- leftcol,
+ state, state.dict,
+ leftcol,
passive=attributes.PASSIVE_NO_INITIALIZE)
if leftval is attributes.PASSIVE_NO_RESULT or leftval is None:
raise ColumnsNotAvailable()
type_=binary.right.type)
elif rightcol.table not in tables:
rightval = self._get_committed_state_attr_by_column(
- state, state.dict,
- rightcol,
+ state, state.dict,
+ rightcol,
passive=attributes.PASSIVE_NO_INITIALIZE)
if rightval is attributes.PASSIVE_NO_RESULT or rightval is None:
raise ColumnsNotAvailable()
start = True
if start and not mapper.single:
allconds.append(visitors.cloned_traverse(
- mapper.inherit_condition,
- {},
+ mapper.inherit_condition,
+ {},
{'binary':visit_binary}
)
)
visited_states = set()
prp, mpp = object(), object()
- visitables = deque([(deque(self._props.values()), prp,
+ visitables = deque([(deque(self._props.values()), prp,
state, state.dict)])
while visitables:
prop = iterator.popleft()
if type_ not in prop.cascade:
continue
- queue = deque(prop.cascade_iterator(type_, parent_state,
+ queue = deque(prop.cascade_iterator(type_, parent_state,
parent_dict, visited_states, halt_on))
if queue:
visitables.append((queue,mpp, None, None))
corresponding_dict = iterator.popleft()
yield instance, instance_mapper, \
corresponding_state, corresponding_dict
- visitables.append((deque(instance_mapper._props.values()),
- prp, corresponding_state,
+ visitables.append((deque(instance_mapper._props.values()),
+ prp, corresponding_state,
corresponding_dict))
@_memoized_configured_property
table_to_mapper = {}
for mapper in self.base_mapper.self_and_descendants:
for t in mapper.tables:
- table_to_mapper[t] = mapper
+ table_to_mapper.setdefault(t, mapper)
+
+ def skip(fk):
+ # attempt to skip dependencies that are not
+ # significant to the inheritance chain
+ # for two tables that are related by inheritance.
+ # while that dependency may be important, it's techinically
+ # not what we mean to sort on here.
+ parent = table_to_mapper.get(fk.parent.table)
+ dep = table_to_mapper.get(fk.column.table)
+ if parent is not None and \
+ dep is not None and \
+ dep is not parent and \
+ dep.inherit_condition is not None:
+ cols = set(sql_util.find_columns(dep.inherit_condition))
+ if parent.inherit_condition is not None:
+ cols = cols.union(sql_util.find_columns(
+ parent.inherit_condition))
+ return fk.parent not in cols and fk.column not in cols
+ else:
+ return fk.parent not in cols
+ return False
- sorted_ = sql_util.sort_tables(table_to_mapper.iterkeys())
+ sorted_ = sql_util.sort_tables(table_to_mapper.iterkeys(),
+ skip_fn=skip)
ret = util.OrderedDict()
for t in sorted_:
ret[t] = table_to_mapper[t]
@util.memoized_property
def _table_to_equated(self):
- """memoized map of tables to collections of columns to be
+ """memoized map of tables to collections of columns to be
synchronized upwards to the base mapper."""
result = util.defaultdict(list)
"""Initialize the inter-mapper relationships of all mappers that
have been constructed thus far.
- This function can be called any number of times, but in
+ This function can be called any number of times, but in
most cases is handled internally.
"""
global _new_mappers
if not _new_mappers:
- return
+ return
_call_configured = None
_COMPILE_MUTEX.acquire()
return
# initialize properties on all mappers
- # note that _mapper_registry is unordered, which
- # may randomly conceal/reveal issues related to
+ # note that _mapper_registry is unordered, which
+ # may randomly conceal/reveal issues related to
# the order of mapper compilation
for mapper in list(_mapper_registry):
if getattr(mapper, '_configure_failed', False):
condition which is not supported.
:param \*names: list of attribute names to be validated.
- :param include_removes: if True, "remove" events will be
+ :param include_removes: if True, "remove" events will be
sent as well - the validation function must accept an additional
argument "is_remove" which will be a boolean.
def _event_on_first_init(manager, cls):
"""Initial mapper compilation trigger.
-
+
instrumentation calls this one when InstanceState
is first generated, and is needed for legacy mutable
attributes to work.
def _event_on_init(state, args, kwargs):
"""Run init_instance hooks.
-
+
This also includes mapper compilation, normally not needed
here but helps with some piecemeal configuration
scenarios (such as in the ORM tutorial).
-
+
"""
instrumenting_mapper = state.manager.info.get(_INSTRUMENTOR)
"""Utility functions that build upon SQL and Schema constructs."""
-def sort_tables(tables):
+def sort_tables(tables, skip_fn=None):
"""sort a collection of Table objects in order of their foreign-key dependency."""
tables = list(tables)
def visit_foreign_key(fkey):
if fkey.use_alter:
return
+ elif skip_fn and skip_fn(fkey):
+ return
parent_table = fkey.column.table
if parent_table in tables:
child_table = fkey.parent.table
tuples.append((parent_table, child_table))
for table in tables:
- visitors.traverse(table,
- {'schema_visitor':True},
+ visitors.traverse(table,
+ {'schema_visitor':True},
{'foreign_key':visit_foreign_key})
tuples.extend(
return list(topological.sort(tuples, tables))
def find_join_source(clauses, join_to):
- """Given a list of FROM clauses and a selectable,
- return the first index and element from the list of
- clauses which can be joined against the selectable. returns
+ """Given a list of FROM clauses and a selectable,
+ return the first index and element from the list of
+ clauses which can be joined against the selectable. returns
None, None if no match is found.
e.g.::
def visit_binary_product(fn, expr):
"""Produce a traversal of the given expression, delivering
column comparisons to the given function.
-
+
The function is of the form::
-
+
def my_fn(binary, left, right)
-
- For each binary expression located which has a
+
+ For each binary expression located which has a
comparison operator, the product of "left" and
"right" will be delivered to that function,
in terms of that binary.
-
+
Hence an expression like::
-
+
and_(
(a + b) == q + func.sum(e + f),
j == r
)
-
+
would have the traversal::
-
+
a <eq> q
a <eq> e
a <eq> f
That is, every combination of "left" and
"right" that doesn't further contain
a binary comparison is passed as pairs.
-
+
"""
stack = []
def visit(element):
yield e
list(visit(expr))
-def find_tables(clause, check_columns=False,
- include_aliases=False, include_joins=False,
+def find_tables(clause, check_columns=False,
+ include_aliases=False, include_joins=False,
include_selects=False, include_crud=False):
"""locate Table objects within the given expression."""
(
not isinstance(t, expression._UnaryExpression) or \
not operators.is_ordering_modifier(t.modifier)
- ):
+ ):
cols.add(t)
else:
for c in t.get_children():
class _repr_params(object):
"""A string view of bound parameters, truncating
display to the given number of 'multi' parameter sets.
-
+
"""
def __init__(self, params, batches):
self.params = params
def expression_as_ddl(clause):
- """Given a SQL expression, convert for usage in DDL, such as
+ """Given a SQL expression, convert for usage in DDL, such as
CREATE INDEX and CHECK CONSTRAINT.
Converts bind params into quoted literals, column identifiers
return visitors.cloned_traverse(crit, {}, {'binary':visit_binary})
-def join_condition(a, b, ignore_nonexistent_tables=False,
+def join_condition(a, b, ignore_nonexistent_tables=False,
a_subset=None,
consider_as_foreign_keys=None):
"""create a join condition between two tables or selectables.
if left is None:
continue
for fk in sorted(
- b.foreign_keys,
+ b.foreign_keys,
key=lambda fk:fk.parent._creation_order):
if consider_as_foreign_keys is not None and \
fk.parent not in consider_as_foreign_keys:
constraints.add(fk.constraint)
if left is not b:
for fk in sorted(
- left.foreign_keys,
+ left.foreign_keys,
key=lambda fk:fk.parent._creation_order):
if consider_as_foreign_keys is not None and \
fk.parent not in consider_as_foreign_keys:
class Annotated(object):
"""clones a ClauseElement and applies an 'annotations' dictionary.
- Unlike regular clones, this clone also mimics __hash__() and
+ Unlike regular clones, this clone also mimics __hash__() and
__cmp__() of the original element so that it takes its place
in hashed collections.
A reference to the original element is maintained, for the important
- reason of keeping its hash value current. When GC'ed, the
+ reason of keeping its hash value current. When GC'ed, the
hash value may be reused, causing conflicts.
"""
try:
cls = annotated_classes[element.__class__]
except KeyError:
- cls = annotated_classes[element.__class__] = type.__new__(type,
- "Annotated%s" % element.__class__.__name__,
+ cls = annotated_classes[element.__class__] = type.__new__(type,
+ "Annotated%s" % element.__class__.__name__,
(Annotated, element.__class__), {})
return object.__new__(cls)
def __init__(self, element, values):
- # force FromClause to generate their internal
+ # force FromClause to generate their internal
# collections into __dict__
if isinstance(element, expression.FromClause):
element.c
exec "annotated_classes[cls] = Annotated%s" % (cls.__name__)
def _deep_annotate(element, annotations, exclude=None):
- """Deep copy the given ClauseElement, annotating each element
+ """Deep copy the given ClauseElement, annotating each element
with the given annotations dictionary.
Elements within the exclude collection will be cloned but not annotated.
element = clone(element)
return element
-def _shallow_annotate(element, annotations):
- """Annotate the given ClauseElement and copy its internals so that
- internal objects refer to the new annotated object.
+def _shallow_annotate(element, annotations):
+ """Annotate the given ClauseElement and copy its internals so that
+ internal objects refer to the new annotated object.
- Basically used to apply a "dont traverse" annotation to a
- selectable, without digging throughout the whole
- structure wasting time.
- """
- element = element._annotate(annotations)
- element._copy_internals()
- return element
+ Basically used to apply a "dont traverse" annotation to a
+ selectable, without digging throughout the whole
+ structure wasting time.
+ """
+ element = element._annotate(annotations)
+ element._copy_internals()
+ return element
def splice_joins(left, right, stop_on=None):
if left is None:
return expression.ColumnSet(columns.difference(omit))
-def criterion_as_pairs(expression, consider_as_foreign_keys=None,
+def criterion_as_pairs(expression, consider_as_foreign_keys=None,
consider_as_referenced_keys=None, any_operator=False):
"""traverse an expression and locate binary criterion pairs."""
if consider_as_foreign_keys:
if binary.left in consider_as_foreign_keys and \
- (col_is(binary.right, binary.left) or
+ (col_is(binary.right, binary.left) or
binary.right not in consider_as_foreign_keys):
pairs.append((binary.right, binary.left))
elif binary.right in consider_as_foreign_keys and \
- (col_is(binary.left, binary.right) or
+ (col_is(binary.left, binary.right) or
binary.left not in consider_as_foreign_keys):
pairs.append((binary.left, binary.right))
elif consider_as_referenced_keys:
if binary.left in consider_as_referenced_keys and \
- (col_is(binary.right, binary.left) or
+ (col_is(binary.right, binary.left) or
binary.right not in consider_as_referenced_keys):
pairs.append((binary.left, binary.right))
elif binary.right in consider_as_referenced_keys and \
- (col_is(binary.left, binary.right) or
+ (col_is(binary.left, binary.right) or
binary.left not in consider_as_referenced_keys):
pairs.append((binary.right, binary.left))
else:
def folded_equivalents(join, equivs=None):
"""Return a list of uniquely named columns.
- The column list of the given Join will be narrowed
+ The column list of the given Join will be narrowed
down to a list of all equivalently-named,
equated columns folded into one column, where 'equated' means they are
equated to each other in the ON clause of this join.
This function is used by Join.select(fold_equivalents=True).
- Deprecated. This function is used for a certain kind of
+ Deprecated. This function is used for a certain kind of
"polymorphic_union" which is designed to achieve joined
table inheritance where the base table has no "discriminator"
- column; [ticket:1131] will provide a better way to
+ column; [ticket:1131] will provide a better way to
achieve this.
"""
s.c.col1 == table2.c.col1
"""
- def __init__(self, selectable, equivalents=None,
- include=None, exclude=None,
- include_fn=None, exclude_fn=None,
+ def __init__(self, selectable, equivalents=None,
+ include=None, exclude=None,
+ include_fn=None, exclude_fn=None,
adapt_on_names=False):
self.__traverse_options__ = {'stop_on':[selectable]}
self.selectable = selectable
def _corresponding_column(self, col, require_embedded, _seen=util.EMPTY_SET):
newcol = self.selectable.corresponding_column(
- col,
+ col,
require_embedded=require_embedded)
if newcol is None and col in self.equivalents and col not in _seen:
for equiv in self.equivalents[col]:
- newcol = self._corresponding_column(equiv,
- require_embedded=require_embedded,
+ newcol = self._corresponding_column(equiv,
+ require_embedded=require_embedded,
_seen=_seen.union([col]))
if newcol is not None:
return newcol
class ColumnAdapter(ClauseAdapter):
"""Extends ClauseAdapter with extra utility functions.
- Provides the ability to "wrap" this ClauseAdapter
+ Provides the ability to "wrap" this ClauseAdapter
around another, a columns dictionary which returns
- adapted elements given an original, and an
+ adapted elements given an original, and an
adapted_row() factory.
"""
- def __init__(self, selectable, equivalents=None,
- chain_to=None, include=None,
+ def __init__(self, selectable, equivalents=None,
+ chain_to=None, include=None,
exclude=None, adapt_required=False):
ClauseAdapter.__init__(self, selectable, equivalents, include, exclude)
if chain_to:
c = c.label(None)
# adapt_required indicates that if we got the same column
- # back which we put in (i.e. it passed through),
+ # back which we put in (i.e. it passed through),
# it's not correct. this is used by eagerloading which
# knows that all columns and expressions need to be adapted
# to a result row, and a "passthrough" is definitely targeting
from test.lib import fixtures
from test.orm import _fixtures
from test.lib.schema import Table, Column
+from sqlalchemy.ext.declarative import declarative_base
class O2MTest(fixtures.MappedTest):
"""deals with inheritance and one-to-many relationships"""
class PolymorphicOnNotLocalTest(fixtures.MappedTest):
@classmethod
def define_tables(cls, metadata):
- t1 = Table('t1', metadata,
+ t1 = Table('t1', metadata,
Column('id', Integer, primary_key=True,
- test_needs_autoincrement=True),
+ test_needs_autoincrement=True),
Column('x', String(10)),
Column('q', String(10)))
- t2 = Table('t2', metadata,
+ t2 = Table('t2', metadata,
Column('id', Integer, primary_key=True,
- test_needs_autoincrement=True),
- Column('y', String(10)),
+ test_needs_autoincrement=True),
+ Column('y', String(10)),
Column('xid', ForeignKey('t1.id')))
@classmethod
"discriminator":column_property(expr)
}, polymorphic_identity="parent",
polymorphic_on=expr)
- mapper(Child, t2, inherits=Parent,
+ mapper(Child, t2, inherits=Parent,
polymorphic_identity="child")
self._roundtrip(parent_ident='p', child_ident='c')
self._roundtrip(parent_ident='p', child_ident='c')
def test_polymorphic_on_expr_implicit_map_no_label_single(self):
- """test that single_table_criterion is propagated
+ """test that single_table_criterion is propagated
with a standalone expr"""
t2, t1 = self.tables.t2, self.tables.t1
Parent, Child = self.classes.Parent, self.classes.Child
self._roundtrip(parent_ident='p', child_ident='c')
def test_polymorphic_on_expr_implicit_map_w_label_single(self):
- """test that single_table_criterion is propagated
+ """test that single_table_criterion is propagated
with a standalone expr"""
t2, t1 = self.tables.t2, self.tables.t1
Parent, Child = self.classes.Parent, self.classes.Child
"discriminator":cprop
}, polymorphic_identity="parent",
polymorphic_on=cprop)
- mapper(Child, t2, inherits=Parent,
+ mapper(Child, t2, inherits=Parent,
polymorphic_identity="child")
self._roundtrip(parent_ident='p', child_ident='c')
"discriminator":cprop
}, polymorphic_identity="parent",
polymorphic_on="discriminator")
- mapper(Child, t2, inherits=Parent,
+ mapper(Child, t2, inherits=Parent,
polymorphic_identity="child")
self._roundtrip(parent_ident='p', child_ident='c')
[Child]
)
+class SortOnlyOnImportantFKsTest(fixtures.MappedTest):
+ @classmethod
+ def define_tables(cls, metadata):
+ Table('a', metadata,
+ Column('id', Integer, primary_key=True),
+ Column('b_id', Integer,
+ ForeignKey('b.id', use_alter=True, name='b'))
+ )
+ Table('b', metadata,
+ Column('id', Integer, ForeignKey('a.id'), primary_key=True)
+ )
+
+ @classmethod
+ def setup_classes(cls):
+ Base = declarative_base()
+
+ class A(Base):
+ __tablename__ = "a"
+
+ id = Column(Integer, primary_key=True)
+ b_id = Column(Integer, ForeignKey('b.id'))
+
+ class B(A):
+ __tablename__ = "b"
+
+ id = Column(Integer, ForeignKey('a.id'), primary_key=True)
+
+ __mapper_args__ = {'inherit_condition': id == A.id}
+
+ cls.classes.A = A
+ cls.classes.B = B
+
+ def test_flush(self):
+ s = Session(testing.db)
+ s.add(self.classes.B())
+ s.flush()
class FalseDiscriminatorTest(fixtures.MappedTest):
@classmethod
def define_tables(cls, metadata):
global t1
- t1 = Table('t1', metadata,
- Column('id', Integer, primary_key=True, test_needs_autoincrement=True),
+ t1 = Table('t1', metadata,
+ Column('id', Integer, primary_key=True, test_needs_autoincrement=True),
Column('type', Boolean, nullable=False))
def test_false_on_sub(self):
@classmethod
def define_tables(cls, metadata):
Table('table_a', metadata,
- Column('id', Integer, primary_key=True,
+ Column('id', Integer, primary_key=True,
test_needs_autoincrement=True),
Column('class_name', String(50))
)
Table('table_b', metadata,
- Column('id', Integer, ForeignKey('table_a.id'),
+ Column('id', Integer, ForeignKey('table_a.id'),
primary_key=True),
Column('class_name', String(50))
)
class C(B):
pass
- mapper(A, table_a,
- polymorphic_on=table_a.c.class_name,
+ mapper(A, table_a,
+ polymorphic_on=table_a.c.class_name,
polymorphic_identity='a')
- mapper(B, table_b, inherits=A,
- polymorphic_on=table_b.c.class_name,
+ mapper(B, table_b, inherits=A,
+ polymorphic_on=table_b.c.class_name,
polymorphic_identity='b')
- mapper(C, table_c, inherits=B,
+ mapper(C, table_c, inherits=B,
polymorphic_identity='c')
def test_poly_configured_immediate(self):
def define_tables(cls, metadata):
global t1, t2, t3, t4
t1= Table('t1', metadata,
- Column('id', Integer, primary_key=True,
+ Column('id', Integer, primary_key=True,
test_needs_autoincrement=True),
Column('data', String(30))
)
t2 = Table('t2', metadata,
- Column('id', Integer, primary_key=True,
+ Column('id', Integer, primary_key=True,
test_needs_autoincrement=True),
Column('t1id', Integer, ForeignKey('t1.id')),
Column('type', String(30)),
Column('data', String(30))
)
t3 = Table('t3', metadata,
- Column('id', Integer, ForeignKey('t2.id'),
+ Column('id', Integer, ForeignKey('t2.id'),
primary_key=True),
Column('moredata', String(30)))
t4 = Table('t4', metadata,
- Column('id', Integer, primary_key=True,
+ Column('id', Integer, primary_key=True,
test_needs_autoincrement=True),
Column('t3id', Integer, ForeignKey('t3.id')),
Column('data', String(30)))
self.assert_(len(q.first().eager) == 1)
class EagerTargetingTest(fixtures.MappedTest):
- """test a scenario where joined table inheritance might be
+ """test a scenario where joined table inheritance might be
confused as an eagerly loaded joined table."""
@classmethod
class B(A):
pass
- mapper(A, a_table, polymorphic_on=a_table.c.type, polymorphic_identity='A',
+ mapper(A, a_table, polymorphic_on=a_table.c.type, polymorphic_identity='A',
properties={
'children': relationship(A, order_by=a_table.c.name)
})
class Stuff(Base):
pass
mapper(Stuff, stuff)
- mapper(Base, base,
- polymorphic_on=base.c.discriminator,
- version_id_col=base.c.version_id,
+ mapper(Base, base,
+ polymorphic_on=base.c.discriminator,
+ version_id_col=base.c.version_id,
polymorphic_identity=1, properties={
'stuff':relationship(Stuff)
})
sess.flush()
assert_raises(orm_exc.StaleDataError,
- sess2.query(Base).with_lockmode('read').get,
+ sess2.query(Base).with_lockmode('read').get,
s1.id)
if not testing.db.dialect.supports_sane_rowcount:
class Sub(Base):
pass
- mapper(Base, base,
- polymorphic_on=base.c.discriminator,
+ mapper(Base, base,
+ polymorphic_on=base.c.discriminator,
version_id_col=base.c.version_id, polymorphic_identity=1)
mapper(Sub, subtable, inherits=Base, polymorphic_identity=2)
def test_explicit_props(self):
person_mapper = mapper(Person, person_table)
mapper(Employee, employee_table, inherits=person_mapper,
- properties={'pid':person_table.c.id,
+ properties={'pid':person_table.c.id,
'eid':employee_table.c.id})
self._do_test(False)
def test_explicit_composite_pk(self):
person_mapper = mapper(Person, person_table)
- mapper(Employee, employee_table,
- inherits=person_mapper,
+ mapper(Employee, employee_table,
+ inherits=person_mapper,
primary_key=[person_table.c.id, employee_table.c.id])
- assert_raises_message(sa_exc.SAWarning,
+ assert_raises_message(sa_exc.SAWarning,
r"On mapper Mapper\|Employee\|employees, "
"primary key column 'persons.id' is being "
"combined with distinct primary key column 'employees.id' "
def define_tables(cls, metadata):
global base, subtable
- base = Table('base', metadata,
+ base = Table('base', metadata,
Column('base_id', Integer, primary_key=True, test_needs_autoincrement=True),
Column('data', String(255)),
Column('sqlite_fixer', String(10))
class_mapper(Sub).get_property('id').columns,
[base.c.base_id, subtable.c.base_id]
)
-
+
s1 = Sub()
s1.id = 10
sess = create_session()
Column('type', String(50)),
Column('counter', Integer, server_default="1")
)
- Table('sub', metadata,
+ Table('sub', metadata,
Column('id', Integer, ForeignKey('base.id'), primary_key=True),
Column('sub', String(50)),
Column('counter', Integer, server_default="1"),
)
def test_optimized_passes(self):
- """"test that the 'optimized load' routine doesn't crash when
+ """"test that the 'optimized load' routine doesn't crash when
a column in the join condition is not available."""
base, sub = self.tables.base, self.tables.sub
# redefine Sub's "id" to favor the "id" col in the subtable.
# "id" is also part of the primary join condition
- mapper(Sub, sub, inherits=Base,
+ mapper(Sub, sub, inherits=Base,
polymorphic_identity='sub',
properties={'id':[sub.c.id, base.c.id]})
sess = sessionmaker()()
sess.commit()
sess.expunge_all()
- # load s1 via Base. s1.id won't populate since it's relative to
- # the "sub" table. The optimized load kicks in and tries to
+ # load s1 via Base. s1.id won't populate since it's relative to
+ # the "sub" table. The optimized load kicks in and tries to
# generate on the primary join, but cannot since "id" is itself unloaded.
# the optimized load needs to return "None" so regular full-row loading proceeds
s1 = sess.query(Base).first()
sess.expunge_all()
# query a bunch of rows to ensure there's no cartesian
# product against "base" occurring, it is in fact
- # detecting that "base" needs to be in the join
+ # detecting that "base" needs to be in the join
# criterion
eq_(
sess.query(Base).order_by(Base.id).all(),
pass
class Sub(Base):
pass
- mapper(Base, base, polymorphic_on=base.c.type,
+ mapper(Base, base, polymorphic_on=base.c.type,
polymorphic_identity='base')
m = mapper(Sub, sub, inherits=Base, polymorphic_identity='sub')
s1 = Sub()
- assert m._optimized_get_statement(attributes.instance_state(s1),
+ assert m._optimized_get_statement(attributes.instance_state(s1),
['counter2']) is None
# loads s1.id as None
eq_(s1.id, None)
# this now will come up with a value of None for id - should reject
- assert m._optimized_get_statement(attributes.instance_state(s1),
+ assert m._optimized_get_statement(attributes.instance_state(s1),
['counter2']) is None
s1.id = 1
attributes.instance_state(s1).commit_all(s1.__dict__, None)
- assert m._optimized_get_statement(attributes.instance_state(s1),
+ assert m._optimized_get_statement(attributes.instance_state(s1),
['counter2']) is not None
def test_load_expired_on_pending_twolevel(self):
class SubSub(Sub):
pass
- mapper(Base, base, polymorphic_on=base.c.type,
+ mapper(Base, base, polymorphic_on=base.c.type,
polymorphic_identity='base')
mapper(Sub, sub, inherits=Base, polymorphic_identity='sub')
mapper(SubSub, subsub, inherits=Sub, polymorphic_identity='subsub')
mapper(Base, base_table)
# succeeds, despite "owner" table not configured yet
- m2 = mapper(Derived, derived_table,
+ m2 = mapper(Derived, derived_table,
inherits=Base)
assert m2.inherit_condition.compare(
base_table.c.id==derived_table.c.id
Column("id", Integer, primary_key=True)
)
derived_table = Table("derived", m,
- Column("id", Integer, ForeignKey('base.id'),
+ Column("id", Integer, ForeignKey('base.id'),
primary_key=True),
Column('order_id', Integer, ForeignKey('order.foo'))
)
Column("id", Integer, primary_key=True)
)
derived_table = Table("derived", m2,
- Column("id", Integer, ForeignKey('base.id'),
+ Column("id", Integer, ForeignKey('base.id'),
primary_key=True),
)
Column("id", Integer, primary_key=True)
)
derived_table = Table("derived", m,
- Column("id", Integer, ForeignKey('base.q'),
+ Column("id", Integer, ForeignKey('base.q'),
primary_key=True),
)
@classmethod
def define_tables(cls, metadata):
parents = Table('parents', metadata,
- Column('id', Integer, primary_key=True,
+ Column('id', Integer, primary_key=True,
test_needs_autoincrement=True),
Column('name', String(60)))
children = Table('children', metadata,
- Column('id', Integer, ForeignKey('parents.id'),
+ Column('id', Integer, ForeignKey('parents.id'),
primary_key=True),
Column('type', Integer,primary_key=True),
Column('name', String(60)))
@classmethod
def define_tables(cls, metadata):
Table('base', metadata,
- Column('id', Integer, primary_key=True,
+ Column('id', Integer, primary_key=True,
test_needs_autoincrement=True),
Column('type', String(50), nullable=False),
)
"""Test the fairly obvious, that an error is raised
when attempting to insert an orphan.
- Previous SQLA versions would check this constraint
+ Previous SQLA versions would check this constraint
in memory which is the original rationale for this test.
"""
__dialect__ = 'default'
def _fixture(self):
- t1 = table('t1', column('c1', Integer),
- column('c2', Integer),
+ t1 = table('t1', column('c1', Integer),
+ column('c2', Integer),
column('c3', Integer))
- t2 = table('t2', column('c1', Integer), column('c2', Integer),
- column('c3', Integer),
+ t2 = table('t2', column('c1', Integer), column('c2', Integer),
+ column('c3', Integer),
column('c4', Integer))
- t3 = table('t3', column('c1', Integer),
- column('c3', Integer),
+ t3 = table('t3', column('c1', Integer),
+ column('c3', Integer),
column('c5', Integer))
return t1, t2, t3
@classmethod
def define_tables(cls, metadata):
content = Table('content', metadata,
- Column('id', Integer, primary_key=True,
+ Column('id', Integer, primary_key=True,
test_needs_autoincrement=True),
Column('type', String(30))
)
foo = Table('foo', metadata,
- Column('id', Integer, ForeignKey('content.id'),
+ Column('id', Integer, ForeignKey('content.id'),
primary_key=True),
Column('content_type', String(30))
)
pass
class Foo(Content):
pass
- mapper(Content, self.tables.content,
+ mapper(Content, self.tables.content,
polymorphic_on=self.tables.content.c.type)
- mapper(Foo, self.tables.foo, inherits=Content,
+ mapper(Foo, self.tables.foo, inherits=Content,
polymorphic_identity='foo')
sess = create_session()
f = Foo()