mentioned throughout many of the examples here. What does it actually do ? Let's start
with the canonical ``User`` and ``Address`` scenario::
- from sqlalchemy import Integer, ForeignKey, String, Column
- from sqlalchemy.ext.declarative import declarative_base
- from sqlalchemy.orm import relationship
+ from sqlalchemy import Column, ForeignKey, Integer, String
+ from sqlalchemy.orm import declarative_base, relationship
Base = declarative_base()
+
class User(Base):
- __tablename__ = 'user'
+ __tablename__ = "user"
id = Column(Integer, primary_key=True)
name = Column(String)
addresses = relationship("Address", backref="user")
+
class Address(Base):
- __tablename__ = 'address'
+ __tablename__ = "address"
id = Column(Integer, primary_key=True)
email = Column(String)
- user_id = Column(Integer, ForeignKey('user.id'))
+ user_id = Column(Integer, ForeignKey("user.id"))
The above configuration establishes a collection of ``Address`` objects on ``User`` called
``User.addresses``. It also establishes a ``.user`` attribute on ``Address`` which will
of an event listener on both sides which will mirror attribute operations
in both directions. The above configuration is equivalent to::
- from sqlalchemy import Integer, ForeignKey, String, Column
- from sqlalchemy.ext.declarative import declarative_base
- from sqlalchemy.orm import relationship
+ from sqlalchemy import Column, ForeignKey, Integer, String
+ from sqlalchemy.orm import declarative_base, relationship
Base = declarative_base()
+
class User(Base):
- __tablename__ = 'user'
+ __tablename__ = "user"
id = Column(Integer, primary_key=True)
name = Column(String)
addresses = relationship("Address", back_populates="user")
+
class Address(Base):
- __tablename__ = 'address'
+ __tablename__ = "address"
id = Column(Integer, primary_key=True)
email = Column(String)
- user_id = Column(Integer, ForeignKey('user.id'))
+ user_id = Column(Integer, ForeignKey("user.id"))
user = relationship("User", back_populates="addresses")
:paramref:`_orm.relationship.primaryjoin` argument is discussed in :ref:`relationship_primaryjoin`). Such
as if we limited the list of ``Address`` objects to those which start with "tony"::
- from sqlalchemy import Integer, ForeignKey, String, Column
- from sqlalchemy.ext.declarative import declarative_base
- from sqlalchemy.orm import relationship
+ from sqlalchemy import Column, ForeignKey, Integer, String
+ from sqlalchemy.orm import declarative_base, relationship
Base = declarative_base()
+
class User(Base):
- __tablename__ = 'user'
+ __tablename__ = "user"
id = Column(Integer, primary_key=True)
name = Column(String)
- addresses = relationship("Address",
- primaryjoin="and_(User.id==Address.user_id, "
- "Address.email.startswith('tony'))",
- backref="user")
+ addresses = relationship(
+ "Address",
+ primaryjoin=(
+ "and_(User.id==Address.user_id, Address.email.startswith('tony'))"
+ ),
+ backref="user",
+ )
+
class Address(Base):
- __tablename__ = 'address'
+ __tablename__ = "address"
id = Column(Integer, primary_key=True)
email = Column(String)
- user_id = Column(Integer, ForeignKey('user.id'))
+ user_id = Column(Integer, ForeignKey("user.id"))
We can observe, by inspecting the resulting property, that both sides
of the relationship have this join condition applied::
# <other imports>
from sqlalchemy.orm import backref
+
class User(Base):
- __tablename__ = 'user'
+ __tablename__ = "user"
id = Column(Integer, primary_key=True)
name = Column(String)
- addresses = relationship("Address",
- backref=backref("user", lazy="joined"))
+ addresses = relationship(
+ "Address",
+ backref=backref("user", lazy="joined"),
+ )
Where above, we placed a ``lazy="joined"`` directive only on the ``Address.user``
side, indicating that when a query against ``Address`` is made, a join to the ``User``
of the "backref" behavior on the Python side by using two separate :func:`_orm.relationship` constructs,
placing :paramref:`_orm.relationship.back_populates` only on one side::
- from sqlalchemy import Integer, ForeignKey, String, Column
- from sqlalchemy.ext.declarative import declarative_base
- from sqlalchemy.orm import relationship
+ from sqlalchemy import Column, ForeignKey, Integer, String
+ from sqlalchemy.orm import declarative_base, relationship
Base = declarative_base()
+
class User(Base):
- __tablename__ = 'user'
+ __tablename__ = "user"
id = Column(Integer, primary_key=True)
name = Column(String)
- addresses = relationship("Address",
- primaryjoin="and_(User.id==Address.user_id, "
- "Address.email.startswith('tony'))",
- back_populates="user")
+
+ addresses = relationship(
+ "Address",
+ primaryjoin="and_(User.id==Address.user_id, "
+ "Address.email.startswith('tony'))",
+ back_populates="user",
+ )
+
class Address(Base):
- __tablename__ = 'address'
+ __tablename__ = "address"
id = Column(Integer, primary_key=True)
email = Column(String)
- user_id = Column(Integer, ForeignKey('user.id'))
+ user_id = Column(Integer, ForeignKey("user.id"))
+
user = relationship("User")
With the above scenario, appending an ``Address`` object to the ``.addresses``
The imports used for each of the following sections is as follows::
- from sqlalchemy import Table, Column, Integer, ForeignKey
- from sqlalchemy.orm import relationship
- from sqlalchemy.ext.declarative import declarative_base
+ from sqlalchemy import Column, ForeignKey, Integer, Table
+ from sqlalchemy.orm import declarative_base, relationship
Base = declarative_base()
-
.. _relationship_patterns_o2m:
One To Many
a collection of items represented by the child::
class Parent(Base):
- __tablename__ = 'parent'
+ __tablename__ = "parent"
id = Column(Integer, primary_key=True)
children = relationship("Child")
+
class Child(Base):
- __tablename__ = 'child'
+ __tablename__ = "child"
id = Column(Integer, primary_key=True)
- parent_id = Column(Integer, ForeignKey('parent.id'))
+ parent_id = Column(Integer, ForeignKey("parent.id"))
To establish a bidirectional relationship in one-to-many, where the "reverse"
side is a many to one, specify an additional :func:`_orm.relationship` and connect
the two using the :paramref:`_orm.relationship.back_populates` parameter::
class Parent(Base):
- __tablename__ = 'parent'
+ __tablename__ = "parent"
id = Column(Integer, primary_key=True)
children = relationship("Child", back_populates="parent")
+
class Child(Base):
- __tablename__ = 'child'
+ __tablename__ = "child"
id = Column(Integer, primary_key=True)
- parent_id = Column(Integer, ForeignKey('parent.id'))
+ parent_id = Column(Integer, ForeignKey("parent.id"))
parent = relationship("Parent", back_populates="children")
``Child`` will get a ``parent`` attribute with many-to-one semantics.
:paramref:`_orm.relationship.back_populates`::
class Parent(Base):
- __tablename__ = 'parent'
+ __tablename__ = "parent"
id = Column(Integer, primary_key=True)
children = relationship("Child", backref="parent")
attribute will be created::
class Parent(Base):
- __tablename__ = 'parent'
+ __tablename__ = "parent"
id = Column(Integer, primary_key=True)
- child_id = Column(Integer, ForeignKey('child.id'))
+ child_id = Column(Integer, ForeignKey("child.id"))
child = relationship("Child")
+
class Child(Base):
- __tablename__ = 'child'
+ __tablename__ = "child"
id = Column(Integer, primary_key=True)
Bidirectional behavior is achieved by adding a second :func:`_orm.relationship`
in both directions::
class Parent(Base):
- __tablename__ = 'parent'
+ __tablename__ = "parent"
id = Column(Integer, primary_key=True)
- child_id = Column(Integer, ForeignKey('child.id'))
+ child_id = Column(Integer, ForeignKey("child.id"))
child = relationship("Child", back_populates="parents")
+
class Child(Base):
- __tablename__ = 'child'
+ __tablename__ = "child"
id = Column(Integer, primary_key=True)
parents = relationship("Parent", back_populates="child")
may be applied to a single :func:`_orm.relationship`, such as ``Parent.child``::
class Parent(Base):
- __tablename__ = 'parent'
+ __tablename__ = "parent"
id = Column(Integer, primary_key=True)
- child_id = Column(Integer, ForeignKey('child.id'))
+ child_id = Column(Integer, ForeignKey("child.id"))
child = relationship("Child", backref="parents")
.. _relationships_one_to_one:
relationships::
class Parent(Base):
- __tablename__ = 'parent'
+ __tablename__ = "parent"
id = Column(Integer, primary_key=True)
# one-to-many collection
children = relationship("Child", back_populates="parent")
+
class Child(Base):
- __tablename__ = 'child'
+ __tablename__ = "child"
id = Column(Integer, primary_key=True)
- parent_id = Column(Integer, ForeignKey('parent.id'))
+ parent_id = Column(Integer, ForeignKey("parent.id"))
# many-to-one scalar
parent = relationship("Parent", back_populates="children")
renaming ``Parent.children`` to ``Parent.child`` for clarity::
class Parent(Base):
- __tablename__ = 'parent'
+ __tablename__ = "parent"
id = Column(Integer, primary_key=True)
# previously one-to-many Parent.children is now
# one-to-one Parent.child
child = relationship("Child", back_populates="parent", uselist=False)
+
class Child(Base):
- __tablename__ = 'child'
+ __tablename__ = "child"
id = Column(Integer, primary_key=True)
- parent_id = Column(Integer, ForeignKey('parent.id'))
+ parent_id = Column(Integer, ForeignKey("parent.id"))
# many-to-one side remains, see tip below
parent = relationship("Parent", back_populates="child")
from sqlalchemy.orm import backref
+
class Parent(Base):
- __tablename__ = 'parent'
+ __tablename__ = "parent"
id = Column(Integer, primary_key=True)
+
class Child(Base):
- __tablename__ = 'child'
+ __tablename__ = "child"
id = Column(Integer, primary_key=True)
- parent_id = Column(Integer, ForeignKey('parent.id'))
+ parent_id = Column(Integer, ForeignKey("parent.id"))
parent = relationship("Parent", backref=backref("child", uselist=False))
-
-
.. _relationships_many_to_many:
Many To Many
class, so that the :class:`_schema.ForeignKey` directives can locate the
remote tables with which to link::
- association_table = Table('association', Base.metadata,
- Column('left_id', ForeignKey('left.id')),
- Column('right_id', ForeignKey('right.id'))
+ association_table = Table(
+ "association",
+ Base.metadata,
+ Column("left_id", ForeignKey("left.id")),
+ Column("right_id", ForeignKey("right.id")),
)
+
class Parent(Base):
- __tablename__ = 'left'
+ __tablename__ = "left"
id = Column(Integer, primary_key=True)
- children = relationship("Child",
- secondary=association_table)
+ children = relationship("Child", secondary=association_table)
+
class Child(Base):
- __tablename__ = 'right'
+ __tablename__ = "right"
id = Column(Integer, primary_key=True)
.. tip::
this ensures that duplicate rows won't be persisted within the table regardless
of issues on the application side::
- association_table = Table('association', Base.metadata,
- Column('left_id', ForeignKey('left.id'), primary_key=True),
- Column('right_id', ForeignKey('right.id'), primary_key=True)
+ association_table = Table(
+ "association",
+ Base.metadata,
+ Column("left_id", ForeignKey("left.id"), primary_key=True),
+ Column("right_id", ForeignKey("right.id"), primary_key=True),
)
For a bidirectional relationship, both sides of the relationship contain a
collection. Specify using :paramref:`_orm.relationship.back_populates`, and
for each :func:`_orm.relationship` specify the common association table::
- association_table = Table('association', Base.metadata,
- Column('left_id', ForeignKey('left.id'), primary_key=True),
- Column('right_id', ForeignKey('right.id'), primary_key=True)
+ association_table = Table(
+ "association",
+ Base.metadata,
+ Column("left_id", ForeignKey("left.id"), primary_key=True),
+ Column("right_id", ForeignKey("right.id"), primary_key=True),
)
+
class Parent(Base):
- __tablename__ = 'left'
+ __tablename__ = "left"
id = Column(Integer, primary_key=True)
children = relationship(
- "Child",
- secondary=association_table,
- back_populates="parents")
+ "Child", secondary=association_table, back_populates="parents"
+ )
+
class Child(Base):
- __tablename__ = 'right'
+ __tablename__ = "right"
id = Column(Integer, primary_key=True)
parents = relationship(
- "Parent",
- secondary=association_table,
- back_populates="children")
+ "Parent", secondary=association_table, back_populates="children"
+ )
+
When using the :paramref:`_orm.relationship.backref` parameter instead of
:paramref:`_orm.relationship.back_populates`, the backref will automatically
use the same :paramref:`_orm.relationship.secondary` argument for the
reverse relationship::
- association_table = Table('association', Base.metadata,
- Column('left_id', ForeignKey('left.id'), primary_key=True),
- Column('right_id', ForeignKey('right.id'), primary_key=True)
+ association_table = Table(
+ "association",
+ Base.metadata,
+ Column("left_id", ForeignKey("left.id"), primary_key=True),
+ Column("right_id", ForeignKey("right.id"), primary_key=True),
)
+
class Parent(Base):
- __tablename__ = 'left'
+ __tablename__ = "left"
id = Column(Integer, primary_key=True)
- children = relationship("Child",
- secondary=association_table,
- backref="parents")
+ children = relationship(
+ "Child", secondary=association_table, backref="parents"
+ )
+
class Child(Base):
- __tablename__ = 'right'
+ __tablename__ = "right"
id = Column(Integer, primary_key=True)
The :paramref:`_orm.relationship.secondary` argument of
available to the callable after all module initialization is complete::
class Parent(Base):
- __tablename__ = 'left'
+ __tablename__ = "left"
id = Column(Integer, primary_key=True)
- children = relationship("Child",
- secondary=lambda: association_table,
- backref="parents")
+ children = relationship(
+ "Child",
+ secondary=lambda: association_table,
+ backref="parents",
+ )
With the declarative extension in use, the traditional "string name of the table"
is accepted as well, matching the name of the table as stored in ``Base.metadata.tables``::
class Parent(Base):
- __tablename__ = 'left'
+ __tablename__ = "left"
id = Column(Integer, primary_key=True)
- children = relationship("Child",
- secondary="association",
- backref="parents")
+ children = relationship("Child", secondary="association", backref="parents")
.. warning:: When passed as a Python-evaluable string, the
:paramref:`_orm.relationship.secondary` argument is interpreted using Python's
``Child``::
class Association(Base):
- __tablename__ = 'association'
- left_id = Column(ForeignKey('left.id'), primary_key=True)
- right_id = Column(ForeignKey('right.id'), primary_key=True)
+ __tablename__ = "association"
+ left_id = Column(ForeignKey("left.id"), primary_key=True)
+ right_id = Column(ForeignKey("right.id"), primary_key=True)
extra_data = Column(String(50))
child = relationship("Child")
+
class Parent(Base):
- __tablename__ = 'left'
+ __tablename__ = "left"
id = Column(Integer, primary_key=True)
children = relationship("Association")
+
class Child(Base):
- __tablename__ = 'right'
+ __tablename__ = "right"
id = Column(Integer, primary_key=True)
As always, the bidirectional version makes use of :paramref:`_orm.relationship.back_populates`
or :paramref:`_orm.relationship.backref`::
class Association(Base):
- __tablename__ = 'association'
- left_id = Column(ForeignKey('left.id'), primary_key=True)
- right_id = Column(ForeignKey('right.id'), primary_key=True)
+ __tablename__ = "association"
+ left_id = Column(ForeignKey("left.id"), primary_key=True)
+ right_id = Column(ForeignKey("right.id"), primary_key=True)
extra_data = Column(String(50))
child = relationship("Child", back_populates="parents")
parent = relationship("Parent", back_populates="children")
+
class Parent(Base):
- __tablename__ = 'left'
+ __tablename__ = "left"
id = Column(Integer, primary_key=True)
children = relationship("Association", back_populates="parent")
+
class Child(Base):
- __tablename__ = 'right'
+ __tablename__ = "right"
id = Column(Integer, primary_key=True)
parents = relationship("Association", back_populates="child")
after :meth:`.Session.commit`::
class Association(Base):
- __tablename__ = 'association'
+ __tablename__ = "association"
- left_id = Column(ForeignKey('left.id'), primary_key=True)
- right_id = Column(ForeignKey('right.id'), primary_key=True)
+ left_id = Column(ForeignKey("left.id"), primary_key=True)
+ right_id = Column(ForeignKey("right.id"), primary_key=True)
extra_data = Column(String(50))
child = relationship("Child", backref="parent_associations")
parent = relationship("Parent", backref="child_associations")
+
class Parent(Base):
- __tablename__ = 'left'
+ __tablename__ = "left"
id = Column(Integer, primary_key=True)
children = relationship("Child", secondary="association")
+
class Child(Base):
- __tablename__ = 'right'
+ __tablename__ = "right"
id = Column(Integer, primary_key=True)
Additionally, just as changes to one relationship aren't reflected in the
children = relationship("Child", back_populates="parent")
+
class Child(Base):
# ...
children = relationship(
"Child",
order_by="desc(Child.email_address)",
- primaryjoin="Parent.id == Child.parent_id"
+ primaryjoin="Parent.id == Child.parent_id",
)
For the case where more than one module contains a class of the same name,
children = relationship(
"myapp.mymodel.Child",
order_by="desc(myapp.mymodel.Child.email_address)",
- primaryjoin="myapp.mymodel.Parent.id == myapp.mymodel.Child.parent_id"
+ primaryjoin="myapp.mymodel.Parent.id == myapp.mymodel.Child.parent_id",
)
The qualified path can be any partial path that removes ambiguity between
children = relationship(
"model1.Child",
order_by="desc(mymodel1.Child.email_address)",
- primaryjoin="Parent.id == model1.Child.parent_id"
+ primaryjoin="Parent.id == model1.Child.parent_id",
)
The :func:`_orm.relationship` construct also accepts Python functions or
from sqlalchemy import desc
+
def _resolve_child_model():
- from myapplication import Child
- return Child
+ from myapplication import Child
+
+ return Child
+
class Parent(Base):
# ...
children = relationship(
_resolve_child_model(),
order_by=lambda: desc(_resolve_child_model().email_address),
- primaryjoin=lambda: Parent.id == _resolve_child_model().parent_id
+ primaryjoin=lambda: Parent.id == _resolve_child_model().parent_id,
)
The full list of parameters which accept Python functions/lambdas or strings
# first, module A, where Child has not been created yet,
# we create a Parent class which knows nothing about Child
+
class Parent(Base):
- # ...
+ ...
+
+ # ... later, in Module B, which is imported after module A:
- #... later, in Module B, which is imported after module A:
class Child(Base):
- # ...
+ ...
+
from module_a import Parent
# assign the User.addresses relationship as a class variable. The
# declarative base class will intercept this and map the relationship.
- Parent.children = relationship(
- Child,
- primaryjoin=Child.parent_id==Parent.id
- )
+ Parent.children = relationship(Child, primaryjoin=Child.parent_id == Parent.id)
.. note:: assignment of mapped properties to a declaratively mapped class will only
function correctly if the "declarative base" class is used, which also
parameter::
keyword_author = Table(
- 'keyword_author', Base.metadata,
- Column('author_id', Integer, ForeignKey('authors.id')),
- Column('keyword_id', Integer, ForeignKey('keywords.id'))
- )
+ "keyword_author",
+ Base.metadata,
+ Column("author_id", Integer, ForeignKey("authors.id")),
+ Column("keyword_id", Integer, ForeignKey("keywords.id")),
+ )
+
class Author(Base):
- __tablename__ = 'authors'
+ __tablename__ = "authors"
id = Column(Integer, primary_key=True)
keywords = relationship("Keyword", secondary="keyword_author")
:func:`~sqlalchemy.orm.relationship`::
class Order(Base):
- __tablename__ = 'order'
+ __tablename__ = "order"
items = relationship("Item", cascade="all, delete-orphan")
customer = relationship("User", cascade="save-update")
its arguments back into :func:`~sqlalchemy.orm.relationship`::
class Item(Base):
- __tablename__ = 'item'
+ __tablename__ = "item"
- order = relationship("Order",
- backref=backref("items", cascade="all, delete-orphan")
- )
+ order = relationship(
+ "Order", backref=backref("items", cascade="all, delete-orphan")
+ )
.. sidebar:: The Origins of Cascade
bi-directionally to a series of ``Item`` objects via relationships
``Order.items`` and ``Item.order``::
- mapper_registry.map_imperatively(Order, order_table, properties={
- 'items' : relationship(Item, back_populates='order')
- })
+ mapper_registry.map_imperatively(
+ Order,
+ order_table,
+ properties={"items": relationship(Item, back_populates="order")},
+ )
- mapper_registry.map_imperatively(Item, item_table, properties={
- 'order' : relationship(Order, back_populates='items')
- })
+ mapper_registry.map_imperatively(
+ Item,
+ item_table,
+ properties={"order": relationship(Order, back_populates="items")},
+ )
If an ``Order`` is already associated with a :class:`_orm.Session`, and
an ``Item`` object is then created and appended to the ``Order.items``
illustrate the ``cascade="all, delete"`` setting on **one** side of the
association::
- association_table = Table('association', Base.metadata,
- Column('left_id', Integer, ForeignKey('left.id')),
- Column('right_id', Integer, ForeignKey('right.id'))
+ association_table = Table(
+ "association",
+ Base.metadata,
+ Column("left_id", Integer, ForeignKey("left.id")),
+ Column("right_id", Integer, ForeignKey("right.id")),
)
+
class Parent(Base):
- __tablename__ = 'left'
+ __tablename__ = "left"
id = Column(Integer, primary_key=True)
children = relationship(
"Child",
secondary=association_table,
back_populates="parents",
- cascade="all, delete"
+ cascade="all, delete",
)
+
class Child(Base):
- __tablename__ = 'right'
+ __tablename__ = "right"
id = Column(Integer, primary_key=True)
parents = relationship(
"Parent",
class Parent(Base):
- __tablename__ = 'parent'
+ __tablename__ = "parent"
id = Column(Integer, primary_key=True)
children = relationship(
- "Child", back_populates="parent",
+ "Child",
+ back_populates="parent",
cascade="all, delete",
- passive_deletes=True
+ passive_deletes=True,
)
+
class Child(Base):
- __tablename__ = 'child'
+ __tablename__ = "child"
id = Column(Integer, primary_key=True)
- parent_id = Column(Integer, ForeignKey('parent.id', ondelete="CASCADE"))
+ parent_id = Column(Integer, ForeignKey("parent.id", ondelete="CASCADE"))
parent = relationship("Parent", back_populates="children")
The behavior of the above configuration when a parent row is deleted
``passive_deletes=True`` on the **other** side of the bidirectional
relationship as illustrated below::
- association_table = Table('association', Base.metadata,
- Column('left_id', Integer, ForeignKey('left.id', ondelete="CASCADE")),
- Column('right_id', Integer, ForeignKey('right.id', ondelete="CASCADE"))
+ association_table = Table(
+ "association",
+ Base.metadata,
+ Column("left_id", Integer, ForeignKey("left.id", ondelete="CASCADE")),
+ Column("right_id", Integer, ForeignKey("right.id", ondelete="CASCADE")),
)
+
class Parent(Base):
- __tablename__ = 'left'
+ __tablename__ = "left"
id = Column(Integer, primary_key=True)
children = relationship(
"Child",
cascade="all, delete",
)
+
class Child(Base):
- __tablename__ = 'right'
+ __tablename__ = "right"
id = Column(Integer, primary_key=True)
parents = relationship(
"Parent",
secondary=association_table,
back_populates="children",
- passive_deletes=True
+ passive_deletes=True,
)
Using the above configuration, the deletion of a ``Parent`` object proceeds
illustrated in the example below::
class User(Base):
- __tablename__ = 'user'
+ __tablename__ = "user"
# ...
- addresses = relationship(
- "Address", cascade="all, delete-orphan")
+ addresses = relationship("Address", cascade="all, delete-orphan")
# ...
# ...
preference = relationship(
- "Preference", cascade="all, delete-orphan",
- single_parent=True)
-
+ "Preference", cascade="all, delete-orphan", single_parent=True
+ )
Above, if a hypothetical ``Preference`` object is removed from a ``User``,
it will be deleted on flush::
offsets, either explicitly or via array slices::
class User(Base):
- __tablename__ = 'user'
+ __tablename__ = "user"
posts = relationship(Post, lazy="dynamic")
+
jack = session.get(User, id)
# filter Jack's blog posts
- posts = jack.posts.filter(Post.headline=='this is a post')
+ posts = jack.posts.filter(Post.headline == "this is a post")
# apply array slices
posts = jack.posts[5:20]
The dynamic relationship supports limited write operations, via the
:meth:`_orm.AppenderQuery.append` and :meth:`_orm.AppenderQuery.remove` methods::
- oldpost = jack.posts.filter(Post.headline=='old post').one()
+ oldpost = jack.posts.filter(Post.headline == "old post").one()
jack.posts.remove(oldpost)
- jack.posts.append(Post('new post'))
+ jack.posts.append(Post("new post"))
Since the read side of the dynamic relationship always queries the
database, changes to the underlying collection will not be visible
class Post(Base):
__table__ = posts_table
- user = relationship(User,
- backref=backref('posts', lazy='dynamic')
- )
+ user = relationship(User, backref=backref("posts", lazy="dynamic"))
Note that eager/lazy loading options cannot be used in conjunction dynamic relationships at this time.
accessed. It is configured using ``lazy='noload'``::
class MyClass(Base):
- __tablename__ = 'some_table'
+ __tablename__ = "some_table"
- children = relationship(MyOtherClass, lazy='noload')
+ children = relationship(MyOtherClass, lazy="noload")
Above, the ``children`` collection is fully writeable, and changes to it will
be persisted to the database as well as locally available for reading at the
emit a lazy load::
class MyClass(Base):
- __tablename__ = 'some_table'
+ __tablename__ = "some_table"
- children = relationship(MyOtherClass, lazy='raise')
+ children = relationship(MyOtherClass, lazy="raise")
Above, attribute access on the ``children`` collection will raise an exception
if it was not previously eagerloaded. This includes read access but for
this collection is a ``list``::
class Parent(Base):
- __tablename__ = 'parent'
+ __tablename__ = "parent"
parent_id = Column(Integer, primary_key=True)
children = relationship(Child)
+
parent = Parent()
parent.children.append(Child())
print(parent.children[0])
:func:`~sqlalchemy.orm.relationship`::
class Parent(Base):
- __tablename__ = 'parent'
+ __tablename__ = "parent"
parent_id = Column(Integer, primary_key=True)
# use a set
children = relationship(Child, collection_class=set)
+
parent = Parent()
child = Child()
parent.children.add(child)
of the mapped class as a key. Below we map an ``Item`` class containing
a dictionary of ``Note`` items keyed to the ``Note.keyword`` attribute::
- from sqlalchemy import Column, Integer, String, ForeignKey
- from sqlalchemy.orm import relationship
+ from sqlalchemy import Column, ForeignKey, Integer, String
+ from sqlalchemy.orm import declarative_base, relationship
from sqlalchemy.orm.collections import attribute_mapped_collection
- from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
+
class Item(Base):
- __tablename__ = 'item'
+ __tablename__ = "item"
id = Column(Integer, primary_key=True)
- notes = relationship("Note",
- collection_class=attribute_mapped_collection('keyword'),
- cascade="all, delete-orphan")
+ notes = relationship(
+ "Note",
+ collection_class=attribute_mapped_collection("keyword"),
+ cascade="all, delete-orphan",
+ )
+
class Note(Base):
- __tablename__ = 'note'
+ __tablename__ = "note"
id = Column(Integer, primary_key=True)
- item_id = Column(Integer, ForeignKey('item.id'), nullable=False)
+ item_id = Column(Integer, ForeignKey("item.id"), nullable=False)
keyword = Column(String)
text = Column(String)
``Item.notes`` is then a dictionary::
>>> item = Item()
- >>> item.notes['a'] = Note('a', 'atext')
+ >>> item.notes["a"] = Note("a", "atext")
>>> item.notes.items()
{'a': <__main__.Note object at 0x2eaaf0>}
item = Item()
item.notes = {
- 'a': Note('a', 'atext'),
- 'b': Note('b', 'btext')
- }
+ "a": Note("a", "atext"),
+ "b": Note("b", "btext"),
+ }
The attribute which :func:`.attribute_mapped_collection` uses as a key
does not need to be mapped at all! Using a regular Python ``@property`` allows virtually
of the ``Note.text`` field::
class Item(Base):
- __tablename__ = 'item'
+ __tablename__ = "item"
id = Column(Integer, primary_key=True)
- notes = relationship("Note",
- collection_class=attribute_mapped_collection('note_key'),
- backref="item",
- cascade="all, delete-orphan")
+ notes = relationship(
+ "Note",
+ collection_class=attribute_mapped_collection("note_key"),
+ backref="item",
+ cascade="all, delete-orphan",
+ )
+
class Note(Base):
- __tablename__ = 'note'
+ __tablename__ = "note"
id = Column(Integer, primary_key=True)
- item_id = Column(Integer, ForeignKey('item.id'), nullable=False)
+ item_id = Column(Integer, ForeignKey("item.id"), nullable=False)
keyword = Column(String)
text = Column(String)
from sqlalchemy.orm.collections import column_mapped_collection
+
class Item(Base):
- __tablename__ = 'item'
+ __tablename__ = "item"
id = Column(Integer, primary_key=True)
- notes = relationship("Note",
- collection_class=column_mapped_collection(Note.__table__.c.keyword),
- cascade="all, delete-orphan")
+ notes = relationship(
+ "Note",
+ collection_class=column_mapped_collection(Note.__table__.c.keyword),
+ cascade="all, delete-orphan",
+ )
as well as :func:`.mapped_collection` which is passed any callable function.
Note that it's usually easier to use :func:`.attribute_mapped_collection` along
from sqlalchemy.orm.collections import mapped_collection
+
class Item(Base):
- __tablename__ = 'item'
+ __tablename__ = "item"
id = Column(Integer, primary_key=True)
- notes = relationship("Note",
- collection_class=mapped_collection(lambda note: note.text[0:10]),
- cascade="all, delete-orphan")
+ notes = relationship(
+ "Note",
+ collection_class=mapped_collection(lambda note: note.text[0:10]),
+ cascade="all, delete-orphan",
+ )
Dictionary mappings are often combined with the "Association Proxy" extension to produce
streamlined dictionary views. See :ref:`proxying_dictionaries` and :ref:`composite_association_proxy`
Setting ``b1.data`` after the fact does not update the collection::
- >>> b1.data = 'the key'
+ >>> b1.data = "the key"
>>> a1.bs
{None: <test3.B object at 0x7f7b1023ef70>}
This can also be seen if one attempts to set up ``B()`` in the constructor.
The order of arguments changes the result::
- >>> B(a=a1, data='the key')
+ >>> B(a=a1, data="the key")
<test3.B object at 0x7f7b10114280>
>>> a1.bs
{None: <test3.B object at 0x7f7b10114280>}
vs::
- >>> B(data='the key', a=a1)
+ >>> B(data="the key", a=a1)
<test3.B object at 0x7f7b10114340>
>>> a1.bs
{'the key': <test3.B object at 0x7f7b10114340>}
collection as well::
from sqlalchemy import event
-
from sqlalchemy.orm import attributes
+
@event.listens_for(B.data, "set")
def set_item(obj, value, previous, initiator):
if obj.a is not None:
obj.a.bs[value] = obj
obj.a.bs.pop(previous)
-
-
.. autofunction:: attribute_mapped_collection
.. autofunction:: column_mapped_collection
repeatedly, or inappropriately, leading to internal state corruption in
rare cases::
- from sqlalchemy.orm.collections import MappedCollection,\
- collection
+ from sqlalchemy.orm.collections import MappedCollection, collection
+
class MyMappedCollection(MappedCollection):
"""Use @internally_instrumented when your methods
of :class:`.MappedCollection` which uses :meth:`.collection.internally_instrumented`
can be used::
- from sqlalchemy.orm.collections import _instrument_class, MappedCollection
+ from sqlalchemy.orm.collections import MappedCollection, _instrument_class
+
_instrument_class(MappedCollection)
This will ensure that the :class:`.MappedCollection` has been properly
return self.x, self.y
def __repr__(self):
- return "Point(x=%r, y=%r)" % (self.x, self.y)
+ return f"Point(x={self.x!r}, y={self.y!r})"
def __eq__(self, other):
- return isinstance(other, Point) and \
- other.x == self.x and \
- other.y == self.y
+ return (
+ isinstance(other, Point)
+ and other.x == self.x
+ and other.y == self.y
+ )
def __ne__(self, other):
return not self.__eq__(other)
attributes that will represent sets of columns via the ``Point`` class::
from sqlalchemy import Column, Integer
- from sqlalchemy.orm import composite
- from sqlalchemy.orm import declarative_base
+ from sqlalchemy.orm import composite, declarative_base
Base = declarative_base()
+
class Vertex(Base):
- __tablename__ = 'vertices'
+ __tablename__ = "vertices"
id = Column(Integer, primary_key=True)
x1 = Column(Integer)
A classical mapping above would define each :func:`.composite`
against the existing table::
- mapper_registry.map_imperatively(Vertex, vertices_table, properties={
- 'start':composite(Point, vertices_table.c.x1, vertices_table.c.y1),
- 'end':composite(Point, vertices_table.c.x2, vertices_table.c.y2),
- })
+ mapper_registry.map_imperatively(
+ Vertex,
+ vertices_table,
+ properties={
+ "start": composite(Point, vertices_table.c.x1, vertices_table.c.y1),
+ "end": composite(Point, vertices_table.c.x2, vertices_table.c.y2),
+ },
+ )
We can now persist and use ``Vertex`` instances, as well as query for them,
using the ``.start`` and ``.end`` attributes against ad-hoc ``Point`` instances:
Below we illustrate the "greater than" operator, implementing
the same expression that the base "greater than" does::
- from sqlalchemy.orm.properties import CompositeProperty
from sqlalchemy import sql
+ from sqlalchemy.orm.properties import CompositeProperty
+
class PointComparator(CompositeProperty.Comparator):
def __gt__(self, other):
"""redefine the 'greater than' operation"""
- return sql.and_(*[a>b for a, b in
- zip(self.__clause_element__().clauses,
- other.__composite_values__())])
+ return sql.and_(
+ *[
+ a > b
+ for a, b in zip(
+ self.__clause_element__().clauses,
+ other.__composite_values__(),
+ )
+ ]
+ )
+
class Vertex(Base):
- ___tablename__ = 'vertices'
+ ___tablename__ = "vertices"
id = Column(Integer, primary_key=True)
x1 = Column(Integer)
x2 = Column(Integer)
y2 = Column(Integer)
- start = composite(Point, x1, y1,
- comparator_factory=PointComparator)
- end = composite(Point, x2, y2,
- comparator_factory=PointComparator)
+ start = composite(Point, x1, y1, comparator_factory=PointComparator)
+ end = composite(Point, x2, y2, comparator_factory=PointComparator)
Nesting Composites
-------------------
from sqlalchemy.orm import composite
+
class Point:
def __init__(self, x, y):
self.x = x
return self.x, self.y
def __repr__(self):
- return "Point(x=%r, y=%r)" % (self.x, self.y)
+ return f"Point(x={self.x!r}, y={self.y!r})"
def __eq__(self, other):
- return isinstance(other, Point) and \
- other.x == self.x and \
- other.y == self.y
+ return (
+ isinstance(other, Point)
+ and other.x == self.x
+ and other.y == self.y
+ )
def __ne__(self, other):
return not self.__eq__(other)
+
class Vertex:
def __init__(self, start, end):
self.start = start
@classmethod
def _generate(self, x1, y1, x2, y2):
"""generate a Vertex from a row"""
- return Vertex(
- Point(x1, y1),
- Point(x2, y2)
- )
+ return Vertex(Point(x1, y1), Point(x2, y2))
def __composite_values__(self):
- return \
- self.start.__composite_values__() + \
- self.end.__composite_values__()
+ return (
+ self.start.__composite_values__()
+ + self.end.__composite_values__()
+ )
+
class HasVertex(Base):
- __tablename__ = 'has_vertex'
+ __tablename__ = "has_vertex"
id = Column(Integer, primary_key=True)
x1 = Column(Integer)
y1 = Column(Integer)
from sqlalchemy import orm
+
class MyMappedClass:
def __init__(self, data):
self.data = data
# mapping attributes using declarative with declarative table
# i.e. __tablename__
- from sqlalchemy import Column, Integer, String, Text, ForeignKey
- from sqlalchemy.orm import column_property, relationship, deferred
- from sqlalchemy.orm import declarative_base
+ from sqlalchemy import Column, ForeignKey, Integer, String, Text
+ from sqlalchemy.orm import (
+ column_property,
+ declarative_base,
+ deferred,
+ relationship,
+ )
Base = declarative_base()
+
class User(Base):
- __tablename__ = 'user'
+ __tablename__ = "user"
id = Column(Integer, primary_key=True)
name = Column(String)
addresses = relationship("Address", back_populates="user")
+
class Address(Base):
- __tablename__ = 'address'
+ __tablename__ = "address"
id = Column(Integer, primary_key=True)
user_id = Column(ForeignKey("user.id"))
# mapping attributes using declarative with imperative table
# i.e. __table__
- from sqlalchemy import Table
- from sqlalchemy import Column, Integer, String, Text, ForeignKey
- from sqlalchemy.orm import column_property, relationship, deferred
- from sqlalchemy.orm import declarative_base
+ from sqlalchemy import Column, ForeignKey, Integer, String, Table, Text
+ from sqlalchemy.orm import (
+ column_property,
+ declarative_base,
+ deferred,
+ relationship,
+ )
Base = declarative_base()
+
class User(Base):
__table__ = Table(
"user",
Column("id", Integer, primary_key=True),
Column("name", String),
Column("firstname", String(50)),
- Column("lastname", String(50))
+ Column("lastname", String(50)),
)
- fullname = column_property(__table__.c.firstname + " " + __table__.c.lastname)
+ fullname = column_property(
+ __table__.c.firstname + " " + __table__.c.lastname
+ )
addresses = relationship("Address", back_populates="user")
+
class Address(Base):
__table__ = Table(
"address",
Column("id", Integer, primary_key=True),
Column("user_id", ForeignKey("user.id")),
Column("email_address", String),
- Column("address_statistics", Text)
+ Column("address_statistics", Text),
)
address_statistics = deferred(__table__.c.address_statistics)
from datetime import datetime
+
class Widget(Base):
- __tablename__ = 'widgets'
+ __tablename__ = "widgets"
id = Column(Integer, primary_key=True)
timestamp = Column(DateTime, nullable=False)
__mapper_args__ = {
- 'version_id_col': timestamp,
- 'version_id_generator': lambda v:datetime.now()
+ "version_id_col": timestamp,
+ "version_id_generator": lambda v: datetime.now(),
}
**Single Table Inheritance**
:paramref:`_orm.Mapper.polymorphic_identity` parameters::
class Person(Base):
- __tablename__ = 'person'
+ __tablename__ = "person"
person_id = Column(Integer, primary_key=True)
type = Column(String, nullable=False)
__mapper_args__ = dict(
polymorphic_on=type,
- polymorphic_identity="person"
+ polymorphic_identity="person",
)
+
class Employee(Person):
__mapper_args__ = dict(
- polymorphic_identity="employee"
+ polymorphic_identity="employee",
)
The ``__mapper_args__`` dictionary may be generated from a class-bound
reg = registry()
+
class BaseOne:
metadata = MetaData()
+
class BaseTwo:
metadata = MetaData()
+
@reg.mapped
class ClassOne:
- __tablename__ = 't1' # will use reg.metadata
+ __tablename__ = "t1" # will use reg.metadata
id = Column(Integer, primary_key=True)
+
@reg.mapped
class ClassTwo(BaseOne):
- __tablename__ = 't1' # will use BaseOne.metadata
+ __tablename__ = "t1" # will use BaseOne.metadata
id = Column(Integer, primary_key=True)
+
@reg.mapped
class ClassThree(BaseTwo):
- __tablename__ = 't1' # will use BaseTwo.metadata
+ __tablename__ = "t1" # will use BaseTwo.metadata
id = Column(Integer, primary_key=True)
-
.. versionchanged:: 1.4.3 The :meth:`_orm.registry.mapped` decorator will
honor an attribute named ``.metadata`` on the class as an alternate
:class:`_schema.MetaData` collection to be used in place of the
__abstract__ = True
def some_helpful_method(self):
- ""
+ """"""
@declared_attr
def __mapper_args__(cls):
- return {"helpful mapper arguments":True}
+ return {"helpful mapper arguments": True}
+
class MyMappedClass(SomeAbstractBase):
- ""
+ pass
One possible use of ``__abstract__`` is to use a distinct
:class:`_schema.MetaData` for different bases::
Base = declarative_base()
+
class DefaultBase(Base):
__abstract__ = True
metadata = MetaData()
+
class OtherBase(Base):
__abstract__ = True
metadata = MetaData()
DefaultBase.metadata.create_all(some_engine)
OtherBase.metadata.create_all(some_other_engine)
-
``__table_cls__``
~~~~~~~~~~~~~~~~~
class MyMixin:
@classmethod
def __table_cls__(cls, name, metadata_obj, *arg, **kw):
- return Table(
- "my_" + name,
- metadata_obj, *arg, **kw
- )
+ return Table(f"my_{name}", metadata_obj, *arg, **kw)
The above mixin would cause all :class:`_schema.Table` objects generated to include
the prefix ``"my_"``, followed by the name normally specified using the
@classmethod
def __table_cls__(cls, *arg, **kw):
for obj in arg[1:]:
- if (isinstance(obj, Column) and obj.primary_key) or \
- isinstance(obj, PrimaryKeyConstraint):
+ if (isinstance(obj, Column) and obj.primary_key) or isinstance(
+ obj, PrimaryKeyConstraint
+ ):
return Table(*arg, **kw)
return None
+
class Person(AutoTable, Base):
id = Column(Integer, primary_key=True)
+
class Employee(Person):
employee_name = Column(String)
An example of some commonly mixed-in idioms is below::
- from sqlalchemy.orm import declarative_mixin
- from sqlalchemy.orm import declared_attr
+ from sqlalchemy.orm import declarative_mixin, declared_attr
+
@declarative_mixin
class MyMixin:
-
@declared_attr
def __tablename__(cls):
return cls.__name__.lower()
- __table_args__ = {'mysql_engine': 'InnoDB'}
- __mapper_args__= {'always_refresh': True}
+ __table_args__ = {"mysql_engine": "InnoDB"}
+ __mapper_args__ = {"always_refresh": True}
+
+ id = Column(Integer, primary_key=True)
- id = Column(Integer, primary_key=True)
class MyModel(MyMixin, Base):
name = Column(String(1000))
should apply to all classes derived from a particular base. This is achieved
using the ``cls`` argument of the :func:`_orm.declarative_base` function::
- from sqlalchemy.orm import declared_attr
+ from sqlalchemy.orm import declarative_base, declared_attr
+
class Base:
@declared_attr
def __tablename__(cls):
return cls.__name__.lower()
- __table_args__ = {'mysql_engine': 'InnoDB'}
+ __table_args__ = {"mysql_engine": "InnoDB"}
- id = Column(Integer, primary_key=True)
+ id = Column(Integer, primary_key=True)
- from sqlalchemy.orm import declarative_base
Base = declarative_base(cls=Base)
+
class MyModel(Base):
name = Column(String(1000))
class TimestampMixin:
created_at = Column(DateTime, default=func.now())
+
class MyModel(TimestampMixin, Base):
- __tablename__ = 'test'
+ __tablename__ = "test"
- id = Column(Integer, primary_key=True)
+ id = Column(Integer, primary_key=True)
name = Column(String(1000))
Where above, all declarative classes that include ``TimestampMixin``
from sqlalchemy.orm import declared_attr
+
@declarative_mixin
class HasRelatedDataMixin:
@declared_attr
def related_data(cls):
- return deferred(Column(Text())
+ return deferred(Column(Text()))
+
class User(HasRelatedDataMixin, Base):
- __tablename__ = 'user'
+ __tablename__ = "user"
id = Column(Integer, primary_key=True)
Where above, the ``related_data`` class-level callable is executed at the
def parent_id(cls):
return Column(Integer, ForeignKey(cls.id))
+
class A(SelfReferentialMixin, Base):
- __tablename__ = 'a'
+ __tablename__ = "a"
class B(SelfReferentialMixin, Base):
- __tablename__ = 'b'
+ __tablename__ = "b"
Above, both classes ``A`` and ``B`` will contain columns ``id`` and
``parent_id``, where ``parent_id`` refers to the ``id`` column local to the
@declarative_mixin
class RefTargetMixin:
- target_id = Column('target_id', ForeignKey('target.id'))
+ target_id = Column("target_id", ForeignKey("target.id"))
@declared_attr
def target(cls):
return relationship("Target")
+
class Foo(RefTargetMixin, Base):
- __tablename__ = 'foo'
+ __tablename__ = "foo"
id = Column(Integer, primary_key=True)
+
class Bar(RefTargetMixin, Base):
- __tablename__ = 'bar'
+ __tablename__ = "bar"
id = Column(Integer, primary_key=True)
+
class Target(Base):
- __tablename__ = 'target'
+ __tablename__ = "target"
id = Column(Integer, primary_key=True)
The canonical example is the primaryjoin condition that depends upon
another mixed-in column::
- @declarative_mixin
- class RefTargetMixin:
+ @declarative_mixin
+ class RefTargetMixin:
@declared_attr
def target_id(cls):
- return Column('target_id', ForeignKey('target.id'))
+ return Column("target_id", ForeignKey("target.id"))
@declared_attr
def target(cls):
- return relationship(Target,
- primaryjoin=Target.id==cls.target_id # this is *incorrect*
+ return relationship(
+ Target,
+ primaryjoin=Target.id == cls.target_id, # this is *incorrect*
)
Mapping a class using the above mixin, we will get an error like::
class RefTargetMixin:
@declared_attr
def target_id(cls):
- return Column('target_id', ForeignKey('target.id'))
+ return Column("target_id", ForeignKey("target.id"))
@declared_attr
def target(cls):
- return relationship("Target",
- primaryjoin="Target.id==%s.target_id" % cls.__name__
+ return relationship(
+ Target, primaryjoin=f"Target.id=={cls.__name__}.target_id"
)
.. seealso::
@declarative_mixin
class SomethingMixin:
-
@declared_attr
def dprop(cls):
return deferred(Column(Integer))
+
class Something(SomethingMixin, Base):
__tablename__ = "something"
@declarative_mixin
class SomethingMixin:
x = Column(Integer)
-
y = Column(Integer)
@declared_attr
def x_plus_y(cls):
return column_property(cls.x + cls.y)
-
.. versionchanged:: 1.0.0 mixin columns are copied to the final mapped class
so that :class:`_orm.declared_attr` methods can access the actual column
that will be mapped.
:func:`.association_proxy` mixin example which provides a scalar list of
string values to an implementing class::
- from sqlalchemy import Column, Integer, ForeignKey, String
+ from sqlalchemy import Column, ForeignKey, Integer, String
from sqlalchemy.ext.associationproxy import association_proxy
- from sqlalchemy.orm import declarative_base
- from sqlalchemy.orm import declarative_mixin
- from sqlalchemy.orm import declared_attr
- from sqlalchemy.orm import relationship
+ from sqlalchemy.orm import (
+ declarative_base,
+ declarative_mixin,
+ declared_attr,
+ relationship,
+ )
Base = declarative_base()
+
@declarative_mixin
class HasStringCollection:
@declared_attr
__tablename__ = cls.string_table_name
id = Column(Integer, primary_key=True)
value = Column(String(50), nullable=False)
- parent_id = Column(Integer,
- ForeignKey('%s.id' % cls.__tablename__),
- nullable=False)
+ parent_id = Column(
+ Integer,
+ ForeignKey(f"{cls.__tablename__}.id"),
+ nullable=False,
+ )
+
def __init__(self, value):
self.value = value
@declared_attr
def strings(cls):
- return association_proxy('_strings', 'value')
+ return association_proxy("_strings", "value")
+
class TypeA(HasStringCollection, Base):
- __tablename__ = 'type_a'
- string_table_name = 'type_a_strings'
+ __tablename__ = "type_a"
+ string_table_name = "type_a_strings"
id = Column(Integer(), primary_key=True)
+
class TypeB(HasStringCollection, Base):
- __tablename__ = 'type_b'
- string_table_name = 'type_b_strings'
+ __tablename__ = "type_b"
+ string_table_name = "type_b_strings"
id = Column(Integer(), primary_key=True)
Above, the ``HasStringCollection`` mixin produces a :func:`_orm.relationship`
``TypeA`` or ``TypeB`` can be instantiated given the constructor
argument ``strings``, a list of strings::
- ta = TypeA(strings=['foo', 'bar'])
- tb = TypeB(strings=['bat', 'bar'])
+ ta = TypeA(strings=["foo", "bar"])
+ tb = TypeB(strings=["bat", "bar"])
This list will generate a collection
of ``StringAttribute`` objects, which are persisted into a table that's
For example, to create a mixin that gives every class a simple table
name based on class name::
- from sqlalchemy.orm import declarative_mixin
- from sqlalchemy.orm import declared_attr
+ from sqlalchemy.orm import declarative_mixin, declared_attr
+
@declarative_mixin
class Tablename:
def __tablename__(cls):
return cls.__name__.lower()
+
class Person(Tablename, Base):
id = Column(Integer, primary_key=True)
- discriminator = Column('type', String(50))
- __mapper_args__ = {'polymorphic_on': discriminator}
+ discriminator = Column("type", String(50))
+ __mapper_args__ = {"polymorphic_on": discriminator}
+
class Engineer(Person):
__tablename__ = None
- __mapper_args__ = {'polymorphic_identity': 'engineer'}
+ __mapper_args__ = {"polymorphic_identity": "engineer"}
primary_language = Column(String(50))
Alternatively, we can modify our ``__tablename__`` function to return
the effect of those subclasses being mapped with single table inheritance
against the parent::
- from sqlalchemy.orm import declarative_mixin
- from sqlalchemy.orm import declared_attr
- from sqlalchemy.orm import has_inherited_table
+ from sqlalchemy.orm import (
+ declarative_mixin,
+ declared_attr,
+ has_inherited_table,
+ )
+
@declarative_mixin
class Tablename:
return None
return cls.__name__.lower()
+
class Person(Tablename, Base):
id = Column(Integer, primary_key=True)
- discriminator = Column('type', String(50))
- __mapper_args__ = {'polymorphic_on': discriminator}
+ discriminator = Column("type", String(50))
+ __mapper_args__ = {"polymorphic_on": discriminator}
+
class Engineer(Person):
primary_language = Column(String(50))
- __mapper_args__ = {'polymorphic_identity': 'engineer'}
+ __mapper_args__ = {"polymorphic_identity": "engineer"}
.. _mixin_inheritance_columns:
class HasId:
@declared_attr
def id(cls):
- return Column('id', Integer, primary_key=True)
+ return Column("id", Integer, primary_key=True)
+
class Person(HasId, Base):
- __tablename__ = 'person'
- discriminator = Column('type', String(50))
- __mapper_args__ = {'polymorphic_on': discriminator}
+ __tablename__ = "person"
+ discriminator = Column("type", String(50))
+ __mapper_args__ = {"polymorphic_on": discriminator}
+
class Engineer(Person):
- __tablename__ = 'engineer'
+ __tablename__ = "engineer"
primary_language = Column(String(50))
- __mapper_args__ = {'polymorphic_identity': 'engineer'}
+ __mapper_args__ = {"polymorphic_identity": "engineer"}
It is usually the case in joined-table inheritance that we want distinctly
named columns on each subclass. However in this case, we may want to have
@declared_attr.cascading
def id(cls):
if has_inherited_table(cls):
- return Column(ForeignKey('person.id'), primary_key=True)
+ return Column(ForeignKey("person.id"), primary_key=True)
else:
return Column(Integer, primary_key=True)
+
class Person(HasIdMixin, Base):
- __tablename__ = 'person'
- discriminator = Column('type', String(50))
- __mapper_args__ = {'polymorphic_on': discriminator}
+ __tablename__ = "person"
+ discriminator = Column("type", String(50))
+ __mapper_args__ = {"polymorphic_on": discriminator}
+
class Engineer(Person):
- __tablename__ = 'engineer'
+ __tablename__ = "engineer"
primary_language = Column(String(50))
- __mapper_args__ = {'polymorphic_identity': 'engineer'}
+ __mapper_args__ = {"polymorphic_identity": "engineer"}
.. warning::
here to create user-defined collation routines that pull
from multiple collections::
- from sqlalchemy.orm import declarative_mixin
- from sqlalchemy.orm import declared_attr
+ from sqlalchemy.orm import declarative_mixin, declared_attr
+
@declarative_mixin
class MySQLSettings:
- __table_args__ = {'mysql_engine':'InnoDB'}
+ __table_args__ = {"mysql_engine": "InnoDB"}
+
@declarative_mixin
class MyOtherMixin:
- __table_args__ = {'info':'foo'}
+ __table_args__ = {"info": "foo"}
+
class MyModel(MySQLSettings, MyOtherMixin, Base):
- __tablename__='my_model'
+ __tablename__ = "my_model"
@declared_attr
def __table_args__(cls):
args.update(MyOtherMixin.__table_args__)
return args
- id = Column(Integer, primary_key=True)
+ id = Column(Integer, primary_key=True)
Creating Indexes with Mixins
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
@declarative_mixin
class MyMixin:
- a = Column(Integer)
- b = Column(Integer)
+ a = Column(Integer)
+ b = Column(Integer)
@declared_attr
def __table_args__(cls):
- return (Index('test_idx_%s' % cls.__tablename__, 'a', 'b'),)
+ return (
+ Index(f"test_idx_{cls.__tablename__}", "a", "b"),
+ )
+
class MyModel(MyMixin, Base):
- __tablename__ = 'atable'
- c = Column(Integer,primary_key=True)
+ __tablename__ = "atable"
+ c = Column(Integer, primary_key=True)
+
With the declarative base class, new mapped classes are declared as subclasses
of the base::
- from sqlalchemy import Column, Integer, String, ForeignKey
+ from sqlalchemy import Column, ForeignKey, Integer, String
from sqlalchemy.orm import declarative_base
# declarative base class
Base = declarative_base()
+
# an example mapping using the base
class User(Base):
- __tablename__ = 'user'
+ __tablename__ = "user"
id = Column(Integer, primary_key=True)
name = Column(String)
that can be applied to any Python class with no hierarchy in place. The
Python class otherwise is configured in declarative style normally::
- from sqlalchemy import Column, Integer, String, Text, ForeignKey
-
- from sqlalchemy.orm import registry
- from sqlalchemy.orm import relationship
+ from sqlalchemy import Column, ForeignKey, Integer, String, Text
+ from sqlalchemy.orm import registry, relationship
mapper_registry = registry()
+
@mapper_registry.mapped
class User:
- __tablename__ = 'user'
+ __tablename__ = "user"
id = Column(Integer, primary_key=True)
name = Column(String)
addresses = relationship("Address", back_populates="user")
+
@mapper_registry.mapped
class Address:
- __tablename__ = 'address'
+ __tablename__ = "address"
id = Column(Integer, primary_key=True)
user_id = Column(ForeignKey("user.id"))
mappings, the decorator should be applied to each subclass::
from sqlalchemy.orm import registry
+
mapper_registry = registry()
+
@mapper_registry.mapped
class Person:
__tablename__ = "person"
type = Column(String, nullable=False)
__mapper_args__ = {
-
"polymorphic_on": type,
- "polymorphic_identity": "person"
+ "polymorphic_identity": "person",
}
person_id = Column(ForeignKey("person.person_id"), primary_key=True)
__mapper_args__ = {
- "polymorphic_identity": "employee"
+ "polymorphic_identity": "employee",
}
Both the "declarative table" and "imperative table" styles of declarative
from __future__ import annotations
- from dataclasses import dataclass
- from dataclasses import field
- from typing import List
- from typing import Optional
+ from dataclasses import dataclass, field
+ from typing import List, Optional
- from sqlalchemy import Column
- from sqlalchemy import ForeignKey
- from sqlalchemy import Integer
- from sqlalchemy import String
- from sqlalchemy import Table
- from sqlalchemy.orm import registry
- from sqlalchemy.orm import relationship
+ from sqlalchemy import Column, ForeignKey, Integer, String, Table
+ from sqlalchemy.orm import registry, relationship
mapper_registry = registry()
nickname: Optional[str] = None
addresses: List[Address] = field(default_factory=list)
- __mapper_args__ = { # type: ignore
- "properties" : {
- "addresses": relationship("Address")
+ __mapper_args__ = { # type: ignore
+ "properties": {
+ "addresses": relationship("Address"),
}
}
+
@mapper_registry.mapped
@dataclass
class Address:
from __future__ import annotations
- from dataclasses import dataclass
- from dataclasses import field
+ from dataclasses import dataclass, field
from typing import List
- from sqlalchemy import Column
- from sqlalchemy import ForeignKey
- from sqlalchemy import Integer
- from sqlalchemy import String
- from sqlalchemy.orm import registry
- from sqlalchemy.orm import relationship
+ from sqlalchemy import Column, ForeignKey, Integer, String
+ from sqlalchemy.orm import registry, relationship
mapper_registry = registry()
class RefTargetMixin:
@declared_attr
def target_id(cls):
- return Column('target_id', ForeignKey('target.id'))
+ return Column("target_id", ForeignKey("target.id"))
@declared_attr
def target(cls):
default_factory=list, metadata={"sa": lambda: relationship("Address")}
)
+
@dataclass
class AddressMixin:
__tablename__ = "address"
default=None, metadata={"sa": Column(String(50))}
)
+
@mapper_registry.mapped
class User(UserMixin):
pass
+
@mapper_registry.mapped
class Address(AddressMixin):
- pass
+ pass
.. versionadded:: 1.4.2 Added support for "declared attr" style mixin attributes,
namely :func:`_orm.relationship` constructs as well as :class:`_schema.Column`
A mapping using ``@attr.s``, in conjunction with imperative table::
import attr
+ from sqlalchemy.orm import registry
# other imports
- from sqlalchemy.orm import registry
mapper_registry = registry()
nickname = attr.ib()
addresses = attr.ib()
+
# other classes...
+
``@dataclass`` and attrs_ mappings may also be used with classical mappings, i.e.
with the :meth:`_orm.registry.map_imperatively` function. See the section
:ref:`orm_imperative_dataclasses` for a similar example.
attribute ``__tablename__`` that indicates the name of a :class:`_schema.Table`
that should be generated along with the mapping::
- from sqlalchemy import Column, Integer, String, ForeignKey
+ from sqlalchemy import Column, ForeignKey, Integer, String
from sqlalchemy.orm import declarative_base
Base = declarative_base()
+
class User(Base):
- __tablename__ = 'user'
+ __tablename__ = "user"
id = Column(Integer, primary_key=True)
name = Column(String)
dictionary::
class MyClass(Base):
- __tablename__ = 'sometable'
- __table_args__ = {'mysql_engine':'InnoDB'}
+ __tablename__ = "sometable"
+ __table_args__ = {"mysql_engine": "InnoDB"}
The other, a tuple, where each argument is positional
(usually constraints)::
class MyClass(Base):
- __tablename__ = 'sometable'
+ __tablename__ = "sometable"
__table_args__ = (
- ForeignKeyConstraint(['id'], ['remote_table.id']),
- UniqueConstraint('foo'),
- )
+ ForeignKeyConstraint(["id"], ["remote_table.id"]),
+ UniqueConstraint("foo"),
+ )
Keyword arguments can be specified with the above form by
specifying the last argument as a dictionary::
class MyClass(Base):
- __tablename__ = 'sometable'
+ __tablename__ = "sometable"
__table_args__ = (
- ForeignKeyConstraint(['id'], ['remote_table.id']),
- UniqueConstraint('foo'),
- {'autoload':True}
- )
+ ForeignKeyConstraint(["id"], ["remote_table.id"]),
+ UniqueConstraint("foo"),
+ {"autoload": True},
+ )
A class may also specify the ``__table_args__`` declarative attribute,
as well as the ``__tablename__`` attribute, in a dynamic style using the
class MyClass(Base):
- __tablename__ = 'sometable'
- __table_args__ = {'schema': 'some_schema'}
-
+ __tablename__ = "sometable"
+ __table_args__ = {"schema": "some_schema"}
The schema name can also be applied to all :class:`_schema.Table` objects
globally by using the :paramref:`_schema.MetaData.schema` parameter documented
or :func:`_orm.declarative_base`::
from sqlalchemy import MetaData
+
metadata_obj = MetaData(schema="some_schema")
- Base = declarative_base(metadata = metadata_obj)
+ Base = declarative_base(metadata=metadata_obj)
class MyClass(Base):
# will use "some_schema" by default
- __tablename__ = 'sometable'
-
+ __tablename__ = "sometable"
.. seealso::
is that of simply assigning new :class:`_schema.Column` objects to the
class::
- MyClass.some_new_column = Column('data', Unicode)
+ MyClass.some_new_column = Column("data", Unicode)
The above operation performed against a declarative class that has been
mapped using the declarative base (note, not the decorator form of declarative)
directly::
+ from sqlalchemy import Column, ForeignKey, Integer, String
from sqlalchemy.orm import declarative_base
- from sqlalchemy import Column, Integer, String, ForeignKey
-
Base = declarative_base()
Column("nickname", String),
)
+
# construct the User class using this table.
class User(Base):
__table__ = user_table
class Person(Base):
__table__ = Table(
- 'person',
+ "person",
Base.metadata,
- Column('id', Integer, primary_key=True),
- Column('name', String(50)),
- Column('type', String(50))
+ Column("id", Integer, primary_key=True),
+ Column("name", String(50)),
+ Column("type", String(50)),
)
__mapper_args__ = {
"polymorphic_on": __table__.c.type,
- "polymorhpic_identity": "person"
+ "polymorhpic_identity": "person",
}
The "imperative table" form is also used when a non-:class:`_schema.Table`
construct, such as a :class:`_sql.Join` or :class:`_sql.Subquery` object,
is to be mapped. An example below::
- from sqlalchemy import select, func
+ from sqlalchemy import func, select
- subq = select(
- func.count(orders.c.id).label('order_count'),
- func.max(orders.c.price).label('highest_order'),
- orders.c.customer_id
- ).group_by(orders.c.customer_id).subquery()
+ subq = (
+ select(
+ func.count(orders.c.id).label("order_count"),
+ func.max(orders.c.price).label("highest_order"),
+ orders.c.customer_id,
+ )
+ .group_by(orders.c.customer_id)
+ .subquery()
+ )
+
+ customer_select = (
+ select(customers, subq)
+ .join_from(customers, subq, customers.c.id == subq.c.customer_id)
+ .subquery()
+ )
- customer_select = select(customers, subq).join_from(
- customers, subq, customers.c.id == subq.c.customer_id
- ).subquery()
class Customer(Base):
__table__ = customer_select
:paramref:`_schema.Table.autoload_with` parameter to the
:class:`_schema.Table`::
- engine = create_engine("postgresql+psycopg2://user:pass@hostname/my_existing_database")
+ engine = create_engine(
+ "postgresql+psycopg2://user:pass@hostname/my_existing_database"
+ )
+
class MyClass(Base):
__table__ = Table(
- 'mytable',
+ "mytable",
Base.metadata,
- autoload_with=engine
+ autoload_with=engine,
)
A major downside of the above approach however is that it requires the database
results with the declarative table mapping process, that is, classes which
use the ``__tablename__`` attribute::
- from sqlalchemy.orm import declarative_base
from sqlalchemy.ext.declarative import DeferredReflection
+ from sqlalchemy.orm import declarative_base
Base = declarative_base()
+
class Reflected(DeferredReflection):
__abstract__ = True
+
class Foo(Reflected, Base):
- __tablename__ = 'foo'
+ __tablename__ = "foo"
bars = relationship("Bar")
+
class Bar(Reflected, Base):
- __tablename__ = 'bar'
+ __tablename__ = "bar"
- foo_id = Column(Integer, ForeignKey('foo.id'))
+ foo_id = Column(Integer, ForeignKey("foo.id"))
Above, we create a mixin class ``Reflected`` that will serve as a base
for classes in our declarative hierarchy that should become mapped when
complete until we do so, given an :class:`_engine.Engine`::
- engine = create_engine("postgresql+psycopg2://user:pass@hostname/my_existing_database")
+ engine = create_engine(
+ "postgresql+psycopg2://user:pass@hostname/my_existing_database"
+ )
Reflected.prepare(engine)
The purpose of the ``Reflected`` class is to define the scope at which
Each ``User`` can have any number of ``Keyword`` objects, and vice-versa
(the many-to-many pattern is described at :ref:`relationships_many_to_many`)::
- from sqlalchemy import Column, Integer, String, ForeignKey, Table
+ from sqlalchemy import Column, ForeignKey, Integer, String, Table
from sqlalchemy.orm import declarative_base, relationship
Base = declarative_base()
class User(Base):
- __tablename__ = 'user'
+ __tablename__ = "user"
id = Column(Integer, primary_key=True)
name = Column(String(64))
kw = relationship("Keyword", secondary=lambda: userkeywords_table)
def __init__(self, name):
self.name = name
+
class Keyword(Base):
- __tablename__ = 'keyword'
+ __tablename__ = "keyword"
id = Column(Integer, primary_key=True)
- keyword = Column('keyword', String(64))
+ keyword = Column("keyword", String(64))
def __init__(self, keyword):
self.keyword = keyword
- userkeywords_table = Table('userkeywords', Base.metadata,
- Column('user_id', Integer, ForeignKey("user.id"),
- primary_key=True),
- Column('keyword_id', Integer, ForeignKey("keyword.id"),
- primary_key=True)
+
+ userkeywords_table = Table(
+ "userkeywords",
+ Base.metadata,
+ Column("user_id", Integer, ForeignKey("user.id"), primary_key=True),
+ Column("keyword_id", Integer, ForeignKey("keyword.id"), primary_key=True),
)
Reading and manipulating the collection of "keyword" strings associated
with ``User`` requires traversal from each collection element to the ``.keyword``
attribute, which can be awkward::
- >>> user = User('jek')
- >>> user.kw.append(Keyword('cheese-inspector'))
+ >>> user = User("jek")
+ >>> user.kw.append(Keyword("cheese-inspector"))
>>> print(user.kw)
[<__main__.Keyword object at 0x12bf830>]
>>> print(user.kw[0].keyword)
from sqlalchemy.ext.associationproxy import association_proxy
+
class User(Base):
- __tablename__ = 'user'
+ __tablename__ = "user"
id = Column(Integer, primary_key=True)
name = Column(String(64))
kw = relationship("Keyword", secondary=lambda: userkeywords_table)
self.name = name
# proxy the 'keyword' attribute from the 'kw' relationship
- keywords = association_proxy('kw', 'keyword')
+ keywords = association_proxy("kw", "keyword")
We can now reference the ``.keywords`` collection as a listing of strings,
which is both readable and writable. New ``Keyword`` objects are created
for us transparently::
- >>> user = User('jek')
- >>> user.keywords.append('cheese-inspector')
+ >>> user = User("jek")
+ >>> user.keywords.append("cheese-inspector")
>>> user.keywords
['cheese-inspector']
- >>> user.keywords.append('snack ninja')
+ >>> user.keywords.append("snack ninja")
>>> user.kw
[<__main__.Keyword object at 0x12cdd30>, <__main__.Keyword object at 0x12cde30>]
new instance of the "intermediary" object using its constructor, passing as a
single argument the given value. In our example above, an operation like::
- user.keywords.append('cheese-inspector')
+ user.keywords.append("cheese-inspector")
Is translated by the association proxy into the operation::
- user.kw.append(Keyword('cheese-inspector'))
+ user.kw.append(Keyword("cheese-inspector"))
The example works here because we have designed the constructor for ``Keyword``
to accept a single positional argument, ``keyword``. For those cases where a
# ...
# use Keyword(keyword=kw) on append() events
- keywords = association_proxy('kw', 'keyword',
- creator=lambda kw: Keyword(keyword=kw))
+ keywords = association_proxy(
+ "kw", "keyword", creator=lambda kw: Keyword(keyword=kw)
+ )
The ``creator`` function accepts a single argument in the case of a list-
or set- based collection, or a scalar attribute. In the case of a dictionary-based
collection of ``User`` to the ``.keyword`` attribute present on each
``UserKeyword``::
- from sqlalchemy import Column, Integer, String, ForeignKey
+ from sqlalchemy import Column, ForeignKey, Integer, String
from sqlalchemy.ext.associationproxy import association_proxy
from sqlalchemy.orm import backref, declarative_base, relationship
Base = declarative_base()
+
class User(Base):
- __tablename__ = 'user'
+ __tablename__ = "user"
id = Column(Integer, primary_key=True)
name = Column(String(64))
# association proxy of "user_keywords" collection
# to "keyword" attribute
- keywords = association_proxy('user_keywords', 'keyword')
+ keywords = association_proxy("user_keywords", "keyword")
def __init__(self, name):
self.name = name
+
class UserKeyword(Base):
- __tablename__ = 'user_keyword'
- user_id = Column(Integer, ForeignKey('user.id'), primary_key=True)
- keyword_id = Column(Integer, ForeignKey('keyword.id'), primary_key=True)
+ __tablename__ = "user_keyword"
+ user_id = Column(Integer, ForeignKey("user.id"), primary_key=True)
+ keyword_id = Column(Integer, ForeignKey("keyword.id"), primary_key=True)
special_key = Column(String(50))
# bidirectional attribute/collection of "user"/"user_keywords"
- user = relationship(User,
- backref=backref("user_keywords",
- cascade="all, delete-orphan")
- )
+ user = relationship(
+ User, backref=backref("user_keywords", cascade="all, delete-orphan")
+ )
# reference to the "Keyword" object
keyword = relationship("Keyword")
self.keyword = keyword
self.special_key = special_key
+
class Keyword(Base):
- __tablename__ = 'keyword'
+ __tablename__ = "keyword"
id = Column(Integer, primary_key=True)
- keyword = Column('keyword', String(64))
+ keyword = Column("keyword", String(64))
def __init__(self, keyword):
self.keyword = keyword
def __repr__(self):
- return 'Keyword(%s)' % repr(self.keyword)
+ return "Keyword(%s)" % repr(self.keyword)
With the above configuration, we can operate upon the ``.keywords`` collection
of each ``User`` object, each of which exposes a collection of ``Keyword``
objects that are obtained from the underyling ``UserKeyword`` elements::
- >>> user = User('log')
- >>> for kw in (Keyword('new_from_blammo'), Keyword('its_big')):
+ >>> user = User("log")
+ >>> for kw in (Keyword("new_from_blammo"), Keyword("its_big")):
... user.keywords.append(kw)
...
>>> print(user.keywords)
a collection of strings, rather than a collection of composed objects.
In this case, each ``.keywords.append()`` operation is equivalent to::
- >>> user.user_keywords.append(UserKeyword(Keyword('its_heavy')))
+ >>> user.user_keywords.append(UserKeyword(Keyword("its_heavy")))
The ``UserKeyword`` association object has two attributes that are both
populated within the scope of the ``append()`` operation of the association
construction, has the effect of appending the new ``UserKeyword`` to
the ``User.user_keywords`` collection (via the relationship)::
- >>> UserKeyword(Keyword('its_wood'), user, special_key='my special key')
+ >>> UserKeyword(Keyword("its_wood"), user, special_key="my special key")
The association proxy returns to us a collection of ``Keyword`` objects represented
by all these operations::
argument to the ``User.keywords`` proxy so that these values are assigned appropriately
when new elements are added to the dictionary::
- from sqlalchemy import Column, Integer, String, ForeignKey
+ from sqlalchemy import Column, ForeignKey, Integer, String
from sqlalchemy.ext.associationproxy import association_proxy
from sqlalchemy.orm import backref, declarative_base, relationship
from sqlalchemy.orm.collections import attribute_mapped_collection
Base = declarative_base()
+
class User(Base):
- __tablename__ = 'user'
+ __tablename__ = "user"
id = Column(Integer, primary_key=True)
name = Column(String(64))
# proxy to 'user_keywords', instantiating UserKeyword
# assigning the new key to 'special_key', values to
# 'keyword'.
- keywords = association_proxy('user_keywords', 'keyword',
- creator=lambda k, v:
- UserKeyword(special_key=k, keyword=v)
- )
+ keywords = association_proxy(
+ "user_keywords",
+ "keyword",
+ creator=lambda k, v: UserKeyword(special_key=k, keyword=v),
+ )
def __init__(self, name):
self.name = name
+
class UserKeyword(Base):
- __tablename__ = 'user_keyword'
- user_id = Column(Integer, ForeignKey('user.id'), primary_key=True)
- keyword_id = Column(Integer, ForeignKey('keyword.id'), primary_key=True)
+ __tablename__ = "user_keyword"
+ user_id = Column(Integer, ForeignKey("user.id"), primary_key=True)
+ keyword_id = Column(Integer, ForeignKey("keyword.id"), primary_key=True)
special_key = Column(String)
# bidirectional user/user_keywords relationships, mapping
# user_keywords with a dictionary against "special_key" as key.
- user = relationship(User, backref=backref(
- "user_keywords",
- collection_class=attribute_mapped_collection("special_key"),
- cascade="all, delete-orphan"
- )
- )
+ user = relationship(
+ User,
+ backref=backref(
+ "user_keywords",
+ collection_class=attribute_mapped_collection("special_key"),
+ cascade="all, delete-orphan",
+ ),
+ )
keyword = relationship("Keyword")
+
class Keyword(Base):
- __tablename__ = 'keyword'
+ __tablename__ = "keyword"
id = Column(Integer, primary_key=True)
- keyword = Column('keyword', String(64))
+ keyword = Column("keyword", String(64))
def __init__(self, keyword):
self.keyword = keyword
def __repr__(self):
- return 'Keyword(%s)' % repr(self.keyword)
+ return "Keyword(%s)" % repr(self.keyword)
We illustrate the ``.keywords`` collection as a dictionary, mapping the
``UserKeyword.special_key`` value to ``Keyword`` objects::
- >>> user = User('log')
+ >>> user = User("log")
- >>> user.keywords['sk1'] = Keyword('kw1')
- >>> user.keywords['sk2'] = Keyword('kw2')
+ >>> user.keywords["sk1"] = Keyword("kw1")
+ >>> user.keywords["sk2"] = Keyword("kw2")
>>> print(user.keywords)
{'sk1': Keyword('kw1'), 'sk2': Keyword('kw2')}
an association proxy on ``User`` that refers to an association proxy
present on ``UserKeyword``::
- from sqlalchemy import Column, Integer, String, ForeignKey
+ from sqlalchemy import Column, ForeignKey, Integer, String
from sqlalchemy.ext.associationproxy import association_proxy
from sqlalchemy.orm import backref, declarative_base, relationship
from sqlalchemy.orm.collections import attribute_mapped_collection
Base = declarative_base()
+
class User(Base):
- __tablename__ = 'user'
+ __tablename__ = "user"
id = Column(Integer, primary_key=True)
name = Column(String(64))
# the same 'user_keywords'->'keyword' proxy as in
# the basic dictionary example.
keywords = association_proxy(
- 'user_keywords',
- 'keyword',
- creator=lambda k, v: UserKeyword(special_key=k, keyword=v)
+ "user_keywords",
+ "keyword",
+ creator=lambda k, v: UserKeyword(special_key=k, keyword=v),
)
# another proxy that is directly column-targeted
def __init__(self, name):
self.name = name
+
class UserKeyword(Base):
- __tablename__ = 'user_keyword'
- user_id = Column(ForeignKey('user.id'), primary_key=True)
- keyword_id = Column(ForeignKey('keyword.id'), primary_key=True)
+ __tablename__ = "user_keyword"
+ user_id = Column(ForeignKey("user.id"), primary_key=True)
+ keyword_id = Column(ForeignKey("keyword.id"), primary_key=True)
special_key = Column(String)
user = relationship(
User,
backref=backref(
"user_keywords",
collection_class=attribute_mapped_collection("special_key"),
- cascade="all, delete-orphan"
- )
+ cascade="all, delete-orphan",
+ ),
)
# the relationship to Keyword is now called
# 'keyword' is changed to be a proxy to the
# 'keyword' attribute of 'Keyword'
- keyword = association_proxy('kw', 'keyword')
+ keyword = association_proxy("kw", "keyword")
+
class Keyword(Base):
- __tablename__ = 'keyword'
+ __tablename__ = "keyword"
id = Column(Integer, primary_key=True)
- keyword = Column('keyword', String(64))
+ keyword = Column("keyword", String(64))
def __init__(self, keyword):
self.keyword = keyword
-
``User.keywords`` is now a dictionary of string to string, where
``UserKeyword`` and ``Keyword`` objects are created and removed for us
transparently using the association proxy. In the example below, we illustrate
Given a mapping as::
class A(Base):
- __tablename__ = 'test_a'
+ __tablename__ = "test_a"
id = Column(Integer, primary_key=True)
- ab = relationship(
- 'AB', backref='a', uselist=False)
+ ab = relationship("AB", backref="a", uselist=False)
b = association_proxy(
- 'ab', 'b', creator=lambda b: AB(b=b),
- cascade_scalar_deletes=True)
+ "ab", "b", creator=lambda b: AB(b=b), cascade_scalar_deletes=True
+ )
class B(Base):
- __tablename__ = 'test_b'
+ __tablename__ = "test_b"
id = Column(Integer, primary_key=True)
- ab = relationship('AB', backref='b', cascade='all, delete-orphan')
+ ab = relationship("AB", backref="b", cascade="all, delete-orphan")
class AB(Base):
- __tablename__ = 'test_ab'
+ __tablename__ = "test_ab"
a_id = Column(Integer, ForeignKey(A.id), primary_key=True)
b_id = Column(Integer, ForeignKey(B.id), primary_key=True)
from sqlalchemy.ext.asyncio import create_async_engine
+
async def async_main():
engine = create_async_engine(
- "postgresql+asyncpg://scott:tiger@localhost/test", echo=True,
+ "postgresql+asyncpg://scott:tiger@localhost/test",
+ echo=True,
)
async with engine.begin() as conn:
)
async with engine.connect() as conn:
-
# select a Result, which will be delivered with buffered
# results
result = await conn.execute(select(t1).where(t1.c.name == "some name 1"))
# clean-up pooled connections
await engine.dispose()
+
asyncio.run(async_main())
Above, the :meth:`_asyncio.AsyncConnection.run_sync` method may be used to
async_result = await conn.stream(select(t1))
async for row in async_result:
- print("row: %s" % (row, ))
+ print("row: %s" % (row,))
.. _asyncio_orm:
import asyncio
- from sqlalchemy import Column
- from sqlalchemy import DateTime
- from sqlalchemy import ForeignKey
- from sqlalchemy import func
- from sqlalchemy import Integer
- from sqlalchemy import String
- from sqlalchemy.ext.asyncio import AsyncSession
- from sqlalchemy.ext.asyncio import async_sessionmaker
- from sqlalchemy.ext.asyncio import create_async_engine
- from sqlalchemy.future import select
- from sqlalchemy.orm import declarative_base
- from sqlalchemy.orm import relationship
- from sqlalchemy.orm import selectinload
+ from sqlalchemy import (
+ Column,
+ DateTime,
+ ForeignKey,
+ Integer,
+ String,
+ func,
+ select,
+ )
+ from sqlalchemy.ext.asyncio import async_sessionmaker, create_async_engine
+ from sqlalchemy.orm import declarative_base, relationship, selectinload
Base = declarative_base()
async_session = AsyncSession(engine, expire_on_commit=False)
# sessionmaker version
- async_session = async_sessionmaker(
- engine, expire_on_commit=False
- )
+ async_session = async_sessionmaker(engine, expire_on_commit=False)
async with async_session() as session:
-
result = await session.execute(select(A).order_by(A.id))
a1 = result.scalars().first()
import asyncio
- from sqlalchemy.ext.asyncio import create_async_engine
- from sqlalchemy.ext.asyncio import AsyncSession
+ from sqlalchemy import select
+ from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine
+
def fetch_and_update_objects(session):
"""run traditional sync-style ORM code in a function that will be
async def async_main():
engine = create_async_engine(
- "postgresql+asyncpg://scott:tiger@localhost/test", echo=True,
+ "postgresql+asyncpg://scott:tiger@localhost/test",
+ echo=True,
)
async with engine.begin() as conn:
await conn.run_sync(Base.metadata.drop_all)
# clean-up pooled connections
await engine.dispose()
+
asyncio.run(async_main())
The above approach of running certain functions within a "sync" runner
import asyncio
- from sqlalchemy import text
+ from sqlalchemy import event, text
from sqlalchemy.engine import Engine
- from sqlalchemy import event
- from sqlalchemy.ext.asyncio import AsyncSession
- from sqlalchemy.ext.asyncio import create_async_engine
+ from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine
from sqlalchemy.orm import Session
## Core events ##
- engine = create_async_engine(
- "postgresql+asyncpg://scott:tiger@localhost:5432/test"
- )
+ engine = create_async_engine("postgresql+asyncpg://scott:tiger@localhost:5432/test")
+
# connect event on instance of Engine
@event.listens_for(engine.sync_engine, "connect")
cursor.execute("select 'execute from event'")
print(cursor.fetchone()[0])
+
# before_execute event on all Engine instances
@event.listens_for(Engine, "before_execute")
def my_before_execute(
- conn, clauseelement, multiparams, params, execution_options
+ conn,
+ clauseelement,
+ multiparams,
+ params,
+ execution_options,
):
print("before execute!")
session = AsyncSession(engine)
+
# before_commit event on instance of Session
@event.listens_for(session.sync_session, "before_commit")
def my_before_commit(session):
result = connection.execute(text("select 'execute from event'"))
print(result.first())
+
# after_commit event on all Session instances
@event.listens_for(Session, "after_commit")
def my_after_commit(session):
print("after commit!")
+
async def go():
await session.execute(text("select 1"))
await session.commit()
await session.close()
await engine.dispose()
+
asyncio.run(go())
+
The above example prints something along the lines of::
New DBAPI connection: <AdaptedConnection <asyncpg.connection.Connection ...>>
it's perfectly fine for it to be a Python ``lambda:``, as the return awaitable
value will be invoked after being returned::
- from sqlalchemy.ext.asyncio import create_async_engine
from sqlalchemy import event
+ from sqlalchemy.ext.asyncio import create_async_engine
engine = create_async_engine(...)
+
@event.listens_for(engine.sync_engine, "connect")
def register_custom_types(dbapi_connection, ...):
dbapi_connection.run_async(
- lambda connection: connection.set_type_codec('MyCustomType', encoder, decoder, ...)
+ lambda connection: connection.set_type_codec(
+ "MyCustomType", encoder, decoder, ...
+ )
)
Above, the object passed to the ``register_custom_types`` event handler
to disable pooling using :class:`~sqlalchemy.pool.NullPool`, preventing the Engine
from using any connection more than once::
+ from sqlalchemy.ext.asyncio import create_async_engine
from sqlalchemy.pool import NullPool
+
engine = create_async_engine(
- "postgresql+asyncpg://user:pass@host/dbname", poolclass=NullPool
+ "postgresql+asyncpg://user:pass@host/dbname",
+ poolclass=NullPool,
)
-
.. _asyncio_scoped_session:
Using asyncio scoped session
from asyncio import current_task
- from sqlalchemy.ext.asyncio import async_sessionmaker
- from sqlalchemy.ext.asyncio import async_scoped_session
- from sqlalchemy.ext.asyncio import AsyncSession
-
- async_session_factory = async_sessionmaker(some_async_engine, expire_on_commit=False)
- AsyncScopedSession = async_scoped_session(async_session_factory, scopefunc=current_task)
+ from sqlalchemy.ext.asyncio import (
+ async_scoped_session,
+ async_sessionmaker,
+ )
+ async_session_factory = async_sessionmaker(
+ some_async_engine,
+ expire_on_commit=False,
+ )
+ AsyncScopedSession = async_scoped_session(
+ async_session_factory,
+ scopefunc=current_task,
+ )
some_async_session = AsyncScopedSession()
:class:`_asyncio.async_scoped_session` also includes **proxy
import asyncio
- from sqlalchemy.ext.asyncio import create_async_engine
- from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy import inspect
+ from sqlalchemy.ext.asyncio import create_async_engine
+
+ engine = create_async_engine("postgresql+asyncpg://scott:tiger@localhost/test")
- engine = create_async_engine(
- "postgresql+asyncpg://scott:tiger@localhost/test"
- )
def use_inspector(conn):
inspector = inspect(conn)
# return any value to the caller
return inspector.get_table_names()
+
async def async_main():
async with engine.connect() as conn:
tables = await conn.run_sync(use_inspector)
+
asyncio.run(async_main())
.. seealso::
from sqlalchemy import bindparam
- def search_for_user(session, username, email=None):
+ def search_for_user(session, username, email=None):
baked_query = bakery(lambda session: session.query(User))
- baked_query += lambda q: q.filter(User.name == bindparam('username'))
+ baked_query += lambda q: q.filter(User.name == bindparam("username"))
baked_query += lambda q: q.order_by(User.id)
if email:
- baked_query += lambda q: q.filter(User.email == bindparam('email'))
+ baked_query += lambda q: q.filter(User.email == bindparam("email"))
result = baked_query(session).params(username=username, email=email).all()
s = Session(bind=engine)
for id_ in random.sample(ids, n):
q = bakery(lambda s: s.query(Customer))
- q += lambda q: q.filter(Customer.id == bindparam('id'))
+ q += lambda q: q.filter(Customer.id == bindparam("id"))
q(s).params(id=id_).one()
The difference in Python function call count for an iteration of 10000
my_simple_cache = {}
+
def lookup(session, id_argument):
if "my_key" not in my_simple_cache:
- query = session.query(Model).filter(Model.id == bindparam('id'))
+ query = session.query(Model).filter(Model.id == bindparam("id"))
my_simple_cache["my_key"] = query.with_session(None)
else:
query = my_simple_cache["my_key"].with_session(session)
my_simple_cache = {}
def lookup(session, id_argument):
-
if "my_key" not in my_simple_cache:
- query = session.query(Model).filter(Model.id == bindparam('id'))
+ query = session.query(Model).filter(Model.id == bindparam("id"))
my_simple_cache["my_key"] = query.with_session(None).bake()
else:
query = my_simple_cache["my_key"].with_session(session)
bakery = baked.bakery()
+
def lookup(session, id_argument):
def create_model_query(session):
- return session.query(Model).filter(Model.id == bindparam('id'))
+ return session.query(Model).filter(Model.id == bindparam("id"))
parameterized_query = bakery.bake(create_model_query)
return parameterized_query(session).params(id=id_argument).all()
my_simple_cache = {}
+
def lookup(session, id_argument, include_frobnizzle=False):
if include_frobnizzle:
cache_key = "my_key_with_frobnizzle"
cache_key = "my_key_without_frobnizzle"
if cache_key not in my_simple_cache:
- query = session.query(Model).filter(Model.id == bindparam('id'))
+ query = session.query(Model).filter(Model.id == bindparam("id"))
if include_frobnizzle:
query = query.filter(Model.frobnizzle == True)
bakery = baked.bakery()
+
def lookup(session, id_argument, include_frobnizzle=False):
def create_model_query(session):
- return session.query(Model).filter(Model.id == bindparam('id'))
+ return session.query(Model).filter(Model.id == bindparam("id"))
parameterized_query = bakery.bake(create_model_query)
return query.filter(Model.frobnizzle == True)
parameterized_query = parameterized_query.with_criteria(
- include_frobnizzle_in_query)
+ include_frobnizzle_in_query
+ )
return parameterized_query(session).params(id=id_argument).all()
bakery = baked.bakery()
+
def lookup(session, id_argument, include_frobnizzle=False):
parameterized_query = bakery.bake(
- lambda s: s.query(Model).filter(Model.id == bindparam('id'))
- )
+ lambda s: s.query(Model).filter(Model.id == bindparam("id"))
+ )
if include_frobnizzle:
parameterized_query += lambda q: q.filter(Model.frobnizzle == True)
baked_query = bakery(lambda session: session.query(User))
baked_query += lambda q: q.filter(
- User.name.in_(bindparam('username', expanding=True)))
+ User.name.in_(bindparam("username", expanding=True))
+ )
- result = baked_query.with_session(session).params(
- username=['ed', 'fred']).all()
+ result = baked_query.with_session(session).params(username=["ed", "fred"]).all()
.. seealso::
# select a correlated subquery in the top columns list,
# we have the "session" argument, pass that
- my_q = bakery(
- lambda s: s.query(Address.id, my_subq.to_query(s).as_scalar()))
+ my_q = bakery(lambda s: s.query(Address.id, my_subq.to_query(s).as_scalar()))
# use a correlated subquery in some of the criteria, we have
# the "query" argument, pass that.
still to allow the result to be cached, the event can be registered
passing the ``bake_ok=True`` flag::
- @event.listens_for(
- Query, "before_compile", retval=True, bake_ok=True)
+ @event.listens_for(Query, "before_compile", retval=True, bake_ok=True)
def my_event(query):
for desc in query.column_descriptions:
- if desc['type'] is User:
- entity = desc['entity']
+ if desc["type"] is User:
+ entity = desc["entity"]
query = query.filter(entity.deleted == False)
return query
To cover the major areas where this occurs, consider the following ORM
mapping, using the typical example of the ``User`` class::
- from sqlalchemy import Column
- from sqlalchemy import Integer
- from sqlalchemy import String
- from sqlalchemy import select
+ from sqlalchemy import Column, Integer, String, select
from sqlalchemy.orm import declarative_base
# "Base" is a class that is created dynamically from the
# declarative_base() function
Base = declarative_base()
+
class User(Base):
- __tablename__ = 'user'
+ __tablename__ = "user"
id = Column(Integer, primary_key=True)
name = Column(String)
+
# "some_user" is an instance of the User class, which
# accepts "id" and "name" kwargs based on the mapping
- some_user = User(id=5, name='user')
+ some_user = User(id=5, name="user")
# it has an attribute called .name that's a string
print(f"Username: {some_user.name}")
# a select() construct makes use of SQL expressions derived from the
# User class itself
- select_stmt = select(User).where(User.id.in_([3, 4, 5])).where(User.name.contains('s'))
+ select_stmt = (
+ select(User).where(User.id.in_([3, 4, 5])).where(User.name.contains("s"))
+ )
Above, the steps that the Mypy extension can take include:
definition and Python code passed to the Mypy tool is equivalent to the
following::
- from sqlalchemy import Column
- from sqlalchemy import Integer
- from sqlalchemy import String
- from sqlalchemy import select
- from sqlalchemy.orm import declarative_base
- from sqlalchemy.orm.decl_api import DeclarativeMeta
+ from sqlalchemy import Column, Integer, String, select
from sqlalchemy.orm import Mapped
+ from sqlalchemy.orm.decl_api import DeclarativeMeta
+
class Base(metaclass=DeclarativeMeta):
__abstract__ = True
+
class User(Base):
- __tablename__ = 'user'
+ __tablename__ = "user"
id: Mapped[Optional[int]] = Mapped._special_method(
Column(Integer, primary_key=True)
)
- name: Mapped[Optional[str]] = Mapped._special_method(
- Column(String)
- )
+ name: Mapped[Optional[str]] = Mapped._special_method(Column(String))
- def __init__(self, id: Optional[int] = ..., name: Optional[str] = ...) -> None:
+ def __init__(
+ self, id: Optional[int] = ..., name: Optional[str] = ...
+ ) -> None:
...
- some_user = User(id=5, name='user')
+
+ some_user = User(id=5, name="user")
print(f"Username: {some_user.name}")
- select_stmt = select(User).where(User.id.in_([3, 4, 5])).where(User.name.contains('s'))
+ select_stmt = (
+ select(User).where(User.id.in_([3, 4, 5])).where(User.name.contains("s"))
+ )
+
The key steps which have been taken above include:
from sqlalchemy.orm import Mapped
+
class MyClass(Base):
# ...
Base = declarative_base()
+
class User(Base):
- __tablename__ = 'user'
+ __tablename__ = "user"
id = Column(Integer, primary_key=True)
name = Column(String)
+
class Address(Base):
- __tablename__ = 'address'
+ __tablename__ = "address"
id = Column(Integer, primary_key=True)
user_id = Column(ForeignKey("user.id"))
column::
class Address(Base):
- __tablename__ = 'address'
+ __tablename__ = "address"
id = Column(Integer, primary_key=True)
user_id: int = Column(ForeignKey("user.id"))
Base.metadata,
Column(Integer, primary_key=True),
Column("employee_name", String(50), nullable=False),
- Column(String(50))
+ Column(String(50)),
)
id: Mapped[int]
is a string or callable, and not a class::
class User(Base):
- __tablename__ = 'user'
+ __tablename__ = "user"
id = Column(Integer, primary_key=True)
name = Column(String)
+
class Address(Base):
- __tablename__ = 'address'
+ __tablename__ = "address"
id = Column(Integer, primary_key=True)
user_id: int = Column(ForeignKey("user.id"))
or by providing the type, in this case the scalar ``User`` object::
class Address(Base):
- __tablename__ = 'address'
+ __tablename__ = "address"
id = Column(Integer, primary_key=True)
user_id: int = Column(ForeignKey("user.id"))
the `TYPE_CHECKING block <https://www.python.org/dev/peps/pep-0484/#runtime-or-type-checking>`_
as appropriate::
- from typing import List, TYPE_CHECKING
+ from typing import TYPE_CHECKING, List
+
from .mymodel import Base
if TYPE_CHECKING:
# that cannot normally be imported at runtime
from .myaddressmodel import Address
+
class User(Base):
- __tablename__ = 'user'
+ __tablename__ = "user"
id = Column(Integer, primary_key=True)
name = Column(String)
applied explicitly::
class User(Base):
- __tablename__ = 'user'
+ __tablename__ = "user"
id = Column(Integer, primary_key=True)
name = Column(String)
- addresses: Mapped[List["Address"]] = relationship("Address", back_populates="user")
+ addresses: Mapped[List["Address"]] = relationship(
+ "Address", back_populates="user"
+ )
+
class Address(Base):
- __tablename__ = 'address'
+ __tablename__ = "address"
id = Column(Integer, primary_key=True)
user_id: int = Column(ForeignKey("user.id"))
:func:`_orm.declarative_mixin` decorator, which provides a hint to the Mypy
plugin that a particular class intends to serve as a declarative mixin::
- from sqlalchemy.orm import declared_attr
- from sqlalchemy.orm import declarative_mixin
+ from sqlalchemy.orm import declarative_mixin, declared_attr
+
@declarative_mixin
class HasUpdatedAt:
def updated_at(cls) -> Column[DateTime]: # uses Column
return Column(DateTime)
+
@declarative_mixin
class HasCompany:
-
@declared_attr
def company_id(cls) -> Mapped[int]: # uses Mapped
return Column(ForeignKey("company.id"))
def company(cls) -> Mapped["Company"]:
return relationship("Company")
+
class Employee(HasUpdatedAt, HasCompany, Base):
- __tablename__ = 'employee'
+ __tablename__ = "employee"
id = Column(Integer, primary_key=True)
name = Column(String)
company_id: Mapped[int]
company: Mapped["Company"]
-
Combining with Dataclasses or Other Type-Sensitive Attribute Systems
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
is significant. That is, a class as follows has to be stated exactly
as it is in order to be accepted by dataclasses::
- mapper_registry : registry = registry()
+ mapper_registry: registry = registry()
@mapper_registry.mapped
addresses: List[Address] = field(default_factory=list)
__mapper_args__ = { # type: ignore
- "properties" : {
- "addresses": relationship("Address")
- }
+ "properties": {"addresses": relationship("Address")}
}
We can't apply our ``Mapped[]`` types to the attributes ``id``, ``name``,
_mypy_mapped_attrs = [id, name, "fullname", "nickname", addresses]
__mapper_args__ = { # type: ignore
- "properties" : {
- "addresses": relationship("Address")
- }
+ "properties": {"addresses": relationship("Address")}
}
With the above recipe, the attributes listed in ``_mypy_mapped_attrs``
column as well as the identifier for the base class::
class Employee(Base):
- __tablename__ = 'employee'
+ __tablename__ = "employee"
id = Column(Integer, primary_key=True)
name = Column(String(50))
type = Column(String(50))
__mapper_args__ = {
- 'polymorphic_identity':'employee',
- 'polymorphic_on':type
+ "polymorphic_identity": "employee",
+ "polymorphic_on": type,
}
Above, an additional column ``type`` is established to act as the
columns), as well as a foreign key reference to the parent table::
class Engineer(Employee):
- __tablename__ = 'engineer'
- id = Column(Integer, ForeignKey('employee.id'), primary_key=True)
+ __tablename__ = "engineer"
+ id = Column(Integer, ForeignKey("employee.id"), primary_key=True)
engineer_name = Column(String(30))
__mapper_args__ = {
- 'polymorphic_identity':'engineer',
+ "polymorphic_identity": "engineer",
}
+
class Manager(Employee):
- __tablename__ = 'manager'
- id = Column(Integer, ForeignKey('employee.id'), primary_key=True)
+ __tablename__ = "manager"
+ id = Column(Integer, ForeignKey("employee.id"), primary_key=True)
manager_name = Column(String(30))
__mapper_args__ = {
- 'polymorphic_identity':'manager',
+ "polymorphic_identity": "manager",
}
In the above example, each mapping specifies the
and ``Employee``::
class Company(Base):
- __tablename__ = 'company'
+ __tablename__ = "company"
id = Column(Integer, primary_key=True)
name = Column(String(50))
employees = relationship("Employee", back_populates="company")
+
class Employee(Base):
- __tablename__ = 'employee'
+ __tablename__ = "employee"
id = Column(Integer, primary_key=True)
name = Column(String(50))
type = Column(String(50))
- company_id = Column(ForeignKey('company.id'))
+ company_id = Column(ForeignKey("company.id"))
company = relationship("Company", back_populates="employees")
__mapper_args__ = {
- 'polymorphic_identity':'employee',
- 'polymorphic_on':type
+ "polymorphic_identity": "employee",
+ "polymorphic_on": type,
}
+
class Manager(Employee):
- # ...
+ ...
+
class Engineer(Employee):
- # ...
+ ...
If the foreign key constraint is on a table corresponding to a subclass,
the relationship should target that subclass instead. In the example
established between the ``Manager`` and ``Company`` classes::
class Company(Base):
- __tablename__ = 'company'
+ __tablename__ = "company"
id = Column(Integer, primary_key=True)
name = Column(String(50))
managers = relationship("Manager", back_populates="company")
+
class Employee(Base):
- __tablename__ = 'employee'
+ __tablename__ = "employee"
id = Column(Integer, primary_key=True)
name = Column(String(50))
type = Column(String(50))
__mapper_args__ = {
- 'polymorphic_identity':'employee',
- 'polymorphic_on':type
+ "polymorphic_identity": "employee",
+ "polymorphic_on": type,
}
+
class Manager(Employee):
- __tablename__ = 'manager'
- id = Column(Integer, ForeignKey('employee.id'), primary_key=True)
+ __tablename__ = "manager"
+ id = Column(Integer, ForeignKey("employee.id"), primary_key=True)
manager_name = Column(String(30))
- company_id = Column(ForeignKey('company.id'))
+ company_id = Column(ForeignKey("company.id"))
company = relationship("Company", back_populates="managers")
__mapper_args__ = {
- 'polymorphic_identity':'manager',
+ "polymorphic_identity": "manager",
}
+
class Engineer(Employee):
- # ...
+ ...
Above, the ``Manager`` class will have a ``Manager.company`` attribute;
``Company`` will have a ``Company.managers`` attribute that always
the :class:`_schema.Column` will be applied to the same base :class:`_schema.Table` object::
class Employee(Base):
- __tablename__ = 'employee'
+ __tablename__ = "employee"
id = Column(Integer, primary_key=True)
name = Column(String(50))
type = Column(String(20))
__mapper_args__ = {
- 'polymorphic_on':type,
- 'polymorphic_identity':'employee'
+ "polymorphic_on": type,
+ "polymorphic_identity": "employee",
}
+
class Manager(Employee):
manager_data = Column(String(50))
__mapper_args__ = {
- 'polymorphic_identity':'manager'
+ "polymorphic_identity": "manager",
}
+
class Engineer(Employee):
engineer_info = Column(String(50))
__mapper_args__ = {
- 'polymorphic_identity':'engineer'
+ "polymorphic_identity": "engineer",
}
Note that the mappers for the derived classes Manager and Engineer omit the
comes up when two subclasses want to specify *the same* column, as below::
class Employee(Base):
- __tablename__ = 'employee'
+ __tablename__ = "employee"
id = Column(Integer, primary_key=True)
name = Column(String(50))
type = Column(String(20))
__mapper_args__ = {
- 'polymorphic_on':type,
- 'polymorphic_identity':'employee'
+ "polymorphic_on": type,
+ "polymorphic_identity": "employee",
}
+
class Engineer(Employee):
- __mapper_args__ = {'polymorphic_identity': 'engineer'}
+ __mapper_args__ = {
+ "polymorphic_identity": "engineer",
+ }
start_date = Column(DateTime)
+
class Manager(Employee):
- __mapper_args__ = {'polymorphic_identity': 'manager'}
+ __mapper_args__ = {
+ "polymorphic_identity": "manager",
+ }
start_date = Column(DateTime)
Above, the ``start_date`` column declared on both ``Engineer`` and ``Manager``
from sqlalchemy.orm import declared_attr
+
class Employee(Base):
- __tablename__ = 'employee'
+ __tablename__ = "employee"
id = Column(Integer, primary_key=True)
name = Column(String(50))
type = Column(String(20))
__mapper_args__ = {
- 'polymorphic_on':type,
- 'polymorphic_identity':'employee'
+ "polymorphic_on": type,
+ "polymorphic_identity": "employee",
}
+
class Engineer(Employee):
- __mapper_args__ = {'polymorphic_identity': 'engineer'}
+ __mapper_args__ = {
+ "polymorphic_identity": "engineer",
+ }
@declared_attr
def start_date(cls):
"Start date column, if not present already."
- return Employee.__table__.c.get('start_date', Column(DateTime))
+ return Employee.__table__.c.get("start_date", Column(DateTime))
+
class Manager(Employee):
- __mapper_args__ = {'polymorphic_identity': 'manager'}
+ __mapper_args__ = {
+ "polymorphic_identity": "manager",
+ }
@declared_attr
def start_date(cls):
"Start date column, if not present already."
- return Employee.__table__.c.get('start_date', Column(DateTime))
+ return Employee.__table__.c.get("start_date", Column(DateTime))
Above, when ``Manager`` is mapped, the ``start_date`` column is
already present on the ``Employee`` class; by returning the existing
from a reusable mixin class::
class Employee(Base):
- __tablename__ = 'employee'
+ __tablename__ = "employee"
id = Column(Integer, primary_key=True)
name = Column(String(50))
type = Column(String(20))
__mapper_args__ = {
- 'polymorphic_on':type,
- 'polymorphic_identity':'employee'
+ "polymorphic_on": type,
+ "polymorphic_identity": "employee",
}
+
class HasStartDate:
@declared_attr
def start_date(cls):
- return cls.__table__.c.get('start_date', Column(DateTime))
+ return cls.__table__.c.get("start_date", Column(DateTime))
+
class Engineer(HasStartDate, Employee):
- __mapper_args__ = {'polymorphic_identity': 'engineer'}
+ __mapper_args__ = {
+ "polymorphic_identity": "engineer",
+ }
+
class Manager(HasStartDate, Employee):
- __mapper_args__ = {'polymorphic_identity': 'manager'}
+ __mapper_args__ = {
+ "polymorphic_identity": "manager",
+ }
Relationships with Single Table Inheritance
+++++++++++++++++++++++++++++++++++++++++++
relationship::
class Company(Base):
- __tablename__ = 'company'
+ __tablename__ = "company"
id = Column(Integer, primary_key=True)
name = Column(String(50))
employees = relationship("Employee", back_populates="company")
+
class Employee(Base):
- __tablename__ = 'employee'
+ __tablename__ = "employee"
id = Column(Integer, primary_key=True)
name = Column(String(50))
type = Column(String(50))
- company_id = Column(ForeignKey('company.id'))
+ company_id = Column(ForeignKey("company.id"))
company = relationship("Company", back_populates="employees")
__mapper_args__ = {
- 'polymorphic_identity':'employee',
- 'polymorphic_on':type
+ "polymorphic_identity": "employee",
+ "polymorphic_on": type,
}
manager_data = Column(String(50))
__mapper_args__ = {
- 'polymorphic_identity':'manager'
+ "polymorphic_identity": "manager",
}
+
class Engineer(Employee):
engineer_info = Column(String(50))
__mapper_args__ = {
- 'polymorphic_identity':'engineer'
+ "polymorphic_identity": "engineer",
}
Also, like the case of joined inheritance, we can create relationships
or subclasses::
class Company(Base):
- __tablename__ = 'company'
+ __tablename__ = "company"
id = Column(Integer, primary_key=True)
name = Column(String(50))
managers = relationship("Manager", back_populates="company")
+
class Employee(Base):
- __tablename__ = 'employee'
+ __tablename__ = "employee"
id = Column(Integer, primary_key=True)
name = Column(String(50))
type = Column(String(50))
__mapper_args__ = {
- 'polymorphic_identity':'employee',
- 'polymorphic_on':type
+ "polymorphic_identity": "employee",
+ "polymorphic_on": type,
}
class Manager(Employee):
manager_name = Column(String(30))
- company_id = Column(ForeignKey('company.id'))
+ company_id = Column(ForeignKey("company.id"))
company = relationship("Company", back_populates="managers")
__mapper_args__ = {
- 'polymorphic_identity':'manager',
+ "polymorphic_identity": "manager",
}
engineer_info = Column(String(50))
__mapper_args__ = {
- 'polymorphic_identity':'engineer'
+ "polymorphic_identity": "engineer",
}
Above, the ``Manager`` class will have a ``Manager.company`` attribute;
table should not be considered as part of the mapping::
class Employee(Base):
- __tablename__ = 'employee'
+ __tablename__ = "employee"
id = Column(Integer, primary_key=True)
name = Column(String(50))
+
class Manager(Employee):
- __tablename__ = 'manager'
+ __tablename__ = "manager"
id = Column(Integer, primary_key=True)
name = Column(String(50))
manager_data = Column(String(50))
__mapper_args__ = {
- 'concrete': True
+ "concrete": True,
}
+
class Engineer(Employee):
- __tablename__ = 'engineer'
+ __tablename__ = "engineer"
id = Column(Integer, primary_key=True)
name = Column(String(50))
engineer_info = Column(String(50))
__mapper_args__ = {
- 'concrete': True
+ "concrete": True,
}
Two critical points should be noted:
from sqlalchemy.ext.declarative import ConcreteBase
+
class Employee(ConcreteBase, Base):
- __tablename__ = 'employee'
+ __tablename__ = "employee"
id = Column(Integer, primary_key=True)
name = Column(String(50))
__mapper_args__ = {
- 'polymorphic_identity': 'employee',
- 'concrete': True
+ "polymorphic_identity": "employee",
+ "concrete": True,
}
+
class Manager(Employee):
- __tablename__ = 'manager'
+ __tablename__ = "manager"
id = Column(Integer, primary_key=True)
name = Column(String(50))
manager_data = Column(String(40))
__mapper_args__ = {
- 'polymorphic_identity': 'manager',
- 'concrete': True
+ "polymorphic_identity": "manager",
+ "concrete": True,
}
+
class Engineer(Employee):
- __tablename__ = 'engineer'
+ __tablename__ = "engineer"
id = Column(Integer, primary_key=True)
name = Column(String(50))
engineer_info = Column(String(40))
__mapper_args__ = {
- 'polymorphic_identity': 'engineer',
- 'concrete': True
+ "polymorphic_identity": "engineer",
+ "concrete": True,
}
Above, Declarative sets up the polymorphic selectable for the
class Employee(Base):
__abstract__ = True
+
class Manager(Employee):
- __tablename__ = 'manager'
+ __tablename__ = "manager"
id = Column(Integer, primary_key=True)
name = Column(String(50))
manager_data = Column(String(40))
__mapper_args__ = {
- 'polymorphic_identity': 'manager',
+ "polymorphic_identity": "manager",
}
+
class Engineer(Employee):
- __tablename__ = 'engineer'
+ __tablename__ = "engineer"
id = Column(Integer, primary_key=True)
name = Column(String(50))
engineer_info = Column(String(40))
__mapper_args__ = {
- 'polymorphic_identity': 'engineer',
+ "polymorphic_identity": "engineer",
}
Above, we are not actually making use of SQLAlchemy's inheritance mapping
from sqlalchemy.ext.declarative import AbstractConcreteBase
+
class Employee(AbstractConcreteBase, Base):
pass
+
class Manager(Employee):
- __tablename__ = 'manager'
+ __tablename__ = "manager"
id = Column(Integer, primary_key=True)
name = Column(String(50))
manager_data = Column(String(40))
__mapper_args__ = {
- 'polymorphic_identity': 'manager',
- 'concrete': True
+ "polymorphic_identity": "manager",
+ "concrete": True,
}
+
class Engineer(Employee):
- __tablename__ = 'engineer'
+ __tablename__ = "engineer"
id = Column(Integer, primary_key=True)
name = Column(String(50))
engineer_info = Column(String(40))
__mapper_args__ = {
- 'polymorphic_identity': 'engineer',
- 'concrete': True
+ "polymorphic_identity": "engineer",
+ "concrete": True,
}
The :class:`.AbstractConcreteBase` helper class has a more complex internal
metadata_obj = Base.metadata
employees_table = Table(
- 'employee', metadata_obj,
- Column('id', Integer, primary_key=True),
- Column('name', String(50)),
+ "employee",
+ metadata_obj,
+ Column("id", Integer, primary_key=True),
+ Column("name", String(50)),
)
managers_table = Table(
- 'manager', metadata_obj,
- Column('id', Integer, primary_key=True),
- Column('name', String(50)),
- Column('manager_data', String(50)),
+ "manager",
+ metadata_obj,
+ Column("id", Integer, primary_key=True),
+ Column("name", String(50)),
+ Column("manager_data", String(50)),
)
engineers_table = Table(
- 'engineer', metadata_obj,
- Column('id', Integer, primary_key=True),
- Column('name', String(50)),
- Column('engineer_info', String(50)),
+ "engineer",
+ metadata_obj,
+ Column("id", Integer, primary_key=True),
+ Column("name", String(50)),
+ Column("engineer_info", String(50)),
)
Next, the UNION is produced using :func:`.polymorphic_union`::
from sqlalchemy.orm import polymorphic_union
- pjoin = polymorphic_union({
- 'employee': employees_table,
- 'manager': managers_table,
- 'engineer': engineers_table
- }, 'type', 'pjoin')
+ pjoin = polymorphic_union(
+ {
+ "employee": employees_table,
+ "manager": managers_table,
+ "engineer": engineers_table,
+ },
+ "type",
+ "pjoin",
+ )
With the above :class:`_schema.Table` objects, the mappings can be produced using "semi-classical" style,
where we use Declarative in conjunction with the ``__table__`` argument;
class Employee(Base):
__table__ = employee_table
__mapper_args__ = {
- 'polymorphic_on': pjoin.c.type,
- 'with_polymorphic': ('*', pjoin),
- 'polymorphic_identity': 'employee'
+ "polymorphic_on": pjoin.c.type,
+ "with_polymorphic": ("*", pjoin),
+ "polymorphic_identity": "employee",
}
+
class Engineer(Employee):
__table__ = engineer_table
__mapper_args__ = {
- 'polymorphic_identity': 'engineer',
- 'concrete': True}
+ "polymorphic_identity": "engineer",
+ "concrete": True,
+ }
+
class Manager(Employee):
__table__ = manager_table
__mapper_args__ = {
- 'polymorphic_identity': 'manager',
- 'concrete': True}
+ "polymorphic_identity": "manager",
+ "concrete": True,
+ }
Alternatively, the same :class:`_schema.Table` objects can be used in
fully "classical" style, without using Declarative at all.
for k in kw:
setattr(self, k, kw[k])
+
class Manager(Employee):
pass
+
class Engineer(Employee):
pass
+
employee_mapper = mapper_registry.map_imperatively(
Employee,
pjoin,
- with_polymorphic=('*', pjoin),
+ with_polymorphic=("*", pjoin),
polymorphic_on=pjoin.c.type,
)
manager_mapper = mapper_registry.map_imperatively(
managers_table,
inherits=employee_mapper,
concrete=True,
- polymorphic_identity='manager',
+ polymorphic_identity="manager",
)
engineer_mapper = mapper_registry.map_imperatively(
Engineer,
engineers_table,
inherits=employee_mapper,
concrete=True,
- polymorphic_identity='engineer',
+ polymorphic_identity="engineer",
)
-
-
The "abstract" example can also be mapped using "semi-classical" or "classical"
style. The difference is that instead of applying the "polymorphic union"
to the :paramref:`.mapper.with_polymorphic` parameter, we apply it directly
from sqlalchemy.orm import polymorphic_union
- pjoin = polymorphic_union({
- 'manager': managers_table,
- 'engineer': engineers_table
- }, 'type', 'pjoin')
+ pjoin = polymorphic_union(
+ {
+ "manager": managers_table,
+ "engineer": engineers_table,
+ },
+ "type",
+ "pjoin",
+ )
+
class Employee(Base):
__table__ = pjoin
__mapper_args__ = {
- 'polymorphic_on': pjoin.c.type,
- 'with_polymorphic': '*',
- 'polymorphic_identity': 'employee'
+ "polymorphic_on": pjoin.c.type,
+ "with_polymorphic": "*",
+ "polymorphic_identity": "employee",
}
+
class Engineer(Employee):
__table__ = engineer_table
__mapper_args__ = {
- 'polymorphic_identity': 'engineer',
- 'concrete': True}
+ "polymorphic_identity": "engineer",
+ "concrete": True,
+ }
+
class Manager(Employee):
__table__ = manager_table
__mapper_args__ = {
- 'polymorphic_identity': 'manager',
- 'concrete': True}
+ "polymorphic_identity": "manager",
+ "concrete": True,
+ }
+
Above, we use :func:`.polymorphic_union` in the same manner as before, except
that we omit the ``employee`` table.
class Company(Base):
- __tablename__ = 'company'
+ __tablename__ = "company"
id = Column(Integer, primary_key=True)
name = Column(String(50))
employees = relationship("Employee")
class Employee(ConcreteBase, Base):
- __tablename__ = 'employee'
+ __tablename__ = "employee"
id = Column(Integer, primary_key=True)
name = Column(String(50))
- company_id = Column(ForeignKey('company.id'))
+ company_id = Column(ForeignKey("company.id"))
__mapper_args__ = {
- 'polymorphic_identity': 'employee',
- 'concrete': True
+ "polymorphic_identity": "employee",
+ "concrete": True,
}
class Manager(Employee):
- __tablename__ = 'manager'
+ __tablename__ = "manager"
id = Column(Integer, primary_key=True)
name = Column(String(50))
manager_data = Column(String(40))
- company_id = Column(ForeignKey('company.id'))
+ company_id = Column(ForeignKey("company.id"))
__mapper_args__ = {
- 'polymorphic_identity': 'manager',
- 'concrete': True
+ "polymorphic_identity": "manager",
+ "concrete": True,
}
class Engineer(Employee):
- __tablename__ = 'engineer'
+ __tablename__ = "engineer"
id = Column(Integer, primary_key=True)
name = Column(String(50))
engineer_info = Column(String(40))
- company_id = Column(ForeignKey('company.id'))
+ company_id = Column(ForeignKey("company.id"))
__mapper_args__ = {
- 'polymorphic_identity': 'engineer',
- 'concrete': True
+ "polymorphic_identity": "engineer",
+ "concrete": True,
}
The next complexity with concrete inheritance and relationships involves
class Company(Base):
- __tablename__ = 'company'
+ __tablename__ = "company"
id = Column(Integer, primary_key=True)
name = Column(String(50))
employees = relationship("Employee", back_populates="company")
class Employee(ConcreteBase, Base):
- __tablename__ = 'employee'
+ __tablename__ = "employee"
id = Column(Integer, primary_key=True)
name = Column(String(50))
- company_id = Column(ForeignKey('company.id'))
+ company_id = Column(ForeignKey("company.id"))
company = relationship("Company", back_populates="employees")
__mapper_args__ = {
- 'polymorphic_identity': 'employee',
- 'concrete': True
+ "polymorphic_identity": "employee",
+ "concrete": True,
}
class Manager(Employee):
- __tablename__ = 'manager'
+ __tablename__ = "manager"
id = Column(Integer, primary_key=True)
name = Column(String(50))
manager_data = Column(String(40))
- company_id = Column(ForeignKey('company.id'))
+ company_id = Column(ForeignKey("company.id"))
company = relationship("Company", back_populates="employees")
__mapper_args__ = {
- 'polymorphic_identity': 'manager',
- 'concrete': True
+ "polymorphic_identity": "manager",
+ "concrete": True,
}
class Engineer(Employee):
- __tablename__ = 'engineer'
+ __tablename__ = "engineer"
id = Column(Integer, primary_key=True)
name = Column(String(50))
engineer_info = Column(String(40))
- company_id = Column(ForeignKey('company.id'))
+ company_id = Column(ForeignKey("company.id"))
company = relationship("Company", back_populates="employees")
__mapper_args__ = {
- 'polymorphic_identity': 'engineer',
- 'concrete': True
+ "polymorphic_identity": "engineer",
+ "concrete": True,
}
The above limitation is related to the current implementation, including