--- /dev/null
+.. change::
+ :tags: usecase, orm
+ :tickets: 12854
+
+ Improvements to the use case of using :ref:`Declarative Dataclass Mapping
+ <orm_declarative_native_dataclasses>` with intermediary classes that are
+ unmapped. As was the existing behavior, classes can subclass
+ :class:`_orm.MappedAsDataclass` alone without a declarative base to act as
+ mixins, or along with a declarative base as well as ``__abstract__ = True``
+ to define an abstract base. However, the improved behavior scans ORM
+ attributes like :func:`_orm.mapped_column` in this case to create correct
+ ``dataclasses.field()`` constructs based on their arguments, allowing for
+ more natural ordering of fields without dataclass errors being thrown.
+ Additionally, added a new :func:`_orm.unmapped_dataclass` decorator
+ function, which may be used to create unmapped mixins in a mapped hierarchy
+ that is using the :func:`_orm.mapped_dataclass` decorator to create mapped
+ dataclasses.
+
+ .. seealso::
+
+ :ref:`orm_declarative_dc_mixins`
When transforming <cls> to a dataclass, attribute(s) originate from superclass <cls> which is not a dataclass.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-This warning occurs when using the SQLAlchemy ORM Mapped Dataclasses feature
+This error occurs when using the SQLAlchemy ORM Mapped Dataclasses feature
described at :ref:`orm_declarative_native_dataclasses` in conjunction with
any mixin class or abstract base that is not itself declared as a
dataclass, such as in the example below::
from __future__ import annotations
- import inspect
from typing import Optional
from uuid import uuid4
email: Mapped[str] = mapped_column()
Above, since ``Mixin`` does not itself extend from :class:`_orm.MappedAsDataclass`,
-the following warning is generated:
+the following error is generated:
.. sourcecode:: none
- SADeprecationWarning: When transforming <class '__main__.User'> to a
- dataclass, attribute(s) "create_user", "update_user" originates from
- superclass <class
- '__main__.Mixin'>, which is not a dataclass. This usage is deprecated and
- will raise an error in SQLAlchemy 2.1. When declaring SQLAlchemy
- Declarative Dataclasses, ensure that all mixin classes and other
- superclasses which include attributes are also a subclass of
- MappedAsDataclass.
+ sqlalchemy.exc.InvalidRequestError: When transforming <class
+ '__main__.User'> to a dataclass, attribute(s) 'create_user', 'update_user'
+ originates from superclass <class '__main__.Mixin'>, which is not a
+ dataclass. When declaring SQLAlchemy Declarative Dataclasses, ensure that
+ all mixin classes and other superclasses which include attributes are also
+ a subclass of MappedAsDataclass or make use of the @unmapped_dataclass
+ decorator.
The fix is to add :class:`_orm.MappedAsDataclass` to the signature of
``Mixin`` as well::
create_user: Mapped[int] = mapped_column()
update_user: Mapped[Optional[int]] = mapped_column(default=None, init=False)
+When using decorators like :func:`_orm.mapped_as_dataclass` to map, the
+:func:`_orm.unmapped_dataclass` may be used to indicate mixins::
+
+ from __future__ import annotations
+
+ from typing import Optional
+ from uuid import uuid4
+
+ from sqlalchemy import String
+ from sqlalchemy.orm import Mapped
+ from sqlalchemy.orm import mapped_as_dataclass
+ from sqlalchemy.orm import mapped_column
+ from sqlalchemy.orm import registry
+ from sqlalchemy.orm import unmapped_dataclass
+
+
+ @unmapped_dataclass
+ class Mixin:
+ create_user: Mapped[int] = mapped_column()
+ update_user: Mapped[Optional[int]] = mapped_column(default=None, init=False)
+
+
+ reg = registry()
+
+
+ @mapped_as_dataclass(reg)
+ class User(Mixin):
+ __tablename__ = "sys_user"
+
+ uid: Mapped[str] = mapped_column(
+ String(50), init=False, default_factory=uuid4, primary_key=True
+ )
+ username: Mapped[str] = mapped_column()
+ email: Mapped[str] = mapped_column()
+
Python's :pep:`681` specification does not accommodate for attributes declared
on superclasses of dataclasses that are not themselves dataclasses; per the
behavior of Python dataclasses, such fields are ignored, as in the following
nor will it attempt to interpret ``update_user`` as a dataclass attribute.
This is because ``Mixin`` is not a dataclass.
-SQLAlchemy's dataclasses feature within the 2.0 series does not honor this
-behavior correctly; instead, attributes on non-dataclass mixins and
-superclasses are treated as part of the final dataclass configuration. However
-type checkers such as Pyright and Mypy will not consider these fields as
-part of the dataclass constructor as they are to be ignored per :pep:`681`.
-Since their presence is ambiguous otherwise, SQLAlchemy 2.1 will require that
+Since type checkers such as Pyright and Mypy will not consider these fields as
+part of the dataclass constructor as they are to be ignored per :pep:`681`,
+their presence becomes ambiguous. Therefore SQLAlchemy requires that
mixin classes which have SQLAlchemy mapped attributes within a dataclass
-hierarchy have to themselves be dataclasses.
+hierarchy have to themselves be dataclasses using SQLAlchemy's unmapped
+dataclass feature.
.. _error_dcte:
Dataclass conversion may be added to any Declarative class either by adding the
:class:`_orm.MappedAsDataclass` mixin to a :class:`_orm.DeclarativeBase` class
hierarchy, or for decorator mapping by using the
-:meth:`_orm.registry.mapped_as_dataclass` class decorator.
+:meth:`_orm.registry.mapped_as_dataclass` class decorator or its
+functional variant :func:`_orm.mapped_as_dataclass`.
The :class:`_orm.MappedAsDataclass` mixin may be applied either
to the Declarative ``Base`` class or any superclass, as in the example
database-generated, is not part of the constructor at all::
from sqlalchemy.orm import Mapped
+ from sqlalchemy.orm import mapped_as_dataclass
from sqlalchemy.orm import mapped_column
from sqlalchemy.orm import registry
reg = registry()
- @reg.mapped_as_dataclass
+ @mapped_as_dataclass(reg)
class User:
__tablename__ = "user_account"
from sqlalchemy import func
from sqlalchemy.orm import Mapped
+ from sqlalchemy.orm import mapped_as_dataclass
from sqlalchemy.orm import mapped_column
from sqlalchemy.orm import registry
reg = registry()
- @reg.mapped_as_dataclass
+ @mapped_as_dataclass(reg)
class User:
__tablename__ = "user_account"
from typing import Annotated
from sqlalchemy.orm import Mapped
+ from sqlalchemy.orm import mapped_as_dataclass
from sqlalchemy.orm import mapped_column
from sqlalchemy.orm import registry
reg = registry()
- @reg.mapped_as_dataclass
+ @mapped_as_dataclass(reg)
class User:
__tablename__ = "user_account"
id: Mapped[intpk]
from typing import Annotated
from sqlalchemy.orm import Mapped
+ from sqlalchemy.orm import mapped_as_dataclass
from sqlalchemy.orm import mapped_column
from sqlalchemy.orm import registry
reg = registry()
- @reg.mapped_as_dataclass
+ @mapped_as_dataclass(reg)
class User:
__tablename__ = "user_account"
Using mixins and abstract superclasses
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-Any mixins or base classes that are used in a :class:`_orm.MappedAsDataclass`
-mapped class which include :class:`_orm.Mapped` attributes must themselves be
-part of a :class:`_orm.MappedAsDataclass`
-hierarchy, such as in the example below using a mixin::
+Mixin and abstract superclass are supported with the Declarative Dataclass
+Mapping by defining classes that are part of the :class:`_orm.MappedAsDataclass`
+hierarchy, either without including a declarative base or by setting
+``__abstract__ = True``. The example below illustrates a class ``Mixin`` that is
+not itself mapped, but serves as part of the base for a mapped class::
+
+ from sqlalchemy.orm import DeclarativeBase
+ from sqlalchemy.orm import MappedAsDataclass
class Mixin(MappedAsDataclass):
create_user: Mapped[int] = mapped_column()
- update_user: Mapped[Optional[int]] = mapped_column(default=None, init=False)
+ update_user: Mapped[Optional[int]] = mapped_column(default=None)
class Base(DeclarativeBase, MappedAsDataclass):
username: Mapped[str] = mapped_column()
email: Mapped[str] = mapped_column()
-Python type checkers which support :pep:`681` will otherwise not consider
-attributes from non-dataclass mixins to be part of the dataclass.
+.. tip::
-.. deprecated:: 2.0.8 Using mixins and abstract bases within
- :class:`_orm.MappedAsDataclass` or
- :meth:`_orm.registry.mapped_as_dataclass` hierarchies which are not
- themselves dataclasses is deprecated, as these fields are not supported
- by :pep:`681` as belonging to the dataclass. A warning is emitted for this
- case which will later be an error.
+ When using :class:`_orm.MappedAsDataclass` without a declarative base in
+ the hiearchy, the target class is still turned into a real Python dataclass,
+ so that it may properly serve as a base for a mapped dataclass. Using
+ :class:`_orm.MappedAsDataclass` (or the :func:`_orm.unmapped_dataclass` decorator
+ described later in this section) is required in order for the class to be correctly
+ recognized by type checkers as SQLAlchemy-enabled dataclasses. Declarative
+ itself will reject mixins / abstract classes that are not themselves
+ Declarative Dataclasses (e.g. they can't be plain classes nor can they be
+ plain ``@dataclass`` classes).
- .. seealso::
+ .. seealso::
- :ref:`error_dcmx` - background on rationale
+ :ref:`error_dcmx` - further background
+Another example, where an abstract base combines :class:`_orm.MappedAsDataclass`
+with ``__abstract__ = True``::
+ from sqlalchemy.orm import DeclarativeBase
+ from sqlalchemy.orm import MappedAsDataclass
+
+
+ class Base(DeclarativeBase, MappedAsDataclass):
+ pass
+
+
+ class AbstractUser(Base):
+ __abstract__ = True
+
+ create_user: Mapped[int] = mapped_column()
+ update_user: Mapped[Optional[int]] = mapped_column(default=None)
+
+
+ class User(AbstractUser):
+ __tablename__ = "sys_user"
+
+ uid: Mapped[str] = mapped_column(
+ String(50), init=False, default_factory=uuid4, primary_key=True
+ )
+ username: Mapped[str] = mapped_column()
+ email: Mapped[str] = mapped_column()
+
+Finally, for a hierarchy that's based on use of the :func:`_orm.mapped_as_dataclass`
+decorator, mixins may be defined using the :func:`_orm.unmapped_dataclass` decorator::
+
+ from sqlalchemy.orm import registry
+ from sqlalchemy.orm import mapped_as_dataclass
+ from sqlalchemy.orm import unmapped_dataclass
+
+
+ @unmapped_dataclass()
+ class Mixin:
+ create_user: Mapped[int] = mapped_column()
+ update_user: Mapped[Optional[int]] = mapped_column(default=None, init=False)
+
+
+ reg = registry()
+
+
+ @mapped_as_dataclass(reg)
+ class User(Mixin):
+ __tablename__ = "sys_user"
+
+ uid: Mapped[str] = mapped_column(
+ String(50), init=False, default_factory=uuid4, primary_key=True
+ )
+ username: Mapped[str] = mapped_column()
+ email: Mapped[str] = mapped_column()
+
+.. versionadded:: 2.1 Added :func:`_orm.unmapped_dataclass`
.. _orm_declarative_dc_relationships:
from sqlalchemy import ForeignKey
from sqlalchemy.orm import Mapped
+ from sqlalchemy.orm import mapped_as_dataclass
from sqlalchemy.orm import mapped_column
from sqlalchemy.orm import registry
from sqlalchemy.orm import relationship
reg = registry()
- @reg.mapped_as_dataclass
+ @mapped_as_dataclass(reg)
class Parent:
__tablename__ = "parent"
id: Mapped[int] = mapped_column(primary_key=True)
)
- @reg.mapped_as_dataclass
+ @mapped_as_dataclass(reg)
class Child:
__tablename__ = "child"
id: Mapped[int] = mapped_column(primary_key=True)
from sqlalchemy.orm import Mapped
+ from sqlalchemy.orm import mapped_as_dataclass
from sqlalchemy.orm import mapped_column
from sqlalchemy.orm import registry
reg = registry()
- @reg.mapped_as_dataclass
+ @mapped_as_dataclass(reg)
class Data:
__tablename__ = "data"
from typing import Optional
from sqlalchemy.orm import Mapped
+ from sqlalchemy.orm import mapped_as_dataclass
from sqlalchemy.orm import mapped_column
from sqlalchemy.orm import registry
reg = registry()
- @reg.mapped_as_dataclass
+ @mapped_as_dataclass(reg)
class User:
__tablename__ = "user_account"
details which **explicitly resolve** these incompatibilities.
SQLAlchemy's :class:`_orm.MappedAsDataclass` class
-and :meth:`_orm.registry.mapped_as_dataclass` method call directly into
+:meth:`_orm.registry.mapped_as_dataclass` method, and
+:func:`_orm.mapped_as_dataclass` functions call directly into
the Python standard library ``dataclasses.dataclass`` class decorator, after
the declarative mapping process has been applied to the class. This
function call may be swapped out for alternateive dataclasses providers,
.. autofunction:: synonym_for
+.. autofunction:: unmapped_dataclass
+
from ..orm import exc as orm_exc
from ..orm import interfaces
from ..orm import relationship
-from ..orm.decl_base import _DeferredMapperConfig
+from ..orm.decl_base import _DeferredDeclarativeConfig
from ..orm.mapper import _CONFIGURE_MUTEX
from ..schema import ForeignKeyConstraint
from ..sql import and_
with _CONFIGURE_MUTEX:
table_to_map_config: Union[
- Dict[Optional[Table], _DeferredMapperConfig],
- Dict[Table, _DeferredMapperConfig],
+ Dict[Optional[Table], _DeferredDeclarativeConfig],
+ Dict[Table, _DeferredDeclarativeConfig],
] = {
cast("Table", m.local_table): m
- for m in _DeferredMapperConfig.classes_for_base(
+ for m in _DeferredDeclarativeConfig.classes_for_base(
cls, sort=False
)
}
(automap_base,),
clsdict,
)
- map_config = _DeferredMapperConfig.config_for_cls(
+ map_config = _DeferredDeclarativeConfig.config_for_cls(
mapped_cls
)
assert map_config.cls.__name__ == newname
generate_relationship,
)
- for map_config in _DeferredMapperConfig.classes_for_base(
+ for map_config in _DeferredDeclarativeConfig.classes_for_base(
automap_base
):
map_config.map()
def _relationships_for_fks(
automap_base: Type[Any],
- map_config: _DeferredMapperConfig,
+ map_config: _DeferredDeclarativeConfig,
table_to_map_config: Union[
- Dict[Optional[Table], _DeferredMapperConfig],
- Dict[Table, _DeferredMapperConfig],
+ Dict[Optional[Table], _DeferredDeclarativeConfig],
+ Dict[Table, _DeferredDeclarativeConfig],
],
collection_class: type,
name_for_scalar_relationship: NameForScalarRelationshipType,
m2m_const: List[ForeignKeyConstraint],
table: Table,
table_to_map_config: Union[
- Dict[Optional[Table], _DeferredMapperConfig],
- Dict[Table, _DeferredMapperConfig],
+ Dict[Optional[Table], _DeferredDeclarativeConfig],
+ Dict[Table, _DeferredDeclarativeConfig],
],
collection_class: type,
name_for_scalar_relationship: NameForCollectionRelationshipType,
from ...orm import relationships
from ...orm.base import _mapper_or_none
from ...orm.clsregistry import _resolver
-from ...orm.decl_base import _DeferredMapperConfig
+from ...orm.decl_base import _DeferredDeclarativeConfig
from ...orm.util import polymorphic_union
from ...schema import Table
from ...util import OrderedDict
if getattr(cls, "__mapper__", None):
return
- to_map = _DeferredMapperConfig.config_for_cls(cls)
+ to_map = _DeferredDeclarativeConfig.config_for_cls(cls)
# can't rely on 'self_and_descendants' here
# since technically an immediate subclass
"""
- to_map = _DeferredMapperConfig.classes_for_base(cls)
+ to_map = _DeferredDeclarativeConfig.classes_for_base(cls)
metadata_to_table = collections.defaultdict(set)
from .decl_api import MappedAsDataclass as MappedAsDataclass
from .decl_api import registry as registry
from .decl_api import synonym_for as synonym_for
+from .decl_api import unmapped_dataclass as unmapped_dataclass
from .decl_base import MappedClassProtocol as MappedClassProtocol
from .descriptor_props import Composite as Composite
from .descriptor_props import CompositeProperty as CompositeProperty
from .base import Mapped
from .base import ORMDescriptor
from .decl_base import _add_attribute
-from .decl_base import _as_declarative
-from .decl_base import _ClassScanMapperConfig
from .decl_base import _declarative_constructor
-from .decl_base import _DeferredMapperConfig
+from .decl_base import _DeclarativeMapperConfig
+from .decl_base import _DeferredDeclarativeConfig
from .decl_base import _del_attribute
-from .decl_base import _mapper
+from .decl_base import _ORMClassConfigurator
from .descriptor_props import Composite
from .descriptor_props import Synonym
from .descriptor_props import Synonym as _orm_synonym
cls._sa_registry = reg
if not cls.__dict__.get("__abstract__", False):
- _as_declarative(reg, cls, dict_)
+ _ORMClassConfigurator._as_declarative(reg, cls, dict_)
type.__init__(cls, classname, bases, dict_)
cls.__init__ = cls.registry.constructor
+def _generate_dc_transforms(
+ cls_: Type[_O],
+ init: Union[_NoArg, bool] = _NoArg.NO_ARG,
+ repr: Union[_NoArg, bool] = _NoArg.NO_ARG, # noqa: A002
+ eq: Union[_NoArg, bool] = _NoArg.NO_ARG,
+ order: Union[_NoArg, bool] = _NoArg.NO_ARG,
+ unsafe_hash: Union[_NoArg, bool] = _NoArg.NO_ARG,
+ match_args: Union[_NoArg, bool] = _NoArg.NO_ARG,
+ kw_only: Union[_NoArg, bool] = _NoArg.NO_ARG,
+ dataclass_callable: Union[
+ _NoArg, Callable[..., Type[Any]]
+ ] = _NoArg.NO_ARG,
+) -> None:
+ apply_dc_transforms: _DataclassArguments = {
+ "init": init,
+ "repr": repr,
+ "eq": eq,
+ "order": order,
+ "unsafe_hash": unsafe_hash,
+ "match_args": match_args,
+ "kw_only": kw_only,
+ "dataclass_callable": dataclass_callable,
+ }
+
+ if hasattr(cls_, "_sa_apply_dc_transforms"):
+ current = cls_._sa_apply_dc_transforms # type: ignore[attr-defined]
+
+ _DeclarativeMapperConfig._assert_dc_arguments(current)
+
+ cls_._sa_apply_dc_transforms = { # type: ignore # noqa: E501
+ k: current.get(k, _NoArg.NO_ARG) if v is _NoArg.NO_ARG else v
+ for k, v in apply_dc_transforms.items()
+ }
+ else:
+ setattr(cls_, "_sa_apply_dc_transforms", apply_dc_transforms)
+
+
class MappedAsDataclass(metaclass=DCTransformDeclarative):
"""Mixin class to indicate when mapping this class, also convert it to be
a dataclass.
.. seealso::
:ref:`orm_declarative_native_dataclasses` - complete background
- on SQLAlchemy native dataclass mapping
+ on SQLAlchemy native dataclass mapping with
+ :class:`_orm.MappedAsDataclass`.
+
+ :ref:`orm_declarative_dc_mixins` - examples specific to using
+ :class:`_orm.MappedAsDataclass` to create mixins
+
+ :func:`_orm.mapped_as_dataclass` / :func:`_orm.unmapped_dataclass` -
+ decorator versions with equivalent functionality
.. versionadded:: 2.0
] = _NoArg.NO_ARG,
**kw: Any,
) -> None:
- apply_dc_transforms: _DataclassArguments = {
- "init": init,
- "repr": repr,
- "eq": eq,
- "order": order,
- "unsafe_hash": unsafe_hash,
- "match_args": match_args,
- "kw_only": kw_only,
- "dataclass_callable": dataclass_callable,
- }
- current_transforms: _DataclassArguments
-
- if hasattr(cls, "_sa_apply_dc_transforms"):
- current = cls._sa_apply_dc_transforms
-
- _ClassScanMapperConfig._assert_dc_arguments(current)
-
- cls._sa_apply_dc_transforms = current_transforms = { # type: ignore # noqa: E501
- k: current.get(k, _NoArg.NO_ARG) if v is _NoArg.NO_ARG else v
- for k, v in apply_dc_transforms.items()
- }
- else:
- cls._sa_apply_dc_transforms = current_transforms = (
- apply_dc_transforms
- )
-
+ _generate_dc_transforms(
+ init=init,
+ repr=repr,
+ eq=eq,
+ order=order,
+ unsafe_hash=unsafe_hash,
+ match_args=match_args,
+ kw_only=kw_only,
+ dataclass_callable=dataclass_callable,
+ cls_=cls,
+ )
super().__init_subclass__(**kw)
if not _is_mapped_class(cls):
- new_anno = (
- _ClassScanMapperConfig._update_annotations_for_non_mapped_class
- )(cls)
- _ClassScanMapperConfig._apply_dataclasses_to_any_class(
- current_transforms, cls, new_anno
- )
+ # turn unmapped classes into "good enough" dataclasses to serve
+ # as a base or a mixin
+ _ORMClassConfigurator._as_unmapped_dataclass(cls, cls.__dict__)
class DeclarativeBase(
_check_not_declarative(cls, DeclarativeBase)
_setup_declarative_base(cls)
else:
- _as_declarative(cls._sa_registry, cls, cls.__dict__)
+ _ORMClassConfigurator._as_declarative(
+ cls._sa_registry, cls, cls.__dict__
+ )
super().__init_subclass__(**kw)
_check_not_declarative(cls, DeclarativeBaseNoMeta)
_setup_declarative_base(cls)
else:
- _as_declarative(cls._sa_registry, cls, cls.__dict__)
+ _ORMClassConfigurator._as_declarative(
+ cls._sa_registry, cls, cls.__dict__
+ )
super().__init_subclass__(**kw)
"""
- def decorate(cls: Type[_O]) -> Type[_O]:
- apply_dc_transforms: _DataclassArguments = {
- "init": init,
- "repr": repr,
- "eq": eq,
- "order": order,
- "unsafe_hash": unsafe_hash,
- "match_args": match_args,
- "kw_only": kw_only,
- "dataclass_callable": dataclass_callable,
- }
-
- setattr(cls, "_sa_apply_dc_transforms", apply_dc_transforms)
- _as_declarative(self, cls, cls.__dict__)
- return cls
+ decorate = mapped_as_dataclass(
+ self,
+ init=init,
+ repr=repr,
+ eq=eq,
+ order=order,
+ unsafe_hash=unsafe_hash,
+ match_args=match_args,
+ kw_only=kw_only,
+ dataclass_callable=dataclass_callable,
+ )
if __cls:
return decorate(__cls)
:meth:`_orm.registry.mapped_as_dataclass`
"""
- _as_declarative(self, cls, cls.__dict__)
+ _ORMClassConfigurator._as_declarative(self, cls, cls.__dict__)
return cls
def as_declarative_base(self, **kw: Any) -> Callable[[Type[_T]], Type[_T]]:
:meth:`_orm.registry.map_imperatively`
"""
- _as_declarative(self, cls, cls.__dict__)
+ _ORMClassConfigurator._as_declarative(self, cls, cls.__dict__)
return cls.__mapper__ # type: ignore
def map_imperatively(
:ref:`orm_declarative_mapping`
"""
- return _mapper(self, class_, local_table, kw)
+ return _ORMClassConfigurator._mapper(self, class_, local_table, kw)
RegistryType = registry
.. versionadded:: 2.0.44
"""
- return registry.mapped_as_dataclass(
- init=init,
- repr=repr,
- eq=eq,
- order=order,
- unsafe_hash=unsafe_hash,
- match_args=match_args,
- kw_only=kw_only,
- dataclass_callable=dataclass_callable,
- )
+
+ def decorate(cls: Type[_O]) -> Type[_O]:
+ _generate_dc_transforms(
+ init=init,
+ repr=repr,
+ eq=eq,
+ order=order,
+ unsafe_hash=unsafe_hash,
+ match_args=match_args,
+ kw_only=kw_only,
+ dataclass_callable=dataclass_callable,
+ cls_=cls,
+ )
+ _ORMClassConfigurator._as_declarative(registry, cls, cls.__dict__)
+ return cls
+
+ return decorate
@inspection._inspects(
def _inspect_decl_meta(cls: Type[Any]) -> Optional[Mapper[Any]]:
mp: Optional[Mapper[Any]] = _inspect_mapped_class(cls)
if mp is None:
- if _DeferredMapperConfig.has_cls(cls):
- _DeferredMapperConfig.raise_unmapped_for_cls(cls)
+ if _DeferredDeclarativeConfig.has_cls(cls):
+ _DeferredDeclarativeConfig.raise_unmapped_for_cls(cls)
return mp
+
+
+@compat_typing.dataclass_transform(
+ field_specifiers=(
+ MappedColumn,
+ RelationshipProperty,
+ Composite,
+ Synonym,
+ mapped_column,
+ relationship,
+ composite,
+ synonym,
+ deferred,
+ ),
+)
+@overload
+def unmapped_dataclass(__cls: Type[_O], /) -> Type[_O]: ...
+
+
+@overload
+def unmapped_dataclass(
+ __cls: Literal[None] = ...,
+ /,
+ *,
+ init: Union[_NoArg, bool] = ...,
+ repr: Union[_NoArg, bool] = ..., # noqa: A002
+ eq: Union[_NoArg, bool] = ...,
+ order: Union[_NoArg, bool] = ...,
+ unsafe_hash: Union[_NoArg, bool] = ...,
+ match_args: Union[_NoArg, bool] = ...,
+ kw_only: Union[_NoArg, bool] = ...,
+ dataclass_callable: Union[_NoArg, Callable[..., Type[Any]]] = ...,
+) -> Callable[[Type[_O]], Type[_O]]: ...
+
+
+def unmapped_dataclass(
+ __cls: Optional[Type[_O]] = None,
+ /,
+ *,
+ init: Union[_NoArg, bool] = _NoArg.NO_ARG,
+ repr: Union[_NoArg, bool] = _NoArg.NO_ARG, # noqa: A002
+ eq: Union[_NoArg, bool] = _NoArg.NO_ARG,
+ order: Union[_NoArg, bool] = _NoArg.NO_ARG,
+ unsafe_hash: Union[_NoArg, bool] = _NoArg.NO_ARG,
+ match_args: Union[_NoArg, bool] = _NoArg.NO_ARG,
+ kw_only: Union[_NoArg, bool] = _NoArg.NO_ARG,
+ dataclass_callable: Union[
+ _NoArg, Callable[..., Type[Any]]
+ ] = _NoArg.NO_ARG,
+) -> Union[Type[_O], Callable[[Type[_O]], Type[_O]]]:
+ """Decorator which allows the creation of dataclass-compatible mixins
+ within mapped class hierarchies based on the
+ :func:`_orm.mapped_as_dataclass` decorator.
+
+ Parameters are the same as those of :func:`_orm.mapped_as_dataclass`.
+ The decorator turns the given class into a SQLAlchemy-compatible dataclass
+ in the same way that :func:`_orm.mapped_as_dataclass` does, taking
+ into account :func:`_orm.mapped_column` and other attributes for dataclass-
+ specific directives, but not actually mapping the class.
+
+ To create unmapped dataclass mixins when using a class hierarchy defined
+ by :class:`.DeclarativeBase` and :class:`.MappedAsDataclass`, the
+ :class:`.MappedAsDataclass` class may be subclassed alone for a similar
+ effect.
+
+ .. versionadded:: 2.1
+
+ .. seealso::
+
+ :ref:`orm_declarative_dc_mixins` - background and example use.
+
+ """
+
+ def decorate(cls: Type[_O]) -> Type[_O]:
+ _generate_dc_transforms(
+ init=init,
+ repr=repr,
+ eq=eq,
+ order=order,
+ unsafe_hash=unsafe_hash,
+ match_args=match_args,
+ kw_only=kw_only,
+ dataclass_callable=dataclass_callable,
+ cls_=cls,
+ )
+ _ORMClassConfigurator._as_unmapped_dataclass(cls, cls.__dict__)
+ return cls
+
+ if __cls:
+ return decorate(__cls)
+ else:
+ return decorate
def _declared_mapping_info(
cls: Type[Any],
-) -> Optional[Union[_DeferredMapperConfig, Mapper[Any]]]:
+) -> Optional[Union[_DeferredDeclarativeConfig, Mapper[Any]]]:
# deferred mapping
- if _DeferredMapperConfig.has_cls(cls):
- return _DeferredMapperConfig.config_for_cls(cls)
+ if _DeferredDeclarativeConfig.has_cls(cls):
+ return _DeferredDeclarativeConfig.config_for_cls(cls)
# regular mapping
elif _is_mapped_class(cls):
return class_mapper(cls, configure=False)
mapper._set_concrete_base()
"""
- if _DeferredMapperConfig.has_cls(cls):
+ if _DeferredDeclarativeConfig.has_cls(cls):
return not _get_immediate_cls_attr(
cls, "_sa_decl_prepare_nocascade", strict=True
)
return None
-def _as_declarative(
- registry: _RegistryType, cls: Type[Any], dict_: _ClassDict
-) -> Optional[_MapperConfig]:
- # declarative scans the class for attributes. no table or mapper
- # args passed separately.
- return _MapperConfig.setup_mapping(registry, cls, dict_, None, {})
-
-
-def _mapper(
- registry: _RegistryType,
- cls: Type[_O],
- table: Optional[FromClause],
- mapper_kw: _MapperKwArgs,
-) -> Mapper[_O]:
- _ImperativeMapperConfig(registry, cls, table, mapper_kw)
- return cast("MappedClassProtocol[_O]", cls).__mapper__
-
-
@util.preload_module("sqlalchemy.orm.decl_api")
def _is_declarative_props(obj: Any) -> bool:
_declared_attr_common = util.preloaded.orm_decl_api._declared_attr_common
return False
-class _MapperConfig:
- __slots__ = (
- "cls",
- "classname",
- "properties",
- "declared_attr_reg",
- "__weakref__",
- )
+class _ORMClassConfigurator:
+ """Object that configures a class that's potentially going to be
+ mapped, and/or turned into an ORM dataclass.
+
+ This is the base class for all the configurator objects.
+
+ """
+
+ __slots__ = ("cls", "classname", "__weakref__")
cls: Type[Any]
classname: str
- properties: util.OrderedDict[
- str,
- Union[
- Sequence[NamedColumn[Any]], NamedColumn[Any], MapperProperty[Any]
- ],
- ]
- declared_attr_reg: Dict[declared_attr[Any], Any]
+
+ def __init__(self, cls_: Type[Any]):
+ self.cls = util.assert_arg_type(cls_, type, "cls_")
+ self.classname = cls_.__name__
@classmethod
- def setup_mapping(
- cls,
- registry: _RegistryType,
- cls_: Type[_O],
- dict_: _ClassDict,
- table: Optional[FromClause],
- mapper_kw: _MapperKwArgs,
+ def _as_declarative(
+ cls, registry: _RegistryType, cls_: Type[Any], dict_: _ClassDict
) -> Optional[_MapperConfig]:
- manager = attributes.opt_manager_of_class(cls)
+ manager = attributes.opt_manager_of_class(cls_)
if manager and manager.class_ is cls_:
raise exc.InvalidRequestError(
- f"Class {cls!r} already has been instrumented declaratively"
+ f"Class {cls_!r} already has been instrumented declaratively"
)
if cls_.__dict__.get("__abstract__", False):
) or hasattr(cls_, "_sa_decl_prepare")
if defer_map:
- return _DeferredMapperConfig(
- registry, cls_, dict_, table, mapper_kw
- )
+ return _DeferredDeclarativeConfig(registry, cls_, dict_)
else:
- return _ClassScanMapperConfig(
- registry, cls_, dict_, table, mapper_kw
- )
+ return _DeclarativeMapperConfig(registry, cls_, dict_)
+
+ @classmethod
+ def _as_unmapped_dataclass(
+ cls, cls_: Type[Any], dict_: _ClassDict
+ ) -> _UnmappedDataclassConfig:
+ return _UnmappedDataclassConfig(cls_, dict_)
+
+ @classmethod
+ def _mapper(
+ cls,
+ registry: _RegistryType,
+ cls_: Type[_O],
+ table: Optional[FromClause],
+ mapper_kw: _MapperKwArgs,
+ ) -> Mapper[_O]:
+ _ImperativeMapperConfig(registry, cls_, table, mapper_kw)
+ return cast("MappedClassProtocol[_O]", cls_).__mapper__
+
+
+class _MapperConfig(_ORMClassConfigurator):
+ """Configurator that configures a class that's potentially going to be
+ mapped, and optionally turned into a dataclass as well."""
+
+ __slots__ = (
+ "properties",
+ "declared_attr_reg",
+ )
+
+ properties: util.OrderedDict[
+ str,
+ Union[
+ Sequence[NamedColumn[Any]], NamedColumn[Any], MapperProperty[Any]
+ ],
+ ]
+ declared_attr_reg: Dict[declared_attr[Any], Any]
def __init__(
self,
registry: _RegistryType,
cls_: Type[Any],
- mapper_kw: _MapperKwArgs,
):
- self.cls = util.assert_arg_type(cls_, type, "cls_")
- self.classname = cls_.__name__
+ super().__init__(cls_)
self.properties = util.OrderedDict()
self.declared_attr_reg = {}
manager.install_member(attrname, value)
return value
- def map(self, mapper_kw: _MapperKwArgs = ...) -> Mapper[Any]:
+ def map(self, mapper_kw: _MapperKwArgs) -> Mapper[Any]:
raise NotImplementedError()
def _early_mapping(self, mapper_kw: _MapperKwArgs) -> None:
class _ImperativeMapperConfig(_MapperConfig):
+ """Configurator that configures a class for an imperative mapping."""
+
__slots__ = ("local_table", "inherits")
def __init__(
table: Optional[FromClause],
mapper_kw: _MapperKwArgs,
):
- super().__init__(registry, cls_, mapper_kw)
+ super().__init__(registry, cls_)
self.local_table = self.set_cls_attribute("__table__", table)
def _setup_inheritance(self, mapper_kw: _MapperKwArgs) -> None:
cls = self.cls
- inherits = mapper_kw.get("inherits", None)
+ inherits = None
+ inherits_search = []
- if inherits is None:
- # since we search for classical mappings now, search for
- # multiple mapped bases as well and raise an error.
- inherits_search = []
- for base_ in cls.__bases__:
- c = _resolve_for_abstract_or_classical(base_)
- if c is None:
- continue
+ # since we search for classical mappings now, search for
+ # multiple mapped bases as well and raise an error.
+ for base_ in cls.__bases__:
+ c = _resolve_for_abstract_or_classical(base_)
+ if c is None:
+ continue
- if _is_supercls_for_inherits(c) and c not in inherits_search:
- inherits_search.append(c)
+ if _is_supercls_for_inherits(c) and c not in inherits_search:
+ inherits_search.append(c)
- if inherits_search:
- if len(inherits_search) > 1:
- raise exc.InvalidRequestError(
- "Class %s has multiple mapped bases: %r"
- % (cls, inherits_search)
- )
- inherits = inherits_search[0]
- elif isinstance(inherits, Mapper):
- inherits = inherits.class_
+ if inherits_search:
+ if len(inherits_search) > 1:
+ raise exc.InvalidRequestError(
+ "Class %s has multiple mapped bases: %r"
+ % (cls, inherits_search)
+ )
+ inherits = inherits_search[0]
self.inherits = inherits
originating_class: Type[Any]
-class _ClassScanMapperConfig(_MapperConfig):
- __slots__ = (
- "registry",
- "clsdict_view",
- "collected_attributes",
- "collected_annotations",
- "local_table",
- "persist_selectable",
- "declared_columns",
- "column_ordering",
- "column_copies",
- "table_args",
- "tablename",
- "mapper_args",
- "mapper_args_fn",
- "table_fn",
- "inherits",
- "single",
- "allow_dataclass_fields",
- "dataclass_setup_arguments",
- "is_dataclass_prior_to_mapping",
- "allow_unmapped_annotations",
- )
+class _ClassScanAbstractConfig(_ORMClassConfigurator):
+ """Abstract base for a configurator that configures a class for a
+ declarative mapping, or an unmapped ORM dataclass.
+
+ Defines scanning of pep-484 annotations as well as ORM dataclass
+ applicators
+
+ """
+
+ __slots__ = ()
- is_deferred = False
- registry: _RegistryType
clsdict_view: _ClassDict
collected_annotations: Dict[str, _CollectedAnnotation]
collected_attributes: Dict[str, Any]
- local_table: Optional[FromClause]
- persist_selectable: Optional[FromClause]
- declared_columns: util.OrderedSet[Column[Any]]
- column_ordering: Dict[Column[Any], int]
- column_copies: Dict[
- Union[MappedColumn[Any], Column[Any]],
- Union[MappedColumn[Any], Column[Any]],
- ]
- tablename: Optional[str]
- mapper_args: Mapping[str, Any]
- table_args: Optional[_TableArgsType]
- mapper_args_fn: Optional[Callable[[], Dict[str, Any]]]
- inherits: Optional[Type[Any]]
- single: bool
is_dataclass_prior_to_mapping: bool
allow_unmapped_annotations: bool
"""
- def __init__(
- self,
- registry: _RegistryType,
- cls_: Type[_O],
- dict_: _ClassDict,
- table: Optional[FromClause],
- mapper_kw: _MapperKwArgs,
- ):
- # grab class dict before the instrumentation manager has been added.
- # reduces cycles
- self.clsdict_view = (
- util.immutabledict(dict_) if dict_ else util.EMPTY_DICT
- )
- super().__init__(registry, cls_, mapper_kw)
- self.registry = registry
- self.persist_selectable = None
-
- self.collected_attributes = {}
- self.collected_annotations = {}
- self.declared_columns = util.OrderedSet()
- self.column_ordering = {}
- self.column_copies = {}
- self.single = False
- self.dataclass_setup_arguments = dca = getattr(
- self.cls, "_sa_apply_dc_transforms", None
- )
+ _include_dunders = {
+ "__table__",
+ "__mapper_args__",
+ "__tablename__",
+ "__table_args__",
+ }
- self.allow_unmapped_annotations = getattr(
- self.cls, "__allow_unmapped__", False
- ) or bool(self.dataclass_setup_arguments)
+ _match_exclude_dunders = re.compile(r"^(?:_sa_|__)")
- self.is_dataclass_prior_to_mapping = cld = dataclasses.is_dataclass(
- cls_
- )
+ def _scan_attributes(self) -> None:
+ raise NotImplementedError()
- sdk = _get_immediate_cls_attr(cls_, "__sa_dataclass_metadata_key__")
+ def _setup_dataclasses_transforms(
+ self, *, enable_descriptor_defaults: bool, revert: bool = False
+ ) -> None:
+ dataclass_setup_arguments = self.dataclass_setup_arguments
+ if not dataclass_setup_arguments:
+ return
- # we don't want to consume Field objects from a not-already-dataclass.
- # the Field objects won't have their "name" or "type" populated,
- # and while it seems like we could just set these on Field as we
- # read them, Field is documented as "user read only" and we need to
- # stay far away from any off-label use of dataclasses APIs.
- if (not cld or dca) and sdk:
+ # can't use is_dataclass since it uses hasattr
+ if "__dataclass_fields__" in self.cls.__dict__:
raise exc.InvalidRequestError(
- "SQLAlchemy mapped dataclasses can't consume mapping "
- "information from dataclass.Field() objects if the immediate "
- "class is not already a dataclass."
+ f"Class {self.cls} is already a dataclass; ensure that "
+ "base classes / decorator styles of establishing dataclasses "
+ "are not being mixed. "
+ "This can happen if a class that inherits from "
+ "'MappedAsDataclass', even indirectly, is been mapped with "
+ "'@registry.mapped_as_dataclass'"
)
- # if already a dataclass, and __sa_dataclass_metadata_key__ present,
- # then also look inside of dataclass.Field() objects yielded by
- # dataclasses.get_fields(cls) when scanning for attributes
- self.allow_dataclass_fields = bool(sdk and cld)
-
- self._setup_declared_events()
-
- self._scan_attributes()
-
- self._setup_dataclasses_transforms()
-
- with mapperlib._CONFIGURE_MUTEX:
- clsregistry._add_class(
- self.classname, self.cls, registry._class_registry
+ # can't create a dataclass if __table__ is already there. This would
+ # fail an assertion when calling _get_arguments_for_make_dataclass:
+ # assert False, "Mapped[] received without a mapping declaration"
+ if "__table__" in self.cls.__dict__:
+ raise exc.InvalidRequestError(
+ f"Class {self.cls} already defines a '__table__'. "
+ "ORM Annotated Dataclasses do not support a pre-existing "
+ "'__table__' element"
)
- self._setup_inheriting_mapper(mapper_kw)
-
- self._extract_mappable_attributes()
-
- self._extract_declared_columns()
-
- self._setup_table(table)
-
- self._setup_inheriting_columns(mapper_kw)
-
- self._early_mapping(mapper_kw)
-
- def _setup_declared_events(self) -> None:
- if _get_immediate_cls_attr(self.cls, "__declare_last__"):
-
- @event.listens_for(Mapper, "after_configured")
- def after_configured() -> None:
- cast(
- "_DeclMappedClassProtocol[Any]", self.cls
- ).__declare_last__()
-
- if _get_immediate_cls_attr(self.cls, "__declare_first__"):
-
- @event.listens_for(Mapper, "before_configured")
- def before_configured() -> None:
- cast(
- "_DeclMappedClassProtocol[Any]", self.cls
- ).__declare_first__()
+ raise_for_non_dc_attrs = collections.defaultdict(list)
- def _cls_attr_override_checker(
- self, cls: Type[_O]
- ) -> Callable[[str, Any], bool]:
- """Produce a function that checks if a class has overridden an
- attribute, taking SQLAlchemy-enabled dataclass fields into account.
+ def _allow_dataclass_field(
+ key: str, originating_class: Type[Any]
+ ) -> bool:
+ if (
+ originating_class is not self.cls
+ and "__dataclass_fields__" not in originating_class.__dict__
+ ):
+ raise_for_non_dc_attrs[originating_class].append(key)
- """
+ return True
- if self.allow_dataclass_fields:
- sa_dataclass_metadata_key = _get_immediate_cls_attr(
- cls, "__sa_dataclass_metadata_key__"
+ field_list = [
+ _AttributeOptions._get_arguments_for_make_dataclass(
+ self,
+ key,
+ anno,
+ mapped_container,
+ self.collected_attributes.get(key, _NoArg.NO_ARG),
+ dataclass_setup_arguments,
+ enable_descriptor_defaults,
+ )
+ for key, anno, mapped_container in (
+ (
+ key,
+ mapped_anno if mapped_anno else raw_anno,
+ mapped_container,
+ )
+ for key, (
+ raw_anno,
+ mapped_container,
+ mapped_anno,
+ is_dc,
+ attr_value,
+ originating_module,
+ originating_class,
+ ) in self.collected_annotations.items()
+ if _allow_dataclass_field(key, originating_class)
+ and (
+ key not in self.collected_attributes
+ # issue #9226; check for attributes that we've collected
+ # which are already instrumented, which we would assume
+ # mean we are in an ORM inheritance mapping and this
+ # attribute is already mapped on the superclass. Under
+ # no circumstance should any QueryableAttribute be sent to
+ # the dataclass() function; anything that's mapped should
+ # be Field and that's it
+ or not isinstance(
+ self.collected_attributes[key], QueryableAttribute
+ )
+ )
+ )
+ ]
+ if raise_for_non_dc_attrs:
+ for (
+ originating_class,
+ non_dc_attrs,
+ ) in raise_for_non_dc_attrs.items():
+ raise exc.InvalidRequestError(
+ f"When transforming {self.cls} to a dataclass, "
+ f"attribute(s) "
+ f"{', '.join(repr(key) for key in non_dc_attrs)} "
+ f"originates from superclass "
+ f"{originating_class}, which is not a dataclass. When "
+ f"declaring SQLAlchemy Declarative "
+ f"Dataclasses, ensure that all mixin classes and other "
+ f"superclasses which include attributes are also a "
+ f"subclass of MappedAsDataclass or make use of the "
+ f"@unmapped_dataclass decorator.",
+ code="dcmx",
+ )
+
+ annotations = {}
+ defaults = {}
+ for item in field_list:
+ if len(item) == 2:
+ name, tp = item
+ elif len(item) == 3:
+ name, tp, spec = item
+ defaults[name] = spec
+ else:
+ assert False
+ annotations[name] = tp
+
+ revert_dict = {}
+
+ for k, v in defaults.items():
+ if k in self.cls.__dict__:
+ revert_dict[k] = self.cls.__dict__[k]
+ setattr(self.cls, k, v)
+
+ self._apply_dataclasses_to_any_class(
+ dataclass_setup_arguments, self.cls, annotations
+ )
+
+ if revert:
+ # used for mixin dataclasses; we have to restore the
+ # mapped_column(), relationship() etc. to the class so these
+ # take place for a mapped class scan
+ for k, v in revert_dict.items():
+ setattr(self.cls, k, v)
+
+ def _collect_annotation(
+ self,
+ name: str,
+ raw_annotation: _AnnotationScanType,
+ originating_class: Type[Any],
+ expect_mapped: Optional[bool],
+ attr_value: Any,
+ ) -> Optional[_CollectedAnnotation]:
+ if name in self.collected_annotations:
+ return self.collected_annotations[name]
+
+ if raw_annotation is None:
+ return None
+
+ is_dataclass = self.is_dataclass_prior_to_mapping
+ allow_unmapped = self.allow_unmapped_annotations
+
+ if expect_mapped is None:
+ is_dataclass_field = isinstance(attr_value, dataclasses.Field)
+ expect_mapped = (
+ not is_dataclass_field
+ and not allow_unmapped
+ and (
+ attr_value is None
+ or isinstance(attr_value, _MappedAttribute)
+ )
+ )
+
+ is_dataclass_field = False
+ extracted = _extract_mapped_subtype(
+ raw_annotation,
+ self.cls,
+ originating_class.__module__,
+ name,
+ type(attr_value),
+ required=False,
+ is_dataclass_field=is_dataclass_field,
+ expect_mapped=expect_mapped and not is_dataclass,
+ )
+ if extracted is None:
+ # ClassVar can come out here
+ return None
+
+ extracted_mapped_annotation, mapped_container = extracted
+
+ if attr_value is None and not is_literal(extracted_mapped_annotation):
+ for elem in get_args(extracted_mapped_annotation):
+ if is_fwd_ref(
+ elem, check_generic=True, check_for_plain_string=True
+ ):
+ elem = de_stringify_annotation(
+ self.cls,
+ elem,
+ originating_class.__module__,
+ include_generic=True,
+ )
+ # look in Annotated[...] for an ORM construct,
+ # such as Annotated[int, mapped_column(primary_key=True)]
+ if isinstance(elem, _IntrospectsAnnotations):
+ attr_value = elem.found_in_pep593_annotated()
+
+ self.collected_annotations[name] = ca = _CollectedAnnotation(
+ raw_annotation,
+ mapped_container,
+ extracted_mapped_annotation,
+ is_dataclass,
+ attr_value,
+ originating_class.__module__,
+ originating_class,
+ )
+ return ca
+
+ @classmethod
+ def _apply_dataclasses_to_any_class(
+ cls,
+ dataclass_setup_arguments: _DataclassArguments,
+ klass: Type[_O],
+ use_annotations: Mapping[str, _AnnotationScanType],
+ ) -> None:
+ cls._assert_dc_arguments(dataclass_setup_arguments)
+
+ dataclass_callable = dataclass_setup_arguments["dataclass_callable"]
+ if dataclass_callable is _NoArg.NO_ARG:
+ dataclass_callable = dataclasses.dataclass
+
+ restored: Optional[Any]
+
+ if use_annotations:
+ # apply constructed annotations that should look "normal" to a
+ # dataclasses callable, based on the fields present. This
+ # means remove the Mapped[] container and ensure all Field
+ # entries have an annotation
+ restored = getattr(klass, "__annotations__", None)
+ klass.__annotations__ = cast("Dict[str, Any]", use_annotations)
+ else:
+ restored = None
+
+ try:
+ dataclass_callable( # type: ignore[call-overload]
+ klass,
+ **{ # type: ignore[call-overload,unused-ignore]
+ k: v
+ for k, v in dataclass_setup_arguments.items()
+ if v is not _NoArg.NO_ARG
+ and k not in ("dataclass_callable",)
+ },
+ )
+ except (TypeError, ValueError) as ex:
+ raise exc.InvalidRequestError(
+ f"Python dataclasses error encountered when creating "
+ f"dataclass for {klass.__name__!r}: "
+ f"{ex!r}. Please refer to Python dataclasses "
+ "documentation for additional information.",
+ code="dcte",
+ ) from ex
+ finally:
+ # restore original annotations outside of the dataclasses
+ # process; for mixins and __abstract__ superclasses, SQLAlchemy
+ # Declarative will need to see the Mapped[] container inside the
+ # annotations in order to map subclasses
+ if use_annotations:
+ if restored is None:
+ del klass.__annotations__
+ else:
+ klass.__annotations__ = restored
+
+ @classmethod
+ def _assert_dc_arguments(cls, arguments: _DataclassArguments) -> None:
+ allowed = {
+ "init",
+ "repr",
+ "order",
+ "eq",
+ "unsafe_hash",
+ "kw_only",
+ "match_args",
+ "dataclass_callable",
+ }
+ disallowed_args = set(arguments).difference(allowed)
+ if disallowed_args:
+ msg = ", ".join(f"{arg!r}" for arg in sorted(disallowed_args))
+ raise exc.ArgumentError(
+ f"Dataclass argument(s) {msg} are not accepted"
+ )
+
+ def _cls_attr_override_checker(
+ self, cls: Type[_O]
+ ) -> Callable[[str, Any], bool]:
+ """Produce a function that checks if a class has overridden an
+ attribute, taking SQLAlchemy-enabled dataclass fields into account.
+
+ """
+
+ if self.allow_dataclass_fields:
+ sa_dataclass_metadata_key = _get_immediate_cls_attr(
+ cls, "__sa_dataclass_metadata_key__"
)
else:
sa_dataclass_metadata_key = None
if ret is obj:
return False
- # for dataclasses, this could be the
- # 'default' of the field. so filter more specifically
- # for an already-mapped InstrumentedAttribute
- if ret is not absent and isinstance(
- ret, InstrumentedAttribute
- ):
- return True
+ # for dataclasses, this could be the
+ # 'default' of the field. so filter more specifically
+ # for an already-mapped InstrumentedAttribute
+ if ret is not absent and isinstance(
+ ret, InstrumentedAttribute
+ ):
+ return True
+
+ if all_field is obj:
+ return False
+ elif all_field is not absent:
+ return True
+
+ # can't find another attribute
+ return False
+
+ return attribute_is_overridden
+
+ def _cls_attr_resolver(
+ self, cls: Type[Any]
+ ) -> Callable[[], Iterable[Tuple[str, Any, Any, bool]]]:
+ """produce a function to iterate the "attributes" of a class
+ which we want to consider for mapping, adjusting for SQLAlchemy fields
+ embedded in dataclass fields.
+
+ """
+ cls_annotations = util.get_annotations(cls)
+
+ cls_vars = vars(cls)
+
+ _include_dunders = self._include_dunders
+ _match_exclude_dunders = self._match_exclude_dunders
+
+ names = [
+ n
+ for n in util.merge_lists_w_ordering(
+ list(cls_vars), list(cls_annotations)
+ )
+ if not _match_exclude_dunders.match(n) or n in _include_dunders
+ ]
+
+ if self.allow_dataclass_fields:
+ sa_dataclass_metadata_key: Optional[str] = _get_immediate_cls_attr(
+ cls, "__sa_dataclass_metadata_key__"
+ )
+ else:
+ sa_dataclass_metadata_key = None
+
+ if not sa_dataclass_metadata_key:
+
+ def local_attributes_for_class() -> (
+ Iterable[Tuple[str, Any, Any, bool]]
+ ):
+ return (
+ (
+ name,
+ cls_vars.get(name),
+ cls_annotations.get(name),
+ False,
+ )
+ for name in names
+ )
+
+ else:
+ dataclass_fields = {
+ field.name: field for field in util.local_dataclass_fields(cls)
+ }
+
+ fixed_sa_dataclass_metadata_key = sa_dataclass_metadata_key
+
+ def local_attributes_for_class() -> (
+ Iterable[Tuple[str, Any, Any, bool]]
+ ):
+ for name in names:
+ field = dataclass_fields.get(name, None)
+ if field and sa_dataclass_metadata_key in field.metadata:
+ yield field.name, _as_dc_declaredattr(
+ field.metadata, fixed_sa_dataclass_metadata_key
+ ), cls_annotations.get(field.name), True
+ else:
+ yield name, cls_vars.get(name), cls_annotations.get(
+ name
+ ), False
+
+ return local_attributes_for_class
+
+
+class _DeclarativeMapperConfig(_MapperConfig, _ClassScanAbstractConfig):
+ """Configurator that will produce a declarative mapped class"""
+
+ __slots__ = (
+ "registry",
+ "local_table",
+ "persist_selectable",
+ "declared_columns",
+ "column_ordering",
+ "column_copies",
+ "table_args",
+ "tablename",
+ "mapper_args",
+ "mapper_args_fn",
+ "table_fn",
+ "inherits",
+ "single",
+ "clsdict_view",
+ "collected_attributes",
+ "collected_annotations",
+ "allow_dataclass_fields",
+ "dataclass_setup_arguments",
+ "is_dataclass_prior_to_mapping",
+ "allow_unmapped_annotations",
+ )
+
+ is_deferred = False
+ registry: _RegistryType
+ local_table: Optional[FromClause]
+ persist_selectable: Optional[FromClause]
+ declared_columns: util.OrderedSet[Column[Any]]
+ column_ordering: Dict[Column[Any], int]
+ column_copies: Dict[
+ Union[MappedColumn[Any], Column[Any]],
+ Union[MappedColumn[Any], Column[Any]],
+ ]
+ tablename: Optional[str]
+ mapper_args: Mapping[str, Any]
+ table_args: Optional[_TableArgsType]
+ mapper_args_fn: Optional[Callable[[], Dict[str, Any]]]
+ inherits: Optional[Type[Any]]
+ single: bool
+
+ def __init__(
+ self,
+ registry: _RegistryType,
+ cls_: Type[_O],
+ dict_: _ClassDict,
+ ):
+ # grab class dict before the instrumentation manager has been added.
+ # reduces cycles
+ self.clsdict_view = (
+ util.immutabledict(dict_) if dict_ else util.EMPTY_DICT
+ )
+ super().__init__(registry, cls_)
+ self.registry = registry
+ self.persist_selectable = None
+
+ self.collected_attributes = {}
+ self.collected_annotations = {}
+ self.declared_columns = util.OrderedSet()
+ self.column_ordering = {}
+ self.column_copies = {}
+ self.single = False
+ self.dataclass_setup_arguments = dca = getattr(
+ self.cls, "_sa_apply_dc_transforms", None
+ )
+
+ self.allow_unmapped_annotations = getattr(
+ self.cls, "__allow_unmapped__", False
+ ) or bool(self.dataclass_setup_arguments)
- if all_field is obj:
- return False
- elif all_field is not absent:
- return True
+ self.is_dataclass_prior_to_mapping = cld = dataclasses.is_dataclass(
+ cls_
+ )
- # can't find another attribute
- return False
+ sdk = _get_immediate_cls_attr(cls_, "__sa_dataclass_metadata_key__")
- return attribute_is_overridden
+ # we don't want to consume Field objects from a not-already-dataclass.
+ # the Field objects won't have their "name" or "type" populated,
+ # and while it seems like we could just set these on Field as we
+ # read them, Field is documented as "user read only" and we need to
+ # stay far away from any off-label use of dataclasses APIs.
+ if (not cld or dca) and sdk:
+ raise exc.InvalidRequestError(
+ "SQLAlchemy mapped dataclasses can't consume mapping "
+ "information from dataclass.Field() objects if the immediate "
+ "class is not already a dataclass."
+ )
- _include_dunders = {
- "__table__",
- "__mapper_args__",
- "__tablename__",
- "__table_args__",
- }
+ # if already a dataclass, and __sa_dataclass_metadata_key__ present,
+ # then also look inside of dataclass.Field() objects yielded by
+ # dataclasses.get_fields(cls) when scanning for attributes
+ self.allow_dataclass_fields = bool(sdk and cld)
- _match_exclude_dunders = re.compile(r"^(?:_sa_|__)")
+ self._setup_declared_events()
- def _cls_attr_resolver(
- self, cls: Type[Any]
- ) -> Callable[[], Iterable[Tuple[str, Any, Any, bool]]]:
- """produce a function to iterate the "attributes" of a class
- which we want to consider for mapping, adjusting for SQLAlchemy fields
- embedded in dataclass fields.
+ self._scan_attributes()
- """
- cls_annotations = util.get_annotations(cls)
+ self._setup_dataclasses_transforms(enable_descriptor_defaults=True)
- cls_vars = vars(cls)
+ with mapperlib._CONFIGURE_MUTEX:
+ clsregistry._add_class(
+ self.classname, self.cls, registry._class_registry
+ )
- _include_dunders = self._include_dunders
- _match_exclude_dunders = self._match_exclude_dunders
+ self._setup_inheriting_mapper()
- names = [
- n
- for n in util.merge_lists_w_ordering(
- list(cls_vars), list(cls_annotations)
- )
- if not _match_exclude_dunders.match(n) or n in _include_dunders
- ]
+ self._extract_mappable_attributes()
- if self.allow_dataclass_fields:
- sa_dataclass_metadata_key: Optional[str] = _get_immediate_cls_attr(
- cls, "__sa_dataclass_metadata_key__"
- )
- else:
- sa_dataclass_metadata_key = None
+ self._extract_declared_columns()
- if not sa_dataclass_metadata_key:
+ self._setup_table()
- def local_attributes_for_class() -> (
- Iterable[Tuple[str, Any, Any, bool]]
- ):
- return (
- (
- name,
- cls_vars.get(name),
- cls_annotations.get(name),
- False,
- )
- for name in names
- )
+ self._setup_inheriting_columns()
- else:
- dataclass_fields = {
- field.name: field for field in util.local_dataclass_fields(cls)
- }
+ self._early_mapping(util.EMPTY_DICT)
- fixed_sa_dataclass_metadata_key = sa_dataclass_metadata_key
+ def _setup_declared_events(self) -> None:
+ if _get_immediate_cls_attr(self.cls, "__declare_last__"):
- def local_attributes_for_class() -> (
- Iterable[Tuple[str, Any, Any, bool]]
- ):
- for name in names:
- field = dataclass_fields.get(name, None)
- if field and sa_dataclass_metadata_key in field.metadata:
- yield field.name, _as_dc_declaredattr(
- field.metadata, fixed_sa_dataclass_metadata_key
- ), cls_annotations.get(field.name), True
- else:
- yield name, cls_vars.get(name), cls_annotations.get(
- name
- ), False
+ @event.listens_for(Mapper, "after_configured")
+ def after_configured() -> None:
+ cast(
+ "_DeclMappedClassProtocol[Any]", self.cls
+ ).__declare_last__()
- return local_attributes_for_class
+ if _get_immediate_cls_attr(self.cls, "__declare_first__"):
+
+ @event.listens_for(Mapper, "before_configured")
+ def before_configured() -> None:
+ cast(
+ "_DeclMappedClassProtocol[Any]", self.cls
+ ).__declare_first__()
def _scan_attributes(self) -> None:
cls = self.cls
# dataclass-only path. if the name is only
# a dataclass field and isn't in local cls.__dict__,
# put the object there.
- # assert that the dataclass-enabled resolver agrees
- # with what we are seeing
-
- assert not attribute_is_overridden(name, obj)
-
- if _is_declarative_props(obj):
- obj = obj.fget()
-
- collected_attributes[name] = obj
- self._collect_annotation(
- name, annotation, base, False, obj
- )
- else:
- collected_annotation = self._collect_annotation(
- name, annotation, base, None, obj
- )
- is_mapped = (
- collected_annotation is not None
- and collected_annotation.mapped_container is not None
- )
- generated_obj = (
- collected_annotation.attr_value
- if collected_annotation is not None
- else obj
- )
- if obj is None and not fixed_table and is_mapped:
- collected_attributes[name] = (
- generated_obj
- if generated_obj is not None
- else MappedColumn()
- )
- elif name in clsdict_view:
- collected_attributes[name] = obj
- # else if the name is not in the cls.__dict__,
- # don't collect it as an attribute.
- # we will see the annotation only, which is meaningful
- # both for mapping and dataclasses setup
-
- if inherited_table_args and not tablename:
- table_args = None
-
- self.table_args = table_args
- self.tablename = tablename
- self.mapper_args_fn = mapper_args_fn
- self.table_fn = table_fn
-
- def _setup_dataclasses_transforms(self) -> None:
- dataclass_setup_arguments = self.dataclass_setup_arguments
- if not dataclass_setup_arguments:
- return
-
- # can't use is_dataclass since it uses hasattr
- if "__dataclass_fields__" in self.cls.__dict__:
- raise exc.InvalidRequestError(
- f"Class {self.cls} is already a dataclass; ensure that "
- "base classes / decorator styles of establishing dataclasses "
- "are not being mixed. "
- "This can happen if a class that inherits from "
- "'MappedAsDataclass', even indirectly, is been mapped with "
- "'@registry.mapped_as_dataclass'"
- )
-
- # can't create a dataclass if __table__ is already there. This would
- # fail an assertion when calling _get_arguments_for_make_dataclass:
- # assert False, "Mapped[] received without a mapping declaration"
- if "__table__" in self.cls.__dict__:
- raise exc.InvalidRequestError(
- f"Class {self.cls} already defines a '__table__'. "
- "ORM Annotated Dataclasses do not support a pre-existing "
- "'__table__' element"
- )
-
- warn_for_non_dc_attrs = collections.defaultdict(list)
-
- def _allow_dataclass_field(
- key: str, originating_class: Type[Any]
- ) -> bool:
- if (
- originating_class is not self.cls
- and "__dataclass_fields__" not in originating_class.__dict__
- ):
- warn_for_non_dc_attrs[originating_class].append(key)
-
- return True
-
- manager = instrumentation.manager_of_class(self.cls)
- assert manager is not None
-
- field_list = [
- _AttributeOptions._get_arguments_for_make_dataclass(
- self,
- key,
- anno,
- mapped_container,
- self.collected_attributes.get(key, _NoArg.NO_ARG),
- dataclass_setup_arguments,
- )
- for key, anno, mapped_container in (
- (
- key,
- mapped_anno if mapped_anno else raw_anno,
- mapped_container,
- )
- for key, (
- raw_anno,
- mapped_container,
- mapped_anno,
- is_dc,
- attr_value,
- originating_module,
- originating_class,
- ) in self.collected_annotations.items()
- if _allow_dataclass_field(key, originating_class)
- and (
- key not in self.collected_attributes
- # issue #9226; check for attributes that we've collected
- # which are already instrumented, which we would assume
- # mean we are in an ORM inheritance mapping and this
- # attribute is already mapped on the superclass. Under
- # no circumstance should any QueryableAttribute be sent to
- # the dataclass() function; anything that's mapped should
- # be Field and that's it
- or not isinstance(
- self.collected_attributes[key], QueryableAttribute
- )
- )
- )
- ]
- if warn_for_non_dc_attrs:
- for (
- originating_class,
- non_dc_attrs,
- ) in warn_for_non_dc_attrs.items():
- util.warn_deprecated(
- f"When transforming {self.cls} to a dataclass, "
- f"attribute(s) "
- f"{', '.join(repr(key) for key in non_dc_attrs)} "
- f"originates from superclass "
- f"{originating_class}, which is not a dataclass. This "
- f"usage is deprecated and will raise an error in "
- f"SQLAlchemy 2.1. When declaring SQLAlchemy Declarative "
- f"Dataclasses, ensure that all mixin classes and other "
- f"superclasses which include attributes are also a "
- f"subclass of MappedAsDataclass.",
- "2.0",
- code="dcmx",
- )
+ # assert that the dataclass-enabled resolver agrees
+ # with what we are seeing
- annotations = {}
- defaults = {}
- for item in field_list:
- if len(item) == 2:
- name, tp = item
- elif len(item) == 3:
- name, tp, spec = item
- defaults[name] = spec
- else:
- assert False
- annotations[name] = tp
+ assert not attribute_is_overridden(name, obj)
- for k, v in defaults.items():
- setattr(self.cls, k, v)
+ if _is_declarative_props(obj):
+ obj = obj.fget()
- self._apply_dataclasses_to_any_class(
- dataclass_setup_arguments, self.cls, annotations
- )
+ collected_attributes[name] = obj
+ self._collect_annotation(
+ name, annotation, base, False, obj
+ )
+ else:
+ collected_annotation = self._collect_annotation(
+ name, annotation, base, None, obj
+ )
+ is_mapped = (
+ collected_annotation is not None
+ and collected_annotation.mapped_container is not None
+ )
+ generated_obj = (
+ collected_annotation.attr_value
+ if collected_annotation is not None
+ else obj
+ )
+ if obj is None and not fixed_table and is_mapped:
+ collected_attributes[name] = (
+ generated_obj
+ if generated_obj is not None
+ else MappedColumn()
+ )
+ elif name in clsdict_view:
+ collected_attributes[name] = obj
+ # else if the name is not in the cls.__dict__,
+ # don't collect it as an attribute.
+ # we will see the annotation only, which is meaningful
+ # both for mapping and dataclasses setup
+
+ if inherited_table_args and not tablename:
+ table_args = None
+
+ self.table_args = table_args
+ self.tablename = tablename
+ self.mapper_args_fn = mapper_args_fn
+ self.table_fn = table_fn
@classmethod
def _update_annotations_for_non_mapped_class(
new_anno[name] = annotation
return new_anno
- @classmethod
- def _apply_dataclasses_to_any_class(
- cls,
- dataclass_setup_arguments: _DataclassArguments,
- klass: Type[_O],
- use_annotations: Mapping[str, _AnnotationScanType],
- ) -> None:
- cls._assert_dc_arguments(dataclass_setup_arguments)
-
- dataclass_callable = dataclass_setup_arguments["dataclass_callable"]
- if dataclass_callable is _NoArg.NO_ARG:
- dataclass_callable = dataclasses.dataclass
-
- restored: Optional[Any]
-
- if use_annotations:
- # apply constructed annotations that should look "normal" to a
- # dataclasses callable, based on the fields present. This
- # means remove the Mapped[] container and ensure all Field
- # entries have an annotation
- restored = getattr(klass, "__annotations__", None)
- klass.__annotations__ = cast("Dict[str, Any]", use_annotations)
- else:
- restored = None
-
- try:
- dataclass_callable( # type: ignore[call-overload]
- klass,
- **{ # type: ignore[call-overload,unused-ignore]
- k: v
- for k, v in dataclass_setup_arguments.items()
- if v is not _NoArg.NO_ARG
- and k not in ("dataclass_callable",)
- },
- )
- except (TypeError, ValueError) as ex:
- raise exc.InvalidRequestError(
- f"Python dataclasses error encountered when creating "
- f"dataclass for {klass.__name__!r}: "
- f"{ex!r}. Please refer to Python dataclasses "
- "documentation for additional information.",
- code="dcte",
- ) from ex
- finally:
- # restore original annotations outside of the dataclasses
- # process; for mixins and __abstract__ superclasses, SQLAlchemy
- # Declarative will need to see the Mapped[] container inside the
- # annotations in order to map subclasses
- if use_annotations:
- if restored is None:
- del klass.__annotations__
- else:
- klass.__annotations__ = restored
-
- @classmethod
- def _assert_dc_arguments(cls, arguments: _DataclassArguments) -> None:
- allowed = {
- "init",
- "repr",
- "order",
- "eq",
- "unsafe_hash",
- "kw_only",
- "match_args",
- "dataclass_callable",
- }
- disallowed_args = set(arguments).difference(allowed)
- if disallowed_args:
- msg = ", ".join(f"{arg!r}" for arg in sorted(disallowed_args))
- raise exc.ArgumentError(
- f"Dataclass argument(s) {msg} are not accepted"
- )
-
- def _collect_annotation(
- self,
- name: str,
- raw_annotation: _AnnotationScanType,
- originating_class: Type[Any],
- expect_mapped: Optional[bool],
- attr_value: Any,
- ) -> Optional[_CollectedAnnotation]:
- if name in self.collected_annotations:
- return self.collected_annotations[name]
-
- if raw_annotation is None:
- return None
-
- is_dataclass = self.is_dataclass_prior_to_mapping
- allow_unmapped = self.allow_unmapped_annotations
-
- if expect_mapped is None:
- is_dataclass_field = isinstance(attr_value, dataclasses.Field)
- expect_mapped = (
- not is_dataclass_field
- and not allow_unmapped
- and (
- attr_value is None
- or isinstance(attr_value, _MappedAttribute)
- )
- )
-
- is_dataclass_field = False
- extracted = _extract_mapped_subtype(
- raw_annotation,
- self.cls,
- originating_class.__module__,
- name,
- type(attr_value),
- required=False,
- is_dataclass_field=is_dataclass_field,
- expect_mapped=expect_mapped and not is_dataclass,
- )
- if extracted is None:
- # ClassVar can come out here
- return None
-
- extracted_mapped_annotation, mapped_container = extracted
-
- if attr_value is None and not is_literal(extracted_mapped_annotation):
- for elem in get_args(extracted_mapped_annotation):
- if is_fwd_ref(
- elem, check_generic=True, check_for_plain_string=True
- ):
- elem = de_stringify_annotation(
- self.cls,
- elem,
- originating_class.__module__,
- include_generic=True,
- )
- # look in Annotated[...] for an ORM construct,
- # such as Annotated[int, mapped_column(primary_key=True)]
- if isinstance(elem, _IntrospectsAnnotations):
- attr_value = elem.found_in_pep593_annotated()
-
- self.collected_annotations[name] = ca = _CollectedAnnotation(
- raw_annotation,
- mapped_container,
- extracted_mapped_annotation,
- is_dataclass,
- attr_value,
- originating_class.__module__,
- originating_class,
- )
- return ca
-
def _warn_for_decl_attributes(
self, cls: Type[Any], key: str, c: Any
) -> None:
else:
return manager.registry.metadata
- def _setup_inheriting_mapper(self, mapper_kw: _MapperKwArgs) -> None:
+ def _setup_inheriting_mapper(self) -> None:
cls = self.cls
- inherits = mapper_kw.get("inherits", None)
+ inherits = None
if inherits is None:
# since we search for classical mappings now, search for
if "__table__" not in clsdict_view and self.tablename is None:
self.single = True
- def _setup_inheriting_columns(self, mapper_kw: _MapperKwArgs) -> None:
+ def _setup_inheriting_columns(self) -> None:
table = self.local_table
cls = self.cls
table_args = self.table_args
)
+class _UnmappedDataclassConfig(_ClassScanAbstractConfig):
+ """Configurator that will produce an unmapped dataclass."""
+
+ __slots__ = (
+ "clsdict_view",
+ "collected_attributes",
+ "collected_annotations",
+ "allow_dataclass_fields",
+ "dataclass_setup_arguments",
+ "is_dataclass_prior_to_mapping",
+ "allow_unmapped_annotations",
+ )
+
+ def __init__(
+ self,
+ cls_: Type[_O],
+ dict_: _ClassDict,
+ ):
+ super().__init__(cls_)
+ self.clsdict_view = (
+ util.immutabledict(dict_) if dict_ else util.EMPTY_DICT
+ )
+ self.dataclass_setup_arguments = getattr(
+ self.cls, "_sa_apply_dc_transforms", None
+ )
+
+ self.is_dataclass_prior_to_mapping = dataclasses.is_dataclass(cls_)
+ self.allow_dataclass_fields = False
+ self.allow_unmapped_annotations = True
+ self.collected_attributes = {}
+ self.collected_annotations = {}
+
+ self._scan_attributes()
+
+ self._setup_dataclasses_transforms(
+ enable_descriptor_defaults=False, revert=True
+ )
+
+ def _scan_attributes(self) -> None:
+ cls = self.cls
+
+ clsdict_view = self.clsdict_view
+ collected_attributes = self.collected_attributes
+ _include_dunders = self._include_dunders
+
+ attribute_is_overridden = self._cls_attr_override_checker(self.cls)
+
+ local_attributes_for_class = self._cls_attr_resolver(cls)
+ for (
+ name,
+ obj,
+ annotation,
+ is_dataclass_field,
+ ) in local_attributes_for_class():
+ if name in _include_dunders:
+ continue
+ elif is_dataclass_field and (
+ name not in clsdict_view or clsdict_view[name] is not obj
+ ):
+ # here, we are definitely looking at the target class
+ # and not a superclass. this is currently a
+ # dataclass-only path. if the name is only
+ # a dataclass field and isn't in local cls.__dict__,
+ # put the object there.
+ # assert that the dataclass-enabled resolver agrees
+ # with what we are seeing
+
+ assert not attribute_is_overridden(name, obj)
+
+ if _is_declarative_props(obj):
+ obj = obj.fget()
+
+ collected_attributes[name] = obj
+ self._collect_annotation(name, annotation, cls, False, obj)
+ else:
+ self._collect_annotation(name, annotation, cls, None, obj)
+ if name in clsdict_view:
+ collected_attributes[name] = obj
+
+
@util.preload_module("sqlalchemy.orm.decl_api")
def _as_dc_declaredattr(
field_metadata: Mapping[str, Any], sa_dataclass_metadata_key: str
return obj
-class _DeferredMapperConfig(_ClassScanMapperConfig):
+class _DeferredDeclarativeConfig(_DeclarativeMapperConfig):
+ """Configurator that extends _DeclarativeMapperConfig to add a
+ "deferred" step, to allow extensions like AbstractConcreteBase,
+ DeferredMapping to partially set up a mapping that is "prepared"
+ when table metadata is ready.
+
+ """
+
_cls: weakref.ref[Type[Any]]
is_deferred = True
_configs: util.OrderedDict[
- weakref.ref[Type[Any]], _DeferredMapperConfig
+ weakref.ref[Type[Any]], _DeferredDeclarativeConfig
] = util.OrderedDict()
def _early_mapping(self, mapper_kw: _MapperKwArgs) -> None:
)
@classmethod
- def config_for_cls(cls, class_: Type[Any]) -> _DeferredMapperConfig:
+ def config_for_cls(cls, class_: Type[Any]) -> _DeferredDeclarativeConfig:
return cls._configs[weakref.ref(class_)]
@classmethod
def classes_for_base(
cls, base_cls: Type[Any], sort: bool = True
- ) -> List[_DeferredMapperConfig]:
+ ) -> List[_DeferredDeclarativeConfig]:
classes_for_base = [
m
for m, cls_ in [(m, m.cls) for m in cls._configs.values()]
all_m_by_cls = {m.cls: m for m in classes_for_base}
- tuples: List[Tuple[_DeferredMapperConfig, _DeferredMapperConfig]] = []
+ tuples: List[
+ Tuple[_DeferredDeclarativeConfig, _DeferredDeclarativeConfig]
+ ] = []
for m_cls in all_m_by_cls:
tuples.extend(
(all_m_by_cls[base_cls], all_m_by_cls[m_cls])
from .attributes import InstrumentedAttribute
from .attributes import QueryableAttribute
from .context import _ORMCompileState
- from .decl_base import _ClassScanMapperConfig
+ from .decl_base import _ClassScanAbstractConfig
+ from .decl_base import _DeclarativeMapperConfig
from .interfaces import _DataclassArguments
from .mapper import Mapper
from .properties import ColumnProperty
@util.preload_module("sqlalchemy.orm.properties")
def declarative_scan(
self,
- decl_scan: _ClassScanMapperConfig,
+ decl_scan: _DeclarativeMapperConfig,
registry: _RegistryType,
cls: Type[Any],
originating_module: Optional[str],
@util.preload_module("sqlalchemy.orm.decl_base")
def _setup_for_dataclass(
self,
- decl_scan: _ClassScanMapperConfig,
+ decl_scan: _DeclarativeMapperConfig,
registry: _RegistryType,
cls: Type[Any],
originating_module: Optional[str],
def _get_dataclass_setup_options(
self,
- decl_scan: _ClassScanMapperConfig,
+ decl_scan: _ClassScanAbstractConfig,
key: str,
dataclass_setup_arguments: _DataclassArguments,
+ enable_descriptor_defaults: bool,
) -> _AttributeOptions:
dataclasses_default = self._attribute_options.dataclasses_default
if (
dataclasses_default is not _NoArg.NO_ARG
and not callable(dataclasses_default)
+ and enable_descriptor_defaults
and not getattr(
decl_scan.cls, "_sa_disable_descriptor_defaults", False
)
from .context import _ORMCompileState
from .context import QueryContext
from .decl_api import RegistryType
- from .decl_base import _ClassScanMapperConfig
+ from .decl_base import _ClassScanAbstractConfig
+ from .decl_base import _DeclarativeMapperConfig
from .loading import _PopulatorDict
from .mapper import Mapper
from .path_registry import _AbstractEntityRegistry
def declarative_scan(
self,
- decl_scan: _ClassScanMapperConfig,
+ decl_scan: _DeclarativeMapperConfig,
registry: RegistryType,
cls: Type[Any],
originating_module: Optional[str],
@classmethod
def _get_arguments_for_make_dataclass(
cls,
- decl_scan: _ClassScanMapperConfig,
+ decl_scan: _ClassScanAbstractConfig,
key: str,
annotation: _AnnotationScanType,
mapped_container: Optional[Any],
elem: Any,
dataclass_setup_arguments: _DataclassArguments,
+ enable_descriptor_defaults: bool,
) -> Union[
Tuple[str, _AnnotationScanType],
- Tuple[str, _AnnotationScanType, dataclasses.Field[Any]],
+ Tuple[str, _AnnotationScanType, dataclasses.Field[Any] | None],
]:
"""given attribute key, annotation, and value from a class, return
the argument tuple we would pass to dataclasses.make_dataclass()
"""
if isinstance(elem, _DCAttributeOptions):
attribute_options = elem._get_dataclass_setup_options(
- decl_scan, key, dataclass_setup_arguments
+ decl_scan,
+ key,
+ dataclass_setup_arguments,
+ enable_descriptor_defaults,
)
dc_field = attribute_options._as_dataclass_field(
key, dataclass_setup_arguments
return (key, annotation, elem)
elif mapped_container is not None:
# it's Mapped[], but there's no "element", which means declarative
- # did not actually do anything for this field. this shouldn't
- # happen.
- # previously, this would occur because _scan_attributes would
- # skip a field that's on an already mapped superclass, but it
- # would still include it in the annotations, leading
- # to issue #8718
-
- assert False, "Mapped[] received without a mapping declaration"
+ # did not actually do anything for this field.
+ # prior to 2.1, this would never happen and we had a false
+ # assertion here, because the mapper _scan_attributes always
+ # generates a MappedColumn when one is not present
+ # (see issue #8718). However, in 2.1 we handle this case for the
+ # non-mapped dataclass use case without the need to generate
+ # MappedColumn that gets thrown away anyway.
+ return (key, annotation)
else:
# plain dataclass field, not mapped. Is only possible
def _get_dataclass_setup_options(
self,
- decl_scan: _ClassScanMapperConfig,
+ decl_scan: _ClassScanAbstractConfig,
key: str,
dataclass_setup_arguments: _DataclassArguments,
+ enable_descriptor_defaults: bool,
) -> _AttributeOptions:
return self._attribute_options
def _get_dataclass_setup_options(
self,
- decl_scan: _ClassScanMapperConfig,
+ decl_scan: _ClassScanAbstractConfig,
key: str,
dataclass_setup_arguments: _DataclassArguments,
+ enable_descriptor_defaults: bool,
) -> _AttributeOptions:
- disable_descriptor_defaults = getattr(
- decl_scan.cls, "_sa_disable_descriptor_defaults", False
+ disable_descriptor_defaults = (
+ not enable_descriptor_defaults
+ or getattr(decl_scan.cls, "_sa_disable_descriptor_defaults", False)
)
+ if disable_descriptor_defaults:
+ return self._attribute_options
+
dataclasses_default = self._attribute_options.dataclasses_default
dataclasses_default_factory = (
self._attribute_options.dataclasses_default_factory
)
- if (
- dataclasses_default is not _NoArg.NO_ARG
- and not callable(dataclasses_default)
- and not disable_descriptor_defaults
+ if dataclasses_default is not _NoArg.NO_ARG and not callable(
+ dataclasses_default
):
self._default_scalar_value = (
self._attribute_options.dataclasses_default
elif (
self._disable_dataclass_default_factory
and dataclasses_default_factory is not _NoArg.NO_ARG
- and not disable_descriptor_defaults
):
return self._attribute_options._replace(
dataclasses_default=DONT_SET,
from ._typing import _ORMColumnExprArgument
from ._typing import _RegistryType
from .base import Mapped
- from .decl_base import _ClassScanMapperConfig
+ from .decl_base import _DeclarativeMapperConfig
from .mapper import Mapper
from .session import Session
from .state import _InstallLoaderCallableProto
def declarative_scan(
self,
- decl_scan: _ClassScanMapperConfig,
+ decl_scan: _DeclarativeMapperConfig,
registry: _RegistryType,
cls: Type[Any],
originating_module: Optional[str],
def _adjust_for_existing_column(
self,
- decl_scan: _ClassScanMapperConfig,
+ decl_scan: _DeclarativeMapperConfig,
key: str,
given_column: Column[_T],
) -> Column[_T]:
def declarative_scan(
self,
- decl_scan: _ClassScanMapperConfig,
+ decl_scan: _DeclarativeMapperConfig,
registry: _RegistryType,
cls: Type[Any],
originating_module: Optional[str],
@util.preload_module("sqlalchemy.orm.decl_base")
def declarative_scan_for_composite(
self,
- decl_scan: _ClassScanMapperConfig,
+ decl_scan: _DeclarativeMapperConfig,
registry: _RegistryType,
cls: Type[Any],
originating_module: Optional[str],
def _init_column_for_annotation(
self,
cls: Type[Any],
- decl_scan: _ClassScanMapperConfig,
+ decl_scan: _DeclarativeMapperConfig,
key: str,
registry: _RegistryType,
argument: _AnnotationScanType,
from .base import Mapped
from .clsregistry import _class_resolver
from .clsregistry import _ModNS
- from .decl_base import _ClassScanMapperConfig
+ from .decl_base import _DeclarativeMapperConfig
from .dependency import _DependencyProcessor
from .mapper import Mapper
from .query import Query
def declarative_scan(
self,
- decl_scan: _ClassScanMapperConfig,
+ decl_scan: _DeclarativeMapperConfig,
registry: _RegistryType,
cls: Type[Any],
originating_module: Optional[str],
from sqlalchemy.orm import mapped_column
from sqlalchemy.orm import relationship
from sqlalchemy.orm import Session
-from sqlalchemy.orm.decl_base import _DeferredMapperConfig
+from sqlalchemy.orm.decl_base import _DeferredDeclarativeConfig
from sqlalchemy.testing import assert_raises_message
from sqlalchemy.testing import eq_
from sqlalchemy.testing import expect_raises_message
def setup_test(self):
global Base, registry
- _DeferredMapperConfig._configs.clear()
+ _DeferredDeclarativeConfig._configs.clear()
registry = decl.registry()
Base = registry.generate_base()
class DeferredReflectBase(DeclarativeReflectionBase):
def teardown_test(self):
super().teardown_test()
- _DeferredMapperConfig._configs.clear()
+ _DeferredDeclarativeConfig._configs.clear()
Base = None
class Address(DeferredReflection, ComparableEntity, Base):
__tablename__ = "addresses"
- eq_(len(_DeferredMapperConfig._configs), 2)
+ eq_(len(_DeferredDeclarativeConfig._configs), 2)
del Address
gc_collect()
gc_collect()
- eq_(len(_DeferredMapperConfig._configs), 1)
+ eq_(len(_DeferredDeclarativeConfig._configs), 1)
DeferredReflection.prepare(testing.db)
gc_collect()
- assert not _DeferredMapperConfig._configs
+ assert not _DeferredDeclarativeConfig._configs
class DeferredSecondaryReflectionTest(DeferredReflectBase):
from sqlalchemy.orm.decl_api import add_mapped_attribute
from sqlalchemy.orm.decl_api import DeclarativeBaseNoMeta
from sqlalchemy.orm.decl_api import DeclarativeMeta
-from sqlalchemy.orm.decl_base import _DeferredMapperConfig
+from sqlalchemy.orm.decl_base import _DeferredDeclarativeConfig
from sqlalchemy.orm.events import InstrumentationEvents
from sqlalchemy.orm.events import MapperEvents
from sqlalchemy.schema import PrimaryKeyConstraint
@classmethod
def prepare(cls):
"sample prepare method"
- to_map = _DeferredMapperConfig.classes_for_base(cls)
+ to_map = _DeferredDeclarativeConfig.classes_for_base(cls)
for thingy in to_map:
thingy.map({})
import contextlib
import dataclasses
from dataclasses import InitVar
-import functools
import inspect as pyinspect
from itertools import product
from typing import Annotated
from sqlalchemy.orm import relationship
from sqlalchemy.orm import Session
from sqlalchemy.orm import synonym
+from sqlalchemy.orm import unmapped_dataclass
from sqlalchemy.orm.attributes import LoaderCallableStatus
+from sqlalchemy.orm.base import _DeclarativeMapped
+from sqlalchemy.orm.base import _is_mapped_class
from sqlalchemy.sql.base import _NoArg
from sqlalchemy.testing import AssertsCompiledSQL
from sqlalchemy.testing import eq_
from sqlalchemy.testing import is_true
from sqlalchemy.testing import ne_
from sqlalchemy.testing import Variation
-
-
-def _dataclass_mixin_warning(clsname, attrnames):
- return testing.expect_deprecated(
- rf"When transforming .* to a dataclass, attribute\(s\) "
- rf"{attrnames} originates from superclass .*{clsname}"
- )
+from sqlalchemy.util.typing import de_stringify_annotation
class DCTransformsTest(AssertsCompiledSQL, fixtures.TestBase):
foo: Mapped[str]
bar: Mapped[str] = mapped_column()
- with (
- _dataclass_mixin_warning(
- "_BaseMixin", "'create_user', 'update_user'"
- ),
- _dataclass_mixin_warning("SubMixin", "'foo', 'bar'"),
+ with testing.expect_raises_message(
+ exc.InvalidRequestError,
+ r"When transforming .* to a dataclass, attribute\(s\) "
+ r"'foo', 'bar' originates from superclass .*SubMixin",
):
class User(SubMixin, Base):
class DataclassesForNonMappedClassesTest(fixtures.TestBase):
- """test for cases added in #9179"""
+ """test for cases added in #9179 as well as #12854"""
+
+ @testing.variation("target", ["base", "mixin", "abstract"])
+ def test_unmapped_mixin_valid_dataclass(self, target: Variation):
+ """test new capability as of #12854. The MappedAsDataclass mixin
+ creates the dataclass taking into account the mapped_column() and
+ other objects with dataclass attributes
+
+ """
+
+ if target.abstract:
+
+ class Base(DeclarativeBase):
+ pass
+
+ # A is only valid as a dataclass if the init=False parameters
+ # are taken into account. This class was not possible in 2.0
+ # in this version, we go through the declarative process,
+ # due to __abstract__ the class is not mapped, MappedAsDataclass
+ # turns it into an unmapped dataclass
+ class A(MappedAsDataclass, Base):
+ __abstract__ = True
+
+ id: Mapped[int] = mapped_column(primary_key=True, init=False)
+ data: Mapped[str]
+
+ some_int: Mapped[int] = mapped_column(init=False, repr=False)
+
+ x: Mapped[int | None] = mapped_column(default=7)
+
+ class B(A):
+ __tablename__ = "a"
+
+ elif target.mixin:
+
+ class Base(DeclarativeBase):
+ pass
+
+ # A is only valid as a dataclass if the init=False parameters
+ # are taken into account. This class was not possible in 2.0.
+ # in this version, the class does not go through the declarative
+ # process, MappedAsDataclass again turns it into an unmapped
+ # dataclass
+ class A(MappedAsDataclass):
+
+ id: Mapped[int] = mapped_column(primary_key=True, init=False)
+ data: Mapped[str]
+
+ some_int: Mapped[int] = mapped_column(init=False, repr=False)
+
+ x: Mapped[int | None] = mapped_column(default=7)
+
+ class B(Base, A):
+ __tablename__ = "a"
+
+ elif target.base:
+
+ class A:
+ pass
+
+ # works on the base class too
+ class Base(MappedAsDataclass, DeclarativeBase):
+ id: Mapped[int] = mapped_column(primary_key=True, init=False)
+ data: Mapped[str]
+
+ some_int: Mapped[int] = mapped_column(init=False, repr=False)
+
+ x: Mapped[int | None] = mapped_column(default=7)
+
+ class B(Base, A):
+ __tablename__ = "a"
+
+ else:
+ target.fail()
+
+ # mixin elements took effect as mapped columns
+ is_(B.__table__.primary_key.columns[0], B.__table__.c.id)
+ assert B.__table__.c.some_int.type._type_affinity is Integer
+
+ eq_regex(
+ repr(B(data="some data")), r".*B\(id=None, data='some data', x=7\)"
+ )
def test_base_is_dc(self):
class Parent(MappedAsDataclass, DeclarativeBase):
eq_regex(repr(Child(a=5, b=6, c=7)), r".*\.Child\(c=7\)")
- # TODO: get this test to work with future anno mode as well
- # anno only: @testing.exclusions.closed("doesn't work for future annotations mode yet") # noqa: E501
+ @testing.variation("decl_type", ["decorator", "mixin"])
+ def test_non_dc_mixin_error(self, decl_type: Variation):
+ class Mixin:
+ create_user: Mapped[int] = mapped_column()
+ update_user: Mapped[Optional[int]] = mapped_column(
+ default=None, init=False
+ )
+
+ if decl_type.mixin:
+
+ class Base(MappedAsDataclass, DeclarativeBase):
+ pass
+
+ bases = (Mixin, Base)
+ elif decl_type.decorator:
+ bases = (Mixin,)
+
+ else:
+ decl_type.fail()
+
+ with testing.expect_raises_message(
+ exc.InvalidRequestError,
+ r"When transforming .* to a dataclass, attribute\(s\) "
+ r"'create_user', 'update_user' originates from superclass .*Mixin",
+ ):
+
+ class User(*bases):
+ __tablename__ = "sys_user"
+
+ uid: Mapped[str] = mapped_column(
+ String(50),
+ init=False,
+ default_factory=lambda: "x",
+ primary_key=True,
+ )
+ username: Mapped[str] = mapped_column()
+ email: Mapped[str] = mapped_column()
+
+ if decl_type.decorator:
+ reg = registry()
+ User = mapped_as_dataclass(reg)(User)
+
@testing.variation(
"dataclass_scope",
- ["on_base", "on_mixin", "on_base_class", "on_sub_class"],
+ ["on_base", "on_mixin", "on_base_class"],
)
@testing.variation(
"test_alternative_callable",
collected_annotations = {}
def check_args(cls, **kw):
- collected_annotations[cls] = getattr(
- cls, "__annotations__", {}
- )
+ # de-stringify annotations to serve the cases
+ # in test_tm_future_annotations_sync.py
+ collected_annotations[cls] = {
+ k: de_stringify_annotation(
+ cls, v, __name__, locals(), include_generic=True
+ )
+ for k, v in getattr(cls, "__annotations__", {}).items()
+ }
return dataclasses.dataclass(cls, **kw)
klass_kw = {"dataclass_callable": check_args}
expected_annotations[Mixin] = {}
- non_dc_mixin = contextlib.nullcontext
+ class Book(Mixin, Base, **klass_kw):
+ id: Mapped[int] = mapped_column(
+ Integer,
+ primary_key=True,
+ init=False,
+ )
- else:
+ elif dataclass_scope.on_base_class:
class Mixin:
@declared_attr.directive
"polymorphic_on": "polymorphic_type",
}
- if dataclass_scope.on_base or dataclass_scope.on_base_class:
+ class Book(MappedAsDataclass, Mixin, Base, **klass_kw):
+ polymorphic_type: Mapped[str] = mapped_column(
+ String,
+ insert_default="book",
+ init=False,
+ )
- @declared_attr
- @classmethod
- def polymorphic_type(cls) -> Mapped[str]:
- return mapped_column(
- String,
- insert_default=cls.__name__,
- init=False,
- )
+ id: Mapped[int] = mapped_column(
+ Integer,
+ primary_key=True,
+ init=False,
+ )
- else:
+ else:
- @declared_attr
- @classmethod
- def polymorphic_type(cls) -> Mapped[str]:
- return mapped_column(
- String,
- insert_default=cls.__name__,
- )
+ class Mixin:
+ @declared_attr.directive
+ @classmethod
+ def __tablename__(cls) -> str:
+ return cls.__name__.lower()
+
+ @declared_attr.directive
+ @classmethod
+ def __mapper_args__(cls) -> Dict[str, Any]:
+ return {
+ "polymorphic_identity": cls.__name__,
+ "polymorphic_on": "polymorphic_type",
+ }
+
+ class Book(Mixin, Base):
+ polymorphic_type: Mapped[str] = mapped_column(
+ String,
+ insert_default="book",
+ init=False,
+ )
+
+ id: Mapped[int] = mapped_column( # noqa: A001
+ Integer, primary_key=True
+ )
- non_dc_mixin = functools.partial(
- _dataclass_mixin_warning, "Mixin", "'polymorphic_type'"
+ if MappedAsDataclass in Book.__mro__:
+ expected_annotations[Book] = {"id": int, "polymorphic_type": str}
+
+ class Novel(Book):
+ id: Mapped[int] = mapped_column(
+ ForeignKey("book.id"),
+ primary_key=True,
+ init=False,
)
+ description: Mapped[Optional[str]]
- if dataclass_scope.on_base_class:
- with non_dc_mixin():
+ expected_annotations[Novel] = {"id": int, "description": Optional[str]}
- class Book(Mixin, MappedAsDataclass, Base, **klass_kw):
- id: Mapped[int] = mapped_column(
- Integer,
- primary_key=True,
- init=False,
- )
+ if test_alternative_callable:
+ eq_(collected_annotations, expected_annotations)
- else:
- if dataclass_scope.on_base:
- local_non_dc_mixin = non_dc_mixin
- else:
- local_non_dc_mixin = contextlib.nullcontext
+ # check that mixin worked
+ eq_(inspect(Book).polymorphic_identity, "Book")
+ eq_(inspect(Novel).polymorphic_identity, "Novel")
- with local_non_dc_mixin():
+ n1 = Novel("the description")
+ eq_(n1.description, "the description")
- class Book(Mixin, Base):
- if not dataclass_scope.on_sub_class:
- id: Mapped[int] = mapped_column( # noqa: A001
- Integer, primary_key=True, init=False
- )
- else:
- id: Mapped[int] = mapped_column( # noqa: A001
- Integer,
- primary_key=True,
- )
+ @testing.variation(
+ "test_alternative_callable",
+ [True, False],
+ )
+ def test_unmapped_decorator(
+ self, registry: _RegistryType, test_alternative_callable
+ ):
+ expected_annotations = {}
- if MappedAsDataclass in Book.__mro__:
- expected_annotations[Book] = {"id": int, "polymorphic_type": str}
+ dc_kw: dict[str, Any]
- if dataclass_scope.on_sub_class:
- with non_dc_mixin():
+ if test_alternative_callable:
+ collected_annotations = {}
- class Novel(MappedAsDataclass, Book, **klass_kw):
- id: Mapped[int] = mapped_column( # noqa: A001
- ForeignKey("book.id"),
- primary_key=True,
- init=False,
+ def check_args(cls, **kw):
+ # de-stringify annotations to serve the cases
+ # in test_tm_future_annotations_sync.py
+ collected_annotations[cls] = {
+ k: de_stringify_annotation(
+ cls, v, __name__, locals(), include_generic=True
)
- description: Mapped[Optional[str]]
+ for k, v in getattr(cls, "__annotations__", {}).items()
+ }
+ return dataclasses.dataclass(cls, **kw)
+ dc_kw = {"dataclass_callable": check_args}
else:
- with non_dc_mixin():
+ dc_kw = {}
- class Novel(Book):
- id: Mapped[int] = mapped_column(
- ForeignKey("book.id"),
- primary_key=True,
- init=False,
- )
- description: Mapped[Optional[str]]
+ @unmapped_dataclass(**dc_kw)
+ class Mixin:
+ @declared_attr.directive
+ @classmethod
+ def __tablename__(cls) -> str:
+ return cls.__name__.lower()
+
+ @declared_attr.directive
+ @classmethod
+ def __mapper_args__(cls) -> Dict[str, Any]:
+ return {
+ "polymorphic_identity": cls.__name__,
+ "polymorphic_on": "polymorphic_type",
+ }
+
+ @registry.mapped_as_dataclass(**dc_kw)
+ class Book(Mixin):
+ polymorphic_type: Mapped[str] = mapped_column(
+ String,
+ insert_default="book",
+ init=False,
+ )
+
+ id: Mapped[int] = mapped_column(
+ Integer,
+ primary_key=True,
+ init=False,
+ )
+ @registry.mapped_as_dataclass(**dc_kw)
+ class Novel(Book):
+ id: Mapped[int] = mapped_column(
+ ForeignKey("book.id"),
+ primary_key=True,
+ init=False,
+ )
+ description: Mapped[Optional[str]]
+
+ expected_annotations[Book] = {"id": int, "polymorphic_type": str}
expected_annotations[Novel] = {"id": int, "description": Optional[str]}
+ expected_annotations[Mixin] = {}
if test_alternative_callable:
eq_(collected_annotations, expected_annotations)
+ # check that mixin worked
+ eq_(inspect(Book).polymorphic_identity, "Book")
+ eq_(inspect(Novel).polymorphic_identity, "Novel")
+
n1 = Novel("the description")
eq_(n1.description, "the description")
a1 = create("some data", 15)
some_int = a1.some_int
+
+ if not _is_mapped_class(cls):
+ a1.id = None
+ a1.some_int = some_int = 10
+
eq_(
dataclasses.asdict(a1),
{"data": "some data", "id": None, "some_int": some_int, "x": 15},
def _assert_repr(self, cls, create, dc_arguments):
assert "__repr__" in cls.__dict__
a1 = create("some data", 12)
- eq_regex(repr(a1), r".*A\(id=None, data='some data', x=12\)")
+
+ if _is_mapped_class(cls):
+ eq_regex(repr(a1), r".*A\(id=None, data='some data', x=12\)")
+ else:
+ eq_regex(
+ repr(a1), r".*A\(id=.*MappedColumn.*, data='some data', x=12\)"
+ )
def _assert_not_repr(self, cls, create, dc_arguments):
assert "__repr__" not in cls.__dict__
with expect_raises(TypeError):
cls("Some data", 5)
+ if not _is_mapped_class(cls):
+ # for an unmapped dataclass, assert we can construct it
+ a1 = cls()
+
+ # then it has no "data" attribute
+ assert not hasattr(a1, "data")
+
+ # dataclass defaults don't work because we necessarily restored
+ # the mappedcolumn/column_property()/etc
+ assert isinstance(a1.x, _DeclarativeMapped)
+
+ return
+
# behavior change in 2.1, even if init=False we set descriptor
# defaults
-
a1 = cls(data="some data")
eq_(a1.data, "some data")
)
eq_(fas.kwonlyargs, [])
- @testing.variation("decorator_type", ["fn", "method"])
+ @testing.variation("decorator_type", ["unmapped", "fn", "method"])
def test_dc_arguments_decorator(
self,
dc_argument_fixture,
registry: _RegistryType,
decorator_type,
):
- if decorator_type.fn:
+ if decorator_type.unmapped:
+ dec = unmapped_dataclass(**dc_argument_fixture[0])
+ elif decorator_type.fn:
dec = mapped_as_dataclass(registry, **dc_argument_fixture[0])
else:
dec = registry.mapped_as_dataclass(**dc_argument_fixture[0])
import contextlib
import dataclasses
from dataclasses import InitVar
-import functools
import inspect as pyinspect
from itertools import product
from typing import Annotated
from sqlalchemy.orm import relationship
from sqlalchemy.orm import Session
from sqlalchemy.orm import synonym
+from sqlalchemy.orm import unmapped_dataclass
from sqlalchemy.orm.attributes import LoaderCallableStatus
+from sqlalchemy.orm.base import _DeclarativeMapped
+from sqlalchemy.orm.base import _is_mapped_class
from sqlalchemy.sql.base import _NoArg
from sqlalchemy.testing import AssertsCompiledSQL
from sqlalchemy.testing import eq_
from sqlalchemy.testing import is_true
from sqlalchemy.testing import ne_
from sqlalchemy.testing import Variation
-
-
-def _dataclass_mixin_warning(clsname, attrnames):
- return testing.expect_deprecated(
- rf"When transforming .* to a dataclass, attribute\(s\) "
- rf"{attrnames} originates from superclass .*{clsname}"
- )
+from sqlalchemy.util.typing import de_stringify_annotation
class DCTransformsTest(AssertsCompiledSQL, fixtures.TestBase):
foo: Mapped[str]
bar: Mapped[str] = mapped_column()
- with (
- _dataclass_mixin_warning(
- "_BaseMixin", "'create_user', 'update_user'"
- ),
- _dataclass_mixin_warning("SubMixin", "'foo', 'bar'"),
+ with testing.expect_raises_message(
+ exc.InvalidRequestError,
+ r"When transforming .* to a dataclass, attribute\(s\) "
+ r"'foo', 'bar' originates from superclass .*SubMixin",
):
class User(SubMixin, Base):
class DataclassesForNonMappedClassesTest(fixtures.TestBase):
- """test for cases added in #9179"""
+ """test for cases added in #9179 as well as #12854"""
+
+ @testing.variation("target", ["base", "mixin", "abstract"])
+ def test_unmapped_mixin_valid_dataclass(self, target: Variation):
+ """test new capability as of #12854. The MappedAsDataclass mixin
+ creates the dataclass taking into account the mapped_column() and
+ other objects with dataclass attributes
+
+ """
+
+ if target.abstract:
+
+ class Base(DeclarativeBase):
+ pass
+
+ # A is only valid as a dataclass if the init=False parameters
+ # are taken into account. This class was not possible in 2.0
+ # in this version, we go through the declarative process,
+ # due to __abstract__ the class is not mapped, MappedAsDataclass
+ # turns it into an unmapped dataclass
+ class A(MappedAsDataclass, Base):
+ __abstract__ = True
+
+ id: Mapped[int] = mapped_column(primary_key=True, init=False)
+ data: Mapped[str]
+
+ some_int: Mapped[int] = mapped_column(init=False, repr=False)
+
+ x: Mapped[int | None] = mapped_column(default=7)
+
+ class B(A):
+ __tablename__ = "a"
+
+ elif target.mixin:
+
+ class Base(DeclarativeBase):
+ pass
+
+ # A is only valid as a dataclass if the init=False parameters
+ # are taken into account. This class was not possible in 2.0.
+ # in this version, the class does not go through the declarative
+ # process, MappedAsDataclass again turns it into an unmapped
+ # dataclass
+ class A(MappedAsDataclass):
+
+ id: Mapped[int] = mapped_column(primary_key=True, init=False)
+ data: Mapped[str]
+
+ some_int: Mapped[int] = mapped_column(init=False, repr=False)
+
+ x: Mapped[int | None] = mapped_column(default=7)
+
+ class B(Base, A):
+ __tablename__ = "a"
+
+ elif target.base:
+
+ class A:
+ pass
+
+ # works on the base class too
+ class Base(MappedAsDataclass, DeclarativeBase):
+ id: Mapped[int] = mapped_column(primary_key=True, init=False)
+ data: Mapped[str]
+
+ some_int: Mapped[int] = mapped_column(init=False, repr=False)
+
+ x: Mapped[int | None] = mapped_column(default=7)
+
+ class B(Base, A):
+ __tablename__ = "a"
+
+ else:
+ target.fail()
+
+ # mixin elements took effect as mapped columns
+ is_(B.__table__.primary_key.columns[0], B.__table__.c.id)
+ assert B.__table__.c.some_int.type._type_affinity is Integer
+
+ eq_regex(
+ repr(B(data="some data")), r".*B\(id=None, data='some data', x=7\)"
+ )
def test_base_is_dc(self):
class Parent(MappedAsDataclass, DeclarativeBase):
eq_regex(repr(Child(a=5, b=6, c=7)), r".*\.Child\(c=7\)")
- # TODO: get this test to work with future anno mode as well
- @testing.exclusions.closed(
- "doesn't work for future annotations mode yet"
- ) # noqa: E501
+ @testing.variation("decl_type", ["decorator", "mixin"])
+ def test_non_dc_mixin_error(self, decl_type: Variation):
+ class Mixin:
+ create_user: Mapped[int] = mapped_column()
+ update_user: Mapped[Optional[int]] = mapped_column(
+ default=None, init=False
+ )
+
+ if decl_type.mixin:
+
+ class Base(MappedAsDataclass, DeclarativeBase):
+ pass
+
+ bases = (Mixin, Base)
+ elif decl_type.decorator:
+ bases = (Mixin,)
+
+ else:
+ decl_type.fail()
+
+ with testing.expect_raises_message(
+ exc.InvalidRequestError,
+ r"When transforming .* to a dataclass, attribute\(s\) "
+ r"'create_user', 'update_user' originates from superclass .*Mixin",
+ ):
+
+ class User(*bases):
+ __tablename__ = "sys_user"
+
+ uid: Mapped[str] = mapped_column(
+ String(50),
+ init=False,
+ default_factory=lambda: "x",
+ primary_key=True,
+ )
+ username: Mapped[str] = mapped_column()
+ email: Mapped[str] = mapped_column()
+
+ if decl_type.decorator:
+ reg = registry()
+ User = mapped_as_dataclass(reg)(User)
+
@testing.variation(
"dataclass_scope",
- ["on_base", "on_mixin", "on_base_class", "on_sub_class"],
+ ["on_base", "on_mixin", "on_base_class"],
)
@testing.variation(
"test_alternative_callable",
collected_annotations = {}
def check_args(cls, **kw):
- collected_annotations[cls] = getattr(
- cls, "__annotations__", {}
- )
+ # de-stringify annotations to serve the cases
+ # in test_tm_future_annotations_sync.py
+ collected_annotations[cls] = {
+ k: de_stringify_annotation(
+ cls, v, __name__, locals(), include_generic=True
+ )
+ for k, v in getattr(cls, "__annotations__", {}).items()
+ }
return dataclasses.dataclass(cls, **kw)
klass_kw = {"dataclass_callable": check_args}
expected_annotations[Mixin] = {}
- non_dc_mixin = contextlib.nullcontext
+ class Book(Mixin, Base, **klass_kw):
+ id: Mapped[int] = mapped_column(
+ Integer,
+ primary_key=True,
+ init=False,
+ )
- else:
+ elif dataclass_scope.on_base_class:
class Mixin:
@declared_attr.directive
"polymorphic_on": "polymorphic_type",
}
- if dataclass_scope.on_base or dataclass_scope.on_base_class:
+ class Book(MappedAsDataclass, Mixin, Base, **klass_kw):
+ polymorphic_type: Mapped[str] = mapped_column(
+ String,
+ insert_default="book",
+ init=False,
+ )
- @declared_attr
- @classmethod
- def polymorphic_type(cls) -> Mapped[str]:
- return mapped_column(
- String,
- insert_default=cls.__name__,
- init=False,
- )
+ id: Mapped[int] = mapped_column(
+ Integer,
+ primary_key=True,
+ init=False,
+ )
- else:
+ else:
- @declared_attr
- @classmethod
- def polymorphic_type(cls) -> Mapped[str]:
- return mapped_column(
- String,
- insert_default=cls.__name__,
- )
+ class Mixin:
+ @declared_attr.directive
+ @classmethod
+ def __tablename__(cls) -> str:
+ return cls.__name__.lower()
- non_dc_mixin = functools.partial(
- _dataclass_mixin_warning, "Mixin", "'polymorphic_type'"
+ @declared_attr.directive
+ @classmethod
+ def __mapper_args__(cls) -> Dict[str, Any]:
+ return {
+ "polymorphic_identity": cls.__name__,
+ "polymorphic_on": "polymorphic_type",
+ }
+
+ class Book(Mixin, Base):
+ polymorphic_type: Mapped[str] = mapped_column(
+ String,
+ insert_default="book",
+ init=False,
+ )
+
+ id: Mapped[int] = mapped_column( # noqa: A001
+ Integer, primary_key=True
+ )
+
+ if MappedAsDataclass in Book.__mro__:
+ expected_annotations[Book] = {"id": int, "polymorphic_type": str}
+
+ class Novel(Book):
+ id: Mapped[int] = mapped_column(
+ ForeignKey("book.id"),
+ primary_key=True,
+ init=False,
)
+ description: Mapped[Optional[str]]
- if dataclass_scope.on_base_class:
- with non_dc_mixin():
+ expected_annotations[Novel] = {"id": int, "description": Optional[str]}
- class Book(Mixin, MappedAsDataclass, Base, **klass_kw):
- id: Mapped[int] = mapped_column(
- Integer,
- primary_key=True,
- init=False,
- )
+ if test_alternative_callable:
+ eq_(collected_annotations, expected_annotations)
- else:
- if dataclass_scope.on_base:
- local_non_dc_mixin = non_dc_mixin
- else:
- local_non_dc_mixin = contextlib.nullcontext
+ # check that mixin worked
+ eq_(inspect(Book).polymorphic_identity, "Book")
+ eq_(inspect(Novel).polymorphic_identity, "Novel")
- with local_non_dc_mixin():
+ n1 = Novel("the description")
+ eq_(n1.description, "the description")
- class Book(Mixin, Base):
- if not dataclass_scope.on_sub_class:
- id: Mapped[int] = mapped_column( # noqa: A001
- Integer, primary_key=True, init=False
- )
- else:
- id: Mapped[int] = mapped_column( # noqa: A001
- Integer,
- primary_key=True,
- )
+ @testing.variation(
+ "test_alternative_callable",
+ [True, False],
+ )
+ def test_unmapped_decorator(
+ self, registry: _RegistryType, test_alternative_callable
+ ):
+ expected_annotations = {}
- if MappedAsDataclass in Book.__mro__:
- expected_annotations[Book] = {"id": int, "polymorphic_type": str}
+ dc_kw: dict[str, Any]
- if dataclass_scope.on_sub_class:
- with non_dc_mixin():
+ if test_alternative_callable:
+ collected_annotations = {}
- class Novel(MappedAsDataclass, Book, **klass_kw):
- id: Mapped[int] = mapped_column( # noqa: A001
- ForeignKey("book.id"),
- primary_key=True,
- init=False,
+ def check_args(cls, **kw):
+ # de-stringify annotations to serve the cases
+ # in test_tm_future_annotations_sync.py
+ collected_annotations[cls] = {
+ k: de_stringify_annotation(
+ cls, v, __name__, locals(), include_generic=True
)
- description: Mapped[Optional[str]]
+ for k, v in getattr(cls, "__annotations__", {}).items()
+ }
+ return dataclasses.dataclass(cls, **kw)
+ dc_kw = {"dataclass_callable": check_args}
else:
- with non_dc_mixin():
+ dc_kw = {}
- class Novel(Book):
- id: Mapped[int] = mapped_column(
- ForeignKey("book.id"),
- primary_key=True,
- init=False,
- )
- description: Mapped[Optional[str]]
+ @unmapped_dataclass(**dc_kw)
+ class Mixin:
+ @declared_attr.directive
+ @classmethod
+ def __tablename__(cls) -> str:
+ return cls.__name__.lower()
+
+ @declared_attr.directive
+ @classmethod
+ def __mapper_args__(cls) -> Dict[str, Any]:
+ return {
+ "polymorphic_identity": cls.__name__,
+ "polymorphic_on": "polymorphic_type",
+ }
+
+ @registry.mapped_as_dataclass(**dc_kw)
+ class Book(Mixin):
+ polymorphic_type: Mapped[str] = mapped_column(
+ String,
+ insert_default="book",
+ init=False,
+ )
+
+ id: Mapped[int] = mapped_column(
+ Integer,
+ primary_key=True,
+ init=False,
+ )
+ @registry.mapped_as_dataclass(**dc_kw)
+ class Novel(Book):
+ id: Mapped[int] = mapped_column(
+ ForeignKey("book.id"),
+ primary_key=True,
+ init=False,
+ )
+ description: Mapped[Optional[str]]
+
+ expected_annotations[Book] = {"id": int, "polymorphic_type": str}
expected_annotations[Novel] = {"id": int, "description": Optional[str]}
+ expected_annotations[Mixin] = {}
if test_alternative_callable:
eq_(collected_annotations, expected_annotations)
+ # check that mixin worked
+ eq_(inspect(Book).polymorphic_identity, "Book")
+ eq_(inspect(Novel).polymorphic_identity, "Novel")
+
n1 = Novel("the description")
eq_(n1.description, "the description")
a1 = create("some data", 15)
some_int = a1.some_int
+
+ if not _is_mapped_class(cls):
+ a1.id = None
+ a1.some_int = some_int = 10
+
eq_(
dataclasses.asdict(a1),
{"data": "some data", "id": None, "some_int": some_int, "x": 15},
def _assert_repr(self, cls, create, dc_arguments):
assert "__repr__" in cls.__dict__
a1 = create("some data", 12)
- eq_regex(repr(a1), r".*A\(id=None, data='some data', x=12\)")
+
+ if _is_mapped_class(cls):
+ eq_regex(repr(a1), r".*A\(id=None, data='some data', x=12\)")
+ else:
+ eq_regex(
+ repr(a1), r".*A\(id=.*MappedColumn.*, data='some data', x=12\)"
+ )
def _assert_not_repr(self, cls, create, dc_arguments):
assert "__repr__" not in cls.__dict__
with expect_raises(TypeError):
cls("Some data", 5)
+ if not _is_mapped_class(cls):
+ # for an unmapped dataclass, assert we can construct it
+ a1 = cls()
+
+ # then it has no "data" attribute
+ assert not hasattr(a1, "data")
+
+ # dataclass defaults don't work because we necessarily restored
+ # the mappedcolumn/column_property()/etc
+ assert isinstance(a1.x, _DeclarativeMapped)
+
+ return
+
# behavior change in 2.1, even if init=False we set descriptor
# defaults
-
a1 = cls(data="some data")
eq_(a1.data, "some data")
)
eq_(fas.kwonlyargs, [])
- @testing.variation("decorator_type", ["fn", "method"])
+ @testing.variation("decorator_type", ["unmapped", "fn", "method"])
def test_dc_arguments_decorator(
self,
dc_argument_fixture,
registry: _RegistryType,
decorator_type,
):
- if decorator_type.fn:
+ if decorator_type.unmapped:
+ dec = unmapped_dataclass(**dc_argument_fixture[0])
+ elif decorator_type.fn:
dec = mapped_as_dataclass(registry, **dc_argument_fixture[0])
else:
dec = registry.mapped_as_dataclass(**dc_argument_fixture[0])
--- /dev/null
+from sqlalchemy import Integer
+from sqlalchemy.orm import Mapped
+from sqlalchemy.orm import mapped_as_dataclass
+from sqlalchemy.orm import mapped_column
+from sqlalchemy.orm import registry
+from sqlalchemy.orm import unmapped_dataclass
+
+
+@unmapped_dataclass(kw_only=True)
+class DataModel:
+ pass
+
+
+@unmapped_dataclass(init=False, kw_only=True)
+class RelationshipsModel(DataModel):
+ __tablename__ = "relationships"
+
+ entity_id1: Mapped[int] = mapped_column(primary_key=True)
+ entity_id2: Mapped[int] = mapped_column(primary_key=True)
+
+
+some_target_tables_registry = registry()
+
+
+@mapped_as_dataclass(some_target_tables_registry)
+class Relationships(RelationshipsModel):
+ im_going_to_be_mapped = True
+ level: Mapped[int] = mapped_column(Integer)
+
+
+# note init=True is implicit on Relationships
+# (this is the type checker, not us)
+rs = Relationships(entity_id1=1, entity_id2=2, level=1)
+
+# EXPECTED_TYPE: int
+reveal_type(rs.entity_id1)
+
+# EXPECTED_TYPE: int
+reveal_type(rs.level)