de-associated from any of its orphan-enabled parents. Previously,
the pending object would be expunged only if de-associated
from all of its orphan-enabled parents. The new flag ``legacy_is_orphan``
- is added to :func:`_orm.mapper` which re-establishes the
+ is added to :class:`_orm.Mapper` which re-establishes the
legacy behavior.
See the change note and example case at :ref:`legacy_is_orphan_addition`
the full "instrumentation manager" for a class before it was mapped
for the purpose of the new ``@declared_attr`` features
described in :ref:`feature_3150`, but the change was also made
- against the classical use of :func:`.mapper` for consistency.
+ against the classical use of :class:`_orm.Mapper` for consistency.
However, SQLSoup relies upon the instrumentation event happening
before any instrumentation under classical mapping.
The behavior is reverted in the case of classical and declarative
Fixed ORM unit of work regression where an errant "assert primary_key"
statement interferes with primary key generation sequences that don't
actually consider the columns in the table to use a real primary key
- constraint, instead using :paramref:`_orm.mapper.primary_key` to establish
+ constraint, instead using :paramref:`_orm.Mapper.primary_key` to establish
certain columns as "primary".
.. change::
Relationship to AliasedClass replaces the need for non primary mappers
-----------------------------------------------------------------------
-The "non primary mapper" is a :func:`.mapper` created in the
+The "non primary mapper" is a :class:`_orm.Mapper` created in the
:ref:`classical_mapping` style, which acts as an additional mapper against an
already mapped class against a different kind of selectable. The non primary
mapper has its roots in the 0.1, 0.2 series of SQLAlchemy where it was
-anticipated that the :func:`.mapper` object was to be the primary query
+anticipated that the :class:`_orm.Mapper` object was to be the primary query
construction interface, before the :class:`_query.Query` object existed.
With the advent of :class:`_query.Query` and later the :class:`.AliasedClass`
* Using :meth:`_orm.registry.map_imperatively`
* :ref:`orm_imperative_dataclasses`
-The existing classical mapping function :func:`_orm.mapper` remains, however
-it is deprecated to call upon :func:`_orm.mapper` directly; the new
-:meth:`_orm.registry.map_imperatively` method now routes the request through
-the :meth:`_orm.registry` so that it integrates with other declarative mappings
-unambiguously.
+The existing classical mapping function ``sqlalchemy.orm.mapper()`` remains,
+however it is deprecated to call upon ``sqlalchemy.orm.mapper()`` directly; the
+new :meth:`_orm.registry.map_imperatively` method now routes the request
+through the :meth:`_orm.registry` so that it integrates with other declarative
+mappings unambiguously.
The new approach interoperates with 3rd party class instrumentation systems
which necessarily must take place on the class before the mapping process
**Synopsis**
-The :func:`_orm.mapper` function moves behind the scenes to be invoked
-by higher level APIs. The new version of this function is the method
+The ``sqlalchemy.orm.mapper()`` standalone function moves behind the scenes to
+be invoked by higher level APIs. The new version of this function is the method
:meth:`_orm.registry.map_imperatively` taken from a :class:`_orm.registry`
object.
:tickets: 7257
Migrated the codebase to remove all pre-2.0 behaviors and architectures
- that were previously noted as deprecated for removal in 2.0, including:
+ that were previously noted as deprecated for removal in 2.0, including,
+ but not limited to:
* removal of all Python 2 code, minimum version is now Python 3.7
:class:`.Table`, and from all DDL/DML/DQL elements that previously could
refer to a "bound engine"
+ * The standalone ``sqlalchemy.orm.mapper()`` function is removed; all
+ classical mapping should be done through the
+ :meth:`_orm.registry.map_imperatively` method of :class:`_orm.registry`.
+
* The :meth:`_orm.Query.join` method no longer accepts strings for
relationship names; the long-documented approach of using
``Class.attrname`` for join targets is now standard.
* ``Query.from_self()``, ``Query.select_entity_from()`` and
``Query.with_polymorphic()`` are removed.
+ * The :paramref:`_orm.relationship.cascade_backrefs` parameter must now
+ remain at its new default of ``False``; the ``save-update`` cascade
+ no longer cascades along a backref.
+
+ * the :paramref:`_orm.Session.future` parameter must always be set to
+ ``True``. 2.0-style transactional patterns for :class:`_orm.Session`
+ are now always in effect.
+
* Loader options no longer accept strings for attribute names. The
long-documented approach of using ``Class.attrname`` for loader option
targets is now standard.
``select([cols])``, the "whereclause" and keyword parameters of
``some_table.select()``.
- * More are in progress as development continues
+ * Legacy "in-place mutator" methods on :class:`_sql.Select` such as
+ ``append_whereclause()``, ``append_order_by()`` etc are removed.
+
:ref:`faq_new_caching` - in the :ref:`faq_toplevel` section
-.. _error_s9r1:
-
-Object is being merged into a Session along the backref cascade
----------------------------------------------------------------
-
-This message refers to the "backref cascade" behavior of SQLAlchemy,
-which is described at :ref:`backref_cascade`. This refers to the action of
-an object being added into a :class:`_orm.Session` as a result of another
-object that's already present in that session being associated with it.
-As this behavior has been shown to be more confusing than helpful,
-the :paramref:`_orm.relationship.cascade_backrefs` and
-:paramref:`_orm.backref.cascade_backrefs` parameters were added, which can
-be set to ``False`` to disable it, and in SQLAlchemy 2.0 the "cascade backrefs"
-behavior will be disabled completely.
-
-To set :paramref:`_orm.relationship.cascade_backrefs` to ``False`` on a
-backref that is currently configured using the
-:paramref:`_orm.relationship.backref` string parameter, the backref must
-be declared using the :func:`_orm.backref` function first so that the
-:paramref:`_orm.backref.cascade_backrefs` parameter may be passed.
-
-Alternatively, the entire "cascade backrefs" behavior can be turned off
-across the board by using the :class:`_orm.Session` in "future" mode,
-by passing ``True`` for the :paramref:`_orm.Session.future` parameter.
-
-.. seealso::
-
- :ref:`backref_cascade` - complete description of the cascade backrefs
- behavior
-
- :ref:`change_5150` - background on the change for SQLAlchemy 2.0.
-
.. _error_xaj1:
An alias is being generated automatically for raw clauseelement
:paramref:`_orm.Session.future` flag is set on :class:`_orm.sessionmaker`
or :class:`_orm.Session`.
+.. _error_s9r1:
+
+Object is being merged into a Session along the backref cascade
+---------------------------------------------------------------
+
+This message refers to the "backref cascade" behavior of SQLAlchemy,
+removed in version 2.0. This refers to the action of
+an object being added into a :class:`_orm.Session` as a result of another
+object that's already present in that session being associated with it.
+As this behavior has been shown to be more confusing than helpful,
+the :paramref:`_orm.relationship.cascade_backrefs` and
+:paramref:`_orm.backref.cascade_backrefs` parameters were added, which can
+be set to ``False`` to disable it, and in SQLAlchemy 2.0 the "cascade backrefs"
+behavior has been removed entirely.
+
+For older SQLAlchemy versions, to set
+:paramref:`_orm.relationship.cascade_backrefs` to ``False`` on a backref that
+is currently configured using the :paramref:`_orm.relationship.backref` string
+parameter, the backref must be declared using the :func:`_orm.backref` function
+first so that the :paramref:`_orm.backref.cascade_backrefs` parameter may be
+passed.
+
+Alternatively, the entire "cascade backrefs" behavior can be turned off
+across the board by using the :class:`_orm.Session` in "future" mode,
+by passing ``True`` for the :paramref:`_orm.Session.future` parameter.
+
+.. seealso::
+
+ :ref:`change_5150` - background on the change for SQLAlchemy 2.0.
+
Rows returned by an outer join may contain NULL for part of the primary key,
as the primary key is the composite of both tables. The :class:`_query.Query` object ignores incoming rows
that don't have an acceptable primary key. Based on the setting of the ``allow_partial_pks``
-flag on :func:`.mapper`, a primary key is accepted if the value has at least one non-NULL
+flag on :class:`_orm.Mapper`, a primary key is accepted if the value has at least one non-NULL
value, or alternatively if the value has no NULL values. See ``allow_partial_pks``
-at :func:`.mapper`.
+at :class:`_orm.Mapper`.
I'm using ``joinedload()`` or ``lazy=False`` to create a JOIN/OUTER JOIN and SQLAlchemy is not constructing the correct query when I try to add a WHERE, ORDER BY, LIMIT, etc. (which relies upon the (OUTER) JOIN)
mapping
mapped
mapped class
- We say a class is "mapped" when it has been passed through the
- :func:`_orm.mapper` function. This process associates the
- class with a database table or other :term:`selectable`
- construct, so that instances of it can be persisted
- and loaded using a :class:`.Session`.
+ We say a class is "mapped" when it has been associated with an
+ instance of the :class:`_orm.Mapper` class. This process associates
+ the class with a database table or other :term:`selectable` construct,
+ so that instances of it can be persisted and loaded using a
+ :class:`.Session`.
.. seealso::
it into a form that is interpreted by the receiving :func:`_orm.relationship` as additional
arguments to be applied to the new relationship it creates.
-Setting cascade for backrefs
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-A key behavior that occurs in the 1.x series of SQLAlchemy regarding backrefs
-is that :ref:`cascades <unitofwork_cascades>` will occur bidirectionally by
-default. This basically means, if one starts with an ``User`` object
-that's been persisted in the :class:`.Session`::
-
- user = session.query(User).filter(User.id == 1).first()
-
-The above ``User`` is :term:`persistent` in the :class:`.Session`. It usually
-is intuitive that if we create an ``Address`` object and append to the
-``User.addresses`` collection, it is automatically added to the
-:class:`.Session` as in the example below::
-
- user = session.query(User).filter(User.id == 1).first()
- address = Address(email_address='foo')
- user.addresses.append(address)
-
-The above behavior is known as the "save update cascade" and is described
-in the section :ref:`unitofwork_cascades`.
-
-However, if we instead created a new ``Address`` object, and associated the
-``User`` object with the ``Address`` as follows::
-
- address = Address(email_address='foo', user=user)
-
-In the above example, it is **not** as intuitive that the ``Address`` would
-automatically be added to the :class:`.Session`. However, the backref behavior
-of ``Address.user`` indicates that the ``Address`` object is also appended to
-the ``User.addresses`` collection. This in turn initiates a **cascade**
-operation which indicates that this ``Address`` should be placed into the
-:class:`.Session` as a :term:`pending` object.
-
-Since this behavior has been identified as counter-intuitive to most people,
-it can be disabled by setting :paramref:`_orm.relationship.cascade_backrefs`
-to False, as in::
-
-
- class User(Base):
- # ...
-
- addresses = relationship("Address", back_populates="user", cascade_backrefs=False)
-
-See the example in :ref:`backref_cascade` for further information.
-
-.. seealso::
-
- :ref:`backref_cascade`.
One Way Backrefs
that :class:`.Session` at once. While it can be disabled, there
is usually not a need to do so.
-One case where ``save-update`` cascade does sometimes get in the way is in that
-it takes place in both directions for bi-directional relationships, e.g.
-backrefs, meaning that the association of a child object with a particular parent
-can have the effect of the parent object being implicitly associated with that
-child object's :class:`.Session`; this pattern, as well as how to modify its
-behavior using the :paramref:`_orm.relationship.cascade_backrefs` flag,
-is discussed in the section :ref:`backref_cascade`.
.. _cascade_delete:
from the :class:`.Session` using :meth:`.Session.expunge`, the
operation should be propagated down to referred objects.
-.. _backref_cascade:
-
-Controlling Cascade on Backrefs
--------------------------------
-
-.. note:: This section applies to a behavior that is removed in SQLAlchemy 2.0.
- By setting the :paramref:`_orm.Session.future` flag on a given
- :class:`_orm.Session`, the 2.0 behavior will be achieved which is
- essentially that the :paramref:`_orm.relationship.cascade_backrefs` flag is
- ignored. See the section :ref:`change_5150` for notes.
-
-In :term:`1.x style` ORM usage, the :ref:`cascade_save_update` cascade by
-default takes place on attribute change events emitted from backrefs. This is
-probably a confusing statement more easily described through demonstration; it
-means that, given a mapping such as this::
-
- mapper_registry.map_imperatively(Order, order_table, properties={
- 'items' : relationship(Item, backref='order')
- })
-
-If an ``Order`` is already in the session, and is assigned to the ``order``
-attribute of an ``Item``, the backref appends the ``Item`` to the ``items``
-collection of that ``Order``, resulting in the ``save-update`` cascade taking
-place::
-
- >>> o1 = Order()
- >>> session.add(o1)
- >>> o1 in session
- True
-
- >>> i1 = Item()
- >>> i1.order = o1
- >>> i1 in o1.items
- True
- >>> i1 in session
- True
-
-This behavior can be disabled using the :paramref:`_orm.relationship.cascade_backrefs` flag::
-
- mapper_registry.map_imperatively(Order, order_table, properties={
- 'items' : relationship(Item, backref='order', cascade_backrefs=False)
- })
-
-So above, the assignment of ``i1.order = o1`` will append ``i1`` to the ``items``
-collection of ``o1``, but will not add ``i1`` to the session. You can, of
-course, :meth:`~.Session.add` ``i1`` to the session at a later point. This
-option may be helpful for situations where an object needs to be kept out of a
-session until it's construction is completed, but still needs to be given
-associations to objects which are already persistent in the target session.
-
-When relationships are created by the :paramref:`_orm.relationship.backref`
-parameter on :func:`_orm.relationship`, the :paramref:`_orm.cascade_backrefs`
-parameter may be set to ``False`` on the backref side by using the
-:func:`_orm.backref` function instead of a string. For example, the above relationship
-could be declared::
-
- mapper_registry.map_imperatively(Order, order_table, properties={
- 'items' : relationship(
- Item, backref=backref('order', cascade_backrefs=False), cascade_backrefs=False
- )
- })
-
-This sets the ``cascade_backrefs=False`` behavior on both relationships.
.. _session_deleting_from_collections:
With all mapping forms, the mapping of the class is configured through
parameters that become part of the :class:`_orm.Mapper` object.
The function which ultimately receives these arguments is the
-:func:`_orm.mapper` function, and are delivered to it from one of
+:class:`_orm.Mapper` function, and are delivered to it from one of
the front-facing mapping functions defined on the :class:`_orm.registry`
object.
For the declarative form of mapping, mapper arguments are specified
using the ``__mapper_args__`` declarative class variable, which is a dictionary
-that is passed as keyword arguments to the :func:`_orm.mapper` function.
+that is passed as keyword arguments to the :class:`_orm.Mapper` function.
Some examples:
**Version ID Column**
-The :paramref:`_orm.mapper.version_id_col` and
-:paramref:`_orm.mapper.version_id_generator` parameters::
+The :paramref:`_orm.Mapper.version_id_col` and
+:paramref:`_orm.Mapper.version_id_generator` parameters::
from datetime import datetime
**Single Table Inheritance**
-The :paramref:`_orm.mapper.polymorphic_on` and
-:paramref:`_orm.mapper.polymorphic_identity` parameters::
+The :paramref:`_orm.Mapper.polymorphic_on` and
+:paramref:`_orm.Mapper.polymorphic_identity` parameters::
class Person(Base):
__tablename__ = 'person'
The :class:`.AssociationProxy` object produced by the :func:`.association_proxy` function
is an instance of a `Python descriptor <https://docs.python.org/howto/descriptor.html>`_.
It is always declared with the user-defined class being mapped, regardless of
-whether Declarative or classical mappings via the :func:`.mapper` function are used.
+whether Declarative or classical mappings via the :class:`_orm.Mapper` function are used.
The proxy functions by operating upon the underlying mapped attribute
or collection in response to operations, and changes made via the proxy are immediately
:paramref:`_schema.Column.default` parameter is assigned to a SQL expression
object. To access this value with asyncio, it has to be refreshed within the
flush process, which is achieved by setting the
- :paramref:`_orm.mapper.eager_defaults` parameter on the mapping::
+ :paramref:`_orm.Mapper.eager_defaults` parameter on the mapping::
class A(Base):
Installation
------------
-The Mypy plugin depends upon new stubs for SQLAlchemy packaged at
-`sqlalchemy2-stubs <https://pypi.org/project/sqlalchemy2-stubs/>`_. These
-stubs necessarily fully replace the previous ``sqlalchemy-stubs`` typing
-annotations published by Dropbox, as they occupy the same ``sqlalchemy-stubs``
-namespace as specified by :pep:`561`. The `Mypy <https://pypi.org/project/mypy/>`_
-package itself is also a dependency.
+TODO: document uninstallation of existing stubs:
+
+* ``sqlalchemy2-stubs``
+* ``sqlalchemy-stubs``
+
+SQLAlchemy 2.0 is expected to be directly typed.
+
+The `Mypy <https://pypi.org/project/mypy/>`_ package itself is a dependency.
Both packages may be installed using the "mypy" extras hook using pip::
.. autofunction:: synonym_for
-.. autofunction:: mapper
-
.. autofunction:: object_mapper
.. autofunction:: class_mapper
Mapping Table Columns
=====================
-The default behavior of :func:`_orm.mapper` is to assemble all the columns in
+The default behavior of :class:`_orm.Mapper` is to assemble all the columns in
the mapped :class:`_schema.Table` into mapped object attributes, each of which are
named according to the name of the column itself (specifically, the ``key``
attribute of :class:`_schema.Column`). This behavior can be
name = user_table.c.user_name
The corresponding technique for an :term:`imperative` mapping is
-to place the desired key in the :paramref:`_orm.mapper.properties`
+to place the desired key in the :paramref:`_orm.Mapper.properties`
dictionary with the desired key::
mapper_registry.map_imperatively(User, user_table, properties={
Options can be specified when mapping a :class:`_schema.Column` using the
:func:`.column_property` function. This function
explicitly creates the :class:`.ColumnProperty` used by the
-:func:`.mapper` to keep track of the :class:`_schema.Column`; normally, the
-:func:`.mapper` creates this automatically. Using :func:`.column_property`,
+:class:`_orm.Mapper` to keep track of the :class:`_schema.Column`; normally, the
+:class:`_orm.Mapper` creates this automatically. Using :func:`.column_property`,
we can pass additional arguments about how we'd like the :class:`_schema.Column`
to be mapped. Below, we pass an option ``active_history``,
which specifies that a change to this column's value should
Creating an Explicit Base Non-Dynamically (for use with mypy, similar)
----------------------------------------------------------------------
+TODO: update for 2.0 - code here may not be accurate
+
SQLAlchemy includes a :ref:`Mypy plugin <mypy_toplevel>` that automatically
accommodates for the dynamically generated ``Base`` class
delivered by SQLAlchemy functions like :func:`_orm.declarative_base`.
-This plugin works along with a new set of typing stubs published at
-`sqlalchemy2-stubs <https://pypi.org/project/sqlalchemy2-stubs>`_.
When this plugin is not in use, or when using other :pep:`484` tools which
may not know how to interpret this class, the declarative base class may
class Base(metaclass=DeclarativeMeta):
__abstract__ = True
- # these are supplied by the sqlalchemy2-stubs, so may be omitted
- # when they are installed
registry = mapper_registry
metadata = mapper_registry.metadata
allow them to be specified in the constructor explicitly, they would instead
be given a default value of ``None``.
-For a :func:`_orm.relationship` to be declared separately, it needs to
-be specified directly within the :paramref:`_orm.mapper.properties`
-dictionary passed to the :func:`_orm.mapper`. An alternative to this
+For a :func:`_orm.relationship` to be declared separately, it needs to be
+specified directly within the :paramref:`_orm.Mapper.properties` dictionary
+which itself is specified within the ``__mapper_args__`` dictionary, so that it
+is passed to the constructor for :class:`_orm.Mapper`. An alternative to this
approach is in the next example.
.. _orm_declarative_dataclasses_declarative_table:
An **imperative** or **classical** mapping refers to the configuration of a
mapped class using the :meth:`_orm.registry.map_imperatively` method,
where the target class does not include any declarative class attributes.
-The "map imperative" style has historically been achieved using the
-:func:`_orm.mapper` function directly, however this function now expects
-that a :meth:`_orm.registry` is present.
-
-.. deprecated:: 1.4 Using the :func:`_orm.mapper` function directly to
- achieve a classical mapping directly is deprecated. The
- :meth:`_orm.registry.map_imperatively` method retains the identical
- functionality while also allowing for string-based resolution of
- other mapped classes from within the registry.
+.. versionchanged:: 2.0 The :meth:`_orm.registry.map_imperatively` method
+ is now used to create classical mappings. The ``sqlalchemy.orm.mapper()``
+ standalone function is effectively removed.
In "classical" form, the table metadata is created separately with the
:class:`_schema.Table` construct, then associated with the ``User`` class via
Some examples in the documentation still use the classical approach, but note that
the classical as well as Declarative approaches are **fully interchangeable**. Both
systems ultimately create the same configuration, consisting of a :class:`_schema.Table`,
-user-defined class, linked together with a :func:`.mapper`. When we talk about
-"the behavior of :func:`.mapper`", this includes when using the Declarative system
+user-defined class, linked together with a :class:`_orm.Mapper` object. When we talk about
+"the behavior of :class:`_orm.Mapper`", this includes when using the Declarative system
as well - it's still used, just behind the scenes.
Mapper Configuration Overview
=============================
-With all mapping forms, the mapping of the class can be
-configured in many ways by passing construction arguments that become
-part of the :class:`_orm.Mapper` object. The function which ultimately
-receives these arguments is the :func:`_orm.mapper` function, which are delivered
-to it originating from one of the front-facing mapping functions defined
-on the :class:`_orm.registry` object.
+With all mapping forms, the mapping of the class can be configured in many ways
+by passing construction arguments that become part of the :class:`_orm.Mapper`
+object. The construct which ultimately receives these arguments is the
+constructor to the :class:`_orm.Mapper` class, and the arguments are delivered
+to it originating from one of the front-facing mapping functions defined on the
+:class:`_orm.registry` object.
There are four general classes of configuration information that the
-:func:`_orm.mapper` function looks for:
+:class:`_orm.Mapper` class looks for:
The class to be mapped
----------------------
When mapping with the :ref:`imperative <orm_imperative_mapping>` style, the
properties dictionary is passed directly as the ``properties`` argument
to :meth:`_orm.registry.map_imperatively`, which will pass it along to the
-:paramref:`_orm.mapper.properties` parameter.
+:paramref:`_orm.Mapper.properties` parameter.
Other mapper configuration parameters
-------------------------------------
-These flags are documented at :func:`_orm.mapper`.
+These flags are documented at :class:`_orm.Mapper`.
When mapping with the :ref:`declarative <orm_declarative_mapping>` mapping
style, additional mapper configuration arguments are configured via the
When mapping with the :ref:`imperative <orm_imperative_mapping>` style,
keyword arguments are passed to the to :meth:`_orm.registry.map_imperatively`
-method which passes them along to the :func:`_orm.mapper` function.
+method which passes them along to the :class:`_orm.Mapper` class.
.. [1] When running under Python 2, a Python 2 "old style" class is the only
almost never needed; it necessarily tends to produce complex queries
which are often less efficient than that which would be produced
by direct query construction. The practice is to some degree
- based on the very early history of SQLAlchemy where the :func:`.mapper`
+ based on the very early history of SQLAlchemy where the :class:`_orm.Mapper`
construct was meant to represent the primary querying interface;
in modern usage, the :class:`_query.Query` object can be used to construct
virtually any SELECT statement, including complex composites, and should
**primary** mapper at a time. This mapper is involved in three main areas of
functionality: querying, persistence, and instrumentation of the mapped class.
The rationale of the primary mapper relates to the fact that the
-:func:`.mapper` modifies the class itself, not only persisting it towards a
+:class:`_orm.Mapper` modifies the class itself, not only persisting it towards a
particular :class:`_schema.Table`, but also :term:`instrumenting` attributes upon the
class which are structured specifically according to the table metadata. It's
not possible for more than one mapper to be associated with a class in equal
:ref:`passive_deletes` - supporting ON DELETE CASCADE with relationships
- :paramref:`.orm.mapper.passive_updates` - similar feature on :func:`.mapper`
+ :paramref:`.orm.mapper.passive_updates` - similar feature on :class:`_orm.Mapper`
Simulating limited ON UPDATE CASCADE without foreign key support
The ORM typically does not actively fetch the values of database-generated
values when it emits an INSERT or UPDATE, instead leaving these columns as
"expired" and to be fetched when they are next accessed, unless the ``eager_defaults``
-:func:`.mapper` flag is set. However, when a
+:class:`_orm.Mapper` flag is set. However, when a
server side version column is used, the ORM needs to actively fetch the newly
generated value. This is so that the version counter is set up *before*
any concurrent transaction may update it again. This fetching is also
from ... import sql
from ... import util
from ...sql import sqltypes
-from ...sql.base import NO_ARG
class ENUM(sqltypes.NativeForEmulated, sqltypes.Enum, _StringType):
BINARY in schema. This does not affect the type of data stored,
only the collation of character data.
- :param quoting: Not used. A warning will be raised if provided.
-
"""
- if kw.pop("quoting", NO_ARG) is not NO_ARG:
- util.warn_deprecated_20(
- "The 'quoting' parameter to :class:`.mysql.ENUM` is deprecated"
- " and will be removed in a future release. "
- "This parameter now has no effect."
- )
kw.pop("strict", None)
self._enum_init(enums, kw)
_StringType.__init__(self, length=self.length, **kw)
.. versionadded:: 1.0.0
- :param quoting: Not used. A warning will be raised if passed.
-
"""
- if kw.pop("quoting", NO_ARG) is not NO_ARG:
- util.warn_deprecated_20(
- "The 'quoting' parameter to :class:`.mysql.SET` is deprecated"
- " and will be removed in a future release. "
- "This parameter now has no effect."
- )
self.retrieve_as_bitwise = kw.pop("retrieve_as_bitwise", False)
self.values = tuple(values)
if not self.retrieve_as_bitwise and "" in values:
conn, table_name, schema, info_cache=self.info_cache, **kw
)
- @util.deprecated_20(
- ":meth:`_reflection.Inspector.reflecttable`",
- "The :meth:`_reflection.Inspector.reflecttable` "
- "method was renamed to "
- ":meth:`_reflection.Inspector.reflect_table`. This deprecated alias "
- "will be removed in a future release.",
- )
- def reflecttable(self, *args, **kwargs):
- "See reflect_table. This method name is deprecated"
- return self.reflect_table(*args, **kwargs)
-
def reflect_table(
self,
table,
"""indicates an API that is in 'legacy' status, a long term deprecation."""
-class RemovedIn20Warning(Base20DeprecationWarning):
- """indicates an API that will be fully removed in SQLAlchemy 2.0."""
-
-
-class MovedIn20Warning(RemovedIn20Warning):
+class MovedIn20Warning(Base20DeprecationWarning):
"""Subtype of RemovedIn20Warning to indicate an API that moved only."""
from .extensions import AbstractConcreteBase
from .extensions import ConcreteBase
from .extensions import DeferredReflection
-from .extensions import instrument_declarative
from ... import util
from ...orm.decl_api import as_declarative as _as_declarative
from ...orm.decl_api import declarative_base as _declarative_base
from ... import inspection
-from ... import util
from ...orm import exc as orm_exc
-from ...orm import registry
from ...orm import relationships
from ...orm.base import _mapper_or_none
from ...orm.clsregistry import _resolver
from ...util import OrderedDict
-@util.deprecated(
- "2.0",
- "the instrument_declarative function is deprecated "
- "and will be removed in SQLAlhcemy 2.0. Please use "
- ":meth:`_orm.registry.map_declaratively",
-)
-def instrument_declarative(cls, cls_registry, metadata):
- """Given a class, configure the class declaratively,
- using the given registry, which can be any dictionary, and
- MetaData object.
-
- """
- registry(metadata=metadata, class_registry=cls_registry).map_declaratively(
- cls
- )
-
-
class ConcreteBase:
"""A helper class for 'concrete' declarative mappings.
from .. import inspect
from .. import types
from ..orm import Mapper
-from ..orm import mapper
from ..orm.attributes import flag_modified
from ..sql.base import SchemaEventTarget
from ..util import memoized_property
if isinstance(prop.columns[0].type, sqltype):
cls.associate_with_attribute(getattr(class_, prop.key))
- event.listen(mapper, "mapper_configured", listen_for_type)
+ event.listen(Mapper, "mapper_configured", listen_for_type)
@classmethod
def as_mutable(cls, sqltype):
) or (prop.columns[0].type is sqltype):
cls.associate_with_attribute(getattr(class_, prop.key))
- event.listen(mapper, "mapper_configured", listen_for_type)
+ event.listen(Mapper, "mapper_configured", listen_for_type)
return sqltype
from .util import with_polymorphic
from .. import sql as _sql
from .. import util as _sa_util
+from ..exc import InvalidRequestError
from ..util.langhelpers import public_factory
relationship = public_factory(RelationshipProperty, ".orm.relationship")
-@_sa_util.deprecated_20("relation", "Please use :func:`.relationship`.")
-def relation(*arg, **kw):
- """A synonym for :func:`relationship`."""
+def mapper(*arg, **kw):
+ """Placeholder for the now-removed ``mapper()`` function.
- return relationship(*arg, **kw)
+ Classical mappings should be performed using the
+ :meth:`_orm.registry.map_imperatively` method.
+
+ This symbol remains in SQLAlchemy 2.0 to suit the deprecated use case
+ of using the ``mapper()`` function as a target for ORM event listeners,
+ which failed to be marked as deprecated in the 1.4 series.
+
+ Global ORM mapper listeners should instead use the :class:`_orm.Mapper`
+ class as the target.
+
+ .. versionchanged:: 2.0 The ``mapper()`` function was removed; the
+ symbol remains temporarily as a placeholder for the event listening
+ use case.
+
+ """
+ raise InvalidRequestError(
+ "The 'sqlalchemy.orm.mapper()' function is removed as of "
+ "SQLAlchemy 2.0. Use the "
+ "'sqlalchemy.orm.registry.map_imperatively()` "
+ "method of the ``sqlalchemy.orm.registry`` class to perform "
+ "classical mapping."
+ )
def dynamic_loader(argument, **kw):
return prop
-mapper = public_factory(Mapper, ".orm.mapper")
-
synonym = public_factory(SynonymProperty, ".orm.synonym")
mapperlib._dispose_registries(mapperlib._all_registries(), False)
-@_sa_util.deprecated_20("eagerload", "Please use :func:`_orm.joinedload`.")
-def eagerload(*args, **kwargs):
- """A synonym for :func:`joinedload()`."""
- return joinedload(*args, **kwargs)
-
-
contains_alias = public_factory(AliasOption, ".orm.contains_alias")
if True:
def _simple_statement(self):
- if (
- self.compile_options._use_legacy_query_style
- and (self.distinct and not self.distinct_on)
- and self.order_by
- ):
- to_add = sql_util.expand_column_list_from_order_by(
- self.primary_columns, self.order_by
- )
- if to_add:
- util.warn_deprecated_20(
- "ORDER BY columns added implicitly due to "
- "DISTINCT is deprecated and will be removed in "
- "SQLAlchemy 2.0. SELECT statements with DISTINCT "
- "should be written to explicitly include the appropriate "
- "columns in the columns clause"
- )
- self.primary_columns += to_add
-
statement = self._select_statement(
self.primary_columns + self.secondary_columns,
tuple(self.from_clauses) + tuple(self.eager_joins.values()),
)
self.supports_single_entity = self.bundle.single_entity
- if (
- self.supports_single_entity
- and not compile_state.compile_options._use_legacy_query_style
- ):
- util.warn_deprecated_20(
- "The Bundle.single_entity flag has no effect when "
- "using 2.0 style execution."
- )
@property
def mapper(self):
The new base class will be given a metaclass that produces
appropriate :class:`~sqlalchemy.schema.Table` objects and makes
- the appropriate :func:`~sqlalchemy.orm.mapper` calls based on the
+ the appropriate :class:`_orm.Mapper` calls based on the
information provided declaratively in the class and any subclasses
of the class.
``metadata`` attribute of the generated declarative base class.
:param mapper:
- An optional callable, defaults to :func:`~sqlalchemy.orm.mapper`. Will
+ An optional callable, defaults to :class:`_orm.Mapper`. Will
be used to map subclasses to their Tables.
:param cls:
* :meth:`_orm.registry.map_imperatively` will produce a
:class:`_orm.Mapper` for a class without scanning the class for
declarative class attributes. This method suits the use case historically
- provided by the
- :func:`_orm.mapper` classical mapping function.
+ provided by the ``sqlalchemy.orm.mapper()`` classical mapping function,
+ which is removed as of SQLAlchemy 2.0.
.. versionadded:: 1.4
examples.
:param mapper:
- An optional callable, defaults to :func:`~sqlalchemy.orm.mapper`.
+ An optional callable, defaults to :class:`_orm.Mapper`.
This function is used to generate new :class:`_orm.Mapper` objects.
:param cls:
information. Instead, all mapping constructs are passed as
arguments.
- This method is intended to be fully equivalent to the classic
- SQLAlchemy :func:`_orm.mapper` function, except that it's in terms of
+ This method is intended to be fully equivalent to the now-removed
+ SQLAlchemy ``mapper()`` function, except that it's in terms of
a particular registry.
E.g.::
and usage examples.
:param class\_: The class to be mapped. Corresponds to the
- :paramref:`_orm.mapper.class_` parameter.
+ :paramref:`_orm.Mapper.class_` parameter.
:param local_table: the :class:`_schema.Table` or other
:class:`_sql.FromClause` object that is the subject of the mapping.
Corresponds to the
- :paramref:`_orm.mapper.local_table` parameter.
+ :paramref:`_orm.Mapper.local_table` parameter.
:param \**kw: all other keyword arguments are passed to the
- :func:`_orm.mapper` function directly.
+ :class:`_orm.Mapper` constructor directly.
.. seealso::
return _mapper(self, class_, local_table, kw)
-mapperlib._legacy_registry = registry()
-
-
def as_declarative(**kw):
"""
Class decorator which will adapt a given class into a
* unmapped superclasses of mapped or to-be-mapped classes
(using the ``propagate=True`` flag)
* :class:`_orm.Mapper` objects
- * the :class:`_orm.Mapper` class itself and the :func:`.mapper`
- function indicate listening for all mappers.
+ * the :class:`_orm.Mapper` class itself indicates listening for all
+ mappers.
Instance events are closely related to mapper events, but
are more specific to the instance and its instrumentation,
elif isinstance(target, mapperlib.Mapper):
return target.class_manager
elif target is orm.mapper:
+ util.warn_deprecated(
+ "The `sqlalchemy.orm.mapper()` symbol is deprecated and "
+ "will be removed in a future release. For the mapper-wide "
+ "event target, use the 'sqlalchemy.orm.Mapper' class.",
+ "2.0",
+ )
return instrumentation.ClassManager
elif isinstance(target, type):
if issubclass(target, mapperlib.Mapper):
* unmapped superclasses of mapped or to-be-mapped classes
(using the ``propagate=True`` flag)
* :class:`_orm.Mapper` objects
- * the :class:`_orm.Mapper` class itself and the :func:`.mapper`
- function indicate listening for all mappers.
+ * the :class:`_orm.Mapper` class itself indicates listening for all
+ mappers.
Mapper events provide hooks into critical sections of the
mapper, including those related to object instrumentation,
orm = util.preloaded.orm
if target is orm.mapper:
+ util.warn_deprecated(
+ "The `sqlalchemy.orm.mapper()` symbol is deprecated and "
+ "will be removed in a future release. For the mapper-wide "
+ "event target, use the 'sqlalchemy.orm.Mapper' class.",
+ "2.0",
+ )
return mapperlib.Mapper
elif isinstance(target, type):
if issubclass(target, mapperlib.Mapper):
):
util.warn(
"'before_configured' and 'after_configured' ORM events "
- "only invoke with the mapper() function or Mapper class "
+ "only invoke with the Mapper class "
"as the target."
)
new mappers have been made available and new mapper use is
detected.
- This event can **only** be applied to the :class:`_orm.Mapper` class
- or :func:`.mapper` function, and not to individual mappings or
- mapped classes. It is only invoked for all mappings as a whole::
+ This event can **only** be applied to the :class:`_orm.Mapper` class,
+ and not to individual mappings or mapped classes. It is only invoked
+ for all mappings as a whole::
- from sqlalchemy.orm import mapper
+ from sqlalchemy.orm import Mapper
- @event.listens_for(mapper, "before_configured")
+ @event.listens_for(Mapper, "before_configured")
def go():
# ...
Also contrast to :meth:`.MapperEvents.before_configured`,
which is invoked before the series of mappers has been configured.
- This event can **only** be applied to the :class:`_orm.Mapper` class
- or :func:`.mapper` function, and not to individual mappings or
+ This event can **only** be applied to the :class:`_orm.Mapper` class,
+ and not to individual mappings or
mapped classes. It is only invoked for all mappings as a whole::
- from sqlalchemy.orm import mapper
+ from sqlalchemy.orm import Mapper
- @event.listens_for(mapper, "after_configured")
+ @event.listens_for(Mapper, "after_configured")
def go():
# ...
The event is often called for a batch of objects of the
same class before their INSERT statements are emitted at
once in a later step. In the extremely rare case that
- this is not desirable, the :func:`.mapper` can be
+ this is not desirable, the :class:`_orm.Mapper` object can be
configured with ``batch=False``, which will cause
batches of instances to be broken up into individual
(and more poorly performing) event->persist->event
same class after their INSERT statements have been
emitted at once in a previous step. In the extremely
rare case that this is not desirable, the
- :func:`.mapper` can be configured with ``batch=False``,
+ :class:`_orm.Mapper` object can be configured with ``batch=False``,
which will cause batches of instances to be broken up
into individual (and more poorly performing)
event->persist->event steps.
The event is often called for a batch of objects of the
same class before their UPDATE statements are emitted at
once in a later step. In the extremely rare case that
- this is not desirable, the :func:`.mapper` can be
+ this is not desirable, the :class:`_orm.Mapper` can be
configured with ``batch=False``, which will cause
batches of instances to be broken up into individual
(and more poorly performing) event->persist->event
The event is often called for a batch of objects of the
same class after their UPDATE statements have been emitted at
once in a previous step. In the extremely rare case that
- this is not desirable, the :func:`.mapper` can be
+ this is not desirable, the :class:`_orm.Mapper` can be
configured with ``batch=False``, which will cause
batches of instances to be broken up into individual
(and more poorly performing) event->persist->event
session.autoflush = autoflush
-@util.deprecated_20(
+@util.became_legacy_20(
":func:`_orm.merge_result`",
alternative="The function as well as the method on :class:`_orm.Query` "
"is superseded by the :func:`_orm.merge_frozen_result` function.",
- becomes_legacy=True,
)
@util.preload_module("sqlalchemy.orm.context")
def merge_result(query, iterator, load=True):
_mapper_registries = weakref.WeakKeyDictionary()
-_legacy_registry = None
-
def _all_registries():
with _CONFIGURE_MUTEX:
):
r"""Direct constructor for a new :class:`_orm.Mapper` object.
- The :func:`_orm.mapper` function is normally invoked through the
+ The :class:`_orm.Mapper` constructor is not called directly, and
+ is normally invoked through the
use of the :class:`_orm.registry` object through either the
:ref:`Declarative <orm_declarative_mapping>` or
:ref:`Imperative <orm_imperative_mapping>` mapping styles.
- .. versionchanged:: 1.4 The :func:`_orm.mapper` function should not
- be called directly for classical mapping; for a classical mapping
- configuration, use the :meth:`_orm.registry.map_imperatively`
- method. The :func:`_orm.mapper` function may become private in a
- future release.
+ .. versionchanged:: 2.0 The public facing ``mapper()`` function is
+ removed; for a classical mapping configuration, use the
+ :meth:`_orm.registry.map_imperatively` method.
Parameters documented below may be passed to either the
:meth:`_orm.registry.map_imperatively` method, or may be passed in the
# we expect that declarative has applied the class manager
# already and set up a registry. if this is None,
- # we will emit a deprecation warning below when we also see that
- # it has no registry.
+ # this raises as of 2.0.
manager = attributes.manager_of_class(self.class_)
if self.non_primary:
)
if not manager.registry:
- util.warn_deprecated_20(
- "Calling the mapper() function directly outside of a "
- "declarative registry is deprecated."
+ raise sa_exc.InvalidRequestError(
+ "The _mapper() function may not be invoked directly outside "
+ "of a declarative registry."
" Please use the sqlalchemy.orm.registry.map_imperatively() "
"function for a classical mapping."
)
- assert _legacy_registry is not None
- _legacy_registry._add_manager(manager)
self.class_manager = manager
self.registry = manager.registry
self._compile_options += opt
return self
- @util.deprecated_20(
+ @util.became_legacy_20(
":meth:`_orm.Query.with_labels` and :meth:`_orm.Query.apply_labels`",
alternative="Use set_label_style(LABEL_STYLE_TABLENAME_PLUS_COL) "
"instead.",
self.load_options += {"_yield_per": count}
return self
- @util.deprecated_20(
+ @util.became_legacy_20(
":meth:`_orm.Query.get`",
alternative="The method is now available as :meth:`_orm.Session.get`",
- becomes_legacy=True,
)
def get(self, ident):
"""Return an instance based on the given primary key identifier,
:func:`~.expression.select`.
The method here accepts mapped classes, :func:`.aliased` constructs,
- and :func:`.mapper` constructs as arguments, which are resolved into
- expression constructs, in addition to appropriate expression
+ and :class:`_orm.Mapper` constructs as arguments, which are resolved
+ into expression constructs, in addition to appropriate expression
constructs.
The correlation arguments are ultimately passed to
self.load_options += {"_invoke_all_eagers": value}
return self
- @util.deprecated_20(
+ @util.became_legacy_20(
":meth:`_orm.Query.with_parent`",
alternative="Use the :func:`_orm.with_parent` standalone construct.",
- becomes_legacy=True,
)
@util.preload_module("sqlalchemy.orm.relationships")
def with_parent(self, instance, property=None, from_entity=None): # noqa
return orm_util._getitem(
self,
item,
- allow_negative=not self.session or not self.session.future,
)
@_generative
return result
- @util.deprecated_20(
+ @util.became_legacy_20(
":meth:`_orm.Query.merge_result`",
alternative="The method is superseded by the "
":func:`_orm.merge_frozen_result` function.",
- becomes_legacy=True,
enable_warnings=False, # warnings occur via loading.merge_result
)
def merge_result(self, iterator, load=True):
passive_updates=True,
enable_typechecks=True,
active_history=False,
- cascade_backrefs=True,
+ cascade_backrefs=False,
)
_dependency_processor = None
:ref:`tutorial_delete_cascade` - Tutorial example describing
a delete cascade.
- :param cascade_backrefs=True:
- A boolean value indicating if the ``save-update`` cascade should
- operate along an assignment event intercepted by a backref.
- When set to ``False``, the attribute managed by this relationship
- will not cascade an incoming transient object into the session of a
- persistent parent, if the event is received via backref.
+ :param cascade_backrefs=False:
+ Legacy; this flag is always False.
- .. deprecated:: 1.4 The
- :paramref:`_orm.relationship.cascade_backrefs`
- flag will default to False in all cases in SQLAlchemy 2.0.
-
- .. seealso::
-
- :ref:`backref_cascade` - Full discussion and examples on how
- the :paramref:`_orm.relationship.cascade_backrefs` option is used.
+ .. versionchanged:: 2.0 "cascade_backrefs" functionality has been
+ removed.
:param collection_class:
A class or callable that returns a new list-holding object. will
self._user_defined_foreign_keys = foreign_keys
self.collection_class = collection_class
self.passive_deletes = passive_deletes
- self.cascade_backrefs = cascade_backrefs
+
+ if cascade_backrefs:
+ raise sa_exc.ArgumentError(
+ "The 'cascade_backrefs' parameter passed to "
+ "relationship() may only be set to False."
+ )
+
self.passive_updates = passive_updates
self.remote_side = remote_side
self.enable_typechecks = enable_typechecks
self,
bind=None,
autoflush=True,
- future=False,
+ future=True,
expire_on_commit=True,
twophase=False,
binds=None,
:ref:`session_committing`
- :param future: if True, use 2.0 style transactional and engine
- behavior. Future mode includes the following behaviors:
-
- * The :class:`_orm.Session` will not use "bound" metadata in order
- to locate an :class:`_engine.Engine`; the engine or engines in use
- must be specified to the constructor of :class:`_orm.Session` or
- otherwise be configured against the :class:`_orm.sessionmaker`
- in use
-
- * The behavior of the :paramref:`_orm.relationship.cascade_backrefs`
- flag on a :func:`_orm.relationship` will always assume
- "False" behavior.
-
- .. versionadded:: 1.4
+ :param future: Deprecated; this flag is always True.
.. seealso::
)
self.identity_map = identity.WeakInstanceDict()
+ if not future:
+ raise sa_exc.ArgumentError(
+ "The 'future' parameter passed to "
+ "Session() may only be set to True."
+ )
+
self._new = {} # InstanceState->object, strong refs object
self._deleted = {} # same
self.bind = bind
self._warn_on_events = False
self._transaction = None
self._nested_transaction = None
- self.future = future
self.hash_key = _new_sessionid()
self.autoflush = autoflush
self.expire_on_commit = expire_on_commit
with self.begin():
yield self
- @property
- @util.deprecated_20(
- ":attr:`_orm.Session.transaction`",
- alternative="For context manager use, use "
- ":meth:`_orm.Session.begin`. To access "
- "the current root transaction, use "
- ":meth:`_orm.Session.get_transaction`.",
- warn_on_attribute_access=True,
- )
- def transaction(self):
- """The current active or inactive :class:`.SessionTransaction`.
-
- May be None if no transaction has begun yet.
-
- .. versionchanged:: 1.4 the :attr:`.Session.transaction` attribute
- is now a read-only descriptor that also may return None if no
- transaction has begun yet.
-
-
- """
- return self._legacy_transaction()
-
- def _legacy_transaction(self):
- if not self.future:
- self._autobegin()
- return self._transaction
-
def in_transaction(self):
"""Return True if this :class:`_orm.Session` has begun a transaction.
If no transaction is in progress, this method is a pass-through.
- In :term:`1.x-style` use, this method rolls back the topmost
- database transaction if no nested transactions are in effect, or
- to the current nested transaction if one is in effect.
-
- When
- :term:`2.0-style` use is in effect via the
- :paramref:`_orm.Session.future` flag, the method always rolls back
+ The method always rolls back
the topmost database transaction, discarding any nested
transactions that may be in progress.
if self._transaction is None:
pass
else:
- self._transaction.rollback(_to_root=self.future)
+ self._transaction.rollback(_to_root=True)
def commit(self):
"""Flush pending changes and commit the current transaction.
If no transaction is in progress, the method will first
"autobegin" a new transaction and commit.
- If :term:`1.x-style` use is in effect and there are currently
- SAVEPOINTs in progress via :meth:`_orm.Session.begin_nested`,
- the operation will release the current SAVEPOINT but not commit
- the outermost database transaction.
-
- If :term:`2.0-style` use is in effect via the
- :paramref:`_orm.Session.future` flag, the outermost database
- transaction is committed unconditionally, automatically releasing any
- SAVEPOINTs in effect.
+ The outermost database transaction is committed unconditionally,
+ automatically releasing any SAVEPOINTs in effect.
.. seealso::
if not self._autobegin():
raise sa_exc.InvalidRequestError("No transaction is begun.")
- self._transaction.commit(_to_root=self.future)
+ self._transaction.commit(_to_root=True)
def prepare(self):
"""Prepare the current transaction in progress for two phase commit.
self._transaction.prepare()
- def connection(self, bind_arguments=None, execution_options=None, **kw):
+ def connection(self, bind_arguments=None, execution_options=None):
r"""Return a :class:`_engine.Connection` object corresponding to this
:class:`.Session` object's transactional state.
"mapper", "bind", "clause", other custom arguments that are passed
to :meth:`.Session.get_bind`.
- :param bind:
- deprecated; use bind_arguments
-
- :param mapper:
- deprecated; use bind_arguments
-
- :param clause:
- deprecated; use bind_arguments
-
:param execution_options: a dictionary of execution options that will
be passed to :meth:`_engine.Connection.execution_options`, **when the
connection is first procured only**. If the connection is already
:ref:`session_transaction_isolation`
- :param \**kw:
- deprecated; use bind_arguments
-
"""
- if not bind_arguments:
- bind_arguments = kw
+ if bind_arguments:
+ bind = bind_arguments.pop("bind", None)
- bind = bind_arguments.pop("bind", None)
- if bind is None:
- bind = self.get_bind(**bind_arguments)
+ if bind is None:
+ bind = self.get_bind(**bind_arguments)
+ else:
+ bind = self.get_bind()
return self._connection_for_bind(
bind,
bind_arguments=None,
_parent_execute_state=None,
_add_event=None,
- **kw,
):
r"""Execute a SQL expression construct.
)
The API contract of :meth:`_orm.Session.execute` is similar to that
- of :meth:`_future.Connection.execute`, the :term:`2.0 style` version
- of :class:`_future.Connection`.
+ of :meth:`_engine.Connection.execute`, the :term:`2.0 style` version
+ of :class:`_engine.Connection`.
.. versionchanged:: 1.4 the :meth:`_orm.Session.execute` method is
now the primary point of ORM statement execution when using
Contents of this dictionary are passed to the
:meth:`.Session.get_bind` method.
- :param mapper:
- deprecated; use the bind_arguments dictionary
-
- :param bind:
- deprecated; use the bind_arguments dictionary
-
- :param \**kw:
- deprecated; use the bind_arguments dictionary
-
:return: a :class:`_engine.Result` object.
"""
statement = coercions.expect(roles.StatementRole, statement)
- if kw:
- util.warn_deprecated_20(
- "Passing bind arguments to Session.execute() as keyword "
- "arguments is deprecated and will be removed SQLAlchemy 2.0. "
- "Please use the bind_arguments parameter."
- )
- if not bind_arguments:
- bind_arguments = kw
- else:
- bind_arguments.update(kw)
- elif not bind_arguments:
+ if not bind_arguments:
bind_arguments = {}
if (
mapped.
:param bind: an :class:`_engine.Engine` or :class:`_engine.Connection`
- object.
+ object.
.. seealso::
:ref:`session_custom_partitioning`.
:param mapper:
- Optional :func:`.mapper` mapped class or instance of
- :class:`_orm.Mapper`. The bind can be derived from a
- :class:`_orm.Mapper`
- first by consulting the "binds" map associated with this
- :class:`.Session`, and secondly by consulting the
- :class:`_schema.MetaData`
- associated with the :class:`_schema.Table` to which the
- :class:`_orm.Mapper`
- is mapped for a bind.
+ Optional mapped class or corresponding :class:`_orm.Mapper` instance.
+ The bind can be derived from a :class:`_orm.Mapper` first by
+ consulting the "binds" map associated with this :class:`.Session`,
+ and secondly by consulting the :class:`_schema.MetaData` associated
+ with the :class:`_schema.Table` to which the :class:`_orm.Mapper` is
+ mapped for a bind.
:param clause:
A :class:`_expression.ClauseElement` (i.e.
)
.. versionadded:: 1.4 Added :meth:`_orm.Session.get`, which is moved
- from the now deprecated :meth:`_orm.Query.get` method.
+ from the now legacy :meth:`_orm.Query.get` method.
:meth:`_orm.Session.get` is special in that it provides direct
access to the identity map of the :class:`.Session`.
from ..util import topological
-def _warn_for_cascade_backrefs(state, prop):
- util.warn_deprecated_20(
- '"%s" object is being merged into a Session along the backref '
- 'cascade path for relationship "%s"; in SQLAlchemy 2.0, this '
- "reverse cascade will not take place. Set cascade_backrefs to "
- "False in either the relationship() or backref() function for "
- "the 2.0 behavior; or to set globally for the whole "
- "Session, set the future=True flag" % (state.class_.__name__, prop),
- code="s9r1",
- )
-
-
def track_cascade_events(descriptor, prop):
"""Establish event listeners on object attributes which handle
cascade-on-set/append.
if (
prop._cascade.save_update
- and (
- (prop.cascade_backrefs and not sess.future)
- or key == initiator.key
- )
+ and (key == initiator.key)
and not sess._contains_state(item_state)
):
- if key != initiator.key:
- _warn_for_cascade_backrefs(item_state, prop)
sess._save_or_update_state(item_state)
return item
newvalue_state = attributes.instance_state(newvalue)
if (
prop._cascade.save_update
- and (
- (prop.cascade_backrefs and not sess.future)
- or key == initiator.key
- )
+ and (key == initiator.key)
and not sess._contains_state(newvalue_state)
):
- if key != initiator.key:
- _warn_for_cascade_backrefs(newvalue_state, prop)
sess._save_or_update_state(newvalue_state)
if (
) = session.set = mapper.set = dependency.set = RandomSet
-def _getitem(iterable_query, item, allow_negative):
+def _getitem(iterable_query, item):
"""calculate __getitem__ in terms of an iterable query object
that also has a slice() method.
"""
def _no_negative_indexes():
- if not allow_negative:
- raise IndexError(
- "negative indexes are not accepted by SQL "
- "index / slice operators"
- )
- else:
- util.warn_deprecated_20(
- "Support for negative indexes for SQL index / slice operators "
- "will be "
- "removed in 2.0; these operators fetch the complete result "
- "and do not work efficiently."
- )
+ raise IndexError(
+ "negative indexes are not accepted by SQL "
+ "index / slice operators"
+ )
if isinstance(item, slice):
start, stop, step = util.decode_slice(item)
def _expression_collection_was_a_list(attrname, fnname, args):
if args and isinstance(args[0], (list, set, dict)) and len(args) == 1:
if isinstance(args[0], list):
- util.warn_deprecated_20(
- 'The "%s" argument to %s(), when referring to a sequence '
+ raise exc.ArgumentError(
+ f'The "{attrname}" argument to {fnname}(), when '
+ "referring to a sequence "
"of items, is now passed as a series of positional "
- "elements, rather than as a list. " % (attrname, fnname)
+ "elements, rather than as a list. "
)
return args[0]
- else:
- return args
+
+ return args
def expect(
original_element, resolved, argname=argname, **kw
)
- def _text_coercion(self, element, argname=None):
- util.warn_deprecated_20(
- "Using plain strings to indicate SQL statements without using "
- "the text() construct is "
- "deprecated and will be removed in version 2.0. Ensure plain "
- "SQL statements are passed using the text() construct."
- )
- return elements.TextClause(element)
-
class SelectStatementImpl(_NoTextCoercion, RoleImpl):
__slots__ = ()
is_dml = True
- @classmethod
- def _constructor_20_deprecations(cls, fn_name, clsname, names):
-
- param_to_method_lookup = dict(
- whereclause=(
- "The :paramref:`%(func)s.whereclause` parameter "
- "will be removed "
- "in SQLAlchemy 2.0. Please refer to the "
- ":meth:`%(classname)s.where` method."
- ),
- values=(
- "The :paramref:`%(func)s.values` parameter will be removed "
- "in SQLAlchemy 2.0. Please refer to the "
- ":meth:`%(classname)s.values` method."
- ),
- inline=(
- "The :paramref:`%(func)s.inline` parameter will be "
- "removed in "
- "SQLAlchemy 2.0. Please use the "
- ":meth:`%(classname)s.inline` method."
- ),
- prefixes=(
- "The :paramref:`%(func)s.prefixes parameter will be "
- "removed in "
- "SQLAlchemy 2.0. Please use the "
- ":meth:`%(classname)s.prefix_with` "
- "method."
- ),
- return_defaults=(
- "The :paramref:`%(func)s.return_defaults` parameter will be "
- "removed in SQLAlchemy 2.0. Please use the "
- ":meth:`%(classname)s.return_defaults` method."
- ),
- returning=(
- "The :paramref:`%(func)s.returning` parameter will be "
- "removed in SQLAlchemy 2.0. Please use the "
- ":meth:`%(classname)s.returning`` method."
- ),
- preserve_parameter_order=(
- "The :paramref:`%(func)s.preserve_parameter_order` parameter "
- "will be removed in SQLAlchemy 2.0. Use the "
- ":meth:`%(classname)s.ordered_values` method with a list "
- "of tuples. "
- ),
- )
-
- return util.deprecated_params(
- **{
- name: (
- "2.0",
- param_to_method_lookup[name]
- % {
- "func": "_expression.%s" % fn_name,
- "classname": "_expression.%s" % clsname,
- },
- )
- for name in names
- }
- )
-
def _generate_fromclause_column_proxies(self, fromclause):
fromclause._columns._populate_separate_keys(
col._make_proxy(fromclause) for col in self._returning
self._validate_dialect_kwargs(opt)
return self
- def _validate_dialect_kwargs_deprecated(self, dialect_kw):
- util.warn_deprecated_20(
- "Passing dialect keyword arguments directly to the "
- "%s constructor is deprecated and will be removed in SQLAlchemy "
- "2.0. Please use the ``with_dialect_options()`` method."
- % (self.__class__.__name__)
- )
- self._validate_dialect_kwargs(dialect_kw)
-
@_generative
def returning(self: SelfUpdateBase, *cols) -> SelfUpdateBase:
r"""Add a :term:`RETURNING` or equivalent clause to this statement.
_returning = ()
- def __init__(self, table, values, prefixes):
+ def __init__(self, table):
self.table = coercions.expect(
roles.DMLTableRole, table, apply_propagate_attrs=self
)
- if values is not None:
- self.values.non_generative(self, values)
- if prefixes:
- self._setup_prefixes(prefixes)
@_generative
@_exclusive_against(
:meth:`.ValuesBase.return_defaults` is used by the ORM to provide
an efficient implementation for the ``eager_defaults`` feature of
- :func:`.mapper`.
+ :class:`_orm.Mapper`.
:param cols: optional list of column key names or
:class:`_schema.Column`
select = None
include_insert_from_select_defaults = False
+ _inline = False
is_insert = True
+ HasCTE._has_ctes_traverse_internals
)
- @ValuesBase._constructor_20_deprecations(
- "insert",
- "Insert",
- [
- "values",
- "inline",
- "prefixes",
- "returning",
- "return_defaults",
- ],
- )
def __init__(
self,
table,
- values=None,
- inline=False,
- prefixes=None,
- returning=None,
- return_defaults=False,
- **dialect_kw,
):
"""Construct an :class:`_expression.Insert` object.
:ref:`inserts_and_updates` - SQL Expression Tutorial
"""
- super(Insert, self).__init__(table, values, prefixes)
- self._inline = inline
- if returning:
- self._returning = returning
- if dialect_kw:
- self._validate_dialect_kwargs_deprecated(dialect_kw)
-
- if return_defaults:
- self._return_defaults = True
- if not isinstance(return_defaults, bool):
- self._return_defaults_columns = return_defaults
+ super(Insert, self).__init__(table)
@_generative
def inline(self: SelfInsert) -> SelfInsert:
__visit_name__ = "update"
is_update = True
+ _preserve_parameter_order = False
+ _inline = False
_traverse_internals = (
[
+ HasCTE._has_ctes_traverse_internals
)
- @ValuesBase._constructor_20_deprecations(
- "update",
- "Update",
- [
- "whereclause",
- "values",
- "inline",
- "prefixes",
- "returning",
- "return_defaults",
- "preserve_parameter_order",
- ],
- )
def __init__(
self,
table,
- whereclause=None,
- values=None,
- inline=False,
- prefixes=None,
- returning=None,
- return_defaults=False,
- preserve_parameter_order=False,
- **dialect_kw,
):
r"""Construct an :class:`_expression.Update` object.
"""
- self._preserve_parameter_order = preserve_parameter_order
- super(Update, self).__init__(table, values, prefixes)
- if returning:
- self._returning = returning
- if whereclause is not None:
- self._where_criteria += (
- coercions.expect(roles.WhereHavingRole, whereclause),
- )
- self._inline = inline
- if dialect_kw:
- self._validate_dialect_kwargs_deprecated(dialect_kw)
- self._return_defaults = return_defaults
+ super(Update, self).__init__(table)
@_generative
def ordered_values(self: SelfUpdate, *args) -> SelfUpdate:
+ HasCTE._has_ctes_traverse_internals
)
- @ValuesBase._constructor_20_deprecations(
- "delete",
- "Delete",
- ["whereclause", "values", "prefixes", "returning"],
- )
def __init__(
self,
table,
- whereclause=None,
- returning=None,
- prefixes=None,
- **dialect_kw,
):
r"""Construct :class:`_expression.Delete` object.
self.table = coercions.expect(
roles.DMLTableRole, table, apply_propagate_attrs=self
)
- if returning:
- self._returning = returning
-
- if prefixes:
- self._setup_prefixes(prefixes)
-
- if whereclause is not None:
- self._where_criteria += (
- coercions.expect(roles.WhereHavingRole, whereclause),
- )
-
- if dialect_kw:
- self._validate_dialect_kwargs_deprecated(dialect_kw)
("else_", InternalTraversal.dp_clauseelement),
]
- # TODO: for Py2k removal, this will be:
- # def __init__(self, *whens, value=None, else_=None):
-
- def __init__(self, *whens, **kw):
+ def __init__(self, *whens, value=None, else_=None):
r"""Produce a ``CASE`` expression.
The ``CASE`` construct in SQL is a conditional object that
whether or not :paramref:`.case.value` is used.
.. versionchanged:: 1.4 the :func:`_sql.case`
- function now accepts the series of WHEN conditions positionally;
- passing the expressions within a list is deprecated.
+ function now accepts the series of WHEN conditions positionally
In the first form, it accepts a list of 2-tuples; each 2-tuple
consists of ``(<sql expression>, <value>)``, where the SQL
"""
- if "whens" in kw:
- util.warn_deprecated_20(
- 'The "whens" argument to case() is now passed using '
- "positional style only, not as a keyword argument."
- )
- whens = (kw.pop("whens"),)
-
whens = coercions._expression_collection_was_a_list(
"whens", "case", whens
)
-
try:
whens = util.dictlike_iteritems(whens)
except TypeError:
pass
- value = kw.pop("value", None)
-
whenlist = [
(
coercions.expect(
self.type = type_
self.whens = whenlist
- else_ = kw.pop("else_", None)
if else_ is not None:
self.else_ = coercions.expect(roles.ExpressionElementRole, else_)
else:
self.else_ = None
- if kw:
- raise TypeError("unknown arguments: %s" % (", ".join(sorted(kw))))
-
@property
def _from_objects(self):
return list(
"1.4",
"Deprecated alias of :paramref:`_schema.Table.must_exist`",
),
- autoload=(
- "2.0",
- "The autoload parameter is deprecated and will be removed in "
- "version 2.0. Please use the "
- "autoload_with parameter, passing an engine or connection.",
- ),
)
def __new__(cls, *args, **kw):
if not args and not kw:
self.fullname = self.name
autoload_with = kwargs.pop("autoload_with", None)
- autoload = kwargs.pop("autoload", autoload_with is not None)
+ autoload = autoload_with is not None
# this argument is only used with _init_existing()
kwargs.pop("autoload_replace", True)
keep_existing = kwargs.pop("keep_existing", False)
.alias(name)
)
- @util.deprecated_20(
- ":meth:`_sql.Join.alias`",
- alternative="Create a select + subquery, or alias the "
- "individual tables inside the join, instead.",
- )
- def alias(self, name=None, flat=False):
- r"""Return an alias of this :class:`_expression.Join`.
-
- The default behavior here is to first produce a SELECT
- construct from this :class:`_expression.Join`, then to produce an
- :class:`_expression.Alias` from that. So given a join of the form::
-
- j = table_a.join(table_b, table_a.c.id == table_b.c.a_id)
-
- The JOIN by itself would look like::
-
- table_a JOIN table_b ON table_a.id = table_b.a_id
-
- Whereas the alias of the above, ``j.alias()``, would in a
- SELECT context look like::
-
- (SELECT table_a.id AS table_a_id, table_b.id AS table_b_id,
- table_b.a_id AS table_b_a_id
- FROM table_a
- JOIN table_b ON table_a.id = table_b.a_id) AS anon_1
-
- The equivalent long-hand form, given a :class:`_expression.Join`
- object ``j``, is::
-
- from sqlalchemy import select, alias
- j = alias(
- select(j.left, j.right).\
- select_from(j).\
- set_label_style(LABEL_STYLE_TABLENAME_PLUS_COL).\
- correlate(False),
- name=name
- )
-
- The selectable produced by :meth:`_expression.Join.alias`
- features the same
- columns as that of the two individual selectables presented under
- a single name - the individual columns are "auto-labeled", meaning
- the ``.c.`` collection of the resulting :class:`_expression.Alias`
- represents
- the names of the individual columns using a
- ``<tablename>_<columname>`` scheme::
-
- j.c.table_a_id
- j.c.table_b_a_id
-
- :meth:`_expression.Join.alias` also features an alternate
- option for aliasing joins which produces no enclosing SELECT and
- does not normally apply labels to the column names. The
- ``flat=True`` option will call :meth:`_expression.FromClause.alias`
- against the left and right sides individually.
- Using this option, no new ``SELECT`` is produced;
- we instead, from a construct as below::
-
- j = table_a.join(table_b, table_a.c.id == table_b.c.a_id)
- j = j.alias(flat=True)
-
- we get a result like this::
-
- table_a AS table_a_1 JOIN table_b AS table_b_1 ON
- table_a_1.id = table_b_1.a_id
-
- The ``flat=True`` argument is also propagated to the contained
- selectables, so that a composite join such as::
-
- j = table_a.join(
- table_b.join(table_c,
- table_b.c.id == table_c.c.b_id),
- table_b.c.a_id == table_a.c.id
- ).alias(flat=True)
-
- Will produce an expression like::
-
- table_a AS table_a_1 JOIN (
- table_b AS table_b_1 JOIN table_c AS table_c_1
- ON table_b_1.id = table_c_1.b_id
- ) ON table_a_1.id = table_b_1.a_id
-
- The standalone :func:`_expression.alias` function as well as the
- base :meth:`_expression.FromClause.alias`
- method also support the ``flat=True``
- argument as a no-op, so that the argument can be passed to the
- ``alias()`` method of any selectable.
-
- :param name: name given to the alias.
-
- :param flat: if True, produce an alias of the left and right
- sides of this :class:`_expression.Join` and return the join of those
- two selectables. This produces join expression that does not
- include an enclosing SELECT.
-
- .. seealso::
-
- :ref:`core_tutorial_aliases`
-
- :func:`_expression.alias`
-
- """
- return self._anonymous_fromclause(flat=flat, name=name)
-
@property
def _hide_froms(self):
return itertools.chain(
.. versionadded:: 1.3.18 :func:`_expression.table` can now
accept a ``schema`` argument.
"""
-
super(TableClause, self).__init__()
self.name = name
self._columns = DedupeColumnCollection()
c.table = self
@util.preload_module("sqlalchemy.sql.dml")
- def insert(self, values=None, inline=False, **kwargs):
+ def insert(self):
"""Generate an :func:`_expression.insert` construct against this
:class:`_expression.TableClause`.
See :func:`_expression.insert` for argument and usage information.
"""
- return util.preloaded.sql_dml.Insert(
- self, values=values, inline=inline, **kwargs
- )
+ return util.preloaded.sql_dml.Insert(self)
@util.preload_module("sqlalchemy.sql.dml")
- def update(self, whereclause=None, values=None, inline=False, **kwargs):
+ def update(self):
"""Generate an :func:`_expression.update` construct against this
:class:`_expression.TableClause`.
"""
return util.preloaded.sql_dml.Update(
self,
- whereclause=whereclause,
- values=values,
- inline=inline,
- **kwargs,
)
@util.preload_module("sqlalchemy.sql.dml")
- def delete(self, whereclause=None, **kwargs):
+ def delete(self):
"""Generate a :func:`_expression.delete` construct against this
:class:`_expression.TableClause`.
See :func:`_expression.delete` for argument and usage information.
"""
- return util.preloaded.sql_dml.Delete(self, whereclause, **kwargs)
+ return util.preloaded.sql_dml.Delete(self)
@property
def _from_objects(self):
("literal_binds", InternalTraversal.dp_boolean),
]
- def __init__(self, *columns, **kw):
+ def __init__(self, *columns, name=None, literal_binds=False):
r"""Construct a :class:`_expression.Values` construct.
The column expressions and the actual data for
super(Values, self).__init__()
self._column_args = columns
- self.name = kw.pop("name", None)
- self.literal_binds = kw.pop("literal_binds", False)
+ self.name = name
+ self.literal_binds = literal_binds
self.named_with_column = self.name is not None
@property
return self.element._from_objects
-class DeprecatedSelectBaseGenerations:
- """A collection of methods available on :class:`_sql.Select` and
- :class:`_sql.CompoundSelect`, these are all **deprecated** methods as they
- modify the object in-place.
-
- """
-
- @util.deprecated(
- "1.4",
- "The :meth:`_expression.GenerativeSelect.append_order_by` "
- "method is deprecated "
- "and will be removed in a future release. Use the generative method "
- ":meth:`_expression.GenerativeSelect.order_by`.",
- )
- def append_order_by(self, *clauses):
- """Append the given ORDER BY criterion applied to this selectable.
-
- The criterion will be appended to any pre-existing ORDER BY criterion.
-
- This is an **in-place** mutation method; the
- :meth:`_expression.GenerativeSelect.order_by` method is preferred,
- as it
- provides standard :term:`method chaining`.
-
- .. seealso::
-
- :meth:`_expression.GenerativeSelect.order_by`
-
- """
- self.order_by.non_generative(self, *clauses)
-
- @util.deprecated(
- "1.4",
- "The :meth:`_expression.GenerativeSelect.append_group_by` "
- "method is deprecated "
- "and will be removed in a future release. Use the generative method "
- ":meth:`_expression.GenerativeSelect.group_by`.",
- )
- def append_group_by(self, *clauses):
- """Append the given GROUP BY criterion applied to this selectable.
-
- The criterion will be appended to any pre-existing GROUP BY criterion.
-
- This is an **in-place** mutation method; the
- :meth:`_expression.GenerativeSelect.group_by` method is preferred,
- as it
- provides standard :term:`method chaining`.
-
-
- """
- self.group_by.non_generative(self, *clauses)
-
-
SelfGenerativeSelect = typing.TypeVar(
"SelfGenerativeSelect", bound="GenerativeSelect"
)
-class GenerativeSelect(DeprecatedSelectBaseGenerations, SelectBase):
+class GenerativeSelect(SelectBase):
"""Base class for SELECT statements where additional elements can be
added.
_fetch_clause_options = None
_for_update_arg = None
- def __init__(
- self,
- _label_style=LABEL_STYLE_DEFAULT,
- use_labels=False,
- limit=None,
- offset=None,
- order_by=None,
- group_by=None,
- ):
- if use_labels:
- if util.SQLALCHEMY_WARN_20:
- util.warn_deprecated_20(
- "The use_labels=True keyword argument to GenerativeSelect "
- "is deprecated and will be removed in version 2.0. Please "
- "use "
- "select.set_label_style(LABEL_STYLE_TABLENAME_PLUS_COL) "
- "if you need to replicate this legacy behavior.",
- stacklevel=4,
- )
- _label_style = LABEL_STYLE_TABLENAME_PLUS_COL
-
+ def __init__(self, _label_style=LABEL_STYLE_DEFAULT):
self._label_style = _label_style
- if limit is not None:
- self.limit.non_generative(self, limit)
- if offset is not None:
- self.offset.non_generative(self, offset)
-
- if order_by is not None:
- self.order_by.non_generative(self, *util.to_list(order_by))
- if group_by is not None:
- self.group_by.non_generative(self, *util.to_list(group_by))
-
@_generative
def with_for_update(
self: SelfGenerativeSelect,
self._label_style = style
return self
- @util.deprecated_20(
- ":meth:`_sql.GenerativeSelect.apply_labels`",
- alternative="Use set_label_style(LABEL_STYLE_TABLENAME_PLUS_COL) "
- "instead.",
- )
- def apply_labels(self):
- return self.set_label_style(LABEL_STYLE_TABLENAME_PLUS_COL)
-
@property
def _group_by_clause(self):
"""ClauseList access to group_by_clauses for legacy dialects"""
INTERSECT_ALL = util.symbol("INTERSECT ALL")
_is_from_container = True
+ _auto_correlate = False
- def __init__(self, keyword, *selects, **kwargs):
- self._auto_correlate = kwargs.pop("correlate", False)
+ def __init__(self, keyword, *selects):
self.keyword = keyword
self.selects = [
coercions.expect(roles.CompoundElementRole, s).self_group(
for s in selects
]
- if kwargs and util.SQLALCHEMY_WARN_20:
- util.warn_deprecated_20(
- "Set functions such as union(), union_all(), extract(), etc. "
- "in SQLAlchemy 2.0 will accept a "
- "series of SELECT statements only. "
- "Please use generative methods such as order_by() for "
- "additional modifications to this CompoundSelect.",
- stacklevel=4,
- )
-
- GenerativeSelect.__init__(self, **kwargs)
+ GenerativeSelect.__init__(self)
@classmethod
def _create_union(cls, *selects, **kwargs):
return CompoundSelect(CompoundSelect.UNION, *selects, **kwargs)
@classmethod
- def _create_union_all(cls, *selects, **kwargs):
+ def _create_union_all(cls, *selects):
r"""Return a ``UNION ALL`` of multiple selectables.
The returned object is an instance of
:param \*selects:
a list of :class:`_expression.Select` instances.
- :param \**kwargs:
- available keyword arguments are the same as those of
- :func:`select`.
-
"""
- return CompoundSelect(CompoundSelect.UNION_ALL, *selects, **kwargs)
+ return CompoundSelect(CompoundSelect.UNION_ALL, *selects)
@classmethod
- def _create_except(cls, *selects, **kwargs):
+ def _create_except(cls, *selects):
r"""Return an ``EXCEPT`` of multiple selectables.
The returned object is an instance of
:param \*selects:
a list of :class:`_expression.Select` instances.
- :param \**kwargs:
- available keyword arguments are the same as those of
- :func:`select`.
-
"""
- return CompoundSelect(CompoundSelect.EXCEPT, *selects, **kwargs)
+ return CompoundSelect(CompoundSelect.EXCEPT, *selects)
@classmethod
- def _create_except_all(cls, *selects, **kwargs):
+ def _create_except_all(cls, *selects):
r"""Return an ``EXCEPT ALL`` of multiple selectables.
The returned object is an instance of
:param \*selects:
a list of :class:`_expression.Select` instances.
- :param \**kwargs:
- available keyword arguments are the same as those of
- :func:`select`.
-
"""
- return CompoundSelect(CompoundSelect.EXCEPT_ALL, *selects, **kwargs)
+ return CompoundSelect(CompoundSelect.EXCEPT_ALL, *selects)
@classmethod
- def _create_intersect(cls, *selects, **kwargs):
+ def _create_intersect(cls, *selects):
r"""Return an ``INTERSECT`` of multiple selectables.
The returned object is an instance of
:param \*selects:
a list of :class:`_expression.Select` instances.
- :param \**kwargs:
- available keyword arguments are the same as those of
- :func:`select`.
-
"""
- return CompoundSelect(CompoundSelect.INTERSECT, *selects, **kwargs)
+ return CompoundSelect(CompoundSelect.INTERSECT, *selects)
@classmethod
- def _create_intersect_all(cls, *selects, **kwargs):
+ def _create_intersect_all(cls, *selects):
r"""Return an ``INTERSECT ALL`` of multiple selectables.
The returned object is an instance of
:param \*selects:
a list of :class:`_expression.Select` instances.
- :param \**kwargs:
- available keyword arguments are the same as those of
- :func:`select`.
"""
- return CompoundSelect(CompoundSelect.INTERSECT_ALL, *selects, **kwargs)
+ return CompoundSelect(CompoundSelect.INTERSECT_ALL, *selects)
def _scalar_type(self):
return self.selects[0]._scalar_type()
return self.selects[0].selected_columns
-class DeprecatedSelectGenerations:
- """A collection of methods available on :class:`_sql.Select`, these
- are all **deprecated** methods as they modify the :class:`_sql.Select`
- object in -place.
-
- """
-
- @util.deprecated(
- "1.4",
- "The :meth:`_expression.Select.append_correlation` "
- "method is deprecated "
- "and will be removed in a future release. Use the generative "
- "method :meth:`_expression.Select.correlate`.",
- )
- def append_correlation(self, fromclause):
- """Append the given correlation expression to this select()
- construct.
-
- This is an **in-place** mutation method; the
- :meth:`_expression.Select.correlate` method is preferred,
- as it provides
- standard :term:`method chaining`.
-
- """
-
- self.correlate.non_generative(self, fromclause)
-
- @util.deprecated(
- "1.4",
- "The :meth:`_expression.Select.append_column` method is deprecated "
- "and will be removed in a future release. Use the generative "
- "method :meth:`_expression.Select.add_columns`.",
- )
- def append_column(self, column):
- """Append the given column expression to the columns clause of this
- select() construct.
-
- E.g.::
-
- my_select.append_column(some_table.c.new_column)
-
- This is an **in-place** mutation method; the
- :meth:`_expression.Select.add_columns` method is preferred,
- as it provides standard
- :term:`method chaining`.
-
- """
- self.add_columns.non_generative(self, column)
-
- @util.deprecated(
- "1.4",
- "The :meth:`_expression.Select.append_prefix` method is deprecated "
- "and will be removed in a future release. Use the generative "
- "method :meth:`_expression.Select.prefix_with`.",
- )
- def append_prefix(self, clause):
- """Append the given columns clause prefix expression to this select()
- construct.
-
- This is an **in-place** mutation method; the
- :meth:`_expression.Select.prefix_with` method is preferred,
- as it provides
- standard :term:`method chaining`.
-
- """
- self.prefix_with.non_generative(self, clause)
-
- @util.deprecated(
- "1.4",
- "The :meth:`_expression.Select.append_whereclause` "
- "method is deprecated "
- "and will be removed in a future release. Use the generative "
- "method :meth:`_expression.Select.where`.",
- )
- def append_whereclause(self, whereclause):
- """Append the given expression to this select() construct's WHERE
- criterion.
-
- The expression will be joined to existing WHERE criterion via AND.
-
- This is an **in-place** mutation method; the
- :meth:`_expression.Select.where` method is preferred,
- as it provides standard
- :term:`method chaining`.
-
- """
- self.where.non_generative(self, whereclause)
-
- @util.deprecated(
- "1.4",
- "The :meth:`_expression.Select.append_having` method is deprecated "
- "and will be removed in a future release. Use the generative "
- "method :meth:`_expression.Select.having`.",
- )
- def append_having(self, having):
- """Append the given expression to this select() construct's HAVING
- criterion.
-
- The expression will be joined to existing HAVING criterion via AND.
-
- This is an **in-place** mutation method; the
- :meth:`_expression.Select.having` method is preferred,
- as it provides standard
- :term:`method chaining`.
-
- """
-
- self.having.non_generative(self, having)
-
- @util.deprecated(
- "1.4",
- "The :meth:`_expression.Select.append_from` method is deprecated "
- "and will be removed in a future release. Use the generative "
- "method :meth:`_expression.Select.select_from`.",
- )
- def append_from(self, fromclause):
- """Append the given :class:`_expression.FromClause` expression
- to this select() construct's FROM clause.
-
- This is an **in-place** mutation method; the
- :meth:`_expression.Select.select_from` method is preferred,
- as it provides
- standard :term:`method chaining`.
-
- """
- self.select_from.non_generative(self, fromclause)
-
-
@CompileState.plugin_for("default", "select")
class SelectState(util.MemoizedSlots, CompileState):
__slots__ = (
HasSuffixes,
HasHints,
HasCompileState,
- DeprecatedSelectGenerations,
_SelectFromElements,
GenerativeSelect,
):
)
@_generative
- def with_only_columns(self: SelfSelect, *columns, **kw) -> SelfSelect:
+ def with_only_columns(
+ self: SelfSelect, *columns, maintain_column_froms=False
+ ) -> SelfSelect:
r"""Return a new :func:`_expression.select` construct with its columns
clause replaced with the given columns.
# is the case for now.
self._assert_no_memoizations()
- maintain_column_froms = kw.pop("maintain_column_froms", False)
- if kw:
- raise TypeError("unknown parameters: %s" % (", ".join(kw),))
-
if maintain_column_froms:
self.select_from.non_generative(self, *self.columns_clause_froms)
.. versionadded:: 1.3.8
:param omit_aliases: A boolean that when true will remove aliases from
- pep 435 enums. For backward compatibility it defaults to ``False``.
- A deprecation warning is raised if the enum has aliases and this
- flag was not set.
+ pep 435 enums. defaults to ``True``.
- .. versionadded:: 1.4.5
-
- .. deprecated:: 1.4 The default will be changed to ``True`` in
- SQLAlchemy 2.0.
+ .. versionchanged:: 2.0 This parameter now defaults to True.
"""
self._enum_init(enums, kw)
self.values_callable = kw.pop("values_callable", None)
self._sort_key_function = kw.pop("sort_key_function", NO_ARG)
length_arg = kw.pop("length", NO_ARG)
- self._omit_aliases = kw.pop("omit_aliases", NO_ARG)
+ self._omit_aliases = kw.pop("omit_aliases", True)
values, objects = self._parse_into_values(enums, kw)
self._setup_for_values(values, objects, kw)
_members = self.enum_class.__members__
- aliases = [n for n, v in _members.items() if v.name != n]
- if self._omit_aliases is NO_ARG and aliases:
- util.warn_deprecated_20(
- "The provided enum %s contains the aliases %s. The "
- "``omit_aliases`` will default to ``True`` in SQLAlchemy "
- "2.0. Specify a value to silence this warning."
- % (self.enum_class.__name__, aliases)
- )
if self._omit_aliases is True:
# remove aliases
members = OrderedDict(
def expect_warnings(*messages, **kw):
"""Context manager which expects one or more warnings.
- With no arguments, squelches all SAWarning and RemovedIn20Warning emitted via
+ With no arguments, squelches all SAWarning emitted via
sqlalchemy.util.warn and sqlalchemy.util.warn_limited. Otherwise
pass string expressions that will match selected warnings via regex;
all non-matching warnings are sent through.
Note that the test suite sets SAWarning warnings to raise exceptions.
""" # noqa
- return _expect_warnings(
- (sa_exc.RemovedIn20Warning, sa_exc.SAWarning), messages, **kw
- )
+ return _expect_warnings(sa_exc.SAWarning, messages, **kw)
@contextlib.contextmanager
else:
real_warn(msg, *arg, **kw)
- with mock.patch("warnings.warn", our_warn), mock.patch(
- "sqlalchemy.util.SQLALCHEMY_WARN_20", True
- ), mock.patch("sqlalchemy.util.deprecations.SQLALCHEMY_WARN_20", True):
+ with mock.patch("warnings.warn", our_warn):
try:
yield
finally:
# likely due to the introduction of __signature__.
from sqlalchemy.util import decorator
- from sqlalchemy.util import deprecations
- from sqlalchemy.testing import mock
@decorator
def wrap(fn, *args, **kw):
- with mock.patch.object(deprecations, "SQLALCHEMY_WARN_20", False):
- for warm in range(warmup):
- fn(*args, **kw)
+ for warm in range(warmup):
+ fn(*args, **kw)
- timerange = range(times)
- with count_functions(variance=variance):
- for time in timerange:
- rv = fn(*args, **kw)
- return rv
+ timerange = range(times)
+ with count_functions(variance=variance):
+ for time in timerange:
+ rv = fn(*args, **kw)
+ return rv
return wrap
message="The loop argument is deprecated",
)
- # ignore things that are deprecated *as of* 2.0 :)
- warnings.filterwarnings(
- "ignore",
- category=sa_exc.SADeprecationWarning,
- message=r".*\(deprecated since: 2.0\)$",
- )
-
try:
import pytest
except ImportError:
from .concurrency import await_only
from .concurrency import greenlet_spawn
from .concurrency import is_exit_exception
+from .deprecations import became_legacy_20
from .deprecations import deprecated
-from .deprecations import deprecated_20
-from .deprecations import deprecated_20_cls
from .deprecations import deprecated_cls
from .deprecations import deprecated_params
from .deprecations import inject_docstring_text
from .deprecations import moved_20
-from .deprecations import SQLALCHEMY_WARN_20
from .deprecations import warn_deprecated
-from .deprecations import warn_deprecated_20
from .langhelpers import add_parameter_text
from .langhelpers import as_interface
from .langhelpers import asbool
"""Helpers related to deprecation of functions, methods, classes, other
functionality."""
-import os
import re
from . import compat
from .. import exc
-SQLALCHEMY_WARN_20 = False
-
-if os.getenv("SQLALCHEMY_WARN_20", "false").lower() in ("true", "yes", "1"):
- SQLALCHEMY_WARN_20 = True
-
-
def _warn_with_version(msg, version, type_, stacklevel, code=None):
- if (
- issubclass(type_, exc.Base20DeprecationWarning)
- and not SQLALCHEMY_WARN_20
- ):
- return
-
warn = type_(msg, code=code)
warn.deprecated_since = version
)
-def warn_deprecated_20(msg, stacklevel=3, code=None):
-
- _warn_with_version(
- msg,
- exc.RemovedIn20Warning.deprecated_since,
- exc.RemovedIn20Warning,
- stacklevel,
- code=code,
- )
-
-
def deprecated_cls(version, message, constructor="__init__"):
header = ".. deprecated:: %s %s" % (version, (message or ""))
return decorate
-def deprecated_20_cls(
- clsname, alternative=None, constructor="__init__", becomes_legacy=False
-):
- message = (
- ".. deprecated:: 1.4 The %s class is considered legacy as of the "
- "1.x series of SQLAlchemy and %s in 2.0."
- % (
- clsname,
- "will be removed"
- if not becomes_legacy
- else "becomes a legacy construct",
- )
- )
-
- if alternative:
- message += " " + alternative
-
- if becomes_legacy:
- warning_cls = exc.LegacyAPIWarning
- else:
- warning_cls = exc.RemovedIn20Warning
-
- def decorate(cls):
- return _decorate_cls_with_warning(
- cls,
- constructor,
- warning_cls,
- message,
- warning_cls.deprecated_since,
- message,
- )
-
- return decorate
-
-
def deprecated(
version,
message=None,
"""
- # nothing is deprecated "since" 2.0 at this time. All "removed in 2.0"
- # should emit the RemovedIn20Warning, but messaging should be expressed
- # in terms of "deprecated since 1.4".
-
- if version == "2.0":
- if warning is None:
- warning = exc.RemovedIn20Warning
- version = "1.4"
if add_deprecation_to_docstring:
header = ".. deprecated:: %s %s" % (
version,
if warning is None:
warning = exc.SADeprecationWarning
- if warning is not exc.RemovedIn20Warning:
- message += " (deprecated since: %s)" % version
+ message += " (deprecated since: %s)" % version
def decorate(fn):
return _decorate_with_warning(
)
-def deprecated_20(api_name, alternative=None, becomes_legacy=False, **kw):
+def became_legacy_20(api_name, alternative=None, **kw):
type_reg = re.match("^:(attr|func|meth):", api_name)
if type_reg:
type_ = {"attr": "attribute", "func": "function", "meth": "method"}[
% (
api_name,
type_,
- "will be removed"
- if not becomes_legacy
- else "becomes a legacy construct",
+ "becomes a legacy construct",
)
)
if alternative:
message += " " + alternative
- if becomes_legacy:
- warning_cls = exc.LegacyAPIWarning
- else:
- warning_cls = exc.RemovedIn20Warning
+ warning_cls = exc.LegacyAPIWarning
return deprecated("2.0", message=message, warning=warning_cls, **kw)
for param, (version, message) in specs.items():
versions[param] = version
messages[param] = _sanitize_restructured_text(message)
- version_warnings[param] = (
- exc.RemovedIn20Warning
- if version == "2.0"
- else exc.SADeprecationWarning
- )
+ version_warnings[param] = exc.SADeprecationWarning
def decorate(fn):
spec = compat.inspect_getfullargspec(fn)
@assert_cycles(4)
def go():
- result = s.connection(mapper=User).execute(stmt)
+ result = s.connection(bind_arguments=dict(mapper=User)).execute(
+ stmt
+ )
while True:
row = result.fetchone()
if row is None:
@profiling.function_call_count()
def test_create_enum_from_pep_435_w_expensive_members(self):
- Enum(self.SomeEnum)
+ Enum(self.SomeEnum, omit_aliases=False)
class CacheKeyTest(fixtures.TestBase):
# down from 185 on this this is a small slice of a usually
# bigger operation so using a small variance
- sess2._legacy_transaction() # autobegin
+ sess2.connection() # autobegin
@profiling.function_call_count(variance=0.20)
def go1():
# third call, merge object already present. almost no calls.
- sess2._legacy_transaction() # autobegin
+ sess2.connection() # autobegin
@profiling.function_call_count(variance=0.10, warmup=1)
def go2():
# using sqlite3 the C extension took it back up to approx. 1257
# (py2.6)
- sess2._legacy_transaction() # autobegin
+ sess2.connection() # autobegin
@profiling.function_call_count(variance=0.10)
def go():
sa_exceptions.SADeprecationWarning,
sa_exceptions.Base20DeprecationWarning,
sa_exceptions.LegacyAPIWarning,
- sa_exceptions.RemovedIn20Warning,
sa_exceptions.MovedIn20Warning,
sa_exceptions.SAWarning,
],
from sqlalchemy import select
from sqlalchemy import table
from sqlalchemy.dialects.mysql import base as mysql
-from sqlalchemy.dialects.mysql import ENUM
-from sqlalchemy.dialects.mysql import SET
from sqlalchemy.testing import AssertsCompiledSQL
from sqlalchemy.testing import expect_deprecated
-from sqlalchemy.testing import expect_deprecated_20
from sqlalchemy.testing import fixtures
"dialect and will be removed in a future release"
):
self.assert_compile(s, "SELECT FOO * FROM foo")
-
-
-class DeprecateQuoting(fixtures.TestBase):
- def test_enum_warning(self):
- ENUM("a", "b")
- with expect_deprecated_20(
- "The 'quoting' parameter to :class:`.mysql.ENUM` is deprecated."
- ):
- ENUM("a", quoting="foo")
-
- def test_set_warning(self):
- SET("a", "b")
- with expect_deprecated_20(
- "The 'quoting' parameter to :class:`.mysql.SET` is deprecated.*"
- ):
- SET("a", quoting="foo")
from sqlalchemy import create_engine
from sqlalchemy import event
from sqlalchemy import exc
-from sqlalchemy import ForeignKey
from sqlalchemy import insert
-from sqlalchemy import inspect
from sqlalchemy import Integer
from sqlalchemy import MetaData
from sqlalchemy import pool
return str(select(1).compile(dialect=db.dialect))
-class DeprecatedReflectionTest(fixtures.TablesTest):
- @classmethod
- def define_tables(cls, metadata):
- Table(
- "user",
- metadata,
- Column("id", Integer, primary_key=True),
- Column("name", String(50)),
- )
- Table(
- "address",
- metadata,
- Column("id", Integer, primary_key=True),
- Column("user_id", ForeignKey("user.id")),
- Column("email", String(50)),
- )
-
- def test_reflecttable(self):
- inspector = inspect(testing.db)
- metadata = MetaData()
-
- table = Table("user", metadata)
- with testing.expect_deprecated_20(
- r"The Inspector.reflecttable\(\) method is considered "
- ):
- res = inspector.reflecttable(table, None)
- exp = inspector.reflect_table(table, None)
-
- eq_(res, exp)
-
-
class EngineEventsTest(fixtures.TestBase):
__requires__ = ("ad_hoc_engines",)
__backend__ = True
import sqlalchemy as sa
from sqlalchemy import inspect
from sqlalchemy.ext import declarative as legacy_decl
-from sqlalchemy.ext.declarative import instrument_declarative
-from sqlalchemy.orm import Mapper
from sqlalchemy.testing import eq_
from sqlalchemy.testing import expect_deprecated_20
from sqlalchemy.testing import fixtures
-from sqlalchemy.testing import is_
from sqlalchemy.testing import is_false
from sqlalchemy.testing import is_true
-class TestInstrumentDeclarative(fixtures.TestBase):
- def test_ok(self):
- class Foo:
- __tablename__ = "foo"
- id = sa.Column(sa.Integer, primary_key=True)
-
- meta = sa.MetaData()
- reg = {}
- with expect_deprecated_20(
- "the instrument_declarative function is deprecated"
- ):
- instrument_declarative(Foo, reg, meta)
-
- mapper = sa.inspect(Foo)
- is_true(isinstance(mapper, Mapper))
- is_(mapper.class_, Foo)
-
-
class DeprecatedImportsTest(fixtures.TestBase):
def _expect_warning(self, name):
return expect_deprecated_20(
for i in range(1, 5):
os.remove("shard%d_%s.db" % (i, provision.FOLLOWER_IDENT))
- @testing.combinations((True,), (False,))
- @testing.uses_deprecated("Using plain strings")
- def test_plain_core_textual_lookup_w_shard(self, use_legacy_text):
+ def test_plain_core_textual_lookup_w_shard(self):
sess = self._fixture_data()
- if use_legacy_text:
- stmt = "SELECT * FROM weather_locations"
- else:
- stmt = text("SELECT * FROM weather_locations")
+ stmt = text("SELECT * FROM weather_locations")
eq_(
- sess.execute(stmt, shard_id="asia").fetchall(),
+ sess.execute(
+ stmt, bind_arguments=dict(shard_id="asia")
+ ).fetchall(),
[(1, "Asia", "Tokyo")],
)
- @testing.combinations((True,), (False,))
- @testing.uses_deprecated("Using plain strings")
- def test_plain_core_textual_lookup(self, use_legacy_text):
+ def test_plain_core_textual_lookup(self):
sess = self._fixture_data()
- if use_legacy_text:
- stmt = "SELECT * FROM weather_locations WHERE id=1"
- else:
- stmt = text("SELECT * FROM weather_locations WHERE id=1")
+ stmt = text("SELECT * FROM weather_locations WHERE id=1")
eq_(
sess.execute(stmt).fetchall(),
[(1, "Asia", "Tokyo")],
from sqlalchemy.orm import descriptor_props
from sqlalchemy.orm import exc as orm_exc
from sqlalchemy.orm import joinedload
-from sqlalchemy.orm import mapper
+from sqlalchemy.orm import Mapper
from sqlalchemy.orm import registry
from sqlalchemy.orm import relationship
from sqlalchemy.orm import Session
def test_custom_mapper_attribute(self):
def mymapper(cls, tbl, **kwargs):
- m = sa.orm.mapper(cls, tbl, **kwargs)
+ m = sa.orm.Mapper(cls, tbl, **kwargs)
m.CHECK = True
return m
def test_custom_mapper_argument(self):
def mymapper(cls, tbl, **kwargs):
- m = sa.orm.mapper(cls, tbl, **kwargs)
+ m = sa.orm.Mapper(cls, tbl, **kwargs)
m.CHECK = True
return m
canary = mock.Mock()
- @event.listens_for(mapper, "instrument_class")
+ @event.listens_for(Mapper, "instrument_class")
def instrument_class(mp, cls):
canary.instrument_class(mp, cls)
from sqlalchemy import ForeignKey
from sqlalchemy import Integer
from sqlalchemy import String
-from sqlalchemy import testing
from sqlalchemy.orm import backref
-from sqlalchemy.orm import clear_mappers
from sqlalchemy.orm import configure_mappers
from sqlalchemy.orm import relationship
from sqlalchemy.testing import fixtures
@classmethod
def setup_mappers(cls):
- global Table1, Table1B, Table2, Table3, Data
table1, table2, table3, data = cls.tables(
"table1", "table2", "table3", "data"
)
- # join = polymorphic_union(
- # {
- # 'table3' : table1.join(table3),
- # 'table2' : table1.join(table2),
- # 'table1' : table1.select(table1.c.type.in_(['table1', 'table1b'])),
- # }, None, 'pjoin')
-
- with testing.expect_deprecated_20(
- r"The Join.alias\(\) method is considered legacy"
- ):
- join = table1.outerjoin(table2).outerjoin(table3).alias("pjoin")
- # join = None
-
- class Table1:
+
+ Base = cls.Basic
+
+ class Table1(Base):
def __init__(self, name, data=None):
self.name = name
if data is not None:
class Table3(Table1):
pass
- class Data:
+ class Data(Base):
def __init__(self, data):
self.data = data
repr(str(self.data)),
)
- try:
- # this is how the mapping used to work. ensure that this raises an
- # error now
- table1_mapper = cls.mapper_registry.map_imperatively(
- Table1,
- table1,
- select_table=join,
- polymorphic_on=table1.c.type,
- polymorphic_identity="table1",
- properties={
- "nxt": relationship(
- Table1,
- backref=backref(
- "prev", foreignkey=join.c.id, uselist=False
- ),
- uselist=False,
- primaryjoin=join.c.id == join.c.related_id,
- ),
- "data": relationship(
- cls.mapper_registry.map_imperatively(Data, data)
- ),
- },
- )
- configure_mappers()
- assert False
- except Exception:
- assert True
- clear_mappers()
-
# currently, the "eager" relationships degrade to lazy relationships
# due to the polymorphic load.
# the "nxt" relationship used to have a "lazy='joined'" on it, but the
), table1_mapper.primary_key
def test_one(self):
+ Table1, Table2 = self.classes("Table1", "Table2")
self._testlist([Table1, Table2, Table1, Table2])
def test_two(self):
+ Table3 = self.classes.Table3
self._testlist([Table3])
def test_three(self):
+ Table1, Table1B, Table2, Table3 = self.classes(
+ "Table1", "Table1B", "Table2", "Table3"
+ )
self._testlist(
[
Table2,
)
def test_four(self):
+ Table1, Table1B, Table2, Table3, Data = self.classes(
+ "Table1", "Table1B", "Table2", "Table3", "Data"
+ )
self._testlist(
[
Table2("t2", [Data("data1"), Data("data2")]),
)
def _testlist(self, classes):
+ Table1 = self.classes.Table1
+
sess = fixture_session()
# create objects in a linked list
sess.bind_mapper(Address, e2)
engine = {"e1": e1, "e2": e2, "e3": e3}[expected]
- conn = sess.connection(**testcase)
+ conn = sess.connection(bind_arguments=testcase)
is_(conn.engine, engine)
sess.close()
canary.get_bind(**kw)
return Session.get_bind(self, **kw)
- sess = GetBindSession(e3, future=True)
+ sess = GetBindSession(e3)
sess.bind_mapper(User, e1)
sess.bind_mapper(Address, e2)
c = testing.db.connect()
sess = Session(bind=c)
sess.begin()
- transaction = sess._legacy_transaction()
+ transaction = sess.get_transaction()
u = User(name="u1")
sess.add(u)
sess.flush()
stmt = select(b1).filter(b1.c.d1.between("d3d1", "d5d1"))
- with testing.expect_deprecated_20(
- "The Bundle.single_entity flag has no effect when "
- "using 2.0 style execution."
- ):
- rows = sess.execute(stmt).all()
+ rows = sess.execute(stmt).all()
eq_(
rows,
[(("d3d1", "d3d2"),), (("d4d1", "d4d2"),), (("d5d1", "d5d2"),)],
m2o_cascade=True,
o2m=False,
m2o=False,
- o2m_cascade_backrefs=True,
- m2o_cascade_backrefs=True,
):
Address, addresses, users, User = (
addresses_rel = {
"addresses": relationship(
Address,
- cascade_backrefs=o2m_cascade_backrefs,
cascade=o2m_cascade and "save-update" or "",
backref=backref(
"user",
cascade=m2o_cascade and "save-update" or "",
- cascade_backrefs=m2o_cascade_backrefs,
),
)
}
"addresses": relationship(
Address,
cascade=o2m_cascade and "save-update" or "",
- cascade_backrefs=o2m_cascade_backrefs,
)
}
user_rel = {}
"user": relationship(
User,
cascade=m2o_cascade and "save-update" or "",
- cascade_backrefs=m2o_cascade_backrefs,
)
}
addresses_rel = {}
bkd_cascade=True,
fwd=False,
bkd=False,
- fwd_cascade_backrefs=True,
- bkd_cascade_backrefs=True,
):
keywords, items, item_keywords, Keyword, Item = (
"keywords": relationship(
Keyword,
secondary=item_keywords,
- cascade_backrefs=fwd_cascade_backrefs,
cascade=fwd_cascade and "save-update" or "",
backref=backref(
"items",
cascade=bkd_cascade and "save-update" or "",
- cascade_backrefs=bkd_cascade_backrefs,
),
)
}
Keyword,
secondary=item_keywords,
cascade=fwd_cascade and "save-update" or "",
- cascade_backrefs=fwd_cascade_backrefs,
)
}
items_rel = {}
Item,
secondary=item_keywords,
cascade=bkd_cascade and "save-update" or "",
- cascade_backrefs=bkd_cascade_backrefs,
)
}
keywords_rel = {}
sess.flush()
a1 = Address(email_address="a1")
- with testing.expect_deprecated(
- '"Address" object is being merged into a Session along '
- 'the backref cascade path for relationship "User.addresses"'
- ):
- a1.user = u1
+ a1.user = u1
sess.add(a1)
sess.expunge(u1)
assert u1 not in sess
sess.flush()
a1 = Address(email_address="a1")
- with testing.expect_deprecated(
- '"Address" object is being merged into a Session along the '
- 'backref cascade path for relationship "User.addresses"'
- ):
- a1.user = u1
+ a1.user = u1
sess.add(a1)
sess.expunge(u1)
assert u1 not in sess
cls.mapper_registry.map_imperatively(
User,
users,
- properties={
- "addresses": relationship(
- Address, backref="user", cascade_backrefs=False
- )
- },
+ properties={"addresses": relationship(Address, backref="user")},
)
cls.mapper_registry.map_imperatively(
Dingaling,
dingalings,
properties={
- "address": relationship(
- Address, backref="dingalings", cascade_backrefs=False
- )
+ "address": relationship(Address, backref="dingalings")
},
)
assert a1 not in sess
- def test_o2m_flag_on_backref(self):
+ def test_o2m_on_backref_no_cascade(self):
Dingaling, Address = self.classes.Dingaling, self.classes.Address
sess = fixture_session()
sess.add(a1)
d1 = Dingaling()
- with testing.expect_deprecated(
- '"Dingaling" object is being merged into a Session along the '
- 'backref cascade path for relationship "Address.dingalings"'
- ):
- d1.address = a1
+ d1.address = a1
assert d1 in a1.dingalings
- assert d1 in sess
-
- sess.commit()
+ assert d1 not in sess
def test_m2o_basic(self):
Dingaling, Address = self.classes.Dingaling, self.classes.Address
a1.dingalings.append(d1)
assert a1 not in sess
- def test_m2o_flag_on_backref(self):
+ def test_m2o_on_backref_no_cascade(self):
User, Address = self.classes.User, self.classes.Address
sess = fixture_session()
sess.add(a1)
u1 = User(name="u1")
- with testing.expect_deprecated(
- '"User" object is being merged into a Session along the backref '
- 'cascade path for relationship "Address.user"'
- ):
- u1.addresses.append(a1)
- assert u1 in sess
+ u1.addresses.append(a1)
+ assert u1 not in sess
- def test_m2o_commit_warns(self):
+ def test_m2o_commit_no_cascade(self):
Dingaling, Address = self.classes.Dingaling, self.classes.Address
sess = fixture_session()
"child": relationship(
Child,
uselist=False,
- backref=backref("parent", cascade_backrefs=False),
+ backref=backref(
+ "parent",
+ ),
)
},
)
Child,
uselist=False,
cascade="all, delete, delete-orphan",
- backref=backref("parent", cascade_backrefs=False),
+ backref=backref("parent"),
)
},
)
single_parent=True,
backref=backref("child", uselist=False),
cascade="all,delete,delete-orphan",
- cascade_backrefs=False,
)
},
)
single_parent=True,
backref=backref("child", uselist=True),
cascade="all,delete,delete-orphan",
- cascade_backrefs=False,
)
},
)
"B",
backref="a",
collection_class=collection_class,
- cascade_backrefs=False,
)
@registry.mapped
(attribute_mapped_collection("key"), "update_kw"),
argnames="collection_class,methname",
)
- @testing.combinations((True,), (False,), argnames="future")
def test_cascades_on_collection(
- self, cascade_fixture, collection_class, methname, future
+ self, cascade_fixture, collection_class, methname
):
A, B = cascade_fixture(collection_class)
- s = Session(future=future)
+ s = Session()
a1 = A()
s.add(a1)
},
)
- def _fixture(self, future=False):
+ def _fixture(self):
Graph, Edge, Point = (
self.classes.Graph,
self.classes.Edge,
self.classes.Point,
)
- sess = Session(testing.db, future=future)
+ sess = Session(testing.db)
g = Graph(
id=1,
edges=[
def test_bulk_update_sql(self):
Edge, Point = (self.classes.Edge, self.classes.Point)
- sess = self._fixture(future=True)
+ sess = self._fixture()
e1 = sess.execute(
select(Edge).filter(Edge.start == Point(14, 5))
def test_bulk_update_evaluate(self):
Edge, Point = (self.classes.Edge, self.classes.Point)
- sess = self._fixture(future=True)
+ sess = self._fixture()
e1 = sess.execute(
select(Edge).filter(Edge.start == Point(14, 5))
from sqlalchemy.testing import eq_
from sqlalchemy.testing import fixtures
from sqlalchemy.testing import is_
-from sqlalchemy.testing.assertions import expect_raises_message
from sqlalchemy.testing.fixtures import fixture_session
from sqlalchemy.testing.util import resolve_lambda
from sqlalchemy.util.langhelpers import hybridproperty
)
eq_(len(froms), 1)
- def test_with_only_columns_unknown_kw(self):
- User, Address = self.classes("User", "Address")
-
- stmt = select(User.id)
-
- with expect_raises_message(TypeError, "unknown parameters: foo"):
- stmt.with_only_columns(User.id, foo="bar")
-
@testing.combinations((True,), (False,))
def test_replace_into_select_from_maintains_existing(self, use_flag):
User, Address = self.classes("User", "Address")
-from contextlib import nullcontext
from unittest.mock import call
from unittest.mock import Mock
from sqlalchemy import String
from sqlalchemy import testing
from sqlalchemy import text
-from sqlalchemy import true
from sqlalchemy.engine import default
from sqlalchemy.engine import result_tuple
from sqlalchemy.orm import aliased
from sqlalchemy.orm import attributes
-from sqlalchemy.orm import backref
from sqlalchemy.orm import clear_mappers
from sqlalchemy.orm import collections
from sqlalchemy.orm import column_property
from sqlalchemy.orm import defaultload
from sqlalchemy.orm import defer
from sqlalchemy.orm import deferred
-from sqlalchemy.orm import eagerload
from sqlalchemy.orm import foreign
from sqlalchemy.orm import instrumentation
from sqlalchemy.orm import joinedload
-from sqlalchemy.orm import mapper
-from sqlalchemy.orm import relation
from sqlalchemy.orm import relationship
from sqlalchemy.orm import scoped_session
from sqlalchemy.orm import Session
from sqlalchemy.orm import undefer
from sqlalchemy.orm import with_parent
from sqlalchemy.orm import with_polymorphic
-from sqlalchemy.orm.collections import attribute_mapped_collection
from sqlalchemy.orm.collections import collection
from sqlalchemy.orm.util import polymorphic_union
from sqlalchemy.testing import assert_raises_message
from .inheritance._poly_fixtures import Manager
from .inheritance._poly_fixtures import Person
from .test_deferred import InheritanceTest as _deferred_InheritanceTest
-from .test_dynamic import _DynamicFixture
from .test_events import _RemoveListeners
from .test_options import PathTest as OptionsPathTest
from .test_options import PathTest
from .test_options import QueryTest as OptionsQueryTest
from .test_query import QueryTest
-from .test_transaction import _LocalFixture
from ..sql.test_compare import CacheKeyFixture
if True:
"subquery object."
)
- def test_deprecated_negative_slices(self):
- User = self.classes.User
-
- sess = fixture_session()
- q = sess.query(User).order_by(User.id)
-
- with testing.expect_deprecated(
- "Support for negative indexes for SQL index / slice operators"
- ):
- eq_(q[-5:-2], [User(id=7), User(id=8)])
-
- with testing.expect_deprecated(
- "Support for negative indexes for SQL index / slice operators"
- ):
- eq_(q[-1], User(id=10))
-
- with testing.expect_deprecated(
- "Support for negative indexes for SQL index / slice operators"
- ):
- eq_(q[-2], User(id=9))
-
- with testing.expect_deprecated(
- "Support for negative indexes for SQL index / slice operators"
- ):
- eq_(q[:-2], [User(id=7), User(id=8)])
-
- # this doesn't evaluate anything because it's a net-negative
- eq_(q[-2:-5], [])
-
def test_deprecated_select_coercion_join_target(self):
User = self.classes.User
addresses = self.tables.addresses
"ON users.id = anon_1.user_id",
)
- def test_deprecated_negative_slices_compile(self):
- User = self.classes.User
-
- sess = fixture_session()
- q = sess.query(User).order_by(User.id)
-
- with testing.expect_deprecated(
- "Support for negative indexes for SQL index / slice operators"
- ):
- self.assert_sql(
- testing.db,
- lambda: q[-5:-2],
- [
- (
- "SELECT users.id AS users_id, users.name "
- "AS users_name "
- "FROM users ORDER BY users.id",
- {},
- )
- ],
- )
-
- with testing.expect_deprecated(
- "Support for negative indexes for SQL index / slice operators"
- ):
- self.assert_sql(
- testing.db,
- lambda: q[-5:],
- [
- (
- "SELECT users.id AS users_id, users.name "
- "AS users_name "
- "FROM users ORDER BY users.id",
- {},
- )
- ],
- )
-
def test_invalid_column(self):
User = self.classes.User
self.assert_sql_count(testing.db, go, expected)
-class DynamicTest(_DynamicFixture, _fixtures.FixtureTest):
- def test_negative_slice_access_raises(self):
- User, Address = self._user_address_fixture()
- sess = fixture_session()
- u1 = sess.get(User, 8)
+class DeprecatedInhTest(_poly_fixtures._Polymorphic):
+ def test_with_polymorphic(self):
+ Person = _poly_fixtures.Person
+ Engineer = _poly_fixtures.Engineer
- with testing.expect_deprecated_20(
- "Support for negative indexes for SQL index / slice"
- ):
- eq_(u1.addresses[-1], Address(id=4))
+ with DeprecatedQueryTest._expect_implicit_subquery():
+ p_poly = with_polymorphic(Person, [Engineer], select(Person))
- with testing.expect_deprecated_20(
- "Support for negative indexes for SQL index / slice"
- ):
- eq_(u1.addresses[-5:-2], [Address(id=2)])
+ is_true(
+ sa.inspect(p_poly).selectable.compare(select(Person).subquery())
+ )
- with testing.expect_deprecated_20(
- "Support for negative indexes for SQL index / slice"
- ):
- eq_(u1.addresses[-2], Address(id=3))
- with testing.expect_deprecated_20(
- "Support for negative indexes for SQL index / slice"
- ):
- eq_(u1.addresses[:-2], [Address(id=2)])
+class DeprecatedMapperTest(
+ fixtures.RemovesEvents, _fixtures.FixtureTest, AssertsCompiledSQL
+):
+ __dialect__ = "default"
+ def test_listen_on_mapper_mapper_event_fn(self, registry):
+ from sqlalchemy.orm import mapper
-class SessionTest(fixtures.RemovesEvents, _LocalFixture):
- def test_transaction_attr(self):
- s1 = Session(testing.db)
+ m1 = Mock()
- with testing.expect_deprecated_20(
- "The Session.transaction attribute is considered legacy as "
- "of the 1.x series"
+ with expect_deprecated(
+ r"The `sqlalchemy.orm.mapper\(\)` symbol is deprecated and "
+ "will be removed"
):
- s1.transaction
- def test_textual_execute(self, connection):
- """test that Session.execute() converts to text()"""
+ @event.listens_for(mapper, "before_configured")
+ def go():
+ m1()
- users = self.tables.users
+ @registry.mapped
+ class MyClass:
+ __tablename__ = "t1"
+ id = Column(Integer, primary_key=True)
- with Session(bind=connection) as sess:
- sess.execute(users.insert(), dict(id=7, name="jack"))
+ registry.configure()
+ eq_(m1.mock_calls, [call()])
- with testing.expect_deprecated_20(
- "Using plain strings to indicate SQL statements "
- "without using the text"
- ):
- # use :bindparam style
- eq_(
- sess.execute(
- "select * from users where id=:id", {"id": 7}
- ).fetchall(),
- [(7, "jack")],
- )
+ def test_listen_on_mapper_instrumentation_event_fn(self, registry):
+ from sqlalchemy.orm import mapper
- with testing.expect_deprecated_20(
- "Using plain strings to indicate SQL statements "
- "without using the text"
- ):
- # use :bindparam style
- eq_(
- sess.scalar(
- "select id from users where id=:id", {"id": 7}
- ),
- 7,
- )
+ m1 = Mock()
- def test_session_str(self):
- s1 = Session(testing.db)
- str(s1)
+ with expect_deprecated(
+ r"The `sqlalchemy.orm.mapper\(\)` symbol is deprecated and "
+ "will be removed"
+ ):
- @testing.combinations(
- {"mapper": None},
- {"clause": None},
- {"bind_arguments": {"mapper": None}, "clause": None},
- {"bind_arguments": {}, "clause": None},
- )
- def test_bind_kwarg_deprecated(self, kw):
- s1 = Session(testing.db)
-
- for meth in s1.execute, s1.scalar:
- m1 = mock.Mock(side_effect=s1.get_bind)
- with mock.patch.object(s1, "get_bind", m1):
- expr = text("select 1")
-
- with testing.expect_deprecated_20(
- r"Passing bind arguments to Session.execute\(\) as "
- "keyword "
- "arguments is deprecated and will be removed SQLAlchemy "
- "2.0"
- ):
- meth(expr, **kw)
-
- bind_arguments = kw.pop("bind_arguments", None)
- if bind_arguments:
- bind_arguments.update(kw)
-
- if "clause" not in kw:
- bind_arguments["clause"] = expr
- eq_(m1.mock_calls, [call(**bind_arguments)])
- else:
- if "clause" not in kw:
- kw["clause"] = expr
- eq_(m1.mock_calls, [call(**kw)])
+ @event.listens_for(mapper, "init")
+ def go(target, args, kwargs):
+ m1(target, args, kwargs)
+ @registry.mapped
+ class MyClass:
+ __tablename__ = "t1"
+ id = Column(Integer, primary_key=True)
-class DeprecatedInhTest(_poly_fixtures._Polymorphic):
- def test_with_polymorphic(self):
- Person = _poly_fixtures.Person
- Engineer = _poly_fixtures.Engineer
+ mc = MyClass(id=5)
+ eq_(m1.mock_calls, [call(mc, (), {"id": 5})])
- with DeprecatedQueryTest._expect_implicit_subquery():
- p_poly = with_polymorphic(Person, [Engineer], select(Person))
+ def test_we_couldnt_remove_mapper_yet(self):
+ """test that the mapper() function is present but raises an
+ informative error when used.
- is_true(
- sa.inspect(p_poly).selectable.compare(select(Person).subquery())
- )
+ The function itself was to be removed as of 2.0, however we forgot
+ to mark deprecated the use of the function as an event target,
+ so it needs to stay around for another cycle at least.
+ """
-class DeprecatedMapperTest(_fixtures.FixtureTest, AssertsCompiledSQL):
- __dialect__ = "default"
+ class MyClass:
+ pass
+
+ t1 = Table("t1", MetaData(), Column("id", Integer, primary_key=True))
+
+ from sqlalchemy.orm import mapper
+
+ with assertions.expect_raises_message(
+ sa_exc.InvalidRequestError,
+ r"The 'sqlalchemy.orm.mapper\(\)' function is removed as of "
+ "SQLAlchemy 2.0.",
+ ):
+ mapper(MyClass, t1)
def test_deferred_scalar_loader_name_change(self):
class Foo:
("passive_updates", False),
("enable_typechecks", False),
("active_history", True),
- ("cascade_backrefs", False),
)
def test_viewonly_warning(self, flag, value):
Order = self.classes.Order
non_primary=True,
)
- def test_illegal_non_primary_legacy(self):
+ def test_illegal_non_primary_legacy(self, registry):
users, Address, addresses, User = (
self.tables.users,
self.classes.Address,
self.classes.User,
)
- with testing.expect_deprecated(
- "Calling the mapper.* function directly outside of a declarative "
- ):
- mapper(User, users)
- with testing.expect_deprecated(
- "Calling the mapper.* function directly outside of a declarative "
- ):
- mapper(Address, addresses)
+ registry.map_imperatively(User, users)
+ registry.map_imperatively(Address, addresses)
with testing.expect_deprecated(
"The mapper.non_primary parameter is deprecated"
):
- m = mapper( # noqa F841
+ m = registry.map_imperatively( # noqa F841
User,
users,
non_primary=True,
configure_mappers,
)
- def test_illegal_non_primary_2_legacy(self):
+ def test_illegal_non_primary_2_legacy(self, registry):
User, users = self.classes.User, self.tables.users
- with testing.expect_deprecated(
- "The mapper.non_primary parameter is deprecated"
- ):
- assert_raises_message(
- sa.exc.InvalidRequestError,
- "Configure a primary mapper first",
- mapper,
- User,
- users,
- non_primary=True,
- )
+ assert_raises_message(
+ sa.exc.InvalidRequestError,
+ "Configure a primary mapper first",
+ registry.map_imperatively,
+ User,
+ users,
+ non_primary=True,
+ )
- def test_illegal_non_primary_3_legacy(self):
+ def test_illegal_non_primary_3_legacy(self, registry):
users, addresses = self.tables.users, self.tables.addresses
class Base:
class Sub(Base):
pass
- with testing.expect_deprecated(
- "Calling the mapper.* function directly outside of a declarative "
- ):
- mapper(Base, users)
- with testing.expect_deprecated(
- "The mapper.non_primary parameter is deprecated",
- ):
- assert_raises_message(
- sa.exc.InvalidRequestError,
- "Configure a primary mapper first",
- mapper,
- Sub,
- addresses,
- non_primary=True,
- )
+ registry.map_imperatively(Base, users)
+
+ assert_raises_message(
+ sa.exc.InvalidRequestError,
+ "Configure a primary mapper first",
+ registry.map_imperatively,
+ Sub,
+ addresses,
+ non_primary=True,
+ )
class InstancesTest(QueryTest, AssertsCompiledSQL):
self.assert_sql_count(testing.db, go, 1)
-class TestDeprecation20(QueryTest):
- def test_relation(self):
- User = self.classes.User
- with testing.expect_deprecated_20(".*relationship"):
- relation(User.addresses)
-
- def test_eagerloading(self):
- User = self.classes.User
- with testing.expect_deprecated_20(".*joinedload"):
- eagerload(User.addresses)
-
-
-class DistinctOrderByImplicitTest(QueryTest, AssertsCompiledSQL):
- __dialect__ = "default"
-
- def test_columns_augmented_roundtrip_three(self):
- User, Address = self.classes.User, self.classes.Address
-
- sess = fixture_session()
-
- q = (
- sess.query(User.id, User.name.label("foo"), Address.id)
- .join(Address, true())
- .filter(User.name == "jack")
- .filter(User.id + Address.user_id > 0)
- .distinct()
- .order_by(User.id, User.name, Address.email_address)
- )
-
- # even though columns are added, they aren't in the result
- with testing.expect_deprecated(
- "ORDER BY columns added implicitly due to "
- ):
- eq_(
- q.all(),
- [
- (7, "jack", 3),
- (7, "jack", 4),
- (7, "jack", 2),
- (7, "jack", 5),
- (7, "jack", 1),
- ],
- )
- for row in q:
- eq_(row._mapping.keys(), ["id", "foo", "id"])
-
- def test_columns_augmented_sql_one(self):
- User, Address = self.classes.User, self.classes.Address
-
- sess = fixture_session()
-
- q = (
- sess.query(User.id, User.name.label("foo"), Address.id)
- .distinct()
- .order_by(User.id, User.name, Address.email_address)
- )
-
- # Address.email_address is added because of DISTINCT,
- # however User.id, User.name are not b.c. they're already there,
- # even though User.name is labeled
- with testing.expect_deprecated(
- "ORDER BY columns added implicitly due to "
- ):
- self.assert_compile(
- q,
- "SELECT DISTINCT users.id AS users_id, users.name AS foo, "
- "addresses.id AS addresses_id, addresses.email_address AS "
- "addresses_email_address FROM users, addresses "
- "ORDER BY users.id, users.name, addresses.email_address",
- )
-
-
class SessionEventsTest(_RemoveListeners, _fixtures.FixtureTest):
run_inserts = None
)
-class CollectionCascadesDespiteBackrefTest(fixtures.TestBase):
- """test old cascade_backrefs behavior
-
- see test/orm/test_cascade.py::class CollectionCascadesNoBackrefTest
- for the future version
-
- """
-
- @testing.fixture
- def cascade_fixture(self, registry):
- def go(collection_class):
- @registry.mapped
- class A:
- __tablename__ = "a"
-
- id = Column(Integer, primary_key=True)
- bs = relationship(
- "B", backref="a", collection_class=collection_class
- )
-
- @registry.mapped
- class B:
- __tablename__ = "b_"
- id = Column(Integer, primary_key=True)
- a_id = Column(ForeignKey("a.id"))
- key = Column(String)
-
- return A, B
-
- yield go
-
- @testing.combinations(
- (set, "add"),
- (list, "append"),
- (attribute_mapped_collection("key"), "__setitem__"),
- (attribute_mapped_collection("key"), "setdefault"),
- (attribute_mapped_collection("key"), "update_dict"),
- (attribute_mapped_collection("key"), "update_kw"),
- argnames="collection_class,methname",
- )
- @testing.combinations((True,), (False,), argnames="future")
- def test_cascades_on_collection(
- self, cascade_fixture, collection_class, methname, future
- ):
- A, B = cascade_fixture(collection_class)
-
- s = Session(future=future)
-
- a1 = A()
- s.add(a1)
-
- b1 = B(key="b1")
- b2 = B(key="b2")
- b3 = B(key="b3")
-
- if future:
- dep_ctx = nullcontext
- else:
-
- def dep_ctx():
- return assertions.expect_deprecated_20(
- '"B" object is being merged into a Session along the '
- 'backref cascade path for relationship "A.bs"'
- )
-
- with dep_ctx():
- b1.a = a1
- with dep_ctx():
- b3.a = a1
-
- if future:
- assert b1 not in s
- assert b3 not in s
- else:
- assert b1 in s
- assert b3 in s
-
- if methname == "__setitem__":
- meth = getattr(a1.bs, methname)
- meth(b1.key, b1)
- meth(b2.key, b2)
- elif methname == "setdefault":
- meth = getattr(a1.bs, methname)
- meth(b1.key, b1)
- meth(b2.key, b2)
- elif methname == "update_dict" and isinstance(a1.bs, dict):
- a1.bs.update({b1.key: b1, b2.key: b2})
- elif methname == "update_kw" and isinstance(a1.bs, dict):
- a1.bs.update(b1=b1, b2=b2)
- else:
- meth = getattr(a1.bs, methname)
- meth(b1)
- meth(b2)
-
- assert b1 in s
- assert b2 in s
-
- # future version:
- if future:
- assert b3 not in s # the event never triggers from reverse
- else:
- # old behavior
- assert b3 in s
-
-
-class LoadOnFKsTest(fixtures.DeclarativeMappedTest):
- @classmethod
- def setup_classes(cls):
- Base = cls.DeclarativeBasic
-
- class Parent(Base):
- __tablename__ = "parent"
- __table_args__ = {"mysql_engine": "InnoDB"}
-
- id = Column(
- Integer, primary_key=True, test_needs_autoincrement=True
- )
-
- class Child(Base):
- __tablename__ = "child"
- __table_args__ = {"mysql_engine": "InnoDB"}
-
- id = Column(
- Integer, primary_key=True, test_needs_autoincrement=True
- )
- parent_id = Column(Integer, ForeignKey("parent.id"))
-
- parent = relationship(Parent, backref=backref("children"))
-
- @testing.fixture
- def parent_fixture(self, connection):
- Parent, Child = self.classes("Parent", "Child")
-
- sess = fixture_session(bind=connection, autoflush=False)
- p1 = Parent()
- p2 = Parent()
- c1, c2 = Child(), Child()
- c1.parent = p1
- sess.add_all([p1, p2])
- assert c1 in sess
-
- yield sess, p1, p2, c1, c2
-
- sess.close()
-
- def test_enable_rel_loading_on_persistent_allows_backref_event(
- self, parent_fixture
- ):
- sess, p1, p2, c1, c2 = parent_fixture
- Parent, Child = self.classes("Parent", "Child")
-
- c3 = Child()
- sess.enable_relationship_loading(c3)
- c3.parent_id = p1.id
- with assertions.expect_deprecated_20(
- '"Child" object is being merged into a Session along the '
- 'backref cascade path for relationship "Parent.children"'
- ):
- c3.parent = p1
-
- # backref fired off when c3.parent was set,
- # because the "old" value was None
- # change as of [ticket:3708]
- assert c3 in p1.children
-
- def test_enable_rel_loading_allows_backref_event(self, parent_fixture):
- sess, p1, p2, c1, c2 = parent_fixture
- Parent, Child = self.classes("Parent", "Child")
-
- c3 = Child()
- sess.enable_relationship_loading(c3)
- c3.parent_id = p1.id
-
- with assertions.expect_deprecated_20(
- '"Child" object is being merged into a Session along the '
- 'backref cascade path for relationship "Parent.children"'
- ):
- c3.parent = p1
-
- # backref fired off when c3.parent was set,
- # because the "old" value was None
- # change as of [ticket:3708]
- assert c3 in p1.children
-
-
-class LazyTest(_fixtures.FixtureTest):
- run_inserts = "once"
- run_deletes = None
-
- def test_backrefs_dont_lazyload(self):
- users, Address, addresses, User = (
- self.tables.users,
- self.classes.Address,
- self.tables.addresses,
- self.classes.User,
- )
-
- self.mapper_registry.map_imperatively(
- User,
- users,
- properties={"addresses": relationship(Address, backref="user")},
- )
- self.mapper_registry.map_imperatively(Address, addresses)
- sess = fixture_session(autoflush=False)
- ad = sess.query(Address).filter_by(id=1).one()
- assert ad.user.id == 7
-
- def go():
- ad.user = None
- assert ad.user is None
-
- self.assert_sql_count(testing.db, go, 0)
-
- u1 = sess.query(User).filter_by(id=7).one()
-
- def go():
- assert ad not in u1.addresses
-
- self.assert_sql_count(testing.db, go, 1)
-
- sess.expire(u1, ["addresses"])
-
- def go():
- assert ad in u1.addresses
-
- self.assert_sql_count(testing.db, go, 1)
-
- sess.expire(u1, ["addresses"])
- ad2 = Address()
-
- def go():
- with assertions.expect_deprecated_20(
- ".* object is being merged into a Session along the "
- "backref cascade path for relationship "
- ):
- ad2.user = u1
- assert ad2.user is u1
-
- self.assert_sql_count(testing.db, go, 0)
-
- def go():
- assert ad2 in u1.addresses
-
- self.assert_sql_count(testing.db, go, 1)
-
-
class MergeResultTest(_fixtures.FixtureTest):
run_setup_mappers = "once"
run_inserts = "once"
from sqlalchemy.orm import joinedload
from sqlalchemy.orm import lazyload
from sqlalchemy.orm import Mapper
-from sqlalchemy.orm import mapper
from sqlalchemy.orm import mapperlib
from sqlalchemy.orm import query
from sqlalchemy.orm import relationship
def init_e(target, args, kwargs):
canary.append(("init_e", target))
- event.listen(mapper, "init", init_a)
- event.listen(Mapper, "init", init_b)
- event.listen(class_mapper(A), "init", init_c)
- event.listen(A, "init", init_d)
- event.listen(A, "init", init_e, propagate=True)
+ event.listen(Mapper, "init", init_a)
+ event.listen(class_mapper(A), "init", init_b)
+ event.listen(A, "init", init_c)
+ event.listen(A, "init", init_d, propagate=True)
a = A()
eq_(
("init_b", a),
("init_c", a),
("init_d", a),
- ("init_e", a),
],
)
# test propagate flag
canary[:] = []
b = B()
- eq_(canary, [("init_a", b), ("init_b", b), ("init_e", b)])
+ eq_(canary, [("init_a", b), ("init_d", b)])
def listen_all(self, mapper, **kw):
canary = []
canary = Mock()
- event.listen(mapper, "before_configured", canary.listen1)
- event.listen(mapper, "before_configured", canary.listen2, insert=True)
- event.listen(mapper, "before_configured", canary.listen3)
- event.listen(mapper, "before_configured", canary.listen4, insert=True)
+ event.listen(Mapper, "before_configured", canary.listen1)
+ event.listen(Mapper, "before_configured", canary.listen2, insert=True)
+ event.listen(Mapper, "before_configured", canary.listen3)
+ event.listen(Mapper, "before_configured", canary.listen4, insert=True)
configure_mappers()
def load(obj, ctx):
canary.append("load")
- event.listen(mapper, "load", load)
+ event.listen(Mapper, "load", load)
s = fixture_session()
u = User(name="u1")
assert_raises_message(
sa.exc.SAWarning,
r"before_configured' and 'after_configured' ORM events only "
- r"invoke with the mapper\(\) function or Mapper class as "
+ r"invoke with the Mapper class as "
r"the target.",
event.listen,
User,
assert_raises_message(
sa.exc.SAWarning,
r"before_configured' and 'after_configured' ORM events only "
- r"invoke with the mapper\(\) function or Mapper class as "
+ r"invoke with the Mapper class as "
r"the target.",
event.listen,
User,
self.mapper_registry.map_imperatively(User, users)
- event.listen(mapper, "before_configured", m1)
- event.listen(mapper, "after_configured", m2)
+ event.listen(Mapper, "before_configured", m1)
+ event.listen(Mapper, "after_configured", m2)
inspect(User)._post_inspect
canary.init()
# mapper level event
- @event.listens_for(mapper, "instrument_class")
+ @event.listens_for(Mapper, "instrument_class")
def instrument_class(mp, class_):
canary.instrument_class(class_)
sess.rollback()
eq_(assertions, [True, True])
- @testing.combinations((True,), (False,))
- def test_autobegin_no_reentrant(self, future):
- s1 = fixture_session(future=future)
+ def test_autobegin_no_reentrant(self):
+ s1 = fixture_session()
canary = Mock()
properties={"addresses": relationship(Address, backref="user")},
)
self.mapper_registry.map_imperatively(Address, addresses)
- sess = fixture_session(autoflush=False, future=True)
+ sess = fixture_session(autoflush=False)
ad = sess.query(Address).filter_by(id=1).one()
assert ad.user.id == 7
self.assert_sql_count(testing.db, go, 0)
+ def test_enable_rel_loading_on_persistent_allows_backref_event(
+ self, parent_fixture
+ ):
+ sess, p1, p2, c1, c2 = parent_fixture
+ Parent, Child = self.classes("Parent", "Child")
+
+ c3 = Child()
+ sess.enable_relationship_loading(c3)
+ c3.parent_id = p1.id
+ c3.parent = p1
+
+ # backref did not fire off when c3.parent was set.
+ # originally this was impacted by #3708, now does not happen
+ # due to backref_cascades behavior being removed
+ assert c3 not in p1.children
+
+ def test_enable_rel_loading_allows_backref_event(self, parent_fixture):
+ sess, p1, p2, c1, c2 = parent_fixture
+ Parent, Child = self.classes("Parent", "Child")
+
+ c3 = Child()
+ sess.enable_relationship_loading(c3)
+ c3.parent_id = p1.id
+
+ c3.parent = p1
+
+ # backref did not fire off when c3.parent was set.
+ # originally this was impacted by #3708, now does not happen
+ # due to backref_cascades behavior being removed
+ assert c3 not in p1.children
+
def test_backref_doesnt_double(self, parent_fixture):
sess, p1, p2, c1, c2 = parent_fixture
Parent, Child = self.classes("Parent", "Child")
m = self.mapper(User, users)
session = fixture_session()
- session.connection(mapper=m)
+ session.connection(bind_arguments=dict(mapper=m))
def test_incomplete_columns(self):
"""Loading from a select which does not contain all columns"""
session.commit()
# testing #3108
- session.begin_nested()
+ nt1 = session.begin_nested()
a_published.status = ARCHIVED
a_editable.status = PUBLISHED
- session.commit()
+ nt1.commit()
session.rollback()
eq_(a_published.status, PUBLISHED)
def test_negative_indexes_raise(self):
User = self.classes.User
- sess = fixture_session(future=True)
+ sess = fixture_session()
q = sess.query(User).order_by(User.id)
with expect_raises_message(
[(7,), (8,), (9,)],
)
+ def test_no_string_execute(self, connection):
+
+ with Session(bind=connection) as sess:
+ with expect_raises_message(
+ sa.exc.ArgumentError,
+ r"Textual SQL expression 'select \* from users where.*' "
+ "should be explicitly declared",
+ ):
+ sess.execute("select * from users where id=:id", {"id": 7})
+
+ with expect_raises_message(
+ sa.exc.ArgumentError,
+ r"Textual SQL expression 'select id from users .*' "
+ "should be explicitly declared",
+ ):
+ sess.scalar("select id from users where id=:id", {"id": 7})
+
class TransScopingTest(_fixtures.FixtureTest):
run_inserts = None
assert sess.is_active
def test_active_flag_autobegin_future(self):
- sess = Session(bind=config.db, future=True)
+ sess = Session(bind=config.db)
assert sess.is_active
assert not sess.in_transaction()
sess.begin()
assert sess.is_active
sess.begin(_subtrans=True)
sess.rollback()
- assert not sess.is_active
+ assert sess.is_active
sess.rollback()
assert sess.is_active
)
self.mapper_registry.map_imperatively(Address, addresses)
- session = fixture_session(future=True)
+ session = fixture_session()
@event.listens_for(session, "after_flush")
def load_collections(session, flush_context):
run_inserts = None
__backend__ = True
- @testing.fixture
- def conn(self):
- with testing.db.connect() as conn:
- yield conn
-
- @testing.fixture
- def future_conn(self):
-
- engine = testing.db
- with engine.connect() as conn:
- yield conn
-
- def test_no_close_transaction_on_flush(self, conn):
+ def test_no_close_transaction_on_flush(self, connection):
User, users = self.classes.User, self.tables.users
- c = conn
+ c = connection
self.mapper_registry.map_imperatively(User, users)
s = Session(bind=c)
s.begin()
- tran = s._legacy_transaction()
+ tran = s.get_transaction()
s.add(User(name="first"))
s.flush()
c.exec_driver_sql("select * from users")
u = User(name="third")
s.add(u)
s.flush()
- assert s._legacy_transaction() is tran
+ assert s.get_transaction() is tran
tran.close()
- def test_subtransaction_on_external_no_begin(self, conn):
+ def test_subtransaction_on_external_no_begin(self, connection_no_trans):
users, User = self.tables.users, self.classes.User
+ connection = connection_no_trans
self.mapper_registry.map_imperatively(User, users)
- trans = conn.begin()
- sess = Session(bind=conn, autoflush=True)
+ trans = connection.begin()
+ sess = Session(bind=connection, autoflush=True)
u = User(name="ed")
sess.add(u)
sess.flush()
sess.close()
@testing.requires.savepoints
- def test_external_nested_transaction(self, conn):
+ def test_external_nested_transaction(self, connection_no_trans):
users, User = self.tables.users, self.classes.User
self.mapper_registry.map_imperatively(User, users)
- trans = conn.begin()
- sess = Session(bind=conn, autoflush=True)
+
+ connection = connection_no_trans
+ trans = connection.begin()
+ sess = Session(bind=connection, autoflush=True)
u1 = User(name="u1")
sess.add(u1)
sess.flush()
trans.commit()
assert len(sess.query(User).all()) == 1
- def test_subtransaction_on_external_commit_future(self, future_conn):
+ def test_subtransaction_on_external_commit(self, connection_no_trans):
users, User = self.tables.users, self.classes.User
self.mapper_registry.map_imperatively(User, users)
- conn = future_conn
- conn.begin()
+ connection = connection_no_trans
+ connection.begin()
- sess = Session(bind=conn, autoflush=True)
+ sess = Session(bind=connection, autoflush=True)
u = User(name="ed")
sess.add(u)
sess.flush()
sess.commit() # commit does nothing
- conn.rollback() # rolls back
+ connection.rollback() # rolls back
assert len(sess.query(User).all()) == 0
sess.close()
- def test_subtransaction_on_external_rollback_future(self, future_conn):
+ def test_subtransaction_on_external_rollback(self, connection_no_trans):
users, User = self.tables.users, self.classes.User
self.mapper_registry.map_imperatively(User, users)
- conn = future_conn
- conn.begin()
+ connection = connection_no_trans
+ connection.begin()
- sess = Session(bind=conn, autoflush=True)
+ sess = Session(bind=connection, autoflush=True)
u = User(name="ed")
sess.add(u)
sess.flush()
sess.rollback() # rolls back
- conn.commit() # nothing to commit
+ connection.commit() # nothing to commit
assert len(sess.query(User).all()) == 0
sess.close()
@testing.requires.savepoints
- def test_savepoint_on_external_future(self, future_conn):
+ def test_savepoint_on_external(self, connection_no_trans):
users, User = self.tables.users, self.classes.User
self.mapper_registry.map_imperatively(User, users)
- conn = future_conn
- conn.begin()
- sess = Session(bind=conn, autoflush=True)
+ connection = connection_no_trans
+ connection.begin()
+ sess = Session(bind=connection, autoflush=True)
u1 = User(name="u1")
sess.add(u1)
sess.flush()
- sess.begin_nested()
+ n1 = sess.begin_nested()
u2 = User(name="u2")
sess.add(u2)
sess.flush()
- sess.rollback()
+ n1.rollback()
- conn.commit()
+ connection.commit()
assert len(sess.query(User).all()) == 1
@testing.requires.savepoints
session = fixture_session()
session.begin()
- session.begin_nested()
+ n1 = session.begin_nested()
u1 = User(name="u1")
session.add(u1)
- session.commit()
+ n1.commit()
assert u1 in session
session.rollback()
assert u1 not in session
session.begin()
u1 = session.query(User).first()
- session.begin_nested()
+ n1 = session.begin_nested()
session.delete(u1)
- session.commit()
+ n1.commit()
assert u1 not in session
session.rollback()
assert u1 in session
assert attributes.instance_state(u1) in nt2._dirty
assert attributes.instance_state(u1) not in nt1._dirty
- s.commit()
- assert attributes.instance_state(u1) in nt2._dirty
- assert attributes.instance_state(u1) in nt1._dirty
-
- s.rollback()
- assert attributes.instance_state(u1).expired
- eq_(u1.name, "u1")
-
- @testing.requires.savepoints
- def test_dirty_state_transferred_deep_nesting_future(self):
- User, users = self.classes.User, self.tables.users
-
- self.mapper_registry.map_imperatively(User, users)
-
- with fixture_session(future=True) as s:
- u1 = User(name="u1")
- s.add(u1)
- s.commit()
-
- nt1 = s.begin_nested()
- nt2 = s.begin_nested()
- u1.name = "u2"
- assert attributes.instance_state(u1) not in nt2._dirty
- assert attributes.instance_state(u1) not in nt1._dirty
- s.flush()
- assert attributes.instance_state(u1) in nt2._dirty
- assert attributes.instance_state(u1) not in nt1._dirty
-
nt2.commit()
assert attributes.instance_state(u1) in nt2._dirty
assert attributes.instance_state(u1) in nt1._dirty
sess.add(u)
sess.flush()
- sess.begin_nested() # nested transaction
+ n1 = sess.begin_nested() # nested transaction
u2 = User(name="u2")
sess.add(u2)
sess.flush()
- sess.rollback()
+ n1.rollback()
sess.commit()
assert len(sess.query(User).all()) == 1
sess.add(u2)
sess.flush()
- sess.rollback() # rolls back nested only
-
- sess.commit()
- assert len(sess.query(User).all()) == 1
- sess.close()
-
- @testing.requires.savepoints
- def test_nested_autotrans_future(self):
- User, users = self.classes.User, self.tables.users
-
- self.mapper_registry.map_imperatively(User, users)
- sess = fixture_session(future=True)
- u = User(name="u1")
- sess.add(u)
- sess.flush()
-
- sess.begin_nested() # nested transaction
-
- u2 = User(name="u2")
- sess.add(u2)
- sess.flush()
-
sess.rollback() # rolls back the whole trans
sess.commit()
sess.rollback()
sess.begin()
- sess.begin_nested()
+ n1 = sess.begin_nested()
u3 = User(name="u3")
sess.add(u3)
- sess.commit() # commit the nested transaction
+ n1.commit() # commit the nested transaction
sess.rollback()
eq_(set(sess.query(User).all()), set([u2]))
eq_(session.is_active, False)
session.rollback()
+ is_(session._transaction, None)
+
+ session.connection()
+
# back to normal
eq_(session._transaction._state, _session.ACTIVE)
eq_(session.is_active, True)
sess.add(User(id=5, name="some name"))
sess.commit()
- def test_no_autocommit_with_explicit_commit(self):
+ def test_no_autobegin_after_explicit_commit(self):
User, users = self.classes.User, self.tables.users
self.mapper_registry.map_imperatively(User, users)
session = fixture_session()
session.add(User(name="ed"))
- session._legacy_transaction().commit()
-
- is_not(session._legacy_transaction(), None)
-
- def test_no_autocommit_with_explicit_commit_future(self):
- User, users = self.classes.User, self.tables.users
+ session.get_transaction().commit()
- self.mapper_registry.map_imperatively(User, users)
- session = fixture_session(future=True)
- session.add(User(name="ed"))
- session._legacy_transaction().commit()
+ is_(session.get_transaction(), None)
- # new in 1.4
- is_(session._legacy_transaction(), None)
+ session.connection()
+ is_not(session.get_transaction(), None)
class _LocalFixture(FixtureTest):
argnames="target_recipe,recipe_rollsback_early",
id_="ns",
)
-@testing.combinations((True,), (False,), argnames="future", id_="s")
class SubtransactionRecipeTest(FixtureTest):
run_inserts = None
__backend__ = True
def test_recipe_heavy_nesting(self, subtransaction_recipe):
users = self.tables.users
- with fixture_session(future=self.future) as session:
+ with fixture_session() as session:
with subtransaction_recipe(session):
session.connection().execute(
users.insert().values(name="user1")
self.mapper_registry.map_imperatively(User, users)
with testing.db.connect() as conn:
trans = conn.begin()
- sess = Session(conn, future=self.future)
+ sess = Session(conn)
with subtransaction_recipe(sess):
u = User(name="ed")
User, users = self.classes.User, self.tables.users
self.mapper_registry.map_imperatively(User, users)
- with fixture_session(future=self.future) as sess:
+ with fixture_session() as sess:
with subtransaction_recipe(sess):
u = User(name="u1")
sess.add(u)
User, users = self.classes.User, self.tables.users
self.mapper_registry.map_imperatively(User, users)
- with fixture_session(future=self.future) as sess:
+ with fixture_session() as sess:
sess.begin()
with subtransaction_recipe(sess):
u = User(name="u1")
self.mapper_registry.map_imperatively(User, users)
- with fixture_session(future=self.future) as sess:
+ with fixture_session() as sess:
sess.begin()
sess.begin_nested()
sess.add(User(name="u2"))
t2.commit()
- assert sess._legacy_transaction() is t1
+ assert sess.get_transaction() is t1
def test_recipe_error_on_using_inactive_session_commands(
self, subtransaction_recipe
users, User = self.tables.users, self.classes.User
self.mapper_registry.map_imperatively(User, users)
- with fixture_session(future=self.future) as sess:
+ with fixture_session() as sess:
sess.begin()
try:
assert not sess.in_transaction()
def test_recipe_multi_nesting(self, subtransaction_recipe):
- with fixture_session(future=self.future) as sess:
+ with fixture_session() as sess:
with subtransaction_recipe(sess):
assert sess.in_transaction()
try:
with subtransaction_recipe(sess):
- assert sess._legacy_transaction()
+ assert sess.get_transaction()
raise Exception("force rollback")
except:
pass
assert not sess.in_transaction()
def test_recipe_deactive_status_check(self, subtransaction_recipe):
- with fixture_session(future=self.future) as sess:
+ with fixture_session() as sess:
sess.begin()
with subtransaction_recipe(sess):
run_inserts = None
__backend__ = True
- def _run_test(self, update_fn, future=False):
+ def _run_test(self, update_fn):
User, users = self.classes.User, self.tables.users
self.mapper_registry.map_imperatively(User, users)
- with fixture_session(future=future) as s:
+ with fixture_session() as s:
u1 = User(name="u1")
u2 = User(name="u2")
s.add_all([u1, u2])
eq_(u2.name, "u2modified")
s.rollback()
- if future:
- assert s._transaction is None
- assert "name" not in u1.__dict__
- else:
- assert s._transaction is trans
- eq_(u1.__dict__["name"], "u1")
+ assert s._transaction is None
+ assert "name" not in u1.__dict__
assert "name" not in u2.__dict__
eq_(u2.name, "u2")
u1.name = "edward"
a1.email_address = "foober"
- s.begin_nested()
+ nt1 = s.begin_nested()
s.add(u2)
with expect_warnings("New instance"):
assert_raises(sa_exc.IntegrityError, s.commit)
assert_raises(sa_exc.InvalidRequestError, s.commit)
- s.rollback()
+ nt1.rollback()
assert u2 not in s
assert a2 not in s
assert u1 in s
u2 = User(name="jack")
s.add_all([u1, u2])
- s.begin_nested()
+ nt1 = s.begin_nested()
u3 = User(name="wendy")
u4 = User(name="foo")
u1.name = "edward"
s.query(User.name).order_by(User.id).all(),
[("edward",), ("jackward",), ("wendy",), ("foo",)],
)
- s.rollback()
+ nt1.rollback()
assert u1.name == "ed"
assert u2.name == "jack"
eq_(s.query(User.name).order_by(User.id).all(), [("ed",), ("jack",)])
u2 = User(name="jack")
s.add_all([u1, u2])
- s.begin_nested()
+ nt1 = s.begin_nested()
u3 = User(name="wendy")
u4 = User(name="foo")
u1.name = "edward"
s.query(User.name).order_by(User.id).all(),
[("edward",), ("jackward",), ("wendy",), ("foo",)],
)
- s.commit()
+ nt1.commit()
def go():
assert u1.name == "edward"
u1.name = "edward"
u1.addresses.append(Address(email_address="bar"))
- s.begin_nested()
+ nt1 = s.begin_nested()
u2 = User(name="jack", addresses=[Address(email_address="bat")])
s.add(u2)
eq_(
User(name="jack", addresses=[Address(email_address="bat")]),
],
)
- s.rollback()
+ nt1.rollback()
eq_(
s.query(User).order_by(User.id).all(),
[
nested_trans = trans._connections[self.bind][1]
nested_trans._do_commit()
- is_(s._legacy_transaction(), trans)
+ is_(s.get_nested_transaction(), trans)
with expect_warnings("nested transaction already deassociated"):
# this previously would raise
assert u1 not in s.new
is_(trans._state, _session.CLOSED)
- is_not(s._legacy_transaction(), trans)
- is_(s._legacy_transaction()._state, _session.ACTIVE)
+ is_not(s.get_transaction(), trans)
- is_(s._legacy_transaction().nested, False)
+ s.connection()
+ is_(s.get_transaction()._state, _session.ACTIVE)
+
+ is_(s.get_transaction().nested, False)
- is_(s._legacy_transaction()._parent, None)
+ is_(s.get_transaction()._parent, None)
class AccountingFlagsTest(_LocalFixture):
def test_explicit_begin(self):
with fixture_session() as s1:
with s1.begin() as trans:
- is_(trans, s1._legacy_transaction())
+ is_(trans, s1.get_transaction())
s1.connection()
is_(s1._transaction, None)
)
@testing.requires.savepoints
- def test_future_rollback_is_global(self):
+ def test_rollback_is_global(self):
users = self.tables.users
- with fixture_session(future=True) as s1:
+ with fixture_session() as s1:
s1.begin()
s1.connection().execute(users.insert(), [{"id": 1, "name": "n1"}])
# rolls back the whole transaction
s1.rollback()
- is_(s1._legacy_transaction(), None)
+ is_(s1.get_transaction(), None)
eq_(
s1.connection().scalar(
)
s1.commit()
- is_(s1._legacy_transaction(), None)
-
- @testing.requires.savepoints
- def test_old_rollback_is_local(self):
- users = self.tables.users
-
- with fixture_session() as s1:
-
- t1 = s1.begin()
-
- s1.connection().execute(users.insert(), [{"id": 1, "name": "n1"}])
-
- s1.begin_nested()
-
- s1.connection().execute(
- users.insert(),
- [{"id": 2, "name": "n2"}, {"id": 3, "name": "n3"}],
- )
-
- eq_(
- s1.connection().scalar(
- select(func.count()).select_from(users)
- ),
- 3,
- )
-
- # rolls back only the savepoint
- s1.rollback()
-
- is_(s1._legacy_transaction(), t1)
-
- eq_(
- s1.connection().scalar(
- select(func.count()).select_from(users)
- ),
- 1,
- )
-
- s1.commit()
- eq_(
- s1.connection().scalar(
- select(func.count()).select_from(users)
- ),
- 1,
- )
- is_not(s1._legacy_transaction(), None)
+ is_(s1.get_transaction(), None)
def test_session_as_ctx_manager_one(self):
users = self.tables.users
with fixture_session() as sess:
- is_not(sess._legacy_transaction(), None)
-
- sess.connection().execute(
- users.insert().values(id=1, name="user1")
- )
-
- eq_(
- sess.connection().execute(users.select()).all(), [(1, "user1")]
- )
-
- is_not(sess._legacy_transaction(), None)
-
- is_not(sess._legacy_transaction(), None)
-
- # did not commit
- eq_(sess.connection().execute(users.select()).all(), [])
-
- def test_session_as_ctx_manager_future_one(self):
- users = self.tables.users
-
- with fixture_session(future=True) as sess:
- is_(sess._legacy_transaction(), None)
+ is_(sess.get_transaction(), None)
sess.connection().execute(
users.insert().values(id=1, name="user1")
sess.connection().execute(users.select()).all(), [(1, "user1")]
)
- is_not(sess._legacy_transaction(), None)
+ is_not(sess.get_transaction(), None)
- is_(sess._legacy_transaction(), None)
+ is_(sess.get_transaction(), None)
# did not commit
eq_(sess.connection().execute(users.select()).all(), [])
try:
with fixture_session() as sess:
- is_not(sess._legacy_transaction(), None)
-
- sess.connection().execute(
- users.insert().values(id=1, name="user1")
- )
-
- raise Exception("force rollback")
- except:
- pass
- is_not(sess._legacy_transaction(), None)
-
- def test_session_as_ctx_manager_two_future(self):
- users = self.tables.users
-
- try:
- with fixture_session(future=True) as sess:
- is_(sess._legacy_transaction(), None)
+ is_(sess.get_transaction(), None)
sess.connection().execute(
users.insert().values(id=1, name="user1")
raise Exception("force rollback")
except:
pass
- is_(sess._legacy_transaction(), None)
+ is_(sess.get_transaction(), None)
def test_begin_context_manager(self):
users = self.tables.users
eq_(sess.connection().execute(users.select()).all(), [(1, "user1")])
sess.close()
- @testing.combinations((True,), (False,), argnames="future")
- def test_interrupt_ctxmanager(self, trans_ctx_manager_fixture, future):
+ def test_interrupt_ctxmanager(self, trans_ctx_manager_fixture):
fn = trans_ctx_manager_fixture
- session = fixture_session(future=future)
+ session = fixture_session()
fn(session, trans_on_subject=True, execute_on_subject=True)
- @testing.combinations((True,), (False,), argnames="future")
@testing.combinations((True,), (False,), argnames="rollback")
@testing.combinations((True,), (False,), argnames="expire_on_commit")
@testing.combinations(
argnames="check_operation",
)
def test_interrupt_ctxmanager_ops(
- self, future, rollback, expire_on_commit, check_operation
+ self, rollback, expire_on_commit, check_operation
):
users, User = self.tables.users, self.classes.User
self.mapper_registry.map_imperatively(User, users)
- session = fixture_session(
- future=future, expire_on_commit=expire_on_commit
- )
+ session = fixture_session(expire_on_commit=expire_on_commit)
with session.begin():
u1 = User(id=7, name="u1")
s1.rollback()
- eq_(s1.in_transaction(), True)
- is_(s1._transaction, trans)
+ eq_(s1.in_transaction(), False)
+ is_(s1._transaction, None)
s1.rollback()
self.A = A
# bind an individual Session to the connection
- self.session = Session(bind=self.connection, future=True)
+ self.session = Session(bind=self.connection)
if testing.requires.savepoints.enabled:
self.nested = self.connection.begin_nested()
# outwit the database transaction isolation and SQLA's
# expiration at the same time by using different Session on
# same transaction
- s2 = Session(bind=s.connection(mapper=Node))
+ s2 = Session(bind=s.connection(bind_arguments=dict(mapper=Node)))
s2.query(Node).filter(Node.id == n2.id).update({"version_id": 3})
s2.commit()
), patch.object(
config.db.dialect, "supports_sane_multi_rowcount", False
):
- s2 = Session(bind=s.connection(mapper=Node))
+ s2 = Session(bind=s.connection(bind_arguments=dict(mapper=Node)))
s2.query(Node).filter(Node.id == n2.id).update({"version_id": 3})
s2.commit()
# outwit the database transaction isolation and SQLA's
# expiration at the same time by using different Session on
# same transaction
- s2 = Session(bind=s.connection(mapper=Node))
+ s2 = Session(bind=s.connection(bind_arguments=dict(mapper=Node)))
s2.query(Node).filter(Node.id == n1.id).update({"version_id": 3})
s2.commit()
def test_when_dicts(self, test_case, expected):
t = table("test", column("col1"))
- whens, value, else_ = testing.resolve_lambda(test_case, t=t)
-
- def _case_args(whens, value=None, else_=None):
- kw = {}
- if value is not None:
- kw["value"] = value
- if else_ is not None:
- kw["else_"] = else_
-
- return case(whens, **kw)
-
- # note: 1.3 also does not allow this form
- # case([whens], **kw)
+ when_dict, value, else_ = testing.resolve_lambda(test_case, t=t)
self.assert_compile(
- _case_args(whens=whens, value=value, else_=else_),
- expected,
+ case(when_dict, value=value, else_=else_), expected
)
def test_text_doesnt_explode(self, connection):
#! coding: utf-8
-import itertools
-import random
-
from sqlalchemy import alias
from sqlalchemy import and_
from sqlalchemy import bindparam
-from sqlalchemy import case
from sqlalchemy import CHAR
from sqlalchemy import column
from sqlalchemy import exc
from sqlalchemy.engine import default
from sqlalchemy.sql import coercions
from sqlalchemy.sql import LABEL_STYLE_TABLENAME_PLUS_COL
-from sqlalchemy.sql import literal
from sqlalchemy.sql import operators
from sqlalchemy.sql import quoted_name
from sqlalchemy.sql import roles
-from sqlalchemy.sql import update
from sqlalchemy.sql import visitors
from sqlalchemy.sql.selectable import SelectStatementGrouping
-from sqlalchemy.testing import assert_raises_message
from sqlalchemy.testing import assertions
from sqlalchemy.testing import AssertsCompiledSQL
from sqlalchemy.testing import eq_
from sqlalchemy.testing import mock
from sqlalchemy.testing.schema import Column
from sqlalchemy.testing.schema import Table
-from .test_update import _UpdateFromTestBase
class ToMetaDataTest(fixtures.TestBase):
):
eq_(stmt.froms, [t1])
- def test_case_list_legacy(self):
- t1 = table("t", column("q"))
-
- with testing.expect_deprecated(
- r"The \"whens\" argument to case\(\), when referring "
- r"to a sequence of items, is now passed"
- ):
- stmt = select(t1).where(
- case(
- [(t1.c.q == 5, "foo"), (t1.c.q == 10, "bar")], else_="bat"
- )
- != "bat"
- )
-
- self.assert_compile(
- stmt,
- "SELECT t.q FROM t WHERE CASE WHEN (t.q = :q_1) "
- "THEN :param_1 WHEN (t.q = :q_2) THEN :param_2 "
- "ELSE :param_3 END != :param_4",
- )
-
- def test_case_whens_kw(self):
- t1 = table("t", column("q"))
-
- with testing.expect_deprecated(
- r"The \"whens\" argument to case\(\), when referring "
- "to a sequence of items, is now passed"
- ):
- stmt = select(t1).where(
- case(
- whens=[(t1.c.q == 5, "foo"), (t1.c.q == 10, "bar")],
- else_="bat",
- )
- != "bat"
- )
-
- self.assert_compile(
- stmt,
- "SELECT t.q FROM t WHERE CASE WHEN (t.q = :q_1) "
- "THEN :param_1 WHEN (t.q = :q_2) THEN :param_2 "
- "ELSE :param_3 END != :param_4",
- )
-
- @testing.combinations(
- (
- (lambda t: ({"x": "y"}, t.c.col1, None)),
- "CASE test.col1 WHEN :param_1 THEN :param_2 END",
- ),
- (
- (lambda t: ({"x": "y", "p": "q"}, t.c.col1, None)),
- "CASE test.col1 WHEN :param_1 THEN :param_2 "
- "WHEN :param_3 THEN :param_4 END",
- ),
- (
- (lambda t: ({t.c.col1 == 7: "x"}, None, 10)),
- "CASE WHEN (test.col1 = :col1_1) THEN :param_1 ELSE :param_2 END",
- ),
- (
- (lambda t: ({t.c.col1 == 7: "x", t.c.col1 == 10: "y"}, None, 10)),
- "CASE WHEN (test.col1 = :col1_1) THEN :param_1 "
- "WHEN (test.col1 = :col1_2) THEN :param_2 ELSE :param_3 END",
- ),
- argnames="test_case, expected",
- )
- def test_when_kwarg(self, test_case, expected):
- t = table("test", column("col1"))
-
- whens, value, else_ = testing.resolve_lambda(test_case, t=t)
-
- def _case_args(whens, value=None, else_=None):
- kw = {}
- if value is not None:
- kw["value"] = value
- if else_ is not None:
- kw["else_"] = else_
-
- with testing.expect_deprecated_20(
- r'The "whens" argument to case\(\) is now passed using '
- r"positional style only, not as a keyword argument."
- ):
-
- return case(whens=whens, **kw)
-
- # note: 1.3 also does not allow this form
- # case([whens], **kw)
-
- self.assert_compile(
- _case_args(whens=whens, value=value, else_=else_),
- expected,
- )
-
- def test_case_whens_dict_kw(self):
- t1 = table("t", column("q"))
- with testing.expect_deprecated(
- r"The \"whens\" argument to case\(\) is now passed"
- ):
- stmt = select(t1).where(
- case(
- whens={t1.c.q == 5: "foo"},
- else_="bat",
- )
- != "bat"
- )
- self.assert_compile(
- stmt,
- "SELECT t.q FROM t WHERE CASE WHEN (t.q = :q_1) THEN "
- ":param_1 ELSE :param_2 END != :param_3",
- )
-
- def test_case_kw_arg_detection(self):
- # because we support py2k, case() has to parse **kw for now
-
- assert_raises_message(
- TypeError,
- "unknown arguments: bat, foo",
- case,
- (column("x") == 10, 5),
- else_=15,
- foo="bar",
- bat="hoho",
- )
-
- def test_with_only_generative(self):
- table1 = table(
- "table1",
- column("col1"),
- column("col2"),
- column("col3"),
- column("colx"),
- )
- s1 = table1.select().scalar_subquery()
-
- with testing.expect_deprecated_20(
- r"The \"columns\" argument to "
- r"Select.with_only_columns\(\), when referring "
- "to a sequence of items, is now passed"
- ):
- stmt = s1.with_only_columns([s1])
- self.assert_compile(
- stmt,
- "SELECT (SELECT table1.col1, table1.col2, "
- "table1.col3, table1.colx FROM table1) AS anon_1",
- )
-
- def test_from_list_with_columns(self):
- table1 = table("t1", column("a"))
- table2 = table("t2", column("b"))
- s1 = select(table1.c.a, table2.c.b)
- self.assert_compile(s1, "SELECT t1.a, t2.b FROM t1, t2")
-
- with testing.expect_deprecated_20(
- r"The \"columns\" argument to "
- r"Select.with_only_columns\(\), when referring "
- "to a sequence of items, is now passed"
- ):
- s2 = s1.with_only_columns([table2.c.b])
-
- self.assert_compile(s2, "SELECT t2.b FROM t2")
-
def test_column(self):
stmt = select(column("x"))
with testing.expect_deprecated(
"ON basefrom.a = joinfrom.a",
)
- with testing.expect_deprecated(r"The Select.append_column\(\)"):
- replaced.append_column(joinfrom.c.b)
-
- self.assert_compile(
- replaced,
- "SELECT basefrom.a, joinfrom.b FROM (SELECT 1 AS a) AS basefrom "
- "JOIN (SELECT 1 AS a, 2 AS b) AS joinfrom "
- "ON basefrom.a = joinfrom.a",
- )
-
def test_against_cloned_non_table(self):
# test that corresponding column digs across
# clone boundaries with anonymous labeled elements
assert u.corresponding_column(s2.c.table2_coly) is u.c.coly
assert s2.c.corresponding_column(u.c.coly) is s2.c.table2_coly
- def test_join_alias(self):
- j1 = self.table1.join(self.table2)
-
- with testing.expect_deprecated_20(
- r"The Join.alias\(\) method is considered legacy"
- ):
- self.assert_compile(
- j1.alias(),
- "SELECT table1.col1 AS table1_col1, table1.col2 AS "
- "table1_col2, table1.col3 AS table1_col3, table1.colx "
- "AS table1_colx, table2.col1 AS table2_col1, "
- "table2.col2 AS table2_col2, table2.col3 AS table2_col3, "
- "table2.coly AS table2_coly FROM table1 JOIN table2 "
- "ON table1.col1 = table2.col2",
- )
-
- with testing.expect_deprecated_20(
- r"The Join.alias\(\) method is considered legacy"
- ):
- self.assert_compile(
- j1.alias(flat=True),
- "table1 AS table1_1 JOIN table2 AS table2_1 "
- "ON table1_1.col1 = table2_1.col2",
- )
-
def test_join_against_self_implicit_subquery(self):
jj = select(self.table1.c.col1.label("bar_col1"))
with testing.expect_deprecated(
):
assert jjj.corresponding_column(jj.c.bar_col1) is jjj_bar_col1
- # test alias of the join
-
- with testing.expect_deprecated(
- r"The Join.alias\(\) method is considered legacy"
- ):
- j2 = jjj.alias("foo")
- assert (
- j2.corresponding_column(self.table1.c.col1) is j2.c.table1_col1
- )
-
def test_select_labels(self):
a = self.table1.select().set_label_style(
LABEL_STYLE_TABLENAME_PLUS_COL
eq_(t.c.c.type._type_affinity, String)
-class DeprecatedAppendMethTest(fixtures.TestBase, AssertsCompiledSQL):
- __dialect__ = "default"
-
- def _expect_deprecated(self, clsname, methname, newmeth):
- return testing.expect_deprecated(
- r"The %s.append_%s\(\) method is deprecated "
- r"and will be removed in a future release. Use the generative "
- r"method %s.%s\(\)." % (clsname, methname, clsname, newmeth)
- )
-
- def test_append_whereclause(self):
- t = table("t", column("q"))
- stmt = select(t)
-
- with self._expect_deprecated("Select", "whereclause", "where"):
- stmt.append_whereclause(t.c.q == 5)
-
- self.assert_compile(stmt, "SELECT t.q FROM t WHERE t.q = :q_1")
-
- def test_append_having(self):
- t = table("t", column("q"))
- stmt = select(t).group_by(t.c.q)
-
- with self._expect_deprecated("Select", "having", "having"):
- stmt.append_having(t.c.q == 5)
-
- self.assert_compile(
- stmt, "SELECT t.q FROM t GROUP BY t.q HAVING t.q = :q_1"
- )
-
- def test_append_order_by(self):
- t = table("t", column("q"), column("x"))
- stmt = select(t).where(t.c.q == 5)
-
- with self._expect_deprecated(
- "GenerativeSelect", "order_by", "order_by"
- ):
- stmt.append_order_by(t.c.x)
-
- self.assert_compile(
- stmt, "SELECT t.q, t.x FROM t WHERE t.q = :q_1 ORDER BY t.x"
- )
-
- def test_append_group_by(self):
- t = table("t", column("q"))
- stmt = select(t)
-
- with self._expect_deprecated(
- "GenerativeSelect", "group_by", "group_by"
- ):
- stmt.append_group_by(t.c.q)
-
- stmt = stmt.having(t.c.q == 5)
-
- self.assert_compile(
- stmt, "SELECT t.q FROM t GROUP BY t.q HAVING t.q = :q_1"
- )
-
- def test_append_correlation(self):
- t1 = table("t1", column("q"))
- t2 = table("t2", column("q"), column("p"))
-
- inner = select(t2.c.p).where(t2.c.q == t1.c.q)
-
- with self._expect_deprecated("Select", "correlation", "correlate"):
- inner.append_correlation(t1)
- stmt = select(t1).where(t1.c.q == inner.scalar_subquery())
-
- self.assert_compile(
- stmt,
- "SELECT t1.q FROM t1 WHERE t1.q = "
- "(SELECT t2.p FROM t2 WHERE t2.q = t1.q)",
- )
-
- def test_append_column(self):
- t1 = table("t1", column("q"), column("p"))
- stmt = select(t1.c.q)
- with self._expect_deprecated("Select", "column", "add_columns"):
- stmt.append_column(t1.c.p)
- self.assert_compile(stmt, "SELECT t1.q, t1.p FROM t1")
-
- def test_append_prefix(self):
- t1 = table("t1", column("q"), column("p"))
- stmt = select(t1.c.q)
- with self._expect_deprecated("Select", "prefix", "prefix_with"):
- stmt.append_prefix("FOO BAR")
- self.assert_compile(stmt, "SELECT FOO BAR t1.q FROM t1")
-
- def test_append_from(self):
- t1 = table("t1", column("q"))
- t2 = table("t2", column("q"))
-
- stmt = select(t1)
- with self._expect_deprecated("Select", "from", "select_from"):
- stmt.append_from(t1.join(t2, t1.c.q == t2.c.q))
- self.assert_compile(stmt, "SELECT t1.q FROM t1 JOIN t2 ON t1.q = t2.q")
-
-
class KeyTargetingTest(fixtures.TablesTest):
run_inserts = "once"
run_deletes = None
)
-class DMLTest(_UpdateFromTestBase, fixtures.TablesTest, AssertsCompiledSQL):
- __dialect__ = "default"
-
- def test_insert_inline_kw_defaults(self):
- m = MetaData()
- foo = Table("foo", m, Column("id", Integer))
-
- t = Table(
- "test",
- m,
- Column("col1", Integer, default=func.foo(1)),
- Column(
- "col2",
- Integer,
- default=select(func.coalesce(func.max(foo.c.id))),
- ),
- )
-
- with testing.expect_deprecated_20(
- "The insert.inline parameter will be removed in SQLAlchemy 2.0."
- ):
- stmt = t.insert(inline=True, values={})
-
- self.assert_compile(
- stmt,
- "INSERT INTO test (col1, col2) VALUES (foo(:foo_1), "
- "(SELECT coalesce(max(foo.id)) AS coalesce_1 FROM "
- "foo))",
- )
-
- def test_insert_inline_kw_default(self):
- metadata = MetaData()
- table = Table(
- "sometable",
- metadata,
- Column("id", Integer, primary_key=True),
- Column("foo", Integer, default=func.foobar()),
- )
-
- with testing.expect_deprecated_20(
- "The insert.inline parameter will be removed in SQLAlchemy 2.0."
- ):
- stmt = table.insert(values={}, inline=True)
-
- self.assert_compile(
- stmt,
- "INSERT INTO sometable (foo) VALUES (foobar())",
- )
-
- with testing.expect_deprecated_20(
- "The insert.inline parameter will be removed in SQLAlchemy 2.0."
- ):
- stmt = table.insert(inline=True)
-
- self.assert_compile(
- stmt,
- "INSERT INTO sometable (foo) VALUES (foobar())",
- params={},
- )
-
- def test_update_inline_kw_defaults(self):
- m = MetaData()
- foo = Table("foo", m, Column("id", Integer))
-
- t = Table(
- "test",
- m,
- Column("col1", Integer, onupdate=func.foo(1)),
- Column(
- "col2",
- Integer,
- onupdate=select(func.coalesce(func.max(foo.c.id))),
- ),
- Column("col3", String(30)),
- )
-
- with testing.expect_deprecated_20(
- "The update.inline parameter will be removed in SQLAlchemy 2.0."
- ):
- stmt = t.update(inline=True, values={"col3": "foo"})
-
- self.assert_compile(
- stmt,
- "UPDATE test SET col1=foo(:foo_1), col2=(SELECT "
- "coalesce(max(foo.id)) AS coalesce_1 FROM foo), "
- "col3=:col3",
- )
-
- def test_update_dialect_kwargs(self):
- t = table("foo", column("bar"))
-
- with testing.expect_deprecated_20("Passing dialect keyword arguments"):
- stmt = t.update(mysql_limit=10)
-
- self.assert_compile(
- stmt, "UPDATE foo SET bar=%s LIMIT 10", dialect="mysql"
- )
-
- def test_update_whereclause(self):
- table1 = table(
- "mytable",
- Column("myid", Integer),
- Column("name", String(30)),
- )
-
- with testing.expect_deprecated_20(
- "The update.whereclause parameter will be "
- "removed in SQLAlchemy 2.0"
- ):
- self.assert_compile(
- table1.update(table1.c.myid == 7),
- "UPDATE mytable SET myid=:myid, name=:name "
- "WHERE mytable.myid = :myid_1",
- )
-
- def test_update_values(self):
- table1 = table(
- "mytable",
- Column("myid", Integer),
- Column("name", String(30)),
- )
-
- with testing.expect_deprecated_20(
- "The update.values parameter will be removed in SQLAlchemy 2.0"
- ):
- self.assert_compile(
- table1.update(values={table1.c.myid: 7}),
- "UPDATE mytable SET myid=:myid",
- )
-
- def test_delete_whereclause(self):
- table1 = table(
- "mytable",
- Column("myid", Integer),
- )
-
- with testing.expect_deprecated_20(
- "The delete.whereclause parameter will be "
- "removed in SQLAlchemy 2.0"
- ):
- self.assert_compile(
- table1.delete(table1.c.myid == 7),
- "DELETE FROM mytable WHERE mytable.myid = :myid_1",
- )
-
- def test_update_ordered_parameters_fire_onupdate(self):
- table = self.tables.update_w_default
-
- values = [(table.c.y, table.c.x + 5), ("x", 10)]
-
- with testing.expect_deprecated_20(
- "The update.preserve_parameter_order parameter will be "
- "removed in SQLAlchemy 2.0."
- ):
- self.assert_compile(
- table.update(preserve_parameter_order=True).values(values),
- "UPDATE update_w_default "
- "SET ycol=(update_w_default.x + :x_1), "
- "x=:x, data=:data",
- )
-
- def test_update_ordered_parameters_override_onupdate(self):
- table = self.tables.update_w_default
-
- values = [
- (table.c.y, table.c.x + 5),
- (table.c.data, table.c.x + 10),
- ("x", 10),
- ]
-
- with testing.expect_deprecated_20(
- "The update.preserve_parameter_order parameter will be "
- "removed in SQLAlchemy 2.0."
- ):
- self.assert_compile(
- table.update(preserve_parameter_order=True).values(values),
- "UPDATE update_w_default "
- "SET ycol=(update_w_default.x + :x_1), "
- "data=(update_w_default.x + :x_2), x=:x",
- )
-
- def test_update_ordered_parameters_oldstyle_1(self):
- table1 = self.tables.mytable
-
- # Confirm that we can pass values as list value pairs
- # note these are ordered *differently* from table.c
- values = [
- (table1.c.name, table1.c.name + "lala"),
- (table1.c.myid, func.do_stuff(table1.c.myid, literal("hoho"))),
- ]
-
- with testing.expect_deprecated_20(
- "The update.preserve_parameter_order parameter will be "
- "removed in SQLAlchemy 2.0.",
- "The update.whereclause parameter will be "
- "removed in SQLAlchemy 2.0",
- "The update.values parameter will be removed in SQLAlchemy 2.0",
- ):
- self.assert_compile(
- update(
- table1,
- (table1.c.myid == func.hoho(4))
- & (
- table1.c.name
- == literal("foo") + table1.c.name + literal("lala")
- ),
- preserve_parameter_order=True,
- values=values,
- ),
- "UPDATE mytable "
- "SET "
- "name=(mytable.name || :name_1), "
- "myid=do_stuff(mytable.myid, :param_1) "
- "WHERE "
- "mytable.myid = hoho(:hoho_1) AND "
- "mytable.name = :param_2 || mytable.name || :param_3",
- )
-
- def test_update_ordered_parameters_oldstyle_2(self):
- table1 = self.tables.mytable
-
- # Confirm that we can pass values as list value pairs
- # note these are ordered *differently* from table.c
- values = [
- (table1.c.name, table1.c.name + "lala"),
- ("description", "some desc"),
- (table1.c.myid, func.do_stuff(table1.c.myid, literal("hoho"))),
- ]
-
- with testing.expect_deprecated_20(
- "The update.preserve_parameter_order parameter will be "
- "removed in SQLAlchemy 2.0.",
- "The update.whereclause parameter will be "
- "removed in SQLAlchemy 2.0",
- ):
- self.assert_compile(
- update(
- table1,
- (table1.c.myid == func.hoho(4))
- & (
- table1.c.name
- == literal("foo") + table1.c.name + literal("lala")
- ),
- preserve_parameter_order=True,
- ).values(values),
- "UPDATE mytable "
- "SET "
- "name=(mytable.name || :name_1), "
- "description=:description, "
- "myid=do_stuff(mytable.myid, :param_1) "
- "WHERE "
- "mytable.myid = hoho(:hoho_1) AND "
- "mytable.name = :param_2 || mytable.name || :param_3",
- )
-
- def test_update_preserve_order_reqs_listtups(self):
- table1 = self.tables.mytable
-
- with testing.expect_deprecated_20(
- "The update.preserve_parameter_order parameter will be "
- "removed in SQLAlchemy 2.0."
- ):
- testing.assert_raises_message(
- ValueError,
- r"When preserve_parameter_order is True, values\(\) "
- r"only accepts a list of 2-tuples",
- table1.update(preserve_parameter_order=True).values,
- {"description": "foo", "name": "bar"},
- )
-
- @testing.fixture
- def randomized_param_order_update(self):
- from sqlalchemy.sql.dml import UpdateDMLState
-
- super_process_ordered_values = UpdateDMLState._process_ordered_values
-
- # this fixture is needed for Python 3.6 and above to work around
- # dictionaries being insert-ordered. in python 2.7 the previous
- # logic fails pretty easily without this fixture.
- def _process_ordered_values(self, statement):
- super_process_ordered_values(self, statement)
-
- tuples = list(self._dict_parameters.items())
- random.shuffle(tuples)
- self._dict_parameters = dict(tuples)
-
- dialect = default.StrCompileDialect()
- dialect.paramstyle = "qmark"
- dialect.positional = True
-
- with mock.patch.object(
- UpdateDMLState, "_process_ordered_values", _process_ordered_values
- ):
- yield
-
- def random_update_order_parameters():
- from sqlalchemy import ARRAY
-
- t = table(
- "foo",
- column("data1", ARRAY(Integer)),
- column("data2", ARRAY(Integer)),
- column("data3", ARRAY(Integer)),
- column("data4", ARRAY(Integer)),
- )
-
- idx_to_value = [
- (t.c.data1, 5, 7),
- (t.c.data2, 10, 18),
- (t.c.data3, 8, 4),
- (t.c.data4, 12, 14),
- ]
-
- def combinations():
- while True:
- random.shuffle(idx_to_value)
- yield list(idx_to_value)
-
- return testing.combinations(
- *[
- (t, combination)
- for i, combination in zip(range(10), combinations())
- ],
- argnames="t, idx_to_value",
- )
-
- @random_update_order_parameters()
- def test_update_to_expression_ppo(
- self, randomized_param_order_update, t, idx_to_value
- ):
- dialect = default.StrCompileDialect()
- dialect.paramstyle = "qmark"
- dialect.positional = True
-
- with testing.expect_deprecated_20(
- "The update.preserve_parameter_order parameter will be "
- "removed in SQLAlchemy 2.0."
- ):
- stmt = t.update(preserve_parameter_order=True).values(
- [(col[idx], val) for col, idx, val in idx_to_value]
- )
-
- self.assert_compile(
- stmt,
- "UPDATE foo SET %s"
- % (
- ", ".join(
- "%s[?]=?" % col.key for col, idx, val in idx_to_value
- )
- ),
- dialect=dialect,
- checkpositional=tuple(
- itertools.chain.from_iterable(
- (idx, val) for col, idx, val in idx_to_value
- )
- ),
- )
-
-
class TableDeprecationTest(fixtures.TestBase):
def test_mustexists(self):
with testing.expect_deprecated("Deprecated alias of .*must_exist"):
):
expect(roles.ExpressionElementRole, Thing())
- def test_statement_text_coercion(self):
- with testing.expect_deprecated_20(
- "Using plain strings to indicate SQL statements"
+ def test_no_statement_text_coercion(self):
+ with testing.expect_raises_message(
+ exc.ArgumentError,
+ r"Textual SQL expression 'select \* from table' should be "
+ "explicitly declared",
):
- is_true(
- expect(roles.StatementRole, "select * from table").compare(
- text("select * from table")
- )
- )
+ expect(roles.StatementRole, "select * from table")
def test_select_statement_no_text_coercion(self):
assert_raises_message(
"table1.col3, table1.colx FROM table1) AS anon_1",
)
+ def test_with_only_generative_no_list(self):
+ s1 = table1.select().scalar_subquery()
+
+ with testing.expect_raises_message(
+ exc.ArgumentError,
+ r"The \"columns\" argument to "
+ r"Select.with_only_columns\(\), when referring "
+ "to a sequence of items, is now passed",
+ ):
+ s1.with_only_columns([s1])
+
@testing.combinations(
(
[table1.c.col1],
from sqlalchemy.testing import AssertsExecutionResults
from sqlalchemy.testing import engines
from sqlalchemy.testing import eq_
-from sqlalchemy.testing import expect_deprecated_20
from sqlalchemy.testing import expect_raises
from sqlalchemy.testing import fixtures
from sqlalchemy.testing import is_
[(1, self.SomeEnum.three), (2, self.SomeEnum.three)],
)
- def test_omit_warn(self):
- with expect_deprecated_20(
- r"The provided enum someenum contains the aliases \['four'\]"
- ):
- Enum(self.SomeEnum)
-
@testing.combinations(
(True, "native"), (False, "non_native"), id_="ai", argnames="native"
)