From: Mike Bayer Date: Mon, 3 Jan 2022 18:49:26 +0000 (-0500) Subject: Remove all remaining removed_in_20 warnings slated for removal X-Git-Tag: rel_2_0_0b1~554^2 X-Git-Url: http://git.ipfire.org/cgi-bin/gitweb.cgi?a=commitdiff_plain;h=01c50c64e302c193733cef7fb146fbab8eaa44bd;p=thirdparty%2Fsqlalchemy%2Fsqlalchemy.git Remove all remaining removed_in_20 warnings slated for removal Finalize all remaining removed-in-2.0 changes so that we can begin doing pep-484 typing without old things getting in the way (we will also have to do public_factory). note there are a few "moved_in_20()" and "became_legacy_in_20()" warnings still in place. The SQLALCHEMY_WARN_20 variable is now removed. Also removed here are the legacy "in place mutators" for Select statements, and some keyword-only argument signatures in Core have been added. Also in the big change department, the ORM mapper() function is removed entirely; the Mapper class is otherwise unchanged, just the public-facing API function. Mappers are now always given a registry in which to participate, however the argument signature of Mapper is not changed. ideally "registry" would be the first positional argument. Fixes: #7257 Change-Id: Ic70c57b9f1cf7eb996338af5183b11bdeb3e1623 --- diff --git a/doc/build/changelog/changelog_08.rst b/doc/build/changelog/changelog_08.rst index 4b6b42ec73..f650061003 100644 --- a/doc/build/changelog/changelog_08.rst +++ b/doc/build/changelog/changelog_08.rst @@ -1896,7 +1896,7 @@ de-associated from any of its orphan-enabled parents. Previously, the pending object would be expunged only if de-associated from all of its orphan-enabled parents. The new flag ``legacy_is_orphan`` - is added to :func:`_orm.mapper` which re-establishes the + is added to :class:`_orm.Mapper` which re-establishes the legacy behavior. See the change note and example case at :ref:`legacy_is_orphan_addition` diff --git a/doc/build/changelog/changelog_10.rst b/doc/build/changelog/changelog_10.rst index 4d3b84d3b4..95a5da35b9 100644 --- a/doc/build/changelog/changelog_10.rst +++ b/doc/build/changelog/changelog_10.rst @@ -1606,7 +1606,7 @@ the full "instrumentation manager" for a class before it was mapped for the purpose of the new ``@declared_attr`` features described in :ref:`feature_3150`, but the change was also made - against the classical use of :func:`.mapper` for consistency. + against the classical use of :class:`_orm.Mapper` for consistency. However, SQLSoup relies upon the instrumentation event happening before any instrumentation under classical mapping. The behavior is reverted in the case of classical and declarative diff --git a/doc/build/changelog/changelog_14.rst b/doc/build/changelog/changelog_14.rst index 3c8678382e..635f996383 100644 --- a/doc/build/changelog/changelog_14.rst +++ b/doc/build/changelog/changelog_14.rst @@ -4796,7 +4796,7 @@ This document details individual issue-level changes made throughout Fixed ORM unit of work regression where an errant "assert primary_key" statement interferes with primary key generation sequences that don't actually consider the columns in the table to use a real primary key - constraint, instead using :paramref:`_orm.mapper.primary_key` to establish + constraint, instead using :paramref:`_orm.Mapper.primary_key` to establish certain columns as "primary". .. change:: diff --git a/doc/build/changelog/migration_13.rst b/doc/build/changelog/migration_13.rst index d7a26084e3..cdef36e12f 100644 --- a/doc/build/changelog/migration_13.rst +++ b/doc/build/changelog/migration_13.rst @@ -84,11 +84,11 @@ New Features and Improvements - ORM Relationship to AliasedClass replaces the need for non primary mappers ----------------------------------------------------------------------- -The "non primary mapper" is a :func:`.mapper` created in the +The "non primary mapper" is a :class:`_orm.Mapper` created in the :ref:`classical_mapping` style, which acts as an additional mapper against an already mapped class against a different kind of selectable. The non primary mapper has its roots in the 0.1, 0.2 series of SQLAlchemy where it was -anticipated that the :func:`.mapper` object was to be the primary query +anticipated that the :class:`_orm.Mapper` object was to be the primary query construction interface, before the :class:`_query.Query` object existed. With the advent of :class:`_query.Query` and later the :class:`.AliasedClass` diff --git a/doc/build/changelog/migration_14.rst b/doc/build/changelog/migration_14.rst index f1e56b391e..7001acb055 100644 --- a/doc/build/changelog/migration_14.rst +++ b/doc/build/changelog/migration_14.rst @@ -292,11 +292,11 @@ the :class:`_orm.registry` object, and fall into these categories: * Using :meth:`_orm.registry.map_imperatively` * :ref:`orm_imperative_dataclasses` -The existing classical mapping function :func:`_orm.mapper` remains, however -it is deprecated to call upon :func:`_orm.mapper` directly; the new -:meth:`_orm.registry.map_imperatively` method now routes the request through -the :meth:`_orm.registry` so that it integrates with other declarative mappings -unambiguously. +The existing classical mapping function ``sqlalchemy.orm.mapper()`` remains, +however it is deprecated to call upon ``sqlalchemy.orm.mapper()`` directly; the +new :meth:`_orm.registry.map_imperatively` method now routes the request +through the :meth:`_orm.registry` so that it integrates with other declarative +mappings unambiguously. The new approach interoperates with 3rd party class instrumentation systems which necessarily must take place on the class before the mapping process diff --git a/doc/build/changelog/migration_20.rst b/doc/build/changelog/migration_20.rst index 4c2ef22bdd..afab782770 100644 --- a/doc/build/changelog/migration_20.rst +++ b/doc/build/changelog/migration_20.rst @@ -1381,8 +1381,8 @@ The original "mapper()" function now a core element of Declarative, renamed **Synopsis** -The :func:`_orm.mapper` function moves behind the scenes to be invoked -by higher level APIs. The new version of this function is the method +The ``sqlalchemy.orm.mapper()`` standalone function moves behind the scenes to +be invoked by higher level APIs. The new version of this function is the method :meth:`_orm.registry.map_imperatively` taken from a :class:`_orm.registry` object. diff --git a/doc/build/changelog/unreleased_20/7257.rst b/doc/build/changelog/unreleased_20/7257.rst index 6a3e4ffda4..079b0554e9 100644 --- a/doc/build/changelog/unreleased_20/7257.rst +++ b/doc/build/changelog/unreleased_20/7257.rst @@ -3,7 +3,8 @@ :tickets: 7257 Migrated the codebase to remove all pre-2.0 behaviors and architectures - that were previously noted as deprecated for removal in 2.0, including: + that were previously noted as deprecated for removal in 2.0, including, + but not limited to: * removal of all Python 2 code, minimum version is now Python 3.7 @@ -26,6 +27,10 @@ :class:`.Table`, and from all DDL/DML/DQL elements that previously could refer to a "bound engine" + * The standalone ``sqlalchemy.orm.mapper()`` function is removed; all + classical mapping should be done through the + :meth:`_orm.registry.map_imperatively` method of :class:`_orm.registry`. + * The :meth:`_orm.Query.join` method no longer accepts strings for relationship names; the long-documented approach of using ``Class.attrname`` for join targets is now standard. @@ -39,6 +44,14 @@ * ``Query.from_self()``, ``Query.select_entity_from()`` and ``Query.with_polymorphic()`` are removed. + * The :paramref:`_orm.relationship.cascade_backrefs` parameter must now + remain at its new default of ``False``; the ``save-update`` cascade + no longer cascades along a backref. + + * the :paramref:`_orm.Session.future` parameter must always be set to + ``True``. 2.0-style transactional patterns for :class:`_orm.Session` + are now always in effect. + * Loader options no longer accept strings for attribute names. The long-documented approach of using ``Class.attrname`` for loader option targets is now standard. @@ -47,4 +60,6 @@ ``select([cols])``, the "whereclause" and keyword parameters of ``some_table.select()``. - * More are in progress as development continues + * Legacy "in-place mutator" methods on :class:`_sql.Select` such as + ``append_whereclause()``, ``append_order_by()`` etc are removed. + diff --git a/doc/build/errors.rst b/doc/build/errors.rst index 4845963b02..e762636927 100644 --- a/doc/build/errors.rst +++ b/doc/build/errors.rst @@ -211,38 +211,6 @@ each, see the section :ref:`faq_new_caching`. :ref:`faq_new_caching` - in the :ref:`faq_toplevel` section -.. _error_s9r1: - -Object is being merged into a Session along the backref cascade ---------------------------------------------------------------- - -This message refers to the "backref cascade" behavior of SQLAlchemy, -which is described at :ref:`backref_cascade`. This refers to the action of -an object being added into a :class:`_orm.Session` as a result of another -object that's already present in that session being associated with it. -As this behavior has been shown to be more confusing than helpful, -the :paramref:`_orm.relationship.cascade_backrefs` and -:paramref:`_orm.backref.cascade_backrefs` parameters were added, which can -be set to ``False`` to disable it, and in SQLAlchemy 2.0 the "cascade backrefs" -behavior will be disabled completely. - -To set :paramref:`_orm.relationship.cascade_backrefs` to ``False`` on a -backref that is currently configured using the -:paramref:`_orm.relationship.backref` string parameter, the backref must -be declared using the :func:`_orm.backref` function first so that the -:paramref:`_orm.backref.cascade_backrefs` parameter may be passed. - -Alternatively, the entire "cascade backrefs" behavior can be turned off -across the board by using the :class:`_orm.Session` in "future" mode, -by passing ``True`` for the :paramref:`_orm.Session.future` parameter. - -.. seealso:: - - :ref:`backref_cascade` - complete description of the cascade backrefs - behavior - - :ref:`change_5150` - background on the change for SQLAlchemy 2.0. - .. _error_xaj1: An alias is being generated automatically for raw clauseelement @@ -1611,3 +1579,33 @@ In SQLAlchemy 1.4, this :term:`2.0 style` behavior is enabled when the :paramref:`_orm.Session.future` flag is set on :class:`_orm.sessionmaker` or :class:`_orm.Session`. +.. _error_s9r1: + +Object is being merged into a Session along the backref cascade +--------------------------------------------------------------- + +This message refers to the "backref cascade" behavior of SQLAlchemy, +removed in version 2.0. This refers to the action of +an object being added into a :class:`_orm.Session` as a result of another +object that's already present in that session being associated with it. +As this behavior has been shown to be more confusing than helpful, +the :paramref:`_orm.relationship.cascade_backrefs` and +:paramref:`_orm.backref.cascade_backrefs` parameters were added, which can +be set to ``False`` to disable it, and in SQLAlchemy 2.0 the "cascade backrefs" +behavior has been removed entirely. + +For older SQLAlchemy versions, to set +:paramref:`_orm.relationship.cascade_backrefs` to ``False`` on a backref that +is currently configured using the :paramref:`_orm.relationship.backref` string +parameter, the backref must be declared using the :func:`_orm.backref` function +first so that the :paramref:`_orm.backref.cascade_backrefs` parameter may be +passed. + +Alternatively, the entire "cascade backrefs" behavior can be turned off +across the board by using the :class:`_orm.Session` in "future" mode, +by passing ``True`` for the :paramref:`_orm.Session.future` parameter. + +.. seealso:: + + :ref:`change_5150` - background on the change for SQLAlchemy 2.0. + diff --git a/doc/build/faq/sessions.rst b/doc/build/faq/sessions.rst index 0c03080f47..6027ab3714 100644 --- a/doc/build/faq/sessions.rst +++ b/doc/build/faq/sessions.rst @@ -305,9 +305,9 @@ I've created a mapping against an Outer Join, and while the query returns rows, Rows returned by an outer join may contain NULL for part of the primary key, as the primary key is the composite of both tables. The :class:`_query.Query` object ignores incoming rows that don't have an acceptable primary key. Based on the setting of the ``allow_partial_pks`` -flag on :func:`.mapper`, a primary key is accepted if the value has at least one non-NULL +flag on :class:`_orm.Mapper`, a primary key is accepted if the value has at least one non-NULL value, or alternatively if the value has no NULL values. See ``allow_partial_pks`` -at :func:`.mapper`. +at :class:`_orm.Mapper`. I'm using ``joinedload()`` or ``lazy=False`` to create a JOIN/OUTER JOIN and SQLAlchemy is not constructing the correct query when I try to add a WHERE, ORDER BY, LIMIT, etc. (which relies upon the (OUTER) JOIN) diff --git a/doc/build/glossary.rst b/doc/build/glossary.rst index 08a503ecb1..5b84e0a7b3 100644 --- a/doc/build/glossary.rst +++ b/doc/build/glossary.rst @@ -427,11 +427,11 @@ Glossary mapping mapped mapped class - We say a class is "mapped" when it has been passed through the - :func:`_orm.mapper` function. This process associates the - class with a database table or other :term:`selectable` - construct, so that instances of it can be persisted - and loaded using a :class:`.Session`. + We say a class is "mapped" when it has been associated with an + instance of the :class:`_orm.Mapper` class. This process associates + the class with a database table or other :term:`selectable` construct, + so that instances of it can be persisted and loaded using a + :class:`.Session`. .. seealso:: diff --git a/doc/build/orm/backref.rst b/doc/build/orm/backref.rst index 65d19eb185..2e1a1920cf 100644 --- a/doc/build/orm/backref.rst +++ b/doc/build/orm/backref.rst @@ -186,55 +186,6 @@ returned ``Address``. The :func:`.backref` function formatted the arguments we it into a form that is interpreted by the receiving :func:`_orm.relationship` as additional arguments to be applied to the new relationship it creates. -Setting cascade for backrefs -~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - -A key behavior that occurs in the 1.x series of SQLAlchemy regarding backrefs -is that :ref:`cascades ` will occur bidirectionally by -default. This basically means, if one starts with an ``User`` object -that's been persisted in the :class:`.Session`:: - - user = session.query(User).filter(User.id == 1).first() - -The above ``User`` is :term:`persistent` in the :class:`.Session`. It usually -is intuitive that if we create an ``Address`` object and append to the -``User.addresses`` collection, it is automatically added to the -:class:`.Session` as in the example below:: - - user = session.query(User).filter(User.id == 1).first() - address = Address(email_address='foo') - user.addresses.append(address) - -The above behavior is known as the "save update cascade" and is described -in the section :ref:`unitofwork_cascades`. - -However, if we instead created a new ``Address`` object, and associated the -``User`` object with the ``Address`` as follows:: - - address = Address(email_address='foo', user=user) - -In the above example, it is **not** as intuitive that the ``Address`` would -automatically be added to the :class:`.Session`. However, the backref behavior -of ``Address.user`` indicates that the ``Address`` object is also appended to -the ``User.addresses`` collection. This in turn initiates a **cascade** -operation which indicates that this ``Address`` should be placed into the -:class:`.Session` as a :term:`pending` object. - -Since this behavior has been identified as counter-intuitive to most people, -it can be disabled by setting :paramref:`_orm.relationship.cascade_backrefs` -to False, as in:: - - - class User(Base): - # ... - - addresses = relationship("Address", back_populates="user", cascade_backrefs=False) - -See the example in :ref:`backref_cascade` for further information. - -.. seealso:: - - :ref:`backref_cascade`. One Way Backrefs diff --git a/doc/build/orm/cascades.rst b/doc/build/orm/cascades.rst index 1a2a7804c2..eab49f40ca 100644 --- a/doc/build/orm/cascades.rst +++ b/doc/build/orm/cascades.rst @@ -122,13 +122,6 @@ for granted; it simplifies code by allowing a single call to that :class:`.Session` at once. While it can be disabled, there is usually not a need to do so. -One case where ``save-update`` cascade does sometimes get in the way is in that -it takes place in both directions for bi-directional relationships, e.g. -backrefs, meaning that the association of a child object with a particular parent -can have the effect of the parent object being implicitly associated with that -child object's :class:`.Session`; this pattern, as well as how to modify its -behavior using the :paramref:`_orm.relationship.cascade_backrefs` flag, -is discussed in the section :ref:`backref_cascade`. .. _cascade_delete: @@ -560,69 +553,6 @@ expunge from the :class:`.Session` using :meth:`.Session.expunge`, the operation should be propagated down to referred objects. -.. _backref_cascade: - -Controlling Cascade on Backrefs -------------------------------- - -.. note:: This section applies to a behavior that is removed in SQLAlchemy 2.0. - By setting the :paramref:`_orm.Session.future` flag on a given - :class:`_orm.Session`, the 2.0 behavior will be achieved which is - essentially that the :paramref:`_orm.relationship.cascade_backrefs` flag is - ignored. See the section :ref:`change_5150` for notes. - -In :term:`1.x style` ORM usage, the :ref:`cascade_save_update` cascade by -default takes place on attribute change events emitted from backrefs. This is -probably a confusing statement more easily described through demonstration; it -means that, given a mapping such as this:: - - mapper_registry.map_imperatively(Order, order_table, properties={ - 'items' : relationship(Item, backref='order') - }) - -If an ``Order`` is already in the session, and is assigned to the ``order`` -attribute of an ``Item``, the backref appends the ``Item`` to the ``items`` -collection of that ``Order``, resulting in the ``save-update`` cascade taking -place:: - - >>> o1 = Order() - >>> session.add(o1) - >>> o1 in session - True - - >>> i1 = Item() - >>> i1.order = o1 - >>> i1 in o1.items - True - >>> i1 in session - True - -This behavior can be disabled using the :paramref:`_orm.relationship.cascade_backrefs` flag:: - - mapper_registry.map_imperatively(Order, order_table, properties={ - 'items' : relationship(Item, backref='order', cascade_backrefs=False) - }) - -So above, the assignment of ``i1.order = o1`` will append ``i1`` to the ``items`` -collection of ``o1``, but will not add ``i1`` to the session. You can, of -course, :meth:`~.Session.add` ``i1`` to the session at a later point. This -option may be helpful for situations where an object needs to be kept out of a -session until it's construction is completed, but still needs to be given -associations to objects which are already persistent in the target session. - -When relationships are created by the :paramref:`_orm.relationship.backref` -parameter on :func:`_orm.relationship`, the :paramref:`_orm.cascade_backrefs` -parameter may be set to ``False`` on the backref side by using the -:func:`_orm.backref` function instead of a string. For example, the above relationship -could be declared:: - - mapper_registry.map_imperatively(Order, order_table, properties={ - 'items' : relationship( - Item, backref=backref('order', cascade_backrefs=False), cascade_backrefs=False - ) - }) - -This sets the ``cascade_backrefs=False`` behavior on both relationships. .. _session_deleting_from_collections: diff --git a/doc/build/orm/declarative_config.rst b/doc/build/orm/declarative_config.rst index 4a091c8483..d19e9ffc0c 100644 --- a/doc/build/orm/declarative_config.rst +++ b/doc/build/orm/declarative_config.rst @@ -152,19 +152,19 @@ Mapper Configuration Options with Declarative With all mapping forms, the mapping of the class is configured through parameters that become part of the :class:`_orm.Mapper` object. The function which ultimately receives these arguments is the -:func:`_orm.mapper` function, and are delivered to it from one of +:class:`_orm.Mapper` function, and are delivered to it from one of the front-facing mapping functions defined on the :class:`_orm.registry` object. For the declarative form of mapping, mapper arguments are specified using the ``__mapper_args__`` declarative class variable, which is a dictionary -that is passed as keyword arguments to the :func:`_orm.mapper` function. +that is passed as keyword arguments to the :class:`_orm.Mapper` function. Some examples: **Version ID Column** -The :paramref:`_orm.mapper.version_id_col` and -:paramref:`_orm.mapper.version_id_generator` parameters:: +The :paramref:`_orm.Mapper.version_id_col` and +:paramref:`_orm.Mapper.version_id_generator` parameters:: from datetime import datetime @@ -181,8 +181,8 @@ The :paramref:`_orm.mapper.version_id_col` and **Single Table Inheritance** -The :paramref:`_orm.mapper.polymorphic_on` and -:paramref:`_orm.mapper.polymorphic_identity` parameters:: +The :paramref:`_orm.Mapper.polymorphic_on` and +:paramref:`_orm.Mapper.polymorphic_identity` parameters:: class Person(Base): __tablename__ = 'person' diff --git a/doc/build/orm/extensions/associationproxy.rst b/doc/build/orm/extensions/associationproxy.rst index de2001e6f5..aef046b049 100644 --- a/doc/build/orm/extensions/associationproxy.rst +++ b/doc/build/orm/extensions/associationproxy.rst @@ -98,7 +98,7 @@ for us transparently:: The :class:`.AssociationProxy` object produced by the :func:`.association_proxy` function is an instance of a `Python descriptor `_. It is always declared with the user-defined class being mapped, regardless of -whether Declarative or classical mappings via the :func:`.mapper` function are used. +whether Declarative or classical mappings via the :class:`_orm.Mapper` function are used. The proxy functions by operating upon the underlying mapped attribute or collection in response to operations, and changes made via the proxy are immediately diff --git a/doc/build/orm/extensions/asyncio.rst b/doc/build/orm/extensions/asyncio.rst index a7d2fb16be..23eed9e769 100644 --- a/doc/build/orm/extensions/asyncio.rst +++ b/doc/build/orm/extensions/asyncio.rst @@ -295,7 +295,7 @@ prevent this: :paramref:`_schema.Column.default` parameter is assigned to a SQL expression object. To access this value with asyncio, it has to be refreshed within the flush process, which is achieved by setting the - :paramref:`_orm.mapper.eager_defaults` parameter on the mapping:: + :paramref:`_orm.Mapper.eager_defaults` parameter on the mapping:: class A(Base): diff --git a/doc/build/orm/extensions/mypy.rst b/doc/build/orm/extensions/mypy.rst index b710d1f443..fc26549969 100644 --- a/doc/build/orm/extensions/mypy.rst +++ b/doc/build/orm/extensions/mypy.rst @@ -17,12 +17,14 @@ Support for :pep:`484` typing annotations as well as the Installation ------------ -The Mypy plugin depends upon new stubs for SQLAlchemy packaged at -`sqlalchemy2-stubs `_. These -stubs necessarily fully replace the previous ``sqlalchemy-stubs`` typing -annotations published by Dropbox, as they occupy the same ``sqlalchemy-stubs`` -namespace as specified by :pep:`561`. The `Mypy `_ -package itself is also a dependency. +TODO: document uninstallation of existing stubs: + +* ``sqlalchemy2-stubs`` +* ``sqlalchemy-stubs`` + +SQLAlchemy 2.0 is expected to be directly typed. + +The `Mypy `_ package itself is a dependency. Both packages may be installed using the "mypy" extras hook using pip:: diff --git a/doc/build/orm/mapping_api.rst b/doc/build/orm/mapping_api.rst index 5d0b6c0d02..eeba54040c 100644 --- a/doc/build/orm/mapping_api.rst +++ b/doc/build/orm/mapping_api.rst @@ -20,8 +20,6 @@ Class Mapping API .. autofunction:: synonym_for -.. autofunction:: mapper - .. autofunction:: object_mapper .. autofunction:: class_mapper diff --git a/doc/build/orm/mapping_columns.rst b/doc/build/orm/mapping_columns.rst index 596c64f7c5..66fb22e01b 100644 --- a/doc/build/orm/mapping_columns.rst +++ b/doc/build/orm/mapping_columns.rst @@ -5,7 +5,7 @@ Mapping Table Columns ===================== -The default behavior of :func:`_orm.mapper` is to assemble all the columns in +The default behavior of :class:`_orm.Mapper` is to assemble all the columns in the mapped :class:`_schema.Table` into mapped object attributes, each of which are named according to the name of the column itself (specifically, the ``key`` attribute of :class:`_schema.Column`). This behavior can be @@ -43,7 +43,7 @@ can be referenced directly:: name = user_table.c.user_name The corresponding technique for an :term:`imperative` mapping is -to place the desired key in the :paramref:`_orm.mapper.properties` +to place the desired key in the :paramref:`_orm.Mapper.properties` dictionary with the desired key:: mapper_registry.map_imperatively(User, user_table, properties={ @@ -118,8 +118,8 @@ Using column_property for column level options Options can be specified when mapping a :class:`_schema.Column` using the :func:`.column_property` function. This function explicitly creates the :class:`.ColumnProperty` used by the -:func:`.mapper` to keep track of the :class:`_schema.Column`; normally, the -:func:`.mapper` creates this automatically. Using :func:`.column_property`, +:class:`_orm.Mapper` to keep track of the :class:`_schema.Column`; normally, the +:class:`_orm.Mapper` creates this automatically. Using :func:`.column_property`, we can pass additional arguments about how we'd like the :class:`_schema.Column` to be mapped. Below, we pass an option ``active_history``, which specifies that a change to this column's value should diff --git a/doc/build/orm/mapping_styles.rst b/doc/build/orm/mapping_styles.rst index 723b540684..f273196b4a 100644 --- a/doc/build/orm/mapping_styles.rst +++ b/doc/build/orm/mapping_styles.rst @@ -107,11 +107,11 @@ Documentation for Declarative mapping continues at :ref:`declarative_config_topl Creating an Explicit Base Non-Dynamically (for use with mypy, similar) ---------------------------------------------------------------------- +TODO: update for 2.0 - code here may not be accurate + SQLAlchemy includes a :ref:`Mypy plugin ` that automatically accommodates for the dynamically generated ``Base`` class delivered by SQLAlchemy functions like :func:`_orm.declarative_base`. -This plugin works along with a new set of typing stubs published at -`sqlalchemy2-stubs `_. When this plugin is not in use, or when using other :pep:`484` tools which may not know how to interpret this class, the declarative base class may @@ -126,8 +126,6 @@ be produced in a fully explicit fashion using the class Base(metaclass=DeclarativeMeta): __abstract__ = True - # these are supplied by the sqlalchemy2-stubs, so may be omitted - # when they are installed registry = mapper_registry metadata = mapper_registry.metadata @@ -339,9 +337,10 @@ during flush from autoincrement or other default value generator. To allow them to be specified in the constructor explicitly, they would instead be given a default value of ``None``. -For a :func:`_orm.relationship` to be declared separately, it needs to -be specified directly within the :paramref:`_orm.mapper.properties` -dictionary passed to the :func:`_orm.mapper`. An alternative to this +For a :func:`_orm.relationship` to be declared separately, it needs to be +specified directly within the :paramref:`_orm.Mapper.properties` dictionary +which itself is specified within the ``__mapper_args__`` dictionary, so that it +is passed to the constructor for :class:`_orm.Mapper`. An alternative to this approach is in the next example. .. _orm_declarative_dataclasses_declarative_table: @@ -527,16 +526,10 @@ Imperative (a.k.a. Classical) Mappings An **imperative** or **classical** mapping refers to the configuration of a mapped class using the :meth:`_orm.registry.map_imperatively` method, where the target class does not include any declarative class attributes. -The "map imperative" style has historically been achieved using the -:func:`_orm.mapper` function directly, however this function now expects -that a :meth:`_orm.registry` is present. - -.. deprecated:: 1.4 Using the :func:`_orm.mapper` function directly to - achieve a classical mapping directly is deprecated. The - :meth:`_orm.registry.map_imperatively` method retains the identical - functionality while also allowing for string-based resolution of - other mapped classes from within the registry. +.. versionchanged:: 2.0 The :meth:`_orm.registry.map_imperatively` method + is now used to create classical mappings. The ``sqlalchemy.orm.mapper()`` + standalone function is effectively removed. In "classical" form, the table metadata is created separately with the :class:`_schema.Table` construct, then associated with the ``User`` class via @@ -588,8 +581,8 @@ yet be linked to table metadata, nor can we specify a string here. Some examples in the documentation still use the classical approach, but note that the classical as well as Declarative approaches are **fully interchangeable**. Both systems ultimately create the same configuration, consisting of a :class:`_schema.Table`, -user-defined class, linked together with a :func:`.mapper`. When we talk about -"the behavior of :func:`.mapper`", this includes when using the Declarative system +user-defined class, linked together with a :class:`_orm.Mapper` object. When we talk about +"the behavior of :class:`_orm.Mapper`", this includes when using the Declarative system as well - it's still used, just behind the scenes. @@ -671,15 +664,15 @@ on the class itself as declarative class variables:: Mapper Configuration Overview ============================= -With all mapping forms, the mapping of the class can be -configured in many ways by passing construction arguments that become -part of the :class:`_orm.Mapper` object. The function which ultimately -receives these arguments is the :func:`_orm.mapper` function, which are delivered -to it originating from one of the front-facing mapping functions defined -on the :class:`_orm.registry` object. +With all mapping forms, the mapping of the class can be configured in many ways +by passing construction arguments that become part of the :class:`_orm.Mapper` +object. The construct which ultimately receives these arguments is the +constructor to the :class:`_orm.Mapper` class, and the arguments are delivered +to it originating from one of the front-facing mapping functions defined on the +:class:`_orm.registry` object. There are four general classes of configuration information that the -:func:`_orm.mapper` function looks for: +:class:`_orm.Mapper` class looks for: The class to be mapped ---------------------- @@ -746,12 +739,12 @@ the section :ref:`orm_declarative_properties` for notes on this process. When mapping with the :ref:`imperative ` style, the properties dictionary is passed directly as the ``properties`` argument to :meth:`_orm.registry.map_imperatively`, which will pass it along to the -:paramref:`_orm.mapper.properties` parameter. +:paramref:`_orm.Mapper.properties` parameter. Other mapper configuration parameters ------------------------------------- -These flags are documented at :func:`_orm.mapper`. +These flags are documented at :class:`_orm.Mapper`. When mapping with the :ref:`declarative ` mapping style, additional mapper configuration arguments are configured via the @@ -760,7 +753,7 @@ style, additional mapper configuration arguments are configured via the When mapping with the :ref:`imperative ` style, keyword arguments are passed to the to :meth:`_orm.registry.map_imperatively` -method which passes them along to the :func:`_orm.mapper` function. +method which passes them along to the :class:`_orm.Mapper` class. .. [1] When running under Python 2, a Python 2 "old style" class is the only diff --git a/doc/build/orm/nonstandard_mappings.rst b/doc/build/orm/nonstandard_mappings.rst index bf6b0f247d..ff02109e89 100644 --- a/doc/build/orm/nonstandard_mappings.rst +++ b/doc/build/orm/nonstandard_mappings.rst @@ -161,7 +161,7 @@ key. almost never needed; it necessarily tends to produce complex queries which are often less efficient than that which would be produced by direct query construction. The practice is to some degree - based on the very early history of SQLAlchemy where the :func:`.mapper` + based on the very early history of SQLAlchemy where the :class:`_orm.Mapper` construct was meant to represent the primary querying interface; in modern usage, the :class:`_query.Query` object can be used to construct virtually any SELECT statement, including complex composites, and should @@ -174,7 +174,7 @@ In modern SQLAlchemy, a particular class is mapped by only one so-called **primary** mapper at a time. This mapper is involved in three main areas of functionality: querying, persistence, and instrumentation of the mapped class. The rationale of the primary mapper relates to the fact that the -:func:`.mapper` modifies the class itself, not only persisting it towards a +:class:`_orm.Mapper` modifies the class itself, not only persisting it towards a particular :class:`_schema.Table`, but also :term:`instrumenting` attributes upon the class which are structured specifically according to the table metadata. It's not possible for more than one mapper to be associated with a class in equal diff --git a/doc/build/orm/relationship_persistence.rst b/doc/build/orm/relationship_persistence.rst index f843764741..765c5b69d9 100644 --- a/doc/build/orm/relationship_persistence.rst +++ b/doc/build/orm/relationship_persistence.rst @@ -213,7 +213,7 @@ should be enabled, using the configuration described at :ref:`passive_deletes` - supporting ON DELETE CASCADE with relationships - :paramref:`.orm.mapper.passive_updates` - similar feature on :func:`.mapper` + :paramref:`.orm.mapper.passive_updates` - similar feature on :class:`_orm.Mapper` Simulating limited ON UPDATE CASCADE without foreign key support diff --git a/doc/build/orm/versioning.rst b/doc/build/orm/versioning.rst index a141df6a0c..7aeca08738 100644 --- a/doc/build/orm/versioning.rst +++ b/doc/build/orm/versioning.rst @@ -175,7 +175,7 @@ automatically providing the new value of the version id counter. The ORM typically does not actively fetch the values of database-generated values when it emits an INSERT or UPDATE, instead leaving these columns as "expired" and to be fetched when they are next accessed, unless the ``eager_defaults`` -:func:`.mapper` flag is set. However, when a +:class:`_orm.Mapper` flag is set. However, when a server side version column is used, the ORM needs to actively fetch the newly generated value. This is so that the version counter is set up *before* any concurrent transaction may update it again. This fetching is also diff --git a/lib/sqlalchemy/dialects/mysql/enumerated.py b/lib/sqlalchemy/dialects/mysql/enumerated.py index b84608f580..f92e2cf199 100644 --- a/lib/sqlalchemy/dialects/mysql/enumerated.py +++ b/lib/sqlalchemy/dialects/mysql/enumerated.py @@ -12,7 +12,6 @@ from ... import exc from ... import sql from ... import util from ...sql import sqltypes -from ...sql.base import NO_ARG class ENUM(sqltypes.NativeForEmulated, sqltypes.Enum, _StringType): @@ -59,15 +58,7 @@ class ENUM(sqltypes.NativeForEmulated, sqltypes.Enum, _StringType): BINARY in schema. This does not affect the type of data stored, only the collation of character data. - :param quoting: Not used. A warning will be raised if provided. - """ - if kw.pop("quoting", NO_ARG) is not NO_ARG: - util.warn_deprecated_20( - "The 'quoting' parameter to :class:`.mysql.ENUM` is deprecated" - " and will be removed in a future release. " - "This parameter now has no effect." - ) kw.pop("strict", None) self._enum_init(enums, kw) _StringType.__init__(self, length=self.length, **kw) @@ -151,15 +142,7 @@ class SET(_StringType): .. versionadded:: 1.0.0 - :param quoting: Not used. A warning will be raised if passed. - """ - if kw.pop("quoting", NO_ARG) is not NO_ARG: - util.warn_deprecated_20( - "The 'quoting' parameter to :class:`.mysql.SET` is deprecated" - " and will be removed in a future release. " - "This parameter now has no effect." - ) self.retrieve_as_bitwise = kw.pop("retrieve_as_bitwise", False) self.values = tuple(values) if not self.retrieve_as_bitwise and "" in values: diff --git a/lib/sqlalchemy/engine/reflection.py b/lib/sqlalchemy/engine/reflection.py index e1b06a3149..92c243de3d 100644 --- a/lib/sqlalchemy/engine/reflection.py +++ b/lib/sqlalchemy/engine/reflection.py @@ -681,17 +681,6 @@ class Inspector: conn, table_name, schema, info_cache=self.info_cache, **kw ) - @util.deprecated_20( - ":meth:`_reflection.Inspector.reflecttable`", - "The :meth:`_reflection.Inspector.reflecttable` " - "method was renamed to " - ":meth:`_reflection.Inspector.reflect_table`. This deprecated alias " - "will be removed in a future release.", - ) - def reflecttable(self, *args, **kwargs): - "See reflect_table. This method name is deprecated" - return self.reflect_table(*args, **kwargs) - def reflect_table( self, table, diff --git a/lib/sqlalchemy/exc.py b/lib/sqlalchemy/exc.py index e51214fd9b..77edf98a0c 100644 --- a/lib/sqlalchemy/exc.py +++ b/lib/sqlalchemy/exc.py @@ -705,11 +705,7 @@ class LegacyAPIWarning(Base20DeprecationWarning): """indicates an API that is in 'legacy' status, a long term deprecation.""" -class RemovedIn20Warning(Base20DeprecationWarning): - """indicates an API that will be fully removed in SQLAlchemy 2.0.""" - - -class MovedIn20Warning(RemovedIn20Warning): +class MovedIn20Warning(Base20DeprecationWarning): """Subtype of RemovedIn20Warning to indicate an API that moved only.""" diff --git a/lib/sqlalchemy/ext/declarative/__init__.py b/lib/sqlalchemy/ext/declarative/__init__.py index b1c1d36912..ebb992742c 100644 --- a/lib/sqlalchemy/ext/declarative/__init__.py +++ b/lib/sqlalchemy/ext/declarative/__init__.py @@ -8,7 +8,6 @@ from .extensions import AbstractConcreteBase from .extensions import ConcreteBase from .extensions import DeferredReflection -from .extensions import instrument_declarative from ... import util from ...orm.decl_api import as_declarative as _as_declarative from ...orm.decl_api import declarative_base as _declarative_base diff --git a/lib/sqlalchemy/ext/declarative/extensions.py b/lib/sqlalchemy/ext/declarative/extensions.py index 1862592f5c..b7c0e78d94 100644 --- a/lib/sqlalchemy/ext/declarative/extensions.py +++ b/lib/sqlalchemy/ext/declarative/extensions.py @@ -8,9 +8,7 @@ from ... import inspection -from ... import util from ...orm import exc as orm_exc -from ...orm import registry from ...orm import relationships from ...orm.base import _mapper_or_none from ...orm.clsregistry import _resolver @@ -20,23 +18,6 @@ from ...schema import Table from ...util import OrderedDict -@util.deprecated( - "2.0", - "the instrument_declarative function is deprecated " - "and will be removed in SQLAlhcemy 2.0. Please use " - ":meth:`_orm.registry.map_declaratively", -) -def instrument_declarative(cls, cls_registry, metadata): - """Given a class, configure the class declaratively, - using the given registry, which can be any dictionary, and - MetaData object. - - """ - registry(metadata=metadata, class_registry=cls_registry).map_declaratively( - cls - ) - - class ConcreteBase: """A helper class for 'concrete' declarative mappings. diff --git a/lib/sqlalchemy/ext/mutable.py b/lib/sqlalchemy/ext/mutable.py index 7e277d3799..7a497abfcb 100644 --- a/lib/sqlalchemy/ext/mutable.py +++ b/lib/sqlalchemy/ext/mutable.py @@ -360,7 +360,6 @@ from .. import event from .. import inspect from .. import types from ..orm import Mapper -from ..orm import mapper from ..orm.attributes import flag_modified from ..sql.base import SchemaEventTarget from ..util import memoized_property @@ -567,7 +566,7 @@ class Mutable(MutableBase): if isinstance(prop.columns[0].type, sqltype): cls.associate_with_attribute(getattr(class_, prop.key)) - event.listen(mapper, "mapper_configured", listen_for_type) + event.listen(Mapper, "mapper_configured", listen_for_type) @classmethod def as_mutable(cls, sqltype): @@ -629,7 +628,7 @@ class Mutable(MutableBase): ) or (prop.columns[0].type is sqltype): cls.associate_with_attribute(getattr(class_, prop.key)) - event.listen(mapper, "mapper_configured", listen_for_type) + event.listen(Mapper, "mapper_configured", listen_for_type) return sqltype diff --git a/lib/sqlalchemy/orm/__init__.py b/lib/sqlalchemy/orm/__init__.py index 69a3e64da6..50b320cb2f 100644 --- a/lib/sqlalchemy/orm/__init__.py +++ b/lib/sqlalchemy/orm/__init__.py @@ -100,6 +100,7 @@ from .util import with_parent from .util import with_polymorphic from .. import sql as _sql from .. import util as _sa_util +from ..exc import InvalidRequestError from ..util.langhelpers import public_factory @@ -144,11 +145,31 @@ with_loader_criteria = public_factory(LoaderCriteriaOption, ".orm") relationship = public_factory(RelationshipProperty, ".orm.relationship") -@_sa_util.deprecated_20("relation", "Please use :func:`.relationship`.") -def relation(*arg, **kw): - """A synonym for :func:`relationship`.""" +def mapper(*arg, **kw): + """Placeholder for the now-removed ``mapper()`` function. - return relationship(*arg, **kw) + Classical mappings should be performed using the + :meth:`_orm.registry.map_imperatively` method. + + This symbol remains in SQLAlchemy 2.0 to suit the deprecated use case + of using the ``mapper()`` function as a target for ORM event listeners, + which failed to be marked as deprecated in the 1.4 series. + + Global ORM mapper listeners should instead use the :class:`_orm.Mapper` + class as the target. + + .. versionchanged:: 2.0 The ``mapper()`` function was removed; the + symbol remains temporarily as a placeholder for the event listening + use case. + + """ + raise InvalidRequestError( + "The 'sqlalchemy.orm.mapper()' function is removed as of " + "SQLAlchemy 2.0. Use the " + "'sqlalchemy.orm.registry.map_imperatively()` " + "method of the ``sqlalchemy.orm.registry`` class to perform " + "classical mapping." + ) def dynamic_loader(argument, **kw): @@ -251,8 +272,6 @@ def query_expression(default_expr=_sql.null()): return prop -mapper = public_factory(Mapper, ".orm.mapper") - synonym = public_factory(SynonymProperty, ".orm.synonym") @@ -284,12 +303,6 @@ def clear_mappers(): mapperlib._dispose_registries(mapperlib._all_registries(), False) -@_sa_util.deprecated_20("eagerload", "Please use :func:`_orm.joinedload`.") -def eagerload(*args, **kwargs): - """A synonym for :func:`joinedload()`.""" - return joinedload(*args, **kwargs) - - contains_alias = public_factory(AliasOption, ".orm.contains_alias") if True: diff --git a/lib/sqlalchemy/orm/context.py b/lib/sqlalchemy/orm/context.py index 012db36c2c..36590f8ee9 100644 --- a/lib/sqlalchemy/orm/context.py +++ b/lib/sqlalchemy/orm/context.py @@ -1091,24 +1091,6 @@ class ORMSelectCompileState(ORMCompileState, SelectState): def _simple_statement(self): - if ( - self.compile_options._use_legacy_query_style - and (self.distinct and not self.distinct_on) - and self.order_by - ): - to_add = sql_util.expand_column_list_from_order_by( - self.primary_columns, self.order_by - ) - if to_add: - util.warn_deprecated_20( - "ORDER BY columns added implicitly due to " - "DISTINCT is deprecated and will be removed in " - "SQLAlchemy 2.0. SELECT statements with DISTINCT " - "should be written to explicitly include the appropriate " - "columns in the columns clause" - ) - self.primary_columns += to_add - statement = self._select_statement( self.primary_columns + self.secondary_columns, tuple(self.from_clauses) + tuple(self.eager_joins.values()), @@ -2293,14 +2275,6 @@ class _BundleEntity(_QueryEntity): ) self.supports_single_entity = self.bundle.single_entity - if ( - self.supports_single_entity - and not compile_state.compile_options._use_legacy_query_style - ): - util.warn_deprecated_20( - "The Bundle.single_entity flag has no effect when " - "using 2.0 style execution." - ) @property def mapper(self): diff --git a/lib/sqlalchemy/orm/decl_api.py b/lib/sqlalchemy/orm/decl_api.py index 7b6814863a..c973f5a4cc 100644 --- a/lib/sqlalchemy/orm/decl_api.py +++ b/lib/sqlalchemy/orm/decl_api.py @@ -370,7 +370,7 @@ def declarative_base( The new base class will be given a metaclass that produces appropriate :class:`~sqlalchemy.schema.Table` objects and makes - the appropriate :func:`~sqlalchemy.orm.mapper` calls based on the + the appropriate :class:`_orm.Mapper` calls based on the information provided declaratively in the class and any subclasses of the class. @@ -408,7 +408,7 @@ def declarative_base( ``metadata`` attribute of the generated declarative base class. :param mapper: - An optional callable, defaults to :func:`~sqlalchemy.orm.mapper`. Will + An optional callable, defaults to :class:`_orm.Mapper`. Will be used to map subclasses to their Tables. :param cls: @@ -479,8 +479,8 @@ class registry: * :meth:`_orm.registry.map_imperatively` will produce a :class:`_orm.Mapper` for a class without scanning the class for declarative class attributes. This method suits the use case historically - provided by the - :func:`_orm.mapper` classical mapping function. + provided by the ``sqlalchemy.orm.mapper()`` classical mapping function, + which is removed as of SQLAlchemy 2.0. .. versionadded:: 1.4 @@ -749,7 +749,7 @@ class registry: examples. :param mapper: - An optional callable, defaults to :func:`~sqlalchemy.orm.mapper`. + An optional callable, defaults to :class:`_orm.Mapper`. This function is used to generate new :class:`_orm.Mapper` objects. :param cls: @@ -923,8 +923,8 @@ class registry: information. Instead, all mapping constructs are passed as arguments. - This method is intended to be fully equivalent to the classic - SQLAlchemy :func:`_orm.mapper` function, except that it's in terms of + This method is intended to be fully equivalent to the now-removed + SQLAlchemy ``mapper()`` function, except that it's in terms of a particular registry. E.g.:: @@ -948,15 +948,15 @@ class registry: and usage examples. :param class\_: The class to be mapped. Corresponds to the - :paramref:`_orm.mapper.class_` parameter. + :paramref:`_orm.Mapper.class_` parameter. :param local_table: the :class:`_schema.Table` or other :class:`_sql.FromClause` object that is the subject of the mapping. Corresponds to the - :paramref:`_orm.mapper.local_table` parameter. + :paramref:`_orm.Mapper.local_table` parameter. :param \**kw: all other keyword arguments are passed to the - :func:`_orm.mapper` function directly. + :class:`_orm.Mapper` constructor directly. .. seealso:: @@ -968,9 +968,6 @@ class registry: return _mapper(self, class_, local_table, kw) -mapperlib._legacy_registry = registry() - - def as_declarative(**kw): """ Class decorator which will adapt a given class into a diff --git a/lib/sqlalchemy/orm/events.py b/lib/sqlalchemy/orm/events.py index a66bf64d8c..03e3796a66 100644 --- a/lib/sqlalchemy/orm/events.py +++ b/lib/sqlalchemy/orm/events.py @@ -152,8 +152,8 @@ class InstanceEvents(event.Events): * unmapped superclasses of mapped or to-be-mapped classes (using the ``propagate=True`` flag) * :class:`_orm.Mapper` objects - * the :class:`_orm.Mapper` class itself and the :func:`.mapper` - function indicate listening for all mappers. + * the :class:`_orm.Mapper` class itself indicates listening for all + mappers. Instance events are closely related to mapper events, but are more specific to the instance and its instrumentation, @@ -200,6 +200,12 @@ class InstanceEvents(event.Events): elif isinstance(target, mapperlib.Mapper): return target.class_manager elif target is orm.mapper: + util.warn_deprecated( + "The `sqlalchemy.orm.mapper()` symbol is deprecated and " + "will be removed in a future release. For the mapper-wide " + "event target, use the 'sqlalchemy.orm.Mapper' class.", + "2.0", + ) return instrumentation.ClassManager elif isinstance(target, type): if issubclass(target, mapperlib.Mapper): @@ -638,8 +644,8 @@ class MapperEvents(event.Events): * unmapped superclasses of mapped or to-be-mapped classes (using the ``propagate=True`` flag) * :class:`_orm.Mapper` objects - * the :class:`_orm.Mapper` class itself and the :func:`.mapper` - function indicate listening for all mappers. + * the :class:`_orm.Mapper` class itself indicates listening for all + mappers. Mapper events provide hooks into critical sections of the mapper, including those related to object instrumentation, @@ -692,6 +698,12 @@ class MapperEvents(event.Events): orm = util.preloaded.orm if target is orm.mapper: + util.warn_deprecated( + "The `sqlalchemy.orm.mapper()` symbol is deprecated and " + "will be removed in a future release. For the mapper-wide " + "event target, use the 'sqlalchemy.orm.Mapper' class.", + "2.0", + ) return mapperlib.Mapper elif isinstance(target, type): if issubclass(target, mapperlib.Mapper): @@ -721,7 +733,7 @@ class MapperEvents(event.Events): ): util.warn( "'before_configured' and 'after_configured' ORM events " - "only invoke with the mapper() function or Mapper class " + "only invoke with the Mapper class " "as the target." ) @@ -896,13 +908,13 @@ class MapperEvents(event.Events): new mappers have been made available and new mapper use is detected. - This event can **only** be applied to the :class:`_orm.Mapper` class - or :func:`.mapper` function, and not to individual mappings or - mapped classes. It is only invoked for all mappings as a whole:: + This event can **only** be applied to the :class:`_orm.Mapper` class, + and not to individual mappings or mapped classes. It is only invoked + for all mappings as a whole:: - from sqlalchemy.orm import mapper + from sqlalchemy.orm import Mapper - @event.listens_for(mapper, "before_configured") + @event.listens_for(Mapper, "before_configured") def go(): # ... @@ -959,13 +971,13 @@ class MapperEvents(event.Events): Also contrast to :meth:`.MapperEvents.before_configured`, which is invoked before the series of mappers has been configured. - This event can **only** be applied to the :class:`_orm.Mapper` class - or :func:`.mapper` function, and not to individual mappings or + This event can **only** be applied to the :class:`_orm.Mapper` class, + and not to individual mappings or mapped classes. It is only invoked for all mappings as a whole:: - from sqlalchemy.orm import mapper + from sqlalchemy.orm import Mapper - @event.listens_for(mapper, "after_configured") + @event.listens_for(Mapper, "after_configured") def go(): # ... @@ -1005,7 +1017,7 @@ class MapperEvents(event.Events): The event is often called for a batch of objects of the same class before their INSERT statements are emitted at once in a later step. In the extremely rare case that - this is not desirable, the :func:`.mapper` can be + this is not desirable, the :class:`_orm.Mapper` object can be configured with ``batch=False``, which will cause batches of instances to be broken up into individual (and more poorly performing) event->persist->event @@ -1052,7 +1064,7 @@ class MapperEvents(event.Events): same class after their INSERT statements have been emitted at once in a previous step. In the extremely rare case that this is not desirable, the - :func:`.mapper` can be configured with ``batch=False``, + :class:`_orm.Mapper` object can be configured with ``batch=False``, which will cause batches of instances to be broken up into individual (and more poorly performing) event->persist->event steps. @@ -1116,7 +1128,7 @@ class MapperEvents(event.Events): The event is often called for a batch of objects of the same class before their UPDATE statements are emitted at once in a later step. In the extremely rare case that - this is not desirable, the :func:`.mapper` can be + this is not desirable, the :class:`_orm.Mapper` can be configured with ``batch=False``, which will cause batches of instances to be broken up into individual (and more poorly performing) event->persist->event @@ -1180,7 +1192,7 @@ class MapperEvents(event.Events): The event is often called for a batch of objects of the same class after their UPDATE statements have been emitted at once in a previous step. In the extremely rare case that - this is not desirable, the :func:`.mapper` can be + this is not desirable, the :class:`_orm.Mapper` can be configured with ``batch=False``, which will cause batches of instances to be broken up into individual (and more poorly performing) event->persist->event diff --git a/lib/sqlalchemy/orm/loading.py b/lib/sqlalchemy/orm/loading.py index 047971d356..796003ebba 100644 --- a/lib/sqlalchemy/orm/loading.py +++ b/lib/sqlalchemy/orm/loading.py @@ -262,11 +262,10 @@ def merge_frozen_result(session, statement, frozen_result, load=True): session.autoflush = autoflush -@util.deprecated_20( +@util.became_legacy_20( ":func:`_orm.merge_result`", alternative="The function as well as the method on :class:`_orm.Query` " "is superseded by the :func:`_orm.merge_frozen_result` function.", - becomes_legacy=True, ) @util.preload_module("sqlalchemy.orm.context") def merge_result(query, iterator, load=True): diff --git a/lib/sqlalchemy/orm/mapper.py b/lib/sqlalchemy/orm/mapper.py index 21da7c2f30..bc818559db 100644 --- a/lib/sqlalchemy/orm/mapper.py +++ b/lib/sqlalchemy/orm/mapper.py @@ -57,8 +57,6 @@ from ..util import HasMemoized _mapper_registries = weakref.WeakKeyDictionary() -_legacy_registry = None - def _all_registries(): with _CONFIGURE_MUTEX: @@ -148,16 +146,15 @@ class Mapper( ): r"""Direct constructor for a new :class:`_orm.Mapper` object. - The :func:`_orm.mapper` function is normally invoked through the + The :class:`_orm.Mapper` constructor is not called directly, and + is normally invoked through the use of the :class:`_orm.registry` object through either the :ref:`Declarative ` or :ref:`Imperative ` mapping styles. - .. versionchanged:: 1.4 The :func:`_orm.mapper` function should not - be called directly for classical mapping; for a classical mapping - configuration, use the :meth:`_orm.registry.map_imperatively` - method. The :func:`_orm.mapper` function may become private in a - future release. + .. versionchanged:: 2.0 The public facing ``mapper()`` function is + removed; for a classical mapping configuration, use the + :meth:`_orm.registry.map_imperatively` method. Parameters documented below may be passed to either the :meth:`_orm.registry.map_imperatively` method, or may be passed in the @@ -1202,8 +1199,7 @@ class Mapper( # we expect that declarative has applied the class manager # already and set up a registry. if this is None, - # we will emit a deprecation warning below when we also see that - # it has no registry. + # this raises as of 2.0. manager = attributes.manager_of_class(self.class_) if self.non_primary: @@ -1251,14 +1247,12 @@ class Mapper( ) if not manager.registry: - util.warn_deprecated_20( - "Calling the mapper() function directly outside of a " - "declarative registry is deprecated." + raise sa_exc.InvalidRequestError( + "The _mapper() function may not be invoked directly outside " + "of a declarative registry." " Please use the sqlalchemy.orm.registry.map_imperatively() " "function for a classical mapping." ) - assert _legacy_registry is not None - _legacy_registry._add_manager(manager) self.class_manager = manager self.registry = manager.registry diff --git a/lib/sqlalchemy/orm/query.py b/lib/sqlalchemy/orm/query.py index 2d574c6b22..e8ec5f156a 100644 --- a/lib/sqlalchemy/orm/query.py +++ b/lib/sqlalchemy/orm/query.py @@ -677,7 +677,7 @@ class Query( self._compile_options += opt return self - @util.deprecated_20( + @util.became_legacy_20( ":meth:`_orm.Query.with_labels` and :meth:`_orm.Query.apply_labels`", alternative="Use set_label_style(LABEL_STYLE_TABLENAME_PLUS_COL) " "instead.", @@ -803,10 +803,9 @@ class Query( self.load_options += {"_yield_per": count} return self - @util.deprecated_20( + @util.became_legacy_20( ":meth:`_orm.Query.get`", alternative="The method is now available as :meth:`_orm.Session.get`", - becomes_legacy=True, ) def get(self, ident): """Return an instance based on the given primary key identifier, @@ -933,8 +932,8 @@ class Query( :func:`~.expression.select`. The method here accepts mapped classes, :func:`.aliased` constructs, - and :func:`.mapper` constructs as arguments, which are resolved into - expression constructs, in addition to appropriate expression + and :class:`_orm.Mapper` constructs as arguments, which are resolved + into expression constructs, in addition to appropriate expression constructs. The correlation arguments are ultimately passed to @@ -997,10 +996,9 @@ class Query( self.load_options += {"_invoke_all_eagers": value} return self - @util.deprecated_20( + @util.became_legacy_20( ":meth:`_orm.Query.with_parent`", alternative="Use the :func:`_orm.with_parent` standalone construct.", - becomes_legacy=True, ) @util.preload_module("sqlalchemy.orm.relationships") def with_parent(self, instance, property=None, from_entity=None): # noqa @@ -2046,7 +2044,6 @@ class Query( return orm_util._getitem( self, item, - allow_negative=not self.session or not self.session.future, ) @_generative @@ -2405,11 +2402,10 @@ class Query( return result - @util.deprecated_20( + @util.became_legacy_20( ":meth:`_orm.Query.merge_result`", alternative="The method is superseded by the " ":func:`_orm.merge_frozen_result` function.", - becomes_legacy=True, enable_warnings=False, # warnings occur via loading.merge_result ) def merge_result(self, iterator, load=True): diff --git a/lib/sqlalchemy/orm/relationships.py b/lib/sqlalchemy/orm/relationships.py index dd58c0201d..522947cc50 100644 --- a/lib/sqlalchemy/orm/relationships.py +++ b/lib/sqlalchemy/orm/relationships.py @@ -111,7 +111,7 @@ class RelationshipProperty(StrategizedProperty): passive_updates=True, enable_typechecks=True, active_history=False, - cascade_backrefs=True, + cascade_backrefs=False, ) _dependency_processor = None @@ -403,21 +403,11 @@ class RelationshipProperty(StrategizedProperty): :ref:`tutorial_delete_cascade` - Tutorial example describing a delete cascade. - :param cascade_backrefs=True: - A boolean value indicating if the ``save-update`` cascade should - operate along an assignment event intercepted by a backref. - When set to ``False``, the attribute managed by this relationship - will not cascade an incoming transient object into the session of a - persistent parent, if the event is received via backref. + :param cascade_backrefs=False: + Legacy; this flag is always False. - .. deprecated:: 1.4 The - :paramref:`_orm.relationship.cascade_backrefs` - flag will default to False in all cases in SQLAlchemy 2.0. - - .. seealso:: - - :ref:`backref_cascade` - Full discussion and examples on how - the :paramref:`_orm.relationship.cascade_backrefs` option is used. + .. versionchanged:: 2.0 "cascade_backrefs" functionality has been + removed. :param collection_class: A class or callable that returns a new list-holding object. will @@ -1007,7 +997,13 @@ class RelationshipProperty(StrategizedProperty): self._user_defined_foreign_keys = foreign_keys self.collection_class = collection_class self.passive_deletes = passive_deletes - self.cascade_backrefs = cascade_backrefs + + if cascade_backrefs: + raise sa_exc.ArgumentError( + "The 'cascade_backrefs' parameter passed to " + "relationship() may only be set to False." + ) + self.passive_updates = passive_updates self.remote_side = remote_side self.enable_typechecks = enable_typechecks diff --git a/lib/sqlalchemy/orm/session.py b/lib/sqlalchemy/orm/session.py index 62e5604756..b1b7005fea 100644 --- a/lib/sqlalchemy/orm/session.py +++ b/lib/sqlalchemy/orm/session.py @@ -985,7 +985,7 @@ class Session(_SessionClassMethods): self, bind=None, autoflush=True, - future=False, + future=True, expire_on_commit=True, twophase=False, binds=None, @@ -1074,20 +1074,7 @@ class Session(_SessionClassMethods): :ref:`session_committing` - :param future: if True, use 2.0 style transactional and engine - behavior. Future mode includes the following behaviors: - - * The :class:`_orm.Session` will not use "bound" metadata in order - to locate an :class:`_engine.Engine`; the engine or engines in use - must be specified to the constructor of :class:`_orm.Session` or - otherwise be configured against the :class:`_orm.sessionmaker` - in use - - * The behavior of the :paramref:`_orm.relationship.cascade_backrefs` - flag on a :func:`_orm.relationship` will always assume - "False" behavior. - - .. versionadded:: 1.4 + :param future: Deprecated; this flag is always True. .. seealso:: @@ -1128,6 +1115,12 @@ class Session(_SessionClassMethods): ) self.identity_map = identity.WeakInstanceDict() + if not future: + raise sa_exc.ArgumentError( + "The 'future' parameter passed to " + "Session() may only be set to True." + ) + self._new = {} # InstanceState->object, strong refs object self._deleted = {} # same self.bind = bind @@ -1136,7 +1129,6 @@ class Session(_SessionClassMethods): self._warn_on_events = False self._transaction = None self._nested_transaction = None - self.future = future self.hash_key = _new_sessionid() self.autoflush = autoflush self.expire_on_commit = expire_on_commit @@ -1170,33 +1162,6 @@ class Session(_SessionClassMethods): with self.begin(): yield self - @property - @util.deprecated_20( - ":attr:`_orm.Session.transaction`", - alternative="For context manager use, use " - ":meth:`_orm.Session.begin`. To access " - "the current root transaction, use " - ":meth:`_orm.Session.get_transaction`.", - warn_on_attribute_access=True, - ) - def transaction(self): - """The current active or inactive :class:`.SessionTransaction`. - - May be None if no transaction has begun yet. - - .. versionchanged:: 1.4 the :attr:`.Session.transaction` attribute - is now a read-only descriptor that also may return None if no - transaction has begun yet. - - - """ - return self._legacy_transaction() - - def _legacy_transaction(self): - if not self.future: - self._autobegin() - return self._transaction - def in_transaction(self): """Return True if this :class:`_orm.Session` has begun a transaction. @@ -1350,13 +1315,7 @@ class Session(_SessionClassMethods): If no transaction is in progress, this method is a pass-through. - In :term:`1.x-style` use, this method rolls back the topmost - database transaction if no nested transactions are in effect, or - to the current nested transaction if one is in effect. - - When - :term:`2.0-style` use is in effect via the - :paramref:`_orm.Session.future` flag, the method always rolls back + The method always rolls back the topmost database transaction, discarding any nested transactions that may be in progress. @@ -1370,7 +1329,7 @@ class Session(_SessionClassMethods): if self._transaction is None: pass else: - self._transaction.rollback(_to_root=self.future) + self._transaction.rollback(_to_root=True) def commit(self): """Flush pending changes and commit the current transaction. @@ -1378,15 +1337,8 @@ class Session(_SessionClassMethods): If no transaction is in progress, the method will first "autobegin" a new transaction and commit. - If :term:`1.x-style` use is in effect and there are currently - SAVEPOINTs in progress via :meth:`_orm.Session.begin_nested`, - the operation will release the current SAVEPOINT but not commit - the outermost database transaction. - - If :term:`2.0-style` use is in effect via the - :paramref:`_orm.Session.future` flag, the outermost database - transaction is committed unconditionally, automatically releasing any - SAVEPOINTs in effect. + The outermost database transaction is committed unconditionally, + automatically releasing any SAVEPOINTs in effect. .. seealso:: @@ -1399,7 +1351,7 @@ class Session(_SessionClassMethods): if not self._autobegin(): raise sa_exc.InvalidRequestError("No transaction is begun.") - self._transaction.commit(_to_root=self.future) + self._transaction.commit(_to_root=True) def prepare(self): """Prepare the current transaction in progress for two phase commit. @@ -1418,7 +1370,7 @@ class Session(_SessionClassMethods): self._transaction.prepare() - def connection(self, bind_arguments=None, execution_options=None, **kw): + def connection(self, bind_arguments=None, execution_options=None): r"""Return a :class:`_engine.Connection` object corresponding to this :class:`.Session` object's transactional state. @@ -1437,15 +1389,6 @@ class Session(_SessionClassMethods): "mapper", "bind", "clause", other custom arguments that are passed to :meth:`.Session.get_bind`. - :param bind: - deprecated; use bind_arguments - - :param mapper: - deprecated; use bind_arguments - - :param clause: - deprecated; use bind_arguments - :param execution_options: a dictionary of execution options that will be passed to :meth:`_engine.Connection.execution_options`, **when the connection is first procured only**. If the connection is already @@ -1456,17 +1399,15 @@ class Session(_SessionClassMethods): :ref:`session_transaction_isolation` - :param \**kw: - deprecated; use bind_arguments - """ - if not bind_arguments: - bind_arguments = kw + if bind_arguments: + bind = bind_arguments.pop("bind", None) - bind = bind_arguments.pop("bind", None) - if bind is None: - bind = self.get_bind(**bind_arguments) + if bind is None: + bind = self.get_bind(**bind_arguments) + else: + bind = self.get_bind() return self._connection_for_bind( bind, @@ -1490,7 +1431,6 @@ class Session(_SessionClassMethods): bind_arguments=None, _parent_execute_state=None, _add_event=None, - **kw, ): r"""Execute a SQL expression construct. @@ -1505,8 +1445,8 @@ class Session(_SessionClassMethods): ) The API contract of :meth:`_orm.Session.execute` is similar to that - of :meth:`_future.Connection.execute`, the :term:`2.0 style` version - of :class:`_future.Connection`. + of :meth:`_engine.Connection.execute`, the :term:`2.0 style` version + of :class:`_engine.Connection`. .. versionchanged:: 1.4 the :meth:`_orm.Session.execute` method is now the primary point of ORM statement execution when using @@ -1539,32 +1479,13 @@ class Session(_SessionClassMethods): Contents of this dictionary are passed to the :meth:`.Session.get_bind` method. - :param mapper: - deprecated; use the bind_arguments dictionary - - :param bind: - deprecated; use the bind_arguments dictionary - - :param \**kw: - deprecated; use the bind_arguments dictionary - :return: a :class:`_engine.Result` object. """ statement = coercions.expect(roles.StatementRole, statement) - if kw: - util.warn_deprecated_20( - "Passing bind arguments to Session.execute() as keyword " - "arguments is deprecated and will be removed SQLAlchemy 2.0. " - "Please use the bind_arguments parameter." - ) - if not bind_arguments: - bind_arguments = kw - else: - bind_arguments.update(kw) - elif not bind_arguments: + if not bind_arguments: bind_arguments = {} if ( @@ -1848,7 +1769,7 @@ class Session(_SessionClassMethods): mapped. :param bind: an :class:`_engine.Engine` or :class:`_engine.Connection` - object. + object. .. seealso:: @@ -1913,15 +1834,12 @@ class Session(_SessionClassMethods): :ref:`session_custom_partitioning`. :param mapper: - Optional :func:`.mapper` mapped class or instance of - :class:`_orm.Mapper`. The bind can be derived from a - :class:`_orm.Mapper` - first by consulting the "binds" map associated with this - :class:`.Session`, and secondly by consulting the - :class:`_schema.MetaData` - associated with the :class:`_schema.Table` to which the - :class:`_orm.Mapper` - is mapped for a bind. + Optional mapped class or corresponding :class:`_orm.Mapper` instance. + The bind can be derived from a :class:`_orm.Mapper` first by + consulting the "binds" map associated with this :class:`.Session`, + and secondly by consulting the :class:`_schema.MetaData` associated + with the :class:`_schema.Table` to which the :class:`_orm.Mapper` is + mapped for a bind. :param clause: A :class:`_expression.ClauseElement` (i.e. @@ -2587,7 +2505,7 @@ class Session(_SessionClassMethods): ) .. versionadded:: 1.4 Added :meth:`_orm.Session.get`, which is moved - from the now deprecated :meth:`_orm.Query.get` method. + from the now legacy :meth:`_orm.Query.get` method. :meth:`_orm.Session.get` is special in that it provides direct access to the identity map of the :class:`.Session`. diff --git a/lib/sqlalchemy/orm/unitofwork.py b/lib/sqlalchemy/orm/unitofwork.py index 03456e5723..6f85508f3d 100644 --- a/lib/sqlalchemy/orm/unitofwork.py +++ b/lib/sqlalchemy/orm/unitofwork.py @@ -21,18 +21,6 @@ from .. import util from ..util import topological -def _warn_for_cascade_backrefs(state, prop): - util.warn_deprecated_20( - '"%s" object is being merged into a Session along the backref ' - 'cascade path for relationship "%s"; in SQLAlchemy 2.0, this ' - "reverse cascade will not take place. Set cascade_backrefs to " - "False in either the relationship() or backref() function for " - "the 2.0 behavior; or to set globally for the whole " - "Session, set the future=True flag" % (state.class_.__name__, prop), - code="s9r1", - ) - - def track_cascade_events(descriptor, prop): """Establish event listeners on object attributes which handle cascade-on-set/append. @@ -57,14 +45,9 @@ def track_cascade_events(descriptor, prop): if ( prop._cascade.save_update - and ( - (prop.cascade_backrefs and not sess.future) - or key == initiator.key - ) + and (key == initiator.key) and not sess._contains_state(item_state) ): - if key != initiator.key: - _warn_for_cascade_backrefs(item_state, prop) sess._save_or_update_state(item_state) return item @@ -119,14 +102,9 @@ def track_cascade_events(descriptor, prop): newvalue_state = attributes.instance_state(newvalue) if ( prop._cascade.save_update - and ( - (prop.cascade_backrefs and not sess.future) - or key == initiator.key - ) + and (key == initiator.key) and not sess._contains_state(newvalue_state) ): - if key != initiator.key: - _warn_for_cascade_backrefs(newvalue_state, prop) sess._save_or_update_state(newvalue_state) if ( diff --git a/lib/sqlalchemy/orm/util.py b/lib/sqlalchemy/orm/util.py index a282adea4b..739d8a4c2c 100644 --- a/lib/sqlalchemy/orm/util.py +++ b/lib/sqlalchemy/orm/util.py @@ -2046,25 +2046,17 @@ def randomize_unitofwork(): ) = session.set = mapper.set = dependency.set = RandomSet -def _getitem(iterable_query, item, allow_negative): +def _getitem(iterable_query, item): """calculate __getitem__ in terms of an iterable query object that also has a slice() method. """ def _no_negative_indexes(): - if not allow_negative: - raise IndexError( - "negative indexes are not accepted by SQL " - "index / slice operators" - ) - else: - util.warn_deprecated_20( - "Support for negative indexes for SQL index / slice operators " - "will be " - "removed in 2.0; these operators fetch the complete result " - "and do not work efficiently." - ) + raise IndexError( + "negative indexes are not accepted by SQL " + "index / slice operators" + ) if isinstance(item, slice): start, stop, step = util.decode_slice(item) diff --git a/lib/sqlalchemy/sql/coercions.py b/lib/sqlalchemy/sql/coercions.py index d11ab6712b..fe4fc4b409 100644 --- a/lib/sqlalchemy/sql/coercions.py +++ b/lib/sqlalchemy/sql/coercions.py @@ -99,14 +99,15 @@ def _document_text_coercion(paramname, meth_rst, param_rst): def _expression_collection_was_a_list(attrname, fnname, args): if args and isinstance(args[0], (list, set, dict)) and len(args) == 1: if isinstance(args[0], list): - util.warn_deprecated_20( - 'The "%s" argument to %s(), when referring to a sequence ' + raise exc.ArgumentError( + f'The "{attrname}" argument to {fnname}(), when ' + "referring to a sequence " "of items, is now passed as a series of positional " - "elements, rather than as a list. " % (attrname, fnname) + "elements, rather than as a list. " ) return args[0] - else: - return args + + return args def expect( @@ -883,15 +884,6 @@ class StatementImpl(_CoerceLiterals, RoleImpl): original_element, resolved, argname=argname, **kw ) - def _text_coercion(self, element, argname=None): - util.warn_deprecated_20( - "Using plain strings to indicate SQL statements without using " - "the text() construct is " - "deprecated and will be removed in version 2.0. Ensure plain " - "SQL statements are passed using the text() construct." - ) - return elements.TextClause(element) - class SelectStatementImpl(_NoTextCoercion, RoleImpl): __slots__ = () diff --git a/lib/sqlalchemy/sql/dml.py b/lib/sqlalchemy/sql/dml.py index feb286d659..fad1728f86 100644 --- a/lib/sqlalchemy/sql/dml.py +++ b/lib/sqlalchemy/sql/dml.py @@ -237,66 +237,6 @@ class UpdateBase( is_dml = True - @classmethod - def _constructor_20_deprecations(cls, fn_name, clsname, names): - - param_to_method_lookup = dict( - whereclause=( - "The :paramref:`%(func)s.whereclause` parameter " - "will be removed " - "in SQLAlchemy 2.0. Please refer to the " - ":meth:`%(classname)s.where` method." - ), - values=( - "The :paramref:`%(func)s.values` parameter will be removed " - "in SQLAlchemy 2.0. Please refer to the " - ":meth:`%(classname)s.values` method." - ), - inline=( - "The :paramref:`%(func)s.inline` parameter will be " - "removed in " - "SQLAlchemy 2.0. Please use the " - ":meth:`%(classname)s.inline` method." - ), - prefixes=( - "The :paramref:`%(func)s.prefixes parameter will be " - "removed in " - "SQLAlchemy 2.0. Please use the " - ":meth:`%(classname)s.prefix_with` " - "method." - ), - return_defaults=( - "The :paramref:`%(func)s.return_defaults` parameter will be " - "removed in SQLAlchemy 2.0. Please use the " - ":meth:`%(classname)s.return_defaults` method." - ), - returning=( - "The :paramref:`%(func)s.returning` parameter will be " - "removed in SQLAlchemy 2.0. Please use the " - ":meth:`%(classname)s.returning`` method." - ), - preserve_parameter_order=( - "The :paramref:`%(func)s.preserve_parameter_order` parameter " - "will be removed in SQLAlchemy 2.0. Use the " - ":meth:`%(classname)s.ordered_values` method with a list " - "of tuples. " - ), - ) - - return util.deprecated_params( - **{ - name: ( - "2.0", - param_to_method_lookup[name] - % { - "func": "_expression.%s" % fn_name, - "classname": "_expression.%s" % clsname, - }, - ) - for name in names - } - ) - def _generate_fromclause_column_proxies(self, fromclause): fromclause._columns._populate_separate_keys( col._make_proxy(fromclause) for col in self._returning @@ -332,15 +272,6 @@ class UpdateBase( self._validate_dialect_kwargs(opt) return self - def _validate_dialect_kwargs_deprecated(self, dialect_kw): - util.warn_deprecated_20( - "Passing dialect keyword arguments directly to the " - "%s constructor is deprecated and will be removed in SQLAlchemy " - "2.0. Please use the ``with_dialect_options()`` method." - % (self.__class__.__name__) - ) - self._validate_dialect_kwargs(dialect_kw) - @_generative def returning(self: SelfUpdateBase, *cols) -> SelfUpdateBase: r"""Add a :term:`RETURNING` or equivalent clause to this statement. @@ -499,14 +430,10 @@ class ValuesBase(UpdateBase): _returning = () - def __init__(self, table, values, prefixes): + def __init__(self, table): self.table = coercions.expect( roles.DMLTableRole, table, apply_propagate_attrs=self ) - if values is not None: - self.values.non_generative(self, values) - if prefixes: - self._setup_prefixes(prefixes) @_generative @_exclusive_against( @@ -765,7 +692,7 @@ class ValuesBase(UpdateBase): :meth:`.ValuesBase.return_defaults` is used by the ORM to provide an efficient implementation for the ``eager_defaults`` feature of - :func:`.mapper`. + :class:`_orm.Mapper`. :param cols: optional list of column key names or :class:`_schema.Column` @@ -809,6 +736,7 @@ class Insert(ValuesBase): select = None include_insert_from_select_defaults = False + _inline = False is_insert = True @@ -835,26 +763,9 @@ class Insert(ValuesBase): + HasCTE._has_ctes_traverse_internals ) - @ValuesBase._constructor_20_deprecations( - "insert", - "Insert", - [ - "values", - "inline", - "prefixes", - "returning", - "return_defaults", - ], - ) def __init__( self, table, - values=None, - inline=False, - prefixes=None, - returning=None, - return_defaults=False, - **dialect_kw, ): """Construct an :class:`_expression.Insert` object. @@ -922,17 +833,7 @@ class Insert(ValuesBase): :ref:`inserts_and_updates` - SQL Expression Tutorial """ - super(Insert, self).__init__(table, values, prefixes) - self._inline = inline - if returning: - self._returning = returning - if dialect_kw: - self._validate_dialect_kwargs_deprecated(dialect_kw) - - if return_defaults: - self._return_defaults = True - if not isinstance(return_defaults, bool): - self._return_defaults_columns = return_defaults + super(Insert, self).__init__(table) @_generative def inline(self: SelfInsert) -> SelfInsert: @@ -1120,6 +1021,8 @@ class Update(DMLWhereBase, ValuesBase): __visit_name__ = "update" is_update = True + _preserve_parameter_order = False + _inline = False _traverse_internals = ( [ @@ -1142,30 +1045,9 @@ class Update(DMLWhereBase, ValuesBase): + HasCTE._has_ctes_traverse_internals ) - @ValuesBase._constructor_20_deprecations( - "update", - "Update", - [ - "whereclause", - "values", - "inline", - "prefixes", - "returning", - "return_defaults", - "preserve_parameter_order", - ], - ) def __init__( self, table, - whereclause=None, - values=None, - inline=False, - prefixes=None, - returning=None, - return_defaults=False, - preserve_parameter_order=False, - **dialect_kw, ): r"""Construct an :class:`_expression.Update` object. @@ -1275,18 +1157,7 @@ class Update(DMLWhereBase, ValuesBase): """ - self._preserve_parameter_order = preserve_parameter_order - super(Update, self).__init__(table, values, prefixes) - if returning: - self._returning = returning - if whereclause is not None: - self._where_criteria += ( - coercions.expect(roles.WhereHavingRole, whereclause), - ) - self._inline = inline - if dialect_kw: - self._validate_dialect_kwargs_deprecated(dialect_kw) - self._return_defaults = return_defaults + super(Update, self).__init__(table) @_generative def ordered_values(self: SelfUpdate, *args) -> SelfUpdate: @@ -1373,18 +1244,9 @@ class Delete(DMLWhereBase, UpdateBase): + HasCTE._has_ctes_traverse_internals ) - @ValuesBase._constructor_20_deprecations( - "delete", - "Delete", - ["whereclause", "values", "prefixes", "returning"], - ) def __init__( self, table, - whereclause=None, - returning=None, - prefixes=None, - **dialect_kw, ): r"""Construct :class:`_expression.Delete` object. @@ -1424,16 +1286,3 @@ class Delete(DMLWhereBase, UpdateBase): self.table = coercions.expect( roles.DMLTableRole, table, apply_propagate_attrs=self ) - if returning: - self._returning = returning - - if prefixes: - self._setup_prefixes(prefixes) - - if whereclause is not None: - self._where_criteria += ( - coercions.expect(roles.WhereHavingRole, whereclause), - ) - - if dialect_kw: - self._validate_dialect_kwargs_deprecated(dialect_kw) diff --git a/lib/sqlalchemy/sql/elements.py b/lib/sqlalchemy/sql/elements.py index 6e5fbba889..3ec702e10e 100644 --- a/lib/sqlalchemy/sql/elements.py +++ b/lib/sqlalchemy/sql/elements.py @@ -2785,10 +2785,7 @@ class Case(ColumnElement): ("else_", InternalTraversal.dp_clauseelement), ] - # TODO: for Py2k removal, this will be: - # def __init__(self, *whens, value=None, else_=None): - - def __init__(self, *whens, **kw): + def __init__(self, *whens, value=None, else_=None): r"""Produce a ``CASE`` expression. The ``CASE`` construct in SQL is a conditional object that @@ -2875,8 +2872,7 @@ class Case(ColumnElement): whether or not :paramref:`.case.value` is used. .. versionchanged:: 1.4 the :func:`_sql.case` - function now accepts the series of WHEN conditions positionally; - passing the expressions within a list is deprecated. + function now accepts the series of WHEN conditions positionally In the first form, it accepts a list of 2-tuples; each 2-tuple consists of ``(, )``, where the SQL @@ -2911,24 +2907,14 @@ class Case(ColumnElement): """ - if "whens" in kw: - util.warn_deprecated_20( - 'The "whens" argument to case() is now passed using ' - "positional style only, not as a keyword argument." - ) - whens = (kw.pop("whens"),) - whens = coercions._expression_collection_was_a_list( "whens", "case", whens ) - try: whens = util.dictlike_iteritems(whens) except TypeError: pass - value = kw.pop("value", None) - whenlist = [ ( coercions.expect( @@ -2954,15 +2940,11 @@ class Case(ColumnElement): self.type = type_ self.whens = whenlist - else_ = kw.pop("else_", None) if else_ is not None: self.else_ = coercions.expect(roles.ExpressionElementRole, else_) else: self.else_ = None - if kw: - raise TypeError("unknown arguments: %s" % (", ".join(sorted(kw)))) - @property def _from_objects(self): return list( diff --git a/lib/sqlalchemy/sql/schema.py b/lib/sqlalchemy/sql/schema.py index a5fdd39d41..d836384b4b 100644 --- a/lib/sqlalchemy/sql/schema.py +++ b/lib/sqlalchemy/sql/schema.py @@ -538,12 +538,6 @@ class Table(DialectKWArgs, SchemaItem, TableClause): "1.4", "Deprecated alias of :paramref:`_schema.Table.must_exist`", ), - autoload=( - "2.0", - "The autoload parameter is deprecated and will be removed in " - "version 2.0. Please use the " - "autoload_with parameter, passing an engine or connection.", - ), ) def __new__(cls, *args, **kw): if not args and not kw: @@ -638,7 +632,7 @@ class Table(DialectKWArgs, SchemaItem, TableClause): self.fullname = self.name autoload_with = kwargs.pop("autoload_with", None) - autoload = kwargs.pop("autoload", autoload_with is not None) + autoload = autoload_with is not None # this argument is only used with _init_existing() kwargs.pop("autoload_replace", True) keep_existing = kwargs.pop("keep_existing", False) diff --git a/lib/sqlalchemy/sql/selectable.py b/lib/sqlalchemy/sql/selectable.py index 9c5850761f..046b41d892 100644 --- a/lib/sqlalchemy/sql/selectable.py +++ b/lib/sqlalchemy/sql/selectable.py @@ -1360,110 +1360,6 @@ class Join(roles.DMLTableRole, FromClause): .alias(name) ) - @util.deprecated_20( - ":meth:`_sql.Join.alias`", - alternative="Create a select + subquery, or alias the " - "individual tables inside the join, instead.", - ) - def alias(self, name=None, flat=False): - r"""Return an alias of this :class:`_expression.Join`. - - The default behavior here is to first produce a SELECT - construct from this :class:`_expression.Join`, then to produce an - :class:`_expression.Alias` from that. So given a join of the form:: - - j = table_a.join(table_b, table_a.c.id == table_b.c.a_id) - - The JOIN by itself would look like:: - - table_a JOIN table_b ON table_a.id = table_b.a_id - - Whereas the alias of the above, ``j.alias()``, would in a - SELECT context look like:: - - (SELECT table_a.id AS table_a_id, table_b.id AS table_b_id, - table_b.a_id AS table_b_a_id - FROM table_a - JOIN table_b ON table_a.id = table_b.a_id) AS anon_1 - - The equivalent long-hand form, given a :class:`_expression.Join` - object ``j``, is:: - - from sqlalchemy import select, alias - j = alias( - select(j.left, j.right).\ - select_from(j).\ - set_label_style(LABEL_STYLE_TABLENAME_PLUS_COL).\ - correlate(False), - name=name - ) - - The selectable produced by :meth:`_expression.Join.alias` - features the same - columns as that of the two individual selectables presented under - a single name - the individual columns are "auto-labeled", meaning - the ``.c.`` collection of the resulting :class:`_expression.Alias` - represents - the names of the individual columns using a - ``_`` scheme:: - - j.c.table_a_id - j.c.table_b_a_id - - :meth:`_expression.Join.alias` also features an alternate - option for aliasing joins which produces no enclosing SELECT and - does not normally apply labels to the column names. The - ``flat=True`` option will call :meth:`_expression.FromClause.alias` - against the left and right sides individually. - Using this option, no new ``SELECT`` is produced; - we instead, from a construct as below:: - - j = table_a.join(table_b, table_a.c.id == table_b.c.a_id) - j = j.alias(flat=True) - - we get a result like this:: - - table_a AS table_a_1 JOIN table_b AS table_b_1 ON - table_a_1.id = table_b_1.a_id - - The ``flat=True`` argument is also propagated to the contained - selectables, so that a composite join such as:: - - j = table_a.join( - table_b.join(table_c, - table_b.c.id == table_c.c.b_id), - table_b.c.a_id == table_a.c.id - ).alias(flat=True) - - Will produce an expression like:: - - table_a AS table_a_1 JOIN ( - table_b AS table_b_1 JOIN table_c AS table_c_1 - ON table_b_1.id = table_c_1.b_id - ) ON table_a_1.id = table_b_1.a_id - - The standalone :func:`_expression.alias` function as well as the - base :meth:`_expression.FromClause.alias` - method also support the ``flat=True`` - argument as a no-op, so that the argument can be passed to the - ``alias()`` method of any selectable. - - :param name: name given to the alias. - - :param flat: if True, produce an alias of the left and right - sides of this :class:`_expression.Join` and return the join of those - two selectables. This produces join expression that does not - include an enclosing SELECT. - - .. seealso:: - - :ref:`core_tutorial_aliases` - - :func:`_expression.alias` - - """ - return self._anonymous_fromclause(flat=flat, name=name) - @property def _hide_froms(self): return itertools.chain( @@ -2618,7 +2514,6 @@ class TableClause(roles.DMLTableRole, Immutable, FromClause): .. versionadded:: 1.3.18 :func:`_expression.table` can now accept a ``schema`` argument. """ - super(TableClause, self).__init__() self.name = name self._columns = DedupeColumnCollection() @@ -2665,7 +2560,7 @@ class TableClause(roles.DMLTableRole, Immutable, FromClause): c.table = self @util.preload_module("sqlalchemy.sql.dml") - def insert(self, values=None, inline=False, **kwargs): + def insert(self): """Generate an :func:`_expression.insert` construct against this :class:`_expression.TableClause`. @@ -2676,12 +2571,10 @@ class TableClause(roles.DMLTableRole, Immutable, FromClause): See :func:`_expression.insert` for argument and usage information. """ - return util.preloaded.sql_dml.Insert( - self, values=values, inline=inline, **kwargs - ) + return util.preloaded.sql_dml.Insert(self) @util.preload_module("sqlalchemy.sql.dml") - def update(self, whereclause=None, values=None, inline=False, **kwargs): + def update(self): """Generate an :func:`_expression.update` construct against this :class:`_expression.TableClause`. @@ -2694,14 +2587,10 @@ class TableClause(roles.DMLTableRole, Immutable, FromClause): """ return util.preloaded.sql_dml.Update( self, - whereclause=whereclause, - values=values, - inline=inline, - **kwargs, ) @util.preload_module("sqlalchemy.sql.dml") - def delete(self, whereclause=None, **kwargs): + def delete(self): """Generate a :func:`_expression.delete` construct against this :class:`_expression.TableClause`. @@ -2712,7 +2601,7 @@ class TableClause(roles.DMLTableRole, Immutable, FromClause): See :func:`_expression.delete` for argument and usage information. """ - return util.preloaded.sql_dml.Delete(self, whereclause, **kwargs) + return util.preloaded.sql_dml.Delete(self) @property def _from_objects(self): @@ -2806,7 +2695,7 @@ class Values(Generative, FromClause): ("literal_binds", InternalTraversal.dp_boolean), ] - def __init__(self, *columns, **kw): + def __init__(self, *columns, name=None, literal_binds=False): r"""Construct a :class:`_expression.Values` construct. The column expressions and the actual data for @@ -2844,8 +2733,8 @@ class Values(Generative, FromClause): super(Values, self).__init__() self._column_args = columns - self.name = kw.pop("name", None) - self.literal_binds = kw.pop("literal_binds", False) + self.name = name + self.literal_binds = literal_binds self.named_with_column = self.name is not None @property @@ -3286,65 +3175,12 @@ class SelectStatementGrouping(GroupedElement, SelectBase): return self.element._from_objects -class DeprecatedSelectBaseGenerations: - """A collection of methods available on :class:`_sql.Select` and - :class:`_sql.CompoundSelect`, these are all **deprecated** methods as they - modify the object in-place. - - """ - - @util.deprecated( - "1.4", - "The :meth:`_expression.GenerativeSelect.append_order_by` " - "method is deprecated " - "and will be removed in a future release. Use the generative method " - ":meth:`_expression.GenerativeSelect.order_by`.", - ) - def append_order_by(self, *clauses): - """Append the given ORDER BY criterion applied to this selectable. - - The criterion will be appended to any pre-existing ORDER BY criterion. - - This is an **in-place** mutation method; the - :meth:`_expression.GenerativeSelect.order_by` method is preferred, - as it - provides standard :term:`method chaining`. - - .. seealso:: - - :meth:`_expression.GenerativeSelect.order_by` - - """ - self.order_by.non_generative(self, *clauses) - - @util.deprecated( - "1.4", - "The :meth:`_expression.GenerativeSelect.append_group_by` " - "method is deprecated " - "and will be removed in a future release. Use the generative method " - ":meth:`_expression.GenerativeSelect.group_by`.", - ) - def append_group_by(self, *clauses): - """Append the given GROUP BY criterion applied to this selectable. - - The criterion will be appended to any pre-existing GROUP BY criterion. - - This is an **in-place** mutation method; the - :meth:`_expression.GenerativeSelect.group_by` method is preferred, - as it - provides standard :term:`method chaining`. - - - """ - self.group_by.non_generative(self, *clauses) - - SelfGenerativeSelect = typing.TypeVar( "SelfGenerativeSelect", bound="GenerativeSelect" ) -class GenerativeSelect(DeprecatedSelectBaseGenerations, SelectBase): +class GenerativeSelect(SelectBase): """Base class for SELECT statements where additional elements can be added. @@ -3368,39 +3204,9 @@ class GenerativeSelect(DeprecatedSelectBaseGenerations, SelectBase): _fetch_clause_options = None _for_update_arg = None - def __init__( - self, - _label_style=LABEL_STYLE_DEFAULT, - use_labels=False, - limit=None, - offset=None, - order_by=None, - group_by=None, - ): - if use_labels: - if util.SQLALCHEMY_WARN_20: - util.warn_deprecated_20( - "The use_labels=True keyword argument to GenerativeSelect " - "is deprecated and will be removed in version 2.0. Please " - "use " - "select.set_label_style(LABEL_STYLE_TABLENAME_PLUS_COL) " - "if you need to replicate this legacy behavior.", - stacklevel=4, - ) - _label_style = LABEL_STYLE_TABLENAME_PLUS_COL - + def __init__(self, _label_style=LABEL_STYLE_DEFAULT): self._label_style = _label_style - if limit is not None: - self.limit.non_generative(self, limit) - if offset is not None: - self.offset.non_generative(self, offset) - - if order_by is not None: - self.order_by.non_generative(self, *util.to_list(order_by)) - if group_by is not None: - self.group_by.non_generative(self, *util.to_list(group_by)) - @_generative def with_for_update( self: SelfGenerativeSelect, @@ -3516,14 +3322,6 @@ class GenerativeSelect(DeprecatedSelectBaseGenerations, SelectBase): self._label_style = style return self - @util.deprecated_20( - ":meth:`_sql.GenerativeSelect.apply_labels`", - alternative="Use set_label_style(LABEL_STYLE_TABLENAME_PLUS_COL) " - "instead.", - ) - def apply_labels(self): - return self.set_label_style(LABEL_STYLE_TABLENAME_PLUS_COL) - @property def _group_by_clause(self): """ClauseList access to group_by_clauses for legacy dialects""" @@ -3894,9 +3692,9 @@ class CompoundSelect(HasCompileState, GenerativeSelect): INTERSECT_ALL = util.symbol("INTERSECT ALL") _is_from_container = True + _auto_correlate = False - def __init__(self, keyword, *selects, **kwargs): - self._auto_correlate = kwargs.pop("correlate", False) + def __init__(self, keyword, *selects): self.keyword = keyword self.selects = [ coercions.expect(roles.CompoundElementRole, s).self_group( @@ -3905,17 +3703,7 @@ class CompoundSelect(HasCompileState, GenerativeSelect): for s in selects ] - if kwargs and util.SQLALCHEMY_WARN_20: - util.warn_deprecated_20( - "Set functions such as union(), union_all(), extract(), etc. " - "in SQLAlchemy 2.0 will accept a " - "series of SELECT statements only. " - "Please use generative methods such as order_by() for " - "additional modifications to this CompoundSelect.", - stacklevel=4, - ) - - GenerativeSelect.__init__(self, **kwargs) + GenerativeSelect.__init__(self) @classmethod def _create_union(cls, *selects, **kwargs): @@ -3938,7 +3726,7 @@ class CompoundSelect(HasCompileState, GenerativeSelect): return CompoundSelect(CompoundSelect.UNION, *selects, **kwargs) @classmethod - def _create_union_all(cls, *selects, **kwargs): + def _create_union_all(cls, *selects): r"""Return a ``UNION ALL`` of multiple selectables. The returned object is an instance of @@ -3950,15 +3738,11 @@ class CompoundSelect(HasCompileState, GenerativeSelect): :param \*selects: a list of :class:`_expression.Select` instances. - :param \**kwargs: - available keyword arguments are the same as those of - :func:`select`. - """ - return CompoundSelect(CompoundSelect.UNION_ALL, *selects, **kwargs) + return CompoundSelect(CompoundSelect.UNION_ALL, *selects) @classmethod - def _create_except(cls, *selects, **kwargs): + def _create_except(cls, *selects): r"""Return an ``EXCEPT`` of multiple selectables. The returned object is an instance of @@ -3967,15 +3751,11 @@ class CompoundSelect(HasCompileState, GenerativeSelect): :param \*selects: a list of :class:`_expression.Select` instances. - :param \**kwargs: - available keyword arguments are the same as those of - :func:`select`. - """ - return CompoundSelect(CompoundSelect.EXCEPT, *selects, **kwargs) + return CompoundSelect(CompoundSelect.EXCEPT, *selects) @classmethod - def _create_except_all(cls, *selects, **kwargs): + def _create_except_all(cls, *selects): r"""Return an ``EXCEPT ALL`` of multiple selectables. The returned object is an instance of @@ -3984,15 +3764,11 @@ class CompoundSelect(HasCompileState, GenerativeSelect): :param \*selects: a list of :class:`_expression.Select` instances. - :param \**kwargs: - available keyword arguments are the same as those of - :func:`select`. - """ - return CompoundSelect(CompoundSelect.EXCEPT_ALL, *selects, **kwargs) + return CompoundSelect(CompoundSelect.EXCEPT_ALL, *selects) @classmethod - def _create_intersect(cls, *selects, **kwargs): + def _create_intersect(cls, *selects): r"""Return an ``INTERSECT`` of multiple selectables. The returned object is an instance of @@ -4001,15 +3777,11 @@ class CompoundSelect(HasCompileState, GenerativeSelect): :param \*selects: a list of :class:`_expression.Select` instances. - :param \**kwargs: - available keyword arguments are the same as those of - :func:`select`. - """ - return CompoundSelect(CompoundSelect.INTERSECT, *selects, **kwargs) + return CompoundSelect(CompoundSelect.INTERSECT, *selects) @classmethod - def _create_intersect_all(cls, *selects, **kwargs): + def _create_intersect_all(cls, *selects): r"""Return an ``INTERSECT ALL`` of multiple selectables. The returned object is an instance of @@ -4018,12 +3790,9 @@ class CompoundSelect(HasCompileState, GenerativeSelect): :param \*selects: a list of :class:`_expression.Select` instances. - :param \**kwargs: - available keyword arguments are the same as those of - :func:`select`. """ - return CompoundSelect(CompoundSelect.INTERSECT_ALL, *selects, **kwargs) + return CompoundSelect(CompoundSelect.INTERSECT_ALL, *selects) def _scalar_type(self): return self.selects[0]._scalar_type() @@ -4117,134 +3886,6 @@ class CompoundSelect(HasCompileState, GenerativeSelect): return self.selects[0].selected_columns -class DeprecatedSelectGenerations: - """A collection of methods available on :class:`_sql.Select`, these - are all **deprecated** methods as they modify the :class:`_sql.Select` - object in -place. - - """ - - @util.deprecated( - "1.4", - "The :meth:`_expression.Select.append_correlation` " - "method is deprecated " - "and will be removed in a future release. Use the generative " - "method :meth:`_expression.Select.correlate`.", - ) - def append_correlation(self, fromclause): - """Append the given correlation expression to this select() - construct. - - This is an **in-place** mutation method; the - :meth:`_expression.Select.correlate` method is preferred, - as it provides - standard :term:`method chaining`. - - """ - - self.correlate.non_generative(self, fromclause) - - @util.deprecated( - "1.4", - "The :meth:`_expression.Select.append_column` method is deprecated " - "and will be removed in a future release. Use the generative " - "method :meth:`_expression.Select.add_columns`.", - ) - def append_column(self, column): - """Append the given column expression to the columns clause of this - select() construct. - - E.g.:: - - my_select.append_column(some_table.c.new_column) - - This is an **in-place** mutation method; the - :meth:`_expression.Select.add_columns` method is preferred, - as it provides standard - :term:`method chaining`. - - """ - self.add_columns.non_generative(self, column) - - @util.deprecated( - "1.4", - "The :meth:`_expression.Select.append_prefix` method is deprecated " - "and will be removed in a future release. Use the generative " - "method :meth:`_expression.Select.prefix_with`.", - ) - def append_prefix(self, clause): - """Append the given columns clause prefix expression to this select() - construct. - - This is an **in-place** mutation method; the - :meth:`_expression.Select.prefix_with` method is preferred, - as it provides - standard :term:`method chaining`. - - """ - self.prefix_with.non_generative(self, clause) - - @util.deprecated( - "1.4", - "The :meth:`_expression.Select.append_whereclause` " - "method is deprecated " - "and will be removed in a future release. Use the generative " - "method :meth:`_expression.Select.where`.", - ) - def append_whereclause(self, whereclause): - """Append the given expression to this select() construct's WHERE - criterion. - - The expression will be joined to existing WHERE criterion via AND. - - This is an **in-place** mutation method; the - :meth:`_expression.Select.where` method is preferred, - as it provides standard - :term:`method chaining`. - - """ - self.where.non_generative(self, whereclause) - - @util.deprecated( - "1.4", - "The :meth:`_expression.Select.append_having` method is deprecated " - "and will be removed in a future release. Use the generative " - "method :meth:`_expression.Select.having`.", - ) - def append_having(self, having): - """Append the given expression to this select() construct's HAVING - criterion. - - The expression will be joined to existing HAVING criterion via AND. - - This is an **in-place** mutation method; the - :meth:`_expression.Select.having` method is preferred, - as it provides standard - :term:`method chaining`. - - """ - - self.having.non_generative(self, having) - - @util.deprecated( - "1.4", - "The :meth:`_expression.Select.append_from` method is deprecated " - "and will be removed in a future release. Use the generative " - "method :meth:`_expression.Select.select_from`.", - ) - def append_from(self, fromclause): - """Append the given :class:`_expression.FromClause` expression - to this select() construct's FROM clause. - - This is an **in-place** mutation method; the - :meth:`_expression.Select.select_from` method is preferred, - as it provides - standard :term:`method chaining`. - - """ - self.select_from.non_generative(self, fromclause) - - @CompileState.plugin_for("default", "select") class SelectState(util.MemoizedSlots, CompileState): __slots__ = ( @@ -4700,7 +4341,6 @@ class Select( HasSuffixes, HasHints, HasCompileState, - DeprecatedSelectGenerations, _SelectFromElements, GenerativeSelect, ): @@ -5345,7 +4985,9 @@ class Select( ) @_generative - def with_only_columns(self: SelfSelect, *columns, **kw) -> SelfSelect: + def with_only_columns( + self: SelfSelect, *columns, maintain_column_froms=False + ) -> SelfSelect: r"""Return a new :func:`_expression.select` construct with its columns clause replaced with the given columns. @@ -5404,10 +5046,6 @@ class Select( # is the case for now. self._assert_no_memoizations() - maintain_column_froms = kw.pop("maintain_column_froms", False) - if kw: - raise TypeError("unknown parameters: %s" % (", ".join(kw),)) - if maintain_column_froms: self.select_from.non_generative(self, *self.columns_clause_froms) diff --git a/lib/sqlalchemy/sql/sqltypes.py b/lib/sqlalchemy/sql/sqltypes.py index cb44fc0861..3c16321728 100644 --- a/lib/sqlalchemy/sql/sqltypes.py +++ b/lib/sqlalchemy/sql/sqltypes.py @@ -1207,14 +1207,9 @@ class Enum(Emulated, String, SchemaType): .. versionadded:: 1.3.8 :param omit_aliases: A boolean that when true will remove aliases from - pep 435 enums. For backward compatibility it defaults to ``False``. - A deprecation warning is raised if the enum has aliases and this - flag was not set. + pep 435 enums. defaults to ``True``. - .. versionadded:: 1.4.5 - - .. deprecated:: 1.4 The default will be changed to ``True`` in - SQLAlchemy 2.0. + .. versionchanged:: 2.0 This parameter now defaults to True. """ self._enum_init(enums, kw) @@ -1239,7 +1234,7 @@ class Enum(Emulated, String, SchemaType): self.values_callable = kw.pop("values_callable", None) self._sort_key_function = kw.pop("sort_key_function", NO_ARG) length_arg = kw.pop("length", NO_ARG) - self._omit_aliases = kw.pop("omit_aliases", NO_ARG) + self._omit_aliases = kw.pop("omit_aliases", True) values, objects = self._parse_into_values(enums, kw) self._setup_for_values(values, objects, kw) @@ -1284,14 +1279,6 @@ class Enum(Emulated, String, SchemaType): _members = self.enum_class.__members__ - aliases = [n for n, v in _members.items() if v.name != n] - if self._omit_aliases is NO_ARG and aliases: - util.warn_deprecated_20( - "The provided enum %s contains the aliases %s. The " - "``omit_aliases`` will default to ``True`` in SQLAlchemy " - "2.0. Specify a value to silence this warning." - % (self.enum_class.__name__, aliases) - ) if self._omit_aliases is True: # remove aliases members = OrderedDict( diff --git a/lib/sqlalchemy/testing/assertions.py b/lib/sqlalchemy/testing/assertions.py index 795b538047..978e6764fd 100644 --- a/lib/sqlalchemy/testing/assertions.py +++ b/lib/sqlalchemy/testing/assertions.py @@ -31,7 +31,7 @@ from ..util import decorator def expect_warnings(*messages, **kw): """Context manager which expects one or more warnings. - With no arguments, squelches all SAWarning and RemovedIn20Warning emitted via + With no arguments, squelches all SAWarning emitted via sqlalchemy.util.warn and sqlalchemy.util.warn_limited. Otherwise pass string expressions that will match selected warnings via regex; all non-matching warnings are sent through. @@ -41,9 +41,7 @@ def expect_warnings(*messages, **kw): Note that the test suite sets SAWarning warnings to raise exceptions. """ # noqa - return _expect_warnings( - (sa_exc.RemovedIn20Warning, sa_exc.SAWarning), messages, **kw - ) + return _expect_warnings(sa_exc.SAWarning, messages, **kw) @contextlib.contextmanager @@ -199,9 +197,7 @@ def _expect_warnings( else: real_warn(msg, *arg, **kw) - with mock.patch("warnings.warn", our_warn), mock.patch( - "sqlalchemy.util.SQLALCHEMY_WARN_20", True - ), mock.patch("sqlalchemy.util.deprecations.SQLALCHEMY_WARN_20", True): + with mock.patch("warnings.warn", our_warn): try: yield finally: diff --git a/lib/sqlalchemy/testing/profiling.py b/lib/sqlalchemy/testing/profiling.py index 2761d49875..6ccfe4ea7f 100644 --- a/lib/sqlalchemy/testing/profiling.py +++ b/lib/sqlalchemy/testing/profiling.py @@ -238,21 +238,18 @@ def function_call_count(variance=0.05, times=1, warmup=0): # likely due to the introduction of __signature__. from sqlalchemy.util import decorator - from sqlalchemy.util import deprecations - from sqlalchemy.testing import mock @decorator def wrap(fn, *args, **kw): - with mock.patch.object(deprecations, "SQLALCHEMY_WARN_20", False): - for warm in range(warmup): - fn(*args, **kw) + for warm in range(warmup): + fn(*args, **kw) - timerange = range(times) - with count_functions(variance=variance): - for time in timerange: - rv = fn(*args, **kw) - return rv + timerange = range(times) + with count_functions(variance=variance): + for time in timerange: + rv = fn(*args, **kw) + return rv return wrap diff --git a/lib/sqlalchemy/testing/warnings.py b/lib/sqlalchemy/testing/warnings.py index c8e481a900..0c550731c0 100644 --- a/lib/sqlalchemy/testing/warnings.py +++ b/lib/sqlalchemy/testing/warnings.py @@ -46,13 +46,6 @@ def setup_filters(): message="The loop argument is deprecated", ) - # ignore things that are deprecated *as of* 2.0 :) - warnings.filterwarnings( - "ignore", - category=sa_exc.SADeprecationWarning, - message=r".*\(deprecated since: 2.0\)$", - ) - try: import pytest except ImportError: diff --git a/lib/sqlalchemy/util/__init__.py b/lib/sqlalchemy/util/__init__.py index b452a1fdae..eb9ddb3138 100644 --- a/lib/sqlalchemy/util/__init__.py +++ b/lib/sqlalchemy/util/__init__.py @@ -70,16 +70,13 @@ from .concurrency import await_fallback from .concurrency import await_only from .concurrency import greenlet_spawn from .concurrency import is_exit_exception +from .deprecations import became_legacy_20 from .deprecations import deprecated -from .deprecations import deprecated_20 -from .deprecations import deprecated_20_cls from .deprecations import deprecated_cls from .deprecations import deprecated_params from .deprecations import inject_docstring_text from .deprecations import moved_20 -from .deprecations import SQLALCHEMY_WARN_20 from .deprecations import warn_deprecated -from .deprecations import warn_deprecated_20 from .langhelpers import add_parameter_text from .langhelpers import as_interface from .langhelpers import asbool diff --git a/lib/sqlalchemy/util/deprecations.py b/lib/sqlalchemy/util/deprecations.py index 4d3e04fde8..6a0fdecb5f 100644 --- a/lib/sqlalchemy/util/deprecations.py +++ b/lib/sqlalchemy/util/deprecations.py @@ -8,7 +8,6 @@ """Helpers related to deprecation of functions, methods, classes, other functionality.""" -import os import re from . import compat @@ -20,19 +19,7 @@ from .langhelpers import inject_param_text from .. import exc -SQLALCHEMY_WARN_20 = False - -if os.getenv("SQLALCHEMY_WARN_20", "false").lower() in ("true", "yes", "1"): - SQLALCHEMY_WARN_20 = True - - def _warn_with_version(msg, version, type_, stacklevel, code=None): - if ( - issubclass(type_, exc.Base20DeprecationWarning) - and not SQLALCHEMY_WARN_20 - ): - return - warn = type_(msg, code=code) warn.deprecated_since = version @@ -57,17 +44,6 @@ def warn_deprecated_limited(msg, args, version, stacklevel=3, code=None): ) -def warn_deprecated_20(msg, stacklevel=3, code=None): - - _warn_with_version( - msg, - exc.RemovedIn20Warning.deprecated_since, - exc.RemovedIn20Warning, - stacklevel, - code=code, - ) - - def deprecated_cls(version, message, constructor="__init__"): header = ".. deprecated:: %s %s" % (version, (message or "")) @@ -84,41 +60,6 @@ def deprecated_cls(version, message, constructor="__init__"): return decorate -def deprecated_20_cls( - clsname, alternative=None, constructor="__init__", becomes_legacy=False -): - message = ( - ".. deprecated:: 1.4 The %s class is considered legacy as of the " - "1.x series of SQLAlchemy and %s in 2.0." - % ( - clsname, - "will be removed" - if not becomes_legacy - else "becomes a legacy construct", - ) - ) - - if alternative: - message += " " + alternative - - if becomes_legacy: - warning_cls = exc.LegacyAPIWarning - else: - warning_cls = exc.RemovedIn20Warning - - def decorate(cls): - return _decorate_cls_with_warning( - cls, - constructor, - warning_cls, - message, - warning_cls.deprecated_since, - message, - ) - - return decorate - - def deprecated( version, message=None, @@ -142,14 +83,6 @@ def deprecated( """ - # nothing is deprecated "since" 2.0 at this time. All "removed in 2.0" - # should emit the RemovedIn20Warning, but messaging should be expressed - # in terms of "deprecated since 1.4". - - if version == "2.0": - if warning is None: - warning = exc.RemovedIn20Warning - version = "1.4" if add_deprecation_to_docstring: header = ".. deprecated:: %s %s" % ( version, @@ -164,8 +97,7 @@ def deprecated( if warning is None: warning = exc.SADeprecationWarning - if warning is not exc.RemovedIn20Warning: - message += " (deprecated since: %s)" % version + message += " (deprecated since: %s)" % version def decorate(fn): return _decorate_with_warning( @@ -186,7 +118,7 @@ def moved_20(message, **kw): ) -def deprecated_20(api_name, alternative=None, becomes_legacy=False, **kw): +def became_legacy_20(api_name, alternative=None, **kw): type_reg = re.match("^:(attr|func|meth):", api_name) if type_reg: type_ = {"attr": "attribute", "func": "function", "meth": "method"}[ @@ -200,9 +132,7 @@ def deprecated_20(api_name, alternative=None, becomes_legacy=False, **kw): % ( api_name, type_, - "will be removed" - if not becomes_legacy - else "becomes a legacy construct", + "becomes a legacy construct", ) ) @@ -219,10 +149,7 @@ def deprecated_20(api_name, alternative=None, becomes_legacy=False, **kw): if alternative: message += " " + alternative - if becomes_legacy: - warning_cls = exc.LegacyAPIWarning - else: - warning_cls = exc.RemovedIn20Warning + warning_cls = exc.LegacyAPIWarning return deprecated("2.0", message=message, warning=warning_cls, **kw) @@ -250,11 +177,7 @@ def deprecated_params(**specs): for param, (version, message) in specs.items(): versions[param] = version messages[param] = _sanitize_restructured_text(message) - version_warnings[param] = ( - exc.RemovedIn20Warning - if version == "2.0" - else exc.SADeprecationWarning - ) + version_warnings[param] = exc.SADeprecationWarning def decorate(fn): spec = compat.inspect_getfullargspec(fn) diff --git a/test/aaa_profiling/test_memusage.py b/test/aaa_profiling/test_memusage.py index 316e7d7d46..10ec4d3079 100644 --- a/test/aaa_profiling/test_memusage.py +++ b/test/aaa_profiling/test_memusage.py @@ -1449,7 +1449,9 @@ class CycleTest(_fixtures.FixtureTest): @assert_cycles(4) def go(): - result = s.connection(mapper=User).execute(stmt) + result = s.connection(bind_arguments=dict(mapper=User)).execute( + stmt + ) while True: row = result.fetchone() if row is None: diff --git a/test/aaa_profiling/test_misc.py b/test/aaa_profiling/test_misc.py index c0508a3cfd..13848a7bdd 100644 --- a/test/aaa_profiling/test_misc.py +++ b/test/aaa_profiling/test_misc.py @@ -46,7 +46,7 @@ class EnumTest(fixtures.TestBase): @profiling.function_call_count() def test_create_enum_from_pep_435_w_expensive_members(self): - Enum(self.SomeEnum) + Enum(self.SomeEnum, omit_aliases=False) class CacheKeyTest(fixtures.TestBase): diff --git a/test/aaa_profiling/test_orm.py b/test/aaa_profiling/test_orm.py index 128a78714f..cfc697d5e6 100644 --- a/test/aaa_profiling/test_orm.py +++ b/test/aaa_profiling/test_orm.py @@ -94,7 +94,7 @@ class MergeTest(NoCache, fixtures.MappedTest): # down from 185 on this this is a small slice of a usually # bigger operation so using a small variance - sess2._legacy_transaction() # autobegin + sess2.connection() # autobegin @profiling.function_call_count(variance=0.20) def go1(): @@ -104,7 +104,7 @@ class MergeTest(NoCache, fixtures.MappedTest): # third call, merge object already present. almost no calls. - sess2._legacy_transaction() # autobegin + sess2.connection() # autobegin @profiling.function_call_count(variance=0.10, warmup=1) def go2(): @@ -124,7 +124,7 @@ class MergeTest(NoCache, fixtures.MappedTest): # using sqlite3 the C extension took it back up to approx. 1257 # (py2.6) - sess2._legacy_transaction() # autobegin + sess2.connection() # autobegin @profiling.function_call_count(variance=0.10) def go(): diff --git a/test/base/test_except.py b/test/base/test_except.py index e73160cd85..c7ef304b3d 100644 --- a/test/base/test_except.py +++ b/test/base/test_except.py @@ -505,7 +505,6 @@ ALL_EXC = [ sa_exceptions.SADeprecationWarning, sa_exceptions.Base20DeprecationWarning, sa_exceptions.LegacyAPIWarning, - sa_exceptions.RemovedIn20Warning, sa_exceptions.MovedIn20Warning, sa_exceptions.SAWarning, ], diff --git a/test/dialect/mysql/test_deprecations.py b/test/dialect/mysql/test_deprecations.py index 32ec5e3005..e66037d90a 100644 --- a/test/dialect/mysql/test_deprecations.py +++ b/test/dialect/mysql/test_deprecations.py @@ -1,11 +1,8 @@ from sqlalchemy import select from sqlalchemy import table from sqlalchemy.dialects.mysql import base as mysql -from sqlalchemy.dialects.mysql import ENUM -from sqlalchemy.dialects.mysql import SET from sqlalchemy.testing import AssertsCompiledSQL from sqlalchemy.testing import expect_deprecated -from sqlalchemy.testing import expect_deprecated_20 from sqlalchemy.testing import fixtures @@ -22,19 +19,3 @@ class CompileTest(AssertsCompiledSQL, fixtures.TestBase): "dialect and will be removed in a future release" ): self.assert_compile(s, "SELECT FOO * FROM foo") - - -class DeprecateQuoting(fixtures.TestBase): - def test_enum_warning(self): - ENUM("a", "b") - with expect_deprecated_20( - "The 'quoting' parameter to :class:`.mysql.ENUM` is deprecated." - ): - ENUM("a", quoting="foo") - - def test_set_warning(self): - SET("a", "b") - with expect_deprecated_20( - "The 'quoting' parameter to :class:`.mysql.SET` is deprecated.*" - ): - SET("a", quoting="foo") diff --git a/test/engine/test_deprecations.py b/test/engine/test_deprecations.py index 454a6c6290..8dc1f0f484 100644 --- a/test/engine/test_deprecations.py +++ b/test/engine/test_deprecations.py @@ -5,9 +5,7 @@ import sqlalchemy as tsa from sqlalchemy import create_engine from sqlalchemy import event from sqlalchemy import exc -from sqlalchemy import ForeignKey from sqlalchemy import insert -from sqlalchemy import inspect from sqlalchemy import Integer from sqlalchemy import MetaData from sqlalchemy import pool @@ -228,37 +226,6 @@ def select1(db): return str(select(1).compile(dialect=db.dialect)) -class DeprecatedReflectionTest(fixtures.TablesTest): - @classmethod - def define_tables(cls, metadata): - Table( - "user", - metadata, - Column("id", Integer, primary_key=True), - Column("name", String(50)), - ) - Table( - "address", - metadata, - Column("id", Integer, primary_key=True), - Column("user_id", ForeignKey("user.id")), - Column("email", String(50)), - ) - - def test_reflecttable(self): - inspector = inspect(testing.db) - metadata = MetaData() - - table = Table("user", metadata) - with testing.expect_deprecated_20( - r"The Inspector.reflecttable\(\) method is considered " - ): - res = inspector.reflecttable(table, None) - exp = inspector.reflect_table(table, None) - - eq_(res, exp) - - class EngineEventsTest(fixtures.TestBase): __requires__ = ("ad_hoc_engines",) __backend__ = True diff --git a/test/ext/declarative/test_deprecations.py b/test/ext/declarative/test_deprecations.py index 58ad2fced6..a4192fddbe 100644 --- a/test/ext/declarative/test_deprecations.py +++ b/test/ext/declarative/test_deprecations.py @@ -1,34 +1,13 @@ import sqlalchemy as sa from sqlalchemy import inspect from sqlalchemy.ext import declarative as legacy_decl -from sqlalchemy.ext.declarative import instrument_declarative -from sqlalchemy.orm import Mapper from sqlalchemy.testing import eq_ from sqlalchemy.testing import expect_deprecated_20 from sqlalchemy.testing import fixtures -from sqlalchemy.testing import is_ from sqlalchemy.testing import is_false from sqlalchemy.testing import is_true -class TestInstrumentDeclarative(fixtures.TestBase): - def test_ok(self): - class Foo: - __tablename__ = "foo" - id = sa.Column(sa.Integer, primary_key=True) - - meta = sa.MetaData() - reg = {} - with expect_deprecated_20( - "the instrument_declarative function is deprecated" - ): - instrument_declarative(Foo, reg, meta) - - mapper = sa.inspect(Foo) - is_true(isinstance(mapper, Mapper)) - is_(mapper.class_, Foo) - - class DeprecatedImportsTest(fixtures.TestBase): def _expect_warning(self, name): return expect_deprecated_20( diff --git a/test/ext/test_horizontal_shard.py b/test/ext/test_horizontal_shard.py index a599560625..7cc6a6f790 100644 --- a/test/ext/test_horizontal_shard.py +++ b/test/ext/test_horizontal_shard.py @@ -695,30 +695,22 @@ class DistinctEngineShardTest(ShardTest, fixtures.MappedTest): for i in range(1, 5): os.remove("shard%d_%s.db" % (i, provision.FOLLOWER_IDENT)) - @testing.combinations((True,), (False,)) - @testing.uses_deprecated("Using plain strings") - def test_plain_core_textual_lookup_w_shard(self, use_legacy_text): + def test_plain_core_textual_lookup_w_shard(self): sess = self._fixture_data() - if use_legacy_text: - stmt = "SELECT * FROM weather_locations" - else: - stmt = text("SELECT * FROM weather_locations") + stmt = text("SELECT * FROM weather_locations") eq_( - sess.execute(stmt, shard_id="asia").fetchall(), + sess.execute( + stmt, bind_arguments=dict(shard_id="asia") + ).fetchall(), [(1, "Asia", "Tokyo")], ) - @testing.combinations((True,), (False,)) - @testing.uses_deprecated("Using plain strings") - def test_plain_core_textual_lookup(self, use_legacy_text): + def test_plain_core_textual_lookup(self): sess = self._fixture_data() - if use_legacy_text: - stmt = "SELECT * FROM weather_locations WHERE id=1" - else: - stmt = text("SELECT * FROM weather_locations WHERE id=1") + stmt = text("SELECT * FROM weather_locations WHERE id=1") eq_( sess.execute(stmt).fetchall(), [(1, "Asia", "Tokyo")], diff --git a/test/orm/declarative/test_basic.py b/test/orm/declarative/test_basic.py index a5c1ba08ab..87ef2e62f6 100644 --- a/test/orm/declarative/test_basic.py +++ b/test/orm/declarative/test_basic.py @@ -25,7 +25,7 @@ from sqlalchemy.orm import deferred from sqlalchemy.orm import descriptor_props from sqlalchemy.orm import exc as orm_exc from sqlalchemy.orm import joinedload -from sqlalchemy.orm import mapper +from sqlalchemy.orm import Mapper from sqlalchemy.orm import registry from sqlalchemy.orm import relationship from sqlalchemy.orm import Session @@ -1489,7 +1489,7 @@ class DeclarativeTest(DeclarativeTestBase): def test_custom_mapper_attribute(self): def mymapper(cls, tbl, **kwargs): - m = sa.orm.mapper(cls, tbl, **kwargs) + m = sa.orm.Mapper(cls, tbl, **kwargs) m.CHECK = True return m @@ -1504,7 +1504,7 @@ class DeclarativeTest(DeclarativeTestBase): def test_custom_mapper_argument(self): def mymapper(cls, tbl, **kwargs): - m = sa.orm.mapper(cls, tbl, **kwargs) + m = sa.orm.Mapper(cls, tbl, **kwargs) m.CHECK = True return m @@ -2214,7 +2214,7 @@ class DeclarativeTest(DeclarativeTestBase): canary = mock.Mock() - @event.listens_for(mapper, "instrument_class") + @event.listens_for(Mapper, "instrument_class") def instrument_class(mp, cls): canary.instrument_class(mp, cls) diff --git a/test/orm/inheritance/test_poly_linked_list.py b/test/orm/inheritance/test_poly_linked_list.py index a501e027a5..9e973cc744 100644 --- a/test/orm/inheritance/test_poly_linked_list.py +++ b/test/orm/inheritance/test_poly_linked_list.py @@ -1,9 +1,7 @@ from sqlalchemy import ForeignKey from sqlalchemy import Integer from sqlalchemy import String -from sqlalchemy import testing from sqlalchemy.orm import backref -from sqlalchemy.orm import clear_mappers from sqlalchemy.orm import configure_mappers from sqlalchemy.orm import relationship from sqlalchemy.testing import fixtures @@ -54,24 +52,13 @@ class PolymorphicCircularTest(fixtures.MappedTest): @classmethod def setup_mappers(cls): - global Table1, Table1B, Table2, Table3, Data table1, table2, table3, data = cls.tables( "table1", "table2", "table3", "data" ) - # join = polymorphic_union( - # { - # 'table3' : table1.join(table3), - # 'table2' : table1.join(table2), - # 'table1' : table1.select(table1.c.type.in_(['table1', 'table1b'])), - # }, None, 'pjoin') - - with testing.expect_deprecated_20( - r"The Join.alias\(\) method is considered legacy" - ): - join = table1.outerjoin(table2).outerjoin(table3).alias("pjoin") - # join = None - - class Table1: + + Base = cls.Basic + + class Table1(Base): def __init__(self, name, data=None): self.name = name if data is not None: @@ -94,7 +81,7 @@ class PolymorphicCircularTest(fixtures.MappedTest): class Table3(Table1): pass - class Data: + class Data(Base): def __init__(self, data): self.data = data @@ -105,35 +92,6 @@ class PolymorphicCircularTest(fixtures.MappedTest): repr(str(self.data)), ) - try: - # this is how the mapping used to work. ensure that this raises an - # error now - table1_mapper = cls.mapper_registry.map_imperatively( - Table1, - table1, - select_table=join, - polymorphic_on=table1.c.type, - polymorphic_identity="table1", - properties={ - "nxt": relationship( - Table1, - backref=backref( - "prev", foreignkey=join.c.id, uselist=False - ), - uselist=False, - primaryjoin=join.c.id == join.c.related_id, - ), - "data": relationship( - cls.mapper_registry.map_imperatively(Data, data) - ), - }, - ) - configure_mappers() - assert False - except Exception: - assert True - clear_mappers() - # currently, the "eager" relationships degrade to lazy relationships # due to the polymorphic load. # the "nxt" relationship used to have a "lazy='joined'" on it, but the @@ -190,12 +148,17 @@ class PolymorphicCircularTest(fixtures.MappedTest): ), table1_mapper.primary_key def test_one(self): + Table1, Table2 = self.classes("Table1", "Table2") self._testlist([Table1, Table2, Table1, Table2]) def test_two(self): + Table3 = self.classes.Table3 self._testlist([Table3]) def test_three(self): + Table1, Table1B, Table2, Table3 = self.classes( + "Table1", "Table1B", "Table2", "Table3" + ) self._testlist( [ Table2, @@ -211,6 +174,9 @@ class PolymorphicCircularTest(fixtures.MappedTest): ) def test_four(self): + Table1, Table1B, Table2, Table3, Data = self.classes( + "Table1", "Table1B", "Table2", "Table3", "Data" + ) self._testlist( [ Table2("t2", [Data("data1"), Data("data2")]), @@ -221,6 +187,8 @@ class PolymorphicCircularTest(fixtures.MappedTest): ) def _testlist(self, classes): + Table1 = self.classes.Table1 + sess = fixture_session() # create objects in a linked list diff --git a/test/orm/test_bind.py b/test/orm/test_bind.py index 1d5af50643..ec62430ad4 100644 --- a/test/orm/test_bind.py +++ b/test/orm/test_bind.py @@ -286,7 +286,7 @@ class BindIntegrationTest(_fixtures.FixtureTest): sess.bind_mapper(Address, e2) engine = {"e1": e1, "e2": e2, "e3": e3}[expected] - conn = sess.connection(**testcase) + conn = sess.connection(bind_arguments=testcase) is_(conn.engine, engine) sess.close() @@ -355,7 +355,7 @@ class BindIntegrationTest(_fixtures.FixtureTest): canary.get_bind(**kw) return Session.get_bind(self, **kw) - sess = GetBindSession(e3, future=True) + sess = GetBindSession(e3) sess.bind_mapper(User, e1) sess.bind_mapper(Address, e2) @@ -422,7 +422,7 @@ class BindIntegrationTest(_fixtures.FixtureTest): c = testing.db.connect() sess = Session(bind=c) sess.begin() - transaction = sess._legacy_transaction() + transaction = sess.get_transaction() u = User(name="u1") sess.add(u) sess.flush() diff --git a/test/orm/test_bundle.py b/test/orm/test_bundle.py index 92c4c19c28..6d613091de 100644 --- a/test/orm/test_bundle.py +++ b/test/orm/test_bundle.py @@ -318,11 +318,7 @@ class BundleTest(fixtures.MappedTest, AssertsCompiledSQL): stmt = select(b1).filter(b1.c.d1.between("d3d1", "d5d1")) - with testing.expect_deprecated_20( - "The Bundle.single_entity flag has no effect when " - "using 2.0 style execution." - ): - rows = sess.execute(stmt).all() + rows = sess.execute(stmt).all() eq_( rows, [(("d3d1", "d3d2"),), (("d4d1", "d4d2"),), (("d5d1", "d5d2"),)], diff --git a/test/orm/test_cascade.py b/test/orm/test_cascade.py index 51ed50255f..5a74d6ad95 100644 --- a/test/orm/test_cascade.py +++ b/test/orm/test_cascade.py @@ -1076,8 +1076,6 @@ class NoSaveCascadeFlushTest(_fixtures.FixtureTest): m2o_cascade=True, o2m=False, m2o=False, - o2m_cascade_backrefs=True, - m2o_cascade_backrefs=True, ): Address, addresses, users, User = ( @@ -1092,12 +1090,10 @@ class NoSaveCascadeFlushTest(_fixtures.FixtureTest): addresses_rel = { "addresses": relationship( Address, - cascade_backrefs=o2m_cascade_backrefs, cascade=o2m_cascade and "save-update" or "", backref=backref( "user", cascade=m2o_cascade and "save-update" or "", - cascade_backrefs=m2o_cascade_backrefs, ), ) } @@ -1107,7 +1103,6 @@ class NoSaveCascadeFlushTest(_fixtures.FixtureTest): "addresses": relationship( Address, cascade=o2m_cascade and "save-update" or "", - cascade_backrefs=o2m_cascade_backrefs, ) } user_rel = {} @@ -1116,7 +1111,6 @@ class NoSaveCascadeFlushTest(_fixtures.FixtureTest): "user": relationship( User, cascade=m2o_cascade and "save-update" or "", - cascade_backrefs=m2o_cascade_backrefs, ) } addresses_rel = {} @@ -1137,8 +1131,6 @@ class NoSaveCascadeFlushTest(_fixtures.FixtureTest): bkd_cascade=True, fwd=False, bkd=False, - fwd_cascade_backrefs=True, - bkd_cascade_backrefs=True, ): keywords, items, item_keywords, Keyword, Item = ( @@ -1155,12 +1147,10 @@ class NoSaveCascadeFlushTest(_fixtures.FixtureTest): "keywords": relationship( Keyword, secondary=item_keywords, - cascade_backrefs=fwd_cascade_backrefs, cascade=fwd_cascade and "save-update" or "", backref=backref( "items", cascade=bkd_cascade and "save-update" or "", - cascade_backrefs=bkd_cascade_backrefs, ), ) } @@ -1171,7 +1161,6 @@ class NoSaveCascadeFlushTest(_fixtures.FixtureTest): Keyword, secondary=item_keywords, cascade=fwd_cascade and "save-update" or "", - cascade_backrefs=fwd_cascade_backrefs, ) } items_rel = {} @@ -1181,7 +1170,6 @@ class NoSaveCascadeFlushTest(_fixtures.FixtureTest): Item, secondary=item_keywords, cascade=bkd_cascade and "save-update" or "", - cascade_backrefs=bkd_cascade_backrefs, ) } keywords_rel = {} @@ -1404,11 +1392,7 @@ class NoSaveCascadeFlushTest(_fixtures.FixtureTest): sess.flush() a1 = Address(email_address="a1") - with testing.expect_deprecated( - '"Address" object is being merged into a Session along ' - 'the backref cascade path for relationship "User.addresses"' - ): - a1.user = u1 + a1.user = u1 sess.add(a1) sess.expunge(u1) assert u1 not in sess @@ -1469,11 +1453,7 @@ class NoSaveCascadeFlushTest(_fixtures.FixtureTest): sess.flush() a1 = Address(email_address="a1") - with testing.expect_deprecated( - '"Address" object is being merged into a Session along the ' - 'backref cascade path for relationship "User.addresses"' - ): - a1.user = u1 + a1.user = u1 sess.add(a1) sess.expunge(u1) assert u1 not in sess @@ -2761,20 +2741,14 @@ class NoBackrefCascadeTest(_fixtures.FixtureTest): cls.mapper_registry.map_imperatively( User, users, - properties={ - "addresses": relationship( - Address, backref="user", cascade_backrefs=False - ) - }, + properties={"addresses": relationship(Address, backref="user")}, ) cls.mapper_registry.map_imperatively( Dingaling, dingalings, properties={ - "address": relationship( - Address, backref="dingalings", cascade_backrefs=False - ) + "address": relationship(Address, backref="dingalings") }, ) @@ -2805,7 +2779,7 @@ class NoBackrefCascadeTest(_fixtures.FixtureTest): assert a1 not in sess - def test_o2m_flag_on_backref(self): + def test_o2m_on_backref_no_cascade(self): Dingaling, Address = self.classes.Dingaling, self.classes.Address sess = fixture_session() @@ -2814,15 +2788,9 @@ class NoBackrefCascadeTest(_fixtures.FixtureTest): sess.add(a1) d1 = Dingaling() - with testing.expect_deprecated( - '"Dingaling" object is being merged into a Session along the ' - 'backref cascade path for relationship "Address.dingalings"' - ): - d1.address = a1 + d1.address = a1 assert d1 in a1.dingalings - assert d1 in sess - - sess.commit() + assert d1 not in sess def test_m2o_basic(self): Dingaling, Address = self.classes.Dingaling, self.classes.Address @@ -2836,7 +2804,7 @@ class NoBackrefCascadeTest(_fixtures.FixtureTest): a1.dingalings.append(d1) assert a1 not in sess - def test_m2o_flag_on_backref(self): + def test_m2o_on_backref_no_cascade(self): User, Address = self.classes.User, self.classes.Address sess = fixture_session() @@ -2845,14 +2813,10 @@ class NoBackrefCascadeTest(_fixtures.FixtureTest): sess.add(a1) u1 = User(name="u1") - with testing.expect_deprecated( - '"User" object is being merged into a Session along the backref ' - 'cascade path for relationship "Address.user"' - ): - u1.addresses.append(a1) - assert u1 in sess + u1.addresses.append(a1) + assert u1 not in sess - def test_m2o_commit_warns(self): + def test_m2o_commit_no_cascade(self): Dingaling, Address = self.classes.Dingaling, self.classes.Address sess = fixture_session() @@ -3844,7 +3808,9 @@ class O2MConflictTest(fixtures.MappedTest): "child": relationship( Child, uselist=False, - backref=backref("parent", cascade_backrefs=False), + backref=backref( + "parent", + ), ) }, ) @@ -3910,7 +3876,7 @@ class O2MConflictTest(fixtures.MappedTest): Child, uselist=False, cascade="all, delete, delete-orphan", - backref=backref("parent", cascade_backrefs=False), + backref=backref("parent"), ) }, ) @@ -3937,7 +3903,6 @@ class O2MConflictTest(fixtures.MappedTest): single_parent=True, backref=backref("child", uselist=False), cascade="all,delete,delete-orphan", - cascade_backrefs=False, ) }, ) @@ -3963,7 +3928,6 @@ class O2MConflictTest(fixtures.MappedTest): single_parent=True, backref=backref("child", uselist=True), cascade="all,delete,delete-orphan", - cascade_backrefs=False, ) }, ) @@ -4499,7 +4463,6 @@ class CollectionCascadesNoBackrefTest(fixtures.TestBase): "B", backref="a", collection_class=collection_class, - cascade_backrefs=False, ) @registry.mapped @@ -4522,13 +4485,12 @@ class CollectionCascadesNoBackrefTest(fixtures.TestBase): (attribute_mapped_collection("key"), "update_kw"), argnames="collection_class,methname", ) - @testing.combinations((True,), (False,), argnames="future") def test_cascades_on_collection( - self, cascade_fixture, collection_class, methname, future + self, cascade_fixture, collection_class, methname ): A, B = cascade_fixture(collection_class) - s = Session(future=future) + s = Session() a1 = A() s.add(a1) diff --git a/test/orm/test_composites.py b/test/orm/test_composites.py index 67ffae75da..19e090e0ed 100644 --- a/test/orm/test_composites.py +++ b/test/orm/test_composites.py @@ -90,14 +90,14 @@ class PointTest(fixtures.MappedTest, testing.AssertsCompiledSQL): }, ) - def _fixture(self, future=False): + def _fixture(self): Graph, Edge, Point = ( self.classes.Graph, self.classes.Edge, self.classes.Point, ) - sess = Session(testing.db, future=future) + sess = Session(testing.db) g = Graph( id=1, edges=[ @@ -231,7 +231,7 @@ class PointTest(fixtures.MappedTest, testing.AssertsCompiledSQL): def test_bulk_update_sql(self): Edge, Point = (self.classes.Edge, self.classes.Point) - sess = self._fixture(future=True) + sess = self._fixture() e1 = sess.execute( select(Edge).filter(Edge.start == Point(14, 5)) @@ -256,7 +256,7 @@ class PointTest(fixtures.MappedTest, testing.AssertsCompiledSQL): def test_bulk_update_evaluate(self): Edge, Point = (self.classes.Edge, self.classes.Point) - sess = self._fixture(future=True) + sess = self._fixture() e1 = sess.execute( select(Edge).filter(Edge.start == Point(14, 5)) diff --git a/test/orm/test_core_compilation.py b/test/orm/test_core_compilation.py index 28f42797e4..2b0c570c49 100644 --- a/test/orm/test_core_compilation.py +++ b/test/orm/test_core_compilation.py @@ -35,7 +35,6 @@ from sqlalchemy.testing import AssertsCompiledSQL from sqlalchemy.testing import eq_ from sqlalchemy.testing import fixtures from sqlalchemy.testing import is_ -from sqlalchemy.testing.assertions import expect_raises_message from sqlalchemy.testing.fixtures import fixture_session from sqlalchemy.testing.util import resolve_lambda from sqlalchemy.util.langhelpers import hybridproperty @@ -232,14 +231,6 @@ class ColumnsClauseFromsTest(QueryTest, AssertsCompiledSQL): ) eq_(len(froms), 1) - def test_with_only_columns_unknown_kw(self): - User, Address = self.classes("User", "Address") - - stmt = select(User.id) - - with expect_raises_message(TypeError, "unknown parameters: foo"): - stmt.with_only_columns(User.id, foo="bar") - @testing.combinations((True,), (False,)) def test_replace_into_select_from_maintains_existing(self, use_flag): User, Address = self.classes("User", "Address") diff --git a/test/orm/test_deprecations.py b/test/orm/test_deprecations.py index 64db9a893f..a567534c35 100644 --- a/test/orm/test_deprecations.py +++ b/test/orm/test_deprecations.py @@ -1,4 +1,3 @@ -from contextlib import nullcontext from unittest.mock import call from unittest.mock import Mock @@ -16,12 +15,10 @@ from sqlalchemy import select from sqlalchemy import String from sqlalchemy import testing from sqlalchemy import text -from sqlalchemy import true from sqlalchemy.engine import default from sqlalchemy.engine import result_tuple from sqlalchemy.orm import aliased from sqlalchemy.orm import attributes -from sqlalchemy.orm import backref from sqlalchemy.orm import clear_mappers from sqlalchemy.orm import collections from sqlalchemy.orm import column_property @@ -31,12 +28,9 @@ from sqlalchemy.orm import contains_eager from sqlalchemy.orm import defaultload from sqlalchemy.orm import defer from sqlalchemy.orm import deferred -from sqlalchemy.orm import eagerload from sqlalchemy.orm import foreign from sqlalchemy.orm import instrumentation from sqlalchemy.orm import joinedload -from sqlalchemy.orm import mapper -from sqlalchemy.orm import relation from sqlalchemy.orm import relationship from sqlalchemy.orm import scoped_session from sqlalchemy.orm import Session @@ -46,7 +40,6 @@ from sqlalchemy.orm import synonym from sqlalchemy.orm import undefer from sqlalchemy.orm import with_parent from sqlalchemy.orm import with_polymorphic -from sqlalchemy.orm.collections import attribute_mapped_collection from sqlalchemy.orm.collections import collection from sqlalchemy.orm.util import polymorphic_union from sqlalchemy.testing import assert_raises_message @@ -67,13 +60,11 @@ from .inheritance import _poly_fixtures from .inheritance._poly_fixtures import Manager from .inheritance._poly_fixtures import Person from .test_deferred import InheritanceTest as _deferred_InheritanceTest -from .test_dynamic import _DynamicFixture from .test_events import _RemoveListeners from .test_options import PathTest as OptionsPathTest from .test_options import PathTest from .test_options import QueryTest as OptionsQueryTest from .test_query import QueryTest -from .test_transaction import _LocalFixture from ..sql.test_compare import CacheKeyFixture if True: @@ -413,35 +404,6 @@ class DeprecatedQueryTest(_fixtures.FixtureTest, AssertsCompiledSQL): "subquery object." ) - def test_deprecated_negative_slices(self): - User = self.classes.User - - sess = fixture_session() - q = sess.query(User).order_by(User.id) - - with testing.expect_deprecated( - "Support for negative indexes for SQL index / slice operators" - ): - eq_(q[-5:-2], [User(id=7), User(id=8)]) - - with testing.expect_deprecated( - "Support for negative indexes for SQL index / slice operators" - ): - eq_(q[-1], User(id=10)) - - with testing.expect_deprecated( - "Support for negative indexes for SQL index / slice operators" - ): - eq_(q[-2], User(id=9)) - - with testing.expect_deprecated( - "Support for negative indexes for SQL index / slice operators" - ): - eq_(q[:-2], [User(id=7), User(id=8)]) - - # this doesn't evaluate anything because it's a net-negative - eq_(q[-2:-5], []) - def test_deprecated_select_coercion_join_target(self): User = self.classes.User addresses = self.tables.addresses @@ -460,44 +422,6 @@ class DeprecatedQueryTest(_fixtures.FixtureTest, AssertsCompiledSQL): "ON users.id = anon_1.user_id", ) - def test_deprecated_negative_slices_compile(self): - User = self.classes.User - - sess = fixture_session() - q = sess.query(User).order_by(User.id) - - with testing.expect_deprecated( - "Support for negative indexes for SQL index / slice operators" - ): - self.assert_sql( - testing.db, - lambda: q[-5:-2], - [ - ( - "SELECT users.id AS users_id, users.name " - "AS users_name " - "FROM users ORDER BY users.id", - {}, - ) - ], - ) - - with testing.expect_deprecated( - "Support for negative indexes for SQL index / slice operators" - ): - self.assert_sql( - testing.db, - lambda: q[-5:], - [ - ( - "SELECT users.id AS users_id, users.name " - "AS users_name " - "FROM users ORDER BY users.id", - {}, - ) - ], - ) - def test_invalid_column(self): User = self.classes.User @@ -629,129 +553,91 @@ class LazyLoadOptSpecificityTest(fixtures.DeclarativeMappedTest): self.assert_sql_count(testing.db, go, expected) -class DynamicTest(_DynamicFixture, _fixtures.FixtureTest): - def test_negative_slice_access_raises(self): - User, Address = self._user_address_fixture() - sess = fixture_session() - u1 = sess.get(User, 8) +class DeprecatedInhTest(_poly_fixtures._Polymorphic): + def test_with_polymorphic(self): + Person = _poly_fixtures.Person + Engineer = _poly_fixtures.Engineer - with testing.expect_deprecated_20( - "Support for negative indexes for SQL index / slice" - ): - eq_(u1.addresses[-1], Address(id=4)) + with DeprecatedQueryTest._expect_implicit_subquery(): + p_poly = with_polymorphic(Person, [Engineer], select(Person)) - with testing.expect_deprecated_20( - "Support for negative indexes for SQL index / slice" - ): - eq_(u1.addresses[-5:-2], [Address(id=2)]) + is_true( + sa.inspect(p_poly).selectable.compare(select(Person).subquery()) + ) - with testing.expect_deprecated_20( - "Support for negative indexes for SQL index / slice" - ): - eq_(u1.addresses[-2], Address(id=3)) - with testing.expect_deprecated_20( - "Support for negative indexes for SQL index / slice" - ): - eq_(u1.addresses[:-2], [Address(id=2)]) +class DeprecatedMapperTest( + fixtures.RemovesEvents, _fixtures.FixtureTest, AssertsCompiledSQL +): + __dialect__ = "default" + def test_listen_on_mapper_mapper_event_fn(self, registry): + from sqlalchemy.orm import mapper -class SessionTest(fixtures.RemovesEvents, _LocalFixture): - def test_transaction_attr(self): - s1 = Session(testing.db) + m1 = Mock() - with testing.expect_deprecated_20( - "The Session.transaction attribute is considered legacy as " - "of the 1.x series" + with expect_deprecated( + r"The `sqlalchemy.orm.mapper\(\)` symbol is deprecated and " + "will be removed" ): - s1.transaction - def test_textual_execute(self, connection): - """test that Session.execute() converts to text()""" + @event.listens_for(mapper, "before_configured") + def go(): + m1() - users = self.tables.users + @registry.mapped + class MyClass: + __tablename__ = "t1" + id = Column(Integer, primary_key=True) - with Session(bind=connection) as sess: - sess.execute(users.insert(), dict(id=7, name="jack")) + registry.configure() + eq_(m1.mock_calls, [call()]) - with testing.expect_deprecated_20( - "Using plain strings to indicate SQL statements " - "without using the text" - ): - # use :bindparam style - eq_( - sess.execute( - "select * from users where id=:id", {"id": 7} - ).fetchall(), - [(7, "jack")], - ) + def test_listen_on_mapper_instrumentation_event_fn(self, registry): + from sqlalchemy.orm import mapper - with testing.expect_deprecated_20( - "Using plain strings to indicate SQL statements " - "without using the text" - ): - # use :bindparam style - eq_( - sess.scalar( - "select id from users where id=:id", {"id": 7} - ), - 7, - ) + m1 = Mock() - def test_session_str(self): - s1 = Session(testing.db) - str(s1) + with expect_deprecated( + r"The `sqlalchemy.orm.mapper\(\)` symbol is deprecated and " + "will be removed" + ): - @testing.combinations( - {"mapper": None}, - {"clause": None}, - {"bind_arguments": {"mapper": None}, "clause": None}, - {"bind_arguments": {}, "clause": None}, - ) - def test_bind_kwarg_deprecated(self, kw): - s1 = Session(testing.db) - - for meth in s1.execute, s1.scalar: - m1 = mock.Mock(side_effect=s1.get_bind) - with mock.patch.object(s1, "get_bind", m1): - expr = text("select 1") - - with testing.expect_deprecated_20( - r"Passing bind arguments to Session.execute\(\) as " - "keyword " - "arguments is deprecated and will be removed SQLAlchemy " - "2.0" - ): - meth(expr, **kw) - - bind_arguments = kw.pop("bind_arguments", None) - if bind_arguments: - bind_arguments.update(kw) - - if "clause" not in kw: - bind_arguments["clause"] = expr - eq_(m1.mock_calls, [call(**bind_arguments)]) - else: - if "clause" not in kw: - kw["clause"] = expr - eq_(m1.mock_calls, [call(**kw)]) + @event.listens_for(mapper, "init") + def go(target, args, kwargs): + m1(target, args, kwargs) + @registry.mapped + class MyClass: + __tablename__ = "t1" + id = Column(Integer, primary_key=True) -class DeprecatedInhTest(_poly_fixtures._Polymorphic): - def test_with_polymorphic(self): - Person = _poly_fixtures.Person - Engineer = _poly_fixtures.Engineer + mc = MyClass(id=5) + eq_(m1.mock_calls, [call(mc, (), {"id": 5})]) - with DeprecatedQueryTest._expect_implicit_subquery(): - p_poly = with_polymorphic(Person, [Engineer], select(Person)) + def test_we_couldnt_remove_mapper_yet(self): + """test that the mapper() function is present but raises an + informative error when used. - is_true( - sa.inspect(p_poly).selectable.compare(select(Person).subquery()) - ) + The function itself was to be removed as of 2.0, however we forgot + to mark deprecated the use of the function as an event target, + so it needs to stay around for another cycle at least. + """ -class DeprecatedMapperTest(_fixtures.FixtureTest, AssertsCompiledSQL): - __dialect__ = "default" + class MyClass: + pass + + t1 = Table("t1", MetaData(), Column("id", Integer, primary_key=True)) + + from sqlalchemy.orm import mapper + + with assertions.expect_raises_message( + sa_exc.InvalidRequestError, + r"The 'sqlalchemy.orm.mapper\(\)' function is removed as of " + "SQLAlchemy 2.0.", + ): + mapper(MyClass, t1) def test_deferred_scalar_loader_name_change(self): class Foo: @@ -1392,7 +1278,6 @@ class ViewonlyFlagWarningTest(fixtures.MappedTest): ("passive_updates", False), ("enable_typechecks", False), ("active_history", True), - ("cascade_backrefs", False), ) def test_viewonly_warning(self, flag, value): Order = self.classes.Order @@ -1504,7 +1389,7 @@ class NonPrimaryMapperTest(_fixtures.FixtureTest, AssertsCompiledSQL): non_primary=True, ) - def test_illegal_non_primary_legacy(self): + def test_illegal_non_primary_legacy(self, registry): users, Address, addresses, User = ( self.tables.users, self.classes.Address, @@ -1512,18 +1397,12 @@ class NonPrimaryMapperTest(_fixtures.FixtureTest, AssertsCompiledSQL): self.classes.User, ) - with testing.expect_deprecated( - "Calling the mapper.* function directly outside of a declarative " - ): - mapper(User, users) - with testing.expect_deprecated( - "Calling the mapper.* function directly outside of a declarative " - ): - mapper(Address, addresses) + registry.map_imperatively(User, users) + registry.map_imperatively(Address, addresses) with testing.expect_deprecated( "The mapper.non_primary parameter is deprecated" ): - m = mapper( # noqa F841 + m = registry.map_imperatively( # noqa F841 User, users, non_primary=True, @@ -1536,22 +1415,19 @@ class NonPrimaryMapperTest(_fixtures.FixtureTest, AssertsCompiledSQL): configure_mappers, ) - def test_illegal_non_primary_2_legacy(self): + def test_illegal_non_primary_2_legacy(self, registry): User, users = self.classes.User, self.tables.users - with testing.expect_deprecated( - "The mapper.non_primary parameter is deprecated" - ): - assert_raises_message( - sa.exc.InvalidRequestError, - "Configure a primary mapper first", - mapper, - User, - users, - non_primary=True, - ) + assert_raises_message( + sa.exc.InvalidRequestError, + "Configure a primary mapper first", + registry.map_imperatively, + User, + users, + non_primary=True, + ) - def test_illegal_non_primary_3_legacy(self): + def test_illegal_non_primary_3_legacy(self, registry): users, addresses = self.tables.users, self.tables.addresses class Base: @@ -1560,21 +1436,16 @@ class NonPrimaryMapperTest(_fixtures.FixtureTest, AssertsCompiledSQL): class Sub(Base): pass - with testing.expect_deprecated( - "Calling the mapper.* function directly outside of a declarative " - ): - mapper(Base, users) - with testing.expect_deprecated( - "The mapper.non_primary parameter is deprecated", - ): - assert_raises_message( - sa.exc.InvalidRequestError, - "Configure a primary mapper first", - mapper, - Sub, - addresses, - non_primary=True, - ) + registry.map_imperatively(Base, users) + + assert_raises_message( + sa.exc.InvalidRequestError, + "Configure a primary mapper first", + registry.map_imperatively, + Sub, + addresses, + non_primary=True, + ) class InstancesTest(QueryTest, AssertsCompiledSQL): @@ -1776,78 +1647,6 @@ class InstancesTest(QueryTest, AssertsCompiledSQL): self.assert_sql_count(testing.db, go, 1) -class TestDeprecation20(QueryTest): - def test_relation(self): - User = self.classes.User - with testing.expect_deprecated_20(".*relationship"): - relation(User.addresses) - - def test_eagerloading(self): - User = self.classes.User - with testing.expect_deprecated_20(".*joinedload"): - eagerload(User.addresses) - - -class DistinctOrderByImplicitTest(QueryTest, AssertsCompiledSQL): - __dialect__ = "default" - - def test_columns_augmented_roundtrip_three(self): - User, Address = self.classes.User, self.classes.Address - - sess = fixture_session() - - q = ( - sess.query(User.id, User.name.label("foo"), Address.id) - .join(Address, true()) - .filter(User.name == "jack") - .filter(User.id + Address.user_id > 0) - .distinct() - .order_by(User.id, User.name, Address.email_address) - ) - - # even though columns are added, they aren't in the result - with testing.expect_deprecated( - "ORDER BY columns added implicitly due to " - ): - eq_( - q.all(), - [ - (7, "jack", 3), - (7, "jack", 4), - (7, "jack", 2), - (7, "jack", 5), - (7, "jack", 1), - ], - ) - for row in q: - eq_(row._mapping.keys(), ["id", "foo", "id"]) - - def test_columns_augmented_sql_one(self): - User, Address = self.classes.User, self.classes.Address - - sess = fixture_session() - - q = ( - sess.query(User.id, User.name.label("foo"), Address.id) - .distinct() - .order_by(User.id, User.name, Address.email_address) - ) - - # Address.email_address is added because of DISTINCT, - # however User.id, User.name are not b.c. they're already there, - # even though User.name is labeled - with testing.expect_deprecated( - "ORDER BY columns added implicitly due to " - ): - self.assert_compile( - q, - "SELECT DISTINCT users.id AS users_id, users.name AS foo, " - "addresses.id AS addresses_id, addresses.email_address AS " - "addresses_email_address FROM users, addresses " - "ORDER BY users.id, users.name, addresses.email_address", - ) - - class SessionEventsTest(_RemoveListeners, _fixtures.FixtureTest): run_inserts = None @@ -2804,252 +2603,6 @@ class ParentTest(QueryTest, AssertsCompiledSQL): ) -class CollectionCascadesDespiteBackrefTest(fixtures.TestBase): - """test old cascade_backrefs behavior - - see test/orm/test_cascade.py::class CollectionCascadesNoBackrefTest - for the future version - - """ - - @testing.fixture - def cascade_fixture(self, registry): - def go(collection_class): - @registry.mapped - class A: - __tablename__ = "a" - - id = Column(Integer, primary_key=True) - bs = relationship( - "B", backref="a", collection_class=collection_class - ) - - @registry.mapped - class B: - __tablename__ = "b_" - id = Column(Integer, primary_key=True) - a_id = Column(ForeignKey("a.id")) - key = Column(String) - - return A, B - - yield go - - @testing.combinations( - (set, "add"), - (list, "append"), - (attribute_mapped_collection("key"), "__setitem__"), - (attribute_mapped_collection("key"), "setdefault"), - (attribute_mapped_collection("key"), "update_dict"), - (attribute_mapped_collection("key"), "update_kw"), - argnames="collection_class,methname", - ) - @testing.combinations((True,), (False,), argnames="future") - def test_cascades_on_collection( - self, cascade_fixture, collection_class, methname, future - ): - A, B = cascade_fixture(collection_class) - - s = Session(future=future) - - a1 = A() - s.add(a1) - - b1 = B(key="b1") - b2 = B(key="b2") - b3 = B(key="b3") - - if future: - dep_ctx = nullcontext - else: - - def dep_ctx(): - return assertions.expect_deprecated_20( - '"B" object is being merged into a Session along the ' - 'backref cascade path for relationship "A.bs"' - ) - - with dep_ctx(): - b1.a = a1 - with dep_ctx(): - b3.a = a1 - - if future: - assert b1 not in s - assert b3 not in s - else: - assert b1 in s - assert b3 in s - - if methname == "__setitem__": - meth = getattr(a1.bs, methname) - meth(b1.key, b1) - meth(b2.key, b2) - elif methname == "setdefault": - meth = getattr(a1.bs, methname) - meth(b1.key, b1) - meth(b2.key, b2) - elif methname == "update_dict" and isinstance(a1.bs, dict): - a1.bs.update({b1.key: b1, b2.key: b2}) - elif methname == "update_kw" and isinstance(a1.bs, dict): - a1.bs.update(b1=b1, b2=b2) - else: - meth = getattr(a1.bs, methname) - meth(b1) - meth(b2) - - assert b1 in s - assert b2 in s - - # future version: - if future: - assert b3 not in s # the event never triggers from reverse - else: - # old behavior - assert b3 in s - - -class LoadOnFKsTest(fixtures.DeclarativeMappedTest): - @classmethod - def setup_classes(cls): - Base = cls.DeclarativeBasic - - class Parent(Base): - __tablename__ = "parent" - __table_args__ = {"mysql_engine": "InnoDB"} - - id = Column( - Integer, primary_key=True, test_needs_autoincrement=True - ) - - class Child(Base): - __tablename__ = "child" - __table_args__ = {"mysql_engine": "InnoDB"} - - id = Column( - Integer, primary_key=True, test_needs_autoincrement=True - ) - parent_id = Column(Integer, ForeignKey("parent.id")) - - parent = relationship(Parent, backref=backref("children")) - - @testing.fixture - def parent_fixture(self, connection): - Parent, Child = self.classes("Parent", "Child") - - sess = fixture_session(bind=connection, autoflush=False) - p1 = Parent() - p2 = Parent() - c1, c2 = Child(), Child() - c1.parent = p1 - sess.add_all([p1, p2]) - assert c1 in sess - - yield sess, p1, p2, c1, c2 - - sess.close() - - def test_enable_rel_loading_on_persistent_allows_backref_event( - self, parent_fixture - ): - sess, p1, p2, c1, c2 = parent_fixture - Parent, Child = self.classes("Parent", "Child") - - c3 = Child() - sess.enable_relationship_loading(c3) - c3.parent_id = p1.id - with assertions.expect_deprecated_20( - '"Child" object is being merged into a Session along the ' - 'backref cascade path for relationship "Parent.children"' - ): - c3.parent = p1 - - # backref fired off when c3.parent was set, - # because the "old" value was None - # change as of [ticket:3708] - assert c3 in p1.children - - def test_enable_rel_loading_allows_backref_event(self, parent_fixture): - sess, p1, p2, c1, c2 = parent_fixture - Parent, Child = self.classes("Parent", "Child") - - c3 = Child() - sess.enable_relationship_loading(c3) - c3.parent_id = p1.id - - with assertions.expect_deprecated_20( - '"Child" object is being merged into a Session along the ' - 'backref cascade path for relationship "Parent.children"' - ): - c3.parent = p1 - - # backref fired off when c3.parent was set, - # because the "old" value was None - # change as of [ticket:3708] - assert c3 in p1.children - - -class LazyTest(_fixtures.FixtureTest): - run_inserts = "once" - run_deletes = None - - def test_backrefs_dont_lazyload(self): - users, Address, addresses, User = ( - self.tables.users, - self.classes.Address, - self.tables.addresses, - self.classes.User, - ) - - self.mapper_registry.map_imperatively( - User, - users, - properties={"addresses": relationship(Address, backref="user")}, - ) - self.mapper_registry.map_imperatively(Address, addresses) - sess = fixture_session(autoflush=False) - ad = sess.query(Address).filter_by(id=1).one() - assert ad.user.id == 7 - - def go(): - ad.user = None - assert ad.user is None - - self.assert_sql_count(testing.db, go, 0) - - u1 = sess.query(User).filter_by(id=7).one() - - def go(): - assert ad not in u1.addresses - - self.assert_sql_count(testing.db, go, 1) - - sess.expire(u1, ["addresses"]) - - def go(): - assert ad in u1.addresses - - self.assert_sql_count(testing.db, go, 1) - - sess.expire(u1, ["addresses"]) - ad2 = Address() - - def go(): - with assertions.expect_deprecated_20( - ".* object is being merged into a Session along the " - "backref cascade path for relationship " - ): - ad2.user = u1 - assert ad2.user is u1 - - self.assert_sql_count(testing.db, go, 0) - - def go(): - assert ad2 in u1.addresses - - self.assert_sql_count(testing.db, go, 1) - - class MergeResultTest(_fixtures.FixtureTest): run_setup_mappers = "once" run_inserts = "once" diff --git a/test/orm/test_events.py b/test/orm/test_events.py index 3437d79427..92ef241ed7 100644 --- a/test/orm/test_events.py +++ b/test/orm/test_events.py @@ -24,7 +24,6 @@ from sqlalchemy.orm import instrumentation from sqlalchemy.orm import joinedload from sqlalchemy.orm import lazyload from sqlalchemy.orm import Mapper -from sqlalchemy.orm import mapper from sqlalchemy.orm import mapperlib from sqlalchemy.orm import query from sqlalchemy.orm import relationship @@ -672,11 +671,10 @@ class MapperEventsTest(_RemoveListeners, _fixtures.FixtureTest): def init_e(target, args, kwargs): canary.append(("init_e", target)) - event.listen(mapper, "init", init_a) - event.listen(Mapper, "init", init_b) - event.listen(class_mapper(A), "init", init_c) - event.listen(A, "init", init_d) - event.listen(A, "init", init_e, propagate=True) + event.listen(Mapper, "init", init_a) + event.listen(class_mapper(A), "init", init_b) + event.listen(A, "init", init_c) + event.listen(A, "init", init_d, propagate=True) a = A() eq_( @@ -686,14 +684,13 @@ class MapperEventsTest(_RemoveListeners, _fixtures.FixtureTest): ("init_b", a), ("init_c", a), ("init_d", a), - ("init_e", a), ], ) # test propagate flag canary[:] = [] b = B() - eq_(canary, [("init_a", b), ("init_b", b), ("init_e", b)]) + eq_(canary, [("init_a", b), ("init_d", b)]) def listen_all(self, mapper, **kw): canary = [] @@ -809,10 +806,10 @@ class MapperEventsTest(_RemoveListeners, _fixtures.FixtureTest): canary = Mock() - event.listen(mapper, "before_configured", canary.listen1) - event.listen(mapper, "before_configured", canary.listen2, insert=True) - event.listen(mapper, "before_configured", canary.listen3) - event.listen(mapper, "before_configured", canary.listen4, insert=True) + event.listen(Mapper, "before_configured", canary.listen1) + event.listen(Mapper, "before_configured", canary.listen2, insert=True) + event.listen(Mapper, "before_configured", canary.listen3) + event.listen(Mapper, "before_configured", canary.listen4, insert=True) configure_mappers() @@ -864,7 +861,7 @@ class MapperEventsTest(_RemoveListeners, _fixtures.FixtureTest): def load(obj, ctx): canary.append("load") - event.listen(mapper, "load", load) + event.listen(Mapper, "load", load) s = fixture_session() u = User(name="u1") @@ -1065,7 +1062,7 @@ class MapperEventsTest(_RemoveListeners, _fixtures.FixtureTest): assert_raises_message( sa.exc.SAWarning, r"before_configured' and 'after_configured' ORM events only " - r"invoke with the mapper\(\) function or Mapper class as " + r"invoke with the Mapper class as " r"the target.", event.listen, User, @@ -1076,7 +1073,7 @@ class MapperEventsTest(_RemoveListeners, _fixtures.FixtureTest): assert_raises_message( sa.exc.SAWarning, r"before_configured' and 'after_configured' ORM events only " - r"invoke with the mapper\(\) function or Mapper class as " + r"invoke with the Mapper class as " r"the target.", event.listen, User, @@ -1092,8 +1089,8 @@ class MapperEventsTest(_RemoveListeners, _fixtures.FixtureTest): self.mapper_registry.map_imperatively(User, users) - event.listen(mapper, "before_configured", m1) - event.listen(mapper, "after_configured", m2) + event.listen(Mapper, "before_configured", m1) + event.listen(Mapper, "after_configured", m2) inspect(User)._post_inspect @@ -1135,7 +1132,7 @@ class MapperEventsTest(_RemoveListeners, _fixtures.FixtureTest): canary.init() # mapper level event - @event.listens_for(mapper, "instrument_class") + @event.listens_for(Mapper, "instrument_class") def instrument_class(mp, class_): canary.instrument_class(class_) @@ -2256,9 +2253,8 @@ class SessionEventsTest(_RemoveListeners, _fixtures.FixtureTest): sess.rollback() eq_(assertions, [True, True]) - @testing.combinations((True,), (False,)) - def test_autobegin_no_reentrant(self, future): - s1 = fixture_session(future=future) + def test_autobegin_no_reentrant(self): + s1 = fixture_session() canary = Mock() diff --git a/test/orm/test_lazy_relations.py b/test/orm/test_lazy_relations.py index cb83bb6f78..ee6d53652d 100644 --- a/test/orm/test_lazy_relations.py +++ b/test/orm/test_lazy_relations.py @@ -954,7 +954,7 @@ class LazyTest(_fixtures.FixtureTest): properties={"addresses": relationship(Address, backref="user")}, ) self.mapper_registry.map_imperatively(Address, addresses) - sess = fixture_session(autoflush=False, future=True) + sess = fixture_session(autoflush=False) ad = sess.query(Address).filter_by(id=1).one() assert ad.user.id == 7 diff --git a/test/orm/test_load_on_fks.py b/test/orm/test_load_on_fks.py index fda8be4236..f33a6881d9 100644 --- a/test/orm/test_load_on_fks.py +++ b/test/orm/test_load_on_fks.py @@ -245,6 +245,37 @@ class LoadOnFKsTest(fixtures.DeclarativeMappedTest): self.assert_sql_count(testing.db, go, 0) + def test_enable_rel_loading_on_persistent_allows_backref_event( + self, parent_fixture + ): + sess, p1, p2, c1, c2 = parent_fixture + Parent, Child = self.classes("Parent", "Child") + + c3 = Child() + sess.enable_relationship_loading(c3) + c3.parent_id = p1.id + c3.parent = p1 + + # backref did not fire off when c3.parent was set. + # originally this was impacted by #3708, now does not happen + # due to backref_cascades behavior being removed + assert c3 not in p1.children + + def test_enable_rel_loading_allows_backref_event(self, parent_fixture): + sess, p1, p2, c1, c2 = parent_fixture + Parent, Child = self.classes("Parent", "Child") + + c3 = Child() + sess.enable_relationship_loading(c3) + c3.parent_id = p1.id + + c3.parent = p1 + + # backref did not fire off when c3.parent was set. + # originally this was impacted by #3708, now does not happen + # due to backref_cascades behavior being removed + assert c3 not in p1.children + def test_backref_doesnt_double(self, parent_fixture): sess, p1, p2, c1, c2 = parent_fixture Parent, Child = self.classes("Parent", "Child") diff --git a/test/orm/test_mapper.py b/test/orm/test_mapper.py index b491604f30..73288359e8 100644 --- a/test/orm/test_mapper.py +++ b/test/orm/test_mapper.py @@ -340,7 +340,7 @@ class MapperTest(_fixtures.FixtureTest, AssertsCompiledSQL): m = self.mapper(User, users) session = fixture_session() - session.connection(mapper=m) + session.connection(bind_arguments=dict(mapper=m)) def test_incomplete_columns(self): """Loading from a select which does not contain all columns""" diff --git a/test/orm/test_naturalpks.py b/test/orm/test_naturalpks.py index 05df71c6a3..f2700513b7 100644 --- a/test/orm/test_naturalpks.py +++ b/test/orm/test_naturalpks.py @@ -871,12 +871,12 @@ class ReversePKsTest(fixtures.MappedTest): session.commit() # testing #3108 - session.begin_nested() + nt1 = session.begin_nested() a_published.status = ARCHIVED a_editable.status = PUBLISHED - session.commit() + nt1.commit() session.rollback() eq_(a_published.status, PUBLISHED) diff --git a/test/orm/test_query.py b/test/orm/test_query.py index 9bfeb36d81..a53c90ed4e 100644 --- a/test/orm/test_query.py +++ b/test/orm/test_query.py @@ -2855,7 +2855,7 @@ class SliceTest(QueryTest): def test_negative_indexes_raise(self): User = self.classes.User - sess = fixture_session(future=True) + sess = fixture_session() q = sess.query(User).order_by(User.id) with expect_raises_message( diff --git a/test/orm/test_session.py b/test/orm/test_session.py index e821a7c20d..a53756d63a 100644 --- a/test/orm/test_session.py +++ b/test/orm/test_session.py @@ -83,6 +83,23 @@ class ExecutionTest(_fixtures.FixtureTest): [(7,), (8,), (9,)], ) + def test_no_string_execute(self, connection): + + with Session(bind=connection) as sess: + with expect_raises_message( + sa.exc.ArgumentError, + r"Textual SQL expression 'select \* from users where.*' " + "should be explicitly declared", + ): + sess.execute("select * from users where id=:id", {"id": 7}) + + with expect_raises_message( + sa.exc.ArgumentError, + r"Textual SQL expression 'select id from users .*' " + "should be explicitly declared", + ): + sess.scalar("select id from users where id=:id", {"id": 7}) + class TransScopingTest(_fixtures.FixtureTest): run_inserts = None @@ -734,7 +751,7 @@ class SessionStateTest(_fixtures.FixtureTest): assert sess.is_active def test_active_flag_autobegin_future(self): - sess = Session(bind=config.db, future=True) + sess = Session(bind=config.db) assert sess.is_active assert not sess.in_transaction() sess.begin() @@ -752,7 +769,7 @@ class SessionStateTest(_fixtures.FixtureTest): assert sess.is_active sess.begin(_subtrans=True) sess.rollback() - assert not sess.is_active + assert sess.is_active sess.rollback() assert sess.is_active @@ -888,7 +905,7 @@ class SessionStateTest(_fixtures.FixtureTest): ) self.mapper_registry.map_imperatively(Address, addresses) - session = fixture_session(future=True) + session = fixture_session() @event.listens_for(session, "after_flush") def load_collections(session, flush_context): diff --git a/test/orm/test_transaction.py b/test/orm/test_transaction.py index c5d06ba88b..96a00ff54c 100644 --- a/test/orm/test_transaction.py +++ b/test/orm/test_transaction.py @@ -41,26 +41,14 @@ class SessionTransactionTest(fixtures.RemovesEvents, FixtureTest): run_inserts = None __backend__ = True - @testing.fixture - def conn(self): - with testing.db.connect() as conn: - yield conn - - @testing.fixture - def future_conn(self): - - engine = testing.db - with engine.connect() as conn: - yield conn - - def test_no_close_transaction_on_flush(self, conn): + def test_no_close_transaction_on_flush(self, connection): User, users = self.classes.User, self.tables.users - c = conn + c = connection self.mapper_registry.map_imperatively(User, users) s = Session(bind=c) s.begin() - tran = s._legacy_transaction() + tran = s.get_transaction() s.add(User(name="first")) s.flush() c.exec_driver_sql("select * from users") @@ -70,15 +58,16 @@ class SessionTransactionTest(fixtures.RemovesEvents, FixtureTest): u = User(name="third") s.add(u) s.flush() - assert s._legacy_transaction() is tran + assert s.get_transaction() is tran tran.close() - def test_subtransaction_on_external_no_begin(self, conn): + def test_subtransaction_on_external_no_begin(self, connection_no_trans): users, User = self.tables.users, self.classes.User + connection = connection_no_trans self.mapper_registry.map_imperatively(User, users) - trans = conn.begin() - sess = Session(bind=conn, autoflush=True) + trans = connection.begin() + sess = Session(bind=connection, autoflush=True) u = User(name="ed") sess.add(u) sess.flush() @@ -88,12 +77,14 @@ class SessionTransactionTest(fixtures.RemovesEvents, FixtureTest): sess.close() @testing.requires.savepoints - def test_external_nested_transaction(self, conn): + def test_external_nested_transaction(self, connection_no_trans): users, User = self.tables.users, self.classes.User self.mapper_registry.map_imperatively(User, users) - trans = conn.begin() - sess = Session(bind=conn, autoflush=True) + + connection = connection_no_trans + trans = connection.begin() + sess = Session(bind=connection, autoflush=True) u1 = User(name="u1") sess.add(u1) sess.flush() @@ -107,60 +98,60 @@ class SessionTransactionTest(fixtures.RemovesEvents, FixtureTest): trans.commit() assert len(sess.query(User).all()) == 1 - def test_subtransaction_on_external_commit_future(self, future_conn): + def test_subtransaction_on_external_commit(self, connection_no_trans): users, User = self.tables.users, self.classes.User self.mapper_registry.map_imperatively(User, users) - conn = future_conn - conn.begin() + connection = connection_no_trans + connection.begin() - sess = Session(bind=conn, autoflush=True) + sess = Session(bind=connection, autoflush=True) u = User(name="ed") sess.add(u) sess.flush() sess.commit() # commit does nothing - conn.rollback() # rolls back + connection.rollback() # rolls back assert len(sess.query(User).all()) == 0 sess.close() - def test_subtransaction_on_external_rollback_future(self, future_conn): + def test_subtransaction_on_external_rollback(self, connection_no_trans): users, User = self.tables.users, self.classes.User self.mapper_registry.map_imperatively(User, users) - conn = future_conn - conn.begin() + connection = connection_no_trans + connection.begin() - sess = Session(bind=conn, autoflush=True) + sess = Session(bind=connection, autoflush=True) u = User(name="ed") sess.add(u) sess.flush() sess.rollback() # rolls back - conn.commit() # nothing to commit + connection.commit() # nothing to commit assert len(sess.query(User).all()) == 0 sess.close() @testing.requires.savepoints - def test_savepoint_on_external_future(self, future_conn): + def test_savepoint_on_external(self, connection_no_trans): users, User = self.tables.users, self.classes.User self.mapper_registry.map_imperatively(User, users) - conn = future_conn - conn.begin() - sess = Session(bind=conn, autoflush=True) + connection = connection_no_trans + connection.begin() + sess = Session(bind=connection, autoflush=True) u1 = User(name="u1") sess.add(u1) sess.flush() - sess.begin_nested() + n1 = sess.begin_nested() u2 = User(name="u2") sess.add(u2) sess.flush() - sess.rollback() + n1.rollback() - conn.commit() + connection.commit() assert len(sess.query(User).all()) == 1 @testing.requires.savepoints @@ -171,10 +162,10 @@ class SessionTransactionTest(fixtures.RemovesEvents, FixtureTest): session = fixture_session() session.begin() - session.begin_nested() + n1 = session.begin_nested() u1 = User(name="u1") session.add(u1) - session.commit() + n1.commit() assert u1 in session session.rollback() assert u1 not in session @@ -194,9 +185,9 @@ class SessionTransactionTest(fixtures.RemovesEvents, FixtureTest): session.begin() u1 = session.query(User).first() - session.begin_nested() + n1 = session.begin_nested() session.delete(u1) - session.commit() + n1.commit() assert u1 not in session session.rollback() assert u1 in session @@ -221,34 +212,6 @@ class SessionTransactionTest(fixtures.RemovesEvents, FixtureTest): assert attributes.instance_state(u1) in nt2._dirty assert attributes.instance_state(u1) not in nt1._dirty - s.commit() - assert attributes.instance_state(u1) in nt2._dirty - assert attributes.instance_state(u1) in nt1._dirty - - s.rollback() - assert attributes.instance_state(u1).expired - eq_(u1.name, "u1") - - @testing.requires.savepoints - def test_dirty_state_transferred_deep_nesting_future(self): - User, users = self.classes.User, self.tables.users - - self.mapper_registry.map_imperatively(User, users) - - with fixture_session(future=True) as s: - u1 = User(name="u1") - s.add(u1) - s.commit() - - nt1 = s.begin_nested() - nt2 = s.begin_nested() - u1.name = "u2" - assert attributes.instance_state(u1) not in nt2._dirty - assert attributes.instance_state(u1) not in nt1._dirty - s.flush() - assert attributes.instance_state(u1) in nt2._dirty - assert attributes.instance_state(u1) not in nt1._dirty - nt2.commit() assert attributes.instance_state(u1) in nt2._dirty assert attributes.instance_state(u1) in nt1._dirty @@ -341,13 +304,13 @@ class SessionTransactionTest(fixtures.RemovesEvents, FixtureTest): sess.add(u) sess.flush() - sess.begin_nested() # nested transaction + n1 = sess.begin_nested() # nested transaction u2 = User(name="u2") sess.add(u2) sess.flush() - sess.rollback() + n1.rollback() sess.commit() assert len(sess.query(User).all()) == 1 @@ -369,28 +332,6 @@ class SessionTransactionTest(fixtures.RemovesEvents, FixtureTest): sess.add(u2) sess.flush() - sess.rollback() # rolls back nested only - - sess.commit() - assert len(sess.query(User).all()) == 1 - sess.close() - - @testing.requires.savepoints - def test_nested_autotrans_future(self): - User, users = self.classes.User, self.tables.users - - self.mapper_registry.map_imperatively(User, users) - sess = fixture_session(future=True) - u = User(name="u1") - sess.add(u) - sess.flush() - - sess.begin_nested() # nested transaction - - u2 = User(name="u2") - sess.add(u2) - sess.flush() - sess.rollback() # rolls back the whole trans sess.commit() @@ -423,11 +364,11 @@ class SessionTransactionTest(fixtures.RemovesEvents, FixtureTest): sess.rollback() sess.begin() - sess.begin_nested() + n1 = sess.begin_nested() u3 = User(name="u3") sess.add(u3) - sess.commit() # commit the nested transaction + n1.commit() # commit the nested transaction sess.rollback() eq_(set(sess.query(User).all()), set([u2])) @@ -686,6 +627,10 @@ class SessionTransactionTest(fixtures.RemovesEvents, FixtureTest): eq_(session.is_active, False) session.rollback() + is_(session._transaction, None) + + session.connection() + # back to normal eq_(session._transaction._state, _session.ACTIVE) eq_(session.is_active, True) @@ -878,26 +823,18 @@ class SessionTransactionTest(fixtures.RemovesEvents, FixtureTest): sess.add(User(id=5, name="some name")) sess.commit() - def test_no_autocommit_with_explicit_commit(self): + def test_no_autobegin_after_explicit_commit(self): User, users = self.classes.User, self.tables.users self.mapper_registry.map_imperatively(User, users) session = fixture_session() session.add(User(name="ed")) - session._legacy_transaction().commit() - - is_not(session._legacy_transaction(), None) - - def test_no_autocommit_with_explicit_commit_future(self): - User, users = self.classes.User, self.tables.users + session.get_transaction().commit() - self.mapper_registry.map_imperatively(User, users) - session = fixture_session(future=True) - session.add(User(name="ed")) - session._legacy_transaction().commit() + is_(session.get_transaction(), None) - # new in 1.4 - is_(session._legacy_transaction(), None) + session.connection() + is_not(session.get_transaction(), None) class _LocalFixture(FixtureTest): @@ -989,7 +926,6 @@ def subtransaction_recipe_three(self): argnames="target_recipe,recipe_rollsback_early", id_="ns", ) -@testing.combinations((True,), (False,), argnames="future", id_="s") class SubtransactionRecipeTest(FixtureTest): run_inserts = None __backend__ = True @@ -1002,7 +938,7 @@ class SubtransactionRecipeTest(FixtureTest): def test_recipe_heavy_nesting(self, subtransaction_recipe): users = self.tables.users - with fixture_session(future=self.future) as session: + with fixture_session() as session: with subtransaction_recipe(session): session.connection().execute( users.insert().values(name="user1") @@ -1046,7 +982,7 @@ class SubtransactionRecipeTest(FixtureTest): self.mapper_registry.map_imperatively(User, users) with testing.db.connect() as conn: trans = conn.begin() - sess = Session(conn, future=self.future) + sess = Session(conn) with subtransaction_recipe(sess): u = User(name="ed") @@ -1061,7 +997,7 @@ class SubtransactionRecipeTest(FixtureTest): User, users = self.classes.User, self.tables.users self.mapper_registry.map_imperatively(User, users) - with fixture_session(future=self.future) as sess: + with fixture_session() as sess: with subtransaction_recipe(sess): u = User(name="u1") sess.add(u) @@ -1074,7 +1010,7 @@ class SubtransactionRecipeTest(FixtureTest): User, users = self.classes.User, self.tables.users self.mapper_registry.map_imperatively(User, users) - with fixture_session(future=self.future) as sess: + with fixture_session() as sess: sess.begin() with subtransaction_recipe(sess): u = User(name="u1") @@ -1090,7 +1026,7 @@ class SubtransactionRecipeTest(FixtureTest): self.mapper_registry.map_imperatively(User, users) - with fixture_session(future=self.future) as sess: + with fixture_session() as sess: sess.begin() sess.begin_nested() @@ -1111,7 +1047,7 @@ class SubtransactionRecipeTest(FixtureTest): sess.add(User(name="u2")) t2.commit() - assert sess._legacy_transaction() is t1 + assert sess.get_transaction() is t1 def test_recipe_error_on_using_inactive_session_commands( self, subtransaction_recipe @@ -1119,7 +1055,7 @@ class SubtransactionRecipeTest(FixtureTest): users, User = self.tables.users, self.classes.User self.mapper_registry.map_imperatively(User, users) - with fixture_session(future=self.future) as sess: + with fixture_session() as sess: sess.begin() try: @@ -1141,13 +1077,13 @@ class SubtransactionRecipeTest(FixtureTest): assert not sess.in_transaction() def test_recipe_multi_nesting(self, subtransaction_recipe): - with fixture_session(future=self.future) as sess: + with fixture_session() as sess: with subtransaction_recipe(sess): assert sess.in_transaction() try: with subtransaction_recipe(sess): - assert sess._legacy_transaction() + assert sess.get_transaction() raise Exception("force rollback") except: pass @@ -1160,7 +1096,7 @@ class SubtransactionRecipeTest(FixtureTest): assert not sess.in_transaction() def test_recipe_deactive_status_check(self, subtransaction_recipe): - with fixture_session(future=self.future) as sess: + with fixture_session() as sess: sess.begin() with subtransaction_recipe(sess): @@ -1217,12 +1153,12 @@ class CleanSavepointTest(FixtureTest): run_inserts = None __backend__ = True - def _run_test(self, update_fn, future=False): + def _run_test(self, update_fn): User, users = self.classes.User, self.tables.users self.mapper_registry.map_imperatively(User, users) - with fixture_session(future=future) as s: + with fixture_session() as s: u1 = User(name="u1") u2 = User(name="u2") s.add_all([u1, u2]) @@ -1236,12 +1172,8 @@ class CleanSavepointTest(FixtureTest): eq_(u2.name, "u2modified") s.rollback() - if future: - assert s._transaction is None - assert "name" not in u1.__dict__ - else: - assert s._transaction is trans - eq_(u1.__dict__["name"], "u1") + assert s._transaction is None + assert "name" not in u1.__dict__ assert "name" not in u2.__dict__ eq_(u2.name, "u2") @@ -1498,13 +1430,13 @@ class RollbackRecoverTest(_LocalFixture): u1.name = "edward" a1.email_address = "foober" - s.begin_nested() + nt1 = s.begin_nested() s.add(u2) with expect_warnings("New instance"): assert_raises(sa_exc.IntegrityError, s.commit) assert_raises(sa_exc.InvalidRequestError, s.commit) - s.rollback() + nt1.rollback() assert u2 not in s assert a2 not in s assert u1 in s @@ -1534,7 +1466,7 @@ class SavepointTest(_LocalFixture): u2 = User(name="jack") s.add_all([u1, u2]) - s.begin_nested() + nt1 = s.begin_nested() u3 = User(name="wendy") u4 = User(name="foo") u1.name = "edward" @@ -1544,7 +1476,7 @@ class SavepointTest(_LocalFixture): s.query(User.name).order_by(User.id).all(), [("edward",), ("jackward",), ("wendy",), ("foo",)], ) - s.rollback() + nt1.rollback() assert u1.name == "ed" assert u2.name == "jack" eq_(s.query(User.name).order_by(User.id).all(), [("ed",), ("jack",)]) @@ -1575,7 +1507,7 @@ class SavepointTest(_LocalFixture): u2 = User(name="jack") s.add_all([u1, u2]) - s.begin_nested() + nt1 = s.begin_nested() u3 = User(name="wendy") u4 = User(name="foo") u1.name = "edward" @@ -1585,7 +1517,7 @@ class SavepointTest(_LocalFixture): s.query(User.name).order_by(User.id).all(), [("edward",), ("jackward",), ("wendy",), ("foo",)], ) - s.commit() + nt1.commit() def go(): assert u1.name == "edward" @@ -1613,7 +1545,7 @@ class SavepointTest(_LocalFixture): u1.name = "edward" u1.addresses.append(Address(email_address="bar")) - s.begin_nested() + nt1 = s.begin_nested() u2 = User(name="jack", addresses=[Address(email_address="bat")]) s.add(u2) eq_( @@ -1629,7 +1561,7 @@ class SavepointTest(_LocalFixture): User(name="jack", addresses=[Address(email_address="bat")]), ], ) - s.rollback() + nt1.rollback() eq_( s.query(User).order_by(User.id).all(), [ @@ -1751,7 +1683,7 @@ class SavepointTest(_LocalFixture): nested_trans = trans._connections[self.bind][1] nested_trans._do_commit() - is_(s._legacy_transaction(), trans) + is_(s.get_nested_transaction(), trans) with expect_warnings("nested transaction already deassociated"): # this previously would raise @@ -1762,12 +1694,14 @@ class SavepointTest(_LocalFixture): assert u1 not in s.new is_(trans._state, _session.CLOSED) - is_not(s._legacy_transaction(), trans) - is_(s._legacy_transaction()._state, _session.ACTIVE) + is_not(s.get_transaction(), trans) - is_(s._legacy_transaction().nested, False) + s.connection() + is_(s.get_transaction()._state, _session.ACTIVE) + + is_(s.get_transaction().nested, False) - is_(s._legacy_transaction()._parent, None) + is_(s.get_transaction()._parent, None) class AccountingFlagsTest(_LocalFixture): @@ -1851,7 +1785,7 @@ class ContextManagerPlusFutureTest(FixtureTest): def test_explicit_begin(self): with fixture_session() as s1: with s1.begin() as trans: - is_(trans, s1._legacy_transaction()) + is_(trans, s1.get_transaction()) s1.connection() is_(s1._transaction, None) @@ -1866,10 +1800,10 @@ class ContextManagerPlusFutureTest(FixtureTest): ) @testing.requires.savepoints - def test_future_rollback_is_global(self): + def test_rollback_is_global(self): users = self.tables.users - with fixture_session(future=True) as s1: + with fixture_session() as s1: s1.begin() s1.connection().execute(users.insert(), [{"id": 1, "name": "n1"}]) @@ -1890,7 +1824,7 @@ class ContextManagerPlusFutureTest(FixtureTest): # rolls back the whole transaction s1.rollback() - is_(s1._legacy_transaction(), None) + is_(s1.get_transaction(), None) eq_( s1.connection().scalar( @@ -1900,79 +1834,13 @@ class ContextManagerPlusFutureTest(FixtureTest): ) s1.commit() - is_(s1._legacy_transaction(), None) - - @testing.requires.savepoints - def test_old_rollback_is_local(self): - users = self.tables.users - - with fixture_session() as s1: - - t1 = s1.begin() - - s1.connection().execute(users.insert(), [{"id": 1, "name": "n1"}]) - - s1.begin_nested() - - s1.connection().execute( - users.insert(), - [{"id": 2, "name": "n2"}, {"id": 3, "name": "n3"}], - ) - - eq_( - s1.connection().scalar( - select(func.count()).select_from(users) - ), - 3, - ) - - # rolls back only the savepoint - s1.rollback() - - is_(s1._legacy_transaction(), t1) - - eq_( - s1.connection().scalar( - select(func.count()).select_from(users) - ), - 1, - ) - - s1.commit() - eq_( - s1.connection().scalar( - select(func.count()).select_from(users) - ), - 1, - ) - is_not(s1._legacy_transaction(), None) + is_(s1.get_transaction(), None) def test_session_as_ctx_manager_one(self): users = self.tables.users with fixture_session() as sess: - is_not(sess._legacy_transaction(), None) - - sess.connection().execute( - users.insert().values(id=1, name="user1") - ) - - eq_( - sess.connection().execute(users.select()).all(), [(1, "user1")] - ) - - is_not(sess._legacy_transaction(), None) - - is_not(sess._legacy_transaction(), None) - - # did not commit - eq_(sess.connection().execute(users.select()).all(), []) - - def test_session_as_ctx_manager_future_one(self): - users = self.tables.users - - with fixture_session(future=True) as sess: - is_(sess._legacy_transaction(), None) + is_(sess.get_transaction(), None) sess.connection().execute( users.insert().values(id=1, name="user1") @@ -1982,9 +1850,9 @@ class ContextManagerPlusFutureTest(FixtureTest): sess.connection().execute(users.select()).all(), [(1, "user1")] ) - is_not(sess._legacy_transaction(), None) + is_not(sess.get_transaction(), None) - is_(sess._legacy_transaction(), None) + is_(sess.get_transaction(), None) # did not commit eq_(sess.connection().execute(users.select()).all(), []) @@ -1994,23 +1862,7 @@ class ContextManagerPlusFutureTest(FixtureTest): try: with fixture_session() as sess: - is_not(sess._legacy_transaction(), None) - - sess.connection().execute( - users.insert().values(id=1, name="user1") - ) - - raise Exception("force rollback") - except: - pass - is_not(sess._legacy_transaction(), None) - - def test_session_as_ctx_manager_two_future(self): - users = self.tables.users - - try: - with fixture_session(future=True) as sess: - is_(sess._legacy_transaction(), None) + is_(sess.get_transaction(), None) sess.connection().execute( users.insert().values(id=1, name="user1") @@ -2019,7 +1871,7 @@ class ContextManagerPlusFutureTest(FixtureTest): raise Exception("force rollback") except: pass - is_(sess._legacy_transaction(), None) + is_(sess.get_transaction(), None) def test_begin_context_manager(self): users = self.tables.users @@ -2151,15 +2003,13 @@ class ContextManagerPlusFutureTest(FixtureTest): eq_(sess.connection().execute(users.select()).all(), [(1, "user1")]) sess.close() - @testing.combinations((True,), (False,), argnames="future") - def test_interrupt_ctxmanager(self, trans_ctx_manager_fixture, future): + def test_interrupt_ctxmanager(self, trans_ctx_manager_fixture): fn = trans_ctx_manager_fixture - session = fixture_session(future=future) + session = fixture_session() fn(session, trans_on_subject=True, execute_on_subject=True) - @testing.combinations((True,), (False,), argnames="future") @testing.combinations((True,), (False,), argnames="rollback") @testing.combinations((True,), (False,), argnames="expire_on_commit") @testing.combinations( @@ -2170,15 +2020,13 @@ class ContextManagerPlusFutureTest(FixtureTest): argnames="check_operation", ) def test_interrupt_ctxmanager_ops( - self, future, rollback, expire_on_commit, check_operation + self, rollback, expire_on_commit, check_operation ): users, User = self.tables.users, self.classes.User self.mapper_registry.map_imperatively(User, users) - session = fixture_session( - future=future, expire_on_commit=expire_on_commit - ) + session = fixture_session(expire_on_commit=expire_on_commit) with session.begin(): u1 = User(id=7, name="u1") @@ -2266,8 +2114,8 @@ class TransactionFlagsTest(fixtures.TestBase): s1.rollback() - eq_(s1.in_transaction(), True) - is_(s1._transaction, trans) + eq_(s1.in_transaction(), False) + is_(s1._transaction, None) s1.rollback() @@ -2560,7 +2408,7 @@ class NewStyleJoinIntoAnExternalTransactionTest( self.A = A # bind an individual Session to the connection - self.session = Session(bind=self.connection, future=True) + self.session = Session(bind=self.connection) if testing.requires.savepoints.enabled: self.nested = self.connection.begin_nested() diff --git a/test/orm/test_versioning.py b/test/orm/test_versioning.py index 9d14ceba15..b5955e4a6b 100644 --- a/test/orm/test_versioning.py +++ b/test/orm/test_versioning.py @@ -748,7 +748,7 @@ class VersionOnPostUpdateTest(fixtures.MappedTest): # outwit the database transaction isolation and SQLA's # expiration at the same time by using different Session on # same transaction - s2 = Session(bind=s.connection(mapper=Node)) + s2 = Session(bind=s.connection(bind_arguments=dict(mapper=Node))) s2.query(Node).filter(Node.id == n2.id).update({"version_id": 3}) s2.commit() @@ -770,7 +770,7 @@ class VersionOnPostUpdateTest(fixtures.MappedTest): ), patch.object( config.db.dialect, "supports_sane_multi_rowcount", False ): - s2 = Session(bind=s.connection(mapper=Node)) + s2 = Session(bind=s.connection(bind_arguments=dict(mapper=Node))) s2.query(Node).filter(Node.id == n2.id).update({"version_id": 3}) s2.commit() @@ -791,7 +791,7 @@ class VersionOnPostUpdateTest(fixtures.MappedTest): # outwit the database transaction isolation and SQLA's # expiration at the same time by using different Session on # same transaction - s2 = Session(bind=s.connection(mapper=Node)) + s2 = Session(bind=s.connection(bind_arguments=dict(mapper=Node))) s2.query(Node).filter(Node.id == n1.id).update({"version_id": 3}) s2.commit() diff --git a/test/sql/test_case_statement.py b/test/sql/test_case_statement.py index db7f16194f..6893a94427 100644 --- a/test/sql/test_case_statement.py +++ b/test/sql/test_case_statement.py @@ -215,23 +215,10 @@ class CaseTest(fixtures.TestBase, AssertsCompiledSQL): def test_when_dicts(self, test_case, expected): t = table("test", column("col1")) - whens, value, else_ = testing.resolve_lambda(test_case, t=t) - - def _case_args(whens, value=None, else_=None): - kw = {} - if value is not None: - kw["value"] = value - if else_ is not None: - kw["else_"] = else_ - - return case(whens, **kw) - - # note: 1.3 also does not allow this form - # case([whens], **kw) + when_dict, value, else_ = testing.resolve_lambda(test_case, t=t) self.assert_compile( - _case_args(whens=whens, value=value, else_=else_), - expected, + case(when_dict, value=value, else_=else_), expected ) def test_text_doesnt_explode(self, connection): diff --git a/test/sql/test_deprecations.py b/test/sql/test_deprecations.py index ea075b36d2..f07410110b 100644 --- a/test/sql/test_deprecations.py +++ b/test/sql/test_deprecations.py @@ -1,12 +1,8 @@ #! coding: utf-8 -import itertools -import random - from sqlalchemy import alias from sqlalchemy import and_ from sqlalchemy import bindparam -from sqlalchemy import case from sqlalchemy import CHAR from sqlalchemy import column from sqlalchemy import exc @@ -29,14 +25,11 @@ from sqlalchemy import text from sqlalchemy.engine import default from sqlalchemy.sql import coercions from sqlalchemy.sql import LABEL_STYLE_TABLENAME_PLUS_COL -from sqlalchemy.sql import literal from sqlalchemy.sql import operators from sqlalchemy.sql import quoted_name from sqlalchemy.sql import roles -from sqlalchemy.sql import update from sqlalchemy.sql import visitors from sqlalchemy.sql.selectable import SelectStatementGrouping -from sqlalchemy.testing import assert_raises_message from sqlalchemy.testing import assertions from sqlalchemy.testing import AssertsCompiledSQL from sqlalchemy.testing import eq_ @@ -46,7 +39,6 @@ from sqlalchemy.testing import is_true from sqlalchemy.testing import mock from sqlalchemy.testing.schema import Column from sqlalchemy.testing.schema import Table -from .test_update import _UpdateFromTestBase class ToMetaDataTest(fixtures.TestBase): @@ -313,165 +305,6 @@ class SelectableTest(fixtures.TestBase, AssertsCompiledSQL): ): eq_(stmt.froms, [t1]) - def test_case_list_legacy(self): - t1 = table("t", column("q")) - - with testing.expect_deprecated( - r"The \"whens\" argument to case\(\), when referring " - r"to a sequence of items, is now passed" - ): - stmt = select(t1).where( - case( - [(t1.c.q == 5, "foo"), (t1.c.q == 10, "bar")], else_="bat" - ) - != "bat" - ) - - self.assert_compile( - stmt, - "SELECT t.q FROM t WHERE CASE WHEN (t.q = :q_1) " - "THEN :param_1 WHEN (t.q = :q_2) THEN :param_2 " - "ELSE :param_3 END != :param_4", - ) - - def test_case_whens_kw(self): - t1 = table("t", column("q")) - - with testing.expect_deprecated( - r"The \"whens\" argument to case\(\), when referring " - "to a sequence of items, is now passed" - ): - stmt = select(t1).where( - case( - whens=[(t1.c.q == 5, "foo"), (t1.c.q == 10, "bar")], - else_="bat", - ) - != "bat" - ) - - self.assert_compile( - stmt, - "SELECT t.q FROM t WHERE CASE WHEN (t.q = :q_1) " - "THEN :param_1 WHEN (t.q = :q_2) THEN :param_2 " - "ELSE :param_3 END != :param_4", - ) - - @testing.combinations( - ( - (lambda t: ({"x": "y"}, t.c.col1, None)), - "CASE test.col1 WHEN :param_1 THEN :param_2 END", - ), - ( - (lambda t: ({"x": "y", "p": "q"}, t.c.col1, None)), - "CASE test.col1 WHEN :param_1 THEN :param_2 " - "WHEN :param_3 THEN :param_4 END", - ), - ( - (lambda t: ({t.c.col1 == 7: "x"}, None, 10)), - "CASE WHEN (test.col1 = :col1_1) THEN :param_1 ELSE :param_2 END", - ), - ( - (lambda t: ({t.c.col1 == 7: "x", t.c.col1 == 10: "y"}, None, 10)), - "CASE WHEN (test.col1 = :col1_1) THEN :param_1 " - "WHEN (test.col1 = :col1_2) THEN :param_2 ELSE :param_3 END", - ), - argnames="test_case, expected", - ) - def test_when_kwarg(self, test_case, expected): - t = table("test", column("col1")) - - whens, value, else_ = testing.resolve_lambda(test_case, t=t) - - def _case_args(whens, value=None, else_=None): - kw = {} - if value is not None: - kw["value"] = value - if else_ is not None: - kw["else_"] = else_ - - with testing.expect_deprecated_20( - r'The "whens" argument to case\(\) is now passed using ' - r"positional style only, not as a keyword argument." - ): - - return case(whens=whens, **kw) - - # note: 1.3 also does not allow this form - # case([whens], **kw) - - self.assert_compile( - _case_args(whens=whens, value=value, else_=else_), - expected, - ) - - def test_case_whens_dict_kw(self): - t1 = table("t", column("q")) - with testing.expect_deprecated( - r"The \"whens\" argument to case\(\) is now passed" - ): - stmt = select(t1).where( - case( - whens={t1.c.q == 5: "foo"}, - else_="bat", - ) - != "bat" - ) - self.assert_compile( - stmt, - "SELECT t.q FROM t WHERE CASE WHEN (t.q = :q_1) THEN " - ":param_1 ELSE :param_2 END != :param_3", - ) - - def test_case_kw_arg_detection(self): - # because we support py2k, case() has to parse **kw for now - - assert_raises_message( - TypeError, - "unknown arguments: bat, foo", - case, - (column("x") == 10, 5), - else_=15, - foo="bar", - bat="hoho", - ) - - def test_with_only_generative(self): - table1 = table( - "table1", - column("col1"), - column("col2"), - column("col3"), - column("colx"), - ) - s1 = table1.select().scalar_subquery() - - with testing.expect_deprecated_20( - r"The \"columns\" argument to " - r"Select.with_only_columns\(\), when referring " - "to a sequence of items, is now passed" - ): - stmt = s1.with_only_columns([s1]) - self.assert_compile( - stmt, - "SELECT (SELECT table1.col1, table1.col2, " - "table1.col3, table1.colx FROM table1) AS anon_1", - ) - - def test_from_list_with_columns(self): - table1 = table("t1", column("a")) - table2 = table("t2", column("b")) - s1 = select(table1.c.a, table2.c.b) - self.assert_compile(s1, "SELECT t1.a, t2.b FROM t1, t2") - - with testing.expect_deprecated_20( - r"The \"columns\" argument to " - r"Select.with_only_columns\(\), when referring " - "to a sequence of items, is now passed" - ): - s2 = s1.with_only_columns([table2.c.b]) - - self.assert_compile(s2, "SELECT t2.b FROM t2") - def test_column(self): stmt = select(column("x")) with testing.expect_deprecated( @@ -504,16 +337,6 @@ class SelectableTest(fixtures.TestBase, AssertsCompiledSQL): "ON basefrom.a = joinfrom.a", ) - with testing.expect_deprecated(r"The Select.append_column\(\)"): - replaced.append_column(joinfrom.c.b) - - self.assert_compile( - replaced, - "SELECT basefrom.a, joinfrom.b FROM (SELECT 1 AS a) AS basefrom " - "JOIN (SELECT 1 AS a, 2 AS b) AS joinfrom " - "ON basefrom.a = joinfrom.a", - ) - def test_against_cloned_non_table(self): # test that corresponding column digs across # clone boundaries with anonymous labeled elements @@ -567,31 +390,6 @@ class SelectableTest(fixtures.TestBase, AssertsCompiledSQL): assert u.corresponding_column(s2.c.table2_coly) is u.c.coly assert s2.c.corresponding_column(u.c.coly) is s2.c.table2_coly - def test_join_alias(self): - j1 = self.table1.join(self.table2) - - with testing.expect_deprecated_20( - r"The Join.alias\(\) method is considered legacy" - ): - self.assert_compile( - j1.alias(), - "SELECT table1.col1 AS table1_col1, table1.col2 AS " - "table1_col2, table1.col3 AS table1_col3, table1.colx " - "AS table1_colx, table2.col1 AS table2_col1, " - "table2.col2 AS table2_col2, table2.col3 AS table2_col3, " - "table2.coly AS table2_coly FROM table1 JOIN table2 " - "ON table1.col1 = table2.col2", - ) - - with testing.expect_deprecated_20( - r"The Join.alias\(\) method is considered legacy" - ): - self.assert_compile( - j1.alias(flat=True), - "table1 AS table1_1 JOIN table2 AS table2_1 " - "ON table1_1.col1 = table2_1.col2", - ) - def test_join_against_self_implicit_subquery(self): jj = select(self.table1.c.col1.label("bar_col1")) with testing.expect_deprecated( @@ -613,16 +411,6 @@ class SelectableTest(fixtures.TestBase, AssertsCompiledSQL): ): assert jjj.corresponding_column(jj.c.bar_col1) is jjj_bar_col1 - # test alias of the join - - with testing.expect_deprecated( - r"The Join.alias\(\) method is considered legacy" - ): - j2 = jjj.alias("foo") - assert ( - j2.corresponding_column(self.table1.c.col1) is j2.c.table1_col1 - ) - def test_select_labels(self): a = self.table1.select().set_label_style( LABEL_STYLE_TABLENAME_PLUS_COL @@ -759,104 +547,6 @@ class TextualSelectTest(fixtures.TestBase, AssertsCompiledSQL): eq_(t.c.c.type._type_affinity, String) -class DeprecatedAppendMethTest(fixtures.TestBase, AssertsCompiledSQL): - __dialect__ = "default" - - def _expect_deprecated(self, clsname, methname, newmeth): - return testing.expect_deprecated( - r"The %s.append_%s\(\) method is deprecated " - r"and will be removed in a future release. Use the generative " - r"method %s.%s\(\)." % (clsname, methname, clsname, newmeth) - ) - - def test_append_whereclause(self): - t = table("t", column("q")) - stmt = select(t) - - with self._expect_deprecated("Select", "whereclause", "where"): - stmt.append_whereclause(t.c.q == 5) - - self.assert_compile(stmt, "SELECT t.q FROM t WHERE t.q = :q_1") - - def test_append_having(self): - t = table("t", column("q")) - stmt = select(t).group_by(t.c.q) - - with self._expect_deprecated("Select", "having", "having"): - stmt.append_having(t.c.q == 5) - - self.assert_compile( - stmt, "SELECT t.q FROM t GROUP BY t.q HAVING t.q = :q_1" - ) - - def test_append_order_by(self): - t = table("t", column("q"), column("x")) - stmt = select(t).where(t.c.q == 5) - - with self._expect_deprecated( - "GenerativeSelect", "order_by", "order_by" - ): - stmt.append_order_by(t.c.x) - - self.assert_compile( - stmt, "SELECT t.q, t.x FROM t WHERE t.q = :q_1 ORDER BY t.x" - ) - - def test_append_group_by(self): - t = table("t", column("q")) - stmt = select(t) - - with self._expect_deprecated( - "GenerativeSelect", "group_by", "group_by" - ): - stmt.append_group_by(t.c.q) - - stmt = stmt.having(t.c.q == 5) - - self.assert_compile( - stmt, "SELECT t.q FROM t GROUP BY t.q HAVING t.q = :q_1" - ) - - def test_append_correlation(self): - t1 = table("t1", column("q")) - t2 = table("t2", column("q"), column("p")) - - inner = select(t2.c.p).where(t2.c.q == t1.c.q) - - with self._expect_deprecated("Select", "correlation", "correlate"): - inner.append_correlation(t1) - stmt = select(t1).where(t1.c.q == inner.scalar_subquery()) - - self.assert_compile( - stmt, - "SELECT t1.q FROM t1 WHERE t1.q = " - "(SELECT t2.p FROM t2 WHERE t2.q = t1.q)", - ) - - def test_append_column(self): - t1 = table("t1", column("q"), column("p")) - stmt = select(t1.c.q) - with self._expect_deprecated("Select", "column", "add_columns"): - stmt.append_column(t1.c.p) - self.assert_compile(stmt, "SELECT t1.q, t1.p FROM t1") - - def test_append_prefix(self): - t1 = table("t1", column("q"), column("p")) - stmt = select(t1.c.q) - with self._expect_deprecated("Select", "prefix", "prefix_with"): - stmt.append_prefix("FOO BAR") - self.assert_compile(stmt, "SELECT FOO BAR t1.q FROM t1") - - def test_append_from(self): - t1 = table("t1", column("q")) - t2 = table("t2", column("q")) - - stmt = select(t1) - with self._expect_deprecated("Select", "from", "select_from"): - stmt.append_from(t1.join(t2, t1.c.q == t2.c.q)) - self.assert_compile(stmt, "SELECT t1.q FROM t1 JOIN t2 ON t1.q = t2.q") - - class KeyTargetingTest(fixtures.TablesTest): run_inserts = "once" run_deletes = None @@ -971,365 +661,6 @@ class PKIncrementTest(fixtures.TablesTest): ) -class DMLTest(_UpdateFromTestBase, fixtures.TablesTest, AssertsCompiledSQL): - __dialect__ = "default" - - def test_insert_inline_kw_defaults(self): - m = MetaData() - foo = Table("foo", m, Column("id", Integer)) - - t = Table( - "test", - m, - Column("col1", Integer, default=func.foo(1)), - Column( - "col2", - Integer, - default=select(func.coalesce(func.max(foo.c.id))), - ), - ) - - with testing.expect_deprecated_20( - "The insert.inline parameter will be removed in SQLAlchemy 2.0." - ): - stmt = t.insert(inline=True, values={}) - - self.assert_compile( - stmt, - "INSERT INTO test (col1, col2) VALUES (foo(:foo_1), " - "(SELECT coalesce(max(foo.id)) AS coalesce_1 FROM " - "foo))", - ) - - def test_insert_inline_kw_default(self): - metadata = MetaData() - table = Table( - "sometable", - metadata, - Column("id", Integer, primary_key=True), - Column("foo", Integer, default=func.foobar()), - ) - - with testing.expect_deprecated_20( - "The insert.inline parameter will be removed in SQLAlchemy 2.0." - ): - stmt = table.insert(values={}, inline=True) - - self.assert_compile( - stmt, - "INSERT INTO sometable (foo) VALUES (foobar())", - ) - - with testing.expect_deprecated_20( - "The insert.inline parameter will be removed in SQLAlchemy 2.0." - ): - stmt = table.insert(inline=True) - - self.assert_compile( - stmt, - "INSERT INTO sometable (foo) VALUES (foobar())", - params={}, - ) - - def test_update_inline_kw_defaults(self): - m = MetaData() - foo = Table("foo", m, Column("id", Integer)) - - t = Table( - "test", - m, - Column("col1", Integer, onupdate=func.foo(1)), - Column( - "col2", - Integer, - onupdate=select(func.coalesce(func.max(foo.c.id))), - ), - Column("col3", String(30)), - ) - - with testing.expect_deprecated_20( - "The update.inline parameter will be removed in SQLAlchemy 2.0." - ): - stmt = t.update(inline=True, values={"col3": "foo"}) - - self.assert_compile( - stmt, - "UPDATE test SET col1=foo(:foo_1), col2=(SELECT " - "coalesce(max(foo.id)) AS coalesce_1 FROM foo), " - "col3=:col3", - ) - - def test_update_dialect_kwargs(self): - t = table("foo", column("bar")) - - with testing.expect_deprecated_20("Passing dialect keyword arguments"): - stmt = t.update(mysql_limit=10) - - self.assert_compile( - stmt, "UPDATE foo SET bar=%s LIMIT 10", dialect="mysql" - ) - - def test_update_whereclause(self): - table1 = table( - "mytable", - Column("myid", Integer), - Column("name", String(30)), - ) - - with testing.expect_deprecated_20( - "The update.whereclause parameter will be " - "removed in SQLAlchemy 2.0" - ): - self.assert_compile( - table1.update(table1.c.myid == 7), - "UPDATE mytable SET myid=:myid, name=:name " - "WHERE mytable.myid = :myid_1", - ) - - def test_update_values(self): - table1 = table( - "mytable", - Column("myid", Integer), - Column("name", String(30)), - ) - - with testing.expect_deprecated_20( - "The update.values parameter will be removed in SQLAlchemy 2.0" - ): - self.assert_compile( - table1.update(values={table1.c.myid: 7}), - "UPDATE mytable SET myid=:myid", - ) - - def test_delete_whereclause(self): - table1 = table( - "mytable", - Column("myid", Integer), - ) - - with testing.expect_deprecated_20( - "The delete.whereclause parameter will be " - "removed in SQLAlchemy 2.0" - ): - self.assert_compile( - table1.delete(table1.c.myid == 7), - "DELETE FROM mytable WHERE mytable.myid = :myid_1", - ) - - def test_update_ordered_parameters_fire_onupdate(self): - table = self.tables.update_w_default - - values = [(table.c.y, table.c.x + 5), ("x", 10)] - - with testing.expect_deprecated_20( - "The update.preserve_parameter_order parameter will be " - "removed in SQLAlchemy 2.0." - ): - self.assert_compile( - table.update(preserve_parameter_order=True).values(values), - "UPDATE update_w_default " - "SET ycol=(update_w_default.x + :x_1), " - "x=:x, data=:data", - ) - - def test_update_ordered_parameters_override_onupdate(self): - table = self.tables.update_w_default - - values = [ - (table.c.y, table.c.x + 5), - (table.c.data, table.c.x + 10), - ("x", 10), - ] - - with testing.expect_deprecated_20( - "The update.preserve_parameter_order parameter will be " - "removed in SQLAlchemy 2.0." - ): - self.assert_compile( - table.update(preserve_parameter_order=True).values(values), - "UPDATE update_w_default " - "SET ycol=(update_w_default.x + :x_1), " - "data=(update_w_default.x + :x_2), x=:x", - ) - - def test_update_ordered_parameters_oldstyle_1(self): - table1 = self.tables.mytable - - # Confirm that we can pass values as list value pairs - # note these are ordered *differently* from table.c - values = [ - (table1.c.name, table1.c.name + "lala"), - (table1.c.myid, func.do_stuff(table1.c.myid, literal("hoho"))), - ] - - with testing.expect_deprecated_20( - "The update.preserve_parameter_order parameter will be " - "removed in SQLAlchemy 2.0.", - "The update.whereclause parameter will be " - "removed in SQLAlchemy 2.0", - "The update.values parameter will be removed in SQLAlchemy 2.0", - ): - self.assert_compile( - update( - table1, - (table1.c.myid == func.hoho(4)) - & ( - table1.c.name - == literal("foo") + table1.c.name + literal("lala") - ), - preserve_parameter_order=True, - values=values, - ), - "UPDATE mytable " - "SET " - "name=(mytable.name || :name_1), " - "myid=do_stuff(mytable.myid, :param_1) " - "WHERE " - "mytable.myid = hoho(:hoho_1) AND " - "mytable.name = :param_2 || mytable.name || :param_3", - ) - - def test_update_ordered_parameters_oldstyle_2(self): - table1 = self.tables.mytable - - # Confirm that we can pass values as list value pairs - # note these are ordered *differently* from table.c - values = [ - (table1.c.name, table1.c.name + "lala"), - ("description", "some desc"), - (table1.c.myid, func.do_stuff(table1.c.myid, literal("hoho"))), - ] - - with testing.expect_deprecated_20( - "The update.preserve_parameter_order parameter will be " - "removed in SQLAlchemy 2.0.", - "The update.whereclause parameter will be " - "removed in SQLAlchemy 2.0", - ): - self.assert_compile( - update( - table1, - (table1.c.myid == func.hoho(4)) - & ( - table1.c.name - == literal("foo") + table1.c.name + literal("lala") - ), - preserve_parameter_order=True, - ).values(values), - "UPDATE mytable " - "SET " - "name=(mytable.name || :name_1), " - "description=:description, " - "myid=do_stuff(mytable.myid, :param_1) " - "WHERE " - "mytable.myid = hoho(:hoho_1) AND " - "mytable.name = :param_2 || mytable.name || :param_3", - ) - - def test_update_preserve_order_reqs_listtups(self): - table1 = self.tables.mytable - - with testing.expect_deprecated_20( - "The update.preserve_parameter_order parameter will be " - "removed in SQLAlchemy 2.0." - ): - testing.assert_raises_message( - ValueError, - r"When preserve_parameter_order is True, values\(\) " - r"only accepts a list of 2-tuples", - table1.update(preserve_parameter_order=True).values, - {"description": "foo", "name": "bar"}, - ) - - @testing.fixture - def randomized_param_order_update(self): - from sqlalchemy.sql.dml import UpdateDMLState - - super_process_ordered_values = UpdateDMLState._process_ordered_values - - # this fixture is needed for Python 3.6 and above to work around - # dictionaries being insert-ordered. in python 2.7 the previous - # logic fails pretty easily without this fixture. - def _process_ordered_values(self, statement): - super_process_ordered_values(self, statement) - - tuples = list(self._dict_parameters.items()) - random.shuffle(tuples) - self._dict_parameters = dict(tuples) - - dialect = default.StrCompileDialect() - dialect.paramstyle = "qmark" - dialect.positional = True - - with mock.patch.object( - UpdateDMLState, "_process_ordered_values", _process_ordered_values - ): - yield - - def random_update_order_parameters(): - from sqlalchemy import ARRAY - - t = table( - "foo", - column("data1", ARRAY(Integer)), - column("data2", ARRAY(Integer)), - column("data3", ARRAY(Integer)), - column("data4", ARRAY(Integer)), - ) - - idx_to_value = [ - (t.c.data1, 5, 7), - (t.c.data2, 10, 18), - (t.c.data3, 8, 4), - (t.c.data4, 12, 14), - ] - - def combinations(): - while True: - random.shuffle(idx_to_value) - yield list(idx_to_value) - - return testing.combinations( - *[ - (t, combination) - for i, combination in zip(range(10), combinations()) - ], - argnames="t, idx_to_value", - ) - - @random_update_order_parameters() - def test_update_to_expression_ppo( - self, randomized_param_order_update, t, idx_to_value - ): - dialect = default.StrCompileDialect() - dialect.paramstyle = "qmark" - dialect.positional = True - - with testing.expect_deprecated_20( - "The update.preserve_parameter_order parameter will be " - "removed in SQLAlchemy 2.0." - ): - stmt = t.update(preserve_parameter_order=True).values( - [(col[idx], val) for col, idx, val in idx_to_value] - ) - - self.assert_compile( - stmt, - "UPDATE foo SET %s" - % ( - ", ".join( - "%s[?]=?" % col.key for col, idx, val in idx_to_value - ) - ), - dialect=dialect, - checkpositional=tuple( - itertools.chain.from_iterable( - (idx, val) for col, idx, val in idx_to_value - ) - ), - ) - - class TableDeprecationTest(fixtures.TestBase): def test_mustexists(self): with testing.expect_deprecated("Deprecated alias of .*must_exist"): diff --git a/test/sql/test_roles.py b/test/sql/test_roles.py index abbef8e280..5c9ed3588a 100644 --- a/test/sql/test_roles.py +++ b/test/sql/test_roles.py @@ -226,15 +226,13 @@ class RoleTest(fixtures.TestBase): ): expect(roles.ExpressionElementRole, Thing()) - def test_statement_text_coercion(self): - with testing.expect_deprecated_20( - "Using plain strings to indicate SQL statements" + def test_no_statement_text_coercion(self): + with testing.expect_raises_message( + exc.ArgumentError, + r"Textual SQL expression 'select \* from table' should be " + "explicitly declared", ): - is_true( - expect(roles.StatementRole, "select * from table").compare( - text("select * from table") - ) - ) + expect(roles.StatementRole, "select * from table") def test_select_statement_no_text_coercion(self): assert_raises_message( diff --git a/test/sql/test_selectable.py b/test/sql/test_selectable.py index c3a2d8d3c6..a63ae5be46 100644 --- a/test/sql/test_selectable.py +++ b/test/sql/test_selectable.py @@ -608,6 +608,17 @@ class SelectableTest( "table1.col3, table1.colx FROM table1) AS anon_1", ) + def test_with_only_generative_no_list(self): + s1 = table1.select().scalar_subquery() + + with testing.expect_raises_message( + exc.ArgumentError, + r"The \"columns\" argument to " + r"Select.with_only_columns\(\), when referring " + "to a sequence of items, is now passed", + ): + s1.with_only_columns([s1]) + @testing.combinations( ( [table1.c.col1], diff --git a/test/sql/test_types.py b/test/sql/test_types.py index fb1bdad2e6..79b77581d0 100644 --- a/test/sql/test_types.py +++ b/test/sql/test_types.py @@ -79,7 +79,6 @@ from sqlalchemy.testing import AssertsCompiledSQL from sqlalchemy.testing import AssertsExecutionResults from sqlalchemy.testing import engines from sqlalchemy.testing import eq_ -from sqlalchemy.testing import expect_deprecated_20 from sqlalchemy.testing import expect_raises from sqlalchemy.testing import fixtures from sqlalchemy.testing import is_ @@ -2529,12 +2528,6 @@ class EnumTest(AssertsCompiledSQL, fixtures.TablesTest): [(1, self.SomeEnum.three), (2, self.SomeEnum.three)], ) - def test_omit_warn(self): - with expect_deprecated_20( - r"The provided enum someenum contains the aliases \['four'\]" - ): - Enum(self.SomeEnum) - @testing.combinations( (True, "native"), (False, "non_native"), id_="ai", argnames="native" )