:ref:`migration_14_toplevel`.
-.. changelog_imports::
-
- .. include:: changelog_13.rst
- :start-line: 5
-
.. changelog::
:version: 1.4.48
.. seealso::
- :ref:`azure_synapse_ignore_no_transaction_on_rollback`
+ ref_azure_synapse_ignore_no_transaction_on_rollback
.. change::
:tags: bug, mypy
.. seealso::
- :ref:`postgresql_constraint_options`
+ ref_postgresql_constraint_options
.. change::
:tags: bug, orm, regression
.. seealso::
- :ref:`asyncio_events_run_async`
+ ref_asyncio_events_run_async
.. change::
.. seealso::
- :ref:`asyncmy`
+ ref_asyncmy
.. change::
:tags: bug, asyncio
.. seealso::
- :ref:`asyncio_scoped_session`
+ ref_asyncio_scoped_session
.. change::
:tags: usecase, mysql
.. seealso::
- :ref:`mypy_declarative_mixins`
+ ref_mypy_declarative_mixins
.. change::
.. seealso::
- :ref:`aiosqlite`
+ ref_aiosqlite
.. change::
:tags: bug, regression, orm, declarative
.. seealso::
- :ref:`pysqlcipher`
+ ref_pysqlcipher
.. change::
.. seealso::
- :ref:`orm_declarative_dataclasses_mixin`
+ ref_orm_declarative_dataclasses_mixin
.. change::
:tags: bug, sql, regression
.. seealso::
- :ref:`mssql_pyodbc_setinputsizes`
+ ref_mssql_pyodbc_setinputsizes
.. change::
:tags: bug, orm, regression
.. seealso::
- :ref:`mypy_toplevel`
+ ref_mypy_toplevel
.. change::
:tags: bug, sql
.. seealso::
- :ref:`orm_declarative_dataclasses_declarative_table`
+ ref_orm_declarative_dataclasses_declarative_table
.. change::
:tags: bug, sql
.. seealso::
- :ref:`asyncpg_prepared_statement_cache`
+ ref_asyncpg_prepared_statement_cache
.. change::
:tags: feature, mysql
.. seealso::
- :ref:`aiomysql`
+ ref_aiomysql
.. change::
:tags: bug, reflection
.. seealso::
- :ref:`sqlite_on_conflict_insert`
+ ref_sqlite_on_conflict_insert
.. change::
:tags: bug, asyncio
:ref:`mapper_automated_reflection_schemes` - in the ORM mapping documentation
- :ref:`automap_intercepting_columns` - in the :ref:`automap_toplevel` documentation
+ ref_automap_intercepting_columns - in the ref_automap_toplevel documentation
.. seealso::
- :ref:`metadata_reflection_dbagnostic_types` - example usage
+ ref_metadata_reflection_dbagnostic_types - example usage
.. change::
:tags: bug, sql
:tickets: 3414
SQLAlchemy now includes support for Python asyncio within both Core and
- ORM, using the included :ref:`asyncio extension <asyncio_toplevel>`. The
+ ORM, using the included ref_asyncio_toplevel. The
extension makes use of the `greenlet
<https://greenlet.readthedocs.io/en/latest/>`_ library in order to adapt
SQLAlchemy's sync-oriented internals such that an asyncio interface that
ultimately interacts with an asyncio database adapter is now feasible. The
single driver supported at the moment is the
- :ref:`dialect-postgresql-asyncpg` driver for PostgreSQL.
+ ref_dialect-postgresql-asyncpg driver for PostgreSQL.
.. seealso::
Microsoft SQL Server. This removes the deprecated feature of using
:class:`.Sequence` objects to manipulate IDENTITY characteristics which
should now be performed using ``mssql_identity_start`` and
- ``mssql_identity_increment`` as documented at :ref:`mssql_identity`. The
+ ``mssql_identity_increment`` as documented at ref_mssql_identity. The
change includes a new parameter :paramref:`.Sequence.data_type` to
accommodate SQL Server's choice of datatype, which for that backend
includes INTEGER, BIGINT, and DECIMAL(n, 0). The default starting value
.. seealso::
- :ref:`oracle_max_identifier_lengths` - in the Oracle dialect documentation
+ ref_oracle_max_identifier_lengths - in the Oracle dialect documentation
.. change::
.. seealso::
- :ref:`postgresql_readonly_deferrable`
+ ref_postgresql_readonly_deferrable
.. change::
:tags: mysql, feature
Remove deprecated method ``Session.prune`` and parameter
``Session.weak_identity_map``. See the recipe at
- :ref:`session_referencing_behavior` for an event-based approach to
+ ref_session_referencing_behavior for an event-based approach to
maintaining strong identity references.
This change also removes the class ``StrongInstanceDict``.
.. seealso::
- :ref:`asyncpg_prepared_statement_name`
+ ref_asyncpg_prepared_statement_name
.. change::
:tags: typing, bug
:ref:`error_dcmx` - background on rationale
- :ref:`orm_declarative_dc_mixins`
+ ref_orm_declarative_dc_mixins
.. change::
:tags: bug, postgresql
SQLAlchemy now computes rowcount for a RETURNING statement in this specific
case by counting the rows returned, rather than relying upon
``cursor.rowcount``. In particular, the ORM versioned rows use case
- (documented at :ref:`mapper_version_counter`) should now be fully
+ (documented at ref_mapper_version_counter) should now be fully
supported with the SQL Server pyodbc dialect.
:tags: usecase, typing
:tickets: 9321
- Improved the typing support for the :ref:`hybrids_toplevel`
+ Improved the typing support for the ref_hybrids_toplevel
extension, updated all documentation to use ORM Annotated Declarative
mappings, and added a new modifier called :attr:`.hybrid_property.inplace`.
This modifier provides a way to alter the state of a :class:`.hybrid_property`
.. seealso::
- :ref:`hybrid_pep484_naming`
+ ref_hybrid_pep484_naming
.. change::
:tags: bug, orm
:tickets: 9295
Adjusted the behavior of the ``thick_mode`` parameter for the
- :ref:`oracledb` dialect to correctly accept ``False`` as a value.
+ ref_oracledb dialect to correctly accept ``False`` as a value.
Previously, only ``None`` would indicate that thick mode should be
disabled.
.. seealso::
- :ref:`dataclasses_pydantic`
+ ref_dataclasses_pydantic
.. changelog::
:class:`_orm.Mapper` object is created within the class creation process,
there was no documented means of running code at this point. The change
is to immediately benefit custom mapping schemes such as that
- of the :ref:`examples_versioned_history` example, which generate additional
+ of the ref_examples_versioned_history example, which generate additional
mappers and tables in response to the creation of mapped classes.
:tags: bug, examples
:tickets: 9220
- Reworked the :ref:`examples_versioned_history` to work with
+ Reworked the ref_examples_versioned_history to work with
version 2.0, while at the same time improving the overall working of
this example to use newer APIs, including a newly added hook
:meth:`_orm.MapperEvents.after_mapper_constructed`.
.. seealso::
- :ref:`automap_by_module` - illustrates use of both techniques at once.
+ ref_automap_by_module - illustrates use of both techniques at once.
.. change::
:tags: orm, bug
.. seealso::
- :ref:`mssql_comment_support`
+ ref_mssql_comment_support
.. changelog::
.. seealso::
- :ref:`orm_declarative_native_dataclasses_non_mapped_fields`
+ ref_orm_declarative_native_dataclasses_non_mapped_fields
.. change::
:tags: bug, orm
:tickets: 8880
- Fixed bug in :ref:`orm_declarative_native_dataclasses` feature where using
+ Fixed bug in ref_orm_declarative_native_dataclasses feature where using
plain dataclass fields with the ``__allow_unmapped__`` directive in a
mapping would not create a dataclass with the correct class-level state for
those fields, copying the raw ``Field`` object to the class inappropriately
Added support for the :func:`.association_proxy` extension function to
take part within Python ``dataclasses`` configuration, when using
the native dataclasses feature described at
- :ref:`orm_declarative_native_dataclasses`. Included are attribute-level
+ ref_orm_declarative_native_dataclasses. Included are attribute-level
arguments including :paramref:`.association_proxy.init` and
:paramref:`.association_proxy.default_factory`.
attribute constructs including :func:`_orm.mapped_column`,
:func:`_orm.relationship` etc. to provide for the Python dataclasses
``compare`` parameter on ``field()``, when using the
- :ref:`orm_declarative_native_dataclasses` feature. Pull request courtesy
+ ref_orm_declarative_native_dataclasses feature. Pull request courtesy
Simon Schiele.
.. change::
.. seealso::
- :ref:`sqlite_include_internal`
+ ref_sqlite_include_internal
.. change::
:tags: feature, postgresql
:ref:`ticket_8054`
- :ref:`oracledb`
+ ref_oracledb
.. change::
:tags: bug, engine
:ref:`ticket_6842`
- :ref:`postgresql_psycopg`
+ ref_postgresql_psycopg
Additionally, classes mapped by :class:`_orm.composite` now support
ordering comparison operations, e.g. ``<``, ``>=``, etc.
- See the new documentation at :ref:`mapper_composite` for examples.
+ See the new documentation at ref_mapper_composite for examples.
.. change::
:tags: engine, bug
``'value'``. For normal bound value handling, the :class:`_types.Unicode`
datatype also may have implications for passing values to the DBAPI, again
in the case of SQL Server, the pyodbc driver supports the use of
- :ref:`setinputsizes mode <mssql_pyodbc_setinputsizes>` which will handle
+ ref_mssql_pyodbc_setinputsizes which will handle
:class:`_types.String` versus :class:`_types.Unicode` differently.
Expanding IN feature now supports empty lists
---------------------------------------------
-The "expanding IN" feature introduced in version 1.2 at :ref:`change_3953` now
+The "expanding IN" feature introduced in version 1.2 at ref_change_3953 now
supports empty lists passed to the :meth:`.ColumnOperators.in_` operator. The implementation
for an empty list will produce an "empty set" expression that is specific to a target
backend, such as "SELECT CAST(NULL AS INTEGER) WHERE 1!=1" for PostgreSQL,
---------------------------------------------------------
The warnings that were first added in version 1.0, described at
-:ref:`migration_2992`, have now been converted into exceptions. Continued
+ref_migration_2992, have now been converted into exceptions. Continued
concerns have been raised regarding the automatic coercion of string fragments
passed to methods like :meth:`_query.Query.filter` and :meth:`_expression.Select.order_by` being
converted to :func:`_expression.text` constructs, even though this has emitted a warning.
.. seealso::
- :ref:`mysql_insert_on_duplicate_key_update`
+ ref_mysql_insert_on_duplicate_key_update
Dialect Improvements and Changes - SQLite
=============================================
.. seealso::
- :ref:`sqlite_on_conflict_ddl`
+ ref_sqlite_on_conflict_ddl
:ticket:`4360`
.. seealso::
- :ref:`mssql_pyodbc_fastexecutemany`
+ ref_mssql_pyodbc_fastexecutemany
:ticket:`4158`
.. seealso::
- :ref:`mssql_identity`
+ ref_mssql_identity
:ticket:`4362`
and construct ORM objects from result sets.
To introduce the general idea of the feature, given code from the
-:ref:`examples_performance` suite as follows, which will invoke
+ref_examples_performance suite as follows, which will invoke
a very simple query "n" times, for a default value of n=10000. The
query returns only a single row, as the overhead we are looking to decrease
is that of **many small queries**. The optimization is not as significant
deprecated, superseded by :meth:`_orm.registry.map_declaratively`. The
:class:`_declarative.ConcreteBase`, :class:`_declarative.AbstractConcreteBase`,
and :class:`_declarative.DeferredReflection` classes remain as extensions in the
-:ref:`declarative_toplevel` package.
+ref_declarative_toplevel package.
Mapping styles have now been organized such that they all extend from
the :class:`_orm.registry` object, and fall into these categories:
* Using :meth:`_orm.registry.mapped` Declarative Decorator
* Declarative Table
* Imperative Table (Hybrid)
- * :ref:`orm_declarative_dataclasses`
+ * ref_orm_declarative_dataclasses
* :ref:`Imperative (a.k.a. "classical" mapping) <orm_imperative_mapping>`
* Using :meth:`_orm.registry.map_imperatively`
- * :ref:`orm_imperative_dataclasses`
+ * ref_orm_imperative_dataclasses
The existing classical mapping function ``sqlalchemy.orm.mapper()`` remains,
however it is deprecated to call upon ``sqlalchemy.orm.mapper()`` directly; the
.. seealso::
- :ref:`orm_declarative_dataclasses`
+ ref_orm_declarative_dataclasses
- :ref:`orm_imperative_dataclasses`
+ ref_orm_imperative_dataclasses
:ticket:`5027`
the initial releases of SQLAlchemy 1.4. This is super new stuff that uses
some previously unfamiliar programming techniques.
-The initial database API supported is the :ref:`dialect-postgresql-asyncpg`
+The initial database API supported is the ref_dialect-postgresql-asyncpg
asyncio driver for PostgreSQL.
The internal features of SQLAlchemy are fully integrated by making use of
feature** such that applications that wish to make use of such ORM features
can opt to organize database-related code into functions which can then be
run within greenlets using the :meth:`_asyncio.AsyncSession.run_sync`
-method. See the ``greenlet_orm.py`` example at :ref:`examples_asyncio`
+method. See the ``greenlet_orm.py`` example at ref_examples_asyncio
for a demonstration.
Support for asynchronous cursors is also provided using new methods
.. seealso::
- :ref:`asyncio_toplevel`
+ ref_asyncio_toplevel
- :ref:`examples_asyncio`
+ ref_examples_asyncio
:meth:`_sql.ColumnOperators.regexp_replace`
- :ref:`pysqlite_regexp` - SQLite implementation notes
+ ref_pysqlite_regexp - SQLite implementation notes
:ticket:`1390`
All IN expressions render parameters for each value in the list on the fly (e.g. expanding parameters)
------------------------------------------------------------------------------------------------------
-The "expanding IN" feature, first introduced in :ref:`change_3953`, has matured
+The "expanding IN" feature, first introduced in ref_change_3953, has matured
enough such that it is clearly superior to the previous method of rendering IN
expressions. As the approach was improved to handle empty lists of values, it
is now the only means that Core / ORM will use to render lists of IN
statements that included IN expressions generally.
In order to service the "baked query" feature described at
-:ref:`baked_toplevel`, a cacheable version of IN was needed, which is what
+ref_baked_toplevel, a cacheable version of IN was needed, which is what
brought about the "expanding IN" feature. In contrast to the existing behavior
whereby the parameter list is expanded at statement construction time into
individual :class:`.BindParameter` objects, the feature instead uses a single
:paramref:`_sa.create_engine.empty_in_strategy` parameter, introduced in version
1.2 as a means for migrating for how this case was treated for the previous IN
system, is now deprecated and this flag no longer has an effect; as described
-in :ref:`change_3907`, this flag allowed a dialect to switch between the
+in ref_change_3907, this flag allowed a dialect to switch between the
original system of comparing a column against itself, which turned out to be a
huge performance issue, and a newer system of comparing "1 != 1" in
order to produce a "false" expression. The 1.3 introduced behavior which
to PREPARE the statement ahead of time as would normally be expected for this
approach to be performant.
-SQLAlchemy includes a :ref:`performance suite <examples_performance>` within
+SQLAlchemy includes a ref_examples_performance within
its examples, where we can compare the times generated for the "batch_inserts"
runner against 1.3 and 1.4, revealing a 3x-5x speedup for most flavors
of batch insert:
The feature batches rows into groups of 1000 by default which can be affected
using the ``executemany_values_page_size`` argument documented at
-:ref:`psycopg2_executemany_mode`.
+ref_psycopg2_executemany_mode.
:ticket:`5263`
would skip including these values within an INSERT so that SQL-level defaults
take place, if any, else the value defaults to NULL on the database side.
-In version 1.0 as part of :ref:`migration_3061`, this behavior was refined so
+In version 1.0 as part of ref_migration_3061, this behavior was refined so
that the ``None`` value was no longer populated into ``__dict__``, only
returned. Besides removing the mutating side effect of a getter operation,
this change also made it possible to set columns that did have server defaults
be cached before the VALUES are rendered.
A quick test of the ``execute_values()`` approach using the
-``bulk_inserts.py`` script in the :ref:`examples_performance` example
+``bulk_inserts.py`` script in the ref_examples_performance example
suite reveals an approximate **fivefold performance increase**:
.. sourcecode:: text
test_core_insert : A single Core INSERT construct inserting mappings in bulk. (100000 iterations); total time 0.944007 sec
Support for the "batch" extension was added in version 1.2 in
-:ref:`change_4109`, and enhanced to include support for the ``execute_values``
+ref_change_4109, and enhanced to include support for the ``execute_values``
extension in 1.3 in :ticket:`4623`. In 1.4 the ``execute_values`` extension is
now being turned on by default for INSERT statements; the "batch" extension
for UPDATE and DELETE remains off by default.
.. seealso::
- :ref:`psycopg2_executemany_mode`
+ ref_psycopg2_executemany_mode
:ticket:`5401`
any modern Python versions rely upon this limitation.
The behavior was first introduced in 0.9 and was part of the larger change of
-allowing for right nested joins as described at :ref:`feature_joins_09`.
+allowing for right nested joins as described at ref_feature_joins_09.
However the SQLite workaround produced many regressions in the 2013-2014
period due to its complexity. In 2016, the dialect was modified so that the
join rewriting logic would only occur for SQLite versions prior to 3.7.16 after
.. seealso::
- :ref:`defaults_sequences`
+ ref_defaults_sequences
:ticket:`4976`
.. seealso::
- :ref:`mssql_identity`
+ ref_mssql_identity
:ticket:`4235`
:class:`_orm.Mapped` which link to constructs such as :func:`_orm.relationship`
will raise errors in Python, as they suggest mis-configurations.
-SQLAlchemy applications that use the :ref:`Mypy plugin <mypy_toplevel>` with
+SQLAlchemy applications that use the ref_mypy_toplevel with
explicit annotations that don't use :class:`_orm.Mapped` in their annotations
are subject to these errors, as would occur in the example below::
remove the base class requirement, a first class :ref:`decorator
<declarative_config_toplevel>` form has been added.
-As yet another separate but related enhancement, support for :ref:`Python
-dataclasses <orm_declarative_dataclasses>` is added as well to both
+As yet another separate but related enhancement, support for Python
+dataclasses is added as well to both
declarative decorator and classical mapping forms.
.. seealso::
)
# or
-
+
session.scalar(
select(func.count(User.id))
)
**Synopsis**
The ``lazy="dynamic"`` relationship loader strategy, discussed at
-:ref:`dynamic_relationship`, makes use of the :class:`_query.Query` object
+ref_dynamic_relationship, makes use of the :class:`_query.Query` object
which is legacy in 2.0. The "dynamic" relationship is not directly compatible
with asyncio without workarounds, and additionally it does not fulfill its
original purpose of preventing iteration of large collections as it has several
:ref:`change_7123`
- :ref:`write_only_relationship`
+ ref_write_only_relationship
.. _migration_20_session_autocommit:
~~~~~~~~~~~~~~~~~~~~~~
SQLAlchemy 1.4 introduced the first SQLAlchemy-native ORM typing support
-using a combination of sqlalchemy2-stubs_ and the :ref:`Mypy Plugin <mypy_toplevel>`.
+using a combination of sqlalchemy2-stubs_ and the ref_mypy_toplevel.
In SQLAlchemy 2.0, the Mypy plugin **remains available, and has been updated
to work with SQLAlchemy 2.0's typing system**. However, it should now be
considered **deprecated**, as applications now have a straightforward path to adopting the
Using Legacy Mypy-Typed Models
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-SQLAlchemy applications that use the :ref:`Mypy plugin <mypy_toplevel>` with
+SQLAlchemy applications that use the ref_mypy_toplevel with
explicit annotations that don't use :class:`_orm.Mapped` in their annotations
are subject to errors under the new system, as such annotations are flagged as
errors when using constructs such as :func:`_orm.relationship`.
.. seealso::
- :ref:`orm_declarative_native_dataclasses`
+ ref_orm_declarative_native_dataclasses
.. _change_6047:
Benchmarks
~~~~~~~~~~
-SQLAlchemy includes a :ref:`Performance Suite <examples_performance>` within
+SQLAlchemy includes a ref_examples_performance within
the ``examples/`` directory, where we can make use of the ``bulk_insert``
suite to benchmark INSERTs of many rows using both Core and ORM in different
ways.
The :class:`_orm.WriteOnlyCollection` also integrates with the new
:ref:`ORM bulk dml <change_8360>` features, including support for bulk INSERT
and UPDATE/DELETE with WHERE criteria, all including RETURNING support as
-well. See the complete documentation at :ref:`write_only_relationship`.
+well. See the complete documentation at ref_write_only_relationship.
.. seealso::
- :ref:`write_only_relationship`
+ ref_write_only_relationship
New pep-484 / type annotated mapping support for Dynamic Relationships
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. seealso::
- :ref:`dynamic_relationship`
+ ref_dynamic_relationship
:ticket:`7123`
-----------------------------------------------------------------------------
The internal system by which :class:`.Table` objects and their components are
-:ref:`reflected <metadata_reflection>` has been completely rearchitected to
+ref_metadata_reflection has been completely rearchitected to
allow high performance bulk reflection of thousands of tables at once for
participating dialects. Currently, the **PostgreSQL** and **Oracle** dialects
participate in the new architecture, where the PostgreSQL dialect can now
.. seealso::
- :ref:`postgresql_psycopg`
+ ref_postgresql_psycopg
.. _ticket_8054:
.. seealso::
- :ref:`oracledb`
+ ref_oracledb
.. _ticket_7631:
.. seealso::
- :ref:`pysqlite_threading_pooling`
+ ref_pysqlite_threading_pooling
:ticket:`7490`
:data:`.func` to generate PostgreSQL-specific functions and
:meth:`.Operators.bool_op` (a boolean-typed version of :meth:`.Operators.op`)
to generate arbitrary operators, in the same manner as they are available
-in previous versions. See the examples at :ref:`postgresql_match`.
+in previous versions. See the examples at ref_postgresql_match.
Existing SQLAlchemy projects that make use of PG-specific directives within
:meth:`.Operators.match` should make use of ``func.to_tsquery()`` directly.
To render SQL in exactly the same form as would be present
-in 1.4, see the version note at :ref:`postgresql_simple_match`.
+in 1.4, see the version note at ref_postgresql_simple_match.
.. seealso::
- :ref:`SQLite Transaction Isolation <sqlite_isolation_level>`
+ ref_sqlite_isolation_level
- :ref:`PostgreSQL Transaction Isolation <postgresql_isolation_level>`
+ ref_postgresql_isolation_level
- :ref:`MySQL Transaction Isolation <mysql_isolation_level>`
+ ref_mysql_isolation_level
- :ref:`SQL Server Transaction Isolation <mssql_isolation_level>`
+ ref_mssql_isolation_level
- :ref:`Oracle Transaction Isolation <oracle_isolation_level>`
+ ref_oracle_isolation_level
:ref:`session_transaction_isolation` - for the ORM
values into account for individual objects.
To use a single :class:`_orm.Session` with multiple ``schema_translate_map``
- configurations, the :ref:`horizontal_sharding_toplevel` extension may
- be used. See the example at :ref:`examples_sharding`.
+ configurations, the ref_horizontal_sharding_toplevel extension may
+ be used. See the example at ref_examples_sharding.
.. _sql_caching:
For a series of examples of "lambda" caching with performance comparisons,
-see the "short_selects" test suite within the :ref:`examples_performance`
+see the "short_selects" test suite within the ref_examples_performance
performance example.
.. _engine_insertmanyvalues:
execution, which occurs when passing a list of dictionaries to the
:paramref:`_engine.Connection.execute.parameters` parameter of the
:meth:`_engine.Connection.execute` or :meth:`_orm.Session.execute` methods (as
-well as equivalent methods under :ref:`asyncio <asyncio_toplevel>` and
+well as equivalent methods under ref_asyncio_toplevel and
shorthand methods like :meth:`_orm.Session.scalars`). It also takes place
within the ORM :term:`unit of work` process when using methods such as
:meth:`_orm.Session.add` and :meth:`_orm.Session.add_all` to add rows.
The :class:`_schema.Table` is the SQLAlchemy Core construct that allows one to define
table metadata, which among other things can be used by the SQLAlchemy ORM
-as a target to map a class. The :ref:`Declarative <declarative_toplevel>`
+as a target to map a class. The ref_declarative_toplevel
extension allows the :class:`_schema.Table` object to be created automatically, given
the contents of the table primarily as a mapping of :class:`_schema.Column` objects.
To apply table-level constraint objects such as :class:`_schema.ForeignKeyConstraint`
to a table defined using Declarative, use the ``__table_args__`` attribute,
-described at :ref:`declarative_table_args`.
+described at ref_declarative_table_args.
.. _constraint_naming_conventions:
.. seealso::
- :ref:`mutable_toplevel`
+ ref_mutable_toplevel
Dealing with Comparison Operations
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
additional in-Python behaviors, including types based on
:class:`.TypeDecorator` as well as other user-defined subclasses of datatypes,
do not have any representation within a database schema. When using database
-the introspection features described at :ref:`metadata_reflection`, SQLAlchemy
+the introspection features described at ref_metadata_reflection, SQLAlchemy
makes use of a fixed mapping which links the datatype information reported by a
database server to a SQLAlchemy datatype object. For example, if we look
inside of a PostgreSQL schema at the definition for a particular database
taken to allow this.
The most straightforward is to override specific columns as described at
-:ref:`reflection_overriding_columns`. In this technique, we simply
+ref_reflection_overriding_columns. In this technique, we simply
use reflection in combination with explicit :class:`_schema.Column` objects for those
columns for which we want to use a custom or decorated datatype::
# PyMySQL
engine = create_engine("mysql+pymysql://scott:tiger@localhost/foo")
-More notes on connecting to MySQL at :ref:`mysql_toplevel`.
+More notes on connecting to MySQL at ref_mysql_toplevel.
Oracle
^^^^^^^^^^
engine = create_engine("oracle+cx_oracle://scott:tiger@tnsname")
-More notes on connecting to Oracle at :ref:`oracle_toplevel`.
+More notes on connecting to Oracle at ref_oracle_toplevel.
Microsoft SQL Server
^^^^^^^^^^^^^^^^^^^^
# pymssql
engine = create_engine("mssql+pymssql://scott:tiger@hostname:port/dbname")
-More notes on connecting to SQL Server at :ref:`mssql_toplevel`.
+More notes on connecting to SQL Server at ref_mssql_toplevel.
SQLite
^^^^^^^
engine = create_engine("sqlite://")
-More notes on connecting to SQLite at :ref:`sqlite_toplevel`.
+More notes on connecting to SQLite at ref_sqlite_toplevel.
Others
^^^^^^
.. note::
:class:`.QueuePool` is not used by default for SQLite engines. See
- :ref:`sqlite_toplevel` for details on SQLite connection pool usage.
+ ref_sqlite_toplevel for details on SQLite connection pool usage.
For more information on connection pooling, see :ref:`pooling_toplevel`.
.. seealso::
- :ref:`mssql_pyodbc_access_tokens` - a more concrete example involving
+ ref_mssql_pyodbc_access_tokens - a more concrete example involving
SQL Server
Modifying the DBAPI connection after connect, or running commands after connect
.. seealso::
- :ref:`multipart_schema_names` - describes use of dotted schema names
+ ref_multipart_schema_names - describes use of dotted schema names
with the SQL Server dialect.
- :ref:`metadata_reflection_schemas`
+ ref_metadata_reflection_schemas
.. _schema_metadata_schema_name:
.. seealso::
- :ref:`postgresql_alternate_search_path` - in the :ref:`postgresql_toplevel` dialect documentation.
+ ref_postgresql_alternate_search_path - in the :ref:`postgresql_toplevel` dialect documentation.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The schema feature of SQLAlchemy interacts with the table reflection
-feature introduced at :ref:`metadata_reflection_toplevel`. See the section
-:ref:`metadata_reflection_schemas` for additional details on how this works.
+feature introduced at ref_metadata_reflection_toplevel. See the section
+ref_metadata_reflection_schemas for additional details on how this works.
Backend-Specific Options
.. seealso::
- * :ref:`mssql_reset_on_return` - in the :ref:`mssql_toplevel` documentation
- * :ref:`postgresql_reset_on_return` in the :ref:`postgresql_toplevel` documentation
+ * ref_mssql_reset_on_return - in the ref_mssql_toplevel documentation
+ * ref_postgresql_reset_on_return in the :ref:`postgresql_toplevel` documentation
as handled by the ORM will not automatically detect in-place changes to
a particular list value; to update list values with the ORM, either re-assign
a new list to the attribute, or use the :class:`.MutableList`
- type modifier. See the section :ref:`mutable_toplevel` for background.
+ type modifier. See the section ref_mutable_toplevel for background.
The available multirange datatypes are as follows:
:ref:`session_committing` - background on session commit
- :ref:`session_expire` - background on attribute expiry
+ ref_session_expire - background on attribute expiry
.. _error_7s2a:
that all ORM annotations must make use of a generic container called
:class:`_orm.Mapped` to be properly annotated. Legacy SQLAlchemy mappings which
include explicit :pep:`484` typing annotations, such as those which use the
-:ref:`legacy Mypy extension <mypy_toplevel>` for typing support, may include
+ref_mypy_toplevel for typing support, may include
directives such as those for :func:`_orm.relationship` that don't include this
generic.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
This warning occurs when using the SQLAlchemy ORM Mapped Dataclasses feature
-described at :ref:`orm_declarative_native_dataclasses` in conjunction with
+described at ref_orm_declarative_native_dataclasses in conjunction with
any mixin class or abstract base that is not itself declared as a
dataclass, such as in the example below::
.. seealso::
- :ref:`orm_declarative_native_dataclasses` - SQLAlchemy dataclasses documentation
+ ref_orm_declarative_native_dataclasses - SQLAlchemy dataclasses documentation
`Python dataclasses <dataclasses_>`_ - on the python.org website
.. seealso::
- :ref:`asyncio_toplevel`
+ ref_asyncio_toplevel
.. _error_xd2s:
.. seealso::
- :ref:`asyncio_orm_avoid_lazyloads` - covers most ORM scenarios where
+ ref_asyncio_orm_avoid_lazyloads - covers most ORM scenarios where
this problem can occur and how to mitigate.
.. _error_xd3s:
.. seealso::
- :ref:`asyncio_inspector` - additional examples of using :func:`_sa.inspect`
+ ref_asyncio_inspector - additional examples of using :func:`_sa.inspect`
with the asyncio extension.
.. seealso::
- :ref:`pysqlite_threading_pooling` - info on PySQLite's behavior.
+ ref_pysqlite_threading_pooling - info on PySQLite's behavior.
.. _faq_dbapi_connection:
pip install sqlalchemy[asyncio]
-For more background, see :ref:`asyncio_install`.
+For more background, see ref_asyncio_install.
.. seealso::
- :ref:`asyncio_install`
+ ref_asyncio_install
for user, address in session.execute(select(u_b, a_b).join(User.addresses)):
...
-* Use result caching - see :ref:`examples_caching` for an in-depth example
+* Use result caching - see ref_examples_caching for an in-depth example
of this.
* Consider a faster interpreter like that of PyPy.
.. seealso::
- :ref:`examples_performance` - a suite of performance demonstrations
+ ref_examples_performance - a suite of performance demonstrations
with bundled profiling capabilities.
I'm inserting 400,000 rows with the ORM and it's really slow!
2. We tell our :class:`.Session` to re-read rows that it has already read,
either when we next query for them using :meth:`.Session.expire_all`
or :meth:`.Session.expire`, or immediately on an object using
- :class:`.Session.refresh`. See :ref:`session_expire` for detail on this.
+ :class:`.Session.refresh`. See ref_session_expire for detail on this.
3. We can run whole queries while setting them to definitely overwrite
already-loaded objects as they read rows by using "populate existing".
.. seealso::
- :ref:`metadata_reflection_toplevel` - complete background on
+ ref_metadata_reflection_toplevel - complete background on
database reflection.
:ref:`orm_declarative_reflected` - background on integrating
.. seealso::
- :ref:`mapper_version_counter` - SQLAlchemy's built-in version id feature.
+ ref_mapper_version_counter - SQLAlchemy's built-in version id feature.
- :ref:`examples_versioning` - other examples of mappings that version rows
+ ref_examples_versioning - other examples of mappings that version rows
temporally.
registry
:doc:`orm/queryguide/relationships` - includes information on lazy
loading of ORM related objects
- :ref:`asyncio_orm_avoid_lazyloads` - tips on avoiding lazy loading
- when using the :ref:`asyncio_toplevel` extension
+ ref_asyncio_orm_avoid_lazyloads - tips on avoiding lazy loading
+ when using the ref_asyncio_toplevel extension
eager load
eager loads
.. seealso::
- :ref:`session_expire`
+ ref_session_expire
Session
The container or scope for ORM database operations. Sessions
.. seealso::
- :ref:`session_object_states`
+ ref_session_object_states
pending
This describes one of the major object states which
.. seealso::
- :ref:`session_object_states`
+ ref_session_object_states
deleted
This describes one of the major object states which
.. seealso::
- :ref:`session_object_states`
+ ref_session_object_states
persistent
This describes one of the major object states which
.. seealso::
- :ref:`session_object_states`
+ ref_session_object_states
detached
This describes one of the major object states which
.. seealso::
- :ref:`session_object_states`
+ ref_session_object_states
attached
Indicates an ORM object that is presently associated with a specific
.. seealso::
- :ref:`session_object_states`
+ ref_session_object_states
* **Using the ORM:**
:doc:`Using the ORM Session <orm/session>` |
:doc:`ORM Querying Guide <orm/queryguide/index>` |
- :doc:`Using AsyncIO <orm/extensions/asyncio>`
* **Configuration Extensions:**
- :doc:`Association Proxy <orm/extensions/associationproxy>` |
- :doc:`Hybrid Attributes <orm/extensions/hybrid>` |
- :doc:`Mutable Scalars <orm/extensions/mutable>` |
- :doc:`Automap <orm/extensions/automap>` |
- :doc:`All extensions <orm/extensions/index>`
+ none...
* **Extending the ORM:**
:doc:`ORM Events and Internals <orm/extending>`
* **Other:**
- :doc:`Introduction to Examples <orm/examples>`
+ none
.. container:: core
* **Engines, Connections, Pools:**
:doc:`Engine Configuration <core/engines>` |
:doc:`Connections, Transactions, Results <core/connections>` |
- :doc:`AsyncIO Support <orm/extensions/asyncio>` |
:doc:`Connection Pooling <core/pooling>`
* **Schema Definition:**
:doc:`Overview <core/schema>` |
:ref:`Tables and Columns <metadata_describing_toplevel>` |
- :ref:`Database Introspection (Reflection) <metadata_reflection_toplevel>` |
- :ref:`Insert/Update Defaults <metadata_defaults_toplevel>` |
+ ref_metadata_reflection_toplevel |
+ ref_metadata_defaults_toplevel |
:ref:`Constraints and Indexes <metadata_constraints_toplevel>` |
:ref:`Using Data Definition Language (DDL) <metadata_ddl_toplevel>`
:doc:`Operator Reference <core/operators>` |
:doc:`SELECT and related constructs <core/selectable>` |
:doc:`INSERT, UPDATE, DELETE <core/dml>` |
- :doc:`SQL Functions <core/functions>` |
:doc:`Table of Contents <core/expression_api>`
* **Core Basics:**
:doc:`Overview <core/api_basics>` |
- :doc:`Runtime Inspection API <core/inspection>` |
:doc:`Event System <core/event>` |
:doc:`Core Event Interfaces <core/events>` |
:doc:`Creating Custom SQL Constructs <core/compiler>`
This section describes notes, options, and usage patterns regarding individual dialects.
:doc:`PostgreSQL <dialects/postgresql>` |
- :doc:`MySQL <dialects/mysql>` |
- :doc:`SQLite <dialects/sqlite>` |
- :doc:`Oracle <dialects/oracle>` |
- :doc:`Microsoft SQL Server <dialects/mssql>`
:doc:`More Dialects ... <dialects/index>`
Working code examples, mostly regarding the ORM, are included in the
SQLAlchemy distribution. A description of all the included example
-applications is at :ref:`examples_toplevel`.
+applications is at ref_examples_toplevel.
There is also a wide variety of examples involving both core SQLAlchemy
constructs as well as the ORM on the wiki. See
`greenlet <https://pypi.org/project/greenlet/>`_ project. This dependency
will be installed by default on common machine platforms, however is not
supported on every architecture and also may not install by default on
-less common architectures. See the section :ref:`asyncio_install` for
+less common architectures. See the section ref_asyncio_install for
additional details on ensuring asyncio support is present.
Supported Installation Methods
To enhance the association object pattern such that direct
access to the ``Association`` object is optional, SQLAlchemy
-provides the :ref:`associationproxy_toplevel` extension. This
+provides the ref_associationproxy_toplevel extension. This
extension allows the configuration of attributes which will
access two "hops" with a single access, one "hop" to the
associated object, and a second to a target attribute.
.. seealso::
- :ref:`associationproxy_toplevel` - allows direct "many to many" style
+ ref_associationproxy_toplevel - allows direct "many to many" style
access between parent and child for a three-class association object mapping.
.. warning::
Avoid mixing the association object pattern with the :ref:`many-to-many <relationships_many_to_many>`
pattern directly, as this produces conditions where data may be read
and written in an inconsistent fashion without special steps;
- the :ref:`association proxy <associationproxy_toplevel>` is typically
+ the ref_associationproxy_toplevel is typically
used to provide more succinct access. For more detailed background
on the caveats introduced by this combination, see the next section
:ref:`association_pattern_w_m2m`.
``Parent.children`` and ``Child.parents`` relationships are replaced with
an extension that will transparently proxy through the ``Association``
class, while keeping everything consistent from the ORM's point of
-view. This extension is known as the :ref:`Association Proxy <associationproxy_toplevel>`.
+view. This extension is known as the ref_associationproxy_toplevel.
.. seealso::
- :ref:`associationproxy_toplevel` - allows direct "many to many" style
+ ref_associationproxy_toplevel - allows direct "many to many" style
access between parent and child for a three-class association object mapping.
.. _orm_declarative_relationship_eval:
.. warning:: The ``all`` cascade option implies the
:ref:`cascade_refresh_expire`
cascade setting which may not be desirable when using the
- :ref:`asyncio_toplevel` extension, as it will expire related objects
+ ref_asyncio_toplevel extension, as it will expire related objects
more aggressively than is typically appropriate in an explicit IO context.
- See the notes at :ref:`asyncio_orm_avoid_lazyloads` for further background.
+ See the notes at ref_asyncio_orm_avoid_lazyloads for further background.
The list of available values which can be specified for
the :paramref:`_orm.relationship.cascade` parameter are described in the following subsections.
support ``FOREIGN KEY`` constraints and they must be enforcing:
* When using MySQL, an appropriate storage engine must be
- selected. See :ref:`mysql_storage_engines` for details.
+ selected. See ref_mysql_storage_engines for details.
* When using SQLite, foreign key support must be enabled explicitly.
- See :ref:`sqlite_foreign_keys` for details.
+ See ref_sqlite_foreign_keys for details.
.. topic:: Notes on Passive Deletes
)
Dictionary mappings are often combined with the "Association Proxy" extension to produce
-streamlined dictionary views. See :ref:`proxying_dictionaries` and :ref:`composite_association_proxy`
+streamlined dictionary views. See ref_proxying_dictionaries and ref_composite_association_proxy
for examples.
.. _key_collections_mutations:
of validation and simple marshaling. See :ref:`simple_validators`
for an example of this.
- For the second use case, the :ref:`associationproxy_toplevel` extension is a
+ For the second use case, the ref_associationproxy_toplevel extension is a
well-tested, widely used system that provides a read/write "view" of a
collection in terms of some attribute present on the target object. As the
target attribute can be a ``@property`` that returns virtually anything, a
"deferred" basis as defined
by the :paramref:`_orm.mapped_column.deferred` keyword. More documentation
on these particular concepts may be found at :ref:`relationship_patterns`,
-:ref:`mapper_column_property_sql_expressions`, and :ref:`orm_queryguide_column_deferral`.
+ref_mapper_column_property_sql_expressions, and :ref:`orm_queryguide_column_deferral`.
Properties may be specified with a declarative mapping as above using
"hybrid table" style as well; the :class:`_schema.Column` objects that
.. seealso::
- :ref:`mapper_version_counter` - background on the ORM version counter feature
+ ref_mapper_version_counter - background on the ORM version counter feature
**Single Table Inheritance**
``__abstract__`` causes declarative to skip the production
of a table or mapper for the class entirely. A class can be added within a
-hierarchy in the same way as mixin (see :ref:`declarative_mixins`), allowing
+hierarchy in the same way as mixin (see ref_declarative_mixins), allowing
subclasses to extend just from the special class::
class SomeAbstractBase(Base):
:func:`_orm.mapped_column`.
.. versionchanged:: 2.0 For users coming from the 1.4 series of SQLAlchemy
- who may have been using the :ref:`mypy plugin <mypy_toplevel>`, the
+ who may have been using the ref_mypy_toplevel, the
:func:`_orm.declarative_mixin` class decorator is no longer needed
to mark declarative mixins, assuming the mypy plugin is no longer in use.
.. seealso::
- :ref:`mapping_columns_toplevel` - contains additional notes on affecting
+ ref_mapping_columns_toplevel - contains additional notes on affecting
how :class:`_orm.Mapper` interprets incoming :class:`.Column` objects.
.. _orm_declarative_mapped_column:
The :func:`_orm.mapped_column` construct integrates with SQLAlchemy's
"native dataclasses" feature, discussed at
-:ref:`orm_declarative_native_dataclasses`. See that section for current
+ref_orm_declarative_native_dataclasses. See that section for current
background on additional directives supported by :func:`_orm.mapped_column`.
The above table is ultimately the same one that corresponds to the
:attr:`_orm.Mapper.local_table` attribute, which we can see through the
-:ref:`runtime inspection system <inspection_toplevel>`::
+ref_inspection_toplevel::
from sqlalchemy import inspect
The "imperative table" form is of particular use when the class itself
is using an alternative form of attribute declaration, such as Python
-dataclasses. See the section :ref:`orm_declarative_dataclasses` for detail.
+dataclasses. See the section ref_orm_declarative_dataclasses for detail.
.. seealso::
:ref:`metadata_describing`
- :ref:`orm_declarative_dataclasses`
+ ref_orm_declarative_dataclasses
.. _orm_imperative_table_column_naming:
* :ref:`maptojoin`
- * :ref:`mapper_sql_expressions`
+ * ref_mapper_sql_expressions
For Declarative Table configuration with :func:`_orm.mapped_column`,
most options are available directly; see the section
There are several patterns available which provide for producing mapped
classes against a series of :class:`_schema.Table` objects that were
introspected from the database, using the reflection process described at
-:ref:`metadata_reflection`.
+ref_metadata_reflection.
A simple way to map a class to a table reflected from the database is to
use a declarative hybrid mapping, passing the
^^^^^^^^^^^^^^
A more automated solution to mapping against an existing database where table
-reflection is to be used is to use the :ref:`automap_toplevel` extension. This
+reflection is to be used is to use the ref_automap_toplevel extension. This
extension will generate entire mapped classes from a database schema, including
relationships between classes based on observed foreign key constraints. While
it includes hooks for customization, such as hooks that allow custom
.. seealso::
- :ref:`automap_toplevel`
+ ref_automap_toplevel
.. _mapper_automated_reflection_schemes:
__table__ = Table("some_table", Base.metadata, autoload_with=some_engine)
The approach also works with both the :class:`.DeferredReflection` base class
-as well as with the :ref:`automap_toplevel` extension. For automap
-specifically, see the section :ref:`automap_intercepting_columns` for
+as well as with the ref_automap_toplevel extension. For automap
+specifically, see the section ref_automap_intercepting_columns for
background.
.. seealso::
:meth:`_events.DDLEvents.column_reflect`
- :ref:`automap_intercepting_columns` - in the :ref:`automap_toplevel` documentation
+ ref_automap_intercepting_columns - in the ref_automap_toplevel documentation
.. _mapper_primary_key:
even though they may be excluded from the ORM mapping.
"Schema level column defaults" refers to the defaults described at
-:ref:`metadata_defaults` including those configured by the
+ref_metadata_defaults including those configured by the
:paramref:`_schema.Column.default`, :paramref:`_schema.Column.onupdate`,
:paramref:`_schema.Column.server_default` and
:paramref:`_schema.Column.server_onupdate` parameters. These constructs
Attribute events are triggered as things occur on individual attributes of
ORM mapped objects. These events form the basis for things like
:ref:`custom validation functions <simple_validators>` as well as
-:ref:`backref handlers <relationships_backref>`.
+ref_relationships_backref.
.. seealso::
:ref:`loading_joined_inheritance` - in the :ref:`queryguide_toplevel`
- :ref:`examples_inheritance` - complete examples of joined, single and
+ ref_examples_inheritance - complete examples of joined, single and
concrete inheritance
.. _joined_inheritance:
.. autoclass:: Composite
-.. autoclass:: CompositeProperty
- :members:
+.. .. autoclass:: CompositeProperty
+ .. :members:
.. autoclass:: AttributeEventToken
:members:
.. warning:: When passed as a Python-evaluable string, the
:paramref:`_orm.relationship.foreign_keys` argument is interpreted using Python's
``eval()`` function. **DO NOT PASS UNTRUSTED INPUT TO THIS STRING**. See
- :ref:`declarative_relationship_eval` for details on declarative
+ ref_declarative_relationship_eval for details on declarative
evaluation of :func:`_orm.relationship` arguments.
:paramref:`_orm.relationship.primaryjoin` argument is interpreted using
Python's
``eval()`` function. **DO NOT PASS UNTRUSTED INPUT TO THIS STRING**. See
- :ref:`declarative_relationship_eval` for details on declarative
+ ref_declarative_relationship_eval for details on declarative
evaluation of :func:`_orm.relationship` arguments.
:paramref:`_orm.relationship.primaryjoin` and
:paramref:`_orm.relationship.secondaryjoin` arguments are interpreted using
Python's ``eval()`` function. **DO NOT PASS UNTRUSTED INPUT TO THESE
- STRINGS**. See :ref:`declarative_relationship_eval` for details on
+ STRINGS**. See ref_declarative_relationship_eval for details on
declarative evaluation of :func:`_orm.relationship` arguments.
(0, 12, 'address')
{stop}
-Read more about Hybrids at :ref:`hybrids_toplevel`.
+Read more about Hybrids at ref_hybrids_toplevel.
.. _synonyms:
.. seealso::
- :ref:`declarative_inheritance`
+ ref_declarative_inheritance
:ref:`mixin_inheritance_columns`
.. tip::
- The :ref:`orm_declarative_native_dataclasses` feature provides an alternate
+ The ref_orm_declarative_native_dataclasses feature provides an alternate
means of generating a default ``__init__()`` method by using
Python dataclasses, and allows for a highly configurable constructor
form.
As illustrated in the previous section, the :class:`_orm.Mapper` object is
available from any mapped class, regardless of method, using the
-:ref:`core_inspection_toplevel` system. Using the
+ref_core_inspection_toplevel system. Using the
:func:`_sa.inspect` function, one can acquire the :class:`_orm.Mapper` from a
mapped class::
>>> insp.session
<sqlalchemy.orm.session.Session object at 0x7f07e614f160>
-Information about the current :ref:`persistence state <session_object_states>`
+Information about the current ref_session_object_states
for the object::
>>> insp.persistent
Fetching Server-Generated Defaults
===================================
-As introduced in the sections :ref:`server_defaults` and :ref:`triggered_columns`,
+As introduced in the sections ref_server_defaults and ref_triggered_columns,
the Core supports the notion of database columns for which the database
itself generates a value upon INSERT and in less common cases upon UPDATE
statements. The ORM features support for such columns regarding being
.. seealso::
- :ref:`mssql_insert_behavior` - background on the SQL Server dialect's
+ ref_mssql_insert_behavior - background on the SQL Server dialect's
methods of fetching newly generated primary key values
Case 3: non primary key, RETURNING or equivalent is not supported or not needed
.. seealso::
- :ref:`metadata_defaults_toplevel`
+ ref_metadata_defaults_toplevel
Notes on eagerly fetching client invoked SQL expressions used for INSERT or UPDATE
-----------------------------------------------------------------------------------
DDL.
SQLAlchemy also supports non-DDL server side defaults, as documented at
-:ref:`defaults_client_invoked_sql`; these "client invoked SQL expressions"
+ref_defaults_client_invoked_sql; these "client invoked SQL expressions"
are set up using the :paramref:`_schema.Column.default` and
:paramref:`_schema.Column.onupdate` parameters.
tables) across multiple databases. The SQLAlchemy :class:`.Session`
contains support for this concept, however to use it fully requires that
:class:`.Session` and :class:`_query.Query` subclasses are used. A basic version
-of these subclasses are available in the :ref:`horizontal_sharding_toplevel`
-ORM extension. An example of use is at: :ref:`examples_sharding`.
+of these subclasses are available in the ref_horizontal_sharding_toplevel
+ORM extension. An example of use is at: ref_examples_sharding.
.. _bulk_operations:
:ref:`faq_session_identity` - in :doc:`/faq/index`
- :ref:`session_expire` - in the ORM :class:`_orm.Session`
+ ref_session_expire - in the ORM :class:`_orm.Session`
documentation
.. _orm_queryguide_autoflush:
.. doctest-disable:
.. deepalchemy:: This option is an advanced-use feature mostly intended
- to be used with the :ref:`horizontal_sharding_toplevel` extension. For
+ to be used with the ref_horizontal_sharding_toplevel extension. For
typical cases of loading objects with identical primary keys from different
"shards" or partitions, consider using individual :class:`_orm.Session`
objects per shard first.
where objects may be loaded from any number of replicas of a particular
database table that nonetheless have overlapping primary key values.
The primary consumer of "identity token" is the
-:ref:`horizontal_sharding_toplevel` extension, which supplies a general
+ref_horizontal_sharding_toplevel extension, which supplies a general
framework for persisting objects among multiple "shards" of a particular
database table.
The :class:`_orm.Session` objects above are independent. If we wanted to
persist both objects in one transaction, we would need to use the
-:ref:`horizontal_sharding_toplevel` extension to do this.
+ref_horizontal_sharding_toplevel extension to do this.
However, we can illustrate querying for these objects in one session as follows:
The above logic takes place automatically when using the
-:ref:`horizontal_sharding_toplevel` extension.
+ref_horizontal_sharding_toplevel extension.
.. versionadded:: 2.0.0rc1 - added the ``identity_token`` ORM level execution
option.
.. seealso::
- :ref:`examples_sharding` - in the :ref:`examples_toplevel` section.
+ ref_examples_sharding - in the ref_examples_toplevel section.
See the script ``separate_schema_translates.py`` for a demonstration of
the above use case using the full sharding API.
background and examples.
.. tip:: as noted elsewhere, lazy loading is not available when using
- :ref:`asyncio_toplevel`.
+ ref_asyncio_toplevel.
Using ``load_only()`` with multiple entities
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. seealso::
- :ref:`mapper_column_property_sql_expressions` - in the section
- :ref:`mapper_sql_expressions`
+ ref_mapper_column_property_sql_expressions - in the section
+ ref_mapper_sql_expressions
:ref:`orm_imperative_table_column_options` - in the section
:ref:`orm_declarative_table_config_toplevel`
The :func:`_orm.with_expression` option is a special option used to
apply SQL expressions to mapped classes dynamically at query time.
For ordinary fixed SQL expressions configured on mappers,
- see the section :ref:`mapper_sql_expressions`.
+ see the section ref_mapper_sql_expressions.
.. _orm_queryguide_with_expression_unions:
The parameter dictionaries contain key/value pairs which may correspond to ORM
mapped attributes that line up with mapped :class:`._schema.Column`
or :func:`_orm.mapped_column` declarations, as well as with
-:ref:`composite <mapper_composite>` declarations. The keys should match
+ref_mapper_composite declarations. The keys should match
the **ORM mapped attribute name** and **not** the actual database column name,
if these two names happen to be different.
The dialects included with SQLAlchemy that include dialect-specific "upsert"
API features are:
-* SQLite - using :class:`_sqlite.Insert` documented at :ref:`sqlite_on_conflict_insert`
-* PostgreSQL - using :class:`_postgresql.Insert` documented at :ref:`postgresql_insert_on_conflict`
-* MySQL/MariaDB - using :class:`_mysql.Insert` documented at :ref:`mysql_insert_on_duplicate_key_update`
+* SQLite - using :class:`_sqlite.Insert` documented at ref_sqlite_on_conflict_insert
+* PostgreSQL - using :class:`_postgresql.Insert` documented at ref_postgresql_insert_on_conflict
+* MySQL/MariaDB - using :class:`_mysql.Insert` documented at ref_mysql_insert_on_duplicate_key_update
Users should review the above sections for background on proper construction
of these objects; in particular, the "upsert" method typically needs to
subclass-specific attributes, as this would be an example of the
:term:`N plus one` problem that emits additional SQL per row. This additional SQL can
impact performance and also be incompatible with approaches such as
-using :ref:`asyncio <asyncio_toplevel>`. Additionally, in our query for
+using ref_asyncio_toplevel. Additionally, in our query for
``Employee`` objects, since the query is against the base table only, we did
not have a way to add SQL criteria involving subclass-specific attributes in
terms of ``Manager`` or ``Engineer``. The next two sections detail two
:meth:`.WriteOnlyCollection.add_all` and :meth:`.WriteOnlyCollection.remove`
methods. Querying the collection is performed by invoking a SELECT statement
which is constructed using the :meth:`.WriteOnlyCollection.select`
- method. Write only loading is discussed at :ref:`write_only_relationship`.
+ method. Write only loading is discussed at ref_write_only_relationship.
* **dynamic loading** - available via ``lazy='dynamic'``, or by
annotating the left side of the :class:`_orm.Relationship` object using the
contents. However, dynamic loaders will implicitly iterate the underlying
collection in various circumstances which makes them less useful for managing
truly large collections. Dynamic loaders are superseded by
- :ref:`"write only" <write_only_relationship>` collections, which will prevent
+ ref_write_only_relationship collections, which will prevent
the underlying collection from being implicitly loaded under any
- circumstances. Dynamic loaders are discussed at :ref:`dynamic_relationship`.
+ circumstances. Dynamic loaders are discussed at ref_dynamic_relationship.
.. _relationship_lazy_option:
Finally, the above example classes include a ``__repr__()`` method, which is
not required but is useful for debugging. Mapped classes can be created with
methods such as ``__repr__()`` generated automatically, using dataclasses. More
-on dataclass mapping at :ref:`orm_declarative_native_dataclasses`.
+on dataclass mapping at ref_orm_declarative_native_dataclasses.
Create an Engine
The :class:`_engine.Engine` is a **factory** that can create new
database connections for us, which also holds onto connections inside
of a :ref:`Connection Pool <pooling_toplevel>` for fast reuse. For learning
-purposes, we normally use a :ref:`SQLite <sqlite_toplevel>` memory-only database
+purposes, we normally use a ref_sqlite_toplevel memory-only database
for convenience::
>>> from sqlalchemy import create_engine
which, on a MySQL backend, ensures that the ``InnoDB`` engine supporting
referential integrity is used. When using SQLite, referential integrity
should be enabled, using the configuration described at
-:ref:`sqlite_foreign_keys`.
+ref_sqlite_foreign_keys.
.. seealso::
.. seealso::
- :ref:`examples_adjacencylist` - working example
+ ref_examples_adjacencylist - working example
Composite Adjacency Lists
~~~~~~~~~~~~~~~~~~~~~~~~~
..
Further discussion on the refresh / expire concept can be found at
-:ref:`session_expire`.
+ref_session_expire.
.. seealso::
- :ref:`session_expire`
+ ref_session_expire
:ref:`faq_session_identity`
global object from which everyone consults as a "registry" of objects.
That's more the job of a **second level cache**. SQLAlchemy provides
a pattern for implementing second level caching using `dogpile.cache <https://dogpilecache.readthedocs.io/>`_,
-via the :ref:`examples_caching` example.
+via the ref_examples_caching example.
How can I get the :class:`~sqlalchemy.orm.session.Session` for a certain object?
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
session = Session.object_session(someobject)
-The newer :ref:`core_inspection_toplevel` system can also be used::
+The newer ref_core_inspection_toplevel system can also be used::
from sqlalchemy import inspect
The :class:`.Session` should be used in such a way that one
instance exists for a single series of operations within a single
transaction. One expedient way to get this effect is by associating
-a :class:`.Session` with the current thread (see :ref:`unitofwork_contextual`
+a :class:`.Session` with the current thread (see ref_unitofwork_contextual
for background). Another is to use a pattern
where the :class:`.Session` is passed between functions and is otherwise
not shared with other threads.
lazy loads, selectinloads, etc.
For a series of classes that all feature some common column structure,
-if the classes are composed using a :ref:`declarative mixin <declarative_mixins>`,
+if the classes are composed using a ref_declarative_mixins,
the mixin class itself may be used in conjunction with the :func:`_orm.with_loader_criteria`
option by making use of a Python lambda. The Python lambda will be invoked at
query compilation time against the specific entities which match the criteria.
.. seealso::
- :ref:`examples_session_orm_events` - includes working examples of the
+ ref_examples_session_orm_events - includes working examples of the
above :func:`_orm.with_loader_criteria` recipes.
.. _do_orm_execute_re_executing:
function is used to merge the "frozen" data from the result object into the
current session.
-The above example is implemented as a complete example in :ref:`examples_caching`.
+The above example is implemented as a complete example in ref_examples_caching.
The :meth:`_orm.ORMExecuteState.invoke_statement` method may also be called
multiple times, passing along different information to the
:class:`_engine.Engine` objects each time. This will return a different
:class:`_engine.Result` object each time; these results can be merged together
using the :meth:`_engine.Result.merge` method. This is the technique employed
-by the :ref:`horizontal_sharding_toplevel` extension; see the source code to
+by the ref_horizontal_sharding_toplevel extension; see the source code to
familiarize.
.. seealso::
- :ref:`examples_caching`
+ ref_examples_caching
- :ref:`examples_sharding`
+ ref_examples_sharding
where something will be happening.
For illustrations of :meth:`.SessionEvents.before_flush`, see
-examples such as :ref:`examples_versioned_history` and
-:ref:`examples_versioned_rows`.
+examples such as ref_examples_versioned_history and
+ref_examples_versioned_rows.
``after_flush()``
^^^^^^^^^^^^^^^^^
-----------------------
Another use case for events is to track the lifecycle of objects. This
-refers to the states first introduced at :ref:`session_object_states`.
+refers to the states first introduced at ref_session_object_states.
All the states above can be tracked fully with events. Each event
represents a distinct state transition, meaning, the starting state
uses these hooks behind the scenes; see :ref:`simple_validators` for
background on this. The attribute events are also behind the mechanics
of backreferences. An example illustrating use of attribute events
-is in :ref:`examples_instrumentation`.
+is in ref_examples_instrumentation.
.. seealso::
- :ref:`postgresql_table_valued_overview` - in the :ref:`postgresql_toplevel` documentation.
+ ref_postgresql_table_valued_overview - in the :ref:`postgresql_toplevel` documentation.
While many databases support table valued and other special
forms, PostgreSQL tends to be where there is the most demand for these
.. seealso::
- :ref:`postgresql_table_valued` - in the :ref:`postgresql_toplevel` documentation -
+ ref_postgresql_table_valued - in the :ref:`postgresql_toplevel` documentation -
this section will detail additional syntaxes such as special column derivations
and "WITH ORDINALITY" that are known to work with PostgreSQL.
.. seealso::
- :ref:`postgresql_column_valued` - in the :ref:`postgresql_toplevel` documentation.
+ ref_postgresql_column_valued - in the :ref:`postgresql_toplevel` documentation.
.. _tutorial_casts:
* "rowcount" is used by the ORM :term:`unit of work` process to validate that
an UPDATE or DELETE statement matched the expected number of rows, and is
also essential for the ORM versioning feature documented at
- :ref:`mapper_version_counter`.
+ ref_mapper_version_counter.
Using RETURNING with UPDATE, DELETE
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
To automatically generate a full-featured ``__init__()`` method which
provides for positional arguments as well as arguments with default keyword
values, the dataclasses feature introduced at
- :ref:`orm_declarative_native_dataclasses` may be used. It's of course
+ ref_orm_declarative_native_dataclasses may be used. It's of course
always an option to use an explicit ``__init__()`` method as well.
* The ``__repr__()`` methods are added so that we get a readable string output;
there's no requirement for these methods to be here. As is the case
with ``__init__()``, a ``__repr__()`` method
can be generated automatically by using the
- :ref:`dataclasses <orm_declarative_native_dataclasses>` feature.
+ ref_orm_declarative_native_dataclasses feature.
.. topic:: Where'd the old Declarative go?
checkers such as Mypy and Pyright, without the need for plugins. Secondly,
deriving the declarations from type annotations is part of SQLAlchemy's
integration with Python dataclasses, which can now be
- :ref:`generated natively <orm_declarative_native_dataclasses>` from mappings.
+ ref_orm_declarative_native_dataclasses from mappings.
For users who like the "old" way, but still desire their IDEs to not
mistakenly report typing errors for their declarative mappings, the
.. seealso::
- Read more about table and schema reflection at :ref:`metadata_reflection_toplevel`.
+ Read more about table and schema reflection at ref_metadata_reflection_toplevel.
For ORM-related variants of table reflection, the section
:ref:`orm_declarative_reflected` includes an overview of the available
SQLAlchemy’s continuous integration. As of September, 2021 the driver
appears to be unmaintained and no longer functions for Python version 3.10,
and additionally depends on a significantly outdated version of PyMySQL.
- Please refer to the :ref:`asyncmy` dialect for current MySQL/MariaDB asyncio
+ Please refer to the ref_asyncmy dialect for current MySQL/MariaDB asyncio
functionality.
The aiomysql dialect is SQLAlchemy's second Python asyncio dialect.
Using a special asyncio mediation layer, the aiomysql dialect is usable
-as the backend for the :ref:`SQLAlchemy asyncio <asyncio_toplevel>`
+as the backend for the ref_asyncio_toplevel
extension package.
This dialect should normally be used only with the
:url: https://github.com/long2ice/asyncmy
.. note:: The asyncmy dialect as of September, 2021 was added to provide
- MySQL/MariaDB asyncio compatibility given that the :ref:`aiomysql` database
+ MySQL/MariaDB asyncio compatibility given that the ref_aiomysql database
driver has become unmaintained, however asyncmy is itself very new.
Using a special asyncio mediation layer, the asyncmy dialect is usable
-as the backend for the :ref:`SQLAlchemy asyncio <asyncio_toplevel>`
+as the backend for the ref_asyncio_toplevel
extension package.
This dialect should normally be used only with the
available.
* INSERT..ON DUPLICATE KEY UPDATE: See
- :ref:`mysql_insert_on_duplicate_key_update`
+ ref_mysql_insert_on_duplicate_key_update
* SELECT pragma, use :meth:`_expression.Select.prefix_with` and
:meth:`_query.Query.prefix_with`::
.. seealso::
- :ref:`mysql_storage_engines`
+ ref_mysql_storage_engines
.. _mysql_unique_constraints:
.. seealso::
- :ref:`mysql_insert_on_duplicate_key_update` - example of how
+ ref_mysql_insert_on_duplicate_key_update - example of how
to use :attr:`_expression.Insert.inserted`
"""
.. seealso::
- :ref:`mysql_insert_on_duplicate_key_update`
+ ref_mysql_insert_on_duplicate_key_update
"""
if args and kw:
The asyncpg dialect is SQLAlchemy's first Python asyncio dialect.
Using a special asyncio mediation layer, the asyncpg dialect is usable
-as the backend for the :ref:`SQLAlchemy asyncio <asyncio_toplevel>`
+as the backend for the ref_asyncio_toplevel
extension package.
This dialect should normally be used only with the
:ref:`dbapi_autocommit`
- :ref:`postgresql_readonly_deferrable`
+ ref_postgresql_readonly_deferrable
:ref:`psycopg2_isolation_level`
attribute set up.
The PostgreSQL dialect can reflect tables from any schema, as outlined in
-:ref:`metadata_reflection_schemas`.
+ref_metadata_reflection_schemas.
With regards to tables which these :class:`_schema.Table`
objects refer to via foreign key constraint, a decision must be made as to how
.. seealso::
- :ref:`postgresql_isolation_level`
+ ref_postgresql_isolation_level
.. _postgresql_index_reflection:
.. seealso::
- :ref:`postgresql_insert_on_conflict` - example of how
+ ref_postgresql_insert_on_conflict - example of how
to use :attr:`_expression.Insert.excluded`
"""
.. seealso::
- :ref:`postgresql_insert_on_conflict`
+ ref_postgresql_insert_on_conflict
"""
self._post_values_clause = OnConflictDoUpdate(
.. seealso::
- :ref:`postgresql_insert_on_conflict`
+ ref_postgresql_insert_on_conflict
"""
self._post_values_clause = OnConflictDoNothing(
:param ops:
Optional dictionary. Used to define operator classes for the
elements; works the same way as that of the
- :ref:`postgresql_ops <postgresql_operator_classes>`
+ ref_postgresql_operator_classes
parameter specified to the :class:`_schema.Index` construct.
.. versionadded:: 1.3.21
.. seealso::
- :ref:`postgresql_operator_classes` - general description of how
+ ref_postgresql_operator_classes - general description of how
PostgreSQL operator classes are specified.
"""
.. seealso::
- :ref:`postgresql_isolation_level`
+ ref_postgresql_isolation_level
:ref:`psycopg2_isolation_level`
.. seealso::
- :ref:`psycopg2_executemany_mode`
+ ref_psycopg2_executemany_mode
.. tip::
Psycopg2 Transaction Isolation Level
-------------------------------------
-As discussed in :ref:`postgresql_isolation_level`,
+As discussed in ref_postgresql_isolation_level,
all PostgreSQL dialects support setting of transaction isolation level
both via the ``isolation_level`` parameter passed to :func:`_sa.create_engine`
,
.. seealso::
- :ref:`postgresql_isolation_level`
+ ref_postgresql_isolation_level
:ref:`pg8000_isolation_level`
.. seealso::
- :ref:`postgresql_match`
+ ref_postgresql_match
"""
interface that's useful for testing and prototyping purposes.
Using a special asyncio mediation layer, the aiosqlite dialect is usable
-as the backend for the :ref:`SQLAlchemy asyncio <asyncio_toplevel>`
+as the backend for the ref_asyncio_toplevel
extension package.
This dialect should normally be used only with the
SQLite's transactional scope is impacted by unresolved
issues in the pysqlite driver, which defers BEGIN statements to a greater
- degree than is often feasible. See the section :ref:`pysqlite_serializable`
+ degree than is often feasible. See the section ref_pysqlite_serializable
for techniques to work around this behavior.
.. seealso::
SQLite's SAVEPOINT feature is impacted by unresolved
issues in the pysqlite driver, which defers BEGIN statements to a greater
- degree than is often feasible. See the section :ref:`pysqlite_serializable`
+ degree than is often feasible. See the section ref_pysqlite_serializable
for techniques to work around this behavior.
Transactional DDL
SQLite's transactional DDL is impacted by unresolved issues
in the pysqlite driver, which fails to emit BEGIN and additionally
forces a COMMIT to cancel any transaction when DDL is encountered.
- See the section :ref:`pysqlite_serializable`
+ See the section ref_pysqlite_serializable
for techniques to work around this behavior.
.. _sqlite_foreign_keys:
.. seealso:: This section describes the :term:`DDL` version of "ON CONFLICT" for
SQLite, which occurs within a CREATE TABLE statement. For "ON CONFLICT" as
- applied to an INSERT statement, see :ref:`sqlite_on_conflict_insert`.
+ applied to an INSERT statement, see ref_sqlite_on_conflict_insert.
SQLite supports a non-standard DDL clause known as ON CONFLICT which can be applied
to primary key, unique, check, and not null constraints. In DDL, it is
.. seealso:: This section describes the :term:`DML` version of "ON CONFLICT" for
SQLite, which occurs within an INSERT statement. For "ON CONFLICT" as
- applied to a CREATE TABLE statement, see :ref:`sqlite_on_conflict_ddl`.
+ applied to a CREATE TABLE statement, see ref_sqlite_on_conflict_ddl.
From version 3.24.0 onwards, SQLite supports "upserts" (update or insert)
of rows into a table via the ``ON CONFLICT`` clause of the ``INSERT``
.. seealso::
- :ref:`sqlite_on_conflict_insert`
+ ref_sqlite_on_conflict_insert
"""
----------------
The driver makes a change to the default pool behavior of pysqlite
-as described in :ref:`pysqlite_threading_pooling`. The pysqlcipher driver
+as described in ref_pysqlite_threading_pooling. The pysqlcipher driver
has been observed to be significantly slower on connection than the
pysqlite driver, most likely due to the encryption overhead, so the
dialect here defaults to using the :class:`.SingletonThreadPool`
.. note:: As indicated below, in current SQLAlchemy versions this
accessor is only useful beyond what's already supplied by
:attr:`_engine.CursorResult.inserted_primary_key` when using the
- :ref:`postgresql_psycopg2` dialect. Future versions hope to
+ psycopg2 dialect. Future versions hope to
generalize this feature to more dialects.
This accessor is added to support dialects that offer the feature
- that is currently implemented by the :ref:`psycopg2_executemany_mode`
+ that is currently implemented by the ref_psycopg2_executemany_mode
feature, currently **only the psycopg2 dialect**, which provides
for many rows to be INSERTed at once while still retaining the
behavior of being able to return server-generated primary key values.
"""A dictionary of parameters applied to the current row.
This attribute is only available in the context of a user-defined default
- generation function, e.g. as described at :ref:`context_default_functions`.
+ generation function, e.g. as described at ref_context_default_functions.
It consists of a dictionary which includes entries for each column/value
pair that is to be part of the INSERT or UPDATE statement. The keys of the
dictionary will be the key value of each :class:`_schema.Column`,
:meth:`.DefaultExecutionContext.get_current_parameters`
- :ref:`context_default_functions`
+ ref_context_default_functions
"""
This method can only be used in the context of a user-defined default
generation function, e.g. as described at
- :ref:`context_default_functions`. When invoked, a dictionary is
+ ref_context_default_functions. When invoked, a dictionary is
returned which includes entries for each column/value pair that is part
of the INSERT or UPDATE statement. The keys of the dictionary will be
the key value of each :class:`_schema.Column`,
:attr:`.DefaultExecutionContext.current_parameters`
- :ref:`context_default_functions`
+ ref_context_default_functions
"""
try:
.. seealso::
- :ref:`mssql_pyodbc_setinputsizes`
+ ref_mssql_pyodbc_setinputsizes
.. versionadded:: 1.2.9
.. seealso::
- :ref:`cx_oracle_setinputsizes`
+ ref_cx_oracle_setinputsizes
"""
pass
.. seealso::
- :ref:`asyncio_events_run_async`
+ ref_asyncio_events_run_async
"""
return await_only(fn(self._connection))
internal list upon discovery. This feature is not typically used or
recommended by the SQLAlchemy maintainers, but is provided to ensure
certain user defined functions can run before others, such as when
- :ref:`Changing the sql_mode in MySQL <mysql_sql_mode>`.
+ ref_mysql_sql_mode.
:param bool named: When using named argument passing, the names listed in
the function argument specification will be used as keys in the
:ref:`cascade_scalar_deletes` - complete usage example
- :param init: Specific to :ref:`orm_declarative_native_dataclasses`,
+ :param init: Specific to ref_orm_declarative_native_dataclasses,
specifies if the mapped attribute should be part of the ``__init__()``
method as generated by the dataclass process.
.. versionadded:: 2.0.0b4
- :param repr: Specific to :ref:`orm_declarative_native_dataclasses`,
+ :param repr: Specific to ref_orm_declarative_native_dataclasses,
specifies if the attribute established by this :class:`.AssociationProxy`
should be part of the ``__repr__()`` method as generated by the dataclass
process.
.. versionadded:: 2.0.0b4
:param default_factory: Specific to
- :ref:`orm_declarative_native_dataclasses`, specifies a default-value
+ ref_orm_declarative_native_dataclasses, specifies a default-value
generation function that will take place as part of the ``__init__()``
method as generated by the dataclass process.
.. versionadded:: 2.0.0b4
:param compare: Specific to
- :ref:`orm_declarative_native_dataclasses`, indicates if this field
+ ref_orm_declarative_native_dataclasses, indicates if this field
should be included in comparison operations when generating the
``__eq__()`` and ``__ne__()`` methods for the mapped class.
.. versionadded:: 2.0.0b4
- :param kw_only: Specific to :ref:`orm_declarative_native_dataclasses`,
+ :param kw_only: Specific to ref_orm_declarative_native_dataclasses,
indicates if this field should be marked as keyword-only when generating
the ``__init__()`` method as generated by the dataclass process.
Arguments passed to :func:`_asyncio.create_async_engine` are mostly
identical to those passed to the :func:`_sa.create_engine` function.
The specified dialect must be an asyncio-compatible dialect
- such as :ref:`dialect-postgresql-asyncpg`.
+ such as ref_dialect-postgresql-asyncpg.
.. versionadded:: 1.4
This function is analogous to the :func:`_sa.engine_from_config` function
in SQLAlchemy Core, except that the requested dialect must be an
- asyncio-compatible dialect such as :ref:`dialect-postgresql-asyncpg`.
+ asyncio-compatible dialect such as ref_dialect-postgresql-asyncpg.
The argument signature of the function is identical to that
of :func:`_sa.engine_from_config`.
Arguments passed to :func:`_asyncio.create_async_pool_from_url` are mostly
identical to those passed to the :func:`_sa.create_pool_from_url` function.
The specified dialect must be an asyncio-compatible dialect
- such as :ref:`dialect-postgresql-asyncpg`.
+ such as ref_dialect-postgresql-asyncpg.
.. versionadded:: 2.0.10
class async_scoped_session(Generic[_AS]):
"""Provides scoped management of :class:`.AsyncSession` objects.
- See the section :ref:`asyncio_scoped_session` for usage details.
+ See the section ref_asyncio_scoped_session for usage details.
.. versionadded:: 1.4.19
.. seealso::
- :ref:`session_expire` - introductory material
+ ref_session_expire - introductory material
:meth:`.Session.expire`
.. seealso::
- :ref:`session_expire` - introductory material
+ ref_session_expire - introductory material
:meth:`.Session.expire`
.. seealso::
- :ref:`session_expire` - introductory material
+ ref_session_expire - introductory material
:meth:`.Session.expire`
.. seealso::
- :ref:`session_expire` - introductory material
+ ref_session_expire - introductory material
:meth:`.Session.expire`
a well-integrated approach to the issue of expediently auto-generating ad-hoc
mappings.
-.. tip:: The :ref:`automap_toplevel` extension is geared towards a
+.. tip:: The ref_automap_toplevel extension is geared towards a
"zero declaration" approach, where a complete ORM model including classes
and pre-named relationships can be generated on the fly from a database
schema. For applications that still want to use explicit class declarations
.. seealso::
- :ref:`automap_toplevel`
+ ref_automap_toplevel
"""
.. seealso::
- :ref:`automap_by_module`
+ ref_automap_by_module
"""
.. seealso::
- :ref:`automap_by_module`
+ ref_automap_by_module
:param name_for_scalar_relationship: callable function which will be
used to produce relationship names for scalar relationships. Defaults
For an overview of multiple-schema automap including the use
of additional naming conventions to resolve table name
- conflicts, see the section :ref:`automap_by_module`.
+ conflicts, see the section ref_automap_by_module.
.. versionadded:: 2.0 :meth:`.AutomapBase.prepare` supports being
directly invoked any number of times, keeping track of tables
Using this approach, we can specify columns and properties
that will take place on mapped subclasses, in the way that
- we normally do as in :ref:`declarative_mixins`::
+ we normally do as in ref_declarative_mixins::
from sqlalchemy.ext.declarative import AbstractConcreteBase
Defines a rudimental 'horizontal sharding' system which allows a Session to
distribute queries and persistence operations across multiple databases.
-For a usage example, see the :ref:`examples_sharding` example included in
+For a usage example, see the ref_examples_sharding example included in
the source distribution.
.. deepalchemy:: The horizontal sharding extension is an advanced feature,
:meth:`.hybrid_property.expression` modifier should mutate the
existing hybrid object at ``Interval.radius`` in place, without creating a
new object. Notes on this modifier and its
-rationale are discussed in the next section :ref:`hybrid_pep484_naming`.
+rationale are discussed in the next section ref_hybrid_pep484_naming.
The use of ``@classmethod`` is optional, and is strictly to give typing
tools a hint that ``cls`` in this case is expected to be the ``Interval``
class, and not an instance of ``Interval``.
of joins in favor of the correlated subquery, which can portably be packed
into a single column expression. A correlated subquery is more portable, but
often performs more poorly at the SQL level. Using the same technique
-illustrated at :ref:`mapper_column_property_sql_expressions`,
+illustrated at ref_mapper_column_property_sql_expressions,
we can adjust our ``SavingsAccount`` example to aggregate the balances for
*all* accounts, and use a correlated subquery for the column expression::
.. seealso::
- :ref:`hybrid_pep484_naming`
+ ref_hybrid_pep484_naming
"""
return self
.. seealso::
- :ref:`hybrid_pep484_naming`
+ ref_hybrid_pep484_naming
"""
return hybrid_property._InPlace(self)
their own instrumentation. It is not intended for general use.
For examples of how the instrumentation extension is used,
-see the example :ref:`examples_instrumentation`.
+see the example ref_examples_instrumentation.
"""
import weakref
Composites are a special ORM feature which allow a single scalar attribute to
be assigned an object value which represents information "composed" from one
or more columns from the underlying mapped table. The usual example is that of
-a geometric "point", and is introduced in :ref:`mapper_composite`.
+a geometric "point", and is introduced in ref_mapper_composite.
As is the case with :class:`.Mutable`, the user-defined composite class
subclasses :class:`.MutableComposite` as a mixin, and detects and delivers
change events to its parents via the :meth:`.MutableComposite.changed` method.
In the case of a composite class, the detection is usually via the usage of the
special Python method ``__setattr__()``. In the example below, we expand upon the ``Point``
-class introduced in :ref:`mapper_composite` to include
+class introduced in ref_mapper_composite to include
:class:`.MutableComposite` in its bases and to route attribute set events via
``__setattr__`` to the :meth:`.MutableComposite.changed` method::
:param default: Passed directly to the
:paramref:`_schema.Column.default` parameter if the
:paramref:`_orm.mapped_column.insert_default` parameter is not present.
- Additionally, when used with :ref:`orm_declarative_native_dataclasses`,
+ Additionally, when used with ref_orm_declarative_native_dataclasses,
indicates a default Python value that should be applied to the keyword
constructor within the generated ``__init__()`` method.
.. versionadded:: 2.0.10
- :param init: Specific to :ref:`orm_declarative_native_dataclasses`,
+ :param init: Specific to ref_orm_declarative_native_dataclasses,
specifies if the mapped attribute should be part of the ``__init__()``
method as generated by the dataclass process.
- :param repr: Specific to :ref:`orm_declarative_native_dataclasses`,
+ :param repr: Specific to ref_orm_declarative_native_dataclasses,
specifies if the mapped attribute should be part of the ``__repr__()``
method as generated by the dataclass process.
:param default_factory: Specific to
- :ref:`orm_declarative_native_dataclasses`,
+ ref_orm_declarative_native_dataclasses,
specifies a default-value generation function that will take place
as part of the ``__init__()``
method as generated by the dataclass process.
:param compare: Specific to
- :ref:`orm_declarative_native_dataclasses`, indicates if this field
+ ref_orm_declarative_native_dataclasses, indicates if this field
should be included in comparison operations when generating the
``__eq__()`` and ``__ne__()`` methods for the mapped class.
.. versionadded:: 2.0.0b4
:param kw_only: Specific to
- :ref:`orm_declarative_native_dataclasses`, indicates if this field
+ ref_orm_declarative_native_dataclasses, indicates if this field
should be marked as keyword-only when generating the ``__init__()``.
:param \**kw: All remaining keyword arguments are passed through to the
.. seealso::
- :ref:`mapper_column_property_sql_expressions` - general use of
+ ref_mapper_column_property_sql_expressions - general use of
:func:`_orm.column_property` to map SQL expressions
:ref:`orm_imperative_table_column_options` - usage of
) -> Composite[Any]:
r"""Return a composite column-based property for use with a Mapper.
- See the mapping documentation section :ref:`mapper_composite` for a
+ See the mapping documentation section ref_mapper_composite for a
full usage example.
The :class:`.MapperProperty` returned by :func:`.composite`
:param info: Optional data dictionary which will be populated into the
:attr:`.MapperProperty.info` attribute of this object.
- :param init: Specific to :ref:`orm_declarative_native_dataclasses`,
+ :param init: Specific to ref_orm_declarative_native_dataclasses,
specifies if the mapped attribute should be part of the ``__init__()``
method as generated by the dataclass process.
- :param repr: Specific to :ref:`orm_declarative_native_dataclasses`,
+ :param repr: Specific to ref_orm_declarative_native_dataclasses,
specifies if the mapped attribute should be part of the ``__repr__()``
method as generated by the dataclass process.
:param default_factory: Specific to
- :ref:`orm_declarative_native_dataclasses`,
+ ref_orm_declarative_native_dataclasses,
specifies a default-value generation function that will take place
as part of the ``__init__()``
method as generated by the dataclass process.
:param compare: Specific to
- :ref:`orm_declarative_native_dataclasses`, indicates if this field
+ ref_orm_declarative_native_dataclasses, indicates if this field
should be included in comparison operations when generating the
``__eq__()`` and ``__ne__()`` methods for the mapped class.
.. versionadded:: 2.0.0b4
:param kw_only: Specific to
- :ref:`orm_declarative_native_dataclasses`, indicates if this field
+ ref_orm_declarative_native_dataclasses, indicates if this field
should be marked as keyword-only when generating the ``__init__()``.
"""
.. seealso::
- :ref:`examples_session_orm_events` - includes examples of using
+ ref_examples_session_orm_events - includes examples of using
:func:`_orm.with_loader_criteria`.
:ref:`do_orm_execute_global_criteria` - basic example on how to
.. warning:: When passed as a Python-evaluable string, the
argument is interpreted using Python's ``eval()`` function.
**DO NOT PASS UNTRUSTED INPUT TO THIS STRING**.
- See :ref:`declarative_relationship_eval` for details on
+ See ref_declarative_relationship_eval for details on
declarative evaluation of :func:`_orm.relationship` arguments.
The :paramref:`_orm.relationship.secondary` keyword argument is
:ref:`self_referential_many_to_many` - Specifics on using
many-to-many in a self-referential case.
- :ref:`declarative_many_to_many` - Additional options when using
+ ref_declarative_many_to_many - Additional options when using
Declarative.
:ref:`association_pattern` - an alternative to
.. seealso::
- :ref:`relationships_backref` - notes on using
+ ref_relationships_backref - notes on using
:paramref:`_orm.relationship.backref`
:ref:`tutorial_orm_related_objects` - in the :ref:`unified_tutorial`,
.. warning:: When passed as a Python-evaluable string, the
argument is interpreted using Python's ``eval()`` function.
**DO NOT PASS UNTRUSTED INPUT TO THIS STRING**.
- See :ref:`declarative_relationship_eval` for details on
+ See ref_declarative_relationship_eval for details on
declarative evaluation of :func:`_orm.relationship` arguments.
.. seealso::
* ``noload`` - no loading should occur at any time. The related
collection will remain empty. The ``noload`` strategy is not
recommended for general use. For a general use "never load"
- approach, see :ref:`write_only_relationship`
+ approach, see ref_write_only_relationship
* ``raise`` - lazy loading is disallowed; accessing
the attribute, if its value were not already loaded via eager
The ``write_only`` loader style is configured automatically when
the :class:`_orm.WriteOnlyMapped` annotation is provided on the
left hand side within a Declarative mapping. See the section
- :ref:`write_only_relationship` for examples.
+ ref_write_only_relationship for examples.
.. versionadded:: 2.0
.. seealso::
- :ref:`write_only_relationship` - in the :ref:`queryguide_toplevel`
+ ref_write_only_relationship - in the :ref:`queryguide_toplevel`
* ``dynamic`` - the attribute will return a pre-configured
:class:`_query.Query` object for all read
The ``dynamic`` loader style is configured automatically when
the :class:`_orm.DynamicMapped` annotation is provided on the
left hand side within a Declarative mapping. See the section
- :ref:`dynamic_relationship` for examples.
+ ref_dynamic_relationship for examples.
.. legacy:: The "dynamic" lazy loader strategy is the legacy form of
what is now the "write_only" strategy described in the section
- :ref:`write_only_relationship`.
+ ref_write_only_relationship.
.. seealso::
- :ref:`dynamic_relationship` - in the :ref:`queryguide_toplevel`
+ ref_dynamic_relationship - in the :ref:`queryguide_toplevel`
- :ref:`write_only_relationship` - more generally useful approach
+ ref_write_only_relationship - more generally useful approach
for large collections that should not fully load into memory
* True - a synonym for 'select'
.. warning:: When passed as a Python-evaluable string, the
argument is interpreted using Python's ``eval()`` function.
**DO NOT PASS UNTRUSTED INPUT TO THIS STRING**.
- See :ref:`declarative_relationship_eval` for details on
+ See ref_declarative_relationship_eval for details on
declarative evaluation of :func:`_orm.relationship` arguments.
:param passive_deletes=False:
.. warning:: When passed as a Python-evaluable string, the
argument is interpreted using Python's ``eval()`` function.
**DO NOT PASS UNTRUSTED INPUT TO THIS STRING**.
- See :ref:`declarative_relationship_eval` for details on
+ See ref_declarative_relationship_eval for details on
declarative evaluation of :func:`_orm.relationship` arguments.
.. seealso::
.. warning:: When passed as a Python-evaluable string, the
argument is interpreted using Python's ``eval()`` function.
**DO NOT PASS UNTRUSTED INPUT TO THIS STRING**.
- See :ref:`declarative_relationship_eval` for details on
+ See ref_declarative_relationship_eval for details on
declarative evaluation of :func:`_orm.relationship` arguments.
.. seealso::
.. seealso::
- :ref:`dynamic_relationship` - Introduction to "dynamic"
+ ref_dynamic_relationship - Introduction to "dynamic"
relationship loaders.
:param secondaryjoin:
.. warning:: When passed as a Python-evaluable string, the
argument is interpreted using Python's ``eval()`` function.
**DO NOT PASS UNTRUSTED INPUT TO THIS STRING**.
- See :ref:`declarative_relationship_eval` for details on
+ See ref_declarative_relationship_eval for details on
declarative evaluation of :func:`_orm.relationship` arguments.
.. seealso::
.. versionadded:: 1.3
- :param init: Specific to :ref:`orm_declarative_native_dataclasses`,
+ :param init: Specific to ref_orm_declarative_native_dataclasses,
specifies if the mapped attribute should be part of the ``__init__()``
method as generated by the dataclass process.
- :param repr: Specific to :ref:`orm_declarative_native_dataclasses`,
+ :param repr: Specific to ref_orm_declarative_native_dataclasses,
specifies if the mapped attribute should be part of the ``__repr__()``
method as generated by the dataclass process.
:param default_factory: Specific to
- :ref:`orm_declarative_native_dataclasses`,
+ ref_orm_declarative_native_dataclasses,
specifies a default-value generation function that will take place
as part of the ``__init__()``
method as generated by the dataclass process.
:param compare: Specific to
- :ref:`orm_declarative_native_dataclasses`, indicates if this field
+ ref_orm_declarative_native_dataclasses, indicates if this field
should be included in comparison operations when generating the
``__eq__()`` and ``__ne__()`` methods for the mapped class.
.. versionadded:: 2.0.0b4
:param kw_only: Specific to
- :ref:`orm_declarative_native_dataclasses`, indicates if this field
+ ref_orm_declarative_native_dataclasses, indicates if this field
should be marked as keyword-only when generating the ``__init__()``.
relationship(SomeClass, lazy="dynamic")
- See the section :ref:`dynamic_relationship` for more details
+ See the section ref_dynamic_relationship for more details
on dynamic loading.
"""
.. seealso::
- :ref:`relationships_backref` - background on backrefs
+ ref_relationships_backref - background on backrefs
"""
The :class:`_orm.Mapped` class represents attributes that are handled
directly by the :class:`_orm.Mapper` class. It does not include other
Python descriptor classes that are provided as extensions, including
- :ref:`hybrids_toplevel` and the :ref:`associationproxy_toplevel`.
+ ref_hybrids_toplevel and the ref_associationproxy_toplevel.
While these systems still make use of ORM-specific superclasses
and structures, they are not :term:`instrumented` by the
:class:`_orm.Mapper` and instead provide their own functionality
.. legacy:: The "dynamic" lazy loader strategy is the legacy form of what
is now the "write_only" strategy described in the section
- :ref:`write_only_relationship`.
+ ref_write_only_relationship.
E.g.::
cascade="all,delete-orphan"
)
- See the section :ref:`dynamic_relationship` for background.
+ See the section ref_dynamic_relationship for background.
.. versionadded:: 2.0
.. seealso::
- :ref:`dynamic_relationship` - complete background
+ ref_dynamic_relationship - complete background
:class:`.WriteOnlyMapped` - fully 2.0 style version
cascade="all,delete-orphan"
)
- See the section :ref:`write_only_relationship` for background.
+ See the section ref_write_only_relationship for background.
.. versionadded:: 2.0
.. seealso::
- :ref:`write_only_relationship` - complete background
+ ref_write_only_relationship - complete background
:class:`.DynamicMapped` - includes legacy :class:`_orm.Query` support
The :func:`_orm.declarative_mixin` decorator currently does not modify
the given class in any way; it's current purpose is strictly to assist
- the :ref:`Mypy plugin <mypy_toplevel>` in being able to identify
+ the ref_mypy_toplevel in being able to identify
SQLAlchemy declarative mixin classes when no other context is present.
.. versionadded:: 1.4.6
:ref:`orm_mixins_toplevel`
- :ref:`mypy_declarative_mixins` - in the
- :ref:`Mypy plugin documentation <mypy_toplevel>`
+ ref_mypy_declarative_mixins - in the
+ ref_mypy_toplevel
""" # noqa: E501
.. seealso::
- :ref:`orm_declarative_native_dataclasses` - complete background
+ ref_orm_declarative_native_dataclasses - complete background
on SQLAlchemy native dataclass mapping
.. versionadded:: 2.0
.. seealso::
- :ref:`orm_declarative_native_dataclasses` - complete background
+ ref_orm_declarative_native_dataclasses - complete background
on SQLAlchemy native dataclass mapping
.. seealso::
- :ref:`mapper_composite`
+ ref_mapper_composite
"""
"""Produce boolean, comparison, and other operators for
:class:`.Composite` attributes.
- See the example in :ref:`composite_operations` for an overview
+ See the example in ref_composite_operations for an overview
of usage , as well as the documentation for :class:`.PropComparator`.
.. seealso::
.. seealso::
- :ref:`mapper_composite`
+ ref_mapper_composite
"""
:ref:`orm_server_defaults`
- :ref:`metadata_defaults_toplevel`
+ ref_metadata_defaults_toplevel
"""
.. seealso::
- :ref:`examples_versioning` - an example which illustrates the use
+ ref_examples_versioning - an example which illustrates the use
of the :meth:`_orm.MapperEvents.before_mapper_configured`
event to create new mappers to record change-audit histories on
objects.
and parameters as well as an option that allows programmatic
invocation of the statement at any point.
- :ref:`examples_session_orm_events` - includes examples of using
+ ref_examples_session_orm_events - includes examples of using
:meth:`_orm.SessionEvents.do_orm_execute`
- :ref:`examples_caching` - an example of how to integrate
+ ref_examples_caching - an example of how to integrate
Dogpile caching with the ORM :class:`_orm.Session` making use
of the :meth:`_orm.SessionEvents.do_orm_execute` event hook.
- :ref:`examples_sharding` - the Horizontal Sharding example /
+ ref_examples_sharding - the Horizontal Sharding example /
extension relies upon the
:meth:`_orm.SessionEvents.do_orm_execute` event hook to invoke a
SQL statement on multiple backends and return a merged result.
we indicate that this value is to be persisted to the database.
This supersedes the use of ``SOME_CONSTANT`` in the default generator
for the :class:`_schema.Column`. The ``active_column_defaults.py``
- example given at :ref:`examples_instrumentation` illustrates using
+ example given at ref_examples_instrumentation illustrates using
the same approach for a changing default, e.g. a timestamp
generator. In this particular example, it is not strictly
necessary to do this since ``SOME_CONSTANT`` would be part of the
:class:`.AttributeEvents` - background on listener options such
as propagation to subclasses.
- :ref:`examples_instrumentation` - see the
+ ref_examples_instrumentation - see the
``active_column_defaults.py`` example.
"""
the :meth:`_orm.QueryEvents.before_compile` event is **no longer
used** for ORM-level attribute loads, such as loads of deferred
or expired attributes as well as relationship loaders. See the
- new examples in :ref:`examples_session_orm_events` which
+ new examples in ref_examples_session_orm_events which
illustrate new ways of intercepting and modifying ORM queries
for the most common purpose of adding arbitrary filter criteria.
:meth:`.QueryEvents.before_compile_delete`
- :ref:`baked_with_before_compile`
+ ref_baked_with_before_compile
"""
be consistent in more scenarios independently of whether or not an
orphan object has been flushed yet or not.
- See the change note and example at :ref:`legacy_is_orphan_addition`
+ See the change note and example at ref_legacy_is_orphan_addition
for more detail on this change.
:param non_primary: Specify that this :class:`_orm.Mapper`
.. seealso::
- :ref:`mapper_version_counter` - discussion of version counting
+ ref_mapper_version_counter - discussion of version counting
and rationale.
:param version_id_generator: Define how new version ids should
Alternatively, server-side versioning functions such as triggers,
or programmatic versioning schemes outside of the version id
generator may be used, by specifying the value ``False``.
- Please see :ref:`server_side_version_counter` for a discussion
+ Please see ref_server_side_version_counter for a discussion
of important points when using this option.
.. seealso::
- :ref:`custom_version_counter`
+ ref_custom_version_counter
- :ref:`server_side_version_counter`
+ ref_server_side_version_counter
:param with_polymorphic: A tuple in the form ``(<classes>,
.. seealso::
- :ref:`mapper_column_property_sql_expressions_composed`
+ ref_mapper_column_property_sql_expressions_composed
"""
return self.columns[0]
statement executions, as the :class:`_orm.Session` will not track
objects from different schema translate maps within a single
session. For multiple schema translate maps within the scope of a
- single :class:`_orm.Session`, see :ref:`examples_sharding`.
+ single :class:`_orm.Session`, see ref_examples_sharding.
.. seealso::
The 'load' argument is the same as that of :meth:`.Session.merge`.
For an example of how :meth:`_query.Query.merge_result` is used, see
- the source code for the example :ref:`examples_caching`, where
+ the source code for the example ref_examples_caching, where
:meth:`_query.Query.merge_result` is used to efficiently restore state
from a cache back into a target :class:`.Session`.
class scoped_session(Generic[_S]):
"""Provides scoped management of :class:`.Session` objects.
- See :ref:`unitofwork_contextual` for a tutorial.
+ See ref_unitofwork_contextual for a tutorial.
.. note::
- When using :ref:`asyncio_toplevel`, the async-compatible
+ When using ref_asyncio_toplevel, the async-compatible
:class:`_asyncio.async_scoped_session` class should be
used in place of :class:`.scoped_session`.
:ref:`session_begin_nested`
- :ref:`pysqlite_serializable` - special workarounds required
+ ref_pysqlite_serializable - special workarounds required
with the SQLite driver in order for SAVEPOINT to work
correctly.
:ref:`unitofwork_transaction`
- :ref:`asyncio_orm_avoid_lazyloads`
+ ref_asyncio_orm_avoid_lazyloads
""" # noqa: E501
.. seealso::
- :ref:`session_expire` - introductory material
+ ref_session_expire - introductory material
:meth:`.Session.expire`
.. seealso::
- :ref:`session_expire` - introductory material
+ ref_session_expire - introductory material
:meth:`.Session.expire`
This operation cascades to associated instances if the association is
mapped with ``cascade="merge"``.
- See :ref:`unitofwork_merging` for a detailed discussion of merging.
+ See ref_unitofwork_merging for a detailed discussion of merging.
:param instance: Instance to be merged.
:param load: Boolean, when False, :meth:`.merge` switches into
.. seealso::
- :ref:`session_expire` - introductory material
+ ref_session_expire - introductory material
:meth:`.Session.expire`
.. tip:: When using SQLite, the SQLite driver included through
Python 3.11 does not handle SAVEPOINTs correctly in all cases
without workarounds. See the section
- :ref:`pysqlite_serializable` for details on current workarounds.
+ ref_pysqlite_serializable for details on current workarounds.
* ``"control_fully"`` - the :class:`_orm.Session` will take
control of the given transaction as its own;
:ref:`session_begin_nested`
- :ref:`pysqlite_serializable` - special workarounds required
+ ref_pysqlite_serializable - special workarounds required
with the SQLite driver in order for SAVEPOINT to work
correctly.
:ref:`unitofwork_transaction`
- :ref:`asyncio_orm_avoid_lazyloads`
+ ref_asyncio_orm_avoid_lazyloads
"""
trans = self._transaction
.. seealso::
- :ref:`session_expire` - introductory material
+ ref_session_expire - introductory material
:meth:`.Session.expire`
.. seealso::
- :ref:`session_expire` - introductory material
+ ref_session_expire - introductory material
:meth:`.Session.expire`
.. seealso::
- :ref:`session_expire` - introductory material
+ ref_session_expire - introductory material
:meth:`.Session.expire`
This operation cascades to associated instances if the association is
mapped with ``cascade="merge"``.
- See :ref:`unitofwork_merging` for a detailed discussion of merging.
+ See ref_unitofwork_merging for a detailed discussion of merging.
:param instance: Instance to be merged.
:param load: Boolean, when False, :meth:`.merge` switches into
.. seealso::
- :ref:`session_object_states`
+ ref_session_object_states
"""
return self.key is None and not self._attached
.. seealso::
- :ref:`session_object_states`
+ ref_session_object_states
"""
return self.key is None and self._attached
.. seealso::
- :ref:`session_object_states`
+ ref_session_object_states
"""
return self.key is not None and self._attached and self._deleted
:func:`.orm.util.was_deleted` - standalone function
- :ref:`session_object_states`
+ ref_session_object_states
"""
return self._deleted
.. seealso::
- :ref:`session_object_states`
+ ref_session_object_states
"""
return self.key is not None and self._attached and not self._deleted
.. seealso::
- :ref:`session_object_states`
+ ref_session_object_states
"""
return self.key is not None and not self._attached
.. seealso::
- :ref:`asyncio_toplevel`
+ ref_asyncio_toplevel
"""
if _async_provider is None:
same attribute and method interface as the original mapped
class, allowing :class:`.AliasedClass` to be compatible
with any attribute technique which works on the original class,
- including hybrid attributes (see :ref:`hybrids_toplevel`).
+ including hybrid attributes (see ref_hybrids_toplevel).
The :class:`.AliasedClass` can be inspected for its underlying
:class:`_orm.Mapper`, aliased selectable, and other information
.. seealso::
- :ref:`inspection_toplevel`
+ ref_inspection_toplevel
"""
The :class:`.WriteOnlyCollection` is used in a mapping by
using the ``"write_only"`` lazy loading strategy with
:func:`_orm.relationship`. For background on this configuration,
- see :ref:`write_only_relationship`.
+ see ref_write_only_relationship.
.. versionadded:: 2.0
.. seealso::
- :ref:`write_only_relationship`
+ ref_write_only_relationship
"""
:class:`.SingletonThreadPool` is used by the SQLite dialect
automatically when a memory-based database is used.
- See :ref:`sqlite_toplevel`.
+ See ref_sqlite_toplevel.
"""
:meth:`.ColumnOperators.in_`
- :ref:`baked_in` - with baked queries
+ ref_baked_in - with baked queries
.. note:: The "expanding" feature does not support "executemany"-
style parameter sets.
correspond.
:param include_defaults: if True, non-server default values and
SQL expressions as specified on :class:`_schema.Column` objects
- (as documented in :ref:`metadata_defaults_toplevel`) not
+ (as documented in ref_metadata_defaults_toplevel) not
otherwise specified in the list of names will be rendered
into the INSERT and SELECT statements, so that these values are also
included in the data to be inserted.
:ref:`mapper_automated_reflection_schemes` -
in the ORM mapping documentation
- :ref:`automap_intercepting_columns` -
- in the :ref:`automap_toplevel` documentation
+ ref_automap_intercepting_columns -
+ in the ref_automap_toplevel documentation
- :ref:`metadata_reflection_dbagnostic_types` - in
- the :ref:`metadata_reflection_toplevel` documentation
+ ref_metadata_reflection_dbagnostic_types - in
+ the ref_metadata_reflection_toplevel documentation
"""
:ref:`tutorial_functions_table_valued` - in the :ref:`unified_tutorial`
- :ref:`postgresql_table_valued` - in the :ref:`postgresql_toplevel` documentation
+ ref_postgresql_table_valued - in the :ref:`postgresql_toplevel` documentation
:meth:`_functions.FunctionElement.scalar_table_valued` - variant of
:meth:`_functions.FunctionElement.table_valued` which delivers the
:ref:`tutorial_functions_column_valued` - in the :ref:`unified_tutorial`
- :ref:`postgresql_column_valued` - in the :ref:`postgresql_toplevel` documentation
+ ref_postgresql_column_valued - in the :ref:`postgresql_toplevel` documentation
:meth:`_functions.FunctionElement.table_valued`
Functions which are interpreted as "generic" functions know how to
calculate their return type automatically. For a listing of known generic
- functions, see :ref:`generic_functions`.
+ functions, see ref_generic_functions.
.. note::
.. versionchanged:: 2.0 ``plainto_tsquery()`` is used instead
of ``to_tsquery()`` for PostgreSQL now; for compatibility with
- other forms, see :ref:`postgresql_match`.
+ other forms, see ref_postgresql_match.
* MySQL - renders ``MATCH (x) AGAINST (y IN BOOLEAN MODE)``
.. seealso::
- :ref:`metadata_reflection_toplevel`
+ ref_metadata_reflection_toplevel
:meth:`_events.DDLEvents.column_reflect`
- :ref:`metadata_reflection_dbagnostic_types`
+ ref_metadata_reflection_dbagnostic_types
:param extend_existing: When ``True``, indicates that if this
:class:`_schema.Table` is already present in the given
In modern SQLAlchemy there is generally no reason to alter this
setting, except for some backend specific cases
- (see :ref:`mssql_triggers` in the SQL Server dialect documentation
+ (see ref_mssql_triggers in the SQL Server dialect documentation
for one such example).
:param include_columns: A list of strings indicating a subset of
to render the special SQLite keyword ``AUTOINCREMENT``
is not included as this is unnecessary and not recommended
by the database vendor. See the section
- :ref:`sqlite_autoincrement` for more background.
+ ref_sqlite_autoincrement for more background.
* Oracle - The Oracle dialect has no default "autoincrement"
feature available at this time, instead the :class:`.Identity`
construct is recommended to achieve this (the :class:`.Sequence`
.. seealso::
- :ref:`metadata_defaults_toplevel`
+ ref_metadata_defaults_toplevel
:param doc: optional String that can be used by the ORM or similar
to document attributes on the Python side. This attribute does
.. seealso::
- :ref:`metadata_defaults` - complete discussion of onupdate
+ ref_metadata_defaults - complete discussion of onupdate
:param primary_key: If ``True``, marks this column as a primary key
column. Multiple columns can have this flag set to specify
.. seealso::
- :ref:`server_defaults` - complete discussion of server side
+ ref_server_defaults - complete discussion of server side
defaults
:param server_onupdate: A :class:`.FetchedValue` instance
.. warning:: This directive **does not** currently produce MySQL's
"ON UPDATE CURRENT_TIMESTAMP()" clause. See
- :ref:`mysql_timestamp_onupdate` for background on how to
+ ref_mysql_timestamp_onupdate for background on how to
produce this clause.
.. seealso::
- :ref:`triggered_columns`
+ ref_triggered_columns
:param quote: Force quoting of this column's name on or off,
corresponding to ``True`` or ``False``. When left at its default
.. seealso::
- :ref:`defaults_sequences`
+ ref_defaults_sequences
:class:`.CreateSequence`
.. seealso::
- :ref:`triggered_columns`
+ ref_triggered_columns
"""
:ref:`schema_indexes` - General information on :class:`.Index`.
- :ref:`postgresql_indexes` - PostgreSQL-specific options available for
+ ref_postgresql_indexes - PostgreSQL-specific options available for
the :class:`.Index` construct.
- :ref:`mysql_indexes` - MySQL-specific options available for the
+ ref_mysql_indexes - MySQL-specific options available for the
:class:`.Index` construct.
- :ref:`mssql_indexes` - MSSQL-specific options available for the
+ ref_mssql_indexes - MSSQL-specific options available for the
:class:`.Index` construct.
"""
.. seealso::
- :ref:`metadata_reflection_toplevel`
+ ref_metadata_reflection_toplevel
:meth:`_events.DDLEvents.column_reflect` - Event used to customize
the reflected columns. Usually used to generalize the types using
:meth:`_types.TypeEngine.as_generic`
- :ref:`metadata_reflection_dbagnostic_types` - describes how to
+ ref_metadata_reflection_dbagnostic_types - describes how to
reflect tables using general types.
"""
a serialized binary field.
To allow ORM change events to propagate for elements associated
- with :class:`.PickleType`, see :ref:`mutable_toplevel`.
+ with :class:`.PickleType`, see ref_mutable_toplevel.
"""
.. seealso::
- :ref:`metadata_reflection_dbagnostic_types` - describes the
+ ref_metadata_reflection_dbagnostic_types - describes the
use of :meth:`_types.TypeEngine.as_generic` in conjunction with
the :meth:`_sql.DDLEvents.column_reflect` event, which is its
intended use.