:ticket:`12736`
+PostgreSQL
+==========
+
+.. _change_10594_postgresql:
+
+Changes to Named Type Handling in PostgreSQL
+---------------------------------------------
+
+Named types such as :class:`_postgresql.ENUM`, :class:`_postgresql.DOMAIN` and
+the dialect-agnostic :class:`._types.Enum` have undergone behavioral changes in
+SQLAlchemy 2.1 to better align with how a distinct type object that may
+be shared among tables works in practice.
+
+Named Types are Now Associated with MetaData
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+Named types are now more strongly associated with the :class:`_schema.MetaData`
+at the top of the table hierarchy and are de-associated with any particular
+:class:`_schema.Table` they may be a part of. This better represents how
+PostgreSQL named types exist independently of any particular table, and that
+they may be used across many tables simultaneously.
+
+:class:`_types.Enum` and :class:`_postgresql.DOMAIN` now have their
+:attr:`~_types.SchemaType.metadata` attribute set as soon as they are
+associated with a table, and no longer refer to the :class:`_schema.Table`
+or tables they are within (a table of course still refers to the named types
+that it uses).
+
+Schema Inheritance from MetaData
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+Named types will now "inherit" the schema of the :class:`_schema.MetaData`
+by default. For example, ``MetaData(schema="myschema")`` will cause all
+:class:`_types.Enum` and :class:`_postgresql.DOMAIN` to use the schema
+"myschema"::
+
+ metadata = MetaData(schema="myschema")
+
+ table = Table(
+ "mytable",
+ metadata,
+ Column("status", Enum("active", "inactive", name="status_enum")),
+ )
+
+ # The enum will be created as "myschema.status_enum"
+
+To have named types use the schema name of an immediate :class:`_schema.Table`
+that they are associated with, set the :paramref:`~_types.SchemaType.schema`
+parameter of the type to be that same schema name::
+
+ table = Table(
+ "mytable",
+ metadata,
+ Column(
+ "status", Enum("active", "inactive", name="status_enum", schema="tableschema")
+ ),
+ schema="tableschema",
+ )
+
+The :paramref:`_types.SchemaType.inherit_schema` parameter remains available
+for this release but is deprecated for eventual removal, and will emit a
+deprecation warning when used.
+
+Modified Create and Drop Behavior
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+The rules by which named types are created and dropped are also modified to
+flow more in terms of a :class:`_schema.MetaData`:
+
+1. :meth:`_schema.MetaData.create_all` and :meth:`_schema.Table.create` will
+ create any named types needed
+2. :meth:`_schema.Table.drop` will not drop any named types
+3. :meth:`_schema.MetaData.drop_all` will drop named types after all tables
+ are dropped
+
+Refined CheckFirst Behavior
+^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+There is also newly refined "checkfirst" behavior. A new enumeration
+:class:`_schema.CheckFirst` is introduced which allows fine-grained control
+within :meth:`_schema.MetaData.create_all`, :meth:`_schema.MetaData.drop_all`,
+:meth:`_schema.Table.create`, and :meth:`_schema.Table.drop` as to what "check"
+queries are emitted, allowing tests for types, sequences etc. to be included
+or not::
+
+ from sqlalchemy import CheckFirst
+
+ # Only check for table existence, skip type checks
+ metadata.create_all(engine, checkfirst=CheckFirst.TABLES)
+
+ # Check for both tables and types
+ metadata.create_all(engine, checkfirst=CheckFirst.TABLES | CheckFirst.TYPES)
+
+inherit_schema is Deprecated
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+Because named types now inherit the schema of :class:`.MetaData` automatically
+and remain agnostic of what :class:`.Table` objects refer to them, the
+:paramref:`_types.Enum.inherit_schema` parameter is deprecated. For 2.1
+it still works the old way by associating the type with the parent
+:class:`.Table`, however as this binds the type to a single :class:`.Table`
+even though the type can be used against any number of tables, it's preferred
+to set :paramref:`_types.Enum.schema` directly as desired when the schema
+used by the :class:`.MetaData` is not what's desired.
+
+:ticket:`10594`
+
+
`
+++ /dev/null
-.. change::
- :tags: change, schema
- :tickets: 10594
-
- Changed the default value of :paramref:`_types.Enum.inherit_schema` to
- ``True`` when :paramref:`_types.Enum.schema` and
- :paramref:`_types.Enum.metadata` parameters are not provided.
- The same behavior has been applied also to PostgreSQL
- :class:`_postgresql.DOMAIN` type.
--- /dev/null
+.. change::
+ :tags: change, postgresql
+ :tickets: 10594, 12690
+
+ Named types such as :class:`_postgresql.ENUM` and
+ :class:`_postgresql.DOMAIN` (as well as the dialect-agnostic
+ :class:`_types.Enum` version) are now more strongly associated with the
+ :class:`_schema.MetaData` at the top of the table hierarchy and are
+ de-associated with any particular :class:`_schema.Table` they may be a part
+ of. This better represents how PostgreSQL named types exist independently
+ of any particular table, and that they may be used across many tables
+ simultaneously. The change impacts the behavior of the "default schema"
+ for a named type, as well as the CREATE/DROP behavior in relationship to
+ the :class:`.MetaData` and :class:`.Table` construct. The change also
+ includes a new :class:`.CheckFirst` enumeration which allows fine grained
+ control over "check" queries during DDL operations, as well as that the
+ :paramref:`_types.SchemaType.inherit_schema` parameter is deprecated and
+ will emit a deprecation warning when used. See the migration notes for
+ full details.
+
+ .. seealso::
+
+ :ref:`change_10594_postgresql` - Complete details on PostgreSQL named type changes
.. autoclass:: DDL
:members:
+.. autoclass:: CheckFirst
+ :members:
+
.. autoclass:: _CreateDropBase
.. autoclass:: CreateTable
from .schema import BaseDDLElement as BaseDDLElement
from .schema import BLANK_SCHEMA as BLANK_SCHEMA
from .schema import CheckConstraint as CheckConstraint
+from .schema import CheckFirst as CheckFirst
from .schema import Column as Column
from .schema import ColumnDefault as ColumnDefault
from .schema import Computed as Computed
from ...sql import sqltypes
from ...sql import type_api
from ...sql.base import _NoArg
+from ...sql.ddl import CheckFirst
from ...sql.ddl import InvokeCreateDDLBase
from ...sql.ddl import InvokeDropDDLBase
bind._run_ddl_visitor(self.DDLDropper, self, checkfirst=checkfirst)
def _check_for_name_in_memos(
- self, checkfirst: bool, kw: Dict[str, Any]
+ self, checkfirst: CheckFirst, kw: Dict[str, Any]
) -> bool:
"""Look in the 'ddl runner' for 'memos', then
note our name in that collection.
def _on_table_create(
self,
- target: Any,
+ target: schema.Table,
bind: _CreateDropBind,
- checkfirst: bool = False,
+ checkfirst: Union[bool, CheckFirst] = CheckFirst.NONE,
**kw: Any,
) -> None:
- if (
- checkfirst
- or (
- not self.metadata
- and not kw.get("_is_metadata_operation", False)
- )
- ) and not self._check_for_name_in_memos(checkfirst, kw):
- self.create(bind=bind, checkfirst=checkfirst)
+ checkfirst = CheckFirst(checkfirst) & CheckFirst.TYPES
+ if not self._check_for_name_in_memos(checkfirst, kw):
+ self.create(bind=bind, checkfirst=bool(checkfirst))
def _on_table_drop(
self,
target: Any,
bind: _CreateDropBind,
- checkfirst: bool = False,
+ checkfirst: CheckFirst = CheckFirst.NONE,
**kw: Any,
) -> None:
- if (
- not self.metadata
- and not kw.get("_is_metadata_operation", False)
- and not self._check_for_name_in_memos(checkfirst, kw)
- ):
- self.drop(bind=bind, checkfirst=checkfirst)
+ # do nothing since the enum is attached to a metadata
+ assert self.metadata is not None
def _on_metadata_create(
self,
- target: Any,
+ target: schema.MetaData,
bind: _CreateDropBind,
- checkfirst: bool = False,
+ checkfirst: Union[bool, CheckFirst] = CheckFirst.NONE,
**kw: Any,
) -> None:
+ checkfirst = CheckFirst(checkfirst) & CheckFirst.TYPES
if not self._check_for_name_in_memos(checkfirst, kw):
- self.create(bind=bind, checkfirst=checkfirst)
+ self.create(bind=bind, checkfirst=bool(checkfirst))
def _on_metadata_drop(
self,
- target: Any,
+ target: schema.MetaData,
bind: _CreateDropBind,
- checkfirst: bool = False,
+ checkfirst: Union[bool, CheckFirst] = CheckFirst.NONE,
**kw: Any,
) -> None:
+ checkfirst = CheckFirst(checkfirst) & CheckFirst.TYPES
if not self._check_for_name_in_memos(checkfirst, kw):
- self.drop(bind=bind, checkfirst=checkfirst)
+ self.drop(bind=bind, checkfirst=bool(checkfirst))
class NamedTypeGenerator(InvokeCreateDDLBase):
type as the implementation, so the special create/drop rules
will be used.
- The create/drop behavior of ENUM is necessarily intricate, due to the
- awkward relationship the ENUM type has in relationship to the
- parent table, in that it may be "owned" by just a single table, or
- may be shared among many tables.
+ The create/drop behavior of ENUM tries to follow the PostgreSQL behavior,
+ with an usability improvement indicated below.
When using :class:`_types.Enum` or :class:`_postgresql.ENUM`
- in an "inline" fashion, the ``CREATE TYPE`` and ``DROP TYPE`` is emitted
- corresponding to when the :meth:`_schema.Table.create` and
- :meth:`_schema.Table.drop`
- methods are called::
+ in an "inline" fashion, the ``CREATE TYPE`` is emitted
+ corresponding to when the :meth:`_schema.Table.create` method is called::
table = Table(
"sometable",
Column("some_enum", ENUM("a", "b", "c", name="myenum")),
)
- table.create(engine) # will emit CREATE ENUM and CREATE TABLE
- table.drop(engine) # will emit DROP TABLE and DROP ENUM
+ # will check if enum exists and emit CREATE ENUM then CREATE TABLE
+ table.create(engine)
+ table.drop(engine) # will *not* drop the enum.
+
+ The enum will not be dropped when the table is dropped, since it's
+ associated with the metadata, not the table itself. Call drop on the
+ :class:`_postgresql.ENUM` directly to drop the type::
+
+ metadata.get_schema_object_by_name("enum", "myenum").drop(engine)
To use a common enumerated type between multiple tables, the best
practice is to declare the :class:`_types.Enum` or
- :class:`_postgresql.ENUM` independently, and associate it with the
- :class:`_schema.MetaData` object itself::
+ :class:`_postgresql.ENUM` independently::
my_enum = ENUM("a", "b", "c", name="myenum", metadata=metadata)
t2 = Table("sometable_two", metadata, Column("some_enum", myenum))
- When this pattern is used, care must still be taken at the level
- of individual table creates. Emitting CREATE TABLE without also
- specifying ``checkfirst=True`` will still cause issues::
-
- t1.create(engine) # will fail: no such type 'myenum'
+ Like before, the type will be created if it does not exist::
- If we specify ``checkfirst=True``, the individual table-level create
- operation will check for the ``ENUM`` and create if not exists::
+ # will check if enum exists and emit CREATE ENUM then CREATE TABLE
+ t1.create(engine)
- # will check if enum exists, and emit CREATE TYPE if not
- t1.create(engine, checkfirst=True)
-
- When using a metadata-level ENUM type, the type will always be created
- and dropped if either the metadata-wide create/drop is called::
+ The type will always be created and dropped if either the metadata-wide
+ create/drop is called::
metadata.create_all(engine) # will emit CREATE TYPE
metadata.drop_all(engine) # will emit DROP TYPE
my_enum.create(engine)
my_enum.drop(engine)
+ .. versionchanged:: 2.1 The behavior of :class:`_postgresql.ENUM` and
+ other named types has been changed to better reflect how PostgreSQL
+ handles CREATE TYPE and DROP TYPE operations.
+ Named types are still created when needed during table
+ creation if they do not already exist. However, they are no longer
+ dropped for an individual :meth:`.Table.drop` operation, since the type
+ may be referenced by other tables as well. Instead,
+ :meth:`.Enum.drop` may be used or :meth:`.MetaData.drop_all` will drop
+ all associated types.
+
"""
native_enum = True
Indicates that ``CREATE TYPE`` should be
emitted, after optionally checking for the
presence of the type, when the parent
- table is being created; and additionally
- that ``DROP TYPE`` is called when the table
- is dropped. When ``False``, no check
+ table is being created. When ``False``, no check
will be performed and no ``CREATE TYPE``
or ``DROP TYPE`` is emitted, unless
:meth:`~.postgresql.ENUM.create`
kw.setdefault("name", impl.name)
kw.setdefault("create_type", impl.create_type)
kw.setdefault("schema", impl.schema)
- kw.setdefault("inherit_schema", impl.inherit_schema)
kw.setdefault("metadata", impl.metadata)
kw.setdefault("_create_events", False)
kw.setdefault("values_callable", impl.values_callable)
check="VALUE ~ '^\d{5}$' OR VALUE ~ '^\d{5}-\d{4}$'",
)
+ :class:`_postgresql.DOMAIN` has the same create/drop behavior specified
+ in :class:`_postgresql.ENUM`.
+
See the `PostgreSQL documentation`__ for additional details
__ https://www.postgresql.org/docs/current/sql-createdomain.html
# existing properties.ColumnProperty from an inheriting
# mapper. make a copy and append our column to it
- # breakpoint()
new_prop = existing_prop.copy()
new_prop.columns.insert(0, incoming_column)
from .sql.ddl import _DropView as _DropView
from .sql.ddl import AddConstraint as AddConstraint
from .sql.ddl import BaseDDLElement as BaseDDLElement
+from .sql.ddl import CheckFirst as CheckFirst
from .sql.ddl import CreateColumn as CreateColumn
from .sql.ddl import CreateIndex as CreateIndex
from .sql.ddl import CreateSchema as CreateSchema
from .compiler import NO_LINTING as NO_LINTING
from .compiler import WARN_LINTING as WARN_LINTING
from .ddl import BaseDDLElement as BaseDDLElement
+from .ddl import CheckFirst as CheckFirst
from .ddl import DDL as DDL
from .ddl import DDLElement as DDLElement
from .ddl import ExecutableDDLElement as ExecutableDDLElement
from __future__ import annotations
import contextlib
+from enum import auto
+from enum import Flag
import typing
from typing import Any
from typing import Callable
)
+class CheckFirst(Flag):
+ """Enumeration for the :paramref:`.MetaData.create_all.checkfirst`
+ parameter passed to methods like :meth:`.MetaData.create_all`,
+ :meth:`.MetaData.drop_all`, :meth:`.Table.create`, :meth:`.Table.drop` and
+ others.
+
+ This enumeration indicates what kinds of objects should be "checked"
+ with a separate query before emitting CREATE or DROP for that object.
+
+ Can use ``CheckFirst(bool_value)`` to convert from a boolean value.
+
+ .. versionadded:: 2.1
+
+ """
+
+ NONE = 0 # equivalent to False
+ """No items should be checked"""
+
+ # avoid 1 so that bool True doesn't match by value
+ TABLES = 2
+ """Check for tables"""
+
+ INDEXES = auto()
+ """Check for indexes"""
+
+ SEQUENCES = auto()
+ """Check for sequences"""
+
+ TYPES = auto()
+ """Check for custom datatypes that are created server-side
+
+ This is currently used by PostgreSQL.
+
+ """
+
+ ALL = TABLES | INDEXES | SEQUENCES | TYPES # equivalent to True
+
+ @classmethod
+ def _missing_(cls, value: object) -> Any:
+ if isinstance(value, bool):
+ return cls.ALL if value else cls.NONE
+ return super()._missing_(value)
+
+
class SchemaGenerator(InvokeCreateDDLBase):
def __init__(
- self, dialect, connection, checkfirst=False, tables=None, **kwargs
+ self,
+ dialect,
+ connection,
+ checkfirst=CheckFirst.NONE,
+ tables=None,
+ **kwargs,
):
super().__init__(connection, **kwargs)
- self.checkfirst = checkfirst
+ self.checkfirst = CheckFirst(checkfirst)
self.tables = tables
self.preparer = dialect.identifier_preparer
self.dialect = dialect
effective_schema = self.connection.schema_for_object(table)
if effective_schema:
self.dialect.validate_identifier(effective_schema)
- return not self.checkfirst or not self.dialect.has_table(
- self.connection, table.name, schema=effective_schema
+ return (
+ not self.checkfirst & CheckFirst.TABLES
+ or not self.dialect.has_table(
+ self.connection, table.name, schema=effective_schema
+ )
)
def _can_create_index(self, index):
effective_schema = self.connection.schema_for_object(index.table)
if effective_schema:
self.dialect.validate_identifier(effective_schema)
- return not self.checkfirst or not self.dialect.has_index(
- self.connection,
- index.table.name,
- index.name,
- schema=effective_schema,
+ return (
+ not self.checkfirst & CheckFirst.INDEXES
+ or not self.dialect.has_index(
+ self.connection,
+ index.table.name,
+ index.name,
+ schema=effective_schema,
+ )
)
def _can_create_sequence(self, sequence):
return self.dialect.supports_sequences and (
(not self.dialect.sequences_optional or not sequence.optional)
and (
- not self.checkfirst
+ not self.checkfirst & CheckFirst.SEQUENCES
or not self.dialect.has_sequence(
self.connection, sequence.name, schema=effective_schema
)
class SchemaDropper(InvokeDropDDLBase):
def __init__(
- self, dialect, connection, checkfirst=False, tables=None, **kwargs
+ self,
+ dialect,
+ connection,
+ checkfirst=CheckFirst.NONE,
+ tables=None,
+ **kwargs,
):
super().__init__(connection, **kwargs)
- self.checkfirst = checkfirst
+ self.checkfirst = CheckFirst(checkfirst)
self.tables = tables
self.preparer = dialect.identifier_preparer
self.dialect = dialect
effective_schema = self.connection.schema_for_object(table)
if effective_schema:
self.dialect.validate_identifier(effective_schema)
- return not self.checkfirst or self.dialect.has_table(
- self.connection, table.name, schema=effective_schema
+ return (
+ not self.checkfirst & CheckFirst.TABLES
+ or self.dialect.has_table(
+ self.connection, table.name, schema=effective_schema
+ )
)
def _can_drop_index(self, index):
effective_schema = self.connection.schema_for_object(index.table)
if effective_schema:
self.dialect.validate_identifier(effective_schema)
- return not self.checkfirst or self.dialect.has_index(
- self.connection,
- index.table.name,
- index.name,
- schema=effective_schema,
+ return (
+ not self.checkfirst & CheckFirst.INDEXES
+ or self.dialect.has_index(
+ self.connection,
+ index.table.name,
+ index.name,
+ schema=effective_schema,
+ )
)
def _can_drop_sequence(self, sequence):
return self.dialect.supports_sequences and (
(not self.dialect.sequences_optional or not sequence.optional)
and (
- not self.checkfirst
+ not self.checkfirst & CheckFirst.SEQUENCES
or self.dialect.has_sequence(
self.connection, sequence.name, schema=effective_schema
)
from typing import Sequence as _typing_Sequence
from typing import Set
from typing import Tuple
+from typing import Type
from typing import TYPE_CHECKING
from typing import TypedDict
from typing import TypeGuard
from .base import SchemaEventTarget as SchemaEventTarget
from .base import SchemaVisitable as SchemaVisitable
from .coercions import _document_text_coercion
+from .ddl import CheckFirst
from .elements import ClauseElement
from .elements import ColumnClause
from .elements import ColumnElement
from .elements import BindParameter
from .elements import KeyedColumnElement
from .functions import Function
+ from .sqltypes import SchemaType
from .type_api import TypeEngine
from .visitors import anon_map
from ..engine import Connection
metadata._add_table(self.name, self.schema, self)
self.metadata = metadata
- def create(self, bind: _CreateDropBind, checkfirst: bool = False) -> None:
+ def create(
+ self,
+ bind: _CreateDropBind,
+ checkfirst: Union[bool, CheckFirst] = CheckFirst.TYPES,
+ ) -> None:
"""Issue a ``CREATE`` statement for this
:class:`_schema.Table`, using the given
:class:`.Connection` or :class:`.Engine`
"""
+ # the default is to only check for schema objects
bind._run_ddl_visitor(ddl.SchemaGenerator, self, checkfirst=checkfirst)
- def drop(self, bind: _CreateDropBind, checkfirst: bool = False) -> None:
+ def drop(
+ self,
+ bind: _CreateDropBind,
+ checkfirst: Union[bool, CheckFirst] = CheckFirst.NONE,
+ ) -> None:
"""Issue a ``DROP`` statement for this
:class:`_schema.Table`, using the given
:class:`.Connection` or :class:`.Engine` for connectivity.
column: Optional[Column[Any]]
data_type: Optional[TypeEngine[int]]
+ metadata: Optional[MetaData]
+
@util.deprecated_params(
order=(
"2.1",
self.schema = schema = metadata.schema
else:
self.schema = quoted_name.construct(schema, quote_schema)
- self.metadata = metadata
self._key = _get_table_key(name, schema)
- if metadata:
- self._set_metadata(metadata)
if data_type is not None:
self.data_type = to_instance(data_type)
else:
self.data_type = None
+ if metadata:
+ self._set_metadata(metadata)
+ else:
+ self.metadata = None
+
@util.preload_module("sqlalchemy.sql.functions")
def next_value(self) -> Function[int]:
"""Return a :class:`.next_value` function element
"""
return util.preloaded.sql_functions.func.next_value(self)
- def _set_parent(self, parent: SchemaEventTarget, **kw: Any) -> None:
- column = parent
- assert isinstance(column, Column)
- super()._set_parent(column)
- column._on_table_attach(self._set_table)
-
def _copy(self) -> Sequence:
return Sequence(
name=self.name,
**self.dialect_kwargs,
)
+ def _set_parent(self, parent: SchemaEventTarget, **kw: Any) -> None:
+ assert isinstance(parent, Column)
+ super()._set_parent(parent, **kw)
+ parent._on_table_attach(self._set_table)
+
def _set_table(self, column: Column[Any], table: Table) -> None:
self._set_metadata(table.metadata)
def _set_metadata(self, metadata: MetaData) -> None:
self.metadata = metadata
- self.metadata._sequences[self._key] = self
+ self.metadata._register_object(self)
+ metadata._sequences[self._key] = self
- def create(self, bind: _CreateDropBind, checkfirst: bool = True) -> None:
+ def create(
+ self,
+ bind: _CreateDropBind,
+ checkfirst: Union[bool, CheckFirst] = CheckFirst.SEQUENCES,
+ ) -> None:
"""Creates this sequence in the database."""
bind._run_ddl_visitor(ddl.SchemaGenerator, self, checkfirst=checkfirst)
- def drop(self, bind: _CreateDropBind, checkfirst: bool = True) -> None:
+ def drop(
+ self,
+ bind: _CreateDropBind,
+ checkfirst: Union[bool, CheckFirst] = CheckFirst.SEQUENCES,
+ ) -> None:
"""Drops this sequence from the database."""
bind._run_ddl_visitor(ddl.SchemaDropper, self, checkfirst=checkfirst)
assert False
self.expressions = self._table_bound_expressions = exprs
- def create(self, bind: _CreateDropBind, checkfirst: bool = False) -> None:
+ def create(
+ self,
+ bind: _CreateDropBind,
+ checkfirst: Union[bool, CheckFirst] = CheckFirst.NONE,
+ ) -> None:
"""Issue a ``CREATE`` statement for this
:class:`.Index`, using the given
:class:`.Connection` or :class:`.Engine`` for connectivity.
"""
bind._run_ddl_visitor(ddl.SchemaGenerator, self, checkfirst=checkfirst)
- def drop(self, bind: _CreateDropBind, checkfirst: bool = False) -> None:
+ def drop(
+ self,
+ bind: _CreateDropBind,
+ checkfirst: Union[bool, CheckFirst] = CheckFirst.NONE,
+ ) -> None:
"""Issue a ``DROP`` statement for this
:class:`.Index`, using the given
:class:`.Connection` or :class:`.Engine` for connectivity.
self._fk_memos: Dict[Tuple[str, Optional[str]], List[ForeignKey]] = (
collections.defaultdict(list)
)
+ self._objects: Set[Union[HasSchemaAttr, SchemaType]] = set()
tables: util.FacadeDict[str, Table]
"""A dictionary of :class:`_schema.Table`
"sequences": self._sequences,
"fk_memos": self._fk_memos,
"naming_convention": self.naming_convention,
+ "objects": self._objects,
}
def __setstate__(self, state: Dict[str, Any]) -> None:
self._sequences = state["sequences"]
self._schemas = state["schemas"]
self._fk_memos = state["fk_memos"]
+ self._objects = state.get("objects", set())
def clear(self) -> None:
- """Clear all Table objects from this MetaData."""
+ """Clear all objects from this MetaData."""
dict.clear(self.tables)
self._schemas.clear()
self._fk_memos.clear()
+ self._sequences.clear()
+ self._objects.clear()
def remove(self, table: Table) -> None:
"""Remove the given Table object from this MetaData."""
self,
bind: _CreateDropBind,
tables: Optional[_typing_Sequence[Table]] = None,
- checkfirst: bool = True,
+ checkfirst: Union[bool, CheckFirst] = CheckFirst.ALL,
) -> None:
"""Create all tables stored in this metadata.
Optional list of ``Table`` objects, which is a subset of the total
tables in the ``MetaData`` (others are ignored).
- :param checkfirst:
- Defaults to True, don't issue CREATEs for tables already present
- in the target database.
+ :param checkfirst: A boolean value or instance of :class:`.CheckFirst`.
+ Indicates which objects should be checked for within a separate pass
+ before creating schema objects.
"""
bind._run_ddl_visitor(
self,
bind: _CreateDropBind,
tables: Optional[_typing_Sequence[Table]] = None,
- checkfirst: bool = True,
+ checkfirst: Union[bool, CheckFirst] = CheckFirst.ALL,
) -> None:
"""Drop all tables stored in this metadata.
Optional list of ``Table`` objects, which is a subset of the
total tables in the ``MetaData`` (others are ignored).
- :param checkfirst:
- Defaults to True, only issue DROPs for tables confirmed to be
- present in the target database.
+ :param checkfirst: A boolean value or instance of :class:`.CheckFirst`.
+ Indicates which objects should be checked for within a separate pass
+ before dropping schema objects.
"""
bind._run_ddl_visitor(
ddl.SchemaDropper, self, checkfirst=checkfirst, tables=tables
)
+ @property
+ def schemas(self) -> _typing_Sequence[str]:
+ """A sequence of schema names that are present in this MetaData."""
+ schemas = self._schemas
+ if self.schema:
+ schemas = schemas | {self.schema}
+ return tuple(schemas)
+
+ def get_schema_objects(
+ self,
+ kind: Type[_T],
+ *,
+ schema: Union[str, None, Literal[_NoArg.NO_ARG]] = _NoArg.NO_ARG,
+ ) -> _typing_Sequence[_T]:
+ """Return a sequence of schema objects of the given kind.
+
+ This method can be used to return :class:`_sqltypes.Enum`,
+ :class:`.Sequence`, etc. objects registered in this
+ :class:`_schema.MetaData`.
+
+ :param kind: a type that indicates what object to return, such as
+ :class:`Enum` or :class:`Sequence`.
+ :param schema: Optional, a schema name to filter the objects by. If
+ not provided the default schema of the metadata is used.
+
+ """
+
+ if schema is _NoArg.NO_ARG:
+ schema = self.schema
+ return tuple(
+ obj
+ for obj in self._objects
+ if isinstance(obj, kind) and obj.schema == schema
+ )
+
+ def get_schema_object_by_name(
+ self,
+ kind: Type[_T],
+ name: str,
+ *,
+ schema: Union[str, None, Literal[_NoArg.NO_ARG]] = _NoArg.NO_ARG,
+ ) -> Optional[_T]:
+ """Return a schema objects of the given kind and name if found.
+
+ This method can be used to return :class:`_sqltypes.Enum`,
+ :class:`.Sequence`, etc. objects registered in this
+ :class:`_schema.MetaData`.
+
+ :param kind: a type that indicates what object to return, such as
+ :class:`Enum` or :class:`Sequence`.
+ :param name: the name of the object to return.
+ :param schema: Optional, a schema name to filter the objects by. If
+ not provided the default schema of the metadata is used.
+
+ """
+
+ for obj in self.get_schema_objects(kind, schema=schema):
+ if getattr(obj, "name", None) == name:
+ return obj
+ return None
+
+ def _register_object(self, obj: Union[HasSchemaAttr, SchemaType]) -> None:
+ self._objects.add(obj)
+
class Computed(FetchedValue, SchemaItem):
"""Defines a generated column, i.e. "GENERATED ALWAYS AS" syntax.
from .. import inspection
from .. import util
from ..engine import processors
+from ..util import deprecated_params
from ..util import langhelpers
from ..util import OrderedDict
+from ..util import warn_deprecated
from ..util.typing import is_literal
from ..util.typing import is_pep695
from ..util.typing import TupleAny
from .elements import ColumnElement
from .operators import OperatorType
from .schema import MetaData
- from .schema import SchemaConst
from .type_api import _BindProcessorType
from .type_api import _ComparatorFactory
from .type_api import _LiteralProcessorType
_use_schema_map = True
name: Optional[str]
+ metadata: Optional[MetaData]
+ @deprecated_params(
+ inherit_schema=(
+ "2.1",
+ "the ``inherit_schema`` parameter is deprecated."
+ "The schema will be inherited from the ``metadata`` object. "
+ "To keep using the table schema as the type schema, pass the "
+ "``schema`` parameter directly.",
+ )
+ )
def __init__(
self,
name: Optional[str] = None,
- schema: Optional[Union[str, Literal[SchemaConst.BLANK_SCHEMA]]] = None,
+ schema: Optional[Union[str, _NoArg]] = NO_ARG,
metadata: Optional[MetaData] = None,
- inherit_schema: Union[bool, _NoArg] = NO_ARG,
+ inherit_schema: bool = False,
quote: Optional[bool] = None,
create_type: bool = True,
_create_events: bool = True,
self.name = quoted_name(name, quote)
else:
self.name = None
- self.schema = schema
- self.metadata = metadata
- self.create_type = create_type
- if inherit_schema is True and schema is not None:
+
+ if schema is NO_ARG:
+ self._schema_provided = False
+ self.schema = None
+ elif inherit_schema:
raise exc.ArgumentError(
"Ambiguously setting inherit_schema=True while "
- "also passing a non-None schema argument"
+ "also passing a schema argument"
)
- self.inherit_schema = (
- inherit_schema
- if inherit_schema is not NO_ARG
- else (schema is None and metadata is None)
- )
+ else:
+ self._schema_provided = True
+ self.schema = schema
+ self._inherit_schema = inherit_schema
+ if metadata:
+ self._set_metadata(metadata)
+ else:
+ self.metadata = None
+
+ self.create_type = create_type
self._create_events = _create_events
if _create_events and self.metadata:
if _adapted_from:
self.dispatch = self.dispatch._join(_adapted_from.dispatch)
+ @property
+ def inherit_schema(self) -> bool:
+ "Deprecated property ``inherit_schema``."
+ warn_deprecated(
+ "The ``inherit_schema`` property is deprecated.", "2.1"
+ )
+ return self._inherit_schema
+
def _set_parent(self, parent, **kw):
# set parent hook is when this type is associated with a column.
# Column calls it for all SchemaEventTarget instances, either the
return variant_mapping
def _set_table(self, column, table):
- if self.inherit_schema:
+ metadata_was_none = self._set_metadata(table.metadata)
+ if self._inherit_schema:
self.schema = table.schema
- elif self.metadata and self.schema is None and self.metadata.schema:
- self.schema = self.metadata.schema
-
- if self.schema is not None:
- self.inherit_schema = False
+ self._inherit_schema = False
if not self._create_events:
return
self._on_table_drop, variant_mapping=variant_mapping
),
)
- if self.metadata is None:
+ if metadata_was_none or self.metadata is not table.metadata:
# if SchemaType were created w/ a metadata argument, these
# events would already have been associated with that metadata
# and would preclude an association with table.metadata
),
)
+ def _set_metadata(self, metadata: MetaData) -> bool:
+ # when called from the ctor metadata is not assigned yet
+ if getattr(self, "metadata", None) is None:
+ self.metadata = metadata
+ if (
+ not self._inherit_schema
+ and not self._schema_provided
+ and metadata.schema
+ ):
+ self.schema = metadata.schema
+ self.metadata._register_object(self)
+ return True
+ else:
+ return False
+
def copy(self, **kw):
return self.adapt(
cast("Type[TypeEngine[Any]]", self.__class__),
if self.metadata is not None
else None
),
+ **(
+ {"schema": kw["schema"]}
+ if kw.get("schema", NO_ARG) is not NO_ARG
+ else {}
+ ),
)
@overload
:param metadata: Associate this type directly with a ``MetaData``
object. For types that exist on the target database as an
independent schema construct (PostgreSQL), this type will be
- created and dropped within ``create_all()`` and ``drop_all()``
- operations. If the type is not associated with any ``MetaData``
- object, it will associate itself with each ``Table`` in which it is
- used, and will be created when any of those individual tables are
- created, after a check is performed for its existence. The type is
- only dropped when ``drop_all()`` is called for that ``Table``
- object's metadata, however.
+ created and dropped within :meth:`_schema.MetaData.create_all` and
+ :meth:`_schema.MetaData.drop_all` operations. If the type is not
+ associated with any :class:`_schema.MetaData` object, it will
+ automatically associate itself with the :class:`_schema.MetaData`
+ object from the first :class:`_schema.Table` it's used in.
+ The type will be created when any table that uses it is created,
+ after a check is performed for its existence.
+ The type is only dropped when :meth:`_schema.MetaData.drop_all`
+ is called for the associated metadata.
+
+ .. versionchanged:: 2.1 named types like :class:`.Enum`,
+ :class:`_postgresql.ENUM` and :class:`_postgresql.DOMAIN` now
+ inherit the schema of the associated :class:`.MetaData`
+ automatically, even when only first associated with a
+ :class:`.Table`.
The value of the :paramref:`_schema.MetaData.schema` parameter of
the :class:`_schema.MetaData` object, if set, will be used as the
default value of the :paramref:`_types.Enum.schema` on this object
if an explicit value is not otherwise supplied.
- .. versionchanged:: 1.4.12 :class:`_types.Enum` inherits the
- :paramref:`_schema.MetaData.schema` parameter of the
- :class:`_schema.MetaData` object if present, when passed using
- the :paramref:`_types.Enum.metadata` parameter.
+ .. versionchanged:: 2.1 :class:`_types.Enum` will associate itself
+ with the metadata of the first ``Table`` it is used in, if a
+ metadata object is not provided.
:param name: The name of this type. This is required for PostgreSQL
and any future supported database which requires an explicitly
that are created and dropped separately from the table(s) they
are used by. Indicates that ``CREATE TYPE`` should be emitted,
after optionally checking for the presence of the type, when the
- parent table is being created; and additionally that ``DROP TYPE`` is
- called when the table is dropped. This parameter is equivalent to the
+ parent table is being created. This parameter is equivalent to the
parameter of the same name on the PostgreSQL-specific
:class:`_postgresql.ENUM` datatype.
present.
If not present, the schema name will be taken from the
- :class:`_schema.MetaData` collection if passed as
- :paramref:`_types.Enum.metadata`, for a :class:`_schema.MetaData`
- that includes the :paramref:`_schema.MetaData.schema` parameter.
-
- .. versionchanged:: 1.4.12 :class:`_types.Enum` inherits the
- :paramref:`_schema.MetaData.schema` parameter of the
- :class:`_schema.MetaData` object if present, when passed using
- the :paramref:`_types.Enum.metadata` parameter.
-
- Otherwise, the schema will be inherited from the associated
- :class:`_schema.Table` object if any; when
- :paramref:`_types.Enum.inherit_schema` is set to
- ``False``, the owning table's schema is **not** used.
-
+ :class:`_schema.MetaData` collection if it
+ includes the :paramref:`_schema.MetaData.schema` parameter,
+ unless the deprecated :paramref:`.Enum.inherit_schema` parameter
+ is set to ``True``.
:param quote: Set explicit quoting preferences for the type's name.
:param inherit_schema: When ``True``, the "schema" from the owning
- :class:`_schema.Table` will be copied to the "schema"
- attribute of this :class:`.Enum`, replacing whatever value was
- passed for the :paramref:`_types.Enum.schema` attribute.
- This also takes effect when using the
- :meth:`_schema.Table.to_metadata` operation.
- Set to ``False`` to retain the schema value provided.
- By default the behavior will be to inherit the table schema unless
- either :paramref:`_types.Enum.schema` and / or
- :paramref:`_types.Enum.metadata` are set.
-
- .. versionchanged:: 2.1 The default value of this parameter
- was changed to ``True`` when :paramref:`_types.Enum.schema`
- and :paramref:`_types.Enum.metadata` are not provided.
+ :class:`_schema.Table`
+ will be copied to the "schema" attribute of this
+ :class:`.Enum`.
+
+ .. deprecated:: 2.1 Setting the :paramref:`.Enum.inherit_schema`
+ parameter is deprecated. Provide the schema directly if
+ the default behavior of using the :class:`_schema.MetaData`
+ schema is not desired.
:param validate_strings: when True, string values that are being
passed to the database in a SQL statement will be checked
self,
name=kw.pop("name", None),
create_type=kw.pop("create_type", True),
- inherit_schema=kw.pop("inherit_schema", NO_ARG),
- schema=kw.pop("schema", None),
+ inherit_schema=kw.pop("inherit_schema", False),
+ schema=kw.pop("schema", NO_ARG),
metadata=kw.pop("metadata", None),
quote=kw.pop("quote", None),
_create_events=kw.pop("_create_events", True),
("native_enum", True),
("create_constraint", False),
("length", self._default_length),
+ ("schema", None),
],
to_inspect=[Enum, SchemaType],
+ omit_kwarg=["schema", "inherit_schema"],
)
def as_generic(self, allow_nulltype=False):
if self.name:
kw.setdefault("name", self.name)
kw.setdefault("schema", self.schema)
- kw.setdefault("inherit_schema", self.inherit_schema)
kw.setdefault("metadata", self.metadata)
kw.setdefault("native_enum", self.native_enum)
kw.setdefault("values_callable", self.values_callable)
"CREATE DOMAIN arraydomain_2d AS INTEGER[][]"
)
connection.exec_driver_sql(
- "CREATE DOMAIN arraydomain_3d AS INTEGER[][][]"
+ "CREATE DOMAIN arraydomain_3d AS INTEGER[][][]"
)
yield
connection.exec_driver_sql("DROP DOMAIN arraydomain")
from sqlalchemy.dialects.postgresql import INT8RANGE
from sqlalchemy.dialects.postgresql import JSON
from sqlalchemy.dialects.postgresql import JSONB
-from sqlalchemy.dialects.postgresql import NamedType
from sqlalchemy.dialects.postgresql import NUMMULTIRANGE
from sqlalchemy.dialects.postgresql import NUMRANGE
from sqlalchemy.dialects.postgresql import pg8000
from sqlalchemy.sql import bindparam
from sqlalchemy.sql import operators
from sqlalchemy.sql import sqltypes
+from sqlalchemy.sql.ddl import CheckFirst
+from sqlalchemy.testing import expect_deprecated
from sqlalchemy.testing import expect_raises
from sqlalchemy.testing import expect_raises_message
from sqlalchemy.testing import fixtures
("create_type", False, "create_type"),
("create_type", True, "create_type"),
("schema", "someschema", "schema"),
- ("inherit_schema", False, "inherit_schema"),
("metadata", MetaData(), "metadata"),
("values_callable", lambda x: None, "values_callable"),
)
eq_(getattr(e1_copy, attrname), value)
- def test_enum_create_table(self, metadata, connection):
+ @testing.variation("type_exists", [True, False])
+ @testing.variation("type_meta", [True, False])
+ def test_enum_create_table(
+ self, metadata, connection, type_exists, type_meta
+ ):
metadata = self.metadata
t1 = Table(
"table",
metadata,
Column("id", Integer, primary_key=True),
Column(
- "value", Enum("one", "two", "three", name="onetwothreetype")
+ "value",
+ Enum(
+ "one",
+ "two",
+ "three",
+ name="onetwothreetype",
+ metadata=metadata if type_meta else None,
+ ),
),
)
- t1.create(connection)
+ if type_exists:
+ t1.c.value.type.create(connection)
+ t1.create(connection) # defaults to check the type
t1.create(connection, checkfirst=True) # check the create
connection.execute(t1.insert(), dict(value="two"))
connection.execute(t1.insert(), dict(value="three"))
[(1, "two"), (2, "three"), (3, "three")],
)
- def test_domain_create_table(self, metadata, connection):
+ @testing.variation("type_exists", [True, False])
+ def test_domain_create_table(self, metadata, connection, type_exists):
metadata = self.metadata
Email = DOMAIN(
name="email",
data_type=Text,
check=r"VALUE ~ '[^@]+@[^@]+\.[^@]+'",
+ metadata=metadata,
)
PosInt = DOMAIN(
name="pos_int",
Column("email", Email),
Column("number", PosInt),
)
+ if type_exists:
+ Email.create(connection)
+ PosInt.create(connection)
t1.create(connection)
t1.create(connection, checkfirst=True) # check the create
connection.execute(
)
@testing.combinations(
- (ENUM("one", "two", "three", name="mytype"), "get_enums"),
+ CheckFirst.ALL, CheckFirst.TYPES, True, argnames="checkfirst"
+ )
+ def test_checkfirst(self, metadata, connection, checkfirst):
+ Value = Enum("a", "b", "c", name="value", metadata=metadata)
+ Value.create(connection)
+
+ # no error the second time
+ Table("t", metadata, Column("c", Value)).create(
+ connection, checkfirst=checkfirst
+ )
+
+ @testing.combinations(
+ CheckFirst.NONE,
+ CheckFirst.INDEXES,
+ CheckFirst.TABLES,
+ CheckFirst.SEQUENCES,
+ False,
+ argnames="checkfirst",
+ )
+ def test_no_checkfirst(self, metadata, connection, checkfirst):
+ Value = Enum("a", "b", "c", name="value", metadata=metadata)
+ Value.create(connection)
+
+ with expect_raises_message(exc.ProgrammingError, "value"):
+ Table("t", metadata, Column("c", Value)).create(
+ connection, checkfirst=checkfirst
+ )
+
+ @testing.variation("kind", ["enum", "domain"])
+ @testing.combinations(
+ {},
+ {"checkfirst": True},
+ {"checkfirst": False},
+ {"checkfirst": CheckFirst.TYPES},
+ {"checkfirst": CheckFirst.TABLES},
+ argnames="initial_checkfirst",
+ )
+ def test_create_behavior(
+ self, metadata, connection, kind: testing.Variation, initial_checkfirst
+ ):
+ metadata = self.metadata
+ Value = Enum("a", "b", "c", name="value", metadata=metadata)
+ PosInt = DOMAIN(
+ name="value",
+ data_type=Integer,
+ not_null=True,
+ check=r"VALUE > 0",
+ )
+ t1 = Table(
+ "tbl",
+ metadata,
+ Column("id", Integer, primary_key=True),
+ )
+ if kind.domain:
+ exists_fn = self._domain_exists
+ t1.append_column(Column("number", PosInt))
+ elif kind.enum:
+ exists_fn = self._enum_exists
+ t1.append_column(Column("value", Value))
+ else:
+ kind.fail()
+
+ t1.create(connection, **initial_checkfirst)
+
+ assert exists_fn("value", connection)
+ t1.drop(connection)
+ # drop did not remove named type
+ assert exists_fn("value", connection)
+ t1.create(connection) # by default it checks for named types
+ with connection.begin_nested() as tr:
+ with expect_raises_message(exc.ProgrammingError, "tbl"):
+ t1.create(connection) # but not for the table
+ tr.rollback()
+ t1.create(connection, checkfirst=True) # check the create
+
+ @testing.combinations(
(
- DOMAIN(
+ lambda **kw: ENUM("one", "two", "three", name="mytype", **kw),
+ "get_enums",
+ ),
+ (
+ lambda **kw: DOMAIN(
name="mytype",
data_type=Text,
check=r"VALUE ~ '[^@]+@[^@]+\.[^@]+'",
+ **kw,
),
"get_domains",
),
- argnames="datatype, method",
+ argnames="datatype_fn, method",
)
+ @testing.variation("type_meta", [True, False])
def test_drops_on_table(
- self, connection, metadata, datatype: "NamedType", method
+ self, connection, metadata, datatype_fn, method, type_meta
):
+ datatype = (
+ datatype_fn(metadata=metadata) if type_meta else datatype_fn()
+ )
table = Table("e1", metadata, Column("e1", datatype))
table.create(connection)
table.drop(connection)
- assert "mytype" not in [
+ assert "mytype" in [
e["name"] for e in getattr(inspect(connection), method)()
]
table.create(connection)
e["name"] for e in getattr(inspect(connection), method)()
]
table.drop(connection)
- assert "mytype" not in [
+ assert "mytype" in [
e["name"] for e in getattr(inspect(connection), method)()
]
t1.drop(conn)
- assert "schema_mytype" not in [
+ assert "schema_mytype" in [
e["name"]
for e in getattr(inspect(conn), method)(
schema=testing.config.test_schema
@testing.combinations(
("inherit_schema_false",),
("inherit_schema_not_provided",),
- ("metadata_schema_only",),
+ ("metadata_only",),
+ ("schema_only",),
("inherit_table_schema",),
("override_metadata_schema",),
argnames="test_case",
else:
assert False
- if test_case == "metadata_schema_only":
+ dep = expect_deprecated(
+ "the ``inherit_schema`` parameter is deprecated"
+ )
+
+ if test_case == "metadata_only":
enum = make_type(metadata=metadata)
assert_schema = testing.config.test_schema
+ elif test_case == "schema_only":
+ enum = make_type(schema=default_schema)
+ assert_schema = default_schema
elif test_case == "override_metadata_schema":
enum = make_type(
metadata=metadata,
)
assert_schema = testing.config.test_schema_2
elif test_case == "inherit_table_schema":
- enum = make_type(metadata=metadata, inherit_schema=True)
+ with dep:
+ enum = make_type(metadata=metadata, inherit_schema=True)
assert_schema = testing.config.test_schema_2
elif test_case == "inherit_schema_not_provided":
enum = make_type()
- assert_schema = testing.config.test_schema_2
+ assert_schema = testing.config.test_schema
elif test_case == "inherit_schema_false":
enum = make_type(inherit_schema=False)
- assert_schema = default_schema
+ assert_schema = testing.config.test_schema
else:
assert False
dialect="postgresql",
),
)
- expected_drop.append(
- RegexSQL("DROP DOMAIN mytype", dialect="postgresql")
- )
else:
type_exists = functools.partial(
self._enum_exists, "mytype", connection
dialect="postgresql",
),
)
- expected_drop.append(
- RegexSQL("DROP TYPE mytype", dialect="postgresql")
- )
t1 = Table("e1", metadata, Column("c1", dt))
assert type_exists()
- with self.sql_execution_asserter(connection) as drop_asserter:
- t1.drop(connection, checkfirst=False)
else:
dt.create(bind=connection, checkfirst=False)
assert type_exists()
with self.sql_execution_asserter(connection) as create_asserter:
t1.create(connection, checkfirst=False)
- with self.sql_execution_asserter(connection) as drop_asserter:
- t1.drop(connection, checkfirst=False)
- assert type_exists()
- dt.drop(bind=connection, checkfirst=False)
+ with self.sql_execution_asserter(connection) as drop_asserter:
+ t1.drop(connection, checkfirst=False)
+
+ assert type_exists()
+ dt.drop(bind=connection, checkfirst=False)
assert not type_exists()
assert_raises(exc.ProgrammingError, e1.drop, conn, checkfirst=False)
- def test_remain_on_table_metadata_wide(self, metadata, future_connection):
- connection = future_connection
-
- e1 = Enum("one", "two", "three", name="myenum", metadata=metadata)
- table = Table("e1", metadata, Column("c1", e1))
-
- # need checkfirst here, otherwise enum will not be created
- assert_raises_message(
- sa.exc.ProgrammingError,
- '.*type "myenum" does not exist',
- table.create,
- connection,
- )
- connection.rollback()
-
- table.create(connection, checkfirst=True)
- table.drop(connection)
- table.create(connection, checkfirst=True)
- table.drop(connection)
- assert "myenum" in [e["name"] for e in inspect(connection).get_enums()]
- metadata.drop_all(connection)
- assert "myenum" not in [
- e["name"] for e in inspect(connection).get_enums()
- ]
-
def test_non_native_dialect(self, metadata, testing_engine):
engine = testing_engine()
engine.connect()
{"my_enum_1", "my_enum_2", "my_enum_3"},
)
t.drop(connection)
- eq_(inspect(connection).get_enums(), [])
+ eq_(
+ {e["name"] for e in inspect(connection).get_enums()},
+ {"my_enum_1", "my_enum_2", "my_enum_3"},
+ )
def _type_combinations(
exclude_json=False,
from sqlalchemy.schema import DropConstraint
from sqlalchemy.schema import ForeignKeyConstraint
from sqlalchemy.schema import Sequence
+from sqlalchemy.sql import CheckFirst
from sqlalchemy.testing import AssertsCompiledSQL
from sqlalchemy.testing import config
from sqlalchemy.testing import engines
mock.call.before_create(
table,
self.bind,
- checkfirst=False,
+ checkfirst=CheckFirst.NONE,
_ddl_runner=mock.ANY,
_is_metadata_operation=mock.ANY,
)
mock.call.after_create(
table,
self.bind,
- checkfirst=False,
+ checkfirst=CheckFirst.NONE,
_ddl_runner=mock.ANY,
_is_metadata_operation=mock.ANY,
)
mock.call.before_create(
table,
self.bind,
- checkfirst=False,
+ checkfirst=CheckFirst.NONE,
_ddl_runner=mock.ANY,
_is_metadata_operation=mock.ANY,
),
mock.call.after_create(
table,
self.bind,
- checkfirst=False,
+ checkfirst=CheckFirst.NONE,
_ddl_runner=mock.ANY,
_is_metadata_operation=mock.ANY,
),
mock.call.before_drop(
table,
self.bind,
- checkfirst=False,
+ checkfirst=CheckFirst.NONE,
_ddl_runner=mock.ANY,
_is_metadata_operation=mock.ANY,
)
mock.call.after_drop(
table,
self.bind,
- checkfirst=False,
+ checkfirst=CheckFirst.NONE,
_ddl_runner=mock.ANY,
_is_metadata_operation=mock.ANY,
)
mock.call.before_drop(
table,
self.bind,
- checkfirst=False,
+ checkfirst=CheckFirst.NONE,
_ddl_runner=mock.ANY,
_is_metadata_operation=mock.ANY,
),
mock.call.after_drop(
table,
self.bind,
- checkfirst=False,
+ checkfirst=CheckFirst.NONE,
_ddl_runner=mock.ANY,
_is_metadata_operation=mock.ANY,
),
mock.call.before_create(
table,
self.bind,
- checkfirst=False,
+ checkfirst=CheckFirst.NONE,
_ddl_runner=mock.ANY,
_is_metadata_operation=mock.ANY,
),
mock.call.after_create(
table,
self.bind,
- checkfirst=False,
+ checkfirst=CheckFirst.NONE,
_ddl_runner=mock.ANY,
_is_metadata_operation=mock.ANY,
),
mock.call.before_drop(
table,
self.bind,
- checkfirst=False,
+ checkfirst=CheckFirst.NONE,
_ddl_runner=mock.ANY,
_is_metadata_operation=mock.ANY,
),
mock.call.after_drop(
table,
self.bind,
- checkfirst=False,
+ checkfirst=CheckFirst.NONE,
_ddl_runner=mock.ANY,
_is_metadata_operation=mock.ANY,
),
# used in the current testing strategy.
metadata,
self.bind,
- checkfirst=False,
+ checkfirst=CheckFirst.NONE,
tables=list(metadata.tables.values()),
_ddl_runner=mock.ANY,
)
mock.call.after_create(
metadata,
self.bind,
- checkfirst=False,
+ checkfirst=CheckFirst.NONE,
tables=list(metadata.tables.values()),
_ddl_runner=mock.ANY,
)
mock.call.before_create(
metadata,
self.bind,
- checkfirst=False,
+ checkfirst=CheckFirst.NONE,
tables=list(metadata.tables.values()),
_ddl_runner=mock.ANY,
),
mock.call.after_create(
metadata,
self.bind,
- checkfirst=False,
+ checkfirst=CheckFirst.NONE,
tables=list(metadata.tables.values()),
_ddl_runner=mock.ANY,
),
mock.call.before_drop(
metadata,
self.bind,
- checkfirst=False,
+ checkfirst=CheckFirst.NONE,
tables=list(metadata.tables.values()),
_ddl_runner=mock.ANY,
)
mock.call.after_drop(
metadata,
self.bind,
- checkfirst=False,
+ checkfirst=CheckFirst.NONE,
tables=list(metadata.tables.values()),
_ddl_runner=mock.ANY,
)
mock.call.before_drop(
metadata,
self.bind,
- checkfirst=False,
+ checkfirst=CheckFirst.NONE,
tables=list(metadata.tables.values()),
_ddl_runner=mock.ANY,
),
mock.call.after_drop(
metadata,
self.bind,
- checkfirst=False,
+ checkfirst=CheckFirst.NONE,
tables=list(metadata.tables.values()),
_ddl_runner=mock.ANY,
),
mock.call.before_create(
table,
self.bind,
- checkfirst=False,
+ # checkfirst is forced to false by the mock connection
+ checkfirst=CheckFirst.NONE,
_ddl_runner=mock.ANY,
_is_metadata_operation=mock.ANY,
)
+from unittest import mock
from unittest.mock import Mock
from sqlalchemy import Column
from sqlalchemy import schema
from sqlalchemy import Sequence
from sqlalchemy import Table
+from sqlalchemy import testing
+from sqlalchemy.sql.ddl import CheckFirst
from sqlalchemy.sql.ddl import SchemaDropper
from sqlalchemy.sql.ddl import SchemaGenerator
+from sqlalchemy.testing import eq_
from sqlalchemy.testing import fixtures
+from sqlalchemy.testing import is_
class EmitDDLTest(fixtures.TestBase):
self._assert_create_comment([t1, t1, c1], generator, m)
- def test_create_seq_checkfirst(self):
+ _true_seq = testing.combinations(
+ True,
+ CheckFirst.ALL,
+ CheckFirst.SEQUENCES | CheckFirst.TABLES,
+ argnames="checkfirst",
+ )
+
+ @_true_seq
+ def test_create_seq_checkfirst(self, checkfirst):
m, t1, t2, s1, s2 = self._table_seq_fixture()
generator = self._mock_create_fixture(
- True, [t1, t2], item_exists=lambda t: t not in ("t1", "s1")
+ checkfirst, [t1, t2], item_exists=lambda t: t not in ("t1", "s1")
)
self._assert_create([t1, s1], generator, m)
- def test_drop_seq_checkfirst(self):
+ @_true_seq
+ def test_drop_seq_checkfirst(self, checkfirst):
m, t1, t2, s1, s2 = self._table_seq_fixture()
generator = self._mock_drop_fixture(
- True, [t1, t2], item_exists=lambda t: t in ("t1", "s1")
+ checkfirst, [t1, t2], item_exists=lambda t: t in ("t1", "s1")
)
self._assert_drop([t1, s1], generator, m)
return True
generator = self._mock_drop_fixture(True, [t1], item_exists=exists)
- self._assert_drop_tables([t1], generator, t1)
+ self._assert_drop_tables([t1], generator, t1, True)
- def test_create_index_checkfirst_exists(self):
- m, t1, i1 = self._table_index_fixture()
- generator = self._mock_create_fixture(
- True, [i1], item_exists=lambda idx: True
- )
- self._assert_create_index([], generator, i1)
+ _true_index = testing.combinations(
+ True, CheckFirst.ALL, CheckFirst.INDEXES, argnames="checkfirst"
+ )
- def test_create_index_checkfirst_doesnt_exist(self):
+ @_true_index
+ def test_create_index_checkfirst_exists(self, checkfirst):
m, t1, i1 = self._table_index_fixture()
generator = self._mock_create_fixture(
- True, [i1], item_exists=lambda idx: False
+ checkfirst, [i1], item_exists=lambda idx: True
)
- self._assert_create_index([i1], generator, i1)
+ self._assert_create_index([], generator, i1, checkfirst)
- def test_create_index_nocheck_exists(self):
+ @_true_index
+ def test_create_index_nocheck_exists(self, checkfirst):
m, t1, i1 = self._table_index_fixture()
generator = self._mock_create_fixture(
- False, [i1], item_exists=lambda idx: True
+ checkfirst, [i1], item_exists=lambda idx: False
)
- self._assert_create_index([i1], generator, i1)
+ self._assert_create_index([i1], generator, i1, checkfirst)
- def test_create_index_nocheck_doesnt_exist(self):
+ _false_index = testing.combinations(
+ False,
+ CheckFirst.NONE,
+ CheckFirst.TABLES,
+ CheckFirst.SEQUENCES,
+ CheckFirst.TYPES,
+ argnames="checkfirst",
+ )
+
+ @_false_index
+ def test_create_index_nocheck_doesnt_exist(self, checkfirst):
m, t1, i1 = self._table_index_fixture()
generator = self._mock_create_fixture(
- False, [i1], item_exists=lambda idx: False
+ checkfirst, [i1], item_exists=lambda idx: False
)
- self._assert_create_index([i1], generator, i1)
+ self._assert_create_index([i1], generator, i1, checkfirst)
- def test_drop_index_checkfirst_exists(self):
+ @_false_index
+ def test_drop_index_checkfirst_exists(self, checkfirst):
m, t1, i1 = self._table_index_fixture()
generator = self._mock_drop_fixture(
- True, [i1], item_exists=lambda idx: True
+ checkfirst, [i1], item_exists=lambda idx: True
)
- self._assert_drop_index([i1], generator, i1)
+ self._assert_drop_index([i1], generator, i1, checkfirst)
- def test_drop_index_checkfirst_doesnt_exist(self):
+ @_true_index
+ def test_drop_index_checkfirst_doesnt_exist(self, checkfirst):
m, t1, i1 = self._table_index_fixture()
generator = self._mock_drop_fixture(
- True, [i1], item_exists=lambda idx: False
+ checkfirst, [i1], item_exists=lambda idx: False
)
- self._assert_drop_index([], generator, i1)
+ self._assert_drop_index([], generator, i1, checkfirst)
- def test_drop_index_nocheck_exists(self):
+ @_false_index
+ def test_drop_index_nocheck_exists(self, checkfirst):
m, t1, i1 = self._table_index_fixture()
generator = self._mock_drop_fixture(
- False, [i1], item_exists=lambda idx: True
+ checkfirst, [i1], item_exists=lambda idx: True
)
- self._assert_drop_index([i1], generator, i1)
+ self._assert_drop_index([i1], generator, i1, checkfirst)
- def test_drop_index_nocheck_doesnt_exist(self):
+ @_false_index
+ def test_drop_index_nocheck_doesnt_exist(self, checkfirst):
m, t1, i1 = self._table_index_fixture()
generator = self._mock_drop_fixture(
- False, [i1], item_exists=lambda idx: False
+ checkfirst, [i1], item_exists=lambda idx: False
)
- self._assert_drop_index([i1], generator, i1)
+ self._assert_drop_index([i1], generator, i1, checkfirst)
+
+ _true_table = testing.combinations(
+ True, CheckFirst.ALL, CheckFirst.TABLES, argnames="checkfirst"
+ )
- def test_create_collection_checkfirst(self):
+ @_true_table
+ def test_create_collection_checkfirst(self, checkfirst):
m, t1, t2, t3, t4, t5 = self._table_fixture()
generator = self._mock_create_fixture(
- True, [t2, t3, t4], item_exists=lambda t: t not in ("t2", "t4")
+ checkfirst,
+ [t2, t3, t4],
+ item_exists=lambda t: t not in ("t2", "t4"),
)
- self._assert_create_tables([t2, t4], generator, m)
+ self._assert_create_tables([t2, t4], generator, m, checkfirst)
def test_drop_collection_checkfirst(self):
m, t1, t2, t3, t4, t5 = self._table_fixture()
True, [t2, t3, t4], item_exists=lambda t: t in ("t2", "t4")
)
- self._assert_drop_tables([t2, t4], generator, m)
+ self._assert_drop_tables([t2, t4], generator, m, True)
+
+ _false_table = testing.combinations(
+ False,
+ CheckFirst.NONE,
+ CheckFirst.INDEXES,
+ CheckFirst.SEQUENCES,
+ CheckFirst.TYPES,
+ argnames="checkfirst",
+ )
- def test_create_collection_nocheck(self):
+ @_false_table
+ def test_create_collection_nocheck(self, checkfirst):
m, t1, t2, t3, t4, t5 = self._table_fixture()
generator = self._mock_create_fixture(
- False, [t2, t3, t4], item_exists=lambda t: t not in ("t2", "t4")
+ checkfirst,
+ [t2, t3, t4],
+ item_exists=lambda t: t not in ("t2", "t4"),
)
- self._assert_create_tables([t2, t3, t4], generator, m)
+ self._assert_create_tables([t2, t3, t4], generator, m, checkfirst)
def test_create_empty_collection(self):
m, t1, t2, t3, t4, t5 = self._table_fixture()
True, [], item_exists=lambda t: t not in ("t2", "t4")
)
- self._assert_create_tables([], generator, m)
+ self._assert_create_tables([], generator, m, True)
def test_drop_empty_collection(self):
m, t1, t2, t3, t4, t5 = self._table_fixture()
True, [], item_exists=lambda t: t in ("t2", "t4")
)
- self._assert_drop_tables([], generator, m)
+ self._assert_drop_tables([], generator, m, True)
def test_drop_collection_nocheck(self):
m, t1, t2, t3, t4, t5 = self._table_fixture()
False, [t2, t3, t4], item_exists=lambda t: t in ("t2", "t4")
)
- self._assert_drop_tables([t2, t3, t4], generator, m)
+ self._assert_drop_tables([t2, t3, t4], generator, m, False)
def test_create_metadata_checkfirst(self):
m, t1, t2, t3, t4, t5 = self._table_fixture()
True, None, item_exists=lambda t: t not in ("t2", "t4")
)
- self._assert_create_tables([t2, t4], generator, m)
+ self._assert_create_tables([t2, t4], generator, m, True)
def test_drop_metadata_checkfirst(self):
m, t1, t2, t3, t4, t5 = self._table_fixture()
True, None, item_exists=lambda t: t in ("t2", "t4")
)
- self._assert_drop_tables([t2, t4], generator, m)
+ self._assert_drop_tables([t2, t4], generator, m, True)
def test_create_metadata_nocheck(self):
m, t1, t2, t3, t4, t5 = self._table_fixture()
False, None, item_exists=lambda t: t not in ("t2", "t4")
)
- self._assert_create_tables([t1, t2, t3, t4, t5], generator, m)
+ self._assert_create_tables([t1, t2, t3, t4, t5], generator, m, False)
def test_drop_metadata_nocheck(self):
m, t1, t2, t3, t4, t5 = self._table_fixture()
False, None, item_exists=lambda t: t in ("t2", "t4")
)
- self._assert_drop_tables([t1, t2, t3, t4, t5], generator, m)
+ self._assert_drop_tables([t1, t2, t3, t4, t5], generator, m, False)
def test_create_metadata_auto_alter_fk(self):
m, t1, t2 = self._use_alter_fixture_one()
m,
)
- def _assert_create_tables(self, elements, generator, argument):
+ def _assert_create_tables(self, elements, generator, argument, checkfirst):
self._assert_ddl(schema.CreateTable, elements, generator, argument)
- def _assert_drop_tables(self, elements, generator, argument):
+ if CheckFirst(checkfirst) & CheckFirst.TABLES:
+ if generator.tables is not None:
+ tables = generator.tables
+ elif isinstance(argument, MetaData):
+ tables = argument.tables.values()
+ else:
+ assert False, "don't know what tables we are checking"
+ eq_(
+ generator.dialect.has_table.mock_calls,
+ [
+ mock.call(mock.ANY, tablename, schema=mock.ANY)
+ for tablename in [t.name for t in tables]
+ ],
+ )
+ else:
+ eq_(
+ generator.dialect.has_index.mock_calls,
+ [],
+ )
+
+ def _assert_drop_tables(self, elements, generator, argument, checkfirst):
self._assert_ddl(schema.DropTable, elements, generator, argument)
+ if CheckFirst(checkfirst) & CheckFirst.TABLES:
+ if generator.tables is not None:
+ tables = generator.tables
+ elif isinstance(argument, MetaData):
+ tables = argument.tables.values()
+ else:
+ assert False, "don't know what tables we are checking"
+ eq_(
+ generator.dialect.has_table.mock_calls,
+ [
+ mock.call(mock.ANY, tablename, schema=mock.ANY)
+ for tablename in [t.name for t in tables]
+ ],
+ )
+ else:
+ eq_(
+ generator.dialect.has_index.mock_calls,
+ [],
+ )
+
def _assert_create(self, elements, generator, argument):
self._assert_ddl(
(schema.CreateTable, schema.CreateSequence, schema.CreateIndex),
argument,
)
- def _assert_create_index(self, elements, generator, argument):
+ def _assert_create_index(self, elements, generator, argument, checkfirst):
self._assert_ddl((schema.CreateIndex,), elements, generator, argument)
- def _assert_drop_index(self, elements, generator, argument):
+ if CheckFirst(checkfirst) & CheckFirst.INDEXES:
+ tablename = argument.table.name
+ indexname = argument.name
+ eq_(
+ generator.dialect.has_index.mock_calls,
+ [mock.call(mock.ANY, tablename, indexname, schema=mock.ANY)],
+ )
+ else:
+ eq_(
+ generator.dialect.has_index.mock_calls,
+ [],
+ )
+
+ def _assert_drop_index(self, elements, generator, argument, checkfirst):
self._assert_ddl((schema.DropIndex,), elements, generator, argument)
+ if CheckFirst(checkfirst) & CheckFirst.INDEXES:
+ tablename = argument.table.name
+ indexname = argument.name
+ eq_(
+ generator.dialect.has_index.mock_calls,
+ [mock.call(mock.ANY, tablename, indexname, schema=mock.ANY)],
+ )
+ else:
+ eq_(
+ generator.dialect.has_index.mock_calls,
+ [],
+ )
+
def _assert_ddl(self, ddl_cls, elements, generator, argument):
+ elements = list(elements)
generator.traverse_single(argument)
for call_ in generator.connection.execute.mock_calls:
c = call_[1][0]
if e not in set(c.include_foreign_key_constraints)
]
assert not elements, "elements remain in list: %r" % elements
+
+
+class MiscTests(fixtures.TestBase):
+ def test_checkfirst_values(self):
+ # ensure that `if checkfirst:` keeps working
+ for m in CheckFirst:
+ if m is CheckFirst.NONE:
+ is_(bool(m), False)
+ else:
+ is_(bool(m), True)
+
+ def test_checkfirst_from_bool(self):
+ eq_(CheckFirst(True), CheckFirst.ALL)
+ eq_(CheckFirst(False), CheckFirst.NONE)
+ eq_(CheckFirst(0), CheckFirst.NONE)
+ eq_(CheckFirst(False), CheckFirst.NONE)
from contextlib import contextmanager
+from contextlib import nullcontext
import pickle
import sqlalchemy as tsa
from sqlalchemy.testing import emits_warning
from sqlalchemy.testing import eq_
from sqlalchemy.testing import eq_ignore_whitespace
+from sqlalchemy.testing import expect_deprecated
from sqlalchemy.testing import expect_raises_message
+from sqlalchemy.testing import expect_warnings
from sqlalchemy.testing import fixtures
from sqlalchemy.testing import is_
from sqlalchemy.testing import is_false
from sqlalchemy.testing import is_true
from sqlalchemy.testing import mock
from sqlalchemy.testing import Variation
-from sqlalchemy.testing.assertions import expect_warnings
class MetaDataTest(fixtures.TestBase, ComparesTables):
else:
MetaData(42) # type: ignore
+ def test_metadata_schemas(self):
+ eq_(MetaData().schemas, ())
+ eq_(MetaData(schema="foo").schemas, ("foo",))
+
+ m1 = MetaData()
+ Table("t", m1, schema="x")
+ Table("t2", m1, schema="y")
+ eq_(sorted(m1.schemas), ["x", "y"])
+
+ m2 = MetaData(schema="w")
+ Table("t", m2, schema="x")
+ Table("t2", m2, schema="y")
+ eq_(sorted(m2.schemas), ["w", "x", "y"])
+
+ def test_get_schema_objects_empty(self):
+ eq_(MetaData().get_schema_objects(Enum), ())
+
+ def test_get_schema_objects(self):
+ class MyEnum(Enum):
+ pass
+
+ m1 = MetaData()
+ e = Enum("a", "b", name="foo")
+ e2 = MyEnum("a", "b", name="foo", schema="t", metadata=m1)
+ s = Sequence("s")
+ Table("t", m1, Column("c", e), Column("s", Integer, s))
+ eq_(m1.get_schema_objects(Enum), (e,))
+ eq_(m1.get_schema_objects(MyEnum), ())
+ eq_(m1.get_schema_objects(Enum, schema="t"), (e2,))
+ eq_(m1.get_schema_objects(MyEnum, schema="t"), (e2,))
+ eq_(m1.get_schema_objects(Sequence), (s,))
+
+ def test_get_schema_object_by_name(self):
+ class MyEnum(Enum):
+ pass
+
+ m1 = MetaData()
+ e = Enum("a", "b", name="foo")
+ e2 = Enum("x", "y", name="bar", metadata=m1)
+ e3 = MyEnum("x", "b", name="foo", schema="t", metadata=m1)
+ e4 = Enum("a", "y", name="baz", schema="t")
+ Table("t", m1, Column("c", e), Column("c2", e4))
+ eq_(m1.get_schema_object_by_name(Enum, "foo"), e)
+ eq_(m1.get_schema_object_by_name(Enum, "bar"), e2)
+ eq_(m1.get_schema_object_by_name(MyEnum, "bar"), None)
+ eq_(m1.get_schema_object_by_name(Enum, "baz"), None)
+ eq_(m1.get_schema_object_by_name(Enum, "foo", schema="t"), e3)
+ eq_(m1.get_schema_object_by_name(Enum, "foo", schema="t"), e3)
+ eq_(m1.get_schema_object_by_name(MyEnum, "foo", schema="t"), e3)
+ eq_(m1.get_schema_object_by_name(Enum, "baz", schema="t"), e4)
+ eq_(m1.get_schema_object_by_name(Enum, "bar", schema="t"), None)
+
+ def test_custom_schematype(self):
+ class FooType(sqltypes.SchemaType, sqltypes.TypeEngine):
+ pass
+
+ m1 = MetaData()
+ t1 = FooType(name="foo")
+ t2 = FooType(name="bar", metadata=m1)
+ Table("t", m1, Column("c", t1))
+ eq_(set(m1.get_schema_objects(FooType)), {t1, t2})
+ eq_(m1.get_schema_object_by_name(FooType, "foo"), t1)
+ eq_(m1.get_schema_object_by_name(FooType, "bar"), t2)
+ eq_(m1.get_schema_object_by_name(Enum, "bar"), None)
+
class ToMetaDataTest(fixtures.TestBase, AssertsCompiledSQL, ComparesTables):
def test_adapt_to_schema(self):
m = MetaData()
type_ = self.MyType()
- eq_(type_.inherit_schema, True)
t1 = Table("x", m, Column("y", type_), schema="z")
- eq_(t1.c.y.type.schema, "z")
+ eq_(t1.c.y.type.schema, None)
adapted = t1.c.y.type.adapt(self.MyType)
- eq_(type_.inherit_schema, False)
- eq_(adapted.inherit_schema, False)
-
- eq_(adapted.schema, "z")
+ eq_(adapted.schema, None)
adapted2 = t1.c.y.type.adapt(self.MyType, schema="q")
eq_(adapted2.schema, "q")
t1 = Table("x", m, Column("y", type_), schema="z")
eq_(t1.c.y.type.schema, "q")
+ @property
+ def inherit_schema_deprecaeted(self):
+ return expect_deprecated(
+ "the ``inherit_schema`` parameter is deprecated"
+ )
+
+ @testing.combinations(
+ {}, {"inherit_schema": True}, {"inherit_schema": False}
+ )
+ def test_inherit_schema_deprecated(self, args):
+ if args.get("inherit_schema", False):
+ dep = self.inherit_schema_deprecaeted
+ else:
+ dep = nullcontext()
+ with dep:
+ typ = self.MyType(**args)
+ with expect_deprecated(
+ "The ``inherit_schema`` property is deprecated"
+ ):
+ eq_(typ.inherit_schema, args.get("inherit_schema", False))
+
def test_inherit_schema_from_metadata(self):
"""test #6373"""
m = MetaData(schema="q")
def test_inherit_schema_from_table_override_metadata(self):
"""test #6373"""
m = MetaData(schema="q")
- type_ = self.MyType(metadata=m, inherit_schema=True)
+ with self.inherit_schema_deprecaeted:
+ type_ = self.MyType(metadata=m, inherit_schema=True)
t1 = Table("x", m, Column("y", type_), schema="z")
eq_(t1.c.y.type.schema, "z")
t1 = Table("x", m, Column("y", type_), schema="z")
eq_(t1.c.y.type.schema, "e")
+ @combinations({}, {"schema": "m"})
+ def test_inherit_schema_mata(self, arg):
+ m = MetaData(**arg)
+ type_ = self.MyType()
+ t1 = Table("x", m, Column("y", type_), schema="z")
+ eq_(t1.c.y.type.schema, arg.get("schema"))
+
def test_inherit_schema(self):
m = MetaData()
- type_ = self.MyType(inherit_schema=True)
+ with self.inherit_schema_deprecaeted:
+ type_ = self.MyType(inherit_schema=True)
t1 = Table("x", m, Column("y", type_), schema="z")
eq_(t1.c.y.type.schema, "z")
- @combinations({}, {"inherit_schema": False}, argnames="enum_kw")
@combinations({}, {"schema": "m"}, argnames="meta_kw")
@combinations({}, {"schema": "t"}, argnames="table_kw")
- def test_independent_schema_enum_explicit_schema(
- self, enum_kw, meta_kw, table_kw
- ):
+ def test_independent_schema_enum_explicit_schema(self, meta_kw, table_kw):
m = MetaData(**meta_kw)
- type_ = sqltypes.Enum("a", schema="e", **enum_kw)
+ type_ = sqltypes.Enum("a", schema="e")
t1 = Table("x", m, Column("y", type_), **table_kw)
eq_(t1.c.y.type.schema, "e")
def test_explicit_schema_w_inherit_raises(self):
- with expect_raises_message(
- exc.ArgumentError,
- "Ambiguously setting inherit_schema=True while also passing "
- "a non-None schema argument",
+ with (
+ expect_raises_message(
+ exc.ArgumentError,
+ "Ambiguously setting inherit_schema=True while also passing "
+ "a schema argument",
+ ),
+ self.inherit_schema_deprecaeted,
):
sqltypes.Enum("a", schema="e", inherit_schema=True)
- def test_independent_schema_off_no_explicit_schema(self):
+ def test_schema_none_does_not_inherits(self):
m = MetaData(schema="m")
- type_ = sqltypes.Enum("a", inherit_schema=False)
+ type_ = sqltypes.Enum("a", schema=None)
t1 = Table("x", m, Column("y", type_), schema="z")
eq_(t1.c.y.type.schema, None)
- def test_inherit_schema_enum_auto(self):
- m = MetaData()
- type_ = sqltypes.Enum("a", "b", "c")
- t1 = Table("x", m, Column("y", type_), schema="z")
- eq_(t1.c.y.type.schema, "z")
-
- def test_inherit_schema_enum_meta(self):
- m = MetaData(schema="q")
- type_ = sqltypes.Enum("a", "b", "c")
+ @combinations("set_meta", "no_set_meta")
+ def test_schema_black_schema_set_none(self, case):
+ m = MetaData(schema="m")
+ kw = {"set_meta": {"metadata": m}, "no_set_meta": {}}
+ type_ = sqltypes.Enum("a", schema=None, **kw[case])
t1 = Table("x", m, Column("y", type_), schema="z")
- eq_(t1.c.y.type.schema, "z")
+ eq_(t1.c.y.type.schema, None)
def test_inherit_schema_enum_set_meta(self):
m = MetaData(schema="q")
type_ = sqltypes.Enum("a", "b", "c", metadata=m)
+ eq_(type_.schema, "q")
t1 = Table("x", m, Column("y", type_), schema="z")
eq_(t1.c.y.type.schema, "q")
def test_inherit_schema_enum_set_meta_explicit(self):
m = MetaData(schema="q")
type_ = sqltypes.Enum("a", "b", "c", metadata=m, schema="e")
+ eq_(type_.schema, "e")
t1 = Table("x", m, Column("y", type_), schema="z")
eq_(t1.c.y.type.schema, "e")
m2 = MetaData()
t2 = t1.to_metadata(m2)
- if assign_metadata:
- # metadata was transferred
- # issue #11802
- is_(t2.c.y.type.metadata, m2)
- else:
- # metadata isn't set
- is_(t2.c.y.type.metadata, None)
+ # metadata was transferred
+ # issue #11802
+ is_(t2.c.y.type.metadata, m2)
# our test type sets table, though
is_(t2.c.y.type.table, t2)
t2 = t1.to_metadata(m2)
eq_(t2.c.y.type.schema, "z")
- @testing.variation("inherit_schema", ["novalue", True, False])
- def test_to_metadata_independent_schema(self, inherit_schema):
- m1 = MetaData()
-
- if inherit_schema.novalue:
- type_ = self.MyType()
- else:
- type_ = self.MyType(inherit_schema=bool(inherit_schema))
-
- t1 = Table("x", m1, Column("y", type_))
-
- m2 = MetaData()
- t2 = t1.to_metadata(m2, schema="bar")
-
- if inherit_schema.novalue or inherit_schema:
- eq_(t2.c.y.type.schema, "bar")
- else:
- eq_(t2.c.y.type.schema, None)
-
@testing.combinations(
("name", "foobar", "name"),
("schema", "someschema", "schema"),
- ("inherit_schema", True, "inherit_schema"),
("metadata", MetaData(), "metadata"),
)
def test_copy_args(self, argname, value, attrname):
eq_(getattr(e1_copy, attrname), value)
- def test_to_metadata_inherit_schema(self):
- m1 = MetaData()
+ @testing.combinations({}, {"schema": "m"}, argnames="meta_schema")
+ @testing.combinations(True, False, argnames="inherit_schema")
+ @testing.combinations({}, {"schema": "t"}, argnames="table_schema")
+ def test_to_metadata_inherit_schema(
+ self, meta_schema, inherit_schema, table_schema
+ ):
+ m1 = MetaData(**meta_schema)
- type_ = self.MyType(inherit_schema=True)
+ if inherit_schema:
+ with self.inherit_schema_deprecaeted:
+ type_ = self.MyType(inherit_schema=True)
+ schema_from = meta_schema | table_schema
+ else:
+ type_ = self.MyType()
+ schema_from = meta_schema
- t1 = Table("x", m1, Column("y", type_))
- # note that inherit_schema means the schema mutates to be that
- # of the table
- is_(type_.schema, None)
+ t1 = Table("x", m1, Column("y", type_), **table_schema)
+ eq_(type_.schema, schema_from.get("schema"))
m2 = MetaData()
t2 = t1.to_metadata(m2, schema="bar")
- eq_(t1.c.y.type.schema, None)
+ eq_(t1.c.y.type.schema, schema_from.get("schema"))
eq_(t2.c.y.type.schema, "bar")
def test_to_metadata_independent_events(self):
m2 = MetaData()
t2 = t1.to_metadata(m2)
- t1.dispatch.before_create(t1, testing.db)
+ t1.dispatch.before_create(t1, testing.db, checkfirst=False)
eq_(t1.c.y.type.evt_targets, (t1,))
eq_(t2.c.y.type.evt_targets, ())
- t2.dispatch.before_create(t2, testing.db)
- t2.dispatch.before_create(t2, testing.db)
+ t2.dispatch.before_create(t2, testing.db, checkfirst=False)
+ t2.dispatch.before_create(t2, testing.db, checkfirst=False)
eq_(t1.c.y.type.evt_targets, (t1,))
eq_(t2.c.y.type.evt_targets, (t2, t2))
is_true(y_copy.type._create_events)
# for PostgreSQL, this will emit CREATE TYPE
- m.dispatch.before_create(t1, testing.db)
+ m.dispatch.before_create(t1, testing.db, checkfirst=False)
try:
eq_(t1.c.y.type.evt_targets, (t1,))
finally:
("info", {"foo": "bar"}),
argnames="paramname, value",
)
- def test_merge_column(
- self,
- paramname,
- value,
- ):
+ def test_merge_column(self, paramname, value):
args = []
params = {}
if paramname == "type" or isinstance(
"metadata",
"name",
"dispatch",
+ "_schema_provided",
):
continue
# assert each value was copied, or that
"y",
name="somename",
quote=True,
- inherit_schema=True,
native_enum=False,
)
eq_(
repr(e),
- "Enum('x', 'y', name='somename', "
- "inherit_schema=True, native_enum=False)",
+ "Enum('x', 'y', name='somename', native_enum=False)",
)
def test_repr_two(self):
e = Enum("x", "y", name="somename", create_constraint=True)
eq_(
repr(e),
- "Enum('x', 'y', name='somename', inherit_schema=True, "
- "create_constraint=True)",
+ "Enum('x', 'y', name='somename', create_constraint=True)",
)
def test_repr_three(self):
e = Enum("x", "y", native_enum=False, length=255)
eq_(
repr(e),
- "Enum('x', 'y', inherit_schema=True, "
- "native_enum=False, length=255)",
+ "Enum('x', 'y', native_enum=False, length=255)",
)
def test_repr_four(self):
e = Enum("x", "y", length=255)
eq_(
repr(e),
- "Enum('x', 'y', inherit_schema=True, length=255)",
+ "Enum('x', 'y', length=255)",
)
def test_length_native(self):
eq_(e.length, None)
eq_(
repr(e),
- "Enum('x', 'y', inherit_schema=True, "
- "native_enum=False, length=None)",
+ "Enum('x', 'y', native_enum=False, length=None)",
)
self.assert_compile(e, "VARCHAR", dialect="default")