:tickets:
a wide refactoring to "attribute loader" and "options" architectures.
- ColumnProperty and PropertyLoader define their loading behaivor via switchable
+ ColumnProperty and PropertyLoader define their loading behavior via switchable
"strategies", and MapperOptions no longer use mapper/property copying
in order to function; they are instead propagated via QueryContext
and SelectionContext objects at query/instances time.
itself differently to some MapperExtensions.
The change also affects the internal attribute API, but not
- the AttributeExtension interface nor any of the publically
+ the AttributeExtension interface nor any of the publicly
documented attribute functions.
- The unit of work no longer genererates a graph of "dependency"
Query.from_self() as well as query.subquery() both disable
the rendering of eager joins inside the subquery produced.
- The "disable all eager joins" feature is available publically
+ The "disable all eager joins" feature is available publicly
via a new query.enable_eagerloads() generative.
.. change::
:tickets:
The construction of types within dialects has been totally
- overhauled. Dialects now define publically available types
+ overhauled. Dialects now define publicly available types
as UPPERCASE names exclusively, and internal implementation
types using underscore identifiers (i.e. are private).
The system by which types are expressed in SQL and DDL
PickleType now uses == for comparison of values when
mutable=True, unless the "comparator" argument with a
- comparsion function is specified to the type. Objects
+ comparison function is specified to the type. Objects
being pickled will be compared based on identity (which
defeats the purpose of mutable=True) if __eq__() is not
overridden or a comparison function is not provided.
Fixed bug when the declarative ``__abstract__`` flag was not being
distinguished for when it was actually the value ``False``.
- The ``__abstract__`` flag needs to acutally evaluate to a True
+ The ``__abstract__`` flag needs to actually evaluate to a True
value at the level being tested.
.. changelog::
fully usable within declarative relationship configuration, as its
string classname would not be available in the registry of classnames
at mapper configuration time. The class now explicitly adds itself
- to the class regsitry, and additionally both :class:`.AbstractConcreteBase`
+ to the class registry, and additionally both :class:`.AbstractConcreteBase`
as well as :class:`.ConcreteBase` set themselves up *before* mappers
are configured within the :func:`.configure_mappers` setup, using
the new :meth:`.MapperEvents.before_configured` event.
provided by :class:`.Session` and :class:`.Connection`, and taking
place for operations such as a failure on "RELEASE SAVEPOINT".
Previously, the fix was only in place for a specific path within
- the ORM flush/commit process; it now takes place for all transational
+ the ORM flush/commit process; it now takes place for all transactional
context managers as well.
.. change::
Highlights of these changes include:
* The construction of types within dialects has been totally
- overhauled. Dialects now define publically available types
+ overhauled. Dialects now define publicly available types
as UPPERCASE names exclusively, and internal
implementation types using underscore identifiers (i.e.
are private). The system by which types are expressed in
# in 0.8, this would fail to load the unloaded state.
assert attributes.get_history(a1, 'data') == ((), ['a1',], ())
- # load_history() is now equiavlent to get_history() with
+ # load_history() is now equivalent to get_history() with
# passive=PASSIVE_OFF ^ INIT_OK
assert inspect(a1).attrs.data.load_history() == ((), ['a1',], ())
The :class:`.quoted_name` object is used internally as needed; however if
other keywords require fixed quoting preferences, the class is available
-publically.
+publicly.
:ticket:`2812`
For an example of subclassing a built in type directly, we subclass
:class:`.postgresql.BYTEA` to provide a ``PGPString``, which will make use of the
-Postgresql ``pgcrypto`` extension to encrpyt/decrypt values
+Postgresql ``pgcrypto`` extension to encrypt/decrypt values
transparently::
from sqlalchemy import create_engine, String, select, func, \
:param isolation_level: Available on: :class:`.Connection`.
Set the transaction isolation level for
the lifespan of this :class:`.Connection` object (*not* the
- underyling DBAPI connection, for which the level is reset
+ underlying DBAPI connection, for which the level is reset
to its original setting upon termination of this
:class:`.Connection` object).
:meth:`.Connection.execute` method or similar),
this :class:`.Connection` will attempt to
procure a new DBAPI connection using the services of the
- :class:`.Pool` as a source of connectivty (e.g. a "reconnection").
+ :class:`.Pool` as a source of connectivity (e.g. a "reconnection").
If a transaction was in progress (e.g. the
:meth:`.Connection.begin` method has been called) when
raise NotImplementedError()
def do_recover_twophase(self, connection):
- """Recover list of uncommited prepared two phase transaction
+ """Recover list of uncommitted prepared two phase transaction
identifiers on the given connection.
:param connection: a :class:`.Connection`.
In the case of a result that is the product of
:ref:`connectionless execution <dbengine_implicit>`,
- the underyling :class:`.Connection` object is also closed, which
+ the underlying :class:`.Connection` object is also closed, which
:term:`releases` DBAPI connection resources.
After this method is called, it is no longer valid to call upon
discarded.
Calls to :meth:`.ResultProxy.fetchmany` after all rows have been
- exhuasted will return
+ exhausted will return
an empty list. After the :meth:`.ResultProxy.close` method is
called, the method will raise :class:`.ResourceClosedError`.
the default data structure will be a Python list of ``None`` values,
at least as long as the index value; the value is then set at its
place in the list. This means for an index value of zero, the list
- will be initalized to ``[None]`` before setting the given value,
+ will be initialized to ``[None]`` before setting the given value,
and for an index value of five, the list will be initialized to
``[None, None, None, None, None]`` before setting the fifth element
to the given value. Note that an existing list is **not** extended
deletes are not supported by SQL** as well as that the
**join condition of an inheritance mapper is not
automatically rendered**. Care must be taken in any
- multiple-table delete to first accomodate via some other means
+ multiple-table delete to first accommodate via some other means
how the related table will be deleted, as well as to
explicitly include the joining
condition between those tables, even in mappings where
returned yet. Offers a slight performance advantage at the
cost of individual transactions by default. The
:meth:`.Pool.unique_connection` method is provided to return
- a consistenty unique connection to bypass this behavior
+ a consistently unique connection to bypass this behavior
when the flag is set.
.. warning:: The :paramref:`.Pool.use_threadlocal` flag
``(Table, [ForeignKeyConstraint, ...])`` such that each
:class:`.Table` follows its dependent :class:`.Table` objects.
Remaining :class:`.ForeignKeyConstraint` objects that are separate due to
- dependency rules not satisifed by the sort are emitted afterwards
+ dependency rules not satisfied by the sort are emitted afterwards
as ``(None, [ForeignKeyConstraint ...])``.
Tables are dependent on another based on the presence of
if self._has_multi_parameters:
raise exc.ArgumentError(
"Can't pass kwargs and multiple parameter sets "
- "simultaenously")
+ "simultaneously")
else:
self.parameters.update(kwargs)
def test_rec_unconnected(self):
# test production of a _ConnectionRecord with an
- # initally unconnected state.
+ # initially unconnected state.
dbapi = MockDBAPI()
p1 = pool.Pool(
self.assert_sql_count(testing.db, go, 1)
def test_double_same_mappers(self):
- """Eager loading with two relationships simulatneously,
+ """Eager loading with two relationships simultaneously,
from the same table, using aliases."""
addresses, items, order_items, orders, \
)
def test_double(self):
- """tests lazy loading with two relationships simulatneously,
+ """tests lazy loading with two relationships simultaneously,
from the same table, using aliases. """
users, orders, User, Address, Order, addresses = (
self.assert_sql_count(testing.db, go, 4)
def test_double_same_mappers(self):
- """Eager loading with two relationships simulatneously,
+ """Eager loading with two relationships simultaneously,
from the same table, using aliases."""
addresses, items, order_items, orders, Item, User, Address, Order, users = (self.tables.addresses,
i = i.values((5, 6, 7))
eq_(i.parameters, {"col1": 5, "col2": 6, "col3": 7})
- def test_kw_and_dict_simulatenously_single(self):
+ def test_kw_and_dict_simultaneously_single(self):
i = t1.insert()
i = i.values({"col1": 5}, col2=7)
eq_(i.parameters, {"col1": 5, "col2": 7})
i = t1.insert()
assert_raises_message(
exc.ArgumentError,
- "Can't pass kwargs and multiple parameter sets simultaenously",
+ "Can't pass kwargs and multiple parameter sets simultaneously",
i.values, [{"col1": 5}], col2=7
)
def test_no_clause_element_on_clauseelement(self):
# re [ticket:3802], there are in the wild examples
# of looping over __clause_element__, therefore the
- # absense of __clause_element__ as a test for "this is the clause
+ # absence of __clause_element__ as a test for "this is the clause
# element" must be maintained
x = Column('foo', Integer)
dict(user_id=1, user_name='john'),
)
- # unary experssions
+ # unary expressions
r = select([users.c.user_name.distinct()]).order_by(
users.c.user_name).execute().first()
eq_(r[users.c.user_name], 'john')