:ticket:`4826`
+.. _change_5263:
+
+ORM Batch inserts with psycopg2 now batch statements with RETURNING in most cases
+---------------------------------------------------------------------------------
+
+The change in :ref:`change_5401` adds support for "executemany" + "RETURNING"
+at the same time in Core, which is now enabled for the psycopg2 dialect
+by default using the psycopg2 ``execute_values()`` extension. The ORM flush
+process now makes use of this feature such that the retrieval of newly generated
+primary key values and server defaults can be achieved while not losing the
+performance benefits of being able to batch INSERT statements together. Additionally,
+psycopg2's ``execute_values()`` extension itself provides a five-fold performance
+improvement over psycopg2's default "executemany" implementation, by rewriting
+an INSERT statement to include many "VALUES" expressions all in one statement
+rather than invoking the same statement repeatedly, as psycopg2 lacks the ability
+to PREPARE the statement ahead of time as would normally be expected for this
+approach to be performant.
+
+SQLAlchemy includes a :ref:`performance suite <examples_performance>` within
+its examples, where we can compare the times generated for the "batch_inserts"
+runner against 1.3 and 1.4, revealing a 3x-5x speedup for most flavors
+of batch insert::
+
+ # 1.3
+ $ python -m examples.performance bulk_inserts --dburl postgresql://scott:tiger@localhost/test
+ test_flush_no_pk : (100000 iterations); total time 14.051527 sec
+ test_bulk_save_return_pks : (100000 iterations); total time 15.002470 sec
+ test_flush_pk_given : (100000 iterations); total time 7.863680 sec
+ test_bulk_save : (100000 iterations); total time 6.780378 sec
+ test_bulk_insert_mappings : (100000 iterations); total time 5.363070 sec
+ test_core_insert : (100000 iterations); total time 5.362647 sec
+
+ # 1.4 with enhancement
+ $ python -m examples.performance bulk_inserts --dburl postgresql://scott:tiger@localhost/test
+ test_flush_no_pk : (100000 iterations); total time 3.820807 sec
+ test_bulk_save_return_pks : (100000 iterations); total time 3.176378 sec
+ test_flush_pk_given : (100000 iterations); total time 4.037789 sec
+ test_bulk_save : (100000 iterations); total time 2.604446 sec
+ test_bulk_insert_mappings : (100000 iterations); total time 1.204897 sec
+ test_core_insert : (100000 iterations); total time 0.958976 sec
+
+Note that the ``execute_values()`` extension modifies the INSERT statement in the psycopg2
+layer, **after** it's been logged by SQLAlchemy. So with SQL logging, one will see the
+parameter sets batched together, but the joining of multiple "values" will not be visible
+on the application side::
+
+ 2020-06-27 19:08:18,166 INFO sqlalchemy.engine.Engine INSERT INTO a (data) VALUES (%(data)s) RETURNING a.id
+ 2020-06-27 19:08:18,166 INFO sqlalchemy.engine.Engine [generated in 0.00698s] ({'data': 'data 1'}, {'data': 'data 2'}, {'data': 'data 3'}, {'data': 'data 4'}, {'data': 'data 5'}, {'data': 'data 6'}, {'data': 'data 7'}, {'data': 'data 8'} ... displaying 10 of 4999 total bound parameter sets ... {'data': 'data 4998'}, {'data': 'data 4999'})
+ 2020-06-27 19:08:18,254 INFO sqlalchemy.engine.Engine COMMIT
+
+The ultimate INSERT statement can be seen by enabling statement logging on the PostgreSQL side::
+
+ 2020-06-27 19:08:18.169 EDT [26960] LOG: statement: INSERT INTO a (data)
+ VALUES ('data 1'),('data 2'),('data 3'),('data 4'),('data 5'),('data 6'),('data
+ 7'),('data 8'),('data 9'),('data 10'),('data 11'),('data 12'),
+ ... ('data 999'),('data 1000') RETURNING a.id
+
+ 2020-06-27 19:08:18.175 EDT
+ [26960] LOG: statement: INSERT INTO a (data) VALUES ('data 1001'),('data
+ 1002'),('data 1003'),('data 1004'),('data 1005 '),('data 1006'),('data
+ 1007'),('data 1008'),('data 1009'),('data 1010'),('data 1011'), ...
+
+The feature batches rows into groups of 1000 by default which can be affected
+using the ``executemany_values_page_size`` argument documented at
+:ref:`psycopg2_executemany_mode`.
+
+:ticket:`5263`
+
+
.. _change_orm_update_returning_14:
ORM Bulk Update and Delete use RETURNING for "fetch" strategy when available
The first half of a significant performance enhancement for PostgreSQL when
using both Core and ORM, the psycopg2 dialect now uses
``psycopg2.extras.execute_values()`` by default for compiled INSERT statements
-and also implements RETURNING support in this mode.
+and also implements RETURNING support in this mode. The other half of this
+change is :ref:`change_5263` which allows the ORM to take advantage of
+RETURNING with executemany (i.e. batching of INSERT statements) so that ORM
+bulk inserts with psycopg2 are up to 400% faster depending on specifics.
This extension method allows many rows to be INSERTed within a single
statement, using an extended VALUES clause for the statement. While
--- /dev/null
+.. change::
+ :tags: orm, performance, postgresql
+ :tickets: 5263
+
+ Implemented support for the psycopg2 ``execute_values()`` extension
+ within the ORM flush process via the enhancements to Core made
+ in :ticket:`5401`, so that this extension is used
+ both as a strategy to batch INSERT statements together as well as
+ that RETURNING may now be used among multiple parameter sets to
+ retrieve primary key values back in batch. This allows nearly
+ all INSERT statements emitted by the ORM on behalf of PostgreSQL
+ to be submitted in batch and also via the ``execute_values()``
+ extension which benches at five times faster than plain
+ executemany() for this particular backend.
+
+ .. seealso::
+
+ :ref:`change_5263`
pass
-EXECUTEMANY_DEFAULT = util.symbol("executemany_default", canonical=0)
+EXECUTEMANY_PLAIN = util.symbol("executemany_plain", canonical=0)
EXECUTEMANY_BATCH = util.symbol("executemany_batch", canonical=1)
EXECUTEMANY_VALUES = util.symbol("executemany_values", canonical=2)
EXECUTEMANY_VALUES_PLUS_BATCH = util.symbol(
class PGDialect_psycopg2(PGDialect):
driver = "psycopg2"
if util.py2k:
+ # turn off supports_unicode_statements for Python 2. psycopg2 supports
+ # unicode statements in Py2K. But! it does not support unicode *bound
+ # parameter names* because it uses the Python "%" operator to
+ # interpolate these into the string, and this fails. So for Py2K, we
+ # have to use full-on encoding for statements and parameters before
+ # passing to cursor.execute().
supports_unicode_statements = False
supports_server_side_cursors = True
self.executemany_mode = util.symbol.parse_user_argument(
executemany_mode,
{
- EXECUTEMANY_DEFAULT: [None],
+ EXECUTEMANY_PLAIN: [None],
EXECUTEMANY_BATCH: ["batch"],
EXECUTEMANY_VALUES: ["values_only"],
EXECUTEMANY_VALUES_PLUS_BATCH: ["values_plus_batch", "values"],
and self._hstore_oids(connection.connection) is not None
)
- # http://initd.org/psycopg/docs/news.html#what-s-new-in-psycopg-2-0-9
+ # PGDialect.initialize() checks server version for <= 8.2 and sets
+ # this flag to False if so
+ if not self.full_returning:
+ self.insert_executemany_returning = False
+ self.executemany_mode = EXECUTEMANY_PLAIN
+
self.supports_sane_multi_rowcount = not (
self.executemany_mode & EXECUTEMANY_BATCH
)
executemany_values = (
"(%s)" % context.compiled.insert_single_values_expr
)
+ if not self.supports_unicode_statements:
+ executemany_values = executemany_values.encode(self.encoding)
+
# guard for statement that was altered via event hook or similar
if executemany_values not in statement:
executemany_values = None
executemany_values = None
if executemany_values:
- # Currently, SQLAlchemy does not pass "RETURNING" statements
- # into executemany(), since no DBAPI has ever supported that
- # until the introduction of psycopg2's executemany_values, so
- # we are not yet using the fetch=True flag.
statement = statement.replace(executemany_values, "%s")
if self.executemany_values_page_size:
kwargs = {"page_size": self.executemany_values_page_size}
if self.isinsert or self.isupdate or self.isdelete:
self.is_crud = True
self._is_explicit_returning = bool(compiled.statement._returning)
- self._is_implicit_returning = (
+ self._is_implicit_returning = bool(
compiled.returning and not compiled.statement._returning
)
result.out_parameters = out_parameters
def _setup_dml_or_text_result(self):
- if self.isinsert and not self.executemany:
+ if self.isinsert:
if (
not self._is_implicit_returning
and not self.compiled.inline
and self.dialect.postfetch_lastrowid
+ and not self.executemany
):
self._setup_ins_pk_from_lastrowid()
getter = self.compiled._inserted_primary_key_from_lastrowid_getter
self.inserted_primary_key_rows = [
- getter(None, self.compiled_parameters[0])
+ getter(None, param) for param in self.compiled_parameters
]
def _setup_ins_pk_from_implicit_returning(self, result, rows):
c.context.compiled_parameters[0],
value_params,
True,
+ c.returned_defaults,
)
rows += c.rowcount
check_rowcount = assert_singlerow
c.context.compiled_parameters[0],
value_params,
True,
+ c.returned_defaults,
)
rows += c.rowcount
else:
c.context.compiled_parameters[0],
value_params,
True,
+ c.returned_defaults
+ if not c.context.executemany
+ else None,
)
if check_rowcount:
and has_all_pks
and not hasvalue
):
-
+ # the "we don't need newly generated values back" section.
+ # here we have all the PKs, all the defaults or we don't want
+ # to fetch them, or the dialect doesn't support RETURNING at all
+ # so we have to post-fetch / use lastrowid anyway.
records = list(records)
multiparams = [rec[2] for rec in records]
last_inserted_params,
value_params,
False,
+ c.returned_defaults
+ if not c.context.executemany
+ else None,
)
else:
_postfetch_bulk_save(mapper_rec, state_dict, table)
else:
+ # here, we need defaults and/or pk values back.
+
+ records = list(records)
+ if (
+ not hasvalue
+ and connection.dialect.insert_executemany_returning
+ and len(records) > 1
+ ):
+ do_executemany = True
+ else:
+ do_executemany = False
+
if not has_all_defaults and base_mapper.eager_defaults:
statement = statement.return_defaults()
elif mapper.version_id_col is not None:
statement = statement.return_defaults(mapper.version_id_col)
+ elif do_executemany:
+ statement = statement.return_defaults(*table.primary_key)
- for (
- state,
- state_dict,
- params,
- mapper_rec,
- connection,
- value_params,
- has_all_pks,
- has_all_defaults,
- ) in records:
+ if do_executemany:
+ multiparams = [rec[2] for rec in records]
- if value_params:
- result = connection.execute(
- statement.values(value_params), params
- )
- else:
- result = cached_connections[connection].execute(
- statement, params
- )
+ c = cached_connections[connection].execute(
+ statement, multiparams
+ )
+ if bookkeeping:
+ for (
+ (
+ state,
+ state_dict,
+ params,
+ mapper_rec,
+ conn,
+ value_params,
+ has_all_pks,
+ has_all_defaults,
+ ),
+ last_inserted_params,
+ inserted_primary_key,
+ returned_defaults,
+ ) in util.zip_longest(
+ records,
+ c.context.compiled_parameters,
+ c.inserted_primary_key_rows,
+ c.returned_defaults_rows or (),
+ ):
+ for pk, col in zip(
+ inserted_primary_key, mapper._pks_by_table[table],
+ ):
+ prop = mapper_rec._columntoproperty[col]
+ if state_dict.get(prop.key) is None:
+ state_dict[prop.key] = pk
+
+ if state:
+ _postfetch(
+ mapper_rec,
+ uowtransaction,
+ table,
+ state,
+ state_dict,
+ c,
+ last_inserted_params,
+ value_params,
+ False,
+ returned_defaults,
+ )
+ else:
+ _postfetch_bulk_save(mapper_rec, state_dict, table)
+ else:
+ for (
+ state,
+ state_dict,
+ params,
+ mapper_rec,
+ connection,
+ value_params,
+ has_all_pks,
+ has_all_defaults,
+ ) in records:
+
+ if value_params:
+ result = connection.execute(
+ statement.values(value_params), params
+ )
+ else:
+ result = cached_connections[connection].execute(
+ statement, params
+ )
- primary_key = result.inserted_primary_key
- if primary_key is not None:
- # set primary key attributes
+ primary_key = result.inserted_primary_key
+ assert primary_key
for pk, col in zip(
primary_key, mapper._pks_by_table[table]
):
prop = mapper_rec._columntoproperty[col]
- if pk is not None and (
+ if (
col in value_params
or state_dict.get(prop.key) is None
):
state_dict[prop.key] = pk
- if bookkeeping:
- if state:
- _postfetch(
- mapper_rec,
- uowtransaction,
- table,
- state,
- state_dict,
- result,
- result.context.compiled_parameters[0],
- value_params,
- False,
- )
- else:
- _postfetch_bulk_save(mapper_rec, state_dict, table)
+ if bookkeeping:
+ if state:
+ _postfetch(
+ mapper_rec,
+ uowtransaction,
+ table,
+ state,
+ state_dict,
+ result,
+ result.context.compiled_parameters[0],
+ value_params,
+ False,
+ result.returned_defaults
+ if not result.context.executemany
+ else None,
+ )
+ else:
+ _postfetch_bulk_save(mapper_rec, state_dict, table)
def _emit_post_update_statements(
params,
value_params,
isupdate,
+ returned_defaults,
):
"""Expire attributes in need of newly persisted database state,
after an INSERT or UPDATE statement has proceeded for that
load_evt_attrs = []
if returning_cols:
- row = result.returned_defaults
+ row = returned_defaults
if row is not None:
for row_value, col in zip(row, returning_cols):
# pk cols returned from insert are handled
values = _extend_values_for_multiparams(
compiler, stmt, compile_state, values, kw
)
+ elif not values and compiler.for_executemany:
+ # convert an "INSERT DEFAULT VALUES"
+ # into INSERT (firstcol) VALUES (DEFAULT) which can be turned
+ # into an in-place multi values. This supports
+ # insert_executemany_returning mode :)
+ values = [(stmt.table.columns[0], "DEFAULT")]
return values
result,
params=None,
checkparams=None,
+ for_executemany=False,
check_literal_execute=None,
check_post_param=None,
dialect=None,
if render_postcompile:
compile_kwargs["render_postcompile"] = True
+ if for_executemany:
+ kw["for_executemany"] = True
+
if render_schema_translate:
kw["render_schema_translate"] = True
super(EachOf, self).no_more_statements()
+class Conditional(EachOf):
+ def __init__(self, condition, rules, else_rules):
+ if condition:
+ super(Conditional, self).__init__(*rules)
+ else:
+ super(Conditional, self).__init__(*else_rules)
+
+
class Or(AllOf):
def process_statement(self, execute_observed):
for rule in self.rules:
# coding: utf-8
import datetime
+import itertools
import logging
import logging.handlers
from sqlalchemy import testing
from sqlalchemy import text
from sqlalchemy import TypeDecorator
+from sqlalchemy import util
from sqlalchemy.dialects.postgresql import base as postgresql
from sqlalchemy.dialects.postgresql import psycopg2 as psycopg2_dialect
from sqlalchemy.dialects.postgresql.psycopg2 import EXECUTEMANY_BATCH
-from sqlalchemy.dialects.postgresql.psycopg2 import EXECUTEMANY_DEFAULT
+from sqlalchemy.dialects.postgresql.psycopg2 import EXECUTEMANY_PLAIN
from sqlalchemy.dialects.postgresql.psycopg2 import EXECUTEMANY_VALUES
from sqlalchemy.engine import cursor as _cursor
from sqlalchemy.engine import engine_from_config
from sqlalchemy.testing.assertions import eq_
from sqlalchemy.testing.assertions import eq_regex
from sqlalchemy.testing.assertions import ne_
+from sqlalchemy.util import u
+from sqlalchemy.util import ue
from ...engine import test_execute
if True:
__backend__ = True
run_create_tables = "each"
+ run_deletes = None
options = None
Column("z", Integer, server_default="5"),
)
+ Table(
+ u("Unitéble2"),
+ metadata,
+ Column(u("méil"), Integer, primary_key=True),
+ Column(ue("\u6e2c\u8a66"), Integer),
+ )
+
def setup(self):
super(ExecuteManyMode, self).setup()
self.engine = engines.testing_engine(options=self.options)
],
)
+ def test_insert_unicode_keys(self, connection):
+ table = self.tables[u("Unitéble2")]
+
+ stmt = table.insert()
+
+ connection.execute(
+ stmt,
+ [
+ {u("méil"): 1, ue("\u6e2c\u8a66"): 1},
+ {u("méil"): 2, ue("\u6e2c\u8a66"): 2},
+ {u("méil"): 3, ue("\u6e2c\u8a66"): 3},
+ ],
+ )
+
+ eq_(connection.execute(table.select()).all(), [(1, 1), (2, 2), (3, 3)])
+
def test_update_fallback(self):
from psycopg2 import extras
class ExecutemanyValuesInsertsTest(ExecuteManyMode, fixtures.TablesTest):
options = {"executemany_mode": "values_only"}
- def test_insert_returning_values(self):
+ def test_insert_returning_values(self, connection):
"""the psycopg2 dialect needs to assemble a fully buffered result
with the return value of execute_values().
"""
t = self.tables.data
- with self.engine.connect() as conn:
- page_size = conn.dialect.executemany_values_page_size or 100
- data = [
- {"x": "x%d" % i, "y": "y%d" % i}
- for i in range(1, page_size * 5 + 27)
- ]
- result = conn.execute(t.insert().returning(t.c.x, t.c.y), data)
-
- eq_([tup[0] for tup in result.cursor.description], ["x", "y"])
- eq_(result.keys(), ["x", "y"])
- assert t.c.x in result.keys()
- assert t.c.id not in result.keys()
- assert not result._soft_closed
- assert isinstance(
- result.cursor_strategy,
- _cursor.FullyBufferedCursorFetchStrategy,
- )
- assert not result.cursor.closed
- assert not result.closed
- eq_(result.mappings().all(), data)
+ conn = connection
+ page_size = conn.dialect.executemany_values_page_size or 100
+ data = [
+ {"x": "x%d" % i, "y": "y%d" % i}
+ for i in range(1, page_size * 5 + 27)
+ ]
+ result = conn.execute(t.insert().returning(t.c.x, t.c.y), data)
+
+ eq_([tup[0] for tup in result.cursor.description], ["x", "y"])
+ eq_(result.keys(), ["x", "y"])
+ assert t.c.x in result.keys()
+ assert t.c.id not in result.keys()
+ assert not result._soft_closed
+ assert isinstance(
+ result.cursor_strategy, _cursor.FullyBufferedCursorFetchStrategy,
+ )
+ assert not result.cursor.closed
+ assert not result.closed
+ eq_(result.mappings().all(), data)
+
+ assert result._soft_closed
+ # assert result.closed
+ assert result.cursor is None
+
+ @testing.provide_metadata
+ def test_insert_returning_preexecute_pk(self, connection):
+ counter = itertools.count(1)
+
+ t = Table(
+ "t",
+ self.metadata,
+ Column(
+ "id",
+ Integer,
+ primary_key=True,
+ default=lambda: util.next(counter),
+ ),
+ Column("data", Integer),
+ )
+ self.metadata.create_all(connection)
+
+ result = connection.execute(
+ t.insert().return_defaults(),
+ [{"data": 1}, {"data": 2}, {"data": 3}],
+ )
- assert result._soft_closed
- # assert result.closed
- assert result.cursor is None
+ eq_(result.inserted_primary_key_rows, [(1,), (2,), (3,)])
- def test_insert_returning_defaults(self):
+ def test_insert_returning_defaults(self, connection):
t = self.tables.data
- with self.engine.connect() as conn:
+ conn = connection
- result = conn.execute(t.insert(), {"x": "x0", "y": "y0"})
- first_pk = result.inserted_primary_key[0]
+ result = conn.execute(t.insert(), {"x": "x0", "y": "y0"})
+ first_pk = result.inserted_primary_key[0]
- page_size = conn.dialect.executemany_values_page_size or 100
- total_rows = page_size * 5 + 27
- data = [
- {"x": "x%d" % i, "y": "y%d" % i} for i in range(1, total_rows)
- ]
- result = conn.execute(t.insert().returning(t.c.id, t.c.z), data)
+ page_size = conn.dialect.executemany_values_page_size or 100
+ total_rows = page_size * 5 + 27
+ data = [{"x": "x%d" % i, "y": "y%d" % i} for i in range(1, total_rows)]
+ result = conn.execute(t.insert().returning(t.c.id, t.c.z), data)
- eq_(
- result.all(),
- [(pk, 5) for pk in range(1 + first_pk, total_rows + first_pk)],
- )
+ eq_(
+ result.all(),
+ [(pk, 5) for pk in range(1 + first_pk, total_rows + first_pk)],
+ )
+
+ def test_insert_return_pks_default_values(self, connection):
+ """test sending multiple, empty rows into an INSERT and getting primary
+ key values back.
+
+ This has to use a format that indicates at least one DEFAULT in
+ multiple parameter sets, i.e. "INSERT INTO table (anycol) VALUES
+ (DEFAULT) (DEFAULT) (DEFAULT) ... RETURNING col"
+
+ """
+ t = self.tables.data
+
+ conn = connection
+
+ result = conn.execute(t.insert(), {"x": "x0", "y": "y0"})
+ first_pk = result.inserted_primary_key[0]
+
+ page_size = conn.dialect.executemany_values_page_size or 100
+ total_rows = page_size * 5 + 27
+ data = [{} for i in range(1, total_rows)]
+ result = conn.execute(t.insert().returning(t.c.id), data)
+
+ eq_(
+ result.all(),
+ [(pk,) for pk in range(1 + first_pk, total_rows + first_pk)],
+ )
def test_insert_w_newlines(self):
from psycopg2 import extras
def test_executemany_correct_flag_options(self):
for opt, expected in [
- (None, EXECUTEMANY_DEFAULT),
+ (None, EXECUTEMANY_PLAIN),
("batch", EXECUTEMANY_BATCH),
("values_only", EXECUTEMANY_VALUES),
("values_plus_batch", EXECUTEMANY_VALUES_PLUS_BATCH),
from sqlalchemy.testing import mock
from sqlalchemy.testing.assertsql import AllOf
from sqlalchemy.testing.assertsql import CompiledSQL
+from sqlalchemy.testing.assertsql import Conditional
from sqlalchemy.testing.assertsql import Or
from sqlalchemy.testing.assertsql import RegexSQL
from sqlalchemy.testing.schema import Column
self.assert_sql_execution(
testing.db,
sess.flush,
- CompiledSQL("INSERT INTO a () VALUES ()", {}),
- CompiledSQL("INSERT INTO a () VALUES ()", {}),
- CompiledSQL("INSERT INTO a () VALUES ()", {}),
- CompiledSQL("INSERT INTO a () VALUES ()", {}),
+ Conditional(
+ testing.db.dialect.insert_executemany_returning,
+ [
+ CompiledSQL(
+ "INSERT INTO a (id) VALUES (DEFAULT)", [{}, {}, {}, {}]
+ ),
+ ],
+ [
+ CompiledSQL("INSERT INTO a () VALUES ()", {}),
+ CompiledSQL("INSERT INTO a () VALUES ()", {}),
+ CompiledSQL("INSERT INTO a () VALUES ()", {}),
+ CompiledSQL("INSERT INTO a () VALUES ()", {}),
+ ],
+ ),
AllOf(
CompiledSQL(
"INSERT INTO b (id) VALUES (:id)", [{"id": 1}, {"id": 3}]
from sqlalchemy.testing import fixtures
from sqlalchemy.testing import mock
from sqlalchemy.testing.assertsql import CompiledSQL
+from sqlalchemy.testing.assertsql import Conditional
from sqlalchemy.testing.schema import Column
from sqlalchemy.testing.schema import Table
from test.orm import _fixtures
s.bulk_save_objects(objects, return_defaults=True)
asserter.assert_(
- CompiledSQL(
- "INSERT INTO users (name) VALUES (:name)", [{"name": "u1"}]
- ),
- CompiledSQL(
- "INSERT INTO users (name) VALUES (:name)", [{"name": "u2"}]
- ),
- CompiledSQL(
- "INSERT INTO users (name) VALUES (:name)", [{"name": "u3"}]
- ),
+ Conditional(
+ testing.db.dialect.insert_executemany_returning,
+ [
+ CompiledSQL(
+ "INSERT INTO users (name) VALUES (:name)",
+ [{"name": "u1"}, {"name": "u2"}, {"name": "u3"}],
+ ),
+ ],
+ [
+ CompiledSQL(
+ "INSERT INTO users (name) VALUES (:name)",
+ [{"name": "u1"}],
+ ),
+ CompiledSQL(
+ "INSERT INTO users (name) VALUES (:name)",
+ [{"name": "u2"}],
+ ),
+ CompiledSQL(
+ "INSERT INTO users (name) VALUES (:name)",
+ [{"name": "u3"}],
+ ),
+ ],
+ )
)
eq_(objects[0].__dict__["id"], 1)
"VALUES (:person_id, :status, :manager_name)",
[{"person_id": 1, "status": "s1", "manager_name": "mn1"}],
),
- CompiledSQL(
- "INSERT INTO people (name, type) VALUES (:name, :type)",
- [{"type": "engineer", "name": "e1"}],
- ),
- CompiledSQL(
- "INSERT INTO people (name, type) VALUES (:name, :type)",
- [{"type": "engineer", "name": "e2"}],
+ Conditional(
+ testing.db.dialect.insert_executemany_returning,
+ [
+ CompiledSQL(
+ "INSERT INTO people (name, type) "
+ "VALUES (:name, :type)",
+ [
+ {"type": "engineer", "name": "e1"},
+ {"type": "engineer", "name": "e2"},
+ ],
+ ),
+ ],
+ [
+ CompiledSQL(
+ "INSERT INTO people (name, type) "
+ "VALUES (:name, :type)",
+ [{"type": "engineer", "name": "e1"}],
+ ),
+ CompiledSQL(
+ "INSERT INTO people (name, type) "
+ "VALUES (:name, :type)",
+ [{"type": "engineer", "name": "e2"}],
+ ),
+ ],
),
CompiledSQL(
"INSERT INTO engineers (person_id, status, primary_language) "
)
asserter.assert_(
- CompiledSQL(
- "INSERT INTO people (name) VALUES (:name)", [{"name": "b1"}]
- ),
- CompiledSQL(
- "INSERT INTO people (name) VALUES (:name)", [{"name": "b2"}]
- ),
- CompiledSQL(
- "INSERT INTO people (name) VALUES (:name)", [{"name": "b3"}]
+ Conditional(
+ testing.db.dialect.insert_executemany_returning,
+ [
+ CompiledSQL(
+ "INSERT INTO people (name) VALUES (:name)",
+ [{"name": "b1"}, {"name": "b2"}, {"name": "b3"}],
+ ),
+ ],
+ [
+ CompiledSQL(
+ "INSERT INTO people (name) VALUES (:name)",
+ [{"name": "b1"}],
+ ),
+ CompiledSQL(
+ "INSERT INTO people (name) VALUES (:name)",
+ [{"name": "b2"}],
+ ),
+ CompiledSQL(
+ "INSERT INTO people (name) VALUES (:name)",
+ [{"name": "b3"}],
+ ),
+ ],
),
CompiledSQL(
"INSERT INTO managers (person_id, status, manager_name) "
from sqlalchemy.testing import mock
from sqlalchemy.testing.assertsql import AllOf
from sqlalchemy.testing.assertsql import CompiledSQL
+from sqlalchemy.testing.assertsql import Conditional
from sqlalchemy.testing.assertsql import RegexSQL
from sqlalchemy.testing.schema import Column
from sqlalchemy.testing.schema import Table
testing.db,
sess.flush,
RegexSQL("^INSERT INTO person", {"data": "some data"}),
- RegexSQL(
- "^INSERT INTO ball",
- lambda c: {"person_id": p.id, "data": "some data"},
- ),
- RegexSQL(
- "^INSERT INTO ball",
- lambda c: {"person_id": p.id, "data": "some data"},
- ),
- RegexSQL(
- "^INSERT INTO ball",
- lambda c: {"person_id": p.id, "data": "some data"},
- ),
- RegexSQL(
- "^INSERT INTO ball",
- lambda c: {"person_id": p.id, "data": "some data"},
+ Conditional(
+ testing.db.dialect.insert_executemany_returning,
+ [
+ RegexSQL(
+ "^INSERT INTO ball",
+ lambda c: [
+ {"person_id": p.id, "data": "some data"},
+ {"person_id": p.id, "data": "some data"},
+ {"person_id": p.id, "data": "some data"},
+ {"person_id": p.id, "data": "some data"},
+ ],
+ )
+ ],
+ [
+ RegexSQL(
+ "^INSERT INTO ball",
+ lambda c: {"person_id": p.id, "data": "some data"},
+ ),
+ RegexSQL(
+ "^INSERT INTO ball",
+ lambda c: {"person_id": p.id, "data": "some data"},
+ ),
+ RegexSQL(
+ "^INSERT INTO ball",
+ lambda c: {"person_id": p.id, "data": "some data"},
+ ),
+ RegexSQL(
+ "^INSERT INTO ball",
+ lambda c: {"person_id": p.id, "data": "some data"},
+ ),
+ ],
),
CompiledSQL(
"UPDATE person SET favorite_ball_id=:favorite_ball_id "
self.assert_sql_execution(
testing.db,
sess.flush,
- CompiledSQL(
- "INSERT INTO ball (person_id, data) "
- "VALUES (:person_id, :data)",
- {"person_id": None, "data": "some data"},
- ),
- CompiledSQL(
- "INSERT INTO ball (person_id, data) "
- "VALUES (:person_id, :data)",
- {"person_id": None, "data": "some data"},
- ),
- CompiledSQL(
- "INSERT INTO ball (person_id, data) "
- "VALUES (:person_id, :data)",
- {"person_id": None, "data": "some data"},
- ),
- CompiledSQL(
- "INSERT INTO ball (person_id, data) "
- "VALUES (:person_id, :data)",
- {"person_id": None, "data": "some data"},
+ Conditional(
+ testing.db.dialect.insert_executemany_returning,
+ [
+ CompiledSQL(
+ "INSERT INTO ball (person_id, data) "
+ "VALUES (:person_id, :data)",
+ [
+ {"person_id": None, "data": "some data"},
+ {"person_id": None, "data": "some data"},
+ {"person_id": None, "data": "some data"},
+ {"person_id": None, "data": "some data"},
+ ],
+ ),
+ ],
+ [
+ CompiledSQL(
+ "INSERT INTO ball (person_id, data) "
+ "VALUES (:person_id, :data)",
+ {"person_id": None, "data": "some data"},
+ ),
+ CompiledSQL(
+ "INSERT INTO ball (person_id, data) "
+ "VALUES (:person_id, :data)",
+ {"person_id": None, "data": "some data"},
+ ),
+ CompiledSQL(
+ "INSERT INTO ball (person_id, data) "
+ "VALUES (:person_id, :data)",
+ {"person_id": None, "data": "some data"},
+ ),
+ CompiledSQL(
+ "INSERT INTO ball (person_id, data) "
+ "VALUES (:person_id, :data)",
+ {"person_id": None, "data": "some data"},
+ ),
+ ],
),
CompiledSQL(
"INSERT INTO person (favorite_ball_id, data) "
from sqlalchemy.testing import fixtures
from sqlalchemy.testing.assertsql import assert_engine
from sqlalchemy.testing.assertsql import CompiledSQL
+from sqlalchemy.testing.assertsql import Conditional
from sqlalchemy.testing.schema import Column
from sqlalchemy.testing.schema import Table
eq_(t1.bar, 5 + 42)
eq_(t2.bar, 10 + 42)
- if eager and testing.db.dialect.implicit_returning:
- asserter.assert_(
- CompiledSQL(
- "INSERT INTO test (id, foo) VALUES (%(id)s, %(foo)s) "
- "RETURNING test.bar",
- [{"foo": 5, "id": 1}],
- dialect="postgresql",
- ),
- CompiledSQL(
- "INSERT INTO test (id, foo) VALUES (%(id)s, %(foo)s) "
- "RETURNING test.bar",
- [{"foo": 10, "id": 2}],
- dialect="postgresql",
- ),
- )
- else:
- asserter.assert_(
- CompiledSQL(
- "INSERT INTO test (id, foo) VALUES (:id, :foo)",
- [{"foo": 5, "id": 1}, {"foo": 10, "id": 2}],
- ),
- CompiledSQL(
- "SELECT test.bar AS test_bar FROM test "
- "WHERE test.id = :param_1",
- [{"param_1": 1}],
- ),
- CompiledSQL(
- "SELECT test.bar AS test_bar FROM test "
- "WHERE test.id = :param_1",
- [{"param_1": 2}],
- ),
+ asserter.assert_(
+ Conditional(
+ eager and testing.db.dialect.implicit_returning,
+ [
+ Conditional(
+ testing.db.dialect.insert_executemany_returning,
+ [
+ CompiledSQL(
+ "INSERT INTO test (id, foo) "
+ "VALUES (%(id)s, %(foo)s) "
+ "RETURNING test.bar",
+ [{"foo": 5, "id": 1}, {"foo": 10, "id": 2}],
+ dialect="postgresql",
+ ),
+ ],
+ [
+ CompiledSQL(
+ "INSERT INTO test (id, foo) "
+ "VALUES (%(id)s, %(foo)s) "
+ "RETURNING test.bar",
+ [{"foo": 5, "id": 1}],
+ dialect="postgresql",
+ ),
+ CompiledSQL(
+ "INSERT INTO test (id, foo) "
+ "VALUES (%(id)s, %(foo)s) "
+ "RETURNING test.bar",
+ [{"foo": 10, "id": 2}],
+ dialect="postgresql",
+ ),
+ ],
+ )
+ ],
+ [
+ CompiledSQL(
+ "INSERT INTO test (id, foo) VALUES (:id, :foo)",
+ [{"foo": 5, "id": 1}, {"foo": 10, "id": 2}],
+ ),
+ CompiledSQL(
+ "SELECT test.bar AS test_bar FROM test "
+ "WHERE test.id = :param_1",
+ [{"param_1": 1}],
+ ),
+ CompiledSQL(
+ "SELECT test.bar AS test_bar FROM test "
+ "WHERE test.id = :param_1",
+ [{"param_1": 2}],
+ ),
+ ],
)
+ )
@testing.combinations(
(
from sqlalchemy.testing import fixtures
from sqlalchemy.testing.assertsql import AllOf
from sqlalchemy.testing.assertsql import CompiledSQL
+from sqlalchemy.testing.assertsql import Conditional
from sqlalchemy.testing.schema import Column
from sqlalchemy.testing.schema import Table
from sqlalchemy.util import OrderedDict
self.assert_sql_execution(
testing.db,
session.flush,
- CompiledSQL(
- "INSERT INTO users (name) VALUES (:name)", {"name": "u1"}
- ),
- CompiledSQL(
- "INSERT INTO users (name) VALUES (:name)", {"name": "u2"}
- ),
- CompiledSQL(
- "INSERT INTO addresses (user_id, email_address) "
- "VALUES (:user_id, :email_address)",
- {"user_id": 1, "email_address": "a1"},
- ),
- CompiledSQL(
- "INSERT INTO addresses (user_id, email_address) "
- "VALUES (:user_id, :email_address)",
- {"user_id": 2, "email_address": "a2"},
+ Conditional(
+ testing.db.dialect.insert_executemany_returning,
+ [
+ CompiledSQL(
+ "INSERT INTO users (name) VALUES (:name)",
+ [{"name": "u1"}, {"name": "u2"}],
+ ),
+ CompiledSQL(
+ "INSERT INTO addresses (user_id, email_address) "
+ "VALUES (:user_id, :email_address)",
+ [
+ {"user_id": 1, "email_address": "a1"},
+ {"user_id": 2, "email_address": "a2"},
+ ],
+ ),
+ ],
+ [
+ CompiledSQL(
+ "INSERT INTO users (name) VALUES (:name)",
+ {"name": "u1"},
+ ),
+ CompiledSQL(
+ "INSERT INTO users (name) VALUES (:name)",
+ {"name": "u2"},
+ ),
+ CompiledSQL(
+ "INSERT INTO addresses (user_id, email_address) "
+ "VALUES (:user_id, :email_address)",
+ {"user_id": 1, "email_address": "a1"},
+ ),
+ CompiledSQL(
+ "INSERT INTO addresses (user_id, email_address) "
+ "VALUES (:user_id, :email_address)",
+ {"user_id": 2, "email_address": "a2"},
+ ),
+ ],
),
)
from sqlalchemy.testing import fixtures
from sqlalchemy.testing.assertsql import AllOf
from sqlalchemy.testing.assertsql import CompiledSQL
+from sqlalchemy.testing.assertsql import Conditional
from sqlalchemy.testing.mock import Mock
from sqlalchemy.testing.mock import patch
from sqlalchemy.testing.schema import Column
CompiledSQL(
"INSERT INTO users (name) VALUES (:name)", {"name": "u1"}
),
- CompiledSQL(
- "INSERT INTO addresses (user_id, email_address) "
- "VALUES (:user_id, :email_address)",
- lambda ctx: {"email_address": "a1", "user_id": u1.id},
- ),
- CompiledSQL(
- "INSERT INTO addresses (user_id, email_address) "
- "VALUES (:user_id, :email_address)",
- lambda ctx: {"email_address": "a2", "user_id": u1.id},
+ Conditional(
+ testing.db.dialect.insert_executemany_returning,
+ [
+ CompiledSQL(
+ "INSERT INTO addresses (user_id, email_address) "
+ "VALUES (:user_id, :email_address)",
+ lambda ctx: [
+ {"email_address": "a1", "user_id": u1.id},
+ {"email_address": "a2", "user_id": u1.id},
+ ],
+ ),
+ ],
+ [
+ CompiledSQL(
+ "INSERT INTO addresses (user_id, email_address) "
+ "VALUES (:user_id, :email_address)",
+ lambda ctx: {"email_address": "a1", "user_id": u1.id},
+ ),
+ CompiledSQL(
+ "INSERT INTO addresses (user_id, email_address) "
+ "VALUES (:user_id, :email_address)",
+ lambda ctx: {"email_address": "a2", "user_id": u1.id},
+ ),
+ ],
),
)
CompiledSQL(
"INSERT INTO users (name) VALUES (:name)", {"name": "u1"}
),
- CompiledSQL(
- "INSERT INTO addresses (user_id, email_address) "
- "VALUES (:user_id, :email_address)",
- lambda ctx: {"email_address": "a1", "user_id": u1.id},
- ),
- CompiledSQL(
- "INSERT INTO addresses (user_id, email_address) "
- "VALUES (:user_id, :email_address)",
- lambda ctx: {"email_address": "a2", "user_id": u1.id},
+ Conditional(
+ testing.db.dialect.insert_executemany_returning,
+ [
+ CompiledSQL(
+ "INSERT INTO addresses (user_id, email_address) "
+ "VALUES (:user_id, :email_address)",
+ lambda ctx: [
+ {"email_address": "a1", "user_id": u1.id},
+ {"email_address": "a2", "user_id": u1.id},
+ ],
+ ),
+ ],
+ [
+ CompiledSQL(
+ "INSERT INTO addresses (user_id, email_address) "
+ "VALUES (:user_id, :email_address)",
+ lambda ctx: {"email_address": "a1", "user_id": u1.id},
+ ),
+ CompiledSQL(
+ "INSERT INTO addresses (user_id, email_address) "
+ "VALUES (:user_id, :email_address)",
+ lambda ctx: {"email_address": "a2", "user_id": u1.id},
+ ),
+ ],
),
)
"(:parent_id, :data)",
{"parent_id": None, "data": "n1"},
),
- AllOf(
- CompiledSQL(
- "INSERT INTO nodes (parent_id, data) VALUES "
- "(:parent_id, :data)",
- lambda ctx: {"parent_id": n1.id, "data": "n2"},
- ),
- CompiledSQL(
- "INSERT INTO nodes (parent_id, data) VALUES "
- "(:parent_id, :data)",
- lambda ctx: {"parent_id": n1.id, "data": "n3"},
- ),
+ Conditional(
+ testing.db.dialect.insert_executemany_returning,
+ [
+ CompiledSQL(
+ "INSERT INTO nodes (parent_id, data) VALUES "
+ "(:parent_id, :data)",
+ lambda ctx: [
+ {"parent_id": n1.id, "data": "n2"},
+ {"parent_id": n1.id, "data": "n3"},
+ ],
+ ),
+ ],
+ [
+ AllOf(
+ CompiledSQL(
+ "INSERT INTO nodes (parent_id, data) VALUES "
+ "(:parent_id, :data)",
+ lambda ctx: {"parent_id": n1.id, "data": "n2"},
+ ),
+ CompiledSQL(
+ "INSERT INTO nodes (parent_id, data) VALUES "
+ "(:parent_id, :data)",
+ lambda ctx: {"parent_id": n1.id, "data": "n3"},
+ ),
+ ),
+ ],
),
)
"(:parent_id, :data)",
{"parent_id": None, "data": "n1"},
),
- AllOf(
- CompiledSQL(
- "INSERT INTO nodes (parent_id, data) VALUES "
- "(:parent_id, :data)",
- lambda ctx: {"parent_id": n1.id, "data": "n2"},
- ),
- CompiledSQL(
- "INSERT INTO nodes (parent_id, data) VALUES "
- "(:parent_id, :data)",
- lambda ctx: {"parent_id": n1.id, "data": "n3"},
- ),
+ Conditional(
+ testing.db.dialect.insert_executemany_returning,
+ [
+ CompiledSQL(
+ "INSERT INTO nodes (parent_id, data) VALUES "
+ "(:parent_id, :data)",
+ lambda ctx: [
+ {"parent_id": n1.id, "data": "n2"},
+ {"parent_id": n1.id, "data": "n3"},
+ ],
+ ),
+ ],
+ [
+ AllOf(
+ CompiledSQL(
+ "INSERT INTO nodes (parent_id, data) VALUES "
+ "(:parent_id, :data)",
+ lambda ctx: {"parent_id": n1.id, "data": "n2"},
+ ),
+ CompiledSQL(
+ "INSERT INTO nodes (parent_id, data) VALUES "
+ "(:parent_id, :data)",
+ lambda ctx: {"parent_id": n1.id, "data": "n3"},
+ ),
+ ),
+ ],
),
)
"(:parent_id, :data)",
lambda ctx: {"parent_id": None, "data": "n1"},
),
- CompiledSQL(
- "INSERT INTO nodes (parent_id, data) VALUES "
- "(:parent_id, :data)",
- lambda ctx: {"parent_id": n1.id, "data": "n11"},
- ),
- CompiledSQL(
- "INSERT INTO nodes (parent_id, data) VALUES "
- "(:parent_id, :data)",
- lambda ctx: {"parent_id": n1.id, "data": "n12"},
- ),
- CompiledSQL(
- "INSERT INTO nodes (parent_id, data) VALUES "
- "(:parent_id, :data)",
- lambda ctx: {"parent_id": n1.id, "data": "n13"},
- ),
- CompiledSQL(
- "INSERT INTO nodes (parent_id, data) VALUES "
- "(:parent_id, :data)",
- lambda ctx: {"parent_id": n12.id, "data": "n121"},
- ),
- CompiledSQL(
- "INSERT INTO nodes (parent_id, data) VALUES "
- "(:parent_id, :data)",
- lambda ctx: {"parent_id": n12.id, "data": "n122"},
+ Conditional(
+ testing.db.dialect.insert_executemany_returning,
+ [
+ CompiledSQL(
+ "INSERT INTO nodes (parent_id, data) VALUES "
+ "(:parent_id, :data)",
+ lambda ctx: [
+ {"parent_id": n1.id, "data": "n11"},
+ {"parent_id": n1.id, "data": "n12"},
+ {"parent_id": n1.id, "data": "n13"},
+ ],
+ ),
+ ],
+ [
+ CompiledSQL(
+ "INSERT INTO nodes (parent_id, data) VALUES "
+ "(:parent_id, :data)",
+ lambda ctx: {"parent_id": n1.id, "data": "n11"},
+ ),
+ CompiledSQL(
+ "INSERT INTO nodes (parent_id, data) VALUES "
+ "(:parent_id, :data)",
+ lambda ctx: {"parent_id": n1.id, "data": "n12"},
+ ),
+ CompiledSQL(
+ "INSERT INTO nodes (parent_id, data) VALUES "
+ "(:parent_id, :data)",
+ lambda ctx: {"parent_id": n1.id, "data": "n13"},
+ ),
+ ],
),
- CompiledSQL(
- "INSERT INTO nodes (parent_id, data) VALUES "
- "(:parent_id, :data)",
- lambda ctx: {"parent_id": n12.id, "data": "n123"},
+ Conditional(
+ testing.db.dialect.insert_executemany_returning,
+ [
+ CompiledSQL(
+ "INSERT INTO nodes (parent_id, data) VALUES "
+ "(:parent_id, :data)",
+ lambda ctx: [
+ {"parent_id": n12.id, "data": "n121"},
+ {"parent_id": n12.id, "data": "n122"},
+ {"parent_id": n12.id, "data": "n123"},
+ ],
+ ),
+ ],
+ [
+ CompiledSQL(
+ "INSERT INTO nodes (parent_id, data) VALUES "
+ "(:parent_id, :data)",
+ lambda ctx: {"parent_id": n12.id, "data": "n121"},
+ ),
+ CompiledSQL(
+ "INSERT INTO nodes (parent_id, data) VALUES "
+ "(:parent_id, :data)",
+ lambda ctx: {"parent_id": n12.id, "data": "n122"},
+ ),
+ CompiledSQL(
+ "INSERT INTO nodes (parent_id, data) VALUES "
+ "(:parent_id, :data)",
+ lambda ctx: {"parent_id": n12.id, "data": "n123"},
+ ),
+ ],
),
)
self.assert_sql_execution(
testing.db,
sess.flush,
- CompiledSQL("INSERT INTO t (data) VALUES (:data)", {"data": "t1"}),
- CompiledSQL("INSERT INTO t (data) VALUES (:data)", {"data": "t2"}),
+ Conditional(
+ testing.db.dialect.insert_executemany_returning,
+ [
+ CompiledSQL(
+ "INSERT INTO t (data) VALUES (:data)",
+ [{"data": "t1"}, {"data": "t2"}],
+ ),
+ ],
+ [
+ CompiledSQL(
+ "INSERT INTO t (data) VALUES (:data)", {"data": "t1"}
+ ),
+ CompiledSQL(
+ "INSERT INTO t (data) VALUES (:data)", {"data": "t2"}
+ ),
+ ],
+ ),
CompiledSQL(
"INSERT INTO t (id, data) VALUES (:id, :data)",
[
s.add_all([t1, t2])
- if testing.db.dialect.implicit_returning:
- self.assert_sql_execution(
- testing.db,
- s.commit,
- CompiledSQL(
- "INSERT INTO test (id) VALUES (%(id)s) RETURNING test.foo",
- [{"id": 1}],
- dialect="postgresql",
- ),
- CompiledSQL(
- "INSERT INTO test (id) VALUES (%(id)s) RETURNING test.foo",
- [{"id": 2}],
- dialect="postgresql",
- ),
- )
- else:
- self.assert_sql_execution(
- testing.db,
- s.commit,
- CompiledSQL(
- "INSERT INTO test (id) VALUES (:id)",
- [{"id": 1}, {"id": 2}],
- ),
- CompiledSQL(
- "SELECT test.foo AS test_foo FROM test "
- "WHERE test.id = :param_1",
- [{"param_1": 1}],
- ),
- CompiledSQL(
- "SELECT test.foo AS test_foo FROM test "
- "WHERE test.id = :param_1",
- [{"param_1": 2}],
- ),
- )
+ self.assert_sql_execution(
+ testing.db,
+ s.commit,
+ Conditional(
+ testing.db.dialect.implicit_returning,
+ [
+ Conditional(
+ testing.db.dialect.insert_executemany_returning,
+ [
+ CompiledSQL(
+ "INSERT INTO test (id) VALUES (%(id)s) "
+ "RETURNING test.foo",
+ [{"id": 1}, {"id": 2}],
+ dialect="postgresql",
+ ),
+ ],
+ [
+ CompiledSQL(
+ "INSERT INTO test (id) VALUES (%(id)s) "
+ "RETURNING test.foo",
+ [{"id": 1}],
+ dialect="postgresql",
+ ),
+ CompiledSQL(
+ "INSERT INTO test (id) VALUES (%(id)s) "
+ "RETURNING test.foo",
+ [{"id": 2}],
+ dialect="postgresql",
+ ),
+ ],
+ ),
+ ],
+ [
+ CompiledSQL(
+ "INSERT INTO test (id) VALUES (:id)",
+ [{"id": 1}, {"id": 2}],
+ ),
+ CompiledSQL(
+ "SELECT test.foo AS test_foo FROM test "
+ "WHERE test.id = :param_1",
+ [{"param_1": 1}],
+ ),
+ CompiledSQL(
+ "SELECT test.foo AS test_foo FROM test "
+ "WHERE test.id = :param_1",
+ [{"param_1": 2}],
+ ),
+ ],
+ ),
+ )
def test_update_defaults_nonpresent(self):
Thing2 = self.classes.Thing2
t4.foo = 8
t4.bar = 12
- if testing.db.dialect.implicit_returning:
- self.assert_sql_execution(
- testing.db,
- s.flush,
- CompiledSQL(
- "UPDATE test2 SET foo=%(foo)s "
- "WHERE test2.id = %(test2_id)s "
- "RETURNING test2.bar",
- [{"foo": 5, "test2_id": 1}],
- dialect="postgresql",
- ),
- CompiledSQL(
- "UPDATE test2 SET foo=%(foo)s, bar=%(bar)s "
- "WHERE test2.id = %(test2_id)s",
- [{"foo": 6, "bar": 10, "test2_id": 2}],
- dialect="postgresql",
- ),
- CompiledSQL(
- "UPDATE test2 SET foo=%(foo)s "
- "WHERE test2.id = %(test2_id)s "
- "RETURNING test2.bar",
- [{"foo": 7, "test2_id": 3}],
- dialect="postgresql",
- ),
- CompiledSQL(
- "UPDATE test2 SET foo=%(foo)s, bar=%(bar)s "
- "WHERE test2.id = %(test2_id)s",
- [{"foo": 8, "bar": 12, "test2_id": 4}],
- dialect="postgresql",
- ),
- )
- else:
- self.assert_sql_execution(
- testing.db,
- s.flush,
- CompiledSQL(
- "UPDATE test2 SET foo=:foo WHERE test2.id = :test2_id",
- [{"foo": 5, "test2_id": 1}],
- ),
- CompiledSQL(
- "UPDATE test2 SET foo=:foo, bar=:bar "
- "WHERE test2.id = :test2_id",
- [{"foo": 6, "bar": 10, "test2_id": 2}],
- ),
- CompiledSQL(
- "UPDATE test2 SET foo=:foo WHERE test2.id = :test2_id",
- [{"foo": 7, "test2_id": 3}],
- ),
- CompiledSQL(
- "UPDATE test2 SET foo=:foo, bar=:bar "
- "WHERE test2.id = :test2_id",
- [{"foo": 8, "bar": 12, "test2_id": 4}],
- ),
- CompiledSQL(
- "SELECT test2.bar AS test2_bar FROM test2 "
- "WHERE test2.id = :param_1",
- [{"param_1": 1}],
- ),
- CompiledSQL(
- "SELECT test2.bar AS test2_bar FROM test2 "
- "WHERE test2.id = :param_1",
- [{"param_1": 3}],
- ),
- )
+ self.assert_sql_execution(
+ testing.db,
+ s.flush,
+ Conditional(
+ testing.db.dialect.implicit_returning,
+ [
+ CompiledSQL(
+ "UPDATE test2 SET foo=%(foo)s "
+ "WHERE test2.id = %(test2_id)s "
+ "RETURNING test2.bar",
+ [{"foo": 5, "test2_id": 1}],
+ dialect="postgresql",
+ ),
+ CompiledSQL(
+ "UPDATE test2 SET foo=%(foo)s, bar=%(bar)s "
+ "WHERE test2.id = %(test2_id)s",
+ [{"foo": 6, "bar": 10, "test2_id": 2}],
+ dialect="postgresql",
+ ),
+ CompiledSQL(
+ "UPDATE test2 SET foo=%(foo)s "
+ "WHERE test2.id = %(test2_id)s "
+ "RETURNING test2.bar",
+ [{"foo": 7, "test2_id": 3}],
+ dialect="postgresql",
+ ),
+ CompiledSQL(
+ "UPDATE test2 SET foo=%(foo)s, bar=%(bar)s "
+ "WHERE test2.id = %(test2_id)s",
+ [{"foo": 8, "bar": 12, "test2_id": 4}],
+ dialect="postgresql",
+ ),
+ ],
+ [
+ CompiledSQL(
+ "UPDATE test2 SET foo=:foo WHERE test2.id = :test2_id",
+ [{"foo": 5, "test2_id": 1}],
+ ),
+ CompiledSQL(
+ "UPDATE test2 SET foo=:foo, bar=:bar "
+ "WHERE test2.id = :test2_id",
+ [{"foo": 6, "bar": 10, "test2_id": 2}],
+ ),
+ CompiledSQL(
+ "UPDATE test2 SET foo=:foo WHERE test2.id = :test2_id",
+ [{"foo": 7, "test2_id": 3}],
+ ),
+ CompiledSQL(
+ "UPDATE test2 SET foo=:foo, bar=:bar "
+ "WHERE test2.id = :test2_id",
+ [{"foo": 8, "bar": 12, "test2_id": 4}],
+ ),
+ CompiledSQL(
+ "SELECT test2.bar AS test2_bar FROM test2 "
+ "WHERE test2.id = :param_1",
+ [{"param_1": 1}],
+ ),
+ CompiledSQL(
+ "SELECT test2.bar AS test2_bar FROM test2 "
+ "WHERE test2.id = :param_1",
+ [{"param_1": 3}],
+ ),
+ ],
+ ),
+ )
def go():
eq_(t1.bar, 2)
dialect = default.DefaultDialect()
dialect.supports_empty_insert = dialect.supports_default_values = True
- stmt = table1.insert().values({}) # hide from 2to3
+ stmt = table1.insert().values({})
self.assert_compile(
stmt, "INSERT INTO mytable DEFAULT VALUES", dialect=dialect
)
+ def test_supports_empty_insert_true_executemany_mode(self):
+ table1 = self.tables.mytable
+
+ dialect = default.DefaultDialect()
+ dialect.supports_empty_insert = dialect.supports_default_values = True
+
+ stmt = table1.insert().values({})
+ self.assert_compile(
+ stmt,
+ "INSERT INTO mytable (myid) VALUES (DEFAULT)",
+ dialect=dialect,
+ for_executemany=True,
+ )
+
def test_supports_empty_insert_false(self):
table1 = self.tables.mytable