foo.bars.append(Bar(name='lala'))
for bar in foo.bars.filter(Bar.name=='lala'):
- print bar
+ print(bar)
session.commit()
::
for row in session.query(User.name, func.count(Address.id).label('numaddresses')).join(Address).group_by(User.name):
- print "name", row.name, "number", row.numaddresses
+ print("name", row.name, "number", row.numaddresses)
``Query`` has a ``statement`` accessor, as well as a
``subquery()`` method which allow ``Query`` to be used to
::
for col in table.c:
- print col
+ print(col)
Work with a specific column:
from sqlalchemy.orm import aliased
address_alias = aliased(Address)
- print session.query(User, address_alias).join((address_alias, User.addresses)).all()
+ print(session.query(User, address_alias).join((address_alias, User.addresses)).all())
* ``sqlalchemy.orm.Mapper``
::
>>> if column('foo') == 5:
- ... print "yes"
+ ... print("yes")
...
In previous versions of SQLAlchemy, the returned
::
if expression:
- print "the expression is:", expression
+ print("the expression is:", expression)
Would not evaluate if ``expression`` was a binary clause.
Since the above pattern should never be used, the base
::
if expression is not None:
- print "the expression is:", expression
+ print("the expression is:", expression)
Keep in mind, **this applies to Table and Column objects
too**.
create = CreateTable(mytable)
# dumps the CREATE TABLE as a string
- print create
+ print(create)
# executes the CREATE TABLE statement
engine.execute(create)
from sqlalchemy.engine.reflection import Inspector
insp = Inspector.from_engine(my_engine)
- print insp.get_schema_names()
+ print(insp.get_schema_names())
the ``from_engine()`` method will in some cases provide a
backend-specific inspector with additional capabilities,
my_engine = create_engine('postgresql://...')
pg_insp = Inspector.from_engine(my_engine)
- print pg_insp.get_table_oid('my_table')
+ print(pg_insp.get_table_oid('my_table'))
RETURNING Support
=================
table.insert().values(data='some data').returning(table.c.id, table.c.timestamp)
)
row = result.first()
- print "ID:", row['id'], "Timestamp:", row['timestamp']
+ print("ID:", row['id'], "Timestamp:", row['timestamp'])
The implementation of RETURNING across the four supported
backends varies wildly, in the case of Oracle requiring an
label('avg')
])
- print s
+ print(s)
SQL:
::
- print s.query(Parent).with_polymorphic([Child]).filter(Child.id > 7)
+ print(s.query(Parent).with_polymorphic([Child]).filter(Child.id > 7))
Which on both 0.6 and 0.7 renders:
<Mapper at 0x101521950; User>
>>> # an expression
- >>> print b.expression
+ >>> print(b.expression)
"user".id = address.user_id
>>> # inspect works on instances
@event.listens_for("load", Base, propagate=True)
def on_load(target, context):
- print "New instance loaded:", target
+ print("New instance loaded:", target)
# on_load() will be applied to SomeClass
class SomeClass(Base):
)
stmt = select([data.c.x.log(data.c.y)]).where(data.c.x.log(2) < value)
- print conn.execute(stmt).fetchall()
+ print(conn.execute(stmt).fetchall())
New features which have come from this immediately include
whenever the ``test_table.c.data`` column is rendered in the columns
clause of a SELECT statement::
- >>> print select([test_table]).where(test_table.c.data == 'HI')
+ >>> print(select([test_table]).where(test_table.c.data == 'HI'))
SELECT lower(test_table.data) AS data
FROM test_table
WHERE test_table.data = lower(:data_1)
engine = create_engine("postgresql://scott:tiger@localhost/test")
insp = inspect(engine)
- print insp.get_table_names()
+ print(insp.get_table_names())
It can also be applied to any :class:`.ClauseElement`, which returns
the :class:`.ClauseElement` itself, such as :class:`.Table`, :class:`.Column`,
when features such as :meth:`.MetaData.create_all` and :func:`.cast` is used::
>>> stmt = select([cast(sometable.c.somechar, String(20, collation='utf8'))])
- >>> print stmt
+ >>> print(stmt)
SELECT CAST(sometable.somechar AS VARCHAR(20) COLLATE "utf8") AS anon_1
FROM sometable
s2 = select([t1, t2]).where(t1.c.x == t2.c.y).where(t1.c.x == s)
- print (s2)
+ print(s2)
SELECT t1.x, t2.y FROM t1, t2
WHERE t1.x = t2.y AND t1.x =
Previously, an expression like the following::
- print (column('x') == 'somevalue').collate("en_EN")
+ print((column('x') == 'somevalue').collate("en_EN"))
would produce an expression like this::
The potentially backwards incompatible change arises if the :meth:`.collate`
operator is being applied to the right-hand column, as follows::
- print column('x') == literal('somevalue').collate("en_EN")
+ print(column('x') == literal('somevalue').collate("en_EN"))
In 0.8, this produces::
generated::
>>> # 0.8
- >>> print column('x').collate('en_EN').desc()
+ >>> print(column('x').collate('en_EN').desc())
(x COLLATE en_EN) DESC
>>> # 0.9
- >>> print column('x').collate('en_EN').desc()
+ >>> print(column('x').collate('en_EN').desc())
x COLLATE en_EN DESC
:ticket:`2879`
>>> from sqlalchemy.dialects import postgresql
>>> type = postgresql.ENUM('one', 'two', "three's", name="myenum")
>>> from sqlalchemy.dialects.postgresql import base
- >>> print base.CreateEnumType(type).compile(dialect=postgresql.dialect())
+ >>> print(base.CreateEnumType(type).compile(dialect=postgresql.dialect()))
CREATE TYPE myenum AS ENUM ('one','two','three''s')
Existing workarounds which already escape single quote signs will need to be
session.commit()
# collection-based relationships are by default named "<classname>_collection"
- print (u1.address_collection)
+ print(u1.address_collection)
Beyond that, the :class:`.AutomapBase` class is a declarative base, and supports
all the features that declarative does. The "automapping" feature can be used
>>> from sqlalchemy import select, and_, false, true
>>> from sqlalchemy.dialects import mysql, postgresql
- >>> print select([t1]).where(t1.c.x).compile(dialect=mysql.dialect())
+ >>> print(select([t1]).where(t1.c.x).compile(dialect=mysql.dialect()))
SELECT t.x, t.y FROM t WHERE t.x = 1
The :func:`.and_` and :func:`.or_` constructs will now exhibit quasi
"short circuit" behavior, that is truncating a rendered expression, when a
:func:`.true` or :func:`.false` constant is present::
- >>> print select([t1]).where(and_(t1.c.y > 5, false())).compile(
- ... dialect=postgresql.dialect())
+ >>> print(select([t1]).where(and_(t1.c.y > 5, false())).compile(
+ ... dialect=postgresql.dialect()))
SELECT t.x, t.y FROM t WHERE false
:func:`.true` can be used as the base to build up an expression::
>>> expr = true()
>>> expr = expr & (t1.c.y > 5)
- >>> print select([t1]).where(expr)
+ >>> print(select([t1]).where(expr))
SELECT t.x, t.y FROM t WHERE t.y > :y_1
The boolean constants :func:`.true` and :func:`.false` themselves render as
``0 = 1`` and ``1 = 1`` for a backend with no boolean constants::
- >>> print select([t1]).where(and_(t1.c.y > 5, false())).compile(
- ... dialect=mysql.dialect())
+ >>> print(select([t1]).where(and_(t1.c.y > 5, false())).compile(
+ ... dialect=mysql.dialect()))
SELECT t.x, t.y FROM t WHERE 0 = 1
Interpretation of ``None``, while not particularly valid SQL, is at least
now consistent::
- >>> print select([t1.c.x]).where(None)
+ >>> print(select([t1.c.x]).where(None))
SELECT t.x FROM t WHERE NULL
- >>> print select([t1.c.x]).where(None).where(None)
+ >>> print(select([t1.c.x]).where(None).where(None))
SELECT t.x FROM t WHERE NULL AND NULL
- >>> print select([t1.c.x]).where(and_(None, None))
+ >>> print(select([t1.c.x]).where(and_(None, None)))
SELECT t.x FROM t WHERE NULL AND NULL
:ticket:`2804`
stmt = select([expr]).order_by(expr)
- print stmt
+ print(stmt)
Prior to 0.9 would render as::
A simple scenario that included "A.b" twice would fail to render
correctly::
- print sess.query(A, a1).order_by(a1.b)
+ print(sess.query(A, a1).order_by(a1.b))
This would order by the wrong column::
Column('y', Integer, default=func.somefunction()))
stmt = select([t.c.x])
- print t.insert().from_select(['x'], stmt)
+ print(t.insert().from_select(['x'], stmt))
Will render::
A query that joins to ``A.bs`` twice::
- print s.query(A).join(A.bs).join(A.bs)
+ print(s.query(A).join(A.bs).join(A.bs))
Will render::
The bigger change involves when joining to an entity without using a
relationship-bound path. If we join to ``B`` twice::
- print s.query(A).join(B, B.a_id == A.id).join(B, B.a_id == A.id)
+ print(s.query(A).join(B, B.a_id == A.id).join(B, B.a_id == A.id))
In 0.9, this would render as follows::
s = Session()
- print s.query(ASub1).join(B, ASub1.b).join(ASub2, B.a)
+ print(s.query(ASub1).join(B, ASub1.b).join(ASub2, B.a))
- print s.query(ASub1).join(B, ASub1.b).join(ASub2, ASub2.id == B.a_id)
+ print(s.query(ASub1).join(B, ASub1.b).join(ASub2, ASub2.id == B.a_id))
The two queries at the bottom are equivalent, and should both render
the identical SQL::
asub2_alias = aliased(ASub2)
- print s.query(ASub1).join(B, ASub1.b).join(asub2_alias, B.a.of_type(asub2_alias))
+ print(s.query(ASub1).join(B, ASub1.b).join(asub2_alias, B.a.of_type(asub2_alias)))
:ticket:`3233`
:ticket:`3367`
>>> books = table('books', column('book_id'), column('owner_id'))
>>> subq = select([books.c.book_id]).\
... where(books.c.owner_id == people.c.people_id).lateral("book_subq")
- >>> print (select([people]).select_from(people.join(subq, true())))
+ >>> print(select([people]).select_from(people.join(subq, true())))
SELECT people.people_id, people.age, people.name
FROM people JOIN LATERAL (SELECT books.book_id AS book_id
FROM books WHERE books.owner_id = people.people_id)
>4=4:PGJ7HQ ... (4703 characters truncated) ... J6IK546AJMB4N6S9L;;9AKI;=
RJPHDSSOTNBUEEC9@Q:RCL:I@5?FO<9K>KJAGAO@E6@A7JI8O:J7B69T6<8;F:S;4BEIJS9HM
K:;5OLPM@JR;R:J6<SOTTT=>Q>7T@I::OTDC:CC<=NGP6C>BC8N',)
- >>> print row
+ >>> print(row)
(u'E6@?>9HPOJB<<BHR:@=TS:5ILU=;JLM<4?B9<S48PTNG9>:=TSTLA;9K;9FPM4M8M@;NM6
GULUAEBT9QGHNHTHR5EP75@OER4?SKC;D:TFUMD:M>;C6U:JLM6R67GEK<A6@S@C@J7>4
=4:PGJ7HQ ... (4703 characters truncated) ... J6IK546AJMB4N6S9L;;9AKI;
>>> engine.execute("create table s (x varchar(max), y varbinary(max))")
>>> insp = inspect(engine)
>>> for col in insp.get_columns("s"):
- ... print col['type'].__class__, col['type'].length
+ ... print(col['type'].__class__, col['type'].length)
...
<class 'sqlalchemy.sql.sqltypes.VARCHAR'> max
<class 'sqlalchemy.dialects.mssql.base.VARBINARY'> max
out as None, so that the type objects work in non-SQL Server contexts::
>>> for col in insp.get_columns("s"):
- ... print col['type'].__class__, col['type'].length
+ ... print(col['type'].__class__, col['type'].length)
...
<class 'sqlalchemy.sql.sqltypes.VARCHAR'> None
<class 'sqlalchemy.dialects.mssql.base.VARBINARY'> None
connection = engine.connect()
result = connection.execute("select username from users")
for row in result:
- print "username:", row['username']
+ print("username:", row['username'])
connection.close()
The connection is an instance of :class:`.Connection`,
result = engine.execute("select username from users")
for row in result:
- print "username:", row['username']
+ print("username:", row['username'])
Where above, the :meth:`~.Engine.execute` method acquires a new
:class:`.Connection` on its own, executes the statement with that object,
result = engine.execute("select username from users")
for row in result:
- print "username:", row['username']
+ print("username:", row['username'])
In addition to "connectionless" execution, it is also possible
to use the :meth:`~.Executable.execute` method of
Column('geom_data', Geometry)
)
- print select([geometry]).where(
- geometry.c.geom_data == 'LINESTRING(189412 252431,189631 259122)')
+ print(select([geometry]).where(
+ geometry.c.geom_data == 'LINESTRING(189412 252431,189631 259122)'))
The resulting SQL embeds both functions as appropriate. ``ST_AsText``
is applied to the columns clause so that the return value is run through
a :func:`.select` against a :func:`.label` of our expression, the string
label is moved to the outside of the wrapped expression::
- print select([geometry.c.geom_data.label('my_data')])
+ print(select([geometry.c.geom_data.label('my_data')]))
Output::
conn.execute(message.insert(), username="some user",
message="this is my message")
- print conn.scalar(
+ print(conn.scalar(
select([message.c.message]).\
where(message.c.username == "some user")
- )
+ ))
The ``pgp_sym_encrypt`` and ``pgp_sym_decrypt`` functions are applied
to the INSERT and SELECT statements::
Usage::
>>> sometable = Table("sometable", metadata, Column("data", MyInt))
- >>> print sometable.c.data + 5
+ >>> print(sometable.c.data + 5)
sometable.data goofy :data_1
The implementation for :meth:`.ColumnOperators.__add__` is consulted
Using the above type::
- >>> print sometable.c.data.log(5)
+ >>> print(sometable.c.data.log(5))
log(:log_1, :log_2)
Using the above type::
>>> from sqlalchemy.sql import column
- >>> print column('x', MyInteger).factorial()
+ >>> print(column('x', MyInteger).factorial())
x !
See also:
from sqlalchemy.pool import Pool
def my_on_connect(dbapi_con, connection_record):
- print "New DBAPI connection:", dbapi_con
+ print("New DBAPI connection:", dbapi_con)
listen(Pool, 'connect', my_on_connect)
@listens_for(Pool, "connect")
def my_on_connect(dbapi_con, connection_record):
- print "New DBAPI connection:", dbapi_con
+ print("New DBAPI connection:", dbapi_con)
Named Argument Styles
---------------------
references)::
>>> for t in metadata.sorted_tables:
- ... print t.name
+ ... print(t.name)
user
user_preference
invoice
# iterate through all columns
for c in employees.c:
- print c
+ print(c)
# get the table's primary key columns
for primary_key in employees.primary_key:
- print primary_key
+ print(primary_key)
# get the table's foreign key objects:
for fkey in employees.foreign_keys:
- print fkey
+ print(fkey)
# access the table's MetaData:
employees.metadata
except exc.DBAPIError, e:
# an exception is raised, Connection is invalidated.
if e.connection_invalidated:
- print "Connection was invalidated!"
+ print("Connection was invalidated!")
# after the invalidate event, a new connection
# starts with a new Pool
from sqlalchemy.engine import reflection
engine = create_engine('...')
insp = reflection.Inspector.from_engine(engine)
- print insp.get_table_names()
+ print(insp.get_table_names())
.. autoclass:: sqlalchemy.engine.reflection.Inspector
:members:
>>> books = table('books', column('book_id'), column('owner_id'))
>>> subq = select([books.c.book_id]).\
... where(books.c.owner_id == people.c.people_id).lateral("book_subq")
- >>> print (select([people]).select_from(people.join(subq, true())))
+ >>> print(select([people]).select_from(people.join(subq, true())))
SELECT people.people_id, people.age, people.name
FROM people JOIN LATERAL (SELECT books.book_id AS book_id
FROM books WHERE books.owner_id = people.people_id)
# ... add Table objects to metadata
ti = metadata.sorted_tables:
for t in ti:
- print t
+ print(t)
How can I get the CREATE TABLE/ DROP TABLE output as a string?
===========================================================================
from sqlalchemy.schema import CreateTable
- print CreateTable(mytable)
+ print(CreateTable(mytable))
To get the string specific to a certain engine::
- print CreateTable(mytable).compile(engine)
+ print(CreateTable(mytable).compile(engine))
There's also a special form of :class:`.Engine` that can let you dump an entire
metadata creation sequence, using this recipe::
def dump(sql, *multiparams, **params):
- print sql.compile(dialect=engine.dialect)
+ print(sql.compile(dialect=engine.dialect))
engine = create_engine('postgresql://', strategy='mock', executor=dump)
metadata.create_all(engine, checkfirst=False)
ps.print_stats()
# uncomment this to see who's calling what
# ps.print_callers()
- print s.getvalue()
+ print(s.getvalue())
To profile a section of code::
class Iterates(object):
def __len__(self):
- print "LEN!"
+ print("LEN!")
return 5
def __iter__(self):
- print "ITER!"
+ print("ITER!")
return iter([1, 2, 3, 4, 5])
list(Iterates())
for obj in walk(a1):
- print obj
+ print(obj)
Output::
``__delete__()`` methods. The :class:`.InstrumentedAttribute`
will generate a SQL expression when used at the class level::
- >>> print MyClass.data == 5
+ >>> print(MyClass.data == 5)
data = :data_1
and at the instance level, keeps track of changes to values,
>>> a1 = Address()
>>> u1.addresses
[]
- >>> print a1.user
+ >>> print(a1.user)
None
However, once the ``Address`` is appended to the ``u1.addresses`` collection,
We can observe, by inspecting the resulting property, that both sides
of the relationship have this join condition applied::
- >>> print User.addresses.property.primaryjoin
+ >>> print(User.addresses.property.primaryjoin)
"user".id = address.user_id AND address.email LIKE :email_1 || '%%'
>>>
- >>> print Address.user.property.primaryjoin
+ >>> print(Address.user.property.primaryjoin)
"user".id = address.user_id AND address.email LIKE :email_1 || '%%'
>>>
# iterate through child objects via association, including association
# attributes
for assoc in p.children:
- print assoc.extra_data
- print assoc.child
+ print(assoc.extra_data)
+ print(assoc.child)
To enhance the association object pattern such that direct
access to the ``Association`` object is optional, SQLAlchemy
parent = Parent()
parent.children.append(Child())
- print parent.children[0]
+ print(parent.children[0])
Collections are not limited to lists. Sets, mutable sequences and almost any
other Python object that can act as a container can be used in place of the
>>> v = Vertex(start=Point(3, 4), end=Point(5, 6))
>>> session.add(v)
>>> q = session.query(Vertex).filter(Vertex.start == Point(3, 4))
- {sql}>>> print q.first().start
+ {sql}>>> print(q.first().start)
BEGIN (implicit)
INSERT INTO vertice (x1, y1, x2, y2) VALUES (?, ?, ?, ?)
(3, 4, 5, 6)
# equivalent to:
#
# session = Session()
- # print session.query(MyClass).all()
+ # print(session.query(MyClass).all())
#
- print Session.query(MyClass).all()
+ print(Session.query(MyClass).all())
The above code accomplishes the same task as that of acquiring the current
:class:`.Session` by calling upon the registry, then using that :class:`.Session`.
of ``StringAttribute`` objects, which are persisted into a table that's
local to either the ``type_a_strings`` or ``type_b_strings`` table::
- >>> print ta._strings
+ >>> print(ta._strings)
[<__main__.StringAttribute object at 0x10151cd90>,
<__main__.StringAttribute object at 0x10151ce10>]
bn = Bundle('mybundle', MyClass.data1, MyClass.data2)
for row in session.query(bn).filter(bn.c.data1 == 'd1'):
- print row.mybundle.data1, row.mybundle.data2
+ print(row.mybundle.data1, row.mybundle.data2)
The bundle can be subclassed to provide custom behaviors when results
are fetched. The method :meth:`.Bundle.create_row_processor` is given
bn = DictBundle('mybundle', MyClass.data1, MyClass.data2)
for row in session.query(bn).filter(bn.c.data1 == 'd1'):
- print row.mybundle['data1'], row.mybundle['data2']
+ print(row.mybundle['data1'], row.mybundle['data2'])
The :class:`.Bundle` construct is also integrated into the behavior
of :func:`.composite`, where it is used to return composite attributes as objects
``.status`` that will behave as one attribute, both at the expression
level::
- >>> print MyClass.job_status == 'some_status'
+ >>> print(MyClass.job_status == 'some_status')
my_table.job_status = :job_status_1
- >>> print MyClass.status == 'some_status'
+ >>> print(MyClass.status == 'some_status')
my_table.job_status = :job_status_1
and at the instance level::
class level, so that it is available from an instance::
some_user = session.query(User).first()
- print some_user.fullname
+ print(some_user.fullname)
as well as usable within queries::
interface::
for obj in session:
- print obj
+ print(obj)
And presence may be tested for using regular "contains" semantics::
if obj in session:
- print "Object is present"
+ print("Object is present")
The session is also keeping track of all newly created (i.e. pending) objects,
all objects which have had changes since they were last loaded or saved (i.e.
with session.begin_nested():
session.merge(record)
except:
- print "Skipped record %s" % record
+ print("Skipped record %s" % record)
session.commit()
.. _session_autocommit: