- if [[ $TRAVIS_PYTHON_VERSION != 'pypy'* ]]; then travis_retry pip install pycurl; fi
# Twisted runs on 2.x and 3.3+, but is flaky on pypy.
- if [[ $TRAVIS_PYTHON_VERSION != 'pypy'* ]]; then travis_retry pip install Twisted; fi
- - if [[ $TRAVIS_PYTHON_VERSION == '2.7' || $TRAVIS_PYTHON_VERSION == '3.6' ]]; then travis_retry pip install sphinx sphinx_rtd_theme; fi
+ - if [[ $TRAVIS_PYTHON_VERSION == '3.6' ]]; then travis_retry pip install sphinx sphinx_rtd_theme; fi
- if [[ $TRAVIS_PYTHON_VERSION == '3.6' ]]; then travis_retry pip install flake8; fi
# On travis the extension should always be built
- if [[ $TRAVIS_PYTHON_VERSION != 'pypy'* ]]; then export TORNADO_EXTENSION=1; fi
- if [[ "$RUN_COVERAGE" == 1 ]]; then coverage xml; fi
- export TORNADO_EXTENSION=0
- if [[ $TRAVIS_PYTHON_VERSION == 3.6 ]]; then cd ../docs && mkdir sphinx-out && sphinx-build -E -n -W -b html . sphinx-out; fi
- - if [[ $TRAVIS_PYTHON_VERSION == '2.7' || $TRAVIS_PYTHON_VERSION == 3.6 ]]; then cd ../docs && mkdir sphinx-doctest-out && sphinx-build -E -n -b doctest . sphinx-out; fi
+ - if [[ $TRAVIS_PYTHON_VERSION == 3.6 ]]; then cd ../docs && mkdir sphinx-doctest-out && sphinx-build -E -n -b doctest . sphinx-out; fi
- if [[ $TRAVIS_PYTHON_VERSION == '3.6' ]]; then flake8; fi
after_success:
code must be replaced with non-blocking equivalents. This means one of three things:
1. *Find a coroutine-friendly equivalent.* For `time.sleep`, use
- `tornado.gen.sleep` instead::
+ `tornado.gen.sleep` (or `asyncio.sleep`) instead::
class CoroutineSleepHandler(RequestHandler):
- @gen.coroutine
- def get(self):
+ async def get(self):
for i in range(5):
print(i)
- yield gen.sleep(1)
+ await gen.sleep(1)
When this option is available, it is usually the best approach.
See the `Tornado wiki <https://github.com/tornadoweb/tornado/wiki/Links>`_
2. *Find a callback-based equivalent.* Similar to the first option,
callback-based libraries are available for many tasks, although they
are slightly more complicated to use than a library designed for
- coroutines. These are typically used with `tornado.gen.Task` as an
- adapter::
+ coroutines. Adapt the callback-based function into a future::
class CoroutineTimeoutHandler(RequestHandler):
- @gen.coroutine
- def get(self):
+ async def get(self):
io_loop = IOLoop.current()
for i in range(5):
print(i)
- yield gen.Task(io_loop.add_timeout, io_loop.time() + 1)
+ f = tornado.concurrent.Future()
+ do_something_with_callback(f.set_result)
+ result = await f
Again, the
`Tornado wiki <https://github.com/tornadoweb/tornado/wiki/Links>`_
that can be used for any blocking function whether an asynchronous
counterpart exists or not::
- executor = concurrent.futures.ThreadPoolExecutor(8)
-
class ThreadPoolHandler(RequestHandler):
- @gen.coroutine
- def get(self):
+ async def get(self):
for i in range(5):
print(i)
- yield executor.submit(time.sleep, 1)
+ await IOLoop.current().run_in_executor(None, time.sleep, 1)
See the :doc:`Asynchronous I/O <guide/async>` chapter of the Tornado
user's guide for more on blocking and asynchronous functions.
comparable to asynchronous systems, but they do not actually make
things asynchronous).
+Asynchronous operations in Tornado generally return placeholder
+objects (``Futures``), with the exception of some low-level components
+like the `.IOLoop` that use callbacks. ``Futures`` are usually
+transformed into their result with the ``await`` or ``yield``
+keywords.
+
Examples
~~~~~~~~
.. testoutput::
:hide:
-And here is the same function rewritten to be asynchronous with a
-callback argument:
+And here is the same function rewritten asynchronously as a native coroutine:
.. testcode::
- from tornado.httpclient import AsyncHTTPClient
+ from tornado.httpclient import AsyncHTTPClient
- def asynchronous_fetch(url, callback):
- http_client = AsyncHTTPClient()
- def handle_response(response):
- callback(response.body)
- http_client.fetch(url, callback=handle_response)
+ async def asynchronous_fetch(url):
+ http_client = AsyncHTTPClient()
+ response = await http_client.fetch(url)
+ return response.body
.. testoutput::
:hide:
-And again with a `.Future` instead of a callback:
+Or for compatibility with older versions of Python, using the `tornado.gen` module:
-.. testcode::
+.. testcode::
- from tornado.concurrent import Future
+ from tornado.httpclient import AsyncHTTPClient
+ from tornado import gen
- def async_fetch_future(url):
+ @gen.coroutine
+ def async_fetch_gen(url):
http_client = AsyncHTTPClient()
- my_future = Future()
- fetch_future = http_client.fetch(url)
- fetch_future.add_done_callback(
- lambda f: my_future.set_result(f.result()))
- return my_future
-
-.. testoutput::
- :hide:
+ response = yield http_client.fetch(url)
+ raise gen.Return(response.body)
-The raw `.Future` version is more complex, but ``Futures`` are
-nonetheless recommended practice in Tornado because they have two
-major advantages. Error handling is more consistent since the
-``Future.result`` method can simply raise an exception (as opposed to
-the ad-hoc error handling common in callback-oriented interfaces), and
-``Futures`` lend themselves well to use with coroutines. Coroutines
-will be discussed in depth in the next section of this guide. Here is
-the coroutine version of our sample function, which is very similar to
-the original synchronous version:
+Coroutines are a little magical, but what they do internally is something like this:
.. testcode::
- from tornado import gen
+ from tornado.concurrent import Future
- @gen.coroutine
- def fetch_coroutine(url):
+ def async_fetch_manual(url):
http_client = AsyncHTTPClient()
- response = yield http_client.fetch(url)
- raise gen.Return(response.body)
+ my_future = Future()
+ fetch_future = http_client.fetch(url)
+ def on_fetch(f):
+ my_future.set_result(f.result().body)
+ fetch_future.add_done_callback(on_fetch)
+ return my_future
.. testoutput::
:hide:
-The statement ``raise gen.Return(response.body)`` is an artifact of
-Python 2, in which generators aren't allowed to return
-values. To overcome this, Tornado coroutines raise a special kind of
-exception called a `.Return`. The coroutine catches this exception and
-treats it like a returned value. In Python 3.3 and later, a ``return
-response.body`` achieves the same result.
+Notice that the coroutine returns its `.Future` before the fetch is
+done. This is what makes coroutines *asynchronous*.
+
+Anything you can do with coroutines you can also do by passing
+callback objects around, but coroutines provide an important
+simplification by letting you organize your code in the same way you
+would if it were synchronous. This is especially important for error
+handling, since ``try``/``except`` blocks work as you would expect in
+coroutines while this is difficult to achieve with callbacks.
+Coroutines will be discussed in depth in the next section of this
+guide.
from tornado import gen
**Coroutines** are the recommended way to write asynchronous code in
-Tornado. Coroutines use the Python ``yield`` keyword to suspend and
-resume execution instead of a chain of callbacks (cooperative
-lightweight threads as seen in frameworks like `gevent
+Tornado. Coroutines use the Python ``await`` or ``yield`` keyword to
+suspend and resume execution instead of a chain of callbacks
+(cooperative lightweight threads as seen in frameworks like `gevent
<http://www.gevent.org>`_ are sometimes called coroutines as well, but
in Tornado all coroutines use explicit context switches and are called
as asynchronous functions).
Example::
- from tornado import gen
-
- @gen.coroutine
- def fetch_coroutine(url):
+ async def fetch_coroutine(url):
http_client = AsyncHTTPClient()
- response = yield http_client.fetch(url)
- # In Python versions prior to 3.3, returning a value from
- # a generator is not allowed and you must use
- # raise gen.Return(response.body)
- # instead.
+ response = await http_client.fetch(url)
return response.body
.. _native_coroutines:
-Python 3.5: ``async`` and ``await``
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+Native vs decorated coroutines
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-Python 3.5 introduces the ``async`` and ``await`` keywords (functions
-using these keywords are also called "native coroutines"). Starting in
-Tornado 4.3, you can use them in place of most ``yield``-based
-coroutines (see the following paragraphs for limitations). Simply use
-``async def foo()`` in place of a function definition with the
-``@gen.coroutine`` decorator, and ``await`` in place of yield. The
-rest of this document still uses the ``yield`` style for compatibility
-with older versions of Python, but ``async`` and ``await`` will run
-faster when they are available::
+Python 3.5 introduced the ``async`` and ``await`` keywords (functions
+using these keywords are also called "native coroutines"). For
+compatibility with older versions of Python, you can use "decorated"
+or "yield-based" coroutines using the `tornado.gen.coroutine`
+decorator.
- async def fetch_coroutine(url):
- http_client = AsyncHTTPClient()
- response = await http_client.fetch(url)
- return response.body
+Native coroutines are the recommended form whenever possible. Only use
+decorated coroutines when compatibility with older versions of Python
+is required. Examples in the tornado documentation will generally use
+the native form.
-The ``await`` keyword is less versatile than the ``yield`` keyword.
-For example, in a ``yield``-based coroutine you can yield a list of
-``Futures``, while in a native coroutine you must wrap the list in
-`tornado.gen.multi`. This also eliminates the integration with
-`concurrent.futures`. You can use `tornado.gen.convert_yielded`
-to convert anything that would work with ``yield`` into a form that
-will work with ``await``::
+Translation between the two forms is generally straightforward::
- async def f():
- executor = concurrent.futures.ThreadPoolExecutor()
- await tornado.gen.convert_yielded(executor.submit(g))
+ # Decorated: # Native:
+
+ # Normal function declaration
+ # with decorator # "async def" keywords
+ @gen.coroutine
+ def a(): async def a():
+ # "yield" all async funcs # "await" all async funcs
+ b = yield c() b = await c()
+ # "return" and "yield"
+ # cannot be mixed in
+ # Python 2, so raise a
+ # special exception. # Return normally
+ raise gen.Return(b) return b
+
+Other differences between the two forms of coroutine are:
+
+- Native coroutines are generally faster.
+- Native coroutines can use ``async for`` and ``async with``
+ statements which make some patterns much simpler.
+- Native coroutines do not run at all unless you ``await`` or
+ ``yield`` them. Decorated coroutines can start running "in the
+ background" as soon as they are called. Note that for both kinds of
+ coroutines it is important to use ``await`` or ``yield`` so that
+ any exceptions have somewhere to go.
+- Decorated coroutines has additional integration with the
+ `concurrent.futures` package, allowing the result of
+ ``executor.submit`` to be yielded directly. For native coroutines,
+ use `.IOLoop.run_in_executor` instead.
+- Decorated coroutines support some shorthand for waiting on multiple
+ objects by yielding a list or dict. Use `tornado.gen.multi` to do
+ this in native coroutines.
+- Decorated coroutines can support integration with other packages
+ including Twisted via a registry of conversion functions.
+ To access this functionality in native coroutines, use
+ `tornado.gen.convert_yielded`.
+- Decorated coroutines always return a `.Future` object. Native
+ coroutines return an *awaitable* object that is not a `.Future`. In
+ Tornado the two are mostly interchangeable.
How it works
~~~~~~~~~~~~
+This section explains the operation of decorated coroutines. Native
+coroutines are conceptually similar, but a little more complicated
+because of the extra integration with the Python runtime.
+
A function containing ``yield`` is a **generator**. All generators
are asynchronous; when called they return a generator object instead
of running to completion. The ``@gen.coroutine`` decorator
~~~~~~~~~~~~~~~~~~~~~~~
Coroutines do not raise exceptions in the normal way: any exception
-they raise will be trapped in the `.Future` until it is yielded. This
-means it is important to call coroutines in the right way, or you may
-have errors that go unnoticed::
+they raise will be trapped in the awaitable object until it is
+yielded. This means it is important to call coroutines in the right
+way, or you may have errors that go unnoticed::
- @gen.coroutine
- def divide(x, y):
+ async def divide(x, y):
return x / y
def bad_call():
divide(1, 0)
In nearly all cases, any function that calls a coroutine must be a
-coroutine itself, and use the ``yield`` keyword in the call. When you
-are overriding a method defined in a superclass, consult the
-documentation to see if coroutines are allowed (the documentation
-should say that the method "may be a coroutine" or "may return a
-`.Future`")::
-
- @gen.coroutine
- def good_call():
- # yield will unwrap the Future returned by divide() and raise
+coroutine itself, and use the ``await`` or ``yield`` keyword in the
+call. When you are overriding a method defined in a superclass,
+consult the documentation to see if coroutines are allowed (the
+documentation should say that the method "may be a coroutine" or "may
+return a `.Future`")::
+
+ async def good_call():
+ # await will unwrap the object returned by divide() and raise
# the exception.
- yield divide(1, 0)
+ await divide(1, 0)
Sometimes you may want to "fire and forget" a coroutine without waiting
for its result. In this case it is recommended to use `.IOLoop.spawn_callback`,
use `.IOLoop.run_in_executor`, which returns
``Futures`` that are compatible with coroutines::
- @gen.coroutine
- def call_blocking():
- yield IOLoop.current().run_in_executor(blocking_func, args)
+ async def call_blocking():
+ await IOLoop.current().run_in_executor(blocking_func, args)
Parallelism
^^^^^^^^^^^
-The `.coroutine` decorator recognizes lists and dicts whose values are
+The `.multi` function accepts lists and dicts whose values are
``Futures``, and waits for all of those ``Futures`` in parallel:
.. testcode::
- @gen.coroutine
- def parallel_fetch(url1, url2):
- resp1, resp2 = yield [http_client.fetch(url1),
- http_client.fetch(url2)]
+ from tornado.gen import multi
- @gen.coroutine
- def parallel_fetch_many(urls):
- responses = yield [http_client.fetch(url) for url in urls]
+ async def parallel_fetch(url1, url2):
+ resp1, resp2 = await multi([http_client.fetch(url1),
+ http_client.fetch(url2)])
+
+ async def parallel_fetch_many(urls):
+ responses = await multi ([http_client.fetch(url) for url in urls])
# responses is a list of HTTPResponses in the same order
- @gen.coroutine
- def parallel_fetch_dict(urls):
- responses = yield {url: http_client.fetch(url)
- for url in urls}
+ async def parallel_fetch_dict(urls):
+ responses = await multi({url: http_client.fetch(url)
+ for url in urls})
# responses is a dict {url: HTTPResponse}
.. testoutput::
:hide:
-Lists and dicts must be wrapped in `tornado.gen.multi` for use with
-``await``::
+In decorated coroutines, it is possible to ``yield`` the list or dict directly::
- async def parallel_fetch(url1, url2):
- resp1, resp2 = await gen.multi([http_client.fetch(url1),
- http_client.fetch(url2)])
+ @gen.coroutine
+ def parallel_fetch_decorated(url1, url2):
+ resp1, resp2 = yield [http_client.fetch(url1),
+ http_client.fetch(url2)])
Interleaving
^^^^^^^^^^^^
Sometimes it is useful to save a `.Future` instead of yielding it
-immediately, so you can start another operation before waiting:
+immediately, so you can start another operation before waiting.
+
+.. testcode::
+
+ from tornado.gen import convert_yielded
+
+ async def get(self):
+ # convert_yielded() starts the native coroutine in the background.
+ # This is equivalent to asyncio.ensure_future() (both work in Tornado).
+ fetch_future = convert_yielded(self.fetch_next_chunk())
+ while True:
+ chunk = yield fetch_future
+ if chunk is None: break
+ self.write(chunk)
+ fetch_future = convert_yielded(self.fetch_next_chunk())
+ yield self.flush()
+
+.. testoutput::
+ :hide:
+
+This is a little easier to do with decorated coroutines, because they
+start immediately when called:
.. testcode::
.. testoutput::
:hide:
-This pattern is most usable with ``@gen.coroutine``. If
-``fetch_next_chunk()`` uses ``async def``, then it must be called as
-``fetch_future =
-tornado.gen.convert_yielded(self.fetch_next_chunk())`` to start the
-background processing.
-
Looping
^^^^^^^
coroutine can contain a ``while True:`` loop and use
`tornado.gen.sleep`::
- @gen.coroutine
- def minute_loop():
+ async def minute_loop():
while True:
- yield do_something()
- yield gen.sleep(60)
+ await do_something()
+ await gen.sleep(60)
# Coroutines that loop forever are generally started with
# spawn_callback().
time of ``do_something()``. To run exactly every 60 seconds, use the
interleaving pattern from above::
- @gen.coroutine
- def minute_loop2():
+ async def minute_loop2():
while True:
nxt = gen.sleep(60) # Start the clock.
- yield do_something() # Run while the clock is ticking.
- yield nxt # Wait for the timer to run out.
+ await do_something() # Run while the clock is ticking.
+ await nxt # Wait for the timer to run out.
components and can also be used to implement other protocols.
* A coroutine library (`tornado.gen`) which allows asynchronous
code to be written in a more straightforward way than chaining
- callbacks.
+ callbacks. This is similar to the native coroutine feature introduced
+ in Python 3.5 (``async def``). Native coroutines are recommended
+ in place of the `tornado.gen` module when available.
The Tornado web framework and HTTP server together offer a full-stack
alternative to `WSGI <http://www.python.org/dev/peps/pep-3333/>`_.
class GoogleOAuth2LoginHandler(tornado.web.RequestHandler,
tornado.auth.GoogleOAuth2Mixin):
- @tornado.gen.coroutine
- def get(self):
+ async def get(self):
if self.get_argument('code', False):
- user = yield self.get_authenticated_user(
+ user = await self.get_authenticated_user(
redirect_uri='http://your.site.com/auth/google',
code=self.get_argument('code'))
# Save the user with e.g. set_secure_cookie
else:
- yield self.authorize_redirect(
+ await self.authorize_redirect(
redirect_uri='http://your.site.com/auth/google',
client_id=self.settings['google_oauth']['key'],
scope=['profile', 'email'],
etc. If the URL regular expression contains capturing groups, they
are passed as arguments to this method.
5. When the request is finished, `~.RequestHandler.on_finish()` is
- called. For synchronous handlers this is immediately after
- ``get()`` (etc) return; for asynchronous handlers it is after the
- call to `~.RequestHandler.finish()`.
+ called. For most handlers this is immediately after ``get()`` (etc)
+ return; for handlers using the `tornado.web.asynchronous` decorator
+ it is after the call to `~.RequestHandler.finish()`.
All methods designed to be overridden are noted as such in the
`.RequestHandler` documentation. Some of the most commonly
Asynchronous handlers
~~~~~~~~~~~~~~~~~~~~~
-Tornado handlers are synchronous by default: when the
-``get()``/``post()`` method returns, the request is considered
-finished and the response is sent. Since all other requests are
-blocked while one handler is running, any long-running handler should
-be made asynchronous so it can call its slow operations in a
-non-blocking way. This topic is covered in more detail in
-:doc:`async`; this section is about the particulars of
-asynchronous techniques in `.RequestHandler` subclasses.
-
-The simplest way to make a handler asynchronous is to use the
-`.coroutine` decorator or ``async def``. This allows you to perform
-non-blocking I/O with the ``yield`` or ``await`` keywords, and no
-response will be sent until the coroutine has returned. See
-:doc:`coroutines` for more details.
-
-In some cases, coroutines may be less convenient than a
-callback-oriented style, in which case the `.tornado.web.asynchronous`
-decorator can be used instead. When this decorator is used the response
-is not automatically sent; instead the request will be kept open until
-some callback calls `.RequestHandler.finish`. It is up to the application
-to ensure that this method is called, or else the user's browser will
-simply hang.
-
-Here is an example that makes a call to the FriendFeed API using
-Tornado's built-in `.AsyncHTTPClient`:
+Certain handler methods (including ``prepare()`` and the HTTP verb
+methods ``get()``/``post()``/etc) may be overridden as coroutines to
+make the handler asynchronous.
-.. testcode::
-
- class MainHandler(tornado.web.RequestHandler):
- @tornado.web.asynchronous
- def get(self):
- http = tornado.httpclient.AsyncHTTPClient()
- http.fetch("http://friendfeed-api.com/v2/feed/bret",
- callback=self.on_response)
-
- def on_response(self, response):
- if response.error: raise tornado.web.HTTPError(500)
- json = tornado.escape.json_decode(response.body)
- self.write("Fetched " + str(len(json["entries"])) + " entries "
- "from the FriendFeed API")
- self.finish()
+Tornado also supports a callback-based style of asynchronous handler
+with the `tornado.web.asynchronous` decorator, but this style is
+deprecated and will be removed in Tornado 6.0. New applications should
+use coroutines instead.
-.. testoutput::
- :hide:
-
-When ``get()`` returns, the request has not finished. When the HTTP
-client eventually calls ``on_response()``, the request is still open,
-and the response is finally flushed to the client with the call to
-``self.finish()``.
-
-For comparison, here is the same example using a coroutine:
+For example, here is a simple handler using a coroutine:
.. testcode::
class MainHandler(tornado.web.RequestHandler):
- @tornado.gen.coroutine
- def get(self):
+ async def get(self):
http = tornado.httpclient.AsyncHTTPClient()
- response = yield http.fetch("http://friendfeed-api.com/v2/feed/bret")
+ response = await http.fetch("http://friendfeed-api.com/v2/feed/bret")
json = tornado.escape.json_decode(response.body)
self.write("Fetched " + str(len(json["entries"])) + " entries "
"from the FriendFeed API")
source tarball or clone the `git repository
<https://github.com/tornadoweb/tornado>`_ as well.
-**Prerequisites**: Tornado runs on Python 2.7, and 3.4+.
-The updates to the `ssl` module in Python 2.7.9 are required
-(in some distributions, these updates may be available in
-older python versions). In addition to the requirements
-which will be installed automatically by ``pip`` or ``setup.py install``,
-the following optional packages may be useful:
+**Prerequisites**: Tornado 5.x runs on Python 2.7, and 3.4+ (Tornado
+6.0 will require Python 3.5+; Python 2 will no longer be supported).
+The updates to the `ssl` module in Python 2.7.9 are required (in some
+distributions, these updates may be available in older python
+versions). In addition to the requirements which will be installed
+automatically by ``pip`` or ``setup.py install``, the following
+optional packages may be useful:
* `pycurl <http://pycurl.sourceforge.net>`_ is used by the optional
``tornado.curl_httpclient``. Libcurl version 7.22 or higher is required.
class GoogleOAuth2LoginHandler(tornado.web.RequestHandler,
tornado.auth.GoogleOAuth2Mixin):
- @tornado.gen.coroutine
- def get(self):
+ async def get(self):
if self.get_argument('code', False):
- user = yield self.get_authenticated_user(
+ user = await self.get_authenticated_user(
redirect_uri='http://your.site.com/auth/google',
code=self.get_argument('code'))
# Save the user with e.g. set_secure_cookie
else:
- yield self.authorize_redirect(
+ await self.authorize_redirect(
redirect_uri='http://your.site.com/auth/google',
client_id=self.settings['google_oauth']['key'],
scope=['profile', 'email'],
class MainHandler(tornado.web.RequestHandler,
tornado.auth.FacebookGraphMixin):
@tornado.web.authenticated
- @tornado.gen.coroutine
- def get(self):
- new_entry = yield self.oauth2_request(
+ async def get(self):
+ new_entry = await self.oauth2_request(
"https://graph.facebook.com/me/feed",
post_args={"message": "I am posting from my Tornado application!"},
access_token=self.current_user["access_token"])
if not new_entry:
# Call failed; perhaps missing permission?
- yield self.authorize_redirect()
+ await self.authorize_redirect()
return
self.finish("Posted a message!")
class TwitterLoginHandler(tornado.web.RequestHandler,
tornado.auth.TwitterMixin):
- @tornado.gen.coroutine
- def get(self):
+ async def get(self):
if self.get_argument("oauth_token", None):
- user = yield self.get_authenticated_user()
+ user = await self.get_authenticated_user()
# Save the user using e.g. set_secure_cookie()
else:
- yield self.authorize_redirect()
+ await self.authorize_redirect()
.. testoutput::
:hide:
class MainHandler(tornado.web.RequestHandler,
tornado.auth.TwitterMixin):
@tornado.web.authenticated
- @tornado.gen.coroutine
- def get(self):
- new_entry = yield self.twitter_request(
+ async def get(self):
+ new_entry = await self.twitter_request(
"/statuses/update",
post_args={"status": "Testing Tornado Web Server"},
access_token=self.current_user["access_token"])
class GoogleOAuth2LoginHandler(tornado.web.RequestHandler,
tornado.auth.GoogleOAuth2Mixin):
- @tornado.gen.coroutine
- def get(self):
+ async def get(self):
if self.get_argument('code', False):
- access = yield self.get_authenticated_user(
+ access = await self.get_authenticated_user(
redirect_uri='http://your.site.com/auth/google',
code=self.get_argument('code'))
- user = yield self.oauth2_request(
+ user = await self.oauth2_request(
"https://www.googleapis.com/oauth2/v1/userinfo",
access_token=access["access_token"])
# Save the user and access token with
# e.g. set_secure_cookie.
else:
- yield self.authorize_redirect(
+ await self.authorize_redirect(
redirect_uri='http://your.site.com/auth/google',
client_id=self.settings['google_oauth']['key'],
scope=['profile', 'email'],
class FacebookGraphLoginHandler(tornado.web.RequestHandler,
tornado.auth.FacebookGraphMixin):
- @tornado.gen.coroutine
- def get(self):
+ async def get(self):
if self.get_argument("code", False):
- user = yield self.get_authenticated_user(
+ user = await self.get_authenticated_user(
redirect_uri='/auth/facebookgraph/',
client_id=self.settings["facebook_api_key"],
client_secret=self.settings["facebook_secret"],
code=self.get_argument("code"))
# Save the user with e.g. set_secure_cookie
else:
- yield self.authorize_redirect(
+ await self.authorize_redirect(
redirect_uri='/auth/facebookgraph/',
client_id=self.settings["facebook_api_key"],
extra_params={"scope": "read_stream,offline_access"})
class MainHandler(tornado.web.RequestHandler,
tornado.auth.FacebookGraphMixin):
@tornado.web.authenticated
- @tornado.gen.coroutine
- def get(self):
- new_entry = yield self.facebook_request(
+ async def get(self):
+ new_entry = await self.facebook_request(
"/me/feed",
post_args={"message": "I am posting from my Tornado application!"},
access_token=self.current_user["access_token"])
If no callback is given, the caller should use the ``Future`` to
wait for the function to complete (perhaps by yielding it in a
- `.gen.engine` function, or passing it to `.IOLoop.add_future`).
+ coroutine, or passing it to `.IOLoop.add_future`).
Usage:
# Do stuff (possibly asynchronous)
callback(result)
- @gen.engine
- def caller(callback):
- yield future_func(arg1, arg2)
- callback()
+ async def caller():
+ await future_func(arg1, arg2)
..
technically asynchronous, but it is written as a single generator
instead of a collection of separate functions.
-For example, the following asynchronous handler:
+For example, the following callback-based asynchronous handler:
.. testcode::
Example usage::
- def handle_response(response):
- if response.error:
- print("Error: %s" % response.error)
+ async def f():
+ http_client = AsyncHTTPClient()
+ try:
+ response = await http_client.fetch("http://www.google.com")
+ except Exception as e:
+ print("Error: %s" % e)
else:
print(response.body)
- http_client = AsyncHTTPClient()
- http_client.fetch("http://www.google.com/", handle_response)
-
The constructor for this class is magic in several respects: It
actually creates an instance of an implementation-specific
subclass, and instances are reused as a kind of pseudo-singleton
from tornado import gen
from tornado.iostream import IOStream
- @gen.coroutine
- def handle_connection(connection, address):
+ async def handle_connection(connection, address):
stream = IOStream(connection)
- message = yield stream.read_until_close()
+ message = await stream.read_until_close()
print("message from client:", message.decode().strip())
def connection_ready(sock, fd, events):
If the event loop is not currently running, the next call to `start()`
will return immediately.
- To use asynchronous methods from otherwise-synchronous code (such as
- unit tests), you can start and stop the event loop like this::
-
- ioloop = IOLoop()
- async_method(ioloop=ioloop, callback=ioloop.stop)
- ioloop.start()
-
- ``ioloop.start()`` will return after ``async_method`` has run
- its callback, whether that callback was invoked before or
- after ``ioloop.start``.
-
Note that even after `stop` has been called, the `IOLoop` is not
completely stopped until `IOLoop.start` has also returned.
Some work that was scheduled before the call to `stop` may still
def run_sync(self, func, timeout=None):
"""Starts the `IOLoop`, runs the given function, and stops the loop.
- The function must return either a yieldable object or
- ``None``. If the function returns a yieldable object, the
- `IOLoop` will run until the yieldable is resolved (and
- `run_sync()` will return the yieldable's result). If it raises
+ The function must return either an awaitable object or
+ ``None``. If the function returns an awaitable object, the
+ `IOLoop` will run until the awaitable is resolved (and
+ `run_sync()` will return the awaitable's result). If it raises
an exception, the `IOLoop` will stop and the exception will be
re-raised to the caller.
a maximum duration for the function. If the timeout expires,
a `tornado.util.TimeoutError` is raised.
- This method is useful in conjunction with `tornado.gen.coroutine`
- to allow asynchronous calls in a ``main()`` function::
+ This method is useful to allow asynchronous calls in a
+ ``main()`` function::
- @gen.coroutine
- def main():
+ async def main():
# do stuff...
if __name__ == '__main__':
IOLoop.current().run_sync(main)
.. versionchanged:: 4.3
- Returning a non-``None``, non-yieldable value is now an error.
+ Returning a non-``None``, non-awaitable value is now an error.
.. versionchanged:: 5.0
If a timeout occurs, the ``func`` coroutine will be cancelled.
+
"""
future_cell = [None]
The callback is invoked with one argument, the
`.Future`.
+
+ This method only accepts `.Future` objects and not other
+ awaitables (unlike most of Tornado where the two are
+ interchangeable).
"""
assert is_future(future)
callback = stack_context.wrap(callback)
import tornado.iostream
import socket
- def send_request():
- stream.write(b"GET / HTTP/1.0\r\nHost: friendfeed.com\r\n\r\n")
- stream.read_until(b"\r\n\r\n", on_headers)
-
- def on_headers(data):
+ async def main():
+ s = socket.socket(socket.AF_INET, socket.SOCK_STREAM, 0)
+ stream = tornado.iostream.IOStream(s)
+ await stream.connect(("friendfeed.com", 80))
+ await stream.write(b"GET / HTTP/1.0\r\nHost: friendfeed.com\r\n\r\n")
+ header_data = await stream.read_until(b"\r\n\r\n")
headers = {}
- for line in data.split(b"\r\n"):
- parts = line.split(b":")
- if len(parts) == 2:
- headers[parts[0].strip()] = parts[1].strip()
- stream.read_bytes(int(headers[b"Content-Length"]), on_body)
-
- def on_body(data):
- print(data)
+ for line in header_data.split(b"\r\n"):
+ parts = line.split(b":")
+ if len(parts) == 2:
+ headers[parts[0].strip()] = parts[1].strip()
+ body_data = await stream.read_bytes(int(headers[b"Content-Length"]))
+ print(body_data)
stream.close()
- tornado.ioloop.IOLoop.current().stop()
if __name__ == '__main__':
+ tornado.ioloop.IOLoop.current().run_sync(main)
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM, 0)
stream = tornado.iostream.IOStream(s)
stream.connect(("friendfeed.com", 80), send_request)
condition = Condition()
- @gen.coroutine
- def waiter():
+ async def waiter():
print("I'll wait right here")
- yield condition.wait() # Yield a Future.
+ await condition.wait()
print("I'm done waiting")
- @gen.coroutine
- def notifier():
+ async def notifier():
print("About to notify")
condition.notify()
print("Done notifying")
- @gen.coroutine
- def runner():
- # Yield two Futures; wait for waiter() and notifier() to finish.
- yield [waiter(), notifier()]
+ async def runner():
+ # Wait for waiter() and notifier() in parallel
+ await gen.multi([waiter(), notifier()])
IOLoop.current().run_sync(runner)
io_loop = IOLoop.current()
# Wait up to 1 second for a notification.
- yield condition.wait(timeout=io_loop.time() + 1)
+ await condition.wait(timeout=io_loop.time() + 1)
...or a `datetime.timedelta` for a timeout relative to the current time::
# Wait up to 1 second.
- yield condition.wait(timeout=datetime.timedelta(seconds=1))
+ await condition.wait(timeout=datetime.timedelta(seconds=1))
The method returns False if there's no notification before the deadline.
event = Event()
- @gen.coroutine
- def waiter():
+ async def waiter():
print("Waiting for event")
- yield event.wait()
+ await event.wait()
print("Not waiting this time")
- yield event.wait()
+ await event.wait()
print("Done")
- @gen.coroutine
- def setter():
+ async def setter():
print("About to set the event")
event.set()
- @gen.coroutine
- def runner():
- yield [waiter(), setter()]
+ async def runner():
+ await gen.multi([waiter(), setter()])
IOLoop.current().run_sync(runner)
# Ensure reliable doctest output: resolve Futures one at a time.
futures_q = deque([Future() for _ in range(3)])
- @gen.coroutine
- def simulator(futures):
+ async def simulator(futures):
for f in futures:
# simulate the asynchronous passage of time
- yield gen.moment
- yield gen.moment
+ await gen.sleep(0)
+ await gen.sleep(0)
f.set_result(None)
IOLoop.current().add_callback(simulator, list(futures_q))
sem = Semaphore(2)
- @gen.coroutine
- def worker(worker_id):
- yield sem.acquire()
+ async def worker(worker_id):
+ await sem.acquire()
try:
print("Worker %d is working" % worker_id)
- yield use_some_resource()
+ await use_some_resource()
finally:
print("Worker %d is done" % worker_id)
sem.release()
- @gen.coroutine
- def runner():
+ async def runner():
# Join all workers.
- yield [worker(i) for i in range(3)]
+ await gen.multi([worker(i) for i in range(3)])
IOLoop.current().run_sync(runner)
Workers 0 and 1 are allowed to run concurrently, but worker 2 waits until
the semaphore has been released once, by worker 0.
- `.acquire` is a context manager, so ``worker`` could be written as::
+ The semaphore can be used as an async context manager::
- @gen.coroutine
- def worker(worker_id):
- with (yield sem.acquire()):
+ async def worker(worker_id):
+ async with sem:
print("Worker %d is working" % worker_id)
- yield use_some_resource()
+ await use_some_resource()
# Now the semaphore has been released.
print("Worker %d is done" % worker_id)
- In Python 3.5, the semaphore itself can be used as an async context
- manager::
+ For compatibility with older versions of Python, `.acquire` is a
+ context manager, so ``worker`` could also be written as::
- async def worker(worker_id):
- async with sem:
+ @gen.coroutine
+ def worker(worker_id):
+ with (yield sem.acquire()):
print("Worker %d is working" % worker_id)
- await use_some_resource()
+ yield use_some_resource()
# Now the semaphore has been released.
print("Worker %d is done" % worker_id)
.. versionchanged:: 4.3
Added ``async with`` support in Python 3.5.
+
"""
def __init__(self, value=1):
super(Semaphore, self).__init__()
Releasing an unlocked lock raises `RuntimeError`.
- `acquire` supports the context manager protocol in all Python versions:
+ A Lock can be used as an async context manager with the ``async
+ with`` statement:
>>> from tornado import gen, locks
>>> lock = locks.Lock()
>>>
- >>> @gen.coroutine
- ... def f():
- ... with (yield lock.acquire()):
+ >>> async def f():
+ ... async with lock:
... # Do something holding the lock.
... pass
...
... # Now the lock is released.
- In Python 3.5, `Lock` also supports the async context manager
- protocol. Note that in this case there is no `acquire`, because
- ``async with`` includes both the ``yield`` and the ``acquire``
- (just as it does with `threading.Lock`):
+ For compatibility with older versions of Python, the `.acquire`
+ method asynchronously returns a regular context manager:
- >>> async def f2(): # doctest: +SKIP
- ... async with lock:
+ >>> async def f2():
+ ... with (yield lock.acquire()):
... # Do something holding the lock.
... pass
...
q = Queue(maxsize=2)
- @gen.coroutine
- def consumer():
- while True:
- item = yield q.get()
+ async def consumer():
+ async for item in q:
try:
print('Doing work on %s' % item)
- yield gen.sleep(0.01)
+ await gen.sleep(0.01)
finally:
q.task_done()
- @gen.coroutine
- def producer():
+ async def producer():
for item in range(5):
- yield q.put(item)
+ await q.put(item)
print('Put %s' % item)
- @gen.coroutine
- def main():
+ async def main():
# Start consumer without waiting (since it never finishes).
IOLoop.current().spawn_callback(consumer)
- yield producer() # Wait for producer to put all tasks.
- yield q.join() # Wait for consumer to finish all tasks.
+ await producer() # Wait for producer to put all tasks.
+ await q.join() # Wait for consumer to finish all tasks.
print('Done')
IOLoop.current().run_sync(main)
Doing work on 4
Done
- In Python 3.5, `Queue` implements the async iterator protocol, so
- ``consumer()`` could be rewritten as::
- async def consumer():
- async for item in q:
+ In versions of Python without native coroutines (before 3.5),
+ ``consumer()`` could be written as::
+
+ @gen.coroutine
+ def consumer():
+ while True:
+ item = yield q.get()
try:
print('Doing work on %s' % item)
yield gen.sleep(0.01)
from tornado import gen
class EchoServer(TCPServer):
- @gen.coroutine
- def handle_stream(self, stream, address):
+ async def handle_stream(self, stream, address):
while True:
try:
- data = yield stream.read_until(b"\n")
- yield stream.write(data)
+ data = await stream.read_until(b"\n")
+ await stream.write(data)
except StreamClosedError:
break
asynchronous code.
The unittest framework is synchronous, so the test must be
- complete by the time the test method returns. This means that
- asynchronous code cannot be used in quite the same way as usual.
- To write test functions that use the same ``yield``-based patterns
- used with the `tornado.gen` module, decorate your test methods
- with `tornado.testing.gen_test` instead of
- `tornado.gen.coroutine`. This class also provides the `stop()`
- and `wait()` methods for a more manual style of testing. The test
- method itself must call ``self.wait()``, and asynchronous
- callbacks should call ``self.stop()`` to signal completion.
+ complete by the time the test method returns. This means that
+ asynchronous code cannot be used in quite the same way as usual
+ and must be adapted to fit. To write your tests with coroutines,
+ decorate your test methods with `tornado.testing.gen_test` instead
+ of `tornado.gen.coroutine`.
+
+ This class also provides the (deprecated) `stop()` and `wait()`
+ methods for a more manual style of testing. The test method itself
+ must call ``self.wait()``, and asynchronous callbacks should call
+ ``self.stop()`` to signal completion.
By default, a new `.IOLoop` is constructed for each test and is available
as ``self.io_loop``. If the code being tested requires a
response = self.wait()
# Test contents of response
self.assertIn("FriendFeed", response.body)
-
- # This test uses an explicit callback-based style.
- class MyTestCase3(AsyncTestCase):
- def test_http_fetch(self):
- client = AsyncHTTPClient()
- client.fetch("http://www.tornadoweb.org/", self.handle_fetch)
- self.wait()
-
- def handle_fetch(self, response):
- # Test contents of response (failures and exceptions here
- # will cause self.wait() to throw an exception and end the
- # test).
- # Exceptions thrown here are magically propagated to
- # self.wait() in test_http_fetch() via stack_context.
- self.assertIn("FriendFeed", response.body)
- self.stop()
"""
def __init__(self, methodName='runTest'):
super(AsyncTestCase, self).__init__(methodName)
.. versionchanged:: 4.0
Now returns a `.Future` if no callback is given.
+
+ .. deprecated:: 5.1
+
+ The ``callback`` argument is deprecated and will be removed in
+ Tornado 6.0.
"""
chunk = b"".join(self._write_buffer)
self._write_buffer = []
py3-sphinx-docs,
# Run the doctests via sphinx (which covers things not run
# in the regular test suite and vice versa)
- {py2,py3}-sphinx-doctest,
+ py3-sphinx-doctest,
py3-lint
commands =
sphinx-build -q -E -n -W -b html . {envtmpdir}/html
-[testenv:py2-sphinx-doctest]
-changedir = docs
-setenv = TORNADO_EXTENSION=0
-# No -W for doctests because that disallows tests with empty output.
-commands =
- sphinx-build -q -E -n -b doctest . {envtmpdir}/doctest
-
[testenv:py3-sphinx-doctest]
changedir = docs
setenv = TORNADO_EXTENSION=0
+# No -W for doctests because that disallows tests with empty output.
commands =
sphinx-build -q -E -n -b doctest . {envtmpdir}/doctest