-``tornado.tcpserver`` --- Basic `IOStream`-based TCP server
-===========================================================
+``tornado.tcpserver`` --- Basic `.IOStream`-based TCP server
+============================================================
.. automodule:: tornado.tcpserver
:members:
Twisted on Tornado
------------------
-`TornadoReactor` implements the Twisted reactor interface on top of
-the Tornado IOLoop. To use it, simply call `install` at the beginning
+``TornadoReactor`` implements the Twisted reactor interface on top of
+the Tornado IOLoop. To use it, simply call ``install`` at the beginning
of the application::
import tornado.platform.twisted
tornado.platform.twisted.install()
from twisted.internet import reactor
-When the app is ready to start, call `IOLoop.instance().start()`
-instead of `reactor.run()`.
+When the app is ready to start, call ``IOLoop.instance().start()``
+instead of ``reactor.run()``.
It is also possible to create a non-global reactor by calling
-`tornado.platform.twisted.TornadoReactor(io_loop)`. However, if
-the `IOLoop` and reactor are to be short-lived (such as those used in
+``tornado.platform.twisted.TornadoReactor(io_loop)``. However, if
+the `.IOLoop` and reactor are to be short-lived (such as those used in
unit tests), additional cleanup may be required. Specifically, it is
recommended to call::
reactor.fireSystemEvent('shutdown')
reactor.disconnectAll()
-before closing the `IOLoop`.
+before closing the `.IOLoop`.
Tornado on Twisted
------------------
-`TwistedIOLoop` implements the Tornado IOLoop interface on top of the Twisted
+``TwistedIOLoop`` implements the Tornado IOLoop interface on top of the Twisted
reactor. Recommended usage::
from tornado.platform.twisted import TwistedIOLoop
# Set up your tornado application as usual using `IOLoop.instance`
reactor.run()
-`TwistedIOLoop` always uses the global Twisted reactor.
+``TwistedIOLoop`` always uses the global Twisted reactor.
This module has been tested with Twisted versions 11.0.0 and newer.
Authentication and security settings:
* ``cookie_secret``: Used by `RequestHandler.get_secure_cookie`
- and `set_secure_cookie` to sign cookies.
+ and `.set_secure_cookie` to sign cookies.
* ``login_url``: The `authenticated` decorator will redirect
to this url if the user is not logged in. Can be further
customized by overriding `RequestHandler.get_login_url`
"""Blocking and non-blocking HTTP client interfaces.
This module defines a common interface shared by two implementations,
-`simple_httpclient` and `curl_httpclient`. Applications may either
+``simple_httpclient`` and ``curl_httpclient``. Applications may either
instantiate their chosen implementation class directly or use the
`AsyncHTTPClient` class from this module, which selects an implementation
that can be overridden with the `AsyncHTTPClient.configure` method.
-The default implementation is `simple_httpclient`, and this is expected
+The default implementation is ``simple_httpclient``, and this is expected
to be suitable for most users' needs. However, some applications may wish
-to switch to `curl_httpclient` for reasons such as the following:
+to switch to ``curl_httpclient`` for reasons such as the following:
-* `curl_httpclient` has some features not found in `simple_httpclient`,
+* ``curl_httpclient`` has some features not found in ``simple_httpclient``,
including support for HTTP proxies and the ability to use a specified
network interface.
-* `curl_httpclient` is more likely to be compatible with sites that are
+* ``curl_httpclient`` is more likely to be compatible with sites that are
not-quite-compliant with the HTTP spec, or sites that use little-exercised
features of HTTP.
-* `simple_httpclient` only supports SSL on Python 2.6 and above.
+* ``simple_httpclient`` only supports SSL on Python 2.6 and above.
-* `curl_httpclient` is faster
+* ``curl_httpclient`` is faster
-* `curl_httpclient` was the default prior to Tornado 2.0.
+* ``curl_httpclient`` was the default prior to Tornado 2.0.
-Note that if you are using `curl_httpclient`, it is highly recommended that
+Note that if you are using ``curl_httpclient``, it is highly recommended that
you use a recent version of ``libcurl`` and ``pycurl``. Currently the minimum
supported version is 7.18.2, and the recommended version is 7.21.1 or newer.
"""
client_key=None, client_cert=None):
r"""Creates an `HTTPRequest`.
- All parameters except `url` are optional.
+ All parameters except ``url`` are optional.
:arg string url: URL to fetch
:arg string method: HTTP method, e.g. "GET" or "POST"
header
:arg bool follow_redirects: Should redirects be followed automatically
or return the 3xx response?
- :arg int max_redirects: Limit for `follow_redirects`
+ :arg int max_redirects: Limit for ``follow_redirects``
:arg string user_agent: String to send as ``User-Agent`` header
:arg bool use_gzip: Request gzip encoding from the server
:arg string network_interface: Network interface to use for request
- :arg callable streaming_callback: If set, `streaming_callback` will
+ :arg callable streaming_callback: If set, ``streaming_callback`` will
be run with each chunk of data as it is received, and
- `~HTTPResponse.body` and `~HTTPResponse.buffer` will be empty in
+ ``HTTPResponse.body`` and ``HTTPResponse.buffer`` will be empty in
the final response.
- :arg callable header_callback: If set, `header_callback` will
+ :arg callable header_callback: If set, ``header_callback`` will
be run with each header line as it is received (including the
first line, e.g. ``HTTP/1.0 200 OK\r\n``, and a final line
containing only ``\r\n``. All lines include the trailing newline
- characters). `~HTTPResponse.headers` will be empty in the final
+ characters). ``HTTPResponse.headers`` will be empty in the final
response. This is most useful in conjunction with
- `streaming_callback`, because it's the only way to get access to
+ ``streaming_callback``, because it's the only way to get access to
header data while the request is in progress.
:arg callable prepare_curl_callback: If set, will be called with
- a `pycurl.Curl` object to allow the application to make additional
- `setopt` calls.
+ a ``pycurl.Curl`` object to allow the application to make additional
+ ``setopt`` calls.
:arg string proxy_host: HTTP proxy hostname. To use proxies,
- `proxy_host` and `proxy_port` must be set; `proxy_username` and
- `proxy_pass` are optional. Proxies are currently only support
- with `curl_httpclient`.
+ ``proxy_host`` and ``proxy_port`` must be set; ``proxy_username`` and
+ ``proxy_pass`` are optional. Proxies are currently only supported
+ with ``curl_httpclient``.
:arg int proxy_port: HTTP proxy port
:arg string proxy_username: HTTP proxy username
:arg string proxy_password: HTTP proxy password
- :arg bool allow_nonstandard_methods: Allow unknown values for `method`
+ :arg bool allow_nonstandard_methods: Allow unknown values for ``method``
argument?
:arg bool validate_cert: For HTTPS requests, validate the server's
certificate?
:arg string ca_certs: filename of CA certificates in PEM format,
- or None to use defaults. Note that in `curl_httpclient`, if
- any request uses a custom `ca_certs` file, they all must (they
- don't have to all use the same `ca_certs`, but it's not possible
+ or None to use defaults. Note that in ``curl_httpclient``, if
+ any request uses a custom ``ca_certs`` file, they all must (they
+ don't have to all use the same ``ca_certs``, but it's not possible
to mix requests with ca_certs and requests that use the defaults.
:arg bool allow_ipv6: Use IPv6 when available? Default is false in
- `simple_httpclient` and true in `curl_httpclient`
+ ``simple_httpclient`` and true in ``curl_httpclient``
:arg string client_key: Filename for client SSL key, if any
:arg string client_cert: Filename for client SSL certificate, if any
"""
server.start(0) # Forks multiple sub-processes
IOLoop.instance().start()
- When using this interface, an `IOLoop` must *not* be passed
- to the `HTTPServer` constructor. `start` will always start
- the server on the default singleton `IOLoop`.
+ When using this interface, an `.IOLoop` must *not* be passed
+ to the `HTTPServer` constructor. `~.TCPServer.start` will always start
+ the server on the default singleton `.IOLoop`.
3. `~tornado.tcpserver.TCPServer.add_sockets`: advanced multi-process::
server.add_sockets(sockets)
IOLoop.instance().start()
- The `add_sockets` interface is more complicated, but it can be
- used with `tornado.process.fork_processes` to give you more
- flexibility in when the fork happens. `add_sockets` can
- also be used in single-process servers if you want to create
- your listening sockets in some way other than
- `tornado.netutil.bind_sockets`.
+ The `~.TCPServer.add_sockets` interface is more complicated,
+ but it can be used with `tornado.process.fork_processes` to
+ give you more flexibility in when the fork happens.
+ `~.TCPServer.add_sockets` can also be used in single-process
+ servers if you want to create your listening sockets in some
+ way other than `tornado.netutil.bind_sockets`.
"""
def __init__(self, request_callback, no_keep_alive=False, io_loop=None,
def set_close_callback(self, callback):
"""Sets a callback that will be run when the connection is closed.
- Use this instead of accessing `HTTPConnection.stream.set_close_callback`
- directly (which was the recommended approach prior to Tornado 3.0).
+ Use this instead of accessing
+ `HTTPConnection.stream.set_close_callback
+ <.BaseIOStream.set_close_callback>` directly (which was the
+ recommended approach prior to Tornado 3.0).
"""
self._close_callback = stack_context.wrap(callback)
self.stream.set_close_callback(self._on_connection_close)
.. attribute:: headers
- `HTTPHeader` dictionary-like object for request headers. Acts like
+ `.HTTPHeaders` dictionary-like object for request headers. Acts like
a case-insensitive dictionary with additional methods for repeated
headers.
.. attribute:: remote_ip
- Client's IP address as a string. If `HTTPServer.xheaders` is set,
+ Client's IP address as a string. If ``HTTPServer.xheaders`` is set,
will pass along the real IP address provided by a load balancer
in the ``X-Real-Ip`` header
.. attribute:: protocol
- The protocol used, either "http" or "https". If `HTTPServer.xheaders`
+ The protocol used, either "http" or "https". If ``HTTPServer.xheaders``
is set, will pass along the protocol used by a load balancer if
reported via an ``X-Scheme`` header.
maps arguments names to lists of values (to support multiple values
for individual names). Names are of type `str`, while arguments
are byte strings. Note that this is different from
- `RequestHandler.get_argument`, which returns argument values as
+ `.RequestHandler.get_argument`, which returns argument values as
unicode strings.
.. attribute:: files
File uploads are available in the files property, which maps file
- names to lists of :class:`HTTPFile`.
+ names to lists of `.HTTPFile`.
.. attribute:: connection
def format_timestamp(ts):
"""Formats a timestamp in the format used by HTTP.
- The argument may be a numeric timestamp as returned by `time.time()`,
- a time tuple as returned by `time.gmtime()`, or a `datetime.datetime`
+ The argument may be a numeric timestamp as returned by `time.time`,
+ a time tuple as returned by `time.gmtime`, or a `datetime.datetime`
object.
>>> format_timestamp(1359312200)
def run_sync(self, func, timeout=None):
"""Starts the `IOLoop`, runs the given function, and stops the loop.
- If the function returns a `Future`, the `IOLoop` will run until
- the future is resolved. If it raises an exception, the `IOLoop`
- will stop and the exception will be re-raised to the caller.
+ If the function returns a `~concurrent.futures.Future`, the
+ `IOLoop` will run until the future is resolved. If it raises
+ an exception, the `IOLoop` will stop and the exception will be
+ re-raised to the caller.
The keyword-only argument ``timeout`` may be used to set
a maximum duration for the function. If the timeout expires,
a `TimeoutError` is raised.
This method is useful in conjunction with `tornado.gen.coroutine`
- to allow asynchronous calls in a `main()` function::
+ to allow asynchronous calls in a ``main()`` function::
@gen.coroutine
def main():
All of the methods take callbacks (since writing and reading are
non-blocking and asynchronous).
- When a stream is closed due to an error, the IOStream's `error`
+ When a stream is closed due to an error, the IOStream's ``error``
attribute contains the exception object.
Subclasses must implement `fileno`, `close_fd`, `write_to_fd`,
"""Close this stream.
If ``exc_info`` is true, set the ``error`` attribute to the current
- exception from `sys.exc_info()` (or if ``exc_info`` is a tuple,
+ exception from `sys.exc_info` (or if ``exc_info`` is a tuple,
use that instead of `sys.exc_info`).
"""
if not self.closed():
class Resolver(Configurable):
"""Configurable asynchronous DNS resolver interface.
- By default, a blocking implementation is used (which simply
- calls `socket.getaddrinfo`). An alternative implementation
- can be chosen with the `Resolver.configure` class method::
+ By default, a blocking implementation is used (which simply calls
+ `socket.getaddrinfo`). An alternative implementation can be
+ chosen with the ``Resolver.configure`` class method::
Resolver.configure('tornado.netutil.ThreadedResolver')
* `tornado.netutil.BlockingResolver`
* `tornado.netutil.ThreadedResolver`
* `tornado.netutil.OverrideResolver`
- * `tornado.platform.twisted.TwistedResolver`
- * `tornado.platform.caresresolver.CaresResolver`
+ * ``tornado.platform.twisted.TwistedResolver``
+ * ``tornado.platform.caresresolver.CaresResolver``
"""
@classmethod
def configurable_base(cls):
The ``host`` argument is a string which may be a hostname or a
literal IP address.
- Returns a `Future` whose result is a list of (family, address)
- pairs, where address is a tuple suitable to pass to
- `socket.connect` (i.e. a (host, port) pair for IPv4;
- additional fields may be present for IPv6). If a callback is
- passed, it will be run with the result as an argument when
- it is complete.
+ Returns a `~concurrent.futures.Future` whose result is a list
+ of (family, address) pairs, where address is a tuple suitable
+ to pass to `socket.socket.connect` (i.e. a (host, port) pair for
+ IPv4; additional fields may be present for IPv6). If a
+ callback is passed, it will be run with the result as an
+ argument when it is complete.
"""
raise NotImplementedError()
class BlockingResolver(ExecutorResolver):
"""Default `Resolver` implementation, using `socket.getaddrinfo`.
- The `IOLoop` will be blocked during the resolution, although the
- callback will not be run until the next `IOLoop` iteration.
+ The `.IOLoop` will be blocked during the resolution, although the
+ callback will not be run until the next `.IOLoop` iteration.
"""
def initialize(self, io_loop=None):
super(BlockingResolver, self).initialize(io_loop=io_loop)
"""Try to Convert an ssl_options dictionary to an SSLContext object.
The ``ssl_options`` dictionary contains keywords to be passed to
- `ssl.wrap_sockets`. In Python 3.2+, `ssl.SSLContext` objects can
+ `ssl.wrap_socket`. In Python 3.2+, `ssl.SSLContext` objects can
be used instead. This function converts the dict form to its
- `SSLContext` equivalent, and may be used when a component which
- accepts both forms needs to upgrade to the `SSLContext` version
+ `~ssl.SSLContext` equivalent, and may be used when a component which
+ accepts both forms needs to upgrade to the `~ssl.SSLContext` version
to use features like SNI or NPN.
"""
if isinstance(ssl_options, dict):
def ssl_wrap_socket(socket, ssl_options, server_hostname=None, **kwargs):
- """Returns an `ssl.SSLSocket` wrapping the given socket.
+ """Returns an ``ssl.SSLSocket`` wrapping the given socket.
``ssl_options`` may be either a dictionary (as accepted by
- `ssl_options_to_context) or an `ssl.SSLContext` object.
- Additional keyword arguments are passed to `wrap_socket`
- (either the `SSLContext` method or the `ssl` module function
+ `ssl_options_to_context`) or an `ssl.SSLContext` object.
+ Additional keyword arguments are passed to ``wrap_socket``
+ (either the `~ssl.SSLContext` method or the `ssl` module function
as appropriate).
"""
context = ssl_options_to_context(ssl_options)
callback()
def mockable(self):
- """Returns a wrapper around self that is compatible with `mock.patch`.
-
- The `mock.patch` function (included in the standard library
- `unittest.mock` package since Python 3.3, or in the
- third-party `mock` package for older versions of Python) is
- incompatible with objects like ``options`` that override
- ``__getattr__`` and ``__setattr__``. This function returns an
- object that can be used with `mock.patch.object` to modify
- option values::
+ """Returns a wrapper around self that is compatible with
+ `mock.patch <unittest.mock.patch>`.
+
+ The `mock.patch <unittest.mock.patch>` function (included in
+ the standard library `unittest.mock` package since Python 3.3,
+ or in the third-party ``mock`` package for older versions of
+ Python) is incompatible with objects like ``options`` that
+ override ``__getattr__`` and ``__setattr__``. This function
+ returns an object that can be used with `mock.patch.object
+ <unittest.mock.patch.object>` to modify option values::
with mock.patch.object(options.mockable(), 'name', value):
assert options.name == value
additions:
* ``stdin``, ``stdout``, and ``stderr`` may have the value
- `tornado.process.Subprocess.STREAM`, which will make the corresponding
- attribute of the resulting Subprocess a `PipeIOStream`.
+ ``tornado.process.Subprocess.STREAM``, which will make the corresponding
+ attribute of the resulting Subprocess a `.PipeIOStream`.
* A new keyword argument ``io_loop`` may be used to pass in an IOLoop.
"""
STREAM = object()
* If you're writing an asynchronous library that doesn't rely on a
stack_context-aware library like `tornado.ioloop` or `tornado.iostream`
(for example, if you're writing a thread pool), use
- `stack_context.wrap()` before any asynchronous operations to capture the
+ `.stack_context.wrap()` before any asynchronous operations to capture the
stack context from where the operation was started.
* If you're writing an asynchronous library that has some shared
server.start(0) # Forks multiple sub-processes
IOLoop.instance().start()
- When using this interface, an `IOLoop` must *not* be passed
+ When using this interface, an `.IOLoop` must *not* be passed
to the `TCPServer` constructor. `start` will always start
- the server on the default singleton `IOLoop`.
+ the server on the default singleton `.IOLoop`.
3. `add_sockets`: advanced multi-process::
flexibility in when the fork happens. `add_sockets` can
also be used in single-process servers if you want to create
your listening sockets in some way other than
- `bind_sockets`.
+ `~tornado.netutil.bind_sockets`.
"""
def __init__(self, io_loop=None, ssl_options=None):
self.io_loop = io_loop
This method may be called more than once to listen on multiple ports.
`listen` takes effect immediately; it is not necessary to call
`TCPServer.start` afterwards. It is, however, necessary to start
- the `IOLoop`.
+ the `.IOLoop`.
"""
sockets = bind_sockets(port, address=address)
self.add_sockets(sockets)
"""Makes this server start accepting connections on the given sockets.
The ``sockets`` parameter is a list of socket objects such as
- those returned by `bind_sockets`.
+ those returned by `~tornado.netutil.bind_sockets`.
`add_sockets` is typically used in combination with that
method and `tornado.process.fork_processes` to provide greater
control over the initialization of a multi-process server.
both will be used if available.
The ``backlog`` argument has the same meaning as for
- `socket.listen`.
+ `socket.socket.listen`.
This method may be called multiple times prior to `start` to listen
on multiple ports or interfaces.
sock.close()
def handle_stream(self, stream, address):
- """Override to handle a new `IOStream` from an incoming connection."""
+ """Override to handle a new `.IOStream` from an incoming connection."""
raise NotImplementedError()
def _handle_connection(self, connection, address):
to all templates by default.
Typical applications do not create `Template` or `Loader` instances by
-hand, but instead use the `render` and `render_string` methods of
+hand, but instead use the `~.RequestHandler.render` and
+`~.RequestHandler.render_string` methods of
`tornado.web.RequestHandler`, which load templates automatically based
-on the ``template_path`` `Application` setting.
+on the ``template_path`` `.Application` setting.
Syntax Reference
----------------
``{% autoescape *function* %}``
Sets the autoescape mode for the current file. This does not affect
other files, even those referenced by ``{% include %}``. Note that
- autoescaping can also be configured globally, at the `Application`
+ autoescaping can also be configured globally, at the `.Application`
or `Loader`.::
{% autoescape xhtml_escape %}
def gen_test(f):
"""Testing equivalent of ``@gen.coroutine``, to be applied to test methods.
- ``@gen.coroutine`` cannot be used on tests because the `IOLoop` is not
+ ``@gen.coroutine`` cannot be used on tests because the `.IOLoop` is not
already running. ``@gen_test`` should be applied to test methods
on subclasses of `AsyncTestCase`.
@property
def settings(self):
- """An alias for `self.application.settings`."""
+ """An alias for ``self.application.settings``."""
return self.application.settings
def head(self, *args, **kwargs):
def set_status(self, status_code, reason=None):
"""Sets the status code for our response.
- :arg int status_code: Response status code. If `reason` is ``None``,
- it must be present in `httplib.responses`.
+ :arg int status_code: Response status code. If ``reason`` is ``None``,
+ it must be present in `httplib.responses <http.client.responses>`.
:arg string reason: Human-readable reason phrase describing the status
- code. If ``None``, it will be filled in from `httplib.responses`.
+ code. If ``None``, it will be filled in from
+ `httplib.responses <http.client.responses>`.
"""
self._status_code = status_code
if reason is not None:
return handler
def reverse_url(self, name, *args):
- """Returns a URL path for handler named `name`
+ """Returns a URL path for handler named ``name``
The handler must be added to the application as a named URLSpec.
"""An exception that will turn into an HTTP error response.
:arg int status_code: HTTP status code. Must be listed in
- `httplib.responses` unless the ``reason`` keyword argument is given.
+ `httplib.responses <http.client.responses>` unless the ``reason``
+ keyword argument is given.
:arg string log_message: Message to be written to the log for this error
(will not be shown to the user unless the `Application` is in debug
mode). May contain ``%s``-style placeholders, which will be filled
def websocket_connect(url, io_loop=None, callback=None):
"""Client-side websocket support.
- Takes a url and returns a Future whose result is a `WebSocketConnection`.
+ Takes a url and returns a Future whose result is a
+ `WebSocketClientConnection`.
"""
if io_loop is None:
io_loop = IOLoop.current()