Address may be either an IP address or hostname. If it's a hostname,
the server will listen on all IP addresses associated with the
name. Address may be an empty string or None to listen on all
- available interfaces. Family may be set to either socket.AF_INET
- or socket.AF_INET6 to restrict to ipv4 or ipv6 addresses, otherwise
+ available interfaces. Family may be set to either `socket.AF_INET`
+ or `socket.AF_INET6` to restrict to IPv4 or IPv6 addresses, otherwise
both will be used if available.
The ``backlog`` argument has the same meaning as for
- ``socket.listen()``.
+ `socket.listen() <socket.socket.listen>`.
- ``flags`` is a bitmask of AI_* flags to ``getaddrinfo``, like
+ ``flags`` is a bitmask of AI_* flags to `~socket.getaddrinfo`, like
``socket.AI_PASSIVE | socket.AI_NUMERICHOST``.
"""
sockets = []
def add_accept_handler(sock, callback, io_loop=None):
- """Adds an ``IOLoop`` event handler to accept new connections on ``sock``.
+ """Adds an `.IOLoop` event handler to accept new connections on ``sock``.
When a connection is accepted, ``callback(connection, address)`` will
be run (``connection`` is a socket object, and ``address`` is the
address of the other end of the connection). Note that this signature
is different from the ``callback(fd, events)`` signature used for
- ``IOLoop`` handlers.
+ `.IOLoop` handlers.
"""
if io_loop is None:
io_loop = IOLoop.current()
Returns a `~concurrent.futures.Future` whose result is a list
of (family, address) pairs, where address is a tuple suitable
- to pass to `socket.socket.connect` (i.e. a (host, port) pair for
- IPv4; additional fields may be present for IPv6). If a
- callback is passed, it will be run with the result as an
- argument when it is complete.
+ to pass to `socket.connect <socket.socket.connect>` (i.e. a
+ ``(host, port)`` pair for IPv4; additional fields may be
+ present for IPv6). If a ``callback`` is passed, it will be run
+ with the result as an argument when it is complete.
"""
raise NotImplementedError()
def ssl_options_to_context(ssl_options):
- """Try to Convert an ssl_options dictionary to an SSLContext object.
+ """Try to convert an ``ssl_options`` dictionary to an
+ `~ssl.SSLContext` object.
The ``ssl_options`` dictionary contains keywords to be passed to
`ssl.wrap_socket`. In Python 3.2+, `ssl.SSLContext` objects can
db = database.Connection(options.mysql_host)
...
-The main() method of your application does not need to be aware of all of
+The ``main()`` method of your application does not need to be aware of all of
the options used throughout your program; they are all automatically loaded
when the modules are loaded. However, all modules that define options
must have been imported before the command line is parsed.
-Your main() method can parse the command line or parse a config file with
+Your ``main()`` method can parse the command line or parse a config file with
either::
tornado.options.parse_command_line()
# or
tornado.options.parse_config_file("/etc/server.conf")
-Command line formats are what you would expect ("--myoption=myvalue").
+Command line formats are what you would expect (``--myoption=myvalue``).
Config files are just Python files. Global names become options, e.g.::
myoption = "myvalue"
myotheroption = "myothervalue"
-We support datetimes, timedeltas, ints, and floats (just pass a 'type'
-kwarg to define). We also accept multi-value options. See the documentation
-for define() below.
+We support `datetimes <datetime.datetime>`, `timedeltas
+<datetime.timedelta>`, ints, and floats (just pass a ``type`` kwarg to
+`define`). We also accept multi-value options. See the documentation for
+`define()` below.
`tornado.options.options` is a singleton instance of `OptionParser`, and
the top-level functions in this module (`define`, `parse_command_line`, etc)
multiple=False, group=None, callback=None):
"""Defines a new command line option.
- If type is given (one of str, float, int, datetime, or timedelta)
- or can be inferred from the default, we parse the command line
- arguments based on the given type. If multiple is True, we accept
+ If ``type`` is given (one of str, float, int, datetime, or timedelta)
+ or can be inferred from the ``default``, we parse the command line
+ arguments based on the given type. If ``multiple`` is True, we accept
comma-separated values, and the option value is always a list.
- For multi-value integers, we also accept the syntax x:y, which
- turns into range(x, y) - very useful for long integer ranges.
+ For multi-value integers, we also accept the syntax ``x:y``, which
+ turns into ``range(x, y)`` - very useful for long integer ranges.
- help and metavar are used to construct the automatically generated
- command line help string. The help message is formatted like::
+ ``help`` and ``metavar`` are used to construct the
+ automatically generated command line help string. The help
+ message is formatted like::
--name=METAVAR help string
- group is used to group the defined options in logical
+ ``group`` is used to group the defined options in logical
groups. By default, command line options are grouped by the
file in which they are defined.
Command line option names must be unique globally. They can be parsed
- from the command line with parse_command_line() or parsed from a
- config file with parse_config_file.
+ from the command line with `parse_command_line` or parsed from a
+ config file with `parse_config_file`.
- If a callback is given, it will be run with the new value whenever
+ If a ``callback`` is given, it will be run with the new value whenever
the option is changed. This can be used to combine command-line
and file-based options::
callback=callback)
def parse_command_line(self, args=None, final=True):
- """Parses all options given on the command line (defaults to sys.argv).
+ """Parses all options given on the command line (defaults to
+ `sys.argv`).
- Note that args[0] is ignored since it is the program name in sys.argv.
+ Note that ``args[0]`` is ignored since it is the program name
+ in `sys.argv`.
We return a list of all arguments that are not parsed as options.
# License for the specific language governing permissions and limitations
# under the License.
-"""Utilities for working with multiple processes."""
+"""Utilities for working with multiple processes, including both forking
+the server into multiple processes and managing subprocesses.
+"""
from __future__ import absolute_import, division, print_function, with_statement
def initialize(cls, io_loop=None):
"""Initializes the ``SIGCHILD`` handler.
- The signal handler is run on an IOLoop to avoid locking issues.
- Note that the IOLoop used for signal handling need not be the
+ The signal handler is run on an `.IOLoop` to avoid locking issues.
+ Note that the `.IOLoop` used for signal handling need not be the
same one used by individual Subprocess objects (as long as the
- IOLoops are each running in separate threads).
+ ``IOLoops`` are each running in separate threads).
"""
if cls._initialized:
return
# License for the specific language governing permissions and limitations
# under the License.
-"""StackContext allows applications to maintain threadlocal-like state
+"""`StackContext` allows applications to maintain threadlocal-like state
that follows execution as it moves to other execution contexts.
The motivating examples are to eliminate the need for explicit
-async_callback wrappers (as in tornado.web.RequestHandler), and to
+``async_callback`` wrappers (as in `tornado.web.RequestHandler`), and to
allow some additional context to be kept for logging.
-This is slightly magic, but it's an extension of the idea that an exception
-handler is a kind of stack-local state and when that stack is suspended
-and resumed in a new context that state needs to be preserved. StackContext
-shifts the burden of restoring that state from each call site (e.g.
-wrapping each AsyncHTTPClient callback in async_callback) to the mechanisms
-that transfer control from one context to another (e.g. AsyncHTTPClient
-itself, IOLoop, thread pools, etc).
+This is slightly magic, but it's an extension of the idea that an
+exception handler is a kind of stack-local state and when that stack
+is suspended and resumed in a new context that state needs to be
+preserved. `StackContext` shifts the burden of restoring that state
+from each call site (e.g. wrapping each `.AsyncHTTPClient` callback
+in ``async_callback``) to the mechanisms that transfer control from
+one context to another (e.g. `.AsyncHTTPClient` itself, `.IOLoop`,
+thread pools, etc).
Example usage::
class ExceptionStackContext(object):
"""Specialization of StackContext for exception handling.
- The supplied exception_handler function will be called in the
+ The supplied ``exception_handler`` function will be called in the
event of an uncaught exception in this context. The semantics are
similar to a try/finally clause, and intended use cases are to log
an error, close a socket, or similar cleanup actions. The
- exc_info triple (type, value, traceback) will be passed to the
+ ``exc_info`` triple ``(type, value, traceback)`` will be passed to the
exception_handler function.
If the exception handler returns true, the exception will be
class NullContext(object):
- """Resets the StackContext.
+ """Resets the `StackContext`.
- Useful when creating a shared resource on demand (e.g. an AsyncHTTPClient)
- where the stack that caused the creating is not relevant to future
- operations.
+ Useful when creating a shared resource on demand (e.g. an
+ `.AsyncHTTPClient`) where the stack that caused the creating is
+ not relevant to future operations.
"""
def __enter__(self):
self.old_contexts = _state.contexts
def wrap(fn):
- """Returns a callable object that will restore the current StackContext
+ """Returns a callable object that will restore the current `StackContext`
when executed.
Use this whenever saving a callback to be executed later in a
To use `TCPServer`, define a subclass which overrides the `handle_stream`
method.
- `TCPServer` can serve SSL traffic with Python 2.6+ and OpenSSL.
To make this server serve SSL traffic, send the ssl_options dictionary
argument with the arguments required for the `ssl.wrap_socket` method,
including "certfile" and "keyfile"::
Address may be either an IP address or hostname. If it's a hostname,
the server will listen on all IP addresses associated with the
name. Address may be an empty string or None to listen on all
- available interfaces. Family may be set to either ``socket.AF_INET``
- or ``socket.AF_INET6`` to restrict to ipv4 or ipv6 addresses, otherwise
+ available interfaces. Family may be set to either `socket.AF_INET`
+ or `socket.AF_INET6` to restrict to IPv4 or IPv6 addresses, otherwise
both will be used if available.
The ``backlog`` argument has the same meaning as for
- `socket.socket.listen`.
+ `socket.listen <socket.socket.listen>`.
This method may be called multiple times prior to `start` to listen
on multiple ports or interfaces.
self._pending_sockets.extend(sockets)
def start(self, num_processes=1):
- """Starts this server in the IOLoop.
+ """Starts this server in the `.IOLoop`.
By default, we run the server in this process and do not fork any
additional child process.
class BaseLoader(object):
- """Base class for template loaders."""
- def __init__(self, autoescape=_DEFAULT_AUTOESCAPE, namespace=None):
- """Creates a template loader.
-
- root_directory may be the empty string if this loader does not
- use the filesystem.
+ """Base class for template loaders.
- autoescape must be either None or a string naming a function
+ You must use a template loader to use template constructs like
+ ``{% extends %}`` and ``{% include %}``. The loader caches all
+ templates after they are loaded the first time.
+ """
+ def __init__(self, autoescape=_DEFAULT_AUTOESCAPE, namespace=None):
+ """``autoescape`` must be either None or a string naming a function
in the template namespace, such as "xhtml_escape".
"""
self.autoescape = autoescape
class Loader(BaseLoader):
"""A template loader that loads from a single root directory.
-
- You must use a template loader to use template constructs like
- {% extends %} and {% include %}. Loader caches all templates after
- they are loaded the first time.
"""
def __init__(self, root_directory, **kwargs):
super(Loader, self).__init__(**kwargs)
#!/usr/bin/env python
"""Support classes for automated testing.
-This module contains three parts:
+* `AsyncTestCase` and `AsyncHTTPTestCase`: Subclasses of unittest.TestCase
+ with additional support for testing asynchronous (`.IOLoop` based) code.
-* `AsyncTestCase`/`AsyncHTTPTestCase`: Subclasses of unittest.TestCase
- with additional support for testing asynchronous (IOLoop-based) code.
-
-* `LogTrapTestCase`: Subclass of unittest.TestCase that discards log output
- from tests that pass and only produces output for failing tests.
+* `ExpectLog` and `LogTrapTestCase`: Make test logs less spammy.
* `main()`: A simple test runner (wrapper around unittest.main()) with support
for the tornado.autoreload module to rerun the tests when code changes.
-
-These components may be used together or independently. In particular,
-it is safe to combine AsyncTestCase and LogTrapTestCase via multiple
-inheritance. See the docstrings for each class/function below for more
-information.
"""
from __future__ import absolute_import, division, print_function, with_statement
class AsyncTestCase(unittest.TestCase):
- """TestCase subclass for testing IOLoop-based asynchronous code.
-
- The unittest framework is synchronous, so the test must be complete
- by the time the test method returns. This method provides the stop()
- and wait() methods for this purpose. The test method itself must call
- self.wait(), and asynchronous callbacks should call self.stop() to signal
- completion.
-
- By default, a new IOLoop is constructed for each test and is available
- as self.io_loop. This IOLoop should be used in the construction of
+ """`~unittest.TestCase` subclass for testing `.IOLoop`-based
+ asynchronous code.
+
+ The unittest framework is synchronous, so the test must be
+ complete by the time the test method returns. This class provides
+ the `stop()` and `wait()` methods for this purpose. The test
+ method itself must call ``self.wait()``, and asynchronous
+ callbacks should call ``self.stop()`` to signal completion.
+ Alternately, the `gen_test` decorator can be used to use yield points
+ from the `tornado.gen` module.
+
+ By default, a new `.IOLoop` is constructed for each test and is available
+ as ``self.io_loop``. This `.IOLoop` should be used in the construction of
HTTP clients/servers, etc. If the code being tested requires a
- global IOLoop, subclasses should override get_new_ioloop to return it.
+ global `.IOLoop`, subclasses should override `get_new_ioloop` to return it.
- The IOLoop's start and stop methods should not be called directly.
- Instead, use self.stop self.wait. Arguments passed to self.stop are
- returned from self.wait. It is possible to have multiple
- wait/stop cycles in the same test.
+ The `.IOLoop`'s ``start`` and ``stop`` methods should not be
+ called directly. Instead, use `self.stop <stop>` and `self.wait
+ <wait>`. Arguments passed to ``self.stop`` are returned from
+ ``self.wait``. It is possible to have multiple ``wait``/``stop``
+ cycles in the same test.
Example::
- # This test uses an asynchronous style similar to most async
- # application code.
+ # This test uses argument passing between self.stop and self.wait.
class MyTestCase(AsyncTestCase):
+ def test_http_fetch(self):
+ client = AsyncHTTPClient(self.io_loop)
+ client.fetch("http://www.tornadoweb.org/", self.stop)
+ response = self.wait()
+ # Test contents of response
+ self.assertIn("FriendFeed", response.body)
+
+ # This test uses an explicit callback-based style.
+ class MyTestCase2(AsyncTestCase):
def test_http_fetch(self):
client = AsyncHTTPClient(self.io_loop)
client.fetch("http://www.tornadoweb.org/", self.handle_fetch)
# self.wait() in test_http_fetch() via stack_context.
self.assertIn("FriendFeed", response.body)
self.stop()
-
- # This test uses the argument passing between self.stop and self.wait
- # for a simpler, more synchronous style.
- # This style is recommended over the preceding example because it
- # keeps the assertions in the test method itself, and is therefore
- # less sensitive to the subtleties of stack_context.
- class MyTestCase2(AsyncTestCase):
- def test_http_fetch(self):
- client = AsyncHTTPClient(self.io_loop)
- client.fetch("http://www.tornadoweb.org/", self.stop)
- response = self.wait()
- # Test contents of response
- self.assertIn("FriendFeed", response.body)
"""
def __init__(self, *args, **kwargs):
super(AsyncTestCase, self).__init__(*args, **kwargs)
self.__rethrow()
def get_new_ioloop(self):
- """Creates a new IOLoop for this test. May be overridden in
- subclasses for tests that require a specific IOLoop (usually
- the singleton).
+ """Creates a new `.IOLoop` for this test. May be overridden in
+ subclasses for tests that require a specific `.IOLoop` (usually
+ the singleton `.IOLoop.instance()`).
"""
return IOLoop()
self.__rethrow()
def stop(self, _arg=None, **kwargs):
- """Stops the ioloop, causing one pending (or future) call to wait()
+ """Stops the `.IOLoop`, causing one pending (or future) call to `wait()`
to return.
- Keyword arguments or a single positional argument passed to stop() are
- saved and will be returned by wait().
+ Keyword arguments or a single positional argument passed to `stop()` are
+ saved and will be returned by `wait()`.
"""
assert _arg is None or not kwargs
self.__stop_args = kwargs or _arg
self.__stopped = True
def wait(self, condition=None, timeout=5):
- """Runs the IOLoop until stop is called or timeout has passed.
+ """Runs the `.IOLoop` until stop is called or timeout has passed.
In the event of a timeout, an exception will be thrown.
- If condition is not None, the IOLoop will be restarted after stop()
- until condition() returns true.
+ If ``condition`` is not None, the `.IOLoop` will be restarted
+ after `stop()` until ``condition()`` returns true.
"""
if not self.__stopped:
if timeout:
class AsyncHTTPTestCase(AsyncTestCase):
"""A test case that starts up an HTTP server.
- Subclasses must override get_app(), which returns the
- tornado.web.Application (or other HTTPServer callback) to be tested.
- Tests will typically use the provided self.http_client to fetch
+ Subclasses must override `get_app()`, which returns the
+ `tornado.web.Application` (or other `.HTTPServer` callback) to be tested.
+ Tests will typically use the provided ``self.http_client`` to fetch
URLs from this server.
Example::
def get_app(self):
"""Should be overridden by subclasses to return a
- tornado.web.Application or other HTTPServer callback.
+ `tornado.web.Application` or other `.HTTPServer` callback.
"""
raise NotImplementedError()
def fetch(self, path, **kwargs):
"""Convenience method to synchronously fetch a url.
- The given path will be appended to the local server's host and port.
- Any additional kwargs will be passed directly to
- AsyncHTTPClient.fetch (and so could be used to pass method="POST",
- body="...", etc).
+ The given path will be appended to the local server's host and
+ port. Any additional kwargs will be passed directly to
+ `.AsyncHTTPClient.fetch` (and so could be used to pass
+ ``method="POST"``, ``body="..."``, etc).
"""
self.http_client.fetch(self.get_url(path), self.stop, **kwargs)
return self.wait()
the test succeeds, so this class can be useful to minimize the noise.
Simply use it as a base class for your test case. It is safe to combine
with AsyncTestCase via multiple inheritance
- ("class MyTestCase(AsyncHTTPTestCase, LogTrapTestCase):")
+ (``class MyTestCase(AsyncHTTPTestCase, LogTrapTestCase):``)
- This class assumes that only one log handler is configured and that
- it is a StreamHandler. This is true for both logging.basicConfig
- and the "pretty logging" configured by tornado.options. It is not
- compatible with other log buffering mechanisms, such as those provided
- by some test runners.
+ This class assumes that only one log handler is configured and
+ that it is a `~logging.StreamHandler`. This is true for both
+ `logging.basicConfig` and the "pretty logging" configured by
+ `tornado.options`. It is not compatible with other log buffering
+ mechanisms, such as those provided by some test runners.
"""
def run(self, result=None):
logger = logging.getLogger()
be specified.
Projects with many tests may wish to define a test script like
- tornado/test/runtests.py. This script should define a method all()
- which returns a test suite and then call tornado.testing.main().
- Note that even when a test script is used, the all() test suite may
- be overridden by naming a single test on the command line::
+ ``tornado/test/runtests.py``. This script should define a method
+ ``all()`` which returns a test suite and then call
+ `tornado.testing.main()`. Note that even when a test script is
+ used, the ``all()`` test suite may be overridden by naming a
+ single test on the command line::
# Runs all tests
python -m tornado.test.runtests