+++ /dev/null
-Client(...)
-
- .request(method, url, ...)
-
- .get(url, ...)
- .options(url, ...)
- .head(url, ...)
- .post(url, ...)
- .put(url, ...)
- .patch(url, ...)
- .delete(url, ...)
-
- .prepare_request(request)
- .send(request, ...)
- .close()
-
-
-Adapter()
-
- .prepare_request(request)
- .send(request)
- .close()
-
-
-+ EnvironmentAdapter
-+ RedirectAdapter
-+ CookieAdapter
-+ AuthAdapter
-+ ConnectionPool
- + HTTPConnection
- + HTTP11Connection
- + HTTP2Connection
-
-
-
-Response(...)
- .status_code - int
- .reason_phrase - str
- .protocol - "HTTP/2" or "HTTP/1.1"
- .url - URL
- .headers - Headers
-
- .content - bytes
- .text - str
- .encoding - str
- .json() - Any
-
- .read() - bytes
- .stream() - bytes iterator
- .raw() - bytes iterator
- .close() - None
-
- .is_redirect - bool
- .request - Request
- .cookies - Cookies
- .history - List[Response]
-
- .raise_for_status()
- .next()
-
-
-Request(...)
- .method
- .url
- .headers
-
- ...
-
-
-Headers
-
-URL
-
-Origin
-
-Cookies
-
-
-# Sync
-
-SyncClient
-SyncResponse
-SyncRequest
-SyncAdapter
-
-
-
-SSE
-HTTP/2 server push support
-Concurrency
# HTTPCore
-A low-level async HTTP library.
-
-## Proposed functionality
-
-* Support for streaming requests and responses. (Done)
-* Support for connection pooling. (Done)
-* gzip, deflate, and brotli decoding. (Done)
-* SSL verification. (Done)
-* Proxy support. (Not done)
-* HTTP/2 support. (Not done)
-* Support *both* async and sync operations. (Done)
-
-## Motivation
-
-Some of the trickier remaining issues on `requests-async` such as request/response streaming, connection pooling, proxy support, would require a fully async variant of urllib3. I considered and started work on a straight port of `urllib3-async`, but having started to dive into it, my judgement is that a from-scratch implementation will be less overall work to achieve.
-
-The intent is that this library could be the low-level implementation, that `requests-async` would then wrap up.
-
-## Credit
-
-* Some inspiration from the design-work of `urllib3`, but redone from scratch, and built as an async-first library.
-* Dependant on the absolutely excellent `h11` package.
-* Uses the `certifi` package for the default SSL verification.
+A proposal for [requests III](https://github.com/kennethreitz/requests3).
+
+## Feature support
+
+* `HTTP/1.1` and `HTTP/2` Support.
+* `async`/`await` support for non-thread-blocking HTTP requests.
+* Fully type annotated.
+* 100% test coverage. *TODO - We're on ~97% right now*
+
+Plus all the standard features of requests...
+
+* International Domains and URLs
+* Keep-Alive & Connection Pooling
+* Sessions with Cookie Persistence *TODO*
+* Browser-style SSL Verification
+* Basic/Digest Authentication *TODO*
+* Elegant Key/Value Cookies *TODO*
+* Automatic Decompression
+* Automatic Content Decoding
+* Unicode Response Bodies
+* Multipart File Uploads *TODO*
+* HTTP(S) Proxy Support *TODO*
+* Connection Timeouts
+* Streaming Downloads
+* .netrc Support *TODO*
+* Chunked Requests
## Usage
Making a request:
```python
-import httpcore
-
-http = httpcore.ConnectionPool()
-response = await http.request('GET', 'http://example.com')
-assert response.status_code == 200
-assert response.body == b'Hello, world'
+>>> import httpcore
+>>>
+>>> client = httpcore.Client()
+>>> response = await client.get('http://example.com')
+>>> response.status_code
+<StatusCode.ok: 200>
+>>> response.text
+'<!doctype html>\n<html>\n<head>\n<title>Example Domain</title>\n...'
```
-Top-level API:
+Alternatively, thread-synchronous requests:
```python
-http = httpcore.ConnectionPool([ssl], [timeout], [limits])
-response = await http.request(method, url, [headers], [body], [stream])
+>>> import httpcore
+>>>
+>>> client = httpcore.SyncClient()
+>>> response = client.get('http://example.com')
+>>> response.status_code
+<StatusCode.ok: 200>
+>>> response.text
+'<!doctype html>\n<html>\n<head>\n<title>Example Domain</title>\n...'
```
-ConnectionPool as a context-manager:
+---
+
+## API Reference
+
+#### `Client([ssl], [timeout], [pool_limits], [max_redirects])`
+
+* `.request(method, url, ...)`
+* `.get(url, ...)`
+* `.options(url, ...)`
+* `.head(url, ...)`
+* `.post(url, ...)`
+* `.put(url, ...)`
+* `.patch(url, ...)`
+* `.delete(url, ...)`
+* `.prepare_request(request)`
+* `.send(request, ...)`
+* `.close()`
+
+### Models
+
+#### `Response(...)`
+
+* `.status_code` - **int**
+* `.reason_phrase` - **str**
+* `.protocol` - `"HTTP/2"` or `"HTTP/1.1"`
+* `.url` - **URL**
+* `.headers` - **Headers**
+* `.content` - **bytes**
+* `.text` - **str**
+* `.encoding` - **str**
+* `.json()` - **Any** *TODO*
+* `.read()` - **bytes**
+* `.stream()` - **bytes iterator**
+* `.raw()` - **bytes iterator**
+* `.close()` - **None**
+* `.is_redirect` - **bool**
+* `.request` - **Request**
+* `.cookies` - **Cookies** *TODO*
+* `.history` - **List[Response]**
+* `.raise_for_status()` - **Response** *TODO*
+* `.next()` - **Response**
+
+#### `Request(method, url, content, headers)`
+
+...
+
+#### `URL(url, allow_relative=False)`
+
+*A normalized, IDNA supporting URL.*
+
+* `.scheme` - **str**
+* `.authority` - **str**
+* `.host` - **str**
+* `.port` - **int**
+* `.path` - **str**
+* `.query` - **str**
+* `.full_path` - **str**
+* `.fragment` - **str**
+* `.is_ssl` - **bool**
+* `.origin` - **Origin**
+* `.is_absolute_url` - **bool**
+* `.is_relative_url` - **bool**
+* `.copy_with([scheme], [authority], [path], [query], [fragment])` - **URL**
+* `.resolve_with(url)` - **URL**
+
+#### `Origin(url)`
+
+*A normalized, IDNA supporting set of scheme/host/port info.*
```python
-async with httpcore.ConnectionPool([ssl], [timeout], [limits]) as http:
- response = await http.request(method, url, [headers], [body], [stream])
+>>> Origin('https://example.org') == Origin('HTTPS://EXAMPLE.ORG:443')
+True
```
-Streaming responses:
+* `.is_ssl` - **bool**
+* `.host` - **str**
+* `.port` - **int**
-```python
-http = httpcore.ConnectionPool()
-response = await http.request(method, url, stream=True)
-async for part in response.stream():
- ...
-```
+#### `Headers(headers)`
-Raw data without gzip/deflate/brotli decompression applied:
+*A case-insensitive multi-dict.*
```python
-http = httpcore.ConnectionPool()
-response = await http.request(method, url, stream=True)
-async for part in response.raw():
- ...
+>>> headers = Headers({'Content-Type': 'application/json'})
+>>> headers['content-type']
+'application/json'
```
-Thread-synchronous requests:
+___
-```python
-http = httpcore.SyncConnectionPool()
-response = http.request('GET', 'http://example.com')
-assert response.status_code == 200
-assert response.body == b'Hello, world'
-```
+## Alternate backends
-## Building a Gateway Server
+### `SyncClient`
-The level of abstraction fits in really well if you're just writing at
-the raw ASGI level. Eg. Here's an how an ASGI gateway server looks against the
-API, including streaming uploads and downloads...
+A thread-synchronous client.
-```python
-import httpcore
-
-
-class GatewayServer:
- def __init__(self, base_url):
- self.base_url = base_url
- self.http = httpcore.ConnectionPool()
-
- async def __call__(self, scope, receive, send):
- assert scope['type'] == 'http'
- path = scope['path']
- query = scope['query_string']
- method = scope['method']
- headers = [
- (k, v) for (k, v) in scope['headers']
- if k not in (b'host', b'transfer-encoding')
- ]
-
- url = self.base_url + path
- if query:
- url += '?' + query.decode()
-
- initial_body, more_body = await self.initial_body(receive)
- if more_body:
- # Streaming request.
- body = self.stream_body(receive, initial_body)
- else:
- # Standard request.
- body = initial_body
-
- response = await self.http.request(
- method, url, headers=headers, body=body, stream=True
- )
-
- await send({
- 'type': 'http.response.start',
- 'status': response.status_code,
- 'headers': response.headers
- })
- data = b''
- async for next_data in response.raw():
- if data:
- await send({
- 'type': 'http.response.body',
- 'body': data,
- 'more_body': True
- })
- data = next_data
- await send({'type': 'http.response.body', 'body': data})
-
- async def initial_body(self, receive):
- """
- Pull the first body message off the 'receive' channel.
- Allows us to determine if we should use a streaming request or not.
- """
- message = await receive()
- body = message.get('body', b'')
- more_body = message.get('more_body', False)
- return (body, more_body)
-
- async def stream_body(self, receive, initial_body):
- """
- Async iterator returning bytes for the request body.
- """
- yield initial_body
- while True:
- message = await receive()
- yield message.get('body', b'')
- if not message.get('more_body', False):
- break
-
-
-app = GatewayServer('http://example.org')
-```
+### `TrioClient`
-Run with...
+*TODO*
-```shell
-uvicorn example:app
-```
+---
+
+## The Stack
+
+The `httpcore` client builds up behavior in a modular way.
+
+This makes it easier to dig into an understand the behaviour of any one aspect in isolation, as well as making it easier to test or to adapt for custom behaviors.
+
+You can also use lower level components in isolation if required, eg. Use a `ConnectionPool` without providing sessions, redirects etc...
+
+* `RedirectAdapter` - Adds redirect support.
+* `EnvironmentAdapter` - Adds `.netrc` and envvars such as `REQUESTS_CA_BUNDLE`.
+* `CookieAdapter` - Adds cookie persistence.
+* `AuthAdapter` - Adds authentication support.
+* `ConnectionPool` - Connection pooling & keep alive.
+ * `HTTPConnection` - A single connection.
+ * `HTTP11Connection` - A single HTTP/1.1 connection.
+ * `HTTP2Connection` - A single HTTP/2 connection, with multiple streams.
from .models import URL, Headers, Origin, Request, Response
from .status_codes import codes
from .streams import BaseReader, BaseWriter, Protocol, Reader, Writer, connect
-from .sync import SyncClient, SyncConnectionPool
+from .sync import SyncClient
__version__ = "0.2.1"
self,
ssl: SSLConfig = DEFAULT_SSL_CONFIG,
timeout: TimeoutConfig = DEFAULT_TIMEOUT_CONFIG,
- limits: PoolLimits = DEFAULT_POOL_LIMITS,
+ pool_limits: PoolLimits = DEFAULT_POOL_LIMITS,
max_redirects: int = DEFAULT_MAX_REDIRECTS,
):
- connection_pool = ConnectionPool(ssl=ssl, timeout=timeout, limits=limits)
+ connection_pool = ConnectionPool(
+ ssl=ssl, timeout=timeout, pool_limits=pool_limits
+ )
cookie_adapter = CookieAdapter(dispatch=connection_pool)
auth_adapter = AuthenticationAdapter(dispatch=cookie_adapter)
redirect_adapter = RedirectAdapter(
*,
ssl: SSLConfig = DEFAULT_SSL_CONFIG,
timeout: TimeoutConfig = DEFAULT_TIMEOUT_CONFIG,
- limits: PoolLimits = DEFAULT_POOL_LIMITS,
+ pool_limits: PoolLimits = DEFAULT_POOL_LIMITS,
):
self.ssl = ssl
self.timeout = timeout
- self.limits = limits
+ self.pool_limits = pool_limits
self.is_closed = False
- self.max_connections = PoolSemaphore(limits)
+ self.max_connections = PoolSemaphore(pool_limits)
self.keepalive_connections = ConnectionStore()
self.active_connections = ConnectionStore()
self.active_connections.remove(connection)
self.max_connections.release()
elif (
- self.limits.soft_limit is not None
- and self.num_connections > self.limits.soft_limit
+ self.pool_limits.soft_limit is not None
+ and self.num_connections > self.pool_limits.soft_limit
):
self.active_connections.remove(connection)
self.max_connections.release()
request: Request = None,
history: typing.List["Response"] = None,
):
- self.status_code = status_code
+ try:
+ # Use a StatusCode IntEnum if possible, for a nicer representation.
+ self.status_code = codes(status_code)
+ except ValueError:
+ self.status_code = status_code
self.reason_phrase = reason_phrase or get_reason_phrase(status_code)
self.protocol = protocol
self.headers = Headers(headers)
class PoolSemaphore(BasePoolSemaphore):
- def __init__(self, limits: PoolLimits):
- self.limits = limits
+ def __init__(self, pool_limits: PoolLimits):
+ self.pool_limits = pool_limits
@property
def semaphore(self) -> typing.Optional[asyncio.BoundedSemaphore]:
if not hasattr(self, "_semaphore"):
- max_connections = self.limits.hard_limit
+ max_connections = self.pool_limits.hard_limit
if max_connections is None:
self._semaphore = None
else:
if self.semaphore is None:
return
- timeout = self.limits.pool_timeout
+ timeout = self.pool_limits.pool_timeout
try:
await asyncio.wait_for(self.semaphore.acquire(), timeout)
except asyncio.TimeoutError:
import typing
from types import TracebackType
-from .config import SSLConfig, TimeoutConfig
-from .dispatch.connection_pool import ConnectionPool
-from .interfaces import Adapter
-from .models import URL, Headers, Response
+from .client import Client
+from .config import (
+ DEFAULT_MAX_REDIRECTS,
+ DEFAULT_POOL_LIMITS,
+ DEFAULT_SSL_CONFIG,
+ DEFAULT_TIMEOUT_CONFIG,
+ PoolLimits,
+ SSLConfig,
+ TimeoutConfig,
+)
+from .models import (
+ URL,
+ ByteOrByteStream,
+ Headers,
+ HeaderTypes,
+ Request,
+ Response,
+ URLTypes,
+)
class SyncResponse:
class SyncClient:
- def __init__(self, adapter: Adapter):
- self._client = adapter
+ def __init__(
+ self,
+ ssl: SSLConfig = DEFAULT_SSL_CONFIG,
+ timeout: TimeoutConfig = DEFAULT_TIMEOUT_CONFIG,
+ pool_limits: PoolLimits = DEFAULT_POOL_LIMITS,
+ max_redirects: int = DEFAULT_MAX_REDIRECTS,
+ ) -> None:
+ self._client = Client(
+ ssl=ssl,
+ timeout=timeout,
+ pool_limits=pool_limits,
+ max_redirects=max_redirects,
+ )
self._loop = asyncio.new_event_loop()
def request(
self,
method: str,
- url: typing.Union[str, URL],
+ url: URLTypes,
*,
- headers: typing.List[typing.Tuple[bytes, bytes]] = [],
- body: typing.Union[bytes, typing.AsyncIterator[bytes]] = b"",
- **options: typing.Any
+ content: ByteOrByteStream = b"",
+ headers: HeaderTypes = None,
+ stream: bool = False,
+ allow_redirects: bool = True,
+ ssl: SSLConfig = None,
+ timeout: TimeoutConfig = None,
) -> SyncResponse:
response = self._loop.run_until_complete(
- self._client.request(method, url, headers=headers, body=body, **options)
+ self._client.request(
+ method,
+ url,
+ content=content,
+ headers=headers,
+ stream=stream,
+ allow_redirects=allow_redirects,
+ ssl=ssl,
+ timeout=timeout,
+ )
)
return SyncResponse(response, self._loop)
+ def get(
+ self,
+ url: URLTypes,
+ *,
+ headers: HeaderTypes = None,
+ stream: bool = False,
+ allow_redirects: bool = True,
+ ssl: SSLConfig = None,
+ timeout: TimeoutConfig = None,
+ ) -> SyncResponse:
+ return self.request(
+ "GET",
+ url,
+ headers=headers,
+ stream=stream,
+ allow_redirects=allow_redirects,
+ ssl=ssl,
+ timeout=timeout,
+ )
+
+ def options(
+ self,
+ url: URLTypes,
+ *,
+ headers: HeaderTypes = None,
+ stream: bool = False,
+ allow_redirects: bool = True,
+ ssl: SSLConfig = None,
+ timeout: TimeoutConfig = None,
+ ) -> SyncResponse:
+ return self.request(
+ "OPTIONS",
+ url,
+ headers=headers,
+ stream=stream,
+ allow_redirects=allow_redirects,
+ ssl=ssl,
+ timeout=timeout,
+ )
+
+ def head(
+ self,
+ url: URLTypes,
+ *,
+ headers: HeaderTypes = None,
+ stream: bool = False,
+ allow_redirects: bool = False, # Note: Differs to usual default.
+ ssl: SSLConfig = None,
+ timeout: TimeoutConfig = None,
+ ) -> SyncResponse:
+ return self.request(
+ "HEAD",
+ url,
+ headers=headers,
+ stream=stream,
+ allow_redirects=allow_redirects,
+ ssl=ssl,
+ timeout=timeout,
+ )
+
+ def post(
+ self,
+ url: URLTypes,
+ *,
+ content: ByteOrByteStream = b"",
+ headers: HeaderTypes = None,
+ stream: bool = False,
+ allow_redirects: bool = True,
+ ssl: SSLConfig = None,
+ timeout: TimeoutConfig = None,
+ ) -> SyncResponse:
+ return self.request(
+ "POST",
+ url,
+ content=content,
+ headers=headers,
+ stream=stream,
+ allow_redirects=allow_redirects,
+ ssl=ssl,
+ timeout=timeout,
+ )
+
+ def put(
+ self,
+ url: URLTypes,
+ *,
+ content: ByteOrByteStream = b"",
+ headers: HeaderTypes = None,
+ stream: bool = False,
+ allow_redirects: bool = True,
+ ssl: SSLConfig = None,
+ timeout: TimeoutConfig = None,
+ ) -> SyncResponse:
+ return self.request(
+ "PUT",
+ url,
+ content=content,
+ headers=headers,
+ stream=stream,
+ allow_redirects=allow_redirects,
+ ssl=ssl,
+ timeout=timeout,
+ )
+
+ def patch(
+ self,
+ url: URLTypes,
+ *,
+ content: ByteOrByteStream = b"",
+ headers: HeaderTypes = None,
+ stream: bool = False,
+ allow_redirects: bool = True,
+ ssl: SSLConfig = None,
+ timeout: TimeoutConfig = None,
+ ) -> SyncResponse:
+ return self.request(
+ "PATCH",
+ url,
+ content=content,
+ headers=headers,
+ stream=stream,
+ allow_redirects=allow_redirects,
+ ssl=ssl,
+ timeout=timeout,
+ )
+
+ def delete(
+ self,
+ url: URLTypes,
+ *,
+ content: ByteOrByteStream = b"",
+ headers: HeaderTypes = None,
+ stream: bool = False,
+ allow_redirects: bool = True,
+ ssl: SSLConfig = None,
+ timeout: TimeoutConfig = None,
+ ) -> SyncResponse:
+ return self.request(
+ "DELETE",
+ url,
+ content=content,
+ headers=headers,
+ stream=stream,
+ allow_redirects=allow_redirects,
+ ssl=ssl,
+ timeout=timeout,
+ )
+
def close(self) -> None:
self._loop.run_until_complete(self._client.close())
traceback: TracebackType = None,
) -> None:
self.close()
-
-
-def SyncConnectionPool(*args: typing.Any, **kwargs: typing.Any) -> SyncClient:
- client = ConnectionPool(*args, **kwargs) # type: ignore
- return SyncClient(client)
"""
The soft_limit config should limit the maximum number of keep-alive connections.
"""
- limits = httpcore.PoolLimits(soft_limit=1)
+ pool_limits = httpcore.PoolLimits(soft_limit=1)
- async with httpcore.ConnectionPool(limits=limits) as http:
+ async with httpcore.ConnectionPool(pool_limits=pool_limits) as http:
response = await http.request("GET", "http://127.0.0.1:8000/")
assert len(http.active_connections) == 0
assert len(http.keepalive_connections) == 1
@threadpool
def test_get(server):
- with httpcore.SyncConnectionPool() as http:
- response = http.request("GET", "http://127.0.0.1:8000/")
+ with httpcore.SyncClient() as http:
+ response = http.get("http://127.0.0.1:8000/")
assert response.status_code == 200
assert response.content == b"Hello, world!"
assert response.text == "Hello, world!"
@threadpool
def test_post(server):
- with httpcore.SyncConnectionPool() as http:
- response = http.request("POST", "http://127.0.0.1:8000/", body=b"Hello, world!")
+ with httpcore.SyncClient() as http:
+ response = http.post("http://127.0.0.1:8000/", content=b"Hello, world!")
assert response.status_code == 200
assert response.reason_phrase == "OK"
@threadpool
def test_stream_response(server):
- with httpcore.SyncConnectionPool() as http:
- response = http.request("GET", "http://127.0.0.1:8000/", stream=True)
+ with httpcore.SyncClient() as http:
+ response = http.get("http://127.0.0.1:8000/", stream=True)
assert response.status_code == 200
content = response.read()
assert content == b"Hello, world!"
@threadpool
def test_stream_iterator(server):
- with httpcore.SyncConnectionPool() as http:
- response = http.request("GET", "http://127.0.0.1:8000/", stream=True)
+ with httpcore.SyncClient() as http:
+ response = http.get("http://127.0.0.1:8000/", stream=True)
assert response.status_code == 200
body = b""
for chunk in response.stream():
@threadpool
def test_raw_iterator(server):
- with httpcore.SyncConnectionPool() as http:
- response = http.request("GET", "http://127.0.0.1:8000/", stream=True)
+ with httpcore.SyncClient() as http:
+ response = http.get("http://127.0.0.1:8000/", stream=True)
assert response.status_code == 200
body = b""
for chunk in response.raw():
import pytest
-import httpcore
+from httpcore import (
+ Client,
+ ConnectTimeout,
+ PoolLimits,
+ PoolTimeout,
+ ReadTimeout,
+ TimeoutConfig,
+)
@pytest.mark.asyncio
async def test_read_timeout(server):
- timeout = httpcore.TimeoutConfig(read_timeout=0.0001)
+ timeout = TimeoutConfig(read_timeout=0.0001)
- async with httpcore.ConnectionPool(timeout=timeout) as http:
- with pytest.raises(httpcore.ReadTimeout):
- await http.request("GET", "http://127.0.0.1:8000/slow_response")
+ async with Client(timeout=timeout) as client:
+ with pytest.raises(ReadTimeout):
+ await client.get("http://127.0.0.1:8000/slow_response")
@pytest.mark.asyncio
async def test_connect_timeout(server):
- timeout = httpcore.TimeoutConfig(connect_timeout=0.0001)
+ timeout = TimeoutConfig(connect_timeout=0.0001)
- async with httpcore.ConnectionPool(timeout=timeout) as http:
- with pytest.raises(httpcore.ConnectTimeout):
+ async with Client(timeout=timeout) as client:
+ with pytest.raises(ConnectTimeout):
# See https://stackoverflow.com/questions/100841/
- await http.request("GET", "http://10.255.255.1/")
+ await client.get("http://10.255.255.1/")
@pytest.mark.asyncio
async def test_pool_timeout(server):
- limits = httpcore.PoolLimits(hard_limit=1, pool_timeout=0.0001)
+ pool_limits = PoolLimits(hard_limit=1, pool_timeout=0.0001)
- async with httpcore.ConnectionPool(limits=limits) as http:
- response = await http.request("GET", "http://127.0.0.1:8000/", stream=True)
+ async with Client(pool_limits=pool_limits) as client:
+ response = await client.get("http://127.0.0.1:8000/", stream=True)
- with pytest.raises(httpcore.PoolTimeout):
- await http.request("GET", "http://localhost:8000/")
+ with pytest.raises(PoolTimeout):
+ await client.get("http://localhost:8000/")
await response.read()