A simple Python cache/caching proxy for Web development and something else, built on aiohttp and aiohttp-client-cache (a family project of requests-cache).
Useful to avoid unfavorable massive accesses to external APIs during development, with little change, without preparing mocks. Not recommended for production.
pip install cacheproxy
$ cacheproxy sqlite -c ./cache --expire-after 1800
Cache database: /tmp/cache.sqlite
======== Running on http://0.0.0.0:8080 ========
(Press CTRL+C to quit)
Other backends:
cacheproxy # in-memory
cacheproxy memory # in-memory
cacheproxy file -c ./cache # file-based, saved under ./cache/
cacheproxy sqlite -c ./cache # sqlite, saved to ./cache.sqlite
cURL:
curl http://0.0.0.0:8080/api.github.com/repos/nolze/cacheproxy # This request is cached until the expiration time
# → {"id":...,"node_id":"...","name":"cacheproxy", ...
Python (requests):
import requests
base_url = "http://0.0.0.0:8080/api.github.com" # Just replace with "https://api.github.com" on production
resp = requests.get(f"{base_url}/repos/nolze/cacheproxy") # or use urljoin()
print(resp.json())
# → {'id': ...., 'node_id': '....', 'name': 'cacheproxy', ...
JavaScript/Node:
const baseURL = "http://0.0.0.0:8080/api.github.com"; // Just replace with "https://api.github.com" on production
const resp = await fetch(`${baseURL}/repos/nolze/cacheproxy`);
const data = await resp.json();
console.log(data);
// → Object { id: ..., node_id: "...", name: "cacheproxy", ...
Use aiohttp-client-cache to load existing databases.
See also:
- Write basic tests
- Support POST/PUT
- Write docs
- Better error handling
- Better logging
- Support switching http/https (with --http/--https flags)
- Write more tests
- Support DynamoDB, MongoDB, and Redis backends
MIT