Compare commits

..

6 commits

25 changed files with 703 additions and 635 deletions

View file

@ -22,9 +22,7 @@ So I built one.
`spiderweb` is a small web framework, just big enough to hold a spider. Getting started is easy:
```shell
uv add spiderweb-framework
# or
pip install spiderweb-framework
poetry add spiderweb-framework
```
Create a new file and drop this in it:
@ -45,17 +43,6 @@ if __name__ == "__main__":
## [View the docs here!](https://itsthejoker.github.io/spiderweb/#/)
### Development (using uv)
This repository uses uv for local development and testing.
- Create a virtual environment: `uv venv`
- Activate it (Windows): `.venv\Scripts\activate`
- Activate it (POSIX): `source .venv/bin/activate`
- Install deps (editable + dev): `uv pip install -e .[dev]`
- Run tests: `uv run python -m pytest`
- Lint/format: `uv run ruff check .` and `uv run black .`
My goal with this framework was to do three things:
1. Learn a lot
@ -81,5 +68,5 @@ And, honestly, I think I got there. Here's a non-exhaustive list of things this
- CORS middleware
- Optional POST data validation middleware with Pydantic
- Session middleware with built-in session store
- Database support (using SQLAlchemy, but you can use whatever you want as long as there's a SQLAlchemy driver for it)
- Database support (using Peewee, but you can use whatever you want as long as there's a Peewee driver for it)
- Tests (currently roughly 89% coverage)

View file

@ -62,12 +62,11 @@ Spiderweb provides five types of responses out of the box:
### Database Agnosticism (Mostly)
Spiderweb persists its internal data (like sessions) using Advanced Alchemy built on SQLAlchemy. Your application can use this same setup out of the box, or bring any ORM you prefer.
One of the largest selling points of Django is the Django Object Relational Mapper (ORM); while there's nothing that compares to it in functionality, there are many other ORMs and database management solutions for developers to choose from.
- By default, Spiderweb creates a SQLite database file `spiderweb.db` next to your app.
- You can pass the `db` argument to `SpiderwebRouter` as a filesystem path (for SQLite), a SQLAlchemy database URL string, or a SQLAlchemy Engine instance.
In order to use a database internally (and since this is not about writing an ORM too), Spiderweb depends on [peewee, a small ORM](https://github.com/coleifer/peewee). Applications using Spiderweb are more than welcome to use peewee models with first-class support or use whatever they're familiar with. Peewee supports PostgreSQL, MySQL, Sqlite, and CockroachDB; if you use one of these, Spiderweb can create the tables it needs in your database and stay out of the way. By default, Spiderweb creates a sqlite database in the application directory for its own use.
> [Read more about databases and migrations](db.md)
> [Read more about the using a database in Spiderweb](db.md)
### Easy to configure
@ -130,4 +129,4 @@ Here's a non-exhaustive list of things this can do:
- Database support (using Peewee, but you can use whatever you want as long as there's a Peewee driver for it)
- Tests (currently a little over 80% coverage)
[^1]: I mostly succeeded. The way that I'm approaching this is that I did my level best, then looked at (and copied) existing solutions where necessary. At the time of this writing, I did all of it solo except for the CORS middleware. [Read more about it here.](middleware/cors.md)
[^1]: I mostly succeeded. The way that I'm approaching this is that I did my level best, then looked at (and copied) existing solutions where necessary. At the time of this writing, I did all of it solo except for the CORS middleware. [Read more about it here.](middleware/cors.md)

View file

@ -1,39 +1,19 @@
# databases
Spiderweb is intentionally ORM-agnostic. Internally, it now uses Advanced Alchemy (built on SQLAlchemy) to persist firstparty data like sessions. You can choose one of the following approaches for your application data:
It's hard to find a server-side app without a database these days, and for good reason: there are a lot of things to keep track of. Spiderweb does its best to remain database-agnostic, though it does utilize `peewee` internally to handle its own data (such as session data). This means that you have three options for how to handle databases in your app.
- Option 1: Use the builtin Advanced Alchemy/SQLAlchemy setup
- Option 2: Bring your own ORM and manage its lifecycle
- Option 3: Use separate databases for Spiderweb internals and your app data
## Option 1: Using Peewee
## Option 1: Use the builtin Advanced Alchemy/SQLAlchemy setup
If you'd just like to use the same system that's already in place, you can import `SpiderwebModel` and get to work writing your own models for Peewee. See below for notes on writing your own database models and fitting them into the server and for changing the driver to a different type of database.
By default, Spiderweb will create and use a SQLite database file named `spiderweb.db` in your application directory. You can change this by passing the `db` argument to `SpiderwebRouter` as any of the following:
- A SQLAlchemy Engine instance
- A database URL string (e.g., `sqlite:///path/to.db`, `postgresql+psycopg://user:pass@host/db`)
- A filesystem path string for SQLite (e.g., `my_db.sqlite`)
## Option 2: Using your own database ORM
Examples:
You may not want to use Peewee, and that's totally fine; in that case, you will want to tell Spiderweb where the database is so that it can create the tables that it needs. To do this, you'll need to be be using a database type that Peewee supports; at this time, the options are SQLite, MySQL, MariaDB, and Postgres.
```python
from sqlalchemy import create_engine
from spiderweb import SpiderwebRouter
You'll want to instantiate your own ORM in the way that works best for you and let Spiderweb know where to find the database. See "Changing the Peewee Database Target" below for information on how to adjust where Spiderweb places data.
# Use a SQLite file by passing a path
app = SpiderwebRouter(db="my_db.sqlite")
# Or pass a SQLAlchemy engine
engine = create_engine("postgresql+psycopg://user:pass@localhost/myapp")
app = SpiderwebRouter(db=engine)
# Or pass a full URL string
app = SpiderwebRouter(db="sqlite:///./local.db")
```
## Option 2: Bring your own ORM
If you are using another ORM or data layer, create and manage it as you normally would. If you need perrequest access to a connection or session, you can attach it via custom middleware:
Instantiating your own ORM depends on whether your ORM can maintain an application-wide connection or if it needs a new connection on a per-request basis. For example, SQLAlchemy prefers that you use an `engine` to access the database. Since it's not clear at any given point which view will be receiving a request, this might be a good reason for some custom middleware to add an `engine` attribute onto the request that can be retrieved later:
```python
from spiderweb.middleware import SpiderwebMiddleware
@ -41,60 +21,105 @@ from sqlalchemy import create_engine
class SQLAlchemyMiddleware(SpiderwebMiddleware):
# there's only one of these, so we can just make it a top-level attr
engine = None
def process_request(self, request) -> None:
# provide handles for the default `spiderweb.db` sqlite3 db
if not self.engine:
self.engine = create_engine("sqlite:///spiderweb.db")
request.engine = self.engine
```
Now any view that receives the incoming request object can access `request.engine` and interact with the database as needed.
Now, any view that receives the incoming request object will be able to access `request.engine` and interact with the database as needed.
> See [Writing Your Own Middleware](middleware/custom_middleware.md) for more information.
## Option 3: Use two databases
## Option 3: Using two databases
If your application requires a database not supported by SQLAlchemy or you prefer to keep concerns separated, you can run two databases: one for Spiderweb internals (sessions, etc.) and one for your application logic.
While this isn't the most delightful of options, admittedly, if your application needs to use a database that isn't something Peewee natively supports, you will want to set aside a database connection specifically for Spiderweb so that internal functions will continue to work as expected while your app uses the database you need for business logic.
## Migrations
## Changing the Peewee Database Target
Advanced Alchemy works seamlessly with Alembic (SQLAlchemy's migration tool). To manage schema changes:
By default, Spiderweb will create and use a SQLite db in the application directory named `spiderweb.db`. You can change this by selecting the right driver from Peewee and passing it to Spiderweb during the server instantiation, like this:
1. Install Alembic:
```python
from spiderweb import SpiderwebRouter
from peewee import SqliteDatabase
```bash
pip install alembic
```
app = SpiderwebRouter(
db=SqliteDatabase("my_db.sqlite")
)
```
2. Initialize a migration repository:
Peewee supports the following databases at this time:
```bash
alembic init migrations
```
- SQLite
- MySQL
- MariaDB
- Postgres
3. Configure Alembic to use Spiderweb's metadata. In `migrations/env.py`, set:
Connecting Spiderweb to Postgres would look like this:
```python
from spiderweb.db import Base
target_metadata = Base.metadata
```
```python
from spiderweb import SpiderwebRouter
from peewee import PostgresqlDatabase
Also set the database URL either in `alembic.ini` (`sqlalchemy.url = ...`) or dynamically in `env.py` (read from environment variables or config).
app = SpiderwebRouter(
db = PostgresqlDatabase(
'my_app',
user='postgres',
password='secret',
host='10.1.0.9',
port=5432
)
)
```
4. Generate migrations from model changes:
## Writing Peewee Models
```bash
alembic revision --autogenerate -m "add my table"
```
```python
from spiderweb.db import SpiderwebModel
```
5. Apply migrations:
If you'd like to use Peewee, then you can use the model code written for Spiderweb. There are two special powers this grants you: migration checking and automatic database assignments.
```bash
alembic upgrade head
```
### Automatic Database Assignments
Notes:
- If you define your own SQLAlchemy models, make sure they inherit from `spiderweb.db.Base` (or include their metadata in `target_metadata`) so Alembic can discover them.
- For multi-database setups, you can configure multiple Alembic contexts or run Alembic separately per database.
- Advanced Alchemy provides additional helpers on top of SQLAlchemy; you can use them freely alongside the guidance above.
One of the odder quirks of Peewee is that you must specify what database object a model is attached to. From [the docs](https://docs.peewee-orm.com/en/latest/peewee/quickstart.html#model-definition):
```python
from peewee import *
db = SqliteDatabase('people.db')
class Person(Model):
name = CharField()
birthday = DateField()
class Meta:
database = db # This model uses the "people.db" database.
```
Spiderweb handles the database assignment for you so that your model is added to the same database that is already in use, regardless of driver:
```python
from spiderweb.db import SpiderwebModel
from peewee import *
class Person(SpiderwebModel):
name = CharField()
birthday = DateField()
```
### Migration Checking
Spiderweb also watches your model and raises an error if the state of the database and your model schema differ. This check attempts to be as thorough as possible, but may not be appropriate for you; if that's the case, then add the magic variable `skip_migration_check` to the `Meta` class for your model. For example:
```python
class Person(SpiderwebModel):
name = CharField()
birthday = DateField()
class Meta:
skip_migration_check = True
```

View file

@ -9,9 +9,6 @@ app = SpiderwebRouter(
```
When working with form data, you may not want to always have to perform your own validation on the incoming data. Spiderweb gives you a way out of the box to perform this validation using Pydantic.
> [!WARNING]
> Pydantic is not installed by default. Install it with `pip install pydantic` or `pip install spiderweb[pydantic]`.
Let's assume that we have a form view that looks like this:
```python
@ -57,4 +54,4 @@ The Pydantic middleware will automatically detect that the model that you want t
If the validation fails, the middleware will call `on_error`, which by default will return a 400 with a list of the broken fields. You may not want this behavior, so the easiest way to address it is to subclass PydanticMiddleware with your own version and override `on_error` to do whatever you'd like.
If validation succeeds, the data from the validator will appear on the request object under `request.validated_data` — to access it, just call `.dict()` on the validated data.
If validation succeeds, the data from the validator will appear on the request object under `request.validated_data` — to access it, just call `.dict()` on the validated data.

View file

@ -4,10 +4,10 @@ Start by installing the package with your favorite package manager:
<!-- tabs:start -->
<!-- tab:uv -->
<!-- tab:poetry -->
```shell
uv add spiderweb-framework
poetry add spiderweb-framework
```
<!-- tab:pip -->

View file

@ -29,6 +29,7 @@ app = SpiderwebRouter(
append_slash=False, # default
cors_allow_all_origins=True,
static_url="static_stuff",
media_dir="media",
debug=True,
case_transform_middleware_type="spongebob",
)
@ -71,6 +72,26 @@ def example(request, id):
return HttpResponse(body=f"Example with id {id}")
@app.route("file_upload/")
def file_upload(request):
if request.method == "POST":
if "file" not in request.FILES:
return HttpResponse(body="No file uploaded", status_code=400)
file = request.FILES["file"]
content = file.read()
filepath = file.save() # Save the file to the media directory
try:
return HttpResponse(body=f"File content: {content.decode('utf-8')}")
except UnicodeDecodeError:
return HttpResponse(
body=f"The file has been uploaded, but it is not a text file."
f" Saved to {filepath}",
status_code=400,
)
else:
return TemplateResponse(request, "file_upload.html")
@app.error(405)
def http405(request) -> HttpResponse:
return HttpResponse(body="Method not allowed", status_code=405)

View file

@ -48,14 +48,12 @@ def form(request):
app = SpiderwebRouter(
templates_dirs=["templates"],
middleware=[
"spiderweb.middleware.sessions.SessionMiddleware",
"spiderweb.middleware.csrf.CSRFMiddleware",
"example_middleware.TestMiddleware",
"example_middleware.RedirectMiddleware",
"example_middleware.ExplodingMiddleware",
],
staticfiles_dirs=["static_files"],
debug=True,
routes=[
("/", index),
("/redirect", redirect),

453
poetry.lock generated
View file

@ -1,61 +1,12 @@
# This file is automatically @generated by Poetry 2.1.1 and should not be changed by hand.
[[package]]
name = "advanced-alchemy"
version = "1.6.3"
description = "Ready-to-go SQLAlchemy concoctions."
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "advanced_alchemy-1.6.3-py3-none-any.whl", hash = "sha256:d905f51affb427a13787f4b562d283ee891fc3c3865b5d7869f2f483533f1546"},
{file = "advanced_alchemy-1.6.3.tar.gz", hash = "sha256:b0ed313c0e1b7ac3c1a9caf8349d1742099b1c1d5f73e8926826da942aa1bf6c"},
]
[package.dependencies]
alembic = ">=1.12.0"
greenlet = "*"
sqlalchemy = ">=2.0.20"
typing-extensions = ">=4.0.0"
[package.extras]
argon2 = ["argon2-cffi"]
cli = ["rich-click"]
fsspec = ["fsspec"]
nanoid = ["fastnanoid (>=0.4.1)"]
obstore = ["obstore"]
passlib = ["passlib[argon2]"]
pwdlib = ["pwdlib[argon2]"]
uuid = ["uuid-utils (>=0.6.1)"]
[[package]]
name = "alembic"
version = "1.17.0"
description = "A database migration tool for SQLAlchemy."
optional = false
python-versions = ">=3.10"
groups = ["main"]
files = [
{file = "alembic-1.17.0-py3-none-any.whl", hash = "sha256:80523bc437d41b35c5db7e525ad9d908f79de65c27d6a5a5eab6df348a352d99"},
{file = "alembic-1.17.0.tar.gz", hash = "sha256:4652a0b3e19616b57d652b82bfa5e38bf5dbea0813eed971612671cb9e90c0fe"},
]
[package.dependencies]
Mako = "*"
SQLAlchemy = ">=1.4.0"
typing-extensions = ">=4.12"
[package.extras]
tz = ["tzdata"]
# This file is automatically @generated by Poetry 2.1.3 and should not be changed by hand.
[[package]]
name = "annotated-types"
version = "0.7.0"
description = "Reusable constraint types to use with typing.Annotated"
optional = true
optional = false
python-versions = ">=3.8"
groups = ["main"]
markers = "extra == \"pydantic\""
files = [
{file = "annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53"},
{file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"},
@ -126,6 +77,87 @@ d = ["aiohttp (>=3.7.4) ; sys_platform != \"win32\" or implementation_name != \"
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
uvloop = ["uvloop (>=0.15.2)"]
[[package]]
name = "cffi"
version = "1.17.1"
description = "Foreign Function Interface for Python calling C code."
optional = false
python-versions = ">=3.8"
groups = ["main"]
markers = "platform_python_implementation != \"PyPy\""
files = [
{file = "cffi-1.17.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:df8b1c11f177bc2313ec4b2d46baec87a5f3e71fc8b45dab2ee7cae86d9aba14"},
{file = "cffi-1.17.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8f2cdc858323644ab277e9bb925ad72ae0e67f69e804f4898c070998d50b1a67"},
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:edae79245293e15384b51f88b00613ba9f7198016a5948b5dddf4917d4d26382"},
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:45398b671ac6d70e67da8e4224a065cec6a93541bb7aebe1b198a61b58c7b702"},
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ad9413ccdeda48c5afdae7e4fa2192157e991ff761e7ab8fdd8926f40b160cc3"},
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5da5719280082ac6bd9aa7becb3938dc9f9cbd57fac7d2871717b1feb0902ab6"},
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bb1a08b8008b281856e5971307cc386a8e9c5b625ac297e853d36da6efe9c17"},
{file = "cffi-1.17.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:045d61c734659cc045141be4bae381a41d89b741f795af1dd018bfb532fd0df8"},
{file = "cffi-1.17.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:6883e737d7d9e4899a8a695e00ec36bd4e5e4f18fabe0aca0efe0a4b44cdb13e"},
{file = "cffi-1.17.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:6b8b4a92e1c65048ff98cfe1f735ef8f1ceb72e3d5f0c25fdb12087a23da22be"},
{file = "cffi-1.17.1-cp310-cp310-win32.whl", hash = "sha256:c9c3d058ebabb74db66e431095118094d06abf53284d9c81f27300d0e0d8bc7c"},
{file = "cffi-1.17.1-cp310-cp310-win_amd64.whl", hash = "sha256:0f048dcf80db46f0098ccac01132761580d28e28bc0f78ae0d58048063317e15"},
{file = "cffi-1.17.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a45e3c6913c5b87b3ff120dcdc03f6131fa0065027d0ed7ee6190736a74cd401"},
{file = "cffi-1.17.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:30c5e0cb5ae493c04c8b42916e52ca38079f1b235c2f8ae5f4527b963c401caf"},
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f75c7ab1f9e4aca5414ed4d8e5c0e303a34f4421f8a0d47a4d019ceff0ab6af4"},
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a1ed2dd2972641495a3ec98445e09766f077aee98a1c896dcb4ad0d303628e41"},
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:46bf43160c1a35f7ec506d254e5c890f3c03648a4dbac12d624e4490a7046cd1"},
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a24ed04c8ffd54b0729c07cee15a81d964e6fee0e3d4d342a27b020d22959dc6"},
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:610faea79c43e44c71e1ec53a554553fa22321b65fae24889706c0a84d4ad86d"},
{file = "cffi-1.17.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:a9b15d491f3ad5d692e11f6b71f7857e7835eb677955c00cc0aefcd0669adaf6"},
{file = "cffi-1.17.1-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:de2ea4b5833625383e464549fec1bc395c1bdeeb5f25c4a3a82b5a8c756ec22f"},
{file = "cffi-1.17.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:fc48c783f9c87e60831201f2cce7f3b2e4846bf4d8728eabe54d60700b318a0b"},
{file = "cffi-1.17.1-cp311-cp311-win32.whl", hash = "sha256:85a950a4ac9c359340d5963966e3e0a94a676bd6245a4b55bc43949eee26a655"},
{file = "cffi-1.17.1-cp311-cp311-win_amd64.whl", hash = "sha256:caaf0640ef5f5517f49bc275eca1406b0ffa6aa184892812030f04c2abf589a0"},
{file = "cffi-1.17.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:805b4371bf7197c329fcb3ead37e710d1bca9da5d583f5073b799d5c5bd1eee4"},
{file = "cffi-1.17.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:733e99bc2df47476e3848417c5a4540522f234dfd4ef3ab7fafdf555b082ec0c"},
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1257bdabf294dceb59f5e70c64a3e2f462c30c7ad68092d01bbbfb1c16b1ba36"},
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:da95af8214998d77a98cc14e3a3bd00aa191526343078b530ceb0bd710fb48a5"},
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d63afe322132c194cf832bfec0dc69a99fb9bb6bbd550f161a49e9e855cc78ff"},
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f79fc4fc25f1c8698ff97788206bb3c2598949bfe0fef03d299eb1b5356ada99"},
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b62ce867176a75d03a665bad002af8e6d54644fad99a3c70905c543130e39d93"},
{file = "cffi-1.17.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:386c8bf53c502fff58903061338ce4f4950cbdcb23e2902d86c0f722b786bbe3"},
{file = "cffi-1.17.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4ceb10419a9adf4460ea14cfd6bc43d08701f0835e979bf821052f1805850fe8"},
{file = "cffi-1.17.1-cp312-cp312-win32.whl", hash = "sha256:a08d7e755f8ed21095a310a693525137cfe756ce62d066e53f502a83dc550f65"},
{file = "cffi-1.17.1-cp312-cp312-win_amd64.whl", hash = "sha256:51392eae71afec0d0c8fb1a53b204dbb3bcabcb3c9b807eedf3e1e6ccf2de903"},
{file = "cffi-1.17.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f3a2b4222ce6b60e2e8b337bb9596923045681d71e5a082783484d845390938e"},
{file = "cffi-1.17.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:0984a4925a435b1da406122d4d7968dd861c1385afe3b45ba82b750f229811e2"},
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d01b12eeeb4427d3110de311e1774046ad344f5b1a7403101878976ecd7a10f3"},
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:706510fe141c86a69c8ddc029c7910003a17353970cff3b904ff0686a5927683"},
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:de55b766c7aa2e2a3092c51e0483d700341182f08e67c63630d5b6f200bb28e5"},
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c59d6e989d07460165cc5ad3c61f9fd8f1b4796eacbd81cee78957842b834af4"},
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dd398dbc6773384a17fe0d3e7eeb8d1a21c2200473ee6806bb5e6a8e62bb73dd"},
{file = "cffi-1.17.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:3edc8d958eb099c634dace3c7e16560ae474aa3803a5df240542b305d14e14ed"},
{file = "cffi-1.17.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:72e72408cad3d5419375fc87d289076ee319835bdfa2caad331e377589aebba9"},
{file = "cffi-1.17.1-cp313-cp313-win32.whl", hash = "sha256:e03eab0a8677fa80d646b5ddece1cbeaf556c313dcfac435ba11f107ba117b5d"},
{file = "cffi-1.17.1-cp313-cp313-win_amd64.whl", hash = "sha256:f6a16c31041f09ead72d69f583767292f750d24913dadacf5756b966aacb3f1a"},
{file = "cffi-1.17.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:636062ea65bd0195bc012fea9321aca499c0504409f413dc88af450b57ffd03b"},
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c7eac2ef9b63c79431bc4b25f1cd649d7f061a28808cbc6c47b534bd789ef964"},
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e221cf152cff04059d011ee126477f0d9588303eb57e88923578ace7baad17f9"},
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:31000ec67d4221a71bd3f67df918b1f88f676f1c3b535a7eb473255fdc0b83fc"},
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6f17be4345073b0a7b8ea599688f692ac3ef23ce28e5df79c04de519dbc4912c"},
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0e2b1fac190ae3ebfe37b979cc1ce69c81f4e4fe5746bb401dca63a9062cdaf1"},
{file = "cffi-1.17.1-cp38-cp38-win32.whl", hash = "sha256:7596d6620d3fa590f677e9ee430df2958d2d6d6de2feeae5b20e82c00b76fbf8"},
{file = "cffi-1.17.1-cp38-cp38-win_amd64.whl", hash = "sha256:78122be759c3f8a014ce010908ae03364d00a1f81ab5c7f4a7a5120607ea56e1"},
{file = "cffi-1.17.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:b2ab587605f4ba0bf81dc0cb08a41bd1c0a5906bd59243d56bad7668a6fc6c16"},
{file = "cffi-1.17.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:28b16024becceed8c6dfbc75629e27788d8a3f9030691a1dbf9821a128b22c36"},
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1d599671f396c4723d016dbddb72fe8e0397082b0a77a4fab8028923bec050e8"},
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ca74b8dbe6e8e8263c0ffd60277de77dcee6c837a3d0881d8c1ead7268c9e576"},
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f7f5baafcc48261359e14bcd6d9bff6d4b28d9103847c9e136694cb0501aef87"},
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:98e3969bcff97cae1b2def8ba499ea3d6f31ddfdb7635374834cf89a1a08ecf0"},
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cdf5ce3acdfd1661132f2a9c19cac174758dc2352bfe37d98aa7512c6b7178b3"},
{file = "cffi-1.17.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:9755e4345d1ec879e3849e62222a18c7174d65a6a92d5b346b1863912168b595"},
{file = "cffi-1.17.1-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:f1e22e8c4419538cb197e4dd60acc919d7696e5ef98ee4da4e01d3f8cfa4cc5a"},
{file = "cffi-1.17.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:c03e868a0b3bc35839ba98e74211ed2b05d2119be4e8a0f224fba9384f1fe02e"},
{file = "cffi-1.17.1-cp39-cp39-win32.whl", hash = "sha256:e31ae45bc2e29f6b2abd0de1cc3b9d5205aa847cafaecb8af1476a609a2f6eb7"},
{file = "cffi-1.17.1-cp39-cp39-win_amd64.whl", hash = "sha256:d016c76bdd850f3c626af19b0542c9677ba156e4ee4fccfdd7848803533ef662"},
{file = "cffi-1.17.1.tar.gz", hash = "sha256:1c39c6016c32bc48dd54561950ebd6836e1670f2ae46128f67cf49e789c52824"},
]
[package.dependencies]
pycparser = "*"
[[package]]
name = "click"
version = "8.1.7"
@ -240,72 +272,91 @@ files = [
toml = ["tomli ; python_full_version <= \"3.11.0a6\""]
[[package]]
name = "greenlet"
version = "3.2.4"
description = "Lightweight in-process concurrent programming"
name = "cryptography"
version = "43.0.1"
description = "cryptography is a package which provides cryptographic recipes and primitives to Python developers."
optional = false
python-versions = ">=3.9"
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "greenlet-3.2.4-cp310-cp310-macosx_11_0_universal2.whl", hash = "sha256:8c68325b0d0acf8d91dde4e6f930967dd52a5302cd4062932a6b2e7c2969f47c"},
{file = "greenlet-3.2.4-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:94385f101946790ae13da500603491f04a76b6e4c059dab271b3ce2e283b2590"},
{file = "greenlet-3.2.4-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f10fd42b5ee276335863712fa3da6608e93f70629c631bf77145021600abc23c"},
{file = "greenlet-3.2.4-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:c8c9e331e58180d0d83c5b7999255721b725913ff6bc6cf39fa2a45841a4fd4b"},
{file = "greenlet-3.2.4-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:58b97143c9cc7b86fc458f215bd0932f1757ce649e05b640fea2e79b54cedb31"},
{file = "greenlet-3.2.4-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c2ca18a03a8cfb5b25bc1cbe20f3d9a4c80d8c3b13ba3df49ac3961af0b1018d"},
{file = "greenlet-3.2.4-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:9fe0a28a7b952a21e2c062cd5756d34354117796c6d9215a87f55e38d15402c5"},
{file = "greenlet-3.2.4-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:8854167e06950ca75b898b104b63cc646573aa5fef1353d4508ecdd1ee76254f"},
{file = "greenlet-3.2.4-cp310-cp310-win_amd64.whl", hash = "sha256:73f49b5368b5359d04e18d15828eecc1806033db5233397748f4ca813ff1056c"},
{file = "greenlet-3.2.4-cp311-cp311-macosx_11_0_universal2.whl", hash = "sha256:96378df1de302bc38e99c3a9aa311967b7dc80ced1dcc6f171e99842987882a2"},
{file = "greenlet-3.2.4-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:1ee8fae0519a337f2329cb78bd7a8e128ec0f881073d43f023c7b8d4831d5246"},
{file = "greenlet-3.2.4-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:94abf90142c2a18151632371140b3dba4dee031633fe614cb592dbb6c9e17bc3"},
{file = "greenlet-3.2.4-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:4d1378601b85e2e5171b99be8d2dc85f594c79967599328f95c1dc1a40f1c633"},
{file = "greenlet-3.2.4-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:0db5594dce18db94f7d1650d7489909b57afde4c580806b8d9203b6e79cdc079"},
{file = "greenlet-3.2.4-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2523e5246274f54fdadbce8494458a2ebdcdbc7b802318466ac5606d3cded1f8"},
{file = "greenlet-3.2.4-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:1987de92fec508535687fb807a5cea1560f6196285a4cde35c100b8cd632cc52"},
{file = "greenlet-3.2.4-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:55e9c5affaa6775e2c6b67659f3a71684de4c549b3dd9afca3bc773533d284fa"},
{file = "greenlet-3.2.4-cp311-cp311-win_amd64.whl", hash = "sha256:9c40adce87eaa9ddb593ccb0fa6a07caf34015a29bf8d344811665b573138db9"},
{file = "greenlet-3.2.4-cp312-cp312-macosx_11_0_universal2.whl", hash = "sha256:3b67ca49f54cede0186854a008109d6ee71f66bd57bb36abd6d0a0267b540cdd"},
{file = "greenlet-3.2.4-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:ddf9164e7a5b08e9d22511526865780a576f19ddd00d62f8a665949327fde8bb"},
{file = "greenlet-3.2.4-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f28588772bb5fb869a8eb331374ec06f24a83a9c25bfa1f38b6993afe9c1e968"},
{file = "greenlet-3.2.4-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:5c9320971821a7cb77cfab8d956fa8e39cd07ca44b6070db358ceb7f8797c8c9"},
{file = "greenlet-3.2.4-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c60a6d84229b271d44b70fb6e5fa23781abb5d742af7b808ae3f6efd7c9c60f6"},
{file = "greenlet-3.2.4-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3b3812d8d0c9579967815af437d96623f45c0f2ae5f04e366de62a12d83a8fb0"},
{file = "greenlet-3.2.4-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:abbf57b5a870d30c4675928c37278493044d7c14378350b3aa5d484fa65575f0"},
{file = "greenlet-3.2.4-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:20fb936b4652b6e307b8f347665e2c615540d4b42b3b4c8a321d8286da7e520f"},
{file = "greenlet-3.2.4-cp312-cp312-win_amd64.whl", hash = "sha256:a7d4e128405eea3814a12cc2605e0e6aedb4035bf32697f72deca74de4105e02"},
{file = "greenlet-3.2.4-cp313-cp313-macosx_11_0_universal2.whl", hash = "sha256:1a921e542453fe531144e91e1feedf12e07351b1cf6c9e8a3325ea600a715a31"},
{file = "greenlet-3.2.4-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:cd3c8e693bff0fff6ba55f140bf390fa92c994083f838fece0f63be121334945"},
{file = "greenlet-3.2.4-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:710638eb93b1fa52823aa91bf75326f9ecdfd5e0466f00789246a5280f4ba0fc"},
{file = "greenlet-3.2.4-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:c5111ccdc9c88f423426df3fd1811bfc40ed66264d35aa373420a34377efc98a"},
{file = "greenlet-3.2.4-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:d76383238584e9711e20ebe14db6c88ddcedc1829a9ad31a584389463b5aa504"},
{file = "greenlet-3.2.4-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:23768528f2911bcd7e475210822ffb5254ed10d71f4028387e5a99b4c6699671"},
{file = "greenlet-3.2.4-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:00fadb3fedccc447f517ee0d3fd8fe49eae949e1cd0f6a611818f4f6fb7dc83b"},
{file = "greenlet-3.2.4-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:d25c5091190f2dc0eaa3f950252122edbbadbb682aa7b1ef2f8af0f8c0afefae"},
{file = "greenlet-3.2.4-cp313-cp313-win_amd64.whl", hash = "sha256:554b03b6e73aaabec3745364d6239e9e012d64c68ccd0b8430c64ccc14939a8b"},
{file = "greenlet-3.2.4-cp314-cp314-macosx_11_0_universal2.whl", hash = "sha256:49a30d5fda2507ae77be16479bdb62a660fa51b1eb4928b524975b3bde77b3c0"},
{file = "greenlet-3.2.4-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:299fd615cd8fc86267b47597123e3f43ad79c9d8a22bebdce535e53550763e2f"},
{file = "greenlet-3.2.4-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:c17b6b34111ea72fc5a4e4beec9711d2226285f0386ea83477cbb97c30a3f3a5"},
{file = "greenlet-3.2.4-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:b4a1870c51720687af7fa3e7cda6d08d801dae660f75a76f3845b642b4da6ee1"},
{file = "greenlet-3.2.4-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:061dc4cf2c34852b052a8620d40f36324554bc192be474b9e9770e8c042fd735"},
{file = "greenlet-3.2.4-cp314-cp314-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:44358b9bf66c8576a9f57a590d5f5d6e72fa4228b763d0e43fee6d3b06d3a337"},
{file = "greenlet-3.2.4-cp314-cp314-win_amd64.whl", hash = "sha256:e37ab26028f12dbb0ff65f29a8d3d44a765c61e729647bf2ddfbbed621726f01"},
{file = "greenlet-3.2.4-cp39-cp39-macosx_11_0_universal2.whl", hash = "sha256:b6a7c19cf0d2742d0809a4c05975db036fdff50cd294a93632d6a310bf9ac02c"},
{file = "greenlet-3.2.4-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:27890167f55d2387576d1f41d9487ef171849ea0359ce1510ca6e06c8bece11d"},
{file = "greenlet-3.2.4-cp39-cp39-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:18d9260df2b5fbf41ae5139e1be4e796d99655f023a636cd0e11e6406cca7d58"},
{file = "greenlet-3.2.4-cp39-cp39-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:671df96c1f23c4a0d4077a325483c1503c96a1b7d9db26592ae770daa41233d4"},
{file = "greenlet-3.2.4-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:16458c245a38991aa19676900d48bd1a6f2ce3e16595051a4db9d012154e8433"},
{file = "greenlet-3.2.4-cp39-cp39-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c9913f1a30e4526f432991f89ae263459b1c64d1608c0d22a5c79c287b3c70df"},
{file = "greenlet-3.2.4-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:b90654e092f928f110e0007f572007c9727b5265f7632c2fa7415b4689351594"},
{file = "greenlet-3.2.4-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:81701fd84f26330f0d5f4944d4e92e61afe6319dcd9775e39396e39d7c3e5f98"},
{file = "greenlet-3.2.4-cp39-cp39-win32.whl", hash = "sha256:65458b409c1ed459ea899e939f0e1cdb14f58dbc803f2f93c5eab5694d32671b"},
{file = "greenlet-3.2.4-cp39-cp39-win_amd64.whl", hash = "sha256:d2e685ade4dafd447ede19c31277a224a239a0a1a4eca4e6390efedf20260cfb"},
{file = "greenlet-3.2.4.tar.gz", hash = "sha256:0dca0d95ff849f9a364385f36ab49f50065d76964944638be9691e1832e9f86d"},
{file = "cryptography-43.0.1-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:8385d98f6a3bf8bb2d65a73e17ed87a3ba84f6991c155691c51112075f9ffc5d"},
{file = "cryptography-43.0.1-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:27e613d7077ac613e399270253259d9d53872aaf657471473ebfc9a52935c062"},
{file = "cryptography-43.0.1-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:68aaecc4178e90719e95298515979814bda0cbada1256a4485414860bd7ab962"},
{file = "cryptography-43.0.1-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:de41fd81a41e53267cb020bb3a7212861da53a7d39f863585d13ea11049cf277"},
{file = "cryptography-43.0.1-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:f98bf604c82c416bc829e490c700ca1553eafdf2912a91e23a79d97d9801372a"},
{file = "cryptography-43.0.1-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:61ec41068b7b74268fa86e3e9e12b9f0c21fcf65434571dbb13d954bceb08042"},
{file = "cryptography-43.0.1-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:014f58110f53237ace6a408b5beb6c427b64e084eb451ef25a28308270086494"},
{file = "cryptography-43.0.1-cp37-abi3-win32.whl", hash = "sha256:2bd51274dcd59f09dd952afb696bf9c61a7a49dfc764c04dd33ef7a6b502a1e2"},
{file = "cryptography-43.0.1-cp37-abi3-win_amd64.whl", hash = "sha256:666ae11966643886c2987b3b721899d250855718d6d9ce41b521252a17985f4d"},
{file = "cryptography-43.0.1-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:ac119bb76b9faa00f48128b7f5679e1d8d437365c5d26f1c2c3f0da4ce1b553d"},
{file = "cryptography-43.0.1-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1bbcce1a551e262dfbafb6e6252f1ae36a248e615ca44ba302df077a846a8806"},
{file = "cryptography-43.0.1-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:58d4e9129985185a06d849aa6df265bdd5a74ca6e1b736a77959b498e0505b85"},
{file = "cryptography-43.0.1-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:d03a475165f3134f773d1388aeb19c2d25ba88b6a9733c5c590b9ff7bbfa2e0c"},
{file = "cryptography-43.0.1-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:511f4273808ab590912a93ddb4e3914dfd8a388fed883361b02dea3791f292e1"},
{file = "cryptography-43.0.1-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:80eda8b3e173f0f247f711eef62be51b599b5d425c429b5d4ca6a05e9e856baa"},
{file = "cryptography-43.0.1-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:38926c50cff6f533f8a2dae3d7f19541432610d114a70808f0926d5aaa7121e4"},
{file = "cryptography-43.0.1-cp39-abi3-win32.whl", hash = "sha256:a575913fb06e05e6b4b814d7f7468c2c660e8bb16d8d5a1faf9b33ccc569dd47"},
{file = "cryptography-43.0.1-cp39-abi3-win_amd64.whl", hash = "sha256:d75601ad10b059ec832e78823b348bfa1a59f6b8d545db3a24fd44362a1564cb"},
{file = "cryptography-43.0.1-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:ea25acb556320250756e53f9e20a4177515f012c9eaea17eb7587a8c4d8ae034"},
{file = "cryptography-43.0.1-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:c1332724be35d23a854994ff0b66530119500b6053d0bd3363265f7e5e77288d"},
{file = "cryptography-43.0.1-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:fba1007b3ef89946dbbb515aeeb41e30203b004f0b4b00e5e16078b518563289"},
{file = "cryptography-43.0.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:5b43d1ea6b378b54a1dc99dd8a2b5be47658fe9a7ce0a58ff0b55f4b43ef2b84"},
{file = "cryptography-43.0.1-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:88cce104c36870d70c49c7c8fd22885875d950d9ee6ab54df2745f83ba0dc365"},
{file = "cryptography-43.0.1-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:9d3cdb25fa98afdd3d0892d132b8d7139e2c087da1712041f6b762e4f807cc96"},
{file = "cryptography-43.0.1-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:e710bf40870f4db63c3d7d929aa9e09e4e7ee219e703f949ec4073b4294f6172"},
{file = "cryptography-43.0.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:7c05650fe8023c5ed0d46793d4b7d7e6cd9c04e68eabe5b0aeea836e37bdcec2"},
{file = "cryptography-43.0.1.tar.gz", hash = "sha256:203e92a75716d8cfb491dc47c79e17d0d9207ccffcbcb35f598fbe463ae3444d"},
]
[package.dependencies]
cffi = {version = ">=1.12", markers = "platform_python_implementation != \"PyPy\""}
[package.extras]
docs = ["sphinx (>=5.3.0)", "sphinx-rtd-theme (>=1.1.1)"]
docstest = ["pyenchant (>=1.6.11)", "readme-renderer", "sphinxcontrib-spelling (>=4.0.1)"]
nox = ["nox"]
pep8test = ["check-sdist", "click", "mypy", "ruff"]
sdist = ["build"]
ssh = ["bcrypt (>=3.1.5)"]
test = ["certifi", "cryptography-vectors (==43.0.1)", "pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
[[package]]
name = "dnspython"
version = "2.6.1"
description = "DNS toolkit"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "dnspython-2.6.1-py3-none-any.whl", hash = "sha256:5ef3b9680161f6fa89daf8ad451b5f1a33b18ae8a1c6778cdf4b43f08c0a6e50"},
{file = "dnspython-2.6.1.tar.gz", hash = "sha256:e8f0f9c23a7b7cb99ded64e6c3a6f3e701d78f50c55e002b839dea7225cff7cc"},
]
[package.extras]
docs = ["Sphinx", "furo"]
test = ["objgraph", "psutil", "setuptools"]
dev = ["black (>=23.1.0)", "coverage (>=7.0)", "flake8 (>=7)", "mypy (>=1.8)", "pylint (>=3)", "pytest (>=7.4)", "pytest-cov (>=4.1.0)", "sphinx (>=7.2.0)", "twine (>=4.0.0)", "wheel (>=0.42.0)"]
dnssec = ["cryptography (>=41)"]
doh = ["h2 (>=4.1.0)", "httpcore (>=1.0.0)", "httpx (>=0.26.0)"]
doq = ["aioquic (>=0.9.25)"]
idna = ["idna (>=3.6)"]
trio = ["trio (>=0.23)"]
wmi = ["wmi (>=1.5.1)"]
[[package]]
name = "email-validator"
version = "2.2.0"
description = "A robust email address syntax and deliverability validation library."
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "email_validator-2.2.0-py3-none-any.whl", hash = "sha256:561977c2d73ce3611850a06fa56b414621e0c8faa9d66f2611407d87465da631"},
{file = "email_validator-2.2.0.tar.gz", hash = "sha256:cb690f344c617a714f22e66ae771445a1ceb46821152df8e165c5f9a364582b7"},
]
[package.dependencies]
dnspython = ">=2.0.0"
idna = ">=2.0.0"
[[package]]
name = "gunicorn"
@ -362,6 +413,21 @@ pytz = ["pytz (>=2014.1)"]
redis = ["redis (>=3.0.0)"]
zoneinfo = ["backports.zoneinfo (>=0.2.1) ; python_version < \"3.9\"", "tzdata (>=2024.1) ; sys_platform == \"win32\" or sys_platform == \"emscripten\""]
[[package]]
name = "idna"
version = "3.10"
description = "Internationalized Domain Names in Applications (IDNA)"
optional = false
python-versions = ">=3.6"
groups = ["main"]
files = [
{file = "idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3"},
{file = "idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9"},
]
[package.extras]
all = ["flake8 (>=7.1.1)", "mypy (>=1.11.2)", "pytest (>=8.3.2)", "ruff (>=0.6.2)"]
[[package]]
name = "iniconfig"
version = "2.0.0"
@ -375,24 +441,22 @@ files = [
]
[[package]]
name = "mako"
version = "1.3.10"
description = "A super-fast templating language that borrows the best ideas from the existing templating languages."
name = "jinja2"
version = "3.1.4"
description = "A very fast and expressive template engine."
optional = false
python-versions = ">=3.8"
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "mako-1.3.10-py3-none-any.whl", hash = "sha256:baef24a52fc4fc514a0887ac600f9f1cff3d82c61d4d700a1fa84d597b88db59"},
{file = "mako-1.3.10.tar.gz", hash = "sha256:99579a6f39583fa7e5630a28c3c1f440e4e97a414b80372649c0ce338da2ea28"},
{file = "jinja2-3.1.4-py3-none-any.whl", hash = "sha256:bc5dd2abb727a5319567b7a813e6a2e7318c39f4f487cfe6c89c6f9c7d25197d"},
{file = "jinja2-3.1.4.tar.gz", hash = "sha256:4a3aee7acbbe7303aede8e9648d13b8bf88a429282aa6122a993f0ac800cb369"},
]
[package.dependencies]
MarkupSafe = ">=0.9.2"
MarkupSafe = ">=2.0"
[package.extras]
babel = ["Babel"]
lingua = ["lingua"]
testing = ["pytest"]
i18n = ["Babel (>=2.7)"]
[[package]]
name = "markupsafe"
@ -464,6 +528,22 @@ files = [
{file = "MarkupSafe-2.1.5.tar.gz", hash = "sha256:d283d37a890ba4c1ae73ffadf8046435c76e7bc2247bbb63c00bd1a709c6544b"},
]
[[package]]
name = "multipart"
version = "1.2.1"
description = "Parser for multipart/form-data"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "multipart-1.2.1-py3-none-any.whl", hash = "sha256:c03dc203bc2e67f6b46a599467ae0d87cf71d7530504b2c1ff4a9ea21d8b8c8c"},
{file = "multipart-1.2.1.tar.gz", hash = "sha256:829b909b67bc1ad1c6d4488fcdc6391c2847842b08323addf5200db88dbe9480"},
]
[package.extras]
dev = ["build", "pytest", "pytest-cov", "twine"]
docs = ["sphinx (>=8,<9)", "sphinx-autobuild"]
[[package]]
name = "mypy-extensions"
version = "1.0.0"
@ -500,6 +580,17 @@ files = [
{file = "pathspec-0.12.1.tar.gz", hash = "sha256:a482d51503a1ab33b1c67a6c3813a26953dbdc71c31dacaef9a838c4e29f5712"},
]
[[package]]
name = "peewee"
version = "3.17.6"
description = "a little orm"
optional = false
python-versions = "*"
groups = ["main"]
files = [
{file = "peewee-3.17.6.tar.gz", hash = "sha256:cea5592c6f4da1592b7cff8eaf655be6648a1f5857469e30037bf920c03fb8fb"},
]
[[package]]
name = "platformdirs"
version = "4.3.6"
@ -533,14 +624,26 @@ files = [
dev = ["pre-commit", "tox"]
testing = ["pytest", "pytest-benchmark"]
[[package]]
name = "pycparser"
version = "2.22"
description = "C parser in Python"
optional = false
python-versions = ">=3.8"
groups = ["main"]
markers = "platform_python_implementation != \"PyPy\""
files = [
{file = "pycparser-2.22-py3-none-any.whl", hash = "sha256:c3702b6d3dd8c7abc1afa565d7e63d53a1d0bd86cdc24edd75470f4de499cfcc"},
{file = "pycparser-2.22.tar.gz", hash = "sha256:491c8be9c040f5390f5bf44a5b07752bd07f56edf992381b05c701439eec10f6"},
]
[[package]]
name = "pydantic"
version = "2.9.2"
description = "Data validation using Python type hints"
optional = true
optional = false
python-versions = ">=3.8"
groups = ["main"]
markers = "extra == \"pydantic\""
files = [
{file = "pydantic-2.9.2-py3-none-any.whl", hash = "sha256:f048cec7b26778210e28a0459867920654d48e5e62db0958433636cde4254f12"},
{file = "pydantic-2.9.2.tar.gz", hash = "sha256:d155cef71265d1e9807ed1c32b4c8deec042a44a50a4188b25ac67ecd81a9c0f"},
@ -562,10 +665,9 @@ timezone = ["tzdata ; python_version >= \"3.9\" and sys_platform == \"win32\""]
name = "pydantic-core"
version = "2.23.4"
description = "Core functionality for Pydantic validation and serialization"
optional = true
optional = false
python-versions = ">=3.8"
groups = ["main"]
markers = "extra == \"pydantic\""
files = [
{file = "pydantic_core-2.23.4-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:b10bd51f823d891193d4717448fab065733958bdb6a6b351967bd349d48d5c9b"},
{file = "pydantic_core-2.23.4-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:4fc714bdbfb534f94034efaa6eadd74e5b93c8fa6315565a222f7b6f42ca1166"},
@ -722,102 +824,6 @@ files = [
{file = "sortedcontainers-2.4.0.tar.gz", hash = "sha256:25caa5a06cc30b6b83d11423433f65d1f9d76c4c6a0c90e3379eaa43b9bfdb88"},
]
[[package]]
name = "sqlalchemy"
version = "2.0.44"
description = "Database Abstraction Library"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "SQLAlchemy-2.0.44-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:471733aabb2e4848d609141a9e9d56a427c0a038f4abf65dd19d7a21fd563632"},
{file = "SQLAlchemy-2.0.44-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:48bf7d383a35e668b984c805470518b635d48b95a3c57cb03f37eaa3551b5f9f"},
{file = "SQLAlchemy-2.0.44-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bf4bb6b3d6228fcf3a71b50231199fb94d2dd2611b66d33be0578ea3e6c2726"},
{file = "SQLAlchemy-2.0.44-cp37-cp37m-musllinux_1_2_aarch64.whl", hash = "sha256:e998cf7c29473bd077704cea3577d23123094311f59bdc4af551923b168332b1"},
{file = "SQLAlchemy-2.0.44-cp37-cp37m-musllinux_1_2_x86_64.whl", hash = "sha256:ebac3f0b5732014a126b43c2b7567f2f0e0afea7d9119a3378bde46d3dcad88e"},
{file = "SQLAlchemy-2.0.44-cp37-cp37m-win32.whl", hash = "sha256:3255d821ee91bdf824795e936642bbf43a4c7cedf5d1aed8d24524e66843aa74"},
{file = "SQLAlchemy-2.0.44-cp37-cp37m-win_amd64.whl", hash = "sha256:78e6c137ba35476adb5432103ae1534f2f5295605201d946a4198a0dea4b38e7"},
{file = "sqlalchemy-2.0.44-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:7c77f3080674fc529b1bd99489378c7f63fcb4ba7f8322b79732e0258f0ea3ce"},
{file = "sqlalchemy-2.0.44-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:4c26ef74ba842d61635b0152763d057c8d48215d5be9bb8b7604116a059e9985"},
{file = "sqlalchemy-2.0.44-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f4a172b31785e2f00780eccab00bc240ccdbfdb8345f1e6063175b3ff12ad1b0"},
{file = "sqlalchemy-2.0.44-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f9480c0740aabd8cb29c329b422fb65358049840b34aba0adf63162371d2a96e"},
{file = "sqlalchemy-2.0.44-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:17835885016b9e4d0135720160db3095dc78c583e7b902b6be799fb21035e749"},
{file = "sqlalchemy-2.0.44-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:cbe4f85f50c656d753890f39468fcd8190c5f08282caf19219f684225bfd5fd2"},
{file = "sqlalchemy-2.0.44-cp310-cp310-win32.whl", hash = "sha256:2fcc4901a86ed81dc76703f3b93ff881e08761c63263c46991081fd7f034b165"},
{file = "sqlalchemy-2.0.44-cp310-cp310-win_amd64.whl", hash = "sha256:9919e77403a483ab81e3423151e8ffc9dd992c20d2603bf17e4a8161111e55f5"},
{file = "sqlalchemy-2.0.44-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:0fe3917059c7ab2ee3f35e77757062b1bea10a0b6ca633c58391e3f3c6c488dd"},
{file = "sqlalchemy-2.0.44-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:de4387a354ff230bc979b46b2207af841dc8bf29847b6c7dbe60af186d97aefa"},
{file = "sqlalchemy-2.0.44-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c3678a0fb72c8a6a29422b2732fe423db3ce119c34421b5f9955873eb9b62c1e"},
{file = "sqlalchemy-2.0.44-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3cf6872a23601672d61a68f390e44703442639a12ee9dd5a88bbce52a695e46e"},
{file = "sqlalchemy-2.0.44-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:329aa42d1be9929603f406186630135be1e7a42569540577ba2c69952b7cf399"},
{file = "sqlalchemy-2.0.44-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:70e03833faca7166e6a9927fbee7c27e6ecde436774cd0b24bbcc96353bce06b"},
{file = "sqlalchemy-2.0.44-cp311-cp311-win32.whl", hash = "sha256:253e2f29843fb303eca6b2fc645aca91fa7aa0aa70b38b6950da92d44ff267f3"},
{file = "sqlalchemy-2.0.44-cp311-cp311-win_amd64.whl", hash = "sha256:7a8694107eb4308a13b425ca8c0e67112f8134c846b6e1f722698708741215d5"},
{file = "sqlalchemy-2.0.44-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:72fea91746b5890f9e5e0997f16cbf3d53550580d76355ba2d998311b17b2250"},
{file = "sqlalchemy-2.0.44-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:585c0c852a891450edbb1eaca8648408a3cc125f18cf433941fa6babcc359e29"},
{file = "sqlalchemy-2.0.44-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9b94843a102efa9ac68a7a30cd46df3ff1ed9c658100d30a725d10d9c60a2f44"},
{file = "sqlalchemy-2.0.44-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:119dc41e7a7defcefc57189cfa0e61b1bf9c228211aba432b53fb71ef367fda1"},
{file = "sqlalchemy-2.0.44-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:0765e318ee9179b3718c4fd7ba35c434f4dd20332fbc6857a5e8df17719c24d7"},
{file = "sqlalchemy-2.0.44-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:2e7b5b079055e02d06a4308d0481658e4f06bc7ef211567edc8f7d5dce52018d"},
{file = "sqlalchemy-2.0.44-cp312-cp312-win32.whl", hash = "sha256:846541e58b9a81cce7dee8329f352c318de25aa2f2bbe1e31587eb1f057448b4"},
{file = "sqlalchemy-2.0.44-cp312-cp312-win_amd64.whl", hash = "sha256:7cbcb47fd66ab294703e1644f78971f6f2f1126424d2b300678f419aa73c7b6e"},
{file = "sqlalchemy-2.0.44-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:ff486e183d151e51b1d694c7aa1695747599bb00b9f5f604092b54b74c64a8e1"},
{file = "sqlalchemy-2.0.44-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:0b1af8392eb27b372ddb783b317dea0f650241cea5bd29199b22235299ca2e45"},
{file = "sqlalchemy-2.0.44-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2b61188657e3a2b9ac4e8f04d6cf8e51046e28175f79464c67f2fd35bceb0976"},
{file = "sqlalchemy-2.0.44-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b87e7b91a5d5973dda5f00cd61ef72ad75a1db73a386b62877d4875a8840959c"},
{file = "sqlalchemy-2.0.44-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:15f3326f7f0b2bfe406ee562e17f43f36e16167af99c4c0df61db668de20002d"},
{file = "sqlalchemy-2.0.44-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:1e77faf6ff919aa8cd63f1c4e561cac1d9a454a191bb864d5dd5e545935e5a40"},
{file = "sqlalchemy-2.0.44-cp313-cp313-win32.whl", hash = "sha256:ee51625c2d51f8baadf2829fae817ad0b66b140573939dd69284d2ba3553ae73"},
{file = "sqlalchemy-2.0.44-cp313-cp313-win_amd64.whl", hash = "sha256:c1c80faaee1a6c3428cecf40d16a2365bcf56c424c92c2b6f0f9ad204b899e9e"},
{file = "sqlalchemy-2.0.44-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2fc44e5965ea46909a416fff0af48a219faefd5773ab79e5f8a5fcd5d62b2667"},
{file = "sqlalchemy-2.0.44-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:dc8b3850d2a601ca2320d081874033684e246d28e1c5e89db0864077cfc8f5a9"},
{file = "sqlalchemy-2.0.44-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d733dec0614bb8f4bcb7c8af88172b974f685a31dc3a65cca0527e3120de5606"},
{file = "sqlalchemy-2.0.44-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:22be14009339b8bc16d6b9dc8780bacaba3402aa7581658e246114abbd2236e3"},
{file = "sqlalchemy-2.0.44-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:357bade0e46064f88f2c3a99808233e67b0051cdddf82992379559322dfeb183"},
{file = "sqlalchemy-2.0.44-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:4848395d932e93c1595e59a8672aa7400e8922c39bb9b0668ed99ac6fa867822"},
{file = "sqlalchemy-2.0.44-cp38-cp38-win32.whl", hash = "sha256:2f19644f27c76f07e10603580a47278abb2a70311136a7f8fd27dc2e096b9013"},
{file = "sqlalchemy-2.0.44-cp38-cp38-win_amd64.whl", hash = "sha256:1df4763760d1de0dfc8192cc96d8aa293eb1a44f8f7a5fbe74caf1b551905c5e"},
{file = "sqlalchemy-2.0.44-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:f7027414f2b88992877573ab780c19ecb54d3a536bef3397933573d6b5068be4"},
{file = "sqlalchemy-2.0.44-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:3fe166c7d00912e8c10d3a9a0ce105569a31a3d0db1a6e82c4e0f4bf16d5eca9"},
{file = "sqlalchemy-2.0.44-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3caef1ff89b1caefc28f0368b3bde21a7e3e630c2eddac16abd9e47bd27cc36a"},
{file = "sqlalchemy-2.0.44-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cc2856d24afa44295735e72f3c75d6ee7fdd4336d8d3a8f3d44de7aa6b766df2"},
{file = "sqlalchemy-2.0.44-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:11bac86b0deada30b6b5f93382712ff0e911fe8d31cb9bf46e6b149ae175eff0"},
{file = "sqlalchemy-2.0.44-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:4d18cd0e9a0f37c9f4088e50e3839fcb69a380a0ec957408e0b57cff08ee0a26"},
{file = "sqlalchemy-2.0.44-cp39-cp39-win32.whl", hash = "sha256:9e9018544ab07614d591a26c1bd4293ddf40752cc435caf69196740516af7100"},
{file = "sqlalchemy-2.0.44-cp39-cp39-win_amd64.whl", hash = "sha256:8e0e4e66fd80f277a8c3de016a81a554e76ccf6b8d881ee0b53200305a8433f6"},
{file = "sqlalchemy-2.0.44-py3-none-any.whl", hash = "sha256:19de7ca1246fbef9f9d1bff8f1ab25641569df226364a0e40457dc5457c54b05"},
{file = "sqlalchemy-2.0.44.tar.gz", hash = "sha256:0ae7454e1ab1d780aee69fd2aae7d6b8670a581d8847f2d1e0f7ddfbf47e5a22"},
]
[package.dependencies]
greenlet = {version = ">=1", markers = "platform_machine == \"aarch64\" or platform_machine == \"ppc64le\" or platform_machine == \"x86_64\" or platform_machine == \"amd64\" or platform_machine == \"AMD64\" or platform_machine == \"win32\" or platform_machine == \"WIN32\""}
typing-extensions = ">=4.6.0"
[package.extras]
aiomysql = ["aiomysql (>=0.2.0)", "greenlet (>=1)"]
aioodbc = ["aioodbc", "greenlet (>=1)"]
aiosqlite = ["aiosqlite", "greenlet (>=1)", "typing_extensions (!=3.10.0.1)"]
asyncio = ["greenlet (>=1)"]
asyncmy = ["asyncmy (>=0.2.3,!=0.2.4,!=0.2.6)", "greenlet (>=1)"]
mariadb-connector = ["mariadb (>=1.0.1,!=1.1.2,!=1.1.5,!=1.1.10)"]
mssql = ["pyodbc"]
mssql-pymssql = ["pymssql"]
mssql-pyodbc = ["pyodbc"]
mypy = ["mypy (>=0.910)"]
mysql = ["mysqlclient (>=1.4.0)"]
mysql-connector = ["mysql-connector-python"]
oracle = ["cx_oracle (>=8)"]
oracle-oracledb = ["oracledb (>=1.0.1)"]
postgresql = ["psycopg2 (>=2.7)"]
postgresql-asyncpg = ["asyncpg", "greenlet (>=1)"]
postgresql-pg8000 = ["pg8000 (>=1.29.1)"]
postgresql-psycopg = ["psycopg (>=3.0.7)"]
postgresql-psycopg2binary = ["psycopg2-binary"]
postgresql-psycopg2cffi = ["psycopg2cffi"]
postgresql-psycopgbinary = ["psycopg[binary] (>=3.0.7)"]
pymysql = ["pymysql"]
sqlcipher = ["sqlcipher3_binary"]
[[package]]
name = "typing-extensions"
version = "4.12.2"
@ -830,10 +836,7 @@ files = [
{file = "typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8"},
]
[extras]
pydantic = ["pydantic"]
[metadata]
lock-version = "2.1"
python-versions = "^3.11"
content-hash = "002b27684674c055d24144488bc7ad9f564b2788ab51d985c4b64fe1589b7ccb"
content-hash = "81a82cbbeda308234dd260a0d9edfd0e4e6264e2f40eb908049e2461c1438eaa"

View file

@ -1,11 +1,11 @@
[project]
name = "spiderweb-framework"
version = "2.0.0"
version = "1.4.0"
description = "A small web framework, just big enough for a spider."
authors = [{name="Joe Kaufeld", email="opensource@joekaufeld.com"}]
readme = "README.md"
packages = [{include = "spiderweb"}]
license = "MIT"
license = "LICENSE.txt"
exclude = [
"tests/*",
"example.py",
@ -27,19 +27,15 @@ classifiers = [
"Topic :: Internet :: WWW/HTTP :: WSGI :: Application",
"Topic :: Software Development :: Libraries :: Application Frameworks",
]
dependencies = [
"advanced-alchemy (>=1.6.3,<2.0.0)"
]
[tool.poetry.dependencies]
python = "^3.11"
SQLAlchemy = "^2.0.32"
peewee = "^3.17.6"
jinja2 = "^3.1.4"
cryptography = "^43.0.0"
email-validator = "^2.2.0"
[project.optional-dependencies]
pydantic = ["pydantic>=2.8.2,<3"]
pydantic = "^2.8.2"
multipart = "^1.2.1"
[tool.poetry.group.dev.dependencies]
ruff = "^0.5.5"
@ -50,11 +46,8 @@ hypothesis = "^6.111.2"
coverage = "^7.6.1"
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.hatch.build.targets.wheel]
packages = ["spiderweb/spiderweb-framework"]
requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api"
[tool.poetry_bumpversion.file."spiderweb/constants.py"]
@ -93,4 +86,4 @@ exclude_also = [
"if TYPE_CHECKING:",
]
ignore_errors = true
ignore_errors = true

View file

@ -1,3 +1,5 @@
from peewee import DatabaseProxy
DEFAULT_ALLOWED_METHODS = ["POST", "GET", "PUT", "PATCH", "DELETE"]
DEFAULT_ENCODING = "UTF-8"
__version__ = "1.3.1"
@ -5,6 +7,8 @@ __version__ = "1.3.1"
# https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Set-Cookie
REGEX_COOKIE_NAME = r"^[a-zA-Z0-9\s\(\)<>@,;:\/\\\[\]\?=\{\}\"\t]*$"
DATABASE_PROXY = DatabaseProxy()
DEFAULT_CORS_ALLOW_METHODS = (
"DELETE",
"GET",

View file

@ -1,28 +1,101 @@
from __future__ import annotations
from peewee import Model, Field, SchemaManager
from pathlib import Path
from typing import Optional, Union
from sqlalchemy import create_engine
from sqlalchemy.engine import Engine
from sqlalchemy.orm import declarative_base, sessionmaker, Session as SASession
# Base class for SQLAlchemy models used internally by Spiderweb
Base = declarative_base()
# Type alias for sessions
DBSession = SASession
from spiderweb.constants import DATABASE_PROXY
def create_sqlite_engine(db_path: Union[str, Path]) -> Engine:
"""Create a SQLite engine from a file path."""
path = Path(db_path)
# Ensure directory exists
if path.parent and not path.parent.exists():
path.parent.mkdir(parents=True, exist_ok=True)
return create_engine(f"sqlite:///{path}", future=True)
class MigrationsNeeded(ExceptionGroup): ...
def create_session_factory(engine: Engine):
"""Return a configured sessionmaker bound to the given engine."""
return sessionmaker(bind=engine, autoflush=False, autocommit=False, future=True)
class MigrationRequired(Exception): ...
class SpiderwebModel(Model):
@classmethod
def check_for_needed_migration(cls):
if hasattr(cls._meta, "skip_migration_check"):
return
current_model_fields: dict[str, Field] = cls._meta.fields
current_db_fields = {
c.name: {
"data_type": c.data_type,
"null": c.null,
"primary_key": c.primary_key,
"default": c.default,
}
for c in cls._meta.database.get_columns(cls._meta.table_name)
}
problems = []
s = SchemaManager(cls, cls._meta.database)
ctx = s._create_context()
for field_name, field_obj in current_model_fields.items():
db_version = current_db_fields.get(field_obj.column_name)
if not db_version:
problems.append(
MigrationRequired(f"Field {field_name} not found in DB.")
)
continue
if field_obj.field_type == "VARCHAR":
field_obj.max_length = field_obj.max_length or 255
if (
cls._meta.fields[field_name].ddl_datatype(ctx).sql
!= db_version["data_type"]
):
problems.append(
MigrationRequired(
f"CharField `{field_name}` has changed the field type."
)
)
else:
if (
cls._meta.database.get_context_options()["field_types"][
field_obj.field_type
]
!= db_version["data_type"]
):
problems.append(
MigrationRequired(
f"Field `{field_name}` has changed the field type."
)
)
if field_obj.null != db_version["null"]:
problems.append(
MigrationRequired(
f"Field `{field_name}` has changed the nullability."
)
)
if field_obj.__class__.__name__ == "BooleanField":
if field_obj.default is False and db_version["default"] not in (
False,
None,
0,
):
problems.append(
MigrationRequired(
f"BooleanField `{field_name}` has changed the default value."
)
)
elif field_obj.default is True and db_version["default"] not in (
True,
1,
):
problems.append(
MigrationRequired(
f"BooleanField `{field_name}` has changed the default value."
)
)
else:
if field_obj.default != db_version["default"]:
problems.append(
MigrationRequired(
f"Field `{field_name}` has changed the default value."
)
)
if problems:
raise MigrationsNeeded(f"The model {cls} requires migrations.", problems)
class Meta:
database = DATABASE_PROXY

49
spiderweb/files.py Normal file
View file

@ -0,0 +1,49 @@
import random
import string
from multipart import MultipartPart
class MediaFile:
# This class acts as a sort of container for uploaded files.
# Rather than trying to subclass the MultipartPart class and deal with
# the complexities of multipart parsing, we just use this class to
# add the save functionality for the media folder. Also makes the most
# common attributes available directly on the instance.
def __init__(self, server, multipart_part: MultipartPart):
self._file: MultipartPart = multipart_part
self.filename: str = self._file.filename
self.content_type: str = self._file.content_type
self.server = server
#: Part size in bytes.
self.size = self._file.size
#: Part name.
self.name = self._file.name
#: Charset as defined in the part header, or the parser default charset.
self.charset = self._file.charset
#: All part headers as a list of (name, value) pairs.
self.headerlist = self._file.headerlist
self.memfile_limit = self._file.memfile_limit
self.buffer_size = self._file.buffer_size
def get_random_suffix(self) -> str:
"""Generate a random 6 character suffix."""
return "".join(random.choices(string.ascii_letters, k=6))
def save(self):
file_path = self.server.BASE_DIR / self.server.media_dir / self._file.filename
if file_path.exists():
# If the file already exists, append a random suffix to the filename
suffix = self.get_random_suffix()
file_path = file_path.with_name(
f"{file_path.stem}_[{suffix}]{file_path.suffix}"
)
self._file.save_as(file_path)
return file_path
def read(self):
return self._file.file.read()
def seek(self, offset: int, whence: int = 0):
return self._file.file.seek(offset, whence)

View file

@ -3,15 +3,14 @@ import logging
import pathlib
import re
import traceback
import urllib.parse as urlparse
from logging import Logger
from pathlib import Path
from threading import Thread
from typing import Optional, Callable, Sequence, Literal
from wsgiref.simple_server import WSGIServer
from jinja2 import BaseLoader, FileSystemLoader
from sqlalchemy import create_engine
from sqlalchemy.engine import Engine
from peewee import Database, SqliteDatabase
from spiderweb.middleware import MiddlewareMixin
from spiderweb.constants import (
@ -19,10 +18,11 @@ from spiderweb.constants import (
DEFAULT_CORS_ALLOW_HEADERS,
)
from spiderweb.constants import (
DATABASE_PROXY,
DEFAULT_ENCODING,
DEFAULT_ALLOWED_METHODS,
)
from spiderweb.db import Base, create_sqlite_engine, create_session_factory
from spiderweb.db import SpiderwebModel
from spiderweb.default_views import (
http403, # noqa: F401
http404, # noqa: F401
@ -67,15 +67,17 @@ class SpiderwebRouter(LocalServerMixin, MiddlewareMixin, RoutesMixin, FernetMixi
cors_allow_credentials: bool = False,
cors_allow_private_network: bool = False,
csrf_trusted_origins: Sequence[str] = None,
db: Optional[Engine | str] = None,
db: Optional[Database] = None,
debug: bool = False,
gzip_compression_level: int = 6,
gzip_minimum_response_length: int = 500,
templates_dirs: Sequence[str] = None,
middleware: Sequence[str] = None,
append_slash: bool = False,
staticfiles_dirs: Sequence[str] = None,
staticfiles_dirs: Sequence[str | Path] = None,
static_url: str = "static",
media_dir: str | Path = None,
media_url: str = "media",
routes: Sequence[tuple[str, Callable] | tuple[str, Callable, dict]] = None,
error_routes: dict[int, Callable] = None,
secret_key: str = None,
@ -96,9 +98,12 @@ class SpiderwebRouter(LocalServerMixin, MiddlewareMixin, RoutesMixin, FernetMixi
self.port = port if port else 8000
self.server_address = (self.addr, self.port)
self.append_slash = append_slash
self.fix_route_starting_slash = True
self.templates_dirs = templates_dirs
self.staticfiles_dirs = staticfiles_dirs
self.media_dir = media_dir
self.static_url = static_url
self.media_url = media_url
self._middleware: list[str] = middleware or []
self.middleware: list[Callable] = []
self.secret_key = secret_key if secret_key else self.generate_key()
@ -148,21 +153,12 @@ class SpiderwebRouter(LocalServerMixin, MiddlewareMixin, RoutesMixin, FernetMixi
self.init_fernet()
self.init_middleware()
# Database setup (SQLAlchemy)
if isinstance(db, Engine):
self.db_engine = db
elif isinstance(db, str):
# treat as URL if it looks like one, otherwise as a filesystem path
if "://" in db:
self.db_engine = create_engine(db, future=True)
else:
self.db_engine = create_sqlite_engine(self.BASE_DIR / db)
else:
self.db_engine = create_sqlite_engine(self.BASE_DIR / "spiderweb.db")
self.db_session_factory = create_session_factory(self.db_engine)
# Create internal tables (e.g., sessions)
Base.metadata.create_all(self.db_engine)
self.db = db or SqliteDatabase(self.BASE_DIR / "spiderweb.db")
# give the models the db connection
DATABASE_PROXY.initialize(self.db)
self.db.create_tables(SpiderwebModel.__subclasses__())
for model in SpiderwebModel.__subclasses__():
model.check_for_needed_migration()
if self.routes:
self.add_routes()
@ -209,6 +205,21 @@ class SpiderwebRouter(LocalServerMixin, MiddlewareMixin, RoutesMixin, FernetMixi
" files will not be served."
)
if self.media_dir:
self.media_dir = pathlib.Path(self.media_dir)
if not pathlib.Path(self.BASE_DIR / self.media_dir).exists():
self.log.error(
f"Media directory '{str(self.media_dir)}' does not exist."
)
raise ConfigError
if self.debug:
self.add_route(rf"/{self.media_url}/<path:filename>", send_file)
else:
self.log.warning(
"`media_dir` is set, but `debug` is set to FALSE."
" Media files will not be served."
)
# finally, run the startup checks to verify everything is correct and happy.
self.log.info("Run startup checks...")
self.run_middleware_checks()
@ -274,10 +285,6 @@ class SpiderwebRouter(LocalServerMixin, MiddlewareMixin, RoutesMixin, FernetMixi
server=self,
)
def get_db_session(self):
"""Return a new SQLAlchemy session bound to the application's engine."""
return self.db_session_factory()
def send_error_response(
self, start_response, request: Request, e: SpiderwebNetworkException
):
@ -296,6 +303,7 @@ class SpiderwebRouter(LocalServerMixin, MiddlewareMixin, RoutesMixin, FernetMixi
return resp
except ConnectionAbortedError as e:
self.log.error(f"{request.method} {request.path} : {e}")
return HttpResponse(status_code=500)
def prepare_and_fire_response(self, start_response, request, resp) -> list[bytes]:
try:
@ -316,7 +324,7 @@ class SpiderwebRouter(LocalServerMixin, MiddlewareMixin, RoutesMixin, FernetMixi
except Exception:
self.log.error(traceback.format_exc())
self.fire_response(
return self.fire_response(
start_response, request, self.get_error_route(500)(request)
)
@ -347,13 +355,6 @@ class SpiderwebRouter(LocalServerMixin, MiddlewareMixin, RoutesMixin, FernetMixi
if not self.check_valid_host(request):
handler = self.get_error_route(403)
if request.is_form_request():
form_data = urlparse.parse_qs(request.content)
for key, value in form_data.items():
if len(value) == 1:
form_data[key] = value[0]
setattr(request, request.method, form_data)
try:
if handler:
abort_view = self.process_request_middleware(request)

View file

@ -13,9 +13,7 @@ from spiderweb.server_checks import ServerCheck
class CheckForSessionMiddleware(ServerCheck):
SESSION_MIDDLEWARE_NOT_FOUND = (
"Session middleware is not enabled. It must be listed above"
"CSRFMiddleware in the middleware list. Add"
" 'spiderweb.middleware.sessions.SessionMiddleware' to your"
" `middleware` list."
"CSRFMiddleware in the middleware list."
)
def check(self) -> Optional[Exception]:
@ -28,8 +26,8 @@ class CheckForSessionMiddleware(ServerCheck):
class VerifyCorrectMiddlewarePlacement(ServerCheck):
SESSION_MIDDLEWARE_BELOW_CSRF = (
"Session middleware is enabled, but must be listed above"
" CSRFMiddleware in the middleware list."
"SessionMiddleware is enabled, but it must be listed above"
"CSRFMiddleware in the middleware list."
)
def check(self) -> Optional[Exception]:

View file

@ -1,26 +1,8 @@
import inspect
from typing import get_type_hints
try: # pragma: no cover - import guard
from pydantic import BaseModel # type: ignore
from pydantic_core._pydantic_core import ValidationError # type: ignore
PYDANTIC_AVAILABLE = True
except Exception: # pragma: no cover - executed only when pydantic isn't installed
PYDANTIC_AVAILABLE = False
class BaseModel: # minimal stub to allow module import without pydantic
@classmethod
def parse_obj(cls, *args, **kwargs): # noqa: D401 - simple shim
raise RuntimeError(
"Pydantic is not installed. Install with 'pip install"
" spiderweb-framework[pydantic]' or 'pip install pydantic'"
" to use PydanticMiddleware."
)
class ValidationError(Exception): # simple stand-in so type hints resolve
def errors(self): # match pydantic's ValidationError API used below
return []
from pydantic import BaseModel
from pydantic_core._pydantic_core import ValidationError
from spiderweb import SpiderwebMiddleware
from spiderweb.request import Request
from spiderweb.response import JsonResponse
@ -37,12 +19,6 @@ class PydanticMiddleware(SpiderwebMiddleware):
def process_request(self, request):
if not request.method == "POST":
return
if not PYDANTIC_AVAILABLE:
raise RuntimeError(
"Pydantic is not installed. Install with 'pip install"
" spiderweb-framework[pydantic]' or 'pip install pydantic'"
" to use PydanticMiddleware."
)
types = get_type_hints(request.handler)
# we don't know what the user named the request object, but
# we know that it's first in the list, and it's always an arg.
@ -58,15 +34,7 @@ class PydanticMiddleware(SpiderwebMiddleware):
# Separated out into its own method so that it can be overridden
errors = e.errors()
error_dict = {"message": "Validation error", "errors": []}
# [
# {
# 'type': 'missing',
# 'loc': ('comment',),
# 'msg': 'Field required',
# 'input': {'email': 'a@a.com'},
# 'url': 'https://errors.pydantic.dev/2.8/v/missing'
# }
# ]
# [{'type': 'missing', 'loc': ('comment',), 'msg': 'Field required', 'input': {'email': 'a@a.com'}, 'url': 'https://errors.pydantic.dev/2.8/v/missing'}]
for error in errors:
field = error["loc"][0]
msg = error["msg"]

View file

@ -1,71 +1,61 @@
from datetime import datetime, timedelta
import json
from sqlalchemy import Column, Integer, String, Text, DateTime
from sqlalchemy.orm import Mapped
from peewee import CharField, TextField, DateTimeField
from spiderweb.middleware import SpiderwebMiddleware
from spiderweb.request import Request
from spiderweb.response import HttpResponse
from spiderweb.db import Base
from spiderweb.db import SpiderwebModel
from spiderweb.utils import generate_key, is_jsonable
class Session(Base):
__tablename__ = "spiderweb_sessions"
class Session(SpiderwebModel):
session_key = CharField(max_length=64)
csrf_token = CharField(max_length=64, null=True)
user_id = CharField(max_length=64, null=True)
session_data = TextField()
created_at = DateTimeField()
last_active = DateTimeField()
ip_address = CharField(max_length=30)
user_agent = TextField()
id: Mapped[int] = Column(Integer, primary_key=True, autoincrement=True)
session_key: Mapped[str] = Column(String(64), index=True, nullable=False)
csrf_token: Mapped[str | None] = Column(String(64), nullable=True)
user_id: Mapped[str | None] = Column(String(64), nullable=True)
session_data: Mapped[str] = Column(Text, nullable=False)
created_at: Mapped[datetime] = Column(DateTime, nullable=False)
last_active: Mapped[datetime] = Column(DateTime, nullable=False)
ip_address: Mapped[str] = Column(String(30), nullable=False)
user_agent: Mapped[str] = Column(Text, nullable=False)
class Meta:
table_name = "spiderweb_sessions"
class SessionMiddleware(SpiderwebMiddleware):
def process_request(self, request: Request):
dbsession = self.server.get_db_session()
try:
existing_session = (
dbsession.query(Session)
.filter(
Session.session_key
== request.COOKIES.get(self.server.session_cookie_name),
Session.ip_address == request.META.get("client_address"),
Session.user_agent == request.headers.get("HTTP_USER_AGENT"),
)
.order_by(Session.id.desc())
.first()
existing_session = (
Session.select()
.where(
Session.session_key
== request.COOKIES.get(self.server.session_cookie_name),
Session.ip_address == request.META.get("client_address"),
Session.user_agent == request.headers.get("HTTP_USER_AGENT"),
)
new_session = False
if not existing_session:
new_session = True
elif datetime.now() - existing_session.created_at > timedelta(
seconds=self.server.session_max_age
):
dbsession.delete(existing_session)
dbsession.commit()
new_session = True
.first()
)
new_session = False
if not existing_session:
new_session = True
elif datetime.now() - existing_session.created_at > timedelta(
seconds=self.server.session_max_age
):
existing_session.delete_instance()
new_session = True
if new_session:
request.SESSION = {}
request._session["id"] = generate_key()
request._session["new_session"] = True
request.META["SESSION"] = None
return
if new_session:
request.SESSION = {}
request._session["id"] = generate_key()
request._session["new_session"] = True
request.META["SESSION"] = None
return
request.SESSION = json.loads(existing_session.session_data)
request.META["SESSION"] = existing_session
request._session["id"] = existing_session.session_key
# touch last_active
existing_session.last_active = datetime.now()
dbsession.add(existing_session)
dbsession.commit()
finally:
dbsession.close()
request.SESSION = json.loads(existing_session.session_data)
request.META["SESSION"] = existing_session
request._session["id"] = existing_session.session_key
existing_session.save()
def process_response(self, request: Request, response: HttpResponse):
cookie_settings = {
@ -88,25 +78,19 @@ class SessionMiddleware(SpiderwebMiddleware):
)
if not is_jsonable(request.SESSION):
raise ValueError("Session data is not JSON serializable.")
dbsession = self.server.get_db_session()
try:
session = Session(
session_key=session_key,
session_data=json.dumps(request.SESSION),
created_at=datetime.now(),
last_active=datetime.now(),
ip_address=request.META.get("client_address"),
user_agent=request.headers.get("HTTP_USER_AGENT"),
)
dbsession.add(session)
dbsession.commit()
finally:
dbsession.close()
session = Session(
session_key=session_key,
session_data=json.dumps(request.SESSION),
created_at=datetime.now(),
last_active=datetime.now(),
ip_address=request.META.get("client_address"),
user_agent=request.headers.get("HTTP_USER_AGENT"),
)
session.save()
return
# Otherwise, we can save the one we already have.
# Use the cached session id to avoid touching a detached SQLAlchemy instance.
session_key = request._session["id"]
session_key = request.META["SESSION"].session_key
# update the session expiration time
response.set_cookie(
self.server.session_cookie_name,
@ -114,18 +98,7 @@ class SessionMiddleware(SpiderwebMiddleware):
**cookie_settings,
)
dbsession = self.server.get_db_session()
try:
session = (
dbsession.query(Session)
.filter(Session.session_key == session_key)
.order_by(Session.id.desc())
.first()
)
if session:
session.session_data = json.dumps(request.SESSION)
session.last_active = datetime.now()
dbsession.add(session)
dbsession.commit()
finally:
dbsession.close()
session = request.META["SESSION"]
session.session_data = json.dumps(request.SESSION)
session.last_active = datetime.now()
session.save()

View file

@ -1,9 +1,17 @@
import json
from urllib.parse import urlparse
from urllib.parse import urlparse, parse_qs
from spiderweb.constants import DEFAULT_ENCODING
from spiderweb.files import MediaFile
from spiderweb.utils import get_client_address, Headers
from multipart import (
parse_form_data,
is_form_request as m_is_form_request,
MultiDict,
MultipartPart,
)
class Request:
def __init__(
@ -24,8 +32,9 @@ class Request:
self.query_params = []
self.server = server
self.handler = handler # the view function that will be called
self.GET = {}
self.POST = {}
self.GET = MultiDict()
self.POST = MultiDict()
self.FILES = MultiDict()
self.META = {}
self.COOKIES = {}
# only used for the session middleware
@ -39,10 +48,21 @@ class Request:
self.populate_cookies()
content_length = int(self.headers.get("content_length") or 0)
if content_length:
self.content = (
self.environ["wsgi.input"].read(content_length).decode(DEFAULT_ENCODING)
)
if self.is_form_request():
if self.method == "POST":
# this pulls from wsgi.input, so we don't have to do it ourselves
self.POST, self.FILES = parse_form_data(self.environ)
for key, value in self.FILES.items():
if isinstance(value, MultipartPart):
self.FILES[key] = MediaFile(self.server, value)
else:
if content_length:
self.content = (
self.environ["wsgi.input"]
.read(content_length)
.decode(DEFAULT_ENCODING)
)
self.GET.update(parse_qs(self.content))
def populate_headers(self) -> None:
data = self.headers
@ -80,27 +100,14 @@ class Request:
self.META["client_address"] = get_client_address(self.environ)
def populate_cookies(self) -> None:
cookies_header = self.environ.get("HTTP_COOKIE")
if not cookies_header:
return
cookies: dict[str, str] = {}
# Split on ';' and be tolerant of optional spaces and malformed segments
for segment in cookies_header.split(";"):
part = segment.strip()
if not part:
continue
if "=" not in part:
# Ignore flag-like segments that don't conform to name=value
continue
name, _, value = part.partition("=") # only split on first '='
cookies[name.strip()] = value.strip()
self.COOKIES = cookies
if cookies := self.environ.get("HTTP_COOKIE"):
self.COOKIES = {
option.split("=")[0]: option.split("=")[1]
for option in cookies.split("; ")
}
def json(self):
return json.loads(self.content)
def is_form_request(self) -> bool:
return (
"content_type" in self.headers
and self.headers["content_type"] == "application/x-www-form-urlencoded"
)
return m_is_form_request(self.environ)

View file

@ -12,6 +12,8 @@ from spiderweb.exceptions import GeneralException
from spiderweb.request import Request
from spiderweb.utils import Headers
from multipart import MultiDict
mimetypes.init()
@ -122,6 +124,8 @@ class JsonResponse(HttpResponse):
self.headers["content-type"] = "application/json"
def render(self) -> str:
if isinstance(self.data, MultiDict):
self.data = self.data.dict
return json.dumps(self.data)

View file

@ -40,6 +40,7 @@ class RoutesMixin:
_error_routes: dict
error_routes: dict[int, Callable]
append_slash: bool
fix_route_starting_slash: bool
def route(self, path, allowed_methods=None, name=None) -> Callable:
"""
@ -134,7 +135,8 @@ class RoutesMixin:
or allowed_methods
or DEFAULT_ALLOWED_METHODS
)
if not path.startswith("/") and self.fix_route_starting_slash:
path = "/" + path
reverse_path = re.sub(r"<(.*?):(.*?)>", r"{\2}", path) if "<" in path else path
def get_packet(func):

View file

@ -1,6 +1,9 @@
from wsgiref.util import setup_testing_defaults
from peewee import SqliteDatabase
from spiderweb import SpiderwebRouter
from spiderweb.request import Request
class StartResponse:
@ -20,9 +23,40 @@ def setup(**kwargs):
environ = {}
setup_testing_defaults(environ)
if "db" not in kwargs:
kwargs["db"] = "spiderweb-tests.db"
kwargs["db"] = SqliteDatabase("spiderweb-tests.db")
return (
SpiderwebRouter(**kwargs),
environ,
StartResponse(),
)
class TestClient:
def __init__(self, **kwargs):
self.app, self.environ, self.start_response = setup(**kwargs)
...
class RequestFactory:
@staticmethod
def create_request(
environ=None,
content=None,
headers=None,
path=None,
server=None,
handler=None,
):
if not environ:
environ = {}
setup_testing_defaults(environ)
environ["HTTP_USER_AGENT"] = "Mozilla/5.0 (testrequest)"
environ["REMOTE_ADDR"] = "1.1.1.1"
return Request(
environ=environ,
content=content,
headers=headers,
path=path,
server=server,
handler=handler,
)

View file

@ -147,71 +147,3 @@ def test_setting_multiple_cookies():
app(environ, start_response)
assert start_response.headers[-1] == ("set-cookie", "cookie2=value2")
assert start_response.headers[-2] == ("set-cookie", "cookie1=value1")
import json as _json
@pytest.mark.parametrize(
"cookie_header,expected",
[
("", {}),
(" ", {}),
(";", {}),
(";; ; ", {}),
("a=1", {"a": "1"}),
("a=1; b=2", {"a": "1", "b": "2"}),
("a=1; b", {"a": "1"}), # flag-like segment ignored
("flag", {}), # single flag ignored
("a=1; flag; c=3", {"a": "1", "c": "3"}),
("a=1; c=", {"a": "1", "c": ""}), # empty value allowed
("token=abc=def==", {"token": "abc=def=="}), # values may contain '='
(" d = q ", {"d": "q"}), # tolerate spaces around name/value
("a=1; ; ; c=3", {"a": "1", "c": "3"}), # empty segments ignored
("a=1; a=2", {"a": "2"}), # last duplicate wins
("q=\"a b c\"", {"q": '"a b c"'}), # quotes preserved
("u=hello%3Dworld", {"u": "hello%3Dworld"}), # url-encoded preserved
("=novalue; a=1", {"": "novalue", "a": "1"}), # empty name retained per current parser
("lead=1; ; trail=2;", {"lead": "1", "trail": "2"}),
(" spaced = value ; another= thing ", {"spaced": "value", "another": "thing"}),
("a=1; b=2; flag; c=; token=abc=def==; d = q ; ;", {"a": "1", "b": "2", "c": "", "token": "abc=def==", "d": "q"}),
],
ids=[
"empty",
"space-only",
"single-semicolon",
"many-empty",
"single-pair",
"two-pairs",
"flag-after",
"single-flag",
"mix-flag",
"empty-value",
"value-with-equals",
"spaces-around",
"ignore-empty-segments",
"duplicate-last-wins",
"quoted-value",
"url-encoded",
"empty-name",
"lead-trail-with-empties",
"spaces-around-multi",
"mixed-case-from-original",
],
)
def test_cookie_parsing_tolerates_malformed_segments(cookie_header, expected):
app, environ, start_response = setup()
from spiderweb.response import JsonResponse
@app.route("/")
def index(request):
return JsonResponse(data=request.COOKIES)
environ["HTTP_COOKIE"] = cookie_header
body = app(environ, start_response)[0].decode("utf-8")
data = _json.loads(body)
assert data == expected

View file

@ -0,0 +1,3 @@
from io import BytesIO
...

View file

@ -2,6 +2,7 @@ from io import BytesIO, BufferedReader
from datetime import timedelta
import pytest
from peewee import SqliteDatabase
from spiderweb import SpiderwebRouter, HttpResponse, StartupErrors, ConfigError
from spiderweb.constants import DEFAULT_ENCODING
@ -51,11 +52,7 @@ def test_session_middleware():
assert app(environ, start_response) == [bytes(str(0), DEFAULT_ENCODING)]
_s = app.get_db_session()
try:
session_key = _s.query(Session).order_by(Session.id.asc()).first().session_key
finally:
_s.close()
session_key = Session.select().first().session_key
environ["HTTP_COOKIE"] = f"swsession={session_key}"
assert app(environ, start_response) == [bytes(str(1), DEFAULT_ENCODING)]
@ -74,25 +71,17 @@ def test_expired_session():
assert app(environ, start_response) == [bytes(str(0), DEFAULT_ENCODING)]
_s = app.get_db_session()
try:
session = _s.query(Session).order_by(Session.id.asc()).first()
session.created_at = session.created_at - timedelta(seconds=app.session_max_age)
_s.add(session)
_s.commit()
environ["HTTP_COOKIE"] = f"swsession={session.session_key}"
finally:
_s.close()
session = Session.select().first()
session.created_at = session.created_at - timedelta(seconds=app.session_max_age)
session.save()
environ["HTTP_COOKIE"] = f"swsession={session.session_key}"
# it shouldn't increment because we get a new session
assert app(environ, start_response) == [bytes(str(0), DEFAULT_ENCODING)]
_s2 = app.get_db_session()
try:
session2 = _s2.query(Session).order_by(Session.id.desc()).first()
assert session2.session_key != session.session_key
finally:
_s2.close()
session2 = list(Session.select())[-1]
assert session2.session_key != session.session_key
def test_exploding_middleware():
@ -121,7 +110,7 @@ def test_csrf_middleware_without_session_middleware():
with pytest.raises(StartupErrors) as e:
SpiderwebRouter(
middleware=["spiderweb.middleware.csrf.CSRFMiddleware"],
db="spiderweb-tests.db",
db=SqliteDatabase("spiderweb-tests.db"),
)
exceptiongroup = e.value.args[1]
assert (
@ -168,12 +157,9 @@ def test_csrf_middleware():
formdata = f"name=bob&csrf_token={token}"
environ["CONTENT_TYPE"] = "application/x-www-form-urlencoded"
_s = app.get_db_session()
try:
_last = _s.query(Session).order_by(Session.id.desc()).first()
environ["HTTP_COOKIE"] = f"swsession={_last.session_key}"
finally:
_s.close()
environ["HTTP_COOKIE"] = (
f"swsession={[i for i in Session.select().dicts()][-1]['session_key']}"
)
environ["REQUEST_METHOD"] = "POST"
environ["HTTP_X_CSRF_TOKEN"] = token
environ["CONTENT_LENGTH"] = len(formdata)
@ -231,12 +217,9 @@ def test_csrf_expired_token():
formdata = f"name=bob&csrf_token={token}"
environ["CONTENT_TYPE"] = "application/x-www-form-urlencoded"
_s = app.get_db_session()
try:
_last = _s.query(Session).order_by(Session.id.desc()).first()
environ["HTTP_COOKIE"] = f"swsession={_last.session_key}"
finally:
_s.close()
environ["HTTP_COOKIE"] = (
f"swsession={[i for i in Session.select().dicts()][-1]['session_key']}"
)
environ["REQUEST_METHOD"] = "POST"
environ["HTTP_ORIGIN"] = "example.com"
environ["HTTP_X_CSRF_TOKEN"] = token
@ -317,7 +300,7 @@ def test_csrf_trusted_origins():
environ["HTTP_ORIGIN"] = "example.com"
resp2 = app(environ, start_response)[0].decode(DEFAULT_ENCODING)
assert resp2 == '{"name": "bob"}'
assert resp2 == '{"name": ["bob"]}'
def test_post_process_middleware():

View file

@ -36,6 +36,7 @@ def get_http_status_by_code(code: int) -> Optional[str]:
resp = HTTPStatus(code)
if resp:
return f"{resp.value} {resp.phrase}"
return None
def is_form_request(request: "Request") -> bool:

View file

@ -0,0 +1,13 @@
{% extends 'base.html' %}
{% block content %}
<form action="" method="post" enctype='multipart/form-data'>
<div class="mb-3">
<label for="formFile" class="form-label">Default file input example</label>
<input name="file" class="form-control" type="file" id="formFile">
</div>
{{ csrf_token }}
<button type="submit" class="btn btn-primary">Submit</button>
</form>
{% endblock %}