Package maintenance

pydantic-slim

pydantic-slim This is a placeholder in case we want to use this package name in future.

pypi package. Binary | Source

Latest version: 0.0.0 Released: 2024-03-29

stac-pydantic

stac-pydantic Pydantic models for STAC Catalogs, Collections, Items, and the STAC API spec. Initially developed by arturo-ai. The main purpose of this library is to provide reusable request/response models for tools such as fastapi. For more comprehensive schema validation and robust extension support, use pystac. Installation ```shell python -m pip install stac-pydantic or python -m pip install stac-pydantic["validation"] ``` For local development: shell python -m pip install -e '.[dev,lint]' | stac-pydantic | STAC Version | STAC API Version | Pydantic Version | |--------------|---------------|------------------|-----------------| | 1.2.x | 1.0.0-beta.1 | <1 | ^1.6 | | 1.3.x | 1.0.0-beta.2 | <1 | ^1.6 | | 2.0.x | 1.0.0 | <1* | ^1.6 | | 3.0.x | 1.0.0 | 1.0.0 | ^2.4 | | 3.1.x | 1.0.0 | 1.0.0 | ^2.4 | * various beta releases, specs not fully implemented Development Install the pre-commit hooks: shell pre-commit install Testing Ensure you have all Python versions installed that the tests will be run against. If using pyenv, run: shell pyenv install 3.8.18 pyenv install 3.9.18 pyenv install 3.10.13 pyenv install 3.11.5 pyenv local 3.8.18 3.9.18 3.10.13 3.11.5 Run the entire test suite: shell tox Run a single test case using the standard pytest convention: shell python -m pytest -v tests/test_models.py::test_item_extensions Usage Loading Models Load data into models with standard pydantic: ```python from stac_pydantic import Catalog stac_catalog = { "type": "Catalog", "stac_version": "1.0.0", "id": "sample", "description": "This is a very basic sample catalog.", "links": [ { "href": "item.json", "rel": "item" } ] } catalog = Catalog(**stac_catalog) assert catalog.id == "sample" assert catalog.links[0].href == "item.json" ``` Extensions STAC defines many extensions which let the user customize the data in their catalog. stac-pydantic.extensions.validate_extensions gets the JSON schemas from the URLs provided in the stac_extensions property (caching the last fetched ones), and will validate a dict, Item, Collection or Catalog against those fetched schemas: ```python from stac_pydantic import Item from stac_pydantic.extensions import validate_extensions stac_item = { "id": "12345", "type": "Feature", "stac_extensions": [ "https://stac-extensions.github.io/eo/v1.0.0/schema.json" ], "geometry": { "type": "Point", "coordinates": [0, 0] }, "bbox": [0.0, 0.0, 0.0, 0.0], "properties": { "datetime": "2020-03-09T14:53:23.262208+00:00", "eo:cloud_cover": 25, }, "links": [], "assets": {}, } model = Item(**stac_item) validate_extensions(model, reraise_exception=True) assert getattr(model.properties, "eo:cloud_cover") == 25 ``` The complete list of current STAC Extensions can be found here. Vendor Extensions The same procedure described above works for any STAC Extension schema as long as it can be loaded from a public url. STAC API The STAC API Specs extent the core STAC specification for implementing dynamic catalogs. STAC Objects used in an API context should always import models from the api subpackage. This package extends Catalog, Collection, and Item models with additional fields and validation rules and introduces Collections and ItemCollections models and Pagination/ Search Links. It also implements models for defining ItemSeach queries. ```python from stac_pydantic.api import Item, ItemCollection stac_item = Item(**{ "id": "12345", "type": "Feature", "stac_extensions": [], "geometry": { "type": "Point", "coordinates": [0, 0] }, "bbox": [0.0, 0.0, 0.0, 0.0], "properties": { "datetime": "2020-03-09T14:53:23.262208+00:00", }, "collection": "CS3", "links": [ { "rel": "self", "href": "http://stac.example.com/catalog/collections/CS3-20160503_132130_04/items/CS3-20160503_132130_04.json" }, { "rel": "collection", "href": "http://stac.example.com/catalog/CS3-20160503_132130_04/catalog.json" }, { "rel": "root", "href": "http://stac.example.com/catalog" }], "assets": {}, }) stac_item_collection = ItemCollection(**{ "type": "FeatureCollection", "features": [stac_item], "links": [ { "rel": "self", "href": "http://stac.example.com/catalog/search?collection=CS3", "type": "application/geo+json" }, { "rel": "root", "href": "http://stac.example.com/catalog", "type": "application/json" }], }) ``` Exporting Models Most STAC extensions are namespaced with a colon (ex eo:gsd) to keep them distinct from other extensions. Because Python doesn't support the use of colons in variable names, we use Pydantic aliasing to add the namespace upon model export. This requires exporting the model with the by_alias = True parameter. Export methods (model_dump() and model_dump_json()) for models in this library have by_alias and exclude_unset st to True by default: python item_dict = item.model_dump() assert item_dict['properties']['landsat:row'] == item.properties.row == 250 CLI ```text Usage: stac-pydantic [OPTIONS] COMMAND [ARGS]... stac-pydantic cli group Options: --help Show this message and exit. Commands: validate-item Validate STAC Item ```

pypi package. Binary | Source

Latest version: 3.2.0 Released: 2025-03-20

pydantic-core

pydantic-core This package provides the core functionality for pydantic validation and serialization. Pydantic-core is currently around 17x faster than pydantic V1. See tests/benchmarks/ for details. Example of direct usage NOTE: You should not need to use pydantic-core directly; instead, use pydantic, which in turn uses pydantic-core. ```py from pydantic_core import SchemaValidator, ValidationError v = SchemaValidator( { 'type': 'typed-dict', 'fields': { 'name': { 'type': 'typed-dict-field', 'schema': { 'type': 'str', }, }, 'age': { 'type': 'typed-dict-field', 'schema': { 'type': 'int', 'ge': 18, }, }, 'is_developer': { 'type': 'typed-dict-field', 'schema': { 'type': 'default', 'schema': {'type': 'bool'}, 'default': True, }, }, }, } ) r1 = v.validate_python({'name': 'Samuel', 'age': 35}) assert r1 == {'name': 'Samuel', 'age': 35, 'is_developer': True} pydantic-core can also validate JSON directly r2 = v.validate_json('{"name": "Samuel", "age": 35}') assert r1 == r2 try: v.validate_python({'name': 'Samuel', 'age': 11}) except ValidationError as e: print(e) """ 1 validation error for model age Input should be greater than or equal to 18 [type=greater_than_equal, context={ge: 18}, input_value=11, input_type=int] """ ``` Getting Started You'll need rust stable installed, or rust nightly if you want to generate accurate coverage. With rust and python 3.9+ installed, compiling pydantic-core should be possible with roughly the following: ```bash clone this repo or your fork git clone git@github.com:pydantic/pydantic-core.git cd pydantic-core create a new virtual env python3 -m venv env source env/bin/activate install dependencies and install pydantic-core make install ``` That should be it, the example shown above should now run. You might find it useful to look at python/pydantic_core/_pydantic_core.pyi and python/pydantic_core/core_schema.py for more information on the python API, beyond that, tests/ provide a large number of examples of usage. If you want to contribute to pydantic-core, you'll want to use some other make commands: * make build-dev to build the package during development * make build-prod to perform an optimised build for benchmarking * make test to run the tests * make testcov to run the tests and generate a coverage report * make lint to run the linter * make format to format python and rust code * make to run format build-dev lint test Profiling It's possible to profile the code using the flamegraph utility from flamegraph-rs. (Tested on Linux.) You can install this with cargo install flamegraph. Run make build-profiling to install a release build with debugging symbols included (needed for profiling). Once that is built, you can profile pytest benchmarks with (e.g.): bash flamegraph -- pytest tests/benchmarks/test_micro_benchmarks.py -k test_list_of_ints_core_py --benchmark-enable The flamegraph command will produce an interactive SVG at flamegraph.svg. Releasing Bump package version locally. Do not just edit Cargo.toml on Github, you need both Cargo.toml and Cargo.lock to be updated. Make a PR for the version bump and merge it. Go to https://github.com/pydantic/pydantic-core/releases and click "Draft a new release" In the "Choose a tag" dropdown enter the new tag v and select "Create new tag on publish" when the option appears. Enter the release title in the form "v " Click Generate release notes button Click Publish release Go to https://github.com/pydantic/pydantic-core/actions and ensure that all build for release are done successfully. Go to https://pypi.org/project/pydantic-core/ and ensure that the latest release is published. Done πŸŽ‰

pypi package. Binary | Source

Latest version: 2.34.1 Released: 2025-04-23

pydantic-pkgr

pydantic-pkgr πŸ“¦ apt brew pip npm β‚Šβ‚Šβ‚ŠSimple Pydantic interfaces for package managers + installed binaries. It's an ORM for your package managers, providing a nice python types for packages + installers. This is a Python library for installing & managing packages locally with a variety of package managers. It's designed for when pip dependencies aren't enough, and your app has to check for & install dependencies at runtime. shell pip install pydantic-pkgr ✨ Built with pydantic v2 for strong static typing guarantees and json import/export compatibility πŸ“¦ Provides consistent cross-platform interfaces for dependency resolution & installation at runtime 🌈 Integrates with django >= 4.0, django-ninja, and OpenAPI + django-jsonform out-of-the-box πŸ¦„ Uses pyinfra / ansible for the actual install operations whenever possible (with internal fallbacks) Built by ArchiveBox to install & auto-update our extractor dependencies at runtime (chrome, wget, curl, etc.) on macOS/Linux/Docker. [!WARNING] This is BETA software, the API is mostly stable but there may be minor changes later on. Source Code: https://github.com/ArchiveBox/pydantic-pkgr/ Documentation: https://github.com/ArchiveBox/pydantic-pkgr/blob/main/README.md ```python from pydantic_pkgr import * apt, brew, pip, npm, env = AptProvider(), BrewProvider(), PipProvider(), NpmProvider(), EnvProvider() dependencies = [ Binary(name='curl', binproviders=[env, apt, brew]), Binary(name='wget', binproviders=[env, apt, brew]), Binary(name='yt-dlp', binproviders=[env, pip, apt, brew]), Binary(name='playwright', binproviders=[env, pip, npm]), Binary(name='puppeteer', binproviders=[env, npm]), ] for binary in dependencies: binary = binary.load_or_install() print(binary.abspath, binary.version, binary.binprovider, binary.is_valid, binary.sha256) # Path('/usr/bin/curl') SemVer('7.81.0') AptProvider() True abc134... binary.exec(cmd=['--version']) # curl 7.81.0 (x86_64-apple-darwin23.0) libcurl/7.81.0 ... ``` ```python from pydantic import InstanceOf from pydantic_pkgr import Binary, BinProvider, BrewProvider, EnvProvider you can also define binaries as classes, making them usable for type checking class CurlBinary(Binary): name: str = 'curl' binproviders: List[InstanceOf[BinProvider]] = [BrewProvider(), EnvProvider()] curl = CurlBinary().install() assert isinstance(curl, CurlBinary) # CurlBinary is a unique type you can use in annotations now print(curl.abspath, curl.version, curl.binprovider, curl.is_valid) # Path('/opt/homebrew/bin/curl') SemVer('8.4.0') BrewProvider() True curl.exec(cmd=['--version']) # curl 8.4.0 (x86_64-apple-darwin23.0) libcurl/8.4.0 ... ``` ```python from pydantic_pkgr import Binary, EnvProvider, PipProvider We also provide direct package manager (aka BinProvider) APIs apt = AptProvider() apt.install('wget') print(apt.PATH, apt.get_abspaths('wget'), apt.get_version('wget')) even if packages are installed by tools we don't control (e.g. pyinfra/ansible/puppet/etc.) from pyinfra.operations import apt apt.packages(name="Install ffmpeg", packages=['ffmpeg'], _sudo=True) our Binary API provides a nice type-checkable, validated, serializable handle ffmpeg = Binary(name='ffmpeg').load() print(ffmpeg) # name=ffmpeg abspath=/usr/bin/ffmpeg version=3.3.0 is_valid=True ... print(ffmpeg.loaded_abspaths) # show all the ffmpeg binaries found in $PATH (in case theres more than one available) print(ffmpeg.model_dump_json()) # ... everything can also be dumped/loaded as json print(ffmpeg.model_json_schema()) # ... all types provide OpenAPI-ready JSON schemas ``` Supported Package Managers So far it supports installing/finding installed/~~updating/removing~~ packages on Linux/macOS with: apt (Ubuntu/Debian/etc.) brew (macOS/Linux) pip (Linux/macOS/Windows) npm (Linux/macOS/Windows) env (looks for existing version of binary in user's $PATH at runtime) vendor (you can bundle vendored copies of packages you depend on within your source) Planned: docker, cargo, nix, apk, go get, gem, pkg, and more using ansible/pyinfra... Usage bash pip install pydantic-pkgr BinProvider Implementations: EnvProvider, AptProvider, BrewProvider, PipProvider, NpmProvider This type represents a "provider of binaries", e.g. a package manager like apt/pip/npm, or env (which finds binaries in your $PATH). BinProviders implement the following interface: * .INSTALLER_BIN -> /opt/homebrew/bin/brew provider's pkg manager location * .PATH -> PATHStr('/opt/homebrew/bin:/usr/local/bin:...') where provider stores bins * get_packages(bin_name: str) -> InstallArgs(['curl', 'libcurl4', '...]) find pkg dependencies for a bin - install(bin_name: str) install a bin using binprovider to install needed packages - load(bin_name: str) find an existing installed binary - load_or_install(bin_name: str) -> Binary find existing / install if needed - get_version(bin_name: str) -> SemVer('1.0.0') get currently installed version - get_abspath(bin_name: str) -> Path('/absolute/path/to/bin') get installed bin abspath * get_abspaths(bin_name: str) -> [Path('/opt/homebrew/bin/curl'), Path('/other/paths/to/curl'), ...] get all matching bins found * get_sha256(bin_name: str) -> str get sha256 hash hexdigest of the binary ```python import platform from typing import List from pydantic_pkgr import EnvProvider, PipProvider, AptProvider, BrewProvider Example: Finding an existing install of bash using the system $PATH environment env = EnvProvider() bash = env.load(bin_name='bash') # Binary('bash', provider=env) print(bash.abspath) # Path('/opt/homebrew/bin/bash') print(bash.version) # SemVer('5.2.26') bash.exec(['-c', 'echo hi']) # hi Example: Installing curl using the apt package manager apt = AptProvider() curl = apt.install(bin_name='curl') # Binary('curl', provider=apt) print(curl.abspath) # Path('/usr/bin/curl') print(curl.version) # SemVer('8.4.0') print(curl.sha256) # 9fd780521c97365f94c90724d80a889097ae1eeb2ffce67b87869cb7e79688ec curl.exec(['--version']) # curl 7.81.0 (x86_64-pc-linux-gnu) libcurl/7.81.0 ... Example: Finding/Installing django with pip (w/ customized binpath resolution behavior) pip = PipProvider( abspath_handler={'': lambda self, bin_name, *context: inspect.getfile(bin_name)}, # use python inspect to get path instead of os.which ) django_bin = pip.load_or_install('django') # Binary('django', provider=pip) print(django_bin.abspath) # Path('/usr/lib/python3.10/site-packages/django/init.py') print(django_bin.version) # SemVer('5.0.2') ``` Binary This type represents a single binary dependency aka a package (e.g. wget, curl, ffmpeg, etc.). It can define one or more BinProviders that it supports, along with overrides to customize the behavior for each. Binarys implement the following interface: - load(), install(), load_or_install() -> Binary - binprovider: InstanceOf[BinProvider] - abspath: Path - abspaths: List[Path] - version: SemVer - sha256: str ```python from pydantic_pkgr import BinProvider, Binary, BinProviderName, BinName, ProviderLookupDict, SemVer class CustomBrewProvider(BrewProvider): name: str = 'custom_brew' def get_macos_packages(self, bin_name: str, **context) -> List[str]: extra_packages_lookup_table = json.load(Path('macos_packages.json')) return extra_packages_lookup_table.get(platform.machine(), [bin_name]) Example: Create a re-usable class defining a binary and its providers class YtdlpBinary(Binary): name: BinName = 'ytdlp' description: str = 'YT-DLP (Replacement for YouTube-DL) Media Downloader' binproviders_supported: List[BinProvider] = [EnvProvider(), PipProvider(), AptProvider(), CustomBrewProvider()] # customize installed package names for specific package managers provider_overrides: Dict[BinProviderName, ProviderLookupDict] = { 'pip': {'packages': ['yt-dlp[default,curl-cffi]']}, # can use literal values (packages -> List[str], version -> SemVer, abspath -> Path, install -> str log) 'apt': {'packages': lambda: ['yt-dlp', 'ffmpeg']}, # also accepts any pure Callable that returns a list of packages 'brew': {'packages': 'self.get_macos_packages'}, # also accepts string reference to function on self (where self is the BinProvider) } ytdlp = YtdlpBinary().load_or_install() print(ytdlp.binprovider) # BrewProvider(...) print(ytdlp.abspath) # Path('/opt/homebrew/bin/yt-dlp') print(ytdlp.abspaths) # [Path('/opt/homebrew/bin/yt-dlp'), Path('/usr/local/bin/yt-dlp')] print(ytdlp.version) # SemVer('2024.4.9') print(ytdlp.sha256) # 46c3518cfa788090c42e379971485f56d007a6ce366dafb0556134ca724d6a36 print(ytdlp.is_valid) # True ``` ```python from pydantic_pkgr import BinProvider, Binary, BinProviderName, BinName, ProviderLookupDict, SemVer Example: Create a binary that uses Podman if available, or Docker otherwise class DockerBinary(Binary): name: BinName = 'docker' binproviders_supported: List[BinProvider] = [EnvProvider(), AptProvider()] provider_overrides: Dict[BinProviderName, ProviderLookupDict] = { 'env': { # example: prefer podman if installed (falling back to docker) 'abspath': lambda: os.which('podman') or os.which('docker') or os.which('docker-ce'), }, 'apt': { # example: vary installed package name based on your CPU architecture 'packages': { 'amd64': ['docker'], 'armv7l': ['docker-ce'], 'arm64': ['docker-ce'], }.get(platform.machine(), 'docker'), }, } docker = DockerBinary().load_or_install() print(docker.binprovider) # EnvProvider() print(docker.abspath) # Path('/usr/local/bin/podman') print(docker.abspaths) # [Path('/usr/local/bin/podman'), Path('/opt/homebrew/bin/podman')] print(docker.version) # SemVer('6.0.2') print(docker.is_valid) # True You can also pass **kwargs to override properties at runtime, e.g. if you want to force the abspath to be at a specific path: custom_docker = DockerBinary(abspath='~/custom/bin/podman').load() print(custom_docker.name) # 'docker' print(custom_docker.binprovider) # EnvProvider() print(custom_docker.abspath) # Path('/Users/example/custom/bin/podman') print(custom_docker.version) # SemVer('5.0.2') print(custom_docker.is_valid) # True ``` SemVer ```python from pydantic_pkgr import SemVer Example: Use the SemVer type directly for parsing & verifying version strings SemVer.parse('Google Chrome 124.0.6367.208+beta_234. 234.234.123') # SemVer(124, 0, 6367') SemVer.parse('2024.04.05) # SemVer(2024, 4, 5) SemVer.parse('1.9+beta') # SemVer(1, 9, 0) str(SemVer(1, 9, 0)) # '1.9.0' ``` These types are all meant to be used library-style to make writing your own apps easier. e.g. you can use it to build things like: playwright install --with-deps) Django Usage The pydantic ecosystem helps us get auto-generated, type-checked Django fields & forms that support BinProvider and Binary. [!TIP] For the full Django experience, we recommend installing these 3 excellent packages: - django-admin-data-views - django-pydantic-field - django-jsonform pip install pydantic-pkgr django-admin-data-views django-pydantic-field django-jsonform Django Model Usage: Store BinProvider and Binary entries in your model fields bash pip install django-pydantic-field Fore more info see the django-pydantic-field docs... Example Django models.py showing how to store Binary and BinProvider instances in DB fields: ```python from typing import List from django.db import models from pydantic import InstanceOf from pydantic_pkgr import BinProvider, Binary, SemVer from django_pydantic_field import SchemaField class InstalledBinary(models.Model): name = models.CharField(max_length=63) binary: Binary = SchemaField() binproviders: List[InstanceOf[BinProvider]] = SchemaField(default=[]) version: SemVer = SchemaField(default=(0,0,1)) ``` And here's how to save a Binary using the example model: ```python find existing curl Binary in $PATH curl = Binary(name='curl').load() save it to the DB using our new model obj = InstalledBinary( name='curl', binary=curl, # store Binary/BinProvider/SemVer values directly in fields binproviders=[env], # no need for manual JSON serialization / schema checking min_version=SemVer('6.5.0'), ) obj.save() ``` When fetching it back from the DB, the Binary field is auto-deserialized / immediately usable: obj = InstalledBinary.objects.get(name='curl') # everything is transparently serialized to/from the DB, # and is ready to go immediately after querying: assert obj.binary.abspath == curl.abspath print(obj.binary.abspath) # Path('/usr/local/bin/curl') obj.binary.exec(['--version']) # curl 7.81.0 (x86_64-apple-darwin23.0) libcurl/7.81.0 ... For a full example see our provided django_example_project/... Django Admin Usage: Display Binary objects nicely in the Admin UI bash pip install pydantic-pkgr django-admin-data-views For more info see the django-admin-data-views docs... Then add this to your settings.py: ```python INSTALLED_APPS = [ # ... 'admin_data_views' 'pydantic_pkgr' # ... ] point these to a function that gets the list of all binaries / a single binary PYDANTIC_PKGR_GET_ALL_BINARIES = 'pydantic_pkgr.views.get_all_binaries' PYDANTIC_PKGR_GET_BINARY = 'pydantic_pkgr.views.get_binary' ADMIN_DATA_VIEWS = { "NAME": "Environment", "URLS": [ { "route": "binaries/", "view": "pydantic_pkgr.views.binaries_list_view", "name": "binaries", "items": { "route": "/", "view": "pydantic_pkgr.views.binary_detail_view", "name": "binary", }, }, # Coming soon: binprovider_list_view + binprovider_detail_view ... ], } `` *For a full example see our provided [django_example_project/`](https://github.com/ArchiveBox/pydantic-pkgr/tree/main/django_example_project)...* Note: If you override the default site admin, you must register the views manually... admin.py: class YourSiteAdmin(admin.AdminSite): """Your customized version of admin.AdminSite""" ... custom_admin = YourSiteAdmin() custom_admin.register(get_user_model()) ... from pydantic_pkgr.admin import register_admin_views register_admin_views(custom_admin) ~~Django Admin Usage: JSONFormWidget for editing BinProvider and Binary data~~ [!IMPORTANT] This feature is coming soon but is blocked on a few issues being fixed first: - https://github.com/surenkov/django-pydantic-field/issues/64 - https://github.com/surenkov/django-pydantic-field/issues/65 - https://github.com/surenkov/django-pydantic-field/issues/66 Expand to see more... ~~Install django-jsonform to get auto-generated Forms for editing BinProvider, Binary, etc. data~~ bash pip install django-pydantic-field django-jsonform For more info see the django-jsonform docs... admin.py: ```python from django.contrib import admin from django_jsonform.widgets import JSONFormWidget from django_pydantic_field.v2.fields import PydanticSchemaField class MyModelAdmin(admin.ModelAdmin): formfield_overrides = {PydanticSchemaField: {"widget": JSONFormWidget}} admin.site.register(MyModel, MyModelAdmin) ``` For a full example see our provided django_example_project/... Examples Advanced: Implement your own package manager behavior by subclassing BinProvider ```python from subprocess import run, PIPE from pydantic_pkgr import BinProvider, BinProviderName, BinName, SemVer class CargoProvider(BinProvider): name: BinProviderName = 'cargo' def on_setup_paths(self): if '~/.cargo/bin' not in sys.path: sys.path.append('~/.cargo/bin') def on_install(self, bin_name: BinName, **context): packages = self.on_get_packages(bin_name) installer_process = run(['cargo', 'install', *packages.split(' ')], capture_output = True, text=True) assert installer_process.returncode == 0 def on_get_packages(self, bin_name: BinName, **context) -> InstallArgs: # optionally remap bin_names to strings passed to installer # e.g. 'yt-dlp' -> ['yt-dlp, 'ffmpeg', 'libcffi', 'libaac'] return [bin_name] def on_get_abspath(self, bin_name: BinName, **context) -> Path | None: self.on_setup_paths() return Path(os.which(bin_name)) def on_get_version(self, bin_name: BinName, **context) -> SemVer | None: self.on_setup_paths() return SemVer(run([bin_name, '--version'], stdout=PIPE).stdout.decode()) cargo = CargoProvider() rg = cargo.install(bin_name='ripgrep') print(rg.binprovider) # CargoProvider() print(rg.version) # SemVer(14, 1, 0) ``` TODO [x] Implement initial basic support for apt, brew, and pip [x] Provide editability and actions via Django Admin UI using django-pydantic-field and django-jsonform [ ] Implement update and remove actions on BinProviders [ ] Add preinstall and postinstall hooks for things like adding apt sources and running cleanup scripts [ ] Implement more package managers (cargo, gem, go get, ppm, nix, docker, etc.) [ ] Add Binary.min_version that affects .is_valid based on whether it meets minimum SemVer threshold Other Packages We Like https://github.com/MrThearMan/django-signal-webhooks https://github.com/MrThearMan/django-admin-data-views https://github.com/lazybird/django-solo https://github.com/joshourisman/django-pydantic-settings https://github.com/surenkov/django-pydantic-field https://github.com/jordaneremieff/djantic

pypi package. Binary | Source

Latest version: 0.5.4 Released: 2024-10-21

pydantic-view

Pydantic view helper decorator Installation bash pip install pydantic_view Usage ```python In [1]: from pydantic import BaseModel, Field ...: from pydantic_view import view ...: ...: ...: class User(BaseModel): ...: id: int ...: username: str ...: password: str ...: address: str ...: ...: @view("Create", exclude={"id"}) ...: class UserCreate(User): ...: pass ...: ...: @view("Update") ...: class UserUpdate(User): ...: pass ...: ...: @view("Patch") ...: class UserPatch(User): ...: username: str = None ...: password: str = None ...: address: str = None ...: ...: @view("Out", exclude={"password"}) ...: class UserOut(User): ...: pass In [2]: user = User(id=0, username="human", password="iamaman", address="Earth") ...: user.Out() ...: Out[2]: UserOut(id=0, username='human', address='Earth') In [3]: User.Update(id=0, username="human", password="iamasuperman", address="Earth") ...: Out[3]: UserUpdate(id=0, username='human', password='iamasuperman', address='Earth') In [4]: User.Patch(id=0, address="Mars") ...: Out[4]: UserPatch(id=0, username=None, password=None, address='Mars') ``` FastAPI example ```python from typing import Optional from fastapi import FastAPI from fastapi.testclient import TestClient from pydantic import BaseModel, ConfigDict, Field from pydantic_view import view, view_field_validator class UserSettings(BaseModel): model_config = ConfigDict(extra="forbid") public: Optional[str] = None secret: Optional[str] = None @view("Out", exclude={"secret"}) class UserSettingsOut(UserSettings): pass @view("Create") class UserSettingsCreate(UserSettings): pass @view("Update") class UserSettingsUpdate(UserSettings): pass @view("Patch") class UserSettingsPatch(UserSettings): public: str = None secret: str = None class User(BaseModel): model_config = ConfigDict(extra="forbid") id: int username: str password: str = Field(default_factory=lambda: "password") settings: UserSettings @view_field_validator({"Create", "Update", "Patch"}, "username") @classmethod def validate_username(cls, v): if len(v) < 3: raise ValueError return v @view("Out", exclude={"password"}) class UserOut(User): pass @view("Create", exclude={"id"}) class UserCreate(User): settings: UserSettings = Field(default_factory=UserSettings) @view("Update", exclude={"id"}) class UserUpdate(User): pass @view("Patch", exclude={"id"}) class UserPatch(User): username: str = None password: str = None settings: UserSettings = None app = FastAPI() db = {} @app.get("/users/{user_id}", response_model=User.Out) async def get(user_id: int) -> User.Out: return db[user_id] @app.post("/users", response_model=User.Out) async def post(user: User.Create) -> User.Out: user_id = 0 # generate_user_id() db[0] = User(id=user_id, **user.model_dump()) return db[0] @app.put("/users/{user_id}", response_model=User.Out) async def put(user_id: int, user: User.Update) -> User.Out: db[user_id] = User(id=user_id, **user.model_dump()) return db[user_id] @app.patch("/users/{user_id}", response_model=User.Out) async def patch(user_id: int, user: User.Patch) -> User.Out: db[user_id] = User({db[user_id].model_dump(), **user.model_dump(exclude_unset=True)}) return db[user_id] def test_fastapi(): client = TestClient(app) # POST response = client.post( "/users", json={ "username": "admin", "password": "admin", }, ) assert response.status_code == 200, response.text assert response.json() == { "id": 0, "username": "admin", "settings": {"public": None}, } # GET response = client.get("/users/0") assert response.status_code == 200, response.text assert response.json() == { "id": 0, "username": "admin", "settings": {"public": None}, } # PUT response = client.put( "/users/0", json={ "username": "superadmin", "password": "superadmin", "settings": {"public": "foo", "secret": "secret"}, }, ) assert response.status_code == 200, response.text assert response.json() == { "id": 0, "username": "superadmin", "settings": {"public": "foo"}, } # PATCH response = client.patch( "/users/0", json={ "username": "guest", "settings": {"public": "bar"}, }, ) assert response.status_code == 200, response.text assert response.json() == { "id": 0, "username": "guest", "settings": {"public": "bar"}, } ```

pypi package. Binary

Latest version: 2.0.1 Released: 2024-10-14

pydantic-glue

JSON Schema to AWS Glue schema converter JSON Schema to AWS Glue schema converter Installation What? Why? Example Override the type for the AWS Glue Schema How it works? Future work Installation bash pip install pydantic-glue What? Converts pydantic schemas to json schema and then to AWS glue schema, so in theory anything that can be converted to JSON Schema could also work. Why? When using AWS Kinesis Firehose in a configuration that receives JSONs and writes parquet files on S3, one needs to define a AWS Glue table so Firehose knows what schema to use when creating the parquet files. AWS Glue lets you define a schema using Avro or JSON Schema and then to create a table from that schema, but as of May 2022 there are limitations on AWS that tables that are created that way can't be used with Kinesis Firehose. This is also confirmed by AWS support. What one could do is create a table set the columns manually, but this means you now have two sources of truth to maintain. This tool allows you to define a table in pydantic and generate a JSON with column types that can be used with terraform to create a Glue table. Example Take the following pydantic class ```python title="example.py" from pydantic import BaseModel from typing import List class Bar(BaseModel): name: str age: int class Foo(BaseModel): nums: List[int] bars: List[Bar] other: str ``` Running pydantic-glue bash pydantic-glue -f example.py -c Foo you get this JSON in the terminal: json { "//": "Generated by pydantic-glue at 2022-05-25 12:35:55.333570. DO NOT EDIT", "columns": { "nums": "array", "bars": "array>", "other": "string" } } and can be used in terraform like that ```terraform locals { columns = jsondecode(file("${path.module}/glue_schema.json")).columns } resource "aws_glue_catalog_table" "table" { name = "table_name" database_name = "db_name" storage_descriptor { dynamic "columns" { for_each = local.columns content { name = columns.key type = columns.value } } } } ``` Alternatively you can run CLI with -o flag to set output file location: bash pydantic-glue -f example.py -c Foo -o example.json -l If your Pydantic models use field aliases, but you prefer to display the field names in the JSON schema, you can enable this behavior by using the --schema-by-name flag. Here you can find the details regarding pydantic aliases. The following model will be converted differently with --schema-by-name argument. ```python from pydantic import BaseModel, Field class A(BaseModel): hey: str = Field(alias="h") ho: str ``` ```bash pydantic-glue -f tests/data/input.py -c A 2025-02-01 00:08:45,046 - INFO - Generated file content: { "//": "Generated by pydantic-glue at 2025-01-31 23:08:45.046012+00:00. DO NOT EDIT", "columns": { "h": "string", "ho": "string" } } ``` bash pydantic-glue -f tests/data/input.py -c A --schema-by-name 2025-02-01 00:09:18,381 - INFO - Generated file content: { "//": "Generated by pydantic-glue at 2025-01-31 23:09:18.380586+00:00. DO NOT EDIT", "columns": { "hey": "string", "ho": "string" } } Override the type for the AWS Glue Schema Wherever there is a type key in the input JSON Schema, an additional key glue_type may be defined to override the type that is used in the AWS Glue Schema. This is, for example, useful for a pydantic model that has a field of type int that is unix epoch time, while the column type you would like in Glue is of type timestamp. Additional JSON Schema keys to a pydantic model can be added by using the Field function with the argument json_schema_extra like so: ```python from pydantic import BaseModel, Field class A(BaseModel): epoch_time: int = Field( ..., json_schema_extra={ "glue_type": "timestamp", }, ) ``` The resulting JSON Schema will be: json { "properties": { "epoch_time": { "glue_type": "timestamp", "title": "Epoch Time", "type": "integer" } }, "required": [ "epoch_time" ], "title": "A", "type": "object" } And the result after processing with pydantic-glue: json { "//": "Generated by pydantic-glue at 2022-05-25 12:35:55.333570. DO NOT EDIT", "columns": { "epoch_time": "timestamp", } } Recursing through object properties terminates when you supply a glue_type to use. If the type is complex, you must supply the full complex type yourself. How it works? pydantic gets converted to JSON Schema the JSON Schema types get mapped to Glue types recursively Future work Not all types are supported, I just add types as I need them, but adding types is very easy, feel free to open issues or send a PR if you stumbled upon a non-supported use case the tool could be easily extended to working with JSON Schema directly thus, anything that can be converted to a JSON Schema should also work.

pypi package. Binary | Source

Latest version: 0.6.1 Released: 2025-02-01

pydantic-zarr

pydantic-zarr Pydantic models for Zarr. ⚠️ Disclaimer ⚠️ This project is under a lot of flux -- I want to add zarr version 3 support to this project, but the reference python implementation doesn't support version 3 yet. Also, the key ideas in this repo may change in the process of being formalized over in this specification (currently just a draft). As the ecosystem evolves I will be breaking things (and versioning the project accordingly), so be advised! Installation pip install -U pydantic-zarr Help See the documentation for detailed information about this project. Example ```python import zarr from pydantic_zarr import GroupSpec group = zarr.group(path='foo') array = zarr.create(store = group.store, path='foo/bar', shape=10, dtype='uint8') array.attrs.put({'metadata': 'hello'}) this is a pydantic model spec = GroupSpec.from_zarr(group) print(spec.model_dump()) """ { 'zarr_version': 2, 'attributes': {}, 'members': { 'bar': { 'zarr_version': 2, 'attributes': {'metadata': 'hello'}, 'shape': (10,), 'chunks': (10,), 'dtype': '|u1', 'fill_value': 0, 'order': 'C', 'filters': None, 'dimension_separator': '.', 'compressor': { 'id': 'blosc', 'cname': 'lz4', 'clevel': 5, 'shuffle': 1, 'blocksize': 0, }, } }, } """ ```

pypi package. Binary

Latest version: 0.7.0 Released: 2024-03-20

pydantic-conf

pydantic-conf Overview pydantic-conf is a Python library for managing application configuration using Pydantic. It supports loading configuration from environment variables and allows for custom startup actions. Installation To install the package, use: sh pip install pydantic-conf Usage Defining Configuration Create a configuration class by inheriting from EnvAppConfig: ```python from pydantic_conf.config import EnvAppConfig class MyConfig(EnvAppConfig): app_name: str debug: bool = False ``` Loading Configuration Load the configuration using the load method: python config = MyConfig.load() print(config.app_name) print(config.debug) Adding Startup Actions Add startup actions by appending to the STARTUP list: ```python def startup_action(config): print(f"Starting up with {config.app_name}") MyConfig.STARTUP.append(startup_action) config = MyConfig.load() ``` License This project is licensed under the MIT License.

pypi package. Binary

Latest version: 1.0.2 Released: 2025-03-18

bump-pydantic

Bump Pydantic ♻️ Bump Pydantic is a tool to help you migrate your code from Pydantic V1 to V2. [!NOTE]\ If you find bugs, please report them on the issue tracker. Table of contents Bump Pydantic ♻️ Table of contents Installation Usage Check diff before applying changes Apply changes Rules BP001: Add default None to Optional[T], Union[T, None] and Any fields BP002: Replace Config class by model_config attribute BP003: Replace Field old parameters to new ones BP004: Replace imports BP005: Replace GenericModel by BaseModel BP006: Replace __root__ by RootModel BP007: Replace decorators BP008: Replace con* functions by Annotated versions BP009: Mark pydantic "protocol" functions in custom types with proper TODOs License Installation The installation is as simple as: bash pip install bump-pydantic Usage bump-pydantic is a CLI tool, hence you can use it from your terminal. It's easy to use. If your project structure is: bash repository/ └── my_package/ └── Then you'll want to do: bash cd /path/to/repository bump-pydantic my_package Check diff before applying changes To check the diff before applying the changes, you can run: bash bump-pydantic --diff Apply changes To apply the changes, you can run: bash bump-pydantic Rules You can find below the list of rules that are applied by bump-pydantic. It's also possible to disable rules by using the --disable option. BP001: Add default None to Optional[T], Union[T, None] and Any fields βœ… Add default None to Optional[T] fields. The following code will be transformed: py class User(BaseModel): name: Optional[str] Into: py class User(BaseModel): name: Optional[str] = None BP002: Replace Config class by model_config attribute βœ… Replace Config class by model_config = ConfigDict(). βœ… Rename old Config attributes to new model_config attributes. βœ… Add a TODO comment in case the transformation can't be done automatically. βœ… Replace Extra enum by string values. The following code will be transformed: ```py from pydantic import BaseModel, Extra class User(BaseModel): name: str class Config: extra = Extra.forbid ``` Into: ```py from pydantic import ConfigDict, BaseModel class User(BaseModel): name: str model_config = ConfigDict(extra="forbid") ``` BP003: Replace Field old parameters to new ones βœ… Replace Field old parameters to new ones. βœ… Replace field: Enum = Field(Enum.VALUE, const=True) by field: Literal[Enum.VALUE] = Enum.VALUE. The following code will be transformed: ```py from typing import List from pydantic import BaseModel, Field class User(BaseModel): name: List[str] = Field(..., min_items=1) ``` Into: ```py from typing import List from pydantic import BaseModel, Field class User(BaseModel): name: List[str] = Field(..., min_length=1) ``` BP004: Replace imports βœ… Replace BaseSettings from pydantic to pydantic_settings. βœ… Replace Color and PaymentCardNumber from pydantic to pydantic_extra_types. BP005: Replace GenericModel by BaseModel βœ… Replace GenericModel by BaseModel. The following code will be transformed: ```py from typing import Generic, TypeVar from pydantic.generics import GenericModel T = TypeVar('T') class User(GenericModel, Generic[T]): name: str ``` Into: ```py from typing import Generic, TypeVar from pydantic import BaseModel T = TypeVar('T') class User(BaseModel, Generic[T]): name: str ``` BP006: Replace __root__ by RootModel βœ… Replace __root__ by RootModel. The following code will be transformed: ```py from typing import List from pydantic import BaseModel class User(BaseModel): age: int name: str class Users(BaseModel): root = List[User] ``` Into: ```py from typing import List from pydantic import RootModel, BaseModel class User(BaseModel): age: int name: str class Users(RootModel[List[User]]): pass ``` BP007: Replace decorators βœ… Replace @validator by @field_validator. βœ… Replace @root_validator by @model_validator. The following code will be transformed: ```py from pydantic import BaseModel, validator, root_validator class User(BaseModel): name: str @validator('name', pre=True) def validate_name(cls, v): return v @root_validator(pre=True) def validate_root(cls, values): return values ``` Into: ```py from pydantic import BaseModel, field_validator, model_validator class User(BaseModel): name: str @field_validator('name', mode='before') def validate_name(cls, v): return v @model_validator(mode='before') def validate_root(cls, values): return values ``` BP008: Replace con* functions by Annotated versions βœ… Replace constr(*args) by Annotated[str, StringConstraints(*args)]. βœ… Replace conint(*args) by Annotated[int, Field(*args)]. βœ… Replace confloat(*args) by Annotated[float, Field(*args)]. βœ… Replace conbytes(*args) by Annotated[bytes, Field(*args)]. βœ… Replace condecimal(*args) by Annotated[Decimal, Field(*args)]. βœ… Replace conset(T, *args) by Annotated[Set[T], Field(*args)]. βœ… Replace confrozenset(T, *args) by Annotated[Set[T], Field(*args)]. βœ… Replace conlist(T, *args) by Annotated[List[T], Field(*args)]. The following code will be transformed: ```py from pydantic import BaseModel, constr class User(BaseModel): name: constr(min_length=1) ``` Into: ```py from pydantic import BaseModel, StringConstraints from typing_extensions import Annotated class User(BaseModel): name: Annotated[str, StringConstraints(min_length=1)] ``` BP009: Mark Pydantic "protocol" functions in custom types with proper TODOs βœ… Mark __get_validators__ as to be replaced by __get_pydantic_core_schema__. βœ… Mark __modify_schema__ as to be replaced by __get_pydantic_json_schema__. The following code will be transformed: ```py class SomeThing: @classmethod def get_validators(cls): yield from [] @classmethod def __modify_schema__(cls, field_schema, field): if field: field_schema['example'] = "Weird example" ``` Into: ``py class SomeThing: @classmethod # TODO[pydantic]: We couldn't refactorget_validators, please create theget_pydantic_core_schema` manually. # Check https://docs.pydantic.dev/latest/migration/#defining-custom-types for more information. def get_validators(cls): yield from [] @classmethod # TODO[pydantic]: We couldn't refactor `__modify_schema__`, please create the `__get_pydantic_json_schema__` manually. # Check https://docs.pydantic.dev/latest/migration/#defining-custom-types for more information. def __modify_schema__(cls, field_schema, field): if field: field_schema['example'] = "Weird example" ``` License This project is licensed under the terms of the MIT license.

pypi package. Binary | Source

Latest version: 0.8.0 Released: 2023-12-28

pydantic-fhir

This stub for FHIR generated by fhirzeug. Format All profiles are in one file. FHIR Specific JSON Representation Generally this generated code tries to stick as close as possible to the FHIR JSON spec. Another important is the FHIR Datatypes spec. Empty Strings String property values can never be empty. Either the property is absent, or it is present with at least one character of content. - https://www.hl7.org/fhir/STU3/json.html Additionally whitespaces are stripped: Note: This means that a string that consists only of whitespace could be trimmed to nothing, which would be treated as an invalid element value. Therefore strings SHOULD always contain non-whitespace content. - https://www.hl7.org/fhir/datatypes.html#primitive That means empty strings are interpreted as null values. This could lead to invalid arrays([""]). This follows the behavior from HAPI and Vonk. DateTime Values Datetime values are strings as well. That means an empty string or a string with whitespaces is threaded as a null value. Which then is not set at all. null Values Just as in XML, JSON objects and arrays are never empty, and properties never have null values (except for a special case documented below). Omit a property if it is empty - https://www.hl7.org/fhir/json.html#xml That means specifically that if a property contains a null value it is like it never has been set. Example: ```python from r4 import Patient Patient(name=None).dict() {} ``` ValueSets and CodeSystems FHIR Specification provides different ways to define a ValueSet. The implementation varies depending on the use case : - If a ValueSet is based on a single CodeSystem and this CodeSystem is defined in FHIR, then the ValueSet is validated by an enum. - If a ValueSet is based on a single CodeSystem, that this CodeSystem is not included in the FHIR specification, but FHIR provides an exhaustive list of possible values, then the ValueSet is validated by a typing.Literal. - Otherwise, the field is validated by a very permissive regex [^\s]+(\s[^\s]+)*.

pypi package. Binary | Source

Latest version: 0.0.1a18 Released: 2021-11-22