Package maintenance

pydantic-zarr

pydantic-zarr Pydantic models for Zarr. ⚠️ Disclaimer ⚠️ This project is under a lot of flux -- I want to add zarr version 3 support to this project, but the reference python implementation doesn't support version 3 yet. Also, the key ideas in this repo may change in the process of being formalized over in this specification (currently just a draft). As the ecosystem evolves I will be breaking things (and versioning the project accordingly), so be advised! Installation pip install -U pydantic-zarr Help See the documentation for detailed information about this project. Example ```python import zarr from pydantic_zarr import GroupSpec group = zarr.group(path='foo') array = zarr.create(store = group.store, path='foo/bar', shape=10, dtype='uint8') array.attrs.put({'metadata': 'hello'}) this is a pydantic model spec = GroupSpec.from_zarr(group) print(spec.model_dump()) """ { 'zarr_version': 2, 'attributes': {}, 'members': { 'bar': { 'zarr_version': 2, 'attributes': {'metadata': 'hello'}, 'shape': (10,), 'chunks': (10,), 'dtype': '|u1', 'fill_value': 0, 'order': 'C', 'filters': None, 'dimension_separator': '.', 'compressor': { 'id': 'blosc', 'cname': 'lz4', 'clevel': 5, 'shuffle': 1, 'blocksize': 0, }, } }, } """ ```

pypi package. Binary

Latest version: 0.7.0 Released: 2024-03-20

pydantic-meta

pypi package. Binary

Latest version: 0.3.3 Released: 2023-04-03

pydantic-dict

pydantic_dict A pydantic model subclass that implements Python's dictionary interface. Example: ```python from pydantic_dict import BaseModelDict class User(BaseModelDict): id: int name: str = "Jane Doe" user = User(id=42) user["session_id"] = "95607c42-250a-4913-9dfb-00eb6535e685" assert user.session_id == "95607c42-250a-4913-9dfb-00eb6535e685" assert user["session_id"] == "95607c42-250a-4913-9dfb-00eb6535e685" user.pop("session_id") assert "session_id" not in user assert user.get("last_name", None) is None user.update({"email": "jane.doe@email.com"}) print(user.json()) >>> {"id": 42, "name": "Jane Doe", "email": "jane.doe@email.com"} user.clear() # fields are NOT removed. only non-fields are removed print(user.json()) >>> {"id": 42, "name": "Jane Doe"} user.setdefault("last_active", "2023-01-01T19:56:10Z") del user["last_active"] ``` Unset marker type The Unset marker type provides a way to "mark" that an optional model field is by default not set and is not required to construct the model. This enables more semantic usage of built-in dict methods like get() and setdefault() that can return or set a default value. Likewise, fields that are Unset are not considered to be members of a BaseModelDict dictionary (e.g. "unset_field" not in model_dict) and are not included in __iter__(), keys(), values(), or len(model_dict). This feature is especially useful when refactoring existing code to use pydantic. Example: ```python from pydantic_dict import BaseModelDict, Unset from typing import Optional class User(BaseModelDict): id: int name: str = "Jane Doe" email: Optional[str] = Unset user = User(id=42) assert "email" not in user user["email"] # raises KeyError assert len(user) == 2 assert set(user.keys()) == {"id", "name"} user.setdefault("email", f"{user.id}@service.com") # returns 42@service.com assert "email" in user ``` Install shell pip install pydantic_dict

pypi package. Binary

Latest version: 0.0.3 Released: 2023-06-30

pydantic-scim

pydantic-scim Largely generated by running datamodel-codegen on the files in schema/, and then cleaned up by hand.

pypi package. Binary

Latest version: 0.0.8 Released: 2023-10-26

pydantic-slim

pydantic-slim This is a placeholder in case we want to use this package name in future.

pypi package. Binary | Source

Latest version: 0.0.0 Released: 2024-03-29

ocsf-pydantic

Project Name OCSF-Pydantic Description Pydantic validated models for OCSF schema events & objects Installation pip install ocsf-pydantic Usage all event & object models are available via ocsf ``` from ocsf.events.application.api import Api apievent: dict = ... # an OCSF normalized dict APIActivity(**apievent) ``` License This project is licensed under the MIT License.

pypi package. Binary | Source

Latest version: 0.0.6 Released: 2024-11-11

pydantic-pony

pydantic-pony Tools to generate Pydantic models from Pony ORM models.

pypi package. Binary | Source

Latest version: 0.0.1 Released: 2023-03-31

pydantic-bind

pydantic-bind Table of Contents Overview Getting Started Why Not Protobufs ? No Copy Supported Types Inheritance Msgpack Namespaces Generated Code Other Languages Overview Python is the language of choice for finance, data science etc. Python calling C++ (and increasingly, Rust) is a common pattern, leveraging packages such as pybind11 . A common problem is a how best to represent data to be shared between python and C++ code. One would like idiomatic representations in each language and this may be necessary to fully utilise certain python packages. E.g., FastAPI is a popular way to create REST services, using Open API definitions derived from pydantic classes. Therefore, a data model authored using pydantic classes, or native python dataclasses, from which sensible C++ structs and appropriate marshalling can automatically be generated, is desirable. This package provides such tools: a cmake rule allows you to generate C++ structs (with msgpack serialisation) and corresponding pybind11 bindings. Python functions allow you to naviagte between the C++ pybind11 objects and the native python objects. There is also an option for all python operations to be directed to an owned pybind11 object (see No Copy). Note that the typcal python developer experience is now somewhat changed, in that it's necessary to build/install the project. I personally use JetBrains CLion, in place of PyCharm for such projects. For an example of the kind of behaviour-less object model this package is intended to help, please see (the rather nascent) fin-data-model Getting Started pydantic_bind adds a custom cmake rule: pydantic_bind_add_package() This rule will do the following: - scan for sub-packages - scan each sub-package for all .py files - add custom steps for generating .cpp/.h files from any of the following, encounted in the .py files: - dataclasses - classes derived from pydantic's BaseModel - enums C++ directory and namespace structure will match the python package structure (see Namespaces). You can create an instance of the pybind11 class from your original using get_pybind_instance(), e.g., my_class.py: from dataclasses import dataclass @dataclass clas MyClass: my_int: int my_string: str | None CMakeLists.txt: cmake_minimum_required(VERSION 3.9) project(my_project) set(CMAKE_CXX_STANDARD 20) find_package(python3 REQUIRED COMPONENTS Interpreter Development) find_package(pydantic_bind REQUIRED COMPONENTS HINTS "${python3_SITELIB}") pydantic_bind_add_package(my_package) my_util.py from pydantic_bind import get_pybind_value from my_package.my_class imnport MyClass orig = MyClass(my_int=123, my_string="hello") generated = get_pybind_value(orig) print(f"my_int: {orig.my_int}, {generated.my_int}") Why Not Protobufs? I personally find protobufs to be a PITA to use: they have poor to no variant support, the generated code is ugly and idiosyncratic, they're large and painful to copy around etc. AVRO is more friendly but generates python classes dynamically, which confuses IDEs like Pycharm. I do think a good solution is something like pydantic_avro where one can define the classes using pydantic, generate the AVRO schema and then the generateed C++ etc. I might well try and converge this project with that approach. I was inspired to some degree by this blog. No Copy One annoyance of multi-language representations of data objects is that you often end up copying data around where you'd prefer to share a single copy. This is the raison d'etre for Protobufs and its ilk. In this project I've created implementations of BaseModel and dataclass which allow python to use the underlying C++ data representation, rather than holding its own copy. Deriving from this BaseModel will give you equivalent functionality of as pydantic's BaseModel. The annotations are re-written using computed_field, with property getters and setters operating on the generated pybind class, which is instantiated behind the scenes in __init__. Note that this will make some operations (especially those that access dict) less efficient. I've also plumbed the computed fields into the JSON schema, so these objects can be used with FastAPI. dataclass works similarly, adding properties to the dataclass, so that the exisitng get and set functionality works seamless in accessing the generated pybind11 class (also set via a shimmed __init__). Using regular dataclass or BaseModel as members of classes defined with the pydantic_bind versions is very inefficient and not recommended. Supported Types The following python -> C++ mappings are supported (there are likely others I should consider): bool --> bool float --> double int --> int str --> std::string datetime.date --> std::chrono::system_clock::time_point datetime.datetime --> std::chrono::system_clock::time_point datetime.time --> std::chrono::system_clock::time_point datetime.timedelta --> std::chrono::duration pydantic.BaseModel --> struct pydantic_bind.BaseModel --> struct dataclass --> struct pydantic_bind.dataclass --> struct Enum --> enum class Inheritance I have tested single inheritance (see Generated Code). Multiple inheritance may work ... or it may not. I'd generally advise against using it for data classes. Msgpack A rather rudimentary msgpack implementation is added to the generated C++ structs, using a slightly modified version of cpppack. It wasn't clear to me whether this package is maintained or accepting submissions, so I copied and slightly modified msgpack.h (also, I couldn't work out how to add to my project with my rather rudimentary cmake skillz!) Changes include: Fixing includes Support for std::optional Support for std::variant Support for enums A likely future enhancement will be to use cereal and add a mgspack adaptor. The no-copy python objects add to_msg_pack() and from_msg_pack() (the latter being a class method), to access this functionality. Namespaces Directory structure and namespaces in the generated C++ match the python package and module names. cmake requires unique target names and pybind11 requires that the filename (minus the OS-speicific qualifiers) matches the module name. Generated Code Code is generated into a directory structure underneath /generated. Headers are installed to /include. Compiled pybind11 modules are installed into /__pybind__. For C++ usage, you need only the headers, the compiled code is for pybind/python usage only. For the example below, common_object_model/common_object_model/v1/common/__pybind__/foo.cpython-311-darwin.so will be installed (obviously with corresponding qualifiers for Linux/Windows). get_pybind_value() searches this directory. Imports/includes should work seamlessly (the python import scheme will be copied). I have tested this but not completely rigorously. common_object_model/common_object_model/v1/common/foo.py: from dataclasses import dataclass import datetime as dt from enum import Enum, auto from typing import Union from pydantic_bind import BaseModel class Weekday(Enum): MONDAY = auto() TUESDAY = auto() WEDNESDAY = auto() THURSDAY = auto() FRIDAY = auto() SATURDAY = auto() SUNDAY = auto() @dataclass class DCFoo: my_int: int my_string: str | None class Foo(BaseModel): my_bool: bool = True my_day: Weekday = Weekday.SUNDAY class Bar(Foo): my_int: int = 123 my_string: str my_optional_string: str | None = None class Baz(BaseModel): my_variant: Union[str, float] = 123. my_date: dt.date my_foo: Foo my_dc_foo: DCFoo will generate the following files: common_object_model/generated/common_object_model/v1/common/foo.h: #ifndef COMMON_OBJECT_MODEL_FOO_H #define COMMON_OBJECT_MODEL_FOO_H #include #include #include #include #include namespace common_object_model::v1::common { enum Weekday { MONDAY = 1, TUESDAY = 2, WEDNESDAY = 3, THURSDAY = 4, FRIDAY = 5, SATURDAY = 6, SUNDAY = 7 }; struct DCFoo { DCFoo() : my_string(), my_int() { } DCFoo(std::optional my_string, int my_int) : my_string(my_string), my_int(my_int) { } std::optional my_string; int my_int; MSGPACK_DEFINE(my_string, my_int); }; struct Foo { Foo(bool my_bool=true, Weekday my_day=SUNDAY) : my_bool(my_bool), my_day(my_day) { } bool my_bool; Weekday my_day; MSGPACK_DEFINE(my_bool, my_day); }; struct Bar : public Foo { Bar() : Foo(), my_string(), my_int(123), my_optional_string(std::nullopt) { } Bar(std::string my_string, bool my_bool=true, Weekday my_day=SUNDAY, int my_int=123, std::optional my_optional_string=std::nullopt) : Foo(my_bool, my_day), my_string(std::move(my_string)), my_int(my_int), my_optional_string(my_optional_string) { } std::string my_string; int my_int; std::optional my_optional_string; MSGPACK_DEFINE(my_string, my_bool, my_day, my_int, my_optional_string); }; struct Baz { Baz() : my_dc_foo(), my_foo(), my_date(), my_variant(123.0) { } Baz(DCFoo my_dc_foo, Foo my_foo, std::chrono::system_clock::time_point my_date, std::variant my_variant=123.0) : my_dc_foo(std::move(my_dc_foo)), my_foo(std::move(my_foo)), my_date(my_date), my_variant(my_variant) { } DCFoo my_dc_foo; Foo my_foo; std::chrono::system_clock::time_point my_date; std::variant my_variant; MSGPACK_DEFINE(my_dc_foo, my_foo, my_date, my_variant); }; } // common_object_model #endif // COMMON_OBJECT_MODEL_FOO_H common_object_model/generated/common_object_model/v1/common/foo.cpp: #include #include #include #include "foo.h" namespace py = pybind11; using namespace common_object_model::v1::common; PYBIND11_MODULE(common_object_model_v1_common_foo, m) { py::enum_(m, "Weekday").value("MONDAY", Weekday::MONDAY) .value("TUESDAY", Weekday::TUESDAY) .value("WEDNESDAY", Weekday::WEDNESDAY) .value("THURSDAY", Weekday::THURSDAY) .value("FRIDAY", Weekday::FRIDAY) .value("SATURDAY", Weekday::SATURDAY) .value("SUNDAY", Weekday::SUNDAY); py::class_(m, "DCFoo") .def(py::init<>()) .def(py::init, int>(), py::arg("my_string"), py::arg("my_int")) .def("to_msg_pack", &DCFoo::to_msg_pack) .def_static("from_msg_pack", &DCFoo::from_msg_pack) .def_readwrite("my_string", &DCFoo::my_string) .def_readwrite("my_int", &DCFoo::my_int); py::class_(m, "Foo") .def(py::init(), py::arg("my_bool")=true, py::arg("my_day")=SUNDAY) .def("to_msg_pack", &Foo::to_msg_pack) .def_static("from_msg_pack", &Foo::from_msg_pack) .def_readwrite("my_bool", &Foo::my_bool) .def_readwrite("my_day", &Foo::my_day); py::class_(m, "Bar") .def(py::init<>()) .def(py::init>(), py::arg("my_string"), py::arg("my_bool")=true, py::arg("my_day")=SUNDAY, py::arg("my_int")=123, py::arg("my_optional_string")=std::nullopt) .def("to_msg_pack", &Bazr:to_msg_pack) .def_static("from_msg_pack", &Bar::from_msg_pack) .def_readwrite("my_string", &Bar::my_string) .def_readwrite("my_int", &Bar::my_int) .def_readwrite("my_optional_string", &Bar::my_optional_string); py::class_(m, "Baz") .def(py::init<>()) .def(py::init>(), py::arg("my_dc_foo"), py::arg("my_foo"), py::arg("my_date"), py::arg("my_variant")=123.0) .def("to_msg_pack", &Baz::to_msg_pack) .def_static("from_msg_pack", &Baz::from_msg_pack) .def_readwrite("my_dc_foo", &Baz::my_dc_foo) .def_readwrite("my_foo", &Baz::my_foo) .def_readwrite("my_date", &Baz::my_date) .def_readwrite("my_variant", &Baz::my_variant); } Other languages When time allows, I will look at adding support for Rust. There is limited value in generating Java or C# classes; calling those VM-based lanagues in-process from python has never worked well, in my experience.

pypi package. Binary

Latest version: 1.0.5 Released: 2023-10-27

BIDS-pydantic

BIDS-pydantic Overview BIDS-pydantic will pull a specified version (from v1.7.0 onwards, tested up to v1.7.0) of the BIDS metadata schema which is used in the JSON BIDS sidecar, from the official BIDS GitHub page, and create corresponding pydantic models, which will provide BIDS data validation using python type annotations. Alternatively, the BIDS-pydantic-models package will only install the models for direct use in your Python software. More information on the use of the models can be found here. Table of Contents Quickstart Installation Usage Development Acknowledgements License How To Contribute Got a great idea for something to implement in BIDS-pydantic, or maybe you have just found a bug? Create an issue to get in touch with the development team and we’ll take it from there. Quickstart If you just want to use the models in your project. Download the pydantic models file for the BIDS schema version you wish to use from the models directory, and add it to your code-base. These files are generated using the bids-pydantic make -a command (see below). Alternatively, you can just run: sh $ pip install bids-pydantic-models More information on the use of the models can be found here. If you want to use the command line tool to generate the models, keep reading this README. Installation Install with: sh $ pip install bids-pydantic BIDS-pydantic can be installed as a module directly from the python package index. For more information how to use a python package in this way please see https://docs.python.org/3/installing/index.html Python Version We recommend using the latest version of Python. BIDS-pydantic supports Python 3.9 and newer. Dependencies These distributions will be installed automatically when installing BIDS-pydantic. pydantic datamodel-code-generator Usage The primary commands can be viewed with the command bids-pydantic: ``` usage: bids-pydantic [-h] {list,make} ... Run one of a set of commands. For example: bids-pydantic list, or bids-pydantic make. Run either command with -h e.g. bids-pydantic make -h to get help for that command. optional arguments: -h, --help show this help message and exit command: {list,make} subcommand to run ``` The list command help can be viewed with the command bids-pydantic list -h: ``` usage: bids-pydantic list [-h] Queries the GitHub API and lists the available supported BIDS schema versions. Only tested up to v1.7.0. optional arguments: -h, --help show this help message and exit ``` The make command help can be viewed with the command bids-pydantic make -h: ``` usage: bids-pydantic make [-h] [--output OUTPUT] [--output-all OUTPUT_ALL] [--schema-version [SCHEMA_VERSION]] Make a new python file(s) containing BIDS compliant pydantic models optional arguments: -h, --help show this help message and exit --output OUTPUT, -o OUTPUT The output python filename to create (will output to stdout console if not specified). --output-all OUTPUT_ALL, -a OUTPUT_ALL Find all parsable schemas and output each to the provided directory. Will create filenames such as bids_schema_model_v_1_7_0.py, etc. Will overwrite any files in that directory with the same name. --schema-version [SCHEMA_VERSION] The BIDS schema version to use. e.g. 1.7.0 - supported versions can be discovered using the list command. If a version is not specified v1.7.0 will be used. --input INPUT, -i INPUT Specify an input BIDS metadata (yml) file to use instead of downloading a version from GitHub. Cannot be used with --schema-version or --output-all ``` Development Development dependencies should be installed using pip install -r requirements/dev.txt -U, and pre-commit install then run to install code-quality Git hooks. Development should be carried out using Python 3.8. Development must comply with a few code styling/quality rules and processes: Before pushing any code, make sure the CHANGELOG.md is updated as per the instructions in the CHANGELOG.md file. tox should also be run to ensure that tests and code-quality checks pass. The README.md file should be updated with any usage or development instructions. Ensure that a good level of test coverage is kept. The test reports will be committed to the CI system when testing is run, and these will be made available during code review. If you wish to view test coverage locally, run coverage report. To ensure these code quality rules are kept to, pre-commit should be installed (see the requirements/dev.txt), and pre-commit install run when first cloning this repo. This will install some pre-commit hooks that will ensure any committed code meets the minimum code-quality and is formatted correctly before being committed to Git. This will ensure that tests will pass on CI system after code is pushed. The tools should also be included in any IDEs/editors used, where possible. To run manually, run precommit run --all-files. The following software tools will be run: mypy pylint black isort pyupgrade Acknowledgements Conversion from schema to pydantic models is carried out using datamodel-code-generator. Data validation is performed using pydantic. License You can check out the full license here This project is licensed under the terms of the MIT license.

pypi package. Binary | Source

Latest version: 0.0.3 Released: 2023-06-08

pydantic-conf

pydantic-conf Overview pydantic-conf is a Python library for managing application configuration using Pydantic. It supports loading configuration from environment variables and allows for custom startup actions. Installation To install the package, use: sh pip install pydantic-conf Usage Defining Configuration Create a configuration class by inheriting from EnvAppConfig: ```python from pydantic_conf.config import EnvAppConfig class MyConfig(EnvAppConfig): app_name: str debug: bool = False ``` Loading Configuration Load the configuration using the load method: python config = MyConfig.load() print(config.app_name) print(config.debug) Adding Startup Actions Add startup actions by appending to the STARTUP list: ```python def startup_action(config): print(f"Starting up with {config.app_name}") MyConfig.STARTUP.append(startup_action) config = MyConfig.load() ``` License This project is licensed under the MIT License.

pypi package. Binary

Latest version: 1.0.2 Released: 2025-03-18