Pydantic Data validation using Python type hints. Fast and extensible, Pydantic plays nicely with your linters/IDE/brain. Define how data should be in pure, canonical Python 3.9+; validate it with Pydantic. Pydantic Logfire :fire: We've recently launched Pydantic Logfire to help you monitor your applications. Learn more Pydantic V1.10 vs. V2 Pydantic V2 is a ground-up rewrite that offers many new features, performance improvements, and some breaking changes compared to Pydantic V1. If you're using Pydantic V1 you may want to look at the pydantic V1.10 Documentation or, 1.10.X-fixes git branch. Pydantic V2 also ships with the latest version of Pydantic V1 built in so that you can incrementally upgrade your code base and projects: from pydantic import v1 as pydantic_v1. Help See documentation for more details. Installation Install using pip install -U pydantic or conda install pydantic -c conda-forge. For more installation options to make Pydantic even faster, see the Install section in the documentation. A Simple Example ```python from datetime import datetime from typing import Optional from pydantic import BaseModel class User(BaseModel): id: int name: str = 'John Doe' signup_ts: Optional[datetime] = None friends: list[int] = [] external_data = {'id': '123', 'signup_ts': '2017-06-01 12:22', 'friends': [1, '2', b'3']} user = User(**external_data) print(user) > User id=123 name='John Doe' signup_ts=datetime.datetime(2017, 6, 1, 12, 22) friends=[1, 2, 3] print(user.id) > 123 ``` Contributing For guidance on setting up a development environment and how to make a contribution to Pydantic, see Contributing to Pydantic. Reporting a Security Vulnerability See our security policy. Changelog v2.11.4 (2025-04-29) GitHub release What's Changed Packaging Bump mkdocs-llmstxt to v0.2.0 by @Viicos in #11725 Changes Allow config and bases to be specified together in create_model() by @Viicos in #11714. This change was backported as it was previously possible (although not meant to be supported) to provide model_config as a field, which would make it possible to provide both configuration and bases. Fixes Remove generics cache workaround by @Viicos in #11755 Remove coercion of decimal constraints by @Viicos in #11772 Fix crash when expanding root type in the mypy plugin by @Viicos in #11735 Fix issue with recursive generic models by @Viicos in #11775 Traverse function-before schemas during schema gathering by @Viicos in #11801 v2.11.3 (2025-04-08) GitHub release What's Changed Packaging Update V1 copy to v1.10.21 by @Viicos in #11706 Fixes Preserve field description when rebuilding model fields by @Viicos in #11698 v2.11.2 (2025-04-03) GitHub release What's Changed Fixes Bump pydantic-core to v2.33.1 by @Viicos in #11678 Make sure __pydantic_private__ exists before setting private attributes by @Viicos in #11666 Do not override FieldInfo._complete when using field from parent class by @Viicos in #11668 Provide the available definitions when applying discriminated unions by @Viicos in #11670 Do not expand root type in the mypy plugin for variables by @Viicos in #11676 Mention the attribute name in model fields deprecation message by @Viicos in #11674 Properly validate parameterized mappings by @Viicos in #11658 v2.11.1 (2025-03-28) GitHub release What's Changed Fixes Do not override 'definitions-ref' schemas containing serialization schemas or metadata by @Viicos in #11644 v2.11.0 (2025-03-27) GitHub release What's Changed Pydantic v2.11 is a version strongly focused on build time performance of Pydantic models (and core schema generation in general). See the blog post for more details. Packaging Bump pydantic-core to v2.33.0 by @Viicos in #11631 New Features Add encoded_string() method to the URL types by @YassinNouh21 in #11580 Add support for defer_build with @validate_call decorator by @Viicos in #11584 Allow @with_config decorator to be used with keyword arguments by @Viicos in #11608 Simplify customization of default value inclusion in JSON Schema generation by @Viicos in #11634 Add generate_arguments_schema() function by @Viicos in #11572 Fixes Allow generic typed dictionaries to be used for unpacked variadic keyword parameters by @Viicos in #11571 Fix runtime error when computing model string representation involving cached properties and self-referenced models by @Viicos in #11579 Preserve other steps when using the ellipsis in the pipeline API by @Viicos in #11626 Fix deferred discriminator application logic by @Viicos in #11591 New Contributors @cmenon12 made their first contribution in #11562 @Jeukoh made their first contribution in #11611 v2.11.0b2 (2025-03-17) GitHub release What's Changed Packaging Bump pydantic-core to v2.32.0 by @Viicos in #11567 New Features Add experimental support for free threading by @Viicos in #11516 Fixes Fix NotRequired qualifier not taken into account in stringified annotation by @Viicos in #11559 New Contributors @joren485 made their first contribution in #11547 v2.11.0b1 (2025-03-06) GitHub release What's Changed Packaging Add a check_pydantic_core_version() function by @Viicos in https://github.com/pydantic/pydantic/pull/11324 Remove greenlet development dependency by @Viicos in https://github.com/pydantic/pydantic/pull/11351 Use the typing-inspection library by @Viicos in https://github.com/pydantic/pydantic/pull/11479 Bump pydantic-core to v2.31.1 by @sydney-runkle in https://github.com/pydantic/pydantic/pull/11526 New Features Support unsubstituted type variables with both a default and a bound or constraints by @FyZzyss in https://github.com/pydantic/pydantic/pull/10789 Add a default_factory_takes_validated_data property to FieldInfo by @Viicos in https://github.com/pydantic/pydantic/pull/11034 Raise a better error when a generic alias is used inside type[] by @Viicos in https://github.com/pydantic/pydantic/pull/11088 Properly support PEP 695 generics syntax by @Viicos in https://github.com/pydantic/pydantic/pull/11189 Properly support type variable defaults by @Viicos in https://github.com/pydantic/pydantic/pull/11332 Add support for validating v6, v7, v8 UUIDs by @astei in https://github.com/pydantic/pydantic/pull/11436 Improve alias configuration APIs by @sydney-runkle in https://github.com/pydantic/pydantic/pull/11468 Changes Rework create_model field definitions format by @Viicos in https://github.com/pydantic/pydantic/pull/11032 Raise a deprecation warning when a field is annotated as final with a default value by @Viicos in https://github.com/pydantic/pydantic/pull/11168 Deprecate accessing model_fields and model_computed_fields on instances by @Viicos in https://github.com/pydantic/pydantic/pull/11169 Breaking Change: Move core schema generation logic for path types inside the GenerateSchema class by @sydney-runkle in https://github.com/pydantic/pydantic/pull/10846 Remove Python 3.8 Support by @sydney-runkle in https://github.com/pydantic/pydantic/pull/11258 Optimize calls to get_type_ref by @Viicos in https://github.com/pydantic/pydantic/pull/10863 Disable pydantic-core core schema validation by @sydney-runkle in https://github.com/pydantic/pydantic/pull/11271 Performance Only evaluate FieldInfo annotations if required during schema building by @Viicos in https://github.com/pydantic/pydantic/pull/10769 Improve __setattr__ performance of Pydantic models by caching setter functions by @MarkusSintonen in https://github.com/pydantic/pydantic/pull/10868 Improve annotation application performance by @Viicos in https://github.com/pydantic/pydantic/pull/11186 Improve performance of _typing_extra module by @Viicos in https://github.com/pydantic/pydantic/pull/11255 Refactor and optimize schema cleaning logic by @Viicos in https://github.com/pydantic/pydantic/pull/11244 Create a single dictionary when creating a CoreConfig instance by @sydney-runkle in https://github.com/pydantic/pydantic/pull/11384 Bump pydantic-core and thus use SchemaValidator and SchemaSerializer caching by @sydney-runkle in https://github.com/pydantic/pydantic/pull/11402 Reuse cached core schemas for parametrized generic Pydantic models by @MarkusSintonen in https://github.com/pydantic/pydantic/pull/11434 Fixes Improve TypeAdapter instance repr by @sydney-runkle in https://github.com/pydantic/pydantic/pull/10872 Use the correct frame when instantiating a parametrized TypeAdapter by @Viicos in https://github.com/pydantic/pydantic/pull/10893 Infer final fields with a default value as class variables in the mypy plugin by @Viicos in https://github.com/pydantic/pydantic/pull/11121 Recursively unpack Literal values if using PEP 695 type aliases by @Viicos in https://github.com/pydantic/pydantic/pull/11114 Override __subclasscheck__ on ModelMetaclass to avoid memory leak and performance issues by @Viicos in https://github.com/pydantic/pydantic/pull/11116 Remove unused _extract_get_pydantic_json_schema() parameter by @Viicos in https://github.com/pydantic/pydantic/pull/11155 Improve discriminated union error message for invalid union variants by @Viicos in https://github.com/pydantic/pydantic/pull/11161 Unpack PEP 695 type aliases if using the Annotated form by @Viicos in https://github.com/pydantic/pydantic/pull/11109 Add missing stacklevel in deprecated_instance_property warning by @Viicos in https://github.com/pydantic/pydantic/pull/11200 Copy WithJsonSchema schema to avoid sharing mutated data by @thejcannon in https://github.com/pydantic/pydantic/pull/11014 Do not cache parametrized models when in the process of parametrizing another model by @Viicos in https://github.com/pydantic/pydantic/pull/10704 Add discriminated union related metadata entries to the CoreMetadata definition by @Viicos in https://github.com/pydantic/pydantic/pull/11216 Consolidate schema definitions logic in the _Definitions class by @Viicos in https://github.com/pydantic/pydantic/pull/11208 Support initializing root model fields with values of the root type in the mypy plugin by @Viicos in https://github.com/pydantic/pydantic/pull/11212 Fix various issues with dataclasses and use_attribute_docstrings by @Viicos in https://github.com/pydantic/pydantic/pull/11246 Only compute normalized decimal places if necessary in decimal_places_validator by @misrasaurabh1 in https://github.com/pydantic/pydantic/pull/11281 Add support for validation_alias in the mypy plugin by @Viicos in https://github.com/pydantic/pydantic/pull/11295 Fix JSON Schema reference collection with "examples" keys by @Viicos in https://github.com/pydantic/pydantic/pull/11305 Do not transform model serializer functions as class methods in the mypy plugin by @Viicos in https://github.com/pydantic/pydantic/pull/11298 Simplify GenerateJsonSchema.literal_schema() implementation by @misrasaurabh1 in https://github.com/pydantic/pydantic/pull/11321 Add additional allowed schemes for ClickHouseDsn by @Maze21127 in https://github.com/pydantic/pydantic/pull/11319 Coerce decimal constraints to Decimal instances by @Viicos in https://github.com/pydantic/pydantic/pull/11350 Use the correct JSON Schema mode when handling function schemas by @Viicos in https://github.com/pydantic/pydantic/pull/11367 Improve exception message when encountering recursion errors during type evaluation by @Viicos in https://github.com/pydantic/pydantic/pull/11356 Always include additionalProperties: True for arbitrary dictionary schemas by @austinyu in https://github.com/pydantic/pydantic/pull/11392 Expose fallback parameter in serialization methods by @Viicos in https://github.com/pydantic/pydantic/pull/11398 Fix path serialization behavior by @sydney-runkle in https://github.com/pydantic/pydantic/pull/11416 Do not reuse validators and serializers during model rebuild by @Viicos in https://github.com/pydantic/pydantic/pull/11429 Collect model fields when rebuilding a model by @Viicos in https://github.com/pydantic/pydantic/pull/11388 Allow cached properties to be altered on frozen models by @Viicos in https://github.com/pydantic/pydantic/pull/11432 Fix tuple serialization for Sequence types by @sydney-runkle in https://github.com/pydantic/pydantic/pull/11435 Fix: do not check for __get_validators__ on classes where __get_pydantic_core_schema__ is also defined by @tlambert03 in https://github.com/pydantic/pydantic/pull/11444 Allow callable instances to be used as serializers by @Viicos in https://github.com/pydantic/pydantic/pull/11451 Improve error thrown when overriding field with a property by @sydney-runkle in https://github.com/pydantic/pydantic/pull/11459 Fix JSON Schema generation with referenceable core schemas holding JSON metadata by @Viicos in https://github.com/pydantic/pydantic/pull/11475 Support strict specification on union member types by @sydney-runkle in https://github.com/pydantic/pydantic/pull/11481 Implicitly set validate_by_name to True when validate_by_alias is False by @sydney-runkle in https://github.com/pydantic/pydantic/pull/11503 Change type of Any when synthesizing BaseSettings.__init__ signature in the mypy plugin by @Viicos in https://github.com/pydantic/pydantic/pull/11497 Support type variable defaults referencing other type variables by @Viicos in https://github.com/pydantic/pydantic/pull/11520 Fix ValueError on year zero by @davidhewitt in https://github.com/pydantic/pydantic-core/pull/1583 dataclass InitVar shouldn't be required on serialization by @sydney-runkle in https://github.com/pydantic/pydantic-core/pull/1602 New Contributors @FyZzyss made their first contribution in https://github.com/pydantic/pydantic/pull/10789 @tamird made their first contribution in https://github.com/pydantic/pydantic/pull/10948 @felixxm made their first contribution in https://github.com/pydantic/pydantic/pull/11077 @alexprabhat99 made their first contribution in https://github.com/pydantic/pydantic/pull/11082 @Kharianne made their first contribution in https://github.com/pydantic/pydantic/pull/11111 @mdaffad made their first contribution in https://github.com/pydantic/pydantic/pull/11177 @thejcannon made their first contribution in https://github.com/pydantic/pydantic/pull/11014 @thomasfrimannkoren made their first contribution in https://github.com/pydantic/pydantic/pull/11251 @usernameMAI made their first contribution in https://github.com/pydantic/pydantic/pull/11275 @ananiavito made their first contribution in https://github.com/pydantic/pydantic/pull/11302 @pawamoy made their first contribution in https://github.com/pydantic/pydantic/pull/11311 @Maze21127 made their first contribution in https://github.com/pydantic/pydantic/pull/11319 @kauabh made their first contribution in https://github.com/pydantic/pydantic/pull/11369 @jaceklaskowski made their first contribution in https://github.com/pydantic/pydantic/pull/11353 @tmpbeing made their first contribution in https://github.com/pydantic/pydantic/pull/11375 @petyosi made their first contribution in https://github.com/pydantic/pydantic/pull/11405 @austinyu made their first contribution in https://github.com/pydantic/pydantic/pull/11392 @mikeedjones made their first contribution in https://github.com/pydantic/pydantic/pull/11402 @astei made their first contribution in https://github.com/pydantic/pydantic/pull/11436 @dsayling made their first contribution in https://github.com/pydantic/pydantic/pull/11522 @sobolevn made their first contribution in https://github.com/pydantic/pydantic-core/pull/1645 v2.11.0a2 (2025-02-10) GitHub release What's Changed Pydantic v2.11 is a version strongly focused on build time performance of Pydantic models (and core schema generation in general). This is another early alpha release, meant to collect early feedback from users having issues with core schema builds. Packaging Bump ruff from 0.9.2 to 0.9.5 by @Viicos in #11407 Bump pydantic-core to v2.29.0 by @mikeedjones in #11402 Use locally-built rust with symbols & pgo by @davidhewitt in #11403 Performance Create a single dictionary when creating a CoreConfig instance by @sydney-runkle in #11384 Fixes Use the correct JSON Schema mode when handling function schemas by @Viicos in #11367 Fix JSON Schema reference logic with examples keys by @Viicos in #11366 Improve exception message when encountering recursion errors during type evaluation by @Viicos in #11356 Always include additionalProperties: True for arbitrary dictionary schemas by @austinyu in #11392 Expose fallback parameter in serialization methods by @Viicos in #11398 Fix path serialization behavior by @sydney-runkle in #11416 New Contributors @kauabh made their first contribution in #11369 @jaceklaskowski made their first contribution in #11353 @tmpbeing made their first contribution in #11375 @petyosi made their first contribution in #11405 @austinyu made their first contribution in #11392 @mikeedjones made their first contribution in #11402 v2.11.0a1 (2025-01-30) GitHub release What's Changed Pydantic v2.11 is a version strongly focused on build time performance of Pydantic models (and core schema generation in general). This is an early alpha release, meant to collect early feedback from users having issues with core schema builds. Packaging Bump dawidd6/action-download-artifact from 6 to 7 by @dependabot in #11018 Re-enable memray related tests on Python 3.12+ by @Viicos in #11191 Bump astral-sh/setup-uv to 5 by @dependabot in #11205 Bump ruff to v0.9.0 by @sydney-runkle in #11254 Regular uv.lock deps update by @sydney-runkle in #11333 Add a check_pydantic_core_version() function by @Viicos in #11324 Remove greenlet development dependency by @Viicos in #11351 Bump pydantic-core to v2.28.0 by @Viicos in #11364 New Features Support unsubstituted type variables with both a default and a bound or constraints by @FyZzyss in #10789 Add a default_factory_takes_validated_data property to FieldInfo by @Viicos in #11034 Raise a better error when a generic alias is used inside type[] by @Viicos in #11088 Properly support PEP 695 generics syntax by @Viicos in #11189 Properly support type variable defaults by @Viicos in #11332 Changes Rework create_model field definitions format by @Viicos in #11032 Raise a deprecation warning when a field is annotated as final with a default value by @Viicos in #11168 Deprecate accessing model_fields and model_computed_fields on instances by @Viicos in #11169 Move core schema generation logic for path types inside the GenerateSchema class by @sydney-runkle in #10846 Move deque schema gen to GenerateSchema class by @sydney-runkle in #11239 Move Mapping schema gen to GenerateSchema to complete removal of prepare_annotations_for_known_type workaround by @sydney-runkle in #11247 Remove Python 3.8 Support by @sydney-runkle in #11258 Disable pydantic-core core schema validation by @sydney-runkle in #11271 Performance Only evaluate FieldInfo annotations if required during schema building by @Viicos in #10769 Optimize calls to get_type_ref by @Viicos in #10863 Improve __setattr__ performance of Pydantic models by caching setter functions by @MarkusSintonen in #10868 Improve annotation application performance by @Viicos in #11186 Improve performance of _typing_extra module by @Viicos in #11255 Refactor and optimize schema cleaning logic by @Viicos and @MarkusSintonen in #11244 Fixes Add validation tests for _internal/_validators.py by @tkasuz in #10763 Improve TypeAdapter instance repr by @sydney-runkle in #10872 Revert "ci: use locally built pydantic-core with debug symbols by @sydney-runkle in #10942 Re-enable all FastAPI tests by @tamird in #10948 Fix typo in HISTORY.md. by @felixxm in #11077 Infer final fields with a default value as class variables in the mypy plugin by @Viicos in #11121 Recursively unpack Literal values if using PEP 695 type aliases by @Viicos in #11114 Override __subclasscheck__ on ModelMetaclass to avoid memory leak and performance issues by @Viicos in #11116 Remove unused _extract_get_pydantic_json_schema() parameter by @Viicos in #11155 Add FastAPI and SQLModel to third-party tests by @sydney-runkle in #11044 Fix conditional expressions syntax for third-party tests by @Viicos in #11162 Move FastAPI tests to third-party workflow by @Viicos in #11164 Improve discriminated union error message for invalid union variants by @Viicos in #11161 Unpack PEP 695 type aliases if using the Annotated form by @Viicos in #11109 Include openapi-python-client check in issue creation for third-party failures, use main branch by @sydney-runkle in #11182 Add pandera third-party tests by @Viicos in #11193 Add ODMantic third-party tests by @sydney-runkle in #11197 Add missing stacklevel in deprecated_instance_property warning by @Viicos in #11200 Copy WithJsonSchema schema to avoid sharing mutated data by @thejcannon in #11014 Do not cache parametrized models when in the process of parametrizing another model by @Viicos in #10704 Re-enable Beanie third-party tests by @Viicos in #11214 Add discriminated union related metadata entries to the CoreMetadata definition by @Viicos in #11216 Consolidate schema definitions logic in the _Definitions class by @Viicos in #11208 Support initializing root model fields with values of the root type in the mypy plugin by @Viicos in #11212 Fix various issues with dataclasses and use_attribute_docstrings by @Viicos in #11246 Only compute normalized decimal places if necessary in decimal_places_validator by @misrasaurabh1 in #11281 Fix two misplaced sentences in validation errors documentation by @ananiavito in #11302 Fix mkdocstrings inventory example in documentation by @pawamoy in #11311 Add support for validation_alias in the mypy plugin by @Viicos in #11295 Do not transform model serializer functions as class methods in the mypy plugin by @Viicos in #11298 Simplify GenerateJsonSchema.literal_schema() implementation by @misrasaurabh1 in #11321 Add additional allowed schemes for ClickHouseDsn by @Maze21127 in #11319 Coerce decimal constraints to Decimal instances by @Viicos in #11350 Fix ValueError on year zero by @davidhewitt in pydantic-core#1583 New Contributors @FyZzyss made their first contribution in #10789 @tamird made their first contribution in #10948 @felixxm made their first contribution in #11077 @alexprabhat99 made their first contribution in #11082 @Kharianne made their first contribution in #11111 @mdaffad made their first contribution in #11177 @thejcannon made their first contribution in #11014 @thomasfrimannkoren made their first contribution in #11251 @usernameMAI made their first contribution in #11275 @ananiavito made their first contribution in #11302 @pawamoy made their first contribution in #11311 @Maze21127 made their first contribution in #11319 v2.10.6 (2025-01-23) GitHub release What's Changed Fixes Fix JSON Schema reference collection with 'examples' keys by @Viicos in #11325 Fix url python serialization by @sydney-runkle in #11331 v2.10.5 (2025-01-08) GitHub release What's Changed Fixes Remove custom MRO implementation of Pydantic models by @Viicos in #11184 Fix URL serialization for unions by @sydney-runkle in #11233 v2.10.4 (2024-12-18) GitHub release What's Changed Packaging Bump pydantic-core to v2.27.2 by @davidhewitt in #11138 Fixes Fix for comparison of AnyUrl objects by @alexprabhat99 in #11082 Properly fetch PEP 695 type params for functions, do not fetch annotations from signature by @Viicos in #11093 Include JSON Schema input core schema in function schemas by @Viicos in #11085 Add len to _BaseUrl to avoid TypeError by @Kharianne in #11111 Make sure the type reference is removed from the seen references by @Viicos in #11143 New Contributors @FyZzyss made their first contribution in #10789 @tamird made their first contribution in #10948 @felixxm made their first contribution in #11077 @alexprabhat99 made their first contribution in #11082 @Kharianne made their first contribution in #11111 v2.10.3 (2024-12-03) GitHub release What's Changed Fixes Set fields when defer_build is set on Pydantic dataclasses by @Viicos in #10984 Do not resolve the JSON Schema reference for dict core schema keys by @Viicos in #10989 Use the globals of the function when evaluating the return type for PlainSerializer and WrapSerializer functions by @Viicos in #11008 Fix host required enforcement for urls to be compatible with v2.9 behavior by @sydney-runkle in #11027 Add a default_factory_takes_validated_data property to FieldInfo by @Viicos in #11034 Fix url json schema in serialization mode by @sydney-runkle in #11035 v2.10.2 (2024-11-25) GitHub release What's Changed Fixes Only evaluate FieldInfo annotations if required during schema building by @Viicos in #10769 Do not evaluate annotations for private fields by @Viicos in #10962 Support serialization as any for Secret types and Url types by @sydney-runkle in #10947 Fix type hint of Field.default to be compatible with Python 3.8 and 3.9 by @Viicos in #10972 Add hashing support for URL types by @sydney-runkle in #10975 Hide BaseModel.__replace__ definition from type checkers by @Viicos in #10979 v2.10.1 (2024-11-21) GitHub release What's Changed Packaging Bump pydantic-core version to v2.27.1 by @sydney-runkle in #10938 Fixes Use the correct frame when instantiating a parametrized TypeAdapter by @Viicos in #10893 Relax check for validated data in default_factory utils by @sydney-runkle in #10909 Fix type checking issue with model_fields and model_computed_fields by @sydney-runkle in #10911 Use the parent configuration during schema generation for stdlib dataclasses by @sydney-runkle in #10928 Use the globals of the function when evaluating the return type of serializers and computed_fields by @Viicos in #10929 Fix URL constraint application by @sydney-runkle in #10922 Fix URL equality with different validation methods by @sydney-runkle in #10934 Fix JSON schema title when specified as '' by @sydney-runkle in #10936 Fix python mode serialization for complex inference by @sydney-runkle in pydantic-core#1549 New Contributors v2.10.0 (2024-11-20) The code released in v2.10.0 is practically identical to that of v2.10.0b2. GitHub release See the v2.10 release blog post for the highlights! What's Changed Packaging Bump pydantic-core to v2.27.0 by @sydney-runkle in #10825 Replaced pdm with uv by @frfahim in #10727 New Features Support fractions.Fraction by @sydney-runkle in #10318 Support Hashable for json validation by @sydney-runkle in #10324 Add a SocketPath type for linux systems by @theunkn0wn1 in #10378 Allow arbitrary refs in JSON schema examples by @sydney-runkle in #10417 Support defer_build for Pydantic dataclasses by @Viicos in #10313 Adding v1 / v2 incompatibility warning for nested v1 model by @sydney-runkle in #10431 Add support for unpacked TypedDict to type hint variadic keyword arguments with @validate_call by @Viicos in #10416 Support compiled patterns in protected_namespaces by @sydney-runkle in #10522 Add support for propertyNames in JSON schema by @FlorianSW in #10478 Adding __replace__ protocol for Python 3.13+ support by @sydney-runkle in #10596 Expose public sort method for JSON schema generation by @sydney-runkle in #10595 Add runtime validation of @validate_call callable argument by @kc0506 in #10627 Add experimental_allow_partial support by @samuelcolvin in #10748 Support default factories taking validated data as an argument by @Viicos in #10678 Allow subclassing ValidationError and PydanticCustomError by @Youssefares in pydantic/pydantic-core#1413 Add trailing-strings support to experimental_allow_partial by @sydney-runkle in #10825 Add rebuild() method for TypeAdapter and simplify defer_build patterns by @sydney-runkle in #10537 Improve TypeAdapter instance repr by @sydney-runkle in #10872 Changes Don't allow customization of SchemaGenerator until interface is more stable by @sydney-runkle in #10303 Cleanly defer_build on TypeAdapters, removing experimental flag by @sydney-runkle in #10329 Fix mro of generic subclass by @kc0506 in #10100 Strip whitespaces on JSON Schema title generation by @sydney-runkle in #10404 Use b64decode and b64encode for Base64Bytes type by @sydney-runkle in #10486 Relax protected namespace config default by @sydney-runkle in #10441 Revalidate parametrized generics if instance's origin is subclass of OG class by @sydney-runkle in #10666 Warn if configuration is specified on the @dataclass decorator and with the __pydantic_config__ attribute by @sydney-runkle in #10406 Recommend against using Ellipsis (...) with Field by @Viicos in #10661 Migrate to subclassing instead of annotated approach for pydantic url types by @sydney-runkle in #10662 Change JSON schema generation of Literals and Enums by @Viicos in #10692 Simplify unions involving Any or Never when replacing type variables by @Viicos in #10338 Do not require padding when decoding base64 bytes by @bschoenmaeckers in pydantic/pydantic-core#1448 Support dates all the way to 1BC by @changhc in pydantic/speedate#77 Performance Schema cleaning: skip unnecessary copies during schema walking by @Viicos in #10286 Refactor namespace logic for annotations evaluation by @Viicos in #10530 Improve email regexp on edge cases by @AlekseyLobanov in #10601 CoreMetadata refactor with an emphasis on documentation, schema build time performance, and reducing complexity by @sydney-runkle in #10675 Fixes Remove guarding check on computed_field with field_serializer by @nix010 in #10390 Fix Predicate issue in v2.9.0 by @sydney-runkle in #10321 Fixing annotated-types bound by @sydney-runkle in #10327 Turn tzdata install requirement into optional timezone dependency by @jakob-keller in #10331 Use correct types namespace when building namedtuple core schemas by @Viicos in #10337 Fix evaluation of stringified annotations during namespace inspection by @Viicos in #10347 Fix IncEx type alias definition by @Viicos in #10339 Do not error when trying to evaluate annotations of private attributes by @Viicos in #10358 Fix nested type statement by @kc0506 in #10369 Improve typing of ModelMetaclass.mro by @Viicos in #10372 Fix class access of deprecated computed_fields by @Viicos in #10391 Make sure inspect.iscoroutinefunction works on coroutines decorated with @validate_call by @MovisLi in #10374 Fix NameError when using validate_call with PEP 695 on a class by @kc0506 in #10380 Fix ZoneInfo with various invalid types by @sydney-runkle in #10408 Fix PydanticUserError on empty model_config with annotations by @cdwilson in #10412 Fix variance issue in _IncEx type alias, only allow True by @Viicos in #10414 Fix serialization schema generation when using PlainValidator by @Viicos in #10427 Fix schema generation error when serialization schema holds references by @Viicos in #10444 Inline references if possible when generating schema for json_schema_input_type by @Viicos in #10439 Fix recursive arguments in Representation by @Viicos in #10480 Fix representation for builtin function types by @kschwab in #10479 Add python validators for decimal constraints (max_digits and decimal_places) by @sydney-runkle in #10506 Only fetch __pydantic_core_schema__ from the current class during schema generation by @Viicos in #10518 Fix stacklevel on deprecation warnings for BaseModel by @sydney-runkle in #10520 Fix warning stacklevel in BaseModel.__init__ by @Viicos in #10526 Improve error handling for in-evaluable refs for discriminator application by @sydney-runkle in #10440 Change the signature of ConfigWrapper.core_config to take the title directly by @Viicos in #10562 Do not use the previous config from the stack for dataclasses without config by @Viicos in #10576 Fix serialization for IP types with mode='python' by @sydney-runkle in #10594 Support constraint application for Base64Etc types by @sydney-runkle in #10584 Fix validate_call ignoring Field in Annotated by @kc0506 in #10610 Raise an error when Self is invalid by @kc0506 in #10609 Using core_schema.InvalidSchema instead of metadata injection + checks by @sydney-runkle in #10523 Tweak type alias logic by @kc0506 in #10643 Support usage of type with typing.Self and type aliases by @kc0506 in #10621 Use overloads for Field and PrivateAttr functions by @Viicos in #10651 Clean up the mypy plugin implementation by @Viicos in #10669 Properly check for typing_extensions variant of TypeAliasType by @Daraan in #10713 Allow any mapping in BaseModel.model_copy() by @Viicos in #10751 Fix isinstance behavior for urls by @sydney-runkle in #10766 Ensure cached_property can be set on Pydantic models by @Viicos in #10774 Fix equality checks for primitives in literals by @sydney-runkle in pydantic/pydantic-core#1459 Properly enforce host_required for URLs by @Viicos in pydantic/pydantic-core#1488 Fix when coerce_numbers_to_str enabled and string has invalid Unicode character by @andrey-berenda in pydantic/pydantic-core#1515 Fix serializing complex values in Enums by @changhc in pydantic/pydantic-core#1524 Refactor _typing_extra module by @Viicos in #10725 Support intuitive equality for urls by @sydney-runkle in #10798 Add bytearray to TypeAdapter.validate_json signature by @samuelcolvin in #10802 Ensure class access of method descriptors is performed when used as a default with Field by @Viicos in #10816 Fix circular import with validate_call by @sydney-runkle in #10807 Fix error when using type aliases referencing other type aliases by @Viicos in #10809 Fix IncEx type alias to be compatible with mypy by @Viicos in #10813 Make __signature__ a lazy property, do not deepcopy defaults by @Viicos in #10818 Make __signature__ lazy for dataclasses, too by @sydney-runkle in #10832 Subclass all single host url classes from AnyUrl to preserve behavior from v2.9 by @sydney-runkle in #10856 New Contributors @jakob-keller made their first contribution in #10331 @MovisLi made their first contribution in #10374 @joaopalmeiro made their first contribution in #10405 @theunkn0wn1 made their first contribution in #10378 @cdwilson made their first contribution in #10412 @dlax made their first contribution in #10421 @kschwab made their first contribution in #10479 @santibreo made their first contribution in #10453 @FlorianSW made their first contribution in #10478 @tkasuz made their first contribution in #10555 @AlekseyLobanov made their first contribution in #10601 @NiclasvanEyk made their first contribution in #10667 @mschoettle made their first contribution in #10677 @Daraan made their first contribution in #10713 @k4nar made their first contribution in #10736 @UriyaHarpeness made their first contribution in #10740 @frfahim made their first contribution in #10727 v2.10.0b2 (2024-11-13) Pre-release, see the GitHub release for details. v2.10.0b1 (2024-11-06) Pre-release, see the GitHub release for details. ... see here for earlier changes.
Latest version: 2.11.4 Released: 2025-04-29
A python package that uses pydantic to validate rows in data tables.
Latest version: 0.3.1 Released: 2024-11-07
qt-pydantic The qt-pydantic package adds support for Qt types in Pydantic BaseModels. Using these annotations allows for easy serialization and deserialization of Qt types. Installation Install using pip: shell pip install qt-pydantic Usage ```python from PySide6 import QtCore, QtGui from pydantic import BaseModel from qt_pydantic import QSize, QColor, QDate Define a model with Qt types class Settings(BaseModel): size: QSize date: QDate color: QColor Parse json string into model json_data = '{"size": [720, 480], "date": "2021-01-01", "color": [255, 95, 135]}' settings = Settings.model_validate_json(json_data) Model types are actual Qt types assert isinstance(settings.size, QtCore.QSize) assert isinstance(settings.date, QtCore.QDate) assert isinstance(settings.color, QtGui.QColor) Turn model into json string data = settings.model_dump_json(indent=2) ``` Contributing To contribute please refer to the Contributing Guide. License MIT License. Copyright 2024 - Beat Reichenbach. See the License file for details.
pypi package. Binary
Latest version: 0.1.0 Released: 2025-01-26
Streamlit Pydantic Auto-generate Streamlit UI elements from Pydantic models. Getting Started • Documentation • Support • Report a Bug • Contribution • Changelog st-pydantic is a fork of the fantastic st-pydantic package, which is no longer maintained by the original author, @LukasMasuch. I tried reaching out to the original maintainer, but I did not get a response, so I created this fork. I intend on maintaining it and adding new features as needed. The original README is below. st-pydantic makes it easy to auto-generate UI elements from Pydantic models or dataclasses. Just define your data model and turn it into a full-fledged UI form. It supports data validation, nested models, and field limitations. st-pydantic can be easily integrated into any Streamlit app. Beta Version: Only suggested for experimental usage. Try out and explore various examples in our playground here. Highlights 🪄 Auto-generated UI elements from Pydantic models & Dataclasses. 📇 Out-of-the-box data validation. 📑 Supports nested Pydantic models. 📏 Supports field limits and customizations. 🎈 Easy to integrate into any Streamlit app. Getting Started Installation Requirements: Python 3.6+. bash pip install st-pydantic Usage Create a script (my_script.py) with a Pydantic model and render it via pydantic_form: ```python import streamlit as st from pydantic import BaseModel import st_pydantic as sp class ExampleModel(BaseModel): some_text: str some_number: int some_boolean: bool data = sp.pydantic_form(key="my_form", model=ExampleModel) if data: st.json(data.json()) ``` Run the streamlit server on the python script: streamlit run my_script.py You can find additional examples in the examples section below. Examples 👉 Try out and explore these examples in our playground here The following collection of examples demonstrate how Streamlit Pydantic can be applied in more advanced scenarios. You can find additional - even more advanced - examples in the examples folder or in the playground. Simple Form ```python import streamlit as st from pydantic import BaseModel import st_pydantic as sp class ExampleModel(BaseModel): some_text: str some_number: int some_boolean: bool data = sp.pydantic_form(key="my_form", model=ExampleModel) if data: st.json(data.json()) ``` Date Validation ```python import streamlit as st from pydantic import BaseModel, Field, HttpUrl from pydantic.color import Color import st_pydantic as sp class ExampleModel(BaseModel): url: HttpUrl color: Color email: str = Field(..., max_length=100, regex=r"^\S+@\S+$") data = sp.pydantic_form(key="my_form", model=ExampleModel) if data: st.json(data.json()) ``` Dataclasses Support ```python import dataclasses import json import streamlit as st from pydantic.json import pydantic_encoder import st_pydantic as sp @dataclasses.dataclass class ExampleModel: some_number: int some_boolean: bool some_text: str = "default input" data = sp.pydantic_form(key="my_form", model=ExampleModel) if data: st.json(json.dumps(data, default=pydantic_encoder)) ``` Complex Nested Model ```python from enum import Enum from typing import Set import streamlit as st from pydantic import BaseModel, Field, ValidationError, parse_obj_as import st_pydantic as sp class OtherData(BaseModel): text: str integer: int class SelectionValue(str, Enum): FOO = "foo" BAR = "bar" class ExampleModel(BaseModel): long_text: str = Field(..., description="Unlimited text property") integer_in_range: int = Field( 20, ge=10, lt=30, multiple_of=2, description="Number property with a limited range.", ) single_selection: SelectionValue = Field( ..., description="Only select a single item from a set." ) multi_selection: Set[SelectionValue] = Field( ..., description="Allows multiple items from a set." ) single_object: OtherData = Field( ..., description="Another object embedded into this model.", ) data = sp.pydantic_form(key="my_form", model=ExampleModel) if data: st.json(data.json()) ``` Render Input ```python from pydantic import BaseModel import st_pydantic as sp class ExampleModel(BaseModel): some_text: str some_number: int = 10 # Optional some_boolean: bool = True # Option input_data = sp.pydantic_input("model_input", ExampleModel, use_sidebar=True) ``` Render Output ```python import datetime from pydantic import BaseModel, Field import st_pydantic as sp class ExampleModel(BaseModel): text: str = Field(..., description="A text property") integer: int = Field(..., description="An integer property.") date: datetime.date = Field(..., description="A date.") instance = ExampleModel(text="Some text", integer=40, date=datetime.date.today()) sp.pydantic_output(instance) ``` Custom Form ```python import streamlit as st from pydantic import BaseModel import st_pydantic as sp class ExampleModel(BaseModel): some_text: str some_number: int = 10 some_boolean: bool = True with st.form(key="pydantic_form"): sp.pydantic_input(key="my_input_model", model=ExampleModel) submit_button = st.form_submit_button(label="Submit") ``` Support & Feedback | Type | Channel | | ------------------------ | ------------------------------------------------------ | | 🚨 Bug Reports | | | 🎁 Feature Requests | | | 👩💻 Usage Questions | | | 📢 Announcements | | Documentation The API documentation can be found here. To generate UI elements, you can use the high-level pydantic_form method. Or the more flexible lower-level pydantic_input and pydantic_output methods. See the examples section on how to use those methods. Contribution Pull requests are encouraged and always welcome. Read our contribution guidelines and check out help-wanted issues. Submit Github issues for any feature request and enhancement, bugs, or documentation problems. By participating in this project, you agree to abide by its Code of Conduct. The development section below contains information on how to build and test the project after you have implemented some changes. Development To build the project and run the style/linter checks, execute: bash make install make check Run make help to see additional commands for development. Licensed MIT. Created and maintained with ❤️ by developers from Berlin.
Latest version: 0.3.1 Released: 2024-07-21
pydantic-db aims to be a database framework agnostic modeling library. Providing functionality to convert database result object(s) into pydantic model(s). The aim is not to provide an ORM, but to target users who prefer raw sql interactions over obfuscated ORM object built queries layers. For those who prefer libraries like pypika to build their queries, this library can still provide a nice layer between raw query results and database models. So long as the database framework you are using returns result objects that can be converted to a dictionary, pydantic-db will ineract cleanly with your results. Usage All examples assumes the existence of underlying tables and data, they are not intended to run as is. from_result To convert a single result object into a model, use Model.from_result. ```python import sqlite3 from pydantic_db import Model class User(Model): id: int name: str db = sqlite3.connect(":memory:") db.row_factory = sqlite3.Row stmt = "SELECT * FROM my_user LIMIT 1" cursor.execute(stmt) r = cursor.fetchone() user = User.from_result(r) ``` from_results To convert a list of result objects into models, use Model.from_results. ```python import sqlite3 from pydantic_db import Model class User(Model): id: int name: str db = sqlite3.connect(":memory:") db.row_factory = sqlite3.Row stmt = "SELECT * FROM my_user" cursor.execute(stmt) results = cursor.fetchall() users = User.from_results(results) ``` Nested models For more complicated queries returning a nested object, models can be nested. To parse them automatically prefix query fields with name__ format prefixes. Say we have a Vehicle table with a reference to an owner (User). ```python import sqlite3 from pydantic_db import Model class User(Model): id: int name: str class Vehicle(Model): id: int name: str owner: User db = sqlite3.connect(":memory:") db.row_factory = sqlite3.Row stmt = """ SELECT v.id, v.name, u.id AS owner__id, u.name AS owner__name FROM my_vehicle v JOIN my_user u ON v.owner_id = u.id """ cursor.execute(stmt) results = cursor.fetchall() vehicles = Vehicle.from_results(results) ``` Optional nested models When a nested model is optional i.e. user: User | None the library will check if there is an id field by default, and if that field is empty (None), it will nullify that field. If your nested model contains a differently named primary key or some other field that can be relied on to detect that a query has not successfully joined, and so the nested model should be None. Override the _skip_prefix_field class var. ```python class User(Model): primary_key: int name: str class Vehicle(Model): _skip_prefix_field = {"owner": "primary_key"} id: int name: str owner: User | None ```
Latest version: 0.1.4 Released: 2025-05-12
Pydantic BR Essa é uma biblioteca de extensão e visa disponibilizar campos com validações brasileiras para a biblioteca pydantic. Compatível com a versão v1 e v2 do Pydantic. Código fonte: https://github.com/scjorge/pydantic_br Documentação: https://pydantic-br.readthedocs.io Disponibilidades | Campo | Grupo de Documentos | Nome do Documento | Método de validação |---|---|---|---| | CPF | Pessoa física | Cadastro de Pessoa Física | Digito Verificador | CNH | Pessoa física | Carteira Nacional de Habilitação | Digito Verificador | TE | Pessoa física | Título de Eleitor | Digito Verificador | PIS | Pessoa física | Programa de Integração Social | Digito Verificador | CERT | Pessoa física | Certidão (Nascimento/Casamento/Óbito) | Digito Verificador | CNS | Pessoa física | Cartão Nacional de Saúde | Digito Verificador | CNPJ | Pessoa Jurídica | Carteira Nacional de Pessoas Jurídicas | Digito Verificador | CEP | Endereços | Código de Endereçamento Postal | RegExr | SiglaEstado | Endereços | Sigla oficial do Estado Brasileiro | RegExr | RENAVAM | Veículos | Registro Nacional de Veículos Automotores | Digito Verificador | PlacaVeiculo | Veículos | Placa do Veículo | RegExr Instalação Utilizando pip pip install pydantic-br Utilizando Poetry poetry add pydantic-br Exemplos de Utilização Os exemplos de dados exemplificados foram tirados dos seguintes sites: geradordecpf 4devs A má utilização dos dados é de total responsabilidade do usuário. CPF válido ```python from pprint import pprint from pydantic import BaseModel from pydantic_br import CPF, CPFDigits, CPFMask class Pessoa(BaseModel): nome: str cpf: CPF # aceita CPF válidos com ou sem máscara cpf_mask: CPFMask # aceita CPF válido apenas com máscara cpf_digits: CPFDigits # aceita CPF válido apenas com dígitos p1 = Pessoa( nome="João", cpf="53221394780", cpf_mask="532.213.947-80", cpf_digits="53221394780" ) pprint(p1.model_dump()) ``` Saída {'cpf': '53221394780', 'cpf_digits': '53221394780', 'cpf_mask': '532.213.947-80', 'nome': 'João'} CPF inválido ```python from pprint import pprint from pydantic import BaseModel from pydantic_br import CPF, CPFDigits, CPFMask class Pessoa(BaseModel): nome: str cpf: CPF # aceita CPF válidos com ou sem máscara cpf_mask: CPFMask # aceita CPF válido apenas com máscara cpf_digits: CPFDigits # aceita CPF válido apenas com dígitos p1 = Pessoa( nome="João", cpf="00000000000", cpf_mask="53221394780", cpf_digits="532.213.947-80" ) pprint(p1.model_dump()) ``` Saída Traceback (most recent call last): p1 = Pessoa( File "pydantic\main.py", line 341, in pydantic.main.BaseModel.__init__ pydantic.error_wrappers.ValidationError: 3 validation errors for Pessoa cpf invalid data [type=invalid_data, input_value='00000000000', input_type=str] cpf_mask invalid mask format [type=invalid_mask, input_value='53221394780', input_type=str] cpf_digits field only accept digits as string [type=not_digits, input_value='532.213.947-80', input_type=str] Versões do Pydantic Os exemplos acima estão escritos na versão v1 do Pydantic. Entretanto, funciona perfeitamente com a versão v2. Então que mudará? Bem, uma das coisa é que os métodos de 'apresentação' das models foram alterados na v2. O método dict() foi alterado para model_dump() e model_dump_json() O método schema() foi alterado para model_json_schema()
Latest version: 1.1.0 Released: 2024-08-02
Agent Framework / shim to use Pydantic with LLMs Documentation: ai.pydantic.dev PydanticAI is a Python agent framework designed to make it less painful to build production grade applications with Generative AI. FastAPI revolutionized web development by offering an innovative and ergonomic design, built on the foundation of Pydantic. Similarly, virtually every agent framework and LLM library in Python uses Pydantic, yet when we began to use LLMs in Pydantic Logfire, we couldn't find anything that gave us the same feeling. We built PydanticAI with one simple aim: to bring that FastAPI feeling to GenAI app development. Why use PydanticAI Built by the Pydantic Team Built by the team behind Pydantic (the validation layer of the OpenAI SDK, the Anthropic SDK, LangChain, LlamaIndex, AutoGPT, Transformers, CrewAI, Instructor and many more). Model-agnostic Supports OpenAI, Anthropic, Gemini, Deepseek, Ollama, Groq, Cohere, and Mistral, and there is a simple interface to implement support for other models. Pydantic Logfire Integration Seamlessly integrates with Pydantic Logfire for real-time debugging, performance monitoring, and behavior tracking of your LLM-powered applications. Type-safe Designed to make type checking as powerful and informative as possible for you. Python-centric Design Leverages Python's familiar control flow and agent composition to build your AI-driven projects, making it easy to apply standard Python best practices you'd use in any other (non-AI) project. Structured Responses Harnesses the power of Pydantic to validate and structure model outputs, ensuring responses are consistent across runs. Dependency Injection System Offers an optional dependency injection system to provide data and services to your agent's system prompts, tools and output validators. This is useful for testing and eval-driven iterative development. Streamed Responses Provides the ability to stream LLM outputs continuously, with immediate validation, ensuring rapid and accurate outputs. Graph Support Pydantic Graph provides a powerful way to define graphs using typing hints, this is useful in complex applications where standard control flow can degrade to spaghetti code. Hello World Example Here's a minimal example of PydanticAI: ```python from pydantic_ai import Agent Define a very simple agent including the model to use, you can also set the model when running the agent. agent = Agent( 'google-gla:gemini-1.5-flash', # Register a static system prompt using a keyword argument to the agent. # For more complex dynamically-generated system prompts, see the example below. system_prompt='Be concise, reply with one sentence.', ) Run the agent synchronously, conducting a conversation with the LLM. Here the exchange should be very short: PydanticAI will send the system prompt and the user query to the LLM, the model will return a text response. See below for a more complex run. result = agent.run_sync('Where does "hello world" come from?') print(result.output) """ The first known use of "hello, world" was in a 1974 textbook about the C programming language. """ ``` (This example is complete, it can be run "as is") Not very interesting yet, but we can easily add "tools", dynamic system prompts, and structured responses to build more powerful agents. Tools & Dependency Injection Example Here is a concise example using PydanticAI to build a support agent for a bank: (Better documented example in the docs) ```python from dataclasses import dataclass from pydantic import BaseModel, Field from pydantic_ai import Agent, RunContext from bank_database import DatabaseConn SupportDependencies is used to pass data, connections, and logic into the model that will be needed when running system prompt and tool functions. Dependency injection provides a type-safe way to customise the behavior of your agents. @dataclass class SupportDependencies: customer_id: int db: DatabaseConn This pydantic model defines the structure of the output returned by the agent. class SupportOutput(BaseModel): support_advice: str = Field(description='Advice returned to the customer') block_card: bool = Field(description="Whether to block the customer's card") risk: int = Field(description='Risk level of query', ge=0, le=10) This agent will act as first-tier support in a bank. Agents are generic in the type of dependencies they accept and the type of output they return. In this case, the support agent has type Agent[SupportDependencies, SupportOutput]. support_agent = Agent( 'openai:gpt-4o', deps_type=SupportDependencies, # The response from the agent will, be guaranteed to be a SupportOutput, # if validation fails the agent is prompted to try again. output_type=SupportOutput, system_prompt=( 'You are a support agent in our bank, give the ' 'customer support and judge the risk level of their query.' ), ) Dynamic system prompts can make use of dependency injection. Dependencies are carried via the RunContext argument, which is parameterized with the deps_type from above. If the type annotation here is wrong, static type checkers will catch it. @support_agent.system_prompt async def add_customer_name(ctx: RunContext[SupportDependencies]) -> str: customer_name = await ctx.deps.db.customer_name(id=ctx.deps.customer_id) return f"The customer's name is {customer_name!r}" tool let you register functions which the LLM may call while responding to a user. Again, dependencies are carried via RunContext, any other arguments become the tool schema passed to the LLM. Pydantic is used to validate these arguments, and errors are passed back to the LLM so it can retry. @support_agent.tool async def customer_balance( ctx: RunContext[SupportDependencies], include_pending: bool ) -> float: """Returns the customer's current account balance.""" # The docstring of a tool is also passed to the LLM as the description of the tool. # Parameter descriptions are extracted from the docstring and added to the parameter schema sent to the LLM. balance = await ctx.deps.db.customer_balance( id=ctx.deps.customer_id, include_pending=include_pending, ) return balance ... # In a real use case, you'd add more tools and a longer system prompt async def main(): deps = SupportDependencies(customer_id=123, db=DatabaseConn()) # Run the agent asynchronously, conducting a conversation with the LLM until a final response is reached. # Even in this fairly simple case, the agent will exchange multiple messages with the LLM as tools are called to retrieve an output. result = await support_agent.run('What is my balance?', deps=deps) # The result.output will be validated with Pydantic to guarantee it is a SupportOutput. Since the agent is generic, # it'll also be typed as a SupportOutput to aid with static type checking. print(result.output) """ support_advice='Hello John, your current account balance, including pending transactions, is $123.45.' block_card=False risk=1 """ result = await support_agent.run('I just lost my card!', deps=deps) print(result.output) """ support_advice="I'm sorry to hear that, John. We are temporarily blocking your card to prevent unauthorized transactions." block_card=True risk=8 """ ``` Next Steps To try PydanticAI yourself, follow the instructions in the examples. Read the docs to learn more about building applications with PydanticAI. Read the API Reference to understand PydanticAI's interface.
Latest version: 0.2.1 Released: 2025-05-13
Pydantic MQL This library can parse and evaluate conditions using pydantic. The usage is similar to the MongoDB Query Language. But instead of filtering documents within a database collection you can use the library to filter arbitrary application data in memory. Example Usage Parsing python from pydantic_mql import Condition test_json = '{"operator": "$eq", "field": "label", "value": "lab"}' condition = Condition.model_validate_json(test_json) print(condition) Serializing python from pydantic_mql import Condition condition = Condition(operator='$and', conditions=[ Condition(operator='$eq', field='label', value='lab'), Condition(operator='$lte', field='rating', value=80) ]) print(condition.model_dump()) Condition Eval python test_data = {'rating': 60, 'label': 'lab'} result = condition.eval(test_data) print(result)
Latest version: 0.1.0 Released: 2023-09-11
Pydantic Commandline Tool Interface Turn Pydantic defined Data Models into CLI Tools and enable loading values from JSON files Requires Pydantic >=2.8.2. Installation bash pip install pydantic-cli Features and Requirements Thin Schema driven interfaces constructed from Pydantic defined data models Validation is performed in a single location as defined by Pydantic's validation model and defined types The CLI parsing level is only structurally validating the args or optional arguments/flags are provided Enable loading config defined in JSON to override or set specific values (e.g. mytool -i in.csv --json-conf config.json) Clear interface between the CLI and your application code Leverage the static analyzing tool mypy to catch type errors in your commandline tool Easy to test (due to reasons defined above) Motivating Use cases Quick scrapy commandline tools for local development (e.g., webscraper CLI tool, or CLI application that runs a training algo) Internal tools driven by a Pydantic data model/schema Configuration heavy tools that are driven by either partial (i.e, "presets") or complete configuration files defined using JSON Note: Newer version of Pydantic-settings has support for commandline functionality. It allows mixing of "sources", such as ENV, YAML, JSON and might satisfy your requirements. https://docs.pydantic.dev/2.8/concepts/pydantic_settings/#settings-management Pydantic-cli predates the CLI component of pydantic-settings and has a few different requirements and design approach. Quick Start To create a commandline tool that takes an input file and max number of records to process as arguments: bash my-tool --input_file /path/to/file.txt --max_records 1234 This requires two components. Create Pydantic Data Model of type T write a function that takes an instance of T and returns the exit code (e.g., 0 for success, non-zero for failure). pass the T into to the to_runner function, or the run_and_exit Explicit example show below. ```python import sys from pydantic_cli import run_and_exit, to_runner, Cmd class MinOptions(Cmd): input_file: str max_records: int def run(self) -> None: print(f"Mock example running with {self}") if name == 'main': # to_runner will return a function that takes the args list to run and # will return an integer exit code sys.exit(to_runner(MinOptions, version='0.1.0')(sys.argv[1:])) ``` Or to implicitly use sys.argv[1:], leverage run_and_exit (to_runner is also useful for testing). ```python if name == 'main': run_and_exit(MinOptions, description="My Tool Description", version='0.1.0') ``` Customizing Description and Commandline Flags If the Pydantic data model fields are reasonable well named (e.g., 'min_score', or 'max_records'), this can yield a good enough description when --help is called. Customizing the commandline flags or the description can be done by leveraging description keyword argument in Field from pydantic. See Field model in Pydantic more details. Custom 'short' or 'long' forms of the commandline args can be provided by using a Tuple[str] or Tuple2[str, str]. For example, cli=('-m', '--max-records') or cli=('--max-records',). Note, Pydantic interprets ... as a "required" value when used in Field. https://docs.pydantic.dev/latest/concepts/models/#required-fields ```python from pydantic import Field from pydantic_cli import run_and_exit, Cmd class MinOptions(Cmd): input_file: str = Field(..., description="Path to Input H5 file", cli=('-i', '--input-file')) max_records: int = Field(..., description="Max records to process", cli=('-m', '--max-records')) debug: bool = Field(False, description="Enable debugging mode", cli= ('-d', '--debug')) def run(self) -> None: print(f"Mock example running with options {self}") if name == 'main': run_and_exit(MinOptions, description="My Tool Description", version='0.1.0') ``` Running bash $> mytool -i input.hdf5 --max-records 100 --debug y Mock example running with options MinOptions(input_file="input.hdf5", max_records=100, debug=True) Leveraging Field is also useful for validating inputs using the standard Pydantic for validation. ```python from pydantic import Field from pydantic_cli import Cmd class MinOptions(Cmd): input_file: str = Field(..., description="Path to Input H5 file", cli=('-i', '--input-file')) max_records: int = Field(..., gt=0, lte=1000, description="Max records to process", cli=('-m', '--max-records')) def run(self) -> None: print(f"Mock example running with options {self}") ``` See Pydantic docs for more details. Loading Configuration using JSON User created commandline tools using pydantic-cli can also load entire models or partially defined Pydantic data models from JSON files. For example, given the following Pydantic data model with the cli_json_enable = True in CliConfig. The cli_json_key will define the commandline argument (e.g., config will translate to --config). The default value is json-config (--json-config). ```python from pydantic_cli import CliConfig, run_and_exit, Cmd class Opts(Cmd): model_config = CliConfig( frozen=True, cli_json_key="json-training", cli_json_enable=True ) hdf_file: str max_records: int = 10 min_filter_score: float alpha: float beta: float def run(self) -> None: print(f"Running with opts:{self}") if name == 'main': run_and_exit(Opts, description="My Tool Description", version='0.1.0') ``` Can be run with a JSON file that defines all the (required) values. json {"hdf_file": "/path/to/file.hdf5", "max_records": 5, "min_filter_score": 1.5, "alpha": 1.0, "beta": 1.0} The tool can be executed as shown below. Note, options required at the commandline as defined in the Opts model (e.g., 'hdf_file', 'min_filter_score', 'alpha' and 'beta') are NO longer required values supplied to the commandline tool. bash my-tool --json-training /path/to/file.json To override values in the JSON config file, or provide the missing required values, simply provide the values at the commandline. These values will override values defined in the JSON config file. The provides a general mechanism of using configuration "preset" files. bash my-tool --json-training /path/to/file.json --alpha -1.8 --max_records 100 Similarly, a partially described data model can be used combined with explict values provided at the commandline. In this example, hdf_file and min_filter_score are still required values that need to be provided to the commandline tool. json {"max_records":10, "alpha":1.234, "beta":9.876} bash my-tool --json-training /path/to/file.json --hdf_file /path/to/file.hdf5 --min_filter_score -12.34 Note: The mixing and matching of a config/preset JSON file and commandline args is the fundamental design requirement of pydantic-cli. Catching Type Errors with mypy If you've used argparse, you've probably been bitten by an AttributeError exception raised on the Namespace instance returned from parsing the raw args. For example, ```python import sys from argparse import ArgumentParser def to_parser() -> ArgumentParser: p = ArgumentParser(description="Example") f = p.add_argument f('hdf5_file', type=str, help="Path to HDF5 records") f("--num_records", required=True, type=int, help="Number of records to filter over") f('-f', '-filter-score', required=True, type=float, default=1.234, help="Min filter score") f('-g', '--enable-gamma-filter', action="store_true", help="Enable gamma filtering") return p def my_library_code(path: str, num_records: float, min_filter_score, enable_gamma=True) -> int: print("Mock running of code") return 0 def main(argv) -> int: p = to_parser() pargs = p.parse_args(argv) return my_library_code(pargs.hdf5_file, pargs.num_record, pargs.min_filter_score, pargs.enable_gamma_filter) if name == 'main': sys.exit(main(sys.argv[1:])) ``` The first error found at runtime is show below. bash Traceback (most recent call last): File "junk.py", line 35, in sys.exit(main(sys.argv[1:])) File "junk.py", line 31, in main return my_library_code(pargs.hdf5_file, pargs.num_record, pargs.min_filter_score, pargs.enable_gamma_filter) AttributeError: 'Namespace' object has no attribute 'num_record' The errors in pargs.num_records and pargs.filter_score are inconsistent with what is defined in to_parser method. Each error will have to be manually hunted down. With pydantic-cli, it's possible to catch these errors by running mypy. This also enables you to refactor your code with more confidence. For example, ```python from pydantic_cli import run_and_exit, Cmd class Options(Cmd): input_file: str max_records: int def run(self) -> None: print(f"Mock example running with {self.max_score}") if name == "main": run_and_exit(Options, version="0.1.0") ``` With mypy, it's possible to proactively catch these types of errors. Using Boolean Flags There's an ergonomic tradeoff to lean on Pydantic and avoid some friction points at CLI level. This yields an explicit model, but slight added verboseness. Summary: xs:bool can be set from commandline as --xs true or --xs false. Or using Pydantic's casting, --xs yes or --xs y. xs:Optional[bool] can be set from commandline as --xs true, --xs false, or --xs none For the None case, you can configure your Pydantic model to handle the casting/coercing/validation. Similarly, the bool casting should be configured in Pydantic. Consider a basic model: ```python from typing import Optional from pydantic import Field from pydantic_cli import run_and_exit, Cmd class Options(Cmd): input_file: str max_records: int = Field(100, cli=('-m', '--max-records')) dry_run: bool = Field(default=False, description="Enable dry run mode", cli=('-d', '--dry-run')) filtering: Optional[bool] def run(self) -> None: print(f"Mock example running with {self}") if name == "main": run_and_exit(Options, description=doc, version="0.1.0") ``` In this case, dry_run is an optional value with a default and can be set as --dry-run yes or --dry-run no filtering is a required value and can be set --filtering true, --filtering False, and --filtering None See the Pydantic docs for more details on boolean casting. https://docs.pydantic.dev/2.8/api/standard_library_types/#booleans Customization and Hooks Hooks into the CLI Execution There are three core hooks into the customization of CLI execution. exception handler (log or write to stderr and map specific exception classes to integer exit codes) prologue handler (pre-execution hook) epilogue handler (post-execution hook) Both of these cases can be customized by passing in a function to the running/execution method. The exception handler should handle any logging or writing to stderr as well as mapping the specific exception to non-zero integer exit code. For example: ```python import sys from pydantic import Field from pydantic_cli import run_and_exit, Cmd class MinOptions(Cmd): input_file: str = Field(..., cli=('-i',)) max_records: int = Field(10, cli=('-m', '--max-records')) def run(self) -> None: # example/mock error raised. Will be mapped to exit code 3 raise ValueError(f"No records found in input file {self.input_file}") def custom_exception_handler(ex: Exception) -> int: exception_map = dict(ValueError=3, IOError=7) sys.stderr.write(str(ex)) exit_code = exception_map.get(ex.class, 1) return exit_code if name == 'main': run_and_exit(MinOptions, exception_handler=custom_exception_handler) ``` A general pre-execution hook can be called using the prologue_handler. This function is Callable[[T], None], where T is an instance of your Pydantic data model. This setup hook will be called before the execution of your main function (e.g., example_runner). ```python import sys import logging def custom_prologue_handler(opts) -> None: logging.basicConfig(level="DEBUG", stream=sys.stdout) if name == 'main': run_and_exit(MinOptions, prolgue_handler=custom_prologue_handler) ``` Similarly, the post execution hook can be called. This function is Callable[[int, float], None] that is the exit code and program runtime in sec as input. ```python from pydantic_cli import run_and_exit def custom_epilogue_handler(exit_code: int, run_time_sec:float) -> None: m = "Success" if exit_code else "Failed" msg = f"Completed running ({m}) in {run_time_sec:.2f} sec" print(msg) if name == 'main': run_and_exit(MinOptions, epilogue_handler=custom_epilogue_handler) ``` SubParsers Defining a subcommand to your commandline tool is enabled by creating a container of dict[str, Cmd] (with str is the subcommand name) into run_and_exit (or to_runner). ```python """Example Subcommand Tool""" from pydantic import AnyUrl, Field from pydantic_cli import run_and_exit, Cmd class AlphaOptions(Cmd): input_file: str = Field(..., cli=('-i',)) max_records: int = Field(10, cli=('-m', '--max-records')) def run(self) -> None: print(f"Running alpha with {self}") class BetaOptions(Cmd): """Beta command for testing. Description of tool""" url: AnyUrl = Field(..., cli=('-u', '--url')) num_retries: int = Field(3, cli=('-n', '--num-retries')) def run(self) -> None: print(f"Running beta with {self}") if name == "main": run_and_exit({"alpha": AlphaOptions, "beta": BetaOptions}, description=doc, version='0.1.0') ``` Configuration Details and Advanced Features Pydantic-cli attempts to stylistically follow Pydantic's approach using a class style configuration. See `DefaultConfig in ``pydantic_cli' for more details. ```python import typing as T from pydantic import ConfigDict class CliConfig(ConfigDict, total=False): # value used to generate the CLI format --{key} cli_json_key: str # Enable JSON config loading cli_json_enable: bool # Set the default ENV var for defining the JSON config path cli_json_config_env_var: str # Set the default Path for JSON config file cli_json_config_path: T.Optional[str] # If a default path is provided or provided from the commandline cli_json_validate_path: bool # Add a flag that will emit the shell completion # this requires 'shtab' # https://github.com/iterative/shtab cli_shell_completion_enable: bool cli_shell_completion_flag: str ``` AutoComplete leveraging shtab There is support for zsh and bash autocomplete generation using shtab The optional dependency can be installed as follows. bash pip install "pydantic-cli[shtab]" To enable the emitting of bash/zsh autocomplete files from shtab, set CliConfig(cli_shell_completion_enable=True) in your data model config. Then use your executable (or .py file) emit the autocomplete file to the necessary output directory. For example, using zsh and a script call my-tool.py, my-tool.py --emit-completion zsh > ~/.zsh/completions/_my-tool.py. By convention/default, the executable name must be prefixed with an underscore. When using autocomplete it should look similar to this. ```bash ./my-tool.py --emit-completion zsh > ~/.zsh/completions/_my-tool.py Completed writing zsh shell output to stdout ./my-tool.py --max -- option -- --max_filter_score -- (type:int default:1.0) --max_length -- (type:int default:12) --max_records -- (type:int default:123455) --max_size -- (type:int default:13) ``` See shtab for more details. Note, that due to the (typically) global zsh completions directory, this can create some friction points with different virtual (or conda) ENVS with the same executable name. General Suggested Testing Model At a high level, pydantic_cli is (hopefully) a thin bridge between your Options defined as a Pydantic model and your main Cmd.run() -> None method that has hooks into the startup, shutdown and error handling of the command line tool. It also supports loading config files defined as JSON. By design, pydantic_cli explicitly does not expose, or leak the argparse instance or implementation details. Argparse is a bit thorny and was written in a different era of Python. Exposing these implementation details would add too much surface area and would enable users' to start mucking with the argparse instance in all kinds of unexpected ways. Testing can be done by leveraging the to_runner interface. It's recommend trying to do the majority of testing via unit tests (independent of pydantic_cli) with your main function and different instances of your pydantic data model. Once this test coverage is reasonable, it can be useful to add a few smoke tests at the integration level leveraging to_runner to make sure the tool is functional. Any bugs at this level are probably at the pydantic_cli level, not your library code. Note, that to_runner(Opts) returns a Callable[[List[str]], int] that can be used with sys.argv[1:] to return an integer exit code of your program. The to_runner layer will also catch any exceptions. ```python import unittest from pydantic_cli import to_runner, Cmd class Options(Cmd): alpha: int def run(self) -> None: if self.alpha < 0: raise Exception(f"Got options {self}. Forced raise for testing.") class TestExample(unittest.TestCase): def test_core(self): # Note, this has nothing to do with pydantic_cli # If possible, this is where the bulk of the testing should be # You code should raise exceptions here or return None on success self.assertTrue(Options(alpha=1).run() is None) def test_example(self): # This is intended to mimic end-to-end testing # from argv[1:]. The exception handler will map exceptions to int exit codes. f = to_runner(Options) self.assertEqual(1, f(["--alpha", "100"])) def test_expected_error(self): f = to_runner(Options) self.assertEqual(1, f(["--alpha", "-10"])) ``` For more scrappy, interactive local development, it can be useful to add ipdb or pdb and create a custom exception_handler. ```python from pydantic_cli import default_exception_handler, run_and_exit, Cmd class Options(Cmd): alpha: int def run(self) -> None: if self.alpha < 0: raise Exception(f"Got options {self}. Forced raise for testing.") def exception_handler(ex: BaseException) -> int: exit_code = default_exception_handler(ex) import ipdb; ipdb.set_trace() return exit_code if name == "main": run_and_exit(Options, exception_handler=exception_handler) ``` The core design choice in pydantic_cli is leveraging composable functions f(g(x)) style providing a straight-forward mechanism to plug into. More Examples More examples are provided here and Testing Examples can be seen here. The TestHarness might provide examples of how to test your CLI tool(s) Limitations Positional Arguments are not supported (See more info in the next subsection) Using Pydantic BaseSettings to set values from dotenv or ENV variables is not supported. Loading dotenv or similar in Pydantic overlapped and competed too much with the "preset" JSON loading model in pydantic-cli. Currently only support "simple" types (e.g., floats, ints, strings, boolean) and limited support for fields defined as List[T], Set[T] and simple Enums. There is no support for nested models. Pydantic-settings might be a better fit for these cases. Leverages argparse underneath the hood (argparse is a bit thorny of an API to build on top of). Why are Positional Arguments not supported? The core features of pydantic-cli are: Define and validate models using Pydantic and use these schemas as an interface to the command line Leverage mypy (or similar static analyzer) to enable validating/checking typesafe-ness prior to runtime Load partial or complete models using JSON (these are essentially, partial or complete config or "preset" files) Positional arguments create friction points when combined with loading model values from a JSON file. More specifically, (required) positional values of the model could be supplied in the JSON and are no longer required at the command line. This can fundamentally change the commandline interface and create ambiguities/bugs. For example: ```python from pydantic_cli import CliConfig, Cmd class MinOptions(Cmd): model_config = CliConfig(cli_json_enable=True) input_file: str input_hdf: str max_records: int = 100 def run(self) -> None: print(f"Running with mock {self}") ``` And the vanilla case running from the command line works as expected. bash my-tool /path/to/file.txt /path/to/file.h5 --max_records 200 However, when using the JSON "preset" feature, there are potential problems where the positional arguments of the tool are shifting around depending on what fields have been defined in the JSON preset. For example, running with this preset.json, the input_file positional argument is no longer required. json {"input_file": "/system/config.txt", "max_records": 12345} Vanilla case works as expected. bash my-tool file.txt /path/to/file.h5 --json-config ./preset.json However, this also works as well. bash my-tool /path/to/file.h5 --json-config ./preset.json In my experience, the changing of the semantic meaning of the command line tool's positional arguments depending on the contents of the preset.json created issues and bugs. The simplest fix is to remove the positional arguments in favor of -i or similar which removed the issue. ```python from pydantic import Field from pydantic_cli import CliConfig, Cmd class MinOptions(Cmd): model_config = CliConfig(cli_json_enable=True) input_file: str = Field(..., cli=('-i', )) input_hdf: str = Field(..., cli=('-d', '--hdf')) max_records: int = Field(100, cli=('-m', '--max-records')) def run(self) -> None: print(f"Running {self}") ``` Running with the preset.json defined above, works as expected. bash my-tool --hdf /path/to/file.h5 --json-config ./preset.json As well as overriding the -i. bash my-tool -i file.txt --hdf /path/to/file.h5 --json-config ./preset.json Or bash my-tool --hdf /path/to/file.h5 -i file.txt --json-config ./preset.json This consistency was the motivation for removing positional argument support in earlier versions of pydantic-cli. Other Related Tools Other tools that leverage type annotations to create CLI tools. pydantic-settings Pydantic >= 2.8.2 supports CLI as a settings "source". cyto Pydantic based model leveraging Pydantic's settings sources. Supports nested values. Optional TOML support. (Leverages: click, pydantic) typer Typer is a library for building CLI applications that users will love using and developers will love creating. Based on Python 3.6+ type hints. (Leverages: click) glacier Building Python CLI using docstrings and typehints (Leverages: click) Typed-Settings Manage typed settings with attrs classes – for server processes as well as click applications (Leverages: attrs, click) cliche Build a simple command-line interface from your functions. (Leverages: argparse and type annotations/hints) SimpleParsing Simple, Elegant, Typed Argument Parsing with argparse. (Leverages: dataclasses, argparse) recline This library helps you quickly implement an interactive command-based application in Python. (Leverages: argparse + type annotations/hints) clippy Clippy crawls the abstract syntax tree (AST) of a Python file and generates a simple command-line interface. clize Turn Python functions into command-line interfaces (Leverages: attrs) plac Parsing the Command Line the Easy Way. typedparse Parser for command-line options based on type hints (Leverages: argparse and type annotations/hints) paiargparse Extension to the python argparser allowing to automatically generate a hierarchical argument list based on dataclasses. (Leverages: argparse + dataclasses) piou A CLI tool to build beautiful command-line interfaces with type validation. pyrallis A framework for simple dataclass-based configurations. ConfigArgParse A drop-in replacement for argparse that allows options to also be set via config files and/or environment variables. spock spock is a framework that helps manage complex parameter configurations during research and development of Python applications. (Leverages: argparse, attrs, and type annotations/hints) oneFace Generating interfaces(CLI, Qt GUI, Dash web app) from a Python function. configpile Overlay for argparse that takes additional parameters from environment variables and configuration files Stats Github Star Growth of pydantic-cli
Latest version: 9.1.0 Released: 2024-09-20
Use pydantic with Django REST framework Introduction Performance Installation Usage General Pydantic Validation Updating Field Values Validation Errors Existing Models Nested Models Manual Serializer Configuration Per-Field Configuration Custom Serializer Additional Properties Introduction Pydantic is a Python library used to perform data serialization and validation. Django REST framework is a framework built on top of Django used to write REST APIs. If you develop DRF APIs and rely on pydantic for data validation/(de)serialization, then drf-pydantic is for you :heart_eyes:. [!NOTE] The latest version of drf_pydantic only supports pydantic v2. Support for pydantic v1 is available in the 1.* version. Performance Translation between pydantic models and DRF serializers is done during class creation (e.g., when you first import the model). This means there will be zero runtime impact when using drf_pydantic in your application. [!NOTE] There will be a minor penalty if validate_pydantic is set to True due to pydantic model validation. This is minimal compared to an-already present overhead of DRF itself because pydantic runs its validation in rust while DRF is pure python. Installation shell pip install drf-pydantic Usage General Use drf_pydantic.BaseModel instead of pydantic.BaseModel when creating your models: ```python from drf_pydantic import BaseModel class MyModel(BaseModel): name: str addresses: list[str] ``` MyModel.drf_serializer is equivalent to the following DRF Serializer class: python class MyModelSerializer: name = CharField(allow_null=False, required=True) addresses = ListField( allow_empty=True, allow_null=False, child=CharField(allow_null=False), required=True, ) Whenever you need a DRF serializer, you can get it from the model like this: python my_value = MyModel.drf_serializer(data={"name": "Van", "addresses": ["Gym"]}) my_value.is_valid(raise_exception=True) [!NOTE] Models created using drf_pydantic are fully identical to those created by pydantic. The only change is the addition of the drf_serializer and drf_config attributes. Pydantic Validation By default, the generated serializer only uses DRF's validation; however, pydantic models are often more complex and their numerous validation rules cannot be fully translated to DRF. To enable pydantic validators to run whenever the generated DRF serializer validates its data (e.g., via .is_valid()), set "validate_pydantic": True within the drf_config property of your model: ```python from drf_pydantic import BaseModel class MyModel(BaseModel): name: str addresses: list[str] drf_config = {"validate_pydantic": True} my_serializer = MyModel.drf_serializer(data={"name": "Van", "addresses": []}) my_serializer.is_valid() # this will also validate MyModel ``` With this option enabled, every time you validate data using your DRF serializer, the parent pydantic model is also validated. If it fails, its ValidationError exception will be wrapped within DRF's ValidationError. Per-field and non-field (object-level) errors are wrapped similarly to how DRF handles them. This ensures your complex pydantic validation logic is properly evaluated wherever a DRF serializer is used. [!NOTE] All drf_config values are properly inherited by child classes, just like pydantic's model_config. Updating Field Values By default, drf_pydantic updates values in the DRF serializer with those from the validated pydantic model: ```python from drf_pydantic import BaseModel class MyModel(BaseModel): name: str addresses: list[str] @pydantic.field_validator("name") @classmethod def validate_name(cls, v): assert isinstance(v, str) return v.strip().title() drf_config = {"validate_pydantic": True} my_serializer = MyModel.drf_serializer(data={"name": "van herrington", "addresses": []}) my_serializer.is_valid() print(my_serializer.data) # {"name": "Van Herrington", "addresses": []} ``` This is handy when you dynamically modify field values within your pydantic validators. You can disable this behavior by setting "backpopulate_after_validation": False: ```python class MyModel(BaseModel): ... drf_config = {"validate_pydantic": True, "backpopulate_after_validation": False} ``` Validation Errors By default, pydantic's ValidationError is wrapped within DRF's ValidationError. If you want to raise pydantic's ValidationError directly, set "validation_error": "pydantic" in the drf_config property of your model: ```python import pydantic from drf_pydantic import BaseModel class MyModel(BaseModel): name: str addresses: list[str] @pydantic.field_validator("name") @classmethod def validate_name(cls, v): assert isinstance(v, str) if v != "Billy": raise ValueError("Wrong door") return v drf_config = {"validate_pydantic": True, "validation_error": "pydantic"} my_serializer = MyModel.drf_serializer(data={"name": "Van", "addresses": []}) my_serializer.is_valid() # this will raise pydantic.ValidationError ``` [!NOTE] When a model is invalid from both DRF's and pydantic's perspectives and exceptions are enabled (.is_valid(raise_exception=True)), DRF's ValidationError will be raised regardless of the validation_error setting, because DRF validation always runs first. [!CAUTION] Setting validation_error to pydantic has side effects: It may break your views because they expect DRF's ValidationError. Calling .is_valid() will always raise pydantic.ValidationError if the data is invalid, even without setting .is_valid(raise_exception=True). Existing Models If you have an existing code base and want to add the drf_serializer attribute only to some of your models, you can extend your existing pydantic models by adding drf_pydantic.BaseModel as a parent class to the models you want to extend. Your existing pydantic models: ```python from pydantic import BaseModel class Pet(BaseModel): name: str class Dog(Pet): breed: str ``` Update your Dog model and get serializer via the drf_serializer: ```python from drf_pydantic import BaseModel as DRFBaseModel from pydantic import BaseModel class Pet(BaseModel): name: str class Dog(DRFBaseModel, Pet): breed: str Dog.drf_serializer ``` [!IMPORTANT] Inheritance order is important: drf_pydantic.BaseModel must always come before pydantic.BaseModel. Nested Models If you have nested models and want to generate a serializer for only one of them, you don't need to update all models. Simply update the model you need, and drf_pydantic will automatically generate serializers for all standard nested pydantic models: ```python from drf_pydantic import BaseModel as DRFBaseModel from pydantic import BaseModel class Apartment(BaseModel): floor: int tenant: str class Building(BaseModel): address: str apartments: list[Apartment] class Block(DRFBaseModel): buildings: list[Building] Block.drf_serializer ``` Manual Serializer Configuration If drf_pydantic doesn't generate the serializer you need, you can configure the DRF serializer fields for each pydantic field manually, or create a custom serializer for the model altogether. [!IMPORTANT] When manually configuring the serializer, you are responsible for setting all properties of the fields (e.g., allow_null, required, default, etc.). drf_pydantic does not perform any introspection for fields that are manually configured or for any fields if a custom serializer is used. Per-Field Configuration ```python from typing import Annotated from drf_pydantic import BaseModel from rest_framework.serializers import IntegerField class Person(BaseModel): name: str age: Annotated[float, IntegerField(min_value=0, max_value=100)] ``` Custom Serializer In the example below, Person will use MyCustomSerializer as its DRF serializer. Employee will have its own serializer generated by drf_pydantic since it doesn't inherit a user-defined drf_serializer attribute. Company will use Person's manually defined serializer for its ceo field. ```python from drf_pydantic import BaseModel, DrfPydanticSerializer from rest_framework.serializers import CharField, IntegerField class MyCustomSerializer(DrfPydanticSerializer): name = CharField(allow_null=False, required=True) age = IntegerField(allow_null=False, required=True) class Person(BaseModel): name: str age: float drf_serializer = MyCustomSerializer class Employee(Person): salary: float class Company(BaseModel): ceo: Person ``` [!IMPORTANT] Added in version v2.6.0 Manual drf_serializer must have base class of DrfPydanticSerializer in order for Pydantic Validation to work properly. You can still use standard Serializer from rest_framework, but automatic pydantic model validation will not work consistently and you will get a warning. Additional Properties Additional field properties are mapped as follows (pydantic -> DRF): description -> help_text title -> label StringConstraints -> min_length and max_length pattern -> Uses the specialized RegexField serializer field max_digits and decimal_places are carried over (used for Decimal types, with the current decimal context precision) ge / gt -> min_value (only for numeric types) le / lt -> max_value (only for numeric types)
Latest version: 2.7.1 Released: 2025-03-28