pydantic-apply Installation Just use pip install pydantic-apply to install the library. Note: pydantic-apply is compatible with pydantic version 2.x on Python 3.9, 3.10, 3.11, 3.12 and 3.13. This is also ensured running all tests on all those versions using tox. About With pydantic-apply you can apply changes to your pydantic models by using the ApplyModelMixin it provides: ```python import pydantic from pydantic_apply import ApplyModelMixin class Something(ApplyModelMixin, pydantic.BaseModel): name: str age: int obj = Something(name='John Doe', age=42) obj.model_apply({ "age": 43, }) assert obj.age == 43 ``` As the apply data you may pass any dictionary or other pydanic object as you wish. pydantic objects will be converted to dict's when being applied - but will only use fields that where explicitly set on the model instance. Also note that .apply() will ignore all fields not present in the model, like the model constructor would. Nested models pydantic-apply will also know how to apply changes to nested models. If those models are by themself subclasses of ApplyModelMixin it will call apply() on those fields as well. Otherwise the whole attribute will be replaced. Apply changes when using validate_assignment When your models have validate_assignment enabled it may become tricky to apply changes to the model. This is due to the fact that you only can assign fields once at a time. But with validate_assignment enabled this means each field assignment will trigger its own validation and this validation might fail as the model state is not completely changes and thus in a "broken" intermediate state. pydantic-apply will take care of this issue and disable the validation for each assignment while applying the changes. It will also ensure the resulting object will still pass the validation, so you don't have to care about this case at all. Contributing If you want to contribute to this project, feel free to just fork the project, create a dev branch in your fork and then create a pull request (PR). If you are unsure about whether your changes really suit the project please create an issue first, to talk about this.
Latest version: 0.7.1 Released: 2025-05-05
Pydantic Mongo A Python library that offers an easy-to-use Repository pattern for MongoDB, supporting both synchronous and asynchronous operations. It simplifies working with databases by providing a clear interface for CRUD (Create, Read, Update, Delete) operations using Pydantic models. With built-in data validation and serialization from Pydantic, it helps manage your MongoDB data safely. Read the documentation Features Asynchronous and Synchronous support Pydantic models integration Type-safe MongoDB operations Cursor-based pagination Installation bash pip install pydantic-mongo Usage Examples Defining Models and Repository ```python from bson import ObjectId from pydantic import BaseModel from pydantic_mongo import AbstractRepository, PydanticObjectId from pymongo import MongoClient from typing import Optional, List Define your models class Foo(BaseModel): count: int size: float = None class Bar(BaseModel): apple: str = 'x' banana: str = 'y' class Spam(BaseModel): # PydanticObjectId is an alias to Annotated[ObjectId, ObjectIdAnnotation] id: Optional[PydanticObjectId] = None foo: Foo bars: List[Bar] Create a repository class SpamRepository(AbstractRepository[Spam]): class Meta: collection_name = 'spams' Connect to database client = MongoClient("mongodb://localhost:27017") database = client["example"] repo = SpamRepository(database) ``` Creating and Saving Documents ```python Create a new document spam = Spam(foo=Foo(count=1, size=1.0), bars=[Bar()]) Create a document with predefined ID spam_with_predefined_id = Spam( id=ObjectId("611827f2878b88b49ebb69fc"), foo=Foo(count=2, size=2.0), bars=[Bar()] ) Save a single document repo.save(spam) # spam.id is now set to an ObjectId Save multiple documents repo.save_many([spam, spam_with_predefined_id]) ``` Querying Documents ```python Find by ID result = repo.find_one_by_id(spam.id) Find by ID using string result = repo.find_one_by_id(ObjectId('611827f2878b88b49ebb69fc')) assert result.foo.count == 2 Find one by custom query result = repo.find_one_by({'foo.count': 1}) Find multiple documents by query results = repo.find_by({'foo.count': {'$gte': 1}}) ``` Pagination ```python Get first page edges = repo.paginate({'foo.count': {'$gte': 1}}, limit=10) Get next page using the last cursor more_edges = repo.paginate( {'foo.count': {'$gte': 1}}, limit=10, after=list(edges)[-1].cursor ) ``` Deleting Documents ```python Delete a document repo.delete(spam) Delete by ID repo.delete_by_id(ObjectId("...")) ``` Async Support For asynchronous applications, you can use AsyncAbstractRepository which provides the same functionality as AbstractRepository but with async/await support: ```python from pymongo import AsyncMongoClient from pydantic import BaseModel from pydantic_mongo import AsyncAbstractRepository class User(BaseModel): id: str name: str email: str class UserRepository(AsyncAbstractRepository[User]): class Meta: collection_name = 'users' Initialize database connection client = AsyncMongoClient('mongodb://localhost:27017') database = client["mydb"] Create repository instance user_repo = UserRepository(database) Example usage user = User(name='John Doe', email='john@example.com') await user_repo.save(user) user = await user_repo.find_one_by_id(user_id) ``` License MIT License
Latest version: 3.1.0 Released: 2025-04-18
pydantic-sweep pydantic_sweep is a library to programmatically, safely and flexibly define complex parameter sweeps over pydantic models in Python. Highlights: - Specify parameter sweeps in Python - Flexibility: specify complex parameter combinations by chaining simple functional operations - Safety checks for parameter combinations (get meaningful errors early) - pydantic field validation For example, the following code will instantiate models with (x=5, sub=Sub1(s=1) and (x=6, sub=Sub1(s=2) and try each of those with seed values of seed=43 and seed=44, leading to four different configurations: ```python import pydantic_sweep as ps class Sub1(ps.BaseModel): s: int = 5 class Sub2(ps.BaseModel): y: str = "hi" class Model(ps.BaseModel): seed: int = 42 x: int = 5 sub: Sub1 | Sub2 We want to test across two seeds configs = ps.config_product( ps.field("seed", [43, 44]), # And two specific Sub1 and x combinations ps.config_zip( ps.field("x", [ps.DefaultValue, 6]), ps.field("sub.s", [1, 2]), ) ) This includes safety checks that Sub1 / Sub2 are uniquely specified models = ps.initialize(Model, configs) The code above is equivalent to models_manual = [ Model(seed=43, sub=Sub1(s=1)), Model(seed=43, x=6, sub=Sub1(s=2)), Model(seed=44, sub=Sub1(s=1)), Model(seed=44, x=6, sub=Sub1(s=2)), ] assert models == models_manual We can also check that we didn't accidentally duplicate a setting ps.check_unique(models) ``` While in this toy example, manually specifying the combinations may still be viable, the library allows infinitely combining different configs and sub-models, making it a powerful tool for large-scale experiment definition. To learn mode about the capabilities of the library please visit the documentation page. Installation You can install the library by checking out the repo and running bash pip install 'pydantic-sweep' License The main code-base is licensed under MPL-2.0 a weak copy-left license that allows commercial use. See the license file for the exact clauses and this FAQ for a high-level description. An exception from this are the documentation in the docs and example folders as well as this README file, which are licensed under the 0BSD: a highly permissive license that does not require attribution. That way, you are free to copy & paste example code into your use-case. See here for a high-level description.
Latest version: 0.3.3 Released: 2025-02-23
pydantic-ai-mlx MLX local inference for Pydantic AI through LM Studio or mlx-lm directly. Run MLX compatible HuggingFace models on Apple silicon locally with Pydantic AI. Two options are provided as backends; - LM Studio backend (OpenAI compatible server that can also utilize mlx-lm, model runs on a separate background process) - mlx-lm backend (direct integration with Apple's library, model runs within your applicaiton, experimental support) STILL IN DEVELOPMENT, NOT RECOMMENDED FOR PRODUCTION USE YET. Contributions are welcome! Features [x] LM Studio backend, should be fully supported [x] Streaming text support for mlx-lm backend [ ] Tool calling support for mlx-lm backend Apple's MLX seems more performant on Apple silicon than llama.cpp (Ollama), as of January 2025. Installation bash uv add pydantic-ai-mlx Usage LM Studio backend ```python from pydantic_ai import Agent from pydantic_ai.messages import ModelMessage from pydantic_ai_lm_studio import LMStudioModel model = LMStudioModel(model_name="mlx-community/Qwen2.5-7B-Instruct-4bit") # supports tool calling agent = Agent(model, system_prompt="You are a chatbot.") async def stream_response(user_prompt: str, message_history: list[ModelMessage]): async with agent.run_stream(user_prompt, message_history) as result: async for message in result.stream(): yield message ``` mlx-lm backend ```python from pydantic_ai import Agent from pydantic_ai.messages import ModelMessage from pydantic_ai_mlx_lm import MLXModel model = MLXModel(model_name="mlx-community/Llama-3.2-3B-Instruct-4bit") See https://github.com/ml-explore/mlx-examples/blob/main/llms/README.md#supported-models also https://huggingface.co/mlx-community agent = Agent(model, system_prompt="You are a chatbot.") async def stream_response(user_prompt: str, message_history: list[ModelMessage]): async with agent.run_stream(user_prompt, message_history) as result: async for message in result.stream(): yield message ```
pypi package. Binary
Latest version: 0.2.5 Released: 2025-03-09
pydantic-tensor Support parsing, validation, and serialization of common tensors (np.ndarray, torch.Tensor, tensorflow.Tensor, jax.Array) for Pydantic. Installation console pip install pydantic-tensor Usage Validation ```python from typing import Annotated, Any, Literal import numpy as np import tensorflow as tf import torch from pydantic import BaseModel, Field from pydantic_tensor import Tensor allow only integers greater equal than 2 and less equal than 3 DimType = Annotated[int, Field(ge=2, le=3)] class Model(BaseModel): # tensor type shape dtype tensor: Tensor[torch.Tensor | np.ndarray[Any, Any], tuple[DimType, DimType], Literal["int32", "int64"]] parsed = Model.model_validate({"tensor": np.ones((2, 2), dtype="int32")}) access the parsed tensor via the "value" property parsed.tensor.value invalid shapes Model.model_validate({"tensor": np.ones((1, 1), dtype="int32")}) Model.model_validate({"tensor": np.ones((4, 4), dtype="int32")}) Model.model_validate({"tensor": np.ones(2, dtype="int32")}) Model.model_validate({"tensor": np.ones((2, 2, 2), dtype="int32")}) invalid dtype Model.model_validate({"tensor": np.ones((2, 2), dtype="float32")}) successfully validate np.ndarray Model.model_validate({"tensor": np.ones((2, 2), dtype="int32")}) convert tf.Tensor to torch.Tensor Model.model_validate({"tensor": tf.ones((2, 2), dtype=tf.int32)}) ``` Parsing The JSON representation of the tensor contains the: binary data of the tensor in little-endian format encoded in Base64 shape of the tensor datatype of the tensor ```python from typing import Any import numpy as np from pydantic import BaseModel from pydantic_tensor import Tensor class Model(BaseModel): tensor: Tensor[Any, Any, Any] parsed = Model.model_validate({"tensor": np.ones((2, 2), dtype="float32")}) parse to JSON: {"tensor":{"shape":[2,2],"dtype":"float32","data":"AACAPwAAgD8AAIA/AACAPw=="}} json_dump = parsed.model_dump_json() parse back to tensor: array([[1., 1.], [1., 1.]], dtype=float32) Model.model_validate_json(json_dump).tensor.value ``` DType Collections Types Int, UInt, Float, Complex, BFloat from pydantic_tensor.types are unions of dtypes according to their names. For Example Int is defined as Literal["int8", "int16", "int32", "int64"]. ```python from typing import Any import numpy as np from pydantic import BaseModel from pydantic_tensor import Tensor from pydantic_tensor.types import Int class Model(BaseModel): tensor: Tensor[Any, Any, Int] for dtype in ["int8", "int16", "int32", "int64"]: Model.model_validate({"tensor": np.ones((2, 2), dtype=dtype)}) # success Model.model_validate({"tensor": np.ones((2, 2), dtype="float32")}) # failure ``` Lazy Tensors Use JaxArray, NumpyNDArray, TensorflowTensor, TorchTensor for lazy versions of tensors types. They only handle tensors when their equivalent libraries (jax, numpy, tensorflow, torch) are imported somewhere else in the program. ```python from typing import Any import numpy as np from pydantic import BaseModel from pydantic_tensor import Tensor from pydantic_tensor.backend.torch import TorchTensor class Model(BaseModel): tensor: Tensor[TorchTensor, Any, Any] Model.model_validate({"tensor": np.ones((2, 2), dtype="float32")}) # failure import torch Model.model_validate({"tensor": np.ones((2, 2), dtype="float32")}) # success ``` Development Install pre-commit hooks pre-commit install Lint hatch run lint:all Test hatch run test:test Check spelling hatch run spell
Latest version: 0.2.0 Released: 2024-05-09
pylint-pydantic A Pylint plugin to help Pylint understand the Pydantic How to use Installation .. code:: shell pip install pylint-pydantic Use in console .. code:: shell pylint --load-plugins pylint_pydantic xxxxx Use in vscode,settings.json add item .. code:: shell "pylint.args": ["--load-plugins", "pylint_pydantic"] # in old vscode version maybe "python.linting.pylintArgs": ["--load-plugins", "pylint_pydantic"] Tests .. code:: shell pylint --rcfile=pylintrc --load-plugins pylint_pydantic tests/ ------------------------------------ Your code has been rated at 10.00/10 FAQ How to resolve pylint: No name 'BaseModel' in module 'pydantic'? Add --extension-pkg-whitelist='pydantic' parameter (see #1961_) Other If you have any questions, please create a issue. https://github.com/fcfangcc/pylint-pydantic/issues Changelog v0.3.4: fixed #35 #34 v0.3.1: fixed #29 v0.3.0: support pylint3 v0.2.4: fix pydantic.Field with BaseModel support v0.2.2: fix model_validator keyword mode, pydatic>=2.0.3 v0.2.1: support model_validator v0.2.0: support Pydantic V2
Latest version: 0.3.5 Released: 2025-01-07
Pydantic Config Support for Pydantic settings configuration file loading Installation Pydantic Config can be installed via pip: pip install pydantic-config Pydantic Config is also available on conda under the conda-forge channel: conda install pydantic-config -c conda-forge Optional Dependencies Pydantic-Config has the following optional dependencies: - yaml - pip install pydantic-config[yaml] - toml - pip install pydantic-config[toml] Only for python<3.11 You can install all the optional dependencies with pip install pydantic-config[all] Usage ```toml config.toml app_name = "Python Application" description = "Test application description" ``` ```python from pydantic_config import SettingsModel, SettingsConfig class Settings(SettingsModel): app_id: str = 1 app_name: str = None description: str = None log_level: str = 'INFO' model_config = SettingsConfig( config_file='config.toml', ) settings = Settings() print(settings) app_id='1' app_name='Python Application' description='Test application description' log_level='INFO' ``` Using multiple config files Multiple config files can be loaded by passing a list of file names. Files will be loaded in the order they are listed. Meaning later files in the list will take priority over earlier files. ```toml config.toml app_name = "Python Application" description = "Test application description" ``` json // config.json { "description": "Description from JSON file", "log_level": "WARNING" } ```python from pydantic_config import SettingsModel, SettingsConfig class Settings(SettingsModel): app_id: str = 1 app_name: str = 'App Name' description: str = None log_level: str = 'INFO' model_config = SettingsConfig( config_file=['config.toml', 'config.json'] # The config.json file will take priority over config.toml ) settings = Settings() print(settings) app_id='1' app_name='Python Application' description='Description from JSON file' log_level='WARNING' ``` Supported file formats Currently, the following file formats are supported: - .yaml Requires pyyaml package - .toml Requires tomli package for python<3.11 - .json - .ini Requiring config files to load Config files will attempt to be loaded from the specified file path. By default, if no file is found the file will simply not be loaded (no error occurs). This may be useful if you want to specify config files that may or may not exist. For example, you may have different config files for per environment: config-prod.toml and config-dev.toml. To disable this behavior set config_file_required=True. This will cause an error to be raised if the specified config file(s) do not exist. Setting this to True will also prohibit the config_file parameter from being set to None or empty []. Merging If your configurations have existing list or dict variables the contents will be merged by default. To disable this behavior and override the contents instead you can set the config_merge option to False in the settings Config class. ```toml config.toml [foo] item1 = "value1" toml config2.toml [foo] item2 = "value2" ``` ```python from pydantic_config import SettingsModel, SettingsConfig class Settings(SettingsModel): foo: dict = {} model_config = SettingsConfig( config_file=['config.toml', 'config2.toml'], config_merge= True, ) settings = Settings() print(settings) foo={'item1': 'value1', 'item2': 'value2'} If config_merge=False then config2.toml would override the values from config.toml foo={'item2': 'value2'} ``` Duplicate items in merged lists By default, all list items will be merged into a single list regardless of duplicated items. To only keep unique list items, set config_merge_unique=True. This will only keep unique items in within a list.
Latest version: 0.3.0 Released: 2023-12-06
xml-to-pydantic xml-to-pydantic is a library for Python to convert XML or HTML to pydantic models. This can be used to: Parse and validate a scraped HTML page into a python object Parse and validate an XML response from an XML-based API Parse and validate data stored in XML format (Please note that this project is not affiliated in any way with the great team at pydantic.) pydantic is a Python library for data validation, applying type hints / annotations. It enables the creation of easy or complex data validation rules for processing external data. That data usually comes in JSON format or from a Python dictionary. But to process and validate HTML or XML into pydantic models would then require two steps: convert the HTML or XML to a Python dictionary, then convert to the pydantic model. This libary provides a convenient way to combine those steps. Note: if you are using this library to parse external, uncontrolled HTML or XML, you should be aware of possible attack vectors through XML: [https://github.com/tiran/defusedxml]. This library uses lxml under the hood. Installation Use pip, or your favorite Python package manager (pipenv, poetry, pdm, ...): bash pip install xml-to-pydantic Usage The HTML or XML data is extracted using XPath. For simple documents, the XPath can be calcualted from the model: ```py from xml_to_pydantic import ConfigDict, XmlBaseModel html_bytes = b""" My page title Header Paragraph1 Paragraph2 Paragraph3 """ class MainContent(XmlBaseModel): model_config = ConfigDict(xpath_root="/html/body/main") p: list[str] result = MainContent.model_validate_html(html_bytes) print(result) > p=['Paragraph1', 'Paragraph2', 'Paragraph3'] ``` ```py from xml_to_pydantic import XmlBaseModel xml_bytes = b""" 4.53 3.25 """ class MyModel(XmlBaseModel): element: list[float] model = MyModel.model_validate_xml(xml_bytes) print(model) > element=[4.53, 3.25] ``` However, for more complicated XML, this one-to-one correspondance may not be convenient, and a better approach is supplying the xpath directly (similar to how pydantic allows specifying an alias for a field): ```py from xml_to_pydantic import XmlBaseModel, XmlField xml_bytes = b""" 4.53 Link """ class MyModel(XmlBaseModel): number: float = XmlField(xpath="./element/text()") href: str = XmlField(xpath="./a/@href") model = MyModel.model_validate_xml(xml_bytes) print(model) > number=4.53 href='https://example.com' ``` The parsing can also deal with nested models and lists: ```py from xml_to_pydantic import XmlBaseModel, XmlField xml_bytes = b""" value1 value2 value3 value11 """ class NextLevel(XmlBaseModel): level2: list[str] = XmlField(xpath="./level2/text()") class MyModel(XmlBaseModel): next_level: NextLevel = XmlField(xpath="./level1") level_11: list[str] = XmlField(xpath="./level11/text()") model = MyModel.model_validate_xml(xml_bytes) print(model) > next_level=NextLevel(level2=['value1', 'value2', 'value3']) level_11=['value11'] ``` Development Prerequisites: Any Python 3.8 through 3.12 poetry for dependency management git make (to use the helper scripts in the Makefile) Autoformatting can be applied by running bash make lintable Before commiting, remember to run bash make lint make test
pypi package. Binary
Latest version: 0.2 Released: 2024-06-11
apollo_pydantic apollo_pydantic包是一个apollo的客户端封装,使用pydantic_settings的类型系统自动进行配置解析和类型转换,可以方便的使用apollo配置,可以自动或者手动同步配置。 pydantic_settings详情见pydantic_settings 当使用apollo作为微服务配置中心时,推荐使用apollo_pydantic。 如何安装 apollo_pydantic已上传至pypi,可以通过如下命令安装 bash pip install -U apollo-pydantic 注意事项 建议使用一个Base类作为配置类基类,设置apollo的相关链接参数。当使用多个appid和cluster时,需要配置多个基类 每个设置类对应一个namespace, 继承同一个配置基类下的配置类namespace不允许重复 每个设置类对应一个label, 用于apollo的灰度发布使用的label ```python class Base(ApolloSettings): model_config = ApolloSettingsConfigDict( # 官方演示地址 config_server='http://81.68.181.139:8080', appid='0001234', cluster='dev', secret_key='0398e769780c4e6399d9e6f73910e155' # if set ) class ApolloSettingsModel(Base): namespace = 'application' label = None ... 4. 支持嵌套的配置项,但是要注意嵌套配置项需要继承自`pydantic.BaseModel`,并且要在根配置类前定义, 否则可能有解析失败的情况,具体详情见[pydantic_settings](https://docs.pydantic.dev/latest/concepts/pydantic_settings/)python class Base(ApolloSettings): ... class Nested(BaseModel): some_name: str class ApolloSettingsModel(Base): namespace = 'application' # 配置项 nested.some_name = xxx nested: Nested await ApolloSettingsModel.refresh() # 主动刷新 print(ApolloSettingsModel().nested.some_name) # xxx ``` 支持配置项的validation_alias和alias属性, 该属性可以为配置项指定别名,如配置项alias_name = some_value可以设置validation_alias = 'alias_name'或者alias = 'alias_name', 该字段会解析为some_value. 注意设置别名时,不要包含.符号, 因为.符号会被解析为嵌套的配置项 apollo配置如下: some_name = xxx python代码如下: ```python class Base(ApolloSettings): ... class ApolloSettingsModel(Base): namespace = 'application' label = None # 配置项 some_name = xxx some_value: str = Field(alias='some_name') await ApolloSettingsModel.refresh() # 主动刷新 print(ApolloSettingsModel().some_value) # xxx 慎用`AliasPath`, 除非精通`pydantic_settings`的使用。 6. 支持自定义`Field`,相关校验参见[pydantic_settings](https://docs.pydantic.dev/latest/api/fields/) 7. 支持数组类型配置,如配置项`some_name[0] = xxx`和`some_name[1] = yyy`, 该字段会解析为`['xxx', 'yyy']`, 该种方式目前不支持数组中嵌套配置项类。如要使用这种配置项可以直接使用json来实现python class Base(ApolloSettings): ... class ApolloSettingsModel(Base): namespace = 'application' label = None # 配置项 # some_name[0] = xxx # some_name[1] = yyy some_name: List[str] # 配置项 # some_value = ["xxx", "yyy"] some_json_value: List[str] await ApolloSettingsModel.refresh() # 主动刷新 print(ApolloSettingsModel().some_name) # ['xxx', 'yyy'] print(ApolloSettingsModel().some_json_value) # ['xxx', 'yyy'] 8. 暂不支持同步线程的方式同步配置,后续会有同步线程的支持 ``` 例子 例子详见apollo_pydantic/examples目录。 apollo的官方演示地址为http://81.68.181.139 使用pydantic_settings ```python import asyncio import datetime from typing import List, Optional from pydantic import AliasPath, BaseModel, Field, HttpUrl from apollo_pydantic import (ApolloSettings, ApolloSettingsConfigDict, enable_debug) class Base(ApolloSettings): model_config = ApolloSettingsConfigDict( # 官方演示地址 config_server='http://81.68.181.139:8080', appid='0001234', cluster='dev', secret_key='0398e769780c4e6399d9e6f73910e155' # if set ) class B(BaseModel): bb: List[int] cc: int = Field(validation_alias=AliasPath('bb', 0)) class ApolloSettingsModel(Base): namespace = 'application' label = None # apollo 的官方演示地址 url: HttpUrl = 'http://81.68.181.139:8080' key1: str white_list: List[int] = Field(default_factory=lambda: ['1', '2', '3']) port: int redis_port: int is_encrypted: bool # a.bb[0] = 123 some_value: Optional[str] = Field(validation_alias=AliasPath('a', 'bb', 0)) # 这里使用AliasPath,需要该配置类必须要有`a`字段 # c-dd = xxx c_dd: str = Field(alias='c-dd') a: B start: datetime.timedelta = 1 arr: List class Abc(Base): label = None namespace = 'aUQTr9' aaa: str = Field(alias='abc') abc: int async def main(): # enable_debug() # 自动更新 await ApolloSettings.start() while True: # 手动更新 # await ApolloSettingsModel.refresh() print(ApolloSettingsModel(), datetime.datetime.now()) print(Abc(), '###' * 20) await asyncio.sleep(10) if name == 'main': # apollo配置如下 """ key1 = value1 white_list = [1, 2, 3,"5",3734985] port = 3306 redis_port = 6379 is_encrypted = true 123 = fsfg范德萨gf xxx = {"业务编号":"123","通道":"123"} # ignore a.bb[0] = 1 a.bb[1] = 2 c-dd = fgfg arr[0] = 123 arr[1] = 2 """ enable_debug() print('##' * 20) asyncio.run(main()) ``` 集成fastapi ```python from fastapi import FastAPI ... some imports class Base(ApolloSettings): model_config = ApolloSettingsConfigDict( # 官方演示地址 config_server='http://81.68.181.139:8080', appid='0001234', cluster='dev', secret_key='0398e769780c4e6399d9e6f73910e155' # if set ) ... define your settings async def sync_settings(app: FastAPI): # 自动同步 await ApolloSettings.start() yield app await ApolloSettings.stop() app = FastAPI(lifespan=sync_settings) if name == 'main': import uvicorn uvicorn.run(app, host='0.0.0.0', port=8000) ``` 功能 自动同步配置
Latest version: 0.0.3 Released: 2025-04-25