85 Commits

Author SHA1 Message Date
d431dca353 Merge pull request #31 from HideyoshiNakazone/project/revises-roadmap
Revises Roadmap
2025-06-23 16:28:40 -03:00
bf42ad638f Adds Docs to README 2025-06-23 16:27:54 -03:00
1bb0995d79 Revises Roadmap 2025-06-23 16:24:45 -03:00
b56598eb02 Merge pull request #30 from HideyoshiNakazone/feature/adds-const
Feature/adds const
2025-06-23 15:32:37 -03:00
60ac12fe39 Adds Docs to Const Type 2025-06-23 15:30:07 -03:00
198ebecef0 Adds Final Tests for Const Type 2025-06-23 15:18:34 -03:00
65a81a8da5 Initial Const Implementation 2025-06-22 22:13:53 -03:00
42a1ae24fe Merge pull request #29 from HideyoshiNakazone/feature/adds-enums
Adds Aditional Validations in Enum
2025-06-22 17:39:54 -03:00
450d44c064 Adds Aditional Validations in Enum 2025-06-22 17:38:58 -03:00
92172c8711 Merge pull request #28 from HideyoshiNakazone/feature/adds-enums
[Feature] Adds Enums
2025-06-22 17:25:27 -03:00
6c94047ec0 Adds Docs for Enum 2025-06-22 17:21:28 -03:00
ef66903948 Minor Fixes in EnumTypeParser and Adds Better UnitTests 2025-06-22 16:43:53 -03:00
7e591f0525 Initial Implementation of Enum 2025-06-22 11:18:42 -03:00
4ba5d83df6 Merge pull request #27 from HideyoshiNakazone/project/add-funding
Create FUNDING.yml
2025-06-22 10:16:58 -03:00
bdaa0cb5b1 Create FUNDING.yml 2025-06-22 10:16:42 -03:00
0ede98fcf0 Merge pull request #26 from HideyoshiNakazone/feature/add-doc
Fix ReadTheDocs Config File
2025-06-22 08:45:32 -03:00
ed2bb35d45 Fix ReadTheDocs Config File 2025-06-22 08:45:05 -03:00
798ea1d601 Merge pull request #21 from HideyoshiNakazone/feature/add-doc
Feature/add doc
2025-06-21 18:43:38 -03:00
02d11f57b2 Adds Config ReadTheDocs 2025-06-21 18:40:41 -03:00
bcbc83e502 Fixes Minor Fields in Docs 2025-06-21 18:26:14 -03:00
ac239c2617 Adds Docs for AllOf and AnyOf 2025-06-21 18:20:44 -03:00
dee8b02d26 Adds Docs for Ref Type 2025-06-21 11:46:47 -03:00
12471ac804 Adds Docs for Object 2025-06-21 08:39:01 -03:00
b92cf37145 Adds Docs for Array, Bool and Numeric 2025-06-20 23:12:33 -03:00
249195ff26 Finalizes String Doc 2025-06-20 22:54:24 -03:00
c504efe23b Initial Work on Documentation 2025-06-19 23:51:33 -03:00
040ffcba66 Merge pull request #20 from HideyoshiNakazone/feature/ref-type-parser
[FEATURE] Implementation of $ref JSON Schema Keyword
2025-06-19 22:09:11 -03:00
58d4cd9707 Adds Feature Example of the New Feature to the ReadMe 2025-06-19 22:03:28 -03:00
607555898e Final and Tested Version of Ref 2025-06-19 00:39:54 -03:00
37cf59078e Working Version of Root Level Reference 2025-06-13 01:52:20 -03:00
f4effac41c Initial Working $ref Keyword with: ForwardRef, Partial Root Ref and Recursive Ref 2025-06-13 01:36:16 -03:00
188cd28586 **BROKEN INITIAL FOWARDREF** 2025-06-12 02:35:09 -03:00
760f30d08f Initial Implementation of $ref 2025-06-12 01:54:52 -03:00
129114a85f Merge pull request #19 from HideyoshiNakazone/project/fixes-feature-request-issue-template
Update issue templates
2025-06-12 00:53:38 -03:00
3e7d796ef7 Update issue templates 2025-06-12 00:53:28 -03:00
fd967cf6fe Merge pull request #17 from HideyoshiNakazone/project/adds-issue-template
Update issue templates
2025-06-12 00:45:39 -03:00
21c4e4ab75 Update issue templates 2025-06-12 00:45:14 -03:00
cbef7104c4 Merge pull request #16 from HideyoshiNakazone/improvement/better-internal-structure
Better Object Internal Structure and Type Selection
2025-06-04 01:27:29 -03:00
dbbb8e0419 Fixes Tests 2025-06-04 01:26:06 -03:00
4bbb896c46 Fixes Default Values in StringTypeParser 2025-06-04 01:12:45 -03:00
3273fd84bf Fixes Test and Reports 2025-06-03 03:00:49 -03:00
782e09d5e3 Adds Test to SchemaConverter.build Schema Validation 2025-06-03 02:35:25 -03:00
66ca341bb2 Adds Test to AllOf 2025-06-03 02:31:13 -03:00
25d8e68e95 Fixes Test and Reports 2025-06-03 02:20:15 -03:00
be7f04e20d Better TypeParser Kwargs 2025-06-03 02:05:21 -03:00
2b2c823e27 Fixes Test of AllOf 2025-06-03 00:49:54 -03:00
e37e9818ed Initial Work on TypeParser Kwargs 2025-06-03 00:48:22 -03:00
bef42e4cdb Better Object Internal Structure and Type Selection 2025-06-03 00:15:19 -03:00
894969332d Merge pull request #15 from HideyoshiNakazone/fix/better-typing-output
Fixes Typing Output
2025-06-02 20:43:13 -03:00
9e52783b22 Fixes Typing Output 2025-06-02 20:41:50 -03:00
393eaa5e0a Merge pull request #12 from PuChenTW/main
feat(parser): first‑class support for JSON string.format
2025-05-10 20:09:14 -03:00
b9c36a46b4 Merge pull request #13 from HideyoshiNakazone/adds-test-execution-pr
Adds PRs to the Test Execution GithubAction
2025-05-10 20:06:40 -03:00
db3d0eee45 Adds PRs to the Test Execution GithubAction 2025-05-10 20:05:26 -03:00
Pu Chen
b52997633c Support string format 2025-05-06 22:52:08 +08:00
Pu Chen
7a3266e4cc Install email-validator 2025-05-06 21:54:02 +08:00
cba4ef0e21 Merge pull request #11 from HideyoshiNakazone/any-all-ref-implementation
Implements: allOf, anyOf

Finalizes the implementation of allOf and anyOf, but the implementation of oneOf was cancelled for the time being
2025-04-19 17:32:58 -03:00
f9f986e3c8 Fixes Minor Element in AnyOf Test 2025-04-19 17:30:11 -03:00
1c546d252f Omits Test Dir in Test Coverage 2025-04-19 17:26:33 -03:00
b409ce49a5 Fixes Validation of JsonSchema 2025-04-19 17:23:38 -03:00
863494ab9c Finalizes AnyOfTypeParser Tests 2025-04-19 16:57:56 -03:00
509ee60b75 Fixes Import Order jambo.parser 2025-04-19 16:51:27 -03:00
20e4a69968 Move Aux Function to the GenericTypeParser Class 2025-04-19 16:45:32 -03:00
d74e700233 Removes Unecessary Case from ArrayParser 2025-04-19 15:48:54 -03:00
42bc0148b8 Adds Test for Boolean Default Value 2025-04-19 15:46:37 -03:00
c6a37dab74 Better Defaults Validation Implementation 2025-04-19 15:44:27 -03:00
5c3d3a39ba Implements Feature Complete AnyOf Keyword 2025-04-19 15:23:22 -03:00
5fdb4fa724 Removes OneOf due to complexity and niche use case
After further analysis, the functionality was deemed too complex to implement for such a niche use case and will therefore be removed from the implementation backlog
2025-04-17 16:06:55 -03:00
dc350aaa8b Adds Test for AllOfTypeParser Case 2025-04-17 03:07:08 -03:00
d5149061a1 Formats Import Orders 2025-04-17 03:04:38 -03:00
459d9da0b9 Final Implementation of AllOf Keyword 2025-04-17 03:03:22 -03:00
6d1febbcc1 Initial allOf Implementation 2025-04-14 03:22:42 -03:00
eb501fec74 Merge pull request #10 from HideyoshiNakazone/implements-object-default
Implements Object Defaults
2025-04-13 02:48:42 -03:00
7272b1a74b Implements Object Defaults 2025-04-13 02:40:07 -03:00
62f3f9b1c5 Merge pull request #9 from HideyoshiNakazone/better-tests
Implements Better Tests
2025-04-13 02:15:08 -03:00
af0a69ed35 Implements Better Object Tests 2025-04-13 02:13:01 -03:00
970aa50845 Implements Better Tests For: Int, Float, Bool 2025-04-13 01:45:28 -03:00
76b40847ce Implements Better String Tests 2025-04-12 19:37:53 -03:00
ec9171ba8f Implements Better Array Tests 2025-04-12 03:49:50 -03:00
4f68c49658 Merge pull request #8 from HideyoshiNakazone/codecov-report
Codecov report
2025-04-11 23:58:13 -03:00
22677e9811 Merge remote-tracking branch 'origin/main' into codecov-report 2025-04-11 23:57:03 -03:00
e8321f7d94 Adds Codecov Badge to README 2025-04-11 23:56:51 -03:00
470f322ff5 Merge pull request #7 from HideyoshiNakazone/codecov-report
Adds Codecov
2025-04-11 23:52:37 -03:00
21e64be29b Adds Codecov 2025-04-11 23:49:47 -03:00
df1df0daab Merge pull request #6 from HideyoshiNakazone/adds-description
Adds Description
2025-04-11 21:59:14 -03:00
e803e39a92 Adds Description 2025-04-11 21:55:32 -03:00
63 changed files with 5027 additions and 746 deletions

15
.github/FUNDING.yml vendored Normal file
View File

@@ -0,0 +1,15 @@
# These are supported funding model platforms
github: # Replace with up to 4 GitHub Sponsors-enabled usernames e.g., [user1, user2]
patreon: # Replace with a single Patreon username
open_collective: # Replace with a single Open Collective username
ko_fi: hideyoshinakazone
tidelift: # Replace with a single Tidelift platform-name/package-name e.g., npm/babel
community_bridge: # Replace with a single Community Bridge project-name e.g., cloud-foundry
liberapay: # Replace with a single Liberapay username
issuehunt: # Replace with a single IssueHunt username
lfx_crowdfunding: # Replace with a single LFX Crowdfunding project-name e.g., cloud-foundry
polar: # Replace with a single Polar username
buy_me_a_coffee: # Replace with a single Buy Me a Coffee username
thanks_dev: # Replace with a single thanks.dev username
custom: # Replace with up to 4 custom sponsorship URLs e.g., ['link1', 'link2']

25
.github/ISSUE_TEMPLATE/bug_report.md vendored Normal file
View File

@@ -0,0 +1,25 @@
---
name: Bug report
about: Create a report to help us improve
title: "[BUG] Title Here"
labels: enhancement
assignees: HideyoshiNakazone
---
**Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
Steps to reproduce the behavior
**Expected behavior**
A clear and concise description of what you expected to happen.
**Environment Information:**
- Python:
- Jambo Version:
- Pydantic:
**Additional context**
Add any other context about the problem here.

View File

@@ -0,0 +1,16 @@
---
name: Feature request
about: Suggest an idea for this project
title: "[FEATURE REQUEST] Title Here"
labels: enhancement
assignees: HideyoshiNakazone
---
**Is this a [Json Schema](https://json-schema.org/specification) Keyword that is missing?** [yes|no]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Additional context**
Add any other context or screenshots about the feature request here.

View File

@@ -2,7 +2,10 @@ name: Test and Publish
on:
push
push:
pull_request:
branches:
- main
permissions:
contents: read
@@ -11,6 +14,8 @@ jobs:
test:
name: run-tests
runs-on: ubuntu-latest
if: github.event_name != 'pull_request' ||
github.event.pull_request.head.repo.full_name != github.event.pull_request.base.repo.full_name
strategy:
matrix:
python-version:
@@ -35,7 +40,15 @@ jobs:
run: uv sync --all-extras --dev
- name: Run tests
run: uv run poe tests
run: |
uv run poe tests
uv run poe tests-report
- name: Upload coverage reports to Codecov
uses: codecov/codecov-action@v5
with:
token: ${{ secrets.CODECOV_TOKEN }}
if: matrix.python-version == '3.10'
publish:
name: publish

22
.readthedocs.yaml Normal file
View File

@@ -0,0 +1,22 @@
version: 2
# Specify os and python version
build:
os: "ubuntu-24.04"
tools:
python: "3.12"
jobs:
create_environment:
- asdf plugin add uv
- asdf install uv latest
- asdf global uv latest
- UV_PROJECT_ENVIRONMENT=$READTHEDOCS_VIRTUALENV_PATH uv sync --all-extras
install:
- "true"
# Build documentation in the docs/ directory with Sphinx
sphinx:
configuration: docs/source/conf.py
# Optionally build your docs in additional formats such as PDF and ePub
formats: all

View File

@@ -5,6 +5,9 @@
<img src="https://img.shields.io/github/last-commit/HideyoshiNakazone/jambo.svg">
<img src="https://github.com/HideyoshiNakazone/jambo/actions/workflows/build.yml/badge.svg" alt="Tests">
</a>
<a href="https://codecov.io/gh/HideyoshiNakazone/jambo" target="_blank">
<img src="https://codecov.io/gh/HideyoshiNakazone/jambo/branch/main/graph/badge.svg" alt="Coverage">
</a>
<br />
<a href="https://pypi.org/project/jambo" target="_blank">
<img src="https://badge.fury.io/py/jambo.svg" alt="Package version">
@@ -24,10 +27,21 @@ Created to simplifying the process of dynamically generating Pydantic models for
## ✨ Features
- ✅ Convert JSON Schema into Pydantic models dynamically
- 🔒 Supports validation for strings, integers, floats, booleans, arrays, and nested objects
- ⚙️ Enforces constraints like `minLength`, `maxLength`, `pattern`, `minimum`, `maximum`, `uniqueItems`, and more
- 📦 Zero config — just pass your schema and get a model
- ✅ Convert JSON Schema into Pydantic models dynamically;
- 🔒 Supports validation for:
- strings
- integers
- floats
- booleans
- arrays
- nested objects
- allOf
- anyOf
- ref
- enum
- const
- ⚙️ Enforces constraints like `minLength`, `maxLength`, `pattern`, `minimum`, `maximum`, `uniqueItems`, and more;
- 📦 Zero config — just pass your schema and get a model.
---
@@ -42,7 +56,8 @@ pip install jambo
## 🚀 Usage
```python
from jambo.schema_converter import SchemaConverter
from jambo import SchemaConverter
schema = {
"title": "Person",
@@ -64,9 +79,14 @@ print(obj)
## ✅ Example Validations
Following are some examples of how to use Jambo to create Pydantic models with various JSON Schema features, but for more information, please refer to the [documentation](https://jambo.readthedocs.io/).
### Strings with constraints
```python
from jambo import SchemaConverter
schema = {
"title": "EmailExample",
"type": "object",
@@ -89,6 +109,9 @@ print(obj)
### Integers with bounds
```python
from jambo import SchemaConverter
schema = {
"title": "AgeExample",
"type": "object",
@@ -106,6 +129,9 @@ print(obj)
### Nested Objects
```python
from jambo import SchemaConverter
schema = {
"title": "NestedObjectExample",
"type": "object",
@@ -127,6 +153,41 @@ obj = Model(address={"street": "Main St", "city": "Gotham"})
print(obj)
```
### References
```python
from jambo import SchemaConverter
schema = {
"title": "person",
"$ref": "#/$defs/person",
"$defs": {
"person": {
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
"emergency_contact": {
"$ref": "#/$defs/person",
},
},
}
},
}
model = SchemaConverter.build(schema)
obj = model(
name="John",
age=30,
emergency_contact=model(
name="Jane",
age=28,
),
)
```
---
## 🧪 Running Tests
@@ -167,9 +228,6 @@ poe create-hooks
## 📌 Roadmap / TODO
- [ ] Support for `enum` and `const`
- [ ] Support for `anyOf`, `allOf`, `oneOf`
- [ ] Schema ref (`$ref`) resolution
- [ ] Better error reporting for unsupported schema types
---

29
docs/Makefile Normal file
View File

@@ -0,0 +1,29 @@
# Minimal makefile for Sphinx documentation
#
# You can set these variables from the command line, and also
# from the environment for the first two.
SPHINXOPTS ?=
SPHINXBUILD ?= sphinx-build
SPHINXAPIDOC ?= sphinx-apidoc
SOURCEDIR = source
BUILDDIR = build
SCANEDDIR = ../jambo
# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
.PHONY: help Makefile
rescan:
$(SPHINXAPIDOC) -f -o $(SOURCEDIR) $(SCANEDDIR) $(EXCLUDEDIR)
clean:
rm -rf $(BUILDDIR)/*
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

35
docs/make.bat Normal file
View File

@@ -0,0 +1,35 @@
@ECHO OFF
pushd %~dp0
REM Command file for Sphinx documentation
if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set SOURCEDIR=source
set BUILDDIR=build
%SPHINXBUILD% >NUL 2>NUL
if errorlevel 9009 (
echo.
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
echo.installed, then set the SPHINXBUILD environment variable to point
echo.to the full path of the 'sphinx-build' executable. Alternatively you
echo.may add the Sphinx directory to PATH.
echo.
echo.If you don't have Sphinx installed, grab it from
echo.https://www.sphinx-doc.org/
exit /b 1
)
if "%1" == "" goto help
%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
goto end
:help
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
:end
popd

37
docs/source/conf.py Normal file
View File

@@ -0,0 +1,37 @@
# Configuration file for the Sphinx documentation builder.
#
# For the full list of built-in configuration values, see the documentation:
# https://www.sphinx-doc.org/en/master/usage/configuration.html
# -- Project information -----------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#project-information
project = "jambo"
copyright = "2025, Vitor Hideyoshi"
author = "Vitor Hideyoshi"
# -- General configuration ---------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration
extensions = [
"sphinx.ext.todo",
"sphinx.ext.viewcode",
"sphinx.ext.autodoc",
"sphinx.ext.napoleon",
]
templates_path = ["_templates"]
exclude_patterns = []
# -- Options for HTML output -------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#options-for-html-output
html_theme = "sphinx_rtd_theme"
html_static_path = ["_static"]
# -- Options for autodoc -----------------------------------------------------
add_module_names = False
python_use_unqualified_type_names = True

33
docs/source/index.rst Normal file
View File

@@ -0,0 +1,33 @@
.. jambo documentation master file, created by
sphinx-quickstart on Thu Jun 19 22:20:35 2025.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Jambo - JSON Schema to Pydantic Converter
=========================================
This is the documentation for Jambo, a tool that converts JSON Schema definitions into Pydantic models.
Welcome to Jambo's documentation!
Jambo is a Python package that automatically converts JSON Schema definitions into Pydantic models. It's designed to streamline schema validation and enforce type safety using Pydantic's powerful validation features.
Created to simplifying the process of dynamically generating Pydantic models for AI frameworks like LangChain, CrewAI, and others.
Installation
------------------
You can install Jambo using pip:
.. code-block:: bash
pip install jambo
.. toctree::
:maxdepth: 2
:caption: Contents:
usage
modules

View File

@@ -0,0 +1,85 @@
jambo.parser package
====================
Submodules
----------
jambo.parser.allof\_type\_parser module
---------------------------------------
.. automodule:: jambo.parser.allof_type_parser
:members:
:show-inheritance:
:undoc-members:
jambo.parser.anyof\_type\_parser module
---------------------------------------
.. automodule:: jambo.parser.anyof_type_parser
:members:
:show-inheritance:
:undoc-members:
jambo.parser.array\_type\_parser module
---------------------------------------
.. automodule:: jambo.parser.array_type_parser
:members:
:show-inheritance:
:undoc-members:
jambo.parser.boolean\_type\_parser module
-----------------------------------------
.. automodule:: jambo.parser.boolean_type_parser
:members:
:show-inheritance:
:undoc-members:
jambo.parser.float\_type\_parser module
---------------------------------------
.. automodule:: jambo.parser.float_type_parser
:members:
:show-inheritance:
:undoc-members:
jambo.parser.int\_type\_parser module
-------------------------------------
.. automodule:: jambo.parser.int_type_parser
:members:
:show-inheritance:
:undoc-members:
jambo.parser.object\_type\_parser module
----------------------------------------
.. automodule:: jambo.parser.object_type_parser
:members:
:show-inheritance:
:undoc-members:
jambo.parser.ref\_type\_parser module
-------------------------------------
.. automodule:: jambo.parser.ref_type_parser
:members:
:show-inheritance:
:undoc-members:
jambo.parser.string\_type\_parser module
----------------------------------------
.. automodule:: jambo.parser.string_type_parser
:members:
:show-inheritance:
:undoc-members:
Module contents
---------------
.. automodule:: jambo.parser
:members:
:show-inheritance:
:undoc-members:

30
docs/source/jambo.rst Normal file
View File

@@ -0,0 +1,30 @@
jambo package
=============
Subpackages
-----------
.. toctree::
:maxdepth: 4
jambo.parser
jambo.types
Submodules
----------
jambo.schema\_converter module
------------------------------
.. automodule:: jambo.schema_converter
:members:
:show-inheritance:
:undoc-members:
Module contents
---------------
.. automodule:: jambo
:members:
:show-inheritance:
:undoc-members:

View File

@@ -0,0 +1,29 @@
jambo.types package
===================
Submodules
----------
jambo.types.json\_schema\_type module
-------------------------------------
.. automodule:: jambo.types.json_schema_type
:members:
:show-inheritance:
:undoc-members:
jambo.types.type\_parser\_options module
----------------------------------------
.. automodule:: jambo.types.type_parser_options
:members:
:show-inheritance:
:undoc-members:
Module contents
---------------
.. automodule:: jambo.types
:members:
:show-inheritance:
:undoc-members:

7
docs/source/modules.rst Normal file
View File

@@ -0,0 +1,7 @@
jambo
=====
.. toctree::
:maxdepth: 4
jambo

View File

@@ -0,0 +1,39 @@
AllOf Type
=================
The AllOf type is used to combine multiple schemas into a single schema. It allows you to specify that an object must conform to all of the specified schemas.
Examples
-----------------
.. code-block:: python
from jambo import SchemaConverter
schema = {
"title": "Person",
"description": "A person",
"type": "object",
"properties": {
"name": {
"allOf": [
{"type": "string", "maxLength": 11},
{"type": "string", "maxLength": 4},
{"type": "string", "minLength": 1},
{"type": "string", "minLength": 2},
]
},
},
}
Model = SchemaConverter.build(schema)
obj = Model(name="J")
print(obj) # Output: Person(name='J')
try:
obj = Model(name="") # This will raise a validation error
except ValueError as e:
print("Validation fails as expected:", e) # Output: Validation fails as expected: 1 validation error for Person

View File

@@ -0,0 +1,41 @@
AnyOf Type
=================
The AnyOf type is used to specify that an object can conform to any one of the specified schemas. It allows for flexibility in the structure of the data, as it can match multiple possible schemas.
Examples
-----------------
.. code-block:: python
from jambo import SchemaConverter
schema = {
"title": "Person",
"description": "A person",
"type": "object",
"properties": {
"id": {
"anyOf": [
{"type": "integer"},
{"type": "string"},
]
},
},
}
Model = SchemaConverter.build(schema)
obj1 = Model(id="1")
print(obj1) # Output: Person(id='1')
obj2 = Model(id=1)
print(obj2) # Output: Person(id=1)
try:
obj3 = Model(name=1.1) # This will raise a validation error
except ValueError as e:
print("Validation fails as expected:", e) # Output: Validation fails as expected: 1 validation error for Person

View File

@@ -0,0 +1,86 @@
Array Type
=================
The Array type has the following required properties:
- items: Schema for the items in the array, which can be a type or a schema object.
And the additional supported properties:
- maxItems: Maximum number of items in the array.
- minItems: Minimum number of items in the array.
- uniqueItems: If true, all items in the array must be unique.
And the additional generic properties:
- default: Default value for the array.
- description: Description of the array field.
Examples
-----------------
1. Basic Array with maxItems and minItems:
.. code-block:: python
from jambo import SchemaConverter
schema = {
"title": "ArrayExample",
"type": "object",
"properties": {
"tags": {
"type": "array",
"items": {"type": "string"},
"minItems": 1,
"maxItems": 5,
},
},
"required": ["tags"],
}
Model = SchemaConverter.build(schema)
obj = Model(tags=["python", "jambo", "pydantic"])
print(obj) # Output: ArrayExample(tags=['python', 'jambo', 'pydantic'])
try:
obj = Model(tags=[]) # This will raise a validation error
except ValueError as e:
print("Validation fails as expected:", e) # Output: Validation fails as expected: 1 validation error for ArrayExample
2. Array with uniqueItems:
.. code-block:: python
from jambo import SchemaConverter
schema = {
"title": "UniqueArrayExample",
"type": "object",
"properties": {
"unique_tags": {
"type": "array",
"items": {"type": "string"},
"uniqueItems": True,
},
},
"required": ["unique_tags"],
}
Model = SchemaConverter.build(schema)
obj = Model(unique_tags=["python", "jambo", "pydantic"])
print(obj) # Output: UniqueArrayExample(unique_tags={'python', 'jambo', 'pydantic'})
try:
obj = Model(unique_tags=["python", "jambo", "python"]) # This will raise a validation error
except ValueError as e:
print("Validation fails as expected:", e) # Output: Validation fails as expected: 1 validation error for UniqueArrayExample

View File

@@ -0,0 +1,34 @@
Bool Types
=================
The Bool type has no specific properties, it has only the generic properties:
- default: Default value for the string.
- description: Description of the string field.
Examples
-----------------
.. code-block:: python
from jambo import SchemaConverter
schema = {
"title": "BoolExample",
"type": "object",
"properties": {
"is_active": {
"type": "boolean",
},
},
"required": ["is_active"],
}
Model = SchemaConverter.build(schema)
obj = Model(is_active=True)
print(obj) # Output: BoolExample(is_active=True)

View File

@@ -0,0 +1,40 @@
Const Type
=================
The const type is a special data type that allows a variable to be a single, fixed value.
It does not have the same properties as the other generic types, but it has the following specific properties:
- const: The fixed value that the variable must always hold.
- description: Description of the const field.
Examples
-----------------
.. code-block:: python
from jambo import SchemaConverter
schema = {
"title": "Country",
"type": "object",
"properties": {
"name": {
"const": "United States of America",
}
},
"required": ["name"],
}
Model = SchemaConverter.build(schema)
obj = Model()
self.assertEqual(obj.name, "United States of America")
with self.assertRaises(ValueError):
obj.name = "Canada"
with self.assertRaises(ValueError):
Model(name="Canada")

View File

@@ -0,0 +1,37 @@
Enum Type
==================
An enum type is a special data type that enables a variable to be a set of predefined constants. The enum type is used to define variables that can only take one out of a small set of possible values.
It does not have any specific properties, but it has the generic properties:
- default: Default value for the enum.
- description: Description of the enum field.
Examples
-----------------
.. code-block:: python
from jambo import SchemaConverter
schema = {
"title": "EnumExample",
"type": "object",
"properties": {
"status": {
"type": "string",
"enum": ["active", "inactive", "pending"],
"description": "The status of the object.",
"default": "active",
},
},
"required": ["status"],
}
Model = SchemaConverter.build(schema)
obj = Model(status="active")
print(obj) # Output: EnumExample(status=status.ACTIVE)

View File

@@ -0,0 +1,118 @@
Numeric Types
=================
The Numeric Types (integer, number) have the following supported properties:
- minimum: Minimum value for the number.
- maximum: Maximum value for the number.
- exclusiveMinimum: If true, the value must be greater than the minimum.
- exclusiveMaximum: If true, the value must be less than the maximum.
- multipleOf: The value must be a multiple of this number.
And the additional generic properties:
- default: Default value for the string.
- description: Description of the string field.
Examples
-----------------
1. Basic Integer with minimum and maximum:
.. code-block:: python
from jambo import SchemaConverter
schema = {
"title": "IntegerExample",
"type": "object",
"properties": {
"age": {
"type": "integer",
"minimum": 0,
"maximum": 120,
},
},
"required": ["age"],
}
Model = SchemaConverter.build(schema)
obj = Model(age=30)
print(obj) # Output: IntegerExample(age=30)
try:
obj = Model(age=-5) # This will raise a validation error
except ValueError as e:
print("Validation fails as expected:", e) # Output: Validation fails as expected: 1 validation error for IntegerExample
2. Number with exclusiveMinimum and exclusiveMaximum:
.. code-block:: python
from jambo import SchemaConverter
schema = {
"title": "NumberExample",
"type": "object",
"properties": {
"price": {
"type": "number",
"exclusiveMinimum": 0,
"exclusiveMaximum": 1000,
},
},
"required": ["price"],
}
Model = SchemaConverter.build(schema)
obj = Model(price=1)
print(obj) # Output: NumberExample(price=1)
try:
obj = Model(price=0) # This will raise a validation error
except ValueError as e:
print("Validation fails as expected:", e) # Output: Validation fails as expected: 1 validation error for NumberExample
obj = Model(price=999)
print(obj) # Output: NumberExample(price=999)
try:
obj = Model(price=1000) # This will raise a validation error
except ValueError as e:
print("Validation fails as expected:", e) # Output: Validation fails as expected: 1 validation error for NumberExample
3. Number with multipleOf:
.. code-block:: python
from jambo import SchemaConverter
schema = {
"title": "MultipleOfExample",
"type": "object",
"properties": {
"quantity": {
"type": "number",
"multipleOf": 0.5,
},
},
"required": ["quantity"],
}
Model = SchemaConverter.build(schema)
obj = Model(quantity=2.5)
print(obj) # Output: MultipleOfExample(quantity=2.5)
try:
obj = Model(quantity=2.3) # This will raise a validation error
except ValueError as e:
print("Validation fails as expected:", e) # Output: Validation fails as expected: 1 validation error for MultipleOfExample

View File

@@ -0,0 +1,46 @@
Object Type
=================
The Bool type has no specific properties, it has only the generic properties:
- default: Default value for the string.
- description: Description of the string field.
Examples
-----------------
.. code-block:: python
from jambo import SchemaConverter
schema = {
"title": "Person",
"type": "object",
"properties": {
"address": {
"type": "object",
"properties": {
"street": {"type": "string"},
"city": {"type": "string"},
},
"default": {
"street": "Unknown Street",
"city": "Unknown City",
},
},
},
"description": "A person object containing a address.",
"required": ["address"],
}
Person = SchemaConverter.build(schema)
obj = Person.model_validate({ "address": {"street": "123 Main St", "city": "Springfield"} })
print(obj) # Output: Person(address=Address(street='123 Main St', city='Springfield'))
obj_default = Person() # Uses default values
print(obj_default) # Output: Person(address=Address(street='Unknown Street', city='Unknown City'))

View File

@@ -0,0 +1,85 @@
Reference Type
===================
The Reference type allows you to reference another schema by its `$ref` property. This is useful for reusing schemas across your application.
The Reference type has no specific properties, it has only the generic properties:
- default: Default value for the reference.
- description: Description of the reference field.
Examples
-----------------
1. Reference to the Root schema:
.. code-block:: python
from jambo import SchemaConverter
schema = {
"title": "Person",
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
"emergency_contact": {
"$ref": "#"
}
},
"required": ["name"],
}
Model = SchemaConverter.build(schema)
obj = Model(name="Alice", age=30, emergency_contact=Model(name="Bob", age=25))
print(obj) # Output: Person(name='Alice', age=30, emergency_contact=Person(name='Bob', age=25))
2. Reference to a Def Schema:
.. code-block:: python
from jambo import SchemaConverter
schema = {
"title": "Person",
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
"address": {
"$ref": "#/$defs/Address"
}
},
"required": ["name"],
"$defs": {
"Address": {
"type": "object",
"properties": {
"street": {"type": "string"},
"city": {"type": "string"},
},
"required": ["street", "city"],
}
},
}
Model = SchemaConverter.build(schema)
obj = Model(name="Alice", age=30, address={"street": "123 Main St", "city": "Springfield"})
print(obj) # Output: Person(name='Alice', age=30, address=Address(street='123 Main St', city='Springfield'))
.. note::
At the moment, Jambo doesn't have a way to expose the class definition :py:class:`Address` defined inside the `$defs` property,
but you can access the model class by using the `Model.__fields__` attribute to get the field definitions,
or by using the `Model.model_fields` property to get a dictionary of field names and their types.
.. warning::
The JSON Schema Reference specification allows for uri referneces,
but Jambo currently only supports root references (using the `#` symbol)
and def references (using the `$def` property).

49
docs/source/usage.rst Normal file
View File

@@ -0,0 +1,49 @@
Using Jambo
===================
Jambo is designed to be easy to use, it doesn't require any complex setup or configuration.
Below a example of how to use Jambo to convert a JSON Schema into a Pydantic model.
.. code-block:: python
from jambo import SchemaConverter
schema = {
"title": "Person",
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
},
"required": ["name"],
}
Person = SchemaConverter.build(schema)
obj = Person(name="Alice", age=30)
print(obj)
# Output: Person(name='Alice', age=30)
The :py:meth:`SchemaConverter.build <jambo.SchemaConverter.build>` static method takes a JSON Schema dictionary and returns a Pydantic model class. You can then instantiate this class with the required fields, and it will automatically validate the data according to the schema.
If passed a description inside the schema it will also add it to the Pydantic model using the `description` field. This is useful for AI Frameworks as: LangChain, CrewAI and others, as they use this description for passing context to LLMs.
For more complex schemas and types see our documentation on
.. toctree::
:maxdepth: 2
:caption: Contents:
usage.string
usage.numeric
usage.bool
usage.array
usage.object
usage.reference
usage.allof
usage.anyof
usage.enum
usage.const

View File

@@ -0,0 +1,107 @@
String Type
=================
The String type has the following supported properties:
- maxLength: Maximum length of the string.
- minLength: Minimum length of the string.
- pattern: Regular expression pattern that the string must match.
- format: A string format that can be used to validate the string (e.g., "email", "uri").
And the additional generic properties:
- default: Default value for the string.
- description: Description of the string field.
Examples
-----------------
1. Basic String with maxLength and minLength:
.. code-block:: python
from jambo import SchemaConverter
schema = {
"title": "StringExample",
"type": "object",
"properties": {
"attr1": {
"type": "string",
"minLength": 5,
"maxLength": 50,
},
},
"required": ["attr1"],
}
Model = SchemaConverter.build(schema)
obj = Model(attr1="this_is_a_valid_string")
print(obj)
# Output: StringExample(attr1='this_is_a_valid_string')
2. String with pattern and format:
Pattern example:
.. code-block:: python
from jambo import SchemaConverter
schema = {
"title": "StringExample",
"type": "object",
"properties": {
"email": {
"type": "string",
"pattern": r"^[a-zA-Z0-9_.+-]+@[a-zA-Z0-9-]+\.[a-zA-Z0-9-.]+$",
},
},
"required": ["email"],
}
Model = SchemaConverter.build(schema)
obj = Model(email="test@email.com")
print(obj)
# Output: StringExample(email='test@email.com')
try:
Model(email="invalid-email")
except ValueError as e:
print("Validation Failed as Expected") # Output: Validation Failed as Expected
Format example:
.. code-block:: python
from jambo import SchemaConverter
schema = {
"title": "StringExample",
"type": "object",
"properties": {
"email": {
"type": "string",
"format": "email",
},
},
"required": ["email"],
}
Model = SchemaConverter.build(schema)
obj = Model(email="test@email.com")
print(obj)
# Output: StringExample(email='test@email.com')
try:
Model(email="invalid-email")
except ValueError as e:
print("Validation Failed as Expected") # Output: Validation Failed as Expected

View File

@@ -0,0 +1,6 @@
from .schema_converter import SchemaConverter
__all__ = [
"SchemaConverter" # Exports the schema converter class for external use
]

View File

@@ -1,10 +1,28 @@
# Exports generic type parser
from ._type_parser import GenericTypeParser as GenericTypeParser
from ._type_parser import GenericTypeParser
from .allof_type_parser import AllOfTypeParser
from .anyof_type_parser import AnyOfTypeParser
from .array_type_parser import ArrayTypeParser
from .boolean_type_parser import BooleanTypeParser
from .const_type_parser import ConstTypeParser
from .enum_type_parser import EnumTypeParser
from .float_type_parser import FloatTypeParser
from .int_type_parser import IntTypeParser
from .object_type_parser import ObjectTypeParser
from .ref_type_parser import RefTypeParser
from .string_type_parser import StringTypeParser
# Exports Implementations
from .int_type_parser import IntTypeParser as IntTypeParser
from .object_type_parser import ObjectTypeParser as ObjectTypeParser
from .string_type_parser import StringTypeParser as StringTypeParser
from .array_type_parser import ArrayTypeParser as ArrayTypeParser
from .boolean_type_parser import BooleanTypeParser as BooleanTypeParser
from .float_type_parser import FloatTypeParser as FloatTypeParser
__all__ = [
"GenericTypeParser",
"EnumTypeParser",
"ConstTypeParser",
"AllOfTypeParser",
"AnyOfTypeParser",
"ArrayTypeParser",
"BooleanTypeParser",
"FloatTypeParser",
"IntTypeParser",
"ObjectTypeParser",
"StringTypeParser",
"RefTypeParser",
]

View File

@@ -1,31 +1,126 @@
from abc import ABC, abstractmethod
from typing import Generic, TypeVar
from typing_extensions import Self
from jambo.types.type_parser_options import TypeParserOptions
from pydantic import Field, TypeAdapter
from typing_extensions import Annotated, Any, Generic, Self, TypeVar, Unpack
from abc import ABC, abstractmethod
from pydantic import Field
T = TypeVar("T")
class GenericTypeParser(ABC, Generic[T]):
@property
@abstractmethod
def mapped_type(self) -> type[T]: ...
json_schema_type: str = None
@property
@abstractmethod
def json_schema_type(self) -> str: ...
type_mappings: dict[str, str] = {}
default_mappings = {
"default": "default",
"description": "description",
}
@staticmethod
@abstractmethod
def from_properties_impl(
self, name: str, properties: dict[str, Any], **kwargs: Unpack[TypeParserOptions]
) -> tuple[T, dict]:
"""
Abstract method to convert properties to a type and its fields properties.
:param name: The name of the type.
:param properties: The properties of the type.
:param kwargs: Additional options for type parsing.
:return: A tuple containing the type and its properties.
"""
def from_properties(
name: str, properties: dict[str, any]
) -> tuple[type[T], Field]: ...
self, name: str, properties: dict[str, Any], **kwargs: Unpack[TypeParserOptions]
) -> tuple[T, dict]:
"""
Converts properties to a type and its fields properties.
:param name: The name of the type.
:param properties: The properties of the type.
:param kwargs: Additional options for type parsing.
:return: A tuple containing the type and its properties.
"""
parsed_type, parsed_properties = self.from_properties_impl(
name, properties, **kwargs
)
if not self._validate_default(parsed_type, parsed_properties):
raise ValueError(
f"Default value {properties.get('default')} is not valid for type {parsed_type.__name__}"
)
return parsed_type, parsed_properties
@classmethod
def get_impl(cls, type_name: str) -> Self:
def type_from_properties(
cls, name: str, properties: dict[str, Any], **kwargs: Unpack[TypeParserOptions]
) -> tuple[type, dict]:
"""
Factory method to fetch the appropriate type parser based on properties
and generates the equivalent type and fields.
:param name: The name of the type to be created.
:param properties: The properties that define the type.
:param kwargs: Additional options for type parsing.
:return: A tuple containing the type and its properties.
"""
parser = cls._get_impl(properties)
return parser().from_properties(name=name, properties=properties, **kwargs)
@classmethod
def _get_impl(cls, properties: dict[str, Any]) -> type[Self]:
for subcls in cls.__subclasses__():
if subcls.json_schema_type == type_name:
schema_type, schema_value = subcls._get_schema_type()
if schema_type not in properties:
continue
if schema_value is None or schema_value == properties[schema_type]:
return subcls
raise ValueError(f"Unknown type: {type_name}")
raise ValueError("Unknown type")
@classmethod
def _get_schema_type(cls) -> tuple[str, str | None]:
if cls.json_schema_type is None:
raise RuntimeError(
f"TypeParser: json_schema_type not defined for subclass {cls.__name__}"
)
schema_definition = cls.json_schema_type.split(":")
if len(schema_definition) == 1:
return schema_definition[0], None
return schema_definition[0], schema_definition[1]
def mappings_properties_builder(
self, properties, **kwargs: Unpack[TypeParserOptions]
) -> dict[str, Any]:
if not kwargs.get("required", False):
properties["default"] = properties.get("default", None)
mappings = self.default_mappings | self.type_mappings
return {
mappings[key]: value for key, value in properties.items() if key in mappings
}
@staticmethod
def _validate_default(field_type: type, field_prop: dict) -> bool:
value = field_prop.get("default")
if value is None and field_prop.get("default_factory") is not None:
value = field_prop["default_factory"]()
if value is None:
return True
try:
field = Annotated[field_type, Field(**field_prop)]
TypeAdapter(field).validate_python(value)
except Exception as _:
return False
return True

View File

@@ -0,0 +1,98 @@
from jambo.parser._type_parser import GenericTypeParser
from jambo.types.type_parser_options import TypeParserOptions
from typing_extensions import Any, Unpack
class AllOfTypeParser(GenericTypeParser):
mapped_type = any
json_schema_type = "allOf"
def from_properties_impl(
self, name, properties, **kwargs: Unpack[TypeParserOptions]
):
sub_properties = properties.get("allOf", [])
root_type = properties.get("type")
if root_type is not None:
for sub_property in sub_properties:
sub_property["type"] = root_type
parser = self._get_type_parser(sub_properties)
combined_properties = self._rebuild_properties_from_subproperties(
sub_properties
)
return parser().from_properties_impl(name, combined_properties, **kwargs)
@staticmethod
def _get_type_parser(
sub_properties: list[dict[str, Any]],
) -> type[GenericTypeParser]:
if not sub_properties:
raise ValueError("Invalid JSON Schema: 'allOf' is empty.")
parsers = set(
GenericTypeParser._get_impl(sub_property) for sub_property in sub_properties
)
if len(parsers) != 1:
raise ValueError("Invalid JSON Schema: allOf types do not match.")
return parsers.pop()
@staticmethod
def _rebuild_properties_from_subproperties(
sub_properties: list[dict[str, Any]],
) -> dict[str, Any]:
properties = {}
for subProperty in sub_properties:
for name, prop in subProperty.items():
if name not in properties:
properties[name] = prop
else:
# Merge properties if they exist in both sub-properties
properties[name] = AllOfTypeParser._validate_prop(
name, properties[name], prop
)
return properties
@staticmethod
def _validate_prop(prop_name, old_value, new_value):
if prop_name == "description":
return f"{old_value} | {new_value}"
if prop_name == "default":
if old_value != new_value:
raise ValueError(
f"Invalid JSON Schema: conflicting defaults for '{prop_name}'"
)
return old_value
if prop_name == "required":
return old_value + new_value
if prop_name in ("maxLength", "maximum", "exclusiveMaximum"):
return old_value if old_value > new_value else new_value
if prop_name in ("minLength", "minimum", "exclusiveMinimum"):
return old_value if old_value < new_value else new_value
if prop_name == "properties":
for key, value in new_value.items():
if key not in old_value:
old_value[key] = value
continue
for sub_key, sub_value in value.items():
if sub_key not in old_value[key]:
old_value[key][sub_key] = sub_value
else:
# Merge properties if they exist in both sub-properties
old_value[key][sub_key] = AllOfTypeParser._validate_prop(
sub_key, old_value[key][sub_key], sub_value
)
# Handle other properties by just returning the first valued
return old_value

View File

@@ -0,0 +1,41 @@
from jambo.parser._type_parser import GenericTypeParser
from jambo.types.type_parser_options import TypeParserOptions
from pydantic import Field
from typing_extensions import Annotated, Union, Unpack
class AnyOfTypeParser(GenericTypeParser):
mapped_type = Union
json_schema_type = "anyOf"
def from_properties_impl(
self, name, properties, **kwargs: Unpack[TypeParserOptions]
):
if "anyOf" not in properties:
raise ValueError(f"Invalid JSON Schema: {properties}")
if not isinstance(properties["anyOf"], list):
raise ValueError(f"Invalid JSON Schema: {properties['anyOf']}")
mapped_properties = self.mappings_properties_builder(properties, **kwargs)
sub_properties = properties["anyOf"]
sub_types = [
GenericTypeParser.type_from_properties(name, subProperty, **kwargs)
for subProperty in sub_properties
]
if not kwargs.get("required", False):
mapped_properties["default"] = mapped_properties.get("default")
# By defining the type as Union of Annotated type we can use the Field validator
# to enforce the constraints of each union type when needed.
# We use Annotated to attach the Field validators to the type.
field_types = [
Annotated[t, Field(**v)] if v is not None else t for t, v in sub_types
]
return Union[(*field_types,)], mapped_properties

View File

@@ -1,12 +1,10 @@
from jambo.parser._type_parser import GenericTypeParser
from jambo.types.type_parser_options import TypeParserOptions
from typing_extensions import Iterable, TypeVar, Unpack
import copy
from jambo.parser._type_parser import GenericTypeParser
from typing import TypeVar
from jambo.utils.properties_builder.mappings_properties_builder import (
mappings_properties_builder,
)
V = TypeVar("V")
@@ -14,54 +12,43 @@ V = TypeVar("V")
class ArrayTypeParser(GenericTypeParser):
mapped_type = list
json_schema_type = "array"
json_schema_type = "type:array"
@classmethod
def from_properties(cls, name, properties):
_item_type, _item_args = GenericTypeParser.get_impl(
properties["items"]["type"]
).from_properties(name, properties["items"])
default_mappings = {"description": "description"}
_mappings = {
"maxItems": "max_length",
"minItems": "min_length",
}
type_mappings = {
"maxItems": "max_length",
"minItems": "min_length",
}
wrapper_type = set if properties.get("uniqueItems", False) else list
mapped_properties = mappings_properties_builder(
properties, _mappings, {"description": "description"}
def from_properties_impl(
self, name, properties, **kwargs: Unpack[TypeParserOptions]
):
item_properties = kwargs.copy()
item_properties["required"] = True
_item_type, _item_args = GenericTypeParser.type_from_properties(
name, properties["items"], **item_properties
)
if "default" in properties:
default_list = properties["default"]
if not isinstance(default_list, list):
raise ValueError(
f"Default value must be a list, got {type(default_list).__name__}"
)
wrapper_type = set if properties.get("uniqueItems", False) else list
field_type = wrapper_type[_item_type]
if len(default_list) > properties.get("maxItems", float("inf")):
raise ValueError(
f"Default list exceeds maxItems limit of {properties.get('maxItems')}"
)
mapped_properties = self.mappings_properties_builder(properties, **kwargs)
if len(default_list) < properties.get("minItems", 0):
raise ValueError(
f"Default list is below minItems limit of {properties.get('minItems')}"
)
if "default" not in mapped_properties:
mapped_properties["default_factory"] = self._build_default_factory(
properties.get("default"), wrapper_type
)
if not all(isinstance(item, _item_type) for item in default_list):
raise ValueError(
f"All items in the default list must be of type {_item_type.__name__}"
)
return field_type, mapped_properties
if wrapper_type is list:
mapped_properties["default_factory"] = lambda: copy.deepcopy(
wrapper_type(default_list)
)
else:
mapped_properties["default_factory"] = lambda: wrapper_type(
default_list
)
def _build_default_factory(self, default_list, wrapper_type):
if default_list is None:
return lambda: None
return wrapper_type[_item_type], mapped_properties
if not isinstance(default_list, Iterable):
raise ValueError(
f"Default value for array must be an iterable, got {type(default_list)}"
)
return lambda: copy.deepcopy(wrapper_type(default_list))

View File

@@ -1,17 +1,25 @@
from jambo.parser._type_parser import GenericTypeParser
from jambo.utils.properties_builder.mappings_properties_builder import (
mappings_properties_builder,
)
from jambo.types.type_parser_options import TypeParserOptions
from typing_extensions import Unpack
class BooleanTypeParser(GenericTypeParser):
mapped_type = bool
json_schema_type = "boolean"
json_schema_type = "type:boolean"
@staticmethod
def from_properties(name, properties):
_mappings = {
"default": "default",
}
return bool, mappings_properties_builder(properties, _mappings)
type_mappings = {
"default": "default",
}
def from_properties_impl(
self, name, properties, **kwargs: Unpack[TypeParserOptions]
):
mapped_properties = self.mappings_properties_builder(properties, **kwargs)
default_value = properties.get("default")
if default_value is not None and not isinstance(default_value, bool):
raise ValueError(f"Default value for {name} must be a boolean.")
return bool, mapped_properties

View File

@@ -0,0 +1,43 @@
from jambo.parser._type_parser import GenericTypeParser
from jambo.types.json_schema_type import JSONSchemaNativeTypes
from jambo.types.type_parser_options import TypeParserOptions
from pydantic import AfterValidator
from typing_extensions import Annotated, Any, Unpack
class ConstTypeParser(GenericTypeParser):
json_schema_type = "const"
default_mappings = {
"const": "default",
"description": "description",
}
def from_properties_impl(
self, name, properties, **kwargs: Unpack[TypeParserOptions]
):
if "const" not in properties:
raise ValueError(f"Const type {name} must have 'const' property defined.")
const_value = properties["const"]
if not isinstance(const_value, JSONSchemaNativeTypes):
raise ValueError(
f"Const type {name} must have 'const' value of allowed types: {JSONSchemaNativeTypes}."
)
const_type = self._build_const_type(const_value)
parsed_properties = self.mappings_properties_builder(properties, **kwargs)
return const_type, parsed_properties
def _build_const_type(self, const_value):
def _validate_const_value(value: Any) -> Any:
if value != const_value:
raise ValueError(
f"Value must be equal to the constant value: {const_value}"
)
return value
return Annotated[type(const_value), AfterValidator(_validate_const_value)]

View File

@@ -0,0 +1,40 @@
from jambo.parser._type_parser import GenericTypeParser
from jambo.types.json_schema_type import JSONSchemaNativeTypes
from jambo.types.type_parser_options import TypeParserOptions
from typing_extensions import Unpack
from enum import Enum
class EnumTypeParser(GenericTypeParser):
json_schema_type = "enum"
def from_properties_impl(
self, name, properties, **kwargs: Unpack[TypeParserOptions]
):
if "enum" not in properties:
raise ValueError(f"Enum type {name} must have 'enum' property defined.")
enum_values = properties["enum"]
if not isinstance(enum_values, list):
raise ValueError(f"Enum type {name} must have 'enum' as a list of values.")
if any(
not isinstance(value, JSONSchemaNativeTypes) for value in enum_values
):
raise ValueError(
f"Enum type {name} must have 'enum' values of allowed types: {JSONSchemaNativeTypes}."
)
# Create a new Enum type dynamically
enum_type = Enum(name, {str(value).upper(): value for value in enum_values})
parsed_properties = self.mappings_properties_builder(properties, **kwargs)
if (
"default" in parsed_properties and parsed_properties["default"] is not None
):
parsed_properties["default"] = enum_type(parsed_properties["default"])
return enum_type, parsed_properties

View File

@@ -1,12 +1,24 @@
from jambo.parser._type_parser import GenericTypeParser
from jambo.utils.properties_builder.numeric_properties_builder import numeric_properties_builder
from jambo.types.type_parser_options import TypeParserOptions
from typing_extensions import Unpack
class FloatTypeParser(GenericTypeParser):
mapped_type = float
json_schema_type = "number"
json_schema_type = "type:number"
@staticmethod
def from_properties(name, properties):
return float, numeric_properties_builder(properties)
type_mappings = {
"minimum": "ge",
"exclusiveMinimum": "gt",
"maximum": "le",
"exclusiveMaximum": "lt",
"multipleOf": "multiple_of",
"default": "default",
}
def from_properties_impl(
self, name, properties, **kwargs: Unpack[TypeParserOptions]
):
return float, self.mappings_properties_builder(properties, **kwargs)

View File

@@ -1,12 +1,24 @@
from jambo.parser._type_parser import GenericTypeParser
from jambo.utils.properties_builder.numeric_properties_builder import numeric_properties_builder
from jambo.types.type_parser_options import TypeParserOptions
from typing_extensions import Unpack
class IntTypeParser(GenericTypeParser):
mapped_type = int
json_schema_type = "integer"
json_schema_type = "type:integer"
@staticmethod
def from_properties(name, properties):
return int, numeric_properties_builder(properties)
type_mappings = {
"minimum": "ge",
"exclusiveMinimum": "gt",
"maximum": "le",
"exclusiveMaximum": "lt",
"multipleOf": "multiple_of",
"default": "default",
}
def from_properties_impl(
self, name, properties, **kwargs: Unpack[TypeParserOptions]
):
return int, self.mappings_properties_builder(properties, **kwargs)

View File

@@ -1,19 +1,70 @@
from jambo.parser._type_parser import GenericTypeParser
from jambo.types.type_parser_options import TypeParserOptions
from pydantic import BaseModel, ConfigDict, Field, create_model
from typing_extensions import Any, Unpack
class ObjectTypeParser(GenericTypeParser):
mapped_type = object
json_schema_type = "object"
json_schema_type = "type:object"
@staticmethod
def from_properties(name, properties):
from jambo.schema_converter import SchemaConverter
def from_properties_impl(
self, name: str, properties: dict[str, Any], **kwargs: Unpack[TypeParserOptions]
) -> tuple[type[BaseModel], dict]:
type_parsing = self.to_model(
name,
properties.get("properties", {}),
properties.get("required", []),
**kwargs,
)
type_properties = {}
if "default" in properties:
raise RuntimeError("Default values for objects are not supported.")
type_properties["default_factory"] = lambda: type_parsing.model_validate(
properties["default"]
)
return (
SchemaConverter.build_object(name, properties),
{}, # The second argument is not used in this case
)
return type_parsing, type_properties
@classmethod
def to_model(
cls,
name: str,
schema: dict[str, Any],
required_keys: list[str],
**kwargs: Unpack[TypeParserOptions],
) -> type[BaseModel]:
"""
Converts JSON Schema object properties to a Pydantic model.
:param name: The name of the model.
:param schema: The properties of the JSON Schema object.
:param required_keys: List of required keys in the schema.
:return: A Pydantic model class.
"""
model_config = ConfigDict(validate_assignment=True)
fields = cls._parse_properties(schema, required_keys, **kwargs)
return create_model(name, __config__=model_config, **fields)
@classmethod
def _parse_properties(
cls,
properties: dict[str, Any],
required_keys: list[str],
**kwargs: Unpack[TypeParserOptions],
) -> dict[str, tuple[type, Field]]:
required_keys = required_keys or []
fields = {}
for name, prop in properties.items():
sub_property = kwargs.copy()
sub_property["required"] = name in required_keys
parsed_type, parsed_properties = GenericTypeParser.type_from_properties(
name, prop, **sub_property
)
fields[name] = (parsed_type, Field(**parsed_properties))
return fields

View File

@@ -0,0 +1,125 @@
from jambo.parser import GenericTypeParser
from jambo.types.type_parser_options import TypeParserOptions
from typing_extensions import Any, ForwardRef, Literal, TypeVar, Union, Unpack
RefType = TypeVar("RefType", bound=Union[type, ForwardRef])
RefStrategy = Literal["forward_ref", "def_ref"]
class RefTypeParser(GenericTypeParser):
json_schema_type = "$ref"
def from_properties_impl(
self, name: str, properties: dict[str, Any], **kwargs: Unpack[TypeParserOptions]
) -> tuple[RefType, dict]:
if "$ref" not in properties:
raise ValueError(f"RefTypeParser: Missing $ref in properties for {name}")
context = kwargs.get("context")
if context is None:
raise RuntimeError(
f"RefTypeParser: Missing `content` in properties for {name}"
)
ref_cache = kwargs.get("ref_cache")
if ref_cache is None:
raise RuntimeError(
f"RefTypeParser: Missing `ref_cache` in properties for {name}"
)
mapped_properties = self.mappings_properties_builder(properties, **kwargs)
ref_strategy, ref_name, ref_property = self._examine_ref_strategy(
name, properties, **kwargs
)
ref_state = self._get_ref_from_cache(ref_name, ref_cache)
if ref_state is not None:
# If the reference is either processing or already cached
return ref_state, mapped_properties
ref_cache[ref_name] = self._parse_from_strategy(
ref_strategy, ref_name, ref_property, **kwargs
)
return ref_cache[ref_name], mapped_properties
def _parse_from_strategy(
self,
ref_strategy: RefStrategy,
ref_name: str,
ref_property: dict[str, Any],
**kwargs: Unpack[TypeParserOptions],
):
match ref_strategy:
case "forward_ref":
mapped_type = ForwardRef(ref_name)
case "def_ref":
mapped_type, _ = GenericTypeParser.type_from_properties(
ref_name, ref_property, **kwargs
)
case _:
raise ValueError(
f"RefTypeParser: Unsupported $ref {ref_property['$ref']}"
)
return mapped_type
def _get_ref_from_cache(
self, ref_name: str, ref_cache: dict[str, type]
) -> RefType | type | None:
try:
ref_state = ref_cache[ref_name]
if ref_state is None:
# If the reference is being processed, we return a ForwardRef
return ForwardRef(ref_name)
# If the reference is already cached, we return it
return ref_state
except KeyError:
# If the reference is not in the cache, we will set it to None
ref_cache[ref_name] = None
def _examine_ref_strategy(
self, name: str, properties: dict[str, Any], **kwargs: Unpack[TypeParserOptions]
) -> tuple[RefStrategy, str, dict] | None:
if properties["$ref"] == "#":
ref_name = kwargs["context"].get("title")
if ref_name is None:
raise ValueError(
"RefTypeParser: Missing title in properties for $ref of Root Reference"
)
return "forward_ref", ref_name, {}
if properties["$ref"].startswith("#/$defs/"):
target_name, target_property = self._extract_target_ref(
name, properties, **kwargs
)
return "def_ref", target_name, target_property
raise ValueError(
"RefTypeParser: Only Root and $defs references are supported at the moment"
)
def _extract_target_ref(
self, name: str, properties: dict[str, Any], **kwargs: Unpack[TypeParserOptions]
) -> tuple[str, dict]:
target_name = None
target_property = kwargs["context"]
for prop_name in properties["$ref"].split("/")[1:]:
if prop_name not in target_property:
raise ValueError(
f"RefTypeParser: Missing {prop_name} in"
" properties for $ref {properties['$ref']}"
)
target_name = prop_name
target_property = target_property[prop_name]
if target_name is None or target_property is None:
raise ValueError(f"RefTypeParser: Invalid $ref {properties['$ref']}")
return target_name, target_property

View File

@@ -1,40 +1,55 @@
from jambo.parser._type_parser import GenericTypeParser
from jambo.utils.properties_builder.mappings_properties_builder import (
mappings_properties_builder,
)
from jambo.types.type_parser_options import TypeParserOptions
from pydantic import EmailStr, HttpUrl, IPvAnyAddress
from typing_extensions import Unpack
from datetime import date, datetime, time
class StringTypeParser(GenericTypeParser):
mapped_type = str
json_schema_type = "string"
json_schema_type = "type:string"
@staticmethod
def from_properties(name, properties):
_mappings = {
"maxLength": "max_length",
"minLength": "min_length",
"pattern": "pattern",
}
type_mappings = {
"maxLength": "max_length",
"minLength": "min_length",
"pattern": "pattern",
"format": "format",
}
mapped_properties = mappings_properties_builder(properties, _mappings)
format_type_mapping = {
"email": EmailStr,
"uri": HttpUrl,
"ipv4": IPvAnyAddress,
"ipv6": IPvAnyAddress,
"hostname": str,
"date": date,
"time": time,
"date-time": datetime,
}
if "default" in properties:
default_value = properties["default"]
if not isinstance(default_value, str):
raise ValueError(
f"Default value for {name} must be a string, "
f"but got {type(properties['default'])}."
)
format_pattern_mapping = {
"hostname": r"^[a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?(\.[a-zA-Z0-9]([a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?)*$",
}
if len(default_value) > properties.get("maxLength", float("inf")):
raise ValueError(
f"Default value for {name} exceeds maxLength limit of {properties.get('maxLength')}"
)
def from_properties_impl(
self, name, properties, **kwargs: Unpack[TypeParserOptions]
):
mapped_properties = self.mappings_properties_builder(
properties, **kwargs
)
if len(default_value) < properties.get("minLength", 0):
raise ValueError(
f"Default value for {name} is below minLength limit of {properties.get('minLength')}"
)
format_type = properties.get("format")
if not format_type:
return str, mapped_properties
return str, mapped_properties
if format_type not in self.format_type_mapping:
raise ValueError(f"Unsupported string format: {format_type}")
mapped_type = self.format_type_mapping[format_type]
if format_type in self.format_pattern_mapping:
mapped_properties["pattern"] = self.format_pattern_mapping[format_type]
return mapped_type, mapped_properties

View File

@@ -1,13 +1,9 @@
from jambo.parser import GenericTypeParser
from jambo.parser import ObjectTypeParser, RefTypeParser
from jambo.types.json_schema_type import JSONSchema
from jsonschema.exceptions import SchemaError
from jsonschema.protocols import Validator
from pydantic import create_model
from pydantic.fields import Field
from typing import Type
from jambo.types.json_schema_type import JSONSchema
from jsonschema.validators import validator_for
from pydantic import BaseModel
class SchemaConverter:
@@ -20,80 +16,53 @@ class SchemaConverter:
"""
@staticmethod
def build(schema: JSONSchema) -> Type:
def build(schema: JSONSchema) -> type[BaseModel]:
"""
Converts a JSON Schema to a Pydantic model.
:param schema: The JSON Schema to convert.
:return: A Pydantic model class.
"""
if "title" not in schema:
raise ValueError("JSON Schema must have a title.")
return SchemaConverter.build_object(schema["title"], schema)
@staticmethod
def build_object(
name: str,
schema: JSONSchema,
) -> Type:
"""
Converts a JSON Schema object to a Pydantic model given a name.
:param name:
:param schema:
:return:
"""
try:
Validator.check_schema(schema)
validator = validator_for(schema)
validator.check_schema(schema)
except SchemaError as e:
raise ValueError(f"Invalid JSON Schema: {e}")
if schema["type"] != "object":
raise TypeError(
f"Invalid JSON Schema: {schema['type']}. Only 'object' can be converted to Pydantic models."
)
if "title" not in schema:
raise ValueError("JSON Schema must have a title.")
return SchemaConverter._build_model_from_properties(
name, schema["properties"], schema.get("required", [])
)
schema_type = SchemaConverter._get_schema_type(schema)
match schema_type:
case "object":
return ObjectTypeParser.to_model(
schema["title"],
schema["properties"],
schema.get("required", []),
context=schema,
ref_cache=dict(),
)
case "$ref":
parsed_model, _ = RefTypeParser().from_properties(
schema["title"],
schema,
context=schema,
ref_cache=dict(),
)
return parsed_model
case _:
raise TypeError(f"Unsupported schema type: {schema_type}")
@staticmethod
def _build_model_from_properties(
model_name: str, model_properties: dict, required_keys: list[str]
) -> Type:
properties = SchemaConverter._parse_properties(model_properties, required_keys)
def _get_schema_type(schema: JSONSchema) -> str:
"""
Returns the type of the schema.
:param schema: The JSON Schema to check.
:return: The type of the schema.
"""
if "$ref" in schema:
return "$ref"
return create_model(model_name, **properties)
@staticmethod
def _parse_properties(
properties: dict, required_keys=None
) -> dict[str, tuple[type, Field]]:
required_keys = required_keys or []
fields = {}
for name, prop in properties.items():
fields[name] = SchemaConverter._build_field(name, prop, required_keys)
return fields
@staticmethod
def _build_field(
name, properties: dict, required_keys: list[str]
) -> tuple[type, dict]:
_field_type, _field_args = GenericTypeParser.get_impl(
properties["type"]
).from_properties(name, properties)
_field_args = _field_args or {}
if description := properties.get("description"):
_field_args["description"] = description
if name not in required_keys:
_field_args["default"] = properties.get("default", None)
if "default_factory" in _field_args and "default" in _field_args:
del _field_args["default"]
return _field_type, Field(**_field_args)
return schema.get("type", "undefined")

View File

@@ -1,4 +1,6 @@
from typing import List, Dict, Union, TypedDict, Literal
from typing_extensions import Dict, List, Literal, TypedDict, Union
from types import NoneType
JSONSchemaType = Literal[
@@ -6,6 +8,17 @@ JSONSchemaType = Literal[
]
JSONSchemaNativeTypes: tuple[type, ...] = (
str,
int,
float,
bool,
list,
set,
NoneType,
)
JSONType = Union[str, int, float, bool, None, Dict[str, "JSONType"], List["JSONType"]]

View File

@@ -0,0 +1,9 @@
from jambo.types.json_schema_type import JSONSchema
from typing_extensions import TypedDict
class TypeParserOptions(TypedDict):
required: bool
context: JSONSchema
ref_cache: dict[str, type]

View File

@@ -1,11 +0,0 @@
def mappings_properties_builder(properties, mappings, default_mappings=None):
default_mappings = default_mappings or {
"default": "default",
"description": "description",
}
mappings = default_mappings | mappings
return {
mappings[key]: value for key, value in properties.items() if key in mappings
}

View File

@@ -1,51 +0,0 @@
from jambo.utils.properties_builder.mappings_properties_builder import (
mappings_properties_builder,
)
def numeric_properties_builder(properties):
_mappings = {
"minimum": "ge",
"exclusiveMinimum": "gt",
"maximum": "le",
"exclusiveMaximum": "lt",
"multipleOf": "multiple_of",
"default": "default",
}
mapped_properties = mappings_properties_builder(properties, _mappings)
if "default" in properties:
default_value = properties["default"]
if not isinstance(default_value, (int, float)):
raise ValueError(
f"Default value must be a number, got {type(default_value).__name__}"
)
if default_value >= properties.get("maximum", float("inf")):
raise ValueError(
f"Default value exceeds maximum limit of {properties.get('maximum')}"
)
if default_value <= properties.get("minimum", float("-inf")):
raise ValueError(
f"Default value is below minimum limit of {properties.get('minimum')}"
)
if default_value > properties.get("exclusiveMaximum", float("inf")):
raise ValueError(
f"Default value exceeds exclusive maximum limit of {properties.get('exclusiveMaximum')}"
)
if default_value < properties.get("exclusiveMinimum", float("-inf")):
raise ValueError(
f"Default value is below exclusive minimum limit of {properties.get('exclusiveMinimum')}"
)
if "multipleOf" in properties:
if default_value % properties["multipleOf"] != 0:
raise ValueError(
f"Default value {default_value} is not a multiple of {properties['multipleOf']}"
)
return mapped_properties

View File

@@ -1,7 +1,7 @@
[project]
name = "jambo"
dynamic = ["version"]
description = "Add your description here"
description = "Jambo - JSON Schema to Pydantic Converter"
requires-python = ">=3.10,<4.0"
maintainers = [
{ name = "Vitor Hideyoshi", email = "vitor.h.n.batista@gmail.com" },
@@ -23,6 +23,7 @@ readme = "README.md"
# Project Dependencies
dependencies = [
"email-validator>=2.2.0",
"jsonschema>=4.23.0",
"pydantic>=2.10.6",
]
@@ -33,6 +34,9 @@ dev = [
"poethepoet>=0.33.1",
"pre-commit>=4.2.0",
"ruff>=0.11.4",
"sphinx>=8.1.3",
"sphinx-autobuild>=2024.10.3",
"sphinx-rtd-theme>=3.0.2",
]
@@ -44,7 +48,9 @@ repository = "https://github.com/HideyoshiNakazone/jambo.git"
# POE Tasks
[tool.poe.tasks]
create-hooks = "bash .githooks/set-hooks.sh"
tests = "python -m unittest discover -s tests -v"
tests = "python -m coverage run -m unittest discover -v"
tests-report = "python -m coverage xml"
serve-docs = "sphinx-autobuild docs/source docs/build"
# Build System
[tool.hatch.version]
@@ -55,8 +61,20 @@ requires = ["hatchling", "hatch-vcs"]
build-backend = "hatchling.build"
# Tests
[tool.coverage.run]
omit = [
"tests/*",
]
# Linters
[tool.ruff.lint]
extend-select = ["I"]
[tool.ruff.lint.isort]
known-first-party = ["jambo"]
section-order=[
"future",
"first-party",
@@ -64,3 +82,4 @@ section-order=[
"third-party",
"standard-library",
]
lines-after-imports = 2

View File

@@ -0,0 +1,308 @@
from jambo.parser.allof_type_parser import AllOfTypeParser
from unittest import TestCase
class TestAllOfTypeParser(TestCase):
def test_all_of_type_parser_object_type(self):
"""
Test the AllOfTypeParser with an object type and validate the properties.
When using allOf with object it should be able to validate the properties
and join them correctly.
"""
properties = {
"type": "object",
"allOf": [
{
"properties": {
"name": {
"type": "string",
"minLength": 1,
}
},
},
{
"type": "object",
"properties": {
"name": {
"type": "string",
"maxLength": 4,
},
"age": {
"type": "integer",
"maximum": 100,
"minimum": 0,
},
},
},
],
}
type_parsing, type_validator = AllOfTypeParser().from_properties(
"placeholder", properties
)
with self.assertRaises(ValueError):
type_parsing(name="John", age=101)
with self.assertRaises(ValueError):
type_parsing(name="", age=30)
with self.assertRaises(ValueError):
type_parsing(name="John Invalid", age=30)
obj = type_parsing(name="John", age=30)
self.assertEqual(obj.name, "John")
self.assertEqual(obj.age, 30)
def test_all_of_type_parser_object_type_required(self):
"""
Tests the required properties of the AllOfTypeParser with an object type.
"""
properties = {
"type": "object",
"allOf": [
{
"properties": {
"name": {
"type": "string",
}
},
"required": ["name"],
},
{
"type": "object",
"properties": {
"age": {
"type": "integer",
}
},
"required": ["age"],
},
],
}
type_parsing, type_validator = AllOfTypeParser().from_properties(
"placeholder", properties
)
with self.assertRaises(ValueError):
type_parsing(name="John")
with self.assertRaises(ValueError):
type_parsing(age=30)
obj = type_parsing(name="John", age=30)
self.assertEqual(obj.name, "John")
self.assertEqual(obj.age, 30)
def test_all_of_type_top_level_type(self):
"""
Tests the AllOfTypeParser with a top-level type and validate the properties.
"""
properties = {
"type": "string",
"allOf": [
{"maxLength": 11},
{"maxLength": 4},
{"minLength": 1},
{"minLength": 2},
],
}
type_parsing, type_validator = AllOfTypeParser().from_properties(
"placeholder", properties
)
self.assertEqual(type_parsing, str)
self.assertEqual(type_validator["max_length"], 11)
self.assertEqual(type_validator["min_length"], 1)
def test_all_of_type_parser_in_fields(self):
"""
Tests the AllOfTypeParser when set in the fields of a model.
"""
properties = {
"allOf": [
{"type": "string", "maxLength": 11},
{"type": "string", "maxLength": 4},
{"type": "string", "minLength": 1},
{"type": "string", "minLength": 2},
]
}
type_parsing, type_validator = AllOfTypeParser().from_properties(
"placeholder", properties
)
self.assertEqual(type_parsing, str)
self.assertEqual(type_validator["max_length"], 11)
self.assertEqual(type_validator["min_length"], 1)
def test_invalid_all_of(self):
"""
Tests that an error is raised when the allOf type is not present.
"""
properties = {
"wrongKey": [
{"type": "string", "maxLength": 11},
{"type": "string", "maxLength": 4},
{"type": "string", "minLength": 1},
{"type": "string", "minLength": 2},
]
}
with self.assertRaises(ValueError):
AllOfTypeParser().from_properties("placeholder", properties)
def test_all_of_invalid_type_not_present(self):
properties = {
"allOf": [
{"maxLength": 11},
{"maxLength": 4},
{"minLength": 1},
{"minLength": 2},
]
}
with self.assertRaises(ValueError):
AllOfTypeParser().from_properties("placeholder", properties)
def test_all_of_invalid_type_in_fields(self):
properties = {
"allOf": [
{"type": "string", "maxLength": 11},
{"type": "integer", "maxLength": 4},
{"type": "string", "minLength": 1},
{"minLength": 2},
]
}
with self.assertRaises(ValueError):
AllOfTypeParser().from_properties("placeholder", properties)
def test_all_of_invalid_type_not_all_equal(self):
"""
Tests that an error is raised when the allOf types are not all equal.
"""
properties = {
"allOf": [
{"type": "string", "maxLength": 11},
{"type": "integer", "maxLength": 4},
{"type": "string", "minLength": 1},
]
}
with self.assertRaises(ValueError):
AllOfTypeParser().from_properties("placeholder", properties)
def test_all_of_description_field(self):
"""
Tests the AllOfTypeParser with a description field.
"""
properties = {
"type": "object",
"allOf": [
{
"properties": {
"name": {
"type": "string",
"description": "One",
}
},
},
{
"properties": {
"name": {
"type": "string",
"description": "Of",
}
},
},
{
"properties": {
"name": {
"type": "string",
"description": "Us",
}
},
},
],
}
type_parsing, _ = AllOfTypeParser().from_properties("placeholder", properties)
self.assertEqual(
type_parsing.model_json_schema()["properties"]["name"]["description"],
"One | Of | Us",
)
def test_all_of_with_defaults(self):
"""
Tests the AllOfTypeParser with a default value.
"""
properties = {
"type": "object",
"allOf": [
{
"properties": {
"name": {
"type": "string",
"default": "John",
}
},
},
{
"properties": {
"name": {
"type": "string",
"default": "John",
},
"age": {
"type": "integer",
"default": 30,
},
},
},
],
}
type_parsing, _ = AllOfTypeParser().from_properties("placeholder", properties)
obj = type_parsing()
self.assertEqual(obj.name, "John")
self.assertEqual(obj.age, 30)
def test_all_of_with_conflicting_defaults(self):
"""
Tests the AllOfTypeParser with conflicting default values.
"""
properties = {
"type": "object",
"allOf": [
{
"properties": {
"name": {
"type": "string",
"default": "John",
}
},
},
{
"properties": {
"name": {
"type": "string",
"default": "Doe",
}
},
},
],
}
with self.assertRaises(ValueError):
AllOfTypeParser().from_properties("placeholder", properties)

View File

@@ -0,0 +1,99 @@
from jambo.parser.anyof_type_parser import AnyOfTypeParser
from typing_extensions import Annotated, Union, get_args, get_origin
from unittest import TestCase
class TestAnyOfTypeParser(TestCase):
def test_any_with_missing_properties(self):
properties = {
"notAnyOf": [
{"type": "string"},
{"type": "integer"},
],
}
with self.assertRaises(ValueError):
AnyOfTypeParser().from_properties("placeholder", properties)
def test_any_of_with_invalid_properties(self):
properties = {
"anyOf": None,
}
with self.assertRaises(ValueError):
AnyOfTypeParser().from_properties("placeholder", properties)
def test_any_of_string_or_int(self):
"""
Tests the AnyOfTypeParser with a string or int type.
"""
properties = {
"anyOf": [
{"type": "string"},
{"type": "integer"},
],
}
type_parsing, _ = AnyOfTypeParser().from_properties(
"placeholder", properties, required=True
)
# check union type has string and int
self.assertEqual(get_origin(type_parsing), Union)
type_1, type_2 = get_args(type_parsing)
self.assertEqual(get_origin(type_1), Annotated)
self.assertIn(str, get_args(type_1))
self.assertEqual(get_origin(type_2), Annotated)
self.assertIn(int, get_args(type_2))
def test_any_of_string_or_int_with_default(self):
"""
Tests the AnyOfTypeParser with a string or int type and a default value.
"""
properties = {
"anyOf": [
{"type": "string"},
{"type": "integer"},
],
"default": 42,
}
type_parsing, type_validator = AnyOfTypeParser().from_properties(
"placeholder", properties
)
# check union type has string and int
self.assertEqual(get_origin(type_parsing), Union)
type_1, type_2 = get_args(type_parsing)
self.assertEqual(get_origin(type_1), Annotated)
self.assertIn(str, get_args(type_1))
self.assertEqual(get_origin(type_2), Annotated)
self.assertIn(int, get_args(type_2))
self.assertEqual(type_validator["default"], 42)
def test_any_string_or_int_with_invalid_defaults(self):
"""
Tests the AnyOfTypeParser with a string or int type and an invalid default value.
"""
properties = {
"anyOf": [
{"type": "string"},
{"type": "integer"},
],
"default": 3.14,
}
with self.assertRaises(ValueError):
AnyOfTypeParser().from_properties("placeholder", properties)

View File

@@ -0,0 +1,99 @@
from jambo.parser import ArrayTypeParser
from typing_extensions import get_args
from unittest import TestCase
class TestArrayTypeParser(TestCase):
def test_array_parser_no_options(self):
parser = ArrayTypeParser()
properties = {"items": {"type": "string"}}
type_parsing, type_validator = parser.from_properties("placeholder", properties)
element_type = get_args(type_parsing)[0]
self.assertEqual(type_parsing.__origin__, list)
self.assertEqual(element_type, str)
def test_array_parser_with_options_unique(self):
parser = ArrayTypeParser()
properties = {"items": {"type": "string"}, "uniqueItems": True}
type_parsing, type_validator = parser.from_properties("placeholder", properties)
self.assertEqual(type_parsing.__origin__, set)
def test_array_parser_with_options_max_min(self):
parser = ArrayTypeParser()
properties = {"items": {"type": "string"}, "maxItems": 10, "minItems": 1}
type_parsing, type_validator = parser.from_properties("placeholder", properties)
self.assertEqual(type_parsing.__origin__, list)
self.assertEqual(type_validator["max_length"], 10)
self.assertEqual(type_validator["min_length"], 1)
def test_array_parser_with_options_default_list(self):
parser = ArrayTypeParser()
properties = {"items": {"type": "string"}, "default": ["a", "b", "c"]}
type_parsing, type_validator = parser.from_properties("placeholder", properties)
self.assertEqual(type_parsing.__origin__, list)
self.assertEqual(type_validator["default_factory"](), ["a", "b", "c"])
def test_array_parse_with_options_default_set(self):
parser = ArrayTypeParser()
properties = {
"items": {"type": "string"},
"uniqueItems": True,
"default": ["a", "b", "c"],
}
type_parsing, type_validator = parser.from_properties("placeholder", properties)
self.assertEqual(type_parsing.__origin__, set)
self.assertEqual(type_validator["default_factory"](), {"a", "b", "c"})
def test_array_parser_with_invalid_default_elem_type(self):
parser = ArrayTypeParser()
properties = {"items": {"type": "string"}, "default": ["a", 1, "c"]}
with self.assertRaises(ValueError):
parser.from_properties("placeholder", properties)
def test_array_parser_with_invalid_default_type(self):
parser = ArrayTypeParser()
properties = {"items": {"type": "string"}, "default": 000}
with self.assertRaises(ValueError):
parser.from_properties("placeholder", properties)
def test_array_parser_with_invalid_default_min(self):
parser = ArrayTypeParser()
properties = {"items": {"type": "string"}, "default": ["a"], "minItems": 2}
with self.assertRaises(ValueError):
parser.from_properties("placeholder", properties)
def test_array_parser_with_invalid_default_max(self):
parser = ArrayTypeParser()
properties = {
"items": {"type": "string"},
"default": ["a", "b", "c", "d"],
"maxItems": 3,
}
with self.assertRaises(ValueError):
parser.from_properties("placeholder", properties)

View File

@@ -0,0 +1,43 @@
from jambo.parser import BooleanTypeParser
from unittest import TestCase
class TestBoolTypeParser(TestCase):
def test_bool_parser_no_options(self):
parser = BooleanTypeParser()
properties = {"type": "boolean"}
type_parsing, type_validator = parser.from_properties_impl(
"placeholder", properties
)
self.assertEqual(type_parsing, bool)
self.assertEqual(type_validator, {"default": None})
def test_bool_parser_with_default(self):
parser = BooleanTypeParser()
properties = {
"type": "boolean",
"default": True,
}
type_parsing, type_validator = parser.from_properties_impl(
"placeholder", properties
)
self.assertEqual(type_parsing, bool)
self.assertEqual(type_validator["default"], True)
def test_bool_parser_with_invalid_default(self):
parser = BooleanTypeParser()
properties = {
"type": "boolean",
"default": "invalid",
}
with self.assertRaises(ValueError):
parser.from_properties_impl("placeholder", properties)

View File

@@ -0,0 +1,49 @@
from jambo.parser import ConstTypeParser
from typing_extensions import Annotated, get_args, get_origin
from unittest import TestCase
class TestConstTypeParser(TestCase):
def test_const_type_parser(self):
parser = ConstTypeParser()
expected_const_value = "United States of America"
properties = {"const": expected_const_value}
parsed_type, parsed_properties = parser.from_properties_impl(
"country", properties
)
self.assertEqual(get_origin(parsed_type), Annotated)
self.assertIn(str, get_args(parsed_type))
self.assertEqual(parsed_properties["default"], expected_const_value)
def test_const_type_parser_invalid_properties(self):
parser = ConstTypeParser()
expected_const_value = "United States of America"
properties = {"notConst": expected_const_value}
with self.assertRaises(ValueError) as context:
parser.from_properties_impl("invalid_country", properties)
self.assertIn(
"Const type invalid_country must have 'const' property defined",
str(context.exception),
)
def test_const_type_parser_invalid_const_value(self):
parser = ConstTypeParser()
properties = {"const": {}}
with self.assertRaises(ValueError) as context:
parser.from_properties_impl("invalid_country", properties)
self.assertIn(
"Const type invalid_country must have 'const' value of allowed types",
str(context.exception),
)

View File

@@ -0,0 +1,90 @@
from jambo.parser import EnumTypeParser
from enum import Enum
from unittest import TestCase
class TestEnumTypeParser(TestCase):
def test_enum_type_parser_throws_enum_not_defined(self):
parser = EnumTypeParser()
schema = {}
with self.assertRaises(ValueError):
parsed_type, parsed_properties = parser.from_properties_impl(
"TestEnum",
schema,
)
def test_enum_type_parser_throws_enum_not_list(self):
parser = EnumTypeParser()
schema = {
"enum": "not_a_list",
}
with self.assertRaises(ValueError):
parsed_type, parsed_properties = parser.from_properties_impl(
"TestEnum",
schema,
)
def test_enum_type_parser_creates_enum(self):
parser = EnumTypeParser()
schema = {
"enum": ["value1", "value2", "value3"],
}
parsed_type, parsed_properties = parser.from_properties_impl(
"TestEnum",
schema,
)
self.assertIsInstance(parsed_type, type)
self.assertTrue(issubclass(parsed_type, Enum))
self.assertEqual(
set(parsed_type.__members__.keys()), {"VALUE1", "VALUE2", "VALUE3"}
)
self.assertEqual(parsed_properties, {"default": None})
def test_enum_type_parser_creates_enum_with_default(self):
parser = EnumTypeParser()
schema = {
"enum": ["value1", "value2", "value3"],
"default": "value2",
}
parsed_type, parsed_properties = parser.from_properties_impl(
"TestEnum",
schema,
)
self.assertIsInstance(parsed_type, type)
self.assertTrue(issubclass(parsed_type, Enum))
self.assertEqual(
set(parsed_type.__members__.keys()), {"VALUE1", "VALUE2", "VALUE3"}
)
self.assertEqual(parsed_properties["default"].value, "value2")
def test_enum_type_parser_throws_invalid_default(self):
parser = EnumTypeParser()
schema = {
"enum": ["value1", "value2", "value3"],
"default": "invalid_value",
}
with self.assertRaises(ValueError):
parser.from_properties_impl("TestEnum", schema)
def test_enum_type_parser_throws_invalid_enum_value(self):
parser = EnumTypeParser()
schema = {
"enum": ["value1", 42, dict()],
}
with self.assertRaises(ValueError):
parser.from_properties_impl("TestEnum", schema)

View File

@@ -0,0 +1,135 @@
from jambo.parser import FloatTypeParser
from unittest import TestCase
class TestFloatTypeParser(TestCase):
def test_float_parser_no_options(self):
parser = FloatTypeParser()
properties = {"type": "number"}
type_parsing, type_validator = parser.from_properties("placeholder", properties)
self.assertEqual(type_parsing, float)
self.assertEqual(type_validator, {"default": None})
def test_float_parser_with_options(self):
parser = FloatTypeParser()
properties = {
"type": "number",
"maximum": 10.5,
"minimum": 1.0,
"multipleOf": 0.5,
}
type_parsing, type_validator = parser.from_properties("placeholder", properties)
self.assertEqual(type_parsing, float)
self.assertEqual(type_validator["le"], 10.5)
self.assertEqual(type_validator["ge"], 1.0)
self.assertEqual(type_validator["multiple_of"], 0.5)
def test_float_parser_with_default(self):
parser = FloatTypeParser()
properties = {
"type": "number",
"default": 5.0,
"maximum": 10.5,
"minimum": 1.0,
"multipleOf": 0.5,
}
type_parsing, type_validator = parser.from_properties("placeholder", properties)
self.assertEqual(type_parsing, float)
self.assertEqual(type_validator["default"], 5.0)
self.assertEqual(type_validator["le"], 10.5)
self.assertEqual(type_validator["ge"], 1.0)
self.assertEqual(type_validator["multiple_of"], 0.5)
def test_float_parser_with_default_invalid_type(self):
parser = FloatTypeParser()
properties = {
"type": "number",
"default": "invalid", # Invalid default value
"maximum": 10.5,
"minimum": 1.0,
"multipleOf": 0.5,
}
with self.assertRaises(ValueError):
parser.from_properties("placeholder", properties)
def test_float_parser_with_default_invalid_maximum(self):
parser = FloatTypeParser()
properties = {
"type": "number",
"default": 15.0,
"maximum": 10.5,
"minimum": 1.0,
"multipleOf": 0.5,
}
with self.assertRaises(ValueError):
parser.from_properties("placeholder", properties)
def test_float_parser_with_default_invalid_minimum(self):
parser = FloatTypeParser()
properties = {
"type": "number",
"default": -5.0,
"maximum": 10.5,
"minimum": 1.0,
"multipleOf": 0.5,
}
with self.assertRaises(ValueError):
parser.from_properties("placeholder", properties)
def test_float_parser_with_default_invalid_exclusive_maximum(self):
parser = FloatTypeParser()
properties = {
"type": "number",
"default": 10.5,
"exclusiveMaximum": 10.5,
"minimum": 1.0,
"multipleOf": 0.5,
}
with self.assertRaises(ValueError):
parser.from_properties("placeholder", properties)
def test_float_parser_with_default_invalid_exclusive_minimum(self):
parser = FloatTypeParser()
properties = {
"type": "number",
"default": 1.0,
"maximum": 10.5,
"exclusiveMinimum": 1.0,
"multipleOf": 0.5,
}
with self.assertRaises(ValueError):
parser.from_properties("placeholder", properties)
def test_float_parser_with_default_invalid_multiple(self):
parser = FloatTypeParser()
properties = {
"type": "number",
"default": 5.0,
"maximum": 10.5,
"minimum": 1.0,
"multipleOf": 2.0,
}
with self.assertRaises(ValueError):
parser.from_properties("placeholder", properties)

View File

@@ -0,0 +1,135 @@
from jambo.parser import IntTypeParser
from unittest import TestCase
class TestIntTypeParser(TestCase):
def test_int_parser_no_options(self):
parser = IntTypeParser()
properties = {"type": "integer"}
type_parsing, type_validator = parser.from_properties("placeholder", properties)
self.assertEqual(type_parsing, int)
self.assertEqual(type_validator, {"default": None})
def test_int_parser_with_options(self):
parser = IntTypeParser()
properties = {
"type": "integer",
"maximum": 10,
"minimum": 1,
"multipleOf": 2,
}
type_parsing, type_validator = parser.from_properties("placeholder", properties)
self.assertEqual(type_parsing, int)
self.assertEqual(type_validator["le"], 10)
self.assertEqual(type_validator["ge"], 1)
self.assertEqual(type_validator["multiple_of"], 2)
def test_int_parser_with_default(self):
parser = IntTypeParser()
properties = {
"type": "integer",
"default": 6,
"maximum": 10,
"minimum": 1,
"multipleOf": 2,
}
type_parsing, type_validator = parser.from_properties("placeholder", properties)
self.assertEqual(type_parsing, int)
self.assertEqual(type_validator["default"], 6)
self.assertEqual(type_validator["le"], 10)
self.assertEqual(type_validator["ge"], 1)
self.assertEqual(type_validator["multiple_of"], 2)
def test_int_parser_with_default_invalid_type(self):
parser = IntTypeParser()
properties = {
"type": "integer",
"default": "invalid", # Invalid default value
"maximum": 10,
"minimum": 1,
"multipleOf": 2,
}
with self.assertRaises(ValueError):
parser.from_properties("placeholder", properties)
def test_int_parser_with_default_invalid_maximum(self):
parser = IntTypeParser()
properties = {
"type": "integer",
"default": 15,
"maximum": 10,
"minimum": 1,
"multipleOf": 2,
}
with self.assertRaises(ValueError):
parser.from_properties("placeholder", properties)
def test_int_parser_with_default_invalid_minimum(self):
parser = IntTypeParser()
properties = {
"type": "integer",
"default": -5,
"maximum": 10,
"minimum": 1,
"multipleOf": 2,
}
with self.assertRaises(ValueError):
parser.from_properties("placeholder", properties)
def test_int_parser_with_default_invalid_exclusive_maximum(self):
parser = IntTypeParser()
properties = {
"type": "integer",
"default": 10,
"exclusiveMaximum": 10,
"minimum": 1,
"multipleOf": 2,
}
with self.assertRaises(ValueError):
parser.from_properties("placeholder", properties)
def test_int_parser_with_default_invalid_exclusive_minimum(self):
parser = IntTypeParser()
properties = {
"type": "integer",
"default": 1,
"exclusiveMinimum": 1,
"maximum": 10,
"multipleOf": 2,
}
with self.assertRaises(ValueError):
parser.from_properties("placeholder", properties)
def test_int_parser_with_default_invalid_multipleOf(self):
parser = IntTypeParser()
properties = {
"type": "integer",
"default": 5,
"maximum": 10,
"minimum": 1,
"multipleOf": 2,
}
with self.assertRaises(ValueError):
parser.from_properties("placeholder", properties)

View File

@@ -0,0 +1,49 @@
from jambo.parser import ObjectTypeParser
from unittest import TestCase
class TestObjectTypeParser(TestCase):
def test_object_type_parser(self):
parser = ObjectTypeParser()
properties = {
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
},
}
Model, _args = parser.from_properties_impl("placeholder", properties)
obj = Model(name="name", age=10)
self.assertEqual(obj.name, "name")
self.assertEqual(obj.age, 10)
def test_object_type_parser_with_default(self):
parser = ObjectTypeParser()
properties = {
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
},
"default": {
"name": "default_name",
"age": 20,
},
}
_, type_validator = parser.from_properties_impl("placeholder", properties)
# Check default value
default_obj = type_validator["default_factory"]()
self.assertEqual(default_obj.name, "default_name")
self.assertEqual(default_obj.age, 20)
# Chekc default factory new object id
new_obj = type_validator["default_factory"]()
self.assertNotEqual(id(default_obj), id(new_obj))

View File

@@ -0,0 +1,484 @@
from jambo.parser import ObjectTypeParser, RefTypeParser
from typing import ForwardRef
from unittest import TestCase
class TestRefTypeParser(TestCase):
def test_ref_type_parser_throws_without_ref(self):
properties = {
"title": "person",
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
},
"required": ["name", "age"],
}
with self.assertRaises(ValueError):
RefTypeParser().from_properties(
"person",
properties,
context=properties,
ref_cache={},
required=True,
)
def test_ref_type_parser_throws_without_context(self):
properties = {
"title": "person",
"$ref": "#/$defs/person",
"$defs": {
"person": {
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
},
}
},
}
with self.assertRaises(RuntimeError):
RefTypeParser().from_properties(
"person",
properties,
ref_cache={},
required=True,
)
def test_ref_type_parser_throws_without_ref_cache(self):
properties = {
"title": "person",
"$ref": "#/$defs/person",
"$defs": {
"person": {
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
},
}
},
}
with self.assertRaises(RuntimeError):
RefTypeParser().from_properties(
"person",
properties,
context=properties,
required=True,
)
def test_ref_type_parser_throws_if_network_ref_type(self):
properties = {
"title": "person",
"$ref": "https://example.com/schemas/person.json",
}
with self.assertRaises(ValueError):
RefTypeParser().from_properties(
"person",
properties,
context=properties,
ref_cache={},
required=True,
)
def test_ref_type_parser_throws_if_non_root_or_def_ref(self):
# This is invalid because object3 is referencing object2,
# but object2 is not defined in $defs or as a root reference.
properties = {
"title": "object1",
"type": "object",
"properties": {
"object2": {
"type": "object",
"properties": {
"attr1": {
"type": "string",
},
"attr2": {
"type": "integer",
},
},
},
"object3": {
"$ref": "#/$defs/object2",
},
},
}
with self.assertRaises(ValueError):
ObjectTypeParser().from_properties(
"person",
properties,
context=properties,
ref_cache={},
required=True,
)
def test_ref_type_parser_throws_if_def_doesnt_exists(self):
properties = {
"title": "person",
"$ref": "#/$defs/employee",
"$defs": {},
}
with self.assertRaises(ValueError):
RefTypeParser().from_properties(
"person",
properties,
context=properties,
ref_cache={},
required=True,
)
def test_ref_type_parser_throws_if_ref_property_doesnt_exists(self):
properties = {
"title": "person",
"$ref": "#/$defs/person",
"$defs": {"person": None},
}
with self.assertRaises(ValueError):
RefTypeParser().from_properties(
"person",
properties,
context=properties,
ref_cache={},
required=True,
)
def test_ref_type_parser_with_def(self):
properties = {
"title": "person",
"$ref": "#/$defs/person",
"$defs": {
"person": {
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
},
}
},
}
type_parsing, type_validator = RefTypeParser().from_properties(
"person",
properties,
context=properties,
ref_cache={},
required=True,
)
self.assertIsInstance(type_parsing, type)
obj = type_parsing(name="John", age=30)
self.assertEqual(obj.name, "John")
self.assertEqual(obj.age, 30)
def test_ref_type_parser_with_forward_ref(self):
properties = {
"title": "person",
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
"emergency_contact": {
"$ref": "#",
},
},
"required": ["name", "age"],
}
model, type_validator = ObjectTypeParser().from_properties(
"person",
properties,
context=properties,
ref_cache={},
required=True,
)
obj = model(
name="John",
age=30,
emergency_contact=model(
name="Jane",
age=28,
),
)
self.assertEqual(obj.name, "John")
self.assertEqual(obj.age, 30)
self.assertIsInstance(obj.emergency_contact, model)
self.assertEqual(obj.emergency_contact.name, "Jane")
self.assertEqual(obj.emergency_contact.age, 28)
def test_ref_type_parser_invalid_forward_ref(self):
properties = {
# Doesn't have a title, which is required for forward references
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
"emergency_contact": {
"$ref": "#",
},
},
"required": ["name", "age"],
}
with self.assertRaises(ValueError):
ObjectTypeParser().from_properties(
"person",
properties,
context=properties,
ref_cache={},
required=True,
)
def test_ref_type_parser_forward_ref_can_checks_validation(self):
properties = {
"title": "person",
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
"emergency_contact": {
"$ref": "#",
},
},
"required": ["name", "age"],
}
model, type_validator = ObjectTypeParser().from_properties(
"person",
properties,
context=properties,
ref_cache={},
required=True,
)
# checks if when created via FowardRef the model is validated correctly.
with self.assertRaises(ValueError):
model(
name="John",
age=30,
emergency_contact=model(
name="Jane",
),
)
def test_ref_type_parser_with_ciclic_def(self):
properties = {
"title": "person",
"$ref": "#/$defs/person",
"$defs": {
"person": {
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
"emergency_contact": {
"$ref": "#/$defs/person",
},
},
}
},
}
model, type_validator = RefTypeParser().from_properties(
"person",
properties,
context=properties,
ref_cache={},
required=True,
)
obj = model(
name="John",
age=30,
emergency_contact=model(
name="Jane",
age=28,
),
)
self.assertEqual(obj.name, "John")
self.assertEqual(obj.age, 30)
self.assertIsInstance(obj.emergency_contact, model)
self.assertEqual(obj.emergency_contact.name, "Jane")
self.assertEqual(obj.emergency_contact.age, 28)
def test_ref_type_parser_with_repeated_ref(self):
properties = {
"title": "person",
"$ref": "#/$defs/person",
"$defs": {
"person": {
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
"emergency_contact": {
"$ref": "#/$defs/person",
},
"friends": {
"type": "array",
"items": {
"$ref": "#/$defs/person",
},
},
},
}
},
}
model, type_validator = RefTypeParser().from_properties(
"person",
properties,
context=properties,
ref_cache={},
required=True,
)
obj = model(
name="John",
age=30,
emergency_contact=model(
name="Jane",
age=28,
),
friends=[
model(name="Alice", age=25),
model(name="Bob", age=26),
],
)
self.assertEqual(
type(obj.emergency_contact),
type(obj.friends[0]),
"Emergency contact and friends should be of the same type",
)
def test_ref_type_parser_pre_computed_ref_cache(self):
ref_cache = {}
parent_properties = {
"$defs": {
"person": {
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
},
}
},
}
properties1 = {
"title": "person1",
"$ref": "#/$defs/person",
}
model1, _ = RefTypeParser().from_properties(
"person",
properties1,
context=parent_properties,
ref_cache=ref_cache,
required=True,
)
properties2 = {
"title": "person2",
"$ref": "#/$defs/person",
}
model2, _ = RefTypeParser().from_properties(
"person",
properties2,
context=parent_properties,
ref_cache=ref_cache,
required=True,
)
self.assertIs(model1, model2, "Models should be the same instance")
def test_parse_from_strategy_invalid_ref_strategy(self):
properties = {
"title": "person",
"$ref": "#/$defs/person",
"$defs": {
"person": {
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
},
}
},
}
with self.assertRaises(ValueError):
ref_strategy, ref_name, ref_property = RefTypeParser()._parse_from_strategy(
"invalid_strategy",
"person",
properties,
)
def test_parse_from_strategy_forward_ref(self):
properties = {
"title": "person",
"$ref": "#/$defs/person",
"$defs": {
"person": {
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
},
}
},
}
parsed_type = RefTypeParser()._parse_from_strategy(
"forward_ref",
"person",
properties,
)
self.assertIsInstance(parsed_type, ForwardRef)
def test_parse_from_strategy_def_ref(self):
properties = {
"title": "person",
"$ref": "#/$defs/person",
"$defs": {
"person": {
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
},
}
},
}
parsed_type = RefTypeParser()._parse_from_strategy(
"def_ref",
"person",
properties,
context=properties,
ref_cache={},
required=True,
)
obj = parsed_type(
name="John",
age=30,
)
self.assertEqual(obj.name, "John")
self.assertEqual(obj.age, 30)

View File

@@ -0,0 +1,199 @@
from jambo.parser import StringTypeParser
from pydantic import EmailStr, HttpUrl, IPvAnyAddress
from datetime import date, datetime, time
from unittest import TestCase
class TestStringTypeParser(TestCase):
def test_string_parser_no_options(self):
parser = StringTypeParser()
properties = {"type": "string"}
type_parsing, type_validator = parser.from_properties("placeholder", properties)
self.assertEqual(type_parsing, str)
def test_string_parser_with_options(self):
parser = StringTypeParser()
properties = {
"type": "string",
"maxLength": 10,
"minLength": 1,
"pattern": "^[a-zA-Z]+$",
}
type_parsing, type_validator = parser.from_properties("placeholder", properties)
self.assertEqual(type_parsing, str)
self.assertEqual(type_validator["max_length"], 10)
self.assertEqual(type_validator["min_length"], 1)
self.assertEqual(type_validator["pattern"], "^[a-zA-Z]+$")
def test_string_parser_with_default_value(self):
parser = StringTypeParser()
properties = {
"type": "string",
"default": "default_value",
"maxLength": 20,
"minLength": 5,
}
type_parsing, type_validator = parser.from_properties("placeholder", properties)
self.assertEqual(type_parsing, str)
self.assertEqual(type_validator["default"], "default_value")
self.assertEqual(type_validator["max_length"], 20)
self.assertEqual(type_validator["min_length"], 5)
def test_string_parser_with_invalid_default_value_type(self):
parser = StringTypeParser()
properties = {
"type": "string",
"default": 12345, # Invalid default value
"maxLength": 20,
"minLength": 5,
}
with self.assertRaises(ValueError):
parser.from_properties("placeholder", properties)
def test_string_parser_with_default_invalid_maxlength(self):
parser = StringTypeParser()
properties = {
"type": "string",
"default": "default_value",
"maxLength": 2,
"minLength": 1,
}
with self.assertRaises(ValueError):
parser.from_properties("placeholder", properties)
def test_string_parser_with_default_invalid_minlength(self):
parser = StringTypeParser()
properties = {
"type": "string",
"default": "a",
"maxLength": 20,
"minLength": 2,
}
with self.assertRaises(ValueError):
parser.from_properties("placeholder", properties)
def test_string_parser_with_email_format(self):
parser = StringTypeParser()
properties = {
"type": "string",
"format": "email",
}
type_parsing, type_validator = parser.from_properties("placeholder", properties)
self.assertEqual(type_parsing, EmailStr)
def test_string_parser_with_uri_format(self):
parser = StringTypeParser()
properties = {
"type": "string",
"format": "uri",
}
type_parsing, type_validator = parser.from_properties("placeholder", properties)
self.assertEqual(type_parsing, HttpUrl)
def test_string_parser_with_ip_formats(self):
parser = StringTypeParser()
for ip_format in ["ipv4", "ipv6"]:
properties = {
"type": "string",
"format": ip_format,
}
type_parsing, type_validator = parser.from_properties(
"placeholder", properties
)
self.assertEqual(type_parsing, IPvAnyAddress)
def test_string_parser_with_time_format(self):
parser = StringTypeParser()
properties = {
"type": "string",
"format": "time",
}
type_parsing, type_validator = parser.from_properties("placeholder", properties)
self.assertEqual(type_parsing, time)
def test_string_parser_with_pattern_based_formats(self):
parser = StringTypeParser()
for format_type in ["hostname"]:
properties = {
"type": "string",
"format": format_type,
}
type_parsing, type_validator = parser.from_properties(
"placeholder", properties
)
self.assertEqual(type_parsing, str)
self.assertIn("pattern", type_validator)
self.assertEqual(
type_validator["pattern"], parser.format_pattern_mapping[format_type]
)
def test_string_parser_with_unsupported_format(self):
parser = StringTypeParser()
properties = {
"type": "string",
"format": "unsupported-format",
}
with self.assertRaises(ValueError) as context:
parser.from_properties("placeholder", properties)
self.assertEqual(
str(context.exception), "Unsupported string format: unsupported-format"
)
def test_string_parser_with_date_format(self):
parser = StringTypeParser()
properties = {
"type": "string",
"format": "date",
}
type_parsing, type_validator = parser.from_properties("placeholder", properties)
self.assertEqual(type_parsing, date)
def test_string_parser_with_datetime_format(self):
parser = StringTypeParser()
properties = {
"type": "string",
"format": "date-time",
}
type_parsing, type_validator = parser.from_properties("placeholder", properties)
self.assertEqual(type_parsing, datetime)

View File

@@ -0,0 +1,21 @@
from jambo.parser import StringTypeParser
from jambo.parser._type_parser import GenericTypeParser
from unittest import TestCase
class TestGenericTypeParser(TestCase):
def test_get_impl(self):
parser = GenericTypeParser._get_impl({"type": "string"})
self.assertIsInstance(parser(), StringTypeParser)
def test_get_impl_invalid_json_schema(self):
with self.assertRaises(RuntimeError):
StringTypeParser.json_schema_type = None
GenericTypeParser._get_impl({"type": "string"})
StringTypeParser.json_schema_type = "type:string"
def test_get_impl_invalid_type(self):
with self.assertRaises(ValueError):
GenericTypeParser._get_impl({"type": "invalid_type"})

View File

@@ -1,7 +1,8 @@
from jambo.schema_converter import SchemaConverter
from jambo import SchemaConverter
from pydantic import BaseModel
from pydantic import BaseModel, HttpUrl
from ipaddress import IPv4Address, IPv6Address
from unittest import TestCase
@@ -10,6 +11,60 @@ def is_pydantic_model(cls):
class TestSchemaConverter(TestCase):
def test_invalid_schema(self):
schema = {
"title": 1,
"description": "A person",
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
},
}
with self.assertRaises(ValueError):
SchemaConverter.build(schema)
def test_build_expects_title(self):
schema = {
"description": "A person",
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
},
}
with self.assertRaises(ValueError):
SchemaConverter.build(schema)
def test_build_expects_object(self):
schema = {
"title": "Person",
"description": "A person",
"type": "string",
}
with self.assertRaises(TypeError):
SchemaConverter.build(schema)
def test_is_invalid_field(self):
schema = {
"title": "Person",
"description": "A person",
"type": "object",
"properties": {
"id": {
"notType": "string",
}
},
# 'required': ['name', 'age', 'is_active', 'friends', 'address'],
}
with self.assertRaises(ValueError) as context:
SchemaConverter.build(schema)
self.assertTrue("Unknown type" in str(context.exception))
def test_jsonschema_to_pydantic(self):
schema = {
"title": "Person",
@@ -200,6 +255,7 @@ class TestSchemaConverter(TestCase):
self.assertEqual(obj.name, "John")
def test_invalid_default_for_string(self):
# Test for default with maxLength
schema_max_length = {
"title": "Person",
@@ -256,3 +312,348 @@ class TestSchemaConverter(TestCase):
model_set = SchemaConverter.build(schema_set)
self.assertEqual(model_set().friends, {"John", "Jane"})
def test_default_for_object(self):
schema = {
"title": "Person",
"description": "A person",
"type": "object",
"properties": {
"address": {
"type": "object",
"properties": {
"street": {"type": "string"},
"city": {"type": "string"},
},
"default": {"street": "123 Main St", "city": "Springfield"},
},
},
"required": ["address"],
}
model = SchemaConverter.build(schema)
obj = model(address={"street": "123 Main St", "city": "Springfield"})
self.assertEqual(obj.address.street, "123 Main St")
self.assertEqual(obj.address.city, "Springfield")
def test_all_of(self):
schema = {
"title": "Person",
"description": "A person",
"type": "object",
"properties": {
"name": {
"allOf": [
{"type": "string", "maxLength": 11},
{"type": "string", "maxLength": 4},
{"type": "string", "minLength": 1},
{"type": "string", "minLength": 2},
]
},
},
}
Model = SchemaConverter.build(schema)
obj = Model(
name="J",
)
self.assertEqual(obj.name, "J")
with self.assertRaises(ValueError):
Model(name="John Invalid")
with self.assertRaises(ValueError):
Model(name="")
def test_any_of(self):
schema = {
"title": "Person",
"description": "A person",
"type": "object",
"properties": {
"id": {
"anyOf": [
{"type": "string", "maxLength": 11, "minLength": 1},
{"type": "integer", "maximum": 10},
]
},
},
}
Model = SchemaConverter.build(schema)
obj = Model(id=1)
self.assertEqual(obj.id, 1)
obj = Model(id="12345678901")
self.assertEqual(obj.id, "12345678901")
with self.assertRaises(ValueError):
Model(id="")
with self.assertRaises(ValueError):
Model(id="12345678901234567890")
with self.assertRaises(ValueError):
Model(id=11)
def test_string_format_email(self):
schema = {
"title": "EmailTest",
"type": "object",
"properties": {"email": {"type": "string", "format": "email"}},
}
model = SchemaConverter.build(schema)
self.assertEqual(model(email="test@example.com").email, "test@example.com")
with self.assertRaises(ValueError):
model(email="invalid-email")
def test_string_format_uri(self):
schema = {
"title": "UriTest",
"type": "object",
"properties": {"website": {"type": "string", "format": "uri"}},
}
model = SchemaConverter.build(schema)
self.assertEqual(
model(website="https://example.com").website, HttpUrl("https://example.com")
)
with self.assertRaises(ValueError):
model(website="invalid-uri")
def test_string_format_ipv4(self):
schema = {
"title": "IPv4Test",
"type": "object",
"properties": {"ip": {"type": "string", "format": "ipv4"}},
}
model = SchemaConverter.build(schema)
self.assertEqual(model(ip="192.168.1.1").ip, IPv4Address("192.168.1.1"))
with self.assertRaises(ValueError):
model(ip="256.256.256.256")
def test_string_format_ipv6(self):
schema = {
"title": "IPv6Test",
"type": "object",
"properties": {"ip": {"type": "string", "format": "ipv6"}},
}
model = SchemaConverter.build(schema)
self.assertEqual(
model(ip="2001:0db8:85a3:0000:0000:8a2e:0370:7334").ip,
IPv6Address("2001:0db8:85a3:0000:0000:8a2e:0370:7334"),
)
with self.assertRaises(ValueError):
model(ip="invalid-ipv6")
def test_string_format_hostname(self):
schema = {
"title": "HostnameTest",
"type": "object",
"properties": {"hostname": {"type": "string", "format": "hostname"}},
}
model = SchemaConverter.build(schema)
self.assertEqual(model(hostname="example.com").hostname, "example.com")
with self.assertRaises(ValueError):
model(hostname="invalid..hostname")
def test_string_format_datetime(self):
schema = {
"title": "DateTimeTest",
"type": "object",
"properties": {"timestamp": {"type": "string", "format": "date-time"}},
}
model = SchemaConverter.build(schema)
self.assertEqual(
model(timestamp="2024-01-01T12:00:00Z").timestamp.isoformat(),
"2024-01-01T12:00:00+00:00",
)
with self.assertRaises(ValueError):
model(timestamp="invalid-datetime")
def test_string_format_time(self):
schema = {
"title": "TimeTest",
"type": "object",
"properties": {"time": {"type": "string", "format": "time"}},
}
model = SchemaConverter.build(schema)
self.assertEqual(
model(time="20:20:39+00:00").time.isoformat(), "20:20:39+00:00"
)
with self.assertRaises(ValueError):
model(time="25:00:00")
def test_string_format_unsupported(self):
schema = {
"title": "InvalidFormat",
"type": "object",
"properties": {"field": {"type": "string", "format": "unsupported"}},
}
with self.assertRaises(ValueError):
SchemaConverter.build(schema)
def test_ref_with_root_ref(self):
schema = {
"title": "Person",
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
"emergency_contact": {
"$ref": "#",
},
},
"required": ["name", "age"],
}
model = SchemaConverter.build(schema)
obj = model(
name="John",
age=30,
emergency_contact=model(
name="Jane",
age=28,
),
)
self.assertEqual(obj.name, "John")
self.assertEqual(obj.age, 30)
self.assertIsInstance(obj.emergency_contact, model)
self.assertEqual(obj.emergency_contact.name, "Jane")
self.assertEqual(obj.emergency_contact.age, 28)
def test_ref_with_def(self):
schema = {
"title": "person",
"$ref": "#/$defs/person",
"$defs": {
"person": {
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
"emergency_contact": {
"$ref": "#/$defs/person",
},
},
}
},
}
model = SchemaConverter.build(schema)
obj = model(
name="John",
age=30,
emergency_contact=model(
name="Jane",
age=28,
),
)
self.assertEqual(obj.name, "John")
self.assertEqual(obj.age, 30)
self.assertIsInstance(obj.emergency_contact, model)
self.assertEqual(obj.emergency_contact.name, "Jane")
self.assertEqual(obj.emergency_contact.age, 28)
def test_ref_with_def_another_model(self):
schema = {
"title": "Person",
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
"address": {"$ref": "#/$defs/Address"},
},
"required": ["name"],
"$defs": {
"Address": {
"type": "object",
"properties": {
"street": {"type": "string"},
"city": {"type": "string"},
},
"required": ["street", "city"],
}
},
}
Model = SchemaConverter.build(schema)
obj = Model(
name="John",
age=30,
address={"street": "123 Main St", "city": "Springfield"},
)
self.assertEqual(obj.name, "John")
self.assertEqual(obj.age, 30)
self.assertEqual(obj.address.street, "123 Main St")
self.assertEqual(obj.address.city, "Springfield")
def test_enum_type_parser(self):
schema = {
"title": "Person",
"type": "object",
"properties": {
"status": {
"type": "string",
"enum": ["active", "inactive", "pending"],
}
},
"required": ["status"],
}
Model = SchemaConverter.build(schema)
obj = Model(status="active")
self.assertEqual(obj.status.value, "active")
def test_enum_type_parser_with_default(self):
schema = {
"title": "Person",
"type": "object",
"properties": {
"status": {
"type": "string",
"enum": ["active", "inactive", "pending"],
"default": "active",
}
},
"required": ["status"],
}
Model = SchemaConverter.build(schema)
obj = Model()
self.assertEqual(obj.status.value, "active")
def test_const_type_parser(self):
schema = {
"title": "Country",
"type": "object",
"properties": {
"name": {
"const": "United States of America",
}
},
"required": ["name"],
}
Model = SchemaConverter.build(schema)
obj = Model()
self.assertEqual(obj.name, "United States of America")
with self.assertRaises(ValueError):
obj.name = "Canada"
with self.assertRaises(ValueError):
Model(name="Canada")

View File

@@ -1,139 +0,0 @@
from jambo.parser import (
ArrayTypeParser,
FloatTypeParser,
GenericTypeParser,
IntTypeParser,
ObjectTypeParser,
StringTypeParser,
)
import unittest
from typing import get_args
class TestTypeParser(unittest.TestCase):
def test_get_impl(self):
self.assertEqual(GenericTypeParser.get_impl("integer"), IntTypeParser)
self.assertEqual(GenericTypeParser.get_impl("string"), StringTypeParser)
self.assertEqual(GenericTypeParser.get_impl("number"), FloatTypeParser)
self.assertEqual(GenericTypeParser.get_impl("object"), ObjectTypeParser)
self.assertEqual(GenericTypeParser.get_impl("array"), ArrayTypeParser)
def test_int_parser(self):
parser = IntTypeParser()
type_parsing, type_validator = parser.from_properties(
"placeholder",
{
"type": "integer",
"minimum": 0,
"exclusiveMinimum": 1,
"maximum": 10,
"exclusiveMaximum": 11,
"multipleOf": 2,
},
)
self.assertEqual(type_parsing, int)
self.assertEqual(type_validator["ge"], 0)
self.assertEqual(type_validator["gt"], 1)
self.assertEqual(type_validator["le"], 10)
self.assertEqual(type_validator["lt"], 11)
self.assertEqual(type_validator["multiple_of"], 2)
def test_float_parser(self):
parser = FloatTypeParser()
type_parsing, type_validator = parser.from_properties(
"placeholder",
{
"type": "number",
"minimum": 0,
"exclusiveMinimum": 1,
"maximum": 10,
"exclusiveMaximum": 11,
"multipleOf": 2,
},
)
self.assertEqual(type_parsing, float)
self.assertEqual(type_validator["ge"], 0)
self.assertEqual(type_validator["gt"], 1)
self.assertEqual(type_validator["le"], 10)
self.assertEqual(type_validator["lt"], 11)
self.assertEqual(type_validator["multiple_of"], 2)
def test_string_parser(self):
parser = StringTypeParser()
type_parsing, type_validator = parser.from_properties(
"placeholder",
{
"type": "string",
"maxLength": 10,
"minLength": 1,
"pattern": "[a-zA-Z0-9]",
},
)
self.assertEqual(type_parsing, str)
self.assertEqual(type_validator["max_length"], 10)
self.assertEqual(type_validator["min_length"], 1)
self.assertEqual(type_validator["pattern"], "[a-zA-Z0-9]")
def test_object_parser(self):
parser = ObjectTypeParser()
properties = {
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
},
}
Model, _args = parser.from_properties("placeholder", properties)
obj = Model(name="name", age=10)
self.assertEqual(obj.name, "name")
self.assertEqual(obj.age, 10)
def test_array_of_string_parser(self):
parser = ArrayTypeParser()
expected_definition = (list[str], {})
properties = {"items": {"type": "string"}}
self.assertEqual(
parser.from_properties("placeholder", properties), expected_definition
)
def test_array_of_object_parser(self):
parser = ArrayTypeParser()
properties = {
"type": "array",
"items": {
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
},
},
"maxItems": 10,
"minItems": 1,
"uniqueItems": True,
}
type_parsing, type_validator = parser.from_properties("placeholder", properties)
self.assertEqual(type_parsing.__origin__, set)
self.assertEqual(type_validator["max_length"], 10)
self.assertEqual(type_validator["min_length"], 1)
Model = get_args(type_parsing)[0]
obj = Model(name="name", age=10)
self.assertEqual(obj.name, "name")
self.assertEqual(obj.age, 10)

1351
uv.lock generated

File diff suppressed because it is too large Load Diff