Skip to content

Commit

Permalink
docs
Browse files Browse the repository at this point in the history
simple old style nwb-linkml test
  • Loading branch information
sneakers-the-rat committed Feb 5, 2024
1 parent d5a3a09 commit 5d3cd95
Show file tree
Hide file tree
Showing 20 changed files with 335 additions and 46 deletions.
9 changes: 9 additions & 0 deletions docs/api/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
# numpydantic

Top-level API contents

```{eval-rst}
.. automodule:: numpydantic
:members:
:imported-members:
```
10 changes: 10 additions & 0 deletions docs/api/linkml/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
# linkml

```{toctree}
:caption: LinkML
ndarraygen
pydanticgen
template
```

6 changes: 6 additions & 0 deletions docs/api/linkml/ndarraygen.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
# ndarraygen

```{eval-rst}
.. automodule:: numpydantic.linkml.ndarraygen
:members:
```
6 changes: 6 additions & 0 deletions docs/api/linkml/pydanticgen.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
# pydanticgen

```{eval-rst}
.. automodule:: numpydantic.linkml.pydanticgen
:members:
```
6 changes: 6 additions & 0 deletions docs/api/linkml/template.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
# template

```{eval-rst}
.. automodule:: numpydantic.linkml.template
:members:
```
6 changes: 6 additions & 0 deletions docs/api/maps.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
# maps

```{eval-rst}
.. automodule:: numpydantic.maps
:members:
```
6 changes: 6 additions & 0 deletions docs/api/monkeypatch.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
# monkeypatch

```{eval-rst}
.. automodule:: numpydantic.monkeypatch
:members:
```
6 changes: 6 additions & 0 deletions docs/api/ndarray.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
# ndarray

```{eval-rst}
.. automodule:: numpydantic.ndarray
:members:
```
6 changes: 6 additions & 0 deletions docs/api/proxy.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
# proxy

```{eval-rst}
.. automodule:: numpydantic.proxy
:members:
```
53 changes: 29 additions & 24 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,48 +6,53 @@
# -- Project information -----------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#project-information

project = 'numpydantic'
copyright = '2024, Jonny Saunders'
author = 'Jonny Saunders'
release = 'v0.0.0'
project = "numpydantic"
copyright = "2024, Jonny Saunders"
author = "Jonny Saunders"
release = "v0.0.0"

# -- General configuration ---------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration

extensions = [
'sphinx.ext.napoleon',
'sphinx.ext.autodoc',
'sphinxcontrib.autodoc_pydantic',
'sphinx.ext.intersphinx',
"sphinx.ext.napoleon",
"sphinx.ext.autodoc",
"sphinxcontrib.autodoc_pydantic",
"sphinx.ext.intersphinx",
"sphinx_design",
'myst_parser',
'sphinx.ext.todo'
"myst_parser",
"sphinx.ext.todo",
]

templates_path = ['_templates']
exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store']
templates_path = ["_templates"]
exclude_patterns = ["_build", "Thumbs.db", ".DS_Store"]

intersphinx_mapping = {
'python': ('https://docs.python.org/3', None),
'numpy': ('https://numpy.org/doc/stable/', None),
'pydantic': ('https://docs.pydantic.dev/latest/', None),
'linkml': ('https://linkml.io/linkml/', None),
'linkml_runtime': ('https://linkml.io/linkml/', None),
'linkml-runtime': ('https://linkml.io/linkml/', None)
"python": ("https://docs.python.org/3", None),
"numpy": ("https://numpy.org/doc/stable/", None),
"pydantic": ("https://docs.pydantic.dev/latest/", None),
"linkml": ("https://linkml.io/linkml/", None),
"linkml_runtime": ("https://linkml.io/linkml/", None),
"linkml-runtime": ("https://linkml.io/linkml/", None),
}

# -- Options for HTML output -------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#options-for-html-output

html_theme = 'furo'
html_static_path = ['_static']
html_theme = "furo"
html_static_path = ["_static"]

# autodoc
autodoc_pydantic_model_show_json_error_strategy = 'coerce'
autodoc_pydantic_model_show_json_error_strategy = "coerce"
autodoc_pydantic_model_show_json = False
autodoc_mock_imports = []
autodoc_mock_imports = [
"dask",
"h5py",
"linkml",
"linkml-runtime",
]
autoclass_content = "both"
autodoc_member_order='bysource'
autodoc_member_order = "bysource"
add_module_names = False

# Napoleon settings
Expand All @@ -68,4 +73,4 @@

# todo
todo_include_todos = True
todo_link_only = True
todo_link_only = True
10 changes: 8 additions & 2 deletions docs/hooks.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,11 @@
# Hooks

## TODO
What hooks do we want to expose to downstream users so they can use this without needing
to override everything?

- nwb compatibility: allowable precision map in dtype check
```{todo}
**NWB Compatibility**
**Precision:** NWB allows for a sort of hierarchy of type specification -
a less precise type also allows the data to be specified in a more precise type
```
38 changes: 20 additions & 18 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,30 +14,32 @@ It does two primary things:
- **Generate models from LinkML** - extend the LinkML pydantic generator to create models that
that use the [linkml-arrays](https://github.com/linkml/linkml-arrays) syntax

## Overview

The Python type annotation system is weird and not like the rest of Python!
(at least until [PEP 0649](https://peps.python.org/pep-0649/) gets mainlined).
Similarly, Pydantic 2's core_schema system is wonderful but still relatively poorly
documented for custom types! This package does the work of plugging them in
together to make some kind of type validation frankenstein.

The first problem is that type annotations are evaluated statically by python, mypy,
etc. This means you can't use typical python syntax for declaring types - it has to
be present at the time `__new__` is called, rather than `__init__`.

- pydantic schema
- validation
- serialization
- lazy loading
- compression

```{toctree}
:maxdepth: 2
:caption: Contents
:hidden: true
overview
ndarray
linkml
hooks
todo
```

```{toctree}
:maxdepth: 2
:caption: Contents:
:hidden:
:caption: API
:hidden: true
api/index
api/ndarray
api/proxy
api/linkml/index
api/maps
api/monkeypatch
hooks
```

2 changes: 2 additions & 0 deletions docs/linkml.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# LinkML Generation

134 changes: 134 additions & 0 deletions docs/ndarray.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,134 @@
# Constrained Arrays

## Implementation details

```{todo}
**Docs:**
Describe implementation details!
```

## Examples

### Declaration

Type with a single {class}`~numpydantic.NDArray` class, or use a {class}`~typing.Union`
to express more complex array constraints.

This package is effectively a Pydantic interface to [nptyping](https://github.com/ramonhagenaars/nptyping),
so any array syntax is valid there. (see [TODO](todo) for caveats)

```python
from typing import Union
from pydantic import BaseModel
from numpydantic import NDArray, Shape, UInt8, Float, Int

class Image(BaseModel):
"""
Data values. Data can be in 1-D, 2-D, 3-D, or 4-D. The first dimension should always represent time. This can also be used to store binary data (e.g., image frames). This can also be a link to data stored in an external file.
"""
array: Union[
NDArray[Shape["* x, * y"], UInt8],
NDArray[Shape["* x, * y, 3 rgb"], UInt8],
NDArray[Shape["* x, * y, 4 rgba"], UInt8],
NDArray[Shape["* t, * x, * y, 3 rgb"], UInt8],
NDArray[Shape["* t, * x, * y, 4 rgba"], Float]
]
```

### Validation:

```python
import numpy as np
# works
frame_gray = Image(array=np.ones((1280, 720), dtype=np.uint8))
frame_rgb = Image(array=np.ones((1280, 720, 3), dtype=np.uint8))
frame_rgba = Image(array=np.ones((1280, 720, 4), dtype=np.uint8))
video_rgb = Image(array=np.ones((100, 1280, 720, 3), dtype=np.uint8))

# fails
wrong_n_dimensions = Image(array=np.ones((1280,), dtype=np.uint8))
wrong_shape = Image(array=np.ones((1280,720,10), dtype=np.uint8))
wrong_type = Image(array=np.ones((1280,720,3), dtype=np.float64))

# shapes and types are checked together
float_video = Image(array=np.ones((100, 1280, 720, 4),dtype=float))
wrong_shape_float_video = Image(array=np.ones((100, 1280, 720, 3),dtype=float))
```

### JSON schema generation:

```python
class MyArray(BaseModel):
array: NDArray[Shape["2 x, * y, 4 z"], Float]
```

```python
>>> print(json.dumps(MyArray.model_json_schema(), indent=2))
```

```json
{
"properties": {
"array": {
"items": {
"items": {
"items": {
"type": "number"
},
"maxItems": 4,
"minItems": 4,
"type": "array"
},
"type": "array"
},
"maxItems": 2,
"minItems": 2,
"title": "Array",
"type": "array"
}
},
"required": [
"array"
],
"title": "MyArray",
"type": "object"
}
```

### Serialization

```python
class SmolArray(BaseModel):
array: NDArray[Shape["2 x, 2 y"], Int]

class BigArray(BaseModel):
array: NDArray[Shape["1000 x, 1000 y"], Int]
```

Serialize small arrays as lists of lists, and big arrays as a b64-encoded blosc compressed string

```python
>>> smol = SmolArray(array=np.array([[1,2],[3,4]], dtype=int))
>>> big = BigArray(array=np.random.randint(0,255,(1000,1000),int))

>>> print(smol.model_dump_json())
{"array":[[1,2],[3,4]]}
>>> print(big.model_dump_json())
{
"array": "( long b64 encoded string )",
"shape": [1000, 1000],
"dtype": "int64",
"unpack_fns": ["base64.b64decode", "blosc2.unpack_array2"],
}
```

## TODO

```{todo}
Implement structured arrays
```

```{todo}
Implement pandas dataframe validation?
```
17 changes: 17 additions & 0 deletions docs/overview.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
# Overview

The Python type annotation system is weird and not like the rest of Python!
(at least until [PEP 0649](https://peps.python.org/pep-0649/) gets mainlined).
Similarly, Pydantic 2's core_schema system is wonderful but still relatively poorly
documented for custom types! This package does the work of plugging them in
together to make some kind of type validation frankenstein.

The first problem is that type annotations are evaluated statically by python, mypy,
etc. This means you can't use typical python syntax for declaring types - it has to
be present at the time `__new__` is called, rather than `__init__`.

- pydantic schema
- validation
- serialization
- lazy loading
- compression
5 changes: 5 additions & 0 deletions docs/todo.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
# TODO

```{todolist}
```
Loading

0 comments on commit 5d3cd95

Please sign in to comment.