-
Notifications
You must be signed in to change notification settings - Fork 52
Support jinja templates #1156
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Support jinja templates #1156
Conversation
Run prompts in the sandbox environment, depabtable but seems recommended for untrusted templates
| "GitPython", | ||
| "requests", | ||
| "chevron", | ||
| "jinja2>=3.1.6", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
jinja appeared to have some bad vulnerabilities prior to 3.1.5 so I chose 3.1.6 as min version.
|
|
||
| def render_templated_object(obj: Any, args: Any) -> Any: | ||
|
|
||
| class _JinjaSafeDict: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I ran into jinja accessing . as function methods during evaluation (ie. TypeError: 'builtin_function_or_method' object is not iterable). This would prevent that by wrapping the dictionary values. Some users may find this and the sandbox environment to be restrictive but start with safety.
|
@cpinn is this one still needed? |
|
| template_format: TemplateFormat | None = None | ||
|
|
||
| @classmethod | ||
| def from_dict_deep(cls, d: dict): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Welcome input here.
I want the api to accept nunjucks but I want users of the sdk to interact with jinja.
ibolmo
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
left a thinker
| template_obj = env.from_string(template) | ||
| if isinstance(data, dict): | ||
| wrapped_data = {k: _JinjaSafeDict.wrap(v) for k, v in data.items()} | ||
| variables = {"input": _JinjaSafeDict(data), **wrapped_data} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
#nit
may be useful to call out that in js we expand the input inside the main variables so we're just keeping similar behavior
| variables = {"input": data} | ||
| return template_obj.render(**variables) | ||
| except jinja2.UndefinedError as e: | ||
| raise ValueError(f"Template rendering failed: {str(e)}") from e |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
#aside Pretty sure we don't do this, but in the future we should have custom error classes
| def from_dict_deep(cls, d: dict): | ||
| d2 = dict(d) | ||
| if d2.get("template_format") == "nunjucks": | ||
| d2["template_format"] = "jinja" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i take it we're aliasing jinja <=> nunjucks, I'm not seeing something similar in TS. 🤔
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
also curious if the template_format should be in the PromptData or in PromptOptions. i take it the latter would be saved in the prompt? Would this make the prompt interoperable in ts and python? would that also avoid hiding the template_format within the data it self?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you remember how I got really worried about the sdk release with nunjucks? That was because of the PromptData v PromptOptions. I realized maybe we may have wanted this in promptOptions but it already existed and went out as prompt data. Some people already started using that on the api so I didn't feel like I could rollback that choice unfortunately. It sort of makes sense in PromptData in the PromptOptions did appear to currently be more about options on the ai model but I think there was a case for putting it in either location.
yeah I wanted to alias jinja <==> nunjucks for users of the python sdk. The other option would be to just accept and treat both in the api but then I was feeling like that messed up the zod type and how the data got saved and read so I ended up with just aliasing in the SDK.
| # Use the prompt's template_format | ||
| template_format = self.template_format | ||
|
|
||
| params = self.options.get("params") or {} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah if template_format was in the options we could grab it from here and avoid some of the state?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If I did that it would not match with the api as template_format is currently top level on the prompt. Maybe that is fine in the python sdk(?) but it wouldn't match the current modelling.
When I had been looking and maybe it was not the best modeling choice it had looked like "options" were really for options of the actual prompt model being used (ie. which model, temperature for the model) and not options for prompt setup.
|
Holding off we may have another plan for supporting advanced templating in other languages. |
Run prompts in the sandbox environment, depabtable but seems recommended for untrusted templates