Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add declare fields action and docs updates #1483

Merged
merged 3 commits into from
Mar 19, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,8 +31,8 @@ and this project aspires to adhere to [Semantic Versioning](https://semver.org/s
- Added `fields` option to the project 2d to support scalar rendering of specific fields.
- Added `dataset_bounds` option to the project 2d, which can be used instead of a full 3D camera specification
- Added support for triggers to execute actions from multiple files via an `actions_files` option that takes a list of actions files.
- Added an `external_surfaces` transform filter, that can be used to reduce memory requriments in pipelines where you plan to only process the external faces of a data set.

- Added an `external_surfaces` transform filter, that can be used to reduce memory requirements in pipelines where you plan to only process the external faces of a data set.
- Added a `declare_fields` action, that allows users to explicitly list the fields to return for field filtering. This option avoids complex field parsing logic.

### Changed
- Changed the replay utility's binary names such that `replay_ser` is now `ascent_replay` and `raplay_mpi` is now `ascent_replay_mpi`. This will help prevent potential name collisions with other tools that also have replay utilities.
Expand Down
60 changes: 25 additions & 35 deletions src/docs/sphinx/Actions/Scenes.rst
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ A scene defined in this way uses the default data source, which is all of the da

Default Images
--------------
When creating a scene, Ascent will set up camera and color table defualts.
When creating a scene, Ascent will set up camera and color table defaults.
The only requirement is that either a ``image_name`` or ``image_prefix``
be provided. Minimally, a scene consists of one plot and a parameter
to specify the output file name. Default images images have a resolution
Expand Down Expand Up @@ -45,7 +45,7 @@ within the image prefix. Assuming the cycle is ``10``, here are some examples:

Image Name
^^^^^^^^^^
The ``image_name`` parameter speficies the excact file name of the of the output
The ``image_name`` parameter specifies the exact file name of the of the output
image, and Ascent will append the ``.png`` to the image file name. If not changed,
the image file will be overwritten.

Expand All @@ -58,13 +58,11 @@ Plots optionally consume the result of a pipeline, but if none is specified, the
Each scene can contain one or more plots.
The plot interface is simply:

.. code-block:: json
.. code-block:: yaml

{
"type" : "plot_name",
"pipeline" : "pipeline_name",
"field" : "field_name"
}
type: "plot_name"
pipeline: "pipeline_name"
field: "field_name"

In c++, the equivalent declarations would be as follows:

Expand Down Expand Up @@ -384,33 +382,25 @@ Now we add a second render to the same example using every available parameter:
scenes["s1/renders/r2/camera/near_plane"] = 0.1;
scenes["s1/renders/r2/camera/far_plane"] = 33.1;

.. code-block:: json

{
"renders":
{
"r1":
{
"image_width": 300,
"image_height": 400,
"image_name": "some_image",
"camera":
{
"look_at": [1.0, 1.0, 1.0],
"position": [0.0, 25.0, 15.0],
"up": [0.0, -1.0, 0.0],
"fov": 60.0,
"xpan": 0.0,
"ypan": 0.0,
"elevation": 10.0,
"azimuth": -10.0,
"zoom": 0.0,
"near_plane": 0.1,
"far_plane": 100.1
}
}
}
}
.. code-block:: yaml

renders:
r1:
image_width: 300
image_height: 400
image_name: "some_image"
camera:
look_at: [1.0, 1.0, 1.0]
position: [0.0, 25.0, 15.0]
up: [0.0, -1.0, 0.0]
fov: 60.0
xpan: 0.0
ypan: 0.0
elevation: 10.0,
azimuth: -10.0,
zoom: 0.0
near_plane: 0.1
far_plane: 100.1


Additional Render Options
Expand Down
65 changes: 33 additions & 32 deletions src/docs/sphinx/AscentAPI.rst
Original file line number Diff line number Diff line change
Expand Up @@ -21,16 +21,16 @@ open
Open provides the initial setup of Ascent from a Conduit Node.
Options include runtime type (e.g., ascent, flow, or empty) and associated backend if available.
If running in parallel (i.e., MPI), then a MPI comm handle must be supplied.
Ascent will always check the file system for a file called ``ascent_options.json`` that will override compiled in options, and for obvious reasons, a MPI communicator cannot be specified in the file.
Ascent will always check the file system for a file called ``ascent_options.yaml`` that will override compiled in options, and for obvious reasons, a MPI communicator cannot be specified in the file.
Here is a file that would set the runtime to the main ascent runtime using a OpenMP backend (inside VTK-m):


.. code-block:: json
.. code-block:: yaml

{
"runtime/type" : "ascent",
"runtime/vtkm/backend" : "openmp"
}
runtime:
type: "ascent"
vtkm:
backend: "openmp"

Example Options
"""""""""""""""
Expand All @@ -53,19 +53,18 @@ A typical integration will include the following code:
Default Directory
"""""""""""""""""
By default, Ascent will output files in the current working directory.
This can be overrided by specifying the ``default_dir``. This directory
This can be overridden by specifying the ``default_dir``. This directory
must be a valid directory, i.e., Ascent will not create this director for
you. Many Ascent filters have parameters that specify output files, and Ascent
will only place files that do not have an absolue path specified.
will only place files that do not have an absolute path specified.
For example, the ``my_image`` would be written to the default directory, but
``/some/other/path/my_image`` would be written in the directory
``/some/other/path/``.

.. code-block:: json
.. code-block:: yaml

default_dir: "/path/to/output/dir"

{
"default_dir" : "/path/to/output/dir"
}

High Order Mesh Refinement
""""""""""""""""""""""""""
Expand All @@ -76,13 +75,12 @@ high-order elements are discretized into many linear elements. The minimum value
is ``1``, i.e., no refinement. There is a memory-accuracy trade-off when using refinement.
The higher the value,
the more accurate the low-order representation is, but more discretization means more memory
usage and more time tp process the additional elements.
usage and more time to process the additional elements.

.. code-block:: yaml

.. code-block:: json
refinement_level: 4

{
"refinement_level" : 4
}

Runtime Options
"""""""""""""""
Expand Down Expand Up @@ -125,7 +123,7 @@ There are often warnings and other information that can indicate potential issue
- ``catch`` Catches conduit::Error exceptions at the Ascent interface and prints info about the error to standard out.
This case this provides an easy way to prevent host program crashes when something goes wrong in Ascent.

By default, Ascent looks for a file called ``ascent_actions.json`` that can append additional actions at runtime.
By default, Ascent looks for a file called ``ascent_actions.yaml`` that can append additional actions at runtime.
This default file name can be overridden in the Ascent options:

.. code-block:: c++
Expand All @@ -147,11 +145,10 @@ Filter Timings
Ascent has internal timings for filters. The timings output is one csv file
per MPI rank.

.. code-block:: json
.. code-block:: yaml

timings : "true"

{
"timings" : "true"
}


Field Filtering
Expand All @@ -162,19 +159,23 @@ publish 100s of variables to Ascent. In this case, its undesirable to
use all fields when the actions only need a single variable. This reduces
the memory overhead Ascent uses.


.. code-block:: yaml

field_filtering : "true"


Field filtering scans the user's actions to identify what fields are required,
only passing the required fields into Ascent. However, there are several
actions where the required fields cannot be resolved. For example, saving simulation
data to the file system saves all fields, and in this case, it is not possible to resolve
the required fields. If field filtering encounters this case, then an error is generated.
Alternatively, if the actions specify which fields to save, then this field filtering
can resolve the fields.
actions where the required fields cannot be resolved. To support field filtering
for all cases, we added support for action ``declare_fields`` that allows a user
to explicitly control the list of active fields.

.. code-block:: json
.. code-block:: yaml

{
"field_filtering" : "true"
}
-
action: "declare_fields"
fields: ["my_field", "my_other_field", ...]



Expand Down Expand Up @@ -263,7 +264,7 @@ Here is a simple example of adding a plot using the C++ API:

info
----
Info populates a conduit Node with infomation about Ascent including runtime execution and outputted results.
Info populates a conduit Node with information about Ascent including runtime execution and outputted results.
This information can be used to return data back to the simulation and for debugging purposes.

.. code-block:: c++
Expand Down
23 changes: 10 additions & 13 deletions src/docs/sphinx/developer_docs/Flow_Filter.rst
Original file line number Diff line number Diff line change
Expand Up @@ -103,18 +103,15 @@ interface looks like this in c++:
filter["params/double_param"] = 2.0;


or equivalently in json:
or equivalently in yaml:

.. code-block:: json
.. code-block:: yaml

type: "filter_name"
params:
string_param: "string"
double_param: 2.0

{
"type" : "filter_name",
"params":
{
"string_param" : "string",
"double_param" : 2.0
}
}

The Ascent runtime looks for the ``params`` node and passes it to the filter
upon creation. Parameters are verified when the filter is created during execution.
Expand Down Expand Up @@ -288,17 +285,17 @@ is where all builtin filter are registered. Following the NoOp example:

Filter registration is templated on the filter type and takes two arguments.

* arg1: the type of the fitler. Valid values are ``transforms`` and ``extracts``
* arg1: the type of the filter. Valid values are ``transforms`` and ``extracts``
* arg2: the front-facing API name of the filter. This is what a user would declare in an actions file.

Accessing Metadata
------------------
We currently populate a limited set of metadata that is accessable to flow filters.
We currently populate a limited set of metadata that is accessible to flow filters.
We place a Conduit node containing the metadata inside the registry which can be
accessed in the following manner:

.. code-block:: c++
:caption: Accessing the regsitry metadata inside a flow filter
:caption: Accessing the registry metadata inside a flow filter

conduit::Node * meta = graph().workspace().registry().fetch<Node>("metadata");
int cycle = -1;
Expand Down
5 changes: 5 additions & 0 deletions src/libs/ascent/runtimes/ascent_main_runtime.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -1946,6 +1946,11 @@ AscentRuntime::BuildGraph(const conduit::Node &actions)
// the workspace executes.
m_save_info_actions.append() = action;
}
else if(action_name == "declare_fields")
{
// Used with field filtering, we don't need
// to process as part of exec
}
else if(action_name == "open_log")
{
// Open Ascent Logging Stream
Expand Down
25 changes: 25 additions & 0 deletions src/libs/ascent/utils/ascent_actions_utils.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -174,11 +174,36 @@ void filter_fields(const conduit::Node &node,
std::set<std::string> &fields,
conduit::Node &info)
{

const int num_children = node.number_of_children();
const std::vector<std::string> names = node.child_names();
for(int i = 0; i < num_children; ++i)
{
const conduit::Node &child = node.child(i);

// to avoid complex parsing, we added a shortcut case
// action: `declare_fields`
// fields: ["field1", "field2", .... "fieldN-1"]

if(child.has_child("action") && child.has_child("fields"))
{
if(child["action"].as_string() == "declare_fields")
{
const Node fields_list = child["fields"];
const int num_entries = child.number_of_children();
for(int e = 0; e < num_entries; e++)
{
const conduit::Node &item = child.child(e);
if(item.dtype().is_string())
{
fields.insert(item.as_string());
}
} // for list entries
// early return, user needs to provide a definitive list
return;
}
}

bool is_leaf = child.number_of_children() == 0;
if(is_leaf)
{
Expand Down
Loading