Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions docs/data-ai/capture-data/conditional-sync.md
Original file line number Diff line number Diff line change
Expand Up @@ -126,7 +126,7 @@ In this example we will continue to use [`sync-at-time:timesyncsensor`](https://
You will need to follow the same steps with your module:

{{< table >}}
{{% tablestep number=1 %}}
{{% tablestep start=1 %}}
**Add the sensor to your machine**

On your machine's **CONFIGURE** page, click the **+** button next to your machine part in the left menu.
Expand All @@ -138,7 +138,7 @@ Click **Add module**, then enter a name or use the suggested name for your senso

<!-- markdownlint-disable-file MD034 -->

{{% tablestep number=2 %}}
{{% tablestep %}}
**Configure your time frame**

Go to the new component panel and copy and paste the following attribute template into your sensor’s attributes field:
Expand Down
16 changes: 8 additions & 8 deletions docs/data-ai/capture-data/filter-before-sync.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,20 +16,20 @@ Contributors have written several filtering {{< glossary_tooltip term_id="module
The following steps use the [`filtered_camera`](https://app.viam.com/module/viam/filtered-camera) module:

{{< table >}}
{{% tablestep number=1 %}}
{{% tablestep start=1 %}}
**Add an ML model service to your machine**

Add an ML model service on your machine that is compatible with the ML model you want to use, for example [TFLite CPU](https://github.com/viam-modules/mlmodel-tflite).

{{% /tablestep %}}
{{% tablestep number=2 %}}
{{% tablestep %}}
**Select a suitable ML model**

Click **Select model** on the ML model service configuration panel, then select an [existing model](https://app.viam.com/registry?type=ML+Model) you want to use, or click **Upload a new model** to upload your own.
If you're not sure which model to use, you can use [`EfficientDet-COCO`](https://app.viam.com/ml-model/viam-labs/EfficientDet-COCO) from the **Registry**, which can detect people and animals, among other things.

{{% /tablestep %}}
{{% tablestep number=3 %}}
{{% tablestep %}}
**Add a vision service to use with the ML model**

You can think of the vision service as the bridge between the ML model service and the output from your camera.
Expand All @@ -38,7 +38,7 @@ Add and configure the `vision / ML model` service on your machine.
From the **Select model** dropdown, select the name of your ML model service (for example, `mlmodel-1`).

{{% /tablestep %}}
{{% tablestep number=4 %}}
{{% tablestep %}}
**Configure the filtered camera**

The `filtered-camera` {{< glossary_tooltip term_id="modular-resource" text="modular component" >}} pulls the stream of images from the camera you configured earlier, and applies the vision service to it.
Expand Down Expand Up @@ -68,7 +68,7 @@ Additionally, you can also add a buffer window with `window_seconds` which contr
If you were to set `window_seconds` to `3`, the camera would also capture and sync images from the 3 seconds before a person appeared in the camera stream.

{{% /tablestep %}}
{{% tablestep number=5 %}}
{{% tablestep %}}
**Configure data capture and sync on the filtered camera**

Configure data capture and sync on the filtered camera just as you did before for the physical camera.
Expand All @@ -77,14 +77,14 @@ The filtered camera will only capture image data that passes the filters you con
Turn off data capture on your original camera if you haven't already, so that you don't capture duplicate or unfiltered images.

{{% /tablestep %}}
{{% tablestep number=6 %}}
{{% tablestep %}}
**Save to start capturing**

Save the config.
With cloud sync enabled, captured data is automatically uploaded to Viam after a short delay.

{{% /tablestep %}}
{{% tablestep number=7 %}}
{{% tablestep %}}
**View filtered data on Viam**

Once you save your configuration, place something that is part of your trained ML model within view of your camera.
Expand All @@ -96,7 +96,7 @@ If no data appears after the sync interval, check the [**Logs**](/manage/trouble
You can test the vision service from the [**CONTROL** tab](/manage/troubleshoot/teleoperate/default-interface/) to see its classifications and detections live.

{{% /tablestep %}}
{{% tablestep number=8 %}}
{{% tablestep %}}
**(Optional) Trigger sync with custom logic**

By default, the captured data syncs at the regular interval you specified in the data capture config.
Expand Down
6 changes: 3 additions & 3 deletions docs/data-ai/data/export.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,15 +40,15 @@ Then authenticate your CLI session with Viam using one of the following options:
To export your data from the cloud using the Viam CLI:

{{< table >}}
{{% tablestep number=1 %}}
{{% tablestep start=1 %}}
**Filter the data you want to download**

Navigate to the [**DATA**](https://app.viam.com/data/view) page.

Use the filters on the left side of the page to filter only the data you wish to export.

{{% /tablestep %}}
{{% tablestep number=2 %}}
{{% tablestep %}}
**Copy the export command from the DATA page**

In the upper right corner of the **DATA** page, click the **Export** button.
Expand All @@ -57,7 +57,7 @@ Click **Copy export command**.
This copies the command, including your org ID and the filters you selected, to your clipboard.

{{% /tablestep %}}
{{% tablestep number=3 %}}
{{% tablestep %}}
**Run the command**

Run the copied command in a terminal:
Expand Down
10 changes: 5 additions & 5 deletions docs/data-ai/data/query.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,14 +42,14 @@ You must have the [owner role](/manage/manage/rbac/) in order to query data on V
{{% tab name="Web UI" %}}

{{< table >}}
{{% tablestep number=1 %}}
{{% tablestep start=1 %}}
**Query with SQL or MQL**

Navigate to the [**Query** page](https://app.viam.com/data/query).
Then, select either **SQL** or **MQL** from the **Query mode** dropdown menu on the right-hand side.

{{% /tablestep %}}
{{% tablestep number=2 %}}
{{% tablestep %}}
**Run your query**

This example query returns the last 5 readings from any components named `my-sensor` in your organization:
Expand Down Expand Up @@ -203,7 +203,7 @@ WHERE component_name = 'sensor-1'
{{% /expand %}}

{{% /tablestep %}}
{{% tablestep number=3 %}}
{{% tablestep %}}
**Review results**

Click **Run query** when ready to perform your query and get matching results.
Expand Down Expand Up @@ -299,7 +299,7 @@ If you want to query data from third party tools, you have to configure data que
You can use third-party tools, such as the [`mongosh` shell](https://www.mongodb.com/docs/mongodb-shell/) or [MongoDB Compass](https://www.mongodb.com/docs/compass/current/), to query captured sensor data.

{{< table >}}
{{% tablestep number=1 %}}
{{% tablestep start=1 %}}
**Connect to your Viam organization's data**

Run the following command to connect to your Viam organization's MongoDB Atlas instance from `mongosh` using the connection URI you obtained during query configuration:
Expand All @@ -309,7 +309,7 @@ mongosh "mongodb://db-user-abcd1e2f-a1b2-3c45-de6f-ab123456c123:YOUR-PASSWORD-HE
```

{{% /tablestep %}}
{{% tablestep number=2 %}}
{{% tablestep %}}
**Query data from a compatible client**

Once connected, you can run SQL or MQL statements to query captured data directly.
Expand Down
14 changes: 7 additions & 7 deletions docs/data-ai/data/visualize.md
Original file line number Diff line number Diff line change
Expand Up @@ -205,14 +205,14 @@ Select a tab below to learn how to configure your visualization tool for use wit
#### Grafana

{{< table >}}
{{% tablestep number=1 %}}
{{% tablestep start=1 %}}
**Choose Grafana instance**

[Install](https://grafana.com/docs/grafana/latest/setup-grafana/installation/) or set up Grafana.
You can use either a local instance of Grafana Enterprise or Grafana Cloud, and can use the free trial version of Grafana Cloud if desired.

{{% /tablestep %}}
{{% tablestep number=2 %}}
{{% tablestep %}}
**Install connector to MongoDB data source**

Navigate to your Grafana web UI.
Expand All @@ -223,7 +223,7 @@ Go to **Connections > Add new connection** and add the [Grafana MongoDB data sou
Install the datasource plugin.

{{% /tablestep %}}
{{% tablestep number=3 %}}
{{% tablestep %}}
**Configure a data connection**

Navigate to the Grafana MongoDB data source that you just installed.
Expand All @@ -249,7 +249,7 @@ Enter the following information in the configuration UI for the plugin:
{{<imgproc src="/tutorials/visualize-data-grafana/configure-grafana-mongodb-datasource.png" resize="800x" declaredimensions=true alt="The Grafana data source plugin configuration page, showing the connection string and username filled in with the configuration determined from the previous steps" class="shadow" >}}

{{< /tablestep >}}
{{% tablestep number=4 %}}
{{% tablestep %}}
**Use Grafana for dashboards**

With your data connection established, you can then build dashboards that provide insight into your data.
Expand Down Expand Up @@ -282,7 +282,7 @@ See the [guide on querying data](/data-ai/data/query/) for more information.
#### Other visualization tools

{{< table >}}
{{% tablestep number=1 %}}
{{% tablestep start=1 %}}
**Install connector to MongoDB data source**

Some visualization clients are able to connect to the Viam MongoDB Atlas Data Federation instance natively, while others require that you install and configure an additional plugin or connector.
Expand All @@ -291,7 +291,7 @@ For example, Tableau requires both the [Atlas SQL JDBC Driver](https://www.mongo
Check with the documentation for your third-party visualization tool to be sure you have the required additional software installed to connect to a MongoDB Atlas Data Federation instance.

{{% /tablestep %}}
{{% tablestep number=2 %}}
{{% tablestep %}}
**Configure a data connection**

Most third-party visualization tools require the _connection URI_ (also called the connection string) to that database server, and the _credentials_ to authenticate to that server in order to visualize your data.
Expand Down Expand Up @@ -344,7 +344,7 @@ Substitute your organization ID for `<YOUR-ORG-ID>`.
{{% /tabs %}}

{{% /tablestep %}}
{{% tablestep number=3 %}}
{{% tablestep %}}
**Use visualization tools for dashboards**

Some third-party visualization tools support the ability to directly query your data within their platform to generate more granular visualizations of specific data.
Expand Down
12 changes: 6 additions & 6 deletions docs/data-ai/train/train-tf-tflite.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,13 +54,13 @@ Follow the guide to [create a dataset](/data-ai/train/create-dataset/).
Now that you have a dataset that contains your labeled images, you are ready to train a machine learning model.

{{< table >}}
{{% tablestep number=1 %}}
{{% tablestep start=1 %}}
**Find your training dataset**

Navigate to your list of [**DATASETS**](https://app.viam.com/data/datasets) and select the one you want to train on.

{{% /tablestep %}}
{{% tablestep number=2 %}}
{{% tablestep %}}
**Train an ML model**

Click **Train model** and follow the prompts.
Expand All @@ -71,7 +71,7 @@ You can train a TFLite model using **Built-in training**.
Click **Next steps**.

{{% /tablestep %}}
{{% tablestep number=3 %}}
{{% tablestep %}}
**Select model type**

Select between:
Expand All @@ -83,7 +83,7 @@ Select between:
| **TensorFlow (TF)** | Best for general-purpose tasks with more computational power. |

{{% /tablestep %}}
{{% tablestep number=4 %}}
{{% tablestep %}}
**Fill in the details for your ML model**

Enter a name for your new model.
Expand All @@ -102,7 +102,7 @@ Click **Train model**.
{{< imgproc src="/tutorials/data-management/train-model.png" alt="The data tab showing the train a model pane" style="width:500px" resize="1200x" class="imgzoom fill shadow" >}}

{{% /tablestep %}}
{{% tablestep number=5 %}}
{{% tablestep %}}
**Wait for your model to train**

The model now starts training and you can follow its process on the [**TRAINING** tab](https://app.viam.com/training).
Expand All @@ -112,7 +112,7 @@ Once the model has finished training, it becomes visible on the [**MODELS** tab]
You will receive an email when your model finishes training.

{{% /tablestep %}}
{{% tablestep number=6 %}}
{{% tablestep %}}
**Debug your training job**

From the [**TRAINING** tab](https://app.viam.com/training), click on your training job's ID to see its logs.
Expand Down
Loading
Loading