You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: advocacy_docs/edb-postgres-ai/ai-accelerator/installing/complete.mdx
+17
Original file line number
Diff line number
Diff line change
@@ -2,6 +2,7 @@
2
2
title: "Completing and verifying the extension installation"
3
3
navTitle: "Completing the installation"
4
4
description: "Completing and verifying the installation of the AI Database and File System extensions."
5
+
deepToC: true
5
6
---
6
7
7
8
### Installing the AI Database extension
@@ -15,6 +16,22 @@ CREATE EXTENSION
15
16
edb=#
16
17
```
17
18
19
+
#### Proxy settings
20
+
21
+
If you are using a HTTP proxy, you may need to set the `HTTP_PROXY` and `HTTPS_PROXY` environment variables in Postgres's environment. You can do this by adding the following lines to the `environment` file in the `main` directory of the Postgres data directory.
22
+
23
+
For example, on Ubuntu with Community Postgres, the `environment` file is located at `/etc/postgresql/16/main/environment`. Run the following commands to add the proxy settings to the `environment` file and restart the Postgres service.
24
+
25
+
```bash
26
+
echo"HTTP_PROXY = 'http://<yourproxysettings>/'"| sudo tee -a /etc/postgresql/16/main/environment
27
+
echo"HTTPS_PROXY = 'http://<yourproxysettings>/'"| sudo tee -a /etc/postgresql/16/main/environment
28
+
sudo systemctl restart postgresql@16-main
29
+
```
30
+
31
+
Replace `<yourproxysettings>` with your proxy settings. If you are using a different version of Postgres, replace `16` with your version number. Consult the documentation for your Postgres distribution for the location of the `environment` file.
32
+
33
+
Airgapped environments are not supported at this time.
34
+
18
35
### Installing the File System extension
19
36
20
37
The File System extension is an extension that provides a set of functions to interact with the file system from within the database. The extension is installed using the `CREATE EXTENSION` command.
Copy file name to clipboardExpand all lines: advocacy_docs/edb-postgres-ai/ai-accelerator/models/using-models.mdx
+11-3
Original file line number
Diff line number
Diff line change
@@ -8,15 +8,15 @@ Pipelines has a model registry that manages configured instances of models. Any
8
8
9
9
## Discover the preloaded models
10
10
11
-
Pipelines comes with a set of pre-created models that you can use out of the box.
11
+
Pipelines comes with a set of pre-created models that you can use out of the box.
12
12
13
13
To find them, you can run the following query:
14
14
15
15
```sql
16
16
SELECT*FROMaidb.list_models();
17
17
```
18
18
19
-
This will return a list of all the models that are currently created in the system. If you have not created any models, you'll see the default models that come with Pipelines.
19
+
This will return a list of all the models that are currently configured in the system. If you have not created any models, you'll see the default models that come with Pipelines.
20
20
21
21
```text
22
22
name | provider | options
@@ -27,7 +27,15 @@ This will return a list of all the models that are currently created in the syst
27
27
dummy | dummy | {"config={}"}
28
28
```
29
29
30
-
The `bert`, `clip`, and `t5` models are all pre-created and ready to use. The `dummy` model is a placeholder model that can be used for testing purposes.
30
+
The `bert`, `clip`, and `t5` models are all pre-configured and ready to use. The `dummy` model is a placeholder model that can be used for testing purposes.
31
+
32
+
!!! note First use of local models
33
+
The first time you use any of the local models, the model will be downloaded from [HuggingFace](https://huggingface.co/). The model is then run locally.
34
+
Subsequent uses of the model will be faster, as the model will be cached locally.
35
+
36
+
If you use a proxy, ensure that it is [configured on the Postgres server](../installing/complete#proxy-settings).
- Install the [plpython3u](https://www.postgresql.org/docs/current/plpython.html) procedural language to run ldap2pg. Note that `plpython3u` is a potential security risk as it's an “untrusted” language.
Copy file name to clipboardExpand all lines: advocacy_docs/pg_extensions/pgrouting/configuring.mdx
+5-3
Original file line number
Diff line number
Diff line change
@@ -3,10 +3,12 @@ title: Configuring pgRouting
3
3
navTitle: Configuring
4
4
---
5
5
!!! Note
6
-
`pgRouting` depends on PostGIS. To install PostGIS, see [EDB PostGIS](https://www.enterprisedb.com/docs/postgis/latest/)
6
+
pgRouting depends on PostGIS. To install PostGIS, see [EDB PostGIS](https://www.enterprisedb.com/docs/postgis/latest/)
7
7
8
-
To enable `pgRouting`, create the extension.
8
+
To enable pgRouting, create the extension:
9
9
10
+
```
10
11
CREATE EXTENSION pgrouting;
12
+
```
11
13
12
-
For additional configuration information, see the [Configuring pgRouting](https://docs.pgrouting.org/latest/en/pgRouting-installation.html#configuring) in the official documentation.
14
+
For additional configuration information, see [Configuring pgRouting](https://docs.pgrouting.org/latest/en/pgRouting-installation.html#configuring) in the official pgRouting documentation.
Copy file name to clipboardExpand all lines: advocacy_docs/pg_extensions/pgrouting/installing.mdx
+1-1
Original file line number
Diff line number
Diff line change
@@ -3,7 +3,7 @@ title: Installing pgRouting
3
3
navTitle: Installing
4
4
---
5
5
6
-
`pgRouting` is supported on the same platforms as the Postgres distribution you're using. Support for `pgRouting` starts with Postgres 13. For details, see:
6
+
pgRouting is supported on the same platforms as the Postgres distribution you're using. Support for pgRouting starts with Postgres 13. For details, see:
7
7
8
8
-[EDB Postgres Advanced Server Product Compatibility](https://www.enterprisedb.com/platform-compatibility#epas)
Copy file name to clipboardExpand all lines: advocacy_docs/pg_extensions/pgrouting/using.mdx
+1-1
Original file line number
Diff line number
Diff line change
@@ -3,4 +3,4 @@ title: Using pgRouting
3
3
navTitle: Using
4
4
---
5
5
6
-
For more information on using `pgRouting`, see [Getting Started](https://github.com/pgRouting/pgRouting?tab=readme-ov-file#getting-started) in the `pgRouting` official documentation.
6
+
For more information on using pgRouting, see [Getting Started](https://github.com/pgRouting/pgRouting?tab=readme-ov-file#getting-started) in the pgRouting official documentation.
To enable `pgtt`, create the extension in the database:
7
+
8
+
```sql
9
+
CREATE EXTENSION pgtt;
10
+
```
11
+
12
+
For additional configuration information after enabling the extension, see the [pgtt official documentation](https://github.com/darold/pgtt?tab=readme-ov-file#configuration).
The `pgtt` documentation describes the latest version of `pgtt`, including minor releases and patches. These release notes cover what was new in each release. For new functionality introduced in a minor or patch release, there are also indicators in the content about the release that introduced the feature.
For more information on using `pgtt`, see [Use of the extension](https://github.com/darold/pgtt?tab=readme-ov-file#use-of-the-extension) in the `pgtt` official documentation.
Copy file name to clipboardExpand all lines: advocacy_docs/pg_extensions/query_advisor/installing.mdx
+1-1
Original file line number
Diff line number
Diff line change
@@ -3,7 +3,7 @@ title: Installing Query Advisor
3
3
navTitle: Installing
4
4
---
5
5
6
-
EDB Query Advisor is supported on the same platforms as the Postgres distribution you're using. Support for EDB Query Advisor starts with Postgres 12. For details, see:
6
+
EDB Query Advisor is supported on the same platforms as the Postgres distribution you're using. Support for EDB Query Advisor starts with Postgres 13. For details, see:
7
7
-[EDB Postgres Advanced Server Product Compatibility](https://www.enterprisedb.com/platform-compatibility#epas)
| Enhancement | New columns are added to the output of the `query_advisor_index_recommendations` and `query_advisor_statistics_recommendations` functions. For more information, see [using query advisor](../using.mdx).|
By default, only predicates filtering at least 1000 rows and 30% of the rows in average are considered. You can use the `min_filter` and `min_selectivity` parameters to override the default.
14
15
15
-
The function generates the one- and two-column index candidates based on the predicates it collects. It replans all related workload queries in the presence of the hypothetical index with respect to each candidate. It recommends the list of indexes that bring the most value to the workload. It also shows the estimated index size and percentage cost reduction in the workload queries. You can decide, based on the size and benefit ratio, which indexes are the most useful for you.
16
+
The function generates the one- and two-column index candidates based on the predicates it collects. It replans all related workload queries in the presence of the hypothetical index with respect to each candidate. It recommends the list of indexes that bring the most value to the workload. It also shows the estimated index size, estimated percentage cost reduction, total cost, absolute benefit and query ID's of the benefited queries in the workload. You can decide, based on the size and benefit ratio, which indexes are the most useful for you.
@@ -35,19 +35,20 @@ By default, `min_err_estimate_num` and `min_err_estimate_ratio` are set to `0`.
35
35
36
36
The function generates potential candidates from the multi-column filters of your queries. Then, these candidates are processed by exploring different possible combinations. Currently the focus is on statistics for two columns at a time.
37
37
38
-
It also shows the weights to each candidate. Weights are based on how many queries would benefit from those extended statistics and what the execution cost of the queries would be.
38
+
It also shows the weights to each candidate. Weights are based on how many queries would benefit from those extended statistics and what the execution cost of the queries would be. It shows query ID's of the benefited queries for which the recommendations are beneficial.
0 commit comments