Skip to content
This repository was archived by the owner on Sep 25, 2025. It is now read-only.

Commit dec8238

Browse files
authored
Add 'how to use vscode env_vars' and changes to Decorator docs (#66)
* Add 'how to use vscode env_vars' and changes to Decorator docs
1 parent 8ec8370 commit dec8238

File tree

8 files changed

+55
-5
lines changed

8 files changed

+55
-5
lines changed

docs/_sidebar.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -59,6 +59,7 @@
5959
- [Configure Service Connections](/how-tos/datacoves/how_to_service_connections.md)
6060
- [Manage Users](/how-tos/datacoves/how_to_manage_users.md)
6161
- [Update Repository](/getting-started/Admin/configure-repository.md)
62+
- [Configure VSCode Environment Variables](/how-tos/datacoves/how_to_environment_variables.md)
6263
- [Datahub](/how-tos/datahub/)
6364
- [Configure dbt metadata ingestion](/how-tos/datahub/how_to_datahub_dbt.md)
6465
- [Configure Snowflake metadata ingestion](/how-tos/datahub/how_to_datahub_snowflake.md)
181 KB
Loading
88.3 KB
Loading
147 KB
Loading
45.6 KB
Loading
Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
# How to use custom VSCode Environment Variables
2+
3+
Even though you can create environment variables in your VSCode instance by using the traditional `export` command, having your Environment configured with custom variables can be an important time-saver, as these are not cleared when your Environment restarts (due to inactivity, maintenance, or changes in its settings)
4+
5+
You can configure environment variables in 3 different places:
6+
7+
- `Project Settings`: applies to the entire Project (with all its Environment)
8+
- `Environment Settings`: applies to the desired Environment
9+
- `User Settings`: applies to the user's VSCode instance.
10+
11+
As the UI is almost the same in the 3 pages, we'll illustrate the process using the `Environment Settings` screen
12+
13+
To configure custom VSCode environment variables:
14+
15+
- Navigate to `VS Code Environment Variables` inside your Environment settings. Once there, click `Add`
16+
17+
![VS Code Environment Variables](./assets/env_vars_1.png)
18+
19+
- A pop-up will appear, where you must specify `Key` and `Value` for your environment variable. Once set, click `Add`
20+
21+
![Configure ENV VAR](./assets/env_vars_2.png)
22+
23+
- You will be sent back to your Environment settings, where you should see the newly created environment variable.
24+
- Once there, make sure to `Save Changes` to your Environment
25+
26+
![Save Changes](./assets/env_vars_3.png)
27+
28+
That's all! Now you can use this new persistent variable in your VSCode instance.
29+
30+
![VSCode](./assets/env_vars_4.png)

docs/reference/airflow/datacoves-decorators.md

Lines changed: 23 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ This custom decorator is an extension of Airflow's default @task decorator and s
1818

1919
**Params:**
2020

21-
- `env`: Pass in a dictionary of variables. eg) `"my_var": "{{ var.value.my_var }}"` Please use {{ var.value.my_var }} syntax to avoid parsing every 30 seconds.
21+
- `env`: Pass in a dictionary of variables. eg `"my_var": "{{ var.value.my_var }}"` Please use {{ var.value.my_var }} syntax to avoid parsing every 30 seconds.
2222
- `outlets`: Used to connect a task to an object in datahub or update a dataset
2323
- `append_env`: Add env vars to existing ones like `DATACOVES__DBT_HOME`
2424

@@ -43,8 +43,23 @@ This custom decorator is an extension of the @task decorator and simplifies runn
4343
- It runs dbt commands inside the dbt Project Root, not the Repository root.
4444

4545
**Params:**
46+
47+
Datacoves dbt decorator supports all the [Datacoves dbt Operator params](/reference/airflow/datacoves-operator#datacoves-dbt-operator) plus:
48+
4649
- `connection_id`: This is the [service connection](/how-tos/datacoves/how_to_service_connections.md) which is automatically added to airflow if you select `Airflow Connection` as the `Delivery Mode`.
47-
- `overrides`: Pass in a dictionary with override parameters such as warehouse, role, or database.
50+
51+
**dbt profile generation:**
52+
53+
With the `connection_id` mentioned above, we create a temporary dbt profile (it only exists at runtime inside the Airflow DAG's worker). By default, this dbt profile contains the selected Service Credential connection details.
54+
55+
The dbt profile `name` is defined either in Project or Environment settings, in their `Profile name` field. This can be overwritten by passing a custom `DATACOVES__DBT_PROFILE` environment variable to the decorator
56+
57+
Users can also customize this dbt profile's connection details and/or target with the following params:
58+
59+
- `overrides`: a dictionary with override parameters such as warehouse, role, database, etc.
60+
- `target`: the target name this temporary dbt profile will receive. Defaults to `default`.
61+
62+
Basic example
4863

4964
```python
5065
def my_dbt_dag():
@@ -63,9 +78,12 @@ Example with overrides.
6378
def my_dbt_dag():
6479
@task.datacoves_dbt(
6580
connection_id="main",
66-
overrides={"warehouse": "my_custom_wh"})
81+
overrides={"warehouse": "my_custom_wh"},
82+
env={"DATACOVES__DBT_PROFILE": "prod"},
83+
target="testing"
84+
)
6785
def dbt_test() -> str:
68-
return "dbt debug"
86+
return "dbt debug -t testing" # Make sure to pass `-t {target}` if you are using a custom target name.
6987

7088
dag = my_dbt_dag()
7189
```
@@ -114,7 +132,7 @@ The new datacoves_dbt parameters are:
114132

115133
- `db_type`: The data warehouse you are using. Currently supports `redshift` or `snowflake`.
116134
- `destination_schema`: The destination schema where the Airflow tables will end-up. By default, the schema will be named as follows: `airflow-{datacoves environment slug}` for example `airflow-qwe123`.
117-
- `connection_id`: The name of your Airflow [service connection](/how-tos/datacoves/how_to_service_connections.md) which is automatically added to airflow if you select `Airflow Connection` as the `Delivery Mode`.
135+
- `connection_id`: The name of your Airflow [service connection](/how-tos/datacoves/how_to_service_connections.md) which is automatically added to airflow if you select `Airflow Connection` as the `Delivery Mode`.
118136
- `additional_tables`: A list of additional tables you would want to add to the default set.
119137
- `tables`: A list of tables to override the default ones from above. Warning: An empty list [] will perform a full-database sync.
120138

docs/reference/airflow/datacoves-operator.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -73,6 +73,7 @@ Params:
7373

7474
- `bash_command`: command to run
7575
- `project_dir` (optional): relative path from repo root to a specific dbt project.
76+
- `run_dbt_deps` (optional): boolean to force dbt deps run.
7677

7778
```python
7879
import datetime

0 commit comments

Comments
 (0)