You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Sep 25, 2025. It is now read-only.
Even though you can create environment variables in your VSCode instance by using the traditional `export` command, having your Environment configured with custom variables can be an important time-saver, as these are not cleared when your Environment restarts (due to inactivity, maintenance, or changes in its settings)
4
+
5
+
You can configure environment variables in 3 different places:
6
+
7
+
-`Project Settings`: applies to the entire Project (with all its Environment)
8
+
-`Environment Settings`: applies to the desired Environment
9
+
-`User Settings`: applies to the user's VSCode instance.
10
+
11
+
As the UI is almost the same in the 3 pages, we'll illustrate the process using the `Environment Settings` screen
12
+
13
+
To configure custom VSCode environment variables:
14
+
15
+
- Navigate to `VS Code Environment Variables` inside your Environment settings. Once there, click `Add`
Copy file name to clipboardExpand all lines: docs/reference/airflow/datacoves-decorators.md
+23-5Lines changed: 23 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -18,7 +18,7 @@ This custom decorator is an extension of Airflow's default @task decorator and s
18
18
19
19
**Params:**
20
20
21
-
-`env`: Pass in a dictionary of variables. eg)`"my_var": "{{ var.value.my_var }}"` Please use {{ var.value.my_var }} syntax to avoid parsing every 30 seconds.
21
+
-`env`: Pass in a dictionary of variables. eg `"my_var": "{{ var.value.my_var }}"` Please use {{ var.value.my_var }} syntax to avoid parsing every 30 seconds.
22
22
-`outlets`: Used to connect a task to an object in datahub or update a dataset
23
23
-`append_env`: Add env vars to existing ones like `DATACOVES__DBT_HOME`
24
24
@@ -43,8 +43,23 @@ This custom decorator is an extension of the @task decorator and simplifies runn
43
43
- It runs dbt commands inside the dbt Project Root, not the Repository root.
44
44
45
45
**Params:**
46
+
47
+
Datacoves dbt decorator supports all the [Datacoves dbt Operator params](/reference/airflow/datacoves-operator#datacoves-dbt-operator) plus:
48
+
46
49
-`connection_id`: This is the [service connection](/how-tos/datacoves/how_to_service_connections.md) which is automatically added to airflow if you select `Airflow Connection` as the `Delivery Mode`.
47
-
-`overrides`: Pass in a dictionary with override parameters such as warehouse, role, or database.
50
+
51
+
**dbt profile generation:**
52
+
53
+
With the `connection_id` mentioned above, we create a temporary dbt profile (it only exists at runtime inside the Airflow DAG's worker). By default, this dbt profile contains the selected Service Credential connection details.
54
+
55
+
The dbt profile `name` is defined either in Project or Environment settings, in their `Profile name` field. This can be overwritten by passing a custom `DATACOVES__DBT_PROFILE` environment variable to the decorator
56
+
57
+
Users can also customize this dbt profile's connection details and/or target with the following params:
58
+
59
+
-`overrides`: a dictionary with override parameters such as warehouse, role, database, etc.
60
+
-`target`: the target name this temporary dbt profile will receive. Defaults to `default`.
61
+
62
+
Basic example
48
63
49
64
```python
50
65
defmy_dbt_dag():
@@ -63,9 +78,12 @@ Example with overrides.
63
78
defmy_dbt_dag():
64
79
@task.datacoves_dbt(
65
80
connection_id="main",
66
-
overrides={"warehouse": "my_custom_wh"})
81
+
overrides={"warehouse": "my_custom_wh"},
82
+
env={"DATACOVES__DBT_PROFILE": "prod"},
83
+
target="testing"
84
+
)
67
85
defdbt_test() -> str:
68
-
return"dbt debug"
86
+
return"dbt debug -t testing"# Make sure to pass `-t {target}` if you are using a custom target name.
69
87
70
88
dag = my_dbt_dag()
71
89
```
@@ -114,7 +132,7 @@ The new datacoves_dbt parameters are:
114
132
115
133
-`db_type`: The data warehouse you are using. Currently supports `redshift` or `snowflake`.
116
134
-`destination_schema`: The destination schema where the Airflow tables will end-up. By default, the schema will be named as follows: `airflow-{datacoves environment slug}` for example `airflow-qwe123`.
117
-
-`connection_id`: The name of your Airflow [service connection](/how-tos/datacoves/how_to_service_connections.md) which is automatically added to airflow if you select `Airflow Connection` as the `Delivery Mode`.
135
+
-`connection_id`: The name of your Airflow [service connection](/how-tos/datacoves/how_to_service_connections.md) which is automatically added to airflow if you select `Airflow Connection` as the `Delivery Mode`.
118
136
-`additional_tables`: A list of additional tables you would want to add to the default set.
119
137
-`tables`: A list of tables to override the default ones from above. Warning: An empty list [] will perform a full-database sync.
0 commit comments