You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/concepts/state.md
+6-5Lines changed: 6 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -92,7 +92,7 @@ The state file is a simple `json` file that looks like:
92
92
You can export a specific environment like so:
93
93
94
94
```sh
95
-
$ sqlmesh state export --environment my_dev -o my_dev_state.json
95
+
sqlmesh state export --environment my_dev -o my_dev_state.json
96
96
```
97
97
98
98
Note that every snapshot that is part of the environment will be exported, not just the differences from `prod`. The reason for this is so that the environment can be fully imported elsewhere without any assumptions about which snapshots are already present in state.
@@ -102,7 +102,7 @@ Note that every snapshot that is part of the environment will be exported, not j
102
102
You can export local state like so:
103
103
104
104
```bash
105
-
$ sqlmesh state export --local -o local_state.json
105
+
sqlmesh state export --local -o local_state.json
106
106
```
107
107
108
108
This essentially just exports the state of the local context which includes local changes that have not been applied to any virtual data environments.
@@ -174,10 +174,11 @@ If your project has [multiple gateways](../guides/configuration.md#gateways) wit
174
174
175
175
```bash
176
176
# state export
177
-
$ sqlmesh --gateway <gateway> state export -o state.json
178
-
177
+
sqlmesh --gateway <gateway> state export -o state.json
178
+
```
179
+
```bash
179
180
# state import
180
-
$ sqlmesh --gateway <gateway> state import -i state.json
181
+
sqlmesh --gateway <gateway> state import -i state.json
Copy file name to clipboardExpand all lines: docs/guides/configuration.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -269,7 +269,7 @@ gateways:
269
269
We can override the `dummy_pw` value with the true password `real_pw` by creating the environment variable. This example demonstrates creating the variable with the bash `export` function:
After the initial string `SQLMESH__`, the environment variable name components move down the key hierarchy in the YAML specification: `GATEWAYS`--> `MY_GATEWAY` --> `CONNECTION` --> `PASSWORD`.
@@ -1492,7 +1492,7 @@ Example enabling debug mode for the CLI command `sqlmesh plan`:
Copy file name to clipboardExpand all lines: docs/integrations/dbt.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -19,19 +19,19 @@ Therefore, SQLMesh is packaged with multiple "extras," which you may optionally
19
19
At minimum, using the SQLMesh dbt adapter requires installing the dbt extra:
20
20
21
21
```bash
22
-
>pip install "sqlmesh[dbt]"
22
+
pip install "sqlmesh[dbt]"
23
23
```
24
24
25
25
If your project uses any SQL execution engine other than DuckDB, you must install the extra for that engine. For example, if your project runs on the Postgres SQL engine:
26
26
27
27
```bash
28
-
>pip install "sqlmesh[dbt,postgres]"
28
+
pip install "sqlmesh[dbt,postgres]"
29
29
```
30
30
31
31
If you would like to use the [SQLMesh Browser UI](../guides/ui.md) to view column-level lineage, include the `web` extra:
32
32
33
33
```bash
34
-
>pip install "sqlmesh[dbt,web]"
34
+
pip install "sqlmesh[dbt,web]"
35
35
```
36
36
37
37
Learn more about [SQLMesh installation and extras here](../installation.md#install-extras).
@@ -41,7 +41,7 @@ Learn more about [SQLMesh installation and extras here](../installation.md#insta
41
41
Prepare an existing dbt project to be run by SQLMesh by executing the `sqlmesh init` command *within the dbt project root directory* and with the `dbt` template option:
42
42
43
43
```bash
44
-
$ sqlmesh init -t dbt
44
+
sqlmesh init -t dbt
45
45
```
46
46
47
47
This will create a file called `sqlmesh.yaml` containing the [default model start date](../reference/model_configuration.md#model-defaults). This configuration file is a minimum starting point for enabling SQLMesh to work with your DBT project.
@@ -247,8 +247,8 @@ Instead, SQLMesh provides predefined time macro variables that can be used in th
247
247
For example, the SQL `WHERE` clause with the "ds" column goes in a new jinja block gated by `{% if sqlmesh_incremental is defined %}` as follows:
248
248
249
249
```bash
250
-
> WHERE
251
-
> ds BETWEEN '{{ start_ds }}' AND '{{ end_ds }}'
250
+
WHERE
251
+
ds BETWEEN '{{ start_ds }}' AND '{{ end_ds }}'
252
252
```
253
253
254
254
`{{ start_ds }}`and `{{ end_ds }}` are the jinja equivalents of SQLMesh's `@start_ds` and `@end_ds` predefined time macro variables. See all [predefined time variables](../concepts/macros/macro_variables.md) available in jinja.
Copy file name to clipboardExpand all lines: docs/integrations/dlt.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,7 +8,7 @@ SQLMesh enables efforless project generation using data ingested through [dlt](h
8
8
To load data from a dlt pipeline into SQLMesh, ensure the dlt pipeline has been run or restored locally. Then simply execute the sqlmesh `init` command *within the dlt project root directory* using the `dlt` template option and specifying the pipeline's name with the `dlt-pipeline` option:
This will create the configuration file and directories, which are found in all SQLMesh projects:
@@ -33,7 +33,7 @@ SQLMesh will also automatically generate models to ingest data from the pipeline
33
33
The default location for dlt pipelines is `~/.dlt/pipelines/<pipeline_name>`. If your pipelines are in a [different directory](https://dlthub.com/docs/general-usage/pipeline#separate-working-environments-with-pipelines_dir), use the `--dlt-path` argument to specify the path explicitly:
@@ -83,7 +83,7 @@ Load package 1728074157.660565 is LOADED and contains no failed jobs
83
83
After the pipeline has run, generate a SQLMesh project by executing:
84
84
85
85
```bash
86
-
$ sqlmesh init -t dlt --dlt-pipeline sushi duckdb
86
+
sqlmesh init -t dlt --dlt-pipeline sushi duckdb
87
87
```
88
88
89
89
Then the SQLMesh project is all set up. You can then proceed to run the SQLMesh `plan` command to ingest the dlt pipeline data and populate the SQLMesh tables:
0 commit comments