@@ -179,7 +179,7 @@ After initializing a project with `datacustomcode init my_package`, you might no
179179that include [ native features] ( https://spark.apache.org/docs/latest/api/python/user_guide/python_packaging.html#using-pyspark-native-features )
180180like C++ or C interop, the platform and architecture may be different between your machine and Data Cloud compute. This is all taken care of
181181in the ` zip ` and ` deploy ` commands, which utilize the Dockerfile which starts ` FROM ` an image compatible with Data Cloud. However, you may
182- want to build, run, and test your script on your machine using the same platform and architecture as Data Cloud. You can use the sections above
182+ want to build, run, and test your script on your machine using the same platform and architecture as Data Cloud. You can use the sections below
183183to test your script in this manner.
184184
185185### VS Code Dev Containers
@@ -201,8 +201,8 @@ Read more about Dev Containers here: https://code.visualstudio.com/docs/devconta
201201
202202Within your ` init ` ed package, you will find a ` jupyterlab.sh ` file that can open a jupyter notebook for you. Jupyter notebooks, in
203203combination with Data Cloud's [ Query Editor] ( https://help.salesforce.com/s/articleView?id=data.c360_a_add_queries_to_a_query_workspace.htm&type=5 )
204- and [ Data Explorer] ( https://help.salesforce.com/s/articleView?id=data.c360_a_data_explorer.htm&type=5 ) can be extremely helpful for data
205- exploration. Instead of running an entire script, one can run one cell at a time as they discover and experiment with the DLO or DMO data.
204+ and [ Data Explorer] ( https://help.salesforce.com/s/articleView?id=data.c360_a_data_explorer.htm&type=5 ) , can be extremely helpful for data
205+ exploration. Instead of running an entire script, one can run one code cell at a time as they discover and experiment with the DLO or DMO data.
206206
207207You can read more about Jupyter Notebooks here: https://jupyter.org/
208208
0 commit comments