Skip to content

Commit 57b3f41

Browse files
Release/0.3.1 (#154)
* Update setup.md * Updated instructions to address dashboard permissions. * update clusters to SINGLE_USER (#153) --------- Co-authored-by: Arun Pamulapati <[email protected]>
1 parent 348cea3 commit 57b3f41

File tree

4 files changed

+20
-6
lines changed

4 files changed

+20
-6
lines changed

dabs/dabs_template/template/tmp/resources/sat_driver_job.yml.tmpl

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,7 @@ resources:
1717
job_clusters:
1818
- job_cluster_key: job_cluster
1919
new_cluster:
20+
data_security_mode: SINGLE_USER
2021
num_workers: 5
2122
spark_version: {{.latest_lts}}
2223
runtime_engine: "PHOTON"

dabs/dabs_template/template/tmp/resources/sat_initiliazer_job.yml.tmpl

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,7 @@ resources:
1515
job_clusters:
1616
- job_cluster_key: job_cluster
1717
new_cluster:
18+
data_security_mode: SINGLE_USER
1819
num_workers: 5
1920
spark_version: {{.latest_lts}}
2021
runtime_engine: "PHOTON"

docs/setup.md

Lines changed: 11 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -77,9 +77,19 @@ You now have two jobs (SAT Initializer Notebook & SAT Driver Notebook). Run SAT
7777

7878
### 2. Access Databricks SQL Dashboards
7979

80-
> **Note:** You can also use Lakeview Dashboards to view the results.
80+
> **Note:** You can use Lakeview Dashboards to view the results.
81+
82+
The Dashboard is, by default, owned by the profile you used to set up SAT or the Service Principle. If you see errors running the dashboard, you are likely running into permissions issues.
83+
1. Go over to the dashboard and click on the "Share" button in the top right.
84+
2. Click on the cogwheel and select the option "Assign new owner"
85+
3. Assign yourself as the new owner of the dashboard. You can also assign this to someone who has access to the SAT catalog/schema and tables.
86+
4. Click on the "Published" option at the top to switch to the draft version of the dashboard.Click on the "Publish" button next to the share option.
87+
5. In the general settings section, you can choose one of two options:
88+
Embed credentials (default): All viewers will run queries using the owner's credentials and compute. This may expose data to users who normally wouldn't have access.
89+
Don't ember credentials: Each viewer will need access to this workspace, the associated data, and the compute to view this dashboard. We recommend using this option.
8190

8291

92+
> **Note:** We are switching SAT to Lakeview Dashboard, but the classic dashboard is still available.
8393
In DBSQL find "SAT - Security Analysis Tool" dashboard to see the report. You can filter the dashboard by **SAT** tag. (The old classic legacy dashboard can be found in Workspace -> Home -> SAT_dashboard)
8494
8595
<img src="./images/sat_dashboard_loc.png" width="70%" height="70%">

terraform/common/jobs.tf

Lines changed: 7 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -3,10 +3,11 @@ resource "databricks_job" "initializer" {
33
job_cluster {
44
job_cluster_key = "job_cluster"
55
new_cluster {
6-
num_workers = 5
7-
spark_version = data.databricks_spark_version.latest_lts.id
8-
node_type_id = data.databricks_node_type.smallest.id
9-
runtime_engine = "PHOTON"
6+
data_security_mode = "SINGLE_USER"
7+
num_workers = 5
8+
spark_version = data.databricks_spark_version.latest_lts.id
9+
node_type_id = data.databricks_node_type.smallest.id
10+
runtime_engine = "PHOTON"
1011
dynamic "gcp_attributes" {
1112
for_each = var.gcp_impersonate_service_account == "" ? [] : [var.gcp_impersonate_service_account]
1213
content {
@@ -17,7 +18,7 @@ resource "databricks_job" "initializer" {
1718
}
1819

1920
task {
20-
task_key = "Initializer"
21+
task_key = "Initializer"
2122
job_cluster_key = "job_cluster"
2223
library {
2324
pypi {
@@ -36,6 +37,7 @@ resource "databricks_job" "driver" {
3637
job_cluster {
3738
job_cluster_key = "job_cluster"
3839
new_cluster {
40+
data_security_mode = "SINGLE_USER"
3941
num_workers = 5
4042
spark_version = data.databricks_spark_version.latest_lts.id
4143
node_type_id = data.databricks_node_type.smallest.id

0 commit comments

Comments
 (0)