Skip to content

Instantly share code, notes, and snippets.

@SQLDBAWithABeard
Last active February 12, 2026 15:25
Show Gist options
  • Select an option

  • Save SQLDBAWithABeard/49d3b444d0bea1ad4d906f2142502913 to your computer and use it in GitHub Desktop.

Select an option

Save SQLDBAWithABeard/49d3b444d0bea1ad4d906f2142502913 to your computer and use it in GitHub Desktop.
Fabric RLS
I mean, funny like I'm a clown? I amuse you?
dbo Workspace_For_Automation DefaultDatasetStorageFormat varchar 8000 NULL NULL
dbo Workspace_For_Automation fuam_modified_at datetime2 NULL NULL NULL
dbo Workspace_For_Automation State varchar 8000 NULL NULL
dbo Workspace_For_Automation CapacityName varchar 8000 NULL NULL
dbo Workspace_For_Automation WorkspaceName varchar 8000 NULL NULL
dbo Workspace_For_Automation WorkspaceId varchar 8000 NULL NULL
dbo Workspace_For_Automation Type varchar 8000 NULL NULL
dbo Workspace_For_Automation Description varchar 8000 NULL NULL
dbo active_items type varchar 8000 NULL NULL
dbo active_items capacityId varchar 8000 NULL NULL
dbo active_items state varchar 8000 NULL NULL
dbo active_items description varchar 8000 NULL NULL
dbo active_items name varchar 8000 NULL NULL
dbo active_items lastUpdatedDate varchar 8000 NULL NULL
dbo active_items workspaceId varchar 8000 NULL NULL
dbo active_items fuam_deleted bit NULL NULL NULL
dbo active_items id varchar 8000 NULL NULL
dbo active_items fuam_modified_at datetime2 NULL NULL NULL
dbo active_items creatorUserPrincipalName varchar 8000 NULL NULL
dbo active_items folderId varchar 8000 NULL NULL
dbo active_items_history Snapshot_Date date NULL NULL NULL
dbo active_items_history type varchar 8000 NULL NULL
dbo active_items_history id varchar 8000 NULL NULL
dbo active_items_history capacityId varchar 8000 NULL NULL
dbo active_items_history Snapshot_id varchar 8000 NULL NULL
dbo active_items_history description varchar 8000 NULL NULL
dbo active_items_history creatorUserPrincipalName varchar 8000 NULL NULL
dbo active_items_history fuam_deleted bit NULL NULL NULL
dbo active_items_history workspaceId varchar 8000 NULL NULL
dbo active_items_history fuam_modified_at datetime2 NULL NULL NULL
dbo active_items_history Snapshot_workspaceId varchar 8000 NULL NULL
dbo active_items_history Snapshot_capacityId varchar 8000 NULL NULL
dbo active_items_history name varchar 8000 NULL NULL
dbo active_items_history lastUpdatedDate varchar 8000 NULL NULL
dbo active_items_history state varchar 8000 NULL NULL
dbo activities FolderDisplayName varchar 8000 NULL NULL
dbo activities DataflowType varchar 8000 NULL NULL
dbo activities CapacityUsers varchar 8000 NULL NULL
dbo activities PaginatedReportDataSources varchar 8000 NULL NULL
dbo activities ImportSource varchar 8000 NULL NULL
dbo activities OrgAppPermission varchar 8000 NULL NULL
dbo activities DeploymentPipelineAccesses varchar 8000 NULL NULL
dbo activities OrganizationalGalleryItemId bigint NULL 19 0
dbo activities RefreshType varchar 8000 NULL NULL
dbo activities UserId varchar 8000 NULL NULL
dbo activities Id varchar 8000 NULL NULL
dbo activities Schedules varchar 8000 NULL NULL
dbo activities TemplateAppObjectId varchar 8000 NULL NULL
dbo activities TemplateAppIsInstalledWithAutomation bit NULL NULL NULL
dbo activities TenantAdminFeatureSwitches varchar 8000 NULL NULL
dbo activities SwitchState varchar 8000 NULL NULL
dbo activities DataConnectivityMode varchar 8000 NULL NULL
dbo activities DatasourceId varchar 8000 NULL NULL
dbo activities GatewayClustersObjectIds varchar 8000 NULL NULL
dbo activities DeploymentPipelineObjectId varchar 8000 NULL NULL
dbo activities ExportEventStartDateTimeParameter varchar 8000 NULL NULL
dbo activities ExternalResource varchar 8000 NULL NULL
dbo activities DashboardName varchar 8000 NULL NULL
dbo activities ItemName varchar 8000 NULL NULL
dbo activities UserType bigint NULL 19 0
dbo activities AppId varchar 8000 NULL NULL
dbo activities IncludeSubartifacts bit NULL NULL NULL
dbo activities SubfolderObjectId varchar 8000 NULL NULL
dbo activities DeploymentPipelineDisplayName varchar 8000 NULL NULL
dbo activities ResultStatus varchar 8000 NULL NULL
dbo activities SubfolderId bigint NULL 19 0
dbo activities CustomVisualAccessTokenSiteUri varchar 8000 NULL NULL
dbo activities DistributionMethod varchar 8000 NULL NULL
dbo activities SharingInformation varchar 8000 NULL NULL
dbo activities RequestId varchar 8000 NULL NULL
dbo activities GatewayType varchar 8000 NULL NULL
dbo activities ArtifactKind varchar 8000 NULL NULL
dbo activities FolderAccessRequests varchar 8000 NULL NULL
dbo activities CreationDate date NULL NULL NULL
dbo activities DeploymentPipelineStageOrder bigint NULL 19 0
dbo activities IsTenantAdminApi bit NULL NULL NULL
dbo activities CapacityName varchar 8000 NULL NULL
dbo activities ImportType varchar 8000 NULL NULL
dbo activities RequiredWorkspaces varchar 8000 NULL NULL
dbo activities ExportedArtifactInfo varchar 8000 NULL NULL
dbo activities TemplatePackageName varchar 8000 NULL NULL
dbo activities PackageId bigint NULL 19 0
dbo activities DatasourceDetails bit NULL NULL NULL
dbo activities Workload varchar 8000 NULL NULL
dbo activities RefreshEnforcementPolicy bigint NULL 19 0
dbo activities ReportMobileLayoutAction varchar 8000 NULL NULL
dbo activities Lineage bit NULL NULL NULL
dbo activities CapacityId varchar 8000 NULL NULL
dbo activities DataflowRefreshScheduleType varchar 8000 NULL NULL
dbo activities DataflowName varchar 8000 NULL NULL
dbo activities Datasets varchar 8000 NULL NULL
dbo activities ImportDisplayName varchar 8000 NULL NULL
dbo activities DatasourceInformations varchar 8000 NULL NULL
dbo activities Activity varchar 8000 NULL NULL
dbo activities IsUpdateAppActivity bit NULL NULL NULL
dbo activities GatewayStatus varchar 8000 NULL NULL
dbo activities TenantAdminFeatureSwitchProperties varchar 8000 NULL NULL
dbo activities RecordType bigint NULL 19 0
dbo activities WorkSpaceName varchar 8000 NULL NULL
dbo activities ReportId varchar 8000 NULL NULL
dbo activities ObjectType varchar 8000 NULL NULL
dbo activities AppName varchar 8000 NULL NULL
dbo activities GatewayId varchar 8000 NULL NULL
dbo activities ModelsSnapshots varchar 8000 NULL NULL
dbo activities CreationHour varchar 8000 NULL NULL
dbo activities ArtifactAccessRequestInfo varchar 8000 NULL NULL
dbo activities CopiedReportId varchar 8000 NULL NULL
dbo activities Operation varchar 8000 NULL NULL
dbo activities IsTemplateAppFromMarketplace bit NULL NULL NULL
dbo activities GitIntegrationRequest varchar 8000 NULL NULL
dbo activities SubscriptionDetails varchar 8000 NULL NULL
dbo activities GatewayClusterId varchar 8000 NULL NULL
dbo activities ObjectId varchar 8000 NULL NULL
dbo activities HasFullReportAttachment bit NULL NULL NULL
dbo activities ObjectDisplayName varchar 8000 NULL NULL
dbo activities SubscribeeInformation varchar 8000 NULL NULL
dbo activities ExportEventEndDateTimeParameter varchar 8000 NULL NULL
dbo activities AggregatedWorkspaceInformation varchar 8000 NULL NULL
dbo activities ArtifactName varchar 8000 NULL NULL
dbo activities CreationTime varchar 8000 NULL NULL
dbo activities UserAgent varchar 8000 NULL NULL
dbo activities GatewayName varchar 8000 NULL NULL
dbo activities SubfolderName varchar 8000 NULL NULL
dbo activities TemplateAppOwnerTenantObjectId varchar 8000 NULL NULL
dbo activities ActivityId varchar 8000 NULL NULL
dbo activities ArtifactId varchar 8000 NULL NULL
dbo activities DeploymentPipelineId bigint NULL 19 0
dbo activities CopiedReportName varchar 8000 NULL NULL
dbo activities OrganizationalGalleryItemDisplayName varchar 8000 NULL NULL
dbo activities FolderObjectId varchar 8000 NULL NULL
dbo activities AppReportId varchar 8000 NULL NULL
dbo activities DatasetId varchar 8000 NULL NULL
dbo activities WorkspacesSemicolonDelimitedList varchar 8000 NULL NULL
dbo activities ItemsCount varchar 8000 NULL NULL
dbo activities ExternalSubscribeeInformation varchar 8000 NULL NULL
dbo activities ClientIP varchar 8000 NULL NULL
dbo activities TemplateAppVersion varchar 8000 NULL NULL
dbo activities TemplateAppFolderObjectId varchar 8000 NULL NULL
dbo activities SharingAction varchar 8000 NULL NULL
dbo activities DatasourceObjectIds varchar 8000 NULL NULL
dbo activities IncludeExpressions bit NULL NULL NULL
dbo activities Experience varchar 8000 NULL NULL
dbo activities OrganizationId varchar 8000 NULL NULL
dbo activities DatasetName varchar 8000 NULL NULL
dbo activities SharingScope varchar 8000 NULL NULL
dbo activities ReportName varchar 8000 NULL NULL
dbo activities TileText varchar 8000 NULL NULL
dbo activities CustomVisualAccessTokenResourceId varchar 8000 NULL NULL
dbo activities CapacityState varchar 8000 NULL NULL
dbo activities OrganizationalGalleryItemPublishTime varchar 8000 NULL NULL
dbo activities CreationMinute varchar 8000 NULL NULL
dbo activities GatewayClusterDatasources varchar 8000 NULL NULL
dbo activities SubscriptionSchedule varchar 8000 NULL NULL
dbo activities UserInformation varchar 8000 NULL NULL
dbo activities LastRefreshTime varchar 8000 NULL NULL
dbo activities ModelId varchar 8000 NULL NULL
dbo activities IsSuccess bit NULL NULL NULL
dbo activities CredentialSetupMode varchar 8000 NULL NULL
dbo activities ExcludePersonalWorkspaces bit NULL NULL NULL
dbo activities DashboardId varchar 8000 NULL NULL
dbo activities BillingType bigint NULL 19 0
dbo activities DeploymentPipelineOperationId varchar 8000 NULL NULL
dbo activities WorkspaceAccessList varchar 8000 NULL NULL
dbo activities TargetWorkspaceId varchar 8000 NULL NULL
dbo activities Monikers varchar 8000 NULL NULL
dbo activities WorkspaceId varchar 8000 NULL NULL
dbo activities ItemId varchar 8000 NULL NULL
dbo activities ExportedArtifactDownloadInfo varchar 8000 NULL NULL
dbo activities UpdateDatasourcesRequest varchar 8000 NULL NULL
dbo activities CreationDateKey varchar 8000 NULL NULL
dbo activities DataflowId varchar 8000 NULL NULL
dbo activities ConsumptionMethod varchar 8000 NULL NULL
dbo activities ImportId varchar 8000 NULL NULL
dbo activities ReportType varchar 8000 NULL NULL
dbo activities UserKey varchar 8000 NULL NULL
dbo activities ModelSettings varchar 8000 NULL NULL
dbo activities GatewayClusters varchar 8000 NULL NULL
dbo activities AccessRequestMessage varchar 8000 NULL NULL
dbo activities AccessRequestType varchar 8000 NULL NULL
dbo activities ActionSource varchar 8000 NULL NULL
dbo activities ActionSourceDetail varchar 8000 NULL NULL
dbo activities ArtifactType varchar 8000 NULL NULL
dbo activities AuditedArtifactInformation varchar 8000 NULL NULL
dbo activities DataflowAccessTokenRequestParameters varchar 8000 NULL NULL
dbo activities DataflowAllowNativeQueries bit NULL NULL NULL
dbo activities DatasetCertificationStage varchar 8000 NULL NULL
dbo activities Datasources varchar 8000 NULL NULL
dbo activities DomainId varchar 8000 NULL NULL
dbo activities EmbedTokenId varchar 8000 NULL NULL
dbo activities EndPoint varchar 8000 NULL NULL
dbo activities GatewayMemberId varchar 8000 NULL NULL
dbo activities InPlaceSharingEnabled bit NULL NULL NULL
dbo activities ItemType varchar 8000 NULL NULL
dbo activities LabelEventType varchar 8000 NULL NULL
dbo activities MentionedUsersInformation varchar 8000 NULL NULL
dbo activities OldSensitivityLabelId varchar 8000 NULL NULL
dbo activities OriginalOwner varchar 8000 NULL NULL
dbo activities OwningAppContext varchar 8000 NULL NULL
dbo activities PinReportToTabInformation varchar 8000 NULL NULL
dbo activities PowerPlatformEnvironmentId varchar 8000 NULL NULL
dbo activities ReportCertificationStage varchar 8000 NULL NULL
dbo activities SensitivityLabelEventData varchar 8000 NULL NULL
dbo activities SensitivityLabelId varchar 8000 NULL NULL
dbo activities ShareLinkId varchar 8000 NULL NULL
dbo activities ShareWithCurrentFilter bit NULL NULL NULL
dbo activities SingleSignOnType varchar 8000 NULL NULL
dbo activities TableName varchar 8000 NULL NULL
dbo activities TakingOverOwner varchar 8000 NULL NULL
dbo activities UpdateFeaturedTables varchar 8000 NULL NULL
dbo activities ImpersonatedUserId varchar 8000 NULL NULL
dbo activities MembershipInformation varchar 8000 NULL NULL
dbo activities SnapshotId bigint NULL 19 0
dbo activities EmbedTokenDatasets varchar 8000 NULL NULL
dbo activities EmbedTokenReports varchar 8000 NULL NULL
dbo activities EmbedTokenTargetWorkspaces varchar 8000 NULL NULL
dbo activities CustomVisualAccessTokenEtag varchar 8000 NULL NULL
dbo activities PaginatedReportMonikers varchar 8000 NULL NULL
dbo activities ItemNewName varchar 8000 NULL NULL
dbo activities ContentProviderCertificationStage varchar 8000 NULL NULL
dbo activities WorkspacesModifiedSince varchar 8000 NULL NULL
dbo activities PreppedForCopilot bit NULL NULL NULL
dbo activities GatewayState varchar 8000 NULL NULL
dbo activities ExportEventActivityTypeParameter varchar 8000 NULL NULL
dbo activities DatasourceName varchar 8000 NULL NULL
dbo activities DatasourceType varchar 8000 NULL NULL
dbo aggregated_activities_last_30days CreationDate date NULL NULL NULL
dbo aggregated_activities_last_30days ObjectType varchar 8000 NULL NULL
dbo aggregated_activities_last_30days Experience varchar 8000 NULL NULL
dbo aggregated_activities_last_30days CapacityId varchar 8000 NULL NULL
dbo aggregated_activities_last_30days CreationDateKey varchar 8000 NULL NULL
dbo aggregated_activities_last_30days WorkspaceId varchar 8000 NULL NULL
dbo aggregated_activities_last_30days CountOfActivities bigint NULL 19 0
dbo aggregated_activities_last_30days ItemId varchar 8000 NULL NULL
dbo aggregated_activities_last_30days ResultStatus varchar 8000 NULL NULL
dbo aggregated_activities_last_30days IsSuccess bit NULL NULL NULL
dbo aggregated_activities_last_30days Activity varchar 8000 NULL NULL
dbo aggregated_activities_last_30days Workload varchar 8000 NULL NULL
dbo aggregated_activities_last_30days fuam_item_id varchar 8000 NULL NULL
dbo aggregated_activities_last_30days fuam_activity_status varchar 8000 NULL NULL
dbo aggregated_activities_last_30days CreationHour varchar 8000 NULL NULL
dbo aggregated_activities_last_30days fuam_item_kind varchar 8000 NULL NULL
dbo aggregated_activities_last_30days DistinctCountOfUsers bigint NULL 19 0
dbo aggregated_activities_last_30days GatewayId varchar 8000 NULL NULL
dbo audit_granted_ws_access_for_analyzer workspace_id varchar 8000 NULL NULL
dbo audit_granted_ws_access_for_analyzer reason varchar 8000 NULL NULL
dbo audit_granted_ws_access_for_analyzer entity varchar 8000 NULL NULL
dbo audit_granted_ws_access_for_analyzer principal_type varchar 8000 NULL NULL
dbo audit_granted_ws_access_for_analyzer timestamp datetime2 NULL NULL NULL
dbo audit_granted_ws_access_for_analyzer access_right varchar 8000 NULL NULL
dbo audit_latest_available_fuam_version current_fuam_version varchar 8000 NULL NULL
dbo audit_latest_available_fuam_version major_version bigint NULL 19 0
dbo audit_latest_available_fuam_version build_version bigint NULL 19 0
dbo audit_latest_available_fuam_version last_check_timestamp datetime2 NULL NULL NULL
dbo audit_latest_available_fuam_version minor_version bigint NULL 19 0
dbo calendar IsPreviousMonth int NULL 10 0
dbo calendar DayOfWeekNum int NULL 10 0
dbo calendar DayOfWeek varchar 8000 NULL NULL
dbo calendar Month int NULL 10 0
dbo calendar Day int NULL 10 0
dbo calendar Year int NULL 10 0
dbo calendar IsInLast30Days int NULL 10 0
dbo calendar WeekOfYear int NULL 10 0
dbo calendar IsPreviousYear int NULL 10 0
dbo calendar IsCurrentYear int NULL 10 0
dbo calendar IsInLast14Days int NULL 10 0
dbo calendar YearMonth varchar 8000 NULL NULL
dbo calendar Date varchar 8000 NULL NULL
dbo calendar DateKey varchar 8000 NULL NULL
dbo calendar IsCurrentQuarter int NULL 10 0
dbo calendar Date2Key date NULL NULL NULL
dbo calendar IsCurrentMonth int NULL 10 0
dbo calendar_timepoints Hour datetime2 NULL NULL NULL
dbo calendar_timepoints Date date NULL NULL NULL
dbo calendar_timepoints TimePoint datetime2 NULL NULL NULL
dbo capacities CapacityId varchar 8000 NULL NULL
dbo capacities region varchar 8000 NULL NULL
dbo capacities fuam_deleted bit NULL NULL NULL
dbo capacities sku varchar 8000 NULL NULL
dbo capacities displayName varchar 8000 NULL NULL
dbo capacities fuam_modified_at datetime2 NULL NULL NULL
dbo capacities state varchar 8000 NULL NULL
dbo capacities capacityUserAccessRight varchar 8000 NULL NULL
dbo capacities_history capacityUserAccessRight varchar 8000 NULL NULL
dbo capacities_history displayName varchar 8000 NULL NULL
dbo capacities_history fuam_modified_at datetime2 NULL NULL NULL
dbo capacities_history Snapshot_CapacityId varchar 8000 NULL NULL
dbo capacities_history CapacityId varchar 8000 NULL NULL
dbo capacities_history state varchar 8000 NULL NULL
dbo capacities_history fuam_deleted bit NULL NULL NULL
dbo capacities_history region varchar 8000 NULL NULL
dbo capacities_history sku varchar 8000 NULL NULL
dbo capacities_history Snapshot_Date date NULL NULL NULL
dbo capacity_metrics_by_item_by_operation_by_day InvalidOperationCount bigint NULL 19 0
dbo capacity_metrics_by_item_by_operation_by_day Date datetime2 NULL NULL NULL
dbo capacity_metrics_by_item_by_operation_by_day ThrottlingInMin float NULL 53 NULL
dbo capacity_metrics_by_item_by_operation_by_day OperationCount bigint NULL 19 0
dbo capacity_metrics_by_item_by_operation_by_day SuccessOperationCount bigint NULL 19 0
dbo capacity_metrics_by_item_by_operation_by_day OperationName varchar 8000 NULL NULL
dbo capacity_metrics_by_item_by_operation_by_day FailureOperationCount bigint NULL 19 0
dbo capacity_metrics_by_item_by_operation_by_day DurationInSec float NULL 53 NULL
dbo capacity_metrics_by_item_by_operation_by_day ItemKind varchar 8000 NULL NULL
dbo capacity_metrics_by_item_by_operation_by_day CapacityId varchar 8000 NULL NULL
dbo capacity_metrics_by_item_by_operation_by_day UserCount bigint NULL 19 0
dbo capacity_metrics_by_item_by_operation_by_day WorkspaceId varchar 8000 NULL NULL
dbo capacity_metrics_by_item_by_operation_by_day TotalCUs float NULL 53 NULL
dbo capacity_metrics_by_item_by_operation_by_day RejectedOperationCount bigint NULL 19 0
dbo capacity_metrics_by_item_by_operation_by_day ItemId varchar 8000 NULL NULL
dbo capacity_metrics_by_item_by_operation_by_day CancelledOperationCount bigint NULL 19 0
dbo capacity_metrics_by_item_by_operation_by_day DateKey varchar 8000 NULL NULL
dbo capacity_metrics_by_item_kind_by_day WorkspaceId varchar 8000 NULL NULL
dbo capacity_metrics_by_item_kind_by_day CapacityId varchar 8000 NULL NULL
dbo capacity_metrics_by_item_kind_by_day SuccessOperationCount bigint NULL 19 0
dbo capacity_metrics_by_item_kind_by_day RejectedOperationCount bigint NULL 19 0
dbo capacity_metrics_by_item_kind_by_day DurationInSec float NULL 53 NULL
dbo capacity_metrics_by_item_kind_by_day DateKey varchar 8000 NULL NULL
dbo capacity_metrics_by_item_kind_by_day UserCount bigint NULL 19 0
dbo capacity_metrics_by_item_kind_by_day CancelledOperationCount bigint NULL 19 0
dbo capacity_metrics_by_item_kind_by_day ItemKind varchar 8000 NULL NULL
dbo capacity_metrics_by_item_kind_by_day Date datetime2 NULL NULL NULL
dbo capacity_metrics_by_item_kind_by_day OperationCount bigint NULL 19 0
dbo capacity_metrics_by_item_kind_by_day TotalCUs float NULL 53 NULL
dbo capacity_metrics_by_item_kind_by_day InvalidOperationCount bigint NULL 19 0
dbo capacity_metrics_by_item_kind_by_day ThrottlingInMin float NULL 53 NULL
dbo capacity_metrics_by_item_kind_by_day FailureOperationCount bigint NULL 19 0
dbo capacity_metrics_by_timepoint BackgroundNonBillablePercentage float NULL 53 NULL
dbo capacity_metrics_by_timepoint TotalCUs float NULL 53 NULL
dbo capacity_metrics_by_timepoint BackgroundRejectionPercentage float NULL 53 NULL
dbo capacity_metrics_by_timepoint InteractivePercentage float NULL 53 NULL
dbo capacity_metrics_by_timepoint BackgroundPercentage float NULL 53 NULL
dbo capacity_metrics_by_timepoint CarryOverCumulativePercentage float NULL 53 NULL
dbo capacity_metrics_by_timepoint OverageReferenceLine bigint NULL 19 0
dbo capacity_metrics_by_timepoint CarryOverAddedPercentage float NULL 53 NULL
dbo capacity_metrics_by_timepoint InteractiveRejectionPercentage float NULL 53 NULL
dbo capacity_metrics_by_timepoint AutoscalePercentage float NULL 53 NULL
dbo capacity_metrics_by_timepoint TimePoint datetime2 NULL NULL NULL
dbo capacity_metrics_by_timepoint InteractiveRejectionThreshold bigint NULL 19 0
dbo capacity_metrics_by_timepoint InteractiveDelayPercentage float NULL 53 NULL
dbo capacity_metrics_by_timepoint ExpectedBurndownInMin float NULL 53 NULL
dbo capacity_metrics_by_timepoint BackgroundRejectionThreshold bigint NULL 19 0
dbo capacity_metrics_by_timepoint CarryOverBurndownPercentage float NULL 53 NULL
dbo capacity_metrics_by_timepoint CULimitPercentage bigint NULL 19 0
dbo capacity_metrics_by_timepoint CapacityId varchar 8000 NULL NULL
dbo capacity_metrics_by_timepoint TotalCUUsagePercentage float NULL 53 NULL
dbo capacity_metrics_by_timepoint InteractiveNonBillablePercentage float NULL 53 NULL
dbo capacity_metrics_by_timepoint SKUCUByTimePoint bigint NULL 19 0
dbo capacity_metrics_item_size_by_day ItemId varchar 8000 NULL NULL
dbo capacity_metrics_item_size_by_day CapacityId varchar 8000 NULL NULL
dbo capacity_metrics_item_size_by_day ItemKind varchar 8000 NULL NULL
dbo capacity_metrics_item_size_by_day ItemSize float NULL 53 NULL
dbo capacity_metrics_item_size_by_day DateKey varchar 8000 NULL NULL
dbo capacity_metrics_item_size_by_day Date datetime2 NULL NULL NULL
dbo capacity_metrics_item_size_by_day WorkspaceId varchar 8000 NULL NULL
dbo capacity_refreshable_days ItemId varchar 8000 NULL NULL
dbo capacity_refreshable_days CapacityId varchar 8000 NULL NULL
dbo capacity_refreshable_days Day varchar 8000 NULL NULL
dbo capacity_refreshable_days WorkspaceId varchar 8000 NULL NULL
dbo capacity_refreshable_details ItemId varchar 8000 NULL NULL
dbo capacity_refreshable_details CapacityId varchar 8000 NULL NULL
dbo capacity_refreshable_details Status varchar 8000 NULL NULL
dbo capacity_refreshable_details LastRefreshStartHour varchar 8000 NULL NULL
dbo capacity_refreshable_details LastRefreshEndHour varchar 8000 NULL NULL
dbo capacity_refreshable_details LastRefreshStartDate date NULL NULL NULL
dbo capacity_refreshable_details LastRefreshStartTime datetime2 NULL NULL NULL
dbo capacity_refreshable_details LastRefreshEndDate date NULL NULL NULL
dbo capacity_refreshable_details ItemName varchar 8000 NULL NULL
dbo capacity_refreshable_details LastRefreshStartDateKey varchar 8000 NULL NULL
dbo capacity_refreshable_details DurationInSeconds bigint NULL 19 0
dbo capacity_refreshable_details RequestId varchar 8000 NULL NULL
dbo capacity_refreshable_details WorkspaceId varchar 8000 NULL NULL
dbo capacity_refreshable_details LastRefreshEndTime datetime2 NULL NULL NULL
dbo capacity_refreshable_details RefreshType varchar 8000 NULL NULL
dbo capacity_refreshable_summaries ConsideredEndDate date NULL NULL NULL
dbo capacity_refreshable_summaries ConsideredStartDate date NULL NULL NULL
dbo capacity_refreshable_summaries WorkspaceId varchar 8000 NULL NULL
dbo capacity_refreshable_summaries RefreshMedianDuration float NULL 53 NULL
dbo capacity_refreshable_summaries ItemId varchar 8000 NULL NULL
dbo capacity_refreshable_summaries CapacityId varchar 8000 NULL NULL
dbo capacity_refreshable_summaries ConsideredStartTime datetime2 NULL NULL NULL
dbo capacity_refreshable_summaries RefreshAverageDuration float NULL 53 NULL
dbo capacity_refreshable_summaries ItemName varchar 8000 NULL NULL
dbo capacity_refreshable_summaries ConsideredEndTime datetime2 NULL NULL NULL
dbo capacity_refreshable_summaries RefreshCount bigint NULL 19 0
dbo capacity_refreshable_summaries ConsideredStartDateKey varchar 8000 NULL NULL
dbo capacity_refreshable_summaries ConsiderationDurationSeconds bigint NULL 19 0
dbo capacity_refreshable_times Time varchar 8000 NULL NULL
dbo capacity_refreshable_times WorkspaceId varchar 8000 NULL NULL
dbo capacity_refreshable_times CapacityId varchar 8000 NULL NULL
dbo capacity_refreshable_times ItemId varchar 8000 NULL NULL
dbo capacity_refreshables IsRefreshEnabled bit NULL NULL NULL
dbo capacity_refreshables ItemName varchar 8000 NULL NULL
dbo capacity_refreshables CapacityId varchar 8000 NULL NULL
dbo capacity_refreshables LocalTimeZoneId varchar 8000 NULL NULL
dbo capacity_refreshables ItemId varchar 8000 NULL NULL
dbo capacity_refreshables WorkspaceId varchar 8000 NULL NULL
dbo capacity_refreshables Kind varchar 8000 NULL NULL
dbo capacity_regions Continent varchar 8000 NULL NULL
dbo capacity_regions FabricRegion varchar 8000 NULL NULL
dbo capacity_regions Latitude float NULL 53 NULL
dbo capacity_regions Longitude float NULL 53 NULL
dbo capacity_regions Location varchar 8000 NULL NULL
dbo capacity_users role varchar 8000 NULL NULL
dbo capacity_users CapacityId varchar 8000 NULL NULL
dbo capacity_users user varchar 8000 NULL NULL
dbo dashboards Name varchar 8000 NULL NULL
dbo dashboards WorkspaceId varchar 8000 NULL NULL
dbo dashboards tiles varchar 8000 NULL NULL
dbo dashboards DashboardId varchar 8000 NULL NULL
dbo dashboards tags varchar 8000 NULL NULL
dbo dashboards appId varchar 8000 NULL NULL
dbo dashboards sensitivityLabel.labelId varchar 8000 NULL NULL
dbo dataflows Name varchar 8000 NULL NULL
dbo dataflows relations varchar 8000 NULL NULL
dbo dataflows DataflowId varchar 8000 NULL NULL
dbo dataflows Generation float NULL 53 NULL
dbo dataflows datasourceUsages varchar 8000 NULL NULL
dbo dataflows ModifiedDateTime varchar 8000 NULL NULL
dbo dataflows WorkspaceId varchar 8000 NULL NULL
dbo dataflows description varchar 8000 NULL NULL
dbo dataflows tags varchar 8000 NULL NULL
dbo dataflows refreshSchedule.days varchar 8000 NULL NULL
dbo dataflows refreshSchedule.times varchar 8000 NULL NULL
dbo dataflows refreshSchedule.enabled bit NULL NULL NULL
dbo dataflows refreshSchedule.localTimeZoneId varchar 8000 NULL NULL
dbo dataflows refreshSchedule.notifyOption varchar 8000 NULL NULL
dbo dataflows sensitivityLabel.labelId varchar 8000 NULL NULL
dbo dataflows upstreamDataflows varchar 8000 NULL NULL
dbo dataflows upstreamDatamarts varchar 8000 NULL NULL
dbo datamarts WorkspaceId varchar 8000 NULL NULL
dbo datamarts DatamartId varchar 8000 NULL NULL
dbo datamarts Name varchar 8000 NULL NULL
dbo datamarts Type varchar 8000 NULL NULL
dbo datamarts ConfiguredById varchar 8000 NULL NULL
dbo datamarts ModifiedById varchar 8000 NULL NULL
dbo datamarts ModifiedDateTime varchar 8000 NULL NULL
dbo datamarts tags varchar 8000 NULL NULL
dbo datamarts sensitivityLabel.labelId varchar 8000 NULL NULL
dbo datamarts datasourceUsages varchar 8000 NULL NULL
dbo datamarts refreshSchedule.days varchar 8000 NULL NULL
dbo datamarts refreshSchedule.times varchar 8000 NULL NULL
dbo datamarts refreshSchedule.enabled bit NULL NULL NULL
dbo datamarts refreshSchedule.localTimeZoneId varchar 8000 NULL NULL
dbo datamarts refreshSchedule.notifyOption varchar 8000 NULL NULL
dbo datamarts upstreamDataflows varchar 8000 NULL NULL
dbo datamarts upstreamDatamarts varchar 8000 NULL NULL
dbo datamarts description varchar 8000 NULL NULL
dbo datamarts endorsementDetails.endorsement varchar 8000 NULL NULL
dbo datasource_instances connectionDetails.extensionDataSourcePath varchar 8000 NULL NULL
dbo datasource_instances connectionDetails.account varchar 8000 NULL NULL
dbo datasource_instances connectionDetails.sharePointSiteUrl varchar 8000 NULL NULL
dbo datasource_instances connectionDetails.url varchar 8000 NULL NULL
dbo datasource_instances connectionDetails.path varchar 8000 NULL NULL
dbo datasource_instances DatasourceType varchar 8000 NULL NULL
dbo datasource_instances GatewayId varchar 8000 NULL NULL
dbo datasource_instances DatasourceId varchar 8000 NULL NULL
dbo datasource_instances connectionDetails.extensionDataSourceKind varchar 8000 NULL NULL
dbo datasource_instances connectionDetails.database varchar 8000 NULL NULL
dbo datasource_instances connectionDetails.server varchar 8000 NULL NULL
dbo datasource_instances connectionDetails.domain varchar 8000 NULL NULL
dbo datasource_instances connectionDetails.loginServer varchar 8000 NULL NULL
dbo datasource_instances connectionDetails.classInfo varchar 8000 NULL NULL
dbo datasource_instances connectionDetails.systemNumber varchar 8000 NULL NULL
dbo datasource_instances connectionDetails.clientId varchar 8000 NULL NULL
dbo datasource_instances connectionDetails.connectionString varchar 8000 NULL NULL
dbo datasource_instances connectionDetails.emailAddress varchar 8000 NULL NULL
dbo delegated_tenant_settings_overrides settingName varchar 8000 NULL NULL
dbo delegated_tenant_settings_overrides title varchar 8000 NULL NULL
dbo delegated_tenant_settings_overrides TransferDateTime datetime2 NULL NULL NULL
dbo delegated_tenant_settings_overrides canSpecifySecurityGroups bit NULL NULL NULL
dbo delegated_tenant_settings_overrides OverrideId varchar 8000 NULL NULL
dbo delegated_tenant_settings_overrides TransferDate date NULL NULL NULL
dbo delegated_tenant_settings_overrides CapacityId varchar 8000 NULL NULL
dbo delegated_tenant_settings_overrides tenantSettingGroup varchar 8000 NULL NULL
dbo delegated_tenant_settings_overrides delegatedFrom varchar 8000 NULL NULL
dbo delegated_tenant_settings_overrides enabled bit NULL NULL NULL
dbo environments State varchar 8000 NULL NULL
dbo environments CreatedDateTime varchar 8000 NULL NULL
dbo environments EnvironmentId varchar 8000 NULL NULL
dbo environments CreatedDate varchar 8000 NULL NULL
dbo environments LastUpdatedDateTime varchar 8000 NULL NULL
dbo environments CreatedById varchar 8000 NULL NULL
dbo environments Name varchar 8000 NULL NULL
dbo environments WorkspaceId varchar 8000 NULL NULL
dbo environments Description varchar 8000 NULL NULL
dbo environments LastUpdatedDate varchar 8000 NULL NULL
dbo environments ModifiedById varchar 8000 NULL NULL
dbo environments tags varchar 8000 NULL NULL
dbo environments sensitivityLabel.labelId varchar 8000 NULL NULL
dbo eventhouses CreatedById varchar 8000 NULL NULL
dbo eventhouses State varchar 8000 NULL NULL
dbo eventhouses EventhouseId varchar 8000 NULL NULL
dbo eventhouses CreatedDate varchar 8000 NULL NULL
dbo eventhouses CreatedDateTime varchar 8000 NULL NULL
dbo eventhouses ModifiedById varchar 8000 NULL NULL
dbo eventhouses Name varchar 8000 NULL NULL
dbo eventhouses WorkspaceId varchar 8000 NULL NULL
dbo eventhouses LastUpdatedDate varchar 8000 NULL NULL
dbo eventhouses Description varchar 8000 NULL NULL
dbo eventhouses LastUpdatedDateTime varchar 8000 NULL NULL
dbo eventhouses tags varchar 8000 NULL NULL
dbo eventhouses sensitivityLabel.labelId varchar 8000 NULL NULL
dbo eventstreams LastUpdatedDateTime varchar 8000 NULL NULL
dbo eventstreams Name varchar 8000 NULL NULL
dbo eventstreams extendedProperties.EnableEventStreamApiV2 float NULL 53 NULL
dbo eventstreams WorkspaceId varchar 8000 NULL NULL
dbo eventstreams relations varchar 8000 NULL NULL
dbo eventstreams CreatedById varchar 8000 NULL NULL
dbo eventstreams Description varchar 8000 NULL NULL
dbo eventstreams CreatedDate varchar 8000 NULL NULL
dbo eventstreams ModifiedById varchar 8000 NULL NULL
dbo eventstreams extendedProperties.EnableSchemaAssociation float NULL 53 NULL
dbo eventstreams LastUpdatedDate varchar 8000 NULL NULL
dbo eventstreams EventstreamId varchar 8000 NULL NULL
dbo eventstreams extendedProperties.EventStreamVersion varchar 8000 NULL NULL
dbo eventstreams State varchar 8000 NULL NULL
dbo eventstreams CreatedDateTime varchar 8000 NULL NULL
dbo eventstreams tags varchar 8000 NULL NULL
dbo eventstreams extendedProperties.RetentionTimeInDays float NULL 53 NULL
dbo eventstreams extendedProperties.ThroughputLevel varchar 8000 NULL NULL
dbo eventstreams sensitivityLabel.labelId varchar 8000 NULL NULL
dbo gateway_logs_inventory GatewayID varchar 8000 NULL NULL
dbo gateway_logs_inventory Date date NULL NULL NULL
dbo gateway_logs_inventory path varchar 8000 NULL NULL
dbo gateway_logs_inventory name varchar 8000 NULL NULL
dbo gateway_logs_inventory size bigint NULL 19 0
dbo gateway_logs_inventory Type varchar 8000 NULL NULL
dbo gateway_logs_inventory Category varchar 8000 NULL NULL
dbo git_connections repositoryName varchar 8000 NULL NULL
dbo git_connections workspaceId varchar 8000 NULL NULL
dbo git_connections gitProviderType varchar 8000 NULL NULL
dbo item_users DisplayName varchar 8000 NULL NULL
dbo item_users ItemName varchar 8000 NULL NULL
dbo item_users UserType varchar 8000 NULL NULL
dbo item_users ItemType varchar 8000 NULL NULL
dbo item_users GraphId varchar 8000 NULL NULL
dbo item_users WorkspaceId varchar 8000 NULL NULL
dbo item_users ItemId varchar 8000 NULL NULL
dbo item_users Identifier varchar 8000 NULL NULL
dbo item_users AccessRight varchar 8000 NULL NULL
dbo item_users EmailAddress varchar 8000 NULL NULL
dbo item_users PrincipalType varchar 8000 NULL NULL
dbo kql_databases KQLDatabaseId varchar 8000 NULL NULL
dbo kql_databases extendedProperties.IngestionServiceUri varchar 8000 NULL NULL
dbo kql_databases createdBy varchar 8000 NULL NULL
dbo kql_databases State varchar 8000 NULL NULL
dbo kql_databases WorkspaceId varchar 8000 NULL NULL
dbo kql_databases lastUpdatedDate varchar 8000 NULL NULL
dbo kql_databases relations varchar 8000 NULL NULL
dbo kql_databases Name varchar 8000 NULL NULL
dbo kql_databases extendedProperties.Region varchar 8000 NULL NULL
dbo kql_databases createdDate varchar 8000 NULL NULL
dbo kql_databases modifiedBy varchar 8000 NULL NULL
dbo kql_databases createdById varchar 8000 NULL NULL
dbo kql_databases extendedProperties.KustoDatabaseType varchar 8000 NULL NULL
dbo kql_databases modifiedById varchar 8000 NULL NULL
dbo kql_databases extendedProperties.QueryServiceUri varchar 8000 NULL NULL
dbo kql_databases Description varchar 8000 NULL NULL
dbo kql_databases tags varchar 8000 NULL NULL
dbo kql_databases sensitivityLabel.labelId varchar 8000 NULL NULL
dbo lakehouses Name varchar 8000 NULL NULL
dbo lakehouses Description varchar 8000 NULL NULL
dbo lakehouses extendedProperties.DefaultSchema varchar 8000 NULL NULL
dbo lakehouses State varchar 8000 NULL NULL
dbo lakehouses datasourceUsages varchar 8000 NULL NULL
dbo lakehouses LakehouseId varchar 8000 NULL NULL
dbo lakehouses CreatedById varchar 8000 NULL NULL
dbo lakehouses ModifiedById varchar 8000 NULL NULL
dbo lakehouses relations varchar 8000 NULL NULL
dbo lakehouses extendedProperties.DwProperties varchar 8000 NULL NULL
dbo lakehouses extendedProperties.OneLakeTablesPath varchar 8000 NULL NULL
dbo lakehouses CreatedDateTime varchar 8000 NULL NULL
dbo lakehouses WorkspaceId varchar 8000 NULL NULL
dbo lakehouses LastUpdatedDateTime varchar 8000 NULL NULL
dbo lakehouses CreatedDate varchar 8000 NULL NULL
dbo lakehouses extendedProperties.OneLakeFilesPath varchar 8000 NULL NULL
dbo lakehouses LastUpdatedDate varchar 8000 NULL NULL
dbo lakehouses tags varchar 8000 NULL NULL
dbo lakehouses sensitivityLabel.labelId varchar 8000 NULL NULL
dbo ml_models WorkspaceId varchar 8000 NULL NULL
dbo ml_models MLModelId varchar 8000 NULL NULL
dbo ml_models Name varchar 8000 NULL NULL
dbo ml_models Description varchar 8000 NULL NULL
dbo ml_models State varchar 8000 NULL NULL
dbo ml_models LastUpdatedDateTime varchar 8000 NULL NULL
dbo ml_models CreatedDateTime varchar 8000 NULL NULL
dbo ml_models ModifiedById varchar 8000 NULL NULL
dbo ml_models CreatedById varchar 8000 NULL NULL
dbo ml_models relations varchar 8000 NULL NULL
dbo ml_models tags varchar 8000 NULL NULL
dbo ml_models CreatedDate varchar 8000 NULL NULL
dbo ml_models LastUpdatedDate varchar 8000 NULL NULL
dbo ml_models sensitivityLabel.labelId varchar 8000 NULL NULL
dbo notebooks Description varchar 8000 NULL NULL
dbo notebooks LastUpdatedDateTime varchar 8000 NULL NULL
dbo notebooks ModifiedById varchar 8000 NULL NULL
dbo notebooks CreatedDate varchar 8000 NULL NULL
dbo notebooks CreatedById varchar 8000 NULL NULL
dbo notebooks relations varchar 8000 NULL NULL
dbo notebooks LastUpdatedDate varchar 8000 NULL NULL
dbo notebooks WorkspaceId varchar 8000 NULL NULL
dbo notebooks Name varchar 8000 NULL NULL
dbo notebooks State varchar 8000 NULL NULL
dbo notebooks NotebookId varchar 8000 NULL NULL
dbo notebooks CreatedDateTime varchar 8000 NULL NULL
dbo notebooks tags varchar 8000 NULL NULL
dbo notebooks sensitivityLabel.labelId varchar 8000 NULL NULL
dbo notebooks extendedProperties.TemporarySource varchar 8000 NULL NULL
dbo notebooks extendedProperties.AutoBindingLakehouse varchar 8000 NULL NULL
dbo pipelines ModifiedById varchar 8000 NULL NULL
dbo pipelines CreatedDateTime varchar 8000 NULL NULL
dbo pipelines Description varchar 8000 NULL NULL
dbo pipelines Name varchar 8000 NULL NULL
dbo pipelines relations varchar 8000 NULL NULL
dbo pipelines PipelineId varchar 8000 NULL NULL
dbo pipelines WorkspaceId varchar 8000 NULL NULL
dbo pipelines LastUpdatedDateTime varchar 8000 NULL NULL
dbo pipelines CreatedById varchar 8000 NULL NULL
dbo pipelines datasourceUsages varchar 8000 NULL NULL
dbo pipelines LastUpdatedDate varchar 8000 NULL NULL
dbo pipelines CreatedDate varchar 8000 NULL NULL
dbo pipelines State varchar 8000 NULL NULL
dbo pipelines tags varchar 8000 NULL NULL
dbo pipelines sensitivityLabel.labelId varchar 8000 NULL NULL
dbo reflexes Description varchar 8000 NULL NULL
dbo reflexes extendedProperties.Version varchar 8000 NULL NULL
dbo reflexes Name varchar 8000 NULL NULL
dbo reflexes LastUpdatedDateTime varchar 8000 NULL NULL
dbo reflexes ModifiedById varchar 8000 NULL NULL
dbo reflexes ReflexId varchar 8000 NULL NULL
dbo reflexes CreatedDate varchar 8000 NULL NULL
dbo reflexes CreatedDateTime varchar 8000 NULL NULL
dbo reflexes State varchar 8000 NULL NULL
dbo reflexes WorkspaceId varchar 8000 NULL NULL
dbo reflexes extendedProperties.Aria.DocumentId varchar 8000 NULL NULL
dbo reflexes LastUpdatedDate varchar 8000 NULL NULL
dbo reflexes CreatedById varchar 8000 NULL NULL
dbo reflexes relations varchar 8000 NULL NULL
dbo reflexes tags varchar 8000 NULL NULL
dbo reflexes sensitivityLabel.labelId varchar 8000 NULL NULL
dbo reports OriginalReportObjectId varchar 8000 NULL NULL
dbo reports SemanticModelId varchar 8000 NULL NULL
dbo reports ModifiedDateTime varchar 8000 NULL NULL
dbo reports CreatedDateTime varchar 8000 NULL NULL
dbo reports CreatedById varchar 8000 NULL NULL
dbo reports description varchar 8000 NULL NULL
dbo reports ModifiedById varchar 8000 NULL NULL
dbo reports ReportType varchar 8000 NULL NULL
dbo reports AppId varchar 8000 NULL NULL
dbo reports WorkspaceId varchar 8000 NULL NULL
dbo reports users varchar 8000 NULL NULL
dbo reports SemanticModelWorkspaceId varchar 8000 NULL NULL
dbo reports ReportId varchar 8000 NULL NULL
dbo reports Name varchar 8000 NULL NULL
dbo reports tags varchar 8000 NULL NULL
dbo reports sensitivityLabel.labelId varchar 8000 NULL NULL
dbo reports endorsementDetails.endorsement varchar 8000 NULL NULL
dbo reports endorsementDetails.certifiedBy varchar 8000 NULL NULL
dbo reports format varchar 8000 NULL NULL
dbo semantic_model_analyzer_runs TotalSize bigint NULL 19 0
dbo semantic_model_analyzer_runs SemanticModelRunKey varchar 8000 NULL NULL
dbo semantic_model_analyzer_runs HierarchySize bigint NULL 19 0
dbo semantic_model_analyzer_runs UserHierarchySize bigint NULL 19 0
dbo semantic_model_analyzer_runs DictionarySize bigint NULL 19 0
dbo semantic_model_analyzer_runs DataSize bigint NULL 19 0
dbo semantic_model_analyzer_runs WorkspaceId varchar 8000 NULL NULL
dbo semantic_model_analyzer_runs CompatibilityLevel bigint NULL 19 0
dbo semantic_model_analyzer_runs SemanticModelId varchar 8000 NULL NULL
dbo semantic_model_analyzer_runs RunName varchar 8000 NULL NULL
dbo semantic_model_analyzer_runs RelationshipSize bigint NULL 19 0
dbo semantic_model_analyzer_runs AnalyzerMode varchar 8000 NULL NULL
dbo semantic_model_analyzer_runs TableCount bigint NULL 19 0
dbo semantic_model_analyzer_runs RunId bigint NULL 19 0
dbo semantic_model_analyzer_runs RunTimestamp datetime2 NULL NULL NULL
dbo semantic_model_bpa_results ObjectName varchar 8000 NULL NULL
dbo semantic_model_bpa_results SemanticModelRunKey varchar 8000 NULL NULL
dbo semantic_model_bpa_results WorkspaceId varchar 8000 NULL NULL
dbo semantic_model_bpa_results RuleId bigint NULL 19 0
dbo semantic_model_bpa_results Timestamp datetime2 NULL NULL NULL
dbo semantic_model_bpa_results SemanticModelId varchar 8000 NULL NULL
dbo semantic_model_bpa_results RunName varchar 8000 NULL NULL
dbo semantic_model_bpa_results Scope varchar 8000 NULL NULL
dbo semantic_model_bpa_results RunId bigint NULL 19 0
dbo semantic_model_bpa_rule_catalog RuleId bigint NULL 19 0
dbo semantic_model_bpa_rule_catalog Category varchar 8000 NULL NULL
dbo semantic_model_bpa_rule_catalog WithDependency bit NULL NULL NULL
dbo semantic_model_bpa_rule_catalog Scope varchar 8000 NULL NULL
dbo semantic_model_bpa_rule_catalog Severity varchar 8000 NULL NULL
dbo semantic_model_bpa_rule_catalog ImpactArea varchar 8000 NULL NULL
dbo semantic_model_bpa_rule_catalog RuleName varchar 8000 NULL NULL
dbo semantic_model_bpa_rule_catalog Description varchar 8000 NULL NULL
dbo semantic_model_bpa_rule_catalog URL varchar 8000 NULL NULL
dbo semantic_model_content_types ContentProviderType varchar 8000 NULL NULL
dbo semantic_model_content_types MappedName varchar 8000 NULL NULL
dbo semantic_model_content_types IsIncludedInOptimizationModule bit NULL NULL NULL
dbo semantic_model_content_types IsDefaultSemanticModel bit NULL NULL NULL
dbo semantic_model_vertipaq_relationships UsedSize bigint NULL 19 0
dbo semantic_model_vertipaq_relationships State varchar 8000 NULL NULL
dbo semantic_model_vertipaq_relationships WorkspaceId varchar 8000 NULL NULL
dbo semantic_model_vertipaq_relationships ModifiedTime datetime2 NULL NULL NULL
dbo semantic_model_vertipaq_relationships FromObject varchar 8000 NULL NULL
dbo semantic_model_vertipaq_relationships SemanticModelRunKey varchar 8000 NULL NULL
dbo semantic_model_vertipaq_relationships RelyOnReferentialIntegrity bit NULL NULL NULL
dbo semantic_model_vertipaq_relationships Active bit NULL NULL NULL
dbo semantic_model_vertipaq_relationships FromColumn varchar 8000 NULL NULL
dbo semantic_model_vertipaq_relationships CrossFilteringBehavior varchar 8000 NULL NULL
dbo semantic_model_vertipaq_relationships JoinOnDateBehavior varchar 8000 NULL NULL
dbo semantic_model_vertipaq_relationships ToObject varchar 8000 NULL NULL
dbo semantic_model_vertipaq_relationships ToColumn varchar 8000 NULL NULL
dbo semantic_model_vertipaq_relationships SemanticModelId varchar 8000 NULL NULL
dbo semantic_model_vertipaq_relationships FromTable varchar 8000 NULL NULL
dbo semantic_model_vertipaq_relationships RunId bigint NULL 19 0
dbo semantic_model_vertipaq_relationships ToTable varchar 8000 NULL NULL
dbo semantic_model_vertipaq_relationships Multiplicity varchar 8000 NULL NULL
dbo semantic_model_vertipaq_relationships SecurityFilteringBehavior varchar 8000 NULL NULL
dbo semantic_model_vertipaq_relationships RelationshipName varchar 8000 NULL NULL
dbo semantic_model_vertipaq_tables Hidden bit NULL NULL NULL
dbo semantic_model_vertipaq_tables HierarchySize bigint NULL 19 0
dbo semantic_model_vertipaq_tables Columns bigint NULL 19 0
dbo semantic_model_vertipaq_tables RefreshPolicy bit NULL NULL NULL
dbo semantic_model_vertipaq_tables WorkspaceId varchar 8000 NULL NULL
dbo semantic_model_vertipaq_tables RowCount bigint NULL 19 0
dbo semantic_model_vertipaq_tables Type varchar 8000 NULL NULL
dbo semantic_model_vertipaq_tables DataSize bigint NULL 19 0
dbo semantic_model_vertipaq_tables RunId bigint NULL 19 0
dbo semantic_model_vertipaq_tables Name varchar 8000 NULL NULL
dbo semantic_model_vertipaq_tables DataCategory varchar 8000 NULL NULL
dbo semantic_model_vertipaq_tables TotalSize bigint NULL 19 0
dbo semantic_model_vertipaq_tables SemanticModelId varchar 8000 NULL NULL
dbo semantic_model_vertipaq_tables Description varchar 8000 NULL NULL
dbo semantic_model_vertipaq_tables DictionarySize bigint NULL 19 0
dbo semantic_model_vertipaq_tables PercentageOfDB float NULL 53 NULL
dbo semantic_model_vertipaq_tables Partitions bigint NULL 19 0
dbo semantic_model_vertipaq_tables RelationshipSize bigint NULL 19 0
dbo semantic_model_vertipaq_tables UserHierarchySize bigint NULL 19 0
dbo semantic_model_vertipaq_tables SemanticModelRunKey varchar 8000 NULL NULL
dbo semantic_models refreshSchedule.localTimeZoneId varchar 8000 NULL NULL
dbo semantic_models SemanticModelId varchar 8000 NULL NULL
dbo semantic_models isEffectiveIdentityRolesRequired bit NULL NULL NULL
dbo semantic_models WorkspaceId varchar 8000 NULL NULL
dbo semantic_models CreatedDateTime varchar 8000 NULL NULL
dbo semantic_models CreatedDate varchar 8000 NULL NULL
dbo semantic_models refreshSchedule.enabled bit NULL NULL NULL
dbo semantic_models ContentProviderType varchar 8000 NULL NULL
dbo semantic_models description varchar 8000 NULL NULL
dbo semantic_models directQueryRefreshSchedule.localTimeZoneId varchar 8000 NULL NULL
dbo semantic_models Name varchar 8000 NULL NULL
dbo semantic_models refreshSchedule.notifyOption varchar 8000 NULL NULL
dbo semantic_models ConfiguredById varchar 8000 NULL NULL
dbo semantic_models isEffectiveIdentityRequired bit NULL NULL NULL
dbo semantic_models StorageMode varchar 8000 NULL NULL
dbo semantic_models directQueryRefreshSchedule.frequency float NULL 53 NULL
dbo semantic_models tags varchar 8000 NULL NULL
dbo semantic_models sensitivityLabel.labelId varchar 8000 NULL NULL
dbo semantic_models endorsementDetails.endorsement varchar 8000 NULL NULL
dbo semantic_models upstreamDataflows varchar 8000 NULL NULL
dbo semantic_models endorsementDetails.certifiedBy varchar 8000 NULL NULL
dbo tenant_settings TransferDate date NULL NULL NULL
dbo tenant_settings enabled bit NULL NULL NULL
dbo tenant_settings tenantSettingGroup varchar 8000 NULL NULL
dbo tenant_settings canSpecifySecurityGroups bit NULL NULL NULL
dbo tenant_settings settingName varchar 8000 NULL NULL
dbo tenant_settings delegateToDomain bit NULL NULL NULL
dbo tenant_settings delegateToWorkspace bit NULL NULL NULL
dbo tenant_settings TransferDateTime datetime2 NULL NULL NULL
dbo tenant_settings title varchar 8000 NULL NULL
dbo tenant_settings delegateToCapacity bit NULL NULL NULL
dbo tenant_settings_enabled_security_groups TransferDateTime datetime2 NULL NULL NULL
dbo tenant_settings_enabled_security_groups name varchar 8000 NULL NULL
dbo tenant_settings_enabled_security_groups settingName varchar 8000 NULL NULL
dbo tenant_settings_enabled_security_groups graphId varchar 8000 NULL NULL
dbo warehouses ModifiedDateTime varchar 8000 NULL NULL
dbo warehouses Name varchar 8000 NULL NULL
dbo warehouses WarehouseId varchar 8000 NULL NULL
dbo warehouses relations varchar 8000 NULL NULL
dbo warehouses ModifiedById varchar 8000 NULL NULL
dbo warehouses ModifiedDate varchar 8000 NULL NULL
dbo warehouses WorkspaceId varchar 8000 NULL NULL
dbo warehouses ConfiguredById varchar 8000 NULL NULL
dbo warehouses tags varchar 8000 NULL NULL
dbo warehouses sensitivityLabel.labelId varchar 8000 NULL NULL
dbo warehouses Description varchar 8000 NULL NULL
dbo workspace_monitoring_reports SemanticModelId varchar 8000 NULL NULL
dbo workspace_monitoring_reports ReportId varchar 8000 NULL NULL
dbo workspace_monitoring_reports KQLDatabaseName varchar 8000 NULL NULL
dbo workspace_monitoring_reports KQLDatabaseId varchar 8000 NULL NULL
dbo workspace_monitoring_reports ReportName varchar 8000 NULL NULL
dbo workspace_monitoring_reports WorkspaceId varchar 8000 NULL NULL
dbo workspaces IsReadOnly bit NULL NULL NULL
dbo workspaces WorkspaceId varchar 8000 NULL NULL
dbo workspaces Type varchar 8000 NULL NULL
dbo workspaces fuam_deleted bit NULL NULL NULL
dbo workspaces CapacityMigrationStatus varchar 8000 NULL NULL
dbo workspaces Description varchar 8000 NULL NULL
dbo workspaces fuam_modified_at datetime2 NULL NULL NULL
dbo workspaces State varchar 8000 NULL NULL
dbo workspaces IsOnDedicatedCapacity bit NULL NULL NULL
dbo workspaces WorkspaceName varchar 8000 NULL NULL
dbo workspaces HasWorkspaceLevelSettings bit NULL NULL NULL
dbo workspaces CapacityId varchar 8000 NULL NULL
dbo workspaces DefaultDatasetStorageFormat varchar 8000 NULL NULL
dbo workspaces_history IsOnDedicatedCapacity bit NULL NULL NULL
dbo workspaces_history WorkspaceName varchar 8000 NULL NULL
dbo workspaces_history HasWorkspaceLevelSettings bit NULL NULL NULL
dbo workspaces_history State varchar 8000 NULL NULL
dbo workspaces_history IsReadOnly bit NULL NULL NULL
dbo workspaces_history DefaultDatasetStorageFormat varchar 8000 NULL NULL
dbo workspaces_history Snapshot_WorkspaceId varchar 8000 NULL NULL
dbo workspaces_history fuam_modified_at datetime2 NULL NULL NULL
dbo workspaces_history fuam_deleted bit NULL NULL NULL
dbo workspaces_history CapacityId varchar 8000 NULL NULL
dbo workspaces_history Snapshot_CapacityId varchar 8000 NULL NULL
dbo workspaces_history Type varchar 8000 NULL NULL
dbo workspaces_history WorkspaceId varchar 8000 NULL NULL
dbo workspaces_history CapacityMigrationStatus varchar 8000 NULL NULL
dbo workspaces_history Snapshot_Date date NULL NULL NULL
dbo workspaces_history Description varchar 8000 NULL NULL
dbo workspaces_scanned_users GraphId varchar 8000 NULL NULL
dbo workspaces_scanned_users PrincipalType varchar 8000 NULL NULL
dbo workspaces_scanned_users WorkspaceId varchar 8000 NULL NULL
dbo workspaces_scanned_users UserType varchar 8000 NULL NULL
dbo workspaces_scanned_users Identifier varchar 8000 NULL NULL
dbo workspaces_scanned_users GroupUserAccessRight varchar 8000 NULL NULL
dbo workspaces_scanned_users_history GraphId varchar 8000 NULL NULL
dbo workspaces_scanned_users_history Snapshot_Date date NULL NULL NULL
dbo workspaces_scanned_users_history PrincipalType varchar 8000 NULL NULL
dbo workspaces_scanned_users_history UserType varchar 8000 NULL NULL
dbo workspaces_scanned_users_history Snapshot_WorkspaceId varchar 8000 NULL NULL
dbo workspaces_scanned_users_history Identifier varchar 8000 NULL NULL
dbo workspaces_scanned_users_history GroupUserAccessRight varchar 8000 NULL NULL
dbo workspaces_scanned_users_history WorkspaceId varchar 8000 NULL NULL
queryinsights exec_requests_history distributed_statement_id uniqueidentifier NULL NULL NULL
queryinsights exec_requests_history database_name varchar 200 NULL NULL
queryinsights exec_requests_history submit_time datetime2 NULL NULL NULL
queryinsights exec_requests_history start_time datetime2 NULL NULL NULL
queryinsights exec_requests_history end_time datetime2 NULL NULL NULL
queryinsights exec_requests_history is_distributed int NULL 10 0
queryinsights exec_requests_history statement_type varchar 128 NULL NULL
queryinsights exec_requests_history total_elapsed_time_ms bigint NULL 19 0
queryinsights exec_requests_history login_name varchar 200 NULL NULL
queryinsights exec_requests_history row_count bigint NULL 19 0
queryinsights exec_requests_history status varchar 200 NULL NULL
queryinsights exec_requests_history session_id int NULL 10 0
queryinsights exec_requests_history connection_id uniqueidentifier NULL NULL NULL
queryinsights exec_requests_history program_name varchar 128 NULL NULL
queryinsights exec_requests_history batch_id uniqueidentifier NULL NULL NULL
queryinsights exec_requests_history root_batch_id uniqueidentifier NULL NULL NULL
queryinsights exec_requests_history query_hash varchar 200 NULL NULL
queryinsights exec_requests_history label varchar 8000 NULL NULL
queryinsights exec_requests_history result_cache_hit int NULL 10 0
queryinsights exec_requests_history allocated_cpu_time_ms bigint NULL 19 0
queryinsights exec_requests_history data_scanned_remote_storage_mb decimal NULL 18 3
queryinsights exec_requests_history data_scanned_memory_mb decimal NULL 18 3
queryinsights exec_requests_history data_scanned_disk_mb decimal NULL 18 3
queryinsights exec_requests_history command varchar 8000 NULL NULL
queryinsights exec_sessions_history session_id int NULL 10 0
queryinsights exec_sessions_history connection_id uniqueidentifier NULL NULL NULL
queryinsights exec_sessions_history session_start_time datetime2 NULL NULL NULL
queryinsights exec_sessions_history session_end_time datetime2 NULL NULL NULL
queryinsights exec_sessions_history program_name varchar 256 NULL NULL
queryinsights exec_sessions_history login_name varchar 256 NULL NULL
queryinsights exec_sessions_history status varchar 100 NULL NULL
queryinsights exec_sessions_history context_info varchar 128 NULL NULL
queryinsights exec_sessions_history total_query_elapsed_time_ms bigint NULL 19 0
queryinsights exec_sessions_history last_request_start_time datetime2 NULL NULL NULL
queryinsights exec_sessions_history last_request_end_time datetime2 NULL NULL NULL
queryinsights exec_sessions_history is_user_process bit NULL NULL NULL
queryinsights exec_sessions_history prev_error int NULL 10 0
queryinsights exec_sessions_history group_id bigint NULL 19 0
queryinsights exec_sessions_history database_id int NULL 10 0
queryinsights exec_sessions_history authenticating_database_id int NULL 10 0
queryinsights exec_sessions_history open_transaction_count bigint NULL 19 0
queryinsights exec_sessions_history text_size int NULL 10 0
queryinsights exec_sessions_history language varchar 256 NULL NULL
queryinsights exec_sessions_history date_format varchar 20 NULL NULL
queryinsights exec_sessions_history date_first int NULL 10 0
queryinsights exec_sessions_history quoted_identifier bit NULL NULL NULL
queryinsights exec_sessions_history arithabort bit NULL NULL NULL
queryinsights exec_sessions_history ansi_null_dflt_on bit NULL NULL NULL
queryinsights exec_sessions_history ansi_defaults bit NULL NULL NULL
queryinsights exec_sessions_history ansi_warnings bit NULL NULL NULL
queryinsights exec_sessions_history ansi_padding bit NULL NULL NULL
queryinsights exec_sessions_history ansi_nulls bit NULL NULL NULL
queryinsights exec_sessions_history concat_null_yields_null bit NULL NULL NULL
queryinsights exec_sessions_history transaction_isolation_level int NULL 10 0
queryinsights exec_sessions_history lock_timeout bigint NULL 19 0
queryinsights exec_sessions_history deadlock_priority int NULL 10 0
queryinsights exec_sessions_history original_security_id varchar 200 NULL NULL
queryinsights exec_sessions_history database_name varchar 200 NULL NULL
queryinsights frequently_run_queries database_name varchar 200 NULL NULL
queryinsights frequently_run_queries number_of_runs int NULL 10 0
queryinsights frequently_run_queries min_run_total_elapsed_time_ms bigint NULL 19 0
queryinsights frequently_run_queries max_run_total_elapsed_time_ms bigint NULL 19 0
queryinsights frequently_run_queries avg_total_elapsed_time_ms bigint NULL 19 0
queryinsights frequently_run_queries number_of_successful_runs int NULL 10 0
queryinsights frequently_run_queries number_of_failed_runs int NULL 10 0
queryinsights frequently_run_queries number_of_canceled_runs int NULL 10 0
queryinsights frequently_run_queries last_run_total_elapsed_time_ms bigint NULL 19 0
queryinsights frequently_run_queries last_run_start_time datetime2 NULL NULL NULL
queryinsights frequently_run_queries last_dist_statement_id uniqueidentifier NULL NULL NULL
queryinsights frequently_run_queries query_hash varchar 200 NULL NULL
queryinsights frequently_run_queries last_run_command varchar 8000 NULL NULL
queryinsights long_running_queries database_name varchar 200 NULL NULL
queryinsights long_running_queries median_total_elapsed_time_ms float NULL 53 NULL
queryinsights long_running_queries last_run_total_elapsed_time_ms bigint NULL 19 0
queryinsights long_running_queries last_run_start_time datetime2 NULL NULL NULL
queryinsights long_running_queries last_dist_statement_id uniqueidentifier NULL NULL NULL
queryinsights long_running_queries last_run_session_id int NULL 10 0
queryinsights long_running_queries number_of_runs int NULL 10 0
queryinsights long_running_queries query_hash varchar 200 NULL NULL
queryinsights long_running_queries last_run_command varchar 8000 NULL NULL
sys external_delta_tables table_id bigint NULL 19 0
sys external_delta_tables is_blocked bit NULL NULL NULL
sys external_delta_tables relative_path nvarchar 2000 NULL NULL
sys external_delta_tables latest_manifest_version bigint NULL 19 0
sys external_delta_tables latest_checkpoint_version bigint NULL 19 0
sys external_delta_tables latest_etag nvarchar 128 NULL NULL
sys external_delta_tables last_update_time datetime NULL NULL NULL
sys managed_delta_table_checkpoints delta_log_commit_sequence_id bigint NULL 19 0
sys managed_delta_table_checkpoints part int NULL 10 0
sys managed_delta_table_checkpoints file_guid uniqueidentifier NULL NULL NULL
sys managed_delta_table_checkpoints version bigint NULL 19 0
sys managed_delta_table_checkpoints source_table_guid uniqueidentifier NULL NULL NULL
sys managed_delta_table_checkpoints source_database_guid uniqueidentifier NULL NULL NULL
sys managed_delta_table_checkpoints table_guid uniqueidentifier NULL NULL NULL
sys managed_delta_table_checkpoints checkpoint_file_name nvarchar 256 NULL NULL
sys managed_delta_table_checkpoints manifest_root nvarchar 256 NULL NULL
sys managed_delta_table_forks commit_sequence_id bigint NULL 19 0
sys managed_delta_table_forks fork_guid uniqueidentifier NULL NULL NULL
sys managed_delta_table_forks source_table_guid uniqueidentifier NULL NULL NULL
sys managed_delta_table_forks source_database_guid uniqueidentifier NULL NULL NULL
sys managed_delta_table_forks xdes_ts bigint NULL 19 0
sys managed_delta_table_forks commit_time datetime NULL NULL NULL
sys managed_delta_table_forks table_guid uniqueidentifier NULL NULL NULL
sys managed_delta_table_forks folder_name nvarchar 40 NULL NULL
sys managed_delta_table_log_files commit_sequence_id bigint NULL 19 0
sys managed_delta_table_log_files file_guid uniqueidentifier NULL NULL NULL
sys managed_delta_table_log_files xdes_ts bigint NULL 19 0
sys managed_delta_table_log_files append_only bit NULL NULL NULL
sys managed_delta_table_log_files rows_inserted bigint NULL 19 0
sys managed_delta_table_log_files commit_time datetime NULL NULL NULL
sys managed_delta_table_log_files source_table_guid uniqueidentifier NULL NULL NULL
sys managed_delta_table_log_files source_database_guid uniqueidentifier NULL NULL NULL
sys managed_delta_table_log_files manifest_file_name nvarchar 256 NULL NULL
sys managed_delta_table_log_files manifest_root nvarchar 256 NULL NULL
sys managed_delta_table_log_files table_guid uniqueidentifier NULL NULL NULL
sys managed_delta_tables table_id bigint NULL 19 0
sys managed_delta_tables object_id int NULL 10 0
sys managed_delta_tables table_guid uniqueidentifier NULL NULL NULL
sys managed_delta_tables fork_guid uniqueidentifier NULL NULL NULL
sys managed_delta_tables delta_log_feature_status int NULL 10 0
sys managed_delta_tables manifest_root nvarchar 256 NULL NULL
sys managed_delta_tables system_task_consideration_bitmask int NULL 10 0
sys managed_delta_tables drop_commit_time datetime NULL NULL NULL
sys sys_dw_schemas name nvarchar 128 NULL NULL
sys sys_dw_schemas schema_id int NULL 10 0
sys sys_dw_schemas principal_id int NULL 10 0
sys sys_dw_schemas is_internal bit NULL NULL NULL
table Measures
lineageTag: a1b2c3d4-0007-0001-0001-000000000001
measure 'Total RLS Models' = COUNTROWS(rls_semantic_models)
lineageTag: a1b2c3d4-0007-0001-0001-000000000002
formatString: #,##0
measure 'Total Roles' = COUNTROWS(rls_roles)
lineageTag: a1b2c3d4-0007-0001-0001-000000000003
formatString: #,##0
measure 'Total Role Assignments' = COUNTROWS(rls_role_members)
lineageTag: a1b2c3d4-0007-0001-0001-000000000004
formatString: #,##0
measure 'Unique Members' = DISTINCTCOUNT(rls_role_members[MemberName])
lineageTag: a1b2c3d4-0007-0001-0001-000000000005
formatString: #,##0
measure 'Linked Reports' = COUNTROWS(rls_reports)
lineageTag: a1b2c3d4-0007-0001-0001-000000000006
formatString: #,##0
measure 'XMLA Capacities' = CALCULATE(COUNTROWS(rls_capacity_xmla_status), rls_capacity_xmla_status[XmlaEndpointEnabled] = TRUE())
lineageTag: a1b2c3d4-0007-0001-0001-000000000007
formatString: #,##0
measure 'Total Capacities' = COUNTROWS(rls_capacity_xmla_status)
lineageTag: a1b2c3d4-0007-0001-0001-000000000008
formatString: #,##0
measure 'Empty Roles' = CALCULATE(COUNTROWS(rls_roles), rls_roles[MemberCount] = 0)
lineageTag: a1b2c3d4-0007-0001-0001-000000000009
formatString: #,##0
measure 'Successful Extractions' = CALCULATE(COUNTROWS(rls_semantic_models), rls_semantic_models[ExtractionStatus] = "Success")
lineageTag: a1b2c3d4-0007-0001-0001-000000000010
formatString: #,##0
measure 'Failed Extractions' = CALCULATE(COUNTROWS(rls_semantic_models), CONTAINSSTRING(rls_semantic_models[ExtractionStatus], "Failed"))
lineageTag: a1b2c3d4-0007-0001-0001-000000000011
formatString: #,##0
measure 'XMLA Coverage %' = DIVIDE([XMLA Capacities], [Total Capacities], 0)
lineageTag: a1b2c3d4-0007-0001-0001-000000000012
formatString: 0.0%
measure 'Avg Members Per Role' = DIVIDE([Total Role Assignments], [Total Roles], 0)
lineageTag: a1b2c3d4-0007-0001-0001-000000000013
formatString: #,##0.0
partition Measures = m
mode: import
source
expression = {blank()}
model Model
culture: en-US
defaultPowerBIDataSourceVersion: powerBI_V3
sourceQueryCulture: en-US
annotation PBI_QueryOrder = ["rls_semantic_models","rls_roles","rls_role_table_permissions","rls_role_members","rls_capacity_xmla_status","rls_reports"]
# Fabric notebook source
# METADATA ********************
# META {
# META "kernel_info": {
# META "name": "synapse_pyspark"
# META },
# META "dependencies": {
# META "lakehouse": {
# META "default_lakehouse_name": "RLS_Audit",
# META "default_lakehouse_workspace_id": ""
# META }
# META }
# META }
# MARKDOWN ********************
# # RLS Discovery
#
# This notebook:
# 1. Reads FUAM Lakehouse to find all semantic models with RLS enabled
# 2. Tests XMLA endpoint connectivity per capacity (15+ capacities)
# 3. Saves discovery results as temp Delta tables for downstream notebooks
# PARAMETERS ********************
# CELL ********************
# Parameters
run_date = ""
# CELL ********************
from datetime import datetime, date
from pyspark.sql.functions import lit, current_timestamp, col
from pyspark.sql.types import (
StructType, StructField, StringType, BooleanType, TimestampType, IntegerType
)
if not run_date:
run_date = date.today().isoformat()
print(f"RLS Discovery - Run Date: {run_date}")
# CELL ********************
# ============================================================
# Step 1: Read FUAM Lakehouse - Identify RLS-enabled semantic models
# ============================================================
# Query FUAM for semantic models with RLS enabled, enriched with workspace/capacity info
df_rls_models = spark.sql("""
SELECT
sm.SemanticModelId,
sm.Name AS SemanticModelName,
sm.WorkspaceId,
w.WorkspaceName,
w.CapacityId,
c.displayName AS CapacityName,
sm.isEffectiveIdentityRequired,
sm.isEffectiveIdentityRolesRequired,
sm.ContentProviderType,
sm.StorageMode
FROM fuam_lakehouse.dbo.semantic_models sm
INNER JOIN fuam_lakehouse.dbo.workspaces w
ON sm.WorkspaceId = w.WorkspaceId
LEFT JOIN fuam_lakehouse.dbo.capacities c
ON w.CapacityId = c.CapacityId
WHERE sm.isEffectiveIdentityRolesRequired = true
AND (w.fuam_deleted IS NULL OR w.fuam_deleted = false)
""")
rls_model_count = df_rls_models.count()
print(f"Found {rls_model_count} semantic models with RLS enabled")
# CELL ********************
# Also get ALL semantic models (including non-RLS) for completeness in reporting
df_all_models = spark.sql("""
SELECT
sm.SemanticModelId,
sm.Name AS SemanticModelName,
sm.WorkspaceId,
w.WorkspaceName,
w.CapacityId,
c.displayName AS CapacityName,
sm.isEffectiveIdentityRequired,
sm.isEffectiveIdentityRolesRequired,
sm.ContentProviderType,
sm.StorageMode
FROM fuam_lakehouse.dbo.semantic_models sm
INNER JOIN fuam_lakehouse.dbo.workspaces w
ON sm.WorkspaceId = w.WorkspaceId
LEFT JOIN fuam_lakehouse.dbo.capacities c
ON w.CapacityId = c.CapacityId
WHERE (w.fuam_deleted IS NULL OR w.fuam_deleted = false)
""")
total_model_count = df_all_models.count()
print(f"Total semantic models across all workspaces: {total_model_count}")
print(f"RLS-enabled: {rls_model_count} ({rls_model_count/max(total_model_count,1)*100:.1f}%)")
# CELL ********************
# Show RLS models by capacity
print("\nRLS-enabled models by capacity:")
df_rls_models.groupBy("CapacityId", "CapacityName").count().orderBy("count", ascending=False).show(50, truncate=False)
# CELL ********************
# ============================================================
# Step 2: Get distinct capacities and test XMLA connectivity
# ============================================================
import sempy.fabric as fabric
# Get distinct capacities that have RLS-enabled models
capacities = (
df_rls_models
.select("CapacityId", "CapacityName")
.distinct()
.collect()
)
print(f"\nTesting XMLA connectivity for {len(capacities)} capacities...")
# CELL ********************
# For XMLA testing, we need a sample workspace per capacity to attempt connection
# Get one workspace per capacity that has an RLS model
sample_workspaces = (
df_rls_models
.select("CapacityId", "WorkspaceId", "WorkspaceName", "SemanticModelId", "SemanticModelName")
.dropDuplicates(["CapacityId"])
.collect()
)
# Build a lookup: CapacityId -> (WorkspaceId, WorkspaceName, SemanticModelName)
capacity_sample = {}
for row in sample_workspaces:
capacity_sample[row.CapacityId] = {
"WorkspaceId": row.WorkspaceId,
"WorkspaceName": row.WorkspaceName,
"SemanticModelId": row.SemanticModelId,
"SemanticModelName": row.SemanticModelName,
}
# CELL ********************
# Test XMLA connectivity per capacity
xmla_status_records = []
for cap in capacities:
capacity_id = cap.CapacityId
capacity_name = cap.CapacityName
sample = capacity_sample.get(capacity_id)
if not sample:
xmla_status_records.append({
"CapacityId": capacity_id,
"CapacityName": capacity_name,
"XmlaEndpointEnabled": False,
"FallbackMethod": "REST",
"TestError": "No sample workspace found",
"LastCheckedTimestamp": datetime.now()
})
continue
workspace_id = sample["WorkspaceId"]
workspace_name = sample["WorkspaceName"]
dataset_name = sample["SemanticModelName"]
print(f"\nTesting XMLA for capacity '{capacity_name}' ({capacity_id})")
print(f" Using workspace: {workspace_name}, dataset: {dataset_name}")
try:
# Attempt to create a TOM server connection
tom_server = fabric.create_tom_server(readonly=True, workspace=workspace_id)
# If we get here, XMLA is enabled - try to list databases
db_count = tom_server.Databases.Count
tom_server.Disconnect()
xmla_status_records.append({
"CapacityId": capacity_id,
"CapacityName": capacity_name,
"XmlaEndpointEnabled": True,
"FallbackMethod": "None",
"TestError": None,
"LastCheckedTimestamp": datetime.now()
})
print(f" XMLA: ENABLED (found {db_count} databases)")
except Exception as e:
error_msg = str(e)
xmla_status_records.append({
"CapacityId": capacity_id,
"CapacityName": capacity_name,
"XmlaEndpointEnabled": False,
"FallbackMethod": "REST",
"TestError": error_msg[:500],
"LastCheckedTimestamp": datetime.now()
})
print(f" XMLA: DISABLED or FAILED - {error_msg[:200]}")
# CELL ********************
# Create XMLA status DataFrame
xmla_schema = StructType([
StructField("CapacityId", StringType(), True),
StructField("CapacityName", StringType(), True),
StructField("XmlaEndpointEnabled", BooleanType(), True),
StructField("FallbackMethod", StringType(), True),
StructField("TestError", StringType(), True),
StructField("LastCheckedTimestamp", TimestampType(), True),
])
df_xmla_status = spark.createDataFrame(xmla_status_records, schema=xmla_schema)
print("\nXMLA Status Summary:")
df_xmla_status.select("CapacityId", "CapacityName", "XmlaEndpointEnabled", "FallbackMethod").show(50, truncate=False)
xmla_enabled_count = len([r for r in xmla_status_records if r["XmlaEndpointEnabled"]])
xmla_disabled_count = len(xmla_status_records) - xmla_enabled_count
print(f"XMLA Enabled: {xmla_enabled_count} capacities")
print(f"XMLA Disabled (REST fallback): {xmla_disabled_count} capacities")
# CELL ********************
# ============================================================
# Step 3: Enrich RLS models with extraction method based on XMLA status
# ============================================================
# Join XMLA status to RLS models to determine extraction method per model
df_xmla_lookup = df_xmla_status.select(
col("CapacityId"),
col("XmlaEndpointEnabled"),
col("FallbackMethod")
)
df_rls_enriched = (
df_rls_models
.join(df_xmla_lookup, on="CapacityId", how="left")
.withColumn("ExtractionMethod",
lit("XMLA").when(col("XmlaEndpointEnabled") == True, lit("XMLA")).otherwise(lit("REST"))
)
.withColumn("ExtractionTimestamp", current_timestamp())
.withColumn("ExtractionStatus", lit("Pending"))
.withColumn("RoleCount", lit(0).cast(IntegerType()))
.drop("XmlaEndpointEnabled", "FallbackMethod")
)
# CELL ********************
# Use when/otherwise properly
from pyspark.sql.functions import when
df_rls_enriched = (
df_rls_models
.join(df_xmla_lookup, on="CapacityId", how="left")
.withColumn("ExtractionMethod",
when(col("XmlaEndpointEnabled") == True, lit("XMLA")).otherwise(lit("REST"))
)
.withColumn("ExtractionTimestamp", current_timestamp())
.withColumn("ExtractionStatus", lit("Pending"))
.withColumn("RoleCount", lit(0).cast(IntegerType()))
.drop("XmlaEndpointEnabled", "FallbackMethod")
)
print(f"\nEnriched RLS models: {df_rls_enriched.count()} rows")
print("\nExtraction method distribution:")
df_rls_enriched.groupBy("ExtractionMethod").count().show()
# CELL ********************
# ============================================================
# Step 4: Save discovery results as temp tables for downstream notebooks
# ============================================================
# Save RLS semantic models (to be updated with RoleCount/Status by extraction notebooks)
df_rls_enriched.write.format("delta").mode("overwrite").saveAsTable("RLS_Audit.dbo.tmp_rls_semantic_models")
print("Saved: tmp_rls_semantic_models")
# Save XMLA status
df_xmla_status.write.format("delta").mode("overwrite").saveAsTable("RLS_Audit.dbo.tmp_rls_capacity_xmla_status")
print("Saved: tmp_rls_capacity_xmla_status")
# Save reports linked to RLS models for the Power BI report
df_rls_reports = spark.sql("""
SELECT
r.ReportId,
r.Name AS ReportName,
r.ReportType,
r.WorkspaceId,
r.SemanticModelId,
r.SemanticModelWorkspaceId,
r.AppId
FROM fuam_lakehouse.dbo.reports r
INNER JOIN RLS_Audit.dbo.tmp_rls_semantic_models sm
ON r.SemanticModelId = sm.SemanticModelId
""")
rls_report_count = df_rls_reports.count()
df_rls_reports.write.format("delta").mode("overwrite").saveAsTable("RLS_Audit.dbo.tmp_rls_reports")
print(f"Saved: tmp_rls_reports ({rls_report_count} reports linked to RLS models)")
# CELL ********************
# Discovery Summary
print("\n" + "=" * 60)
print("DISCOVERY SUMMARY")
print("=" * 60)
print(f"Total semantic models in FUAM: {total_model_count}")
print(f"RLS-enabled semantic models: {rls_model_count}")
print(f"Reports linked to RLS models: {rls_report_count}")
print(f"Capacities with RLS models: {len(capacities)}")
print(f" - XMLA enabled: {xmla_enabled_count}")
print(f" - REST fallback: {xmla_disabled_count}")
print(f"Run date: {run_date}")
mssparkutils.notebook.exit(f"Discovery complete: {rls_model_count} RLS models across {len(capacities)} capacities")
# Fabric notebook source
# METADATA ********************
# META {
# META "kernel_info": {
# META "name": "synapse_pyspark"
# META },
# META "dependencies": {
# META "lakehouse": {
# META "default_lakehouse_name": "RLS_Audit",
# META "default_lakehouse_workspace_id": ""
# META }
# META }
# META }
# MARKDOWN ********************
# # RLS Audit Orchestrator
#
# This notebook orchestrates the full RLS audit pipeline:
# 1. **Discovery** — Identify RLS-enabled semantic models from FUAM, test XMLA per capacity
# 2. **Role Extraction** — Extract RLS role definitions and DAX filter expressions
# 3. **Role Members** — Extract role membership (users/groups)
# 4. **Persistence** — Write results to RLS_Audit Lakehouse with history
# 5. **Summary** — Generate operational summary
# PARAMETERS ********************
# CELL ********************
# Parameters
run_date = "" # Leave empty for today's date
# CELL ********************
from datetime import datetime, date
import json
if not run_date:
run_date = date.today().isoformat()
print(f"RLS Audit Pipeline - Run Date: {run_date}")
print("=" * 60)
# CELL ********************
# Pipeline execution tracking
pipeline_results = {}
def run_notebook(notebook_name, params=None):
"""Run a child notebook and capture the result."""
if params is None:
params = {}
params["run_date"] = run_date
print(f"\n{'=' * 60}")
print(f"Starting: {notebook_name}")
print(f"Parameters: {json.dumps(params, indent=2)}")
print(f"Start time: {datetime.now().isoformat()}")
print(f"{'=' * 60}")
try:
result = mssparkutils.notebook.run(notebook_name, timeout_seconds=3600, arguments=params)
pipeline_results[notebook_name] = {
"status": "Success",
"result": result,
"timestamp": datetime.now().isoformat()
}
print(f"Completed: {notebook_name} - Success")
return result
except Exception as e:
pipeline_results[notebook_name] = {
"status": "Failed",
"error": str(e),
"timestamp": datetime.now().isoformat()
}
print(f"FAILED: {notebook_name} - {str(e)}")
raise
# CELL ********************
# Step 1: Discovery - Read FUAM, identify RLS models, test XMLA per capacity
try:
run_notebook("nb_rls_discovery")
print("Step 1 complete: Discovery finished successfully")
except Exception as e:
print(f"FATAL: Discovery failed, cannot continue pipeline. Error: {e}")
raise
# CELL ********************
# Step 2: Role Extraction - Extract RLS role definitions via XMLA/REST
try:
run_notebook("nb_rls_role_extraction")
print("Step 2 complete: Role extraction finished successfully")
except Exception as e:
print(f"WARNING: Role extraction failed. Error: {e}")
print("Continuing pipeline - partial results may be available")
# CELL ********************
# Step 3: Role Membership - Extract users/groups assigned to roles
try:
run_notebook("nb_rls_role_members")
print("Step 3 complete: Role membership extraction finished successfully")
except Exception as e:
print(f"WARNING: Role membership extraction failed. Error: {e}")
print("Continuing pipeline - partial results may be available")
# CELL ********************
# Step 4: Persistence - Write to Lakehouse with history
try:
run_notebook("nb_rls_persistence")
print("Step 4 complete: Persistence finished successfully")
except Exception as e:
print(f"WARNING: Persistence failed. Error: {e}")
print("Data may not have been saved - check Lakehouse tables")
# CELL ********************
# Step 5: Summary Report
try:
run_notebook("nb_rls_summary")
print("Step 5 complete: Summary generated successfully")
except Exception as e:
print(f"WARNING: Summary generation failed. Error: {e}")
# CELL ********************
# Pipeline Summary
print("\n" + "=" * 60)
print("PIPELINE EXECUTION SUMMARY")
print("=" * 60)
print(f"Run Date: {run_date}")
print(f"Completed: {datetime.now().isoformat()}")
print()
all_success = True
for notebook, result in pipeline_results.items():
status_icon = "OK" if result["status"] == "Success" else "FAIL"
print(f" [{status_icon}] {notebook}: {result['status']}")
if result["status"] != "Success":
all_success = False
print(f" Error: {result.get('error', 'Unknown')}")
print()
if all_success:
print("Pipeline completed successfully.")
else:
print("Pipeline completed with errors - review failures above.")
# Exit with the result for upstream callers
mssparkutils.notebook.exit(json.dumps(pipeline_results))
# Fabric notebook source
# METADATA ********************
# META {
# META "kernel_info": {
# META "name": "synapse_pyspark"
# META },
# META "dependencies": {
# META "lakehouse": {
# META "default_lakehouse_name": "RLS_Audit",
# META "default_lakehouse_workspace_id": ""
# META }
# META }
# META }
# MARKDOWN ********************
# # RLS Persistence & History
#
# This notebook:
# 1. Promotes temp tables to final current-state tables (overwrite)
# 2. Appends current state to history tables with snapshot date
# 3. Cleans up temp tables
# PARAMETERS ********************
# CELL ********************
# Parameters
run_date = ""
# CELL ********************
from datetime import datetime, date
from pyspark.sql.functions import lit, current_date, col
if not run_date:
run_date = date.today().isoformat()
snapshot_date = run_date
print(f"RLS Persistence - Snapshot Date: {snapshot_date}")
# CELL ********************
# ============================================================
# Table mapping: temp -> final, with history counterpart
# ============================================================
tables = [
{
"temp": "RLS_Audit.dbo.tmp_rls_semantic_models",
"final": "RLS_Audit.dbo.rls_semantic_models",
"history": "RLS_Audit.dbo.rls_semantic_models_history",
},
{
"temp": "RLS_Audit.dbo.tmp_rls_roles",
"final": "RLS_Audit.dbo.rls_roles",
"history": "RLS_Audit.dbo.rls_roles_history",
},
{
"temp": "RLS_Audit.dbo.tmp_rls_role_table_permissions",
"final": "RLS_Audit.dbo.rls_role_table_permissions",
"history": "RLS_Audit.dbo.rls_role_table_permissions_history",
},
{
"temp": "RLS_Audit.dbo.tmp_rls_role_members",
"final": "RLS_Audit.dbo.rls_role_members",
"history": "RLS_Audit.dbo.rls_role_members_history",
},
{
"temp": "RLS_Audit.dbo.tmp_rls_capacity_xmla_status",
"final": "RLS_Audit.dbo.rls_capacity_xmla_status",
"history": "RLS_Audit.dbo.rls_capacity_xmla_status_history",
},
]
# Also persist the reports linked to RLS models
tables.append({
"temp": "RLS_Audit.dbo.tmp_rls_reports",
"final": "RLS_Audit.dbo.rls_reports",
"history": "RLS_Audit.dbo.rls_reports_history",
})
# CELL ********************
# ============================================================
# Step 1: Write current state tables (overwrite)
# ============================================================
print("Writing current-state tables...")
for t in tables:
try:
df = spark.sql(f"SELECT * FROM {t['temp']}")
row_count = df.count()
df.write.format("delta").mode("overwrite").saveAsTable(t["final"])
print(f" {t['final']}: {row_count} rows (overwrite)")
except Exception as e:
print(f" {t['final']}: FAILED - {str(e)[:200]}")
# CELL ********************
# ============================================================
# Step 2: Append to history tables with snapshot date
# ============================================================
print(f"\nAppending to history tables (Snapshot_Date: {snapshot_date})...")
for t in tables:
try:
df = spark.sql(f"SELECT * FROM {t['final']}")
df_hist = df.withColumn("Snapshot_Date", lit(snapshot_date).cast("date"))
row_count = df_hist.count()
# Check if history table exists; if not, create it
try:
spark.sql(f"SELECT 1 FROM {t['history']} LIMIT 1")
table_exists = True
except Exception:
table_exists = False
if table_exists:
# Delete existing rows for this snapshot date to allow re-runs
spark.sql(f"""
DELETE FROM {t['history']}
WHERE Snapshot_Date = '{snapshot_date}'
""")
df_hist.write.format("delta").mode("append").saveAsTable(t["history"])
else:
df_hist.write.format("delta").mode("overwrite").saveAsTable(t["history"])
print(f" {t['history']}: {row_count} rows appended")
except Exception as e:
print(f" {t['history']}: FAILED - {str(e)[:200]}")
# CELL ********************
# ============================================================
# Step 3: Clean up temp tables
# ============================================================
print("\nCleaning up temp tables...")
for t in tables:
try:
spark.sql(f"DROP TABLE IF EXISTS {t['temp']}")
print(f" Dropped: {t['temp']}")
except Exception as e:
print(f" Failed to drop {t['temp']}: {str(e)[:100]}")
# CELL ********************
# ============================================================
# Verification: Row counts for all final tables
# ============================================================
print("\n" + "=" * 60)
print("PERSISTENCE SUMMARY")
print("=" * 60)
print(f"Snapshot Date: {snapshot_date}")
print()
print("Current State Tables:")
for t in tables:
try:
count = spark.sql(f"SELECT COUNT(*) AS cnt FROM {t['final']}").collect()[0].cnt
print(f" {t['final']}: {count} rows")
except Exception:
print(f" {t['final']}: NOT FOUND")
print()
print("History Tables:")
for t in tables:
try:
total = spark.sql(f"SELECT COUNT(*) AS cnt FROM {t['history']}").collect()[0].cnt
snapshots = spark.sql(f"SELECT COUNT(DISTINCT Snapshot_Date) AS cnt FROM {t['history']}").collect()[0].cnt
print(f" {t['history']}: {total} rows across {snapshots} snapshots")
except Exception:
print(f" {t['history']}: NOT FOUND")
mssparkutils.notebook.exit(f"Persistence complete for snapshot {snapshot_date}")
# Fabric notebook source
# METADATA ********************
# META {
# META "kernel_info": {
# META "name": "synapse_pyspark"
# META },
# META "dependencies": {
# META "lakehouse": {
# META "default_lakehouse_name": "RLS_Audit",
# META "default_lakehouse_workspace_id": ""
# META }
# META }
# META }
# MARKDOWN ********************
# # RLS Role Extraction
#
# This notebook extracts RLS role definitions from semantic models:
# - **XMLA path**: Uses TOM (Tabular Object Model) via semantic-link to get full role details
# - **REST path**: Uses Power BI REST API as fallback for capacities without XMLA
#
# Outputs: `tmp_rls_roles`, `tmp_rls_role_table_permissions`
# PARAMETERS ********************
# CELL ********************
# Parameters
run_date = ""
# CELL ********************
from datetime import datetime, date
from pyspark.sql.functions import lit, current_timestamp, col
from pyspark.sql.types import (
StructType, StructField, StringType, TimestampType, IntegerType
)
import sempy.fabric as fabric
import requests
import json
import time
if not run_date:
run_date = date.today().isoformat()
print(f"RLS Role Extraction - Run Date: {run_date}")
# CELL ********************
# ============================================================
# Load discovery results
# ============================================================
df_models = spark.sql("SELECT * FROM RLS_Audit.dbo.tmp_rls_semantic_models")
df_xmla_status = spark.sql("SELECT * FROM RLS_Audit.dbo.tmp_rls_capacity_xmla_status")
total_models = df_models.count()
print(f"Total RLS models to process: {total_models}")
# Separate XMLA and REST models
xmla_models = df_models.filter(col("ExtractionMethod") == "XMLA").collect()
rest_models = df_models.filter(col("ExtractionMethod") == "REST").collect()
print(f"XMLA extraction: {len(xmla_models)} models")
print(f"REST extraction: {len(rest_models)} models")
# CELL ********************
# ============================================================
# XMLA Role Extraction
# ============================================================
roles_records = []
table_perms_records = []
model_updates = [] # Track extraction status per model
# Group XMLA models by workspace for efficient connection reuse
from collections import defaultdict
workspace_groups = defaultdict(list)
for row in xmla_models:
workspace_groups[row.WorkspaceId].append(row)
print(f"\nProcessing {len(workspace_groups)} workspaces via XMLA...")
# CELL ********************
# Process each workspace (one TOM connection per workspace)
for workspace_id, models in workspace_groups.items():
workspace_name = models[0].WorkspaceName
print(f"\n--- Workspace: {workspace_name} ({workspace_id}) ---")
print(f" Models to process: {len(models)}")
tom_server = None
try:
tom_server = fabric.create_tom_server(readonly=True, workspace=workspace_id)
for model_row in models:
dataset_name = model_row.SemanticModelName
dataset_id = model_row.SemanticModelId
print(f" Processing: {dataset_name}")
try:
database = tom_server.Databases.GetByName(dataset_name)
role_count = 0
for role in database.Model.Roles:
role_count += 1
role_name = role.Name
role_desc = role.Description if role.Description else ""
model_permission = str(role.ModelPermission)
member_count = role.Members.Count
roles_records.append({
"SemanticModelId": dataset_id,
"SemanticModelName": dataset_name,
"WorkspaceId": workspace_id,
"WorkspaceName": workspace_name,
"RoleName": role_name,
"RoleDescription": role_desc,
"ModelPermission": model_permission,
"MemberCount": member_count,
"ExtractionMethod": "XMLA",
"ExtractionTimestamp": datetime.now(),
})
# Extract table permissions (DAX filter expressions)
for table_perm in role.TablePermissions:
table_name = table_perm.Table.Name
filter_expr = table_perm.FilterExpression if table_perm.FilterExpression else ""
table_perms_records.append({
"SemanticModelId": dataset_id,
"SemanticModelName": dataset_name,
"WorkspaceId": workspace_id,
"RoleName": role_name,
"TableName": table_name,
"FilterExpression": filter_expr,
"ExtractionMethod": "XMLA",
"ExtractionTimestamp": datetime.now(),
})
model_updates.append({
"SemanticModelId": dataset_id,
"RoleCount": role_count,
"ExtractionStatus": "Success",
"ExtractionMethod": "XMLA",
})
print(f" Found {role_count} roles")
except Exception as e:
print(f" FAILED: {str(e)[:200]}")
model_updates.append({
"SemanticModelId": dataset_id,
"RoleCount": 0,
"ExtractionStatus": f"Failed: {str(e)[:200]}",
"ExtractionMethod": "XMLA",
})
except Exception as e:
print(f" Workspace connection FAILED: {str(e)[:200]}")
for model_row in models:
model_updates.append({
"SemanticModelId": model_row.SemanticModelId,
"RoleCount": 0,
"ExtractionStatus": f"Connection failed: {str(e)[:200]}",
"ExtractionMethod": "XMLA",
})
finally:
if tom_server:
try:
tom_server.Disconnect()
except Exception:
pass
print(f"\nXMLA extraction complete: {len(roles_records)} roles found")
# CELL ********************
# ============================================================
# REST API Fallback for capacities without XMLA
# ============================================================
if rest_models:
print(f"\nProcessing {len(rest_models)} models via REST API...")
# Get access token via Managed Identity
from notebookutils import mssparkutils
access_token = mssparkutils.credentials.getToken("https://analysis.windows.net/powerbi/api")
headers = {
"Authorization": f"Bearer {access_token}",
"Content-Type": "application/json"
}
for model_row in rest_models:
dataset_id = model_row.SemanticModelId
dataset_name = model_row.SemanticModelName
workspace_id = model_row.WorkspaceId
workspace_name = model_row.WorkspaceName
print(f" REST: {dataset_name} in {workspace_name}")
try:
# Get dataset details including roles (Admin API)
# Using the admin API to get dataset info across workspaces
url = f"https://api.powerbi.com/v1.0/myorg/groups/{workspace_id}/datasets/{dataset_id}"
response = requests.get(url, headers=headers)
if response.status_code == 200:
dataset_info = response.json()
# Try to get roles via the scan API or evaluate_dax
# The REST API /datasets/{id} doesn't directly return role definitions
# We'll use the admin scan endpoint which FUAM may have already called
# For now, record that RLS is enabled but details need XMLA
# Attempt to get role assignments
roles_url = f"https://api.powerbi.com/v1.0/myorg/groups/{workspace_id}/datasets/{dataset_id}/users"
roles_response = requests.get(roles_url, headers=headers)
if roles_response.status_code == 200:
users_data = roles_response.json().get("value", [])
# This gives us dataset users, not RLS role definitions
# RLS role definitions are NOT available via standard REST API
# Mark as limited extraction
model_updates.append({
"SemanticModelId": dataset_id,
"RoleCount": -1, # -1 indicates unknown (REST limitation)
"ExtractionStatus": "REST-Limited",
"ExtractionMethod": "REST",
})
print(f" REST: Dataset found but role details require XMLA (limited extraction)")
elif response.status_code == 429:
# Rate limited - wait and retry
retry_after = int(response.headers.get("Retry-After", 30))
print(f" Rate limited, waiting {retry_after}s...")
time.sleep(retry_after)
# Skip for now, will be retried on next run
model_updates.append({
"SemanticModelId": dataset_id,
"RoleCount": 0,
"ExtractionStatus": "Rate-Limited-Skipped",
"ExtractionMethod": "REST",
})
else:
model_updates.append({
"SemanticModelId": dataset_id,
"RoleCount": 0,
"ExtractionStatus": f"REST-Error-{response.status_code}",
"ExtractionMethod": "REST",
})
print(f" REST Error: {response.status_code}")
except Exception as e:
model_updates.append({
"SemanticModelId": dataset_id,
"RoleCount": 0,
"ExtractionStatus": f"REST-Failed: {str(e)[:200]}",
"ExtractionMethod": "REST",
})
print(f" REST Failed: {str(e)[:200]}")
# Small delay to avoid throttling
time.sleep(0.5)
print(f"REST extraction complete")
else:
print("No models require REST fallback - all capacities support XMLA")
# CELL ********************
# ============================================================
# Save extraction results
# ============================================================
# Save roles
if roles_records:
roles_schema = StructType([
StructField("SemanticModelId", StringType(), True),
StructField("SemanticModelName", StringType(), True),
StructField("WorkspaceId", StringType(), True),
StructField("WorkspaceName", StringType(), True),
StructField("RoleName", StringType(), True),
StructField("RoleDescription", StringType(), True),
StructField("ModelPermission", StringType(), True),
StructField("MemberCount", IntegerType(), True),
StructField("ExtractionMethod", StringType(), True),
StructField("ExtractionTimestamp", TimestampType(), True),
])
df_roles = spark.createDataFrame(roles_records, schema=roles_schema)
df_roles.write.format("delta").mode("overwrite").saveAsTable("RLS_Audit.dbo.tmp_rls_roles")
print(f"Saved: tmp_rls_roles ({len(roles_records)} rows)")
else:
# Write empty table with schema
df_roles = spark.createDataFrame([], StructType([
StructField("SemanticModelId", StringType(), True),
StructField("SemanticModelName", StringType(), True),
StructField("WorkspaceId", StringType(), True),
StructField("WorkspaceName", StringType(), True),
StructField("RoleName", StringType(), True),
StructField("RoleDescription", StringType(), True),
StructField("ModelPermission", StringType(), True),
StructField("MemberCount", IntegerType(), True),
StructField("ExtractionMethod", StringType(), True),
StructField("ExtractionTimestamp", TimestampType(), True),
]))
df_roles.write.format("delta").mode("overwrite").saveAsTable("RLS_Audit.dbo.tmp_rls_roles")
print("Saved: tmp_rls_roles (empty - no roles found)")
# Save table permissions
if table_perms_records:
perms_schema = StructType([
StructField("SemanticModelId", StringType(), True),
StructField("SemanticModelName", StringType(), True),
StructField("WorkspaceId", StringType(), True),
StructField("RoleName", StringType(), True),
StructField("TableName", StringType(), True),
StructField("FilterExpression", StringType(), True),
StructField("ExtractionMethod", StringType(), True),
StructField("ExtractionTimestamp", TimestampType(), True),
])
df_perms = spark.createDataFrame(table_perms_records, schema=perms_schema)
df_perms.write.format("delta").mode("overwrite").saveAsTable("RLS_Audit.dbo.tmp_rls_role_table_permissions")
print(f"Saved: tmp_rls_role_table_permissions ({len(table_perms_records)} rows)")
else:
df_perms = spark.createDataFrame([], StructType([
StructField("SemanticModelId", StringType(), True),
StructField("SemanticModelName", StringType(), True),
StructField("WorkspaceId", StringType(), True),
StructField("RoleName", StringType(), True),
StructField("TableName", StringType(), True),
StructField("FilterExpression", StringType(), True),
StructField("ExtractionMethod", StringType(), True),
StructField("ExtractionTimestamp", TimestampType(), True),
]))
df_perms.write.format("delta").mode("overwrite").saveAsTable("RLS_Audit.dbo.tmp_rls_role_table_permissions")
print("Saved: tmp_rls_role_table_permissions (empty)")
# CELL ********************
# Update model extraction status
if model_updates:
update_schema = StructType([
StructField("SemanticModelId", StringType(), True),
StructField("RoleCount", IntegerType(), True),
StructField("ExtractionStatus", StringType(), True),
StructField("ExtractionMethod", StringType(), True),
])
df_updates = spark.createDataFrame(model_updates, schema=update_schema)
# Update the tmp_rls_semantic_models table with extraction results
df_models_original = spark.sql("SELECT * FROM RLS_Audit.dbo.tmp_rls_semantic_models")
df_models_updated = (
df_models_original.alias("m")
.join(df_updates.alias("u"), on="SemanticModelId", how="left")
.select(
col("m.SemanticModelId"),
col("m.SemanticModelName"),
col("m.WorkspaceId"),
col("m.WorkspaceName"),
col("m.CapacityId"),
col("m.CapacityName"),
col("m.isEffectiveIdentityRequired"),
col("m.isEffectiveIdentityRolesRequired"),
col("m.ContentProviderType"),
col("m.StorageMode"),
col("u.RoleCount"),
col("u.ExtractionMethod"),
col("m.ExtractionTimestamp"),
col("u.ExtractionStatus"),
)
)
df_models_updated.write.format("delta").mode("overwrite").saveAsTable("RLS_Audit.dbo.tmp_rls_semantic_models")
print("Updated: tmp_rls_semantic_models with extraction status")
# CELL ********************
# Extraction Summary
print("\n" + "=" * 60)
print("ROLE EXTRACTION SUMMARY")
print("=" * 60)
print(f"Models processed: {len(model_updates)}")
print(f"Roles found: {len(roles_records)}")
print(f"Table filter expressions: {len(table_perms_records)}")
success_count = len([u for u in model_updates if u["ExtractionStatus"] == "Success"])
failed_count = len([u for u in model_updates if "Failed" in u["ExtractionStatus"] or "Error" in u["ExtractionStatus"]])
limited_count = len([u for u in model_updates if "Limited" in u["ExtractionStatus"]])
print(f"Successful extractions: {success_count}")
print(f"Failed extractions: {failed_count}")
print(f"Limited (REST): {limited_count}")
mssparkutils.notebook.exit(f"Roles: {len(roles_records)}, Table filters: {len(table_perms_records)}")
# Fabric notebook source
# METADATA ********************
# META {
# META "kernel_info": {
# META "name": "synapse_pyspark"
# META },
# META "dependencies": {
# META "lakehouse": {
# META "default_lakehouse_name": "RLS_Audit",
# META "default_lakehouse_workspace_id": ""
# META }
# META }
# META }
# MARKDOWN ********************
# # RLS Role Membership Extraction
#
# This notebook extracts which users and groups are assigned to each RLS role:
# - **XMLA path**: Uses TOM to enumerate role members directly
# - **REST path**: Uses Power BI REST API (limited - dataset users, not role-level)
#
# Outputs: `tmp_rls_role_members`
# PARAMETERS ********************
# CELL ********************
# Parameters
run_date = ""
# CELL ********************
from datetime import datetime, date
from pyspark.sql.functions import lit, current_timestamp, col
from pyspark.sql.types import (
StructType, StructField, StringType, TimestampType
)
from collections import defaultdict
import sempy.fabric as fabric
import requests
import time
if not run_date:
run_date = date.today().isoformat()
print(f"RLS Role Members Extraction - Run Date: {run_date}")
# CELL ********************
# ============================================================
# Load discovery results
# ============================================================
df_models = spark.sql("""
SELECT *
FROM RLS_Audit.dbo.tmp_rls_semantic_models
WHERE ExtractionStatus = 'Success'
OR ExtractionStatus = 'REST-Limited'
""")
df_xmla_status = spark.sql("""
SELECT CapacityId, XmlaEndpointEnabled
FROM RLS_Audit.dbo.tmp_rls_capacity_xmla_status
""")
models_list = df_models.collect()
xmla_capacities = set(
row.CapacityId for row in df_xmla_status.filter(col("XmlaEndpointEnabled") == True).collect()
)
xmla_models = [m for m in models_list if m.CapacityId in xmla_capacities]
rest_models = [m for m in models_list if m.CapacityId not in xmla_capacities]
print(f"Models with successful role extraction: {len(models_list)}")
print(f" XMLA path: {len(xmla_models)}")
print(f" REST path: {len(rest_models)}")
# CELL ********************
# ============================================================
# XMLA Role Membership Extraction
# ============================================================
member_records = []
# Group by workspace for connection reuse
workspace_groups = defaultdict(list)
for row in xmla_models:
workspace_groups[row.WorkspaceId].append(row)
print(f"\nProcessing {len(workspace_groups)} workspaces via XMLA for role members...")
for workspace_id, models in workspace_groups.items():
workspace_name = models[0].WorkspaceName
print(f"\n--- Workspace: {workspace_name} ---")
tom_server = None
try:
tom_server = fabric.create_tom_server(readonly=True, workspace=workspace_id)
for model_row in models:
dataset_name = model_row.SemanticModelName
dataset_id = model_row.SemanticModelId
print(f" Processing: {dataset_name}")
try:
database = tom_server.Databases.GetByName(dataset_name)
for role in database.Model.Roles:
role_name = role.Name
if role.Members.Count == 0:
print(f" Role '{role_name}': 0 members (empty role)")
continue
for member in role.Members:
member_name = member.Name if member.Name else ""
member_id = member.MemberID if member.MemberID else ""
identity_provider = member.IdentityProvider if member.IdentityProvider else ""
# Determine member type from the member object
# ExternalModelRoleMember = external, WindowsModelRoleMember = Windows/AAD
member_type_name = type(member).__name__
if "External" in member_type_name:
member_type = "ExternalUser"
elif member_name and "@" in member_name:
member_type = "User"
else:
# Could be a group or app - check the MemberID format
# GUIDs without @ are typically groups or apps
member_type = "Group"
member_records.append({
"SemanticModelId": dataset_id,
"SemanticModelName": dataset_name,
"WorkspaceId": workspace_id,
"WorkspaceName": workspace_name,
"RoleName": role_name,
"MemberName": member_name,
"MemberType": member_type,
"IdentityProvider": identity_provider,
"MemberId": member_id,
"ExtractionMethod": "XMLA",
"ExtractionTimestamp": datetime.now(),
})
print(f" Role '{role_name}': {role.Members.Count} members")
except Exception as e:
print(f" FAILED for {dataset_name}: {str(e)[:200]}")
except Exception as e:
print(f" Workspace connection FAILED: {str(e)[:200]}")
finally:
if tom_server:
try:
tom_server.Disconnect()
except Exception:
pass
print(f"\nXMLA member extraction complete: {len(member_records)} member assignments")
# CELL ********************
# ============================================================
# REST API Fallback for Role Members
# ============================================================
if rest_models:
print(f"\nProcessing {len(rest_models)} models via REST API for members...")
from notebookutils import mssparkutils
access_token = mssparkutils.credentials.getToken("https://analysis.windows.net/powerbi/api")
headers = {
"Authorization": f"Bearer {access_token}",
"Content-Type": "application/json"
}
for model_row in rest_models:
dataset_id = model_row.SemanticModelId
dataset_name = model_row.SemanticModelName
workspace_id = model_row.WorkspaceId
workspace_name = model_row.WorkspaceName
print(f" REST: {dataset_name}")
try:
# The standard REST API doesn't expose RLS role members directly.
# The closest is GET /datasets/{id}/users which shows dataset-level permissions.
# For actual RLS role membership, the Admin API enhanced scan is needed.
# We'll record what we can from the dataset users endpoint.
url = f"https://api.powerbi.com/v1.0/myorg/groups/{workspace_id}/datasets/{dataset_id}/users"
response = requests.get(url, headers=headers)
if response.status_code == 200:
users = response.json().get("value", [])
for user in users:
member_records.append({
"SemanticModelId": dataset_id,
"SemanticModelName": dataset_name,
"WorkspaceId": workspace_id,
"WorkspaceName": workspace_name,
"RoleName": "__dataset_level__", # Not role-specific via REST
"MemberName": user.get("displayName", user.get("emailAddress", "")),
"MemberType": user.get("principalType", "Unknown"),
"IdentityProvider": "AzureAD",
"MemberId": user.get("identifier", ""),
"ExtractionMethod": "REST",
"ExtractionTimestamp": datetime.now(),
})
print(f" Found {len(users)} dataset-level users (not role-specific)")
elif response.status_code == 429:
retry_after = int(response.headers.get("Retry-After", 30))
print(f" Rate limited, waiting {retry_after}s...")
time.sleep(retry_after)
else:
print(f" Error: {response.status_code}")
except Exception as e:
print(f" Failed: {str(e)[:200]}")
time.sleep(0.5)
print(f"REST member extraction complete")
else:
print("No models require REST fallback for members")
# CELL ********************
# ============================================================
# Save member records
# ============================================================
member_schema = StructType([
StructField("SemanticModelId", StringType(), True),
StructField("SemanticModelName", StringType(), True),
StructField("WorkspaceId", StringType(), True),
StructField("WorkspaceName", StringType(), True),
StructField("RoleName", StringType(), True),
StructField("MemberName", StringType(), True),
StructField("MemberType", StringType(), True),
StructField("IdentityProvider", StringType(), True),
StructField("MemberId", StringType(), True),
StructField("ExtractionMethod", StringType(), True),
StructField("ExtractionTimestamp", TimestampType(), True),
])
if member_records:
df_members = spark.createDataFrame(member_records, schema=member_schema)
else:
df_members = spark.createDataFrame([], schema=member_schema)
df_members.write.format("delta").mode("overwrite").saveAsTable("RLS_Audit.dbo.tmp_rls_role_members")
print(f"\nSaved: tmp_rls_role_members ({len(member_records)} rows)")
# CELL ********************
# Member Extraction Summary
print("\n" + "=" * 60)
print("ROLE MEMBERSHIP SUMMARY")
print("=" * 60)
print(f"Total member assignments: {len(member_records)}")
xmla_members = [r for r in member_records if r["ExtractionMethod"] == "XMLA"]
rest_members = [r for r in member_records if r["ExtractionMethod"] == "REST"]
print(f" XMLA (role-level): {len(xmla_members)}")
print(f" REST (dataset-level): {len(rest_members)}")
# Unique members
unique_members = set(r["MemberName"] for r in member_records if r["MemberName"])
print(f"Unique members: {len(unique_members)}")
# Unique roles with members
unique_roles = set((r["SemanticModelId"], r["RoleName"]) for r in member_records)
print(f"Unique role assignments: {len(unique_roles)}")
if rest_members:
print(f"\nNOTE: {len(rest_members)} members extracted via REST are at dataset-level only.")
print("Enable XMLA on those capacities for role-level member details.")
mssparkutils.notebook.exit(f"Members: {len(member_records)}, Unique: {len(unique_members)}")
# Fabric notebook source
# METADATA ********************
# META {
# META "kernel_info": {
# META "name": "synapse_pyspark"
# META },
# META "dependencies": {
# META "lakehouse": {
# META "default_lakehouse_name": "RLS_Audit",
# META "default_lakehouse_workspace_id": ""
# META }
# META }
# META }
# MARKDOWN ********************
# # RLS Audit Summary Report
#
# This notebook generates an operational summary of the RLS audit results,
# rendered as HTML tables directly in the notebook output.
# PARAMETERS ********************
# CELL ********************
# Parameters
run_date = ""
# CELL ********************
from datetime import datetime, date
from pyspark.sql.functions import col, count, countDistinct, sum as spark_sum, when, lit
if not run_date:
run_date = date.today().isoformat()
print(f"RLS Summary Report - {run_date}")
# CELL ********************
# ============================================================
# Load final tables
# ============================================================
df_models = spark.sql("SELECT * FROM RLS_Audit.dbo.rls_semantic_models")
df_roles = spark.sql("SELECT * FROM RLS_Audit.dbo.rls_roles")
df_perms = spark.sql("SELECT * FROM RLS_Audit.dbo.rls_role_table_permissions")
df_members = spark.sql("SELECT * FROM RLS_Audit.dbo.rls_role_members")
df_xmla = spark.sql("SELECT * FROM RLS_Audit.dbo.rls_capacity_xmla_status")
df_reports = spark.sql("SELECT * FROM RLS_Audit.dbo.rls_reports")
# CELL ********************
# ============================================================
# KPI Metrics
# ============================================================
total_rls_models = df_models.count()
total_roles = df_roles.count()
total_members = df_members.count()
total_reports = df_reports.count()
total_capacities = df_xmla.count()
xmla_enabled = df_xmla.filter(col("XmlaEndpointEnabled") == True).count()
xmla_disabled = total_capacities - xmla_enabled
unique_members = df_members.select("MemberName").distinct().count()
successful_extractions = df_models.filter(col("ExtractionStatus") == "Success").count()
failed_extractions = df_models.filter(col("ExtractionStatus").contains("Failed")).count()
kpi_html = f"""
<div style="font-family: Segoe UI, sans-serif; padding: 20px;">
<h1 style="color: #2b579a;">RLS Audit Summary - {run_date}</h1>
<div style="display: flex; flex-wrap: wrap; gap: 15px; margin: 20px 0;">
<div style="background: #2b579a; color: white; padding: 20px; border-radius: 8px; min-width: 150px; text-align: center;">
<div style="font-size: 36px; font-weight: bold;">{total_rls_models}</div>
<div>RLS Models</div>
</div>
<div style="background: #217346; color: white; padding: 20px; border-radius: 8px; min-width: 150px; text-align: center;">
<div style="font-size: 36px; font-weight: bold;">{total_roles}</div>
<div>Roles Defined</div>
</div>
<div style="background: #7719aa; color: white; padding: 20px; border-radius: 8px; min-width: 150px; text-align: center;">
<div style="font-size: 36px; font-weight: bold;">{total_members}</div>
<div>Role Assignments</div>
</div>
<div style="background: #ca5010; color: white; padding: 20px; border-radius: 8px; min-width: 150px; text-align: center;">
<div style="font-size: 36px; font-weight: bold;">{unique_members}</div>
<div>Unique Members</div>
</div>
<div style="background: #0078d4; color: white; padding: 20px; border-radius: 8px; min-width: 150px; text-align: center;">
<div style="font-size: 36px; font-weight: bold;">{total_reports}</div>
<div>Linked Reports</div>
</div>
<div style="background: #107c10; color: white; padding: 20px; border-radius: 8px; min-width: 150px; text-align: center;">
<div style="font-size: 36px; font-weight: bold;">{xmla_enabled}/{total_capacities}</div>
<div>XMLA Capacities</div>
</div>
</div>
<div style="display: flex; gap: 15px; margin: 10px 0;">
<div style="background: #dff6dd; padding: 10px 20px; border-radius: 6px; border-left: 4px solid #107c10;">
Successful extractions: <strong>{successful_extractions}</strong>
</div>
<div style="background: #fde7e9; padding: 10px 20px; border-radius: 6px; border-left: 4px solid #d13438;">
Failed extractions: <strong>{failed_extractions}</strong>
</div>
</div>
</div>
"""
displayHTML(kpi_html)
# CELL ********************
# ============================================================
# RLS Models by Capacity
# ============================================================
df_by_capacity = (
df_models
.groupBy("CapacityId", "CapacityName", "ExtractionMethod")
.agg(
count("*").alias("ModelCount"),
spark_sum(when(col("ExtractionStatus") == "Success", 1).otherwise(0)).alias("Successful"),
spark_sum(when(col("ExtractionStatus").contains("Failed"), 1).otherwise(0)).alias("Failed"),
)
.orderBy("CapacityName")
)
rows = df_by_capacity.collect()
table_rows = ""
for r in rows:
status_color = "#107c10" if r.Failed == 0 else "#d13438"
table_rows += f"""
<tr>
<td>{r.CapacityName or r.CapacityId}</td>
<td>{r.ExtractionMethod}</td>
<td style="text-align:center;">{r.ModelCount}</td>
<td style="text-align:center; color: #107c10;">{r.Successful}</td>
<td style="text-align:center; color: {status_color};">{r.Failed}</td>
</tr>
"""
capacity_html = f"""
<div style="font-family: Segoe UI, sans-serif; padding: 20px;">
<h2>RLS Models by Capacity</h2>
<table style="border-collapse: collapse; width: 100%;">
<thead>
<tr style="background: #f3f3f3; border-bottom: 2px solid #2b579a;">
<th style="padding: 10px; text-align: left;">Capacity</th>
<th style="padding: 10px; text-align: left;">Method</th>
<th style="padding: 10px; text-align: center;">Models</th>
<th style="padding: 10px; text-align: center;">Successful</th>
<th style="padding: 10px; text-align: center;">Failed</th>
</tr>
</thead>
<tbody>{table_rows}</tbody>
</table>
</div>
"""
displayHTML(capacity_html)
# CELL ********************
# ============================================================
# XMLA Capacity Status
# ============================================================
xmla_rows = df_xmla.collect()
xmla_table = ""
for r in xmla_rows:
icon = "&#9989;" if r.XmlaEndpointEnabled else "&#10060;"
error_text = r.TestError[:100] if r.TestError else ""
xmla_table += f"""
<tr>
<td>{r.CapacityName or r.CapacityId}</td>
<td style="text-align:center;">{icon}</td>
<td>{r.FallbackMethod}</td>
<td style="font-size: 0.85em; color: #666;">{error_text}</td>
</tr>
"""
xmla_html = f"""
<div style="font-family: Segoe UI, sans-serif; padding: 20px;">
<h2>XMLA Endpoint Status by Capacity</h2>
<table style="border-collapse: collapse; width: 100%;">
<thead>
<tr style="background: #f3f3f3; border-bottom: 2px solid #2b579a;">
<th style="padding: 10px; text-align: left;">Capacity</th>
<th style="padding: 10px; text-align: center;">XMLA</th>
<th style="padding: 10px; text-align: left;">Fallback</th>
<th style="padding: 10px; text-align: left;">Error</th>
</tr>
</thead>
<tbody>{xmla_table}</tbody>
</table>
</div>
"""
displayHTML(xmla_html)
# CELL ********************
# ============================================================
# Gaps & Risks: Models with potential issues
# ============================================================
# Models with RLS enabled but no roles extracted
df_no_roles = df_models.filter(
(col("RoleCount") == 0) & (col("ExtractionStatus") == "Success")
)
no_roles_count = df_no_roles.count()
# Roles with no members
df_empty_roles = df_roles.filter(col("MemberCount") == 0)
empty_roles_count = df_empty_roles.count()
# Roles with no filter expressions (DAX)
df_roles_with_perms = df_perms.select("SemanticModelId", "RoleName").distinct()
df_all_roles = df_roles.select("SemanticModelId", "RoleName").distinct()
df_no_filters = df_all_roles.subtract(df_roles_with_perms)
no_filter_count = df_no_filters.count()
# Build risk items
risk_items = ""
if no_roles_count > 0:
models_list_html = ""
for r in df_no_roles.select("SemanticModelName", "WorkspaceName").collect()[:10]:
models_list_html += f"<li>{r.SemanticModelName} ({r.WorkspaceName})</li>"
risk_items += f"""
<div style="background: #fff4ce; padding: 15px; border-radius: 6px; border-left: 4px solid #ca5010; margin: 10px 0;">
<strong>RLS Enabled but No Roles Defined: {no_roles_count} models</strong>
<p>These models have <code>isEffectiveIdentityRolesRequired = true</code> but no roles were found.
This may indicate misconfiguration or that roles were removed.</p>
<ul>{models_list_html}</ul>
</div>
"""
if empty_roles_count > 0:
roles_list_html = ""
for r in df_empty_roles.select("SemanticModelName", "RoleName").collect()[:10]:
roles_list_html += f"<li>{r.RoleName} on {r.SemanticModelName}</li>"
risk_items += f"""
<div style="background: #fde7e9; padding: 15px; border-radius: 6px; border-left: 4px solid #d13438; margin: 10px 0;">
<strong>Roles with No Members: {empty_roles_count} roles</strong>
<p>These roles exist but have no users or groups assigned.</p>
<ul>{roles_list_html}</ul>
</div>
"""
if no_filter_count > 0:
risk_items += f"""
<div style="background: #fff4ce; padding: 15px; border-radius: 6px; border-left: 4px solid #ca5010; margin: 10px 0;">
<strong>Roles with No Filter Expressions: {no_filter_count} roles</strong>
<p>These roles have no DAX table filter expressions, meaning they may grant full access.</p>
</div>
"""
if xmla_disabled > 0:
risk_items += f"""
<div style="background: #deecf9; padding: 15px; border-radius: 6px; border-left: 4px solid #0078d4; margin: 10px 0;">
<strong>Limited Extraction: {xmla_disabled} capacities without XMLA</strong>
<p>Role definitions and member details could not be fully extracted for models on these capacities.
Enable XMLA read endpoint on these capacities for complete coverage.</p>
</div>
"""
if not risk_items:
risk_items = """
<div style="background: #dff6dd; padding: 15px; border-radius: 6px; border-left: 4px solid #107c10; margin: 10px 0;">
<strong>No issues detected.</strong> All RLS configurations appear healthy.
</div>
"""
risk_html = f"""
<div style="font-family: Segoe UI, sans-serif; padding: 20px;">
<h2>Gaps &amp; Risks</h2>
{risk_items}
</div>
"""
displayHTML(risk_html)
# CELL ********************
# ============================================================
# Role Details Table
# ============================================================
df_role_detail = (
df_roles
.join(df_models.select("SemanticModelId", "CapacityName"), on="SemanticModelId", how="left")
.select(
"CapacityName", "WorkspaceName", "SemanticModelName",
"RoleName", "RoleDescription", "ModelPermission", "MemberCount"
)
.orderBy("CapacityName", "WorkspaceName", "SemanticModelName", "RoleName")
)
detail_rows = df_role_detail.collect()
detail_table = ""
for r in detail_rows:
member_color = "#d13438" if r.MemberCount == 0 else "#333"
detail_table += f"""
<tr>
<td>{r.CapacityName or ''}</td>
<td>{r.WorkspaceName or ''}</td>
<td>{r.SemanticModelName}</td>
<td><strong>{r.RoleName}</strong></td>
<td>{r.RoleDescription or ''}</td>
<td>{r.ModelPermission or ''}</td>
<td style="text-align:center; color:{member_color};">{r.MemberCount}</td>
</tr>
"""
detail_html = f"""
<div style="font-family: Segoe UI, sans-serif; padding: 20px;">
<h2>All RLS Roles ({len(detail_rows)} roles)</h2>
<table style="border-collapse: collapse; width: 100%; font-size: 0.9em;">
<thead>
<tr style="background: #f3f3f3; border-bottom: 2px solid #2b579a;">
<th style="padding: 8px; text-align: left;">Capacity</th>
<th style="padding: 8px; text-align: left;">Workspace</th>
<th style="padding: 8px; text-align: left;">Semantic Model</th>
<th style="padding: 8px; text-align: left;">Role</th>
<th style="padding: 8px; text-align: left;">Description</th>
<th style="padding: 8px; text-align: left;">Permission</th>
<th style="padding: 8px; text-align: center;">Members</th>
</tr>
</thead>
<tbody>{detail_table}</tbody>
</table>
</div>
"""
displayHTML(detail_html)
# CELL ********************
# ============================================================
# Role Membership Detail
# ============================================================
df_member_detail = (
df_members
.filter(col("RoleName") != "__dataset_level__")
.select(
"WorkspaceName", "SemanticModelName", "RoleName",
"MemberName", "MemberType", "MemberId"
)
.orderBy("WorkspaceName", "SemanticModelName", "RoleName", "MemberName")
)
member_detail_rows = df_member_detail.collect()
member_table = ""
for r in member_detail_rows[:200]: # Limit to first 200 for display
member_table += f"""
<tr>
<td>{r.WorkspaceName or ''}</td>
<td>{r.SemanticModelName}</td>
<td>{r.RoleName}</td>
<td>{r.MemberName}</td>
<td>{r.MemberType}</td>
<td style="font-size: 0.8em; color: #666;">{r.MemberId or ''}</td>
</tr>
"""
truncation_note = ""
if len(member_detail_rows) > 200:
truncation_note = f"<p style='color: #666;'>Showing first 200 of {len(member_detail_rows)} rows. See Power BI report for full details.</p>"
member_html = f"""
<div style="font-family: Segoe UI, sans-serif; padding: 20px;">
<h2>Role Membership ({len(member_detail_rows)} assignments)</h2>
{truncation_note}
<table style="border-collapse: collapse; width: 100%; font-size: 0.9em;">
<thead>
<tr style="background: #f3f3f3; border-bottom: 2px solid #2b579a;">
<th style="padding: 8px; text-align: left;">Workspace</th>
<th style="padding: 8px; text-align: left;">Semantic Model</th>
<th style="padding: 8px; text-align: left;">Role</th>
<th style="padding: 8px; text-align: left;">Member</th>
<th style="padding: 8px; text-align: left;">Type</th>
<th style="padding: 8px; text-align: left;">ID</th>
</tr>
</thead>
<tbody>{member_table}</tbody>
</table>
</div>
"""
displayHTML(member_html)
# CELL ********************
mssparkutils.notebook.exit("Summary complete")
relationship 'roles_to_models'
fromColumn: rls_roles.SemanticModelId
toColumn: rls_semantic_models.SemanticModelId
relationship 'table_perms_to_models'
fromColumn: rls_role_table_permissions.SemanticModelId
toColumn: rls_semantic_models.SemanticModelId
relationship 'members_to_models'
fromColumn: rls_role_members.SemanticModelId
toColumn: rls_semantic_models.SemanticModelId
relationship 'reports_to_models'
fromColumn: rls_reports.SemanticModelId
toColumn: rls_semantic_models.SemanticModelId
relationship 'models_to_xmla'
fromColumn: rls_semantic_models.CapacityId
toColumn: rls_capacity_xmla_status.CapacityId
table rls_capacity_xmla_status
lineageTag: a1b2c3d4-0005-0001-0001-000000000001
column CapacityId
dataType: string
lineageTag: a1b2c3d4-0005-0001-0001-000000000002
summarizeBy: none
sourceColumn: CapacityId
column CapacityName
dataType: string
lineageTag: a1b2c3d4-0005-0001-0001-000000000003
summarizeBy: none
sourceColumn: CapacityName
column XmlaEndpointEnabled
dataType: boolean
lineageTag: a1b2c3d4-0005-0001-0001-000000000004
summarizeBy: none
sourceColumn: XmlaEndpointEnabled
column FallbackMethod
dataType: string
lineageTag: a1b2c3d4-0005-0001-0001-000000000005
summarizeBy: none
sourceColumn: FallbackMethod
column TestError
dataType: string
lineageTag: a1b2c3d4-0005-0001-0001-000000000006
summarizeBy: none
sourceColumn: TestError
column LastCheckedTimestamp
dataType: dateTime
lineageTag: a1b2c3d4-0005-0001-0001-000000000007
summarizeBy: none
sourceColumn: LastCheckedTimestamp
partition rls_capacity_xmla_status = m
mode: directLake
source
entityName: rls_capacity_xmla_status
schemaName: dbo
table rls_reports
lineageTag: a1b2c3d4-0006-0001-0001-000000000001
column ReportId
dataType: string
lineageTag: a1b2c3d4-0006-0001-0001-000000000002
summarizeBy: none
sourceColumn: ReportId
column ReportName
dataType: string
lineageTag: a1b2c3d4-0006-0001-0001-000000000003
summarizeBy: none
sourceColumn: ReportName
column ReportType
dataType: string
lineageTag: a1b2c3d4-0006-0001-0001-000000000004
summarizeBy: none
sourceColumn: ReportType
column WorkspaceId
dataType: string
lineageTag: a1b2c3d4-0006-0001-0001-000000000005
summarizeBy: none
sourceColumn: WorkspaceId
column SemanticModelId
dataType: string
lineageTag: a1b2c3d4-0006-0001-0001-000000000006
summarizeBy: none
sourceColumn: SemanticModelId
column SemanticModelWorkspaceId
dataType: string
lineageTag: a1b2c3d4-0006-0001-0001-000000000007
summarizeBy: none
sourceColumn: SemanticModelWorkspaceId
column AppId
dataType: string
lineageTag: a1b2c3d4-0006-0001-0001-000000000008
summarizeBy: none
sourceColumn: AppId
partition rls_reports = m
mode: directLake
source
entityName: rls_reports
schemaName: dbo
table rls_role_members
lineageTag: a1b2c3d4-0004-0001-0001-000000000001
column SemanticModelId
dataType: string
lineageTag: a1b2c3d4-0004-0001-0001-000000000002
summarizeBy: none
sourceColumn: SemanticModelId
column SemanticModelName
dataType: string
lineageTag: a1b2c3d4-0004-0001-0001-000000000003
summarizeBy: none
sourceColumn: SemanticModelName
column WorkspaceId
dataType: string
lineageTag: a1b2c3d4-0004-0001-0001-000000000004
summarizeBy: none
sourceColumn: WorkspaceId
column WorkspaceName
dataType: string
lineageTag: a1b2c3d4-0004-0001-0001-000000000005
summarizeBy: none
sourceColumn: WorkspaceName
column RoleName
dataType: string
lineageTag: a1b2c3d4-0004-0001-0001-000000000006
summarizeBy: none
sourceColumn: RoleName
column MemberName
dataType: string
lineageTag: a1b2c3d4-0004-0001-0001-000000000007
summarizeBy: none
sourceColumn: MemberName
column MemberType
dataType: string
lineageTag: a1b2c3d4-0004-0001-0001-000000000008
summarizeBy: none
sourceColumn: MemberType
column IdentityProvider
dataType: string
lineageTag: a1b2c3d4-0004-0001-0001-000000000009
summarizeBy: none
sourceColumn: IdentityProvider
column MemberId
dataType: string
lineageTag: a1b2c3d4-0004-0001-0001-000000000010
summarizeBy: none
sourceColumn: MemberId
column ExtractionMethod
dataType: string
lineageTag: a1b2c3d4-0004-0001-0001-000000000011
summarizeBy: none
sourceColumn: ExtractionMethod
column ExtractionTimestamp
dataType: dateTime
lineageTag: a1b2c3d4-0004-0001-0001-000000000012
summarizeBy: none
sourceColumn: ExtractionTimestamp
partition rls_role_members = m
mode: directLake
source
entityName: rls_role_members
schemaName: dbo
table rls_role_table_permissions
lineageTag: a1b2c3d4-0003-0001-0001-000000000001
column SemanticModelId
dataType: string
lineageTag: a1b2c3d4-0003-0001-0001-000000000002
summarizeBy: none
sourceColumn: SemanticModelId
column SemanticModelName
dataType: string
lineageTag: a1b2c3d4-0003-0001-0001-000000000003
summarizeBy: none
sourceColumn: SemanticModelName
column WorkspaceId
dataType: string
lineageTag: a1b2c3d4-0003-0001-0001-000000000004
summarizeBy: none
sourceColumn: WorkspaceId
column RoleName
dataType: string
lineageTag: a1b2c3d4-0003-0001-0001-000000000005
summarizeBy: none
sourceColumn: RoleName
column TableName
dataType: string
lineageTag: a1b2c3d4-0003-0001-0001-000000000006
summarizeBy: none
sourceColumn: TableName
column FilterExpression
dataType: string
lineageTag: a1b2c3d4-0003-0001-0001-000000000007
summarizeBy: none
sourceColumn: FilterExpression
column ExtractionMethod
dataType: string
lineageTag: a1b2c3d4-0003-0001-0001-000000000008
summarizeBy: none
sourceColumn: ExtractionMethod
column ExtractionTimestamp
dataType: dateTime
lineageTag: a1b2c3d4-0003-0001-0001-000000000009
summarizeBy: none
sourceColumn: ExtractionTimestamp
partition rls_role_table_permissions = m
mode: directLake
source
entityName: rls_role_table_permissions
schemaName: dbo
table rls_roles
lineageTag: a1b2c3d4-0002-0001-0001-000000000001
column SemanticModelId
dataType: string
lineageTag: a1b2c3d4-0002-0001-0001-000000000002
summarizeBy: none
sourceColumn: SemanticModelId
column SemanticModelName
dataType: string
lineageTag: a1b2c3d4-0002-0001-0001-000000000003
summarizeBy: none
sourceColumn: SemanticModelName
column WorkspaceId
dataType: string
lineageTag: a1b2c3d4-0002-0001-0001-000000000004
summarizeBy: none
sourceColumn: WorkspaceId
column WorkspaceName
dataType: string
lineageTag: a1b2c3d4-0002-0001-0001-000000000005
summarizeBy: none
sourceColumn: WorkspaceName
column RoleName
dataType: string
lineageTag: a1b2c3d4-0002-0001-0001-000000000006
summarizeBy: none
sourceColumn: RoleName
column RoleDescription
dataType: string
lineageTag: a1b2c3d4-0002-0001-0001-000000000007
summarizeBy: none
sourceColumn: RoleDescription
column ModelPermission
dataType: string
lineageTag: a1b2c3d4-0002-0001-0001-000000000008
summarizeBy: none
sourceColumn: ModelPermission
column MemberCount
dataType: int64
lineageTag: a1b2c3d4-0002-0001-0001-000000000009
summarizeBy: sum
sourceColumn: MemberCount
column ExtractionMethod
dataType: string
lineageTag: a1b2c3d4-0002-0001-0001-000000000010
summarizeBy: none
sourceColumn: ExtractionMethod
column ExtractionTimestamp
dataType: dateTime
lineageTag: a1b2c3d4-0002-0001-0001-000000000011
summarizeBy: none
sourceColumn: ExtractionTimestamp
partition rls_roles = m
mode: directLake
source
entityName: rls_roles
schemaName: dbo
table rls_semantic_models
lineageTag: a1b2c3d4-0001-0001-0001-000000000001
column SemanticModelId
dataType: string
lineageTag: a1b2c3d4-0001-0001-0001-000000000002
summarizeBy: none
sourceColumn: SemanticModelId
column SemanticModelName
dataType: string
lineageTag: a1b2c3d4-0001-0001-0001-000000000003
summarizeBy: none
sourceColumn: SemanticModelName
column WorkspaceId
dataType: string
lineageTag: a1b2c3d4-0001-0001-0001-000000000004
summarizeBy: none
sourceColumn: WorkspaceId
column WorkspaceName
dataType: string
lineageTag: a1b2c3d4-0001-0001-0001-000000000005
summarizeBy: none
sourceColumn: WorkspaceName
column CapacityId
dataType: string
lineageTag: a1b2c3d4-0001-0001-0001-000000000006
summarizeBy: none
sourceColumn: CapacityId
column CapacityName
dataType: string
lineageTag: a1b2c3d4-0001-0001-0001-000000000007
summarizeBy: none
sourceColumn: CapacityName
column isEffectiveIdentityRequired
dataType: boolean
lineageTag: a1b2c3d4-0001-0001-0001-000000000008
summarizeBy: none
sourceColumn: isEffectiveIdentityRequired
column isEffectiveIdentityRolesRequired
dataType: boolean
lineageTag: a1b2c3d4-0001-0001-0001-000000000009
summarizeBy: none
sourceColumn: isEffectiveIdentityRolesRequired
column ContentProviderType
dataType: string
lineageTag: a1b2c3d4-0001-0001-0001-000000000010
summarizeBy: none
sourceColumn: ContentProviderType
column StorageMode
dataType: string
lineageTag: a1b2c3d4-0001-0001-0001-000000000011
summarizeBy: none
sourceColumn: StorageMode
column RoleCount
dataType: int64
lineageTag: a1b2c3d4-0001-0001-0001-000000000012
summarizeBy: sum
sourceColumn: RoleCount
column ExtractionMethod
dataType: string
lineageTag: a1b2c3d4-0001-0001-0001-000000000013
summarizeBy: none
sourceColumn: ExtractionMethod
column ExtractionTimestamp
dataType: dateTime
lineageTag: a1b2c3d4-0001-0001-0001-000000000014
summarizeBy: none
sourceColumn: ExtractionTimestamp
column ExtractionStatus
dataType: string
lineageTag: a1b2c3d4-0001-0001-0001-000000000015
summarizeBy: none
sourceColumn: ExtractionStatus
partition rls_semantic_models = m
mode: directLake
source
entityName: rls_semantic_models
schemaName: dbo
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment