About 50 results
Open links in new tab
  1. Is there a way to use parameters in Databricks in SQL with parameter ...

    Sep 29, 2024 · EDIT: I got a message from Databricks' employee that currently (DBR 15.4 LTS) the parameter marker syntax is not supported in this scenario. It might work in the future versions. …

  2. Printing secret value in Databricks - Stack Overflow

    Nov 11, 2021 · 2 Building on @camo's answer, since you're looking to use the secret value outside Databricks, you can use the Databricks Python SDK to fetch the bytes representation of the secret …

  3. Databricks shows REDACTED on a hardcoded value - Stack Overflow

    Mar 16, 2023 · It's not possible, Databricks just scans entire output for occurences of secret values and replaces them with " [REDACTED]". It is helpless if you transform the value. For example, like you …

  4. Databricks Permissions Required to Create a Cluster

    Nov 9, 2023 · To create new clusters user needs to have either: Have permission to use a cluster policy. This is a recommended way of giving users the ability to create new clusters (DLT, Jobs as well) …

  5. Convert string to date in databricks SQL - Stack Overflow

    Jul 9, 2021 · Use Databricks Datetime Patterns. According to SparkSQL documentation on the Databricks website, you can use datetime patterns specific to Databricks to convert to and from date …

  6. Installing multiple libraries 'permanently' on Databricks' cluster ...

    Feb 28, 2024 · Easiest is to use databricks cli 's libraries command for an existing cluster (or create job command and specify appropriate params for your job cluster) Can use the REST API itself, same …

  7. How to import own modules from repo on Databricks?

    Dec 7, 2022 · I have connected a Github repository to my Databricks workspace, and am trying to import a module that's in this repo into a notebook also within the repo. The structure is as such: …

  8. REST API to query Databricks table - Stack Overflow

    Jul 24, 2022 · Is databricks designed for such use cases or is a better approach to copy this table (gold layer) in an operational database such as azure sql db after the transformations are done in pyspark …

  9. Databricks - Download a dbfs:/FileStore file to my Local Machine

    Method1: Using Databricks portal GUI, you can download full results (max 1 millions rows). Method2: Using Databricks CLI To download full results, first save the file to dbfs and then copy the file to local …

  10. databricks - How to create Storage Credential using Service Principal ...

    Sep 24, 2024 · An Azure Databricks access connector is a first-party Azure resource that lets you connect managed identities to an Azure Databricks account. You must have the Contributor role or …