Product Feedback

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Ability to Upload Python Wheel Libraries through API 2.0

    It would be great if we could just upload libraries through the 2.0 API into a Workspace folder rather than a DBFS folder. We will want to use this in our CI/CD pipeline to upload the newest version of our libraries.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  External libraries / applications  ·  Flag idea as inappropriate…  ·  Admin →
  2. Manage Databricks secrets API to improve the security violations

    credentials = dbutils.secrets.get(scope = "<scope-name>", key = "<key-name>")

    print(credentials)

    ## output : [REDACTED]

    ## But through iterating, it is possible to see the hidden secrets

    for single_key_character in credentials:
    print(single_key_character)

    Output:
    Q
    2
    $
    %
    5
    y
    U
    =
    ;
    5

    ----

    In this the secrets are able to retrieved by iterating secrets variable.
    To address this issue, we have to restrict the iteration for the secrets assigned variable.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  External libraries / applications  ·  Flag idea as inappropriate…  ·  Admin →
  3. Implement REST API Documentation Service like Swagger

    Implement a REST API Documentation Service like Swagger.

    This will make it easier to use the Databricks APIs easier to use.

    Sometimes the available API documentation/Examples are not granular enough.
    There are features of APIs that are not covered by API Examples which can be difficult to implement as the API Documentation isn't as detailed as it could be.

    23 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  REST API  ·  Flag idea as inappropriate…  ·  Admin →
  4. Bug in graph order when displaying in chart

    creating a bar chart based on a temp view or saved table that is sorted by the column it is summarized on (y axis) creates an error when trying to aggregate over all results.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
  5. aliases in URL

    The "databricks URL" seems to be just some kind of account ID.

    We have multiple accounts and it is almost impossible to tell which account I am logged in to on any given webpage. This makes it really easy to accidentally change clusters in "prod" and not in "dev"...

    Most SaaS products have some kind of account name next to the login data. (E.g. in your case that would be in little pull down in the upper right. ("Logged in as ...".) Also, just having a URL with a name and not a semi random string (dbc-0a0a0a0a-0000.cloud...) would help.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  6. ADLS AD integration

    When an Azure Data Lake Store Gen 2 is mounted on the cluster, I'd like users to be able to mount only the ADLS file systems folders that they have access through Active Directory.

    At the moment if a user mounts a folder from ADLS, that folder is visible on the whole cluster.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Other  ·  Flag idea as inappropriate…  ·  Admin →
  7. Bug in display Histogram

    There seems to be a bug in the display of the Histogram, where there bars extend beyond the frame of the plot. Please find attached a screenshot of this happening

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Dashboards  ·  Flag idea as inappropriate…  ·  Admin →
  8. 1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  9. jobs could be cloned in the Job UI

    Sometimes I'd like to spin up a new job with all the same cluster settings, etc, as an existing job, but pointing at a different notebook. A "clone" feature would be super useful!

    6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
  10. Live updates to Spark UI

    Right now I have to refresh the page to see the updated data, would be great if its live.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Cluster management  ·  Flag idea as inappropriate…  ·  Admin →
  11. Markdown as Base Notebook Format

    If a notebook's syntax could default to Markdown, this would make git integration more seemless for e.g. a README.md, and would also make it easier for similarly documentation-focused notebooks.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
  12. Notebooks required git merge prior to push

    Currently, we have multiple people collaborating on notebooks through github, with each person having checked out a version in their own directory. However, if Person 1 commits code, Person 2 often doesn't notice that this code has been commited. If Person 2 commits code without first pulling, it will effectively undo the changes done by Person 1 and produce no warning at all. This seems to entirely defeat the purpose of source control. Unless I'm mistaken, Databricks doesn't seem to have a "pull and merge" functionality at all. So Person 2 needs to notice the commit, and manually merge the…

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
  13. 1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  14. visual studio

    For notebooks, a plug-in for Visual Studio would be really nice on Azure Databricks, as an alternative to the Web UI.

    15 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  External libraries / applications  ·  Flag idea as inappropriate…  ·  Admin →
  15. Change favicon of page when a cell is running in Notebook

    Jupyter notebook does this.

    When we are running a cell, and it's in execution it changes favicon to a "spinner".

    This way we can go back to the tab, when we know that the job is done.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
  16. Full-screen file browser

    The side drawer is better than nothing. But actual persistent pages for navigating the filesystem (like in Jupyter Notebook) would be much more helpful.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Navigation UI  ·  Flag idea as inappropriate…  ·  Admin →
  17. dbutils.fs.ls would return the lastmodified timestamp of files in addition to the size.

    The dbutils.fs.ls command returns the path, filename and size of the files it lists. With the timestamp the input files can be processed in the proper sequence.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Other  ·  Flag idea as inappropriate…  ·  Admin →
  18. libraries load correctly on cluster restart

    When a notebook is attached to a cluster with some user-installed libraries, the notebook does not find the the libraries when the cluster is restarted, or when the cluster was auto-terminated and is started again since last execution. When executing, the notebook reports that the library is not found until I detach the notebook and then attach to the same cluster again.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
  19. dbutils.fs operations supported wildcards

    Support some sort of wildcard or partial filename matching for the 'from' argument to cp(), rm(), and mv().

    44 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Data import / export  ·  Flag idea as inappropriate…  ·  Admin →
  20. Adding an implicit method "display" to Datasets

    This is a little thing, but if you could include something equivalent to the code snippet I've attached in the notebook scopes

    import org.apache.spark.sql.Dataset
    implicit def aux[T](df: Dataset[T])(implicit _display: Dataset[T] => Unit) = new {
    def display = _display(df)
    }
    implicit val auxHelp = display(_: Dataset[_])

    This allows the following code to work:

    spark.range(10).display

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Feedback and Knowledge Base