Product Feedback

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Add Git support for Visual Studio Team Services for Azure Databricks

    Most people on Azure are likely to also use VSTS as their git repository. VSTS support authorization tokens and git operations just like GIT itself so it should be an easy addition I would think.

    582 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    32 comments  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
  2. Collapsible Headings

    It would be great to have collapsible markdown headings just like the notebook extension "collapsible headings" in jupyter. Notebooks tend to become very large very quickly and this feature would greatly help us.

    72 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
  3. Support Search and Replace within a Notebook across all Cells

    It will really aid productivity if we can search and replace across all cells within a notebook.

    49 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    7 comments  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
  4. auto-shutdown and auto-relaunch Cluster out of office hours and week days

    The idea is that when we create a Cluster that we use only for data/model exploration we can select an option to insure the cluster gets shut down out of business hours/days and get relaunched in time for the next day of work
    Of course we can do this manually but 99% of the time we would forget

    47 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    5 comments  ·  Cluster management  ·  Flag idea as inappropriate…  ·  Admin →

    Happy to announce that the most voted for feature has been released. You can now setup auto-termination on any cluster that is launched. The timeout time for shutdown is configurable.

  5. Line Numbers

    Debugging in Databricks is plenty annoying as is. Letting us see line numbers should be a simple change that would make debugging much less frustrating. (Sometimes I've ended up pasting the contents of a cell into a text editor to find the line providing the error!)

    40 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →

    Both line numbers and command numbers are now available. Under the VIEW menu you can show/hide them. Your preferences will be remembered. Command numbers can be clicked on, which gives you a permalink to that particular command. Such permalinks are great for sharing and pointing collaborators to specific commands in notebooks.

  6. A cluster can scale down to 1 worker if idle for 30 minutes

    If a cluster is idle for > 30 minutes or some number, would be great if it scaled down automatically to just one worker.

    33 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Cluster management  ·  Flag idea as inappropriate…  ·  Admin →

    Auto-scaling is not ready (both up-scaling and down-scaling). Due to legacy contracts, older customers need to contact their SA to discuss how to get this enabled.

  7. Support ability to group Spark Tables into "Databases" for folders.

    Support the ability to group Spark Tables into folders or "databases" so we can organize our datasets easier per client. Right now its a long list of tables. Hard to manage.

    29 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Other  ·  Flag idea as inappropriate…  ·  Admin →

    Happy to announce that our third most requested feature has been released. You can now browse all databases in the UI as well as all the tabes inside those databases. Thus, you can organize your tables into different databases.

  8. Allow a repository branch to be specified with the GitHub integration

    Allow a repository branch to be specified with the GitHub integration. For more complicated Git workflows it isn't always best to make changes to the master branch. Being able to specify the a custom branch will allow users to create a dev branch for testing and merge it into the master branch when the notebook has been completed.

    28 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
  9. Ability to manage notebook/folder permissions with custom ACL Groups

    Define groups with a list of users, so we can set permission on a notebook/folder for that group.

    Currently, whenever we add user to a team there is a lot of manual adding of the user to many notebook/folders.

    26 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
  10. Support for Bokeh and Plotly for Python visualization libraries

    I unsuccessfully tried many times to add this libraries for awesome visualization and wasn't able to make it work.
    For the sake of great visualizations these two libraries should be included by default or at least allow users to successfully using on the notebooks.

    26 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    5 comments  ·  External libraries / applications  ·  Flag idea as inappropriate…  ·  Admin →
  11. Support Launching Spark Clusters with different HDFS/Hadoop versions

    Choose the HDFS/Hadoop version for the Spark Cluster.

    25 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  12. Show which notebooks are running specific commands

    It would be nice to find which notebook ran a specific job on a cluster

    22 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Cluster management  ·  Flag idea as inappropriate…  ·  Admin →

    It is now possible to see in the detailed clusters page which notebooks are actively running vs being idle. It is also possible to go into a notebook, click Schedule and see the list of jobs that are using that notebook. If you have additional use cases you’d like us to cover w.r.t. insight into what’s running, please contact us.

  13. I could see which notebooks are currently running any command

    For small, one-off tasks we use a shared default cluster. When we need to restart that cluster we do not want to kill any running tasks, so we need to go to Cluster UI, expand the cluster in question and open each notebook attached to that cluster to check whether it's running any command. This is tedious if we have a lot of notebooks attached to the cluster.

    It would be cool if the UI showed whether the notebook is running any command or not.

    22 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →

    This is now available.
    1. Click on Clusters in the navigation bar.
    2. Then click on the relevant cluster in the cluster list. This takes you to a detailed page about the selected cluster.
    3. Now click on the “Notebooks” tab. The middle “status” column will either say “Idle” or “Running” depending on if the listed notebook is currently running or not.

  14. Version control for notebooks

    Provide version control for notebooks that allows for easy viewing and reversion to earlier versions.

    20 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →

    Version control is now available in the professional tier (Databricks 1.4.1). Please try it out and let us know what you think.

    Sorry about the earlier typo. Versioning is indeed available.

  15. Add Multi-Factor Authentication (MFA) to login

    Adding Multi-Factor Authentication (MFA) to the login would be nice.

    20 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    4 comments  ·  Other  ·  Flag idea as inappropriate…  ·  Admin →

    This is now available through most identity providers, which you can connect to using SAML 2.0. This is how most of the Databricks customers are doing MFA. Please let us know if you still would like to have MFA outside the SSO/SAML connectivity.

  16. Need IDE support

    We need IDE support (or at least support for syntax highlighting / auto-completion / auto-indentation) for usability and ease of development / test.

    19 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    completed  ·  4 comments  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
  17. Scala 2.11

    It would be great if Databricks would support the latest version of Scala and a few previous versions similar to the way that it allows you to pick your version of Spark when you create a cluster.

    19 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Other  ·  Flag idea as inappropriate…  ·  Admin →
  18. 19 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Cluster management  ·  Flag idea as inappropriate…  ·  Admin →
  19. "Run all" should reset scope

    "Run all" should reset scope, similar to how detach+attach does.

    This would make it easy to "validate" the notebook works correctly and to clear the slate after some manual fussing around.

    13 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Dashboards  ·  Flag idea as inappropriate…  ·  Admin →
  20. notebooks were stored in a git repo I could access from outside the cluster.

    My preferred method of offline editing would involve cloning a git repo, and pushing my changes when I got back online.

    13 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    3 comments  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
← Previous 1 3 4 5 6
  • Don't see your idea?

Feedback and Knowledge Base