Product Feedback

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. UX fix

    Hi,

    if you work with multiple people inside the same notebook, the other persons courser is refelceted wrong( jumps to my current edit if the person is idle)
    pls fix
    thx

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Navigation UI  ·  Flag idea as inappropriate…  ·  Admin →
  2. It would be great if cells showed the amount of time they have been running (not just when finished)

    When you run a cell, it shows you the amount of time it took, *when the cell finished*.

    It would be great if it showed you the time it has taken so far, *while* executing.

    2 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
  3. Historical Ganglia snapshot included more metrics

    It's nice that clusters have historical snapshots of the ganglia UI. It currently shows the "load_one" metric for all nodes on the cluster. It would also be nice to add some report metrics to this snapshot -- e.g. mem_report, cpu_report, network_report, and disk_report. This makes it easier to debug issues with jobs that terminate clusters upon completion or failure.
    Thanks!

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Other  ·  Flag idea as inappropriate…  ·  Admin →
  4. Support for JSON representation to exceed 10,000 bytes

    Http post request, JSON representation cannot exceed 10,000 bytes.
    If i have a large request body, how to over come the issue?

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  REST API  ·  Flag idea as inappropriate…  ·  Admin →
  5. Plot options does not support ambiguous column names

    Given I am in Plot Options
    When a dataframe has two columns with the same name
    Then I can use an alias to differentiate the columns

    My situation is that I left joined two tables. Nulls from non-joined rows is taking precedence when columns names are the same. I am unable to specify an alias for the tables in the Plot Options screen.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Visualizations  ·  Flag idea as inappropriate…  ·  Admin →
  6. Make the green dot next to the active clusters more colorblind friendly

    I have moderate red green colorblindness and sometimes its hard for me to tell if a cluster is active or not in the interface. If you could change the color of the active dot to a brighter more colorblind friendly green like #00FF48, or even add a little check inside the circle, it would be much appreciated.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
  7. Not able to add external library glpk , pyhton

    After adding via init script, server is not able to start.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  External libraries / applications  ·  Flag idea as inappropriate…  ·  Admin →
  8. Support the new us-west-2d availability zone

    us-west-2d has consistently low spot pricing for some instance types. Please add support for it.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Cluster management  ·  Flag idea as inappropriate…  ·  Admin →
  9. I had a searchable way to access workspaces instead of a non-scrollable menu.

    I will have many databricks workspaces for my clients to search through and manage, possibly 100+. Current list is not scrollable when you have many workspaces to juggle through. I've already run out of space with 8 workspaces. I'd like to search by DBX resource name & Azure subscription it's deployed in.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Other  ·  Flag idea as inappropriate…  ·  Admin →
  10. Use AAD authentication in the REST API and Databricks CLI instead of user tokens

    Use Azure Active Directory (AAD) authentication in the REST API (and Databricks CLI) instead of user tokens

    33 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    5 comments  ·  REST API  ·  Flag idea as inappropriate…  ·  Admin →
  11. Long Hang Running Command

    Sometime my command line will hang in "Running command". The code is really simple like "print("hello world")", which made my experience really, really bad!!! Could you provide more support or information why this happened?

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  12. Add S3 PutObjectAcl Funtion to dbfsutils.

    Our clusters write to cross account S3 Buckets. I already configured BucketOwnerFullControl ACL on spark configuration.
    But need access this output datas from more additional account roles for audit, etc.

    I want you to improve the dbfsutils (or little functions) to able S3 PutObjectACL operation.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Data import / export  ·  Flag idea as inappropriate…  ·  Admin →
  13. New Age Roofers

    Address:
    6506 Elkhurst Dr #10
    Plano, Texas
    75023

    Primary phone:
    469-995-8967

    Website
    http://www.planocontractor.co/

    Primary category:
    Roofing Contractor

    Hours:
    24 hours

    Owner:
    Betty Jenkins

    Business Email:
    info@planocontractor.co

    Keywords:
    Roofing Contractor, Siding Contractor, Window Installation Service, Door Supplier, Gutter Installation, Commercial Services

    Description:
    The Roofer at New Age Roofers Plano TX can make tiny repair work, repair significant problems, and also set up brand-new items for one or many structures. We provide roof examination to establish the security of your roofing system. The team has over a years of experience setting up doors, home windows, and roofing systems. We additionally supply doors…

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  14. Support GPU clusters (g3) in North California region

    When we try to create a cluster with GPU instances, we see that the GPU instance types available are p2 and p3, which are not available in our current installation region (North California).

    For us-west-1 region that is the current account the only GPU instance types supported by AWS are the g3 instance types.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Cluster management  ·  Flag idea as inappropriate…  ·  Admin →
  15. We could have multiple regions in the same account

    Currently we can only have one region supported per account. It would be great if we could be able to add more than one region and choose on which region we will create our clusters

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Account management  ·  Flag idea as inappropriate…  ·  Admin →
  16. Azure - Make NCv3 available in North Europe

    Currently, only NCv1 (K80 GPU's) are available for GPU accelerated ML workflows, these are excruciatingly slow and small compared to NCv3 that run V100s.
    We have all our data in North Europe and are therefore stuck with K80s, putting us at a major competative disadvantage.
    Futhermore, a lot of frameworks have cumbersome multi-gpu training workflows, so it is preferable to use one larger GPU over several smaller ones.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Cluster management  ·  Flag idea as inappropriate…  ·  Admin →
  17. we could recreate the key after it has been (mistakenly) deleted

    After successfully running for few days on AWS we mistakenly deleted the key from AWS.

    Now we are getting the following error when trying to start new clusters:

    ```
    Time
    2019-05-08 12:14:02 CEST
    Message
    Cluster terminated. Reason: Cloud Provider Launch Failure

    A cloud provider error was encountered while launching worker nodes. See the Databricks guide for more information.

    AWS API error code: InvalidKeyPair.NotFound

    AWS error message: The key pair 'dbe-worker-XYZ' does not exist

    ```

    How we can force DataBricks to recreate the key?
    Deleting and adding again the IAM role to the DataBricks didn't help, it still expects the same…

    6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  18. Add option to upgrade workspace to premium from standard

    There is currently no way to upgrade a workspace from the standard tier to the premium tier. Rather than having to export all notebooks, create a new workspace, port everything over, re-mount storage endpoints, re-add all users, and import the notebooks, it would be preferable to be able to click a button.

    52 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Flag idea as inappropriate…  ·  Admin →
  19. We need the ability to mask/obfuscate sensitive data in DataBricks notebooks.

    We sensitive data on S3 that we'd like to be able to mask/obfuscate. Traditional databases have the functionality to mask column based on security permissions. Below is how SQL Server handles it. For example, only being able to see the last 4 digits of a ssn. We need the same functionality when view data in DataBricks notebooks.

    https://docs.microsoft.com/en-us/sql/relational-databases/security/dynamic-data-masking?view=sql-server-2017

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  20. dbutils.widgets would become available in Databricks Connect

    I use dbutils.widgets to set variables in my Databricks notebooks. It would be nice if there would be an implementation of this sub-module in Databricks Connect so I can use an IDE for Databricks development.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  External libraries / applications  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Feedback and Knowledge Base