Product Feedback

It would be great if ...

You've used all your votes and won't be able to post a new idea, but you can still search and comment on existing ideas.

There are two ways to get more votes:

  • When an admin closes an idea you've voted on, you'll get your votes back from that idea.
  • You can remove your votes from an open idea you support.
  • To see ideas you have already voted on, select the "My feedback" filter and select "My open ideas".
(thinking…)

Enter your idea and we'll search to see if someone has already suggested it.

If a similar idea already exists, you can support and comment on it.

If it doesn't exist, you can post your idea so others can support it.

Enter your idea and we'll search to see if someone has already suggested it.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Collapsible Headings in html export

    It would be great if the html export of a notebook with collapsible headings would also have collapsible headings.

    39 votes
    Vote
    Sign in
    Check!
    (thinking…)
    Reset
    or sign in with
    • facebook
    • google
      Password icon
      Signed in as (Sign out)
      You have left! (?) (thinking…)
      0 comments  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
    • azure active directory

      I would like to grant users access based on Azure AD security groups. Instead of managing users individually and directly in databricks.

      57 votes
      Vote
      Sign in
      Check!
      (thinking…)
      Reset
      or sign in with
      • facebook
      • google
        Password icon
        Signed in as (Sign out)
        You have left! (?) (thinking…)
        0 comments  ·  Account management  ·  Flag idea as inappropriate…  ·  Admin →
      • we can provide relationships between Spark SQL tables

        Add the capability to provide informational relationships, constraints and hints that would help building a data model within Databricks itself. I understand this is a Spark feature which is pending since 2017 to provide Informational Referential integrity [SPARK-19842] but it hasn't moved on.If Databricks can provide a similar feature at least with Delta then there is no need of using an intermediary model-staging service like a RDBMS or OLAP or MPP when visualizing using tools like PowerBI or Tableau.

        19 votes
        Vote
        Sign in
        Check!
        (thinking…)
        Reset
        or sign in with
        • facebook
        • google
          Password icon
          Signed in as (Sign out)
          You have left! (?) (thinking…)
          0 comments  ·  Other  ·  Flag idea as inappropriate…  ·  Admin →
        • Undo delete cell

          As title says!

          Or in fact undo any non editor command..

          6 votes
          Vote
          Sign in
          Check!
          (thinking…)
          Reset
          or sign in with
          • facebook
          • google
            Password icon
            Signed in as (Sign out)
            You have left! (?) (thinking…)
            1 comment  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
          • Add option to upgrade workspace to premium from standard

            There is currently no way to upgrade a workspace from the standard tier to the premium tier. Rather than having to export all notebooks, create a new workspace, port everything over, re-mount storage endpoints, re-add all users, and import the notebooks, it would be preferable to be able to click a button.

            50 votes
            Vote
            Sign in
            Check!
            (thinking…)
            Reset
            or sign in with
            • facebook
            • google
              Password icon
              Signed in as (Sign out)
              You have left! (?) (thinking…)
              1 comment  ·  Flag idea as inappropriate…  ·  Admin →
            • Allow management of Groups through the UI

              Allow management of Groups through the UI rather than just through the Groups API.

              Only being able to Add/Remove/List Users in a Group through the Groups API is inconvenient and restrictive.

              The current state of Permissions management for administrators of Databricks is not a great experience.

              When a new User is granted access, the Administrators need to go and manually grant the User Permission to all of the Libraries/Folders/Jobs/Clusters etc.

              This manual process has resulted in Users being granted Permissions that they shouldn't have or not being granted Permissions that they should have.

              We want to use Groups to help…

              35 votes
              Vote
              Sign in
              Check!
              (thinking…)
              Reset
              or sign in with
              • facebook
              • google
                Password icon
                Signed in as (Sign out)
                You have left! (?) (thinking…)
                2 comments  ·  Navigation UI  ·  Flag idea as inappropriate…  ·  Admin →
              • Use AAD authentication in the REST API and Databricks CLI instead of user tokens

                Use Azure Active Directory (AAD) authentication in the REST API (and Databricks CLI) instead of user tokens

                17 votes
                Vote
                Sign in
                Check!
                (thinking…)
                Reset
                or sign in with
                • facebook
                • google
                  Password icon
                  Signed in as (Sign out)
                  You have left! (?) (thinking…)
                  3 comments  ·  REST API  ·  Flag idea as inappropriate…  ·  Admin →
                • Display Function in Notebooks could be used to generate multiple plots

                  I would like to be able to generate a set of visuals from a single display function call or SQL query output without having to rerun the same chunk multiple times.

                  13 votes
                  Vote
                  Sign in
                  Check!
                  (thinking…)
                  Reset
                  or sign in with
                  • facebook
                  • google
                    Password icon
                    Signed in as (Sign out)
                    You have left! (?) (thinking…)
                    0 comments  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
                  • ADLS AD integration

                    When an Azure Data Lake Store Gen 2 is mounted on the cluster, I'd like users to be able to mount only the ADLS file systems folders that they have access through Active Directory.

                    At the moment if a user mounts a folder from ADLS, that folder is visible on the whole cluster.

                    3 votes
                    Vote
                    Sign in
                    Check!
                    (thinking…)
                    Reset
                    or sign in with
                    • facebook
                    • google
                      Password icon
                      Signed in as (Sign out)
                      You have left! (?) (thinking…)
                      1 comment  ·  Other  ·  Flag idea as inappropriate…  ·  Admin →
                    • jobs could be cloned in the Job UI

                      Sometimes I'd like to spin up a new job with all the same cluster settings, etc, as an existing job, but pointing at a different notebook. A "clone" feature would be super useful!

                      6 votes
                      Vote
                      Sign in
                      Check!
                      (thinking…)
                      Reset
                      or sign in with
                      • facebook
                      • google
                        Password icon
                        Signed in as (Sign out)
                        You have left! (?) (thinking…)
                        1 comment  ·  Flag idea as inappropriate…  ·  Admin →
                      • Databricks could integrate better with a Python IDE

                        Similar to the RStudio Server integration which has become availble recently, it would be great to have an integration with at least one Python IDE (PyCharm, Spyder, etc.). Is such an integration planned for the near future? The development of code in notebooks can be rather cumbersome for larger projects.

                        14 votes
                        Vote
                        Sign in
                        Check!
                        (thinking…)
                        Reset
                        or sign in with
                        • facebook
                        • google
                          Password icon
                          Signed in as (Sign out)
                          You have left! (?) (thinking…)
                          1 comment  ·  Other  ·  Flag idea as inappropriate…  ·  Admin →
                        • Markdown as Base Notebook Format

                          If a notebook's syntax could default to Markdown, this would make git integration more seemless for e.g. a README.md, and would also make it easier for similarly documentation-focused notebooks.

                          3 votes
                          Vote
                          Sign in
                          Check!
                          (thinking…)
                          Reset
                          or sign in with
                          • facebook
                          • google
                            Password icon
                            Signed in as (Sign out)
                            You have left! (?) (thinking…)
                            0 comments  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
                          • Notebooks required git merge prior to push

                            Currently, we have multiple people collaborating on notebooks through github, with each person having checked out a version in their own directory. However, if Person 1 commits code, Person 2 often doesn't notice that this code has been commited. If Person 2 commits code without first pulling, it will effectively undo the changes done by Person 1 and produce no warning at all. This seems to entirely defeat the purpose of source control. Unless I'm mistaken, Databricks doesn't seem to have a "pull and merge" functionality at all. So Person 2 needs to notice the commit, and manually merge the…

                            3 votes
                            Vote
                            Sign in
                            Check!
                            (thinking…)
                            Reset
                            or sign in with
                            • facebook
                            • google
                              Password icon
                              Signed in as (Sign out)
                              You have left! (?) (thinking…)
                              0 comments  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
                            • 1 vote
                              Vote
                              Sign in
                              Check!
                              (thinking…)
                              Reset
                              or sign in with
                              • facebook
                              • google
                                Password icon
                                Signed in as (Sign out)
                                You have left! (?) (thinking…)
                                0 comments  ·  Flag idea as inappropriate…  ·  Admin →
                              • gitlab integration

                                adding gitlab support as github so I can store them my shit and optionallya add some auto export when committing in pdf so I can get them in the same location easiily

                                11 votes
                                Vote
                                Sign in
                                Check!
                                (thinking…)
                                Reset
                                or sign in with
                                • facebook
                                • google
                                  Password icon
                                  Signed in as (Sign out)
                                  You have left! (?) (thinking…)
                                  0 comments  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
                                • Change favicon of page when a cell is running in Notebook

                                  Jupyter notebook does this.

                                  When we are running a cell, and it's in execution it changes favicon to a "spinner".

                                  This way we can go back to the tab, when we know that the job is done.

                                  1 vote
                                  Vote
                                  Sign in
                                  Check!
                                  (thinking…)
                                  Reset
                                  or sign in with
                                  • facebook
                                  • google
                                    Password icon
                                    Signed in as (Sign out)
                                    You have left! (?) (thinking…)
                                    0 comments  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
                                  • dbutils.fs.ls would return the lastmodified timestamp of files in addition to the size.

                                    The dbutils.fs.ls command returns the path, filename and size of the files it lists. With the timestamp the input files can be processed in the proper sequence.

                                    1 vote
                                    Vote
                                    Sign in
                                    Check!
                                    (thinking…)
                                    Reset
                                    or sign in with
                                    • facebook
                                    • google
                                      Password icon
                                      Signed in as (Sign out)
                                      You have left! (?) (thinking…)
                                      0 comments  ·  Other  ·  Flag idea as inappropriate…  ·  Admin →
                                    • libraries load correctly on cluster restart

                                      When a notebook is attached to a cluster with some user-installed libraries, the notebook does not find the the libraries when the cluster is restarted, or when the cluster was auto-terminated and is started again since last execution. When executing, the notebook reports that the library is not found until I detach the notebook and then attach to the same cluster again.

                                      1 vote
                                      Vote
                                      Sign in
                                      Check!
                                      (thinking…)
                                      Reset
                                      or sign in with
                                      • facebook
                                      • google
                                        Password icon
                                        Signed in as (Sign out)
                                        You have left! (?) (thinking…)
                                        0 comments  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
                                      • Adding an implicit method "display" to Datasets

                                        This is a little thing, but if you could include something equivalent to the code snippet I've attached in the notebook scopes

                                        import org.apache.spark.sql.Dataset
                                        implicit def aux[T](df: Dataset[T])(implicit _display: Dataset[T] => Unit) = new {
                                        def display = _display(df)
                                        }
                                        implicit val auxHelp = display(_: Dataset[_])

                                        This allows the following code to work:

                                        spark.range(10).display

                                        1 vote
                                        Vote
                                        Sign in
                                        Check!
                                        (thinking…)
                                        Reset
                                        or sign in with
                                        • facebook
                                        • google
                                          Password icon
                                          Signed in as (Sign out)
                                          You have left! (?) (thinking…)
                                          0 comments  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
                                        • editing notebooks didn't have so much lag. It is almost unusable.

                                          Editing notebooks is so slow sometimes, regardless of computer, internet connection, etc. Rebooting doesn't help, more RAM or CPU doesn't help, a different browser doesn't help. It's awful, with 10 or more seconds of lag just typing individual characters or clicking/highlighting different parts of the notebook. HELP!

                                          12 votes
                                          Vote
                                          Sign in
                                          Check!
                                          (thinking…)
                                          Reset
                                          or sign in with
                                          • facebook
                                          • google
                                            Password icon
                                            Signed in as (Sign out)
                                            You have left! (?) (thinking…)
                                            1 comment  ·  Notebooks  ·  Flag idea as inappropriate…  ·  Admin →
                                          ← Previous 1 3 4 5 13 14
                                          • Don't see your idea?

                                          Feedback and Knowledge Base