Please enable server mode with has industry standard JDBC and ODBC connectivity options11 votes
It would be great if Databricks would support the latest version of Scala and a few previous versions similar to the way that it allows you to pick your version of Spark when you create a cluster.19 votes
We now support Scala 2.11 JARs for Spark 2.0. The feature is experimental, but will be stabilized over the next months.
Needed for auditing.4 votes
Databricks enterprise offering now comes with complete audit logs that covers most of the actions taken on the platform.
Support the ability to group Spark Tables into folders or "databases" so we can organize our datasets easier per client. Right now its a long list of tables. Hard to manage.29 votes
Happy to announce that our third most requested feature has been released. You can now browse all databases in the UI as well as all the tabes inside those databases. Thus, you can organize your tables into different databases.
chapter 2 notebook shold have right path: "val lines = sc.textFile("file:///dbfs/learning-spark-master/README.md") // Create an RDD ..."
Learning Spark Chapter 2 Examples in Scala
Example 2-2. Scala line count
should have right path
val lines = sc.textFile("file:///dbfs/learning-spark-master/README.md") // Create an RDD called lines
Because this was executed from Initial Setup in the Overview notebook when the following was evaluated:
os.system("cd /dbfs; rm master.zip; rm -rf learning-spark-master; wget https://github.com/databricks/learning-spark/archive/master.zip; unzip master.zip;ls")1 vote
Thank you. This will be fixed in the next release of Databricks.
Without a date/time stamp its hard to tell when an idea turned into a planned item or when a response to feedback was submitted.1 vote
Unfortunately this is a limitation in uservoice portal software. Note however that you can get the list of the newest suggestions:
If you enable the GitHub integration option at the basic account level service, it would help us feel safe that we can invest our time and energy into developing our solution on DataBricks. As we grow we will move up to the pro level account. But right now we are unclear if our work is backed up and safe. This should not be a cost to you at all. So please please provide. Thanks.3 votes
All databases are running distributed with a replica in another datacenter. Furthermore, we do database dumps every day and upload/save to S3, which has eleven 9 durability.
DBC has a login screen where it is possible to access it via non-SSL. First it redirects you to ssl, but then once you have the jsessionid, you can request the non-SSL page.
This sets off warnings and alerts when our security team does an audit of our publicly reachable servers.
The request is to please always restrict the login screen to SSL.3 votes
This is now going out in Databricks 2.2, which is being released this coming week.
Add a Search text field to find threads in the Forum.1 vote
This is now possible. Inside Databricks, click on the question mark at the top right and then enter your search term where it says"Search Guide & Forum". This will search all the forum posts and give you links to the posts.
In the Jobs page, it would be good to have a new column for the status of the most recent job1 vote
This is an excellent suggestion and we have now implemented it. It will be released in about three weeks after QA is done. Thanks Mohan.
Adding Multi-Factor Authentication (MFA) to the login would be nice.20 votes
This is now available through most identity providers, which you can connect to using SAML 2.0. This is how most of the Databricks customers are doing MFA. Please let us know if you still would like to have MFA outside the SSO/SAML connectivity.
Add scroll to attaching notebook on chrome, if the list is long you need to use cntrl + f.1 vote
Thanks for reporting this bug. We have now fixed it and a scroll bar now appears.
Add windowing functions to Spark SQL such as lead, lag, firstvalue and lastvalue.11 votes
This was just completed for Spark 1.4. This version of Spark is still being QA:d but we’re hoping to have an early preview of it in Databricks Cloud before the official 1.4 release.
Allow users to set export limits and remove existing default limit for CSV exports9 votes
The default limits have been improved. We are also reviewing making the limits configurable.
To be able to manage files on the cluster: including delete and download.7 votes
- Don't see your idea?