Product Feedback
-
Use AAD authentication in the REST API and Databricks CLI instead of user tokens
Use Azure Active Directory (AAD) authentication in the REST API (and Databricks CLI) instead of user tokens
81 votes -
Support for JSON representation to exceed 10,000 bytes
Http post request, JSON representation cannot exceed 10,000 bytes.
If i have a large request body, how to over come the issue?1 vote -
Upload library for all clusters through API
If a library has been set to be installed on all clusters, islibraryforallclusters will be true. Right now, this can only be done through the home page. It would be great if this could be done through the API or CLI.
3 votes -
Implement REST API Documentation Service like Swagger
Implement a REST API Documentation Service like Swagger.
This will make it easier to use the Databricks APIs easier to use.
Sometimes the available API documentation/Examples are not granular enough.
There are features of APIs that are not covered by API Examples which can be difficult to implement as the API Documentation isn't as detailed as it could be.23 votes -
1 vote
-
Job was queued if max concurrent reached.
I would run my jobs by api, with some parameter.
However there is a limit of concurrent jobs which is fine, but when hitting it, jobs should not be skipped, but instead queued for execution when jobs running number allow it again.25 votes -
Jobs triggered a webhook on completion
You can write your own code at the end of a notebook to trigger actions on job completion, but there's no way to get programmatically notified of job failure (minus building an email integration to handle the notification email, or constantly polling the job status endpoint).
20 votes -
API had an endpoint to list open or running contexts
It would be great to be able to list the open contexts that count against the per-cluster limit of 150. Except when a context Is created and the ID returned, there is no other way to determine that ID to access that context.
9 votes -
REST API Request for get direct link/set permission/run notebook
We try to make Azure Databricks Platform as an ultimate investigation platform. But its Notebook model doesn't work well when we try to assign each investigation case. So we try to use REST API to clone the template notebook and assign that to each investigator. To make the process smooth, we will need API to get direct access link, set permission and run notebook.
2 votes -
timeout_seconds
I set timeoutseconds https://docs.databricks.com/api/latest/jobs.html#runs-submit via REST API, but I can not see the timeoutseconds on the job running UI page although timeout_seconds do works.
3 votes -
there is a REST API to upload a notebook into a user's directory without having to use the UI
This will help immensely in code deployment automation.
1 vote -
Update specific fields in a job configuration
Instead of resetting every field in a job from the REST API 2.0 /jobs/reset request. A new endpoint could be supplied where only the defined key:value, defined in the JSON request body, would be used to change the specified job configuration.
4 votes -
Run Jobs with a specific cluster_id
In the REST API 2.0, a "cluster_id" field could be given so that a job could be attached to an existing cluster before running.
3 votes
- Don't see your idea?