Product Feedback

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Support for JSON representation to exceed 10,000 bytes

    Http post request, JSON representation cannot exceed 10,000 bytes.
    If i have a large request body, how to over come the issue?

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  REST API  ·  Flag idea as inappropriate…  ·  Admin →
  2. Use AAD authentication in the REST API and Databricks CLI instead of user tokens

    Use Azure Active Directory (AAD) authentication in the REST API (and Databricks CLI) instead of user tokens

    33 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    5 comments  ·  REST API  ·  Flag idea as inappropriate…  ·  Admin →
  3. Upload library for all clusters through API

    If a library has been set to be installed on all clusters, is_library_for_all_clusters will be true. Right now, this can only be done through the home page. It would be great if this could be done through the API or CLI.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  REST API  ·  Flag idea as inappropriate…  ·  Admin →
  4. Implement REST API Documentation Service like Swagger

    Implement a REST API Documentation Service like Swagger.

    This will make it easier to use the Databricks APIs easier to use.

    Sometimes the available API documentation/Examples are not granular enough.
    There are features of APIs that are not covered by API Examples which can be difficult to implement as the API Documentation isn't as detailed as it could be.

    23 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  REST API  ·  Flag idea as inappropriate…  ·  Admin →
  5. 1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  REST API  ·  Flag idea as inappropriate…  ·  Admin →
  6. Job was queued if max concurrent reached.

    I would run my jobs by api, with some parameter.
    However there is a limit of concurrent jobs which is fine, but when hitting it, jobs should not be skipped, but instead queued for execution when jobs running number allow it again.

    22 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  REST API  ·  Flag idea as inappropriate…  ·  Admin →
  7. Jobs triggered a webhook on completion

    You can write your own code at the end of a notebook to trigger actions on job completion, but there's no way to get programmatically notified of job failure (minus building an email integration to handle the notification email, or constantly polling the job status endpoint).

    17 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  REST API  ·  Flag idea as inappropriate…  ·  Admin →
  8. API had an endpoint to list open or running contexts

    It would be great to be able to list the open contexts that count against the per-cluster limit of 150. Except when a context Is created and the ID returned, there is no other way to determine that ID to access that context.

    9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  REST API  ·  Flag idea as inappropriate…  ·  Admin →
  9. REST API Request for get direct link/set permission/run notebook

    We try to make Azure Databricks Platform as an ultimate investigation platform. But its Notebook model doesn't work well when we try to assign each investigation case. So we try to use REST API to clone the template notebook and assign that to each investigator. To make the process smooth, we will need API to get direct access link, set permission and run notebook.

    2 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  REST API  ·  Flag idea as inappropriate…  ·  Admin →
  10. timeout_seconds

    I set timeout_seconds https://docs.databricks.com/api/latest/jobs.html#runs-submit via REST API, but I can not see the timeout_seconds on the job running UI page although timeout_seconds do works.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  REST API  ·  Flag idea as inappropriate…  ·  Admin →
  11. 1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  REST API  ·  Flag idea as inappropriate…  ·  Admin →
  12. Update specific fields in a job configuration

    Instead of resetting every field in a job from the REST API 2.0 /jobs/reset request. A new endpoint could be supplied where only the defined key:value, defined in the JSON request body, would be used to change the specified job configuration.

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  REST API  ·  Flag idea as inappropriate…  ·  Admin →
  13. Run Jobs with a specific cluster_id

    In the REST API 2.0, a "cluster_id" field could be given so that a job could be attached to an existing cluster before running.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  REST API  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Feedback and Knowledge Base