Product Feedback

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Automatically load externally pushed Github changes

    Consider this scenario:

    * I create a notebook and sync it with Github.
    * I create a scheduled task to run that notebook.
    * Outside of the Databricks UI, I edit the notebook or merge changes from another Git branch, then push the changes to Github. (Both of these are expected scenarios in my company.)

    Currently I have to open the notebook in the Databricks UI for it to sync the external changes. If I don't, my scheduled task will continue running the old version of the notebook.

    I am requesting that Databricks automatically sync external changes to the Github…

    20 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Flag idea as inappropriate…  ·  Admin →
    started  ·  Rakesh responded

    We will have a new Workspace API that allows you to import/ export notebooks into Databricks Workspace and therefore allow you to externally sync with the version control system.

  2. cluster names could be long-lived.

    For example, even beyond stops and starts (as in over the night when we shut down clusters or on weekends). Would be nice to always have a cluster name you could depend on. Might tie back to the other comments on ACL to a cluster.

    6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    3 comments  ·  Cluster management  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Feedback and Knowledge Base