TensorFlow 1.13.1 contains major bugfixes compared to 1.12.0, here are some highlights:
* TensorFlow GPU binaries are now built against CUDA 10 and TensorRT 5.0.
(This is extremely important to leverage the full capabilities of V100 GPUs)
* Fixes a potential security vulnerability where carefully crafted GIF images can produce a null pointer dereference during decoding
* Improve performance of GPU cumsum/cumprod by up to 300x.
Any loads more, but these 3 are enough to warrant an upgrade, not to mention compatibility ops and renaming that make models less prone to breaking in TF 2.03 votes
Some of the common python packages (e.g., scipy, numpy, and scikit-learn) came with the Databricks default machine image were out-of-date. It will be great if Databricks can keep these commonly used python packages up-to-date in the default machine image.11 votes
We have now substantially upgraded the software on the images. In fact, we regularly update various software installed on the clusters. The release notes covers the exact version of all the installed software. Here is an example of this:
I unsuccessfully tried many times to add this libraries for awesome visualization and wasn't able to make it work.
For the sake of great visualizations these two libraries should be included by default or at least allow users to successfully using on the notebooks.26 votes
Databricks supports Bokeh and Plotly libraries and you can install them in your workspace to develop custom visualizations.
Documentation and example notebooks are listed below.
It would be good to have something like OpenBLAS compiled and installed on all the Spark nodes by default. https://github.com/xianyi/OpenBLAS
This can make a bunch of ML workloads run much faster9 votes
This is now completed
Be able to install custom libraries (in addition to just Java and Python libraries)6 votes
We now support running a script on every node of a cluster when it’s launched. That script can then run arbitrary shell commands, e.g. download and install arbitrary libraries. The script has to be located in DBFS (Databricks FileSystem). Please contact a field engineer at Databricks for help to set this up.
Provide the ability for users to organize their uploaded libs inside a folder. Putting everything in the Workspace root folder is very confusing.1 vote
- Don't see your idea?