Pull to refresh
Делаем эффективные инструменты для разработчиков

Big Data Tools Update 11 Is Out

JetBrains corporate blog Java *Scala *Big Data *

EAP 11 of the Big Data Tools plugin for IntelliJ IDEA Ultimate, PyCharm, and DataGrip is available starting today. You can install it from the JetBrains Plugin Repository or inside your IDE.

Big Data Tools is a new JetBrains plugin that allows you to connect to Hadoop and Spark clusters and monitor nodes, applications, and jobs. It also brings support for editing and running Zeppelin notebooks inside IntelliJ IDEA and DataGrip, so you can create, edit, and run Zeppelin notebooks without ever having to leave your favorite IDE. The plugin offers smart navigation, code completion, inspections, quick-fixes, and refactoring inside notebooks.

New run/debug configuration for spark-submit

One of the most significant improvements in this release is the ability to submit Spark applications from your IDE without using a console. This feature is available for all supported IDEs. The spark-submit script in the Spark bin directory is used to launch applications on a cluster. It can use all supported cluster managers through a single interface, so you don’t have to configure your application.

The major drawback of using the spark-submit script is that there’s a number of rituals you have to perform. You have to manually build all artifacts, copy them to the destination server over SSH, and run spark-submit with a special command or, in the worst-case scenario, write a separate bash script to run it. This new feature eliminates all these tedious tasks. All you need to do is create a new run configuration and fill in the parameters; that’s it.

Apache Zeppelin support for DataGrip

Also, you can now use the Zeppelin integration inside the DataGrip IDE. The Big Data Tools plugin provides advanced data visualization and context-sensitive code completion inside Zeppelin notebooks to help you write SQL code faster. Please note that Zeppelin support is limited to the SQL language, for example, Matplotlib or Scala code inspections will not work.

Change log

This update also brings the following fixes and improvements:


  • Spark-submit support for IntelliJ IDEA, DataGrip, and PyCharm (BDIDE-843).
  • Apache Zeppelin support for DataGrip (BDIDE-1045).
  • Titles of Zeppelin paragraphs can now be edited (BDIDE-906).
  • Advanced search filtering in the Spark Monitoring tool (BDIDE-1159).
  • Zeppelin text output containing tables is automatically converted to the table view (BDIDE-1172).

Usability Problems

  • Zeppelin: The ability to Save and restore the last saved path has been added to the export feature (charts, tables, and HTML content in Zeppelin note outputs) (BDIDE-1132).

Bug fixes

  • Zeppelin: Fixed – IntelliJ IDEA 2020.1 crashes when displaying Zeppelin note outputs with embedded images (BDIDE-1184).
  • Zeppelin: Fixed – It is not possible to connect to the server if the /api/version method requires authentication on the server (BRIDE-1199).
  • Zeppelin: Fixed – The Stop Paragraph Action on a pending paragraph stops all paragraphs in the notebook (BDIDE-1171).
  • Zeppelin: Fixed – Paragraph numbering in the Structure tool window doesn't match the numbering inside Zeppelin notes (BDIDE-1135).
  • Zeppelin: Fixed – IDEA tries to download sources for dependencies every time it starts (BDIDE-1208).
  • Zeppelin: Fixed – Disable the "Open in browser" button for local notes (BDIDE-1142).
  • Zeppelin: Fixed – Inlay shouldn't be empty after restarting the interpreter (BDIDE-1129).
  • Spark Monitoring: Fixed – Storage JSON parsing to prevent failures to load Storages (BDIDE-1162).
  • Remote FS: Fixed – Copying folders inside the Azure container shouldn't clear the files inside that folder (BDIDE-1141).
  • Remote FS: Fixed – Files with an unknown content type should be opened in text mode (BDIDE-1192).
  • Remote FS: Fixed – Exception when opening files without extensions (BDIDE-1202).
  • General: Fixed – Problems with the copying and pasting files between projects (BDIDE-1195).
  • HTTP Proxy: Fixed – An error that appeared when connecting to Spark and Hadoop with the enabled socks proxy authorization (BDIDE-1209).
  • HTTP Proxy: Fixed – An error involving invalid proxy authorization parameters (BDIDE-1215).
  • HTTP Proxy: Fixed – Custom proxy settings shouldn't replace global proxy settings (BDIDE-1216).

The full release notes are available here.

Documentation and Social Networks

And last but not least, if you need help learning to use any of the plugin’s features, make sure to check out the documentation. And if you still have questions, please don’t hesitate to leave us a message either here in the comments or on Twitter.

The Big Data Tools team

Total votes 7: ↑7 and ↓0 +7
Views 1.4K
Comments 0
Comments Leave a comment



1,001–5,000 employees