In this approach there is no need to extract the java code as a jar. Use these java classes and execute the SCALA notebook.įigure1: Upload java jar file as library Approach 2: Using Databricks connect Now you can have one notebook which is a SCALA notebook, there you can import the java classes from the libraries. After that you can go to the Databricks cluster and go to the libraries and upload the jar file as a library there In the jar file your whole code will be there. When you have java code for spark what you can do is, you can export the java code as an executable jar.
0 Comments
Leave a Reply. |