"how to run scala code in spark container using docker?" Code Answer

5

based on the error message that you have below

:: org.mongodb.spark#mongo-spark-connector_2.11;2.2.0: not found

it indicates that the package is missing. checking on currently available mongodb connector for spark packages, confirms that the package is no longer available (replaced with patched v2.2.6).

you can check an updated example of mongodb spark connector with docker on sindbach/mongodb-spark-docker.

additional information: spark-shell is a repl (read-evaluate-print loop) tool. it is an interactive shell used by programmers to interact with a framework. you don't need to explicitly execute build for execution. when you specify --packages argument of spark-shell it will automatically fetch the package and include it in the environment of your shell.

By Michael J. Barber on May 12 2022

Answers related to “how to run scala code in spark container using docker?”

Only authorized users can answer the Search term. Please sign in first, or register a free account.