sparkr - JVM is not ready after 10 seconds -


i configured sparkr tutorials, , working. able read database read.df, nothing else works, , following error appears:

error in sparkr.init(master = "local") : jvm not ready after 10 seconds

why appear suddenly? i've read other users same problem, solutions given did not work. below code:

sys.setenv(spark_home= "c:/spark") sys.setenv(hadoop_home = "c:/hadoop") .libpaths(c(file.path(sys.getenv("spark_home"), "r", "lib"), .libpaths())) library(sparkr)  #initialeze sparkr environment sys.setenv('sparkr_submit_args'='"--packages" "com.databricks:spark-csv_2.11:1.2.0" "sparkr-shell"') sys.setenv(spark_mem="4g")  #create spark context , sql context sc <- sparkr.init(master = "local") sqlcontext <- sparkrsql.init(sc) 

try few things below:

  1. check if c:/windows/system32/ there in path.

  2. check if spark-submit.cmd has proper execute permissions.

  3. if both above things true , if giving same error, delete spark directory , again create fresh 1 unzipping spark gzip file.


Comments

Popular posts from this blog

scala - 'wrong top statement declaration' when using slick in IntelliJ -

c# - DevExpress.Wpf.Grid.InfiniteGridSizeException was unhandled -

PySide and Qt Properties: Connecting signals from Python to QML -