scala - How to load data into hive external table using spark? -


i want try load data hive external table using spark. please me on this, how load data hive using scala code or java

thanks in advance

assuming hive external table created using like,

create external table external_parquet(c1 int, c2 string, c3 timestamp)      stored parquet location '/user/etl/destination';   -- location directory on hdfs 

and have existing dataframe / rdd in spark, want write.

import sqlcontext.implicits._ val rdd = sc.parallelize(list((1, "a", new date), (2, "b", new date), (3, "c", new date))) val df = rdd.todf("c1", "c2", "c3")  //column names data frame df.write.mode(savemode.overwrite).parquet("/user/etl/destination") // if want overwrite existing dataset (full reimport source) 

if don't want overwrite existing data dataset...

df.write.mode(savemode.append).parquet("/user/etl/destination")  // if want append existing dataset (incremental imports) 

Comments

Popular posts from this blog

scala - 'wrong top statement declaration' when using slick in IntelliJ -

c# - DevExpress.Wpf.Grid.InfiniteGridSizeException was unhandled -

PySide and Qt Properties: Connecting signals from Python to QML -