scala - Generic map method for spark RDD doesnt compile -


i can't figure out why doesn't compile:

implicit class debughelper[t](ar: rdd[t]) {     def debug_restrainer(implicit sc: sparkcontext): rdd[t] = {         if (debug_size.isdefined) sc.parallelize(ar.take(debug_size.get)) else ar     } } 

it gives error: no classtag available t

does know it's complaining about?

if compiler asks classtag need. sparkcontext can retrieved existing rdd.

import scala.reflect.classtag  implicit class debughelper[t](ar: rdd[t])(implicit val t: classtag[t]) {   def debug_restrainer: rdd[t] = {     if (debug_size.isdefined)        ar.sparkcontext.parallelize(ar.take(debug_size.get))     else ar   } } 

Comments

Popular posts from this blog

PySide and Qt Properties: Connecting signals from Python to QML -

c# - DevExpress.Wpf.Grid.InfiniteGridSizeException was unhandled -

scala - 'wrong top statement declaration' when using slick in IntelliJ -