Creating User Defined Function in Spark-SQL

Answer #1 100 %

You can do this, at least for filtering, if you're willing to use a language-integrated query.

For a data file dates.txt containing:


You can pack as much Scala date magic in your UDF as you want but I'll keep it simple:

def myDateFilter(date: String) = date contains "-08-"

Set it all up as follows -- a lot of this is from the Programming guide.

val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext._

// case class for your records
case class Entry(name: String, when: String)

// read and parse the data
val entries = sc.textFile("dates.txt").map(_.split(",")).map(e => Entry(e(0),e(1)))

You can use the UDF as part of your WHERE clause:

val augustEntries = entries.where('when)(myDateFilter).select('name, 'when)

and see the results:

augustEntries.map(r => r(0)).collect().foreach(println)

Notice the version of the where method I've used, declared as follows in the doc:

def where[T1](arg1: Symbol)(udf: (T1) ? Boolean): SchemaRDD

So, the UDF can only take one argument, but you can compose several .where() calls to filter on multiple columns.

Edit for Spark 1.2.0 (and really 1.1.0 too)

While it's not really documented, Spark now supports registering a UDF so it can be queried from SQL.

The above UDF could be registered using:

sqlContext.registerFunction("myDateFilter", myDateFilter)

and if the table was registered

sqlContext.registerRDDAsTable(entries, "entries")

it could be queried using

sqlContext.sql("SELECT * FROM entries WHERE myDateFilter(when)")

For more details see this example.

You’ll also like:

© 2022 CodeForDev.com -