Salesforce connector in Spark
Salesforce is a customer relationship management solution that brings companies and customers together. It’s one integrated CRM platform that gives all your departments — including marketing, sales, commerce, and service — a single, shared view of every customer.
Get salesforce connector from here
Code:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 |
import org.apache.spark.sql.SparkSession import org.apache.spark.sql._ object Sample { def main(arg: Array[String]) { val spark = SparkSession.builder(). appName("salesforce"). master("local[*]"). getOrCreate() val tableName = "account" val outputPath = "output/result" + tableName val salesforceDf = spark. read. format("jdbc"). option("url", "jdbc:datadirect:sforce://login.salesforce.com;"). option("driver", "com.ddtek.jdbc.sforce.SForceDriver"). option("dbtable", tableName). option("user", "<Username>"). option("password", "<password>"). option("securitytoken", "<security_token>") .load() salesforceDf.createOrReplaceTempView("account") spark.sql("select * from account").collect.foreach(println) //save the result salesforceDf.write.save(outputPath) } } |
Open your terminal and run the following command to start the Spark shell with Salesforce JDBC driver path as its parameter:
1 |
spark-shell --jars /path_to_driver/sforce.jar |
Spark Submit:
1 |
spark-submit --jars sforce.jar --class <Your class name> your jar file |