首页 > 数据库技术 > 详细

SparkSQL External Datasource简易使用之CSV

时间:2014-12-24 11:26:05      阅读:1239      评论:0      收藏:0      [点我收藏+]

下载源码&编译:

git clone https://github.com/databricks/spark-csv.git
sbt/sbt package

 

Maven GAV:

groupId: com.databricks.spark
artifactId: spark-csv_2.10
version: 0.1

 

$SPARK_HOME/conf/spark-env.sh

export SPARK_CLASSPATH=/home/spark/software/source/spark_package/spark-csv/target/scala-2.10/spark-csv-assembly-0.1.jar:$SPARK_CLASSPATH

 

测试数据下载:

wget https://github.com/databricks/spark-csv/raw/master/src/test/resources/cars.csv 

 

Scala API:

import org.apache.spark.sql.SQLContext
val sqlContext = new SQLContext(sc)
import com.databricks.spark.csv._
val cars = sqlContext.csvFile("file:///home/spark/software/data/cars.csv")
cars.collect

 

SQL:

CREATE TEMPORARY TABLE cars
USING com.databricks.spark.csv
OPTIONS (path "file:///home/spark/software/data/cars.csv", header "true");

select * from cars;

 

SparkSQL External Datasource简易使用之CSV

原文:http://www.cnblogs.com/luogankun/p/4181884.html

(0)
(0)
   
举报
评论 一句话评论(0
关于我们 - 联系我们 - 留言反馈 - 联系我们:wmxa8@hotmail.com
© 2014 bubuko.com 版权所有
打开技术之扣,分享程序人生!