機執行SparkSQL的代碼,在這個程序中,我已經創建好sqlContext了,以後的部分就是SparkSQL教程了。這是我更新完1.3版之後新
改的程序,不出意外1.X的版本都是這樣用的。
PS:補充壹下這個是Python API,不是Scala的。
import os
import sys
import traceback
# Path for spark source folder
os.environ['SPARK_HOME']="/Users/jilu/Downloads/spark-1.3.0-bin-hadoop2.4"
# Append pyspark to Python Path
sys.path.append("/Users/jilu/Downloads/spark-1.3.0-bin-hadoop2.4/python/")
sys.path.append("/Users/jilu/Downloads/spark-1.3.0-bin-hadoop2.4/python/lib/py4j-0.8.2.1-src.zip")
# try to import needed models
try:
from pyspark import SparkContext
from pyspark import SparkConf
from pyspark.sql import SQLContext, Row
print ("Successfully imported Spark Modules")
except ImportError as e:
print ("Can not import Spark Modules {}".format(traceback.format_exc()))
sys.exit(1)
# config spark env
conf = SparkConf().setAppName("myApp").setMaster("local")
sc = SparkContext(conf=conf)
sqlContext = SQLContext(sc)