pyspark.sql.DataFrame与pandas.DataFrame之间的相互转换      pyspark.sql.DataFrame与pandas.DataFrame之间的相互转换
# -*- coding: utf-8 -*-
import pandas as pd
from pyspark.sql import SparkSession
from pyspark.sql import SQLContext
from pyspark import SparkContext
# 配置spark 运⾏参数
import os
# 初始化spark DataFrame
sc = SparkContext()
if __name__ == "__main__":
print "1、初始化pandas DataFrame"
# 初始化pandas DataFrame
df = pd.DataFrame([[1, 2, 3], [4, 5, 6]], index=['row1', 'row2'], columns=['c1', 'c2', 'c3'])
# 打印数据
print df
spark = SparkSession\
.builder\
.appName("testDataFrame")\
.getOrCreate()
sentenceData = ateDataFrame([(0.0, "I like Spark"),
(1.0, "Pandas is useful"),
(2.0, "They are coded by Python ")],
["label", "sentence"])
# 显⽰数据
sentenceData.select("label").show()
print "2、将pandas.DataFrame 转换成 spark.DataFrame"
# spark.DataFrame 转换成 pandas.DataFrame
sqlContext = SQLContext(sc)
spark_df = ateDataFrame(df)
# 显⽰数据
spark_df.select("c1").show()
print "3、将spark.DataFrame 转换成 pandas.DataFrame"
# pandas.DataFrame 转换成 spark.DataFrame
pandas_df = Pandas()
# 打印数据
print pandas_df
运⾏结果:
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
18/05/21 19:47:21 WARN NativeCodeLoader: Unable to load native-hadoop library for using builtin-java classes where applicable 18/05/21 19:47:22 WARN Utils: Your hostname, localhost resolves to a loopback address: 127.0.0.1; using 10.2.33.229 instead (on interface en0) 18/05/21 19:4
7:22 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
18/05/21 19:47:22 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
1、初始化pandas DataFrame
log4j与log4j2c1  c2  c3
row1  1  2  3
row2  4  5  6
+-----+
|label|
+-----+
|  0.0|
|  1.0|
|  2.0|
+-----+
2、将pandas.DataFrame 转换成  spark.DataFrame
+---+
| c1|
+---+
|  1|
|  4|
+---+
3、将spark.DataFrame 转换成  pandas.DataFrame
label                  sentence
0    0.0              I like Spark
1    1.0          Pandas is useful
2    2.0  They are coded by Python
Process finished with exit code 0

版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系QQ:729038198,我们将在24小时内删除。