Home > Backend Development > Python Tutorial > Run PySpark Local Python Windows Notebook

Run PySpark Local Python Windows Notebook

Patricia Arquette
Release: 2025-01-21 18:15:08
Original
294 people have browsed it

PySpark Getting Started Guide: Easy to configure and use in your local environment

PySpark is the Python API of Apache Spark, an open source distributed computing system that enables fast and scalable data processing. PySpark allows Python developers to leverage the power of Spark for big data analysis, machine learning, and data engineering tasks without having to delve into the complexities of Java or Scala.

Using PySpark, users can process large data sets, perform distributed data transformations, and run machine learning algorithms in a cluster. It integrates seamlessly with popular data processing frameworks like Hadoop and supports multiple data formats, making it a versatile tool in the field of data science and analytics.

This guide provides an overview of PySpark configuration to help you easily set up and use it in your local computer environment.

Installation

  1. Install Python: https://www.php.cn/link/70fa3e3aed5e5da45f0114c00fadfb41
  2. Install Java: Please download the latest version of Java first: https://www.php.cn/link/8513351ff7f10b0f156c9d1f669e1210 (This article uses Java 23)
  3. Install PySpark:

First, you need to download Apache Spark from:

This article uses https://www.php.cn/link/8f7b2d9100577f77aa8fbb4f51c0366e as a tutorial example.

Python configuration

  1. Java configuration:
<code class="language-python">import os
os.environ["JAVA_HOME"] = fr"D:\Soft\JAVA\jdk-23.0.1"
os.environ["PATH"] = os.environ["JAVA_HOME"] + "/bin;" + os.environ["PATH"]</code>
Copy after login
  1. PySpark configuration:
<code class="language-python">import os
os.environ["SPARK_HOME"] = fr"D:\Soft\pyspark\spark-3.5.4-bin-hadoop3"
os.environ["PATH"] = os.environ["SPARK_HOME"] + "/bin;" + os.environ["PATH"]</code>
Copy after login

After the configuration is complete, you can try checking PySpark in the command line:

PySpark Notebook Example

<code class="language-python">import numpy as np
import pandas as pd
from pyspark.sql import SparkSession

spark = SparkSession.builder \
    .appName("调试示例") \
    .master("local[*]") \
    .config("spark.eventLog.enabled", "true") \
    .config("spark.sql.shuffle.partitions", "1") \
    .getOrCreate()

spark.sparkContext.setLogLevel("DEBUG")
# 启用基于Arrow的列式数据传输
spark.conf.set("spark.sql.execution.arrow.enabled", "true")

# 生成pandas DataFrame
pdf = pd.DataFrame(np.random.rand(100, 3))

# 使用Arrow从pandas DataFrame创建Spark DataFrame
df = spark.createDataFrame(pdf)
# 重命名列
df = df.toDF("a", "b", "c")
df.show(5) # 使用df.show(5)查看PySpark测试输出</code>
Copy after login

Run PySpark Local Python Windows Notebook

Machine learning data example:

<code class="language-python">import requests
from pyspark.sql import SparkSession

# 数据集URL
url = "https://archive.ics.uci.edu/ml/machine-learning-databases/iris/iris.data"

# 下载数据集并保存到本地
response = requests.get(url)
with open("iris.data", "wb") as file:
    file.write(response.content)

# 创建SparkSession
spark = SparkSession.builder \
    .appName("鸢尾花数据分析") \
    .master("local[*]") \
    .getOrCreate()

# 本地下载的鸢尾花数据集路径
iris_data_path = "iris.data"

# 定义数据的模式
columns = ["sepal_length", "sepal_width", "petal_length", "petal_width", "species"]

# 将数据加载到DataFrame中
df = spark.read.csv(iris_data_path, header=False, inferSchema=True)

# 设置列名
df = df.toDF(*columns)

# 显示DataFrame的前几行
df.show()

# 完成后停止SparkSession
spark.stop()</code>
Copy after login

Run PySpark Local Python Windows Notebook

Run successfully!

Reference

The above is the detailed content of Run PySpark Local Python Windows Notebook. For more information, please follow other related articles on the PHP Chinese website!

source:php.cn
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Latest Articles by Author
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template