Ben Chuanlong Du's Blog

And let it direct your passion with reason.

Process Big Data Using PySpark

  1. PySpark 2.4 and older does not support Python 3.8. You have to use Python 3.7 with PySpark 2.4 or older.

  2. It can be extremely helpful to run a PySpark application locally to detect possible issues before submitting it to the Spark cluster.

    #!/usr/bin/env bash …

Get CentOS Version

You can get the version of CentOS using the following command.

rpm -q centos-release

This trick can be used to get the version of the CentOS distribution on a Spark cluster. Basically, you run this command in the driver or workers to print the versions and then parse the log …

Spark Issue: Shell Related

Symptom 1

/bin/sh: hdfs: command not found

Possible Causes of Symptom 1

The command hdfs is not on the search path.

Possible Solutions to Symptom 1

  1. Use the full path to the command.
  2. Configure the environment variable PATH before you use the command.
  3. Find other alternatives to the command …

Spark Issue: Namespace Quota Is Exceeded

Symptom

Caused by: org.apache.hadoop.hdfs.protocol.NSQuotaExceededException: The NameSpace quota (directories and files) of directory /user/user_name is exceeded: quota=163840 file count=163841

Cause

The namespace quota of the directory /user/user_name is execeeded.

Solutions

  1. Remove non-needed files from the directory /user/user_name to release some namespace …

Spark Issue: Rust Panic

If you use Rust with Spark/PySpark and there are issues in the Rust code, you might get Rust panic error messages.

Symptom

Error: b"thread 'main' panicked at 'index out of bounds: the len is 15 but the index is 15', src/game.rs:131:39\nnote: run with …

Spark Issue: RuntimeException: Unsupported Literal Type Class

Symptom

java.lang.RuntimeException: Unsupported literal type class java.util.ArrayList [1]

Possible Causes

This happens in PySpark when a Python list is provide where a scalar is required. Assuming id0 is an integer column in the DataFrame df, the following code throws the above error.

v = [1, 2, 3 …