Ben Chuanlong Du's Blog

And let it direct your passion with reason.

Spark Issue: RuntimeException: Unsupported Literal Type Class

Symptom

java.lang.RuntimeException: Unsupported literal type class java.util.ArrayList [1]

Possible Causes

This happens in PySpark when a Python list is provide where a scalar is required. Assuming id0 is an integer column in the DataFrame df, the following code throws the above error.

v = [1, 2, 3]
df.filter(col("id0") == v)

Possible Solutions

  1. Use a scalar value for v in the above code example.
  2. Use isin to check whether the value of id0 is in the list v.
    v = [1, 2, 3]
    df.filter(col("id0").isin(v))
    

Comments