15

I want to add a new column to the dataframe with values consist of either 0 or 1. I used 'randint' function from,

from random import randint

df1 = df.withColumn('isVal',randint(0,1))

But I get the following error,

/spark/python/pyspark/sql/dataframe.py", line 1313, in withColumn assert isinstance(col, Column), "col should be Column" AssertionError: col should be Column

how to use a custom function or randint function for generate random value for the column?

eliasah
  • 39,588
  • 11
  • 124
  • 154
Dilma
  • 625
  • 3
  • 10
  • 22

2 Answers2

25

You are using python builtin random. This returns a specific value which is constant (the returned value).

As the error message shows, we expect a column which represents the expression.

To do this do:

from pyspark.sql.functions import rand,when
df1 = df.withColumn('isVal', when(rand() > 0.5, 1).otherwise(0))

This would give a uniform distribution between 0 and 1. See the functions documentation for more options (http://spark.apache.org/docs/latest/api/python/pyspark.sql.html#module-pyspark.sql.functions)

JARS
  • 1,109
  • 7
  • 10
Assaf Mendelson
  • 12,701
  • 5
  • 47
  • 56
10

Had a similar problem with integer values from 5 to 10. I've used the rand() function from pyspark.sql.functions

from pyspark.sql.functions import *
df1 = df.withColumn("random", round(rand()*(10-5)+5,0))
mayank agrawal
  • 2,495
  • 2
  • 13
  • 32
gogogod
  • 115
  • 1
  • 6