2

I do not understand the below behavior. I am trying to divide bkg_fx_rate by sys_currency, in order to get FX cross rate. Both the fields are of type Decimal(36, 16). What I would expect with this precision, in the first row, would be 1.5387893576534500/0.9330900000000000 = 1.6491328356894297. Yet, what I get instead is 1.649133. Why is that? I have tried to cast, and change DecimalType() arguments, but it never gives me more than 6 places on the right side.

Thank you.

enter image description here

Grevioos
  • 355
  • 5
  • 30

1 Answers1

2

Cast them to DoubleType()

>>> data = [(1.5387893576534500,0.9330900000000000)]
>>> 
>>> schema = StructType([
...     StructField('a', DoubleType(),True),
...     StructField('b', DoubleType(),True)
... ])
>>> 
>>> df = spark.createDataFrame(data=data,schema=schema)
>>> df.withColumn('c', df.a/df.b).show()
+----------------+-------+------------------+
|               a|      b|                 c|
+----------------+-------+------------------+
|1.53878935765345|0.93309|1.6491328356894297|
+----------------+-------+------------------+
Bala
  • 11,068
  • 19
  • 67
  • 120