0

just trying to find if there is a way to detect overflow/underflow for DecimalType column in spark (DataFrame API)? there is a method to detect this happening for numeric types with a small range of values. I understand that could be done on the lower level via using UDFs and detecting if there is an overflow in the corresponding BigDecimal value representing result, but I would prefer to use the existing DataFrame function etc.

Pavel
  • 1,519
  • 21
  • 29
  • Can you tell us what logic you were thinking to apply in your UDF ? – Sanket9394 Jul 10 '21 at 16:35
  • I guess, I've totally undervalue BigDecimal type: https://stackoverflow.com/questions/31974837/can-doubles-or-bigdecimal-overflow as I understand it's just not correct question to ask about BigDecimal. it unlikely whole power of BigDecimal would be needed in practice. – Pavel Jul 10 '21 at 16:55
  • 1
    One thing is you can always compute the maximum limit of your Decimal Column using precision and scale, and probably compare. But I am not sure if that would suffice your usecase. see :https://stackoverflow.com/questions/2377174/how-do-i-interpret-precision-and-scale-of-a-number-in-a-database – Sanket9394 Jul 10 '21 at 17:50

0 Answers0