1

I want to replicate my case when statement written in T-SQL to Pyspark. But I am brand new to Pyspark.

SELECT Table1.Part
    , Table1.Serial
    , Table1.AIRCRAFT_NUMBER
    , Table1.date_removed
    , Table2.dbo.E15.TIME
    , Table2.dbo.E15.TSO
    , data.dbo.EE18.Allowable_Time
    , CASE 
         WHEN (data.dbo.EE18.Allowable_Time > 0) 
            THEN data.dbo.EE18.Allowable_Time - Table2.dbo.E15.TSO 
      END AS CAL
FROM Table1 t1
zero323
  • 322,348
  • 103
  • 959
  • 935
PineNuts0
  • 4,740
  • 21
  • 67
  • 112
  • there's an inner join in there – PineNuts0 Jun 08 '18 at 14:40
  • 1
    This query *should* work in pyspark-sql, but maybe you're looking for [`pyspark.sql.functions.when()`](http://spark.apache.org/docs/2.1.0/api/python/pyspark.sql.html#pyspark.sql.functions.when) – pault Jun 08 '18 at 14:41

0 Answers0