the question is in no way related to Spark, it's just related to dates representation in the different 'systems'.
The point here is that one of the ways excel's dates are represented is in the form of number of days passed since 1/1/1900 while UNIX timestamp is the well known number of seconds from the epoch (1/1/1970)
you can check here the corresponding date of the value 43517 in terms of Timestamp
https://www.epochconverter.com/timezones?q=43517&tz=UTC
I guess you should ask if there is a way to represent dates as timestamp in Excel or, in alternative, how to map values using Dataframes