You cannot run these two actions through the same spark job. What you're surely looking for is running these two jobs in parallel in the same application.
As the documentation says, you can run multiple jobs in parallel in the same application if those jobs are submitted from different threads:
Inside a given Spark application (SparkContext instance), multiple parallel jobs can run simultaneously if they were submitted from separate threads. By “job”, in this section, we mean a Spark action (e.g. save, collect) and any tasks that need to run to evaluate that action. Spark’s scheduler is fully thread-safe and supports this use case to enable applications that serve multiple requests (e.g. queries for multiple users).
In other words, this should run both actions in parallel (using completable future API here, but you can use any async execution or multithreading mechanism):
CompletableFuture.runAsync(() -> writeToES(df));
CompletableFuture.runAsync(() -> writeToCassandra(df));
You can then join on one or both of these two to wait for completion. As noted in the documentation, you need to pay attention to the configured scheduler mode. Using the FAIR scheduler allows you to run the above in parallel:
conf.set("spark.scheduler.mode", "FAIR")