In pyspark, we can search contents in the file like below:
from pyspark.sql.functions import input_file_name
input_path = "data/" # This can be a S3 location
data = spark.read.text(input_path).select(input_file_name(), "value").rdd
df = spark.createDataFrame(data)
df2 = df.filter(df["value"].contains("F1"))
>>> df.show()
+--------------------+--------------------+
| input_file_name()| value|
+--------------------+--------------------+
|file:///Users/hbo...|"`F1`","`F2`","`F3`"|
|file:///Users/hbo...| "a","b","c"'|
|file:///Users/hbo...| "d","e","f"|
|file:///Users/hbo...| "F1","F2","F3"|
|file:///Users/hbo...| "a","b","c"|
|file:///Users/hbo...| "d","e","f"|
+--------------------+--------------------+
>>> df2 = df.filter(df["value"].contains("F1"))
>>> df2.show()
+--------------------+--------------------+
| input_file_name()| value|
+--------------------+--------------------+
|file:///Users/hbo...|"`F1`","`F2`","`F3`"|
|file:///Users/hbo...| "F1","F2","F3"|
+--------------------+--------------------+
Let me know if this works for you.