You were overwriting the CSV for each iteration of the loop.
$searchFiles = Import-CSV 'C:\Data\SCRIPTS\PS1\TestFindFile.csv' -Header ("Name")
$source = 'C:\Data'
$outputPath = 'c:\data\scripts\ps1\TestFileLocation.csv'
$searchFiles | ForEach-Object {
# Silently continue to try to ignore error like
# not being able to read path's which are too long
Get-ChildItem $source -Filter $_ -rec -ErrorAction SilentlyContinue | where {!$_.PSIsContainer} | select-object FullName
} | export-csv -notypeinformation -delimiter '|' -path $outputPath
Example using AlphaFS
A comment asked for an example using AlphaFS because it claims to overcome the long path issue. I'm not going into all the details, but here is how I got it to work.
# download and unzip to c:\alpahfs
# dir C:\AlphaFS\* -Recurse -File | Unblock-File
[System.Reflection.Assembly]::LoadFrom('C:\AlphaFS\lib\net451\AlphaFS.dll')
$searchFiles = Import-CSV 'C:\Data\SCRIPTS\PS1\TestFindFile.csv' -Header ("Name")
$source = 'C:\Data'
$outputPath = 'c:\data\scripts\ps1\TestFileLocation.csv'
$searchFiles | ForEach-Object {
$files = [Alphaleonis.Win32.Filesystem.Directory]::EnumerateFiles($source,'*',[System.IO.SearchOption]::AllDirectories)
$files | ForEach-Object { [PSCustomObject] @{FileName = $_} }
} | export-csv -notypeinformation -delimiter '|' -path $outputPath
# type $outputPath