Since one file has only unique values, try using a hash instead of an array of (expensive) PSObject types. This uses a "hash of hashes" for the Borrow.Txt file. After the hash has been built you only need to process the Loan.txt file line-by-line.
#**********************************************
#*** BORROW.TXT File Setup ***
#**********************************************
#This sets up the array of objects for the Nelnet BORROW.TXT file
echo "Processing file: $FileSource\BORROW.TXT"
$BrrwHash = @{}
Get-Content $BorrowFile |
ForEach-Object{
$BrrwHash[$_.substring(0,9)] = @{
BRBNMF = $_.Substring(9,14)
BRBNML = $_.Substring(23,25)
BRBNMM = $_.Substring(48,1)
BRBADS = $_.Substring(49,1)
BRBPA1 = $_.Substring(56,30)
BRBPA2 = $_.Substring(86,30)
BRBRPC = $_.Substring(116,18)
BRBPZC = $_.Substring(134,15)
BRQSTA = $_.Substring(153,2)
BRBPAC = $_.Substring(162,5)
BRBPPN = $_.Substring(167,11)
BRBDOB = $_.Substring(355,7)
}
echo "Merging Files to: $OutFile"
Get-Content "Loan.txt" |
ForEach-Item{
$lnbss = $_.substring(0,9)
If ($BrrwHash.ContainsKey($lnbss){
$BrrwValues = $BrrwHash[$lnbss]
#merge data
# using, for example $brrwValues['BRBNMF'] and $_.Substring(x,y)
}
}
Because the hash is fairly large you might want to have a look at using the System.Collections.Hashtable class to create a new object and specify an initial size for the hash. If you think the file contains 50K items use 50000 (or 55000). That should reduce the time to create the hash.