0

I have developed a file upload and processing with Laravel. But the runtime is very long.

File looks like this: (very large, about 50k hands per file)

QhQs3s2s@86,QdQs3s2s@86,QcQs3s2s@86,KhKs3s2s@100,KdKs3s2s@100,KcKs3s2s@100,AhAs3s2s@86,AdAs3s2s@86,AcAs3s2s@86

It is uploaded via txt upload and then chunked into sets of 1000 "Hands"

/**
 * Upload the create Files
 */
public function uploadFile(Request $request)
{
    // process SituationName
    $name = $request->input('name');
    $situation = Situation::firstOrCreate(['name' => $name, 'active' => 1]);

    //process RaiseRange
    $action = Action::where('name', 'Raise')->first();
    $path = $request->file('rangeRaise')->store('ranges');

    //Split Files
    $content = Storage::disk('local')->get($path);
    $array = explode(",", $content);
    $arrayFinal = array_chunk($array, 1000);

    foreach($arrayFinal as $arrayJob){
        $filename = 'ranges/RaiseFile'.uniqid().'.txt';
        Storage::disk('local')->put($filename, json_encode($arrayJob));
        ProcessRangeFiles::dispatch($action, $situation, $filename);
    }
}

Then it gets dispatched as a job with following handle

public function handle()
{
    Log::info('File Processing started');
    $array = null;
    $content = null;
    $found = null;

    $path = $this->path;
    $action = $this->action;
    $situation = $this->situation;

    $hands = Hand::all();

    $content = json_decode(Storage::disk('local')->get($path));

    foreach ($content as $key=>$line){
        $array[$key] = explode('@', $line);
        foreach($hands as $hand){
            if($hand->hand == $array[$key][0]){
                $found = $hand;
                break;
            }
        }
        DB::table('hands_to_situations_to_actions')->insert(
            ['hand_id' => $found->id, 'action_id' => $action->id, 'situation_id' => $situation->id, 'percentage' => $array[$key][1], 'created_at' => Carbon::now()->toDateTimeString(), 'updated_at' => Carbon::now()->toDateTimeString()]
        );
    }
    Log::info('File Processing finished');
}

$hands is filled with every possible Poker Omaha Hand.

Has anyone an idea how to optimize this code? Then runtime for every 1000 chunks is about 12 Minutes.

Yann1ck
  • 123
  • 2
  • 8
  • 2
    Try doing a batch insert when an array of data reaches a certain threshold instead of single inserts on each maybe? https://stackoverflow.com/questions/41871287/batch-insert-in-laravel-5-2 – Jeremy Harris Mar 20 '19 at 18:51
  • So is not the searching of the hand in the big $hands array to problem? Or does the DB insert takes to much time for each iteration? – Yann1ck Mar 20 '19 at 18:54
  • Possibly both, but that many DB calls is gonna be slow, regardless – infamoustrey Mar 20 '19 at 18:55
  • Do the inserts inside of a transaction to streamline IO. – Sammitch Mar 20 '19 at 19:02
  • You need to optimize the `$hands` foreach loop as well as the DB inserts as mentioned above. How many rows does `$hands` have and what is a `Hand` by the way? – nice_dev Mar 20 '19 at 19:50
  • $hands has 270725 entries Hand represents a single Omaha Poker Hand like AcAsKd2h for Ace of clubs, Ace of Spades, King of diamond and 2 of hearths – Yann1ck Mar 20 '19 at 23:01
  • So what would be a possibility to optimize the foreach loop? I know that every Hand can only occur once in the file. Would it be good to delete the found hands from the hand array? – Yann1ck Mar 20 '19 at 23:27

0 Answers0