0

Trying to figure out why I am getting the following error:

Undefined index Plugin ID

I am using Maatwebsite\Excel for my import and tried using the guide here:

https://appdividend.com/2017/06/12/import-export-data-csv-excel-laravel-5-4/

I think I have everything in the right place, but I am getting the above error from this code:

public function import(Request $request)
{
if($request->file('imported-file'))
{
        $path = $request->file('imported-file')->getRealPath();
        $data = Excel::load($path, function($reader)
  {
        })->get();

  if(!empty($data) && $data->count())
  {
    foreach ($data->toArray() as $row)
    {
      if(!empty($row))
      {
        $dataArray[] =
        [
          'plugin_id' => $row['Plugin ID'],
          'cve' => $row['CVE'],
          'cvss' => $row['CVSS'],
          'risk' => $row['Risk'],
          'host' => $row['Host'],
          'protocol' => $row['Protocol'],
          'port' => $row['Port'],
          'name' => $row['Name'],
          'synopsis' => $row['Synopsis'],
          'description' => $row['Description'],
          'solution' => $row['Solution'],
          'see_also' => $row['See Also'],
          'plugino_utput' => $row['Plugin Output']
        ];
      }
  }
  if(!empty($dataArray))
  {
     Shipping::insert($dataArray);
     return back();
   }
 }
}
}

This is in my controller file and is trying to account for the headers being different in the CSV compared to in my database.

Any idea why it would be complaining about index on a column from the csv side of things?

wahyzcrak
  • 114
  • 12
  • try simple `dd($row);` in `foreach` to see the `$row` array – ljubadr Oct 19 '17 at 16:19
  • can you give an example of the syntax for that? I am not getting it right (pretty new to php and laravel, etc) – wahyzcrak Oct 19 '17 at 16:34
  • [dd() helper function](https://laravel.com/docs/5.5/helpers#method-dd) – ljubadr Oct 19 '17 at 16:36
  • just copy `dd($row);` and paste it after `foreach ($data->toArray() as $row) {` – ljubadr Oct 19 '17 at 16:37
  • That is returning array:13[ with one of the records on the screen, but not inserting anything into the database – wahyzcrak Oct 19 '17 at 16:51
  • `dd()` will print to the screen and stop execution (nothing after that point is executed). It should be helpful to see if your `$row` array contains `'Plugin ID'` – ljubadr Oct 19 '17 at 16:55
  • I tried it with dump and that seems to return all of the records one row at a time – wahyzcrak Oct 19 '17 at 16:56
  • I see, yes it appears to be correctly mapping values from the csv and then returning them with the database column name – wahyzcrak Oct 19 '17 at 16:57
  • You should also add `$dataArray = [];` before `foeach` – ljubadr Oct 19 '17 at 17:06
  • check your `$dataArray` before you insert to database `dd($dataArray);` – ljubadr Oct 19 '17 at 17:09
  • Thank you for all of the help so far, this is teaching me a lot. It seems that the array is getting populated correctly, (I used your recommendation to check what was in it after the fill array section. Unfortunately now I am timing out again. Is there logs I can look at to see if insert statements are being processed by mysql? – wahyzcrak Oct 19 '17 at 17:25
  • you should look into laravel log file `storage/logs/laravel.log` for errors, you can see the time when error happened and `Stack trace`. It can be overwhelming at the beginning, but you'll get used to it :) – ljubadr Oct 19 '17 at 17:30
  • Also log should tell you where the error occurred, file and line, so it's easier to track the errors down. In `.env` file in project root, `APP_DEBUG=true` to show errors in browser – ljubadr Oct 19 '17 at 17:32
  • I don't think I really see any errors in the logs with debugging enabled, it looks like it is doing the call to insert the data, but then I can't see any records getting inserted and the process ends up timing out after 30 seconds in the web interface – wahyzcrak Oct 19 '17 at 18:12
  • If the error is gone, thats good. How many rows in csv file? – ljubadr Oct 19 '17 at 20:25
  • If you have a lot of rows, insert could take a while. You can always [increase php timeout](https://stackoverflow.com/questions/3829403/how-to-increase-the-execution-timeout-in-php) – ljubadr Oct 19 '17 at 20:33
  • There isn't very many rows in this file – wahyzcrak Oct 19 '17 at 21:15
  • Thank you very much for your help. I ended up changing the way I was doing this a bit, by following another guide that used the Maatwebsite/excel functionality. It also had issues, but I ended up being able to use var_dump($value) to determine that I was having problems because I was trying to pass an array as values into another array. Will post the code that ended up working shortly. – wahyzcrak Oct 20 '17 at 13:17

1 Answers1

0

I ended up with this for now from another post. The post still had an extra section in it, but doing a var_dump on $value (which I left in, but commented out) I could see that $value was already an array, so instead of passing it into another array, I tried just inserting with it and that seems to be working.

Still working on placing the error and success messages.

Thanks to ljubadr for helping me learning how to put some print type statements in with the code to see what was getting output at various places.

public function importExcel(Request $request)
{
    if($request->hasFile('import_file')){
        $path = $request->file('import_file')->getRealPath();
        $data = Excel::load($path, function($reader) {})->get();
        if(!empty($data) && $data->count()){
            foreach ($data->toArray() as $key => $value) {
                if(!empty($value)){
                    #var_dump($value);
                    Item::insert($value);
                }
            }
        return back()->with('success','Insert Record successfully.');
        }
    }
#return back()->with('error','Please Check your file, Something is wrong there.');
}
}
wahyzcrak
  • 114
  • 12
  • couple thousand csv records works pretty well. I need this to work with between 50 -> 100k records and I don't really want to increase the timeout. Anything bigger than 100k records is starting to be kind of silly to do all in one csv, but anything up to that I think should be able to happen. For example the import csv query I made (shell script) could import all three of my example files, one with 1k records, one with 2k records, and one with 15k records, and it could put them all into the database in about 3 seconds. – wahyzcrak Oct 20 '17 at 15:53