1

Thank you for the response. I will give it a try and update my question, I have my own code but it is a bit messy to show all. My problem is that I do not get the indexes right.

I use:

$products = array();
$lines = file('data_stock.csv', FILE_IGNORE_NEW_LINES);

foreach ($lines as $key => $value)
{
$products[$key] = str_getcsv($value);
}

And I manage to read the data, but this will give me an error:

 if ((int)$products[$_sku] > 0 && isset($products[$_sku])) {

Error: Notice: Undefined index: test-product-1 in.... The 'test-product-1' is from the sku column in the csv file

Output from

echo '<pre>';
print_r($products);
echo '</pre>';

gives:

Array
(
[0] => Array
    (
        [0] => sku
        [1] => qty
    )

[1] => Array
    (
        [0] => test-product-1
        [1] => 3
    )

[2] => Array
    (
        [0] => test-product-2
        [1] => 6
    )

[3] => Array
    (
        [0] => test-product-3
        [1] => 30
    )

)

I am trying to use a csv file to be imported into the array to replace

$products = [
'test-product-1' => 3,
'test-product-2' => 6,
'test-product-3' => 30
];

But I can not produce the same array when I import from the CSV file, which will cause problems. Examples for CSV to array: http://php.net/manual/en/function.str-getcsv.php

CSV file:

sku,qty
test-product-1,3
test-product-2,6
test-product-3,30

Next step is to extend the script to handle prices. I need to be able to pick up these variables from the CSV file too. And use them inside the for loop.

sku,qty,price,special_price
test-product-1,3,100,50
test-product-2,6,99,
test-product-3,30,500,300
M2Newbie
  • 33
  • 5
  • 4
    A good starting point to create your own code would be https://stackoverflow.com/questions/1269562/how-to-create-an-array-from-a-csv-file-using-php-and-the-fgetcsv-function. Try that and see how far you get, then ask for help if you are stuck at any point. – Nigel Ren Nov 06 '18 at 07:58
  • Thank you. I updated my question with some code I use and what kind of error I get – M2Newbie Nov 06 '18 at 09:12

2 Answers2

0

I used following code in my project and its working fine for me.

I used csv_reader PHP library for it. You have to put this library in your library folder and import it into file where you want to read your csv.

include_once('../csv_reader.php');

    $read = new CSV_Reader;

    $read->strFilePath = "file_name_with_path";
    $read->strOutPutMode = 0;  // 1 will show as HTML 0 will return an array

    $read->setDefaultConfiguration();
    $read->readTheCsv();
    $dataArr = array();
    $dataArr = $read->arrOutPut;

In $dataArr, i will get the result,

Madhuri Patel
  • 1,270
  • 12
  • 24
0

I think the problem is that when you store the row, your storing it indexed by the row number ($key will be the line number in the file). Instead I think you want to index it by the first column of the CSV file. So extract the data first (using str_getcsv() as you do already) and index by the first column ([0])...

$products = array();
$lines = file('data_stock.csv', FILE_IGNORE_NEW_LINES);

foreach ($lines as $value)
{
    $data = str_getcsv($value);
    $products[$data[0]] = $data;
}

If you want to add the first row as a header and use it to key the data...

$products = array();
$lines = file('data_stock.csv', FILE_IGNORE_NEW_LINES);
$headers = str_getcsv(array_shift($lines));

$products = array();
foreach ( $lines as $value )    {
    $data = str_getcsv($value);
    $products[$data[0]] = array_combine($headers, $data);
}

The removes the first row of the array using array_shift() and then uses this row in the array_combine() as the keys for each row. With your test data, you would get something like...

Array
(
    [test-product-1] => Array
        (
            [sku] => test-product-1
            [qty] => 3
            [price] => 100
            [special_price] => 50
        )
Nigel Ren
  • 56,122
  • 11
  • 43
  • 55
  • Hi Nigel, This was exactly what I needed. A small change in the code and some basics when reading from the array. Now my headache is gone :) Thank you so much. – M2Newbie Nov 06 '18 at 11:43