1

So, need help.

prePS - code works but not works fine when load test.

So, at all, php code get from user a file and save it on cluster(4 nodes) as tree parts.

so, simple code like.

$user_file = get_full_filename_of_userfile_from_post_array('user_file');

$fin = fopen($user_file,'rb');

$fout1 = fopen(get_uniq_name('p1'),'wb');
$fout2 = fopen(get_uniq_name('p2'),'wb');
$fout3 = fopen(get_uniq_name('p3'),'wb');

while ($part = fread($fin))
{
fwrite($fout1,get_1_part($part));
fwrite($fout2,get_2_part($part));
fwrite($fout3,get_3_part($part));
}
fclose($fin);
fclose($fout1);
fclose($fout2);
fclose($fout3);

$location = get_random_nodes(3,$array_of_existing_nodes);

foreach($location as $key => $node)//key 1..3
{
if(is_local_node($node)
{
save_local_file(get_part_file_name($key));
}
else
{
save_remote_file(get_part_file_name($key),$node);
}
}

//delete tmp files, logs,...etc

save_remote_file() - using cURL sends a file POST method like in docs

$post_data = array(
'md5sum' => $md5sum,
'ctime'  => $ctime,
.....
'file'   => @.$upload_file_full_name,
);

$ch = curl_init();
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, Config::get('connect_curl_timeout'));
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_USERAGENT, "Mozilla/4.0 (compatible;)");
curl_setopt($ch, CURLOPT_URL, $URL.'/index.php');
curl_setopt($ch, CURLOPT_PORT, Config::get('NODES_PORT'));
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_FAILONERROR, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $post_data);

So, during test, I upload 14000 files, a file per request, 10 request per node (parallel). php code, check file and get answer, and then in background saving file on cluster. (yes ,I know that it would be nice to create daemon for saving - it is a task for future)

SO, sometime, there may be about 100, or even 200 php processes on node, in background (use php-fpm function )

ignore_user_abort(true);
set_time_limit(0);
session_commit();

if(function_exists('fastcgi_finish_request'))
    fastcgi_finish_request();

A few calculation. 14000 files = 14000*3=42000parts, saveing on random 3 from 4, so 25% parts save locally, 75% -remote = 0.75*42000 = 31500 remote saveings

during test I gets about 100 errors on all nodes from curl errno = 26 errer = couldn't open file "##ІИP_zOж лЅHж"//it is stranger, because origin file name - it is about 124 chars in name. example

/var/vhosts/my.domains.com/www/process/r_5357bc33f3686_h1398258739.9968.758df087f8db9b340653ceb1abf160ae8512db03.chunk0.part2

before code with cURL , I added checks file_exists($upload_file_full_name) and is_readable($upload_file_full_name); if not - log it.

checks passed good, but curl returns error(100 times from 31500 ones)

also, add code, if error, wait 10secs, try againg, wait 10secs, try, wait 10 try.

always all if first try with error, all next tries with error too, but according logs, at the same time another php processes what are saving other files, good send a part via curl.

So I don't understand, how can I find a reason and fixing it.

user3032727
  • 41
  • 1
  • 6
  • New information. I wrote that error appears during loading test. I guess , maybe error appear always (means one error per 100 good requests), and do load test with one request per node with delay - the same not loading system, simple use in single mode. And Yes, error appears too. – user3032727 May 05 '14 at 09:59

0 Answers0