0

I am sending data from a web page to another API that I've designed that will receive it in JSON form. This web page has a form and the form-data has 2 images and some text data with it. What I have done is that I've buffered the image using Multer and I've reassigned the original req.body.userImage to the buffered image. The API on the other end will receive the entire req.body as JSON and will save it into MongoDB database. As the "userPicture" data type is Buffer so I've buffered the image here that's uploaded using the form and now I am trying to send it to that API. But the buffered image can't be sent, as it's a Node.js project I've used Node.js's HTTP method, I've also used some other modules from NPM and from them all I am getting the same error code: 413 (Payload too large). How can I send the buffered image now? Should I change the design of the API on the other end to receive raw image first and then buffer the image there right after that? If that's the case then how can I send raw image to that API?

const upload = multer({
  limits: {
    fileSize: 2000000,
  },
  fileFilter(req, file, cb) {
    if (!file.originalname.match(/\.(jpg|png|jpeg|svg)$/)) {
      return cb(new Error("Please upload an image"));
    }
    cb(undefined, true); //means go ahead and accept the given upload
  },
});

app.post("/postData", upload.any('photos'), async (req, res) => {

  console.log('data send here')

  // console.log(req.body)
  // console.log(req.body.name)
  // console.log(req.files[0].buffer)

  let distroLogo = await sharp(req.files[0].buffer)
    .resize({
      width: 300,
      height: 300
    })
    .png()
    .toBuffer();

  req.body.logo = distroLogo

  let userPicture = await sharp(req.files[1].buffer)
    .resize({
      width: 300,
      height: 300
    })
    .png()
    .toBuffer();

  req.body.userProfilePicture = userPicture

  const data = JSON.stringify(req.body)
  const options = {
    hostname: '***API ADDRESS***',
    port: 443,
    path: '/**PATH**',
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
      'Content-Length': data.length
    }
  }

  const reque = https.request(options, (res) => {
    console.log(`statusCode: ${res.statusCode}`)

    reque.on('data', (d) => {
      process.stdout.write(d)
    })
  })

  reque.on('error', (error) => {
    console.error(error)
  })

  reque.write(data)
  reque.end()
}), (error, req, res, next) => {
  res.status(400).send({
    error: error.message
  })
}
xafak
  • 21
  • 1
  • 1
  • 7

2 Answers2

0

You can use the "Transfer-Encoding: chunked" header If your requests aren't already using a compression you can use: gzip, br or deflate.

Zirpoo
  • 1
0

I fixed the issue by using the bodyParsers limit and extended options, means I used the following code to increase the bodyParser's limit to take values more than the default value.

Added code:

app.use(bodyParser.json({
  limit: "50mb"
}));
app.use(bodyParser.urlencoded({
  limit: "50mb",
  extended: true,
  parameterLimit: 50000
}));

I used the following post to solve my problem:

Node.js Express.js bodyParser POST limit

xafak
  • 21
  • 1
  • 1
  • 7