I spent quite a bit of time trying to get Laravel 9 Browsershot working on a job queue server on an Arm Ubuntu 20.04 on AWS EC2 running PHP 8.2. I was finally able to get it to work and am documenting here in case it helps anyone else googeling this.
The main reason it was so hard was I was not unable to install local-chromium that comes with Puppeteer. In my case the only thing that worked was to make it all work with the regular snap distribution.
The problem with snap is it will only work if it is running in a real /home folder under /home of a home user with a home dir. Unfortunately www-data is not setup that way by default. So to make it work I needed to convert www-data into a full user and make nginx operate from /home/www-data rather than /var/www.
Note: I am assuming you have already installed php8.2 and your Laravel project is setup with Browsershot.
- Upgrade everything & install node
sudo apt-get -y upgrade
sudo curl -sL https://deb.nodesource.com/setup_14.x | sudo -E bash -
sudo apt-get install -y nodejs gconf-service libasound2 libatk1.0-0 libc6 libcairo2 libcups2 libdbus-1-3 libexpat1 libfontconfig1 libgbm1 libgcc1 libgconf-2-4 libgdk-pixbuf2.0-0 libglib2.0-0 libgtk-3-0 libnspr4 libpango-1.0-0 libpangocairo-1.0-0 libstdc++6 libx11-6 libx11-xcb1 libxcb1 libxcomposite1 libxcursor1 libxdamage1 libxext6 libxfixes3 libxi6 libxrandr2 libxrender1 libxss1 libxtst6 ca-certificates fonts-liberation libappindicator1 libnss3 lsb-release xdg-utils wget libgbm-dev libxshmfence-dev
- Setup the "real" www-data user
sudo mkdir /home/www-data
sudo usermod -d /home/www-data www-data -s /bin/bash
- Install snap chromium
sudo apt-get -y chromium-browser
- Setup temp write dirs required for Browsershot & chromium
sudo mkdir /home/www-data/browsershot-html
sudo mkdir /home/www-data/user-data
sudo mkdir /home/www-data/user-data/Default
sudo mkdir /home/www-data/user-data/Default/Cache
sudo chown -R www-data:www-data /home/www-data
Install your project in /home/www-data/your-laravel-project and make sure that nginx is configured to serve from that dir
Install Puppeteer
cd /home/www-data/your-laravel-project
sudo npm install --location=global --unsafe-perm puppeteer --ignore-scripts
- Create a test console script in your Laravel project
<?php
namespace App\Console\Commands;
use Illuminate\Console\Command;
use Spatie\Browsershot\Browsershot;
class TestBrowsershot extends Command
{
/**
* The name and signature of the console command.
*
* @var string
*/
protected $signature = 'test:browsershot';
/**
* The console command description.
*
* @var string
*/
protected $description = 'Test browsershot';
/**
* Execute the console command.
*
* @return mixed
*/
public function handle()
{
Browsershot::html('<h1>Hello world</h1>')
->setOption('args', ['--disable-web-security'])
->ignoreHttpsErrors()
->noSandbox()
->showBackground()
->setOption('scale', 0.9)
->emulateMedia('print')
->setNodeBinary('/usr/bin/node')
->setNpmBinary('/usr/bin/npm')
->setChromePath('chromium-browser')
->setCustomTempPath('/home/www-data/browsershot-html')
->addChromiumArguments([
'lang' => "en-US,en;q=0.9",
'hide-scrollbars',
'enable-font-antialiasing',
'force-device-scale-factor' => 1,
'font-render-hinting' => 'none',
'user-data-dir' => '/home/www-data/user-data',
'disk-cache-dir' => '/home/www-data/user-data/Default/Cache',
])
->save('hello.pdf');
}
}
Note: Some of these options expose security issues if connecting with external sites. In my case I am only feeding in HTML that I create, so that's not an issue. If creating shots of external sites you may want to create a sandbox machine with no other code or access to databases etc. (Or find a way to make it work without the security holes noSandbox() and --disable-web-security)
- Finally, test via the www-data user (which is how nginx will be running)
su www-data
cd /home/www-data/your-laravel-project
php artisan test:browsershot
Good luck!