I'll make abstraction of things like grid width vs. viewport width, etc. because they're immaterial to the problem. You basically want to find out how many items of a given dimension (cardWidth * cardHeight
) will fit into a container of another given dimension (totalWidth * totalHeight
), provided that there needs to be a specified amount of spacing between the items (gutterSize
).
You first need to determine the number of columns, i.e. how many cards will fit in a row. Basically, you're trying to find the maximum cols
value that satisfies the following inequality:
(cols * cardWidth) + ((cols - 1) * gutterSize)) <= totalWidth
This equation can be rewritten as:
cols <= (totalWidth + gutterSize) / (cardWidth + gutterSize)
In JavaScript this would be:
const totalWidth = 500;
const cardWidth = 40;
const gutterSize = 25;
const cols = Math.floor((totalWidth + gutterSize) / (cardWidth + gutterSize));
console.log(cols); // 8
The logic for the number of rows is exactly the same. You need to find the maximum rows
value that satisfies:
(rows * cardHeight) + ((rows - 1) * gutterSize)) <= totalHeight
Which can be rewritten as:
rows <= (totalHeight + gutterSize) / (cardHeight + gutterSize)
Or in JavaScript:
const totalHeight = 800;
const cardHeight = 60;
const gutterSize = 25;
const rows = Math.floor((totalHeight + gutterSize) / (cardHeight + gutterSize));
console.log(rows); // 9
Then, calculating the number of cards is just a matter of multiplying both values:
cards = rows * cols
Everything put together, in JavaScript it would look like this:
const totalWidth = 500;
const totalHeight = 800;
const cardWidth = 40;
const cardHeight = 60;
const gutterSize = 25;
const cols = Math.floor((totalWidth + gutterSize) / (cardWidth + gutterSize));
const rows = Math.floor((totalHeight + gutterSize) / (cardHeight + gutterSize));
const cards = rows * cols;
console.log(cards); // 72