I'm writing a script in C to print out pascals triangle, so I wrote a function for factorial, then made the variable c = to the binomial expansion equation, this works up until line n = 13 where it produces the output:
1
1 1
1 2 1
1 3 3 1
1 4 6 4 1
1 5 10 10 5 1
1 6 15 20 15 6 1
1 7 21 35 35 21 7 1
1 8 28 56 70 56 28 8 1
1 9 36 84 126 126 84 36 9 1
1 10 45 120 210 252 210 120 45 10 1
1 11 55 165 330 462 462 330 165 55 11 1
1 12 66 220 495 792 924 792 495 220 66 12 1
0 4 24 88 221 399 532 532 399 221 88 24 4 0
With the last line being n = 13. I originally thought it was breaking because the factorials would be to big for 32bit integers but the error is still present with 64bit integers...
#include <stdio.h>
#include <stdlib.h>
#include <stdint.h>
#include <inttypes.h>
int64_t factorial(int k);
int main(int argc, char **argv)
{
int c, n, r, i;
if (argc < 2) {
printf("[function]: %s [number of lines to print]", argv[0]);
return -1;
}
n = atoi(argv[1]);
for (i = 0; i <= n; i++) {
for (r = 0; r <= i; r++) {
c = (int) factorial(i) / (factorial(r) * factorial((i - r)));
printf("%d ", c);
}
printf("\n");
}
return 1;
}
int64_t factorial(int k)
{
int64_t j;
if (k == 0 || k == 1)
return 1;
int n = k;
for (j = k; n > 1;)
j = j * --n;
return j;
}
I would greatly appreciate some help with this...