Edit: Aha! I have just noticed the 1024 requirement. This makes it more complicated, but the idea remains the same. Instead of just having an int number, you need int number[32] (or long number[16], what have you.)
The math at the borders is annoying, but not impossible. Let me know if you can't figure it out.
This works for me. Decomposition and supporting values greater than offered in an (int) is left as an exercise to the reader...
#include <stdio.h> // only to print - not needed in computation
int main(int argc, char *argv[]) {
printf("Converting: %s\n", argv[1]);
int number = 0x0;
char * binaryString = argv[1];
int index = 0;
int asciiZero = '0';
char curr = binaryString[index];
while(curr != '\0') {
number = (number << 1) | (curr - asciiZero);
index++;
curr = binaryString[index];
}
printf("As number: %d\n", number);
int MAX_DIGITS = 10; //adjust accordingly...
char buffer[MAX_DIGITS];
index = 0;
while(number > 0) {
buffer[index] = ((char) number % 10) + asciiZero;
index++;
number = number / 10;
}
buffer[index] = '\0';
printf("As string: %s\n", buffer);
}
If you want to support more than offered in the primitives available to you, you can make a struct containing multiple ints/longs/etc.