I missed posting here, you always teach me a lot!
So, I have an assignment to write my own 2 functions to change from integer to ASCII and back from ASCII to integer for an embedded system, and they have very specific features:
here are the provided function definitions that I should build on. it is put in a data.h file:
#include <stdint.h>
uint8_t my_itoa(int32_t data, uint8_t * ptr, uint32_t base);
int32_t my_atoi(uint8_t * ptr, uint8_t digits, uint32_t base);
these functions should be used for basic data manipulation, here's how these functions are going to be used in another file course1.c which has data.c included:
digits = my_itoa( num, ptr, BASE_16);
value = my_atoi( ptr, digits, BASE_16);
and:
digits = my_itoa( num, ptr, BASE_10);
value = my_atoi( ptr, digits, BASE_10);
there are certain features that should be in both functions:
- they should support base 2 all the way to base 16.
- string functions or libraries shouldn't be used.
- All operations need to be performed using pointer arithmetic, not array indexing.
- function needs to handle signed data.
for my_itoa:
- Integer-to-ASCII needs to convert data from a standard integer type into an ASCII string.
- The number you wish to convert is passed in as a signed 32-bit integer.
- Copy the converted character string to the uint8_t* pointer passed in as a parameter (ptr)
- The signed 32-bit number will have a maximum string size (Hint: Think base 2).
- You must place a null terminator at the end of the converted c-string
- Function should return the length of the converted data (including a negative sign). Example my_itoa(ptr, 1234, 10) should return an ASCII string length of 5 (including the null terminator).
for my_atoi:
- ASCII-to-Integer needs to convert data back from an ASCII represented string into an integer type.
- The character string to convert is passed in as a uint8_t * pointer (ptr).
- The number of digits in your character set is passed in as a uint8_t integer (digits).
- The converted 32-bit signed integer should be returned.
after my research into this thread: Writing my own atoi function
I was able to write this code, my data.c:
#include <stdlib.h>
#include <string.h>
int main(){
return 0;
}
uint8_t my_itoa(int32_t data, uint8_t * ptr, uint32_t base)
{
return *ptr;
};
int32_t my_atoi(uint8_t * ptr, uint8_t digits, uint32_t base)
{
const char* str= ptr;
uint8_t len = strlen(str);
str = (uint8_t*) malloc(len * sizeof(uint8_t));
while (*str != '\0')
{
uint8_t a;
a = *str -'0';
*ptr = a;
str++;
ptr++;
}
str = str - len;
ptr = ptr -len;
return *ptr;
};
as I understand this part removes null character:
a = *str -'0';
this code has a main problem, which is when i compile it this way I get the in my_atoi that pointer differs in sginedness and pointer assignment differs in signedness:
src/data.c: In function ‘my_atoi’:
src/data.c:20:19: error: pointer targets in initialization differ in
signedness [-Werror=pointer-sign]
const char* str= ptr;
^~~
src/data.c:22:6: error: pointer targets in assignment differ in
signedness [-Werror=pointer-sign]
str = (uint8_t*) malloc(len * sizeof(uint8_t));
^
and when I edit it to this:
uint8_t len = strlen(str);
I get errors that pointer target in passing argument 1 of strlen differ in sigdness, and in string.h strlen definition expected a const char*:
src/data.c: In function ‘my_atoi’:
src/data.c:21:19: error: pointer targets in passing argument 1 of
‘strlen’ differ in signedness [-Werror=pointer-sign]
int len = strlen(str);
^~~
In file included from src/data.c:3:0:
/usr/include/string.h:384:15: note: expected ‘const char *’ but
argument is of type ‘unsigned char *’
extern size_t strlen (const char *__s)
^~~~~~
src/data.c:22:6: error: pointer targets in assignment differ in
signedness [-Werror=pointer-sign]
str = (char*) malloc(len * sizeof(char));
^
and also the fact that I need to get rid of strlen and string.h alltogether. like here: Converting ASCII to Hex and vice versa - strange issue so it was not very useful. Actually I don't understand the code inside of it.
Another challenge is with the determining input variable to function(3rd input) which should be BASE_10, BASE_16, or BASE_2 etc..., how should the variable:
uint32_t base
which is an int, store this value "BASE_10" to compare it in an if conditional? and what are the bases of conversion to different bases and how should I handle them?
I'm stuck on many different frontiers the answer to this challenge, where to learn more, what the requirement of maximum string size, how should I handle signed data. I'm seeking help. sorry for this long question but I needed to include all the factors.
EDIT editing this the code like this:
uint8_t* str= ptr;
uint8_t len = strlen((const char *)str);
following this thread: cast unsigned char * (uint8_t *) to const char *
I have no errors now, but still need to completely get rid of string.h