#define STRIP0 0
#define STRIP1 1
#define STRIP2 2
#define STRIP3 3
#define PINS0 2,3,4
#define PINS1 5,6,7
#define PINS2 8,9,10
#define PINS3 11,12,13
#define PINS(STRIP) { (STRIP) == (STRIP0) ? PINS0 :\
(STRIP) == (STRIP1) ? PINS1 :\
(STRIP) == (STRIP2) ? PINS2 :PINS3}
now if when i call a function that takes 3 arguments all of type int foo(int,int,int);
like this foo(PINS1);
then the function compiles and works as expected(all arguments are passed as if the #define was replaced by "5,6,7")
but if i use the macro for selecting the set of pins like foo(PINS(STRIP1));
then the argument selection goes haywire. in this specific case the faulty argument list becomes "7,12,13" and for foo(PINS(STRIP0));
it becomes "4,12,13" there is a pattern that i see but i dont have the expertise to tell and rectify what is happening at compile time.