2

I'm a week into an intro programming class, and I'm having trouble with fixing what's supposed to be a relatively simple code. I keep getting an invalid type argument of unary '*' error.

#include <stdio.h>
#define PI 3.14159; 
int main()
{
   float r;
   float area;
   scanf("%f", &r);
   area = PI * r * r;
   printf("Area is %f", area);
   return 0; 
}

Could someone explain this, and how to fix it?

Lidong Guo
  • 2,817
  • 2
  • 19
  • 31

2 Answers2

10
#define PI 3.14159; 
                  ^

Drop the semicolon. Leaving it in, the code will expand to:

area = 3.14159; * r * r;
cnicutar
  • 178,505
  • 25
  • 365
  • 392
  • 3
    Or better yet, stop using #define. – vipw Sep 03 '13 at 21:55
  • 4
    @RobertAleksanderYevdokimov Don't `#define PI 3.14159`, in first place. `#include ` and use `M_PI`. Generally: don't reinvent the wheel, use the standard library whenever possible. –  Sep 03 '13 at 21:56
  • 1
    `M_PI` isn't part of standard C, is it? – Carl Norum Sep 03 '13 at 21:58
  • Indeed it isn't: http://stackoverflow.com/questions/5007925/using-m-pi-with-c89-standard – Carl Norum Sep 03 '13 at 22:01
  • 1
    @CarlNorum **Must not**. Didn't think of that :-) – cnicutar Sep 03 '13 at 22:01
  • 1
    @RyanAmos: What's wrong with using a macro where none is needed is that it circumvents the language, opening up weird errors like this. If the OP had instead written `const float PI = 3.14159f;`, this question wouldn't exist. – Chuck Sep 03 '13 at 23:03
2

You have to remove the extra ; in the definition of the macro PI. It is unnecessary for macro, and in your case results in syntax error after expansion.

rems4e
  • 3,112
  • 1
  • 17
  • 24