6

Possible Duplicate:
Best Practices: Option Infer
What is the best way to mix VB.NET's Option Strict and the new Option Infer directives?

I am developing a old solution, that was translated from VB6 to the VB.NET.

Actually, the default options in files are

Option Strict On
Option Explicit On

I want to use LINQ, and found that is easier to use also the Option Infer On.

Less to write, less (so, easier) to read.

However, a (conservative, from my point of view) part of the team keeps the Option Infer Off and insist do not use it at all, without explicitly explain the causes.

In your opinion, what are the "dangers" of using Option Infer On, along with other two options (Strict and Explicit, both On)?

jmoreno
  • 12,752
  • 4
  • 60
  • 91
serhio
  • 28,010
  • 62
  • 221
  • 374
  • 5
    I always turn `Option Infer` off. But yes, I'm old school and "conservative" when it comes to things like this. I'd rather the compiler catch my errors than my having to debug them at run-time. I don't mind the extra typing; between IntelliSense and being able to type quickly, it's not much of a problem. – Cody Gray - on strike Dec 19 '11 at 15:32
  • 3
    As far as the other two, there is absolutely no excuse: `Option Explicit` and `Option Strict` should **always** be on. – Cody Gray - on strike Dec 19 '11 at 15:33
  • 5
    @CodyGray, this is infer, infer work like var in c#, this is compile time error catch – Fredou Dec 19 '11 at 15:35
  • 2
    For all the people voting to close as "not constructive", I advise switching your vote to "exact duplicate": [Best Practices: Option Infer](http://stackoverflow.com/questions/667851/best-practices-option-infer) or [What is the best way to mix VB.NET's Option Strict and the new Option Infer directives?](http://stackoverflow.com/questions/194278/what-is-the-best-way-to-mix-vb-nets-option-strict-and-the-new-option-infer-dire) – Cody Gray - on strike Dec 19 '11 at 15:38
  • @Fredou: Right, I'm aware of that. I still like to be explicit. Unlike C#, VB.NET *doesn't* have a `var` or `auto` keyword. The syntax with Option Infer looks broken to me, but that's probably because I still work in VB 6 occasionally. I'm all for the compromise of turning Option Infer off at the project-level, but selectively turning it on in individual code files where an extensive use of LINQ actually means that it improves readability. – Cody Gray - on strike Dec 19 '11 at 15:40
  • 1
    @Cody Gray: what about **objective** (not subjective) observations-differences? – serhio Dec 19 '11 at 15:51
  • 1
    I'm not really sure how this could be objective. The only thing objective would be contained in the documentation for the `Option Infer` directive. Whatever it *does* would be what would change by having it on or off, objectively speaking. Everything else is stylistic and inherently subjective. Not that there's anything *bad* with talking about style. – Cody Gray - on strike Dec 19 '11 at 15:55
  • I hate static binding and explicit declarations. And if you have good test coverage (you do have test coverage, don't you?) then catching runtime errors shouldn't be a problem. – Wayne Werner Dec 19 '11 at 17:52
  • 1
    Wow, hating static binding and explicit declarations? I suspect someone might be a scripting language programmer... – Cody Gray - on strike Dec 20 '11 at 00:16
  • This question differs from the referenced question because it’s about the dangers of Option Infer, not about heather it is a good practice. – jmoreno Jan 19 '19 at 11:25

3 Answers3

8

Code written with Option Infer on is no different in performance or type safety than code written with the same types explicitly declared. With that in mind, the arguments I could come up with against Option Infer are:

  • Inconsistency between cases when the type must be specified and when it can be inferred.

    • Class fields for one cannot be inferred, even if initialized inline.
    • Variables holding lambdas (Dim f = Function(x) ...) do not always infer types.
    • Variables that are not initialized must be given a type

    The strength of this argument is directly proportional to the consistency of style in your existing codebase. (For example, I sometimes still use underscores to continue lines when working with older code even when the newer compiler does not require them, if the rest of the code around it uses them.)

  • Sometimes the type is not immediately obvious when looking through code.

    Dim temp = Foo() 'temp is type of Foo's return, which is...
    

    Workaround: Declare the variable's type when you feel the need.

    This is not a "danger" as much as a potential inconvenience. More so if you are not working in an environment where Intellisense cannot tell you the inferred type.

  • The inferred type may end up being more specific than you really want in that case.

    Workaround: Specifically declare the type you want in that case.

    As the compiler catches cases when this is an issue, I wouldn't call it a "danger" per se. The only time I can think of where this would be an issue the compiler doesn't catch would be if you have different overloads of a method for the base and derived types or are shadowing methods in a derived type. I would argue that either of those cases are problems with the existing code and not with Option Infer.

  • The usage of anonymous types that come up in LINQ queries could lead to larger methods than normal as they cannot be passed between methods.

    Workaround: Define named types when this occurs and break up methods as normal.

    This is more of a danger in as much as long methods are dangerous. The usual "how long is too long" discussions apply.

  • It makes me look less productive because there are fewer KB in my code files from all those type names I don't have to type. (OK, this one is a joke.)

Gideon Engelberth
  • 6,095
  • 1
  • 21
  • 22
4

More and more languages infer the type of its variables. Consider C#, F# and possibly a whole host of non-.NET languages as well.

You keep type safety with option infer on. People that like to specify their variables can still do so. But sometimes it is next to impossible and definitely makes reading code harder, with those cryptic names you end up when using LINQ.

I used to be old-school. But when inferring entered the C# world, after a while I simply had to admit it: it improves coding speed, readability, and thus quality and makes it easier to maintain your code. This doesn't mean that you should stop specifying all your variables. In many cases it is still better to specify the types, regardless whether infer is on or off. Again: for readability's sake.

Explain to the old-school people why you'd want it on by default and that they can still type there typenames if they want to.

Abel
  • 56,041
  • 24
  • 146
  • 247
1

In my case I prefer having all ON

but having Infer OFF is "ok", you just need to type MORE ;-)

Fredou
  • 19,848
  • 10
  • 58
  • 113