281

I am doing some research into common errors and poor assumptions made by junior (and perhaps senior) software engineers.

What was your longest-held assumption that was eventually corrected?

For example, I misunderstood that the size of an integer is not a standard and instead depends on the language and target. A bit embarrassing to state, but there it is.

Be frank; what firm belief did you have, and roughly how long did you maintain the assumption? It can be about an algorithm, a language, a programming concept, testing, or anything else about programming, programming languages, or computer science.

skaffman
  • 398,947
  • 96
  • 818
  • 769
Demi
  • 6,147
  • 7
  • 36
  • 38
  • 3
    You may be interested http://doi.acm.org/10.1145/1364782.1364795 http://doi.acm.org/10.1145/984458.984495 http://doi.acm.org/10.1145/1142031.1142053 – Simon Gibbs May 20 '09 at 14:28

195 Answers195

545

For a long time I assumed that everyone else had this super-mastery of all programming concepts (design patterns, the latest new language, computational complexity, lambda expressions, you name it).

Reading blogs, Stack Overflow and programming books always seemed to make me feel that I was behind the curve on the things that all programmers must just know intuitively.

I've realized over time that I'm effectively comparing my knowledge to the collective knowledge of many people, not a single individual and that is a pretty high bar for anyone. Most programmers in the real world have a cache of knowledge that is required to do their jobs and have more than a few areas that they are either weak or completely ignorant of.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
JohnFx
  • 34,542
  • 18
  • 104
  • 162
  • 68
    So true! That's the problem of this age. Information is also discouraging. I had this revelation a few weeks ago when I felt like a complete loser in everything I did (not the first time) regarding research. Guys who get their papers published in IEEE Transactions do not necessarily have the same skills as guys who work at Google, boast in StackOverflow, ar excellent professors, or write great programming blogs. Of course, the best guys are exponentially cooler than we are, but they don't know everything you know that you don't know. So, stay cool. – jbasko May 21 '09 at 05:19
  • 40
    It also helps to understand that those bloggers aren't writing everything off the top of their heads either. Good bloggers research their topics and learn new things while writing posts. – JohnFx May 21 '09 at 14:35
  • 47
    I obsess daily about the stuff I don't have time to read about and learn. It leaves me with a horrendous feeling of guilt sometimes. – brad May 21 '09 at 20:55
  • 2
    I know how you feel. I try really hard to keep up with this stuff, but I do have a day job, after all! – JohnFx May 21 '09 at 22:19
  • The url is self explanatory: http://www.liveintentionally.com/Too_Many_Choices.htm – corlettk May 23 '09 at 07:13
  • 3
    Totally! Thanks for putting it like that, I felt so alienated when I started reading this site thinking "shit, I don't know anything!". – Alex May 23 '09 at 10:39
  • I do exactly the same and had never realized it... i felt for a long time too pesimistic and modest about my skills based on the phenomenon described above... – Kostas Konstantinidis May 24 '09 at 13:14
  • 9
    @Zilupe: Amen to that. I've published a few international conference papers and journals. In the eyes of some people, that sounded cool. Until you realized that it doesn't really take much effort to publish a paper. We're no genius. We're just like everyone else. We made mistakes, and we publish crap papers. Well, except for some minority group of real geniuses... – Hao Wooi Lim May 25 '09 at 03:18
  • 2
    Absolutely agree!! Most developers working on commercial projects are expected to deliver against tough deadlines. I wonder how many follow OO practices, good coding guidelines, test driven development etc. I have seen many .NET projects that don't follow a proper layered architecture mainly due to time and resource constraints. – Shaw May 25 '09 at 09:26
  • oh well, I guess I was a little off topic there. What I meant to say is not all are genius developers, and so when you see some one at SO espousing on the importance of TDD, OO, while all perfectly given with good intentions, don't fell bad that you aren't good enough, because you aren't doing them. We all work under the circumstances we find our selves in, and sometimes it's not possible to go by good practices. – Shaw May 25 '09 at 09:28
  • 1
    "[...] I'm effectively comparing my knowledge to the collective knowledge of many people". <<<< Exactly!! – hasen May 27 '09 at 18:27
  • 3
    Maybe it is that tinge of guilt/regret that pushes us to get better at our craft. I suppose it is ultimately a good thing. – JohnFx May 27 '09 at 18:47
  • 4
    +1 Good thing I read this. I thought I was the only one. – Randell Jul 29 '09 at 09:57
  • 2
    This saved my brain from frying. Lately i've been thinking more about what i dont know and how it seems that everyone knows everything so perfectly than actually learning something! Glad to see i'm not the only one! – Michal Ciechan Apr 17 '10 at 21:55
  • Man I couldn't have read this at a better time.. – Jeriko May 23 '10 at 13:06
  • I agree what JohnFx's posting. However, FWITW @JohnFX... I DO know more than you ;-) – user279521 Aug 11 '10 at 11:33
  • 1
    @user279521 - Oh yeah? How many fingers am I holding up? – JohnFx Oct 11 '10 at 14:56
  • @JohnFx 2months later..... +1 (damn you are slow dude).... – user279521 Oct 11 '10 at 15:05
308

That people knew what they wanted.

For the longest time I thought I would talk with people, they would describe a problem or workflow and I would put it into code and automate it. Turns out every time that happens, what they thought they wanted wasn't actually what they wanted.

Edit: I agree with most of the comments. This is not a technical answer and may not be what the questioner was looking for. It doesn't apply only to programming. I'm sure it's not my longest-held assumption either, but it was the most striking thing I've learned in the 10 short years I've been doing this. I'm sure it was pure naivete on my part but the way my brain is/was wired and the teaching and experiences I had prior to entering the business world led me to believe that I would be doing what I answered; that I would be able to use code and computers to fix people's problems.

I guess this answer is similar to Robin's about non-programmers understanding/caring about what I'm talking about. It's about learning the business as an agile, iterative, interactive process. It's about learning the difference between being a programming-code-monkey and being a software developer. It's about realizing that there is a differnce between the two and that to be really good in the field, it's not just syntax and typing speed.

Edit: This answer is now community-wiki to appease people upset at this answer giving me rep.

Instantsoup
  • 14,825
  • 5
  • 34
  • 41
  • 9
    Or change what they want after seeing what they previously wanted. People like to change their minds. I know, cuz I'm a people. – J. Polfer May 20 '09 at 14:36
  • 13
    You were giving them what they asked for, not what they wanted. – Brent Baisley May 20 '09 at 15:43
  • 47
    Why do boring uncontroversial no-answers get up-voted so excessively?! – nes1983 May 20 '09 at 16:50
  • I mean, dude, that is not a statement that can be "wrong" or "false", it isn't really statement, more your perception that changed. It doesn't satisfy the terms of the original questions and describes more your increasingly pessimistic look upon minkind than your progress as a scientist. – nes1983 May 20 '09 at 16:52
  • 39
    Wow. Sounds like someone needs a hug. – bzlm May 20 '09 at 20:26
  • 6
    @niko- this is a good answer..even I upvoted it and my answer has to compete ;) – TStamper May 21 '09 at 00:44
  • 1
    @Brent Baisley : Worse, you are giving them what you think they asked for based on what they asked for, not what they wanted. – Hao Wooi Lim May 21 '09 at 05:20
  • What lead you to make that assumption in the first place? – Daniel Daranas May 21 '09 at 09:33
  • Not to be pedantic, but it seems like what you're finding is that building systems is an INTERACTIVE process. Idea, prototype (and sometimes deployment), trash bin, new idea... etc. Or you can see it as "people are stupid" which is true as well... – Dan Rosenstark May 21 '09 at 13:29
  • That's why I like agile project management. – Scoregraphic May 21 '09 at 13:33
  • 1
    @Niko: The answer is OK. The fact that he got nearly 400 Rep off of it is BS. – gnovice May 21 '09 at 14:43
  • On the other hand, some people know what they want but it's absolutely non-sensical. For example, one of our clients "I know this guy, and personally I know that he has this much money, we want to highlight him on our web page, but we can't store anywhere anything that indicates this, can you do it?". Hmmm. – Kieran Senior May 21 '09 at 14:56
  • "I think you can have what you want or what you need, but you can't have both... usually." (from Hal Hartley's "Simple Men") – Daniel Daranas May 21 '09 at 15:38
  • @Daniel Daranas: Sounds like the uncertainty principle for software... – Treb May 21 '09 at 21:01
  • 24
    My god @ people complaining, stackoverflow rep is not a competition. Upvote if you enjoyed the answer, don't downvote because you are jealous you didn't post it first. – Dmitri Farkov May 21 '09 at 21:12
  • @All - expounded on my reasonings and stopped the rep gain. Thanks for the comments, I was surprised this answer generated what it did! – Instantsoup May 21 '09 at 21:28
  • 3
    This definitely deserves to be the highest upvoted answer. The idea that naively asking users what they want will produce the best product is widespread and very, very wrong. Especially important to keep in mind when building "shrinkwrap" or web software: the users that are giving you feedback are only a subset of your total users, and this will skew your perspective. – Wedge May 21 '09 at 21:36
  • 1
    Maybe I want it to be the highest voted answer. Maybe I don't. I'm not sure. – Daniel Daranas May 21 '09 at 21:57
  • 2
    My favorite is when you see something wrong with what the customer asked for, and point it out. "When I build it like this, its going to do X, which I'm almost certain you don't want it to do". They tell you to build it anyway, and then 6 months later panic when the release is pushed back because its doing X and it needs to be 'fixed'. – Jherico May 21 '09 at 22:40
  • @Jherico this is called a "change request" in the contract = $$$. – Daniel Daranas May 22 '09 at 08:46
  • Sometimes customers makes you implement something just to help them decide what they want. Ideally, a "Proof of concept" should do it, but it happens very rarely. – Sergiu May 22 '09 at 13:31
292

That I know where the performance problem is without profiling

lothar
  • 19,853
  • 5
  • 45
  • 59
232

That I should have only one exit point from a function/method.

Dug
  • 825
  • 1
  • 9
  • 19
  • 91
    Excellent realization; exit as often as necessary. One should bail out of a function as soon as it makes no sense to continue further into it. Doing this can reduce complexity and increase readability by, for example, avoiding deeply nested conditionals, when they are preconditions required for the method to run properly. In modern languages with memory management and resource constructs like using/finally, continuing all the way to the end of a method dogmatically makes no sense. – Triynko May 20 '09 at 17:35
  • 24
    Who came up with this, by the way? It's like a programming urban legend. – brad May 21 '09 at 20:54
  • 49
    People who have to debug other people's code are who came up with this. – gatorfax May 21 '09 at 23:44
  • 4
    If your code is so deeply nested that you can't try to make one exit point, then refactor the code. Move stuff into methods with names that describe what's happening, etc.... Sure you have to follow each method to understand exactly what each method is doing, but at least you can get the big picture just by looking at which methods are called, under which conditions. Then you can't put in many exit points can you? So by all your exit points you make someone have to totally reorganize your code, so they can refactor tons of nested conditionals into separate methods. – Richard Anthony Hein May 21 '09 at 23:52
  • 2
    My Data structures teacher used to teach that. I always thought it was confusing and unnecessary, however, his exams were based on that assumption. – Gustavo Muenz May 22 '09 at 17:52
  • 23
    I think this commonly-held but wrong idea is based on a misunderstanding. When you exit a function, you should always *return* to the same point. That was an important rule in languages like BASIC that didn't enforce it: The rule meant, for instance, that you should use GOSUB instead of GOTO. In languages like C# or Java that call methods, it's automatic. But because it's automatic, I think it morphed from the logical "only one return-to point" to the nonsensical "only one exit point". – Ryan Lundy May 22 '09 at 17:53
  • 35
    From languages like C where yo need to manually release ressources. Multiple exit points were a good chance for leaking ressources. IMO there's no point to it in languages with exceptions, as you often don't know your exit points anymore, or the are in the middle of a statement. -- In these languages, all that remains is "structure for readability". – peterchen May 27 '09 at 22:16
  • 5
    I think this *rule* is made up by flow-chart oriented people (like some stupid instructors in my college :-P) – Mehrdad Afshari Jun 01 '09 at 17:31
  • 5
    -1. A single exit point is the only sane way to enforce post-conditions. – Adrian McCarthy Jun 03 '09 at 21:19
  • 7
    @Adrian McCarthy - only if you're using C. Most other languages have something like try/finally, some languages/platforms have built-in support for postconditions. – Daniel Earwicker Jul 30 '09 at 22:25
  • 2
    I have never had a problem with multiple exits as long as they are used responsibly. Use them either at the top of the function to return early on unexpected params (usually null) or in an if/switch/try-catch statement in the logic. Just don't hide them everywhere. – PeteT Dec 17 '09 at 12:00
  • 2
    'return', 'continue', and 'exit' are my favourite keywords. – Evan Plaice Jun 14 '10 at 19:10
  • 4
    People often confuse *bad programming* with *disobeying a rule*. So they try to enforce rules like "one exit point" globally because they think it will eliminate bad programming. OEP makes sense when you need your method to do cleanup. In the other 95% of cases it leads to hideous nesting and *less readable* code. It's all about using the right tool for the job. For some functions in some languages OEP is a very good idea. – Jason Williams Jul 12 '10 at 19:53
  • @Adrian: If you've got something complex enough that you've got real problems with an early exit, you might be best to divide the function into two pieces, an outer which handles resources and pre/post-conditions, and an inner which does the real work and which can use early exit as necessary. Most of the time you don't need that sort of complexity though. – Donal Fellows Jul 13 '10 at 10:18
  • @Donal Fellows: I'd say just the opposite. If you find yourself needing multiple exit points, then you probably haven't decomposed the problem properly. – Adrian McCarthy Jul 13 '10 at 18:03
  • @Adrian: This sounds like one of these arguments where different sides think that the other is utterly wrong. – Donal Fellows Jul 13 '10 at 19:50
  • 1
    If you need multiple exit points in non-trivial code, you need to think about refactoring your function. – Joe Zitzelberger Jan 05 '11 at 19:11
228

That nonprogrammers understand what I'm talking about.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Robin Day
  • 100,552
  • 23
  • 116
  • 167
219

That bugfree software was possible.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
JaredPar
  • 733,204
  • 149
  • 1,241
  • 1,454
  • 35
    +1, although NASA almost managed it – Patrick McDonald May 20 '09 at 14:31
  • 55
    Yes but the "almost" cost a few million of dollars :) – Jem May 20 '09 at 15:04
  • 4
    It hasn't yet been achieved, but it's definitely possible (as long as we continue to work with deterministic digital hardware). We'd have to start from scratch with a new, carefully engineered OS and hardware free of serious design faults. – Triynko May 20 '09 at 16:45
  • What's deterministic digital hardware? – Liran Orevi May 20 '09 at 19:28
  • 1
    To be a bit more specific, "Bug free software was possible on a normal budget" – Frank Farmer May 20 '09 at 23:11
  • 4
    You will never have bug-free software, just software that hasn't had any bugs found yet. – Mark Glorie May 21 '09 at 00:38
  • +1 to Patrick it's almost an utopia but I think everyone knows it's not bug-free software it's almost bug-free software. – dr. evil May 21 '09 at 14:25
  • I don't want bug free program, it's not fun, and will drop the employement rate of developer by 10 ! – Nicolas Dorier May 21 '09 at 14:52
  • Wow, got a -1 vote for this answer. Do people really think bug free software is possible? – JaredPar May 21 '09 at 17:10
  • 15
    @Triynko your "possible" and @JaredPar's "possible" are not the same. Theory and practice might be the same in theory but are very different in practice. – wilhelmtell May 22 '09 at 02:30
  • 2
    Depends if you mean algorithmic correctness or 'does what the user wants it to do'. The former is certainly possible. – Steven Evers May 22 '09 at 02:31
  • @wilhelmtell: Theory and practice are only different when you don't know what "theory" means. People seem to confuse it with "conjecture". @Jared: yeah, I always thought if I could just be smarter, try harder, learn more, I could do it. This is the same reason we make so many laws, have so many years of schooling, take so many drugs, etc.: maybe with a little more control we can make everything perfect, right? (Off topic much?) – Jay Bazuzi May 22 '09 at 15:02
  • 3
    @JaredPar I've seen many Hello World applications that are COMPLETELY bug free =P In fact, a lot of programming books I pick up have this little guy in one fashion or another, and I can't think of a one that had a bug in it =P – Joseph May 22 '09 at 17:51
  • 13
    @Joseph, part of the problem is people think Hello World programs are bug free. They're not. Most do not check for errors in printf for instance or account for other failed IO attempts. – JaredPar May 22 '09 at 18:01
  • @JaredPar's right--most trivial software has bugs. All non-trivial software has bugs. It is simply not possible to write interesting, bug-free software with today's tools; and we seem to be cool with that for the most part. Software really does an amazing job considering how complex it gets. Even notorious office always seems to recover my unsaved documents should I cross it :). We've come a long way this decade in *handling* the inevitable bugs better. – Michael Haren May 23 '09 at 13:17
  • 3
    Autopilots with Autoland systems for aircraft must be capable of more then 1 million landings without an accident. Most use three computers working in parallel, 1 flying the plane, 2 checking the first for mistakes. That is about as close to bug free as you will get. For the record I worked on those systems, but I did not program them, – Jim C May 29 '09 at 20:04
  • 4
    I'll have to downvote this. It *is* possible to have bug-free software, and there is a constantly-developing science to achieve this. Conforming yourself that there is no bug-free software is just an excuse so that you don't have to care. That helps to stagnate an important research subject in Software Engineering. – Juliano Jun 03 '09 at 21:43
  • 2
    I know one - TeX. There's even a price for each bug afaik. – Tobias Langner Aug 28 '09 at 07:54
  • Isn't this C program bug free? void main() {} ;-) – RussellH Sep 29 '09 at 04:27
  • 9
    @RussellH, no. You've failed to specify a return value and the resulting process will return random garbage memory. – JaredPar Sep 29 '09 at 05:37
  • @JaredPar: Bug free software is completely possible. However, it requires patience, and very skilled programmers, both of which are in short supply. Which makes Bug free software possible, just unlikely. – NotMe Jul 12 '10 at 16:08
  • "Bug free" software has to account for EVERY eventuality. Including bad design and UI, faulty components, and to take it to the extreme then maybe even power failures? – Robin Day Jul 12 '10 at 17:43
  • I still do't understand what is a bug actually? :) – THEn Jul 16 '10 at 17:02
  • Please don't go to dissect the literature of the meaning of the word "possible." We're computer programmers. Writing bug free software is simply not possible. :) – Donotalo Sep 28 '10 at 20:02
  • @JaredPar -- "the resulting process will return random garbage memory". That's a feature, not a bug. :-). – RussellH Dec 01 '10 at 18:06
199

That private member variables were private to the instance and not the class.

David Webb
  • 190,537
  • 57
  • 313
  • 299
166

I thought that static typing was sitting very still at your keyboard.

edralph
  • 1,831
  • 1
  • 14
  • 14
162

That you can fully understand a problem before you start developing.

Jeffrey Hines
  • 1,766
  • 1
  • 13
  • 16
  • 32
    This, my friend, should be: "That you can fully understand a problem." But it is so true. And apparnetly a hard concept to understand or even accept. – KarlP May 21 '09 at 15:12
  • 4
    You cannot understand the problem "fully", but certainly you MUST understand the problem ( at some degree ) before you start developing. http://bit.ly/kLXgL – OscarRyz May 21 '09 at 17:58
  • Sometimes you have to start developing to understand the problem. And/or, the problem changes the more you develop. – Evan Plaice Jun 14 '10 at 19:15
158

Smart People are Always Smarter than Me.

I can really beat myself up when I make mistakes and often get told off for self-deprecating. I used to look up in awe at a lot of developers and often assumed that since they knew more than me on X, they knew more than me.

As I have continued to gain experience and meet more people, I have started to realise that oftentimes, while they know more than me in a particular subject, they are not necessarily smarter than me/you.

Moral of the story: Never underestimate what you can bring to the table.

NomeN
  • 17,140
  • 7
  • 32
  • 33
Rob Cooper
  • 28,567
  • 26
  • 103
  • 142
  • Good one! I am currently working with a colleague who really knows A LOT about .NET development. Took me some time to realise that I am better at understanding what the customers needs. – Treb May 21 '09 at 21:40
  • 58
    And on the other hand, that I know more than other people. It turns out that they just know different stuff. The other moral: Never underestimate what someone else can bring to the table. – thursdaysgeek May 21 '09 at 23:19
  • 1
    Here's that old "Do unto others" thing again... I'm coining a new phrase: Tech bulying ~ The state of feeling superior because you know some stuff, and making the mistake of letting everyone else know it. @seealso: smartass. – corlettk May 23 '09 at 08:17
  • 1
    Excellent observation - my version is more negative "Everyone does stupid now and then". Somewhat related to "don't flip the bozo bit". – peterchen May 27 '09 at 22:17
  • 2
    You only have to worry when stupid people, are smarter than you. – Brad Gilbert Jul 29 '09 at 21:16
  • We confide in our strength,without boasting of it;we respect that of others,without fearing it--Thomas Jefferson – krishna Aug 22 '09 at 15:01
131

For the longest time I thought that Bad Programming was something that happened on the fringe.. that Doing Things Correctly was the norm. I'm not so naive these days.

Sam Axe
  • 33,313
  • 9
  • 55
  • 89
  • 30
    I used to think Bad Programming was only done by other programmers, until I was done in by one of my Bad Programs. Now I Do Things Correctly! (You believe me this time, right?) – Jared Updike May 21 '09 at 04:26
  • OMG, I would upvote this again and again, till my clickfinger gets as tired as my code-reviewing eyes... – AviD May 21 '09 at 21:43
  • 2
    Totally. I've gone from "That never happens" to "That never happens except at *this* job" to "Every place has bad code." – Ryan Lundy May 22 '09 at 17:55
  • I kinda still hold that belief. I guess I haven't seen much of how code is written in the real world (typical companies). – hasen May 22 '09 at 18:07
  • 1
    Hacking is the norm. Engineering is the purview of the truly competetent. If ever meet a software engineer I'll let you know. – corlettk May 23 '09 at 06:30
  • 3
    @corlettk: You mean monkey-coding is the norm, no? Hacking is an art, a high-level of art mind you, that I'm far far away from achieving. – hasen May 27 '09 at 18:33
  • 2
    @Hasen: No, hacking is an analogy to unskillfully taking an axe to a tree, chiseling off tiny pieces in a mad panic with no real plan, and creating a bloody great mess until the tree finally falls on your head. A "hack" is "one who produces banal and mediocre work in the hope of gaining commercial success". Why it was that the computer field changed "hack" to mean "skilled", I'll never know. – Lawrence Dol Oct 23 '09 at 01:23
113

I thought I should move towards abstracting as much as possible. I got hit in the head major with this, because of too much intertwined little bits of functionality.

Now I try keep things as simple and decoupled as possible. Refactoring to make something abstract is much easier than predicting how I need to abstract something.

Thus I moved from developing the framework that rules them all, to snippets of functionality that get the job done. Never looked back, except when I think about the time I naively thought I would be the one developing the next big thing.

Evert
  • 93,428
  • 18
  • 118
  • 189
  • 26
    Decoupled = true Abstraction. Abstract for its own sake is... premature optimization. – Jared Updike May 21 '09 at 04:28
  • 1
    This goes along with what I've found doing performance tuning. There can be a lovely program with multiple layers of abstraction. Then the workload gets heavy, and guess what is costing all the time ... all the abstractions. Computers execute instructions, not abstractions. – Mike Dunlavey May 21 '09 at 14:02
  • 5
    Abstraction and generalisation are powerful tools, sadly used to generalise an abstract use case with one single implementation. The funny thing is that whenever there is a need to change the implementation, the abstractions and generalisations have to change too... – KarlP May 21 '09 at 15:08
  • I totally agree with Jared ... if you have managed to get to "simple and decoupled" you have achieved true abstraction. How can things be decoupled if you haven't abstracted things out into interfaces and factories etc...? How can it be simple unless you remove all the "if type = this then do this, or if the type is that then do something else code"? – Richard Anthony Hein May 21 '09 at 23:58
  • Same here. I think I learned about abstraction *before* making a whole lot of spaghetti code. They should've taught how to get things done even if the code is spaghetti, and *then* teach us about OO and abstraction. – hasen May 22 '09 at 18:09
  • What about abstraction in the event that you may make use of it in the future, even if it's inefficient now? – Andrew Weir May 27 '09 at 12:32
  • Andrew, The point I tried to convey, was that you can probably refactor your code when you do need it. I realize this is a blanket generalized statement, but it's a good rule of thumb. – Evert May 27 '09 at 14:22
  • This is biting me right now! I think I fell into "architecture astronaut" mode when I was designing my software, and now that I'm implementing it, I see that I wasted a lot of time making it much more flexible than necessary. I'm still in favor of abstraction, but you pay for it up front with a lot of effort. I think I should have focused on making something that works well instead of the "one true system". – A. Levy May 27 '09 at 14:57
  • "Refactoring to make something abstract is much easier than predicting how I need to abstract something." <-- – Alex Baranosky Aug 17 '09 at 14:37
  • I've never really understood the importance of abstract classes. Sure, there's a 1/100 case where it makes sense to template the sub-classes but IMHO, abstract base classes are wayyyy overused. It has to be a backlash of Univ style teaching that assumes theory==practice. – Evan Plaice Jun 14 '10 at 19:19
103

That women find computer programmers sexy...

V'rasana Oannes
  • 644
  • 1
  • 6
  • 16
100

That the quality of software will lead to greater sales. Sometimes it does but not always.

Ian Ringrose
  • 51,220
  • 55
  • 213
  • 317
81

That all languages are (mostly) created equal.

For a good long while I figured that the language of choice didn't really make much of a difference in the difficulty of the development process and the potential for project success. This is definitely not true.

Choosing the right language for the job is as important/critical as any other single project decision that is made.

Overhed
  • 1,289
  • 1
  • 13
  • 41
  • 13
    I feel that chosing the right libraries is what matters. It just so happens there's often a 1-to-1 correspondence between languages and libraries... – Kevin Montrose May 21 '09 at 05:07
  • 7
    But if two programming languages are both Turing complete then what's the difference? You can write any program in either language! ;) – Bill the Lizard May 21 '09 at 14:17
  • 8
    I disagree, the decision what language to use is way less important than who will actually be implementing the project. As just one example of many other more important decisions. – Boris Terzic May 22 '09 at 13:53
  • 1
    A study referred to in the (still) wonderful project management book "Peopleware" concluded that despite the strong opinions language choice engenders, there wasn't a strong difference in productivity between languages (except for assembly, which notably underperformed higher-level languages). However, that study was from around the time when Ada was the hot new language on the block, so it may be time to redo such a study. – Daniel Martin May 24 '09 at 23:51
  • 1
    While Turning complete languages do make it possible to implement the same application the syntax and metaphors of some languages do often lend themselves to particular problem types – Crippledsmurf May 27 '09 at 11:15
  • 13
    BrainFu** is as turing complete as python is. – hasen May 27 '09 at 18:31
  • Think about doing COM Integration with C# vs VB.NET. The optional parameters... This goes away with C# 4.0, but to this point it was/is true. – Nate May 28 '09 at 20:24
  • 2
    The usual wisdom is that productivity in lines of debugged code per unit time is roughly the same across languages, while the number of lines needed for a given amount of functionality can vary considerably. – David Thornley Jun 03 '09 at 20:44
  • 9
    That Turing complete languages are somehow equally applicable is a common misconception. A Turing complete language can compute everything that a Turing machine can (and often implied the other way around too). There is absolutely no implications regarding performance. An operation that takes linear time in one language could very well take exponential time on another and they could still both be Turing complete. There's a huge difference between what's theoretically computable and what is feasible in practice. – TrayMan Jun 10 '09 at 06:34
  • 2
    @TrayMan, I just assumed the comment from Bill the Lizard was a joke. – Alex Baranosky Aug 17 '09 at 14:36
81

That a large comment/code ratio is a good thing.

It took me a while to realize that code should be self documenting. Sure, a comment here and there is helpful if the code can't be made clearer or if there's an important reason why something is being done. But, in general, it's better to spend that comment time renaming variables. It's cleaner, clearer and the comments don't get "out of sync" with the code.

Clay Nichols
  • 11,848
  • 30
  • 109
  • 170
  • 1
    I agree _in_ the actual code... excluding javadoc comments (or equivalent). – corlettk May 23 '09 at 05:13
  • 11
    +1, don't even get me started on the treatises I used to write for 10 line functions – wds May 27 '09 at 08:11
  • To add to this, an assert() statement is better than documenting a precondition/postcondition. .NET 4 code contracts can automatically be turned into documentation, too! – Robert Fraser May 31 '10 at 10:59
66

That programming is impossible.

Not kidding, I always thought that programming was some impossible thing to learn, and I always stayed away from it. And when I got near code, I could never understand it.

Then one day I just sat down and read some basic beginner tutorials, and worked my way from there. And today I work as a programmer and I love every minute of it.

To add, I don't think programming is easy, it's a challenge and I love learning more and there is nothing more fun than to solve some programming problem.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Ólafur Waage
  • 68,817
  • 22
  • 142
  • 198
65

"On Error Resume Next" was some kind of error handling

Paulo Guedes
  • 7,189
  • 5
  • 40
  • 60
  • 6
    I feel you...but in vbscript (esp. asp), it was the ONLY "error handling" option available, combined with judicious checking whether an error actually occurred, and a fair amount of prayer. – flatline May 20 '09 at 15:37
  • 2
    Yeah... it is some kind... just a kind that we are glad to be getting away from – Matthew Whited May 20 '09 at 19:41
  • 6
    Well?! but it is. You start your error-handling block with On Error Resume Next, try something, and then If (err.number<>0) then – jpinto3912 May 20 '09 at 21:22
  • Isn't this the only vbscript equivalent to a try catch? – James May 22 '09 at 12:07
  • -1: It is a kind of error handling. It just isn't that elegant. – JohnFx May 23 '09 at 16:11
  • Resume Next thing is awesome :) Ignoring anything and saying "Of course it will work, I wrote it!" (I did that before.) – JCasso Dec 07 '09 at 12:43
  • I know it is fun to pick on this construct, but to be fair it was INTENDED to be used for inline error handling If (Err.code=12...) – JohnFx Jan 27 '10 at 00:44
  • 1 more reason why VB should die a quick and painful death. @jpinto3912 Add, '<>' to the die/quick/painful list too while you're at it. – Evan Plaice Jun 14 '10 at 19:28
  • @Evan yeah, I'm still typing != for the first 30mins I start vba-ing. But to wish VBA would die quickly over try-catch semantics and the odd <> is bit too much, hmm? – jpinto3912 Jun 15 '10 at 17:18
  • @jpinto3912 Obviously, I'm not a fan of VB. The language has awkward semantics and a lot of awkward practices that are special to VB. IMHO, it's a MS word macro language that grew to be way more than it ever should have been. Mostly, it irritates me that It's a lot harder to find C# info on Goog because everything about .NET is flooded with really old VB examples and Goog gives age precedence over usefulness in that case. I'll admit. I have religious anti-VB views. Hence the whole 'death to the infidels' speak ;) – Evan Plaice Jun 15 '10 at 21:06
  • @jpinto3912 '<>' isn't really so much a vb specific thing. It just rubs me the wrong way because 'x <> y' literally means 'x is less-than greater-than y'. So, x becomes omniponent when y is involved ;). I like != because it's a simple 'not equal' – Evan Plaice Jun 15 '10 at 21:09
64

That programming software requires a strong foundation in higher math.

For years before I started coding I was always told that to be a good programmer you had to be good at advanced algebra, geometry, calculus, trig, etc.

Ten years later and I have only once had to do anything that an eighth grader couldn't.

Twipped
  • 1,149
  • 12
  • 24
  • 5
    Very true. In most cases you don't need to be a math expert. The only time I ever really needed to know any advanced math was when I was doing 3D programming as a hobby. In fact, it was actually the 3D programming during high school that inspired me to pay better attention in trig and pre-cal classes. Other than that though, very basic math is usually all you need. – Steve Wortham May 20 '09 at 21:43
  • 29
    I think you were misinformed. Sure, to be a good *programmer*, you don't really need to use much higher level math, but to truly understand and apply certain computer science concepts, you're going to need more than just eighth grade math. – hbw May 21 '09 at 00:43
  • 2
    I would say being comfortable with binary logic (which isn't that much math) and how the CPU actually works (memory allocation, device communication, ALU and the interaction with whatever registers you have on your platform) is far more important to be a good programmer than a thorough understanding of advanced mathematics. – Martin P. Hellwig May 21 '09 at 10:01
  • 12
    I think the emphasis on math is to teach critical thinking skills and problem solving not as something that you would use in every day computer programming. – Zack May 21 '09 at 14:54
  • 14
    The kind of abstraction you need to understand advanced mathematics is very similar to the abstraction you need to create software. – OscarRyz May 21 '09 at 18:04
  • @Zack, @Oscar - I think thats the idea behind it, and thats what academics (my professors included) would like to believe. However, the abstractions for advanced mathematics are actually VERY different from abstractions you need to create actual software. In fact, it's rare to find people that do both well. – AviD May 21 '09 at 21:55
  • 6
    I think functional programming concepts is much easier to understand if you have a stronger foundation in mathematics, simply because you aren't frightened off by the syntax as much. It looks familiar. I made the mistake of using simple mathematical functions to demonstrate the functional programming concepts new to C#. Some people were immediately declaring that it was too complex. – Richard Anthony Hein May 22 '09 at 00:08
  • 1
    I would also have to disagree to an extent. I does not *REQUIRE* a good foundation in in higher maths but so many patterns and concepts are based on mathematical techniques and to create more complex software having a good mathematical understanding will be useful when writing your logic and algorithms. – Sheff May 22 '09 at 10:09
  • In good programming, you're right. In computer science, a strong foundation in mathematics is needed to really understand the depths of some of the topics. – Mike Tunnicliffe May 24 '09 at 05:48
  • I think there's just some correlation, not causation. If you enjoy math, you migth be more likely to enjoy certain aspects of programming. – peterchen May 27 '09 at 22:25
  • 1
    Once I started working in relational theory and data management, I found that many of my iterative calculations could have been better handled by a better understanding of mathematical concepts and algorithms. Calculus II? Maybe not. But pre-calculus and trig, absolutely. – Chris K Jun 04 '09 at 03:52
  • 1
    In my programming career the only thing I regret is not having done MORE math. There's many advanced programming concepts that have me befuddled on a daily basis. (That said, it's mostly related to functional programming and type systems) – Rehno Lindeque Jan 04 '10 at 10:12
  • The only reason they gave us so many maths at the university was to make sure not all students would pass the first two years. Testing a student on his skills in mathematics is definitely more straightforward than testing on programming. – Dimitri C. Jan 04 '10 at 10:41
  • I completely and whole heartedly disagree with this answer. – Akusete Jun 07 '10 at 03:56
  • I've had to use algorithms based on `mod` functions all the time (usually in conditionals based on indexes), and I never learned that in eighth grade. – Lance Roberts Jul 12 '10 at 18:44
63

That optimizing == rewriting in assembly language.

When I first really understood assembly (coming from BASIC) it seemed that the only way to make code run faster was to rewrite it in assembly. Took quite a few years to realize that compilers can be very good at optimization and especially with CPUs with branch prediction etc they can probably do a better job than a human can do in a reasonable amount of time. Also that spending time on optimizing the algorithm is likely to give you a better win than spending time converting from a high to a low level language. Also that premature optimization is the root of all evil...

danio
  • 8,548
  • 6
  • 47
  • 55
63
  • That the company executives care about the quality of the code.
  • That fewer lines is better.
Trampas Kirk
  • 1,436
  • 3
  • 16
  • 21
  • 2
    they DO care, but you have to combine artist-skills with worker-skills. Every piece of algoritm cant be a piece of art too. Some of it will be plumpering, so reuse the "less used". Remember the old 80/20 rule. 80% of the program is used 20% of the time. So focus 80% on 20% of the code and make that REAL PIECE OF ART! :OP – BerggreenDK May 20 '09 at 23:12
  • 71
    fewer lines are better! part of the reason I dislike java as a language is that doing anything takes up so many lines of code. less lines of code means it is easier to change your code. – Claudiu May 21 '09 at 05:56
  • 7
    It depends on what you're removing to get fewer lines. If the code is still readable with fewer lines then it's good. However, there are plenty of ways to reduce the number of lines of code that make the code worse. – Herms May 21 '09 at 14:57
  • 2
    Except when people take the "fewer lines is better" mentality to far with chained method calls 7 deep so that when one of them throws a null pointer, you have no idea which it was. Or they condense so many actions into one line that it's 150 characters long and performs 4 operations. This makes it both harder to read and harder to debug, but is not any faster nor does it uses less memory during execution. – Trampas Kirk May 21 '09 at 19:24
  • The real killer to the fewer lines issue is the PHB. When the manager doesn't read 10k lines of your buddies work and compares it to your 1k lines which he also didn't read, he is likely to assume that you are work only 1 hour per day. – SingleNegationElimination May 22 '09 at 00:23
  • 1
    I think we need to make the distinction between "fewer lines" and "less code". – Shalom Craimer Jul 29 '09 at 09:08
  • 17
    If your line ends in ))))) and you're not writing Lisp, you have too-few lines. –  Jul 30 '09 at 22:23
  • It SHOULDN'T (!) be like that - but how do you expect a company "X (feel free to add any type) managers" to understand/care the qulity of the code, if the only thing (at least from my own experiences) they understand (sadly) is this: more $$$, more $$$, etc. – sabiland Aug 28 '09 at 08:13
58

I would say that storing the year element of a date as 2 digits was an assumption that afflicted an entire generation of developers. The money that was blown on Y2K was pretty horrific.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
MikeJ
  • 14,430
  • 21
  • 71
  • 87
  • 1
    This is the only answer that I'll upvote, though it's a CW so it doesn't matter... – Dan Rosenstark May 21 '09 at 13:32
  • Good thing I am just in this for the babes. :-) – MikeJ May 21 '09 at 23:55
  • 4
    IIRC some systems back in the 60's and maybe 70's only used one digit because it used less memory. I have even seen paper forms where "196_" and "197_" was preprinted. – some Jun 05 '09 at 00:03
  • 3
    I still see forms with 200_ and presumably there are some now with 201_ printed. – Macha Apr 03 '10 at 20:25
  • 5
    The sad part is... Unix will have their second round at this in 2038 – Evan Plaice Jun 14 '10 at 19:32
  • @Evan: If, that is, they're still using 32 bit machines in 2038. – Billy ONeal Jul 12 '10 at 15:57
  • 4
    @Billy Just because the machine architecture changes doesn't mean the data format will. Storing 2 digits of resolution in int format would make a byte (8bit) date format and, yet, it affected tons of 32bit hardware architecture machines in 2k. This is just one more example of why you don't let low level hardware guys specify data formats. They penny pinch bits with the knowledge that there will be a scheduled SNAFU in the distant future. – Evan Plaice Jul 12 '10 at 16:38
  • Specifying two-digit years was entirely reasonable in many cases, and two-digit years are entirely reasonable today in many cases as an input or reporting format. The computer formats that claim to accommodate dates down to Year One are silly, given that any date before 1753 is going to be meaningless without context (does 1-1-1600 mean the 14,650 days before 1-1-2000 or something else?) Unix-style times will have some difficulties in 2038, but by then migrating to 64 bits for "live" systems with new data shouldn't be a problem; historical data would be unambiguous. – supercat Dec 20 '10 at 01:11
57

That anything other than insertion/bubble sort was quite simply dark magic.

Robin Day
  • 100,552
  • 23
  • 116
  • 167
  • Haha, I like this one, as it hits close to home. Sort in faster than n-squared time?? Unpossible! – Ross May 20 '09 at 14:54
  • It's amazing how simple and obvious most sorting algorithms seem once you have a strong feel for recursion and divide&conquer. Until then, most of them feel like black magic. – Brian May 20 '09 at 18:08
  • 74
    I am a RESEARCHER in sorting algorithms! And they STILL feel like dark magic. – SPWorley May 20 '09 at 19:31
  • No amount of recursion helps me grok sorting networks, such as bitonic sorts. – SingleNegationElimination May 22 '09 at 00:42
  • 14
    I once had a line of code in my program that was a long and complicated and I didn't feel like breaking it up or explaining it (it was some complicated lighting formula), so I put it all on one line and #define'd it to be DARK_MAGICK, and the only comment was a warning against trying to unravel the mysteries of the dark magick – Alex May 23 '09 at 10:26
  • Arno, I guess that's to be expected. It seems a fundamental property of our world that the closer you look at something, the crazier it gets. Atoms, people, algorithms... you name it. (As a side note, I just realized that if I'd care about such things I'd accept that as the base for a proof of the existence of a god.) – peterchen May 27 '09 at 22:22
  • 2
    Bogosort is the most mysterious of them all. – Alex Beardsley Jun 02 '09 at 05:06
50

That XML would be a truly interoperable and human readable data format.

Alex. S.
  • 143,260
  • 19
  • 55
  • 62
  • 7
    XML isn't a panacea but I wouldn't like to go back to the days where I regularly saw applications trying to squeeze relational data into single csv files. – Tony Edgecombe May 21 '09 at 07:43
  • 4
    its an inter-operable syntax, no doubt about that. Its just that syntax is often the least important aspect of any solution. – Simon Gibbs May 21 '09 at 14:35
  • 2
    +1, you could add small and fast to the wishlist too. – MarkJ Jun 11 '09 at 16:30
  • 1
    True but an improvement over csv and fixed length where without the documentation you are screwed. – PeteT Dec 17 '09 at 13:32
  • 7
    I love XML for the standardization it brought to data formats and for correctly handling character encodings. I hate what is sometimes done *using* XML, however. – Joachim Sauer Apr 21 '10 at 14:36
48

That C++ was somehow intrinsically better than all other languages.

This I received from a friend a couple of years ahead of me in college. I kept it with me for an embarrassingly long time (I'm blushing right now). It was only after working with it for 2 years or so before I could see the cracks for what they were.

No one - and nothing - is perfect, there is always room for improvement.

Jesse Beder
  • 33,081
  • 21
  • 109
  • 146
Binary Worrier
  • 50,774
  • 20
  • 136
  • 184
  • 5
    "better" will bring you tons of less-than-hateful comments. But I would say it is one of the most fast-executing, flexible, free-from-hurdles one. It's also one that takes your youth to proper learn it, only to find you could do more or less the same app. (albeit requiring some extra tonne or two of electricity-generating coal) with java or C#. – jpinto3912 May 20 '09 at 21:35
  • @JP: I'm happy with my choice of words :) – Binary Worrier May 21 '09 at 08:30
  • Productivity matters more in the world of business applications. of course, there are some niches that c++ is required, and the only option. – Shaw May 25 '09 at 09:42
  • @Shaw: Indeed recently - for a pet project - I chose to write one particular component in Managed C++, purely for performance reasons. It's just that I no longer believe it's "intrinsically superior" to all other languages. – Binary Worrier May 25 '09 at 09:57
  • 7
    I've always assumed C++ is worse than straight ANSI C, simply because the kind of trouble that I've seen C++ programmers get into is so much more complicated than the kind of trouble I've seen C programmers get into. – Nosredna May 27 '09 at 21:54
  • @Nosredna: Is that an assumption for the list? :) – Dale Reidy May 29 '09 at 20:17
  • 1
    Actually, the language that's better than all other is Common Lisp. C++ isn't bad, though. – David Thornley Jun 03 '09 at 20:49
47

I believed that creating programs would be exactly like what was taught in class...you sit down with a group of people, go over a problem, come up with a solution, etc. etc. Instead, the real world is "Here is my problem, I need it solved, go" and ten minutes later you get another, leaving you no real time to plan out your solution efficiently.

Aaron
  • 7,431
  • 12
  • 35
  • 37
42

I thought mainstream design patterns were awesome, when they were introduced in a CS class. I had programmed about 8 years as hobby before that, and I really didn't have solid understanding of how to create good abstractions.

Design patterns felt like magic; you could do really neat stuff. Later I discovered functional programming (via Mozart/Oz, OCaml, later Scala, Haskell, and Clojure), and then I understood that many of the patterns were just boilerplate, or additional complexity, because the language wasn't expressive enough.

Of course there are almost always some kind of patterns, but they are in a higher level in expressive languages. Now I've been doing some professional coding in Java, and I really feel the pain when I have to use a convention such as visitor or command pattern, instead of pattern matching and higher order functions.

egaga
  • 21,042
  • 10
  • 46
  • 60
  • "many of the patterns were just boilerplate, or additional complexity, because the language wasn't expressive enough." Expressiveness is simply boilerplate code hardwired into the language. – Unknown May 21 '09 at 09:51
  • 4
    Not true, how is it boilerplate to have first class stuff instead of limiting the capabilities of a programmer, like in the case of higher order functions. Lisps are beautiful example of this. – egaga May 21 '09 at 10:01
38

For the first few years I was programming I didn't catch on that 1 Kbyte is technically 1024 bytes, not 1000. I was always a little perplexed by the fact that the sizes of my data files seemed slightly off from what I expected them to be.

gnovice
  • 125,304
  • 15
  • 256
  • 359
  • 114
    Hard drive manufacturers still haven't caught on... – Michael Myers May 20 '09 at 14:31
  • 10
    @mmyers I think you mean hard drive marketers right? Or are the drives actually built like that? – Instantsoup May 20 '09 at 14:38
  • well they'll manufacture a drive with 300 billion bytes and call it 300GB, when 300GB (is for most intents) 300 x 2^30. (7.3% difference there!) – nickf May 20 '09 at 15:18
  • 5
    There's a silly (and rarely used) "formal" definition of 1024: "kibi-". As in "A Commodore 64 has 64 Kibibytes of memory". Ugh. It is confusing in some instances. Certain specific areas of computing use K=1000 like networking bitrates (Kbit/s), and others use it for marketing reasons (HD's mostly). Memory and file sizes are normally quoted in K=1024/etc. – jesup May 20 '09 at 15:55
  • 3
    I may date myself with this statement, but every time I see/hear the word "Kibibytes", I think of that character from "Fat Albert" (Mushmouth, I think) who added B's to everything he said. "Probagrambing is rebabeally harbd." – gnovice May 20 '09 at 16:03
  • 16
    Hey, stop the kibi hating. MeBi and KiBi are at least unbambiguobus. – bzlm May 20 '09 at 16:30
  • 21
    Kilo means 1000, Mega means 1000000, Giga means 1000000000. It's the RAM and OS makers that got it wrong, not the drive makers. – Mark Ransom May 20 '09 at 17:26
  • 7
    @Mark Ransom: Actually it's not the RAM/OS makers that got it wrong. In binary, 1024 is 10 bits all set to 1, it is a nice round figure [10000000000], as all data is stored in binary it makes sense to use a KB as 1024. What is retarded is that everyone uses different standards. They should all use one or the other, regardless of which they pick and stop confusing the hell out of everyone... – BenAlabaster May 20 '09 at 18:38
  • 8
    Those prefixes had a defined meaning long before someone tried to adapt them to binary numbers. You shouldn't need to know the context to decipher the prefix meaning. I understand how it came about, but I think it was a mistake that has caused confusion for far too long. – Mark Ransom May 20 '09 at 19:00
  • Wow, looks like I unintentionally set off quite the feud. =) – gnovice May 20 '09 at 19:00
  • 39
    No one's going to do it? Seriously? Okay, I'll do it... http://xkcd.com/394/ – Erik Forbes May 20 '09 at 19:15
  • 1
    Maybe we should go back to binary coded decimal... math functions would make less mistakes... of course they would be slower and use more RAM but who cares :) (4th times a charm) – Matthew Whited May 20 '09 at 19:50
  • IIRC IBM used 2k48 and 4k96 (about the memory size of their mainframes) during the sixties, and in informal talk it was just "4k". In the beginning only small numbers where used and the difference was such small that nobody cared, or "everyone" knew by the context if 1k meant 1000 or 1024. – some Jun 04 '09 at 23:40
  • 5
    @BenAlabaster: Actually 1024 (1 KiB) is 2^10 or binary 10000000000, which is certainly *not* 10 bits all set to 1. And Mark is quite correct, the definitions of Kilo and Mega were around and defined in engineering circles long before computer guys borrowed them for their own (inexact) uses. It's time for computer geeks like us to let it go, admit were were wrong, and start using the right notation to mean the right thing. – Lawrence Dol Oct 23 '09 at 01:05
36

That condition checks like:

if (condition1 && condition2 && condition3)

are performed in an unspecified order...

User
  • 30,403
  • 22
  • 79
  • 107
  • 15
    In what language? Languages like C/C++, Java, and Python guarantee that the conditions are evaluated left to right and that evaluation stops at the first condition that returns false. It's part of the langauge spec. I assume that most other languages make the same guarantee. – Clint Miller May 20 '09 at 15:49
  • 44
    @Clint: Yeah, hence "that turned out to be incorrect". – bzlm May 20 '09 at 16:22
  • yeah, this one is cool. it makes wrint stuff like if(myList!=null && myList.Count >= 0) {do stuff();} a lot easier – Zack May 20 '09 at 19:49
  • 4
    actually, this one depends on the language, and & will evaluate all conditions (not shortcut). And I've seen many people use And (&) in VB instead of AndAlso (&&) – Lucas May 20 '09 at 20:44
  • Wait, How does this work in C#? If I do if (object != null && object.property == true) I am pretty sure it would actually crash on null.... – Damien May 21 '09 at 08:20
  • @Damien: In C# it won't try it out your self. In pre .net VB that would crash. – Binary Worrier May 21 '09 at 08:27
  • New thing learned, you are correct it is in order. – Damien May 21 '09 at 08:28
  • 2
    . . . Actually it will crash in VB.net too unless you use AndAlso re Lucas' comment – Binary Worrier May 21 '09 at 08:28
  • Some languages don't have an order specified. For the languages that do, they usually brag about how they have "short-circuit" logic. – Unknown May 21 '09 at 09:49
  • In pre-.NET VB, there were no short circuit operators so && wouldn't compile. – Richard Anthony Hein May 22 '09 at 00:00
  • LOL, and to be precise, as Binary said, there is no && or &. – Richard Anthony Hein May 22 '09 at 00:01
  • in C++ they may be performed in an unspecified order AFAIR. – Krzysztof Kozmic May 27 '09 at 11:53
  • in C++ it is left to right. This allows you to check if the object is null in the leftmost spot first. if (!object==null && object.getItem()==5) – Alex May 29 '09 at 15:39
  • As i read this i remembered that i was actually told in my CS class that i couldn't trust the execution order and we where told to write => if(x!=null) if(x.getX()==5) <= so it would always work. I total forgot about that until now :) – edorian Aug 11 '10 at 12:19
  • Note that in C/C++, function arguments are performed in an unspecified order. For example:`printf("%d, %d\n", i++, i);` – Joey Adams Sep 18 '10 at 20:26
35

That my programming would be faster and better if I performed it alone.

bzlm
  • 9,626
  • 6
  • 65
  • 92
31

"The project will be done in 2 weeks"

and

"That will take 2 hours to implement"

Marc
  • 1,537
  • 11
  • 21
  • 1
    Now I always take that time x2 or x3. If I delivered "on time", then I'll get praised on how fast that was – Eric May 21 '09 at 15:20
  • 7
    Yeah, and then you spend 3 hours just fighting a stupid bug, and they think you're not doing anything. Tell me about it. – hasen May 22 '09 at 18:21
  • @Eric: Yes, I've been doing this for the past while and it's working out great. I even get to take time off (I'm self-employed, not a work truant!). – DisgruntledGoat Jul 05 '09 at 01:28
  • 1
    +1 because "It will be done in two weeks!" has become such a running joke with me that I have to mentally bitch slap myself every time I earnestly give an estimate that is, yet again, two weeks. – BlairHippo Nov 06 '09 at 16:18
30

That I can understand my own code without comments!!!

Pratik Deoghare
  • 35,497
  • 30
  • 100
  • 146
28

I thought I would need it.

MrValdez
  • 8,515
  • 10
  • 56
  • 79
  • 3
    Joke explanation: The line is the opposite of YAGNI (You ain't gonna need it). In essence, I thought I would need a class/module/functionality/etc before I can complete my program. – MrValdez May 21 '09 at 05:41
  • 5
    I've long thought there should be an opposing principle: BWIIDNI? – Daniel Earwicker May 27 '09 at 16:53
25

That dynamically typed languages like Python or Ruby are somehow less qualified for use on large projects.

FogleBird
  • 74,300
  • 25
  • 125
  • 131
  • 7
    I had this same awakening circa 2000. I read some stuff on the original wiki at www.c2.com and ended up starting this page: http://www.c2.com/cgi/wiki?UnificationOfStaticTypesAndUnitTests and was on the verge of concluding that I was irrationally attached to static typing. But I've since begun using an environment (C#) in which static typing really brings the IDE to life during editing, and I'm now pretty convinced that statically typed languages are better because they are easier to work with. There is no dynamically typed language that would not be improved by some static type info! :) – Daniel Earwicker May 27 '09 at 17:09
  • Statically typed can make the IDE infinitely better while dynamically typed can make the code infinitely shorter. It takes black magic to bend a statically typed language to break the boundaries whereas dynamically typed languages oft times don't have enough or clearly defined boundaries. Choose your poison. – Evan Plaice Jun 14 '10 at 19:43
  • It is possible to have nice short code in a statically typed language; see Haskell and boo. – Qwertie Jul 08 '10 at 20:27
25

One assumption I had as a rookie those days was that people with more years in the field automatically are better developers..

Arcturus
  • 26,677
  • 10
  • 92
  • 107
24

This is embarrassing, but for the longest time I didn’t really grasp the difference between reference types and value types. I thought to you had to use the ref keyword to change an object in a different method.

This is one of the most fundamental concepts to C# that I should have known.

Aaron Daniels
  • 9,563
  • 6
  • 45
  • 58
  • You'd be surprise how many developers do not know the difference. – Richard Anthony Hein May 22 '09 at 00:02
  • When I did phone interviews, I would always have a question for the candidate about this because it's so commonly misunderstood. – Ryan Lundy May 22 '09 at 17:59
  • ...and most of them missed it. They could explain the difference between pass-by-value and pass-by-reference for value types, but few of them grokked it for reference types. – Ryan Lundy May 22 '09 at 18:00
  • correct me if i'm wrong(not 100% sure) value types = int/bool/decimal etc and referece types = classes? – Michal Ciechan Apr 17 '10 at 22:07
  • @LnDCobra you're right and wrong. Classes are fundamentally reference types but value types (int, bool, decimal) can also be passed by reference using the ref keyword. – Evan Plaice Jun 14 '10 at 19:39
22

This is really embarrassing but when I was starting to learn how to program nothing could satisfy me. I wanted to write video games. Not the trivial little programs all these books wanted me to write. So I decided I could easily skip 10 chapters and ignore the basics.

So I basically ignored variables!

The problem was that I did not recognize keywords from conventions:

Car car = new Car(); //good
Car test = new Car(); //wrong must be lowercase car!

for (int i = 0; i < 10; i++) //good
for (int test = 0; test < 10; test++)//wrong must be i

I did this for over a year and even made a tic-tac-to game in 3000 lines! I was thrilled by my awesomeness at that point, until I found a tic-tac-to in 150 lines on the Internet. Then realized I was an idiot and started over again.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
MrHus
  • 32,888
  • 6
  • 31
  • 31
20

That programming is easy.

D'Arcy Rittich
  • 167,292
  • 40
  • 290
  • 283
20

That Unix and Linux OSs are well designed ... I should probably qualify this(!)

Firstly, the view is reenforced by such anti-truisms as:

  • every subsequent OS developed ends up redesigning Unix poorly (it's said about Lisp as well, where it is more true).
  • the list of rules that make the 'Unix philosophy'. It's not that they are wrong, it's the implication that Unix itself follows them closely.

It may be more true to say that they were well designed/well done, and surely parts of them are, but even this is just a relative judgment, relative to some awful versions of Windows. Here are some examples of things that are done badly:

  • configuration is a mess, ad-hoc flat file configs are not good
  • the C programming language should have been replaced (by something like D) a long time ago
  • shell scripting is schizophrenic. It is not good for development as it is shorthand designed for quick typing.
  • directory structures are badly named
  • the GNU tool chain is unnecessarily arcane
  • the belief that general purpose always trumps special purpose

Overall they require unnecessary expertise to operate. Or rather a lot of knowledge where there is only a moderate amount of understanding.

It's not all bad. Linux is politically better and not corrupted by business needs, but sadly to a large degree a lot of the technical highground has been lost.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
mike g
  • 1,791
  • 1
  • 18
  • 25
  • 17
    It's better designed than Windows... – Zifre May 21 '09 at 21:31
  • 4
    I still think that the flat-file config is better, but ad-hoc is a disaster. Seems to me that the MacOS X plist mechanism makes a very good compromise – SingleNegationElimination May 22 '09 at 00:05
  • 4
    I think the general point stands: Linux carries around a lot of cruft a bit like windows does. but your specific points don't really convince me, configuration works fine for most things (depends on how the configuration files are implemented), shell scripting is now done with python a lot, directory structure is down to taste, ... Bit weak really – wds May 27 '09 at 07:25
  • 1
    There are probably 'better' negative points - i haven't used it regularly for a while, but do you think that maybe you have low standards on this. Conf works, but not well - but really there should be a standard (with type information) which would make things like context sensitive help, and decent gui tools possible (that can handle all versions of conf files for instance). IMHO your POV lacks vision on this. – mike g May 27 '09 at 13:25
  • 1
    It isn't just shell scripting. There is a complete lack of separation from what is a Human interface (the interactive shell) from an engineering interface (the programs). Shell has perverted the api for programs, and torpedoed one of the goals of unix (everything as small programs). Output as human readable text, and input as save on typing single char switches. Take rm (a toy example) #this is ok for an experts command line rm -f #should be something like this in scripts remove force=true The human layer should be a separate layer. – mike g May 27 '09 at 13:35
  • I won't rant much on dir naming, but the idea that its completely subjective is wrong i think. – mike g May 27 '09 at 13:46
  • 4
    Some of us thought that Unix redesigned Multics poorly. Unix was intentionally designed to avoid being everything Multics was, and then the rest was hacked in instead of being designed in. – Windows programmer Jun 08 '09 at 03:10
  • I'd argue that *nix in general are more secure and some brands (in my case Linux Mint) are more stable. Toss in Norton on Windows and my *nix will burn windows any day in performance. Better designed? not necessarily. Better in general? Yep. SideNote: I have rarely if ever touched a config file in Mint. Just about everything can be done with the GUI. – Evan Plaice Jun 14 '10 at 20:42
  • 1
    Linux not corrupted by business needs?! So many superior ideas have been rejected for inferior ones. Linux is today mostly about politics: my ex-colleague worked his ass off to get his patch accepted into the linux kernel (a slight enhancement of the TCP protocol). He could tell many "interesting" stories about people trying to block/sabotage patch acceptance on very dubious and sometimes incorrect technical grounds and assumptions. – zvrba Jul 12 '10 at 16:18
19

When I first started after graduating from university I expected that more senior developers would know what they were doing. Boy was I wrong....

Mark
  • 28,783
  • 8
  • 63
  • 92
  • @Mark, and that people who told you "what was correct" wouldn't just be "saying something" because they didn't actually know the answer. (-: – Rob Wells Jun 19 '09 at 12:49
  • 17
    That's funny, my biggest misconception when I became a senior developer was that I expected university graduates would know what they were doing. :-) – tnyfst Jun 19 '09 at 12:51
  • 1
    @Mark: LOL @tnyfst: LOL again ;-) – Treb Jun 20 '09 at 16:59
19

Ok, I learned programming rather early. I was 14 or so. And I held all kinds of crazy beliefs, but don't ask me about the precise timing, because that was a … long while ago.

  • Ok, so, I believed for a while that if you use the term synchronize in Java, then Java solves this nasting synchronizing thing for you

  • I believed for at least half a year, likely more, that static typing would improve performance.

  • I believed that freeing something would return memory back to the OS.

  • I believed that malloc calls boil down to checking if there is enough free space on the OS, so malloc would be inexpensive.

  • I thought a long while that Java was built with all the benefits and flaws of the other languages in mind, into a "perfect blend" that would take the best properties of the other languages and reject the mistakes.

  • I vastly overestimated the number of cases where LinkedLists outperform ArrayLists.

  • I thought that NP-hardness was a proof that no INSTANCE could be solved efficiently, which is trivially false, for a while.

  • I thought that finding the best flight-plan on travel agency web sites would take so long because of the "Travelling Salesman Problem", as I proudly chuckled to my relatives (when I was small, alright?!)

Could come up with more. No idea how long I sticked to each of them. Sorry.

PS:
Ahh, ok, this one got cleared up not so slowly, but I see newbies do this every now and then, so I thought you might be interested: I also thought that to store an uncertain number of things, you'd need to declare a new variable for each. So I'd create variables a1, a2, a3, ..., rather than using one variable a, which I would declare to be a vector.

nes1983
  • 15,209
  • 4
  • 44
  • 64
18

I used to believe that the majority of work on an application was actually programming. I'm sure this is true in some cases, but in my experience I spend more time researching, documenting, discussing, and analyzing than actually coding. (I work on software that operates a laser-based sensor, and determining how best to control the hardware is much more challenging than writing the code to do so.)

I also used to think that open environments where programmers can look over their shoulder and ask the guy (usually) next to them a question were the best environments for a team of programmers to hammer out a solution. It turns out that a dark lonely room is more productive, team or no team.

When I graduated, I assumed that programming professionally would be like programming in college, meaning that I would be given the inputs and expected outputs and asked to fill in the black box that does the conversion. In reality, I have to figure out the inputs, outputs and the black box.

I didn't used to think marketing and sales guys were the scourge of the human race, so naive.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Atilio Jobson
  • 719
  • 1
  • 8
  • 11
  • Someone else lost a vote just so I could vote this up. – UnkwnTech May 21 '09 at 10:25
  • 2
    My personal favourite is this sort of conversation: BA: "The system requires these outputs. || Me: OK, we'll need these inputs. || BA: But the data-entry will cost millions! || Me: Yes, and where did you expect the system to get this data? || BA: Can't you make it up?" – corlettk May 23 '09 at 06:37
  • You're last point is missing a Lawyers and Bean-counters to go with the whole scourge-of-the-human-race part – Evan Plaice Jun 14 '10 at 19:52
18

That's its a 9-5 job

Nir
  • 24,619
  • 25
  • 81
  • 117
17

Having No defects is possible before going live.

It is definitely not true, even P2 defects get left open at times.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
TStamper
  • 30,098
  • 10
  • 66
  • 73
  • 8
    How about the assumption that your internal names for priority levels are my internal names for priority levels? Over here, what y'all call TPS reports are called SRP reports! ;) – Doug McClean Jun 19 '09 at 14:44
17

That code reviews are a waste of time.

Having moved from a company where they were entirely optional to one where they are mandatory (even audited) I've come to understand their usefulness. Having a second set of eyes on code, even on the most trivial pieces, can:

A) save you embarrassment when you screw up something trivial (a trivial code review, for instance, would have prevented us from spamming hundreds of emails to our customers, at my previous job)

B) can teach you things that you didn't know in the first place (I'm ever learning new libraries at my current job - inevitably at a big company, someone has already stumbled upon the problem you have and done a better job solving it - it's just a matter of knowing where to look)

C) at the very least ensure that someone other than yourself knows how things work.

In the end, I wind up happier with the code I submit here, than in my previous employment, even though back then I thought I knew everything :)

new123456
  • 873
  • 1
  • 12
  • 21
James
  • 8,512
  • 1
  • 26
  • 28
  • My first introduction to code reviews was in an organization that didn't actually believe in them, but wanted to say they did them. When I had my first experience of a raal honest code review, it was a bit of a shock. – Mark Bessey Aug 13 '09 at 17:46
16

That if conditions were evaluated every line, and if you wrote code like this:

Dim a as Boolean = True
If a Then
    Console.WriteLine("1")
    a = False
    Console.WriteLine("2")
Else
    Console.WriteLine("3")
End If

Then the output would be:

1
3
MiffTheFox
  • 21,302
  • 14
  • 69
  • 94
  • 39
    This is one misconception I never had/heard of. – Brad Gilbert May 20 '09 at 21:52
  • 4
    Some of my friends used to play this robot-programming game where this was actually the case in the half-assed language you programmed your 'bot in. – Zarkonnen May 21 '09 at 09:51
  • This is awesome. This is the only answer that I'm upvoting, except for that other answer that I upvoted. Something like this does happen when you iterate through an Array and try to remove elements from the array... – Dan Rosenstark May 21 '09 at 13:39
  • Holy crap, wouldn't that be a show stopper? – NTDLS May 28 '09 at 20:54
  • 4
    Wouldn't you find out that wasn't true the first time you stepped thorugh it with the debugger? – John MacIntyre Jun 03 '09 at 22:24
  • It's roughly how conditional instructions on ARM work. They all have an `"if (a) then ..."` pattern (or `NOT a`), where `a` is one of the CPU flags. Since conditional jumps aren't very fast, it makes sense to have multiple conditional instructions with the same condition in a row. But if you did change that condition flag halfway, subsequent instructions will use the new flag value. – MSalters Aug 31 '09 at 15:17
  • I wonder, are there situations where this logic could actually be useful? – Max Yankov Jul 06 '11 at 11:07
16

That the design of the NT operating system is flawed when compared to UNIX. It turned out that NT Kernel and design decisions are very similar to any modern UNIX like system and that most of the problems you get in the kernel is the result from third party buggy drivers written by buggy companies.

  • 2
    I protest. One fundamental thing deliniates windows whatever to unix. Memory management. Windows detects an attempt to break in. Unix detects an attempt to break out... so windows programs can and do use unallocated memory. Yeck! – corlettk May 23 '09 at 08:24
  • 2
    @corlettk - do you have any references for what you mean by that? – Daniel Earwicker May 27 '09 at 16:52
  • 6
    It's wrong anyway. The relevant windows mechanism is page tables. He's suggesting that Windows VirtualAlloc()s everything, and you only need VirtualProtect to ask permission. The whole need for VirtualAlloc() pretty much proves him wrong. – MSalters Aug 31 '09 at 15:23
  • Windows blocks RAW packets for 'security reasons'. If security is an excuse to block something, shouldn't they block the whole internet? Nuff said... – Evan Plaice Jun 14 '10 at 19:55
15

That .NET structs (C# and VB.NET) were reference types, just like classes.

I "received" that piece of wisdom at some point shortly before or after .NET 1.0 arrived on the scene (I've no idea where from, it may have sprung whole from my mind, like Athena from the brow of Zeus), and kept it until disabused of the notion by Jon Skeet about 4 months ago.

Thanks Jon.

P.S. Not programming related, but I also believed (until about 5 minutes ago) that "Apollo sprang whole from the brow of Zeus".

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Binary Worrier
  • 50,774
  • 20
  • 136
  • 184
15

That bytes and characters were the practically same thing - "ASCII" was just a way of mapping a byte value to a glyph on the screen.

Reading about Unicode really opened my eyes (although I still don't fully understand it).

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Cybis
  • 9,773
  • 2
  • 36
  • 37
  • 3
    Great article: http://www.joelonsoftware.com/articles/Unicode.html – corlettk May 23 '09 at 05:15
  • Indeed, things don't get hairy at all until you find about the transcoding tables that are in the individual font files. – SingleNegationElimination Jul 05 '09 at 02:49
  • This very common misconception is slowly being replaced by the misconception that "a Unicode character is 2 bytes". The whole standard is defined for extension, and characters will **always** be variable length -- especially when Unicode normalization comes into the picture. – André Caron Jun 01 '11 at 18:05
15

That I was a good programmer!

ddd
  • 1,399
  • 3
  • 19
  • 44
  • 1
    For my first several jobs, I was the only programmer in the department, and I thought I was pretty hot stuff. Then I got a job working in a team with other programmers. That was an eye-opener. – Joe White Feb 03 '11 at 13:23
14

I can read SO and get any work done.

FastAl
  • 6,194
  • 2
  • 36
  • 60
14

I used to assume it's enough to program Win32 applications.

Also that every program must come with a GUI, because command-line is "outdated".

Peter Perháč
  • 20,434
  • 21
  • 120
  • 152
14

That one day I'd have a realistic idea how long it would take to build some nontrivial code/system/whatever.

DarkSquid
  • 2,646
  • 1
  • 21
  • 19
13

I thought all I needed to do to improve database performance was put the database in 3rd normal form.

Oorang
  • 6,630
  • 1
  • 35
  • 52
13

That object orientation is always the best way to design source code and will always be.

Viktor Sehr
  • 12,825
  • 5
  • 58
  • 90
12

That this:

SomeClass object(initialValue);

and this:

SomeClass object = initialValue;

were guaranteed to be equivalent in C++. I thought the second form was guaranteed to be interpreted as if it had been written as the first form. Not so: see C++ Initialization Syntax.

Community
  • 1
  • 1
Kristopher Johnson
  • 81,409
  • 55
  • 245
  • 302
11

Some of the things that I still have trouble with are the following misconceptions - I still try and hold on to them even though I know better:

  • All stakeholders will make decisions about software design objectively. Those that aren't embroiled in writing the code make all sorts of decisions based entirely on emotion that don't always make sense to us developers.
  • Project budgets always make sense - I've seen companies that are quite happy to drop [just for example] $50,000 a month for years rather than pay $250,000 to have a project completed in 6 months. The government for one loses their annual budget if they don't spend it - so spend it they will, come hell or high water. It astounds me at how many project dollars are wasted on things like this.
  • You should always use the right tools for the right job - sometimes this decision is not in your hands. Sometimes it comes down from on high that "thou shalt use X technology" for this project, leaving you thinking "WTF! Who came up with that ridiculous idea?"... the guy paying your paycheque, that's who, now get it done.
  • Programming ideology comes first and foremost, everything else is secondary. In reality, deadlines and business objectives need to be met in order to get your paycheque. Sometimes you make the worst decisions because you just don't have time to do it the right way... just as sometimes that word is on the tip of your tongue but the minute it takes to recall it makes you choose a different and less ideal word. There isn't always time to do it right, sometimes there is only time to do it - however that may be. Hence oft' seen anti-patterns used by so called experienced developers who have to knock out a solution to a problem 10 minutes before the presentation deadline for the software being delivered to your best client tomorrow.
BenAlabaster
  • 39,070
  • 21
  • 110
  • 151
11

Back when I programmed on the TI-83, I thought you couldn't assign a variable to itself. So (ignoring that this is C code, not TI-BASIC) instead of writing

c = c + 1;

I would write

d = c + 1;
c = d;

When I learned about += and ++ it blew my mind.

Chris Lutz
  • 73,191
  • 16
  • 130
  • 183
11

That IDEs would get faster.

Anirudh
  • 2,209
  • 4
  • 25
  • 32
10

That I should always optimize my code. That's not to say I shouldn't think through it before I write it, but that I should think hard about how to squeeze every bit of performance out of each statement, even to the point of sacrificing readability.

Jimmy
  • 27,142
  • 5
  • 87
  • 100
9

I think I was 10 years old when someone convinced me that there will be a computer capable of running an infinite loop in under 3 seconds.

Shalom Craimer
  • 20,659
  • 8
  • 70
  • 106
9

That XML namespaces (or worse, well formedness) are in some way more difficult than trying to do without them.

A very common blunder, even at the W3C!

Simon Gibbs
  • 4,737
  • 6
  • 50
  • 80
  • It's not that they're worse. It's that they take a language that's already pretty ugly/verbose and make it a lot more ugly/verbose. – Evan Plaice Jun 14 '10 at 20:45
9

My incorrect assumption: That while there's always some room for improvement, in my case, I am pretty much as good a programmer as I can be.

When I first got out of college, I'd already been programming C for 6 years, knew all about "structured programming", thought "OO" was just a fad, and thought "man, I am good!!"

10 years later, I was thinking "OK, back then I was nowhere near as good as I thought I was... now I get the ideas of polymorphism and how to write clean OO programs... now I'm really good".

So somehow, I was always really good, yet also always getting way better than I was earlier.

The penny dropped not long after that and I finally have "some" humility. There's always more to learn (have yet to write a proper program in a purely functional language like Haskell).

Paul Hollingsworth
  • 13,124
  • 12
  • 51
  • 68
  • I second the motion. Nobody is anywhere near half as good as they think they are, but that doesn't seem to prevent the smart ones from learning. The dumb ones persist with there delusions of adequacy despite all the evidence to the contrary; and refuse to learn, or be taught. – corlettk May 23 '09 at 08:21
8

I was convinced, for at least 6 years, that every problem had exactly 1 solution.

Utterly unaware of multiple algorithms with differing complexities, space/time tradeoffs, OOP vs. Functional vs. Imperative, levels of abstraction and undecidable problems. When that blissful naivety broke, it opened up a world of possibilities and slammed the door on simply sitting down and building things. Took me a long time to figure out how to just pick one and run with it.

Kim Reece
  • 1,260
  • 9
  • 11
8

In C++, during a long time I was tkinking that compiler rejects your when giving a definition for a pure virtual method.

I was astonished when realizing that I was mistaken.

Many times when I tell someone else to give a default implementation of its pure virtual destructor for its abstract class, he/she looks back at me with BIG eyes. And I know from here that a long discussion will follow ... It seems a common belief somewhat spread within C++ beginners (as I consider myself too .. I am still learning currently!)

wikipedia link to c++'s pure virtual methods

yves Baumes
  • 8,836
  • 7
  • 45
  • 74
  • Holy crap! I am gonna quiz all my friends with C++ experience, see if any of them know this, 'cause I sure didn't. – KeyserSoze May 20 '09 at 19:17
  • Most of the time it doesn't make sense - if you're forced to override a method anyway, why waste time on an implementation? Destructors are a special case, because they're always called even when they're overridden. – Mark Ransom May 20 '09 at 19:50
  • Heh :). I've spent *way* too much time debugging problems that resulted from having forgotten to add a virtual destructor to a base class. – reuben May 21 '09 at 04:29
  • Mark: It allows you to provide a "default" implementation while still requiring the author of the derived class to think about whether they should use the default implementation. Rarely useful really. But it is there is that's the style you want. – jmucchiello May 21 '09 at 15:29
7

G'day,

That I'd be just designing and writing code.

No requirements gathering, documentation or supporting.

cheers,

Rob Wells
  • 36,220
  • 13
  • 81
  • 146
  • Thankfully all of that was drilled into me at university! I would have been given the shock of my life otherwise ;-) – Barry Gallagher Jun 19 '09 at 13:27
  • Ah... the number 1 reason why I got my diploma in IT, then went on straight to sign on in law enforcement. (ironically, I'm now a cop assigned to an IT project, doing requirements gathering, documentation and users-vendors liaison.) =P – Darkwoof Jun 16 '10 at 02:47
7
  • My co-workers were/are producing supposedly bad code because they sucked/suck. It took me a while to learn that I should first check what really happened. Most of the times, bad code was caused by lack of management, customers who didn't want to check what they really wanted and started changing their minds like there's no tomorrow, or other circunstances out of anyone's control, like economic crysis.
  • Customers demand "for yesterday" features because they are stupid: Not really. It's about communication. If someone tells them it everything can really be done in 1 week, guess what? they'll want it in 1 week.
  • "Never change code that works". This is not a good thing IMO. You obviously don't have to change what's really working. However, if you never change a piece of code because it's supposedly working and it's too complex to change, you may end up finding out that code isn't really doing what it's supposed to do. Eg: I've seen a sales commission calculation software doing wrong calculations for two years because nobody wanted to maintain the software. Nobody at sales knew about it. The formula was so complex they didn't really know how to check the numbers.
João Marcus
  • 1,610
  • 1
  • 13
  • 20
7

As an old procedural programmer, I didn't really understand OO when I first started programming in Java for a hobby project. Wrote lots of code without really understanding the point of interfaces, tried to maximize code re-use by forcing everything into an inheritance hierarchy - wishing Java had multiple inheritance when things wouldn't fit cleaning into one hierarchy. My code worked, but I wince at that early stuff now.

When I started reading about dynamic languages and trying to figure out a good one to learn, reading about Python's significant whitespace turned me off - I was convinced that I would hate that. But when I eventually learned Python, it became something I really like. We generally make the effort in whatever language to have consistent indent levels, but get nothing for it in return (other than the visual readability). In Python, I found that I wasn't doing any more effort than I had before with regard to indent levels, and Python handled what I'd been having to use braces or whatever for in other languages. It makes Python feel cleaner to me now.

Anon
  • 11,870
  • 3
  • 23
  • 19
6

That more comments are better. I've always tried to make my code as readable as possible--mainly because I'm almost certainly the guy that's going to fix the bug that I let slip by. So in years past, I used to have paragraphs after paragraphs of comments.

Eventually it dawned on me that there's a point where more comments--no matter how neatly structured--add no value and actually becomes a hassle to maintain. These days, I take the table-of-contents + footnotes approach and everyone's happier for it.

hythlodayr
  • 2,377
  • 15
  • 23
6

I used to think that Internet Explorer 6 box model is an evil dumb idea MS came up with only to break compatibility with other browsers.

Lots of CSSing convinced me that it's much more logical, and can make the page design maintenance (changing blocks paddings/borders/margins) much easier.

Think about the physical world: changing the paddings or borders width of an A4 page doesn't change the page width, only reduce the space for the content.

theosp
  • 7,439
  • 3
  • 23
  • 24
6

That the only localization/internationalization issue is translating messages.

I used to think that all other languages (and I had no concept of locales) were like English in all ways except for words and grammar. To localize/internationalize a piece of software, therefore, you only needed to have a translator translate the strings that are shown to the user. Then I began realizing:

  • Some languages are written right-to-left.
  • Some scripts use contextual shaping.
  • There is large variation in the way that dates, times, numbers, etc. are formatted.
  • Program icons and graphics can be meaningless or offensive to some groups of people.
  • Some languages have more than one "plural form".
  • ...

Even today I sometimes read about internationalization issues that surprise me.

Daniel Trebbien
  • 38,421
  • 18
  • 121
  • 193
6

never met with integer promotion before... and thought that 'z' would hold 255 in this code:

unsigned char x = 1;
unsigned char y = 2;
unsigned char z = abs(x - y);

correct value of z is 1

Andrey
  • 4,216
  • 1
  • 23
  • 31
  • Depending on the implementation, z could be 65535. Or various other values. – Windows programmer Jun 08 '09 at 03:13
  • no, it could not. described behavior is correct according to standard. do you know a compiler that acts like you described? – Andrey Jun 08 '09 at 07:27
  • The standard allows a conforming implementation to define unsigned char (and plain char and signed char) as 16 bits, and to define int (and unsigned int) as 16 bits. It doesn't matter if I know a compiler that defines char as 16 bits. It doesn't matter if I used compilers that defined int as 16 bits. The standard allows it. – Windows programmer Jun 08 '09 at 08:26
  • sizeof(char) is not important in this case because due to promotion, the given expression that is passed as argument to abs becomes (int)x - (int)y, and abs(-1) would always be 1. – Andrey Jun 08 '09 at 10:49
  • 1
    Of course sizeof(char) is not important. sizeof(char) is always 1. Meanwhile, the standard allows a conforming implementation to define unsigned char as 16 bits and unsigned int as 16 bits. In that implementation, perfectly legally, x is 1, y is 2, for the subtraction x promotes to unsigned int with value 1, y promotes to unsigned int with value 2, the result of the subtraction is 65535, and abs(65535) is 65535. In that implementation the standard requires unsigned char to promote to unsigned int because plain (signed) int can't hold all the values that unsigned char can hold. – Windows programmer Jun 09 '09 at 01:11
  • I've just figured out that Andrey didn't know that CHAR_BIT is implementation defined. 16 is allowed. – Windows programmer Jun 10 '09 at 00:28
  • I got your point; but why in what case would unsigned char be promoted to unsigned int? it might happen only in case if int is not capable to hold values of unsigned char. If this is the case - then abs((int)65535) would return 1, because 65535 would represent -1 for int. If it is not the case (int is capable to hold unsigned char values) then promotion would be to int, not to unsigned int. z would still be 1. – Andrey Jun 10 '09 at 06:10
  • "it might happen only in case if int is not capable to hold values of unsigned char" -- Bingo, exactly as I wrote above. 16 bits from 16 bits. Thank you for finally understanding. ... "z would still be 1" -- oops, try again to read what you wrote a few lines earlier. – Windows programmer Jun 10 '09 at 23:50
  • OK, I see what Andrey almost said. We need to know which programming language is in use. If it's C then abs isn't overloaded and the argument will be demoted from unsigned int to int. – Windows programmer Jun 11 '09 at 03:52
  • ok, and if C++? wouldn't the same thing happen in C++? abs has overloads for int and long (it wouldn't make sense to have abs overloads for unsigned types) – Andrey Jun 11 '09 at 06:15
  • In my example with 16 bit char and 16 bit int, when C++ chooses an overload that takes a built in promotion from unsigned int, that built in promotion is going to be long not int. Plain int can lose some values of unsigned int but plain long cannot. (Again remember, this is in my example with 16 bit char and 16 bit int. In a different example where CHAR_BIT is 64 and int and long are all 64 bits, long would also lose some values of unsigned int.) – Windows programmer Jun 11 '09 at 23:39
6

The OO is not necessarily better then non-OO.

i assumed that OO was always better.. then i discovered other techniques, such as functional programming, and had the realization that OO is not always better.

SingleNegationElimination
  • 151,563
  • 33
  • 264
  • 304
sean riley
  • 2,633
  • 1
  • 22
  • 22
  • You assumed that "OO is not necessarily better than non-OO" and your assumption turned out to be false, i.e. OO _is_ necessarily better than non-OO? or you assumed that OO was necessarily better than non-OO and then you learnt that it is not necessarily better? – Daniel Daranas May 21 '09 at 09:14
  • 2
    Sorry, that was ambiguous. i assumed that OO was always better.. then i discovered other techniques, such as functional programming, and had the realization that OO is not always better. – sean riley May 21 '09 at 23:52
  • Thanks for the clarification - that's what I imagined, but I wanted to have your precise thoughts! – Daniel Daranas May 22 '09 at 20:29
6

That goto's are harmful.

Now we us continue or break.

hWorks
  • 51
  • 4
6

I just recently found out that over a million instructions are executed in a Hello World! c++ program I wrote. I never would have expected so much for anything as simple as a single cout statement

rzrgenesys187
  • 3,246
  • 2
  • 17
  • 7
6

don't use advanced implementation-specific features because you might want to switch implementations "sometime". i've done this time and again, and almost invariably the switch never happened.

Martin DeMello
  • 11,876
  • 7
  • 49
  • 64
6

I am a young fledgling developer hoping to do it professionally because it's what I love and this is a list of opinions i once held that I have learned through my brief experience are wrong

The horrible mess you end up with when you don't seperate user interface from logic at all is acceptable and is how everyone writes software

There's no such thing as too much complexity, or abstraction

One Class One Responsability - I never really had this concept, it's been very formitive for me

Testing is something I don't need to do when I'm coding in my bedroom

I don't need source control because it's overkill for the projects I do

Developers do everything, we're supposed to know how to design icons and make awesome looking layouts

Dispose doesn't always need a finaliser

An exception should be thrown whenever any type of error occurs

Exceptions are for error cases, and a lot of the time it's OK to just return a value indicating failure. I've come to understand this recently, I've been saying it and still throwing exceptions for much longer

I cam write an application that has no bugs at all

Crippledsmurf
  • 3,982
  • 1
  • 31
  • 50
  • Those are nice lessons, but... Which one(s) of those assumptions turned out to be incorrect? – Windows programmer Jun 08 '09 at 02:57
  • Recently, I've learned: GIT is amazing, and I thought the same thing. I'm also learning tests (other than manually testing... time consuming). One thing you might be missing-- debug using debuggers, not printing out at various execution times. (If possible). Coding to no errors, don't ever try to write a reliable program that relies on an external source. My only problem with a super-simple CMS was I relied on yahoo and f_open which hosting disabled, and yahoo changed the endpoint... – CodeJoust Oct 13 '09 at 02:19
  • If you're talking about .NET, Dispose *doesn't* always need a finalizer -- that one isn't a misconception. In fact, since SafeHandle was added in .NET 2.0, finalizers should be pretty rare. – Joe White Feb 03 '11 at 13:48
6

That we as software engineers can understand what the user really wants.

Jim Evans
  • 6,285
  • 10
  • 37
  • 60
5

That my schooling would prepare me for a job in the field.

Steven Evers
  • 16,649
  • 19
  • 79
  • 126
5

That learning the language is just learning the syntax, and the most common parts of the standard library.

Macha
  • 14,366
  • 14
  • 57
  • 69
5

That bytecode interpreted languages (like C# or F#) are slower than those reset - button - hogs that compile directly to machine code.

Well, when I started having that believe (in the 80s), it was true. However, even in C# - times I sometimes wondered if "putting that inner loop into a .cpp - file would make my app go faster").

Luckily, no.

Sadly, I just realized that a few years ago.

Turing Complete
  • 929
  • 2
  • 12
  • 19
  • 2
    Here's another: C# is not a bytecode interpreted language. There is a "bytecode" analog in IL, but C# IL is compiled upfront to fully native code before your program starts running. – Joel Coehoorn Jul 12 '10 at 17:22
  • Thats only part of what I meant. My belief was that the JIT was far inferior to directly compiled code, which is wrong. – Turing Complete Jul 13 '10 at 07:24
5

"It's going to work this time"

Jeffrey Greenham
  • 1,382
  • 5
  • 16
  • 33
5
  • Programming Language == Compiler/Interpreter
  • Programming Language == IDE
  • Programming Language == Standard Library
Tahir Akhtar
  • 11,385
  • 7
  • 42
  • 69
5

I used to think I was a pretty good programmer. Held that position for 2 years.

When you work in a vacuum, it's easy to fill the room :-D

Zee Spencer
  • 3,460
  • 6
  • 29
  • 31
5

That the now popular $ sign was illegal as part of a java/javascript identifier.

Hannes de Jager
  • 2,903
  • 6
  • 37
  • 57
5

Thinking that I know everything about a certain language / topic in programming. Just not possible.

Dmitri Farkov
  • 9,133
  • 1
  • 29
  • 45
5

That virtual-machine architectures like Java and .NET were essentially worthless for anything except toy projects because of performance issues.

(Well, to be fair, maybe that WAS true at some point.)

JCCyC
  • 16,140
  • 11
  • 48
  • 75
  • That myth persists to this day. Counter argument: http://cplus.about.com/od/programmingchallenges/a/challenge12.htm java 0.02688359274 seconds; C# 0.166 secs; C++ 429.46 secs; http://forums.sun.com/thread.jspa?messageID=10435068#10435068 1st and 2nd are both VM's so don't tell be C++ is inherently faster, or slower. A bad craftsman blames his tools. The best violins where made before we know how to measure anything with sufficient precision to reproduce them. Aside: Bob Wilson on quantum physics: http://www.videosift.com/video/Robert-Anton-Wilson-explains-Quantum-Physics – corlettk May 23 '09 at 05:37
  • Just to nitpick, but .Net isn't a virtual machine. It's a just-in-time compiler, such that the IL is compiled to native machine code one time per deployment. – Joel Coehoorn Jun 03 '09 at 21:21
  • True, it uses a JIT, but using .NET "feels" the same as a Java-style VM design (and of course Java has a JIT too). – Qwertie Jul 08 '10 at 20:53
5

That C++ was the coolest language out there!

hasen
  • 161,647
  • 65
  • 194
  • 231
  • Of course it is. Don't you know? – jrharshath May 27 '09 at 11:52
  • Yea, I used to think so, and I even used to argue *for* it. – hasen May 27 '09 at 18:25
  • What's wrong with C++? I mean, I know there are things wrong with it, but it is pretty cool. I would argue for it. – Carson Myers Jun 02 '09 at 09:02
  • It's definetly not the coolest – hasen Jun 02 '09 at 10:20
  • Not the coolest? You can do OOP and metaprogramming in an *efficient* way! – isekaijin Aug 31 '09 at 13:23
  • It was cool... now it is sadly old and unDRY :(. Sure sure, still the best for efficient code, but not *that* cool. Metaprogramming? You mean Template Black Magic Trickery that Halts Compilers? Python is the new cool kid around... sure, it's a somewhat slow kid... but cool. Anyway, C++ is going through some surgeries to come out as the new C++0x... oops, C++1x kid. Then it will be cool again, like a 60 years old man dressing like a 15 years old teenager! – e.tadeu Feb 10 '10 at 10:28
  • 4
    -1. Template meta programming in C++ is the coolest thing there is. – Viktor Sehr May 15 '10 at 20:06
5

It's important to subscribe to many RSS feeds, read many blogs and participate in open source projects.

I realized that, what is really important is that I spend more time doing coding. I have had the habit of reading and following many blogs, and while they are a rich source of information its really impossible to assimilate everything. It's very important to have balanced reading, and put more emphasis on practice.

Reg. open source, I'm afraid I won't be popular. I have tried participating in open source, and most of them in .NET. I'm appalled to see that many open source projects don't even follow a proper architecture. I saw one system in .NET not using a layered architecture, and database connection code was there all over the place including code behind, and I gave up.

Shaw
  • 1,484
  • 4
  • 20
  • 31
5

That managers know what they talk about.

Andriy Volkov
  • 18,653
  • 9
  • 68
  • 83
4

That people would care about best practices, or even consistency.

l0b0
  • 55,365
  • 30
  • 138
  • 223
4

That I need to define all the variables I'll use in my function in its beginning (Pascal style).

I used to believe I need to think about ALL the resources to be used by my function and define them before I start coding, this is probably because my first language was Pascal where that's the requirement. Then when I moved to C, I would define temp variables that are used only within loops outside those loops, disregarding in-loop scope, just so that "everything will be defined in the beginning".

It took me several years to understand that defining all the resources in advance is not a holly cow, and that scoping is by itself ultra important to code readability.

Roee Adler
  • 33,434
  • 32
  • 105
  • 133
4

I thought "duck typing" was actually "duct typing" when I first heard of it, similar to the way people often say duck tape. "Duck typing" just sounded wrong, while "duct typing" made a weird kind of sense (cobbled-together types).

Chinmay Kanchi
  • 62,729
  • 22
  • 87
  • 114
4

That programming is for juniors and that the best project managers are people who can’t program.

Kdeveloper
  • 13,679
  • 11
  • 41
  • 49
4

That you never finish the project you didn't start.

Seems really stupid but I put off so many projects because the scale was simply overwhelming. Having just finished a monster of a project I realized I never would have started had I realized the scope of it. In reality though, even the most complex system is pretty simple when broken into discrete and defined pieces. Yet looked at on the macro level it is quickly overwhelming.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Serapth
  • 7,122
  • 4
  • 31
  • 39
4

That procedural developers/programmers unfamiliar with SQL and relational databases don't need any formal training or understanding of how to work with and or use SQL and that a quick read of something like SQL For Dummies is enough to be sufficient in working with Relational databases like Oracle & SQL Server.

Far too often many errors in applications dealing with data stored in a relational database like Oracle and SQL Server are caused by a lack of understanding or how to use the langauge of relational databases; SQL.

I used to work for a software vendor who had the mentality that all a developer needed was the SQL For Dummies book or something similiar and they would be fully equipped to handle any relational database issue. Now that the clients of this vendor have databases measuring in hundreds of gigabytes this lack of SQL knowledge is coming back around in a negative way. It's not just bad performing lookups and or updates and inserts that are a problem but the actual design of the database itself that is the real obstacle.

All of that could have been avoided and resulted in far less costs now if at that time the development lead would have treated SQL and relational databases with the same leve of respect that they did with the langauge they built the application with.

Don't dismiss SQL as unimportant because it WILL come back to haunt you eventually. You may be able to get away with it for a while, even years but you will eventually hit that breaking point where you can't progress without a complete re-design of your database and that is when the costs will be highest.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
user337500
  • 1
  • 1
  • 1
4

That there is always a "right" way of doing things. I held onto this assumption for far too long after leaving university.

Of course I came to realise that there are always many ways a task can be completed. There are always advantages and disadvantages to each method. Look at the information available, decide, then make sure you can justify it to your boss.

Phil Hale
  • 3,453
  • 2
  • 36
  • 50
4

Back at the beginning of my C++ days (a lot of hair ago) I was surrounded by Java academics. When asked for an advantage of C++ over Java (typically a question I try to dismiss as contrived, but there you go), I'd include in my answer that C++ gave you references and pointers. The Java guys would look incredulous and suggest that references are pointers, and laugh me out of the room. I insisted that references and pointers are distinct in C++.

And, to be fair, I was right. References and pointers are different semantically and syntactically. Unfortunately, I backed up my claim with a fallacy: that the underlying implementation was different.

It was my firm belief that references were, by standardisation, name aliases in the syntax in the same way that a typedef is a type alias with no storage.

I was sure that references were not objects and had no storage, that they just provided multiple top-level mappings of "name" to "object". In that regard, I thought that they were like soft-links in a filesystem:

Code: int a = 3; int& b = a;

 Names          Objects           Memory

+-----+     +-------------+     +-------+
|  a  |---->|             |     |       |
+-----+     |             |     |       |
            |     int     |---->|   3   |
+-----+     |             |     |       |
|  b  |---->|             |     |       |
+-----+     +-------------+     +-------+

Of course, although optimisations may lead to this, references do have storage. They are distinct objects, even if the syntax does its best to abstract that away from the programmer.

Suffice it to say, I was disappointed to learn that a compiler with optimisations turned off may implement a reference as a pointer, requiring a dereference operation: that I was actually creating the analogy to a hard-link in a filesystem:

Code: int a = 3; int& b = a;

 Names          Objects           Memory

+-----+     +-------------+     +-------+
|  a  |---->|     int     |---->|       |
+-----+     +-------------+     |       |
                                |   3   |
+-----+     +-------------+     |       |
|  b  |---->|     int&    |---->|       |
+-----+     +-------------+     +-------+

Standard C++ doesn't actually specify how references ought to be implemented, so my theory could hold true for some toolchains, but it doesn't in any mainstream compiler... and it's certainly not stated in the standard.

Lightness Races in Orbit
  • 378,754
  • 76
  • 643
  • 1,055
4

I could spend days trying to reduce the amount of memory my business layer used, just to later realize that the WinForms (GUI) of my project used 4 times more memory than the rest of the application.

Burnsys
  • 854
  • 1
  • 8
  • 11
4

the assumption that i was to make the program 100% complete and bug free and report it as "completed". Sometimes the company wants to release the program when there are many bugs to get market share first.

nonopolarity
  • 146,324
  • 131
  • 460
  • 740
4

that after I finish CS school, I can start a job and use my knowledge that I learned in school for real world applications. (I actually wish i wouldn't waste 4 years of my life in learning operating systems and prolog)

jDeveloper
  • 2,096
  • 2
  • 21
  • 27
  • What's sad is that (to me at least) operating systems, prolog, and similar subjects (esp. AI and 3d graphics) were fun. I probably would have chose a different career if I knew the "real world" was far more mundane. – Cybis May 20 '09 at 21:08
  • 2
    agreed. It seems like most of us get stuck doing web applications and fairly simple database work after studying some hard core C/C++ development. – jDeveloper May 21 '09 at 15:03
  • 2
    On the other hand, the reverse is just as true: "That I can build real world applications (well) without understanding the basics such as operating systems and prolog" - I find this very common amongst the bad programmers I meet... – AviD Aug 28 '09 at 10:16
4

Bitwise comparisons on integers in SQL WHERE clauses are practically free in terms of query performance.

As it happens, this is somewhat true for the first half-million rows or so. After that it turns out to be extremely UN-free.

JohnFx
  • 34,542
  • 18
  • 104
  • 162
  • 1
    UN-free == expensive? Is this a hidden political statement about the United Nations? Awesomes. – Kieveli May 21 '09 at 16:42
  • Please, which RDBMS does this apply to? I've never had a problemo in Access, Sequal, Ingres, Postgres, Informix, or MySql... though I've only (knowingly) dealt with multimillion row tables in Ingres and Informix. – corlettk May 23 '09 at 07:10
  • In my case SQL Server, but I think it would apply to any RDBMS. The trouble is that bitwise operations are not sargable and won't use indexes efficiently. The operation, however, is so fast even witha table scan I didn't notice it until it got really large. – JohnFx May 23 '09 at 15:50
  • The DB goes fast for the first half million and then slows down? – Qwertie Jul 08 '10 at 20:57
  • Just saying the performance profile resembles O(N) – JohnFx Jul 09 '10 at 16:45
4

For a long time (about 5 years) I thought that PHP rocks.

I thought that I know algorithms. And then I joined Topcoder.com

jbasko
  • 7,028
  • 1
  • 38
  • 51
4

That ASCII was stored in a different way to binary

  • What?! It is... ASCII is a character code, binary is a way of writing numbers... – Zifre May 21 '09 at 21:30
  • I meant that i though an image and a text file were stored differently on disk. That an image was binary and text was something else. –  May 21 '09 at 21:38
  • 1
    There is a nugget of truth in this. a few filesystems, especially network filesystems, handle bytes corresponding to newlines differently depending on whether they think the file is text or non-text. In particular, some made it very difficult to fix this when it happens to be wrong. Few new technologies do this because its a terrible idea. – SingleNegationElimination May 21 '09 at 23:46
  • 2
    (Open)VMS for instance does it, so technically not entirely wrong. And the reason why C supports both file modes. – MSalters Aug 31 '09 at 15:24
  • Actually, when I started to program C, I believed that the different modes were because of DOS were retarded. Well, as 10yrs old you get all crazy sorts of ideas ;) – Frank Feb 21 '11 at 01:44
4

In the early days, most personal computers had a cassette tape interface for loading and storing programs. I did not have a computer at this time but read everything I could get my hands on (mostly magazines) that had anything to do with computers (this was the late 70's - no internet for me). For some reason I was under the impression that programs were executed directly from the cassette tape and that the only reason computers had any RAM was to store variables while the program ran. I figured that when the code had to execute a jump instruction, it would somehow rewind or advance the tape to the correct position and continue from there.

Ferruccio
  • 98,941
  • 38
  • 226
  • 299
  • I still, to this day, have not been adequately able to explain the different between volatile memory (RAM) and non-volatile storage (hard-disk etc) to my mother. – Dan Diplo Jul 29 '09 at 09:56
  • Gotta love it... Amazing how things have changed... was that bytecode on the tapes? No high-level languages there. – CodeJoust Oct 13 '09 at 02:14
4

That everyone else is using the latest and greatest technology, while my team is the only one stuck with inferior outdated tools. (Except for the mystic cobol dinosaurs)

Erich Kitzmueller
  • 36,381
  • 5
  • 80
  • 102
4

That everyone wants to produce the best\most sutiable code possible for a problem...

AwkwardCoder
  • 24,893
  • 27
  • 82
  • 152
4

That, being the owner of the code I write, I'm the only person who should understand or touch it.

Rômulo Ceccon
  • 10,081
  • 5
  • 39
  • 47
3

In school, you are taught the programming is "read input, process data, write output". In reality, there is rarely a processing step -- Most coding is just "read input, output"

Generally, it's either "read from user, write to database" or "read from database, display on screen". Those two cases cover about 95% of the work you'll ever do.

James Curran
  • 101,701
  • 37
  • 181
  • 258
3

Satisfy a customer by implenting what he wants - unfortunately this implies that a customer knows what he wants.

tanascius
  • 53,078
  • 22
  • 114
  • 136
3

The less code the better. Now I know that sometimes it's worth to have more lines of code if it makes easier to read/understand

Sergio
  • 8,125
  • 10
  • 46
  • 77
3

That Python was an impractical, annoying language (I can still read some comments on my early code, complaining about it) and C++ what the only true object-oriented language.

I was so wrong I still fill ashamed.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Stefano Borini
  • 138,652
  • 96
  • 297
  • 431
3

That other people would be as bothered by known bugs as I was, and would make fixing them a priority over project work.

MartW
  • 12,348
  • 3
  • 44
  • 68
3

That the benefit of OOP is that you get to reuse the object, when in reality it's the resuse of the rest of the code by creating a new object that has the same interface.

In reality, the object might be 2% of the code so reuse gets you only 2% benefit. The real benefit is reusing other 98% of the code by creating a new object that allows all the other code to something completely different. Now you have reuse of 98% of the code. Well worth th 3x longer it takes to write something as an object.

E.g., If you have a drawing program and suddenly there is a new shape you want to be able to draw you just change the ShapeObject (while keeping the interface the same). Nothing else in the program has to change.

Clay Nichols
  • 11,848
  • 30
  • 109
  • 170
3

That I wouldn't need to rapidly refactor my Object oriented code. Martin Fowler finally opened my eyes.

Ritesh M Nayak
  • 8,001
  • 14
  • 49
  • 78
3

That I would never find a practical use in programming for the Karnaugh maps I was taught in my computer science curriculum.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
schemathings
  • 47
  • 1
  • 5
3

That tests were just another method of procrastination.

Macha
  • 14,366
  • 14
  • 57
  • 69
3

That PHP's mysql_fetch_row was the only way to retrieve data from an executed SQL query.

Honestly - I programmed an entire web application without using mysql_fetch_array, and had to change bunches of numbers every time I wanted to change the function to add an extra column.

Julian H. Lam
  • 25,501
  • 13
  • 46
  • 73
3

I taught myself C by reading K&R. Unfortunately, I did not read it word for word and must have missed a few things. I wrote my own versions of malloc and calloc that I carried around with me from job to job, because I didn't realize you could just link in with existing libraries. I did this for several years until someone finally asked me why I was carting that stuff around, "um ... you DO realize you could just link in the existing libraries, right?"

elbillaf
  • 1,952
  • 10
  • 37
  • 73
3

Turns out it doesn't matter whether you check if memory allocation returns a reference or not under Linux, as it will actually lie to you and either actually allocate the memory at some time in the future or abort your program altogether if it doesn't have the memory you need.

zaratustra
  • 8,148
  • 8
  • 36
  • 42
3

Since college days, I thought myself to be master of programming. since I could code but others couldn't. But when I joined a company, then I was struck by my ignorance about basics. All my assumptions about myself turned out to be wrong! Now I know what I need to know and what I do not know!

Manohar
  • 1,270
  • 7
  • 17
  • 28
3

When at college (mid 90's) they only had Windows 3.11 machines in the computer lab (I know, weird college).

For a while I thought that only the Windows platform was relevant to me as a professional programmer and that all other platforms were only interesting from an historical academic point of view.

After graduating from school and learning about modern unixes and linux environments I couldn't help feeling angry and disappointed about my lame school.

I cannot yet believe I graduated with a computer engineering degree without ever seeing a bash shell or even hearing about emacs or vim.

Sergio Acosta
  • 11,418
  • 12
  • 62
  • 91
  • That's... impressive, is about the only word I can think of. – mavnn May 21 '09 at 08:04
  • I got lucky... We had a xenix (an early Microsoft unix port to Intel) system at my TAFE college. I got to play, and one of my mates was hired back as the sys-admin... and we figured it out together. When I started work on Solaris I was streets ahead of my compatriots. Yep, a Windows only university environment is totally suckful. – corlettk May 23 '09 at 07:22
  • 1
    Who uses Unix anyway? At least thats what I thought when I was FORCED to learn ONLY Unix in uni, basically treating non-Unix enivronments as either toys for home (windows) or nonexistent legacy (Mainframes etc). – AviD Aug 28 '09 at 10:12
3

That it was so important to make efficient programs without wasting a byte nor a CPU cycle.

But with more experience, its not about bytes or about CPU cycles, its about your flow of thought, continuous, uninterrupted, much like a poem.

Essentially, don't try too hard.

lprsd
  • 84,407
  • 47
  • 135
  • 168
3

I always assumed that anyone writing any code for any language used an editing program.

I was working with a client of mine who had me on mostly as support and to write some of the more complex things for him. Well one day he messed up a file, big time. He accidentally saved over three hours worth of his own work, and when I asked him why he didn't save more often he replied with, "because I wasn't done". Naturally, this was not an acceptable answer, and I poked and prodded a little further. I eventually came to find out that he he has never used any editing program, EVER! Not even notepad.exe! He had been using an online CPanel editor for files! It didn't even have a 'Find' function. He couldn't ever save until he was done because he was editing the live file on the site!

Needless to say I was flabbergasted, and he's still using the CPanel editor to this day...

  • 1
    Cpanel's editor! Cpanel is a good management, but seriously... I only use that for on-the-road patches... Never trust a remote server, sometimes I just copy a long comment to the clipboard so I don't have to worry if it doesn't post... (to many things online like killing sessions when you have a good, long comment or post etc.) – CodeJoust Oct 13 '09 at 02:21
  • For quick patches, I admit I have done that. – Macha Apr 03 '10 at 20:55
3

Learning regular expressions will save you time

rjdevereux
  • 1,842
  • 2
  • 21
  • 35
  • 1
    really? They haven't saved you time? They save me a ton of work daily. – Demi May 22 '09 at 17:59
  • 1
    LOL. There's only two types of regex /complicated/ && /far king complicated/. – corlettk May 23 '09 at 05:17
  • 3
    LOL @ that, this reminds me of the quote: Some people, when confronted with a problem, think "I know, I'll use regular expressions." Now they have two problems. Thanks Jeff :D – Leo Jweda Jun 07 '09 at 06:04
  • They will, if you don't overuse them. Knowing the basics saved me a ton of time! (Try doing 4 chained str_replace's in a row...). – CodeJoust Oct 13 '09 at 02:15
3

My longest held (and therefore most costly) incorrect assumption was: "The business's requirements are sane and reasonable, I'm just not understanding them yet."

100 green assumptions sitting on the wall,
and if one green assumption should accidently fall,
there'd be 99 green assumptions sitting on wall.

Alternately:

Humpty dumpty sat on the wall.
Humpty dumpty had a great fall,
and all kings horses and all the kings men,
said Effim, he's only a tech.

corlettk
  • 13,288
  • 7
  • 38
  • 52
3

That, by learning an exact science, I wouldn't need to improve my limited social skills.

Rômulo Ceccon
  • 10,081
  • 5
  • 39
  • 47
3

That the evaluation order of if statements in C/C++ was compiler-specific. So writing:

if ( pointer != NULL ) && ( pointer->doSomething() )

Was un-safe because the evaluation order could be swapped. I found out recently (after many years of spouting that lie) that its part of the ANSI-C specification, you can guarantee the order and its perfectly safe.

James

  • http://stackoverflow.com/questions/888224/what-is-your-longest-held-programming-assumption-that-turned-out-to-be-incorrect/888259#888259 – Michael Myers Jun 10 '09 at 15:40
  • mmyers, you mentioned exactly this question, which this answerer's answer answered almost perfectly. Did you forget to add something else? – Windows programmer Jun 11 '09 at 23:42
  • Meanwhile, the evaluation of most expressions, including expressions in if statements, can often be compiler-specific. James Norris's if expression contains three operators. Two of the three do not impose any ordering. – Windows programmer Jun 11 '09 at 23:44
  • Yes I didn't spot the previous point, thanks! If you look under conditional operators in the ANSI C specification: http://std.dkuug.dk/JTC1/SC22/WG14/www/docs/n843.pdf 6.5.13 Logical AND operator ... Unlike the bitwise binary & operator, the && operator guarantees left-to-right evaluation; there is a sequence point after the evaluation of the first operand. If the first operand compares equal to 0, the second operand is not evaluated. Further: I find it difficult to believe that languages like C++/Java built after the C spec do not follow this rule too. –  Jun 12 '09 at 08:54
  • I don't know of any language that doesn't use short-circuit and/or logic. Well, okay, VB. But it seems like I rely on short-circuit and/or logic just about every day. Hard to imagine what my code would look like if I didn't know this basic principle. – Qwertie Jul 08 '10 at 21:02
3

That I would ever become wealthy programming software for someone else

slf
  • 22,595
  • 11
  • 77
  • 101
2

I never thought I would be a professional programmer, I thought I would be working with electronics. But in the end, programming is so much easier and pays so much better that what started as a side job became my main thing.

Otávio Décio
  • 73,752
  • 17
  • 161
  • 228
2

My biggest preconception was that I would be allowed program the way I wanted to. Then of course I left university and got employed by a company that had ridiculous frameworks, rules and procedures in place that not only meant I wasn't programming the way I wanted to, but meant I was programming badly.

Barry Gallagher
  • 6,156
  • 4
  • 26
  • 30
2
  • I thought I'd be coding for 8 hours straight. Realistically, I get 4 hours a day of coding, 1 hour for lunch, 1 for coffee breaks, and 2 for screwing around / chit chatting/ stack over and under flowing.

  • Prior to working, I thought that all clients would be idiots and don't know two craps about computers. Boy was I wrong on that one. Sometimes, we get projects by people who can do it better than we can, they just don't have the time to do it.

  • I thought cubicles were bad, Right now I love them :D I actually moved from a door-ed office to a cubicle. I like the openness.

  • All programmers are not athletic. I thought that I was the only one that goes to the gym. Where I work, at least 10 of us go to the gym every day at 5 am.

  • I thought there would be no women programmers. A couple of our leads are ladies.

dassouki
  • 6,286
  • 7
  • 51
  • 81
2

That Java passes copies of objects to functions, not references.

In other words, I thought that if you pass an object into a method, then change the object in some way, it doesn't change the object in the calling scope. I always passed objects into methods, manipulated them, then returned them!

DisgruntledGoat
  • 70,219
  • 68
  • 205
  • 290
2

I always believed that to be a good programmer one has to know all the inner workings of the system. I was ashamed of the fact that i didn't know everything that is to be known about the language like its libraries, patterns, snippets before you start coding. Well, I am not so naive anymore.

lune
  • 185
  • 2
  • 6
2

That I could convince traditional procedural programmers of why OOP oft-times provides a better solution.

That is, a language that describes the world needs the ability to describe complex objects and their relationships.

Arguments usually included nonsense about abstract classes, which I responded to with "not all OOP programmers are fresh out of Uni and still obsessed with abstracts". Or the classic, "there's nothing you could do in OOP that I couldn't do with strictly procedural programming", which I usually replied to with, "It's not that you could, it's whether you would if you had a more extensive toolset".

I've learned to just accept that they don't see the world through the same lens I do.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Evan Plaice
  • 13,944
  • 6
  • 76
  • 94
  • Traditional procedural programmers have a different sight view of life. To them, a computer uses a program to process data. Input->Program->Output. Entangling data with procedures adds no value. In other words, in the mindset of a traditional programmer, the program is not even trying to describe complex objects and their relationships. It is not making a model of anything. It's using algorithms that read input and write output. – Erich Kitzmueller Aug 11 '10 at 11:26
2

That simplicity almost always beats complexity. KISS - Keep It Simple Stupid rules.

Edit: As Georg states below I got this one reversed. My mind must have gotten lost in the replies. Simplicity almost always makes your code better if used correctly.

mwgriffith
  • 550
  • 3
  • 6
  • 3
    You might have misread the question. In accordance to the title it sounds like the believe in simplicity turned out to be incorrect? – Georg Fritzsche Jul 12 '10 at 16:26
  • I'd actually agree with that. The best and fastest software in the world is incredibly complex, and it got there for a reason. – Andres Jaan Tack Jul 12 '10 at 18:35
  • Sorry, I must have lost the question after reading too many replies. But your right, I should have said that complexity was better than simplicity. Meaning that simplicity is usually the best way to go when programming. Its easier to maintain, easier to debug, and it occasionally even runs faster. – mwgriffith Jul 12 '10 at 19:02
2

That....who needs JUnit testing when breakpoints are effective? (when testing applications in debug mode). I realised later why....

Buhake Sindi
  • 87,898
  • 29
  • 167
  • 228
  • My clients wanted documented report on failed tests. By just saying "I've tested this & it works" scared the hell out of them. – Buhake Sindi Jul 13 '10 at 02:01
2

that temporary solutions are not permanent solutions
or in other words: workarounds are not for ever :)).

  • That's what you say! I don't wanna know how many of my workarounds are still floating around... – Bobby Aug 11 '10 at 11:41
  • well yeah, that's why that's wrong what I say, isn't it? my point is workarounds are for ever, the world is just not perfect at all... –  Aug 11 '10 at 13:20
2

That all OOP languages have the same concept of object orientation.

  • A Java interface != a method's interface.
  • A Java interface is a language-specific solution for the need to have multiple inheritance. Ruby's mixins attempt to solve the same problem.
  • Inheritance provided out of the box in Javascript is very different from how Java implements inheritance.
Alan
  • 7,066
  • 5
  • 30
  • 38
2

If you can't read my code, you just don't know the language. I had a few code reviews where I tried to counter with that.

Took a couple more years to learn there's a time and place to be magical with your code and it is in the libraries, not the application. The app is for clarity and readability. Magic is best used when hidden behind extension methods and frameworks.

Ball
  • 2,591
  • 3
  • 19
  • 26
  • 1
    In fact, you should never be magical. It's simple to write code to do what you want. I can imagine 6-7 ways of doing the same thing. Only a couple of those are easy for others, or yourself in 6 months, to read. That's the real challenge. That's the real goal of programming - to make it easy for other humans to read. Even in a library, other people will need to extend or modify it. Always keep it readable. – Kieveli May 21 '09 at 16:45
  • Hmm- I think there are often some trade-offs here. Sometimes a small bit of magic at a lower layer can make a lot of code at a higher layer both smaller and more readable. It's not that you should never be magical- but you should be judicious in your application of magic. – Tagore Smith Dec 06 '10 at 04:54
2

That I grok programming. By studying the SICP book I saw that I knew nothing. At least now I am delving into programming more.

Nick Dandoulakis
  • 42,588
  • 16
  • 104
  • 136
2

the assumption that if i write code really well and as bug free as possible, and that's the best thing i can do. turns out sometimes the managers prefer people who try to become their favorite instead of doing nice work.

nonopolarity
  • 146,324
  • 131
  • 460
  • 740
  • 1
    The good programmer is both the favorite and code really well ! :D – Nicolas Dorier May 21 '09 at 15:38
  • 1
    Slashene: Great Comment :-). But obviously, people who try to please their manager are not the one who are the more serious in their work (are you?:-) )... And most of time when trying to do a better work (with fewer bugs), you take more time to do it: something your manager will always disagree with (even when you know you HAVE to do it). – yves Baumes May 21 '09 at 23:08
  • what if your manager make weird noises all the time and be really relaxed with all his friends who also work at the company, and be really strict and has the highest expectation about you? His friends don't have to respond to you since they know they cannot get fired. While on the other hand, your manager will call you on your cell phone when he has a question, expecting an immediate response. and even yell at you coz he thinks he pays you and can yell at you. – nonopolarity May 22 '09 at 21:04
2

That programming elegance combined with automation was an adequate substitute for good old-fashioned testing.

ep4169
  • 2,345
  • 2
  • 17
  • 20
2

I used to think that I will never program like top tier developer like the MS developer, but now I think I can write same clean code or even better.

Fred Yang
  • 2,521
  • 3
  • 21
  • 29
  • Go and have a look at the example code in the WDK (Windows Driver Kit), most of it is part of the Windows build and to my eyes pretty horrible. – Tony Edgecombe May 21 '09 at 07:40
2

That somehow a company that runs a large number of fairly high profile/high traffic websites actually knew what the heck they were doing. It ended up they were for the most part clueless and extremely lucky to be in the position that they were in. So I guess the moral would be,

solid software engineering && best practices != business success

or....

most critical software systems == crap

Tom Willis
  • 5,250
  • 23
  • 34
  • The instance doesn't allways represent the whole. I guess the company concerned must be very very lucky to be in there current position... that or they're actually a front for a US bank. – corlettk May 23 '09 at 08:26
2

This is embarrassing, but for the longest time I had believed it was more memory efficient to nest my method calls, or make multiple method calls, than to create a variable to store the value for each method call in C#.

Alexander Kahoun
  • 2,458
  • 24
  • 36
  • You mean, more efficient not to store intermediate results in temporary variables? In .NET temporary variables do have a tiny bit of overhead compared to intermediate values, but the compiler will often create temporaries without you asking for them anyway, which you'll often see if you disassemble to CIL. You generally don't have to create "an object" to store the result of a method; I assume you mean "variable". – Qwertie Jul 09 '10 at 16:27
  • @Qwertie: Thanks. I updated the answer to read more clearly. – Alexander Kahoun Jul 12 '10 at 15:49
  • Doesn't the compiler optimize this? –  Feb 13 '11 at 22:28
2

Not longest-held, but at some point and for several years I:

  • Thought Microsoft Windows was the only Operating System in the world ( it was 1992 )
  • Knowing DOS was more than enough to have "advanced" OS knowledge.

That's why I didn't choose "computer course" in high school. I thought that I knew already enough about computers.

Later at university and out of my mistake:

  • I thought that UNIX os/programs were perfect and DOS/Windows won't ever come any close to it ( back then it look so true, I guess Linus at al thought the same and that's why Linux is sooo similar to UNIX and not.. well other OS's )

Finally and for a long time, I thought that:

  • Only my software sucks and commercial software was flawless, because... it was "COMERCIAL" software
  • USA software/engineers/products were synonyms of excellence and anything outside were just poor attempts.
OscarRyz
  • 196,001
  • 113
  • 385
  • 569
2

I thought Windows 3.1 was only a platform to play solitaire. And DOS is a platform for BASICA.

henry
  • 969
  • 2
  • 11
  • 21
2

Error handling is unnecessary when you have tested your code thoroughly.

too much php
  • 88,666
  • 34
  • 128
  • 138
2

That always there is not enough time to finish it before deadline.

Lukas Šalkauskas
  • 14,191
  • 20
  • 61
  • 77
2

That a WTF is always an evidence of a bad professional.

In fact I've been realizing recently how many WTF's I committed myself throughout my career, but I was comforted when StackOverflow showed me they are just another software metric.

Community
  • 1
  • 1
Rômulo Ceccon
  • 10,081
  • 5
  • 39
  • 47
2

That variables are actually just names for specific areas in the memory.

Gumbo
  • 643,351
  • 109
  • 780
  • 844
2

That creating a successful application can easily be done by only programmers. Software is also about ease of use, good looks, documentation and proper marketing. Software development is multi disciplinary and failing one discipline will probably fail the application.

Tarscher
  • 1,923
  • 1
  • 22
  • 45
2

That a language suitable for systems programming must support [mutable] variables.

james woodyatt
  • 2,170
  • 17
  • 17
2

Common poor assumptions: "Quality of Code is secondary". Even poorer assumption: "Quality of code is not important at all."

Quality of code can be a very broad concept. I disscued it quite thoroughly here.

Daniel Ribeiro
  • 3,110
  • 3
  • 23
  • 49
2

That the more lines of code then the better the software would be.

  • 1
    Wow, that's one you definitely don't want. I spend a lot of time cleaning up code. The less lines the better. (and clearer syntax). – CodeJoust Oct 13 '09 at 02:22
2

That you could memset( this, 0, sizeof(TheObject) ) a C++ object in its constructor with no negative consequences

bobobobo
  • 64,917
  • 62
  • 258
  • 363
  • You'll zero out the vtable! If there's a vtable, I think it can only work if there is a derived class (which overwrites the vtable pointer when its constructor starts). – Qwertie Jul 09 '10 at 16:17
2

That marketing guys care about what you do.

crauscher
  • 6,528
  • 14
  • 59
  • 85
  • 1
    Actually, that marketing guys UNDERSTAND what is possible and what isn't, so they don't try to sell the solution to famine everywhere in the world. – isekaijin Aug 31 '09 at 13:18
2

That you needed a client specification to complete a project. More times than not you start with a sales meeting and a notepad. Of course at the end of the meeting they would like a deadline, "just ballpark it".

threadhack
  • 127
  • 4
  • 6
1

I assumed it was going to be a rollercoaster ride of fast cars, loose women, private jets and daring escapades. Just wait until I get my hands on that career advisor....

pauljwilliams
  • 19,079
  • 3
  • 51
  • 79
1

The specs are complete and suffient

SeanX
  • 1,851
  • 20
  • 28
1

That an html element id and name attribute where interchangable.

It turns out that elements with 'name' attributes are related/used.referenced for POSTs etc and 'id' attributes are used for DOM reference.

Mark Redman
  • 24,079
  • 20
  • 92
  • 147
1

thread = process

Radi
  • 6,548
  • 18
  • 63
  • 91
1

That software engineers are always honest about what they are doing now or done to your software in the past.

u07ch
  • 13,324
  • 5
  • 42
  • 48
1

That 640K should to be enough for anybody (DOS). That was widely believed by a lot of people for a number of years.

The first time I had a system with 8MB of RAM, I thought that was far more than I needed. That ran the OS (Mac) plus all the applications I was using (Word, Email, Firefox, etc).

Brent Baisley
  • 962
  • 1
  • 6
  • 4
  • 5
    You ran firefox on an 8MB machine? What decade was this, and how did you get a hold of such an early copy ;) (intended sarcarm) – Evert May 20 '09 at 15:53
  • How is this assumption programming related? Did you use Word, Email (is that an actual application?) and Firefox to program? – bzlm May 20 '09 at 16:25
  • His statement was to memory usage from programming... while is examples were not. I don't see why this was down voted. – Matthew Whited May 20 '09 at 19:36
  • Dude, there wasn't a firefox back then. and word was probably notepad, lol. – hasen May 23 '09 at 04:05
  • 1
    You're right, it was Mosaic (NCSA?). I actually meant to say FoxBase, not Firefox. And there was a program called "Mail", which Microsoft bought. They also bought Fox Software, makers of Foxbase. – Brent Baisley May 23 '09 at 23:25
  • 2
    @Brent Baisley: then why don't you edit your answer? – Cristian Ciupitu Jun 25 '10 at 16:29
1

That threads in Windows are cheap.

Turns out this is only somewhat true. A thread has a certain amount of overhead and requires its own address space where it can live and be happy. So if I find myself dealing with dozens of threads within a single application, I ask myself how I can simplify and consolidate everything into fewer threads.

Steve Wortham
  • 21,740
  • 5
  • 68
  • 90
1

That everything I wrote would fail at some point in the foreseeable future.

Not that everything won't eventually fall apart, but early on in my programming education, when I found try..catch blocks...I wrapped EVERYTHING in them....things that, if they failed, would have represented much bigger problems than my programs would be handling (e.g., the north and south pole have switched places)

Pete H.
  • 1,427
  • 1
  • 12
  • 16
  • My favourite bug: Apparently, the first time an F111 flew over the equator in "terrain following mode" (500ft above the ocean at about mach 1) it turned itself over... that's the only way the software could make sense of "left" and "right" at a negative latitude. Oops! – corlettk May 23 '09 at 07:01
1

That learning a whole new language would be really really hard.

too much php
  • 88,666
  • 34
  • 128
  • 138
1

That run-time performance mattered. Total solution time is what matters, often.

Since learning python, I have weaned myself from my attachment to static typing.

Andrej Panjkov
  • 1,448
  • 2
  • 13
  • 17
  • 1
    I have tried Python before, but, believe it or not, I write more bugs in Python than C++ (and I don't have a whole lot of C++ experience). Static typing is just so much more productive. – Zifre May 21 '09 at 21:33
  • @Zifre: there is some truth there, but it also matters how quick you can fix them and how fast you can write the whole program. I had my share of bugs caused by dynamic typing, but since they were easy to fix they didn't bother me too much. – Cristian Ciupitu Jun 25 '10 at 16:36
1

I did not know something divided by 0 in Javascript is Infinity (IEEE 754 arithmetic). Learnt it the hard way recently.

Chetan S
  • 23,637
  • 2
  • 63
  • 78
1

That profiling and performance analysis were the same thing.

Then I found out that profilers, while better than nothing, contain faulty assumptions, such as:

  • only aggregates matter, not details
  • statistical precision is necessary in locating performance problems
  • measuring time, and locating unnecessary time-consuming operations, are the same thing
Mike Dunlavey
  • 40,059
  • 14
  • 91
  • 135
  • A profiler is a generic solution which was only ever intended to put you in "the ball park". Don't bother optimising code which the profile doesn't prove is a performance bottle-neck. I agree that this can be misleading. Once upon a time I found myself optimising an equals method, which was called literally trillions of times... until I said to myself "Hang on, millions yes, trillions no. Why is equals called trillions of times?" The moral of the story is that a profiler isn't a replacement for an IQ. Cheers. Keith. – corlettk May 23 '09 at 08:30
  • @corlettk: What I do now is wait until the program is being slow, and then take several samples of the call stack, using the "pause" button. Then I look for call sites that appear on multiple samples. Any such call site is a spot that, if I can optimize it, will speed up my program substantially. This flies in the face of all accepted wisdom about profiling. – Mike Dunlavey May 23 '09 at 13:47
1

That an identity column cannot contain duplicate values: identity column in Sql server

Community
  • 1
  • 1
Justin Ethier
  • 131,333
  • 52
  • 229
  • 284
1

That because i built the software on my 'Standard' environment it would work on everyone's machine/server. Only to discover that i had installed some obscure libraries and services that actually were being used. And then discover that i leveraged a bug, that was subsequently patched.

Bluephlame
  • 3,901
  • 4
  • 35
  • 51
1

You can't diagnose 'intermittent errors' in production. Rebooting the server is the only way to fix it.

Maybe is was MORE true in my early days of ASP coding. But there are a lot of good profiling tools to find memory leaks and other weird issues. Perfmon also provides lots of good diagnostic data. Plus you should be coding diagnostic logging into your application.

russau
  • 8,928
  • 6
  • 39
  • 49
1

That I know to write a proper web application and was all clear when I had to design stuff that works in all the browsers it screwed me.

Pranali Desai
  • 974
  • 1
  • 7
  • 22
1

That understanding pointers and recursivity would be freakin' hard.

That Integers in VB6 has different size than .Net.

That VB6 could make bit level operations.

Professional programmers make bug-less software.

Broken_Window
  • 2,037
  • 3
  • 21
  • 47
1

That OOP was obsolete :( I still regret thinking that till this very day.

Leo Jweda
  • 2,481
  • 3
  • 23
  • 34
1

If I have a powerful static type system like the one in ML or Haskell, I should use it to encode as many invariants as possible. Only with experience did I learn that sometimes it's better to let the invariants be dynamic.

Norman Ramsey
  • 198,648
  • 61
  • 360
  • 533
1

That full Unicode support was a prerequisite for successfully deploying software to Asian regions.

1

I thought writing good enough software is an easy task

Upul Bandara
  • 5,973
  • 4
  • 37
  • 60
1

That our development methods were chosen and used because they were the best of breed.

Then I figured out that the tools we use had a much greater impact on what we did, when we did it, and how we did it than what I thought.

Karen Lopez
  • 483
  • 3
  • 5
1

That people actually cared about the technologies being used (open source/ closed source).

Ritesh M Nayak
  • 8,001
  • 14
  • 49
  • 78
1

In the early eighties when I started playing around with computers (ZX81 with 1K of memory), I used spend hours to type in reams of machine code (bytes, not human readable assembly language) for games from magazines, essentially using BASIC Poke instructions.

I believed that if I ever entered a single instruction incorrectly then I'd have to go back to the beginning and start entering the machine code again from the start.

Damian
  • 4,723
  • 2
  • 32
  • 53
  • Ow. I've never had that one (other than writing classes in interactive consoles for fun... just because macs have ruby installed out of the box). – CodeJoust Oct 13 '09 at 02:24
0

A program can eventually have all of its problems ironed out.

HalliHax
  • 816
  • 2
  • 11
  • 26
0

that:

for (int i = 0; i < myObj.variable; i = i + 1)

gets optimized to:

int j = myObj.variable; 
for (int i = 0; i < j; i = i + 1)

wow, i stopped putting in function calls in the place of j when I realized that they were being run EVERY time!

Reason:

for (int i = 0; i < myObj.variable; i = i + 1){ 
    if (function_argument == NULL){ 
        myObj.variable++; 
    } else { 
        printf("%d", myObj.variable);
    }
}

is not the same as:

int j = myObj.variable;
for (int i = 0; i < j; i = i + 1){ 
    if (function_argument == NULL){ 
        myObj.variable++; 
    } else { 
        printf("%d", myObj.variable);
    }
}

arbitrary example, but you can see how the optimization would change execution.

chacham15
  • 13,719
  • 26
  • 104
  • 207
0

Of course you could look at FindBugs and PMD but these are my favorite gotchas and tricks (all Java):

Fields are not overridden, they are shadowed.

There is no explicit super.super access.

Classes with no constructors defined have an implicit zero-argument constructor. I made a practical error related to this one this year.

To get a reference to an inner class's parent you can use the syntax "Outer.this" to disambiguate method calls or synchronize.

Classes are "friends of themselves" in C++ terms, private methods and fields of any instance of that class can be referenced from any method of the same class, even static methods. This would have made some of my early clone() and copy constructors much simpler.

Protected methods and fields are accessable in a static context of extending classes, but only if that class is in the same package. I'm glad that flex.messaging.io.amf isn't a sealed package.

Karl the Pagan
  • 1,944
  • 17
  • 17
0

That salesmen manage customer expectations realistically. (Trained in under-promising and over-delivering)

That software requirements generally come from market research.

Maltrap
  • 2,620
  • 1
  • 33
  • 32
0

He said he knew programming, it must be true!

Alfre2
  • 2,039
  • 3
  • 19
  • 22
0

That dimension n is an instance of dimension (n+1) when they're equivalent.

Larsson
  • 21
  • 1
0

Thinking I'm the only one person that makes a piece of code... then when I need that routine I can't remember what I did and simply copy/paste my own code.

Now, I know that everybody does that.

Broken_Window
  • 2,037
  • 3
  • 21
  • 47
0

When i was learning algorithms in my junior middle school, i thought NPC as just non-polynomial problems, which meant the complexity of this problem was no more simple than polynomial. I didn't recognize i was wrong until i learned computational theory in my college -_-b

ZelluX
  • 69,107
  • 19
  • 71
  • 104
-1

@Kyralessa: It's worth noting that on most processors, in assembly/machine language, it's possible for functions to return someplace other than their caller while leaving the stack in good condition. Indeed, there are a variety of situations where this can be useful. One variation I first saw on the 6502, though it works even better on the Z80, was a print-message routine where the text to be printed immediately followed the call instruction; execution would resume after the zero terminator (or, as a slight optimization when using the Z80, at the zero terminator, since letting the zero byte be executed as a NOP would be cheaper than trying to avoid it).

It's also interesting to note that in many modern languages, functions have one normal exit point (which will resume execution following the call) but can also exit by throwing an exception. Even in C, one can use setjmp/longjmp to simulate such behavior.

supercat
  • 77,689
  • 9
  • 166
  • 211
  • 3
    Who are you speaking to, exactly? If you had made this a comment under @Kyralessa's answer, rather than a separate answer, we might have been able to follow the thread a little better. – Robert Harvey Dec 19 '10 at 19:11
  • @Robert Harvey: My bad. I accidentally hit "answer" rather than "comment". I'll try to find the proper spot for posting. – supercat Dec 20 '10 at 01:04
-2

Java is slow. So many perl fan bois on slashdot regurgitate(sp???) this, its sad.

mP.
  • 18,002
  • 10
  • 71
  • 105
-2

That I would be programming in VB forever, I am now c#.