1

According to this

Let n be a parameter that measures the size of the problem, and let R(n) be the amount of resources the process requires for a problem of size n. In our previous examples we took n to be the number for which a given function is to be computed, but there are other possibilities. For instance, if our goal is to compute an approximation to the square root of a number, we might take n to be the number of digits accuracy required. For matrix multiplication we might take n to be the number of rows in the matrices. In general there are a number of properties of the problem with respect to which it will be desirable to analyze a given process. Similarly, R(n) might measure the number of internal storage registers used, the number of elementary machine operations performed, and so on. In computers that do only a fixed number of operations at a time, the time required will be proportional to the number of elementary machine operations performed.

and

... For instance, if we count process steps as “machine operations” we are making the assumption that the number of machine operations needed to perform, say, a multiplication is independent of the size of the numbers to be multiplied, which is false if the numbers are sufficiently large. Similar remarks hold for the estimates of space. Like the design and description of a process, the analysis of a process can be carried out at various levels of abstraction.

I usually try to count the number of steps by counting how many times a single operation is executed and totalling all of them (e.g. 1 if statement, n multiplications, n additions = 2n+1 = 2n = n operations where n is the input). This book makes it appear there are different things to count when counting the number of steps. That is, what steps to count might vary. My questions are:

  • Other than registers and elementary machine operations, are there other examples of things that can be counted when counting the number of steps?
  • I believe I am counting elementary machine operations when determining order of growth. Are there example programs where the steps we're counting should not be elementary machine operations (e.g. registers)?
  • This statement is confusing: "In computers that do only a fixed number of operations at a time, the time required will be proportional to the number of elementary machine operations performed." Does this talk of all computers in general? Are there computers that do different numbers of operations at a time?
lightning_missile
  • 2,821
  • 5
  • 30
  • 58
  • Please don't use markup from elsewhere, but the markup as generated by the buttons of the online editor. – user unknown Apr 14 '18 at 17:51
  • @user unknown unfortunatly, the buttons you speak of are not being read by my screen reader software. – lightning_missile Apr 14 '18 at 17:54
  • Oh. Well, for an unnumbered list, you make a newline, blank, dash blank and text and get a bullet, with text. Multiple bullet entries with multiple such lines. Are you interested in the button order? It is "bold", "italic", (then a space separator), "hyperlink", "citation blockquote", "code format", "image insert", "Javascript/CSS/HTML-Snippet", (again a space seperator), "numbered list", "bullet list", "header", "horizontal rule", (space seperator), "undo" and "redo". All available with hotkeys too, all in combination with CTRL: B I - L Q K G M - O U H R - Z Shift-Z. – user unknown Apr 14 '18 at 18:34
  • If you know of techniques which are more screen reader friendly, maybe you can open a thread on meta. – user unknown Apr 14 '18 at 18:35
  • If you insist on counting each individual machine operation, you might as well count the CPU cycles they take. That leads to complications which are not even worth discussing in detail (optimizations, CPU architecture, caching etc) - too many variables to consider. Bottom line - if you see a group of instructions which do not depend on any size parameters such as `n`, just take them to be O(1). Besides, no amount of theoretical musing will ever be more robust than real world benchmarks. – meowgoesthedog Apr 14 '18 at 21:11

1 Answers1

0

It is up to you what you count.

What you count defines the "model" you're using in your analysis, like e.g. RAM model.

With a model you get some predictions as to your program's behaviour. Whether these predictions are any good, -- which we can actually measure by running our actual programs at different problem sizes -- says to us whether the execution model we used is any good, whether it is in good correspondence with what is actually going on inside our computer, whether it is a useful one.

Analyzing empirical run-time behavior can be done by plotting the run times versus the problem sizes on a log-log plot. We'd get some curve; any curve can be approximated by a sequence of straight lines; straight line on a log-log plot expresses a power rule of growth:

      t ~ na     <==>     log(t) ~ a log(n).

Empirical orders of growth for the win!

cf. this Q&A entry of mine on this site. Also, great examples of log-log plots in answers by Todd Lehman here.


update: to your last bullet point, parallel computers could be doing different numbers of operations at a time, depending on available resources and / or the specifics of the computational task's stage currently being executed -- like the quoted material is saying, multiplying some numbers early on in the process (say) while they are still small enough takes O(1) machine operations; not so when they become bigger.

Also, modern CPUs can (will?) perform several operations at once, and that depending on the specifics of instructions currently in the pipeline(*) (some combinations are more amenable to this, some less).


(*) Wikipedia says: "Instruction pipelining is a technique for implementing instruction-level parallelism within a single processor."

Will Ness
  • 70,110
  • 9
  • 98
  • 181