1

I have already resolved several others using Application.Index with Application.WorksheetFunction.Match and reduced time to perform from about 7-8 seconds to milliseconds. But I feel there is still room for improvement.

Should I use an array with Index and Match?

I was also told to use Scripting.Dictionary, but I am looking for someone who can demonstrate how to do it the right way in this scenario. Because in my head I have to populate the dictionary with a loop before I can even use that, so wont it be similar in terms of speed?

'Production Quantity for Dashboard
For i = 2 To Total_rows_Prod
    For j = 2 To Total_rows_Dash
        If ThisWorkbook.Worksheets("Prod. Qty.").Cells(i, 5) = ThisWorkbook.Worksheets("Dashboard").Cells(j, 1) Then
           ThisWorkbook.Worksheets("Dashboard").Cells(j, 4) = ThisWorkbook.Worksheets("Dashboard").Cells(j, 4) + ThisWorkbook.Worksheets("Prod. Qty.").Cells(i, 31) / ThisWorkbook.Worksheets("Prod. Qty.").Cells(i, 4)
        End If
    Next j
Next i

After doing some bottleneck testing, using a nested for-next loop is known to be extremely slow and bad practice as shown below (run time of code is shown at row 10):

enter image description here

However, when using Index and Match while only using 1 for-next loop as shown in the code below:

'Production Quantity for Dashboard
For i = 2 To Total_rows_Prod
    m = Application.Match(ThisWorkbook.Worksheets("Prod. Qty.").Cells(i, 5), ThisWorkbook.Worksheets("Dashboard").Range("A:A"), 0)
    If Not IsError(m) Then
        ThisWorkbook.Worksheets("Dashboard").Cells(Application.WorksheetFunction.Match(ThisWorkbook.Worksheets("Prod. Qty.").Cells(i, 5), ThisWorkbook.Worksheets("Dashboard").Range("A:A"), 0), 4) = ThisWorkbook.Worksheets("Dashboard").Cells(Application.WorksheetFunction.Match(ThisWorkbook.Worksheets("Prod. Qty.").Cells(i, 5), ThisWorkbook.Worksheets("Dashboard").Range("A:A"), 0), 4) + ThisWorkbook.Worksheets("Prod. Qty.").Cells(i, 31) / ThisWorkbook.Worksheets("Prod. Qty.").Cells(i, 4)
    End If
Next i

The run time would be negligible as shown below (still at row 10):

enter image description here

The final time (as of 1/21/19) I was able to make everything run with Index and Match replacements was 2 seconds:

enter image description here

On a slower netbook running a Pentium Atom processor, it takes 26 seconds to perform the same code (13 times longer). So I am wondering if there is way to bring down that 26 seconds.

Any improvements that would make the time to perform optimally minimized would be great. Hopefully making it the fastest possible.

Update as of 2/23/19 (Code Revision):

Dim rangeProdQtySum As Range, rangeProdQtyPass As Range
Set rangeProdQtySum = ThisWorkbook.Worksheets("Prod. Qty.").Range("AE1:AE" & Total_rows_Prod)
Set rangeProdQtyPass = ThisWorkbook.Worksheets("Prod. Qty.").Range("D1:D" & Total_rows_Prod)

ReDim arrProdQtySum(1 To Total_rows_Prod) As Variant
ReDim arrProdQtyPass(1 To Total_rows_Prod) As Variant

arrProdQtySum = rangeProdQtySum
arrProdQtyPass = rangeProdQtyPass

ReDim arrProdQtyDash(1 To Total_rows_Dash) As Variant

'Production Quantity for Dashboard
For i = 2 To Total_rows_Prod
    m = Application.Match(arrProdQtyJTJN(i, 1), Application.Index(Application.WorksheetFunction.Transpose(arrDashInfo), 0, 1), 0)
    If Not IsError(m) Then
        arrProdQtyDash(Application.Match(arrProdQtyJTJN(i, 1), Application.Index(Application.WorksheetFunction.Transpose(arrDashInfo), 0, 1), 0)) = arrProdQtyDash(Application.Match(arrProdQtyJTJN(i, 1), Application.Index(Application.WorksheetFunction.Transpose(arrDashInfo), 0, 1), 0)) + arrProdQtySum(i, 1) / arrProdQtyPass(i, 1)
    End If
Next i

arrProdQtyDash(1) = "Production Quantity"
ThisWorkbook.Worksheets("Dashboard").Range("D1:D" & Total_rows_Dash) = Application.WorksheetFunction.Transpose(arrProdQtyDash)

With arrays, minimizing usage of for-next loops, use index and match with combinations to arrays, and memorization of variables (assigning references to a variable), the timer are as follows:

2/23/19 attempt

The same set of codes that I tried on the slower pentium atom computer takes 10 seconds to perform the same code as shown below:

improvement

This is about 2.6x faster than simply using the index and match codes. I am wondering if there are any more improvements to be suggested for the code snippet displayed here because it is the bottleneck (takes 5 seconds to perform on the pentium atom or about 50% of the processing time).

Pherdindy
  • 1,168
  • 7
  • 23
  • 52
  • 2
    One thing is Memoization: For example you use `.Worksheets("Dashboard").Range("A:A")` in a loop but as the result of this expression will never change re-executing it repeatedly is wasteful. Store the result in a variable outside of the loop then use that variable inside the loop. (Probably not a major speedup but its good practice & makes more readable/maintainable code) – Alex K. Jan 18 '19 at 11:36
  • @AlexK. Right that makes sense I will have to incorporate that minor improvement looks like it'll speed things up a bit – Pherdindy Jan 18 '19 at 11:44
  • 1
    You could also read your data into an array first. This will be **much** faster than accessing the cells each time. Eg `MyArray = Range("A1:B10").Value` will read the range A1:B10 into the array. You can then access it with `MyArray(row, column)` and eg write it back into cells with `Range("A1:B10").Value = MyArray` Attention: When writing back range size must mach array size (or you loose data). – Pᴇʜ Jan 18 '19 at 12:19
  • @Pᴇʜ Thanks. I should really start using an array than keep referencing to the workbook objects – Pherdindy Jan 21 '19 at 09:07
  • One concern I always think about is the mathematical side of the loops. With arrays, I believe I have to have a nested `for-next` loop (`for i` and `for j`). I have 3 months worth of data and the total rows for the variable i has 1,500 rows while the variable j has 100 rows. That is about 150,000 loops. In 1 year it'll be 600,000 loops. What I used to do was use `Autofilter` to actually reduce the number of loops significantly. But I believe `Autofilter` is also very taxing on the computation speed. Is having a nested loop and using an array the fastest way to go despite high number of loops? – Pherdindy Jan 21 '19 at 09:36
  • 1
    While filters can also produce heavy load they are performed by Excel (which supports multi threading) and not by VBA (which doesn't support multi threading). That means they can still be a performance boost if they reduce data for the loop. So you will need to test which one is faster in your case. In general we can say the less data you loop through the less time it takes. Also the less cell read/write actions the less time it takes. Therefore 1 read into array and 1 write back is already a huge performance boost compared with 600,000 read/writes. – Pᴇʜ Jan 21 '19 at 10:10
  • 1
    [How do I copy a filtered range into an array? (Excel VBA)](https://stackoverflow.com/questions/11995761/how-do-i-copy-a-filtered-range-into-an-array-excel-vba) might be of interest if you want to use filtered data in combination with the array method. – Pᴇʜ Jan 21 '19 at 10:11
  • @Pᴇʜ I'll let the timer decide which one is faster (filter with arrays or just the arrays) although a little more tedious as I have to write two sets of code for my application. I will make an update in a few days once I finish. Thank as always – Pherdindy Jan 21 '19 at 10:17
  • @Pᴇʜ I have improved the code where I assign a range to an array, I am not sure whether it was necessary to assign the worksheet range to a range variable prior to assigning that range variable to an array though. Also I am not sure if `Application.Transpose` should be used instead of `Application.WorksheetFunction.Transpose` in terms of the speed and any loss in functionality – Pherdindy Feb 23 '19 at 10:33

0 Answers0