0

Question for you Access guru's out there:

I have an Oracle pass-through query which returns a result set in about 2 seconds in the Access program. However, when I try to use that pass-through query inside a make-table query, it drags for about 7 minutes. (The file currently resides on my desktop and all tables are local (not linked)).

In an effort to improve performance, I created delete/append queries and ran it with VBA. Then I tried DAO db.Execute with explicit string SQL queries, referencing the pass-through query (INSERT/SELECT). All methods are extremely SLOW! I would love to get it close to the actual run time of the pass-through (again, 2 seconds).

Any insight?

JimT
  • 170
  • 13
  • 1
    How are you returning the result set? In query window? Try scrolling down the window after 2 seconds and see if query continues to calculate. My guess is you *see* results in 2 seconds but query is not finished processing. – Parfait Jan 07 '20 at 20:47
  • Yeah - that's totally it. It's returning ~16,000 records, which probably kills MS Access. In Toad I can get to the end of the result set in about 5 seconds (after an almost immediate return on the query). Can you put this as an answer? I'll mark it. – JimT Jan 07 '20 at 20:54
  • Show code (vba/sql)! We may improve speed ( e.g. you tried ADODB,or link AccessDB to Oracle http://www.orafaq.com/node/60) – ComputerVersteher Jan 07 '20 at 22:19
  • 16,000 records isn't all that much, reading the whole recordset should only take a few seconds (judging from ODBC to SQL Server via LAN). – Andre Jan 07 '20 at 22:50

2 Answers2

2

Likely, there is no actual, large performance difference between make-table and pass-through query. The query actually takes minutes to run while you can view some results in seconds. Scrolling down the query window may show query still being rendered.

Because pass-through queries interface with an ODBC driver/OLEDB provider to parse SQL statements between client (i.e., MS Access) and database (i.e., Oracle) across a network, several environment or software reasons can be the factor.

  1. Network traffic between client machine and server. Avoid WAN for hard-wired LAN connections. Never use internet folders or run Access on flash drives or drives of limited disk space.
  2. ODBC driver/OLEDB provider that may be outmoded. Try finding the most recent version compatible for your Oracle version and MS Office version. And try running on 64-bit architectures.
  3. Complex or large Oracle data types that cannot map to JET/ACE data types such as high precision number types or large binary object types. See this Oracle doc.
  4. Bloated Access app that should be compacted and even decompiled.
  5. Database without split architecture where application object (queries, forms, reports, macros, modules) are divorced from tables.
Parfait
  • 104,375
  • 17
  • 94
  • 125
0

Is the make table to access local? I mean a dog and say a bus are two VERY different things here.

Is that make table local to oracle, or are you taking about the PT query being used as a make table to a LOCAL table in Access? As such, your comparison is a mount Everest of a difference here.

I suspect that a plane jane linked table to Oracle vs that of PT query likely would perform the same if the target table is "local". That's because you pulling the whole table down the network pipe to a local table.

If the PT query is a make table 100% to another table on the Oracle server? Then a PT query should perform just as fast as if you typed that command in the Oracle console - there should be no difference.

You don't mention the row count (EDIT - i see you note only 16,000 rows - that is rather small). So the issue then is the network connection. is this a LAN (local area network), or are you using say a VPN and the internet which will be about 100 times slower. I mean, a connection that is 100x slower then a typical LAN is going to be 100 times slower - no surprise here.

So, details as to the type of connection being used, and the row count would shed significant light here. But if a VPN and connection over the internet is involved here, then such connections are about 50 times slower then your super cheap-o office local area network. You can read this article of mine that explains this speed issue: http://www.kallal.ca//Wan/Wans.html

I don't see why 16,000 rows would take so long - that will take well under a second of time on the cheapest office network. Using any other client to pull that data should not run or work any faster then a PT query. I mean, ado.net, ODBC, or oleDB don't cause the server to say, hum, gee - lets send the data slower. All of these clients should pull data at the same rate.

Albert D. Kallal
  • 42,205
  • 3
  • 34
  • 51
  • There are actually four pass-through queries running and placing the results into a table. The all return >10,000 records. Also, and I failed to mention this previously, but the Oracle data warehouse I pull the data from returns about 90 rows (+/- based on the table). My end users want all of these rows. I assume this is why there is such a performance hit. Instead of using make table queries (which used the pass-through queries), I ended up writing four DAO queries and db.Execute with SQL. This seemed to work better. – JimT Jan 08 '20 at 14:16
  • Note: you can run four saved, make-table action queries in code with `db.Execute mySavedQuery`. The only think you really change here is avoiding any GUI query processing. The other commands, `DoCmd.OpenQuery`, `DoCmd.RunSQL` or `!` run action in query designer, does extra work since it warns user of action (i.e., saves/caches data temporarily to be able to roll back/cancel). – Parfait Jan 08 '20 at 16:22
  • It not clear what you mean by oracle is returning about 90 rows? You note 10,000 rows, then you mention 90 rows? Are you attempting to state that your local tables are doing some kind of join against the oracle table? It would seem whatever that join is attempting to do should included the joined data on those 10,000+ row queries - that way you not attempt after pulling data to do joins with local tables over that ODBC connection (that is slow). And remember, Access cannot filter in ANY WAY a PT query. (it can filter non PT query). – Albert D. Kallal Jan 10 '20 at 21:32
  • So any filter against a PT query will run MUCH slower then a NON pt query. If you can by some kind of group by create or determine the 90 rows that you need (or eventualy join on), then pull those 90 rows local, and THEN join the data. However, It not at all clear why you are pulling data from Oracle and THEN attempting some kind of join operation on local tables to the server. (or it not clear what the difference is between the 10,000+ row queries vs the about 90 rows you are taking about nor is the context clear. – Albert D. Kallal Jan 10 '20 at 21:34