• DSum performance issue (2000(SP3))

    Author
    Topic
    #439333

    I have an inventory query for an art gallery that filters about 7500 records down to 3500 to create inventory statements for the individual artists. There is a setup query that uses DSum to add up the number of items sold for any given catalogue number. This is subtracted from the number received to give the final number currently in stock. The second (and final query) is based on the first and the only difference is an expression to filter out anything that comes up 0. The problem is that it takes a good five minutes for the process to run and the manager is complaining about the time.

    Is there a way around the processor-intensive DSum? I ran JetShowPlan as suggested in a recent post and discovered that the output was a staggering number of pages in a word document (well over 1000) for the second query (The same thing over and over again. One for each record of output.) The first query outputs about half a page. Clearly the second query is a pig for processor time and not very efficient. Any suggestions would be most welcome. I suppose I could tell her to go for a coffee whenever she runs this one. flee

    Viewing 0 reply threads
    Author
    Replies
    • #1049612

      If you can transfer the work of the Dsum into a Sum Grouping query, that should speed things up a lot.

      • #1049689

        Thanks John. It pays to have some one else do your thinking for you. I suspect I used DSum at the time because it seemed like a good idea, bash not really thinking about the performance issue of the function being called over and over. With the grouping query, the report generates in a couple of seconds. cheers

    Viewing 0 reply threads
    Reply To: DSum performance issue (2000(SP3))

    You can use BBCodes to format your content.
    Your account can't use all available BBCodes, they will be stripped before saving.

    Your information: