PQ - iterate over transactons and use aggregrate sum as target

Djpejsen 41 Reputation points
2021-02-08T03:00:18.69+00:00

Hi guys

I have some forecast transactions that I need to adjust if there is coming a Sales Order in the same week as forecast, based on the Sales order qty. and the aggregate sum

There is general 4 conditions for the iteration

Condition 1

Sales order in WK02 of 10 pcs

Forecast before adjustment
W01 : 50
W02 : 50
W03 : 50
W04 : 50
Aggregate sum 200

Forecast after adjustment
W01 : 50
W02 : 40 (Forecast)
W02 : 10 (Sales order)
W03 : 50
W04 : 50
Aggregate sum 200
——————————————
Condition 2

Sales order in WK02 of 60 pcs

Forecast before adjustment
W01 : 50
W02 : 50
W03 : 50
W04 : 50
Aggregate sum 200

Forecast after adjustment
W01 : 50
W02 : 0 (Forecast)
W02 : 60 (Sales order)
W03 : 40 (Forecast)
W04 : 50
Aggregate sum 200

——————————————
Condition 3

Sales order in WK02 of 110 pcs

Forecast before adjustment
W01 : 50
W02 : 50
W03 : 50
W04 : 50
Aggregate sum 200

Forecast after adjustment
W01 : 50
W02 : 0 (Forecast)
W02 : 110 (Sales order)
W03 : 0 (Forecast)
W04 : 40
Aggregate sum 200

——————————————
Condition 4

Sales order in WK02 of 50 pcs

Forecast before adjustment
W01 : 50
W02 : 50
W03 : 50
W04 : 50
Aggregate sum 200

Forecast after adjustment
W01 : 50
W02 : 0 (Forecast)
W02 : 50 (Sales order)
W03 : 50
W04 : 50
Aggregate sum 200

Not Monitored
Not Monitored
Tag not monitored by Microsoft.
35,926 questions
{count} votes

Accepted answer
  1. Lz._ 8,991 Reputation points
    2021-02-17T11:51:51.113+00:00

    Hi @Djpejsen
    Could you play with this workbook and let me know? With the hope this is good this time as I run out of aspirin :-)


9 additional answers

Sort by: Most helpful
  1. Lz._ 8,991 Reputation points
    2021-02-19T07:56:01.22+00:00

    Hi @Djpejsen

    Inside nested tables is no problem (sample available here) and I thought this was going to be your approach given your initial request

    Regarding optimization with a 100k table not sure what to suggest w/o having the complete picture (Source table # columns, what your whole query should do - might not only be what we discussed - ...) + I'm not a PQ guru :(

    As a general rule you should get rid of the columns and rows you don't need as early as possible. In other words, always "use" the smallest possible object (table, list...). If you look at the above sample I get rid of the FakeWeek column I created earlier, before calling the function inside each nested table

    With your 100k table buffering might not be an option - see Curt Hagenlocher's explanation below the chart in this article - but who knows ???
    I've also seen a number of posts where Ehren from the dev. team has recommended using a 64-bit version of Excel - this doesn't change anything with regard to the 256K RAM limit discussed by Curt though

    Hope this helps a bit


  2. Lz._ 8,991 Reputation points
    2021-02-19T14:48:12.337+00:00

    @Djpejsen . According to your Case #3 you expect some nested tables where the Incoming Sales Order will be >= to the SUM([Current FC]), in which case the correction will be -[Current FC] for every row. The below revised version of the function avoids the iteration in this case:

    (SourceTable as table) as table =>  
    let  
        Source = SourceTable,  
        Forecast = Table.Column(Source, "Current FC"),  
        Sales = Table.First(Source)[Sales Order],  
        Correction = List.Generate(  
        ()=>  
            [  
                i = 0,  
                forecast = Forecast{i},  
                correction = if Sales > 0 then -forecast else 0,  
                rest = if Sales = 0 then 0 else forecast-Sales  
            ],  
            each [i] < List.Count(Forecast),  
            each  
                [  
                    i = [i] + 1,  
                    forecast = Forecast{i},  
                    correction = if [rest] = 0 then 0  
                        else if [rest] > 0 then List.Min({forecast, [rest]})  
                        else List.Max({-forecast, [rest]}),  
                    rest = if [rest] < 0 then  
                        if [rest] > -forecast then 0 else [rest]+forecast  
                        else List.Max({[rest]-forecast, 0})  
                ],  
            each [correction]  
        ),  
        ToColumns = Table.ToColumns(Source) & {Correction},  
        ToTable = Table.FromColumns(ToColumns, Table.ColumnNames(Source) & {"Correction"}),  
        TypedCorrection = Table.TransformColumnTypes(ToTable, {{"Correction", Int64.Type}}),  
      
        NoIteration = Table.AddColumn(Source, "Correction", each -[Current FC], Int64.Type)  
    in  
        if Sales >= List.Sum(Forecast) then NoIteration else TypedCorrection  
    
    0 comments No comments

  3. Djpejsen 41 Reputation points
    2021-02-20T12:36:49.927+00:00

    Thank you, but then I have to replace the null values with 0 (Zero) as the type logical cannot convert a null value.

    correction = if Sales > 0 then -forecast else 0,
    rest = if Sales = 0 then 0 else forecast-Sales

    Below will exclude all the Forecast rows for the items where Sales = 0 in the first week.
    I cannot do that as the number of rows in the Source(input) file must be the same number of rows as the in the Adjusted Forecast (Output)

    NoIteration = Table.AddColumn(Source, "Correction", each -[Current FC], Int64.Type)
    in
    if Sales >= List.Sum(Forecast) then NoIteration else TypedCorrection

    0 comments No comments

  4. Djpejsen 41 Reputation points
    2021-02-20T12:43:41.61+00:00

    Maybe your solution is okay, but then I must filter out the FC row where Sales qty = 0 in the first week and then combine Source and Adjusted FC at the end.

    0 comments No comments