Parallel-machine scheduling of jobs with mixed job-, machine- and position-dependent processing times

We consider a number of parallel-machine scheduling problems in which jobs have variable processing times. The actual processing time of each job is described by an arbitrary positive function of the position it holds on a machine. However, the function itself may additionally depend on the job or a machine this job was assigned to. Our aim is to find a schedule that minimizes the objectives of maximum completion time or the total completion time. We present a full set of polynomial solutions for the cases of jobs with no precedence constraints. We also show that the case of single-chained jobs may be not easier in general, but some polynomial results can be obtained, too.


Introduction
In many real-world scheduling problems one cannot assume that job processing times are fixed. Wright (1936) noticed that the time required to perform an aircraft production task decreases in conjunction with the worker's experience. Today, it is widely observed that the processing time of a job may vary in reaction to different environmental factors, such as the amounts of resources available, the starting time of the job, or the set of jobs performed earlier. In the latter case, it is often assumed that the actual processing time of a job depends either on its position in a schedule, or on the sum of basic processing times of jobs that have been already executed. If all the jobs are assumed to be identical, both these approaches are, in some sense, exchangeable. In this paper, the actual processing time of a job is described by a positive function of at B Bartłomiej Przybylski bap@amu.edu.pl most three arguments: the index of the job, the machine to which the job is assigned, and the position it holds on this machine. We also assume that the jobs are either independent, or they need to be processed in a given order, i.e. the precedence constraints take form of a single chain.
Recently, one can observe a growing number of papers related to scheduling problems with the learning effect, where the processing time of a job decreases as the set of already executed jobs grows (see, e.g., Wang et al. 2020). A dual group of problems with the aging effect is also widely considered in literature (see, e.g., Lu et al. 2018;Liu et al. 2018). Having this in mind, we analyze a number of general scheduling problems where job processing times may, but do not have to give in to such a monotonic impact. In particular, we show polynomial algorithms for scheduling problems in which job processing times are described by any arbitrary positive and discrete function. The analyzed problems vary in: the set of factors that impact actual processing times of jobs (position, position and machine, position and job, or all the factors together), the precedence constraints among jobs (independent or chained jobs), and the objective function (maximum completion time or the total completion time).
In practice, it is often assumed that job processing times vary among jobs. As the machines may have different capabilities, the processing time may also depend on the machine chosen (scheduling on unrelated machines). However, the assumption that the actual processing time of a job freely depends on the position it holds on a machine, may seem artificial and impractical. As indicated above and further discussed in Sect. 2, position-dependent models are usually monotonic with respect to the job position. By assuming that job processing time may freely depend on its position, we obtain general results that can be successfully applied both for learning and aging models.
The contribution of the paper is two-fold. First, we analyze various parallel-machine scheduling problems where the actual processing times of independent jobs may freely depend on the three factors: machine, job, and its position on a machine. We present the full classification of computational complexity for these problems and their subproblems. It can be shown that most of such problems can be solved in polynomial time. Second, we show that although the case of single-chained jobs is not easier in general, some of the corresponding problems in this group can also be solved in polynomial time.
This paper is organized as follows. In Sect. 2, we briefly review main positiondependent learning and aging models. We also present main results in the area of job-and machine-dependent processing times. In Sect. 3, we formalize the problem considered in this paper. In Sect. 4, we present the full classification of the complexity of problems with independent jobs. Then, in Sect. 5, we show how the assumption that the jobs are chained influences the analyzed problems. We complete the paper by Sect. 6 which includes conclusions and remarks on future research.

Literature review
To the best of our knowledge, the first position-dependent model, where the actual job processing time depends on the number of jobs executed earlier, was proposed by Gawiejnowicz (1996). In this model, the actual processing time of the i-th job scheduled on the r -th position, p i,r , depends on its basic processing time,p i , and the value of ν(r ) ∈ [0, 1]. In particular, Gawiejnowicz focused on makespan minimization on a single machine within the general p i,r =p i /ν(r ) model. Similar resultslimited to the learning effect-were presented by Biskup (1999), who introduced the p i,r =p i r a model, where a < 0 is a constant learning index. He considered single-machine scheduling problems to minimize the total flow time and the maximum deviation from the common due date. A few years later, Mosheiov and Sidney (2003) analyzed a more general case where the learning indices are job-dependent, that is p i,r =p i r a i . All these results constituted a base for rich research in the area of position-dependent scheduling.
Another popular approach to position-dependent scheduling is based on the paper by DeJong (1957). Okołowski and Gawiejnowicz (2010) considered a group of parallelmachine scheduling problems within the model of generalized DeJong's learning effect. In this model, it is assumed that p i,r =p i [M + (1 − M)r a ], where 0 ≤ M < 1 is called the incompressibility factor. The same model was analyzed later by Ji et al. (2015).
The list of models, where job processing time depends on the number of jobs executed earlier, is still expanding. There are a few models that take into account that job processing times may be affected simultaneously by factors of different kind. For example, a number of models in which time-dependent job deterioration and DeJong's learning effect are mixed were analyzed by Zhang et al. (2018) and Sun et al. (2020). We refer the reader to Agnetis et al. (2014), Strusevich and Rustogi (2017), Azzouz et al. (2017), and Gawiejnowicz (2020) for more details on different scheduling models of the kind discussed in this section.
There are two main limitations of the models of scheduling with variable job processing times, especially those that are position-dependent. First, the majority of results are related to single-machine scheduling (see, e.g., Wang 2010; Wang and Wang 2013;Debczynski andGawiejnowicz 2013 or Huang andWang 2014). At the same time, one can find only limited number of polynomial results on parallel-machine scheduling of position-dependent jobs (see, e.g., Mosheiov and Sidney 2003;Mosheiov 2008or Przybylski 2017. Second, most papers related to position-dependent scheduling focus on particular learning or aging models. In fact, there are only individual results independent of the form of the function describing the actual processing time of a job (e.g. Mosheiov 2008). Inspired by the latter paper, we assume that the actual processing time of a job may freely depend on the position it holds on a machine. In particular, we do not make any assumption on the form of such a relation, including its monotonicity. This way, our results can be successfully applied to almost all position-dependent models discussed in literature.
For many practical applications it is also justified to assume that the actual processing time of a job freely depends on a machine it is assigned to. The general problems of scheduling fixed-time jobs on unrelated machines to minimize the makespan or the total weighted completion time are known to be NP-Hard. For this reason, approximation algorithms are proposed (see, e.g., Lenstra et al. 1990;Schulz andSkutella 2002 or Bansal et al. 2019). On the other hand, Bruno et al. (1974) showed that there exists a polynomial algorithm for the case of total completion time as the objective.
Having this in mind, we also assume that the actual processing time of a job may depend on a machine it is being assigned to. In the next section, we present the formal definition of this dependency.

Problem formulation
In this section, we formulate the set of parallel-machine scheduling problems with job-, machine-, and position-dependent processing times which are going to be discussed henceforth. Let us be given m parallel machines, P 1 , P 2 , . . . , P m , and n non-preemptable jobs, J 1 , . . . , J n . The processing time of job J i assigned to a given machine P j is described by the ϕ j i : N + → Q + function. The value of the function freely depends on the position r the job holds on the j-th machine. In other words, the processing time of the i-th job holding the r -th position on the j-th machine is equal to p i, j,r = ϕ j i (r ). If the values of the functions are job-independent, they will be denoted by ϕ j , and then it holds that p j,r = ϕ j (r ) for each job J i . Similarly, if the values of the functions are machine-independent, they will be denoted by ϕ i , and then it holds that p i,r = ϕ i (r ) for each feasible i. If the values of the functions are job-and machine-independent, they will be denoted by ϕ, and then it holds that p r = ϕ(r ). The jobs may be either independent (i.e. with no precedence constraints), or chained. Our aim is to find an optimal schedule in the sense of at least one of the following objective functions: maximum completion time or total completion time. Using the traditional three-field notation (see Gawiejnowicz 2020 for a concise description), we can denote these problems by Pm| Throughout the paper we will assume that the set of ϕ j i functions is a part of the problem description, not the input, and that the values of such ϕ j i functions are precomputed and thus obtainable in constant time. This assumption helps us focus on the essence of the presented results. However, if the values of the ϕ j i functions were computed on the fly and could be obtained in polynomial time, then all the polynomial algorithms presented henceforth would remain polynomial.

Independent jobs
In this section, we consider problems in which position-dependent actual processing times are additionally job-dependent, machine-dependent, or mixed job-and machinedependent, i.e. p i,r = ϕ i (r ), p j,r = ϕ j (r ), or p i, j,r = ϕ j i (r ), respectively. However, we assume that there are no precedence constraints among jobs.
Let us start this section by observing that the Pm| p r = ϕ(r )|C max and Pm| p r = ϕ(r )| C i problems can be easily solved by assigning at most n/m jobs to each of the machines in such a way that the numbers of jobs executed on any pair of machines do not differ by more than one. As it will be shown, in case of machine-dependent processing times, some of the problems can still be solved in polynomial time even if chain precedence constraints are defined. However, if the actual processing times are job-dependent, then they might be-as a special case-constant. As, for m = 2, this leads to the P2||C max problem, the following proposition holds.
It is worth noticing that although the P2| p i,r = ϕ i (r )|C max problem is NP-Hard, the 1| p i,r = ϕ i (r )|C max problem can be rewritten as the following balanced Assignment problem.
Here, x i,r is an indicator variable that is equal to 1 if and only if job J i holds the r -th position in a schedule. Condition (2) guarantees that only one job will be assigned to each of the positions, from 1 to n. Similarly, condition (3) ensures us that each job will be assigned to exactly one position. Finally, (1) represents the value of the C max objective function for a given job-to-position assignment. As shown by Fredman and Tarjan (1987), a balanced Assignment problem can be solved in O(n 3 ). We refer the reader to a book by Burkard et al. (2012) for a wide review on the Assignment problem. The 1| p i,r = ϕ i (r )| C i problem can also be rewritten as a variant of the Assignment problem. Assume that n jobs are executed on a single machine, one after another, with their processing times equal to p 1 , p 2 , . . . , p n , respectively. Then, For this reason, the objective takes form of As n is a constant, one can introduce a new function,φ i (r ) = (n − r + 1) · ϕ i (r ), and rewrite the objective as which is equivalent to (1). Thus, the 1| p i,r = ϕ i (r )| C i problem can be solved in polynomial time, too. What is more interesting, the latter result can be generalized to a case of parallel machines, mixed position-, job-and machine-dependent processing times, and C i objective function as follows. Let us start with a two-machine case in which p i, j,r = ϕ j i (r ). Consider an instance of the Assignment problem, denoted by I c . In this instance, we are given n jobs which should be assigned to both a position and a machine. However, we assume a priori that exactly c jobs will be assigned to the first machine. This means that the remaining n − c jobs will be assigned to the second machine. Given an instance I c and any corresponding feasible schedule T c with no idle times between jobs, we have where ϕ j [r ] is a function associated with the job holding the r -th position on the j-th machine. Figure 1 shows an example of a corresponding bipartite graph for n = 5 and c = 2. The weights of the edges correspond to the impact of a job-to-machine-toposition assignment on the final value of the C i objective function, based on (5). With the reasoning similar to the one presented for (4), instance I c can be formulated as the following ILP program for the Assignment problem. s.t.
Here, conditions (7-8) guarantee that exactly one job will be assigned to each of the c positions on the first machine, and each of the n − c positions on the second machine. Similarly, condition (9) ensures us that each job will be assigned to exactly one spot. Fig. 1 The I 2 instance with 5 jobs As we can now solve the case with fixed number of jobs assigned to the first machine, we may construct an enumerating algorithm for the general problem, presented as Algorithm 1. Create the I c instance of the Assignment problem.

5:
Find an optimal schedule T c for I c based on the (6-10) program. 6: if C i (T c ) < C then 7: T ← T c 8: Proof In order to find an optimal schedule for the P2| p i, j,r = ϕ j i (r )| C i problem, we consider all the possible numbers of jobs c assigned to the first machine. Then, we select the c value that guarantees the lowest value of the objective function. Thus, the algorithm must lead to an optimal schedule. Notice that the optimal c may be any value between 0 to n, as the actual processing times of jobs are also machinedependent. For example, if ϕ 1 i (r ) < 1 n and ϕ 2 i (r ) ≥ 1 for each i and r , then c = n (in the optimal schedule all n jobs are assigned to the first machine). Similarly, if ϕ 1 i (r ) ≥ 1 and ϕ 2 i (r ) < 1 n for each i and r , then c = 0 (in the optimal schedule all n jobs are assigned to the second machine). Notice that if the processing times were machine-independent, then it would be enough to consider only the cases in which 0 ≤ c ≤ n/2 . Indeed, given the optimal schedule T c inside the for loop (line 3), one could immediately obtain the optimal schedule T n−c with the same value of the objective function, simply by swapping the machines.
As the for loop (line 3) is executed n +1 times, and each time an optimal assignment for I c is found in O(n 3 ), the total required number of operations is O(n 4 ). Now, let us consider the general Pm| p i, j,r = ϕ j i (r )| C i problem. One can notice that the same approach can be applied here. Indeed, let AP(m, n) be a number of Assignment problems that need to be solved in order to find an optimal solution for m machines and n jobs. It holds that This means that the Pm| p i, j,r = ϕ j i (r )| C i problem can be solved in the time of O(n m−1 ·n 3 ) = O(n m+2 ). The same asymptotic result can be justified by the following reasoning. Given a fixed sequence of non-negative numbers of jobs (n 1 , n 2 , . . . , n m ) that have to be assigned to each of the machines, one can find an optimal solution by solving the Assignment problem. The number of (n 1 , n 2 , . . . , n m ) sequences such that m j=1 n j = n is defined by n+m−1 n = O(n m−1 /(m − 1)!), for a fixed m. Thus, the Pm| p i, j,r = ϕ j i (r )| C i problem can be solved in O(n m+2 ). We will finish this section by considering the P2| p j,r = ϕ j (r )|C max and the P2| p j,r = ϕ j (r )| C i problems. Let us observe that for any schedule T the values of C max (T ) and C i (T ) depend on the number of jobs executed on each of the machines only. For example, if in the T c schedule there are c jobs assigned to machine P 1 and n − c jobs assigned to machine P 2 , then (n − c − r + 1) · ϕ 2 (r ) , respectively, and then assigning any c jobs to machine P 1 and the remaining n − c jobs to machine P 2 in the case of the C max objective function, and any c jobs to machine P 1 and the remaining n − c jobs to machine P 2 in the case of the C i objective function.
In general, the Pm| p j,r = ϕ j (r )|C max and Pm| p j,r = ϕ j (r )| C i problems can be solved by greedily assigning each of n jobs to a machine that guarantees the lowest C max at the moment of assignment, as shown in Algorithm 2. In schedule T , assign job J i to machine P p so it is executed as soon as possible. 7: The job is executed in the P Proof One can notice that for a single job it is optimal to select a machine that guarantees the lowest completion time of this job. Assume that n > 1 is the lowest number of jobs for which Algorithm 2 leads to a suboptimal schedule. Let T n be a schedule generated by Algorithm 2 for n jobs and let T n be an optimal schedule for the same instance. It must hold that C max (T n ) < C max (T n ), because T n is suboptimal. Let us remove from schedule T n any job for which the completion time is equal to C max (T n ). Call the resulting schedule T n−1 .
By the construction of Algorithm 2, T n−1 is an optimal schedule of n − 1 jobs. For that reason it must hold that C max (T n−1 ) ≤ C max (T n ) < C max (T n ). At the same time, there exists at least one machine P j such that the number of jobs assigned to it in the T n−1 schedule is lower than in the T n schedule. After assigning a new job to that machine we obtain a new schedule, S n . Again, by the construction of the algorithm, it must hold that C max (T n ) ≤ C max (S n ). Consequently, C max (T n ) ≤ C max (S n ) ≤ C max (T n ) < C max (T n ) which is a contradiction. A similar reasoning can be provided for the C i objective function.
Before the for loop (line 4), one needs to perform O(m) operations. In the for loop, the number of operations is bounded by O(nm). However, if one implements P as a priority queue using a strict Fibonacci heap (Brodal et al. 2012 We present a summary of the results discussed up to this moment in Table 1. The results proved in this section are printed in bold. Other results in the table follow either from their generalizations (if they are polynomial) or from their special cases (in case of NP-Hardness).

Chained jobs
In the previous section, we analyzed the Pm| p i, j,r = ϕ j i (r )|C max and Pm| p i, j,r = ϕ j i (r )| C i problems of scheduling jobs with no precedence constraints. Now, we will assume that the jobs need to be executed in a given order, i.e. that they are chained.
It might be observed that when one schedules jobs with fixed processing times, the problems in which jobs are single-chained are usually easier than the ones with independent jobs. For example, the P||C max problem is NP-Hard, while the P|chain|C max problem can be solved in linear time. It is also true for more general R||C max and R|chain|C max problems.
However, it seems that the case of chained jobs is generally harder than the case of independent jobs when it comes to mixed job-, machine-and position-dependent processing times. Despite this, in this section we will present polynomial algorithms for the Pm|chain, p j,r = ϕ j (r )|C max and Pm|chain, p j,r = ϕ j (r )| C i problems. We will analyze these problems one after another.
Let us begin with the P2|chain, p j,r = ϕ j (r )|C max problem. Assume that we are given n chained jobs, i.e. J 1 → J 2 →, · · · → J n . One can observe that in any optimal schedule related to the P2|chain, p j,r = ϕ j (r )|C max problem jobs must be executed one after another, without unnecessary idleness. That means that the value of the C max objective function is basically the sum of the actual processing times of all jobs. Moreover, the objective value does not depend on the exact job-to-machine assignment, but rather on the number of jobs assigned to the first (and consequently, to the second) machine. If one assigns c jobs to the first machine, then the sum of their actual processing times is equal to c r =1 ϕ 1 (r ). As there are n jobs in total, the remaining n − c jobs must be assigned to the second machine, with sum of their actual processing times equal to n−c r =1 ϕ 2 (r ). This means that for any schedule T c with exactly c jobs assigned to the first machine and with no idle times between consecutive jobs, we have These considerations lead us to the linear algorithm for the P2|chain, p j,r = ϕ j (r )|C max problem, presented as Algorithm 3.

Fig. 2 Notation for a set of chains
Is is worth noticing, based on the results by Przybylski (2017), that if the ϕ function is non-increasing, then the Pm|chains, p r = ϕ(r )|C max problem with more than one chain of jobs to be scheduled can be solved in O(n).
In case of the P2|chain, p j,r = ϕ j (r )|C max problem, one can find an optimal schedule by answering a simple question of how many jobs should be assigned to each of two machines? Given an optimal number c of jobs that should be assigned to machine P 1 , one of n c optimal schedules can be obtained by choosing any c jobs from the original chain of n jobs. However, in case of the C i objective function, there are two questions to be answered. First, how many jobs should be assigned to each of two machines, and second, which jobs should it be. Thus, the P2|chain, p i, j,r = ϕ j i (r )| C i problem cannot be reformulated as the Assignment problem, even if we assume that the number of jobs executed on each of the machines is constant. This is so, because jobs need to be executed in a given order. As a consequence, the actual processing time of a job J i in any feasible schedule directly depends on how many jobs from the {J 1 , J 2 , . . . , J i−1 } set were assigned to each of the machines.
It turns out that the Pm|chain, p j,r = ϕ j (r )| C i problem can still be solved by a polynomial algorithm that uses, as a subroutine, a known algorithm for the 1|chains| C i problem. For the latter case, assume that we are given a set of m chains of jobs, containing n 1 , n 2 , . . . , n m jobs, respectively, where m j=1 n j = n. Denote those jobs as presented in Fig. 2. Without loss of generality, we may assume that if n j = 0 for some j, then the chain is empty and the index j is omitted.
Given such a set of chains, one can find a single-machine schedule that is optimal in the sense of the C i objective function, based on Algorithm 5 by Conway et al. (1967). This algorithm requires O(n 2 ) time and is presented in its original form, i.e. as the list of steps.

Algorithm 5
Input: The set of chains of jobs, as described in Fig. 2, with the processing time of each job J j,k equal to p j,k Output: An optimal schedule for the 1|chains| C i problem Step 1. For each job J j,k compute Step 2. For each chain j = 1, . . . , m compute y j = min k x j,k and h j = arg min k x j,k .
Step 3. Choose j such that y j ≤ y j for all possible values of j and place the first h j jobs from chain j in the schedule.
Step 4. Neglecting all scheduled jobs, reindex the remaining jobs and recompute y j and h j .
Step 5. Repeat Steps 3 and 4 until all the jobs are scheduled.
Algorithm 5 can be adapted to the Pm|chain, p j,r = ϕ j (r )| C i problem in the following way. Given an instance of the latter problem, with a chain of length n, we imagine any feasible schedule. In such a schedule n j jobs are executed on machine P j , with their actual processing times equal to ϕ j (1), ϕ j (2), . . . , ϕ j (n j ), respectively. This is so for each 1 ≤ j ≤ m. As no two jobs are executed at the same time, this can be viewed as a single-machine scheduling problem, in which we schedule m independent chains of jobs with fixed processing times to minimize the value of C i .
Let us be given an instance of the Pm|chain, p j,r = ϕ j (r )| C i problem. Also, let (n 1 , n 2 , . . . , n m ) be a sequence of non-negative integers such that m j=1 n j = n. For such a sequence, we generate an instance of the 1|chains| C i problem, in which we have at most m chains of jobs. The j-th chain consists of n j jobs, J j,1 →J j,2 → . . . →J j,n j , with their fixed processing times equal to ϕ j (1), ϕ j (2), . . . , ϕ i (n j ), respectively. Given a single-machine schedule that provides the lowest value of C i within all the (n 1 , n 2 , . . . , n m ) sequences, a corresponding m-machine optimal schedule with the same value of C i can be generated, as shown in Algorithm 6.

9:
Find an optimal single-machine schedule T for J , based on Alg. 5. 10: if C i (T ) < C s then 11: T s ← T 12: J s ← J 13: 14: T ← An empty schedule for m identical parallel machines 15: for i ← 1, 2, . . . , n do 16: J j,k ← The i-th job in the T s schedule, as represented in the J s set 17: The processing time of job J j,k is equal to ϕ j (k).

18:
Assign the i-th job from the original chain of jobs to machine P j . 19: The processing time of this job will be equal to ϕ j (k). 20: return T Notice that inside of Algorithm 6, we perform Algorithm 5 O(n m−1 /(m − 1)!) times. As the latter algorithm needs O(n 2 ) operations, the first for loop (line 4) requires O(n m+1 /(m − 1)!) operations in total. The second for loop (line 15) can be performed in O(n) time. So, Algorithm 6 requires a total running time of O(n m+1 ).

Conclusions
We considered a set of parallel-machine scheduling problems with jobs that are position-dependent. However, the actual processing time of a job might also freely depend on the job itself and a machine it had been assigned to. We analyzed the cases of independent and chained jobs, with maximum completion time and total completion time as the objective functions.
As the Pm| p i, j,r = ϕ j i (r )|C max problem is NP-Hard in general, we proposed a number of polynomial algorithms that solve some of its special cases. Those special cases have been obtained by reducing the number of different ϕ functions. In case of the Pm| p i, j,r = ϕ j i (r )| C i problem, we presented a full set of polynomial algorithms for it an its subproblems. Then, for the case of chained jobs, we presented a series of polynomial algorithms for the Pm|chain, p j,r = ϕ j (r )|C max and Pm|chain, p j,r = ϕ j (r )| C i problems.
Although the presented results are theoretical, they can be used in real life. As the analyzed model is very general, it covers various job-, machine-, and positiondependent models-including the ones that might have not been discussed yet. For example, the presented results are true for classic learning and aging models (e.g. p i,r = p i · r a i , where a i < 0 or a i > 0, respectively). They also remain valid when DeJong's learning effect is considered. Our results can be a base for further research on parallel-machine scheduling of variable-time jobs for which no assumptions on the model of positional-dependency are made.
There are still some open questions related to the problems discussed in this paper. Namely, we now focus on answering the questions of whether the Pm|chain, p i, j,r = ϕ j i (r )|C max and Pm|chain, p i, j,r = ϕ j i (r )| C i problems are polynomially-solvable. It is known that the answer is positive for job processing times that are only job-and machine-dependent. However, for position-dependent jobs, the answer requires some further analysis.