Abstract
Structural properties of optimal preemptive schedules have been studied in a number of recent papers with a primary focus on two structural parameters: the minimum number of preemptions necessary, and a tight lower bound on shifts, i.e., the sizes of intervals bounded by the times created by preemptions, job starts, or completions. These two parameters have been investigated for a large class of preemptive scheduling problems, but so far only rough bounds for these parameters have been derived for specific problems. This paper sharpens the bounds on these structural parameters for a wellknown open problem in the theory of preemptive scheduling: Instances consist of intrees of n unitexecutiontime jobs with release dates, and the objective is to minimize the total completion time on two processors. This is among the current, tantalizing “threshold” problems of scheduling theory: Our literature survey reveals that any significant generalization leads to an NPhard problem, but that any significant, but slight simplification leads to tractable problem with a polynomialtime solution. For the above problem, we show that the number of preemptions necessary for optimality need not exceed \(2n1\); that the number must be of order \({\varOmega }(\log n)\) for some instances; and that the minimum shift need not be less than \(2^{2n+1}.\) These bounds are obtained by combinatorial analysis of optimal preemptive schedules rather than by the analysis of polytope corners for linearprogram formulations of the problem, an approach to be found in earlier papers. The bounds immediately follow from a fundamental structural property called normality, by which minimal shifts of a job are exponentially decreasing functions. In particular, the first interval between a preempted job’s start and its preemption must be a multiple of 1 / 2, the second such interval must be a multiple of 1 / 4, and in general, the ith preemption must occur at a multiple of \(2^{i}\). We expect the new structural properties to play a prominent role in finally settling a vexing, stillopen question of complexity.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
We study structural properties of optimal preemptive schedules of a classic problem of scheduling unit execution time (UET) jobs with precedence constraints and release dates on two processors. Optimal nonpreemptive schedules for this and related problems have been well researched in the literature for various objective functions and restrictions. Fujii et al. (1971) present a matchingbased algorithm, and Coffman and Graham (1972) devise a joblabeling algorithm for minimummakespan nonpreemptive schedules. Garey and Johnson introduce \(O(n^2)\) and \(O(n^{2.81})\) time algorithms for minimizing maximum lateness for jobs, respectively, without release dates (Garey and Johnson 1976), and with release dates (Garey and Johnson 1977). Gabow (1982) designed an almost lineartime algorithm for the minimummakespan problem. Leung et al. (2001) and Carlier et al. (2014) extend these results to precedence delays. Baptiste and Timkovsky (2004) focus on minimization of total completion time and present an \(O(n^9)\) time shortestpath optimization algorithm for scheduling jobs with release dates. They also conjecture that there always exist socalled ideal schedules that minimize both maximum completion time and total completion time for jobs with release dates. This has been known to hold true for equal release dates without preemptions (Coffman and Graham 1972) and with preemptions (Coffman et al. 2003). Coffman et al. (2012) prove the Baptiste–Timkovsky conjecture and give an \(O(n^3)\) algorithm for the minimization of total completion time for jobs with release dates—a major improvement over the \(O(n^9)\) time algorithm in Baptiste and Timkovsky (2004).
Optimal preemptive schedules have proven more challenging to compute efficiently, especially for jobs with release dates and the totalcompletiontime criterion. Coffman et al. (2012) prove that these schedules are not ideal, that is, for some instances any schedule minimizing total completion time will be longer than the schedule minimizing maximum completion time. That holds even for intree precedence constraints. This last result serves as a point of departure for this paper, with its focus on intree precedence constraints, release dates, and the criterion of totalcompletiontime, the problem \(P2pmtn,in\text {}tree,r_j,p_j=1\sum C_j\) in the wellknown threefield notation. Despite numerous efforts, the computational complexity of the problem remains open: reducing the number of processors to \(m=1\) renders the problem polynomially solvable (Baptiste et al. 2004); and so does dropping the precedence constraints (Herrbach and Leung 1990); dropping the release dates (Coffman et al. 2003); and assuming outtrees instead of intrees (Baptiste and Timkovsky 2001; Brucker et al. 2003; Huo and Leung 2005). With this background in mind, we focus on key structural properties of optimal preemptive schedules for the problem \(P2pmtn,in\text {}tree,r_j,p_j=1\sum C_j\). The complexity of the problem studied in this paper remains also open for \(m=3\) machines. This holds for both nonpreemptive and preemptive cases and for both makespan and totalcompletiontime objectives.
Sauer and Stone (1987) study the problem with no release dates and maximumcompletiontime (makespan) minimization. They show that, for every optimal preemptive schedule, there is an optimal preemptive schedule with at most n preemptions, where preemptions occur at multiples of 1 / 2, and go on to define a shift that is the duration of an interval between two consecutive time points, each of which is a job start, a job end, a job resumption, or a job preemption. The shortest necessary shift in an optimal schedule is then called its resolution.^{Footnote 1} The minimum resolution over all instances of a given preemptive scheduling problem is called the problem resolution. Following Sauer and Stone (1987), the minimum number of preemptions and the minimum resolution necessary for optimal schedules have become two main structural parameters in preemptive scheduling. They have been investigated for a large class of preemptive scheduling problems by Baptiste et al. (2011) who give general bounds for these parameters.
Coffman et al. (2015) provide bounds on the resolutions of various scheduling problems—we refer the reader to their work for a comprehensive overview. In particular, they show upper bounds of \(m^{n/(m+1)}\) and \(m^{(n1)/(m+1)}\) on resolutions for problems \(Ppmtn,in\text {}tree,r_j,p_j=1C_{\max }\) and \(Ppmtn,in\text {}tree,r_j,p_j=1\sum C_j\), respectively, where n is the number of jobs and m is the number of processors. Thus, for the problem \(P2pmtn,in\text {}tree,r_j,p_j=1\sum C_j\) studied in this paper one immediately obtains an upper bound of \(2^{(n1)/3}\) on its resolution. As for lower bounds, Coffman et al. (2015) shows that the resolution of \(Ppmtn,prec,r_j\sum w_iC_j\) is at least \((m+n)^{(2n+1)/2}\) .
The papers of Sauer and Stone (1987), Baptiste et al. (2011), and Coffman et al. (2015) obtain their resolution bounds by analyzing the corners of feasibility regions of linear programs designed for specific problems. Our approach is combinatorial and does not make use of the theory of linear programming. It yields a lower bound of \(2^{2n1}\) on the problem resolution of \(P2pmtn,in\text {}tree,r_j,p_j=1\sum C_j\), which is a significant improvement for intrees over the lower bound of \((n+2)^{(2n+1)/2}\) that can be derived directly from Coffman et al. (2015).
We introduce in this paper the concept of normal schedules where shifts decrease as a function of time: The first shift is a multiple of 1 / 2, the second one is a multiple of 1 / 4, and in general, the ith shift is a multiple if \(2^{i}\). We prove that there exist optimal schedules that are normal for intrees. However, we conjecture that this is no longer the case for arbitrary precedence constraints, i.e., there are instances for which no optimal schedule is normal. The normality of a schedule implies that each shift is a multiple of \(2^{2n+1}\), which is a much stronger claim than the usual requirement that all shifts are no shorter than the problem resolution. Normality also implies that there exists an optimal schedule with a finite number (in particular, a number not exceeding \(2n1\)) of events which are times when jobs start, end, or are preempted. Thus, \(2n1\) is an upper bound on the number of preemptions necessary for optimality. We also observe that a job may be required to preempt at a point which is neither a start nor the end of another job in order to ensure optimality. These preemption events unrelated to job starts or completions seem to be confined to rather contrived instances; they are more the exception than the rule in preemptive scheduling. We also prove that there exists a sequence of problem instances indexed by n for which the number of preemptions in the corresponding optimal schedules is \({\varOmega }(\log n)\). Thus, a tight upper bound on the number of preemptions required for optimality must be at least logarithmic in n.
2 Our approach and results: a general overview
In Sect. 3, we exhibit a sequence of problem instances indexed by n where the number of preemptions of a single job is of order \({\varOmega }(\log n)\). In the remaining sections we focus on proving our lower bound on the minimum shift. We start by showing that an optimal schedule is a concatenation of blocks, each with at most three jobs. No job starts or completes inside a block but there is at least one job start at the beginning of a block, and/or at least one job completion at the end of a block. This is done in Sects. 4.2 and 4.3. A block is called lnormal if each job duration in the block is a multiple of \(1/2^{l+1}\), and the block length is a multiple of \(1/2^{l}\). In a normal schedule the first block must be 1normal, the second 2normal and so on. These concepts are introduced in Sect. 4.4, where it is verified that, in a normal schedule with q blocks, each preemption occurs at a multiple of \(1/2^{q+1},\) where \(q\le 2n1\). Our goal is to show that there exists an optimal schedule that is normal. Our proof is by contradiction. We begin by assuming an optimal schedule that is also maximal in the sense that it has a latest possible abnormality point i, i.e., a latest block i which is not inormal. We show that such a block must have exactly three jobs. One completes at the end of the block and has an \((i+1)\)normal duration, but the durations of the other two are not \((i+1)\)normal, as shown by Lemma 12. These two jobs then trigger an alternating chain of jobs to which they also belong, as shown in Sect. 6. The completion times of the jobs in the chain are not \((i+1)\)normal, which makes it possible under normalblock circumstances to either extend the chain by one job or prove that the abnormality point must exceed i; this is our main result in Theorem 3. Thus, we get a contradiction in either case since the number of jobs is finite and the schedule is maximal. The normalblock circumstances here mean that the alternating chain does not end with a certain structure that we call an Aconfiguration, a configuration that prevents us from extending the alternating chain. However, we show that there always exists a maximal schedule that does not include an \(\text {A}\)configuration. This is done in Sect. 5, where the key result is Theorem 2. The main result of the paper follows and states that there is a normal schedule that is optimal for \(P2pmtn,in\text {}tree,r_j,p_j=1\sum C_j\).
3 How many preemptions of a job is required?
In this section we show that, for any given number n of jobs, it is sometimes necessary to preempt a job \(p={\varOmega }(\log n)\) times. Let \(A_i\), \(i\ge 0\), be a set of four jobs \(a_1^i,a_2^i,a_3^i,a_4^i\) such that \(r(a_j^i)=2i\) for each \(j\in \{1,2,3\}\), \(r(a_4^i)=2i+1\) and \(a_j^i\prec a_4^i\) for each \(j\in \{1,2,3\}\). Then, define
where \(a_4^{i}\prec a_4^{i+1}\) for each \(i\in \{0,\ldots ,p1\}\). (See Fig. 1.)
We prove that the job \(a_4^p\) should complete exactly at \(2p+31/2^{p+1}\) in any optimal schedule. This is done by first proving that no feasible schedule (optimal or not) can complete \(a_4^p\) earlier (cf. Proposition 1), and then by proving that staring \(a_4^p\) later leads to a schedule that cannot be optimal (cf. Proposition 3).
Proposition 1
Let \(p\ge 0\) be any integer. If \(\mathcal {P}\) is a preemptive schedule for \(\mathcal {J}_p\), then the total length of the job \(a_4^p\) executing in \([2p+2,+\infty )\) is at least \(11/2^{p+1}\).
Proof
We prove the lemma by induction on p. Let \(p=0\). (Note that \(\mathcal {J}_p=A_0\).) Executing less than 1 / 2 units of \(a_4^0\) in \([2,+\infty )\) implies that \(a_4^0\) completes at \(5/2\varepsilon \), for some \(\varepsilon >0\) This, however, requires completing each job in \(\text {A}_0\setminus \{a_4^0\}\) at \(3/2\varepsilon \) or earlier, which is not possible.
Suppose that the lemma holds for integers smaller than p and we prove it for p. Let \(\mathcal {P}\) be a preemptive schedule for \(\mathcal {J}_p\). We consider all jobs that must execute in time interval \(I=[2p,s(\mathcal {P},a_4^p)]\). Each job in \(A_{p}\setminus \{a_4^{p}\}\) executes in this interval, because \(r(a)=2p\) and \(a\prec a_4^{p}\) for each \(a\in A_{p}\setminus \{a_4^{p}\}\). By induction hypothesis and by the facts that \(a\prec a_4^p\) for each \(a\in \mathcal {J}_{p1}\), we obtain that a part of \(a_4^{p1}\) that executes in I is of length at least \(11/2^{p}\). Thus, the total length of all jobs that execute in I in \(\mathcal {P}\) is at least \(41/2^{p}\). Therefore,
Thus, at least \(11/2^{p+1}\) units of \(a_4^p\) execute in \([2p+2,+\infty )\) as required. \(\square \)
We iteratively construct a schedule \(\mathcal {P}_p\). Let \(\mathcal {P}_0\) be such that the jobs \(a_1^0,a_2^0,a_3^0\) form a 3 / 2 schedule in interval [0, 3 / 2] and \(a_4^0\) executes in [3 / 2, 5 / 2]. For \(p>0\), first take \(\mathcal {P}_{p1}\) and then execute the jobs in \(A_p\) as follows:
(See Fig. 2.)
Note that, when \(p>0\), \(C(\mathcal {P}_{p},a_4^{p1})=C(\mathcal {P}_{p1},a_4^{p1})=2p+11/2^{p}\) and hence \(\mathcal {P}_p\) is feasible. This gives the following.
Proposition 2
Let \(p\ge 0\) be any integer. There exists a schedule for \(\mathcal {J}_p\) that completes \(a_4^p\) at \(2p+31/2^{p+1}\). \(\square \)
Proposition 3
Let \(p\ge 0\) be any integer. Each optimal schedule for \(\mathcal {J}_p\) completes \(a_4^p\) at \(2p+31/2^{p+1}\) and satisfies the following: the total length of idle time in \([0,2(j+1)]\) is \(12^{j+1}\) for each \(j\in \{0,\ldots ,p\}\), and there is no time interval contained in \([0,2(p+1)]\) in which both processors are idle.
Proof
We prove the lemma by contradiction, i.e., suppose that, for some \(p\ge 0\), there exists an optimal schedule \(\mathcal {P}_p'\) such that \(C(\mathcal {P}_p',a_4^p)\ne 2p+31/2^{p+1}\). Let, without loss of generality, p be the minimum integer for which this holds. One can verify the lemma for \(p=0\) and hence \(p>0\). By Proposition 1,
By Proposition 2 and by the minimality of p, there exists an optimal schedule \(\mathcal {P}_{p1}\) for \(\mathcal {J}_{p1}\) that executes at least \(11/2^{p}\) units of \(a_4^{p1}\) in \([2p,+\infty )\).
We argue that the total length of \(a_4^{p1}\), denoted by x, that executes in \(\mathcal {P}_p'\) in interval \([2p,+\infty )\) equals exactly \(11/2^{p}\). By Proposition 1,
Define a schedule \(\mathcal {P}'\) that equals \(\mathcal {P}_{p1}\) in the interval [0, 2p] and equals \(\mathcal {P}_p'\) in the interval \([2p,+\infty )\). Note that \(\mathcal {P}'\) is not feasible only if the total length of \(a_4^p\) executing in \(\mathcal {P}'\) (that equals \(1/2^{p}+x\)) is greater than 1. However,
We then obtain a schedule \(\mathcal {P}\) by removing the total length of \(x1+1/2^{p}\) of \(a_4^{p1}\) from \(\mathcal {P}'\) in a way that minimizes the completion time of \(a_4^{p1}\) in \(\mathcal {P}\), which gives \(C(\mathcal {P},a_4^{p1})\le C(\mathcal {P}',a_4^{p1})(x1+1/2^{p})\). The schedule \(\mathcal {P}\) is feasible and \(C(\mathcal {P},a)=C(\mathcal {P}',a)\) for each \(a\in \mathcal {J}_p\setminus \{a_4^{p1}\}\). Hence, by (3),
Thus, by the optimality of \(\mathcal {P}_p'\), \(x1+1/2^{p}=0\), i.e., \(x=11/2^{p}\) as required.
In the schedule \(\mathcal {P}_p'\), the jobs that are executed in time interval \([2p,C(\mathcal {P}_p',a_4^p)]\) are the ones in \(A_p\) and x units of \(a_4^{p1}\). By a case analysis one can prove that
Note that, by construction,
Moreover, \(x=11/2^{p}\) and the minimality of p imply that
This, (1) and \(C(\mathcal {P}_p,a_4^p)=2p+31/2^{p+1}\) imply that \(\mathcal {P}_u'\) is not optimal. This gives the desired contradiction.
Note that is follows that in each optimal schedule \(\mathcal {P}\) for \(\mathcal {J}_p\), \(C(\mathcal {P},a_4^j)=2p+31/2^{j+1}\). Since \(a_4^j\) is a successor of all jobs in \(\mathcal {J}_j\), we obtain that there is, in \(\mathcal {P}\), idle time on exactly one processor in interval \((2(j+1)1/2^{j+1},2(j+1))\). Also note that, for each \(j\in \{0,\ldots ,p\}\), no idle time is possible in the interval \(I=(2j,2(j+1)1/2^{j+1})\) because, for \(j>0\), \(a_1^,a_2^j,a_3^j\) and a part of \(a_4^{j1}\) of length \(12^j\) must execute in I, and for \(j=0\), \(a_1^0,a_2^0\) and \(a_3^0\) must execute in I. This completes the proof of the claim. \(\square \)
Note that for any given \(p\ge 0\), the number of jobs in \(\mathcal {J}_p\) equals \(4(p+1)\). Proposition 3 gives us that each optimal schedule for \(\mathcal {J}_p\) has a job that completes at a time point that is a multiple of \(1/2^{p+1}\) but is not a multiple of \(1/2^{p}\). This gives that there exists a set of n jobs \(\mathcal {J}\) such that there exists no optimal solution to \(P2pmtn,in\text {}tree,r_j,p_j=1\sum C_j\) for \(\mathcal {J}\) in which each job start, completion and preemption occurs at a time point that is a multiple of \(1/2^{n/41}\). We note that this upper bound on the resolution of this problem is slightly weaker than the bound \(2^{(n1)/3}\) proved in Coffman et al. (2015). Our main result of this section is the following lower bound on the number of preemption of one job in an optimal schedule.
Theorem 1
Given any positive integer p, there exists an instance of the problem \(P2pmtn,in\text {}tree,r_j,p_j{=}1\sum C_j\), such that in any optimal schedule for the instance, this is a job that is preempted at least \(p={\varOmega }(\log \mathcal {J})\) times, where \(\mathcal {J}\) is the job set of the instance.
Proof
Let \(\mathcal {P}_p\) be an optimal schedule for \(\mathcal {J}\). Thus \(\mathcal {P}_p\) satisfies the conditions in Proposition 3. By Theorem 4, we may assume that \(\mathcal {P}_p\) is normal. Define \(l=C(\mathcal {P}_p,a_4^p)\cdot 2^{c}\), where \(c=2\mathcal {J}_p+3\).
Take \(\mathcal {J}=\{a\}\cup \mathcal {J}_p\cup \{b_1,\ldots ,b_l\}\). The precedence constraints between the jobs in \(\mathcal {J}_p\) are as in Fig. 1. We extend the precedence relation to \(\mathcal {J}\) by additionally enforcing:
We first construct a schedule \(\mathcal {P}'\) as follows. Take \(\mathcal {P}_p\) and extend it by executing a so that \(C(\mathcal {P}',a)=2(p+1)+1/2^{p+1}\) and executing \(b_1,\ldots ,b_l\) so that \(C(\mathcal {P}',b_i)=C(\mathcal {P}',b_{i1})+1\) for each \(i\in \{1,\ldots ,l\}\), where \(b_0=a_4^p\). By Proposition 3, such a schedule \(\mathcal {P}'\) exists and a is preempted p times in \(\mathcal {P}'\).
Let \(\mathcal {P}\) be an optimal normal schedule for \(\mathcal {J}\). By Theorem 4, such a schedule exists. Suppose for a contradiction that a is preempted at most \(p1\) times in \(\mathcal {P}\). By Proposition 3, we have \(C(\mathcal {P},a)<C(\mathcal {P}',a)\) and \(C(\mathcal {P},a_4^p)>C(\mathcal {P}',a_4^p)\). The number of events in \(\mathcal {P}\) (respectively in \(\mathcal {P}'\)) in interval \([0,C(\mathcal {P},a_4^p)]\) (respectively, \([0,C(\mathcal {P}',a_4^p)]\)) is at most \(2\mathcal {J}_p+3\) because each event either equals 0 or is the start or completion time of a job. Since both schedules are normal,
Thus, by definition of l,
By Proposition 3, \(\mathcal {P}\) restricted to the jobs in \(\mathcal {J}_p\) is not optimal for \(\mathcal {J}_p\). This in particular implies
This, together with (4), implies
Since, by construction of \(\mathcal {P}'\),
we obtain that the total completion time of \(\mathcal {P}'\) is strictly smaller than that of \(\mathcal {P}\), which gives a required contradiction.
Finally, note that \(C(\mathcal {P}_p,a_4^p)\le \mathcal {J}_p\le c\) and hence \(l\le 2^{2c}\). Thus, \(\mathcal {J}=\mathcal {J}_p+1+l=2^{O(p)}\) because \(c=O(p)\) and \(\mathcal {J}_p=O(p)\). This implies that \(p={\varOmega }(\log \mathcal {J})\) as required. \(\square \)
4 Optimal, normal and maximal schedules
4.1 Preliminaries
Let \(\mathcal {J}\) be a set of n unit UET jobs. The release date for job a, denoted by r(a), is the earliest start time for a in any feasible schedule of \(\mathcal {J}\). We assume that r(a) is an integer for all \(a\in \mathcal {J}\).
For two jobs a and b, we say that a is a predecessor of b, and that b is a successor of a, if all feasible schedules require that b not start until a has finished. We write \(a\prec b\) to denote this relation. In contrast, \(a\nprec b\) means that b can start prior to the completion time of a. Two jobs a and b are said to be independent if \(a\nprec b\) and \(b\nprec a\). For \(B\subseteq \mathcal {J}\), we say that the jobs in B are independent if each pair of jobs in B is independent. This work deals with intree precedence constraints, i.e., for each job a there exists at most one job b such that \(a\prec b\).
The symbol \(\mathbb {R}_+\) denotes the set of nonnegative real numbers. Given a schedule \(\mathcal {P}\) and a job \(a\in \mathcal {J}\), define \(s(\mathcal {P},a)\) and \(C(\mathcal {P},a)\) to be the start and completion times of a in \(\mathcal {P}\), respectively. A job is called release date pinned in \(\mathcal {P}\) if it starts at its release date in \(\mathcal {P}\). The total completion time of a schedule \(\mathcal {P}\) of \(\mathcal {J}\) is given by \(\sum _{a\in \mathcal {J}}C(\mathcal {P},a)\). We say that a preemptive schedule \(\mathcal {P}\) is optimal if the sum of its job completion times is minimum among all preemptive schedules for \(\mathcal {J}\).
4.2 Events, partitions and basic schedule transformations
For a given schedule \(\mathcal {P}\), define a vector \({\varvec{e}}=(e_1,\ldots ,e_q)\), where \(0=e_1<e_2<\cdots <e_q\), such that
The elements of \({\varvec{e}}\) are called the events of \(\mathcal {P}\). The part of \(\mathcal {P}\) in time interval \([e_i,e_{i+1}]\) is called the i th block of \(\mathcal {P}\), or simply a block of \(\mathcal {P}\), \(i\in \{1,\ldots ,q1\}\). Given \(i\in \{1,\ldots ,q1\}\), let \(\xi _i:\mathcal {J}\rightarrow \mathbb {R}_+\) be a function such that for each \(a\in \mathcal {J}\), \(\xi _i(a)\) is the total length of a executed in the ith block of \(\mathcal {P}\). Then, \((\xi _1,\ldots ,\xi _{q1})\) is called the partition of \(\mathcal {P}\). Denote by \((\mathcal {P},{\varvec{e}},{\varvec{\xi }})\) the schedule \(\mathcal {P}\) with events \({\varvec{e}}\) and partition \({\varvec{\xi }}\). Unless specified otherwise, it is understood that \({\varvec{e}}\) has q components. For each \(a\in \mathcal {J}\), \(\tau _{\mathcal {P}}(a)\) is the integer \(i\in \{1,\ldots ,q1\}\) such that \(C(\mathcal {P},a)=e_{i+1}\). In other words, the \(\tau _{\mathcal {P}}(a)\)th block is the last block in which job a appears. Whenever \(\mathcal {P}\) is clear from context we will simply write \(\tau (a)\). For any function \(f:\mathcal {J}\rightarrow \mathbb {R}_+\), let
In the following we will analyze schedules by investigating their events and partitions. Informally speaking, the events and the partition of a schedule \(\mathcal {P}\) are insufficient to uniquely reconstruct the schedule \(\mathcal {P}\) but they suffice to build a schedule with the same total completion time as \(\mathcal {P}\). The schedules built from a list of events and a partition may differ in how pieces of jobs are executed within the blocks. The main advantage of our approach is that in order to construct a block in \([e_i,e_{i+1}]\) one only needs to solve the problem \(P2p_j,pmtnC_{\max }\) where the execution time of a job a is \(\xi _i(a)\); the proof of Lemma 1 gives more details. We formalize this observation in the next two lemmas.
Proposition 4
Let \((\mathcal {P},{\varvec{e}},{\varvec{\xi }})\) be a schedule. The following conditions hold for each \(i\in \{1,\ldots ,q1\}\):

(i)
For each \(a\in \mathcal {J}(\xi _i)\): \(r(a)\le e_i\);

(ii)
For each \(a\in \mathcal {J}(\xi _i)\): \(\xi _i(a)\le e_{i+1}e_i\) and \(\sum _{a\in \mathcal {J}(\xi _i)}\xi _i(a)\le 2(e_{i+1}e_i)\);

(iii)
For each \(a\in \mathcal {J}(\xi _i)\) and \(b\in \mathcal {J}(\xi _j)\), where \(i\le j<q\): \(b\nprec a\).
Proof
Condition (i) follows from the fact that no job in \(\mathcal {J}(\xi _i)\) starts or completes in \((e_i,e_{i+1})\), \(i\in \{1,\ldots ,q1\}\). (Note that \(r(a)>e_i\) is not possible for \(a\in \mathcal {J}(\xi _i)\) because then we would have \(s(\mathcal {P},a)\in (e_i,e_{i+1})\) which would contradict \(e_i\) and \(e_{i+1}\) being two consecutive events of \(\mathcal {P}\).) Conditions (ii) and (iii) follow directly from the fact that \(\mathcal {P}\) is a feasible schedule for \(\mathcal {J}\). (Note that (iii) in particular implies that the jobs in \(\mathcal {J}(\xi _i)\) are independent.) \(\square \)
We often rely on rearrangements of the events \({\varvec{e}}\) of a schedule \(\mathcal {P}\) which result in new schedules \(\mathcal {P}'\) with events that differ from those in \({\varvec{e}}\). The resulting schedule \(\mathcal {P}'\), however, may still be analyzed in the time intervals \([e_i,e_{i+1}]\), \(i\in \{1,\ldots ,q1\}\) defined by the original \({\varvec{e}}\). For this analysis, we need the following lemma, in which vectors of increasing real numbers beginning with 0 are regarded as sequences of time points.
Lemma 1
If there exist q time points \(e_1 < \cdots < e_q\) and \(q1\) functions \(\xi _i:\mathcal {J}\rightarrow \mathbb {R}_+\) (\(i=1,\ldots ,q1\)) such that for each \(a\in \mathcal {J}\), \(\sum _{i=1}^{q1}\xi _i(a)=1\) and conditions (i)–(iii) in Proposition 4 are satisfied, then there exists a schedule \(\mathcal {P}\) such that for each \(i\in \{1,\ldots ,q1\}\) and for each \(a\in \mathcal {J}\) the total length of all pieces of a executed in \([e_i,e_{i+1}]\) equals \(\xi _i(a)\).
Proof
For any given \(i\in \{1,\ldots ,q1\}\), it is enough to construct the part of schedule \(\mathcal {P}\), denoted by \(\mathcal {P}_i\), in the time interval \([e_i,e_{i+1}]\). By (i) and (ii), this is equivalent to solving the problem \(P2p_j,pmtnC_{\max }\) where the execution time of each job a is \(\xi _i(a)\). It is easy to see that such a schedule \(\mathcal {P}_i\) exists if and only if the duration of \([e_i,e_{i+1}]\) is at least the larger of the maximum of the execution times \(\xi (a)\) and the sum of these times averaged over the two processors, i.e.,
Thus, (ii) guarantees that \(\mathcal {P}_i\) exists. Finally note that (iii) guarantees that the precedence constraints between jobs in different blocks are met. \(\square \)
We close this section by introducing two basic transformations of a given schedule \((\mathcal {P},{\varvec{e}},{\varvec{\xi }})\): the cyclic shift and the swapping of two jobs. Let \(\varepsilon >0\) and \(j>0\). Let \(B=\{a_1,\ldots ,a_j\}\subseteq \mathcal {J}\) be j different jobs and let \(\{i_1,\ldots ,i_j\}\subseteq \{1,\ldots ,q1\}\) be j blocks of \(\mathcal {P}\) such that \(\xi _{i_k}(a_k)\ge \varepsilon \) and \(\xi _{i_{k+1}}(a_k)\le e_{i_{k+1}+1}e_{i_{k+1}}\varepsilon \) for \(k\in \{1,\ldots ,j\}\), where \(i_{j+1}=i_{1}\). We define a cyclic shift of B by \(\varepsilon \) on \(\{i_1,\ldots ,i_j\}\) in \(\mathcal {P}\), or just a cyclic shift if it is clear from context, as follows. Let
be the events and the partition, respectively, obtained by replacing a piece of \(a_{k+1}\) of length \(\varepsilon \) in block \(j_{k+1}\) of \(\mathcal {P}\) with a piece of \(a_{k}\) of length \(\varepsilon \) for each \(k\in \{1,\ldots ,j\}\), where \(i_{j+1}=i_{1}\). This transformation may not result in a feasible schedule because the precedence constraints or release dates may be violated. However, if neither is violated, then the assumptions of Lemma 1 are met for \(\mathcal {P}'\) with the events \({\varvec{e}}'\), and the partition \({\varvec{\xi }}'\) exists. If \(\mathcal {P}'\) exists, then in addition we assume that the blocks of \(\mathcal {P}'\) enforce the following restrictions:

For each \(a_k\in B\), if \(C(\mathcal {P},a_k)=e_{i_k+1}\) and \(i_{k+1}<i_k\) (taking \(i_{j+1}=i_{1}\)), then \(C(\mathcal {P}',a_k)=e_{i_k+1}\varepsilon \), which reduces the completion time of job \(a_k\) by as much as possible with respect to the cyclic shift.

If \(C(\mathcal {P},a_k)\le e_{i_{k+1}}\) (taking \(i_{j+1}=i_{1}\)), then \(C(\mathcal {P}',a_k)=e_{i_{k+1}}+\varepsilon \), which increases the completion time of job \(a_k\) by as little as possible with respect to the cyclic shift.
Note that, in general, \({\varvec{e}}\) does not consist of the events of \(\mathcal {P}'\), and the number of events of \(\mathcal {P}'\) may be different than the number of events of \(\mathcal {P}\).
Finally, we introduce the notion of swapping of two jobs which is used in Sects. 4.4 and 6 to reduce total completion time of a schedule by applying the shortest processing time (SPT) rule to two jobs that complete in consecutive blocks. Let \(\mathcal {P}\) be a schedule with events \({\varvec{e}}\) and partition \({\varvec{\xi }}\). Let a and \(a'\) be two jobs such that \(C(\mathcal {P},a')=e_{\tau (a)}\), \(s(\mathcal {P},a)\le s(\mathcal {P},a')\) and \(a'\) is independent of any job in \(\mathcal {J}(\xi _{\tau (a)})\). We define a transformation of swapping a and \(a'\) that results in a new schedule \(\mathcal {P}'\) as follows (see Fig. 3). Find a set of indices \(I\subseteq \{1,\ldots ,\tau (a)1\}\) such that for each \(j\in I\),
\(\varepsilon (\max I)\) is minimum and \(\sum _{j\in I}\varepsilon (j)=\xi _{\tau (a)}(a)\). Such a set I exists because of the constraints imposed on a and \(a'\). The schedule \(\mathcal {P}'\) is obtained by performing the following three steps:

For each \(j\in I\), remove a piece of \(a'\) of length \(\varepsilon (j)\) from the jth block of \(\mathcal {P}\).

Remove the piece of a executing in the \(\tau (a)\)th block and add a piece of \(a'\) of length \(\xi _{\tau (a)}(a)\) to the \(\tau (a)\)th block of \(\mathcal {P}\).

Add a piece of a of length \(\varepsilon (j)\) to the jth block of \(\mathcal {P}\) for each \(j\in I\).
Lemma 2
Given schedule \((\mathcal {P},{\varvec{e}},{\varvec{\xi }})\), let \(a,a'\) be two jobs such that \(C(\mathcal {P},a')=e_{k}\), \(s(\mathcal {P},a)\le s(\mathcal {P},a')\) and \(a'\) is independent of any job in \(\mathcal {J}(\xi _{k})\), where \(k=\tau _{\mathcal {P}}(a)\). Then, the schedule \(\mathcal {P}'\) obtained by swapping a and \(a'\) in \(\mathcal {P}\) is feasible and \(\sum _{a''\in \mathcal {J}}C(\mathcal {P}',a'')\le \sum _{a''\in \mathcal {J}}C(\mathcal {P},a'')\) with the inequality being strict when \(s(\mathcal {P},a)<s(\mathcal {P},a')\) and \(\xi _{k1}(a)<e_ke_{k1}\).
Proof
The fact that \(\mathcal {P}'\) is feasible follows directly from its construction. Suppose that \(s(\mathcal {P},a)<s(\mathcal {P},a')\) and \(\xi _{k1}(a)<e_ke_{k1}\). If \(k1\notin I\), then \(C(\mathcal {P}',a)\le e_{k1}+\xi _{k1}(a)<e_k\). Otherwise, the restriction on taking \(\varepsilon (\max I)=\varepsilon (k1)\) to be minimum implies, due to \(s(\mathcal {P},a)<s(\mathcal {P},a')\), that \(\xi _{k1}(a)+\varepsilon (k1)<e_ke_{k1}\) and hence \(C(\mathcal {P}',a)=e_{k1}+\xi _{k1}(a)+\varepsilon (k1)<e_k\). Thus, the total completion time of \(\mathcal {P}'\) is strictly smaller than that of \(\mathcal {P}\) as required. \(\square \)
4.3 Properties of optimal schedules
We now give some key properties of optimal schedules and describe three configurations that are forbidden in optimal schedules. These results will be used in subsequent sections. The following lemma states that if a job a completes in the ith block of an optimal schedule \(\mathcal {P}\), i.e., \(\tau (a)=i\), then the part of a that executes in that block spans the block. Such a job a is called a spanning job in block i.
Lemma 3
Given schedule \((\mathcal {P},{\varvec{e}},{\varvec{\xi }})\), each job \(a\in \mathcal {J}\) is a spanning job in block \(\tau (a)\), i.e., \(\xi _{\tau (a)}(a)=e_{\tau (a)+1}e_{\tau (a)}\).
Proof
The proof is by contradiction. There exists \(\varepsilon >0\) such that at most one job executes in \(I=[e_{\tau (a)+1}\varepsilon ,e_{\tau (a)+1}]\) on each machine in \(\mathcal {P}\) and \(\varepsilon \le e_{\tau (a)+1}e_{\tau (a)}\xi _{\tau (a)}(a)\). Let B be the set of the jobs that execute in I. Clearly, \(a\in B\) and \(1\le B\le 2\). There exists a job \(b'\in \mathcal {J}\setminus B\) such that \(\xi _{\tau (a)}(b')\ne 0\). Indeed, otherwise a could be executed in \([e_{\tau (a)},e_{\tau (a)}+\xi _{\tau (a)}(a)]\) without making any other changes in the schedule. Since the new schedule completes a earlier (because \(\xi _{\tau (a)}(a)<e_{\tau (a)+1}e_{\tau (a)}\)), this would contradict the optimality of \(\mathcal {P}\). Then \(C(\mathcal {P},b') > \tau (a)\) and we can use some of the space of \(\xi _{\tau (a)}(b')\) for job a to complete a earlier. More formally, define \(\varepsilon '=\min \{\varepsilon ,\xi _{\tau (a)}(b')\}\). Let \(e'=e_{\tau (a)+1}\varepsilon '\) and for each job \(c\in \mathcal {J}\) let
and
By Lemma 1, there exists a schedule \(\mathcal {P}'\) such that for each \(t\in \{1,\ldots ,q1\}\setminus \{\tau (a)\}\) the total length of all pieces of each job \(c\in \mathcal {J}\) executed in \([e_t,e_{t+1}]\) is \(\xi _t(c)\), the total length of all pieces of each job c executed in \([e_{\tau (a)},e']\) equals \(\xi '(c)\), and the total length of all pieces of each job c executed in \([e',e_{\tau (a)}+1]\) equals \(\xi ''(c)\). However, \(C(\mathcal {P}',c)=C(\mathcal {P},c)\) for each \(c\in \mathcal {J}\setminus \{a\}\) and \(C(\mathcal {P}',a)<C(\mathcal {P},a)\), which contradicts the optimality of \(\mathcal {P}\). \(\square \)
Lemma 4
Given schedule \((\mathcal {P},{\varvec{e}},{\varvec{\xi }})\), if \(a\in \mathcal {J}\) is not a spanning job in block i (\(i\in \{1,\ldots ,q2\}\)), \(s(\mathcal {P},a)\le e_i\) and \(C(\mathcal {P},a)\ge e_{i+1}\), then there is no idle time in the ith block of \(\mathcal {P}\).
Proof
Suppose for a contradiction that there is idle time of length \(\varepsilon >0\) in \([e_i,e_{i+1}]\) on one of the processors in \(\mathcal {P}\). We get a contradiction by obtaining another schedule \(\mathcal {P}'\) such that \(C(\mathcal {P},b)=C(\mathcal {P}',b)\) for each \(b\in \mathcal {J}\setminus \{a\}\) and \(C(\mathcal {P}',a)<C(\mathcal {P},a)\). Namely, take \(\varepsilon '=\min \{\varepsilon ,\xi _{\tau (a)}(a),e_{i+1}e_i\xi _i(a)\}\). By Lemma 3, \(\tau (a)>i\) and hence \(\varepsilon '>0\). By Lemma 1, the desired schedule \(\mathcal {P}'\) obtained from \(\mathcal {P}\) by moving the piece of a that executes in \([C(\mathcal {P},a)\varepsilon ',C(\mathcal {P},a)]\) to the ith block of \(\mathcal {P}\) is feasible. \(\square \)
Given schedule \((\mathcal {P},{\varvec{e}},{\varvec{\xi }})\), two jobs a and b with \(\tau (a)<\tau (b)\) are said to interlace if job b is not spanning in block \(\tau (a)\) and there exists \(t<\tau (a)\) such that job a is not spanning in block t, \(\xi _t(b)>0\), \(r(a)<e_{t+1}\) and a is independent of all jobs in \(\mathcal {J}(\xi _t)\cup \cdots \cup \mathcal {J}(\xi _{\tau (a)})\). Note that, informally speaking, the above constraints imply that a piece of a executed in \([C(\mathcal {P},a)\varepsilon ,C(\mathcal {P},a)]\), for some \(\varepsilon >0\), can be exchanged with a piece of b of length \(\varepsilon \) executing in the tth block of \(\mathcal {P}\). We formalize this observation in the next lemma.
Lemma 5
If \(\mathcal {P}\) is an optimal schedule, then no two jobs interlace in \(\mathcal {P}\).
Proof
Let \({\varvec{e}}\) and \({\varvec{\xi }}\) be the events and the partition of \(\mathcal {P}\), respectively. Suppose for a contradiction that two jobs a and b with \(\tau (a)<\tau (b)\) interlace and t is the block in the definition. Let
Note that \(\varepsilon >0\). By Lemma 1, there exists a schedule \(\mathcal {P}'\) with \({\varvec{e}}'\) and partition \({\varvec{\xi }}'\) such that
The schedule \(\mathcal {P}'\) is feasible for two reasons. First, \(r(a)<e_{t+1}\) implies that if \(s(\mathcal {P},a)<e_{t+1}\), then \(r(a)\le e_t\) and if \(s(\mathcal {P},a)\ge e_{t+1}\), then \(s(\mathcal {P}',a)\ge e_{t+1}\varepsilon \) according to the definition of the transformation, which implies that a does not start prior to its release date in \(\mathcal {P}'\). Second, the fact that a is independent of all jobs in \(\mathcal {J}(\xi _t)\cup \cdots \cup \mathcal {J}(\xi _{\tau (a)})\) implies that a does not violate the precedence constraints in \(\mathcal {P}'\). For each \(c\in \mathcal {J}\setminus \{a\}\), \(C(\mathcal {P},c)=C(\mathcal {P}',c)\) and \(C(\mathcal {P},a)>C(\mathcal {P}',a)\). This contradicts the optimality of \(\mathcal {P}\). \(\square \)
Lemma 6
Let \((\mathcal {P},{\varvec{e}},{\varvec{\xi }})\) be an optimal schedule. Let \(I=[x,y]\) be an interval and let \(B\subseteq \mathcal {J}\) be such that \(a\in B\) if and only if the total length of job a executing in I is strictly between 0 and \(yx\).
If jobs in B are independent, \(C(\mathcal {P},a)\ge y\) and \(r(a)\le x\) for each \(a\in B\), then \(B\le 2\).
Proof
It follows from definition of set B that no job completes in (x, y). We first argue that
Suppose for a contradiction that \(C(\mathcal {P},b)=y\) for some job \(b\in B\). Since the total length of b in I is less than \(yx\), there exists a nonempty interval \(I'\subseteq I\) such that no part of b executes in \(I'\). We obtain a schedule \(\mathcal {P}'\) by exchanging the part of \(\mathcal {P}\) that executes in \(I'\) with the part of \(\mathcal {P}\) that executes in \([yI',y]\). Since the release date of each job that executes in I is at most x and the jobs whose parts execute in I are independent, we obtain that \(\mathcal {P}'\) is indeed a feasible schedule. Then, \(C(\mathcal {P}',b)=yI'<y=C(\mathcal {P},b)\) and \(C(\mathcal {P}',a)\le C(\mathcal {P},a)\) for each \(a\in \mathcal {J}\setminus \{b\}\), which completes the proof of (5).
We now prove the lemma. Suppose for a contradiction that \(B>2\). Let b be a job in B with minimum completion time in \(\mathcal {P}\). Since \(B>2\), Lemma 3 implies that there exists \(b'\in B\) such that \(\tau (b)<\tau (b')\) and \(b'\) is not a spanning job in chunk \(\tau (b)\). Define \(\varepsilon =\min \{yxp,\xi _{\tau (b)}(b),p',e_{\tau (b)+1}e_{\tau (b)}\xi _{\tau (b)}(b')\}\), where p and \(p'\) are the total lengths of b and \(b'\) respectively executing in I. Due to the choice of \(b'\), \(\varepsilon >0\). We obtain a schedule \(\mathcal {P}'\) by first exchanging the pieces of \(b'\) of total length \(\varepsilon \) executing in I with a piece of b of length \(\varepsilon \) executing in chunk \(\tau (b)\). The resulting \(\mathcal {P}'\) may not be feasible in I, however, the McNaughton’s rule can readily turn this part into a feasible schedule. This provides a feasible schedule \(\mathcal {P}'\) because the release date of each job whose part executes in I is at most x and the jobs that execute in I in \(\mathcal {P}\) are independent. By (5), \(C(\mathcal {P}',b)=C(\mathcal {P},b)\varepsilon \). Note that if a job completes at y in \(\mathcal {P}\), then the total length of this job in I equals \(yx\); otherwise the job would belong to B contradicting (5). Thus, no job completes later in \(\mathcal {P}'\) than in \(\mathcal {P}\)—a contradiction with the optimality of \(\mathcal {P}\). \(\square \)
Lemma 7
Let schedule \((\mathcal {P},{\varvec{e}},{\varvec{\xi }})\) be optimal. If \(\mathcal {J}(\xi _i)\ne \emptyset \) (\(i\in \{1,\ldots ,q1\}\)), then:

(i)
There exists a job in \(\mathcal {J}\) that is spanning in block i;

(ii)
\(\mathcal {J}(\xi _i)\le 3\) and if \(\mathcal {J}(\xi _i)=3\), then some job in \(\mathcal {J}(\xi _i)\) completes at \(e_{i+1}\) in \(\mathcal {P}\).
Proof
Note that \(\mathcal {J}(\xi _i)>3\) would lead to a contradiction to Lemma 6 with \(I=[e_i,e_{i+1}]\). Moreover, if \(i=\tau (a)\) for some \(a\in \mathcal {J}\), then by Lemma 3, a is spanning in block i and the lemma holds.
Thus, assume that no job finishes in the ith block of \(\mathcal {P}\). If \(\mathcal {J}(\xi _i)\le 2\), then it remain to prove (i): if no job a is spanning in block i, then by Lemma 4, there is no idle time in the ith block of \(\mathcal {P}\), which would violate \(\mathcal {J}(\xi _i)\le 2\). This completes the proof of case \(\mathcal {J}(\xi _i)\le 2\). We prove, by contradiction, that \(\mathcal {J}(\xi _i)=3\) is not possible if no job completes at \(e_{i+1}\). Denote \(B=\{a\in \mathcal {J}\, \bigl \bigr . \,0<\xi _i(a)<e_{i+1}e_i\}\). Clearly, \(B>1\). On the other hand, \(B<3\) for otherwise the job in B with smallest completion time interlaces with one of the two other jobs in B, which contradicts Lemma 5. Thus, \(B=2\). Denote \(B=\{b,b'\}\) and assume without loss of generality that \(C(\mathcal {P},b)\le C(\mathcal {P},b')\). According to Lemma 3, job b is spanning in block \(\tau (b)\). Also job \(b'\) is spanning in block \(\tau (b)\), since otherwise b and \(b'\) interlace, which is not possible according to Lemma 5. The only job, call it \(c'\), in \(\mathcal {J}(\xi _i)\setminus \{b,b'\}\) completes in \((e_{i+1},e_{\tau (b)})\) for otherwise this job and b interlace—again a contradiction with Lemma 5. Thus, in particular, \(\tau (b)>i+1\). This situation is depicted in Fig. 4.
Let \(Y=\{a\in \mathcal {J}\setminus \mathcal {J}(\xi _i)\, \bigl \bigr . \,e_{i+1}\le s(\mathcal {P},a)\le e_{\tau (b)}\}\). Since \(e_{i+1}\) is an event of \(\mathcal {P}\), \(Y\ne \emptyset \). By Lemma 5, if \(e_{i+1}\le C(\mathcal {P},a)\le e_{\tau (b)}\), then \(a\in Y\) or \(a=c'\). If there exists \(c\in Y\) such that \(C(\mathcal {P},c)=e_{\tau (b)}\), then we obtain a schedule \(\mathcal {P}'\) by swapping b and c. By Lemma 2, \(\mathcal {P}'\) is feasible. Moreover, if job b is nonspanning in block \(\tau (b)1\), then the total completion time of \(\mathcal {P}'\) is smaller than that of \(\mathcal {P}\), which completes the proof. If, on the other hand, job b is spanning in block \(\tau (b)1\), then \(C(\mathcal {P}',b)=e_{\tau (b)}\) and \(\xi '_{\tau _{\mathcal {P}'}(b)1}(b')=0\), where \({\varvec{\xi }}'\) is the partition of \(\mathcal {P}'\), in which case b and \(b'\) interlace in \(\mathcal {P}'\)—a contradiction with Lemma 5. Thus, it remains to consider the situation when no such c exists. This, since \(e_{\tau (b)}\) is an event of \(\mathcal {P}\), implies that \(c'\) ends at \(e_{\tau (b)}\) in \(\mathcal {P}\). Moreover, \(\mathcal {J}(\xi _{\tau (c')})\subseteq \{c',b,b'\}\) for otherwise \(\mathcal {P}\) would not be optimal. Thus, some job \(c\in Y\) ends at \(e_{\tau (c')}\) because \(e_{\tau (c')}\) is an event of \(\mathcal {P}\) and no job in Y can start at \(e_{\tau (c')}\). Therefore, one of jobs \(\{c',b\}\) must be nonspanning in block \(\tau (c)\). Swapping this job with c gives, by Lemma 2, a schedule with smaller total completion time that that of \(\mathcal {P}\), which provides the required contradiction and completes the proof of the lemma. \(\square \)
The following two lemmas describe additional configurations that cannot be present in an optimal schedule. The first situation is depicted in Fig. 5a, while the statement of Lemma 9 is shown in Fig. 5b.
Lemma 8
Given schedule \((\mathcal {P},{\varvec{e}},{\varvec{\xi }})\), let \(e_j\) be an event in \(\mathcal {P}\) and jobs c, \(c'\), and d be such that

(i)
\(C(\mathcal {P},c)=C(\mathcal {P},c')=e_j\);

(ii)
\(C(\mathcal {P},d)=e_{j+1}\) and \(s(\mathcal {P},d)<e_j\);

(iii)
Jobs in \(\{c,c'\}\cup \mathcal {J}(\xi _j)\) are independent.
Then, \(\mathcal {P}\) is not optimal.
Proof
By Lemma 7 (ii), one of jobs c or \(c'\), say c, satisfies \(\xi _t(c)=0\), where \(e_t=s(\mathcal {P}, d)\). We then have \(s(\mathcal {P}, d)<s(\mathcal {P},c)\) for otherwise c and d would interlace, thus we get a contradiction by Lemma 5. Therefore, we can swap jobs c and d. By Lemma 2, the resulting schedule is feasible and has smaller total completion time than \(\mathcal {P}\), as required. \(\square \)
Lemma 9
Let schedule \((\mathcal {P},{\varvec{e}},{\varvec{\xi }})\) be optimal and jobs a and \(a'\) be such that

\(\xi _j(a)>0\), \(\xi _{j'}(a)>0\), \(j<j'1\) and job a is spanning in block t for each \(t\in \{j+1,\ldots ,j'1\}\);

\(\xi _{j}(a')=0\) and \(C(\mathcal {P},a')=e_{j'}\);

No successor of \(a'\) starts at \(e_{j'}\).
Then, \(s(\mathcal {P},a')\ge e_{j+1}\) and \(\tau (a)>j'\).
Proof
If \(s(\mathcal {P},a')<e_{j+1}\), then due to \(\xi _j(a')=0\), \(s(\mathcal {P},a')<e_j\). But then, a and \(a'\) would interlace, which is not possible in an optimal schedule according to Lemma 5.
By assumption, \(a'\) is independent of any job in \(\mathcal {J}(\xi _{j'})\). Also, \(s(\mathcal {P},a)\le e_{j+1}\le s(\mathcal {P},a')\). Then, \(\tau (a)>j'\) follows from an observation that otherwise swapping a and \(a'\) in \(\mathcal {P}\) would produce, by Lemma 2, a schedule with smaller total completion time than that of \(\mathcal {P}\). \(\square \)
4.4 Abnormality points and maximal schedules
We now define normal schedules, abnormality points and maximal schedules. In particular Lemma 12 gives key necessary conditions for an abnormality point.
For any \(x\in \mathbb {R}_+\) and nonnegative integer l, we say that x is l normal if \(x=l'/2^l\) for some integer \(l'\). We say that a block of a schedule \(\mathcal {P}\) is l normal if the length of the block is lnormal and the total execution time of each job in the block is \((l+1)\)normal. A preemptive schedule \(\mathcal {P}\) with q events is normal if the ith block of \(\mathcal {P}\) is inormal for each \(i\in \{1,\ldots ,q1\}\). If a schedule \(\mathcal {P}\) with q events \({\varvec{e}}\) and partition \({\varvec{\xi }}\) is not normal, then the minimum index \(i\in \{1,\ldots ,q1\}\) such that the ith block of \(\mathcal {P}\) is not inormal is called the abnormality point of \(\mathcal {P}\). If a schedule is normal, then its abnormality point is denoted by \(\infty \) for convenience. We have the following simple observations.
Observation 1
If x is lnormal, then x is \(l'\)normal for each \(l'\ge l\). \(\square \)
Observation 2
If \(i\ne \infty \) is the abnormality point of a schedule \(\mathcal {P}\) with events \({\varvec{e}}\), then \(e_i\) is \((i1)\)normal. \(\square \)
According to our definition, if an ith block of a schedule \(\mathcal {P}\) is inormal, then \(\xi _i(a)\) is \((i+1)\)normal for each \(a\in \mathcal {J}\), however, this does not necessarily imply that job preemptions occur at \((i+1)\)normal time points in the ith block of \(\mathcal {P}\). Such job preemptions can possibly take place only strictly between \(e_i\) and \(e_{i+1}\) since both \(e_i\) and \(e_{i+1}\) are inormal by assumption. By the next observation, we may assume without loss of generality that inormal blocks have job preemptions only at \((i+1)\)normal time points.
Observation 3
If the ith block of a schedule \(\mathcal {P}\) is inormal, then there exists a schedule \(\mathcal {P}'\) with the same events, partition and total completion time as that of \(\mathcal {P}\), in which each preemption, resumption, job start and job completion in the ith block occurs at \((i+1)\)normal time point.
Proof
It follows from the McNaughton’s algorithm. \(\square \)
Let us introduce a partial order, denoted by \(\trianglelefteq \), to the set of all schedules. For schedules \(\mathcal {P}\) and \(\mathcal {P}'\), we write \(\mathcal {P}\trianglelefteq \mathcal {P}'\) if and only if one of the following holds:

\(\mathcal {P}= \mathcal {P}'\);

\(\mathcal {P}'\) is optimal, while \(\mathcal {P}\) is not;

Both \(\mathcal {P}\) and \(\mathcal {P}'\) are optimal and, additionally, \(\mathcal {P}'\) is normal while \(\mathcal {P}\) is not;

Both \(\mathcal {P}\) and \(\mathcal {P}'\) are optimal, but neither is normal. Additionally \(i\le i'\), where i and \(i'\) are the abnormality points of \(\mathcal {P}\) and \(\mathcal {P}'\), respectively.
Any element in \(\mathcal {J}\) that is maximal under the partial order is called a maximal schedule.
Lemma 10
Let schedule \((\mathcal {P},{\varvec{e}},{\varvec{\xi }})\) have abnormality point \(i\ne \infty \). For each \(a\in \mathcal {J}\) and for each \(i'\le i\), \(\sum _{j=i'}^{q1}\xi _j(a)\) is \(i'\)normal.
Proof
Let \(a\in \mathcal {J}\) be selected arbitrarily. Note that
Since \(i\ge i'\) is the abnormality point of \(\mathcal {P}\), Observation 1 implies that \(\xi _j(a)\) is inormal for each \(j\in \{1,\ldots ,i'1\}\). \(\square \)
The next lemma, informally speaking, allows us to further consider only those maximal schedules with abnormality point \(i\ne \infty \) in which the abnormality of the ith block is due to the length of the jobs in this block, and not due to the length of this block.
Lemma 11
Let \(\mathcal {P}\) be a maximal schedule with the events \(e_1,\ldots ,e_q\). If \(i\ne \infty \) is the abnormality point of \(\mathcal {P}\), then there exists a maximal schedule \(\mathcal {P}'\) with abnormality point i such that \(e_1,\ldots ,e_i,e_{i+1}',\ldots ,e_{q'}'\) are its events and \(e_{i+1}'e_i\) is inormal.
Proof
If \(i=\tau (a)\) for some \(a\in \mathcal {J}\), then by Lemma 3, \(e_{i+1}=e_i+\xi _i(a)\). By Lemma 10, \(\xi _i(a)\) is inormal. Thus, \(\mathcal {P}'=\mathcal {P}\) is the required schedule, which proves the lemma. Hence, \(i\ne \tau (a)\) for each \(a\in \mathcal {J}\), i.e., no job completes at \(e_{i+1}\). Note that \(e_{i+1}=s(\mathcal {P},a)\) for some \(a\in \mathcal {J}\). Suppose for a contradiction that \(e_{i+1}e_i\) is not inormal. Thus, \(e_{i+1}\) is not inormal. Lemma 4 implies that there is no idle time in the ith block of \(\mathcal {P}\). Thus, \(\mathcal {J}(\xi _i)\ge 2\) and therefore there exists \(d\in \mathcal {J}(\xi _i)\) that is nonspanning in block \(i+1\) because a starts at \(e_{i+1}\).
Case 1 There is an inormal number in \((e_i,e_{i+1}]\). Let x be the maximal inormal number in \((e_i,e_{i+1}]\). Then, \(r(a)\le x\) because there is no inormal number in \((x,e_{i+1}]\) and \(r(a)\le e_{i+1}\) is inormal by Observation 1. Let
No job completes at \(e_{i+1}\) and therefore the jobs in \(\mathcal {J}(\xi _i)\cup \mathcal {J}(\xi _{i+1})\) are independent. Thus, by Lemma 1, there exists a schedule \(\mathcal {P}'\) with events \({\varvec{e}}'\) and partition \({\varvec{\xi }}'\), where
Moreover, due to the McNaughton’s rule, one can assume that \(s(\mathcal {P}',a)=x\). By Observation 2, \(e_i\) is \((i1)\)normal and hence, by Observation 1, \(xe_i\) is inormal. Since the first \(i1\) blocks are identical in \(\mathcal {P}\) and \(\mathcal {P}'\), and \(e'_{i+1}=x\), \(\mathcal {P}'\) is the desired schedule, which completes the proof in this case.
Case 2 There is no inormal number in \((e_i,e_{i+1}]\). By Observation 1, there is no \((i1)\)normal number in \((e_i,e_{i+1}]\). Let \(x>e_i\) be the minimum \((i1)\)normal number. Since \(i+1<q\), more than one block intersects \((e_i,x)\).
Suppose first that exactly two blocks intersect \((e_i,x)\), and there exists a job b such that \(\xi _i(b)+\xi _{i+1}(b)=e_{i+2}e_i\). One of the two blocks is of length at least \((xe_i)/2\). By Observation 2, \(e_i+(xe_i)/2\) is inormal. Hence, due to the condition in Case 2, this must be the \((i+1)\)st block. However, the schedule with events \((e_1,\ldots ,e_{i},e_i+e_{i+2}e_{i+1},e_{i+2},\ldots ,e_q)\) and partition \((\xi _1,\ldots ,\xi _i,\xi _{i+2},\xi _{i+1},\xi _{i+3},\ldots ,\xi _{q1})\) would satisfy the assumption in Case 1. This allows us to construct the desired schedule \(\mathcal {P}'\) as in Case 1.
Suppose now that exactly two blocks intersect \((e_i,x)\) and there exists no job b such that \(\xi _i(b)+\xi _{i+1}(b)=e_{i+2}e_i\). Lemma 6 applied to \(I=[e_i,e_{i+2}]\) gives a contradiction. Observe that the corresponding set \(\{a,b,d\}\subseteq B\) in Lemma 6 is of size at least 3. Moreover, since no job completes in \((e_i,e_{i+2})\), B contains only independent jobs.
Finally, we consider the case when more than two blocks intersect \((e_i,x)\). Thus, the job a does not complete before x. Moreover, no job completes at \(e_{i+2}\) because otherwise either \(\mathcal {P}\) is not optimal or \(e_{i+2}\) is inormal by Lemma 10. Since \(e_{i+2}<x\) we get a contradiction in either case. Therefore, there is a job \(a'\) that starts at \(e_{i+2}\). Clearly, \(a'\) does not complete before x. Thus, Lemma 6 for \(I=[e_i,\min \{x,e_{i+3}\}]\) again gives a contradiction. Observe that the corresponding set \(\{a,a',d\}\subseteq B\) in Lemma 6 is of size at least 3. Moreover, since no job completes in \((e_i,e_{i+3})\), B contains only independent jobs. \(\square \)
Given schedule \((\mathcal {P}, {\varvec{e}}, {\varvec{\xi }})\), for \(i\in \{1,\ldots ,q1\}\) define
Lemma 12
Let \(\mathcal {P}\) be a maximal schedule. If \(i\ne \infty \) is the abnormality point of \(\mathcal {P}\), then \(A_i(\mathcal {P})=2\) and \(\mathcal {J}(\xi _i)= 3\).
Proof
Let \({\varvec{e}}\) and \({\varvec{\xi }}\) be the events and the partition of \(\mathcal {P}\), respectively. By Lemma 11, \(e_{i+1}e_i\) is inormal. We have \(\mathcal {J}(\xi _i)>2\) because otherwise by Lemmas 4 and 3, \(\xi _i(a)=e_{i+1}e_i\) for each \(a\in \mathcal {J}(\xi _i)\), which would contradict the fact that \(e_{i+1}e_i\) is inormal. Lemma 7 implies that \(\mathcal {J}(\xi _i)=3\) and there exists \(a\in \mathcal {J}(\xi _i)\) such that \(\xi _i(a)=e_{i+1}e_i\). Thus, \(a\notin A_i(\mathcal {P})\), and we have that \(A_i(\mathcal {P})\le 2\) because \(A_i(\mathcal {P})\subseteq \mathcal {J}(\xi _i)\). Also, \(A_i(\mathcal {P})>1\) by Lemma 4. \(\square \)
Finally, the following observation shows that the abnormality point of a schedule does not decrease after a certain type of schedule modifications. The observation directly follows from the definition of normality.
Lemma 13
Let \(\mathcal {P}\) be a schedule and \(\varepsilon =l'/2^{l}\) for some integers l and \(l'\). If \(\mathcal {P}'\) is a schedule that is obtained from \(\mathcal {P}\) by a sequence of modifications, each modification being a removal of a piece of length that is a multiple of \(\varepsilon \) from a jth block and an insertion of this piece into a \(j'\)th block, where \(\min \{j,j'\}\ge l1\), then the abnormality point of \(\mathcal {P}'\) is not smaller than that of \(\mathcal {P}\). \(\square \)
We close this section with a brief summary of key properties of optimal and maximal schedules obtained in this section.
An optimal schedule is a concatenation of blocks. The structure of any block is as follows. Each nonempty block has a spanning job, and pieces of at most three jobs can be present in a block; if two jobs are present in a block, then both are spanning (cf. Lemmas 4 and 7) while if three jobs are present in a block, then one of them completes in the block (cf. Lemma 7) and it is the only spanning job in this block (cf. Lemma 3). The latter property from Lemma 3 is general—each job that ends in a block is spanning. We also emphasize that interlacing of jobs is not allowed in optimal schedules (see Lemma 5). This observation is heavily used throughout the paper.
Later on in Sect. 4.4 we switch our focus to optimal schedules that are also maximal. Lemma 11 shows that we may assume without loss of generality that, in a maximal schedule with abnormality point \(i\ne \infty \), the length of block i is inormal and thus there exists a job a in block i such that the length of a in this block is not inormal. Then, Lemma 12 says that there must be another job b, different from a, that has its total length in block i not inormal either. Note that, due to Lemmas 3, 4 and 7, this implies that neither a nor b is spanning in block i, none of them finishes in this block and there must be a third spanning job that finishes in block i.
5 \(\text {A}\)configurations
In this section we first define an undesirable structure that may appear in a schedule; we refer to this structure as an \(\text {A}\) configuration. Our proof that there exists a normal optimal schedule for each \(\mathcal {J}\), given in Sect. 6, relies on the key assumption that there exists a maximal schedule without \(\text {A}\)configurations, or \(\text {A}\) free maximal schedule, for each set of jobs \(\mathcal {J}\). The proof proceeds by contradiction. The abnormality point implies the existence of an alternating chain in a maximal schedule (Sect. 6.1 for definition of alternating chains). In Sect. 6.3, we show that if \(\text {A}\)configuration is not present in the part of the schedule that follows the alternating chain, then we are able to extend this chain by adding one more job. Thus we arrive at the required contradiction since this method allows us to extend the alternating chain ad infinitum in the presence of a finite number of jobs.
Therefore, the Aconfigurations are undesirable and the main goal of this section is to prove that an \(\text {A}\)free maximal schedule exists for each \(\mathcal {J}\). Our proof is by contradiction: informally speaking, we take a maximal schedule having an \(\text {A}\)configuration as early as possible, and, after some schedule transformations, we either obtain a new schedule with smaller total completion time or with an earlier \(\text {A}\)configuration. In the former case we clearly obtain a contradiction. In the latter case, a contradiction occurs only if the new schedule is maximal, i.e., its abnormality point is not smaller than that of the initial schedule. For this reason, while performing the initial schedule transformations we must ensure that they do not change the abnormality point in the latter case. The proof works for intrees, however, it does not for general precedence constraints. The question whether there is an \(\text {A}\)free maximal schedule for each \(\mathcal {J}\) and general precedence constraints remains open.
Let \((\mathcal {P},{\varvec{e}},{\varvec{\xi }})\) be a schedule. We say that \(\mathcal {P}\) has an \(\text {A}\) configuration of length \(\ell \) (\(\ell >0\)) starting at \(e_j\) if there exist two jobs a and b such that

\(C(\mathcal {P},a)=e_{j}\) and \(C(\mathcal {P},b)=e_{j'}\) for some \(j'>j\);

\([e_{j}\ell , e_{j}]\) is a maximal interval where a executes nonpreemptively;

b executes nonpreemptively in \([e_j,e_{j'}]\), and b does not execute in \([e_{j}\ell , e_{j}]\);

\(s(\mathcal {P},b)<e_{j}\ell \);

a and each job in \(\mathcal {J}(\xi _j)\cup \cdots \cup \mathcal {J}(\xi _{j'})\) are independent.
We also say that the jobs a and b form the \(\text {A}\)configuration. See Fig. 6 for an exemplary \(\text {A}\)configuration.
If no pair of jobs form an \(\text {A}\)configuration in \(\mathcal {P}\), then \(\mathcal {P}\) is called \(\text {A}\) free. For any time interval I, if for any \(x\in I\cap \{e_i:i=1,\ldots ,q\}\) there is no \(\text {A}\)configuration at x in \(\mathcal {P}\), then \(\mathcal {P}\) is \(\text {A}\) free in I. The main result of this section is the following theorem.
Theorem 2
If any maximal schedule has an abnormality point \(i\ne \infty \), then there exists an \(\text {A}\)free maximal schedule.
We first provide several technical lemmas before presenting our proof of Theorem 2. A schedule \(\mathcal {P}\) of abnormality point i is said to be \(\text {A}\) maximal if it is maximal and, unless \(i=\infty \), one of the following two statements is true:

\(\mathcal {P}\) is \(\text {A}\)free;

any maximal schedule is not \(\text {A}\)free, and \(\mathcal {P}\) has the earliest starting \(\text {A}\)configuration among maximal schedules.
Proposition 5
Let \(\mathcal {P}\) be \(\text {A}\)maximal. If a and b form an \(\text {A}\)configuration in \(\mathcal {P}\) with \(C(\mathcal {P},a)<C(\mathcal {P},b)\), then \(s(\mathcal {P},a)\le s(\mathcal {P},b)\).
Proof
Suppose for a contradiction that \(s(\mathcal {P},a)>s(\mathcal {P},b)\). Then, swapping jobs a and b in \(\mathcal {P}\) produces, by Lemma 2, a schedule with smaller total completion time than that of \(\mathcal {P}\)—a contradiction. \(\square \)
The first of the following two lemmas describes a situation that guarantees an \(\text {A}\)configuration, while the second a situation that cannot happen in an \(\text {A}\)maximal schedule with an \(\text {A}\)configuration.
Proposition 6
Given maximal schedule \((\mathcal {P},{\varvec{e}},{\varvec{\xi }})\), let \(e_j\) be an event in \(\mathcal {P}\) and jobs a, c, \(c_1\), and d be such that

(i)
\(C(\mathcal {P},c_1)=e_{j}\), \(C(\mathcal {P},c)=e_{j+1}\), and \(C(\mathcal {P},d)=e_{j+2}\);

(ii)
\(\mathcal {J}(\xi _{j1})=\{c,c_1\}\), \(\mathcal {J}(\xi _{j})=\{a,c\}\) and \(\mathcal {J}(\xi _{j+1})=\{a,d\}\);

(iii)
\(s(\mathcal {P},d)<e_{j1}\).
Then, jobs c and d form an \(\text {A}\)configuration. (See Fig. 7a for an illustration.)
Proof
Let \(k<j1\) be the maximum index such that in block \(k1\) job c is nonspanning but spanning in block t for each \(t\in \{k,\ldots ,j1\}\). Note that by (i), (ii) and Lemma 4, k is well defined. We prove, by induction on \(t\in \{1,\ldots ,jk\}\), that
which immediately follows from (i), (ii) and Lemma 4 for \(t=1\). So assume inductively that the claim holds for some \(t1\ge 1\) and we prove it for t. It suffices to argue that some job \(c_t\) completes at \(e_{jt+1}\) since then Lemma 3 implies \((c_t)\) is spanning in block \(jt\). By the induction hypothesis and the fact that all jobs have the same execution time, neither \(c_{t1}\) nor c start at \(e_{jt+1}\). Since \(e_{jt+1}\) is an event of \(\mathcal {P}\), some job \(c_{t}\) completes at \(e_{jt+1}\) as required. We have \(s(\mathcal {P},c)<e_{jt}\) for otherwise we can swap jobs c and d in \([s(\mathcal {P},c),C(\mathcal {P},d)]\). The resulting schedule is feasible and has smaller total completion time than \(\mathcal {P}\). Thus \(\mathcal {P}\) is not optimal—contradiction. This proves (6).
If \(0<\xi _{k1}(c)<e_{k}e_{k1}\), then by modifying the schedule in block \(k1\) we may without loss of generality assume that c resumes at \(e_k\). Thus, (6) implies that c and d form an \(\text {A}\)configuration of length \(e_{j+1}e_k\) at \(e_{j+1}\). \(\square \)
Proposition 7
Let \((\mathcal {P},{\varvec{e}},{\varvec{\xi }})\) be an \(\text {A}\)maximal schedule. Suppose that jobs a and b form an \(\text {A}\)configuration at \(e_j\) in \(\mathcal {P}\) with \(C(\mathcal {P},a)<C(\mathcal {P},b)\). Then there exists no \(e_t\le s(\mathcal {P},b)\) such that (see Fig. 7b for an illustration, where it is possible that job c ends at the start of b):

(i)
\(r(a)<e_t\), \(\xi _t(a)=e_{t+1}e_t\), job a is nonspanning in block \(t1\);

(ii)
Jobs in \(\mathcal {J}(\xi _{t1})\cup \{a\}\) are independent;

(iii)
Some job c in \(\mathcal {J}(\xi _{t1})\) satisfies \(C(\mathcal {P},c)\ge e_{j'}\), \(\xi _{j'}(c)=0\), and if \(C(\mathcal {P},c)=e_{j'}\), then the jobs in \((\mathcal {J}(\xi _{j'})\setminus \{b\})\cup \{c\}\) are independent, where \(e_{j'}=s(\mathcal {P},b)\);

(iv)
The abnormality point of \(\mathcal {P}\) is not in \(\{t,\ldots ,j'\}\).
Proof
Suppose for a contradiction that such an \(e_t\) exists. Let \(\ell >0\) be the length of the \(\text {A}\)configuration formed by a and b. Define
By (i) and (iii), we have \(\varepsilon >0\). Let \(\mathcal {P}'\) be a schedule obtained by moving a piece of c of length \(\varepsilon \) from the \((t1)\)st block to the \(j'\)th block, a piece of b of length \(\varepsilon \) from the \(j'\)th block to \([C(\mathcal {P},a)\varepsilon ,C(\mathcal {P},a)]\), and a piece of a from \([C(\mathcal {P},a)\varepsilon ,C(\mathcal {P},a)]\) to the \((t1)\)st block. By (i), (ii), (iii) and Lemma 1, the schedule \(\mathcal {P}'\) is feasible. This transformation is shown in Fig. 8 when \(\varepsilon =e_{t}e_{t1}\xi _{t1}(a)\) and \(j'=t\). Clearly, \(C(\mathcal {P},d)=C(\mathcal {P}',d)\) for each \(d\in \mathcal {J}\setminus \{a,c\}\) and, by (iii), \(C(\mathcal {P}',c)\le C(\mathcal {P},c)+\varepsilon \).
If \(\varepsilon =\ell \), then the total completion time of \(\mathcal {P}'\) is strictly smaller than that of \(\mathcal {P}\), because a resumes at \(C(\mathcal {P},a)\ell \) in \(\mathcal {P}\), i.e., \(C(\mathcal {P}',a)<C(\mathcal {P},a)\ell \). We get a contradiction since \(\mathcal {P}\) is optimal.
Otherwise, if \(\varepsilon <\ell \), then a and b form an \(\text {A}\)configuration in \(\mathcal {P}'\) at \(C(\mathcal {P},a)\varepsilon \). Also, \(C(\mathcal {P}',a)=C(\mathcal {P},a)\varepsilon \). Let i be the abnormality point of \(\mathcal {P}\). If \(i\le t1\), then the fact that \(\mathcal {P}\) and \(\mathcal {P}'\) are the same in \([0,e_{t1}]\), we obtain that the abnormality point of \(\mathcal {P}'\) equals i and \(\mathcal {P}'\) is \(\text {A}\)maximal. If \(i>t1\), then by (iv), \(i>j'\) and hence \(\varepsilon \) is tnormal and, by Lemma 13, \(\mathcal {P}'\) is \(\text {A}\)maximal. Therefore, we obtain a contradiction in both cases, which proves the lemma. \(\square \)
The following lemma describes how two jobs that form an \(\text {A}\)configuration start in an \(\text {A}\)maximal schedule. See Fig. 9 for an illustration of the two possible cases: the two jobs can start at different time points, or at the same time.
Proposition 8
Suppose that each \(\text {A}\)maximal schedule has an \(\text {A}\)configuration. Then, there exists an \(\text {A}\)maximal schedule \((\mathcal {P},{\varvec{e}},{\varvec{\xi }})\) such that the earliest \(\text {A}\)configuration formed by a and b with \(C(\mathcal {P},a)<C(\mathcal {P},b)\) satisfies the following properties:

(i)
\(\mathcal {J}(\xi _{j'})=\{a,b\}\), \(\mathcal {J}(\xi _j)=2\) and some job completes at \(e_{j'}\), where \(e_{j'}=s(\mathcal {P},b)\) and \(e_j=s(\mathcal {P},a)\),

(ii)
\(0\le j'j\le 1\), and

(iii)
\(e_{j'+1}\) is an integer.
Proof
Let \(\mathcal {P}\) be \(\text {A}\)maximal. Let a and b form the earliest \(\text {A}\)configuration in \(\mathcal {P}\). By Proposition 5, \(s(\mathcal {P},a)\le s(\mathcal {P},b)=e_{j'}\). Without loss of generality assume that \(s(\mathcal {P},b)\) is as late as possible.
Then job a is spanning in block \(j'\) since otherwise a and b interlace and we get a contradiction by Lemma 5. Moreover,
since otherwise, by Lemma 7, it holds \(\mathcal {J}(\xi _{j'})=\{a,b,c\}\) and \(C(\mathcal {P},c)=e_{j'+1}\). The latter implies, by Lemma 3, that job c is spanning in block \(j'\), which contradicts that job a is spanning in block \(j'\) and proves (7).
We prove (iii) first. Suppose for a contradiction that \(e_{j'+1}\) is not an integer and let h be the greatest integer smaller than \(e_{j'+1}\). Since \(e_{j'+1}\) is an event and, by definition of \(\text {A}\)configuration, none of the jobs a and b ends at \(e_{j'+1}\), (7) implies that some job c starts at \(e_{j'+1}\).
We show that \(\xi _{j'+1}(b)=0\), which will make our first transformation in (9) feasible. This holds for \(j'+1=\tau (a)\), since by definition of \(\text {A}\)configuration \(b\notin \mathcal {J}(\xi _{\tau (a)})\). For \(j'+1<\tau (a)\), we have \(\xi _{j'+1}(b)=0\) or job a is spanning in block \(j'+1\) for otherwise a and b interlace and we get a contradiction by Lemma 5. However, \(\xi _{j'+1}(b)>0\) and job a is spanning in block \(j'+1\), which imply \(\mathcal {J}(\xi _{j'+1})=\{a,b,c\}\). Thus, by Lemma 7, some job must finish at \(e_{j'+1}\) and since \(j'+1<\tau (b)<\tau (b)\), this job must be c. By Lemma 3, job c is spanning in block \(j'+1\)—a contradiction. Therefore,
Since \(\tau (a)>j'+1\), we obtain by (7) and definition of \(\text {A}\)configuration that no job ends at \(e_{j'+2}\) and hence Lemma 7 implies that job c is spanning in block \(j'+1\). Now take
and let \(\mathcal {P}'\) be a schedule with events \({\varvec{e'}}\) and partition \({\varvec{\xi '}}\), where
Figure 10a illustrates the transformation from \(\mathcal {P}\) to \(\mathcal {P}'\) for \(\varepsilon =e_{j'+1}h\), when \(h>e_{j'}\). Observe that (8) and \(\xi _{j'}(b)\ge \varepsilon \) ensure the feasibility of \(\mathcal {P}'\). Also, by (7) and Lemma 12, we have \(i\ne j'\), where i is the abnormality point of \(\mathcal {P}\) possibly equal \(\infty \). Clearly, if \(i<j'\), then \(\mathcal {P}\) and \(\mathcal {P}'\) have the same abnormality point i since the two schedules are identical in \([0,e_{j'}]\). If \(i>j'\), then by Lemma 11, \(\xi _{j'+1}(c)\) is \((j'+1)\)normal and Lemma 13 implies the same abnormality point i for both \(\mathcal {P}\) and \(\mathcal {P}'\). Finally, the \(\text {A}\)maximality of \(\mathcal {P}\) implies that a and b form an \(\text {A}\)configuration in \(\mathcal {P}'\). Therefore, if \(h\le e_{j'}\), then \(\mathcal {P}'\) is \(\text {A}\)maximal and it can be ensured that \(s(\mathcal {P}',b)>s(\mathcal {P},b)\), which contradicts our assumption about \(\mathcal {P}\). If \(h>e_{j'}\), then \(\mathcal {P}'\) is \(\text {A}\)maximal and satisfies (iii) as required. To simplify notation we set \(\mathcal {P}:=\mathcal {P}'\) in the reminder of the proof.
We now prove (i) and (ii). Observe that by (iii), \(s(\mathcal {P},b)\) is not an integer and thus
for otherwise \(\mathcal {P}\) would not be optimal—a contradiction.
Suppose first that \(s(\mathcal {P},a)=s(\mathcal {P},b)=e_{j'}\). If a job \(a'\) in \(\mathcal {J}(\xi _{j'1})\) does not complete at \(e_{j'}\), i.e., \(a'\) is preempted at \(e_{j'}\), then \(C(\mathcal {P},a')> e_{\tau (b)}\) for otherwise, by Lemma 3, at most one of jobs \(\{a,b\}\) can be spanning in block \(\tau (a')\) and thus the other job in \(\{a,b\}\) and \(a'\) would interlace, which contradicts Lemma 5. However, if \(C(\mathcal {P},a')>e_{\tau (b)}\), then job \(a'\) is spanning in block \(\tau (b)\) for otherwise \(a'\) and b interlace, which again contradicts Lemma 5. Thus, \(\mathcal {J}(\xi _{\tau (b)})=2\) by Lemma 3. Therefore, a job in \(\mathcal {J}(\xi _{j'1})\setminus \{a'\}\) completes at \(e_{j'}\). The other conditions of the lemma trivially follow when \(s(\mathcal {P},a)=s(\mathcal {P},b)\).
Now let \(s(\mathcal {P},a)\ne s(\mathcal {P},b)\). By assumption, \(s(\mathcal {P},a)<s(\mathcal {P},b)\). Informally, the proof is divided into two stages. In the first stage we consider block \(j'1\) and we prove that \(\mathcal {J}(\xi _{j'1})=\{a,c\}\) and that \(\tau (c)=j'1\)—see Eqs. (11), (12) and (13) and Fig. 10b. In the second stage we prove that a starts at \(e_{j'1}\). The proof of the latter is done by contradiction, i.e., we suppose that a starts before \(e_{j'1}\). This assumption implies that \(\mathcal {P}\) looks as shown in Fig. 10c in the interval \([e_{j'3},e_{j'+1}]\), which allows us to get the desired contradiction thanks to Proposition 6.
First we prove by contradiction that
By (10), \(\mathcal {J}(\xi _{j'1})\setminus \{a\}\ne \emptyset \). Take any \(c\in \mathcal {J}(\xi _{j'1})\setminus \{a\}\). By (7), \(\xi _{j'}(c)=0\). Since job a is nonspanning in block \(j'1\), the conditions (i)–(iv) of Proposition 7 are all satisfied by jobs a and c, and \(t=j'\). (Condition (iv) holds as \(j'\) is not the abnormality point of \(\mathcal {P}\) by (7) and Lemma 12.) Therefore we get a contradictions, and (11) holds.
Next, we show that
If no job completes at \(e_{j'}\), then Lemma 7 and (10) immediately imply (12). If some job, say c, completes at \(e_{j'}\), then Lemma 3 implies that job c is spanning in block \(j'1\). Since a completes after \(e_{j'}\), \(a\ne c\). This and (11) imply (12).
Finally to complete the first stage, we prove by contradiction that
To that end take \(\varepsilon =\min \{\xi _{j'1}(c),\xi _{j'}(b)\}\) and let \(\mathcal {P}'\) be a schedule with events \({\varvec{e'}}\) and partition \({\varvec{\xi '}}\), where
Note that
which, by (iii), implies that \(s(\mathcal {P}',b)\ge r(b)\). Thus, \(\mathcal {P}'\) is feasible and, by assumption, optimal. Also, by (7), (12) and Lemma 12, we have \(i\notin \{j'1,j'\}\), where i is the abnormality point of \(\mathcal {P}\). Thus, as before, i is the abnormality point of \(\mathcal {P}'\). Indeed, it follows from the fact that the two schedules are identical in \([0,e_{j'}]\) (which covers the case when \(i<j'\)), and from Lemma 13 (that covers the case when \(i>j'\)). Moreover, \(\mathcal {P}'\) contains a block that ends at \(e_{j'+1}\) and contains the jobs a, b and c, none of which completes at \(e_{j'+1}\)—a contradiction with Lemma 7. Therefore, (13) holds, and thus due to Eqs. (11), (12) and (13), the schedule in the interval \([e_{j'1},e_{j'+1}]\) looks like in Fig. 10b.
In the second stage we argue that
Suppose for a contradiction that this is not the case. By (11), c does not start at \(e_{j'1}\). Since a does not starts at \(e_{j'1}\) either, there is a job, say \(c'\) that ends at \(e_{j'1}\), otherwise \(e_{j'1}\) would not be an event. By Lemma 3,
which implies
as follows: First we observe that there is no job \(d\ne c'\) that completes at \(e_{j'1}\). Indeed, otherwise Lemma 8 applied to \(c=d\), \(c'\), \(d=c\), and \(e_j=e_{j'1}\) gives the required contradiction. Now, if \(c\in \mathcal {J}(\xi _{j'2})\), then the conditions (i)(iv) of Proposition 7 are all satisfied by jobs a, b and c, and \(t=j'1\)—a contradiction. (Condition (iv) holds as neither \(j'1\) nor \(j'\) is the abnormality point of \(\mathcal {P}\) by (7), (12), and Lemma 12.) Therefore, \(\xi _{j'2}(c)=0\). Thus, \(\mathcal {J}(\xi _{j'2})\subseteq \{a,c'\}\), because if a job different than a and \(c'\) that does not complete at \(e_{j'1}\) is present in \(\mathcal {J}(\xi _{j'2})\) then, by (12) and (13), this job interlaces with c that contradicts Lemma 5. This implies (16) as required.
If job \(c'\) is nonspanning in block \(j'3\), then by (12), (16) and \(C(\mathcal {P},c')=e_{j'1}\), \(s(\mathcal {P},c')<e_{j'2}\), which implies that c and \(c'\) form an \(\text {A}\)configuration of length \(e_{j'1}e_{j'2}\) at \(e_{j'1}\), which leads to a contradiction with \(\text {A}\)maximality of \(\mathcal {P}\). Thus we have
We prove that
i.e., we prove that \(\mathcal {P}\) in the interval \([e_{j'3},e_{j'+1}]\) is as shown in Fig. 10c. First, we have \(\xi _{j'3}(c_1)>0\) for some \(c_1\notin \{a,c,c'\}\). Otherwise \(\mathcal {J}(\xi _{j'3})\subseteq \{a,c,c'\}\), and since \(e_{j'2}\) is an event, \(s(\mathcal {P},a)=e_{j'2}\). Then, however, conditions (i)(iv) of Proposition 7 are all satisfied by jobs a, b, c, and \(t=j'2\)—a contradiction (observe that \(hs(\mathcal {P},a)<1\), thus (i) is satisfied; condition (iv) holds as none of \(j'2\), \(j'1\), \(j'\) is the abnormality point of \(\mathcal {P}\) by (7), (12), (15), (16), and Lemma 12). Second, each such \(c_1\) completes at \(e_{j'2}\) for otherwise, by (12), (16) and (17), \(c_1\) and c interlace—a contradiction by Lemma 5. Thus, by Lemma 3, job \(c_1\) is spanning in block \(j'3\). This, and (17) imply (18). Thus, \(\mathcal {P}\) looks in the interval \([e_{j'3},e_{j'+1}]\) as shown in Fig. 10c. Finally, by Proposition 6 applied to \(c=c'\), \(c_1\), \(d=c\), a, and \(e_j=e_{j'2}\), we obtain that c and \(c'\) form an \(\text {A}\)configuration at \(e_{j'1}\). Thus, again, we get a contradiction since \(\mathcal {P}\) is \(\text {A}\)maximal. Hence, (14) holds. Therefore the lemma follows by (7), (12), and (14). \(\square \)
Given schedule \((\mathcal {P}, {\varvec{e}},{\varvec{\xi }})\), \(l\ge 1\) and \(\{a_1,\ldots ,a_l\}\subseteq \mathcal {J}\), job sequence \((a_1,\ldots , a_l)\) is called a subchain starting at t in \(\mathcal {P}\) if:

(S1)
For each \(j\in \{1,\ldots ,l1\}\), \(a_j\preceq a_{j+1}\);

(S2)
For each \(j\in \{1,\ldots ,l1\}\), \(C(\mathcal {P},a_j)=s(\mathcal {P},a_{j+1})\);

(S3)
Job \(a_1\) executes nonpreemptively in \([t,C(\mathcal {P},a_1)]\).
Moreover, job sequence \((a_1,\ldots ,a_l)\) is a chain in \(\mathcal {P}\) if it satisfies conditions (S1), (S2) and additionally

(S4)
Time t is the earliest moment such that \(a_1\) executes with no preemption in \([t,C(\mathcal {P},a_1)]\);

(S5)
No predecessor of \(a_1\) ends at t.
Suppose that jobs a and b form an \(\text {A}\)configuration in \(\mathcal {P}\) with \(C(\mathcal {P},a)<C(\mathcal {P},b)\). For \(\varepsilon \ge 0\), we define an operation of \(\varepsilon \) exchanging a and b in an interval \([e_k,C(\mathcal {P},b)]\), \(k<\tau (b)\), as follows. First, all pieces of a and b are removed from the blocks \(k,\ldots ,\tau (b)\). Note that the total length of all removed pieces of a and b is \(\sum _{t=k}^{\tau (b)}\xi _t(a)\) and \(\sum _{t=k}^{\tau (b)}\xi _t(b)\), respectively. Then, the empty gaps are filled out with the total length \(\sum _{t=k}^{\tau (b)}\xi _t(a)\varepsilon \) of a and the total length of \(\sum _{t=k}^{\tau (b)}\xi _t(b)+\varepsilon \) of b in such a way that b completes as early as possible. Note that the new schedule is feasible only if \(\varepsilon =0\). Whenever the transformation of \(\varepsilon \)exchanging will be used with \(\varepsilon >0\), then some other appropriate changes in the schedule will be made to ensure feasibility.
For \(\varepsilon >0\), we extend the operation of \(\varepsilon \)exchanging of two jobs to subchains as follows. Let \(A=(a_1,\ldots ,a_l=a)\) and \(B=(b_1,\ldots ,b_{l'}=b)\) be two subchains in \(\mathcal {P}\) that start at t, and such that a and b form an \(\text {A}\)configuration in \(\mathcal {P}\), where \(s(\mathcal {P},a)\le s(\mathcal {P},b)\). (Note that we either have \(l=l'\) or \(l=l'1\).) Let d be any job that executes in \([t\varepsilon ,t]\). The operation of \((\varepsilon ,d)\) exchanging of A and B in \(\mathcal {P}\) leads to a schedule \(\mathcal {P}'\) obtained by making the following changes to \(\mathcal {P}\):

For each \(j\in \{1,\ldots ,l\}\), \(a_j\) is executed in \([t_j\varepsilon ,t_{j+1}\varepsilon ]\) in \(\mathcal {P}'\), where take \(t_1=t\), \(t_j=s(\mathcal {P},a_j)\) for \(j\in \{2,\ldots ,l\}\) and \(t_{l+1}=e_{j'+1}\) such that \(e_{j'}=s(\mathcal {P},b)\);

For each \(j\in \{1,\ldots ,l'\}\), \(b_j\) is executed in \([u_j+\varepsilon ,u_{j+1}+\varepsilon ]\) in \(\mathcal {P}'\), where take \(u_1=t\), \(u_j=s(\mathcal {P},b_j)\) for \(j\in \{2,\ldots ,l'\}\) and \(u_{l'+1}=e_{j'+1}\);

A piece of d executing in \([t\varepsilon ,t]\) is placed in \([t,t+\varepsilon ]\) in \(\mathcal {P}'\) (the “room” for this job is made by postponing \(b_1\));

In the interval \([e_{j'+1},C(\mathcal {P},b)]\) \(\varepsilon \)exchanging of a and b is made.
The transformation is illustrated in Fig. 11 for \(d=b_1\). Note that in this particular case the total completion times of \(\mathcal {P}\) and \(\mathcal {P}'\) are equal.
The new schedule \(\mathcal {P}'\) is feasible under certain conditions. First, the value of \(\varepsilon \) must be selected in such a way that \(\varepsilon \)exchanging of a and b is possible in the abovementioned interval. Second, d should not be a predecessor of \(a_1\). Also, the release dates of jobs \(a_1,\ldots ,a_l\) need to be respected and \(a_1\) must be nonpreemptively executed in \([t,t+\varepsilon ]\). We summarize those conditions in the following lemma.
Proposition 9
Let \((a_1,\ldots ,a_l=a)\) and \((b_1,\ldots ,b_l=b)\), starting at t, be two subchains in \(\mathcal {P}\) such that a and b form an \(\text {A}\)configuration of length \(\ell \) in \(\mathcal {P}\). If \(\varepsilon \le \ell \), \(r(a_1)\le t\varepsilon \) and \(r(a_j)\le s(\mathcal {P},a_j)\varepsilon \) for each \(j\in \{2,\ldots ,l\}\), \(a_1\) executes nonpreemptively in \([t,t+\varepsilon ]\), and some job d that is not a predecessor of \(a_1\) executes nonpreemptively in \([t\varepsilon ,t]\), then the schedule \(\mathcal {P}'\) obtained by \((\varepsilon ,d)\)exchanging of the two subchains in \(\mathcal {P}\) is feasible. \(\square \)
5.1 Proof of Theorem 2
Let \((\mathcal {P}, {\varvec{e}}, {\varvec{\xi }})\) be a maximal schedule that satisfies the properties in Proposition 8. Let \((a_1,\ldots ,a_l=a)\) be the chain in \(\mathcal {P}\) that starts at \(t_a\) and let \((b_1,\ldots ,b_{l'}=b)\) be the chain in \(\mathcal {P}\) that starts at \(t_b\). By definition of chains and Proposition 8 we have
Case 1 \(t_a\ge t_b\).
In this case we perform a transformation shown in Fig. 12 as described below. By Proposition 8, there exists an integer h such that both jobs a and b execute nonpreemptively in \([s(\mathcal {P},a),h]\) and \([s(\mathcal {P},b),h]\), respectively. We have \(e_p=t_a\) for some event \(e_p\).
Let \(e_{j'}=s(\mathcal {P},b)\). By Proposition 8, \(he_{j'}=\xi _{j'}(b)\). Let \(\ell \) be the length of the \(\text {A}\)configuration formed by a and b. Clearly \(C(\mathcal {P},a)\ell > h\) by definition of \(\text {A}\)configuration and Proposition 8.
Let \(A=(a_1,\ldots ,a_l)\) and let B be the subchain of the chain \((b_1,\ldots ,b_{l'}=b)\) that starts at \(t_a\) with a job \(b'\) and ends with the job b. By definition of \(t_a\), \(\xi _{p1}(a_1)<e_pe_{p1}\). Thus, \(\mathcal {J}(\xi _{p1})\setminus \{a_1\}\ge 2\). Let \(d\in \mathcal {J}( \xi _{p1})\setminus \{a_1\}\) be a job that does not complete at \(e_p\) (possibly \(b'\)), if such a job exists. Otherwise, let d be any job in \(\mathcal {J}( \xi _{p1})\setminus \{a_1\}\). Take
where \(h'\) is the greatest integer smaller than \(t_a\) and
The latter ensures that d, if it completes at \(t_a\) in \(\mathcal {P}\), does not complete after \(s(\mathcal {P}',a_2)\) in \(\mathcal {P}'\). Note that, by definition of \(t_a\), no predecessor of \(a_1\) ends at \(t_a\) and \(\xi _{p1}(a_1)<e_pe_{p1}\). Hence, in particular, \(\varepsilon >0\). Let \(\mathcal {P}'\) be the schedule obtained by \((\varepsilon ,d)\)exchanging of A and B in \(\mathcal {P}\). By Proposition 9, \(\mathcal {P}'\) is feasible. If \(\varepsilon =\ell \), then the total completion time of \(\mathcal {P}'\) is strictly smaller than that of \(\mathcal {P}\) and we get a contradiction since \(\mathcal {P}\) is optimal.
Thus, consider \(\varepsilon <\ell \). Then, the total completion time of \(\mathcal {P}'\) does not exceed that of \(\mathcal {P}\). To see that we observe that by (19) and (20) we have \(s(\mathcal {P},b)t_a<l\). Also, if two jobs in \(\mathcal {J}(\xi _{p1})\setminus \{a_1\}\) complete at \(t_a\), then either at least one of them is a predecessor of \(b'\), which implies that \(s(\mathcal {P},b')=t_a\), or otherwise we obtain from Lemma 8 that \(s(\mathcal {P},b')=t_a\). Therefore, no more than l jobs in \(\{d,b_1,\ldots ,b_{l'}\}\) complete in \([t_a,h]\) in \(\mathcal {P}\). Thus, no more than l jobs get delayed by \(\varepsilon \) each as a result of the exchange, however, each job in the chain \((a_1,\ldots ,a_l=a)\) completes by \(\varepsilon \) earlier at the same time.
Finally, we show that \(\mathcal {P}\) and \(\mathcal {P}'\) have the same abnormality point. Clearly, this holds if \(i<p1\). Also, if \(i>j'\), then \(\varepsilon \) is pnormal. To see this we observe that \(e_p\), \(e_{p1}\), \(\xi _{p1}(a_1)\) and \(t_a\) are clearly all pnormal. By Lemma 10, \(C(\mathcal {P},a_1)t_a\) is pnormal. Also \(e_{j'}=t_a+(C(\mathcal {P},b')t_a)+k2\), where k is the number of jobs in B, is pnormal. If \(l=1\), then \(s(\mathcal {P},a)\) and h are pnormal, which implies pnormality of y. For \(l>1\), we argue that y is also pnormal and for that we need only consider \(y=(C(\mathcal {P},a_1)e_p)/2\). Then, \(b'\) is not present in the \((p1)\)st block for otherwise \(b'\) would be selected as d. The length of \((p1)\)st block, \(e_pe_{p1}\), is by definition \((p1)\)normal. By Lemma 3, \(\xi _{p1}(d)=e_{p}e_{p1}\). By Lemma 5, each job in \(\mathcal {J}(\xi _{p1})\setminus \{a_1\}\) must complete at \(e_{p}\). This proves, again by Lemma 3, that \(\mathcal {J}(\xi _{p1})=2\). By Lemma 10, \(\xi _{p1}(a_1)+\xi _p(a_1)+\xi _{p+1}(a_1)=\xi _{p1}(a_1)+C(\mathcal {P},a_1)t_a\) is \((p1)\)normal. Since \(\xi _{p1}(a_1)\in \{0,e_pe_{p1}\}\), we obtain that \(\xi _{p1}(a_1)\) is \((p1)\)normal. Thus, \((C(\mathcal {P},a_1)t_a)/2\) is pnormal as required. Therefore, \(\varepsilon \) is pnormal and, by Lemma 13, \(\mathcal {P}\) and \(\mathcal {P}'\) have the same abnormality point for \(i>j'\). Also, by Proposition 8, and chain definition, we have \(\mathcal {J}(\xi _k)=2\) for each \(k\in \{p,\ldots ,j'\}\). Thus, by Lemma 12, \(i\notin \{p,\ldots ,j'\}\). Finally, consider \(i=p1\). Then, if i is no longer the abnormality point \(i'\) of \(\mathcal {P}'\), then \(i'>i\)—a contradiction since \(\mathcal {P}\) is \(\text {A}\)maximal. Therefore, i is the abnormality point of \(\mathcal {P}'\), and hence we proved that \(\mathcal {P}\) and \(\mathcal {P}'\) have the same abnormality point. To complete the case proof we note that a and b form an \(\text {A}\)configuration in \(\mathcal {P}'\) at \(C(\mathcal {P}',a)=C(\mathcal {P},a)\varepsilon \), which contradicts the \(\text {A}\)maximality of \(\mathcal {P}\).
Case 2 \(t_a<t_b\). We first define
and \(a'\) to be the job from the chain \((a_{1},\ldots ,a_{l})\) that starts or resumes at \(t'_a\). By (19)–(22), it holds \(a'=a_{ll'+1}\) or \(a'=a_{ll'+2}\), and only one job, namely \(a'\), from the chain \((a_1,\ldots ,a_l)\) is executed in \((t'_a,t_b)\). By definition \(t'_a<t_b\), also we have \(t'_a=e_{p}\) for some event \(e_p\).
We first prove that exactly one job that is not in the chain \((a_{1},\ldots ,a_{l})\), call it d, executes in \([t'_a,t_b]\) and completes at \(t_b\). Indeed, if \(l'=1\), then this follows from Proposition 8. If \(l'>1\), then any job not in the chain \((a_{1},\ldots ,a_{l})\) that executes in \([t'_a,t_b]\) completes in \([t'_a,t_b]\), otherwise this job interlaces with \(b_1\)—a contradiction with Lemma 5. Finally, we show that two or more jobs not in the chain \((a_{1},\ldots ,a_{l})\) cannot complete in \([t'_a,t_b]\). If there are at least three such jobs, then the last two of them form an \(\text {A}\)configuration, which contradicts the \(\text {A}\)maximality of \(\mathcal {P}\). For exactly two, \(c'\) and c completing in this order, by Claim 1, c and \(c'\) form an \(\text {A}\)configuration at \(e_{p+1}\)—a contradiction since \(\mathcal {P}\) is \(\text {A}\)maximal. Also, observe that for \(l'=l+1\), job \(b_1\) resumes at \(t_b\) and thus \(b_1\) and d form an \(\text {A}\)configuration at \(e_{p+1}\) by Claim 1—a contradiction since \(\mathcal {P}\) is \(\text {A}\)maximal. Thus, let \(l\ge l'\) in the reminder of the lemma.
Now we prove that our schedule \(\mathcal {P}\) satisfies the following claim that we have used above (see Fig. 13a for illustration of Claim 1):
Claim 1
Suppose that \(t'_a=e_p\) is an event in \(\mathcal {P}\) and that there exist jobs \(a'\), c, and d such that

(i)
\(C(\mathcal {P},c)=e_{p+1}\), and \(C(\mathcal {P},d)=e_{p+2}\);

(ii)
\(\mathcal {J}(\xi _{p})=\{a',c\}\) and \(\mathcal {J}(\xi _{p+1})=\{a',d\}\);

(iii)
\(s(\mathcal {P},d)<e_{p}\);

(iv)
if \(l'=l+1\), then \(d=b_1\); otherwise \(C(\mathcal {P},d)=t_b\).
Then jobs c and d form an \(\text {A}\)configuration at \(e_{p+1}\).
Proof
If \(\xi _{p1}(c)<e_pe_{p1}\), then the jobs c and d form an \(\text {A}\)configuration of length \(e_{p+1}e_{p}\) at \(e_{p+1}\)—the lemma holds. Thus,
We now prove that in interval \([e_{p1},e_{p+2}]\) the schedule \(\mathcal {P}\) is as in Fig. 13, i.e., there exists a job \(c_1\) such that
First, we show that \(\xi _{p1}(c_1)>0\) for some \(c_1\notin \{a',c,d\}\). Otherwise, by (23) and (ii), \(s(\mathcal {P},a')=e_p\) since \(e_p\) is an event. Thus, \(\mathcal {J}(\xi _{p1})=\{c,d\}\). Now, take
where \(h'\) is the greatest integer smaller than \(t'_a\). Observe that \(\varepsilon >0\). Let \(A=(a_1,\ldots ,a_l)\) and let \(B=(d,b_1,\ldots ,b_{l'}=b)\) be the subchain that starts at \(e_{p+1}\). Perform the \((\varepsilon ,d)\)exchanging of A and B in \(\mathcal {P}\) as in Case 1 (the completion time of c does not change in this transformation when \(d\ne b_1\) because \(a'\) from the \(\tau (a')\)th block is placed in the \((p1)\)st block and d from \((p1)\)st block is placed in the \((\tau (d)+1)\)st block) to get a contradiction. Observe that, by (iv), \(d=b_1\) for \(l'=l+1\) and thus the \((\varepsilon ,d)\)exchanging of A and B indeed produces schedule \(\mathcal {P}'\) with total completion time that does not exceed that of schedule \(\mathcal {P}\). Also, by Lemma 4, there is no idle time in the \((p1)\)st block of \(\mathcal {P}\). This implies, by Lemma 10, that \(\xi _{p1}(d)=e_{p}e_{p1}\) is \((p1)\)normal, which by arguments in Case 1 implies that the abnormality points of \(\mathcal {P}\) and \(\mathcal {P}'\) are the same.
Second, by Lemma 5, \(c_1\) and d cannot interlace, which implies \(C(\mathcal {P},c_1)=e_{p}\). By Lemma 3, job \(c_1\) is spanning in block \(p1\). Thus, by (23) we have \(\{c_1,c\}=\mathcal {J}(\xi _{p1})\). Finally, \(s(\mathcal {P},d)<e_{p1}\) is due to (iii) and \(\mathcal {J}(\xi _{p1})=\{c_1,c\}\). This completes the proof of (24). Equation (24) allows us to apply Proposition 6 to c, \(c_1\), d, \(a=a'\), and \(e_j=e_{p}\) to conclude that c and d form an \(\text {A}\)configuration at \(e_{p+1}\). This contradicts the \(\text {A}\)maximality of \(\mathcal {P}\) and completes the proof of Claim 1. \(\square \)
Let \(e_{j'}=s(\mathcal {P},b)\). Now, let \(y=\sum _{t\ge p}\xi _t(b_1)\) and \(z=\sum _{t\ge p}\xi _t(a_{ll'+1})\). First we prove that \(z\le y\). This holds due to Proposition 8 when \(l'=1\) and hence let \(l\ge l'>1\). If \(z>y\), then swap a and b and then do the \((\varepsilon ,a_{ll'+1})\)exchanging of \((b_1,\ldots ,b_{l'})\) and \((a_{ll'+1},\ldots ,a_l)\) (note the order of the chain, which is important) both starting at \(t_b\), where \(\varepsilon =s(\mathcal {P},b)s(\mathcal {P},a) + \lambda \) and \(0<\lambda <\min \{\ell ,zy,\xi _{j'}(b)\}\). This transformation is shown in Fig. 14a. Let the resulting schedule be \(\mathcal {P}'\). The swapping increases the total completion time by \(s(\mathcal {P},b)s(\mathcal {P},a)\) and the \((\varepsilon ,a_{ll'+1})\)exchanging decreases it by \(((l'1)l')\varepsilon \)—observe that after the swapping of a and b the completion time of job \(a=a_l\) does not change in the exchange. Therefore, the overall change equals \(\lambda \) and thus to get a contradiction it suffices to prove that \(\mathcal {P}'\) is feasible.
Observe that \(s(\mathcal {P},b)s(\mathcal {P},a)=C(\mathcal {P},b_1)C(\mathcal {P},\)) \({a_{ll'+1}}\). By Proposition 8, \(e_{j'+1}\) is an integer. Thus, \(r(b)<e_{j'+1}\) implies \(r(b)\le e_{j'+1}1\). Moreover, \(s(\mathcal {P}',b)\ge e_{j'+1}1\). Therefore, by the definition of a subchain, all jobs \(b_{2},\ldots ,b_{l'}\) respect their release dates in \(\mathcal {P}'\). Since \(z>y\), we have that \(s(\mathcal {P},b_1)<t_a'\) and hence \(b_1\) respects its release date in \(\mathcal {P}'\). Thus, \(z\le y\) for the reminder of the proof. We consider the following three subcases.
Case 2a \(t'_a=s(\mathcal {P},a_{ll'+1})\). (Schedule transformation performed in this case is shown in Fig. 14b.) Then, \(z=1\). Since \(z\le y\), we have \(y=1\). If some job in \(\mathcal {J}(\xi _{p1})\) does not complete at \(t_a'\), then this case reduces to Case 1. Otherwise, two jobs in \(\mathcal {J}(\xi _{p1})\) complete at \(t_a'\). Thus, by Lemma 7, for at least one job in \(\mathcal {J}(\xi _{p1})\), say job \(c'\), we have \(\xi _t(c')=0\), where \(e_t=s(\mathcal {P},d)\). Therefore, \(c'\) and d interlace if \(s(\mathcal {P},c')<s(\mathcal {P},d)\)—a contradiction by Lemma 5, or we can swap jobs \(c'\) and d if \(s(\mathcal {P},c')>s(\mathcal {P},d)\). In the latter case the resulting schedule (see Fig. 14b) reduces the total completion time of \(\mathcal {P}\) by Lemma 2. This schedule is not feasible when \(c'\prec a_{ll'+1}\) and we restore feasibility by applying a 0exchanging of b and a in \([e_{j'+1},C(\mathcal {P},b)]\) followed by \((t_bt_a',a_{ll'+1})\)exchanging of \((b_1,\ldots ,b_{l'})\) and \((a_{ll'+1},\ldots ,a_l)\) both starting at \(t_b\). The new schedule \(\mathcal {P}'\) is feasible since \(c'\nprec b_1\) for intrees, and since, by Proposition 8, \(s(\mathcal {P}',b)\ge e_{j'+1}1\ge r(b)\), which shows that all \(b_1,\ldots ,b_{l'}\) respect their release dates in \(\mathcal {P}'\). Thus, we get a contradiction since \(\mathcal {P}\) is optimal.
Case 2b \(t'_a=s(\mathcal {P},a_{ll'+2})\). Since \(l\ge l'\), \(\xi _{p1}(a_{ll'+1})=e_pe_{p1}\). Also, \(t_a'<t_b\) implies that \(b_1\) resumes at \(t_b=e_{p+1}\). Thus, by Claim 1, d and \(b_1\) form an \(\text {A}\)configuration at \(e_{p+1}\), which contradicts the \(\text {A}\)maximality of \(\mathcal {P}\).
Case 2c \(t'_a\ne s(\mathcal {P},a_{ll'+1})\) and \(t'_a\ne s(\mathcal {P},a_{ll'+2})\). By definition of \(t_a'\), we have \(t'_a=t_a\) and \(a_1\) resumes at \(t_a\). Since \(t_a\) is an event of \(\mathcal {P}\) some job, say c, completes at \(e_p\). By Lemma 3, job c is spanning in block \(p1\). If another job completes at \(e_p\), then we get a contradiction by Lemma 8. Hence, by Lemma 5, \(\mathcal {J}(\xi _{p1})\subseteq \{c,d,b_1,a_1\}\). Note that \(z\le y\), \(l\ge l'\) and \(t_a<t_b\) imply that \(l'=l\). By definition of \(t_a\), job \(a_1\) is nonspanning in block \(p1\). By Lemma 5, d and \(b_1\) do not interlace, which implies \(\xi _{p1}(b_1)=0\). Therefore, \(\xi _{p1}(d)>0\). This allows us to obtain a contradiction by performing an analogous transformation as in Claim 1.
Observe that for the proof of Theorem 2 it is crucial to show that \(\mathcal {P}\) and \(\mathcal {P}'\) have the same abnormality point. This needs to be proven in Case 1, Claim 1, 5 and 5. In 5 and 5 the proof reduces to the proof for Case 1 and Claim 1. In Claim 1 the proof also reduces to the proof for Case 1 but the \(\xi _{p1}(d)\) is new in definition of \(\varepsilon \) as compared to Case 1 so we provide an appropriate comment about this \(\xi _{p1}(d)\) in Claim 1. Finally, in Case 1 we explicitly prove that \(\mathcal {P}\) and \(\mathcal {P}'\) have the same abnormality point.
6 Alternating chains
In this section we prove that there always exists an optimal normal schedule for \(P2pmtn,in\text {}tree,r_j,p_j\sum C_j\). Our proof is by contradiction. We show that an abnormality point \(i\ne \infty \) in a maximal schedule implies that there is an alternating chain, see Sect. 6.1 for its definition, in the schedule. Each job in that chain completes at the moment which is not inormal. This fact allows us to either make the alternating chain longer, which is shown in Sect. 6.3, or find an optimal schedule with an abnormality point higher than i. Thus in either case we get a contradiction, in the former, since the number of jobs is finite and we cannot extend the chain ad infinitum, in the latter since the initial schedule is maximal.
6.1 Basic definitions and properties
Given schedule \((\mathcal {P},{\varvec{e}},{\varvec{\xi }})\), let \(I=\{j,\ldots ,j'\}\subseteq \{1,\ldots ,q1\}\). For two jobs a and \(a'\), We say that a covers \(a'\) in I if for each \(t\in I\), \(\xi _t(a')>0\) implies \(\xi _t(a)=e_{t+1}e_t\). Let \(\mathcal {P}\) be a maximal schedule of abnormality point \(i\ne \infty \). Job sequence \((d_1,\ldots ,d_l)\) is called an alternating chain in \(\mathcal {P}\) if \(d_1\in A_i(\mathcal {P})\) and \(d_1\) executes nonpreemptively in \([e_{i+1},C(\mathcal {P},d_1)]\) and additionally, unless if \(l=1\), the following conditions are satisfied:

(C1)
\(A_i(\mathcal {P})=\{d_1,d_2\}\) and \(\tau (d_1)=i+1\),

(C2)
\(C(\mathcal {P},d_j)<C(\mathcal {P},d_{j+1})\) for each \(j\in \{1,\ldots ,l1\}\).

(C3)
the job \(d_j\) executes nonpreemptively in the interval \([C(\mathcal {P},d_{j2}),C(\mathcal {P},d_j)]\) for each \(j\in \{2,\ldots ,l\}\), where \(C(\mathcal {P},d_0)=e_{i+1}\).
If \((d_1,\ldots ,d_l)\) (\(l\ge 2\)) satisfies (C1), (C3) and

(C2’)
\(C(\mathcal {P},d_j)<C(\mathcal {P},d_{j+1})\) for each \(j\in \{1,\ldots ,l2\}\) and \(C(\mathcal {P},d_{l1})=C(\mathcal {P},d_l)\),
then \((d_1,\ldots ,d_l)\) is said to be almost alternating.
The following lemma excludes almost alternating chains from maximal schedules. The lemma does not require that the maximal schedules are \(\text {A}\)free.
Lemma 14
Let \(\mathcal {P}\) be a maximal schedule of abnormality point \(i\ne \infty \). There exists no almost alternating chain in \(\mathcal {P}\).
Proof
Suppose for a contradiction that \((d_1,\ldots ,d_l)\) is an alternating chain with \(C(\mathcal {P},d_{l1})=C(\mathcal {P},d_l)\). Let first \(l>3\) and let U be the set of odd indices in \(\{3,\ldots ,l\}\). Denote by \({\varvec{\xi }}\) the partition of \(\mathcal {P}\). We fist prove that the total length of \(d_2\) executed in \([C(\mathcal {P},d_1),C(\mathcal {P},d_l)]\), namely \(\xi _{i+2}(d_2)\), is inormal. From the definition of alternating chain we know that there is no idle time in this interval and only the jobs from the chain execute in it. Hence,
By Lemma 10, \(\sum _{j'=1}^i\xi _{j'}(d_j)\) is inormal for each \(j\in \{3,\ldots ,l\}\). Therefore, \(\xi _{i+2}(d_2)\) is inormal. This implies, by Lemmas 10 and 11, that the three following numbers are inormal:
since \(\xi _{i+1}(d_1)=\xi _{i+1}(d_2)\). Therefore, \(\xi _i(d_1)\) and \(\xi _i(d_2)\) are \((i+1)\)normal, and since \(e_{i+1}\) is inormal due to Lemma 3, i is not the abnormality point of \(\mathcal {P}\)—a contradiction.
Now consider \(l=2\). Let \({\varvec{e}}\) be the events of \(\mathcal {P}\). Denote \(x=\xi _{i+1}(d_1)=\xi _{i+1}(d_2)\). By assumption and by definition of alternating chain, \(d_1\) and \(d_2\) complete at \(e_{i+2}\) and hence \(\xi _i(d_1)+x=s_1/2^i\) and \(\xi _i(d_2)+x=s_2/2^i\) for some integers \(s_1\) and \(s_2\). By definition of alternating chain, \(A_i(\mathcal {P})=\{d_1,d_2\}\) and hence x is not \((i+1)\)normal. Thus, \(x=s'/2^{i+1}+\varepsilon \) for some \(0<\varepsilon <1/2^{i+1}\). Then, \(\xi _i(d_1)=(s_1s')/2^i\varepsilon \) and \(\xi _i(d_2)=(s_2s')/2^i\varepsilon \). By Lemma 11, \(e_{i+1}e_{i}\) is inormal. By Lemma 12,
Thus, \(2\varepsilon \) is inormal, which implies that \(\varepsilon \) is \((i+1)\)normal—a contradiction with \(0<\varepsilon <1/2^{i+1}\). \(\square \)
Lemma 15
Let \(\mathcal {P}\) be a maximal schedule of abnormality point \(i\ne \infty \). If \((d_1,\ldots ,d_l)\) (\(l\ge 1\)) is an alternating chain in \(\mathcal {P}\), then \(C(\mathcal {P},d_j)\) is not \((i+1)\)normal for each \(j\in \{1,\ldots ,l\}\).
Proof
Let first \(1\le j\le \min \{2,l\}\). Then, \(C(\mathcal {P},d_j) = e_{i+1}+1\xi _i(d_j)\sum _{t<i}\xi _t(d_j)\). By Lemma 10, the latter sum is inormal. By Observations 1 and 2 and by Lemma 11, \(e_{i+1}\) is inormal. However, \(\xi _i(d_j)\) is not \((i+1)\)normal because \(d_j\in A_i(\mathcal {P})\) and hence, again by Observation 1, \(C(\mathcal {P},d_j)\) is not \((i+1)\)normal.
For \(j>2\), if j is even (respectively, odd), then let U be the set of even (respectively, odd) integers in \(\{1,\ldots ,j1\}\). Let \(u=1\) if j is odd and let \(u=2\) if j is even. Then,
Again, by Lemma 10, \(\xi _i(d_u)\) is the only term in the above expression that is not \((i+1)\)normal. Thus, \(C(\mathcal {P},d_j)\) is not \((i+1)\)normal. \(\square \)
6.2 Transformations using alternating chains
Consider a schedule \(\mathcal {P}\) of abnormality point i and an alternating chain \((d_1,\ldots ,d_l)\) (\(l>1\)) and \(\mathcal {J}(\xi _{\tau (d_l)})=\{x,y,d_l\}\). Let \(u=2\) if l is odd, and \(u=1\) if l is even. Let \(\varepsilon >0\) be the largest \(\varepsilon \) such that
where U is the set of the indices in \(\{1,\ldots ,l\}\) having the same parity as l, and
We define a transformation of \(\varepsilon \) pushing of \(d_l\) that produces a schedule \(\mathcal {P}'\) as follows (see Fig. 15 for an illustration):

the schedules \(\mathcal {P}\) and \(\mathcal {P}'\) are identical in time intervals \([0,e_j]\) and \([C(\mathcal {P},d_l),\infty )\).

To obtain the part of \(\mathcal {P}'\) in \([e_i,e_{i+1}]\), we increase (with respect to \(\mathcal {P}\)) the amount of \(d_{3u}\) by \(\varepsilon \) and decrease the amount of \(d_u\) by \(\varepsilon \). Then, a part of job \(d_{3u}\) executes in \([e_{i+1},C(\mathcal {P},d_{3u})\varepsilon ]\) and a part of job \(d_u\) executes in \([e_{i+1},C(\mathcal {P},d_u)+\varepsilon ]\). This in particular characterizes the execution of \(d_1\) and \(d_2\) in \(\mathcal {P}'\).

For each \(j\in U\setminus \{1,2\}\), the part of \(d_j\) that executes in \([C(\mathcal {P},d_{j2}),C(\mathcal {P},d_j)]\) in \(\mathcal {P}\) is executed in \([C(\mathcal {P},d_{j2})\varepsilon ,C(\mathcal {P},d_j)\varepsilon ]\) in \(\mathcal {P}'\). In this way we ensure that each job \(d_j\), \(j\in U\), completes \(\varepsilon \) earlier in \(\mathcal {P}'\) than in \(\mathcal {P}\).

For each \(j\in \{3,\ldots ,l\}\setminus U\), the part of \(d_j\) that executes in \([C(\mathcal {P},d_{j2}),C(\mathcal {P},d_j)]\) in \(\mathcal {P}\) is executed in \([C(\mathcal {P},d_{j2})+\varepsilon ,C(\mathcal {P},d_j)+\varepsilon ]\) in \(\mathcal {P}'\). In this way we ensure that each job \(d_j\), \(j\notin U\), completes \(\varepsilon \) later in \(\mathcal {P}'\) than in \(\mathcal {P}\).

Finally, the two jobs x and y are executed in the remaining free space in \([C(\mathcal {P}',d_{l1}),C(\mathcal {P},d_l)]\) on one machine and in \([C(\mathcal {P}',d_l),C(\mathcal {P},d_l)]\) on the other machine.
The transformation of \(\varepsilon \)pushing of \(d_l\) will be a key transformation used to extend an alternating chain of a maximal schedule \(\mathcal {P}\) in the proof of Theorem 3—the main result of the next section. The extension, as alluded earlier, requires that \(\mathcal {P}\) is \(\text {A}\)free. Actually, it suffices that \(\mathcal {P}\) is \(\text {A}\)free in the interval that starts with the completion of \(d_l\), the last job of the chain. However, since the \(\varepsilon \)pushing of \(d_l\) may change \(\mathcal {P}\) itself we need to prove that the resulting schedule is \(\text {A}\)free in the interval that starts with the completion of \(d_l\), which the transformation may have changed, in order to unable further extensions of the chain. Thus, we need the following lemma.
Lemma 16
Suppose that \((d_1,\ldots ,d_l)\) (where \(l>1\)) is an alternating chain in a maximal schedule \(\mathcal {P}\) that is \(\text {A}\)free in \([C(\mathcal {P},d_l),\infty )\), and \(\mathcal {J}(\xi _{\tau (d_l)})=\{x,y,d_l\}\). If \(\varepsilon \) is selected as in (25) and (26), then the schedule \(\mathcal {P}'\) obtained from \(\mathcal {P}\) by \(\varepsilon \)pushing of \(d_l\) is maximal, \(\text {A}\)free in \([C(\mathcal {P}',d_l),\infty )\), and \((d_1,\ldots ,d_l)\) is an alternating chain in \(\mathcal {P}'\).
Proof
An \(\varepsilon \)pushing of \(d_l\) results in a feasible schedule \(\mathcal {P}'\) (note that at most one of \(d_1\) and \(d_2\) can have release date in \([e_i,e_{i+1}]\)) with the total completion time not exceeding that of \(\mathcal {P}\). Thus, \(\mathcal {P}'\) is optimal. Note that an odd l would results in \(\mathcal {P}'\) having smaller total completion time than that of \(\mathcal {P}\). Thus, l is even. If the \(\varepsilon \) makes at least one of the tree inequalities in (25) an equality, then the ith block becomes inormal and we get a contradiction in case of a maximal \(\mathcal {P}\). On the other hand, if an \(\varepsilon \) makes all three inequalities in (25) holding strict, then \((d_1,\ldots ,d_l)\), \(l>1\), is an alternating chain in \(\mathcal {P}'\).
We prove, by contradiction, that the schedule \(\mathcal {P}'\) is \(\text {A}\)free in \([C(\mathcal {P}',d_l),\infty )\). Suppose that some jobs a and b form an \(\text {A}\)configuration at a point \(t\ge C(\mathcal {P}',d_l)\) in \(\mathcal {P}'\). Note that \(t\ge C(\mathcal {P},d_l)\) is not possible because \(\mathcal {P}\) and \(\mathcal {P}'\) are identical from \(C(\mathcal {P},d_l)\) on and \(\mathcal {P}\) is \(\text {A}\)free in \([C(\mathcal {P},d_l),\infty )\) by assumption. Thus, \(t=C(\mathcal {P}',d_l)\). Therefore, \(a=d_{l}\) and \(b\in \{x,y\}\).
Let \(\lambda \in [0,C(\mathcal {P}',b)C(\mathcal {P}',d_l)]\) be a maximal real number such that for each \(\varepsilon '\in [0,\lambda )\) there exists a feasible schedule \(\mathcal {P}_{\varepsilon '}\) such that \((d_1,\ldots ,d_l)\) is an alternating chain in \(\mathcal {P}_{\varepsilon '}\) and \(\varepsilon '\)pushing of \(d_l\) in \(\mathcal {P}_{\varepsilon '}\) results in \(\mathcal {P}\). Since l is even, the total completion time of \(\mathcal {P}_{\varepsilon '}\) is the same as the total completion time of \(\mathcal {P}'\) for each \(\varepsilon '\in [0,\lambda )\). Informally speaking, \(\mathcal {P}_{\varepsilon '}\) is obtained by performing a modification that is ‘opposite’ to pushing of \(d_l\). By definition of \(\text {A}\)configuration, \(d_l\) is independent of any job that executes in the interval \((C(\mathcal {P}',d_l),C(\mathcal {P}',b))\) in \(\mathcal {P}'\). Thus, the maximality of \(\lambda \) implies that taking \(\varepsilon '=\lambda \) would result in a schedule \(\mathcal {P}_{\varepsilon '}\) in which one of the following holds:

Job sequence \((d_1,\ldots ,d_l)\) is not an alternating chain in \(\mathcal {P}_{\varepsilon '}\). Then we have two possibilities. The first possibility is that \(t=C(\mathcal {P}_{\varepsilon '},d_j)=C(\mathcal {P}_{\varepsilon '},d_{j+1})\) for some \(j\in \{1,\ldots ,l2\}\). Then, \((d_1,\ldots ,d_{j+1})\) is an almost alternating chain—a contradiction with Lemma 14. The second possibility is that either \(d_1\) or \(d_2\) is not present in the ith block of \(\mathcal {P}_{\varepsilon '}\). Then, the abnormality point of \(\mathcal {P}_{\varepsilon '}\) is greater than i—a contradiction with the maximality of \(\mathcal {P}\).

\(C(\mathcal {P}_{\varepsilon '},d_l)=C(\mathcal {P},b)\). This would imply that \(C(\mathcal {P}_{\varepsilon '},b){<}C(\mathcal {P},b)\) and this is not possible due to the optimality of \(\mathcal {P}\).

\(s(\mathcal {P}_{\varepsilon '},d_j)=r(d_j)\) for some \(j\in \{3,\ldots ,l1\}\). In this case a contradiction follows from Lemma 15.

\(s(\mathcal {P}_{\varepsilon '},b)=r(b)\). Since \(C(\mathcal {P}_{\varepsilon '},d_{l1})=s(\mathcal {P}_{\varepsilon '},b)\), we again obtain a contradiction with Lemma 15.
Therefore, the lemma is proved. \(\square \)
Finally, we observe that the \(\varepsilon \)pushing of \(d_l\) can readily be extended to the case where one of the jobs x or y starts in \((C(\mathcal {P},d_{l1}),C(\mathcal {P},d_{l}))\) but neither of them completes in that interval.
6.3 Extending an alternating chain
We first prove that a singlejob alternating chain is present in each maximal (and thus in \(\text {A}\)maximal) schedule of abnormality point \(i\ne \infty \).
Proposition 10
If \(\mathcal {P}\) is a maximal schedule of abnormality point \(i\ne \infty \), then a job in \(A_i(\mathcal {P})\) with minimum completion time forms an alternating chain in \(\mathcal {P}\).
Proof
Suppose that schedule \((\mathcal {P},{\varvec{e}},{\varvec{\xi }})\) is maximal. According to Observations 1 and 2 and Lemma 11, \(e_{i+1}\) is inormal. By Lemma 12, we have \(A_i(\mathcal {P})=2\). Let \(A_i(\mathcal {P})=\{b,c\}\), where \(C(\mathcal {P},b)\le C(\mathcal {P},c)\). Denote \(I=\{i+1,\ldots ,\tau (b)\}\). Note that, by Lemma 10, \(I\ne \emptyset \).
We prove the lemma by contradiction. More precisely, the assumption that b does not form an alternating chain in \(\mathcal {P}\) allows us to conclude that \(\mathcal {P}\) is not maximal. We may assume without loss of generality that c covers b in I. Indeed, if this is not the case, then we transform \(\mathcal {P}\) as follows. Let \(t\in I\) be such that \(\xi _t(b)>0\) and job c is nonspanning in block t. Take \(\varepsilon =\min \{\xi _t(b),\xi _i(c),e_{t+1}e_t\xi _t(c)\}\). Note that \(\varepsilon >0\) and, by Lemmas 4 and 7, \(\xi _i(c)=e_{i+1}e_i\xi _i(b)\). The schedule obtained by a transformation
has the same total completion time and the same events as \(\mathcal {P}\) and either: \(\xi _i'(b)=e_{i+1}e_i\) (which happens when \(\varepsilon =\xi _i(c)\)); or \(\xi _t'(b)=0\) (which happens when \(\varepsilon =\xi _t(b)\)); or \(\xi _t'(c)=e_{t+1}e_t\) (which happens when \(\varepsilon =e_{t+1}e_t\xi _t(c)\)). In the former case we would obtain a schedule with abnormality point greater than i, which is not possible due to the maximality of \(\mathcal {P}\). After repeating this transformation as long as c does not cover b in I we obtain the desired schedule.
Now we prove that \(\tau (b)=i+1\). Suppose for a contradiction that \(\tau (b)>i+1\). Thus, since c covers b in I, and \(e_{\tau (b)}\) is an event of \(\mathcal {P}\), we have \(\xi _{\tau (b)1}(b)=\xi _{\tau (b)1}(c)=0\) and there exists \(a\in \mathcal {J}\setminus \{b,c\}\) such that \(C(\mathcal {P},a)=e_{\tau (b)}\). Find the maximum j, \(j<\tau (b)1\), such that \(\xi _j(b)\ne 0\). Note that \(j\ge i\). Since job c covers b in I, c is spanning in block j. By Lemma 7, \(a\notin \mathcal {J}(\xi _j)\). Lemma 9 applied to \(a=b\), \(a'=a\), j and \(j'=\tau (b)\) gives \(\tau (b)>j'=\tau (b)\)—a contradiction. This proves that (b) is an alternating chain in \(\mathcal {P}\). \(\square \)
Proposition 11
Let \((d_1,\ldots ,d_l)\) be an alternating chain in a maximal schedule \(\mathcal {P}\). Then, there is no idle time in block \(\tau (d_{l1})+1\) of \(\mathcal {P}\).
Proof
Let \({\varvec{e}}\) be the q events of \(\mathcal {P}\) and let \({\varvec{\xi }}\) be its partition. Let i be the abnormality point of \(\mathcal {P}\). Since \(\mathcal {P}\) has an alternating chain, \(i\ne \infty \). Suppose for a contradiction that there is idle time in the \((\tau (d_{l1})+1)\)st block of \(\mathcal {P}\). By Lemma 3, at most one job completes in the \((\tau (d_{l1})+1)\)st block of \(\mathcal {P}\). By Lemma 4, no job that does not complete in the \((\tau (d_{l1})+1)\)st block can be present in this block. Thus, \(d_l\) is the only job in block \((\tau (d_{l1})+1)\) and the total length of the idle time is \(x=e_{\tau (d_{l1})+2}e_{\tau (d_{l1})+1}\). Construct a schedule \(\mathcal {P}'\) by performing an \(\varepsilon \)pushing of \(d_l\) in \(\mathcal {P}\) with
Denote the resulting schedule by \(\mathcal {P}'\). To complete the proof we observe that \(C(\mathcal {P}',d_j)=r(a)\) for some job a and some \(j\in \{3,\ldots ,l\}\) (when \(\varepsilon =\gamma \)) or \(C(\mathcal {P}',d_{j1})=C(\mathcal {P}',d_j)\) for some \(j\in \{2,\ldots ,l\}\) (when \(\varepsilon \in \{\beta ,x/2\}\)) or there is no \(d_j\) in the ith block of \(\mathcal {P}'\) for some \(j\in \{1,2\}\) (when \(\varepsilon =\alpha \)). Therefore, the choice of \(\varepsilon \) always results in \(\mathcal {P}'\) that has all blocks j, \(j\in \{1,\ldots ,i\}\), being jnormal—a contradiction with the lemma assumption that \(\mathcal {P}\) is maximal. \(\square \)
The next theorem states that if a maximal schedule \(\mathcal {P}\) with an alternating chain \((d_1,\ldots ,d_l)\) has no \(\text {A}\)configuration at \(t\ge C(\mathcal {P},d_{l1})\), then there exists another maximal schedule \(\mathcal {P}'\) with longer alternating chain \((d_1,\ldots ,d_l,d_{l+1})\) with no \(\text {A}\)configuration at \(t\ge C(\mathcal {P}',d_l)\).
Theorem 3
Let \(\mathcal {P}\) be a maximal schedule of abnormality point \(i\ne \infty \). If \((d_1,\ldots ,d_l)\), \(l\ge 1\), is an alternating chain in \(\mathcal {P}\) and \(\mathcal {P}\) is \(\text {A}\)free in \([C(\mathcal {P},d_{l1}),\infty )\), where \(C(\mathcal {P},d_0)\) is the \((i+1)\)st event of \(\mathcal {P}\), then there exists a job \(d_{l+1}\) such that \((d_1,\ldots ,d_l,d_{l+1})\) is an alternating chain in some maximal schedule \(\mathcal {P}'\) that is \(\text {A}\)free in \([C(\mathcal {P}',d_{l}),\infty )\).
We leave the proof of the theorem to the end of the section. Theorem 2 guarantees that a maximal \(\text {A}\)free schedule exists for intrees. If this schedule is not normal, then we have a singlejob alternating chain in it by Proposition 10. Theorem 3 guarantees that the alternating chain can be always extended by one job. However, the process of extending the alternating chain may result in a schedule which is not \(\text {A}\)free in general—the source of this lies in Lemma 16. Luckily, we do not need the resulting schedule to be \(\text {A}\)free—it suffices that it has no \(\text {A}\)configuration at a completion of \(d_l\), the last job from the alternating chain, or later—see the assumptions of Theorem 3. The above gives a sketch of the proof of the following theorem.
Theorem 4
There exists a normal optimal schedule for each instance of problem \(P2pmtn,in\text {}tree,r_j,p_j\sum C_j\).
Proof
Take a maximal schedule \(\mathcal {P}\) and suppose for a contradiction that \(\mathcal {P}\) is not normal. Let \(i\ne \infty \) be its abnormality point. By Proposition 10, \(A_i(\mathcal {P})=\{d_1,d_2\}\), and \((d_1)\) is an alternating chain in \(\mathcal {P}\). Next, by Theorem 2, there is a maximal schedule \(\mathcal {P}^1\) with its abnormality point i and alternating chain \((d_1)\) which is also \(\text {A}\)free. Thus, in particular, \(\mathcal {P}^1\) is \(\text {A}\)free in \([x,\infty )\), where x is the \((i+1)\)st event of \(\mathcal {P}^1\). Finally, by Theorem 3 and a simple inductive argument, there exists a schedule \(\mathcal {P}^{n+1}\) with an alternating chain of \(l=n+1\) jobs, contradicting the fact that the number of jobs equals n. \(\square \)
Corollary 1
For the given set of n jobs, there exists an optimal schedule for \(P2pmtn,in\text {}tree,r_j,p_j\sum C_j\) such that each job start, preemption, resumption, or completion occurs at a time point that is a multiple of \(1/2^{2n}\).
6.3.1 Proof of Theorem 3
By Lemma 12, \(A_i(\mathcal {P})=2\). If \(l=1\), then take \(d_{l+1}\) to be the job in \(A_i(\mathcal {P})\setminus \{d_1\}\). If \(l>1\), then by Proposition 11, there is no idle time in \((\tau (d_{l1})+1)\)st block of \(\mathcal {P}\). Thus, \(\mathcal {J}(\xi _{\tau (d_{l1})+1})>1\). Take \(d_{l+1}\) to be a job in \(\mathcal {J}(\xi _{\tau (d_{l1})+1})\setminus \{d_l\}\) that starts or resumes at \(C(\mathcal {P},d_{l1})\).
We will perform several schedule modifications leading to some maximal schedule \(\mathcal {P}'\) that is \(\text {A}\)free in \([C(\mathcal {P}',d_l),\infty )\) and has an alternating chain \((d_1,\ldots ,d_{l+1})\). We point out that some steps of the proof redefine the job \(d_{l+1}\) selected above. By assumption, \(d_1\in A_i(\mathcal {P})\), and by the choice of \(d_2\), \(d_2\in A_i(\mathcal {P})\). Thus, (C1) follows for \((d_1,\ldots ,d_{l+1})\).
We now prove that
Note that if \(l=1\), then (27) follows from Proposition 10 and from the choice of \(d_1\) and \(d_2\). Thus, \(l\ge 2\) from now on.
We begin by proving, by contradiction, that \(d_{l+1}\) executes nonpreemptively in \([C(\mathcal {P},d_{l1}),C(\mathcal {P},d_{l})]\). First, we observe that no job, except for \(d_{l}\), completes in the interval \((C(\mathcal {P},d_{l1}),C(\mathcal {P},d_{l})]\) for otherwise job \(d_{l+1}\) must complete in \((C(\mathcal {P},d_{l1}),C(\mathcal {P},d_{l})]\) and thus jobs \(d_{l1}\) and \(d_{l+1}\) form an \(\text {A}\)configuration in \(\mathcal {P}\)—a contradiction since \(\mathcal {P}\) is \(\text {A}\)free in \([C(\mathcal {P},d_{l1}),\infty )\).
Second, \(J\setminus \{d_l,d_{l+1}\}\le 1\), where J is the set of jobs executed in \((C(\mathcal {P},d_{l1}),C(\mathcal {P},d_{l})]\). Otherwise, there is a pair of jobs in \(\{x,y,d_l\}\), where \(\{x,y\} \subseteq J\setminus \{d_l,d_{l+1}\}\), that interlace—contradiction by Lemma 5 and by the fact that \(d_l\) is preempted in \((C(\mathcal {P},d_{l1}),C(\mathcal {P},d_{l})]\). Indeed, this pair of job consists of a job \(z\in \{x,y,d_l\}\) with minimum completion time among those three jobs, and a job in \(\{x,y,d_l\}\setminus \{z\}\) that is nonspanning in block \(\tau (z)\).
Finally, we show that without loss of generality \(J=\{d_l,d_{l+1}\}\). Suppose otherwise, i.e., \(J=\{x,d_l,d_{l+1}\}\). The \(\varepsilon \)pushing of \(d_l\) with
results in a schedule \(\mathcal {P}'\) that either has all blocks j, \(j\in \{1,\ldots ,i\}\), being jnormal (this happens when \(\varepsilon \in \{\alpha ,\beta ,\gamma \}\))—a contradiction with the lemma assumption that \(\mathcal {P}\) is maximal; or it has \(d_l\) and \(d_{l+1}\) as the only two jobs executed in interval \([C(\mathcal {P}',d_{l1}),C(\mathcal {P}',d_{l})]\); or it has \(d_l\) and x as the only two jobs executed in interval \([C(\mathcal {P}',d_{l1}),C(\mathcal {P}',d_{l})]\). (See Fig. 16a for this transformation when \(\varepsilon =C(\mathcal {P},d_{l})C(\mathcal {P},d_{l1})\xi _{\tau (d_{l1})+1}(d_{l+1})\).) In the latter case, i.e., when \(\varepsilon =\xi _{\tau (d_{l1})+1}(d_{l+1})\), we take x as \(d_{l+1}\) from now on. The schedule \(\mathcal {P}'\) is maximal, by Lemma 16 it is \(\text {A}\)free in \([C(\mathcal {P},d_{l}),\infty )\), and \((d_1,\ldots ,d_{l})\) is an alternating chain in \(\mathcal {P}'\). Thus, without loss of generality we can take \(\mathcal {P}\) as being \(\mathcal {P}'\) from now on. Then, we have that \(d_{l+1}\) executes nonpreemptively in \([C(\mathcal {P},d_{l1}),C(\mathcal {P},d_{l})]\) and, by Lemma 14, (27) holds. Thus, (C2) holds for \((d_1,\ldots ,d_{l+1})\) in \(\mathcal {P}\).
We argue that there is no idle time in the \((\tau (d_l)+1)\)st block of \(\mathcal {P}\). Our argument is by contradiction. If there is idle time in the block, then by (27), \(d_{l+1}\) is the only job there. Thus, \(d_{l+1}\) completes at \(e_{\tau (d_l)+2}\) or there is a release date pinned job a that starts at \(e_{\tau (d_l)+2}\), i.e., \(r(a)=e_{\tau (d_l)+2}\). In the former case we have an alternating chain \((d_1,\ldots ,d_{l+1})\) with idle time in block \(\tau (d_l)+1\) which contradicts Proposition 11 and completes the proof. In the latter case we use an extended \(\varepsilon \)pushing of \(d_{l+1}\) (the operation of the \(\varepsilon \)pushing can be generalized in a straightforward way to the case when \(d_{l+1}\) is preempted in interval \([C(\mathcal {P},d_{l}),C(\mathcal {P},d_{l+1})]\), see also Fig. 16b—we omit a formal definition of the extended pushing) with
which results in a schedule \(\mathcal {P}'\) that has all blocks j, \(j\in \{1,\ldots ,i\}\), being jnormal—a contradiction with the lemma assumption that \(\mathcal {P}\) is maximal. (See Fig. 16b for an illustration of this schedule transformation when \(\varepsilon =r(a)C(\mathcal {P},d_l)\).) Therefore, without loss of generality we may assume there is no idle time block \(\tau (d_l)+1\), and thus some job \(a\ne d_{l+1}\) starts or resumes at \(e_{\tau (d_l)+1}\).
We next describe a finite iterative process that starts with \(\mathcal {P}\), produces a schedule \(\mathcal {P}_u\) in its uth iteration, \(u\ge 1\), and stops after \(T\ge 1\) iterations. We then show that if \(T=1\), then the schedule \(\mathcal {P}_1\) satisfies the conditions of the lemma. However, if \(T>1\), then we prove that there is a pair of jobs x and y that allows the iterative process to construct maximal schedules \(\mathcal {P}_1\), ..., \(\mathcal {P}_{T1}\) each with alternating chain \((d_1,\ldots ,d_l)\), yet at the same time the pair prevents \(d_{l+1}\) from satisfying (C3) in \(\mathcal {P}_1\), ..., \(\mathcal {P}_{T1}\). However, an exit after \(T>1\) iterations is only possible through a schedule \(\mathcal {P}_T\) such that the total completion time of \(\mathcal {P}_T\) is smaller than that of \(\mathcal {P}\) or both schedules have the same total completion times but the abnormality point of \(\mathcal {P}_T\) is greater than i, which contradicts the maximality of \(\mathcal {P}\). Therefore the iterative process must exit after exactly one iteration producing the desired schedule \(\mathcal {P}_1\)—more than one iteration leads to a contradiction.
In order to describe the iterative process formally, we introduce a key definition and related notation. We say that a schedule \((\mathcal {P}_u, {\varvec{\xi }}^u, {\varvec{e}}^u)\) with \(\tau _u=\tau _{\mathcal {P}_{u}}\) is \(d_{l+1}\) preempted if there exists a pair of jobs x and y such that the following conditions are satisfied:

(I1)
\(\mathcal {P}_u\) is maximal, \((d_1,\ldots ,d_l)\) is an alternating chain in \(\mathcal {P}_u\), i is the abnormality point of \(\mathcal {P}_u\), and \(d_l\) and \(d_{l+1}\) are the only two jobs executed in \([C(\mathcal {P}_u,d_{l1}),C(\mathcal {P}_u,d_l)]\), and \(C(\mathcal {P}_u,d_{l+1})>C(\mathcal {P}_u,d_l)\);

(I2)
Some job \(a_u\) starts or resumes at \(C(\mathcal {P}_u,d_l)\);

(I3)
\(d_{l+1}\) covers each job in \(\{a_1,\ldots ,a_u\}\) in \(I_u=\{\tau _u(d_l)+1,\ldots ,k_u\}\), where \(k_u=\min \{\tau _u(a_u),\tau _{u}(d_{l+1})\}\);

(I4)
\(C(\mathcal {P}_u,a_u)>C(\mathcal {P}_u,d_{l+1})\);

(I5)
There exists \(j_u<\tau _{u}(d_{l+1})\) such that \(\mathcal {J}(\xi ^u_{j_u})=\{x,y\}\) and \(\min \{r(x),r(y)\}>C(\mathcal {P}_u,d_l)\).
In the following we prove that all schedules \(\mathcal {P}_1,\ldots ,\mathcal {P}_{T1}\) are \(d_{l+1}\)preempted when \(T>1\). Let initially \(u=1\), and the iterative process is as follows. (\(\mathcal {P}_0\) refers to \(\mathcal {P}\).)
Step 1: Moving \(a_{u1}\) from block \(\tau _{u1}(d_l)+1\) to block \(k_{u1}\).
If \(u=1\), then let \(\mathcal {P}_{u}'=\mathcal {P}\) and go to Step 2. If \(u>1\), then we construct \(\mathcal {P}_{u}'\) by an extended \(\varepsilon \)pushing of \(d_{l+1}\) in \(\mathcal {P}_{u1}\) and moving a piece of \(a_{u1}\) of length \(\varepsilon \) from block \(\tau _{u1}(d_l)+1\) to block \(k_{u1}\), where
Denote by \({\varvec{e}}'\) and \({\varvec{\xi }}'\) the events and the partition of \(\mathcal {P}_{u}'\), respectively. Let for brevity \(\tau _{\mathcal {P}_{u}'}=\tau '\). Figure 17 depicts the transition from \(\mathcal {P}_{u1}\) to \(\mathcal {P}_{u}'\) for \(u>1\). Note that \(\mathcal {P}_{u}'\) and \(\mathcal {P}_{u1}\) have the same total completion times.
If \(\varepsilon \in \{\alpha , \beta , \gamma , \lceil C(\mathcal {P}'_u,d_l) \rceil C(\mathcal {P}'_u,d_l)\}\), then the abnormality point of \(\mathcal {P}_u'\) is greater than i and, having the required contradiction, we stop the iterative process with \(T=u\).
For the two remaining values we have that the number of blocks in \([C(\mathcal {P}_{u}',d_l),C(\mathcal {P}_{u}',d_{l+1})]\) in \(\mathcal {P}_{u}'\) is one less than the number of blocks in \([C(\mathcal {P}_{u1},d_l),C(\mathcal {P}_{u1},d_{l+1})]\) in \(\mathcal {P}_{u1}\). (We give an appropriate argument at the end of the proof of the lemma.)
Also, if \(\varepsilon =e^{u1}_{k_{u1}+1}e^{u1}_{k_{u1}}\xi ^{u1}_{k_{u1}}(a_{u1})\), then the total completion time of \(\mathcal {P}'_u\) is smaller than the total completion time of \(\mathcal {P}_{u1}\) provided that (\(\xi ^{u1}_{k_{u1}}(a_{u1})=0\) and \(\xi ^{u1}_{k_{u1}1}(d_{l+1})<e^{u1}_{k_{u1}}e^{u1}_{k_{u1}1}\)) because \(d_{l+1}\) completes in \(\mathcal {P}_u'\) strictly prior to \(e_{\tau _u(d_{l+1})}=C(\mathcal {P}_u,d_{l+1})\varepsilon \). Then, we stop the iterative process with \(T=u\). Otherwise, set \(a_u:=a_{u1}\) and go to Step 2.
If \(\varepsilon =\xi ^{u1}_{\tau _{u1}(d_l)+1}(a_{u1})\), then there is a job \(a_u\) that starts or resumes at \(C(\mathcal {P}_{u}',d_l)\). Go to Step 2.
Step 2: Making \(d_{l+1}\) cover \(a_u\).
Denote for brevity \(\tau '=\tau _{\mathcal {P}_u'}\) and let \({\varvec{e}}'\) and \({\varvec{\xi }}'\) be the events and the partition of \(\mathcal {P}_u'\), respectively. If \(d_{l+1}\) covers \(a_u\) in
then set \(\mathcal {P}_{u}=\mathcal {P}_{u}'\) and go to Step 3. Otherwise we obtain \(\mathcal {P}_{u}\) from \(\mathcal {P}'_{u}\) as follows. Find \(t\in I_u'\) such that \(\xi _t'(a_u)>0\) and \(\xi _t'(d_{l+1})<e_{t+1}'e_t'\). By Lemma 5, \(t<\tau '(a_u)\), and by Lemma 3, \(t<\tau '(d_{l+1})\). Let
We construct \(\mathcal {P}_{u}\) with events \({\varvec{e}}^u\) and partition \({\varvec{\xi }}^u\), where
and then:

If \(\varepsilon '\in \{e'_{\tau '(d_l)+1}r(a_u)\xi '_{\tau '(d_{l})}(a_u),(e'_{\tau '(d_l)+1}e'_{\tau '(d_l)})/2\xi '_{\tau '(d_{l})}(a_u)\}\), then we do \(\varepsilon ''\)pushing of \(d_l\) in \(\mathcal {P}'_{u}\) with
$$\begin{aligned} \varepsilon ''=\min \left\{ \alpha , \beta , \gamma , \varepsilon '\right\} \end{aligned}$$to get a schedule \(\mathcal {P}_{u}\) that has all blocks j, \(j\in \{1,\ldots ,i\}\), being jnormal—in such case we stop the iterative process with \(T=u\); (This \(\varepsilon ''\)pushing is shown in Fig. 16a with \(x=a_u\) and \(\varepsilon '=\varepsilon ''\).)

If \(\varepsilon '=\xi '_t(a_u)\), then \(\xi ^{u}_t(a_u)=0\), i.e., \(a_u\) is no longer in block t as required;

If \(\varepsilon '=e'_{t+1}e'_t\xi '_t(d_{l+1})\), then \(\xi ^{u}_t(d_{l+1})=e^{u}_{t+1}e^{u}_t\) as required.
If \(d_{l+1}\) does not cover \(a_u\) in \(I_u\) and \(\varepsilon '>0\), then repeat Step 2 for \(\mathcal {P}_u\). Thus, either in the resulting schedule \(\mathcal {P}_{u}\), \(d_{l+1}\) covers \(a_u\) in \(I_u\), in which case go to Step 3, or \(d_{l+1}\) does not cover \(a_u\) in \(I_u\) and \(\varepsilon '=0\) (then \(\xi ^u_{\tau _u(d_l)}(a_u)=\xi ^u_{\tau _u(d_l)}(d_{l+1})\) or \(e^u_{\tau _u(d_l)+1}=r(a_u)\)), in which case the abnormality point of \(\mathcal {P}_u\) is greater than i and, having the required contradiction, we stop the iterative process with \(T=u\).
Step 3: Pushing \(a_u\) out of \([C(\mathcal {P}_u,d_{l1}),C(\mathcal {P}_u,d_l)]\). If a part of \(a_u\) executes in \([C(\mathcal {P}_u,d_{l1}),C(\mathcal {P}_u,d_l)]\) in \(\mathcal {P}_u\), then perform an \(\varepsilon ''\)pushing of \(d_l\) in \(\mathcal {P}_{u}\) as in Fig. 16a with \(x=a_u\), where
If \(\varepsilon ''=\min \left\{ \alpha , \beta , \gamma , \lceil C(\mathcal {P}_{u},d_l) \rceil C(\mathcal {P}_{u},d_l)\right\} \), then the abnormality point of the resulting schedule is greater than i and, having the required contradiction, we stop the iterative process with \(T=u\). If \((d_1,\ldots ,d_{l+1})\) is an alternating chain in \(\mathcal {P}_u\), then also stop with \(T=u\). Otherwise, go to Step 4.
Step 4: Moving to the next iteration. Set \(u:=u+1\) and return to Step 1.
We now briefly sketch the reminder of the proof. For the time being let us assume that the iteration process ends after \(T\ge 1\) iterations, and that \(\varepsilon > 0\) in Step 1 for \(u>1\). We prove these two assumptions at the end of the proof. In the following \(\mathcal {P}_u'\), \(\mathcal {P}_u''\), and \(\mathcal {P}_u\) refer to the schedules obtained at the end of Steps 1, 2 and 3, respectively, \(u\ge 1\).
Let \(u=1\). Note that \(\mathcal {P}_1'=\mathcal {P}\). Then either \(d_{l+1}\) executes without preemption in \([C(\mathcal {P}_1'',d_l),C(\mathcal {P}_1'',d_{l+1})]\) or not. In the former case, Step 3 ensures that \(d_{l+1}\) executes in \(\mathcal {P}_1\) without preemption in \([C(\mathcal {P}_1,d_{l1}),C(\mathcal {P}_1,d_{l+1})]\). The latter implies that (C3) holds for \(\mathcal {P}_1\) and by Claim 4 below that \(\mathcal {P}_1\) is \(\text {A}\)free, which proves the lemma. We then have \(T=1\). If \(d_{l+1}\) is preempted in \([C(\mathcal {P}_1'',d_l),C(\mathcal {P}_1'',d_{l+1})]\), then \(T>1\) and it suffices to show that in this case we get a contradiction. To that end we show that, if \(T>1\), then \(\mathcal {P}_1\) is \(d_{l+1}\)preempted (see Claims 2, 3, and 5 below), and if \(\mathcal {P}_{u1}\) is \(d_{l+1}\)preempted, then \(\mathcal {P}_u\) is \(d_{l+1}\)preempted as well, \(u\in \{2,\ldots ,T1\}\) (see Claims 2, 3, and 6 below). This process of generating \(d_{l+1}\)preempted schedules cannot continue ad infinitum since the process exits after T iterations. However, any exit schedule \(\mathcal {P}_T\) certifies that \(\mathcal {P}\) is not maximal which gives the required contradiction.
We now proceed with details. Note that, for each \(u\in \{1,\ldots ,T\}\), (C1) and (C2) hold for \((d_1,\ldots ,d_{l+1})\) in \(\mathcal {P}_{u}\) and
and that (I1), (I2) and (I3) follow directly from the definition of the iterative process above:
Claim 2
Let \(T>1\). If \(u=1\), or \(u\in \{2,\ldots ,T1\}\) and \(\mathcal {P}_{u1}\) is \(d_{l+1}\)preempted, then \(\mathcal {P}_{u}\) satisfies conditions (I1), (I2) and (I3).
Proof
Note that, by construction, the total completion times and abnormality points of the schedules \(\mathcal {P},\mathcal {P}_1,\ldots ,\mathcal {P}_{T1}\) are the same. Thanks to Step 3 and the fact that \(u<T\), \(d_l\) and \(d_{l+1}\) are the only two jobs executed in \([C(\mathcal {P}_u,d_{l1}),C(\mathcal {P}_u,d_l)]\) for each \(u\in \{1,\ldots ,T1\}\). Finally, by Lemma 14, we have \(C(\mathcal {P}_u,d_{l+1})>C(\mathcal {P}_u,d_l)\) for \(u=1\), and by (I5) for schedule \(\mathcal {P}_{u1}\) we have \(C(\mathcal {P}_{u1},d_{l})<\min \{r(x),r(y)\}<C(\mathcal {P}_{u1},d_{l+1})\) for \(u>1\). By construction, \(C(\mathcal {P}_{u},d_{l}) <\lceil C(\mathcal {P}_{u1},d_{l}) \rceil \) and \(j_u<\tau _{u}(d_{l+1})\) such that \(\mathcal {J}(\xi ^u_{j_u})=\{x,y\}\). Thus, \(C(\mathcal {P}_{u},d_{l})<\min \{r(x),r(y)\}<C(\mathcal {P}_{u},d_{l+1})\). Therefore, \(\mathcal {P}_u\) satisfies (I1).
By Proposition 11, there is no idle time in block \(\tau _{\mathcal {P}}(d_l)+1\) in \(\mathcal {P}\). Thus the choice of \(a_1\) ensures that it starts or resumes at \(C(\mathcal {P}_1,d_l)\) for \(u=1\). For \(u>1\), the job \(a_u\) always exists because there is no idle time in block \(\tau _{u}(d_l)+1\) in \(\mathcal {P}_u\). This follows from the fact that otherwise, by construction, there would be idle time in \([C(\mathcal {P}_{u1},d_{l}), C(\mathcal {P}_{u1},d_{l+1})]\) in \(\mathcal {P}_{u1}\), which, since (I3) and (I4) hold for \(\mathcal {P}_{u1}\), implies that \(a_{u1}\) can be completed earlier in \(\mathcal {P}_{u1}\), which contradicts its optimality. Hence, \(\mathcal {P}_u\) satisfies (I2).
Finally, Steps 2 and 3 and \(u<T\) ensure that (I3) holds for \(\mathcal {P}_u\). \(\square \)
Claim 3
Let \(T>1\). If \(u=1\), or \(u\in \{2,\ldots ,T1\}\) and \(\mathcal {P}_{u1}\) is \(d_{l+1}\)preempted, then \(\mathcal {P}_{u}\) satisfies condition (I4).
Proof
It suffices to argue that
Denote for brevity \(k=k_u\). Suppose for a contradiction that \(k=\tau _u(a_u)\). By Lemma 3, \(\xi _k^u(a_u)=e^u_{k+1}e^u_k\). By Claim 2, (I3) holds for \(\mathcal {P}_u\). Thus, \(d_{l+1}\) covers \(a_u\) in \(I_u\) and, by Lemma 5, \(\xi _k^u(d_{l+1})=e^u_{k+1}e^u_k\). Hence, there exists \(a'\in \mathcal {J}\) such that \(C(\mathcal {P}_{u},a')=e^u_{k}\) because \(e^u_k\) is an event in \(\mathcal {P}_u\).
If \(a'\ne d_l\), then \(\tau _u(d_l)<k\) and \(\xi ^u_{k1}(a_u)=0\) because, again, \(d_{l+1}\) covers \(a_u\) in \(I_u\). Then, let \(j<k1\) be such that \(\xi ^u_j(a_u)>0\) and \(\xi ^u_{j'}(a_u)=0\) for each \(j'\in \{j+1,\ldots ,k1\}\). By (28) such a j exists. By Lemma 7 and the fact that \(d_{l+1}\) covers \(a_u\) in \(I_u\), \(\xi ^u_j(a_u)=\xi ^u_j(d_{l+1})=e^u_{j+1}e^u_j\) which implies \(\xi ^u_j(a')=0\). Lemma 9 applied to \(a=a_u\), \(a'\), j and \(j'=k\), leads to a contradiction.
It remains to consider the case when \(a'=d_l\). We have that \(u>1\) because otherwise the jobs \(d_l\) and \(a_1\) form an \(\text {A}\)configuration in \(\mathcal {P}\)—a contradiction since \(\mathcal {P}\) is \(\text {A}\)free. We have
The first inequality follows from (I4) for \(\mathcal {P}_{u1}\), which is \(d_{l+1}\)preempted; the second inequality follows by construction of \(\mathcal {P}_u\), while the last one holds by assumption that \(k=\tau _u(a_u)\). Moreover, the construction ensures that the inequality \(C(\mathcal {P}_u,d_{l+1})\ge C(\mathcal {P}_u,a_u)\) implies
Thus, by (30), we have \(C(\mathcal {P}_{u1},a_{u1})>C(\mathcal {P}_{u1},a_u)\). Therefore, (31) and (I2), (I3) for \(\mathcal {P}_{u1}\) imply that \(a_{u}\) and \(a_{u1}\) interlace in \(\mathcal {P}_{u1}\). Finally, by (I1) applied to \(\mathcal {P}_{u1}\), \(\mathcal {P}_{u1}\) is optimal and hence we arrive at a contradiction with Lemma 5. Hence, (29) follows, and the proof of the lemma is completed. \(\square \)
Claim 4
If the job \(d_{l+1}\) executes with no preemption in interval \([C(\mathcal {P}_1,d_{l1}),C(\mathcal {P}_1,d_{l+1})]\) in \(\mathcal {P}_1\), then \(\mathcal {P}_1\) is \(\text {A}\)free in \([C(\mathcal {P}_1,d_{l}),\infty )\).
Proof
In view of Lemma 16, it suffices to prove that \(\mathcal {P}''_{1}\) at the end of Step 2 is \(\text {A}\)free in \([C(\mathcal {P}',d_{l}),\infty )\), if the step is not vacuous. Let \(\tau (a_1)\equiv \tau _{\mathcal {P}''_1}\) and \(k=\min \{\tau (a_1),\tau (d_{l+1})\}\) for convenience. Let \({\varvec{e}}''\) and \({\varvec{\xi }}''\) be the events and the partition of \(\mathcal {P}_{1}''\), respectively. We start with an observation concerning the construction of \(\mathcal {P}''_1\), namely, the sequence of events, that is the start and completion times of jobs, is the same in \(\mathcal {P}=\mathcal {P}'_1\) and \(\mathcal {P}''_{1}\). More precisely,

(B1)
\(\mathcal {J}(\xi _j)=\mathcal {J}(\xi ''_j)\) for \(j\in \{1,\ldots ,\tau (d_{l})1\}\) and \(j\ge k+1\);

(B2)
\(\mathcal {J}(\xi _{\tau _{\mathcal {P}}(d_{l})})=\{d_l,d_{l+1}\}\) and \(\mathcal {J}(\xi ''_{\tau (d_{l})})=\{d_l,d_{l+1},a_1\}\);

(B3)
\(\xi ''_j(x)=\xi _j(x)\) for \(x\notin \{a_1,d_{l+1}\}\), \(\xi ''_j(a_1)\le \xi _j(a_1)\), \(\xi ''_j(d_{l+1})\ge \xi _j(d_{l+1})\), and \(\xi ''_j(a_1)+\xi ''_j(d_{l+1})=\xi _j(a_1)+\xi _j(d_{l+1})\) for each \(j\in \{\tau (d_l)+1,\ldots ,k\}\).
Suppose for a contradiction that schedule \(\mathcal {P}''_{1}\) is not \(\text {A}\)free in interval \([C(\mathcal {P}''_{1},d_{l}),\infty )\). Then, by (B1)–(B3) and since the schedule \(\mathcal {P}\) is \(\text {A}\)free in \([C(\mathcal {P},d_{l1}),\infty )\), \(a=a_1\) must be one of the two jobs that form an \(\text {A}\)configuration in \(\mathcal {P}''_{1}\).
Let a and a job x form an \(\text {A}\)configuration at \(e^{''}_{\tau (a)}\) in \(\mathcal {P}''_{1}\). By definition of \(\text {A}\)configuration, x is not preempted in \((e''_j, e''_{\tau (a)}]\) for some \(j<\tau (a)\) in \(\mathcal {P}''_{1}\) and \(\xi ''_{j1}(x)<e''_{j1}e''_j\). By (B2), \(\tau (d_l)+1\le j\). Thus, \(\tau (d_l)+1\le j \le k\) for otherwise, by (B1), a and x or form an \(\text {A}\)configuration in \(\mathcal {P}\)—a contradiction. Moreover, there is a block \(t\in \{j,\ldots ,k\}\) in \(\mathcal {P}\) with \(\{a,x\}\subseteq \mathcal {J}(\xi _t)\) for otherwise again a and x form an \(\text {A}\)configuration in \(\mathcal {P}\)—a contradiction. Therefore, if \(x=d_{l+1}\), then \(\xi _t(a)>0\) and \(\xi _t(d_{l+1})=e_{t+1}e_t\) in \(\mathcal {P}\) and it remains so in \(\mathcal {P}''_{1}\) which follows from the transformation in Step 1. Thus, a and \(d_{l+1}\) do not form an \(\text {A}\)configuration in \(\mathcal {P}''_{1}\) which contradicts our assumption that \(x=d_{l+1}\). If \(x\ne d_{l+1}\), then \(\xi _t(d_{l+1})<e_{t+1}e_t\) and hence a job \(a'\in \{a,x\}\) with \(\xi _{\tau _{\mathcal {P}}(d_{l+1})}(d_{l+1})<e_{\tau _{\mathcal {P}}(d_{l+1})+1}e_{\tau _{\mathcal {P}}(d_{l+1})}\) and \(d_{l+1}\) interlace in \(\mathcal {P}\) when \(\tau (a)>\tau (d_{l+1})=k\)—a contradiction with Lemma 5. Thus it remains to consider \(\tau (a)\le \tau (d_{l+1})\). By Step 3, no preemption of \(d_{l+1}\) in \([C(\mathcal {P}_1,d_{l1}),C(\mathcal {P}_1,d_{l+1})]\) in \(\mathcal {P}_1\) implies no preemption of \(d_{l+1}\) in \([C(\mathcal {P}''_1,d_{l}),C(\mathcal {P}''_1,d_{l+1})]\) in \(\mathcal {P}''_1\). Therefore, the jobs a and x interlace in \(\mathcal {P}''_1\) if \(\tau (d_l)\ge \tau (a)\)—a contradiction by Lemma 5.
Now, suppose that a and a job x such that a completes at \(e''_{\tau (x)}\) form an \(\text {A}\)configuration in \(\mathcal {P}''_{1}\). Since \(d_{l+1}\) covers a in \(I_1'\) in \(\mathcal {P}''_{1}\), we have \(x\ne d_{l+1}\). By definition of \(\text {A}\)configuration, a is not preempted in \((e''_j, e''_{\tau (x)}]\) for some \(j\le \tau (a)\) in \(\mathcal {P}''_{1}\) and \(\xi ''_{j1}(a)<e''_{j1}e''_j\). Also, \(\tau (d_l)+1\le j \le k\) for otherwise, by (B1), a and x form an \(\text {A}\)configuration in \(\mathcal {P}\)—a contradiction. Moreover, there is a block \(t\in \{j,\ldots ,k1\}\) in \(\mathcal {P}\) with \(\{a,x\}\subseteq \mathcal {J}(\xi _t)\) for otherwise again a and x form an \(\text {A}\)configuration in \(\mathcal {P}\)—a contradiction. Therefore, \(\{d_{l+1},x\}\subseteq \mathcal {J}(\xi ''_t)\), and a and x interlace in \(\mathcal {P}''_{1}\)—a contradiction by Lemma 5. \(\square \)
Claim 5
If \(T=1\), then \(\mathcal {P}_1\) is maximal, \(\text {A}\)free in interval \([C(\mathcal {P},d_{l}),\infty )\) and \((d_1,\ldots ,d_{l+1})\) is an alternating chain in \(\mathcal {P}_1\). If \(T>1\), then \(\mathcal {P}_1\) satisfies condition (I5).
Proof
If \(T=1\), then by the maximality of \(\mathcal {P}\) we have that \(\mathcal {P}_1\) is maximal and \((d_1,\ldots ,d_{l+1})\) is an alternating chain in \(\mathcal {P}_1\). By Claim 4, \(\mathcal {P}_1\) is \((\text {A},[C(\mathcal {P},d_{l}),\infty ))\)free, which completes the proof in this case.
Suppose now that \(T>1\). Let for brevity \(k=k_1\) and \(\tau _1=\tau _{\mathcal {P}_1}\) in the proof of Claim 5. By Claim 3, \(\mathcal {P}_1\) satisfies (I4). Hence, \(k=\tau _1(d_{l+1})\), and \(\xi ^1_k(d_{l+1})=e^1_{k+1}e^1_k\) by Lemma 3. Note that \(\xi ^1_j(d_{l+1})=e^1_{j+1}e^1_j\) for each \(j\in \{\tau _1(d_l)+1,\ldots ,k\}\) is not possible because then \((d_1,\ldots ,d_{l+1})\) is an alternating chain in \(\mathcal {P}_1\) and hence the iterative process would stop with \(T=1\) in Step 3. We show that otherwise we can find the desired jobs x and y in (I5). The key to finding the jobs is the existence of a block \(j\in \{\tau _1(d_1)+1,\ldots ,k1\}\) such that \(\xi ^1_j(d_{l+1})<e^1_{j+1}e^1_j\). Take the smallest such j. Let \(\{x,y\}\subseteq \mathcal {J}(\xi ^1_j)\setminus \{a_1, d_{l+1}\}\). Such two jobs exist since there is no idle time in block j and \(a_1\notin \mathcal {J}(\xi ^1_j)\) because, by Claim 2, \(d_{l+1}\) covers \(a_1\) in \(I_1\). We now prove that
To that end we first argue that no predecessor of x or y is executed in \((C(\mathcal {P}_1,d_l),e^1_j]\). By contradiction, suppose z is a predecessor of x or y that completes in \((C(\mathcal {P}_1,d_l),e^1_j]\). Then, z must also start in \([C(\mathcal {P}_1,d_l),e^1_j]\) for otherwise by (28) and the fact that \(d_{l+1}\) covers \(a_1\) in \(I_1\) (by Claim 2) we get that z interlaces either \(a_1\) or \(d_{l+1}\) in \(\mathcal {P}_1\)—a contradiction by Lemma 5. Therefore, z starts in \([C(\mathcal {P}_1,d_l),e^1_j]\) and thus there is a block \(j'\in \{\tau _1(d_l)+1,\ldots ,j1\}\) such that \(\xi ^1_{j'}(z)>\xi ^1_{j'}(d_{l+1})\). The latter is guaranteed by the fact that z executes in \([C(\mathcal {P}_1,d_l),e_j^1]\). Since, by Claim 2, \(d_{l+1}\) covers \(a_1\) in \(I_1\), we have \(\xi ^1_{j'}(a_1)=0\). Therefore, there is a job w, \(w\notin \{a_1,d_{l+1}\}\), such that \(\xi ^1_{j'}(w)>0\). Thus, we get a contradiction by our definition of j since \(j'<j\).
Second, arguing by contradiction, suppose without loss of generality that \(r(x)\le C(\mathcal {P}_1,d_l)\). If \(\xi ^1_k(a_1)<e^1_{k+1}e^1_k\), then take
and let \(\mathcal {P}''\) be a schedule with events \({\varvec{e}}''\) and partition \({\varvec{\xi }}''\), where
Note that \(j>\tau _1(d_l)+1\) because \(\xi ^1_j(a_1)=0\). The assumption \(r(x)\le C(\mathcal {P}_1,d_l)\) implies that \(\mathcal {P}''\) is feasible. Thus, we get a contradiction since the total completion time of \(\mathcal {P}''\) is smaller than that of \(\mathcal {P}_1\).
On the other hand, if \(\xi ^1_k(a_1)=e^1_{k+1}e^1_k\), then \(x\notin \mathcal {J}(\xi ^1_k)\), and (28) and the fact that \(d_{l+1}\) covers \(a_1\) in \(I_1\) (by Claim 2) imply that \(\mathcal {J}(\xi ^1_{\tau _1(d_l)+1})\cap \{x,y\}=\emptyset \). Thus, \(C(\mathcal {P}_1,x)\le e^1_k\) or \(C(\mathcal {P}_1,x)>e^1_{k+1}\). In the former case \(a_1\) and x interlace, and in the latter case x and \(d_{l+1}\) interlace—contradiction with Lemma 5. Therefore, (32) holds and \(\mathcal {P}_1\) satisfies (I5) as required. \(\square \)
Claim 6
Let \(T>1\). Then, \(\mathcal {P}_{u}\) satisfies condition (I5) for \(u\in \{1,\ldots ,T1\}\).
Proof
By Claim 5, \(\mathcal {P}_1\) satisfies (I5). By induction on \(u=1,\ldots ,T1\), we have \(\min \{r(x),r(y)\}>C(\mathcal {P}_u,d_l)\). Let \(j_u\) be the earliest j such that \(\mathcal {J}(\xi ^u_j)=\{x,y\}\) in \(\mathcal {P}_u\). Again by induction on \(u=1,\ldots ,T1\) we have \(j_u<\tau _u(d_{l+1})\). Therefore, \(\mathcal {P}_{u}\) satisfies condition (I5) for \(u\in \{1,\ldots ,T1\}\). \(\square \)
Claim 7
For each \(u\in \{1,\ldots ,T\}\), if \(\mathcal {P}_{u}\) is \(d_{l+1}\)preempted, then \(\xi ^{u}_{k_{u}}(a_{u})<e^{u}_{k_{u}+1}e^{u}_{k_{u}}\).
Proof
Suppose that \(\xi ^{u}_{k_{u}}(a_{u})=e^{u}_{k_{u}+1}e^{u}_{k_{u}}\) and \(\mathcal {P}_u\) is \(d_{l+1}\)preempted. By (I4), it holds \(k_u=\tau _u(d_{l+1})\) and by Lemma 3, \(\xi ^{u}_{k_{u}}(d_{l+1})=e^{u}_{k_{u}+1}e^{u}_{k_{u}}\). Then, there is a job \(a'\) that completes at \(e_{k_{u}}\) because \(e_{k_u}\) is an event in \(\mathcal {P}_u\). The job \(a'\) starts after \(C(\mathcal {P}_{u},d_l)\) for otherwise \(a'\) interlaces with either \(a_u\) or \(d_{l+1}\) in an optimal \(\mathcal {P}_{u}\)—contradiction with with Lemma 5. Perform swapping of \(a'\) and \(d_{l+1}\). By Lemma 2, this leads to a feasible schedule \(\mathcal {P}'\). If \(\xi _{k_u1}(d_{l+1})<e_{k_u}e_{k_u1}\), then the total completion time of the new schedule is smaller than that of \(\mathcal {P}_{u}\)—a contradiction again. If \(\xi _{k_u1}(d_{l+1})=e_{k_u}e_{k_u1}\), then there is a job \(a''\) that completes at \(e_{k_{u}1}\) in \(\mathcal {P}_u\) because \(e_{k_u1}\) is an event in \(\mathcal {P}_u\). Again, the job \(a''\) starts after \(C(\mathcal {P}_{u},d_l)\) for otherwise \(a''\) interlaces with either \(a_u\) or \(d_{l+1}\) in an optimal \(\mathcal {P}_{u}\)—contradiction with Lemma 5. Clearly, \(C(\mathcal {P}_u,a'')=C(\mathcal {P}',a'')\) and \(s(\mathcal {P}_u,a'')=s(\mathcal {P}',a'')\). Take \(\mathcal {P}_u:=\mathcal {P}'\) and \(a':=a''\) and repeat the above swapping.
After a finite number of the above ‘swappings’ we obtain a feasible schedule with lower total completion time than the initial one—a contradiction. This completes the proof of Claim 7. \(\square \)
Now we return to the proof of Theorem 3. If \(T=1\), then the lemma holds by Claim 5. To complete the proof we show that \(T>1\) leads to a contradiction. First, for \(T>1\), Claims 2, 3, and 5 imply that the schedule \(\mathcal {P}_1\) is \(d_{l+1}\)preempted. Thus, Claims 2, 3 and 6 and an induction on \(u\in \{1,\ldots ,T1\}\) give that each of the schedules \(\mathcal {P}_1,\ldots ,\mathcal {P}_{T1}\) is \(d_{l+1}\)preempted. Since the iterative process exits in iteration \(u=T\), producing either \(\mathcal {P}'_T\) or \(\mathcal {P}''_T\), we get a contradiction since either these two exit schedules have smaller total completion times than \(\mathcal {P}_{T1}\) or they have the same total completion times but their abnormality points are greater than i. Hence, \(\mathcal {P}_{T1}\) is not maximal, contradicting the fact that it is \(d_{l+1}\)preempted.
It remains to show that T exists, i.e., that the number of iterations is finite. To that end let \(c_u\) and \(C_u\) be the numbers of jobs executed in \(\mathcal {P}_u\) which are not covered by \(d_{l+1}\), and the number of blocks, respectively, in \([C(\mathcal {P}_u,d_{l}),C(\mathcal {P}_u,d_{l+1})]\).
By Claim 7, \(\varepsilon >0\) in Step 1 of the construction of \(\mathcal {P}_u'\) for each \(u\in \{2,\ldots ,T\}\). If \(\varepsilon =\xi ^{u1}_{\tau _{u1}(d_l)+1}(a_{u1})\) in Step 1, then the block \(\tau _{u1}(d_l)+1\) disappears. Otherwise, \(\varepsilon =e^{u1}_{k_{u1}+1}e^{u1}_{k_{u1}}\xi ^{u1}_{k_{u1}}(a_{u1})\) in Step 1. If \(\xi ^{u1}_{k_{u1}}(a_{u1})>0\), then \(a_u=a_{u1}\) at the end of Step 1 and \(\xi ^{u}_{k_{u}}(a_{u})=e^{u}_{k_{u}+1}e^{u}_{k_{u}}\). Therefore, both Steps 2 and 3 are vacuous and thus \(\mathcal {P}'_u=\mathcal {P}_u\) is \(d_{l+1}\)preempted. However, \(\xi ^{u}_{k_{u}}(a_{u})=e^{u}_{k_{u}+1}e^{u}_{k_{u}}\) contradicts Claim 7. Thus, we have \(\xi ^{u1}_{k_{u1}}(a_{u1})=0\) in Step 1, and therefore the block \(k_{u1}\) disappears.
Next, Step 2 does not change the number of blocks. Finally, Step 3, if not vacuous, may increase the number of blocks by at most one. Thus, we have \(C_u=C_{u1}1\) and \(c_u=c_{u1}\), if Step 3 is vacuous, and \(C_u\le C_{u1}\) and \(c_u=c_{u1}1\), if Step 3 is not vacuous. Consequently \(C_u+c_u<C_{u1}+c_{u1}\) and \(T\le C_1+c_1\le 3n\). Thus, the iterative process indeed stops with some schedule \(\mathcal {P}'_T\) or \(\mathcal {P}''_T\). This, by Claim 5, completes the proof of the theorem.
7 Summary
In this paper we have provided some structural characterization of the preemptions in optimal schedules for the problem \(P2pmtn,in\text {}tree,r_j,p_j\sum C_j\). The advantages of our characterization are as follows:

It narrows down a search space of optimal solutions from an infinite one to a finite one.

The understanding of the possible structure of preemptions is a step towards determining the complexity of the problem \(P2pmtn,in\text {}tree,r_j,p_j\sum C_j\). On the one hand, the normality of an optimal schedule may lead us to a polynomialtime algorithm. On the other hand, the fact that a single job may need many (of the order of \(\log n\)) preemptions as stated in Theorem 1 could be useful in proving NPcompleteness, although the complexity of the problem \(P2pmtn,in\text {}tree,r_j,p_j\sum C_j\) is left as an interesting and challenging open problem.

It significantly improves the lower bound on the resolution for the problem \(P2pmtn,in\text {}tree,r_j,p_j \sum C_j\).
Note that we rely on the intrees precedence constraints between the jobs in our proof. This assumption is crucial when proving in Sect. 5 that a maximal \(\text {A}\)free schedule exists. The generalization of our result to arbitrary precedence constraints or providing an example that an analogous statement as the one in Corollary 1 is false for more general precedence constraints is left as an open problem.
Notes
The term resolution was introduced in Coffman et al. (2015).
References
Baptiste, P., Brucker, P., Knust, S., & Timkovsky, V. (2004). Ten notes on equalprocessingtime scheduling. 4OR, 2(2), 111–127.
Baptiste, P., Carlier, J., Kononov, A., Queyranne, M., Sevastyanov, S., & Sviridenko, M. (2011). Properties of optimal schedules in preemptive shop scheduling. Discrete Applied Mathematics, 159(5), 272–280.
Baptiste, P., & Timkovsky, V. (2001). On preemption redundancy in scheduling unit processing time jobs on two parallel machines. Operations Research Letters, 28(5), 205–212.
Baptiste, P., & Timkovsky, V. (2004). Shortest path to nonpreemptive schedules of unittime jobs on two identical parallel machines with minimum total completion time. Mathematical Methods of Operations Research, 60(1), 145–153.
Brucker, P., Hurink, J., & Knust, S. (2003). A polynomial algorithm for P \(\vert \, \text{ p }_{\rm j}=1, \text{ r }_{{\rm j}}\), outtree \(\vert \, \sum {\text{ c }}_{{\rm j}}\). Mathematical Methods of Operations Research, 56(3), 407–412.
Carlier, A., Hanen, C., & MunierKordon, A. (2014) Equivalence of two classical list scheduling algorithms for dependent typed tasks with release dates, due dates and precedence delays. New Challenges in Scheduling Theory, March 31–April 4, Aussois, France.
Coffman, E, Jr, Dereniowski, D., & Kubiak, W. (2012). An efficient algorithm for an ideal scheduling problem. Acta Informatica, 6, 1–14.
Coffman, E, Jr, & Graham, R. (1972). Optimal scheduling for twoprocessor systems. Acta Informatica, 1, 200–213.
Coffman, E, Jr, Ng, C., & Timkovsky, V. (2015). How small are shifts required in optimal preemptive schedules? Journal of Scheduling, 18(2), 155–163.
Coffman, E, Jr, Sethuraman, J., & Timkovsky, V. (2003). Ideal preemptive schedules on two processors. Acta Informatica, 39(8), 597–612.
Fujii, M., Kasami, T., & Ninomiya, K. (1971). Optimal sequencing of two equivalent processors. SIAM Journal on Applied Mathematics, 17, 784–789. Erratum 20 (1971) 141.
Gabow, H. (1982). An almostlinear algorithm for twoprocessor scheduling. Journal of ACM, 29, 766–780.
Garey, M., & Johnson, D. (1976). Scheduling tasks with nonuniform deadlines on two processors. Journal of ACM, 23(3), 461–467.
Garey, M., & Johnson, D. (1977). Twoprocessor scheduling with starttimes and deadlines. SIAM Journal on Computing, 6(3), 416–426.
Herrbach, L. A., & Leung, J. Y. T. (1990). Preemptive scheduling of equal length jobs on two machines to minimize mean flow time. Operations Research, 38(3), 487–494.
Huo, Y., & Leung, J. Y. (2005). Minimizing total completion time for UET tasks with release time and outtree precedence constraints. Mathematical Methods of Operations Research, 62(2), 275–279.
Leung, A., Palem, K., & Pnueli, A. (2001). Scheduling timeconstrained instructions on pipelined processors. ACM Transactions on Programming Languages and Systems, 23(1), 73–103.
Sauer, N. W., & Stone, M. G. (1987). Rational preemptive scheduling. Order, 4, 195–206.
Acknowledgments
This research has been supported by the Natural Sciences and Engineering Research Council of Canada (NSERC) Grant OPG0105675. Dariusz Dereniowski was partially supported by a scholarship for outstanding young researchers founded by the Polish Ministry of Science and Higher Education, and Polish National Science Center under Contract DEC2011/02/A/ST6/00201.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Cite this article
Chen, B., Coffman, E., Dereniowski, D. et al. Normalform preemption sequences for an open problem in scheduling theory. J Sched 19, 701–728 (2016). https://doi.org/10.1007/s1095101504469
Published:
Issue Date:
DOI: https://doi.org/10.1007/s1095101504469