TRIOID: A GENERALIZATION OF MATROID AND THE ASSOCIATED
POLYTOPE
SANTOSH N. KABADI AND ABRAHAM P. PUNNEN
Abstract. We consider a generalization of the well known greedy algorithm, called m-
step greedy algorithm, where m elements are examined in each iteration. When m = 1
or 2, the algorithm reduces to the standard greedy algorithm. For m = 3 we provide a
complete characterization of the independence system, called trioid, where the m-step greedy
algorithm guarantees an optimal solution for all weight functions. We also characterize the
trioid polytope and propose a generalization of submodular functions.
1. Introduction
Let E = {1, 2, . . ., n} and F 2E so that (E, F ) is an independence system, (i.e. if A F
and B A then B F ). Let w : E R be a prescribed weight function. We consider the
following linear combinatorial optimization problem (LCOP):
Maximize {w [S ] : S F },
where w [S ] = i S w (i), if S =, and w [ ] = 0.
The well known greedy algorithm for LCOP can be described as follows.
Algorithm 1: The Greedy Algorithm
Input: E = {1, 2, . . ., n}; F : the family of feasible solutions (possibly given as an
oracle);
Output: X, the solution obtained.
Order the elements of E such that
w (1) w (2) w (r ) > 0 w (r + 1) w (n);
X ;
k 1;
while k r do
if X {k } F then
X X {k };
end
k k + 1;
end
Output X
Key words and phrases. Matroid, independence system, greedy algorithm, trioid, combinatorial
optimization.
This work was supported by NSERC discovery grants awarded to Santosh N. Kabadi and Abraham P.
Punnen.
1
In his path-breaking work [7], Edmonds showed that an independence system (E, F ) is
a matroid [22] if and only if the greedy algorithm computes an optimal solution to the
corresponding instances of LCOP for all weight functions. By relaxing the restriction of an
independence system and/or modifying the greedy algorithm appropriately, various classes of
discrete systems (E, F ) are identi ed by researchers that guarantee optimality of the solution
produced by the algorithm for the corresponding instances of LCOP for all weight functions.
These discrete systems include pseudomatroids (delta matroids) [2, 5], greedoids [17], matroid
embeddings [12], supermatroids [6, 10, 21], among others [3, 9, 13, 15]. Various modi cations
of the greedy algorithm have also been analyzed extensively as approximation strategies with
guaranteed average performance [18] and worst case performance [19] for various classes of
linear combinatorial optimization problems. In each of these algorithms, in each iteration,
exactly one element is added to the current solution to build the optimal (approximate)
solution.
Jenkyns [14] considered a generalization of the greedy algorithm, called J -Greedy algo-
rithm, where more than one element is added in each iteration (see also [16]). His algorithm
can be described as follows:
Algorithm 2: The J -Greedy Algorithm
Input: E = {1, 2, . . ., n}; an independence system (E, F ) (possibly given as an
oracle); and a function J : F E Z + ;
Output: X, the solution obtained.
X ;
1;
i 1;
while X 0 do
Choose S E X such that w [S ] = w (e) is maximized subject to
e S
S J (X, i) and X S F ;
X X S;
= S ;
i i + 1;
end
Output X
If J (X, i) = 1 for all i, the above algorithm reduces to the greedy algorithm. Unlike
the greedy algorithm, no simple characterization of an independence system that guarantees
optimality of the solution produced by the J-greedy algorithm is known. Further, while the
matroid polytope can be elegantly de ned, no similar representation of such an independence
system is known.
In this paper, we consider another generalization of the greedy algorithm which we call the
m-step greedy algorithm. As in the case of Jenkyns algorithm, our algorithm allows more
than one element (in fact, at most m elements, for a given integer m) to be selected in each
iteration. However, the two algorithms are quite di erent. For m 3, we give a complete
characterization of the class of independence systems, for which the m-step greedy algorithm
guarantees an optimal solution to the associated LCOP for all weight functions. The resulting
mathematical structure generalizes the class of matroids. We also characterize the polytopes
2
associated with this class of systems. Further, our study leads to the identi cation of an
interesting new class of set functions that are closely related to submodular functions [11].
2. Notations and Basic Definitions
De nition 1. : A discrete system (E, F ), where E = {1, 2, . . ., n} and F 2E, is an
independence system if and only if A F and B A implies that B F . Each element of
F is called an independent set of the system.
Throughout the paper, we only consider discrete systems that are independence systems.
For a given positive integer m, we introduce the m-step greedy algorithm which can be
summarized as follows. We order the elements of E = {1, 2, . . ., n} such that w (1) w (2)
w (n) and we start with the empty set as the initial solution and with all the elements of
E as unscanned. In each iteration, we scan the rst m of the currently unscanned elements
of E in the order 1, 2, 3, . . ., n and augment the current solution by adding all these m
elements or a subset of it (the subset could be empty as well) so that the resulting solution
is feasible and gives maximum improvement. The m elements of E scanned in this iteration
are then marked as scanned. In the last iteration, depending on the value of n, the number
of unscanned elements could be less than m and all of them are scanned in this iteration.
In every other iteration, we always scan exactly m elements. A formal description of the
m-step greedy algorithm is given below.
Algorithm 3: The m-step Greedy Algorithm
Input: E = {1, 2, . . ., n}; an independence system (E, F ) (possibly given as an
oracle); integer m.
Output: X G, the solution obtained.
Order the elements of E such that w (1) w (2) w (n);
S 0 ;
k 0;
n
while k w [Y ]. Hence the solution X G F output
by the 3-step greedy algorithm must contain Y { } for some X Y . This proves part (1).
Let us now prove part (2), where Y = 3k + 1 for some integer k . Since X Y and
there does not exist X Y such that Y { } F, it follows that X Y 2. Consider
the weight function,
1 + 3 for all Y {i}
1 + 2 for = i
w = 1 + for {j, p}
for all X (Y {j, p})
1
1 otherwise
Since Y = 3k + 1, the set S k in the 3-step greedy algorithm is precisely Y {i}. In
the (k + 1)th iteration, the algorithm considers the triplet {i, j, p}. If i S k+1 then as in
the previous case, the algorithm must choose some X Y and hence Y { } F, a
contradiction. If i S k+1, then it follows from the de nition of the 3-step greedy algorithm
that it must choose S k+1 = (Y {i}) {j, p} F .
This proves the theorem.
5
Corollary 6. Let (E, F ) be a trioid. For any A = {i, j, p} E if B1 = {i} and B2 = {j, p}
are maximal independent subsets of A, then F \A = .
Proof. Suppose there exists { } F \A then {i, } F and {j, p, } F . By theorem 5,
therefore either {i, j, } F or {i, p, } F, contradicting maximality of sets B1 and B2 .
Theorem 7. Let (E, F ) be a trioid. Let X, Y F with X Y and suppose X Y
such that Y { } F . Let R Y be such that R = 3r for some integer r . Then:
(1) For any i, j Y R, i = j and p E Y, R {i, j, p} F .
(2) For any i Y R and {j, p} E Y, j = p, R {i, j } F or R {i, p} F or
R {j, p} F .
(3) If R X Y, then for any i (X Y ) R and {j, p} E Y, j = p, R {i, j } F
or R {i, p} F .
Proof. Since there does not exist X Y such that Y { } F, by Theorem 5, Y = 3k +1
for some integer k and X Y 2.
Suppose part (1) of the theorem is not true. Then there exist
i, j Y R and p E Y such that R {i, j, p} F .
/ (1)
Assign the following weights to the elements of E :
R
1 + 4
1 + 3 for = i, j
1 + 2 for = p
w =
Y (R {i, j })
1 +
X (Y {p})
1
1 otherwise
The optimal objective function value of the corresponding instance of LCOP is at least
w [X ] > w [Y ]. The 3-step greedy algorithm chooses S r = R and S r+1 as either R {i, j }
or R {i, j, p}. In view of (1), S r+1 = S {i, j }. Since Y = 3k + 1, S k = Y {u, v }
for some u, v Y R. The next triplet scanned by the algorithm is {u, v, z } for some
z X (Y {p}). Hence the algorithm must choose S k+1 such that Y S k+1, and there-
fore the solution X G output by the algorithm satis es Y { } X G for some X Y, a
contradiction.
Let us now consider part (2) of the theorem. If this is not true, then there exists
i Y R and {j, p} E Y, j = p, such that R {i, j } F, R {i, p} F, and R {j, p} F .
/ / /
(2)
Assign the following weights to the elements of E .
6
R
1 + 4
1 + 3 for = i
1 + 2 for = j, p
w =
Y (R {i})
1 +
X (Y {j, p})
1
1 otherwise
The algorithm now chooses S r = R and S r+1 as either R {i} or R {i, j } or R {i, p}
or R {j, p} or R {i, j, p}. In view of (2), we have S r+1 = R {i} and hence S k+1 = Y .
Since w [X ] > w [Y ], we must have X G Y { } for some X Y, a contradiction.
Let us now consider part (3) of the theorem. If this is not true, then there exist
R X Y ; i (X Y ) R and {j, p} E Y, j = p, such that R {i, j } F, and R {i, p} F .
/ /
(3)
Assign the following weights to the elements of E .
R
2 + 6
2 + 5 for = i
1 + 2 for = j, p
w =
Y (R {i})
1 +
X (Y {j, p})
1
1 otherwise
The algorithm now chooses S r = R and S r+1 as either R {i} or R {i, j } or R {i, p} or
R {i, j, p}. In view of (3), we have S r+1 = R {i} and hence S k+1 = Y . Since w [X ] > w [Y ],
we must have X G Y { } for some X Y, a contradiction.
This proves the theorem.
Corollary 8. Let [E, F ] be a trioid. If e E such that {e} F (i.e. e is a loop) then for
any X, Y F such that X Y , X Y such that Y { } F . In particular, all
the maximal elements of F are of same cardinality.
Proof. If possible, let X, Y F be such that X Y and there does not exist X Y
such that Y { } F . If Y 2, then from part 1 of Theorem 7 by choosing any distinct
i, j Y, p = e and R =, we have a contradiction. If Y = 1, i.e. Y = {i}, then from part
2 of Theorem 7 by choosing this i, p = e, any j X and R =, we have a contradiction.
This proves the result.
Theorem 9. Let (E, F ) be a trioid and let X, Y F with X Y 2. Then for any
e Y X, {x, y } X Y such that X {e} {x, y } F .
7
Proof. For any e Y X assign the following weights to elements of E .
X Y
2 + 2
2 + for = e
w =
X Y
1
1 otherwise
Since Y F, (X Y ) {e} F . Let X Y = k . Then the algorithm chooses S k+1
3
(X Y ) {e}, and therefore (X Y ) {e} X G . Since the optimal objective function
value is at least w [X ], X G must contain all but at most two elements, say {x, y } of X Y .
Hence X {e} {x, y } F .
If X, Y F are such that X Y = {x}, then obviously for any e Y X, X {e} {x} F .
Corollary 10. Let (E, F ) be an independence system that satis es Theorems 5 and 9. Let
X, Y F with X Y 3. Then for any e Y X, z X Y such that X {e} z F .
Proof. By Theorem 9, {x, y } X Y such that X = X {e} {x, y } F . If there exists
{j } = X {e} {z } F where {z } = {x, y } {j }, then the
j {x, y } such that X
result is proved. Else, using Theorem 5 with X, X and any z X Y {x, y } we have
{z } {x, y } = X {e} {z } F . This proves the result.
X
Theorem 11. Let (E, F ) be a trioid and X, Y F with X Y = 2. Then:
(1) If X Y 1, then for any e Y X (i) there exists z X Y such that
X {e} {z } F or (ii) for any j X Y, X {e} {j } F .
(2) If X Y = 3k for some integer k 0, and X Y Y X = 2, then let
X Y = {i, j } and Y X = {e, f }. Then X {e} {i} F or X {e} {j } F
or X {f } F .
Proof. To prove part (1), let Y = (X Y ) {e}. If j X Y such that Y {j } =
X {e} {z } F for {z } = (X Y ) {j }, the result follows. Else by Theorem 5 with
Y, X and any j X Y, we get Y {j } (X Y ) = X {e} {j } F .
To prove part (2) let us assign the following weights to elements of E :
X Y
3
2 + for = e
w = 1 + {i, j }
1 for = f
1 otherwise
The 3-step greedy algorithm chooses S k+1 = X {e} or X {e} {i} or X {e} {j } or
X . In the rst three cases, the result is proved. In the last case, since w [Y ] > w [X ], the the
algorithm must choose X G = X {f } F . This proves the result.
Theorem 12. Suppose an independence system (E, F ) satis es theorems 5, 7, 9, 11. Then:
(1) For any A E, (E A, F /A) satis es theorems 5, 7, 9, 11.
(2) For any B A E such that A = 3 and B is a maximal subset of A in F,
(E A, F \(A, B )) satis es theorems 5, 7, 9, 11.
8
Proof. Proof of part (1) is straightforward and hence omitted.
We now prove part (2). Thus, suppose (E, F ) satis es Theorems 5, 7, 9, 11. It is easy to
see that (E A, F \(A, B )) satis es Theorem 9 and part (1) of Theorem 11.
To prove that (E A, F \(A, B )) satis es Theorem 5, consider any X, Y F \(A, B ) with
X Y . Then X B F and Y B F . The only non-trivial case is when Y B = 3k +1
and Y = 3k or 3k 1. But in this case, using the fact that (E, F ) satis es parts(1) and (3)
of Theorem 7 with X B, Y B and with R as a subset of B of cardinality 3 B , we get
3
B {x} F for some x A B, contradicting the fact that B is a maximal subset of A in F .
Now let us prove that (E A, F \(A, B )) satis es Theorem 7. Thus, consider any X, Y
F \(A, B ) with X Y , such that X Y with Y { } F \(A, B ). Let R Y
be such that R = 3r for some integer r . Since (E A, F \(A, B )) satis es Theorem 5,
Y = 3k + 1, for some integer k . Also, since (E, F ) satis es Theorem 5 and X B F and
Y B F, Y B = 3k + 1. This implies that B = 0 or 3. The result now follows by
applying Theorem 7 to (E, F ) with X B, Y B and R = R B .
To prove that (E A, F \(A, B )) satis es part (2) of Theorem 11, consider any X, Y F
with X Y = 3k for some integer k 0, X Y = {i, j } and Y X = {e, f }. Let
Y = Y B {f } and X = X B . Then X, Y F . If Y {y } = X {e} {x} F for
some y {i, j } where {x} = {i, j } {y }, then the result is proved. Else, by Theorem 5,
Y = 3k + 1, which implies that B = 0 or 3 and therefore X Y = 3(k + 1). The result
and Y .
now follows by applying part (2) of Theorem 11 to X
Corollary 13. Let (E, F ) be an independence system such that all the maximal elements of
F are of same cardinality. Then (E, F ) is a trioid if and only if it is a matroid.
Proof. The if part of the corollary follows from Theorem 4 and the facts that (1) the
greedy algorithm (Algorithm 1) produces an optimal solution to LCOP on a matroid and
(2) any deletion/contraction of a matroid is a matroid.
To prove the only if part, consider any pair of maximal elements X, Y F and any
e Y X . It is su cient to show that i X Y such that X {e} {i} F . If
X Y Y X = 1 this is trivially true. If X Y Y X 3, then the result follows
from Corollary 10. Suppose X Y Y X = 2. Let X Y = {i, j }, Y X = {e, f } and
Y = Y {f }. If Y {y } = X {e} {x} F for some y {i, j } where {x} = {i, j } {y },
then the result is proved. Else, it follows from Theorem 5 that Y = 3k + 1 and therefore
X Y = 3k . From Theorem 11, X {e} {i} F or X {e} {j } F or X {f } F .
In the rst two cases, the result is proved. In the last case, we have a contradiction to fact
that X is a maximal element of F .
This proves the corollary.
We now prove our main result of this paper.
Theorem 14. Let [E, F ] be an independence system. Then [E, F ] is a trioid if and only if
it satis es conditions of theorems 5, 7, 9 and 11.
Proof. The only if part of the theorem follows from theorems 5, 7, 9 and 11. Let us now
prove the if part. If the result is not true, then choose a counter-example with minimum
9
value of E and choose a weight function w : E R such that a corresponding solution
produced by the 3-step greedy algorithm is not optimal. Obviously w (i) > 0 for all i E for
otherwise it follows from Theorem 12 that we could delete elements of E with non-positive
weights to obtain a counter-example with a smaller value of E , contradicting the minimal-
ity of E . Without loss of generality we assume that all weights are distinct and the values
w [A] = i A w (i) are distinct for all A E . Hence the solution X G produced by the 3-step
greedy algorithm and the optimal solution X are unique with X G = X . Obviously, X G
and X are maximal elements of F . Let E = {1, 2,, n} and w (1) > w (2) > > w (n).
The 3-step greedy algorithm rst considers the triplet {1, 2, 3} and S 1 {1, 2, 3}. Note
that S 1 = for otherwise, (E {1, 2, 3}, F ) is a smaller counter-example. If {1, 2, 3}
(X X G ) =, where is the symmetric di erence operator, then S 1 X G X and
by Theorem 12, [E {1, 2, 3}, F \({1, 2, 3}, S 1)] is a smaller counter-example, contradicting
the minimality of E . Hence the set {1, 2, 3} contains the element e X X G with w (e)
maximum. If e X G X then S 1 (X G X ) = . If e X X G, then since e X G
it follows that
{1, 2, 3} {e} = {f, g } = S 1 X G X
and
w (f ) + w (g ) > w (e).
Case (1) X G X : In this case, by maximality of X G and X and by Theorem 5, it
follows that X G = 3k + 1 for some integer k 0. This implies that in some iteration
i k + 1, the 3-step greedy algorithm scans the triplet {3i 2, 3i 1, 3i} X G . Let i
be the rst such iteration and let {3i 2, 3i 1, 3i } = {x1, x2, x3 } with x1 X G . Then
i 1 i 1
G
X, and S = 3(i 1).
S
Case (1(a)): {x2, x3 } X G = 1: Without loss of generality, let x2 X G . Then it follows
from the de nition of X G that the algorithm sets S i = S i 1 {x2 }. By part (2) of Theorem
7, one of the following holds:
(i) S i 1 {x2, x3 } F,
(ii) S i 1 {x1, x2 } F
(iii) S i 1 {x1, x3 } F
In the rst two cases, we have a contradiction to the choice of S i by the 3-step greedy
algorithm. In the third case, choice of S i by the 3-step greedy algorithm implies that
w (x2 ) > w (x1 ) + w (x3 ). Since {x1, x3 } are the rst two elements of E X G scanned by the
algorithm, we have the following:
For all x, y X X G, w (x2 ) > w (x1 ) + w (x3 ) > w (x) + w (y ). Since S 1 (X G X ) =,
there exists g S i (X G X ) and by Theorem 9 there exists x, y X X G such that
X = X {g } {x, y } F . But w (g ) w (x2 ) > w (x) + w (y ) and hence w [X ] > w [X ],
contradicting the de nition of X .
10
Case (1(b)): {x2, x3 } X G : In this case, the 3-step greedy algorithm sets S i = S i 1
{x2, x3 }. But by part (1) of Theorem 7, S i 1 {x1, x2, x3 } F, contradicting the choice of
S i by the algorithm.
Case (1(c)): {x2, x3 } X G = : In this case, by de nition of X G, S i = S i 1 . But
S i 1 X G, S i 1 = 3(i 1), X G = 3k + 1 and (i 1) k . Hence, using part (2) of
Theorem 7 with some i X G S i 1 and {j, p} {x1, x2, x3 } we get that S i 1 { } F
for some {j, p}, contradicting the choice of S i by the algorithm.
Case (2) X G X : By Theorem 5 and maximality of X and X G, we have X = 3k +1
for some integer k 0. If the largest element e of X X G is in X G, then for any
x X X G, w (e) > w (x). But, by Theorem 5, for any x X X G, X = X {x} {e} F
] > w [X ], contradicting the de nition of X . Hence e X X G and therefore as
and w [X
shown before, e = 1 and {2, 3} = S 1 X G X and w (1) w [X ] contradict-
ing the de nition of X .
Case (3) X G X : Let e be the element of X G X with maximum value of w (e).
Then, as shown before, e {1, 2, 3}.
Suppose e X G X . Then w (e) > w (x) for all x X X G . Since w [X ] > w [X G ], this
implies that X X G 2. If X X G 3, then by Corollary 10, there exists z X X G
such that X = X {e} {z } F . But w [X ] > w [X ], contradicting the de nition of X .
G G
Thus X X X X = 2.
Let X G X = {e, f } and X X G = {x, y }. Since w [X ] > w [X G ], w (x) + w (y ) >
w (e) + w (f ) and hence, min{w (x), w (y )} > w (f ). If there exists z X G X such that
w (z ) w [X ] and w [X ] > w [X ], a contradiction.
or X
Hence X G X {1, 2, 3} {e} and so (X G X ) {e} S 1 . Since w [X ] > w [X G ],
w (f ) w [X ], we have a contradiction. Else, by part (2) of Theo-
rem 7 we have either {e, i} F or {e, j } F or (i, j ) F . From this and the fact that
w (i) + w (j ) w (x) + w (y ) > w (e) we get a contradiction to the choice of S 1 .
If X G X =, then S 1 = 3k + 1. Hence, by Theorem 5, there exists z {x, y } such
that X = S 1 {z } F . But w [X ] > w [X ] and we have a contradiction.
Suppose e X X G . Then, as shown before, e = 1, {2, 3} = S 1 X G X and
w (2) + w (3) > w (1).
If X X G 3, then by Corollary 10, there exists j X G X such that X G {1} {j }
F . But this implies that {1, 2} f f or {1, 3} F, contradicting the choice of S 1 . Hence,
11
X X G X G X = 2 with X G X = {2, 3}. Let X X G = {1, y }.
If X G X =, then by Theorem 11, there exists z {2, 3} such that X G {1} {z } F
or for any j X G X, X G {1} {j } F . In either case, we get a contradiction to the
choice of S 1 .
If X G X =, then by Theorem 11, {1, 2} F or {1, 3} F or {y, 2, 3} F . We thus
have either a contradiction to the choice of S 1 or to the choice of X G .
This proves the theorem.
4. Trioid polytope
For any discrete system (E, F ) its rank function f : 2E Z + is de ned as follows:
f (A) = max{w A [Y ] : Y F },
1 if i A
A
where wi =
0 otherwise
Let (E, F ) be a trioid. Consider any A, B E . Let X F be a solution to max{w A B [Y ] :
Y F } obtained by using the 3-step greedy algorithm with elements of E arranged in the
following order: elements of A B, followed by elements of A B, then elements of B A
and nally elements of E (A B ). The observations (1) to (5) below can be easily veri ed
using the de nition of rank function, the 3-step greedy algorithm and properties of a trioid.
f (A B ) = X
(1)
f (A) X A
(2)
f (B ) X B
(3)
If A B 0 or 2(mod 3), then X A B = f (A B ) and hence,
(4)
f (A) + f (B ) X AX B X X A B
= f (A B ) + f (A B )
(5) If A B 1(mod 3), then X A B f (A B ) 1. Hence,
f (A) + f (B ) X AX B X X A B
f (A B ) + f (A B ) 1
We thus have the following theorem:
Theorem 15. Let (E, F ) be a trioid with rank function f : 2E Z + . Then for any
A, B E,
(1) If A B = 0 or 2( mod 3) then f (A) + f (B ) f (A B ) + f (A B )
(2) If A B = 1( mod 3) then f (A) + f (B ) f (A B ) + f (A B ) 1
12
A set function satisfying conditions (1) and (2) of theorem 15 is called an almost submod-
ular function.
Consider the polytope de ned by
P = {X R n : xj f (S ) S 2E ; xj 0 j E }
j S
We now show that if (E, F ) is a trioid, then for any w Rn, the optimal solution generated
by the 3-step greedy algorithm is an optimal solution to the linear program (LP-trioid) given
below.
n
LP-trioid Maximize wj xj
j =1
Subject to
X P.
In other words, we show that P represents the convex hull of incidence vectors of elements
of F when (E, F ) is a trioid. The dual of LP-trioid, (which we denote by D-trioid), can be
written as follows:
D-trioid Minimize f (S )y S
S E
Subject to
{yS : j S E } w (j ), j E
yS 0 S E .
Consider the trioid (E, F ) with w1 w2 wn 0. Each iteration i of the 3-step
greedy algorithm will be of one of the following 3 types:
Type 1: n 3i, S i = S i 1 {3i 2} and S i 1 {3i 1, 3i} F . (This implies that
w3i 2 w3i 1 + w3i .)
Type 2: n 3i, S i = S i 1 {3i 1, 3i} and S i 1 {3i 2} F . (This implies that
w3i 2 w3i 1 + w3i .)
Type 3: All other cases
Let k = n . For i {1, 2, . . ., k }, we recursively de ne a class of ni ai matrices B i
3
and a class of mi ai matrices D i as follows, where ai = min{3i, n} and ni, mi are some
integers. (For convenience, we assume that n = 3k .)
110
If iteration 1 is of type 1 or 2, then D 1 = ; else D 1 = 1 1 1
101
If iteration 1 is of type 1 then B 1 = 1 01 0
D
If iteration 1 is of type 2 then B 1 = 0 11 1
D
13
100
If iteration 1 is of type 3 then B 1 = 1 1 0
D1
For i = 2, 3, . . ., k,
if iteration i is of type 1 or 2 then
1 1 0
. . .
. . .
D i 1
. . .
1 1 0
i
D =
1 0 1
. . .
. . .
i 1
D . . .
101
else,
1 1 1
. . .
D i = D i 1 . . .
. . .
111
If iteration i is of type 1, then
1 0
0
. .
.
. .
.
D i 1 . .
.
i
B =
1 0 0
Di
If iteration i is of type 2, then
Di
Bi =
0 0 0 1 1
Otherwise
1 0 0
. . .
. . .
D i 1
. . .
1 0 0
1 1 0
. . .
B i = D i 1
. . .
. . .
1 1 0
1 1 1
. . .
. . .
i 1
D . . .
11 1
Note that each column of B i represents a unique elements of E . Let Li be the ni n matrix
where its (k, j )th element Li = Bkj if j ai, and 0 otherwise. We now de ne a p n
i
kj
14
matrix B as:
L1
L2
B = . .
. .
Lk
Each row i of B is the incidence vector of some subset Si of E . Let b be a p-vector whose
ith component is f (Si). It is not di cult to verify that the incidence vector x of X G F
(the output of the 3-step greedy algorithm) belongs to P and satis es
Bx = b.
It follows from LP duality [20] that to show that x is an optimal solution to LP-trioid, it is
su cient to produce a dual p-vector y 0 such that y T B = w and has the dual objective
function value p=1 f (Sj )yj = wx .
j
We assign values to components of y recursively. Please note that for convenience we
i
assume that n = 3k . Let ji = k
k =1 nk for all i {1, 2, . . ., k }. Let w = w . For
i = k, k 1, . . ., 1
(1) If iteration i is of type 1, we de ne
1 i
for r = ji mi 1 + 1, . . . ji
mi 1 (w3i )
i
1
for r = ji 2mi 1 + 1, . . ., ji mi 1
yr = mi 1 (w3i 1 )
1 i i i
(w3i 2 w3i w3i 1 ) for r = ji 1 + 1, . . . ji 2mi 1
mi 1
Let y = (yji 1+1, . . ., yji ) and w i 1 = w i y T B i .
(2) If iteration i is of type 2, we de ne
i i i
w3i 2 +w3i 1 w3i
1 for r = ji 1 + 1, . . . ji 1 + mi 1
mi 1 2
i i i
w3i 2 +w3i w3i 1
1
for r = ji 1 + mi 1 + 1, . . ., ji 1
yr =
mi i1 2
i i
w3i 1 +w3i w3i 2
for r = ji
2
Let y = (yji 1 +1, . . ., yji ) and w i 1 = w i y T B i .
(3) If iteration i is of type 3, we de ne
1 i
for r = ji mi 1 + 1, . . . ji
mi 1 (w3i )
i i
yr = mi1 1 (w3i 1 w3i ) for r = ji 2mi 1 + 1, . . ., ji mi 1
1 i i
(w3i 2 w3i 1 ) for r = ji 1 + 1, . . . ji 1 + mi 1
mi 1
Let wr 1 = wr w3i 2 .
i i
The foregoing discussion leads to the following theorem.
Theorem 16. For any trioid (E, F ), polytope P gives the convex hull of incidence vectors
of elements of F .
.
15
5. conclusion
We proposed a generalization of the greedy algorithm, called m-step greedy algorithm,
and provide a complete characterization of an independence system, called trioid, where the
3-step greedy algorithm guarantees an optimal solution. Trioids form a proper generalization
the well studied discrete system of matroids. We also characterize trioid polytope, gener-
alizing the matroid polytope. Further we introduced a class of set functions, called almost
submodular functions, that generalizes submodular functions. It is shown that the rank
function of a trioid is almost submodular. We conjecture that the converse of this result
is also true. i.e. almost submodularity of the rank function is a necessary and su cient
condition for an independence system to be a trioid. It will be interesting to investigate
mathematical properties the m-step greedy algorithm for m 4. This is left as a topic for
future research.
References
[1] M. Barnabei, G. Nicoletti, L. Pezzoli: Matroids on partially ordered sets. Adv. in Appl. Math. 21 (1998)
78 - 112.
[2] A. Bouchet:Matching and -matroids, Discrete Math. 24 (1989) 55-62.
[3] A. Bouchet, and W. H. Cunningham, Delta-matroids, jump systems, and bisubmodular polyhedra,
SIAM J. Disc. Math. 8 (1995) 17-32.
[4] C. Croitoru, On approximate algorithms for combinatorial linear maximization problems, Mathematical
Methods of Operations Research 24 (1980) 171-175.
[5] R. Chandrasekaran and S. N. Kabadi, Pseudomatroids, Discrete Math., 71 (1988) 205-217.
[6] F.D.J. Dunstan, A.W. Ingleton, D.J.A. Welsh, Supermatroids, in: Proceedings of the Conference on
Combinatorial Mathematics, Mathematical Institute Oxford, 1972, Combinatorics (1972), 72 - 122
[7] J. Edmonds, Matroids and the greedy algorithm, Mathematical Programming 1 (1971) 127-36.
[8] J. Edmonds, Submodular functions, matroids, and certain polyhedra. In: Combinatorial Structures and
Their Applications, R. Guy et al. eds., Gordon and Breach, New York, 1970, 69 - 87.
[9] U. Faigle, S. Fujishige, A general model for matroids and the greedy algorithm, Mathematical Program-
ming 2008, DOI: 10.1007/s10107-008-0213-1
[10] U. Faigle: On supermatroids with submodular rank functions. In: Algebraic Methods in Graph Theory
I, Colloq. Math. Soc. J. Bolyai 25 (1981), 149158.
[11] S. Fujishige: Submodular Functions and Optimization, 2nd ed., Annals of Discrete Mathematics 58,
2005
[12] P. Helman, B. M. E. Moret, H. D. Shapiro, An exact characterization of greedy structures, SIAM
Journal of Discrete Mathematics 6 (1993) 274-283.
[13] V. Il ev, Hereditary systems and greedy-type algorithms, Discrete Applied Mathematics 132 (2003)
137-148.
[14] T.A. Jenkyns, The e cacy of the greedy algorithm, Proceedings of the 7th S-E Conference of Combi-
natorics, Graph Theory and Computing, Utilitas Math. Winnipeg, 1976, pp. 341-350
[15] S. N. Kabadi and R. Sridhar: -matroid and jump system, Journal of Applied mathematics and Decision
Sciences (2005) 95-106.
[16] B. Korte and D. Hausmann, An analysis of the greedy algorithm for independence systems, Ann. Disc.
Math. 2 (1978) 65-74.
[17] B. Korte and L. Lovasz, Greedoidsa structural framework for the greedy algorithm. In Progress in
Combinatorial Optimization, pages 221-243, 1984.
[18] C. McDiarmid, On the greedy algorithm with random costs, Mathematical Programming, 36 (1986)
245-255.
[19] J. Mestre, Greedy in Approximation Algorithms, manuscript, http://www.mpi-
inf.mpg.de/ jmestre/papers/greedy.pdf
[20] K. G. Murty, Linear Programming, Wiley, 1983
16
[21] E. Tardos, An intersection theorem for supermatroids. J. Combin. Theory B 50 (1990) 150-159.
[22] H. Whitney, On the abstract properties of linear dependence. American Journal of Mathematics, 57
(1935) 509-533.
Faculty of Business Administration, University of New Brunswick, Fredericton, New
Brunswick, Canada
E-mail address : ******@***.**
Department of Mathematics, Simon Fraser University Surrey, Central City, 250-13450
102nd AV, Surrey, British Columbia,V3T 0A3, Canada
E-mail address : *******@***.**
17
.dvi