fbpx
Wikipedia

Basic feasible solution

In the theory of linear programming, a basic feasible solution (BFS) is a solution with a minimal set of non-zero variables. Geometrically, each BFS corresponds to a vertex of the polyhedron of feasible solutions. If there exists an optimal solution, then there exists an optimal BFS. Hence, to find an optimal solution, it is sufficient to consider the BFS-s. This fact is used by the simplex algorithm, which essentially travels from one BFS to another until an optimal solution is found.[1]

Definitions edit

Preliminaries: equational form with linearly-independent rows edit

For the definitions below, we first present the linear program in the so-called equational form:

maximize  
subject to   and  

where:

  •   and   are vectors of size n (the number of variables);
  •   is a vector of size m (the number of constraints);
  •   is an m-by-n matrix;
  •   means that all variables are non-negative.

Any linear program can be converted into an equational form by adding slack variables.

As a preliminary clean-up step, we verify that:

  • The system   has at least one solution (otherwise the whole LP has no solution and there is nothing more to do);
  • All m rows of the matrix   are linearly independent, i.e., its rank is m (otherwise we can just delete redundant rows without changing the LP).

Feasible solution edit

A feasible solution of the LP is any vector   such that  . We assume that there is at least one feasible solution. If m = n, then there is only one feasible solution. Typically m < n, so the system   has many solutions; each such solution is called a feasible solution of the LP.

Basis edit

A basis of the LP is a nonsingular submatrix of A, with all m rows and only m<n columns.

Sometimes, the term basis is used not for the submatrix itself, but for the set of indices of its columns. Let B be a subset of m indices from {1,...,n}. Denote by   the square m-by-m matrix made of the m columns of   indexed by B. If   is nonsingular, the columns indexed by B are a basis of the column space of  . In this case, we call B a basis of the LP.

Since the rank of   is m, it has at least one basis; since   has n columns, it has at most   bases.

Basic feasible solution edit

Given a basis B, we say that a feasible solution   is a basic feasible solution with basis B if all its non-zero variables are indexed by B, that is, for all  .

Properties edit

1. A BFS is determined only by the constraints of the LP (the matrix   and the vector  ); it does not depend on the optimization objective.

2. By definition, a BFS has at most m non-zero variables and at least n-m zero variables. A BFS can have less than m non-zero variables; in that case, it can have many different bases, all of which contain the indices of its non-zero variables.

3. A feasible solution   is basic if-and-only-if the columns of the matrix   are linearly independent, where K is the set of indices of the non-zero elements of  .[1]: 45 

4. Each basis determines a unique BFS: for each basis B of m indices, there is at most one BFS   with basis B. This is because   must satisfy the constraint  , and by definition of basis the matrix   is non-singular, so the constraint has a unique solution:

 

The opposite is not true: each BFS can come from many different bases. If the unique solution of   satisfies the non-negativity constraints  , then B is called a feasible basis.

5. If a linear program has an optimal solution (i.e., it has a feasible solution, and the set of feasible solutions is bounded), then it has an optimal BFS. This is a consequence of the Bauer maximum principle: the objective of a linear program is convex; the set of feasible solutions is convex (it is an intersection of hyperspaces); therefore the objective attains its maximum in an extreme point of the set of feasible solutions.

Since the number of BFS-s is finite and bounded by  , an optimal solution to any LP can be found in finite time by just evaluating the objective function in all  BFS-s. This is not the most efficient way to solve an LP; the simplex algorithm examines the BFS-s in a much more efficient way.

Examples edit

Consider a linear program with the following constraints:

 

The matrix A is:

 

Here, m=2 and there are 10 subsets of 2 indices, however, not all of them are bases: the set {3,5} is not a basis since columns 3 and 5 are linearly dependent.

The set B={2,4} is a basis, since the matrix   is non-singular.

The unique BFS corresponding to this basis is  .

Geometric interpretation edit

 

The set of all feasible solutions is an intersection of hyperspaces. Therefore, it is a convex polyhedron. If it is bounded, then it is a convex polytope. Each BFS corresponds to a vertex of this polytope.[1]: 53–56 

Basic feasible solutions for the dual problem edit

As mentioned above, every basis B defines a unique basic feasible solution   . In a similar way, each basis defines a solution to the dual linear program:

minimize  
subject to  .

The solution is  .

Finding an optimal BFS edit

There are several methods for finding a BFS that is also optimal.

Using the simplex algorithm edit

In practice, the easiest way to find an optimal BFS is to use the simplex algorithm. It keeps, at each point of its execution, a "current basis" B (a subset of m out of n variables), a "current BFS", and a "current tableau". The tableau is a representation of the linear program where the basic variables are expressed in terms of the non-basic ones:[1]: 65 

 
where   is the vector of m basic variables,   is the vector of n non-basic variables, and   is the maximization objective. Since non-basic variables equal 0, the current BFS is  , and the current maximization objective is  .

If all coefficients in   are negative, then   is an optimal solution, since all variables (including all non-basic variables) must be at least 0, so the second line implies  .

If some coefficients in   are positive, then it may be possible to increase the maximization target. For example, if   is non-basic and its coefficient in   is positive, then increasing it above 0 may make   larger. If it is possible to do so without violating other constraints, then the increased variable becomes basic (it "enters the basis"), while some basic variable is decreased to 0 to keep the equality constraints and thus becomes non-basic (it "exits the basis").

If this process is done carefully, then it is possible to guarantee that   increases until it reaches an optimal BFS.

Converting any optimal solution to an optimal BFS edit

In the worst case, the simplex algorithm may require exponentially many steps to complete. There are algorithms for solving an LP in weakly-polynomial time, such as the ellipsoid method; however, they usually return optimal solutions that are not basic.

However, Given any optimal solution to the LP, it is easy to find an optimal feasible solution that is also basic.[2]: see also "external links" below. 

Finding a basis that is both primal-optimal and dual-optimal edit

A basis B of the LP is called dual-optimal if the solution   is an optimal solution to the dual linear program, that is, it minimizes  . In general, a primal-optimal basis is not necessarily dual-optimal, and a dual-optimal basis is not necessarily primal-optimal (in fact, the solution of a primal-optimal basis may even be unfeasible for the dual, and vice versa).

If both   is an optimal BFS of the primal LP, and   is an optimal BFS of the dual LP, then the basis B is called PD-optimal. Every LP with an optimal solution has a PD-optimal basis, and it is found by the Simplex algorithm. However, its run-time is exponential in the worst case. Nimrod Megiddo proved the following theorems:[2]

  • There exists a strongly polynomial time algorithm that inputs an optimal solution to the primal LP and an optimal solution to the dual LP, and returns an optimal basis.
  • If there exists a strongly polynomial time algorithm that inputs an optimal solution to only the primal LP (or only the dual LP) and returns an optimal basis, then there exists a strongly-polynomial time algorithm for solving any linear program (the latter is a famous open problem).

Megiddo's algorithms can be executed using a tableau, just like the simplex algorithm.

External links edit

  • How to move from an optimal feasible solution to an optimal basic feasible solution. Paul Robin, Operations Research Stack Exchange.

References edit

  1. ^ a b c d Gärtner, Bernd; Matoušek, Jiří (2006). Understanding and Using Linear Programming. Berlin: Springer. ISBN 3-540-30697-8.: 44–48 
  2. ^ a b Megiddo, Nimrod (1991-02-01). "On Finding Primal- and Dual-Optimal Bases". ORSA Journal on Computing. 3 (1): 63–65. CiteSeerX 10.1.1.11.427. doi:10.1287/ijoc.3.1.63. ISSN 0899-1499.

basic, feasible, solution, this, article, relies, largely, entirely, single, source, relevant, discussion, found, talk, page, please, help, improve, this, article, introducing, citations, additional, sources, find, sources, news, newspapers, books, scholar, js. This article relies largely or entirely on a single source Relevant discussion may be found on the talk page Please help improve this article by introducing citations to additional sources Find sources Basic feasible solution news newspapers books scholar JSTOR December 2018 In the theory of linear programming a basic feasible solution BFS is a solution with a minimal set of non zero variables Geometrically each BFS corresponds to a vertex of the polyhedron of feasible solutions If there exists an optimal solution then there exists an optimal BFS Hence to find an optimal solution it is sufficient to consider the BFS s This fact is used by the simplex algorithm which essentially travels from one BFS to another until an optimal solution is found 1 Contents 1 Definitions 1 1 Preliminaries equational form with linearly independent rows 1 2 Feasible solution 1 3 Basis 1 4 Basic feasible solution 2 Properties 3 Examples 4 Geometric interpretation 5 Basic feasible solutions for the dual problem 6 Finding an optimal BFS 6 1 Using the simplex algorithm 6 2 Converting any optimal solution to an optimal BFS 6 3 Finding a basis that is both primal optimal and dual optimal 7 External links 8 ReferencesDefinitions editPreliminaries equational form with linearly independent rows edit For the definitions below we first present the linear program in the so called equational form maximize cTx textstyle mathbf c T mathbf x nbsp subject to Ax b displaystyle A mathbf x mathbf b nbsp and x 0 displaystyle mathbf x geq 0 nbsp where cT displaystyle mathbf c T nbsp and x displaystyle mathbf x nbsp are vectors of size n the number of variables b displaystyle mathbf b nbsp is a vector of size m the number of constraints A displaystyle A nbsp is an m by n matrix x 0 displaystyle mathbf x geq 0 nbsp means that all variables are non negative Any linear program can be converted into an equational form by adding slack variables As a preliminary clean up step we verify that The system Ax b displaystyle A mathbf x mathbf b nbsp has at least one solution otherwise the whole LP has no solution and there is nothing more to do All m rows of the matrix A displaystyle A nbsp are linearly independent i e its rank is m otherwise we can just delete redundant rows without changing the LP Feasible solution edit A feasible solution of the LP is any vector x 0 displaystyle mathbf x geq 0 nbsp such that Ax b displaystyle A mathbf x mathbf b nbsp We assume that there is at least one feasible solution If m n then there is only one feasible solution Typically m lt n so the system Ax b displaystyle A mathbf x mathbf b nbsp has many solutions each such solution is called a feasible solution of the LP Basis edit A basis of the LP is a nonsingular submatrix of A with all m rows and only m lt n columns Sometimes the term basis is used not for the submatrix itself but for the set of indices of its columns Let B be a subset of m indices from 1 n Denote by AB displaystyle A B nbsp the square m by m matrix made of the m columns of A displaystyle A nbsp indexed by B If AB displaystyle A B nbsp is nonsingular the columns indexed by B are a basis of the column space of A displaystyle A nbsp In this case we call B a basis of the LP Since the rank of A displaystyle A nbsp is m it has at least one basis since A displaystyle A nbsp has n columns it has at most nm displaystyle binom n m nbsp bases Basic feasible solution edit Given a basis B we say that a feasible solution x displaystyle mathbf x nbsp is a basic feasible solution with basis B if all its non zero variables are indexed by B that is for all j B xj 0 displaystyle j not in B x j 0 nbsp Properties edit1 A BFS is determined only by the constraints of the LP the matrix A displaystyle A nbsp and the vector b displaystyle mathbf b nbsp it does not depend on the optimization objective 2 By definition a BFS has at most m non zero variables and at least n m zero variables A BFS can have less than m non zero variables in that case it can have many different bases all of which contain the indices of its non zero variables 3 A feasible solution x displaystyle mathbf x nbsp is basic if and only if the columns of the matrix AK displaystyle A K nbsp are linearly independent where K is the set of indices of the non zero elements of x displaystyle mathbf x nbsp 1 45 4 Each basis determines a unique BFS for each basis B of m indices there is at most one BFS xB displaystyle mathbf x B nbsp with basis B This is because xB displaystyle mathbf x B nbsp must satisfy the constraint ABxB b displaystyle A B mathbf x B b nbsp and by definition of basis the matrix AB displaystyle A B nbsp is non singular so the constraint has a unique solution xB AB 1 b displaystyle mathbf x B A B 1 cdot b nbsp The opposite is not true each BFS can come from many different bases If the unique solution of xB AB 1 b displaystyle mathbf x B A B 1 cdot b nbsp satisfies the non negativity constraints xB 0 displaystyle mathbf x B geq 0 nbsp then B is called a feasible basis 5 If a linear program has an optimal solution i e it has a feasible solution and the set of feasible solutions is bounded then it has an optimal BFS This is a consequence of the Bauer maximum principle the objective of a linear program is convex the set of feasible solutions is convex it is an intersection of hyperspaces therefore the objective attains its maximum in an extreme point of the set of feasible solutions Since the number of BFS s is finite and bounded by nm displaystyle binom n m nbsp an optimal solution to any LP can be found in finite time by just evaluating the objective function in all nm displaystyle binom n m nbsp BFS s This is not the most efficient way to solve an LP the simplex algorithm examines the BFS s in a much more efficient way Examples editConsider a linear program with the following constraints x1 5x2 3x3 4x4 6x5 14x2 3x3 5x4 6x5 7 i 1 5 xi 0 displaystyle begin aligned x 1 5x 2 3x 3 4x 4 6x 5 amp 14 x 2 3x 3 5x 4 6x 5 amp 7 forall i in 1 ldots 5 x i amp geq 0 end aligned nbsp The matrix A is A 1534601356 b 14 7 displaystyle A begin pmatrix 1 amp 5 amp 3 amp 4 amp 6 0 amp 1 amp 3 amp 5 amp 6 end pmatrix mathbf b 14 7 nbsp Here m 2 and there are 10 subsets of 2 indices however not all of them are bases the set 3 5 is not a basis since columns 3 and 5 are linearly dependent The set B 2 4 is a basis since the matrix AB 5415 displaystyle A B begin pmatrix 5 amp 4 1 amp 5 end pmatrix nbsp is non singular The unique BFS corresponding to this basis is xB 0 2 0 1 0 displaystyle x B 0 2 0 1 0 nbsp Geometric interpretation edit nbsp The set of all feasible solutions is an intersection of hyperspaces Therefore it is a convex polyhedron If it is bounded then it is a convex polytope Each BFS corresponds to a vertex of this polytope 1 53 56 Basic feasible solutions for the dual problem editAs mentioned above every basis B defines a unique basic feasible solution xB AB 1 b displaystyle mathbf x B A B 1 cdot b nbsp In a similar way each basis defines a solution to the dual linear program minimize bTy textstyle mathbf b T mathbf y nbsp subject to ATy c displaystyle A T mathbf y geq mathbf c nbsp The solution is yB ABT 1 c displaystyle mathbf y B A B T 1 cdot c nbsp Finding an optimal BFS editThere are several methods for finding a BFS that is also optimal Using the simplex algorithm edit In practice the easiest way to find an optimal BFS is to use the simplex algorithm It keeps at each point of its execution a current basis B a subset of m out of n variables a current BFS and a current tableau The tableau is a representation of the linear program where the basic variables are expressed in terms of the non basic ones 1 65 xB p QxNz z0 rTxN displaystyle begin aligned x B amp p Qx N z amp z 0 r T x N end aligned nbsp where xB displaystyle x B nbsp is the vector of m basic variables xN displaystyle x N nbsp is the vector of n non basic variables and z displaystyle z nbsp is the maximization objective Since non basic variables equal 0 the current BFS is p displaystyle p nbsp and the current maximization objective is z0 displaystyle z 0 nbsp If all coefficients in r displaystyle r nbsp are negative then z0 displaystyle z 0 nbsp is an optimal solution since all variables including all non basic variables must be at least 0 so the second line implies z z0 displaystyle z leq z 0 nbsp If some coefficients in r displaystyle r nbsp are positive then it may be possible to increase the maximization target For example if x5 displaystyle x 5 nbsp is non basic and its coefficient in r displaystyle r nbsp is positive then increasing it above 0 may make z displaystyle z nbsp larger If it is possible to do so without violating other constraints then the increased variable becomes basic it enters the basis while some basic variable is decreased to 0 to keep the equality constraints and thus becomes non basic it exits the basis If this process is done carefully then it is possible to guarantee that z displaystyle z nbsp increases until it reaches an optimal BFS Converting any optimal solution to an optimal BFS edit In the worst case the simplex algorithm may require exponentially many steps to complete There are algorithms for solving an LP in weakly polynomial time such as the ellipsoid method however they usually return optimal solutions that are not basic However Given any optimal solution to the LP it is easy to find an optimal feasible solution that is also basic 2 see also external links below Finding a basis that is both primal optimal and dual optimal edit A basis B of the LP is called dual optimal if the solution yB ABT 1 c displaystyle mathbf y B A B T 1 cdot c nbsp is an optimal solution to the dual linear program that is it minimizes bTy textstyle mathbf b T mathbf y nbsp In general a primal optimal basis is not necessarily dual optimal and a dual optimal basis is not necessarily primal optimal in fact the solution of a primal optimal basis may even be unfeasible for the dual and vice versa If both xB AB 1 b displaystyle mathbf x B A B 1 cdot b nbsp is an optimal BFS of the primal LP and yB ABT 1 c displaystyle mathbf y B A B T 1 cdot c nbsp is an optimal BFS of the dual LP then the basis B is called PD optimal Every LP with an optimal solution has a PD optimal basis and it is found by the Simplex algorithm However its run time is exponential in the worst case Nimrod Megiddo proved the following theorems 2 There exists a strongly polynomial time algorithm that inputs an optimal solution to the primal LP and an optimal solution to the dual LP and returns an optimal basis If there exists a strongly polynomial time algorithm that inputs an optimal solution to only the primal LP or only the dual LP and returns an optimal basis then there exists a strongly polynomial time algorithm for solving any linear program the latter is a famous open problem Megiddo s algorithms can be executed using a tableau just like the simplex algorithm External links editHow to move from an optimal feasible solution to an optimal basic feasible solution Paul Robin Operations Research Stack Exchange References edit a b c d Gartner Bernd Matousek Jiri 2006 Understanding and Using Linear Programming Berlin Springer ISBN 3 540 30697 8 44 48 a b Megiddo Nimrod 1991 02 01 On Finding Primal and Dual Optimal Bases ORSA Journal on Computing 3 1 63 65 CiteSeerX 10 1 1 11 427 doi 10 1287 ijoc 3 1 63 ISSN 0899 1499 Retrieved from https en wikipedia org w index php title Basic feasible solution amp oldid 1200887085, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.