Skip to content

Commit bd50ba0

Browse files
committed
fixes
1 parent 44e7365 commit bd50ba0

File tree

1 file changed

+5
-5
lines changed

1 file changed

+5
-5
lines changed

src/graph/hungarian-algorithm.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -142,21 +142,21 @@ The key idea is to **consider matrix rows one by one**, and not all at once. Thu
142142

143143
2. While there is no increasing path starting in this row, recalculate the potential.
144144

145-
3. As soon as an augmenting path is found, alternate the matching along it (thus including the last row in the matching), and restart from point 1 (to consider the next line).
145+
3. As soon as an augmenting path is found, propagate the matching along it (thus including the last edge in the matching), and restart from step 1 (to consider the next line).
146146

147-
To achieve the required asymptotic complexity, it is necessary to implement steps 2-3, which are performed for each row of the matrix, in time $\mathcal{O}(n^2)$ (for rectangular problems in $\mathcal{O}(nm)$).
147+
To achieve the required complexity, it is necessary to implement steps 2-3, which are performed for each row of the matrix, in time $\mathcal{O}(n^2)$ (for rectangular problems in $\mathcal{O}(nm)$).
148148

149149
To do this, recall two facts proved above:
150150

151151
- With a change in the potential, the vertices that were reachable by Kuhn's traversal will remain reachable.
152152

153153
- In total, only $\mathcal{O}(n)$ recalculations of the potential could occur before an augmenting path was found.
154154

155-
From this follow these **key ideas** that allow us to achieve the required asymptotics:
155+
From this follow these **key ideas** that allow us to achieve the required complexity:
156156

157157
- To check for the presence of an augmenting path, there is no need to start the Kuhn traversal again after each potential recalculation. Instead, you can make the Kuhn traversal in an **iterative form**: after each recalculation of the potential, look at the added rigid edges and, if their left ends were reachable, mark their right ends reachable as well and continue the traversal from them.
158158

159-
- Developing this idea further, we can present the algorithm as follows: a cyclic process is employed where, at each step, the potential is recalculated. Subsequently, a column that has become reachable is identified (which will always exist as new reachable vertices emerge after every potential recalculation). If the column is unsaturated, an augmenting chain is discovered. Conversely, if the column is saturated, the corresponding row in the matching also becomes reachable.
159+
- Developing this idea further, we can present the algorithm as follows: at each step of the loop, the potential is recalculated. Subsequently, a column that has become reachable is identified (which will always exist as new reachable vertices emerge after every potential recalculation). If the column is unsaturated, an augmenting chain is discovered. Conversely, if the column is saturated, the matching row also becomes reachable.
160160

161161
- To quickly recalculate the potential (faster than the $\mathcal{O}(n^2)$ naive version), you need to maintain auxiliary minima for each of the columns:
162162

@@ -168,7 +168,7 @@ $$\Delta=\min_{j\notin Z_2} minv[j].$$
168168

169169
Thus, finding $\Delta$ can now be done in $\mathcal{O}(n)$.
170170

171-
It is necessary to update the array $minv$ when new visited rows appear. This can be done in $\mathcal{O}(n)$ for the added row (which adds up to $\mathcal{O}(n^2)$). It is also necessary to update the array $minv$ when recalculating the potential, which is also done in time $\mathcal{O}(n)$ ($minv$ changes only for columns that have not yet been reached: namely, it decreases by $\Delta$).
171+
It is necessary to update the array $minv$ when new visited rows appear. This can be done in $\mathcal{O}(n)$ for the added row (which adds up over all rows to $\mathcal{O}(n^2)$). It is also necessary to update the array $minv$ when recalculating the potential, which is also done in time $\mathcal{O}(n)$ ($minv$ changes only for columns that have not yet been reached: namely, it decreases by $\Delta$).
172172

173173
Thus, the algorithm takes the following form: in the outer loop, we consider matrix rows one by one. Each row is processed in time $\mathcal{O}(n^2)$, since only $\mathcal{O}(n)$ potential recalculations could occur (each in time $\mathcal{O}(n)$), and the array $minv$ is maintained in time $\mathcal{O}(n^2)$; Kuhn's algorithm will work in time $\mathcal{O}(n^2)$ (since it is presented in the form of $\mathcal{O}(n)$ iterations, each of which visits a new column).
174174

0 commit comments

Comments
 (0)