# Hacker Cup 2016 Finals Solutions

Here are the solutions to the Hacker Cup 2016 Finals problems. If you had a rejected solution and want to find out where you went wrong, read on and download the official test data and solutions!

Snake and Ladder, Boomerang Crews, RNG, Maximinimax Flow, and Rainbow String were written by Jacob Plachta. Grundy Graph was written by Vladislav Isenbaev.

### Snake and Ladder

Let’s first consider fully-blocked rungs (those with two flowers, one per rail). If the first A rungs and the last B rungs are fully blocked in this way (for some non-negative integers A and B), then those rungs can be completely ignored, effectively decreasing N by A + B and leaving us with a smaller ladder. After this reduction, if there are any more fully-blocked rungs on the ladder, then the answer must be 0.

Now, let’s consider the remaining flowers. First, what happens if there are in fact no more flowers? In this case, if N = 1, then there are 2 arrangements. Otherwise, for every pair of columns i and j (such that |i – j| ≠ 1), and each rail k, it can be shown that there’s exactly 1 valid arrangement for the snake if its head goes at the intersection of rung i and rail k, and its tail goes somewhere on rung j (the snake will zig-zag between rungs i and j). This results in a total of 2N^2 – 2N + 4 arrangements.

What if there’s at least one flower in the way? After sorting the flowers by rung, we can observe that the snake has no choice about what to do between consecutive flowers — it must zig-zag between them (similar to the case with no flowers). In fact, if the distance between 2 consecutive flowers is an even number of rungs and they’re on the same rail (or if the distance is odd and they’re on different rails), then the snake can’t navigate between them and the answer must be 0. Otherwise, the snake can choose what to do before the first flower and after the last flower. In particular, the snake can choose to zig-zag for some number of rungs before the first flower, before proceeding to the first rung in a straight line. If the first flower is on rung i, then the snake has max{ 1, i – 1 } options. This should then be multiplied by the number of options after the last flower to produce the the number of valid paths for the snake’s body, which should then be multiplied by 2 to yield the final answer, since the snake’s head can go at either end of the path (except in the special case of the snake having 0 length, in which case the answer should be 1 rather than 2).

The total time complexity of this algorithm is O(K log K).

### Boomerang Crews

Let us consider a member of the opposing crew to be “strong” if they are stronger than your crew’s strongest member. Note that you can always arrange for your strongest member to defeat all non-strong opponents without issue. What remains is to determine the largest number of throwing contests that can be won against strong opponents. This value can be found with binary search — if a strategy exists to defeat at least x strong opponents, then a strategy also exists to defeat at least x – 1 of them.

The problem has been reduced to testing whether or not some number x of the strong opponents may be defeated. You might as well plan to defeat the x weakest strong opponents. Each of them must be must be “whittled down” as necessary by winning against one or more of your own crew members before being defeated. You might as well plan to have your x strongest crew members defeat them, while your remaining N – x crew members are sent in to lose and whittle the opponents down.

One question remains — how should you match up your x strongest crew members against the x weakest strong opponents? Each of your members in turn (in an arbitrary order) can be greedily matched against a remaining opponent by minimizing the difference between your crew member’s strength and their opponent’s final strength (after D has been repeatedly subtracted from it as necessary). This can be done efficiently by maintaining a set of the opponents’ strengths modulo D, for example. Finally, once a matching has been decided on, we can compute the number of times each opponent must be whittled down to lose their throwing contest (keeping in mind that each opponent after the first will get whittled down 1 extra time automatically from beating your previous crew member after their win), and check if the sum of those values is no greater than N – x. The total time complexity of this algorithm is O(N * log^2(N) + M).

### Grundy Graph

Let’s think of the given graph with 2N nodes as being an implication graph for a system of N Boolean variables. In particular, we’ll have each pair of nodes 2k + 1 and 2k + 2 represent the kth variable and its negation, respectively. Colouring a node black can then represent selecting that value. As usual, an edge from node i to node j will represent the fact that i implies j. To complete the implication graph, we should also fill in any missing implied edges (for example, if there’s an edge from node 1 to node 3, then the 1st variable implies the 2nd variable, which means that the negation of the 2nd variable implies the negation of the 1st variable and there should be an edge from node 4 to node 2).

Under this interpretation, Alice will start by getting to choose the value of the 1st variable (if she colours node 1 black and node 2 white, then it’ll be true, and if she instead colours node 1 white and node 2 black, then it’ll be false). Bob will then get to choose the value of the 2nd variable, Alice will choose the value of the 3rd variable, and so on until all N variables have been assigned a Boolean value.

This implication graph conceptualization also ties nicely into determining the winner of the game. Bob wins if there exists at least one edge from a black node to a white node, which is equivalent to a constraint that hasn’t been satisfied by the selection of variable values. Therefore, Alice is trying to assign values to variables to satisfy the given 2-satisfiability problem, while Bob is trying to prevent that.

Expressing this game in a more mathematical sense (in a manner common to 2-player turn-based games), Alice wins if the following sequence of N + 1 statements is true:

- 1. For at least one value of variable 1,
- 2. For all values of variable 2,
- 3. For at least one value of variable 3,
- …
- N + 1. The system of constraints is satisfied.

In other words, we have a 2-satisfiability problem prefixed by a series of existential quantifiers (for Alice’s turns) and universal quantifiers (for Bob’s turns). Let’s call each node whose associated quantifier is existential an “existential node” (these are nodes 1, 2, 5, 6, etc.). Let’s call each of the the remaining nodes a “universal node” (nodes 3, 4, 7, 8, etc.). It can then be shown that Bob wins (the expression is not satisfiable) if and only if at least one of the following three conditions holds:

- An existential node is in the same strongly connected component as its complement node.
- An existential node i is in the same strongly connected component as a universal node j, such that j‘s associated quantifier is within the scope of i‘s quantifier (in other words, i < j).
- A universal node is reachable from another universal node.

The graph’s strongly connected components can be found in linear time, for example using Kosaraju’s algorithm, which will make the first two conditions simple to check. The third condition can also be handled in linear time in total with a depth first search from each universal node, taking care to not visit any node multiple times. As such, the time complexity of this algorithm is O(N + M).

### RNG

This problem can be solved with bitmask DP. For convenience, let I_0 = 1, and then let DP[b][i] be the minimum expected time to finish, given that you’ve collected the items in bitmask b so far, and are currently in area I_i (for 0 ≤ b < 2^K and 0 ≤ i ≤ K). We can initialize DP[2^K – 1][0..K] to be 0, and the final answer will be DP[0][0].

Before proceeding any further, we’ll need to precompute some things. For each pair of i and j (0 ≤ i, j ≤ K), we’d like to compute the minimum distance T[i][j] from area I_i to area I_j (if it’s reachable). For each item i (1 ≤ i ≤ K), we’d also like to compute the minimum distance L[i] from area I_i to any “leaf” (area with no outgoing paths). All of these distances can be found in O(NK) time by running BFS K + 1 times.

We’ll also need to precompute some expected values. First, let eDown[d] be the expected time to travel from area 1 to an area which is a distance of d away from it. eDown[0] = 0, and we can use the recurrence eDown[d] = eDown[d – 1] + D + (1 – P) / P * (R + eDown[d – 1]). Conversely, let eUp[d] be the expected time to get back to area 1 from an area which is a distance of d away from the nearest leaf (an area i such that L[i] = d). eUp[0] = R, and eUp[d] = P * (D + eUp[d – 1]) + (1 – P) * R. Finally, let eFail[d] be the conditional expected time to get sent back to area 1 while attempting to take d paths in a row (which happens with probability 1 – P^d). eFail[0] = 0, and eFail[d] = eFail[d – 1] + P^(d – 1) * (1 – P) * (D * (d – 1) + R). Precomputing all of these expected values takes O(N) time.

At last, we’re ready to proceed with the DP. We should iterate over the remaining bitmasks in descending order (from 2^K – 2 down to 0). For each bitmask b, and for each i between 1 and K which is not in bitmask b, we can first set DP[b][i] to be equal to DP[b | (2^(i – 1))][i]. Next, we can use those values to compute DP[b][0] as the minimum value of DP[b][i] + eUp[T[0][i]], for any i between 1 and K which is not in bitmask b. Finally, for each i between 1 and K which is in bitmask b, we can compute DP[b][i] as the minimum of two strategies. We can attempt to reach the root as soon as possible by heading towards the nearest leaf, with an expected time of DP[b][0] + eUp[L[i]]. Otherwise, we can attempt to directly reach some other item j (which is not in bitmask b but is reachable from area I_i), possibly failing and returning to area 1 along the way — if we let x = T[i][j], this has an expected time of P^x * (D * x + DP[b][j]) + (1 – P^x) * DP[b][0] + eFail[x].

With the above DP evaluation, the states are computed in an appropriate order, and all potentially optimal strategies are considered. The total time complexity of this solution is O(NK + 2^K * K^2).

### Maximinimax Flow

With N nodes and N edges, the graph is sure to have exactly 1 cycle. Our first step should be to find all of the edges which belong to that cycle, which can be done in O(N) time by performing a BFS from an arbitrary node until a node is visited twice, and then retracing the cycle.

Beyond that, it can be shown that the layout of the graph in fact doesn’t matter at all — if we let x be the smallest capacity of any non-cycle edge, and y and z be the two smallest capacities of any cycle edges, then the minimax flow in the graph can be shown to be min{ x, y + z }.

This insight allows us to approach the efficient handling of the operations. For the cycle edges, we’d like to maintain two Binary Indexed Trees — one to query the number of cycle edges with capacities no larger than some value v, and another to query their sum. We’ll want a corresponding pair of BITs for the non-cycle edges. Additionally, for just the cycle edges, we’ll want to maintain an ordered set of their capacities (for the purpose of looking up the two smallest ones). All 5 of these data structures can be initialized and modified in response to update operations quite easily.

What remains is answering the queries. To answer a given query which allows using a edge augmentations, we can binary search on the answer (since, if it’s possible to achieve a maximinimax flow of f, it must be possible to also achieve f + 1). For a given value of f, we’ll need to compute the number of edge augmentations which must be used on cycle and non-cycle edges independently, and check if the sum of those two values is no greater than a. For non-cycle edges, this is fairly straightforward — we can use the count BIT to determine how many of them have capacities smaller than f, and then use the sum BIT to compute the cost of augmenting all of those up to a capacity of exactly f. The computation for cycle edges involves the same concept, but is trickier. If the smallest cycle edge capacity is c1 and the second-smallest one is c2, then there are the following cases to consider:

- f ≤ c1 + c2 — no augmentations are required
- c1 + c2 < f ≤ 2 * c2 — f – (c1 + c2) augmentations are required
- 2 * c2 < f — we need to augment all of the edges to have a capacity of at least ceil(f / 2) (using the same BIT approach as for non-cycle edges), and then if f is odd, we can subtract one edge augmentation (since one edge may have a capacity of floor(f / 2) instead)

The total time complexity of this algorithm is O(N * log(N) + M * log(Z) * log(N)).

### Rainbow String

As a first step, we should build a suffix array from the string in order to get a lexicographical ordering of its suffixes. For each index i (1 ≤ i ≤ N), let O[i] be the number of suffixes which are lexicographically smaller than the suffix starting at index i. These values can be computed in O(N) time, though an O(N log^2 N) implementation is simpler and still fast enough.

To answer the ith query, we’ll need to consider the set of valid starting indices for length L_i (indices j such that j + L_i – 1 ≤ N, and the substring j..(j + L_i – 1) contains at least one green letter and at no red letters), and find the one with the K_ith-smallest O value (this is the case because, in general, the length-l prefix of the kth-smallest suffix with length no less than l is equal to the kth-smallest substring of length k).

For each index j, let G[j] and R[j] respectively be the indices of the first green and red letters at index j or later (or N + 1 if there are none). These values can be computed in O(N) time by iterating over the string in reverse. If R[j] ≤ G[j], then index j is not valid for any query lengths. Otherwise, we can observe that it’s valid for the interval of query lengths (G[j] – j + 1)..(R[j] – j).

This suggests the following approach: we can perform a line sweep over the possible query lengths from 1 up to N, while maintaining the set of currently valid starting indices in some data structure. In particular, we need to be able to efficiently insert and remove the indices’ O values, and find the kth-smallest of them, which are operations that a Binary Indexed Tree can support in O(log N) time each. So, for each query length l, we should update the BIT by inserting the O values of indices which start being valid at length l, removing those of indices which stop being valid at length l, and then processing each query i for which L_i = l. This can be done by determining the K_ith-smallest value currently in the BIT, looking up the index j which has that O value, and then for each letter from A to Z, adding its number of occurrences within the range of indices j..(j + l – 1) onto its total answer (which can be done in constant time assuming that we’ve precomputed a prefix sum array of its occurrences in O(N) time).

The total time complexity of this algorithm is O((N + Q) log N).

*Source here:*