Editorial: String Cutting
Given a string $s$ of length $n$ and integers $l, k$, cut $s$ into $k$ or more contiguous parts, each of length at least $l$. Sort the parts lexicographically and look at the $k$-th smallest. Maximize it.
- If $l \cdot k > n$, we cannot even fit $k$ parts of length $\ge l$: output
NO. - If $k = 1$, the $1$st smallest is the smallest part. The first part in original order is a prefix of $s$, and any proper prefix of $s$ is strictly smaller than $s$, so making any cut at all strictly decreases the smallest part. Output $s$ as a single part.
Assume from now on $l \cdot k \le n$ and $k \ge 2$.
Every part is a contiguous substring of $s$, so the $k$-th sorted part of the optimal partition is $s[i..j]$ for some $0 \le i \le j \le n-1$.
In any partition where $s[i..j]$ is a part, the other parts are split cleanly into left parts (lying entirely in $s[0..i-1]$) and right parts (lying entirely in $s[j+1..n-1]$). Let there be $m$ left parts and $r$ right parts. The total part count is $m + 1 + r \ge k$.
Length constraints (each part has length $\ge l$):
- Left parts cover $s[0..i-1]$ of length $i$, so $m\cdot l \le i$. This means: either $i = 0$ (no left parts), or $i \ge l$ with $m \in [1,\lfloor i/l\rfloor]$. $i \in [1, l-1]$ is impossible.
- Right parts cover $s[j+1..n-1]$ of length $n-1-j$, so symmetrically: either $n-1-j = 0$, or $n-1-j \ge l$ with $r \in [1, \lfloor (n-1-j)/l\rfloor]$.
You would think that to evaluate whether $s[i..j]$ is our answer, we have to decide:
Is there a valid partition in which $s[i..j]$ is exactly the $k$-th sorted part?
That question is genuinely hard to answer directly — we don’t want to touch it. So let’s ask an easier, weaker question instead:
Is there a valid partition (with $\ge k$ parts, each of length $\ge l$) in which $s[i..j]$ appears as a part at all, not necessarily as the $k$-th one?
If the answer is “yes”, we call the pair $(i, j)$ feasible. Note that this completely ignores sort order and ranks — it’s a pure counting question about whether we can place $\ge k-1$ other parts around $s[i..j]$.
From the length constraints above, $(i, j)$ is feasible if and only if all of the following hold:
- Candidate long enough: $j - i + 1 \ge l$.
- Prefix partitionable: $i = 0$ or $i \ge l$.
- Suffix partitionable: $n-1-j = 0$ or $n-1-j \ge l$.
- Enough other parts: $$ \left\lfloor\tfrac{i}{l}\right\rfloor \;+\; \left\lfloor\tfrac{n-1-j}{l}\right\rfloor \;\ge\; k-1. $$
Condition (4) just says: the prefix can carry up to $\lfloor i/l\rfloor$ length-$\ge l$ parts, the suffix can carry up to $\lfloor(n-1-j)/l\rfloor$ such parts, and together they should be able to carry the $k-1$ parts we need besides $s[i..j]$.
Nothing more; nothing about sort order.
Answer $=$ lex-maximum $s[i..j]$ over feasible $(i, j)$.
This is the entire heart of the O(n³) solution. It says: to find the optimal $k$-th part, forget about ranks — just ask “where can this substring possibly fit as some part?” and take the lex-largest such substring.
Let’s prove both directions carefully, because this is where your intuition was probably rebelling.
Suppose the optimal partition has $k$-th sorted part $P^* = s[i^*..j^*]$. $P^*$ is literally a part of this partition. So the partition witnesses that $(i^*, j^*)$ is feasible (the other parts are the $\ge k - 1$ witnesses). Therefore
$$ \text{answer} \;=\; P^* \;\le\; \max\{s[i..j] : (i, j)\text{ feasible}\}. $$No sort-order reasoning needed for this direction.
This is the step that might feel like a leap. The question: if $(i^*, j^*)$ is feasible (i.e. $s[i^*..j^*]$ can appear as some part in some partition), why can we always produce a partition where the $k$-th sorted part is at least $s[i^*..j^*]$?
The trick is that we get to choose how many left / right parts to use. We’ll build a partition with exactly $k$ parts total. In a $k$-part partition, the $k$-th sorted part is automatically the maximum. And since our candidate is one of the parts, the maximum is automatically $\ge$ our candidate.
Construction. Let $(i^*, j^*)$ be any max-feasible pair (if there are ties, any tie‑winner works).
- Put $m$ left parts covering $s[0..i^*-1]$, each of length $\ge l$.
- Put the candidate $s[i^*..j^*]$ in the middle.
- Put $r$ right parts covering $s[j^*+1..n-1]$, each of length $\ge l$.
- Pick $m, r \ge 0$ with $m + r = k-1$, subject to:
- $m \in [\alpha, \lfloor i^*/l\rfloor]$ where $\alpha = \mathbb{1}[i^* > 0]$ (need $\ge 1$ left part if prefix is non-empty);
- $r \in [\beta, \lfloor (n-1-j^*)/l\rfloor]$ where $\beta = \mathbb{1}[j^* < n-1]$ (similarly on the right).
Why such $m, r$ exist. Feasibility gives $\lfloor i^*/l\rfloor + \lfloor(n-1-j^*)/l\rfloor \ge k-1$, so the max of $m + r$ is $\ge k - 1$. The minimum of $m + r$ needed is $\alpha + \beta$. As long as $\alpha + \beta \le k-1$, an intermediate value $m + r = k-1$ exists.
- For $k \ge 3$: $\alpha + \beta \le 2 \le k-1$. ✓
- For $k = 2$: we need $\alpha + \beta \le 1$, which can fail only when $i^* > 0$ and $j^* < n-1$ simultaneously. But in that corner case, pushing $j^*$ out to $n-1$ (a strictly larger candidate) gives another feasible pair (feasibility only becomes easier as $j^*$ grows toward $n-1$ because the $\lfloor(n-1-j)/l\rfloor$ term just turns to $0$, which doesn’t increase condition (4) — so actually we should push the other direction: make $m$ bigger until $j^* = n-1$). Either way, the lex-max feasible $(i^*, j^*)$ can be taken with $j^* = n-1$ or $i^* = 0$, so WLOG $\alpha + \beta \le 1$ for the tie we pick. (You can also take a straightforward shortcut: extend $j^*$ to $\mathrm{rp}(i^*)$ as in the O(n²) section — that extension is always $\ge s[i^*..j^*]$ lex, and satisfies the “exactly $k$ parts” property.)
Why the partition is valid. It covers $s$ exactly, has exactly $k$ parts, each of length $\ge l$.
Why its $k$-th sorted part $\ge s[i^*..j^*]$. Among exactly $k$ parts, sorted order puts the largest at position $k$. Our candidate is one of the $k$ parts, so the max $\ge$ candidate. Therefore the $k$-th sorted part of this partition $\ge s[i^*..j^*]$.
Hence answer $\ge s[i^*..j^*] = \max\text{-feasible}$.
Both directions together give answer $=$ max feasible $s[i..j]$. This is the entire logical foundation of the O(n³) algorithm.
There is nothing fancy now — we just enumerate every $(i, j)$, test feasibility, and track the lex-max among feasible candidates.
best_l, best_r = -1, -1 # "best" = empty, i.e. worse than any real candidate
for i in 0..n-1:
for j in i+l-1..n-1: # guarantees j - i + 1 >= l (condition 1)
pre = i
suf = n - 1 - j
# conditions 2 and 3
if pre != 0 and pre < l: continue
if suf != 0 and suf < l: continue
# condition 4
if (pre // l) + (suf // l) < k - 1: continue
# (i, j) is feasible; see if it's the new lex-max
if best_l == -1 or s[i..j] >_lex s[best_l..best_r]:
best_l, best_r = i, j
print s[best_l..best_r]
This is exactly the part that looked magical before. It isn’t.
The main claim says: the answer is the lex-maximum feasible $s[i..j]$. To compute a maximum over a set, you just iterate through the set and remember the largest element seen so far — that’s what if s[i..j] >_lex s[best_l..best_r]: update is doing. Nothing about sort orders or the $k$-th position is happening inside the loop; all of that reasoning was absorbed once and for all into the main claim.
- Outer pairs $(i, j)$: $O(n^2)$.
- Each lex comparison of two substrings: naive scan, $O(n)$.
- Total: $O(n^3)$ per test case.
That’s enough to understand the problem and pass tests with small $n$ (a few hundred). The rest of the editorial shortens this, first to $O(n^2)$ by avoiding redundant candidates, then to $O(n \log n)$ by speeding up the comparison.
For a fixed $i$, among all feasible $(i, j)$, the one with the largest $j$ gives the lex-largest candidate (every smaller feasible $j$ yields a strict prefix, which is lex $\le$). Solving condition (4) for the largest feasible $j$:
$$ \mathrm{rp}(i) \;=\; n - 1 - \max\bigl(k - 1 - \lfloor i/l\rfloor,\; 0\bigr)\cdot l. $$So we only need to try $s[i..\mathrm{rp}(i)]$ for each valid $i$ (either $i = 0$ or $i \in [l, n-l]$), not all $(i, j)$ pairs. This drops the outer enumeration from $O(n^2)$ to $O(n)$, giving $O(n^2)$ overall (the inner comparison is still $O(n)$).
Preprocessing a suffix array of $s$, its LCP array, and a sparse table over the LCP array lets us lex-compare any two substrings of $s$ in $O(1)$ after $O(n \log n)$ setup. Plug into the $O(n^2)$ loop above: the comparison becomes $O(1)$ and total time is $O(n \log n)$. This is what the primary model solution does. You don’t need any of it to understand the algorithm — the $O(n^3)$ brute force already contains all the combinatorial content.