Advent of Code 2023 Days 12 and 17 Highlights
After some family holiday celebrations, I’m catching up with the remaining part of Advent of Code 2023. In this blog post I’ll give you a short walkthrough of my approaches for days 12 and 17.
Day 12 in Elixir: Recursion and Memoization
I struggled a lot with the day 12’s task. To a large extent, this was due to my choice of technology - I was not sure how to represent the task efficiently in Elixir, since one thing where it sucks is indexes and arrays. Built-in lists are inefficient for index access, and though there are options such as Nx, they are usually too specific and not a good match for this particular task. So I spent a lot of time trying to do various splits/chunks approaches, but ultimately gave up on the task after solving part 1 until yesterday. Another thing where I messed up was trying to actually construct all possibilities - that clearly became a bad idea looking at the numbers at part 2. Inspiration came from day 19 task - there, the only thing you need to do is count things, and that immediately reminded me that in day 12 I only need to figure out the counts. Then, the solution was comparatively straightforward, barring some debugging.
Instead of trying to parse things in chunks, I just keep the states of springs as a binary.
def parse_input(input) do
input
|> String.split("\n", trim: true)
|> Enum.map(fn line ->
[states, damaged] = String.split(line)
damaged =
damaged
|> String.split(",")
|> Enum.map(&String.to_integer/1)
{states, damaged}
end)
end
The crux of the solution is using recursion to examine both #
and .
possibilities when encountering
?
+ base cases where we return 1
if the given state is valid and 0
otherwise. Then, the answer
is just a sum of the returns of all sub-branches of recursion. So, I was able to get the first part solved
again with this code:
def p1(input) do
input
|> Enum.map(fn {states, damaged} -> solve(states, damaged) end)
|> Enum.sum()
end
def solve(states, damaged), do: solve(states, damaged, 0)
def solve("", [], _), do: 1
def solve("", [0], _), do: 1
def solve(".", [], _), do: 1
def solve(".", [0], _), do: 1
def solve("?", [], _), do: 1
def solve("?", [0], _), do: 1
def solve("?", [1], _), do: 1
def solve("#", [], _), do: 0
def solve("#", [1], _), do: 1
def solve("#", damaged, _) when length(damaged) > 0, do: 0
def solve(<<"?", rest::binary>>, [springs | damaged], continuing) do
if continuing > 0 do
if springs > 0 do
solve("#" <> rest, [springs | damaged], continuing)
else
solve("#" <> rest, damaged, continuing) + solve("." <> rest, [continuing | damaged], 0)
end
else
solve("#" <> rest, [springs | damaged], 0) + solve("." <> rest, [springs | damaged], 0)
end
end
def solve(<<"?", rest::binary>>, [], continuing) do
if continuing > 0 do
solve("." <> rest, [continuing], 0)
else
solve("." <> rest, [], 0)
end
end
def solve(<<".", rest::binary>>, damaged, continuing) do
if continuing > 0 do
solve(rest, [continuing | damaged])
else
solve(rest, damaged, 0)
end
end
def solve(
<<a::binary-size(1), b::binary-size(1), rest::binary>>,
[springs | damaged],
continuing
) do
case {a, b, springs} do
{"#", ".", 1} ->
solve(b <> rest, damaged, 0)
{"#", ".", _} ->
0
{"#", "#", springs} when springs < 2 ->
0
{"#", "#", springs} when springs >= 2 ->
solve(b <> rest, [springs - 1 | damaged], continuing + 1)
{"#", "?", 1} ->
solve("." <> rest, damaged, 0)
{"#", "?", springs} when springs > 1 ->
solve(b <> rest, [springs - 1 | damaged], continuing + 1)
{"#", "?", _} ->
0
{".", b, springs} ->
if continuing > 0 do
solve(b <> rest, [continuing | springs], 0)
else
solve(b <> rest, springs, 0)
end
end
end
def solve(<<"#", _rest::binary>>, [], _), do: 0
def solve("", _, _), do: 0
Those pattern-matches were painfully constructed by going through the test examples and then debugging the
results of the run on the actual input :) But the basic idea is easy to see: if we arrive at any state we know
cannot occur, like having still damaged
springs groups in the numbers list while running out of the states
string,
we return 0. Getting a string of only one #
and a list of [1]
is one possibility, so we return 1
.
We examine a maximum of 2 characters at a time. We also maintain a continuing
variable that is an integer
that shows us how many #
were consumed in the current group. That allows us to back track on group ends, for example:
def solve(<<"?", rest::binary>>, [springs | damaged], continuing) do
if continuing > 0 do
if springs > 0 do
solve("#" <> rest, [springs | damaged], continuing)
else
solve("#" <> rest, damaged, continuing) + solve("." <> rest, [continuing | damaged], 0)
end
else
solve("#" <> rest, [springs | damaged], 0) + solve("." <> rest, [springs | damaged], 0)
end
end
Here, we essentially say:
-
if we are running through a group of
#
and encounter a?
, and we have exhausted the currentdamaged
spring group, examine the cases of replacing the?
with#
and continuing the groupsolve("#" <> rest, damaged, continuing)
and breaking thr group and putting the currently running group back on thedamaged
listsolve("." <> rest, [continuing | damaged], 0)
- essentially, a poor man’s backtracking. -
if we didn’t exhaust the current group, continue trying to enlarge it
solve("#" <> rest, [springs | damaged], continuing)
-
if we are not running through a group, just try replacing
?
with both#
and.
:solve("#" <> rest, [springs | damaged], 0) + solve("." <> rest, [springs | damaged], 0)
.
This strategy gives us the correct answer for part 1, but it’s too slow for part 2. As usual with recursion,
the solution is to memoize it :) The fastest way to modify the code to do it was to add a helper (solve_with_memo
)
that maintains a memoization map and call it in case of continuing == 0
(“clean” states).
I also cleaned up some no-longer needed pattern matches.
def solve(states, damaged, memo \\ %{}), do: solve(states, damaged, 0, memo) |> elem(0)
def solve("", [], _, memo), do: {1, memo}
def solve("", _, _, memo), do: {0, memo}
def solve("#", [1], _, memo), do: {1, memo}
def solve("#", damaged, _, memo) when length(damaged) > 0, do: {0, memo}
def solve(<<"?", rest::binary>>, [springs | damaged], continuing, memo) do
if continuing > 0 do
if springs > 0 do
solve_with_memo("#" <> rest, [springs | damaged], continuing, memo)
else
{dot_ans, _} = solve_with_memo("." <> rest, [continuing | damaged], 0, memo)
{hash_ans, _} = solve_with_memo("#" <> rest, damaged, continuing, memo)
{hash_ans + dot_ans, memo}
end
else
{hash_ans, memo} = solve_with_memo("#" <> rest, [springs | damaged], 0, memo)
{dot_ans, memo} = solve_with_memo("." <> rest, [springs | damaged], 0, memo)
{hash_ans + dot_ans, memo}
end
end
def solve(<<"?", rest::binary>>, [], _continuing, memo) do
solve_with_memo("." <> rest, [], 0, memo)
end
def solve(<<".", rest::binary>>, damaged, _continuing, memo) do
solve_with_memo(rest, damaged, 0, memo)
end
def solve(
<<a::binary-size(1), b::binary-size(1), rest::binary>>,
[springs | damaged],
continuing,
memo
) do
case {a, b, springs} do
{"#", ".", 1} ->
solve_with_memo(b <> rest, damaged, 0, memo)
{"#", ".", _} ->
{0, memo}
{"#", "#", springs} when springs < 2 ->
{0, memo}
{"#", "#", springs} when springs >= 2 ->
solve(b <> rest, [springs - 1 | damaged], continuing + 1, memo)
{"#", "?", 1} ->
solve_with_memo("." <> rest, damaged, 0, memo)
{"#", "?", springs} when springs > 1 ->
solve(b <> rest, [springs - 1 | damaged], continuing + 1, memo)
{"#", "?", _} ->
{0, memo}
{".", b, springs} ->
if continuing > 0 do
solve_with_memo(b <> rest, [continuing | springs], 0, memo)
else
solve_with_memo(b <> rest, springs, 0, memo)
end
end
end
def solve(<<"#", _rest::binary>>, [], _, memo), do: {0, memo}
def solve_with_memo(states, damaged, continuing, memo) do
case Map.get(memo, {states, damaged}) do
nil ->
{answer, memo} = solve(states, damaged, continuing, memo)
memo = if continuing == 0, do: Map.put(memo, {states, damaged}, answer), else: memo
{answer, memo}
answer ->
{answer, memo}
end
end
And, to add a final twist, we can also run part 2 in parallel on input:
def p2(input) do
input
|> Task.async_stream(
fn {states, damaged} ->
states = List.duplicate(states, 5) |> Enum.intersperse("?") |> Enum.join()
damaged = List.duplicate(damaged, 5) |> List.flatten()
solve(states, damaged)
end,
timeout: :infinity
)
|> Enum.map(&elem(&1, 1))
|> Enum.sum()
end
The tests solving both parts on test and actual input run in ~0.1 second :)
After looking at the reddit thread with solutions, it seems that even better approach is to eagerly match the
groups of #
(essentially, doing a look-ahead), but the basic idea of the solution is the same in most cases.
Here
is a good thread explaining a very neat Python solution and basic intro to recursion and memoization.
Day 17 in Rust: Dijkstra with a Twist
Day 17 immediately felt like the
Dijkstra’s algorithm day to me :)
I chose Rust for this reason too - it has a built-in BinaryHeap
data structure.
Let’s start with parsing and data model:
use Direction::*;
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
pub enum Direction {
Left,
Right,
Up,
Down,
}
static ALL_DIRECTIONS: [Direction; 4] = [Left, Right, Up, Down];
impl Direction {
pub fn opposite(&self) -> Direction {
match self {
Left => Right,
Right => Left,
Up => Down,
Down => Up,
}
}
pub fn turns(&self) -> [Direction; 2] {
if *self == Left || *self == Right {
[Up, Down]
} else {
[Left, Right]
}
}
}
impl ToString for Direction {
fn to_string(&self) -> String {
let ch = match self {
Left => "<",
Right => ">",
Up => "^",
Down => "v",
};
ch.to_string()
}
}
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
pub struct Pos {
row: usize,
col: usize,
}
impl Pos {
pub fn walk(&self, direction: Direction) -> Option<Pos> {
match direction {
Left => {
if self.col >= 1 {
Some(Pos { row: self.row, col: self.col - 1 })
} else {
None
}
}
Right => Some(Pos { row: self.row, col: self.col + 1 }),
Up => {
if self.row >= 1 {
Some(Pos { row: self.row - 1, col: self.col })
} else {
None
}
}
Down => Some(Pos { row: self.row + 1, col: self.col }),
}
}
}
#[derive(Debug, Clone)]
pub struct Map {
rows: Vec<Vec<usize>>,
}
impl Map {
pub fn max_row(&self) -> usize {
self.rows.len()
}
pub fn max_col(&self) -> usize {
self.rows[0].len()
}
pub fn heat_loss(&self, pos: &Pos) -> usize {
self.rows[pos.row][pos.col]
}
}
pub fn parse_input(input: &str) -> Map {
let mut rows = vec![];
for row in input.trim().split("\n") {
let row = row.chars().map(|ch| ch.to_digit(10).unwrap() as usize).collect();
rows.push(row);
}
Map { rows }
}
Let’s also model the allowed moves:
impl Map {
pub fn next_moves(
&self,
pos: &Pos,
direction: &Direction,
steps: usize
) -> Vec<(Pos, Direction)> {
let allowed_directions = p1_allowed_directions(direction, steps);
allowed_directions
.into_iter()
.filter_map(|d| {
pos.walk(d).and_then(|new_pos| {
if new_pos.row < self.max_row() && new_pos.col < self.max_col() {
Some((new_pos, d))
} else {
None
}
})
})
.collect()
}
}
fn p1_allowed_directions(direction: &Direction, steps: usize) -> Vec<Direction> {
if steps == 3 {
direction.turns().to_vec()
} else {
let turns = direction.turns();
vec![*direction, turns[0], turns[1]]
}
}
Overall, it was a comparatively straightforward implementation, but there’s a twist - we cannot use plain coordinates to represent the “states” between which we move in the Dijkstra’s algorithm. Because the crucible can only go straight for a limited number of blocks and there are essentially rules that determine which blocks you can move to based on your previous path, we need to account not only for the blocks position on the grid, but also the direction the crucible moves into and the number of steps it already moved in that direction. And, of course, what we want to minimize is not plain distance, it’s a cumulative heat loss. So, the state we need to find the minimum distance is this:
#[derive(Debug, Clone, PartialEq, Eq)]
struct State {
pos: Pos,
heat_loss: usize,
direction: Direction,
steps: usize,
}
We also need to be aware that the built-in binary heap is a max heap, so it will put the maximum element on top.
We need the minimum element, so we use the handy .reverse()
method on our Ordering
when defining PartialOrd
and Ord
for our State
struct:
impl PartialOrd for State {
fn partial_cmp(&self, other: &Self) -> Option<std::cmp::Ordering> {
self.heat_loss.partial_cmp(&other.heat_loss).map(|ordering| ordering.reverse())
}
}
impl Ord for State {
fn cmp(&self, other: &Self) -> std::cmp::Ordering {
self.partial_cmp(other).unwrap()
}
}
And now we can implement our Dijkstra’s algorithm:
pub fn dijkstra(map: &Map) -> usize {
let start = Pos { row: 0, col: 0 };
let target = Pos { row: map.max_row() - 1, col: map.max_col() - 1 };
let mut visited: HashSet<(Pos, Direction, usize)> = HashSet::new();
let mut queue = BinaryHeap::new();
queue.push(State { pos: start, heat_loss: 0, direction: Right, steps: 0 });
queue.push(State { pos: start, heat_loss: 0, direction: Down, steps: 0 });
let mut heat_losses = HashMap::new();
for direction in ALL_DIRECTIONS {
heat_losses.insert((start, direction, 0), 0);
}
while let Some(State { pos, heat_loss, direction, steps }) = queue.pop() {
for (neighbour, new_direction) in map.next_moves(&pos, &direction, steps + 1) {
let steps = if new_direction == direction { steps + 1 } else { 0 };
if !visited.contains(&(neighbour, new_direction, steps)) {
let new_heat_loss = heat_loss + map.heat_loss(&neighbour);
let new_state = State {
pos: neighbour,
heat_loss: new_heat_loss,
direction: new_direction,
steps
};
match heat_losses.entry((neighbour, new_direction, steps)) {
Entry::Occupied(mut entry) => {
if new_heat_loss < *entry.get() {
entry.insert(new_heat_loss);
queue.push(new_state);
}
}
Entry::Vacant(entry) => {
entry.insert(new_heat_loss);
queue.push(new_state);
}
};
}
}
visited.insert((pos, direction, steps));
}
ALL_DIRECTIONS
.iter()
.flat_map(|direction| {
(0..=3).into_iter().flat_map(|steps| heat_losses.get(&(target, *direction, steps)))
})
.min()
.map(|x| *x)
.unwrap()
}
So:
-
We
start
at the top-left corner and ourtarget
is the bottom-right corner. Originally, we don’t mark any states asvisited
. -
From the top-left corner we can either walk right or down - we put those states in our
queue
of states to examine. Our heat loss is0
in the top-left block at the start. -
Since Rust’s
BinaryHeap
doesn’t allow hash-map like access, we maintain a separateHashMap
of heat losses at different states - we know already that for the top-left corner (start
block, in anydirection
, with0
steps walked) it will be0
. -
The main loop of Dijkstra’s algorithm starts now: we pop the lowest heat-loss state from the
queue
, and look at its neighbours usingMap::next_moves
method, also not forgetting to update oursteps
counter appropriately. If the new state has already beenvisited
, we ignore it. If it wasn’t, we calculate thenew_heat_loss
andnew_state
. We then use the Entry API for efficient state lookup and update in theheat_losses
hash map. If we found a better heat loss solution (or the first solution) for the given state (a tuple of block coordinates, direction, and walked steps), we update theheat_losses
hash map with that and put the current state on thequeue
. - After examining all neighbours of a state, we mark that state visited.
-
Once we run out of the items in the queue, we look at all possible states in the
target
block, i.e. all directions and all steps count and select the minimum one.
And that solves part 1:
pub fn p1(map: &Map) -> usize {
dijkstra(map)
}
To solve part 2, we just need to do minor updates. Let’s add another allowed directions checker:
fn p2_allowed_directions(direction: &Direction, steps: usize) -> Vec<Direction> {
if steps == 10 {
direction.turns().to_vec()
} else if steps < 4 {
vec![*direction]
} else {
let turns = direction.turns();
vec![*direction, turns[0], turns[1]]
}
}
And an option to use it in the next_moves
method:
impl Map {
pub fn next_moves(
&self,
pos: &Pos,
// cont
p2: bool
) -> Vec<(Pos, Direction)> {
let allowed_directions =
if p2 {
p2_allowed_directions(direction, steps)
} else {
p1_allowed_directions(direction, steps)
};
// cont
}
}
And in our Dijkstra’s implementation we just need to accept and pass this argument to next_moves
+
make sure that we only look at solutions where we walked at least 4 steps in one direction to arrive to the target:
pub fn dijkstra(map: &Map, p2: bool) -> usize {
let start = Pos { row: 0, col: 0 };
// cont
while let Some(State { pos, heat_loss, direction, steps }) = queue.pop() {
for (neighbour, new_direction) in map.next_moves(&pos, &direction, steps + 1, p2) {
// cont
}
visited.insert((pos, direction, steps));
}
ALL_DIRECTIONS
.iter()
.flat_map(|direction| {
let min_steps = if p2 { 3 } else { 0 };
(min_steps..=10).into_iter().flat_map(|steps|
heat_losses.get(&(target, *direction, steps))
)
})
.min()
.map(|x| *x)
.unwrap()
}
And we’re done:
pub fn p2(map: &Map) -> usize {
dijkstra(map, true)
}
If you enjoyed this content, you can sponsor me on Github to produce more videos / educational blog posts.
And if you're looking for consulting services, feel free to contact me .