Skip to content

mmcknett/advent-of-code-2022

Repository files navigation

Advent of Code 2022 solutions

For this year, I decided to use Advent of Code to do more Rust practice. I decided to get a head start by learning how to do some of the more complex parsing tasks and data structures ahead of time, since it's a big struggle to learn those while also trying to solve the harder AoC puzzles. Those will go in utils, and there's a little module for trying things out in testbed.

I put steps for quickly bootstrapping solutions in setup. I kept a running log of solutions in this readme as well.

Plug for Rust: I really enjoyed using the language for these toy problems! It's much more ergonomic than C++, both from the perspective of using the type system and from the perspective of generating correct code once it compiles. I frequently found myself producing correct code once the code successfully compiled. The compiler also prevented me from introducing incorrect behavior once, such as preventing me from using a reference in a dependent function to a list that I had mutably borrowed. (I've had that error in production code before; it's very hard to track down!)

Solution Log

Day 25

Day 25 prompt

The last one was a short one. At first I misunderstood the problem (I thought the powers of 5 were alternating their sign), but test cases quickly corrected that mistake. I decided the easiest way for me to get my head around this would be to turn the number into a base-5 number first, then correct that into a "snafu" number. The tricky part there is figuring out how to adjust the more-significant values of a number based on the least significant value adjustments. That wasn't too bad, as you can actually only carry at most 1 out of a less-significant place. So for example, the largest base-5 number less than 1000, 444, gets converted like...

444 --> (4)(4)(4) --> 4(4+1)(-1) --> (4+1)(0)(-1) --> (1)(0)(0)(-1) --> 100-

So breaking it into converting to a list of base 5 digits, then transforming to a snafu number, made it pretty simple.

Day 24

Day 24 prompt

BFS day again! But today's was another slog for me. I pretty quickly realized that the "blizzards" repeating was going to mean that I didn't have to simulate the board state forever, and I could cut off simulating it at t = width * height. (Technically, the LCM of width & height, but for the puzzle input they're mutually prime.) I could also easily calculate the position of any given blizzard at time t.

The easiest way for me to think about finding the shortest path through the grid was to calculate all width * height boards and do a breadth-first search through the volume, layer by layer. For that, I rendered all the blizzards to a grid as a "wall" if there was a blizzard present at that time and "open" if not, and then did the usual breadth-first search where you don't go outside the bounds of the grid and don't visit walled squares.

It just took forever to actually write everything out! It made me think I might have saved time if I had had a super generic BFS algorithm that let me provide adjacency rules (and which worked in 3D).

Part 2 was conceptually simple; just modify the part 1 algorithm to do paths multiple times. Unfortunately, I ran into a bunch of off-by-one errors I had to debug, and then I realized that one of my original assumptions (that you can immediately move into the board) was wrong...but only for the sample. I ended up solving the puzzle input before having a solution that worked for the sample.

Day 23

Day 23 prompt

This one was straightforward. I felt like I was basically following the instructions directly from the prompt. The practice with using tuples for day 22 came in handy again. I wrote a pretty inefficient algorithm, but the input for part 2 in release mode ran relatively quickly (order of a few dozen seconds), so I didn't need to optimize it to get the answer.

Day 22

Day 22 prompt

Day 22 was a slog for me. It's a lot of bounds checking. The tricky part (and from Reddit it looks like it was tricky for others, too) was figuring out how to get the wrapping rules right for part 2, treating the input as a flattened cube. I drew a bunch of pictures and imagined the sides wrapping in 3D space a bunch, and ultimately came up with the general idea that you can find the 90-degree concave corners first. Those are guaranteed to be adjacent edges. Once you've found those, you can expand outward from them to find more edges that are adjacent to each other when re-combined into a cube.

The trick there was realizing that, as you expand out, if you change direction following a cube edge twice, you're not longer on adjacent edges. For example, say you start at a corner. You can go up to follow one edge and left to follow the other edge from the corner. Those two edges are obviously adjacent. Then, if you turn left from the left edge and go down a new edge, and the other edge continues upward, that downward edge and the second upward edge are adjacent. However, if the first upward edge turns right and continues along an edge going to the right, that rightward edge is not adjacent to the downward edge (the one you got to second after going left).

# C and D adjacent edges      # C and D are NOT adjacent.

  D.                            
  D.                            
  B.                              BD
  B.                              B.
AA..                            AA..
C...                            C...

The only exception to that rule is the very last edge.

It definitely would have been easier to hard-code the cube geometry. I also have a feeling it would have been easier to try to represent the cube in 3D, but I had already invested enough time in calculating the adjacent edges on the 2D version that I didn't want to try to go down the 3D route.

But hey, at least this wasn't an exponential runtime problem!

Day 21

Day 21 prompt

This one was fun. Basically, write an arithmetic expression parser. For Part 2 I had to switch to parsing the input into an expression tree so that I could implement algebra to find X. This was good practice for making boxed recursive enum types. I have a feeling my functions are overly verbose and that there are probably some nifty Rust pattern-matching ways to handle some of the tasks here. I considered trying out the unstable box patterns feature, but I was able to accomplish what I needed with if match expressions. Still, I'm sure there must be some way to make the rules for reduce and balance less verbose.

Day 20

Day 20 prompt

I picked back up on new days today, after two weeks of travel and then spending yesterday getting back into the grove by finishing Day 16.

This problem seems simple when you first look at it, but the requirements made it tricky. I took a little time to think about it, and ultimately landed on turning the numbers into a doubly-linked list. That let me maintain the original order of the numbers (by reference to the linked list nodes), but freely rearrange them according to the rules of the problem. Since I needed to reference the nodes directly, I decided it would be easiest to write my own Linked List struct, which meant learning about how Rust handles heap allocation and pointers. (I got surprisingly far into coding in Rust without needing to really figure them out!) More on my learning experience in the Day 20 readme.

I spent most of my time figuring out how to work with Rust Rc (ref counted pointers) and RefCells (the abstraction that makes it so you can mutate what those reference-counted pointers point to at all). As long as you have the right data structure, you can basically follow the problem statement instructions blindly. For part 2, the big wrinkle to deal with is the size of the numbers getting big, which means you loop around the list many times. To deal with that, just eliminate the duplicate moves by moving the distance you're supposed to move mod list.len() - 1. (It's length - 1 because [0, 1, 2] and [1, 2, 0] are equivalent. So if you're moving 0 to the right, you got from [1, 2, 0] to [1, 0, 2], not to [0, 1, 2]).

Day 19

Day 19 prompt

(We were preparing for our two-week international trip, so I didn't have time to immediately finish this one.) I started with a naive, aggressively-exponential algorithm for part 1. Same issue as day 16; caching params (with the cached library!) isn't sufficient to shortcut the processing. This one seems suited to dynamic programming, but I don't know exactly what intermediate values to build up yet.

Rather than trying to build up intermediate values, I went ahead with branch-and-bound like Day 16. I really struggled to figure out how to do the bound, because it's not easy to come up with a heuristic for the max possible number of geodes. I couldn't come up with a better one than "add the geodes you have to the ones that will be produced by geode-cracking robots you have now, plus however many more you'll have assuming you can build one of those robots every minute". This just doesn't narrow it down very much. But, it was enough, and that method finished while I was still trying to optimize the performance.

I'll move on, but...

TODO: Improve the performance of Day 19! It takes a few minutes.

Day 18

Day 18 prompt

Voxels! For part 1, I used a BFS algorithm to search each voxel that was part of a "droplet" and count the number of neighbors it was missing to find its surface count. For part 2, I tried filling a volume of air around the droplet and then skipping any voxel that would have counted as an outside surface if that voxel wasn't in the air volume.

That would have worked fine, but I had a bug in my min/max algorithm to find the bounds of the air volume that gave me a voxel cube of 5000 on a side! I assumed that was the real size, but it was actually much smaller than that (maybe 50 to a side?), so I spent time trying to figure out an efficient way of determining if an "air" voxel was trapped using a DFS-out method. Implementing that was when I found the problem with the min/max algorithm.

Lessons learned:

  • Double check that the problem is really as big as you think it is. Don't add 1 to max as you accumulate it to find max. :eyeroll:
  • Lean more on tests for "trivial"-seeming functions–they'll be easy to write and

Day 17

Day 17 prompt

I built a solution that does the whole "tetris" simulation, and that ended up taking hours to run (after I did some small perf tweaks) for part 2.

TODO: Make it run in less than 39 hours.

There are signs that this problem is just some math operation, under the covers. You could think of the tetris board as a long string of base-128 digits (or base-256 if u8s are easier to work with), and each drop is a comparison with some 1-4 digits (each one times 2 or divided by 2, depending on what direction they're "pushed"). But I haven't figured out yet how to model the "push and drop" operation as a math expression.

Day 16

Day 16 prompt

I wrote a recursive find_max_release search algorithm that ended up being exponential run-time. Caching parameters isn't a good enough strategy for this one; the intermediate results are too diverse. I let part 1 run overnight (<8 hours I suppose) and it produced a result, but I haven't gotten time to optimize part 1 and then finish part 2.

Before starting part 2, I knew I needed to optimize. I started with the shortest-path-finding algorithm. I used dijkstra's algorithm to generate the shortest path between each valve and kept a matrix of distances. That drastically improved the run time.

For part 2, I started by adapting find_max_release to make a recursive call for each step, so that I could track the narrator and an elephant independently using the same branching mechanism. I also created a smaller example that helped me debug the two independent walkers. This ended up working great for the sample input, but it was just too slow for the puzzle input. I tried optimizing with memoization and reducing the extra copying of vectors that I had for debugging paths, but that also wasn't sufficient. Reddit solutions mentioned "branch and bound", and looking that up ended up being the key for me.

Caching solutions for these problems isn't enough; sometimes you need to cut off the algorithm when it's possible to tell that you won't be able to do better by going down a particular branch. For this problem, my heuristic function checks if you can beat the current best by immediately opening the best valve, then walking to the next best valve, opening it, and so on. It's not possible to go faster than that, since walks are longer between valves. That was sufficient to bring down the run time of part 2 for the puzzle input to a second or so. (Upon reflection, the heuristic function isn't wholly valid for part 2. Valves can be opened slightly faster, so I should only decrease time remaining by 1 minute instead of 2. Luckily, it still let the algorithm produce the right output, and it's faster as is. However, this function is liable to be a source of weird bugs, generally.)

Lessons learned

  • Don't forget the "bound" part of branch & bound! I often reach for memoization, and then forget that I could be tracking a "best result so far" that I could use to prune branches of the algorithm that are provably worse, based on a heuristic function.
  • Find a graph library so you don't have to implement shortest-path-finding yourself.
  • Don't try to reinvent the wheel on graphs, either. Use a HashMap of (Node, HashSet) for edges. Or better yet, some library that holds a graph representation.
  • Making aliased types for different things like release rate, distance, etc. made it easier to keep things straight in a problem that has lots of moving parts. Use the type system!

Day 15

Day 15 prompt

Part 1 was a really good setup to part 2 for this one. I used my Range struct from utils and modified it to add some features for this problem, including making it generic. It probably wasn’t worth the effort, but I did learn more about Rust generics by taking it on, so it had some benefits.

Day 14

Day 14 prompt

I opted to use a set of points instead of a grid for this one. I got a little ratholed trying to decide if Rust would do something reasonable with a backward range like 5..=2. It doesn't; I resorted to min(y1, y2)..=max(y1, y2). My Coord struct and the existing parser for it came in handy, though I did need to make it handle multiple "->"-separated coordinates (vs. parsing just two coords to make a line).

I learned that Rust lets you label nested loops so you can have finer-grained control over break & continue. That was handy! It let me do an inner for loop over the three directions and have control over the loops like:

'outer: while /* ... */ {
    // ...
    for /* ... */ {
        if /* ... */ {
            continue; // skip to the next inner-loop thing
        } else {
            // ...
            continue 'outer; // Finish early with the inner loop
        }
    }

    // ...
    return true; // Running out of the inner loop ever means the function terminates!
}

return false; // Running out of the outer loop means the function terminates differently.

Lessons learned

  • Don't fiddle with "backward" ranges, just use min and max to get the right forward range.

Day 13

Day 13 prompt

I can't tell how much time I wasted today. Part 1 took a lot of time because I decided to use an enum type to handle the lists-or-values aspects of the packets. I decided to overload the <= operator for the type because I realized it's fundamentally what I was writing. I think that implementing the PartialOrd interface for the type was technically overkill, since we didn't care as much about equality, but it did make writing the logic a little more straightforward. That decision ended up making part 2 dead simple, because I could just sort the vector of packets. (LorVs, in my implementation. Probably should have called it Packet; whoops.)

So ultimately, I spent an hour and a half setting up the parsing and ordering of the packet type, then debugging all that for part 1, just to make part 2 take 10 minutes. I guess I must not have wasted too much time, since I got my 3rd best ranking with it even after an hour-delayed start.

Lessons learned

  • Pay attention to the problem asking for 1-based indexing.
  • It pays to use standard operators (overloading <= for my enum type meant I could finish part 2 very quickly).
  • unzip did something I didn't expect -- if I skipped zip'd iterators until one of those iterators ran out, calling unzip would yield two empty vectors instead of an empty vector paired with the remainder of the other vector.
    • This ate up some debugging time while I tried to figure out my <= algorithm.
    • Turns out, I should have just used the built-in Vec cmp method. 🤦
    • BUT, at least I have learned how I could rewrite vector comparison using iterators. So that's fun.

Day 12

Day 12 prompt

Today was BFS day! Find the shortest path given [whatever] constraints on going one square away, so naturally breadth-first-searching a grid is a good choice. I actually remembered all the pieces I needed for BFS this time around, and tried to learn my lesson from a recent interview experience and simplify the up/down/left/right inner loop. This worked out nicely:

let deltas: [(isize, isize); 4] = [(0, 1), (-1, 0), (0, -1), (1, 0)]; // right, up, left, down

Combine that with checked_add_signed, a nightly feature of 1.66.0 (and the realization that all I need to do to get those is rustup install nightly then cargo +nightly run!), to get a simple way to visit the neighbors with bounds-checking:

for (d_r, d_c) in &deltas {
    let next_r = match curr_pt.0.checked_add_signed(*d_r) {
        Some(val) if val < terrain.rows() => val,
        _ => continue
    };
    // ...

The only issue I ran into with part 2 was that some lowest points couldn't reach the end; for that I just switched to returning an optional path length.

Lessons learned

  • It's easier than I thought to get unstable Rust features (or unreleased stabilized features)!
    • rustup install nightly
    • Add the feature to the top of main.rs (e.g. for today I added #![feature(mixed_integer_ops)]).
    • Run with +nightly, e.g.: cargo +nightly run input.txt
    • Add "+nightly" to the list of args in launch.json if you're set up for debugging like this project template has it. (I needed this for debugging part 2 where a path isn't always possible.)

Day 11

Day 11 prompt

Since I overdid parsing in previous days, I decided to just double down on splitting today. Oddly enough, that probably slowed me down, compared to picking LALRPOP! It basically took half an hour to parse each of the "monkeys" and turn their worry-increase operations into something where I didn't have to deal with &str lifetimes.

Part 2 was the big boondoggle. I thought I might be able to brute-force it and figured out how to refactor it for BigInts, but even those turned out to be exponential performance. The trick, I realized late, was to work in values mod whatever the product of the divisors was, since the remainder of a product is the product of the remainders. Same goes for sums.

Lessons learned

  • Sometimes it is better to use the fancy parser.

Day 10

Day 10 prompt

Today was the off-by-one-error challenge day. I had a lot of trouble overthinking the processor "stall" logic, then had to hack around the fact that I was expecting to read the "X register" after an operation and not during an operation. For part 2, Grid came in handy, though I was disappointed that I couldn't find a straightforward print function. I ended up using:

for row in 0..display.rows() {
        let r: String = display.iter_row(row).collect();
        println!("{r}");
    }

I realized after, though, that Grid has a prettier "alternate" representation, which would have been harder to read for this day, but might come in handy. Just use the standard "alternate" formatting:

print!("{:#?}", display);

Lessons learned

  • Don't try to anticipate part 2. I assumed I'd get more instructions to add, which had longer stall times, and generalized the "instruction stall" feature. It was unnecessary for part 2 and confused me for part 1.

Day 9

Day 9 prompt

I thought it was smart to use my Coord util, but it probably ended up wasting time. However, it was smart to make the Rope class, because it was very easy to make a collection of them (Knots) and refactor it for use in part 2.

Lessons learned

  • Tuples are probably better than hand-rolling coordinate classes. A real 2d/3d math library might be better

To do:

  • Pick a Rust 2d/3d math crate. Maybe nalgebra?

Day 8

Day 8 prompt

I felt like I should've been able to use iterators for the viewing_distance code for part 2, but couldn't think of how. I ended up with the usual "look up/left/down/right" kind of code you get when searching a grid is involved. The grid crate did come in handy, though, especially given the input format was rows of numbers separated by newlines. If we get another one of these, I'll copy-pasta:

    let field: Grid<u8> = Grid::from_vec(
        input.chars().filter(|c| *c != '\n').map(|c| c.to_digit(10).unwrap() as u8).collect(),
        input.chars().find_position(|c| *c == '\n').unwrap().0
    );

Day 7

Day 7 prompt

The parser was probably overkill for this one, but because I've been leaning on it so much, it was easier to reach for it than to try to relearn Regex and build a match statement on regexes.

The hardest part of Day 7 was keeping track of things.

  • Keeping track of what terminal commands could just be ignored
  • Keeping track of duplicated values for summing the smallest directories (for part 1).

I didn't bother to benchmark part 2, but it's obviously exponential runtime (the size is calculated recursively every time). I could cache the calculated size of the directories, but since it gave me the right answer instantaneously without caching, it wasn't necessary.

Day 6

Day 6 prompt

I was just waiting for Rust's windows method to come in handy! It was a little cumbersome to figure out how to turn a &str into the kind of slice that has windows on it, but once I did that it was just a matter of using sets to find the first run of characters that were all unique. And since the second part was the same algorithm with a different window size, it was just a matter of generalizing the function to take n instead of using 4 everywhere.

Day 5

Day 5 prompt

I got myself really wound up around parsing on this one. I went down a rabbit hold trying to use LALRPOP to parse the representation of the stacks, but hit a snag because I needed a lexer that doesn't ignore whitespace. Then it took me time to decide how I wanted to handle the custom parsing.

Lessons learned

  • Don't even bother with LALRPOP's default lexer if whitespace matters in the parsing.

To do:

  • See if it's possible to write a LALRPOP lexer that can consider whitespace. This could even be useful for when newline-separated input needs to get parsed as a vector.
  • Look for the faster way of initializing a vector of objects. I got hung up trying to figure out how to repeat a VecDeque::new.

Day 4

Day 4 prompt

LALRPOP paid off for this one! It was really simple to write a parser that extracted text into a pair of ranges. I created a Range struct with start & end, and then implemented fully_contains and overlaps for inclusive-end ranges. Instead of thinking too hard about the overlaps logic, I just gave it my best shot on the boolean operations and refined by adding a bunch of test cases. With those things in place, it was fairly straightforward to turn the text of the prompt into code that said what it was supposed to do ("sum the assignment pairs that overlap each other", basically, for part 2).

Day 3

Day 3 prompt

Parsing was again the big challenge. I had to sort out the mechanics of splitting lines in two and converting chars to the right number. The other big thing for Part 2 was doing an intersect_all. I used fold, and the debugger came in handy for that. (I discovered I needed to special-case the initial value, which intersects with nothing.)

This intersect_all might come in handy later! It would have been handy for Part 1, too, if I'd thought to put the two halves of each input into sets instead of vectors. When I refactor, I'll probably use it.

Lessons learned

  1. Get comfortable with HashSet, because sets are probably going to come in handy and accumulating them was a little strange. E.g. copied() to turn &u32 into u32.
  2. Study up on group_by, because it would have been helpful for Day 1 and today.
  3. The equivalent to Python's ord() is char as u32.
  4. Using #[test]s to try things out was handy.

To do:

  • Get comfortable with group_by

Day 2

Day 2 prompt

Keeping the various cases straight was the hardest part. For simplicity I didn't initially introduce Enums, and kept having to mentally map "A" to Rock, "Y" to Paper, etc.

Refactoring this one exposed something weird in LALRPOP, where I can't seem to use regex rules on my Play terminal. That is, I really wanted to write r"[AX]" => ..., but had to resort to "A" => ... and "X" => ....

Lessons learned

  1. It's more readable to use sum() instead of fold() with an implicit sum.
  2. LALRPOP has some rough edges; if you're time-pressured, work around them and move on.

Day 1

First see: Day 1 prompt

The first thing that happened was I realized my parsing practice was limited entirely to parsing a single line. Suddenly, I wanted to collect lines of code together into vectors of parsed ints, using a blank line to separate that input. I couldn't think of how to do it with iterators, so I gave up on iterators and just used a for loop to accumulate into a mutable vector of vectors.

That wasn't actually necessary! It would have been simpler to sum as input gets taken. Also, the hard problem of collecting lines deliniated by blank lines into a vector would have been much simpler if I'd just read the whole file into a string and split around blank lines, instead of trying to think of how to segment off each vector as I iterated the input.

I refactored day 1 into a more template-able example that I liked better, and split off some functions that might be useful utilities.

Lessons Learned

  1. Just solve the part 1 problem without trying to anticipate the part 2 problem. I preserved lists of ints instead of summing as I went.
  2. Read files instead of relying on standard input. That way the program is debuggable!

To do:

  • Create a template for days that includes a launch.json to provide a test runner and two configs for loading sample.txt and input.txt.
  • Create a string-to-vector utility to parse things out of strings separated by [whatever].
  • Extract an argument-reading utility I decided I don't want to deal with a utility function taking a mutable iterator and returning a string.

About

My solutions for the 2022 Advent of Code

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages