diff --git a/docs/challenging-sql-problems/challenging-sql-problems.md b/docs/challenging-sql-problems/challenging-sql-problems.md
new file mode 100644
index 0000000..fadb33b
--- /dev/null
+++ b/docs/challenging-sql-problems/challenging-sql-problems.md
@@ -0,0 +1,30 @@
+# Challenging SQL questions
+
+> [!WARNING]
+>
+> These questions are not for people new to SQL! These expect you to use advanced SQL techniques that most people don't know.
+
+## Problems
+
+### π€ Bronze Tier
+
+As long as you know your database features, these are straightforward.
+
+1. [Fibonacci sequence](problems/bronze/fibonacci-sequence.md)
+2. [Temperature anomaly detection](problems/bronze/temperature-anomaly-detection.md)
+3. [UK bank holidays](problems/bronze/uk-bank-holidays.md)
+
+### βͺ Silver Tier
+
+These require a bit more thinking.
+
+1. [Suspicious login activity](problems/silver/suspicious-login-activity.md)
+2. [Funnel analytics](problems/silver/funnel-analytics.md)
+3. [Customer sales running totals](problems/silver/customer-sales-running-totals.md)
+
+### π‘ Gold Tier
+
+Expect to spend a bit of time on these.
+
+1. [Loan repayment schedules](problems/gold/loan-repayment-schedule.md)
+2. [Travel plans](problems/gold/travel-plans.md)
diff --git a/docs/challenging-sql-problems/problems/bronze/fibonacci-sequence.md b/docs/challenging-sql-problems/problems/bronze/fibonacci-sequence.md
new file mode 100644
index 0000000..6301042
--- /dev/null
+++ b/docs/challenging-sql-problems/problems/bronze/fibonacci-sequence.md
@@ -0,0 +1,48 @@
+# Fibonacci sequence π’
+
+> [!QUESTION]
+>
+> Generate the first 45 terms of the [Fibonacci sequence](https://en.wikipedia.org/wiki/Fibonacci_sequence).
+>
+> The output should have one row per term in the sequence and the columns:
+>
+> - `n` as the term number
+> - `f_n` as the corresponding Fibonacci number
+>
+> Order the output by `n`.
+
+The [Fibonacci sequence](https://en.wikipedia.org/wiki/Fibonacci_sequence) is defined as _fn = fn-1 + fn-2_, where _f1 = f2 = 1_.
+
+For example:
+
+- The third term is _f3 = f2 + f1 = 1 + 1 = 2_
+- The fourth term is _f4 = f3 + f2 = 2 + 1 = 3_
+- ...
+- The tenth term is _f10 = f9 + f8 = 34 + 21 = 55_
+
+The solution can be found at:
+
+- [fibonacci-sequence.md](../../solutions/bronze/fibonacci-sequence.md)
+
+---
+
+
+>? INFO: **Sample output**
+>
+> | n | f_n |
+> | --: | --: |
+> | 1 | 1 |
+> | 2 | 1 |
+> | 3 | 2 |
+> | 4 | 3 |
+> | ... | ... |
+
+
+>? TIP: **Hint 1**
+>
+> Use a [recursive CTE](../../../from-excel-to-sql/advanced-concepts/recursive-ctes.md) to generate the sequence.
+
+
+>? TIP: **Hint 2**
+>
+> Use the columns `n`, `f_n`, and `f_m` to keep track of the current term, the current Fibonacci number, and the previous Fibonacci number.
diff --git a/docs/challenging-sql-problems/problems/bronze/temperature-anomaly-detection.md b/docs/challenging-sql-problems/problems/bronze/temperature-anomaly-detection.md
new file mode 100644
index 0000000..5caf9be
--- /dev/null
+++ b/docs/challenging-sql-problems/problems/bronze/temperature-anomaly-detection.md
@@ -0,0 +1,48 @@
+# Temperature anomaly detection π
+
+> [!QUESTION]
+>
+> Given the temperature data below, identify the temperature readings that are at least 10% higher than the average of the previous 2 and following 2 readings.
+>
+> Do _not_ include the current reading in the average calculation, and use the calculated average temperature as the denominator for the 10% calculation.
+>
+> If there are fewer than 2 readings before or 2 after the current reading, do not include the reading in the output.
+>
+> The output should have the columns:
+>
+> - `site_id`
+> - `reading_datetime`
+> - `temperature`
+> - `average_temperature` as the average of the 4 readings around the current reading (2 each side), rounded to 4 decimal places
+> - `percentage_increase` as the percentage increase of the current reading over the `average_temperature`, rounded to 4 decimal places
+>
+> Order the output by `site_id` then `reading_datetime`.
+
+
+Expand for the DDL
+--8<-- "docs/challenging-sql-problems/problems/bronze/temperature-anomaly-detection.sql"
+
+
+The solution can be found at:
+
+- [temperature-anomaly-detection.md](../../solutions/bronze/temperature-anomaly-detection.md)
+
+---
+
+
+>? INFO: **Sample output**
+>
+> | site_id | reading_datetime | temperature | average_temperature | percentage_increase |
+> |--------:|:--------------------|------------:|--------------------:|--------------------:|
+> | 1 | 2021-01-02 02:01:17 | 22.43 | 20.0525 | 11.8564 |
+> | ... | ... | ... | ... | ... |
+
+
+>? TIP: **Hint 1**
+>
+> Use a [window function](../../../from-excel-to-sql/main-concepts/window-functions.md) (or two!) to calculate the average temperature of the surrounding readings.
+
+
+>? TIP: **Hint 2**
+>
+> Use another [window function](../../../from-excel-to-sql/main-concepts/window-functions.md) to identify rows with at least 4 surrounding readings (2 before and 2 after).
diff --git a/docs/challenging-sql-problems/problems/bronze/temperature-anomaly-detection.sql b/docs/challenging-sql-problems/problems/bronze/temperature-anomaly-detection.sql
new file mode 100644
index 0000000..31d42bf
--- /dev/null
+++ b/docs/challenging-sql-problems/problems/bronze/temperature-anomaly-detection.sql
@@ -0,0 +1,35 @@
+```sql
+create table readings (
+ site_id integer,
+ reading_datetime timestamp,
+ temperature decimal(5, 2) not null,
+ primary key (site_id, reading_datetime)
+);
+insert into readings
+values
+ (1, '2021-01-01 03:26:23', 20.02),
+ (1, '2021-01-01 19:52:46', 20.17),
+ (1, '2021-01-02 02:01:17', 22.43),
+ (1, '2021-01-02 21:02:34', 19.91),
+ (1, '2021-01-03 04:12:56', 20.11),
+ (1, '2021-01-03 20:23:12', 20.22),
+ (1, '2021-01-04 05:34:23', 20.04),
+ (1, '2021-01-04 21:45:34', 22.69),
+ (1, '2021-01-05 06:56:45', 20.50),
+ (1, '2021-01-05 22:07:56', 20.32),
+ (1, '2021-01-06 07:18:07', 20.17),
+ (1, '2021-01-06 23:29:18', 23.58),
+ (2, '2021-01-01 04:30:10', 21.52),
+ (2, '2021-01-01 17:12:25', 21.48),
+ (2, '2021-01-02 01:59:43', 23.10),
+ (2, '2021-01-02 20:05:53', 18.19),
+ (2, '2021-01-03 03:17:12', 21.23),
+ (2, '2021-01-03 19:25:20', 21.27),
+ (2, '2021-01-04 04:33:34', 21.51),
+ (2, '2021-01-04 20:41:45', 21.49),
+ (2, '2021-01-05 05:49:56', 21.52),
+ (2, '2021-01-05 21:58:07', 21.48),
+ (2, '2021-01-06 07:06:18', 21.50),
+ (2, '2021-01-06 23:14:29', 21.52)
+;
+```
diff --git a/docs/challenging-sql-problems/problems/bronze/uk-bank-holidays.md b/docs/challenging-sql-problems/problems/bronze/uk-bank-holidays.md
new file mode 100644
index 0000000..eecc700
--- /dev/null
+++ b/docs/challenging-sql-problems/problems/bronze/uk-bank-holidays.md
@@ -0,0 +1,47 @@
+# UK bank holidays π
+
+> [!QUESTION]
+>
+> Using [DuckDB](https://duckdb.org/), parse the [UK bank holiday endpoint](https://www.gov.uk/bank-holidays.json) into a table.
+>
+> - [https://www.gov.uk/bank-holidays.json](https://www.gov.uk/bank-holidays.json)
+>
+> Each row in the output should correspond to a single event, and the column headers (below) should map directly to the JSON properties with the same names:
+>
+> - `division`
+> - `title`
+> - `date`
+> - `notes`
+> - `bunting`
+
+Here's a starting point:
+
+```sql
+from 'https://www.gov.uk/bank-holidays.json'
+```
+
+The solution can be found at:
+
+- [uk-bank-holidays.md](../../solutions/bronze/uk-bank-holidays.md)
+
+---
+
+
+>? INFO: **Sample output**
+>
+> | division | title | date | notes | bunting |
+> |:------------------|:---------------|:-----------|:------|:--------|
+> | england-and-wales | New Yearβs Day | 2018-01-01 | | true |
+> | england-and-wales | Good Friday | 2018-03-30 | | false |
+> | england-and-wales | Easter Monday | 2018-04-02 | | true |
+> | ... | ... | ... | ... | ... |
+
+
+>? TIP: **Hint 1**
+>
+> Use [`UNPIVOT`](https://duckdb.org/docs/sql/statements/unpivot.html) to move the separate columns for each division into a single column.
+
+
+>? TIP: **Hint 2**
+>
+> Use [`UNNEST`](https://duckdb.org/docs/sql/query_syntax/unnest.html) to explode the event JSON into separate rows and columns.
diff --git a/docs/challenging-sql-problems/problems/gold/loan-repayment-schedule.md b/docs/challenging-sql-problems/problems/gold/loan-repayment-schedule.md
new file mode 100644
index 0000000..d04c4af
--- /dev/null
+++ b/docs/challenging-sql-problems/problems/gold/loan-repayment-schedule.md
@@ -0,0 +1,141 @@
+# Loan repayment schedule π°
+
+> [!QUESTION]
+>
+> For the loan details below, generate the loan repayment schedules for the loans.
+>
+> The output should have a row per loan per repayment and the columns:
+>
+> - `loan_id`
+> - `repayment_number` as the repayment number
+> - `repayment_date` as the date of the repayment
+> - `interest` as the interest component of the repayment
+> - `principal` as the principal component of the repayment
+> - `total` as the total repayment
+> - `balance` as the outstanding balance _after_ the repayment
+>
+> Order the output by `loan_id` and `repayment_number`.
+
+
+Expand for the DDL
+--8<-- "docs/challenging-sql-problems/problems/gold/loan-repayment-schedule.sql"
+
+
+The loans have the following details:
+
+- `loan_id`: The unique identifier for the loan
+- `loan_value`: The total value of the loan
+- `interest_rate`: The monthly interest rate
+- `repayments`: The number of monthly repayments to make on the loan
+- `start_date`: The date the loan was taken out
+
+The repayments are due exactly one month after each other (no need to account for weekends or holidays), and the first repayment is due one month after the `start_date`. The `start_date` will never be on the 29th, 30th, or 31st of the month.
+
+For each loan, the monthly repayment will be for the same amount (except the final one) which you need to calculate, or check **Hint 0** below. The monthly repayment must be rounded to two decimal places, but any rounding error should be accounted for in the final repayment so that the outstanding balance is exactly zero after the final repayment.
+
+Each repayment, the interest is calculated and added first, and then the repayment is subtracted from the balance. The interest is calculated on the current outstanding balance and rounded to two decimal places.
+
+A monthly repayment will be made up of two parts: the interest and the principal. The interest is calculated as described above, and the principal is the difference between the monthly repayment and the interest so is the amount that goes towards actually paying off the loan.
+
+The solution can be found at:
+
+- [loan-repayment-schedule.md](../../solutions/gold/loan-repayment-schedule.md)
+
+A worked example is provided below to help illustrate the loan calculations.
+
+---
+
+
+>? INFO: **Sample output**
+>
+> | loan_id | repayment_number | repayment_date | interest | principal | total | balance |
+> |--------:|-----------------:|:---------------|---------:|----------:|---------:|---------:|
+> | 1 | 1 | 2024-02-03 | 1600.00 | 12682.06 | 14282.06 | 67317.94 |
+> | 1 | 2 | 2024-03-03 | 1346.36 | 12935.70 | 14282.06 | 54382.24 |
+> | 1 | 3 | 2024-04-03 | 1087.64 | 13194.42 | 14282.06 | 41187.82 |
+> | 1 | 4 | ... | ... | ... | ... | ... |
+> | 2 | 1 | 2024-02-02 | 1125.00 | 5751.00 | 6876.00 | 69249.00 |
+> | 2 | 2 | 2024-03-02 | 1038.74 | 5837.26 | 6876.00 | 63411.74 |
+> | 2 | 3 | 2024-04-02 | 951.18 | 5924.82 | 6876.00 | 57486.92 |
+> | 2 | 4 | ... | ... | ... | ... | ... |
+> | 3 | 1 | 2024-02-01 | 1000.00 | 3707.35 | 4707.35 | 96292.65 |
+> | 3 | 2 | 2024-03-01 | 962.93 | 3744.42 | 4707.35 | 92548.23 |
+> | 3 | 3 | 2024-04-01 | 925.48 | 3781.87 | 4707.35 | 88766.36 |
+> | 3 | 4 | ... | ... | ... | ... | ... |
+
+
+>? TIP: **Hint 0**
+>
+> The formula for calculating the monthly repayment is:
+>
+> - `(1 + interest_rate)` to the power of `repayments` as `amortised_rate`, then
+> - `loan_value * interest_rate * amortised_rate / (amortised_rate - 1)` as `monthly_repayment`
+
+
+>? TIP: **Hint 1**
+>
+> Use a [recursive CTE](../../../from-excel-to-sql/advanced-concepts/recursive-ctes.md) to generate and calculate the rows for the repayment schedule.
+
+
+>? TIP: **Hint 2**
+>
+> For the recursive CTE's anchor statement, start with a dummy row for each loan with only the loan value and the start date. Then, recursively calculate the interest, principal, and balance for each repayment in the recursive statement.
+
+
+>? TIP: **Hint 3**
+>
+> Calculate the final repayment's details separately to account for any rounding errors.
+
+---
+
+### Worked example
+
+To help illustrate the loan calculations, consider the following loan:
+
+- `loan_value`: 10,000.00
+- `interest_rate`: 0.01
+- `repayments`: 6
+- `start_date`: 2024-01-01
+
+A loan with these details will have a monthly repayment value of 1,725.48 (rounded to 2 decimal places).
+
+Let's walk through a few repayments.
+
+#### The first repayment
+
+- The first repayment is due on 2024-02-01
+- The interest is calculated on the outstanding balance of 10,000.00
+- The interest is 1%, so the interest for the month is 100.00 (10,000.00 \* 0.01)
+- The repayment is 1,725.48, so the outstanding balance after the repayment is 8,374.52 (10,000.00 + 100.00 - 1,725.48)
+- We note that the principal component of the repayment is 1,625.48 (1,725.48 - 100.00)
+
+#### The second repayment
+
+- The second repayment is due on 2024-03-01
+- The interest is calculated on the outstanding balance of 8,374.52
+- The interest is 1%, so the interest for the month is 83.75 (8,374.52 \* 0.01)
+- The repayment is 1,725.48, so the outstanding balance after the repayment is 6,732.79 (8,374.52 + 83.75 - 1,725.48)
+- We note that the principal component of the repayment is 1,641.73 (1,725.48 - 83.75)
+
+#### The third, fourth, and fifth repayments
+
+- The interest and principal components are calculated in the same way as above
+- The outstanding balance after the fifth repayment is 1,708.43
+
+#### The final repayment
+
+- The final repayment is due on 2024-07-01
+- The interest is calculated on the outstanding balance of 1,708.43
+- The interest is 1%, so the interest for the month is 17.08 (1,708.43 \* 0.01)
+- Since this is the final repayment and we want to account for any rounding errors, the repayment is the outstanding balance plus the interest: 1,725.51 (1,708.43 + 17.08)
+
+Therefore, the repayment schedule for this loan would look like:
+
+| loan_id | repayment_number | repayment_date | interest | principal | total | balance |
+| ------: | ---------------: | :------------- | -------: | --------: | ------: | ------: |
+| 1 | 1 | 2024-02-01 | 100.00 | 1625.48 | 1725.48 | 8374.52 |
+| 1 | 2 | 2024-03-01 | 83.75 | 1641.73 | 1725.48 | 6732.79 |
+| 1 | 3 | 2024-04-01 | 67.33 | 1658.15 | 1725.48 | 5074.64 |
+| 1 | 4 | 2024-05-01 | 50.75 | 1674.73 | 1725.48 | 3399.91 |
+| 1 | 5 | 2024-06-01 | 34.00 | 1691.48 | 1725.48 | 1708.43 |
+| 1 | 6 | 2024-07-01 | 17.08 | 1708.43 | 1725.51 | 0.00 |
diff --git a/docs/challenging-sql-problems/problems/gold/loan-repayment-schedule.sql b/docs/challenging-sql-problems/problems/gold/loan-repayment-schedule.sql
new file mode 100644
index 0000000..88f582c
--- /dev/null
+++ b/docs/challenging-sql-problems/problems/gold/loan-repayment-schedule.sql
@@ -0,0 +1,15 @@
+```sql
+create table loans (
+ loan_id integer primary key,
+ loan_value decimal(10, 2) not null,
+ interest_rate decimal(5, 4) not null,
+ repayments integer not null,
+ start_date date not null
+);
+insert into loans
+values
+ (1, 80000.00, 0.020, 6, '2024-01-01'),
+ (2, 75000.00, 0.015, 12, '2024-01-02'),
+ (3, 100000.00, 0.010, 24, '2024-01-03')
+;
+```
diff --git a/docs/challenging-sql-problems/problems/gold/travel-plans.md b/docs/challenging-sql-problems/problems/gold/travel-plans.md
new file mode 100644
index 0000000..2e44ab1
--- /dev/null
+++ b/docs/challenging-sql-problems/problems/gold/travel-plans.md
@@ -0,0 +1,55 @@
+# Travel Plans π
+
+> [!QUESTION]
+>
+> Journey information has been collected into two tables:
+>
+> - `routes_timetable`
+> - `routes_schedule`
+>
+> The timetable table records individual routes with their full departure/arrival timestamps and cost. The schedule table records the _schedule_ of repeated routes with their schedule definition.
+>
+> You need to get from New York to Paris, and the earliest you can leave from New York is 2024-01-01 12:00:00-05:00.
+>
+> Give a minimum of 30 minutes and a maximum of 2 hours for "interchange" time (the time between arrival and departure at the same location). All costs are in the same currency (with no currency specified).
+>
+> What's the fastest route, and what's the cheapest route?
+>
+> The output should have the columns:
+>
+> - `route` which is each location in the route separated by a hyphen, e.g. `New York - London - Paris`
+> - `departure_datetime_utc` as the departure time (UTC) from New York
+> - `arrival_datetime_utc` as the arrival time (UTC) in Paris
+> - `duration` as the total duration of the journey
+> - `cost` as the total cost of the journey
+>
+> Order the output by `arrival_datetime_utc`.
+
+
+Expand for the DDL
+--8<-- "docs/challenging-sql-problems/problems/gold/travel-plans.sql"
+
+
+The solution can be found at:
+
+- [travel-plans.md](../../solutions/gold/travel-plans.md)
+
+---
+
+
+>? INFO: **Sample output**
+>
+> | route | departure_datetime_utc | arrival_datetime_utc | duration | cost |
+> |:-----------------|:-----------------------|:---------------------|:---------|-------:|
+> | New York - Paris | 2024-01-01 23:00:00 | 2024-01-02 16:45:00 | 17:45:00 | 279.00 |
+> | ... | ... | ... | ... | ... |
+
+
+>? TIP: **Hint 1**
+>
+> Expand the `routes_schedule` table into individual routes and then union with the `routes_timetable` table for a full list of routes to consider.
+
+
+>? TIP: **Hint 2**
+>
+> Use a [recursive CTE](../../../from-excel-to-sql/advanced-concepts/recursive-ctes.md) to build up the journey from New York to Paris, considering the interchange time between routes.
diff --git a/docs/challenging-sql-problems/problems/gold/travel-plans.sql b/docs/challenging-sql-problems/problems/gold/travel-plans.sql
new file mode 100644
index 0000000..c80f5ab
--- /dev/null
+++ b/docs/challenging-sql-problems/problems/gold/travel-plans.sql
@@ -0,0 +1,47 @@
+```sql
+create table routes_schedule (
+ schedule_id int primary key,
+ mode_of_transport varchar not null,
+ from_location varchar not null,
+ to_location varchar not null,
+ earliest_departure timetz not null,
+ latest_departure timetz not null,
+ frequency time, /* `null` means that it's daily */
+ duration time not null,
+ cost decimal(8, 2) not null,
+);
+insert into routes_schedule
+values
+ (1, 'train', 'London St Pancras', 'London Gatwick', '08:00:00+00:00', '20:00:00+00:00', '01:00:00', '00:30:00', 17.50),
+ (2, 'train', 'London St Pancras', 'London Gatwick', '07:30:00+00:00', '22:30:00+00:00', '02:30:00', '01:15:00', 12.00),
+ (3, 'bus', 'London St Pancras', 'London Gatwick', '06:15:00+00:00', '06:15:00+00:00', null, '03:30:00', 6.75),
+ (4, 'bus', 'London St Pancras', 'London Gatwick', '19:30:00+00:00', '19:30:00+00:00', null, '03:30:00', 6.75),
+ (5, 'train', 'London Gatwick', 'London St Pancras', '09:00:00+00:00', '21:00:00+00:00', '01:00:00', '00:30:00', 17.50),
+ (6, 'train', 'London Gatwick', 'London St Pancras', '07:15:00+00:00', '22:15:00+00:00', '02:30:00', '01:15:00', 12.00),
+ (7, 'bus', 'London Gatwick', 'London St Pancras', '06:00:00+00:00', '06:00:00+00:00', null, '03:30:00', 6.75),
+ (8, 'bus', 'London Gatwick', 'London St Pancras', '20:00:00+00:00', '20:00:00+00:00', null, '03:30:00', 6.75)
+;
+
+create table routes_timetable (
+ route_id int primary key,
+ mode_of_transport varchar not null,
+ from_location varchar not null,
+ to_location varchar not null,
+ departure_datetime timestamptz not null,
+ arrival_datetime timestamptz not null,
+ cost decimal(8, 2) not null,
+);
+insert into routes_timetable
+values
+ (1, 'boat', 'London St Pancras', 'Paris', '2024-01-01 06:00:00+00:00', '2024-01-01 07:30:00+01:00', 45.00),
+ (2, 'plane', 'London Gatwick', 'New York', '2024-01-01 13:05:00+00:00', '2024-01-01 20:55:00-05:00', 158.00),
+ (3, 'plane', 'London Gatwick', 'New York', '2024-01-02 20:40:00+00:00', '2024-01-03 04:30:00-05:00', 147.00),
+ (4, 'plane', 'London St Pancras', 'Paris', '2024-01-03 07:00:00+00:00', '2024-01-03 08:30:00+01:00', 70.00),
+ (5, 'plane', 'Paris', 'New York', '2024-01-02 12:00:00+01:00', '2024-01-02 20:30:00-05:00', 180.00),
+ (6, 'plane', 'New York', 'London Gatwick', '2024-01-01 13:00:00-05:00', '2024-01-02 05:45:00+00:00', 160.00),
+ (7, 'boat', 'New York', 'London Gatwick', '2024-01-01 05:30:00-05:00', '2024-01-01 23:00:00+00:00', 195.00),
+ (8, 'boat', 'London St Pancras', 'Paris', '2024-01-01 18:00:00+00:00', '2024-01-01 19:30:00+01:00', 95.00),
+ (9, 'boat', 'London St Pancras', 'Paris', '2024-01-02 14:00:00+00:00', '2024-01-02 15:30:00+01:00', 40.00),
+ (10, 'plane', 'New York', 'Paris', '2024-01-01 18:00:00-05:00', '2024-01-02 17:45:00+01:00', 279.00)
+;
+```
diff --git a/docs/challenging-sql-problems/problems/silver/customer-sales-running-totals.md b/docs/challenging-sql-problems/problems/silver/customer-sales-running-totals.md
new file mode 100644
index 0000000..12f15bd
--- /dev/null
+++ b/docs/challenging-sql-problems/problems/silver/customer-sales-running-totals.md
@@ -0,0 +1,63 @@
+# Customer sales running totals π
+
+> [!QUESTION]
+>
+> Using _only_ the `Sales.SalesOrderHeader` table in the [AdventureWorks](https://learn.microsoft.com/en-us/sql/samples/adventureworks-install-configure) database, calculate the running total of `TotalDue` per customer for the customers with `CustomerID` values of `11176`, `11091`, and `11287`.
+>
+> The output should have a row per customer for each day in June 2014, but the running totals should include all the historic sales for the customers.
+>
+> The output should have 90 rows (30 days in June for 3 customers) and the columns:
+>
+> - `BalanceDate` as the date that the end-of-day balance corresponds to
+> - `CustomerID`
+> - `RunningTotal` as the sum of the `TotalDue` values up to and including the `BalanceDate`
+>
+> Order the output by `BalanceDate` and `CustomerID`.
+
+> [!NOTE]
+>
+> You can access this table on the [db<>fiddle](https://dbfiddle.uk/8VEWSCRd) website at:
+>
+> - [https://dbfiddle.uk/8VEWSCRd](https://dbfiddle.uk/8VEWSCRd)
+
+Since the rows corresponding to 2014-06-01 should include the historic sales, the rows for 2014-06-01 should be:
+
+| BalanceDate | CustomerID | RunningTotal |
+| :---------- | ---------: | -----------: |
+| 2014-06-01 | 11091 | 1243.5234 |
+| 2014-06-01 | 11176 | 1222.8820 |
+| 2014-06-01 | 11287 | 1115.2109 |
+
+However, you should calculate this yourself (don't just copy the above values).
+
+The solution can be found at:
+
+- [customer-sales-running-totals.md](../../solutions/silver/customer-sales-running-totals.md)
+
+---
+
+
+>? INFO: **Sample output**
+>
+> | BalanceDate | CustomerID | RunningTotal |
+> |:------------|-----------:|-------------:|
+> | 2014-06-01 | 11091 | 1243.5234 |
+> | 2014-06-01 | 11176 | 1222.8820 |
+> | 2014-06-01 | 11287 | 1115.2109 |
+> | 2014-06-02 | 11091 | 1243.5234 |
+> | 2014-06-02 | 11176 | 1222.8820 |
+> | 2014-06-02 | 11287 | 1115.2109 |
+> | 2014-06-03 | 11091 | 1243.5234 |
+> | 2014-06-03 | 11176 | 1222.8820 |
+> | 2014-06-03 | 11287 | 1115.2109 |
+> | ... | ... | ... |
+
+
+>? TIP: **Hint 1**
+>
+> Use a [recursive CTE](../../../from-excel-to-sql/advanced-concepts/recursive-ctes.md) (or equivalent) to generate the June 2014 date axis, and then join the customers' sales to it.
+
+
+>? TIP: **Hint 2**
+>
+> Use the `SUM` function with [the `OVER` clause](../../../from-excel-to-sql/main-concepts/window-functions.md) to calculate the running total.
diff --git a/docs/challenging-sql-problems/problems/silver/funnel-analytics.md b/docs/challenging-sql-problems/problems/silver/funnel-analytics.md
new file mode 100644
index 0000000..a7290f5
--- /dev/null
+++ b/docs/challenging-sql-problems/problems/silver/funnel-analytics.md
@@ -0,0 +1,68 @@
+# Funnel analytics β¬
+
+> [!QUESTION]
+>
+> A prominent UK bank is interested in understanding the conversion rates of their mortgage application process.
+>
+> Their application funnel consists of the following stages, in order:
+>
+> 1. full application
+> 2. decision
+> 3. documentation
+> 4. valuation inspection
+> 5. valuation made
+> 6. valuation submitted
+> 7. solicitation
+> 8. funds released
+>
+> The table `applications` tracks the dates that each mortgage application reached each stage.
+>
+> Write a query that calculates the conversion rates between each stage for each cohort (defined below).
+>
+> The output should have a row per cohort and stage, with the columns:
+>
+> - `cohort` as the month that the applications were started; e.g., an application started on `2024-01-15` would be cohort `2024-01-01`.
+> - `stage`
+> - `mortgages` as the number of mortgages that reached the stage
+> - `step_rate` as the percentage of mortgages that reached the stage compared to the previous stage
+> - `total_rate` as the percentage of mortgages that reached the stage compared to the first stage
+>
+> Note that each cohort should have _all_ the stages, even if there are no mortgages that reached that stage -- the `mortgages` column should be `0` in that case.
+>
+> The output should be ordered by `cohort` and _the `stage` order_ (e.g. `full application` should come before `decision`, and so on).
+
+
+Expand for the DDL
+--8<-- "docs/challenging-sql-problems/problems/silver/funnel-analytics.sql"
+
+
+The solution can be found at:
+
+- [funnel-analytics.md](../../solutions/silver/funnel-analytics.md)
+
+---
+
+
+>? INFO: **Sample output**
+>
+| cohort | stage | mortgages | step_rate | total_rate |
+|:-----------|:---------------------|----------:|----------:|-----------:|
+| 2024-01-01 | full application | 4 | 100.00 | 100.00 |
+| 2024-01-01 | decision | 4 | 100.00 | 100.00 |
+| 2024-01-01 | documentation | 3 | 75.00 | 75.00 |
+| 2024-01-01 | valuation inspection | 3 | 100.00 | 75.00 |
+| 2024-01-01 | valuation made | 3 | 100.00 | 75.00 |
+| 2024-01-01 | valuation submitted | 3 | 100.00 | 75.00 |
+| 2024-01-01 | solicitation | 1 | 33.33 | 25.00 |
+| 2024-01-01 | funds released | 1 | 100.00 | 25.00 |
+| ... | ... | ... | ... | ... |
+
+
+>? TIP: **Hint 1**
+>
+> Determine each row's cohort before calculating the rates.
+
+
+>? TIP: **Hint 2**
+>
+> Use [window functions](../../../from-excel-to-sql/main-concepts/window-functions.md) to compare the current row with the historic rows.
diff --git a/docs/challenging-sql-problems/problems/silver/funnel-analytics.sql b/docs/challenging-sql-problems/problems/silver/funnel-analytics.sql
new file mode 100644
index 0000000..b4924d5
--- /dev/null
+++ b/docs/challenging-sql-problems/problems/silver/funnel-analytics.sql
@@ -0,0 +1,77 @@
+```sql
+create table applications (
+ event_id int primary key,
+ event_date date not null,
+ mortgage_id int not null,
+ stage varchar not null
+);
+insert into applications
+values
+ (1, '2024-01-02', 1, 'full application'),
+ (2, '2024-01-06', 1, 'decision'),
+ (3, '2024-01-12', 1, 'documentation'),
+ (4, '2024-01-14', 1, 'valuation inspection'),
+ (5, '2024-01-16', 2, 'full application'),
+ (6, '2024-01-17', 3, 'full application'),
+ (7, '2024-01-19', 2, 'decision'),
+ (8, '2024-01-25', 2, 'documentation'),
+ (9, '2024-01-27', 1, 'valuation made'),
+ (10, '2024-01-27', 4, 'full application'),
+ (11, '2024-01-29', 3, 'decision'),
+ (12, '2024-02-02', 1, 'valuation submitted'),
+ (13, '2024-02-03', 4, 'decision'),
+ (14, '2024-02-04', 5, 'full application'),
+ (15, '2024-02-05', 4, 'documentation'),
+ (16, '2024-02-06', 4, 'valuation inspection'),
+ (17, '2024-02-09', 6, 'full application'),
+ (18, '2024-02-11', 5, 'decision'),
+ (19, '2024-02-11', 7, 'full application'),
+ (20, '2024-02-12', 2, 'valuation inspection'),
+ (21, '2024-02-12', 6, 'decision'),
+ (22, '2024-02-12', 7, 'decision'),
+ (23, '2024-02-13', 2, 'valuation made'),
+ (24, '2024-02-13', 6, 'documentation'),
+ (25, '2024-02-14', 2, 'valuation submitted'),
+ (26, '2024-02-15', 6, 'valuation inspection'),
+ (27, '2024-02-16', 4, 'valuation made'),
+ (28, '2024-02-17', 5, 'documentation'),
+ (29, '2024-02-19', 4, 'valuation submitted'),
+ (30, '2024-02-20', 5, 'valuation inspection'),
+ (31, '2024-02-21', 8, 'full application'),
+ (32, '2024-02-22', 5, 'valuation made'),
+ (33, '2024-02-23', 9, 'full application'),
+ (34, '2024-02-25', 6, 'valuation made'),
+ (35, '2024-02-27', 5, 'valuation submitted'),
+ (36, '2024-02-27', 6, 'valuation submitted'),
+ (37, '2024-02-29', 9, 'decision'),
+ (38, '2024-02-29', 10, 'full application'),
+ (39, '2024-03-01', 9, 'documentation'),
+ (40, '2024-03-02', 8, 'decision'),
+ (41, '2024-03-05', 11, 'full application'),
+ (42, '2024-03-07', 9, 'valuation inspection'),
+ (43, '2024-03-07', 12, 'full application'),
+ (44, '2024-03-08', 9, 'valuation made'),
+ (45, '2024-03-08', 13, 'full application'),
+ (46, '2024-03-10', 10, 'decision'),
+ (47, '2024-03-12', 12, 'decision'),
+ (48, '2024-03-15', 10, 'documentation'),
+ (49, '2024-03-15', 13, 'decision'),
+ (50, '2024-03-16', 11, 'decision'),
+ (51, '2024-03-17', 13, 'documentation'),
+ (52, '2024-03-18', 9, 'valuation submitted'),
+ (53, '2024-03-18', 10, 'valuation inspection'),
+ (54, '2024-03-20', 13, 'valuation inspection'),
+ (55, '2024-03-21', 10, 'valuation made'),
+ (56, '2024-03-22', 13, 'valuation made'),
+ (57, '2024-03-27', 10, 'valuation submitted'),
+ (58, '2024-03-28', 13, 'valuation submitted'),
+ (59, '2024-04-12', 6, 'solicitation'),
+ (60, '2024-04-17', 6, 'funds released'),
+ (61, '2024-04-26', 1, 'solicitation'),
+ (62, '2024-05-02', 1, 'funds released'),
+ (63, '2024-05-17', 5, 'solicitation'),
+ (64, '2024-05-28', 5, 'funds released'),
+ (65, '2024-06-03', 9, 'solicitation'),
+ (66, '2024-06-04', 9, 'funds released')
+;
+```
diff --git a/docs/challenging-sql-problems/problems/silver/suspicious-login-activity.md b/docs/challenging-sql-problems/problems/silver/suspicious-login-activity.md
new file mode 100644
index 0000000..473c018
--- /dev/null
+++ b/docs/challenging-sql-problems/problems/silver/suspicious-login-activity.md
@@ -0,0 +1,47 @@
+# Suspicious login activity π€
+
+> [!QUESTION]
+>
+> The `event` table captures `login`, `logout`, and `login failed` events for users.
+>
+> Consecutive failed login attempts are considered suspicious once they reach a certain threshold.
+>
+> For the events below, identify the users who have `login failed` events at least five times in a row.
+>
+> Keep only the user ID and their _greatest_ number of consecutive failed login attempts.
+>
+> The output should have the columns:
+>
+> - `user_id`
+> - `consecutive_failures` as the greatest number of consecutive failed login attempts for the user
+>
+> Order by `user_id`.
+
+
+Expand for the DDL
+--8<-- "docs/challenging-sql-problems/problems/silver/suspicious-login-activity.sql"
+
+
+The solution can be found at:
+
+- [suspicious-login-activity.md](../../solutions/silver/suspicious-login-activity.md)
+
+---
+
+
+>? INFO: **Sample output**
+>
+> | user_id | consecutive_failures |
+> |--------:|---------------------:|
+> | 1 | 5 |
+> | ... | ... |
+
+
+>? TIP: **Hint 1**
+>
+> This is a typical "gaps and islands" problem.
+
+
+>? TIP: **Hint 2**
+>
+> Use the difference between two `ROW_NUMBER()` functions to create a group for each user and event type. Partition on `user_id` for one and partitioning on both `user_id` and `event_id` for the other, ordering both by `event_id`.
diff --git a/docs/challenging-sql-problems/problems/silver/suspicious-login-activity.sql b/docs/challenging-sql-problems/problems/silver/suspicious-login-activity.sql
new file mode 100644
index 0000000..2c1326a
--- /dev/null
+++ b/docs/challenging-sql-problems/problems/silver/suspicious-login-activity.sql
@@ -0,0 +1,67 @@
+```sql
+create table events (
+ event_id integer primary key,
+ user_id integer not null,
+ event_datetime timestamp not null,
+ event_type varchar not null
+);
+insert into events
+values
+ (1, 1, '2024-01-01 11:00:00', 'login'),
+ (2, 1, '2024-01-01 12:00:00', 'logout'),
+ (3, 1, '2024-01-03 03:00:00', 'login failed'),
+ (4, 1, '2024-01-03 03:05:00', 'login'),
+ (5, 2, '2024-01-03 10:00:00', 'login'),
+ (6, 2, '2024-01-03 15:00:00', 'logout'),
+ (7, 1, '2024-01-03 23:00:00', 'logout'),
+ (8, 2, '2024-01-04 22:00:00', 'login failed'),
+ (9, 2, '2024-01-04 22:05:00', 'login'),
+ (10, 3, '2024-01-05 20:00:00', 'login'),
+ (11, 3, '2024-01-06 04:00:00', 'logout'),
+ (12, 2, '2024-01-09 15:00:00', 'logout'),
+ (13, 3, '2024-01-11 21:00:00', 'login'),
+ (14, 1, '2024-01-12 12:00:00', 'login failed'),
+ (15, 1, '2024-01-12 13:00:00', 'login failed'),
+ (16, 1, '2024-01-13 03:00:00', 'login failed'),
+ (17, 2, '2024-01-13 10:00:00', 'login failed'),
+ (18, 2, '2024-01-13 10:05:00', 'login'),
+ (19, 2, '2024-01-13 15:00:00', 'logout'),
+ (20, 1, '2024-01-13 23:00:00', 'login failed'),
+ (21, 2, '2024-01-14 22:00:00', 'login'),
+ (22, 3, '2024-01-15 20:00:00', 'login'),
+ (23, 3, '2024-01-16 04:00:00', 'logout'),
+ (24, 2, '2024-01-19 15:00:00', 'logout'),
+ (25, 3, '2024-01-21 21:00:00', 'login'),
+ (26, 1, '2024-01-22 12:00:00', 'login failed'),
+ (27, 1, '2024-01-22 12:05:00', 'password reset'),
+ (28, 1, '2024-01-22 12:10:00', 'login'),
+ (29, 1, '2024-01-22 13:00:00', 'logout'),
+ (30, 1, '2024-01-23 03:00:00', 'login'),
+ (31, 2, '2024-01-23 10:00:00', 'login'),
+ (32, 2, '2024-01-23 15:00:00', 'logout'),
+ (33, 1, '2024-01-23 23:00:00', 'logout'),
+ (34, 2, '2024-01-24 22:00:00', 'login'),
+ (35, 3, '2024-01-25 20:00:00', 'login'),
+ (36, 3, '2024-01-26 04:00:00', 'logout'),
+ (37, 2, '2024-01-29 15:00:00', 'logout'),
+ (38, 3, '2024-01-31 21:00:00', 'login failed'),
+ (39, 3, '2024-01-31 21:01:00', 'login failed'),
+ (40, 3, '2024-01-31 21:02:00', 'login failed'),
+ (41, 3, '2024-01-31 21:03:00', 'login failed'),
+ (42, 3, '2024-01-31 21:04:00', 'login failed'),
+ (43, 3, '2024-01-31 21:05:00', 'password reset'),
+ (44, 3, '2024-01-31 21:06:00', 'password reset'),
+ (45, 3, '2024-01-31 21:07:00', 'password reset'),
+ (46, 3, '2024-01-31 21:08:00', 'password reset'),
+ (47, 3, '2024-01-31 21:09:00', 'password reset'),
+ (48, 3, '2024-01-31 21:10:00', 'password reset'),
+ (49, 3, '2024-01-31 21:11:00', 'login failed'),
+ (50, 3, '2024-01-31 21:12:00', 'login failed'),
+ (51, 3, '2024-01-31 21:13:00', 'login failed'),
+ (52, 3, '2024-01-31 21:14:00', 'login failed'),
+ (53, 3, '2024-01-31 21:15:00', 'login failed'),
+ (54, 3, '2024-01-31 21:16:00', 'login failed'),
+ (55, 3, '2024-01-31 21:17:00', 'login failed'),
+ (56, 3, '2024-01-31 21:18:00', 'login failed')
+;
+```
diff --git a/docs/challenging-sql-problems/solutions/bronze/fibonacci-sequence--duckdb.sql b/docs/challenging-sql-problems/solutions/bronze/fibonacci-sequence--duckdb.sql
new file mode 100644
index 0000000..cff686d
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/bronze/fibonacci-sequence--duckdb.sql
@@ -0,0 +1,16 @@
+```sql
+with recursive fibonacci(n, f_n, f_m) as (
+ select 1, 1, 0
+ union all
+ select
+ n + 1,
+ f_n + f_m,
+ f_n
+ from fibonacci
+ where n < 45
+)
+
+select n, f_n
+from fibonacci
+order by n
+```
diff --git a/docs/challenging-sql-problems/solutions/bronze/fibonacci-sequence--sql-server.sql b/docs/challenging-sql-problems/solutions/bronze/fibonacci-sequence--sql-server.sql
new file mode 100644
index 0000000..c324647
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/bronze/fibonacci-sequence--sql-server.sql
@@ -0,0 +1,16 @@
+```sql
+with fibonacci(n, f_n, f_m) as (
+ select 1, 1, 0
+ union all
+ select
+ n + 1,
+ f_n + f_m,
+ f_n
+ from fibonacci
+ where n < 45
+)
+
+select n, f_n
+from fibonacci
+order by n
+```
diff --git a/docs/challenging-sql-problems/solutions/bronze/fibonacci-sequence.md b/docs/challenging-sql-problems/solutions/bronze/fibonacci-sequence.md
new file mode 100644
index 0000000..920944d
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/bronze/fibonacci-sequence.md
@@ -0,0 +1,78 @@
+# Fibonacci sequence π’
+
+> [!TIP]
+>
+> Solution to the following problem:
+>
+> - [fibonacci-sequence.md](../../problems/bronze/fibonacci-sequence.md)
+
+## Result Set
+
+Regardless of the database, the result set should look like:
+
+| n | f_n |
+| --: | ---------: |
+| 1 | 1 |
+| 2 | 1 |
+| 3 | 2 |
+| 4 | 3 |
+| 5 | 5 |
+| 6 | 8 |
+| 7 | 13 |
+| 8 | 21 |
+| 9 | 34 |
+| 10 | 55 |
+| 11 | 89 |
+| 12 | 144 |
+| 13 | 233 |
+| 14 | 377 |
+| 15 | 610 |
+| 16 | 987 |
+| 17 | 1597 |
+| 18 | 2584 |
+| 19 | 4181 |
+| 20 | 6765 |
+| 21 | 10946 |
+| 22 | 17711 |
+| 23 | 28657 |
+| 24 | 46368 |
+| 25 | 75025 |
+| 26 | 121393 |
+| 27 | 196418 |
+| 28 | 317811 |
+| 29 | 514229 |
+| 30 | 832040 |
+| 31 | 1346269 |
+| 32 | 2178309 |
+| 33 | 3524578 |
+| 34 | 5702887 |
+| 35 | 9227465 |
+| 36 | 14930352 |
+| 37 | 24157817 |
+| 38 | 39088169 |
+| 39 | 63245986 |
+| 40 | 102334155 |
+| 41 | 165580141 |
+| 42 | 267914296 |
+| 43 | 433494437 |
+| 44 | 701408733 |
+| 45 | 1134903170 |
+
+
+Expand for the DDL
+--8<-- "docs/challenging-sql-problems/solutions/bronze/fibonacci-sequence.sql"
+
+
+## Solution
+
+Some SQL solutions per database are provided below.
+
+
+> SUCCESS: **DuckDB, SQLite, PostgreSQL, Snowflake**
+>
+--8<-- "docs/challenging-sql-problems/solutions/bronze/fibonacci-sequence--duckdb.sql"
+
+
+> SUCCESS: **SQL Server**
+>
+--8<-- "docs/challenging-sql-problems/solutions/bronze/fibonacci-sequence--sql-server.sql"
diff --git a/docs/challenging-sql-problems/solutions/bronze/fibonacci-sequence.sql b/docs/challenging-sql-problems/solutions/bronze/fibonacci-sequence.sql
new file mode 100644
index 0000000..72e452a
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/bronze/fibonacci-sequence.sql
@@ -0,0 +1,50 @@
+```sql
+select *
+from values
+ (1, 1),
+ (2, 1),
+ (3, 2),
+ (4, 3),
+ (5, 5),
+ (6, 8),
+ (7, 13),
+ (8, 21),
+ (9, 34),
+ (10, 55),
+ (11, 89),
+ (12, 144),
+ (13, 233),
+ (14, 377),
+ (15, 610),
+ (16, 987),
+ (17, 1597),
+ (18, 2584),
+ (19, 4181),
+ (20, 6765),
+ (21, 10946),
+ (22, 17711),
+ (23, 28657),
+ (24, 46368),
+ (25, 75025),
+ (26, 121393),
+ (27, 196418),
+ (28, 317811),
+ (29, 514229),
+ (30, 832040),
+ (31, 1346269),
+ (32, 2178309),
+ (33, 3524578),
+ (34, 5702887),
+ (35, 9227465),
+ (36, 14930352),
+ (37, 24157817),
+ (38, 39088169),
+ (39, 63245986),
+ (40, 102334155),
+ (41, 165580141),
+ (42, 267914296),
+ (43, 433494437),
+ (44, 701408733),
+ (45, 1134903170)
+as solution(n, f_n)
+```
diff --git a/docs/challenging-sql-problems/solutions/bronze/temperature-anomaly-detection--duckdb.sql b/docs/challenging-sql-problems/solutions/bronze/temperature-anomaly-detection--duckdb.sql
new file mode 100644
index 0000000..0ac61b0
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/bronze/temperature-anomaly-detection--duckdb.sql
@@ -0,0 +1,32 @@
+```sql
+with temperatures as (
+ select
+ site_id,
+ reading_datetime,
+ temperature,
+ avg(temperature) over rows_around_site_reading as average_temperature,
+ count(temperature) over rows_around_site_reading as count_of_rows
+ from readings
+ window rows_around_site_reading as (
+ partition by site_id
+ order by reading_datetime
+ rows between 2 preceding
+ and 2 following
+ exclude current row
+ )
+)
+
+select
+ site_id,
+ reading_datetime,
+ temperature,
+ round(average_temperature, 4) as average_temperature,
+ round(100.0 * (temperature - average_temperature) / average_temperature, 4) as percentage_increase
+from temperatures
+where 1=1
+ and count_of_rows = 4
+ and (temperature - average_temperature) / average_temperature > 0.1
+order by
+ site_id,
+ reading_datetime
+```
diff --git a/docs/challenging-sql-problems/solutions/bronze/temperature-anomaly-detection--sql-server.sql b/docs/challenging-sql-problems/solutions/bronze/temperature-anomaly-detection--sql-server.sql
new file mode 100644
index 0000000..d81d9d8
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/bronze/temperature-anomaly-detection--sql-server.sql
@@ -0,0 +1,42 @@
+```sql
+with temps as (
+ select
+ site_id,
+ reading_datetime,
+ temperature,
+ (
+ avg(temperature) over rows_before_site_reading
+ + avg(temperature) over rows_after_site_reading
+ ) / 2 as average_temperature,
+ (
+ count(temperature) over rows_before_site_reading
+ + count(temperature) over rows_after_site_reading
+ ) as count_of_rows
+ from readings
+ window
+ rows_before_site_reading as (
+ partition by site_id
+ order by reading_datetime
+ rows between 2 preceding and 1 preceding
+ ),
+ rows_after_site_reading as (
+ partition by site_id
+ order by reading_datetime
+ rows between 1 following and 2 following
+ )
+)
+
+select
+ site_id,
+ reading_datetime,
+ temperature,
+ round(average_temperature, 4) as average_temperature,
+ round(100.0 * (temperature - average_temperature) / average_temperature, 4) as percentage_increase
+from temps
+where 1=1
+ and count_of_rows = 4
+ and (temperature - average_temperature) / average_temperature > 0.1
+order by
+ site_id,
+ reading_datetime
+```
diff --git a/docs/challenging-sql-problems/solutions/bronze/temperature-anomaly-detection.md b/docs/challenging-sql-problems/solutions/bronze/temperature-anomaly-detection.md
new file mode 100644
index 0000000..eb8e0de
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/bronze/temperature-anomaly-detection.md
@@ -0,0 +1,36 @@
+# Temperature anomaly detection π
+
+> [!TIP]
+>
+> Solution to the following problem:
+>
+> - [temperature-anomaly-detection.md](../../problems/bronze/temperature-anomaly-detection.md)
+
+## Result Set
+
+Regardless of the database, the result set should look like:
+
+| site_id | reading_datetime | temperature | average_temperature | percentage_increase |
+| ------: | :------------------ | ----------: | ------------------: | ------------------: |
+| 1 | 2021-01-02 02:01:17 | 22.43 | 20.0525 | 11.8564 |
+| 1 | 2021-01-04 21:45:34 | 22.69 | 20.2700 | 11.9388 |
+| 2 | 2021-01-02 01:59:43 | 23.10 | 20.6050 | 12.1087 |
+
+
+Expand for the DDL
+--8<-- "docs/challenging-sql-problems/solutions/bronze/temperature-anomaly-detection.sql"
+
+
+## Solution
+
+Some SQL solutions per database are provided below.
+
+
+> SUCCESS: **DuckDB, SQLite, PostgreSQL**
+>
+--8<-- "docs/challenging-sql-problems/solutions/bronze/temperature-anomaly-detection--duckdb.sql"
+
+
+> SUCCESS: **SQL Server**
+>
+--8<-- "docs/challenging-sql-problems/solutions/bronze/temperature-anomaly-detection--sql-server.sql"
diff --git a/docs/challenging-sql-problems/solutions/bronze/temperature-anomaly-detection.sql b/docs/challenging-sql-problems/solutions/bronze/temperature-anomaly-detection.sql
new file mode 100644
index 0000000..553fe3d
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/bronze/temperature-anomaly-detection.sql
@@ -0,0 +1,8 @@
+```sql
+select *
+from values
+ (1, '2021-01-02 02:01:17', 22.43, 20.0525, 11.8564),
+ (1, '2021-01-04 21:45:34', 22.69, 20.2700, 11.9388),
+ (2, '2021-01-02 01:59:43', 23.10, 20.6050, 12.1087)
+as solution(site_id, reading_datetime, temperature, average_temperature, percentage_increase)
+```
diff --git a/docs/challenging-sql-problems/solutions/bronze/uk-bank-holidays--duckdb.sql b/docs/challenging-sql-problems/solutions/bronze/uk-bank-holidays--duckdb.sql
new file mode 100644
index 0000000..314b22a
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/bronze/uk-bank-holidays--duckdb.sql
@@ -0,0 +1,12 @@
+```sql
+select
+ division,
+ unnest(events.events, recursive:=true)
+from (
+ unpivot 'https://www.gov.uk/bank-holidays.json'
+ on "england-and-wales", "scotland", "northern-ireland"
+ into
+ name division
+ value events
+)
+```
diff --git a/docs/challenging-sql-problems/solutions/bronze/uk-bank-holidays.md b/docs/challenging-sql-problems/solutions/bronze/uk-bank-holidays.md
new file mode 100644
index 0000000..1fe9e20
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/bronze/uk-bank-holidays.md
@@ -0,0 +1,38 @@
+# UK bank holidays π
+
+> [!TIP]
+>
+> Solution to the following problem:
+>
+> - [uk-bank-holidays.md](../../problems/bronze/uk-bank-holidays.md)
+
+## Result Set
+
+A reduced result set (as at 2024-04-13) should look like:
+
+| division | title | date | notes | bunting |
+| :---------------- | :--------------------- | :--------- | :---- | :------ |
+| england-and-wales | New Yearβs Day | 2018-01-01 | | true |
+| england-and-wales | Good Friday | 2018-03-30 | | false |
+| england-and-wales | Easter Monday | 2018-04-02 | | true |
+| england-and-wales | Early May bank holiday | 2018-05-07 | | true |
+| england-and-wales | Spring bank holiday | 2018-05-28 | | true |
+| england-and-wales | Summer bank holiday | 2018-08-27 | | true |
+| england-and-wales | Christmas Day | 2018-12-25 | | true |
+| england-and-wales | Boxing Day | 2018-12-26 | | true |
+| england-and-wales | New Yearβs Day | 2019-01-01 | | true |
+| ... | ... | ... | ... | ... |
+
+
+Expand for the DDL
+--8<-- "docs/challenging-sql-problems/solutions/bronze/uk-bank-holidays.sql"
+
+
+## Solution
+
+The solution for DuckDB is provided below.
+
+
+> SUCCESS: **DuckDB**
+>
+--8<-- "docs/challenging-sql-problems/solutions/bronze/uk-bank-holidays--duckdb.sql"
diff --git a/docs/challenging-sql-problems/solutions/bronze/uk-bank-holidays.sql b/docs/challenging-sql-problems/solutions/bronze/uk-bank-holidays.sql
new file mode 100644
index 0000000..849be45
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/bronze/uk-bank-holidays.sql
@@ -0,0 +1,14 @@
+```sql
+select *
+from values
+ ('england-and-wales', 'New Yearβs Day', '2018-01-01', '', true),
+ ('england-and-wales', 'Good Friday', '2018-03-30', '', false),
+ ('england-and-wales', 'Easter Monday', '2018-04-02', '', true),
+ ('england-and-wales', 'Early May bank holiday', '2018-05-07', '', true),
+ ('england-and-wales', 'Spring bank holiday', '2018-05-28', '', true),
+ ('england-and-wales', 'Summer bank holiday', '2018-08-27', '', true),
+ ('england-and-wales', 'Christmas Day', '2018-12-25', '', true),
+ ('england-and-wales', 'Boxing Day', '2018-12-26', '', true),
+ ('england-and-wales', 'New Yearβs Day', '2019-01-01', '', true)
+as solution(division, title, date, notes, bunting)
+```
diff --git a/docs/challenging-sql-problems/solutions/gold/loan-repayment-schedule--duckdb.sql b/docs/challenging-sql-problems/solutions/gold/loan-repayment-schedule--duckdb.sql
new file mode 100644
index 0000000..89e8e52
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/gold/loan-repayment-schedule--duckdb.sql
@@ -0,0 +1,67 @@
+```sql
+with recursive
+
+monthly_repayment_value as (
+ select
+ loan_id,
+ power(1 + interest_rate, repayments) as amortised_rate,
+ round(
+ (loan_value * interest_rate * amortised_rate) / (amortised_rate - 1),
+ 2
+ ) as monthly_repayment
+ from loans
+),
+
+schedule as (
+ select
+ /* loan details */
+ loans.loan_id,
+ loans.interest_rate,
+ loans.repayments,
+ monthly_repayment_value.monthly_repayment,
+
+ /* repayment details */
+ 0 as repayment_number,
+ loans.start_date as repayment_date,
+ 0::decimal(10, 2) as starting_balance,
+ 0::decimal(10, 2) as interest,
+ 0::decimal(10, 2) as principal,
+ 0::decimal(10, 2) as total,
+ loans.loan_value as remaining_balance
+ from loans
+ inner join monthly_repayment_value
+ using (loan_id)
+ union all
+ select
+ loan_id,
+ interest_rate,
+ repayments,
+ monthly_repayment,
+
+ repayment_number + 1,
+ repayment_date + interval '1 month',
+ remaining_balance,
+ round(remaining_balance * interest_rate, 2) as interest_,
+ monthly_repayment - interest_ as principal_,
+ monthly_repayment,
+ remaining_balance - principal_
+ from schedule
+ where repayment_number < repayments
+)
+
+select
+ loan_id,
+ repayment_number,
+ repayment_date,
+ interest,
+
+ /* adjust the final repayment with the rounding error */
+ if(repayment_number = repayments, starting_balance, principal) as principal,
+ if(repayment_number = repayments, starting_balance + interest, total) as total,
+ if(repayment_number = repayments, 0, remaining_balance) as balance
+from schedule
+where repayment_number > 0
+order by
+ loan_id,
+ repayment_number
+```
diff --git a/docs/challenging-sql-problems/solutions/gold/loan-repayment-schedule.md b/docs/challenging-sql-problems/solutions/gold/loan-repayment-schedule.md
new file mode 100644
index 0000000..d3f6d77
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/gold/loan-repayment-schedule.md
@@ -0,0 +1,70 @@
+# Loan repayment schedule π°
+
+> [!TIP]
+>
+> Solution to the following problem:
+>
+> - [loan-repayment-schedule.md](../../problems/gold/loan-repayment-schedule.md)
+
+## Result Set
+
+Regardless of the database, the result set should look like:
+
+| loan_id | repayment_number | repayment_date | interest | principal | total | balance |
+| ------: | ---------------: | :------------- | -------: | --------: | -------: | -------: |
+| 1 | 1 | 2024-02-01 | 1600.00 | 12682.06 | 14282.06 | 67317.94 |
+| 1 | 2 | 2024-03-01 | 1346.36 | 12935.70 | 14282.06 | 54382.24 |
+| 1 | 3 | 2024-04-01 | 1087.64 | 13194.42 | 14282.06 | 41187.82 |
+| 1 | 4 | 2024-05-01 | 823.76 | 13458.30 | 14282.06 | 27729.52 |
+| 1 | 5 | 2024-06-01 | 554.59 | 13727.47 | 14282.06 | 14002.05 |
+| 1 | 6 | 2024-07-01 | 280.04 | 14002.05 | 14282.09 | 0.00 |
+| 2 | 1 | 2024-02-02 | 1125.00 | 5751.00 | 6876.00 | 69249.00 |
+| 2 | 2 | 2024-03-02 | 1038.74 | 5837.26 | 6876.00 | 63411.74 |
+| 2 | 3 | 2024-04-02 | 951.18 | 5924.82 | 6876.00 | 57486.92 |
+| 2 | 4 | 2024-05-02 | 862.30 | 6013.70 | 6876.00 | 51473.22 |
+| 2 | 5 | 2024-06-02 | 772.10 | 6103.90 | 6876.00 | 45369.32 |
+| 2 | 6 | 2024-07-02 | 680.54 | 6195.46 | 6876.00 | 39173.86 |
+| 2 | 7 | 2024-08-02 | 587.61 | 6288.39 | 6876.00 | 32885.47 |
+| 2 | 8 | 2024-09-02 | 493.28 | 6382.72 | 6876.00 | 26502.75 |
+| 2 | 9 | 2024-10-02 | 397.54 | 6478.46 | 6876.00 | 20024.29 |
+| 2 | 10 | 2024-11-02 | 300.36 | 6575.64 | 6876.00 | 13448.65 |
+| 2 | 11 | 2024-12-02 | 201.73 | 6674.27 | 6876.00 | 6774.38 |
+| 2 | 12 | 2025-01-02 | 101.62 | 6774.38 | 6876.00 | 0.00 |
+| 3 | 1 | 2024-02-03 | 1000.00 | 3707.35 | 4707.35 | 96292.65 |
+| 3 | 2 | 2024-03-03 | 962.93 | 3744.42 | 4707.35 | 92548.23 |
+| 3 | 3 | 2024-04-03 | 925.48 | 3781.87 | 4707.35 | 88766.36 |
+| 3 | 4 | 2024-05-03 | 887.66 | 3819.69 | 4707.35 | 84946.67 |
+| 3 | 5 | 2024-06-03 | 849.47 | 3857.88 | 4707.35 | 81088.79 |
+| 3 | 6 | 2024-07-03 | 810.89 | 3896.46 | 4707.35 | 77192.33 |
+| 3 | 7 | 2024-08-03 | 771.92 | 3935.43 | 4707.35 | 73256.90 |
+| 3 | 8 | 2024-09-03 | 732.57 | 3974.78 | 4707.35 | 69282.12 |
+| 3 | 9 | 2024-10-03 | 692.82 | 4014.53 | 4707.35 | 65267.59 |
+| 3 | 10 | 2024-11-03 | 652.68 | 4054.67 | 4707.35 | 61212.92 |
+| 3 | 11 | 2024-12-03 | 612.13 | 4095.22 | 4707.35 | 57117.70 |
+| 3 | 12 | 2025-01-03 | 571.18 | 4136.17 | 4707.35 | 52981.53 |
+| 3 | 13 | 2025-02-03 | 529.82 | 4177.53 | 4707.35 | 48804.00 |
+| 3 | 14 | 2025-03-03 | 488.04 | 4219.31 | 4707.35 | 44584.69 |
+| 3 | 15 | 2025-04-03 | 445.85 | 4261.50 | 4707.35 | 40323.19 |
+| 3 | 16 | 2025-05-03 | 403.23 | 4304.12 | 4707.35 | 36019.07 |
+| 3 | 17 | 2025-06-03 | 360.19 | 4347.16 | 4707.35 | 31671.91 |
+| 3 | 18 | 2025-07-03 | 316.72 | 4390.63 | 4707.35 | 27281.28 |
+| 3 | 19 | 2025-08-03 | 272.81 | 4434.54 | 4707.35 | 22846.74 |
+| 3 | 20 | 2025-09-03 | 228.47 | 4478.88 | 4707.35 | 18367.86 |
+| 3 | 21 | 2025-10-03 | 183.68 | 4523.67 | 4707.35 | 13844.19 |
+| 3 | 22 | 2025-11-03 | 138.44 | 4568.91 | 4707.35 | 9275.28 |
+| 3 | 23 | 2025-12-03 | 92.75 | 4614.60 | 4707.35 | 4660.68 |
+| 3 | 24 | 2026-01-03 | 46.61 | 4660.68 | 4707.29 | 0.00 |
+
+
+Expand for the DDL
+--8<-- "docs/challenging-sql-problems/solutions/gold/loan-repayment-schedule.sql"
+
+
+## Solution
+
+Some SQL solutions per database are provided below.
+
+
+> SUCCESS: **DuckDB**
+>
+--8<-- "docs/challenging-sql-problems/solutions/gold/loan-repayment-schedule--duckdb.sql"
diff --git a/docs/challenging-sql-problems/solutions/gold/loan-repayment-schedule.sql b/docs/challenging-sql-problems/solutions/gold/loan-repayment-schedule.sql
new file mode 100644
index 0000000..c2987ab
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/gold/loan-repayment-schedule.sql
@@ -0,0 +1,47 @@
+```sql
+select *
+from values
+ (1, 1, '2024-02-01', 1600.00, 12682.06, 14282.06, 67317.94),
+ (1, 2, '2024-03-01', 1346.36, 12935.70, 14282.06, 54382.24),
+ (1, 3, '2024-04-01', 1087.64, 13194.42, 14282.06, 41187.82),
+ (1, 4, '2024-05-01', 823.76, 13458.30, 14282.06, 27729.52),
+ (1, 5, '2024-06-01', 554.59, 13727.47, 14282.06, 14002.05),
+ (1, 6, '2024-07-01', 280.04, 14002.05, 14282.09, 0.00),
+ (2, 1, '2024-02-02', 1125.00, 5751.00, 6876.00, 69249.00),
+ (2, 2, '2024-03-02', 1038.74, 5837.26, 6876.00, 63411.74),
+ (2, 3, '2024-04-02', 951.18, 5924.82, 6876.00, 57486.92),
+ (2, 4, '2024-05-02', 862.30, 6013.70, 6876.00, 51473.22),
+ (2, 5, '2024-06-02', 772.10, 6103.90, 6876.00, 45369.32),
+ (2, 6, '2024-07-02', 680.54, 6195.46, 6876.00, 39173.86),
+ (2, 7, '2024-08-02', 587.61, 6288.39, 6876.00, 32885.47),
+ (2, 8, '2024-09-02', 493.28, 6382.72, 6876.00, 26502.75),
+ (2, 9, '2024-10-02', 397.54, 6478.46, 6876.00, 20024.29),
+ (2, 10, '2024-11-02', 300.36, 6575.64, 6876.00, 13448.65),
+ (2, 11, '2024-12-02', 201.73, 6674.27, 6876.00, 6774.38),
+ (2, 12, '2025-01-02', 101.62, 6774.38, 6876.00, 0.00),
+ (3, 1, '2024-02-03', 1000.00, 3707.35, 4707.35, 96292.65),
+ (3, 2, '2024-03-03', 962.93, 3744.42, 4707.35, 92548.23),
+ (3, 3, '2024-04-03', 925.48, 3781.87, 4707.35, 88766.36),
+ (3, 4, '2024-05-03', 887.66, 3819.69, 4707.35, 84946.67),
+ (3, 5, '2024-06-03', 849.47, 3857.88, 4707.35, 81088.79),
+ (3, 6, '2024-07-03', 810.89, 3896.46, 4707.35, 77192.33),
+ (3, 7, '2024-08-03', 771.92, 3935.43, 4707.35, 73256.90),
+ (3, 8, '2024-09-03', 732.57, 3974.78, 4707.35, 69282.12),
+ (3, 9, '2024-10-03', 692.82, 4014.53, 4707.35, 65267.59),
+ (3, 10, '2024-11-03', 652.68, 4054.67, 4707.35, 61212.92),
+ (3, 11, '2024-12-03', 612.13, 4095.22, 4707.35, 57117.70),
+ (3, 12, '2025-01-03', 571.18, 4136.17, 4707.35, 52981.53),
+ (3, 13, '2025-02-03', 529.82, 4177.53, 4707.35, 48804.00),
+ (3, 14, '2025-03-03', 488.04, 4219.31, 4707.35, 44584.69),
+ (3, 15, '2025-04-03', 445.85, 4261.50, 4707.35, 40323.19),
+ (3, 16, '2025-05-03', 403.23, 4304.12, 4707.35, 36019.07),
+ (3, 17, '2025-06-03', 360.19, 4347.16, 4707.35, 31671.91),
+ (3, 18, '2025-07-03', 316.72, 4390.63, 4707.35, 27281.28),
+ (3, 19, '2025-08-03', 272.81, 4434.54, 4707.35, 22846.74),
+ (3, 20, '2025-09-03', 228.47, 4478.88, 4707.35, 18367.86),
+ (3, 21, '2025-10-03', 183.68, 4523.67, 4707.35, 13844.19),
+ (3, 22, '2025-11-03', 138.44, 4568.91, 4707.35, 9275.28),
+ (3, 23, '2025-12-03', 92.75, 4614.60, 4707.35, 4660.68),
+ (3, 24, '2026-01-03', 46.61, 4660.68, 4707.29, 0.00)
+as solution(loan_id, repayment_number, repayment_date, interest, principal, total, balance)
+```
diff --git a/docs/challenging-sql-problems/solutions/gold/travel-plans--duckdb.sql b/docs/challenging-sql-problems/solutions/gold/travel-plans--duckdb.sql
new file mode 100644
index 0000000..b9acd60
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/gold/travel-plans--duckdb.sql
@@ -0,0 +1,118 @@
+```sql
+/*
+ In DuckDB, we can convert a timestamp with a timezone to a timestamp
+ in UTC by casting it to a varchar and then to a timestamp.
+*/
+with recursive
+
+date_axis as (
+ select
+ /* give us a day either side to be safe */
+ min(departure_datetime::date) - 1 as date_,
+ max(arrival_datetime::date) + 1 as max_date,
+ from routes_timetable
+ union all
+ select
+ date_ + interval '1 day',
+ max_date,
+ from date_axis
+ where date_ < max_date - 1 /* account for final loop */
+),
+
+scheduled_timetable as (
+ select
+ schedule_id,
+ from_location,
+ to_location,
+ cost,
+ duration::varchar::interval as duration,
+ earliest_departure::varchar::time as departure_time_utc,
+ departure_time_utc + duration::varchar::interval as arrival_time_utc,
+ latest_departure::varchar::time as limit_,
+ frequency::varchar::interval as frequency,
+ from routes_schedule
+ union all
+ select
+ schedule_id,
+ from_location,
+ to_location,
+ cost,
+ duration,
+ departure_time_utc + frequency as departure_time_utc,
+ arrival_time_utc + frequency as arrival_time_utc,
+ limit_,
+ frequency,
+ from scheduled_timetable
+ where departure_time_utc <= limit_ - frequency /* account for the final loop */
+ and frequency is not null /* daily schedules don't need to be expanded */
+),
+
+timetable as (
+ select
+ from_location,
+ to_location,
+ departure_datetime::varchar::timestamp as departure_datetime_utc,
+ arrival_datetime::varchar::timestamp as arrival_datetime_utc,
+ arrival_datetime_utc - departure_datetime_utc as duration,
+ cost,
+ from routes_timetable
+ union all
+ select
+ scheduled_timetable.from_location,
+ scheduled_timetable.to_location,
+ scheduled_timetable.departure_time_utc + date_axis.date_,
+ scheduled_timetable.arrival_time_utc + date_axis.date_,
+ scheduled_timetable.duration,
+ scheduled_timetable.cost,
+ from scheduled_timetable
+ cross join date_axis
+),
+
+routes as (
+ select
+ from_location as starting_location,
+ departure_datetime_utc as starting_departure_datetime_utc,
+
+ from_location,
+ to_location,
+ departure_datetime_utc,
+ arrival_datetime_utc,
+ cost,
+ from_location || ' - ' || to_location as route,
+ from timetable
+ where 1=1
+ and from_location = 'New York'
+ and departure_datetime_utc >= '2024-01-01 12:00:00-05:00'
+ union all
+ select
+ routes.starting_location,
+ routes.starting_departure_datetime_utc,
+
+ timetable.from_location,
+ timetable.to_location,
+ timetable.departure_datetime_utc,
+ timetable.arrival_datetime_utc,
+ routes.cost + timetable.cost,
+ routes.route || ' - ' || timetable.to_location,
+ from routes
+ inner join timetable
+ on routes.to_location = timetable.from_location
+ and timetable.departure_datetime_utc between routes.arrival_datetime_utc + interval '30 minutes'
+ and routes.arrival_datetime_utc + interval '6 hours'
+)
+
+select distinct
+ route,
+ starting_departure_datetime_utc as departure_datetime_utc,
+ arrival_datetime_utc,
+ arrival_datetime_utc - starting_departure_datetime_utc as duration,
+ cost
+from routes
+where to_location = 'Paris'
+qualify 0=1
+ /* fastest route */
+ or 1 = row_number() over (order by duration, cost, route)
+ /* cheapest route */
+ or 1 = row_number() over (order by cost, duration, route)
+order by arrival_datetime_utc
+```
diff --git a/docs/challenging-sql-problems/solutions/gold/travel-plans.md b/docs/challenging-sql-problems/solutions/gold/travel-plans.md
new file mode 100644
index 0000000..582b0e8
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/gold/travel-plans.md
@@ -0,0 +1,32 @@
+# Travel Plans π
+
+> [!TIP]
+>
+> Solution to the following problem:
+>
+> - [travel-plans.md](../../problems/gold/travel-plans.md)
+
+## Result Set
+
+Regardless of the database, the result set should look like:
+
+| route | departure_datetime_utc | arrival_datetime_utc | duration | cost |
+| :---------------------------------------------------- | :--------------------- | :------------------- | :------- | -----: |
+| New York - London Gatwick - London St Pancras - Paris | 2024-01-01 18:00:00 | 2024-01-02 14:30:00 | 20:30:00 | 212.00 |
+| New York - Paris | 2024-01-01 23:00:00 | 2024-01-02 16:45:00 | 17:45:00 | 279.00 |
+
+
+Expand for the DDL
+--8<-- "docs/challenging-sql-problems/solutions/gold/travel-plans.sql"
+
+
+## Solution
+
+Some SQL solutions per database are provided below.
+
+
+> SUCCESS: **DuckDB**
+>
+> There's probably a better way to do this. Please let me know if you find it!
+>
+--8<-- "docs/challenging-sql-problems/solutions/gold/travel-plans--duckdb.sql"
diff --git a/docs/challenging-sql-problems/solutions/gold/travel-plans.sql b/docs/challenging-sql-problems/solutions/gold/travel-plans.sql
new file mode 100644
index 0000000..8da856b
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/gold/travel-plans.sql
@@ -0,0 +1,7 @@
+```sql
+select *
+from values
+ ('New York - London Gatwick - London St Pancras - Paris', '2024-01-01 18:00:00', '2024-01-02 14:30:00', '20:30:00', 212.00),
+ ('New York - Paris', '2024-01-01 23:00:00', '2024-01-02 16:45:00', '17:45:00', 279.00)
+as solution(route, departure_datetime_utc, arrival_datetime_utc, duration, cost)
+```
diff --git a/docs/challenging-sql-problems/solutions/silver/customer-sales-running-totals--sql-server.sql b/docs/challenging-sql-problems/solutions/silver/customer-sales-running-totals--sql-server.sql
new file mode 100644
index 0000000..e04362f
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/silver/customer-sales-running-totals--sql-server.sql
@@ -0,0 +1,59 @@
+```sql
+with
+
+Dates as (
+ select cast('2014-06-01' as date) as BalanceDate
+ union all
+ select dateadd(day, 1, BalanceDate)
+ from Dates
+ where BalanceDate < '2014-06-30'
+),
+
+CustomerSalesUnioned as (
+ select
+ CustomerID,
+ '2014-05-31' as OrderDate,
+ sum(TotalDue) as TotalDue
+ from Sales.SalesOrderHeader
+ where 1=1
+ and CustomerID in (11176, 11091, 11287)
+ and OrderDate < '2014-06-01'
+ group by CustomerID
+ union all
+ select
+ CustomerID,
+ OrderDate,
+ sum(TotalDue) as TotalDue
+ from Sales.SalesOrderHeader
+ where 1=1
+ and CustomerID in (11176, 11091, 11287)
+ and OrderDate between '2014-06-01' and '2014-06-30'
+ group by
+ CustomerID,
+ OrderDate
+),
+
+CustomerSales as (
+ select
+ CustomerID,
+ OrderDate,
+ -1 + lead(OrderDate, 1, '2014-07-01') over CustomerByOrderDate as NextOrderDate,
+ sum(TotalDue) over CustomerByOrderDate as RunningTotal
+ from CustomerSalesUnioned
+ window CustomerByOrderDate as (
+ partition by CustomerID
+ order by OrderDate
+ )
+)
+
+select
+ Dates.BalanceDate,
+ CustomerSales.CustomerID,
+ CustomerSales.RunningTotal
+from Dates
+ left join CustomerSales
+ on Dates.BalanceDate between CustomerSales.OrderDate and CustomerSales.NextOrderDate
+order by
+ Dates.BalanceDate,
+ CustomerSales.CustomerID
+```
diff --git a/docs/challenging-sql-problems/solutions/silver/customer-sales-running-totals.md b/docs/challenging-sql-problems/solutions/silver/customer-sales-running-totals.md
new file mode 100644
index 0000000..4a9ca69
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/silver/customer-sales-running-totals.md
@@ -0,0 +1,118 @@
+# Customer sales running totals π
+
+> [!TIP]
+>
+> Solution to the following problem:
+>
+> - [customer-sales-running-totals.md](../../problems/silver/customer-sales-running-totals.md)
+
+## Result Set
+
+The result set should look like:
+
+| BalanceDate | CustomerID | RunningTotal |
+| :---------- | ---------: | -----------: |
+| 2014-06-01 | 11091 | 1243.5234 |
+| 2014-06-01 | 11176 | 1222.8820 |
+| 2014-06-01 | 11287 | 1115.2109 |
+| 2014-06-02 | 11091 | 1243.5234 |
+| 2014-06-02 | 11176 | 1222.8820 |
+| 2014-06-02 | 11287 | 1115.2109 |
+| 2014-06-03 | 11091 | 1243.5234 |
+| 2014-06-03 | 11176 | 1222.8820 |
+| 2014-06-03 | 11287 | 1115.2109 |
+| 2014-06-04 | 11091 | 1243.5234 |
+| 2014-06-04 | 11176 | 1222.8820 |
+| 2014-06-04 | 11287 | 1115.2109 |
+| 2014-06-05 | 11091 | 1243.5234 |
+| 2014-06-05 | 11176 | 1222.8820 |
+| 2014-06-05 | 11287 | 1158.1733 |
+| 2014-06-06 | 11091 | 1243.5234 |
+| 2014-06-06 | 11176 | 1222.8820 |
+| 2014-06-06 | 11287 | 1158.1733 |
+| 2014-06-07 | 11091 | 1282.1874 |
+| 2014-06-07 | 11176 | 1222.8820 |
+| 2014-06-07 | 11287 | 1158.1733 |
+| 2014-06-08 | 11091 | 1282.1874 |
+| 2014-06-08 | 11176 | 1222.8820 |
+| 2014-06-08 | 11287 | 1158.1733 |
+| 2014-06-09 | 11091 | 1282.1874 |
+| 2014-06-09 | 11176 | 1222.8820 |
+| 2014-06-09 | 11287 | 1158.1733 |
+| 2014-06-10 | 11091 | 1314.2103 |
+| 2014-06-10 | 11176 | 1222.8820 |
+| 2014-06-10 | 11287 | 1158.1733 |
+| 2014-06-11 | 11091 | 1314.2103 |
+| 2014-06-11 | 11176 | 1222.8820 |
+| 2014-06-11 | 11287 | 1158.1733 |
+| 2014-06-12 | 11091 | 1314.2103 |
+| 2014-06-12 | 11176 | 1222.8820 |
+| 2014-06-12 | 11287 | 1158.1733 |
+| 2014-06-13 | 11091 | 1314.2103 |
+| 2014-06-13 | 11176 | 1222.8820 |
+| 2014-06-13 | 11287 | 1158.1733 |
+| 2014-06-14 | 11091 | 1314.2103 |
+| 2014-06-14 | 11176 | 1228.3960 |
+| 2014-06-14 | 11287 | 1191.3012 |
+| 2014-06-15 | 11091 | 1314.2103 |
+| 2014-06-15 | 11176 | 1388.0354 |
+| 2014-06-15 | 11287 | 1191.3012 |
+| 2014-06-16 | 11091 | 1314.2103 |
+| 2014-06-16 | 11176 | 1388.0354 |
+| 2014-06-16 | 11287 | 1191.3012 |
+| 2014-06-17 | 11091 | 1314.2103 |
+| 2014-06-17 | 11176 | 1430.9978 |
+| 2014-06-17 | 11287 | 1191.3012 |
+| 2014-06-18 | 11091 | 1314.2103 |
+| 2014-06-18 | 11176 | 1430.9978 |
+| 2014-06-18 | 11287 | 1191.3012 |
+| 2014-06-19 | 11091 | 1314.2103 |
+| 2014-06-19 | 11176 | 1430.9978 |
+| 2014-06-19 | 11287 | 1191.3012 |
+| 2014-06-20 | 11091 | 1314.2103 |
+| 2014-06-20 | 11176 | 1430.9978 |
+| 2014-06-20 | 11287 | 1191.3012 |
+| 2014-06-21 | 11091 | 1314.2103 |
+| 2014-06-21 | 11176 | 1430.9978 |
+| 2014-06-21 | 11287 | 1191.3012 |
+| 2014-06-22 | 11091 | 1314.2103 |
+| 2014-06-22 | 11176 | 1430.9978 |
+| 2014-06-22 | 11287 | 1191.3012 |
+| 2014-06-23 | 11091 | 1314.2103 |
+| 2014-06-23 | 11176 | 1430.9978 |
+| 2014-06-23 | 11287 | 1191.3012 |
+| 2014-06-24 | 11091 | 1314.2103 |
+| 2014-06-24 | 11176 | 1430.9978 |
+| 2014-06-24 | 11287 | 1191.3012 |
+| 2014-06-25 | 11091 | 1314.2103 |
+| 2014-06-25 | 11176 | 1430.9978 |
+| 2014-06-25 | 11287 | 1191.3012 |
+| 2014-06-26 | 11091 | 1314.2103 |
+| 2014-06-26 | 11176 | 1430.9978 |
+| 2014-06-26 | 11287 | 1191.3012 |
+| 2014-06-27 | 11091 | 1314.2103 |
+| 2014-06-27 | 11176 | 1430.9978 |
+| 2014-06-27 | 11287 | 1191.3012 |
+| 2014-06-28 | 11091 | 1314.2103 |
+| 2014-06-28 | 11176 | 1430.9978 |
+| 2014-06-28 | 11287 | 1191.3012 |
+| 2014-06-29 | 11091 | 1314.2103 |
+| 2014-06-29 | 11176 | 1458.6118 |
+| 2014-06-29 | 11287 | 1191.3012 |
+| 2014-06-30 | 11091 | 1314.2103 |
+| 2014-06-30 | 11176 | 1458.6118 |
+| 2014-06-30 | 11287 | 1289.6131 |
+
+
+Expand for the DDL
+--8<-- "docs/challenging-sql-problems/solutions/silver/customer-sales-running-totals.sql"
+
+
+## Solution
+
+The solution for SQL Server is provided below.
+
+
+> SUCCESS: **SQL Server**
+>
+--8<-- "docs/challenging-sql-problems/solutions/silver/customer-sales-running-totals--sql-server.sql"
diff --git a/docs/challenging-sql-problems/solutions/silver/customer-sales-running-totals.sql b/docs/challenging-sql-problems/solutions/silver/customer-sales-running-totals.sql
new file mode 100644
index 0000000..4a484e9
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/silver/customer-sales-running-totals.sql
@@ -0,0 +1,95 @@
+```sql
+select *
+from (values
+ ('2014-06-01', 11091, 1243.5234),
+ ('2014-06-01', 11176, 1222.8820),
+ ('2014-06-01', 11287, 1115.2109),
+ ('2014-06-02', 11091, 1243.5234),
+ ('2014-06-02', 11176, 1222.8820),
+ ('2014-06-02', 11287, 1115.2109),
+ ('2014-06-03', 11091, 1243.5234),
+ ('2014-06-03', 11176, 1222.8820),
+ ('2014-06-03', 11287, 1115.2109),
+ ('2014-06-04', 11091, 1243.5234),
+ ('2014-06-04', 11176, 1222.8820),
+ ('2014-06-04', 11287, 1115.2109),
+ ('2014-06-05', 11091, 1243.5234),
+ ('2014-06-05', 11176, 1222.8820),
+ ('2014-06-05', 11287, 1158.1733),
+ ('2014-06-06', 11091, 1243.5234),
+ ('2014-06-06', 11176, 1222.8820),
+ ('2014-06-06', 11287, 1158.1733),
+ ('2014-06-07', 11091, 1282.1874),
+ ('2014-06-07', 11176, 1222.8820),
+ ('2014-06-07', 11287, 1158.1733),
+ ('2014-06-08', 11091, 1282.1874),
+ ('2014-06-08', 11176, 1222.8820),
+ ('2014-06-08', 11287, 1158.1733),
+ ('2014-06-09', 11091, 1282.1874),
+ ('2014-06-09', 11176, 1222.8820),
+ ('2014-06-09', 11287, 1158.1733),
+ ('2014-06-10', 11091, 1314.2103),
+ ('2014-06-10', 11176, 1222.8820),
+ ('2014-06-10', 11287, 1158.1733),
+ ('2014-06-11', 11091, 1314.2103),
+ ('2014-06-11', 11176, 1222.8820),
+ ('2014-06-11', 11287, 1158.1733),
+ ('2014-06-12', 11091, 1314.2103),
+ ('2014-06-12', 11176, 1222.8820),
+ ('2014-06-12', 11287, 1158.1733),
+ ('2014-06-13', 11091, 1314.2103),
+ ('2014-06-13', 11176, 1222.8820),
+ ('2014-06-13', 11287, 1158.1733),
+ ('2014-06-14', 11091, 1314.2103),
+ ('2014-06-14', 11176, 1228.3960),
+ ('2014-06-14', 11287, 1191.3012),
+ ('2014-06-15', 11091, 1314.2103),
+ ('2014-06-15', 11176, 1388.0354),
+ ('2014-06-15', 11287, 1191.3012),
+ ('2014-06-16', 11091, 1314.2103),
+ ('2014-06-16', 11176, 1388.0354),
+ ('2014-06-16', 11287, 1191.3012),
+ ('2014-06-17', 11091, 1314.2103),
+ ('2014-06-17', 11176, 1430.9978),
+ ('2014-06-17', 11287, 1191.3012),
+ ('2014-06-18', 11091, 1314.2103),
+ ('2014-06-18', 11176, 1430.9978),
+ ('2014-06-18', 11287, 1191.3012),
+ ('2014-06-19', 11091, 1314.2103),
+ ('2014-06-19', 11176, 1430.9978),
+ ('2014-06-19', 11287, 1191.3012),
+ ('2014-06-20', 11091, 1314.2103),
+ ('2014-06-20', 11176, 1430.9978),
+ ('2014-06-20', 11287, 1191.3012),
+ ('2014-06-21', 11091, 1314.2103),
+ ('2014-06-21', 11176, 1430.9978),
+ ('2014-06-21', 11287, 1191.3012),
+ ('2014-06-22', 11091, 1314.2103),
+ ('2014-06-22', 11176, 1430.9978),
+ ('2014-06-22', 11287, 1191.3012),
+ ('2014-06-23', 11091, 1314.2103),
+ ('2014-06-23', 11176, 1430.9978),
+ ('2014-06-23', 11287, 1191.3012),
+ ('2014-06-24', 11091, 1314.2103),
+ ('2014-06-24', 11176, 1430.9978),
+ ('2014-06-24', 11287, 1191.3012),
+ ('2014-06-25', 11091, 1314.2103),
+ ('2014-06-25', 11176, 1430.9978),
+ ('2014-06-25', 11287, 1191.3012),
+ ('2014-06-26', 11091, 1314.2103),
+ ('2014-06-26', 11176, 1430.9978),
+ ('2014-06-26', 11287, 1191.3012),
+ ('2014-06-27', 11091, 1314.2103),
+ ('2014-06-27', 11176, 1430.9978),
+ ('2014-06-27', 11287, 1191.3012),
+ ('2014-06-28', 11091, 1314.2103),
+ ('2014-06-28', 11176, 1430.9978),
+ ('2014-06-28', 11287, 1191.3012),
+ ('2014-06-29', 11091, 1314.2103),
+ ('2014-06-29', 11176, 1458.6118),
+ ('2014-06-29', 11287, 1191.3012),
+ ('2014-06-30', 11091, 1314.2103),
+ ('2014-06-30', 11176, 1458.6118),
+ ('2014-06-30', 11287, 1289.6131)
+) as solution(BalanceDate, CustomerID, RunningTotal)
+```
diff --git a/docs/challenging-sql-problems/solutions/silver/funnel-analytics--duckdb.sql b/docs/challenging-sql-problems/solutions/silver/funnel-analytics--duckdb.sql
new file mode 100644
index 0000000..0988596
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/silver/funnel-analytics--duckdb.sql
@@ -0,0 +1,76 @@
+```sql
+with
+
+stages(stage, step) as (
+ values
+ ('full application', 1),
+ ('decision', 2),
+ ('documentation', 3),
+ ('valuation inspection', 4),
+ ('valuation made', 5),
+ ('valuation submitted', 6),
+ ('solicitation', 7),
+ ('funds released', 8)
+),
+
+cohorts as (
+ select
+ applications.event_id,
+ applications.event_date,
+ applications.mortgage_id,
+ applications.stage,
+ stages.step,
+ datetrunc('month', first_value(applications.event_date) over (
+ partition by applications.mortgage_id
+ order by stages.step
+ )) as cohort
+ from applications
+ inner join stages
+ using (stage)
+),
+
+cohorts_by_stage as (
+ select
+ cohort,
+ stage,
+ any_value(step) as step,
+ count(*) as mortgages
+ from cohorts
+ group by
+ cohort,
+ stage
+),
+
+axis as (
+ select
+ cohort,
+ stages.stage,
+ stages.step
+ from (select distinct cohort from cohorts_by_stage)
+ cross join stages
+),
+
+funnel as (
+ select
+ axis.cohort,
+ axis.stage,
+ axis.step,
+ coalesce(cohorts_by_stage.mortgages, 0) as mortgages,
+ from axis
+ left join cohorts_by_stage
+ using (cohort, step)
+)
+
+select
+ cohort,
+ stage,
+ mortgages,
+ round(100.0 * coalesce(mortgages / lag(mortgages, 1, mortgages) over cohort_by_step, 0), 2) as step_rate,
+ round(100.0 * mortgages / first_value(mortgages) over cohort_by_step, 2) as total_rate,
+from funnel
+window cohort_by_step as (
+ partition by cohort
+ order by step
+)
+order by cohort, step
+```
diff --git a/docs/challenging-sql-problems/solutions/silver/funnel-analytics.md b/docs/challenging-sql-problems/solutions/silver/funnel-analytics.md
new file mode 100644
index 0000000..fdab633
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/silver/funnel-analytics.md
@@ -0,0 +1,27 @@
+# Funnel analytics β¬
+
+> [!TIP]
+>
+> Solution to the following problem:
+>
+> - [funnel-analytics.md](../../problems/silver/funnel-analytics.md)
+
+## Result Set
+
+Regardless of the database, the result set should look like:
+
+
+Expand for the DDL
+--8<-- "docs/challenging-sql-problems/solutions/silver/funnel-analytics.sql"
+
+
+## Solution
+
+Some SQL solutions per database are provided below.
+
+[//]: # "What about a solution like: https://popsql.com/sql-templates/marketing/running-a-funnel-analysis"
+
+
+> SUCCESS: **DuckDB**
+>
+--8<-- "docs/challenging-sql-problems/solutions/silver/funnel-analytics--duckdb.sql"
diff --git a/docs/challenging-sql-problems/solutions/silver/funnel-analytics.sql b/docs/challenging-sql-problems/solutions/silver/funnel-analytics.sql
new file mode 100644
index 0000000..493817a
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/silver/funnel-analytics.sql
@@ -0,0 +1,29 @@
+```sql
+select *
+from values
+ ('2024-01-01', 'full application', 4, 100.00, 100.00),
+ ('2024-01-01', 'decision', 4, 100.00, 100.00),
+ ('2024-01-01', 'documentation', 3, 75.00, 75.00),
+ ('2024-01-01', 'valuation inspection', 3, 100.00, 75.00),
+ ('2024-01-01', 'valuation made', 3, 100.00, 75.00),
+ ('2024-01-01', 'valuation submitted', 3, 100.00, 75.00),
+ ('2024-01-01', 'solicitation', 1, 33.33, 25.00),
+ ('2024-01-01', 'funds released', 1, 100.00, 25.00),
+ ('2024-02-01', 'full application', 6, 100.00, 100.00),
+ ('2024-02-01', 'decision', 6, 100.00, 100.00),
+ ('2024-02-01', 'documentation', 4, 66.67, 66.67),
+ ('2024-02-01', 'valuation inspection', 4, 100.00, 66.67),
+ ('2024-02-01', 'valuation made', 4, 100.00, 66.67),
+ ('2024-02-01', 'valuation submitted', 4, 100.00, 66.67),
+ ('2024-02-01', 'solicitation', 3, 75.00, 50.00),
+ ('2024-02-01', 'funds released', 3, 100.00, 50.00),
+ ('2024-03-01', 'full application', 3, 100.00, 100.00),
+ ('2024-03-01', 'decision', 3, 100.00, 100.00),
+ ('2024-03-01', 'documentation', 1, 33.33, 33.33),
+ ('2024-03-01', 'valuation inspection', 1, 100.00, 33.33),
+ ('2024-03-01', 'valuation made', 1, 100.00, 33.33),
+ ('2024-03-01', 'valuation submitted', 1, 100.00, 33.33),
+ ('2024-03-01', 'solicitation', 0, 0.00, 0.00),
+ ('2024-03-01', 'funds released', 0, 0.00, 0.00)
+as solution(cohort, stage, mortgages, step_rate, total_rate)
+```
diff --git a/docs/challenging-sql-problems/solutions/silver/suspicious-login-activity--duckdb.sql b/docs/challenging-sql-problems/solutions/silver/suspicious-login-activity--duckdb.sql
new file mode 100644
index 0000000..c48423a
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/silver/suspicious-login-activity--duckdb.sql
@@ -0,0 +1,22 @@
+```sql
+with event_groups as (
+ select
+ *,
+ (0
+ + row_number() over (partition by user_id order by event_id)
+ - row_number() over (partition by user_id, event_type order by event_id)
+ ) as event_group
+ from events
+)
+
+select
+ user_id,
+ count(*) as consecutive_failures
+from event_groups
+group by user_id, event_group
+having 1=1
+ and consecutive_failures >= 5
+ and any_value(event_type) = 'login failed'
+qualify consecutive_failures = max(consecutive_failures) over (partition by user_id)
+order by user_id
+```
diff --git a/docs/challenging-sql-problems/solutions/silver/suspicious-login-activity.md b/docs/challenging-sql-problems/solutions/silver/suspicious-login-activity.md
new file mode 100644
index 0000000..07801df
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/silver/suspicious-login-activity.md
@@ -0,0 +1,30 @@
+# Suspicious login activity π€
+
+> [!TIP]
+>
+> Solution to the following problem:
+>
+> - [suspicious-login-activity.md](../../problems/silver/suspicious-login-activity.md)
+
+## Result Set
+
+Regardless of the database, the result set should look like:
+
+| user_id | consecutive_failures |
+| ------: | -------------------: |
+| 1 | 5 |
+| 3 | 8 |
+
+
+Expand for the DDL
+--8<-- "docs/challenging-sql-problems/solutions/silver/suspicious-login-activity.sql"
+
+
+## Solution
+
+Some SQL solutions per database are provided below.
+
+
+> SUCCESS: **DuckDB**
+>
+--8<-- "docs/challenging-sql-problems/solutions/silver/suspicious-login-activity--duckdb.sql"
diff --git a/docs/challenging-sql-problems/solutions/silver/suspicious-login-activity.sql b/docs/challenging-sql-problems/solutions/silver/suspicious-login-activity.sql
new file mode 100644
index 0000000..3797138
--- /dev/null
+++ b/docs/challenging-sql-problems/solutions/silver/suspicious-login-activity.sql
@@ -0,0 +1,7 @@
+```sql
+select *
+from values
+ (1, 5),
+ (3, 8)
+as solution(user_id, consecutive_failures)
+```
diff --git a/docs/index.md b/docs/index.md
index 98ca16f..aaa565f 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -27,3 +27,4 @@ This project has a few different series:
- [From Excel to SQL](from-excel-to-sql/from-excel-to-sql.md), which is aimed at Excel users who want to learn SQL
- [Everything About Joins](everything-about-joins/everything-about-joins.md), which is a comprehensive guide to SQL joins
+- [Challenging SQL questions](challenging-sql-problems/challenging-sql-problems.md), which are SQL questions that really test your skills
diff --git a/mkdocs.yml b/mkdocs.yml
index 96ee622..9f4a6ea 100644
--- a/mkdocs.yml
+++ b/mkdocs.yml
@@ -7,7 +7,10 @@ repo_url: https://github.com/Bilbottom/sql-learning-materials/
theme:
name: material
- features: ["navigation.instant", "navigation.footer"]
+ features:
+ - navigation.instant
+ - navigation.footer
+ - content.code.copy
palette:
- accent: default
primary: default
@@ -68,12 +71,39 @@ nav:
- everything-about-joins/under-the-hood/hash-joins.md
- everything-about-joins/under-the-hood/merge-joins.md
- everything-about-joins/under-the-hood/range-intersection-joins.md
+ - Challenging SQL problems:
+ - challenging-sql-problems/challenging-sql-problems.md
+ - Problems:
+ - π€ Bronze Tier:
+ - challenging-sql-problems/problems/bronze/fibonacci-sequence.md # recursive CTE
+ - challenging-sql-problems/problems/bronze/temperature-anomaly-detection.md # window functions
+ - challenging-sql-problems/problems/bronze/uk-bank-holidays.md # unpivot and unnest
+ - βͺ Silver Tier:
+ - challenging-sql-problems/problems/silver/suspicious-login-activity.md # window function (gaps and islands)
+ - challenging-sql-problems/problems/silver/funnel-analytics.md # custom axis (TODO: this might be bronze tier)
+ - challenging-sql-problems/problems/silver/customer-sales-running-totals.md # window functions
+ - π‘ Gold Tier:
+ - challenging-sql-problems/problems/gold/loan-repayment-schedule.md # recursive CTE
+ - challenging-sql-problems/problems/gold/travel-plans.md # recursive CTEs
+ - Solutions:
+ - π€ Bronze Tier:
+ - challenging-sql-problems/solutions/bronze/fibonacci-sequence.md
+ - challenging-sql-problems/solutions/bronze/temperature-anomaly-detection.md
+ - challenging-sql-problems/solutions/bronze/uk-bank-holidays.md
+ - βͺ Silver Tier:
+ - challenging-sql-problems/solutions/silver/suspicious-login-activity.md
+ - challenging-sql-problems/solutions/silver/funnel-analytics.md
+ - challenging-sql-problems/solutions/silver/customer-sales-running-totals.md
+ - π‘ Gold Tier:
+ - challenging-sql-problems/solutions/gold/loan-repayment-schedule.md
+ - challenging-sql-problems/solutions/gold/travel-plans.md
markdown_extensions:
- admonition
- callouts # https://github.com/oprypin/markdown-callouts
- mdx_truly_sane_lists # Support nested lists
- pymdownx.details # Support collapsible sections with ???
+ - pymdownx.snippets
- pymdownx.superfences:
custom_fences:
- { name: mermaid, class: mermaid }