Compare commits

...

51 commits

Author SHA1 Message Date
Gabe Farrell
0ec7b458cc
ui: tweaks and fixes (#194)
* reduce min width of top chart on mobile

* adjust error page style

* adjust h1 line height
2026-02-04 13:41:12 -05:00
Gabe Farrell
531c72899c
fix: add null check for top charts bg gradient (#193) 2026-02-03 11:23:30 -05:00
Gabe Farrell
b06685c1af
fix: rewind navigation (#191) 2026-02-02 15:06:13 -05:00
Gabe Farrell
64236c99c9
fix: invalid json response when login gate is disabled (#184) 2026-01-26 14:49:30 -05:00
Gabe Farrell
42b32c7920
feat: add api key auth to web api (#183) 2026-01-26 13:48:43 -05:00
PythonGermany
bf1c03e9fd
docs: fix typo in index.mdx (#182) 2026-01-26 13:43:01 -05:00
Gabe Farrell
35e104c97e
fix: gradient background on top charts (#181) 2026-01-26 13:03:27 -05:00
Gabe Farrell
c8a11ef018
fix: ensure mbids in mbidmapping are discovered (#180) 2026-01-25 15:51:07 -05:00
Gabe Farrell
937f9062b5
fix: include time zone name overrides and add KOITO_FORCE_TZ cfg option (#176)
* timezone overrides and force_tz option

* docs for force_tz

* add link to time zone names in docs
2026-01-24 13:19:04 -05:00
Gabe Farrell
1ed055d098
fix: ui tweaks and fixes (#170)
* add subtle gradient to home page

* tweak autumn theme primary color

* reduce home page top margin on mobile

* use focus-active instead of focus for outline

* fix gradient on rewind page

* align checkbox on login form

* i forgot what the pseudo class was called
2026-01-22 21:31:14 -05:00
Gabe Farrell
08fc9eed86
fix: correct interest bucket queries (#169) 2026-01-22 17:01:46 -05:00
Gabe Farrell
cb4d177875
fix: release associations and add cleanup migration (#168)
* fix: release associations and add cleanup migration

* fix: incorrect test
2026-01-22 15:33:38 -05:00
Gabe Farrell
16cee8cfca
fix: speedup top-artists and top-albums queries (#167) 2026-01-21 17:30:59 -05:00
onespaceman
c59c6c3baa
QOL changes to client (#165) 2026-01-21 16:03:27 -05:00
Gabe Farrell
e7ba34710c
feat: lastfm image support (#166)
* feat: lastfm image support

* docs
2026-01-21 16:03:05 -05:00
Gabe Farrell
56ac73d12b
fix: improve subsonic image searching (#164) 2026-01-21 14:54:52 -05:00
Gabe Farrell
1a8099e902
feat: refetch missing images on startup (#160)
* artist image refetching

* album image refetching

* remove unused var
2026-01-20 12:10:54 -05:00
Gabe Farrell
5e294b839c
feat: all time rank display (#149)
* add all time rank to item pages

* fix artist albums component

* add no rows check

* fix rewind page
2026-01-16 01:03:23 -05:00
d08e05220f docs: add disclaimer about subsonic config 2026-01-15 22:01:25 -05:00
c0de721a7c chore: ignore README for docker workflow 2026-01-15 21:27:59 -05:00
Gabe Farrell
d2d6924e05
fix: use sql rank (#148) 2026-01-15 21:08:30 -05:00
Gabe Farrell
aa7fddd518
fix: a couple ui fixes (#147)
* fix: reduce loading component width

* improve theme selector for mobile

* match interest graph width to activity grid
2026-01-15 20:21:05 -05:00
Gabe Farrell
1eb1cd0fd5
chore: call relay early to prevent missed relays (#145)
* chore: call relay early to prevent missed relays

* fix: get current time in tz for listen activity (#146)

* fix: get current time in tz for listen activity

* fix: adjust test to prevent timezone errors
2026-01-15 19:40:38 -05:00
Gabe Farrell
92648167f0
fix: get current time in tz for listen activity (#146)
* fix: get current time in tz for listen activity

* fix: adjust test to prevent timezone errors
2026-01-15 19:36:48 -05:00
Gabe Farrell
9dbdfe5e41
update README 2026-01-15 18:21:51 -05:00
Gabe Farrell
94108953ec
fix: conditional rendering on artist and album pages (#140) 2026-01-14 22:12:57 -05:00
Gabe Farrell
d87ed2eb97
fix: ensure listen activity correctly sums listen activity in step (#139)
* remove impossible nil check

* fix listen activity not correctly aggregating step

* remove stray log

* fix test
2026-01-14 21:35:01 -05:00
Gabe Farrell
3305ad269e
Add Star History section to README
Added Star History section with visualization.
2026-01-14 17:21:52 -05:00
Gabe Farrell
20bbf62254
update README
Added logo and Ko-Fi badge to README.
2026-01-14 14:47:21 -05:00
Gabe Farrell
a94584da23
create FUNDING.yml 2026-01-14 14:06:14 -05:00
Gabe Farrell
8223a29be6
fix: correctly cycle tracks in backfill (#138) 2026-01-14 12:46:17 -05:00
231e751be3 docs: add navidrome quickstart guide 2026-01-14 01:26:01 -05:00
feef66da12 fix: add required parameters for subsonic request 2026-01-14 01:09:17 -05:00
Gabe Farrell
25d7bb41c1
Revise README for project status and update screenshots
Updated project status to reflect active development and instability. Added new images to the screenshots section and made minor text adjustments.

Also since when does AI write GitHub default commit messages...
2026-01-14 00:24:19 -05:00
Gabe Farrell
df59605418
feat: backfill duration from musicbrainz (#135)
* feat: backfill durations from musicbrainz

* chore: make request body dump info level
2026-01-14 00:08:05 -05:00
Gabe Farrell
288d04d714
fix: ui tweaks and fixes (#134) 2026-01-13 23:25:31 -05:00
Gabe Farrell
c2a0987946
fix: improved mobile ui for rewind (#133) 2026-01-13 11:13:54 -05:00
6e7b4e0522 fix: rewind ui bug 2026-01-13 01:02:25 -05:00
Gabe Farrell
62267652ba
feat: improve rewind page (#130)
* add timeframe selectors for rewind

* alter rewind nav to default to monthly rewind

* fix rewind default page

* remove superfluous parameters
2026-01-12 23:22:29 -05:00
Gabe Farrell
ddb0becc0f
fix: ui fixes and koito import time config fix (#128)
* fix: add import time checking to koito import

* adjust interest graph css

* show musicbrainz link when not logged in

* remove chart animation

* change interest steps to 16
2026-01-12 17:44:33 -05:00
Gabe Farrell
231eb1b0fb
feat: interest over time graph (#127)
* api

* ui

* test

* add margin to prevent clipping
2026-01-12 16:20:31 -05:00
Gabe Farrell
e45099c71a
fix: improve matching with identically named albums (#126)
* fix: improve matching with identically named albums

* fix: incorrect sql query
2026-01-12 13:03:04 -05:00
Gabe Farrell
97cd378535
feat: add endpoint and ui to update mbz id (#125)
* wip

* wip

* feat: add endpoint and ui to update mbz id
2026-01-11 01:50:27 -05:00
Gabe Farrell
7cf7cd3a10
feat: add musicbrainz link where possible (#124) 2026-01-11 01:39:56 -05:00
Gabe Farrell
d61e814306
fix: do not update mbz id when one already exists (#123) 2026-01-11 01:39:41 -05:00
Gabe Farrell
f51771bc34
feat: add ranks to top items charts (#122) 2026-01-11 00:15:46 -05:00
d3faa9728e chore: use named volume in dev 2026-01-11 00:03:46 -05:00
Gabe Farrell
f48dd6c039
fix: respect client timezone for requests (#119)
* maybe fixed for total listen activity

* maybe actually fixed now

* fix unset location panics
2026-01-10 01:45:31 -05:00
2925425750 docs: only release docs on new version 2026-01-01 18:41:07 -05:00
Gabe Farrell
c346c7cb31
fix: associate tracks with release when scrobbling (#118) 2026-01-01 02:40:27 -05:00
Gabe Farrell
d327729bff
transition time ranged queries to timeframe (#117) 2026-01-01 01:56:16 -05:00
128 changed files with 5625 additions and 2227 deletions

5
.env.example Normal file
View file

@ -0,0 +1,5 @@
KOITO_ALLOWED_HOSTS=*
KOITO_LOG_LEVEL=debug
KOITO_CONFIG_DIR=test_config_dir
KOITO_DATABASE_URL=postgres://postgres:secret@localhost:5432?sslmode=disable
TZ=Etc/UTC

3
.github/FUNDING.yml vendored Normal file
View file

@ -0,0 +1,3 @@
# These are supported funding model platforms
ko_fi: gabehf

View file

@ -2,10 +2,13 @@ name: Deploy to GitHub Pages
on: on:
push: push:
branches: [main] tags:
- "v*"
paths: paths:
- 'docs/**' - "docs/**"
- '.github/workflows/**' - ".github/workflows/**"
workflow_dispatch:
permissions: permissions:
contents: read contents: read
@ -21,9 +24,9 @@ jobs:
- name: Install, build, and upload your site output - name: Install, build, and upload your site output
uses: withastro/action@v4 uses: withastro/action@v4
with: with:
path: ./docs # The root location of your Astro project inside the repository. (optional) path: ./docs # The root location of your Astro project inside the repository. (optional)
node-version: 20 # The specific version of Node that should be used to build your site. Defaults to 22. (optional) node-version: 20 # The specific version of Node that should be used to build your site. Defaults to 22. (optional)
package-manager: yarn@1.22.22 # The Node package manager that should be used to install dependencies and build your site. Automatically detected based on your lockfile. (optional) package-manager: yarn@1.22.22 # The Node package manager that should be used to install dependencies and build your site. Automatically detected based on your lockfile. (optional)
deploy: deploy:
needs: build needs: build
@ -34,4 +37,4 @@ jobs:
steps: steps:
- name: Deploy to GitHub Pages - name: Deploy to GitHub Pages
id: deployment id: deployment
uses: actions/deploy-pages@v4 uses: actions/deploy-pages@v4

View file

@ -17,6 +17,7 @@ on:
- main - main
paths-ignore: paths-ignore:
- "docs/**" - "docs/**"
- "README.md"
workflow_dispatch: workflow_dispatch:

1
.gitignore vendored
View file

@ -1 +1,2 @@
test_config_dir test_config_dir
.env

View file

@ -1,3 +1,8 @@
ifneq (,$(wildcard ./.env))
include .env
export
endif
.PHONY: all test clean client .PHONY: all test clean client
postgres.schemadump: postgres.schemadump:
@ -10,7 +15,7 @@ postgres.schemadump:
-v --dbname="koitodb" -f "/tmp/dump/schema.sql" -v --dbname="koitodb" -f "/tmp/dump/schema.sql"
postgres.run: postgres.run:
docker run --name koito-db -p 5432:5432 -e POSTGRES_PASSWORD=secret -d postgres docker run --name koito-db -p 5432:5432 -v koito_dev_db:/var/lib/postgresql -e POSTGRES_PASSWORD=secret -d postgres
postgres.run-scratch: postgres.run-scratch:
docker run --name koito-scratch -p 5433:5432 -e POSTGRES_PASSWORD=secret -d postgres docker run --name koito-scratch -p 5433:5432 -e POSTGRES_PASSWORD=secret -d postgres
@ -28,10 +33,10 @@ postgres.remove-scratch:
docker stop koito-scratch && docker rm koito-scratch docker stop koito-scratch && docker rm koito-scratch
api.debug: postgres.start api.debug: postgres.start
KOITO_ALLOWED_HOSTS=* KOITO_LOG_LEVEL=debug KOITO_CONFIG_DIR=test_config_dir KOITO_DATABASE_URL=postgres://postgres:secret@localhost:5432?sslmode=disable go run cmd/api/main.go go run cmd/api/main.go
api.scratch: postgres.run-scratch api.scratch: postgres.run-scratch
KOITO_ALLOWED_HOSTS=* KOITO_LOG_LEVEL=debug KOITO_CONFIG_DIR=test_config_dir/scratch KOITO_DATABASE_URL=postgres://postgres:secret@localhost:5433?sslmode=disable go run cmd/api/main.go KOITO_DATABASE_URL=postgres://postgres:secret@localhost:5433?sslmode=disable go run cmd/api/main.go
api.test: api.test:
go test ./... -timeout 60s go test ./... -timeout 60s

View file

@ -1,9 +1,21 @@
# Koito <div align="center">
![Koito logo](https://github.com/user-attachments/assets/bd69a050-b40f-4da7-8ff1-4607554bfd6d)
*Koito (小糸) is a Japanese surname. It is also homophonous with the words 恋と (koi to), meaning "and/with love".*
</div>
<div align="center">
[![Ko-Fi](https://img.shields.io/badge/Ko--fi-F16061?style=for-the-badge&logo=ko-fi&logoColor=white)](https://ko-fi.com/gabehf)
</div>
Koito is a modern, themeable ListenBrainz-compatible scrobbler for self-hosters who want control over their data and insights into their listening habits. Koito is a modern, themeable ListenBrainz-compatible scrobbler for self-hosters who want control over their data and insights into their listening habits.
It supports relaying to other compatible scrobblers, so you can try it safely without replacing your current setup. It supports relaying to other compatible scrobblers, so you can try it safely without replacing your current setup.
> This project is currently pre-release, and therefore you can expect rapid development and some bugs. If you don't want to replace your current scrobbler > This project is under active development and still considered "unstable", and therefore you can expect some bugs. If you don't want to replace your current scrobbler
with Koito quite yet, you can [set up a relay](https://koito.io/guides/scrobbler/#set-up-a-relay) from Koito to another ListenBrainz-compatible with Koito quite yet, you can [set up a relay](https://koito.io/guides/scrobbler/#set-up-a-relay) from Koito to another ListenBrainz-compatible
scrobbler. This is what I've been doing for the entire development of this app and it hasn't failed me once. Or, you can always use something scrobbler. This is what I've been doing for the entire development of this app and it hasn't failed me once. Or, you can always use something
like [multi-scrobbler](https://github.com/FoxxMD/multi-scrobbler). like [multi-scrobbler](https://github.com/FoxxMD/multi-scrobbler).
@ -23,8 +35,9 @@ You can view my public instance with my listening data at https://koito.mnrva.de
## Screenshots ## Screenshots
![screenshot one](assets/screenshot1.png) ![screenshot one](assets/screenshot1.png)
![screenshot two](assets/screenshot2.png) <img width="2021" height="1330" alt="image" src="https://github.com/user-attachments/assets/956748ff-f61f-4102-94b2-50783d9ee72b" />
![screenshot three](assets/screenshot3.png) <img width="1505" height="1018" alt="image" src="https://github.com/user-attachments/assets/5f7e1162-f723-4e4b-a528-06cf26d1d870" />
## Installation ## Installation
@ -75,6 +88,16 @@ There are currently some known issues that I am actively working on, in addition
If you have any feature ideas, open a GitHub issue to let me know. I'm sorting through ideas to decide which data visualizations and customization options to add next. If you have any feature ideas, open a GitHub issue to let me know. I'm sorting through ideas to decide which data visualizations and customization options to add next.
## Star History
<a href="https://www.star-history.com/#gabehf/koito&type=date&legend=top-left">
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://api.star-history.com/svg?repos=gabehf/koito&type=date&theme=dark&legend=top-left" />
<source media="(prefers-color-scheme: light)" srcset="https://api.star-history.com/svg?repos=gabehf/koito&type=date&legend=top-left" />
<img alt="Star History Chart" src="https://api.star-history.com/svg?repos=gabehf/koito&type=date&legend=top-left" />
</picture>
</a>
## Albums that fueled development + notes ## Albums that fueled development + notes
More relevant here than any of my other projects... More relevant here than any of my other projects...
@ -84,5 +107,4 @@ Not just during development, you can see my complete listening data on my [live
#### Random notes #### Random notes
- I find it a little annoying when READMEs use emoji but everyone else is doing it so I felt like I had to... - I find it a little annoying when READMEs use emoji but everyone else is doing it so I felt like I had to...
- It's funny how you can see the days in my listening history when I was just working on this project because they have way more listens than other days. - About 50% of the reason I built this was minor/not-so-minor greivances with Maloja. Could I have just contributed to Maloja? Maybe, but I like building stuff and I like Koito's UI a lot more anyways.
- About 50% of the reason I built this was minor/not-so-minor greivances with Maloja. Could I have just contributed to Maloja? Maybe, but I like building stuff and I like Koito's UI a lot more anyways.

View file

@ -23,6 +23,12 @@ interface timeframe {
to?: number; to?: number;
period?: string; period?: string;
} }
interface getInterestArgs {
buckets: number;
artist_id: number;
album_id: number;
track_id: number;
}
async function handleJson<T>(r: Response): Promise<T> { async function handleJson<T>(r: Response): Promise<T> {
if (!r.ok) { if (!r.ok) {
@ -42,32 +48,32 @@ async function getLastListens(
async function getTopTracks( async function getTopTracks(
args: getItemsArgs args: getItemsArgs
): Promise<PaginatedResponse<Track>> { ): Promise<PaginatedResponse<Ranked<Track>>> {
let url = `/apis/web/v1/top-tracks?period=${args.period}&limit=${args.limit}&page=${args.page}`; let url = `/apis/web/v1/top-tracks?period=${args.period}&limit=${args.limit}&page=${args.page}`;
if (args.artist_id) url += `&artist_id=${args.artist_id}`; if (args.artist_id) url += `&artist_id=${args.artist_id}`;
else if (args.album_id) url += `&album_id=${args.album_id}`; else if (args.album_id) url += `&album_id=${args.album_id}`;
const r = await fetch(url); const r = await fetch(url);
return handleJson<PaginatedResponse<Track>>(r); return handleJson<PaginatedResponse<Ranked<Track>>>(r);
} }
async function getTopAlbums( async function getTopAlbums(
args: getItemsArgs args: getItemsArgs
): Promise<PaginatedResponse<Album>> { ): Promise<PaginatedResponse<Ranked<Album>>> {
let url = `/apis/web/v1/top-albums?period=${args.period}&limit=${args.limit}&page=${args.page}`; let url = `/apis/web/v1/top-albums?period=${args.period}&limit=${args.limit}&page=${args.page}`;
if (args.artist_id) url += `&artist_id=${args.artist_id}`; if (args.artist_id) url += `&artist_id=${args.artist_id}`;
const r = await fetch(url); const r = await fetch(url);
return handleJson<PaginatedResponse<Album>>(r); return handleJson<PaginatedResponse<Ranked<Album>>>(r);
} }
async function getTopArtists( async function getTopArtists(
args: getItemsArgs args: getItemsArgs
): Promise<PaginatedResponse<Artist>> { ): Promise<PaginatedResponse<Ranked<Artist>>> {
const url = `/apis/web/v1/top-artists?period=${args.period}&limit=${args.limit}&page=${args.page}`; const url = `/apis/web/v1/top-artists?period=${args.period}&limit=${args.limit}&page=${args.page}`;
const r = await fetch(url); const r = await fetch(url);
return handleJson<PaginatedResponse<Artist>>(r); return handleJson<PaginatedResponse<Ranked<Artist>>>(r);
} }
async function getActivity( async function getActivity(
@ -79,6 +85,13 @@ async function getActivity(
return handleJson<ListenActivityItem[]>(r); return handleJson<ListenActivityItem[]>(r);
} }
async function getInterest(args: getInterestArgs): Promise<InterestBucket[]> {
const r = await fetch(
`/apis/web/v1/interest?buckets=${args.buckets}&album_id=${args.album_id}&artist_id=${args.artist_id}&track_id=${args.track_id}`
);
return handleJson<InterestBucket[]>(r);
}
async function getStats(period: string): Promise<Stats> { async function getStats(period: string): Promise<Stats> {
const r = await fetch(`/apis/web/v1/stats?period=${period}`); const r = await fetch(`/apis/web/v1/stats?period=${period}`);
@ -270,6 +283,19 @@ function setPrimaryAlias(
body: form, body: form,
}); });
} }
function updateMbzId(
type: string,
id: number,
mbzid: string
): Promise<Response> {
const form = new URLSearchParams();
form.append(`${type}_id`, String(id));
form.append("mbz_id", mbzid);
return fetch(`/apis/web/v1/mbzid`, {
method: "PATCH",
body: form,
});
}
function getAlbum(id: number): Promise<Album> { function getAlbum(id: number): Promise<Album> {
return fetch(`/apis/web/v1/album?id=${id}`).then( return fetch(`/apis/web/v1/album?id=${id}`).then(
(r) => r.json() as Promise<Album> (r) => r.json() as Promise<Album>
@ -302,6 +328,7 @@ export {
getTopAlbums, getTopAlbums,
getTopArtists, getTopArtists,
getActivity, getActivity,
getInterest,
getStats, getStats,
search, search,
replaceImage, replaceImage,
@ -318,6 +345,7 @@ export {
createAlias, createAlias,
deleteAlias, deleteAlias,
setPrimaryAlias, setPrimaryAlias,
updateMbzId,
getApiKeys, getApiKeys,
createApiKey, createApiKey,
deleteApiKey, deleteApiKey,
@ -339,6 +367,7 @@ type Track = {
musicbrainz_id: string; musicbrainz_id: string;
time_listened: number; time_listened: number;
first_listen: number; first_listen: number;
all_time_rank: number;
}; };
type Artist = { type Artist = {
id: number; id: number;
@ -350,6 +379,7 @@ type Artist = {
time_listened: number; time_listened: number;
first_listen: number; first_listen: number;
is_primary: boolean; is_primary: boolean;
all_time_rank: number;
}; };
type Album = { type Album = {
id: number; id: number;
@ -361,6 +391,7 @@ type Album = {
musicbrainz_id: string; musicbrainz_id: string;
time_listened: number; time_listened: number;
first_listen: number; first_listen: number;
all_time_rank: number;
}; };
type Alias = { type Alias = {
id: number; id: number;
@ -379,10 +410,19 @@ type PaginatedResponse<T> = {
current_page: number; current_page: number;
items_per_page: number; items_per_page: number;
}; };
type Ranked<T> = {
item: T;
rank: number;
};
type ListenActivityItem = { type ListenActivityItem = {
start_time: Date; start_time: Date;
listens: number; listens: number;
}; };
type InterestBucket = {
bucket_start: Date;
bucket_end: Date;
listen_count: number;
};
type SimpleArtists = { type SimpleArtists = {
name: string; name: string;
id: number; id: number;
@ -422,9 +462,9 @@ type NowPlaying = {
}; };
type RewindStats = { type RewindStats = {
title: string; title: string;
top_artists: Artist[]; top_artists: Ranked<Artist>[];
top_albums: Album[]; top_albums: Ranked<Album>[];
top_tracks: Track[]; top_tracks: Ranked<Track>[];
minutes_listened: number; minutes_listened: number;
avg_minutes_listened_per_day: number; avg_minutes_listened_per_day: number;
plays: number; plays: number;
@ -440,13 +480,16 @@ type RewindStats = {
export type { export type {
getItemsArgs, getItemsArgs,
getActivityArgs, getActivityArgs,
getInterestArgs,
Track, Track,
Artist, Artist,
Album, Album,
Listen, Listen,
SearchResponse, SearchResponse,
PaginatedResponse, PaginatedResponse,
Ranked,
ListenActivityItem, ListenActivityItem,
InterestBucket,
User, User,
Alias, Alias,
ApiKey, ApiKey,

View file

@ -58,6 +58,7 @@
--header-sm: 16px; --header-sm: 16px;
--header-xl-weight: 600; --header-xl-weight: 600;
--header-weight: 600; --header-weight: 600;
--header-line-height: 3rem;
} }
@media (min-width: 60rem) { @media (min-width: 60rem) {
@ -68,6 +69,7 @@
--header-sm: 16px; --header-sm: 16px;
--header-xl-weight: 600; --header-xl-weight: 600;
--header-weight: 600; --header-weight: 600;
--header-line-height: 1.3em;
} }
} }
@ -98,6 +100,7 @@ h1 {
font-family: "League Spartan"; font-family: "League Spartan";
font-weight: var(--header-weight); font-weight: var(--header-weight);
font-size: var(--header-xl); font-size: var(--header-xl);
line-height: var(--header-line-height);
} }
h2 { h2 {
font-family: "League Spartan"; font-family: "League Spartan";
@ -130,30 +133,21 @@ h4 {
text-decoration: underline; text-decoration: underline;
} }
input[type="text"] { input[type="text"],
border: 1px solid var(--color-bg); input[type="password"],
}
input[type="text"]:focus {
outline: none;
border: 1px solid var(--color-fg-tertiary);
}
textarea { textarea {
border: 1px solid var(--color-bg); border: 1px solid var(--color-bg);
} }
textarea:focus { input[type="checkbox"] {
outline: none; height: fit-content;
border: 1px solid var(--color-fg-tertiary);
} }
input[type="password"] { input:focus-visible,
border: 1px solid var(--color-bg); button:focus-visible,
} a:focus-visible,
input[type="password"]:focus { select:focus-visible,
outline: none; textarea:focus-visible {
border: 1px solid var(--color-fg-tertiary); border-color: transparent;
} outline: 2px solid var(--color-fg-tertiary);
input[type="checkbox"]:focus {
outline: none;
border: 1px solid var(--color-fg-tertiary);
} }
button:hover { button:hover {

View file

@ -63,19 +63,19 @@ export default function ActivityGrid({
queryFn: ({ queryKey }) => getActivity(queryKey[1] as getActivityArgs), queryFn: ({ queryKey }) => getActivity(queryKey[1] as getActivityArgs),
}); });
const { theme, themeName } = useTheme(); const { theme } = useTheme();
const color = getPrimaryColor(theme); const color = getPrimaryColor(theme);
if (isPending) { if (isPending) {
return ( return (
<div className="w-[500px]"> <div className="w-[350px]">
<h3>Activity</h3> <h3>Activity</h3>
<p>Loading...</p> <p>Loading...</p>
</div> </div>
); );
} else if (isError) { } else if (isError) {
return ( return (
<div className="w-[500px]"> <div className="w-[350px]">
<h3>Activity</h3> <h3>Activity</h3>
<p className="error">Error: {error.message}</p> <p className="error">Error: {error.message}</p>
</div> </div>
@ -129,14 +129,7 @@ export default function ActivityGrid({
} }
v = Math.min(v, t); v = Math.min(v, t);
if (themeName === "pearl") { return ((v - t) / t) * 0.8;
// special case for the only light theme lol
// could be generalized by pragmatically comparing the
// lightness of the bg vs the primary but eh
return (t - v) / t;
} else {
return ((v - t) / t) * 0.8;
}
}; };
const CHUNK_SIZE = 26 * 7; const CHUNK_SIZE = 26 * 7;

View file

@ -7,10 +7,12 @@ export default function AllTimeStats() {
queryFn: ({ queryKey }) => getStats(queryKey[1]), queryFn: ({ queryKey }) => getStats(queryKey[1]),
}); });
const header = "All time stats";
if (isPending) { if (isPending) {
return ( return (
<div className="w-[200px]"> <div>
<h3>All Time Stats</h3> <h3>{header}</h3>
<p>Loading...</p> <p>Loading...</p>
</div> </div>
); );
@ -18,7 +20,7 @@ export default function AllTimeStats() {
return ( return (
<> <>
<div> <div>
<h3>All Time Stats</h3> <h3>{header}</h3>
<p className="error">Error: {error.message}</p> <p className="error">Error: {error.message}</p>
</div> </div>
</> </>
@ -29,7 +31,7 @@ export default function AllTimeStats() {
return ( return (
<div> <div>
<h3>All Time Stats</h3> <h3>{header}</h3>
<div> <div>
<span <span
className={numberClasses} className={numberClasses}

View file

@ -8,11 +8,11 @@ interface Props {
period: string; period: string;
} }
export default function ArtistAlbums({ artistId, name, period }: Props) { export default function ArtistAlbums({ artistId, name }: Props) {
const { isPending, isError, data, error } = useQuery({ const { isPending, isError, data, error } = useQuery({
queryKey: [ queryKey: [
"top-albums", "top-albums",
{ limit: 99, period: "all_time", artist_id: artistId, page: 0 }, { limit: 99, period: "all_time", artist_id: artistId },
], ],
queryFn: ({ queryKey }) => getTopAlbums(queryKey[1] as getItemsArgs), queryFn: ({ queryKey }) => getTopAlbums(queryKey[1] as getItemsArgs),
}); });
@ -39,16 +39,20 @@ export default function ArtistAlbums({ artistId, name, period }: Props) {
<h3>Albums featuring {name}</h3> <h3>Albums featuring {name}</h3>
<div className="flex flex-wrap gap-8"> <div className="flex flex-wrap gap-8">
{data.items.map((item) => ( {data.items.map((item) => (
<Link to={`/album/${item.id}`} className="flex gap-2 items-start"> <Link
to={`/album/${item.item.id}`}
className="flex gap-2 items-start"
>
<img <img
src={imageUrl(item.image, "medium")} src={imageUrl(item.item.image, "medium")}
alt={item.title} alt={item.item.title}
style={{ width: 130 }} style={{ width: 130 }}
/> />
<div className="w-[180px] flex flex-col items-start gap-1"> <div className="w-[180px] flex flex-col items-start gap-1">
<p>{item.title}</p> <p>{item.item.title}</p>
<p className="text-sm color-fg-secondary"> <p className="text-sm color-fg-secondary">
{item.listen_count} play{item.listen_count > 1 ? "s" : ""} {item.item.listen_count} play
{item.item.listen_count > 1 ? "s" : ""}
</p> </p>
</div> </div>
</Link> </Link>

View file

@ -0,0 +1,112 @@
import { useQuery } from "@tanstack/react-query";
import { getInterest, type getInterestArgs } from "api/api";
import { useTheme } from "~/hooks/useTheme";
import type { Theme } from "~/styles/themes.css";
import { Area, AreaChart } from "recharts";
import { RechartsDevtools } from "@recharts/devtools";
function getPrimaryColor(theme: Theme): string {
const value = theme.primary;
const rgbMatch = value.match(
/^rgb\(\s*(\d{1,3})\s*,\s*(\d{1,3})\s*,\s*(\d{1,3})\s*\)$/
);
if (rgbMatch) {
const [, r, g, b] = rgbMatch.map(Number);
return "#" + [r, g, b].map((n) => n.toString(16).padStart(2, "0")).join("");
}
return value;
}
interface Props {
buckets?: number;
artistId?: number;
albumId?: number;
trackId?: number;
}
export default function InterestGraph({
buckets = 16,
artistId = 0,
albumId = 0,
trackId = 0,
}: Props) {
const { isPending, isError, data, error } = useQuery({
queryKey: [
"interest",
{
buckets: buckets,
artist_id: artistId,
album_id: albumId,
track_id: trackId,
},
],
queryFn: ({ queryKey }) => getInterest(queryKey[1] as getInterestArgs),
});
const { theme } = useTheme();
const color = getPrimaryColor(theme);
if (isPending) {
return (
<div className="w-[350px] sm:w-[500px]">
<h3>Interest over time</h3>
<p>Loading...</p>
</div>
);
} else if (isError) {
return (
<div className="w-[350px] sm:w-[500px]">
<h3>Interest over time</h3>
<p className="error">Error: {error.message}</p>
</div>
);
}
// Note: I would really like to have the animation for the graph, however
// the line graph can get weirdly clipped before the animation is done
// so I think I just have to remove it for now.
return (
<div className="flex flex-col items-start w-full max-w-[335px] sm:max-w-[500px]">
<h3>Interest over time</h3>
<AreaChart
style={{
width: "100%",
aspectRatio: 3.5,
maxWidth: 440,
overflow: "visible",
}}
data={data}
margin={{ top: 15, bottom: 20 }}
>
<defs>
<linearGradient id="colorGradient" x1="0" y1="0" x2="0" y2="1">
<stop offset="5%" stopColor={color} stopOpacity={0.5} />
<stop offset="95%" stopColor={color} stopOpacity={0} />
</linearGradient>
</defs>
<Area
dataKey="listen_count"
type="natural"
stroke="none"
fill="url(#colorGradient)"
animationDuration={0}
animationEasing="ease-in-out"
activeDot={false}
/>
<Area
dataKey="listen_count"
type="natural"
stroke={color}
fill="none"
strokeWidth={2}
animationDuration={0}
animationEasing="ease-in-out"
dot={false}
activeDot={false}
style={{ filter: `drop-shadow(0px 0px 0px ${color})` }}
/>
</AreaChart>
</div>
);
}

View file

@ -42,6 +42,8 @@ export default function LastPlays(props: Props) {
queryFn: () => getNowPlaying(), queryFn: () => getNowPlaying(),
}); });
const header = "Last played";
const [items, setItems] = useState<Listen[] | null>(null); const [items, setItems] = useState<Listen[] | null>(null);
const handleDelete = async (listen: Listen) => { const handleDelete = async (listen: Listen) => {
@ -63,14 +65,14 @@ export default function LastPlays(props: Props) {
if (isPending) { if (isPending) {
return ( return (
<div className="w-[300px] sm:w-[500px]"> <div className="w-[300px] sm:w-[500px]">
<h3>Last Played</h3> <h3>{header}</h3>
<p>Loading...</p> <p>Loading...</p>
</div> </div>
); );
} else if (isError) { } else if (isError) {
return ( return (
<div className="w-[300px] sm:w-[500px]"> <div className="w-[300px] sm:w-[500px]">
<h3>Last Played</h3> <h3>{header}</h3>
<p className="error">Error: {error.message}</p> <p className="error">Error: {error.message}</p>
</div> </div>
); );
@ -86,7 +88,7 @@ export default function LastPlays(props: Props) {
return ( return (
<div className="text-sm sm:text-[16px]"> <div className="text-sm sm:text-[16px]">
<h3 className="hover:underline"> <h3 className="hover:underline">
<Link to={`/listens?period=all_time${params}`}>Last Played</Link> <Link to={`/listens?period=all_time${params}`}>{header}</Link>
</h3> </h3>
<table className="-ml-4"> <table className="-ml-4">
<tbody> <tbody>

View file

@ -30,17 +30,19 @@ export default function TopAlbums(props: Props) {
queryFn: ({ queryKey }) => getTopAlbums(queryKey[1] as getItemsArgs), queryFn: ({ queryKey }) => getTopAlbums(queryKey[1] as getItemsArgs),
}); });
const header = "Top albums";
if (isPending) { if (isPending) {
return ( return (
<div className="w-[300px]"> <div className="w-[300px]">
<h3>Top Albums</h3> <h3>{header}</h3>
<p>Loading...</p> <p>Loading...</p>
</div> </div>
); );
} else if (isError) { } else if (isError) {
return ( return (
<div className="w-[300px]"> <div className="w-[300px]">
<h3>Top Albums</h3> <h3>{header}</h3>
<p className="error">Error: {error.message}</p> <p className="error">Error: {error.message}</p>
</div> </div>
); );
@ -54,7 +56,7 @@ export default function TopAlbums(props: Props) {
props.artistId ? `&artist_id=${props.artistId}` : "" props.artistId ? `&artist_id=${props.artistId}` : ""
}`} }`}
> >
Top Albums {header}
</Link> </Link>
</h3> </h3>
<div className="max-w-[300px]"> <div className="max-w-[300px]">

View file

@ -21,17 +21,19 @@ export default function TopArtists(props: Props) {
queryFn: ({ queryKey }) => getTopArtists(queryKey[1] as getItemsArgs), queryFn: ({ queryKey }) => getTopArtists(queryKey[1] as getItemsArgs),
}); });
const header = "Top artists";
if (isPending) { if (isPending) {
return ( return (
<div className="w-[300px]"> <div className="w-[300px]">
<h3>Top Artists</h3> <h3>{header}</h3>
<p>Loading...</p> <p>Loading...</p>
</div> </div>
); );
} else if (isError) { } else if (isError) {
return ( return (
<div className="w-[300px]"> <div className="w-[300px]">
<h3>Top Artists</h3> <h3>{header}</h3>
<p className="error">Error: {error.message}</p> <p className="error">Error: {error.message}</p>
</div> </div>
); );
@ -40,9 +42,7 @@ export default function TopArtists(props: Props) {
return ( return (
<div> <div>
<h3 className="hover:underline"> <h3 className="hover:underline">
<Link to={`/chart/top-artists?period=${props.period}`}> <Link to={`/chart/top-artists?period=${props.period}`}>{header}</Link>
Top Artists
</Link>
</h3> </h3>
<div className="max-w-[300px]"> <div className="max-w-[300px]">
<TopItemList type="artist" data={data} /> <TopItemList type="artist" data={data} />

View file

@ -1,102 +1,171 @@
import { Link, useNavigate } from "react-router"; import { Link, useNavigate } from "react-router";
import ArtistLinks from "./ArtistLinks"; import ArtistLinks from "./ArtistLinks";
import { imageUrl, type Album, type Artist, type Track, type PaginatedResponse } from "api/api"; import {
imageUrl,
type Album,
type Artist,
type Track,
type PaginatedResponse,
type Ranked,
} from "api/api";
type Item = Album | Track | Artist; type Item = Album | Track | Artist;
interface Props<T extends Item> { interface Props<T extends Ranked<Item>> {
data: PaginatedResponse<T> data: PaginatedResponse<T>;
separators?: ConstrainBoolean separators?: ConstrainBoolean;
type: "album" | "track" | "artist"; ranked?: boolean;
className?: string, type: "album" | "track" | "artist";
className?: string;
} }
export default function TopItemList<T extends Item>({ data, separators, type, className }: Props<T>) { export default function TopItemList<T extends Ranked<Item>>({
data,
separators,
type,
className,
ranked,
}: Props<T>) {
return (
<div className={`flex flex-col gap-1 ${className} min-w-[200px]`}>
{data.items.map((item, index) => {
const key = `${type}-${item.item.id}`;
return (
<div
key={key}
style={{ fontSize: 12 }}
className={`${
separators && index !== data.items.length - 1
? "border-b border-(--color-fg-tertiary) mb-1 pb-2"
: ""
}`}
>
<ItemCard
ranked={ranked}
rank={item.rank}
item={item.item}
type={type}
key={type + item.item.id}
/>
</div>
);
})}
</div>
);
}
return ( function ItemCard({
<div className={`flex flex-col gap-1 ${className} min-w-[200px]`}> item,
{data.items.map((item, index) => { type,
const key = `${type}-${item.id}`; rank,
return ( ranked,
<div }: {
key={key} item: Item;
style={{ fontSize: 12 }} type: "album" | "track" | "artist";
className={`${ rank: number;
separators && index !== data.items.length - 1 ? 'border-b border-(--color-fg-tertiary) mb-1 pb-2' : '' ranked?: boolean;
}`} }) {
> const itemClasses = `flex items-center gap-2`;
<ItemCard item={item} type={type} key={type+item.id} />
</div> switch (type) {
); case "album": {
})} const album = item as Album;
return (
<div style={{ fontSize: 12 }} className={itemClasses}>
{ranked && <div className="w-7 text-end">{rank}</div>}
<Link to={`/album/${album.id}`}>
<img
loading="lazy"
src={imageUrl(album.image, "small")}
alt={album.title}
className="min-w-[48px]"
/>
</Link>
<div>
<Link
to={`/album/${album.id}`}
className="hover:text-(--color-fg-secondary)"
>
<span style={{ fontSize: 14 }}>{album.title}</span>
</Link>
<br />
{album.is_various_artists ? (
<span className="color-fg-secondary">Various Artists</span>
) : (
<div>
<ArtistLinks
artists={
album.artists
? [album.artists[0]]
: [{ id: 0, name: "Unknown Artist" }]
}
/>
</div>
)}
<div className="color-fg-secondary">{album.listen_count} plays</div>
</div>
</div> </div>
); );
}
function ItemCard({ item, type }: { item: Item; type: "album" | "track" | "artist" }) {
const itemClasses = `flex items-center gap-2`
switch (type) {
case "album": {
const album = item as Album;
return (
<div style={{fontSize: 12}} className={itemClasses}>
<Link to={`/album/${album.id}`}>
<img loading="lazy" src={imageUrl(album.image, "small")} alt={album.title} className="min-w-[48px]" />
</Link>
<div>
<Link to={`/album/${album.id}`} className="hover:text-(--color-fg-secondary)">
<span style={{fontSize: 14}}>{album.title}</span>
</Link>
<br />
{album.is_various_artists ?
<span className="color-fg-secondary">Various Artists</span>
:
<div>
<ArtistLinks artists={album.artists ? [album.artists[0]] : [{id: 0, name: 'Unknown Artist'}]}/>
</div>
}
<div className="color-fg-secondary">{album.listen_count} plays</div>
</div>
</div>
);
}
case "track": {
const track = item as Track;
return (
<div style={{fontSize: 12}} className={itemClasses}>
<Link to={`/track/${track.id}`}>
<img loading="lazy" src={imageUrl(track.image, "small")} alt={track.title} className="min-w-[48px]" />
</Link>
<div>
<Link to={`/track/${track.id}`} className="hover:text-(--color-fg-secondary)">
<span style={{fontSize: 14}}>{track.title}</span>
</Link>
<br />
<div>
<ArtistLinks artists={track.artists || [{id: 0, Name: 'Unknown Artist'}]}/>
</div>
<div className="color-fg-secondary">{track.listen_count} plays</div>
</div>
</div>
);
}
case "artist": {
const artist = item as Artist;
return (
<div style={{fontSize: 12}}>
<Link className={itemClasses+' mt-1 mb-[6px] hover:text-(--color-fg-secondary)'} to={`/artist/${artist.id}`}>
<img loading="lazy" src={imageUrl(artist.image, "small")} alt={artist.name} className="min-w-[48px]" />
<div>
<span style={{fontSize: 14}}>{artist.name}</span>
<div className="color-fg-secondary">{artist.listen_count} plays</div>
</div>
</Link>
</div>
);
}
} }
case "track": {
const track = item as Track;
return (
<div style={{ fontSize: 12 }} className={itemClasses}>
{ranked && <div className="w-7 text-end">{rank}</div>}
<Link to={`/track/${track.id}`}>
<img
loading="lazy"
src={imageUrl(track.image, "small")}
alt={track.title}
className="min-w-[48px]"
/>
</Link>
<div>
<Link
to={`/track/${track.id}`}
className="hover:text-(--color-fg-secondary)"
>
<span style={{ fontSize: 14 }}>{track.title}</span>
</Link>
<br />
<div>
<ArtistLinks
artists={track.artists || [{ id: 0, Name: "Unknown Artist" }]}
/>
</div>
<div className="color-fg-secondary">{track.listen_count} plays</div>
</div>
</div>
);
}
case "artist": {
const artist = item as Artist;
return (
<div style={{ fontSize: 12 }} className={itemClasses}>
{ranked && <div className="w-7 text-end">{rank}</div>}
<Link
className={
itemClasses + " mt-1 mb-[6px] hover:text-(--color-fg-secondary)"
}
to={`/artist/${artist.id}`}
>
<img
loading="lazy"
src={imageUrl(artist.image, "small")}
alt={artist.name}
className="min-w-[48px]"
/>
<div>
<span style={{ fontSize: 14 }}>{artist.name}</span>
<div className="color-fg-secondary">
{artist.listen_count} plays
</div>
</div>
</Link>
</div>
);
}
}
} }

View file

@ -28,17 +28,19 @@ const TopTracks = (props: Props) => {
queryFn: ({ queryKey }) => getTopTracks(queryKey[1] as getItemsArgs), queryFn: ({ queryKey }) => getTopTracks(queryKey[1] as getItemsArgs),
}); });
const header = "Top tracks";
if (isPending) { if (isPending) {
return ( return (
<div className="w-[300px]"> <div className="w-[300px]">
<h3>Top Tracks</h3> <h3>{header}</h3>
<p>Loading...</p> <p>Loading...</p>
</div> </div>
); );
} else if (isError) { } else if (isError) {
return ( return (
<div className="w-[300px]"> <div className="w-[300px]">
<h3>Top Tracks</h3> <h3>{header}</h3>
<p className="error">Error: {error.message}</p> <p className="error">Error: {error.message}</p>
</div> </div>
); );
@ -53,7 +55,7 @@ const TopTracks = (props: Props) => {
<div> <div>
<h3 className="hover:underline"> <h3 className="hover:underline">
<Link to={`/chart/top-tracks?period=${props.period}${params}`}> <Link to={`/chart/top-tracks?period=${props.period}${params}`}>
Top Tracks {header}
</Link> </Link>
</h3> </h3>
<div className="max-w-[300px]"> <div className="max-w-[300px]">

View file

@ -0,0 +1,23 @@
interface Props {
size: number;
hover?: boolean;
}
export default function MbzIcon({ size, hover }: Props) {
let classNames = "";
if (hover) {
classNames += "icon-hover-fill";
}
return (
<div className={classNames}>
<svg
width={`${size}px`}
height={`${size}px`}
viewBox="0 0 24 24"
fill="var(--color-fg)"
xmlns="http://www.w3.org/2000/svg"
>
<path d="M11.582 0L1.418 5.832v12.336L11.582 24V10.01L7.1 12.668v3.664c.01.111.01.225 0 .336-.103.435-.54.804-1 1.111-.802.537-1.752.509-2.166-.111-.413-.62-.141-1.631.666-2.168.384-.28.863-.399 1.334-.332V6.619c0-.154.134-.252.226-.308L11.582 3zm.836 0v6.162c.574.03 1.14.16 1.668.387a2.225 2.225 0 0 0 1.656-.717 1.02 1.02 0 1 1 1.832-.803l.004.006a1.022 1.022 0 0 1-1.295 1.197c-.34.403-.792.698-1.297.85.34.263.641.576.891.928a1.04 1.04 0 0 1 .777.125c.768.486.568 1.657-.318 1.857-.886.2-1.574-.77-1.09-1.539.02-.03.042-.06.065-.09a3.598 3.598 0 0 0-1.436-1.166 4.142 4.142 0 0 0-1.457-.369v4.01c.855.06 1.256.493 1.555.834.227.256.356.39.578.402.323.018.568.008.806 0a5.44 5.44 0 0 1 .895.022c.94-.017 1.272-.226 1.605-.446a2.533 2.533 0 0 1 1.131-.463 1.027 1.027 0 0 1 .12-.263 1.04 1.04 0 0 1 .105-.137c.023-.025.047-.044.07-.066a4.775 4.775 0 0 1 0-2.405l-.012-.01a1.02 1.02 0 1 1 .692.272h-.057a4.288 4.288 0 0 0 0 1.877h.063a1.02 1.02 0 1 1-.545 1.883l-.047-.033a1 1 0 0 1-.352-.442 1.885 1.885 0 0 0-.814.354 3.03 3.03 0 0 1-.703.365c.757.555 1.772 1.6 2.199 2.299a1.03 1.03 0 0 1 .256-.033 1.02 1.02 0 1 1-.545 1.88l-.047-.03a1.017 1.017 0 0 1-.27-1.376.72.72 0 0 1 .051-.072c-.445-.775-2.026-2.28-2.46-2.387a4.037 4.037 0 0 0-1.31-.117c-.24.008-.513.018-.866 0-.515-.027-.783-.333-1.043-.629-.26-.296-.51-.56-1.055-.611V18.5a1.877 1.877 0 0 0 .426-.135.333.333 0 0 1 .058-.027c.56-.267 1.421-.91 2.096-2.447a1.02 1.02 0 0 1-.27-1.344 1.02 1.02 0 1 1 .915 1.54 6.273 6.273 0 0 1-1.432 2.136 1.785 1.785 0 0 1 .691.306.667.667 0 0 0 .37.168 3.31 3.31 0 0 0 .888-.222 1.02 1.02 0 0 1 1.787-.79v-.005a1.02 1.02 0 0 1-.773 1.683 1.022 1.022 0 0 1-.719-.287 3.935 3.935 0 0 1-1.168.287h-.05a1.313 1.313 0 0 1-.71-.275c-.262-.177-.51-.345-1.402-.12a2.098 2.098 0 0 1-.707.2V24l10.164-5.832V5.832zm4.154 4.904a.352.352 0 0 0-.197.639l.018.01c.163.1.378.053.484-.108v-.002a.352.352 0 0 0-.303-.539zm-4.99 1.928L7.082 9.5v2l4.5-2.668zm8.385.38a.352.352 0 0 0-.295.165v.002a.35.35 0 0 0 .096.473l.013.01a.357.357 0 0 0 .487-.108.352.352 0 0 0-.301-.541zM16.09 8.647a.352.352 0 0 0-.277.163.355.355 0 0 0 .296.54c.482 0 .463-.73-.02-.703zm3.877 2.477a.352.352 0 0 0-.295.164.35.35 0 0 0 .094.475l.015.01a.357.357 0 0 0 .485-.11.352.352 0 0 0-.3-.539zm-4.375 3.594a.352.352 0 0 0-.291.172.35.35 0 0 0-.04.265.352.352 0 1 0 .33-.437zm4.375.789a.352.352 0 0 0-.295.164v.002a.352.352 0 0 0 .094.473l.015.01a.357.357 0 0 0 .485-.108.352.352 0 0 0-.3-.54zm-2.803 2.488v.002a.347.347 0 0 0-.223.084.352.352 0 0 0 .23.62.347.347 0 0 0 .23-.085.348.348 0 0 0 .12-.24.353.353 0 0 0-.35-.38.347.347 0 0 0-.007 0Z"></path>
</svg>
</div>
);
}

View file

@ -20,7 +20,7 @@ export default function DeleteModal({ open, setOpen, title, id, type }: Props) {
setLoading(true); setLoading(true);
deleteItem(type.toLowerCase(), id).then((r) => { deleteItem(type.toLowerCase(), id).then((r) => {
if (r.ok) { if (r.ok) {
navigate("/"); navigate(-1);
} else { } else {
console.log(r); console.log(r);
} }

View file

@ -4,6 +4,7 @@ import {
deleteAlias, deleteAlias,
getAliases, getAliases,
setPrimaryAlias, setPrimaryAlias,
updateMbzId,
type Alias, type Alias,
} from "api/api"; } from "api/api";
import { Modal } from "../Modal"; import { Modal } from "../Modal";
@ -12,6 +13,7 @@ import { useEffect, useState } from "react";
import { Trash } from "lucide-react"; import { Trash } from "lucide-react";
import SetVariousArtists from "./SetVariousArtist"; import SetVariousArtists from "./SetVariousArtist";
import SetPrimaryArtist from "./SetPrimaryArtist"; import SetPrimaryArtist from "./SetPrimaryArtist";
import UpdateMbzID from "./UpdateMbzID";
interface Props { interface Props {
type: string; type: string;
@ -69,7 +71,7 @@ export default function EditModal({ open, setOpen, type, id }: Props) {
const handleNewAlias = () => { const handleNewAlias = () => {
setError(undefined); setError(undefined);
if (input === "") { if (input === "") {
setError("alias must be provided"); setError("no input");
return; return;
} }
setLoading(true); setLoading(true);
@ -156,6 +158,7 @@ export default function EditModal({ open, setOpen, type, id }: Props) {
{type.toLowerCase() === "track" && ( {type.toLowerCase() === "track" && (
<SetPrimaryArtist id={id} type="track" /> <SetPrimaryArtist id={id} type="track" />
)} )}
<UpdateMbzID type={type} id={id} />
</div> </div>
</Modal> </Modal>
); );

View file

@ -0,0 +1,53 @@
import { updateMbzId } from "api/api";
import { useState } from "react";
import { AsyncButton } from "~/components/AsyncButton";
interface Props {
type: string;
id: number;
}
export default function UpdateMbzID({ type, id }: Props) {
const [err, setError] = useState<string | undefined>();
const [input, setInput] = useState("");
const [loading, setLoading] = useState(false);
const [mbzid, setMbzid] = useState<"">();
const [success, setSuccess] = useState("");
const handleUpdateMbzID = () => {
setError(undefined);
if (input === "") {
setError("no input");
return;
}
setLoading(true);
updateMbzId(type, id, input).then((r) => {
if (r.ok) {
setSuccess("successfully updated MusicBrainz ID");
} else {
r.json().then((r) => setError(r.error));
}
});
setLoading(false);
};
return (
<div className="w-full">
<h3>Update MusicBrainz ID</h3>
<div className="flex gap-2 w-3/5">
<input
type="text"
placeholder="Update MusicBrainz ID"
className="mx-auto fg bg rounded-md p-3 flex-grow"
value={input}
onChange={(e) => setInput(e.target.value)}
/>
<AsyncButton loading={loading} onClick={handleUpdateMbzID}>
Submit
</AsyncButton>
</div>
{err && <p className="error">{err}</p>}
{success && <p className="success">{success}</p>}
</div>
);
}

View file

@ -54,7 +54,7 @@ export default function LoginForm() {
className="w-full mx-auto fg bg rounded p-2" className="w-full mx-auto fg bg rounded p-2"
onChange={(e) => setPassword(e.target.value)} onChange={(e) => setPassword(e.target.value)}
/> />
<div className="flex gap-2"> <div className="flex gap-2 items-center">
<input <input
type="checkbox" type="checkbox"
name="koito-remember" name="koito-remember"

View file

@ -19,7 +19,7 @@ interface Props {
} }
export default function MergeModal(props: Props) { export default function MergeModal(props: Props) {
const [query, setQuery] = useState(""); const [query, setQuery] = useState(props.currentTitle);
const [data, setData] = useState<SearchResponse>(); const [data, setData] = useState<SearchResponse>();
const [debouncedQuery, setDebouncedQuery] = useState(query); const [debouncedQuery, setDebouncedQuery] = useState(query);
const [mergeTarget, setMergeTarget] = useState<{ title: string; id: number }>( const [mergeTarget, setMergeTarget] = useState<{ title: string; id: number }>(
@ -101,11 +101,12 @@ export default function MergeModal(props: Props) {
<input <input
type="text" type="text"
autoFocus autoFocus
defaultValue={props.currentTitle}
// i find my stupid a(n) logic to be a little silly so im leaving it in even if its not optimal // i find my stupid a(n) logic to be a little silly so im leaving it in even if its not optimal
placeholder={`Search for a${ placeholder={`Search for a${props.type.toLowerCase()[0] === "a" ? "n" : ""
props.type.toLowerCase()[0] === "a" ? "n" : "" } ${props.type.toLowerCase()} to be merged into the current ${props.type.toLowerCase()}`}
} ${props.type.toLowerCase()} to be merged into the current ${props.type.toLowerCase()}`}
className="w-full mx-auto fg bg rounded p-2" className="w-full mx-auto fg bg rounded p-2"
onFocus={(e) => { setQuery(e.target.value); e.target.select()}}
onChange={(e) => setQuery(e.target.value)} onChange={(e) => setQuery(e.target.value)}
/> />
<SearchResults selectorMode data={data} onSelect={toggleSelect} /> <SearchResults selectorMode data={data} onSelect={toggleSelect} />
@ -128,7 +129,7 @@ export default function MergeModal(props: Props) {
> >
Merge Items Merge Items
</button> </button>
<div className="flex gap-2 mt-3"> <div className="flex items-center gap-2 mt-3">
<input <input
type="checkbox" type="checkbox"
name="reverse-merge-order" name="reverse-merge-order"
@ -139,7 +140,7 @@ export default function MergeModal(props: Props) {
</div> </div>
{(props.type.toLowerCase() === "album" || {(props.type.toLowerCase() === "album" ||
props.type.toLowerCase() === "artist") && ( props.type.toLowerCase() === "artist") && (
<div className="flex gap-2 mt-3"> <div className="flex items-center gap-2 mt-3">
<input <input
type="checkbox" type="checkbox"
name="replace-image" name="replace-image"

View file

@ -32,10 +32,34 @@ export function Modal({
} }
}, [isOpen, shouldRender]); }, [isOpen, shouldRender]);
// Close on Escape key // Handle keyboard events
useEffect(() => { useEffect(() => {
const handleKeyDown = (e: KeyboardEvent) => { const handleKeyDown = (e: KeyboardEvent) => {
if (e.key === 'Escape') onClose(); // Close on Escape key
if (e.key === 'Escape') {
onClose()
// Trap tab navigation to the modal
} else if (e.key === 'Tab') {
if (modalRef.current) {
const focusableEls = modalRef.current.querySelectorAll<HTMLElement>(
'button:not(:disabled), [href], input:not(:disabled), select:not(:disabled), textarea:not(:disabled), [tabindex]:not([tabindex="-1"])'
);
const firstEl = focusableEls[0];
const lastEl = focusableEls[focusableEls.length - 1];
const activeEl = document.activeElement
if (e.shiftKey && activeEl === firstEl) {
e.preventDefault();
lastEl.focus();
} else if (!e.shiftKey && activeEl === lastEl) {
e.preventDefault();
firstEl.focus();
} else if (!Array.from(focusableEls).find(node => node.isEqualNode(activeEl))) {
e.preventDefault();
firstEl.focus();
}
}
};
}; };
if (isOpen) document.addEventListener('keydown', handleKeyDown); if (isOpen) document.addEventListener('keydown', handleKeyDown);
return () => document.removeEventListener('keydown', handleKeyDown); return () => document.removeEventListener('keydown', handleKeyDown);
@ -70,13 +94,13 @@ export function Modal({
}`} }`}
style={{ maxWidth: maxW ?? 600, height: h ?? '' }} style={{ maxWidth: maxW ?? 600, height: h ?? '' }}
> >
{children}
<button <button
onClick={onClose} onClick={onClose}
className="absolute top-2 right-2 color-fg-tertiary hover:cursor-pointer" className="absolute top-2 right-2 color-fg-tertiary hover:cursor-pointer"
> >
🞪 🞪
</button> </button>
{children}
</div> </div>
</div>, </div>,
document.body document.body

View file

@ -8,9 +8,16 @@ interface Props {
} }
export default function Rewind(props: Props) { export default function Rewind(props: Props) {
const artistimg = props.stats.top_artists[0].image; const artistimg = props.stats.top_artists[0]?.item.image;
const albumimg = props.stats.top_albums[0].image; const albumimg = props.stats.top_albums[0]?.item.image;
const trackimg = props.stats.top_tracks[0].image; const trackimg = props.stats.top_tracks[0]?.item.image;
if (
!props.stats.top_artists[0] ||
!props.stats.top_albums[0] ||
!props.stats.top_tracks[0]
) {
return <p>Not enough data exists to create a Rewind for this period :(</p>;
}
return ( return (
<div className="flex flex-col gap-7"> <div className="flex flex-col gap-7">
<h2>{props.stats.title}</h2> <h2>{props.stats.title}</h2>

View file

@ -1,7 +1,9 @@
import type { Ranked } from "api/api";
type TopItemProps<T> = { type TopItemProps<T> = {
title: string; title: string;
imageSrc: string; imageSrc: string;
items: T[]; items: Ranked<T>[];
getLabel: (item: T) => string; getLabel: (item: T) => string;
includeTime?: boolean; includeTime?: boolean;
}; };
@ -28,23 +30,23 @@ export function RewindTopItem<
<div className="flex items-center gap-2"> <div className="flex items-center gap-2">
<div className="flex flex-col items-start mb-2"> <div className="flex flex-col items-start mb-2">
<h2>{getLabel(top)}</h2> <h2>{getLabel(top.item)}</h2>
<span className="text-(--color-fg-tertiary) -mt-3 text-sm"> <span className="text-(--color-fg-tertiary) -mt-3 text-sm">
{`${top.listen_count} plays`} {`${top.item.listen_count} plays`}
{includeTime {includeTime
? ` (${Math.floor(top.time_listened / 60)} minutes)` ? ` (${Math.floor(top.item.time_listened / 60)} minutes)`
: ``} : ``}
</span> </span>
</div> </div>
</div> </div>
{rest.map((e) => ( {rest.map((e) => (
<div key={e.id} className="text-sm"> <div key={e.item.id} className="text-sm">
{getLabel(e)} {getLabel(e.item)}
<span className="text-(--color-fg-tertiary)"> <span className="text-(--color-fg-tertiary)">
{` - ${e.listen_count} plays`} {` - ${e.item.listen_count} plays`}
{includeTime {includeTime
? ` (${Math.floor(e.time_listened / 60)} minutes)` ? ` (${Math.floor(e.item.time_listened / 60)} minutes)`
: ``} : ``}
</span> </span>
</div> </div>

View file

@ -2,7 +2,7 @@ import { ExternalLink, History, Home, Info } from "lucide-react";
import SidebarSearch from "./SidebarSearch"; import SidebarSearch from "./SidebarSearch";
import SidebarItem from "./SidebarItem"; import SidebarItem from "./SidebarItem";
import SidebarSettings from "./SidebarSettings"; import SidebarSettings from "./SidebarSettings";
import { getRewindYear } from "~/utils/utils"; import { getRewindParams, getRewindYear } from "~/utils/utils";
export default function Sidebar() { export default function Sidebar() {
const iconSize = 20; const iconSize = 20;
@ -45,7 +45,7 @@ export default function Sidebar() {
<SidebarSearch size={iconSize} /> <SidebarSearch size={iconSize} />
<SidebarItem <SidebarItem
space={10} space={10}
to={`/rewind?year=${getRewindYear()}`} to="/rewind"
name="Rewind" name="Rewind"
onClick={() => {}} onClick={() => {}}
modal={<></>} modal={<></>}

View file

@ -1,23 +1,43 @@
import type { Theme } from "~/styles/themes.css"; import type { Theme } from "~/styles/themes.css";
interface Props { interface Props {
theme: Theme theme: Theme;
themeName: string themeName: string;
setTheme: Function setTheme: Function;
} }
export default function ThemeOption({ theme, themeName, setTheme }: Props) { export default function ThemeOption({ theme, themeName, setTheme }: Props) {
const capitalizeFirstLetter = (s: string) => {
return s.charAt(0).toUpperCase() + s.slice(1);
};
const capitalizeFirstLetter = (s: string) => { return (
return s.charAt(0).toUpperCase() + s.slice(1); <div
} onClick={() => setTheme(themeName)}
className="rounded-md p-3 sm:p-5 hover:cursor-pointer flex gap-3 items-center border-2 justify-between"
return ( style={{
<div onClick={() => setTheme(themeName)} className="rounded-md p-3 sm:p-5 hover:cursor-pointer flex gap-4 items-center border-2" style={{background: theme.bg, color: theme.fg, borderColor: theme.bgSecondary}}> background: theme.bg,
<div className="text-xs sm:text-sm">{capitalizeFirstLetter(themeName)}</div> color: theme.fg,
<div className="w-[50px] h-[30px] rounded-md" style={{background: theme.bgSecondary}}></div> borderColor: theme.bgSecondary,
<div className="w-[50px] h-[30px] rounded-md" style={{background: theme.fgSecondary}}></div> }}
<div className="w-[50px] h-[30px] rounded-md" style={{background: theme.primary}}></div> >
</div> <div className="text-xs sm:text-sm">
) {capitalizeFirstLetter(themeName)}
} </div>
<div className="flex gap-2 w-full">
<div
className="w-2/7 max-w-[50px] h-[30px] rounded-md"
style={{ background: theme.bgSecondary }}
></div>
<div
className="w-2/7 max-w-[50px] h-[30px] rounded-md"
style={{ background: theme.fgSecondary }}
></div>
<div
className="w-2/7 max-w-[50px] h-[30px] rounded-md"
style={{ background: theme.primary }}
></div>
</div>
</div>
);
}

View file

@ -49,7 +49,7 @@ export function ThemeSwitcher() {
<AsyncButton onClick={resetTheme}>Reset</AsyncButton> <AsyncButton onClick={resetTheme}>Reset</AsyncButton>
</div> </div>
</div> </div>
<div className="grid grid-cols-2 items-center gap-2"> <div className="grid grid-cols-1 sm:grid-cols-2 items-center gap-2">
{Object.entries(themes).map(([name, themeData]) => ( {Object.entries(themes).map(([name, themeData]) => (
<ThemeOption <ThemeOption
setTheme={setTheme} setTheme={setTheme}

View file

@ -9,16 +9,19 @@ import {
} from "react-router"; } from "react-router";
import type { Route } from "./+types/root"; import type { Route } from "./+types/root";
import './themes.css' import "./themes.css";
import "./app.css"; import "./app.css";
import { QueryClient, QueryClientProvider } from "@tanstack/react-query"; import { QueryClient, QueryClientProvider } from "@tanstack/react-query";
import { ThemeProvider } from './providers/ThemeProvider'; import { ThemeProvider } from "./providers/ThemeProvider";
import Sidebar from "./components/sidebar/Sidebar"; import Sidebar from "./components/sidebar/Sidebar";
import Footer from "./components/Footer"; import Footer from "./components/Footer";
import { AppProvider } from "./providers/AppProvider"; import { AppProvider } from "./providers/AppProvider";
import { initTimezoneCookie } from "./tz";
initTimezoneCookie();
// Create a client // Create a client
const queryClient = new QueryClient() const queryClient = new QueryClient();
export const links: Route.LinksFunction = () => [ export const links: Route.LinksFunction = () => [
{ rel: "preconnect", href: "https://fonts.googleapis.com" }, { rel: "preconnect", href: "https://fonts.googleapis.com" },
@ -35,14 +38,23 @@ export const links: Route.LinksFunction = () => [
export function Layout({ children }: { children: React.ReactNode }) { export function Layout({ children }: { children: React.ReactNode }) {
return ( return (
<html lang="en" style={{backgroundColor: 'black'}}> <html lang="en" style={{ backgroundColor: "black" }}>
<head> <head>
<meta charSet="utf-8" /> <meta charSet="utf-8" />
<meta name="viewport" content="width=device-width, initial-scale=1" /> <meta name="viewport" content="width=device-width, initial-scale=1" />
<link rel="icon" type="image/png" href="/favicon-96x96.png" sizes="96x96" /> <link
rel="icon"
type="image/png"
href="/favicon-96x96.png"
sizes="96x96"
/>
<link rel="icon" type="image/svg+xml" href="/favicon.svg" /> <link rel="icon" type="image/svg+xml" href="/favicon.svg" />
<link rel="shortcut icon" href="/favicon.ico" /> <link rel="shortcut icon" href="/favicon.ico" />
<link rel="apple-touch-icon" sizes="180x180" href="/apple-touch-icon.png" /> <link
rel="apple-touch-icon"
sizes="180x180"
href="/apple-touch-icon.png"
/>
<meta name="apple-mobile-web-app-title" content="Koito" /> <meta name="apple-mobile-web-app-title" content="Koito" />
<link rel="manifest" href="/site.webmanifest" /> <link rel="manifest" href="/site.webmanifest" />
<Meta /> <Meta />
@ -60,71 +72,71 @@ export function Layout({ children }: { children: React.ReactNode }) {
export default function App() { export default function App() {
return ( return (
<> <>
<AppProvider> <AppProvider>
<ThemeProvider> <ThemeProvider>
<QueryClientProvider client={queryClient}> <QueryClientProvider client={queryClient}>
<div className="flex-col flex sm:flex-row"> <div className="flex-col flex sm:flex-row">
<Sidebar /> <Sidebar />
<div className="flex flex-col items-center mx-auto w-full ml-0 sm:ml-[40px]"> <div className="flex flex-col items-center mx-auto w-full ml-0 sm:ml-[40px]">
<Outlet /> <Outlet />
<Footer /> <Footer />
</div> </div>
</div> </div>
</QueryClientProvider> </QueryClientProvider>
</ThemeProvider> </ThemeProvider>
</AppProvider> </AppProvider>
</> </>
); );
} }
export function HydrateFallback() { export function HydrateFallback() {
return null return null;
} }
export function ErrorBoundary() { export function ErrorBoundary() {
const error = useRouteError(); const error = useRouteError();
let message = "Oops!"; let message = "Oops!";
let details = "An unexpected error occurred."; let details = "An unexpected error occurred.";
let stack: string | undefined; let stack: string | undefined;
if (isRouteErrorResponse(error)) { if (isRouteErrorResponse(error)) {
message = error.status === 404 ? "404" : "Error"; message = error.status === 404 ? "404" : "Error";
details = error.status === 404 details =
error.status === 404
? "The requested page could not be found." ? "The requested page could not be found."
: error.statusText || details; : error.statusText || details;
} else if (import.meta.env.DEV && error instanceof Error) { } else if (import.meta.env.DEV && error instanceof Error) {
details = error.message; details = error.message;
stack = error.stack; stack = error.stack;
} }
const title = `${message} - Koito`;
const title = `${message} - Koito` return (
<AppProvider>
return ( <ThemeProvider>
<AppProvider> <title>{title}</title>
<ThemeProvider> <Sidebar />
<title>{title}</title> <div className="flex">
<div className="flex"> <div className="w-full flex flex-col">
<Sidebar /> <main className="pt-16 p-4 mx-auto flex-grow">
<div className="w-full flex flex-col"> <div className="md:flex gap-4">
<main className="pt-16 p-4 container mx-auto flex-grow"> <img className="w-[200px] rounded mb-3" src="../yuu.jpg" />
<div className="flex gap-4 items-end"> <div>
<img className="w-[200px] rounded" src="../yuu.jpg" /> <h1>{message}</h1>
<div> <p>{details}</p>
<h1>{message}</h1>
<p>{details}</p>
</div>
</div>
{stack && (
<pre className="w-full p-4 overflow-x-auto">
<code>{stack}</code>
</pre>
)}
</main>
<Footer />
</div>
</div> </div>
</ThemeProvider> </div>
</AppProvider> {stack && (
); <pre className="w-full p-4 overflow-x-auto">
<code>{stack}</code>
</pre>
)}
</main>
<Footer />
</div>
</div>
</ThemeProvider>
</AppProvider>
);
} }

View file

@ -1,12 +1,12 @@
import TopItemList from "~/components/TopItemList"; import TopItemList from "~/components/TopItemList";
import ChartLayout from "./ChartLayout"; import ChartLayout from "./ChartLayout";
import { useLoaderData, type LoaderFunctionArgs } from "react-router"; import { useLoaderData, type LoaderFunctionArgs } from "react-router";
import { type Album, type PaginatedResponse } from "api/api"; import { type Album, type PaginatedResponse, type Ranked } from "api/api";
export async function clientLoader({ request }: LoaderFunctionArgs) { export async function clientLoader({ request }: LoaderFunctionArgs) {
const url = new URL(request.url); const url = new URL(request.url);
const page = url.searchParams.get("page") || "0"; const page = url.searchParams.get("page") || "0";
url.searchParams.set('page', page) url.searchParams.set("page", page);
const res = await fetch( const res = await fetch(
`/apis/web/v1/top-albums?${url.searchParams.toString()}` `/apis/web/v1/top-albums?${url.searchParams.toString()}`
@ -20,7 +20,9 @@ export async function clientLoader({ request }: LoaderFunctionArgs) {
} }
export default function AlbumChart() { export default function AlbumChart() {
const { top_albums: initialData } = useLoaderData<{ top_albums: PaginatedResponse<Album> }>(); const { top_albums: initialData } = useLoaderData<{
top_albums: PaginatedResponse<Ranked<Album>>;
}>();
return ( return (
<ChartLayout <ChartLayout
@ -28,26 +30,35 @@ export default function AlbumChart() {
initialData={initialData} initialData={initialData}
endpoint="chart/top-albums" endpoint="chart/top-albums"
render={({ data, page, onNext, onPrev }) => ( render={({ data, page, onNext, onPrev }) => (
<div className="flex flex-col gap-5"> <div className="flex flex-col gap-5 w-full">
<div className="flex gap-15 mx-auto"> <div className="flex gap-15 mx-auto">
<button className="default" onClick={onPrev} disabled={page <= 1}> <button className="default" onClick={onPrev} disabled={page <= 1}>
Prev Prev
</button> </button>
<button className="default" onClick={onNext} disabled={!data.has_next_page}> <button
Next className="default"
</button> onClick={onNext}
</div> disabled={!data.has_next_page}
>
Next
</button>
</div>
<TopItemList <TopItemList
ranked
separators separators
data={data} data={data}
className="w-[400px] sm:w-[600px]" className="w-11/12 sm:w-[600px]"
type="album" type="album"
/> />
<div className="flex gap-15 mx-auto"> <div className="flex gap-15 mx-auto">
<button className="default" onClick={onPrev} disabled={page === 0}> <button className="default" onClick={onPrev} disabled={page === 0}>
Prev Prev
</button> </button>
<button className="default" onClick={onNext} disabled={!data.has_next_page}> <button
className="default"
onClick={onNext}
disabled={!data.has_next_page}
>
Next Next
</button> </button>
</div> </div>

View file

@ -1,12 +1,12 @@
import TopItemList from "~/components/TopItemList"; import TopItemList from "~/components/TopItemList";
import ChartLayout from "./ChartLayout"; import ChartLayout from "./ChartLayout";
import { useLoaderData, type LoaderFunctionArgs } from "react-router"; import { useLoaderData, type LoaderFunctionArgs } from "react-router";
import { type Album, type PaginatedResponse } from "api/api"; import { type Album, type PaginatedResponse, type Ranked } from "api/api";
export async function clientLoader({ request }: LoaderFunctionArgs) { export async function clientLoader({ request }: LoaderFunctionArgs) {
const url = new URL(request.url); const url = new URL(request.url);
const page = url.searchParams.get("page") || "0"; const page = url.searchParams.get("page") || "0";
url.searchParams.set('page', page) url.searchParams.set("page", page);
const res = await fetch( const res = await fetch(
`/apis/web/v1/top-artists?${url.searchParams.toString()}` `/apis/web/v1/top-artists?${url.searchParams.toString()}`
@ -20,7 +20,9 @@ export async function clientLoader({ request }: LoaderFunctionArgs) {
} }
export default function Artist() { export default function Artist() {
const { top_artists: initialData } = useLoaderData<{ top_artists: PaginatedResponse<Album> }>(); const { top_artists: initialData } = useLoaderData<{
top_artists: PaginatedResponse<Ranked<Album>>;
}>();
return ( return (
<ChartLayout <ChartLayout
@ -28,26 +30,35 @@ export default function Artist() {
initialData={initialData} initialData={initialData}
endpoint="chart/top-artists" endpoint="chart/top-artists"
render={({ data, page, onNext, onPrev }) => ( render={({ data, page, onNext, onPrev }) => (
<div className="flex flex-col gap-5"> <div className="flex flex-col gap-5 w-full">
<div className="flex gap-15 mx-auto"> <div className="flex gap-15 mx-auto">
<button className="default" onClick={onPrev} disabled={page <= 1}> <button className="default" onClick={onPrev} disabled={page <= 1}>
Prev Prev
</button> </button>
<button className="default" onClick={onNext} disabled={!data.has_next_page}> <button
Next className="default"
</button> onClick={onNext}
</div> disabled={!data.has_next_page}
>
Next
</button>
</div>
<TopItemList <TopItemList
ranked
separators separators
data={data} data={data}
className="w-[400px] sm:w-[600px]" className="w-11/12 sm:w-[600px]"
type="artist" type="artist"
/> />
<div className="flex gap-15 mx-auto"> <div className="flex gap-15 mx-auto">
<button className="default" onClick={onPrev} disabled={page <= 1}> <button className="default" onClick={onPrev} disabled={page <= 1}>
Prev Prev
</button> </button>
<button className="default" onClick={onNext} disabled={!data.has_next_page}> <button
className="default"
onClick={onNext}
disabled={!data.has_next_page}
>
Next Next
</button> </button>
</div> </div>

View file

@ -1,264 +1,272 @@
import { import { useFetcher, useLocation, useNavigate } from "react-router";
useFetcher, import { useEffect, useState } from "react";
useLocation, import { average } from "color.js";
useNavigate, import { imageUrl, type PaginatedResponse } from "api/api";
} from "react-router" import PeriodSelector from "~/components/PeriodSelector";
import { useEffect, useState } from "react"
import { average } from "color.js"
import { imageUrl, type PaginatedResponse } from "api/api"
import PeriodSelector from "~/components/PeriodSelector"
interface ChartLayoutProps<T> { interface ChartLayoutProps<T> {
title: "Top Albums" | "Top Tracks" | "Top Artists" | "Last Played" title: "Top Albums" | "Top Tracks" | "Top Artists" | "Last Played";
initialData: PaginatedResponse<T> initialData: PaginatedResponse<T>;
endpoint: string endpoint: string;
render: (opts: { render: (opts: {
data: PaginatedResponse<T> data: PaginatedResponse<T>;
page: number page: number;
onNext: () => void onNext: () => void;
onPrev: () => void onPrev: () => void;
}) => React.ReactNode }) => React.ReactNode;
} }
export default function ChartLayout<T>({ export default function ChartLayout<T>({
title, title,
initialData, initialData,
endpoint, endpoint,
render, render,
}: ChartLayoutProps<T>) { }: ChartLayoutProps<T>) {
const pgTitle = `${title} - Koito` const pgTitle = `${title} - Koito`;
const fetcher = useFetcher() const fetcher = useFetcher();
const location = useLocation() const location = useLocation();
const navigate = useNavigate() const navigate = useNavigate();
const currentParams = new URLSearchParams(location.search) const currentParams = new URLSearchParams(location.search);
const currentPage = parseInt(currentParams.get("page") || "1", 10) const currentPage = parseInt(currentParams.get("page") || "1", 10);
const data: PaginatedResponse<T> = fetcher.data?.[endpoint] const data: PaginatedResponse<T> = fetcher.data?.[endpoint]
? fetcher.data[endpoint] ? fetcher.data[endpoint]
: initialData : initialData;
const [bgColor, setBgColor] = useState<string>("(--color-bg)") const [bgColor, setBgColor] = useState<string>("(--color-bg)");
useEffect(() => { useEffect(() => {
if ((data?.items?.length ?? 0) === 0) return if ((data?.items?.length ?? 0) === 0) return;
const img = (data.items[0] as any)?.image const img = (data.items[0] as any)?.item?.image;
if (!img) return if (!img) return;
average(imageUrl(img, "small"), { amount: 1 }).then((color) => { average(imageUrl(img, "small"), { amount: 1 }).then((color) => {
setBgColor(`rgba(${color[0]},${color[1]},${color[2]},0.4)`) setBgColor(`rgba(${color[0]},${color[1]},${color[2]},0.4)`);
}) });
}, [data]) }, [data]);
const period = currentParams.get("period") ?? "day" const period = currentParams.get("period") ?? "day";
const year = currentParams.get("year") const year = currentParams.get("year");
const month = currentParams.get("month") const month = currentParams.get("month");
const week = currentParams.get("week") const week = currentParams.get("week");
const updateParams = (params: Record<string, string | null>) => { const updateParams = (params: Record<string, string | null>) => {
const nextParams = new URLSearchParams(location.search) const nextParams = new URLSearchParams(location.search);
for (const key in params) { for (const key in params) {
const val = params[key] const val = params[key];
if (val !== null) { if (val !== null) {
nextParams.set(key, val) nextParams.set(key, val);
} else { } else {
nextParams.delete(key) nextParams.delete(key);
} }
}
const url = `/${endpoint}?${nextParams.toString()}`
navigate(url, { replace: false })
} }
const handleSetPeriod = (p: string) => {
updateParams({
period: p,
page: "1",
year: null,
month: null,
week: null,
})
}
const handleSetYear = (val: string) => {
if (val == "") {
updateParams({
period: period,
page: "1",
year: null,
month: null,
week: null
})
return
}
updateParams({
period: null,
page: "1",
year: val,
})
}
const handleSetMonth = (val: string) => {
updateParams({
period: null,
page: "1",
year: year ?? new Date().getFullYear().toString(),
month: val,
})
}
const handleSetWeek = (val: string) => {
updateParams({
period: null,
page: "1",
year: year ?? new Date().getFullYear().toString(),
month: null,
week: val,
})
}
useEffect(() => { const url = `/${endpoint}?${nextParams.toString()}`;
fetcher.load(`/${endpoint}?${currentParams.toString()}`) navigate(url, { replace: false });
}, [location.search]) };
const setPage = (nextPage: number) => { const handleSetPeriod = (p: string) => {
const nextParams = new URLSearchParams(location.search) updateParams({
nextParams.set("page", String(nextPage)) period: p,
const url = `/${endpoint}?${nextParams.toString()}` page: "1",
fetcher.load(url) year: null,
navigate(url, { replace: false }) month: null,
} week: null,
});
const handleNextPage = () => setPage(currentPage + 1) };
const handlePrevPage = () => setPage(currentPage - 1) const handleSetYear = (val: string) => {
if (val == "") {
const yearOptions = Array.from({ length: 10 }, (_, i) => `${new Date().getFullYear() - i}`) updateParams({
const monthOptions = Array.from({ length: 12 }, (_, i) => `${i + 1}`) period: period,
const weekOptions = Array.from({ length: 53 }, (_, i) => `${i + 1}`) page: "1",
year: null,
const getDateRange = (): string => { month: null,
let from: Date week: null,
let to: Date });
return;
const now = new Date()
const currentYear = now.getFullYear()
const currentMonth = now.getMonth() // 0-indexed
const currentDate = now.getDate()
if (year && month) {
from = new Date(parseInt(year), parseInt(month) - 1, 1)
to = new Date(from)
to.setMonth(from.getMonth() + 1)
to.setDate(0)
} else if (year && week) {
const base = new Date(parseInt(year), 0, 1) // Jan 1 of the year
const weekNumber = parseInt(week)
from = new Date(base)
from.setDate(base.getDate() + (weekNumber - 1) * 7)
to = new Date(from)
to.setDate(from.getDate() + 6)
} else if (year) {
from = new Date(parseInt(year), 0, 1)
to = new Date(parseInt(year), 11, 31)
} else {
switch (period) {
case "day":
from = new Date(now)
to = new Date(now)
break
case "week":
to = new Date(now)
from = new Date(now)
from.setDate(to.getDate() - 6)
break
case "month":
to = new Date(now)
from = new Date(now)
if (currentMonth === 0) {
from = new Date(currentYear - 1, 11, currentDate)
} else {
from = new Date(currentYear, currentMonth - 1, currentDate)
}
break
case "year":
to = new Date(now)
from = new Date(currentYear - 1, currentMonth, currentDate)
break
case "all_time":
return "All Time"
default:
return ""
}
}
const formatter = new Intl.DateTimeFormat(undefined, {
year: "numeric",
month: "long",
day: "numeric",
})
return `${formatter.format(from)} - ${formatter.format(to)}`
} }
updateParams({
period: null,
page: "1",
year: val,
});
};
const handleSetMonth = (val: string) => {
updateParams({
period: null,
page: "1",
year: year ?? new Date().getFullYear().toString(),
month: val,
});
};
const handleSetWeek = (val: string) => {
updateParams({
period: null,
page: "1",
year: year ?? new Date().getFullYear().toString(),
month: null,
week: val,
});
};
return ( useEffect(() => {
<div fetcher.load(`/${endpoint}?${currentParams.toString()}`);
className="w-full min-h-screen" }, [location.search]);
style={{
background: `linear-gradient(to bottom, ${bgColor}, var(--color-bg) 500px)`, const setPage = (nextPage: number) => {
transition: "1000", const nextParams = new URLSearchParams(location.search);
}} nextParams.set("page", String(nextPage));
> const url = `/${endpoint}?${nextParams.toString()}`;
<title>{pgTitle}</title> fetcher.load(url);
<meta property="og:title" content={pgTitle} /> navigate(url, { replace: false });
<meta name="description" content={pgTitle} /> };
<div className="w-19/20 sm:17/20 mx-auto pt-6 sm:pt-12">
<h1>{title}</h1> const handleNextPage = () => setPage(currentPage + 1);
<div className="flex flex-col items-start md:flex-row sm:items-center gap-4"> const handlePrevPage = () => setPage(currentPage - 1);
<PeriodSelector current={period} setter={handleSetPeriod} disableCache />
<div className="flex gap-5"> const yearOptions = Array.from(
<select { length: 10 },
value={year ?? ""} (_, i) => `${new Date().getFullYear() - i}`
onChange={(e) => handleSetYear(e.target.value)} );
className="px-2 py-1 rounded border border-gray-400" const monthOptions = Array.from({ length: 12 }, (_, i) => `${i + 1}`);
> const weekOptions = Array.from({ length: 53 }, (_, i) => `${i + 1}`);
<option value="">Year</option>
{yearOptions.map((y) => ( const getDateRange = (): string => {
<option key={y} value={y}>{y}</option> let from: Date;
))} let to: Date;
</select>
<select const now = new Date();
value={month ?? ""} const currentYear = now.getFullYear();
onChange={(e) => handleSetMonth(e.target.value)} const currentMonth = now.getMonth(); // 0-indexed
className="px-2 py-1 rounded border border-gray-400" const currentDate = now.getDate();
>
<option value="">Month</option> if (year && month) {
{monthOptions.map((m) => ( from = new Date(parseInt(year), parseInt(month) - 1, 1);
<option key={m} value={m}>{m}</option> to = new Date(from);
))} to.setMonth(from.getMonth() + 1);
</select> to.setDate(0);
<select } else if (year && week) {
value={week ?? ""} const base = new Date(parseInt(year), 0, 1); // Jan 1 of the year
onChange={(e) => handleSetWeek(e.target.value)} const weekNumber = parseInt(week);
className="px-2 py-1 rounded border border-gray-400" from = new Date(base);
> from.setDate(base.getDate() + (weekNumber - 1) * 7);
<option value="">Week</option> to = new Date(from);
{weekOptions.map((w) => ( to.setDate(from.getDate() + 6);
<option key={w} value={w}>{w}</option> } else if (year) {
))} from = new Date(parseInt(year), 0, 1);
</select> to = new Date(parseInt(year), 11, 31);
</div> } else {
</div> switch (period) {
<p className="mt-2 text-sm text-color-fg-secondary">{getDateRange()}</p> case "day":
<div className="mt-10 sm:mt-20 flex mx-auto justify-between"> from = new Date(now);
{render({ to = new Date(now);
data, break;
page: currentPage, case "week":
onNext: handleNextPage, to = new Date(now);
onPrev: handlePrevPage, from = new Date(now);
})} from.setDate(to.getDate() - 6);
</div> break;
</div> case "month":
</div> to = new Date(now);
) from = new Date(now);
if (currentMonth === 0) {
from = new Date(currentYear - 1, 11, currentDate);
} else {
from = new Date(currentYear, currentMonth - 1, currentDate);
}
break;
case "year":
to = new Date(now);
from = new Date(currentYear - 1, currentMonth, currentDate);
break;
case "all_time":
return "All Time";
default:
return "";
}
}
const formatter = new Intl.DateTimeFormat(undefined, {
year: "numeric",
month: "long",
day: "numeric",
});
return `${formatter.format(from)} - ${formatter.format(to)}`;
};
return (
<div
className="w-full min-h-screen"
style={{
background: `linear-gradient(to bottom, ${bgColor}, var(--color-bg) 500px)`,
transition: "1000",
}}
>
<title>{pgTitle}</title>
<meta property="og:title" content={pgTitle} />
<meta name="description" content={pgTitle} />
<div className="w-19/20 sm:17/20 mx-auto pt-6 sm:pt-12">
<h1>{title}</h1>
<div className="flex flex-col items-start md:flex-row sm:items-center gap-4">
<PeriodSelector
current={period}
setter={handleSetPeriod}
disableCache
/>
<div className="flex gap-5">
<select
value={year ?? ""}
onChange={(e) => handleSetYear(e.target.value)}
className="px-2 py-1 rounded border border-gray-400"
>
<option value="">Year</option>
{yearOptions.map((y) => (
<option key={y} value={y}>
{y}
</option>
))}
</select>
<select
value={month ?? ""}
onChange={(e) => handleSetMonth(e.target.value)}
className="px-2 py-1 rounded border border-gray-400"
>
<option value="">Month</option>
{monthOptions.map((m) => (
<option key={m} value={m}>
{m}
</option>
))}
</select>
<select
value={week ?? ""}
onChange={(e) => handleSetWeek(e.target.value)}
className="px-2 py-1 rounded border border-gray-400"
>
<option value="">Week</option>
{weekOptions.map((w) => (
<option key={w} value={w}>
{w}
</option>
))}
</select>
</div>
</div>
<p className="mt-2 text-sm text-color-fg-secondary">{getDateRange()}</p>
<div className="mt-10 sm:mt-20 flex mx-auto justify-between">
{render({
data,
page: currentPage,
onNext: handleNextPage,
onPrev: handlePrevPage,
})}
</div>
</div>
</div>
);
} }

View file

@ -1,12 +1,12 @@
import TopItemList from "~/components/TopItemList"; import TopItemList from "~/components/TopItemList";
import ChartLayout from "./ChartLayout"; import ChartLayout from "./ChartLayout";
import { useLoaderData, type LoaderFunctionArgs } from "react-router"; import { useLoaderData, type LoaderFunctionArgs } from "react-router";
import { type Album, type PaginatedResponse } from "api/api"; import { type Track, type PaginatedResponse, type Ranked } from "api/api";
export async function clientLoader({ request }: LoaderFunctionArgs) { export async function clientLoader({ request }: LoaderFunctionArgs) {
const url = new URL(request.url); const url = new URL(request.url);
const page = url.searchParams.get("page") || "0"; const page = url.searchParams.get("page") || "0";
url.searchParams.set('page', page) url.searchParams.set("page", page);
const res = await fetch( const res = await fetch(
`/apis/web/v1/top-tracks?${url.searchParams.toString()}` `/apis/web/v1/top-tracks?${url.searchParams.toString()}`
@ -15,12 +15,14 @@ export async function clientLoader({ request }: LoaderFunctionArgs) {
throw new Response("Failed to load top tracks", { status: 500 }); throw new Response("Failed to load top tracks", { status: 500 });
} }
const top_tracks: PaginatedResponse<Album> = await res.json(); const top_tracks: PaginatedResponse<Track> = await res.json();
return { top_tracks }; return { top_tracks };
} }
export default function TrackChart() { export default function TrackChart() {
const { top_tracks: initialData } = useLoaderData<{ top_tracks: PaginatedResponse<Album> }>(); const { top_tracks: initialData } = useLoaderData<{
top_tracks: PaginatedResponse<Ranked<Track>>;
}>();
return ( return (
<ChartLayout <ChartLayout
@ -28,26 +30,35 @@ export default function TrackChart() {
initialData={initialData} initialData={initialData}
endpoint="chart/top-tracks" endpoint="chart/top-tracks"
render={({ data, page, onNext, onPrev }) => ( render={({ data, page, onNext, onPrev }) => (
<div className="flex flex-col gap-5"> <div className="flex flex-col gap-5 w-full">
<div className="flex gap-15 mx-auto"> <div className="flex gap-15 mx-auto">
<button className="default" onClick={onPrev} disabled={page <= 1}> <button className="default" onClick={onPrev} disabled={page <= 1}>
Prev Prev
</button> </button>
<button className="default" onClick={onNext} disabled={!data.has_next_page}> <button
Next className="default"
</button> onClick={onNext}
</div> disabled={!data.has_next_page}
>
Next
</button>
</div>
<TopItemList <TopItemList
ranked
separators separators
data={data} data={data}
className="w-[400px] sm:w-[600px]" className="w-11/12 sm:w-[600px]"
type="track" type="track"
/> />
<div className="flex gap-15 mx-auto"> <div className="flex gap-15 mx-auto">
<button className="default" onClick={onPrev} disabled={page === 0}> <button className="default" onClick={onPrev} disabled={page === 0}>
Prev Prev
</button> </button>
<button className="default" onClick={onNext} disabled={!data.has_next_page}> <button
className="default"
onClick={onNext}
disabled={!data.has_next_page}
>
Next Next
</button> </button>
</div> </div>

View file

@ -10,20 +10,17 @@ import PeriodSelector from "~/components/PeriodSelector";
import { useAppContext } from "~/providers/AppProvider"; import { useAppContext } from "~/providers/AppProvider";
export function meta({}: Route.MetaArgs) { export function meta({}: Route.MetaArgs) {
return [ return [{ title: "Koito" }, { name: "description", content: "Koito" }];
{ title: "Koito" },
{ name: "description", content: "Koito" },
];
} }
export default function Home() { export default function Home() {
const [period, setPeriod] = useState('week') const [period, setPeriod] = useState("week");
const { homeItems } = useAppContext(); const { homeItems } = useAppContext();
return ( return (
<main className="flex flex-grow justify-center pb-4"> <main className="flex flex-grow justify-center pb-4 w-full bg-linear-to-b to-(--color-bg) from-(--color-bg-secondary) to-60%">
<div className="flex-1 flex flex-col items-center gap-16 min-h-0 mt-20"> <div className="flex-1 flex flex-col items-center gap-16 min-h-0 sm:mt-20 mt-10">
<div className="flex flex-col md:flex-row gap-10 md:gap-20"> <div className="flex flex-col md:flex-row gap-10 md:gap-20">
<AllTimeStats /> <AllTimeStats />
<ActivityGrid configurable /> <ActivityGrid configurable />
@ -33,7 +30,10 @@ export default function Home() {
<TopArtists period={period} limit={homeItems} /> <TopArtists period={period} limit={homeItems} />
<TopAlbums period={period} limit={homeItems} /> <TopAlbums period={period} limit={homeItems} />
<TopTracks period={period} limit={homeItems} /> <TopTracks period={period} limit={homeItems} />
<LastPlays showNowPlaying={true} limit={Math.floor(homeItems * 2.7)} /> <LastPlays
showNowPlaying={true}
limit={Math.floor(homeItems * 2.7)}
/>
</div> </div>
</div> </div>
</main> </main>

View file

@ -7,6 +7,7 @@ import PeriodSelector from "~/components/PeriodSelector";
import MediaLayout from "./MediaLayout"; import MediaLayout from "./MediaLayout";
import ActivityGrid from "~/components/ActivityGrid"; import ActivityGrid from "~/components/ActivityGrid";
import { timeListenedString } from "~/utils/utils"; import { timeListenedString } from "~/utils/utils";
import InterestGraph from "~/components/InterestGraph";
export async function clientLoader({ params }: LoaderFunctionArgs) { export async function clientLoader({ params }: LoaderFunctionArgs) {
const res = await fetch(`/apis/web/v1/album?id=${params.id}`); const res = await fetch(`/apis/web/v1/album?id=${params.id}`);
@ -29,6 +30,7 @@ export default function Album() {
title={album.title} title={album.title}
img={album.image} img={album.image}
id={album.id} id={album.id}
rank={album.all_time_rank}
musicbrainzId={album.musicbrainz_id} musicbrainzId={album.musicbrainz_id}
imgItemId={album.id} imgItemId={album.id}
mergeFunc={mergeAlbums} mergeFunc={mergeAlbums}
@ -44,22 +46,22 @@ export default function Album() {
}} }}
subContent={ subContent={
<div className="flex flex-col gap-2 items-start"> <div className="flex flex-col gap-2 items-start">
{album.listen_count && ( {album.listen_count !== 0 && (
<p> <p>
{album.listen_count} play{album.listen_count > 1 ? "s" : ""} {album.listen_count} play{album.listen_count > 1 ? "s" : ""}
</p> </p>
)} )}
{ {album.time_listened !== 0 && (
<p title={Math.floor(album.time_listened / 60 / 60) + " hours"}> <p title={Math.floor(album.time_listened / 60 / 60) + " hours"}>
{timeListenedString(album.time_listened)} {timeListenedString(album.time_listened)}
</p> </p>
} )}
{ {album.first_listen > 0 && (
<p title={new Date(album.first_listen * 1000).toLocaleString()}> <p title={new Date(album.first_listen * 1000).toLocaleString()}>
Listening since{" "} Listening since{" "}
{new Date(album.first_listen * 1000).toLocaleDateString()} {new Date(album.first_listen * 1000).toLocaleDateString()}
</p> </p>
} )}
</div> </div>
} }
> >
@ -69,7 +71,10 @@ export default function Album() {
<div className="flex flex-wrap gap-20 mt-10"> <div className="flex flex-wrap gap-20 mt-10">
<LastPlays limit={30} albumId={album.id} /> <LastPlays limit={30} albumId={album.id} />
<TopTracks limit={12} period={period} albumId={album.id} /> <TopTracks limit={12} period={period} albumId={album.id} />
<ActivityGrid configurable albumId={album.id} /> <div className="flex flex-col items-start gap-4">
<ActivityGrid configurable albumId={album.id} />
<InterestGraph albumId={album.id} />
</div>
</div> </div>
</MediaLayout> </MediaLayout>
); );

View file

@ -8,6 +8,7 @@ import MediaLayout from "./MediaLayout";
import ArtistAlbums from "~/components/ArtistAlbums"; import ArtistAlbums from "~/components/ArtistAlbums";
import ActivityGrid from "~/components/ActivityGrid"; import ActivityGrid from "~/components/ActivityGrid";
import { timeListenedString } from "~/utils/utils"; import { timeListenedString } from "~/utils/utils";
import InterestGraph from "~/components/InterestGraph";
export async function clientLoader({ params }: LoaderFunctionArgs) { export async function clientLoader({ params }: LoaderFunctionArgs) {
const res = await fetch(`/apis/web/v1/artist?id=${params.id}`); const res = await fetch(`/apis/web/v1/artist?id=${params.id}`);
@ -35,6 +36,7 @@ export default function Artist() {
title={artist.name} title={artist.name}
img={artist.image} img={artist.image}
id={artist.id} id={artist.id}
rank={artist.all_time_rank}
musicbrainzId={artist.musicbrainz_id} musicbrainzId={artist.musicbrainz_id}
imgItemId={artist.id} imgItemId={artist.id}
mergeFunc={mergeArtists} mergeFunc={mergeArtists}
@ -55,17 +57,17 @@ export default function Artist() {
{artist.listen_count} play{artist.listen_count > 1 ? "s" : ""} {artist.listen_count} play{artist.listen_count > 1 ? "s" : ""}
</p> </p>
)} )}
{ {artist.time_listened !== 0 && (
<p title={Math.floor(artist.time_listened / 60 / 60) + " hours"}> <p title={Math.floor(artist.time_listened / 60 / 60) + " hours"}>
{timeListenedString(artist.time_listened)} {timeListenedString(artist.time_listened)}
</p> </p>
} )}
{ {artist.first_listen > 0 && (
<p title={new Date(artist.first_listen * 1000).toLocaleString()}> <p title={new Date(artist.first_listen * 1000).toLocaleString()}>
Listening since{" "} Listening since{" "}
{new Date(artist.first_listen * 1000).toLocaleDateString()} {new Date(artist.first_listen * 1000).toLocaleDateString()}
</p> </p>
} )}
</div> </div>
} }
> >
@ -76,7 +78,10 @@ export default function Artist() {
<div className="flex gap-15 mt-10 flex-wrap"> <div className="flex gap-15 mt-10 flex-wrap">
<LastPlays limit={20} artistId={artist.id} /> <LastPlays limit={20} artistId={artist.id} />
<TopTracks limit={8} period={period} artistId={artist.id} /> <TopTracks limit={8} period={period} artistId={artist.id} />
<ActivityGrid configurable artistId={artist.id} /> <div className="flex flex-col items-start gap-4">
<ActivityGrid configurable artistId={artist.id} />
<InterestGraph artistId={artist.id} />
</div>
</div> </div>
<ArtistAlbums period={period} artistId={artist.id} name={artist.name} /> <ArtistAlbums period={period} artistId={artist.id} name={artist.name} />
</div> </div>

View file

@ -10,97 +10,200 @@ import DeleteModal from "~/components/modals/DeleteModal";
import RenameModal from "~/components/modals/EditModal/EditModal"; import RenameModal from "~/components/modals/EditModal/EditModal";
import EditModal from "~/components/modals/EditModal/EditModal"; import EditModal from "~/components/modals/EditModal/EditModal";
import AddListenModal from "~/components/modals/AddListenModal"; import AddListenModal from "~/components/modals/AddListenModal";
import MbzIcon from "~/components/icons/MbzIcon";
import { Link } from "react-router";
export type MergeFunc = (from: number, to: number, replaceImage: boolean) => Promise<Response> export type MergeFunc = (
export type MergeSearchCleanerFunc = (r: SearchResponse, id: number) => SearchResponse from: number,
to: number,
replaceImage: boolean
) => Promise<Response>;
export type MergeSearchCleanerFunc = (
r: SearchResponse,
id: number
) => SearchResponse;
interface Props { interface Props {
type: "Track" | "Album" | "Artist" type: "Track" | "Album" | "Artist";
title: string title: string;
img: string img: string;
id: number id: number;
musicbrainzId: string rank: number;
imgItemId: number musicbrainzId: string;
mergeFunc: MergeFunc imgItemId: number;
mergeCleanerFunc: MergeSearchCleanerFunc mergeFunc: MergeFunc;
children: React.ReactNode mergeCleanerFunc: MergeSearchCleanerFunc;
subContent: React.ReactNode children: React.ReactNode;
subContent: React.ReactNode;
} }
export default function MediaLayout(props: Props) { export default function MediaLayout(props: Props) {
const [bgColor, setBgColor] = useState<string>("(--color-bg)"); const [bgColor, setBgColor] = useState<string>("(--color-bg)");
const [mergeModalOpen, setMergeModalOpen] = useState(false); const [mergeModalOpen, setMergeModalOpen] = useState(false);
const [deleteModalOpen, setDeleteModalOpen] = useState(false); const [deleteModalOpen, setDeleteModalOpen] = useState(false);
const [imageModalOpen, setImageModalOpen] = useState(false); const [imageModalOpen, setImageModalOpen] = useState(false);
const [renameModalOpen, setRenameModalOpen] = useState(false); const [renameModalOpen, setRenameModalOpen] = useState(false);
const [addListenModalOpen, setAddListenModalOpen] = useState(false); const [addListenModalOpen, setAddListenModalOpen] = useState(false);
const { user } = useAppContext(); const { user } = useAppContext();
useEffect(() => { useEffect(() => {
average(imageUrl(props.img, 'small'), { amount: 1 }).then((color) => { average(imageUrl(props.img, "small"), { amount: 1 }).then((color) => {
setBgColor(`rgba(${color[0]},${color[1]},${color[2]},0.4)`); setBgColor(`rgba(${color[0]},${color[1]},${color[2]},0.4)`);
}); });
}, [props.img]); }, [props.img]);
const replaceImageCallback = () => { const replaceImageCallback = () => {
window.location.reload() window.location.reload();
} };
const title = `${props.title} - Koito` const title = `${props.title} - Koito`;
const mobileIconSize = 22 const mobileIconSize = 22;
const normalIconSize = 30 const normalIconSize = 30;
let vw = Math.max(document.documentElement.clientWidth || 0, window.innerWidth || 0) let vw = Math.max(
document.documentElement.clientWidth || 0,
window.innerWidth || 0
);
let iconSize = vw > 768 ? normalIconSize : mobileIconSize let iconSize = vw > 768 ? normalIconSize : mobileIconSize;
return ( console.log("MBZ:", props.musicbrainzId);
<main
className="w-full flex flex-col flex-grow" return (
style={{ <main
background: `linear-gradient(to bottom, ${bgColor}, var(--color-bg) 700px)`, className="w-full flex flex-col flex-grow"
transition: '1000', style={{
}} background: `linear-gradient(to bottom, ${bgColor}, var(--color-bg) 700px)`,
> transition: "1000",
<ImageDropHandler itemType={props.type.toLowerCase() === 'artist' ? 'artist' : 'album'} onComplete={replaceImageCallback} /> }}
<title>{title}</title> >
<meta property="og:title" content={title} /> <ImageDropHandler
<meta itemType={props.type.toLowerCase() === "artist" ? "artist" : "album"}
name="description" onComplete={replaceImageCallback}
content={title} />
/> <title>{title}</title>
<div className="w-19/20 mx-auto pt-12"> <meta property="og:title" content={title} />
<div className="flex gap-8 flex-wrap md:flex-nowrap relative"> <meta name="description" content={title} />
<div className="flex flex-col justify-around"> <div className="w-19/20 mx-auto pt-12">
<img style={{zIndex: 5}} src={imageUrl(props.img, "large")} alt={props.title} className="md:min-w-[385px] w-[220px] h-auto shadow-(--color-shadow) shadow-lg" /> <div className="flex gap-8 flex-wrap md:flex-nowrap relative">
</div> <div className="flex flex-col justify-around">
<div className="flex flex-col items-start"> <img
<h3>{props.type}</h3> style={{ zIndex: 5 }}
<h1>{props.title}</h1> src={imageUrl(props.img, "large")}
{props.subContent} alt={props.title}
</div> className="md:min-w-[385px] w-[220px] h-auto shadow-(--color-shadow) shadow-lg"
{ user && />
<div className="absolute left-1 sm:right-1 sm:left-auto -top-9 sm:top-1 flex gap-3 items-center"> </div>
{ props.type === "Track" && <div className="flex flex-col items-start">
<> <h3>{props.type}</h3>
<button title="Add Listen" className="hover:cursor-pointer" onClick={() => setAddListenModalOpen(true)}><Plus size={iconSize} /></button> <div className="flex">
<AddListenModal open={addListenModalOpen} setOpen={setAddListenModalOpen} trackid={props.id} /> <h1>
</> {props.title}
} <span className="text-xl font-medium text-(--color-fg-secondary)">
<button title="Edit Item" className="hover:cursor-pointer" onClick={() => setRenameModalOpen(true)}><Edit size={iconSize} /></button> {" "}
<button title="Replace Image" className="hover:cursor-pointer" onClick={() => setImageModalOpen(true)}><ImageIcon size={iconSize} /></button> #{props.rank}
<button title="Merge Items" className="hover:cursor-pointer" onClick={() => setMergeModalOpen(true)}><Merge size={iconSize} /></button> </span>
<button title="Delete Item" className="hover:cursor-pointer" onClick={() => setDeleteModalOpen(true)}><Trash size={iconSize} /></button> </h1>
<EditModal open={renameModalOpen} setOpen={setRenameModalOpen} type={props.type.toLowerCase()} id={props.id}/>
<ImageReplaceModal open={imageModalOpen} setOpen={setImageModalOpen} id={props.imgItemId} musicbrainzId={props.musicbrainzId} type={props.type === "Track" ? "Album" : props.type} />
<MergeModal currentTitle={props.title} mergeFunc={props.mergeFunc} mergeCleanerFunc={props.mergeCleanerFunc} type={props.type} currentId={props.id} open={mergeModalOpen} setOpen={setMergeModalOpen} />
<DeleteModal open={deleteModalOpen} setOpen={setDeleteModalOpen} title={props.title} id={props.id} type={props.type} />
</div>
}
</div>
{props.children}
</div> </div>
</main> {props.subContent}
); </div>
<div className="absolute left-1 sm:right-1 sm:left-auto -top-9 sm:top-1 flex gap-3 items-center">
{props.musicbrainzId && (
<Link
title="View on MusicBrainz"
target="_blank"
to={`https://musicbrainz.org/${props.type.toLowerCase()}/${
props.musicbrainzId
}`}
>
<MbzIcon size={iconSize} hover />
</Link>
)}
{user && (
<>
{props.type === "Track" && (
<>
<button
title="Add Listen"
className="hover:cursor-pointer"
onClick={() => setAddListenModalOpen(true)}
>
<Plus size={iconSize} />
</button>
<AddListenModal
open={addListenModalOpen}
setOpen={setAddListenModalOpen}
trackid={props.id}
/>
</>
)}
<button
title="Edit Item"
className="hover:cursor-pointer"
onClick={() => setRenameModalOpen(true)}
>
<Edit size={iconSize} />
</button>
{props.type !== "Track" && (
<button
title="Replace Image"
className="hover:cursor-pointer"
onClick={() => setImageModalOpen(true)}
>
<ImageIcon size={iconSize} />
</button>
)}
<button
title="Merge Items"
className="hover:cursor-pointer"
onClick={() => setMergeModalOpen(true)}
>
<Merge size={iconSize} />
</button>
<button
title="Delete Item"
className="hover:cursor-pointer"
onClick={() => setDeleteModalOpen(true)}
>
<Trash size={iconSize} />
</button>
<EditModal
open={renameModalOpen}
setOpen={setRenameModalOpen}
type={props.type.toLowerCase()}
id={props.id}
/>
<ImageReplaceModal
open={imageModalOpen}
setOpen={setImageModalOpen}
id={props.imgItemId}
musicbrainzId={props.musicbrainzId}
type={props.type === "Track" ? "Album" : props.type}
/>
<MergeModal
currentTitle={props.title}
mergeFunc={props.mergeFunc}
mergeCleanerFunc={props.mergeCleanerFunc}
type={props.type}
currentId={props.id}
open={mergeModalOpen}
setOpen={setMergeModalOpen}
/>
<DeleteModal
open={deleteModalOpen}
setOpen={setDeleteModalOpen}
title={props.title}
id={props.id}
type={props.type}
/>
</>
)}
</div>
</div>
{props.children}
</div>
</main>
);
} }

View file

@ -6,6 +6,7 @@ import PeriodSelector from "~/components/PeriodSelector";
import MediaLayout from "./MediaLayout"; import MediaLayout from "./MediaLayout";
import ActivityGrid from "~/components/ActivityGrid"; import ActivityGrid from "~/components/ActivityGrid";
import { timeListenedString } from "~/utils/utils"; import { timeListenedString } from "~/utils/utils";
import InterestGraph from "~/components/InterestGraph";
export async function clientLoader({ params }: LoaderFunctionArgs) { export async function clientLoader({ params }: LoaderFunctionArgs) {
let res = await fetch(`/apis/web/v1/track?id=${params.id}`); let res = await fetch(`/apis/web/v1/track?id=${params.id}`);
@ -33,7 +34,8 @@ export default function Track() {
title={track.title} title={track.title}
img={track.image} img={track.image}
id={track.id} id={track.id}
musicbrainzId={album.musicbrainz_id} rank={track.all_time_rank}
musicbrainzId={track.musicbrainz_id}
imgItemId={track.album_id} imgItemId={track.album_id}
mergeFunc={mergeTracks} mergeFunc={mergeTracks}
mergeCleanerFunc={(r, id) => { mergeCleanerFunc={(r, id) => {
@ -48,23 +50,28 @@ export default function Track() {
}} }}
subContent={ subContent={
<div className="flex flex-col gap-2 items-start"> <div className="flex flex-col gap-2 items-start">
<Link to={`/album/${track.album_id}`}>appears on {album.title}</Link> <p>
{track.listen_count && ( Appears on{" "}
<Link className="hover:underline" to={`/album/${track.album_id}`}>
{album.title}
</Link>
</p>
{track.listen_count !== 0 && (
<p> <p>
{track.listen_count} play{track.listen_count > 1 ? "s" : ""} {track.listen_count} play{track.listen_count > 1 ? "s" : ""}
</p> </p>
)} )}
{ {track.time_listened !== 0 && (
<p title={Math.floor(track.time_listened / 60 / 60) + " hours"}> <p title={Math.floor(track.time_listened / 60 / 60) + " hours"}>
{timeListenedString(track.time_listened)} {timeListenedString(track.time_listened)}
</p> </p>
} )}
{ {track.first_listen > 0 && (
<p title={new Date(track.first_listen * 1000).toLocaleString()}> <p title={new Date(track.first_listen * 1000).toLocaleString()}>
Listening since{" "} Listening since{" "}
{new Date(track.first_listen * 1000).toLocaleDateString()} {new Date(track.first_listen * 1000).toLocaleDateString()}
</p> </p>
} )}
</div> </div>
} }
> >
@ -73,7 +80,10 @@ export default function Track() {
</div> </div>
<div className="flex flex-wrap gap-20 mt-10"> <div className="flex flex-wrap gap-20 mt-10">
<LastPlays limit={20} trackId={track.id} /> <LastPlays limit={20} trackId={track.id} />
<ActivityGrid trackId={track.id} configurable /> <div className="flex flex-col items-start gap-4">
<ActivityGrid configurable trackId={track.id} />
<InterestGraph trackId={track.id} />
</div>
</div> </div>
</MediaLayout> </MediaLayout>
); );

View file

@ -1,52 +1,213 @@
import Rewind from "~/components/rewind/Rewind"; import Rewind from "~/components/rewind/Rewind";
import type { Route } from "./+types/Home"; import type { Route } from "./+types/Home";
import { type RewindStats } from "api/api"; import { imageUrl, type RewindStats } from "api/api";
import { useState } from "react"; import { useEffect, useState } from "react";
import type { LoaderFunctionArgs } from "react-router"; import type { LoaderFunctionArgs } from "react-router";
import { useLoaderData } from "react-router"; import { useLoaderData } from "react-router";
import { getRewindYear } from "~/utils/utils"; import { getRewindParams, getRewindYear } from "~/utils/utils";
import { useNavigate } from "react-router";
import { average } from "color.js";
import { ChevronLeft, ChevronRight } from "lucide-react";
// TODO: Bind year and month selectors to what data actually exists
const months = [
"Full Year",
"January",
"February",
"March",
"April",
"May",
"June",
"July",
"August",
"September",
"October",
"November",
"December",
];
export async function clientLoader({ request }: LoaderFunctionArgs) { export async function clientLoader({ request }: LoaderFunctionArgs) {
const url = new URL(request.url); const url = new URL(request.url);
const year = url.searchParams.get("year") || getRewindYear(); const year = parseInt(
url.searchParams.get("year") || getRewindParams().year.toString()
);
const month = parseInt(
url.searchParams.get("month") || getRewindParams().month.toString()
);
const res = await fetch(`/apis/web/v1/summary?year=${year}`); const res = await fetch(`/apis/web/v1/summary?year=${year}&month=${month}`);
if (!res.ok) { if (!res.ok) {
throw new Response("Failed to load summary", { status: 500 }); throw new Response("Failed to load summary", { status: 500 });
} }
const stats: RewindStats = await res.json(); const stats: RewindStats = await res.json();
stats.title = `Your ${year} Rewind`; stats.title = `Your ${month === 0 ? "" : months[month]} ${year} Rewind`;
return { stats }; return { stats };
} }
export function meta({}: Route.MetaArgs) {
return [
{ title: `Rewind - Koito` },
{ name: "description", content: "Rewind - Koito" },
];
}
export default function RewindPage() { export default function RewindPage() {
const currentParams = new URLSearchParams(location.search);
let year = parseInt(
currentParams.get("year") || getRewindParams().year.toString()
);
let month = parseInt(
currentParams.get("month") || getRewindParams().month.toString()
);
const navigate = useNavigate();
const [showTime, setShowTime] = useState(false); const [showTime, setShowTime] = useState(false);
const { stats: stats } = useLoaderData<{ stats: RewindStats }>(); const { stats: stats } = useLoaderData<{ stats: RewindStats }>();
const [bgColor, setBgColor] = useState<string>("(--color-bg)");
useEffect(() => {
if (!stats.top_artists[0]) return;
const img = (stats.top_artists[0] as any)?.item.image;
if (!img) return;
average(imageUrl(img, "small"), { amount: 1 }).then((color) => {
setBgColor(`rgba(${color[0]},${color[1]},${color[2]},0.4)`);
});
}, [stats]);
const updateParams = (params: Record<string, string | null>) => {
const nextParams = new URLSearchParams(location.search);
for (const key in params) {
const val = params[key];
if (val !== null) {
nextParams.set(key, val);
}
}
const url = `/rewind?${nextParams.toString()}`;
navigate(url, { replace: false });
};
const navigateMonth = (direction: "prev" | "next") => {
if (direction === "next") {
if (month === 12) {
month = 0;
} else {
month += 1;
}
} else {
if (month === 0) {
month = 12;
} else {
month -= 1;
}
}
console.log(`Month: ${month}`);
updateParams({
year: year.toString(),
month: month.toString(),
});
};
const navigateYear = (direction: "prev" | "next") => {
if (direction === "next") {
year += 1;
} else {
year -= 1;
}
updateParams({
year: year.toString(),
month: month.toString(),
});
};
const pgTitle = `${stats.title} - Koito`;
return ( return (
<main className="w-18/20"> <div
<title>{stats.title} - Koito</title> className="w-full min-h-screen"
<meta property="og:title" content={`${stats.title} - Koito`} /> style={{
<meta name="description" content={`${stats.title} - Koito`} /> background: `linear-gradient(to bottom, ${bgColor}, var(--color-bg) 500px)`,
<div className="flex flex-col items-start mt-20 gap-10"> transition: "1000",
<div className="flex items-center gap-3"> }}
<label htmlFor="show-time-checkbox">Show time listened?</label> >
<input <div className="flex flex-col items-start sm:items-center gap-4">
type="checkbox" <title>{pgTitle}</title>
name="show-time-checkbox" <meta property="og:title" content={pgTitle} />
checked={showTime} <meta name="description" content={pgTitle} />
onChange={(e) => setShowTime(!showTime)} <div className="flex flex-col lg:flex-row items-start lg:mt-15 mt-5 gap-10 w-19/20 px-5 md:px-20">
></input> <div className="flex flex-col items-start gap-4">
<div className="flex flex-col items-start gap-4 py-8">
<div className="flex items-center gap-6 justify-around">
<button
onClick={() => navigateMonth("prev")}
className="p-2 disabled:text-(--color-fg-tertiary)"
disabled={
// Previous month is in the future OR
new Date(year, month - 2) > new Date() ||
// We are looking at current year and prev would take us to full year
(new Date().getFullYear() === year && month === 1)
}
>
<ChevronLeft size={20} />
</button>
<p className="font-medium text-xl text-center w-30">
{months[month]}
</p>
<button
onClick={() => navigateMonth("next")}
className="p-2 disabled:text-(--color-fg-tertiary)"
disabled={
// next month is current or future month and
month >= new Date().getMonth() &&
// we are looking at current (or future) year
year >= new Date().getFullYear()
}
>
<ChevronRight size={20} />
</button>
</div>
<div className="flex items-center gap-6 justify-around">
<button
onClick={() => navigateYear("prev")}
className="p-2 disabled:text-(--color-fg-tertiary)"
disabled={new Date(year - 1, month) > new Date()}
>
<ChevronLeft size={20} />
</button>
<p className="font-medium text-xl text-center w-30">{year}</p>
<button
onClick={() => navigateYear("next")}
className="p-2 disabled:text-(--color-fg-tertiary)"
disabled={
// Next year date is in the future OR
new Date(year + 1, month - 1) > new Date() ||
// Next year date is current full year OR
(month == 0 && new Date().getFullYear() === year + 1) ||
// Next year date is current month
(new Date().getMonth() === month - 1 &&
new Date().getFullYear() === year + 1)
}
>
<ChevronRight size={20} />
</button>
</div>
</div>
<div className="flex items-center gap-3">
<label htmlFor="show-time-checkbox">Show time listened?</label>
<input
type="checkbox"
name="show-time-checkbox"
checked={showTime}
onChange={(e) => setShowTime(!showTime)}
></input>
</div>
</div>
{stats !== undefined && (
<Rewind stats={stats} includeTime={showTime} />
)}
</div> </div>
{stats !== undefined && <Rewind stats={stats} includeTime={showTime} />}
</div> </div>
</main> </div>
); );
} }

View file

@ -92,7 +92,7 @@ export const themes: Record<string, Theme> = {
fg: "#fef9f3", fg: "#fef9f3",
fgSecondary: "#dbc6b0", fgSecondary: "#dbc6b0",
fgTertiary: "#a3917a", fgTertiary: "#a3917a",
primary: "#d97706", primary: "#F0850A",
primaryDim: "#b45309", primaryDim: "#b45309",
accent: "#8c4c28", accent: "#8c4c28",
accentDim: "#6b3b1f", accentDim: "#6b3b1f",

10
client/app/tz.ts Normal file
View file

@ -0,0 +1,10 @@
export function initTimezoneCookie() {
if (typeof window === "undefined") return;
if (document.cookie.includes("tz=")) return;
const tz = Intl.DateTimeFormat().resolvedOptions().timeZone;
if (!tz) return;
document.cookie = `tz=${tz}; Path=/; Max-Age=31536000; SameSite=Lax`;
}

View file

@ -16,12 +16,15 @@ const timeframeToInterval = (timeframe: Timeframe): string => {
}; };
const getRewindYear = (): number => { const getRewindYear = (): number => {
return new Date().getFullYear() - 1;
};
const getRewindParams = (): { month: number; year: number } => {
const today = new Date(); const today = new Date();
if (today.getMonth() > 10 && today.getDate() >= 30) { if (today.getMonth() == 0) {
// if we are in december 30/31, just serve current year return { month: 0, year: today.getFullYear() - 1 };
return today.getFullYear();
} else { } else {
return today.getFullYear() - 1; return { month: today.getMonth(), year: today.getFullYear() };
} }
}; };
@ -114,5 +117,5 @@ const timeListenedString = (seconds: number) => {
return `${minutes} minutes listened`; return `${minutes} minutes listened`;
}; };
export { hexToHSL, timeListenedString, getRewindYear }; export { hexToHSL, timeListenedString, getRewindYear, getRewindParams };
export type { hsl }; export type { hsl };

View file

@ -13,6 +13,7 @@
"@radix-ui/react-tabs": "^1.1.12", "@radix-ui/react-tabs": "^1.1.12",
"@react-router/node": "^7.5.3", "@react-router/node": "^7.5.3",
"@react-router/serve": "^7.5.3", "@react-router/serve": "^7.5.3",
"@recharts/devtools": "^0.0.7",
"@tanstack/react-query": "^5.80.6", "@tanstack/react-query": "^5.80.6",
"@vanilla-extract/css": "^1.17.4", "@vanilla-extract/css": "^1.17.4",
"color.js": "^1.2.0", "color.js": "^1.2.0",
@ -20,7 +21,9 @@
"lucide-react": "^0.513.0", "lucide-react": "^0.513.0",
"react": "^19.1.0", "react": "^19.1.0",
"react-dom": "^19.1.0", "react-dom": "^19.1.0",
"react-router": "^7.5.3" "react-is": "^19.2.3",
"react-router": "^7.5.3",
"recharts": "^3.6.0"
}, },
"devDependencies": { "devDependencies": {
"@react-router/dev": "^7.5.3", "@react-router/dev": "^7.5.3",

View file

@ -689,6 +689,23 @@
morgan "^1.10.0" morgan "^1.10.0"
source-map-support "^0.5.21" source-map-support "^0.5.21"
"@recharts/devtools@^0.0.7":
version "0.0.7"
resolved "https://registry.yarnpkg.com/@recharts/devtools/-/devtools-0.0.7.tgz#a909d102efd76fc45bc2b7a150e67a02da04b4c1"
integrity sha512-ud66rUf3FYf1yQLGSCowI50EQyC/rcZblvDgNvfUIVaEXyQtr5K2DFgwegziqbVclsVBQLTxyntVViJN5H4oWQ==
"@reduxjs/toolkit@1.x.x || 2.x.x":
version "2.11.2"
resolved "https://registry.yarnpkg.com/@reduxjs/toolkit/-/toolkit-2.11.2.tgz#582225acea567329ca6848583e7dd72580d38e82"
integrity sha512-Kd6kAHTA6/nUpp8mySPqj3en3dm0tdMIgbttnQ1xFMVpufoj+ADi8pXLBsd4xzTRHQa7t/Jv8W5UnCuW4kuWMQ==
dependencies:
"@standard-schema/spec" "^1.0.0"
"@standard-schema/utils" "^0.3.0"
immer "^11.0.0"
redux "^5.0.1"
redux-thunk "^3.1.0"
reselect "^5.1.0"
"@rollup/rollup-android-arm-eabi@4.42.0": "@rollup/rollup-android-arm-eabi@4.42.0":
version "4.42.0" version "4.42.0"
resolved "https://registry.yarnpkg.com/@rollup/rollup-android-arm-eabi/-/rollup-android-arm-eabi-4.42.0.tgz#8baae15a6a27f18b7c5be420e00ab08c7d3dd6f4" resolved "https://registry.yarnpkg.com/@rollup/rollup-android-arm-eabi/-/rollup-android-arm-eabi-4.42.0.tgz#8baae15a6a27f18b7c5be420e00ab08c7d3dd6f4"
@ -789,6 +806,16 @@
resolved "https://registry.yarnpkg.com/@rollup/rollup-win32-x64-msvc/-/rollup-win32-x64-msvc-4.42.0.tgz#516c6770ba15fe6aef369d217a9747492c01e8b7" resolved "https://registry.yarnpkg.com/@rollup/rollup-win32-x64-msvc/-/rollup-win32-x64-msvc-4.42.0.tgz#516c6770ba15fe6aef369d217a9747492c01e8b7"
integrity sha512-LpHiJRwkaVz/LqjHjK8LCi8osq7elmpwujwbXKNW88bM8eeGxavJIKKjkjpMHAh/2xfnrt1ZSnhTv41WYUHYmA== integrity sha512-LpHiJRwkaVz/LqjHjK8LCi8osq7elmpwujwbXKNW88bM8eeGxavJIKKjkjpMHAh/2xfnrt1ZSnhTv41WYUHYmA==
"@standard-schema/spec@^1.0.0":
version "1.1.0"
resolved "https://registry.yarnpkg.com/@standard-schema/spec/-/spec-1.1.0.tgz#a79b55dbaf8604812f52d140b2c9ab41bc150bb8"
integrity sha512-l2aFy5jALhniG5HgqrD6jXLi/rUWrKvqN/qJx6yoJsgKhblVd+iqqU4RCXavm/jPityDo5TCvKMnpjKnOriy0w==
"@standard-schema/utils@^0.3.0":
version "0.3.0"
resolved "https://registry.yarnpkg.com/@standard-schema/utils/-/utils-0.3.0.tgz#3d5e608f16c2390c10528e98e59aef6bf73cae7b"
integrity sha512-e7Mew686owMaPJVNNLs55PUvgz371nKgwsc4vxE49zsODpJEnxgxRo2y/OKrqueavXgZNMDVj3DdHFlaSAeU8g==
"@tailwindcss/node@4.1.8": "@tailwindcss/node@4.1.8":
version "4.1.8" version "4.1.8"
resolved "https://registry.yarnpkg.com/@tailwindcss/node/-/node-4.1.8.tgz#e29187abec6194ce1e9f072208c62116a79a129b" resolved "https://registry.yarnpkg.com/@tailwindcss/node/-/node-4.1.8.tgz#e29187abec6194ce1e9f072208c62116a79a129b"
@ -918,6 +945,57 @@
dependencies: dependencies:
tslib "^2.4.0" tslib "^2.4.0"
"@types/d3-array@^3.0.3":
version "3.2.2"
resolved "https://registry.yarnpkg.com/@types/d3-array/-/d3-array-3.2.2.tgz#e02151464d02d4a1b44646d0fcdb93faf88fde8c"
integrity sha512-hOLWVbm7uRza0BYXpIIW5pxfrKe0W+D5lrFiAEYR+pb6w3N2SwSMaJbXdUfSEv+dT4MfHBLtn5js0LAWaO6otw==
"@types/d3-color@*":
version "3.1.3"
resolved "https://registry.yarnpkg.com/@types/d3-color/-/d3-color-3.1.3.tgz#368c961a18de721da8200e80bf3943fb53136af2"
integrity sha512-iO90scth9WAbmgv7ogoq57O9YpKmFBbmoEoCHDB2xMBY0+/KVrqAaCDyCE16dUspeOvIxFFRI+0sEtqDqy2b4A==
"@types/d3-ease@^3.0.0":
version "3.0.2"
resolved "https://registry.yarnpkg.com/@types/d3-ease/-/d3-ease-3.0.2.tgz#e28db1bfbfa617076f7770dd1d9a48eaa3b6c51b"
integrity sha512-NcV1JjO5oDzoK26oMzbILE6HW7uVXOHLQvHshBUW4UMdZGfiY6v5BeQwh9a9tCzv+CeefZQHJt5SRgK154RtiA==
"@types/d3-interpolate@^3.0.1":
version "3.0.4"
resolved "https://registry.yarnpkg.com/@types/d3-interpolate/-/d3-interpolate-3.0.4.tgz#412b90e84870285f2ff8a846c6eb60344f12a41c"
integrity sha512-mgLPETlrpVV1YRJIglr4Ez47g7Yxjl1lj7YKsiMCb27VJH9W8NVM6Bb9d8kkpG/uAQS5AmbA48q2IAolKKo1MA==
dependencies:
"@types/d3-color" "*"
"@types/d3-path@*":
version "3.1.1"
resolved "https://registry.yarnpkg.com/@types/d3-path/-/d3-path-3.1.1.tgz#f632b380c3aca1dba8e34aa049bcd6a4af23df8a"
integrity sha512-VMZBYyQvbGmWyWVea0EHs/BwLgxc+MKi1zLDCONksozI4YJMcTt8ZEuIR4Sb1MMTE8MMW49v0IwI5+b7RmfWlg==
"@types/d3-scale@^4.0.2":
version "4.0.9"
resolved "https://registry.yarnpkg.com/@types/d3-scale/-/d3-scale-4.0.9.tgz#57a2f707242e6fe1de81ad7bfcccaaf606179afb"
integrity sha512-dLmtwB8zkAeO/juAMfnV+sItKjlsw2lKdZVVy6LRr0cBmegxSABiLEpGVmSJJ8O08i4+sGR6qQtb6WtuwJdvVw==
dependencies:
"@types/d3-time" "*"
"@types/d3-shape@^3.1.0":
version "3.1.8"
resolved "https://registry.yarnpkg.com/@types/d3-shape/-/d3-shape-3.1.8.tgz#d1516cc508753be06852cd06758e3bb54a22b0e3"
integrity sha512-lae0iWfcDeR7qt7rA88BNiqdvPS5pFVPpo5OfjElwNaT2yyekbM0C9vK+yqBqEmHr6lDkRnYNoTBYlAgJa7a4w==
dependencies:
"@types/d3-path" "*"
"@types/d3-time@*", "@types/d3-time@^3.0.0":
version "3.0.4"
resolved "https://registry.yarnpkg.com/@types/d3-time/-/d3-time-3.0.4.tgz#8472feecd639691450dd8000eb33edd444e1323f"
integrity sha512-yuzZug1nkAAaBlBBikKZTgzCeA+k1uy4ZFwWANOfKw5z5LRhV0gNA7gNkKm7HoK+HRN0wX3EkxGk0fpbWhmB7g==
"@types/d3-timer@^3.0.0":
version "3.0.2"
resolved "https://registry.yarnpkg.com/@types/d3-timer/-/d3-timer-3.0.2.tgz#70bbda77dc23aa727413e22e214afa3f0e852f70"
integrity sha512-Ps3T8E8dZDam6fUyNiMkekK3XUsaUEik+idO9/YjPtfj2qruF8tFBXS7XhtE4iIXBLxhmLjP3SXpLhVf21I9Lw==
"@types/estree@1.0.7": "@types/estree@1.0.7":
version "1.0.7" version "1.0.7"
resolved "https://registry.yarnpkg.com/@types/estree/-/estree-1.0.7.tgz#4158d3105276773d5b7695cd4834b1722e4f37a8" resolved "https://registry.yarnpkg.com/@types/estree/-/estree-1.0.7.tgz#4158d3105276773d5b7695cd4834b1722e4f37a8"
@ -949,6 +1027,11 @@
dependencies: dependencies:
csstype "^3.0.2" csstype "^3.0.2"
"@types/use-sync-external-store@^0.0.6":
version "0.0.6"
resolved "https://registry.yarnpkg.com/@types/use-sync-external-store/-/use-sync-external-store-0.0.6.tgz#60be8d21baab8c305132eb9cb912ed497852aadc"
integrity sha512-zFDAD+tlpf2r4asuHEj0XH6pY6i0g5NeAHPn+15wk3BV6JA69eERFXC1gyGThDkVa1zCyKr5jox1+2LbV/AMLg==
"@vanilla-extract/babel-plugin-debug-ids@^1.2.2": "@vanilla-extract/babel-plugin-debug-ids@^1.2.2":
version "1.2.2" version "1.2.2"
resolved "https://registry.yarnpkg.com/@vanilla-extract/babel-plugin-debug-ids/-/babel-plugin-debug-ids-1.2.2.tgz#0bcb26614d8c6c4c0d95f8f583d838ce71294633" resolved "https://registry.yarnpkg.com/@vanilla-extract/babel-plugin-debug-ids/-/babel-plugin-debug-ids-1.2.2.tgz#0bcb26614d8c6c4c0d95f8f583d838ce71294633"
@ -1163,6 +1246,11 @@ chownr@^3.0.0:
resolved "https://registry.yarnpkg.com/chownr/-/chownr-3.0.0.tgz#9855e64ecd240a9cc4267ce8a4aa5d24a1da15e4" resolved "https://registry.yarnpkg.com/chownr/-/chownr-3.0.0.tgz#9855e64ecd240a9cc4267ce8a4aa5d24a1da15e4"
integrity sha512-+IxzY9BZOQd/XuYPRmrvEVjF/nqj5kgT4kEq7VofrDoM1MxoRjEWkrCC3EtLi59TVawxTAn+orJwFQcrqEN1+g== integrity sha512-+IxzY9BZOQd/XuYPRmrvEVjF/nqj5kgT4kEq7VofrDoM1MxoRjEWkrCC3EtLi59TVawxTAn+orJwFQcrqEN1+g==
clsx@^2.1.1:
version "2.1.1"
resolved "https://registry.yarnpkg.com/clsx/-/clsx-2.1.1.tgz#eed397c9fd8bd882bfb18deab7102049a2f32999"
integrity sha512-eYm0QWBtUrBWZWG0d386OGAw16Z995PiOVo2B7bjWSbHedGl5e0ZWaq65kOGgUSNesEIDkB9ISbTg/JK9dhCZA==
color-convert@^2.0.1: color-convert@^2.0.1:
version "2.0.1" version "2.0.1"
resolved "https://registry.yarnpkg.com/color-convert/-/color-convert-2.0.1.tgz#72d3a68d598c9bdb3af2ad1e84f21d896abd4de3" resolved "https://registry.yarnpkg.com/color-convert/-/color-convert-2.0.1.tgz#72d3a68d598c9bdb3af2ad1e84f21d896abd4de3"
@ -1261,6 +1349,77 @@ csstype@^3.0.2, csstype@^3.0.7:
resolved "https://registry.yarnpkg.com/csstype/-/csstype-3.1.3.tgz#d80ff294d114fb0e6ac500fbf85b60137d7eff81" resolved "https://registry.yarnpkg.com/csstype/-/csstype-3.1.3.tgz#d80ff294d114fb0e6ac500fbf85b60137d7eff81"
integrity sha512-M1uQkMl8rQK/szD0LNhtqxIPLpimGm8sOBwU7lLnCpSbTyY3yeU1Vc7l4KT5zT4s/yOxHH5O7tIuuLOCnLADRw== integrity sha512-M1uQkMl8rQK/szD0LNhtqxIPLpimGm8sOBwU7lLnCpSbTyY3yeU1Vc7l4KT5zT4s/yOxHH5O7tIuuLOCnLADRw==
"d3-array@2 - 3", "d3-array@2.10.0 - 3", d3-array@^3.1.6:
version "3.2.4"
resolved "https://registry.yarnpkg.com/d3-array/-/d3-array-3.2.4.tgz#15fec33b237f97ac5d7c986dc77da273a8ed0bb5"
integrity sha512-tdQAmyA18i4J7wprpYq8ClcxZy3SC31QMeByyCFyRt7BVHdREQZ5lpzoe5mFEYZUWe+oq8HBvk9JjpibyEV4Jg==
dependencies:
internmap "1 - 2"
"d3-color@1 - 3":
version "3.1.0"
resolved "https://registry.yarnpkg.com/d3-color/-/d3-color-3.1.0.tgz#395b2833dfac71507f12ac2f7af23bf819de24e2"
integrity sha512-zg/chbXyeBtMQ1LbD/WSoW2DpC3I0mpmPdW+ynRTj/x2DAWYrIY7qeZIHidozwV24m4iavr15lNwIwLxRmOxhA==
d3-ease@^3.0.1:
version "3.0.1"
resolved "https://registry.yarnpkg.com/d3-ease/-/d3-ease-3.0.1.tgz#9658ac38a2140d59d346160f1f6c30fda0bd12f4"
integrity sha512-wR/XK3D3XcLIZwpbvQwQ5fK+8Ykds1ip7A2Txe0yxncXSdq1L9skcG7blcedkOX+ZcgxGAmLX1FrRGbADwzi0w==
"d3-format@1 - 3":
version "3.1.0"
resolved "https://registry.yarnpkg.com/d3-format/-/d3-format-3.1.0.tgz#9260e23a28ea5cb109e93b21a06e24e2ebd55641"
integrity sha512-YyUI6AEuY/Wpt8KWLgZHsIU86atmikuoOmCfommt0LYHiQSPjvX2AcFc38PX0CBpr2RCyZhjex+NS/LPOv6YqA==
"d3-interpolate@1.2.0 - 3", d3-interpolate@^3.0.1:
version "3.0.1"
resolved "https://registry.yarnpkg.com/d3-interpolate/-/d3-interpolate-3.0.1.tgz#3c47aa5b32c5b3dfb56ef3fd4342078a632b400d"
integrity sha512-3bYs1rOD33uo8aqJfKP3JWPAibgw8Zm2+L9vBKEHJ2Rg+viTR7o5Mmv5mZcieN+FRYaAOWX5SJATX6k1PWz72g==
dependencies:
d3-color "1 - 3"
d3-path@^3.1.0:
version "3.1.0"
resolved "https://registry.yarnpkg.com/d3-path/-/d3-path-3.1.0.tgz#22df939032fb5a71ae8b1800d61ddb7851c42526"
integrity sha512-p3KP5HCf/bvjBSSKuXid6Zqijx7wIfNW+J/maPs+iwR35at5JCbLUT0LzF1cnjbCHWhqzQTIN2Jpe8pRebIEFQ==
d3-scale@^4.0.2:
version "4.0.2"
resolved "https://registry.yarnpkg.com/d3-scale/-/d3-scale-4.0.2.tgz#82b38e8e8ff7080764f8dcec77bd4be393689396"
integrity sha512-GZW464g1SH7ag3Y7hXjf8RoUuAFIqklOAq3MRl4OaWabTFJY9PN/E1YklhXLh+OQ3fM9yS2nOkCoS+WLZ6kvxQ==
dependencies:
d3-array "2.10.0 - 3"
d3-format "1 - 3"
d3-interpolate "1.2.0 - 3"
d3-time "2.1.1 - 3"
d3-time-format "2 - 4"
d3-shape@^3.1.0:
version "3.2.0"
resolved "https://registry.yarnpkg.com/d3-shape/-/d3-shape-3.2.0.tgz#a1a839cbd9ba45f28674c69d7f855bcf91dfc6a5"
integrity sha512-SaLBuwGm3MOViRq2ABk3eLoxwZELpH6zhl3FbAoJ7Vm1gofKx6El1Ib5z23NUEhF9AsGl7y+dzLe5Cw2AArGTA==
dependencies:
d3-path "^3.1.0"
"d3-time-format@2 - 4":
version "4.1.0"
resolved "https://registry.yarnpkg.com/d3-time-format/-/d3-time-format-4.1.0.tgz#7ab5257a5041d11ecb4fe70a5c7d16a195bb408a"
integrity sha512-dJxPBlzC7NugB2PDLwo9Q8JiTR3M3e4/XANkreKSUxF8vvXKqm1Yfq4Q5dl8budlunRVlUUaDUgFt7eA8D6NLg==
dependencies:
d3-time "1 - 3"
"d3-time@1 - 3", "d3-time@2.1.1 - 3", d3-time@^3.0.0:
version "3.1.0"
resolved "https://registry.yarnpkg.com/d3-time/-/d3-time-3.1.0.tgz#9310db56e992e3c0175e1ef385e545e48a9bb5c7"
integrity sha512-VqKjzBLejbSMT4IgbmVgDjpkYrNWUYJnbCGo874u7MMKIWsILRX+OpX/gTk8MqjpT1A/c6HY2dCA77ZN0lkQ2Q==
dependencies:
d3-array "2 - 3"
d3-timer@^3.0.1:
version "3.0.1"
resolved "https://registry.yarnpkg.com/d3-timer/-/d3-timer-3.0.1.tgz#6284d2a2708285b1abb7e201eda4380af35e63b0"
integrity sha512-ndfJ/JxxMd3nw31uyKoY2naivF+r29V+Lc0svZxe1JvvIRmi8hUsrMvdOwgS1o6uBHmiz91geQ0ylPP0aj1VUA==
debug@2.6.9: debug@2.6.9:
version "2.6.9" version "2.6.9"
resolved "https://registry.yarnpkg.com/debug/-/debug-2.6.9.tgz#5d128515df134ff327e90a4c93f4e077a536341f" resolved "https://registry.yarnpkg.com/debug/-/debug-2.6.9.tgz#5d128515df134ff327e90a4c93f4e077a536341f"
@ -1275,6 +1434,11 @@ debug@^4.1.0, debug@^4.1.1, debug@^4.3.1, debug@^4.4.1:
dependencies: dependencies:
ms "^2.1.3" ms "^2.1.3"
decimal.js-light@^2.5.1:
version "2.5.1"
resolved "https://registry.yarnpkg.com/decimal.js-light/-/decimal.js-light-2.5.1.tgz#134fd32508f19e208f4fb2f8dac0d2626a867934"
integrity sha512-qIMFpTMZmny+MMIitAB6D7iVPEorVw6YQRWkvarTkT4tBeSLLiHzcwj6q0MmYSFCiVpiqPJTJEYIrpcPzVEIvg==
dedent@^1.5.3: dedent@^1.5.3:
version "1.6.0" version "1.6.0"
resolved "https://registry.yarnpkg.com/dedent/-/dedent-1.6.0.tgz#79d52d6389b1ffa67d2bcef59ba51847a9d503b2" resolved "https://registry.yarnpkg.com/dedent/-/dedent-1.6.0.tgz#79d52d6389b1ffa67d2bcef59ba51847a9d503b2"
@ -1384,6 +1548,11 @@ es-object-atoms@^1.0.0, es-object-atoms@^1.1.1:
dependencies: dependencies:
es-errors "^1.3.0" es-errors "^1.3.0"
es-toolkit@^1.39.3:
version "1.43.0"
resolved "https://registry.yarnpkg.com/es-toolkit/-/es-toolkit-1.43.0.tgz#2c278d55ffeb30421e6e73a009738ed37b10ef61"
integrity sha512-SKCT8AsWvYzBBuUqMk4NPwFlSdqLpJwmy6AP322ERn8W2YLIB6JBXnwMI2Qsh2gfphT3q7EKAxKb23cvFHFwKA==
esbuild@^0.25.0, "esbuild@npm:esbuild@>=0.17.6 <0.26.0": esbuild@^0.25.0, "esbuild@npm:esbuild@>=0.17.6 <0.26.0":
version "0.25.5" version "0.25.5"
resolved "https://registry.yarnpkg.com/esbuild/-/esbuild-0.25.5.tgz#71075054993fdfae76c66586f9b9c1f8d7edd430" resolved "https://registry.yarnpkg.com/esbuild/-/esbuild-0.25.5.tgz#71075054993fdfae76c66586f9b9c1f8d7edd430"
@ -1438,6 +1607,11 @@ eval@0.1.8:
"@types/node" "*" "@types/node" "*"
require-like ">= 0.1.1" require-like ">= 0.1.1"
eventemitter3@^5.0.1:
version "5.0.1"
resolved "https://registry.yarnpkg.com/eventemitter3/-/eventemitter3-5.0.1.tgz#53f5ffd0a492ac800721bb42c66b841de96423c4"
integrity sha512-GWkBvjiSZK87ELrYOSESUYeVIc9mvLLf/nXalMOS5dYrgZq9o5OVkbZAVM06CVxYsCwH9BDZFPlQTlPA1j4ahA==
exit-hook@2.2.1: exit-hook@2.2.1:
version "2.2.1" version "2.2.1"
resolved "https://registry.yarnpkg.com/exit-hook/-/exit-hook-2.2.1.tgz#007b2d92c6428eda2b76e7016a34351586934593" resolved "https://registry.yarnpkg.com/exit-hook/-/exit-hook-2.2.1.tgz#007b2d92c6428eda2b76e7016a34351586934593"
@ -1646,11 +1820,26 @@ iconv-lite@0.4.24:
dependencies: dependencies:
safer-buffer ">= 2.1.2 < 3" safer-buffer ">= 2.1.2 < 3"
immer@^10.1.1:
version "10.2.0"
resolved "https://registry.yarnpkg.com/immer/-/immer-10.2.0.tgz#88a4ce06a1af64172d254b70f7cb04df51c871b1"
integrity sha512-d/+XTN3zfODyjr89gM3mPq1WNX2B8pYsu7eORitdwyA2sBubnTl3laYlBk4sXY5FUa5qTZGBDPJICVbvqzjlbw==
immer@^11.0.0:
version "11.1.3"
resolved "https://registry.yarnpkg.com/immer/-/immer-11.1.3.tgz#78681e1deb6cec39753acf04eb16d7576c04f4d6"
integrity sha512-6jQTc5z0KJFtr1UgFpIL3N9XSC3saRaI9PwWtzM2pSqkNGtiNkYY2OSwkOGDK2XcTRcLb1pi/aNkKZz0nxVH4Q==
inherits@2.0.4: inherits@2.0.4:
version "2.0.4" version "2.0.4"
resolved "https://registry.yarnpkg.com/inherits/-/inherits-2.0.4.tgz#0fa2c64f932917c3433a0ded55363aae37416b7c" resolved "https://registry.yarnpkg.com/inherits/-/inherits-2.0.4.tgz#0fa2c64f932917c3433a0ded55363aae37416b7c"
integrity sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ== integrity sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ==
"internmap@1 - 2":
version "2.0.3"
resolved "https://registry.yarnpkg.com/internmap/-/internmap-2.0.3.tgz#6685f23755e43c524e251d29cbc97248e3061009"
integrity sha512-5Hh7Y1wQbvY5ooGgPbDaL5iYLAPzMTUrjMulskHLH6wnv/A+1q5rgEaiuqEjB+oxGXIVZs1FF+R/KPN3ZSQYYg==
ipaddr.js@1.9.1: ipaddr.js@1.9.1:
version "1.9.1" version "1.9.1"
resolved "https://registry.yarnpkg.com/ipaddr.js/-/ipaddr.js-1.9.1.tgz#bff38543eeb8984825079ff3a2a8e6cbd46781b3" resolved "https://registry.yarnpkg.com/ipaddr.js/-/ipaddr.js-1.9.1.tgz#bff38543eeb8984825079ff3a2a8e6cbd46781b3"
@ -2180,6 +2369,19 @@ react-dom@^19.1.0:
dependencies: dependencies:
scheduler "^0.26.0" scheduler "^0.26.0"
react-is@^19.2.3:
version "19.2.3"
resolved "https://registry.yarnpkg.com/react-is/-/react-is-19.2.3.tgz#eec2feb69c7fb31f77d0b5c08c10ae1c88886b29"
integrity sha512-qJNJfu81ByyabuG7hPFEbXqNcWSU3+eVus+KJs+0ncpGfMyYdvSmxiJxbWR65lYi1I+/0HBcliO029gc4F+PnA==
"react-redux@8.x.x || 9.x.x":
version "9.2.0"
resolved "https://registry.yarnpkg.com/react-redux/-/react-redux-9.2.0.tgz#96c3ab23fb9a3af2cb4654be4b51c989e32366f5"
integrity sha512-ROY9fvHhwOD9ySfrF0wmvu//bKCQ6AeZZq1nJNtbDC+kk5DuSuNX/n6YWYF/SYy7bSba4D4FSz8DJeKY/S/r+g==
dependencies:
"@types/use-sync-external-store" "^0.0.6"
use-sync-external-store "^1.4.0"
react-refresh@^0.14.0: react-refresh@^0.14.0:
version "0.14.2" version "0.14.2"
resolved "https://registry.yarnpkg.com/react-refresh/-/react-refresh-0.14.2.tgz#3833da01ce32da470f1f936b9d477da5c7028bf9" resolved "https://registry.yarnpkg.com/react-refresh/-/react-refresh-0.14.2.tgz#3833da01ce32da470f1f936b9d477da5c7028bf9"
@ -2203,11 +2405,43 @@ readdirp@^4.0.1:
resolved "https://registry.yarnpkg.com/readdirp/-/readdirp-4.1.2.tgz#eb85801435fbf2a7ee58f19e0921b068fc69948d" resolved "https://registry.yarnpkg.com/readdirp/-/readdirp-4.1.2.tgz#eb85801435fbf2a7ee58f19e0921b068fc69948d"
integrity sha512-GDhwkLfywWL2s6vEjyhri+eXmfH6j1L7JE27WhqLeYzoh/A3DBaYGEj2H/HFZCn/kMfim73FXxEJTw06WtxQwg== integrity sha512-GDhwkLfywWL2s6vEjyhri+eXmfH6j1L7JE27WhqLeYzoh/A3DBaYGEj2H/HFZCn/kMfim73FXxEJTw06WtxQwg==
recharts@^3.6.0:
version "3.6.0"
resolved "https://registry.yarnpkg.com/recharts/-/recharts-3.6.0.tgz#403f0606581153601857e46733277d1411633df3"
integrity sha512-L5bjxvQRAe26RlToBAziKUB7whaGKEwD3znoM6fz3DrTowCIC/FnJYnuq1GEzB8Zv2kdTfaxQfi5GoH0tBinyg==
dependencies:
"@reduxjs/toolkit" "1.x.x || 2.x.x"
clsx "^2.1.1"
decimal.js-light "^2.5.1"
es-toolkit "^1.39.3"
eventemitter3 "^5.0.1"
immer "^10.1.1"
react-redux "8.x.x || 9.x.x"
reselect "5.1.1"
tiny-invariant "^1.3.3"
use-sync-external-store "^1.2.2"
victory-vendor "^37.0.2"
redux-thunk@^3.1.0:
version "3.1.0"
resolved "https://registry.yarnpkg.com/redux-thunk/-/redux-thunk-3.1.0.tgz#94aa6e04977c30e14e892eae84978c1af6058ff3"
integrity sha512-NW2r5T6ksUKXCabzhL9z+h206HQw/NJkcLm1GPImRQ8IzfXwRGqjVhKJGauHirT0DAuyy6hjdnMZaRoAcy0Klw==
redux@^5.0.1:
version "5.0.1"
resolved "https://registry.yarnpkg.com/redux/-/redux-5.0.1.tgz#97fa26881ce5746500125585d5642c77b6e9447b"
integrity sha512-M9/ELqF6fy8FwmkpnF0S3YKOqMyoWJ4+CS5Efg2ct3oY9daQvd/Pc71FpGZsVsbl3Cpb+IIcjBDUnnyBdQbq4w==
"require-like@>= 0.1.1": "require-like@>= 0.1.1":
version "0.1.2" version "0.1.2"
resolved "https://registry.yarnpkg.com/require-like/-/require-like-0.1.2.tgz#ad6f30c13becd797010c468afa775c0c0a6b47fa" resolved "https://registry.yarnpkg.com/require-like/-/require-like-0.1.2.tgz#ad6f30c13becd797010c468afa775c0c0a6b47fa"
integrity sha512-oyrU88skkMtDdauHDuKVrgR+zuItqr6/c//FXzvmxRGMexSDc6hNvJInGW3LL46n+8b50RykrvwSUIIQH2LQ5A== integrity sha512-oyrU88skkMtDdauHDuKVrgR+zuItqr6/c//FXzvmxRGMexSDc6hNvJInGW3LL46n+8b50RykrvwSUIIQH2LQ5A==
reselect@5.1.1, reselect@^5.1.0:
version "5.1.1"
resolved "https://registry.yarnpkg.com/reselect/-/reselect-5.1.1.tgz#c766b1eb5d558291e5e550298adb0becc24bb72e"
integrity sha512-K/BG6eIky/SBpzfHZv/dd+9JBFiS4SWV7FIujVyJRux6e45+73RaUHXLmIR1f7WOMaQ0U1km6qwklRQxpJJY0w==
retry@^0.12.0: retry@^0.12.0:
version "0.12.0" version "0.12.0"
resolved "https://registry.yarnpkg.com/retry/-/retry-0.12.0.tgz#1b42a6266a21f07421d1b0b54b7dc167b01c013b" resolved "https://registry.yarnpkg.com/retry/-/retry-0.12.0.tgz#1b42a6266a21f07421d1b0b54b7dc167b01c013b"
@ -2492,6 +2726,11 @@ tar@^7.4.3:
mkdirp "^3.0.1" mkdirp "^3.0.1"
yallist "^5.0.0" yallist "^5.0.0"
tiny-invariant@^1.3.3:
version "1.3.3"
resolved "https://registry.yarnpkg.com/tiny-invariant/-/tiny-invariant-1.3.3.tgz#46680b7a873a0d5d10005995eb90a70d74d60127"
integrity sha512-+FbBPE1o9QAYvviau/qC5SE3caw21q3xkvWKBtja5vgqOWIHHJ3ioaq1VPfn/Szqctz2bU/oYeKd9/z5BL+PVg==
tinyglobby@^0.2.13: tinyglobby@^0.2.13:
version "0.2.14" version "0.2.14"
resolved "https://registry.yarnpkg.com/tinyglobby/-/tinyglobby-0.2.14.tgz#5280b0cf3f972b050e74ae88406c0a6a58f4079d" resolved "https://registry.yarnpkg.com/tinyglobby/-/tinyglobby-0.2.14.tgz#5280b0cf3f972b050e74ae88406c0a6a58f4079d"
@ -2566,6 +2805,11 @@ update-browserslist-db@^1.1.3:
escalade "^3.2.0" escalade "^3.2.0"
picocolors "^1.1.1" picocolors "^1.1.1"
use-sync-external-store@^1.2.2, use-sync-external-store@^1.4.0:
version "1.6.0"
resolved "https://registry.yarnpkg.com/use-sync-external-store/-/use-sync-external-store-1.6.0.tgz#b174bfa65cb2b526732d9f2ac0a408027876f32d"
integrity sha512-Pp6GSwGP/NrPIrxVFAIkOQeyw8lFenOHijQWkUTrDvrF4ALqylP2C/KCkeS9dpUM3KvYRQhna5vt7IL95+ZQ9w==
utils-merge@1.0.1: utils-merge@1.0.1:
version "1.0.1" version "1.0.1"
resolved "https://registry.yarnpkg.com/utils-merge/-/utils-merge-1.0.1.tgz#9f95710f50a267947b2ccc124741c1028427e713" resolved "https://registry.yarnpkg.com/utils-merge/-/utils-merge-1.0.1.tgz#9f95710f50a267947b2ccc124741c1028427e713"
@ -2594,6 +2838,26 @@ vary@~1.1.2:
resolved "https://registry.yarnpkg.com/vary/-/vary-1.1.2.tgz#2299f02c6ded30d4a5961b0b9f74524a18f634fc" resolved "https://registry.yarnpkg.com/vary/-/vary-1.1.2.tgz#2299f02c6ded30d4a5961b0b9f74524a18f634fc"
integrity sha512-BNGbWLfd0eUPabhkXUVm0j8uuvREyTh5ovRa/dyow/BqAbZJyC+5fU+IzQOzmAKzYqYRAISoRhdQr3eIZ/PXqg== integrity sha512-BNGbWLfd0eUPabhkXUVm0j8uuvREyTh5ovRa/dyow/BqAbZJyC+5fU+IzQOzmAKzYqYRAISoRhdQr3eIZ/PXqg==
victory-vendor@^37.0.2:
version "37.3.6"
resolved "https://registry.yarnpkg.com/victory-vendor/-/victory-vendor-37.3.6.tgz#401ac4b029a0b3d33e0cba8e8a1d765c487254da"
integrity sha512-SbPDPdDBYp+5MJHhBCAyI7wKM3d5ivekigc2Dk2s7pgbZ9wIgIBYGVw4zGHBml/qTFbexrofXW6Gu4noGxrOwQ==
dependencies:
"@types/d3-array" "^3.0.3"
"@types/d3-ease" "^3.0.0"
"@types/d3-interpolate" "^3.0.1"
"@types/d3-scale" "^4.0.2"
"@types/d3-shape" "^3.1.0"
"@types/d3-time" "^3.0.0"
"@types/d3-timer" "^3.0.0"
d3-array "^3.1.6"
d3-ease "^3.0.1"
d3-interpolate "^3.0.1"
d3-scale "^4.0.2"
d3-shape "^3.1.0"
d3-time "^3.0.0"
d3-timer "^3.0.1"
vite-node@^3.1.4, vite-node@^3.2.2: vite-node@^3.1.4, vite-node@^3.2.2:
version "3.2.3" version "3.2.3"
resolved "https://registry.yarnpkg.com/vite-node/-/vite-node-3.2.3.tgz#1c5a2282fe100114c26fd221daf506e69d392a36" resolved "https://registry.yarnpkg.com/vite-node/-/vite-node-3.2.3.tgz#1c5a2282fe100114c26fd221daf506e69d392a36"

View file

@ -0,0 +1,9 @@
-- +goose Up
DELETE FROM artist_releases ar
WHERE NOT EXISTS (
SELECT 1
FROM artist_tracks at
JOIN tracks t ON at.track_id = t.id
WHERE at.artist_id = ar.artist_id
AND t.release_id = ar.release_id
);

View file

@ -56,22 +56,60 @@ LEFT JOIN artist_aliases aa ON a.id = aa.artist_id
WHERE a.musicbrainz_id = $1 WHERE a.musicbrainz_id = $1
GROUP BY a.id, a.musicbrainz_id, a.image, a.image_source, a.name; GROUP BY a.id, a.musicbrainz_id, a.image, a.image_source, a.name;
-- name: GetArtistsWithoutImages :many
SELECT
*
FROM artists_with_name
WHERE image IS NULL
AND id > $2
ORDER BY id ASC
LIMIT $1;
-- name: GetTopArtistsPaginated :many -- name: GetTopArtistsPaginated :many
SELECT SELECT
x.id,
x.name,
x.musicbrainz_id,
x.image,
x.listen_count,
RANK() OVER (ORDER BY x.listen_count DESC) AS rank
FROM (
SELECT
a.id, a.id,
a.name, a.name,
a.musicbrainz_id, a.musicbrainz_id,
a.image, a.image,
COUNT(*) AS listen_count COUNT(*) AS listen_count
FROM listens l FROM listens l
JOIN tracks t ON l.track_id = t.id JOIN tracks t ON l.track_id = t.id
JOIN artist_tracks at ON at.track_id = t.id JOIN artist_tracks at ON at.track_id = t.id
JOIN artists_with_name a ON a.id = at.artist_id JOIN artists_with_name a ON a.id = at.artist_id
WHERE l.listened_at BETWEEN $1 AND $2 WHERE l.listened_at BETWEEN $1 AND $2
GROUP BY a.id, a.name, a.musicbrainz_id, a.image, a.image_source, a.name GROUP BY a.id, a.name, a.musicbrainz_id, a.image
ORDER BY listen_count DESC, a.id ) x
ORDER BY x.listen_count DESC, x.id
LIMIT $3 OFFSET $4; LIMIT $3 OFFSET $4;
-- name: GetArtistAllTimeRank :one
SELECT
artist_id,
rank
FROM (
SELECT
x.artist_id,
RANK() OVER (ORDER BY x.listen_count DESC) AS rank
FROM (
SELECT
at.artist_id,
COUNT(*) AS listen_count
FROM listens l
JOIN tracks t ON l.track_id = t.id
JOIN artist_tracks at ON t.id = at.track_id
GROUP BY at.artist_id
) x
)
WHERE artist_id = $1;
-- name: CountTopArtists :one -- name: CountTopArtists :one
SELECT COUNT(DISTINCT at.artist_id) AS total_count SELECT COUNT(DISTINCT at.artist_id) AS total_count
FROM listens l FROM listens l

View file

@ -3,7 +3,13 @@ DO $$
BEGIN BEGIN
DELETE FROM tracks WHERE id NOT IN (SELECT l.track_id FROM listens l); DELETE FROM tracks WHERE id NOT IN (SELECT l.track_id FROM listens l);
DELETE FROM releases WHERE id NOT IN (SELECT t.release_id FROM tracks t); DELETE FROM releases WHERE id NOT IN (SELECT t.release_id FROM tracks t);
-- DELETE FROM releases WHERE release_group_id NOT IN (SELECT t.release_group_id FROM tracks t);
-- DELETE FROM releases WHERE release_group_id NOT IN (SELECT rg.id FROM release_groups rg);
DELETE FROM artists WHERE id NOT IN (SELECT at.artist_id FROM artist_tracks at); DELETE FROM artists WHERE id NOT IN (SELECT at.artist_id FROM artist_tracks at);
DELETE FROM artist_releases ar
WHERE NOT EXISTS (
SELECT 1
FROM artist_tracks at
JOIN tracks t ON at.track_id = t.id
WHERE at.artist_id = ar.artist_id
AND t.release_id = ar.release_id
);
END $$; END $$;

139
db/queries/interest.sql Normal file
View file

@ -0,0 +1,139 @@
-- name: GetGroupedListensFromArtist :many
WITH bounds AS (
SELECT
MIN(l.listened_at) AS start_time,
NOW() AS end_time
FROM listens l
JOIN tracks t ON t.id = l.track_id
JOIN artist_tracks at ON at.track_id = t.id
WHERE at.artist_id = $1
),
stats AS (
SELECT
start_time,
end_time,
EXTRACT(EPOCH FROM (end_time - start_time)) AS total_seconds,
((end_time - start_time) / sqlc.arg(bucket_count)::int) AS bucket_interval
FROM bounds
),
bucket_series AS (
SELECT generate_series(0, sqlc.arg(bucket_count)::int - 1) AS idx
),
listen_indices AS (
SELECT
LEAST(
sqlc.arg(bucket_count)::int - 1,
FLOOR(
(EXTRACT(EPOCH FROM (l.listened_at - s.start_time)) / NULLIF(s.total_seconds, 0))
* sqlc.arg(bucket_count)::int
)::int
) AS bucket_idx
FROM listens l
JOIN tracks t ON t.id = l.track_id
JOIN artist_tracks at ON at.track_id = t.id
CROSS JOIN stats s
WHERE at.artist_id = $1
AND s.start_time IS NOT NULL
)
SELECT
(s.start_time + (s.bucket_interval * bs.idx))::timestamptz AS bucket_start,
(s.start_time + (s.bucket_interval * (bs.idx + 1)))::timestamptz AS bucket_end,
COUNT(li.bucket_idx) AS listen_count
FROM bucket_series bs
CROSS JOIN stats s
LEFT JOIN listen_indices li ON bs.idx = li.bucket_idx
WHERE s.start_time IS NOT NULL
GROUP BY bs.idx, s.start_time, s.bucket_interval
ORDER BY bs.idx;
-- name: GetGroupedListensFromRelease :many
WITH bounds AS (
SELECT
MIN(l.listened_at) AS start_time,
NOW() AS end_time
FROM listens l
JOIN tracks t ON t.id = l.track_id
WHERE t.release_id = $1
),
stats AS (
SELECT
start_time,
end_time,
EXTRACT(EPOCH FROM (end_time - start_time)) AS total_seconds,
((end_time - start_time) / sqlc.arg(bucket_count)::int) AS bucket_interval
FROM bounds
),
bucket_series AS (
SELECT generate_series(0, sqlc.arg(bucket_count)::int - 1) AS idx
),
listen_indices AS (
SELECT
LEAST(
sqlc.arg(bucket_count)::int - 1,
FLOOR(
(EXTRACT(EPOCH FROM (l.listened_at - s.start_time)) / NULLIF(s.total_seconds, 0))
* sqlc.arg(bucket_count)::int
)::int
) AS bucket_idx
FROM listens l
JOIN tracks t ON t.id = l.track_id
CROSS JOIN stats s
WHERE t.release_id = $1
AND s.start_time IS NOT NULL
)
SELECT
(s.start_time + (s.bucket_interval * bs.idx))::timestamptz AS bucket_start,
(s.start_time + (s.bucket_interval * (bs.idx + 1)))::timestamptz AS bucket_end,
COUNT(li.bucket_idx) AS listen_count
FROM bucket_series bs
CROSS JOIN stats s
LEFT JOIN listen_indices li ON bs.idx = li.bucket_idx
WHERE s.start_time IS NOT NULL
GROUP BY bs.idx, s.start_time, s.bucket_interval
ORDER BY bs.idx;
-- name: GetGroupedListensFromTrack :many
WITH bounds AS (
SELECT
MIN(l.listened_at) AS start_time,
NOW() AS end_time
FROM listens l
JOIN tracks t ON t.id = l.track_id
WHERE t.id = $1
),
stats AS (
SELECT
start_time,
end_time,
EXTRACT(EPOCH FROM (end_time - start_time)) AS total_seconds,
((end_time - start_time) / sqlc.arg(bucket_count)::int) AS bucket_interval
FROM bounds
),
bucket_series AS (
SELECT generate_series(0, sqlc.arg(bucket_count)::int - 1) AS idx
),
listen_indices AS (
SELECT
LEAST(
sqlc.arg(bucket_count)::int - 1,
FLOOR(
(EXTRACT(EPOCH FROM (l.listened_at - s.start_time)) / NULLIF(s.total_seconds, 0))
* sqlc.arg(bucket_count)::int
)::int
) AS bucket_idx
FROM listens l
JOIN tracks t ON t.id = l.track_id
CROSS JOIN stats s
WHERE t.id = $1
AND s.start_time IS NOT NULL
)
SELECT
(s.start_time + (s.bucket_interval * bs.idx))::timestamptz AS bucket_start,
(s.start_time + (s.bucket_interval * (bs.idx + 1)))::timestamptz AS bucket_end,
COUNT(li.bucket_idx) AS listen_count
FROM bucket_series bs
CROSS JOIN stats s
LEFT JOIN listen_indices li ON bs.idx = li.bucket_idx
WHERE s.start_time IS NOT NULL
GROUP BY bs.idx, s.start_time, s.bucket_interval
ORDER BY bs.idx;

View file

@ -4,7 +4,7 @@ VALUES ($1, $2, $3, $4)
ON CONFLICT DO NOTHING; ON CONFLICT DO NOTHING;
-- name: GetLastListensPaginated :many -- name: GetLastListensPaginated :many
SELECT SELECT
l.*, l.*,
t.title AS track_title, t.title AS track_title,
t.release_id AS release_id, t.release_id AS release_id,
@ -16,31 +16,31 @@ ORDER BY l.listened_at DESC
LIMIT $3 OFFSET $4; LIMIT $3 OFFSET $4;
-- name: GetLastListensFromArtistPaginated :many -- name: GetLastListensFromArtistPaginated :many
SELECT SELECT
l.*, l.*,
t.title AS track_title, t.title AS track_title,
t.release_id AS release_id, t.release_id AS release_id,
get_artists_for_track(t.id) AS artists get_artists_for_track(t.id) AS artists
FROM listens l FROM listens l
JOIN tracks_with_title t ON l.track_id = t.id JOIN tracks_with_title t ON l.track_id = t.id
JOIN artist_tracks at ON t.id = at.track_id JOIN artist_tracks at ON t.id = at.track_id
WHERE at.artist_id = $5 WHERE at.artist_id = $5
AND l.listened_at BETWEEN $1 AND $2 AND l.listened_at BETWEEN $1 AND $2
ORDER BY l.listened_at DESC ORDER BY l.listened_at DESC
LIMIT $3 OFFSET $4; LIMIT $3 OFFSET $4;
-- name: GetFirstListenFromArtist :one -- name: GetFirstListenFromArtist :one
SELECT SELECT
l.* l.*
FROM listens l FROM listens l
JOIN tracks_with_title t ON l.track_id = t.id JOIN tracks_with_title t ON l.track_id = t.id
JOIN artist_tracks at ON t.id = at.track_id JOIN artist_tracks at ON t.id = at.track_id
WHERE at.artist_id = $1 WHERE at.artist_id = $1
ORDER BY l.listened_at ASC ORDER BY l.listened_at ASC
LIMIT 1; LIMIT 1;
-- name: GetLastListensFromReleasePaginated :many -- name: GetLastListensFromReleasePaginated :many
SELECT SELECT
l.*, l.*,
t.title AS track_title, t.title AS track_title,
t.release_id AS release_id, t.release_id AS release_id,
@ -53,7 +53,7 @@ ORDER BY l.listened_at DESC
LIMIT $3 OFFSET $4; LIMIT $3 OFFSET $4;
-- name: GetFirstListenFromRelease :one -- name: GetFirstListenFromRelease :one
SELECT SELECT
l.* l.*
FROM listens l FROM listens l
JOIN tracks t ON l.track_id = t.id JOIN tracks t ON l.track_id = t.id
@ -62,7 +62,7 @@ ORDER BY l.listened_at ASC
LIMIT 1; LIMIT 1;
-- name: GetLastListensFromTrackPaginated :many -- name: GetLastListensFromTrackPaginated :many
SELECT SELECT
l.*, l.*,
t.title AS track_title, t.title AS track_title,
t.release_id AS release_id, t.release_id AS release_id,
@ -75,7 +75,7 @@ ORDER BY l.listened_at DESC
LIMIT $3 OFFSET $4; LIMIT $3 OFFSET $4;
-- name: GetFirstListenFromTrack :one -- name: GetFirstListenFromTrack :one
SELECT SELECT
l.* l.*
FROM listens l FROM listens l
JOIN tracks t ON l.track_id = t.id JOIN tracks t ON l.track_id = t.id
@ -83,6 +83,13 @@ WHERE t.id = $1
ORDER BY l.listened_at ASC ORDER BY l.listened_at ASC
LIMIT 1; LIMIT 1;
-- name: GetFirstListen :one
SELECT
*
FROM listens
ORDER BY listened_at ASC
LIMIT 1;
-- name: CountListens :one -- name: CountListens :one
SELECT COUNT(*) AS total_count SELECT COUNT(*) AS total_count
FROM listens l FROM listens l
@ -137,90 +144,51 @@ WHERE l.listened_at BETWEEN $1 AND $2
AND t.id = $3; AND t.id = $3;
-- name: ListenActivity :many -- name: ListenActivity :many
WITH buckets AS ( SELECT
SELECT generate_series($1::timestamptz, $2::timestamptz, $3::interval) AS bucket_start (listened_at AT TIME ZONE $1::text)::date as day,
), COUNT(*) AS listen_count
bucketed_listens AS ( FROM listens
SELECT WHERE listened_at >= $2
b.bucket_start, AND listened_at < $3
COUNT(l.listened_at) AS listen_count GROUP BY day
FROM buckets b ORDER BY day;
LEFT JOIN listens l
ON l.listened_at >= b.bucket_start
AND l.listened_at < b.bucket_start + $3::interval
GROUP BY b.bucket_start
ORDER BY b.bucket_start
)
SELECT * FROM bucketed_listens;
-- name: ListenActivityForArtist :many -- name: ListenActivityForArtist :many
WITH buckets AS ( SELECT
SELECT generate_series($1::timestamptz, $2::timestamptz, $3::interval) AS bucket_start (listened_at AT TIME ZONE $1::text)::date as day,
), COUNT(*) AS listen_count
filtered_listens AS ( FROM listens l
SELECT l.* JOIN tracks t ON l.track_id = t.id
FROM listens l JOIN artist_tracks at ON t.id = at.track_id
JOIN artist_tracks t ON l.track_id = t.track_id WHERE l.listened_at >= $2
WHERE t.artist_id = $4 AND l.listened_at < $3
), AND at.artist_id = $4
bucketed_listens AS ( GROUP BY day
SELECT ORDER BY day;
b.bucket_start,
COUNT(l.listened_at) AS listen_count
FROM buckets b
LEFT JOIN filtered_listens l
ON l.listened_at >= b.bucket_start
AND l.listened_at < b.bucket_start + $3::interval
GROUP BY b.bucket_start
ORDER BY b.bucket_start
)
SELECT * FROM bucketed_listens;
-- name: ListenActivityForRelease :many -- name: ListenActivityForRelease :many
WITH buckets AS ( SELECT
SELECT generate_series($1::timestamptz, $2::timestamptz, $3::interval) AS bucket_start (listened_at AT TIME ZONE $1::text)::date as day,
), COUNT(*) AS listen_count
filtered_listens AS ( FROM listens l
SELECT l.* JOIN tracks t ON l.track_id = t.id
FROM listens l WHERE l.listened_at >= $2
JOIN tracks t ON l.track_id = t.id AND l.listened_at < $3
WHERE t.release_id = $4 AND t.release_id = $4
), GROUP BY day
bucketed_listens AS ( ORDER BY day;
SELECT
b.bucket_start,
COUNT(l.listened_at) AS listen_count
FROM buckets b
LEFT JOIN filtered_listens l
ON l.listened_at >= b.bucket_start
AND l.listened_at < b.bucket_start + $3::interval
GROUP BY b.bucket_start
ORDER BY b.bucket_start
)
SELECT * FROM bucketed_listens;
-- name: ListenActivityForTrack :many -- name: ListenActivityForTrack :many
WITH buckets AS ( SELECT
SELECT generate_series($1::timestamptz, $2::timestamptz, $3::interval) AS bucket_start (listened_at AT TIME ZONE $1::text)::date as day,
), COUNT(*) AS listen_count
filtered_listens AS ( FROM listens l
SELECT l.* JOIN tracks t ON l.track_id = t.id
FROM listens l WHERE l.listened_at >= $2
JOIN tracks t ON l.track_id = t.id AND l.listened_at < $3
WHERE t.id = $4 AND t.id = $4
), GROUP BY day
bucketed_listens AS ( ORDER BY day;
SELECT
b.bucket_start,
COUNT(l.listened_at) AS listen_count
FROM buckets b
LEFT JOIN filtered_listens l
ON l.listened_at >= b.bucket_start
AND l.listened_at < b.bucket_start + $3::interval
GROUP BY b.bucket_start
ORDER BY b.bucket_start
)
SELECT * FROM bucketed_listens;
-- name: UpdateTrackIdForListens :exec -- name: UpdateTrackIdForListens :exec
UPDATE listens SET track_id = $2 UPDATE listens SET track_id = $2

View file

@ -32,34 +32,76 @@ JOIN artist_releases ar ON r.id = ar.release_id
WHERE r.title = ANY ($1::TEXT[]) AND ar.artist_id = $2 WHERE r.title = ANY ($1::TEXT[]) AND ar.artist_id = $2
LIMIT 1; LIMIT 1;
-- name: GetReleaseByArtistAndTitlesNoMbzID :one
SELECT r.*
FROM releases_with_title r
JOIN artist_releases ar ON r.id = ar.release_id
WHERE r.title = ANY ($1::TEXT[])
AND ar.artist_id = $2
AND EXISTS (
SELECT 1
FROM releases r2
WHERE r2.id = r.id
AND r2.musicbrainz_id IS NULL
);
-- name: GetTopReleasesFromArtist :many -- name: GetTopReleasesFromArtist :many
SELECT SELECT
r.*, x.*,
COUNT(*) AS listen_count, get_artists_for_release(x.id) AS artists,
get_artists_for_release(r.id) AS artists RANK() OVER (ORDER BY x.listen_count DESC) AS rank
FROM listens l FROM (
JOIN tracks t ON l.track_id = t.id SELECT
JOIN releases_with_title r ON t.release_id = r.id r.*,
JOIN artist_releases ar ON r.id = ar.release_id COUNT(*) AS listen_count
WHERE ar.artist_id = $5 FROM listens l
AND l.listened_at BETWEEN $1 AND $2 JOIN tracks t ON l.track_id = t.id
GROUP BY r.id, r.title, r.musicbrainz_id, r.various_artists, r.image, r.image_source JOIN releases_with_title r ON t.release_id = r.id
ORDER BY listen_count DESC, r.id JOIN artist_releases ar ON r.id = ar.release_id
WHERE ar.artist_id = $5
AND l.listened_at BETWEEN $1 AND $2
GROUP BY r.id, r.title, r.musicbrainz_id, r.various_artists, r.image, r.image_source
) x
ORDER BY listen_count DESC, x.id
LIMIT $3 OFFSET $4; LIMIT $3 OFFSET $4;
-- name: GetTopReleasesPaginated :many -- name: GetTopReleasesPaginated :many
SELECT SELECT
r.*, x.*,
COUNT(*) AS listen_count, get_artists_for_release(x.id) AS artists,
get_artists_for_release(r.id) AS artists RANK() OVER (ORDER BY x.listen_count DESC) AS rank
FROM listens l FROM (
JOIN tracks t ON l.track_id = t.id SELECT
JOIN releases_with_title r ON t.release_id = r.id r.*,
WHERE l.listened_at BETWEEN $1 AND $2 COUNT(*) AS listen_count
GROUP BY r.id, r.title, r.musicbrainz_id, r.various_artists, r.image, r.image_source FROM listens l
ORDER BY listen_count DESC, r.id JOIN tracks t ON l.track_id = t.id
JOIN releases_with_title r ON t.release_id = r.id
WHERE l.listened_at BETWEEN $1 AND $2
GROUP BY r.id, r.title, r.musicbrainz_id, r.various_artists, r.image, r.image_source
) x
ORDER BY listen_count DESC, x.id
LIMIT $3 OFFSET $4; LIMIT $3 OFFSET $4;
-- name: GetReleaseAllTimeRank :one
SELECT
release_id,
rank
FROM (
SELECT
x.release_id,
RANK() OVER (ORDER BY x.listen_count DESC) AS rank
FROM (
SELECT
t.release_id,
COUNT(*) AS listen_count
FROM listens l
JOIN tracks t ON l.track_id = t.id
GROUP BY t.release_id
) x
)
WHERE release_id = $1;
-- name: CountTopReleases :one -- name: CountTopReleases :one
SELECT COUNT(DISTINCT r.id) AS total_count SELECT COUNT(DISTINCT r.id) AS total_count
FROM listens l FROM listens l

View file

@ -27,68 +27,112 @@ FROM tracks_with_title t
JOIN artist_tracks at ON t.id = at.track_id JOIN artist_tracks at ON t.id = at.track_id
WHERE at.artist_id = $1; WHERE at.artist_id = $1;
-- name: GetTrackByTitleAndArtists :one -- name: GetTrackByTrackInfo :one
SELECT t.* SELECT t.*
FROM tracks_with_title t FROM tracks_with_title t
JOIN artist_tracks at ON at.track_id = t.id JOIN artist_tracks at ON at.track_id = t.id
WHERE t.title = $1 WHERE t.title = $1
AND at.artist_id = ANY($2::int[]) AND at.artist_id = ANY($3::int[])
AND t.release_id = $2
GROUP BY t.id, t.title, t.musicbrainz_id, t.duration, t.release_id GROUP BY t.id, t.title, t.musicbrainz_id, t.duration, t.release_id
HAVING COUNT(DISTINCT at.artist_id) = cardinality($2::int[]); HAVING COUNT(DISTINCT at.artist_id) = cardinality($3::int[]);
-- name: GetTopTracksPaginated :many -- name: GetTopTracksPaginated :many
SELECT SELECT
t.id, x.track_id AS id,
t.title, t.title,
t.musicbrainz_id, t.musicbrainz_id,
t.release_id, t.release_id,
r.image, r.image,
COUNT(*) AS listen_count, x.listen_count,
get_artists_for_track(t.id) AS artists get_artists_for_track(x.track_id) AS artists,
FROM listens l x.rank
JOIN tracks_with_title t ON l.track_id = t.id FROM (
SELECT
track_id,
COUNT(*) AS listen_count,
RANK() OVER (ORDER BY COUNT(*) DESC) as rank
FROM listens
WHERE listened_at BETWEEN $1 AND $2
GROUP BY track_id
ORDER BY listen_count DESC
LIMIT $3 OFFSET $4
) x
JOIN tracks_with_title t ON x.track_id = t.id
JOIN releases r ON t.release_id = r.id JOIN releases r ON t.release_id = r.id
WHERE l.listened_at BETWEEN $1 AND $2 ORDER BY x.listen_count DESC, x.track_id;
GROUP BY t.id, t.title, t.musicbrainz_id, t.release_id, r.image
ORDER BY listen_count DESC, t.id
LIMIT $3 OFFSET $4;
-- name: GetTopTracksByArtistPaginated :many -- name: GetTopTracksByArtistPaginated :many
SELECT SELECT
t.id, x.track_id AS id,
t.title, t.title,
t.musicbrainz_id, t.musicbrainz_id,
t.release_id, t.release_id,
r.image, r.image,
COUNT(*) AS listen_count, x.listen_count,
get_artists_for_track(t.id) AS artists get_artists_for_track(x.track_id) AS artists,
FROM listens l x.rank
JOIN tracks_with_title t ON l.track_id = t.id FROM (
SELECT
l.track_id,
COUNT(*) AS listen_count,
RANK() OVER (ORDER BY COUNT(*) DESC) as rank
FROM listens l
JOIN artist_tracks at ON l.track_id = at.track_id
WHERE l.listened_at BETWEEN $1 AND $2
AND at.artist_id = $5
GROUP BY l.track_id
ORDER BY listen_count DESC
LIMIT $3 OFFSET $4
) x
JOIN tracks_with_title t ON x.track_id = t.id
JOIN releases r ON t.release_id = r.id JOIN releases r ON t.release_id = r.id
JOIN artist_tracks at ON at.track_id = t.id ORDER BY x.listen_count DESC, x.track_id;
WHERE l.listened_at BETWEEN $1 AND $2
AND at.artist_id = $5
GROUP BY t.id, t.title, t.musicbrainz_id, t.release_id, r.image
ORDER BY listen_count DESC, t.id
LIMIT $3 OFFSET $4;
-- name: GetTopTracksInReleasePaginated :many -- name: GetTopTracksInReleasePaginated :many
SELECT SELECT
t.id, x.track_id AS id,
t.title, t.title,
t.musicbrainz_id, t.musicbrainz_id,
t.release_id, t.release_id,
r.image, r.image,
COUNT(*) AS listen_count, x.listen_count,
get_artists_for_track(t.id) AS artists get_artists_for_track(x.track_id) AS artists,
FROM listens l x.rank
JOIN tracks_with_title t ON l.track_id = t.id FROM (
SELECT
l.track_id,
COUNT(*) AS listen_count,
RANK() OVER (ORDER BY COUNT(*) DESC) as rank
FROM listens l
JOIN tracks t ON l.track_id = t.id
WHERE l.listened_at BETWEEN $1 AND $2
AND t.release_id = $5
GROUP BY l.track_id
ORDER BY listen_count DESC
LIMIT $3 OFFSET $4
) x
JOIN tracks_with_title t ON x.track_id = t.id
JOIN releases r ON t.release_id = r.id JOIN releases r ON t.release_id = r.id
WHERE l.listened_at BETWEEN $1 AND $2 ORDER BY x.listen_count DESC, x.track_id;
AND t.release_id = $5
GROUP BY t.id, t.title, t.musicbrainz_id, t.release_id, r.image -- name: GetTrackAllTimeRank :one
ORDER BY listen_count DESC, t.id SELECT
LIMIT $3 OFFSET $4; id,
rank
FROM (
SELECT
x.id,
RANK() OVER (ORDER BY x.listen_count DESC) AS rank
FROM (
SELECT
t.id,
COUNT(*) AS listen_count
FROM listens l
JOIN tracks_with_title t ON l.track_id = t.id
GROUP BY t.id) x
) y
WHERE id = $1;
-- name: CountTopTracks :one -- name: CountTopTracks :one
SELECT COUNT(DISTINCT l.track_id) AS total_count SELECT COUNT(DISTINCT l.track_id) AS total_count
@ -136,3 +180,13 @@ WHERE artist_id = $1 AND track_id = $2;
-- name: DeleteTrack :exec -- name: DeleteTrack :exec
DELETE FROM tracks WHERE id = $1; DELETE FROM tracks WHERE id = $1;
-- name: GetTracksWithNoDurationButHaveMbzID :many
SELECT
*
FROM tracks_with_title
WHERE duration = 0
AND musicbrainz_id IS NOT NULL
AND id > $2
ORDER BY id ASC
LIMIT $1;

View file

@ -1,57 +1,69 @@
// @ts-check // @ts-check
import { defineConfig } from 'astro/config'; import { defineConfig } from "astro/config";
import starlight from '@astrojs/starlight'; import starlight from "@astrojs/starlight";
import tailwindcss from '@tailwindcss/vite'; import tailwindcss from "@tailwindcss/vite";
// https://astro.build/config // https://astro.build/config
export default defineConfig({ export default defineConfig({
integrations: [ integrations: [
starlight({ starlight({
head: [ head: [
{ {
tag: 'script', tag: "script",
attrs: { attrs: {
src: 'https://static.cloudflareinsights.com/beacon.min.js', src: "https://static.cloudflareinsights.com/beacon.min.js",
'data-cf-beacon': '{"token": "1948caaaba10463fa1d310ee02b0951c"}', "data-cf-beacon": '{"token": "1948caaaba10463fa1d310ee02b0951c"}',
defer: true, defer: true,
}
}
],
title: 'Koito',
logo: {
src: './src/assets/logo_text.png',
replacesTitle: true,
}, },
social: [{ icon: 'github', label: 'GitHub', href: 'https://github.com/gabehf/koito' }], },
sidebar: [ ],
{ title: "Koito",
label: 'Guides', logo: {
items: [ src: "./src/assets/logo_text.png",
// Each item here is one entry in the navigation menu. replacesTitle: true,
{ label: 'Installation', slug: 'guides/installation' }, },
{ label: 'Importing Data', slug: 'guides/importing' }, social: [
{ label: 'Setting up the Scrobbler', slug: 'guides/scrobbler' }, {
{ label: 'Editing Data', slug: 'guides/editing' }, icon: "github",
], label: "GitHub",
}, href: "https://github.com/gabehf/koito",
{ },
label: 'Reference', ],
items: [ sidebar: [
{ label: 'Configuration Options', slug: 'reference/configuration' }, {
] label: "Guides",
}, items: [
// Each item here is one entry in the navigation menu.
{ label: "Installation", slug: "guides/installation" },
{ label: "Importing Data", slug: "guides/importing" },
{ label: "Setting up the Scrobbler", slug: "guides/scrobbler" },
{ label: "Editing Data", slug: "guides/editing" },
], ],
customCss: [ },
// Path to your Tailwind base styles: {
'./src/styles/global.css', label: "Quickstart",
], items: [
}), { label: "Setup with Navidrome", slug: "quickstart/navidrome" },
], ],
},
{
label: "Reference",
items: [
{ label: "Configuration Options", slug: "reference/configuration" },
],
},
],
customCss: [
// Path to your Tailwind base styles:
"./src/styles/global.css",
],
}),
],
site: "https://koito.io", site: "https://koito.io",
vite: { vite: {
plugins: [tailwindcss()], plugins: [tailwindcss()],
}, },
}); });

Binary file not shown.

After

Width:  |  Height:  |  Size: 178 KiB

View file

@ -28,7 +28,7 @@ import { Card, CardGrid } from '@astrojs/starlight/components';
Koito can be connected to any music server or client that allows for custom ListenBrainz URLs. Koito can be connected to any music server or client that allows for custom ListenBrainz URLs.
</Card> </Card>
<Card title="Scrobbler relay" icon="rocket"> <Card title="Scrobbler relay" icon="rocket">
Automatically relay listens submitted to your Koito instance to other ListenBrainz compatble servers. Automatically relay listens submitted to your Koito instance to other ListenBrainz compatible servers.
</Card> </Card>
<Card title="Automatic data fetching" icon="download"> <Card title="Automatic data fetching" icon="download">
Koito automatically fetches data from MusicBrainz and images from Deezer and Cover Art Archive to compliment what is provided by your music server. Koito automatically fetches data from MusicBrainz and images from Deezer and Cover Art Archive to compliment what is provided by your music server.

View file

@ -0,0 +1,68 @@
---
title: Navidrome Quickstart
description: How to set up Koito to work with your Navidrome instance.
---
## Configure Koito
This quickstart assumes you are using Docker compose. Below is an example file, adjusted from the actual file I use personally.
```yaml title="compose.yaml"
services:
koito:
image: gabehf/koito:latest
container_name: koito
depends_on:
- db
user: 1000:1000
environment:
- KOITO_DATABASE_URL=postgres://postgres:<a_super_random_string>@db:5432/koitodb
- KOITO_ALLOWED_HOSTS=koito.mydomain.com,192.168.1.100
- KOITO_SUBSONIC_URL=https://navidrome.mydomain.com # the url to your navidrome instance
- KOITO_SUBSONIC_PARAMS=u=<navidrome_username>&t=<navidrome_token>&s=<navidrome_salt>
- KOITO_DEFAULT_THEME=black # i like this theme, use whatever you want
ports:
- "4110:4110"
volumes:
- ./koito-data:/etc/koito
restart: unless-stopped
db:
user: 1000:1000
image: postgres:16
container_name: psql
restart: unless-stopped
environment:
POSTGRES_DB: koitodb
POSTGRES_USER: postgres
POSTGRES_PASSWORD: <a_super_random_string>
volumes:
- ./db-data:/var/lib/postgresql/data
```
### How do I get the Subsonic params?
The easiest way to get your Subsonic parameters to open your browser and sign into Navidrome, then press F12 to get to
the developer options and navigate to the **Network** tab. Find a `getCoverArt` request (there should be a lot on the home
page) and look for the part of the URL that looks like `u=<username>&t=<random_string>&s=<small_random_string>`. This
is what you need to copy and provide to Koito.
:::note
If you don't want to use Navidrome to provide images to Koito, you can skip the `KOITO_SUBSONIC_URL` and `KOITO_SUBSONIC_PARAMS`
variables entirely.
:::
## Configure Navidrome
You have to provide Navidrome with the environment variables `ND_LISTENBRAINZ_ENABLED=true` and
`ND_LISTENBRAINZ_BASEURL=<your_koito_url>/apis/listenbrainz/1`. The place where you edit these environment variables will change
depending on how you have chosen to deploy Navidrome.
## Enable ListenBrainz in Navidrome
In Navidome, click on **Settings** in the top right, then click **Personal**.
Here, you will see that **Scrobble to ListenBrainz** is turned off. Flip that switch on.
![navidrome listenbrainz switch screenshot](../../../assets/navidrome_lbz_switch.png)
When you flip it on, Navidrome will prompt you for a ListenBrainz token. To get this token, open your Koito page and sign in.
Press the settings button (or hit `\`) and go to the **API Keys** tab. Copy the autogenerated API key by either clicking the
copy button, or clicking on the key itself and copying with ctrl+c.
After hitting **Save** in Navidrome, your listen activity will start being sent to Koito as you listen to tracks.
Happy scrobbling!

View file

@ -64,6 +64,8 @@ If the environment variable is defined without **and** with the suffix at the sa
##### KOITO_CONFIG_DIR ##### KOITO_CONFIG_DIR
- Default: `/etc/koito` - Default: `/etc/koito`
- Description: The location where import folders and image caches are stored. - Description: The location where import folders and image caches are stored.
##### KOITO_FORCE_TZ
- Description: A canonical IANA database time zone name (https://en.wikipedia.org/wiki/List_of_tz_database_time_zones) that Koito will use to serve all clients. Overrides any timezones requested via a `tz` cookie or `tz` query parameter. Koito will fail to start if this value is invalid.
##### KOITO_DISABLE_DEEZER ##### KOITO_DISABLE_DEEZER
- Default: `false` - Default: `false`
- Description: Disables Deezer as a source for finding artist and album images. - Description: Disables Deezer as a source for finding artist and album images.
@ -78,6 +80,13 @@ If the environment variable is defined without **and** with the suffix at the sa
##### KOITO_SUBSONIC_PARAMS ##### KOITO_SUBSONIC_PARAMS
- Required: `true` if KOITO_SUBSONIC_URL is set - Required: `true` if KOITO_SUBSONIC_URL is set
- Description: The `u`, `t`, and `s` authentication parameters to use for authenticated requests to your subsonic server, in the format `u=XXX&t=XXX&s=XXX`. An easy way to find them is to open the network tab in the developer tools of your browser of choice and copy them from a request. - Description: The `u`, `t`, and `s` authentication parameters to use for authenticated requests to your subsonic server, in the format `u=XXX&t=XXX&s=XXX`. An easy way to find them is to open the network tab in the developer tools of your browser of choice and copy them from a request.
:::caution
If Koito is unable to validate your Subsonic configuration, it will fail to start. If you notice your container isn't running after
changing these parameters, check the logs!
:::
##### KOITO_LASTFM_API_KEY
- Required: `false`
- Description: Your LastFM API key, which will be used for fetching images if provided. You can get an API key [here](https://www.last.fm/api/authentication),
##### KOITO_SKIP_IMPORT ##### KOITO_SKIP_IMPORT
- Default: `false` - Default: `false`
- Description: Skips running the importer on startup. - Description: Skips running the importer on startup.

View file

@ -2,6 +2,7 @@ package engine
import ( import (
"context" "context"
"encoding/json"
"fmt" "fmt"
"io" "io"
"net/http" "net/http"
@ -95,6 +96,10 @@ func Run(
defer store.Close(ctx) defer store.Close(ctx)
l.Info().Msg("Engine: Database connection established") l.Info().Msg("Engine: Database connection established")
if cfg.ForceTZ() != nil {
l.Debug().Msgf("Engine: Forcing the use of timezone '%s'", cfg.ForceTZ().String())
}
l.Debug().Msg("Engine: Initializing MusicBrainz client") l.Debug().Msg("Engine: Initializing MusicBrainz client")
var mbzC mbz.MusicBrainzCaller var mbzC mbz.MusicBrainzCaller
if !cfg.MusicBrainzDisabled() { if !cfg.MusicBrainzDisabled() {
@ -105,12 +110,39 @@ func Run(
l.Warn().Msg("Engine: MusicBrainz client disabled") l.Warn().Msg("Engine: MusicBrainz client disabled")
} }
if cfg.SubsonicEnabled() {
l.Debug().Msg("Engine: Checking Subsonic configuration")
pingURL := cfg.SubsonicUrl() + "/rest/ping.view?" + cfg.SubsonicParams() + "&f=json&v=1&c=koito"
resp, err := http.Get(pingURL)
if err != nil {
l.Fatal().Err(err).Msg("Engine: Failed to contact Subsonic server! Ensure the provided URL is correct")
} else {
defer resp.Body.Close()
var result struct {
Response struct {
Status string `json:"status"`
} `json:"subsonic-response"`
}
if err := json.NewDecoder(resp.Body).Decode(&result); err != nil {
l.Fatal().Err(err).Msg("Engine: Failed to parse Subsonic response")
} else if result.Response.Status != "ok" {
l.Fatal().Msg("Engine: Provided Subsonic credentials are invalid")
} else {
l.Info().Msg("Engine: Subsonic credentials validated successfully")
}
}
}
l.Debug().Msg("Engine: Initializing image sources") l.Debug().Msg("Engine: Initializing image sources")
images.Initialize(images.ImageSourceOpts{ images.Initialize(images.ImageSourceOpts{
UserAgent: cfg.UserAgent(), UserAgent: cfg.UserAgent(),
EnableCAA: !cfg.CoverArtArchiveDisabled(), EnableCAA: !cfg.CoverArtArchiveDisabled(),
EnableDeezer: !cfg.DeezerDisabled(), EnableDeezer: !cfg.DeezerDisabled(),
EnableSubsonic: cfg.SubsonicEnabled(), EnableSubsonic: cfg.SubsonicEnabled(),
EnableLastFM: cfg.LastFMApiKey() != "",
}) })
l.Info().Msg("Engine: Image sources initialized") l.Info().Msg("Engine: Image sources initialized")
@ -184,6 +216,8 @@ func Run(
} }
}() }()
l.Info().Msg("Engine: Beginning startup tasks...")
l.Debug().Msg("Engine: Checking import configuration") l.Debug().Msg("Engine: Checking import configuration")
if !cfg.SkipImport() { if !cfg.SkipImport() {
go func() { go func() {
@ -191,16 +225,14 @@ func Run(
}() }()
} }
// l.Info().Msg("Creating test export file")
// go func() {
// err := export.ExportData(ctx, "koito", store)
// if err != nil {
// l.Err(err).Msg("Failed to generate export file")
// }
// }()
l.Info().Msg("Engine: Pruning orphaned images") l.Info().Msg("Engine: Pruning orphaned images")
go catalog.PruneOrphanedImages(logger.NewContext(l), store) go catalog.PruneOrphanedImages(logger.NewContext(l), store)
l.Info().Msg("Engine: Running duration backfill task")
go catalog.BackfillTrackDurationsFromMusicBrainz(ctx, store, mbzC)
l.Info().Msg("Engine: Attempting to fetch missing artist images")
go catalog.FetchMissingArtistImages(ctx, store)
l.Info().Msg("Engine: Attempting to fetch missing album images")
go catalog.FetchMissingAlbumImages(ctx, store)
l.Info().Msg("Engine: Initialization finished") l.Info().Msg("Engine: Initialization finished")
quit := make(chan os.Signal, 1) quit := make(chan os.Signal, 1)
@ -221,19 +253,19 @@ func Run(
} }
func RunImporter(l *zerolog.Logger, store db.DB, mbzc mbz.MusicBrainzCaller) { func RunImporter(l *zerolog.Logger, store db.DB, mbzc mbz.MusicBrainzCaller) {
l.Debug().Msg("Checking for import files...") l.Debug().Msg("Importer: Checking for import files...")
files, err := os.ReadDir(path.Join(cfg.ConfigDir(), "import")) files, err := os.ReadDir(path.Join(cfg.ConfigDir(), "import"))
if err != nil { if err != nil {
l.Err(err).Msg("Failed to read files from import dir") l.Err(err).Msg("Importer: Failed to read files from import dir")
} }
if len(files) > 0 { if len(files) > 0 {
l.Info().Msg("Files found in import directory. Attempting to import...") l.Info().Msg("Importer: Files found in import directory. Attempting to import...")
} else { } else {
return return
} }
defer func() { defer func() {
if r := recover(); r != nil { if r := recover(); r != nil {
l.Error().Interface("recover", r).Msg("Panic when importing files") l.Error().Interface("recover", r).Msg("Importer: Panic when importing files")
} }
}() }()
for _, file := range files { for _, file := range files {
@ -241,37 +273,37 @@ func RunImporter(l *zerolog.Logger, store db.DB, mbzc mbz.MusicBrainzCaller) {
continue continue
} }
if strings.Contains(file.Name(), "Streaming_History_Audio") { if strings.Contains(file.Name(), "Streaming_History_Audio") {
l.Info().Msgf("Import file %s detecting as being Spotify export", file.Name()) l.Info().Msgf("Importer: Import file %s detecting as being Spotify export", file.Name())
err := importer.ImportSpotifyFile(logger.NewContext(l), store, file.Name()) err := importer.ImportSpotifyFile(logger.NewContext(l), store, file.Name())
if err != nil { if err != nil {
l.Err(err).Msgf("Failed to import file: %s", file.Name()) l.Err(err).Msgf("Importer: Failed to import file: %s", file.Name())
} }
} else if strings.Contains(file.Name(), "maloja") { } else if strings.Contains(file.Name(), "maloja") {
l.Info().Msgf("Import file %s detecting as being Maloja export", file.Name()) l.Info().Msgf("Importer: Import file %s detecting as being Maloja export", file.Name())
err := importer.ImportMalojaFile(logger.NewContext(l), store, file.Name()) err := importer.ImportMalojaFile(logger.NewContext(l), store, file.Name())
if err != nil { if err != nil {
l.Err(err).Msgf("Failed to import file: %s", file.Name()) l.Err(err).Msgf("Importer: Failed to import file: %s", file.Name())
} }
} else if strings.Contains(file.Name(), "recenttracks") { } else if strings.Contains(file.Name(), "recenttracks") {
l.Info().Msgf("Import file %s detecting as being ghan.nl LastFM export", file.Name()) l.Info().Msgf("Importer: Import file %s detecting as being ghan.nl LastFM export", file.Name())
err := importer.ImportLastFMFile(logger.NewContext(l), store, mbzc, file.Name()) err := importer.ImportLastFMFile(logger.NewContext(l), store, mbzc, file.Name())
if err != nil { if err != nil {
l.Err(err).Msgf("Failed to import file: %s", file.Name()) l.Err(err).Msgf("Importer: Failed to import file: %s", file.Name())
} }
} else if strings.Contains(file.Name(), "listenbrainz") { } else if strings.Contains(file.Name(), "listenbrainz") {
l.Info().Msgf("Import file %s detecting as being ListenBrainz export", file.Name()) l.Info().Msgf("Importer: Import file %s detecting as being ListenBrainz export", file.Name())
err := importer.ImportListenBrainzExport(logger.NewContext(l), store, mbzc, file.Name()) err := importer.ImportListenBrainzExport(logger.NewContext(l), store, mbzc, file.Name())
if err != nil { if err != nil {
l.Err(err).Msgf("Failed to import file: %s", file.Name()) l.Err(err).Msgf("Importer: Failed to import file: %s", file.Name())
} }
} else if strings.Contains(file.Name(), "koito") { } else if strings.Contains(file.Name(), "koito") {
l.Info().Msgf("Import file %s detecting as being Koito export", file.Name()) l.Info().Msgf("Importer: Import file %s detecting as being Koito export", file.Name())
err := importer.ImportKoitoFile(logger.NewContext(l), store, file.Name()) err := importer.ImportKoitoFile(logger.NewContext(l), store, file.Name())
if err != nil { if err != nil {
l.Err(err).Msgf("Failed to import file: %s", file.Name()) l.Err(err).Msgf("Importer: Failed to import file: %s", file.Name())
} }
} else { } else {
l.Warn().Msgf("File %s not recognized as a valid import file; make sure it is valid and named correctly", file.Name()) l.Warn().Msgf("Importer: File %s not recognized as a valid import file; make sure it is valid and named correctly", file.Name())
} }
} }
} }

View file

@ -4,6 +4,7 @@ import (
"net/http" "net/http"
"strconv" "strconv"
"strings" "strings"
"time"
"github.com/gabehf/koito/internal/db" "github.com/gabehf/koito/internal/db"
"github.com/gabehf/koito/internal/logger" "github.com/gabehf/koito/internal/logger"
@ -19,7 +20,7 @@ func GetListenActivityHandler(store db.DB) func(w http.ResponseWriter, r *http.R
rangeStr := r.URL.Query().Get("range") rangeStr := r.URL.Query().Get("range")
_range, err := strconv.Atoi(rangeStr) _range, err := strconv.Atoi(rangeStr)
if err != nil { if err != nil && rangeStr != "" {
l.Debug().AnErr("error", err).Msg("GetListenActivityHandler: Invalid range parameter") l.Debug().AnErr("error", err).Msg("GetListenActivityHandler: Invalid range parameter")
utils.WriteError(w, "invalid range parameter", http.StatusBadRequest) utils.WriteError(w, "invalid range parameter", http.StatusBadRequest)
return return
@ -27,7 +28,7 @@ func GetListenActivityHandler(store db.DB) func(w http.ResponseWriter, r *http.R
monthStr := r.URL.Query().Get("month") monthStr := r.URL.Query().Get("month")
month, err := strconv.Atoi(monthStr) month, err := strconv.Atoi(monthStr)
if err != nil { if err != nil && monthStr != "" {
l.Debug().AnErr("error", err).Msg("GetListenActivityHandler: Invalid month parameter") l.Debug().AnErr("error", err).Msg("GetListenActivityHandler: Invalid month parameter")
utils.WriteError(w, "invalid month parameter", http.StatusBadRequest) utils.WriteError(w, "invalid month parameter", http.StatusBadRequest)
return return
@ -35,7 +36,7 @@ func GetListenActivityHandler(store db.DB) func(w http.ResponseWriter, r *http.R
yearStr := r.URL.Query().Get("year") yearStr := r.URL.Query().Get("year")
year, err := strconv.Atoi(yearStr) year, err := strconv.Atoi(yearStr)
if err != nil { if err != nil && yearStr != "" {
l.Debug().AnErr("error", err).Msg("GetListenActivityHandler: Invalid year parameter") l.Debug().AnErr("error", err).Msg("GetListenActivityHandler: Invalid year parameter")
utils.WriteError(w, "invalid year parameter", http.StatusBadRequest) utils.WriteError(w, "invalid year parameter", http.StatusBadRequest)
return return
@ -43,7 +44,7 @@ func GetListenActivityHandler(store db.DB) func(w http.ResponseWriter, r *http.R
artistIdStr := r.URL.Query().Get("artist_id") artistIdStr := r.URL.Query().Get("artist_id")
artistId, err := strconv.Atoi(artistIdStr) artistId, err := strconv.Atoi(artistIdStr)
if err != nil { if err != nil && artistIdStr != "" {
l.Debug().AnErr("error", err).Msg("GetListenActivityHandler: Invalid artist ID parameter") l.Debug().AnErr("error", err).Msg("GetListenActivityHandler: Invalid artist ID parameter")
utils.WriteError(w, "invalid artist ID parameter", http.StatusBadRequest) utils.WriteError(w, "invalid artist ID parameter", http.StatusBadRequest)
return return
@ -51,7 +52,7 @@ func GetListenActivityHandler(store db.DB) func(w http.ResponseWriter, r *http.R
albumIdStr := r.URL.Query().Get("album_id") albumIdStr := r.URL.Query().Get("album_id")
albumId, err := strconv.Atoi(albumIdStr) albumId, err := strconv.Atoi(albumIdStr)
if err != nil { if err != nil && albumIdStr != "" {
l.Debug().AnErr("error", err).Msg("GetListenActivityHandler: Invalid album ID parameter") l.Debug().AnErr("error", err).Msg("GetListenActivityHandler: Invalid album ID parameter")
utils.WriteError(w, "invalid album ID parameter", http.StatusBadRequest) utils.WriteError(w, "invalid album ID parameter", http.StatusBadRequest)
return return
@ -59,7 +60,7 @@ func GetListenActivityHandler(store db.DB) func(w http.ResponseWriter, r *http.R
trackIdStr := r.URL.Query().Get("track_id") trackIdStr := r.URL.Query().Get("track_id")
trackId, err := strconv.Atoi(trackIdStr) trackId, err := strconv.Atoi(trackIdStr)
if err != nil { if err != nil && trackIdStr != "" {
l.Debug().AnErr("error", err).Msg("GetListenActivityHandler: Invalid track ID parameter") l.Debug().AnErr("error", err).Msg("GetListenActivityHandler: Invalid track ID parameter")
utils.WriteError(w, "invalid track ID parameter", http.StatusBadRequest) utils.WriteError(w, "invalid track ID parameter", http.StatusBadRequest)
return return
@ -85,11 +86,17 @@ func GetListenActivityHandler(store db.DB) func(w http.ResponseWriter, r *http.R
Range: _range, Range: _range,
Month: month, Month: month,
Year: year, Year: year,
Timezone: parseTZ(r),
AlbumID: int32(albumId), AlbumID: int32(albumId),
ArtistID: int32(artistId), ArtistID: int32(artistId),
TrackID: int32(trackId), TrackID: int32(trackId),
} }
if strings.ToLower(opts.Timezone.String()) == "local" {
opts.Timezone, _ = time.LoadLocation("UTC")
l.Warn().Msg("GetListenActivityHandler: Timezone is unset, using UTC")
}
l.Debug().Msgf("GetListenActivityHandler: Retrieving listen activity with options: %+v", opts) l.Debug().Msgf("GetListenActivityHandler: Retrieving listen activity with options: %+v", opts)
activity, err := store.GetListenActivity(ctx, opts) activity, err := store.GetListenActivity(ctx, opts)
@ -99,7 +106,72 @@ func GetListenActivityHandler(store db.DB) func(w http.ResponseWriter, r *http.R
return return
} }
activity = processActivity(activity, opts)
l.Debug().Msg("GetListenActivityHandler: Successfully retrieved listen activity") l.Debug().Msg("GetListenActivityHandler: Successfully retrieved listen activity")
utils.WriteJSON(w, http.StatusOK, activity) utils.WriteJSON(w, http.StatusOK, activity)
} }
} }
// ngl i hate this
func processActivity(
items []db.ListenActivityItem,
opts db.ListenActivityOpts,
) []db.ListenActivityItem {
from, to := db.ListenActivityOptsToTimes(opts)
buckets := make(map[string]int64)
for _, item := range items {
bucketStart := normalizeToStep(item.Start, opts.Step)
key := bucketStart.Format("2006-01-02")
buckets[key] += item.Listens
}
var result []db.ListenActivityItem
for t := normalizeToStep(from, opts.Step); t.Before(to); t = addStep(t, opts.Step) {
key := t.Format("2006-01-02")
result = append(result, db.ListenActivityItem{
Start: t,
Listens: buckets[key],
})
}
return result
}
func normalizeToStep(t time.Time, step db.StepInterval) time.Time {
switch step {
case db.StepDay:
return time.Date(t.Year(), t.Month(), t.Day(), 0, 0, 0, 0, t.Location())
case db.StepWeek:
weekday := int(t.Weekday())
if weekday == 0 {
weekday = 7
}
start := t.AddDate(0, 0, -(weekday - 1))
return time.Date(start.Year(), start.Month(), start.Day(), 0, 0, 0, 0, t.Location())
case db.StepMonth:
return time.Date(t.Year(), t.Month(), 1, 0, 0, 0, 0, t.Location())
default:
return t
}
}
func addStep(t time.Time, step db.StepInterval) time.Time {
switch step {
case db.StepDay:
return t.AddDate(0, 0, 1)
case db.StepWeek:
return t.AddDate(0, 0, 7)
case db.StepMonth:
return t.AddDate(0, 1, 0)
default:
return t.AddDate(0, 0, 1)
}
}

View file

@ -13,7 +13,7 @@ func SummaryHandler(store db.DB) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) { return func(w http.ResponseWriter, r *http.Request) {
ctx := r.Context() ctx := r.Context()
l := logger.FromContext(ctx) l := logger.FromContext(ctx)
l.Debug().Msg("GetTopAlbumsHandler: Received request to retrieve top albums") l.Debug().Msg("SummaryHandler: Received request to retrieve summary")
timeframe := TimeframeFromRequest(r) timeframe := TimeframeFromRequest(r)
summary, err := summary.GenerateSummary(ctx, store, 1, timeframe, "") summary, err := summary.GenerateSummary(ctx, store, 1, timeframe, "")

View file

@ -6,7 +6,9 @@ import (
"strconv" "strconv"
"strings" "strings"
"time" "time"
_ "time/tzdata"
"github.com/gabehf/koito/internal/cfg"
"github.com/gabehf/koito/internal/db" "github.com/gabehf/koito/internal/db"
"github.com/gabehf/koito/internal/logger" "github.com/gabehf/koito/internal/logger"
) )
@ -37,17 +39,6 @@ func OptsFromRequest(r *http.Request) db.GetItemsOpts {
page = 1 page = 1
} }
weekStr := r.URL.Query().Get("week")
week, _ := strconv.Atoi(weekStr)
monthStr := r.URL.Query().Get("month")
month, _ := strconv.Atoi(monthStr)
yearStr := r.URL.Query().Get("year")
year, _ := strconv.Atoi(yearStr)
fromStr := r.URL.Query().Get("from")
from, _ := strconv.Atoi(fromStr)
toStr := r.URL.Query().Get("to")
to, _ := strconv.Atoi(toStr)
artistIdStr := r.URL.Query().Get("artist_id") artistIdStr := r.URL.Query().Get("artist_id")
artistId, _ := strconv.Atoi(artistIdStr) artistId, _ := strconv.Atoi(artistIdStr)
albumIdStr := r.URL.Query().Get("album_id") albumIdStr := r.URL.Query().Get("album_id")
@ -55,6 +46,8 @@ func OptsFromRequest(r *http.Request) db.GetItemsOpts {
trackIdStr := r.URL.Query().Get("track_id") trackIdStr := r.URL.Query().Get("track_id")
trackId, _ := strconv.Atoi(trackIdStr) trackId, _ := strconv.Atoi(trackIdStr)
tf := TimeframeFromRequest(r)
var period db.Period var period db.Period
switch strings.ToLower(r.URL.Query().Get("period")) { switch strings.ToLower(r.URL.Query().Get("period")) {
case "day": case "day":
@ -67,108 +60,195 @@ func OptsFromRequest(r *http.Request) db.GetItemsOpts {
period = db.PeriodYear period = db.PeriodYear
case "all_time": case "all_time":
period = db.PeriodAllTime period = db.PeriodAllTime
default:
l.Debug().Msgf("OptsFromRequest: Using default value '%s' for period", db.PeriodDay)
period = db.PeriodDay
} }
l.Debug().Msgf("OptsFromRequest: Parsed options: limit=%d, page=%d, week=%d, month=%d, year=%d, from=%d, to=%d, artist_id=%d, album_id=%d, track_id=%d, period=%s", l.Debug().Msgf("OptsFromRequest: Parsed options: limit=%d, page=%d, week=%d, month=%d, year=%d, from=%d, to=%d, artist_id=%d, album_id=%d, track_id=%d, period=%s",
limit, page, week, month, year, from, to, artistId, albumId, trackId, period) limit, page, tf.Week, tf.Month, tf.Year, tf.FromUnix, tf.ToUnix, artistId, albumId, trackId, period)
return db.GetItemsOpts{ return db.GetItemsOpts{
Limit: limit, Limit: limit,
Period: period, Page: page,
Page: page, Timeframe: tf,
Week: week, ArtistID: artistId,
Month: month, AlbumID: albumId,
Year: year, TrackID: trackId,
From: int64(from),
To: int64(to),
ArtistID: artistId,
AlbumID: albumId,
TrackID: trackId,
} }
} }
// Takes a request and returns a db.Timeframe representing the week, month, year, period, or unix
// time range specified by the request parameters
func TimeframeFromRequest(r *http.Request) db.Timeframe { func TimeframeFromRequest(r *http.Request) db.Timeframe {
opts := OptsFromRequest(r) q := r.URL.Query()
now := time.Now()
loc := now.Location()
// if 'from' is set, but 'to' is not set, assume 'to' should be now parseInt := func(key string) int {
if opts.From != 0 && opts.To == 0 { v := q.Get(key)
opts.To = now.Unix() if v == "" {
} return 0
// YEAR
if opts.Year != 0 && opts.Month == 0 && opts.Week == 0 {
start := time.Date(opts.Year, 1, 1, 0, 0, 0, 0, loc)
end := time.Date(opts.Year+1, 1, 1, 0, 0, 0, 0, loc).Add(-time.Second)
opts.From = start.Unix()
opts.To = end.Unix()
}
// MONTH (+ optional year)
if opts.Month != 0 {
year := opts.Year
if year == 0 {
year = now.Year()
if int(now.Month()) < opts.Month {
year--
}
} }
i, _ := strconv.Atoi(v)
start := time.Date(year, time.Month(opts.Month), 1, 0, 0, 0, 0, loc) return i
end := endOfMonth(year, time.Month(opts.Month), loc)
opts.From = start.Unix()
opts.To = end.Unix()
} }
// WEEK (+ optional year) parseInt64 := func(key string) int64 {
if opts.Week != 0 { v := q.Get(key)
year := opts.Year if v == "" {
if year == 0 { return 0
year = now.Year()
_, currentWeek := now.ISOWeek()
if currentWeek < opts.Week {
year--
}
} }
i, _ := strconv.ParseInt(v, 10, 64)
// ISO week 1 is defined as the week with Jan 4 in it return i
jan4 := time.Date(year, 1, 4, 0, 0, 0, 0, loc)
week1Start := startOfWeek(jan4)
start := week1Start.AddDate(0, 0, (opts.Week-1)*7)
end := endOfWeek(start)
opts.From = start.Unix()
opts.To = end.Unix()
} }
return db.Timeframe{ return db.Timeframe{
Period: opts.Period, Period: db.Period(q.Get("period")),
T1u: opts.From, Year: parseInt("year"),
T2u: opts.To, Month: parseInt("month"),
Week: parseInt("week"),
FromUnix: parseInt64("from"),
ToUnix: parseInt64("to"),
Timezone: parseTZ(r),
} }
} }
func startOfWeek(t time.Time) time.Time {
// ISO week: Monday = 1 func parseTZ(r *http.Request) *time.Location {
weekday := int(t.Weekday())
if weekday == 0 { // Sunday // this map is obviously AI.
weekday = 7 // i manually referenced as many links as I could and couldn't find any
// incorrect entries here so hopefully it is all correct.
overrides := map[string]string{
// --- North America ---
"America/Indianapolis": "America/Indiana/Indianapolis",
"America/Knoxville": "America/Indiana/Knoxville",
"America/Louisville": "America/Kentucky/Louisville",
"America/Montreal": "America/Toronto",
"America/Shiprock": "America/Denver",
"America/Fort_Wayne": "America/Indiana/Indianapolis",
"America/Virgin": "America/Port_of_Spain",
"America/Santa_Isabel": "America/Tijuana",
"America/Ensenada": "America/Tijuana",
"America/Rosario": "America/Argentina/Cordoba",
"America/Jujuy": "America/Argentina/Jujuy",
"America/Mendoza": "America/Argentina/Mendoza",
"America/Catamarca": "America/Argentina/Catamarca",
"America/Cordoba": "America/Argentina/Cordoba",
"America/Buenos_Aires": "America/Argentina/Buenos_Aires",
"America/Coral_Harbour": "America/Atikokan",
"America/Atka": "America/Adak",
"US/Alaska": "America/Anchorage",
"US/Aleutian": "America/Adak",
"US/Arizona": "America/Phoenix",
"US/Central": "America/Chicago",
"US/Eastern": "America/New_York",
"US/East-Indiana": "America/Indiana/Indianapolis",
"US/Hawaii": "Pacific/Honolulu",
"US/Indiana-Starke": "America/Indiana/Knoxville",
"US/Michigan": "America/Detroit",
"US/Mountain": "America/Denver",
"US/Pacific": "America/Los_Angeles",
"US/Samoa": "Pacific/Pago_Pago",
"Canada/Atlantic": "America/Halifax",
"Canada/Central": "America/Winnipeg",
"Canada/Eastern": "America/Toronto",
"Canada/Mountain": "America/Edmonton",
"Canada/Newfoundland": "America/St_Johns",
"Canada/Pacific": "America/Vancouver",
// --- Asia ---
"Asia/Calcutta": "Asia/Kolkata",
"Asia/Saigon": "Asia/Ho_Chi_Minh",
"Asia/Katmandu": "Asia/Kathmandu",
"Asia/Rangoon": "Asia/Yangon",
"Asia/Ulan_Bator": "Asia/Ulaanbaatar",
"Asia/Macao": "Asia/Macau",
"Asia/Tel_Aviv": "Asia/Jerusalem",
"Asia/Ashkhabad": "Asia/Ashgabat",
"Asia/Chungking": "Asia/Chongqing",
"Asia/Dacca": "Asia/Dhaka",
"Asia/Istanbul": "Europe/Istanbul",
"Asia/Kashgar": "Asia/Urumqi",
"Asia/Thimbu": "Asia/Thimphu",
"Asia/Ujung_Pandang": "Asia/Makassar",
"ROC": "Asia/Taipei",
"Iran": "Asia/Tehran",
"Israel": "Asia/Jerusalem",
"Japan": "Asia/Tokyo",
"Singapore": "Asia/Singapore",
"Hongkong": "Asia/Hong_Kong",
// --- Europe ---
"Europe/Kiev": "Europe/Kyiv",
"Europe/Belfast": "Europe/London",
"Europe/Tiraspol": "Europe/Chisinau",
"Europe/Nicosia": "Asia/Nicosia",
"Europe/Moscow": "Europe/Moscow",
"W-SU": "Europe/Moscow",
"GB": "Europe/London",
"GB-Eire": "Europe/London",
"Eire": "Europe/Dublin",
"Poland": "Europe/Warsaw",
"Portugal": "Europe/Lisbon",
"Turkey": "Europe/Istanbul",
// --- Australia / Pacific ---
"Australia/ACT": "Australia/Sydney",
"Australia/Canberra": "Australia/Sydney",
"Australia/LHI": "Australia/Lord_Howe",
"Australia/North": "Australia/Darwin",
"Australia/NSW": "Australia/Sydney",
"Australia/Queensland": "Australia/Brisbane",
"Australia/South": "Australia/Adelaide",
"Australia/Tasmania": "Australia/Hobart",
"Australia/Victoria": "Australia/Melbourne",
"Australia/West": "Australia/Perth",
"Australia/Yancowinna": "Australia/Broken_Hill",
"Pacific/Samoa": "Pacific/Pago_Pago",
"Pacific/Yap": "Pacific/Chuuk",
"Pacific/Truk": "Pacific/Chuuk",
"Pacific/Ponape": "Pacific/Pohnpei",
"NZ": "Pacific/Auckland",
"NZ-CHAT": "Pacific/Chatham",
// --- Africa ---
"Africa/Asmera": "Africa/Asmara",
"Africa/Timbuktu": "Africa/Bamako",
"Egypt": "Africa/Cairo",
"Libya": "Africa/Tripoli",
// --- Atlantic ---
"Atlantic/Faeroe": "Atlantic/Faroe",
"Atlantic/Jan_Mayen": "Europe/Oslo",
"Iceland": "Atlantic/Reykjavik",
// --- Etc / Misc ---
"UTC": "UTC",
"Etc/UTC": "UTC",
"Etc/GMT": "UTC",
"GMT": "UTC",
"Zulu": "UTC",
"Universal": "UTC",
} }
return time.Date(t.Year(), t.Month(), t.Day()-weekday+1, 0, 0, 0, 0, t.Location())
} if cfg.ForceTZ() != nil {
func endOfWeek(t time.Time) time.Time { return cfg.ForceTZ()
return startOfWeek(t).AddDate(0, 0, 7).Add(-time.Second) }
}
func endOfMonth(year int, month time.Month, loc *time.Location) time.Time { if tz := r.URL.Query().Get("tz"); tz != "" {
startNextMonth := time.Date(year, month+1, 1, 0, 0, 0, 0, loc) if fixedTz, exists := overrides[tz]; exists {
return startNextMonth.Add(-time.Second) tz = fixedTz
}
if loc, err := time.LoadLocation(tz); err == nil {
return loc
}
}
if c, err := r.Cookie("tz"); err == nil {
var tz string
if fixedTz, exists := overrides[c.Value]; exists {
tz = fixedTz
} else {
tz = c.Value
}
if loc, err := time.LoadLocation(tz); err == nil {
return loc
}
}
return time.Now().Location()
} }

View file

@ -0,0 +1,47 @@
package handlers
import (
"net/http"
"strconv"
"github.com/gabehf/koito/internal/db"
"github.com/gabehf/koito/internal/logger"
"github.com/gabehf/koito/internal/utils"
)
func GetInterestHandler(store db.DB) func(w http.ResponseWriter, r *http.Request) {
return func(w http.ResponseWriter, r *http.Request) {
ctx := r.Context()
l := logger.FromContext(ctx)
l.Debug().Msg("GetInterestHandler: Received request to retrieve interest")
// im just using this to parse the artist/album/track id, which is bad
parsed := OptsFromRequest(r)
bucketCountStr := r.URL.Query().Get("buckets")
var buckets = 0
var err error
if buckets, err = strconv.Atoi(bucketCountStr); err != nil {
l.Debug().Msg("GetInterestHandler: Buckets is not an integer")
utils.WriteError(w, "parameter 'buckets' must be an integer", http.StatusBadRequest)
return
}
opts := db.GetInterestOpts{
Buckets: buckets,
AlbumID: int32(parsed.AlbumID),
ArtistID: int32(parsed.ArtistID),
TrackID: int32(parsed.TrackID),
}
interest, err := store.GetInterest(ctx, opts)
if err != nil {
l.Err(err).Msg("GetInterestHandler: Failed to query interest")
utils.WriteError(w, "Failed to retrieve interest: "+err.Error(), http.StatusInternalServerError)
return
}
utils.WriteJSON(w, http.StatusOK, interest)
}
}

View file

@ -90,6 +90,11 @@ func LbzSubmitListenHandler(store db.DB, mbzc mbz.MusicBrainzCaller) func(w http
utils.WriteError(w, "failed to read request body", http.StatusBadRequest) utils.WriteError(w, "failed to read request body", http.StatusBadRequest)
return return
} }
if cfg.LbzRelayEnabled() {
go doLbzRelay(requestBytes, l)
}
if err := json.NewDecoder(bytes.NewBuffer(requestBytes)).Decode(&req); err != nil { if err := json.NewDecoder(bytes.NewBuffer(requestBytes)).Decode(&req); err != nil {
l.Err(err).Msg("LbzSubmitListenHandler: Failed to decode request") l.Err(err).Msg("LbzSubmitListenHandler: Failed to decode request")
utils.WriteError(w, "failed to decode request", http.StatusBadRequest) utils.WriteError(w, "failed to decode request", http.StatusBadRequest)
@ -103,7 +108,7 @@ func LbzSubmitListenHandler(store db.DB, mbzc mbz.MusicBrainzCaller) func(w http
return return
} }
l.Debug().Any("request_body", req).Msg("LbzSubmitListenHandler: Parsed request body") l.Info().Any("request_body", req).Msg("LbzSubmitListenHandler: Parsed request body")
if len(req.Payload) < 1 { if len(req.Payload) < 1 {
l.Debug().Msg("LbzSubmitListenHandler: Payload is empty") l.Debug().Msg("LbzSubmitListenHandler: Payload is empty")
@ -234,10 +239,6 @@ func LbzSubmitListenHandler(store db.DB, mbzc mbz.MusicBrainzCaller) func(w http
w.WriteHeader(http.StatusOK) w.WriteHeader(http.StatusOK)
w.Header().Set("Content-Type", "application/json") w.Header().Set("Content-Type", "application/json")
w.Write([]byte("{\"status\": \"ok\"}")) w.Write([]byte("{\"status\": \"ok\"}"))
if cfg.LbzRelayEnabled() {
go doLbzRelay(requestBytes, l)
}
} }
} }

105
engine/handlers/mbzid.go Normal file
View file

@ -0,0 +1,105 @@
package handlers
import (
"net/http"
"strconv"
"github.com/gabehf/koito/internal/db"
"github.com/gabehf/koito/internal/logger"
"github.com/gabehf/koito/internal/utils"
"github.com/google/uuid"
)
func UpdateMbzIdHandler(store db.DB) func(w http.ResponseWriter, r *http.Request) {
return func(w http.ResponseWriter, r *http.Request) {
ctx := r.Context()
l := logger.FromContext(ctx)
l.Debug().Msg("UpdateMbzIdHandler: Received request to set update MusicBrainz ID")
err := r.ParseForm()
if err != nil {
l.Debug().Msg("UpdateMbzIdHandler: Failed to parse form")
utils.WriteError(w, "form is invalid", http.StatusBadRequest)
return
}
// Parse query parameters
artistIDStr := r.FormValue("artist_id")
albumIDStr := r.FormValue("album_id")
trackIDStr := r.FormValue("track_id")
mbzidStr := r.FormValue("mbz_id")
if mbzidStr == "" || (artistIDStr == "" && albumIDStr == "" && trackIDStr == "") {
l.Debug().Msg("UpdateMbzIdHandler: Request is missing required parameters")
utils.WriteError(w, "mbzid and artist_id, album_id, or track_id must be provided", http.StatusBadRequest)
return
}
if utils.MoreThanOneString(artistIDStr, albumIDStr, trackIDStr) {
l.Debug().Msg("UpdateMbzIdHandler: Request has more than one of artist_id, album_id, and track_id")
utils.WriteError(w, "only one of artist_id, album_id, or track_id can be provided at a time", http.StatusBadRequest)
return
}
var mbzid uuid.UUID
if mbzid, err = uuid.Parse(mbzidStr); err != nil {
l.Debug().Msg("UpdateMbzIdHandler: Provided MusicBrainz ID is invalid")
utils.WriteError(w, "provided musicbrainz id is invalid", http.StatusBadRequest)
return
}
if artistIDStr != "" {
var artistID int
artistID, err = strconv.Atoi(artistIDStr)
if err != nil {
l.Debug().AnErr("error", err).Msg("UpdateMbzIdHandler: Invalid artist id")
utils.WriteError(w, "invalid artist_id", http.StatusBadRequest)
return
}
err = store.UpdateArtist(ctx, db.UpdateArtistOpts{
ID: int32(artistID),
MusicBrainzID: mbzid,
})
if err != nil {
l.Error().Err(err).Msg("UpdateMbzIdHandler: Failed to update musicbrainz id")
utils.WriteError(w, "failed to update musicbrainz id", http.StatusInternalServerError)
return
}
} else if albumIDStr != "" {
var albumID int
albumID, err = strconv.Atoi(albumIDStr)
if err != nil {
l.Debug().AnErr("error", err).Msg("UpdateMbzIdHandler: Invalid album id")
utils.WriteError(w, "invalid artist_id", http.StatusBadRequest)
return
}
err = store.UpdateAlbum(ctx, db.UpdateAlbumOpts{
ID: int32(albumID),
MusicBrainzID: mbzid,
})
if err != nil {
l.Error().Err(err).Msg("UpdateMbzIdHandler: Failed to update musicbrainz id")
utils.WriteError(w, "failed to update musicbrainz id", http.StatusInternalServerError)
return
}
} else if trackIDStr != "" {
var trackID int
trackID, err = strconv.Atoi(trackIDStr)
if err != nil {
l.Debug().AnErr("error", err).Msg("UpdateMbzIdHandler: Invalid track id")
utils.WriteError(w, "invalid artist_id", http.StatusBadRequest)
return
}
err = store.UpdateTrack(ctx, db.UpdateTrackOpts{
ID: int32(trackID),
MusicBrainzID: mbzid,
})
if err != nil {
l.Error().Err(err).Msg("UpdateMbzIdHandler: Failed to update musicbrainz id")
utils.WriteError(w, "failed to update musicbrainz id", http.StatusInternalServerError)
return
}
}
w.WriteHeader(http.StatusNoContent)
}
}

View file

@ -9,6 +9,7 @@ import (
"github.com/gabehf/koito/internal/catalog" "github.com/gabehf/koito/internal/catalog"
"github.com/gabehf/koito/internal/cfg" "github.com/gabehf/koito/internal/cfg"
"github.com/gabehf/koito/internal/db" "github.com/gabehf/koito/internal/db"
"github.com/gabehf/koito/internal/images"
"github.com/gabehf/koito/internal/logger" "github.com/gabehf/koito/internal/logger"
"github.com/gabehf/koito/internal/utils" "github.com/gabehf/koito/internal/utils"
"github.com/google/uuid" "github.com/google/uuid"
@ -75,7 +76,7 @@ func ReplaceImageHandler(store db.DB) http.HandlerFunc {
fileUrl := r.FormValue("image_url") fileUrl := r.FormValue("image_url")
if fileUrl != "" { if fileUrl != "" {
l.Debug().Msg("ReplaceImageHandler: Image identified as remote file") l.Debug().Msg("ReplaceImageHandler: Image identified as remote file")
err = catalog.ValidateImageURL(fileUrl) err = images.ValidateImageURL(fileUrl)
if err != nil { if err != nil {
l.Debug().AnErr("error", err).Msg("ReplaceImageHandler: Invalid image URL") l.Debug().AnErr("error", err).Msg("ReplaceImageHandler: Invalid image URL")
utils.WriteError(w, "url is invalid or not an image file", http.StatusBadRequest) utils.WriteError(w, "url is invalid or not an image file", http.StatusBadRequest)

View file

@ -2,7 +2,6 @@ package handlers
import ( import (
"net/http" "net/http"
"strings"
"github.com/gabehf/koito/internal/db" "github.com/gabehf/koito/internal/db"
"github.com/gabehf/koito/internal/logger" "github.com/gabehf/koito/internal/logger"
@ -23,54 +22,39 @@ func StatsHandler(store db.DB) http.HandlerFunc {
l.Debug().Msg("StatsHandler: Received request to retrieve statistics") l.Debug().Msg("StatsHandler: Received request to retrieve statistics")
var period db.Period tf := TimeframeFromRequest(r)
switch strings.ToLower(r.URL.Query().Get("period")) {
case "day":
period = db.PeriodDay
case "week":
period = db.PeriodWeek
case "month":
period = db.PeriodMonth
case "year":
period = db.PeriodYear
case "all_time":
period = db.PeriodAllTime
default:
l.Debug().Msgf("StatsHandler: Using default value '%s' for period", db.PeriodDay)
period = db.PeriodDay
}
l.Debug().Msgf("StatsHandler: Fetching statistics for period '%s'", period) l.Debug().Msg("StatsHandler: Fetching statistics")
listens, err := store.CountListens(r.Context(), db.Timeframe{Period: period}) listens, err := store.CountListens(r.Context(), tf)
if err != nil { if err != nil {
l.Err(err).Msg("StatsHandler: Failed to fetch listen count") l.Err(err).Msg("StatsHandler: Failed to fetch listen count")
utils.WriteError(w, "failed to get listens: "+err.Error(), http.StatusInternalServerError) utils.WriteError(w, "failed to get listens: "+err.Error(), http.StatusInternalServerError)
return return
} }
tracks, err := store.CountTracks(r.Context(), db.Timeframe{Period: period}) tracks, err := store.CountTracks(r.Context(), tf)
if err != nil { if err != nil {
l.Err(err).Msg("StatsHandler: Failed to fetch track count") l.Err(err).Msg("StatsHandler: Failed to fetch track count")
utils.WriteError(w, "failed to get tracks: "+err.Error(), http.StatusInternalServerError) utils.WriteError(w, "failed to get tracks: "+err.Error(), http.StatusInternalServerError)
return return
} }
albums, err := store.CountAlbums(r.Context(), db.Timeframe{Period: period}) albums, err := store.CountAlbums(r.Context(), tf)
if err != nil { if err != nil {
l.Err(err).Msg("StatsHandler: Failed to fetch album count") l.Err(err).Msg("StatsHandler: Failed to fetch album count")
utils.WriteError(w, "failed to get albums: "+err.Error(), http.StatusInternalServerError) utils.WriteError(w, "failed to get albums: "+err.Error(), http.StatusInternalServerError)
return return
} }
artists, err := store.CountArtists(r.Context(), db.Timeframe{Period: period}) artists, err := store.CountArtists(r.Context(), tf)
if err != nil { if err != nil {
l.Err(err).Msg("StatsHandler: Failed to fetch artist count") l.Err(err).Msg("StatsHandler: Failed to fetch artist count")
utils.WriteError(w, "failed to get artists: "+err.Error(), http.StatusInternalServerError) utils.WriteError(w, "failed to get artists: "+err.Error(), http.StatusInternalServerError)
return return
} }
timeListenedS, err := store.CountTimeListened(r.Context(), db.Timeframe{Period: period}) timeListenedS, err := store.CountTimeListened(r.Context(), tf)
if err != nil { if err != nil {
l.Err(err).Msg("StatsHandler: Failed to fetch time listened") l.Err(err).Msg("StatsHandler: Failed to fetch time listened")
utils.WriteError(w, "failed to get time listened: "+err.Error(), http.StatusInternalServerError) utils.WriteError(w, "failed to get time listened: "+err.Error(), http.StatusInternalServerError)

View file

@ -61,7 +61,9 @@ func TestImportSpotify(t *testing.T) {
a, err := store.GetArtist(context.Background(), db.GetArtistOpts{Name: "The Story So Far"}) a, err := store.GetArtist(context.Background(), db.GetArtistOpts{Name: "The Story So Far"})
require.NoError(t, err) require.NoError(t, err)
track, err := store.GetTrack(context.Background(), db.GetTrackOpts{Title: "Clairvoyant", ArtistIDs: []int32{a.ID}}) r, err := store.GetAlbum(context.Background(), db.GetAlbumOpts{ArtistID: a.ID, Title: "The Story So Far / Stick To Your Guns Split"})
require.NoError(t, err)
track, err := store.GetTrack(context.Background(), db.GetTrackOpts{Title: "Clairvoyant", ReleaseID: r.ID, ArtistIDs: []int32{a.ID}})
require.NoError(t, err) require.NoError(t, err)
t.Log(track) t.Log(track)
assert.Equal(t, "Clairvoyant", track.Title) assert.Equal(t, "Clairvoyant", track.Title)
@ -107,15 +109,15 @@ func TestImportLastFM(t *testing.T) {
artist, err := store.GetArtist(context.Background(), db.GetArtistOpts{MusicBrainzID: uuid.MustParse("4b00640f-3be6-43f8-9b34-ff81bd89320a")}) artist, err := store.GetArtist(context.Background(), db.GetArtistOpts{MusicBrainzID: uuid.MustParse("4b00640f-3be6-43f8-9b34-ff81bd89320a")})
require.NoError(t, err) require.NoError(t, err)
assert.Equal(t, "OurR", artist.Name) assert.Equal(t, "OurR", artist.Name)
artist, err = store.GetArtist(context.Background(), db.GetArtistOpts{Name: "CHUU"}) artist, err = store.GetArtist(context.Background(), db.GetArtistOpts{Name: "Necry Talkie"})
require.NoError(t, err) require.NoError(t, err)
track, err := store.GetTrack(context.Background(), db.GetTrackOpts{Title: "because I'm stupid?", ArtistIDs: []int32{artist.ID}}) track, err := store.GetTrack(context.Background(), db.GetTrackOpts{Title: "放課後の記憶", ReleaseID: album.ID, ArtistIDs: []int32{artist.ID}})
require.NoError(t, err) require.NoError(t, err)
t.Log(track) t.Log(track)
listens, err := store.GetListensPaginated(context.Background(), db.GetItemsOpts{TrackID: int(track.ID), Period: db.PeriodAllTime}) listens, err := store.GetListensPaginated(context.Background(), db.GetItemsOpts{TrackID: int(track.ID), Timeframe: db.Timeframe{Period: db.PeriodAllTime}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, listens.Items, 1) require.Len(t, listens.Items, 1)
assert.WithinDuration(t, time.Unix(1749776100, 0), listens.Items[0].Time, 1*time.Second) assert.WithinDuration(t, time.Unix(1749774900, 0), listens.Items[0].Time, 1*time.Second)
truncateTestData(t) truncateTestData(t)
} }
@ -141,15 +143,15 @@ func TestImportLastFM_MbzDisabled(t *testing.T) {
artist, err := store.GetArtist(context.Background(), db.GetArtistOpts{MusicBrainzID: uuid.MustParse("4b00640f-3be6-43f8-9b34-ff81bd89320a")}) artist, err := store.GetArtist(context.Background(), db.GetArtistOpts{MusicBrainzID: uuid.MustParse("4b00640f-3be6-43f8-9b34-ff81bd89320a")})
require.NoError(t, err) require.NoError(t, err)
assert.Equal(t, "OurR", artist.Name) assert.Equal(t, "OurR", artist.Name)
artist, err = store.GetArtist(context.Background(), db.GetArtistOpts{Name: "CHUU"}) artist, err = store.GetArtist(context.Background(), db.GetArtistOpts{Name: "Necry Talkie"})
require.NoError(t, err) require.NoError(t, err)
track, err := store.GetTrack(context.Background(), db.GetTrackOpts{Title: "because I'm stupid?", ArtistIDs: []int32{artist.ID}}) track, err := store.GetTrack(context.Background(), db.GetTrackOpts{Title: "放課後の記憶", ReleaseID: album.ID, ArtistIDs: []int32{artist.ID}})
require.NoError(t, err) require.NoError(t, err)
t.Log(track) t.Log(track)
listens, err := store.GetListensPaginated(context.Background(), db.GetItemsOpts{TrackID: int(track.ID), Period: db.PeriodAllTime}) listens, err := store.GetListensPaginated(context.Background(), db.GetItemsOpts{TrackID: int(track.ID), Timeframe: db.Timeframe{Period: db.PeriodAllTime}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, listens.Items, 1) require.Len(t, listens.Items, 1)
assert.WithinDuration(t, time.Unix(1749776100, 0), listens.Items[0].Time, 1*time.Second) assert.WithinDuration(t, time.Unix(1749774900, 0), listens.Items[0].Time, 1*time.Second)
truncateTestData(t) truncateTestData(t)
} }
@ -216,7 +218,7 @@ func TestImportListenBrainz(t *testing.T) {
track, err := store.GetTrack(context.Background(), db.GetTrackOpts{MusicBrainzID: uuid.MustParse("08e8f55b-f1a4-46b8-b2d1-fab4c592165c")}) track, err := store.GetTrack(context.Background(), db.GetTrackOpts{MusicBrainzID: uuid.MustParse("08e8f55b-f1a4-46b8-b2d1-fab4c592165c")})
require.NoError(t, err) require.NoError(t, err)
assert.Equal(t, "Desert", track.Title) assert.Equal(t, "Desert", track.Title)
listens, err := store.GetListensPaginated(context.Background(), db.GetItemsOpts{TrackID: int(track.ID), Period: db.PeriodAllTime}) listens, err := store.GetListensPaginated(context.Background(), db.GetItemsOpts{TrackID: int(track.ID), Timeframe: db.Timeframe{Period: db.PeriodAllTime}})
require.NoError(t, err) require.NoError(t, err)
assert.Len(t, listens.Items, 1) assert.Len(t, listens.Items, 1)
assert.WithinDuration(t, time.Unix(1749780612, 0), listens.Items[0].Time, 1*time.Second) assert.WithinDuration(t, time.Unix(1749780612, 0), listens.Items[0].Time, 1*time.Second)
@ -254,7 +256,7 @@ func TestImportListenBrainz_MbzDisabled(t *testing.T) {
track, err := store.GetTrack(context.Background(), db.GetTrackOpts{MusicBrainzID: uuid.MustParse("08e8f55b-f1a4-46b8-b2d1-fab4c592165c")}) track, err := store.GetTrack(context.Background(), db.GetTrackOpts{MusicBrainzID: uuid.MustParse("08e8f55b-f1a4-46b8-b2d1-fab4c592165c")})
require.NoError(t, err) require.NoError(t, err)
assert.Equal(t, "Desert", track.Title) assert.Equal(t, "Desert", track.Title)
listens, err := store.GetListensPaginated(context.Background(), db.GetItemsOpts{TrackID: int(track.ID), Period: db.PeriodAllTime}) listens, err := store.GetListensPaginated(context.Background(), db.GetItemsOpts{TrackID: int(track.ID), Timeframe: db.Timeframe{Period: db.PeriodAllTime}})
require.NoError(t, err) require.NoError(t, err)
assert.Len(t, listens.Items, 1) assert.Len(t, listens.Items, 1)
assert.WithinDuration(t, time.Unix(1749780612, 0), listens.Items[0].Time, 1*time.Second) assert.WithinDuration(t, time.Unix(1749780612, 0), listens.Items[0].Time, 1*time.Second)
@ -262,6 +264,34 @@ func TestImportListenBrainz_MbzDisabled(t *testing.T) {
truncateTestData(t) truncateTestData(t)
} }
func TestImportListenBrainz_MBIDMapping(t *testing.T) {
src := path.Join("..", "test_assets", "listenbrainz_shoko1_123456789.zip")
destDir := filepath.Join(cfg.ConfigDir(), "import")
dest := filepath.Join(destDir, "listenbrainz_shoko1_123456789.zip")
// not going to make the dest dir because engine should make it already
input, err := os.ReadFile(src)
require.NoError(t, err)
require.NoError(t, os.WriteFile(dest, input, os.ModePerm))
engine.RunImporter(logger.Get(), store, &mbz.MbzErrorCaller{})
album, err := store.GetAlbum(context.Background(), db.GetAlbumOpts{MusicBrainzID: uuid.MustParse("177ebc28-0115-3897-8eb3-ebf74ce23790")})
require.NoError(t, err)
assert.Equal(t, "Zombie", album.Title)
artist, err := store.GetArtist(context.Background(), db.GetArtistOpts{MusicBrainzID: uuid.MustParse("c98d40fd-f6cf-4b26-883e-eaa515ee2851")})
require.NoError(t, err)
assert.Equal(t, "The Cranberries", artist.Name)
track, err := store.GetTrack(context.Background(), db.GetTrackOpts{MusicBrainzID: uuid.MustParse("3bbeb4e3-ab6d-460d-bfc5-de49e4251061")})
require.NoError(t, err)
assert.Equal(t, "Zombie", track.Title)
truncateTestData(t)
}
func TestImportKoito(t *testing.T) { func TestImportKoito(t *testing.T) {
src := path.Join("..", "test_assets", "koito_export_test.json") src := path.Join("..", "test_assets", "koito_export_test.json")
@ -274,6 +304,7 @@ func TestImportKoito(t *testing.T) {
giriReleaseMBID := uuid.MustParse("ac1f8da0-21d7-426e-83b0-befff06f0871") giriReleaseMBID := uuid.MustParse("ac1f8da0-21d7-426e-83b0-befff06f0871")
suzukiMBID := uuid.MustParse("30f851bb-dba3-4e9b-811c-5f27f595c86a") suzukiMBID := uuid.MustParse("30f851bb-dba3-4e9b-811c-5f27f595c86a")
nijinoTrackMBID := uuid.MustParse("a4f26836-3894-46c1-acac-227808308687") nijinoTrackMBID := uuid.MustParse("a4f26836-3894-46c1-acac-227808308687")
lp3MBID := uuid.MustParse("d0ec30bd-7cdc-417c-979d-5a0631b8a161")
input, err := os.ReadFile(src) input, err := os.ReadFile(src)
require.NoError(t, err) require.NoError(t, err)
@ -284,11 +315,11 @@ func TestImportKoito(t *testing.T) {
// ensure all artists are saved // ensure all artists are saved
_, err = store.GetArtist(ctx, db.GetArtistOpts{Name: "American Football"}) _, err = store.GetArtist(ctx, db.GetArtistOpts{Name: "American Football"})
require.NoError(t, err) assert.NoError(t, err)
_, err = store.GetArtist(ctx, db.GetArtistOpts{Name: "Rachel Goswell"}) _, err = store.GetArtist(ctx, db.GetArtistOpts{Name: "Rachel Goswell"})
require.NoError(t, err) assert.NoError(t, err)
_, err = store.GetArtist(ctx, db.GetArtistOpts{Name: "Elizabeth Powell"}) _, err = store.GetArtist(ctx, db.GetArtistOpts{Name: "Elizabeth Powell"})
require.NoError(t, err) assert.NoError(t, err)
// ensure artist aliases are saved // ensure artist aliases are saved
artist, err := store.GetArtist(ctx, db.GetArtistOpts{MusicBrainzID: suzukiMBID}) artist, err := store.GetArtist(ctx, db.GetArtistOpts{MusicBrainzID: suzukiMBID})
@ -310,6 +341,12 @@ func TestImportKoito(t *testing.T) {
aliases, err := store.GetAllAlbumAliases(ctx, album.ID) aliases, err := store.GetAllAlbumAliases(ctx, album.ID)
require.NoError(t, err) require.NoError(t, err)
assert.Contains(t, utils.FlattenAliases(aliases), "Nijinoiroyo Azayakadeare (NELKE ver.)") assert.Contains(t, utils.FlattenAliases(aliases), "Nijinoiroyo Azayakadeare (NELKE ver.)")
// ensure album associations are saved
album, err = store.GetAlbum(ctx, db.GetAlbumOpts{MusicBrainzID: lp3MBID})
require.NoError(t, err)
assert.Contains(t, utils.FlattenSimpleArtistNames(album.Artists), "Elizabeth Powell")
assert.Contains(t, utils.FlattenSimpleArtistNames(album.Artists), "Rachel Goswell")
assert.Contains(t, utils.FlattenSimpleArtistNames(album.Artists), "American Football")
// ensure all tracks are saved // ensure all tracks are saved
track, err := store.GetTrack(ctx, db.GetTrackOpts{MusicBrainzID: nijinoTrackMBID}) track, err := store.GetTrack(ctx, db.GetTrackOpts{MusicBrainzID: nijinoTrackMBID})
@ -323,7 +360,9 @@ func TestImportKoito(t *testing.T) {
artist, err = store.GetArtist(ctx, db.GetArtistOpts{MusicBrainzID: suzukiMBID}) artist, err = store.GetArtist(ctx, db.GetArtistOpts{MusicBrainzID: suzukiMBID})
require.NoError(t, err) require.NoError(t, err)
_, err = store.GetTrack(ctx, db.GetTrackOpts{Title: "GIRI GIRI", ArtistIDs: []int32{artist.ID}}) album, err = store.GetAlbum(ctx, db.GetAlbumOpts{ArtistID: artist.ID, Title: "GIRI GIRI"})
require.NoError(t, err)
_, err = store.GetTrack(ctx, db.GetTrackOpts{Title: "GIRI GIRI", ReleaseID: album.ID, ArtistIDs: []int32{artist.ID}})
require.NoError(t, err) require.NoError(t, err)
count, err := store.CountTracks(ctx, db.Timeframe{Period: db.PeriodAllTime}) count, err := store.CountTracks(ctx, db.Timeframe{Period: db.PeriodAllTime})

View file

@ -74,15 +74,15 @@ func getApiKey(t *testing.T, session string) {
func truncateTestData(t *testing.T) { func truncateTestData(t *testing.T) {
err := store.Exec(context.Background(), err := store.Exec(context.Background(),
`TRUNCATE `TRUNCATE
artists, artists,
artist_aliases, artist_aliases,
tracks, tracks,
artist_tracks, artist_tracks,
releases, releases,
artist_releases, artist_releases,
release_aliases, release_aliases,
listens listens
RESTART IDENTITY CASCADE`) RESTART IDENTITY CASCADE`)
require.NoError(t, err) require.NoError(t, err)
} }
@ -211,7 +211,7 @@ func TestGetters(t *testing.T) {
assert.Equal(t, "花の塔", track.Title) assert.Equal(t, "花の塔", track.Title)
// Listen was saved // Listen was saved
resp, err = http.DefaultClient.Get(host() + "/apis/web/v1/listens") resp, err = http.DefaultClient.Get(host() + "/apis/web/v1/listens?period=all_time")
assert.NoError(t, err) assert.NoError(t, err)
var listens db.PaginatedResponse[models.Listen] var listens db.PaginatedResponse[models.Listen]
err = json.NewDecoder(resp.Body).Decode(&listens) err = json.NewDecoder(resp.Body).Decode(&listens)
@ -220,21 +220,21 @@ func TestGetters(t *testing.T) {
assert.EqualValues(t, 2, listens.Items[0].Track.ID) assert.EqualValues(t, 2, listens.Items[0].Track.ID)
assert.Equal(t, "Where Our Blue Is", listens.Items[0].Track.Title) assert.Equal(t, "Where Our Blue Is", listens.Items[0].Track.Title)
resp, err = http.DefaultClient.Get(host() + "/apis/web/v1/top-artists") resp, err = http.DefaultClient.Get(host() + "/apis/web/v1/top-artists?period=all_time")
assert.NoError(t, err) assert.NoError(t, err)
var artists db.PaginatedResponse[models.Artist] var artists db.PaginatedResponse[models.Artist]
err = json.NewDecoder(resp.Body).Decode(&artists) err = json.NewDecoder(resp.Body).Decode(&artists)
require.NoError(t, err) require.NoError(t, err)
require.Len(t, artists.Items, 3) require.Len(t, artists.Items, 3)
resp, err = http.DefaultClient.Get(host() + "/apis/web/v1/top-albums") resp, err = http.DefaultClient.Get(host() + "/apis/web/v1/top-albums?period=all_time")
assert.NoError(t, err) assert.NoError(t, err)
var albums db.PaginatedResponse[models.Album] var albums db.PaginatedResponse[models.Album]
err = json.NewDecoder(resp.Body).Decode(&albums) err = json.NewDecoder(resp.Body).Decode(&albums)
require.NoError(t, err) require.NoError(t, err)
require.Len(t, albums.Items, 3) require.Len(t, albums.Items, 3)
resp, err = http.DefaultClient.Get(host() + "/apis/web/v1/top-tracks") resp, err = http.DefaultClient.Get(host() + "/apis/web/v1/top-tracks?period=all_time")
assert.NoError(t, err) assert.NoError(t, err)
var tracks db.PaginatedResponse[models.Track] var tracks db.PaginatedResponse[models.Track]
err = json.NewDecoder(resp.Body).Decode(&tracks) err = json.NewDecoder(resp.Body).Decode(&tracks)
@ -356,6 +356,51 @@ func TestDelete(t *testing.T) {
truncateTestData(t) truncateTestData(t)
} }
func TestLoginGate(t *testing.T) {
t.Run("Submit Listens", doSubmitListens)
req, err := http.NewRequest("DELETE", host()+"/apis/web/v1/artist?id=1", nil)
require.NoError(t, err)
req.Header.Add("Authorization", "Token "+apikey)
resp, err := http.DefaultClient.Do(req)
assert.NoError(t, err)
assert.Equal(t, 204, resp.StatusCode)
req, err = http.NewRequest("GET", host()+"/apis/web/v1/artist?id=3", nil)
require.NoError(t, err)
resp, err = http.DefaultClient.Do(req)
assert.NoError(t, err)
assert.Equal(t, 200, resp.StatusCode)
var artist models.Artist
err = json.NewDecoder(resp.Body).Decode(&artist)
require.NoError(t, err)
assert.Equal(t, "ネクライトーキー", artist.Name)
cfg.SetLoginGate(true)
req, err = http.NewRequest("GET", host()+"/apis/web/v1/artist?id=3", nil)
require.NoError(t, err)
// req.Header.Add("Authorization", "Token "+apikey)
resp, err = http.DefaultClient.Do(req)
assert.NoError(t, err)
assert.Equal(t, 401, resp.StatusCode)
req, err = http.NewRequest("GET", host()+"/apis/web/v1/artist?id=3", nil)
require.NoError(t, err)
req.Header.Add("Authorization", "Token "+apikey)
resp, err = http.DefaultClient.Do(req)
assert.NoError(t, err)
assert.Equal(t, 200, resp.StatusCode)
err = json.NewDecoder(resp.Body).Decode(&artist)
require.NoError(t, err)
assert.Equal(t, "ネクライトーキー", artist.Name)
cfg.SetLoginGate(false)
truncateTestData(t)
}
func TestAliasesAndSearch(t *testing.T) { func TestAliasesAndSearch(t *testing.T) {
t.Run("Submit Listens", doSubmitListens) t.Run("Submit Listens", doSubmitListens)
@ -439,7 +484,7 @@ func TestStats(t *testing.T) {
t.Run("Submit Listens", doSubmitListens) t.Run("Submit Listens", doSubmitListens)
resp, err = http.DefaultClient.Get(host() + "/apis/web/v1/stats") resp, err = http.DefaultClient.Get(host() + "/apis/web/v1/stats?period=all_time")
t.Log(resp) t.Log(resp)
require.NoError(t, err) require.NoError(t, err)
var actual handlers.StatsResponse var actual handlers.StatsResponse

View file

@ -0,0 +1,166 @@
package middleware
import (
"context"
"errors"
"fmt"
"net/http"
"strings"
"time"
"github.com/gabehf/koito/internal/cfg"
"github.com/gabehf/koito/internal/db"
"github.com/gabehf/koito/internal/logger"
"github.com/gabehf/koito/internal/models"
"github.com/gabehf/koito/internal/utils"
"github.com/google/uuid"
)
type MiddlwareContextKey string
const (
UserContextKey MiddlwareContextKey = "user"
apikeyContextKey MiddlwareContextKey = "apikeyID"
)
type AuthMode int
const (
AuthModeSessionCookie AuthMode = iota
AuthModeAPIKey
AuthModeSessionOrAPIKey
AuthModeLoginGate
)
func Authenticate(store db.DB, mode AuthMode) func(http.Handler) http.Handler {
return func(next http.Handler) http.Handler {
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
ctx := r.Context()
l := logger.FromContext(ctx)
var user *models.User
var err error
switch mode {
case AuthModeSessionCookie:
user, err = validateSession(ctx, store, r)
case AuthModeAPIKey:
user, err = validateAPIKey(ctx, store, r)
case AuthModeSessionOrAPIKey:
user, err = validateSession(ctx, store, r)
if err != nil || user == nil {
user, err = validateAPIKey(ctx, store, r)
}
case AuthModeLoginGate:
if cfg.LoginGate() {
user, err = validateSession(ctx, store, r)
if err != nil || user == nil {
user, err = validateAPIKey(ctx, store, r)
}
} else {
next.ServeHTTP(w, r)
return
}
}
if err != nil {
l.Err(err).Msg("authentication failed")
utils.WriteError(w, "unauthorized", http.StatusUnauthorized)
return
}
if user == nil {
utils.WriteError(w, "unauthorized", http.StatusUnauthorized)
return
}
ctx = context.WithValue(ctx, UserContextKey, user)
r = r.WithContext(ctx)
next.ServeHTTP(w, r)
})
}
}
func validateSession(ctx context.Context, store db.DB, r *http.Request) (*models.User, error) {
l := logger.FromContext(r.Context())
l.Debug().Msgf("ValidateSession: Checking user authentication via session cookie")
cookie, err := r.Cookie("koito_session")
var sid uuid.UUID
if err == nil {
sid, err = uuid.Parse(cookie.Value)
if err != nil {
l.Err(err).Msg("ValidateSession: Could not parse UUID from session cookie")
return nil, errors.New("session cookie is invalid")
}
} else {
l.Debug().Msgf("ValidateSession: No session cookie found; attempting API key authentication")
return nil, errors.New("session cookie is missing")
}
l.Debug().Msg("ValidateSession: Retrieved login cookie from request")
u, err := store.GetUserBySession(r.Context(), sid)
if err != nil {
l.Err(fmt.Errorf("ValidateSession: %w", err)).Msg("Error accessing database")
return nil, errors.New("internal server error")
}
if u == nil {
l.Debug().Msg("ValidateSession: No user with session id found")
return nil, errors.New("no user with session id found")
}
ctx = context.WithValue(r.Context(), UserContextKey, u)
r = r.WithContext(ctx)
l.Debug().Msgf("ValidateSession: Refreshing session for user '%s'", u.Username)
store.RefreshSession(r.Context(), sid, time.Now().Add(30*24*time.Hour))
l.Debug().Msgf("ValidateSession: Refreshed session for user '%s'", u.Username)
return u, nil
}
func validateAPIKey(ctx context.Context, store db.DB, r *http.Request) (*models.User, error) {
l := logger.FromContext(ctx)
l.Debug().Msg("ValidateApiKey: Checking if user is already authenticated")
authH := r.Header.Get("Authorization")
var token string
if strings.HasPrefix(strings.ToLower(authH), "token ") {
token = strings.TrimSpace(authH[6:]) // strip "Token "
} else {
l.Error().Msg("ValidateApiKey: Authorization header must be formatted 'Token {token}'")
return nil, errors.New("authorization header is invalid")
}
u, err := store.GetUserByApiKey(ctx, token)
if err != nil {
l.Err(err).Msg("ValidateApiKey: Failed to get user from database using api key")
return nil, errors.New("internal server error")
}
if u == nil {
l.Debug().Msg("ValidateApiKey: API key does not exist")
return nil, errors.New("authorization token is invalid")
}
ctx = context.WithValue(r.Context(), UserContextKey, u)
r = r.WithContext(ctx)
return u, nil
}
func GetUserFromContext(ctx context.Context) *models.User {
user, ok := ctx.Value(UserContextKey).(*models.User)
if !ok {
return nil
}
return user
}

View file

@ -1,125 +0,0 @@
package middleware
import (
"context"
"fmt"
"net/http"
"strings"
"time"
"github.com/gabehf/koito/internal/db"
"github.com/gabehf/koito/internal/logger"
"github.com/gabehf/koito/internal/models"
"github.com/gabehf/koito/internal/utils"
"github.com/google/uuid"
)
type MiddlwareContextKey string
const (
UserContextKey MiddlwareContextKey = "user"
apikeyContextKey MiddlwareContextKey = "apikeyID"
)
func ValidateSession(store db.DB) func(next http.Handler) http.Handler {
return func(next http.Handler) http.Handler {
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
l := logger.FromContext(r.Context())
l.Debug().Msgf("ValidateSession: Checking user authentication via session cookie")
cookie, err := r.Cookie("koito_session")
var sid uuid.UUID
if err == nil {
sid, err = uuid.Parse(cookie.Value)
if err != nil {
l.Err(err).Msg("ValidateSession: Could not parse UUID from session cookie")
utils.WriteError(w, "session cookie is invalid", http.StatusUnauthorized)
return
}
} else {
l.Debug().Msgf("ValidateSession: No session cookie found; attempting API key authentication")
utils.WriteError(w, "session cookie is missing", http.StatusUnauthorized)
return
}
l.Debug().Msg("ValidateSession: Retrieved login cookie from request")
u, err := store.GetUserBySession(r.Context(), sid)
if err != nil {
l.Err(fmt.Errorf("ValidateSession: %w", err)).Msg("Error accessing database")
utils.WriteError(w, "internal server error", http.StatusInternalServerError)
return
}
if u == nil {
l.Debug().Msg("ValidateSession: No user with session id found")
utils.WriteError(w, "unauthorized", http.StatusUnauthorized)
return
}
ctx := context.WithValue(r.Context(), UserContextKey, u)
r = r.WithContext(ctx)
l.Debug().Msgf("ValidateSession: Refreshing session for user '%s'", u.Username)
store.RefreshSession(r.Context(), sid, time.Now().Add(30*24*time.Hour))
l.Debug().Msgf("ValidateSession: Refreshed session for user '%s'", u.Username)
next.ServeHTTP(w, r)
})
}
}
func ValidateApiKey(store db.DB) func(next http.Handler) http.Handler {
return func(next http.Handler) http.Handler {
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
ctx := r.Context()
l := logger.FromContext(ctx)
l.Debug().Msg("ValidateApiKey: Checking if user is already authenticated")
u := GetUserFromContext(ctx)
if u != nil {
l.Debug().Msg("ValidateApiKey: User is already authenticated; skipping API key authentication")
next.ServeHTTP(w, r)
return
}
authh := r.Header.Get("Authorization")
var token string
if strings.HasPrefix(strings.ToLower(authh), "token ") {
token = strings.TrimSpace(authh[6:]) // strip "Token "
} else {
l.Error().Msg("ValidateApiKey: Authorization header must be formatted 'Token {token}'")
utils.WriteError(w, "unauthorized", http.StatusUnauthorized)
return
}
u, err := store.GetUserByApiKey(ctx, token)
if err != nil {
l.Err(err).Msg("Failed to get user from database using api key")
utils.WriteError(w, "internal server error", http.StatusInternalServerError)
return
}
if u == nil {
l.Debug().Msg("Api key does not exist")
utils.WriteError(w, "unauthorized", http.StatusUnauthorized)
return
}
ctx = context.WithValue(r.Context(), UserContextKey, u)
r = r.WithContext(ctx)
next.ServeHTTP(w, r)
})
}
}
func GetUserFromContext(ctx context.Context) *models.User {
user, ok := ctx.Value(UserContextKey).(*models.User)
if !ok {
return nil
}
return user
}

View file

@ -38,9 +38,7 @@ func bindRoutes(
r.Get("/config", handlers.GetCfgHandler()) r.Get("/config", handlers.GetCfgHandler())
r.Group(func(r chi.Router) { r.Group(func(r chi.Router) {
if cfg.LoginGate() { r.Use(middleware.Authenticate(db, middleware.AuthModeLoginGate))
r.Use(middleware.ValidateSession(db))
}
r.Get("/artist", handlers.GetArtistHandler(db)) r.Get("/artist", handlers.GetArtistHandler(db))
r.Get("/artists", handlers.GetArtistsForItemHandler(db)) r.Get("/artists", handlers.GetArtistsForItemHandler(db))
r.Get("/album", handlers.GetAlbumHandler(db)) r.Get("/album", handlers.GetAlbumHandler(db))
@ -55,6 +53,7 @@ func bindRoutes(
r.Get("/search", handlers.SearchHandler(db)) r.Get("/search", handlers.SearchHandler(db))
r.Get("/aliases", handlers.GetAliasesHandler(db)) r.Get("/aliases", handlers.GetAliasesHandler(db))
r.Get("/summary", handlers.SummaryHandler(db)) r.Get("/summary", handlers.SummaryHandler(db))
r.Get("/interest", handlers.GetInterestHandler(db))
}) })
r.Post("/logout", handlers.LogoutHandler(db)) r.Post("/logout", handlers.LogoutHandler(db))
if !cfg.RateLimitDisabled() { if !cfg.RateLimitDisabled() {
@ -78,7 +77,7 @@ func bindRoutes(
}) })
r.Group(func(r chi.Router) { r.Group(func(r chi.Router) {
r.Use(middleware.ValidateSession(db)) r.Use(middleware.Authenticate(db, middleware.AuthModeSessionOrAPIKey))
r.Get("/export", handlers.ExportHandler(db)) r.Get("/export", handlers.ExportHandler(db))
r.Post("/replace-image", handlers.ReplaceImageHandler(db)) r.Post("/replace-image", handlers.ReplaceImageHandler(db))
r.Patch("/album", handlers.UpdateAlbumHandler(db)) r.Patch("/album", handlers.UpdateAlbumHandler(db))
@ -94,6 +93,7 @@ func bindRoutes(
r.Post("/aliases", handlers.CreateAliasHandler(db)) r.Post("/aliases", handlers.CreateAliasHandler(db))
r.Post("/aliases/delete", handlers.DeleteAliasHandler(db)) r.Post("/aliases/delete", handlers.DeleteAliasHandler(db))
r.Post("/aliases/primary", handlers.SetPrimaryAliasHandler(db)) r.Post("/aliases/primary", handlers.SetPrimaryAliasHandler(db))
r.Patch("/mbzid", handlers.UpdateMbzIdHandler(db))
r.Get("/user/apikeys", handlers.GetApiKeysHandler(db)) r.Get("/user/apikeys", handlers.GetApiKeysHandler(db))
r.Post("/user/apikeys", handlers.GenerateApiKeyHandler(db)) r.Post("/user/apikeys", handlers.GenerateApiKeyHandler(db))
r.Patch("/user/apikeys", handlers.UpdateApiKeyLabelHandler(db)) r.Patch("/user/apikeys", handlers.UpdateApiKeyLabelHandler(db))
@ -109,8 +109,10 @@ func bindRoutes(
AllowedHeaders: []string{"Content-Type", "Authorization"}, AllowedHeaders: []string{"Content-Type", "Authorization"},
})) }))
r.With(middleware.ValidateApiKey(db)).Post("/submit-listens", handlers.LbzSubmitListenHandler(db, mbz)) r.With(middleware.Authenticate(db, middleware.AuthModeAPIKey)).
r.With(middleware.ValidateApiKey(db)).Get("/validate-token", handlers.LbzValidateTokenHandler(db)) Post("/submit-listens", handlers.LbzSubmitListenHandler(db, mbz))
r.With(middleware.Authenticate(db, middleware.AuthModeAPIKey)).
Get("/validate-token", handlers.LbzValidateTokenHandler(db))
}) })
// serve react client // serve react client

View file

@ -82,11 +82,8 @@ func createOrUpdateAlbumWithMbzReleaseID(ctx context.Context, d db.DB, opts Asso
titles := []string{release.Title, opts.ReleaseName} titles := []string{release.Title, opts.ReleaseName}
utils.Unique(&titles) utils.Unique(&titles)
l.Debug().Msgf("Searching for albums '%v' from artist id %d in DB", titles, opts.Artists[0].ID) l.Debug().Msgf("Searching for albums '%v' from artist id %d and no associated MusicBrainz ID in DB", titles, opts.Artists[0].ID)
album, err = d.GetAlbum(ctx, db.GetAlbumOpts{ album, err = d.GetAlbumWithNoMbzIDByTitles(ctx, opts.Artists[0].ID, titles)
ArtistID: opts.Artists[0].ID,
Titles: titles,
})
if err == nil { if err == nil {
l.Debug().Msgf("Found album %s, updating with MusicBrainz Release ID...", album.Title) l.Debug().Msgf("Found album %s, updating with MusicBrainz Release ID...", album.Title)
err := d.UpdateAlbum(ctx, db.UpdateAlbumOpts{ err := d.UpdateAlbum(ctx, db.UpdateAlbumOpts{

View file

@ -96,6 +96,19 @@ func matchArtistsByMBIDMappings(ctx context.Context, d db.DB, opts AssociateArti
}) })
if err == nil { if err == nil {
l.Debug().Msgf("Artist '%s' found by Name", a.Artist) l.Debug().Msgf("Artist '%s' found by Name", a.Artist)
if artist.MbzID == nil {
err := d.UpdateArtist(ctx, db.UpdateArtistOpts{
ID: artist.ID,
MusicBrainzID: a.Mbid,
})
if err != nil {
l.Err(err).Msg("matchArtistsByMBIDMappings: failed to update artist with MusicBrainz ID")
return nil, fmt.Errorf("matchArtistsByMBIDMappings: %w", err)
}
l.Debug().Msgf("Updated artist '%s' with MusicBrainz ID", artist.Name)
} else {
l.Warn().Msgf("Attempted to update artist %s with MusicBrainz ID, but an existing ID was already found", artist.Name)
}
err = d.UpdateArtist(ctx, db.UpdateArtistOpts{ID: artist.ID, MusicBrainzID: a.Mbid}) err = d.UpdateArtist(ctx, db.UpdateArtistOpts{ID: artist.ID, MusicBrainzID: a.Mbid})
if err != nil { if err != nil {
l.Err(err).Msgf("matchArtistsByMBIDMappings: Failed to associate artist '%s' with MusicBrainz ID", artist.Name) l.Err(err).Msgf("matchArtistsByMBIDMappings: Failed to associate artist '%s' with MusicBrainz ID", artist.Name)

View file

@ -39,7 +39,7 @@ func AssociateTrack(ctx context.Context, d db.DB, opts AssociateTrackOpts) (*mod
return matchTrackByMbzID(ctx, d, opts) return matchTrackByMbzID(ctx, d, opts)
} else { } else {
l.Debug().Msgf("Associating track '%s' by title and artist", opts.TrackName) l.Debug().Msgf("Associating track '%s' by title and artist", opts.TrackName)
return matchTrackByTitleAndArtist(ctx, d, opts) return matchTrackByTrackInfo(ctx, d, opts)
} }
} }
@ -56,45 +56,53 @@ func matchTrackByMbzID(ctx context.Context, d db.DB, opts AssociateTrackOpts) (*
return nil, fmt.Errorf("matchTrackByMbzID: %w", err) return nil, fmt.Errorf("matchTrackByMbzID: %w", err)
} else { } else {
l.Debug().Msgf("Track '%s' could not be found by MusicBrainz ID", opts.TrackName) l.Debug().Msgf("Track '%s' could not be found by MusicBrainz ID", opts.TrackName)
track, err := matchTrackByTitleAndArtist(ctx, d, opts) track, err := matchTrackByTrackInfo(ctx, d, opts)
if err != nil { if err != nil {
return nil, fmt.Errorf("matchTrackByMbzID: %w", err) return nil, fmt.Errorf("matchTrackByMbzID: %w", err)
} }
l.Debug().Msgf("Updating track '%s' with MusicBrainz ID %s", opts.TrackName, opts.TrackMbzID) l.Debug().Msgf("Updating track '%s' with MusicBrainz ID %s", opts.TrackName, opts.TrackMbzID)
err = d.UpdateTrack(ctx, db.UpdateTrackOpts{ if track.MbzID == nil || *track.MbzID == uuid.Nil {
ID: track.ID, err := d.UpdateTrack(ctx, db.UpdateTrackOpts{
MusicBrainzID: opts.TrackMbzID, ID: track.ID,
}) MusicBrainzID: opts.TrackMbzID,
if err != nil { })
return nil, fmt.Errorf("matchTrackByMbzID: %w", err) if err != nil {
l.Err(err).Msg("matchArtistsByMBIDMappings: failed to update track with MusicBrainz ID")
return nil, fmt.Errorf("matchArtistsByMBIDMappings: %w", err)
}
l.Debug().Msgf("Updated track '%s' with MusicBrainz ID", track.Title)
} else {
l.Warn().Msgf("Attempted to update track %s with MusicBrainz ID, but an existing ID was already found", track.Title)
} }
track.MbzID = &opts.TrackMbzID track.MbzID = &opts.TrackMbzID
return track, nil return track, nil
} }
} }
func matchTrackByTitleAndArtist(ctx context.Context, d db.DB, opts AssociateTrackOpts) (*models.Track, error) { func matchTrackByTrackInfo(ctx context.Context, d db.DB, opts AssociateTrackOpts) (*models.Track, error) {
l := logger.FromContext(ctx) l := logger.FromContext(ctx)
// try provided track title // try provided track title
track, err := d.GetTrack(ctx, db.GetTrackOpts{ track, err := d.GetTrack(ctx, db.GetTrackOpts{
Title: opts.TrackName, Title: opts.TrackName,
ReleaseID: opts.AlbumID,
ArtistIDs: opts.ArtistIDs, ArtistIDs: opts.ArtistIDs,
}) })
if err == nil { if err == nil {
l.Debug().Msgf("Track '%s' found by title and artist match", track.Title) l.Debug().Msgf("Track '%s' found by title, release and artist match", track.Title)
return track, nil return track, nil
} else if !errors.Is(err, pgx.ErrNoRows) { } else if !errors.Is(err, pgx.ErrNoRows) {
return nil, fmt.Errorf("matchTrackByTitleAndArtist: %w", err) return nil, fmt.Errorf("matchTrackByTrackInfo: %w", err)
} else { } else {
if opts.TrackMbzID != uuid.Nil { if opts.TrackMbzID != uuid.Nil {
mbzTrack, err := opts.Mbzc.GetTrack(ctx, opts.TrackMbzID) mbzTrack, err := opts.Mbzc.GetTrack(ctx, opts.TrackMbzID)
if err == nil { if err == nil {
track, err := d.GetTrack(ctx, db.GetTrackOpts{ track, err := d.GetTrack(ctx, db.GetTrackOpts{
Title: mbzTrack.Title, Title: mbzTrack.Title,
ReleaseID: opts.AlbumID,
ArtistIDs: opts.ArtistIDs, ArtistIDs: opts.ArtistIDs,
}) })
if err == nil { if err == nil {
l.Debug().Msgf("Track '%s' found by MusicBrainz title and artist match", opts.TrackName) l.Debug().Msgf("Track '%s' found by MusicBrainz title, release and artist match", opts.TrackName)
return track, nil return track, nil
} }
} }
@ -108,7 +116,7 @@ func matchTrackByTitleAndArtist(ctx context.Context, d db.DB, opts AssociateTrac
Duration: opts.Duration, Duration: opts.Duration,
}) })
if err != nil { if err != nil {
return nil, fmt.Errorf("matchTrackByTitleAndArtist: %w", err) return nil, fmt.Errorf("matchTrackByTrackInfo: %w", err)
} }
if opts.TrackMbzID == uuid.Nil { if opts.TrackMbzID == uuid.Nil {
l.Info().Msgf("Created track '%s' with title and artist", opts.TrackName) l.Info().Msgf("Created track '%s' with title and artist", opts.TrackName)

View file

@ -0,0 +1,85 @@
package catalog
import (
"context"
"fmt"
"github.com/gabehf/koito/internal/db"
"github.com/gabehf/koito/internal/logger"
"github.com/gabehf/koito/internal/mbz"
"github.com/google/uuid"
)
func BackfillTrackDurationsFromMusicBrainz(
ctx context.Context,
store db.DB,
mbzCaller mbz.MusicBrainzCaller,
) error {
l := logger.FromContext(ctx)
l.Info().Msg("BackfillTrackDurationsFromMusicBrainz: Starting backfill of track durations from MusicBrainz")
var from int32 = 0
for {
l.Debug().Int32("ID", from).Msg("Fetching tracks to backfill from ID")
tracks, err := store.GetTracksWithNoDurationButHaveMbzID(ctx, from)
if err != nil {
return fmt.Errorf("BackfillTrackDurationsFromMusicBrainz: failed to fetch tracks for duration backfill: %w", err)
}
// nil, nil means no more results
if len(tracks) == 0 {
if from == 0 {
l.Info().Msg("BackfillTrackDurationsFromMusicBrainz: No tracks need updating. Skipping backfill...")
} else {
l.Info().Msg("BackfillTrackDurationsFromMusicBrainz: Backfill complete")
}
return nil
}
for _, track := range tracks {
from = track.ID
if track.MbzID == nil || *track.MbzID == uuid.Nil {
continue
}
l.Debug().
Str("title", track.Title).
Str("mbz_id", track.MbzID.String()).
Msg("BackfillTrackDurationsFromMusicBrainz: Backfilling duration from MusicBrainz")
mbzTrack, err := mbzCaller.GetTrack(ctx, *track.MbzID)
if err != nil {
l.Err(err).
Str("title", track.Title).
Msg("BackfillTrackDurationsFromMusicBrainz: Failed to fetch track from MusicBrainz")
continue
}
if mbzTrack.LengthMs <= 0 {
l.Debug().
Str("title", track.Title).
Msg("BackfillTrackDurationsFromMusicBrainz: MusicBrainz track has no duration")
continue
}
durationSeconds := int32(mbzTrack.LengthMs / 1000)
err = store.UpdateTrack(ctx, db.UpdateTrackOpts{
ID: track.ID,
Duration: durationSeconds,
})
if err != nil {
l.Err(err).
Str("title", track.Title).
Msg("BackfillTrackDurationsFromMusicBrainz: Failed to update track duration")
} else {
l.Info().
Str("title", track.Title).
Int32("duration_seconds", durationSeconds).
Msg("BackfillTrackDurationsFromMusicBrainz: Track duration backfilled successfully")
}
}
}
}

View file

@ -0,0 +1,36 @@
package catalog_test
import (
"context"
"testing"
"github.com/gabehf/koito/internal/catalog"
"github.com/gabehf/koito/internal/mbz"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
)
func TestBackfillDuration(t *testing.T) {
setupTestDataWithMbzIDs(t)
ctx := context.Background()
mbzc := &mbz.MbzMockCaller{
Artists: mbzArtistData,
Releases: mbzReleaseData,
Tracks: mbzTrackData,
}
var err error
err = catalog.BackfillTrackDurationsFromMusicBrainz(context.Background(), store, &mbz.MbzErrorCaller{})
assert.NoError(t, err)
err = catalog.BackfillTrackDurationsFromMusicBrainz(ctx, store, mbzc)
assert.NoError(t, err)
count, err := store.Count(ctx, `
SELECT COUNT(*) FROM tracks_with_title WHERE title = $1 AND duration > 0
`, "Tokyo Calling")
require.NoError(t, err)
assert.Equal(t, 1, count, "track was not updated with duration")
}

View file

@ -13,7 +13,9 @@ import (
"github.com/gabehf/koito/internal/cfg" "github.com/gabehf/koito/internal/cfg"
"github.com/gabehf/koito/internal/db" "github.com/gabehf/koito/internal/db"
"github.com/gabehf/koito/internal/images"
"github.com/gabehf/koito/internal/logger" "github.com/gabehf/koito/internal/logger"
"github.com/gabehf/koito/internal/utils"
"github.com/google/uuid" "github.com/google/uuid"
"github.com/h2non/bimg" "github.com/h2non/bimg"
) )
@ -78,30 +80,10 @@ func SourceImageDir() string {
} }
} }
// ValidateImageURL checks if the URL points to a valid image by performing a HEAD request.
func ValidateImageURL(url string) error {
resp, err := http.Head(url)
if err != nil {
return fmt.Errorf("ValidateImageURL: http.Head: %w", err)
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
return fmt.Errorf("ValidateImageURL: HEAD request failed, status code: %d", resp.StatusCode)
}
contentType := resp.Header.Get("Content-Type")
if !strings.HasPrefix(contentType, "image/") {
return fmt.Errorf("ValidateImageURL: URL does not point to an image, content type: %s", contentType)
}
return nil
}
// DownloadAndCacheImage downloads an image from the given URL, then calls CompressAndSaveImage. // DownloadAndCacheImage downloads an image from the given URL, then calls CompressAndSaveImage.
func DownloadAndCacheImage(ctx context.Context, id uuid.UUID, url string, size ImageSize) error { func DownloadAndCacheImage(ctx context.Context, id uuid.UUID, url string, size ImageSize) error {
l := logger.FromContext(ctx) l := logger.FromContext(ctx)
err := ValidateImageURL(url) err := images.ValidateImageURL(url)
if err != nil { if err != nil {
return fmt.Errorf("DownloadAndCacheImage: %w", err) return fmt.Errorf("DownloadAndCacheImage: %w", err)
} }
@ -285,3 +267,127 @@ func pruneDirImgs(ctx context.Context, store db.DB, path string, memo map[string
} }
return count, nil return count, nil
} }
func FetchMissingArtistImages(ctx context.Context, store db.DB) error {
l := logger.FromContext(ctx)
l.Info().Msg("FetchMissingArtistImages: Starting backfill of missing artist images")
var from int32 = 0
for {
l.Debug().Int32("ID", from).Msg("Fetching artist images to backfill from ID")
artists, err := store.ArtistsWithoutImages(ctx, from)
if err != nil {
return fmt.Errorf("FetchMissingArtistImages: failed to fetch artists for image backfill: %w", err)
}
if len(artists) == 0 {
if from == 0 {
l.Info().Msg("FetchMissingArtistImages: No artists with missing images found")
} else {
l.Info().Msg("FetchMissingArtistImages: Finished fetching missing artist images")
}
return nil
}
for _, artist := range artists {
from = artist.ID
l.Debug().
Str("title", artist.Name).
Msg("FetchMissingArtistImages: Attempting to fetch missing artist image")
var aliases []string
if aliasrow, err := store.GetAllArtistAliases(ctx, artist.ID); err != nil {
aliases = utils.FlattenAliases(aliasrow)
} else {
aliases = []string{artist.Name}
}
var imgid uuid.UUID
imgUrl, imgErr := images.GetArtistImage(ctx, images.ArtistImageOpts{
Aliases: aliases,
})
if imgErr == nil && imgUrl != "" {
imgid = uuid.New()
err = store.UpdateArtist(ctx, db.UpdateArtistOpts{
ID: artist.ID,
Image: imgid,
ImageSrc: imgUrl,
})
if err != nil {
l.Err(err).
Str("title", artist.Name).
Msg("FetchMissingArtistImages: Failed to update artist with image in database")
continue
}
l.Info().
Str("name", artist.Name).
Msg("FetchMissingArtistImages: Successfully fetched missing artist image")
} else {
l.Err(err).
Str("name", artist.Name).
Msg("FetchMissingArtistImages: Failed to fetch artist image")
}
}
}
}
func FetchMissingAlbumImages(ctx context.Context, store db.DB) error {
l := logger.FromContext(ctx)
l.Info().Msg("FetchMissingAlbumImages: Starting backfill of missing album images")
var from int32 = 0
for {
l.Debug().Int32("ID", from).Msg("Fetching album images to backfill from ID")
albums, err := store.AlbumsWithoutImages(ctx, from)
if err != nil {
return fmt.Errorf("FetchMissingAlbumImages: failed to fetch albums for image backfill: %w", err)
}
if len(albums) == 0 {
if from == 0 {
l.Info().Msg("FetchMissingAlbumImages: No albums with missing images found")
} else {
l.Info().Msg("FetchMissingAlbumImages: Finished fetching missing album images")
}
return nil
}
for _, album := range albums {
from = album.ID
l.Debug().
Str("title", album.Title).
Msg("FetchMissingAlbumImages: Attempting to fetch missing album image")
var imgid uuid.UUID
imgUrl, imgErr := images.GetAlbumImage(ctx, images.AlbumImageOpts{
Artists: utils.FlattenSimpleArtistNames(album.Artists),
Album: album.Title,
ReleaseMbzID: album.MbzID,
})
if imgErr == nil && imgUrl != "" {
imgid = uuid.New()
err = store.UpdateAlbum(ctx, db.UpdateAlbumOpts{
ID: album.ID,
Image: imgid,
ImageSrc: imgUrl,
})
if err != nil {
l.Err(err).
Str("title", album.Title).
Msg("FetchMissingAlbumImages: Failed to update album with image in database")
continue
}
l.Info().
Str("name", album.Title).
Msg("FetchMissingAlbumImages: Successfully fetched missing album image")
} else {
l.Err(err).
Str("name", album.Title).
Msg("FetchMissingAlbumImages: Failed to fetch album image")
}
}
}
}

View file

@ -63,7 +63,7 @@ func TestSubmitListen_CreateAllMbzIDs(t *testing.T) {
assert.True(t, exists, "expected listen row to exist") assert.True(t, exists, "expected listen row to exist")
// Verify that listen time is correct // Verify that listen time is correct
p, err := store.GetListensPaginated(ctx, db.GetItemsOpts{Limit: 1, Page: 1}) p, err := store.GetListensPaginated(ctx, db.GetItemsOpts{Limit: 1, Page: 1, Timeframe: db.Timeframe{Period: db.PeriodAllTime}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, p.Items, 1) require.Len(t, p.Items, 1)
l := p.Items[0] l := p.Items[0]
@ -282,6 +282,73 @@ func TestSubmitListen_MatchAllMbzIDs(t *testing.T) {
assert.Equal(t, 1, count, "duplicate artist created") assert.Equal(t, 1, count, "duplicate artist created")
} }
func TestSubmitListen_DoNotOverwriteMbzIDs(t *testing.T) {
setupTestDataWithMbzIDs(t)
// artist gets matched with musicbrainz id
// release gets matched with mbz id
// track gets matched with mbz id
ctx := context.Background()
mbzc := &mbz.MbzMockCaller{
Artists: mbzArtistData,
Releases: mbzReleaseData,
Tracks: mbzTrackData,
}
artistMbzID := uuid.MustParse("10000000-0000-0000-0000-000000000000")
releaseMbzID := uuid.MustParse("01000000-0000-0000-0000-000000000000")
existingReleaseMbzID := uuid.MustParse("00000000-0000-0000-0000-000000000101")
trackMbzID := uuid.MustParse("00100000-0000-0000-0000-000000000000")
opts := catalog.SubmitListenOpts{
MbzCaller: mbzc,
ArtistNames: []string{"ATARASHII GAKKO!"},
Artist: "ATARASHII GAKKO!",
ArtistMbzIDs: []uuid.UUID{
artistMbzID,
},
TrackTitle: "Tokyo Calling",
RecordingMbzID: trackMbzID,
ReleaseTitle: "AG! Calling",
ReleaseMbzID: releaseMbzID,
Time: time.Now(),
UserID: 1,
}
err := catalog.SubmitListen(ctx, store, opts)
require.NoError(t, err)
// Verify that the listen was saved
exists, err := store.RowExists(ctx, `
SELECT EXISTS (
SELECT 1 FROM listens
WHERE track_id = $1
)`, 1)
require.NoError(t, err)
assert.True(t, exists, "expected listen row to exist")
// verify that track, release group, and artist are existing ones and not duplicates
count, err := store.Count(ctx, `
SELECT COUNT(*) FROM tracks_with_title WHERE musicbrainz_id = $1
`, trackMbzID)
require.NoError(t, err)
assert.Equal(t, 0, count, "duplicate track created")
count, err = store.Count(ctx, `
SELECT COUNT(*) FROM releases_with_title WHERE musicbrainz_id = $1
`, releaseMbzID)
require.NoError(t, err)
assert.Equal(t, 0, count, "duplicate release group created")
count, err = store.Count(ctx, `
SELECT COUNT(*) FROM releases_with_title WHERE musicbrainz_id = $1
`, existingReleaseMbzID)
require.NoError(t, err)
assert.Equal(t, 1, count, "existing release group should not be overwritten")
count, err = store.Count(ctx, `
SELECT COUNT(*) FROM artists_with_name WHERE musicbrainz_id = $1
`, artistMbzID)
require.NoError(t, err)
assert.Equal(t, 0, count, "duplicate artist created")
}
func TestSubmitListen_MatchTrackFromMbzTitle(t *testing.T) { func TestSubmitListen_MatchTrackFromMbzTitle(t *testing.T) {
setupTestDataSansMbzIDs(t) setupTestDataSansMbzIDs(t)

View file

@ -38,6 +38,7 @@ const (
DISABLE_MUSICBRAINZ_ENV = "KOITO_DISABLE_MUSICBRAINZ" DISABLE_MUSICBRAINZ_ENV = "KOITO_DISABLE_MUSICBRAINZ"
SUBSONIC_URL_ENV = "KOITO_SUBSONIC_URL" SUBSONIC_URL_ENV = "KOITO_SUBSONIC_URL"
SUBSONIC_PARAMS_ENV = "KOITO_SUBSONIC_PARAMS" SUBSONIC_PARAMS_ENV = "KOITO_SUBSONIC_PARAMS"
LASTFM_API_KEY_ENV = "KOITO_LASTFM_API_KEY"
SKIP_IMPORT_ENV = "KOITO_SKIP_IMPORT" SKIP_IMPORT_ENV = "KOITO_SKIP_IMPORT"
ALLOWED_HOSTS_ENV = "KOITO_ALLOWED_HOSTS" ALLOWED_HOSTS_ENV = "KOITO_ALLOWED_HOSTS"
CORS_ORIGINS_ENV = "KOITO_CORS_ALLOWED_ORIGINS" CORS_ORIGINS_ENV = "KOITO_CORS_ALLOWED_ORIGINS"
@ -48,6 +49,7 @@ const (
FETCH_IMAGES_DURING_IMPORT_ENV = "KOITO_FETCH_IMAGES_DURING_IMPORT" FETCH_IMAGES_DURING_IMPORT_ENV = "KOITO_FETCH_IMAGES_DURING_IMPORT"
ARTIST_SEPARATORS_ENV = "KOITO_ARTIST_SEPARATORS_REGEX" ARTIST_SEPARATORS_ENV = "KOITO_ARTIST_SEPARATORS_REGEX"
LOGIN_GATE_ENV = "KOITO_LOGIN_GATE" LOGIN_GATE_ENV = "KOITO_LOGIN_GATE"
FORCE_TZ = "KOITO_FORCE_TZ"
) )
type config struct { type config struct {
@ -72,6 +74,7 @@ type config struct {
disableMusicBrainz bool disableMusicBrainz bool
subsonicUrl string subsonicUrl string
subsonicParams string subsonicParams string
lastfmApiKey string
subsonicEnabled bool subsonicEnabled bool
skipImport bool skipImport bool
fetchImageDuringImport bool fetchImageDuringImport bool
@ -85,6 +88,7 @@ type config struct {
importAfter time.Time importAfter time.Time
artistSeparators []*regexp.Regexp artistSeparators []*regexp.Regexp
loginGate bool loginGate bool
forceTZ *time.Location
} }
var ( var (
@ -165,6 +169,7 @@ func loadConfig(getenv func(string) string, version string) (*config, error) {
if cfg.subsonicEnabled && (cfg.subsonicUrl == "" || cfg.subsonicParams == "") { if cfg.subsonicEnabled && (cfg.subsonicUrl == "" || cfg.subsonicParams == "") {
return nil, fmt.Errorf("loadConfig: invalid configuration: both %s and %s must be set in order to use subsonic image fetching", SUBSONIC_URL_ENV, SUBSONIC_PARAMS_ENV) return nil, fmt.Errorf("loadConfig: invalid configuration: both %s and %s must be set in order to use subsonic image fetching", SUBSONIC_URL_ENV, SUBSONIC_PARAMS_ENV)
} }
cfg.lastfmApiKey = getenv(LASTFM_API_KEY_ENV)
cfg.skipImport = parseBool(getenv(SKIP_IMPORT_ENV)) cfg.skipImport = parseBool(getenv(SKIP_IMPORT_ENV))
cfg.userAgent = fmt.Sprintf("Koito %s (contact@koito.io)", version) cfg.userAgent = fmt.Sprintf("Koito %s (contact@koito.io)", version)
@ -210,6 +215,13 @@ func loadConfig(getenv func(string) string, version string) (*config, error) {
cfg.loginGate = true cfg.loginGate = true
} }
if getenv(FORCE_TZ) != "" {
cfg.forceTZ, err = time.LoadLocation(getenv(FORCE_TZ))
if err != nil {
return nil, fmt.Errorf("forced timezone '%s' is not a valid timezone", getenv(FORCE_TZ))
}
}
switch strings.ToLower(getenv(LOG_LEVEL_ENV)) { switch strings.ToLower(getenv(LOG_LEVEL_ENV)) {
case "debug": case "debug":
cfg.logLevel = 0 cfg.logLevel = 0
@ -232,192 +244,3 @@ func parseBool(s string) bool {
return false return false
} }
} }
// Global accessors for configuration values
func UserAgent() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.userAgent
}
func ListenAddr() string {
lock.RLock()
defer lock.RUnlock()
return fmt.Sprintf("%s:%d", globalConfig.bindAddr, globalConfig.listenPort)
}
func ConfigDir() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.configDir
}
func DatabaseUrl() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.databaseUrl
}
func MusicBrainzUrl() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.musicBrainzUrl
}
func MusicBrainzRateLimit() int {
lock.RLock()
defer lock.RUnlock()
return globalConfig.musicBrainzRateLimit
}
func LogLevel() int {
lock.RLock()
defer lock.RUnlock()
return globalConfig.logLevel
}
func StructuredLogging() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.structuredLogging
}
func LbzRelayEnabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.lbzRelayEnabled
}
func LbzRelayUrl() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.lbzRelayUrl
}
func LbzRelayToken() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.lbzRelayToken
}
func DefaultPassword() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.defaultPw
}
func DefaultUsername() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.defaultUsername
}
func DefaultTheme() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.defaultTheme
}
func FullImageCacheEnabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.enableFullImageCache
}
func DeezerDisabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.disableDeezer
}
func CoverArtArchiveDisabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.disableCAA
}
func MusicBrainzDisabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.disableMusicBrainz
}
func SubsonicEnabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.subsonicEnabled
}
func SubsonicUrl() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.subsonicUrl
}
func SubsonicParams() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.subsonicParams
}
func SkipImport() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.skipImport
}
func AllowedHosts() []string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.allowedHosts
}
func AllowAllHosts() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.allowAllHosts
}
func AllowedOrigins() []string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.allowedOrigins
}
func RateLimitDisabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.disableRateLimit
}
func ThrottleImportMs() int {
lock.RLock()
defer lock.RUnlock()
return globalConfig.importThrottleMs
}
// returns the before, after times, in that order
func ImportWindow() (time.Time, time.Time) {
lock.RLock()
defer lock.RUnlock()
return globalConfig.importBefore, globalConfig.importAfter
}
func FetchImagesDuringImport() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.fetchImageDuringImport
}
func ArtistSeparators() []*regexp.Regexp {
lock.RLock()
defer lock.RUnlock()
return globalConfig.artistSeparators
}
func LoginGate() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.loginGate
}

206
internal/cfg/getters.go Normal file
View file

@ -0,0 +1,206 @@
package cfg
import (
"fmt"
"regexp"
"time"
)
func UserAgent() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.userAgent
}
func ListenAddr() string {
lock.RLock()
defer lock.RUnlock()
return fmt.Sprintf("%s:%d", globalConfig.bindAddr, globalConfig.listenPort)
}
func ConfigDir() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.configDir
}
func DatabaseUrl() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.databaseUrl
}
func MusicBrainzUrl() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.musicBrainzUrl
}
func MusicBrainzRateLimit() int {
lock.RLock()
defer lock.RUnlock()
return globalConfig.musicBrainzRateLimit
}
func LogLevel() int {
lock.RLock()
defer lock.RUnlock()
return globalConfig.logLevel
}
func StructuredLogging() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.structuredLogging
}
func LbzRelayEnabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.lbzRelayEnabled
}
func LbzRelayUrl() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.lbzRelayUrl
}
func LbzRelayToken() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.lbzRelayToken
}
func DefaultPassword() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.defaultPw
}
func DefaultUsername() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.defaultUsername
}
func DefaultTheme() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.defaultTheme
}
func FullImageCacheEnabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.enableFullImageCache
}
func DeezerDisabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.disableDeezer
}
func CoverArtArchiveDisabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.disableCAA
}
func MusicBrainzDisabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.disableMusicBrainz
}
func SubsonicEnabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.subsonicEnabled
}
func SubsonicUrl() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.subsonicUrl
}
func SubsonicParams() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.subsonicParams
}
func LastFMApiKey() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.lastfmApiKey
}
func SkipImport() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.skipImport
}
func AllowedHosts() []string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.allowedHosts
}
func AllowAllHosts() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.allowAllHosts
}
func AllowedOrigins() []string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.allowedOrigins
}
func RateLimitDisabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.disableRateLimit
}
func ThrottleImportMs() int {
lock.RLock()
defer lock.RUnlock()
return globalConfig.importThrottleMs
}
// returns the before, after times, in that order
func ImportWindow() (time.Time, time.Time) {
lock.RLock()
defer lock.RUnlock()
return globalConfig.importBefore, globalConfig.importAfter
}
func FetchImagesDuringImport() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.fetchImageDuringImport
}
func ArtistSeparators() []*regexp.Regexp {
lock.RLock()
defer lock.RUnlock()
return globalConfig.artistSeparators
}
func LoginGate() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.loginGate
}
func ForceTZ() *time.Location {
lock.RLock()
defer lock.RUnlock()
return globalConfig.forceTZ
}

7
internal/cfg/setters.go Normal file
View file

@ -0,0 +1,7 @@
package cfg
func SetLoginGate(val bool) {
lock.Lock()
defer lock.Unlock()
globalConfig.loginGate = val
}

View file

@ -14,12 +14,14 @@ type DB interface {
GetArtist(ctx context.Context, opts GetArtistOpts) (*models.Artist, error) GetArtist(ctx context.Context, opts GetArtistOpts) (*models.Artist, error)
GetAlbum(ctx context.Context, opts GetAlbumOpts) (*models.Album, error) GetAlbum(ctx context.Context, opts GetAlbumOpts) (*models.Album, error)
GetAlbumWithNoMbzIDByTitles(ctx context.Context, artistId int32, titles []string) (*models.Album, error)
GetTrack(ctx context.Context, opts GetTrackOpts) (*models.Track, error) GetTrack(ctx context.Context, opts GetTrackOpts) (*models.Track, error)
GetTracksWithNoDurationButHaveMbzID(ctx context.Context, from int32) ([]*models.Track, error)
GetArtistsForAlbum(ctx context.Context, id int32) ([]*models.Artist, error) GetArtistsForAlbum(ctx context.Context, id int32) ([]*models.Artist, error)
GetArtistsForTrack(ctx context.Context, id int32) ([]*models.Artist, error) GetArtistsForTrack(ctx context.Context, id int32) ([]*models.Artist, error)
GetTopTracksPaginated(ctx context.Context, opts GetItemsOpts) (*PaginatedResponse[*models.Track], error) GetTopTracksPaginated(ctx context.Context, opts GetItemsOpts) (*PaginatedResponse[RankedItem[*models.Track]], error)
GetTopArtistsPaginated(ctx context.Context, opts GetItemsOpts) (*PaginatedResponse[*models.Artist], error) GetTopArtistsPaginated(ctx context.Context, opts GetItemsOpts) (*PaginatedResponse[RankedItem[*models.Artist]], error)
GetTopAlbumsPaginated(ctx context.Context, opts GetItemsOpts) (*PaginatedResponse[*models.Album], error) GetTopAlbumsPaginated(ctx context.Context, opts GetItemsOpts) (*PaginatedResponse[RankedItem[*models.Album]], error)
GetListensPaginated(ctx context.Context, opts GetItemsOpts) (*PaginatedResponse[*models.Listen], error) GetListensPaginated(ctx context.Context, opts GetItemsOpts) (*PaginatedResponse[*models.Listen], error)
GetListenActivity(ctx context.Context, opts ListenActivityOpts) ([]ListenActivityItem, error) GetListenActivity(ctx context.Context, opts ListenActivityOpts) ([]ListenActivityItem, error)
GetAllArtistAliases(ctx context.Context, id int32) ([]models.Alias, error) GetAllArtistAliases(ctx context.Context, id int32) ([]models.Alias, error)
@ -29,6 +31,7 @@ type DB interface {
GetUserBySession(ctx context.Context, sessionId uuid.UUID) (*models.User, error) GetUserBySession(ctx context.Context, sessionId uuid.UUID) (*models.User, error)
GetUserByUsername(ctx context.Context, username string) (*models.User, error) GetUserByUsername(ctx context.Context, username string) (*models.User, error)
GetUserByApiKey(ctx context.Context, key string) (*models.User, error) GetUserByApiKey(ctx context.Context, key string) (*models.User, error)
GetInterest(ctx context.Context, opts GetInterestOpts) ([]InterestBucket, error)
// Save // Save
@ -85,6 +88,7 @@ type DB interface {
// in seconds // in seconds
CountTimeListenedToItem(ctx context.Context, opts TimeListenedOpts) (int64, error) CountTimeListenedToItem(ctx context.Context, opts TimeListenedOpts) (int64, error)
CountUsers(ctx context.Context) (int64, error) CountUsers(ctx context.Context) (int64, error)
// Search // Search
SearchArtists(ctx context.Context, q string) ([]*models.Artist, error) SearchArtists(ctx context.Context, q string) ([]*models.Artist, error)
@ -102,6 +106,7 @@ type DB interface {
ImageHasAssociation(ctx context.Context, image uuid.UUID) (bool, error) ImageHasAssociation(ctx context.Context, image uuid.UUID) (bool, error)
GetImageSource(ctx context.Context, image uuid.UUID) (string, error) GetImageSource(ctx context.Context, image uuid.UUID) (string, error)
AlbumsWithoutImages(ctx context.Context, from int32) ([]*models.Album, error) AlbumsWithoutImages(ctx context.Context, from int32) ([]*models.Album, error)
ArtistsWithoutImages(ctx context.Context, from int32) ([]*models.Artist, error)
GetExportPage(ctx context.Context, opts GetExportPageOpts) ([]*ExportItem, error) GetExportPage(ctx context.Context, opts GetExportPageOpts) ([]*ExportItem, error)
Ping(ctx context.Context) error Ping(ctx context.Context) error
Close(ctx context.Context) Close(ctx context.Context)

View file

@ -27,6 +27,7 @@ type GetTrackOpts struct {
ID int32 ID int32
MusicBrainzID uuid.UUID MusicBrainzID uuid.UUID
Title string Title string
ReleaseID int32
ArtistIDs []int32 ArtistIDs []int32
} }
@ -116,14 +117,9 @@ type AddArtistsToAlbumOpts struct {
} }
type GetItemsOpts struct { type GetItemsOpts struct {
Limit int Limit int
Period Period Page int
Page int Timeframe Timeframe
Week int // 1-52
Month int // 1-12
Year int
From int64 // unix timestamp
To int64 // unix timestamp
// Used only for getting top tracks // Used only for getting top tracks
ArtistID int ArtistID int
@ -138,6 +134,7 @@ type ListenActivityOpts struct {
Range int Range int
Month int Month int
Year int Year int
Timezone *time.Location
AlbumID int32 AlbumID int32
ArtistID int32 ArtistID int32
TrackID int32 TrackID int32
@ -156,3 +153,10 @@ type GetExportPageOpts struct {
TrackID int32 TrackID int32
Limit int32 Limit int32
} }
type GetInterestOpts struct {
Buckets int
AlbumID int32
ArtistID int32
TrackID int32
}

View file

@ -6,23 +6,6 @@ import (
// should this be in db package ??? // should this be in db package ???
type Timeframe struct {
Period Period
T1u int64
T2u int64
}
func TimeframeToTimeRange(timeframe Timeframe) (t1, t2 time.Time) {
if timeframe.T1u == 0 && timeframe.T2u == 0 {
t2 = time.Now()
t1 = StartTimeFromPeriod(timeframe.Period)
} else {
t1 = time.Unix(timeframe.T1u, 0)
t2 = time.Unix(timeframe.T2u, 0)
}
return
}
type Period string type Period string
const ( const (
@ -31,9 +14,12 @@ const (
PeriodMonth Period = "month" PeriodMonth Period = "month"
PeriodYear Period = "year" PeriodYear Period = "year"
PeriodAllTime Period = "all_time" PeriodAllTime Period = "all_time"
PeriodDefault Period = "day"
) )
func (p Period) IsZero() bool {
return p == ""
}
func StartTimeFromPeriod(p Period) time.Time { func StartTimeFromPeriod(p Period) time.Time {
now := time.Now() now := time.Now()
switch p { switch p {
@ -71,17 +57,21 @@ const (
// and end will be 23:59:59 on Saturday at the end of the current week. // and end will be 23:59:59 on Saturday at the end of the current week.
// If opts.Year (or opts.Year + opts.Month) is provided, start and end will simply by the start and end times of that year/month. // If opts.Year (or opts.Year + opts.Month) is provided, start and end will simply by the start and end times of that year/month.
func ListenActivityOptsToTimes(opts ListenActivityOpts) (start, end time.Time) { func ListenActivityOptsToTimes(opts ListenActivityOpts) (start, end time.Time) {
now := time.Now() loc := opts.Timezone
if loc == nil {
loc, _ = time.LoadLocation("UTC")
}
now := time.Now().In(loc)
// If Year (and optionally Month) are specified, use calendar boundaries // If Year (and optionally Month) are specified, use calendar boundaries
if opts.Year != 0 { if opts.Year != 0 {
if opts.Month != 0 { if opts.Month != 0 {
// Specific month of a specific year // Specific month of a specific year
start = time.Date(opts.Year, time.Month(opts.Month), 1, 0, 0, 0, 0, now.Location()) start = time.Date(opts.Year, time.Month(opts.Month), 1, 0, 0, 0, 0, loc)
end = start.AddDate(0, 1, 0).Add(-time.Nanosecond) end = start.AddDate(0, 1, 0).Add(-time.Nanosecond)
} else { } else {
// Whole year // Whole year
start = time.Date(opts.Year, 1, 1, 0, 0, 0, 0, now.Location()) start = time.Date(opts.Year, 1, 1, 0, 0, 0, 0, loc)
end = start.AddDate(1, 0, 0).Add(-time.Nanosecond) end = start.AddDate(1, 0, 0).Add(-time.Nanosecond)
} }
return start, end return start, end
@ -93,30 +83,32 @@ func ListenActivityOptsToTimes(opts ListenActivityOpts) (start, end time.Time) {
// Determine step and align accordingly // Determine step and align accordingly
switch opts.Step { switch opts.Step {
case StepDay: case StepDay:
today := time.Date(now.Year(), now.Month(), now.Day(), 0, 0, 0, 0, now.Location()) today := time.Date(now.Year(), now.Month(), now.Day(), 0, 0, 0, 0, loc)
start = today.AddDate(0, 0, -opts.Range) start = today.AddDate(0, 0, -opts.Range)
end = today.AddDate(0, 0, 1).Add(-time.Nanosecond) end = today.AddDate(0, 0, 1).Add(-time.Nanosecond)
case StepWeek: case StepWeek:
// Align to most recent Sunday // Align to most recent Sunday
weekday := int(now.Weekday()) // Sunday = 0 weekday := int(now.Weekday()) // Sunday = 0
startOfThisWeek := time.Date(now.Year(), now.Month(), now.Day()-weekday, 0, 0, 0, 0, now.Location()) startOfThisWeek := time.Date(now.Year(), now.Month(), now.Day()-weekday, 0, 0, 0, 0, loc)
start = startOfThisWeek.AddDate(0, 0, -7*opts.Range) // need to subtract 1 from range for week because we are going back from the beginning of this
// week, so we sort of already went back a week
start = startOfThisWeek.AddDate(0, 0, -7*(opts.Range-1))
end = startOfThisWeek.AddDate(0, 0, 7).Add(-time.Nanosecond) end = startOfThisWeek.AddDate(0, 0, 7).Add(-time.Nanosecond)
case StepMonth: case StepMonth:
firstOfThisMonth := time.Date(now.Year(), now.Month(), 1, 0, 0, 0, 0, now.Location()) firstOfThisMonth := time.Date(now.Year(), now.Month(), 1, 0, 0, 0, 0, loc)
start = firstOfThisMonth.AddDate(0, -opts.Range, 0) start = firstOfThisMonth.AddDate(0, -opts.Range, 0)
end = firstOfThisMonth.AddDate(0, 1, 0).Add(-time.Nanosecond) end = firstOfThisMonth.AddDate(0, 1, 0).Add(-time.Nanosecond)
case StepYear: case StepYear:
firstOfThisYear := time.Date(now.Year(), 1, 1, 0, 0, 0, 0, now.Location()) firstOfThisYear := time.Date(now.Year(), 1, 1, 0, 0, 0, 0, loc)
start = firstOfThisYear.AddDate(-opts.Range, 0, 0) start = firstOfThisYear.AddDate(-opts.Range, 0, 0)
end = firstOfThisYear.AddDate(1, 0, 0).Add(-time.Nanosecond) end = firstOfThisYear.AddDate(1, 0, 0).Add(-time.Nanosecond)
default: default:
// Default to daily // Default to daily
today := time.Date(now.Year(), now.Month(), now.Day(), 0, 0, 0, 0, now.Location()) today := time.Date(now.Year(), now.Month(), now.Day(), 0, 0, 0, 0, loc)
start = today.AddDate(0, 0, -opts.Range) start = today.AddDate(0, 0, -opts.Range)
end = today.AddDate(0, 0, 1).Add(-time.Nanosecond) end = today.AddDate(0, 0, 1).Add(-time.Nanosecond)
} }

View file

@ -3,6 +3,9 @@ package db_test
import ( import (
"testing" "testing"
"time" "time"
"github.com/gabehf/koito/internal/db"
"github.com/stretchr/testify/require"
) )
func TestListenActivityOptsToTimes(t *testing.T) { func TestListenActivityOptsToTimes(t *testing.T) {
@ -21,6 +24,11 @@ func eod(t time.Time) time.Time {
return time.Date(year, month, day, 23, 59, 59, 0, loc) return time.Date(year, month, day, 23, 59, 59, 0, loc)
} }
func TestPeriodUnset(t *testing.T) {
var p db.Period
require.True(t, p.IsZero())
}
func bod(t time.Time) time.Time { func bod(t time.Time) time.Time {
year, month, day := t.Date() year, month, day := t.Date()
loc := t.Location() loc := t.Location()

View file

@ -23,32 +23,13 @@ func (d *Psql) GetAlbum(ctx context.Context, opts db.GetAlbumOpts) (*models.Albu
var err error var err error
var ret = new(models.Album) var ret = new(models.Album)
if opts.ID != 0 { if opts.MusicBrainzID != uuid.Nil {
l.Debug().Msgf("Fetching album from DB with id %d", opts.ID)
row, err := d.q.GetRelease(ctx, opts.ID)
if err != nil {
return nil, fmt.Errorf("GetAlbum: %w", err)
}
ret.ID = row.ID
ret.MbzID = row.MusicBrainzID
ret.Title = row.Title
ret.Image = row.Image
ret.VariousArtists = row.VariousArtists
err = json.Unmarshal(row.Artists, &ret.Artists)
if err != nil {
return nil, fmt.Errorf("GetAlbum: json.Unmarshal: %w", err)
}
} else if opts.MusicBrainzID != uuid.Nil {
l.Debug().Msgf("Fetching album from DB with MusicBrainz Release ID %s", opts.MusicBrainzID) l.Debug().Msgf("Fetching album from DB with MusicBrainz Release ID %s", opts.MusicBrainzID)
row, err := d.q.GetReleaseByMbzID(ctx, &opts.MusicBrainzID) row, err := d.q.GetReleaseByMbzID(ctx, &opts.MusicBrainzID)
if err != nil { if err != nil {
return nil, fmt.Errorf("GetAlbum: %w", err) return nil, fmt.Errorf("GetAlbum: %w", err)
} }
ret.ID = row.ID opts.ID = row.ID
ret.MbzID = row.MusicBrainzID
ret.Title = row.Title
ret.Image = row.Image
ret.VariousArtists = row.VariousArtists
} else if opts.ArtistID != 0 && opts.Title != "" { } else if opts.ArtistID != 0 && opts.Title != "" {
l.Debug().Msgf("Fetching album from DB with artist_id %d and title %s", opts.ArtistID, opts.Title) l.Debug().Msgf("Fetching album from DB with artist_id %d and title %s", opts.ArtistID, opts.Title)
row, err := d.q.GetReleaseByArtistAndTitle(ctx, repository.GetReleaseByArtistAndTitleParams{ row, err := d.q.GetReleaseByArtistAndTitle(ctx, repository.GetReleaseByArtistAndTitleParams{
@ -58,11 +39,7 @@ func (d *Psql) GetAlbum(ctx context.Context, opts db.GetAlbumOpts) (*models.Albu
if err != nil { if err != nil {
return nil, fmt.Errorf("GetAlbum: %w", err) return nil, fmt.Errorf("GetAlbum: %w", err)
} }
ret.ID = row.ID opts.ID = row.ID
ret.MbzID = row.MusicBrainzID
ret.Title = row.Title
ret.Image = row.Image
ret.VariousArtists = row.VariousArtists
} else if opts.ArtistID != 0 && len(opts.Titles) > 0 { } else if opts.ArtistID != 0 && len(opts.Titles) > 0 {
l.Debug().Msgf("Fetching release group from DB with artist_id %d and titles %v", opts.ArtistID, opts.Titles) l.Debug().Msgf("Fetching release group from DB with artist_id %d and titles %v", opts.ArtistID, opts.Titles)
row, err := d.q.GetReleaseByArtistAndTitles(ctx, repository.GetReleaseByArtistAndTitlesParams{ row, err := d.q.GetReleaseByArtistAndTitles(ctx, repository.GetReleaseByArtistAndTitlesParams{
@ -72,22 +49,87 @@ func (d *Psql) GetAlbum(ctx context.Context, opts db.GetAlbumOpts) (*models.Albu
if err != nil { if err != nil {
return nil, fmt.Errorf("GetAlbum: %w", err) return nil, fmt.Errorf("GetAlbum: %w", err)
} }
opts.ID = row.ID
}
l.Debug().Msgf("Fetching album from DB with id %d", opts.ID)
row, err := d.q.GetRelease(ctx, opts.ID)
if err != nil {
return nil, fmt.Errorf("GetAlbum: %w", err)
}
count, err := d.q.CountListensFromRelease(ctx, repository.CountListensFromReleaseParams{
ListenedAt: time.Unix(0, 0),
ListenedAt_2: time.Now(),
ReleaseID: opts.ID,
})
if err != nil {
return nil, fmt.Errorf("GetAlbum: CountListensFromRelease: %w", err)
}
seconds, err := d.CountTimeListenedToItem(ctx, db.TimeListenedOpts{
Timeframe: db.Timeframe{Period: db.PeriodAllTime},
AlbumID: opts.ID,
})
if err != nil {
return nil, fmt.Errorf("GetAlbum: CountTimeListenedToItem: %w", err)
}
firstListen, err := d.q.GetFirstListenFromRelease(ctx, opts.ID)
if err != nil && !errors.Is(err, pgx.ErrNoRows) {
return nil, fmt.Errorf("GetAlbum: GetFirstListenFromRelease: %w", err)
}
rank, err := d.q.GetReleaseAllTimeRank(ctx, opts.ID)
if err != nil && !errors.Is(err, pgx.ErrNoRows) {
return nil, fmt.Errorf("GetAlbum: GetReleaseAllTimeRank: %w", err)
}
ret.ID = row.ID
ret.MbzID = row.MusicBrainzID
ret.Title = row.Title
ret.Image = row.Image
ret.VariousArtists = row.VariousArtists
err = json.Unmarshal(row.Artists, &ret.Artists)
if err != nil {
return nil, fmt.Errorf("GetAlbum: json.Unmarshal: %w", err)
}
ret.AllTimeRank = rank.Rank
ret.ListenCount = count
ret.TimeListened = seconds
ret.FirstListen = firstListen.ListenedAt.Unix()
return ret, nil
}
func (d *Psql) GetAlbumWithNoMbzIDByTitles(ctx context.Context, artistId int32, titles []string) (*models.Album, error) {
l := logger.FromContext(ctx)
ret := new(models.Album)
if artistId != 0 && len(titles) > 0 {
l.Debug().Msgf("GetAlbumWithNoMbzIDByTitles: Fetching release group from DB with artist_id %d and titles %v and no associated MusicBrainz ID", artistId, titles)
row, err := d.q.GetReleaseByArtistAndTitlesNoMbzID(ctx, repository.GetReleaseByArtistAndTitlesNoMbzIDParams{
ArtistID: artistId,
Column1: titles,
})
if err != nil {
return nil, fmt.Errorf("GetAlbum: %w", err)
}
ret.ID = row.ID ret.ID = row.ID
ret.MbzID = row.MusicBrainzID ret.MbzID = row.MusicBrainzID
ret.Title = row.Title ret.Title = row.Title
ret.Image = row.Image ret.Image = row.Image
ret.VariousArtists = row.VariousArtists ret.VariousArtists = row.VariousArtists
} else { } else {
return nil, errors.New("GetAlbum: insufficient information to get album") return nil, errors.New("GetAlbumWithNoMbzIDByTitles: insufficient information to get album")
} }
count, err := d.q.CountListensFromRelease(ctx, repository.CountListensFromReleaseParams{ count, err := d.q.CountListensFromRelease(ctx, repository.CountListensFromReleaseParams{
ListenedAt: time.Unix(0, 0), ListenedAt: time.Unix(0, 0),
ListenedAt_2: time.Now(), ListenedAt_2: time.Now(),
ReleaseID: ret.ID, ReleaseID: ret.ID,
}) })
if err != nil { if err != nil {
return nil, fmt.Errorf("GetAlbum: CountListensFromRelease: %w", err) return nil, fmt.Errorf("GetAlbumWithNoMbzIDByTitles: CountListensFromRelease: %w", err)
} }
seconds, err := d.CountTimeListenedToItem(ctx, db.TimeListenedOpts{ seconds, err := d.CountTimeListenedToItem(ctx, db.TimeListenedOpts{
@ -95,12 +137,12 @@ func (d *Psql) GetAlbum(ctx context.Context, opts db.GetAlbumOpts) (*models.Albu
AlbumID: ret.ID, AlbumID: ret.ID,
}) })
if err != nil { if err != nil {
return nil, fmt.Errorf("GetAlbum: CountTimeListenedToItem: %w", err) return nil, fmt.Errorf("GetAlbumWithNoMbzIDByTitles: CountTimeListenedToItem: %w", err)
} }
firstListen, err := d.q.GetFirstListenFromRelease(ctx, ret.ID) firstListen, err := d.q.GetFirstListenFromRelease(ctx, ret.ID)
if err != nil && !errors.Is(err, pgx.ErrNoRows) { if err != nil && !errors.Is(err, pgx.ErrNoRows) {
return nil, fmt.Errorf("GetAlbum: GetFirstListenFromRelease: %w", err) return nil, fmt.Errorf("GetAlbumWithNoMbzIDByTitles: GetFirstListenFromRelease: %w", err)
} }
ret.ListenCount = count ret.ListenCount = count
@ -232,6 +274,9 @@ func (d *Psql) UpdateAlbum(ctx context.Context, opts db.UpdateAlbumOpts) error {
} }
} }
if opts.Image != uuid.Nil { if opts.Image != uuid.Nil {
if opts.ImageSrc == "" {
return fmt.Errorf("UpdateAlbum: image source must be provided when updating an image")
}
l.Debug().Msgf("Updating release with ID %d with image %s", opts.ID, opts.Image) l.Debug().Msgf("Updating release with ID %d with image %s", opts.ID, opts.Image)
err := qtx.UpdateReleaseImage(ctx, repository.UpdateReleaseImageParams{ err := qtx.UpdateReleaseImage(ctx, repository.UpdateReleaseImageParams{
ID: opts.ID, ID: opts.ID,

View file

@ -20,114 +20,60 @@ import (
// this function sucks because sqlc keeps making new types for rows that are the same // this function sucks because sqlc keeps making new types for rows that are the same
func (d *Psql) GetArtist(ctx context.Context, opts db.GetArtistOpts) (*models.Artist, error) { func (d *Psql) GetArtist(ctx context.Context, opts db.GetArtistOpts) (*models.Artist, error) {
l := logger.FromContext(ctx) l := logger.FromContext(ctx)
if opts.ID != 0 { if opts.MusicBrainzID != uuid.Nil {
l.Debug().Msgf("Fetching artist from DB with id %d", opts.ID)
row, err := d.q.GetArtist(ctx, opts.ID)
if err != nil {
return nil, fmt.Errorf("GetArtist: GetArtist by ID: %w", err)
}
count, err := d.q.CountListensFromArtist(ctx, repository.CountListensFromArtistParams{
ListenedAt: time.Unix(0, 0),
ListenedAt_2: time.Now(),
ArtistID: row.ID,
})
if err != nil {
return nil, fmt.Errorf("GetArtist: CountListensFromArtist: %w", err)
}
seconds, err := d.CountTimeListenedToItem(ctx, db.TimeListenedOpts{
Timeframe: db.Timeframe{Period: db.PeriodAllTime},
ArtistID: row.ID,
})
if err != nil {
return nil, fmt.Errorf("GetArtist: CountTimeListenedToItem: %w", err)
}
firstListen, err := d.q.GetFirstListenFromArtist(ctx, row.ID)
if err != nil && !errors.Is(err, pgx.ErrNoRows) {
return nil, fmt.Errorf("GetAlbum: GetFirstListenFromArtist: %w", err)
}
return &models.Artist{
ID: row.ID,
MbzID: row.MusicBrainzID,
Name: row.Name,
Aliases: row.Aliases,
Image: row.Image,
ListenCount: count,
TimeListened: seconds,
FirstListen: firstListen.ListenedAt.Unix(),
}, nil
} else if opts.MusicBrainzID != uuid.Nil {
l.Debug().Msgf("Fetching artist from DB with MusicBrainz ID %s", opts.MusicBrainzID) l.Debug().Msgf("Fetching artist from DB with MusicBrainz ID %s", opts.MusicBrainzID)
row, err := d.q.GetArtistByMbzID(ctx, &opts.MusicBrainzID) row, err := d.q.GetArtistByMbzID(ctx, &opts.MusicBrainzID)
if err != nil { if err != nil {
return nil, fmt.Errorf("GetArtist: GetArtistByMbzID: %w", err) return nil, fmt.Errorf("GetArtist: GetArtistByMbzID: %w", err)
} }
count, err := d.q.CountListensFromArtist(ctx, repository.CountListensFromArtistParams{ opts.ID = row.ID
ListenedAt: time.Unix(0, 0),
ListenedAt_2: time.Now(),
ArtistID: row.ID,
})
if err != nil {
return nil, fmt.Errorf("GetArtist: CountListensFromArtist: %w", err)
}
seconds, err := d.CountTimeListenedToItem(ctx, db.TimeListenedOpts{
Timeframe: db.Timeframe{Period: db.PeriodAllTime},
ArtistID: row.ID,
})
if err != nil {
return nil, fmt.Errorf("GetArtist: CountTimeListenedToItem: %w", err)
}
firstListen, err := d.q.GetFirstListenFromArtist(ctx, row.ID)
if err != nil && !errors.Is(err, pgx.ErrNoRows) {
return nil, fmt.Errorf("GetAlbum: GetFirstListenFromArtist: %w", err)
}
return &models.Artist{
ID: row.ID,
MbzID: row.MusicBrainzID,
Name: row.Name,
Aliases: row.Aliases,
Image: row.Image,
ListenCount: count,
TimeListened: seconds,
FirstListen: firstListen.ListenedAt.Unix(),
}, nil
} else if opts.Name != "" { } else if opts.Name != "" {
l.Debug().Msgf("Fetching artist from DB with name '%s'", opts.Name) l.Debug().Msgf("Fetching artist from DB with name '%s'", opts.Name)
row, err := d.q.GetArtistByName(ctx, opts.Name) row, err := d.q.GetArtistByName(ctx, opts.Name)
if err != nil { if err != nil {
return nil, fmt.Errorf("GetArtist: GetArtistByName: %w", err) return nil, fmt.Errorf("GetArtist: GetArtistByName: %w", err)
} }
count, err := d.q.CountListensFromArtist(ctx, repository.CountListensFromArtistParams{ opts.ID = row.ID
ListenedAt: time.Unix(0, 0),
ListenedAt_2: time.Now(),
ArtistID: row.ID,
})
if err != nil {
return nil, fmt.Errorf("GetArtist: CountListensFromArtist: %w", err)
}
seconds, err := d.CountTimeListenedToItem(ctx, db.TimeListenedOpts{
Timeframe: db.Timeframe{Period: db.PeriodAllTime},
ArtistID: row.ID,
})
if err != nil {
return nil, fmt.Errorf("GetArtist: CountTimeListenedToItem: %w", err)
}
firstListen, err := d.q.GetFirstListenFromArtist(ctx, row.ID)
if err != nil && !errors.Is(err, pgx.ErrNoRows) {
return nil, fmt.Errorf("GetAlbum: GetFirstListenFromArtist: %w", err)
}
return &models.Artist{
ID: row.ID,
MbzID: row.MusicBrainzID,
Name: row.Name,
Aliases: row.Aliases,
Image: row.Image,
ListenCount: count,
TimeListened: seconds,
FirstListen: firstListen.ListenedAt.Unix(),
}, nil
} else {
return nil, errors.New("insufficient information to get artist")
} }
l.Debug().Msgf("Fetching artist from DB with id %d", opts.ID)
row, err := d.q.GetArtist(ctx, opts.ID)
if err != nil {
return nil, fmt.Errorf("GetArtist: GetArtist by ID: %w", err)
}
count, err := d.q.CountListensFromArtist(ctx, repository.CountListensFromArtistParams{
ListenedAt: time.Unix(0, 0),
ListenedAt_2: time.Now(),
ArtistID: row.ID,
})
if err != nil {
return nil, fmt.Errorf("GetArtist: CountListensFromArtist: %w", err)
}
seconds, err := d.CountTimeListenedToItem(ctx, db.TimeListenedOpts{
Timeframe: db.Timeframe{Period: db.PeriodAllTime},
ArtistID: row.ID,
})
if err != nil {
return nil, fmt.Errorf("GetArtist: CountTimeListenedToItem: %w", err)
}
firstListen, err := d.q.GetFirstListenFromArtist(ctx, row.ID)
if err != nil && !errors.Is(err, pgx.ErrNoRows) {
return nil, fmt.Errorf("GetAlbum: GetFirstListenFromArtist: %w", err)
}
rank, err := d.q.GetArtistAllTimeRank(ctx, opts.ID)
if err != nil && !errors.Is(err, pgx.ErrNoRows) {
return nil, fmt.Errorf("GetArtist: GetArtistAllTimeRank: %w", err)
}
return &models.Artist{
ID: row.ID,
MbzID: row.MusicBrainzID,
Name: row.Name,
Aliases: row.Aliases,
Image: row.Image,
ListenCount: count,
TimeListened: seconds,
AllTimeRank: rank.Rank,
FirstListen: firstListen.ListenedAt.Unix(),
}, nil
} }
// Inserts all unique aliases into the DB with specified source // Inserts all unique aliases into the DB with specified source
@ -264,6 +210,9 @@ func (d *Psql) UpdateArtist(ctx context.Context, opts db.UpdateArtistOpts) error
} }
} }
if opts.Image != uuid.Nil { if opts.Image != uuid.Nil {
if opts.ImageSrc == "" {
return fmt.Errorf("UpdateAlbum: image source must be provided when updating an image")
}
l.Debug().Msgf("Updating artist with id %d with image %s", opts.ID, opts.Image) l.Debug().Msgf("Updating artist with id %d with image %s", opts.ID, opts.Image)
err = qtx.UpdateArtistImage(ctx, repository.UpdateArtistImageParams{ err = qtx.UpdateArtistImage(ctx, repository.UpdateArtistImageParams{
ID: opts.ID, ID: opts.ID,

View file

@ -46,7 +46,7 @@ func TestCountNewTracks(t *testing.T) {
t1u := t1.Unix() t1u := t1.Unix()
t2, _ := time.Parse(time.DateOnly, "2025-12-31") t2, _ := time.Parse(time.DateOnly, "2025-12-31")
t2u := t2.Unix() t2u := t2.Unix()
count, err := store.CountNewTracks(ctx, db.Timeframe{T1u: t1u, T2u: t2u}) count, err := store.CountNewTracks(ctx, db.Timeframe{FromUnix: t1u, ToUnix: t2u})
require.NoError(t, err) require.NoError(t, err)
assert.Equal(t, int64(1), count, "expected tracks count to match inserted data") assert.Equal(t, int64(1), count, "expected tracks count to match inserted data")
@ -76,7 +76,7 @@ func TestCountNewAlbums(t *testing.T) {
t1u := t1.Unix() t1u := t1.Unix()
t2, _ := time.Parse(time.DateOnly, "2025-12-31") t2, _ := time.Parse(time.DateOnly, "2025-12-31")
t2u := t2.Unix() t2u := t2.Unix()
count, err := store.CountNewAlbums(ctx, db.Timeframe{T1u: t1u, T2u: t2u}) count, err := store.CountNewAlbums(ctx, db.Timeframe{FromUnix: t1u, ToUnix: t2u})
require.NoError(t, err) require.NoError(t, err)
assert.Equal(t, int64(1), count, "expected albums count to match inserted data") assert.Equal(t, int64(1), count, "expected albums count to match inserted data")
@ -106,7 +106,7 @@ func TestCountNewArtists(t *testing.T) {
t1u := t1.Unix() t1u := t1.Unix()
t2, _ := time.Parse(time.DateOnly, "2025-12-31") t2, _ := time.Parse(time.DateOnly, "2025-12-31")
t2u := t2.Unix() t2u := t2.Unix()
count, err := store.CountNewArtists(ctx, db.Timeframe{T1u: t1u, T2u: t2u}) count, err := store.CountNewArtists(ctx, db.Timeframe{FromUnix: t1u, ToUnix: t2u})
require.NoError(t, err) require.NoError(t, err)
assert.Equal(t, int64(1), count, "expected artists count to match inserted data") assert.Equal(t, int64(1), count, "expected artists count to match inserted data")

View file

@ -72,3 +72,26 @@ func (d *Psql) AlbumsWithoutImages(ctx context.Context, from int32) ([]*models.A
} }
return albums, nil return albums, nil
} }
// returns nil, nil on no results
func (d *Psql) ArtistsWithoutImages(ctx context.Context, from int32) ([]*models.Artist, error) {
rows, err := d.q.GetArtistsWithoutImages(ctx, repository.GetArtistsWithoutImagesParams{
Limit: 20,
ID: from,
})
if errors.Is(err, pgx.ErrNoRows) {
return nil, nil
} else if err != nil {
return nil, fmt.Errorf("ArtistsWithoutImages: %w", err)
}
ret := make([]*models.Artist, len(rows))
for i, row := range rows {
ret[i] = &models.Artist{
ID: row.ID,
Name: row.Name,
MbzID: row.MusicBrainzID,
}
}
return ret, nil
}

View file

@ -0,0 +1,70 @@
package psql
import (
"context"
"errors"
"fmt"
"github.com/gabehf/koito/internal/db"
"github.com/gabehf/koito/internal/repository"
)
func (d *Psql) GetInterest(ctx context.Context, opts db.GetInterestOpts) ([]db.InterestBucket, error) {
if opts.Buckets == 0 {
return nil, errors.New("GetInterest: bucket count must be provided")
}
ret := make([]db.InterestBucket, 0)
if opts.ArtistID != 0 {
resp, err := d.q.GetGroupedListensFromArtist(ctx, repository.GetGroupedListensFromArtistParams{
ArtistID: opts.ArtistID,
BucketCount: int32(opts.Buckets),
})
if err != nil {
return nil, fmt.Errorf("GetInterest: GetGroupedListensFromArtist: %w", err)
}
for _, v := range resp {
ret = append(ret, db.InterestBucket{
BucketStart: v.BucketStart,
BucketEnd: v.BucketEnd,
ListenCount: v.ListenCount,
})
}
return ret, nil
} else if opts.AlbumID != 0 {
resp, err := d.q.GetGroupedListensFromRelease(ctx, repository.GetGroupedListensFromReleaseParams{
ReleaseID: opts.AlbumID,
BucketCount: int32(opts.Buckets),
})
if err != nil {
return nil, fmt.Errorf("GetInterest: GetGroupedListensFromRelease: %w", err)
}
for _, v := range resp {
ret = append(ret, db.InterestBucket{
BucketStart: v.BucketStart,
BucketEnd: v.BucketEnd,
ListenCount: v.ListenCount,
})
}
return ret, nil
} else if opts.TrackID != 0 {
resp, err := d.q.GetGroupedListensFromTrack(ctx, repository.GetGroupedListensFromTrackParams{
ID: opts.TrackID,
BucketCount: int32(opts.Buckets),
})
if err != nil {
return nil, fmt.Errorf("GetInterest: GetGroupedListensFromTrack: %w", err)
}
for _, v := range resp {
ret = append(ret, db.InterestBucket{
BucketStart: v.BucketStart,
BucketEnd: v.BucketEnd,
ListenCount: v.ListenCount,
})
}
return ret, nil
} else {
return nil, errors.New("GetInterest: artist id, album id, or track id must be provided")
}
}

View file

@ -0,0 +1,112 @@
package psql_test
import (
"context"
"testing"
"github.com/gabehf/koito/internal/db"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
)
// an llm wrote this because i didn't feel like it. it looks like it works, although
// it could stand to be more thorough
func TestGetInterest(t *testing.T) {
truncateTestData(t)
ctx := context.Background()
// --- Setup Data ---
// Insert Artists
err := store.Exec(ctx, `
INSERT INTO artists (musicbrainz_id)
VALUES ('00000000-0000-0000-0000-000000000001'),
('00000000-0000-0000-0000-000000000002')`)
require.NoError(t, err)
// Insert Releases (Albums)
err = store.Exec(ctx, `
INSERT INTO releases (musicbrainz_id)
VALUES ('00000000-0000-0000-0000-000000000011')`)
require.NoError(t, err)
// Insert Tracks (Both on Release 1)
err = store.Exec(ctx, `
INSERT INTO tracks (musicbrainz_id, release_id)
VALUES ('11111111-1111-1111-1111-111111111111', 1),
('22222222-2222-2222-2222-222222222222', 1)`)
require.NoError(t, err)
// Link Artists to Tracks
// Artist 1 -> Track 1
// Artist 2 -> Track 2
err = store.Exec(ctx, `
INSERT INTO artist_tracks (artist_id, track_id)
VALUES (1, 1), (2, 2)`)
require.NoError(t, err)
// Insert Listens
// Track 1 (Artist 1, Release 1): 3 Listens
// Track 2 (Artist 2, Release 1): 2 Listens
err = store.Exec(ctx, `
INSERT INTO listens (user_id, track_id, listened_at) VALUES
(1, 1, NOW() - INTERVAL '1 hour'),
(1, 1, NOW() - INTERVAL '2 hours'),
(1, 1, NOW() - INTERVAL '3 hours'),
(1, 2, NOW() - INTERVAL '1 hour'),
(1, 2, NOW() - INTERVAL '2 hours')
`)
require.NoError(t, err)
// --- Test Validation ---
t.Run("Validation", func(t *testing.T) {
// Error: Missing Buckets
_, err := store.GetInterest(ctx, db.GetInterestOpts{ArtistID: 1})
assert.Error(t, err)
assert.Contains(t, err.Error(), "bucket count must be provided")
// Error: Missing ID
_, err = store.GetInterest(ctx, db.GetInterestOpts{Buckets: 10})
assert.Error(t, err)
assert.Contains(t, err.Error(), "must be provided")
})
// --- Test Data Retrieval ---
// Note: We use Buckets: 1 to ensure all listens are aggregated into a single result
// for easier assertion, avoiding complex date/time math in the test.
t.Run("Artist Interest", func(t *testing.T) {
// Artist 1 should have 3 listens (from Track 1)
buckets, err := store.GetInterest(ctx, db.GetInterestOpts{
ArtistID: 1,
Buckets: 1,
})
require.NoError(t, err)
require.Len(t, buckets, 1)
assert.EqualValues(t, 3, buckets[0].ListenCount, "Artist 1 should have 3 listens")
})
t.Run("Album Interest", func(t *testing.T) {
// Album 1 contains Track 1 (3 listens) and Track 2 (2 listens) = 5 Total
buckets, err := store.GetInterest(ctx, db.GetInterestOpts{
AlbumID: 1,
Buckets: 1,
})
require.NoError(t, err)
require.Len(t, buckets, 1)
assert.EqualValues(t, 5, buckets[0].ListenCount, "Album 1 should have 5 listens total")
})
t.Run("Track Interest", func(t *testing.T) {
// Track 2 should have 2 listens
buckets, err := store.GetInterest(ctx, db.GetInterestOpts{
TrackID: 2,
Buckets: 1,
})
require.NoError(t, err)
require.Len(t, buckets, 1)
assert.EqualValues(t, 2, buckets[0].ListenCount, "Track 2 should have 2 listens")
})
}

View file

@ -11,38 +11,20 @@ import (
"github.com/gabehf/koito/internal/logger" "github.com/gabehf/koito/internal/logger"
"github.com/gabehf/koito/internal/models" "github.com/gabehf/koito/internal/models"
"github.com/gabehf/koito/internal/repository" "github.com/gabehf/koito/internal/repository"
"github.com/gabehf/koito/internal/utils"
) )
func (d *Psql) GetListensPaginated(ctx context.Context, opts db.GetItemsOpts) (*db.PaginatedResponse[*models.Listen], error) { func (d *Psql) GetListensPaginated(ctx context.Context, opts db.GetItemsOpts) (*db.PaginatedResponse[*models.Listen], error) {
l := logger.FromContext(ctx) l := logger.FromContext(ctx)
offset := (opts.Page - 1) * opts.Limit offset := (opts.Page - 1) * opts.Limit
var t1 time.Time t1, t2 := db.TimeframeToTimeRange(opts.Timeframe)
var t2 time.Time
if opts.From != 0 && opts.To != 0 {
t1 = time.Unix(int64(opts.From), 0)
t2 = time.Unix(int64(opts.To), 0)
} else {
t1R, t2R, err := utils.DateRange(opts.Week, opts.Month, opts.Year)
if err != nil {
return nil, fmt.Errorf("GetListensPaginated: %w", err)
}
t1 = t1R
t2 = t2R
if opts.Month == 0 && opts.Year == 0 {
// use period, not date range
t2 = time.Now()
t1 = db.StartTimeFromPeriod(opts.Period)
}
}
if opts.Limit == 0 { if opts.Limit == 0 {
opts.Limit = DefaultItemsPerPage opts.Limit = DefaultItemsPerPage
} }
var listens []*models.Listen var listens []*models.Listen
var count int64 var count int64
if opts.TrackID > 0 { if opts.TrackID > 0 {
l.Debug().Msgf("Fetching %d listens with period %s on page %d from range %v to %v", l.Debug().Msgf("Fetching %d listens on page %d from range %v to %v",
opts.Limit, opts.Period, opts.Page, t1.Format("Jan 02, 2006"), t2.Format("Jan 02, 2006")) opts.Limit, opts.Page, t1.Format("Jan 02, 2006"), t2.Format("Jan 02, 2006"))
rows, err := d.q.GetLastListensFromTrackPaginated(ctx, repository.GetLastListensFromTrackPaginatedParams{ rows, err := d.q.GetLastListensFromTrackPaginated(ctx, repository.GetLastListensFromTrackPaginatedParams{
ListenedAt: t1, ListenedAt: t1,
ListenedAt_2: t2, ListenedAt_2: t2,
@ -77,8 +59,8 @@ func (d *Psql) GetListensPaginated(ctx context.Context, opts db.GetItemsOpts) (*
return nil, fmt.Errorf("GetListensPaginated: CountListensFromTrack: %w", err) return nil, fmt.Errorf("GetListensPaginated: CountListensFromTrack: %w", err)
} }
} else if opts.AlbumID > 0 { } else if opts.AlbumID > 0 {
l.Debug().Msgf("Fetching %d listens with period %s on page %d from range %v to %v", l.Debug().Msgf("Fetching %d listens on page %d from range %v to %v",
opts.Limit, opts.Period, opts.Page, t1.Format("Jan 02, 2006"), t2.Format("Jan 02, 2006")) opts.Limit, opts.Page, t1.Format("Jan 02, 2006"), t2.Format("Jan 02, 2006"))
rows, err := d.q.GetLastListensFromReleasePaginated(ctx, repository.GetLastListensFromReleasePaginatedParams{ rows, err := d.q.GetLastListensFromReleasePaginated(ctx, repository.GetLastListensFromReleasePaginatedParams{
ListenedAt: t1, ListenedAt: t1,
ListenedAt_2: t2, ListenedAt_2: t2,
@ -113,8 +95,8 @@ func (d *Psql) GetListensPaginated(ctx context.Context, opts db.GetItemsOpts) (*
return nil, fmt.Errorf("GetListensPaginated: CountListensFromRelease: %w", err) return nil, fmt.Errorf("GetListensPaginated: CountListensFromRelease: %w", err)
} }
} else if opts.ArtistID > 0 { } else if opts.ArtistID > 0 {
l.Debug().Msgf("Fetching %d listens with period %s on page %d from range %v to %v", l.Debug().Msgf("Fetching %d listens on page %d from range %v to %v",
opts.Limit, opts.Period, opts.Page, t1.Format("Jan 02, 2006"), t2.Format("Jan 02, 2006")) opts.Limit, opts.Page, t1.Format("Jan 02, 2006"), t2.Format("Jan 02, 2006"))
rows, err := d.q.GetLastListensFromArtistPaginated(ctx, repository.GetLastListensFromArtistPaginatedParams{ rows, err := d.q.GetLastListensFromArtistPaginated(ctx, repository.GetLastListensFromArtistPaginatedParams{
ListenedAt: t1, ListenedAt: t1,
ListenedAt_2: t2, ListenedAt_2: t2,
@ -149,8 +131,8 @@ func (d *Psql) GetListensPaginated(ctx context.Context, opts db.GetItemsOpts) (*
return nil, fmt.Errorf("GetListensPaginated: CountListensFromArtist: %w", err) return nil, fmt.Errorf("GetListensPaginated: CountListensFromArtist: %w", err)
} }
} else { } else {
l.Debug().Msgf("Fetching %d listens with period %s on page %d from range %v to %v", l.Debug().Msgf("Fetching %d listens on page %d from range %v to %v",
opts.Limit, opts.Period, opts.Page, t1.Format("Jan 02, 2006"), t2.Format("Jan 02, 2006")) opts.Limit, opts.Page, t1.Format("Jan 02, 2006"), t2.Format("Jan 02, 2006"))
rows, err := d.q.GetLastListensPaginated(ctx, repository.GetLastListensPaginatedParams{ rows, err := d.q.GetLastListensPaginated(ctx, repository.GetLastListensPaginatedParams{
ListenedAt: t1, ListenedAt: t1,
ListenedAt_2: t2, ListenedAt_2: t2,

View file

@ -23,12 +23,12 @@ func (d *Psql) GetListenActivity(ctx context.Context, opts db.ListenActivityOpts
var listenActivity []db.ListenActivityItem var listenActivity []db.ListenActivityItem
if opts.AlbumID > 0 { if opts.AlbumID > 0 {
l.Debug().Msgf("Fetching listen activity for %d %s(s) from %v to %v for release group %d", l.Debug().Msgf("Fetching listen activity for %d %s(s) from %v to %v for release group %d",
opts.Range, opts.Step, t1.Format("Jan 02, 2006 15:04:05"), t2.Format("Jan 02, 2006 15:04:05"), opts.AlbumID) opts.Range, opts.Step, t1.Format("Jan 02, 2006 15:04:05 MST"), t2.Format("Jan 02, 2006 15:04:05 MST"), opts.AlbumID)
rows, err := d.q.ListenActivityForRelease(ctx, repository.ListenActivityForReleaseParams{ rows, err := d.q.ListenActivityForRelease(ctx, repository.ListenActivityForReleaseParams{
Column1: t1, Column1: opts.Timezone.String(),
Column2: t2, ListenedAt: t1,
Column3: stepToInterval(opts.Step), ListenedAt_2: t2,
ReleaseID: opts.AlbumID, ReleaseID: opts.AlbumID,
}) })
if err != nil { if err != nil {
return nil, fmt.Errorf("GetListenActivity: ListenActivityForRelease: %w", err) return nil, fmt.Errorf("GetListenActivity: ListenActivityForRelease: %w", err)
@ -36,7 +36,7 @@ func (d *Psql) GetListenActivity(ctx context.Context, opts db.ListenActivityOpts
listenActivity = make([]db.ListenActivityItem, len(rows)) listenActivity = make([]db.ListenActivityItem, len(rows))
for i, row := range rows { for i, row := range rows {
t := db.ListenActivityItem{ t := db.ListenActivityItem{
Start: row.BucketStart, Start: row.Day.Time,
Listens: row.ListenCount, Listens: row.ListenCount,
} }
listenActivity[i] = t listenActivity[i] = t
@ -44,12 +44,12 @@ func (d *Psql) GetListenActivity(ctx context.Context, opts db.ListenActivityOpts
l.Debug().Msgf("Database responded with %d steps", len(rows)) l.Debug().Msgf("Database responded with %d steps", len(rows))
} else if opts.ArtistID > 0 { } else if opts.ArtistID > 0 {
l.Debug().Msgf("Fetching listen activity for %d %s(s) from %v to %v for artist %d", l.Debug().Msgf("Fetching listen activity for %d %s(s) from %v to %v for artist %d",
opts.Range, opts.Step, t1.Format("Jan 02, 2006 15:04:05"), t2.Format("Jan 02, 2006 15:04:05"), opts.ArtistID) opts.Range, opts.Step, t1.Format("Jan 02, 2006 15:04:05 MST"), t2.Format("Jan 02, 2006 15:04:05 MST"), opts.ArtistID)
rows, err := d.q.ListenActivityForArtist(ctx, repository.ListenActivityForArtistParams{ rows, err := d.q.ListenActivityForArtist(ctx, repository.ListenActivityForArtistParams{
Column1: t1, Column1: opts.Timezone.String(),
Column2: t2, ListenedAt: t1,
Column3: stepToInterval(opts.Step), ListenedAt_2: t2,
ArtistID: opts.ArtistID, ArtistID: opts.ArtistID,
}) })
if err != nil { if err != nil {
return nil, fmt.Errorf("GetListenActivity: ListenActivityForArtist: %w", err) return nil, fmt.Errorf("GetListenActivity: ListenActivityForArtist: %w", err)
@ -57,7 +57,7 @@ func (d *Psql) GetListenActivity(ctx context.Context, opts db.ListenActivityOpts
listenActivity = make([]db.ListenActivityItem, len(rows)) listenActivity = make([]db.ListenActivityItem, len(rows))
for i, row := range rows { for i, row := range rows {
t := db.ListenActivityItem{ t := db.ListenActivityItem{
Start: row.BucketStart, Start: row.Day.Time,
Listens: row.ListenCount, Listens: row.ListenCount,
} }
listenActivity[i] = t listenActivity[i] = t
@ -65,12 +65,12 @@ func (d *Psql) GetListenActivity(ctx context.Context, opts db.ListenActivityOpts
l.Debug().Msgf("Database responded with %d steps", len(rows)) l.Debug().Msgf("Database responded with %d steps", len(rows))
} else if opts.TrackID > 0 { } else if opts.TrackID > 0 {
l.Debug().Msgf("Fetching listen activity for %d %s(s) from %v to %v for track %d", l.Debug().Msgf("Fetching listen activity for %d %s(s) from %v to %v for track %d",
opts.Range, opts.Step, t1.Format("Jan 02, 2006 15:04:05"), t2.Format("Jan 02, 2006 15:04:05"), opts.TrackID) opts.Range, opts.Step, t1.Format("Jan 02, 2006 15:04:05 MST"), t2.Format("Jan 02, 2006 15:04:05 MST"), opts.TrackID)
rows, err := d.q.ListenActivityForTrack(ctx, repository.ListenActivityForTrackParams{ rows, err := d.q.ListenActivityForTrack(ctx, repository.ListenActivityForTrackParams{
Column1: t1, Column1: opts.Timezone.String(),
Column2: t2, ListenedAt: t1,
Column3: stepToInterval(opts.Step), ListenedAt_2: t2,
ID: opts.TrackID, ID: opts.TrackID,
}) })
if err != nil { if err != nil {
return nil, fmt.Errorf("GetListenActivity: ListenActivityForTrack: %w", err) return nil, fmt.Errorf("GetListenActivity: ListenActivityForTrack: %w", err)
@ -78,7 +78,7 @@ func (d *Psql) GetListenActivity(ctx context.Context, opts db.ListenActivityOpts
listenActivity = make([]db.ListenActivityItem, len(rows)) listenActivity = make([]db.ListenActivityItem, len(rows))
for i, row := range rows { for i, row := range rows {
t := db.ListenActivityItem{ t := db.ListenActivityItem{
Start: row.BucketStart, Start: row.Day.Time,
Listens: row.ListenCount, Listens: row.ListenCount,
} }
listenActivity[i] = t listenActivity[i] = t
@ -86,11 +86,11 @@ func (d *Psql) GetListenActivity(ctx context.Context, opts db.ListenActivityOpts
l.Debug().Msgf("Database responded with %d steps", len(rows)) l.Debug().Msgf("Database responded with %d steps", len(rows))
} else { } else {
l.Debug().Msgf("Fetching listen activity for %d %s(s) from %v to %v", l.Debug().Msgf("Fetching listen activity for %d %s(s) from %v to %v",
opts.Range, opts.Step, t1.Format("Jan 02, 2006 15:04:05"), t2.Format("Jan 02, 2006 15:04:05")) opts.Range, opts.Step, t1.Format("Jan 02, 2006 15:04:05 MST"), t2.Format("Jan 02, 2006 15:04:05 MST"))
rows, err := d.q.ListenActivity(ctx, repository.ListenActivityParams{ rows, err := d.q.ListenActivity(ctx, repository.ListenActivityParams{
Column1: t1, Column1: opts.Timezone.String(),
Column2: t2, ListenedAt: t1,
Column3: stepToInterval(opts.Step), ListenedAt_2: t2,
}) })
if err != nil { if err != nil {
return nil, fmt.Errorf("GetListenActivity: ListenActivity: %w", err) return nil, fmt.Errorf("GetListenActivity: ListenActivity: %w", err)
@ -98,7 +98,7 @@ func (d *Psql) GetListenActivity(ctx context.Context, opts db.ListenActivityOpts
listenActivity = make([]db.ListenActivityItem, len(rows)) listenActivity = make([]db.ListenActivityItem, len(rows))
for i, row := range rows { for i, row := range rows {
t := db.ListenActivityItem{ t := db.ListenActivityItem{
Start: row.BucketStart, Start: row.Day.Time,
Listens: row.ListenCount, Listens: row.ListenCount,
} }
listenActivity[i] = t listenActivity[i] = t

View file

@ -22,55 +22,55 @@ func TestListenActivity(t *testing.T) {
truncateTestData(t) truncateTestData(t)
err := store.Exec(context.Background(), err := store.Exec(context.Background(),
`INSERT INTO artists (musicbrainz_id) `INSERT INTO artists (musicbrainz_id)
VALUES ('00000000-0000-0000-0000-000000000001'), VALUES ('00000000-0000-0000-0000-000000000001'),
('00000000-0000-0000-0000-000000000002')`) ('00000000-0000-0000-0000-000000000002')`)
require.NoError(t, err) require.NoError(t, err)
// Move artist names into artist_aliases // Move artist names into artist_aliases
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO artist_aliases (artist_id, alias, source, is_primary) `INSERT INTO artist_aliases (artist_id, alias, source, is_primary)
VALUES (1, 'Artist One', 'Testing', true), VALUES (1, 'Artist One', 'Testing', true),
(2, 'Artist Two', 'Testing', true)`) (2, 'Artist Two', 'Testing', true)`)
require.NoError(t, err) require.NoError(t, err)
// Insert release groups // Insert release groups
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO releases (musicbrainz_id) `INSERT INTO releases (musicbrainz_id)
VALUES ('00000000-0000-0000-0000-000000000011'), VALUES ('00000000-0000-0000-0000-000000000011'),
('00000000-0000-0000-0000-000000000022')`) ('00000000-0000-0000-0000-000000000022')`)
require.NoError(t, err) require.NoError(t, err)
// Move release titles into release_aliases // Move release titles into release_aliases
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO release_aliases (release_id, alias, source, is_primary) `INSERT INTO release_aliases (release_id, alias, source, is_primary)
VALUES (1, 'Release One', 'Testing', true), VALUES (1, 'Release One', 'Testing', true),
(2, 'Release Two', 'Testing', true)`) (2, 'Release Two', 'Testing', true)`)
require.NoError(t, err) require.NoError(t, err)
// Insert tracks // Insert tracks
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO tracks (musicbrainz_id, release_id) `INSERT INTO tracks (musicbrainz_id, release_id)
VALUES ('11111111-1111-1111-1111-111111111111', 1), VALUES ('11111111-1111-1111-1111-111111111111', 1),
('22222222-2222-2222-2222-222222222222', 2)`) ('22222222-2222-2222-2222-222222222222', 2)`)
require.NoError(t, err) require.NoError(t, err)
// Move track titles into track_aliases // Move track titles into track_aliases
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO track_aliases (track_id, alias, source, is_primary) `INSERT INTO track_aliases (track_id, alias, source, is_primary)
VALUES (1, 'Track One', 'Testing', true), VALUES (1, 'Track One', 'Testing', true),
(2, 'Track Two', 'Testing', true)`) (2, 'Track Two', 'Testing', true)`)
require.NoError(t, err) require.NoError(t, err)
// Associate tracks with artists // Associate tracks with artists
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO artist_tracks (artist_id, track_id) `INSERT INTO artist_tracks (artist_id, track_id)
VALUES (1, 1), (2, 2)`) VALUES (1, 1), (2, 2)`)
require.NoError(t, err) require.NoError(t, err)
// Insert listens // Insert listens
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO listens (user_id, track_id, listened_at) `INSERT INTO listens (user_id, track_id, listened_at)
VALUES (1, 1, NOW() - INTERVAL '1 day'), VALUES (1, 1, NOW() - INTERVAL '1 day'),
(1, 1, NOW() - INTERVAL '2 days'), (1, 1, NOW() - INTERVAL '2 days'),
(1, 1, NOW() - INTERVAL '1 week 1 day'), (1, 1, NOW() - INTERVAL '1 week 1 day'),
@ -88,33 +88,35 @@ func TestListenActivity(t *testing.T) {
// Test for opts.Step = db.StepDay // Test for opts.Step = db.StepDay
activity, err := store.GetListenActivity(ctx, db.ListenActivityOpts{Step: db.StepDay}) activity, err := store.GetListenActivity(ctx, db.ListenActivityOpts{Step: db.StepDay})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, activity, db.DefaultRange) require.Len(t, activity, 3)
assert.Equal(t, []int64{0, 0, 0, 2, 0, 0, 0, 0, 0, 2, 2, 0}, flattenListenCounts(activity)) assert.Equal(t, []int64{2, 2, 2}, flattenListenCounts(activity))
// Truncate listens table and insert specific dates for testing opts.Step = db.StepMonth // Truncate listens table and insert specific dates for testing opts.Step = db.StepMonth
err = store.Exec(context.Background(), `TRUNCATE TABLE listens`) err = store.Exec(context.Background(), `TRUNCATE TABLE listens`)
require.NoError(t, err) require.NoError(t, err)
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO listens (user_id, track_id, listened_at) `INSERT INTO listens (user_id, track_id, listened_at)
VALUES (1, 1, NOW() - INTERVAL '1 month'), VALUES (1, 1, NOW() - INTERVAL '1 month 1 day'),
(1, 1, NOW() - INTERVAL '2 months'), (1, 1, NOW() - INTERVAL '2 months 1 day'),
(1, 1, NOW() - INTERVAL '3 months'), (1, 1, NOW() - INTERVAL '3 months 1 day'),
(1, 2, NOW() - INTERVAL '1 month'), (1, 2, NOW() - INTERVAL '1 month 1 day'),
(1, 2, NOW() - INTERVAL '2 months')`) (1, 2, NOW() - INTERVAL '1 second'),
(1, 2, NOW() - INTERVAL '2 seconds'),
(1, 2, NOW() - INTERVAL '2 months 1 day')`)
require.NoError(t, err) require.NoError(t, err)
activity, err = store.GetListenActivity(ctx, db.ListenActivityOpts{Step: db.StepMonth, Range: 8}) activity, err = store.GetListenActivity(ctx, db.ListenActivityOpts{Step: db.StepMonth, Range: 8})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, activity, 8) require.Len(t, activity, 4)
assert.Equal(t, []int64{0, 0, 0, 0, 1, 2, 2, 0}, flattenListenCounts(activity)) assert.Equal(t, []int64{1, 2, 2, 2}, flattenListenCounts(activity))
// Truncate listens table and insert specific dates for testing opts.Step = db.StepYear // Truncate listens table and insert specific dates for testing opts.Step = db.StepYear
err = store.Exec(context.Background(), `TRUNCATE TABLE listens RESTART IDENTITY`) err = store.Exec(context.Background(), `TRUNCATE TABLE listens RESTART IDENTITY`)
require.NoError(t, err) require.NoError(t, err)
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO listens (user_id, track_id, listened_at) `INSERT INTO listens (user_id, track_id, listened_at)
VALUES (1, 1, NOW() - INTERVAL '1 year'), VALUES (1, 1, NOW() - INTERVAL '1 year'),
(1, 1, NOW() - INTERVAL '2 years'), (1, 1, NOW() - INTERVAL '2 years'),
(1, 2, NOW() - INTERVAL '1 year'), (1, 2, NOW() - INTERVAL '1 year'),
@ -123,8 +125,8 @@ func TestListenActivity(t *testing.T) {
activity, err = store.GetListenActivity(ctx, db.ListenActivityOpts{Step: db.StepYear}) activity, err = store.GetListenActivity(ctx, db.ListenActivityOpts{Step: db.StepYear})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, activity, db.DefaultRange) require.Len(t, activity, 3)
assert.Equal(t, []int64{0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 2, 0}, flattenListenCounts(activity)) assert.Equal(t, []int64{1, 1, 2}, flattenListenCounts(activity))
// Truncate and insert data for a specific month/year // Truncate and insert data for a specific month/year
err = store.Exec(context.Background(), `TRUNCATE TABLE listens RESTART IDENTITY`) err = store.Exec(context.Background(), `TRUNCATE TABLE listens RESTART IDENTITY`)
require.NoError(t, err) require.NoError(t, err)
@ -141,10 +143,10 @@ func TestListenActivity(t *testing.T) {
Year: 2024, Year: 2024,
}) })
require.NoError(t, err) require.NoError(t, err)
require.Len(t, activity, 31) // number of days in march require.Len(t, activity, 2) // number of days in march
t.Log(activity) t.Log(activity)
assert.EqualValues(t, 1, activity[9].Listens) assert.EqualValues(t, 1, activity[0].Listens)
assert.EqualValues(t, 1, activity[19].Listens) assert.EqualValues(t, 1, activity[1].Listens)
// Truncate and insert listens associated with two different albums // Truncate and insert listens associated with two different albums
err = store.Exec(context.Background(), `TRUNCATE TABLE listens RESTART IDENTITY`) err = store.Exec(context.Background(), `TRUNCATE TABLE listens RESTART IDENTITY`)
@ -161,53 +163,29 @@ func TestListenActivity(t *testing.T) {
AlbumID: 1, // Track 1 only AlbumID: 1, // Track 1 only
}) })
require.NoError(t, err) require.NoError(t, err)
require.Len(t, activity, db.DefaultRange) require.Len(t, activity, 2)
assert.Equal(t, []int64{0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0}, flattenListenCounts(activity)) assert.Equal(t, []int64{1, 1}, flattenListenCounts(activity))
activity, err = store.GetListenActivity(ctx, db.ListenActivityOpts{ activity, err = store.GetListenActivity(ctx, db.ListenActivityOpts{
Step: db.StepDay, Step: db.StepDay,
TrackID: 1, // Track 1 only TrackID: 1, // Track 1 only
}) })
require.NoError(t, err) require.NoError(t, err)
require.Len(t, activity, db.DefaultRange) require.Len(t, activity, 2)
assert.Equal(t, []int64{0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0}, flattenListenCounts(activity)) assert.Equal(t, []int64{1, 1}, flattenListenCounts(activity))
activity, err = store.GetListenActivity(ctx, db.ListenActivityOpts{ activity, err = store.GetListenActivity(ctx, db.ListenActivityOpts{
Step: db.StepDay, Step: db.StepDay,
ArtistID: 2, // Should only include listens to Track 2 ArtistID: 2, // Should only include listens to Track 2
}) })
require.NoError(t, err) require.NoError(t, err)
require.Len(t, activity, db.DefaultRange) require.Len(t, activity, 1)
assert.Equal(t, []int64{0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0}, flattenListenCounts(activity)) assert.Equal(t, []int64{1}, flattenListenCounts(activity))
// month without year is disallowed // month without year is disallowed
_, err = store.GetListenActivity(ctx, db.ListenActivityOpts{ _, err = store.GetListenActivity(ctx, db.ListenActivityOpts{
Step: db.StepDay, Step: db.StepDay,
Month: 5, Month: 5,
}) })
require.Error(t, err) assert.Error(t, err)
// invalid options
_, err = store.GetListenActivity(ctx, db.ListenActivityOpts{
Year: -10,
})
require.Error(t, err)
_, err = store.GetListenActivity(ctx, db.ListenActivityOpts{
Year: 2025,
Month: -10,
})
require.Error(t, err)
_, err = store.GetListenActivity(ctx, db.ListenActivityOpts{
Range: -1,
})
require.Error(t, err)
_, err = store.GetListenActivity(ctx, db.ListenActivityOpts{
AlbumID: -1,
})
require.Error(t, err)
_, err = store.GetListenActivity(ctx, db.ListenActivityOpts{
ArtistID: -1,
})
require.Error(t, err)
} }

View file

@ -14,49 +14,49 @@ func testDataForListens(t *testing.T) {
truncateTestData(t) truncateTestData(t)
// Insert artists // Insert artists
err := store.Exec(context.Background(), err := store.Exec(context.Background(),
`INSERT INTO artists (musicbrainz_id) `INSERT INTO artists (musicbrainz_id)
VALUES ('00000000-0000-0000-0000-000000000001'), VALUES ('00000000-0000-0000-0000-000000000001'),
('00000000-0000-0000-0000-000000000002')`) ('00000000-0000-0000-0000-000000000002')`)
require.NoError(t, err) require.NoError(t, err)
// Insert artist aliases // Insert artist aliases
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO artist_aliases (artist_id, alias, source, is_primary) `INSERT INTO artist_aliases (artist_id, alias, source, is_primary)
VALUES (1, 'Artist One', 'Testing', true), VALUES (1, 'Artist One', 'Testing', true),
(2, 'Artist Two', 'Testing', true)`) (2, 'Artist Two', 'Testing', true)`)
require.NoError(t, err) require.NoError(t, err)
// Insert release groups // Insert release groups
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO releases (musicbrainz_id) `INSERT INTO releases (musicbrainz_id)
VALUES ('00000000-0000-0000-0000-000000000011'), VALUES ('00000000-0000-0000-0000-000000000011'),
('00000000-0000-0000-0000-000000000022')`) ('00000000-0000-0000-0000-000000000022')`)
require.NoError(t, err) require.NoError(t, err)
// Insert release aliases // Insert release aliases
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO release_aliases (release_id, alias, source, is_primary) `INSERT INTO release_aliases (release_id, alias, source, is_primary)
VALUES (1, 'Release One', 'Testing', true), VALUES (1, 'Release One', 'Testing', true),
(2, 'Release Two', 'Testing', true)`) (2, 'Release Two', 'Testing', true)`)
require.NoError(t, err) require.NoError(t, err)
// Insert tracks // Insert tracks
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO tracks (musicbrainz_id, release_id) `INSERT INTO tracks (musicbrainz_id, release_id)
VALUES ('11111111-1111-1111-1111-111111111111', 1), VALUES ('11111111-1111-1111-1111-111111111111', 1),
('22222222-2222-2222-2222-222222222222', 2)`) ('22222222-2222-2222-2222-222222222222', 2)`)
require.NoError(t, err) require.NoError(t, err)
// Insert track aliases // Insert track aliases
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO track_aliases (track_id, alias, source, is_primary) `INSERT INTO track_aliases (track_id, alias, source, is_primary)
VALUES (1, 'Track One', 'Testing', true), VALUES (1, 'Track One', 'Testing', true),
(2, 'Track Two', 'Testing', true)`) (2, 'Track Two', 'Testing', true)`)
require.NoError(t, err) require.NoError(t, err)
// Insert artist track associations // Insert artist track associations
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO artist_tracks (track_id, artist_id) `INSERT INTO artist_tracks (track_id, artist_id)
VALUES (1, 1), VALUES (1, 1),
(2, 2)`) (2, 2)`)
require.NoError(t, err) require.NoError(t, err)
@ -67,7 +67,7 @@ func TestGetListens(t *testing.T) {
ctx := context.Background() ctx := context.Background()
// Test valid // Test valid
resp, err := store.GetListensPaginated(ctx, db.GetItemsOpts{Period: db.PeriodAllTime}) resp, err := store.GetListensPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodAllTime}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 10) require.Len(t, resp.Items, 10)
assert.Equal(t, int64(10), resp.TotalCount) assert.Equal(t, int64(10), resp.TotalCount)
@ -78,7 +78,7 @@ func TestGetListens(t *testing.T) {
assert.Equal(t, "Artist Three", resp.Items[1].Track.Artists[0].Name) assert.Equal(t, "Artist Three", resp.Items[1].Track.Artists[0].Name)
// Test pagination // Test pagination
resp, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Limit: 1, Page: 2, Period: db.PeriodAllTime}) resp, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Limit: 1, Page: 2, Timeframe: db.Timeframe{Period: db.PeriodAllTime}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 1) require.Len(t, resp.Items, 1)
require.Len(t, resp.Items[0].Track.Artists, 1) require.Len(t, resp.Items[0].Track.Artists, 1)
@ -89,7 +89,7 @@ func TestGetListens(t *testing.T) {
assert.Equal(t, "Artist Three", resp.Items[0].Track.Artists[0].Name) assert.Equal(t, "Artist Three", resp.Items[0].Track.Artists[0].Name)
// Test page out of range // Test page out of range
resp, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Limit: 10, Page: 10, Period: db.PeriodAllTime}) resp, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Limit: 10, Page: 10, Timeframe: db.Timeframe{Period: db.PeriodAllTime}})
require.NoError(t, err) require.NoError(t, err)
assert.Empty(t, resp.Items) assert.Empty(t, resp.Items)
assert.False(t, resp.HasNextPage) assert.False(t, resp.HasNextPage)
@ -102,7 +102,7 @@ func TestGetListens(t *testing.T) {
assert.Error(t, err) assert.Error(t, err)
// Test specify period // Test specify period
resp, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Period: db.PeriodDay}) resp, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodDay}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 0) // empty require.Len(t, resp.Items, 0) // empty
assert.Equal(t, int64(0), resp.TotalCount) assert.Equal(t, int64(0), resp.TotalCount)
@ -112,38 +112,38 @@ func TestGetListens(t *testing.T) {
require.Len(t, resp.Items, 0) // empty require.Len(t, resp.Items, 0) // empty
assert.Equal(t, int64(0), resp.TotalCount) assert.Equal(t, int64(0), resp.TotalCount)
resp, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Period: db.PeriodWeek}) resp, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodWeek}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 1) require.Len(t, resp.Items, 1)
assert.Equal(t, int64(1), resp.TotalCount) assert.Equal(t, int64(1), resp.TotalCount)
resp, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Period: db.PeriodMonth}) resp, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodMonth}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 3) require.Len(t, resp.Items, 3)
assert.Equal(t, int64(3), resp.TotalCount) assert.Equal(t, int64(3), resp.TotalCount)
resp, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Period: db.PeriodYear}) resp, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodYear}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 6) require.Len(t, resp.Items, 6)
assert.Equal(t, int64(6), resp.TotalCount) assert.Equal(t, int64(6), resp.TotalCount)
// Test filter by artists, releases, and tracks // Test filter by artists, releases, and tracks
resp, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Period: db.PeriodAllTime, ArtistID: 1}) resp, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodAllTime}, ArtistID: 1})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 4) require.Len(t, resp.Items, 4)
assert.Equal(t, int64(4), resp.TotalCount) assert.Equal(t, int64(4), resp.TotalCount)
resp, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Period: db.PeriodAllTime, AlbumID: 2}) resp, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodAllTime}, AlbumID: 2})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 3) require.Len(t, resp.Items, 3)
assert.Equal(t, int64(3), resp.TotalCount) assert.Equal(t, int64(3), resp.TotalCount)
resp, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Period: db.PeriodAllTime, TrackID: 3}) resp, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodAllTime}, TrackID: 3})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 2) require.Len(t, resp.Items, 2)
assert.Equal(t, int64(2), resp.TotalCount) assert.Equal(t, int64(2), resp.TotalCount)
// when both artistID and albumID are specified, artist id is ignored // when both artistID and albumID are specified, artist id is ignored
resp, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Period: db.PeriodAllTime, AlbumID: 2, ArtistID: 1}) resp, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodAllTime}, AlbumID: 2, ArtistID: 1})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 3) require.Len(t, resp.Items, 3)
assert.Equal(t, int64(3), resp.TotalCount) assert.Equal(t, int64(3), resp.TotalCount)
@ -152,20 +152,16 @@ func TestGetListens(t *testing.T) {
testDataAbsoluteListenTimes(t) testDataAbsoluteListenTimes(t)
resp, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Year: 2023}) resp, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Year: 2023}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 4) require.Len(t, resp.Items, 4)
assert.Equal(t, int64(4), resp.TotalCount) assert.Equal(t, int64(4), resp.TotalCount)
resp, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Month: 6, Year: 2024}) resp, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Month: 6, Year: 2024}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 3) require.Len(t, resp.Items, 3)
assert.Equal(t, int64(3), resp.TotalCount) assert.Equal(t, int64(3), resp.TotalCount)
// invalid, year required with month
_, err = store.GetListensPaginated(ctx, db.GetItemsOpts{Month: 10})
require.Error(t, err)
} }
func TestSaveListen(t *testing.T) { func TestSaveListen(t *testing.T) {

View file

@ -52,7 +52,7 @@ func (d *Psql) MergeTracks(ctx context.Context, fromId, toId int32) error {
} }
err = qtx.CleanOrphanedEntries(ctx) err = qtx.CleanOrphanedEntries(ctx)
if err != nil { if err != nil {
l.Err(err).Msg("Failed to clean orphaned entries") l.Err(err).Msg("MergeTracks: Failed to clean orphaned entries")
return err return err
} }
return tx.Commit(ctx) return tx.Commit(ctx)

View file

@ -12,27 +12,27 @@ func setupTestDataForMerge(t *testing.T) {
truncateTestData(t) truncateTestData(t)
// Insert artists // Insert artists
err := store.Exec(context.Background(), err := store.Exec(context.Background(),
`INSERT INTO artists (musicbrainz_id, image, image_source) `INSERT INTO artists (musicbrainz_id, image, image_source)
VALUES ('00000000-0000-0000-0000-000000000001', '10000000-0000-0000-0000-000000000000', 'source.com'), VALUES ('00000000-0000-0000-0000-000000000001', '10000000-0000-0000-0000-000000000000', 'source.com'),
('00000000-0000-0000-0000-000000000002', NULL, NULL)`) ('00000000-0000-0000-0000-000000000002', NULL, NULL)`)
require.NoError(t, err) require.NoError(t, err)
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO artist_aliases (artist_id, alias, source, is_primary) `INSERT INTO artist_aliases (artist_id, alias, source, is_primary)
VALUES (1, 'Artist One', 'Testing', true), VALUES (1, 'Artist One', 'Testing', true),
(2, 'Artist Two', 'Testing', true)`) (2, 'Artist Two', 'Testing', true)`)
require.NoError(t, err) require.NoError(t, err)
// Insert albums // Insert albums
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO releases (musicbrainz_id, image, image_source) `INSERT INTO releases (musicbrainz_id, image, image_source)
VALUES ('11111111-1111-1111-1111-111111111111', '20000000-0000-0000-0000-000000000000', 'source.com'), VALUES ('11111111-1111-1111-1111-111111111111', '20000000-0000-0000-0000-000000000000', 'source.com'),
('22222222-2222-2222-2222-222222222222', NULL, NULL), ('22222222-2222-2222-2222-222222222222', NULL, NULL),
(NULL, NULL, NULL)`) (NULL, NULL, NULL)`)
require.NoError(t, err) require.NoError(t, err)
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO release_aliases (release_id, alias, source, is_primary) `INSERT INTO release_aliases (release_id, alias, source, is_primary)
VALUES (1, 'Album One', 'Testing', true), VALUES (1, 'Album One', 'Testing', true),
(2, 'Album Two', 'Testing', true), (2, 'Album Two', 'Testing', true),
(3, 'Album Three', 'Testing', true)`) (3, 'Album Three', 'Testing', true)`)
@ -40,7 +40,7 @@ func setupTestDataForMerge(t *testing.T) {
// Insert tracks // Insert tracks
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO tracks (musicbrainz_id, release_id) `INSERT INTO tracks (musicbrainz_id, release_id)
VALUES ('33333333-3333-3333-3333-333333333333', 1), VALUES ('33333333-3333-3333-3333-333333333333', 1),
('44444444-4444-4444-4444-444444444444', 2), ('44444444-4444-4444-4444-444444444444', 2),
('55555555-5555-5555-5555-555555555555', 1), ('55555555-5555-5555-5555-555555555555', 1),
@ -48,7 +48,7 @@ func setupTestDataForMerge(t *testing.T) {
require.NoError(t, err) require.NoError(t, err)
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO track_aliases (track_id, alias, source, is_primary) `INSERT INTO track_aliases (track_id, alias, source, is_primary)
VALUES (1, 'Track One', 'Testing', true), VALUES (1, 'Track One', 'Testing', true),
(2, 'Track Two', 'Testing', true), (2, 'Track Two', 'Testing', true),
(3, 'Track Three', 'Testing', true), (3, 'Track Three', 'Testing', true),
@ -57,18 +57,18 @@ func setupTestDataForMerge(t *testing.T) {
// Associate artists with albums and tracks // Associate artists with albums and tracks
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO artist_releases (artist_id, release_id) `INSERT INTO artist_releases (artist_id, release_id)
VALUES (1, 1), (2, 2), (1, 3)`) VALUES (1, 1), (2, 2), (1, 3)`)
require.NoError(t, err) require.NoError(t, err)
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO artist_tracks (artist_id, track_id) `INSERT INTO artist_tracks (artist_id, track_id)
VALUES (1, 1), (2, 2), (1, 3), (1, 4)`) VALUES (1, 1), (2, 2), (1, 3), (1, 4)`)
require.NoError(t, err) require.NoError(t, err)
// Insert listens // Insert listens
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO listens (user_id, track_id, listened_at) `INSERT INTO listens (user_id, track_id, listened_at)
VALUES (1, 1, NOW() - INTERVAL '1 day'), VALUES (1, 1, NOW() - INTERVAL '1 day'),
(1, 2, NOW() - INTERVAL '2 days'), (1, 2, NOW() - INTERVAL '2 days'),
(1, 3, NOW() - INTERVAL '3 days'), (1, 3, NOW() - INTERVAL '3 days'),
@ -90,14 +90,14 @@ func TestMergeTracks(t *testing.T) {
require.NoError(t, err) require.NoError(t, err)
assert.Equal(t, 2, count, "expected all listens to be merged into Track 2") assert.Equal(t, 2, count, "expected all listens to be merged into Track 2")
// Verify artist is associated with album // Verify old artist is not associated with album
exists, err := store.RowExists(ctx, ` exists, err := store.RowExists(ctx, `
SELECT EXISTS ( SELECT EXISTS (
SELECT 1 FROM artist_releases SELECT 1 FROM artist_releases
WHERE release_id = $1 AND artist_id = $2 WHERE release_id = $1 AND artist_id = $2
)`, 2, 1) )`, 2, 1)
require.NoError(t, err) require.NoError(t, err)
assert.True(t, exists, "expected old artist to be associated with album") assert.False(t, exists)
truncateTestData(t) truncateTestData(t)
} }

View file

@ -4,41 +4,27 @@ import (
"context" "context"
"encoding/json" "encoding/json"
"fmt" "fmt"
"time"
"github.com/gabehf/koito/internal/db" "github.com/gabehf/koito/internal/db"
"github.com/gabehf/koito/internal/logger" "github.com/gabehf/koito/internal/logger"
"github.com/gabehf/koito/internal/models" "github.com/gabehf/koito/internal/models"
"github.com/gabehf/koito/internal/repository" "github.com/gabehf/koito/internal/repository"
"github.com/gabehf/koito/internal/utils"
) )
func (d *Psql) GetTopAlbumsPaginated(ctx context.Context, opts db.GetItemsOpts) (*db.PaginatedResponse[*models.Album], error) { func (d *Psql) GetTopAlbumsPaginated(ctx context.Context, opts db.GetItemsOpts) (*db.PaginatedResponse[db.RankedItem[*models.Album]], error) {
l := logger.FromContext(ctx) l := logger.FromContext(ctx)
offset := (opts.Page - 1) * opts.Limit offset := (opts.Page - 1) * opts.Limit
t1, t2, err := utils.DateRange(opts.Week, opts.Month, opts.Year) t1, t2 := db.TimeframeToTimeRange(opts.Timeframe)
if err != nil {
return nil, fmt.Errorf("GetTopAlbumsPaginated: %w", err)
}
if opts.Month == 0 && opts.Year == 0 {
// use period, not date range
t2 = time.Now()
t1 = db.StartTimeFromPeriod(opts.Period)
}
if opts.From != 0 || opts.To != 0 {
t1 = time.Unix(opts.From, 0)
t2 = time.Unix(opts.To, 0)
}
if opts.Limit == 0 { if opts.Limit == 0 {
opts.Limit = DefaultItemsPerPage opts.Limit = DefaultItemsPerPage
} }
var rgs []*models.Album var rgs []db.RankedItem[*models.Album]
var count int64 var count int64
if opts.ArtistID != 0 { if opts.ArtistID != 0 {
l.Debug().Msgf("Fetching top %d albums from artist id %d with period %s on page %d from range %v to %v", l.Debug().Msgf("Fetching top %d albums from artist id %d on page %d from range %v to %v",
opts.Limit, opts.ArtistID, opts.Period, opts.Page, t1.Format("Jan 02, 2006"), t2.Format("Jan 02, 2006")) opts.Limit, opts.ArtistID, opts.Page, t1.Format("Jan 02, 2006"), t2.Format("Jan 02, 2006"))
rows, err := d.q.GetTopReleasesFromArtist(ctx, repository.GetTopReleasesFromArtistParams{ rows, err := d.q.GetTopReleasesFromArtist(ctx, repository.GetTopReleasesFromArtistParams{
ArtistID: int32(opts.ArtistID), ArtistID: int32(opts.ArtistID),
@ -50,7 +36,7 @@ func (d *Psql) GetTopAlbumsPaginated(ctx context.Context, opts db.GetItemsOpts)
if err != nil { if err != nil {
return nil, fmt.Errorf("GetTopAlbumsPaginated: GetTopReleasesFromArtist: %w", err) return nil, fmt.Errorf("GetTopAlbumsPaginated: GetTopReleasesFromArtist: %w", err)
} }
rgs = make([]*models.Album, len(rows)) rgs = make([]db.RankedItem[*models.Album], len(rows))
l.Debug().Msgf("Database responded with %d items", len(rows)) l.Debug().Msgf("Database responded with %d items", len(rows))
for i, v := range rows { for i, v := range rows {
artists := make([]models.SimpleArtist, 0) artists := make([]models.SimpleArtist, 0)
@ -59,7 +45,7 @@ func (d *Psql) GetTopAlbumsPaginated(ctx context.Context, opts db.GetItemsOpts)
l.Err(err).Msgf("Error unmarshalling artists for release group with id %d", v.ID) l.Err(err).Msgf("Error unmarshalling artists for release group with id %d", v.ID)
return nil, fmt.Errorf("GetTopAlbumsPaginated: Unmarshal: %w", err) return nil, fmt.Errorf("GetTopAlbumsPaginated: Unmarshal: %w", err)
} }
rgs[i] = &models.Album{ rgs[i].Item = &models.Album{
ID: v.ID, ID: v.ID,
MbzID: v.MusicBrainzID, MbzID: v.MusicBrainzID,
Title: v.Title, Title: v.Title,
@ -68,14 +54,15 @@ func (d *Psql) GetTopAlbumsPaginated(ctx context.Context, opts db.GetItemsOpts)
VariousArtists: v.VariousArtists, VariousArtists: v.VariousArtists,
ListenCount: v.ListenCount, ListenCount: v.ListenCount,
} }
rgs[i].Rank = v.Rank
} }
count, err = d.q.CountReleasesFromArtist(ctx, int32(opts.ArtistID)) count, err = d.q.CountReleasesFromArtist(ctx, int32(opts.ArtistID))
if err != nil { if err != nil {
return nil, fmt.Errorf("GetTopAlbumsPaginated: CountReleasesFromArtist: %w", err) return nil, fmt.Errorf("GetTopAlbumsPaginated: CountReleasesFromArtist: %w", err)
} }
} else { } else {
l.Debug().Msgf("Fetching top %d albums with period %s on page %d from range %v to %v", l.Debug().Msgf("Fetching top %d albums on page %d from range %v to %v",
opts.Limit, opts.Period, opts.Page, t1.Format("Jan 02, 2006"), t2.Format("Jan 02, 2006")) opts.Limit, opts.Page, t1.Format("Jan 02, 2006"), t2.Format("Jan 02, 2006"))
rows, err := d.q.GetTopReleasesPaginated(ctx, repository.GetTopReleasesPaginatedParams{ rows, err := d.q.GetTopReleasesPaginated(ctx, repository.GetTopReleasesPaginatedParams{
ListenedAt: t1, ListenedAt: t1,
ListenedAt_2: t2, ListenedAt_2: t2,
@ -85,7 +72,7 @@ func (d *Psql) GetTopAlbumsPaginated(ctx context.Context, opts db.GetItemsOpts)
if err != nil { if err != nil {
return nil, fmt.Errorf("GetTopAlbumsPaginated: GetTopReleasesPaginated: %w", err) return nil, fmt.Errorf("GetTopAlbumsPaginated: GetTopReleasesPaginated: %w", err)
} }
rgs = make([]*models.Album, len(rows)) rgs = make([]db.RankedItem[*models.Album], len(rows))
l.Debug().Msgf("Database responded with %d items", len(rows)) l.Debug().Msgf("Database responded with %d items", len(rows))
for i, row := range rows { for i, row := range rows {
artists := make([]models.SimpleArtist, 0) artists := make([]models.SimpleArtist, 0)
@ -94,16 +81,16 @@ func (d *Psql) GetTopAlbumsPaginated(ctx context.Context, opts db.GetItemsOpts)
l.Err(err).Msgf("Error unmarshalling artists for release group with id %d", row.ID) l.Err(err).Msgf("Error unmarshalling artists for release group with id %d", row.ID)
return nil, fmt.Errorf("GetTopAlbumsPaginated: Unmarshal: %w", err) return nil, fmt.Errorf("GetTopAlbumsPaginated: Unmarshal: %w", err)
} }
t := &models.Album{ rgs[i].Item = &models.Album{
Title: row.Title,
MbzID: row.MusicBrainzID,
ID: row.ID, ID: row.ID,
MbzID: row.MusicBrainzID,
Title: row.Title,
Image: row.Image, Image: row.Image,
Artists: artists, Artists: artists,
VariousArtists: row.VariousArtists, VariousArtists: row.VariousArtists,
ListenCount: row.ListenCount, ListenCount: row.ListenCount,
} }
rgs[i] = t rgs[i].Rank = row.Rank
} }
count, err = d.q.CountTopReleases(ctx, repository.CountTopReleasesParams{ count, err = d.q.CountTopReleases(ctx, repository.CountTopReleasesParams{
ListenedAt: t1, ListenedAt: t1,
@ -114,7 +101,7 @@ func (d *Psql) GetTopAlbumsPaginated(ctx context.Context, opts db.GetItemsOpts)
} }
l.Debug().Msgf("Database responded with %d albums out of a total %d", len(rows), count) l.Debug().Msgf("Database responded with %d albums out of a total %d", len(rows), count)
} }
return &db.PaginatedResponse[*models.Album]{ return &db.PaginatedResponse[db.RankedItem[*models.Album]]{
Items: rgs, Items: rgs,
TotalCount: count, TotalCount: count,
ItemsPerPage: int32(opts.Limit), ItemsPerPage: int32(opts.Limit),

View file

@ -14,23 +14,23 @@ func TestGetTopAlbumsPaginated(t *testing.T) {
ctx := context.Background() ctx := context.Background()
// Test valid // Test valid
resp, err := store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Period: db.PeriodAllTime}) resp, err := store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodAllTime}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 4) require.Len(t, resp.Items, 4)
assert.Equal(t, int64(4), resp.TotalCount) assert.Equal(t, int64(4), resp.TotalCount)
assert.Equal(t, "Release One", resp.Items[0].Title) assert.Equal(t, "Release One", resp.Items[0].Item.Title)
assert.Equal(t, "Release Two", resp.Items[1].Title) assert.Equal(t, "Release Two", resp.Items[1].Item.Title)
assert.Equal(t, "Release Three", resp.Items[2].Title) assert.Equal(t, "Release Three", resp.Items[2].Item.Title)
assert.Equal(t, "Release Four", resp.Items[3].Title) assert.Equal(t, "Release Four", resp.Items[3].Item.Title)
// Test pagination // Test pagination
resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Limit: 1, Page: 2, Period: db.PeriodAllTime}) resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Limit: 1, Page: 2, Timeframe: db.Timeframe{Period: db.PeriodAllTime}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 1) require.Len(t, resp.Items, 1)
assert.Equal(t, "Release Two", resp.Items[0].Title) assert.Equal(t, "Release Two", resp.Items[0].Item.Title)
// Test page out of range // Test page out of range
resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Limit: 1, Page: 10, Period: db.PeriodAllTime}) resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Limit: 1, Page: 10, Timeframe: db.Timeframe{Period: db.PeriodAllTime}})
require.NoError(t, err) require.NoError(t, err)
require.Empty(t, resp.Items) require.Empty(t, resp.Items)
assert.False(t, resp.HasNextPage) assert.False(t, resp.HasNextPage)
@ -43,7 +43,7 @@ func TestGetTopAlbumsPaginated(t *testing.T) {
assert.Error(t, err) assert.Error(t, err)
// Test specify period // Test specify period
resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Period: db.PeriodDay}) resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodDay}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 0) // empty require.Len(t, resp.Items, 0) // empty
assert.Equal(t, int64(0), resp.TotalCount) assert.Equal(t, int64(0), resp.TotalCount)
@ -53,51 +53,47 @@ func TestGetTopAlbumsPaginated(t *testing.T) {
require.Len(t, resp.Items, 0) // empty require.Len(t, resp.Items, 0) // empty
assert.Equal(t, int64(0), resp.TotalCount) assert.Equal(t, int64(0), resp.TotalCount)
resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Period: db.PeriodWeek}) resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodWeek}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 1) require.Len(t, resp.Items, 1)
assert.Equal(t, int64(1), resp.TotalCount) assert.Equal(t, int64(1), resp.TotalCount)
assert.Equal(t, "Release Four", resp.Items[0].Title) assert.Equal(t, "Release Four", resp.Items[0].Item.Title)
resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Period: db.PeriodMonth}) resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodMonth}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 2) require.Len(t, resp.Items, 2)
assert.Equal(t, int64(2), resp.TotalCount) assert.Equal(t, int64(2), resp.TotalCount)
assert.Equal(t, "Release Three", resp.Items[0].Title) assert.Equal(t, "Release Three", resp.Items[0].Item.Title)
assert.Equal(t, "Release Four", resp.Items[1].Title) assert.Equal(t, "Release Four", resp.Items[1].Item.Title)
resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Period: db.PeriodYear}) resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodYear}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 3) require.Len(t, resp.Items, 3)
assert.Equal(t, int64(3), resp.TotalCount) assert.Equal(t, int64(3), resp.TotalCount)
assert.Equal(t, "Release Two", resp.Items[0].Title) assert.Equal(t, "Release Two", resp.Items[0].Item.Title)
assert.Equal(t, "Release Three", resp.Items[1].Title) assert.Equal(t, "Release Three", resp.Items[1].Item.Title)
assert.Equal(t, "Release Four", resp.Items[2].Title) assert.Equal(t, "Release Four", resp.Items[2].Item.Title)
// test specific artist // test specific artist
resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Period: db.PeriodYear, ArtistID: 2}) resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodYear}, ArtistID: 2})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 1) require.Len(t, resp.Items, 1)
assert.Equal(t, int64(1), resp.TotalCount) assert.Equal(t, int64(1), resp.TotalCount)
assert.Equal(t, "Release Two", resp.Items[0].Title) assert.Equal(t, "Release Two", resp.Items[0].Item.Title)
// Test specify dates // Test specify dates
testDataAbsoluteListenTimes(t) testDataAbsoluteListenTimes(t)
resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Year: 2023}) resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Year: 2023}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 1) require.Len(t, resp.Items, 1)
assert.Equal(t, int64(1), resp.TotalCount) assert.Equal(t, int64(1), resp.TotalCount)
assert.Equal(t, "Release One", resp.Items[0].Title) assert.Equal(t, "Release One", resp.Items[0].Item.Title)
resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Month: 6, Year: 2024}) resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Month: 6, Year: 2024}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 1) require.Len(t, resp.Items, 1)
assert.Equal(t, int64(1), resp.TotalCount) assert.Equal(t, int64(1), resp.TotalCount)
assert.Equal(t, "Release Two", resp.Items[0].Title) assert.Equal(t, "Release Two", resp.Items[0].Item.Title)
// invalid, year required with month
_, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Month: 10})
require.Error(t, err)
} }

Some files were not shown because too many files have changed in this diff Show more