Compare commits

...

37 commits
v0.1.1 ... main

Author SHA1 Message Date
Gabe Farrell
0ec7b458cc
ui: tweaks and fixes (#194)
* reduce min width of top chart on mobile

* adjust error page style

* adjust h1 line height
2026-02-04 13:41:12 -05:00
Gabe Farrell
531c72899c
fix: add null check for top charts bg gradient (#193) 2026-02-03 11:23:30 -05:00
Gabe Farrell
b06685c1af
fix: rewind navigation (#191) 2026-02-02 15:06:13 -05:00
Gabe Farrell
64236c99c9
fix: invalid json response when login gate is disabled (#184) 2026-01-26 14:49:30 -05:00
Gabe Farrell
42b32c7920
feat: add api key auth to web api (#183) 2026-01-26 13:48:43 -05:00
PythonGermany
bf1c03e9fd
docs: fix typo in index.mdx (#182) 2026-01-26 13:43:01 -05:00
Gabe Farrell
35e104c97e
fix: gradient background on top charts (#181) 2026-01-26 13:03:27 -05:00
Gabe Farrell
c8a11ef018
fix: ensure mbids in mbidmapping are discovered (#180) 2026-01-25 15:51:07 -05:00
Gabe Farrell
937f9062b5
fix: include time zone name overrides and add KOITO_FORCE_TZ cfg option (#176)
* timezone overrides and force_tz option

* docs for force_tz

* add link to time zone names in docs
2026-01-24 13:19:04 -05:00
Gabe Farrell
1ed055d098
fix: ui tweaks and fixes (#170)
* add subtle gradient to home page

* tweak autumn theme primary color

* reduce home page top margin on mobile

* use focus-active instead of focus for outline

* fix gradient on rewind page

* align checkbox on login form

* i forgot what the pseudo class was called
2026-01-22 21:31:14 -05:00
Gabe Farrell
08fc9eed86
fix: correct interest bucket queries (#169) 2026-01-22 17:01:46 -05:00
Gabe Farrell
cb4d177875
fix: release associations and add cleanup migration (#168)
* fix: release associations and add cleanup migration

* fix: incorrect test
2026-01-22 15:33:38 -05:00
Gabe Farrell
16cee8cfca
fix: speedup top-artists and top-albums queries (#167) 2026-01-21 17:30:59 -05:00
onespaceman
c59c6c3baa
QOL changes to client (#165) 2026-01-21 16:03:27 -05:00
Gabe Farrell
e7ba34710c
feat: lastfm image support (#166)
* feat: lastfm image support

* docs
2026-01-21 16:03:05 -05:00
Gabe Farrell
56ac73d12b
fix: improve subsonic image searching (#164) 2026-01-21 14:54:52 -05:00
Gabe Farrell
1a8099e902
feat: refetch missing images on startup (#160)
* artist image refetching

* album image refetching

* remove unused var
2026-01-20 12:10:54 -05:00
Gabe Farrell
5e294b839c
feat: all time rank display (#149)
* add all time rank to item pages

* fix artist albums component

* add no rows check

* fix rewind page
2026-01-16 01:03:23 -05:00
d08e05220f docs: add disclaimer about subsonic config 2026-01-15 22:01:25 -05:00
c0de721a7c chore: ignore README for docker workflow 2026-01-15 21:27:59 -05:00
Gabe Farrell
d2d6924e05
fix: use sql rank (#148) 2026-01-15 21:08:30 -05:00
Gabe Farrell
aa7fddd518
fix: a couple ui fixes (#147)
* fix: reduce loading component width

* improve theme selector for mobile

* match interest graph width to activity grid
2026-01-15 20:21:05 -05:00
Gabe Farrell
1eb1cd0fd5
chore: call relay early to prevent missed relays (#145)
* chore: call relay early to prevent missed relays

* fix: get current time in tz for listen activity (#146)

* fix: get current time in tz for listen activity

* fix: adjust test to prevent timezone errors
2026-01-15 19:40:38 -05:00
Gabe Farrell
92648167f0
fix: get current time in tz for listen activity (#146)
* fix: get current time in tz for listen activity

* fix: adjust test to prevent timezone errors
2026-01-15 19:36:48 -05:00
Gabe Farrell
9dbdfe5e41
update README 2026-01-15 18:21:51 -05:00
Gabe Farrell
94108953ec
fix: conditional rendering on artist and album pages (#140) 2026-01-14 22:12:57 -05:00
Gabe Farrell
d87ed2eb97
fix: ensure listen activity correctly sums listen activity in step (#139)
* remove impossible nil check

* fix listen activity not correctly aggregating step

* remove stray log

* fix test
2026-01-14 21:35:01 -05:00
Gabe Farrell
3305ad269e
Add Star History section to README
Added Star History section with visualization.
2026-01-14 17:21:52 -05:00
Gabe Farrell
20bbf62254
update README
Added logo and Ko-Fi badge to README.
2026-01-14 14:47:21 -05:00
Gabe Farrell
a94584da23
create FUNDING.yml 2026-01-14 14:06:14 -05:00
Gabe Farrell
8223a29be6
fix: correctly cycle tracks in backfill (#138) 2026-01-14 12:46:17 -05:00
231e751be3 docs: add navidrome quickstart guide 2026-01-14 01:26:01 -05:00
feef66da12 fix: add required parameters for subsonic request 2026-01-14 01:09:17 -05:00
Gabe Farrell
25d7bb41c1
Revise README for project status and update screenshots
Updated project status to reflect active development and instability. Added new images to the screenshots section and made minor text adjustments.

Also since when does AI write GitHub default commit messages...
2026-01-14 00:24:19 -05:00
Gabe Farrell
df59605418
feat: backfill duration from musicbrainz (#135)
* feat: backfill durations from musicbrainz

* chore: make request body dump info level
2026-01-14 00:08:05 -05:00
Gabe Farrell
288d04d714
fix: ui tweaks and fixes (#134) 2026-01-13 23:25:31 -05:00
Gabe Farrell
c2a0987946
fix: improved mobile ui for rewind (#133) 2026-01-13 11:13:54 -05:00
100 changed files with 2939 additions and 1384 deletions

5
.env.example Normal file
View file

@ -0,0 +1,5 @@
KOITO_ALLOWED_HOSTS=*
KOITO_LOG_LEVEL=debug
KOITO_CONFIG_DIR=test_config_dir
KOITO_DATABASE_URL=postgres://postgres:secret@localhost:5432?sslmode=disable
TZ=Etc/UTC

3
.github/FUNDING.yml vendored Normal file
View file

@ -0,0 +1,3 @@
# These are supported funding model platforms
ko_fi: gabehf

View file

@ -17,6 +17,7 @@ on:
- main - main
paths-ignore: paths-ignore:
- "docs/**" - "docs/**"
- "README.md"
workflow_dispatch: workflow_dispatch:

1
.gitignore vendored
View file

@ -1 +1,2 @@
test_config_dir test_config_dir
.env

View file

@ -1,3 +1,8 @@
ifneq (,$(wildcard ./.env))
include .env
export
endif
.PHONY: all test clean client .PHONY: all test clean client
postgres.schemadump: postgres.schemadump:
@ -28,10 +33,10 @@ postgres.remove-scratch:
docker stop koito-scratch && docker rm koito-scratch docker stop koito-scratch && docker rm koito-scratch
api.debug: postgres.start api.debug: postgres.start
KOITO_ALLOWED_HOSTS=* KOITO_LOG_LEVEL=debug KOITO_CONFIG_DIR=test_config_dir KOITO_DATABASE_URL=postgres://postgres:secret@localhost:5432?sslmode=disable go run cmd/api/main.go go run cmd/api/main.go
api.scratch: postgres.run-scratch api.scratch: postgres.run-scratch
KOITO_ALLOWED_HOSTS=* KOITO_LOG_LEVEL=debug KOITO_CONFIG_DIR=test_config_dir/scratch KOITO_DATABASE_URL=postgres://postgres:secret@localhost:5433?sslmode=disable go run cmd/api/main.go KOITO_DATABASE_URL=postgres://postgres:secret@localhost:5433?sslmode=disable go run cmd/api/main.go
api.test: api.test:
go test ./... -timeout 60s go test ./... -timeout 60s

View file

@ -1,9 +1,21 @@
# Koito <div align="center">
![Koito logo](https://github.com/user-attachments/assets/bd69a050-b40f-4da7-8ff1-4607554bfd6d)
*Koito (小糸) is a Japanese surname. It is also homophonous with the words 恋と (koi to), meaning "and/with love".*
</div>
<div align="center">
[![Ko-Fi](https://img.shields.io/badge/Ko--fi-F16061?style=for-the-badge&logo=ko-fi&logoColor=white)](https://ko-fi.com/gabehf)
</div>
Koito is a modern, themeable ListenBrainz-compatible scrobbler for self-hosters who want control over their data and insights into their listening habits. Koito is a modern, themeable ListenBrainz-compatible scrobbler for self-hosters who want control over their data and insights into their listening habits.
It supports relaying to other compatible scrobblers, so you can try it safely without replacing your current setup. It supports relaying to other compatible scrobblers, so you can try it safely without replacing your current setup.
> This project is currently pre-release, and therefore you can expect rapid development and some bugs. If you don't want to replace your current scrobbler > This project is under active development and still considered "unstable", and therefore you can expect some bugs. If you don't want to replace your current scrobbler
with Koito quite yet, you can [set up a relay](https://koito.io/guides/scrobbler/#set-up-a-relay) from Koito to another ListenBrainz-compatible with Koito quite yet, you can [set up a relay](https://koito.io/guides/scrobbler/#set-up-a-relay) from Koito to another ListenBrainz-compatible
scrobbler. This is what I've been doing for the entire development of this app and it hasn't failed me once. Or, you can always use something scrobbler. This is what I've been doing for the entire development of this app and it hasn't failed me once. Or, you can always use something
like [multi-scrobbler](https://github.com/FoxxMD/multi-scrobbler). like [multi-scrobbler](https://github.com/FoxxMD/multi-scrobbler).
@ -23,8 +35,9 @@ You can view my public instance with my listening data at https://koito.mnrva.de
## Screenshots ## Screenshots
![screenshot one](assets/screenshot1.png) ![screenshot one](assets/screenshot1.png)
![screenshot two](assets/screenshot2.png) <img width="2021" height="1330" alt="image" src="https://github.com/user-attachments/assets/956748ff-f61f-4102-94b2-50783d9ee72b" />
![screenshot three](assets/screenshot3.png) <img width="1505" height="1018" alt="image" src="https://github.com/user-attachments/assets/5f7e1162-f723-4e4b-a528-06cf26d1d870" />
## Installation ## Installation
@ -75,6 +88,16 @@ There are currently some known issues that I am actively working on, in addition
If you have any feature ideas, open a GitHub issue to let me know. I'm sorting through ideas to decide which data visualizations and customization options to add next. If you have any feature ideas, open a GitHub issue to let me know. I'm sorting through ideas to decide which data visualizations and customization options to add next.
## Star History
<a href="https://www.star-history.com/#gabehf/koito&type=date&legend=top-left">
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://api.star-history.com/svg?repos=gabehf/koito&type=date&theme=dark&legend=top-left" />
<source media="(prefers-color-scheme: light)" srcset="https://api.star-history.com/svg?repos=gabehf/koito&type=date&legend=top-left" />
<img alt="Star History Chart" src="https://api.star-history.com/svg?repos=gabehf/koito&type=date&legend=top-left" />
</picture>
</a>
## Albums that fueled development + notes ## Albums that fueled development + notes
More relevant here than any of my other projects... More relevant here than any of my other projects...
@ -84,5 +107,4 @@ Not just during development, you can see my complete listening data on my [live
#### Random notes #### Random notes
- I find it a little annoying when READMEs use emoji but everyone else is doing it so I felt like I had to... - I find it a little annoying when READMEs use emoji but everyone else is doing it so I felt like I had to...
- It's funny how you can see the days in my listening history when I was just working on this project because they have way more listens than other days.
- About 50% of the reason I built this was minor/not-so-minor greivances with Maloja. Could I have just contributed to Maloja? Maybe, but I like building stuff and I like Koito's UI a lot more anyways. - About 50% of the reason I built this was minor/not-so-minor greivances with Maloja. Could I have just contributed to Maloja? Maybe, but I like building stuff and I like Koito's UI a lot more anyways.

View file

@ -48,32 +48,32 @@ async function getLastListens(
async function getTopTracks( async function getTopTracks(
args: getItemsArgs args: getItemsArgs
): Promise<PaginatedResponse<Track>> { ): Promise<PaginatedResponse<Ranked<Track>>> {
let url = `/apis/web/v1/top-tracks?period=${args.period}&limit=${args.limit}&page=${args.page}`; let url = `/apis/web/v1/top-tracks?period=${args.period}&limit=${args.limit}&page=${args.page}`;
if (args.artist_id) url += `&artist_id=${args.artist_id}`; if (args.artist_id) url += `&artist_id=${args.artist_id}`;
else if (args.album_id) url += `&album_id=${args.album_id}`; else if (args.album_id) url += `&album_id=${args.album_id}`;
const r = await fetch(url); const r = await fetch(url);
return handleJson<PaginatedResponse<Track>>(r); return handleJson<PaginatedResponse<Ranked<Track>>>(r);
} }
async function getTopAlbums( async function getTopAlbums(
args: getItemsArgs args: getItemsArgs
): Promise<PaginatedResponse<Album>> { ): Promise<PaginatedResponse<Ranked<Album>>> {
let url = `/apis/web/v1/top-albums?period=${args.period}&limit=${args.limit}&page=${args.page}`; let url = `/apis/web/v1/top-albums?period=${args.period}&limit=${args.limit}&page=${args.page}`;
if (args.artist_id) url += `&artist_id=${args.artist_id}`; if (args.artist_id) url += `&artist_id=${args.artist_id}`;
const r = await fetch(url); const r = await fetch(url);
return handleJson<PaginatedResponse<Album>>(r); return handleJson<PaginatedResponse<Ranked<Album>>>(r);
} }
async function getTopArtists( async function getTopArtists(
args: getItemsArgs args: getItemsArgs
): Promise<PaginatedResponse<Artist>> { ): Promise<PaginatedResponse<Ranked<Artist>>> {
const url = `/apis/web/v1/top-artists?period=${args.period}&limit=${args.limit}&page=${args.page}`; const url = `/apis/web/v1/top-artists?period=${args.period}&limit=${args.limit}&page=${args.page}`;
const r = await fetch(url); const r = await fetch(url);
return handleJson<PaginatedResponse<Artist>>(r); return handleJson<PaginatedResponse<Ranked<Artist>>>(r);
} }
async function getActivity( async function getActivity(
@ -367,6 +367,7 @@ type Track = {
musicbrainz_id: string; musicbrainz_id: string;
time_listened: number; time_listened: number;
first_listen: number; first_listen: number;
all_time_rank: number;
}; };
type Artist = { type Artist = {
id: number; id: number;
@ -378,6 +379,7 @@ type Artist = {
time_listened: number; time_listened: number;
first_listen: number; first_listen: number;
is_primary: boolean; is_primary: boolean;
all_time_rank: number;
}; };
type Album = { type Album = {
id: number; id: number;
@ -389,6 +391,7 @@ type Album = {
musicbrainz_id: string; musicbrainz_id: string;
time_listened: number; time_listened: number;
first_listen: number; first_listen: number;
all_time_rank: number;
}; };
type Alias = { type Alias = {
id: number; id: number;
@ -407,6 +410,10 @@ type PaginatedResponse<T> = {
current_page: number; current_page: number;
items_per_page: number; items_per_page: number;
}; };
type Ranked<T> = {
item: T;
rank: number;
};
type ListenActivityItem = { type ListenActivityItem = {
start_time: Date; start_time: Date;
listens: number; listens: number;
@ -455,9 +462,9 @@ type NowPlaying = {
}; };
type RewindStats = { type RewindStats = {
title: string; title: string;
top_artists: Artist[]; top_artists: Ranked<Artist>[];
top_albums: Album[]; top_albums: Ranked<Album>[];
top_tracks: Track[]; top_tracks: Ranked<Track>[];
minutes_listened: number; minutes_listened: number;
avg_minutes_listened_per_day: number; avg_minutes_listened_per_day: number;
plays: number; plays: number;
@ -480,6 +487,7 @@ export type {
Listen, Listen,
SearchResponse, SearchResponse,
PaginatedResponse, PaginatedResponse,
Ranked,
ListenActivityItem, ListenActivityItem,
InterestBucket, InterestBucket,
User, User,

View file

@ -58,6 +58,7 @@
--header-sm: 16px; --header-sm: 16px;
--header-xl-weight: 600; --header-xl-weight: 600;
--header-weight: 600; --header-weight: 600;
--header-line-height: 3rem;
} }
@media (min-width: 60rem) { @media (min-width: 60rem) {
@ -68,6 +69,7 @@
--header-sm: 16px; --header-sm: 16px;
--header-xl-weight: 600; --header-xl-weight: 600;
--header-weight: 600; --header-weight: 600;
--header-line-height: 1.3em;
} }
} }
@ -98,6 +100,7 @@ h1 {
font-family: "League Spartan"; font-family: "League Spartan";
font-weight: var(--header-weight); font-weight: var(--header-weight);
font-size: var(--header-xl); font-size: var(--header-xl);
line-height: var(--header-line-height);
} }
h2 { h2 {
font-family: "League Spartan"; font-family: "League Spartan";
@ -130,30 +133,21 @@ h4 {
text-decoration: underline; text-decoration: underline;
} }
input[type="text"] { input[type="text"],
border: 1px solid var(--color-bg); input[type="password"],
}
input[type="text"]:focus {
outline: none;
border: 1px solid var(--color-fg-tertiary);
}
textarea { textarea {
border: 1px solid var(--color-bg); border: 1px solid var(--color-bg);
} }
textarea:focus { input[type="checkbox"] {
outline: none; height: fit-content;
border: 1px solid var(--color-fg-tertiary);
} }
input[type="password"] { input:focus-visible,
border: 1px solid var(--color-bg); button:focus-visible,
} a:focus-visible,
input[type="password"]:focus { select:focus-visible,
outline: none; textarea:focus-visible {
border: 1px solid var(--color-fg-tertiary); border-color: transparent;
} outline: 2px solid var(--color-fg-tertiary);
input[type="checkbox"]:focus {
outline: none;
border: 1px solid var(--color-fg-tertiary);
} }
button:hover { button:hover {

View file

@ -68,14 +68,14 @@ export default function ActivityGrid({
if (isPending) { if (isPending) {
return ( return (
<div className="w-[500px]"> <div className="w-[350px]">
<h3>Activity</h3> <h3>Activity</h3>
<p>Loading...</p> <p>Loading...</p>
</div> </div>
); );
} else if (isError) { } else if (isError) {
return ( return (
<div className="w-[500px]"> <div className="w-[350px]">
<h3>Activity</h3> <h3>Activity</h3>
<p className="error">Error: {error.message}</p> <p className="error">Error: {error.message}</p>
</div> </div>

View file

@ -7,10 +7,12 @@ export default function AllTimeStats() {
queryFn: ({ queryKey }) => getStats(queryKey[1]), queryFn: ({ queryKey }) => getStats(queryKey[1]),
}); });
const header = "All time stats";
if (isPending) { if (isPending) {
return ( return (
<div className="w-[200px]"> <div>
<h3>All Time Stats</h3> <h3>{header}</h3>
<p>Loading...</p> <p>Loading...</p>
</div> </div>
); );
@ -18,7 +20,7 @@ export default function AllTimeStats() {
return ( return (
<> <>
<div> <div>
<h3>All Time Stats</h3> <h3>{header}</h3>
<p className="error">Error: {error.message}</p> <p className="error">Error: {error.message}</p>
</div> </div>
</> </>
@ -29,7 +31,7 @@ export default function AllTimeStats() {
return ( return (
<div> <div>
<h3>All Time Stats</h3> <h3>{header}</h3>
<div> <div>
<span <span
className={numberClasses} className={numberClasses}

View file

@ -8,11 +8,11 @@ interface Props {
period: string; period: string;
} }
export default function ArtistAlbums({ artistId, name, period }: Props) { export default function ArtistAlbums({ artistId, name }: Props) {
const { isPending, isError, data, error } = useQuery({ const { isPending, isError, data, error } = useQuery({
queryKey: [ queryKey: [
"top-albums", "top-albums",
{ limit: 99, period: "all_time", artist_id: artistId, page: 0 }, { limit: 99, period: "all_time", artist_id: artistId },
], ],
queryFn: ({ queryKey }) => getTopAlbums(queryKey[1] as getItemsArgs), queryFn: ({ queryKey }) => getTopAlbums(queryKey[1] as getItemsArgs),
}); });
@ -39,16 +39,20 @@ export default function ArtistAlbums({ artistId, name, period }: Props) {
<h3>Albums featuring {name}</h3> <h3>Albums featuring {name}</h3>
<div className="flex flex-wrap gap-8"> <div className="flex flex-wrap gap-8">
{data.items.map((item) => ( {data.items.map((item) => (
<Link to={`/album/${item.id}`} className="flex gap-2 items-start"> <Link
to={`/album/${item.item.id}`}
className="flex gap-2 items-start"
>
<img <img
src={imageUrl(item.image, "medium")} src={imageUrl(item.item.image, "medium")}
alt={item.title} alt={item.item.title}
style={{ width: 130 }} style={{ width: 130 }}
/> />
<div className="w-[180px] flex flex-col items-start gap-1"> <div className="w-[180px] flex flex-col items-start gap-1">
<p>{item.title}</p> <p>{item.item.title}</p>
<p className="text-sm color-fg-secondary"> <p className="text-sm color-fg-secondary">
{item.listen_count} play{item.listen_count > 1 ? "s" : ""} {item.item.listen_count} play
{item.item.listen_count > 1 ? "s" : ""}
</p> </p>
</div> </div>
</Link> </Link>

View file

@ -48,14 +48,14 @@ export default function InterestGraph({
if (isPending) { if (isPending) {
return ( return (
<div className="w-[500px]"> <div className="w-[350px] sm:w-[500px]">
<h3>Interest over time</h3> <h3>Interest over time</h3>
<p>Loading...</p> <p>Loading...</p>
</div> </div>
); );
} else if (isError) { } else if (isError) {
return ( return (
<div className="w-[500px]"> <div className="w-[350px] sm:w-[500px]">
<h3>Interest over time</h3> <h3>Interest over time</h3>
<p className="error">Error: {error.message}</p> <p className="error">Error: {error.message}</p>
</div> </div>
@ -67,7 +67,7 @@ export default function InterestGraph({
// so I think I just have to remove it for now. // so I think I just have to remove it for now.
return ( return (
<div className="flex flex-col items-start w-full max-w-[500px]"> <div className="flex flex-col items-start w-full max-w-[335px] sm:max-w-[500px]">
<h3>Interest over time</h3> <h3>Interest over time</h3>
<AreaChart <AreaChart
style={{ style={{

View file

@ -42,6 +42,8 @@ export default function LastPlays(props: Props) {
queryFn: () => getNowPlaying(), queryFn: () => getNowPlaying(),
}); });
const header = "Last played";
const [items, setItems] = useState<Listen[] | null>(null); const [items, setItems] = useState<Listen[] | null>(null);
const handleDelete = async (listen: Listen) => { const handleDelete = async (listen: Listen) => {
@ -63,14 +65,14 @@ export default function LastPlays(props: Props) {
if (isPending) { if (isPending) {
return ( return (
<div className="w-[300px] sm:w-[500px]"> <div className="w-[300px] sm:w-[500px]">
<h3>Last Played</h3> <h3>{header}</h3>
<p>Loading...</p> <p>Loading...</p>
</div> </div>
); );
} else if (isError) { } else if (isError) {
return ( return (
<div className="w-[300px] sm:w-[500px]"> <div className="w-[300px] sm:w-[500px]">
<h3>Last Played</h3> <h3>{header}</h3>
<p className="error">Error: {error.message}</p> <p className="error">Error: {error.message}</p>
</div> </div>
); );
@ -86,7 +88,7 @@ export default function LastPlays(props: Props) {
return ( return (
<div className="text-sm sm:text-[16px]"> <div className="text-sm sm:text-[16px]">
<h3 className="hover:underline"> <h3 className="hover:underline">
<Link to={`/listens?period=all_time${params}`}>Last Played</Link> <Link to={`/listens?period=all_time${params}`}>{header}</Link>
</h3> </h3>
<table className="-ml-4"> <table className="-ml-4">
<tbody> <tbody>

View file

@ -30,17 +30,19 @@ export default function TopAlbums(props: Props) {
queryFn: ({ queryKey }) => getTopAlbums(queryKey[1] as getItemsArgs), queryFn: ({ queryKey }) => getTopAlbums(queryKey[1] as getItemsArgs),
}); });
const header = "Top albums";
if (isPending) { if (isPending) {
return ( return (
<div className="w-[300px]"> <div className="w-[300px]">
<h3>Top Albums</h3> <h3>{header}</h3>
<p>Loading...</p> <p>Loading...</p>
</div> </div>
); );
} else if (isError) { } else if (isError) {
return ( return (
<div className="w-[300px]"> <div className="w-[300px]">
<h3>Top Albums</h3> <h3>{header}</h3>
<p className="error">Error: {error.message}</p> <p className="error">Error: {error.message}</p>
</div> </div>
); );
@ -54,7 +56,7 @@ export default function TopAlbums(props: Props) {
props.artistId ? `&artist_id=${props.artistId}` : "" props.artistId ? `&artist_id=${props.artistId}` : ""
}`} }`}
> >
Top Albums {header}
</Link> </Link>
</h3> </h3>
<div className="max-w-[300px]"> <div className="max-w-[300px]">

View file

@ -21,17 +21,19 @@ export default function TopArtists(props: Props) {
queryFn: ({ queryKey }) => getTopArtists(queryKey[1] as getItemsArgs), queryFn: ({ queryKey }) => getTopArtists(queryKey[1] as getItemsArgs),
}); });
const header = "Top artists";
if (isPending) { if (isPending) {
return ( return (
<div className="w-[300px]"> <div className="w-[300px]">
<h3>Top Artists</h3> <h3>{header}</h3>
<p>Loading...</p> <p>Loading...</p>
</div> </div>
); );
} else if (isError) { } else if (isError) {
return ( return (
<div className="w-[300px]"> <div className="w-[300px]">
<h3>Top Artists</h3> <h3>{header}</h3>
<p className="error">Error: {error.message}</p> <p className="error">Error: {error.message}</p>
</div> </div>
); );
@ -40,9 +42,7 @@ export default function TopArtists(props: Props) {
return ( return (
<div> <div>
<h3 className="hover:underline"> <h3 className="hover:underline">
<Link to={`/chart/top-artists?period=${props.period}`}> <Link to={`/chart/top-artists?period=${props.period}`}>{header}</Link>
Top Artists
</Link>
</h3> </h3>
<div className="max-w-[300px]"> <div className="max-w-[300px]">
<TopItemList type="artist" data={data} /> <TopItemList type="artist" data={data} />

View file

@ -6,11 +6,12 @@ import {
type Artist, type Artist,
type Track, type Track,
type PaginatedResponse, type PaginatedResponse,
type Ranked,
} from "api/api"; } from "api/api";
type Item = Album | Track | Artist; type Item = Album | Track | Artist;
interface Props<T extends Item> { interface Props<T extends Ranked<Item>> {
data: PaginatedResponse<T>; data: PaginatedResponse<T>;
separators?: ConstrainBoolean; separators?: ConstrainBoolean;
ranked?: boolean; ranked?: boolean;
@ -18,33 +19,17 @@ interface Props<T extends Item> {
className?: string; className?: string;
} }
export default function TopItemList<T extends Item>({ export default function TopItemList<T extends Ranked<Item>>({
data, data,
separators, separators,
type, type,
className, className,
ranked, ranked,
}: Props<T>) { }: Props<T>) {
const currentParams = new URLSearchParams(location.search);
const page = Math.max(parseInt(currentParams.get("page") || "1"), 1);
let lastRank = 0;
const calculateRank = (data: Item[], page: number, index: number): number => {
if (
index === 0 ||
data[index] == undefined ||
!(data[index].listen_count === data[index - 1].listen_count)
) {
lastRank = index + 1 + (page - 1) * 100;
}
return lastRank;
};
return ( return (
<div className={`flex flex-col gap-1 ${className} min-w-[200px]`}> <div className={`flex flex-col gap-1 ${className} min-w-[200px]`}>
{data.items.map((item, index) => { {data.items.map((item, index) => {
const key = `${type}-${item.id}`; const key = `${type}-${item.item.id}`;
return ( return (
<div <div
key={key} key={key}
@ -57,10 +42,10 @@ export default function TopItemList<T extends Item>({
> >
<ItemCard <ItemCard
ranked={ranked} ranked={ranked}
rank={calculateRank(data.items, page, index)} rank={item.rank}
item={item} item={item.item}
type={type} type={type}
key={type + item.id} key={type + item.item.id}
/> />
</div> </div>
); );

View file

@ -28,17 +28,19 @@ const TopTracks = (props: Props) => {
queryFn: ({ queryKey }) => getTopTracks(queryKey[1] as getItemsArgs), queryFn: ({ queryKey }) => getTopTracks(queryKey[1] as getItemsArgs),
}); });
const header = "Top tracks";
if (isPending) { if (isPending) {
return ( return (
<div className="w-[300px]"> <div className="w-[300px]">
<h3>Top Tracks</h3> <h3>{header}</h3>
<p>Loading...</p> <p>Loading...</p>
</div> </div>
); );
} else if (isError) { } else if (isError) {
return ( return (
<div className="w-[300px]"> <div className="w-[300px]">
<h3>Top Tracks</h3> <h3>{header}</h3>
<p className="error">Error: {error.message}</p> <p className="error">Error: {error.message}</p>
</div> </div>
); );
@ -53,7 +55,7 @@ const TopTracks = (props: Props) => {
<div> <div>
<h3 className="hover:underline"> <h3 className="hover:underline">
<Link to={`/chart/top-tracks?period=${props.period}${params}`}> <Link to={`/chart/top-tracks?period=${props.period}${params}`}>
Top Tracks {header}
</Link> </Link>
</h3> </h3>
<div className="max-w-[300px]"> <div className="max-w-[300px]">

View file

@ -20,7 +20,7 @@ export default function DeleteModal({ open, setOpen, title, id, type }: Props) {
setLoading(true); setLoading(true);
deleteItem(type.toLowerCase(), id).then((r) => { deleteItem(type.toLowerCase(), id).then((r) => {
if (r.ok) { if (r.ok) {
navigate("/"); navigate(-1);
} else { } else {
console.log(r); console.log(r);
} }

View file

@ -54,7 +54,7 @@ export default function LoginForm() {
className="w-full mx-auto fg bg rounded p-2" className="w-full mx-auto fg bg rounded p-2"
onChange={(e) => setPassword(e.target.value)} onChange={(e) => setPassword(e.target.value)}
/> />
<div className="flex gap-2"> <div className="flex gap-2 items-center">
<input <input
type="checkbox" type="checkbox"
name="koito-remember" name="koito-remember"

View file

@ -19,7 +19,7 @@ interface Props {
} }
export default function MergeModal(props: Props) { export default function MergeModal(props: Props) {
const [query, setQuery] = useState(""); const [query, setQuery] = useState(props.currentTitle);
const [data, setData] = useState<SearchResponse>(); const [data, setData] = useState<SearchResponse>();
const [debouncedQuery, setDebouncedQuery] = useState(query); const [debouncedQuery, setDebouncedQuery] = useState(query);
const [mergeTarget, setMergeTarget] = useState<{ title: string; id: number }>( const [mergeTarget, setMergeTarget] = useState<{ title: string; id: number }>(
@ -101,11 +101,12 @@ export default function MergeModal(props: Props) {
<input <input
type="text" type="text"
autoFocus autoFocus
defaultValue={props.currentTitle}
// i find my stupid a(n) logic to be a little silly so im leaving it in even if its not optimal // i find my stupid a(n) logic to be a little silly so im leaving it in even if its not optimal
placeholder={`Search for a${ placeholder={`Search for a${props.type.toLowerCase()[0] === "a" ? "n" : ""
props.type.toLowerCase()[0] === "a" ? "n" : ""
} ${props.type.toLowerCase()} to be merged into the current ${props.type.toLowerCase()}`} } ${props.type.toLowerCase()} to be merged into the current ${props.type.toLowerCase()}`}
className="w-full mx-auto fg bg rounded p-2" className="w-full mx-auto fg bg rounded p-2"
onFocus={(e) => { setQuery(e.target.value); e.target.select()}}
onChange={(e) => setQuery(e.target.value)} onChange={(e) => setQuery(e.target.value)}
/> />
<SearchResults selectorMode data={data} onSelect={toggleSelect} /> <SearchResults selectorMode data={data} onSelect={toggleSelect} />
@ -128,7 +129,7 @@ export default function MergeModal(props: Props) {
> >
Merge Items Merge Items
</button> </button>
<div className="flex gap-2 mt-3"> <div className="flex items-center gap-2 mt-3">
<input <input
type="checkbox" type="checkbox"
name="reverse-merge-order" name="reverse-merge-order"
@ -139,7 +140,7 @@ export default function MergeModal(props: Props) {
</div> </div>
{(props.type.toLowerCase() === "album" || {(props.type.toLowerCase() === "album" ||
props.type.toLowerCase() === "artist") && ( props.type.toLowerCase() === "artist") && (
<div className="flex gap-2 mt-3"> <div className="flex items-center gap-2 mt-3">
<input <input
type="checkbox" type="checkbox"
name="replace-image" name="replace-image"

View file

@ -32,10 +32,34 @@ export function Modal({
} }
}, [isOpen, shouldRender]); }, [isOpen, shouldRender]);
// Close on Escape key // Handle keyboard events
useEffect(() => { useEffect(() => {
const handleKeyDown = (e: KeyboardEvent) => { const handleKeyDown = (e: KeyboardEvent) => {
if (e.key === 'Escape') onClose(); // Close on Escape key
if (e.key === 'Escape') {
onClose()
// Trap tab navigation to the modal
} else if (e.key === 'Tab') {
if (modalRef.current) {
const focusableEls = modalRef.current.querySelectorAll<HTMLElement>(
'button:not(:disabled), [href], input:not(:disabled), select:not(:disabled), textarea:not(:disabled), [tabindex]:not([tabindex="-1"])'
);
const firstEl = focusableEls[0];
const lastEl = focusableEls[focusableEls.length - 1];
const activeEl = document.activeElement
if (e.shiftKey && activeEl === firstEl) {
e.preventDefault();
lastEl.focus();
} else if (!e.shiftKey && activeEl === lastEl) {
e.preventDefault();
firstEl.focus();
} else if (!Array.from(focusableEls).find(node => node.isEqualNode(activeEl))) {
e.preventDefault();
firstEl.focus();
}
}
};
}; };
if (isOpen) document.addEventListener('keydown', handleKeyDown); if (isOpen) document.addEventListener('keydown', handleKeyDown);
return () => document.removeEventListener('keydown', handleKeyDown); return () => document.removeEventListener('keydown', handleKeyDown);
@ -70,13 +94,13 @@ export function Modal({
}`} }`}
style={{ maxWidth: maxW ?? 600, height: h ?? '' }} style={{ maxWidth: maxW ?? 600, height: h ?? '' }}
> >
{children}
<button <button
onClick={onClose} onClick={onClose}
className="absolute top-2 right-2 color-fg-tertiary hover:cursor-pointer" className="absolute top-2 right-2 color-fg-tertiary hover:cursor-pointer"
> >
🞪 🞪
</button> </button>
{children}
</div> </div>
</div>, </div>,
document.body document.body

View file

@ -8,9 +8,9 @@ interface Props {
} }
export default function Rewind(props: Props) { export default function Rewind(props: Props) {
const artistimg = props.stats.top_artists[0]?.image; const artistimg = props.stats.top_artists[0]?.item.image;
const albumimg = props.stats.top_albums[0]?.image; const albumimg = props.stats.top_albums[0]?.item.image;
const trackimg = props.stats.top_tracks[0]?.image; const trackimg = props.stats.top_tracks[0]?.item.image;
if ( if (
!props.stats.top_artists[0] || !props.stats.top_artists[0] ||
!props.stats.top_albums[0] || !props.stats.top_albums[0] ||

View file

@ -1,7 +1,9 @@
import type { Ranked } from "api/api";
type TopItemProps<T> = { type TopItemProps<T> = {
title: string; title: string;
imageSrc: string; imageSrc: string;
items: T[]; items: Ranked<T>[];
getLabel: (item: T) => string; getLabel: (item: T) => string;
includeTime?: boolean; includeTime?: boolean;
}; };
@ -28,23 +30,23 @@ export function RewindTopItem<
<div className="flex items-center gap-2"> <div className="flex items-center gap-2">
<div className="flex flex-col items-start mb-2"> <div className="flex flex-col items-start mb-2">
<h2>{getLabel(top)}</h2> <h2>{getLabel(top.item)}</h2>
<span className="text-(--color-fg-tertiary) -mt-3 text-sm"> <span className="text-(--color-fg-tertiary) -mt-3 text-sm">
{`${top.listen_count} plays`} {`${top.item.listen_count} plays`}
{includeTime {includeTime
? ` (${Math.floor(top.time_listened / 60)} minutes)` ? ` (${Math.floor(top.item.time_listened / 60)} minutes)`
: ``} : ``}
</span> </span>
</div> </div>
</div> </div>
{rest.map((e) => ( {rest.map((e) => (
<div key={e.id} className="text-sm"> <div key={e.item.id} className="text-sm">
{getLabel(e)} {getLabel(e.item)}
<span className="text-(--color-fg-tertiary)"> <span className="text-(--color-fg-tertiary)">
{` - ${e.listen_count} plays`} {` - ${e.item.listen_count} plays`}
{includeTime {includeTime
? ` (${Math.floor(e.time_listened / 60)} minutes)` ? ` (${Math.floor(e.item.time_listened / 60)} minutes)`
: ``} : ``}
</span> </span>
</div> </div>

View file

@ -1,23 +1,43 @@
import type { Theme } from "~/styles/themes.css"; import type { Theme } from "~/styles/themes.css";
interface Props { interface Props {
theme: Theme theme: Theme;
themeName: string themeName: string;
setTheme: Function setTheme: Function;
} }
export default function ThemeOption({ theme, themeName, setTheme }: Props) { export default function ThemeOption({ theme, themeName, setTheme }: Props) {
const capitalizeFirstLetter = (s: string) => { const capitalizeFirstLetter = (s: string) => {
return s.charAt(0).toUpperCase() + s.slice(1); return s.charAt(0).toUpperCase() + s.slice(1);
} };
return ( return (
<div onClick={() => setTheme(themeName)} className="rounded-md p-3 sm:p-5 hover:cursor-pointer flex gap-4 items-center border-2" style={{background: theme.bg, color: theme.fg, borderColor: theme.bgSecondary}}> <div
<div className="text-xs sm:text-sm">{capitalizeFirstLetter(themeName)}</div> onClick={() => setTheme(themeName)}
<div className="w-[50px] h-[30px] rounded-md" style={{background: theme.bgSecondary}}></div> className="rounded-md p-3 sm:p-5 hover:cursor-pointer flex gap-3 items-center border-2 justify-between"
<div className="w-[50px] h-[30px] rounded-md" style={{background: theme.fgSecondary}}></div> style={{
<div className="w-[50px] h-[30px] rounded-md" style={{background: theme.primary}}></div> background: theme.bg,
color: theme.fg,
borderColor: theme.bgSecondary,
}}
>
<div className="text-xs sm:text-sm">
{capitalizeFirstLetter(themeName)}
</div> </div>
) <div className="flex gap-2 w-full">
<div
className="w-2/7 max-w-[50px] h-[30px] rounded-md"
style={{ background: theme.bgSecondary }}
></div>
<div
className="w-2/7 max-w-[50px] h-[30px] rounded-md"
style={{ background: theme.fgSecondary }}
></div>
<div
className="w-2/7 max-w-[50px] h-[30px] rounded-md"
style={{ background: theme.primary }}
></div>
</div>
</div>
);
} }

View file

@ -49,7 +49,7 @@ export function ThemeSwitcher() {
<AsyncButton onClick={resetTheme}>Reset</AsyncButton> <AsyncButton onClick={resetTheme}>Reset</AsyncButton>
</div> </div>
</div> </div>
<div className="grid grid-cols-2 items-center gap-2"> <div className="grid grid-cols-1 sm:grid-cols-2 items-center gap-2">
{Object.entries(themes).map(([name, themeData]) => ( {Object.entries(themes).map(([name, themeData]) => (
<ThemeOption <ThemeOption
setTheme={setTheme} setTheme={setTheme}

View file

@ -116,12 +116,12 @@ export function ErrorBoundary() {
<AppProvider> <AppProvider>
<ThemeProvider> <ThemeProvider>
<title>{title}</title> <title>{title}</title>
<div className="flex">
<Sidebar /> <Sidebar />
<div className="flex">
<div className="w-full flex flex-col"> <div className="w-full flex flex-col">
<main className="pt-16 p-4 container mx-auto flex-grow"> <main className="pt-16 p-4 mx-auto flex-grow">
<div className="flex gap-4 items-end"> <div className="md:flex gap-4">
<img className="w-[200px] rounded" src="../yuu.jpg" /> <img className="w-[200px] rounded mb-3" src="../yuu.jpg" />
<div> <div>
<h1>{message}</h1> <h1>{message}</h1>
<p>{details}</p> <p>{details}</p>

View file

@ -1,7 +1,7 @@
import TopItemList from "~/components/TopItemList"; import TopItemList from "~/components/TopItemList";
import ChartLayout from "./ChartLayout"; import ChartLayout from "./ChartLayout";
import { useLoaderData, type LoaderFunctionArgs } from "react-router"; import { useLoaderData, type LoaderFunctionArgs } from "react-router";
import { type Album, type PaginatedResponse } from "api/api"; import { type Album, type PaginatedResponse, type Ranked } from "api/api";
export async function clientLoader({ request }: LoaderFunctionArgs) { export async function clientLoader({ request }: LoaderFunctionArgs) {
const url = new URL(request.url); const url = new URL(request.url);
@ -21,7 +21,7 @@ export async function clientLoader({ request }: LoaderFunctionArgs) {
export default function AlbumChart() { export default function AlbumChart() {
const { top_albums: initialData } = useLoaderData<{ const { top_albums: initialData } = useLoaderData<{
top_albums: PaginatedResponse<Album>; top_albums: PaginatedResponse<Ranked<Album>>;
}>(); }>();
return ( return (
@ -30,7 +30,7 @@ export default function AlbumChart() {
initialData={initialData} initialData={initialData}
endpoint="chart/top-albums" endpoint="chart/top-albums"
render={({ data, page, onNext, onPrev }) => ( render={({ data, page, onNext, onPrev }) => (
<div className="flex flex-col gap-5"> <div className="flex flex-col gap-5 w-full">
<div className="flex gap-15 mx-auto"> <div className="flex gap-15 mx-auto">
<button className="default" onClick={onPrev} disabled={page <= 1}> <button className="default" onClick={onPrev} disabled={page <= 1}>
Prev Prev
@ -47,7 +47,7 @@ export default function AlbumChart() {
ranked ranked
separators separators
data={data} data={data}
className="w-[400px] sm:w-[600px]" className="w-11/12 sm:w-[600px]"
type="album" type="album"
/> />
<div className="flex gap-15 mx-auto"> <div className="flex gap-15 mx-auto">

View file

@ -1,7 +1,7 @@
import TopItemList from "~/components/TopItemList"; import TopItemList from "~/components/TopItemList";
import ChartLayout from "./ChartLayout"; import ChartLayout from "./ChartLayout";
import { useLoaderData, type LoaderFunctionArgs } from "react-router"; import { useLoaderData, type LoaderFunctionArgs } from "react-router";
import { type Album, type PaginatedResponse } from "api/api"; import { type Album, type PaginatedResponse, type Ranked } from "api/api";
export async function clientLoader({ request }: LoaderFunctionArgs) { export async function clientLoader({ request }: LoaderFunctionArgs) {
const url = new URL(request.url); const url = new URL(request.url);
@ -21,7 +21,7 @@ export async function clientLoader({ request }: LoaderFunctionArgs) {
export default function Artist() { export default function Artist() {
const { top_artists: initialData } = useLoaderData<{ const { top_artists: initialData } = useLoaderData<{
top_artists: PaginatedResponse<Album>; top_artists: PaginatedResponse<Ranked<Album>>;
}>(); }>();
return ( return (
@ -30,7 +30,7 @@ export default function Artist() {
initialData={initialData} initialData={initialData}
endpoint="chart/top-artists" endpoint="chart/top-artists"
render={({ data, page, onNext, onPrev }) => ( render={({ data, page, onNext, onPrev }) => (
<div className="flex flex-col gap-5"> <div className="flex flex-col gap-5 w-full">
<div className="flex gap-15 mx-auto"> <div className="flex gap-15 mx-auto">
<button className="default" onClick={onPrev} disabled={page <= 1}> <button className="default" onClick={onPrev} disabled={page <= 1}>
Prev Prev
@ -47,7 +47,7 @@ export default function Artist() {
ranked ranked
separators separators
data={data} data={data}
className="w-[400px] sm:w-[600px]" className="w-11/12 sm:w-[600px]"
type="artist" type="artist"
/> />
<div className="flex gap-15 mx-auto"> <div className="flex gap-15 mx-auto">

View file

@ -40,7 +40,7 @@ export default function ChartLayout<T>({
useEffect(() => { useEffect(() => {
if ((data?.items?.length ?? 0) === 0) return; if ((data?.items?.length ?? 0) === 0) return;
const img = (data.items[0] as any)?.image; const img = (data.items[0] as any)?.item?.image;
if (!img) return; if (!img) return;
average(imageUrl(img, "small"), { amount: 1 }).then((color) => { average(imageUrl(img, "small"), { amount: 1 }).then((color) => {

View file

@ -1,7 +1,7 @@
import TopItemList from "~/components/TopItemList"; import TopItemList from "~/components/TopItemList";
import ChartLayout from "./ChartLayout"; import ChartLayout from "./ChartLayout";
import { useLoaderData, type LoaderFunctionArgs } from "react-router"; import { useLoaderData, type LoaderFunctionArgs } from "react-router";
import { type Album, type PaginatedResponse } from "api/api"; import { type Track, type PaginatedResponse, type Ranked } from "api/api";
export async function clientLoader({ request }: LoaderFunctionArgs) { export async function clientLoader({ request }: LoaderFunctionArgs) {
const url = new URL(request.url); const url = new URL(request.url);
@ -15,13 +15,13 @@ export async function clientLoader({ request }: LoaderFunctionArgs) {
throw new Response("Failed to load top tracks", { status: 500 }); throw new Response("Failed to load top tracks", { status: 500 });
} }
const top_tracks: PaginatedResponse<Album> = await res.json(); const top_tracks: PaginatedResponse<Track> = await res.json();
return { top_tracks }; return { top_tracks };
} }
export default function TrackChart() { export default function TrackChart() {
const { top_tracks: initialData } = useLoaderData<{ const { top_tracks: initialData } = useLoaderData<{
top_tracks: PaginatedResponse<Album>; top_tracks: PaginatedResponse<Ranked<Track>>;
}>(); }>();
return ( return (
@ -30,7 +30,7 @@ export default function TrackChart() {
initialData={initialData} initialData={initialData}
endpoint="chart/top-tracks" endpoint="chart/top-tracks"
render={({ data, page, onNext, onPrev }) => ( render={({ data, page, onNext, onPrev }) => (
<div className="flex flex-col gap-5"> <div className="flex flex-col gap-5 w-full">
<div className="flex gap-15 mx-auto"> <div className="flex gap-15 mx-auto">
<button className="default" onClick={onPrev} disabled={page <= 1}> <button className="default" onClick={onPrev} disabled={page <= 1}>
Prev Prev
@ -47,7 +47,7 @@ export default function TrackChart() {
ranked ranked
separators separators
data={data} data={data}
className="w-[400px] sm:w-[600px]" className="w-11/12 sm:w-[600px]"
type="track" type="track"
/> />
<div className="flex gap-15 mx-auto"> <div className="flex gap-15 mx-auto">

View file

@ -10,20 +10,17 @@ import PeriodSelector from "~/components/PeriodSelector";
import { useAppContext } from "~/providers/AppProvider"; import { useAppContext } from "~/providers/AppProvider";
export function meta({}: Route.MetaArgs) { export function meta({}: Route.MetaArgs) {
return [ return [{ title: "Koito" }, { name: "description", content: "Koito" }];
{ title: "Koito" },
{ name: "description", content: "Koito" },
];
} }
export default function Home() { export default function Home() {
const [period, setPeriod] = useState('week') const [period, setPeriod] = useState("week");
const { homeItems } = useAppContext(); const { homeItems } = useAppContext();
return ( return (
<main className="flex flex-grow justify-center pb-4"> <main className="flex flex-grow justify-center pb-4 w-full bg-linear-to-b to-(--color-bg) from-(--color-bg-secondary) to-60%">
<div className="flex-1 flex flex-col items-center gap-16 min-h-0 mt-20"> <div className="flex-1 flex flex-col items-center gap-16 min-h-0 sm:mt-20 mt-10">
<div className="flex flex-col md:flex-row gap-10 md:gap-20"> <div className="flex flex-col md:flex-row gap-10 md:gap-20">
<AllTimeStats /> <AllTimeStats />
<ActivityGrid configurable /> <ActivityGrid configurable />
@ -33,7 +30,10 @@ export default function Home() {
<TopArtists period={period} limit={homeItems} /> <TopArtists period={period} limit={homeItems} />
<TopAlbums period={period} limit={homeItems} /> <TopAlbums period={period} limit={homeItems} />
<TopTracks period={period} limit={homeItems} /> <TopTracks period={period} limit={homeItems} />
<LastPlays showNowPlaying={true} limit={Math.floor(homeItems * 2.7)} /> <LastPlays
showNowPlaying={true}
limit={Math.floor(homeItems * 2.7)}
/>
</div> </div>
</div> </div>
</main> </main>

View file

@ -30,6 +30,7 @@ export default function Album() {
title={album.title} title={album.title}
img={album.image} img={album.image}
id={album.id} id={album.id}
rank={album.all_time_rank}
musicbrainzId={album.musicbrainz_id} musicbrainzId={album.musicbrainz_id}
imgItemId={album.id} imgItemId={album.id}
mergeFunc={mergeAlbums} mergeFunc={mergeAlbums}
@ -45,22 +46,22 @@ export default function Album() {
}} }}
subContent={ subContent={
<div className="flex flex-col gap-2 items-start"> <div className="flex flex-col gap-2 items-start">
{album.listen_count && ( {album.listen_count !== 0 && (
<p> <p>
{album.listen_count} play{album.listen_count > 1 ? "s" : ""} {album.listen_count} play{album.listen_count > 1 ? "s" : ""}
</p> </p>
)} )}
{ {album.time_listened !== 0 && (
<p title={Math.floor(album.time_listened / 60 / 60) + " hours"}> <p title={Math.floor(album.time_listened / 60 / 60) + " hours"}>
{timeListenedString(album.time_listened)} {timeListenedString(album.time_listened)}
</p> </p>
} )}
{ {album.first_listen > 0 && (
<p title={new Date(album.first_listen * 1000).toLocaleString()}> <p title={new Date(album.first_listen * 1000).toLocaleString()}>
Listening since{" "} Listening since{" "}
{new Date(album.first_listen * 1000).toLocaleDateString()} {new Date(album.first_listen * 1000).toLocaleDateString()}
</p> </p>
} )}
</div> </div>
} }
> >

View file

@ -36,6 +36,7 @@ export default function Artist() {
title={artist.name} title={artist.name}
img={artist.image} img={artist.image}
id={artist.id} id={artist.id}
rank={artist.all_time_rank}
musicbrainzId={artist.musicbrainz_id} musicbrainzId={artist.musicbrainz_id}
imgItemId={artist.id} imgItemId={artist.id}
mergeFunc={mergeArtists} mergeFunc={mergeArtists}
@ -56,17 +57,17 @@ export default function Artist() {
{artist.listen_count} play{artist.listen_count > 1 ? "s" : ""} {artist.listen_count} play{artist.listen_count > 1 ? "s" : ""}
</p> </p>
)} )}
{ {artist.time_listened !== 0 && (
<p title={Math.floor(artist.time_listened / 60 / 60) + " hours"}> <p title={Math.floor(artist.time_listened / 60 / 60) + " hours"}>
{timeListenedString(artist.time_listened)} {timeListenedString(artist.time_listened)}
</p> </p>
} )}
{ {artist.first_listen > 0 && (
<p title={new Date(artist.first_listen * 1000).toLocaleString()}> <p title={new Date(artist.first_listen * 1000).toLocaleString()}>
Listening since{" "} Listening since{" "}
{new Date(artist.first_listen * 1000).toLocaleDateString()} {new Date(artist.first_listen * 1000).toLocaleDateString()}
</p> </p>
} )}
</div> </div>
} }
> >

View file

@ -28,6 +28,7 @@ interface Props {
title: string; title: string;
img: string; img: string;
id: number; id: number;
rank: number;
musicbrainzId: string; musicbrainzId: string;
imgItemId: number; imgItemId: number;
mergeFunc: MergeFunc; mergeFunc: MergeFunc;
@ -96,7 +97,15 @@ export default function MediaLayout(props: Props) {
</div> </div>
<div className="flex flex-col items-start"> <div className="flex flex-col items-start">
<h3>{props.type}</h3> <h3>{props.type}</h3>
<h1>{props.title}</h1> <div className="flex">
<h1>
{props.title}
<span className="text-xl font-medium text-(--color-fg-secondary)">
{" "}
#{props.rank}
</span>
</h1>
</div>
{props.subContent} {props.subContent}
</div> </div>
<div className="absolute left-1 sm:right-1 sm:left-auto -top-9 sm:top-1 flex gap-3 items-center"> <div className="absolute left-1 sm:right-1 sm:left-auto -top-9 sm:top-1 flex gap-3 items-center">

View file

@ -34,6 +34,7 @@ export default function Track() {
title={track.title} title={track.title}
img={track.image} img={track.image}
id={track.id} id={track.id}
rank={track.all_time_rank}
musicbrainzId={track.musicbrainz_id} musicbrainzId={track.musicbrainz_id}
imgItemId={track.album_id} imgItemId={track.album_id}
mergeFunc={mergeTracks} mergeFunc={mergeTracks}
@ -49,23 +50,28 @@ export default function Track() {
}} }}
subContent={ subContent={
<div className="flex flex-col gap-2 items-start"> <div className="flex flex-col gap-2 items-start">
<Link to={`/album/${track.album_id}`}>appears on {album.title}</Link> <p>
{track.listen_count && ( Appears on{" "}
<Link className="hover:underline" to={`/album/${track.album_id}`}>
{album.title}
</Link>
</p>
{track.listen_count !== 0 && (
<p> <p>
{track.listen_count} play{track.listen_count > 1 ? "s" : ""} {track.listen_count} play{track.listen_count > 1 ? "s" : ""}
</p> </p>
)} )}
{ {track.time_listened !== 0 && (
<p title={Math.floor(track.time_listened / 60 / 60) + " hours"}> <p title={Math.floor(track.time_listened / 60 / 60) + " hours"}>
{timeListenedString(track.time_listened)} {timeListenedString(track.time_listened)}
</p> </p>
} )}
{ {track.first_listen > 0 && (
<p title={new Date(track.first_listen * 1000).toLocaleString()}> <p title={new Date(track.first_listen * 1000).toLocaleString()}>
Listening since{" "} Listening since{" "}
{new Date(track.first_listen * 1000).toLocaleDateString()} {new Date(track.first_listen * 1000).toLocaleDateString()}
</p> </p>
} )}
</div> </div>
} }
> >

View file

@ -29,10 +29,12 @@ const months = [
export async function clientLoader({ request }: LoaderFunctionArgs) { export async function clientLoader({ request }: LoaderFunctionArgs) {
const url = new URL(request.url); const url = new URL(request.url);
const year = const year = parseInt(
parseInt(url.searchParams.get("year") || "0") || getRewindParams().year; url.searchParams.get("year") || getRewindParams().year.toString()
const month = );
parseInt(url.searchParams.get("month") || "0") || getRewindParams().month; const month = parseInt(
url.searchParams.get("month") || getRewindParams().month.toString()
);
const res = await fetch(`/apis/web/v1/summary?year=${year}&month=${month}`); const res = await fetch(`/apis/web/v1/summary?year=${year}&month=${month}`);
if (!res.ok) { if (!res.ok) {
@ -46,10 +48,12 @@ export async function clientLoader({ request }: LoaderFunctionArgs) {
export default function RewindPage() { export default function RewindPage() {
const currentParams = new URLSearchParams(location.search); const currentParams = new URLSearchParams(location.search);
let year = let year = parseInt(
parseInt(currentParams.get("year") || "0") || getRewindParams().year; currentParams.get("year") || getRewindParams().year.toString()
let month = );
parseInt(currentParams.get("month") || "0") || getRewindParams().month; let month = parseInt(
currentParams.get("month") || getRewindParams().month.toString()
);
const navigate = useNavigate(); const navigate = useNavigate();
const [showTime, setShowTime] = useState(false); const [showTime, setShowTime] = useState(false);
const { stats: stats } = useLoaderData<{ stats: RewindStats }>(); const { stats: stats } = useLoaderData<{ stats: RewindStats }>();
@ -59,7 +63,7 @@ export default function RewindPage() {
useEffect(() => { useEffect(() => {
if (!stats.top_artists[0]) return; if (!stats.top_artists[0]) return;
const img = (stats.top_artists[0] as any)?.image; const img = (stats.top_artists[0] as any)?.item.image;
if (!img) return; if (!img) return;
average(imageUrl(img, "small"), { amount: 1 }).then((color) => { average(imageUrl(img, "small"), { amount: 1 }).then((color) => {
@ -73,10 +77,8 @@ export default function RewindPage() {
for (const key in params) { for (const key in params) {
const val = params[key]; const val = params[key];
if (val !== null && val !== "0") { if (val !== null) {
nextParams.set(key, val); nextParams.set(key, val);
} else {
nextParams.delete(key);
} }
} }
@ -99,6 +101,7 @@ export default function RewindPage() {
month -= 1; month -= 1;
} }
} }
console.log(`Month: ${month}`);
updateParams({ updateParams({
year: year.toString(), year: year.toString(),
@ -128,15 +131,13 @@ export default function RewindPage() {
transition: "1000", transition: "1000",
}} }}
> >
<div className="flex flex-col items-start md:flex-row sm:items-center gap-4"> <div className="flex flex-col items-start sm:items-center gap-4">
<title>{pgTitle}</title> <title>{pgTitle}</title>
<meta property="og:title" content={pgTitle} /> <meta property="og:title" content={pgTitle} />
<meta name="description" content={pgTitle} /> <meta name="description" content={pgTitle} />
<div className="flex flex-col items-start mt-20 gap-10 w-19/20 px-20"> <div className="flex flex-col lg:flex-row items-start lg:mt-15 mt-5 gap-10 w-19/20 px-5 md:px-20">
{stats !== undefined && ( <div className="flex flex-col items-start gap-4">
<Rewind stats={stats} includeTime={showTime} /> <div className="flex flex-col items-start gap-4 py-8">
)}
<div className="flex flex-col items-center gap-4 py-8">
<div className="flex items-center gap-6 justify-around"> <div className="flex items-center gap-6 justify-around">
<button <button
onClick={() => navigateMonth("prev")} onClick={() => navigateMonth("prev")}
@ -156,7 +157,12 @@ export default function RewindPage() {
<button <button
onClick={() => navigateMonth("next")} onClick={() => navigateMonth("next")}
className="p-2 disabled:text-(--color-fg-tertiary)" className="p-2 disabled:text-(--color-fg-tertiary)"
disabled={new Date(year, month) > new Date()} disabled={
// next month is current or future month and
month >= new Date().getMonth() &&
// we are looking at current (or future) year
year >= new Date().getFullYear()
}
> >
<ChevronRight size={20} /> <ChevronRight size={20} />
</button> </button>
@ -197,6 +203,10 @@ export default function RewindPage() {
></input> ></input>
</div> </div>
</div> </div>
{stats !== undefined && (
<Rewind stats={stats} includeTime={showTime} />
)}
</div>
</div> </div>
</div> </div>
); );

View file

@ -92,7 +92,7 @@ export const themes: Record<string, Theme> = {
fg: "#fef9f3", fg: "#fef9f3",
fgSecondary: "#dbc6b0", fgSecondary: "#dbc6b0",
fgTertiary: "#a3917a", fgTertiary: "#a3917a",
primary: "#d97706", primary: "#F0850A",
primaryDim: "#b45309", primaryDim: "#b45309",
accent: "#8c4c28", accent: "#8c4c28",
accentDim: "#6b3b1f", accentDim: "#6b3b1f",

View file

@ -0,0 +1,9 @@
-- +goose Up
DELETE FROM artist_releases ar
WHERE NOT EXISTS (
SELECT 1
FROM artist_tracks at
JOIN tracks t ON at.track_id = t.id
WHERE at.artist_id = ar.artist_id
AND t.release_id = ar.release_id
);

View file

@ -56,22 +56,60 @@ LEFT JOIN artist_aliases aa ON a.id = aa.artist_id
WHERE a.musicbrainz_id = $1 WHERE a.musicbrainz_id = $1
GROUP BY a.id, a.musicbrainz_id, a.image, a.image_source, a.name; GROUP BY a.id, a.musicbrainz_id, a.image, a.image_source, a.name;
-- name: GetArtistsWithoutImages :many
SELECT
*
FROM artists_with_name
WHERE image IS NULL
AND id > $2
ORDER BY id ASC
LIMIT $1;
-- name: GetTopArtistsPaginated :many -- name: GetTopArtistsPaginated :many
SELECT SELECT
x.id,
x.name,
x.musicbrainz_id,
x.image,
x.listen_count,
RANK() OVER (ORDER BY x.listen_count DESC) AS rank
FROM (
SELECT
a.id, a.id,
a.name, a.name,
a.musicbrainz_id, a.musicbrainz_id,
a.image, a.image,
COUNT(*) AS listen_count COUNT(*) AS listen_count
FROM listens l FROM listens l
JOIN tracks t ON l.track_id = t.id JOIN tracks t ON l.track_id = t.id
JOIN artist_tracks at ON at.track_id = t.id JOIN artist_tracks at ON at.track_id = t.id
JOIN artists_with_name a ON a.id = at.artist_id JOIN artists_with_name a ON a.id = at.artist_id
WHERE l.listened_at BETWEEN $1 AND $2 WHERE l.listened_at BETWEEN $1 AND $2
GROUP BY a.id, a.name, a.musicbrainz_id, a.image, a.image_source, a.name GROUP BY a.id, a.name, a.musicbrainz_id, a.image
ORDER BY listen_count DESC, a.id ) x
ORDER BY x.listen_count DESC, x.id
LIMIT $3 OFFSET $4; LIMIT $3 OFFSET $4;
-- name: GetArtistAllTimeRank :one
SELECT
artist_id,
rank
FROM (
SELECT
x.artist_id,
RANK() OVER (ORDER BY x.listen_count DESC) AS rank
FROM (
SELECT
at.artist_id,
COUNT(*) AS listen_count
FROM listens l
JOIN tracks t ON l.track_id = t.id
JOIN artist_tracks at ON t.id = at.track_id
GROUP BY at.artist_id
) x
)
WHERE artist_id = $1;
-- name: CountTopArtists :one -- name: CountTopArtists :one
SELECT COUNT(DISTINCT at.artist_id) AS total_count SELECT COUNT(DISTINCT at.artist_id) AS total_count
FROM listens l FROM listens l

View file

@ -3,7 +3,13 @@ DO $$
BEGIN BEGIN
DELETE FROM tracks WHERE id NOT IN (SELECT l.track_id FROM listens l); DELETE FROM tracks WHERE id NOT IN (SELECT l.track_id FROM listens l);
DELETE FROM releases WHERE id NOT IN (SELECT t.release_id FROM tracks t); DELETE FROM releases WHERE id NOT IN (SELECT t.release_id FROM tracks t);
-- DELETE FROM releases WHERE release_group_id NOT IN (SELECT t.release_group_id FROM tracks t);
-- DELETE FROM releases WHERE release_group_id NOT IN (SELECT rg.id FROM release_groups rg);
DELETE FROM artists WHERE id NOT IN (SELECT at.artist_id FROM artist_tracks at); DELETE FROM artists WHERE id NOT IN (SELECT at.artist_id FROM artist_tracks at);
DELETE FROM artist_releases ar
WHERE NOT EXISTS (
SELECT 1
FROM artist_tracks at
JOIN tracks t ON at.track_id = t.id
WHERE at.artist_id = ar.artist_id
AND t.release_id = ar.release_id
);
END $$; END $$;

View file

@ -1,162 +1,139 @@
-- name: GetGroupedListensFromArtist :many -- name: GetGroupedListensFromArtist :many
WITH artist_listens AS ( WITH bounds AS (
SELECT SELECT
l.listened_at MIN(l.listened_at) AS start_time,
NOW() AS end_time
FROM listens l FROM listens l
JOIN tracks t ON t.id = l.track_id JOIN tracks t ON t.id = l.track_id
JOIN artist_tracks at ON at.track_id = t.id JOIN artist_tracks at ON at.track_id = t.id
WHERE at.artist_id = $1 WHERE at.artist_id = $1
), ),
bounds AS ( stats AS (
SELECT SELECT
MIN(listened_at) AS start_time, start_time,
MAX(listened_at) AS end_time end_time,
FROM artist_listens EXTRACT(EPOCH FROM (end_time - start_time)) AS total_seconds,
((end_time - start_time) / sqlc.arg(bucket_count)::int) AS bucket_interval
FROM bounds
), ),
bucketed AS ( bucket_series AS (
SELECT generate_series(0, sqlc.arg(bucket_count)::int - 1) AS idx
),
listen_indices AS (
SELECT SELECT
LEAST( LEAST(
sqlc.arg(bucket_count) - 1, sqlc.arg(bucket_count)::int - 1,
FLOOR( FLOOR(
( (EXTRACT(EPOCH FROM (l.listened_at - s.start_time)) / NULLIF(s.total_seconds, 0))
EXTRACT(EPOCH FROM (al.listened_at - b.start_time)) * sqlc.arg(bucket_count)::int
/
NULLIF(EXTRACT(EPOCH FROM (b.end_time - b.start_time)), 0)
) * sqlc.arg(bucket_count)
)::int )::int
) AS bucket_idx, ) AS bucket_idx
b.start_time, FROM listens l
b.end_time JOIN tracks t ON t.id = l.track_id
FROM artist_listens al JOIN artist_tracks at ON at.track_id = t.id
CROSS JOIN bounds b CROSS JOIN stats s
), WHERE at.artist_id = $1
aggregated AS ( AND s.start_time IS NOT NULL
SELECT
start_time
+ (
bucket_idx * (end_time - start_time)
/ sqlc.arg(bucket_count)
) AS bucket_start,
start_time
+ (
(bucket_idx + 1) * (end_time - start_time)
/ sqlc.arg(bucket_count)
) AS bucket_end,
COUNT(*) AS listen_count
FROM bucketed
GROUP BY bucket_idx, start_time, end_time
) )
SELECT SELECT
bucket_start::timestamptz, (s.start_time + (s.bucket_interval * bs.idx))::timestamptz AS bucket_start,
bucket_end::timestamptz, (s.start_time + (s.bucket_interval * (bs.idx + 1)))::timestamptz AS bucket_end,
listen_count COUNT(li.bucket_idx) AS listen_count
FROM aggregated FROM bucket_series bs
ORDER BY bucket_start; CROSS JOIN stats s
LEFT JOIN listen_indices li ON bs.idx = li.bucket_idx
WHERE s.start_time IS NOT NULL
GROUP BY bs.idx, s.start_time, s.bucket_interval
ORDER BY bs.idx;
-- name: GetGroupedListensFromRelease :many -- name: GetGroupedListensFromRelease :many
WITH artist_listens AS ( WITH bounds AS (
SELECT SELECT
l.listened_at MIN(l.listened_at) AS start_time,
NOW() AS end_time
FROM listens l FROM listens l
JOIN tracks t ON t.id = l.track_id JOIN tracks t ON t.id = l.track_id
WHERE t.release_id = $1 WHERE t.release_id = $1
), ),
bounds AS ( stats AS (
SELECT SELECT
MIN(listened_at) AS start_time, start_time,
MAX(listened_at) AS end_time end_time,
FROM artist_listens EXTRACT(EPOCH FROM (end_time - start_time)) AS total_seconds,
((end_time - start_time) / sqlc.arg(bucket_count)::int) AS bucket_interval
FROM bounds
), ),
bucketed AS ( bucket_series AS (
SELECT generate_series(0, sqlc.arg(bucket_count)::int - 1) AS idx
),
listen_indices AS (
SELECT SELECT
LEAST( LEAST(
sqlc.arg(bucket_count) - 1, sqlc.arg(bucket_count)::int - 1,
FLOOR( FLOOR(
( (EXTRACT(EPOCH FROM (l.listened_at - s.start_time)) / NULLIF(s.total_seconds, 0))
EXTRACT(EPOCH FROM (al.listened_at - b.start_time)) * sqlc.arg(bucket_count)::int
/
NULLIF(EXTRACT(EPOCH FROM (b.end_time - b.start_time)), 0)
) * sqlc.arg(bucket_count)
)::int )::int
) AS bucket_idx, ) AS bucket_idx
b.start_time, FROM listens l
b.end_time JOIN tracks t ON t.id = l.track_id
FROM artist_listens al CROSS JOIN stats s
CROSS JOIN bounds b WHERE t.release_id = $1
), AND s.start_time IS NOT NULL
aggregated AS (
SELECT
start_time
+ (
bucket_idx * (end_time - start_time)
/ sqlc.arg(bucket_count)
) AS bucket_start,
start_time
+ (
(bucket_idx + 1) * (end_time - start_time)
/ sqlc.arg(bucket_count)
) AS bucket_end,
COUNT(*) AS listen_count
FROM bucketed
GROUP BY bucket_idx, start_time, end_time
) )
SELECT SELECT
bucket_start::timestamptz, (s.start_time + (s.bucket_interval * bs.idx))::timestamptz AS bucket_start,
bucket_end::timestamptz, (s.start_time + (s.bucket_interval * (bs.idx + 1)))::timestamptz AS bucket_end,
listen_count COUNT(li.bucket_idx) AS listen_count
FROM aggregated FROM bucket_series bs
ORDER BY bucket_start; CROSS JOIN stats s
LEFT JOIN listen_indices li ON bs.idx = li.bucket_idx
WHERE s.start_time IS NOT NULL
GROUP BY bs.idx, s.start_time, s.bucket_interval
ORDER BY bs.idx;
-- name: GetGroupedListensFromTrack :many -- name: GetGroupedListensFromTrack :many
WITH artist_listens AS ( WITH bounds AS (
SELECT SELECT
l.listened_at MIN(l.listened_at) AS start_time,
NOW() AS end_time
FROM listens l FROM listens l
JOIN tracks t ON t.id = l.track_id JOIN tracks t ON t.id = l.track_id
WHERE t.id = $1 WHERE t.id = $1
), ),
bounds AS ( stats AS (
SELECT SELECT
MIN(listened_at) AS start_time, start_time,
MAX(listened_at) AS end_time end_time,
FROM artist_listens EXTRACT(EPOCH FROM (end_time - start_time)) AS total_seconds,
((end_time - start_time) / sqlc.arg(bucket_count)::int) AS bucket_interval
FROM bounds
), ),
bucketed AS ( bucket_series AS (
SELECT generate_series(0, sqlc.arg(bucket_count)::int - 1) AS idx
),
listen_indices AS (
SELECT SELECT
LEAST( LEAST(
sqlc.arg(bucket_count) - 1, sqlc.arg(bucket_count)::int - 1,
FLOOR( FLOOR(
( (EXTRACT(EPOCH FROM (l.listened_at - s.start_time)) / NULLIF(s.total_seconds, 0))
EXTRACT(EPOCH FROM (al.listened_at - b.start_time)) * sqlc.arg(bucket_count)::int
/
NULLIF(EXTRACT(EPOCH FROM (b.end_time - b.start_time)), 0)
) * sqlc.arg(bucket_count)
)::int )::int
) AS bucket_idx, ) AS bucket_idx
b.start_time, FROM listens l
b.end_time JOIN tracks t ON t.id = l.track_id
FROM artist_listens al CROSS JOIN stats s
CROSS JOIN bounds b WHERE t.id = $1
), AND s.start_time IS NOT NULL
aggregated AS (
SELECT
start_time
+ (
bucket_idx * (end_time - start_time)
/ sqlc.arg(bucket_count)
) AS bucket_start,
start_time
+ (
(bucket_idx + 1) * (end_time - start_time)
/ sqlc.arg(bucket_count)
) AS bucket_end,
COUNT(*) AS listen_count
FROM bucketed
GROUP BY bucket_idx, start_time, end_time
) )
SELECT SELECT
bucket_start::timestamptz, (s.start_time + (s.bucket_interval * bs.idx))::timestamptz AS bucket_start,
bucket_end::timestamptz, (s.start_time + (s.bucket_interval * (bs.idx + 1)))::timestamptz AS bucket_end,
listen_count COUNT(li.bucket_idx) AS listen_count
FROM aggregated FROM bucket_series bs
ORDER BY bucket_start; CROSS JOIN stats s
LEFT JOIN listen_indices li ON bs.idx = li.bucket_idx
WHERE s.start_time IS NOT NULL
GROUP BY bs.idx, s.start_time, s.bucket_interval
ORDER BY bs.idx;

View file

@ -47,32 +47,61 @@ WHERE r.title = ANY ($1::TEXT[])
-- name: GetTopReleasesFromArtist :many -- name: GetTopReleasesFromArtist :many
SELECT SELECT
x.*,
get_artists_for_release(x.id) AS artists,
RANK() OVER (ORDER BY x.listen_count DESC) AS rank
FROM (
SELECT
r.*, r.*,
COUNT(*) AS listen_count, COUNT(*) AS listen_count
get_artists_for_release(r.id) AS artists FROM listens l
FROM listens l JOIN tracks t ON l.track_id = t.id
JOIN tracks t ON l.track_id = t.id JOIN releases_with_title r ON t.release_id = r.id
JOIN releases_with_title r ON t.release_id = r.id JOIN artist_releases ar ON r.id = ar.release_id
JOIN artist_releases ar ON r.id = ar.release_id WHERE ar.artist_id = $5
WHERE ar.artist_id = $5
AND l.listened_at BETWEEN $1 AND $2 AND l.listened_at BETWEEN $1 AND $2
GROUP BY r.id, r.title, r.musicbrainz_id, r.various_artists, r.image, r.image_source GROUP BY r.id, r.title, r.musicbrainz_id, r.various_artists, r.image, r.image_source
ORDER BY listen_count DESC, r.id ) x
ORDER BY listen_count DESC, x.id
LIMIT $3 OFFSET $4; LIMIT $3 OFFSET $4;
-- name: GetTopReleasesPaginated :many -- name: GetTopReleasesPaginated :many
SELECT SELECT
x.*,
get_artists_for_release(x.id) AS artists,
RANK() OVER (ORDER BY x.listen_count DESC) AS rank
FROM (
SELECT
r.*, r.*,
COUNT(*) AS listen_count, COUNT(*) AS listen_count
get_artists_for_release(r.id) AS artists FROM listens l
FROM listens l JOIN tracks t ON l.track_id = t.id
JOIN tracks t ON l.track_id = t.id JOIN releases_with_title r ON t.release_id = r.id
JOIN releases_with_title r ON t.release_id = r.id WHERE l.listened_at BETWEEN $1 AND $2
WHERE l.listened_at BETWEEN $1 AND $2 GROUP BY r.id, r.title, r.musicbrainz_id, r.various_artists, r.image, r.image_source
GROUP BY r.id, r.title, r.musicbrainz_id, r.various_artists, r.image, r.image_source ) x
ORDER BY listen_count DESC, r.id ORDER BY listen_count DESC, x.id
LIMIT $3 OFFSET $4; LIMIT $3 OFFSET $4;
-- name: GetReleaseAllTimeRank :one
SELECT
release_id,
rank
FROM (
SELECT
x.release_id,
RANK() OVER (ORDER BY x.listen_count DESC) AS rank
FROM (
SELECT
t.release_id,
COUNT(*) AS listen_count
FROM listens l
JOIN tracks t ON l.track_id = t.id
GROUP BY t.release_id
) x
)
WHERE release_id = $1;
-- name: CountTopReleases :one -- name: CountTopReleases :one
SELECT COUNT(DISTINCT r.id) AS total_count SELECT COUNT(DISTINCT r.id) AS total_count
FROM listens l FROM listens l

View file

@ -39,57 +39,100 @@ HAVING COUNT(DISTINCT at.artist_id) = cardinality($3::int[]);
-- name: GetTopTracksPaginated :many -- name: GetTopTracksPaginated :many
SELECT SELECT
t.id, x.track_id AS id,
t.title, t.title,
t.musicbrainz_id, t.musicbrainz_id,
t.release_id, t.release_id,
r.image, r.image,
x.listen_count,
get_artists_for_track(x.track_id) AS artists,
x.rank
FROM (
SELECT
track_id,
COUNT(*) AS listen_count, COUNT(*) AS listen_count,
get_artists_for_track(t.id) AS artists RANK() OVER (ORDER BY COUNT(*) DESC) as rank
FROM listens l FROM listens
JOIN tracks_with_title t ON l.track_id = t.id WHERE listened_at BETWEEN $1 AND $2
GROUP BY track_id
ORDER BY listen_count DESC
LIMIT $3 OFFSET $4
) x
JOIN tracks_with_title t ON x.track_id = t.id
JOIN releases r ON t.release_id = r.id JOIN releases r ON t.release_id = r.id
WHERE l.listened_at BETWEEN $1 AND $2 ORDER BY x.listen_count DESC, x.track_id;
GROUP BY t.id, t.title, t.musicbrainz_id, t.release_id, r.image
ORDER BY listen_count DESC, t.id
LIMIT $3 OFFSET $4;
-- name: GetTopTracksByArtistPaginated :many -- name: GetTopTracksByArtistPaginated :many
SELECT SELECT
t.id, x.track_id AS id,
t.title, t.title,
t.musicbrainz_id, t.musicbrainz_id,
t.release_id, t.release_id,
r.image, r.image,
x.listen_count,
get_artists_for_track(x.track_id) AS artists,
x.rank
FROM (
SELECT
l.track_id,
COUNT(*) AS listen_count, COUNT(*) AS listen_count,
get_artists_for_track(t.id) AS artists RANK() OVER (ORDER BY COUNT(*) DESC) as rank
FROM listens l FROM listens l
JOIN tracks_with_title t ON l.track_id = t.id JOIN artist_tracks at ON l.track_id = at.track_id
JOIN releases r ON t.release_id = r.id WHERE l.listened_at BETWEEN $1 AND $2
JOIN artist_tracks at ON at.track_id = t.id
WHERE l.listened_at BETWEEN $1 AND $2
AND at.artist_id = $5 AND at.artist_id = $5
GROUP BY t.id, t.title, t.musicbrainz_id, t.release_id, r.image GROUP BY l.track_id
ORDER BY listen_count DESC, t.id ORDER BY listen_count DESC
LIMIT $3 OFFSET $4; LIMIT $3 OFFSET $4
) x
JOIN tracks_with_title t ON x.track_id = t.id
JOIN releases r ON t.release_id = r.id
ORDER BY x.listen_count DESC, x.track_id;
-- name: GetTopTracksInReleasePaginated :many -- name: GetTopTracksInReleasePaginated :many
SELECT SELECT
t.id, x.track_id AS id,
t.title, t.title,
t.musicbrainz_id, t.musicbrainz_id,
t.release_id, t.release_id,
r.image, r.image,
x.listen_count,
get_artists_for_track(x.track_id) AS artists,
x.rank
FROM (
SELECT
l.track_id,
COUNT(*) AS listen_count, COUNT(*) AS listen_count,
get_artists_for_track(t.id) AS artists RANK() OVER (ORDER BY COUNT(*) DESC) as rank
FROM listens l FROM listens l
JOIN tracks_with_title t ON l.track_id = t.id JOIN tracks t ON l.track_id = t.id
JOIN releases r ON t.release_id = r.id WHERE l.listened_at BETWEEN $1 AND $2
WHERE l.listened_at BETWEEN $1 AND $2
AND t.release_id = $5 AND t.release_id = $5
GROUP BY t.id, t.title, t.musicbrainz_id, t.release_id, r.image GROUP BY l.track_id
ORDER BY listen_count DESC, t.id ORDER BY listen_count DESC
LIMIT $3 OFFSET $4; LIMIT $3 OFFSET $4
) x
JOIN tracks_with_title t ON x.track_id = t.id
JOIN releases r ON t.release_id = r.id
ORDER BY x.listen_count DESC, x.track_id;
-- name: GetTrackAllTimeRank :one
SELECT
id,
rank
FROM (
SELECT
x.id,
RANK() OVER (ORDER BY x.listen_count DESC) AS rank
FROM (
SELECT
t.id,
COUNT(*) AS listen_count
FROM listens l
JOIN tracks_with_title t ON l.track_id = t.id
GROUP BY t.id) x
) y
WHERE id = $1;
-- name: CountTopTracks :one -- name: CountTopTracks :one
SELECT COUNT(DISTINCT l.track_id) AS total_count SELECT COUNT(DISTINCT l.track_id) AS total_count
@ -137,3 +180,13 @@ WHERE artist_id = $1 AND track_id = $2;
-- name: DeleteTrack :exec -- name: DeleteTrack :exec
DELETE FROM tracks WHERE id = $1; DELETE FROM tracks WHERE id = $1;
-- name: GetTracksWithNoDurationButHaveMbzID :many
SELECT
*
FROM tracks_with_title
WHERE duration = 0
AND musicbrainz_id IS NOT NULL
AND id > $2
ORDER BY id ASC
LIMIT $1;

View file

@ -1,8 +1,8 @@
// @ts-check // @ts-check
import { defineConfig } from 'astro/config'; import { defineConfig } from "astro/config";
import starlight from '@astrojs/starlight'; import starlight from "@astrojs/starlight";
import tailwindcss from '@tailwindcss/vite'; import tailwindcss from "@tailwindcss/vite";
// https://astro.build/config // https://astro.build/config
export default defineConfig({ export default defineConfig({
@ -10,41 +10,53 @@ export default defineConfig({
starlight({ starlight({
head: [ head: [
{ {
tag: 'script', tag: "script",
attrs: { attrs: {
src: 'https://static.cloudflareinsights.com/beacon.min.js', src: "https://static.cloudflareinsights.com/beacon.min.js",
'data-cf-beacon': '{"token": "1948caaaba10463fa1d310ee02b0951c"}', "data-cf-beacon": '{"token": "1948caaaba10463fa1d310ee02b0951c"}',
defer: true, defer: true,
} },
} },
], ],
title: 'Koito', title: "Koito",
logo: { logo: {
src: './src/assets/logo_text.png', src: "./src/assets/logo_text.png",
replacesTitle: true, replacesTitle: true,
}, },
social: [{ icon: 'github', label: 'GitHub', href: 'https://github.com/gabehf/koito' }], social: [
{
icon: "github",
label: "GitHub",
href: "https://github.com/gabehf/koito",
},
],
sidebar: [ sidebar: [
{ {
label: 'Guides', label: "Guides",
items: [ items: [
// Each item here is one entry in the navigation menu. // Each item here is one entry in the navigation menu.
{ label: 'Installation', slug: 'guides/installation' }, { label: "Installation", slug: "guides/installation" },
{ label: 'Importing Data', slug: 'guides/importing' }, { label: "Importing Data", slug: "guides/importing" },
{ label: 'Setting up the Scrobbler', slug: 'guides/scrobbler' }, { label: "Setting up the Scrobbler", slug: "guides/scrobbler" },
{ label: 'Editing Data', slug: 'guides/editing' }, { label: "Editing Data", slug: "guides/editing" },
], ],
}, },
{ {
label: 'Reference', label: "Quickstart",
items: [ items: [
{ label: 'Configuration Options', slug: 'reference/configuration' }, { label: "Setup with Navidrome", slug: "quickstart/navidrome" },
] ],
},
{
label: "Reference",
items: [
{ label: "Configuration Options", slug: "reference/configuration" },
],
}, },
], ],
customCss: [ customCss: [
// Path to your Tailwind base styles: // Path to your Tailwind base styles:
'./src/styles/global.css', "./src/styles/global.css",
], ],
}), }),
], ],

Binary file not shown.

After

Width:  |  Height:  |  Size: 178 KiB

View file

@ -28,7 +28,7 @@ import { Card, CardGrid } from '@astrojs/starlight/components';
Koito can be connected to any music server or client that allows for custom ListenBrainz URLs. Koito can be connected to any music server or client that allows for custom ListenBrainz URLs.
</Card> </Card>
<Card title="Scrobbler relay" icon="rocket"> <Card title="Scrobbler relay" icon="rocket">
Automatically relay listens submitted to your Koito instance to other ListenBrainz compatble servers. Automatically relay listens submitted to your Koito instance to other ListenBrainz compatible servers.
</Card> </Card>
<Card title="Automatic data fetching" icon="download"> <Card title="Automatic data fetching" icon="download">
Koito automatically fetches data from MusicBrainz and images from Deezer and Cover Art Archive to compliment what is provided by your music server. Koito automatically fetches data from MusicBrainz and images from Deezer and Cover Art Archive to compliment what is provided by your music server.

View file

@ -0,0 +1,68 @@
---
title: Navidrome Quickstart
description: How to set up Koito to work with your Navidrome instance.
---
## Configure Koito
This quickstart assumes you are using Docker compose. Below is an example file, adjusted from the actual file I use personally.
```yaml title="compose.yaml"
services:
koito:
image: gabehf/koito:latest
container_name: koito
depends_on:
- db
user: 1000:1000
environment:
- KOITO_DATABASE_URL=postgres://postgres:<a_super_random_string>@db:5432/koitodb
- KOITO_ALLOWED_HOSTS=koito.mydomain.com,192.168.1.100
- KOITO_SUBSONIC_URL=https://navidrome.mydomain.com # the url to your navidrome instance
- KOITO_SUBSONIC_PARAMS=u=<navidrome_username>&t=<navidrome_token>&s=<navidrome_salt>
- KOITO_DEFAULT_THEME=black # i like this theme, use whatever you want
ports:
- "4110:4110"
volumes:
- ./koito-data:/etc/koito
restart: unless-stopped
db:
user: 1000:1000
image: postgres:16
container_name: psql
restart: unless-stopped
environment:
POSTGRES_DB: koitodb
POSTGRES_USER: postgres
POSTGRES_PASSWORD: <a_super_random_string>
volumes:
- ./db-data:/var/lib/postgresql/data
```
### How do I get the Subsonic params?
The easiest way to get your Subsonic parameters to open your browser and sign into Navidrome, then press F12 to get to
the developer options and navigate to the **Network** tab. Find a `getCoverArt` request (there should be a lot on the home
page) and look for the part of the URL that looks like `u=<username>&t=<random_string>&s=<small_random_string>`. This
is what you need to copy and provide to Koito.
:::note
If you don't want to use Navidrome to provide images to Koito, you can skip the `KOITO_SUBSONIC_URL` and `KOITO_SUBSONIC_PARAMS`
variables entirely.
:::
## Configure Navidrome
You have to provide Navidrome with the environment variables `ND_LISTENBRAINZ_ENABLED=true` and
`ND_LISTENBRAINZ_BASEURL=<your_koito_url>/apis/listenbrainz/1`. The place where you edit these environment variables will change
depending on how you have chosen to deploy Navidrome.
## Enable ListenBrainz in Navidrome
In Navidome, click on **Settings** in the top right, then click **Personal**.
Here, you will see that **Scrobble to ListenBrainz** is turned off. Flip that switch on.
![navidrome listenbrainz switch screenshot](../../../assets/navidrome_lbz_switch.png)
When you flip it on, Navidrome will prompt you for a ListenBrainz token. To get this token, open your Koito page and sign in.
Press the settings button (or hit `\`) and go to the **API Keys** tab. Copy the autogenerated API key by either clicking the
copy button, or clicking on the key itself and copying with ctrl+c.
After hitting **Save** in Navidrome, your listen activity will start being sent to Koito as you listen to tracks.
Happy scrobbling!

View file

@ -64,6 +64,8 @@ If the environment variable is defined without **and** with the suffix at the sa
##### KOITO_CONFIG_DIR ##### KOITO_CONFIG_DIR
- Default: `/etc/koito` - Default: `/etc/koito`
- Description: The location where import folders and image caches are stored. - Description: The location where import folders and image caches are stored.
##### KOITO_FORCE_TZ
- Description: A canonical IANA database time zone name (https://en.wikipedia.org/wiki/List_of_tz_database_time_zones) that Koito will use to serve all clients. Overrides any timezones requested via a `tz` cookie or `tz` query parameter. Koito will fail to start if this value is invalid.
##### KOITO_DISABLE_DEEZER ##### KOITO_DISABLE_DEEZER
- Default: `false` - Default: `false`
- Description: Disables Deezer as a source for finding artist and album images. - Description: Disables Deezer as a source for finding artist and album images.
@ -78,6 +80,13 @@ If the environment variable is defined without **and** with the suffix at the sa
##### KOITO_SUBSONIC_PARAMS ##### KOITO_SUBSONIC_PARAMS
- Required: `true` if KOITO_SUBSONIC_URL is set - Required: `true` if KOITO_SUBSONIC_URL is set
- Description: The `u`, `t`, and `s` authentication parameters to use for authenticated requests to your subsonic server, in the format `u=XXX&t=XXX&s=XXX`. An easy way to find them is to open the network tab in the developer tools of your browser of choice and copy them from a request. - Description: The `u`, `t`, and `s` authentication parameters to use for authenticated requests to your subsonic server, in the format `u=XXX&t=XXX&s=XXX`. An easy way to find them is to open the network tab in the developer tools of your browser of choice and copy them from a request.
:::caution
If Koito is unable to validate your Subsonic configuration, it will fail to start. If you notice your container isn't running after
changing these parameters, check the logs!
:::
##### KOITO_LASTFM_API_KEY
- Required: `false`
- Description: Your LastFM API key, which will be used for fetching images if provided. You can get an API key [here](https://www.last.fm/api/authentication),
##### KOITO_SKIP_IMPORT ##### KOITO_SKIP_IMPORT
- Default: `false` - Default: `false`
- Description: Skips running the importer on startup. - Description: Skips running the importer on startup.

View file

@ -2,6 +2,7 @@ package engine
import ( import (
"context" "context"
"encoding/json"
"fmt" "fmt"
"io" "io"
"net/http" "net/http"
@ -95,6 +96,10 @@ func Run(
defer store.Close(ctx) defer store.Close(ctx)
l.Info().Msg("Engine: Database connection established") l.Info().Msg("Engine: Database connection established")
if cfg.ForceTZ() != nil {
l.Debug().Msgf("Engine: Forcing the use of timezone '%s'", cfg.ForceTZ().String())
}
l.Debug().Msg("Engine: Initializing MusicBrainz client") l.Debug().Msg("Engine: Initializing MusicBrainz client")
var mbzC mbz.MusicBrainzCaller var mbzC mbz.MusicBrainzCaller
if !cfg.MusicBrainzDisabled() { if !cfg.MusicBrainzDisabled() {
@ -105,12 +110,39 @@ func Run(
l.Warn().Msg("Engine: MusicBrainz client disabled") l.Warn().Msg("Engine: MusicBrainz client disabled")
} }
if cfg.SubsonicEnabled() {
l.Debug().Msg("Engine: Checking Subsonic configuration")
pingURL := cfg.SubsonicUrl() + "/rest/ping.view?" + cfg.SubsonicParams() + "&f=json&v=1&c=koito"
resp, err := http.Get(pingURL)
if err != nil {
l.Fatal().Err(err).Msg("Engine: Failed to contact Subsonic server! Ensure the provided URL is correct")
} else {
defer resp.Body.Close()
var result struct {
Response struct {
Status string `json:"status"`
} `json:"subsonic-response"`
}
if err := json.NewDecoder(resp.Body).Decode(&result); err != nil {
l.Fatal().Err(err).Msg("Engine: Failed to parse Subsonic response")
} else if result.Response.Status != "ok" {
l.Fatal().Msg("Engine: Provided Subsonic credentials are invalid")
} else {
l.Info().Msg("Engine: Subsonic credentials validated successfully")
}
}
}
l.Debug().Msg("Engine: Initializing image sources") l.Debug().Msg("Engine: Initializing image sources")
images.Initialize(images.ImageSourceOpts{ images.Initialize(images.ImageSourceOpts{
UserAgent: cfg.UserAgent(), UserAgent: cfg.UserAgent(),
EnableCAA: !cfg.CoverArtArchiveDisabled(), EnableCAA: !cfg.CoverArtArchiveDisabled(),
EnableDeezer: !cfg.DeezerDisabled(), EnableDeezer: !cfg.DeezerDisabled(),
EnableSubsonic: cfg.SubsonicEnabled(), EnableSubsonic: cfg.SubsonicEnabled(),
EnableLastFM: cfg.LastFMApiKey() != "",
}) })
l.Info().Msg("Engine: Image sources initialized") l.Info().Msg("Engine: Image sources initialized")
@ -184,6 +216,8 @@ func Run(
} }
}() }()
l.Info().Msg("Engine: Beginning startup tasks...")
l.Debug().Msg("Engine: Checking import configuration") l.Debug().Msg("Engine: Checking import configuration")
if !cfg.SkipImport() { if !cfg.SkipImport() {
go func() { go func() {
@ -191,16 +225,14 @@ func Run(
}() }()
} }
// l.Info().Msg("Creating test export file")
// go func() {
// err := export.ExportData(ctx, "koito", store)
// if err != nil {
// l.Err(err).Msg("Failed to generate export file")
// }
// }()
l.Info().Msg("Engine: Pruning orphaned images") l.Info().Msg("Engine: Pruning orphaned images")
go catalog.PruneOrphanedImages(logger.NewContext(l), store) go catalog.PruneOrphanedImages(logger.NewContext(l), store)
l.Info().Msg("Engine: Running duration backfill task")
go catalog.BackfillTrackDurationsFromMusicBrainz(ctx, store, mbzC)
l.Info().Msg("Engine: Attempting to fetch missing artist images")
go catalog.FetchMissingArtistImages(ctx, store)
l.Info().Msg("Engine: Attempting to fetch missing album images")
go catalog.FetchMissingAlbumImages(ctx, store)
l.Info().Msg("Engine: Initialization finished") l.Info().Msg("Engine: Initialization finished")
quit := make(chan os.Signal, 1) quit := make(chan os.Signal, 1)
@ -221,19 +253,19 @@ func Run(
} }
func RunImporter(l *zerolog.Logger, store db.DB, mbzc mbz.MusicBrainzCaller) { func RunImporter(l *zerolog.Logger, store db.DB, mbzc mbz.MusicBrainzCaller) {
l.Debug().Msg("Checking for import files...") l.Debug().Msg("Importer: Checking for import files...")
files, err := os.ReadDir(path.Join(cfg.ConfigDir(), "import")) files, err := os.ReadDir(path.Join(cfg.ConfigDir(), "import"))
if err != nil { if err != nil {
l.Err(err).Msg("Failed to read files from import dir") l.Err(err).Msg("Importer: Failed to read files from import dir")
} }
if len(files) > 0 { if len(files) > 0 {
l.Info().Msg("Files found in import directory. Attempting to import...") l.Info().Msg("Importer: Files found in import directory. Attempting to import...")
} else { } else {
return return
} }
defer func() { defer func() {
if r := recover(); r != nil { if r := recover(); r != nil {
l.Error().Interface("recover", r).Msg("Panic when importing files") l.Error().Interface("recover", r).Msg("Importer: Panic when importing files")
} }
}() }()
for _, file := range files { for _, file := range files {
@ -241,37 +273,37 @@ func RunImporter(l *zerolog.Logger, store db.DB, mbzc mbz.MusicBrainzCaller) {
continue continue
} }
if strings.Contains(file.Name(), "Streaming_History_Audio") { if strings.Contains(file.Name(), "Streaming_History_Audio") {
l.Info().Msgf("Import file %s detecting as being Spotify export", file.Name()) l.Info().Msgf("Importer: Import file %s detecting as being Spotify export", file.Name())
err := importer.ImportSpotifyFile(logger.NewContext(l), store, file.Name()) err := importer.ImportSpotifyFile(logger.NewContext(l), store, file.Name())
if err != nil { if err != nil {
l.Err(err).Msgf("Failed to import file: %s", file.Name()) l.Err(err).Msgf("Importer: Failed to import file: %s", file.Name())
} }
} else if strings.Contains(file.Name(), "maloja") { } else if strings.Contains(file.Name(), "maloja") {
l.Info().Msgf("Import file %s detecting as being Maloja export", file.Name()) l.Info().Msgf("Importer: Import file %s detecting as being Maloja export", file.Name())
err := importer.ImportMalojaFile(logger.NewContext(l), store, file.Name()) err := importer.ImportMalojaFile(logger.NewContext(l), store, file.Name())
if err != nil { if err != nil {
l.Err(err).Msgf("Failed to import file: %s", file.Name()) l.Err(err).Msgf("Importer: Failed to import file: %s", file.Name())
} }
} else if strings.Contains(file.Name(), "recenttracks") { } else if strings.Contains(file.Name(), "recenttracks") {
l.Info().Msgf("Import file %s detecting as being ghan.nl LastFM export", file.Name()) l.Info().Msgf("Importer: Import file %s detecting as being ghan.nl LastFM export", file.Name())
err := importer.ImportLastFMFile(logger.NewContext(l), store, mbzc, file.Name()) err := importer.ImportLastFMFile(logger.NewContext(l), store, mbzc, file.Name())
if err != nil { if err != nil {
l.Err(err).Msgf("Failed to import file: %s", file.Name()) l.Err(err).Msgf("Importer: Failed to import file: %s", file.Name())
} }
} else if strings.Contains(file.Name(), "listenbrainz") { } else if strings.Contains(file.Name(), "listenbrainz") {
l.Info().Msgf("Import file %s detecting as being ListenBrainz export", file.Name()) l.Info().Msgf("Importer: Import file %s detecting as being ListenBrainz export", file.Name())
err := importer.ImportListenBrainzExport(logger.NewContext(l), store, mbzc, file.Name()) err := importer.ImportListenBrainzExport(logger.NewContext(l), store, mbzc, file.Name())
if err != nil { if err != nil {
l.Err(err).Msgf("Failed to import file: %s", file.Name()) l.Err(err).Msgf("Importer: Failed to import file: %s", file.Name())
} }
} else if strings.Contains(file.Name(), "koito") { } else if strings.Contains(file.Name(), "koito") {
l.Info().Msgf("Import file %s detecting as being Koito export", file.Name()) l.Info().Msgf("Importer: Import file %s detecting as being Koito export", file.Name())
err := importer.ImportKoitoFile(logger.NewContext(l), store, file.Name()) err := importer.ImportKoitoFile(logger.NewContext(l), store, file.Name())
if err != nil { if err != nil {
l.Err(err).Msgf("Failed to import file: %s", file.Name()) l.Err(err).Msgf("Importer: Failed to import file: %s", file.Name())
} }
} else { } else {
l.Warn().Msgf("File %s not recognized as a valid import file; make sure it is valid and named correctly", file.Name()) l.Warn().Msgf("Importer: File %s not recognized as a valid import file; make sure it is valid and named correctly", file.Name())
} }
} }
} }

View file

@ -106,7 +106,7 @@ func GetListenActivityHandler(store db.DB) func(w http.ResponseWriter, r *http.R
return return
} }
activity = fillMissingActivity(activity, opts) activity = processActivity(activity, opts)
l.Debug().Msg("GetListenActivityHandler: Successfully retrieved listen activity") l.Debug().Msg("GetListenActivityHandler: Successfully retrieved listen activity")
utils.WriteJSON(w, http.StatusOK, activity) utils.WriteJSON(w, http.StatusOK, activity)
@ -114,34 +114,55 @@ func GetListenActivityHandler(store db.DB) func(w http.ResponseWriter, r *http.R
} }
// ngl i hate this // ngl i hate this
func fillMissingActivity( func processActivity(
items []db.ListenActivityItem, items []db.ListenActivityItem,
opts db.ListenActivityOpts, opts db.ListenActivityOpts,
) []db.ListenActivityItem { ) []db.ListenActivityItem {
from, to := db.ListenActivityOptsToTimes(opts) from, to := db.ListenActivityOptsToTimes(opts)
existing := make(map[string]int64, len(items)) buckets := make(map[string]int64)
for _, item := range items { for _, item := range items {
existing[item.Start.Format("2006-01-02")] = item.Listens bucketStart := normalizeToStep(item.Start, opts.Step)
key := bucketStart.Format("2006-01-02")
buckets[key] += item.Listens
} }
var result []db.ListenActivityItem var result []db.ListenActivityItem
for t := from; t.Before(to); t = addStep(t, opts.Step) { for t := normalizeToStep(from, opts.Step); t.Before(to); t = addStep(t, opts.Step) {
listens := int64(0) key := t.Format("2006-01-02")
if v, ok := existing[t.Format("2006-01-02")]; ok {
listens = v
}
result = append(result, db.ListenActivityItem{ result = append(result, db.ListenActivityItem{
Start: t, Start: t,
Listens: int64(listens), Listens: buckets[key],
}) })
} }
return result return result
} }
func normalizeToStep(t time.Time, step db.StepInterval) time.Time {
switch step {
case db.StepDay:
return time.Date(t.Year(), t.Month(), t.Day(), 0, 0, 0, 0, t.Location())
case db.StepWeek:
weekday := int(t.Weekday())
if weekday == 0 {
weekday = 7
}
start := t.AddDate(0, 0, -(weekday - 1))
return time.Date(start.Year(), start.Month(), start.Day(), 0, 0, 0, 0, t.Location())
case db.StepMonth:
return time.Date(t.Year(), t.Month(), 1, 0, 0, 0, 0, t.Location())
default:
return t
}
}
func addStep(t time.Time, step db.StepInterval) time.Time { func addStep(t time.Time, step db.StepInterval) time.Time {
switch step { switch step {
case db.StepDay: case db.StepDay:

View file

@ -6,7 +6,9 @@ import (
"strconv" "strconv"
"strings" "strings"
"time" "time"
_ "time/tzdata"
"github.com/gabehf/koito/internal/cfg"
"github.com/gabehf/koito/internal/db" "github.com/gabehf/koito/internal/db"
"github.com/gabehf/koito/internal/logger" "github.com/gabehf/koito/internal/logger"
) )
@ -107,14 +109,143 @@ func TimeframeFromRequest(r *http.Request) db.Timeframe {
func parseTZ(r *http.Request) *time.Location { func parseTZ(r *http.Request) *time.Location {
// this map is obviously AI.
// i manually referenced as many links as I could and couldn't find any
// incorrect entries here so hopefully it is all correct.
overrides := map[string]string{
// --- North America ---
"America/Indianapolis": "America/Indiana/Indianapolis",
"America/Knoxville": "America/Indiana/Knoxville",
"America/Louisville": "America/Kentucky/Louisville",
"America/Montreal": "America/Toronto",
"America/Shiprock": "America/Denver",
"America/Fort_Wayne": "America/Indiana/Indianapolis",
"America/Virgin": "America/Port_of_Spain",
"America/Santa_Isabel": "America/Tijuana",
"America/Ensenada": "America/Tijuana",
"America/Rosario": "America/Argentina/Cordoba",
"America/Jujuy": "America/Argentina/Jujuy",
"America/Mendoza": "America/Argentina/Mendoza",
"America/Catamarca": "America/Argentina/Catamarca",
"America/Cordoba": "America/Argentina/Cordoba",
"America/Buenos_Aires": "America/Argentina/Buenos_Aires",
"America/Coral_Harbour": "America/Atikokan",
"America/Atka": "America/Adak",
"US/Alaska": "America/Anchorage",
"US/Aleutian": "America/Adak",
"US/Arizona": "America/Phoenix",
"US/Central": "America/Chicago",
"US/Eastern": "America/New_York",
"US/East-Indiana": "America/Indiana/Indianapolis",
"US/Hawaii": "Pacific/Honolulu",
"US/Indiana-Starke": "America/Indiana/Knoxville",
"US/Michigan": "America/Detroit",
"US/Mountain": "America/Denver",
"US/Pacific": "America/Los_Angeles",
"US/Samoa": "Pacific/Pago_Pago",
"Canada/Atlantic": "America/Halifax",
"Canada/Central": "America/Winnipeg",
"Canada/Eastern": "America/Toronto",
"Canada/Mountain": "America/Edmonton",
"Canada/Newfoundland": "America/St_Johns",
"Canada/Pacific": "America/Vancouver",
// --- Asia ---
"Asia/Calcutta": "Asia/Kolkata",
"Asia/Saigon": "Asia/Ho_Chi_Minh",
"Asia/Katmandu": "Asia/Kathmandu",
"Asia/Rangoon": "Asia/Yangon",
"Asia/Ulan_Bator": "Asia/Ulaanbaatar",
"Asia/Macao": "Asia/Macau",
"Asia/Tel_Aviv": "Asia/Jerusalem",
"Asia/Ashkhabad": "Asia/Ashgabat",
"Asia/Chungking": "Asia/Chongqing",
"Asia/Dacca": "Asia/Dhaka",
"Asia/Istanbul": "Europe/Istanbul",
"Asia/Kashgar": "Asia/Urumqi",
"Asia/Thimbu": "Asia/Thimphu",
"Asia/Ujung_Pandang": "Asia/Makassar",
"ROC": "Asia/Taipei",
"Iran": "Asia/Tehran",
"Israel": "Asia/Jerusalem",
"Japan": "Asia/Tokyo",
"Singapore": "Asia/Singapore",
"Hongkong": "Asia/Hong_Kong",
// --- Europe ---
"Europe/Kiev": "Europe/Kyiv",
"Europe/Belfast": "Europe/London",
"Europe/Tiraspol": "Europe/Chisinau",
"Europe/Nicosia": "Asia/Nicosia",
"Europe/Moscow": "Europe/Moscow",
"W-SU": "Europe/Moscow",
"GB": "Europe/London",
"GB-Eire": "Europe/London",
"Eire": "Europe/Dublin",
"Poland": "Europe/Warsaw",
"Portugal": "Europe/Lisbon",
"Turkey": "Europe/Istanbul",
// --- Australia / Pacific ---
"Australia/ACT": "Australia/Sydney",
"Australia/Canberra": "Australia/Sydney",
"Australia/LHI": "Australia/Lord_Howe",
"Australia/North": "Australia/Darwin",
"Australia/NSW": "Australia/Sydney",
"Australia/Queensland": "Australia/Brisbane",
"Australia/South": "Australia/Adelaide",
"Australia/Tasmania": "Australia/Hobart",
"Australia/Victoria": "Australia/Melbourne",
"Australia/West": "Australia/Perth",
"Australia/Yancowinna": "Australia/Broken_Hill",
"Pacific/Samoa": "Pacific/Pago_Pago",
"Pacific/Yap": "Pacific/Chuuk",
"Pacific/Truk": "Pacific/Chuuk",
"Pacific/Ponape": "Pacific/Pohnpei",
"NZ": "Pacific/Auckland",
"NZ-CHAT": "Pacific/Chatham",
// --- Africa ---
"Africa/Asmera": "Africa/Asmara",
"Africa/Timbuktu": "Africa/Bamako",
"Egypt": "Africa/Cairo",
"Libya": "Africa/Tripoli",
// --- Atlantic ---
"Atlantic/Faeroe": "Atlantic/Faroe",
"Atlantic/Jan_Mayen": "Europe/Oslo",
"Iceland": "Atlantic/Reykjavik",
// --- Etc / Misc ---
"UTC": "UTC",
"Etc/UTC": "UTC",
"Etc/GMT": "UTC",
"GMT": "UTC",
"Zulu": "UTC",
"Universal": "UTC",
}
if cfg.ForceTZ() != nil {
return cfg.ForceTZ()
}
if tz := r.URL.Query().Get("tz"); tz != "" { if tz := r.URL.Query().Get("tz"); tz != "" {
if fixedTz, exists := overrides[tz]; exists {
tz = fixedTz
}
if loc, err := time.LoadLocation(tz); err == nil { if loc, err := time.LoadLocation(tz); err == nil {
return loc return loc
} }
} }
if c, err := r.Cookie("tz"); err == nil { if c, err := r.Cookie("tz"); err == nil {
if loc, err := time.LoadLocation(c.Value); err == nil { var tz string
if fixedTz, exists := overrides[c.Value]; exists {
tz = fixedTz
} else {
tz = c.Value
}
if loc, err := time.LoadLocation(tz); err == nil {
return loc return loc
} }
} }

View file

@ -90,6 +90,11 @@ func LbzSubmitListenHandler(store db.DB, mbzc mbz.MusicBrainzCaller) func(w http
utils.WriteError(w, "failed to read request body", http.StatusBadRequest) utils.WriteError(w, "failed to read request body", http.StatusBadRequest)
return return
} }
if cfg.LbzRelayEnabled() {
go doLbzRelay(requestBytes, l)
}
if err := json.NewDecoder(bytes.NewBuffer(requestBytes)).Decode(&req); err != nil { if err := json.NewDecoder(bytes.NewBuffer(requestBytes)).Decode(&req); err != nil {
l.Err(err).Msg("LbzSubmitListenHandler: Failed to decode request") l.Err(err).Msg("LbzSubmitListenHandler: Failed to decode request")
utils.WriteError(w, "failed to decode request", http.StatusBadRequest) utils.WriteError(w, "failed to decode request", http.StatusBadRequest)
@ -103,7 +108,7 @@ func LbzSubmitListenHandler(store db.DB, mbzc mbz.MusicBrainzCaller) func(w http
return return
} }
l.Debug().Any("request_body", req).Msg("LbzSubmitListenHandler: Parsed request body") l.Info().Any("request_body", req).Msg("LbzSubmitListenHandler: Parsed request body")
if len(req.Payload) < 1 { if len(req.Payload) < 1 {
l.Debug().Msg("LbzSubmitListenHandler: Payload is empty") l.Debug().Msg("LbzSubmitListenHandler: Payload is empty")
@ -234,10 +239,6 @@ func LbzSubmitListenHandler(store db.DB, mbzc mbz.MusicBrainzCaller) func(w http
w.WriteHeader(http.StatusOK) w.WriteHeader(http.StatusOK)
w.Header().Set("Content-Type", "application/json") w.Header().Set("Content-Type", "application/json")
w.Write([]byte("{\"status\": \"ok\"}")) w.Write([]byte("{\"status\": \"ok\"}"))
if cfg.LbzRelayEnabled() {
go doLbzRelay(requestBytes, l)
}
} }
} }

View file

@ -9,6 +9,7 @@ import (
"github.com/gabehf/koito/internal/catalog" "github.com/gabehf/koito/internal/catalog"
"github.com/gabehf/koito/internal/cfg" "github.com/gabehf/koito/internal/cfg"
"github.com/gabehf/koito/internal/db" "github.com/gabehf/koito/internal/db"
"github.com/gabehf/koito/internal/images"
"github.com/gabehf/koito/internal/logger" "github.com/gabehf/koito/internal/logger"
"github.com/gabehf/koito/internal/utils" "github.com/gabehf/koito/internal/utils"
"github.com/google/uuid" "github.com/google/uuid"
@ -75,7 +76,7 @@ func ReplaceImageHandler(store db.DB) http.HandlerFunc {
fileUrl := r.FormValue("image_url") fileUrl := r.FormValue("image_url")
if fileUrl != "" { if fileUrl != "" {
l.Debug().Msg("ReplaceImageHandler: Image identified as remote file") l.Debug().Msg("ReplaceImageHandler: Image identified as remote file")
err = catalog.ValidateImageURL(fileUrl) err = images.ValidateImageURL(fileUrl)
if err != nil { if err != nil {
l.Debug().AnErr("error", err).Msg("ReplaceImageHandler: Invalid image URL") l.Debug().AnErr("error", err).Msg("ReplaceImageHandler: Invalid image URL")
utils.WriteError(w, "url is invalid or not an image file", http.StatusBadRequest) utils.WriteError(w, "url is invalid or not an image file", http.StatusBadRequest)

View file

@ -264,6 +264,34 @@ func TestImportListenBrainz_MbzDisabled(t *testing.T) {
truncateTestData(t) truncateTestData(t)
} }
func TestImportListenBrainz_MBIDMapping(t *testing.T) {
src := path.Join("..", "test_assets", "listenbrainz_shoko1_123456789.zip")
destDir := filepath.Join(cfg.ConfigDir(), "import")
dest := filepath.Join(destDir, "listenbrainz_shoko1_123456789.zip")
// not going to make the dest dir because engine should make it already
input, err := os.ReadFile(src)
require.NoError(t, err)
require.NoError(t, os.WriteFile(dest, input, os.ModePerm))
engine.RunImporter(logger.Get(), store, &mbz.MbzErrorCaller{})
album, err := store.GetAlbum(context.Background(), db.GetAlbumOpts{MusicBrainzID: uuid.MustParse("177ebc28-0115-3897-8eb3-ebf74ce23790")})
require.NoError(t, err)
assert.Equal(t, "Zombie", album.Title)
artist, err := store.GetArtist(context.Background(), db.GetArtistOpts{MusicBrainzID: uuid.MustParse("c98d40fd-f6cf-4b26-883e-eaa515ee2851")})
require.NoError(t, err)
assert.Equal(t, "The Cranberries", artist.Name)
track, err := store.GetTrack(context.Background(), db.GetTrackOpts{MusicBrainzID: uuid.MustParse("3bbeb4e3-ab6d-460d-bfc5-de49e4251061")})
require.NoError(t, err)
assert.Equal(t, "Zombie", track.Title)
truncateTestData(t)
}
func TestImportKoito(t *testing.T) { func TestImportKoito(t *testing.T) {
src := path.Join("..", "test_assets", "koito_export_test.json") src := path.Join("..", "test_assets", "koito_export_test.json")
@ -276,6 +304,7 @@ func TestImportKoito(t *testing.T) {
giriReleaseMBID := uuid.MustParse("ac1f8da0-21d7-426e-83b0-befff06f0871") giriReleaseMBID := uuid.MustParse("ac1f8da0-21d7-426e-83b0-befff06f0871")
suzukiMBID := uuid.MustParse("30f851bb-dba3-4e9b-811c-5f27f595c86a") suzukiMBID := uuid.MustParse("30f851bb-dba3-4e9b-811c-5f27f595c86a")
nijinoTrackMBID := uuid.MustParse("a4f26836-3894-46c1-acac-227808308687") nijinoTrackMBID := uuid.MustParse("a4f26836-3894-46c1-acac-227808308687")
lp3MBID := uuid.MustParse("d0ec30bd-7cdc-417c-979d-5a0631b8a161")
input, err := os.ReadFile(src) input, err := os.ReadFile(src)
require.NoError(t, err) require.NoError(t, err)
@ -312,6 +341,12 @@ func TestImportKoito(t *testing.T) {
aliases, err := store.GetAllAlbumAliases(ctx, album.ID) aliases, err := store.GetAllAlbumAliases(ctx, album.ID)
require.NoError(t, err) require.NoError(t, err)
assert.Contains(t, utils.FlattenAliases(aliases), "Nijinoiroyo Azayakadeare (NELKE ver.)") assert.Contains(t, utils.FlattenAliases(aliases), "Nijinoiroyo Azayakadeare (NELKE ver.)")
// ensure album associations are saved
album, err = store.GetAlbum(ctx, db.GetAlbumOpts{MusicBrainzID: lp3MBID})
require.NoError(t, err)
assert.Contains(t, utils.FlattenSimpleArtistNames(album.Artists), "Elizabeth Powell")
assert.Contains(t, utils.FlattenSimpleArtistNames(album.Artists), "Rachel Goswell")
assert.Contains(t, utils.FlattenSimpleArtistNames(album.Artists), "American Football")
// ensure all tracks are saved // ensure all tracks are saved
track, err := store.GetTrack(ctx, db.GetTrackOpts{MusicBrainzID: nijinoTrackMBID}) track, err := store.GetTrack(ctx, db.GetTrackOpts{MusicBrainzID: nijinoTrackMBID})

View file

@ -356,6 +356,51 @@ func TestDelete(t *testing.T) {
truncateTestData(t) truncateTestData(t)
} }
func TestLoginGate(t *testing.T) {
t.Run("Submit Listens", doSubmitListens)
req, err := http.NewRequest("DELETE", host()+"/apis/web/v1/artist?id=1", nil)
require.NoError(t, err)
req.Header.Add("Authorization", "Token "+apikey)
resp, err := http.DefaultClient.Do(req)
assert.NoError(t, err)
assert.Equal(t, 204, resp.StatusCode)
req, err = http.NewRequest("GET", host()+"/apis/web/v1/artist?id=3", nil)
require.NoError(t, err)
resp, err = http.DefaultClient.Do(req)
assert.NoError(t, err)
assert.Equal(t, 200, resp.StatusCode)
var artist models.Artist
err = json.NewDecoder(resp.Body).Decode(&artist)
require.NoError(t, err)
assert.Equal(t, "ネクライトーキー", artist.Name)
cfg.SetLoginGate(true)
req, err = http.NewRequest("GET", host()+"/apis/web/v1/artist?id=3", nil)
require.NoError(t, err)
// req.Header.Add("Authorization", "Token "+apikey)
resp, err = http.DefaultClient.Do(req)
assert.NoError(t, err)
assert.Equal(t, 401, resp.StatusCode)
req, err = http.NewRequest("GET", host()+"/apis/web/v1/artist?id=3", nil)
require.NoError(t, err)
req.Header.Add("Authorization", "Token "+apikey)
resp, err = http.DefaultClient.Do(req)
assert.NoError(t, err)
assert.Equal(t, 200, resp.StatusCode)
err = json.NewDecoder(resp.Body).Decode(&artist)
require.NoError(t, err)
assert.Equal(t, "ネクライトーキー", artist.Name)
cfg.SetLoginGate(false)
truncateTestData(t)
}
func TestAliasesAndSearch(t *testing.T) { func TestAliasesAndSearch(t *testing.T) {
t.Run("Submit Listens", doSubmitListens) t.Run("Submit Listens", doSubmitListens)

View file

@ -0,0 +1,166 @@
package middleware
import (
"context"
"errors"
"fmt"
"net/http"
"strings"
"time"
"github.com/gabehf/koito/internal/cfg"
"github.com/gabehf/koito/internal/db"
"github.com/gabehf/koito/internal/logger"
"github.com/gabehf/koito/internal/models"
"github.com/gabehf/koito/internal/utils"
"github.com/google/uuid"
)
type MiddlwareContextKey string
const (
UserContextKey MiddlwareContextKey = "user"
apikeyContextKey MiddlwareContextKey = "apikeyID"
)
type AuthMode int
const (
AuthModeSessionCookie AuthMode = iota
AuthModeAPIKey
AuthModeSessionOrAPIKey
AuthModeLoginGate
)
func Authenticate(store db.DB, mode AuthMode) func(http.Handler) http.Handler {
return func(next http.Handler) http.Handler {
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
ctx := r.Context()
l := logger.FromContext(ctx)
var user *models.User
var err error
switch mode {
case AuthModeSessionCookie:
user, err = validateSession(ctx, store, r)
case AuthModeAPIKey:
user, err = validateAPIKey(ctx, store, r)
case AuthModeSessionOrAPIKey:
user, err = validateSession(ctx, store, r)
if err != nil || user == nil {
user, err = validateAPIKey(ctx, store, r)
}
case AuthModeLoginGate:
if cfg.LoginGate() {
user, err = validateSession(ctx, store, r)
if err != nil || user == nil {
user, err = validateAPIKey(ctx, store, r)
}
} else {
next.ServeHTTP(w, r)
return
}
}
if err != nil {
l.Err(err).Msg("authentication failed")
utils.WriteError(w, "unauthorized", http.StatusUnauthorized)
return
}
if user == nil {
utils.WriteError(w, "unauthorized", http.StatusUnauthorized)
return
}
ctx = context.WithValue(ctx, UserContextKey, user)
r = r.WithContext(ctx)
next.ServeHTTP(w, r)
})
}
}
func validateSession(ctx context.Context, store db.DB, r *http.Request) (*models.User, error) {
l := logger.FromContext(r.Context())
l.Debug().Msgf("ValidateSession: Checking user authentication via session cookie")
cookie, err := r.Cookie("koito_session")
var sid uuid.UUID
if err == nil {
sid, err = uuid.Parse(cookie.Value)
if err != nil {
l.Err(err).Msg("ValidateSession: Could not parse UUID from session cookie")
return nil, errors.New("session cookie is invalid")
}
} else {
l.Debug().Msgf("ValidateSession: No session cookie found; attempting API key authentication")
return nil, errors.New("session cookie is missing")
}
l.Debug().Msg("ValidateSession: Retrieved login cookie from request")
u, err := store.GetUserBySession(r.Context(), sid)
if err != nil {
l.Err(fmt.Errorf("ValidateSession: %w", err)).Msg("Error accessing database")
return nil, errors.New("internal server error")
}
if u == nil {
l.Debug().Msg("ValidateSession: No user with session id found")
return nil, errors.New("no user with session id found")
}
ctx = context.WithValue(r.Context(), UserContextKey, u)
r = r.WithContext(ctx)
l.Debug().Msgf("ValidateSession: Refreshing session for user '%s'", u.Username)
store.RefreshSession(r.Context(), sid, time.Now().Add(30*24*time.Hour))
l.Debug().Msgf("ValidateSession: Refreshed session for user '%s'", u.Username)
return u, nil
}
func validateAPIKey(ctx context.Context, store db.DB, r *http.Request) (*models.User, error) {
l := logger.FromContext(ctx)
l.Debug().Msg("ValidateApiKey: Checking if user is already authenticated")
authH := r.Header.Get("Authorization")
var token string
if strings.HasPrefix(strings.ToLower(authH), "token ") {
token = strings.TrimSpace(authH[6:]) // strip "Token "
} else {
l.Error().Msg("ValidateApiKey: Authorization header must be formatted 'Token {token}'")
return nil, errors.New("authorization header is invalid")
}
u, err := store.GetUserByApiKey(ctx, token)
if err != nil {
l.Err(err).Msg("ValidateApiKey: Failed to get user from database using api key")
return nil, errors.New("internal server error")
}
if u == nil {
l.Debug().Msg("ValidateApiKey: API key does not exist")
return nil, errors.New("authorization token is invalid")
}
ctx = context.WithValue(r.Context(), UserContextKey, u)
r = r.WithContext(ctx)
return u, nil
}
func GetUserFromContext(ctx context.Context) *models.User {
user, ok := ctx.Value(UserContextKey).(*models.User)
if !ok {
return nil
}
return user
}

View file

@ -1,125 +0,0 @@
package middleware
import (
"context"
"fmt"
"net/http"
"strings"
"time"
"github.com/gabehf/koito/internal/db"
"github.com/gabehf/koito/internal/logger"
"github.com/gabehf/koito/internal/models"
"github.com/gabehf/koito/internal/utils"
"github.com/google/uuid"
)
type MiddlwareContextKey string
const (
UserContextKey MiddlwareContextKey = "user"
apikeyContextKey MiddlwareContextKey = "apikeyID"
)
func ValidateSession(store db.DB) func(next http.Handler) http.Handler {
return func(next http.Handler) http.Handler {
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
l := logger.FromContext(r.Context())
l.Debug().Msgf("ValidateSession: Checking user authentication via session cookie")
cookie, err := r.Cookie("koito_session")
var sid uuid.UUID
if err == nil {
sid, err = uuid.Parse(cookie.Value)
if err != nil {
l.Err(err).Msg("ValidateSession: Could not parse UUID from session cookie")
utils.WriteError(w, "session cookie is invalid", http.StatusUnauthorized)
return
}
} else {
l.Debug().Msgf("ValidateSession: No session cookie found; attempting API key authentication")
utils.WriteError(w, "session cookie is missing", http.StatusUnauthorized)
return
}
l.Debug().Msg("ValidateSession: Retrieved login cookie from request")
u, err := store.GetUserBySession(r.Context(), sid)
if err != nil {
l.Err(fmt.Errorf("ValidateSession: %w", err)).Msg("Error accessing database")
utils.WriteError(w, "internal server error", http.StatusInternalServerError)
return
}
if u == nil {
l.Debug().Msg("ValidateSession: No user with session id found")
utils.WriteError(w, "unauthorized", http.StatusUnauthorized)
return
}
ctx := context.WithValue(r.Context(), UserContextKey, u)
r = r.WithContext(ctx)
l.Debug().Msgf("ValidateSession: Refreshing session for user '%s'", u.Username)
store.RefreshSession(r.Context(), sid, time.Now().Add(30*24*time.Hour))
l.Debug().Msgf("ValidateSession: Refreshed session for user '%s'", u.Username)
next.ServeHTTP(w, r)
})
}
}
func ValidateApiKey(store db.DB) func(next http.Handler) http.Handler {
return func(next http.Handler) http.Handler {
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
ctx := r.Context()
l := logger.FromContext(ctx)
l.Debug().Msg("ValidateApiKey: Checking if user is already authenticated")
u := GetUserFromContext(ctx)
if u != nil {
l.Debug().Msg("ValidateApiKey: User is already authenticated; skipping API key authentication")
next.ServeHTTP(w, r)
return
}
authh := r.Header.Get("Authorization")
var token string
if strings.HasPrefix(strings.ToLower(authh), "token ") {
token = strings.TrimSpace(authh[6:]) // strip "Token "
} else {
l.Error().Msg("ValidateApiKey: Authorization header must be formatted 'Token {token}'")
utils.WriteError(w, "unauthorized", http.StatusUnauthorized)
return
}
u, err := store.GetUserByApiKey(ctx, token)
if err != nil {
l.Err(err).Msg("Failed to get user from database using api key")
utils.WriteError(w, "internal server error", http.StatusInternalServerError)
return
}
if u == nil {
l.Debug().Msg("Api key does not exist")
utils.WriteError(w, "unauthorized", http.StatusUnauthorized)
return
}
ctx = context.WithValue(r.Context(), UserContextKey, u)
r = r.WithContext(ctx)
next.ServeHTTP(w, r)
})
}
}
func GetUserFromContext(ctx context.Context) *models.User {
user, ok := ctx.Value(UserContextKey).(*models.User)
if !ok {
return nil
}
return user
}

View file

@ -38,9 +38,7 @@ func bindRoutes(
r.Get("/config", handlers.GetCfgHandler()) r.Get("/config", handlers.GetCfgHandler())
r.Group(func(r chi.Router) { r.Group(func(r chi.Router) {
if cfg.LoginGate() { r.Use(middleware.Authenticate(db, middleware.AuthModeLoginGate))
r.Use(middleware.ValidateSession(db))
}
r.Get("/artist", handlers.GetArtistHandler(db)) r.Get("/artist", handlers.GetArtistHandler(db))
r.Get("/artists", handlers.GetArtistsForItemHandler(db)) r.Get("/artists", handlers.GetArtistsForItemHandler(db))
r.Get("/album", handlers.GetAlbumHandler(db)) r.Get("/album", handlers.GetAlbumHandler(db))
@ -79,7 +77,7 @@ func bindRoutes(
}) })
r.Group(func(r chi.Router) { r.Group(func(r chi.Router) {
r.Use(middleware.ValidateSession(db)) r.Use(middleware.Authenticate(db, middleware.AuthModeSessionOrAPIKey))
r.Get("/export", handlers.ExportHandler(db)) r.Get("/export", handlers.ExportHandler(db))
r.Post("/replace-image", handlers.ReplaceImageHandler(db)) r.Post("/replace-image", handlers.ReplaceImageHandler(db))
r.Patch("/album", handlers.UpdateAlbumHandler(db)) r.Patch("/album", handlers.UpdateAlbumHandler(db))
@ -111,8 +109,10 @@ func bindRoutes(
AllowedHeaders: []string{"Content-Type", "Authorization"}, AllowedHeaders: []string{"Content-Type", "Authorization"},
})) }))
r.With(middleware.ValidateApiKey(db)).Post("/submit-listens", handlers.LbzSubmitListenHandler(db, mbz)) r.With(middleware.Authenticate(db, middleware.AuthModeAPIKey)).
r.With(middleware.ValidateApiKey(db)).Get("/validate-token", handlers.LbzValidateTokenHandler(db)) Post("/submit-listens", handlers.LbzSubmitListenHandler(db, mbz))
r.With(middleware.Authenticate(db, middleware.AuthModeAPIKey)).
Get("/validate-token", handlers.LbzValidateTokenHandler(db))
}) })
// serve react client // serve react client

View file

@ -74,9 +74,6 @@ func matchTrackByMbzID(ctx context.Context, d db.DB, opts AssociateTrackOpts) (*
} else { } else {
l.Warn().Msgf("Attempted to update track %s with MusicBrainz ID, but an existing ID was already found", track.Title) l.Warn().Msgf("Attempted to update track %s with MusicBrainz ID, but an existing ID was already found", track.Title)
} }
if err != nil {
return nil, fmt.Errorf("matchTrackByMbzID: %w", err)
}
track.MbzID = &opts.TrackMbzID track.MbzID = &opts.TrackMbzID
return track, nil return track, nil
} }

View file

@ -0,0 +1,85 @@
package catalog
import (
"context"
"fmt"
"github.com/gabehf/koito/internal/db"
"github.com/gabehf/koito/internal/logger"
"github.com/gabehf/koito/internal/mbz"
"github.com/google/uuid"
)
func BackfillTrackDurationsFromMusicBrainz(
ctx context.Context,
store db.DB,
mbzCaller mbz.MusicBrainzCaller,
) error {
l := logger.FromContext(ctx)
l.Info().Msg("BackfillTrackDurationsFromMusicBrainz: Starting backfill of track durations from MusicBrainz")
var from int32 = 0
for {
l.Debug().Int32("ID", from).Msg("Fetching tracks to backfill from ID")
tracks, err := store.GetTracksWithNoDurationButHaveMbzID(ctx, from)
if err != nil {
return fmt.Errorf("BackfillTrackDurationsFromMusicBrainz: failed to fetch tracks for duration backfill: %w", err)
}
// nil, nil means no more results
if len(tracks) == 0 {
if from == 0 {
l.Info().Msg("BackfillTrackDurationsFromMusicBrainz: No tracks need updating. Skipping backfill...")
} else {
l.Info().Msg("BackfillTrackDurationsFromMusicBrainz: Backfill complete")
}
return nil
}
for _, track := range tracks {
from = track.ID
if track.MbzID == nil || *track.MbzID == uuid.Nil {
continue
}
l.Debug().
Str("title", track.Title).
Str("mbz_id", track.MbzID.String()).
Msg("BackfillTrackDurationsFromMusicBrainz: Backfilling duration from MusicBrainz")
mbzTrack, err := mbzCaller.GetTrack(ctx, *track.MbzID)
if err != nil {
l.Err(err).
Str("title", track.Title).
Msg("BackfillTrackDurationsFromMusicBrainz: Failed to fetch track from MusicBrainz")
continue
}
if mbzTrack.LengthMs <= 0 {
l.Debug().
Str("title", track.Title).
Msg("BackfillTrackDurationsFromMusicBrainz: MusicBrainz track has no duration")
continue
}
durationSeconds := int32(mbzTrack.LengthMs / 1000)
err = store.UpdateTrack(ctx, db.UpdateTrackOpts{
ID: track.ID,
Duration: durationSeconds,
})
if err != nil {
l.Err(err).
Str("title", track.Title).
Msg("BackfillTrackDurationsFromMusicBrainz: Failed to update track duration")
} else {
l.Info().
Str("title", track.Title).
Int32("duration_seconds", durationSeconds).
Msg("BackfillTrackDurationsFromMusicBrainz: Track duration backfilled successfully")
}
}
}
}

View file

@ -0,0 +1,36 @@
package catalog_test
import (
"context"
"testing"
"github.com/gabehf/koito/internal/catalog"
"github.com/gabehf/koito/internal/mbz"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
)
func TestBackfillDuration(t *testing.T) {
setupTestDataWithMbzIDs(t)
ctx := context.Background()
mbzc := &mbz.MbzMockCaller{
Artists: mbzArtistData,
Releases: mbzReleaseData,
Tracks: mbzTrackData,
}
var err error
err = catalog.BackfillTrackDurationsFromMusicBrainz(context.Background(), store, &mbz.MbzErrorCaller{})
assert.NoError(t, err)
err = catalog.BackfillTrackDurationsFromMusicBrainz(ctx, store, mbzc)
assert.NoError(t, err)
count, err := store.Count(ctx, `
SELECT COUNT(*) FROM tracks_with_title WHERE title = $1 AND duration > 0
`, "Tokyo Calling")
require.NoError(t, err)
assert.Equal(t, 1, count, "track was not updated with duration")
}

View file

@ -13,7 +13,9 @@ import (
"github.com/gabehf/koito/internal/cfg" "github.com/gabehf/koito/internal/cfg"
"github.com/gabehf/koito/internal/db" "github.com/gabehf/koito/internal/db"
"github.com/gabehf/koito/internal/images"
"github.com/gabehf/koito/internal/logger" "github.com/gabehf/koito/internal/logger"
"github.com/gabehf/koito/internal/utils"
"github.com/google/uuid" "github.com/google/uuid"
"github.com/h2non/bimg" "github.com/h2non/bimg"
) )
@ -78,30 +80,10 @@ func SourceImageDir() string {
} }
} }
// ValidateImageURL checks if the URL points to a valid image by performing a HEAD request.
func ValidateImageURL(url string) error {
resp, err := http.Head(url)
if err != nil {
return fmt.Errorf("ValidateImageURL: http.Head: %w", err)
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
return fmt.Errorf("ValidateImageURL: HEAD request failed, status code: %d", resp.StatusCode)
}
contentType := resp.Header.Get("Content-Type")
if !strings.HasPrefix(contentType, "image/") {
return fmt.Errorf("ValidateImageURL: URL does not point to an image, content type: %s", contentType)
}
return nil
}
// DownloadAndCacheImage downloads an image from the given URL, then calls CompressAndSaveImage. // DownloadAndCacheImage downloads an image from the given URL, then calls CompressAndSaveImage.
func DownloadAndCacheImage(ctx context.Context, id uuid.UUID, url string, size ImageSize) error { func DownloadAndCacheImage(ctx context.Context, id uuid.UUID, url string, size ImageSize) error {
l := logger.FromContext(ctx) l := logger.FromContext(ctx)
err := ValidateImageURL(url) err := images.ValidateImageURL(url)
if err != nil { if err != nil {
return fmt.Errorf("DownloadAndCacheImage: %w", err) return fmt.Errorf("DownloadAndCacheImage: %w", err)
} }
@ -285,3 +267,127 @@ func pruneDirImgs(ctx context.Context, store db.DB, path string, memo map[string
} }
return count, nil return count, nil
} }
func FetchMissingArtistImages(ctx context.Context, store db.DB) error {
l := logger.FromContext(ctx)
l.Info().Msg("FetchMissingArtistImages: Starting backfill of missing artist images")
var from int32 = 0
for {
l.Debug().Int32("ID", from).Msg("Fetching artist images to backfill from ID")
artists, err := store.ArtistsWithoutImages(ctx, from)
if err != nil {
return fmt.Errorf("FetchMissingArtistImages: failed to fetch artists for image backfill: %w", err)
}
if len(artists) == 0 {
if from == 0 {
l.Info().Msg("FetchMissingArtistImages: No artists with missing images found")
} else {
l.Info().Msg("FetchMissingArtistImages: Finished fetching missing artist images")
}
return nil
}
for _, artist := range artists {
from = artist.ID
l.Debug().
Str("title", artist.Name).
Msg("FetchMissingArtistImages: Attempting to fetch missing artist image")
var aliases []string
if aliasrow, err := store.GetAllArtistAliases(ctx, artist.ID); err != nil {
aliases = utils.FlattenAliases(aliasrow)
} else {
aliases = []string{artist.Name}
}
var imgid uuid.UUID
imgUrl, imgErr := images.GetArtistImage(ctx, images.ArtistImageOpts{
Aliases: aliases,
})
if imgErr == nil && imgUrl != "" {
imgid = uuid.New()
err = store.UpdateArtist(ctx, db.UpdateArtistOpts{
ID: artist.ID,
Image: imgid,
ImageSrc: imgUrl,
})
if err != nil {
l.Err(err).
Str("title", artist.Name).
Msg("FetchMissingArtistImages: Failed to update artist with image in database")
continue
}
l.Info().
Str("name", artist.Name).
Msg("FetchMissingArtistImages: Successfully fetched missing artist image")
} else {
l.Err(err).
Str("name", artist.Name).
Msg("FetchMissingArtistImages: Failed to fetch artist image")
}
}
}
}
func FetchMissingAlbumImages(ctx context.Context, store db.DB) error {
l := logger.FromContext(ctx)
l.Info().Msg("FetchMissingAlbumImages: Starting backfill of missing album images")
var from int32 = 0
for {
l.Debug().Int32("ID", from).Msg("Fetching album images to backfill from ID")
albums, err := store.AlbumsWithoutImages(ctx, from)
if err != nil {
return fmt.Errorf("FetchMissingAlbumImages: failed to fetch albums for image backfill: %w", err)
}
if len(albums) == 0 {
if from == 0 {
l.Info().Msg("FetchMissingAlbumImages: No albums with missing images found")
} else {
l.Info().Msg("FetchMissingAlbumImages: Finished fetching missing album images")
}
return nil
}
for _, album := range albums {
from = album.ID
l.Debug().
Str("title", album.Title).
Msg("FetchMissingAlbumImages: Attempting to fetch missing album image")
var imgid uuid.UUID
imgUrl, imgErr := images.GetAlbumImage(ctx, images.AlbumImageOpts{
Artists: utils.FlattenSimpleArtistNames(album.Artists),
Album: album.Title,
ReleaseMbzID: album.MbzID,
})
if imgErr == nil && imgUrl != "" {
imgid = uuid.New()
err = store.UpdateAlbum(ctx, db.UpdateAlbumOpts{
ID: album.ID,
Image: imgid,
ImageSrc: imgUrl,
})
if err != nil {
l.Err(err).
Str("title", album.Title).
Msg("FetchMissingAlbumImages: Failed to update album with image in database")
continue
}
l.Info().
Str("name", album.Title).
Msg("FetchMissingAlbumImages: Successfully fetched missing album image")
} else {
l.Err(err).
Str("name", album.Title).
Msg("FetchMissingAlbumImages: Failed to fetch album image")
}
}
}
}

View file

@ -38,6 +38,7 @@ const (
DISABLE_MUSICBRAINZ_ENV = "KOITO_DISABLE_MUSICBRAINZ" DISABLE_MUSICBRAINZ_ENV = "KOITO_DISABLE_MUSICBRAINZ"
SUBSONIC_URL_ENV = "KOITO_SUBSONIC_URL" SUBSONIC_URL_ENV = "KOITO_SUBSONIC_URL"
SUBSONIC_PARAMS_ENV = "KOITO_SUBSONIC_PARAMS" SUBSONIC_PARAMS_ENV = "KOITO_SUBSONIC_PARAMS"
LASTFM_API_KEY_ENV = "KOITO_LASTFM_API_KEY"
SKIP_IMPORT_ENV = "KOITO_SKIP_IMPORT" SKIP_IMPORT_ENV = "KOITO_SKIP_IMPORT"
ALLOWED_HOSTS_ENV = "KOITO_ALLOWED_HOSTS" ALLOWED_HOSTS_ENV = "KOITO_ALLOWED_HOSTS"
CORS_ORIGINS_ENV = "KOITO_CORS_ALLOWED_ORIGINS" CORS_ORIGINS_ENV = "KOITO_CORS_ALLOWED_ORIGINS"
@ -48,6 +49,7 @@ const (
FETCH_IMAGES_DURING_IMPORT_ENV = "KOITO_FETCH_IMAGES_DURING_IMPORT" FETCH_IMAGES_DURING_IMPORT_ENV = "KOITO_FETCH_IMAGES_DURING_IMPORT"
ARTIST_SEPARATORS_ENV = "KOITO_ARTIST_SEPARATORS_REGEX" ARTIST_SEPARATORS_ENV = "KOITO_ARTIST_SEPARATORS_REGEX"
LOGIN_GATE_ENV = "KOITO_LOGIN_GATE" LOGIN_GATE_ENV = "KOITO_LOGIN_GATE"
FORCE_TZ = "KOITO_FORCE_TZ"
) )
type config struct { type config struct {
@ -72,6 +74,7 @@ type config struct {
disableMusicBrainz bool disableMusicBrainz bool
subsonicUrl string subsonicUrl string
subsonicParams string subsonicParams string
lastfmApiKey string
subsonicEnabled bool subsonicEnabled bool
skipImport bool skipImport bool
fetchImageDuringImport bool fetchImageDuringImport bool
@ -85,6 +88,7 @@ type config struct {
importAfter time.Time importAfter time.Time
artistSeparators []*regexp.Regexp artistSeparators []*regexp.Regexp
loginGate bool loginGate bool
forceTZ *time.Location
} }
var ( var (
@ -165,6 +169,7 @@ func loadConfig(getenv func(string) string, version string) (*config, error) {
if cfg.subsonicEnabled && (cfg.subsonicUrl == "" || cfg.subsonicParams == "") { if cfg.subsonicEnabled && (cfg.subsonicUrl == "" || cfg.subsonicParams == "") {
return nil, fmt.Errorf("loadConfig: invalid configuration: both %s and %s must be set in order to use subsonic image fetching", SUBSONIC_URL_ENV, SUBSONIC_PARAMS_ENV) return nil, fmt.Errorf("loadConfig: invalid configuration: both %s and %s must be set in order to use subsonic image fetching", SUBSONIC_URL_ENV, SUBSONIC_PARAMS_ENV)
} }
cfg.lastfmApiKey = getenv(LASTFM_API_KEY_ENV)
cfg.skipImport = parseBool(getenv(SKIP_IMPORT_ENV)) cfg.skipImport = parseBool(getenv(SKIP_IMPORT_ENV))
cfg.userAgent = fmt.Sprintf("Koito %s (contact@koito.io)", version) cfg.userAgent = fmt.Sprintf("Koito %s (contact@koito.io)", version)
@ -210,6 +215,13 @@ func loadConfig(getenv func(string) string, version string) (*config, error) {
cfg.loginGate = true cfg.loginGate = true
} }
if getenv(FORCE_TZ) != "" {
cfg.forceTZ, err = time.LoadLocation(getenv(FORCE_TZ))
if err != nil {
return nil, fmt.Errorf("forced timezone '%s' is not a valid timezone", getenv(FORCE_TZ))
}
}
switch strings.ToLower(getenv(LOG_LEVEL_ENV)) { switch strings.ToLower(getenv(LOG_LEVEL_ENV)) {
case "debug": case "debug":
cfg.logLevel = 0 cfg.logLevel = 0
@ -232,192 +244,3 @@ func parseBool(s string) bool {
return false return false
} }
} }
// Global accessors for configuration values
func UserAgent() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.userAgent
}
func ListenAddr() string {
lock.RLock()
defer lock.RUnlock()
return fmt.Sprintf("%s:%d", globalConfig.bindAddr, globalConfig.listenPort)
}
func ConfigDir() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.configDir
}
func DatabaseUrl() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.databaseUrl
}
func MusicBrainzUrl() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.musicBrainzUrl
}
func MusicBrainzRateLimit() int {
lock.RLock()
defer lock.RUnlock()
return globalConfig.musicBrainzRateLimit
}
func LogLevel() int {
lock.RLock()
defer lock.RUnlock()
return globalConfig.logLevel
}
func StructuredLogging() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.structuredLogging
}
func LbzRelayEnabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.lbzRelayEnabled
}
func LbzRelayUrl() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.lbzRelayUrl
}
func LbzRelayToken() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.lbzRelayToken
}
func DefaultPassword() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.defaultPw
}
func DefaultUsername() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.defaultUsername
}
func DefaultTheme() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.defaultTheme
}
func FullImageCacheEnabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.enableFullImageCache
}
func DeezerDisabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.disableDeezer
}
func CoverArtArchiveDisabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.disableCAA
}
func MusicBrainzDisabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.disableMusicBrainz
}
func SubsonicEnabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.subsonicEnabled
}
func SubsonicUrl() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.subsonicUrl
}
func SubsonicParams() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.subsonicParams
}
func SkipImport() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.skipImport
}
func AllowedHosts() []string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.allowedHosts
}
func AllowAllHosts() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.allowAllHosts
}
func AllowedOrigins() []string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.allowedOrigins
}
func RateLimitDisabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.disableRateLimit
}
func ThrottleImportMs() int {
lock.RLock()
defer lock.RUnlock()
return globalConfig.importThrottleMs
}
// returns the before, after times, in that order
func ImportWindow() (time.Time, time.Time) {
lock.RLock()
defer lock.RUnlock()
return globalConfig.importBefore, globalConfig.importAfter
}
func FetchImagesDuringImport() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.fetchImageDuringImport
}
func ArtistSeparators() []*regexp.Regexp {
lock.RLock()
defer lock.RUnlock()
return globalConfig.artistSeparators
}
func LoginGate() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.loginGate
}

206
internal/cfg/getters.go Normal file
View file

@ -0,0 +1,206 @@
package cfg
import (
"fmt"
"regexp"
"time"
)
func UserAgent() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.userAgent
}
func ListenAddr() string {
lock.RLock()
defer lock.RUnlock()
return fmt.Sprintf("%s:%d", globalConfig.bindAddr, globalConfig.listenPort)
}
func ConfigDir() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.configDir
}
func DatabaseUrl() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.databaseUrl
}
func MusicBrainzUrl() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.musicBrainzUrl
}
func MusicBrainzRateLimit() int {
lock.RLock()
defer lock.RUnlock()
return globalConfig.musicBrainzRateLimit
}
func LogLevel() int {
lock.RLock()
defer lock.RUnlock()
return globalConfig.logLevel
}
func StructuredLogging() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.structuredLogging
}
func LbzRelayEnabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.lbzRelayEnabled
}
func LbzRelayUrl() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.lbzRelayUrl
}
func LbzRelayToken() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.lbzRelayToken
}
func DefaultPassword() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.defaultPw
}
func DefaultUsername() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.defaultUsername
}
func DefaultTheme() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.defaultTheme
}
func FullImageCacheEnabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.enableFullImageCache
}
func DeezerDisabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.disableDeezer
}
func CoverArtArchiveDisabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.disableCAA
}
func MusicBrainzDisabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.disableMusicBrainz
}
func SubsonicEnabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.subsonicEnabled
}
func SubsonicUrl() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.subsonicUrl
}
func SubsonicParams() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.subsonicParams
}
func LastFMApiKey() string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.lastfmApiKey
}
func SkipImport() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.skipImport
}
func AllowedHosts() []string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.allowedHosts
}
func AllowAllHosts() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.allowAllHosts
}
func AllowedOrigins() []string {
lock.RLock()
defer lock.RUnlock()
return globalConfig.allowedOrigins
}
func RateLimitDisabled() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.disableRateLimit
}
func ThrottleImportMs() int {
lock.RLock()
defer lock.RUnlock()
return globalConfig.importThrottleMs
}
// returns the before, after times, in that order
func ImportWindow() (time.Time, time.Time) {
lock.RLock()
defer lock.RUnlock()
return globalConfig.importBefore, globalConfig.importAfter
}
func FetchImagesDuringImport() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.fetchImageDuringImport
}
func ArtistSeparators() []*regexp.Regexp {
lock.RLock()
defer lock.RUnlock()
return globalConfig.artistSeparators
}
func LoginGate() bool {
lock.RLock()
defer lock.RUnlock()
return globalConfig.loginGate
}
func ForceTZ() *time.Location {
lock.RLock()
defer lock.RUnlock()
return globalConfig.forceTZ
}

7
internal/cfg/setters.go Normal file
View file

@ -0,0 +1,7 @@
package cfg
func SetLoginGate(val bool) {
lock.Lock()
defer lock.Unlock()
globalConfig.loginGate = val
}

View file

@ -16,11 +16,12 @@ type DB interface {
GetAlbum(ctx context.Context, opts GetAlbumOpts) (*models.Album, error) GetAlbum(ctx context.Context, opts GetAlbumOpts) (*models.Album, error)
GetAlbumWithNoMbzIDByTitles(ctx context.Context, artistId int32, titles []string) (*models.Album, error) GetAlbumWithNoMbzIDByTitles(ctx context.Context, artistId int32, titles []string) (*models.Album, error)
GetTrack(ctx context.Context, opts GetTrackOpts) (*models.Track, error) GetTrack(ctx context.Context, opts GetTrackOpts) (*models.Track, error)
GetTracksWithNoDurationButHaveMbzID(ctx context.Context, from int32) ([]*models.Track, error)
GetArtistsForAlbum(ctx context.Context, id int32) ([]*models.Artist, error) GetArtistsForAlbum(ctx context.Context, id int32) ([]*models.Artist, error)
GetArtistsForTrack(ctx context.Context, id int32) ([]*models.Artist, error) GetArtistsForTrack(ctx context.Context, id int32) ([]*models.Artist, error)
GetTopTracksPaginated(ctx context.Context, opts GetItemsOpts) (*PaginatedResponse[*models.Track], error) GetTopTracksPaginated(ctx context.Context, opts GetItemsOpts) (*PaginatedResponse[RankedItem[*models.Track]], error)
GetTopArtistsPaginated(ctx context.Context, opts GetItemsOpts) (*PaginatedResponse[*models.Artist], error) GetTopArtistsPaginated(ctx context.Context, opts GetItemsOpts) (*PaginatedResponse[RankedItem[*models.Artist]], error)
GetTopAlbumsPaginated(ctx context.Context, opts GetItemsOpts) (*PaginatedResponse[*models.Album], error) GetTopAlbumsPaginated(ctx context.Context, opts GetItemsOpts) (*PaginatedResponse[RankedItem[*models.Album]], error)
GetListensPaginated(ctx context.Context, opts GetItemsOpts) (*PaginatedResponse[*models.Listen], error) GetListensPaginated(ctx context.Context, opts GetItemsOpts) (*PaginatedResponse[*models.Listen], error)
GetListenActivity(ctx context.Context, opts ListenActivityOpts) ([]ListenActivityItem, error) GetListenActivity(ctx context.Context, opts ListenActivityOpts) ([]ListenActivityItem, error)
GetAllArtistAliases(ctx context.Context, id int32) ([]models.Alias, error) GetAllArtistAliases(ctx context.Context, id int32) ([]models.Alias, error)
@ -87,6 +88,7 @@ type DB interface {
// in seconds // in seconds
CountTimeListenedToItem(ctx context.Context, opts TimeListenedOpts) (int64, error) CountTimeListenedToItem(ctx context.Context, opts TimeListenedOpts) (int64, error)
CountUsers(ctx context.Context) (int64, error) CountUsers(ctx context.Context) (int64, error)
// Search // Search
SearchArtists(ctx context.Context, q string) ([]*models.Artist, error) SearchArtists(ctx context.Context, q string) ([]*models.Artist, error)
@ -104,6 +106,7 @@ type DB interface {
ImageHasAssociation(ctx context.Context, image uuid.UUID) (bool, error) ImageHasAssociation(ctx context.Context, image uuid.UUID) (bool, error)
GetImageSource(ctx context.Context, image uuid.UUID) (string, error) GetImageSource(ctx context.Context, image uuid.UUID) (string, error)
AlbumsWithoutImages(ctx context.Context, from int32) ([]*models.Album, error) AlbumsWithoutImages(ctx context.Context, from int32) ([]*models.Album, error)
ArtistsWithoutImages(ctx context.Context, from int32) ([]*models.Artist, error)
GetExportPage(ctx context.Context, opts GetExportPageOpts) ([]*ExportItem, error) GetExportPage(ctx context.Context, opts GetExportPageOpts) ([]*ExportItem, error)
Ping(ctx context.Context) error Ping(ctx context.Context) error
Close(ctx context.Context) Close(ctx context.Context)

View file

@ -57,11 +57,11 @@ const (
// and end will be 23:59:59 on Saturday at the end of the current week. // and end will be 23:59:59 on Saturday at the end of the current week.
// If opts.Year (or opts.Year + opts.Month) is provided, start and end will simply by the start and end times of that year/month. // If opts.Year (or opts.Year + opts.Month) is provided, start and end will simply by the start and end times of that year/month.
func ListenActivityOptsToTimes(opts ListenActivityOpts) (start, end time.Time) { func ListenActivityOptsToTimes(opts ListenActivityOpts) (start, end time.Time) {
now := time.Now()
loc := opts.Timezone loc := opts.Timezone
if loc == nil { if loc == nil {
loc, _ = time.LoadLocation("UTC") loc, _ = time.LoadLocation("UTC")
} }
now := time.Now().In(loc)
// If Year (and optionally Month) are specified, use calendar boundaries // If Year (and optionally Month) are specified, use calendar boundaries
if opts.Year != 0 { if opts.Year != 0 {
@ -91,7 +91,9 @@ func ListenActivityOptsToTimes(opts ListenActivityOpts) (start, end time.Time) {
// Align to most recent Sunday // Align to most recent Sunday
weekday := int(now.Weekday()) // Sunday = 0 weekday := int(now.Weekday()) // Sunday = 0
startOfThisWeek := time.Date(now.Year(), now.Month(), now.Day()-weekday, 0, 0, 0, 0, loc) startOfThisWeek := time.Date(now.Year(), now.Month(), now.Day()-weekday, 0, 0, 0, 0, loc)
start = startOfThisWeek.AddDate(0, 0, -7*opts.Range) // need to subtract 1 from range for week because we are going back from the beginning of this
// week, so we sort of already went back a week
start = startOfThisWeek.AddDate(0, 0, -7*(opts.Range-1))
end = startOfThisWeek.AddDate(0, 0, 7).Add(-time.Nanosecond) end = startOfThisWeek.AddDate(0, 0, 7).Add(-time.Nanosecond)
case StepMonth: case StepMonth:

View file

@ -23,32 +23,13 @@ func (d *Psql) GetAlbum(ctx context.Context, opts db.GetAlbumOpts) (*models.Albu
var err error var err error
var ret = new(models.Album) var ret = new(models.Album)
if opts.ID != 0 { if opts.MusicBrainzID != uuid.Nil {
l.Debug().Msgf("Fetching album from DB with id %d", opts.ID)
row, err := d.q.GetRelease(ctx, opts.ID)
if err != nil {
return nil, fmt.Errorf("GetAlbum: %w", err)
}
ret.ID = row.ID
ret.MbzID = row.MusicBrainzID
ret.Title = row.Title
ret.Image = row.Image
ret.VariousArtists = row.VariousArtists
err = json.Unmarshal(row.Artists, &ret.Artists)
if err != nil {
return nil, fmt.Errorf("GetAlbum: json.Unmarshal: %w", err)
}
} else if opts.MusicBrainzID != uuid.Nil {
l.Debug().Msgf("Fetching album from DB with MusicBrainz Release ID %s", opts.MusicBrainzID) l.Debug().Msgf("Fetching album from DB with MusicBrainz Release ID %s", opts.MusicBrainzID)
row, err := d.q.GetReleaseByMbzID(ctx, &opts.MusicBrainzID) row, err := d.q.GetReleaseByMbzID(ctx, &opts.MusicBrainzID)
if err != nil { if err != nil {
return nil, fmt.Errorf("GetAlbum: %w", err) return nil, fmt.Errorf("GetAlbum: %w", err)
} }
ret.ID = row.ID opts.ID = row.ID
ret.MbzID = row.MusicBrainzID
ret.Title = row.Title
ret.Image = row.Image
ret.VariousArtists = row.VariousArtists
} else if opts.ArtistID != 0 && opts.Title != "" { } else if opts.ArtistID != 0 && opts.Title != "" {
l.Debug().Msgf("Fetching album from DB with artist_id %d and title %s", opts.ArtistID, opts.Title) l.Debug().Msgf("Fetching album from DB with artist_id %d and title %s", opts.ArtistID, opts.Title)
row, err := d.q.GetReleaseByArtistAndTitle(ctx, repository.GetReleaseByArtistAndTitleParams{ row, err := d.q.GetReleaseByArtistAndTitle(ctx, repository.GetReleaseByArtistAndTitleParams{
@ -58,11 +39,7 @@ func (d *Psql) GetAlbum(ctx context.Context, opts db.GetAlbumOpts) (*models.Albu
if err != nil { if err != nil {
return nil, fmt.Errorf("GetAlbum: %w", err) return nil, fmt.Errorf("GetAlbum: %w", err)
} }
ret.ID = row.ID opts.ID = row.ID
ret.MbzID = row.MusicBrainzID
ret.Title = row.Title
ret.Image = row.Image
ret.VariousArtists = row.VariousArtists
} else if opts.ArtistID != 0 && len(opts.Titles) > 0 { } else if opts.ArtistID != 0 && len(opts.Titles) > 0 {
l.Debug().Msgf("Fetching release group from DB with artist_id %d and titles %v", opts.ArtistID, opts.Titles) l.Debug().Msgf("Fetching release group from DB with artist_id %d and titles %v", opts.ArtistID, opts.Titles)
row, err := d.q.GetReleaseByArtistAndTitles(ctx, repository.GetReleaseByArtistAndTitlesParams{ row, err := d.q.GetReleaseByArtistAndTitles(ctx, repository.GetReleaseByArtistAndTitlesParams{
@ -72,19 +49,19 @@ func (d *Psql) GetAlbum(ctx context.Context, opts db.GetAlbumOpts) (*models.Albu
if err != nil { if err != nil {
return nil, fmt.Errorf("GetAlbum: %w", err) return nil, fmt.Errorf("GetAlbum: %w", err)
} }
ret.ID = row.ID opts.ID = row.ID
ret.MbzID = row.MusicBrainzID }
ret.Title = row.Title
ret.Image = row.Image l.Debug().Msgf("Fetching album from DB with id %d", opts.ID)
ret.VariousArtists = row.VariousArtists row, err := d.q.GetRelease(ctx, opts.ID)
} else { if err != nil {
return nil, errors.New("GetAlbum: insufficient information to get album") return nil, fmt.Errorf("GetAlbum: %w", err)
} }
count, err := d.q.CountListensFromRelease(ctx, repository.CountListensFromReleaseParams{ count, err := d.q.CountListensFromRelease(ctx, repository.CountListensFromReleaseParams{
ListenedAt: time.Unix(0, 0), ListenedAt: time.Unix(0, 0),
ListenedAt_2: time.Now(), ListenedAt_2: time.Now(),
ReleaseID: ret.ID, ReleaseID: opts.ID,
}) })
if err != nil { if err != nil {
return nil, fmt.Errorf("GetAlbum: CountListensFromRelease: %w", err) return nil, fmt.Errorf("GetAlbum: CountListensFromRelease: %w", err)
@ -92,17 +69,32 @@ func (d *Psql) GetAlbum(ctx context.Context, opts db.GetAlbumOpts) (*models.Albu
seconds, err := d.CountTimeListenedToItem(ctx, db.TimeListenedOpts{ seconds, err := d.CountTimeListenedToItem(ctx, db.TimeListenedOpts{
Timeframe: db.Timeframe{Period: db.PeriodAllTime}, Timeframe: db.Timeframe{Period: db.PeriodAllTime},
AlbumID: ret.ID, AlbumID: opts.ID,
}) })
if err != nil { if err != nil {
return nil, fmt.Errorf("GetAlbum: CountTimeListenedToItem: %w", err) return nil, fmt.Errorf("GetAlbum: CountTimeListenedToItem: %w", err)
} }
firstListen, err := d.q.GetFirstListenFromRelease(ctx, ret.ID) firstListen, err := d.q.GetFirstListenFromRelease(ctx, opts.ID)
if err != nil && !errors.Is(err, pgx.ErrNoRows) { if err != nil && !errors.Is(err, pgx.ErrNoRows) {
return nil, fmt.Errorf("GetAlbum: GetFirstListenFromRelease: %w", err) return nil, fmt.Errorf("GetAlbum: GetFirstListenFromRelease: %w", err)
} }
rank, err := d.q.GetReleaseAllTimeRank(ctx, opts.ID)
if err != nil && !errors.Is(err, pgx.ErrNoRows) {
return nil, fmt.Errorf("GetAlbum: GetReleaseAllTimeRank: %w", err)
}
ret.ID = row.ID
ret.MbzID = row.MusicBrainzID
ret.Title = row.Title
ret.Image = row.Image
ret.VariousArtists = row.VariousArtists
err = json.Unmarshal(row.Artists, &ret.Artists)
if err != nil {
return nil, fmt.Errorf("GetAlbum: json.Unmarshal: %w", err)
}
ret.AllTimeRank = rank.Rank
ret.ListenCount = count ret.ListenCount = count
ret.TimeListened = seconds ret.TimeListened = seconds
ret.FirstListen = firstListen.ListenedAt.Unix() ret.FirstListen = firstListen.ListenedAt.Unix()
@ -282,6 +274,9 @@ func (d *Psql) UpdateAlbum(ctx context.Context, opts db.UpdateAlbumOpts) error {
} }
} }
if opts.Image != uuid.Nil { if opts.Image != uuid.Nil {
if opts.ImageSrc == "" {
return fmt.Errorf("UpdateAlbum: image source must be provided when updating an image")
}
l.Debug().Msgf("Updating release with ID %d with image %s", opts.ID, opts.Image) l.Debug().Msgf("Updating release with ID %d with image %s", opts.ID, opts.Image)
err := qtx.UpdateReleaseImage(ctx, repository.UpdateReleaseImageParams{ err := qtx.UpdateReleaseImage(ctx, repository.UpdateReleaseImageParams{
ID: opts.ID, ID: opts.ID,

View file

@ -20,7 +20,21 @@ import (
// this function sucks because sqlc keeps making new types for rows that are the same // this function sucks because sqlc keeps making new types for rows that are the same
func (d *Psql) GetArtist(ctx context.Context, opts db.GetArtistOpts) (*models.Artist, error) { func (d *Psql) GetArtist(ctx context.Context, opts db.GetArtistOpts) (*models.Artist, error) {
l := logger.FromContext(ctx) l := logger.FromContext(ctx)
if opts.ID != 0 { if opts.MusicBrainzID != uuid.Nil {
l.Debug().Msgf("Fetching artist from DB with MusicBrainz ID %s", opts.MusicBrainzID)
row, err := d.q.GetArtistByMbzID(ctx, &opts.MusicBrainzID)
if err != nil {
return nil, fmt.Errorf("GetArtist: GetArtistByMbzID: %w", err)
}
opts.ID = row.ID
} else if opts.Name != "" {
l.Debug().Msgf("Fetching artist from DB with name '%s'", opts.Name)
row, err := d.q.GetArtistByName(ctx, opts.Name)
if err != nil {
return nil, fmt.Errorf("GetArtist: GetArtistByName: %w", err)
}
opts.ID = row.ID
}
l.Debug().Msgf("Fetching artist from DB with id %d", opts.ID) l.Debug().Msgf("Fetching artist from DB with id %d", opts.ID)
row, err := d.q.GetArtist(ctx, opts.ID) row, err := d.q.GetArtist(ctx, opts.ID)
if err != nil { if err != nil {
@ -45,40 +59,9 @@ func (d *Psql) GetArtist(ctx context.Context, opts db.GetArtistOpts) (*models.Ar
if err != nil && !errors.Is(err, pgx.ErrNoRows) { if err != nil && !errors.Is(err, pgx.ErrNoRows) {
return nil, fmt.Errorf("GetAlbum: GetFirstListenFromArtist: %w", err) return nil, fmt.Errorf("GetAlbum: GetFirstListenFromArtist: %w", err)
} }
return &models.Artist{ rank, err := d.q.GetArtistAllTimeRank(ctx, opts.ID)
ID: row.ID,
MbzID: row.MusicBrainzID,
Name: row.Name,
Aliases: row.Aliases,
Image: row.Image,
ListenCount: count,
TimeListened: seconds,
FirstListen: firstListen.ListenedAt.Unix(),
}, nil
} else if opts.MusicBrainzID != uuid.Nil {
l.Debug().Msgf("Fetching artist from DB with MusicBrainz ID %s", opts.MusicBrainzID)
row, err := d.q.GetArtistByMbzID(ctx, &opts.MusicBrainzID)
if err != nil {
return nil, fmt.Errorf("GetArtist: GetArtistByMbzID: %w", err)
}
count, err := d.q.CountListensFromArtist(ctx, repository.CountListensFromArtistParams{
ListenedAt: time.Unix(0, 0),
ListenedAt_2: time.Now(),
ArtistID: row.ID,
})
if err != nil {
return nil, fmt.Errorf("GetArtist: CountListensFromArtist: %w", err)
}
seconds, err := d.CountTimeListenedToItem(ctx, db.TimeListenedOpts{
Timeframe: db.Timeframe{Period: db.PeriodAllTime},
ArtistID: row.ID,
})
if err != nil {
return nil, fmt.Errorf("GetArtist: CountTimeListenedToItem: %w", err)
}
firstListen, err := d.q.GetFirstListenFromArtist(ctx, row.ID)
if err != nil && !errors.Is(err, pgx.ErrNoRows) { if err != nil && !errors.Is(err, pgx.ErrNoRows) {
return nil, fmt.Errorf("GetAlbum: GetFirstListenFromArtist: %w", err) return nil, fmt.Errorf("GetArtist: GetArtistAllTimeRank: %w", err)
} }
return &models.Artist{ return &models.Artist{
ID: row.ID, ID: row.ID,
@ -88,46 +71,9 @@ func (d *Psql) GetArtist(ctx context.Context, opts db.GetArtistOpts) (*models.Ar
Image: row.Image, Image: row.Image,
ListenCount: count, ListenCount: count,
TimeListened: seconds, TimeListened: seconds,
AllTimeRank: rank.Rank,
FirstListen: firstListen.ListenedAt.Unix(), FirstListen: firstListen.ListenedAt.Unix(),
}, nil }, nil
} else if opts.Name != "" {
l.Debug().Msgf("Fetching artist from DB with name '%s'", opts.Name)
row, err := d.q.GetArtistByName(ctx, opts.Name)
if err != nil {
return nil, fmt.Errorf("GetArtist: GetArtistByName: %w", err)
}
count, err := d.q.CountListensFromArtist(ctx, repository.CountListensFromArtistParams{
ListenedAt: time.Unix(0, 0),
ListenedAt_2: time.Now(),
ArtistID: row.ID,
})
if err != nil {
return nil, fmt.Errorf("GetArtist: CountListensFromArtist: %w", err)
}
seconds, err := d.CountTimeListenedToItem(ctx, db.TimeListenedOpts{
Timeframe: db.Timeframe{Period: db.PeriodAllTime},
ArtistID: row.ID,
})
if err != nil {
return nil, fmt.Errorf("GetArtist: CountTimeListenedToItem: %w", err)
}
firstListen, err := d.q.GetFirstListenFromArtist(ctx, row.ID)
if err != nil && !errors.Is(err, pgx.ErrNoRows) {
return nil, fmt.Errorf("GetAlbum: GetFirstListenFromArtist: %w", err)
}
return &models.Artist{
ID: row.ID,
MbzID: row.MusicBrainzID,
Name: row.Name,
Aliases: row.Aliases,
Image: row.Image,
ListenCount: count,
TimeListened: seconds,
FirstListen: firstListen.ListenedAt.Unix(),
}, nil
} else {
return nil, errors.New("insufficient information to get artist")
}
} }
// Inserts all unique aliases into the DB with specified source // Inserts all unique aliases into the DB with specified source
@ -264,6 +210,9 @@ func (d *Psql) UpdateArtist(ctx context.Context, opts db.UpdateArtistOpts) error
} }
} }
if opts.Image != uuid.Nil { if opts.Image != uuid.Nil {
if opts.ImageSrc == "" {
return fmt.Errorf("UpdateAlbum: image source must be provided when updating an image")
}
l.Debug().Msgf("Updating artist with id %d with image %s", opts.ID, opts.Image) l.Debug().Msgf("Updating artist with id %d with image %s", opts.ID, opts.Image)
err = qtx.UpdateArtistImage(ctx, repository.UpdateArtistImageParams{ err = qtx.UpdateArtistImage(ctx, repository.UpdateArtistImageParams{
ID: opts.ID, ID: opts.ID,

View file

@ -72,3 +72,26 @@ func (d *Psql) AlbumsWithoutImages(ctx context.Context, from int32) ([]*models.A
} }
return albums, nil return albums, nil
} }
// returns nil, nil on no results
func (d *Psql) ArtistsWithoutImages(ctx context.Context, from int32) ([]*models.Artist, error) {
rows, err := d.q.GetArtistsWithoutImages(ctx, repository.GetArtistsWithoutImagesParams{
Limit: 20,
ID: from,
})
if errors.Is(err, pgx.ErrNoRows) {
return nil, nil
} else if err != nil {
return nil, fmt.Errorf("ArtistsWithoutImages: %w", err)
}
ret := make([]*models.Artist, len(rows))
for i, row := range rows {
ret[i] = &models.Artist{
ID: row.ID,
Name: row.Name,
MbzID: row.MusicBrainzID,
}
}
return ret, nil
}

View file

@ -14,54 +14,54 @@ func (d *Psql) GetInterest(ctx context.Context, opts db.GetInterestOpts) ([]db.I
return nil, errors.New("GetInterest: bucket count must be provided") return nil, errors.New("GetInterest: bucket count must be provided")
} }
ret := make([]db.InterestBucket, opts.Buckets) ret := make([]db.InterestBucket, 0)
if opts.ArtistID != 0 { if opts.ArtistID != 0 {
resp, err := d.q.GetGroupedListensFromArtist(ctx, repository.GetGroupedListensFromArtistParams{ resp, err := d.q.GetGroupedListensFromArtist(ctx, repository.GetGroupedListensFromArtistParams{
ArtistID: opts.ArtistID, ArtistID: opts.ArtistID,
BucketCount: opts.Buckets, BucketCount: int32(opts.Buckets),
}) })
if err != nil { if err != nil {
return nil, fmt.Errorf("GetInterest: GetGroupedListensFromArtist: %w", err) return nil, fmt.Errorf("GetInterest: GetGroupedListensFromArtist: %w", err)
} }
for i, v := range resp { for _, v := range resp {
ret[i] = db.InterestBucket{ ret = append(ret, db.InterestBucket{
BucketStart: v.BucketStart, BucketStart: v.BucketStart,
BucketEnd: v.BucketEnd, BucketEnd: v.BucketEnd,
ListenCount: v.ListenCount, ListenCount: v.ListenCount,
} })
} }
return ret, nil return ret, nil
} else if opts.AlbumID != 0 { } else if opts.AlbumID != 0 {
resp, err := d.q.GetGroupedListensFromRelease(ctx, repository.GetGroupedListensFromReleaseParams{ resp, err := d.q.GetGroupedListensFromRelease(ctx, repository.GetGroupedListensFromReleaseParams{
ReleaseID: opts.AlbumID, ReleaseID: opts.AlbumID,
BucketCount: opts.Buckets, BucketCount: int32(opts.Buckets),
}) })
if err != nil { if err != nil {
return nil, fmt.Errorf("GetInterest: GetGroupedListensFromRelease: %w", err) return nil, fmt.Errorf("GetInterest: GetGroupedListensFromRelease: %w", err)
} }
for i, v := range resp { for _, v := range resp {
ret[i] = db.InterestBucket{ ret = append(ret, db.InterestBucket{
BucketStart: v.BucketStart, BucketStart: v.BucketStart,
BucketEnd: v.BucketEnd, BucketEnd: v.BucketEnd,
ListenCount: v.ListenCount, ListenCount: v.ListenCount,
} })
} }
return ret, nil return ret, nil
} else if opts.TrackID != 0 { } else if opts.TrackID != 0 {
resp, err := d.q.GetGroupedListensFromTrack(ctx, repository.GetGroupedListensFromTrackParams{ resp, err := d.q.GetGroupedListensFromTrack(ctx, repository.GetGroupedListensFromTrackParams{
ID: opts.TrackID, ID: opts.TrackID,
BucketCount: opts.Buckets, BucketCount: int32(opts.Buckets),
}) })
if err != nil { if err != nil {
return nil, fmt.Errorf("GetInterest: GetGroupedListensFromTrack: %w", err) return nil, fmt.Errorf("GetInterest: GetGroupedListensFromTrack: %w", err)
} }
for i, v := range resp { for _, v := range resp {
ret[i] = db.InterestBucket{ ret = append(ret, db.InterestBucket{
BucketStart: v.BucketStart, BucketStart: v.BucketStart,
BucketEnd: v.BucketEnd, BucketEnd: v.BucketEnd,
ListenCount: v.ListenCount, ListenCount: v.ListenCount,
} })
} }
return ret, nil return ret, nil
} else { } else {

View file

@ -23,7 +23,7 @@ func (d *Psql) GetListenActivity(ctx context.Context, opts db.ListenActivityOpts
var listenActivity []db.ListenActivityItem var listenActivity []db.ListenActivityItem
if opts.AlbumID > 0 { if opts.AlbumID > 0 {
l.Debug().Msgf("Fetching listen activity for %d %s(s) from %v to %v for release group %d", l.Debug().Msgf("Fetching listen activity for %d %s(s) from %v to %v for release group %d",
opts.Range, opts.Step, t1.Format("Jan 02, 2006 15:04:05"), t2.Format("Jan 02, 2006 15:04:05"), opts.AlbumID) opts.Range, opts.Step, t1.Format("Jan 02, 2006 15:04:05 MST"), t2.Format("Jan 02, 2006 15:04:05 MST"), opts.AlbumID)
rows, err := d.q.ListenActivityForRelease(ctx, repository.ListenActivityForReleaseParams{ rows, err := d.q.ListenActivityForRelease(ctx, repository.ListenActivityForReleaseParams{
Column1: opts.Timezone.String(), Column1: opts.Timezone.String(),
ListenedAt: t1, ListenedAt: t1,
@ -44,7 +44,7 @@ func (d *Psql) GetListenActivity(ctx context.Context, opts db.ListenActivityOpts
l.Debug().Msgf("Database responded with %d steps", len(rows)) l.Debug().Msgf("Database responded with %d steps", len(rows))
} else if opts.ArtistID > 0 { } else if opts.ArtistID > 0 {
l.Debug().Msgf("Fetching listen activity for %d %s(s) from %v to %v for artist %d", l.Debug().Msgf("Fetching listen activity for %d %s(s) from %v to %v for artist %d",
opts.Range, opts.Step, t1.Format("Jan 02, 2006 15:04:05"), t2.Format("Jan 02, 2006 15:04:05"), opts.ArtistID) opts.Range, opts.Step, t1.Format("Jan 02, 2006 15:04:05 MST"), t2.Format("Jan 02, 2006 15:04:05 MST"), opts.ArtistID)
rows, err := d.q.ListenActivityForArtist(ctx, repository.ListenActivityForArtistParams{ rows, err := d.q.ListenActivityForArtist(ctx, repository.ListenActivityForArtistParams{
Column1: opts.Timezone.String(), Column1: opts.Timezone.String(),
ListenedAt: t1, ListenedAt: t1,
@ -65,7 +65,7 @@ func (d *Psql) GetListenActivity(ctx context.Context, opts db.ListenActivityOpts
l.Debug().Msgf("Database responded with %d steps", len(rows)) l.Debug().Msgf("Database responded with %d steps", len(rows))
} else if opts.TrackID > 0 { } else if opts.TrackID > 0 {
l.Debug().Msgf("Fetching listen activity for %d %s(s) from %v to %v for track %d", l.Debug().Msgf("Fetching listen activity for %d %s(s) from %v to %v for track %d",
opts.Range, opts.Step, t1.Format("Jan 02, 2006 15:04:05"), t2.Format("Jan 02, 2006 15:04:05"), opts.TrackID) opts.Range, opts.Step, t1.Format("Jan 02, 2006 15:04:05 MST"), t2.Format("Jan 02, 2006 15:04:05 MST"), opts.TrackID)
rows, err := d.q.ListenActivityForTrack(ctx, repository.ListenActivityForTrackParams{ rows, err := d.q.ListenActivityForTrack(ctx, repository.ListenActivityForTrackParams{
Column1: opts.Timezone.String(), Column1: opts.Timezone.String(),
ListenedAt: t1, ListenedAt: t1,
@ -86,7 +86,7 @@ func (d *Psql) GetListenActivity(ctx context.Context, opts db.ListenActivityOpts
l.Debug().Msgf("Database responded with %d steps", len(rows)) l.Debug().Msgf("Database responded with %d steps", len(rows))
} else { } else {
l.Debug().Msgf("Fetching listen activity for %d %s(s) from %v to %v", l.Debug().Msgf("Fetching listen activity for %d %s(s) from %v to %v",
opts.Range, opts.Step, t1.Format("Jan 02, 2006 15:04:05"), t2.Format("Jan 02, 2006 15:04:05")) opts.Range, opts.Step, t1.Format("Jan 02, 2006 15:04:05 MST"), t2.Format("Jan 02, 2006 15:04:05 MST"))
rows, err := d.q.ListenActivity(ctx, repository.ListenActivityParams{ rows, err := d.q.ListenActivity(ctx, repository.ListenActivityParams{
Column1: opts.Timezone.String(), Column1: opts.Timezone.String(),
ListenedAt: t1, ListenedAt: t1,

View file

@ -97,20 +97,19 @@ func TestListenActivity(t *testing.T) {
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO listens (user_id, track_id, listened_at) `INSERT INTO listens (user_id, track_id, listened_at)
VALUES (1, 1, NOW() - INTERVAL '1 month'), VALUES (1, 1, NOW() - INTERVAL '1 month 1 day'),
(1, 1, NOW() - INTERVAL '2 months'), (1, 1, NOW() - INTERVAL '2 months 1 day'),
(1, 1, NOW() - INTERVAL '3 months'), (1, 1, NOW() - INTERVAL '3 months 1 day'),
(1, 2, NOW() - INTERVAL '1 month'), (1, 2, NOW() - INTERVAL '1 month 1 day'),
(1, 2, NOW() - INTERVAL '2 months')`) (1, 2, NOW() - INTERVAL '1 second'),
(1, 2, NOW() - INTERVAL '2 seconds'),
(1, 2, NOW() - INTERVAL '2 months 1 day')`)
require.NoError(t, err) require.NoError(t, err)
// This test is bad, and I think it's because of daylight savings.
// I need to find a better test.
activity, err = store.GetListenActivity(ctx, db.ListenActivityOpts{Step: db.StepMonth, Range: 8}) activity, err = store.GetListenActivity(ctx, db.ListenActivityOpts{Step: db.StepMonth, Range: 8})
require.NoError(t, err) require.NoError(t, err)
// require.Len(t, activity, 8) require.Len(t, activity, 4)
// assert.Equal(t, []int64{0, 0, 0, 0, 1, 2, 2, 0}, flattenListenCounts(activity)) assert.Equal(t, []int64{1, 2, 2, 2}, flattenListenCounts(activity))
// Truncate listens table and insert specific dates for testing opts.Step = db.StepYear // Truncate listens table and insert specific dates for testing opts.Step = db.StepYear
err = store.Exec(context.Background(), `TRUNCATE TABLE listens RESTART IDENTITY`) err = store.Exec(context.Background(), `TRUNCATE TABLE listens RESTART IDENTITY`)

View file

@ -52,7 +52,7 @@ func (d *Psql) MergeTracks(ctx context.Context, fromId, toId int32) error {
} }
err = qtx.CleanOrphanedEntries(ctx) err = qtx.CleanOrphanedEntries(ctx)
if err != nil { if err != nil {
l.Err(err).Msg("Failed to clean orphaned entries") l.Err(err).Msg("MergeTracks: Failed to clean orphaned entries")
return err return err
} }
return tx.Commit(ctx) return tx.Commit(ctx)

View file

@ -90,14 +90,14 @@ func TestMergeTracks(t *testing.T) {
require.NoError(t, err) require.NoError(t, err)
assert.Equal(t, 2, count, "expected all listens to be merged into Track 2") assert.Equal(t, 2, count, "expected all listens to be merged into Track 2")
// Verify artist is associated with album // Verify old artist is not associated with album
exists, err := store.RowExists(ctx, ` exists, err := store.RowExists(ctx, `
SELECT EXISTS ( SELECT EXISTS (
SELECT 1 FROM artist_releases SELECT 1 FROM artist_releases
WHERE release_id = $1 AND artist_id = $2 WHERE release_id = $1 AND artist_id = $2
)`, 2, 1) )`, 2, 1)
require.NoError(t, err) require.NoError(t, err)
assert.True(t, exists, "expected old artist to be associated with album") assert.False(t, exists)
truncateTestData(t) truncateTestData(t)
} }

View file

@ -11,7 +11,7 @@ import (
"github.com/gabehf/koito/internal/repository" "github.com/gabehf/koito/internal/repository"
) )
func (d *Psql) GetTopAlbumsPaginated(ctx context.Context, opts db.GetItemsOpts) (*db.PaginatedResponse[*models.Album], error) { func (d *Psql) GetTopAlbumsPaginated(ctx context.Context, opts db.GetItemsOpts) (*db.PaginatedResponse[db.RankedItem[*models.Album]], error) {
l := logger.FromContext(ctx) l := logger.FromContext(ctx)
offset := (opts.Page - 1) * opts.Limit offset := (opts.Page - 1) * opts.Limit
t1, t2 := db.TimeframeToTimeRange(opts.Timeframe) t1, t2 := db.TimeframeToTimeRange(opts.Timeframe)
@ -19,7 +19,7 @@ func (d *Psql) GetTopAlbumsPaginated(ctx context.Context, opts db.GetItemsOpts)
opts.Limit = DefaultItemsPerPage opts.Limit = DefaultItemsPerPage
} }
var rgs []*models.Album var rgs []db.RankedItem[*models.Album]
var count int64 var count int64
if opts.ArtistID != 0 { if opts.ArtistID != 0 {
@ -36,7 +36,7 @@ func (d *Psql) GetTopAlbumsPaginated(ctx context.Context, opts db.GetItemsOpts)
if err != nil { if err != nil {
return nil, fmt.Errorf("GetTopAlbumsPaginated: GetTopReleasesFromArtist: %w", err) return nil, fmt.Errorf("GetTopAlbumsPaginated: GetTopReleasesFromArtist: %w", err)
} }
rgs = make([]*models.Album, len(rows)) rgs = make([]db.RankedItem[*models.Album], len(rows))
l.Debug().Msgf("Database responded with %d items", len(rows)) l.Debug().Msgf("Database responded with %d items", len(rows))
for i, v := range rows { for i, v := range rows {
artists := make([]models.SimpleArtist, 0) artists := make([]models.SimpleArtist, 0)
@ -45,7 +45,7 @@ func (d *Psql) GetTopAlbumsPaginated(ctx context.Context, opts db.GetItemsOpts)
l.Err(err).Msgf("Error unmarshalling artists for release group with id %d", v.ID) l.Err(err).Msgf("Error unmarshalling artists for release group with id %d", v.ID)
return nil, fmt.Errorf("GetTopAlbumsPaginated: Unmarshal: %w", err) return nil, fmt.Errorf("GetTopAlbumsPaginated: Unmarshal: %w", err)
} }
rgs[i] = &models.Album{ rgs[i].Item = &models.Album{
ID: v.ID, ID: v.ID,
MbzID: v.MusicBrainzID, MbzID: v.MusicBrainzID,
Title: v.Title, Title: v.Title,
@ -54,6 +54,7 @@ func (d *Psql) GetTopAlbumsPaginated(ctx context.Context, opts db.GetItemsOpts)
VariousArtists: v.VariousArtists, VariousArtists: v.VariousArtists,
ListenCount: v.ListenCount, ListenCount: v.ListenCount,
} }
rgs[i].Rank = v.Rank
} }
count, err = d.q.CountReleasesFromArtist(ctx, int32(opts.ArtistID)) count, err = d.q.CountReleasesFromArtist(ctx, int32(opts.ArtistID))
if err != nil { if err != nil {
@ -71,7 +72,7 @@ func (d *Psql) GetTopAlbumsPaginated(ctx context.Context, opts db.GetItemsOpts)
if err != nil { if err != nil {
return nil, fmt.Errorf("GetTopAlbumsPaginated: GetTopReleasesPaginated: %w", err) return nil, fmt.Errorf("GetTopAlbumsPaginated: GetTopReleasesPaginated: %w", err)
} }
rgs = make([]*models.Album, len(rows)) rgs = make([]db.RankedItem[*models.Album], len(rows))
l.Debug().Msgf("Database responded with %d items", len(rows)) l.Debug().Msgf("Database responded with %d items", len(rows))
for i, row := range rows { for i, row := range rows {
artists := make([]models.SimpleArtist, 0) artists := make([]models.SimpleArtist, 0)
@ -80,16 +81,16 @@ func (d *Psql) GetTopAlbumsPaginated(ctx context.Context, opts db.GetItemsOpts)
l.Err(err).Msgf("Error unmarshalling artists for release group with id %d", row.ID) l.Err(err).Msgf("Error unmarshalling artists for release group with id %d", row.ID)
return nil, fmt.Errorf("GetTopAlbumsPaginated: Unmarshal: %w", err) return nil, fmt.Errorf("GetTopAlbumsPaginated: Unmarshal: %w", err)
} }
t := &models.Album{ rgs[i].Item = &models.Album{
Title: row.Title,
MbzID: row.MusicBrainzID,
ID: row.ID, ID: row.ID,
MbzID: row.MusicBrainzID,
Title: row.Title,
Image: row.Image, Image: row.Image,
Artists: artists, Artists: artists,
VariousArtists: row.VariousArtists, VariousArtists: row.VariousArtists,
ListenCount: row.ListenCount, ListenCount: row.ListenCount,
} }
rgs[i] = t rgs[i].Rank = row.Rank
} }
count, err = d.q.CountTopReleases(ctx, repository.CountTopReleasesParams{ count, err = d.q.CountTopReleases(ctx, repository.CountTopReleasesParams{
ListenedAt: t1, ListenedAt: t1,
@ -100,7 +101,7 @@ func (d *Psql) GetTopAlbumsPaginated(ctx context.Context, opts db.GetItemsOpts)
} }
l.Debug().Msgf("Database responded with %d albums out of a total %d", len(rows), count) l.Debug().Msgf("Database responded with %d albums out of a total %d", len(rows), count)
} }
return &db.PaginatedResponse[*models.Album]{ return &db.PaginatedResponse[db.RankedItem[*models.Album]]{
Items: rgs, Items: rgs,
TotalCount: count, TotalCount: count,
ItemsPerPage: int32(opts.Limit), ItemsPerPage: int32(opts.Limit),

View file

@ -18,16 +18,16 @@ func TestGetTopAlbumsPaginated(t *testing.T) {
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 4) require.Len(t, resp.Items, 4)
assert.Equal(t, int64(4), resp.TotalCount) assert.Equal(t, int64(4), resp.TotalCount)
assert.Equal(t, "Release One", resp.Items[0].Title) assert.Equal(t, "Release One", resp.Items[0].Item.Title)
assert.Equal(t, "Release Two", resp.Items[1].Title) assert.Equal(t, "Release Two", resp.Items[1].Item.Title)
assert.Equal(t, "Release Three", resp.Items[2].Title) assert.Equal(t, "Release Three", resp.Items[2].Item.Title)
assert.Equal(t, "Release Four", resp.Items[3].Title) assert.Equal(t, "Release Four", resp.Items[3].Item.Title)
// Test pagination // Test pagination
resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Limit: 1, Page: 2, Timeframe: db.Timeframe{Period: db.PeriodAllTime}}) resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Limit: 1, Page: 2, Timeframe: db.Timeframe{Period: db.PeriodAllTime}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 1) require.Len(t, resp.Items, 1)
assert.Equal(t, "Release Two", resp.Items[0].Title) assert.Equal(t, "Release Two", resp.Items[0].Item.Title)
// Test page out of range // Test page out of range
resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Limit: 1, Page: 10, Timeframe: db.Timeframe{Period: db.PeriodAllTime}}) resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Limit: 1, Page: 10, Timeframe: db.Timeframe{Period: db.PeriodAllTime}})
@ -57,29 +57,29 @@ func TestGetTopAlbumsPaginated(t *testing.T) {
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 1) require.Len(t, resp.Items, 1)
assert.Equal(t, int64(1), resp.TotalCount) assert.Equal(t, int64(1), resp.TotalCount)
assert.Equal(t, "Release Four", resp.Items[0].Title) assert.Equal(t, "Release Four", resp.Items[0].Item.Title)
resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodMonth}}) resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodMonth}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 2) require.Len(t, resp.Items, 2)
assert.Equal(t, int64(2), resp.TotalCount) assert.Equal(t, int64(2), resp.TotalCount)
assert.Equal(t, "Release Three", resp.Items[0].Title) assert.Equal(t, "Release Three", resp.Items[0].Item.Title)
assert.Equal(t, "Release Four", resp.Items[1].Title) assert.Equal(t, "Release Four", resp.Items[1].Item.Title)
resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodYear}}) resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodYear}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 3) require.Len(t, resp.Items, 3)
assert.Equal(t, int64(3), resp.TotalCount) assert.Equal(t, int64(3), resp.TotalCount)
assert.Equal(t, "Release Two", resp.Items[0].Title) assert.Equal(t, "Release Two", resp.Items[0].Item.Title)
assert.Equal(t, "Release Three", resp.Items[1].Title) assert.Equal(t, "Release Three", resp.Items[1].Item.Title)
assert.Equal(t, "Release Four", resp.Items[2].Title) assert.Equal(t, "Release Four", resp.Items[2].Item.Title)
// test specific artist // test specific artist
resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodYear}, ArtistID: 2}) resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodYear}, ArtistID: 2})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 1) require.Len(t, resp.Items, 1)
assert.Equal(t, int64(1), resp.TotalCount) assert.Equal(t, int64(1), resp.TotalCount)
assert.Equal(t, "Release Two", resp.Items[0].Title) assert.Equal(t, "Release Two", resp.Items[0].Item.Title)
// Test specify dates // Test specify dates
@ -89,11 +89,11 @@ func TestGetTopAlbumsPaginated(t *testing.T) {
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 1) require.Len(t, resp.Items, 1)
assert.Equal(t, int64(1), resp.TotalCount) assert.Equal(t, int64(1), resp.TotalCount)
assert.Equal(t, "Release One", resp.Items[0].Title) assert.Equal(t, "Release One", resp.Items[0].Item.Title)
resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Month: 6, Year: 2024}}) resp, err = store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Month: 6, Year: 2024}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 1) require.Len(t, resp.Items, 1)
assert.Equal(t, int64(1), resp.TotalCount) assert.Equal(t, int64(1), resp.TotalCount)
assert.Equal(t, "Release Two", resp.Items[0].Title) assert.Equal(t, "Release Two", resp.Items[0].Item.Title)
} }

View file

@ -10,7 +10,7 @@ import (
"github.com/gabehf/koito/internal/repository" "github.com/gabehf/koito/internal/repository"
) )
func (d *Psql) GetTopArtistsPaginated(ctx context.Context, opts db.GetItemsOpts) (*db.PaginatedResponse[*models.Artist], error) { func (d *Psql) GetTopArtistsPaginated(ctx context.Context, opts db.GetItemsOpts) (*db.PaginatedResponse[db.RankedItem[*models.Artist]], error) {
l := logger.FromContext(ctx) l := logger.FromContext(ctx)
offset := (opts.Page - 1) * opts.Limit offset := (opts.Page - 1) * opts.Limit
t1, t2 := db.TimeframeToTimeRange(opts.Timeframe) t1, t2 := db.TimeframeToTimeRange(opts.Timeframe)
@ -28,7 +28,7 @@ func (d *Psql) GetTopArtistsPaginated(ctx context.Context, opts db.GetItemsOpts)
if err != nil { if err != nil {
return nil, fmt.Errorf("GetTopArtistsPaginated: GetTopArtistsPaginated: %w", err) return nil, fmt.Errorf("GetTopArtistsPaginated: GetTopArtistsPaginated: %w", err)
} }
rgs := make([]*models.Artist, len(rows)) rgs := make([]db.RankedItem[*models.Artist], len(rows))
for i, row := range rows { for i, row := range rows {
t := &models.Artist{ t := &models.Artist{
Name: row.Name, Name: row.Name,
@ -37,7 +37,8 @@ func (d *Psql) GetTopArtistsPaginated(ctx context.Context, opts db.GetItemsOpts)
Image: row.Image, Image: row.Image,
ListenCount: row.ListenCount, ListenCount: row.ListenCount,
} }
rgs[i] = t rgs[i].Item = t
rgs[i].Rank = row.Rank
} }
count, err := d.q.CountTopArtists(ctx, repository.CountTopArtistsParams{ count, err := d.q.CountTopArtists(ctx, repository.CountTopArtistsParams{
ListenedAt: t1, ListenedAt: t1,
@ -48,7 +49,7 @@ func (d *Psql) GetTopArtistsPaginated(ctx context.Context, opts db.GetItemsOpts)
} }
l.Debug().Msgf("Database responded with %d artists out of a total %d", len(rows), count) l.Debug().Msgf("Database responded with %d artists out of a total %d", len(rows), count)
return &db.PaginatedResponse[*models.Artist]{ return &db.PaginatedResponse[db.RankedItem[*models.Artist]]{
Items: rgs, Items: rgs,
TotalCount: count, TotalCount: count,
ItemsPerPage: int32(opts.Limit), ItemsPerPage: int32(opts.Limit),

View file

@ -18,16 +18,16 @@ func TestGetTopArtistsPaginated(t *testing.T) {
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 4) require.Len(t, resp.Items, 4)
assert.Equal(t, int64(4), resp.TotalCount) assert.Equal(t, int64(4), resp.TotalCount)
assert.Equal(t, "Artist One", resp.Items[0].Name) assert.Equal(t, "Artist One", resp.Items[0].Item.Name)
assert.Equal(t, "Artist Two", resp.Items[1].Name) assert.Equal(t, "Artist Two", resp.Items[1].Item.Name)
assert.Equal(t, "Artist Three", resp.Items[2].Name) assert.Equal(t, "Artist Three", resp.Items[2].Item.Name)
assert.Equal(t, "Artist Four", resp.Items[3].Name) assert.Equal(t, "Artist Four", resp.Items[3].Item.Name)
// Test pagination // Test pagination
resp, err = store.GetTopArtistsPaginated(ctx, db.GetItemsOpts{Limit: 1, Page: 2, Timeframe: db.Timeframe{Period: db.PeriodAllTime}}) resp, err = store.GetTopArtistsPaginated(ctx, db.GetItemsOpts{Limit: 1, Page: 2, Timeframe: db.Timeframe{Period: db.PeriodAllTime}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 1) require.Len(t, resp.Items, 1)
assert.Equal(t, "Artist Two", resp.Items[0].Name) assert.Equal(t, "Artist Two", resp.Items[0].Item.Name)
// Test page out of range // Test page out of range
resp, err = store.GetTopArtistsPaginated(ctx, db.GetItemsOpts{Limit: 1, Page: 10, Timeframe: db.Timeframe{Period: db.PeriodAllTime}}) resp, err = store.GetTopArtistsPaginated(ctx, db.GetItemsOpts{Limit: 1, Page: 10, Timeframe: db.Timeframe{Period: db.PeriodAllTime}})
@ -57,22 +57,22 @@ func TestGetTopArtistsPaginated(t *testing.T) {
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 1) require.Len(t, resp.Items, 1)
assert.Equal(t, int64(1), resp.TotalCount) assert.Equal(t, int64(1), resp.TotalCount)
assert.Equal(t, "Artist Four", resp.Items[0].Name) assert.Equal(t, "Artist Four", resp.Items[0].Item.Name)
resp, err = store.GetTopArtistsPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodMonth}}) resp, err = store.GetTopArtistsPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodMonth}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 2) require.Len(t, resp.Items, 2)
assert.Equal(t, int64(2), resp.TotalCount) assert.Equal(t, int64(2), resp.TotalCount)
assert.Equal(t, "Artist Three", resp.Items[0].Name) assert.Equal(t, "Artist Three", resp.Items[0].Item.Name)
assert.Equal(t, "Artist Four", resp.Items[1].Name) assert.Equal(t, "Artist Four", resp.Items[1].Item.Name)
resp, err = store.GetTopArtistsPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodYear}}) resp, err = store.GetTopArtistsPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodYear}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 3) require.Len(t, resp.Items, 3)
assert.Equal(t, int64(3), resp.TotalCount) assert.Equal(t, int64(3), resp.TotalCount)
assert.Equal(t, "Artist Two", resp.Items[0].Name) assert.Equal(t, "Artist Two", resp.Items[0].Item.Name)
assert.Equal(t, "Artist Three", resp.Items[1].Name) assert.Equal(t, "Artist Three", resp.Items[1].Item.Name)
assert.Equal(t, "Artist Four", resp.Items[2].Name) assert.Equal(t, "Artist Four", resp.Items[2].Item.Name)
// Test specify dates // Test specify dates
@ -82,11 +82,11 @@ func TestGetTopArtistsPaginated(t *testing.T) {
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 1) require.Len(t, resp.Items, 1)
assert.Equal(t, int64(1), resp.TotalCount) assert.Equal(t, int64(1), resp.TotalCount)
assert.Equal(t, "Artist One", resp.Items[0].Name) assert.Equal(t, "Artist One", resp.Items[0].Item.Name)
resp, err = store.GetTopArtistsPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Month: 6, Year: 2024}}) resp, err = store.GetTopArtistsPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Month: 6, Year: 2024}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 1) require.Len(t, resp.Items, 1)
assert.Equal(t, int64(1), resp.TotalCount) assert.Equal(t, int64(1), resp.TotalCount)
assert.Equal(t, "Artist Two", resp.Items[0].Name) assert.Equal(t, "Artist Two", resp.Items[0].Item.Name)
} }

View file

@ -11,14 +11,14 @@ import (
"github.com/gabehf/koito/internal/repository" "github.com/gabehf/koito/internal/repository"
) )
func (d *Psql) GetTopTracksPaginated(ctx context.Context, opts db.GetItemsOpts) (*db.PaginatedResponse[*models.Track], error) { func (d *Psql) GetTopTracksPaginated(ctx context.Context, opts db.GetItemsOpts) (*db.PaginatedResponse[db.RankedItem[*models.Track]], error) {
l := logger.FromContext(ctx) l := logger.FromContext(ctx)
offset := (opts.Page - 1) * opts.Limit offset := (opts.Page - 1) * opts.Limit
t1, t2 := db.TimeframeToTimeRange(opts.Timeframe) t1, t2 := db.TimeframeToTimeRange(opts.Timeframe)
if opts.Limit == 0 { if opts.Limit == 0 {
opts.Limit = DefaultItemsPerPage opts.Limit = DefaultItemsPerPage
} }
var tracks []*models.Track var tracks []db.RankedItem[*models.Track]
var count int64 var count int64
if opts.AlbumID > 0 { if opts.AlbumID > 0 {
l.Debug().Msgf("Fetching top %d tracks on page %d from range %v to %v", l.Debug().Msgf("Fetching top %d tracks on page %d from range %v to %v",
@ -33,7 +33,7 @@ func (d *Psql) GetTopTracksPaginated(ctx context.Context, opts db.GetItemsOpts)
if err != nil { if err != nil {
return nil, fmt.Errorf("GetTopTracksPaginated: GetTopTracksInReleasePaginated: %w", err) return nil, fmt.Errorf("GetTopTracksPaginated: GetTopTracksInReleasePaginated: %w", err)
} }
tracks = make([]*models.Track, len(rows)) tracks = make([]db.RankedItem[*models.Track], len(rows))
for i, row := range rows { for i, row := range rows {
artists := make([]models.SimpleArtist, 0) artists := make([]models.SimpleArtist, 0)
err = json.Unmarshal(row.Artists, &artists) err = json.Unmarshal(row.Artists, &artists)
@ -50,7 +50,8 @@ func (d *Psql) GetTopTracksPaginated(ctx context.Context, opts db.GetItemsOpts)
AlbumID: row.ReleaseID, AlbumID: row.ReleaseID,
Artists: artists, Artists: artists,
} }
tracks[i] = t tracks[i].Item = t
tracks[i].Rank = row.Rank
} }
count, err = d.q.CountTopTracksByRelease(ctx, repository.CountTopTracksByReleaseParams{ count, err = d.q.CountTopTracksByRelease(ctx, repository.CountTopTracksByReleaseParams{
ListenedAt: t1, ListenedAt: t1,
@ -73,7 +74,7 @@ func (d *Psql) GetTopTracksPaginated(ctx context.Context, opts db.GetItemsOpts)
if err != nil { if err != nil {
return nil, fmt.Errorf("GetTopTracksPaginated: GetTopTracksByArtistPaginated: %w", err) return nil, fmt.Errorf("GetTopTracksPaginated: GetTopTracksByArtistPaginated: %w", err)
} }
tracks = make([]*models.Track, len(rows)) tracks = make([]db.RankedItem[*models.Track], len(rows))
for i, row := range rows { for i, row := range rows {
artists := make([]models.SimpleArtist, 0) artists := make([]models.SimpleArtist, 0)
err = json.Unmarshal(row.Artists, &artists) err = json.Unmarshal(row.Artists, &artists)
@ -90,7 +91,8 @@ func (d *Psql) GetTopTracksPaginated(ctx context.Context, opts db.GetItemsOpts)
AlbumID: row.ReleaseID, AlbumID: row.ReleaseID,
Artists: artists, Artists: artists,
} }
tracks[i] = t tracks[i].Item = t
tracks[i].Rank = row.Rank
} }
count, err = d.q.CountTopTracksByArtist(ctx, repository.CountTopTracksByArtistParams{ count, err = d.q.CountTopTracksByArtist(ctx, repository.CountTopTracksByArtistParams{
ListenedAt: t1, ListenedAt: t1,
@ -112,7 +114,7 @@ func (d *Psql) GetTopTracksPaginated(ctx context.Context, opts db.GetItemsOpts)
if err != nil { if err != nil {
return nil, fmt.Errorf("GetTopTracksPaginated: GetTopTracksPaginated: %w", err) return nil, fmt.Errorf("GetTopTracksPaginated: GetTopTracksPaginated: %w", err)
} }
tracks = make([]*models.Track, len(rows)) tracks = make([]db.RankedItem[*models.Track], len(rows))
for i, row := range rows { for i, row := range rows {
artists := make([]models.SimpleArtist, 0) artists := make([]models.SimpleArtist, 0)
err = json.Unmarshal(row.Artists, &artists) err = json.Unmarshal(row.Artists, &artists)
@ -129,7 +131,8 @@ func (d *Psql) GetTopTracksPaginated(ctx context.Context, opts db.GetItemsOpts)
AlbumID: row.ReleaseID, AlbumID: row.ReleaseID,
Artists: artists, Artists: artists,
} }
tracks[i] = t tracks[i].Item = t
tracks[i].Rank = row.Rank
} }
count, err = d.q.CountTopTracks(ctx, repository.CountTopTracksParams{ count, err = d.q.CountTopTracks(ctx, repository.CountTopTracksParams{
ListenedAt: t1, ListenedAt: t1,
@ -141,7 +144,7 @@ func (d *Psql) GetTopTracksPaginated(ctx context.Context, opts db.GetItemsOpts)
l.Debug().Msgf("Database responded with %d tracks out of a total %d", len(rows), count) l.Debug().Msgf("Database responded with %d tracks out of a total %d", len(rows), count)
} }
return &db.PaginatedResponse[*models.Track]{ return &db.PaginatedResponse[db.RankedItem[*models.Track]]{
Items: tracks, Items: tracks,
TotalCount: count, TotalCount: count,
ItemsPerPage: int32(opts.Limit), ItemsPerPage: int32(opts.Limit),

View file

@ -18,19 +18,19 @@ func TestGetTopTracksPaginated(t *testing.T) {
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 4) require.Len(t, resp.Items, 4)
assert.Equal(t, int64(4), resp.TotalCount) assert.Equal(t, int64(4), resp.TotalCount)
assert.Equal(t, "Track One", resp.Items[0].Title) assert.Equal(t, "Track One", resp.Items[0].Item.Title)
assert.Equal(t, "Track Two", resp.Items[1].Title) assert.Equal(t, "Track Two", resp.Items[1].Item.Title)
assert.Equal(t, "Track Three", resp.Items[2].Title) assert.Equal(t, "Track Three", resp.Items[2].Item.Title)
assert.Equal(t, "Track Four", resp.Items[3].Title) assert.Equal(t, "Track Four", resp.Items[3].Item.Title)
// ensure artists are included // ensure artists are included
require.Len(t, resp.Items[0].Artists, 1) require.Len(t, resp.Items[0].Item.Artists, 1)
assert.Equal(t, "Artist One", resp.Items[0].Artists[0].Name) assert.Equal(t, "Artist One", resp.Items[0].Item.Artists[0].Name)
// Test pagination // Test pagination
resp, err = store.GetTopTracksPaginated(ctx, db.GetItemsOpts{Limit: 1, Page: 2, Timeframe: db.Timeframe{Period: db.PeriodAllTime}}) resp, err = store.GetTopTracksPaginated(ctx, db.GetItemsOpts{Limit: 1, Page: 2, Timeframe: db.Timeframe{Period: db.PeriodAllTime}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 1) require.Len(t, resp.Items, 1)
assert.Equal(t, "Track Two", resp.Items[0].Title) assert.Equal(t, "Track Two", resp.Items[0].Item.Title)
// Test page out of range // Test page out of range
resp, err = store.GetTopTracksPaginated(ctx, db.GetItemsOpts{Limit: 1, Page: 10, Timeframe: db.Timeframe{Period: db.PeriodAllTime}}) resp, err = store.GetTopTracksPaginated(ctx, db.GetItemsOpts{Limit: 1, Page: 10, Timeframe: db.Timeframe{Period: db.PeriodAllTime}})
@ -60,41 +60,41 @@ func TestGetTopTracksPaginated(t *testing.T) {
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 1) require.Len(t, resp.Items, 1)
assert.Equal(t, int64(1), resp.TotalCount) assert.Equal(t, int64(1), resp.TotalCount)
assert.Equal(t, "Track Four", resp.Items[0].Title) assert.Equal(t, "Track Four", resp.Items[0].Item.Title)
resp, err = store.GetTopTracksPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodMonth}}) resp, err = store.GetTopTracksPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodMonth}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 2) require.Len(t, resp.Items, 2)
assert.Equal(t, int64(2), resp.TotalCount) assert.Equal(t, int64(2), resp.TotalCount)
assert.Equal(t, "Track Three", resp.Items[0].Title) assert.Equal(t, "Track Three", resp.Items[0].Item.Title)
assert.Equal(t, "Track Four", resp.Items[1].Title) assert.Equal(t, "Track Four", resp.Items[1].Item.Title)
resp, err = store.GetTopTracksPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodYear}}) resp, err = store.GetTopTracksPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodYear}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 3) require.Len(t, resp.Items, 3)
assert.Equal(t, int64(3), resp.TotalCount) assert.Equal(t, int64(3), resp.TotalCount)
assert.Equal(t, "Track Two", resp.Items[0].Title) assert.Equal(t, "Track Two", resp.Items[0].Item.Title)
assert.Equal(t, "Track Three", resp.Items[1].Title) assert.Equal(t, "Track Three", resp.Items[1].Item.Title)
assert.Equal(t, "Track Four", resp.Items[2].Title) assert.Equal(t, "Track Four", resp.Items[2].Item.Title)
// Test filter by artists and releases // Test filter by artists and releases
resp, err = store.GetTopTracksPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodAllTime}, ArtistID: 1}) resp, err = store.GetTopTracksPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodAllTime}, ArtistID: 1})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 1) require.Len(t, resp.Items, 1)
assert.Equal(t, int64(1), resp.TotalCount) assert.Equal(t, int64(1), resp.TotalCount)
assert.Equal(t, "Track One", resp.Items[0].Title) assert.Equal(t, "Track One", resp.Items[0].Item.Title)
resp, err = store.GetTopTracksPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodAllTime}, AlbumID: 2}) resp, err = store.GetTopTracksPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodAllTime}, AlbumID: 2})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 1) require.Len(t, resp.Items, 1)
assert.Equal(t, int64(1), resp.TotalCount) assert.Equal(t, int64(1), resp.TotalCount)
assert.Equal(t, "Track Two", resp.Items[0].Title) assert.Equal(t, "Track Two", resp.Items[0].Item.Title)
// when both artistID and albumID are specified, artist id is ignored // when both artistID and albumID are specified, artist id is ignored
resp, err = store.GetTopTracksPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodAllTime}, AlbumID: 2, ArtistID: 1}) resp, err = store.GetTopTracksPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Period: db.PeriodAllTime}, AlbumID: 2, ArtistID: 1})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 1) require.Len(t, resp.Items, 1)
assert.Equal(t, int64(1), resp.TotalCount) assert.Equal(t, int64(1), resp.TotalCount)
assert.Equal(t, "Track Two", resp.Items[0].Title) assert.Equal(t, "Track Two", resp.Items[0].Item.Title)
// Test specify dates // Test specify dates
@ -104,11 +104,11 @@ func TestGetTopTracksPaginated(t *testing.T) {
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 1) require.Len(t, resp.Items, 1)
assert.Equal(t, int64(1), resp.TotalCount) assert.Equal(t, int64(1), resp.TotalCount)
assert.Equal(t, "Track One", resp.Items[0].Title) assert.Equal(t, "Track One", resp.Items[0].Item.Title)
resp, err = store.GetTopTracksPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Month: 6, Year: 2024}}) resp, err = store.GetTopTracksPaginated(ctx, db.GetItemsOpts{Timeframe: db.Timeframe{Month: 6, Year: 2024}})
require.NoError(t, err) require.NoError(t, err)
require.Len(t, resp.Items, 1) require.Len(t, resp.Items, 1)
assert.Equal(t, int64(1), resp.TotalCount) assert.Equal(t, int64(1), resp.TotalCount)
assert.Equal(t, "Track Two", resp.Items[0].Title) assert.Equal(t, "Track Two", resp.Items[0].Item.Title)
} }

View file

@ -21,37 +21,13 @@ func (d *Psql) GetTrack(ctx context.Context, opts db.GetTrackOpts) (*models.Trac
l := logger.FromContext(ctx) l := logger.FromContext(ctx)
var track models.Track var track models.Track
if opts.ID != 0 { if opts.MusicBrainzID != uuid.Nil {
l.Debug().Msgf("Fetching track from DB with id %d", opts.ID)
t, err := d.q.GetTrack(ctx, opts.ID)
if err != nil {
return nil, fmt.Errorf("GetTrack: GetTrack By ID: %w", err)
}
track = models.Track{
ID: t.ID,
MbzID: t.MusicBrainzID,
Title: t.Title,
AlbumID: t.ReleaseID,
Image: t.Image,
Duration: t.Duration,
}
err = json.Unmarshal(t.Artists, &track.Artists)
if err != nil {
return nil, fmt.Errorf("GetTrack: json.Unmarshal: %w", err)
}
} else if opts.MusicBrainzID != uuid.Nil {
l.Debug().Msgf("Fetching track from DB with MusicBrainz ID %s", opts.MusicBrainzID) l.Debug().Msgf("Fetching track from DB with MusicBrainz ID %s", opts.MusicBrainzID)
t, err := d.q.GetTrackByMbzID(ctx, &opts.MusicBrainzID) t, err := d.q.GetTrackByMbzID(ctx, &opts.MusicBrainzID)
if err != nil { if err != nil {
return nil, fmt.Errorf("GetTrack: GetTrackByMbzID: %w", err) return nil, fmt.Errorf("GetTrack: GetTrackByMbzID: %w", err)
} }
track = models.Track{ opts.ID = t.ID
ID: t.ID,
MbzID: t.MusicBrainzID,
Title: t.Title,
AlbumID: t.ReleaseID,
Duration: t.Duration,
}
} else if len(opts.ArtistIDs) > 0 && opts.ReleaseID != 0 { } else if len(opts.ArtistIDs) > 0 && opts.ReleaseID != 0 {
l.Debug().Msgf("Fetching track from DB from release id %d with title '%s' and artist id(s) '%v'", opts.ReleaseID, opts.Title, opts.ArtistIDs) l.Debug().Msgf("Fetching track from DB from release id %d with title '%s' and artist id(s) '%v'", opts.ReleaseID, opts.Title, opts.ArtistIDs)
t, err := d.q.GetTrackByTrackInfo(ctx, repository.GetTrackByTrackInfoParams{ t, err := d.q.GetTrackByTrackInfo(ctx, repository.GetTrackByTrackInfoParams{
@ -62,21 +38,19 @@ func (d *Psql) GetTrack(ctx context.Context, opts db.GetTrackOpts) (*models.Trac
if err != nil { if err != nil {
return nil, fmt.Errorf("GetTrack: GetTrackByTrackInfo: %w", err) return nil, fmt.Errorf("GetTrack: GetTrackByTrackInfo: %w", err)
} }
track = models.Track{ opts.ID = t.ID
ID: t.ID,
MbzID: t.MusicBrainzID,
Title: t.Title,
AlbumID: t.ReleaseID,
Duration: t.Duration,
} }
} else {
return nil, errors.New("GetTrack: insufficient information to get track") l.Debug().Msgf("Fetching track from DB with id %d", opts.ID)
t, err := d.q.GetTrack(ctx, opts.ID)
if err != nil {
return nil, fmt.Errorf("GetTrack: GetTrack By ID: %w", err)
} }
count, err := d.q.CountListensFromTrack(ctx, repository.CountListensFromTrackParams{ count, err := d.q.CountListensFromTrack(ctx, repository.CountListensFromTrackParams{
ListenedAt: time.Unix(0, 0), ListenedAt: time.Unix(0, 0),
ListenedAt_2: time.Now(), ListenedAt_2: time.Now(),
TrackID: track.ID, TrackID: opts.ID,
}) })
if err != nil { if err != nil {
return nil, fmt.Errorf("GetTrack: CountListensFromTrack: %w", err) return nil, fmt.Errorf("GetTrack: CountListensFromTrack: %w", err)
@ -84,20 +58,37 @@ func (d *Psql) GetTrack(ctx context.Context, opts db.GetTrackOpts) (*models.Trac
seconds, err := d.CountTimeListenedToItem(ctx, db.TimeListenedOpts{ seconds, err := d.CountTimeListenedToItem(ctx, db.TimeListenedOpts{
Timeframe: db.Timeframe{Period: db.PeriodAllTime}, Timeframe: db.Timeframe{Period: db.PeriodAllTime},
TrackID: track.ID, TrackID: opts.ID,
}) })
if err != nil { if err != nil {
return nil, fmt.Errorf("GetTrack: CountTimeListenedToItem: %w", err) return nil, fmt.Errorf("GetTrack: CountTimeListenedToItem: %w", err)
} }
firstListen, err := d.q.GetFirstListenFromTrack(ctx, track.ID) firstListen, err := d.q.GetFirstListenFromTrack(ctx, opts.ID)
if err != nil && !errors.Is(err, pgx.ErrNoRows) { if err != nil && !errors.Is(err, pgx.ErrNoRows) {
return nil, fmt.Errorf("GetAlbum: GetFirstListenFromRelease: %w", err) return nil, fmt.Errorf("GetAlbum: GetFirstListenFromRelease: %w", err)
} }
rank, err := d.q.GetTrackAllTimeRank(ctx, opts.ID)
if err != nil && !errors.Is(err, pgx.ErrNoRows) {
return nil, fmt.Errorf("GetAlbum: GetTrackAllTimeRank: %w", err)
}
track.ListenCount = count track = models.Track{
track.TimeListened = seconds ID: t.ID,
track.FirstListen = firstListen.ListenedAt.Unix() MbzID: t.MusicBrainzID,
Title: t.Title,
AlbumID: t.ReleaseID,
Image: t.Image,
Duration: t.Duration,
AllTimeRank: rank.Rank,
ListenCount: count,
TimeListened: seconds,
FirstListen: firstListen.ListenedAt.Unix(),
}
err = json.Unmarshal(t.Artists, &track.Artists)
if err != nil {
return nil, fmt.Errorf("GetTrack: json.Unmarshal: %w", err)
}
return &track, nil return &track, nil
} }
@ -146,6 +137,13 @@ func (d *Psql) SaveTrack(ctx context.Context, opts db.SaveTrackOpts) (*models.Tr
if err != nil { if err != nil {
return nil, fmt.Errorf("SaveTrack: AssociateArtistToTrack: %w", err) return nil, fmt.Errorf("SaveTrack: AssociateArtistToTrack: %w", err)
} }
err = qtx.AssociateArtistToRelease(ctx, repository.AssociateArtistToReleaseParams{
ArtistID: aid,
ReleaseID: trackRow.ReleaseID,
})
if err != nil {
return nil, fmt.Errorf("SaveTrack: AssociateArtistToTrack: %w", err)
}
} }
// insert primary alias // insert primary alias
err = qtx.InsertTrackAlias(ctx, repository.InsertTrackAliasParams{ err = qtx.InsertTrackAlias(ctx, repository.InsertTrackAliasParams{
@ -242,7 +240,28 @@ func (d *Psql) SaveTrackAliases(ctx context.Context, id int32, aliases []string,
} }
func (d *Psql) DeleteTrack(ctx context.Context, id int32) error { func (d *Psql) DeleteTrack(ctx context.Context, id int32) error {
return d.q.DeleteTrack(ctx, id) l := logger.FromContext(ctx)
tx, err := d.conn.BeginTx(ctx, pgx.TxOptions{})
if err != nil {
l.Err(err).Msg("Failed to begin transaction")
return fmt.Errorf("DeleteTrack: %w", err)
}
defer tx.Rollback(ctx)
qtx := d.q.WithTx(tx)
err = qtx.DeleteTrack(ctx, id)
if err != nil {
return fmt.Errorf("DeleteTrack: DeleteTrack: %w", err)
}
// also clean orphaned entries to ensure artists are disassociated with releases where
// they no longer have any tracks on the release
err = qtx.CleanOrphanedEntries(ctx)
if err != nil {
return fmt.Errorf("DeleteTrack: CleanOrphanedEntries: %w", err)
}
return tx.Commit(ctx)
} }
func (d *Psql) DeleteTrackAlias(ctx context.Context, id int32, alias string) error { func (d *Psql) DeleteTrackAlias(ctx context.Context, id int32, alias string) error {
@ -375,3 +394,29 @@ func (d *Psql) SetPrimaryTrackArtist(ctx context.Context, id int32, artistId int
} }
return tx.Commit(ctx) return tx.Commit(ctx)
} }
// returns nil, nil when no results
func (d *Psql) GetTracksWithNoDurationButHaveMbzID(ctx context.Context, from int32) ([]*models.Track, error) {
results, err := d.q.GetTracksWithNoDurationButHaveMbzID(ctx, repository.GetTracksWithNoDurationButHaveMbzIDParams{
Limit: 20,
ID: from,
})
if errors.Is(err, pgx.ErrNoRows) {
return nil, nil
} else if err != nil {
return nil, fmt.Errorf("GetTracksWithNoDurationButHaveMbzID: %w", err)
}
ret := make([]*models.Track, 0)
for _, v := range results {
ret = append(ret, &models.Track{
ID: v.ID,
Duration: v.Duration,
MbzID: v.MusicBrainzID,
Title: v.Title,
})
}
return ret, nil
}

View file

@ -62,7 +62,7 @@ func testDataForTracks(t *testing.T) {
VALUES (1, 1), (2, 2)`) VALUES (1, 1), (2, 2)`)
require.NoError(t, err) require.NoError(t, err)
// Associate tracks with artists // Insert listens
err = store.Exec(context.Background(), err = store.Exec(context.Background(),
`INSERT INTO listens (user_id, track_id, listened_at) `INSERT INTO listens (user_id, track_id, listened_at)
VALUES (1, 1, NOW()), (1, 2, NOW())`) VALUES (1, 1, NOW()), (1, 2, NOW())`)
@ -228,3 +228,27 @@ func TestDeleteTrack(t *testing.T) {
_, err = store.Count(ctx, `SELECT * FROM tracks WHERE id = 2`) _, err = store.Count(ctx, `SELECT * FROM tracks WHERE id = 2`)
require.ErrorIs(t, err, pgx.ErrNoRows) // no rows error require.ErrorIs(t, err, pgx.ErrNoRows) // no rows error
} }
func TestReleaseAssociations(t *testing.T) {
testDataForTracks(t)
ctx := context.Background()
track, err := store.SaveTrack(ctx, db.SaveTrackOpts{
Title: "Track Three",
AlbumID: 2,
ArtistIDs: []int32{2, 1}, // Artist Two feat. Artist One
Duration: 100,
})
require.NoError(t, err)
count, err := store.Count(ctx, `SELECT COUNT(*) FROM artist_releases WHERE release_id = 2`)
require.NoError(t, err)
require.Equal(t, 2, count, "expected release to be associated with artist from inserted track")
err = store.DeleteTrack(ctx, track.ID)
require.NoError(t, err)
count, err = store.Count(ctx, `SELECT COUNT(*) FROM artist_releases WHERE release_id = 2`)
require.NoError(t, err)
require.Equal(t, 1, count, "expected artist no longer on release to be disassociated from release")
}

View file

@ -28,6 +28,11 @@ type PaginatedResponse[T any] struct {
CurrentPage int32 `json:"current_page"` CurrentPage int32 `json:"current_page"`
} }
type RankedItem[T any] struct {
Item T `json:"item"`
Rank int64 `json:"rank"`
}
type ExportItem struct { type ExportItem struct {
ListenedAt time.Time ListenedAt time.Time
UserID int32 UserID int32

View file

@ -110,6 +110,9 @@ func (c *DeezerClient) getEntity(ctx context.Context, endpoint string, result an
return nil return nil
} }
// Deezer behavior is that it serves a default image when it can't find one for an artist, so
// this function will just download the default image thinking that it is an actual artist image.
// I don't know how to fix this yet.
func (c *DeezerClient) GetArtistImages(ctx context.Context, aliases []string) (string, error) { func (c *DeezerClient) GetArtistImages(ctx context.Context, aliases []string) (string, error) {
l := logger.FromContext(ctx) l := logger.FromContext(ctx)
resp := new(DeezerArtistResponse) resp := new(DeezerArtistResponse)

View file

@ -5,6 +5,7 @@ import (
"context" "context"
"fmt" "fmt"
"net/http" "net/http"
"strings"
"sync" "sync"
"github.com/gabehf/koito/internal/logger" "github.com/gabehf/koito/internal/logger"
@ -16,6 +17,8 @@ type ImageSource struct {
deezerC *DeezerClient deezerC *DeezerClient
subsonicEnabled bool subsonicEnabled bool
subsonicC *SubsonicClient subsonicC *SubsonicClient
lastfmEnabled bool
lastfmC *LastFMClient
caaEnabled bool caaEnabled bool
} }
type ImageSourceOpts struct { type ImageSourceOpts struct {
@ -23,6 +26,7 @@ type ImageSourceOpts struct {
EnableCAA bool EnableCAA bool
EnableDeezer bool EnableDeezer bool
EnableSubsonic bool EnableSubsonic bool
EnableLastFM bool
} }
var once sync.Once var once sync.Once
@ -30,6 +34,7 @@ var imgsrc ImageSource
type ArtistImageOpts struct { type ArtistImageOpts struct {
Aliases []string Aliases []string
MBID *uuid.UUID
} }
type AlbumImageOpts struct { type AlbumImageOpts struct {
@ -55,6 +60,10 @@ func Initialize(opts ImageSourceOpts) {
imgsrc.subsonicEnabled = true imgsrc.subsonicEnabled = true
imgsrc.subsonicC = NewSubsonicClient() imgsrc.subsonicC = NewSubsonicClient()
} }
if opts.EnableLastFM {
imgsrc.lastfmEnabled = true
imgsrc.lastfmC = NewLastFMClient()
}
}) })
} }
@ -65,31 +74,46 @@ func Shutdown() {
func GetArtistImage(ctx context.Context, opts ArtistImageOpts) (string, error) { func GetArtistImage(ctx context.Context, opts ArtistImageOpts) (string, error) {
l := logger.FromContext(ctx) l := logger.FromContext(ctx)
if imgsrc.subsonicEnabled { if imgsrc.subsonicEnabled {
img, err := imgsrc.subsonicC.GetArtistImage(ctx, opts.Aliases[0]) img, err := imgsrc.subsonicC.GetArtistImage(ctx, opts.MBID, opts.Aliases[0])
if err != nil { if err != nil {
return "", err l.Debug().Err(err).Msg("GetArtistImage: Could not find artist image from Subsonic")
} } else if img != "" {
if img != "" {
return img, nil return img, nil
} }
l.Debug().Msg("Could not find artist image from Subsonic") } else {
l.Debug().Msg("GetArtistImage: Subsonic image fetching is disabled")
} }
if imgsrc.deezerC != nil { if imgsrc.lastfmEnabled {
img, err := imgsrc.lastfmC.GetArtistImage(ctx, opts.MBID, opts.Aliases[0])
if err != nil {
l.Debug().Err(err).Msg("GetArtistImage: Could not find artist image from LastFM")
} else if img != "" {
return img, nil
}
} else {
l.Debug().Msg("GetArtistImage: LastFM image fetching is disabled")
}
if imgsrc.deezerEnabled {
img, err := imgsrc.deezerC.GetArtistImages(ctx, opts.Aliases) img, err := imgsrc.deezerC.GetArtistImages(ctx, opts.Aliases)
if err != nil { if err != nil {
l.Debug().Err(err).Msg("GetArtistImage: Could not find artist image from Deezer")
return "", err return "", err
} } else if img != "" {
return img, nil return img, nil
} }
} else {
l.Debug().Msg("GetArtistImage: Deezer image fetching is disabled")
}
l.Warn().Msg("GetArtistImage: No image providers are enabled") l.Warn().Msg("GetArtistImage: No image providers are enabled")
return "", nil return "", nil
} }
func GetAlbumImage(ctx context.Context, opts AlbumImageOpts) (string, error) { func GetAlbumImage(ctx context.Context, opts AlbumImageOpts) (string, error) {
l := logger.FromContext(ctx) l := logger.FromContext(ctx)
if imgsrc.subsonicEnabled { if imgsrc.subsonicEnabled {
img, err := imgsrc.subsonicC.GetAlbumImage(ctx, opts.Artists[0], opts.Album) img, err := imgsrc.subsonicC.GetAlbumImage(ctx, opts.ReleaseMbzID, opts.Artists[0], opts.Album)
if err != nil { if err != nil {
return "", err l.Debug().Err(err).Msg("GetAlbumImage: Could not find artist image from Subsonic")
} }
if img != "" { if img != "" {
return img, nil return img, nil
@ -102,29 +126,41 @@ func GetAlbumImage(ctx context.Context, opts AlbumImageOpts) (string, error) {
url := fmt.Sprintf(caaBaseUrl+"/release/%s/front", opts.ReleaseMbzID.String()) url := fmt.Sprintf(caaBaseUrl+"/release/%s/front", opts.ReleaseMbzID.String())
resp, err := http.DefaultClient.Head(url) resp, err := http.DefaultClient.Head(url)
if err != nil { if err != nil {
return "", err l.Debug().Err(err).Msg("GetAlbumImage: Could not find artist image from CoverArtArchive with Release MBID")
} } else {
if resp.StatusCode == 200 { if resp.StatusCode == 200 {
return url, nil return url, nil
} else {
l.Debug().Int("status", resp.StatusCode).Msg("GetAlbumImage: Got non-OK response from CoverArtArchive")
}
} }
l.Debug().Str("url", url).Str("status", resp.Status).Msg("Could not find album cover from CoverArtArchive with MusicBrainz release ID")
} }
if opts.ReleaseGroupMbzID != nil && *opts.ReleaseGroupMbzID != uuid.Nil { if opts.ReleaseGroupMbzID != nil && *opts.ReleaseGroupMbzID != uuid.Nil {
url := fmt.Sprintf(caaBaseUrl+"/release-group/%s/front", opts.ReleaseGroupMbzID.String()) url := fmt.Sprintf(caaBaseUrl+"/release-group/%s/front", opts.ReleaseGroupMbzID.String())
resp, err := http.DefaultClient.Head(url) resp, err := http.DefaultClient.Head(url)
if err != nil { if err != nil {
return "", err l.Debug().Err(err).Msg("GetAlbumImage: Could not find artist image from CoverArtArchive with Release Group MBID")
} }
if resp.StatusCode == 200 { if resp.StatusCode == 200 {
return url, nil return url, nil
} }
l.Debug().Str("url", url).Str("status", resp.Status).Msg("Could not find album cover from CoverArtArchive with MusicBrainz release group ID")
} }
} }
if imgsrc.lastfmEnabled {
img, err := imgsrc.lastfmC.GetAlbumImage(ctx, opts.ReleaseMbzID, opts.Artists[0], opts.Album)
if err != nil {
l.Debug().Err(err).Msg("GetAlbumImage: Could not find artist image from Subsonic")
}
if img != "" {
return img, nil
}
l.Debug().Msg("Could not find album cover from Subsonic")
}
if imgsrc.deezerEnabled { if imgsrc.deezerEnabled {
l.Debug().Msg("Attempting to find album image from Deezer") l.Debug().Msg("Attempting to find album image from Deezer")
img, err := imgsrc.deezerC.GetAlbumImages(ctx, opts.Artists, opts.Album) img, err := imgsrc.deezerC.GetAlbumImages(ctx, opts.Artists, opts.Album)
if err != nil { if err != nil {
l.Debug().Err(err).Msg("GetAlbumImage: Could not find artist image from Deezer")
return "", err return "", err
} }
return img, nil return img, nil
@ -132,3 +168,23 @@ func GetAlbumImage(ctx context.Context, opts AlbumImageOpts) (string, error) {
l.Warn().Msg("GetAlbumImage: No image providers are enabled") l.Warn().Msg("GetAlbumImage: No image providers are enabled")
return "", nil return "", nil
} }
// ValidateImageURL checks if the URL points to a valid image by performing a HEAD request.
func ValidateImageURL(url string) error {
resp, err := http.Head(url)
if err != nil {
return fmt.Errorf("ValidateImageURL: http.Head: %w", err)
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
return fmt.Errorf("ValidateImageURL: HEAD request failed, status code: %d", resp.StatusCode)
}
contentType := resp.Header.Get("Content-Type")
if !strings.HasPrefix(contentType, "image/") {
return fmt.Errorf("ValidateImageURL: URL does not point to an image, content type: %s", contentType)
}
return nil
}

298
internal/images/lastfm.go Normal file
View file

@ -0,0 +1,298 @@
package images
import (
"context"
"encoding/json"
"fmt"
"io"
"net/http"
"net/url"
"strings"
"github.com/gabehf/koito/internal/cfg"
"github.com/gabehf/koito/internal/logger"
"github.com/gabehf/koito/queue"
"github.com/google/uuid"
)
// i told gemini to write this cuz i figured it would be simple enough and
// it looks like it just works? maybe ai is actually worth one quintillion gallons of water
type LastFMClient struct {
apiKey string
baseUrl string
userAgent string
requestQueue *queue.RequestQueue
}
// LastFM JSON structures use "#text" for the value of XML-mapped fields
type lastFMImage struct {
URL string `json:"#text"`
Size string `json:"size"`
}
type lastFMAlbumResponse struct {
Album struct {
Name string `json:"name"`
Image []lastFMImage `json:"image"`
} `json:"album"`
Error int `json:"error"`
Message string `json:"message"`
}
type lastFMArtistResponse struct {
Artist struct {
Name string `json:"name"`
Image []lastFMImage `json:"image"`
} `json:"artist"`
Error int `json:"error"`
Message string `json:"message"`
}
const (
lastFMApiBaseUrl = "http://ws.audioscrobbler.com/2.0/"
)
func NewLastFMClient() *LastFMClient {
ret := new(LastFMClient)
ret.apiKey = cfg.LastFMApiKey()
ret.baseUrl = lastFMApiBaseUrl
ret.userAgent = cfg.UserAgent()
ret.requestQueue = queue.NewRequestQueue(5, 5)
return ret
}
func (c *LastFMClient) queue(ctx context.Context, req *http.Request) ([]byte, error) {
l := logger.FromContext(ctx)
req.Header.Set("User-Agent", c.userAgent)
req.Header.Set("Accept", "application/json")
resultChan := c.requestQueue.Enqueue(func(client *http.Client, done chan<- queue.RequestResult) {
resp, err := client.Do(req)
if err != nil {
l.Debug().Err(err).Str("url", req.URL.String()).Msg("Failed to contact LastFM")
done <- queue.RequestResult{Err: err}
return
}
defer resp.Body.Close()
// LastFM might return 200 OK even for API errors (like "Artist not found"),
// so we rely on parsing the JSON body for logic errors later,
// but we still check for HTTP protocol failures here.
if resp.StatusCode >= 500 {
err = fmt.Errorf("received server error from LastFM: %s", resp.Status)
done <- queue.RequestResult{Body: nil, Err: err}
return
}
body, err := io.ReadAll(resp.Body)
done <- queue.RequestResult{Body: body, Err: err}
})
result := <-resultChan
return result.Body, result.Err
}
func (c *LastFMClient) getEntity(ctx context.Context, params url.Values, result any) error {
l := logger.FromContext(ctx)
// Add standard parameters
params.Set("api_key", c.apiKey)
params.Set("format", "json")
// Construct URL
reqUrl, _ := url.Parse(c.baseUrl)
reqUrl.RawQuery = params.Encode()
l.Debug().Msgf("Sending request to LastFM: GET %s", reqUrl.String())
req, err := http.NewRequest("GET", reqUrl.String(), nil)
if err != nil {
return fmt.Errorf("getEntity: %w", err)
}
l.Debug().Msg("Adding LastFM request to queue")
body, err := c.queue(ctx, req)
if err != nil {
l.Err(err).Msg("LastFM request failed")
return fmt.Errorf("getEntity: %w", err)
}
err = json.Unmarshal(body, result)
if err != nil {
l.Err(err).Msg("Failed to unmarshal LastFM response")
return fmt.Errorf("getEntity: %w", err)
}
return nil
}
// selectBestImage picks the largest available image from the LastFM slice
func (c *LastFMClient) selectBestImage(images []lastFMImage) string {
// Rank preference: mega > extralarge > large > medium > small
// Since LastFM usually returns them in order of size, we could take the last one,
// but a map lookup is safer against API changes.
imgMap := make(map[string]string)
for _, img := range images {
if img.URL != "" {
imgMap[img.Size] = img.URL
}
}
if url, ok := imgMap["mega"]; ok {
if err := ValidateImageURL(overrideImgSize(url)); err == nil {
return overrideImgSize(url)
} else {
return url
}
}
if url, ok := imgMap["extralarge"]; ok {
if err := ValidateImageURL(overrideImgSize(url)); err == nil {
return overrideImgSize(url)
} else {
return url
}
}
if url, ok := imgMap["large"]; ok {
if err := ValidateImageURL(overrideImgSize(url)); err == nil {
return overrideImgSize(url)
} else {
return url
}
}
if url, ok := imgMap["medium"]; ok {
return url
}
if url, ok := imgMap["small"]; ok {
return url
}
return ""
}
// lastfm seems to only return a 300x300 image even for "mega" and "extralarge" images, so I'm cheating
func overrideImgSize(url string) string {
return strings.Replace(url, "300x300", "600x600", 1)
}
func (c *LastFMClient) GetAlbumImage(ctx context.Context, mbid *uuid.UUID, artist, album string) (string, error) {
l := logger.FromContext(ctx)
resp := new(lastFMAlbumResponse)
l.Debug().Msgf("Finding album image for %s from artist %s", album, artist)
// Helper to run the fetch
fetch := func(query paramsBuilder) error {
params := url.Values{}
params.Set("method", "album.getInfo")
query(params)
return c.getEntity(ctx, params, resp)
}
// 1. Try MBID search first
if mbid != nil {
l.Debug().Str("mbid", mbid.String()).Msg("Searching album image by MBID")
err := fetch(func(p url.Values) {
p.Set("mbid", mbid.String())
})
// If success and no API error code
if err == nil && resp.Error == 0 && len(resp.Album.Image) > 0 {
best := c.selectBestImage(resp.Album.Image)
if best != "" {
return best, nil
}
} else if resp.Error != 0 {
l.Debug().Int("api_error", resp.Error).Msg("LastFM MBID lookup failed, falling back to name")
}
}
// 2. Fallback to Artist + Album name match
l.Debug().Str("title", album).Str("artist", artist).Msg("Searching album image by title and artist")
// Clear previous response structure just in case
resp = new(lastFMAlbumResponse)
err := fetch(func(p url.Values) {
p.Set("artist", artist)
p.Set("album", album)
// Auto-correct spelling is useful for name lookups
p.Set("autocorrect", "1")
})
if err != nil {
return "", fmt.Errorf("GetAlbumImage: %v", err)
}
if resp.Error != 0 {
return "", fmt.Errorf("GetAlbumImage: LastFM API error %d: %s", resp.Error, resp.Message)
}
best := c.selectBestImage(resp.Album.Image)
if best == "" {
return "", fmt.Errorf("GetAlbumImage: no suitable image found")
}
return best, nil
}
func (c *LastFMClient) GetArtistImage(ctx context.Context, mbid *uuid.UUID, artist string) (string, error) {
l := logger.FromContext(ctx)
resp := new(lastFMArtistResponse)
l.Debug().Msgf("Finding artist image for %s", artist)
fetch := func(query paramsBuilder) error {
params := url.Values{}
params.Set("method", "artist.getInfo")
query(params)
return c.getEntity(ctx, params, resp)
}
// 1. Try MBID search
if mbid != nil {
l.Debug().Str("mbid", mbid.String()).Msg("Searching artist image by MBID")
err := fetch(func(p url.Values) {
p.Set("mbid", mbid.String())
})
if err == nil && resp.Error == 0 && len(resp.Artist.Image) > 0 {
best := c.selectBestImage(resp.Artist.Image)
if best != "" {
// Validate to match Subsonic implementation behavior
if err := ValidateImageURL(best); err == nil {
return best, nil
}
}
}
}
// 2. Fallback to Artist name
l.Debug().Str("artist", artist).Msg("Searching artist image by name")
resp = new(lastFMArtistResponse)
err := fetch(func(p url.Values) {
p.Set("artist", artist)
p.Set("autocorrect", "1")
})
if err != nil {
return "", fmt.Errorf("GetArtistImage: %v", err)
}
if resp.Error != 0 {
return "", fmt.Errorf("GetArtistImage: LastFM API error %d: %s", resp.Error, resp.Message)
}
best := c.selectBestImage(resp.Artist.Image)
if best == "" {
return "", fmt.Errorf("GetArtistImage: no suitable image found")
}
if err := ValidateImageURL(best); err != nil {
return "", fmt.Errorf("GetArtistImage: failed to validate image url")
}
return best, nil
}
type paramsBuilder func(url.Values)

View file

@ -11,6 +11,7 @@ import (
"github.com/gabehf/koito/internal/cfg" "github.com/gabehf/koito/internal/cfg"
"github.com/gabehf/koito/internal/logger" "github.com/gabehf/koito/internal/logger"
"github.com/gabehf/koito/queue" "github.com/gabehf/koito/queue"
"github.com/google/uuid"
) )
type SubsonicClient struct { type SubsonicClient struct {
@ -26,6 +27,8 @@ type SubsonicAlbumResponse struct {
SearchResult3 struct { SearchResult3 struct {
Album []struct { Album []struct {
CoverArt string `json:"coverArt"` CoverArt string `json:"coverArt"`
Artist string `json:"artist"`
MBID string `json:"musicBrainzId"`
} `json:"album"` } `json:"album"`
} `json:"searchResult3"` } `json:"searchResult3"`
} `json:"subsonic-response"` } `json:"subsonic-response"`
@ -43,7 +46,7 @@ type SubsonicArtistResponse struct {
} }
const ( const (
subsonicAlbumSearchFmtStr = "/rest/search3?%s&f=json&query=%s&v=1.13.0&c=koito&artistCount=0&songCount=0&albumCount=1" subsonicAlbumSearchFmtStr = "/rest/search3?%s&f=json&query=%s&v=1.13.0&c=koito&artistCount=0&songCount=0&albumCount=10"
subsonicArtistSearchFmtStr = "/rest/search3?%s&f=json&query=%s&v=1.13.0&c=koito&artistCount=1&songCount=0&albumCount=0" subsonicArtistSearchFmtStr = "/rest/search3?%s&f=json&query=%s&v=1.13.0&c=koito&artistCount=1&songCount=0&albumCount=0"
subsonicCoverArtFmtStr = "/rest/getCoverArt?%s&id=%s&v=1.13.0&c=koito" subsonicCoverArtFmtStr = "/rest/getCoverArt?%s&id=%s&v=1.13.0&c=koito"
) )
@ -106,32 +109,72 @@ func (c *SubsonicClient) getEntity(ctx context.Context, endpoint string, result
return nil return nil
} }
func (c *SubsonicClient) GetAlbumImage(ctx context.Context, artist, album string) (string, error) { func (c *SubsonicClient) GetAlbumImage(ctx context.Context, mbid *uuid.UUID, artist, album string) (string, error) {
l := logger.FromContext(ctx) l := logger.FromContext(ctx)
resp := new(SubsonicAlbumResponse) resp := new(SubsonicAlbumResponse)
l.Debug().Msgf("Finding album image for %s from artist %s", album, artist) l.Debug().Msgf("Finding album image for %s from artist %s", album, artist)
err := c.getEntity(ctx, fmt.Sprintf(subsonicAlbumSearchFmtStr, c.authParams, url.QueryEscape(artist+" "+album)), resp) // first try mbid search
if mbid != nil {
l.Debug().Str("mbid", mbid.String()).Msg("Searching album image by MBID")
err := c.getEntity(ctx, fmt.Sprintf(subsonicAlbumSearchFmtStr, c.authParams, url.QueryEscape(mbid.String())), resp)
if err != nil { if err != nil {
return "", fmt.Errorf("GetAlbumImage: %v", err) return "", fmt.Errorf("GetAlbumImage: %v", err)
} }
l.Debug().Any("subsonic_response", resp).Send() l.Debug().Any("subsonic_response", resp).Msg("")
if len(resp.SubsonicResponse.SearchResult3.Album) < 1 || resp.SubsonicResponse.SearchResult3.Album[0].CoverArt == "" { if len(resp.SubsonicResponse.SearchResult3.Album) >= 1 {
return "", fmt.Errorf("GetAlbumImage: failed to get album art")
}
return cfg.SubsonicUrl() + fmt.Sprintf(subsonicCoverArtFmtStr, c.authParams, url.QueryEscape(resp.SubsonicResponse.SearchResult3.Album[0].CoverArt)), nil return cfg.SubsonicUrl() + fmt.Sprintf(subsonicCoverArtFmtStr, c.authParams, url.QueryEscape(resp.SubsonicResponse.SearchResult3.Album[0].CoverArt)), nil
}
}
// else do artist match
l.Debug().Str("title", album).Str("artist", artist).Msg("Searching album image by title and artist")
err := c.getEntity(ctx, fmt.Sprintf(subsonicAlbumSearchFmtStr, c.authParams, url.QueryEscape(album)), resp)
if err != nil {
return "", fmt.Errorf("GetAlbumImage: %v", err)
}
l.Debug().Any("subsonic_response", resp).Msg("")
if len(resp.SubsonicResponse.SearchResult3.Album) < 1 {
return "", fmt.Errorf("GetAlbumImage: failed to get album art from subsonic")
}
for _, album := range resp.SubsonicResponse.SearchResult3.Album {
if album.Artist == artist {
return cfg.SubsonicUrl() + fmt.Sprintf(subsonicCoverArtFmtStr, c.authParams, url.QueryEscape(resp.SubsonicResponse.SearchResult3.Album[0].CoverArt)), nil
}
}
return "", fmt.Errorf("GetAlbumImage: failed to get album art from subsonic")
} }
func (c *SubsonicClient) GetArtistImage(ctx context.Context, artist string) (string, error) { func (c *SubsonicClient) GetArtistImage(ctx context.Context, mbid *uuid.UUID, artist string) (string, error) {
l := logger.FromContext(ctx) l := logger.FromContext(ctx)
resp := new(SubsonicArtistResponse) resp := new(SubsonicArtistResponse)
l.Debug().Msgf("Finding artist image for %s", artist) l.Debug().Msgf("Finding artist image for %s", artist)
// first try mbid search
if mbid != nil {
l.Debug().Str("mbid", mbid.String()).Msg("Searching artist image by MBID")
err := c.getEntity(ctx, fmt.Sprintf(subsonicArtistSearchFmtStr, c.authParams, url.QueryEscape(mbid.String())), resp)
if err != nil {
return "", fmt.Errorf("GetArtistImage: %v", err)
}
l.Debug().Any("subsonic_response", resp).Msg("")
if len(resp.SubsonicResponse.SearchResult3.Artist) < 1 || resp.SubsonicResponse.SearchResult3.Artist[0].ArtistImageUrl == "" {
return "", fmt.Errorf("GetArtistImage: failed to get artist art")
}
// Subsonic seems to have a tendency to return an artist image even though the url is a 404
if err = ValidateImageURL(resp.SubsonicResponse.SearchResult3.Artist[0].ArtistImageUrl); err != nil {
return "", fmt.Errorf("GetArtistImage: failed to get validate image url")
}
}
l.Debug().Str("artist", artist).Msg("Searching artist image by name")
err := c.getEntity(ctx, fmt.Sprintf(subsonicArtistSearchFmtStr, c.authParams, url.QueryEscape(artist)), resp) err := c.getEntity(ctx, fmt.Sprintf(subsonicArtistSearchFmtStr, c.authParams, url.QueryEscape(artist)), resp)
if err != nil { if err != nil {
return "", fmt.Errorf("GetArtistImage: %v", err) return "", fmt.Errorf("GetArtistImage: %v", err)
} }
l.Debug().Any("subsonic_response", resp).Send() l.Debug().Any("subsonic_response", resp).Msg("")
if len(resp.SubsonicResponse.SearchResult3.Artist) < 1 || resp.SubsonicResponse.SearchResult3.Artist[0].ArtistImageUrl == "" { if len(resp.SubsonicResponse.SearchResult3.Artist) < 1 || resp.SubsonicResponse.SearchResult3.Artist[0].ArtistImageUrl == "" {
return "", fmt.Errorf("GetArtistImage: failed to get artist art") return "", fmt.Errorf("GetArtistImage: failed to get artist art")
} }
// Subsonic seems to have a tendency to return an artist image even though the url is a 404
if err = ValidateImageURL(resp.SubsonicResponse.SearchResult3.Artist[0].ArtistImageUrl); err != nil {
return "", fmt.Errorf("GetArtistImage: failed to get validate image url")
}
return resp.SubsonicResponse.SearchResult3.Artist[0].ArtistImageUrl, nil return resp.SubsonicResponse.SearchResult3.Artist[0].ArtistImageUrl, nil
} }

View file

@ -85,20 +85,33 @@ func ImportListenBrainzFile(ctx context.Context, store db.DB, mbzc mbz.MusicBrai
} }
artistMbzIDs, err := utils.ParseUUIDSlice(payload.TrackMeta.AdditionalInfo.ArtistMBIDs) artistMbzIDs, err := utils.ParseUUIDSlice(payload.TrackMeta.AdditionalInfo.ArtistMBIDs)
if err != nil { if err != nil {
l.Debug().Err(err).Msg("Failed to parse one or more uuids") l.Debug().AnErr("error", err).Msg("ImportListenBrainzFile: Failed to parse one or more UUIDs")
}
if len(artistMbzIDs) < 1 {
l.Debug().AnErr("error", err).Msg("ImportListenBrainzFile: Attempting to parse artist UUIDs from mbid_mapping")
utils.ParseUUIDSlice(payload.TrackMeta.MBIDMapping.ArtistMBIDs)
if err != nil {
l.Debug().AnErr("error", err).Msg("ImportListenBrainzFile: Failed to parse one or more UUIDs")
}
} }
rgMbzID, err := uuid.Parse(payload.TrackMeta.AdditionalInfo.ReleaseGroupMBID) rgMbzID, err := uuid.Parse(payload.TrackMeta.AdditionalInfo.ReleaseGroupMBID)
if err != nil { if err != nil {
rgMbzID = uuid.Nil rgMbzID = uuid.Nil
} }
releaseMbzID, err := uuid.Parse(payload.TrackMeta.AdditionalInfo.ReleaseMBID) releaseMbzID, err := uuid.Parse(payload.TrackMeta.AdditionalInfo.ReleaseMBID)
if err != nil {
releaseMbzID, err = uuid.Parse(payload.TrackMeta.MBIDMapping.ReleaseMBID)
if err != nil { if err != nil {
releaseMbzID = uuid.Nil releaseMbzID = uuid.Nil
} }
}
recordingMbzID, err := uuid.Parse(payload.TrackMeta.AdditionalInfo.RecordingMBID) recordingMbzID, err := uuid.Parse(payload.TrackMeta.AdditionalInfo.RecordingMBID)
if err != nil {
recordingMbzID, err = uuid.Parse(payload.TrackMeta.MBIDMapping.RecordingMBID)
if err != nil { if err != nil {
recordingMbzID = uuid.Nil recordingMbzID = uuid.Nil
} }
}
var client string var client string
if payload.TrackMeta.AdditionalInfo.MediaPlayer != "" { if payload.TrackMeta.AdditionalInfo.MediaPlayer != "" {

View file

@ -12,11 +12,5 @@ type Album struct {
ListenCount int64 `json:"listen_count"` ListenCount int64 `json:"listen_count"`
TimeListened int64 `json:"time_listened"` TimeListened int64 `json:"time_listened"`
FirstListen int64 `json:"first_listen"` FirstListen int64 `json:"first_listen"`
AllTimeRank int64 `json:"all_time_rank"`
} }
// type SimpleAlbum struct {
// ID int32 `json:"id"`
// Title string `json:"title"`
// VariousArtists bool `json:"is_various_artists"`
// Image uuid.UUID `json:"image"`
// }

View file

@ -12,6 +12,7 @@ type Artist struct {
TimeListened int64 `json:"time_listened"` TimeListened int64 `json:"time_listened"`
FirstListen int64 `json:"first_listen"` FirstListen int64 `json:"first_listen"`
IsPrimary bool `json:"is_primary,omitempty"` IsPrimary bool `json:"is_primary,omitempty"`
AllTimeRank int64 `json:"all_time_rank"`
} }
type SimpleArtist struct { type SimpleArtist struct {

View file

@ -13,4 +13,5 @@ type Track struct {
AlbumID int32 `json:"album_id"` AlbumID int32 `json:"album_id"`
TimeListened int64 `json:"time_listened"` TimeListened int64 `json:"time_listened"`
FirstListen int64 `json:"first_listen"` FirstListen int64 `json:"first_listen"`
AllTimeRank int64 `json:"all_time_rank"`
} }

View file

@ -134,6 +134,39 @@ func (q *Queries) GetArtist(ctx context.Context, id int32) (GetArtistRow, error)
return i, err return i, err
} }
const getArtistAllTimeRank = `-- name: GetArtistAllTimeRank :one
SELECT
artist_id,
rank
FROM (
SELECT
x.artist_id,
RANK() OVER (ORDER BY x.listen_count DESC) AS rank
FROM (
SELECT
at.artist_id,
COUNT(*) AS listen_count
FROM listens l
JOIN tracks t ON l.track_id = t.id
JOIN artist_tracks at ON t.id = at.track_id
GROUP BY at.artist_id
) x
)
WHERE artist_id = $1
`
type GetArtistAllTimeRankRow struct {
ArtistID int32
Rank int64
}
func (q *Queries) GetArtistAllTimeRank(ctx context.Context, artistID int32) (GetArtistAllTimeRankRow, error) {
row := q.db.QueryRow(ctx, getArtistAllTimeRank, artistID)
var i GetArtistAllTimeRankRow
err := row.Scan(&i.ArtistID, &i.Rank)
return i, err
}
const getArtistByImage = `-- name: GetArtistByImage :one const getArtistByImage = `-- name: GetArtistByImage :one
SELECT id, musicbrainz_id, image, image_source FROM artists WHERE image = $1 LIMIT 1 SELECT id, musicbrainz_id, image, image_source FROM artists WHERE image = $1 LIMIT 1
` `
@ -221,6 +254,47 @@ func (q *Queries) GetArtistByName(ctx context.Context, alias string) (GetArtistB
return i, err return i, err
} }
const getArtistsWithoutImages = `-- name: GetArtistsWithoutImages :many
SELECT
id, musicbrainz_id, image, image_source, name
FROM artists_with_name
WHERE image IS NULL
AND id > $2
ORDER BY id ASC
LIMIT $1
`
type GetArtistsWithoutImagesParams struct {
Limit int32
ID int32
}
func (q *Queries) GetArtistsWithoutImages(ctx context.Context, arg GetArtistsWithoutImagesParams) ([]ArtistsWithName, error) {
rows, err := q.db.Query(ctx, getArtistsWithoutImages, arg.Limit, arg.ID)
if err != nil {
return nil, err
}
defer rows.Close()
var items []ArtistsWithName
for rows.Next() {
var i ArtistsWithName
if err := rows.Scan(
&i.ID,
&i.MusicBrainzID,
&i.Image,
&i.ImageSource,
&i.Name,
); err != nil {
return nil, err
}
items = append(items, i)
}
if err := rows.Err(); err != nil {
return nil, err
}
return items, nil
}
const getReleaseArtists = `-- name: GetReleaseArtists :many const getReleaseArtists = `-- name: GetReleaseArtists :many
SELECT SELECT
a.id, a.musicbrainz_id, a.image, a.image_source, a.name, a.id, a.musicbrainz_id, a.image, a.image_source, a.name,
@ -269,18 +343,27 @@ func (q *Queries) GetReleaseArtists(ctx context.Context, releaseID int32) ([]Get
const getTopArtistsPaginated = `-- name: GetTopArtistsPaginated :many const getTopArtistsPaginated = `-- name: GetTopArtistsPaginated :many
SELECT SELECT
x.id,
x.name,
x.musicbrainz_id,
x.image,
x.listen_count,
RANK() OVER (ORDER BY x.listen_count DESC) AS rank
FROM (
SELECT
a.id, a.id,
a.name, a.name,
a.musicbrainz_id, a.musicbrainz_id,
a.image, a.image,
COUNT(*) AS listen_count COUNT(*) AS listen_count
FROM listens l FROM listens l
JOIN tracks t ON l.track_id = t.id JOIN tracks t ON l.track_id = t.id
JOIN artist_tracks at ON at.track_id = t.id JOIN artist_tracks at ON at.track_id = t.id
JOIN artists_with_name a ON a.id = at.artist_id JOIN artists_with_name a ON a.id = at.artist_id
WHERE l.listened_at BETWEEN $1 AND $2 WHERE l.listened_at BETWEEN $1 AND $2
GROUP BY a.id, a.name, a.musicbrainz_id, a.image, a.image_source, a.name GROUP BY a.id, a.name, a.musicbrainz_id, a.image
ORDER BY listen_count DESC, a.id ) x
ORDER BY x.listen_count DESC, x.id
LIMIT $3 OFFSET $4 LIMIT $3 OFFSET $4
` `
@ -297,6 +380,7 @@ type GetTopArtistsPaginatedRow struct {
MusicBrainzID *uuid.UUID MusicBrainzID *uuid.UUID
Image *uuid.UUID Image *uuid.UUID
ListenCount int64 ListenCount int64
Rank int64
} }
func (q *Queries) GetTopArtistsPaginated(ctx context.Context, arg GetTopArtistsPaginatedParams) ([]GetTopArtistsPaginatedRow, error) { func (q *Queries) GetTopArtistsPaginated(ctx context.Context, arg GetTopArtistsPaginatedParams) ([]GetTopArtistsPaginatedRow, error) {
@ -319,6 +403,7 @@ func (q *Queries) GetTopArtistsPaginated(ctx context.Context, arg GetTopArtistsP
&i.MusicBrainzID, &i.MusicBrainzID,
&i.Image, &i.Image,
&i.ListenCount, &i.ListenCount,
&i.Rank,
); err != nil { ); err != nil {
return nil, err return nil, err
} }

View file

@ -15,11 +15,17 @@ BEGIN
DELETE FROM tracks WHERE id NOT IN (SELECT l.track_id FROM listens l); DELETE FROM tracks WHERE id NOT IN (SELECT l.track_id FROM listens l);
DELETE FROM releases WHERE id NOT IN (SELECT t.release_id FROM tracks t); DELETE FROM releases WHERE id NOT IN (SELECT t.release_id FROM tracks t);
DELETE FROM artists WHERE id NOT IN (SELECT at.artist_id FROM artist_tracks at); DELETE FROM artists WHERE id NOT IN (SELECT at.artist_id FROM artist_tracks at);
DELETE FROM artist_releases ar
WHERE NOT EXISTS (
SELECT 1
FROM artist_tracks at
JOIN tracks t ON at.track_id = t.id
WHERE at.artist_id = ar.artist_id
AND t.release_id = ar.release_id
);
END $$ END $$
` `
// DELETE FROM releases WHERE release_group_id NOT IN (SELECT t.release_group_id FROM tracks t);
// DELETE FROM releases WHERE release_group_id NOT IN (SELECT rg.id FROM release_groups rg);
func (q *Queries) CleanOrphanedEntries(ctx context.Context) error { func (q *Queries) CleanOrphanedEntries(ctx context.Context) error {
_, err := q.db.Exec(ctx, cleanOrphanedEntries) _, err := q.db.Exec(ctx, cleanOrphanedEntries)
return err return err

View file

@ -11,64 +11,57 @@ import (
) )
const getGroupedListensFromArtist = `-- name: GetGroupedListensFromArtist :many const getGroupedListensFromArtist = `-- name: GetGroupedListensFromArtist :many
WITH artist_listens AS ( WITH bounds AS (
SELECT SELECT
l.listened_at MIN(l.listened_at) AS start_time,
NOW() AS end_time
FROM listens l FROM listens l
JOIN tracks t ON t.id = l.track_id JOIN tracks t ON t.id = l.track_id
JOIN artist_tracks at ON at.track_id = t.id JOIN artist_tracks at ON at.track_id = t.id
WHERE at.artist_id = $1 WHERE at.artist_id = $1
), ),
bounds AS ( stats AS (
SELECT SELECT
MIN(listened_at) AS start_time, start_time,
MAX(listened_at) AS end_time end_time,
FROM artist_listens EXTRACT(EPOCH FROM (end_time - start_time)) AS total_seconds,
((end_time - start_time) / $2::int) AS bucket_interval
FROM bounds
), ),
bucketed AS ( bucket_series AS (
SELECT generate_series(0, $2::int - 1) AS idx
),
listen_indices AS (
SELECT SELECT
LEAST( LEAST(
$2 - 1, $2::int - 1,
FLOOR( FLOOR(
( (EXTRACT(EPOCH FROM (l.listened_at - s.start_time)) / NULLIF(s.total_seconds, 0))
EXTRACT(EPOCH FROM (al.listened_at - b.start_time)) * $2::int
/
NULLIF(EXTRACT(EPOCH FROM (b.end_time - b.start_time)), 0)
) * $2
)::int )::int
) AS bucket_idx, ) AS bucket_idx
b.start_time, FROM listens l
b.end_time JOIN tracks t ON t.id = l.track_id
FROM artist_listens al JOIN artist_tracks at ON at.track_id = t.id
CROSS JOIN bounds b CROSS JOIN stats s
), WHERE at.artist_id = $1
aggregated AS ( AND s.start_time IS NOT NULL
SELECT
start_time
+ (
bucket_idx * (end_time - start_time)
/ $2
) AS bucket_start,
start_time
+ (
(bucket_idx + 1) * (end_time - start_time)
/ $2
) AS bucket_end,
COUNT(*) AS listen_count
FROM bucketed
GROUP BY bucket_idx, start_time, end_time
) )
SELECT SELECT
bucket_start::timestamptz, (s.start_time + (s.bucket_interval * bs.idx))::timestamptz AS bucket_start,
bucket_end::timestamptz, (s.start_time + (s.bucket_interval * (bs.idx + 1)))::timestamptz AS bucket_end,
listen_count COUNT(li.bucket_idx) AS listen_count
FROM aggregated FROM bucket_series bs
ORDER BY bucket_start CROSS JOIN stats s
LEFT JOIN listen_indices li ON bs.idx = li.bucket_idx
WHERE s.start_time IS NOT NULL
GROUP BY bs.idx, s.start_time, s.bucket_interval
ORDER BY bs.idx
` `
type GetGroupedListensFromArtistParams struct { type GetGroupedListensFromArtistParams struct {
ArtistID int32 ArtistID int32
BucketCount interface{} BucketCount int32
} }
type GetGroupedListensFromArtistRow struct { type GetGroupedListensFromArtistRow struct {
@ -98,63 +91,55 @@ func (q *Queries) GetGroupedListensFromArtist(ctx context.Context, arg GetGroupe
} }
const getGroupedListensFromRelease = `-- name: GetGroupedListensFromRelease :many const getGroupedListensFromRelease = `-- name: GetGroupedListensFromRelease :many
WITH artist_listens AS ( WITH bounds AS (
SELECT SELECT
l.listened_at MIN(l.listened_at) AS start_time,
NOW() AS end_time
FROM listens l FROM listens l
JOIN tracks t ON t.id = l.track_id JOIN tracks t ON t.id = l.track_id
WHERE t.release_id = $1 WHERE t.release_id = $1
), ),
bounds AS ( stats AS (
SELECT SELECT
MIN(listened_at) AS start_time, start_time,
MAX(listened_at) AS end_time end_time,
FROM artist_listens EXTRACT(EPOCH FROM (end_time - start_time)) AS total_seconds,
((end_time - start_time) / $2::int) AS bucket_interval
FROM bounds
), ),
bucketed AS ( bucket_series AS (
SELECT generate_series(0, $2::int - 1) AS idx
),
listen_indices AS (
SELECT SELECT
LEAST( LEAST(
$2 - 1, $2::int - 1,
FLOOR( FLOOR(
( (EXTRACT(EPOCH FROM (l.listened_at - s.start_time)) / NULLIF(s.total_seconds, 0))
EXTRACT(EPOCH FROM (al.listened_at - b.start_time)) * $2::int
/
NULLIF(EXTRACT(EPOCH FROM (b.end_time - b.start_time)), 0)
) * $2
)::int )::int
) AS bucket_idx, ) AS bucket_idx
b.start_time, FROM listens l
b.end_time JOIN tracks t ON t.id = l.track_id
FROM artist_listens al CROSS JOIN stats s
CROSS JOIN bounds b WHERE t.release_id = $1
), AND s.start_time IS NOT NULL
aggregated AS (
SELECT
start_time
+ (
bucket_idx * (end_time - start_time)
/ $2
) AS bucket_start,
start_time
+ (
(bucket_idx + 1) * (end_time - start_time)
/ $2
) AS bucket_end,
COUNT(*) AS listen_count
FROM bucketed
GROUP BY bucket_idx, start_time, end_time
) )
SELECT SELECT
bucket_start::timestamptz, (s.start_time + (s.bucket_interval * bs.idx))::timestamptz AS bucket_start,
bucket_end::timestamptz, (s.start_time + (s.bucket_interval * (bs.idx + 1)))::timestamptz AS bucket_end,
listen_count COUNT(li.bucket_idx) AS listen_count
FROM aggregated FROM bucket_series bs
ORDER BY bucket_start CROSS JOIN stats s
LEFT JOIN listen_indices li ON bs.idx = li.bucket_idx
WHERE s.start_time IS NOT NULL
GROUP BY bs.idx, s.start_time, s.bucket_interval
ORDER BY bs.idx
` `
type GetGroupedListensFromReleaseParams struct { type GetGroupedListensFromReleaseParams struct {
ReleaseID int32 ReleaseID int32
BucketCount interface{} BucketCount int32
} }
type GetGroupedListensFromReleaseRow struct { type GetGroupedListensFromReleaseRow struct {
@ -184,63 +169,55 @@ func (q *Queries) GetGroupedListensFromRelease(ctx context.Context, arg GetGroup
} }
const getGroupedListensFromTrack = `-- name: GetGroupedListensFromTrack :many const getGroupedListensFromTrack = `-- name: GetGroupedListensFromTrack :many
WITH artist_listens AS ( WITH bounds AS (
SELECT SELECT
l.listened_at MIN(l.listened_at) AS start_time,
NOW() AS end_time
FROM listens l FROM listens l
JOIN tracks t ON t.id = l.track_id JOIN tracks t ON t.id = l.track_id
WHERE t.id = $1 WHERE t.id = $1
), ),
bounds AS ( stats AS (
SELECT SELECT
MIN(listened_at) AS start_time, start_time,
MAX(listened_at) AS end_time end_time,
FROM artist_listens EXTRACT(EPOCH FROM (end_time - start_time)) AS total_seconds,
((end_time - start_time) / $2::int) AS bucket_interval
FROM bounds
), ),
bucketed AS ( bucket_series AS (
SELECT generate_series(0, $2::int - 1) AS idx
),
listen_indices AS (
SELECT SELECT
LEAST( LEAST(
$2 - 1, $2::int - 1,
FLOOR( FLOOR(
( (EXTRACT(EPOCH FROM (l.listened_at - s.start_time)) / NULLIF(s.total_seconds, 0))
EXTRACT(EPOCH FROM (al.listened_at - b.start_time)) * $2::int
/
NULLIF(EXTRACT(EPOCH FROM (b.end_time - b.start_time)), 0)
) * $2
)::int )::int
) AS bucket_idx, ) AS bucket_idx
b.start_time, FROM listens l
b.end_time JOIN tracks t ON t.id = l.track_id
FROM artist_listens al CROSS JOIN stats s
CROSS JOIN bounds b WHERE t.id = $1
), AND s.start_time IS NOT NULL
aggregated AS (
SELECT
start_time
+ (
bucket_idx * (end_time - start_time)
/ $2
) AS bucket_start,
start_time
+ (
(bucket_idx + 1) * (end_time - start_time)
/ $2
) AS bucket_end,
COUNT(*) AS listen_count
FROM bucketed
GROUP BY bucket_idx, start_time, end_time
) )
SELECT SELECT
bucket_start::timestamptz, (s.start_time + (s.bucket_interval * bs.idx))::timestamptz AS bucket_start,
bucket_end::timestamptz, (s.start_time + (s.bucket_interval * (bs.idx + 1)))::timestamptz AS bucket_end,
listen_count COUNT(li.bucket_idx) AS listen_count
FROM aggregated FROM bucket_series bs
ORDER BY bucket_start CROSS JOIN stats s
LEFT JOIN listen_indices li ON bs.idx = li.bucket_idx
WHERE s.start_time IS NOT NULL
GROUP BY bs.idx, s.start_time, s.bucket_interval
ORDER BY bs.idx
` `
type GetGroupedListensFromTrackParams struct { type GetGroupedListensFromTrackParams struct {
ID int32 ID int32
BucketCount interface{} BucketCount int32
} }
type GetGroupedListensFromTrackRow struct { type GetGroupedListensFromTrackRow struct {

View file

@ -141,6 +141,38 @@ func (q *Queries) GetRelease(ctx context.Context, id int32) (GetReleaseRow, erro
return i, err return i, err
} }
const getReleaseAllTimeRank = `-- name: GetReleaseAllTimeRank :one
SELECT
release_id,
rank
FROM (
SELECT
x.release_id,
RANK() OVER (ORDER BY x.listen_count DESC) AS rank
FROM (
SELECT
t.release_id,
COUNT(*) AS listen_count
FROM listens l
JOIN tracks t ON l.track_id = t.id
GROUP BY t.release_id
) x
)
WHERE release_id = $1
`
type GetReleaseAllTimeRankRow struct {
ReleaseID int32
Rank int64
}
func (q *Queries) GetReleaseAllTimeRank(ctx context.Context, releaseID int32) (GetReleaseAllTimeRankRow, error) {
row := q.db.QueryRow(ctx, getReleaseAllTimeRank, releaseID)
var i GetReleaseAllTimeRankRow
err := row.Scan(&i.ReleaseID, &i.Rank)
return i, err
}
const getReleaseByArtistAndTitle = `-- name: GetReleaseByArtistAndTitle :one const getReleaseByArtistAndTitle = `-- name: GetReleaseByArtistAndTitle :one
SELECT r.id, r.musicbrainz_id, r.image, r.various_artists, r.image_source, r.title SELECT r.id, r.musicbrainz_id, r.image, r.various_artists, r.image_source, r.title
FROM releases_with_title r FROM releases_with_title r
@ -321,17 +353,22 @@ func (q *Queries) GetReleasesWithoutImages(ctx context.Context, arg GetReleasesW
const getTopReleasesFromArtist = `-- name: GetTopReleasesFromArtist :many const getTopReleasesFromArtist = `-- name: GetTopReleasesFromArtist :many
SELECT SELECT
x.id, x.musicbrainz_id, x.image, x.various_artists, x.image_source, x.title, x.listen_count,
get_artists_for_release(x.id) AS artists,
RANK() OVER (ORDER BY x.listen_count DESC) AS rank
FROM (
SELECT
r.id, r.musicbrainz_id, r.image, r.various_artists, r.image_source, r.title, r.id, r.musicbrainz_id, r.image, r.various_artists, r.image_source, r.title,
COUNT(*) AS listen_count, COUNT(*) AS listen_count
get_artists_for_release(r.id) AS artists FROM listens l
FROM listens l JOIN tracks t ON l.track_id = t.id
JOIN tracks t ON l.track_id = t.id JOIN releases_with_title r ON t.release_id = r.id
JOIN releases_with_title r ON t.release_id = r.id JOIN artist_releases ar ON r.id = ar.release_id
JOIN artist_releases ar ON r.id = ar.release_id WHERE ar.artist_id = $5
WHERE ar.artist_id = $5
AND l.listened_at BETWEEN $1 AND $2 AND l.listened_at BETWEEN $1 AND $2
GROUP BY r.id, r.title, r.musicbrainz_id, r.various_artists, r.image, r.image_source GROUP BY r.id, r.title, r.musicbrainz_id, r.various_artists, r.image, r.image_source
ORDER BY listen_count DESC, r.id ) x
ORDER BY listen_count DESC, x.id
LIMIT $3 OFFSET $4 LIMIT $3 OFFSET $4
` `
@ -352,6 +389,7 @@ type GetTopReleasesFromArtistRow struct {
Title string Title string
ListenCount int64 ListenCount int64
Artists []byte Artists []byte
Rank int64
} }
func (q *Queries) GetTopReleasesFromArtist(ctx context.Context, arg GetTopReleasesFromArtistParams) ([]GetTopReleasesFromArtistRow, error) { func (q *Queries) GetTopReleasesFromArtist(ctx context.Context, arg GetTopReleasesFromArtistParams) ([]GetTopReleasesFromArtistRow, error) {
@ -378,6 +416,7 @@ func (q *Queries) GetTopReleasesFromArtist(ctx context.Context, arg GetTopReleas
&i.Title, &i.Title,
&i.ListenCount, &i.ListenCount,
&i.Artists, &i.Artists,
&i.Rank,
); err != nil { ); err != nil {
return nil, err return nil, err
} }
@ -391,15 +430,20 @@ func (q *Queries) GetTopReleasesFromArtist(ctx context.Context, arg GetTopReleas
const getTopReleasesPaginated = `-- name: GetTopReleasesPaginated :many const getTopReleasesPaginated = `-- name: GetTopReleasesPaginated :many
SELECT SELECT
x.id, x.musicbrainz_id, x.image, x.various_artists, x.image_source, x.title, x.listen_count,
get_artists_for_release(x.id) AS artists,
RANK() OVER (ORDER BY x.listen_count DESC) AS rank
FROM (
SELECT
r.id, r.musicbrainz_id, r.image, r.various_artists, r.image_source, r.title, r.id, r.musicbrainz_id, r.image, r.various_artists, r.image_source, r.title,
COUNT(*) AS listen_count, COUNT(*) AS listen_count
get_artists_for_release(r.id) AS artists FROM listens l
FROM listens l JOIN tracks t ON l.track_id = t.id
JOIN tracks t ON l.track_id = t.id JOIN releases_with_title r ON t.release_id = r.id
JOIN releases_with_title r ON t.release_id = r.id WHERE l.listened_at BETWEEN $1 AND $2
WHERE l.listened_at BETWEEN $1 AND $2 GROUP BY r.id, r.title, r.musicbrainz_id, r.various_artists, r.image, r.image_source
GROUP BY r.id, r.title, r.musicbrainz_id, r.various_artists, r.image, r.image_source ) x
ORDER BY listen_count DESC, r.id ORDER BY listen_count DESC, x.id
LIMIT $3 OFFSET $4 LIMIT $3 OFFSET $4
` `
@ -419,6 +463,7 @@ type GetTopReleasesPaginatedRow struct {
Title string Title string
ListenCount int64 ListenCount int64
Artists []byte Artists []byte
Rank int64
} }
func (q *Queries) GetTopReleasesPaginated(ctx context.Context, arg GetTopReleasesPaginatedParams) ([]GetTopReleasesPaginatedRow, error) { func (q *Queries) GetTopReleasesPaginated(ctx context.Context, arg GetTopReleasesPaginatedParams) ([]GetTopReleasesPaginatedRow, error) {
@ -444,6 +489,7 @@ func (q *Queries) GetTopReleasesPaginated(ctx context.Context, arg GetTopRelease
&i.Title, &i.Title,
&i.ListenCount, &i.ListenCount,
&i.Artists, &i.Artists,
&i.Rank,
); err != nil { ); err != nil {
return nil, err return nil, err
} }

View file

@ -155,22 +155,30 @@ func (q *Queries) GetAllTracksFromArtist(ctx context.Context, artistID int32) ([
const getTopTracksByArtistPaginated = `-- name: GetTopTracksByArtistPaginated :many const getTopTracksByArtistPaginated = `-- name: GetTopTracksByArtistPaginated :many
SELECT SELECT
t.id, x.track_id AS id,
t.title, t.title,
t.musicbrainz_id, t.musicbrainz_id,
t.release_id, t.release_id,
r.image, r.image,
x.listen_count,
get_artists_for_track(x.track_id) AS artists,
x.rank
FROM (
SELECT
l.track_id,
COUNT(*) AS listen_count, COUNT(*) AS listen_count,
get_artists_for_track(t.id) AS artists RANK() OVER (ORDER BY COUNT(*) DESC) as rank
FROM listens l FROM listens l
JOIN tracks_with_title t ON l.track_id = t.id JOIN artist_tracks at ON l.track_id = at.track_id
JOIN releases r ON t.release_id = r.id WHERE l.listened_at BETWEEN $1 AND $2
JOIN artist_tracks at ON at.track_id = t.id
WHERE l.listened_at BETWEEN $1 AND $2
AND at.artist_id = $5 AND at.artist_id = $5
GROUP BY t.id, t.title, t.musicbrainz_id, t.release_id, r.image GROUP BY l.track_id
ORDER BY listen_count DESC, t.id ORDER BY listen_count DESC
LIMIT $3 OFFSET $4 LIMIT $3 OFFSET $4
) x
JOIN tracks_with_title t ON x.track_id = t.id
JOIN releases r ON t.release_id = r.id
ORDER BY x.listen_count DESC, x.track_id
` `
type GetTopTracksByArtistPaginatedParams struct { type GetTopTracksByArtistPaginatedParams struct {
@ -189,6 +197,7 @@ type GetTopTracksByArtistPaginatedRow struct {
Image *uuid.UUID Image *uuid.UUID
ListenCount int64 ListenCount int64
Artists []byte Artists []byte
Rank int64
} }
func (q *Queries) GetTopTracksByArtistPaginated(ctx context.Context, arg GetTopTracksByArtistPaginatedParams) ([]GetTopTracksByArtistPaginatedRow, error) { func (q *Queries) GetTopTracksByArtistPaginated(ctx context.Context, arg GetTopTracksByArtistPaginatedParams) ([]GetTopTracksByArtistPaginatedRow, error) {
@ -214,6 +223,7 @@ func (q *Queries) GetTopTracksByArtistPaginated(ctx context.Context, arg GetTopT
&i.Image, &i.Image,
&i.ListenCount, &i.ListenCount,
&i.Artists, &i.Artists,
&i.Rank,
); err != nil { ); err != nil {
return nil, err return nil, err
} }
@ -227,21 +237,30 @@ func (q *Queries) GetTopTracksByArtistPaginated(ctx context.Context, arg GetTopT
const getTopTracksInReleasePaginated = `-- name: GetTopTracksInReleasePaginated :many const getTopTracksInReleasePaginated = `-- name: GetTopTracksInReleasePaginated :many
SELECT SELECT
t.id, x.track_id AS id,
t.title, t.title,
t.musicbrainz_id, t.musicbrainz_id,
t.release_id, t.release_id,
r.image, r.image,
x.listen_count,
get_artists_for_track(x.track_id) AS artists,
x.rank
FROM (
SELECT
l.track_id,
COUNT(*) AS listen_count, COUNT(*) AS listen_count,
get_artists_for_track(t.id) AS artists RANK() OVER (ORDER BY COUNT(*) DESC) as rank
FROM listens l FROM listens l
JOIN tracks_with_title t ON l.track_id = t.id JOIN tracks t ON l.track_id = t.id
JOIN releases r ON t.release_id = r.id WHERE l.listened_at BETWEEN $1 AND $2
WHERE l.listened_at BETWEEN $1 AND $2
AND t.release_id = $5 AND t.release_id = $5
GROUP BY t.id, t.title, t.musicbrainz_id, t.release_id, r.image GROUP BY l.track_id
ORDER BY listen_count DESC, t.id ORDER BY listen_count DESC
LIMIT $3 OFFSET $4 LIMIT $3 OFFSET $4
) x
JOIN tracks_with_title t ON x.track_id = t.id
JOIN releases r ON t.release_id = r.id
ORDER BY x.listen_count DESC, x.track_id
` `
type GetTopTracksInReleasePaginatedParams struct { type GetTopTracksInReleasePaginatedParams struct {
@ -260,6 +279,7 @@ type GetTopTracksInReleasePaginatedRow struct {
Image *uuid.UUID Image *uuid.UUID
ListenCount int64 ListenCount int64
Artists []byte Artists []byte
Rank int64
} }
func (q *Queries) GetTopTracksInReleasePaginated(ctx context.Context, arg GetTopTracksInReleasePaginatedParams) ([]GetTopTracksInReleasePaginatedRow, error) { func (q *Queries) GetTopTracksInReleasePaginated(ctx context.Context, arg GetTopTracksInReleasePaginatedParams) ([]GetTopTracksInReleasePaginatedRow, error) {
@ -285,6 +305,7 @@ func (q *Queries) GetTopTracksInReleasePaginated(ctx context.Context, arg GetTop
&i.Image, &i.Image,
&i.ListenCount, &i.ListenCount,
&i.Artists, &i.Artists,
&i.Rank,
); err != nil { ); err != nil {
return nil, err return nil, err
} }
@ -298,20 +319,28 @@ func (q *Queries) GetTopTracksInReleasePaginated(ctx context.Context, arg GetTop
const getTopTracksPaginated = `-- name: GetTopTracksPaginated :many const getTopTracksPaginated = `-- name: GetTopTracksPaginated :many
SELECT SELECT
t.id, x.track_id AS id,
t.title, t.title,
t.musicbrainz_id, t.musicbrainz_id,
t.release_id, t.release_id,
r.image, r.image,
x.listen_count,
get_artists_for_track(x.track_id) AS artists,
x.rank
FROM (
SELECT
track_id,
COUNT(*) AS listen_count, COUNT(*) AS listen_count,
get_artists_for_track(t.id) AS artists RANK() OVER (ORDER BY COUNT(*) DESC) as rank
FROM listens l FROM listens
JOIN tracks_with_title t ON l.track_id = t.id WHERE listened_at BETWEEN $1 AND $2
GROUP BY track_id
ORDER BY listen_count DESC
LIMIT $3 OFFSET $4
) x
JOIN tracks_with_title t ON x.track_id = t.id
JOIN releases r ON t.release_id = r.id JOIN releases r ON t.release_id = r.id
WHERE l.listened_at BETWEEN $1 AND $2 ORDER BY x.listen_count DESC, x.track_id
GROUP BY t.id, t.title, t.musicbrainz_id, t.release_id, r.image
ORDER BY listen_count DESC, t.id
LIMIT $3 OFFSET $4
` `
type GetTopTracksPaginatedParams struct { type GetTopTracksPaginatedParams struct {
@ -329,6 +358,7 @@ type GetTopTracksPaginatedRow struct {
Image *uuid.UUID Image *uuid.UUID
ListenCount int64 ListenCount int64
Artists []byte Artists []byte
Rank int64
} }
func (q *Queries) GetTopTracksPaginated(ctx context.Context, arg GetTopTracksPaginatedParams) ([]GetTopTracksPaginatedRow, error) { func (q *Queries) GetTopTracksPaginated(ctx context.Context, arg GetTopTracksPaginatedParams) ([]GetTopTracksPaginatedRow, error) {
@ -353,6 +383,7 @@ func (q *Queries) GetTopTracksPaginated(ctx context.Context, arg GetTopTracksPag
&i.Image, &i.Image,
&i.ListenCount, &i.ListenCount,
&i.Artists, &i.Artists,
&i.Rank,
); err != nil { ); err != nil {
return nil, err return nil, err
} }
@ -399,6 +430,37 @@ func (q *Queries) GetTrack(ctx context.Context, id int32) (GetTrackRow, error) {
return i, err return i, err
} }
const getTrackAllTimeRank = `-- name: GetTrackAllTimeRank :one
SELECT
id,
rank
FROM (
SELECT
x.id,
RANK() OVER (ORDER BY x.listen_count DESC) AS rank
FROM (
SELECT
t.id,
COUNT(*) AS listen_count
FROM listens l
JOIN tracks_with_title t ON l.track_id = t.id
GROUP BY t.id) x
) y
WHERE id = $1
`
type GetTrackAllTimeRankRow struct {
ID int32
Rank int64
}
func (q *Queries) GetTrackAllTimeRank(ctx context.Context, id int32) (GetTrackAllTimeRankRow, error) {
row := q.db.QueryRow(ctx, getTrackAllTimeRank, id)
var i GetTrackAllTimeRankRow
err := row.Scan(&i.ID, &i.Rank)
return i, err
}
const getTrackByMbzID = `-- name: GetTrackByMbzID :one const getTrackByMbzID = `-- name: GetTrackByMbzID :one
SELECT id, musicbrainz_id, duration, release_id, title FROM tracks_with_title SELECT id, musicbrainz_id, duration, release_id, title FROM tracks_with_title
WHERE musicbrainz_id = $1 LIMIT 1 WHERE musicbrainz_id = $1 LIMIT 1
@ -447,6 +509,48 @@ func (q *Queries) GetTrackByTrackInfo(ctx context.Context, arg GetTrackByTrackIn
return i, err return i, err
} }
const getTracksWithNoDurationButHaveMbzID = `-- name: GetTracksWithNoDurationButHaveMbzID :many
SELECT
id, musicbrainz_id, duration, release_id, title
FROM tracks_with_title
WHERE duration = 0
AND musicbrainz_id IS NOT NULL
AND id > $2
ORDER BY id ASC
LIMIT $1
`
type GetTracksWithNoDurationButHaveMbzIDParams struct {
Limit int32
ID int32
}
func (q *Queries) GetTracksWithNoDurationButHaveMbzID(ctx context.Context, arg GetTracksWithNoDurationButHaveMbzIDParams) ([]TracksWithTitle, error) {
rows, err := q.db.Query(ctx, getTracksWithNoDurationButHaveMbzID, arg.Limit, arg.ID)
if err != nil {
return nil, err
}
defer rows.Close()
var items []TracksWithTitle
for rows.Next() {
var i TracksWithTitle
if err := rows.Scan(
&i.ID,
&i.MusicBrainzID,
&i.Duration,
&i.ReleaseID,
&i.Title,
); err != nil {
return nil, err
}
items = append(items, i)
}
if err := rows.Err(); err != nil {
return nil, err
}
return items, nil
}
const insertTrack = `-- name: InsertTrack :one const insertTrack = `-- name: InsertTrack :one
INSERT INTO tracks (musicbrainz_id, release_id, duration) INSERT INTO tracks (musicbrainz_id, release_id, duration)
VALUES ($1, $2, $3) VALUES ($1, $2, $3)

View file

@ -10,9 +10,9 @@ import (
type Summary struct { type Summary struct {
Title string `json:"title,omitempty"` Title string `json:"title,omitempty"`
TopArtists []*models.Artist `json:"top_artists"` // ListenCount and TimeListened are overriden with stats from timeframe TopArtists []db.RankedItem[*models.Artist] `json:"top_artists"` // ListenCount and TimeListened are overriden with stats from timeframe
TopAlbums []*models.Album `json:"top_albums"` // ListenCount and TimeListened are overriden with stats from timeframe TopAlbums []db.RankedItem[*models.Album] `json:"top_albums"` // ListenCount and TimeListened are overriden with stats from timeframe
TopTracks []*models.Track `json:"top_tracks"` // ListenCount and TimeListened are overriden with stats from timeframe TopTracks []db.RankedItem[*models.Track] `json:"top_tracks"` // ListenCount and TimeListened are overriden with stats from timeframe
MinutesListened int `json:"minutes_listened"` MinutesListened int `json:"minutes_listened"`
AvgMinutesPerDay int `json:"avg_minutes_listened_per_day"` AvgMinutesPerDay int `json:"avg_minutes_listened_per_day"`
Plays int `json:"plays"` Plays int `json:"plays"`
@ -37,16 +37,16 @@ func GenerateSummary(ctx context.Context, store db.DB, userId int32, timeframe d
summary.TopArtists = topArtists.Items summary.TopArtists = topArtists.Items
// replace ListenCount and TimeListened with stats from timeframe // replace ListenCount and TimeListened with stats from timeframe
for i, artist := range summary.TopArtists { for i, artist := range summary.TopArtists {
timelistened, err := store.CountTimeListenedToItem(ctx, db.TimeListenedOpts{ArtistID: artist.ID, Timeframe: timeframe}) timelistened, err := store.CountTimeListenedToItem(ctx, db.TimeListenedOpts{ArtistID: artist.Item.ID, Timeframe: timeframe})
if err != nil { if err != nil {
return nil, fmt.Errorf("GenerateSummary: %w", err) return nil, fmt.Errorf("GenerateSummary: %w", err)
} }
listens, err := store.CountListensToItem(ctx, db.TimeListenedOpts{ArtistID: artist.ID, Timeframe: timeframe}) listens, err := store.CountListensToItem(ctx, db.TimeListenedOpts{ArtistID: artist.Item.ID, Timeframe: timeframe})
if err != nil { if err != nil {
return nil, fmt.Errorf("GenerateSummary: %w", err) return nil, fmt.Errorf("GenerateSummary: %w", err)
} }
summary.TopArtists[i].TimeListened = timelistened summary.TopArtists[i].Item.TimeListened = timelistened
summary.TopArtists[i].ListenCount = listens summary.TopArtists[i].Item.ListenCount = listens
} }
topAlbums, err := store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Page: 1, Limit: 5, Timeframe: timeframe}) topAlbums, err := store.GetTopAlbumsPaginated(ctx, db.GetItemsOpts{Page: 1, Limit: 5, Timeframe: timeframe})
@ -56,16 +56,16 @@ func GenerateSummary(ctx context.Context, store db.DB, userId int32, timeframe d
summary.TopAlbums = topAlbums.Items summary.TopAlbums = topAlbums.Items
// replace ListenCount and TimeListened with stats from timeframe // replace ListenCount and TimeListened with stats from timeframe
for i, album := range summary.TopAlbums { for i, album := range summary.TopAlbums {
timelistened, err := store.CountTimeListenedToItem(ctx, db.TimeListenedOpts{AlbumID: album.ID, Timeframe: timeframe}) timelistened, err := store.CountTimeListenedToItem(ctx, db.TimeListenedOpts{AlbumID: album.Item.ID, Timeframe: timeframe})
if err != nil { if err != nil {
return nil, fmt.Errorf("GenerateSummary: %w", err) return nil, fmt.Errorf("GenerateSummary: %w", err)
} }
listens, err := store.CountListensToItem(ctx, db.TimeListenedOpts{AlbumID: album.ID, Timeframe: timeframe}) listens, err := store.CountListensToItem(ctx, db.TimeListenedOpts{AlbumID: album.Item.ID, Timeframe: timeframe})
if err != nil { if err != nil {
return nil, fmt.Errorf("GenerateSummary: %w", err) return nil, fmt.Errorf("GenerateSummary: %w", err)
} }
summary.TopAlbums[i].TimeListened = timelistened summary.TopAlbums[i].Item.TimeListened = timelistened
summary.TopAlbums[i].ListenCount = listens summary.TopAlbums[i].Item.ListenCount = listens
} }
topTracks, err := store.GetTopTracksPaginated(ctx, db.GetItemsOpts{Page: 1, Limit: 5, Timeframe: timeframe}) topTracks, err := store.GetTopTracksPaginated(ctx, db.GetItemsOpts{Page: 1, Limit: 5, Timeframe: timeframe})
@ -75,16 +75,16 @@ func GenerateSummary(ctx context.Context, store db.DB, userId int32, timeframe d
summary.TopTracks = topTracks.Items summary.TopTracks = topTracks.Items
// replace ListenCount and TimeListened with stats from timeframe // replace ListenCount and TimeListened with stats from timeframe
for i, track := range summary.TopTracks { for i, track := range summary.TopTracks {
timelistened, err := store.CountTimeListenedToItem(ctx, db.TimeListenedOpts{TrackID: track.ID, Timeframe: timeframe}) timelistened, err := store.CountTimeListenedToItem(ctx, db.TimeListenedOpts{TrackID: track.Item.ID, Timeframe: timeframe})
if err != nil { if err != nil {
return nil, fmt.Errorf("GenerateSummary: %w", err) return nil, fmt.Errorf("GenerateSummary: %w", err)
} }
listens, err := store.CountListensToItem(ctx, db.TimeListenedOpts{TrackID: track.ID, Timeframe: timeframe}) listens, err := store.CountListensToItem(ctx, db.TimeListenedOpts{TrackID: track.Item.ID, Timeframe: timeframe})
if err != nil { if err != nil {
return nil, fmt.Errorf("GenerateSummary: %w", err) return nil, fmt.Errorf("GenerateSummary: %w", err)
} }
summary.TopTracks[i].TimeListened = timelistened summary.TopTracks[i].Item.TimeListened = timelistened
summary.TopTracks[i].ListenCount = listens summary.TopTracks[i].Item.ListenCount = listens
} }
t1, t2 := db.TimeframeToTimeRange(timeframe) t1, t2 := db.TimeframeToTimeRange(timeframe)

View file

@ -18,7 +18,7 @@
}, },
"album": { "album": {
"image_url": "https://cdn-images.dzcdn.net/images/cover/1f54d600d0ce5c88a6b2fd75659ec796/1000x1000-000000-80-0-0.jpg", "image_url": "https://cdn-images.dzcdn.net/images/cover/1f54d600d0ce5c88a6b2fd75659ec796/1000x1000-000000-80-0-0.jpg",
"mbid": null, "mbid": "d0ec30bd-7cdc-417c-979d-5a0631b8a161",
"aliases": [ "aliases": [
{ {
"alias": "American Football (LP3)", "alias": "American Football (LP3)",
@ -70,7 +70,7 @@
}, },
"album": { "album": {
"image_url": "https://cdn-images.dzcdn.net/images/cover/1f54d600d0ce5c88a6b2fd75659ec796/1000x1000-000000-80-0-0.jpg", "image_url": "https://cdn-images.dzcdn.net/images/cover/1f54d600d0ce5c88a6b2fd75659ec796/1000x1000-000000-80-0-0.jpg",
"mbid": null, "mbid": "d0ec30bd-7cdc-417c-979d-5a0631b8a161",
"aliases": [ "aliases": [
{ {
"alias": "American Football (LP3)", "alias": "American Football (LP3)",
@ -122,7 +122,7 @@
}, },
"album": { "album": {
"image_url": "https://cdn-images.dzcdn.net/images/cover/1f54d600d0ce5c88a6b2fd75659ec796/1000x1000-000000-80-0-0.jpg", "image_url": "https://cdn-images.dzcdn.net/images/cover/1f54d600d0ce5c88a6b2fd75659ec796/1000x1000-000000-80-0-0.jpg",
"mbid": null, "mbid": "d0ec30bd-7cdc-417c-979d-5a0631b8a161",
"aliases": [ "aliases": [
{ {
"alias": "American Football (LP3)", "alias": "American Football (LP3)",
@ -174,7 +174,7 @@
}, },
"album": { "album": {
"image_url": "https://cdn-images.dzcdn.net/images/cover/1f54d600d0ce5c88a6b2fd75659ec796/1000x1000-000000-80-0-0.jpg", "image_url": "https://cdn-images.dzcdn.net/images/cover/1f54d600d0ce5c88a6b2fd75659ec796/1000x1000-000000-80-0-0.jpg",
"mbid": null, "mbid": "d0ec30bd-7cdc-417c-979d-5a0631b8a161",
"aliases": [ "aliases": [
{ {
"alias": "American Football (LP3)", "alias": "American Football (LP3)",
@ -226,7 +226,7 @@
}, },
"album": { "album": {
"image_url": "https://cdn-images.dzcdn.net/images/cover/1f54d600d0ce5c88a6b2fd75659ec796/1000x1000-000000-80-0-0.jpg", "image_url": "https://cdn-images.dzcdn.net/images/cover/1f54d600d0ce5c88a6b2fd75659ec796/1000x1000-000000-80-0-0.jpg",
"mbid": null, "mbid": "d0ec30bd-7cdc-417c-979d-5a0631b8a161",
"aliases": [ "aliases": [
{ {
"alias": "American Football (LP3)", "alias": "American Football (LP3)",
@ -278,7 +278,7 @@
}, },
"album": { "album": {
"image_url": "https://cdn-images.dzcdn.net/images/cover/1f54d600d0ce5c88a6b2fd75659ec796/1000x1000-000000-80-0-0.jpg", "image_url": "https://cdn-images.dzcdn.net/images/cover/1f54d600d0ce5c88a6b2fd75659ec796/1000x1000-000000-80-0-0.jpg",
"mbid": null, "mbid": "d0ec30bd-7cdc-417c-979d-5a0631b8a161",
"aliases": [ "aliases": [
{ {
"alias": "American Football (LP3)", "alias": "American Football (LP3)",
@ -330,7 +330,7 @@
}, },
"album": { "album": {
"image_url": "https://cdn-images.dzcdn.net/images/cover/1f54d600d0ce5c88a6b2fd75659ec796/1000x1000-000000-80-0-0.jpg", "image_url": "https://cdn-images.dzcdn.net/images/cover/1f54d600d0ce5c88a6b2fd75659ec796/1000x1000-000000-80-0-0.jpg",
"mbid": null, "mbid": "d0ec30bd-7cdc-417c-979d-5a0631b8a161",
"aliases": [ "aliases": [
{ {
"alias": "American Football (LP3)", "alias": "American Football (LP3)",

Binary file not shown.