Koito/internal/importer/spotify.go
safierinx-a 8ce6ec494d Add bulk import optimization: track_lookup cache, batch inserts, BulkSubmitter
Adopts ListenBrainz-inspired patterns to speed up imports from ~24h to
under 30 minutes for 49k scrobbles.

Phase 1 - track_lookup cache table:
- New migration (000006) adds persistent entity lookup cache
- Maps normalized (artist, track, album) → (artist_id, album_id, track_id)
- SubmitListen fast path: cache hit skips 18 DB queries → 2 queries
- Cache populated after entity resolution, invalidated on merge/delete
- Benefits both live scrobbles and imports

Phase 2 - SaveListensBatch:
- New batch listen insert using pgx CopyFrom → temp table → INSERT ON CONFLICT
- Thousands of inserts per second vs one-at-a-time

Phase 3 - BulkSubmitter:
- Reusable import accelerator for all importers
- Pre-deduplicates scrobbles by (artist, track, album) in memory
- Worker pool (4 goroutines) for parallel entity creation on cache miss
- Batch listen insertion via SaveListensBatch

Phase 4 - Migrate importers:
- Maloja, Spotify, LastFM, ListenBrainz importers use BulkSubmitter
- Koito importer left as-is (already fast with pre-resolved IDs)

Phase 5 - Skip image lookups during import:
- GetArtistImage/GetAlbumImage calls fully skipped when SkipCacheImage=true
- Background tasks (FetchMissingArtistImages/FetchMissingAlbumImages) backfill

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 04:17:50 +05:30

76 lines
2 KiB
Go

package importer
import (
"context"
"encoding/json"
"fmt"
"os"
"path"
"time"
"github.com/gabehf/koito/internal/catalog"
"github.com/gabehf/koito/internal/cfg"
"github.com/gabehf/koito/internal/db"
"github.com/gabehf/koito/internal/logger"
"github.com/gabehf/koito/internal/mbz"
)
type SpotifyExportItem struct {
Timestamp time.Time `json:"ts"`
TrackName string `json:"master_metadata_track_name"`
ArtistName string `json:"master_metadata_album_artist_name"`
AlbumName string `json:"master_metadata_album_album_name"`
ReasonEnd string `json:"reason_end"`
MsPlayed int32 `json:"ms_played"`
}
func ImportSpotifyFile(ctx context.Context, store db.DB, mbzc mbz.MusicBrainzCaller, filename string) error {
l := logger.FromContext(ctx)
l.Info().Msgf("Beginning spotify import on file: %s", filename)
file, err := os.Open(path.Join(cfg.ConfigDir(), "import", filename))
if err != nil {
l.Err(err).Msgf("Failed to read import file: %s", filename)
return fmt.Errorf("ImportSpotifyFile: %w", err)
}
defer file.Close()
export := make([]SpotifyExportItem, 0)
err = json.NewDecoder(file).Decode(&export)
if err != nil {
return fmt.Errorf("ImportSpotifyFile: %w", err)
}
bs := NewBulkSubmitter(ctx, BulkSubmitterOpts{
Store: store,
Mbzc: mbzc,
})
for _, item := range export {
if item.ReasonEnd != "trackdone" {
continue
}
if !inImportTimeWindow(item.Timestamp) {
continue
}
if item.TrackName == "" || item.ArtistName == "" {
l.Debug().Msg("Skipping non-track item")
continue
}
bs.Accept(catalog.SubmitListenOpts{
MbzCaller: mbzc,
Artist: item.ArtistName,
TrackTitle: item.TrackName,
ReleaseTitle: item.AlbumName,
Duration: item.MsPlayed / 1000,
Time: item.Timestamp,
Client: "spotify",
UserID: 1,
SkipCacheImage: true,
})
}
count, err := bs.Flush()
if err != nil {
return fmt.Errorf("ImportSpotifyFile: %w", err)
}
return finishImport(ctx, filename, count)
}