[3.2][Audio] Part 6 (Last? maybe?) (#3244)

* Removes `MAX_BALANCE` from bank, user `bank.get_max_balance()` now
`[p]bankset maxbal` can be used to set the maximum bank balance

Signed-off-by: Guy <guyreis96@gmail.com>

* Initial Commit

Signed-off-by: Draper <27962761+Drapersniper@users.noreply.github.com>

* I need to make sure I keep aika on her toes.

Signed-off-by: Draper <27962761+Drapersniper@users.noreply.github.com>

* Fixes a few missing kwargs and case consistency

Signed-off-by: Draper <27962761+Drapersniper@users.noreply.github.com>

* Fixes a few missing kwargs and case consistency v2 and typos

Signed-off-by: Draper <27962761+Drapersniper@users.noreply.github.com>

* Reset cooldowns + add changelogs

Signed-off-by: Draper <27962761+Drapersniper@users.noreply.github.com>

* Add 3 extra file formats.

Signed-off-by: Draper <27962761+Drapersniper@users.noreply.github.com>

* IRDUMB - fix capitalization.

Signed-off-by: Draper <27962761+Drapersniper@users.noreply.github.com>

* Fix a silent error, and some incorrect messages.

Signed-off-by: guyre <27962761+drapersniper@users.noreply.github.com>

* Remove unnecessary emojis from queue when they are not needed

Signed-off-by: guyre <27962761+drapersniper@users.noreply.github.com>

* Remove duplicated call in `[p]playlist update`

Signed-off-by: guyre <27962761+drapersniper@users.noreply.github.com>

* Remove duplicated call in `[p]playlist update`

Signed-off-by: guyre <27962761+drapersniper@users.noreply.github.com>

* Resolve conflicts

Signed-off-by: Draper <27962761+Drapersniper@users.noreply.github.com>

* Bring all files up to date + Black

Signed-off-by: Draper <27962761+Drapersniper@users.noreply.github.com>

* Facepalm

Signed-off-by: Draper <27962761+Drapersniper@users.noreply.github.com>

* *Sigh*

Signed-off-by: Draper <27962761+Drapersniper@users.noreply.github.com>

* *Sigh* 2.0

Signed-off-by: Draper <27962761+Drapersniper@users.noreply.github.com>

* Merge branch 'V3/develop' of https://github.com/Cog-Creators/Red-DiscordBot into audio-misc-pt1

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

# Resolve Conflicts:
#	redbot/cogs/audio/audio.py
#	redbot/cogs/audio/utils.py

* Import missing Typecheck

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Fix Broken docstrings

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Sort Local Tracks

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* 🤦

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Reorder the sorting of local tracks,
`alphanumerical lower then alphanumerical upper`
`a comes before A, but B comes after A`

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Black formatting

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Make the local file sorting case insensitive

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Add global blacklist/whitelist + fix some issues with original server based whitelist/blacklist

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Remove the pre-commit yaml

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Nottin to see

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Further improvement to the blacklists

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Further improvement to the blacklists

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Fix  the __str__ method on LocalTracks Object

* Rename LocalTracks.to_string_hidden() to LocalTracks.to_string_user() To keep it inline with the Query object

* Remove encoding pragmas + a few typo fixes

* Update some typehints + fix some typos

* Remove this duplicate call

* Black

* fix capitalization

* Address preda's review

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Remove the API from the audio cog

 - Is in direct conflict with goals stated in #2804
 - Features this was intended to enable can be enabled in other more
 appropriate ways later on

* changelog

* Address Aika's review

* Black

* *sigh* dont use github web ui

* Fuck windows Long live linux... *sigh* no lets ensure windows users can still use local tracks

* Merge branch 'V3/develop' of https://github.com/Cog-Creators/Red-DiscordBot into refactoring

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

# Conflicts:
#	redbot/cogs/audio/audio.py

* 👀 + chore

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* facepalm

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* facepalm... again y u h8 me bruh

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* fuk this fuk u tube fuck python fuck all

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* awyehfqwajefhnqeffawefqa eqewarfqaesf qwef qaf qwfr

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* fuck everything

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* oh lord saviour resus i love you just make this work

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Change logic to be no errors within last 10 seconds... this should be a valid work around discord ratelimits caused by the spam

* Remove auto deletion

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* See I did a ting

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* irdumb

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* black

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Add an is_url attribute to Query objects

* chore

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Black

* Address Aikas review

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Hyperlink Playlist names

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Make shit bold

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* why was this here

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* why was this here

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Initial commit

* Workinnng

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Improve SQL Statements +  migrate from SQL Alchemy + Databases to APSW

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* apsw tested and working

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* chose

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Migrate Playlist to DB 3 TODO
1 Migrate Config to Schema 3 without playlists
and update get_playlist methods

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Revert "Migrate Playlist to DB 3 TODO 1 Migrate Config to Schema 3 without playlists and update get_playlist methods"

This reverts commit 4af33cff

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Implement schema migration

* Lets not touch the deps since #3192 is already adding them

* chore

* *sigh* Black

* Follow the existing logic and always default Playlist to guild scope

* wghqjegqf black

* Update usage of last_fetched and last_updated to be Ints... However column migration still pending

* Some bug fixes

* Update usage of last_fetched and last_updated to be Ints... However column migration still pending

* working

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* partial match

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* better partial match

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* black

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* I thought i done this before

* Delete 3195.misc.1.rst

Wrong PR

* Thanks Sinbad

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Thanks Sinbad

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Log Errors  in init ...

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Update error logs.

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Create index

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* :Drapersweat:

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Chore

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Revert "Chore"

This reverts commit edcc9a9f

UGHHHH

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Allow removing tracks from queue by URL

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Words matter

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* *sigh*

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* chore

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* arghhh CONFLICTS

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Review sinbads latest comment ..

ToDo.. Nuke existing playlist - check version and set version

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* migrate the DB schema to v3 (to keep in line with the schema visioning of Config

* Add a Todo

* *sigh* conflicts and black

* *sigh* black

* Passively delete playlist deletion mechanism

* Delete Old entries on startup

* Since we are dropping the table mightaware make these into JSON for future proofing

* Don't Dump strings in JSON field ? :think:

* Move some things around to make easier to use 1 connection to the Audio DB

* Move some things around to make easier to use 1 connection to the Audio DB

* *sigh*

* Clean up api

* *sigh* black

* Red + reorder some variables

* 🤦

* how could i forget this .......

* Black

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Black

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Black

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* #automagically

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* FINAFUCKINGLY

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* FINAFUCKINGLY

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Remove unused config default

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Remove the API from the audio Cog (Properly)

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Missed these changes

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* ARGHHH

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Some fixes I've noticed while running through the code line by line

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Translation + UX (show playlist author ID if can't find user)

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* *sigh* missed this one

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* this is no longer needed ....

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* 🤦

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* fix new lines in error messages

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Black

* Sinbads Review

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Sinbads Review

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* *sigh* copy paste

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* imrpove backups

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Im a fucking idiot

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Fix #3238

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* chore

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* humans

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* humans

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* add play alias to playlists

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Im dumb ...

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Im dumb ...

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* fix new line

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* fix new line

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* show playlist count on playlist picker

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* DJ/Vote system fixes

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* DJ/Vote system fixes

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* *sigh* fix currency check

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* show playlist count on playlist picker

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* DJ/Vote system fixes

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* DJ/Vote system fixes

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* *sigh* fix currency check

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Fix duplicate messages on timeout

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* fix SQL Statement logic

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* fix SQL Statement logic

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Markdown escape

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Markdown escape

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Markdown escape fix

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Markdown escape fix

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* clean up local cache more frequently

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* clean up db more frequently

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Await in hell

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* *sigh* im dumb

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* *sigh* im dumb

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Black cuz I hate red

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Black cuz I hate red

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* StringIO to ByteIO

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* StringIO to ByteIO

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* *sigh* im dumb

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* :Facepalm: the whole purpose of this is so its offline so this can be backed up without being blocking

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Run write queries on ThreadPoolExecutor

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* Backup Audio.db

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* *sigh* im dumb

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* blaaaack

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* *sigh*

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* formatting

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* remove duplicated string of code

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

* ffs awaits

Signed-off-by: Drapersniper <27962761+drapersniper@users.noreply.github.com>

Co-authored-by: Michael H <michael@michaelhall.tech>
This commit is contained in:
Draper 2020-01-04 01:36:09 +00:00 committed by Michael H
parent 1d2dd19244
commit 95e8d60729
43 changed files with 4128 additions and 1938 deletions

2
.github/CODEOWNERS vendored
View File

@ -30,7 +30,7 @@ redbot/core/utils/dbtools.py @mikeshardmind
# Cogs
redbot/cogs/admin/* @tekulvw
redbot/cogs/alias/* @tekulvw
redbot/cogs/audio/* @aikaterna
redbot/cogs/audio/* @aikaterna @Drapersniper
redbot/cogs/bank/* @tekulvw
redbot/cogs/cleanup/* @palmtree5
redbot/cogs/customcom/* @palmtree5

3
.gitignore vendored
View File

@ -137,3 +137,6 @@ ENV/
# pytest
.pytest_cache/
# Pre-commit hooks
/.pre-commit-config.yaml

View File

@ -0,0 +1 @@
Escape track descriptions so that they do not break markdown.

View File

@ -0,0 +1 @@
2 Changes, removed the ``Databases`` dependency and migrated it over to APSW.

View File

@ -1 +1 @@
New dependency: ``databases[sqlite]`` .
New dependency: ``databases[sqlite]``.

View File

@ -0,0 +1,5 @@
When playing a localtrack ``[p]play`` and ``[p]bumpplay`` no longer require the use of "localtracks\\" prefix.
Before: ``[p]bumpplay localtracks\\ENM\\501 - Inside The Machine.mp3``
Now: ``[p]bumpplay ENM\\501 - Inside The Machine.mp3``
Now nested folders: ``[p]bumpplay Parent Folder\\Nested Folder\\track.mp3``

View File

@ -0,0 +1 @@
Fix track index being off by 1 on ``[p]search`` command.

View File

@ -0,0 +1 @@
Expanded local track support to all file formats (m3u, m4a, mp4, etc).

View File

@ -0,0 +1 @@
Reset cooldown upon failure of commands that has a cooldown timer.

View File

@ -0,0 +1 @@
``[p]bumpplay`` command has been added.

View File

@ -0,0 +1 @@
``[p]shuffle`` command has an additional argument to tell the bot whether it should shuffle bumped tracks.

View File

@ -0,0 +1 @@
DJ_ENABLED and DJ_ROLE settings are now stored on memory after first fetch, to reduce duplicated calls.

View File

@ -0,0 +1 @@
Fix an issue where updating your Spotify and YouTube Data API tokens did not refresh them.

View File

@ -0,0 +1 @@
Fix an issue where the blacklist was not being applied correctly.

View File

@ -0,0 +1 @@
Fix an issue in ``[p]audioset restrictions blacklist list`` where it would call the list a `Whitelist`.

View File

@ -0,0 +1 @@
Add global whitelist/blacklist commands.

View File

@ -0,0 +1 @@
Add `cache.db` to the list of items not included in a backup.

View File

@ -0,0 +1 @@
remove an undocumented API from audio

View File

@ -0,0 +1 @@
Fixed an error that was thrown when running ``[p]audioset dj``.

View File

@ -0,0 +1 @@
Better error handling the player is unable to play multiple tracks in sequence.

View File

@ -0,0 +1 @@
Fixed an attribute error raised in :meth:`event_handler`.

View File

@ -0,0 +1 @@
Migrate Playlists to its dedicated playlist table and remove them from the Config driver.

View File

@ -0,0 +1 @@
``[p]remove`` command now accepts an URL or Index, if an URL is used it will remove all tracks in the queue with that URL.

View File

@ -0,0 +1 @@
Fixed a crash that could happen when the bot can't connect to the lavalink node,

View File

@ -4,25 +4,10 @@ import contextlib
import datetime
import json
import logging
import os
import random
import time
import traceback
from collections import namedtuple
from typing import Callable, Dict, List, Mapping, Optional, Tuple, Union
try:
from sqlite3 import Error as SQLError
from databases import Database
HAS_SQL = True
_ERROR = None
except ImportError as err:
_ERROR = "".join(traceback.format_exception_only(type(err), err)).strip()
HAS_SQL = False
SQLError = err.__class__
Database = None
from typing import Callable, List, MutableMapping, Optional, TYPE_CHECKING, Tuple, Union, NoReturn
import aiohttp
import discord
@ -32,129 +17,38 @@ from lavalink.rest_api import LoadResult
from redbot.core import Config, commands
from redbot.core.bot import Red
from redbot.core.i18n import Translator, cog_i18n
from . import audio_dataclasses
from .errors import InvalidTableError, SpotifyFetchError, YouTubeApiError, DatabaseError
from .databases import CacheInterface, SQLError
from .errors import DatabaseError, SpotifyFetchError, YouTubeApiError, TrackEnqueueError
from .playlists import get_playlist
from .utils import CacheLevel, Notifier, is_allowed, queue_duration, track_limit
log = logging.getLogger("red.audio.cache")
_ = Translator("Audio", __file__)
_DROP_YOUTUBE_TABLE = "DROP TABLE youtube;"
_CREATE_YOUTUBE_TABLE = """
CREATE TABLE IF NOT EXISTS youtube(
id INTEGER PRIMARY KEY AUTOINCREMENT,
track_info TEXT,
youtube_url TEXT,
last_updated TEXT,
last_fetched TEXT
);
"""
_CREATE_UNIQUE_INDEX_YOUTUBE_TABLE = (
"CREATE UNIQUE INDEX IF NOT EXISTS idx_youtube_url ON youtube (track_info, youtube_url);"
)
_INSERT_YOUTUBE_TABLE = """
INSERT OR REPLACE INTO
youtube(track_info, youtube_url, last_updated, last_fetched)
VALUES (:track_info, :track_url, :last_updated, :last_fetched);
"""
_QUERY_YOUTUBE_TABLE = "SELECT * FROM youtube WHERE track_info=:track;"
_UPDATE_YOUTUBE_TABLE = """UPDATE youtube
SET last_fetched=:last_fetched
WHERE track_info=:track;"""
_DROP_SPOTIFY_TABLE = "DROP TABLE spotify;"
_CREATE_UNIQUE_INDEX_SPOTIFY_TABLE = (
"CREATE UNIQUE INDEX IF NOT EXISTS idx_spotify_uri ON spotify (id, type, uri);"
)
_CREATE_SPOTIFY_TABLE = """
CREATE TABLE IF NOT EXISTS spotify(
id TEXT,
type TEXT,
uri TEXT,
track_name TEXT,
artist_name TEXT,
song_url TEXT,
track_info TEXT,
last_updated TEXT,
last_fetched TEXT
);
"""
_INSERT_SPOTIFY_TABLE = """
INSERT OR REPLACE INTO
spotify(id, type, uri, track_name, artist_name,
song_url, track_info, last_updated, last_fetched)
VALUES (:id, :type, :uri, :track_name, :artist_name,
:song_url, :track_info, :last_updated, :last_fetched);
"""
_QUERY_SPOTIFY_TABLE = "SELECT * FROM spotify WHERE uri=:uri;"
_UPDATE_SPOTIFY_TABLE = """UPDATE spotify
SET last_fetched=:last_fetched
WHERE uri=:uri;"""
_DROP_LAVALINK_TABLE = "DROP TABLE lavalink;"
_CREATE_LAVALINK_TABLE = """
CREATE TABLE IF NOT EXISTS lavalink(
query TEXT,
data BLOB,
last_updated TEXT,
last_fetched TEXT
);
"""
_CREATE_UNIQUE_INDEX_LAVALINK_TABLE = (
"CREATE UNIQUE INDEX IF NOT EXISTS idx_lavalink_query ON lavalink (query);"
)
_INSERT_LAVALINK_TABLE = """
INSERT OR REPLACE INTO
lavalink(query, data, last_updated, last_fetched)
VALUES (:query, :data, :last_updated, :last_fetched);
"""
_QUERY_LAVALINK_TABLE = "SELECT * FROM lavalink WHERE query=:query;"
_QUERY_LAST_FETCHED_LAVALINK_TABLE = (
"SELECT * FROM lavalink "
"WHERE last_fetched LIKE :day1"
" OR last_fetched LIKE :day2"
" OR last_fetched LIKE :day3"
" OR last_fetched LIKE :day4"
" OR last_fetched LIKE :day5"
" OR last_fetched LIKE :day6"
" OR last_fetched LIKE :day7;"
)
_UPDATE_LAVALINK_TABLE = """UPDATE lavalink
SET last_fetched=:last_fetched
WHERE query=:query;"""
_PARSER = {
"youtube": {
"insert": _INSERT_YOUTUBE_TABLE,
"youtube_url": {"query": _QUERY_YOUTUBE_TABLE},
"update": _UPDATE_YOUTUBE_TABLE,
},
"spotify": {
"insert": _INSERT_SPOTIFY_TABLE,
"track_info": {"query": _QUERY_SPOTIFY_TABLE},
"update": _UPDATE_SPOTIFY_TABLE,
},
"lavalink": {
"insert": _INSERT_LAVALINK_TABLE,
"data": {"query": _QUERY_LAVALINK_TABLE, "played": _QUERY_LAST_FETCHED_LAVALINK_TABLE},
"update": _UPDATE_LAVALINK_TABLE,
},
}
_TOP_100_GLOBALS = "https://www.youtube.com/playlist?list=PL4fGSI1pDJn6puJdseH2Rt9sMvt9E2M4i"
_TOP_100_US = "https://www.youtube.com/playlist?list=PL4fGSI1pDJn5rWitrRWFKdm-ulaFiIyoK"
if TYPE_CHECKING:
_database: CacheInterface
_bot: Red
_config: Config
else:
_database = None
_bot = None
_config = None
def _pass_config_to_apis(config: Config, bot: Red):
global _database, _config, _bot
if _config is None:
_config = config
if _bot is None:
_bot = bot
if _database is None:
_database = CacheInterface()
class SpotifyAPI:
"""Wrapper for the Spotify API."""
@ -162,17 +56,19 @@ class SpotifyAPI:
def __init__(self, bot: Red, session: aiohttp.ClientSession):
self.bot = bot
self.session = session
self.spotify_token = None
self.spotify_token: Optional[MutableMapping[str, Union[str, int]]] = None
self.client_id = None
self.client_secret = None
@staticmethod
async def _check_token(token: dict):
async def _check_token(token: MutableMapping):
now = int(time.time())
return token["expires_at"] - now < 60
@staticmethod
def _make_token_auth(client_id: Optional[str], client_secret: Optional[str]) -> dict:
def _make_token_auth(
client_id: Optional[str], client_secret: Optional[str]
) -> MutableMapping[str, Union[str, int]]:
if client_id is None:
client_id = ""
if client_secret is None:
@ -181,7 +77,9 @@ class SpotifyAPI:
auth_header = base64.b64encode((client_id + ":" + client_secret).encode("ascii"))
return {"Authorization": "Basic %s" % auth_header.decode("ascii")}
async def _make_get(self, url: str, headers: dict = None, params: dict = None) -> dict:
async def _make_get(
self, url: str, headers: MutableMapping = None, params: MutableMapping = None
) -> MutableMapping[str, str]:
if params is None:
params = {}
async with self.session.request("GET", url, params=params, headers=headers) as r:
@ -193,13 +91,12 @@ class SpotifyAPI:
)
return await r.json()
async def _get_auth(self):
if self.client_id is None or self.client_secret is None:
tokens = await self.bot.get_shared_api_tokens("spotify")
self.client_id = tokens.get("client_id", "")
self.client_secret = tokens.get("client_secret", "")
async def _get_auth(self) -> NoReturn:
tokens = await self.bot.get_shared_api_tokens("spotify")
self.client_id = tokens.get("client_id", "")
self.client_secret = tokens.get("client_secret", "")
async def _request_token(self) -> dict:
async def _request_token(self) -> MutableMapping[str, Union[str, int]]:
await self._get_auth()
payload = {"grant_type": "client_credentials"}
@ -223,7 +120,9 @@ class SpotifyAPI:
log.debug("Created a new access token for Spotify: {0}".format(token))
return self.spotify_token["access_token"]
async def post_call(self, url: str, payload: dict, headers: dict = None) -> dict:
async def post_call(
self, url: str, payload: MutableMapping, headers: MutableMapping = None
) -> MutableMapping[str, Union[str, int]]:
async with self.session.post(url, data=payload, headers=headers) as r:
if r.status != 200:
log.debug(
@ -233,13 +132,15 @@ class SpotifyAPI:
)
return await r.json()
async def get_call(self, url: str, params: dict) -> dict:
async def get_call(
self, url: str, params: MutableMapping
) -> MutableMapping[str, Union[str, int]]:
token = await self._get_spotify_token()
return await self._make_get(
url, params=params, headers={"Authorization": "Bearer {0}".format(token)}
)
async def get_categories(self) -> List[Dict[str, str]]:
async def get_categories(self) -> List[MutableMapping]:
url = "https://api.spotify.com/v1/browse/categories"
params = {}
result = await self.get_call(url, params=params)
@ -278,10 +179,9 @@ class YouTubeAPI:
self.session = session
self.api_key = None
async def _get_api_key(self,) -> Optional[str]:
if self.api_key is None:
tokens = await self.bot.get_shared_api_tokens("youtube")
self.api_key = tokens.get("api_key", "")
async def _get_api_key(self,) -> str:
tokens = await self.bot.get_shared_api_tokens("youtube")
self.api_key = tokens.get("api_key", "")
return self.api_key
async def get_call(self, query: str) -> Optional[str]:
@ -310,122 +210,39 @@ class YouTubeAPI:
@cog_i18n(_)
class MusicCache:
"""
Handles music queries to the Spotify and Youtube Data API.
"""Handles music queries to the Spotify and Youtube Data API.
Always tries the Cache first.
"""
def __init__(self, bot: Red, session: aiohttp.ClientSession, path: str):
def __init__(self, bot: Red, session: aiohttp.ClientSession):
self.bot = bot
self.spotify_api: SpotifyAPI = SpotifyAPI(bot, session)
self.youtube_api: YouTubeAPI = YouTubeAPI(bot, session)
self._session: aiohttp.ClientSession = session
if HAS_SQL:
self.database: Database = Database(
f'sqlite:///{os.path.abspath(str(os.path.join(path, "cache.db")))}'
)
else:
self.database = None
self.database = _database
self._tasks: dict = {}
self._tasks: MutableMapping = {}
self._lock: asyncio.Lock = asyncio.Lock()
self.config: Optional[Config] = None
async def initialize(self, config: Config):
if HAS_SQL:
await self.database.connect()
await self.database.execute(query="PRAGMA temp_store = 2;")
await self.database.execute(query="PRAGMA journal_mode = wal;")
await self.database.execute(query="PRAGMA wal_autocheckpoint;")
await self.database.execute(query="PRAGMA read_uncommitted = 1;")
await self.database.execute(query=_CREATE_LAVALINK_TABLE)
await self.database.execute(query=_CREATE_UNIQUE_INDEX_LAVALINK_TABLE)
await self.database.execute(query=_CREATE_YOUTUBE_TABLE)
await self.database.execute(query=_CREATE_UNIQUE_INDEX_YOUTUBE_TABLE)
await self.database.execute(query=_CREATE_SPOTIFY_TABLE)
await self.database.execute(query=_CREATE_UNIQUE_INDEX_SPOTIFY_TABLE)
self.config = config
async def close(self):
if HAS_SQL:
await self.database.execute(query="PRAGMA optimize;")
await self.database.disconnect()
async def insert(self, table: str, values: List[dict]):
# if table == "spotify":
# return
if HAS_SQL:
query = _PARSER.get(table, {}).get("insert")
if query is None:
raise InvalidTableError(f"{table} is not a valid table in the database.")
await self.database.execute_many(query=query, values=values)
async def update(self, table: str, values: Dict[str, str]):
# if table == "spotify":
# return
if HAS_SQL:
table = _PARSER.get(table, {})
sql_query = table.get("update")
time_now = str(datetime.datetime.now(datetime.timezone.utc))
values["last_fetched"] = time_now
if not table:
raise InvalidTableError(f"{table} is not a valid table in the database.")
await self.database.fetch_one(query=sql_query, values=values)
async def fetch_one(
self, table: str, query: str, values: Dict[str, str]
) -> Tuple[Optional[str], bool]:
table = _PARSER.get(table, {})
sql_query = table.get(query, {}).get("query")
if HAS_SQL:
if not table:
raise InvalidTableError(f"{table} is not a valid table in the database.")
row = await self.database.fetch_one(query=sql_query, values=values)
last_updated = getattr(row, "last_updated", None)
need_update = True
with contextlib.suppress(TypeError):
if last_updated:
last_update = datetime.datetime.fromisoformat(
last_updated
) + datetime.timedelta(days=await self.config.cache_age())
last_update.replace(tzinfo=datetime.timezone.utc)
need_update = last_update < datetime.datetime.now(datetime.timezone.utc)
return getattr(row, query, None), need_update if table != "spotify" else True
else:
return None, True
# TODO: Create a task to remove entries
# from DB that haven't been fetched in x days ... customizable by Owner
async def fetch_all(self, table: str, query: str, values: Dict[str, str]) -> List[Mapping]:
if HAS_SQL:
table = _PARSER.get(table, {})
sql_query = table.get(query, {}).get("played")
if not table:
raise InvalidTableError(f"{table} is not a valid table in the database.")
return await self.database.fetch_all(query=sql_query, values=values)
return []
await _database.init()
@staticmethod
def _spotify_format_call(qtype: str, key: str) -> Tuple[str, dict]:
def _spotify_format_call(qtype: str, key: str) -> Tuple[str, MutableMapping]:
params = {}
if qtype == "album":
query = "https://api.spotify.com/v1/albums/{0}/tracks".format(key)
query = f"https://api.spotify.com/v1/albums/{key}/tracks"
elif qtype == "track":
query = "https://api.spotify.com/v1/tracks/{0}".format(key)
query = f"https://api.spotify.com/v1/tracks/{key}"
else:
query = "https://api.spotify.com/v1/playlists/{0}/tracks".format(key)
query = f"https://api.spotify.com/v1/playlists/{key}/tracks"
return query, params
@staticmethod
def _get_spotify_track_info(track_data: dict) -> Tuple[str, ...]:
def _get_spotify_track_info(track_data: MutableMapping) -> Tuple[str, ...]:
artist_name = track_data["artists"][0]["name"]
track_name = track_data["name"]
track_info = f"{track_name} {artist_name}"
@ -451,7 +268,7 @@ class MusicCache:
total_tracks = len(tracks)
database_entries = []
track_count = 0
time_now = str(datetime.datetime.now(datetime.timezone.utc))
time_now = int(datetime.datetime.now(datetime.timezone.utc).timestamp())
youtube_cache = CacheLevel.set_youtube().is_subset(current_cache_level)
for track in tracks:
if track.get("error", {}).get("message") == "invalid id":
@ -484,7 +301,7 @@ class MusicCache:
if youtube_cache:
update = True
with contextlib.suppress(SQLError):
val, update = await self.fetch_one(
(val, update) = await self.database.fetch_one(
"youtube", "youtube_url", {"track": track_info}
)
if update:
@ -517,7 +334,7 @@ class MusicCache:
) -> str:
track_url = await self.youtube_api.get_call(track_info)
if CacheLevel.set_youtube().is_subset(current_cache_level) and track_url:
time_now = str(datetime.datetime.now(datetime.timezone.utc))
time_now = int(datetime.datetime.now(datetime.timezone.utc).timestamp())
task = (
"insert",
(
@ -540,12 +357,12 @@ class MusicCache:
query_type: str,
uri: str,
recursive: Union[str, bool] = False,
params=None,
params: MutableMapping = None,
notifier: Optional[Notifier] = None,
) -> Union[dict, List[str]]:
) -> Union[MutableMapping, List[str]]:
if recursive is False:
call, params = self._spotify_format_call(query_type, uri)
(call, params) = self._spotify_format_call(query_type, uri)
results = await self.spotify_api.get_call(call, params)
else:
results = await self.spotify_api.get_call(recursive, params)
@ -608,8 +425,7 @@ class MusicCache:
skip_youtube: bool = False,
notifier: Optional[Notifier] = None,
) -> List[str]:
"""
Queries the Database then falls back to Spotify and YouTube APIs.
"""Queries the Database then falls back to Spotify and YouTube APIs.
Parameters
----------
@ -628,14 +444,12 @@ class MusicCache:
List[str]
List of Youtube URLs.
"""
current_cache_level = (
CacheLevel(await self.config.cache_level()) if HAS_SQL else CacheLevel.none()
)
current_cache_level = CacheLevel(await self.config.cache_level())
cache_enabled = CacheLevel.set_spotify().is_subset(current_cache_level)
if query_type == "track" and cache_enabled:
update = True
with contextlib.suppress(SQLError):
val, update = await self.fetch_one(
(val, update) = await self.database.fetch_one(
"spotify", "track_info", {"uri": f"spotify:track:{uri}"}
)
if update:
@ -673,9 +487,7 @@ class MusicCache:
track_list = []
has_not_allowed = False
try:
current_cache_level = (
CacheLevel(await self.config.cache_level()) if HAS_SQL else CacheLevel.none()
)
current_cache_level = CacheLevel(await self.config.cache_level())
guild_data = await self.config.guild(ctx.guild).all()
# now = int(time.time())
@ -698,7 +510,7 @@ class MusicCache:
return track_list
database_entries = []
time_now = str(datetime.datetime.now(datetime.timezone.utc))
time_now = int(datetime.datetime.now(datetime.timezone.utc).timestamp())
youtube_cache = CacheLevel.set_youtube().is_subset(current_cache_level)
spotify_cache = CacheLevel.set_spotify().is_subset(current_cache_level)
@ -730,7 +542,7 @@ class MusicCache:
if youtube_cache:
update = True
with contextlib.suppress(SQLError):
val, update = await self.fetch_one(
(val, update) = await self.database.fetch_one(
"youtube", "youtube_url", {"track": track_info}
)
if update:
@ -745,7 +557,7 @@ class MusicCache:
if val:
try:
result, called_api = await self.lavalink_query(
(result, called_api) = await self.lavalink_query(
ctx, player, audio_dataclasses.Query.process_input(val)
)
except (RuntimeError, aiohttp.ServerDisconnectedError):
@ -760,7 +572,7 @@ class MusicCache:
lock(ctx, False)
error_embed = discord.Embed(
colour=await ctx.embed_colour(),
title=_("Player timedout, skipping remaning tracks."),
title=_("Player timeout, skipping remaining tracks."),
)
await notifier.update_embed(error_embed)
break
@ -771,16 +583,6 @@ class MusicCache:
key = "lavalink"
seconds = "???"
second_key = None
# if track_count == 2:
# five_time = int(time.time()) - now
# if track_count >= 2:
# remain_tracks = total_tracks - track_count
# time_remain = (remain_tracks / 2) * five_time
# if track_count < total_tracks:
# seconds = dynamic_time(int(time_remain))
# if track_count == total_tracks:
# seconds = "0s"
# second_key = "lavalink_time"
await notifier.notify_user(
current=track_count,
total=total_tracks,
@ -837,16 +639,14 @@ class MusicCache:
await player.play()
if len(track_list) == 0:
if not has_not_allowed:
embed3 = discord.Embed(
colour=await ctx.embed_colour(),
title=_(
raise SpotifyFetchError(
message=_(
"Nothing found.\nThe YouTube API key may be invalid "
"or you may be rate limited on YouTube's search service.\n"
"Check the YouTube API key again and follow the instructions "
"at `{prefix}audioset youtubeapi`."
).format(prefix=ctx.prefix),
).format(prefix=ctx.prefix)
)
await ctx.send(embed=embed3)
player.maybe_shuffle()
if enqueue and tracks_from_spotify:
if total_tracks > enqueued_tracks:
@ -885,15 +685,15 @@ class MusicCache:
return track_list
async def youtube_query(self, ctx: commands.Context, track_info: str) -> str:
current_cache_level = (
CacheLevel(await self.config.cache_level()) if HAS_SQL else CacheLevel.none()
)
current_cache_level = CacheLevel(await self.config.cache_level())
cache_enabled = CacheLevel.set_youtube().is_subset(current_cache_level)
val = None
if cache_enabled:
update = True
with contextlib.suppress(SQLError):
val, update = await self.fetch_one("youtube", "youtube_url", {"track": track_info})
(val, update) = await self.database.fetch_one(
"youtube", "youtube_url", {"track": track_info}
)
if update:
val = None
if val is None:
@ -914,10 +714,8 @@ class MusicCache:
query: audio_dataclasses.Query,
forced: bool = False,
) -> Tuple[LoadResult, bool]:
"""
A replacement for :code:`lavalink.Player.load_tracks`.
This will try to get a valid cached entry first if not found or if in valid
it will then call the lavalink API.
"""A replacement for :code:`lavalink.Player.load_tracks`. This will try to get a valid
cached entry first if not found or if in valid it will then call the lavalink API.
Parameters
----------
@ -934,9 +732,7 @@ class MusicCache:
Tuple[lavalink.LoadResult, bool]
Tuple with the Load result and whether or not the API was called.
"""
current_cache_level = (
CacheLevel(await self.config.cache_level()) if HAS_SQL else CacheLevel.none()
)
current_cache_level = CacheLevel(await self.config.cache_level())
cache_enabled = CacheLevel.set_lavalink().is_subset(current_cache_level)
val = None
_raw_query = audio_dataclasses.Query.process_input(query)
@ -944,14 +740,15 @@ class MusicCache:
if cache_enabled and not forced and not _raw_query.is_local:
update = True
with contextlib.suppress(SQLError):
val, update = await self.fetch_one("lavalink", "data", {"query": query})
(val, update) = await self.database.fetch_one("lavalink", "data", {"query": query})
if update:
val = None
if val:
if val and not isinstance(val, str):
log.debug(f"Querying Local Database for {query}")
task = ("update", ("lavalink", {"query": query}))
self.append_task(ctx, *task)
if val and not forced:
data = json.loads(val)
data = val
data["query"] = query
results = LoadResult(data)
called_api = False
@ -965,6 +762,8 @@ class MusicCache:
results = await player.load_tracks(query)
except KeyError:
results = None
except RuntimeError:
raise TrackEnqueueError
if results is None:
results = LoadResult({"loadType": "LOAD_FAILED", "playlistInfo": {}, "tracks": []})
if (
@ -975,7 +774,7 @@ class MusicCache:
and results.tracks
):
with contextlib.suppress(SQLError):
time_now = str(datetime.datetime.now(datetime.timezone.utc))
time_now = int(datetime.datetime.now(datetime.timezone.utc).timestamp())
task = (
"insert",
(
@ -1003,10 +802,12 @@ class MusicCache:
tasks = self._tasks[ctx.message.id]
del self._tasks[ctx.message.id]
await asyncio.gather(
*[self.insert(*a) for a in tasks["insert"]], return_exceptions=True
*[self.database.insert(*a) for a in tasks["insert"]],
return_exceptions=True,
)
await asyncio.gather(
*[self.update(*a) for a in tasks["update"]], return_exceptions=True
*[self.database.update(*a) for a in tasks["update"]],
return_exceptions=True,
)
log.debug(f"Completed database writes for {lock_id} " f"({lock_author})")
@ -1015,16 +816,16 @@ class MusicCache:
log.debug("Running pending writes to database")
with contextlib.suppress(Exception):
tasks = {"update": [], "insert": []}
for k, task in self._tasks.items():
for (k, task) in self._tasks.items():
for t, args in task.items():
tasks[t].append(args)
self._tasks = {}
await asyncio.gather(
*[self.insert(*a) for a in tasks["insert"]], return_exceptions=True
*[self.database.insert(*a) for a in tasks["insert"]], return_exceptions=True
)
await asyncio.gather(
*[self.update(*a) for a in tasks["update"]], return_exceptions=True
*[self.database.update(*a) for a in tasks["update"]], return_exceptions=True
)
log.debug("Completed pending writes to database have finished")
@ -1034,29 +835,26 @@ class MusicCache:
self._tasks[lock_id] = {"update": [], "insert": []}
self._tasks[lock_id][event].append(task)
async def play_random(self):
async def get_random_from_db(self):
tracks = []
try:
query_data = {}
for i in range(1, 8):
date = (
"%"
+ str(
(
datetime.datetime.now(datetime.timezone.utc)
- datetime.timedelta(days=i)
).date()
)
+ "%"
)
query_data[f"day{i}"] = date
date = datetime.datetime.now(datetime.timezone.utc) - datetime.timedelta(days=7)
date = int(date.timestamp())
query_data["day"] = date
max_age = await self.config.cache_age()
maxage = datetime.datetime.now(tz=datetime.timezone.utc) - datetime.timedelta(
days=max_age
)
maxage_int = int(time.mktime(maxage.timetuple()))
query_data["maxage"] = maxage_int
vals = await self.fetch_all("lavalink", "data", query_data)
recently_played = [r.data for r in vals if r]
vals = await self.database.fetch_all("lavalink", "data", query_data)
recently_played = [r.tracks for r in vals if r]
if recently_played:
track = random.choice(recently_played)
results = LoadResult(json.loads(track))
results = LoadResult(track)
tracks = list(results.tracks)
except Exception:
tracks = []
@ -1065,9 +863,7 @@ class MusicCache:
async def autoplay(self, player: lavalink.Player):
autoplaylist = await self.config.guild(player.channel.guild).autoplaylist()
current_cache_level = (
CacheLevel(await self.config.cache_level()) if HAS_SQL else CacheLevel.none()
)
current_cache_level = CacheLevel(await self.config.cache_level())
cache_enabled = CacheLevel.set_lavalink().is_subset(current_cache_level)
playlist = None
tracks = None
@ -1084,10 +880,10 @@ class MusicCache:
if not tracks or not getattr(playlist, "tracks", None):
if cache_enabled:
tracks = await self.play_random()
tracks = await self.get_random_from_db()
if not tracks:
ctx = namedtuple("Context", "message")
results, called_api = await self.lavalink_query(
(results, called_api) = await self.lavalink_query(
ctx(player.channel.guild),
player,
audio_dataclasses.Query.process_input(_TOP_100_US),
@ -1124,7 +920,7 @@ class MusicCache:
continue
valid = True
track.extras = {"autoplay": True}
track.extras["autoplay"] = True
player.add(player.channel.guild.me, track)
self.bot.dispatch(
"red_audio_track_auto_play", player.channel.guild, track, player.channel.guild.me

File diff suppressed because it is too large Load Diff

View File

@ -1,7 +1,9 @@
import ntpath
import os
import posixpath
import re
from pathlib import Path, PosixPath, WindowsPath
from typing import List, Optional, Union
from typing import List, Optional, Union, MutableMapping
from urllib.parse import urlparse
import lavalink
@ -14,13 +16,57 @@ _config: Optional[Config] = None
_bot: Optional[Red] = None
_localtrack_folder: Optional[str] = None
_ = Translator("Audio", __file__)
_remove_start = re.compile(r"^(sc|list) ")
_re_youtube_timestamp = re.compile(r"&t=(\d+)s?")
_re_youtube_index = re.compile(r"&index=(\d+)")
_re_spotify_url = re.compile(r"(http[s]?://)?(open.spotify.com)/")
_re_spotify_timestamp = re.compile(r"#(\d+):(\d+)")
_re_soundcloud_timestamp = re.compile(r"#t=(\d+):(\d+)s?")
_re_twitch_timestamp = re.compile(r"\?t=(\d+)h(\d+)m(\d+)s")
_RE_REMOVE_START = re.compile(r"^(sc|list) ")
_RE_YOUTUBE_TIMESTAMP = re.compile(r"&t=(\d+)s?")
_RE_YOUTUBE_INDEX = re.compile(r"&index=(\d+)")
_RE_SPOTIFY_URL = re.compile(r"(http[s]?://)?(open.spotify.com)/")
_RE_SPOTIFY_TIMESTAMP = re.compile(r"#(\d+):(\d+)")
_RE_SOUNDCLOUD_TIMESTAMP = re.compile(r"#t=(\d+):(\d+)s?")
_RE_TWITCH_TIMESTAMP = re.compile(r"\?t=(\d+)h(\d+)m(\d+)s")
_PATH_SEPS = [posixpath.sep, ntpath.sep]
_FULLY_SUPPORTED_MUSIC_EXT = (".mp3", ".flac", ".ogg")
_PARTIALLY_SUPPORTED_MUSIC_EXT = (
".m3u",
".m4a",
".aac",
".ra",
".wav",
".opus",
".wma",
".ts",
".au",
# These do not work
# ".mid",
# ".mka",
# ".amr",
# ".aiff",
# ".ac3",
# ".voc",
# ".dsf",
)
_PARTIALLY_SUPPORTED_VIDEO_EXT = (
".mp4",
".mov",
".flv",
".webm",
".mkv",
".wmv",
".3gp",
".m4v",
".mk3d", # https://github.com/Devoxin/lavaplayer
".mka", # https://github.com/Devoxin/lavaplayer
".mks", # https://github.com/Devoxin/lavaplayer
# These do not work
# ".vob",
# ".mts",
# ".avi",
# ".mpg",
# ".mpeg",
# ".swf",
)
_PARTIALLY_SUPPORTED_MUSIC_EXT += _PARTIALLY_SUPPORTED_VIDEO_EXT
def _pass_config_to_dataclasses(config: Config, bot: Red, folder: str):
@ -32,36 +78,14 @@ def _pass_config_to_dataclasses(config: Config, bot: Red, folder: str):
_localtrack_folder = folder
class ChdirClean(object):
def __init__(self, directory):
self.old_dir = os.getcwd()
self.new_dir = directory
self.cwd = None
class LocalPath:
"""Local tracks class.
def __enter__(self):
return self
def __exit__(self, _type, value, traceback):
self.chdir_out()
return isinstance(value, OSError)
def chdir_in(self):
self.cwd = Path(self.new_dir)
os.chdir(self.new_dir)
def chdir_out(self):
self.cwd = Path(self.old_dir)
os.chdir(self.old_dir)
class LocalPath(ChdirClean):
"""
Local tracks class.
Used to handle system dir trees in a cross system manner.
The only use of this class is for `localtracks`.
Used to handle system dir trees in a cross system manner. The only use of this class is for
`localtracks`.
"""
_supported_music_ext = (".mp3", ".flac", ".ogg")
_all_music_ext = _FULLY_SUPPORTED_MUSIC_EXT + _PARTIALLY_SUPPORTED_MUSIC_EXT
def __init__(self, path, **kwargs):
self._path = path
@ -89,10 +113,11 @@ class LocalPath(ChdirClean):
_path.relative_to(self.localtrack_folder)
self.path = _path
except (ValueError, TypeError):
if path and path.startswith("localtracks//"):
path = path.replace("localtracks//", "", 1)
elif path and path.startswith("localtracks/"):
path = path.replace("localtracks/", "", 1)
for sep in _PATH_SEPS:
if path and path.startswith(f"localtracks{sep}{sep}"):
path = path.replace(f"localtracks{sep}{sep}", "", 1)
elif path and path.startswith(f"localtracks{sep}"):
path = path.replace(f"localtracks{sep}", "", 1)
self.path = self.localtrack_folder.joinpath(path) if path else self.localtrack_folder
try:
@ -100,18 +125,18 @@ class LocalPath(ChdirClean):
parent = self.path.parent
else:
parent = self.path
super().__init__(str(parent.absolute()))
self.parent = Path(parent)
except OSError:
self.parent = None
self.cwd = Path.cwd()
@property
def name(self):
return str(self.path.name)
@property
def suffix(self):
return str(self.path.suffix)
def is_dir(self):
try:
return self.path.is_dir()
@ -159,11 +184,11 @@ class LocalPath(ChdirClean):
def _filtered(self, paths: List[Path]):
for p in paths:
if p.suffix in self._supported_music_ext:
if p.suffix in self._all_music_ext:
yield p
def __str__(self):
return str(self.path.absolute())
return self.to_string()
def to_string(self):
try:
@ -171,7 +196,7 @@ class LocalPath(ChdirClean):
except OSError:
return str(self._path)
def to_string_hidden(self, arg: str = None):
def to_string_user(self, arg: str = None):
string = str(self.absolute()).replace(
(str(self.localtrack_folder.absolute()) + os.sep) if arg is None else arg, ""
)
@ -186,13 +211,13 @@ class LocalPath(ChdirClean):
def tracks_in_tree(self):
tracks = []
for track in self.multirglob(*[f"*{ext}" for ext in self._supported_music_ext]):
for track in self.multirglob(*[f"*{ext}" for ext in self._all_music_ext]):
if track.exists() and track.is_file() and track.parent != self.localtrack_folder:
tracks.append(Query.process_input(LocalPath(str(track.absolute()))))
return tracks
return sorted(tracks, key=lambda x: x.to_string_user().lower())
def subfolders_in_tree(self):
files = list(self.multirglob(*[f"*{ext}" for ext in self._supported_music_ext]))
files = list(self.multirglob(*[f"*{ext}" for ext in self._all_music_ext]))
folders = []
for f in files:
if f.exists() and f.parent not in folders and f.parent != self.localtrack_folder:
@ -201,17 +226,17 @@ class LocalPath(ChdirClean):
for folder in folders:
if folder.exists() and folder.is_dir():
return_folders.append(LocalPath(str(folder.absolute())))
return return_folders
return sorted(return_folders, key=lambda x: x.to_string_user().lower())
def tracks_in_folder(self):
tracks = []
for track in self.multiglob(*[f"*{ext}" for ext in self._supported_music_ext]):
for track in self.multiglob(*[f"*{ext}" for ext in self._all_music_ext]):
if track.exists() and track.is_file() and track.parent != self.localtrack_folder:
tracks.append(Query.process_input(LocalPath(str(track.absolute()))))
return tracks
return sorted(tracks, key=lambda x: x.to_string_user().lower())
def subfolders(self):
files = list(self.multiglob(*[f"*{ext}" for ext in self._supported_music_ext]))
files = list(self.multiglob(*[f"*{ext}" for ext in self._all_music_ext]))
folders = []
for f in files:
if f.exists() and f.parent not in folders and f.parent != self.localtrack_folder:
@ -220,12 +245,44 @@ class LocalPath(ChdirClean):
for folder in folders:
if folder.exists() and folder.is_dir():
return_folders.append(LocalPath(str(folder.absolute())))
return return_folders
return sorted(return_folders, key=lambda x: x.to_string_user().lower())
def __eq__(self, other):
if not isinstance(other, LocalPath):
return NotImplemented
return self.path._cparts == other.path._cparts
def __hash__(self):
try:
return self._hash
except AttributeError:
self._hash = hash(tuple(self.path._cparts))
return self._hash
def __lt__(self, other):
if not isinstance(other, LocalPath):
return NotImplemented
return self.path._cparts < other.path._cparts
def __le__(self, other):
if not isinstance(other, LocalPath):
return NotImplemented
return self.path._cparts <= other.path._cparts
def __gt__(self, other):
if not isinstance(other, LocalPath):
return NotImplemented
return self.path._cparts > other.path._cparts
def __ge__(self, other):
if not isinstance(other, LocalPath):
return NotImplemented
return self.path._cparts >= other.path._cparts
class Query:
"""
Query data class.
"""Query data class.
Use: Query.process_input(query) to generate the Query object.
"""
@ -259,6 +316,8 @@ class Query:
self.local_name: Optional[str] = kwargs.get("name", None)
self.search_subfolders: bool = kwargs.get("search_subfolders", False)
self.spotify_uri: Optional[str] = kwargs.get("uri", None)
self.uri: Optional[str] = kwargs.get("url", None)
self.is_url: bool = kwargs.get("is_url", False)
self.start_time: int = kwargs.get("start_time", 0)
self.track_index: Optional[int] = kwargs.get("track_index", None)
@ -271,16 +330,38 @@ class Query:
if self.is_playlist or self.is_album:
self.single_track = False
self._hash = hash(
(
self.valid,
self.is_local,
self.is_spotify,
self.is_youtube,
self.is_soundcloud,
self.is_bandcamp,
self.is_vimeo,
self.is_mixer,
self.is_twitch,
self.is_other,
self.is_playlist,
self.is_album,
self.is_search,
self.is_stream,
self.single_track,
self.id,
self.spotify_uri,
self.start_time,
self.track_index,
self.uri,
)
)
def __str__(self):
return str(self.lavalink_query)
@classmethod
def process_input(cls, query: Union[LocalPath, lavalink.Track, "Query", str], **kwargs):
"""
A replacement for :code:`lavalink.Player.load_tracks`.
This will try to get a valid cached entry first if not found or if in valid
it will then call the lavalink API.
"""A replacement for :code:`lavalink.Player.load_tracks`. This will try to get a valid
cached entry first if not found or if in valid it will then call the lavalink API.
Parameters
----------
@ -293,7 +374,7 @@ class Query:
"""
if not query:
query = "InvalidQueryPlaceHolderName"
possible_values = dict()
possible_values = {}
if isinstance(query, str):
query = query.strip("<>")
@ -311,7 +392,7 @@ class Query:
return cls(query, **possible_values)
@staticmethod
def _parse(track, **kwargs):
def _parse(track, **kwargs) -> MutableMapping:
returning = {}
if (
type(track) == type(LocalPath)
@ -338,7 +419,7 @@ class Query:
_id = _id.split("?")[0]
returning["id"] = _id
if "#" in _id:
match = re.search(_re_spotify_timestamp, track)
match = re.search(_RE_SPOTIFY_TIMESTAMP, track)
if match:
returning["start_time"] = (int(match.group(1)) * 60) + int(match.group(2))
returning["uri"] = track
@ -349,7 +430,7 @@ class Query:
returning["soundcloud"] = True
elif track.startswith("list "):
returning["invoked_from"] = "search list"
track = _remove_start.sub("", track, 1)
track = _RE_REMOVE_START.sub("", track, 1)
returning["queryforced"] = track
_localtrack = LocalPath(track)
@ -367,6 +448,8 @@ class Query:
try:
query_url = urlparse(track)
if all([query_url.scheme, query_url.netloc, query_url.path]):
returning["url"] = track
returning["is_url"] = True
url_domain = ".".join(query_url.netloc.split(".")[-2:])
if not query_url.netloc:
url_domain = ".".join(query_url.path.split("/")[0].split(".")[-2:])
@ -374,11 +457,11 @@ class Query:
returning["youtube"] = True
_has_index = "&index=" in track
if "&t=" in track:
match = re.search(_re_youtube_timestamp, track)
match = re.search(_RE_YOUTUBE_TIMESTAMP, track)
if match:
returning["start_time"] = int(match.group(1))
if _has_index:
match = re.search(_re_youtube_index, track)
match = re.search(_RE_YOUTUBE_INDEX, track)
if match:
returning["track_index"] = int(match.group(1)) - 1
if all(k in track for k in ["&list=", "watch?"]):
@ -402,7 +485,7 @@ class Query:
returning["album"] = True
elif "/track/" in track:
returning["single"] = True
val = re.sub(_re_spotify_url, "", track).replace("/", ":")
val = re.sub(_RE_SPOTIFY_URL, "", track).replace("/", ":")
if "user:" in val:
val = val.split(":", 2)[-1]
_id = val.split(":", 1)[-1]
@ -410,7 +493,7 @@ class Query:
if "#" in _id:
_id = _id.split("#")[0]
match = re.search(_re_spotify_timestamp, track)
match = re.search(_RE_SPOTIFY_TIMESTAMP, track)
if match:
returning["start_time"] = (int(match.group(1)) * 60) + int(
match.group(2)
@ -421,7 +504,7 @@ class Query:
elif url_domain == "soundcloud.com":
returning["soundcloud"] = True
if "#t=" in track:
match = re.search(_re_soundcloud_timestamp, track)
match = re.search(_RE_SOUNDCLOUD_TIMESTAMP, track)
if match:
returning["start_time"] = (int(match.group(1)) * 60) + int(
match.group(2)
@ -446,7 +529,7 @@ class Query:
elif url_domain == "twitch.tv":
returning["twitch"] = True
if "?t=" in track:
match = re.search(_re_twitch_timestamp, track)
match = re.search(_RE_TWITCH_TIMESTAMP, track)
if match:
returning["start_time"] = (
(int(match.group(1)) * 60 * 60)
@ -485,5 +568,66 @@ class Query:
def to_string_user(self):
if self.is_local:
return str(self.track.to_string_hidden())
return str(self.track.to_string_user())
return str(self._raw)
@property
def suffix(self):
if self.is_local:
return self.track.suffix
return None
def __eq__(self, other):
if not isinstance(other, Query):
return NotImplemented
return self.to_string_user() == other.to_string_user()
def __hash__(self):
try:
return self._hash
except AttributeError:
self._hash = hash(
(
self.valid,
self.is_local,
self.is_spotify,
self.is_youtube,
self.is_soundcloud,
self.is_bandcamp,
self.is_vimeo,
self.is_mixer,
self.is_twitch,
self.is_other,
self.is_playlist,
self.is_album,
self.is_search,
self.is_stream,
self.single_track,
self.id,
self.spotify_uri,
self.start_time,
self.track_index,
self.uri,
)
)
return self._hash
def __lt__(self, other):
if not isinstance(other, Query):
return NotImplemented
return self.to_string_user() < other.to_string_user()
def __le__(self, other):
if not isinstance(other, Query):
return NotImplemented
return self.to_string_user() <= other.to_string_user()
def __gt__(self, other):
if not isinstance(other, Query):
return NotImplemented
return self.to_string_user() > other.to_string_user()
def __ge__(self, other):
if not isinstance(other, Query):
return NotImplemented
return self.to_string_user() >= other.to_string_user()

View File

@ -1,8 +1,11 @@
from typing import TYPE_CHECKING
from redbot.core import Config, commands
from .apis import HAS_SQL
_config = None
if TYPE_CHECKING:
_config: Config
else:
_config = None
def _pass_config_to_checks(config: Config):
@ -26,12 +29,3 @@ def roomlocked():
return False
return commands.check(predicate)
def can_have_caching():
"""Check to disable Caching commands if SQLite is not avaliable."""
async def predicate(ctx: commands.Context):
return HAS_SQL
return commands.check(predicate)

View File

@ -0,0 +1,18 @@
from redbot.core import Config
from redbot.core.bot import Red
from .apis import _pass_config_to_apis
from .audio_dataclasses import _pass_config_to_dataclasses
from .converters import _pass_config_to_converters
from .databases import _pass_config_to_databases
from .playlists import _pass_config_to_playlist
from .utils import _pass_config_to_utils
def pass_config_to_dependencies(config: Config, bot: Red, localtracks_folder: str):
_pass_config_to_databases(config, bot)
_pass_config_to_utils(config, bot)
_pass_config_to_dataclasses(config, bot, localtracks_folder)
_pass_config_to_apis(config, bot)
_pass_config_to_playlist(config, bot)
_pass_config_to_converters(config, bot)

View File

@ -1,16 +1,17 @@
import argparse
import functools
import re
from typing import Optional, Tuple, Union
from typing import Optional, Tuple, Union, MutableMapping, TYPE_CHECKING
import discord
from redbot.cogs.audio.errors import TooManyMatches, NoMatchesFound
from redbot.core import Config, commands
from redbot.core.bot import Red
from redbot.core.i18n import Translator
from .playlists import PlaylistScope, standardize_scope
from .errors import NoMatchesFound, TooManyMatches
from .playlists import get_all_playlist_converter, standardize_scope
from .utils import PlaylistScope
_ = Translator("Audio", __file__)
@ -24,8 +25,12 @@ __all__ = [
"get_playlist_converter",
]
_config = None
_bot = None
if TYPE_CHECKING:
_bot: Red
_config: Config
else:
_bot = None
_config = None
_SCOPE_HELP = """
Scope must be a valid version of one of the following:
@ -137,30 +142,18 @@ async def global_unique_user_finder(
class PlaylistConverter(commands.Converter):
async def convert(self, ctx: commands.Context, arg: str) -> dict:
global_scope = await _config.custom(PlaylistScope.GLOBAL.value).all()
guild_scope = await _config.custom(PlaylistScope.GUILD.value).all()
user_scope = await _config.custom(PlaylistScope.USER.value).all()
user_matches = [
(uid, pid, pdata)
for uid, data in user_scope.items()
for pid, pdata in data.items()
if arg == pid or arg.lower() in pdata.get("name", "").lower()
]
guild_matches = [
(gid, pid, pdata)
for gid, data in guild_scope.items()
for pid, pdata in data.items()
if arg == pid or arg.lower() in pdata.get("name", "").lower()
]
global_matches = [
(None, pid, pdata)
for pid, pdata in global_scope.items()
if arg == pid or arg.lower() in pdata.get("name", "").lower()
]
async def convert(self, ctx: commands.Context, arg: str) -> MutableMapping:
global_matches = await get_all_playlist_converter(
PlaylistScope.GLOBAL.value, _bot, arg, guild=ctx.guild, author=ctx.author
)
guild_matches = await get_all_playlist_converter(
PlaylistScope.GUILD.value, _bot, arg, guild=ctx.guild, author=ctx.author
)
user_matches = await get_all_playlist_converter(
PlaylistScope.USER.value, _bot, arg, guild=ctx.guild, author=ctx.author
)
if not user_matches and not guild_matches and not global_matches:
raise commands.BadArgument(_("Could not match '{}' to a playlist.").format(arg))
return {
PlaylistScope.GLOBAL.value: global_matches,
PlaylistScope.GUILD.value: guild_matches,
@ -498,9 +491,7 @@ class LazyGreedyConverter(commands.Converter):
def get_lazy_converter(splitter: str) -> type:
"""
Returns a typechecking safe `LazyGreedyConverter` suitable for use with discord.py.
"""
"""Returns a typechecking safe `LazyGreedyConverter` suitable for use with discord.py."""
class PartialMeta(type(LazyGreedyConverter)):
__call__ = functools.partialmethod(type(LazyGreedyConverter).__call__, splitter)
@ -512,9 +503,7 @@ def get_lazy_converter(splitter: str) -> type:
def get_playlist_converter() -> type:
"""
Returns a typechecking safe `PlaylistConverter` suitable for use with discord.py.
"""
"""Returns a typechecking safe `PlaylistConverter` suitable for use with discord.py."""
class PartialMeta(type(PlaylistConverter)):
__call__ = functools.partialmethod(type(PlaylistConverter).__call__)

View File

@ -0,0 +1,372 @@
import asyncio
import concurrent.futures
import contextlib
import datetime
import json
import logging
import time
from dataclasses import dataclass, field
from typing import Dict, List, Optional, TYPE_CHECKING, Tuple, Union, MutableMapping, Mapping
import apsw
from redbot.core import Config
from redbot.core.bot import Red
from redbot.core.data_manager import cog_data_path
from .errors import InvalidTableError
from .sql_statements import *
from .utils import PlaylistScope
log = logging.getLogger("red.audio.database")
if TYPE_CHECKING:
database_connection: apsw.Connection
_bot: Red
_config: Config
else:
_config = None
_bot = None
database_connection = None
SCHEMA_VERSION = 3
SQLError = apsw.ExecutionCompleteError
_PARSER: Mapping = {
"youtube": {
"insert": YOUTUBE_UPSERT,
"youtube_url": {"query": YOUTUBE_QUERY},
"update": YOUTUBE_UPDATE,
},
"spotify": {
"insert": SPOTIFY_UPSERT,
"track_info": {"query": SPOTIFY_QUERY},
"update": SPOTIFY_UPDATE,
},
"lavalink": {
"insert": LAVALINK_UPSERT,
"data": {"query": LAVALINK_QUERY, "played": LAVALINK_QUERY_LAST_FETCHED_RANDOM},
"update": LAVALINK_UPDATE,
},
}
def _pass_config_to_databases(config: Config, bot: Red):
global _config, _bot, database_connection
if _config is None:
_config = config
if _bot is None:
_bot = bot
if database_connection is None:
database_connection = apsw.Connection(
str(cog_data_path(_bot.get_cog("Audio")) / "Audio.db")
)
@dataclass
class PlaylistFetchResult:
playlist_id: int
playlist_name: str
scope_id: int
author_id: int
playlist_url: Optional[str] = None
tracks: List[MutableMapping] = field(default_factory=lambda: [])
def __post_init__(self):
if isinstance(self.tracks, str):
self.tracks = json.loads(self.tracks)
@dataclass
class CacheFetchResult:
query: Optional[Union[str, MutableMapping]]
last_updated: int
def __post_init__(self):
if isinstance(self.last_updated, int):
self.updated_on: datetime.datetime = datetime.datetime.fromtimestamp(self.last_updated)
if isinstance(self.query, str) and all(
k in self.query for k in ["loadType", "playlistInfo", "isSeekable", "isStream"]
):
self.query = json.loads(self.query)
@dataclass
class CacheLastFetchResult:
tracks: List[MutableMapping] = field(default_factory=lambda: [])
def __post_init__(self):
if isinstance(self.tracks, str):
self.tracks = json.loads(self.tracks)
@dataclass
class CacheGetAllLavalink:
query: str
data: List[MutableMapping] = field(default_factory=lambda: [])
def __post_init__(self):
if isinstance(self.data, str):
self.data = json.loads(self.data)
class CacheInterface:
def __init__(self):
self.database = database_connection.cursor()
@staticmethod
def close():
with contextlib.suppress(Exception):
database_connection.close()
async def init(self):
self.database.execute(PRAGMA_SET_temp_store)
self.database.execute(PRAGMA_SET_journal_mode)
self.database.execute(PRAGMA_SET_read_uncommitted)
self.maybe_migrate()
self.database.execute(LAVALINK_CREATE_TABLE)
self.database.execute(LAVALINK_CREATE_INDEX)
self.database.execute(YOUTUBE_CREATE_TABLE)
self.database.execute(YOUTUBE_CREATE_INDEX)
self.database.execute(SPOTIFY_CREATE_TABLE)
self.database.execute(SPOTIFY_CREATE_INDEX)
await self.clean_up_old_entries()
async def clean_up_old_entries(self):
max_age = await _config.cache_age()
maxage = datetime.datetime.now(tz=datetime.timezone.utc) - datetime.timedelta(days=max_age)
maxage_int = int(time.mktime(maxage.timetuple()))
values = {"maxage": maxage_int}
with concurrent.futures.ThreadPoolExecutor(max_workers=3) as executor:
executor.submit(self.database.execute, LAVALINK_DELETE_OLD_ENTRIES, values)
executor.submit(self.database.execute, YOUTUBE_DELETE_OLD_ENTRIES, values)
executor.submit(self.database.execute, SPOTIFY_DELETE_OLD_ENTRIES, values)
def maybe_migrate(self):
current_version = self.database.execute(PRAGMA_FETCH_user_version).fetchone()
if isinstance(current_version, tuple):
current_version = current_version[0]
if current_version == SCHEMA_VERSION:
return
self.database.execute(PRAGMA_SET_user_version, {"version": SCHEMA_VERSION})
async def insert(self, table: str, values: List[MutableMapping]):
try:
query = _PARSER.get(table, {}).get("insert")
if query is None:
raise InvalidTableError(f"{table} is not a valid table in the database.")
self.database.execute("BEGIN;")
self.database.executemany(query, values)
self.database.execute("COMMIT;")
except Exception as err:
log.debug("Error during audio db insert", exc_info=err)
async def update(self, table: str, values: Dict[str, Union[str, int]]):
try:
table = _PARSER.get(table, {})
sql_query = table.get("update")
time_now = int(datetime.datetime.now(datetime.timezone.utc).timestamp())
values["last_fetched"] = time_now
if not table:
raise InvalidTableError(f"{table} is not a valid table in the database.")
with concurrent.futures.ThreadPoolExecutor(max_workers=1) as executor:
executor.submit(self.database.execute, sql_query, values)
except Exception as err:
log.debug("Error during audio db update", exc_info=err)
async def fetch_one(
self, table: str, query: str, values: Dict[str, Union[str, int]]
) -> Tuple[Optional[str], bool]:
table = _PARSER.get(table, {})
sql_query = table.get(query, {}).get("query")
if not table:
raise InvalidTableError(f"{table} is not a valid table in the database.")
max_age = await _config.cache_age()
maxage = datetime.datetime.now(tz=datetime.timezone.utc) - datetime.timedelta(days=max_age)
maxage_int = int(time.mktime(maxage.timetuple()))
values.update({"maxage": maxage_int})
output = self.database.execute(sql_query, values).fetchone() or (None, 0)
result = CacheFetchResult(*output)
return result.query, False
async def fetch_all(
self, table: str, query: str, values: Dict[str, Union[str, int]]
) -> List[CacheLastFetchResult]:
table = _PARSER.get(table, {})
sql_query = table.get(query, {}).get("played")
if not table:
raise InvalidTableError(f"{table} is not a valid table in the database.")
output = []
for index, row in enumerate(self.database.execute(sql_query, values), start=1):
if index % 50 == 0:
await asyncio.sleep(0.01)
output.append(CacheLastFetchResult(*row))
return output
async def fetch_random(
self, table: str, query: str, values: Dict[str, Union[str, int]]
) -> CacheLastFetchResult:
table = _PARSER.get(table, {})
sql_query = table.get(query, {}).get("played")
if not table:
raise InvalidTableError(f"{table} is not a valid table in the database.")
with concurrent.futures.ThreadPoolExecutor(max_workers=1) as executor:
for future in concurrent.futures.as_completed(
[executor.submit(self.database.execute, sql_query, values)]
):
try:
row = future.result()
row = row.fetchone()
except Exception as exc:
log.debug(f"Failed to completed random fetch from database", exc_info=exc)
return CacheLastFetchResult(*row)
class PlaylistInterface:
def __init__(self):
self.cursor = database_connection.cursor()
self.cursor.execute(PRAGMA_SET_temp_store)
self.cursor.execute(PRAGMA_SET_journal_mode)
self.cursor.execute(PRAGMA_SET_read_uncommitted)
self.cursor.execute(PLAYLIST_CREATE_TABLE)
self.cursor.execute(PLAYLIST_CREATE_INDEX)
@staticmethod
def close():
with contextlib.suppress(Exception):
database_connection.close()
@staticmethod
def get_scope_type(scope: str) -> int:
if scope == PlaylistScope.GLOBAL.value:
table = 1
elif scope == PlaylistScope.USER.value:
table = 3
else:
table = 2
return table
def fetch(self, scope: str, playlist_id: int, scope_id: int) -> PlaylistFetchResult:
scope_type = self.get_scope_type(scope)
row = (
self.cursor.execute(
PLAYLIST_FETCH,
({"playlist_id": playlist_id, "scope_id": scope_id, "scope_type": scope_type}),
).fetchone()
or []
)
return PlaylistFetchResult(*row) if row else None
async def fetch_all(
self, scope: str, scope_id: int, author_id=None
) -> List[PlaylistFetchResult]:
scope_type = self.get_scope_type(scope)
if author_id is not None:
output = []
for index, row in enumerate(
self.cursor.execute(
PLAYLIST_FETCH_ALL_WITH_FILTER,
({"scope_type": scope_type, "scope_id": scope_id, "author_id": author_id}),
),
start=1,
):
if index % 50 == 0:
await asyncio.sleep(0.01)
output.append(row)
else:
output = []
for index, row in enumerate(
self.cursor.execute(
PLAYLIST_FETCH_ALL, ({"scope_type": scope_type, "scope_id": scope_id})
),
start=1,
):
if index % 50 == 0:
await asyncio.sleep(0.01)
output.append(row)
return [PlaylistFetchResult(*row) for row in output] if output else []
async def fetch_all_converter(
self, scope: str, playlist_name, playlist_id
) -> List[PlaylistFetchResult]:
scope_type = self.get_scope_type(scope)
try:
playlist_id = int(playlist_id)
except Exception:
playlist_id = -1
output = []
for index, row in enumerate(
self.cursor.execute(
PLAYLIST_FETCH_ALL_CONVERTER,
(
{
"scope_type": scope_type,
"playlist_name": playlist_name,
"playlist_id": playlist_id,
}
),
),
start=1,
):
if index % 50 == 0:
await asyncio.sleep(0.01)
output.append(row)
return [PlaylistFetchResult(*row) for row in output] if output else []
def delete(self, scope: str, playlist_id: int, scope_id: int):
scope_type = self.get_scope_type(scope)
with concurrent.futures.ThreadPoolExecutor(max_workers=1) as executor:
executor.submit(
self.cursor.execute,
PLAYLIST_DELETE,
({"playlist_id": playlist_id, "scope_id": scope_id, "scope_type": scope_type}),
)
def delete_scheduled(self):
with concurrent.futures.ThreadPoolExecutor(max_workers=1) as executor:
executor.submit(self.cursor.execute, PLAYLIST_DELETE_SCHEDULED)
def drop(self, scope: str):
scope_type = self.get_scope_type(scope)
with concurrent.futures.ThreadPoolExecutor(max_workers=1) as executor:
executor.submit(
self.cursor.execute, PLAYLIST_DELETE_SCOPE, ({"scope_type": scope_type})
)
def create_table(self, scope: str):
scope_type = self.get_scope_type(scope)
return self.cursor.execute(PLAYLIST_CREATE_TABLE, ({"scope_type": scope_type}))
def upsert(
self,
scope: str,
playlist_id: int,
playlist_name: str,
scope_id: int,
author_id: int,
playlist_url: Optional[str],
tracks: List[MutableMapping],
):
scope_type = self.get_scope_type(scope)
with concurrent.futures.ThreadPoolExecutor(max_workers=1) as executor:
executor.submit(
self.cursor.execute,
PLAYLIST_UPSERT,
{
"scope_type": str(scope_type),
"playlist_id": int(playlist_id),
"playlist_name": str(playlist_name),
"scope_id": int(scope_id),
"author_id": int(author_id),
"playlist_url": playlist_url,
"tracks": json.dumps(tracks),
},
)

View File

@ -5,7 +5,7 @@
class Equalizer:
def __init__(self):
self._band_count = 15
self.bands = [0.0 for _ in range(self._band_count)]
self.bands = [0.0 for _loop_counter in range(self._band_count)]
def set_gain(self, band: int, gain: float):
if band < 0 or band >= self._band_count:

View File

@ -14,7 +14,6 @@ class LavalinkDownloadFailed(AudioError, RuntimeError):
The response from the server to the failed GET request.
should_retry : bool
Whether or not the Audio cog should retry downloading the jar.
"""
def __init__(self, *args, response: aiohttp.ClientResponse, should_retry: bool = False):
@ -33,6 +32,18 @@ class LavalinkDownloadFailed(AudioError, RuntimeError):
return f"[{self.response.status} {self.response.reason}]"
class QueryUnauthorized(AudioError):
"""Provided an unauthorized query to audio."""
def __init__(self, message, *args):
self.message = message
super().__init__(*args)
class TrackEnqueueError(AudioError):
"""Unable to play track."""
class PlayListError(AudioError):
"""Base exception for errors related to playlists."""

View File

@ -15,24 +15,28 @@ import aiohttp
from tqdm import tqdm
from redbot.core import data_manager
from .errors import LavalinkDownloadFailed
log = logging.getLogger("red.audio.manager")
JAR_VERSION = "3.2.1"
JAR_BUILD = 846
LAVALINK_DOWNLOAD_URL = (
f"https://github.com/Cog-Creators/Lavalink-Jars/releases/download/{JAR_VERSION}_{JAR_BUILD}/"
f"Lavalink.jar"
"Lavalink.jar"
)
LAVALINK_DOWNLOAD_DIR = data_manager.cog_data_path(raw_name="Audio")
LAVALINK_JAR_FILE = LAVALINK_DOWNLOAD_DIR / "Lavalink.jar"
BUNDLED_APP_YML = pathlib.Path(__file__).parent / "data" / "application.yml"
LAVALINK_APP_YML = LAVALINK_DOWNLOAD_DIR / "application.yml"
READY_LINE_RE = re.compile(rb"Started Launcher in \S+ seconds")
BUILD_LINE_RE = re.compile(rb"Build:\s+(?P<build>\d+)")
log = logging.getLogger("red.audio.manager")
_RE_READY_LINE = re.compile(rb"Started Launcher in \S+ seconds")
_FAILED_TO_START = re.compile(rb"Web server failed to start. (.*)")
_RE_BUILD_LINE = re.compile(rb"Build:\s+(?P<build>\d+)")
_RE_JAVA_VERSION_LINE = re.compile(
r'version "(?P<major>\d+).(?P<minor>\d+).\d+(?:_\d+)?(?:-[A-Za-z0-9]+)?"'
)
_RE_JAVA_SHORT_VERSION = re.compile(r'version "(?P<major>\d+)"')
class ServerManager:
@ -40,10 +44,10 @@ class ServerManager:
_java_available: ClassVar[Optional[bool]] = None
_java_version: ClassVar[Optional[Tuple[int, int]]] = None
_up_to_date: ClassVar[Optional[bool]] = None
_blacklisted_archs = []
_blacklisted_archs: List[str] = []
def __init__(self) -> None:
self.ready = asyncio.Event()
self.ready: asyncio.Event = asyncio.Event()
self._proc: Optional[asyncio.subprocess.Process] = None # pylint:disable=no-member
self._monitor_task: Optional[asyncio.Task] = None
@ -88,7 +92,7 @@ class ServerManager:
@classmethod
async def _get_jar_args(cls) -> List[str]:
java_available, java_version = await cls._has_java()
(java_available, java_version) = await cls._has_java()
if not java_available:
raise RuntimeError("You must install Java 1.8+ for Lavalink to run.")
@ -117,9 +121,7 @@ class ServerManager:
@staticmethod
async def _get_java_version() -> Tuple[int, int]:
"""
This assumes we've already checked that java exists.
"""
"""This assumes we've already checked that java exists."""
_proc: asyncio.subprocess.Process = await asyncio.create_subprocess_exec( # pylint:disable=no-member
"java", "-version", stdout=asyncio.subprocess.PIPE, stderr=asyncio.subprocess.PIPE
)
@ -133,15 +135,11 @@ class ServerManager:
# ... version "MAJOR.MINOR.PATCH[_BUILD]" ...
# ...
# We only care about the major and minor parts though.
version_line_re = re.compile(
r'version "(?P<major>\d+).(?P<minor>\d+).\d+(?:_\d+)?(?:-[A-Za-z0-9]+)?"'
)
short_version_re = re.compile(r'version "(?P<major>\d+)"')
lines = version_info.splitlines()
for line in lines:
match = version_line_re.search(line)
short_match = short_version_re.search(line)
match = _RE_JAVA_VERSION_LINE.search(line)
short_match = _RE_JAVA_SHORT_VERSION.search(line)
if match:
return int(match["major"]), int(match["minor"])
elif short_match:
@ -157,9 +155,11 @@ class ServerManager:
lastmessage = 0
for i in itertools.cycle(range(50)):
line = await self._proc.stdout.readline()
if READY_LINE_RE.search(line):
if _RE_READY_LINE.search(line):
self.ready.set()
break
if _FAILED_TO_START.search(line):
raise RuntimeError(f"Lavalink failed to start: {line.decode().strip()}")
if self._proc.returncode is not None and lastmessage + 2 < time.time():
# Avoid Console spam only print once every 2 seconds
lastmessage = time.time()
@ -259,7 +259,7 @@ class ServerManager:
stderr=asyncio.subprocess.STDOUT,
)
stdout = (await _proc.communicate())[0]
match = BUILD_LINE_RE.search(stdout)
match = _RE_BUILD_LINE.search(stdout)
if not match:
# Output is unexpected, suspect corrupted jarfile
return False

View File

@ -1,6 +1,5 @@
from collections import namedtuple
from enum import Enum, unique
from typing import List, Optional, Union
from typing import List, MutableMapping, Optional, Union, TYPE_CHECKING
import discord
import lavalink
@ -9,22 +8,33 @@ from redbot.core import Config, commands
from redbot.core.bot import Red
from redbot.core.i18n import Translator
from redbot.core.utils.chat_formatting import humanize_list
from .errors import InvalidPlaylistScope, MissingAuthor, MissingGuild, NotAllowed
_config = None
_bot = None
from .databases import PlaylistFetchResult, PlaylistInterface
from .errors import InvalidPlaylistScope, MissingAuthor, MissingGuild, NotAllowed
from .utils import PlaylistScope
if TYPE_CHECKING:
database: PlaylistInterface
_bot: Red
_config: Config
else:
database = None
_bot = None
_config = None
__all__ = [
"Playlist",
"PlaylistScope",
"get_playlist",
"get_all_playlist",
"create_playlist",
"reset_playlist",
"delete_playlist",
"humanize_scope",
"standardize_scope",
"FakePlaylist",
"get_all_playlist_for_migration23",
"database",
"get_all_playlist_converter",
"get_playlist_database",
]
FakePlaylist = namedtuple("Playlist", "author scope")
@ -32,29 +42,22 @@ FakePlaylist = namedtuple("Playlist", "author scope")
_ = Translator("Audio", __file__)
@unique
class PlaylistScope(Enum):
GLOBAL = "GLOBALPLAYLIST"
GUILD = "GUILDPLAYLIST"
USER = "USERPLAYLIST"
def __str__(self):
return "{0}".format(self.value)
@staticmethod
def list():
return list(map(lambda c: c.value, PlaylistScope))
def _pass_config_to_playlist(config: Config, bot: Red):
global _config, _bot
global _config, _bot, database
if _config is None:
_config = config
if _bot is None:
_bot = bot
if database is None:
database = PlaylistInterface()
def standardize_scope(scope) -> str:
def get_playlist_database() -> Optional[PlaylistInterface]:
global database
return database
def standardize_scope(scope: str) -> str:
scope = scope.upper()
valid_scopes = ["GLOBAL", "GUILD", "AUTHOR", "USER", "SERVER", "MEMBER", "BOT"]
@ -76,17 +79,25 @@ def standardize_scope(scope) -> str:
return scope
def humanize_scope(scope, ctx=None, the=None):
def _prepare_config_scope(
scope, author: Union[discord.abc.User, int] = None, guild: Union[discord.Guild, int] = None
):
scope = standardize_scope(scope)
if scope == PlaylistScope.GLOBAL.value:
return ctx or _("the ") if the else "" + "Global"
elif scope == PlaylistScope.GUILD.value:
return ctx.name if ctx else _("the ") if the else "" + "Server"
config_scope = [PlaylistScope.GLOBAL.value, _bot.user.id]
elif scope == PlaylistScope.USER.value:
return str(ctx) if ctx else _("the ") if the else "" + "User"
if author is None:
raise MissingAuthor("Invalid author for user scope.")
config_scope = [PlaylistScope.USER.value, int(getattr(author, "id", author))]
else:
if guild is None:
raise MissingGuild("Invalid guild for guild scope.")
config_scope = [PlaylistScope.GUILD.value, int(getattr(guild, "id", guild))]
return config_scope
def _prepare_config_scope(
def _prepare_config_scope_for_migration23( # TODO: remove me in a future version ?
scope, author: Union[discord.abc.User, int] = None, guild: discord.Guild = None
):
scope = standardize_scope(scope)
@ -104,6 +115,146 @@ def _prepare_config_scope(
return config_scope
class PlaylistMigration23: # TODO: remove me in a future version ?
"""A single playlist."""
def __init__(
self,
scope: str,
author: int,
playlist_id: int,
name: str,
playlist_url: Optional[str] = None,
tracks: Optional[List[MutableMapping]] = None,
guild: Union[discord.Guild, int, None] = None,
):
self.guild = guild
self.scope = standardize_scope(scope)
self.author = author
self.id = playlist_id
self.name = name
self.url = playlist_url
self.tracks = tracks or []
@classmethod
async def from_json(
cls, scope: str, playlist_number: int, data: MutableMapping, **kwargs
) -> "PlaylistMigration23":
"""Get a Playlist object from the provided information.
Parameters
----------
scope:str
The custom config scope. One of 'GLOBALPLAYLIST', 'GUILDPLAYLIST' or 'USERPLAYLIST'.
playlist_number: int
The playlist's number.
data: dict
The JSON representation of the playlist to be gotten.
**kwargs
Extra attributes for the Playlist instance which override values
in the data dict. These should be complete objects and not
IDs, where possible.
Returns
-------
Playlist
The playlist object for the requested playlist.
Raises
------
`InvalidPlaylistScope`
Passing a scope that is not supported.
`MissingGuild`
Trying to access the Guild scope without a guild.
`MissingAuthor`
Trying to access the User scope without an user id.
"""
guild = data.get("guild") or kwargs.get("guild")
author: int = data.get("author") or 0
playlist_id = data.get("id") or playlist_number
name = data.get("name", "Unnamed")
playlist_url = data.get("playlist_url", None)
tracks = data.get("tracks", [])
return cls(
guild=guild,
scope=scope,
author=author,
playlist_id=playlist_id,
name=name,
playlist_url=playlist_url,
tracks=tracks,
)
async def save(self):
"""Saves a Playlist to SQL."""
scope, scope_id = _prepare_config_scope(self.scope, self.author, self.guild)
database.upsert(
scope,
playlist_id=int(self.id),
playlist_name=self.name,
scope_id=scope_id,
author_id=self.author,
playlist_url=self.url,
tracks=self.tracks,
)
async def get_all_playlist_for_migration23( # TODO: remove me in a future version ?
scope: str, guild: Union[discord.Guild, int] = None
) -> List[PlaylistMigration23]:
"""
Gets all playlist for the specified scope.
Parameters
----------
scope: str
The custom config scope. One of 'GLOBALPLAYLIST', 'GUILDPLAYLIST' or 'USERPLAYLIST'.
guild: discord.Guild
The guild to get the playlist from if scope is GUILDPLAYLIST.
Returns
-------
list
A list of all playlists for the specified scope
Raises
------
`InvalidPlaylistScope`
Passing a scope that is not supported.
`MissingGuild`
Trying to access the Guild scope without a guild.
`MissingAuthor`
Trying to access the User scope without an user id.
"""
playlists = await _config.custom(scope).all()
if scope == PlaylistScope.GLOBAL.value:
return [
await PlaylistMigration23.from_json(
scope,
playlist_number,
playlist_data,
guild=guild,
author=int(playlist_data.get("author", 0)),
)
for playlist_number, playlist_data in playlists.items()
]
elif scope == PlaylistScope.USER.value:
return [
await PlaylistMigration23.from_json(
scope, playlist_number, playlist_data, guild=guild, author=int(user_id)
)
for user_id, scopedata in playlists.items()
for playlist_number, playlist_data in scopedata.items()
]
else:
return [
await PlaylistMigration23.from_json(
scope,
playlist_number,
playlist_data,
guild=int(guild_id),
author=int(playlist_data.get("author", 0)),
)
for guild_id, scopedata in playlists.items()
for playlist_number, playlist_data in scopedata.items()
]
class Playlist:
"""A single playlist."""
@ -115,14 +266,16 @@ class Playlist:
playlist_id: int,
name: str,
playlist_url: Optional[str] = None,
tracks: Optional[List[dict]] = None,
tracks: Optional[List[MutableMapping]] = None,
guild: Union[discord.Guild, int, None] = None,
):
self.bot = bot
self.guild = guild
self.scope = standardize_scope(scope)
self.config_scope = _prepare_config_scope(self.scope, author, guild)
self.scope_id = self.config_scope[-1]
self.author = author
self.author_id = getattr(self.author, "id", self.author)
self.guild_id = (
getattr(guild, "id", guild) if self.scope == PlaylistScope.GLOBAL.value else None
)
@ -132,7 +285,14 @@ class Playlist:
self.tracks = tracks or []
self.tracks_obj = [lavalink.Track(data=track) for track in self.tracks]
async def edit(self, data: dict):
def __repr__(self):
return (
f"Playlist(name={self.name}, id={self.id}, scope={self.scope}, "
f"scope_id={self.scope_id}, author={self.author_id}, "
f"tracks={len(self.tracks)}, url={self.url})"
)
async def edit(self, data: MutableMapping):
"""
Edits a Playlist.
Parameters
@ -146,10 +306,23 @@ class Playlist:
for item in list(data.keys()):
setattr(self, item, data[item])
await self.save()
return self
await _config.custom(*self.config_scope, str(self.id)).set(self.to_json())
async def save(self):
"""Saves a Playlist."""
scope, scope_id = self.config_scope
database.upsert(
scope,
playlist_id=int(self.id),
playlist_name=self.name,
scope_id=scope_id,
author_id=self.author_id,
playlist_url=self.url,
tracks=self.tracks,
)
def to_json(self) -> dict:
def to_json(self) -> MutableMapping:
"""Transform the object to a dict.
Returns
-------
@ -158,7 +331,7 @@ class Playlist:
"""
data = dict(
id=self.id,
author=self.author,
author=self.author_id,
guild=self.guild_id,
name=self.name,
playlist_url=self.url,
@ -168,7 +341,9 @@ class Playlist:
return data
@classmethod
async def from_json(cls, bot: Red, scope: str, playlist_number: int, data: dict, **kwargs):
async def from_json(
cls, bot: Red, scope: str, playlist_number: int, data: PlaylistFetchResult, **kwargs
) -> "Playlist":
"""Get a Playlist object from the provided information.
Parameters
----------
@ -197,12 +372,12 @@ class Playlist:
`MissingAuthor`
Trying to access the User scope without an user id.
"""
guild = data.get("guild") or kwargs.get("guild")
author = data.get("author")
playlist_id = data.get("id") or playlist_number
name = data.get("name", "Unnamed")
playlist_url = data.get("playlist_url", None)
tracks = data.get("tracks", [])
guild = data.scope_id if scope == PlaylistScope.GUILD.value else kwargs.get("guild")
author = data.author_id
playlist_id = data.playlist_id or playlist_number
name = data.playlist_name
playlist_url = data.playlist_url
tracks = data.tracks
return cls(
bot=bot,
@ -252,13 +427,13 @@ async def get_playlist(
`MissingAuthor`
Trying to access the User scope without an user id.
"""
playlist_data = await _config.custom(
*_prepare_config_scope(scope, author, guild), str(playlist_number)
).all()
if not playlist_data["id"]:
scope_standard, scope_id = _prepare_config_scope(scope, author, guild)
playlist_data = database.fetch(scope_standard, playlist_number, scope_id)
if not (playlist_data and playlist_data.playlist_id):
raise RuntimeError(f"That playlist does not exist for the following scope: {scope}")
return await Playlist.from_json(
bot, scope, playlist_number, playlist_data, guild=guild, author=author
bot, scope_standard, playlist_number, playlist_data, guild=guild, author=author
)
@ -296,23 +471,65 @@ async def get_all_playlist(
`MissingAuthor`
Trying to access the User scope without an user id.
"""
playlists = await _config.custom(*_prepare_config_scope(scope, author, guild)).all()
scope_standard, scope_id = _prepare_config_scope(scope, author, guild)
if specified_user:
user_id = getattr(author, "id", author)
return [
await Playlist.from_json(
bot, scope, playlist_number, playlist_data, guild=guild, author=author
)
for playlist_number, playlist_data in playlists.items()
if user_id == playlist_data.get("author")
]
playlists = await database.fetch_all(scope_standard, scope_id, author_id=user_id)
else:
return [
await Playlist.from_json(
bot, scope, playlist_number, playlist_data, guild=guild, author=author
)
for playlist_number, playlist_data in playlists.items()
]
playlists = await database.fetch_all(scope_standard, scope_id)
return [
await Playlist.from_json(
bot, scope, playlist.playlist_id, playlist, guild=guild, author=author
)
for playlist in playlists
]
async def get_all_playlist_converter(
scope: str,
bot: Red,
arg: str,
guild: Union[discord.Guild, int] = None,
author: Union[discord.abc.User, int] = None,
) -> List[Playlist]:
"""
Gets all playlist for the specified scope.
Parameters
----------
scope: str
The custom config scope. One of 'GLOBALPLAYLIST', 'GUILDPLAYLIST' or 'USERPLAYLIST'.
guild: discord.Guild
The guild to get the playlist from if scope is GUILDPLAYLIST.
author: int
The ID of the user to get the playlist from if scope is USERPLAYLIST.
bot: Red
The bot's instance
arg:str
The value to lookup.
Returns
-------
list
A list of all playlists for the specified scope
Raises
------
`InvalidPlaylistScope`
Passing a scope that is not supported.
`MissingGuild`
Trying to access the Guild scope without a guild.
`MissingAuthor`
Trying to access the User scope without an user id.
"""
scope_standard, scope_id = _prepare_config_scope(scope, author, guild)
playlists = await database.fetch_all_converter(
scope_standard, playlist_name=arg, playlist_id=arg
)
return [
await Playlist.from_json(
bot, scope, playlist.playlist_id, playlist, guild=guild, author=author
)
for playlist in playlists
]
async def create_playlist(
@ -320,12 +537,11 @@ async def create_playlist(
scope: str,
playlist_name: str,
playlist_url: Optional[str] = None,
tracks: Optional[List[dict]] = None,
tracks: Optional[List[MutableMapping]] = None,
author: Optional[discord.User] = None,
guild: Optional[discord.Guild] = None,
) -> Optional[Playlist]:
"""
Creates a new Playlist.
"""Creates a new Playlist.
Parameters
----------
@ -337,7 +553,7 @@ async def create_playlist(
The name of the new playlist.
playlist_url:str
the url of the new playlist.
tracks: List[dict]
tracks: List[MutableMapping]
A list of tracks to add to the playlist.
author: discord.User
The Author of the playlist.
@ -358,12 +574,16 @@ async def create_playlist(
"""
playlist = Playlist(
ctx.bot, scope, author.id, ctx.message.id, playlist_name, playlist_url, tracks, ctx.guild
)
await _config.custom(*_prepare_config_scope(scope, author, guild), str(ctx.message.id)).set(
playlist.to_json()
ctx.bot,
scope,
author.id if author else None,
ctx.message.id,
playlist_name,
playlist_url,
tracks,
guild or ctx.guild,
)
await playlist.save()
return playlist
@ -372,8 +592,7 @@ async def reset_playlist(
guild: Union[discord.Guild, int] = None,
author: Union[discord.abc.User, int] = None,
) -> None:
"""
Wipes all playlists for the specified scope.
"""Wipes all playlists for the specified scope.
Parameters
----------
@ -393,7 +612,9 @@ async def reset_playlist(
`MissingAuthor`
Trying to access the User scope without an user id.
"""
await _config.custom(*_prepare_config_scope(scope, author, guild)).clear()
scope, scope_id = _prepare_config_scope(scope, author, guild)
database.drop(scope)
database.create_table(scope)
async def delete_playlist(
@ -402,27 +623,27 @@ async def delete_playlist(
guild: discord.Guild,
author: Union[discord.abc.User, int] = None,
) -> None:
"""Deletes the specified playlist.
Parameters
----------
scope: str
The custom config scope. One of 'GLOBALPLAYLIST', 'GUILDPLAYLIST' or 'USERPLAYLIST'.
playlist_id: Union[str, int]
The ID of the playlist.
guild: discord.Guild
The guild to get the playlist from if scope is GUILDPLAYLIST.
author: int
The ID of the user to get the playlist from if scope is USERPLAYLIST.
Raises
------
`InvalidPlaylistScope`
Passing a scope that is not supported.
`MissingGuild`
Trying to access the Guild scope without a guild.
`MissingAuthor`
Trying to access the User scope without an user id.
"""
Deletes the specified playlist.
Parameters
----------
scope: str
The custom config scope. One of 'GLOBALPLAYLIST', 'GUILDPLAYLIST' or 'USERPLAYLIST'.
playlist_id: Union[str, int]
The ID of the playlist.
guild: discord.Guild
The guild to get the playlist from if scope is GUILDPLAYLIST.
author: int
The ID of the user to get the playlist from if scope is USERPLAYLIST.
Raises
------
`InvalidPlaylistScope`
Passing a scope that is not supported.
`MissingGuild`
Trying to access the Guild scope without a guild.
`MissingAuthor`
Trying to access the User scope without an user id.
"""
await _config.custom(*_prepare_config_scope(scope, author, guild), str(playlist_id)).clear()
scope, scope_id = _prepare_config_scope(scope, author, guild)
database.delete(scope, int(playlist_id), scope_id)

View File

@ -0,0 +1,397 @@
# TODO: https://github.com/Cog-Creators/Red-DiscordBot/pull/3195#issuecomment-567821701
# Thanks a lot Sinbad!
__all__ = [
# PRAGMA Statements
"PRAGMA_SET_temp_store",
"PRAGMA_SET_journal_mode",
"PRAGMA_SET_read_uncommitted",
"PRAGMA_FETCH_user_version",
"PRAGMA_SET_user_version",
# Playlist table statements
"PLAYLIST_CREATE_TABLE",
"PLAYLIST_DELETE",
"PLAYLIST_DELETE_SCOPE",
"PLAYLIST_DELETE_SCHEDULED",
"PLAYLIST_FETCH_ALL",
"PLAYLIST_FETCH_ALL_WITH_FILTER",
"PLAYLIST_FETCH_ALL_CONVERTER",
"PLAYLIST_FETCH",
"PLAYLIST_UPSERT",
"PLAYLIST_CREATE_INDEX",
# YouTube table statements
"YOUTUBE_DROP_TABLE",
"YOUTUBE_CREATE_TABLE",
"YOUTUBE_CREATE_INDEX",
"YOUTUBE_UPSERT",
"YOUTUBE_UPDATE",
"YOUTUBE_QUERY",
"YOUTUBE_DELETE_OLD_ENTRIES",
# Spotify table statements
"SPOTIFY_DROP_TABLE",
"SPOTIFY_CREATE_INDEX",
"SPOTIFY_CREATE_TABLE",
"SPOTIFY_UPSERT",
"SPOTIFY_QUERY",
"SPOTIFY_UPDATE",
"SPOTIFY_DELETE_OLD_ENTRIES",
# Lavalink table statements
"LAVALINK_DROP_TABLE",
"LAVALINK_CREATE_TABLE",
"LAVALINK_CREATE_INDEX",
"LAVALINK_UPSERT",
"LAVALINK_UPDATE",
"LAVALINK_QUERY",
"LAVALINK_QUERY_LAST_FETCHED_RANDOM",
"LAVALINK_DELETE_OLD_ENTRIES",
"LAVALINK_FETCH_ALL_ENTRIES_GLOBAL",
]
# PRAGMA Statements
PRAGMA_SET_temp_store = """
PRAGMA temp_store = 2;
"""
PRAGMA_SET_journal_mode = """
PRAGMA journal_mode = wal;
"""
PRAGMA_SET_read_uncommitted = """
PRAGMA read_uncommitted = 1;
"""
PRAGMA_FETCH_user_version = """
pragma user_version;
"""
PRAGMA_SET_user_version = """
pragma user_version=3;
"""
# Playlist table statements
PLAYLIST_CREATE_TABLE = """
CREATE TABLE IF NOT EXISTS playlists (
scope_type INTEGER NOT NULL,
playlist_id INTEGER NOT NULL,
playlist_name TEXT NOT NULL,
scope_id INTEGER NOT NULL,
author_id INTEGER NOT NULL,
deleted BOOLEAN DEFAULT false,
playlist_url TEXT,
tracks JSON,
PRIMARY KEY (playlist_id, scope_id, scope_type)
);
"""
PLAYLIST_DELETE = """
UPDATE playlists
SET
deleted = true
WHERE
(
scope_type = :scope_type
AND playlist_id = :playlist_id
AND scope_id = :scope_id
)
;
"""
PLAYLIST_DELETE_SCOPE = """
DELETE
FROM
playlists
WHERE
scope_type = :scope_type ;
"""
PLAYLIST_DELETE_SCHEDULED = """
DELETE
FROM
playlists
WHERE
deleted = true;
"""
PLAYLIST_FETCH_ALL = """
SELECT
playlist_id,
playlist_name,
scope_id,
author_id,
playlist_url,
tracks
FROM
playlists
WHERE
scope_type = :scope_type
AND scope_id = :scope_id
AND deleted = false
;
"""
PLAYLIST_FETCH_ALL_WITH_FILTER = """
SELECT
playlist_id,
playlist_name,
scope_id,
author_id,
playlist_url,
tracks
FROM
playlists
WHERE
(
scope_type = :scope_type
AND scope_id = :scope_id
AND author_id = :author_id
AND deleted = false
)
;
"""
PLAYLIST_FETCH_ALL_CONVERTER = """
SELECT
playlist_id,
playlist_name,
scope_id,
author_id,
playlist_url,
tracks
FROM
playlists
WHERE
(
scope_type = :scope_type
AND
(
playlist_id = :playlist_id
OR
LOWER(playlist_name) LIKE "%" || COALESCE(LOWER(:playlist_name), "") || "%"
)
AND deleted = false
)
;
"""
PLAYLIST_FETCH = """
SELECT
playlist_id,
playlist_name,
scope_id,
author_id,
playlist_url,
tracks
FROM
playlists
WHERE
(
scope_type = :scope_type
AND playlist_id = :playlist_id
AND scope_id = :scope_id
AND deleted = false
)
"""
PLAYLIST_UPSERT = """
INSERT INTO
playlists ( scope_type, playlist_id, playlist_name, scope_id, author_id, playlist_url, tracks )
VALUES
(
:scope_type, :playlist_id, :playlist_name, :scope_id, :author_id, :playlist_url, :tracks
)
ON CONFLICT (scope_type, playlist_id, scope_id) DO
UPDATE
SET
playlist_name = excluded.playlist_name,
playlist_url = excluded.playlist_url,
tracks = excluded.tracks;
"""
PLAYLIST_CREATE_INDEX = """
CREATE INDEX IF NOT EXISTS name_index ON playlists (scope_type, playlist_id, playlist_name, scope_id);
"""
# YouTube table statements
YOUTUBE_DROP_TABLE = """
DROP TABLE IF EXISTS youtube;
"""
YOUTUBE_CREATE_TABLE = """
CREATE TABLE IF NOT EXISTS youtube(
id INTEGER PRIMARY KEY AUTOINCREMENT,
track_info TEXT,
youtube_url TEXT,
last_updated INTEGER,
last_fetched INTEGER
);
"""
YOUTUBE_CREATE_INDEX = """
CREATE UNIQUE INDEX IF NOT EXISTS idx_youtube_url
ON youtube (track_info, youtube_url);
"""
YOUTUBE_UPSERT = """INSERT INTO
youtube
(
track_info,
youtube_url,
last_updated,
last_fetched
)
VALUES
(
:track_info,
:track_url,
:last_updated,
:last_fetched
)
ON CONFLICT
(
track_info,
youtube_url
)
DO UPDATE
SET
track_info = excluded.track_info,
last_updated = excluded.last_updated
"""
YOUTUBE_UPDATE = """
UPDATE youtube
SET last_fetched=:last_fetched
WHERE track_info=:track;
"""
YOUTUBE_QUERY = """
SELECT youtube_url, last_updated
FROM youtube
WHERE
track_info=:track
AND last_updated > :maxage
;
"""
YOUTUBE_DELETE_OLD_ENTRIES = """
DELETE FROM youtube
WHERE
last_updated < :maxage;
"""
# Spotify table statements
SPOTIFY_DROP_TABLE = """
DROP TABLE IF EXISTS spotify;
"""
SPOTIFY_CREATE_TABLE = """
CREATE TABLE IF NOT EXISTS spotify(
id TEXT,
type TEXT,
uri TEXT,
track_name TEXT,
artist_name TEXT,
song_url TEXT,
track_info TEXT,
last_updated INTEGER,
last_fetched INTEGER
);
"""
SPOTIFY_CREATE_INDEX = """
CREATE UNIQUE INDEX IF NOT EXISTS idx_spotify_uri
ON spotify (id, type, uri);
"""
SPOTIFY_UPSERT = """INSERT INTO
spotify
(
id, type, uri, track_name, artist_name,
song_url, track_info, last_updated, last_fetched
)
VALUES
(
:id, :type, :uri, :track_name, :artist_name,
:song_url, :track_info, :last_updated, :last_fetched
)
ON CONFLICT
(
id,
type,
uri
)
DO UPDATE
SET
track_name = excluded.track_name,
artist_name = excluded.artist_name,
song_url = excluded.song_url,
track_info = excluded.track_info,
last_updated = excluded.last_updated;
"""
SPOTIFY_UPDATE = """
UPDATE spotify
SET last_fetched=:last_fetched
WHERE uri=:uri;
"""
SPOTIFY_QUERY = """
SELECT track_info, last_updated
FROM spotify
WHERE
uri=:uri
AND last_updated > :maxage;
"""
SPOTIFY_DELETE_OLD_ENTRIES = """
DELETE FROM spotify
WHERE
last_updated < :maxage;
"""
# Lavalink table statements
LAVALINK_DROP_TABLE = """
DROP TABLE IF EXISTS lavalink ;
"""
LAVALINK_CREATE_TABLE = """
CREATE TABLE IF NOT EXISTS lavalink(
query TEXT,
data JSON,
last_updated INTEGER,
last_fetched INTEGER
);
"""
LAVALINK_CREATE_INDEX = """
CREATE UNIQUE INDEX IF NOT EXISTS idx_lavalink_query
ON lavalink (query);
"""
LAVALINK_UPSERT = """INSERT INTO
lavalink
(
query,
data,
last_updated,
last_fetched
)
VALUES
(
:query,
:data,
:last_updated,
:last_fetched
)
ON CONFLICT
(
query
)
DO UPDATE
SET
data = excluded.data,
last_updated = excluded.last_updated;
"""
LAVALINK_UPDATE = """
UPDATE lavalink
SET last_fetched=:last_fetched
WHERE query=:query;
"""
LAVALINK_QUERY = """
SELECT data, last_updated
FROM lavalink
WHERE
query=:query
AND last_updated > :maxage;
"""
LAVALINK_QUERY_LAST_FETCHED_RANDOM = """
SELECT data
FROM lavalink
WHERE
last_fetched > :day
AND last_updated > :maxage
ORDER BY RANDOM()
LIMIT 10
;
"""
LAVALINK_DELETE_OLD_ENTRIES = """
DELETE FROM lavalink
WHERE
last_updated < :maxage;
"""
LAVALINK_FETCH_ALL_ENTRIES_GLOBAL = """
SELECT query, data
FROM lavalink
"""

View File

@ -1,9 +1,10 @@
import asyncio
import contextlib
import functools
import os
import re
import time
from enum import Enum, unique
from typing import MutableMapping, Optional, TYPE_CHECKING
from urllib.parse import urlparse
import discord
@ -11,13 +12,14 @@ import lavalink
from redbot.core import Config, commands
from redbot.core.bot import Red
from redbot.core.i18n import Translator
from redbot.core.utils.chat_formatting import bold, box
from discord.utils import escape_markdown as escape
from . import audio_dataclasses
from .converters import _pass_config_to_converters
from .playlists import _pass_config_to_playlist
from .audio_dataclasses import Query
__all__ = [
"pass_config_to_dependencies",
"_pass_config_to_utils",
"track_limit",
"queue_duration",
"draw_time",
@ -26,35 +28,45 @@ __all__ = [
"clear_react",
"match_yt_playlist",
"remove_react",
"get_description",
"get_track_description",
"track_creator",
"time_convert",
"url_check",
"userlimit",
"is_allowed",
"track_to_json",
"rgetattr",
"humanize_scope",
"CacheLevel",
"format_playlist_picker_data",
"get_track_description_unformatted",
"Notifier",
"PlaylistScope",
]
_re_time_converter = re.compile(r"(?:(\d+):)?([0-5]?[0-9]):([0-5][0-9])")
re_yt_list_playlist = re.compile(
_RE_TIME_CONVERTER = re.compile(r"(?:(\d+):)?([0-5]?[0-9]):([0-5][0-9])")
_RE_YT_LIST_PLAYLIST = re.compile(
r"^(https?://)?(www\.)?(youtube\.com|youtu\.?be)(/playlist\?).*(list=)(.*)(&|$)"
)
_config = None
_bot = None
if TYPE_CHECKING:
_config: Config
_bot: Red
else:
_config = None
_bot = None
_ = Translator("Audio", __file__)
def pass_config_to_dependencies(config: Config, bot: Red, localtracks_folder: str):
global _bot, _config
_bot = bot
_config = config
_pass_config_to_playlist(config, bot)
_pass_config_to_converters(config, bot)
audio_dataclasses._pass_config_to_dataclasses(config, bot, localtracks_folder)
def _pass_config_to_utils(config: Config, bot: Red) -> None:
global _config, _bot
if _config is None:
_config = config
if _bot is None:
_bot = bot
def track_limit(track, maxlength):
def track_limit(track, maxlength) -> bool:
try:
length = round(track.length / 1000)
except AttributeError:
@ -65,16 +77,33 @@ def track_limit(track, maxlength):
return True
async def is_allowed(guild: discord.Guild, query: str):
async def is_allowed(guild: discord.Guild, query: str, query_obj: Query = None) -> bool:
query = query.lower().strip()
whitelist = set(await _config.guild(guild).url_keyword_whitelist())
if whitelist:
return any(i in query for i in whitelist)
blacklist = set(await _config.guild(guild).url_keyword_blacklist())
return not any(i in query for i in blacklist)
if query_obj is not None:
query = query_obj.lavalink_query.replace("ytsearch:", "youtubesearch").replace(
"scsearch:", "soundcloudsearch"
)
global_whitelist = set(await _config.url_keyword_whitelist())
global_whitelist = [i.lower() for i in global_whitelist]
if global_whitelist:
return any(i in query for i in global_whitelist)
global_blacklist = set(await _config.url_keyword_blacklist())
global_blacklist = [i.lower() for i in global_blacklist]
if any(i in query for i in global_blacklist):
return False
if guild is not None:
whitelist = set(await _config.guild(guild).url_keyword_whitelist())
whitelist = [i.lower() for i in whitelist]
if whitelist:
return any(i in query for i in whitelist)
blacklist = set(await _config.guild(guild).url_keyword_blacklist())
blacklist = [i.lower() for i in blacklist]
return not any(i in query for i in blacklist)
return True
async def queue_duration(ctx):
async def queue_duration(ctx) -> int:
player = lavalink.get_player(ctx.guild.id)
duration = []
for i in range(len(player.queue)):
@ -94,7 +123,7 @@ async def queue_duration(ctx):
return queue_total_duration
async def draw_time(ctx):
async def draw_time(ctx) -> str:
player = lavalink.get_player(ctx.guild.id)
paused = player.paused
pos = player.position
@ -115,7 +144,7 @@ async def draw_time(ctx):
return msg
def dynamic_time(seconds):
def dynamic_time(seconds) -> str:
m, s = divmod(seconds, 60)
h, m = divmod(m, 60)
d, h = divmod(h, 24)
@ -133,7 +162,19 @@ def dynamic_time(seconds):
return msg.format(d, h, m, s)
def match_url(url):
def format_playlist_picker_data(pid, pname, ptracks, pauthor, scope) -> str:
author = _bot.get_user(pauthor) or pauthor or _("Unknown")
line = _(
" - Name: <{pname}>\n"
" - Scope: < {scope} >\n"
" - ID: < {pid} >\n"
" - Tracks: < {ptracks} >\n"
" - Author: < {author} >\n\n"
).format(pname=pname, scope=humanize_scope(scope), pid=pid, ptracks=ptracks, author=author)
return box(line, lang="md")
def match_url(url) -> bool:
try:
query_url = urlparse(url)
return all([query_url.scheme, query_url.netloc, query_url.path])
@ -141,18 +182,18 @@ def match_url(url):
return False
def match_yt_playlist(url):
if re_yt_list_playlist.match(url):
def match_yt_playlist(url) -> bool:
if _RE_YT_LIST_PLAYLIST.match(url):
return True
return False
async def remove_react(message, react_emoji, react_user):
async def remove_react(message, react_emoji, react_user) -> None:
with contextlib.suppress(discord.HTTPException):
await message.remove_reaction(react_emoji, react_user)
async def clear_react(bot: Red, message: discord.Message, emoji: dict = None):
async def clear_react(bot: Red, message: discord.Message, emoji: MutableMapping = None) -> None:
try:
await message.clear_reactions()
except discord.Forbidden:
@ -166,29 +207,50 @@ async def clear_react(bot: Red, message: discord.Message, emoji: dict = None):
return
async def get_description(track):
if any(x in track.uri for x in [f"{os.sep}localtracks", f"localtracks{os.sep}"]):
local_track = audio_dataclasses.LocalPath(track.uri)
if track.title != "Unknown title":
return "**{} - {}**\n{}".format(
track.author, track.title, local_track.to_string_hidden()
)
def get_track_description(track) -> Optional[str]:
if track and getattr(track, "uri", None):
query = Query.process_input(track.uri)
if query.is_local:
if track.title != "Unknown title":
return f'**{escape(f"{track.author} - {track.title}")}**' + escape(
f"\n{query.to_string_user()} "
)
else:
return escape(query.to_string_user())
else:
return local_track.to_string_hidden()
else:
return "**[{}]({})**".format(track.title, track.uri)
return f'**{escape(f"[{track.title}]({track.uri}) ")}**'
elif hasattr(track, "to_string_user") and track.is_local:
return escape(track.to_string_user() + " ")
def track_creator(player, position=None, other_track=None):
def get_track_description_unformatted(track) -> Optional[str]:
if track and hasattr(track, "uri"):
query = Query.process_input(track.uri)
if query.is_local:
if track.title != "Unknown title":
return escape(f"{track.author} - {track.title}")
else:
return escape(query.to_string_user())
else:
return escape(f"{track.title}")
elif hasattr(track, "to_string_user") and track.is_local:
return escape(track.to_string_user() + " ")
def track_creator(player, position=None, other_track=None) -> MutableMapping:
if position == "np":
queued_track = player.current
elif position is None:
queued_track = other_track
else:
queued_track = player.queue[position]
track_keys = queued_track._info.keys()
track_values = queued_track._info.values()
track_id = queued_track.track_identifier
return track_to_json(queued_track)
def track_to_json(track: lavalink.Track) -> MutableMapping:
track_keys = track._info.keys()
track_values = track._info.values()
track_id = track.track_identifier
track_info = {}
for k, v in zip(track_keys, track_values):
track_info[k] = v
@ -200,8 +262,8 @@ def track_creator(player, position=None, other_track=None):
return track_obj
def time_convert(length):
match = re.compile(_re_time_converter).match(length)
def time_convert(length) -> int:
match = _RE_TIME_CONVERTER.match(length)
if match is not None:
hr = int(match.group(1)) if match.group(1) else 0
mn = int(match.group(2)) if match.group(2) else 0
@ -215,7 +277,7 @@ def time_convert(length):
return 0
def url_check(url):
def url_check(url) -> bool:
valid_tld = [
"youtube.com",
"youtu.be",
@ -235,7 +297,7 @@ def url_check(url):
return True if url_domain in valid_tld else False
def userlimit(channel):
def userlimit(channel) -> bool:
if channel.user_limit == 0 or channel.user_limit > len(channel.members) + 1:
return False
return True
@ -386,7 +448,9 @@ class CacheLevel:
class Notifier:
def __init__(self, ctx: commands.Context, message: discord.Message, updates: dict, **kwargs):
def __init__(
self, ctx: commands.Context, message: discord.Message, updates: MutableMapping, **kwargs
):
self.context = ctx
self.message = message
self.updates = updates
@ -402,8 +466,8 @@ class Notifier:
seconds_key: str = None,
seconds: str = None,
):
"""
This updates an existing message.
"""This updates an existing message.
Based on the message found in :variable:`Notifier.updates` as per the `key` param
"""
if self.last_msg_time + self.cooldown > time.time() and not current == total:
@ -435,3 +499,27 @@ class Notifier:
self.last_msg_time = time.time()
except discord.errors.NotFound:
pass
@unique
class PlaylistScope(Enum):
GLOBAL = "GLOBALPLAYLIST"
GUILD = "GUILDPLAYLIST"
USER = "USERPLAYLIST"
def __str__(self):
return "{0}".format(self.value)
@staticmethod
def list():
return list(map(lambda c: c.value, PlaylistScope))
def humanize_scope(scope, ctx=None, the=None):
if scope == PlaylistScope.GLOBAL.value:
return _("the ") if the else "" + _("Global")
elif scope == PlaylistScope.GUILD.value:
return ctx.name if ctx else _("the ") if the else "" + _("Server")
elif scope == PlaylistScope.USER.value:
return str(ctx) if ctx else _("the ") if the else "" + _("User")

View File

@ -678,7 +678,7 @@ class Permissions(commands.Cog):
@staticmethod
def _get_updated_schema(
old_config: _OldConfigSchema
old_config: _OldConfigSchema,
) -> Tuple[_NewConfigSchema, _NewConfigSchema]:
# Prior to 1.0.0, the schema was in this form for both global
# and guild-based rules:

View File

@ -22,7 +22,7 @@ DROP_DDL_SCRIPT_PATH = _PKG_PATH / "drop_ddl.sql"
def encode_identifier_data(
id_data: IdentifierData
id_data: IdentifierData,
) -> Tuple[str, str, str, List[str], List[str], int, bool]:
return (
id_data.cog_name,

View File

@ -38,7 +38,6 @@ install_requires =
Click==7.0
colorama==0.4.1
contextlib2==0.5.5
databases[sqlite]==0.2.5
discord.py==1.2.5
distro==1.4.0; sys_platform == "linux"
fuzzywuzzy==0.17.0

View File

@ -14,7 +14,6 @@ install_requires =
babel
click
colorama
databases[sqlite]
discord.py
distro; sys_platform == "linux"
fuzzywuzzy