PostgreSQL driver, tests against DB backends, and general drivers cleanup (#2723)

* PostgreSQL driver and general drivers cleanup

Signed-off-by: Toby Harradine <tobyharradine@gmail.com>

* Make tests pass

Signed-off-by: Toby Harradine <tobyharradine@gmail.com>

* Add black --target-version flag in make.bat

Signed-off-by: Toby Harradine <tobyharradine@gmail.com>

* Rewrite postgres driver

Most of the logic is now in PL/pgSQL.

This completely avoids the use of Python f-strings to format identifiers into queries. Although an SQL-injection attack would have been impossible anyway (only the owner would have ever had the ability to do that), using PostgreSQL's format() is more reliable for unusual identifiers. Performance-wise, I'm not sure whether this is an improvement, but I highly doubt that it's worse.

Signed-off-by: Toby Harradine <tobyharradine@gmail.com>

* Reformat

Signed-off-by: Toby Harradine <tobyharradine@gmail.com>

* Fix PostgresDriver.delete_all_data()

Signed-off-by: Toby Harradine <tobyharradine@gmail.com>

* Clean up PL/pgSQL code

Signed-off-by: Toby Harradine <tobyharradine@gmail.com>

* More PL/pgSQL cleanup

Signed-off-by: Toby Harradine <tobyharradine@gmail.com>

* PL/pgSQL function optimisations

Signed-off-by: Toby Harradine <tobyharradine@gmail.com>

* Ensure compatibility with PostgreSQL 10 and below

Signed-off-by: Toby Harradine <tobyharradine@gmail.com>

* More/better docstrings for PG functions

Signed-off-by: Toby Harradine <tobyharradine@gmail.com>

* Fix typo in docstring

Signed-off-by: Toby Harradine <tobyharradine@gmail.com>

* Return correct value on toggle()

Signed-off-by: Toby Harradine <tobyharradine@gmail.com>

* Use composite type for PG function parameters

Signed-off-by: Toby Harradine <tobyharradine@gmail.com>

* Fix JSON driver's Config.clear_all()

Signed-off-by: Toby Harradine <tobyharradine@gmail.com>

* Correct description for Mongo tox recipe

Signed-off-by: Toby Harradine <tobyharradine@gmail.com>

* Fix linting errors

Signed-off-by: Toby Harradine <tobyharradine@gmail.com>

* Update dep specification after merging bumpdeps

Signed-off-by: Toby Harradine <tobyharradine@gmail.com>

* Add towncrier entries

Signed-off-by: Toby Harradine <tobyharradine@gmail.com>

* Update from merge

Signed-off-by: Toby Harradine <tobyharradine@gmail.com>

* Mention [postgres] extra in install docs

Signed-off-by: Toby Harradine <tobyharradine@gmail.com>

* Support more connection options and use better defaults

Signed-off-by: Toby Harradine <tobyharradine@gmail.com>

* Actually pass PG env vars in tox

Signed-off-by: Toby Harradine <tobyharradine@gmail.com>

* Replace event trigger with manual DELETE queries

Signed-off-by: Toby Harradine <tobyharradine@gmail.com>
This commit is contained in:
Toby Harradine
2019-08-27 12:02:26 +10:00
committed by Michael H
parent 57fa29dd64
commit d1a46acc9a
34 changed files with 2282 additions and 843 deletions

View File

@@ -1,7 +1,9 @@
import asyncio
import json
import logging
import os
import shutil
import tarfile
from asyncio import AbstractEventLoop, as_completed, Semaphore
from asyncio.futures import isfuture
from itertools import chain
@@ -24,8 +26,10 @@ from typing import (
)
import discord
from datetime import datetime
from fuzzywuzzy import fuzz, process
from .. import commands, data_manager
from .chat_formatting import box
if TYPE_CHECKING:
@@ -37,6 +41,7 @@ __all__ = [
"fuzzy_command_search",
"format_fuzzy_results",
"deduplicate_iterables",
"create_backup",
]
_T = TypeVar("_T")
@@ -397,3 +402,45 @@ def bounded_gather(
tasks = (_sem_wrapper(semaphore, task) for task in coros_or_futures)
return asyncio.gather(*tasks, loop=loop, return_exceptions=return_exceptions)
async def create_backup(dest: Path = Path.home()) -> Optional[Path]:
data_path = Path(data_manager.core_data_path().parent)
if not data_path.exists():
return
dest.mkdir(parents=True, exist_ok=True)
timestr = datetime.utcnow().isoformat(timespec="minutes")
backup_fpath = dest / f"redv3_{data_manager.instance_name}_{timestr}.tar.gz"
to_backup = []
exclusions = [
"__pycache__",
"Lavalink.jar",
os.path.join("Downloader", "lib"),
os.path.join("CogManager", "cogs"),
os.path.join("RepoManager", "repos"),
]
# Avoiding circular imports
from ...cogs.downloader.repo_manager import RepoManager
repo_mgr = RepoManager()
await repo_mgr.initialize()
repo_output = []
for _, repo in repo_mgr._repos:
repo_output.append({"url": repo.url, "name": repo.name, "branch": repo.branch})
repos_file = data_path / "cogs" / "RepoManager" / "repos.json"
with repos_file.open("w") as fs:
json.dump(repo_output, fs, indent=4)
instance_file = data_path / "instance.json"
with instance_file.open("w") as fs:
json.dump({data_manager.instance_name: data_manager.basic_config}, fs, indent=4)
for f in data_path.glob("**/*"):
if not any(ex in str(f) for ex in exclusions) and f.is_file():
to_backup.append(f)
with tarfile.open(str(backup_fpath), "w:gz") as tar:
for f in to_backup:
tar.add(str(f), arcname=f.relative_to(data_path), recursive=False)
return backup_fpath

View File

@@ -89,7 +89,7 @@ class Tunnel(metaclass=TunnelMeta):
destination: discord.abc.Messageable,
content: str = None,
embed=None,
files: Optional[List[discord.File]] = None
files: Optional[List[discord.File]] = None,
) -> List[discord.Message]:
"""
This does the actual sending, use this instead of a full tunnel