From e2c8b1100870511aaa575ce807f9009e5a5dba99 Mon Sep 17 00:00:00 2001 From: jack1142 <6032823+jack1142@users.noreply.github.com> Date: Fri, 8 Nov 2019 02:36:16 +0100 Subject: [PATCH] [V3 Downloader] Revision tracking (#2571) * feat(downloader): Install cog from specific commit in repo (initial commit) - Repo and Installable have commit property now - New class inheriting from Installable - InstalledCog (old one from converters.py removed) - New Repo.checkout() method, which is also async ctx manager ref #2527 * fix(downloader): Keep information about repo's branch in config - This is needed to make sure that repo can go back from detached state in some rare unexpected cases - current branch is determined by `git symbolic-ref` now as this command errors for detached HEAD * feat(downloader): Update repo without cogs, update single cog The most important part of issue #2527 has been added here - `[p]repo update` command added - new conf format - nested dictionary repo_name->cog_name->cog_json installed libraries are now kept in conf too - `InstalledCog` renamed to `InstalledModule` - installed libraries use this class - `Downloader.installed_libraries()` and `Downloader.installed_modules()` added - `Downloader._add_to_installed()` and `Downloader._remove_from_installed()` now accept list of modules, of both cogs and libraries - `[p]cog install` tells about fails of copying cog and installing shared libraries - `[p]cog update` will truly update only chosen cogs (if provided) or cogs that need update - pinned cogs aren't checked - before update, repos are updated - to determine if update is needed `Repo.get_modified_modules()` is used - `[p]cog pin` and `[p]cog unpin` commands for pinning/unpinning cogs added - `Repo.checkout()` allows to choose ctx manager exit's checkout revision - `Repo.install_cog()` returns `InstalledModule` now and raises CopyingError (maybe breaking?) - `Repo.install_libraries()` returns 2-tuple of installed and failed libraries (maybe breaking?) - `RepoManager.get_all_cogs()` added, which returns cogs from all repos - `RepoManager.repos` property added, which contains tuple of `Repo` * test(downloader): Repo.current_branch() throws an exception, when branch can't be determined * style(downloader): rename _add_to_installed to _save_to_installed This method is used for both adding and updating existing modules in Config * refactor(downloader): add ctx.typing() for few commands `[p]cog install` is nested hell, can't wait for moving install logic to separate method * fix(downloader): refactor and fix `set` usage * perf(downloader): update commits for ALL checked modules to omit diffs next time This will also disable running git diff for cogs that have the same commit as the latest one * style(downloader): few style improvements - use of mutable object in method definition - make Repo._get_full_sha1() public method - too long line - don't use len to check if sequence is empty * feat(downloader): add `[p]cog updateallfromrepos` and `[p]cog updatetoversion` commands - moved cog update logic into `Downloader._cog_update_logic()` (lack of better name) - splitted whole cog update process into smaller methods - might still need some improvements - added new methods to `Repo` class: - `is_on_branch()` to check if repo is currently checked out to branch - `is_ancestor()` to check if one commit is ancestor of the other - fix for `Downloader._available_updates()` behaviour broken by commit 5755ab08ba67556b3863e907c6f44d80f4f13d88 * feat(downloader): try to find last commit where module is still present Enhancements: - `Installable` now has `repo` attribute containing repo object or `None` if repo is missing - `Downloader._install_cogs()` and `Downloader._reinstall_libraries()` are able to install modules from different commits of repo - `Repo.checkout()` as ctx manager will now exit to commit which was active before checking out - unification of `rev` and `hash` terms: All function parameters are explicitly called `hash`, if it can only be commit's full sha1 hash or `rev` if it can be anything that names a commit object, see [link](https://git-scm.com/docs/git-rev-parse#_specifying_revisions) - new `Repo.get_last_module_occurence()` method, which gets module's Installable from last commit in which it still occurs * docs(downloader): Add basic description for `InstalledModule` * fix(downloader): cog ignored during updates if its commit was missing After config format update, commit string is empty until update and when such cog was checked and it wasn't available in repo anymore, it was ignored * refactor(downloader): Installing cogs from specific rev will pin them * perf(downloader): Don't checkout when current commit equals target hash - changes to `Repo.checkout()`: - `exit_to_rev` is now keyword only argument - added `force_checkout` to force checkout even if `Repo.commit` value is the same as target hash * refactor(downloader): Repo._run() stderr is redirected to debug log now - added two keyword arguments: - `valid_exit_codes` which specifies valid exit codes, used to determine if stderr should be sent as debug or error level in logging - `debug_only` which specifies if stderr can be sent only as debug level in logging * style(downloader): stop using `set` as arg name in `_load_repos()` * feat(downloader): pass multiple cogs to `[p]cog (un)pin` * refactor(downloader): accept module name instead of instance, fix spelling * style(downloader): few small style changes * fix(downloader): add type annotations + fixes based on them - fix wrong type annotations and add a lot of new ones - add checks for `Installable.repo` being `None` - fix wrong return type in `Downloader._install_requirements` - show repo names correctly when updating all repos - fix error when some requirement fails to install BREAKING CHANGE: - type of `Repo.available_modules` is now consistent (always `tuple`) * tests: use same event loop policy as in Red's code * enhance(downloader): fully handle ambiguous revisions * build(deps): add pytest-mock dependency to tests extra * fix(downloader): minor fixes * feat(downloader): add tool for editing Downloader's test repo This script aims to help update the human-readable version of repo used for git integration tests in ``redbot/tests/downloader_testrepo.export`` by exporting/importing it in/from provided directory. Note ---- Editing `downloader_git_test_repo.export` file manually is strongly discouraged, especially editing any part of commit directives as that causes a change in the commit's hash. Another problem devs could encounter when trying to manually edit that file are editors that will use CRLF instead of LF for new line character(s) and therefore break it. I also used `.gitattributes` to prevent autocrlf from breaking testrepo. Also, if Git ever changes currently used SHA-1 to SHA-256 we will have to update old hashes with new ones. But it's a small drawback, when we can have human-readable version of repo. Known limitations ----------------- ``git fast-export`` exports commits without GPG signs so this script disables it in repo's config. This also means devs shouldn't use ``--gpg-sign`` flag in ``git commit`` within the test repo. * tests(downloader): add git tests and test repo for them Also added Markdown file that is even more clear than export file on what the test repo contains. This is manually created but can be automated on later date. * test(downloader): add more tests related to RepoManager These tests use expected output that is already guaranteed by git tests. * chore(CODEOWNERS): add jack1142 to Downloader's folders I know this doesn't actually give any benefit to people that don't have write permission to the repo but I saw other big fella devs doing this, so I think this might be advisable. * enhance(downloader): allow easy schema updates in future * enhance(downloader): more typing fixes, add comments for clarity * feat(downloader): add python and bot version check to update process follow-up on #2605, this commit fully fixes #1866 * chore(changelog): add towncrier entries * fix(downloader): use `*args` instead of `commands.Greedy` * fix(downloader): hot-reload issue - `InstallableType` now inherits from `IntEnum` There's desync of `InstallableType` class types due to hot-reload and `IntEnum` allows for equality check between different types * enhance(downloader): ensure there's no cog with same name installed should fix #2927 * fix(downloader): last few changes before marking as ready for review --- .github/CODEOWNERS | 5 +- changelog.d/2571.misc.rst | 1 + changelog.d/downloader/1866.enhance.rst | 1 + changelog.d/downloader/2527.docs.rst | 1 + changelog.d/downloader/2527.enhance.1.rst | 1 + changelog.d/downloader/2527.enhance.2.rst | 1 + changelog.d/downloader/2527.feature.1.rst | 1 + changelog.d/downloader/2527.feature.2.rst | 1 + changelog.d/downloader/2527.feature.3.rst | 1 + changelog.d/downloader/2527.feature.4.rst | 1 + changelog.d/downloader/2527.feature.5.rst | 1 + changelog.d/downloader/2527.feature.6.rst | 1 + changelog.d/downloader/2527.misc.1.rst | 4 + changelog.d/downloader/2571.bugfix.1.rst | 1 + changelog.d/downloader/2571.bugfix.2.rst | 1 + changelog.d/downloader/2571.dep.rst | 1 + changelog.d/downloader/2571.enhance.rst | 1 + changelog.d/downloader/2571.misc.rst | 1 + changelog.d/downloader/2927.bugfix.rst | 1 + docs/framework_downloader.rst | 6 + redbot/__init__.py | 23 +- redbot/__main__.py | 14 +- redbot/cogs/downloader/checks.py | 2 +- redbot/cogs/downloader/converters.py | 6 +- redbot/cogs/downloader/downloader.py | 1114 ++++++++++++++++----- redbot/cogs/downloader/errors.py | 38 + redbot/cogs/downloader/installable.py | 107 +- redbot/cogs/downloader/json_mixins.py | 13 +- redbot/cogs/downloader/repo_manager.py | 679 +++++++++++-- redbot/pytest/.gitattributes | 1 + redbot/pytest/downloader.py | 136 ++- redbot/pytest/downloader_testrepo.export | 134 +++ redbot/pytest/downloader_testrepo.md | 102 ++ setup.cfg | 2 + tests/cogs/downloader/test_downloader.py | 335 ++++++- tests/cogs/downloader/test_git.py | 452 +++++++++ tests/cogs/downloader/test_installable.py | 8 +- tests/conftest.py | 3 + tools/edit_testrepo.py | 172 ++++ tools/primary_deps.ini | 1 + 40 files changed, 2922 insertions(+), 452 deletions(-) create mode 100644 changelog.d/2571.misc.rst create mode 100644 changelog.d/downloader/1866.enhance.rst create mode 100644 changelog.d/downloader/2527.docs.rst create mode 100644 changelog.d/downloader/2527.enhance.1.rst create mode 100644 changelog.d/downloader/2527.enhance.2.rst create mode 100644 changelog.d/downloader/2527.feature.1.rst create mode 100644 changelog.d/downloader/2527.feature.2.rst create mode 100644 changelog.d/downloader/2527.feature.3.rst create mode 100644 changelog.d/downloader/2527.feature.4.rst create mode 100644 changelog.d/downloader/2527.feature.5.rst create mode 100644 changelog.d/downloader/2527.feature.6.rst create mode 100644 changelog.d/downloader/2527.misc.1.rst create mode 100644 changelog.d/downloader/2571.bugfix.1.rst create mode 100644 changelog.d/downloader/2571.bugfix.2.rst create mode 100644 changelog.d/downloader/2571.dep.rst create mode 100644 changelog.d/downloader/2571.enhance.rst create mode 100644 changelog.d/downloader/2571.misc.rst create mode 100644 changelog.d/downloader/2927.bugfix.rst create mode 100644 redbot/pytest/.gitattributes create mode 100644 redbot/pytest/downloader_testrepo.export create mode 100644 redbot/pytest/downloader_testrepo.md create mode 100644 tests/cogs/downloader/test_git.py create mode 100644 tools/edit_testrepo.py diff --git a/.github/CODEOWNERS b/.github/CODEOWNERS index ef2d440ba..7dac8bd23 100644 --- a/.github/CODEOWNERS +++ b/.github/CODEOWNERS @@ -33,7 +33,7 @@ redbot/cogs/audio/* @aikaterna redbot/cogs/bank/* @tekulvw redbot/cogs/cleanup/* @palmtree5 redbot/cogs/customcom/* @palmtree5 -redbot/cogs/downloader/* @tekulvw +redbot/cogs/downloader/* @tekulvw @jack1142 redbot/cogs/economy/* @palmtree5 redbot/cogs/filter/* @palmtree5 redbot/cogs/general/* @palmtree5 @@ -49,6 +49,9 @@ redbot/cogs/warnings/* @palmtree5 # Docs docs/* @tekulvw @palmtree5 +# Tests +tests/cogs/downloader/* @jack1142 + # Setup, instance setup, and running the bot setup.py @tekulvw redbot/__init__.py @tekulvw diff --git a/changelog.d/2571.misc.rst b/changelog.d/2571.misc.rst new file mode 100644 index 000000000..f071ffc3f --- /dev/null +++ b/changelog.d/2571.misc.rst @@ -0,0 +1 @@ +Tests now use same event loop policy as Red's code. \ No newline at end of file diff --git a/changelog.d/downloader/1866.enhance.rst b/changelog.d/downloader/1866.enhance.rst new file mode 100644 index 000000000..3c75d90ca --- /dev/null +++ b/changelog.d/downloader/1866.enhance.rst @@ -0,0 +1 @@ +Downloader will now check if Python and bot version match requirements in ``info.json`` during update. \ No newline at end of file diff --git a/changelog.d/downloader/2527.docs.rst b/changelog.d/downloader/2527.docs.rst new file mode 100644 index 000000000..3b4b47f61 --- /dev/null +++ b/changelog.d/downloader/2527.docs.rst @@ -0,0 +1 @@ +Added :func:`redbot.cogs.downloader.repo_manager.InstalledModule` to Downloader's framework docs. \ No newline at end of file diff --git a/changelog.d/downloader/2527.enhance.1.rst b/changelog.d/downloader/2527.enhance.1.rst new file mode 100644 index 000000000..5666bd733 --- /dev/null +++ b/changelog.d/downloader/2527.enhance.1.rst @@ -0,0 +1 @@ +User can now pass multiple cog names to ``[p]cog install``. \ No newline at end of file diff --git a/changelog.d/downloader/2527.enhance.2.rst b/changelog.d/downloader/2527.enhance.2.rst new file mode 100644 index 000000000..23b990d6d --- /dev/null +++ b/changelog.d/downloader/2527.enhance.2.rst @@ -0,0 +1 @@ +When passing cogs to ``[p]cog update`` command, it will now only update those cogs, not all cogs from the repo these cogs are from. \ No newline at end of file diff --git a/changelog.d/downloader/2527.feature.1.rst b/changelog.d/downloader/2527.feature.1.rst new file mode 100644 index 000000000..342b8910e --- /dev/null +++ b/changelog.d/downloader/2527.feature.1.rst @@ -0,0 +1 @@ +Added ``[p]repo update [repos]`` command that allows you to update repos without updating cogs from them. \ No newline at end of file diff --git a/changelog.d/downloader/2527.feature.2.rst b/changelog.d/downloader/2527.feature.2.rst new file mode 100644 index 000000000..116d40435 --- /dev/null +++ b/changelog.d/downloader/2527.feature.2.rst @@ -0,0 +1 @@ +Added ``[p]cog installversion `` command that allows you to install cogs from specified revision (commit, tag, branch) of given repo. When using this command, cog will automatically be pinned. \ No newline at end of file diff --git a/changelog.d/downloader/2527.feature.3.rst b/changelog.d/downloader/2527.feature.3.rst new file mode 100644 index 000000000..b9567b967 --- /dev/null +++ b/changelog.d/downloader/2527.feature.3.rst @@ -0,0 +1 @@ +Added ``[p]cog pin `` and ``[p]cog unpin `` for pinning cogs. Cogs that are pinned will not be updated when using update commands. \ No newline at end of file diff --git a/changelog.d/downloader/2527.feature.4.rst b/changelog.d/downloader/2527.feature.4.rst new file mode 100644 index 000000000..c50b44623 --- /dev/null +++ b/changelog.d/downloader/2527.feature.4.rst @@ -0,0 +1 @@ +Added ``[p]cog checkforupdates`` command that will tell which cogs can be updated (including pinned cog) without updating them. \ No newline at end of file diff --git a/changelog.d/downloader/2527.feature.5.rst b/changelog.d/downloader/2527.feature.5.rst new file mode 100644 index 000000000..dc08620a0 --- /dev/null +++ b/changelog.d/downloader/2527.feature.5.rst @@ -0,0 +1 @@ +Added ``[p]cog updateallfromrepos `` command that will update all cogs from given repos. \ No newline at end of file diff --git a/changelog.d/downloader/2527.feature.6.rst b/changelog.d/downloader/2527.feature.6.rst new file mode 100644 index 000000000..d5eb16a8e --- /dev/null +++ b/changelog.d/downloader/2527.feature.6.rst @@ -0,0 +1 @@ +Added ``[p]cog updatetoversion [cogs]`` command that updates all cogs or ones of user's choosing to chosen revision of given repo. \ No newline at end of file diff --git a/changelog.d/downloader/2527.misc.1.rst b/changelog.d/downloader/2527.misc.1.rst new file mode 100644 index 000000000..8b1d6db9e --- /dev/null +++ b/changelog.d/downloader/2527.misc.1.rst @@ -0,0 +1,4 @@ +Added :func:`redbot.cogs.downloader.installable.InstalledModule` which is used instead of :func:`redbot.cogs.downloader.installable.Installable` when we refer to installed cog or shared library. +Therefore: + - ``to_json`` and ``from_json`` methods were moved from :func:`redbot.cogs.downloader.installable.Installable` to :func:`redbot.cogs.downloader.installable.InstalledModule` + - return types changed for :func:`redbot.cogs.converter.InstalledCog.convert`, :func:`redbot.cogs.downloader.Downloader.installed_cogs`, :func:`redbot.cogs.downloader.Repo.install_cog` to use :func:`redbot.cogs.downloader.installable.InstalledModule`. \ No newline at end of file diff --git a/changelog.d/downloader/2571.bugfix.1.rst b/changelog.d/downloader/2571.bugfix.1.rst new file mode 100644 index 000000000..342f97e42 --- /dev/null +++ b/changelog.d/downloader/2571.bugfix.1.rst @@ -0,0 +1 @@ +Made regex for repo names use raw string to stop ``DeprecationWarning`` about invalid escape sequence. \ No newline at end of file diff --git a/changelog.d/downloader/2571.bugfix.2.rst b/changelog.d/downloader/2571.bugfix.2.rst new file mode 100644 index 000000000..74bf663c0 --- /dev/null +++ b/changelog.d/downloader/2571.bugfix.2.rst @@ -0,0 +1 @@ +Downloader will no longer allow to install cog that is already installed. \ No newline at end of file diff --git a/changelog.d/downloader/2571.dep.rst b/changelog.d/downloader/2571.dep.rst new file mode 100644 index 000000000..34ab16ae6 --- /dev/null +++ b/changelog.d/downloader/2571.dep.rst @@ -0,0 +1 @@ +Added ``pytest-mock`` requirement to ``tests`` extra. \ No newline at end of file diff --git a/changelog.d/downloader/2571.enhance.rst b/changelog.d/downloader/2571.enhance.rst new file mode 100644 index 000000000..bc65a7a84 --- /dev/null +++ b/changelog.d/downloader/2571.enhance.rst @@ -0,0 +1 @@ +Added error messages for failures during installing/reinstalling requirements and copying cogs and shared libraries. \ No newline at end of file diff --git a/changelog.d/downloader/2571.misc.rst b/changelog.d/downloader/2571.misc.rst new file mode 100644 index 000000000..0acd4199f --- /dev/null +++ b/changelog.d/downloader/2571.misc.rst @@ -0,0 +1 @@ +Added more Downloader tests for Repo logic and git integration. New git tests use a test repo file that can be generated using new tool at ``tools/edit_testrepo.py``. \ No newline at end of file diff --git a/changelog.d/downloader/2927.bugfix.rst b/changelog.d/downloader/2927.bugfix.rst new file mode 100644 index 000000000..c79b243e3 --- /dev/null +++ b/changelog.d/downloader/2927.bugfix.rst @@ -0,0 +1 @@ +Downloader will no longer allow to install cog with same name as other that is installed. \ No newline at end of file diff --git a/docs/framework_downloader.rst b/docs/framework_downloader.rst index 6d4a5a45b..e7b3de700 100644 --- a/docs/framework_downloader.rst +++ b/docs/framework_downloader.rst @@ -68,6 +68,12 @@ Installable .. autoclass:: Installable :members: +InstalledModule +^^^^^^^^^^^^^^^ + +.. autoclass:: InstalledModule + :members: + .. automodule:: redbot.cogs.downloader.repo_manager Repo diff --git a/redbot/__init__.py b/redbot/__init__.py index 2f62106d6..a7982024c 100644 --- a/redbot/__init__.py +++ b/redbot/__init__.py @@ -1,3 +1,4 @@ +import asyncio as _asyncio import re as _re import sys as _sys import warnings as _warnings @@ -15,8 +16,13 @@ from typing import ( MIN_PYTHON_VERSION = (3, 7, 0) -__all__ = ["MIN_PYTHON_VERSION", "__version__", "version_info", "VersionInfo"] - +__all__ = [ + "MIN_PYTHON_VERSION", + "__version__", + "version_info", + "VersionInfo", + "_update_event_loop_policy", +] if _sys.version_info < MIN_PYTHON_VERSION: print( f"Python {'.'.join(map(str, MIN_PYTHON_VERSION))} is required to run Red, but you have " @@ -173,6 +179,19 @@ class VersionInfo: ) +def _update_event_loop_policy(): + if _sys.platform == "win32": + _asyncio.set_event_loop_policy(_asyncio.WindowsProactorEventLoopPolicy()) + elif _sys.implementation.name == "cpython": + # Let's not force this dependency, uvloop is much faster on cpython + try: + import uvloop as _uvloop + except ImportError: + pass + else: + _asyncio.set_event_loop_policy(_uvloop.EventLoopPolicy()) + + __version__ = "3.1.7" version_info = VersionInfo.from_str(__version__) diff --git a/redbot/__main__.py b/redbot/__main__.py index a195e26a4..2947a9edc 100644 --- a/redbot/__main__.py +++ b/redbot/__main__.py @@ -13,17 +13,9 @@ import discord # Set the event loop policies here so any subsequent `get_event_loop()` # calls, in particular those as a result of the following imports, # return the correct loop object. -if sys.platform == "win32": - asyncio.set_event_loop_policy(asyncio.WindowsProactorEventLoopPolicy()) -elif sys.implementation.name == "cpython": - # Let's not force this dependency, uvloop is much faster on cpython - try: - import uvloop - except ImportError: - uvloop = None - pass - else: - asyncio.set_event_loop_policy(uvloop.EventLoopPolicy()) +from redbot import _update_event_loop_policy + +_update_event_loop_policy() import redbot.logging from redbot.core.bot import Red, ExitCodes diff --git a/redbot/cogs/downloader/checks.py b/redbot/cogs/downloader/checks.py index cb86a8d4e..55b7327f0 100644 --- a/redbot/cogs/downloader/checks.py +++ b/redbot/cogs/downloader/checks.py @@ -21,7 +21,7 @@ REPO_INSTALL_MSG = _( _ = T_ -async def do_install_agreement(ctx: commands.Context): +async def do_install_agreement(ctx: commands.Context) -> bool: downloader = ctx.cog if downloader is None or downloader.already_agreed: return True diff --git a/redbot/cogs/downloader/converters.py b/redbot/cogs/downloader/converters.py index 54f7522cd..5b7357579 100644 --- a/redbot/cogs/downloader/converters.py +++ b/redbot/cogs/downloader/converters.py @@ -1,14 +1,14 @@ import discord from redbot.core import commands from redbot.core.i18n import Translator -from .installable import Installable +from .installable import InstalledModule _ = Translator("Koala", __file__) -class InstalledCog(Installable): +class InstalledCog(InstalledModule): @classmethod - async def convert(cls, ctx: commands.Context, arg: str) -> Installable: + async def convert(cls, ctx: commands.Context, arg: str) -> InstalledModule: downloader = ctx.bot.get_cog("Downloader") if downloader is None: raise commands.CommandError(_("No Downloader cog found.")) diff --git a/redbot/cogs/downloader/downloader.py b/redbot/cogs/downloader/downloader.py index d8a21d00d..be682431a 100644 --- a/redbot/cogs/downloader/downloader.py +++ b/redbot/cogs/downloader/downloader.py @@ -5,8 +5,8 @@ import re import shutil import sys from pathlib import Path -from sys import path as syspath -from typing import Tuple, Union, Iterable +from typing import Tuple, Union, Iterable, Optional, Dict, Set, List, cast +from collections import defaultdict import discord from redbot.core import checks, commands, Config, version_info as red_version_info @@ -20,7 +20,7 @@ from redbot.core.utils.predicates import MessagePredicate, ReactionPredicate from . import errors from .checks import do_install_agreement from .converters import InstalledCog -from .installable import Installable +from .installable import InstallableType, Installable, InstalledModule from .log import log from .repo_manager import RepoManager, Repo @@ -35,7 +35,7 @@ class Downloader(commands.Cog): self.conf = Config.get_conf(self, identifier=998240343, force_registration=True) - self.conf.register_global(installed=[]) + self.conf.register_global(schema_version=0, installed_cogs={}, installed_libraries={}) self.already_agreed = False @@ -50,10 +50,43 @@ class Downloader(commands.Cog): self._repo_manager = RepoManager() - async def initialize(self): + async def initialize(self) -> None: await self._repo_manager.initialize() + await self._maybe_update_config() - async def cog_install_path(self): + async def _maybe_update_config(self) -> None: + schema_version = await self.conf.schema_version() + + if schema_version == 0: + await self._schema_0_to_1() + schema_version += 1 + await self.conf.schema_version.set(schema_version) + + async def _schema_0_to_1(self): + """ + This contains migration to allow saving state + of both installed cogs and shared libraries. + """ + old_conf = await self.conf.get_raw("installed", default=[]) + if not old_conf: + return + async with self.conf.installed_cogs() as new_cog_conf: + for cog_json in old_conf: + repo_name = cog_json["repo_name"] + module_name = cog_json["cog_name"] + if repo_name not in new_cog_conf: + new_cog_conf[repo_name] = {} + new_cog_conf[repo_name][module_name] = { + "repo_name": repo_name, + "module_name": module_name, + "commit": "", + "pinned": False, + } + await self.conf.clear_raw("installed") + # no reliable way to get installed libraries (i.a. missing repo name) + # but it only helps `[p]cog update` run faster so it's not an issue + + async def cog_install_path(self) -> Path: """Get the current cog install path. Returns @@ -64,100 +97,273 @@ class Downloader(commands.Cog): """ return await self.bot._cog_mgr.install_path() - async def installed_cogs(self) -> Tuple[Installable]: + async def installed_cogs(self) -> Tuple[InstalledModule, ...]: """Get info on installed cogs. Returns ------- - `tuple` of `Installable` - All installed cogs / shared lib directories. + `tuple` of `InstalledModule` + All installed cogs. """ - installed = await self.conf.installed() + installed = await self.conf.installed_cogs() # noinspection PyTypeChecker - return tuple(Installable.from_json(v, self._repo_manager) for v in installed) + return tuple( + InstalledModule.from_json(cog_json, self._repo_manager) + for repo_json in installed.values() + for cog_json in repo_json.values() + ) - async def _add_to_installed(self, cog: Installable): - """Mark a cog as installed. + async def installed_libraries(self) -> Tuple[InstalledModule, ...]: + """Get info on installed shared libraries. + + Returns + ------- + `tuple` of `InstalledModule` + All installed shared libraries. + + """ + installed = await self.conf.installed_libraries() + # noinspection PyTypeChecker + return tuple( + InstalledModule.from_json(lib_json, self._repo_manager) + for repo_json in installed.values() + for lib_json in repo_json.values() + ) + + async def installed_modules(self) -> Tuple[InstalledModule, ...]: + """Get info on installed cogs and shared libraries. + + Returns + ------- + `tuple` of `InstalledModule` + All installed cogs and shared libraries. + + """ + return await self.installed_cogs() + await self.installed_libraries() + + async def _save_to_installed(self, modules: Iterable[InstalledModule]) -> None: + """Mark modules as installed or updates their json in Config. Parameters ---------- - cog : Installable - The cog to check off. + modules : `list` of `InstalledModule` + The modules to check off. """ - installed = await self.conf.installed() - cog_json = cog.to_json() + installed_cogs = await self.conf.installed_cogs() + installed_libraries = await self.conf.installed_libraries() + for module in modules: + if module.type == InstallableType.COG: + installed = installed_cogs + elif module.type == InstallableType.SHARED_LIBRARY: + installed = installed_libraries + else: + continue + module_json = module.to_json() + repo_json = installed.setdefault(module.repo_name, {}) + repo_json[module.name] = module_json - if cog_json not in installed: - installed.append(cog_json) - await self.conf.installed.set(installed) + await self.conf.installed_cogs.set(installed_cogs) + await self.conf.installed_libraries.set(installed_libraries) - async def _remove_from_installed(self, cog: Installable): - """Remove a cog from the saved list of installed cogs. + async def _remove_from_installed(self, modules: Iterable[InstalledModule]) -> None: + """Remove modules from the saved list + of installed modules (corresponding to type of module). Parameters ---------- - cog : Installable - The cog to remove. + modules : `list` of `InstalledModule` + The modules to remove. """ - installed = await self.conf.installed() - cog_json = cog.to_json() + installed_cogs = await self.conf.installed_cogs() + installed_libraries = await self.conf.installed_libraries() + for module in modules: + if module.type == InstallableType.COG: + installed = installed_cogs + elif module.type == InstallableType.SHARED_LIBRARY: + installed = installed_libraries + else: + continue + with contextlib.suppress(KeyError): + installed[module._json_repo_name].pop(module.name) - if cog_json in installed: - installed.remove(cog_json) - await self.conf.installed.set(installed) + await self.conf.installed_cogs.set(installed_cogs) + await self.conf.installed_libraries.set(installed_libraries) - async def _reinstall_cogs(self, cogs: Iterable[Installable]) -> Tuple[Installable]: + async def _available_updates( + self, cogs: Iterable[InstalledModule] + ) -> Tuple[Tuple[Installable, ...], Tuple[Installable, ...]]: """ - Installs a list of cogs, used when updating. - :param cogs: - :return: Any cogs that failed to copy + Get cogs and libraries which can be updated. + + Parameters + ---------- + cogs : `list` of `InstalledModule` + List of cogs, which should be checked against the updates. + + Returns + ------- + tuple + 2-tuple of cogs and libraries which can be updated. + """ - failed = [] - for cog in cogs: - if not await cog.copy_to(await self.cog_install_path()): - failed.append(cog) - - # noinspection PyTypeChecker - return tuple(failed) - - async def _reinstall_libraries(self, cogs: Iterable[Installable]) -> Tuple[Installable]: - """ - Reinstalls any shared libraries from the repos of cogs that - were updated. - :param cogs: - :return: Any libraries that failed to copy - """ - repo_names = set(cog.repo_name for cog in cogs) - unfiltered_repos = (self._repo_manager.get_repo(r) for r in repo_names) - repos = filter(lambda r: r is not None, unfiltered_repos) - - failed = [] + repos = {cog.repo for cog in cogs if cog.repo is not None} + installed_libraries = await self.installed_libraries() + modules: Set[InstalledModule] = set() + cogs_to_update: Set[Installable] = set() + libraries_to_update: Set[Installable] = set() + # split libraries and cogs into 2 categories: + # 1. `cogs_to_update`, `libraries_to_update` - module needs update, skip diffs + # 2. `modules` - module MAY need update, check diffs for repo in repos: - if not await repo.install_libraries( - target_dir=self.SHAREDLIB_PATH, req_target_dir=self.LIB_PATH + for lib in repo.available_libraries: + try: + index = installed_libraries.index(lib) + except ValueError: + libraries_to_update.add(lib) + else: + modules.add(installed_libraries[index]) + for cog in cogs: + if cog.repo is None: + # cog had its repo removed, can't check for updates + continue + if cog.commit: + modules.add(cog) + continue + # marking cog for update if there's no commit data saved (back-compat, see GH-2571) + last_cog_occurrence = await cog.repo.get_last_module_occurrence(cog.name) + if last_cog_occurrence is not None: + cogs_to_update.add(last_cog_occurrence) + + # Reduces diff requests to a single dict with no repeats + hashes: Dict[Tuple[Repo, str], Set[InstalledModule]] = defaultdict(set) + for module in modules: + module.repo = cast(Repo, module.repo) + if module.repo.commit != module.commit and await module.repo.is_ancestor( + module.commit, module.repo.commit ): - failed.extend(repo.available_libraries) + hashes[(module.repo, module.commit)].add(module) + + update_commits = [] + for (repo, old_hash), modules_to_check in hashes.items(): + modified = await repo.get_modified_modules(old_hash, repo.commit) + for module in modules_to_check: + try: + index = modified.index(module) + except ValueError: + # module wasn't modified - we just need to update its commit + module.commit = repo.commit + update_commits.append(module) + else: + modified_module = modified[index] + if modified_module.type == InstallableType.COG: + cogs_to_update.add(modified_module) + elif modified_module.type == InstallableType.SHARED_LIBRARY: + libraries_to_update.add(modified_module) + + await self._save_to_installed(update_commits) + + return (tuple(cogs_to_update), tuple(libraries_to_update)) + + async def _install_cogs( + self, cogs: Iterable[Installable] + ) -> Tuple[Tuple[InstalledModule, ...], Tuple[Installable, ...]]: + """Installs a list of cogs. + + Parameters + ---------- + cogs : `list` of `Installable` + Cogs to install. ``repo`` property of those objects can't be `None` + Returns + ------- + tuple + 2-tuple of installed and failed cogs. + """ + repos: Dict[str, Tuple[Repo, Dict[str, List[Installable]]]] = {} + for cog in cogs: + try: + repo_by_commit = repos[cog.repo_name] + except KeyError: + cog.repo = cast(Repo, cog.repo) # docstring specifies this already + repo_by_commit = repos[cog.repo_name] = (cog.repo, defaultdict(list)) + cogs_by_commit = repo_by_commit[1] + cogs_by_commit[cog.commit].append(cog) + installed = [] + failed = [] + for repo, cogs_by_commit in repos.values(): + exit_to_commit = repo.commit + for commit, cogs_to_install in cogs_by_commit.items(): + await repo.checkout(commit) + for cog in cogs_to_install: + if await cog.copy_to(await self.cog_install_path()): + installed.append(InstalledModule.from_installable(cog)) + else: + failed.append(cog) + await repo.checkout(exit_to_commit) # noinspection PyTypeChecker - return tuple(failed) + return (tuple(installed), tuple(failed)) - async def _reinstall_requirements(self, cogs: Iterable[Installable]) -> bool: + async def _reinstall_libraries( + self, libraries: Iterable[Installable] + ) -> Tuple[Tuple[InstalledModule, ...], Tuple[Installable, ...]]: + """Installs a list of shared libraries, used when updating. + + Parameters + ---------- + libraries : `list` of `Installable` + Libraries to reinstall. ``repo`` property of those objects can't be `None` + Returns + ------- + tuple + 2-tuple of installed and failed libraries. """ - Reinstalls requirements for given cogs that have been updated. - Returns a bool that indicates if all requirement installations - were successful. - :param cogs: - :return: + repos: Dict[str, Tuple[Repo, Dict[str, Set[Installable]]]] = {} + for lib in libraries: + try: + repo_by_commit = repos[lib.repo_name] + except KeyError: + lib.repo = cast(Repo, lib.repo) # docstring specifies this already + repo_by_commit = repos[lib.repo_name] = (lib.repo, defaultdict(set)) + libs_by_commit = repo_by_commit[1] + libs_by_commit[lib.commit].add(lib) + + all_installed: List[InstalledModule] = [] + all_failed: List[Installable] = [] + for repo, libs_by_commit in repos.values(): + exit_to_commit = repo.commit + for commit, libs in libs_by_commit.items(): + await repo.checkout(commit) + installed, failed = await repo.install_libraries( + target_dir=self.SHAREDLIB_PATH, req_target_dir=self.LIB_PATH, libraries=libs + ) + all_installed += installed + all_failed += failed + await repo.checkout(exit_to_commit) + + # noinspection PyTypeChecker + return (tuple(all_installed), tuple(all_failed)) + + async def _install_requirements(self, cogs: Iterable[Installable]) -> Tuple[str, ...]: + """ + Installs requirements for given cogs. + + Parameters + ---------- + cogs : `list` of `Installable` + Cogs whose requirements should be installed. + Returns + ------- + tuple + Tuple of failed requirements. """ # Reduces requirements to a single list with no repeats - requirements = set(r for c in cogs for r in c.requirements) - repo_names = self._repo_manager.get_all_repo_names() - repos = [(self._repo_manager.get_repo(rn), []) for rn in repo_names] + requirements = {requirement for cog in cogs for requirement in cog.requirements} + repos: List[Tuple[Repo, List[str]]] = [(repo, []) for repo in self._repo_manager.repos] # This for loop distributes the requirements across all repos # which will allow us to concurrently install requirements @@ -167,15 +373,15 @@ class Downloader(commands.Cog): has_reqs = list(filter(lambda item: len(item[1]) > 0, repos)) - ret = True + failed_reqs = [] for repo, reqs in has_reqs: for req in reqs: - # noinspection PyTypeChecker - ret = ret and await repo.install_raw_requirements([req], self.LIB_PATH) - return ret + if not await repo.install_raw_requirements([req], self.LIB_PATH): + failed_reqs.append(req) + return tuple(failed_reqs) @staticmethod - async def _delete_cog(target: Path): + async def _delete_cog(target: Path) -> None: """ Removes an (installed) cog. :param target: Path pointing to an existing file or directory @@ -191,11 +397,12 @@ class Downloader(commands.Cog): @commands.command() @checks.is_owner() - async def pipinstall(self, ctx, *deps: str): + async def pipinstall(self, ctx: commands.Context, *deps: str) -> None: """Install a group of dependencies using pip.""" if not deps: - return await ctx.send_help() - repo = Repo("", "", "", Path.cwd(), loop=ctx.bot.loop) + await ctx.send_help() + return + repo = Repo("", "", "", "", Path.cwd(), loop=ctx.bot.loop) async with ctx.typing(): success = await repo.install_raw_requirements(deps, self.LIB_PATH) @@ -211,12 +418,14 @@ class Downloader(commands.Cog): @commands.group() @checks.is_owner() - async def repo(self, ctx): + async def repo(self, ctx: commands.Context) -> None: """Repo management commands.""" pass @repo.command(name="add") - async def _repo_add(self, ctx, name: str, repo_url: str, branch: str = None): + async def _repo_add( + self, ctx: commands.Context, name: str, repo_url: str, branch: str = None + ) -> None: """Add a new repo. Repo names can only contain characters A-z, numbers, underscores, and hyphens. @@ -225,14 +434,15 @@ class Downloader(commands.Cog): agreed = await do_install_agreement(ctx) if not agreed: return - if re.match("^[a-zA-Z0-9_\-]*$", name) is None: + if re.match(r"^[a-zA-Z0-9_\-]*$", name) is None: await ctx.send( _("Repo names can only contain characters A-z, numbers, underscores, and hyphens.") ) return try: - # noinspection PyTypeChecker - repo = await self._repo_manager.add_repo(name=name, url=repo_url, branch=branch) + async with ctx.typing(): + # noinspection PyTypeChecker + repo = await self._repo_manager.add_repo(name=name, url=repo_url, branch=branch) except errors.ExistingGitRepo: await ctx.send(_("That git repo has already been added under another name.")) except errors.CloningError as err: @@ -246,7 +456,8 @@ class Downloader(commands.Cog): except OSError: await ctx.send( _( - "Something went wrong trying to add that repo. Your repo name might have an invalid character." + "Something went wrong trying to add that repo." + " Your repo name might have an invalid character." ) ) else: @@ -255,7 +466,7 @@ class Downloader(commands.Cog): await ctx.send(repo.install_msg.replace("[p]", ctx.prefix)) @repo.command(name="delete", aliases=["remove", "del"], usage="") - async def _repo_del(self, ctx, repo: Repo): + async def _repo_del(self, ctx: commands.Context, repo: Repo) -> None: """Remove a repo and its files.""" await self._repo_manager.delete_repo(repo.name) @@ -264,107 +475,168 @@ class Downloader(commands.Cog): ) @repo.command(name="list") - async def _repo_list(self, ctx): + async def _repo_list(self, ctx: commands.Context) -> None: """List all installed repos.""" - repos = self._repo_manager.get_all_repo_names() - repos = sorted(repos, key=str.lower) + repos = self._repo_manager.repos + sorted_repos = sorted(repos, key=lambda r: str.lower(r.name)) joined = _("Installed Repos:\n\n") - for repo_name in repos: - repo = self._repo_manager.get_repo(repo_name) + for repo in sorted_repos: joined += "+ {}: {}\n".format(repo.name, repo.short or "") for page in pagify(joined, ["\n"], shorten_by=16): await ctx.send(box(page.lstrip(" "), lang="diff")) @repo.command(name="info", usage="") - async def _repo_info(self, ctx, repo: Repo): + async def _repo_info(self, ctx: commands.Context, repo: Repo) -> None: """Show information about a repo.""" - if repo is None: - await ctx.send(_("Repo `{repo.name}` not found.").format(repo=repo)) - return - msg = _("Information on {repo.name}:\n{description}").format( repo=repo, description=repo.description or "" ) await ctx.send(box(msg)) + @repo.command(name="update") + async def _repo_update(self, ctx: commands.Context, *repos: Repo) -> None: + """Update all repos, or ones of your choosing.""" + async with ctx.typing(): + updated: Set[str] + if not repos: + updated = {repo.name for repo in await self._repo_manager.update_all_repos()} + else: + updated = set() + for repo in repos: + old, new = await repo.update() + if old != new: + updated.add(repo.name) + + if updated: + message = _("Repo update completed successfully.") + message += _("\nUpdated: ") + humanize_list(tuple(map(inline, updated))) + elif repos is None: + await ctx.send(_("All installed repos are already up to date.")) + return + else: + await ctx.send(_("These repos are already up to date.")) + return + await ctx.send(message) + @commands.group() @checks.is_owner() - async def cog(self, ctx): + async def cog(self, ctx: commands.Context) -> None: """Cog installation management commands.""" pass - @cog.command(name="install", usage=" ") - async def _cog_install(self, ctx, repo: Repo, cog_name: str): + @cog.command(name="install", usage=" ") + async def _cog_install(self, ctx: commands.Context, repo: Repo, *cog_names: str) -> None: """Install a cog from the given repo.""" - cog: Installable = discord.utils.get(repo.available_cogs, name=cog_name) - if cog is None: - await ctx.send( - _( - "Error: there is no cog by the name of `{cog_name}` in the `{repo.name}` repo." - ).format(cog_name=cog_name, repo=repo) - ) + await self._cog_installrev(ctx, repo, None, cog_names) + + @cog.command(name="installversion", usage=" ") + async def _cog_installversion( + self, ctx: commands.Context, repo: Repo, rev: str, *cog_names: str + ) -> None: + """Install a cog from the specified revision of given repo.""" + await self._cog_installrev(ctx, repo, rev, cog_names) + + async def _cog_installrev( + self, ctx: commands.Context, repo: Repo, rev: Optional[str], cog_names: Iterable[str] + ) -> None: + if not cog_names: + await ctx.send_help() return - elif cog.min_python_version > sys.version_info: - await ctx.send( - _("This cog requires at least python version {version}, aborting install.").format( - version=".".join([str(n) for n in cog.min_python_version]) - ) + commit = None + async with ctx.typing(): + if rev is not None: + try: + commit = await repo.get_full_sha1(rev) + except errors.AmbiguousRevision as e: + msg = _( + "Error: short sha1 `{rev}` is ambiguous. Possible candidates:\n" + ).format(rev=rev) + for candidate in e.candidates: + msg += ( + f"**{candidate.object_type} {candidate.rev}**" + f" - {candidate.description}\n" + ) + for page in pagify(msg): + await ctx.send(msg) + return + except errors.UnknownRevision: + await ctx.send( + _("Error: there is no revision `{rev}` in repo `{repo.name}`").format( + rev=rev, repo=repo + ) + ) + return + cog_names = set(cog_names) + + async with repo.checkout(commit, exit_to_rev=repo.branch): + cogs, message = await self._filter_incorrect_cogs_by_names(repo, cog_names) + if not cogs: + await ctx.send(message) + return + failed_reqs = await self._install_requirements(cogs) + if failed_reqs: + message += _("\nFailed to install requirements: ") + humanize_list( + tuple(map(inline, failed_reqs)) + ) + await ctx.send(message) + return + + installed_cogs, failed_cogs = await self._install_cogs(cogs) + + installed_libs, failed_libs = await repo.install_libraries( + target_dir=self.SHAREDLIB_PATH, req_target_dir=self.LIB_PATH ) - return - ignore_max = cog.min_bot_version > cog.max_bot_version - if ( - cog.min_bot_version > red_version_info - or not ignore_max - and cog.max_bot_version < red_version_info - ): - await ctx.send( - _("This cog requires at least Red version {min_version}").format( - min_version=cog.min_bot_version + if rev is not None: + for cog in installed_cogs: + cog.pinned = True + await self._save_to_installed(installed_cogs + installed_libs) + if failed_libs: + libnames = [inline(lib.name) for lib in failed_libs] + message = ( + _("\nFailed to install shared libraries for `{repo.name}` repo: ").format( + repo=repo + ) + + humanize_list(libnames) + + message ) - + ( - "" - if ignore_max - else _(" and at most {max_version}").format(max_version=cog.max_bot_version) + if failed_cogs: + cognames = [inline(cog.name) for cog in failed_cogs] + message = _("\nFailed to install cogs: ") + humanize_list(cognames) + message + if installed_cogs: + cognames = [inline(cog.name) for cog in installed_cogs] + message = ( + _("Successfully installed cogs: ") + + humanize_list(cognames) + + ( + _( + "\nThese cogs are now pinned and won't get updated automatically." + " To change this, use `{prefix}cog unpin `" + ).format(prefix=ctx.prefix) + if rev is not None + else "" + ) + + _("\nYou can load them using `{prefix}load `").format( + prefix=ctx.prefix + ) + + message ) - + _(", but you have {current_version}, aborting install.").format( - current_version=red_version_info - ) - ) - return - - if not await repo.install_requirements(cog, self.LIB_PATH): - libraries = humanize_list(tuple(map(inline, cog.requirements))) - await ctx.send( - _("Failed to install the required libraries for `{cog_name}`: {libraries}").format( - cog_name=cog.name, libraries=libraries - ) - ) - return - - await repo.install_cog(cog, await self.cog_install_path()) - - await self._add_to_installed(cog) - - await repo.install_libraries(target_dir=self.SHAREDLIB_PATH, req_target_dir=self.LIB_PATH) - - await ctx.send( - _( - "Cog `{cog_name}` successfully installed. You can load it with `{prefix}load {cog_name}`" - ).format(cog_name=cog_name, prefix=ctx.prefix) - ) - if cog.install_msg: - await ctx.send(cog.install_msg.replace("[p]", ctx.prefix)) + # "---" added to separate cog install messages from Downloader's message + await ctx.send(f"{message}\n---") + for cog in installed_cogs: + if cog.install_msg: + await ctx.send(cog.install_msg.replace("[p]", ctx.prefix)) @cog.command(name="uninstall", usage="") - async def _cog_uninstall(self, ctx, cogs: commands.Greedy[InstalledCog]): + async def _cog_uninstall(self, ctx: commands.Context, *cogs: InstalledCog) -> None: """Uninstall cogs. You may only uninstall cogs which were previously installed by Downloader. """ if not cogs: - return await ctx.send_help() + await ctx.send_help() + return async with ctx.typing(): uninstalled_cogs = [] failed_cogs = [] @@ -379,7 +651,7 @@ class Downloader(commands.Cog): uninstalled_cogs.append(inline(real_name)) else: failed_cogs.append(real_name) - await self._remove_from_installed(cog) + await self._remove_from_installed(cogs) message = "" if uninstalled_cogs: @@ -395,43 +667,432 @@ class Downloader(commands.Cog): ) await ctx.send(message) - @cog.command(name="update") - async def _cog_update(self, ctx, cog_name: InstalledCog = None): - """Update all cogs, or one of your choosing.""" - installed_cogs = set(await self.installed_cogs()) - - async with ctx.typing(): - if cog_name is None: - updated = await self._repo_manager.update_all_repos() - - else: - try: - updated = await self._repo_manager.update_repo(cog_name.repo_name) - except KeyError: - # Thrown if the repo no longer exists - updated = {} - - updated_cogs = set(cog for repo in updated for cog in repo.available_cogs) - installed_and_updated = updated_cogs & installed_cogs - - if installed_and_updated: - await self._reinstall_requirements(installed_and_updated) - await self._reinstall_cogs(installed_and_updated) - await self._reinstall_libraries(installed_and_updated) - message = _("Cog update completed successfully.") - - cognames = {c.name for c in installed_and_updated} - message += _("\nUpdated: ") + humanize_list(tuple(map(inline, cognames))) - else: - await ctx.send(_("All installed cogs are already up to date.")) - return + @cog.command(name="pin", usage="") + async def _cog_pin(self, ctx: commands.Context, *cogs: InstalledCog) -> None: + """Pin cogs - this will lock cogs on their current version.""" + if not cogs: + await ctx.send_help() + return + already_pinned = [] + pinned = [] + for cog in set(cogs): + if cog.pinned: + already_pinned.append(inline(cog.name)) + continue + cog.pinned = True + pinned.append(cog) + message = "" + if pinned: + await self._save_to_installed(pinned) + cognames = [inline(cog.name) for cog in pinned] + message += _("Pinned cogs: ") + humanize_list(cognames) + if already_pinned: + message += _("\nThese cogs were already pinned: ") + humanize_list(already_pinned) await ctx.send(message) - cognames &= set(ctx.bot.extensions.keys()) # only reload loaded cogs - if not cognames: - return await ctx.send( - _("None of the updated cogs were previously loaded. Update complete.") + @cog.command(name="unpin", usage="") + async def _cog_unpin(self, ctx: commands.Context, *cogs: InstalledCog) -> None: + """Unpin cogs - this will remove update lock from cogs.""" + if not cogs: + await ctx.send_help() + return + not_pinned = [] + unpinned = [] + for cog in set(cogs): + if not cog.pinned: + not_pinned.append(inline(cog.name)) + continue + cog.pinned = False + unpinned.append(cog) + message = "" + if unpinned: + await self._save_to_installed(unpinned) + cognames = [inline(cog.name) for cog in unpinned] + message += _("Unpinned cogs: ") + humanize_list(cognames) + if not_pinned: + message += _("\nThese cogs weren't pinned: ") + humanize_list(not_pinned) + await ctx.send(message) + + @cog.command(name="checkforupdates") + async def _cog_checkforupdates(self, ctx: commands.Context) -> None: + """ + Check for available cog updates (including pinned cogs). + + This command doesn't update cogs, it only checks for updates. + Use `[p]cog update` to update cogs. + """ + async with ctx.typing(): + cogs_to_check = await self._get_cogs_to_check() + cogs_to_update, libs_to_update = await self._available_updates(cogs_to_check) + message = "" + if cogs_to_update: + cognames = [cog.name for cog in cogs_to_update] + message += _("These cogs can be updated: ") + humanize_list( + tuple(map(inline, cognames)) + ) + if libs_to_update: + libnames = [cog.name for cog in libs_to_update] + message += _("\nThese shared libraries can be updated: ") + humanize_list( + tuple(map(inline, libnames)) + ) + if message: + await ctx.send(message) + else: + await ctx.send(_("All installed cogs are up to date.")) + + @cog.command(name="update") + async def _cog_update(self, ctx: commands.Context, *cogs: InstalledCog) -> None: + """Update all cogs, or ones of your choosing.""" + await self._cog_update_logic(ctx, cogs=cogs) + + @cog.command(name="updateallfromrepos", usage="") + async def _cog_updateallfromrepos(self, ctx: commands.Context, *repos: Repo) -> None: + """Update all cogs from repos of your choosing.""" + if not repos: + await ctx.send_help() + return + await self._cog_update_logic(ctx, repos=repos) + + @cog.command(name="updatetoversion", usage=" [cogs]") + async def _cog_updatetoversion( + self, ctx: commands.Context, repo: Repo, rev: str, *cogs: InstalledCog + ) -> None: + """Update all cogs, or ones of your choosing to chosen revision of one repo. + + Note that update doesn't mean downgrade and therefore revision + has to be newer than the one that cog currently has. If you want to + downgrade the cog, uninstall and install it again. + """ + await self._cog_update_logic(ctx, repo=repo, rev=rev, cogs=cogs) + + async def _cog_update_logic( + self, + ctx: commands.Context, + *, + repo: Optional[Repo] = None, + repos: Optional[List[Repo]] = None, + rev: Optional[str] = None, + cogs: Optional[List[InstalledModule]] = None, + ) -> None: + async with ctx.typing(): + # this is enough to be sure that `rev` is not None (based on calls to this method) + if repo is not None: + rev = cast(str, rev) + await repo.update() + try: + commit = await repo.get_full_sha1(rev) + except errors.AmbiguousRevision as e: + msg = _( + "Error: short sha1 `{rev}` is ambiguous. Possible candidates:\n" + ).format(rev=rev) + for candidate in e.candidates: + msg += ( + f"**{candidate.object_type} {candidate.rev}**" + f" - {candidate.description}\n" + ) + for page in pagify(msg): + await ctx.send(msg) + return + except errors.UnknownRevision: + await ctx.send( + _("Error: there is no revision `{rev}` in repo `{repo.name}`").format( + rev=rev, repo=repo + ) + ) + return + await repo.checkout(commit) + cogs_to_check = await self._get_cogs_to_check(repos=[repo], cogs=cogs) + else: + cogs_to_check = await self._get_cogs_to_check(repos=repos, cogs=cogs) + + pinned_cogs = {cog for cog in cogs_to_check if cog.pinned} + cogs_to_check -= pinned_cogs + if not cogs_to_check: + message = _("There were no cogs to check.") + if pinned_cogs: + cognames = [cog.name for cog in pinned_cogs] + message += _( + "\nThese cogs are pinned and therefore weren't checked: " + ) + humanize_list(tuple(map(inline, cognames))) + await ctx.send(message) + return + cogs_to_update, libs_to_update = await self._available_updates(cogs_to_check) + + updates_available = cogs_to_update or libs_to_update + cogs_to_update, filter_message = self._filter_incorrect_cogs(cogs_to_update) + message = "" + if updates_available: + updated_cognames, message = await self._update_cogs_and_libs( + cogs_to_update, libs_to_update + ) + else: + if repos: + message = _("Cogs from provided repos are already up to date.") + elif repo: + if cogs: + message = _("Provided cogs are already up to date with this revision.") + else: + message = _( + "Cogs from provided repo are already up to date with this revision." + ) + else: + if cogs: + message = _("Provided cogs are already up to date.") + else: + message = _("All installed cogs are already up to date.") + if repo is not None: + await repo.checkout(repo.branch) + if pinned_cogs: + cognames = [cog.name for cog in pinned_cogs] + message += _( + "\nThese cogs are pinned and therefore weren't checked: " + ) + humanize_list(tuple(map(inline, cognames))) + message += filter_message + await ctx.send(message) + if updates_available and updated_cognames: + await self._ask_for_cog_reload(ctx, updated_cognames) + + @cog.command(name="list", usage="") + async def _cog_list(self, ctx: commands.Context, repo: Repo) -> None: + """List all available cogs from a single repo.""" + installed = await self.installed_cogs() + installed_str = "" + if installed: + installed_str = _("Installed Cogs:\n") + "\n".join( + [ + "- {}{}".format(i.name, ": {}".format(i.short) if i.short else "") + for i in installed + if i.repo_name == repo.name + ] ) + cogs = _("Available Cogs:\n") + "\n".join( + [ + "+ {}: {}".format(cog.name, cog.short or "") + for cog in repo.available_cogs + if not (cog.hidden or cog in installed) + ] + ) + cogs = cogs + "\n\n" + installed_str + for page in pagify(cogs, ["\n"], shorten_by=16): + await ctx.send(box(page.lstrip(" "), lang="diff")) + + @cog.command(name="info", usage=" ") + async def _cog_info(self, ctx: commands.Context, repo: Repo, cog_name: str) -> None: + """List information about a single cog.""" + cog = discord.utils.get(repo.available_cogs, name=cog_name) + if cog is None: + await ctx.send( + _("There is no cog `{cog_name}` in the repo `{repo.name}`").format( + cog_name=cog_name, repo=repo + ) + ) + return + + msg = _( + "Information on {cog_name}:\n{description}\n\nRequirements: {requirements}" + ).format( + cog_name=cog.name, + description=cog.description or "", + requirements=", ".join(cog.requirements) or "None", + ) + await ctx.send(box(msg)) + + async def is_installed( + self, cog_name: str + ) -> Union[Tuple[bool, InstalledModule], Tuple[bool, None]]: + """Check to see if a cog has been installed through Downloader. + + Parameters + ---------- + cog_name : str + The name of the cog to check for. + + Returns + ------- + `tuple` of (`bool`, `InstalledModule`) + :code:`(True, InstalledModule)` if the cog is installed, else + :code:`(False, None)`. + + """ + for installed_cog in await self.installed_cogs(): + if installed_cog.name == cog_name: + return True, installed_cog + return False, None + + async def _filter_incorrect_cogs_by_names( + self, repo: Repo, cog_names: Iterable[str] + ) -> Tuple[Tuple[Installable, ...], str]: + """Filter out incorrect cogs from list. + + Parameters + ---------- + repo : `Repo` + Repo which should be searched for `cog_names` + cog_names : `list` of `str` + Cog names to search for in repo. + Returns + ------- + tuple + 2-tuple of cogs to install and error message for incorrect cogs. + """ + installed_cogs = await self.installed_cogs() + cogs: List[Installable] = [] + unavailable_cogs: List[str] = [] + already_installed: List[str] = [] + name_already_used: List[str] = [] + + for cog_name in cog_names: + cog: Optional[Installable] = discord.utils.get(repo.available_cogs, name=cog_name) + if cog is None: + unavailable_cogs.append(inline(cog_name)) + continue + if cog in installed_cogs: + already_installed.append(inline(cog_name)) + continue + if discord.utils.get(installed_cogs, name=cog.name): + name_already_used.append(inline(cog_name)) + continue + cogs.append(cog) + + message = "" + + if unavailable_cogs: + message += _("\nCouldn't find these cogs in {repo.name}: ").format( + repo=repo + ) + humanize_list(unavailable_cogs) + if already_installed: + message += _("\nThese cogs were already installed: ") + humanize_list( + already_installed + ) + if name_already_used: + message += _( + "\nSome cogs with these names are already installed from different repos: " + ) + humanize_list(already_installed) + correct_cogs, add_to_message = self._filter_incorrect_cogs(cogs) + if add_to_message: + return correct_cogs, f"{message}{add_to_message}" + return correct_cogs, message + + def _filter_incorrect_cogs( + self, cogs: Iterable[Installable] + ) -> Tuple[Tuple[Installable, ...], str]: + correct_cogs: List[Installable] = [] + outdated_python_version: List[str] = [] + outdated_bot_version: List[str] = [] + for cog in cogs: + if cog.min_python_version > sys.version_info: + outdated_python_version.append( + inline(cog.name) + + _(" (Minimum: {min_version})").format( + min_version=".".join([str(n) for n in cog.min_python_version]) + ) + ) + continue + ignore_max = cog.min_bot_version > cog.max_bot_version + if ( + cog.min_bot_version > red_version_info + or not ignore_max + and cog.max_bot_version < red_version_info + ): + outdated_bot_version.append( + inline(cog.name) + + _(" (Minimum: {min_version}").format(min_version=cog.min_bot_version) + + ( + "" + if ignore_max + else _(", at most: {max_version}").format(max_version=cog.max_bot_version) + ) + + ")" + ) + continue + correct_cogs.append(cog) + message = "" + if outdated_python_version: + message += _( + "\nThese cogs require higher python version than you have: " + ) + humanize_list(outdated_python_version) + if outdated_bot_version: + message += _( + "\nThese cogs require different Red version" + " than you currently have ({current_version}): " + ).format(current_version=red_version_info) + humanize_list(outdated_bot_version) + + return tuple(correct_cogs), message + + async def _get_cogs_to_check( + self, + *, + repos: Optional[Iterable[Repo]] = None, + cogs: Optional[Iterable[InstalledModule]] = None, + ) -> Set[InstalledModule]: + if not (cogs or repos): + await self._repo_manager.update_all_repos() + cogs_to_check = {cog for cog in await self.installed_cogs() if cog.repo is not None} + else: + # this is enough to be sure that `cogs` is not None (based on if above) + if not repos: + cogs = cast(Iterable[InstalledModule], cogs) + repos = {cog.repo for cog in cogs if cog.repo is not None} + + for repo in repos: + if await repo.is_on_branch(): + exit_to_commit = None + else: + exit_to_commit = repo.commit + await repo.update() + await repo.checkout(exit_to_commit) + if cogs: + cogs_to_check = {cog for cog in cogs if cog.repo is not None and cog.repo in repos} + else: + cogs_to_check = { + cog + for cog in await self.installed_cogs() + if cog.repo is not None and cog.repo in repos + } + + return cogs_to_check + + async def _update_cogs_and_libs( + self, cogs_to_update: Iterable[Installable], libs_to_update: Iterable[Installable] + ) -> Tuple[Set[str], str]: + failed_reqs = await self._install_requirements(cogs_to_update) + if failed_reqs: + return ( + set(), + _("Failed to install requirements: ") + + humanize_list(tuple(map(inline, failed_reqs))), + ) + installed_cogs, failed_cogs = await self._install_cogs(cogs_to_update) + installed_libs, failed_libs = await self._reinstall_libraries(libs_to_update) + await self._save_to_installed(installed_cogs + installed_libs) + message = _("Cog update completed successfully.") + + updated_cognames: Set[str] = set() + if installed_cogs: + updated_cognames = {cog.name for cog in installed_cogs} + message += _("\nUpdated: ") + humanize_list(tuple(map(inline, updated_cognames))) + if failed_cogs: + cognames = [cog.name for cog in failed_cogs] + message += _("\nFailed to update cogs: ") + humanize_list(tuple(map(inline, cognames))) + if not cogs_to_update: + message = _("No cogs were updated.") + if installed_libs: + message += _( + "\nSome shared libraries were updated, you should restart the bot " + "to bring the changes into effect." + ) + if failed_libs: + libnames = [lib.name for lib in failed_libs] + message += _("\nFailed to install shared libraries: ") + humanize_list( + tuple(map(inline, libnames)) + ) + return (updated_cognames, message) + + async def _ask_for_cog_reload(self, ctx: commands.Context, updated_cognames: Set[str]) -> None: + updated_cognames &= ctx.bot.extensions.keys() # only reload loaded cogs + if not updated_cognames: + await ctx.send(_("None of the updated cogs were previously loaded. Update complete.")) + return if not ctx.assume_yes: message = _("Would you like to reload the updated cogs?") @@ -464,75 +1125,7 @@ class Downloader(commands.Cog): with contextlib.suppress(discord.Forbidden): await query.clear_reactions() - await ctx.invoke(ctx.bot.get_cog("Core").reload, *cognames) - - @cog.command(name="list", usage="") - async def _cog_list(self, ctx, repo: Repo): - """List all available cogs from a single repo.""" - installed = await self.installed_cogs() - installed_str = "" - if installed: - installed_str = _("Installed Cogs:\n") + "\n".join( - [ - "- {}{}".format(i.name, ": {}".format(i.short) if i.short else "") - for i in installed - if i.repo_name == repo.name - ] - ) - cogs = repo.available_cogs - cogs = _("Available Cogs:\n") + "\n".join( - [ - "+ {}: {}".format(c.name, c.short or "") - for c in cogs - if not (c.hidden or c in installed) - ] - ) - cogs = cogs + "\n\n" + installed_str - for page in pagify(cogs, ["\n"], shorten_by=16): - await ctx.send(box(page.lstrip(" "), lang="diff")) - - @cog.command(name="info", usage=" ") - async def _cog_info(self, ctx, repo: Repo, cog_name: str): - """List information about a single cog.""" - cog = discord.utils.get(repo.available_cogs, name=cog_name) - if cog is None: - await ctx.send( - _("There is no cog `{cog_name}` in the repo `{repo.name}`").format( - cog_name=cog_name, repo=repo - ) - ) - return - - msg = _( - "Information on {cog_name}:\n{description}\n\nRequirements: {requirements}" - ).format( - cog_name=cog.name, - description=cog.description or "", - requirements=", ".join(cog.requirements) or "None", - ) - await ctx.send(box(msg)) - - async def is_installed( - self, cog_name: str - ) -> Union[Tuple[bool, Installable], Tuple[bool, None]]: - """Check to see if a cog has been installed through Downloader. - - Parameters - ---------- - cog_name : str - The name of the cog to check for. - - Returns - ------- - `tuple` of (`bool`, `Installable`) - :code:`(True, Installable)` if the cog is installed, else - :code:`(False, None)`. - - """ - for installable in await self.installed_cogs(): - if installable.name == cog_name: - return True, installable - return False, None + await ctx.invoke(ctx.bot.get_cog("Core").reload, *updated_cognames) def format_findcog_info( self, command_name: str, cog_installable: Union[Installable, object] = None @@ -554,17 +1147,20 @@ class Downloader(commands.Cog): """ if isinstance(cog_installable, Installable): made_by = ", ".join(cog_installable.author) or _("Missing from info.json") - repo = self._repo_manager.get_repo(cog_installable.repo_name) - repo_url = _("Missing from installed repos") if repo is None else repo.url + repo_url = ( + _("Missing from installed repos") + if cog_installable.repo is None + else cog_installable.repo.url + ) cog_name = cog_installable.name else: made_by = "26 & co." repo_url = "https://github.com/Cog-Creators/Red-DiscordBot" cog_name = cog_installable.__class__.__name__ - msg = _("Command: {command}\nMade by: {author}\nRepo: {repo}\nCog name: {cog}") + msg = _("Command: {command}\nMade by: {author}\nRepo: {repo_url}\nCog name: {cog}") - return msg.format(command=command_name, author=made_by, repo=repo_url, cog=cog_name) + return msg.format(command=command_name, author=made_by, repo_url=repo_url, cog=cog_name) def cog_name_from_instance(self, instance: object) -> str: """Determines the cog name that Downloader knows from the cog instance. @@ -586,7 +1182,7 @@ class Downloader(commands.Cog): return splitted[-2] @commands.command() - async def findcog(self, ctx: commands.Context, command_name: str): + async def findcog(self, ctx: commands.Context, command_name: str) -> None: """Find which cog a command comes from. This will only work with loaded cogs. diff --git a/redbot/cogs/downloader/errors.py b/redbot/cogs/downloader/errors.py index cd8f7405b..efd31bedd 100644 --- a/redbot/cogs/downloader/errors.py +++ b/redbot/cogs/downloader/errors.py @@ -1,7 +1,16 @@ +from __future__ import annotations + +from typing import List, TYPE_CHECKING + +if TYPE_CHECKING: + from .repo_manager import Candidate + + __all__ = [ "DownloaderException", "GitException", "InvalidRepoName", + "CopyingError", "ExistingGitRepo", "MissingGitRepo", "CloningError", @@ -10,6 +19,8 @@ __all__ = [ "UpdateError", "GitDiffError", "NoRemoteURL", + "UnknownRevision", + "AmbiguousRevision", "PipError", ] @@ -37,6 +48,15 @@ class InvalidRepoName(DownloaderException): pass +class CopyingError(DownloaderException): + """ + Throw when there was an issue + during copying of module's files. + """ + + pass + + class ExistingGitRepo(DownloaderException): """ Thrown when trying to clone into a folder where a @@ -105,6 +125,24 @@ class NoRemoteURL(GitException): pass +class UnknownRevision(GitException): + """ + Thrown when specified revision cannot be found. + """ + + pass + + +class AmbiguousRevision(GitException): + """ + Thrown when specified revision is ambiguous. + """ + + def __init__(self, message: str, candidates: List[Candidate]) -> None: + super().__init__(message) + self.candidates = candidates + + class PipError(DownloaderException): """ Thrown when pip returns a non-zero return code. diff --git a/redbot/cogs/downloader/installable.py b/redbot/cogs/downloader/installable.py index 772aec4a9..327d571ad 100644 --- a/redbot/cogs/downloader/installable.py +++ b/redbot/cogs/downloader/installable.py @@ -1,9 +1,11 @@ +from __future__ import annotations + import json import distutils.dir_util import shutil -from enum import Enum +from enum import IntEnum from pathlib import Path -from typing import MutableMapping, Any, TYPE_CHECKING +from typing import MutableMapping, Any, TYPE_CHECKING, Optional, Dict, Union, Callable, Tuple, cast from .log import log from .json_mixins import RepoJSONMixin @@ -11,10 +13,11 @@ from .json_mixins import RepoJSONMixin from redbot.core import __version__, version_info as red_version_info, VersionInfo if TYPE_CHECKING: - from .repo_manager import RepoManager + from .repo_manager import RepoManager, Repo -class InstallableType(Enum): +class InstallableType(IntEnum): + # using IntEnum, because hot-reload breaks its identity UNKNOWN = 0 COG = 1 SHARED_LIBRARY = 2 @@ -34,6 +37,10 @@ class Installable(RepoJSONMixin): ---------- repo_name : `str` Name of the repository which this package belongs to. + repo : Repo, optional + Repo object of the Installable, if repo is missing this will be `None` + commit : `str`, optional + Installable's commit. This is not the same as ``repo.commit`` author : `tuple` of `str`, optional Name(s) of the author(s). bot_version : `tuple` of `int` @@ -58,30 +65,36 @@ class Installable(RepoJSONMixin): """ - def __init__(self, location: Path): + def __init__(self, location: Path, repo: Optional[Repo] = None, commit: str = ""): """Base installable initializer. Parameters ---------- location : pathlib.Path Location (file or folder) to the installable. + repo : Repo, optional + Repo object of the Installable, if repo is missing this will be `None` + commit : str + Installable's commit. This is not the same as ``repo.commit`` """ super().__init__(location) self._location = location + self.repo = repo self.repo_name = self._location.parent.stem + self.commit = commit - self.author = () + self.author: Tuple[str, ...] = () self.min_bot_version = red_version_info self.max_bot_version = red_version_info self.min_python_version = (3, 5, 1) self.hidden = False self.disabled = False - self.required_cogs = {} # Cog name -> repo URL - self.requirements = () - self.tags = () + self.required_cogs: Dict[str, str] = {} # Cog name -> repo URL + self.requirements: Tuple[str, ...] = () + self.tags: Tuple[str, ...] = () self.type = InstallableType.UNKNOWN if self._info_file.exists(): @@ -90,15 +103,15 @@ class Installable(RepoJSONMixin): if self._info == {}: self.type = InstallableType.COG - def __eq__(self, other): + def __eq__(self, other: Any) -> bool: # noinspection PyProtectedMember return self._location == other._location - def __hash__(self): + def __hash__(self) -> int: return hash(self._location) @property - def name(self): + def name(self) -> str: """`str` : The name of this package.""" return self._location.stem @@ -111,6 +124,7 @@ class Installable(RepoJSONMixin): :return: Status of installation :rtype: bool """ + copy_func: Callable[..., Any] if self._location.is_file(): copy_func = shutil.copy2 else: @@ -121,18 +135,20 @@ class Installable(RepoJSONMixin): # noinspection PyBroadException try: copy_func(src=str(self._location), dst=str(target_dir / self._location.stem)) - except: + except: # noqa: E722 log.exception("Error occurred when copying path: {}".format(self._location)) return False return True - def _read_info_file(self): + def _read_info_file(self) -> None: super()._read_info_file() if self._info_file.exists(): self._process_info_file() - def _process_info_file(self, info_file_path: Path = None) -> MutableMapping[str, Any]: + def _process_info_file( + self, info_file_path: Optional[Path] = None + ) -> MutableMapping[str, Any]: """ Processes an information file. Loads dependencies among other information into this object. @@ -145,7 +161,7 @@ class Installable(RepoJSONMixin): if info_file_path is None or not info_file_path.is_file(): raise ValueError("No valid information file path was found.") - info = {} + info: Dict[str, Any] = {} with info_file_path.open(encoding="utf-8") as f: try: info = json.load(f) @@ -174,7 +190,7 @@ class Installable(RepoJSONMixin): self.max_bot_version = max_bot_version try: - min_python_version = tuple(info.get("min_python_version", [3, 5, 1])) + min_python_version = tuple(info.get("min_python_version", (3, 5, 1))) except ValueError: min_python_version = self.min_python_version self.min_python_version = min_python_version @@ -212,14 +228,51 @@ class Installable(RepoJSONMixin): return info - def to_json(self): - return {"repo_name": self.repo_name, "cog_name": self.name} + +class InstalledModule(Installable): + """Base class for installed modules, + this is basically instance of installed `Installable` + used by Downloader. + + Attributes + ---------- + pinned : `bool` + Whether or not this cog is pinned, always `False` if module is not a cog. + """ + + def __init__( + self, + location: Path, + repo: Optional[Repo] = None, + commit: str = "", + pinned: bool = False, + json_repo_name: str = "", + ): + super().__init__(location=location, repo=repo, commit=commit) + self.pinned: bool = pinned if self.type == InstallableType.COG else False + # this is here so that Downloader could use real repo name instead of "MISSING_REPO" + self._json_repo_name = json_repo_name + + def to_json(self) -> Dict[str, Union[str, bool]]: + module_json: Dict[str, Union[str, bool]] = { + "repo_name": self.repo_name, + "module_name": self.name, + "commit": self.commit, + } + if self.type == InstallableType.COG: + module_json["pinned"] = self.pinned + return module_json @classmethod - def from_json(cls, data: dict, repo_mgr: "RepoManager"): - repo_name = data["repo_name"] - cog_name = data["cog_name"] + def from_json( + cls, data: Dict[str, Union[str, bool]], repo_mgr: RepoManager + ) -> InstalledModule: + repo_name = cast(str, data["repo_name"]) + cog_name = cast(str, data["module_name"]) + commit = cast(str, data.get("commit", "")) + pinned = cast(bool, data.get("pinned", False)) + # TypedDict, where are you :/ repo = repo_mgr.get_repo(repo_name) if repo is not None: repo_folder = repo.folder_path @@ -228,4 +281,12 @@ class Installable(RepoJSONMixin): location = repo_folder / cog_name - return cls(location=location) + return cls( + location=location, repo=repo, commit=commit, pinned=pinned, json_repo_name=repo_name + ) + + @classmethod + def from_installable(cls, module: Installable, *, pinned: bool = False) -> InstalledModule: + return cls( + location=module._location, repo=module.repo, commit=module.commit, pinned=pinned + ) diff --git a/redbot/cogs/downloader/json_mixins.py b/redbot/cogs/downloader/json_mixins.py index b989e91ef..6c8a18282 100644 --- a/redbot/cogs/downloader/json_mixins.py +++ b/redbot/cogs/downloader/json_mixins.py @@ -1,5 +1,6 @@ import json from pathlib import Path +from typing import Optional, Tuple, Dict, Any class RepoJSONMixin: @@ -8,18 +9,18 @@ class RepoJSONMixin: def __init__(self, repo_folder: Path): self._repo_folder = repo_folder - self.author = None - self.install_msg = None - self.short = None - self.description = None + self.author: Optional[Tuple[str, ...]] = None + self.install_msg: Optional[str] = None + self.short: Optional[str] = None + self.description: Optional[str] = None self._info_file = repo_folder / self.INFO_FILE_NAME if self._info_file.exists(): self._read_info_file() - self._info = {} + self._info: Dict[str, Any] = {} - def _read_info_file(self): + def _read_info_file(self) -> None: if not (self._info_file.exists() or self._info_file.is_file()): return diff --git a/redbot/cogs/downloader/repo_manager.py b/redbot/cogs/downloader/repo_manager.py index 6e29032bc..39ee18dce 100644 --- a/redbot/cogs/downloader/repo_manager.py +++ b/redbot/cogs/downloader/repo_manager.py @@ -1,3 +1,5 @@ +from __future__ import annotations + import asyncio import functools import os @@ -7,23 +9,71 @@ import shutil import re from concurrent.futures import ThreadPoolExecutor from pathlib import Path -from subprocess import run as sp_run, PIPE +from subprocess import run as sp_run, PIPE, CompletedProcess from string import Formatter from sys import executable -from typing import List, Tuple, Iterable, MutableMapping, Union, Optional +from typing import ( + Any, + AsyncContextManager, + Awaitable, + Dict, + Generator, + Iterable, + List, + NamedTuple, + Optional, + Tuple, +) -from redbot.core import data_manager, commands +import discord +from redbot.core import data_manager, commands, Config from redbot.core.utils import safe_delete from redbot.core.i18n import Translator from . import errors -from .installable import Installable, InstallableType +from .installable import Installable, InstallableType, InstalledModule from .json_mixins import RepoJSONMixin from .log import log _ = Translator("RepoManager", __file__) +class Candidate(NamedTuple): + rev: str + object_type: str + description: str + + +class _RepoCheckoutCtxManager( + Awaitable[None], AsyncContextManager[None] +): # pylint: disable=duplicate-bases + def __init__( + self, + repo: Repo, + rev: Optional[str], + exit_to_rev: Optional[str] = None, + force_checkout: bool = False, + ): + self.repo = repo + self.rev = rev + if exit_to_rev is None: + self.exit_to_rev = self.repo.commit + else: + self.exit_to_rev = exit_to_rev + self.force_checkout = force_checkout + self.coro = repo._checkout(self.rev, force_checkout=self.force_checkout) + + def __await__(self) -> Generator[Any, None, None]: + return self.coro.__await__() + + async def __aenter__(self) -> None: + await self + + async def __aexit__(self, exc_type, exc, tb) -> None: + if self.rev is not None: + await self.repo._checkout(self.exit_to_rev, force_checkout=self.force_checkout) + + class ProcessFormatter(Formatter): def vformat(self, format_string, args, kwargs): return shlex.split(super().vformat(format_string, args, kwargs)) @@ -38,27 +88,49 @@ class ProcessFormatter(Formatter): class Repo(RepoJSONMixin): GIT_CLONE = "git clone --recurse-submodules -b {branch} {url} {folder}" GIT_CLONE_NO_BRANCH = "git clone --recurse-submodules {url} {folder}" - GIT_CURRENT_BRANCH = "git -C {path} rev-parse --abbrev-ref HEAD" + GIT_CURRENT_BRANCH = "git -C {path} symbolic-ref --short HEAD" + GIT_CURRENT_COMMIT = "git -C {path} rev-parse HEAD" GIT_LATEST_COMMIT = "git -C {path} rev-parse {branch}" GIT_HARD_RESET = "git -C {path} reset --hard origin/{branch} -q" GIT_PULL = "git -C {path} pull --recurse-submodules -q --ff-only" - GIT_DIFF_FILE_STATUS = "git -C {path} diff --no-commit-id --name-status {old_hash} {new_hash}" - GIT_LOG = "git -C {path} log --relative-date --reverse {old_hash}.. {relative_file_path}" + GIT_DIFF_FILE_STATUS = ( + "git -C {path} diff-tree --no-commit-id --name-status" + " -r -z --line-prefix='\t' {old_rev} {new_rev}" + ) + GIT_LOG = "git -C {path} log --relative-date --reverse {old_rev}.. {relative_file_path}" GIT_DISCOVER_REMOTE_URL = "git -C {path} config --get remote.origin.url" + GIT_CHECKOUT = "git -C {path} checkout {rev}" + GIT_GET_FULL_SHA1 = "git -C {path} rev-parse --verify {rev}^{{commit}}" + GIT_IS_ANCESTOR = ( + "git -C {path} merge-base --is-ancestor {maybe_ancestor_rev} {descendant_rev}" + ) + GIT_CHECK_IF_MODULE_EXISTS = "git -C {path} cat-file -e {rev}:{module_name}/__init__.py" + # ↓ this gives a commit after last occurrence + GIT_GET_LAST_MODULE_OCCURRENCE_COMMIT = ( + "git -C {path} log --diff-filter=D --pretty=format:%H -n 1 {descendant_rev}" + " -- {module_name}/__init__.py" + ) PIP_INSTALL = "{python} -m pip install -U -t {target_dir} {reqs}" + MODULE_FOLDER_REGEX = re.compile(r"(\w+)\/") + AMBIGUOUS_ERROR_REGEX = re.compile( + r"^hint: {3}(?P[A-Za-z0-9]+) (?Pcommit|tag) (?P.+)$", re.MULTILINE + ) + def __init__( self, name: str, url: str, - branch: str, + branch: Optional[str], + commit: str, folder_path: Path, - available_modules: Tuple[Installable] = (), - loop: asyncio.AbstractEventLoop = None, + available_modules: Tuple[Installable, ...] = (), + loop: Optional[asyncio.AbstractEventLoop] = None, ): self.url = url self.branch = branch + self.commit = commit self.name = name @@ -73,12 +145,10 @@ class Repo(RepoJSONMixin): self._repo_lock = asyncio.Lock() - self._loop = loop - if self._loop is None: - self._loop = asyncio.get_event_loop() + self._loop = loop if loop is not None else asyncio.get_event_loop() @classmethod - async def convert(cls, ctx: commands.Context, argument: str): + async def convert(cls, ctx: commands.Context, argument: str) -> Repo: downloader_cog = ctx.bot.get_cog("Downloader") if downloader_cog is None: raise commands.CommandError(_("No Downloader cog found.")) @@ -92,26 +162,82 @@ class Repo(RepoJSONMixin): ) return poss_repo - def _existing_git_repo(self) -> (bool, Path): + def _existing_git_repo(self) -> Tuple[bool, Path]: git_path = self.folder_path / ".git" return git_path.exists(), git_path - async def _get_file_update_statuses( - self, old_hash: str, new_hash: str - ) -> MutableMapping[str, str]: + async def is_ancestor(self, maybe_ancestor_rev: str, descendant_rev: str) -> bool: """ - Gets the file update status letters for each changed file between - the two hashes. - :param old_hash: Pre-update - :param new_hash: Post-update - :return: Mapping of filename -> status_letter + Check if the first is an ancestor of the second. + + Parameters + ---------- + maybe_ancestor_rev : `str` + Revision to check if it is ancestor of :code:`descendant_rev` + descendant_rev : `str` + Descendant revision + + Returns + ------- + bool + `True` if :code:`maybe_ancestor_rev` is + ancestor of :code:`descendant_rev` or `False` otherwise + """ + valid_exit_codes = (0, 1) p = await self._run( ProcessFormatter().format( - self.GIT_DIFF_FILE_STATUS, + self.GIT_IS_ANCESTOR, path=self.folder_path, - old_hash=old_hash, - new_hash=new_hash, + maybe_ancestor_rev=maybe_ancestor_rev, + descendant_rev=descendant_rev, + ), + valid_exit_codes=valid_exit_codes, + ) + + if p.returncode in valid_exit_codes: + return not bool(p.returncode) + raise errors.GitException( + f"Git failed to determine if commit {maybe_ancestor_rev}" + f" is ancestor of {descendant_rev} for repo at path: {self.folder_path}" + ) + + async def is_on_branch(self) -> bool: + """ + Check if repo is currently on branch. + + Returns + ------- + bool + `True` if repo is on branch or `False` otherwise + + """ + return await self.latest_commit() == self.commit + + async def _get_file_update_statuses( + self, old_rev: str, new_rev: Optional[str] = None + ) -> Dict[str, str]: + """ + Gets the file update status letters for each changed file between the two revisions. + + Parameters + ---------- + old_rev : `str` + Pre-update revision + new_rev : `str`, optional + Post-update revision, defaults to repo's branch if not given + + Returns + ------- + Dict[str, str] + Mapping of filename -> status_letter + + """ + if new_rev is None: + new_rev = self.branch + p = await self._run( + ProcessFormatter().format( + self.GIT_DIFF_FILE_STATUS, path=self.folder_path, old_rev=old_rev, new_rev=new_rev ) ) @@ -120,21 +246,156 @@ class Repo(RepoJSONMixin): "Git diff failed for repo at path: {}".format(self.folder_path) ) - stdout = p.stdout.strip().decode().split("\n") - + stdout = p.stdout.strip(b"\t\n\x00 ").decode().split("\x00\t") ret = {} for filename in stdout: - # TODO: filter these filenames by ones in self.available_modules - status, _, filepath = filename.partition("\t") + status, __, filepath = filename.partition("\x00") # NUL character ret[filepath] = status return ret - async def _get_commit_notes(self, old_commit_hash: str, relative_file_path: str) -> str: + async def get_last_module_occurrence( + self, module_name: str, descendant_rev: Optional[str] = None + ) -> Optional[Installable]: + """ + Gets module's `Installable` from last commit in which it still occurs. + + Parameters + ---------- + module_name : str + Name of module to get. + descendant_rev : `str`, optional + Revision from which the module's commit must be + reachable (i.e. descendant commit), + defaults to repo's branch if not given. + + Returns + ------- + `Installable` + Module from last commit in which it still occurs + or `None` if it couldn't be found. + + """ + if descendant_rev is None: + descendant_rev = self.branch + p = await self._run( + ProcessFormatter().format( + self.GIT_CHECK_IF_MODULE_EXISTS, + path=self.folder_path, + rev=descendant_rev, + module_name=module_name, + ), + debug_only=True, + ) + if p.returncode == 0: + async with self.checkout(descendant_rev): + return discord.utils.get(self.available_modules, name=module_name) + + p = await self._run( + ProcessFormatter().format( + self.GIT_GET_LAST_MODULE_OCCURRENCE_COMMIT, + path=self.folder_path, + descendant_rev=descendant_rev, + module_name=module_name, + ) + ) + + if p.returncode != 0: + raise errors.GitException( + "Git log failed for repo at path: {}".format(self.folder_path) + ) + + commit = p.stdout.decode().strip() + if commit: + async with self.checkout(f"{commit}~"): + return discord.utils.get(self.available_modules, name=module_name) + return None + + async def _is_module_modified(self, module: Installable, other_hash: str) -> bool: + """ + Checks if given module was different in :code:`other_hash`. + + Parameters + ---------- + module : `Installable` + Module to check. + other_hash : `str` + Hash to compare module to. + + Returns + ------- + bool + `True` if module was different, `False` otherwise. + + """ + if module.commit == other_hash: + return False + + for status in await self._get_file_update_statuses(other_hash, module.commit): + match = self.MODULE_FOLDER_REGEX.match(status) + if match is not None and match.group(1) == module.name: + return True + + return False + + async def get_modified_modules( + self, old_rev: str, new_rev: Optional[str] = None + ) -> Tuple[Installable, ...]: + """ + Gets modified modules between the two revisions. + For every module that doesn't exist in :code:`new_rev`, + it will try to find last commit, where it still existed + + Parameters + ---------- + old_rev : `str` + Pre-update revision, ancestor of :code:`new_rev` + new_rev : `str`, optional + Post-update revision, defaults to repo's branch if not given + + Returns + ------- + `tuple` of `Installable` + List of changed modules between the two revisions. + + """ + if new_rev is None: + new_rev = self.branch + modified_modules = set() + # check differences + for status in await self._get_file_update_statuses(old_rev, new_rev): + match = self.MODULE_FOLDER_REGEX.match(status) + if match is not None: + modified_modules.add(match.group(1)) + + async with self.checkout(old_rev): + # save old modules + old_hash = self.commit + old_modules = self.available_modules + # save new modules + await self.checkout(new_rev) + modules = [] + new_modules = self.available_modules + for old_module in old_modules: + if old_module.name not in modified_modules: + continue + try: + index = new_modules.index(old_module) + except ValueError: + # module doesn't exist in this revision, try finding previous occurrence + module = await self.get_last_module_occurrence(old_module.name, new_rev) + if module is not None and await self._is_module_modified(module, old_hash): + modules.append(module) + else: + modules.append(new_modules[index]) + + return tuple(modules) + + async def _get_commit_notes(self, old_rev: str, relative_file_path: str) -> str: """ Gets the commit notes from git log. - :param old_commit_hash: Point in time to start getting messages + :param old_rev: Point in time to start getting messages :param relative_file_path: Path relative to the repo folder of the file to get messages for. :return: Git commit note log @@ -143,7 +404,7 @@ class Repo(RepoJSONMixin): ProcessFormatter().format( self.GIT_LOG, path=self.folder_path, - old_hash=old_commit_hash, + old_rev=old_rev, relative_file_path=relative_file_path, ) ) @@ -156,7 +417,47 @@ class Repo(RepoJSONMixin): return p.stdout.decode().strip() - def _update_available_modules(self) -> Tuple[str]: + async def get_full_sha1(self, rev: str) -> str: + """ + Gets full sha1 object name. + + Parameters + ---------- + rev : str + Revision to search for full sha1 object name. + + Raises + ------ + .UnknownRevision + When git cannot find provided revision. + .AmbiguousRevision + When git cannot resolve provided short sha1 to one commit. + + Returns + ------- + `str` + Full sha1 object name for provided revision. + + """ + p = await self._run( + ProcessFormatter().format(self.GIT_GET_FULL_SHA1, path=self.folder_path, rev=rev) + ) + + if p.returncode != 0: + stderr = p.stderr.decode().strip() + ambiguous_error = f"error: short SHA1 {rev} is ambiguous\nhint: The candidates are:\n" + if not stderr.startswith(ambiguous_error): + raise errors.UnknownRevision(f"Revision {rev} cannot be found.") + candidates = [] + for match in self.AMBIGUOUS_ERROR_REGEX.finditer(stderr, len(ambiguous_error)): + candidates.append(Candidate(match["rev"], match["type"], match["desc"])) + if candidates: + raise errors.AmbiguousRevision(f"Short SHA1 {rev} is ambiguous.", candidates) + raise errors.UnknownRevision(f"Revision {rev} cannot be found.") + + return p.stdout.decode().strip() + + def _update_available_modules(self) -> Tuple[Installable, ...]: """ Updates the available modules attribute for this repo. :return: List of available modules. @@ -175,22 +476,114 @@ class Repo(RepoJSONMixin): """ for file_finder, name, is_pkg in pkgutil.iter_modules(path=[str(self.folder_path)]): if is_pkg: - curr_modules.append(Installable(location=self.folder_path / name)) - self.available_modules = curr_modules + curr_modules.append( + Installable(location=self.folder_path / name, repo=self, commit=self.commit) + ) + self.available_modules = tuple(curr_modules) - # noinspection PyTypeChecker - return tuple(self.available_modules) + return self.available_modules - async def _run(self, *args, **kwargs): + async def _run( + self, + *args: Any, + valid_exit_codes: Tuple[int, ...] = (0,), + debug_only: bool = False, + **kwargs: Any, + ) -> CompletedProcess: + """ + Parameters + ---------- + valid_exit_codes : tuple + Specifies valid exit codes, used to determine + if stderr should be sent as debug or error level in logging. + When not provided, defaults to :code:`(0,)` + debug_only : bool + Specifies if stderr can be sent only as debug level in logging. + When not provided, defaults to `False` + """ env = os.environ.copy() env["GIT_TERMINAL_PROMPT"] = "0" kwargs["env"] = env async with self._repo_lock: - return await self._loop.run_in_executor( - self._executor, functools.partial(sp_run, *args, stdout=PIPE, **kwargs) + p: CompletedProcess = await self._loop.run_in_executor( + self._executor, + functools.partial(sp_run, *args, stdout=PIPE, stderr=PIPE, **kwargs), + ) + stderr = p.stderr.decode().strip() + if stderr: + if debug_only or p.returncode in valid_exit_codes: + log.debug(stderr) + else: + log.error(stderr) + return p + + async def _setup_repo(self) -> None: + self.commit = await self.current_commit() + self._read_info_file() + self._update_available_modules() + + async def _checkout(self, rev: Optional[str] = None, force_checkout: bool = False) -> None: + if rev is None: + return + if not force_checkout and self.commit == rev: + return + exists, __ = self._existing_git_repo() + if not exists: + raise errors.MissingGitRepo( + "A git repo does not exist at path: {}".format(self.folder_path) ) - async def clone(self) -> Tuple[str]: + p = await self._run( + ProcessFormatter().format(self.GIT_CHECKOUT, path=self.folder_path, rev=rev) + ) + + if p.returncode != 0: + raise errors.UnknownRevision( + "Could not checkout to {}. This revision may not exist".format(rev) + ) + + await self._setup_repo() + + def checkout( + self, + rev: Optional[str] = None, + *, + exit_to_rev: Optional[str] = None, + force_checkout: bool = False, + ) -> _RepoCheckoutCtxManager: + """ + Checks out repository to provided revision. + + The return value of this method can also be used as an asynchronous + context manager, i.e. with :code:`async with` syntax. This will + checkout repository to :code:`exit_to_rev` on exit of the context manager. + + Parameters + ---------- + rev : str, optional + Revision to checkout to, when not provided, method won't do anything + exit_to_rev : str, optional + Revision to checkout to after exiting context manager, + when not provided, defaults to current commit + This will be ignored, when used with :code:`await` or when :code:`rev` is `None`. + force_checkout : bool + When `True` checkout will be done even + if :code:`self.commit` is the same as target hash + (applies to exiting context manager as well) + If provided revision isn't full sha1 hash, + checkout will be done no matter to this parameter. + Defaults to `False`. + + Raises + ------ + .UnknownRevision + When git cannot checkout to provided revision. + + """ + + return _RepoCheckoutCtxManager(self, rev, exit_to_rev, force_checkout) + + async def clone(self) -> Tuple[Installable, ...]: """Clone a new repo. Returns @@ -224,9 +617,9 @@ class Repo(RepoJSONMixin): if self.branch is None: self.branch = await self.current_branch() - self._read_info_file() + await self._setup_repo() - return self._update_available_modules() + return self.available_modules async def current_branch(self) -> str: """Determine the current branch using git commands. @@ -237,7 +630,7 @@ class Repo(RepoJSONMixin): The current branch name. """ - exists, _ = self._existing_git_repo() + exists, __ = self._existing_git_repo() if not exists: raise errors.MissingGitRepo( "A git repo does not exist at path: {}".format(self.folder_path) @@ -254,9 +647,33 @@ class Repo(RepoJSONMixin): return p.stdout.decode().strip() - async def current_commit(self, branch: str = None) -> str: + async def current_commit(self) -> str: """Determine the current commit hash of the repo. + Returns + ------- + str + The requested commit hash. + + """ + exists, __ = self._existing_git_repo() + if not exists: + raise errors.MissingGitRepo( + "A git repo does not exist at path: {}".format(self.folder_path) + ) + + p = await self._run( + ProcessFormatter().format(self.GIT_CURRENT_COMMIT, path=self.folder_path) + ) + + if p.returncode != 0: + raise errors.CurrentHashError("Unable to determine commit hash.") + + return p.stdout.decode().strip() + + async def latest_commit(self, branch: Optional[str] = None) -> str: + """Determine the latest commit hash of the repo. + Parameters ---------- branch : `str`, optional @@ -271,7 +688,7 @@ class Repo(RepoJSONMixin): if branch is None: branch = self.branch - exists, _ = self._existing_git_repo() + exists, __ = self._existing_git_repo() if not exists: raise errors.MissingGitRepo( "A git repo does not exist at path: {}".format(self.folder_path) @@ -282,11 +699,11 @@ class Repo(RepoJSONMixin): ) if p.returncode != 0: - raise errors.CurrentHashError("Unable to determine old commit hash.") + raise errors.CurrentHashError("Unable to determine latest commit hash.") return p.stdout.decode().strip() - async def current_url(self, folder: Path = None) -> str: + async def current_url(self, folder: Optional[Path] = None) -> str: """ Discovers the FETCH URL for a Git repo. @@ -316,7 +733,7 @@ class Repo(RepoJSONMixin): return p.stdout.decode().strip() - async def hard_reset(self, branch: str = None) -> None: + async def hard_reset(self, branch: Optional[str] = None) -> None: """Perform a hard reset on the current repo. Parameters @@ -328,7 +745,8 @@ class Repo(RepoJSONMixin): if branch is None: branch = self.branch - exists, _ = self._existing_git_repo() + await self.checkout(branch) + exists, __ = self._existing_git_repo() if not exists: raise errors.MissingGitRepo( "A git repo does not exist at path: {}".format(self.folder_path) @@ -345,7 +763,7 @@ class Repo(RepoJSONMixin): " the following path: {}".format(self.folder_path) ) - async def update(self) -> (str, str): + async def update(self) -> Tuple[str, str]: """Update the current branch of this repo. Returns @@ -354,10 +772,9 @@ class Repo(RepoJSONMixin): :py:code`(old commit hash, new commit hash)` """ - curr_branch = await self.current_branch() - old_commit = await self.current_commit(branch=curr_branch) + old_commit = await self.latest_commit() - await self.hard_reset(branch=curr_branch) + await self.hard_reset() p = await self._run(ProcessFormatter().format(self.GIT_PULL, path=self.folder_path)) @@ -367,14 +784,11 @@ class Repo(RepoJSONMixin): " for the repo located at path: {}".format(self.folder_path) ) - new_commit = await self.current_commit(branch=curr_branch) + await self._setup_repo() - self._update_available_modules() - self._read_info_file() + return old_commit, self.commit - return old_commit, new_commit - - async def install_cog(self, cog: Installable, target_dir: Path) -> bool: + async def install_cog(self, cog: Installable, target_dir: Path) -> InstalledModule: """Install a cog to the target directory. Parameters @@ -386,8 +800,13 @@ class Repo(RepoJSONMixin): Returns ------- - bool - The success of the installation. + `InstalledModule` + Cog instance. + + Raises + ------ + .CopyingError + When cog couldn't be copied. """ if cog not in self.available_cogs: @@ -399,11 +818,14 @@ class Repo(RepoJSONMixin): if not target_dir.exists(): raise ValueError("That target directory does not exist.") - return await cog.copy_to(target_dir=target_dir) + if not await cog.copy_to(target_dir=target_dir): + raise errors.CopyingError("There was an issue during copying of cog's files") + + return InstalledModule.from_installable(cog) async def install_libraries( - self, target_dir: Path, req_target_dir: Path, libraries: Tuple[Installable] = () - ) -> bool: + self, target_dir: Path, req_target_dir: Path, libraries: Iterable[Installable] = () + ) -> Tuple[Tuple[InstalledModule, ...], Tuple[Installable, ...]]: """Install shared libraries to the target directory. If :code:`libraries` is not specified, all shared libraries in the repo @@ -420,26 +842,30 @@ class Repo(RepoJSONMixin): Returns ------- - bool - The success of the installation. + tuple + 2-tuple of installed and failed libraries. """ - if len(libraries) > 0: + + if libraries: if not all([i in self.available_libraries for i in libraries]): raise ValueError("Some given libraries are not available in this repo.") else: libraries = self.available_libraries - if len(libraries) > 0: - ret = True + if libraries: + installed = [] + failed = [] for lib in libraries: - ret = ( - ret - and await self.install_requirements(cog=lib, target_dir=req_target_dir) + if not ( + await self.install_requirements(cog=lib, target_dir=req_target_dir) and await lib.copy_to(target_dir=target_dir) - ) - return ret - return True + ): + failed.append(lib) + else: + installed.append(InstalledModule.from_installable(lib)) + return (tuple(installed), tuple(failed)) + return ((), ()) async def install_requirements(self, cog: Installable, target_dir: Path) -> bool: """Install a cog's requirements. @@ -466,7 +892,9 @@ class Repo(RepoJSONMixin): return await self.install_raw_requirements(cog.requirements, target_dir) - async def install_raw_requirements(self, requirements: Tuple[str], target_dir: Path) -> bool: + async def install_raw_requirements( + self, requirements: Iterable[str], target_dir: Path + ) -> bool: """Install a list of requirements using pip. Parameters @@ -482,7 +910,7 @@ class Repo(RepoJSONMixin): Success of the installation """ - if len(requirements) == 0: + if not requirements: return True # TODO: Check and see if any of these modules are already available @@ -503,7 +931,7 @@ class Repo(RepoJSONMixin): return True @property - def available_cogs(self) -> Tuple[Installable]: + def available_cogs(self) -> Tuple[Installable, ...]: """`tuple` of `installable` : All available cogs in this Repo. This excludes hidden or shared packages. @@ -514,7 +942,7 @@ class Repo(RepoJSONMixin): ) @property - def available_libraries(self) -> Tuple[Installable]: + def available_libraries(self) -> Tuple[Installable, ...]: """`tuple` of `installable` : All available shared libraries in this Repo. """ @@ -524,11 +952,14 @@ class Repo(RepoJSONMixin): ) @classmethod - async def from_folder(cls, folder: Path): - repo = cls(name=folder.stem, branch="", url="", folder_path=folder) - repo.branch = await repo.current_branch() + async def from_folder(cls, folder: Path, branch: str = "") -> Repo: + repo = cls(name=folder.stem, url="", branch=branch, commit="", folder_path=folder) repo.url = await repo.current_url() - repo._update_available_modules() + if branch == "": + repo.branch = await repo.current_branch() + repo._update_available_modules() + else: + await repo.checkout(repo.branch, force_checkout=True) return repo @@ -537,11 +968,13 @@ class RepoManager: GITHUB_OR_GITLAB_RE = re.compile(r"https?://git(?:hub)|(?:lab)\.com/") TREE_URL_RE = re.compile(r"(?P/tree)/(?P\S+)$") - def __init__(self): - self._repos = {} + def __init__(self) -> None: + self._repos: Dict[str, Repo] = {} + self.conf = Config.get_conf(self, identifier=170708480, force_registration=True) + self.conf.register_global(repos={}) - async def initialize(self): - await self._load_repos(set=True) + async def initialize(self) -> None: + await self._load_repos(set_repos=True) @property def repos_folder(self) -> Path: @@ -583,14 +1016,17 @@ class RepoManager: url, branch = self._parse_url(url, branch) # noinspection PyTypeChecker - r = Repo(url=url, name=name, branch=branch, folder_path=self.repos_folder / name) + r = Repo( + url=url, name=name, branch=branch, commit="", folder_path=self.repos_folder / name + ) await r.clone() + await self.conf.repos.set_raw(name, value=r.branch) self._repos[name] = r return r - def get_repo(self, name: str) -> Union[Repo, None]: + def get_repo(self, name: str) -> Optional[Repo]: """Get a Repo object for a repository. Parameters @@ -606,7 +1042,11 @@ class RepoManager: """ return self._repos.get(name, None) - def get_all_repo_names(self) -> Tuple[str]: + @property + def repos(self) -> Tuple[Repo, ...]: + return tuple(self._repos.values()) + + def get_all_repo_names(self) -> Tuple[str, ...]: """Get all repo names. Returns @@ -617,7 +1057,20 @@ class RepoManager: # noinspection PyTypeChecker return tuple(self._repos.keys()) - async def delete_repo(self, name: str): + def get_all_cogs(self) -> Tuple[Installable, ...]: + """Get all cogs. + + Returns + ------- + `tuple` of `Installable` + + """ + all_cogs: List[Installable] = [] + for repo in self._repos.values(): + all_cogs += repo.available_cogs + return tuple(all_cogs) + + async def delete_repo(self, name: str) -> None: """Delete a repository and its folders. Parameters @@ -637,41 +1090,59 @@ class RepoManager: safe_delete(repo.folder_path) + await self.conf.repos.clear_raw(repo.name) try: del self._repos[name] except KeyError: pass - async def update_repo(self, repo_name: str) -> MutableMapping[Repo, Tuple[str, str]]: + async def update_repo(self, repo_name: str) -> Tuple[Repo, Tuple[str, str]]: + """Update repo with provided name. + + Parameters + ---------- + name : str + The name of the repository to update. + + Returns + ------- + Tuple[Repo, Tuple[str, str]] + A 2-`tuple` with Repo object and a 2-`tuple` of `str` + containing old and new commit hashes. + + """ repo = self._repos[repo_name] old, new = await repo.update() - return {repo: (old, new)} + return (repo, (old, new)) - async def update_all_repos(self) -> MutableMapping[Repo, Tuple[str, str]]: + async def update_all_repos(self) -> Dict[Repo, Tuple[str, str]]: """Call `Repo.update` on all repositories. Returns ------- - dict - A mapping of `Repo` objects that received new commits to a `tuple` - of `str` containing old and new commit hashes. + Dict[Repo, Tuple[str, str]] + A mapping of `Repo` objects that received new commits to + a 2-`tuple` of `str` containing old and new commit hashes. """ ret = {} - for repo_name, _ in self._repos.items(): - repo, (old, new) = (await self.update_repo(repo_name)).popitem() + for repo_name, __ in self._repos.items(): + repo, (old, new) = await self.update_repo(repo_name) if old != new: ret[repo] = (old, new) return ret - async def _load_repos(self, set=False) -> MutableMapping[str, Repo]: + async def _load_repos(self, set_repos: bool = False) -> Dict[str, Repo]: ret = {} self.repos_folder.mkdir(parents=True, exist_ok=True) for folder in self.repos_folder.iterdir(): if not folder.is_dir(): continue try: - ret[folder.stem] = await Repo.from_folder(folder) + branch = await self.conf.repos.get_raw(folder.stem, default="") + ret[folder.stem] = await Repo.from_folder(folder, branch) + if branch == "": + await self.conf.repos.set_raw(folder.stem, value=ret[folder.stem].branch) except errors.NoRemoteURL: log.warning("A remote URL does not exist for repo %s", folder.stem) except errors.DownloaderException as err: @@ -683,7 +1154,7 @@ class RepoManager: ), ) - if set: + if set_repos: self._repos = ret return ret diff --git a/redbot/pytest/.gitattributes b/redbot/pytest/.gitattributes new file mode 100644 index 000000000..31b7e1c61 --- /dev/null +++ b/redbot/pytest/.gitattributes @@ -0,0 +1 @@ +downloader_testrepo.export -text \ No newline at end of file diff --git a/redbot/pytest/downloader.py b/redbot/pytest/downloader.py index 4c6e7d342..0ac043791 100644 --- a/redbot/pytest/downloader.py +++ b/redbot/pytest/downloader.py @@ -1,39 +1,43 @@ from collections import namedtuple from pathlib import Path import json +import subprocess as sp +import shutil import pytest -from redbot.cogs.downloader.repo_manager import RepoManager, Repo -from redbot.cogs.downloader.installable import Installable +from redbot.cogs.downloader.repo_manager import RepoManager, Repo, ProcessFormatter +from redbot.cogs.downloader.installable import Installable, InstalledModule __all__ = [ "patch_relative_to", "repo_manager", "repo", - "repo_norun", "bot_repo", "INFO_JSON", "LIBRARY_INFO_JSON", "installable", + "installed_cog", "library_installable", "fake_run_noprint", + "fake_current_commit", + "_session_git_repo", + "git_repo", + "cloned_git_repo", + "git_repo_with_remote", ] -async def fake_run(*args, **kwargs): - fake_result_tuple = namedtuple("fake_result", "returncode result") - res = fake_result_tuple(0, (args, kwargs)) - print(args[0]) - return res - - async def fake_run_noprint(*args, **kwargs): fake_result_tuple = namedtuple("fake_result", "returncode result") res = fake_result_tuple(0, (args, kwargs)) return res +async def fake_current_commit(*args, **kwargs): + return "fake_result" + + @pytest.fixture(scope="module", autouse=True) def patch_relative_to(monkeysession): def fake_relative_to(self, some_path: Path): @@ -50,30 +54,26 @@ def repo_manager(tmpdir_factory): @pytest.fixture -def repo(tmpdir): - repo_folder = Path(str(tmpdir)) / "repos" / "squid" +def repo(tmp_path): + repo_folder = tmp_path / "repos" / "squid" repo_folder.mkdir(parents=True, exist_ok=True) return Repo( url="https://github.com/tekulvw/Squid-Plugins", name="squid", branch="rewrite_cogs", + commit="6acb5decbb717932e5dc0cda7fca0eff452c47dd", folder_path=repo_folder, ) -@pytest.fixture -def repo_norun(repo): - repo._run = fake_run - return repo - - @pytest.fixture def bot_repo(event_loop): cwd = Path.cwd() return Repo( name="Red-DiscordBot", branch="WRONG", + commit="", url="https://empty.com/something.git", folder_path=cwd, loop=event_loop, @@ -120,6 +120,16 @@ def installable(tmpdir): return cog_info +@pytest.fixture +def installed_cog(tmpdir): + cog_path = tmpdir.mkdir("test_repo").mkdir("test_installed_cog") + info_path = cog_path.join("info.json") + info_path.write_text(json.dumps(INFO_JSON), "utf-8") + + cog_info = InstalledModule(Path(str(cog_path))) + return cog_info + + @pytest.fixture def library_installable(tmpdir): lib_path = tmpdir.mkdir("test_repo").mkdir("test_lib") @@ -128,3 +138,93 @@ def library_installable(tmpdir): cog_info = Installable(Path(str(lib_path))) return cog_info + + +# Git +TEST_REPO_EXPORT_PTH: Path = Path(__file__).parent / "downloader_testrepo.export" + + +def _init_test_repo(destination: Path): + # copied from tools/edit_testrepo.py + git_dirparams = ("git", "-C", str(destination)) + init_commands = ( + (*git_dirparams, "init"), + (*git_dirparams, "config", "--local", "user.name", "Cog-Creators"), + (*git_dirparams, "config", "--local", "user.email", "cog-creators@example.org"), + (*git_dirparams, "config", "--local", "commit.gpgSign", "false"), + ) + + for args in init_commands: + sp.run(args, check=True) + return git_dirparams + + +@pytest.fixture(scope="session") +async def _session_git_repo(tmp_path_factory, event_loop): + # we will import repo only once once per session and duplicate the repo folder + repo_path = tmp_path_factory.mktemp("session_git_repo") + repo = Repo( + name="redbot-testrepo", + url="", + branch="master", + commit="", + folder_path=repo_path, + loop=event_loop, + ) + git_dirparams = _init_test_repo(repo_path) + fast_import = sp.Popen((*git_dirparams, "fast-import", "--quiet"), stdin=sp.PIPE) + with TEST_REPO_EXPORT_PTH.open(mode="rb") as f: + fast_import.communicate(f.read()) + return_code = fast_import.wait() + if return_code: + raise Exception(f"git fast-import failed with code {return_code}") + sp.run((*git_dirparams, "reset", "--hard")) + return repo + + +@pytest.fixture +async def git_repo(_session_git_repo, tmp_path, event_loop): + # fixture only copies repo that was imported in _session_git_repo + repo_path = tmp_path / "redbot-testrepo" + shutil.copytree(_session_git_repo.folder_path, repo_path) + repo = Repo( + name="redbot-testrepo", + url=_session_git_repo.url, + branch=_session_git_repo.branch, + commit=_session_git_repo.commit, + folder_path=repo_path, + loop=event_loop, + ) + return repo + + +@pytest.fixture +async def cloned_git_repo(_session_git_repo, tmp_path, event_loop): + # don't use this if you want to edit origin repo + repo_path = tmp_path / "redbot-cloned_testrepo" + repo = Repo( + name="redbot-testrepo", + url=str(_session_git_repo.folder_path), + branch=_session_git_repo.branch, + commit=_session_git_repo.commit, + folder_path=repo_path, + loop=event_loop, + ) + sp.run(("git", "clone", str(_session_git_repo.folder_path), str(repo_path)), check=True) + return repo + + +@pytest.fixture +async def git_repo_with_remote(git_repo, tmp_path, event_loop): + # this can safely be used when you want to do changes to origin repo + repo_path = tmp_path / "redbot-testrepo_with_remote" + repo = Repo( + name="redbot-testrepo", + url=str(git_repo.folder_path), + branch=git_repo.branch, + commit=git_repo.commit, + folder_path=repo_path, + loop=event_loop, + ) + sp.run(("git", "clone", str(git_repo.folder_path), str(repo_path)), check=True) + return repo diff --git a/redbot/pytest/downloader_testrepo.export b/redbot/pytest/downloader_testrepo.export new file mode 100644 index 000000000..5e27ae041 --- /dev/null +++ b/redbot/pytest/downloader_testrepo.export @@ -0,0 +1,134 @@ +# THIS FILE SHOULDN'T BE EDITED MANUALLY. USE `edit_testrepo.py` TOOL TO UPDATE THE REPO. +blob +mark :1 +original-oid cfd75093008a560c1f2a09e5068e0dd1517eaa1c +data 14 +Sample file 1. +reset refs/heads/ambiguous_with_tag +commit refs/heads/ambiguous_with_tag +mark :2 +original-oid c6f0e5ec04d99bdf8c6c78ff20d66d286eecb3ea +author Cog-Creators 1571921830 +0200 +committer Cog-Creators 1571919491 +0200 +data 27 +Commit ambiguous with tag. +M 100644 :1 sample_file1.txt + +reset refs/heads/ambiguous_1 +commit refs/heads/ambiguous_1 +mark :3 +original-oid 95da0b576271cb5bee5f3e075074c03ee05fed05 +author Cog-Creators 1571777704 +0200 +committer Cog-Creators 1571777704 +0200 +data 23 +Ambiguous commit 16955 +M 100644 :1 sample_file1.txt + +reset refs/heads/ambiguous_2 +commit refs/heads/ambiguous_2 +mark :4 +original-oid 95da0b57a416d9c8ce950554228d1fc195c30b43 +author Cog-Creators 1571777704 +0200 +committer Cog-Creators 1571777704 +0200 +data 23 +Ambiguous commit 44414 +M 100644 :1 sample_file1.txt + +blob +mark :5 +original-oid f1a18139c84a82addbded8a7b5738c36fb02fce1 +data 22 +print("Hello world!") + +blob +mark :6 +original-oid 1abb7a2470722faee2175980ee202717b4158057 +data 14 +Sample file 2. +reset refs/tags/lightweight +commit refs/tags/lightweight +mark :7 +original-oid c950fc05a540dd76b944719c2a3302da2e2f3090 +author Cog-Creators 1571776887 +0200 +committer Cog-Creators 1571777047 +0200 +data 31 +Initial commit, prepare files. +M 100644 :5 mycog/__init__.py +M 100644 :1 sample_file1.txt +M 100644 :6 sample_file2.txt + +blob +mark :8 +original-oid 10ec5813415b6d7c902eee95cc13dc38c6f50917 +data 11 +Added file. +blob +mark :9 +original-oid 5ed17bf7914989db85f2e66045e62b35eed10f3b +data 42 +def setup(bot): + print("Hello world!") + +commit refs/tags/lightweight +mark :10 +original-oid fb99eb7d2d5bed514efc98fe6686b368f8425745 +author Cog-Creators 1571777140 +0200 +committer Cog-Creators 1571777140 +0200 +data 39 +Add, modify, rename and remove a file. +from :7 +M 100644 :8 added_file.txt +M 100644 :9 mycog/__init__.py +D sample_file1.txt +D sample_file2.txt +M 100644 :6 sample_file3.txt + +commit refs/tags/annotated +mark :11 +original-oid a7120330cc179396914e0d6af80cfa282adc124b +author Cog-Creators 1571777209 +0200 +committer Cog-Creators 1571777209 +0200 +data 14 +Remove mycog. +from :10 +D mycog/__init__.py + +blob +mark :12 +original-oid 1ba9a868ae2f65571c75681ec47d40595bea4882 +data 14 +Sample file 4. +commit refs/heads/master +mark :13 +original-oid 2db662c1d341b1db7d225ccc1af4019ba5228c70 +author Cog-Creators 1571777704 +0200 +committer Cog-Creators 1571777704 +0200 +data 32 +One commit after mycog removal. +from :11 +M 100644 :12 sample_file4.txt + +reset refs/heads/dont_add_commits +commit refs/heads/dont_add_commits +mark :14 +original-oid a0ccc2390883c85a361f5a90c72e1b07958939fa +author Cog-Creators 1571777548 +0200 +committer Cog-Creators 1571777548 +0200 +data 103 +Don't edit this, this is used for tests for current commit, latest commit, full sha1 from branch name. +M 100644 :1 sample_file1.txt + +tag annotated +from :11 +original-oid 41f6cf3b58e774d2b3414ced3ee9f2541f1c682f +tagger Cog-Creators 1571777367 +0200 +data 15 +Annotated tag. + +tag ambiguous_tag_66387 +from :2 +original-oid c6f028f843389c850e2c20d8dd1f5fa498252764 +tagger Cog-Creators 1571919491 +0200 +data 37 +Annotated tag ambiguous with commit. + diff --git a/redbot/pytest/downloader_testrepo.md b/redbot/pytest/downloader_testrepo.md new file mode 100644 index 000000000..77571a4d6 --- /dev/null +++ b/redbot/pytest/downloader_testrepo.md @@ -0,0 +1,102 @@ +# Downloader's test repo reference + +This file can be used as a reference on what repo contains +if some dev will want to add more test in future. + +Branch master +--- + +**Commit:** c950fc05a540dd76b944719c2a3302da2e2f3090 +**Commit message:** Initial commit, prepare files. +**Tree status:** +``` +downloader_testrepo/ + ├── mycog +A │ ├── __init__.py +A ├── sample_file1.txt +A └── sample_file2.txt +``` +--- +**Commit:** fb99eb7d2d5bed514efc98fe6686b368f8425745 +**Tag:** lightweight +**Commit message:** Add, modify, rename and remove a file. +**Tree status:** +``` +downloader_testrepo/ + ├── mycog/ +M │ ├── __init__.py +A ├── added_file.txt +D ├── sample_file1.txt +R └── sample_file2.txt -> sample_file3.txt +``` +--- +**Commit:** a7120330cc179396914e0d6af80cfa282adc124b +**Tag:** annotated (sha1: 41f6cf3b58e774d2b3414ced3ee9f2541f1c682f) +**Commit message:** Remove mycog. +**Tree status:** +``` +downloader_testrepo/ +D ├── mycog/ +D │ ├── __init__.py + ├── added_file.txt + └── sample_file3.txt +``` +--- +**Commit:** 2db662c1d341b1db7d225ccc1af4019ba5228c70 +**Commit message:** One commit after mycog removal. +**Tree status:** +``` +downloader_testrepo/ + ├── added_file.txt + ├── sample_file3.txt +A └── sample_file4.txt +``` + +Branch with persistent HEAD +--- + +**Commit:** a0ccc2390883c85a361f5a90c72e1b07958939fa +**Branch:** dont_add_commits +**Commit message:** Don't edit this, this is used for tests for current commit, latest commit, full sha1 from branch name. +**Tree status:** +``` +downloader_testrepo/ +A └── sample_file1.txt +``` + +Branches with ambiguous commits (95da0b57) +--- + +**Commit:** 95da0b576271cb5bee5f3e075074c03ee05fed05 +**Branch:** ambiguous_1 +**Commit message:** Ambiguous commit 16955 +**Tree status:** +``` +downloader_testrepo/ +A └── sample_file1.txt +``` + + +**Commit:** 95da0b57a416d9c8ce950554228d1fc195c30b43 +**Branch:** ambiguous_2 +**Commit message:** Ambiguous commit 44414 +**Tree status:** +``` +downloader_testrepo/ +A └── sample_file1.txt +``` + + +Branch with ambiguous tag (c6f0) +--- + +**Commit:** c6f0e5ec04d99bdf8c6c78ff20d66d286eecb3ea +**Branch:** ambiguous_with_tag +**Tag:** ambiguous_tag_66387 (sha1: c6f028f843389c850e2c20d8dd1f5fa498252764) +**Commit message:** Commit ambiguous with tag. +**Tree status:** + +``` +downloader_testrepo/ +A └── sample_file1.txt +``` \ No newline at end of file diff --git a/setup.cfg b/setup.cfg index 929144bab..cec4be2b5 100644 --- a/setup.cfg +++ b/setup.cfg @@ -106,6 +106,7 @@ test = pyparsing==2.4.2 pytest==5.1.2 pytest-asyncio==0.10.0 + pytest-mock==1.11.2 six==1.12.0 typed-ast==1.4.0 wcwidth==0.1.7 @@ -131,5 +132,6 @@ include = **/locales/*.po data/* data/**/* + *.export redbot.core.drivers.postgres = *.sql diff --git a/tests/cogs/downloader/test_downloader.py b/tests/cogs/downloader/test_downloader.py index 57186baaa..29d49a7e7 100644 --- a/tests/cogs/downloader/test_downloader.py +++ b/tests/cogs/downloader/test_downloader.py @@ -1,41 +1,339 @@ +import asyncio import pathlib from collections import namedtuple +from typing import Any, NamedTuple from pathlib import Path import pytest -from unittest.mock import MagicMock +from pytest_mock import MockFixture from redbot.pytest.downloader import * -from redbot.cogs.downloader.repo_manager import RepoManager, Repo -from redbot.cogs.downloader.errors import ExistingGitRepo +from redbot.cogs.downloader.repo_manager import Installable +from redbot.cogs.downloader.repo_manager import Candidate, ProcessFormatter, RepoManager, Repo +from redbot.cogs.downloader.errors import ( + AmbiguousRevision, + ExistingGitRepo, + GitException, + UnknownRevision, +) -def test_existing_git_repo(tmpdir): - repo_folder = Path(str(tmpdir)) / "repos" / "squid" / ".git" +class FakeCompletedProcess(NamedTuple): + returncode: int + stdout: bytes = b"" + stderr: bytes = b"" + + +async def async_return(ret: Any): + return ret + + +def _mock_run( + mocker: MockFixture, repo: Repo, returncode: int, stdout: bytes = b"", stderr: bytes = b"" +): + return mocker.patch.object( + repo, + "_run", + autospec=True, + return_value=async_return(FakeCompletedProcess(returncode, stdout, stderr)), + ) + + +def _mock_setup_repo(mocker: MockFixture, repo: Repo, commit: str): + def update_commit(*args, **kwargs): + repo.commit = commit + return mocker.DEFAULT + + return mocker.patch.object( + repo, + "_setup_repo", + autospec=True, + side_effect=update_commit, + return_value=async_return(None), + ) + + +def test_existing_git_repo(tmp_path): + repo_folder = tmp_path / "repos" / "squid" / ".git" repo_folder.mkdir(parents=True, exist_ok=True) r = Repo( url="https://github.com/tekulvw/Squid-Plugins", name="squid", branch="rewrite_cogs", + commit="6acb5decbb717932e5dc0cda7fca0eff452c47dd", folder_path=repo_folder.parent, ) - exists, _ = r._existing_git_repo() + exists, git_path = r._existing_git_repo() assert exists is True + assert git_path == repo_folder + + +ancestor_rev = "c950fc05a540dd76b944719c2a3302da2e2f3090" +descendant_rev = "fb99eb7d2d5bed514efc98fe6686b368f8425745" + + +@pytest.mark.asyncio +@pytest.mark.parametrize( + "maybe_ancestor_rev,descendant_rev,returncode,expected", + [(ancestor_rev, descendant_rev, 0, True), (descendant_rev, ancestor_rev, 1, False)], +) +async def test_is_ancestor(mocker, repo, maybe_ancestor_rev, descendant_rev, returncode, expected): + m = _mock_run(mocker, repo, returncode) + ret = await repo.is_ancestor(maybe_ancestor_rev, descendant_rev) + m.assert_called_once_with( + ProcessFormatter().format( + repo.GIT_IS_ANCESTOR, + path=repo.folder_path, + maybe_ancestor_rev=maybe_ancestor_rev, + descendant_rev=descendant_rev, + ), + valid_exit_codes=(0, 1), + ) + assert ret is expected + + +@pytest.mark.asyncio +async def test_is_ancestor_raise(mocker, repo): + m = _mock_run(mocker, repo, 128) + with pytest.raises(GitException): + await repo.is_ancestor("invalid1", "invalid2") + + m.assert_called_once_with( + ProcessFormatter().format( + repo.GIT_IS_ANCESTOR, + path=repo.folder_path, + maybe_ancestor_rev="invalid1", + descendant_rev="invalid2", + ), + valid_exit_codes=(0, 1), + ) + + +@pytest.mark.asyncio +async def test_get_file_update_statuses(mocker, repo): + old_rev = "c950fc05a540dd76b944719c2a3302da2e2f3090" + new_rev = "fb99eb7d2d5bed514efc98fe6686b368f8425745" + m = _mock_run( + mocker, + repo, + 0, + b"A\x00added_file.txt\x00\t" + b"M\x00mycog/__init__.py\x00\t" + b"D\x00sample_file1.txt\x00\t" + b"D\x00sample_file2.txt\x00\t" + b"A\x00sample_file3.txt", + ) + ret = await repo._get_file_update_statuses(old_rev, new_rev) + m.assert_called_once_with( + ProcessFormatter().format( + repo.GIT_DIFF_FILE_STATUS, path=repo.folder_path, old_rev=old_rev, new_rev=new_rev + ) + ) + + assert ret == { + "added_file.txt": "A", + "mycog/__init__.py": "M", + "sample_file1.txt": "D", + "sample_file2.txt": "D", + "sample_file3.txt": "A", + } + + +@pytest.mark.asyncio +async def test_is_module_modified(mocker, repo): + old_rev = "c950fc05a540dd76b944719c2a3302da2e2f3090" + new_rev = "fb99eb7d2d5bed514efc98fe6686b368f8425745" + FakeInstallable = namedtuple("Installable", "name commit") + module = FakeInstallable("mycog", new_rev) + m = mocker.patch.object( + repo, + "_get_file_update_statuses", + autospec=True, + return_value=async_return( + { + "added_file.txt": "A", + "mycog/__init__.py": "M", + "sample_file1.txt": "D", + "sample_file2.txt": "D", + "sample_file3.txt": "A", + } + ), + ) + ret = await repo._is_module_modified(module, old_rev) + m.assert_called_once_with(old_rev, new_rev) + + assert ret is True + + +@pytest.mark.asyncio +async def test_get_full_sha1_success(mocker, repo): + commit = "c950fc05a540dd76b944719c2a3302da2e2f3090" + m = _mock_run(mocker, repo, 0, commit.encode()) + ret = await repo.get_full_sha1(commit) + m.assert_called_once_with( + ProcessFormatter().format(repo.GIT_GET_FULL_SHA1, path=repo.folder_path, rev=commit) + ) + + assert ret == commit + + +@pytest.mark.asyncio +async def test_get_full_sha1_notfound(mocker, repo): + m = _mock_run(mocker, repo, 128, b"", b"fatal: Needed a single revision") + with pytest.raises(UnknownRevision): + await repo.get_full_sha1("invalid") + m.assert_called_once_with( + ProcessFormatter().format(repo.GIT_GET_FULL_SHA1, path=repo.folder_path, rev="invalid") + ) + + +@pytest.mark.asyncio +async def test_get_full_sha1_ambiguous(mocker, repo): + m = _mock_run( + mocker, + repo, + 128, + b"", + b"error: short SHA1 c6f0 is ambiguous\n" + b"hint: The candidates are:\n" + b"hint: c6f028f tag ambiguous_tag_66387\n" + b"hint: c6f0e5e commit 2019-10-24 - Commit ambiguous with tag.\n" + b"fatal: Needed a single revision", + ) + with pytest.raises(AmbiguousRevision) as exc_info: + await repo.get_full_sha1("c6f0") + m.assert_called_once_with( + ProcessFormatter().format(repo.GIT_GET_FULL_SHA1, path=repo.folder_path, rev="c6f0") + ) + + assert exc_info.value.candidates == [ + Candidate("c6f028f", "tag", "ambiguous_tag_66387"), + Candidate("c6f0e5e", "commit", "2019-10-24 - Commit ambiguous with tag."), + ] + + +def test_update_available_modules(repo): + module = repo.folder_path / "mycog" / "__init__.py" + submodule = module.parent / "submodule" / "__init__.py" + module.parent.mkdir(parents=True) + module.touch() + submodule.parent.mkdir() + submodule.touch() + ret = repo._update_available_modules() + assert ( + ret + == repo.available_modules + == (Installable(location=module.parent, repo=repo, commit=repo.commit),) + ) + + +@pytest.mark.asyncio +async def test_checkout(mocker, repo): + commit = "c950fc05a540dd76b944719c2a3302da2e2f3090" + m = _mock_run(mocker, repo, 0) + _mock_setup_repo(mocker, repo, commit) + git_path = repo.folder_path / ".git" + git_path.mkdir() + await repo._checkout(commit) + + assert repo.commit == commit + m.assert_called_once_with( + ProcessFormatter().format(repo.GIT_CHECKOUT, path=repo.folder_path, rev=commit) + ) + + +@pytest.mark.asyncio +async def test_checkout_ctx_manager(mocker, repo): + commit = "c950fc05a540dd76b944719c2a3302da2e2f3090" + m = mocker.patch.object(repo, "_checkout", autospec=True, return_value=async_return(None)) + old_commit = repo.commit + async with repo.checkout(commit): + m.assert_called_with(commit, force_checkout=False) + m.return_value = async_return(None) + + m.assert_called_with(old_commit, force_checkout=False) + + +@pytest.mark.asyncio +async def test_checkout_await(mocker, repo): + commit = "c950fc05a540dd76b944719c2a3302da2e2f3090" + m = mocker.patch.object(repo, "_checkout", autospec=True, return_value=async_return(None)) + await repo.checkout(commit) + + m.assert_called_once_with(commit, force_checkout=False) + + +@pytest.mark.asyncio +async def test_clone_with_branch(mocker, repo): + branch = repo.branch = "dont_add_commits" + commit = "a0ccc2390883c85a361f5a90c72e1b07958939fa" + repo.commit = "" + m = _mock_run(mocker, repo, 0) + _mock_setup_repo(mocker, repo, commit) + + await repo.clone() + + assert repo.commit == commit + m.assert_called_once_with( + ProcessFormatter().format( + repo.GIT_CLONE, branch=branch, url=repo.url, folder=repo.folder_path + ) + ) + + +@pytest.mark.asyncio +async def test_clone_without_branch(mocker, repo): + branch = "dont_add_commits" + commit = "a0ccc2390883c85a361f5a90c72e1b07958939fa" + repo.branch = None + repo.commit = "" + m = _mock_run(mocker, repo, 0) + _mock_setup_repo(mocker, repo, commit) + mocker.patch.object(repo, "current_branch", autospec=True, return_value=async_return(branch)) + + await repo.clone() + + assert repo.commit == commit + m.assert_called_once_with( + ProcessFormatter().format(repo.GIT_CLONE_NO_BRANCH, url=repo.url, folder=repo.folder_path) + ) + + +@pytest.mark.asyncio +async def test_update(mocker, repo): + old_commit = repo.commit + new_commit = "a0ccc2390883c85a361f5a90c72e1b07958939fa" + m = _mock_run(mocker, repo, 0) + _mock_setup_repo(mocker, repo, new_commit) + mocker.patch.object( + repo, "latest_commit", autospec=True, return_value=async_return(old_commit) + ) + mocker.patch.object(repo, "hard_reset", autospec=True, return_value=async_return(None)) + ret = await repo.update() + + assert ret == (old_commit, new_commit) + m.assert_called_once_with(ProcessFormatter().format(repo.GIT_PULL, path=repo.folder_path)) + + +# old tests @pytest.mark.asyncio async def test_add_repo(monkeypatch, repo_manager): monkeypatch.setattr("redbot.cogs.downloader.repo_manager.Repo._run", fake_run_noprint) + monkeypatch.setattr( + "redbot.cogs.downloader.repo_manager.Repo.current_commit", fake_current_commit + ) squid = await repo_manager.add_repo( url="https://github.com/tekulvw/Squid-Plugins", name="squid", branch="rewrite_cogs" ) - assert squid.available_modules == [] + assert squid.available_modules == () @pytest.mark.asyncio @@ -49,14 +347,20 @@ async def test_lib_install_requirements(monkeypatch, library_installable, repo, sharedlib_path = lib_path / "cog_shared" sharedlib_path.mkdir(parents=True, exist_ok=True) - result = await repo.install_libraries(target_dir=sharedlib_path, req_target_dir=lib_path) + installed, failed = await repo.install_libraries( + target_dir=sharedlib_path, req_target_dir=lib_path + ) - assert result is True + assert len(installed) == 1 + assert len(failed) == 0 @pytest.mark.asyncio async def test_remove_repo(monkeypatch, repo_manager): monkeypatch.setattr("redbot.cogs.downloader.repo_manager.Repo._run", fake_run_noprint) + monkeypatch.setattr( + "redbot.cogs.downloader.repo_manager.Repo.current_commit", fake_current_commit + ) await repo_manager.add_repo( url="https://github.com/tekulvw/Squid-Plugins", name="squid", branch="rewrite_cogs" @@ -67,17 +371,8 @@ async def test_remove_repo(monkeypatch, repo_manager): @pytest.mark.asyncio -async def test_current_branch(bot_repo): - branch = await bot_repo.current_branch() - - # So this does work, just not sure how to fully automate the test - - assert branch not in ("WRONG", "") - - -@pytest.mark.asyncio -async def test_existing_repo(repo_manager): - repo_manager.does_repo_exist = MagicMock(return_value=True) +async def test_existing_repo(mocker, repo_manager): + repo_manager.does_repo_exist = mocker.MagicMock(return_value=True) with pytest.raises(ExistingGitRepo): await repo_manager.add_repo("http://test.com", "test") diff --git a/tests/cogs/downloader/test_git.py b/tests/cogs/downloader/test_git.py new file mode 100644 index 000000000..075b6e783 --- /dev/null +++ b/tests/cogs/downloader/test_git.py @@ -0,0 +1,452 @@ +from pathlib import Path +import subprocess as sp + +import pytest + +from redbot.cogs.downloader.repo_manager import ProcessFormatter, Repo +from redbot.pytest.downloader import ( + cloned_git_repo, + git_repo, + git_repo_with_remote, + _session_git_repo, +) + + +@pytest.mark.asyncio +async def test_git_clone_nobranch(git_repo, tmp_path): + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_CLONE_NO_BRANCH, + url=git_repo.folder_path, + folder=tmp_path / "cloned_repo_test", + ) + ) + assert p.returncode == 0 + + +@pytest.mark.asyncio +async def test_git_clone_branch(git_repo, tmp_path): + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_CLONE, + branch="master", + url=git_repo.folder_path, + folder=tmp_path / "cloned_repo_test", + ) + ) + assert p.returncode == 0 + + +@pytest.mark.asyncio +async def test_git_clone_non_existent_branch(git_repo, tmp_path): + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_CLONE, + branch="non-existent-branch", + url=git_repo.folder_path, + folder=tmp_path / "cloned_repo_test", + ) + ) + assert p.returncode == 128 + + +@pytest.mark.asyncio +async def test_git_clone_notgit_repo(git_repo, tmp_path): + notgit_repo = tmp_path / "test_clone_folder" + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_CLONE, branch=None, url=notgit_repo, folder=tmp_path / "cloned_repo_test" + ) + ) + assert p.returncode == 128 + + +@pytest.mark.asyncio +async def test_git_current_branch_master(git_repo): + p = await git_repo._run( + ProcessFormatter().format(git_repo.GIT_CURRENT_BRANCH, path=git_repo.folder_path) + ) + assert p.returncode == 0 + assert p.stdout.decode().strip() == "master" + + +@pytest.mark.asyncio +async def test_git_current_branch_detached(git_repo): + await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_CHECKOUT, + path=git_repo.folder_path, + rev="c950fc05a540dd76b944719c2a3302da2e2f3090", + ) + ) + p = await git_repo._run( + ProcessFormatter().format(git_repo.GIT_CURRENT_BRANCH, path=git_repo.folder_path) + ) + assert p.returncode == 128 + assert p.stderr.decode().strip() == "fatal: ref HEAD is not a symbolic ref" + + +@pytest.mark.asyncio +async def test_git_current_commit_on_branch(git_repo): + # HEAD on dont_add_commits (a0ccc2390883c85a361f5a90c72e1b07958939fa) + # setup + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_CHECKOUT, path=git_repo.folder_path, rev="dont_add_commits" + ) + ) + assert p.returncode == 0 + + p = await git_repo._run( + ProcessFormatter().format(git_repo.GIT_CURRENT_COMMIT, path=git_repo.folder_path) + ) + assert p.returncode == 0 + assert p.stdout.decode().strip() == "a0ccc2390883c85a361f5a90c72e1b07958939fa" + + +@pytest.mark.asyncio +async def test_git_current_commit_detached(git_repo): + # detached HEAD state (c950fc05a540dd76b944719c2a3302da2e2f3090) + await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_CHECKOUT, + path=git_repo.folder_path, + rev="c950fc05a540dd76b944719c2a3302da2e2f3090", + ) + ) + p = await git_repo._run( + ProcessFormatter().format(git_repo.GIT_CURRENT_COMMIT, path=git_repo.folder_path) + ) + assert p.returncode == 0 + assert p.stdout.decode().strip() == "c950fc05a540dd76b944719c2a3302da2e2f3090" + + +@pytest.mark.asyncio +async def test_git_latest_commit(git_repo): + # HEAD on dont_add_commits (a0ccc2390883c85a361f5a90c72e1b07958939fa) + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_LATEST_COMMIT, path=git_repo.folder_path, branch="dont_add_commits" + ) + ) + assert p.returncode == 0 + assert p.stdout.decode().strip() == "a0ccc2390883c85a361f5a90c72e1b07958939fa" + + +@pytest.mark.asyncio +async def test_git_hard_reset(cloned_git_repo, tmp_path): + staged_file = cloned_git_repo.folder_path / "staged_file.txt" + staged_file.touch() + git_dirparams = ("git", "-C", str(cloned_git_repo.folder_path)) + sp.run((*git_dirparams, "add", "staged_file.txt"), check=True) + assert staged_file.exists() is True + p = await cloned_git_repo._run( + ProcessFormatter().format( + cloned_git_repo.GIT_HARD_RESET, path=cloned_git_repo.folder_path, branch="master" + ) + ) + assert p.returncode == 0 + assert staged_file.exists() is False + + +@pytest.mark.asyncio +async def test_git_pull(git_repo_with_remote, tmp_path): + # setup + staged_file = Path(git_repo_with_remote.url) / "staged_file.txt" + staged_file.touch() + git_dirparams = ("git", "-C", git_repo_with_remote.url) + sp.run((*git_dirparams, "add", "staged_file.txt"), check=True) + sp.run( + (*git_dirparams, "commit", "-m", "test commit", "--no-gpg-sign", "--no-verify"), check=True + ) + assert not (git_repo_with_remote.folder_path / "staged_file.txt").exists() + + p = await git_repo_with_remote._run( + ProcessFormatter().format( + git_repo_with_remote.GIT_PULL, path=git_repo_with_remote.folder_path + ) + ) + assert p.returncode == 0 + assert (git_repo_with_remote.folder_path / "staged_file.txt").exists() + + +@pytest.mark.asyncio +async def test_git_diff_file_status(git_repo): + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_DIFF_FILE_STATUS, + path=git_repo.folder_path, + old_rev="c950fc05a540dd76b944719c2a3302da2e2f3090", + new_rev="fb99eb7d2d5bed514efc98fe6686b368f8425745", + ) + ) + assert p.returncode == 0 + stdout = p.stdout.strip(b"\t\n\x00 ").decode() + assert stdout == ( + "A\x00added_file.txt\x00\t" + "M\x00mycog/__init__.py\x00\t" + "D\x00sample_file1.txt\x00\t" + "D\x00sample_file2.txt\x00\t" + "A\x00sample_file3.txt" + ) + + +# might need to add test for test_git_log, but it's unused method currently + + +@pytest.mark.asyncio +async def test_git_discover_remote_url(cloned_git_repo, tmp_path): + p = await cloned_git_repo._run( + ProcessFormatter().format( + cloned_git_repo.GIT_DISCOVER_REMOTE_URL, path=cloned_git_repo.folder_path + ) + ) + assert p.returncode == 0 + assert p.stdout.decode().strip() == cloned_git_repo.url + + +@pytest.mark.asyncio +async def test_git_checkout_detached_head(git_repo): + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_CHECKOUT, + path=git_repo.folder_path, + rev="c950fc05a540dd76b944719c2a3302da2e2f3090", + ) + ) + assert p.returncode == 0 + + +@pytest.mark.asyncio +async def test_git_checkout_branch(git_repo): + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_CHECKOUT, path=git_repo.folder_path, rev="dont_add_commits" + ) + ) + assert p.returncode == 0 + + +@pytest.mark.asyncio +async def test_git_checkout_non_existent_branch(git_repo): + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_CHECKOUT, path=git_repo.folder_path, rev="non-existent-branch" + ) + ) + assert p.returncode == 1 + + +@pytest.mark.asyncio +async def test_git_get_full_sha1_from_branch_name(git_repo): + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_GET_FULL_SHA1, path=git_repo.folder_path, rev="dont_add_commits" + ) + ) + assert p.returncode == 0 + assert p.stdout.decode().strip() == "a0ccc2390883c85a361f5a90c72e1b07958939fa" + + +@pytest.mark.asyncio +async def test_git_get_full_sha1_from_full_hash(git_repo): + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_GET_FULL_SHA1, + path=git_repo.folder_path, + rev="c950fc05a540dd76b944719c2a3302da2e2f3090", + ) + ) + assert p.returncode == 0 + assert p.stdout.decode().strip() == "c950fc05a540dd76b944719c2a3302da2e2f3090" + + +@pytest.mark.asyncio +async def test_git_get_full_sha1_from_short_hash(git_repo): + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_GET_FULL_SHA1, path=git_repo.folder_path, rev="c950" + ) + ) + assert p.returncode == 0 + assert p.stdout.decode().strip() == "c950fc05a540dd76b944719c2a3302da2e2f3090" + + +@pytest.mark.asyncio +async def test_git_get_full_sha1_from_too_short_hash(git_repo): + p = await git_repo._run( + ProcessFormatter().format(git_repo.GIT_GET_FULL_SHA1, path=git_repo.folder_path, rev="c95") + ) + assert p.returncode == 128 + assert p.stderr.decode().strip() == "fatal: Needed a single revision" + + +@pytest.mark.asyncio +async def test_git_get_full_sha1_from_lightweight_tag(git_repo): + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_GET_FULL_SHA1, path=git_repo.folder_path, rev="lightweight" + ) + ) + assert p.returncode == 0 + assert p.stdout.decode().strip() == "fb99eb7d2d5bed514efc98fe6686b368f8425745" + + +@pytest.mark.asyncio +async def test_git_get_full_sha1_from_annotated_tag(git_repo): + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_GET_FULL_SHA1, path=git_repo.folder_path, rev="annotated" + ) + ) + assert p.returncode == 0 + assert p.stdout.decode().strip() == "a7120330cc179396914e0d6af80cfa282adc124b" + + +@pytest.mark.asyncio +async def test_git_get_full_sha1_from_invalid_ref(git_repo): + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_GET_FULL_SHA1, path=git_repo.folder_path, rev="invalid" + ) + ) + assert p.returncode == 128 + assert p.stderr.decode().strip() == "fatal: Needed a single revision" + + +@pytest.mark.asyncio +async def test_git_get_full_sha1_from_ambiguous_commits(git_repo): + # 2 ambiguous refs: + # branch ambiguous_1 - 95da0b576271cb5bee5f3e075074c03ee05fed05 + # branch ambiguous_2 - 95da0b57a416d9c8ce950554228d1fc195c30b43 + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_GET_FULL_SHA1, path=git_repo.folder_path, rev="95da0b57" + ) + ) + assert p.returncode == 128 + assert p.stderr.decode().strip() == ( + "error: short SHA1 95da0b57 is ambiguous\n" + "hint: The candidates are:\n" + "hint: 95da0b576 commit 2019-10-22 - Ambiguous commit 16955\n" + "hint: 95da0b57a commit 2019-10-22 - Ambiguous commit 44414\n" + "fatal: Needed a single revision" + ) + + +@pytest.mark.asyncio +async def test_git_get_full_sha1_from_ambiguous_tag_and_commit(git_repo): + # 2 ambiguous refs: + # branch ambiguous_with_tag - c6f0e5ec04d99bdf8c6c78ff20d66d286eecb3ea + # tag ambiguous_tag_66387 - c6f0e5ec04d99bdf8c6c78ff20d66d286eecb3ea + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_GET_FULL_SHA1, path=git_repo.folder_path, rev="c6f0" + ) + ) + assert p.returncode == 128 + assert p.stderr.decode().strip() == ( + "error: short SHA1 c6f0 is ambiguous\n" + "hint: The candidates are:\n" + "hint: c6f028f tag ambiguous_tag_66387\n" + "hint: c6f0e5e commit 2019-10-24 - Commit ambiguous with tag.\n" + "fatal: Needed a single revision" + ) + + +@pytest.mark.asyncio +async def test_git_is_ancestor_true(git_repo): + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_IS_ANCESTOR, + path=git_repo.folder_path, + maybe_ancestor_rev="c950fc05a540dd76b944719c2a3302da2e2f3090", + descendant_rev="fb99eb7d2d5bed514efc98fe6686b368f8425745", + ) + ) + assert p.returncode == 0 + + +@pytest.mark.asyncio +async def test_git_is_ancestor_false(git_repo): + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_IS_ANCESTOR, + path=git_repo.folder_path, + maybe_ancestor_rev="fb99eb7d2d5bed514efc98fe6686b368f8425745", + descendant_rev="c950fc05a540dd76b944719c2a3302da2e2f3090", + ) + ) + assert p.returncode == 1 + + +@pytest.mark.asyncio +async def test_git_is_ancestor_invalid_ref(git_repo): + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_IS_ANCESTOR, + path=git_repo.folder_path, + maybe_ancestor_rev="invalid1", + descendant_rev="invalid2", + ) + ) + assert p.returncode == 128 + assert p.stderr.decode().strip() == "fatal: Not a valid object name invalid1" + + +@pytest.mark.asyncio +async def test_git_check_if_module_exists_true(git_repo): + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_CHECK_IF_MODULE_EXISTS, + path=git_repo.folder_path, + rev="fb99eb7d2d5bed514efc98fe6686b368f8425745", + module_name="mycog", + ) + ) + assert p.returncode == 0 + + +@pytest.mark.asyncio +async def test_git_check_if_module_exists_false(git_repo): + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_CHECK_IF_MODULE_EXISTS, + path=git_repo.folder_path, + rev="a7120330cc179396914e0d6af80cfa282adc124b", + module_name="mycog", + ) + ) + assert p.returncode == 128 + assert p.stderr.decode().strip() == ( + "fatal: Not a valid object name a7120330cc179396914e0d6af80cfa282adc124b:mycog/__init__.py" + ) + + +@pytest.mark.asyncio +async def test_git_find_last_occurrence_existent(git_repo): + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_GET_LAST_MODULE_OCCURRENCE_COMMIT, + path=git_repo.folder_path, + descendant_rev="2db662c1d341b1db7d225ccc1af4019ba5228c70", + module_name="mycog", + ) + ) + assert p.returncode == 0 + # the command gives a commit after last occurrence + assert p.stdout.decode().strip() == "a7120330cc179396914e0d6af80cfa282adc124b" + + +@pytest.mark.asyncio +async def test_git_find_last_occurrence_non_existent(git_repo): + p = await git_repo._run( + ProcessFormatter().format( + git_repo.GIT_GET_LAST_MODULE_OCCURRENCE_COMMIT, + path=git_repo.folder_path, + descendant_rev="c950fc05a540dd76b944719c2a3302da2e2f3090", + module_name="mycog", + ) + ) + assert p.returncode == 0 + assert p.stdout.decode().strip() == "" diff --git a/tests/cogs/downloader/test_installable.py b/tests/cogs/downloader/test_installable.py index 730ae9bcb..825945baf 100644 --- a/tests/cogs/downloader/test_installable.py +++ b/tests/cogs/downloader/test_installable.py @@ -51,8 +51,8 @@ def test_repo_name(installable): assert installable.repo_name == "test_repo" -def test_serialization(installable): - data = installable.to_json() - cog_name = data["cog_name"] +def test_serialization(installed_cog): + data = installed_cog.to_json() + cog_name = data["module_name"] - assert cog_name == "test_cog" + assert cog_name == "test_installed_cog" diff --git a/tests/conftest.py b/tests/conftest.py index 11d03fb88..7810e142f 100644 --- a/tests/conftest.py +++ b/tests/conftest.py @@ -3,8 +3,11 @@ import os import pytest +from redbot import _update_event_loop_policy from redbot.core import drivers, data_manager +_update_event_loop_policy() + @pytest.fixture(scope="session") def event_loop(request): diff --git a/tools/edit_testrepo.py b/tools/edit_testrepo.py new file mode 100644 index 000000000..1befef09f --- /dev/null +++ b/tools/edit_testrepo.py @@ -0,0 +1,172 @@ +#!/usr/bin/env python3.7 +"""Script to edit test repo used by Downloader git integration tests. + +This script aims to help update the human-readable version of repo +used for git integration tests in ``redbot/tests/downloader_testrepo.export`` +by exporting/importing it in/from provided directory. + +What this script does +--------------------- +edit_testrepo.py import: + It inits test repo in provided directory, sets up committer data in git config, + imports the repo from ``redbot/tests/downloader_testrepo.export`` using + git's fast-import command and updates repo's working tree. +edit_testrepo.py export: + It exports repo from provided directory into ``redbot/tests/downloader_testrepo.export`` + using git's fast-export. To make the file more useful for developers, + it's called with option that adds extra directive ``original-oid ``, + which while ignored by import, might ease up creating tests without importing the repo. + +Note +---- +Editing `downloader_git_test_repo.export` file manually is strongly discouraged, +especially editing any part of commit directives as that causes a change in the commit's hash. +Another problem devs could encounter when trying to manually edit that file +are editors that will use CRLF instead of LF for new line character(s) and therefore break it. + +Also, if Git ever changes currently used SHA-1 to SHA-256 we will have to +update old hashes with new ones. But it's a small drawback, +when we can have human-readable version of repo. + +Known limitations +----------------- +``git fast-export`` exports commits without GPG signs so this script disables it in repo's config. +This also means devs shouldn't use ``--gpg-sign`` flag in ``git commit`` within the test repo. +""" +import shlex +import subprocess as sp +from pathlib import Path +from typing import Tuple + +import click + + +MAIN_DIRECTORY = Path(__file__).absolute().parent.parent +TEST_REPO_EXPORT_PTH: Path = MAIN_DIRECTORY / "redbot" / "pytest" / "downloader_testrepo.export" + + +class ClickCustomPath(click.Path): + """Similar to `click.Path` but returns `Path` object instead.""" + + def convert(self, value, param, ctx): + path_string = super().convert(value, param, ctx) + return Path(path_string) + + +class EmptyDirectory(ClickCustomPath): + """Similar to `ClickCustomPath`, but only allows empty or non-existent directories. + Unlike `ClickCustomPath`, this type doesn't accept + 'file_okay', 'dir_okay' and 'readable' keyword arguments. + """ + + def __init__(self, **kwargs): + super().__init__(readable=True, dir_okay=True, file_okay=False, **kwargs) + + def convert(self, value, param, ctx): + path = super().convert(value, param, ctx) + if path.exists() and next(path.glob("*"), None) is not None: + self.fail(f'Directory "{str(path)}" is not empty!') + return path + + +class GitRepoDirectory(ClickCustomPath): + """Similar to `ClickCustomPath`, but only allows git repo directories. + Unlike `ClickCustomPath`, this type doesn't accept + 'file_okay', 'dir_okay' and 'readable' keyword arguments. + """ + + def __init__(self, **kwargs): + super().__init__(readable=True, dir_okay=True, file_okay=False, **kwargs) + + def convert(self, value, param, ctx): + path = super().convert(value, param, ctx) + git_path = path / ".git" + if not git_path.exists(): + self.fail(f"A git repo does not exist at path: {str(path)}") + return path + + +@click.group() +def cli(): + """Downloader test repo commands.""" + + +@cli.command(name="init", short_help="Init a new test repo in chosen directory.") +@click.argument("destination", type=EmptyDirectory(writable=True, resolve_path=True)) +def git_init(destination: Path): + """Init a new test repo in chosen directory. This might be useful + if someone will ever want to make a completely new test repo without importing it.""" + init_test_repo(destination) + click.echo(f'New test repo successfully initialized at "{str(destination)}".') + + +@cli.command(name="import", short_help="Import test repo into chosen directory.") +@click.argument("destination", type=EmptyDirectory(writable=True, resolve_path=True)) +def git_import(destination: Path): + """Import test repo into chosen directory.""" + if not TEST_REPO_EXPORT_PTH.is_file(): + raise click.ClickException(f'File "{str(TEST_REPO_EXPORT_PTH)}" can\'t be found.') + git_dirparams = init_test_repo(destination) + + fast_import = sp.Popen((*git_dirparams, "fast-import", "--quiet"), stdin=sp.PIPE) + with TEST_REPO_EXPORT_PTH.open(mode="rb") as f: + fast_import.communicate(f.read()) + return_code = fast_import.wait() + if return_code: + raise click.ClickException(f"git fast-import failed with code {return_code}") + + _run((*git_dirparams, "reset", "--hard")) + click.echo( + f'Test repo successfully imported at "{str(destination)}"\n' + 'When you\'ll update it, use "edit_testrepo.py export" to update test repo file.' + ) + + +@cli.command(name="export", short_help="Export repo to test repo file.") +@click.argument("source", type=GitRepoDirectory(resolve_path=True)) +@click.option("--yes", is_flag=True) +def git_export(source: Path, yes: bool): + if not yes and TEST_REPO_EXPORT_PTH.is_file(): + click.confirm( + f"Test repo file ({str(TEST_REPO_EXPORT_PTH)}) already exists, " + "are you sure you want to replace it?", + abort=True, + ) + p = _run( + ("git", "-C", str(source), "fast-export", "--all", "--show-original-ids"), stdout=sp.PIPE + ) + with TEST_REPO_EXPORT_PTH.open(mode="wb") as f: + f.write( + b"# THIS FILE SHOULDN'T BE EDITED MANUALLY. " + b"USE `edit_testrepo.py` TOOL TO UPDATE THE REPO.\n" + p.stdout + ) + click.echo("Test repo successfully exported.") + + +def init_test_repo(destination: Path): + destination.mkdir(exist_ok=True) + git_dirparams = ("git", "-C", str(destination)) + init_commands: Tuple[Tuple[str, ...], ...] = ( + (*git_dirparams, "init"), + (*git_dirparams, "config", "--local", "user.name", "Cog-Creators"), + (*git_dirparams, "config", "--local", "user.email", "cog-creators@example.org"), + (*git_dirparams, "config", "--local", "commit.gpgSign", "false"), + ) + + for args in init_commands: + _run(args) + return git_dirparams + + +def _run(args, stderr=None, stdout=sp.DEVNULL) -> sp.CompletedProcess: + try: + return sp.run(args, stderr=stderr, stdout=stdout, check=True) + except sp.CalledProcessError as exc: + cmd = " ".join(map(lambda c: shlex.quote(str(c)), exc.cmd)) + raise click.ClickException( + f"The following command failed with code {exc.returncode}:\n {cmd}" + ) + + +if __name__ == "__main__": + cli() diff --git a/tools/primary_deps.ini b/tools/primary_deps.ini index a91aa2188..0405d74c3 100644 --- a/tools/primary_deps.ini +++ b/tools/primary_deps.ini @@ -45,3 +45,4 @@ test = pylint pytest pytest-asyncio + pytest-mock