mirror of
https://github.com/josegonzalez/python-github-backup.git
synced 2025-12-05 16:18:02 +01:00
Compare commits
48 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
196acd0aca | ||
|
|
679ac841f6 | ||
|
|
498d9eba32 | ||
|
|
0f82b1717c | ||
|
|
4d5126f303 | ||
|
|
b864218b44 | ||
|
|
98919c82c9 | ||
|
|
045eacbf18 | ||
|
|
7a234ba7ed | ||
|
|
e8a255b450 | ||
|
|
81a2f762da | ||
|
|
cb0293cbe5 | ||
|
|
252c25461f | ||
|
|
e8ed03fd06 | ||
|
|
38010d7c39 | ||
|
|
71b4288e6b | ||
|
|
ba4fa9fa2d | ||
|
|
869f761c90 | ||
|
|
195e700128 | ||
|
|
27441b71b6 | ||
|
|
cfeaee7309 | ||
|
|
fac8e4274f | ||
|
|
17fee66f31 | ||
|
|
a56d27dd8b | ||
|
|
e57873b6dd | ||
|
|
2658b039a1 | ||
|
|
fd684a71fb | ||
|
|
bacd77030b | ||
|
|
b73079daf2 | ||
|
|
eca8a70666 | ||
|
|
e74765ba7f | ||
|
|
6db5bd731b | ||
|
|
7305871c20 | ||
|
|
baf7b1a9b4 | ||
|
|
121fa68294 | ||
|
|
44dfc79edc | ||
|
|
89f59cc7a2 | ||
|
|
ad8c5b8768 | ||
|
|
921aab3729 | ||
|
|
ea4c3d0f6f | ||
|
|
9b6400932d | ||
|
|
de0c3f46c6 | ||
|
|
73b069f872 | ||
|
|
3d3f512074 | ||
|
|
1c3078992d | ||
|
|
4b40ae94d7 | ||
|
|
a18fda9faf | ||
|
|
41130fc8b0 |
9
.gitignore
vendored
9
.gitignore
vendored
@@ -25,3 +25,12 @@ doc/_build
|
|||||||
|
|
||||||
# Generated man page
|
# Generated man page
|
||||||
doc/aws_hostname.1
|
doc/aws_hostname.1
|
||||||
|
|
||||||
|
# Annoying macOS files
|
||||||
|
.DS_Store
|
||||||
|
._*
|
||||||
|
|
||||||
|
# IDE configuration files
|
||||||
|
.vscode
|
||||||
|
.atom
|
||||||
|
|
||||||
|
|||||||
78
CHANGES.rst
78
CHANGES.rst
@@ -1,9 +1,85 @@
|
|||||||
Changelog
|
Changelog
|
||||||
=========
|
=========
|
||||||
|
|
||||||
0.23.0 (2019-06-04)
|
0.28.0 (2020-02-03)
|
||||||
-------------------
|
-------------------
|
||||||
------------------------
|
------------------------
|
||||||
|
- Remove deprecated (and removed) git lfs flags. [smiley]
|
||||||
|
|
||||||
|
"--tags" and "--force" were removed at some point from "git lfs fetch". This broke our backup script.
|
||||||
|
|
||||||
|
|
||||||
|
0.27.0 (2020-01-22)
|
||||||
|
-------------------
|
||||||
|
- Fixed script fails if not installed from pip. [Ben Baron]
|
||||||
|
|
||||||
|
At the top of the script, the line from github_backup import __version__ gets the script's version number to use if the script is called with the -v or --version flags. The problem is that if the script hasn't been installed via pip (for example I cloned the repo directly to my backup server), the script will fail due to an import exception.
|
||||||
|
|
||||||
|
Also presumably it will always use the version number from pip even if running a modified version from git or a fork or something, though this does not fix that as I have no idea how to check if it's running the pip installed version or not. But at least the script will now work fine if cloned from git or just copied to another machine.
|
||||||
|
|
||||||
|
closes https://github.com/josegonzalez/python-github-backup/issues/141
|
||||||
|
- Fixed macOS keychain access when using Python 3. [Ben Baron]
|
||||||
|
|
||||||
|
Python 3 is returning bytes rather than a string, so the string concatenation to create the auth variable was throwing an exception which the script was interpreting to mean it couldn't find the password. Adding a conversion to string first fixed the issue.
|
||||||
|
- Public repos no longer include the auth token. [Ben Baron]
|
||||||
|
|
||||||
|
When backing up repositories using an auth token and https, the GitHub personal auth token is leaked in each backed up repository. It is included in the URL of each repository's git remote url.
|
||||||
|
|
||||||
|
This is not needed as they are public and can be accessed without the token and can cause issues in the future if the token is ever changed, so I think it makes more sense not to have the token stored in each repo backup. I think the token should only be "leaked" like this out of necessity, e.g. it's a private repository and the --prefer-ssh option was not chosen so https with auth token was required to perform the clone.
|
||||||
|
- Fixed comment typo. [Ben Baron]
|
||||||
|
- Switched log_info to log_warning in download_file. [Ben Baron]
|
||||||
|
- Crash when an release asset doesn't exist. [Ben Baron]
|
||||||
|
|
||||||
|
Currently, the script crashes whenever a release asset is unable to download (for example a 404 response). This change instead logs the failure and allows the script to continue. No retry logic is enabled, but at least it prevents the crash and allows the backup to complete. Retry logic can be implemented later if wanted.
|
||||||
|
|
||||||
|
closes https://github.com/josegonzalez/python-github-backup/issues/129
|
||||||
|
- Moved asset downloading loop inside the if block. [Ben Baron]
|
||||||
|
- Separate release assets and skip re-downloading. [Ben Baron]
|
||||||
|
|
||||||
|
Currently the script puts all release assets into the same folder called `releases`. So any time 2 release files have the same name, only the last one downloaded is actually saved. A particularly bad example of this is MacDownApp/macdown where all of their releases are named `MacDown.app.zip`. So even though they have 36 releases and all 36 are downloaded, only the last one is actually saved.
|
||||||
|
|
||||||
|
With this change, each releases' assets are now stored in a fubfolder inside `releases` named after the release name. There could still be edge cases if two releases have the same name, but this is still much safer tha the previous behavior.
|
||||||
|
|
||||||
|
This change also now checks if the asset file already exists on disk and skips downloading it. This drastically speeds up addiotnal syncs as it no longer downloads every single release every single time. It will now only download new releases which I believe is the expected behavior.
|
||||||
|
|
||||||
|
closes https://github.com/josegonzalez/python-github-backup/issues/126
|
||||||
|
- Added newline to end of file. [Ben Baron]
|
||||||
|
- Improved gitignore, macOS files and IDE configs. [Ben Baron]
|
||||||
|
|
||||||
|
Ignores the annoying hidden macOS files .DS_Store and ._* as well as the IDE configuration folders for contributors using the popular Visual Studio Code and Atom IDEs (more can be added later as needed).
|
||||||
|
|
||||||
|
|
||||||
|
0.26.0 (2019-09-23)
|
||||||
|
-------------------
|
||||||
|
- Workaround gist clone in `--prefer-ssh` mode. [Vladislav Yarmak]
|
||||||
|
- Create PULL_REQUEST.md. [Jose Diaz-Gonzalez]
|
||||||
|
- Create ISSUE_TEMPLATE.md. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
|
||||||
|
0.25.0 (2019-07-03)
|
||||||
|
-------------------
|
||||||
|
- Issue 119: Change retrieve_data to be a generator. [2a]
|
||||||
|
|
||||||
|
See issue #119.
|
||||||
|
|
||||||
|
|
||||||
|
0.24.0 (2019-06-27)
|
||||||
|
-------------------
|
||||||
|
- QKT-45: include assets - update readme. [Ethan Timm]
|
||||||
|
|
||||||
|
update readme with flag information for including assets alongside their respective releases
|
||||||
|
- Make assets it's own flag. [Harrison Wright]
|
||||||
|
- Fix super call for python2. [Harrison Wright]
|
||||||
|
- Fix redirect to s3. [Harrison Wright]
|
||||||
|
- WIP: download assets. [Harrison Wright]
|
||||||
|
- QKT-42: releases - add readme info. [ethan]
|
||||||
|
- QKT-42 update: shorter command flag. [ethan]
|
||||||
|
- QKT-42: support saving release information. [ethan]
|
||||||
|
- Fix pull details. [Harrison Wright]
|
||||||
|
|
||||||
|
|
||||||
|
0.23.0 (2019-06-04)
|
||||||
|
-------------------
|
||||||
- Avoid to crash in case of HTTP 502 error. [Gael de Chalendar]
|
- Avoid to crash in case of HTTP 502 error. [Gael de Chalendar]
|
||||||
|
|
||||||
Survive also on socket.error connections like on HTTPError or URLError.
|
Survive also on socket.error connections like on HTTPError or URLError.
|
||||||
|
|||||||
13
ISSUE_TEMPLATE.md
Normal file
13
ISSUE_TEMPLATE.md
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
# Important notice regarding filed issues
|
||||||
|
|
||||||
|
This project already fills my needs, and as such I have no real reason to continue it's development. This project is otherwise provided as is, and no support is given.
|
||||||
|
|
||||||
|
If pull requests implementing bug fixes or enhancements are pushed, I am happy to review and merge them (time permitting).
|
||||||
|
|
||||||
|
If you wish to have a bug fixed, you have a few options:
|
||||||
|
|
||||||
|
- Fix it yourself and file a pull request.
|
||||||
|
- File a bug and hope someone else fixes it for you.
|
||||||
|
- Pay me to fix it (my rate is $200 an hour, minimum 1 hour, contact me via my [github email address](https://github.com/josegonzalez) if you want to go this route).
|
||||||
|
|
||||||
|
In all cases, feel free to file an issue, they may be of help to others in the future.
|
||||||
7
PULL_REQUEST.md
Normal file
7
PULL_REQUEST.md
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
# Important notice regarding filed pull requests
|
||||||
|
|
||||||
|
This project already fills my needs, and as such I have no real reason to continue it's development. This project is otherwise provided as is, and no support is given.
|
||||||
|
|
||||||
|
I will attempt to review pull requests at _my_ earliest convenience. If I am unable to get to your pull request in a timely fashion, it is what it is. This repository does not pay any bills, and I am not required to merge any pull request from any individual.
|
||||||
|
|
||||||
|
If you wish to jump my personal priority queue, you may pay me for my time to review. My rate is $200 an hour - minimum 1 hour - feel free contact me via my github email address if you want to go this route.
|
||||||
@@ -4,6 +4,8 @@ github-backup
|
|||||||
|
|
||||||
|PyPI| |Python Versions|
|
|PyPI| |Python Versions|
|
||||||
|
|
||||||
|
This project is considered feature complete for the primary maintainer. If you would like a bugfix or enhancement and cannot sponsor the work, pull requests are welcome. Feel free to contact the maintainer for consulting estimates if desired.
|
||||||
|
|
||||||
backup a github user or organization
|
backup a github user or organization
|
||||||
|
|
||||||
Requirements
|
Requirements
|
||||||
@@ -32,8 +34,9 @@ CLI Usage is as follows::
|
|||||||
[--watched] [--followers] [--following] [--all]
|
[--watched] [--followers] [--following] [--all]
|
||||||
[--issues] [--issue-comments] [--issue-events] [--pulls]
|
[--issues] [--issue-comments] [--issue-events] [--pulls]
|
||||||
[--pull-comments] [--pull-commits] [--labels] [--hooks]
|
[--pull-comments] [--pull-commits] [--labels] [--hooks]
|
||||||
[--milestones] [--repositories] [--bare] [--lfs]
|
[--milestones] [--repositories] [--releases] [--assets]
|
||||||
[--wikis] [--gists] [--starred-gists] [--skip-existing]
|
[--bare] [--lfs] [--wikis] [--gists] [--starred-gists]
|
||||||
|
[--skip-existing]
|
||||||
[-L [LANGUAGES [LANGUAGES ...]]] [-N NAME_REGEX]
|
[-L [LANGUAGES [LANGUAGES ...]]] [-N NAME_REGEX]
|
||||||
[-H GITHUB_HOST] [-O] [-R REPOSITORY] [-P] [-F]
|
[-H GITHUB_HOST] [-O] [-R REPOSITORY] [-P] [-F]
|
||||||
[--prefer-ssh] [-v]
|
[--prefer-ssh] [-v]
|
||||||
@@ -76,6 +79,8 @@ CLI Usage is as follows::
|
|||||||
authenticated)
|
authenticated)
|
||||||
--milestones include milestones in backup
|
--milestones include milestones in backup
|
||||||
--repositories include repository clone in backup
|
--repositories include repository clone in backup
|
||||||
|
--releases include repository releases' information without assets or binaries
|
||||||
|
--assets include assets alongside release information; only applies if including releases
|
||||||
--bare clone bare repositories
|
--bare clone bare repositories
|
||||||
--lfs clone LFS repositories (requires Git LFS to be
|
--lfs clone LFS repositories (requires Git LFS to be
|
||||||
installed, https://git-lfs.github.com)
|
installed, https://git-lfs.github.com)
|
||||||
|
|||||||
@@ -18,6 +18,7 @@ import subprocess
|
|||||||
import sys
|
import sys
|
||||||
import time
|
import time
|
||||||
import platform
|
import platform
|
||||||
|
PY2 = False
|
||||||
try:
|
try:
|
||||||
# python 3
|
# python 3
|
||||||
from urllib.parse import urlparse
|
from urllib.parse import urlparse
|
||||||
@@ -26,16 +27,25 @@ try:
|
|||||||
from urllib.error import HTTPError, URLError
|
from urllib.error import HTTPError, URLError
|
||||||
from urllib.request import urlopen
|
from urllib.request import urlopen
|
||||||
from urllib.request import Request
|
from urllib.request import Request
|
||||||
|
from urllib.request import HTTPRedirectHandler
|
||||||
|
from urllib.request import build_opener
|
||||||
except ImportError:
|
except ImportError:
|
||||||
# python 2
|
# python 2
|
||||||
|
PY2 = True
|
||||||
from urlparse import urlparse
|
from urlparse import urlparse
|
||||||
from urllib import quote as urlquote
|
from urllib import quote as urlquote
|
||||||
from urllib import urlencode
|
from urllib import urlencode
|
||||||
from urllib2 import HTTPError, URLError
|
from urllib2 import HTTPError, URLError
|
||||||
from urllib2 import urlopen
|
from urllib2 import urlopen
|
||||||
from urllib2 import Request
|
from urllib2 import Request
|
||||||
|
from urllib2 import HTTPRedirectHandler
|
||||||
|
from urllib2 import build_opener
|
||||||
|
|
||||||
from github_backup import __version__
|
try:
|
||||||
|
from github_backup import __version__
|
||||||
|
VERSION = __version__
|
||||||
|
except ImportError:
|
||||||
|
VERSION = 'unknown'
|
||||||
|
|
||||||
FNULL = open(os.devnull, 'w')
|
FNULL = open(os.devnull, 'w')
|
||||||
|
|
||||||
@@ -296,13 +306,22 @@ def parse_args():
|
|||||||
help='Clone repositories using SSH instead of HTTPS')
|
help='Clone repositories using SSH instead of HTTPS')
|
||||||
parser.add_argument('-v', '--version',
|
parser.add_argument('-v', '--version',
|
||||||
action='version',
|
action='version',
|
||||||
version='%(prog)s ' + __version__)
|
version='%(prog)s ' + VERSION)
|
||||||
parser.add_argument('--keychain-name',
|
parser.add_argument('--keychain-name',
|
||||||
dest='osx_keychain_item_name',
|
dest='osx_keychain_item_name',
|
||||||
help='OSX ONLY: name field of password item in OSX keychain that holds the personal access or OAuth token')
|
help='OSX ONLY: name field of password item in OSX keychain that holds the personal access or OAuth token')
|
||||||
parser.add_argument('--keychain-account',
|
parser.add_argument('--keychain-account',
|
||||||
dest='osx_keychain_item_account',
|
dest='osx_keychain_item_account',
|
||||||
help='OSX ONLY: account field of password item in OSX keychain that holds the personal access or OAuth token')
|
help='OSX ONLY: account field of password item in OSX keychain that holds the personal access or OAuth token')
|
||||||
|
parser.add_argument('--releases',
|
||||||
|
action='store_true',
|
||||||
|
dest='include_releases',
|
||||||
|
help='include release information, not including assets or binaries'
|
||||||
|
)
|
||||||
|
parser.add_argument('--assets',
|
||||||
|
action='store_true',
|
||||||
|
dest='include_assets',
|
||||||
|
help='include assets alongside release information; only applies if including releases')
|
||||||
return parser.parse_args()
|
return parser.parse_args()
|
||||||
|
|
||||||
|
|
||||||
@@ -322,6 +341,8 @@ def get_auth(args, encode=True):
|
|||||||
'-s', args.osx_keychain_item_name,
|
'-s', args.osx_keychain_item_name,
|
||||||
'-a', args.osx_keychain_item_account,
|
'-a', args.osx_keychain_item_account,
|
||||||
'-w'], stderr=devnull).strip())
|
'-w'], stderr=devnull).strip())
|
||||||
|
if not PY2:
|
||||||
|
token = token.decode('utf-8')
|
||||||
auth = token + ':' + 'x-oauth-basic'
|
auth = token + ':' + 'x-oauth-basic'
|
||||||
except:
|
except:
|
||||||
log_error('No password item matching the provided name and account could be found in the osx keychain.')
|
log_error('No password item matching the provided name and account could be found in the osx keychain.')
|
||||||
@@ -372,14 +393,14 @@ def get_github_host(args):
|
|||||||
|
|
||||||
|
|
||||||
def get_github_repo_url(args, repository):
|
def get_github_repo_url(args, repository):
|
||||||
if args.prefer_ssh:
|
|
||||||
return repository['ssh_url']
|
|
||||||
|
|
||||||
if repository.get('is_gist'):
|
if repository.get('is_gist'):
|
||||||
return repository['git_pull_url']
|
return repository['git_pull_url']
|
||||||
|
|
||||||
|
if args.prefer_ssh:
|
||||||
|
return repository['ssh_url']
|
||||||
|
|
||||||
auth = get_auth(args, False)
|
auth = get_auth(args, False)
|
||||||
if auth:
|
if auth and repository['private'] == True:
|
||||||
repo_url = 'https://{0}@{1}/{2}/{3}.git'.format(
|
repo_url = 'https://{0}@{1}/{2}/{3}.git'.format(
|
||||||
auth,
|
auth,
|
||||||
get_github_host(args),
|
get_github_host(args),
|
||||||
@@ -391,12 +412,11 @@ def get_github_repo_url(args, repository):
|
|||||||
return repo_url
|
return repo_url
|
||||||
|
|
||||||
|
|
||||||
def retrieve_data(args, template, query_args=None, single_request=False):
|
def retrieve_data_gen(args, template, query_args=None, single_request=False):
|
||||||
auth = get_auth(args)
|
auth = get_auth(args)
|
||||||
query_args = get_query_args(query_args)
|
query_args = get_query_args(query_args)
|
||||||
per_page = 100
|
per_page = 100
|
||||||
page = 0
|
page = 0
|
||||||
data = []
|
|
||||||
|
|
||||||
while True:
|
while True:
|
||||||
page = page + 1
|
page = page + 1
|
||||||
@@ -423,11 +443,12 @@ def retrieve_data(args, template, query_args=None, single_request=False):
|
|||||||
response = json.loads(r.read().decode('utf-8'))
|
response = json.loads(r.read().decode('utf-8'))
|
||||||
if len(errors) == 0:
|
if len(errors) == 0:
|
||||||
if type(response) == list:
|
if type(response) == list:
|
||||||
data.extend(response)
|
for resp in response:
|
||||||
|
yield resp
|
||||||
if len(response) < per_page:
|
if len(response) < per_page:
|
||||||
break
|
break
|
||||||
elif type(response) == dict and single_request:
|
elif type(response) == dict and single_request:
|
||||||
data.append(response)
|
yield response
|
||||||
|
|
||||||
if len(errors) > 0:
|
if len(errors) > 0:
|
||||||
log_error(errors)
|
log_error(errors)
|
||||||
@@ -435,8 +456,8 @@ def retrieve_data(args, template, query_args=None, single_request=False):
|
|||||||
if single_request:
|
if single_request:
|
||||||
break
|
break
|
||||||
|
|
||||||
return data
|
def retrieve_data(args, template, query_args=None, single_request=False):
|
||||||
|
return list(retrieve_data_gen(args, template, query_args, single_request))
|
||||||
|
|
||||||
def get_query_args(query_args=None):
|
def get_query_args(query_args=None):
|
||||||
if not query_args:
|
if not query_args:
|
||||||
@@ -532,6 +553,55 @@ def _request_url_error(template, retry_timeout):
|
|||||||
return False
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
class S3HTTPRedirectHandler(HTTPRedirectHandler):
|
||||||
|
"""
|
||||||
|
A subclassed redirect handler for downloading Github assets from S3.
|
||||||
|
|
||||||
|
urllib will add the Authorization header to the redirected request to S3, which will result in a 400,
|
||||||
|
so we should remove said header on redirect.
|
||||||
|
"""
|
||||||
|
def redirect_request(self, req, fp, code, msg, headers, newurl):
|
||||||
|
if PY2:
|
||||||
|
# HTTPRedirectHandler is an old style class
|
||||||
|
request = HTTPRedirectHandler.redirect_request(self, req, fp, code, msg, headers, newurl)
|
||||||
|
else:
|
||||||
|
request = super(S3HTTPRedirectHandler, self).redirect_request(req, fp, code, msg, headers, newurl)
|
||||||
|
del request.headers['Authorization']
|
||||||
|
return request
|
||||||
|
|
||||||
|
|
||||||
|
def download_file(url, path, auth):
|
||||||
|
# Skip downloading release assets if they already exist on disk so we don't redownload on every sync
|
||||||
|
if os.path.exists(path):
|
||||||
|
return
|
||||||
|
|
||||||
|
request = Request(url)
|
||||||
|
request.add_header('Accept', 'application/octet-stream')
|
||||||
|
request.add_header('Authorization', 'Basic '.encode('ascii') + auth)
|
||||||
|
opener = build_opener(S3HTTPRedirectHandler)
|
||||||
|
|
||||||
|
try:
|
||||||
|
response = opener.open(request)
|
||||||
|
|
||||||
|
chunk_size = 16 * 1024
|
||||||
|
with open(path, 'wb') as f:
|
||||||
|
while True:
|
||||||
|
chunk = response.read(chunk_size)
|
||||||
|
if not chunk:
|
||||||
|
break
|
||||||
|
f.write(chunk)
|
||||||
|
except HTTPError as exc:
|
||||||
|
# Gracefully handle 404 responses (and others) when downloading from S3
|
||||||
|
log_warning('Skipping download of asset {0} due to HTTPError: {1}'.format(url, exc.reason))
|
||||||
|
except URLError as e:
|
||||||
|
# Gracefully handle other URL errors
|
||||||
|
log_warning('Skipping download of asset {0} due to URLError: {1}'.format(url, e.reason))
|
||||||
|
except socket.error as e:
|
||||||
|
# Gracefully handle socket errors
|
||||||
|
# TODO: Implement retry logic
|
||||||
|
log_warning('Skipping download of asset {0} due to socker error: {1}'.format(url, e.strerror))
|
||||||
|
|
||||||
|
|
||||||
def get_authenticated_user(args):
|
def get_authenticated_user(args):
|
||||||
template = 'https://{0}/user'.format(get_github_api_host(args))
|
template = 'https://{0}/user'.format(get_github_api_host(args))
|
||||||
data = retrieve_data(args, template, single_request=True)
|
data = retrieve_data(args, template, single_request=True)
|
||||||
@@ -699,6 +769,10 @@ def backup_repositories(args, output_directory, repositories):
|
|||||||
if args.include_hooks or args.include_everything:
|
if args.include_hooks or args.include_everything:
|
||||||
backup_hooks(args, repo_cwd, repository, repos_template)
|
backup_hooks(args, repo_cwd, repository, repos_template)
|
||||||
|
|
||||||
|
if args.include_releases or args.include_everything:
|
||||||
|
backup_releases(args, repo_cwd, repository, repos_template,
|
||||||
|
include_assets=args.include_assets or args.include_everything)
|
||||||
|
|
||||||
if args.incremental:
|
if args.incremental:
|
||||||
open(last_update_path, 'w').write(last_update)
|
open(last_update_path, 'w').write(last_update)
|
||||||
|
|
||||||
@@ -784,24 +858,27 @@ def backup_pulls(args, repo_cwd, repository, repos_template):
|
|||||||
pull_states = ['open', 'closed']
|
pull_states = ['open', 'closed']
|
||||||
for pull_state in pull_states:
|
for pull_state in pull_states:
|
||||||
query_args['state'] = pull_state
|
query_args['state'] = pull_state
|
||||||
# It'd be nice to be able to apply the args.since filter here...
|
_pulls = retrieve_data_gen(args,
|
||||||
_pulls = retrieve_data(args,
|
|
||||||
_pulls_template,
|
_pulls_template,
|
||||||
query_args=query_args)
|
query_args=query_args)
|
||||||
for pull in _pulls:
|
for pull in _pulls:
|
||||||
|
if args.since and pull['updated_at'] < args.since:
|
||||||
|
break
|
||||||
if not args.since or pull['updated_at'] >= args.since:
|
if not args.since or pull['updated_at'] >= args.since:
|
||||||
pulls[pull['number']] = pull
|
pulls[pull['number']] = pull
|
||||||
else:
|
else:
|
||||||
_pulls = retrieve_data(args,
|
_pulls = retrieve_data_gen(args,
|
||||||
_pulls_template,
|
_pulls_template,
|
||||||
query_args=query_args)
|
query_args=query_args)
|
||||||
for pull in _pulls:
|
for pull in _pulls:
|
||||||
|
if args.since and pull['updated_at'] < args.since:
|
||||||
|
break
|
||||||
if not args.since or pull['updated_at'] >= args.since:
|
if not args.since or pull['updated_at'] >= args.since:
|
||||||
pulls[pull['number']] = retrieve_data(
|
pulls[pull['number']] = retrieve_data(
|
||||||
args,
|
args,
|
||||||
_pulls_template + '/{}'.format(pull['number']),
|
_pulls_template + '/{}'.format(pull['number']),
|
||||||
single_request=True
|
single_request=True
|
||||||
)
|
)[0]
|
||||||
|
|
||||||
log_info('Saving {0} pull requests to disk'.format(
|
log_info('Saving {0} pull requests to disk'.format(
|
||||||
len(list(pulls.keys()))))
|
len(list(pulls.keys()))))
|
||||||
@@ -880,6 +957,37 @@ def backup_hooks(args, repo_cwd, repository, repos_template):
|
|||||||
log_info("Unable to read hooks, skipping")
|
log_info("Unable to read hooks, skipping")
|
||||||
|
|
||||||
|
|
||||||
|
def backup_releases(args, repo_cwd, repository, repos_template, include_assets=False):
|
||||||
|
repository_fullname = repository['full_name']
|
||||||
|
|
||||||
|
# give release files somewhere to live & log intent
|
||||||
|
release_cwd = os.path.join(repo_cwd, 'releases')
|
||||||
|
log_info('Retrieving {0} releases'.format(repository_fullname))
|
||||||
|
mkdir_p(repo_cwd, release_cwd)
|
||||||
|
|
||||||
|
query_args = {}
|
||||||
|
|
||||||
|
release_template = '{0}/{1}/releases'.format(repos_template, repository_fullname)
|
||||||
|
releases = retrieve_data(args, release_template, query_args=query_args)
|
||||||
|
|
||||||
|
# for each release, store it
|
||||||
|
log_info('Saving {0} releases to disk'.format(len(releases)))
|
||||||
|
for release in releases:
|
||||||
|
release_name = release['tag_name']
|
||||||
|
output_filepath = os.path.join(release_cwd, '{0}.json'.format(release_name))
|
||||||
|
with codecs.open(output_filepath, 'w+', encoding='utf-8') as f:
|
||||||
|
json_dump(release, f)
|
||||||
|
|
||||||
|
if include_assets:
|
||||||
|
assets = retrieve_data(args, release['assets_url'])
|
||||||
|
if len(assets) > 0:
|
||||||
|
# give release asset files somewhere to live & download them (not including source archives)
|
||||||
|
release_assets_cwd = os.path.join(release_cwd, release_name)
|
||||||
|
mkdir_p(release_assets_cwd)
|
||||||
|
for asset in assets:
|
||||||
|
download_file(asset['url'], os.path.join(release_assets_cwd, asset['name']), get_auth(args))
|
||||||
|
|
||||||
|
|
||||||
def fetch_repository(name,
|
def fetch_repository(name,
|
||||||
remote_url,
|
remote_url,
|
||||||
local_dir,
|
local_dir,
|
||||||
@@ -928,7 +1036,7 @@ def fetch_repository(name,
|
|||||||
logging_subprocess(git_command, None, cwd=local_dir)
|
logging_subprocess(git_command, None, cwd=local_dir)
|
||||||
|
|
||||||
if lfs_clone:
|
if lfs_clone:
|
||||||
git_command = ['git', 'lfs', 'fetch', '--all', '--force', '--tags', '--prune']
|
git_command = ['git', 'lfs', 'fetch', '--all', '--prune']
|
||||||
else:
|
else:
|
||||||
git_command = ['git', 'fetch', '--all', '--force', '--tags', '--prune']
|
git_command = ['git', 'fetch', '--all', '--force', '--tags', '--prune']
|
||||||
logging_subprocess(git_command, None, cwd=local_dir)
|
logging_subprocess(git_command, None, cwd=local_dir)
|
||||||
|
|||||||
@@ -1 +1 @@
|
|||||||
__version__ = '0.23.0'
|
__version__ = '0.28.0'
|
||||||
|
|||||||
Reference in New Issue
Block a user