mirror of
https://github.com/josegonzalez/python-github-backup.git
synced 2025-12-05 16:18:02 +01:00
Compare commits
105 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
d4055eb99c | ||
|
|
d8a330559c | ||
|
|
de93824498 | ||
|
|
2efeaa7580 | ||
|
|
647810a2f0 | ||
|
|
0dfe5c342a | ||
|
|
1d6e1abab1 | ||
|
|
dd2b96b172 | ||
|
|
7a589f1e63 | ||
|
|
92c619cd01 | ||
|
|
9a91dd7733 | ||
|
|
6592bd8196 | ||
|
|
e9e3b18512 | ||
|
|
88148b4c95 | ||
|
|
8448add464 | ||
|
|
5b30b7ebdd | ||
|
|
c3a17710d3 | ||
|
|
4462412ec7 | ||
|
|
8d61538e5e | ||
|
|
4d37ad206f | ||
|
|
1f983863fc | ||
|
|
f0b28567b9 | ||
|
|
77ede50b19 | ||
|
|
97e4fbbacb | ||
|
|
03604cc654 | ||
|
|
73a62fdee1 | ||
|
|
94e1d62ad5 | ||
|
|
54cef11ce7 | ||
|
|
56397eba1c | ||
|
|
9f861efccf | ||
|
|
c1c9ce6dca | ||
|
|
ab18d8aee0 | ||
|
|
9d7d98b19e | ||
|
|
0233bff696 | ||
|
|
6154ceda15 | ||
|
|
9023052e9c | ||
|
|
874c235ba5 | ||
|
|
b7b234d8a5 | ||
|
|
ed160eb0ca | ||
|
|
1d11d62b73 | ||
|
|
9e1cba9817 | ||
|
|
3859a80b7a | ||
|
|
8c12d54898 | ||
|
|
b6b6605acd | ||
|
|
ff5e0aa89c | ||
|
|
79726c360d | ||
|
|
a511bb2b49 | ||
|
|
aedf9b2c66 | ||
|
|
b9e35a50f5 | ||
|
|
d0e239b3ef | ||
|
|
29c9373d9d | ||
|
|
eb8b22c81c | ||
|
|
03739ce1be | ||
|
|
d2bb205b4b | ||
|
|
17141c1bb6 | ||
|
|
d362adbbca | ||
|
|
1e5a90486c | ||
|
|
9b74aff20b | ||
|
|
89df625e04 | ||
|
|
675484a215 | ||
|
|
325f77dcd9 | ||
|
|
f12e9167aa | ||
|
|
816447af19 | ||
|
|
d9e15e2be2 | ||
|
|
534145d178 | ||
|
|
fe162eedd5 | ||
|
|
53a9a22afb | ||
|
|
2aa7d4cf1e | ||
|
|
804843c128 | ||
|
|
5fc27a4d42 | ||
|
|
c8b3f048f5 | ||
|
|
2d98251992 | ||
|
|
050f5f1c17 | ||
|
|
348a238770 | ||
|
|
708b377918 | ||
|
|
6193efb798 | ||
|
|
4b30aaeef3 | ||
|
|
762059d1a6 | ||
|
|
a440bc1522 | ||
|
|
43793c1e5e | ||
|
|
24fac46459 | ||
|
|
c9916e28a4 | ||
|
|
ab4b28cdd4 | ||
|
|
6feb409fc2 | ||
|
|
8bdbc2cee2 | ||
|
|
a4d6272b50 | ||
|
|
7ce61202e5 | ||
|
|
3e82d829e4 | ||
|
|
339ad96876 | ||
|
|
b2a942eb43 | ||
|
|
e8aa38f395 | ||
|
|
86bdb1420c | ||
|
|
2e7f325475 | ||
|
|
8bf62cd932 | ||
|
|
63bf7267a6 | ||
|
|
5612e51153 | ||
|
|
c81bf98627 | ||
|
|
040516325a | ||
|
|
dca9f8051b | ||
|
|
3bc23473b8 | ||
|
|
2c9eb80cf2 | ||
|
|
bb86f0582e | ||
|
|
e8387f9a7f | ||
|
|
39b173f173 | ||
|
|
883c92753d |
300
CHANGES.rst
Normal file
300
CHANGES.rst
Normal file
@@ -0,0 +1,300 @@
|
|||||||
|
Changelog
|
||||||
|
=========
|
||||||
|
|
||||||
|
0.13.1 (2017-04-11)
|
||||||
|
-------------------
|
||||||
|
|
||||||
|
- Fix error when repository has no updated_at value. [Nicolai Ehemann]
|
||||||
|
|
||||||
|
0.13.0 (2017-04-05)
|
||||||
|
-------------------
|
||||||
|
|
||||||
|
- Add OS check for OSX specific keychain args. [Martin O'Reilly]
|
||||||
|
|
||||||
|
Keychain arguments are only supported on Mac OSX.
|
||||||
|
Added check for operating system so we give a
|
||||||
|
"Keychain arguments are only supported on Mac OSX"
|
||||||
|
error message rather than a "No password item matching the
|
||||||
|
provided name and account could be found in the osx keychain"
|
||||||
|
error message
|
||||||
|
|
||||||
|
|
||||||
|
- Add support for storing PAT in OSX keychain. [Martin O'Reilly]
|
||||||
|
|
||||||
|
Added additional optional arguments and README guidance for storing
|
||||||
|
and accessing a Github personal access token (PAT) in the OSX
|
||||||
|
keychain
|
||||||
|
|
||||||
|
|
||||||
|
0.12.1 (2017-03-27)
|
||||||
|
-------------------
|
||||||
|
|
||||||
|
- Avoid remote branch name churn. [Chris Adams]
|
||||||
|
|
||||||
|
This avoids the backup output having lots of "[new branch]" messages
|
||||||
|
because removing the old remote name removed all of the existing branch
|
||||||
|
references.
|
||||||
|
|
||||||
|
|
||||||
|
- Fix detection of bare git directories. [Andrzej Maczuga]
|
||||||
|
|
||||||
|
0.12.0 (2016-11-22)
|
||||||
|
-------------------
|
||||||
|
|
||||||
|
Fix
|
||||||
|
~~~
|
||||||
|
|
||||||
|
- Properly import version from github_backup package. [Jose Diaz-
|
||||||
|
Gonzalez]
|
||||||
|
|
||||||
|
- Support alternate git status output. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
Other
|
||||||
|
~~~~~
|
||||||
|
|
||||||
|
- Pep8: E501 line too long (83 > 79 characters) [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
- Pep8: E128 continuation line under-indented for visual indent. [Jose
|
||||||
|
Diaz-Gonzalez]
|
||||||
|
|
||||||
|
- Support archivization using bare git clones. [Andrzej Maczuga]
|
||||||
|
|
||||||
|
- Fix typo, 3x. [Terrell Russell]
|
||||||
|
|
||||||
|
0.11.0 (2016-10-26)
|
||||||
|
-------------------
|
||||||
|
|
||||||
|
- Support --token file:///home/user/token.txt (fixes gh-51) [Björn
|
||||||
|
Dahlgren]
|
||||||
|
|
||||||
|
- Fix some linting. [Albert Wang]
|
||||||
|
|
||||||
|
- Fix byte/string conversion for python 3. [Albert Wang]
|
||||||
|
|
||||||
|
- Support python 3. [Albert Wang]
|
||||||
|
|
||||||
|
- Encode special characters in password. [Remi Rampin]
|
||||||
|
|
||||||
|
- Don't pretend program name is "Github Backup" [Remi Rampin]
|
||||||
|
|
||||||
|
- Don't install over insecure connection. [Remi Rampin]
|
||||||
|
|
||||||
|
The git:// protocol is unauthenticated and unencrypted, and no longer advertised by GitHub. Using HTTPS shouldn't impact performance.
|
||||||
|
|
||||||
|
0.10.3 (2016-08-21)
|
||||||
|
-------------------
|
||||||
|
|
||||||
|
- Fixes #29. [Jonas Michel]
|
||||||
|
|
||||||
|
Reporting an error when the user's rate limit is exceeded causes
|
||||||
|
the script to terminate after resuming execution from a rate limit
|
||||||
|
sleep. Instead of generating an explicit error we just want to
|
||||||
|
inform the user that the script is going to sleep until their rate
|
||||||
|
limit count resets.
|
||||||
|
|
||||||
|
|
||||||
|
- Fixes #29. [Jonas Michel]
|
||||||
|
|
||||||
|
The errors list was not being cleared out after resuming a backup
|
||||||
|
from a rate limit sleep. When the backup was resumed, the non-empty
|
||||||
|
errors list caused the backup to quit after the next `retrieve_data`
|
||||||
|
request.
|
||||||
|
|
||||||
|
|
||||||
|
0.10.2 (2016-08-21)
|
||||||
|
-------------------
|
||||||
|
|
||||||
|
- Add a note regarding git version requirement. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
Closes #37
|
||||||
|
|
||||||
|
0.10.0 (2016-08-18)
|
||||||
|
-------------------
|
||||||
|
|
||||||
|
- Implement incremental updates. [Robert Bradshaw]
|
||||||
|
|
||||||
|
Guarded with an --incremental flag.
|
||||||
|
|
||||||
|
Stores the time of the last update and only downloads issue and
|
||||||
|
pull request data since this time. All other data is relatively
|
||||||
|
small (likely fetched with a single request) and so is simply
|
||||||
|
re-populated from scratch as before.
|
||||||
|
|
||||||
|
|
||||||
|
0.9.0 (2016-03-29)
|
||||||
|
------------------
|
||||||
|
|
||||||
|
- Fix cloning private repos with basic auth or token. [Kazuki Suda]
|
||||||
|
|
||||||
|
0.8.0 (2016-02-14)
|
||||||
|
------------------
|
||||||
|
|
||||||
|
- Don't store issues which are actually pull requests. [Enrico Tröger]
|
||||||
|
|
||||||
|
This prevents storing pull requests twice since the Github API returns
|
||||||
|
pull requests also as issues. Those issues will be skipped but only if
|
||||||
|
retrieving pull requests is requested as well.
|
||||||
|
Closes #23.
|
||||||
|
|
||||||
|
|
||||||
|
0.7.0 (2016-02-02)
|
||||||
|
------------------
|
||||||
|
|
||||||
|
- Softly fail if not able to read hooks. [Albert Wang]
|
||||||
|
|
||||||
|
- Add note about 2-factor auth. [Albert Wang]
|
||||||
|
|
||||||
|
- Make user repository search go through endpoint capable of reading
|
||||||
|
private repositories. [Albert Wang]
|
||||||
|
|
||||||
|
- Prompt for password if only username given. [Alex Hall]
|
||||||
|
|
||||||
|
0.6.0 (2015-11-10)
|
||||||
|
------------------
|
||||||
|
|
||||||
|
- Force proper remote url. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
- Improve error handling in case of HTTP errors. [Enrico Tröger]
|
||||||
|
|
||||||
|
In case of a HTTP status code 404, the returned 'r' was never assigned.
|
||||||
|
In case of URL errors which are not timeouts, we probably should bail
|
||||||
|
out.
|
||||||
|
|
||||||
|
|
||||||
|
- Add --hooks to also include web hooks into the backup. [Enrico Tröger]
|
||||||
|
|
||||||
|
- Create the user specified output directory if it does not exist.
|
||||||
|
[Enrico Tröger]
|
||||||
|
|
||||||
|
Fixes #17.
|
||||||
|
|
||||||
|
|
||||||
|
- Add missing auth argument to _get_response() [Enrico Tröger]
|
||||||
|
|
||||||
|
When running unauthenticated and Github starts rate-limiting the client,
|
||||||
|
github-backup crashes because the used auth variable in _get_response()
|
||||||
|
was not available. This change should fix it.
|
||||||
|
|
||||||
|
|
||||||
|
- Add repository URL to error message for non-existing repositories.
|
||||||
|
[Enrico Tröger]
|
||||||
|
|
||||||
|
This makes it easier for the user to identify which repository does not
|
||||||
|
exist or is not initialised, i.e. whether it is the main repository or
|
||||||
|
the wiki repository and which clone URL was used to check.
|
||||||
|
|
||||||
|
|
||||||
|
0.5.0 (2015-10-10)
|
||||||
|
------------------
|
||||||
|
|
||||||
|
- Add release script. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
- Refactor to both simplify codepath as well as follow PEP8 standards.
|
||||||
|
[Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
- Retry 3 times when the connection times out. [Mathijs Jonker]
|
||||||
|
|
||||||
|
- Made unicode output defalut. [Kirill Grushetsky]
|
||||||
|
|
||||||
|
- Import alphabetised. [Kirill Grushetsky]
|
||||||
|
|
||||||
|
- Preserve Unicode characters in the output file. [Kirill Grushetsky]
|
||||||
|
|
||||||
|
Added option to preserve Unicode characters in the output file
|
||||||
|
|
||||||
|
- Josegonzales/python-github-backup#12 Added backup of labels and
|
||||||
|
milestones. [aensley]
|
||||||
|
|
||||||
|
- Fixed indent. [Mathijs Jonker]
|
||||||
|
|
||||||
|
- Skip unitialized repo's. [mjonker-embed]
|
||||||
|
|
||||||
|
These gave me errors which caused mails from crontab.
|
||||||
|
|
||||||
|
- Added prefer-ssh. [mjonker-embed]
|
||||||
|
|
||||||
|
Was needed for my back-up setup, code includes this but readme wasn't updated
|
||||||
|
|
||||||
|
- Retry API requests which failed due to rate-limiting. [Chris Adams]
|
||||||
|
|
||||||
|
This allows operation to continue, albeit at a slower pace,
|
||||||
|
if you have enough data to trigger the API rate limits
|
||||||
|
|
||||||
|
- Logging_subprocess: always log when a command fails. [Chris Adams]
|
||||||
|
|
||||||
|
Previously git clones could fail without any indication
|
||||||
|
unless you edited the source to change `logger=None` to use
|
||||||
|
a configured logger.
|
||||||
|
|
||||||
|
Now a non-zero return code will always output a message to
|
||||||
|
stderr and will display the executed command so it can be
|
||||||
|
rerun for troubleshooting.
|
||||||
|
|
||||||
|
|
||||||
|
- Switch to using ssh_url. [Chris Adams]
|
||||||
|
|
||||||
|
The previous commit used the wrong URL for a private repo. This was
|
||||||
|
masked by the lack of error loging in logging_subprocess (which will be
|
||||||
|
in a separate branch)
|
||||||
|
|
||||||
|
|
||||||
|
- Add an option to prefer checkouts over SSH. [Chris Adams]
|
||||||
|
|
||||||
|
This is really useful with private repos to avoid being nagged
|
||||||
|
for credentials for every repository
|
||||||
|
|
||||||
|
|
||||||
|
- Add pull request support. [Kevin Laude]
|
||||||
|
|
||||||
|
Back up reporitory pull requests by passing the --include-pulls
|
||||||
|
argument. Pull requests are saved to
|
||||||
|
repositories/<repository name>/pulls/<pull request number>.json. Include
|
||||||
|
the --pull-request-comments argument to add review comments to the pull
|
||||||
|
request backup and pass the --pull-request-commits argument to add
|
||||||
|
commits to the pull request backup.
|
||||||
|
|
||||||
|
Pull requests are automatically backed up when the --all argument is
|
||||||
|
uesd.
|
||||||
|
|
||||||
|
|
||||||
|
- Add GitHub Enterprise support. [Kevin Laude]
|
||||||
|
|
||||||
|
Pass the -H or --github-host argument with a GitHub Enterprise hostname
|
||||||
|
to backup from that GitHub enterprise host. If no argument is passed
|
||||||
|
then back up from github.com.
|
||||||
|
|
||||||
|
|
||||||
|
0.2.0 (2014-09-22)
|
||||||
|
------------------
|
||||||
|
|
||||||
|
- Add support for retrieving repositories. Closes #1. [Jose Diaz-
|
||||||
|
Gonzalez]
|
||||||
|
|
||||||
|
- Fix PEP8 violations. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
- Add authorization to header only if specified by user. [Ioannis
|
||||||
|
Filippidis]
|
||||||
|
|
||||||
|
- Fill out readme more. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
- Fix import. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
- Properly name readme. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
- Create MANIFEST.in. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
- Create .gitignore. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
- Create setup.py. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
- Create requirements.txt. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
- Create __init__.py. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
- Create LICENSE.txt. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
- Create README.md. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
- Create github-backup. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
|
||||||
72
README.rst
72
README.rst
@@ -4,6 +4,11 @@ github-backup
|
|||||||
|
|
||||||
backup a github user or organization
|
backup a github user or organization
|
||||||
|
|
||||||
|
Requirements
|
||||||
|
============
|
||||||
|
|
||||||
|
- GIT 1.9+
|
||||||
|
|
||||||
Installation
|
Installation
|
||||||
============
|
============
|
||||||
|
|
||||||
@@ -13,22 +18,27 @@ Using PIP via PyPI::
|
|||||||
|
|
||||||
Using PIP via Github::
|
Using PIP via Github::
|
||||||
|
|
||||||
pip install git+git://github.com/josegonzalez/python-github-backup.git#egg=github-backup
|
pip install git+https://github.com/josegonzalez/python-github-backup.git#egg=github-backup
|
||||||
|
|
||||||
Usage
|
Usage
|
||||||
=====
|
=====
|
||||||
|
|
||||||
CLI Usage is as follows::
|
CLI Usage is as follows::
|
||||||
|
|
||||||
Github Backup [-h] [-u USERNAME] [-p PASSWORD] [-t TOKEN]
|
github-backup [-h] [-u USERNAME] [-p PASSWORD] [-t TOKEN]
|
||||||
[-o OUTPUT_DIRECTORY] [--starred] [--watched] [--all]
|
[-o OUTPUT_DIRECTORY] [-i] [--starred] [--watched]
|
||||||
[--issues] [--issue-comments] [--issue-events]
|
[--all] [--issues] [--issue-comments] [--issue-events]
|
||||||
[--repositories] [--wikis] [--skip-existing]
|
[--pulls] [--pull-comments] [--pull-commits] [--labels]
|
||||||
[-L [LANGUAGES [LANGUAGES ...]]] [-N NAME_REGEX] [-O]
|
[--hooks] [--milestones] [--repositories] [--bare]
|
||||||
[-R REPOSITORY] [-P] [-F] [-v]
|
[--wikis] [--skip-existing]
|
||||||
USER
|
[-L [LANGUAGES [LANGUAGES ...]]] [-N NAME_REGEX]
|
||||||
|
[-H GITHUB_HOST] [-O] [-R REPOSITORY] [-P] [-F]
|
||||||
|
[--prefer-ssh] [-v]
|
||||||
|
[--keychain-name OSX_KEYCHAIN_ITEM_NAME]
|
||||||
|
[--keychain-account OSX_KEYCHAIN_ITEM_ACCOUNT]
|
||||||
|
USER
|
||||||
|
|
||||||
Backup a github users account
|
Backup a github account
|
||||||
|
|
||||||
positional arguments:
|
positional arguments:
|
||||||
USER github username
|
USER github username
|
||||||
@@ -38,29 +48,69 @@ CLI Usage is as follows::
|
|||||||
-u USERNAME, --username USERNAME
|
-u USERNAME, --username USERNAME
|
||||||
username for basic auth
|
username for basic auth
|
||||||
-p PASSWORD, --password PASSWORD
|
-p PASSWORD, --password PASSWORD
|
||||||
password for basic auth
|
password for basic auth. If a username is given but
|
||||||
|
not a password, the password will be prompted for.
|
||||||
-t TOKEN, --token TOKEN
|
-t TOKEN, --token TOKEN
|
||||||
personal access or OAuth token
|
personal access or OAuth token
|
||||||
-o OUTPUT_DIRECTORY, --output-directory OUTPUT_DIRECTORY
|
-o OUTPUT_DIRECTORY, --output-directory OUTPUT_DIRECTORY
|
||||||
directory at which to backup the repositories
|
directory at which to backup the repositories
|
||||||
|
-i, --incremental incremental backup
|
||||||
--starred include starred repositories in backup
|
--starred include starred repositories in backup
|
||||||
--watched include watched repositories in backup
|
--watched include watched repositories in backup
|
||||||
--all include everything in backup
|
--all include everything in backup
|
||||||
--issues include issues in backup
|
--issues include issues in backup
|
||||||
--issue-comments include issue comments in backup
|
--issue-comments include issue comments in backup
|
||||||
--issue-events include issue events in backup
|
--issue-events include issue events in backup
|
||||||
|
--pulls include pull requests in backup
|
||||||
|
--pull-comments include pull request review comments in backup
|
||||||
|
--pull-commits include pull request commits in backup
|
||||||
|
--labels include labels in backup
|
||||||
|
--hooks include hooks in backup (works only when
|
||||||
|
authenticated)
|
||||||
|
--milestones include milestones in backup
|
||||||
--repositories include repository clone in backup
|
--repositories include repository clone in backup
|
||||||
|
--bare clone bare repositories
|
||||||
--wikis include wiki clone in backup
|
--wikis include wiki clone in backup
|
||||||
--skip-existing skip project if a backup directory exists
|
--skip-existing skip project if a backup directory exists
|
||||||
-L [LANGUAGES [LANGUAGES ...]], --languages [LANGUAGES [LANGUAGES ...]]
|
-L [LANGUAGES [LANGUAGES ...]], --languages [LANGUAGES [LANGUAGES ...]]
|
||||||
only allow these languages
|
only allow these languages
|
||||||
-N NAME_REGEX, --name-regex NAME_REGEX
|
-N NAME_REGEX, --name-regex NAME_REGEX
|
||||||
python regex to match names against
|
python regex to match names against
|
||||||
-O, --organization whether or not this is a query for an organization
|
-H GITHUB_HOST, --github-host GITHUB_HOST
|
||||||
|
GitHub Enterprise hostname
|
||||||
|
-O, --organization whether or not this is an organization user
|
||||||
-R REPOSITORY, --repository REPOSITORY
|
-R REPOSITORY, --repository REPOSITORY
|
||||||
name of repository to limit backup to
|
name of repository to limit backup to
|
||||||
-P, --private include private repositories
|
-P, --private include private repositories
|
||||||
-F, --fork include forked repositories
|
-F, --fork include forked repositories
|
||||||
|
--prefer-ssh Clone repositories using SSH instead of HTTPS
|
||||||
-v, --version show program's version number and exit
|
-v, --version show program's version number and exit
|
||||||
|
--keychain-name OSX_KEYCHAIN_ITEM_NAME
|
||||||
|
OSX ONLY: name field of password item in OSX keychain
|
||||||
|
that holds the personal access or OAuth token
|
||||||
|
--keychain-account OSX_KEYCHAIN_ITEM_ACCOUNT
|
||||||
|
OSX ONLY: account field of password item in OSX
|
||||||
|
keychain that holds the personal access or OAuth token
|
||||||
|
|
||||||
|
|
||||||
The package can be used to backup an *entire* organization or repository, including issues and wikis in the most appropriate format (clones for wikis, json files for issues).
|
The package can be used to backup an *entire* organization or repository, including issues and wikis in the most appropriate format (clones for wikis, json files for issues).
|
||||||
|
|
||||||
|
Authentication
|
||||||
|
==============
|
||||||
|
|
||||||
|
Note: Password-based authentication will fail if you have two-factor authentication enabled.
|
||||||
|
|
||||||
|
Using the Keychain on Mac OSX
|
||||||
|
=============================
|
||||||
|
Note: On Mac OSX the token can be stored securely in the user's keychain. To do this:
|
||||||
|
|
||||||
|
1. Open Keychain from "Applications -> Utilities -> Keychain Access"
|
||||||
|
2. Add a new password item using "File -> New Password Item"
|
||||||
|
3. Enter a name in the "Keychain Item Name" box. You must provide this name to github-backup using the --keychain-name argument.
|
||||||
|
4. Enter an account name in the "Account Name" box, enter your Github username as set above. You must provide this name to github-backup using the --keychain-account argument.
|
||||||
|
5. Enter your Github personal access token in the "Password" box
|
||||||
|
|
||||||
|
Note: When you run github-backup, you will be asked whether you want to allow "security" to use your confidential information stored in your keychain. You have two options:
|
||||||
|
|
||||||
|
1. **Allow:** In this case you will need to click "Allow" each time you run `github-backup`
|
||||||
|
2. **Always Allow:** In this case, you will not be asked for permission when you run `github-backup` in future. This is less secure, but is required if you want to schedule `github-backup` to run automatically
|
||||||
|
|||||||
812
bin/github-backup
Normal file → Executable file
812
bin/github-backup
Normal file → Executable file
@@ -1,8 +1,13 @@
|
|||||||
#!/usr/bin/env python
|
#!/usr/bin/env python
|
||||||
|
|
||||||
|
from __future__ import print_function
|
||||||
|
|
||||||
import argparse
|
import argparse
|
||||||
import base64
|
import base64
|
||||||
|
import calendar
|
||||||
|
import codecs
|
||||||
import errno
|
import errno
|
||||||
|
import getpass
|
||||||
import json
|
import json
|
||||||
import logging
|
import logging
|
||||||
import os
|
import os
|
||||||
@@ -10,11 +15,29 @@ import re
|
|||||||
import select
|
import select
|
||||||
import subprocess
|
import subprocess
|
||||||
import sys
|
import sys
|
||||||
import urllib
|
import time
|
||||||
import urllib2
|
import platform
|
||||||
|
try:
|
||||||
|
# python 3
|
||||||
|
from urllib.parse import urlparse
|
||||||
|
from urllib.parse import quote as urlquote
|
||||||
|
from urllib.parse import urlencode
|
||||||
|
from urllib.error import HTTPError, URLError
|
||||||
|
from urllib.request import urlopen
|
||||||
|
from urllib.request import Request
|
||||||
|
except ImportError:
|
||||||
|
# python 2
|
||||||
|
from urlparse import urlparse
|
||||||
|
from urllib import quote as urlquote
|
||||||
|
from urllib import urlencode
|
||||||
|
from urllib2 import HTTPError, URLError
|
||||||
|
from urllib2 import urlopen
|
||||||
|
from urllib2 import Request
|
||||||
|
|
||||||
from github_backup import __version__
|
from github_backup import __version__
|
||||||
|
|
||||||
|
FNULL = open(os.devnull, 'w')
|
||||||
|
|
||||||
|
|
||||||
def log_error(message):
|
def log_error(message):
|
||||||
if type(message) == str:
|
if type(message) == str:
|
||||||
@@ -34,7 +57,11 @@ def log_info(message):
|
|||||||
sys.stdout.write("{0}\n".format(msg))
|
sys.stdout.write("{0}\n".format(msg))
|
||||||
|
|
||||||
|
|
||||||
def logging_subprocess(popenargs, logger, stdout_log_level=logging.DEBUG, stderr_log_level=logging.ERROR, **kwargs):
|
def logging_subprocess(popenargs,
|
||||||
|
logger,
|
||||||
|
stdout_log_level=logging.DEBUG,
|
||||||
|
stderr_log_level=logging.ERROR,
|
||||||
|
**kwargs):
|
||||||
"""
|
"""
|
||||||
Variant of subprocess.call that accepts a logger instead of stdout/stderr,
|
Variant of subprocess.call that accepts a logger instead of stdout/stderr,
|
||||||
and logs stdout messages via logger.debug and stderr messages via
|
and logs stdout messages via logger.debug and stderr messages via
|
||||||
@@ -47,7 +74,10 @@ def logging_subprocess(popenargs, logger, stdout_log_level=logging.DEBUG, stderr
|
|||||||
child.stderr: stderr_log_level}
|
child.stderr: stderr_log_level}
|
||||||
|
|
||||||
def check_io():
|
def check_io():
|
||||||
ready_to_read = select.select([child.stdout, child.stderr], [], [], 1000)[0]
|
ready_to_read = select.select([child.stdout, child.stderr],
|
||||||
|
[],
|
||||||
|
[],
|
||||||
|
1000)[0]
|
||||||
for io in ready_to_read:
|
for io in ready_to_read:
|
||||||
line = io.readline()
|
line = io.readline()
|
||||||
if not logger:
|
if not logger:
|
||||||
@@ -61,7 +91,13 @@ def logging_subprocess(popenargs, logger, stdout_log_level=logging.DEBUG, stderr
|
|||||||
|
|
||||||
check_io() # check again to catch anything after the process exits
|
check_io() # check again to catch anything after the process exits
|
||||||
|
|
||||||
return child.wait()
|
rc = child.wait()
|
||||||
|
|
||||||
|
if rc != 0:
|
||||||
|
print('{} returned {}:'.format(popenargs[0], rc), file=sys.stderr)
|
||||||
|
print('\t', ' '.join(popenargs), file=sys.stderr)
|
||||||
|
|
||||||
|
return rc
|
||||||
|
|
||||||
|
|
||||||
def mkdir_p(*args):
|
def mkdir_p(*args):
|
||||||
@@ -75,81 +111,258 @@ def mkdir_p(*args):
|
|||||||
raise
|
raise
|
||||||
|
|
||||||
|
|
||||||
|
def mask_password(url, secret='*****'):
|
||||||
|
parsed = urlparse(url)
|
||||||
|
|
||||||
|
if not parsed.password:
|
||||||
|
return url
|
||||||
|
elif parsed.password == 'x-oauth-basic':
|
||||||
|
return url.replace(parsed.username, secret)
|
||||||
|
|
||||||
|
return url.replace(parsed.password, secret)
|
||||||
|
|
||||||
|
|
||||||
def parse_args():
|
def parse_args():
|
||||||
parser = argparse.ArgumentParser(description='Backup a github users account', prog='Github Backup')
|
parser = argparse.ArgumentParser(description='Backup a github account')
|
||||||
parser.add_argument('user', metavar='USER', type=str, help='github username')
|
parser.add_argument('user',
|
||||||
parser.add_argument('-u', '--username', dest='username', help='username for basic auth')
|
metavar='USER',
|
||||||
parser.add_argument('-p', '--password', dest='password', help='password for basic auth')
|
type=str,
|
||||||
parser.add_argument('-t', '--token', dest='token', help='personal access or OAuth token')
|
help='github username')
|
||||||
parser.add_argument('-o', '--output-directory', default='.', dest='output_directory', help='directory at which to backup the repositories')
|
parser.add_argument('-u',
|
||||||
parser.add_argument('--starred', action='store_true', dest='include_starred', help='include starred repositories in backup')
|
'--username',
|
||||||
parser.add_argument('--watched', action='store_true', dest='include_watched', help='include watched repositories in backup')
|
dest='username',
|
||||||
parser.add_argument('--all', action='store_true', dest='include_everything', help='include everything in backup')
|
help='username for basic auth')
|
||||||
parser.add_argument('--issues', action='store_true', dest='include_issues', help='include issues in backup')
|
parser.add_argument('-p',
|
||||||
parser.add_argument('--issue-comments', action='store_true', dest='include_issue_comments', help='include issue comments in backup')
|
'--password',
|
||||||
parser.add_argument('--issue-events', action='store_true', dest='include_issue_events', help='include issue events in backup')
|
dest='password',
|
||||||
parser.add_argument('--repositories', action='store_true', dest='include_repository', help='include repository clone in backup')
|
help='password for basic auth. '
|
||||||
parser.add_argument('--wikis', action='store_true', dest='include_wiki', help='include wiki clone in backup')
|
'If a username is given but not a password, the '
|
||||||
parser.add_argument('--skip-existing', action='store_true', dest='skip_existing', help='skip project if a backup directory exists')
|
'password will be prompted for.')
|
||||||
parser.add_argument('-L', '--languages', dest='languages', help='only allow these languages', nargs='*')
|
parser.add_argument('-t',
|
||||||
parser.add_argument('-N', '--name-regex', dest='name_regex', help='python regex to match names against')
|
'--token',
|
||||||
parser.add_argument('-O', '--organization', action='store_true', dest='organization', help='whether or not this is a query for an organization')
|
dest='token',
|
||||||
parser.add_argument('-R', '--repository', dest='repository', help='name of repository to limit backup to')
|
help='personal access or OAuth token, or path to token (file://...)') # noqa
|
||||||
parser.add_argument('-P', '--private', action='store_true', dest='private', help='include private repositories')
|
parser.add_argument('-o',
|
||||||
parser.add_argument('-F', '--fork', action='store_true', dest='fork', help='include forked repositories')
|
'--output-directory',
|
||||||
parser.add_argument('-v', '--version', action='version', version='%(prog)s ' + __version__)
|
default='.',
|
||||||
|
dest='output_directory',
|
||||||
|
help='directory at which to backup the repositories')
|
||||||
|
parser.add_argument('-i',
|
||||||
|
'--incremental',
|
||||||
|
action='store_true',
|
||||||
|
dest='incremental',
|
||||||
|
help='incremental backup')
|
||||||
|
parser.add_argument('--starred',
|
||||||
|
action='store_true',
|
||||||
|
dest='include_starred',
|
||||||
|
help='include starred repositories in backup')
|
||||||
|
parser.add_argument('--watched',
|
||||||
|
action='store_true',
|
||||||
|
dest='include_watched',
|
||||||
|
help='include watched repositories in backup')
|
||||||
|
parser.add_argument('--all',
|
||||||
|
action='store_true',
|
||||||
|
dest='include_everything',
|
||||||
|
help='include everything in backup')
|
||||||
|
parser.add_argument('--issues',
|
||||||
|
action='store_true',
|
||||||
|
dest='include_issues',
|
||||||
|
help='include issues in backup')
|
||||||
|
parser.add_argument('--issue-comments',
|
||||||
|
action='store_true',
|
||||||
|
dest='include_issue_comments',
|
||||||
|
help='include issue comments in backup')
|
||||||
|
parser.add_argument('--issue-events',
|
||||||
|
action='store_true',
|
||||||
|
dest='include_issue_events',
|
||||||
|
help='include issue events in backup')
|
||||||
|
parser.add_argument('--pulls',
|
||||||
|
action='store_true',
|
||||||
|
dest='include_pulls',
|
||||||
|
help='include pull requests in backup')
|
||||||
|
parser.add_argument('--pull-comments',
|
||||||
|
action='store_true',
|
||||||
|
dest='include_pull_comments',
|
||||||
|
help='include pull request review comments in backup')
|
||||||
|
parser.add_argument('--pull-commits',
|
||||||
|
action='store_true',
|
||||||
|
dest='include_pull_commits',
|
||||||
|
help='include pull request commits in backup')
|
||||||
|
parser.add_argument('--labels',
|
||||||
|
action='store_true',
|
||||||
|
dest='include_labels',
|
||||||
|
help='include labels in backup')
|
||||||
|
parser.add_argument('--hooks',
|
||||||
|
action='store_true',
|
||||||
|
dest='include_hooks',
|
||||||
|
help='include hooks in backup (works only when authenticated)') # noqa
|
||||||
|
parser.add_argument('--milestones',
|
||||||
|
action='store_true',
|
||||||
|
dest='include_milestones',
|
||||||
|
help='include milestones in backup')
|
||||||
|
parser.add_argument('--repositories',
|
||||||
|
action='store_true',
|
||||||
|
dest='include_repository',
|
||||||
|
help='include repository clone in backup')
|
||||||
|
parser.add_argument('--bare',
|
||||||
|
action='store_true',
|
||||||
|
dest='bare_clone',
|
||||||
|
help='clone bare repositories')
|
||||||
|
parser.add_argument('--wikis',
|
||||||
|
action='store_true',
|
||||||
|
dest='include_wiki',
|
||||||
|
help='include wiki clone in backup')
|
||||||
|
parser.add_argument('--skip-existing',
|
||||||
|
action='store_true',
|
||||||
|
dest='skip_existing',
|
||||||
|
help='skip project if a backup directory exists')
|
||||||
|
parser.add_argument('-L',
|
||||||
|
'--languages',
|
||||||
|
dest='languages',
|
||||||
|
help='only allow these languages',
|
||||||
|
nargs='*')
|
||||||
|
parser.add_argument('-N',
|
||||||
|
'--name-regex',
|
||||||
|
dest='name_regex',
|
||||||
|
help='python regex to match names against')
|
||||||
|
parser.add_argument('-H',
|
||||||
|
'--github-host',
|
||||||
|
dest='github_host',
|
||||||
|
help='GitHub Enterprise hostname')
|
||||||
|
parser.add_argument('-O',
|
||||||
|
'--organization',
|
||||||
|
action='store_true',
|
||||||
|
dest='organization',
|
||||||
|
help='whether or not this is an organization user')
|
||||||
|
parser.add_argument('-R',
|
||||||
|
'--repository',
|
||||||
|
dest='repository',
|
||||||
|
help='name of repository to limit backup to')
|
||||||
|
parser.add_argument('-P', '--private',
|
||||||
|
action='store_true',
|
||||||
|
dest='private',
|
||||||
|
help='include private repositories')
|
||||||
|
parser.add_argument('-F', '--fork',
|
||||||
|
action='store_true',
|
||||||
|
dest='fork',
|
||||||
|
help='include forked repositories')
|
||||||
|
parser.add_argument('--prefer-ssh',
|
||||||
|
action='store_true',
|
||||||
|
help='Clone repositories using SSH instead of HTTPS')
|
||||||
|
parser.add_argument('-v', '--version',
|
||||||
|
action='version',
|
||||||
|
version='%(prog)s ' + __version__)
|
||||||
|
parser.add_argument('--keychain-name',
|
||||||
|
dest='osx_keychain_item_name',
|
||||||
|
help='OSX ONLY: name field of password item in OSX keychain that holds the personal access or OAuth token')
|
||||||
|
parser.add_argument('--keychain-account',
|
||||||
|
dest='osx_keychain_item_account',
|
||||||
|
help='OSX ONLY: account field of password item in OSX keychain that holds the personal access or OAuth token')
|
||||||
return parser.parse_args()
|
return parser.parse_args()
|
||||||
|
|
||||||
|
|
||||||
def get_auth(args):
|
def get_auth(args, encode=True):
|
||||||
auth = None
|
auth = None
|
||||||
if args.token:
|
|
||||||
auth = base64.b64encode(args.token + ':' + 'x-oauth-basic')
|
|
||||||
elif args.username and args.password:
|
|
||||||
auth = base64.b64encode(args.username + ':' + args.password)
|
|
||||||
elif args.username and not args.password:
|
|
||||||
log_error('You must specify a password for basic auth when specifying a username')
|
|
||||||
elif args.password and not args.username:
|
|
||||||
log_error('You must specify a username for basic auth when specifying a password')
|
|
||||||
|
|
||||||
return auth
|
if args.osx_keychain_item_name:
|
||||||
|
if not args.osx_keychain_item_account:
|
||||||
|
log_error('You must specify both name and account fields for osx keychain password items')
|
||||||
|
else:
|
||||||
|
if platform.system() != 'Darwin':
|
||||||
|
log_error("Keychain arguments are only supported on Mac OSX")
|
||||||
|
try:
|
||||||
|
with open(os.devnull,'w') as devnull:
|
||||||
|
token = (subprocess.check_output([
|
||||||
|
'security','find-generic-password',
|
||||||
|
'-s',args.osx_keychain_item_name,
|
||||||
|
'-a',args.osx_keychain_item_account,
|
||||||
|
'-w' ], stderr=devnull).strip())
|
||||||
|
auth = token + ':' + 'x-oauth-basic'
|
||||||
|
except:
|
||||||
|
log_error('No password item matching the provided name and account could be found in the osx keychain.')
|
||||||
|
elif args.osx_keychain_item_account:
|
||||||
|
log_error('You must specify both name and account fields for osx keychain password items')
|
||||||
|
elif args.token:
|
||||||
|
_path_specifier = 'file://'
|
||||||
|
if args.token.startswith(_path_specifier):
|
||||||
|
args.token = open(args.token[len(_path_specifier):],
|
||||||
|
'rt').readline().strip()
|
||||||
|
auth = args.token + ':' + 'x-oauth-basic'
|
||||||
|
elif args.username:
|
||||||
|
if not args.password:
|
||||||
|
args.password = getpass.getpass()
|
||||||
|
if encode:
|
||||||
|
password = args.password
|
||||||
|
else:
|
||||||
|
password = urlquote(args.password)
|
||||||
|
auth = args.username + ':' + password
|
||||||
|
elif args.password:
|
||||||
|
log_error('You must specify a username for basic auth')
|
||||||
|
|
||||||
|
if not auth:
|
||||||
|
return None
|
||||||
|
|
||||||
|
if not encode:
|
||||||
|
return auth
|
||||||
|
|
||||||
|
return base64.b64encode(auth.encode('ascii'))
|
||||||
|
|
||||||
|
|
||||||
|
def get_github_api_host(args):
|
||||||
|
if args.github_host:
|
||||||
|
host = args.github_host + '/api/v3'
|
||||||
|
else:
|
||||||
|
host = 'api.github.com'
|
||||||
|
|
||||||
|
return host
|
||||||
|
|
||||||
|
|
||||||
|
def get_github_host(args):
|
||||||
|
if args.github_host:
|
||||||
|
host = args.github_host
|
||||||
|
else:
|
||||||
|
host = 'github.com'
|
||||||
|
|
||||||
|
return host
|
||||||
|
|
||||||
|
|
||||||
|
def get_github_repo_url(args, repository):
|
||||||
|
if args.prefer_ssh:
|
||||||
|
return repository['ssh_url']
|
||||||
|
|
||||||
|
auth = get_auth(args, False)
|
||||||
|
if auth:
|
||||||
|
repo_url = 'https://{0}@{1}/{2}/{3}.git'.format(
|
||||||
|
auth,
|
||||||
|
get_github_host(args),
|
||||||
|
args.user,
|
||||||
|
repository['name'])
|
||||||
|
else:
|
||||||
|
repo_url = repository['clone_url']
|
||||||
|
|
||||||
|
return repo_url
|
||||||
|
|
||||||
|
|
||||||
def retrieve_data(args, template, query_args=None, single_request=False):
|
def retrieve_data(args, template, query_args=None, single_request=False):
|
||||||
auth = get_auth(args)
|
auth = get_auth(args)
|
||||||
|
query_args = get_query_args(query_args)
|
||||||
per_page = 100
|
per_page = 100
|
||||||
page = 0
|
page = 0
|
||||||
data = []
|
data = []
|
||||||
if not query_args:
|
|
||||||
query_args = {}
|
|
||||||
|
|
||||||
while True:
|
while True:
|
||||||
page = page + 1
|
page = page + 1
|
||||||
querystring = urllib.urlencode(dict({
|
request = _construct_request(per_page, page, query_args, template, auth) # noqa
|
||||||
'per_page': per_page,
|
r, errors = _get_response(request, auth, template)
|
||||||
'page': page
|
|
||||||
}.items() + query_args.items()))
|
|
||||||
|
|
||||||
request = urllib2.Request(template + '?' + querystring)
|
status_code = int(r.getcode())
|
||||||
if auth is not None:
|
|
||||||
request.add_header('Authorization', 'Basic ' + auth)
|
|
||||||
r = urllib2.urlopen(request)
|
|
||||||
|
|
||||||
errors = []
|
if status_code != 200:
|
||||||
if int(r.getcode()) != 200:
|
template = 'API request returned HTTP {0}: {1}'
|
||||||
errors.append('Bad response from api')
|
errors.append(template.format(status_code, r.reason))
|
||||||
|
|
||||||
if 'X-RateLimit-Limit' in r.headers and int(r.headers['X-RateLimit-Limit']) == 0:
|
|
||||||
ratelimit_error = 'No more requests remaining'
|
|
||||||
if auth is None:
|
|
||||||
ratelimit_error = ratelimit_error + ', specify username/password or token to raise your github ratelimit'
|
|
||||||
|
|
||||||
errors.append(ratelimit_error)
|
|
||||||
|
|
||||||
if int(r.getcode()) != 200:
|
|
||||||
log_error(errors)
|
log_error(errors)
|
||||||
|
|
||||||
response = json.loads(r.read())
|
response = json.loads(r.read().decode('utf-8'))
|
||||||
if len(errors) == 0:
|
if len(errors) == 0:
|
||||||
if type(response) == list:
|
if type(response) == list:
|
||||||
data.extend(response)
|
data.extend(response)
|
||||||
@@ -167,22 +380,121 @@ def retrieve_data(args, template, query_args=None, single_request=False):
|
|||||||
return data
|
return data
|
||||||
|
|
||||||
|
|
||||||
|
def get_query_args(query_args=None):
|
||||||
|
if not query_args:
|
||||||
|
query_args = {}
|
||||||
|
return query_args
|
||||||
|
|
||||||
|
|
||||||
|
def _get_response(request, auth, template):
|
||||||
|
retry_timeout = 3
|
||||||
|
errors = []
|
||||||
|
# We'll make requests in a loop so we can
|
||||||
|
# delay and retry in the case of rate-limiting
|
||||||
|
while True:
|
||||||
|
should_continue = False
|
||||||
|
try:
|
||||||
|
r = urlopen(request)
|
||||||
|
except HTTPError as exc:
|
||||||
|
errors, should_continue = _request_http_error(exc, auth, errors) # noqa
|
||||||
|
r = exc
|
||||||
|
except URLError:
|
||||||
|
should_continue = _request_url_error(template, retry_timeout)
|
||||||
|
if not should_continue:
|
||||||
|
raise
|
||||||
|
|
||||||
|
if should_continue:
|
||||||
|
continue
|
||||||
|
|
||||||
|
break
|
||||||
|
return r, errors
|
||||||
|
|
||||||
|
|
||||||
|
def _construct_request(per_page, page, query_args, template, auth):
|
||||||
|
querystring = urlencode(dict(list({
|
||||||
|
'per_page': per_page,
|
||||||
|
'page': page
|
||||||
|
}.items()) + list(query_args.items())))
|
||||||
|
|
||||||
|
request = Request(template + '?' + querystring)
|
||||||
|
if auth is not None:
|
||||||
|
request.add_header('Authorization', 'Basic '.encode('ascii') + auth)
|
||||||
|
return request
|
||||||
|
|
||||||
|
|
||||||
|
def _request_http_error(exc, auth, errors):
|
||||||
|
# HTTPError behaves like a Response so we can
|
||||||
|
# check the status code and headers to see exactly
|
||||||
|
# what failed.
|
||||||
|
|
||||||
|
should_continue = False
|
||||||
|
headers = exc.headers
|
||||||
|
limit_remaining = int(headers.get('x-ratelimit-remaining', 0))
|
||||||
|
|
||||||
|
if exc.code == 403 and limit_remaining < 1:
|
||||||
|
# The X-RateLimit-Reset header includes a
|
||||||
|
# timestamp telling us when the limit will reset
|
||||||
|
# so we can calculate how long to wait rather
|
||||||
|
# than inefficiently polling:
|
||||||
|
gm_now = calendar.timegm(time.gmtime())
|
||||||
|
reset = int(headers.get('x-ratelimit-reset', 0)) or gm_now
|
||||||
|
# We'll never sleep for less than 10 seconds:
|
||||||
|
delta = max(10, reset - gm_now)
|
||||||
|
|
||||||
|
limit = headers.get('x-ratelimit-limit')
|
||||||
|
print('Exceeded rate limit of {} requests; waiting {} seconds to reset'.format(limit, delta), # noqa
|
||||||
|
file=sys.stderr)
|
||||||
|
|
||||||
|
if auth is None:
|
||||||
|
print('Hint: Authenticate to raise your GitHub rate limit',
|
||||||
|
file=sys.stderr)
|
||||||
|
|
||||||
|
time.sleep(delta)
|
||||||
|
should_continue = True
|
||||||
|
return errors, should_continue
|
||||||
|
|
||||||
|
|
||||||
|
def _request_url_error(template, retry_timeout):
|
||||||
|
# Incase of a connection timing out, we can retry a few time
|
||||||
|
# But we won't crash and not back-up the rest now
|
||||||
|
log_info('{} timed out'.format(template))
|
||||||
|
retry_timeout -= 1
|
||||||
|
|
||||||
|
if retry_timeout >= 0:
|
||||||
|
return True
|
||||||
|
|
||||||
|
log_error('{} timed out to much, skipping!')
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
def retrieve_repositories(args):
|
def retrieve_repositories(args):
|
||||||
log_info('Retrieving repositories')
|
log_info('Retrieving repositories')
|
||||||
single_request = False
|
single_request = False
|
||||||
template = 'https://api.github.com/users/{0}/repos'.format(args.user)
|
template = 'https://{0}/user/repos'.format(
|
||||||
|
get_github_api_host(args))
|
||||||
if args.organization:
|
if args.organization:
|
||||||
template = 'https://api.github.com/orgs/{0}/repos'.format(args.user)
|
template = 'https://{0}/orgs/{1}/repos'.format(
|
||||||
|
get_github_api_host(args),
|
||||||
|
args.user)
|
||||||
|
|
||||||
if args.repository:
|
if args.repository:
|
||||||
single_request = True
|
single_request = True
|
||||||
template = 'https://api.github.com/repos/{0}/{1}'.format(args.user, args.repository)
|
template = 'https://{0}/repos/{1}/{2}'.format(
|
||||||
|
get_github_api_host(args),
|
||||||
|
args.user,
|
||||||
|
args.repository)
|
||||||
|
|
||||||
return retrieve_data(args, template, single_request=single_request)
|
return retrieve_data(args, template, single_request=single_request)
|
||||||
|
|
||||||
|
|
||||||
def filter_repositories(args, repositories):
|
def filter_repositories(args, unfiltered_repositories):
|
||||||
log_info('Filtering repositories')
|
log_info('Filtering repositories')
|
||||||
|
|
||||||
|
repositories = []
|
||||||
|
for r in unfiltered_repositories:
|
||||||
|
if r['owner']['login'] == args.user:
|
||||||
|
repositories.append(r)
|
||||||
|
|
||||||
name_regex = None
|
name_regex = None
|
||||||
if args.name_regex:
|
if args.name_regex:
|
||||||
name_regex = re.compile(args.name_regex)
|
name_regex = re.compile(args.name_regex)
|
||||||
@@ -196,7 +508,7 @@ def filter_repositories(args, repositories):
|
|||||||
if not args.private:
|
if not args.private:
|
||||||
repositories = [r for r in repositories if not r['private']]
|
repositories = [r for r in repositories if not r['private']]
|
||||||
if languages:
|
if languages:
|
||||||
repositories = [r for r in repositories if r['language'] and r['language'].lower() in languages]
|
repositories = [r for r in repositories if r['language'] and r['language'].lower() in languages] # noqa
|
||||||
if name_regex:
|
if name_regex:
|
||||||
repositories = [r for r in repositories if name_regex.match(r['name'])]
|
repositories = [r for r in repositories if name_regex.match(r['name'])]
|
||||||
|
|
||||||
@@ -205,101 +517,326 @@ def filter_repositories(args, repositories):
|
|||||||
|
|
||||||
def backup_repositories(args, output_directory, repositories):
|
def backup_repositories(args, output_directory, repositories):
|
||||||
log_info('Backing up repositories')
|
log_info('Backing up repositories')
|
||||||
issue_template = "https://api.github.com/repos"
|
repos_template = 'https://{0}/repos'.format(get_github_api_host(args))
|
||||||
wiki_template = "git@github.com:{0}.wiki.git"
|
|
||||||
|
if args.incremental:
|
||||||
|
last_update = max(list(repository['updated_at'] for repository in repositories) or [time.strftime('%Y-%m-%dT%H:%M:%SZ', time.localtime())]) # noqa
|
||||||
|
last_update_path = os.path.join(output_directory, 'last_update')
|
||||||
|
if os.path.exists(last_update_path):
|
||||||
|
args.since = open(last_update_path).read().strip()
|
||||||
|
else:
|
||||||
|
args.since = None
|
||||||
|
else:
|
||||||
|
args.since = None
|
||||||
|
|
||||||
issue_states = ['open', 'closed']
|
|
||||||
for repository in repositories:
|
for repository in repositories:
|
||||||
backup_cwd = os.path.join(output_directory, 'repositories')
|
backup_cwd = os.path.join(output_directory, 'repositories')
|
||||||
repo_cwd = os.path.join(backup_cwd, repository['name'])
|
repo_cwd = os.path.join(backup_cwd, repository['name'])
|
||||||
|
repo_dir = os.path.join(repo_cwd, 'repository')
|
||||||
|
repo_url = get_github_repo_url(args, repository)
|
||||||
|
|
||||||
if args.include_repository or args.include_everything:
|
if args.include_repository or args.include_everything:
|
||||||
mkdir_p(backup_cwd, repo_cwd)
|
fetch_repository(repository['name'],
|
||||||
exists = os.path.isdir('{0}/repository/.git'.format(repo_cwd))
|
repo_url,
|
||||||
if args.skip_existing and exists:
|
repo_dir,
|
||||||
continue
|
skip_existing=args.skip_existing,
|
||||||
|
bare_clone=args.bare_clone)
|
||||||
|
|
||||||
if exists:
|
download_wiki = (args.include_wiki or args.include_everything)
|
||||||
log_info('Updating {0} repository'.format(repository['full_name']))
|
if repository['has_wiki'] and download_wiki:
|
||||||
git_command = ["git", "pull", 'origin', 'master']
|
fetch_repository(repository['name'],
|
||||||
logging_subprocess(git_command, logger=None, cwd=os.path.join(repo_cwd, 'repository'))
|
repo_url.replace('.git', '.wiki.git'),
|
||||||
else:
|
os.path.join(repo_cwd, 'wiki'),
|
||||||
log_info('Cloning {0} repository'.format(repository['full_name']))
|
skip_existing=args.skip_existing,
|
||||||
git_command = ["git", "clone", repository['clone_url'], 'repository']
|
bare_clone=args.bare_clone)
|
||||||
logging_subprocess(git_command, logger=None, cwd=repo_cwd)
|
|
||||||
|
|
||||||
if repository['has_wiki'] and (args.include_wiki or args.include_everything):
|
|
||||||
mkdir_p(backup_cwd, repo_cwd)
|
|
||||||
exists = os.path.isdir('{0}/wiki/.git'.format(repo_cwd))
|
|
||||||
if args.skip_existing and exists:
|
|
||||||
continue
|
|
||||||
|
|
||||||
if exists:
|
|
||||||
log_info('Updating {0} wiki'.format(repository['full_name']))
|
|
||||||
git_command = ["git", "pull", 'origin', 'master']
|
|
||||||
logging_subprocess(git_command, logger=None, cwd=os.path.join(repo_cwd, 'wiki'))
|
|
||||||
else:
|
|
||||||
log_info('Cloning {0} wiki'.format(repository['full_name']))
|
|
||||||
git_command = ["git", "clone", wiki_template.format(repository['full_name']), 'wiki']
|
|
||||||
logging_subprocess(git_command, logger=None, cwd=repo_cwd)
|
|
||||||
|
|
||||||
if args.include_issues or args.include_everything:
|
if args.include_issues or args.include_everything:
|
||||||
if args.skip_existing and os.path.isdir('{0}/issues/.git'.format(repo_cwd)):
|
backup_issues(args, repo_cwd, repository, repos_template)
|
||||||
|
|
||||||
|
if args.include_pulls or args.include_everything:
|
||||||
|
backup_pulls(args, repo_cwd, repository, repos_template)
|
||||||
|
|
||||||
|
if args.include_milestones or args.include_everything:
|
||||||
|
backup_milestones(args, repo_cwd, repository, repos_template)
|
||||||
|
|
||||||
|
if args.include_labels or args.include_everything:
|
||||||
|
backup_labels(args, repo_cwd, repository, repos_template)
|
||||||
|
|
||||||
|
if args.include_hooks or args.include_everything:
|
||||||
|
backup_hooks(args, repo_cwd, repository, repos_template)
|
||||||
|
|
||||||
|
if args.incremental:
|
||||||
|
open(last_update_path, 'w').write(last_update)
|
||||||
|
|
||||||
|
|
||||||
|
def backup_issues(args, repo_cwd, repository, repos_template):
|
||||||
|
has_issues_dir = os.path.isdir('{0}/issues/.git'.format(repo_cwd))
|
||||||
|
if args.skip_existing and has_issues_dir:
|
||||||
|
return
|
||||||
|
|
||||||
|
log_info('Retrieving {0} issues'.format(repository['full_name']))
|
||||||
|
issue_cwd = os.path.join(repo_cwd, 'issues')
|
||||||
|
mkdir_p(repo_cwd, issue_cwd)
|
||||||
|
|
||||||
|
issues = {}
|
||||||
|
issues_skipped = 0
|
||||||
|
issues_skipped_message = ''
|
||||||
|
_issue_template = '{0}/{1}/issues'.format(repos_template,
|
||||||
|
repository['full_name'])
|
||||||
|
|
||||||
|
should_include_pulls = args.include_pulls or args.include_everything
|
||||||
|
issue_states = ['open', 'closed']
|
||||||
|
for issue_state in issue_states:
|
||||||
|
query_args = {
|
||||||
|
'filter': 'all',
|
||||||
|
'state': issue_state
|
||||||
|
}
|
||||||
|
if args.since:
|
||||||
|
query_args['since'] = args.since
|
||||||
|
|
||||||
|
_issues = retrieve_data(args,
|
||||||
|
_issue_template,
|
||||||
|
query_args=query_args)
|
||||||
|
for issue in _issues:
|
||||||
|
# skip pull requests which are also returned as issues
|
||||||
|
# if retrieving pull requests is requested as well
|
||||||
|
if 'pull_request' in issue and should_include_pulls:
|
||||||
|
issues_skipped += 1
|
||||||
continue
|
continue
|
||||||
|
|
||||||
log_info('Retrieving {0} issues'.format(repository['full_name']))
|
issues[issue['number']] = issue
|
||||||
issue_cwd = os.path.join(repo_cwd, 'issues')
|
|
||||||
mkdir_p(backup_cwd, repo_cwd, issue_cwd)
|
|
||||||
|
|
||||||
issues = {}
|
if issues_skipped:
|
||||||
_issue_template = '{0}/{1}/issues'.format(issue_template, repository['full_name'])
|
issues_skipped_message = ' (skipped {0} pull requests)'.format(
|
||||||
|
issues_skipped)
|
||||||
|
|
||||||
for issue_state in issue_states:
|
log_info('Saving {0} issues to disk{1}'.format(
|
||||||
query_args = {
|
len(list(issues.keys())), issues_skipped_message))
|
||||||
'filter': 'all',
|
comments_template = _issue_template + '/{0}/comments'
|
||||||
'state': issue_state
|
events_template = _issue_template + '/{0}/events'
|
||||||
}
|
for number, issue in list(issues.items()):
|
||||||
|
if args.include_issue_comments or args.include_everything:
|
||||||
|
template = comments_template.format(number)
|
||||||
|
issues[number]['comment_data'] = retrieve_data(args, template)
|
||||||
|
if args.include_issue_events or args.include_everything:
|
||||||
|
template = events_template.format(number)
|
||||||
|
issues[number]['event_data'] = retrieve_data(args, template)
|
||||||
|
|
||||||
_issues = retrieve_data(args, _issue_template, query_args=query_args)
|
issue_file = '{0}/{1}.json'.format(issue_cwd, number)
|
||||||
for issue in _issues:
|
with codecs.open(issue_file, 'w', encoding='utf-8') as f:
|
||||||
issues[issue['number']] = issue
|
json_dump(issue, f)
|
||||||
|
|
||||||
log_info('Saving {0} issues to disk'.format(len(issues.keys())))
|
|
||||||
for number, issue in issues.iteritems():
|
|
||||||
comments_template = _issue_template + '/{0}/comments'
|
|
||||||
events_template = _issue_template + '/{0}/events'
|
|
||||||
if args.include_issue_comments or args.include_everything:
|
|
||||||
issues[number]['comment_data'] = retrieve_data(args, comments_template.format(number))
|
|
||||||
if args.include_issue_events or args.include_everything:
|
|
||||||
issues[number]['event_data'] = retrieve_data(args, events_template.format(number))
|
|
||||||
|
|
||||||
with open('{0}/{1}.json'.format(issue_cwd, number), 'w') as issue_file:
|
def backup_pulls(args, repo_cwd, repository, repos_template):
|
||||||
json.dump(issue, issue_file, sort_keys=True, indent=4, separators=(',', ': '))
|
has_pulls_dir = os.path.isdir('{0}/pulls/.git'.format(repo_cwd))
|
||||||
|
if args.skip_existing and has_pulls_dir:
|
||||||
|
return
|
||||||
|
|
||||||
|
log_info('Retrieving {0} pull requests'.format(repository['full_name'])) # noqa
|
||||||
|
pulls_cwd = os.path.join(repo_cwd, 'pulls')
|
||||||
|
mkdir_p(repo_cwd, pulls_cwd)
|
||||||
|
|
||||||
|
pulls = {}
|
||||||
|
_pulls_template = '{0}/{1}/pulls'.format(repos_template,
|
||||||
|
repository['full_name'])
|
||||||
|
|
||||||
|
pull_states = ['open', 'closed']
|
||||||
|
for pull_state in pull_states:
|
||||||
|
query_args = {
|
||||||
|
'filter': 'all',
|
||||||
|
'state': pull_state,
|
||||||
|
'sort': 'updated',
|
||||||
|
'direction': 'desc',
|
||||||
|
}
|
||||||
|
|
||||||
|
# It'd be nice to be able to apply the args.since filter here...
|
||||||
|
_pulls = retrieve_data(args,
|
||||||
|
_pulls_template,
|
||||||
|
query_args=query_args)
|
||||||
|
for pull in _pulls:
|
||||||
|
if not args.since or pull['updated_at'] >= args.since:
|
||||||
|
pulls[pull['number']] = pull
|
||||||
|
|
||||||
|
log_info('Saving {0} pull requests to disk'.format(
|
||||||
|
len(list(pulls.keys()))))
|
||||||
|
comments_template = _pulls_template + '/{0}/comments'
|
||||||
|
commits_template = _pulls_template + '/{0}/commits'
|
||||||
|
for number, pull in list(pulls.items()):
|
||||||
|
if args.include_pull_comments or args.include_everything:
|
||||||
|
template = comments_template.format(number)
|
||||||
|
pulls[number]['comment_data'] = retrieve_data(args, template)
|
||||||
|
if args.include_pull_commits or args.include_everything:
|
||||||
|
template = commits_template.format(number)
|
||||||
|
pulls[number]['commit_data'] = retrieve_data(args, template)
|
||||||
|
|
||||||
|
pull_file = '{0}/{1}.json'.format(pulls_cwd, number)
|
||||||
|
with codecs.open(pull_file, 'w', encoding='utf-8') as f:
|
||||||
|
json_dump(pull, f)
|
||||||
|
|
||||||
|
|
||||||
|
def backup_milestones(args, repo_cwd, repository, repos_template):
|
||||||
|
milestone_cwd = os.path.join(repo_cwd, 'milestones')
|
||||||
|
if args.skip_existing and os.path.isdir(milestone_cwd):
|
||||||
|
return
|
||||||
|
|
||||||
|
log_info('Retrieving {0} milestones'.format(repository['full_name']))
|
||||||
|
mkdir_p(repo_cwd, milestone_cwd)
|
||||||
|
|
||||||
|
template = '{0}/{1}/milestones'.format(repos_template,
|
||||||
|
repository['full_name'])
|
||||||
|
|
||||||
|
query_args = {
|
||||||
|
'state': 'all'
|
||||||
|
}
|
||||||
|
|
||||||
|
_milestones = retrieve_data(args, template, query_args=query_args)
|
||||||
|
|
||||||
|
milestones = {}
|
||||||
|
for milestone in _milestones:
|
||||||
|
milestones[milestone['number']] = milestone
|
||||||
|
|
||||||
|
log_info('Saving {0} milestones to disk'.format(
|
||||||
|
len(list(milestones.keys()))))
|
||||||
|
for number, milestone in list(milestones.items()):
|
||||||
|
milestone_file = '{0}/{1}.json'.format(milestone_cwd, number)
|
||||||
|
with codecs.open(milestone_file, 'w', encoding='utf-8') as f:
|
||||||
|
json_dump(milestone, f)
|
||||||
|
|
||||||
|
|
||||||
|
def backup_labels(args, repo_cwd, repository, repos_template):
|
||||||
|
label_cwd = os.path.join(repo_cwd, 'labels')
|
||||||
|
output_file = '{0}/labels.json'.format(label_cwd)
|
||||||
|
template = '{0}/{1}/labels'.format(repos_template,
|
||||||
|
repository['full_name'])
|
||||||
|
_backup_data(args,
|
||||||
|
'labels',
|
||||||
|
template,
|
||||||
|
output_file,
|
||||||
|
label_cwd)
|
||||||
|
|
||||||
|
|
||||||
|
def backup_hooks(args, repo_cwd, repository, repos_template):
|
||||||
|
auth = get_auth(args)
|
||||||
|
if not auth:
|
||||||
|
log_info("Skipping hooks since no authentication provided")
|
||||||
|
return
|
||||||
|
hook_cwd = os.path.join(repo_cwd, 'hooks')
|
||||||
|
output_file = '{0}/hooks.json'.format(hook_cwd)
|
||||||
|
template = '{0}/{1}/hooks'.format(repos_template,
|
||||||
|
repository['full_name'])
|
||||||
|
try:
|
||||||
|
_backup_data(args,
|
||||||
|
'hooks',
|
||||||
|
template,
|
||||||
|
output_file,
|
||||||
|
hook_cwd)
|
||||||
|
except SystemExit:
|
||||||
|
log_info("Unable to read hooks, skipping")
|
||||||
|
|
||||||
|
|
||||||
|
def fetch_repository(name,
|
||||||
|
remote_url,
|
||||||
|
local_dir,
|
||||||
|
skip_existing=False,
|
||||||
|
bare_clone=False):
|
||||||
|
if bare_clone:
|
||||||
|
if os.path.exists(local_dir):
|
||||||
|
clone_exists = subprocess.check_output(['git',
|
||||||
|
'rev-parse',
|
||||||
|
'--is-bare-repository'],
|
||||||
|
cwd=local_dir) == "true\n"
|
||||||
|
else:
|
||||||
|
clone_exists = False
|
||||||
|
else:
|
||||||
|
clone_exists = os.path.exists(os.path.join(local_dir, '.git'))
|
||||||
|
|
||||||
|
if clone_exists and skip_existing:
|
||||||
|
return
|
||||||
|
|
||||||
|
masked_remote_url = mask_password(remote_url)
|
||||||
|
|
||||||
|
initialized = subprocess.call('git ls-remote ' + remote_url,
|
||||||
|
stdout=FNULL,
|
||||||
|
stderr=FNULL,
|
||||||
|
shell=True)
|
||||||
|
if initialized == 128:
|
||||||
|
log_info("Skipping {0} ({1}) since it's not initialized".format(
|
||||||
|
name, masked_remote_url))
|
||||||
|
return
|
||||||
|
|
||||||
|
if clone_exists:
|
||||||
|
log_info('Updating {0} in {1}'.format(name, local_dir))
|
||||||
|
|
||||||
|
remotes = subprocess.check_output(['git', 'remote', 'show'],
|
||||||
|
cwd=local_dir)
|
||||||
|
remotes = [i.strip() for i in remotes.decode('utf-8')]
|
||||||
|
|
||||||
|
if 'origin' not in remotes:
|
||||||
|
git_command = ['git', 'remote', 'rm', 'origin']
|
||||||
|
logging_subprocess(git_command, None, cwd=local_dir)
|
||||||
|
git_command = ['git', 'remote', 'add', 'origin', remote_url]
|
||||||
|
logging_subprocess(git_command, None, cwd=local_dir)
|
||||||
|
else:
|
||||||
|
git_command = ['git', 'remote', 'set-url', 'origin', remote_url]
|
||||||
|
logging_subprocess(git_command, None, cwd=local_dir)
|
||||||
|
|
||||||
|
git_command = ['git', 'fetch', '--all', '--force', '--tags', '--prune']
|
||||||
|
logging_subprocess(git_command, None, cwd=local_dir)
|
||||||
|
else:
|
||||||
|
log_info('Cloning {0} repository from {1} to {2}'.format(
|
||||||
|
name,
|
||||||
|
masked_remote_url,
|
||||||
|
local_dir))
|
||||||
|
if bare_clone:
|
||||||
|
git_command = ['git', 'clone', '--mirror', remote_url, local_dir]
|
||||||
|
else:
|
||||||
|
git_command = ['git', 'clone', remote_url, local_dir]
|
||||||
|
logging_subprocess(git_command, None)
|
||||||
|
|
||||||
|
|
||||||
def backup_account(args, output_directory):
|
def backup_account(args, output_directory):
|
||||||
account_cwd = os.path.join(output_directory, 'account')
|
account_cwd = os.path.join(output_directory, 'account')
|
||||||
if args.include_starred or args.include_everything:
|
|
||||||
if not args.skip_existing or not os.path.exists('{0}/starred.json'.format(account_cwd)):
|
|
||||||
log_info('Retrieving {0} starred repositories'.format(args.user))
|
|
||||||
mkdir_p(account_cwd)
|
|
||||||
|
|
||||||
starred_template = "https://api.github.com/users/{0}/starred"
|
if args.include_starred or args.include_everything:
|
||||||
starred = retrieve_data(args, starred_template.format(args.user))
|
output_file = '{0}/starred.json'.format(account_cwd)
|
||||||
log_info('Writing {0} starred repositories'.format(len(starred)))
|
template = "https://{0}/users/{1}/starred"
|
||||||
with open('{0}/starred.json'.format(account_cwd), 'w') as starred_file:
|
template = template.format(get_github_api_host(args), args.user)
|
||||||
json.dump(starred, starred_file, sort_keys=True, indent=4, separators=(',', ': '))
|
_backup_data(args,
|
||||||
|
'starred repositories',
|
||||||
|
template,
|
||||||
|
output_file,
|
||||||
|
account_cwd)
|
||||||
|
|
||||||
if args.include_watched or args.include_everything:
|
if args.include_watched or args.include_everything:
|
||||||
if not args.skip_existing or not os.path.exists('{0}/watched.json'.format(account_cwd)):
|
output_file = '{0}/watched.json'.format(account_cwd)
|
||||||
log_info('Retrieving {0} watched repositories'.format(args.user))
|
template = "https://{0}/users/{1}/subscriptions"
|
||||||
mkdir_p(account_cwd)
|
template = template.format(get_github_api_host(args), args.user)
|
||||||
|
_backup_data(args,
|
||||||
|
'watched repositories',
|
||||||
|
template,
|
||||||
|
output_file,
|
||||||
|
account_cwd)
|
||||||
|
|
||||||
watched_template = "https://api.github.com/users/{0}/subscriptions"
|
|
||||||
watched = retrieve_data(args, watched_template.format(args.user))
|
def _backup_data(args, name, template, output_file, output_directory):
|
||||||
log_info('Writing {0} watched repositories'.format(len(watched)))
|
skip_existing = args.skip_existing
|
||||||
with open('{0}/watched.json'.format(account_cwd), 'w') as watched_file:
|
if not skip_existing or not os.path.exists(output_file):
|
||||||
json.dump(watched, watched_file, sort_keys=True, indent=4, separators=(',', ': '))
|
log_info('Retrieving {0} {1}'.format(args.user, name))
|
||||||
|
mkdir_p(output_directory)
|
||||||
|
data = retrieve_data(args, template)
|
||||||
|
|
||||||
|
log_info('Writing {0} {1} to disk'.format(len(data), name))
|
||||||
|
with codecs.open(output_file, 'w', encoding='utf-8') as f:
|
||||||
|
json_dump(data, f)
|
||||||
|
|
||||||
|
|
||||||
|
def json_dump(data, output_file):
|
||||||
|
json.dump(data,
|
||||||
|
output_file,
|
||||||
|
ensure_ascii=False,
|
||||||
|
sort_keys=True,
|
||||||
|
indent=4,
|
||||||
|
separators=(',', ': '))
|
||||||
|
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
@@ -307,7 +844,8 @@ def main():
|
|||||||
|
|
||||||
output_directory = os.path.realpath(args.output_directory)
|
output_directory = os.path.realpath(args.output_directory)
|
||||||
if not os.path.isdir(output_directory):
|
if not os.path.isdir(output_directory):
|
||||||
log_error('Specified output directory is not a directory: {0}'.format(output_directory))
|
log_info('Create output directory {0}'.format(output_directory))
|
||||||
|
mkdir_p(output_directory)
|
||||||
|
|
||||||
log_info('Backing up user {0} to {1}'.format(args.user, output_directory))
|
log_info('Backing up user {0} to {1}'.format(args.user, output_directory))
|
||||||
|
|
||||||
|
|||||||
@@ -1 +1 @@
|
|||||||
__version__ = '0.2.0'
|
__version__ = '0.13.1'
|
||||||
|
|||||||
127
release
Executable file
127
release
Executable file
@@ -0,0 +1,127 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -eo pipefail; [[ $RELEASE_TRACE ]] && set -x
|
||||||
|
|
||||||
|
PACKAGE_NAME='github-backup'
|
||||||
|
INIT_PACKAGE_NAME='github_backup'
|
||||||
|
PUBLIC="true"
|
||||||
|
|
||||||
|
# Colors
|
||||||
|
COLOR_OFF="\033[0m" # unsets color to term fg color
|
||||||
|
RED="\033[0;31m" # red
|
||||||
|
GREEN="\033[0;32m" # green
|
||||||
|
YELLOW="\033[0;33m" # yellow
|
||||||
|
MAGENTA="\033[0;35m" # magenta
|
||||||
|
CYAN="\033[0;36m" # cyan
|
||||||
|
|
||||||
|
# ensure wheel is available
|
||||||
|
pip install wheel > /dev/null
|
||||||
|
|
||||||
|
command -v gitchangelog >/dev/null 2>&1 || {
|
||||||
|
echo -e "${RED}WARNING: Missing gitchangelog binary, please run: pip install gitchangelog==2.2.0${COLOR_OFF}\n"
|
||||||
|
exit 1
|
||||||
|
}
|
||||||
|
|
||||||
|
command -v rst-lint > /dev/null || {
|
||||||
|
echo -e "${RED}WARNING: Missing rst-lint binary, please run: pip install restructuredtext_lint${COLOR_OFF}\n"
|
||||||
|
exit 1
|
||||||
|
}
|
||||||
|
|
||||||
|
if [[ "$@" != "major" ]] && [[ "$@" != "minor" ]] && [[ "$@" != "patch" ]]; then
|
||||||
|
echo -e "${RED}WARNING: Invalid release type, must specify 'major', 'minor', or 'patch'${COLOR_OFF}\n"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo -e "\n${GREEN}STARTING RELEASE PROCESS${COLOR_OFF}\n"
|
||||||
|
|
||||||
|
set +e;
|
||||||
|
git status | grep -Eo "working (directory|tree) clean" &> /dev/null
|
||||||
|
if [ ! $? -eq 0 ]; then # working directory is NOT clean
|
||||||
|
echo -e "${RED}WARNING: You have uncomitted changes, you may have forgotten something${COLOR_OFF}\n"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
set -e;
|
||||||
|
|
||||||
|
echo -e "${YELLOW}--->${COLOR_OFF} Updating local copy"
|
||||||
|
git pull -q origin master
|
||||||
|
|
||||||
|
echo -e "${YELLOW}--->${COLOR_OFF} Retrieving release versions"
|
||||||
|
|
||||||
|
current_version=$(cat ${INIT_PACKAGE_NAME}/__init__.py |grep '__version__ ='|sed 's/[^0-9.]//g')
|
||||||
|
major=$(echo $current_version | awk '{split($0,a,"."); print a[1]}')
|
||||||
|
minor=$(echo $current_version | awk '{split($0,a,"."); print a[2]}')
|
||||||
|
patch=$(echo $current_version | awk '{split($0,a,"."); print a[3]}')
|
||||||
|
|
||||||
|
if [[ "$@" == "major" ]]; then
|
||||||
|
major=$(($major + 1));
|
||||||
|
minor="0"
|
||||||
|
patch="0"
|
||||||
|
elif [[ "$@" == "minor" ]]; then
|
||||||
|
minor=$(($minor + 1));
|
||||||
|
patch="0"
|
||||||
|
elif [[ "$@" == "patch" ]]; then
|
||||||
|
patch=$(($patch + 1));
|
||||||
|
fi
|
||||||
|
|
||||||
|
next_version="${major}.${minor}.${patch}"
|
||||||
|
|
||||||
|
echo -e "${YELLOW} >${COLOR_OFF} ${MAGENTA}${current_version}${COLOR_OFF} -> ${MAGENTA}${next_version}${COLOR_OFF}"
|
||||||
|
|
||||||
|
echo -e "${YELLOW}--->${COLOR_OFF} Ensuring readme passes lint checks (if this fails, run rst-lint)"
|
||||||
|
rst-lint README.rst > /dev/null
|
||||||
|
|
||||||
|
echo -e "${YELLOW}--->${COLOR_OFF} Creating necessary temp file"
|
||||||
|
tempfoo=$(basename $0)
|
||||||
|
TMPFILE=$(mktemp /tmp/${tempfoo}.XXXXXX) || {
|
||||||
|
echo -e "${RED}WARNING: Cannot create temp file using mktemp in /tmp dir ${COLOR_OFF}\n"
|
||||||
|
exit 1
|
||||||
|
}
|
||||||
|
|
||||||
|
find_this="__version__ = '$current_version'"
|
||||||
|
replace_with="__version__ = '$next_version'"
|
||||||
|
|
||||||
|
echo -e "${YELLOW}--->${COLOR_OFF} Updating ${INIT_PACKAGE_NAME}/__init__.py"
|
||||||
|
sed "s/$find_this/$replace_with/" ${INIT_PACKAGE_NAME}/__init__.py > $TMPFILE && mv $TMPFILE ${INIT_PACKAGE_NAME}/__init__.py
|
||||||
|
|
||||||
|
find_this="${PACKAGE_NAME}.git@$current_version"
|
||||||
|
replace_with="${PACKAGE_NAME}.git@$next_version"
|
||||||
|
|
||||||
|
echo -e "${YELLOW}--->${COLOR_OFF} Updating README.rst"
|
||||||
|
sed "s/$find_this/$replace_with/" README.rst > $TMPFILE && mv $TMPFILE README.rst
|
||||||
|
|
||||||
|
if [ -f docs/conf.py ]; then
|
||||||
|
echo -e "${YELLOW}--->${COLOR_OFF} Updating docs"
|
||||||
|
find_this="version = '${current_version}'"
|
||||||
|
replace_with="version = '${next_version}'"
|
||||||
|
sed "s/$find_this/$replace_with/" docs/conf.py > $TMPFILE && mv $TMPFILE docs/conf.py
|
||||||
|
|
||||||
|
find_this="version = '${current_version}'"
|
||||||
|
replace_with="release = '${next_version}'"
|
||||||
|
sed "s/$find_this/$replace_with/" docs/conf.py > $TMPFILE && mv $TMPFILE docs/conf.py
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo -e "${YELLOW}--->${COLOR_OFF} Updating CHANGES.rst for new release"
|
||||||
|
version_header="$next_version ($(date +%F))"
|
||||||
|
set +e; dashes=$(yes '-'|head -n ${#version_header}|tr -d '\n') ; set -e
|
||||||
|
gitchangelog |sed "4s/.*/$version_header/"|sed "5s/.*/$dashes/" > $TMPFILE && mv $TMPFILE CHANGES.rst
|
||||||
|
|
||||||
|
echo -e "${YELLOW}--->${COLOR_OFF} Adding changed files to git"
|
||||||
|
git add CHANGES.rst README.rst ${INIT_PACKAGE_NAME}/__init__.py
|
||||||
|
if [ -f docs/conf.py ]; then git add docs/conf.py; fi
|
||||||
|
|
||||||
|
echo -e "${YELLOW}--->${COLOR_OFF} Creating release"
|
||||||
|
git commit -q -m "Release version $next_version"
|
||||||
|
|
||||||
|
echo -e "${YELLOW}--->${COLOR_OFF} Tagging release"
|
||||||
|
git tag -a $next_version -m "Release version $next_version"
|
||||||
|
|
||||||
|
echo -e "${YELLOW}--->${COLOR_OFF} Pushing release and tags to github"
|
||||||
|
git push -q origin master && git push -q --tags
|
||||||
|
|
||||||
|
if [[ "$PUBLIC" == "true" ]]; then
|
||||||
|
echo -e "${YELLOW}--->${COLOR_OFF} Creating python release"
|
||||||
|
cp README.rst README
|
||||||
|
python setup.py sdist bdist_wheel upload > /dev/null
|
||||||
|
rm README
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo -e "\n${CYAN}RELEASED VERSION ${next_version}!${COLOR_OFF}\n"
|
||||||
Reference in New Issue
Block a user