mirror of
https://github.com/josegonzalez/python-github-backup.git
synced 2025-12-05 16:18:02 +01:00
Compare commits
164 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
498d9eba32 | ||
|
|
0f82b1717c | ||
|
|
4d5126f303 | ||
|
|
98919c82c9 | ||
|
|
045eacbf18 | ||
|
|
7a234ba7ed | ||
|
|
e8a255b450 | ||
|
|
81a2f762da | ||
|
|
cb0293cbe5 | ||
|
|
252c25461f | ||
|
|
e8ed03fd06 | ||
|
|
38010d7c39 | ||
|
|
71b4288e6b | ||
|
|
ba4fa9fa2d | ||
|
|
869f761c90 | ||
|
|
195e700128 | ||
|
|
27441b71b6 | ||
|
|
cfeaee7309 | ||
|
|
fac8e4274f | ||
|
|
17fee66f31 | ||
|
|
a56d27dd8b | ||
|
|
e57873b6dd | ||
|
|
2658b039a1 | ||
|
|
fd684a71fb | ||
|
|
bacd77030b | ||
|
|
b73079daf2 | ||
|
|
eca8a70666 | ||
|
|
e74765ba7f | ||
|
|
6db5bd731b | ||
|
|
7305871c20 | ||
|
|
baf7b1a9b4 | ||
|
|
121fa68294 | ||
|
|
44dfc79edc | ||
|
|
89f59cc7a2 | ||
|
|
ad8c5b8768 | ||
|
|
921aab3729 | ||
|
|
ea4c3d0f6f | ||
|
|
9b6400932d | ||
|
|
de0c3f46c6 | ||
|
|
73b069f872 | ||
|
|
3d3f512074 | ||
|
|
1c3078992d | ||
|
|
4b40ae94d7 | ||
|
|
a18fda9faf | ||
|
|
41130fc8b0 | ||
|
|
2340a02fc6 | ||
|
|
cafff4ae80 | ||
|
|
3193d120e5 | ||
|
|
da4b29a2d6 | ||
|
|
d05c96ecef | ||
|
|
c86163bfe6 | ||
|
|
eff6e36974 | ||
|
|
63e458bafb | ||
|
|
57ab5ce1a2 | ||
|
|
d148f9b900 | ||
|
|
89ee22c2be | ||
|
|
9e472b74e6 | ||
|
|
4b459f9af8 | ||
|
|
b70ea87db7 | ||
|
|
f8be34562b | ||
|
|
ec05204aa9 | ||
|
|
628f2cbf73 | ||
|
|
38bf438d2f | ||
|
|
899cf42b57 | ||
|
|
b5972aaaf0 | ||
|
|
d860f369e9 | ||
|
|
77ab1bda15 | ||
|
|
4a4a317331 | ||
|
|
5a8e1ac275 | ||
|
|
0de341eab4 | ||
|
|
b0130fdf94 | ||
|
|
b49f399037 | ||
|
|
321414d352 | ||
|
|
413d4381cc | ||
|
|
0110ea40ed | ||
|
|
8d2ef2f528 | ||
|
|
1a79f755a5 | ||
|
|
abf45d5b54 | ||
|
|
fd33037b1c | ||
|
|
87dab293ed | ||
|
|
0244af4e05 | ||
|
|
eca9f0f7df | ||
|
|
afa2a6d587 | ||
|
|
b77ea48d74 | ||
|
|
f378254188 | ||
|
|
83128e986a | ||
|
|
17e4f9a125 | ||
|
|
e59d1e3a68 | ||
|
|
de860ee5a9 | ||
|
|
cb054c2631 | ||
|
|
c142707a90 | ||
|
|
7cccd42ec9 | ||
|
|
9a539b1d6b | ||
|
|
cd2372183e | ||
|
|
bd346de898 | ||
|
|
6e3cbe841a | ||
|
|
8b95f187ad | ||
|
|
ef88248c41 | ||
|
|
0a4decfb3b | ||
|
|
2b9549ffde | ||
|
|
fb2c3ca921 | ||
|
|
4f4785085d | ||
|
|
76895dcf69 | ||
|
|
1d50a4038b | ||
|
|
9d31ccfba9 | ||
|
|
27a1ba2d04 | ||
|
|
f157ea107f | ||
|
|
a129cc759a | ||
|
|
bb551a83f4 | ||
|
|
9b1b4a9ebc | ||
|
|
e6b6eb8bef | ||
|
|
0b3f120e2b | ||
|
|
990249b80b | ||
|
|
cefb226545 | ||
|
|
ea22ffdf26 | ||
|
|
0f21d7b8a4 | ||
|
|
cb33b9bab7 | ||
|
|
68c48cb0b3 | ||
|
|
922a3c5a6e | ||
|
|
d4055eb99c | ||
|
|
d8a330559c | ||
|
|
de93824498 | ||
|
|
2efeaa7580 | ||
|
|
647810a2f0 | ||
|
|
0dfe5c342a | ||
|
|
1d6e1abab1 | ||
|
|
dd2b96b172 | ||
|
|
7a589f1e63 | ||
|
|
92c619cd01 | ||
|
|
9a91dd7733 | ||
|
|
6592bd8196 | ||
|
|
e9e3b18512 | ||
|
|
88148b4c95 | ||
|
|
8448add464 | ||
|
|
5b30b7ebdd | ||
|
|
c3a17710d3 | ||
|
|
4462412ec7 | ||
|
|
8d61538e5e | ||
|
|
4d37ad206f | ||
|
|
1f983863fc | ||
|
|
f0b28567b9 | ||
|
|
77ede50b19 | ||
|
|
97e4fbbacb | ||
|
|
03604cc654 | ||
|
|
73a62fdee1 | ||
|
|
94e1d62ad5 | ||
|
|
54cef11ce7 | ||
|
|
56397eba1c | ||
|
|
9f861efccf | ||
|
|
c1c9ce6dca | ||
|
|
ab18d8aee0 | ||
|
|
9d7d98b19e | ||
|
|
0233bff696 | ||
|
|
6154ceda15 | ||
|
|
9023052e9c | ||
|
|
874c235ba5 | ||
|
|
b7b234d8a5 | ||
|
|
ed160eb0ca | ||
|
|
1d11d62b73 | ||
|
|
9e1cba9817 | ||
|
|
3859a80b7a | ||
|
|
8c12d54898 | ||
|
|
1e5a90486c | ||
|
|
9b74aff20b |
9
.gitignore
vendored
9
.gitignore
vendored
@@ -25,3 +25,12 @@ doc/_build
|
|||||||
|
|
||||||
# Generated man page
|
# Generated man page
|
||||||
doc/aws_hostname.1
|
doc/aws_hostname.1
|
||||||
|
|
||||||
|
# Annoying macOS files
|
||||||
|
.DS_Store
|
||||||
|
._*
|
||||||
|
|
||||||
|
# IDE configuration files
|
||||||
|
.vscode
|
||||||
|
.atom
|
||||||
|
|
||||||
|
|||||||
347
CHANGES.rst
347
CHANGES.rst
@@ -1,14 +1,309 @@
|
|||||||
Changelog
|
Changelog
|
||||||
=========
|
=========
|
||||||
|
|
||||||
|
0.27.0 (2020-01-21)
|
||||||
|
-------------------
|
||||||
|
------------------------
|
||||||
|
- Fixed script fails if not installed from pip. [Ben Baron]
|
||||||
|
|
||||||
|
At the top of the script, the line from github_backup import __version__ gets the script's version number to use if the script is called with the -v or --version flags. The problem is that if the script hasn't been installed via pip (for example I cloned the repo directly to my backup server), the script will fail due to an import exception.
|
||||||
|
|
||||||
|
Also presumably it will always use the version number from pip even if running a modified version from git or a fork or something, though this does not fix that as I have no idea how to check if it's running the pip installed version or not. But at least the script will now work fine if cloned from git or just copied to another machine.
|
||||||
|
|
||||||
|
closes https://github.com/josegonzalez/python-github-backup/issues/141
|
||||||
|
- Fixed macOS keychain access when using Python 3. [Ben Baron]
|
||||||
|
|
||||||
|
Python 3 is returning bytes rather than a string, so the string concatenation to create the auth variable was throwing an exception which the script was interpreting to mean it couldn't find the password. Adding a conversion to string first fixed the issue.
|
||||||
|
- Public repos no longer include the auth token. [Ben Baron]
|
||||||
|
|
||||||
|
When backing up repositories using an auth token and https, the GitHub personal auth token is leaked in each backed up repository. It is included in the URL of each repository's git remote url.
|
||||||
|
|
||||||
|
This is not needed as they are public and can be accessed without the token and can cause issues in the future if the token is ever changed, so I think it makes more sense not to have the token stored in each repo backup. I think the token should only be "leaked" like this out of necessity, e.g. it's a private repository and the --prefer-ssh option was not chosen so https with auth token was required to perform the clone.
|
||||||
|
- Fixed comment typo. [Ben Baron]
|
||||||
|
- Switched log_info to log_warning in download_file. [Ben Baron]
|
||||||
|
- Crash when an release asset doesn't exist. [Ben Baron]
|
||||||
|
|
||||||
|
Currently, the script crashes whenever a release asset is unable to download (for example a 404 response). This change instead logs the failure and allows the script to continue. No retry logic is enabled, but at least it prevents the crash and allows the backup to complete. Retry logic can be implemented later if wanted.
|
||||||
|
|
||||||
|
closes https://github.com/josegonzalez/python-github-backup/issues/129
|
||||||
|
- Moved asset downloading loop inside the if block. [Ben Baron]
|
||||||
|
- Separate release assets and skip re-downloading. [Ben Baron]
|
||||||
|
|
||||||
|
Currently the script puts all release assets into the same folder called `releases`. So any time 2 release files have the same name, only the last one downloaded is actually saved. A particularly bad example of this is MacDownApp/macdown where all of their releases are named `MacDown.app.zip`. So even though they have 36 releases and all 36 are downloaded, only the last one is actually saved.
|
||||||
|
|
||||||
|
With this change, each releases' assets are now stored in a fubfolder inside `releases` named after the release name. There could still be edge cases if two releases have the same name, but this is still much safer tha the previous behavior.
|
||||||
|
|
||||||
|
This change also now checks if the asset file already exists on disk and skips downloading it. This drastically speeds up addiotnal syncs as it no longer downloads every single release every single time. It will now only download new releases which I believe is the expected behavior.
|
||||||
|
|
||||||
|
closes https://github.com/josegonzalez/python-github-backup/issues/126
|
||||||
|
- Added newline to end of file. [Ben Baron]
|
||||||
|
- Improved gitignore, macOS files and IDE configs. [Ben Baron]
|
||||||
|
|
||||||
|
Ignores the annoying hidden macOS files .DS_Store and ._* as well as the IDE configuration folders for contributors using the popular Visual Studio Code and Atom IDEs (more can be added later as needed).
|
||||||
|
|
||||||
|
|
||||||
|
0.26.0 (2019-09-23)
|
||||||
|
-------------------
|
||||||
|
- Workaround gist clone in `--prefer-ssh` mode. [Vladislav Yarmak]
|
||||||
|
- Create PULL_REQUEST.md. [Jose Diaz-Gonzalez]
|
||||||
|
- Create ISSUE_TEMPLATE.md. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
|
||||||
|
0.25.0 (2019-07-03)
|
||||||
|
-------------------
|
||||||
|
- Issue 119: Change retrieve_data to be a generator. [2a]
|
||||||
|
|
||||||
|
See issue #119.
|
||||||
|
|
||||||
|
|
||||||
|
0.24.0 (2019-06-27)
|
||||||
|
-------------------
|
||||||
|
- QKT-45: include assets - update readme. [Ethan Timm]
|
||||||
|
|
||||||
|
update readme with flag information for including assets alongside their respective releases
|
||||||
|
- Make assets it's own flag. [Harrison Wright]
|
||||||
|
- Fix super call for python2. [Harrison Wright]
|
||||||
|
- Fix redirect to s3. [Harrison Wright]
|
||||||
|
- WIP: download assets. [Harrison Wright]
|
||||||
|
- QKT-42: releases - add readme info. [ethan]
|
||||||
|
- QKT-42 update: shorter command flag. [ethan]
|
||||||
|
- QKT-42: support saving release information. [ethan]
|
||||||
|
- Fix pull details. [Harrison Wright]
|
||||||
|
|
||||||
|
|
||||||
|
0.23.0 (2019-06-04)
|
||||||
|
-------------------
|
||||||
|
- Avoid to crash in case of HTTP 502 error. [Gael de Chalendar]
|
||||||
|
|
||||||
|
Survive also on socket.error connections like on HTTPError or URLError.
|
||||||
|
|
||||||
|
This should solve issue #110.
|
||||||
|
|
||||||
|
|
||||||
|
0.22.2 (2019-02-21)
|
||||||
|
-------------------
|
||||||
|
|
||||||
|
Fix
|
||||||
|
~~~
|
||||||
|
- Warn instead of error. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
Refs #106
|
||||||
|
|
||||||
|
|
||||||
|
0.22.1 (2019-02-21)
|
||||||
|
-------------------
|
||||||
|
- Log URL error https://github.com/josegonzalez/python-github-
|
||||||
|
backup/issues/105. [JOHN STETIC]
|
||||||
|
|
||||||
|
|
||||||
|
0.22.0 (2019-02-01)
|
||||||
|
-------------------
|
||||||
|
- Remove unnecessary sys.exit call. [W. Harrison Wright]
|
||||||
|
- Add org check to avoid incorrect log output. [W. Harrison Wright]
|
||||||
|
- Fix accidental system exit with better logging strategy. [W. Harrison
|
||||||
|
Wright]
|
||||||
|
|
||||||
|
|
||||||
|
0.21.1 (2018-12-25)
|
||||||
|
-------------------
|
||||||
|
- Mark options which are not included in --all. [Bernd]
|
||||||
|
|
||||||
|
As discussed in Issue #100
|
||||||
|
|
||||||
|
|
||||||
|
0.21.0 (2018-11-28)
|
||||||
|
-------------------
|
||||||
|
- Correctly download repos when user arg != authenticated user. [W.
|
||||||
|
Harrison Wright]
|
||||||
|
|
||||||
|
|
||||||
|
0.20.1 (2018-09-29)
|
||||||
|
-------------------
|
||||||
|
- Clone the specified user's gists, not the authenticated user. [W.
|
||||||
|
Harrison Wright]
|
||||||
|
- Clone the specified user's starred repos, not the authenticated user.
|
||||||
|
[W. Harrison Wright]
|
||||||
|
|
||||||
|
|
||||||
|
0.20.0 (2018-03-24)
|
||||||
|
-------------------
|
||||||
|
- Chore: drop Python 2.6. [Jose Diaz-Gonzalez]
|
||||||
|
- Feat: simplify release script. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
|
||||||
|
0.19.2 (2018-03-24)
|
||||||
|
-------------------
|
||||||
|
|
||||||
|
Fix
|
||||||
|
~~~
|
||||||
|
- Cleanup pep8 violations. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
|
||||||
|
0.19.0 (2018-03-24)
|
||||||
|
-------------------
|
||||||
|
- Add additional output for the current request. [Robin Gloster]
|
||||||
|
|
||||||
|
This is useful to have some progress indication for huge repositories.
|
||||||
|
- Add option to backup additional PR details. [Robin Gloster]
|
||||||
|
|
||||||
|
Some payload is only included when requesting a single pull request
|
||||||
|
- Mark string as binary in comparison for skip_existing. [Johannes
|
||||||
|
Bornhold]
|
||||||
|
|
||||||
|
Found out that the flag "--skip-existing" did not work out as expected on Python
|
||||||
|
3.6. Tracked it down to the comparison which has to be against a string of bytes
|
||||||
|
in Python3.
|
||||||
|
|
||||||
|
|
||||||
|
0.18.0 (2018-02-22)
|
||||||
|
-------------------
|
||||||
|
- Add option to fetch followers/following JSON data. [Stephen Greene]
|
||||||
|
|
||||||
|
|
||||||
|
0.17.0 (2018-02-20)
|
||||||
|
-------------------
|
||||||
|
- Short circuit gists backup process. [W. Harrison Wright]
|
||||||
|
- Formatting. [W. Harrison Wright]
|
||||||
|
- Add ability to backup gists. [W. Harrison Wright]
|
||||||
|
|
||||||
|
|
||||||
|
0.16.0 (2018-01-22)
|
||||||
|
-------------------
|
||||||
|
- Change option to --all-starred. [W. Harrison Wright]
|
||||||
|
- JK don't update documentation. [W. Harrison Wright]
|
||||||
|
- Put starred clone repoistories under a new option. [W. Harrison
|
||||||
|
Wright]
|
||||||
|
- Add comment. [W. Harrison Wright]
|
||||||
|
- Add ability to clone starred repos. [W. Harrison Wright]
|
||||||
|
|
||||||
|
|
||||||
|
0.14.1 (2017-10-11)
|
||||||
|
-------------------
|
||||||
|
- Fix arg not defined error. [Edward Pfremmer]
|
||||||
|
|
||||||
|
|
||||||
|
0.14.0 (2017-10-11)
|
||||||
|
-------------------
|
||||||
|
- Added a check to see if git-lfs is installed when doing an LFS clone.
|
||||||
|
[pieterclaerhout]
|
||||||
|
- Added support for LFS clones. [pieterclaerhout]
|
||||||
|
- Add pypi info to readme. [Albert Wang]
|
||||||
|
- Explicitly support python 3 in package description. [Albert Wang]
|
||||||
|
- Add couple examples to help new users. [Yusuf Tran]
|
||||||
|
|
||||||
|
|
||||||
|
0.13.2 (2017-05-06)
|
||||||
|
-------------------
|
||||||
|
- Fix remotes while updating repository. [Dima Gerasimov]
|
||||||
|
|
||||||
|
|
||||||
|
0.13.1 (2017-04-11)
|
||||||
|
-------------------
|
||||||
|
- Fix error when repository has no updated_at value. [Nicolai Ehemann]
|
||||||
|
|
||||||
|
|
||||||
|
0.13.0 (2017-04-05)
|
||||||
|
-------------------
|
||||||
|
- Add OS check for OSX specific keychain args. [Martin O'Reilly]
|
||||||
|
|
||||||
|
Keychain arguments are only supported on Mac OSX.
|
||||||
|
Added check for operating system so we give a
|
||||||
|
"Keychain arguments are only supported on Mac OSX"
|
||||||
|
error message rather than a "No password item matching the
|
||||||
|
provided name and account could be found in the osx keychain"
|
||||||
|
error message
|
||||||
|
- Add support for storing PAT in OSX keychain. [Martin O'Reilly]
|
||||||
|
|
||||||
|
Added additional optional arguments and README guidance for storing
|
||||||
|
and accessing a Github personal access token (PAT) in the OSX
|
||||||
|
keychain
|
||||||
|
|
||||||
|
|
||||||
|
0.12.1 (2017-03-27)
|
||||||
|
-------------------
|
||||||
|
- Avoid remote branch name churn. [Chris Adams]
|
||||||
|
|
||||||
|
This avoids the backup output having lots of "[new branch]" messages
|
||||||
|
because removing the old remote name removed all of the existing branch
|
||||||
|
references.
|
||||||
|
- Fix detection of bare git directories. [Andrzej Maczuga]
|
||||||
|
|
||||||
|
|
||||||
|
0.12.0 (2016-11-22)
|
||||||
|
-------------------
|
||||||
|
|
||||||
|
Fix
|
||||||
|
~~~
|
||||||
|
- Properly import version from github_backup package. [Jose Diaz-
|
||||||
|
Gonzalez]
|
||||||
|
- Support alternate git status output. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
Other
|
||||||
|
~~~~~
|
||||||
|
- Pep8: E501 line too long (83 > 79 characters) [Jose Diaz-Gonzalez]
|
||||||
|
- Pep8: E128 continuation line under-indented for visual indent. [Jose
|
||||||
|
Diaz-Gonzalez]
|
||||||
|
- Support archivization using bare git clones. [Andrzej Maczuga]
|
||||||
|
- Fix typo, 3x. [Terrell Russell]
|
||||||
|
|
||||||
|
|
||||||
|
0.11.0 (2016-10-26)
|
||||||
|
-------------------
|
||||||
|
- Support --token file:///home/user/token.txt (fixes gh-51) [Björn
|
||||||
|
Dahlgren]
|
||||||
|
- Fix some linting. [Albert Wang]
|
||||||
|
- Fix byte/string conversion for python 3. [Albert Wang]
|
||||||
|
- Support python 3. [Albert Wang]
|
||||||
|
- Encode special characters in password. [Remi Rampin]
|
||||||
|
- Don't pretend program name is "Github Backup" [Remi Rampin]
|
||||||
|
- Don't install over insecure connection. [Remi Rampin]
|
||||||
|
|
||||||
|
The git:// protocol is unauthenticated and unencrypted, and no longer advertised by GitHub. Using HTTPS shouldn't impact performance.
|
||||||
|
|
||||||
|
|
||||||
|
0.10.3 (2016-08-21)
|
||||||
|
-------------------
|
||||||
|
- Fixes #29. [Jonas Michel]
|
||||||
|
|
||||||
|
Reporting an error when the user's rate limit is exceeded causes
|
||||||
|
the script to terminate after resuming execution from a rate limit
|
||||||
|
sleep. Instead of generating an explicit error we just want to
|
||||||
|
inform the user that the script is going to sleep until their rate
|
||||||
|
limit count resets.
|
||||||
|
- Fixes #29. [Jonas Michel]
|
||||||
|
|
||||||
|
The errors list was not being cleared out after resuming a backup
|
||||||
|
from a rate limit sleep. When the backup was resumed, the non-empty
|
||||||
|
errors list caused the backup to quit after the next `retrieve_data`
|
||||||
|
request.
|
||||||
|
|
||||||
|
|
||||||
|
0.10.2 (2016-08-21)
|
||||||
|
-------------------
|
||||||
|
- Add a note regarding git version requirement. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
Closes #37
|
||||||
|
|
||||||
|
|
||||||
|
0.10.0 (2016-08-18)
|
||||||
|
-------------------
|
||||||
|
- Implement incremental updates. [Robert Bradshaw]
|
||||||
|
|
||||||
|
Guarded with an --incremental flag.
|
||||||
|
|
||||||
|
Stores the time of the last update and only downloads issue and
|
||||||
|
pull request data since this time. All other data is relatively
|
||||||
|
small (likely fetched with a single request) and so is simply
|
||||||
|
re-populated from scratch as before.
|
||||||
|
|
||||||
|
|
||||||
0.9.0 (2016-03-29)
|
0.9.0 (2016-03-29)
|
||||||
------------------
|
------------------
|
||||||
|
|
||||||
- Fix cloning private repos with basic auth or token. [Kazuki Suda]
|
- Fix cloning private repos with basic auth or token. [Kazuki Suda]
|
||||||
|
|
||||||
|
|
||||||
0.8.0 (2016-02-14)
|
0.8.0 (2016-02-14)
|
||||||
------------------
|
------------------
|
||||||
|
|
||||||
- Don't store issues which are actually pull requests. [Enrico Tröger]
|
- Don't store issues which are actually pull requests. [Enrico Tröger]
|
||||||
|
|
||||||
This prevents storing pull requests twice since the Github API returns
|
This prevents storing pull requests twice since the Github API returns
|
||||||
@@ -19,43 +314,31 @@ Changelog
|
|||||||
|
|
||||||
0.7.0 (2016-02-02)
|
0.7.0 (2016-02-02)
|
||||||
------------------
|
------------------
|
||||||
|
|
||||||
- Softly fail if not able to read hooks. [Albert Wang]
|
- Softly fail if not able to read hooks. [Albert Wang]
|
||||||
|
|
||||||
- Add note about 2-factor auth. [Albert Wang]
|
- Add note about 2-factor auth. [Albert Wang]
|
||||||
|
|
||||||
- Make user repository search go through endpoint capable of reading
|
- Make user repository search go through endpoint capable of reading
|
||||||
private repositories. [Albert Wang]
|
private repositories. [Albert Wang]
|
||||||
|
|
||||||
- Prompt for password if only username given. [Alex Hall]
|
- Prompt for password if only username given. [Alex Hall]
|
||||||
|
|
||||||
|
|
||||||
0.6.0 (2015-11-10)
|
0.6.0 (2015-11-10)
|
||||||
------------------
|
------------------
|
||||||
|
|
||||||
- Force proper remote url. [Jose Diaz-Gonzalez]
|
- Force proper remote url. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
- Improve error handling in case of HTTP errors. [Enrico Tröger]
|
- Improve error handling in case of HTTP errors. [Enrico Tröger]
|
||||||
|
|
||||||
In case of a HTTP status code 404, the returned 'r' was never assigned.
|
In case of a HTTP status code 404, the returned 'r' was never assigned.
|
||||||
In case of URL errors which are not timeouts, we probably should bail
|
In case of URL errors which are not timeouts, we probably should bail
|
||||||
out.
|
out.
|
||||||
|
|
||||||
|
|
||||||
- Add --hooks to also include web hooks into the backup. [Enrico Tröger]
|
- Add --hooks to also include web hooks into the backup. [Enrico Tröger]
|
||||||
|
|
||||||
- Create the user specified output directory if it does not exist.
|
- Create the user specified output directory if it does not exist.
|
||||||
[Enrico Tröger]
|
[Enrico Tröger]
|
||||||
|
|
||||||
Fixes #17.
|
Fixes #17.
|
||||||
|
|
||||||
|
|
||||||
- Add missing auth argument to _get_response() [Enrico Tröger]
|
- Add missing auth argument to _get_response() [Enrico Tröger]
|
||||||
|
|
||||||
When running unauthenticated and Github starts rate-limiting the client,
|
When running unauthenticated and Github starts rate-limiting the client,
|
||||||
github-backup crashes because the used auth variable in _get_response()
|
github-backup crashes because the used auth variable in _get_response()
|
||||||
was not available. This change should fix it.
|
was not available. This change should fix it.
|
||||||
|
|
||||||
|
|
||||||
- Add repository URL to error message for non-existing repositories.
|
- Add repository URL to error message for non-existing repositories.
|
||||||
[Enrico Tröger]
|
[Enrico Tröger]
|
||||||
|
|
||||||
@@ -66,40 +349,28 @@ Changelog
|
|||||||
|
|
||||||
0.5.0 (2015-10-10)
|
0.5.0 (2015-10-10)
|
||||||
------------------
|
------------------
|
||||||
|
|
||||||
- Add release script. [Jose Diaz-Gonzalez]
|
- Add release script. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
- Refactor to both simplify codepath as well as follow PEP8 standards.
|
- Refactor to both simplify codepath as well as follow PEP8 standards.
|
||||||
[Jose Diaz-Gonzalez]
|
[Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
- Retry 3 times when the connection times out. [Mathijs Jonker]
|
- Retry 3 times when the connection times out. [Mathijs Jonker]
|
||||||
|
|
||||||
- Made unicode output defalut. [Kirill Grushetsky]
|
- Made unicode output defalut. [Kirill Grushetsky]
|
||||||
|
|
||||||
- Import alphabetised. [Kirill Grushetsky]
|
- Import alphabetised. [Kirill Grushetsky]
|
||||||
|
|
||||||
- Preserve Unicode characters in the output file. [Kirill Grushetsky]
|
- Preserve Unicode characters in the output file. [Kirill Grushetsky]
|
||||||
|
|
||||||
Added option to preserve Unicode characters in the output file
|
Added option to preserve Unicode characters in the output file
|
||||||
|
|
||||||
- Josegonzales/python-github-backup#12 Added backup of labels and
|
- Josegonzales/python-github-backup#12 Added backup of labels and
|
||||||
milestones. [aensley]
|
milestones. [aensley]
|
||||||
|
|
||||||
- Fixed indent. [Mathijs Jonker]
|
- Fixed indent. [Mathijs Jonker]
|
||||||
|
|
||||||
- Skip unitialized repo's. [mjonker-embed]
|
- Skip unitialized repo's. [mjonker-embed]
|
||||||
|
|
||||||
These gave me errors which caused mails from crontab.
|
These gave me errors which caused mails from crontab.
|
||||||
|
|
||||||
- Added prefer-ssh. [mjonker-embed]
|
- Added prefer-ssh. [mjonker-embed]
|
||||||
|
|
||||||
Was needed for my back-up setup, code includes this but readme wasn't updated
|
Was needed for my back-up setup, code includes this but readme wasn't updated
|
||||||
|
|
||||||
- Retry API requests which failed due to rate-limiting. [Chris Adams]
|
- Retry API requests which failed due to rate-limiting. [Chris Adams]
|
||||||
|
|
||||||
This allows operation to continue, albeit at a slower pace,
|
This allows operation to continue, albeit at a slower pace,
|
||||||
if you have enough data to trigger the API rate limits
|
if you have enough data to trigger the API rate limits
|
||||||
|
|
||||||
- Logging_subprocess: always log when a command fails. [Chris Adams]
|
- Logging_subprocess: always log when a command fails. [Chris Adams]
|
||||||
|
|
||||||
Previously git clones could fail without any indication
|
Previously git clones could fail without any indication
|
||||||
@@ -109,21 +380,15 @@ Changelog
|
|||||||
Now a non-zero return code will always output a message to
|
Now a non-zero return code will always output a message to
|
||||||
stderr and will display the executed command so it can be
|
stderr and will display the executed command so it can be
|
||||||
rerun for troubleshooting.
|
rerun for troubleshooting.
|
||||||
|
|
||||||
|
|
||||||
- Switch to using ssh_url. [Chris Adams]
|
- Switch to using ssh_url. [Chris Adams]
|
||||||
|
|
||||||
The previous commit used the wrong URL for a private repo. This was
|
The previous commit used the wrong URL for a private repo. This was
|
||||||
masked by the lack of error loging in logging_subprocess (which will be
|
masked by the lack of error loging in logging_subprocess (which will be
|
||||||
in a separate branch)
|
in a separate branch)
|
||||||
|
|
||||||
|
|
||||||
- Add an option to prefer checkouts over SSH. [Chris Adams]
|
- Add an option to prefer checkouts over SSH. [Chris Adams]
|
||||||
|
|
||||||
This is really useful with private repos to avoid being nagged
|
This is really useful with private repos to avoid being nagged
|
||||||
for credentials for every repository
|
for credentials for every repository
|
||||||
|
|
||||||
|
|
||||||
- Add pull request support. [Kevin Laude]
|
- Add pull request support. [Kevin Laude]
|
||||||
|
|
||||||
Back up reporitory pull requests by passing the --include-pulls
|
Back up reporitory pull requests by passing the --include-pulls
|
||||||
@@ -135,8 +400,6 @@ Changelog
|
|||||||
|
|
||||||
Pull requests are automatically backed up when the --all argument is
|
Pull requests are automatically backed up when the --all argument is
|
||||||
uesd.
|
uesd.
|
||||||
|
|
||||||
|
|
||||||
- Add GitHub Enterprise support. [Kevin Laude]
|
- Add GitHub Enterprise support. [Kevin Laude]
|
||||||
|
|
||||||
Pass the -H or --github-host argument with a GitHub Enterprise hostname
|
Pass the -H or --github-host argument with a GitHub Enterprise hostname
|
||||||
@@ -146,35 +409,21 @@ Changelog
|
|||||||
|
|
||||||
0.2.0 (2014-09-22)
|
0.2.0 (2014-09-22)
|
||||||
------------------
|
------------------
|
||||||
|
|
||||||
- Add support for retrieving repositories. Closes #1. [Jose Diaz-
|
- Add support for retrieving repositories. Closes #1. [Jose Diaz-
|
||||||
Gonzalez]
|
Gonzalez]
|
||||||
|
|
||||||
- Fix PEP8 violations. [Jose Diaz-Gonzalez]
|
- Fix PEP8 violations. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
- Add authorization to header only if specified by user. [Ioannis
|
- Add authorization to header only if specified by user. [Ioannis
|
||||||
Filippidis]
|
Filippidis]
|
||||||
|
|
||||||
- Fill out readme more. [Jose Diaz-Gonzalez]
|
- Fill out readme more. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
- Fix import. [Jose Diaz-Gonzalez]
|
- Fix import. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
- Properly name readme. [Jose Diaz-Gonzalez]
|
- Properly name readme. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
- Create MANIFEST.in. [Jose Diaz-Gonzalez]
|
- Create MANIFEST.in. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
- Create .gitignore. [Jose Diaz-Gonzalez]
|
- Create .gitignore. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
- Create setup.py. [Jose Diaz-Gonzalez]
|
- Create setup.py. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
- Create requirements.txt. [Jose Diaz-Gonzalez]
|
- Create requirements.txt. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
- Create __init__.py. [Jose Diaz-Gonzalez]
|
- Create __init__.py. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
- Create LICENSE.txt. [Jose Diaz-Gonzalez]
|
- Create LICENSE.txt. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
- Create README.md. [Jose Diaz-Gonzalez]
|
- Create README.md. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
- Create github-backup. [Jose Diaz-Gonzalez]
|
- Create github-backup. [Jose Diaz-Gonzalez]
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
13
ISSUE_TEMPLATE.md
Normal file
13
ISSUE_TEMPLATE.md
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
# Important notice regarding filed issues
|
||||||
|
|
||||||
|
This project already fills my needs, and as such I have no real reason to continue it's development. This project is otherwise provided as is, and no support is given.
|
||||||
|
|
||||||
|
If pull requests implementing bug fixes or enhancements are pushed, I am happy to review and merge them (time permitting).
|
||||||
|
|
||||||
|
If you wish to have a bug fixed, you have a few options:
|
||||||
|
|
||||||
|
- Fix it yourself and file a pull request.
|
||||||
|
- File a bug and hope someone else fixes it for you.
|
||||||
|
- Pay me to fix it (my rate is $200 an hour, minimum 1 hour, contact me via my [github email address](https://github.com/josegonzalez) if you want to go this route).
|
||||||
|
|
||||||
|
In all cases, feel free to file an issue, they may be of help to others in the future.
|
||||||
7
PULL_REQUEST.md
Normal file
7
PULL_REQUEST.md
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
# Important notice regarding filed pull requests
|
||||||
|
|
||||||
|
This project already fills my needs, and as such I have no real reason to continue it's development. This project is otherwise provided as is, and no support is given.
|
||||||
|
|
||||||
|
I will attempt to review pull requests at _my_ earliest convenience. If I am unable to get to your pull request in a timely fashion, it is what it is. This repository does not pay any bills, and I am not required to merge any pull request from any individual.
|
||||||
|
|
||||||
|
If you wish to jump my personal priority queue, you may pay me for my time to review. My rate is $200 an hour - minimum 1 hour - feel free contact me via my github email address if you want to go this route.
|
||||||
93
README.rst
93
README.rst
@@ -2,8 +2,17 @@
|
|||||||
github-backup
|
github-backup
|
||||||
=============
|
=============
|
||||||
|
|
||||||
|
|PyPI| |Python Versions|
|
||||||
|
|
||||||
|
This project is considered feature complete for the primary maintainer. If you would like a bugfix or enhancement and cannot sponsor the work, pull requests are welcome. Feel free to contact the maintainer for consulting estimates if desired.
|
||||||
|
|
||||||
backup a github user or organization
|
backup a github user or organization
|
||||||
|
|
||||||
|
Requirements
|
||||||
|
============
|
||||||
|
|
||||||
|
- GIT 1.9+
|
||||||
|
|
||||||
Installation
|
Installation
|
||||||
============
|
============
|
||||||
|
|
||||||
@@ -13,21 +22,26 @@ Using PIP via PyPI::
|
|||||||
|
|
||||||
Using PIP via Github::
|
Using PIP via Github::
|
||||||
|
|
||||||
pip install git+git://github.com/josegonzalez/python-github-backup.git#egg=github-backup
|
pip install git+https://github.com/josegonzalez/python-github-backup.git#egg=github-backup
|
||||||
|
|
||||||
Usage
|
Usage
|
||||||
=====
|
=====
|
||||||
|
|
||||||
CLI Usage is as follows::
|
CLI Usage is as follows::
|
||||||
|
|
||||||
Github Backup [-h] [-u USERNAME] [-p PASSWORD] [-t TOKEN]
|
github-backup [-h] [-u USERNAME] [-p PASSWORD] [-t TOKEN]
|
||||||
[-o OUTPUT_DIRECTORY] [--starred] [--watched] [--all]
|
[-o OUTPUT_DIRECTORY] [-i] [--starred] [--all-starred]
|
||||||
|
[--watched] [--followers] [--following] [--all]
|
||||||
[--issues] [--issue-comments] [--issue-events] [--pulls]
|
[--issues] [--issue-comments] [--issue-events] [--pulls]
|
||||||
[--pull-comments] [--pull-commits] [--labels] [--hooks]
|
[--pull-comments] [--pull-commits] [--labels] [--hooks]
|
||||||
[--milestones] [--repositories] [--wikis]
|
[--milestones] [--repositories] [--releases] [--assets]
|
||||||
[--skip-existing] [-L [LANGUAGES [LANGUAGES ...]]]
|
[--bare] [--lfs] [--wikis] [--gists] [--starred-gists]
|
||||||
[-N NAME_REGEX] [-H GITHUB_HOST] [-O] [-R REPOSITORY]
|
[--skip-existing]
|
||||||
[-P] [-F] [--prefer-ssh] [-v]
|
[-L [LANGUAGES [LANGUAGES ...]]] [-N NAME_REGEX]
|
||||||
|
[-H GITHUB_HOST] [-O] [-R REPOSITORY] [-P] [-F]
|
||||||
|
[--prefer-ssh] [-v]
|
||||||
|
[--keychain-name OSX_KEYCHAIN_ITEM_NAME]
|
||||||
|
[--keychain-account OSX_KEYCHAIN_ITEM_ACCOUNT]
|
||||||
USER
|
USER
|
||||||
|
|
||||||
Backup a github account
|
Backup a github account
|
||||||
@@ -43,11 +57,16 @@ CLI Usage is as follows::
|
|||||||
password for basic auth. If a username is given but
|
password for basic auth. If a username is given but
|
||||||
not a password, the password will be prompted for.
|
not a password, the password will be prompted for.
|
||||||
-t TOKEN, --token TOKEN
|
-t TOKEN, --token TOKEN
|
||||||
personal access or OAuth token
|
personal access or OAuth token, or path to token
|
||||||
|
(file://...)
|
||||||
-o OUTPUT_DIRECTORY, --output-directory OUTPUT_DIRECTORY
|
-o OUTPUT_DIRECTORY, --output-directory OUTPUT_DIRECTORY
|
||||||
directory at which to backup the repositories
|
directory at which to backup the repositories
|
||||||
--starred include starred repositories in backup
|
-i, --incremental incremental backup
|
||||||
|
--starred include JSON output of starred repositories in backup
|
||||||
|
--all-starred include starred repositories in backup
|
||||||
--watched include watched repositories in backup
|
--watched include watched repositories in backup
|
||||||
|
--followers include JSON output of followers in backup
|
||||||
|
--following include JSON output of following users in backup
|
||||||
--all include everything in backup
|
--all include everything in backup
|
||||||
--issues include issues in backup
|
--issues include issues in backup
|
||||||
--issue-comments include issue comments in backup
|
--issue-comments include issue comments in backup
|
||||||
@@ -60,7 +79,14 @@ CLI Usage is as follows::
|
|||||||
authenticated)
|
authenticated)
|
||||||
--milestones include milestones in backup
|
--milestones include milestones in backup
|
||||||
--repositories include repository clone in backup
|
--repositories include repository clone in backup
|
||||||
|
--releases include repository releases' information without assets or binaries
|
||||||
|
--assets include assets alongside release information; only applies if including releases
|
||||||
|
--bare clone bare repositories
|
||||||
|
--lfs clone LFS repositories (requires Git LFS to be
|
||||||
|
installed, https://git-lfs.github.com)
|
||||||
--wikis include wiki clone in backup
|
--wikis include wiki clone in backup
|
||||||
|
--gists include gists in backup
|
||||||
|
--starred-gists include starred gists in backup
|
||||||
--skip-existing skip project if a backup directory exists
|
--skip-existing skip project if a backup directory exists
|
||||||
-L [LANGUAGES [LANGUAGES ...]], --languages [LANGUAGES [LANGUAGES ...]]
|
-L [LANGUAGES [LANGUAGES ...]], --languages [LANGUAGES [LANGUAGES ...]]
|
||||||
only allow these languages
|
only allow these languages
|
||||||
@@ -75,6 +101,12 @@ CLI Usage is as follows::
|
|||||||
-F, --fork include forked repositories
|
-F, --fork include forked repositories
|
||||||
--prefer-ssh Clone repositories using SSH instead of HTTPS
|
--prefer-ssh Clone repositories using SSH instead of HTTPS
|
||||||
-v, --version show program's version number and exit
|
-v, --version show program's version number and exit
|
||||||
|
--keychain-name OSX_KEYCHAIN_ITEM_NAME
|
||||||
|
OSX ONLY: name field of password item in OSX keychain
|
||||||
|
that holds the personal access or OAuth token
|
||||||
|
--keychain-account OSX_KEYCHAIN_ITEM_ACCOUNT
|
||||||
|
OSX ONLY: account field of password item in OSX
|
||||||
|
keychain that holds the personal access or OAuth token
|
||||||
|
|
||||||
|
|
||||||
The package can be used to backup an *entire* organization or repository, including issues and wikis in the most appropriate format (clones for wikis, json files for issues).
|
The package can be used to backup an *entire* organization or repository, including issues and wikis in the most appropriate format (clones for wikis, json files for issues).
|
||||||
@@ -83,3 +115,46 @@ Authentication
|
|||||||
==============
|
==============
|
||||||
|
|
||||||
Note: Password-based authentication will fail if you have two-factor authentication enabled.
|
Note: Password-based authentication will fail if you have two-factor authentication enabled.
|
||||||
|
|
||||||
|
Using the Keychain on Mac OSX
|
||||||
|
=============================
|
||||||
|
Note: On Mac OSX the token can be stored securely in the user's keychain. To do this:
|
||||||
|
|
||||||
|
1. Open Keychain from "Applications -> Utilities -> Keychain Access"
|
||||||
|
2. Add a new password item using "File -> New Password Item"
|
||||||
|
3. Enter a name in the "Keychain Item Name" box. You must provide this name to github-backup using the --keychain-name argument.
|
||||||
|
4. Enter an account name in the "Account Name" box, enter your Github username as set above. You must provide this name to github-backup using the --keychain-account argument.
|
||||||
|
5. Enter your Github personal access token in the "Password" box
|
||||||
|
|
||||||
|
Note: When you run github-backup, you will be asked whether you want to allow "security" to use your confidential information stored in your keychain. You have two options:
|
||||||
|
|
||||||
|
1. **Allow:** In this case you will need to click "Allow" each time you run `github-backup`
|
||||||
|
2. **Always Allow:** In this case, you will not be asked for permission when you run `github-backup` in future. This is less secure, but is required if you want to schedule `github-backup` to run automatically
|
||||||
|
|
||||||
|
About Git LFS
|
||||||
|
=============
|
||||||
|
|
||||||
|
When you use the "--lfs" option, you will need to make sure you have Git LFS installed.
|
||||||
|
|
||||||
|
Instructions on how to do this can be found on https://git-lfs.github.com.
|
||||||
|
|
||||||
|
Examples
|
||||||
|
========
|
||||||
|
|
||||||
|
Backup all repositories::
|
||||||
|
|
||||||
|
export ACCESS_TOKEN=SOME-GITHUB-TOKEN
|
||||||
|
github-backup WhiteHouse --token $ACCESS_TOKEN --organization --output-directory /tmp/white-house --repositories
|
||||||
|
|
||||||
|
Backup a single organization repository with everything else (wiki, pull requests, comments, issues etc)::
|
||||||
|
|
||||||
|
export ACCESS_TOKEN=SOME-GITHUB-TOKEN
|
||||||
|
ORGANIZATION=docker
|
||||||
|
REPO=cli
|
||||||
|
# e.g. git@github.com:docker/cli.git
|
||||||
|
github-backup $ORGANIZATION -P -t $ACCESS_TOKEN -o . --all -O -R $REPO
|
||||||
|
|
||||||
|
.. |PyPI| image:: https://img.shields.io/pypi/v/github-backup.svg
|
||||||
|
:target: https://pypi.python.org/pypi/github-backup/
|
||||||
|
.. |Python Versions| image:: https://img.shields.io/pypi/pyversions/github-backup.svg
|
||||||
|
:target: https://github.com/albertyw/github-backup
|
||||||
|
|||||||
@@ -1,6 +1,7 @@
|
|||||||
#!/usr/bin/env python
|
#!/usr/bin/env python
|
||||||
|
|
||||||
from __future__ import print_function
|
from __future__ import print_function
|
||||||
|
import socket
|
||||||
|
|
||||||
import argparse
|
import argparse
|
||||||
import base64
|
import base64
|
||||||
@@ -16,26 +17,51 @@ import select
|
|||||||
import subprocess
|
import subprocess
|
||||||
import sys
|
import sys
|
||||||
import time
|
import time
|
||||||
import urlparse
|
import platform
|
||||||
import urllib
|
PY2 = False
|
||||||
import urllib2
|
try:
|
||||||
|
# python 3
|
||||||
|
from urllib.parse import urlparse
|
||||||
|
from urllib.parse import quote as urlquote
|
||||||
|
from urllib.parse import urlencode
|
||||||
|
from urllib.error import HTTPError, URLError
|
||||||
|
from urllib.request import urlopen
|
||||||
|
from urllib.request import Request
|
||||||
|
from urllib.request import HTTPRedirectHandler
|
||||||
|
from urllib.request import build_opener
|
||||||
|
except ImportError:
|
||||||
|
# python 2
|
||||||
|
PY2 = True
|
||||||
|
from urlparse import urlparse
|
||||||
|
from urllib import quote as urlquote
|
||||||
|
from urllib import urlencode
|
||||||
|
from urllib2 import HTTPError, URLError
|
||||||
|
from urllib2 import urlopen
|
||||||
|
from urllib2 import Request
|
||||||
|
from urllib2 import HTTPRedirectHandler
|
||||||
|
from urllib2 import build_opener
|
||||||
|
|
||||||
|
try:
|
||||||
from github_backup import __version__
|
from github_backup import __version__
|
||||||
|
VERSION = __version__
|
||||||
|
except ImportError:
|
||||||
|
VERSION = 'unknown'
|
||||||
|
|
||||||
FNULL = open(os.devnull, 'w')
|
FNULL = open(os.devnull, 'w')
|
||||||
|
|
||||||
|
|
||||||
def log_error(message):
|
def log_error(message):
|
||||||
if type(message) == str:
|
"""
|
||||||
message = [message]
|
Log message (str) or messages (List[str]) to stderr and exit with status 1
|
||||||
|
"""
|
||||||
for msg in message:
|
log_warning(message)
|
||||||
sys.stderr.write("{0}\n".format(msg))
|
|
||||||
|
|
||||||
sys.exit(1)
|
sys.exit(1)
|
||||||
|
|
||||||
|
|
||||||
def log_info(message):
|
def log_info(message):
|
||||||
|
"""
|
||||||
|
Log message (str) or messages (List[str]) to stdout
|
||||||
|
"""
|
||||||
if type(message) == str:
|
if type(message) == str:
|
||||||
message = [message]
|
message = [message]
|
||||||
|
|
||||||
@@ -43,6 +69,17 @@ def log_info(message):
|
|||||||
sys.stdout.write("{0}\n".format(msg))
|
sys.stdout.write("{0}\n".format(msg))
|
||||||
|
|
||||||
|
|
||||||
|
def log_warning(message):
|
||||||
|
"""
|
||||||
|
Log message (str) or messages (List[str]) to stderr
|
||||||
|
"""
|
||||||
|
if type(message) == str:
|
||||||
|
message = [message]
|
||||||
|
|
||||||
|
for msg in message:
|
||||||
|
sys.stderr.write("{0}\n".format(msg))
|
||||||
|
|
||||||
|
|
||||||
def logging_subprocess(popenargs,
|
def logging_subprocess(popenargs,
|
||||||
logger,
|
logger,
|
||||||
stdout_log_level=logging.DEBUG,
|
stdout_log_level=logging.DEBUG,
|
||||||
@@ -55,11 +92,15 @@ def logging_subprocess(popenargs,
|
|||||||
"""
|
"""
|
||||||
child = subprocess.Popen(popenargs, stdout=subprocess.PIPE,
|
child = subprocess.Popen(popenargs, stdout=subprocess.PIPE,
|
||||||
stderr=subprocess.PIPE, **kwargs)
|
stderr=subprocess.PIPE, **kwargs)
|
||||||
|
if sys.platform == 'win32':
|
||||||
|
log_info("Windows operating system detected - no subprocess logging will be returned")
|
||||||
|
|
||||||
log_level = {child.stdout: stdout_log_level,
|
log_level = {child.stdout: stdout_log_level,
|
||||||
child.stderr: stderr_log_level}
|
child.stderr: stderr_log_level}
|
||||||
|
|
||||||
def check_io():
|
def check_io():
|
||||||
|
if sys.platform == 'win32':
|
||||||
|
return
|
||||||
ready_to_read = select.select([child.stdout, child.stderr],
|
ready_to_read = select.select([child.stdout, child.stderr],
|
||||||
[],
|
[],
|
||||||
[],
|
[],
|
||||||
@@ -80,8 +121,8 @@ def logging_subprocess(popenargs,
|
|||||||
rc = child.wait()
|
rc = child.wait()
|
||||||
|
|
||||||
if rc != 0:
|
if rc != 0:
|
||||||
print(u'{} returned {}:'.format(popenargs[0], rc), file=sys.stderr)
|
print('{} returned {}:'.format(popenargs[0], rc), file=sys.stderr)
|
||||||
print('\t', u' '.join(popenargs), file=sys.stderr)
|
print('\t', ' '.join(popenargs), file=sys.stderr)
|
||||||
|
|
||||||
return rc
|
return rc
|
||||||
|
|
||||||
@@ -96,8 +137,9 @@ def mkdir_p(*args):
|
|||||||
else:
|
else:
|
||||||
raise
|
raise
|
||||||
|
|
||||||
|
|
||||||
def mask_password(url, secret='*****'):
|
def mask_password(url, secret='*****'):
|
||||||
parsed = urlparse.urlparse(url)
|
parsed = urlparse(url)
|
||||||
|
|
||||||
if not parsed.password:
|
if not parsed.password:
|
||||||
return url
|
return url
|
||||||
@@ -106,9 +148,9 @@ def mask_password(url, secret='*****'):
|
|||||||
|
|
||||||
return url.replace(parsed.password, secret)
|
return url.replace(parsed.password, secret)
|
||||||
|
|
||||||
|
|
||||||
def parse_args():
|
def parse_args():
|
||||||
parser = argparse.ArgumentParser(description='Backup a github account',
|
parser = argparse.ArgumentParser(description='Backup a github account')
|
||||||
prog='Github Backup')
|
|
||||||
parser.add_argument('user',
|
parser.add_argument('user',
|
||||||
metavar='USER',
|
metavar='USER',
|
||||||
type=str,
|
type=str,
|
||||||
@@ -126,24 +168,41 @@ def parse_args():
|
|||||||
parser.add_argument('-t',
|
parser.add_argument('-t',
|
||||||
'--token',
|
'--token',
|
||||||
dest='token',
|
dest='token',
|
||||||
help='personal access or OAuth token')
|
help='personal access or OAuth token, or path to token (file://...)') # noqa
|
||||||
parser.add_argument('-o',
|
parser.add_argument('-o',
|
||||||
'--output-directory',
|
'--output-directory',
|
||||||
default='.',
|
default='.',
|
||||||
dest='output_directory',
|
dest='output_directory',
|
||||||
help='directory at which to backup the repositories')
|
help='directory at which to backup the repositories')
|
||||||
|
parser.add_argument('-i',
|
||||||
|
'--incremental',
|
||||||
|
action='store_true',
|
||||||
|
dest='incremental',
|
||||||
|
help='incremental backup')
|
||||||
parser.add_argument('--starred',
|
parser.add_argument('--starred',
|
||||||
action='store_true',
|
action='store_true',
|
||||||
dest='include_starred',
|
dest='include_starred',
|
||||||
help='include starred repositories in backup')
|
help='include JSON output of starred repositories in backup')
|
||||||
|
parser.add_argument('--all-starred',
|
||||||
|
action='store_true',
|
||||||
|
dest='all_starred',
|
||||||
|
help='include starred repositories in backup [*]')
|
||||||
parser.add_argument('--watched',
|
parser.add_argument('--watched',
|
||||||
action='store_true',
|
action='store_true',
|
||||||
dest='include_watched',
|
dest='include_watched',
|
||||||
help='include watched repositories in backup')
|
help='include JSON output of watched repositories in backup')
|
||||||
|
parser.add_argument('--followers',
|
||||||
|
action='store_true',
|
||||||
|
dest='include_followers',
|
||||||
|
help='include JSON output of followers in backup')
|
||||||
|
parser.add_argument('--following',
|
||||||
|
action='store_true',
|
||||||
|
dest='include_following',
|
||||||
|
help='include JSON output of following users in backup')
|
||||||
parser.add_argument('--all',
|
parser.add_argument('--all',
|
||||||
action='store_true',
|
action='store_true',
|
||||||
dest='include_everything',
|
dest='include_everything',
|
||||||
help='include everything in backup')
|
help='include everything in backup (not including [*])')
|
||||||
parser.add_argument('--issues',
|
parser.add_argument('--issues',
|
||||||
action='store_true',
|
action='store_true',
|
||||||
dest='include_issues',
|
dest='include_issues',
|
||||||
@@ -168,6 +227,10 @@ def parse_args():
|
|||||||
action='store_true',
|
action='store_true',
|
||||||
dest='include_pull_commits',
|
dest='include_pull_commits',
|
||||||
help='include pull request commits in backup')
|
help='include pull request commits in backup')
|
||||||
|
parser.add_argument('--pull-details',
|
||||||
|
action='store_true',
|
||||||
|
dest='include_pull_details',
|
||||||
|
help='include more pull request details in backup [*]')
|
||||||
parser.add_argument('--labels',
|
parser.add_argument('--labels',
|
||||||
action='store_true',
|
action='store_true',
|
||||||
dest='include_labels',
|
dest='include_labels',
|
||||||
@@ -175,7 +238,7 @@ def parse_args():
|
|||||||
parser.add_argument('--hooks',
|
parser.add_argument('--hooks',
|
||||||
action='store_true',
|
action='store_true',
|
||||||
dest='include_hooks',
|
dest='include_hooks',
|
||||||
help='include hooks in backup (works only when authenticated)')
|
help='include hooks in backup (works only when authenticated)') # noqa
|
||||||
parser.add_argument('--milestones',
|
parser.add_argument('--milestones',
|
||||||
action='store_true',
|
action='store_true',
|
||||||
dest='include_milestones',
|
dest='include_milestones',
|
||||||
@@ -184,10 +247,26 @@ def parse_args():
|
|||||||
action='store_true',
|
action='store_true',
|
||||||
dest='include_repository',
|
dest='include_repository',
|
||||||
help='include repository clone in backup')
|
help='include repository clone in backup')
|
||||||
|
parser.add_argument('--bare',
|
||||||
|
action='store_true',
|
||||||
|
dest='bare_clone',
|
||||||
|
help='clone bare repositories')
|
||||||
|
parser.add_argument('--lfs',
|
||||||
|
action='store_true',
|
||||||
|
dest='lfs_clone',
|
||||||
|
help='clone LFS repositories (requires Git LFS to be installed, https://git-lfs.github.com) [*]')
|
||||||
parser.add_argument('--wikis',
|
parser.add_argument('--wikis',
|
||||||
action='store_true',
|
action='store_true',
|
||||||
dest='include_wiki',
|
dest='include_wiki',
|
||||||
help='include wiki clone in backup')
|
help='include wiki clone in backup')
|
||||||
|
parser.add_argument('--gists',
|
||||||
|
action='store_true',
|
||||||
|
dest='include_gists',
|
||||||
|
help='include gists in backup [*]')
|
||||||
|
parser.add_argument('--starred-gists',
|
||||||
|
action='store_true',
|
||||||
|
dest='include_starred_gists',
|
||||||
|
help='include starred gists in backup [*]')
|
||||||
parser.add_argument('--skip-existing',
|
parser.add_argument('--skip-existing',
|
||||||
action='store_true',
|
action='store_true',
|
||||||
dest='skip_existing',
|
dest='skip_existing',
|
||||||
@@ -217,39 +296,82 @@ def parse_args():
|
|||||||
parser.add_argument('-P', '--private',
|
parser.add_argument('-P', '--private',
|
||||||
action='store_true',
|
action='store_true',
|
||||||
dest='private',
|
dest='private',
|
||||||
help='include private repositories')
|
help='include private repositories [*]')
|
||||||
parser.add_argument('-F', '--fork',
|
parser.add_argument('-F', '--fork',
|
||||||
action='store_true',
|
action='store_true',
|
||||||
dest='fork',
|
dest='fork',
|
||||||
help='include forked repositories')
|
help='include forked repositories [*]')
|
||||||
parser.add_argument('--prefer-ssh',
|
parser.add_argument('--prefer-ssh',
|
||||||
action='store_true',
|
action='store_true',
|
||||||
help='Clone repositories using SSH instead of HTTPS')
|
help='Clone repositories using SSH instead of HTTPS')
|
||||||
parser.add_argument('-v', '--version',
|
parser.add_argument('-v', '--version',
|
||||||
action='version',
|
action='version',
|
||||||
version='%(prog)s ' + __version__)
|
version='%(prog)s ' + VERSION)
|
||||||
|
parser.add_argument('--keychain-name',
|
||||||
|
dest='osx_keychain_item_name',
|
||||||
|
help='OSX ONLY: name field of password item in OSX keychain that holds the personal access or OAuth token')
|
||||||
|
parser.add_argument('--keychain-account',
|
||||||
|
dest='osx_keychain_item_account',
|
||||||
|
help='OSX ONLY: account field of password item in OSX keychain that holds the personal access or OAuth token')
|
||||||
|
parser.add_argument('--releases',
|
||||||
|
action='store_true',
|
||||||
|
dest='include_releases',
|
||||||
|
help='include release information, not including assets or binaries'
|
||||||
|
)
|
||||||
|
parser.add_argument('--assets',
|
||||||
|
action='store_true',
|
||||||
|
dest='include_assets',
|
||||||
|
help='include assets alongside release information; only applies if including releases')
|
||||||
return parser.parse_args()
|
return parser.parse_args()
|
||||||
|
|
||||||
|
|
||||||
def get_auth(args, encode=True):
|
def get_auth(args, encode=True):
|
||||||
auth = None
|
auth = None
|
||||||
|
|
||||||
if args.token:
|
if args.osx_keychain_item_name:
|
||||||
|
if not args.osx_keychain_item_account:
|
||||||
|
log_error('You must specify both name and account fields for osx keychain password items')
|
||||||
|
else:
|
||||||
|
if platform.system() != 'Darwin':
|
||||||
|
log_error("Keychain arguments are only supported on Mac OSX")
|
||||||
|
try:
|
||||||
|
with open(os.devnull, 'w') as devnull:
|
||||||
|
token = (subprocess.check_output([
|
||||||
|
'security', 'find-generic-password',
|
||||||
|
'-s', args.osx_keychain_item_name,
|
||||||
|
'-a', args.osx_keychain_item_account,
|
||||||
|
'-w'], stderr=devnull).strip())
|
||||||
|
if not PY2:
|
||||||
|
token = token.decode('utf-8')
|
||||||
|
auth = token + ':' + 'x-oauth-basic'
|
||||||
|
except:
|
||||||
|
log_error('No password item matching the provided name and account could be found in the osx keychain.')
|
||||||
|
elif args.osx_keychain_item_account:
|
||||||
|
log_error('You must specify both name and account fields for osx keychain password items')
|
||||||
|
elif args.token:
|
||||||
|
_path_specifier = 'file://'
|
||||||
|
if args.token.startswith(_path_specifier):
|
||||||
|
args.token = open(args.token[len(_path_specifier):],
|
||||||
|
'rt').readline().strip()
|
||||||
auth = args.token + ':' + 'x-oauth-basic'
|
auth = args.token + ':' + 'x-oauth-basic'
|
||||||
elif args.username:
|
elif args.username:
|
||||||
if not args.password:
|
if not args.password:
|
||||||
args.password = getpass.getpass()
|
args.password = getpass.getpass()
|
||||||
auth = args.username + ':' + args.password
|
if encode:
|
||||||
|
password = args.password
|
||||||
|
else:
|
||||||
|
password = urlquote(args.password)
|
||||||
|
auth = args.username + ':' + password
|
||||||
elif args.password:
|
elif args.password:
|
||||||
log_error('You must specify a username for basic auth')
|
log_error('You must specify a username for basic auth')
|
||||||
|
|
||||||
if not auth:
|
if not auth:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
if encode == False:
|
if not encode:
|
||||||
return auth
|
return auth
|
||||||
|
|
||||||
return base64.b64encode(auth)
|
return base64.b64encode(auth.encode('ascii'))
|
||||||
|
|
||||||
|
|
||||||
def get_github_api_host(args):
|
def get_github_api_host(args):
|
||||||
@@ -269,28 +391,32 @@ def get_github_host(args):
|
|||||||
|
|
||||||
return host
|
return host
|
||||||
|
|
||||||
|
|
||||||
def get_github_repo_url(args, repository):
|
def get_github_repo_url(args, repository):
|
||||||
|
if repository.get('is_gist'):
|
||||||
|
return repository['git_pull_url']
|
||||||
|
|
||||||
if args.prefer_ssh:
|
if args.prefer_ssh:
|
||||||
return repository['ssh_url']
|
return repository['ssh_url']
|
||||||
|
|
||||||
auth = get_auth(args, False)
|
auth = get_auth(args, False)
|
||||||
if auth:
|
if auth and repository['private'] == True:
|
||||||
repo_url = 'https://{0}@{1}/{2}/{3}.git'.format(
|
repo_url = 'https://{0}@{1}/{2}/{3}.git'.format(
|
||||||
auth,
|
auth,
|
||||||
get_github_host(args),
|
get_github_host(args),
|
||||||
args.user,
|
repository['owner']['login'],
|
||||||
repository['name'])
|
repository['name'])
|
||||||
else:
|
else:
|
||||||
repo_url = repository['clone_url']
|
repo_url = repository['clone_url']
|
||||||
|
|
||||||
return repo_url
|
return repo_url
|
||||||
|
|
||||||
def retrieve_data(args, template, query_args=None, single_request=False):
|
|
||||||
|
def retrieve_data_gen(args, template, query_args=None, single_request=False):
|
||||||
auth = get_auth(args)
|
auth = get_auth(args)
|
||||||
query_args = get_query_args(query_args)
|
query_args = get_query_args(query_args)
|
||||||
per_page = 100
|
per_page = 100
|
||||||
page = 0
|
page = 0
|
||||||
data = []
|
|
||||||
|
|
||||||
while True:
|
while True:
|
||||||
page = page + 1
|
page = page + 1
|
||||||
@@ -299,19 +425,30 @@ def retrieve_data(args, template, query_args=None, single_request=False):
|
|||||||
|
|
||||||
status_code = int(r.getcode())
|
status_code = int(r.getcode())
|
||||||
|
|
||||||
|
retries = 0
|
||||||
|
while retries < 3 and status_code == 502:
|
||||||
|
print('API request returned HTTP 502: Bad Gateway. Retrying in 5 seconds')
|
||||||
|
retries += 1
|
||||||
|
time.sleep(5)
|
||||||
|
request = _construct_request(per_page, page, query_args, template, auth) # noqa
|
||||||
|
r, errors = _get_response(request, auth, template)
|
||||||
|
|
||||||
|
status_code = int(r.getcode())
|
||||||
|
|
||||||
if status_code != 200:
|
if status_code != 200:
|
||||||
template = 'API request returned HTTP {0}: {1}'
|
template = 'API request returned HTTP {0}: {1}'
|
||||||
errors.append(template.format(status_code, r.reason))
|
errors.append(template.format(status_code, r.reason))
|
||||||
log_error(errors)
|
log_error(errors)
|
||||||
|
|
||||||
response = json.loads(r.read())
|
response = json.loads(r.read().decode('utf-8'))
|
||||||
if len(errors) == 0:
|
if len(errors) == 0:
|
||||||
if type(response) == list:
|
if type(response) == list:
|
||||||
data.extend(response)
|
for resp in response:
|
||||||
|
yield resp
|
||||||
if len(response) < per_page:
|
if len(response) < per_page:
|
||||||
break
|
break
|
||||||
elif type(response) == dict and single_request:
|
elif type(response) == dict and single_request:
|
||||||
data.append(response)
|
yield response
|
||||||
|
|
||||||
if len(errors) > 0:
|
if len(errors) > 0:
|
||||||
log_error(errors)
|
log_error(errors)
|
||||||
@@ -319,8 +456,8 @@ def retrieve_data(args, template, query_args=None, single_request=False):
|
|||||||
if single_request:
|
if single_request:
|
||||||
break
|
break
|
||||||
|
|
||||||
return data
|
def retrieve_data(args, template, query_args=None, single_request=False):
|
||||||
|
return list(retrieve_data_gen(args, template, query_args, single_request))
|
||||||
|
|
||||||
def get_query_args(query_args=None):
|
def get_query_args(query_args=None):
|
||||||
if not query_args:
|
if not query_args:
|
||||||
@@ -336,11 +473,17 @@ def _get_response(request, auth, template):
|
|||||||
while True:
|
while True:
|
||||||
should_continue = False
|
should_continue = False
|
||||||
try:
|
try:
|
||||||
r = urllib2.urlopen(request)
|
r = urlopen(request)
|
||||||
except urllib2.HTTPError as exc:
|
except HTTPError as exc:
|
||||||
errors, should_continue = _request_http_error(exc, auth, errors) # noqa
|
errors, should_continue = _request_http_error(exc, auth, errors) # noqa
|
||||||
r = exc
|
r = exc
|
||||||
except urllib2.URLError:
|
except URLError as e:
|
||||||
|
log_warning(e.reason)
|
||||||
|
should_continue = _request_url_error(template, retry_timeout)
|
||||||
|
if not should_continue:
|
||||||
|
raise
|
||||||
|
except socket.error as e:
|
||||||
|
log_warning(e.strerror)
|
||||||
should_continue = _request_url_error(template, retry_timeout)
|
should_continue = _request_url_error(template, retry_timeout)
|
||||||
if not should_continue:
|
if not should_continue:
|
||||||
raise
|
raise
|
||||||
@@ -353,14 +496,15 @@ def _get_response(request, auth, template):
|
|||||||
|
|
||||||
|
|
||||||
def _construct_request(per_page, page, query_args, template, auth):
|
def _construct_request(per_page, page, query_args, template, auth):
|
||||||
querystring = urllib.urlencode(dict({
|
querystring = urlencode(dict(list({
|
||||||
'per_page': per_page,
|
'per_page': per_page,
|
||||||
'page': page
|
'page': page
|
||||||
}.items() + query_args.items()))
|
}.items()) + list(query_args.items())))
|
||||||
|
|
||||||
request = urllib2.Request(template + '?' + querystring)
|
request = Request(template + '?' + querystring)
|
||||||
if auth is not None:
|
if auth is not None:
|
||||||
request.add_header('Authorization', 'Basic ' + auth)
|
request.add_header('Authorization', 'Basic '.encode('ascii') + auth)
|
||||||
|
log_info('Requesting {}?{}'.format(template, querystring))
|
||||||
return request
|
return request
|
||||||
|
|
||||||
|
|
||||||
@@ -387,10 +531,9 @@ def _request_http_error(exc, auth, errors):
|
|||||||
print('Exceeded rate limit of {} requests; waiting {} seconds to reset'.format(limit, delta), # noqa
|
print('Exceeded rate limit of {} requests; waiting {} seconds to reset'.format(limit, delta), # noqa
|
||||||
file=sys.stderr)
|
file=sys.stderr)
|
||||||
|
|
||||||
ratelimit_error = 'No more requests remaining'
|
|
||||||
if auth is None:
|
if auth is None:
|
||||||
ratelimit_error += '; authenticate to raise your GitHub rate limit' # noqa
|
print('Hint: Authenticate to raise your GitHub rate limit',
|
||||||
errors.append(ratelimit_error)
|
file=sys.stderr)
|
||||||
|
|
||||||
time.sleep(delta)
|
time.sleep(delta)
|
||||||
should_continue = True
|
should_continue = True
|
||||||
@@ -410,11 +553,81 @@ def _request_url_error(template, retry_timeout):
|
|||||||
return False
|
return False
|
||||||
|
|
||||||
|
|
||||||
def retrieve_repositories(args):
|
class S3HTTPRedirectHandler(HTTPRedirectHandler):
|
||||||
|
"""
|
||||||
|
A subclassed redirect handler for downloading Github assets from S3.
|
||||||
|
|
||||||
|
urllib will add the Authorization header to the redirected request to S3, which will result in a 400,
|
||||||
|
so we should remove said header on redirect.
|
||||||
|
"""
|
||||||
|
def redirect_request(self, req, fp, code, msg, headers, newurl):
|
||||||
|
if PY2:
|
||||||
|
# HTTPRedirectHandler is an old style class
|
||||||
|
request = HTTPRedirectHandler.redirect_request(self, req, fp, code, msg, headers, newurl)
|
||||||
|
else:
|
||||||
|
request = super(S3HTTPRedirectHandler, self).redirect_request(req, fp, code, msg, headers, newurl)
|
||||||
|
del request.headers['Authorization']
|
||||||
|
return request
|
||||||
|
|
||||||
|
|
||||||
|
def download_file(url, path, auth):
|
||||||
|
# Skip downloading release assets if they already exist on disk so we don't redownload on every sync
|
||||||
|
if os.path.exists(path):
|
||||||
|
return
|
||||||
|
|
||||||
|
request = Request(url)
|
||||||
|
request.add_header('Accept', 'application/octet-stream')
|
||||||
|
request.add_header('Authorization', 'Basic '.encode('ascii') + auth)
|
||||||
|
opener = build_opener(S3HTTPRedirectHandler)
|
||||||
|
|
||||||
|
try:
|
||||||
|
response = opener.open(request)
|
||||||
|
|
||||||
|
chunk_size = 16 * 1024
|
||||||
|
with open(path, 'wb') as f:
|
||||||
|
while True:
|
||||||
|
chunk = response.read(chunk_size)
|
||||||
|
if not chunk:
|
||||||
|
break
|
||||||
|
f.write(chunk)
|
||||||
|
except HTTPError as exc:
|
||||||
|
# Gracefully handle 404 responses (and others) when downloading from S3
|
||||||
|
log_warning('Skipping download of asset {0} due to HTTPError: {1}'.format(url, exc.reason))
|
||||||
|
except URLError as e:
|
||||||
|
# Gracefully handle other URL errors
|
||||||
|
log_warning('Skipping download of asset {0} due to URLError: {1}'.format(url, e.reason))
|
||||||
|
except socket.error as e:
|
||||||
|
# Gracefully handle socket errors
|
||||||
|
# TODO: Implement retry logic
|
||||||
|
log_warning('Skipping download of asset {0} due to socker error: {1}'.format(url, e.strerror))
|
||||||
|
|
||||||
|
|
||||||
|
def get_authenticated_user(args):
|
||||||
|
template = 'https://{0}/user'.format(get_github_api_host(args))
|
||||||
|
data = retrieve_data(args, template, single_request=True)
|
||||||
|
return data[0]
|
||||||
|
|
||||||
|
|
||||||
|
def check_git_lfs_install():
|
||||||
|
exit_code = subprocess.call(['git', 'lfs', 'version'])
|
||||||
|
if exit_code != 0:
|
||||||
|
log_error('The argument --lfs requires you to have Git LFS installed.\nYou can get it from https://git-lfs.github.com.')
|
||||||
|
|
||||||
|
|
||||||
|
def retrieve_repositories(args, authenticated_user):
|
||||||
log_info('Retrieving repositories')
|
log_info('Retrieving repositories')
|
||||||
single_request = False
|
single_request = False
|
||||||
|
if args.user == authenticated_user['login']:
|
||||||
|
# we must use the /user/repos API to be able to access private repos
|
||||||
template = 'https://{0}/user/repos'.format(
|
template = 'https://{0}/user/repos'.format(
|
||||||
get_github_api_host(args))
|
get_github_api_host(args))
|
||||||
|
else:
|
||||||
|
if args.private and not args.organization:
|
||||||
|
log_warning('Authenticated user is different from user being backed up, thus private repositories cannot be accessed')
|
||||||
|
template = 'https://{0}/users/{1}/repos'.format(
|
||||||
|
get_github_api_host(args),
|
||||||
|
args.user)
|
||||||
|
|
||||||
if args.organization:
|
if args.organization:
|
||||||
template = 'https://{0}/orgs/{1}/repos'.format(
|
template = 'https://{0}/orgs/{1}/repos'.format(
|
||||||
get_github_api_host(args),
|
get_github_api_host(args),
|
||||||
@@ -427,13 +640,44 @@ def retrieve_repositories(args):
|
|||||||
args.user,
|
args.user,
|
||||||
args.repository)
|
args.repository)
|
||||||
|
|
||||||
return retrieve_data(args, template, single_request=single_request)
|
repos = retrieve_data(args, template, single_request=single_request)
|
||||||
|
|
||||||
|
if args.all_starred:
|
||||||
|
starred_template = 'https://{0}/users/{1}/starred'.format(get_github_api_host(args), args.user)
|
||||||
|
starred_repos = retrieve_data(args, starred_template, single_request=False)
|
||||||
|
# flag each repo as starred for downstream processing
|
||||||
|
for item in starred_repos:
|
||||||
|
item.update({'is_starred': True})
|
||||||
|
repos.extend(starred_repos)
|
||||||
|
|
||||||
|
if args.include_gists:
|
||||||
|
gists_template = 'https://{0}/users/{1}/gists'.format(get_github_api_host(args), args.user)
|
||||||
|
gists = retrieve_data(args, gists_template, single_request=False)
|
||||||
|
# flag each repo as a gist for downstream processing
|
||||||
|
for item in gists:
|
||||||
|
item.update({'is_gist': True})
|
||||||
|
repos.extend(gists)
|
||||||
|
|
||||||
|
if args.include_starred_gists:
|
||||||
|
starred_gists_template = 'https://{0}/gists/starred'.format(get_github_api_host(args))
|
||||||
|
starred_gists = retrieve_data(args, starred_gists_template, single_request=False)
|
||||||
|
# flag each repo as a starred gist for downstream processing
|
||||||
|
for item in starred_gists:
|
||||||
|
item.update({'is_gist': True,
|
||||||
|
'is_starred': True})
|
||||||
|
repos.extend(starred_gists)
|
||||||
|
|
||||||
|
return repos
|
||||||
|
|
||||||
|
|
||||||
def filter_repositories(args, repositories):
|
def filter_repositories(args, unfiltered_repositories):
|
||||||
log_info('Filtering repositories')
|
log_info('Filtering repositories')
|
||||||
|
|
||||||
repositories = [r for r in repositories if r['owner']['login'] == args.user]
|
repositories = []
|
||||||
|
for r in unfiltered_repositories:
|
||||||
|
# gists can be anonymous, so need to safely check owner
|
||||||
|
if r.get('owner', {}).get('login') == args.user or r.get('is_starred'):
|
||||||
|
repositories.append(r)
|
||||||
|
|
||||||
name_regex = None
|
name_regex = None
|
||||||
if args.name_regex:
|
if args.name_regex:
|
||||||
@@ -444,11 +688,11 @@ def filter_repositories(args, repositories):
|
|||||||
languages = [x.lower() for x in args.languages]
|
languages = [x.lower() for x in args.languages]
|
||||||
|
|
||||||
if not args.fork:
|
if not args.fork:
|
||||||
repositories = [r for r in repositories if not r['fork']]
|
repositories = [r for r in repositories if not r.get('fork')]
|
||||||
if not args.private:
|
if not args.private:
|
||||||
repositories = [r for r in repositories if not r['private']]
|
repositories = [r for r in repositories if not r.get('private') or r.get('public')]
|
||||||
if languages:
|
if languages:
|
||||||
repositories = [r for r in repositories if r['language'] and r['language'].lower() in languages] # noqa
|
repositories = [r for r in repositories if r.get('language') and r.get('language').lower() in languages] # noqa
|
||||||
if name_regex:
|
if name_regex:
|
||||||
repositories = [r for r in repositories if name_regex.match(r['name'])]
|
repositories = [r for r in repositories if name_regex.match(r['name'])]
|
||||||
|
|
||||||
@@ -459,24 +703,56 @@ def backup_repositories(args, output_directory, repositories):
|
|||||||
log_info('Backing up repositories')
|
log_info('Backing up repositories')
|
||||||
repos_template = 'https://{0}/repos'.format(get_github_api_host(args))
|
repos_template = 'https://{0}/repos'.format(get_github_api_host(args))
|
||||||
|
|
||||||
|
if args.incremental:
|
||||||
|
last_update = max(list(repository['updated_at'] for repository in repositories) or [time.strftime('%Y-%m-%dT%H:%M:%SZ', time.localtime())]) # noqa
|
||||||
|
last_update_path = os.path.join(output_directory, 'last_update')
|
||||||
|
if os.path.exists(last_update_path):
|
||||||
|
args.since = open(last_update_path).read().strip()
|
||||||
|
else:
|
||||||
|
args.since = None
|
||||||
|
else:
|
||||||
|
args.since = None
|
||||||
|
|
||||||
for repository in repositories:
|
for repository in repositories:
|
||||||
backup_cwd = os.path.join(output_directory, 'repositories')
|
if repository.get('is_gist'):
|
||||||
repo_cwd = os.path.join(backup_cwd, repository['name'])
|
repo_cwd = os.path.join(output_directory, 'gists', repository['id'])
|
||||||
|
elif repository.get('is_starred'):
|
||||||
|
# put starred repos in -o/starred/${owner}/${repo} to prevent collision of
|
||||||
|
# any repositories with the same name
|
||||||
|
repo_cwd = os.path.join(output_directory, 'starred', repository['owner']['login'], repository['name'])
|
||||||
|
else:
|
||||||
|
repo_cwd = os.path.join(output_directory, 'repositories', repository['name'])
|
||||||
|
|
||||||
repo_dir = os.path.join(repo_cwd, 'repository')
|
repo_dir = os.path.join(repo_cwd, 'repository')
|
||||||
repo_url = get_github_repo_url(args, repository)
|
repo_url = get_github_repo_url(args, repository)
|
||||||
|
|
||||||
if args.include_repository or args.include_everything:
|
include_gists = (args.include_gists or args.include_starred_gists)
|
||||||
fetch_repository(repository['name'],
|
if (args.include_repository or args.include_everything) \
|
||||||
|
or (include_gists and repository.get('is_gist')):
|
||||||
|
repo_name = repository.get('name') if not repository.get('is_gist') else repository.get('id')
|
||||||
|
fetch_repository(repo_name,
|
||||||
repo_url,
|
repo_url,
|
||||||
repo_dir,
|
repo_dir,
|
||||||
skip_existing=args.skip_existing)
|
skip_existing=args.skip_existing,
|
||||||
|
bare_clone=args.bare_clone,
|
||||||
|
lfs_clone=args.lfs_clone)
|
||||||
|
|
||||||
|
if repository.get('is_gist'):
|
||||||
|
# dump gist information to a file as well
|
||||||
|
output_file = '{0}/gist.json'.format(repo_cwd)
|
||||||
|
with codecs.open(output_file, 'w', encoding='utf-8') as f:
|
||||||
|
json_dump(repository, f)
|
||||||
|
|
||||||
|
continue # don't try to back anything else for a gist; it doesn't exist
|
||||||
|
|
||||||
download_wiki = (args.include_wiki or args.include_everything)
|
download_wiki = (args.include_wiki or args.include_everything)
|
||||||
if repository['has_wiki'] and download_wiki:
|
if repository['has_wiki'] and download_wiki:
|
||||||
fetch_repository(repository['name'],
|
fetch_repository(repository['name'],
|
||||||
repo_url.replace('.git', '.wiki.git'),
|
repo_url.replace('.git', '.wiki.git'),
|
||||||
os.path.join(repo_cwd, 'wiki'),
|
os.path.join(repo_cwd, 'wiki'),
|
||||||
skip_existing=args.skip_existing)
|
skip_existing=args.skip_existing,
|
||||||
|
bare_clone=args.bare_clone,
|
||||||
|
lfs_clone=args.lfs_clone)
|
||||||
|
|
||||||
if args.include_issues or args.include_everything:
|
if args.include_issues or args.include_everything:
|
||||||
backup_issues(args, repo_cwd, repository, repos_template)
|
backup_issues(args, repo_cwd, repository, repos_template)
|
||||||
@@ -493,6 +769,13 @@ def backup_repositories(args, output_directory, repositories):
|
|||||||
if args.include_hooks or args.include_everything:
|
if args.include_hooks or args.include_everything:
|
||||||
backup_hooks(args, repo_cwd, repository, repos_template)
|
backup_hooks(args, repo_cwd, repository, repos_template)
|
||||||
|
|
||||||
|
if args.include_releases or args.include_everything:
|
||||||
|
backup_releases(args, repo_cwd, repository, repos_template,
|
||||||
|
include_assets=args.include_assets or args.include_everything)
|
||||||
|
|
||||||
|
if args.incremental:
|
||||||
|
open(last_update_path, 'w').write(last_update)
|
||||||
|
|
||||||
|
|
||||||
def backup_issues(args, repo_cwd, repository, repos_template):
|
def backup_issues(args, repo_cwd, repository, repos_template):
|
||||||
has_issues_dir = os.path.isdir('{0}/issues/.git'.format(repo_cwd))
|
has_issues_dir = os.path.isdir('{0}/issues/.git'.format(repo_cwd))
|
||||||
@@ -509,12 +792,15 @@ def backup_issues(args, repo_cwd, repository, repos_template):
|
|||||||
_issue_template = '{0}/{1}/issues'.format(repos_template,
|
_issue_template = '{0}/{1}/issues'.format(repos_template,
|
||||||
repository['full_name'])
|
repository['full_name'])
|
||||||
|
|
||||||
|
should_include_pulls = args.include_pulls or args.include_everything
|
||||||
issue_states = ['open', 'closed']
|
issue_states = ['open', 'closed']
|
||||||
for issue_state in issue_states:
|
for issue_state in issue_states:
|
||||||
query_args = {
|
query_args = {
|
||||||
'filter': 'all',
|
'filter': 'all',
|
||||||
'state': issue_state
|
'state': issue_state
|
||||||
}
|
}
|
||||||
|
if args.since:
|
||||||
|
query_args['since'] = args.since
|
||||||
|
|
||||||
_issues = retrieve_data(args,
|
_issues = retrieve_data(args,
|
||||||
_issue_template,
|
_issue_template,
|
||||||
@@ -522,18 +808,21 @@ def backup_issues(args, repo_cwd, repository, repos_template):
|
|||||||
for issue in _issues:
|
for issue in _issues:
|
||||||
# skip pull requests which are also returned as issues
|
# skip pull requests which are also returned as issues
|
||||||
# if retrieving pull requests is requested as well
|
# if retrieving pull requests is requested as well
|
||||||
if 'pull_request' in issue and (args.include_pulls or args.include_everything):
|
if 'pull_request' in issue and should_include_pulls:
|
||||||
issues_skipped += 1
|
issues_skipped += 1
|
||||||
continue
|
continue
|
||||||
|
|
||||||
issues[issue['number']] = issue
|
issues[issue['number']] = issue
|
||||||
|
|
||||||
if issues_skipped:
|
if issues_skipped:
|
||||||
issues_skipped_message = ' (skipped {0} pull requests)'.format(issues_skipped)
|
issues_skipped_message = ' (skipped {0} pull requests)'.format(
|
||||||
log_info('Saving {0} issues to disk{1}'.format(len(issues.keys()), issues_skipped_message))
|
issues_skipped)
|
||||||
|
|
||||||
|
log_info('Saving {0} issues to disk{1}'.format(
|
||||||
|
len(list(issues.keys())), issues_skipped_message))
|
||||||
comments_template = _issue_template + '/{0}/comments'
|
comments_template = _issue_template + '/{0}/comments'
|
||||||
events_template = _issue_template + '/{0}/events'
|
events_template = _issue_template + '/{0}/events'
|
||||||
for number, issue in issues.iteritems():
|
for number, issue in list(issues.items()):
|
||||||
if args.include_issue_comments or args.include_everything:
|
if args.include_issue_comments or args.include_everything:
|
||||||
template = comments_template.format(number)
|
template = comments_template.format(number)
|
||||||
issues[number]['comment_data'] = retrieve_data(args, template)
|
issues[number]['comment_data'] = retrieve_data(args, template)
|
||||||
@@ -558,24 +847,44 @@ def backup_pulls(args, repo_cwd, repository, repos_template):
|
|||||||
pulls = {}
|
pulls = {}
|
||||||
_pulls_template = '{0}/{1}/pulls'.format(repos_template,
|
_pulls_template = '{0}/{1}/pulls'.format(repos_template,
|
||||||
repository['full_name'])
|
repository['full_name'])
|
||||||
|
|
||||||
pull_states = ['open', 'closed']
|
|
||||||
for pull_state in pull_states:
|
|
||||||
query_args = {
|
query_args = {
|
||||||
'filter': 'all',
|
'filter': 'all',
|
||||||
'state': pull_state
|
'state': 'all',
|
||||||
|
'sort': 'updated',
|
||||||
|
'direction': 'desc',
|
||||||
}
|
}
|
||||||
|
|
||||||
_pulls = retrieve_data(args,
|
if not args.include_pull_details:
|
||||||
|
pull_states = ['open', 'closed']
|
||||||
|
for pull_state in pull_states:
|
||||||
|
query_args['state'] = pull_state
|
||||||
|
_pulls = retrieve_data_gen(args,
|
||||||
_pulls_template,
|
_pulls_template,
|
||||||
query_args=query_args)
|
query_args=query_args)
|
||||||
for pull in _pulls:
|
for pull in _pulls:
|
||||||
|
if args.since and pull['updated_at'] < args.since:
|
||||||
|
break
|
||||||
|
if not args.since or pull['updated_at'] >= args.since:
|
||||||
pulls[pull['number']] = pull
|
pulls[pull['number']] = pull
|
||||||
|
else:
|
||||||
|
_pulls = retrieve_data_gen(args,
|
||||||
|
_pulls_template,
|
||||||
|
query_args=query_args)
|
||||||
|
for pull in _pulls:
|
||||||
|
if args.since and pull['updated_at'] < args.since:
|
||||||
|
break
|
||||||
|
if not args.since or pull['updated_at'] >= args.since:
|
||||||
|
pulls[pull['number']] = retrieve_data(
|
||||||
|
args,
|
||||||
|
_pulls_template + '/{}'.format(pull['number']),
|
||||||
|
single_request=True
|
||||||
|
)[0]
|
||||||
|
|
||||||
log_info('Saving {0} pull requests to disk'.format(len(pulls.keys())))
|
log_info('Saving {0} pull requests to disk'.format(
|
||||||
|
len(list(pulls.keys()))))
|
||||||
comments_template = _pulls_template + '/{0}/comments'
|
comments_template = _pulls_template + '/{0}/comments'
|
||||||
commits_template = _pulls_template + '/{0}/commits'
|
commits_template = _pulls_template + '/{0}/commits'
|
||||||
for number, pull in pulls.iteritems():
|
for number, pull in list(pulls.items()):
|
||||||
if args.include_pull_comments or args.include_everything:
|
if args.include_pull_comments or args.include_everything:
|
||||||
template = comments_template.format(number)
|
template = comments_template.format(number)
|
||||||
pulls[number]['comment_data'] = retrieve_data(args, template)
|
pulls[number]['comment_data'] = retrieve_data(args, template)
|
||||||
@@ -609,8 +918,9 @@ def backup_milestones(args, repo_cwd, repository, repos_template):
|
|||||||
for milestone in _milestones:
|
for milestone in _milestones:
|
||||||
milestones[milestone['number']] = milestone
|
milestones[milestone['number']] = milestone
|
||||||
|
|
||||||
log_info('Saving {0} milestones to disk'.format(len(milestones.keys())))
|
log_info('Saving {0} milestones to disk'.format(
|
||||||
for number, milestone in milestones.iteritems():
|
len(list(milestones.keys()))))
|
||||||
|
for number, milestone in list(milestones.items()):
|
||||||
milestone_file = '{0}/{1}.json'.format(milestone_cwd, number)
|
milestone_file = '{0}/{1}.json'.format(milestone_cwd, number)
|
||||||
with codecs.open(milestone_file, 'w', encoding='utf-8') as f:
|
with codecs.open(milestone_file, 'w', encoding='utf-8') as f:
|
||||||
json_dump(milestone, f)
|
json_dump(milestone, f)
|
||||||
@@ -647,7 +957,52 @@ def backup_hooks(args, repo_cwd, repository, repos_template):
|
|||||||
log_info("Unable to read hooks, skipping")
|
log_info("Unable to read hooks, skipping")
|
||||||
|
|
||||||
|
|
||||||
def fetch_repository(name, remote_url, local_dir, skip_existing=False):
|
def backup_releases(args, repo_cwd, repository, repos_template, include_assets=False):
|
||||||
|
repository_fullname = repository['full_name']
|
||||||
|
|
||||||
|
# give release files somewhere to live & log intent
|
||||||
|
release_cwd = os.path.join(repo_cwd, 'releases')
|
||||||
|
log_info('Retrieving {0} releases'.format(repository_fullname))
|
||||||
|
mkdir_p(repo_cwd, release_cwd)
|
||||||
|
|
||||||
|
query_args = {}
|
||||||
|
|
||||||
|
release_template = '{0}/{1}/releases'.format(repos_template, repository_fullname)
|
||||||
|
releases = retrieve_data(args, release_template, query_args=query_args)
|
||||||
|
|
||||||
|
# for each release, store it
|
||||||
|
log_info('Saving {0} releases to disk'.format(len(releases)))
|
||||||
|
for release in releases:
|
||||||
|
release_name = release['tag_name']
|
||||||
|
output_filepath = os.path.join(release_cwd, '{0}.json'.format(release_name))
|
||||||
|
with codecs.open(output_filepath, 'w+', encoding='utf-8') as f:
|
||||||
|
json_dump(release, f)
|
||||||
|
|
||||||
|
if include_assets:
|
||||||
|
assets = retrieve_data(args, release['assets_url'])
|
||||||
|
if len(assets) > 0:
|
||||||
|
# give release asset files somewhere to live & download them (not including source archives)
|
||||||
|
release_assets_cwd = os.path.join(release_cwd, release_name)
|
||||||
|
mkdir_p(release_assets_cwd)
|
||||||
|
for asset in assets:
|
||||||
|
download_file(asset['url'], os.path.join(release_assets_cwd, asset['name']), get_auth(args))
|
||||||
|
|
||||||
|
|
||||||
|
def fetch_repository(name,
|
||||||
|
remote_url,
|
||||||
|
local_dir,
|
||||||
|
skip_existing=False,
|
||||||
|
bare_clone=False,
|
||||||
|
lfs_clone=False):
|
||||||
|
if bare_clone:
|
||||||
|
if os.path.exists(local_dir):
|
||||||
|
clone_exists = subprocess.check_output(['git',
|
||||||
|
'rev-parse',
|
||||||
|
'--is-bare-repository'],
|
||||||
|
cwd=local_dir) == b"true\n"
|
||||||
|
else:
|
||||||
|
clone_exists = False
|
||||||
|
else:
|
||||||
clone_exists = os.path.exists(os.path.join(local_dir, '.git'))
|
clone_exists = os.path.exists(os.path.join(local_dir, '.git'))
|
||||||
|
|
||||||
if clone_exists and skip_existing:
|
if clone_exists and skip_existing:
|
||||||
@@ -655,26 +1010,50 @@ def fetch_repository(name, remote_url, local_dir, skip_existing=False):
|
|||||||
|
|
||||||
masked_remote_url = mask_password(remote_url)
|
masked_remote_url = mask_password(remote_url)
|
||||||
|
|
||||||
initalized = subprocess.call('git ls-remote ' + remote_url,
|
initialized = subprocess.call('git ls-remote ' + remote_url,
|
||||||
stdout=FNULL,
|
stdout=FNULL,
|
||||||
stderr=FNULL,
|
stderr=FNULL,
|
||||||
shell=True)
|
shell=True)
|
||||||
if initalized == 128:
|
if initialized == 128:
|
||||||
log_info("Skipping {0} ({1}) since it's not initalized".format(name, masked_remote_url))
|
log_info("Skipping {0} ({1}) since it's not initialized".format(
|
||||||
|
name, masked_remote_url))
|
||||||
return
|
return
|
||||||
|
|
||||||
if clone_exists:
|
if clone_exists:
|
||||||
log_info('Updating {0} in {1}'.format(name, local_dir))
|
log_info('Updating {0} in {1}'.format(name, local_dir))
|
||||||
|
|
||||||
|
remotes = subprocess.check_output(['git', 'remote', 'show'],
|
||||||
|
cwd=local_dir)
|
||||||
|
remotes = [i.strip() for i in remotes.decode('utf-8').splitlines()]
|
||||||
|
|
||||||
|
if 'origin' not in remotes:
|
||||||
git_command = ['git', 'remote', 'rm', 'origin']
|
git_command = ['git', 'remote', 'rm', 'origin']
|
||||||
logging_subprocess(git_command, None, cwd=local_dir)
|
logging_subprocess(git_command, None, cwd=local_dir)
|
||||||
git_command = ['git', 'remote', 'add', 'origin', remote_url]
|
git_command = ['git', 'remote', 'add', 'origin', remote_url]
|
||||||
logging_subprocess(git_command, None, cwd=local_dir)
|
logging_subprocess(git_command, None, cwd=local_dir)
|
||||||
git_command = ['git', 'fetch', '--all', '--tags', '--prune']
|
else:
|
||||||
|
git_command = ['git', 'remote', 'set-url', 'origin', remote_url]
|
||||||
|
logging_subprocess(git_command, None, cwd=local_dir)
|
||||||
|
|
||||||
|
if lfs_clone:
|
||||||
|
git_command = ['git', 'lfs', 'fetch', '--all', '--force', '--tags', '--prune']
|
||||||
|
else:
|
||||||
|
git_command = ['git', 'fetch', '--all', '--force', '--tags', '--prune']
|
||||||
logging_subprocess(git_command, None, cwd=local_dir)
|
logging_subprocess(git_command, None, cwd=local_dir)
|
||||||
else:
|
else:
|
||||||
log_info('Cloning {0} repository from {1} to {2}'.format(name,
|
log_info('Cloning {0} repository from {1} to {2}'.format(
|
||||||
|
name,
|
||||||
masked_remote_url,
|
masked_remote_url,
|
||||||
local_dir))
|
local_dir))
|
||||||
|
if bare_clone:
|
||||||
|
if lfs_clone:
|
||||||
|
git_command = ['git', 'lfs', 'clone', '--mirror', remote_url, local_dir]
|
||||||
|
else:
|
||||||
|
git_command = ['git', 'clone', '--mirror', remote_url, local_dir]
|
||||||
|
else:
|
||||||
|
if lfs_clone:
|
||||||
|
git_command = ['git', 'lfs', 'clone', remote_url, local_dir]
|
||||||
|
else:
|
||||||
git_command = ['git', 'clone', remote_url, local_dir]
|
git_command = ['git', 'clone', remote_url, local_dir]
|
||||||
logging_subprocess(git_command, None)
|
logging_subprocess(git_command, None)
|
||||||
|
|
||||||
@@ -683,21 +1062,37 @@ def backup_account(args, output_directory):
|
|||||||
account_cwd = os.path.join(output_directory, 'account')
|
account_cwd = os.path.join(output_directory, 'account')
|
||||||
|
|
||||||
if args.include_starred or args.include_everything:
|
if args.include_starred or args.include_everything:
|
||||||
output_file = '{0}/starred.json'.format(account_cwd)
|
output_file = "{0}/starred.json".format(account_cwd)
|
||||||
template = "https://{0}/users/{1}/starred"
|
template = "https://{0}/users/{1}/starred".format(get_github_api_host(args), args.user)
|
||||||
template = template.format(get_github_api_host(args), args.user)
|
|
||||||
_backup_data(args,
|
_backup_data(args,
|
||||||
'starred repositories',
|
"starred repositories",
|
||||||
template,
|
template,
|
||||||
output_file,
|
output_file,
|
||||||
account_cwd)
|
account_cwd)
|
||||||
|
|
||||||
if args.include_watched or args.include_everything:
|
if args.include_watched or args.include_everything:
|
||||||
output_file = '{0}/watched.json'.format(account_cwd)
|
output_file = "{0}/watched.json".format(account_cwd)
|
||||||
template = "https://{0}/users/{1}/subscriptions"
|
template = "https://{0}/users/{1}/subscriptions".format(get_github_api_host(args), args.user)
|
||||||
template = template.format(get_github_api_host(args), args.user)
|
|
||||||
_backup_data(args,
|
_backup_data(args,
|
||||||
'watched repositories',
|
"watched repositories",
|
||||||
|
template,
|
||||||
|
output_file,
|
||||||
|
account_cwd)
|
||||||
|
|
||||||
|
if args.include_followers or args.include_everything:
|
||||||
|
output_file = "{0}/followers.json".format(account_cwd)
|
||||||
|
template = "https://{0}/users/{1}/followers".format(get_github_api_host(args), args.user)
|
||||||
|
_backup_data(args,
|
||||||
|
"followers",
|
||||||
|
template,
|
||||||
|
output_file,
|
||||||
|
account_cwd)
|
||||||
|
|
||||||
|
if args.include_following or args.include_everything:
|
||||||
|
output_file = "{0}/following.json".format(account_cwd)
|
||||||
|
template = "https://{0}/users/{1}/following".format(get_github_api_host(args), args.user)
|
||||||
|
_backup_data(args,
|
||||||
|
"following",
|
||||||
template,
|
template,
|
||||||
output_file,
|
output_file,
|
||||||
account_cwd)
|
account_cwd)
|
||||||
@@ -732,9 +1127,13 @@ def main():
|
|||||||
log_info('Create output directory {0}'.format(output_directory))
|
log_info('Create output directory {0}'.format(output_directory))
|
||||||
mkdir_p(output_directory)
|
mkdir_p(output_directory)
|
||||||
|
|
||||||
|
if args.lfs_clone:
|
||||||
|
check_git_lfs_install()
|
||||||
|
|
||||||
log_info('Backing up user {0} to {1}'.format(args.user, output_directory))
|
log_info('Backing up user {0} to {1}'.format(args.user, output_directory))
|
||||||
|
|
||||||
repositories = retrieve_repositories(args)
|
authenticated_user = get_authenticated_user(args)
|
||||||
|
repositories = retrieve_repositories(args, authenticated_user)
|
||||||
repositories = filter_repositories(args, repositories)
|
repositories = filter_repositories(args, repositories)
|
||||||
backup_repositories(args, output_directory, repositories)
|
backup_repositories(args, output_directory, repositories)
|
||||||
backup_account(args, output_directory)
|
backup_account(args, output_directory)
|
||||||
|
|||||||
@@ -1 +1 @@
|
|||||||
__version__ = '0.9.0'
|
__version__ = '0.27.0'
|
||||||
|
|||||||
11
release
11
release
@@ -1,8 +1,13 @@
|
|||||||
#!/usr/bin/env bash
|
#!/usr/bin/env bash
|
||||||
set -eo pipefail; [[ $RELEASE_TRACE ]] && set -x
|
set -eo pipefail; [[ $RELEASE_TRACE ]] && set -x
|
||||||
|
|
||||||
PACKAGE_NAME='github-backup'
|
if [[ ! -f setup.py ]]; then
|
||||||
INIT_PACKAGE_NAME='github_backup'
|
echo -e "${RED}WARNING: Missing setup.py${COLOR_OFF}\n"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
PACKAGE_NAME="$(cat setup.py | grep "name='" | head | cut -d "'" -f2)"
|
||||||
|
INIT_PACKAGE_NAME="$(echo "${PACKAGE_NAME//-/_}")"
|
||||||
PUBLIC="true"
|
PUBLIC="true"
|
||||||
|
|
||||||
# Colors
|
# Colors
|
||||||
@@ -34,7 +39,7 @@ fi
|
|||||||
echo -e "\n${GREEN}STARTING RELEASE PROCESS${COLOR_OFF}\n"
|
echo -e "\n${GREEN}STARTING RELEASE PROCESS${COLOR_OFF}\n"
|
||||||
|
|
||||||
set +e;
|
set +e;
|
||||||
git status | grep "working directory clean" &> /dev/null
|
git status | grep -Eo "working (directory|tree) clean" &> /dev/null
|
||||||
if [ ! $? -eq 0 ]; then # working directory is NOT clean
|
if [ ! $? -eq 0 ]; then # working directory is NOT clean
|
||||||
echo -e "${RED}WARNING: You have uncomitted changes, you may have forgotten something${COLOR_OFF}\n"
|
echo -e "${RED}WARNING: You have uncomitted changes, you may have forgotten something${COLOR_OFF}\n"
|
||||||
exit 1
|
exit 1
|
||||||
|
|||||||
3
setup.py
3
setup.py
@@ -37,8 +37,9 @@ setup(
|
|||||||
'Development Status :: 5 - Production/Stable',
|
'Development Status :: 5 - Production/Stable',
|
||||||
'Topic :: System :: Archiving :: Backup',
|
'Topic :: System :: Archiving :: Backup',
|
||||||
'License :: OSI Approved :: MIT License',
|
'License :: OSI Approved :: MIT License',
|
||||||
'Programming Language :: Python :: 2.6',
|
|
||||||
'Programming Language :: Python :: 2.7',
|
'Programming Language :: Python :: 2.7',
|
||||||
|
'Programming Language :: Python :: 3.5',
|
||||||
|
'Programming Language :: Python :: 3.6',
|
||||||
],
|
],
|
||||||
description='backup a github user or organization',
|
description='backup a github user or organization',
|
||||||
long_description=open_file('README.rst').read(),
|
long_description=open_file('README.rst').read(),
|
||||||
|
|||||||
Reference in New Issue
Block a user