mirror of
https://github.com/josegonzalez/python-github-backup.git
synced 2025-12-05 08:08:02 +01:00
Compare commits
76 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
8b7512c8d8 | ||
|
|
995b7ede6c | ||
|
|
7840528fe2 | ||
|
|
6fb0d86977 | ||
|
|
9f6b401171 | ||
|
|
bf638f7aea | ||
|
|
c3855a94f1 | ||
|
|
c3f4bfde0d | ||
|
|
d3edef0622 | ||
|
|
9ef496efad | ||
|
|
42bfe6f79d | ||
|
|
5af522a348 | ||
|
|
6dfba7a783 | ||
|
|
7551829677 | ||
|
|
72d35a9b94 | ||
|
|
3eae9d78ed | ||
|
|
90ba839c7d | ||
|
|
1ec0820936 | ||
|
|
ca463e5cd4 | ||
|
|
1750d0eff1 | ||
|
|
e4d1c78993 | ||
|
|
7a9455db88 | ||
|
|
a98ff7f23d | ||
|
|
7b78f06a68 | ||
|
|
56db3ff0e8 | ||
|
|
5c9c20f6ee | ||
|
|
c8c585cbb5 | ||
|
|
e7880bb056 | ||
|
|
18e3bd574a | ||
|
|
1ed3d66777 | ||
|
|
a194fa48ce | ||
|
|
8f859be355 | ||
|
|
80e00d31d9 | ||
|
|
32202656ba | ||
|
|
875e31819a | ||
|
|
73dc75ab95 | ||
|
|
cd23dd1a16 | ||
|
|
d244de1952 | ||
|
|
4dae43c58e | ||
|
|
b018a91fb4 | ||
|
|
759ec58beb | ||
|
|
b43c998b65 | ||
|
|
38b4a2c106 | ||
|
|
6210ec3845 | ||
|
|
90396d2bdf | ||
|
|
aa35e883b0 | ||
|
|
963ed3e6f6 | ||
|
|
b710547fdc | ||
|
|
64b5667a16 | ||
|
|
b0c8cfe059 | ||
|
|
5bedaf825f | ||
|
|
9d28d9c2b0 | ||
|
|
eb756d665c | ||
|
|
3d5f61aa22 | ||
|
|
d6bf031bf7 | ||
|
|
85ab54e514 | ||
|
|
df4d751be2 | ||
|
|
03c660724d | ||
|
|
39848e650c | ||
|
|
12ac519e9c | ||
|
|
9e25473151 | ||
|
|
d3079bfb74 | ||
|
|
3b9ff1ac14 | ||
|
|
268a989b09 | ||
|
|
45a3b87892 | ||
|
|
1c465f4d35 | ||
|
|
3ad9b02b26 | ||
|
|
8bfad9b5b7 | ||
|
|
985d79c1bc | ||
|
|
7d1b7f20ef | ||
|
|
d3b67f884a | ||
|
|
65749bfde4 | ||
|
|
aeeb0eb9d7 | ||
|
|
f027760ac5 | ||
|
|
a9e48f8c4e | ||
|
|
338d5a956b |
75
.dockerignore
Normal file
75
.dockerignore
Normal file
@@ -0,0 +1,75 @@
|
|||||||
|
# Docker ignore file to reduce build context size
|
||||||
|
|
||||||
|
# Temp files
|
||||||
|
*~
|
||||||
|
~*
|
||||||
|
.*~
|
||||||
|
\#*
|
||||||
|
.#*
|
||||||
|
*#
|
||||||
|
dist
|
||||||
|
|
||||||
|
# Build files
|
||||||
|
build
|
||||||
|
dist
|
||||||
|
pkg
|
||||||
|
*.egg
|
||||||
|
*.egg-info
|
||||||
|
|
||||||
|
# Debian Files
|
||||||
|
debian/files
|
||||||
|
debian/python-github-backup*
|
||||||
|
|
||||||
|
# Sphinx build
|
||||||
|
doc/_build
|
||||||
|
|
||||||
|
# Generated man page
|
||||||
|
doc/github_backup.1
|
||||||
|
|
||||||
|
# Annoying macOS files
|
||||||
|
.DS_Store
|
||||||
|
._*
|
||||||
|
|
||||||
|
# IDE configuration files
|
||||||
|
.vscode
|
||||||
|
.atom
|
||||||
|
.idea
|
||||||
|
*.code-workspace
|
||||||
|
|
||||||
|
# RSA
|
||||||
|
id_rsa
|
||||||
|
id_rsa.pub
|
||||||
|
|
||||||
|
# Virtual env
|
||||||
|
venv
|
||||||
|
.venv
|
||||||
|
|
||||||
|
# Git
|
||||||
|
.git
|
||||||
|
.gitignore
|
||||||
|
.gitchangelog.rc
|
||||||
|
.github
|
||||||
|
|
||||||
|
# Documentation
|
||||||
|
*.md
|
||||||
|
!README.md
|
||||||
|
|
||||||
|
# Environment variables files
|
||||||
|
.env
|
||||||
|
.env.*
|
||||||
|
!.env.example
|
||||||
|
*.log
|
||||||
|
|
||||||
|
# Cache files
|
||||||
|
**/__pycache__/
|
||||||
|
*.py[cod]
|
||||||
|
|
||||||
|
# Docker files
|
||||||
|
docker-compose.yml
|
||||||
|
Dockerfile*
|
||||||
|
|
||||||
|
# Other files
|
||||||
|
release
|
||||||
|
*.tar
|
||||||
|
*.zip
|
||||||
|
*.gzip
|
||||||
28
.github/ISSUE_TEMPLATE/bug.yaml
vendored
Normal file
28
.github/ISSUE_TEMPLATE/bug.yaml
vendored
Normal file
@@ -0,0 +1,28 @@
|
|||||||
|
---
|
||||||
|
name: Bug Report
|
||||||
|
description: File a bug report.
|
||||||
|
body:
|
||||||
|
- type: markdown
|
||||||
|
attributes:
|
||||||
|
value: |
|
||||||
|
# Important notice regarding filed issues
|
||||||
|
|
||||||
|
This project already fills my needs, and as such I have no real reason to continue it's development. This project is otherwise provided as is, and no support is given.
|
||||||
|
|
||||||
|
If pull requests implementing bug fixes or enhancements are pushed, I am happy to review and merge them (time permitting).
|
||||||
|
|
||||||
|
If you wish to have a bug fixed, you have a few options:
|
||||||
|
|
||||||
|
- Fix it yourself and file a pull request.
|
||||||
|
- File a bug and hope someone else fixes it for you.
|
||||||
|
- Pay me to fix it (my rate is $200 an hour, minimum 1 hour, contact me via my [github email address](https://github.com/josegonzalez) if you want to go this route).
|
||||||
|
|
||||||
|
In all cases, feel free to file an issue, they may be of help to others in the future.
|
||||||
|
- type: textarea
|
||||||
|
id: what-happened
|
||||||
|
attributes:
|
||||||
|
label: What happened?
|
||||||
|
description: Also tell us, what did you expect to happen?
|
||||||
|
placeholder: Tell us what you see!
|
||||||
|
validations:
|
||||||
|
required: true
|
||||||
27
.github/ISSUE_TEMPLATE/feature.yaml
vendored
Normal file
27
.github/ISSUE_TEMPLATE/feature.yaml
vendored
Normal file
@@ -0,0 +1,27 @@
|
|||||||
|
---
|
||||||
|
name: Feature Request
|
||||||
|
description: File a feature request.
|
||||||
|
body:
|
||||||
|
- type: markdown
|
||||||
|
attributes:
|
||||||
|
value: |
|
||||||
|
# Important notice regarding filed issues
|
||||||
|
|
||||||
|
This project already fills my needs, and as such I have no real reason to continue it's development. This project is otherwise provided as is, and no support is given.
|
||||||
|
|
||||||
|
If pull requests implementing bug fixes or enhancements are pushed, I am happy to review and merge them (time permitting).
|
||||||
|
|
||||||
|
If you wish to have a feature implemented, you have a few options:
|
||||||
|
|
||||||
|
- Implement it yourself and file a pull request.
|
||||||
|
- File an issue and hope someone else implements it for you.
|
||||||
|
- Pay me to implement it (my rate is $200 an hour, minimum 1 hour, contact me via my [github email address](https://github.com/josegonzalez) if you want to go this route).
|
||||||
|
|
||||||
|
In all cases, feel free to file an issue, they may be of help to others in the future.
|
||||||
|
- type: textarea
|
||||||
|
id: what-would-you-like-to-happen
|
||||||
|
attributes:
|
||||||
|
label: What would you like to happen?
|
||||||
|
description: Please describe in detail how the new functionality should work as well as any issues with existing functionality.
|
||||||
|
validations:
|
||||||
|
required: true
|
||||||
4
.github/workflows/automatic-release.yml
vendored
4
.github/workflows/automatic-release.yml
vendored
@@ -18,7 +18,7 @@ jobs:
|
|||||||
runs-on: ubuntu-24.04
|
runs-on: ubuntu-24.04
|
||||||
steps:
|
steps:
|
||||||
- name: Checkout repository
|
- name: Checkout repository
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v6
|
||||||
with:
|
with:
|
||||||
fetch-depth: 0
|
fetch-depth: 0
|
||||||
ssh-key: ${{ secrets.DEPLOY_PRIVATE_KEY }}
|
ssh-key: ${{ secrets.DEPLOY_PRIVATE_KEY }}
|
||||||
@@ -27,7 +27,7 @@ jobs:
|
|||||||
git config --local user.email "action@github.com"
|
git config --local user.email "action@github.com"
|
||||||
git config --local user.name "GitHub Action"
|
git config --local user.name "GitHub Action"
|
||||||
- name: Setup Python
|
- name: Setup Python
|
||||||
uses: actions/setup-python@v5
|
uses: actions/setup-python@v6
|
||||||
with:
|
with:
|
||||||
python-version: '3.12'
|
python-version: '3.12'
|
||||||
- name: Install prerequisites
|
- name: Install prerequisites
|
||||||
|
|||||||
2
.github/workflows/docker.yml
vendored
2
.github/workflows/docker.yml
vendored
@@ -38,7 +38,7 @@ jobs:
|
|||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Checkout repository
|
- name: Checkout repository
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v6
|
||||||
with:
|
with:
|
||||||
persist-credentials: false
|
persist-credentials: false
|
||||||
|
|
||||||
|
|||||||
9
.github/workflows/lint.yml
vendored
9
.github/workflows/lint.yml
vendored
@@ -15,16 +15,19 @@ jobs:
|
|||||||
lint:
|
lint:
|
||||||
name: lint
|
name: lint
|
||||||
runs-on: ubuntu-24.04
|
runs-on: ubuntu-24.04
|
||||||
|
strategy:
|
||||||
|
matrix:
|
||||||
|
python-version: ["3.10", "3.11", "3.12", "3.13", "3.14"]
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Checkout repository
|
- name: Checkout repository
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v6
|
||||||
with:
|
with:
|
||||||
fetch-depth: 0
|
fetch-depth: 0
|
||||||
- name: Setup Python
|
- name: Setup Python
|
||||||
uses: actions/setup-python@v5
|
uses: actions/setup-python@v6
|
||||||
with:
|
with:
|
||||||
python-version: "3.12"
|
python-version: ${{ matrix.python-version }}
|
||||||
cache: "pip"
|
cache: "pip"
|
||||||
- run: pip install -r release-requirements.txt && pip install wheel
|
- run: pip install -r release-requirements.txt && pip install wheel
|
||||||
- run: flake8 --ignore=E501,E203,W503
|
- run: flake8 --ignore=E501,E203,W503
|
||||||
|
|||||||
33
.github/workflows/test.yml
vendored
Normal file
33
.github/workflows/test.yml
vendored
Normal file
@@ -0,0 +1,33 @@
|
|||||||
|
---
|
||||||
|
name: "test"
|
||||||
|
|
||||||
|
# yamllint disable-line rule:truthy
|
||||||
|
on:
|
||||||
|
pull_request:
|
||||||
|
branches:
|
||||||
|
- "*"
|
||||||
|
push:
|
||||||
|
branches:
|
||||||
|
- "main"
|
||||||
|
- "master"
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
test:
|
||||||
|
name: test
|
||||||
|
runs-on: ubuntu-24.04
|
||||||
|
strategy:
|
||||||
|
matrix:
|
||||||
|
python-version: ["3.10", "3.11", "3.12", "3.13", "3.14"]
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v6
|
||||||
|
with:
|
||||||
|
fetch-depth: 0
|
||||||
|
- name: Setup Python
|
||||||
|
uses: actions/setup-python@v6
|
||||||
|
with:
|
||||||
|
python-version: ${{ matrix.python-version }}
|
||||||
|
cache: "pip"
|
||||||
|
- run: pip install -r release-requirements.txt
|
||||||
|
- run: pytest tests/ -v
|
||||||
4
.gitignore
vendored
4
.gitignore
vendored
@@ -1,4 +1,4 @@
|
|||||||
*.py[oc]
|
*.py[cod]
|
||||||
|
|
||||||
# Temp files
|
# Temp files
|
||||||
*~
|
*~
|
||||||
@@ -33,6 +33,7 @@ doc/github_backup.1
|
|||||||
# IDE configuration files
|
# IDE configuration files
|
||||||
.vscode
|
.vscode
|
||||||
.atom
|
.atom
|
||||||
|
.idea
|
||||||
|
|
||||||
README
|
README
|
||||||
|
|
||||||
@@ -42,3 +43,4 @@ id_rsa.pub
|
|||||||
|
|
||||||
# Virtual env
|
# Virtual env
|
||||||
venv
|
venv
|
||||||
|
.venv
|
||||||
|
|||||||
570
CHANGES.rst
570
CHANGES.rst
@@ -1,9 +1,577 @@
|
|||||||
Changelog
|
Changelog
|
||||||
=========
|
=========
|
||||||
|
|
||||||
0.50.3 (2025-08-08)
|
0.52.0 (2025-11-28)
|
||||||
-------------------
|
-------------------
|
||||||
------------------------
|
------------------------
|
||||||
|
- Skip DMCA'd repos which return a 451 response. [Rodos]
|
||||||
|
|
||||||
|
Log a warning and the link to the DMCA notice. Continue backing up
|
||||||
|
other repositories instead of crashing.
|
||||||
|
|
||||||
|
Closes #163
|
||||||
|
- Chore(deps): bump restructuredtext-lint in the python-packages group.
|
||||||
|
[dependabot[bot]]
|
||||||
|
|
||||||
|
Bumps the python-packages group with 1 update: [restructuredtext-lint](https://github.com/twolfson/restructuredtext-lint).
|
||||||
|
|
||||||
|
|
||||||
|
Updates `restructuredtext-lint` from 1.4.0 to 2.0.2
|
||||||
|
- [Changelog](https://github.com/twolfson/restructuredtext-lint/blob/master/CHANGELOG.rst)
|
||||||
|
- [Commits](https://github.com/twolfson/restructuredtext-lint/compare/1.4.0...2.0.2)
|
||||||
|
|
||||||
|
---
|
||||||
|
updated-dependencies:
|
||||||
|
- dependency-name: restructuredtext-lint
|
||||||
|
dependency-version: 2.0.2
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-major
|
||||||
|
dependency-group: python-packages
|
||||||
|
...
|
||||||
|
- Chore(deps): bump actions/checkout from 5 to 6. [dependabot[bot]]
|
||||||
|
|
||||||
|
Bumps [actions/checkout](https://github.com/actions/checkout) from 5 to 6.
|
||||||
|
- [Release notes](https://github.com/actions/checkout/releases)
|
||||||
|
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
|
||||||
|
- [Commits](https://github.com/actions/checkout/compare/v5...v6)
|
||||||
|
|
||||||
|
---
|
||||||
|
updated-dependencies:
|
||||||
|
- dependency-name: actions/checkout
|
||||||
|
dependency-version: '6'
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-major
|
||||||
|
...
|
||||||
|
- Chore(deps): bump the python-packages group with 3 updates.
|
||||||
|
[dependabot[bot]]
|
||||||
|
|
||||||
|
Bumps the python-packages group with 3 updates: [click](https://github.com/pallets/click), [pytest](https://github.com/pytest-dev/pytest) and [keyring](https://github.com/jaraco/keyring).
|
||||||
|
|
||||||
|
|
||||||
|
Updates `click` from 8.3.0 to 8.3.1
|
||||||
|
- [Release notes](https://github.com/pallets/click/releases)
|
||||||
|
- [Changelog](https://github.com/pallets/click/blob/main/CHANGES.rst)
|
||||||
|
- [Commits](https://github.com/pallets/click/compare/8.3.0...8.3.1)
|
||||||
|
|
||||||
|
Updates `pytest` from 8.3.3 to 9.0.1
|
||||||
|
- [Release notes](https://github.com/pytest-dev/pytest/releases)
|
||||||
|
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
|
||||||
|
- [Commits](https://github.com/pytest-dev/pytest/compare/8.3.3...9.0.1)
|
||||||
|
|
||||||
|
Updates `keyring` from 25.6.0 to 25.7.0
|
||||||
|
- [Release notes](https://github.com/jaraco/keyring/releases)
|
||||||
|
- [Changelog](https://github.com/jaraco/keyring/blob/main/NEWS.rst)
|
||||||
|
- [Commits](https://github.com/jaraco/keyring/compare/v25.6.0...v25.7.0)
|
||||||
|
|
||||||
|
---
|
||||||
|
updated-dependencies:
|
||||||
|
- dependency-name: click
|
||||||
|
dependency-version: 8.3.1
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-patch
|
||||||
|
dependency-group: python-packages
|
||||||
|
- dependency-name: pytest
|
||||||
|
dependency-version: 9.0.1
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-major
|
||||||
|
dependency-group: python-packages
|
||||||
|
- dependency-name: keyring
|
||||||
|
dependency-version: 25.7.0
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-minor
|
||||||
|
dependency-group: python-packages
|
||||||
|
...
|
||||||
|
|
||||||
|
|
||||||
|
0.51.3 (2025-11-18)
|
||||||
|
-------------------
|
||||||
|
- Test: Add pagination tests for cursor and page-based Link headers.
|
||||||
|
[Rodos]
|
||||||
|
- Use cursor based pagination. [Helio Machado]
|
||||||
|
|
||||||
|
|
||||||
|
0.51.2 (2025-11-16)
|
||||||
|
-------------------
|
||||||
|
|
||||||
|
Fix
|
||||||
|
~~~
|
||||||
|
- Improve CA certificate detection with fallback chain. [Rodos]
|
||||||
|
|
||||||
|
The previous implementation incorrectly assumed empty get_ca_certs()
|
||||||
|
meant broken SSL, causing false failures in GitHub Codespaces and other
|
||||||
|
directory-based cert systems where certificates exist but aren't pre-loaded.
|
||||||
|
It would then attempt to import certifi as a workaround, but certifi wasn't
|
||||||
|
listed in requirements.txt, causing the fallback to fail with ImportError
|
||||||
|
even though the system certificates would have worked fine.
|
||||||
|
|
||||||
|
This commit replaces the naive check with a layered fallback approach that
|
||||||
|
checks multiple certificate sources. First it checks for pre-loaded system
|
||||||
|
certs (file-based systems). Then it verifies system cert paths exist
|
||||||
|
(directory-based systems like Ubuntu/Debian/Codespaces). Finally it attempts
|
||||||
|
to use certifi as an optional fallback only if needed.
|
||||||
|
|
||||||
|
This approach eliminates hard dependencies (certifi is now optional), works
|
||||||
|
in GitHub Codespaces without any setup, and fails gracefully with clear hints
|
||||||
|
for resolution when SSL is actually broken rather than failing with
|
||||||
|
ModuleNotFoundError.
|
||||||
|
|
||||||
|
Fixes #444
|
||||||
|
|
||||||
|
|
||||||
|
0.51.1 (2025-11-16)
|
||||||
|
-------------------
|
||||||
|
|
||||||
|
Fix
|
||||||
|
~~~
|
||||||
|
- Prevent duplicate attachment downloads. [Rodos]
|
||||||
|
|
||||||
|
Fixes bug where attachments were downloaded multiple times with
|
||||||
|
incremented filenames (file.mov, file_1.mov, file_2.mov) when
|
||||||
|
running backups without --skip-existing flag.
|
||||||
|
|
||||||
|
I should not have used the --skip-existing flag for attachments,
|
||||||
|
it did not do what I thought it did.
|
||||||
|
|
||||||
|
The correct approach is to always use the manifest to guide what
|
||||||
|
has already been downloaded and what now needs to be done.
|
||||||
|
|
||||||
|
Other
|
||||||
|
~~~~~
|
||||||
|
- Chore(deps): bump certifi in the python-packages group.
|
||||||
|
[dependabot[bot]]
|
||||||
|
|
||||||
|
Bumps the python-packages group with 1 update: [certifi](https://github.com/certifi/python-certifi).
|
||||||
|
|
||||||
|
|
||||||
|
Updates `certifi` from 2025.10.5 to 2025.11.12
|
||||||
|
- [Commits](https://github.com/certifi/python-certifi/compare/2025.10.05...2025.11.12)
|
||||||
|
|
||||||
|
---
|
||||||
|
updated-dependencies:
|
||||||
|
- dependency-name: certifi
|
||||||
|
dependency-version: 2025.11.12
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-minor
|
||||||
|
dependency-group: python-packages
|
||||||
|
...
|
||||||
|
- Test: Add pytest infrastructure and attachment tests. [Rodos]
|
||||||
|
|
||||||
|
In making my last fix to attachments, I found it challenging not
|
||||||
|
having tests to ensure there was no regression.
|
||||||
|
|
||||||
|
Added pytest with minimal setup and isolated configuration. Created
|
||||||
|
a separate test workflow to keep tests isolated from linting.
|
||||||
|
|
||||||
|
Tests cover the key elements of the attachment logic:
|
||||||
|
- URL extraction from issue bodies
|
||||||
|
- Filename extraction from different URL types
|
||||||
|
- Filename collision resolution
|
||||||
|
- Manifest duplicate prevention
|
||||||
|
- Chore(deps): bump black in the python-packages group.
|
||||||
|
[dependabot[bot]]
|
||||||
|
|
||||||
|
Bumps the python-packages group with 1 update: [black](https://github.com/psf/black).
|
||||||
|
|
||||||
|
|
||||||
|
Updates `black` from 25.9.0 to 25.11.0
|
||||||
|
- [Release notes](https://github.com/psf/black/releases)
|
||||||
|
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
|
||||||
|
- [Commits](https://github.com/psf/black/compare/25.9.0...25.11.0)
|
||||||
|
|
||||||
|
---
|
||||||
|
updated-dependencies:
|
||||||
|
- dependency-name: black
|
||||||
|
dependency-version: 25.11.0
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-minor
|
||||||
|
dependency-group: python-packages
|
||||||
|
...
|
||||||
|
- Chore(deps): bump docutils in the python-packages group.
|
||||||
|
[dependabot[bot]]
|
||||||
|
|
||||||
|
Bumps the python-packages group with 1 update: [docutils](https://github.com/rtfd/recommonmark).
|
||||||
|
|
||||||
|
|
||||||
|
Updates `docutils` from 0.22.2 to 0.22.3
|
||||||
|
- [Changelog](https://github.com/readthedocs/recommonmark/blob/master/CHANGELOG.md)
|
||||||
|
- [Commits](https://github.com/rtfd/recommonmark/commits)
|
||||||
|
|
||||||
|
---
|
||||||
|
updated-dependencies:
|
||||||
|
- dependency-name: docutils
|
||||||
|
dependency-version: 0.22.3
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-patch
|
||||||
|
dependency-group: python-packages
|
||||||
|
...
|
||||||
|
|
||||||
|
|
||||||
|
0.51.0 (2025-11-06)
|
||||||
|
-------------------
|
||||||
|
|
||||||
|
Fix
|
||||||
|
~~~
|
||||||
|
- Remove Python 3.8 and 3.9 from CI matrix. [Rodos]
|
||||||
|
|
||||||
|
3.8 and 3.9 are failing because the pinned dependencies don't support them:
|
||||||
|
- autopep8==2.3.2 needs Python 3.9+
|
||||||
|
- bleach==6.3.0 needs Python 3.10+
|
||||||
|
|
||||||
|
Both are EOL now anyway (3.8 in Oct 2024, 3.9 in Oct 2025).
|
||||||
|
|
||||||
|
Just fixing CI to test 3.10-3.14 for now. Will do a separate PR to formally
|
||||||
|
drop 3.8/3.9 support with python_requires and README updates.
|
||||||
|
|
||||||
|
Other
|
||||||
|
~~~~~
|
||||||
|
- Refactor: Add atomic writes for attachment files and manifests.
|
||||||
|
[Rodos]
|
||||||
|
- Feat: Add attachment download support for issues and pull requests.
|
||||||
|
[Rodos]
|
||||||
|
|
||||||
|
Adds new --attachments flag that downloads user-uploaded files from
|
||||||
|
issue and PR bodies and comments. Key features:
|
||||||
|
|
||||||
|
- Determines attachment URLs
|
||||||
|
- Tracks downloads in manifest.json with metadata
|
||||||
|
- Supports --skip-existing to avoid re-downloading
|
||||||
|
- Handles filename collisions with counter suffix
|
||||||
|
- Smart retry logic for transient vs permanent failures
|
||||||
|
- Uses Content-Disposition for correct file extensions
|
||||||
|
- Feat: Drop support for Python 3.8 and 3.9 (EOL) [Rodos]
|
||||||
|
|
||||||
|
Both Python 3.8 and 3.9 have reached end-of-life:
|
||||||
|
- Python 3.8: EOL October 7, 2024
|
||||||
|
- Python 3.9: EOL October 31, 2025
|
||||||
|
|
||||||
|
Changes:
|
||||||
|
- Add python_requires=">=3.10" to setup.py
|
||||||
|
- Remove Python 3.8 and 3.9 from classifiers
|
||||||
|
- Add Python 3.13 and 3.14 to classifiers
|
||||||
|
- Update README to document Python 3.10+ requirement
|
||||||
|
- Feat: Enforce Python 3.8+ requirement and add multi-version CI
|
||||||
|
testing. [Rodos]
|
||||||
|
|
||||||
|
- Add python_requires=">=3.8" to setup.py to enforce minimum version at install time
|
||||||
|
- Update README to explicitly document Python 3.8+ requirement
|
||||||
|
- Add CI matrix to test lint/build on Python 3.8-3.14 (7 versions)
|
||||||
|
- Aligns with actual usage patterns (~99% of downloads on Python 3.8+)
|
||||||
|
- Prevents future PRs from inadvertently using incompatible syntax
|
||||||
|
|
||||||
|
This change protects users by preventing installation on unsupported Python
|
||||||
|
versions and ensures contributors can see version requirements clearly.
|
||||||
|
- Chore(deps): bump bleach in the python-packages group.
|
||||||
|
[dependabot[bot]]
|
||||||
|
|
||||||
|
Bumps the python-packages group with 1 update: [bleach](https://github.com/mozilla/bleach).
|
||||||
|
|
||||||
|
|
||||||
|
Updates `bleach` from 6.2.0 to 6.3.0
|
||||||
|
- [Changelog](https://github.com/mozilla/bleach/blob/main/CHANGES)
|
||||||
|
- [Commits](https://github.com/mozilla/bleach/compare/v6.2.0...v6.3.0)
|
||||||
|
|
||||||
|
---
|
||||||
|
updated-dependencies:
|
||||||
|
- dependency-name: bleach
|
||||||
|
dependency-version: 6.3.0
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-minor
|
||||||
|
dependency-group: python-packages
|
||||||
|
...
|
||||||
|
- Chore(deps): bump charset-normalizer in the python-packages group.
|
||||||
|
[dependabot[bot]]
|
||||||
|
|
||||||
|
Bumps the python-packages group with 1 update: [charset-normalizer](https://github.com/jawah/charset_normalizer).
|
||||||
|
|
||||||
|
|
||||||
|
Updates `charset-normalizer` from 3.4.3 to 3.4.4
|
||||||
|
- [Release notes](https://github.com/jawah/charset_normalizer/releases)
|
||||||
|
- [Changelog](https://github.com/jawah/charset_normalizer/blob/master/CHANGELOG.md)
|
||||||
|
- [Commits](https://github.com/jawah/charset_normalizer/compare/3.4.3...3.4.4)
|
||||||
|
|
||||||
|
---
|
||||||
|
updated-dependencies:
|
||||||
|
- dependency-name: charset-normalizer
|
||||||
|
dependency-version: 3.4.4
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-patch
|
||||||
|
dependency-group: python-packages
|
||||||
|
...
|
||||||
|
- Chore(deps): bump idna from 3.10 to 3.11 in the python-packages group.
|
||||||
|
[dependabot[bot]]
|
||||||
|
|
||||||
|
Bumps the python-packages group with 1 update: [idna](https://github.com/kjd/idna).
|
||||||
|
|
||||||
|
|
||||||
|
Updates `idna` from 3.10 to 3.11
|
||||||
|
- [Release notes](https://github.com/kjd/idna/releases)
|
||||||
|
- [Changelog](https://github.com/kjd/idna/blob/master/HISTORY.rst)
|
||||||
|
- [Commits](https://github.com/kjd/idna/compare/v3.10...v3.11)
|
||||||
|
|
||||||
|
---
|
||||||
|
updated-dependencies:
|
||||||
|
- dependency-name: idna
|
||||||
|
dependency-version: '3.11'
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-minor
|
||||||
|
dependency-group: python-packages
|
||||||
|
...
|
||||||
|
- Chore(deps): bump the python-packages group across 1 directory with 2
|
||||||
|
updates. [dependabot[bot]]
|
||||||
|
|
||||||
|
Bumps the python-packages group with 2 updates in the / directory: [platformdirs](https://github.com/tox-dev/platformdirs) and [rich](https://github.com/Textualize/rich).
|
||||||
|
|
||||||
|
|
||||||
|
Updates `platformdirs` from 4.4.0 to 4.5.0
|
||||||
|
- [Release notes](https://github.com/tox-dev/platformdirs/releases)
|
||||||
|
- [Changelog](https://github.com/tox-dev/platformdirs/blob/main/CHANGES.rst)
|
||||||
|
- [Commits](https://github.com/tox-dev/platformdirs/compare/4.4.0...4.5.0)
|
||||||
|
|
||||||
|
Updates `rich` from 14.1.0 to 14.2.0
|
||||||
|
- [Release notes](https://github.com/Textualize/rich/releases)
|
||||||
|
- [Changelog](https://github.com/Textualize/rich/blob/master/CHANGELOG.md)
|
||||||
|
- [Commits](https://github.com/Textualize/rich/compare/v14.1.0...v14.2.0)
|
||||||
|
|
||||||
|
---
|
||||||
|
updated-dependencies:
|
||||||
|
- dependency-name: platformdirs
|
||||||
|
dependency-version: 4.5.0
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-minor
|
||||||
|
dependency-group: python-packages
|
||||||
|
- dependency-name: rich
|
||||||
|
dependency-version: 14.2.0
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-minor
|
||||||
|
dependency-group: python-packages
|
||||||
|
...
|
||||||
|
- Chore(deps): bump the python-packages group with 3 updates.
|
||||||
|
[dependabot[bot]]
|
||||||
|
|
||||||
|
Bumps the python-packages group with 3 updates: [certifi](https://github.com/certifi/python-certifi), [click](https://github.com/pallets/click) and [markdown-it-py](https://github.com/executablebooks/markdown-it-py).
|
||||||
|
|
||||||
|
|
||||||
|
Updates `certifi` from 2025.8.3 to 2025.10.5
|
||||||
|
- [Commits](https://github.com/certifi/python-certifi/compare/2025.08.03...2025.10.05)
|
||||||
|
|
||||||
|
Updates `click` from 8.1.8 to 8.3.0
|
||||||
|
- [Release notes](https://github.com/pallets/click/releases)
|
||||||
|
- [Changelog](https://github.com/pallets/click/blob/main/CHANGES.rst)
|
||||||
|
- [Commits](https://github.com/pallets/click/compare/8.1.8...8.3.0)
|
||||||
|
|
||||||
|
Updates `markdown-it-py` from 3.0.0 to 4.0.0
|
||||||
|
- [Release notes](https://github.com/executablebooks/markdown-it-py/releases)
|
||||||
|
- [Changelog](https://github.com/executablebooks/markdown-it-py/blob/master/CHANGELOG.md)
|
||||||
|
- [Commits](https://github.com/executablebooks/markdown-it-py/compare/v3.0.0...v4.0.0)
|
||||||
|
|
||||||
|
---
|
||||||
|
updated-dependencies:
|
||||||
|
- dependency-name: certifi
|
||||||
|
dependency-version: 2025.10.5
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-minor
|
||||||
|
dependency-group: python-packages
|
||||||
|
- dependency-name: click
|
||||||
|
dependency-version: 8.3.0
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-minor
|
||||||
|
dependency-group: python-packages
|
||||||
|
- dependency-name: markdown-it-py
|
||||||
|
dependency-version: 4.0.0
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-major
|
||||||
|
dependency-group: python-packages
|
||||||
|
...
|
||||||
|
- Chore(deps): bump docutils in the python-packages group.
|
||||||
|
[dependabot[bot]]
|
||||||
|
|
||||||
|
Bumps the python-packages group with 1 update: [docutils](https://github.com/rtfd/recommonmark).
|
||||||
|
|
||||||
|
|
||||||
|
Updates `docutils` from 0.22.1 to 0.22.2
|
||||||
|
- [Changelog](https://github.com/readthedocs/recommonmark/blob/master/CHANGELOG.md)
|
||||||
|
- [Commits](https://github.com/rtfd/recommonmark/commits)
|
||||||
|
|
||||||
|
---
|
||||||
|
updated-dependencies:
|
||||||
|
- dependency-name: docutils
|
||||||
|
dependency-version: 0.22.2
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-patch
|
||||||
|
dependency-group: python-packages
|
||||||
|
...
|
||||||
|
- Chore(deps): bump the python-packages group across 1 directory with 2
|
||||||
|
updates. [dependabot[bot]]
|
||||||
|
|
||||||
|
Bumps the python-packages group with 2 updates in the / directory: [black](https://github.com/psf/black) and [docutils](https://github.com/rtfd/recommonmark).
|
||||||
|
|
||||||
|
|
||||||
|
Updates `black` from 25.1.0 to 25.9.0
|
||||||
|
- [Release notes](https://github.com/psf/black/releases)
|
||||||
|
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
|
||||||
|
- [Commits](https://github.com/psf/black/compare/25.1.0...25.9.0)
|
||||||
|
|
||||||
|
Updates `docutils` from 0.22 to 0.22.1
|
||||||
|
- [Changelog](https://github.com/readthedocs/recommonmark/blob/master/CHANGELOG.md)
|
||||||
|
- [Commits](https://github.com/rtfd/recommonmark/commits)
|
||||||
|
|
||||||
|
---
|
||||||
|
updated-dependencies:
|
||||||
|
- dependency-name: black
|
||||||
|
dependency-version: 25.9.0
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-minor
|
||||||
|
dependency-group: python-packages
|
||||||
|
- dependency-name: docutils
|
||||||
|
dependency-version: 0.22.1
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-patch
|
||||||
|
dependency-group: python-packages
|
||||||
|
...
|
||||||
|
- Delete .github/ISSUE_TEMPLATE.md. [Jose Diaz-Gonzalez]
|
||||||
|
- Create feature.yaml. [Jose Diaz-Gonzalez]
|
||||||
|
- Delete .github/ISSUE_TEMPLATE/bug_report.md. [Jose Diaz-Gonzalez]
|
||||||
|
- Rename bug.md to bug.yaml. [Jose Diaz-Gonzalez]
|
||||||
|
- Chore: create bug template. [Jose Diaz-Gonzalez]
|
||||||
|
- Chore: Rename PULL_REQUEST.md to .github/PULL_REQUEST.md. [Jose Diaz-
|
||||||
|
Gonzalez]
|
||||||
|
- Chore: Rename ISSUE_TEMPLATE.md to .github/ISSUE_TEMPLATE.md. [Jose
|
||||||
|
Diaz-Gonzalez]
|
||||||
|
- Chore(deps): bump actions/setup-python from 5 to 6. [dependabot[bot]]
|
||||||
|
|
||||||
|
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 5 to 6.
|
||||||
|
- [Release notes](https://github.com/actions/setup-python/releases)
|
||||||
|
- [Commits](https://github.com/actions/setup-python/compare/v5...v6)
|
||||||
|
|
||||||
|
---
|
||||||
|
updated-dependencies:
|
||||||
|
- dependency-name: actions/setup-python
|
||||||
|
dependency-version: '6'
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-major
|
||||||
|
...
|
||||||
|
- Chore(deps): bump twine from 6.1.0 to 6.2.0 in the python-packages
|
||||||
|
group. [dependabot[bot]]
|
||||||
|
|
||||||
|
Bumps the python-packages group with 1 update: [twine](https://github.com/pypa/twine).
|
||||||
|
|
||||||
|
|
||||||
|
Updates `twine` from 6.1.0 to 6.2.0
|
||||||
|
- [Release notes](https://github.com/pypa/twine/releases)
|
||||||
|
- [Changelog](https://github.com/pypa/twine/blob/main/docs/changelog.rst)
|
||||||
|
- [Commits](https://github.com/pypa/twine/compare/6.1.0...6.2.0)
|
||||||
|
|
||||||
|
---
|
||||||
|
updated-dependencies:
|
||||||
|
- dependency-name: twine
|
||||||
|
dependency-version: 6.2.0
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-minor
|
||||||
|
dependency-group: python-packages
|
||||||
|
...
|
||||||
|
- Chore(deps): bump more-itertools in the python-packages group.
|
||||||
|
[dependabot[bot]]
|
||||||
|
|
||||||
|
Bumps the python-packages group with 1 update: [more-itertools](https://github.com/more-itertools/more-itertools).
|
||||||
|
|
||||||
|
|
||||||
|
Updates `more-itertools` from 10.7.0 to 10.8.0
|
||||||
|
- [Release notes](https://github.com/more-itertools/more-itertools/releases)
|
||||||
|
- [Commits](https://github.com/more-itertools/more-itertools/compare/v10.7.0...v10.8.0)
|
||||||
|
|
||||||
|
---
|
||||||
|
updated-dependencies:
|
||||||
|
- dependency-name: more-itertools
|
||||||
|
dependency-version: 10.8.0
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-minor
|
||||||
|
dependency-group: python-packages
|
||||||
|
...
|
||||||
|
- Chore(deps): bump platformdirs in the python-packages group.
|
||||||
|
[dependabot[bot]]
|
||||||
|
|
||||||
|
Bumps the python-packages group with 1 update: [platformdirs](https://github.com/tox-dev/platformdirs).
|
||||||
|
|
||||||
|
|
||||||
|
Updates `platformdirs` from 4.3.8 to 4.4.0
|
||||||
|
- [Release notes](https://github.com/tox-dev/platformdirs/releases)
|
||||||
|
- [Changelog](https://github.com/tox-dev/platformdirs/blob/main/CHANGES.rst)
|
||||||
|
- [Commits](https://github.com/tox-dev/platformdirs/compare/4.3.8...4.4.0)
|
||||||
|
|
||||||
|
---
|
||||||
|
updated-dependencies:
|
||||||
|
- dependency-name: platformdirs
|
||||||
|
dependency-version: 4.4.0
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-minor
|
||||||
|
dependency-group: python-packages
|
||||||
|
...
|
||||||
|
- Chore(deps): bump actions/checkout from 4 to 5. [dependabot[bot]]
|
||||||
|
|
||||||
|
Bumps [actions/checkout](https://github.com/actions/checkout) from 4 to 5.
|
||||||
|
- [Release notes](https://github.com/actions/checkout/releases)
|
||||||
|
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
|
||||||
|
- [Commits](https://github.com/actions/checkout/compare/v4...v5)
|
||||||
|
|
||||||
|
---
|
||||||
|
updated-dependencies:
|
||||||
|
- dependency-name: actions/checkout
|
||||||
|
dependency-version: '5'
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-major
|
||||||
|
...
|
||||||
|
- Chore(deps): bump requests in the python-packages group.
|
||||||
|
[dependabot[bot]]
|
||||||
|
|
||||||
|
Bumps the python-packages group with 1 update: [requests](https://github.com/psf/requests).
|
||||||
|
|
||||||
|
|
||||||
|
Updates `requests` from 2.32.4 to 2.32.5
|
||||||
|
- [Release notes](https://github.com/psf/requests/releases)
|
||||||
|
- [Changelog](https://github.com/psf/requests/blob/main/HISTORY.md)
|
||||||
|
- [Commits](https://github.com/psf/requests/compare/v2.32.4...v2.32.5)
|
||||||
|
|
||||||
|
---
|
||||||
|
updated-dependencies:
|
||||||
|
- dependency-name: requests
|
||||||
|
dependency-version: 2.32.5
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-patch
|
||||||
|
dependency-group: python-packages
|
||||||
|
...
|
||||||
|
- Chore: update Dockerfile to use Python 3.12 and improve dependency
|
||||||
|
installation. [Mateusz Hajder]
|
||||||
|
- Chore(deps): bump the python-packages group with 2 updates.
|
||||||
|
[dependabot[bot]]
|
||||||
|
|
||||||
|
Bumps the python-packages group with 2 updates: [certifi](https://github.com/certifi/python-certifi) and [charset-normalizer](https://github.com/jawah/charset_normalizer).
|
||||||
|
|
||||||
|
|
||||||
|
Updates `certifi` from 2025.7.14 to 2025.8.3
|
||||||
|
- [Commits](https://github.com/certifi/python-certifi/compare/2025.07.14...2025.08.03)
|
||||||
|
|
||||||
|
Updates `charset-normalizer` from 3.4.2 to 3.4.3
|
||||||
|
- [Release notes](https://github.com/jawah/charset_normalizer/releases)
|
||||||
|
- [Changelog](https://github.com/jawah/charset_normalizer/blob/master/CHANGELOG.md)
|
||||||
|
- [Commits](https://github.com/jawah/charset_normalizer/compare/3.4.2...3.4.3)
|
||||||
|
|
||||||
|
---
|
||||||
|
updated-dependencies:
|
||||||
|
- dependency-name: certifi
|
||||||
|
dependency-version: 2025.8.3
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-minor
|
||||||
|
dependency-group: python-packages
|
||||||
|
- dependency-name: charset-normalizer
|
||||||
|
dependency-version: 3.4.3
|
||||||
|
dependency-type: direct:production
|
||||||
|
update-type: version-update:semver-patch
|
||||||
|
dependency-group: python-packages
|
||||||
|
...
|
||||||
|
|
||||||
|
|
||||||
|
0.50.3 (2025-08-08)
|
||||||
|
-------------------
|
||||||
- Revert "Add conditional check for git checkout in development path"
|
- Revert "Add conditional check for git checkout in development path"
|
||||||
[Eric Wheeler]
|
[Eric Wheeler]
|
||||||
|
|
||||||
|
|||||||
42
Dockerfile
42
Dockerfile
@@ -1,16 +1,38 @@
|
|||||||
FROM python:3.9.18-slim
|
FROM python:3.12-alpine3.22 AS builder
|
||||||
|
|
||||||
RUN --mount=type=cache,target=/var/cache/apt \
|
RUN pip install --no-cache-dir --upgrade pip \
|
||||||
apt-get update && apt-get install -y git git-lfs
|
&& pip install --no-cache-dir uv
|
||||||
|
|
||||||
WORKDIR /usr/src/app
|
WORKDIR /app
|
||||||
|
|
||||||
COPY release-requirements.txt .
|
RUN --mount=type=cache,target=/root/.cache/uv \
|
||||||
RUN --mount=type=cache,target=/root/.cache/pip \
|
--mount=type=bind,source=requirements.txt,target=requirements.txt \
|
||||||
pip install -r release-requirements.txt
|
--mount=type=bind,source=release-requirements.txt,target=release-requirements.txt \
|
||||||
|
uv venv \
|
||||||
|
&& uv pip install -r release-requirements.txt
|
||||||
|
|
||||||
COPY . .
|
COPY . .
|
||||||
RUN --mount=type=cache,target=/root/.cache/pip \
|
|
||||||
pip install .
|
|
||||||
|
|
||||||
ENTRYPOINT [ "github-backup" ]
|
RUN --mount=type=cache,target=/root/.cache/uv \
|
||||||
|
uv pip install .
|
||||||
|
|
||||||
|
|
||||||
|
FROM python:3.12-alpine3.22
|
||||||
|
ENV PYTHONUNBUFFERED=1
|
||||||
|
|
||||||
|
RUN apk add --no-cache \
|
||||||
|
ca-certificates \
|
||||||
|
git \
|
||||||
|
git-lfs \
|
||||||
|
&& addgroup -g 1000 appuser \
|
||||||
|
&& adduser -D -u 1000 -G appuser appuser
|
||||||
|
|
||||||
|
COPY --from=builder --chown=appuser:appuser /app /app
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
USER appuser
|
||||||
|
|
||||||
|
ENV PATH="/app/.venv/bin:$PATH"
|
||||||
|
|
||||||
|
ENTRYPOINT ["github-backup"]
|
||||||
|
|||||||
@@ -1,13 +0,0 @@
|
|||||||
# Important notice regarding filed issues
|
|
||||||
|
|
||||||
This project already fills my needs, and as such I have no real reason to continue it's development. This project is otherwise provided as is, and no support is given.
|
|
||||||
|
|
||||||
If pull requests implementing bug fixes or enhancements are pushed, I am happy to review and merge them (time permitting).
|
|
||||||
|
|
||||||
If you wish to have a bug fixed, you have a few options:
|
|
||||||
|
|
||||||
- Fix it yourself and file a pull request.
|
|
||||||
- File a bug and hope someone else fixes it for you.
|
|
||||||
- Pay me to fix it (my rate is $200 an hour, minimum 1 hour, contact me via my [github email address](https://github.com/josegonzalez) if you want to go this route).
|
|
||||||
|
|
||||||
In all cases, feel free to file an issue, they may be of help to others in the future.
|
|
||||||
32
README.rst
32
README.rst
@@ -9,8 +9,8 @@ The package can be used to backup an *entire* `Github <https://github.com/>`_ or
|
|||||||
Requirements
|
Requirements
|
||||||
============
|
============
|
||||||
|
|
||||||
|
- Python 3.10 or higher
|
||||||
- GIT 1.9+
|
- GIT 1.9+
|
||||||
- Python
|
|
||||||
|
|
||||||
Installation
|
Installation
|
||||||
============
|
============
|
||||||
@@ -50,7 +50,7 @@ CLI Help output::
|
|||||||
[--keychain-name OSX_KEYCHAIN_ITEM_NAME]
|
[--keychain-name OSX_KEYCHAIN_ITEM_NAME]
|
||||||
[--keychain-account OSX_KEYCHAIN_ITEM_ACCOUNT]
|
[--keychain-account OSX_KEYCHAIN_ITEM_ACCOUNT]
|
||||||
[--releases] [--latest-releases NUMBER_OF_LATEST_RELEASES]
|
[--releases] [--latest-releases NUMBER_OF_LATEST_RELEASES]
|
||||||
[--skip-prerelease] [--assets]
|
[--skip-prerelease] [--assets] [--attachments]
|
||||||
[--exclude [REPOSITORY [REPOSITORY ...]]
|
[--exclude [REPOSITORY [REPOSITORY ...]]
|
||||||
[--throttle-limit THROTTLE_LIMIT] [--throttle-pause THROTTLE_PAUSE]
|
[--throttle-limit THROTTLE_LIMIT] [--throttle-pause THROTTLE_PAUSE]
|
||||||
USER
|
USER
|
||||||
@@ -133,6 +133,9 @@ CLI Help output::
|
|||||||
--skip-prerelease skip prerelease and draft versions; only applies if including releases
|
--skip-prerelease skip prerelease and draft versions; only applies if including releases
|
||||||
--assets include assets alongside release information; only
|
--assets include assets alongside release information; only
|
||||||
applies if including releases
|
applies if including releases
|
||||||
|
--attachments download user-attachments from issues and pull requests
|
||||||
|
to issues/attachments/{issue_number}/ and
|
||||||
|
pulls/attachments/{pull_number}/ directories
|
||||||
--exclude [REPOSITORY [REPOSITORY ...]]
|
--exclude [REPOSITORY [REPOSITORY ...]]
|
||||||
names of repositories to exclude from backup.
|
names of repositories to exclude from backup.
|
||||||
--throttle-limit THROTTLE_LIMIT
|
--throttle-limit THROTTLE_LIMIT
|
||||||
@@ -213,6 +216,29 @@ When you use the ``--lfs`` option, you will need to make sure you have Git LFS i
|
|||||||
Instructions on how to do this can be found on https://git-lfs.github.com.
|
Instructions on how to do this can be found on https://git-lfs.github.com.
|
||||||
|
|
||||||
|
|
||||||
|
About Attachments
|
||||||
|
-----------------
|
||||||
|
|
||||||
|
When you use the ``--attachments`` option with ``--issues`` or ``--pulls``, the tool will download user-uploaded attachments (images, videos, documents, etc.) from issue and pull request descriptions and comments. In some circumstances attachments contain valuable data related to the topic, and without their backup important information or context might be lost inadvertently.
|
||||||
|
|
||||||
|
Attachments are saved to ``issues/attachments/{issue_number}/`` and ``pulls/attachments/{pull_number}/`` directories, where ``{issue_number}`` is the GitHub issue number (e.g., issue #123 saves to ``issues/attachments/123/``). Each attachment directory contains:
|
||||||
|
|
||||||
|
- The downloaded attachment files (named by their GitHub identifier with appropriate file extensions)
|
||||||
|
- If multiple attachments have the same filename, conflicts are resolved with numeric suffixes (e.g., ``report.pdf``, ``report_1.pdf``, ``report_2.pdf``)
|
||||||
|
- A ``manifest.json`` file documenting all downloads, including URLs, file metadata, and download status
|
||||||
|
|
||||||
|
The tool automatically extracts file extensions from HTTP headers to ensure files can be more easily opened by your operating system.
|
||||||
|
|
||||||
|
**Supported URL formats:**
|
||||||
|
|
||||||
|
- Modern: ``github.com/user-attachments/{assets,files}/*``
|
||||||
|
- Legacy: ``user-images.githubusercontent.com/*`` and ``private-user-images.githubusercontent.com/*``
|
||||||
|
- Repo files: ``github.com/{owner}/{repo}/files/*`` (filtered to current repository)
|
||||||
|
- Repo assets: ``github.com/{owner}/{repo}/assets/*`` (filtered to current repository)
|
||||||
|
|
||||||
|
**Repository filtering** for repo files/assets handles renamed and transferred repositories gracefully. URLs are included if they either match the current repository name directly, or redirect to it (e.g., ``willmcgugan/rich`` redirects to ``Textualize/rich`` after transfer).
|
||||||
|
|
||||||
|
|
||||||
Run in Docker container
|
Run in Docker container
|
||||||
-----------------------
|
-----------------------
|
||||||
|
|
||||||
@@ -303,7 +329,7 @@ Quietly and incrementally backup useful Github user data (public and private rep
|
|||||||
export FINE_ACCESS_TOKEN=SOME-GITHUB-TOKEN
|
export FINE_ACCESS_TOKEN=SOME-GITHUB-TOKEN
|
||||||
GH_USER=YOUR-GITHUB-USER
|
GH_USER=YOUR-GITHUB-USER
|
||||||
|
|
||||||
github-backup -f $FINE_ACCESS_TOKEN --prefer-ssh -o ~/github-backup/ -l error -P -i --all-starred --starred --watched --followers --following --issues --issue-comments --issue-events --pulls --pull-comments --pull-commits --labels --milestones --repositories --wikis --releases --assets --pull-details --gists --starred-gists $GH_USER
|
github-backup -f $FINE_ACCESS_TOKEN --prefer-ssh -o ~/github-backup/ -l error -P -i --all-starred --starred --watched --followers --following --issues --issue-comments --issue-events --pulls --pull-comments --pull-commits --labels --milestones --repositories --wikis --releases --assets --attachments --pull-details --gists --starred-gists $GH_USER
|
||||||
|
|
||||||
Debug an error/block or incomplete backup into a temporary directory. Omit "incremental" to fill a previous incomplete backup. ::
|
Debug an error/block or incomplete backup into a temporary directory. Omit "incremental" to fill a previous incomplete backup. ::
|
||||||
|
|
||||||
|
|||||||
@@ -1 +1 @@
|
|||||||
__version__ = "0.50.3"
|
__version__ = "0.52.0"
|
||||||
|
|||||||
@@ -37,22 +37,42 @@ FNULL = open(os.devnull, "w")
|
|||||||
FILE_URI_PREFIX = "file://"
|
FILE_URI_PREFIX = "file://"
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class RepositoryUnavailableError(Exception):
|
||||||
|
"""Raised when a repository is unavailable due to legal reasons (e.g., DMCA takedown)."""
|
||||||
|
|
||||||
|
def __init__(self, message, dmca_url=None):
|
||||||
|
super().__init__(message)
|
||||||
|
self.dmca_url = dmca_url
|
||||||
|
|
||||||
|
|
||||||
|
# Setup SSL context with fallback chain
|
||||||
https_ctx = ssl.create_default_context()
|
https_ctx = ssl.create_default_context()
|
||||||
if not https_ctx.get_ca_certs():
|
if https_ctx.get_ca_certs():
|
||||||
import warnings
|
# Layer 1: Certificates pre-loaded from system (file-based)
|
||||||
|
pass
|
||||||
|
else:
|
||||||
|
paths = ssl.get_default_verify_paths()
|
||||||
|
if (paths.cafile and os.path.exists(paths.cafile)) or (
|
||||||
|
paths.capath and os.path.exists(paths.capath)
|
||||||
|
):
|
||||||
|
# Layer 2: Cert paths exist, will be lazy-loaded on first use (directory-based)
|
||||||
|
pass
|
||||||
|
else:
|
||||||
|
# Layer 3: Try certifi package as optional fallback
|
||||||
|
try:
|
||||||
|
import certifi
|
||||||
|
|
||||||
warnings.warn(
|
https_ctx = ssl.create_default_context(cafile=certifi.where())
|
||||||
"\n\nYOUR DEFAULT CA CERTS ARE EMPTY.\n"
|
except ImportError:
|
||||||
+ "PLEASE POPULATE ANY OF:"
|
# All layers failed - no certificates available anywhere
|
||||||
+ "".join(
|
sys.exit(
|
||||||
["\n - " + x for x in ssl.get_default_verify_paths() if type(x) is str]
|
"\nERROR: No CA certificates found. Cannot connect to GitHub over SSL.\n\n"
|
||||||
)
|
"Solutions you can explore:\n"
|
||||||
+ "\n",
|
" 1. pip install certifi\n"
|
||||||
stacklevel=2,
|
" 2. Alpine: apk add ca-certificates\n"
|
||||||
)
|
" 3. Debian/Ubuntu: apt-get install ca-certificates\n\n"
|
||||||
import certifi
|
)
|
||||||
|
|
||||||
https_ctx = ssl.create_default_context(cafile=certifi.where())
|
|
||||||
|
|
||||||
|
|
||||||
def logging_subprocess(
|
def logging_subprocess(
|
||||||
@@ -420,6 +440,12 @@ def parse_args(args=None):
|
|||||||
dest="include_assets",
|
dest="include_assets",
|
||||||
help="include assets alongside release information; only applies if including releases",
|
help="include assets alongside release information; only applies if including releases",
|
||||||
)
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--attachments",
|
||||||
|
action="store_true",
|
||||||
|
dest="include_attachments",
|
||||||
|
help="download user-attachments from issues and pull requests",
|
||||||
|
)
|
||||||
parser.add_argument(
|
parser.add_argument(
|
||||||
"--throttle-limit",
|
"--throttle-limit",
|
||||||
dest="throttle_limit",
|
dest="throttle_limit",
|
||||||
@@ -575,27 +601,39 @@ def retrieve_data_gen(args, template, query_args=None, single_request=False):
|
|||||||
auth = get_auth(args, encode=not args.as_app)
|
auth = get_auth(args, encode=not args.as_app)
|
||||||
query_args = get_query_args(query_args)
|
query_args = get_query_args(query_args)
|
||||||
per_page = 100
|
per_page = 100
|
||||||
page = 0
|
next_url = None
|
||||||
|
|
||||||
while True:
|
while True:
|
||||||
if single_request:
|
if single_request:
|
||||||
request_page, request_per_page = None, None
|
request_per_page = None
|
||||||
else:
|
else:
|
||||||
page = page + 1
|
request_per_page = per_page
|
||||||
request_page, request_per_page = page, per_page
|
|
||||||
|
|
||||||
request = _construct_request(
|
request = _construct_request(
|
||||||
request_per_page,
|
request_per_page,
|
||||||
request_page,
|
|
||||||
query_args,
|
query_args,
|
||||||
template,
|
next_url or template,
|
||||||
auth,
|
auth,
|
||||||
as_app=args.as_app,
|
as_app=args.as_app,
|
||||||
fine=True if args.token_fine is not None else False,
|
fine=True if args.token_fine is not None else False,
|
||||||
) # noqa
|
) # noqa
|
||||||
r, errors = _get_response(request, auth, template)
|
r, errors = _get_response(request, auth, next_url or template)
|
||||||
|
|
||||||
status_code = int(r.getcode())
|
status_code = int(r.getcode())
|
||||||
|
|
||||||
|
# Handle DMCA takedown (HTTP 451) - raise exception to skip entire repository
|
||||||
|
if status_code == 451:
|
||||||
|
dmca_url = None
|
||||||
|
try:
|
||||||
|
response_data = json.loads(r.read().decode("utf-8"))
|
||||||
|
dmca_url = response_data.get("block", {}).get("html_url")
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
raise RepositoryUnavailableError(
|
||||||
|
"Repository unavailable due to legal reasons (HTTP 451)",
|
||||||
|
dmca_url=dmca_url
|
||||||
|
)
|
||||||
|
|
||||||
# Check if we got correct data
|
# Check if we got correct data
|
||||||
try:
|
try:
|
||||||
response = json.loads(r.read().decode("utf-8"))
|
response = json.loads(r.read().decode("utf-8"))
|
||||||
@@ -627,15 +665,14 @@ def retrieve_data_gen(args, template, query_args=None, single_request=False):
|
|||||||
retries += 1
|
retries += 1
|
||||||
time.sleep(5)
|
time.sleep(5)
|
||||||
request = _construct_request(
|
request = _construct_request(
|
||||||
per_page,
|
request_per_page,
|
||||||
page,
|
|
||||||
query_args,
|
query_args,
|
||||||
template,
|
next_url or template,
|
||||||
auth,
|
auth,
|
||||||
as_app=args.as_app,
|
as_app=args.as_app,
|
||||||
fine=True if args.token_fine is not None else False,
|
fine=True if args.token_fine is not None else False,
|
||||||
) # noqa
|
) # noqa
|
||||||
r, errors = _get_response(request, auth, template)
|
r, errors = _get_response(request, auth, next_url or template)
|
||||||
|
|
||||||
status_code = int(r.getcode())
|
status_code = int(r.getcode())
|
||||||
try:
|
try:
|
||||||
@@ -665,7 +702,16 @@ def retrieve_data_gen(args, template, query_args=None, single_request=False):
|
|||||||
if type(response) is list:
|
if type(response) is list:
|
||||||
for resp in response:
|
for resp in response:
|
||||||
yield resp
|
yield resp
|
||||||
if len(response) < per_page:
|
# Parse Link header for next page URL (cursor-based pagination)
|
||||||
|
link_header = r.headers.get("Link", "")
|
||||||
|
next_url = None
|
||||||
|
if link_header:
|
||||||
|
# Parse Link header: <https://api.github.com/...?per_page=100&after=cursor>; rel="next"
|
||||||
|
for link in link_header.split(","):
|
||||||
|
if 'rel="next"' in link:
|
||||||
|
next_url = link[link.find("<") + 1:link.find(">")]
|
||||||
|
break
|
||||||
|
if not next_url:
|
||||||
break
|
break
|
||||||
elif type(response) is dict and single_request:
|
elif type(response) is dict and single_request:
|
||||||
yield response
|
yield response
|
||||||
@@ -718,22 +764,27 @@ def _get_response(request, auth, template):
|
|||||||
|
|
||||||
|
|
||||||
def _construct_request(
|
def _construct_request(
|
||||||
per_page, page, query_args, template, auth, as_app=None, fine=False
|
per_page, query_args, template, auth, as_app=None, fine=False
|
||||||
):
|
):
|
||||||
all_query_args = {}
|
# If template is already a full URL with query params (from Link header), use it directly
|
||||||
if per_page:
|
if "?" in template and template.startswith("http"):
|
||||||
all_query_args["per_page"] = per_page
|
request_url = template
|
||||||
if page:
|
# Extract query string for logging
|
||||||
all_query_args["page"] = page
|
querystring = template.split("?", 1)[1]
|
||||||
if query_args:
|
|
||||||
all_query_args.update(query_args)
|
|
||||||
|
|
||||||
request_url = template
|
|
||||||
if all_query_args:
|
|
||||||
querystring = urlencode(all_query_args)
|
|
||||||
request_url = template + "?" + querystring
|
|
||||||
else:
|
else:
|
||||||
querystring = ""
|
# Build URL with query parameters
|
||||||
|
all_query_args = {}
|
||||||
|
if per_page:
|
||||||
|
all_query_args["per_page"] = per_page
|
||||||
|
if query_args:
|
||||||
|
all_query_args.update(query_args)
|
||||||
|
|
||||||
|
request_url = template
|
||||||
|
if all_query_args:
|
||||||
|
querystring = urlencode(all_query_args)
|
||||||
|
request_url = template + "?" + querystring
|
||||||
|
else:
|
||||||
|
querystring = ""
|
||||||
|
|
||||||
request = Request(request_url)
|
request = Request(request_url)
|
||||||
if auth is not None:
|
if auth is not None:
|
||||||
@@ -749,7 +800,7 @@ def _construct_request(
|
|||||||
"Accept", "application/vnd.github.machine-man-preview+json"
|
"Accept", "application/vnd.github.machine-man-preview+json"
|
||||||
)
|
)
|
||||||
|
|
||||||
log_url = template
|
log_url = template if "?" not in template else template.split("?")[0]
|
||||||
if querystring:
|
if querystring:
|
||||||
log_url += "?" + querystring
|
log_url += "?" + querystring
|
||||||
logger.info("Requesting {}".format(log_url))
|
logger.info("Requesting {}".format(log_url))
|
||||||
@@ -814,7 +865,9 @@ class S3HTTPRedirectHandler(HTTPRedirectHandler):
|
|||||||
request = super(S3HTTPRedirectHandler, self).redirect_request(
|
request = super(S3HTTPRedirectHandler, self).redirect_request(
|
||||||
req, fp, code, msg, headers, newurl
|
req, fp, code, msg, headers, newurl
|
||||||
)
|
)
|
||||||
del request.headers["Authorization"]
|
# Only delete Authorization header if it exists (attachments may not have it)
|
||||||
|
if "Authorization" in request.headers:
|
||||||
|
del request.headers["Authorization"]
|
||||||
return request
|
return request
|
||||||
|
|
||||||
|
|
||||||
@@ -824,8 +877,7 @@ def download_file(url, path, auth, as_app=False, fine=False):
|
|||||||
return
|
return
|
||||||
|
|
||||||
request = _construct_request(
|
request = _construct_request(
|
||||||
per_page=100,
|
per_page=None,
|
||||||
page=1,
|
|
||||||
query_args={},
|
query_args={},
|
||||||
template=url,
|
template=url,
|
||||||
auth=auth,
|
auth=auth,
|
||||||
@@ -867,6 +919,585 @@ def download_file(url, path, auth, as_app=False, fine=False):
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def download_attachment_file(url, path, auth, as_app=False, fine=False):
|
||||||
|
"""Download attachment file directly (not via GitHub API).
|
||||||
|
|
||||||
|
Similar to download_file() but for direct file URLs, not API endpoints.
|
||||||
|
Attachment URLs (user-images, user-attachments) are direct downloads,
|
||||||
|
not API endpoints, so we skip _construct_request() which adds API params.
|
||||||
|
|
||||||
|
URL Format Support & Authentication Requirements:
|
||||||
|
|
||||||
|
| URL Format | Auth Required | Notes |
|
||||||
|
|----------------------------------------------|---------------|--------------------------|
|
||||||
|
| github.com/user-attachments/assets/* | Private only | Modern format (2024+) |
|
||||||
|
| github.com/user-attachments/files/* | Private only | Modern format (2024+) |
|
||||||
|
| user-images.githubusercontent.com/* | No (public) | Legacy CDN, all eras |
|
||||||
|
| private-user-images.githubusercontent.com/* | JWT in URL | Legacy private (5min) |
|
||||||
|
| github.com/{owner}/{repo}/files/* | Repo filter | Old repo files |
|
||||||
|
|
||||||
|
- Modern user-attachments: Requires GitHub token auth for private repos
|
||||||
|
- Legacy public CDN: No auth needed/accepted (returns 400 with auth header)
|
||||||
|
- Legacy private CDN: Uses JWT token embedded in URL, no GitHub token needed
|
||||||
|
- Repo files: Filtered to current repository only during extraction
|
||||||
|
|
||||||
|
Returns dict with metadata:
|
||||||
|
- success: bool
|
||||||
|
- http_status: int (200, 404, etc.)
|
||||||
|
- content_type: str or None
|
||||||
|
- original_filename: str or None (from Content-Disposition)
|
||||||
|
- size_bytes: int or None
|
||||||
|
- error: str or None
|
||||||
|
"""
|
||||||
|
import re
|
||||||
|
from datetime import datetime, timezone
|
||||||
|
|
||||||
|
metadata = {
|
||||||
|
"url": url,
|
||||||
|
"success": False,
|
||||||
|
"http_status": None,
|
||||||
|
"content_type": None,
|
||||||
|
"original_filename": None,
|
||||||
|
"size_bytes": None,
|
||||||
|
"downloaded_at": datetime.now(timezone.utc).isoformat(),
|
||||||
|
"error": None,
|
||||||
|
}
|
||||||
|
|
||||||
|
# Create simple request (no API query params)
|
||||||
|
request = Request(url)
|
||||||
|
request.add_header("Accept", "application/octet-stream")
|
||||||
|
|
||||||
|
# Add authentication header only for modern github.com/user-attachments URLs
|
||||||
|
# Legacy CDN URLs (user-images.githubusercontent.com) are public and don't need/accept auth
|
||||||
|
# Private CDN URLs (private-user-images) use JWT tokens embedded in the URL
|
||||||
|
if auth is not None and "github.com/user-attachments/" in url:
|
||||||
|
if not as_app:
|
||||||
|
if fine:
|
||||||
|
# Fine-grained token: plain token with "token " prefix
|
||||||
|
request.add_header("Authorization", "token " + auth)
|
||||||
|
else:
|
||||||
|
# Classic token: base64-encoded with "Basic " prefix
|
||||||
|
request.add_header("Authorization", "Basic ".encode("ascii") + auth)
|
||||||
|
else:
|
||||||
|
# App authentication
|
||||||
|
auth = auth.encode("ascii")
|
||||||
|
request.add_header("Authorization", "token ".encode("ascii") + auth)
|
||||||
|
|
||||||
|
# Reuse S3HTTPRedirectHandler from download_file()
|
||||||
|
opener = build_opener(S3HTTPRedirectHandler)
|
||||||
|
|
||||||
|
temp_path = path + ".temp"
|
||||||
|
|
||||||
|
try:
|
||||||
|
response = opener.open(request)
|
||||||
|
metadata["http_status"] = response.getcode()
|
||||||
|
|
||||||
|
# Extract Content-Type
|
||||||
|
content_type = response.headers.get("Content-Type", "").split(";")[0].strip()
|
||||||
|
if content_type:
|
||||||
|
metadata["content_type"] = content_type
|
||||||
|
|
||||||
|
# Extract original filename from Content-Disposition header
|
||||||
|
# Format: attachment; filename=example.mov or attachment;filename="example.mov"
|
||||||
|
content_disposition = response.headers.get("Content-Disposition", "")
|
||||||
|
if content_disposition:
|
||||||
|
# Match: filename=something or filename="something" or filename*=UTF-8''something
|
||||||
|
match = re.search(r'filename\*?=["\']?([^"\';\r\n]+)', content_disposition)
|
||||||
|
if match:
|
||||||
|
original_filename = match.group(1).strip()
|
||||||
|
# Handle RFC 5987 encoding: filename*=UTF-8''example.mov
|
||||||
|
if "UTF-8''" in original_filename:
|
||||||
|
original_filename = original_filename.split("UTF-8''")[1]
|
||||||
|
metadata["original_filename"] = original_filename
|
||||||
|
|
||||||
|
# Fallback: Extract filename from final URL after redirects
|
||||||
|
# This handles user-attachments/assets URLs which redirect to S3 with filename.ext
|
||||||
|
if not metadata["original_filename"]:
|
||||||
|
from urllib.parse import urlparse, unquote
|
||||||
|
|
||||||
|
final_url = response.geturl()
|
||||||
|
parsed = urlparse(final_url)
|
||||||
|
# Get filename from path (last component before query string)
|
||||||
|
path_parts = parsed.path.split("/")
|
||||||
|
if path_parts:
|
||||||
|
# URL might be encoded, decode it
|
||||||
|
filename_from_url = unquote(path_parts[-1])
|
||||||
|
# Only use if it has an extension
|
||||||
|
if "." in filename_from_url:
|
||||||
|
metadata["original_filename"] = filename_from_url
|
||||||
|
|
||||||
|
# Download file to temporary location
|
||||||
|
chunk_size = 16 * 1024
|
||||||
|
bytes_downloaded = 0
|
||||||
|
with open(temp_path, "wb") as f:
|
||||||
|
while True:
|
||||||
|
chunk = response.read(chunk_size)
|
||||||
|
if not chunk:
|
||||||
|
break
|
||||||
|
f.write(chunk)
|
||||||
|
bytes_downloaded += len(chunk)
|
||||||
|
|
||||||
|
# Atomic rename to final location
|
||||||
|
os.rename(temp_path, path)
|
||||||
|
|
||||||
|
metadata["size_bytes"] = bytes_downloaded
|
||||||
|
metadata["success"] = True
|
||||||
|
|
||||||
|
except HTTPError as exc:
|
||||||
|
metadata["http_status"] = exc.code
|
||||||
|
metadata["error"] = str(exc.reason)
|
||||||
|
logger.warning(
|
||||||
|
"Skipping download of attachment {0} due to HTTPError: {1}".format(
|
||||||
|
url, exc.reason
|
||||||
|
)
|
||||||
|
)
|
||||||
|
except URLError as e:
|
||||||
|
metadata["error"] = str(e.reason)
|
||||||
|
logger.warning(
|
||||||
|
"Skipping download of attachment {0} due to URLError: {1}".format(
|
||||||
|
url, e.reason
|
||||||
|
)
|
||||||
|
)
|
||||||
|
except socket.error as e:
|
||||||
|
metadata["error"] = str(e.strerror) if hasattr(e, "strerror") else str(e)
|
||||||
|
logger.warning(
|
||||||
|
"Skipping download of attachment {0} due to socket error: {1}".format(
|
||||||
|
url, e.strerror if hasattr(e, "strerror") else str(e)
|
||||||
|
)
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
metadata["error"] = str(e)
|
||||||
|
logger.warning(
|
||||||
|
"Skipping download of attachment {0} due to error: {1}".format(url, str(e))
|
||||||
|
)
|
||||||
|
# Clean up temp file if it was partially created
|
||||||
|
if os.path.exists(temp_path):
|
||||||
|
try:
|
||||||
|
os.remove(temp_path)
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
return metadata
|
||||||
|
|
||||||
|
|
||||||
|
def extract_attachment_urls(item_data, issue_number=None, repository_full_name=None):
|
||||||
|
"""Extract GitHub-hosted attachment URLs from issue/PR body and comments.
|
||||||
|
|
||||||
|
What qualifies as an attachment?
|
||||||
|
There is no "attachment" concept in the GitHub API - it's a user behavior pattern
|
||||||
|
we've identified through analysis of real-world repositories. We define attachments as:
|
||||||
|
|
||||||
|
- User-uploaded files hosted on GitHub's CDN domains
|
||||||
|
- Found outside of code blocks (not examples/documentation)
|
||||||
|
- Matches known GitHub attachment URL patterns
|
||||||
|
|
||||||
|
This intentionally captures bare URLs pasted by users, not just markdown/HTML syntax.
|
||||||
|
Some false positives (example URLs in documentation) may occur - these fail gracefully
|
||||||
|
with HTTP 404 and are logged in the manifest.
|
||||||
|
|
||||||
|
Supported URL formats:
|
||||||
|
- Modern: github.com/user-attachments/{assets,files}/*
|
||||||
|
- Legacy: user-images.githubusercontent.com/* (including private-user-images)
|
||||||
|
- Repo files: github.com/{owner}/{repo}/files/* (filtered to current repo)
|
||||||
|
- Repo assets: github.com/{owner}/{repo}/assets/* (filtered to current repo)
|
||||||
|
|
||||||
|
Repository filtering (repo files/assets only):
|
||||||
|
- Direct match: URL is for current repository → included
|
||||||
|
- Redirect match: URL redirects to current repository → included (handles renames/transfers)
|
||||||
|
- Different repo: URL is for different repository → excluded
|
||||||
|
|
||||||
|
Code block filtering:
|
||||||
|
- Removes fenced code blocks (```) and inline code (`) before extraction
|
||||||
|
- Prevents extracting URLs from code examples and documentation snippets
|
||||||
|
|
||||||
|
Args:
|
||||||
|
item_data: Issue or PR data dict
|
||||||
|
issue_number: Issue/PR number for logging
|
||||||
|
repository_full_name: Full repository name (owner/repo) for filtering repo-scoped URLs
|
||||||
|
"""
|
||||||
|
import re
|
||||||
|
|
||||||
|
urls = []
|
||||||
|
|
||||||
|
# Define all GitHub attachment patterns
|
||||||
|
# Stop at markdown punctuation: whitespace, ), `, ", >, <
|
||||||
|
# Trailing sentence punctuation (. ! ? , ; : ' ") is stripped in post-processing
|
||||||
|
patterns = [
|
||||||
|
r'https://github\.com/user-attachments/(?:assets|files)/[^\s\)`"<>]+', # Modern
|
||||||
|
r'https://(?:private-)?user-images\.githubusercontent\.com/[^\s\)`"<>]+', # Legacy CDN
|
||||||
|
]
|
||||||
|
|
||||||
|
# Add repo-scoped patterns (will be filtered by repository later)
|
||||||
|
# These patterns match ANY repo, then we filter to current repo with redirect checking
|
||||||
|
repo_files_pattern = r'https://github\.com/[^/]+/[^/]+/files/\d+/[^\s\)`"<>]+'
|
||||||
|
repo_assets_pattern = r'https://github\.com/[^/]+/[^/]+/assets/\d+/[^\s\)`"<>]+'
|
||||||
|
patterns.append(repo_files_pattern)
|
||||||
|
patterns.append(repo_assets_pattern)
|
||||||
|
|
||||||
|
def clean_url(url):
|
||||||
|
"""Remove trailing sentence and markdown punctuation that's not part of the URL."""
|
||||||
|
return url.rstrip(".!?,;:'\")")
|
||||||
|
|
||||||
|
def remove_code_blocks(text):
|
||||||
|
"""Remove markdown code blocks (fenced and inline) from text.
|
||||||
|
|
||||||
|
This prevents extracting URLs from code examples like:
|
||||||
|
- Fenced code blocks: ```code```
|
||||||
|
- Inline code: `code`
|
||||||
|
"""
|
||||||
|
# Remove fenced code blocks first (```...```)
|
||||||
|
# DOTALL flag makes . match newlines
|
||||||
|
text = re.sub(r"```.*?```", "", text, flags=re.DOTALL)
|
||||||
|
|
||||||
|
# Remove inline code (`...`)
|
||||||
|
# Non-greedy match between backticks
|
||||||
|
text = re.sub(r"`[^`]*`", "", text)
|
||||||
|
|
||||||
|
return text
|
||||||
|
|
||||||
|
def is_repo_scoped_url(url):
|
||||||
|
"""Check if URL is a repo-scoped attachment (files or assets)."""
|
||||||
|
return bool(
|
||||||
|
re.match(r"https://github\.com/[^/]+/[^/]+/(?:files|assets)/\d+/", url)
|
||||||
|
)
|
||||||
|
|
||||||
|
def check_redirect_to_current_repo(url, current_repo):
|
||||||
|
"""Check if URL redirects to current repository.
|
||||||
|
|
||||||
|
Returns True if:
|
||||||
|
- URL is already for current repo
|
||||||
|
- URL redirects (301/302) to current repo (handles renames/transfers)
|
||||||
|
|
||||||
|
Returns False otherwise (URL is for a different repo).
|
||||||
|
"""
|
||||||
|
# Extract owner/repo from URL
|
||||||
|
match = re.match(r"https://github\.com/([^/]+)/([^/]+)/", url)
|
||||||
|
if not match:
|
||||||
|
return False
|
||||||
|
|
||||||
|
url_owner, url_repo = match.groups()
|
||||||
|
url_repo_full = f"{url_owner}/{url_repo}"
|
||||||
|
|
||||||
|
# Direct match - no need to check redirect
|
||||||
|
if url_repo_full.lower() == current_repo.lower():
|
||||||
|
return True
|
||||||
|
|
||||||
|
# Different repo - check if it redirects to current repo
|
||||||
|
# This handles repository transfers and renames
|
||||||
|
try:
|
||||||
|
import urllib.request
|
||||||
|
import urllib.error
|
||||||
|
|
||||||
|
# Make HEAD request with redirect following disabled
|
||||||
|
# We need to manually handle redirects to see the Location header
|
||||||
|
request = urllib.request.Request(url, method="HEAD")
|
||||||
|
request.add_header("User-Agent", "python-github-backup")
|
||||||
|
|
||||||
|
# Create opener that does NOT follow redirects
|
||||||
|
class NoRedirectHandler(urllib.request.HTTPRedirectHandler):
|
||||||
|
def redirect_request(self, req, fp, code, msg, headers, newurl):
|
||||||
|
return None # Don't follow redirects
|
||||||
|
|
||||||
|
opener = urllib.request.build_opener(NoRedirectHandler)
|
||||||
|
|
||||||
|
try:
|
||||||
|
_ = opener.open(request, timeout=10)
|
||||||
|
# Got 200 - URL works as-is but for different repo
|
||||||
|
return False
|
||||||
|
except urllib.error.HTTPError as e:
|
||||||
|
# Check if it's a redirect (301, 302, 307, 308)
|
||||||
|
if e.code in (301, 302, 307, 308):
|
||||||
|
location = e.headers.get("Location", "")
|
||||||
|
# Check if redirect points to current repo
|
||||||
|
if location:
|
||||||
|
redirect_match = re.match(
|
||||||
|
r"https://github\.com/([^/]+)/([^/]+)/", location
|
||||||
|
)
|
||||||
|
if redirect_match:
|
||||||
|
redirect_owner, redirect_repo = redirect_match.groups()
|
||||||
|
redirect_repo_full = f"{redirect_owner}/{redirect_repo}"
|
||||||
|
return redirect_repo_full.lower() == current_repo.lower()
|
||||||
|
return False
|
||||||
|
except Exception:
|
||||||
|
# On any error (timeout, network issue, etc.), be conservative
|
||||||
|
# and exclude the URL to avoid downloading from wrong repos
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Extract from body
|
||||||
|
body = item_data.get("body") or ""
|
||||||
|
# Remove code blocks before searching for URLs
|
||||||
|
body_cleaned = remove_code_blocks(body)
|
||||||
|
for pattern in patterns:
|
||||||
|
found_urls = re.findall(pattern, body_cleaned)
|
||||||
|
urls.extend([clean_url(url) for url in found_urls])
|
||||||
|
|
||||||
|
# Extract from issue comments
|
||||||
|
if "comment_data" in item_data:
|
||||||
|
for comment in item_data["comment_data"]:
|
||||||
|
comment_body = comment.get("body") or ""
|
||||||
|
# Remove code blocks before searching for URLs
|
||||||
|
comment_cleaned = remove_code_blocks(comment_body)
|
||||||
|
for pattern in patterns:
|
||||||
|
found_urls = re.findall(pattern, comment_cleaned)
|
||||||
|
urls.extend([clean_url(url) for url in found_urls])
|
||||||
|
|
||||||
|
# Extract from PR regular comments
|
||||||
|
if "comment_regular_data" in item_data:
|
||||||
|
for comment in item_data["comment_regular_data"]:
|
||||||
|
comment_body = comment.get("body") or ""
|
||||||
|
# Remove code blocks before searching for URLs
|
||||||
|
comment_cleaned = remove_code_blocks(comment_body)
|
||||||
|
for pattern in patterns:
|
||||||
|
found_urls = re.findall(pattern, comment_cleaned)
|
||||||
|
urls.extend([clean_url(url) for url in found_urls])
|
||||||
|
|
||||||
|
regex_urls = list(set(urls)) # dedupe
|
||||||
|
|
||||||
|
# Filter repo-scoped URLs to current repository only
|
||||||
|
# This handles repository transfers/renames via redirect checking
|
||||||
|
if repository_full_name:
|
||||||
|
filtered_urls = []
|
||||||
|
for url in regex_urls:
|
||||||
|
if is_repo_scoped_url(url):
|
||||||
|
# Check if URL belongs to current repo (or redirects to it)
|
||||||
|
if check_redirect_to_current_repo(url, repository_full_name):
|
||||||
|
filtered_urls.append(url)
|
||||||
|
# else: skip URLs from other repositories
|
||||||
|
else:
|
||||||
|
# Non-repo-scoped URLs (user-attachments, CDN) - always include
|
||||||
|
filtered_urls.append(url)
|
||||||
|
regex_urls = filtered_urls
|
||||||
|
|
||||||
|
return regex_urls
|
||||||
|
|
||||||
|
|
||||||
|
def get_attachment_filename(url):
|
||||||
|
"""Get filename from attachment URL, handling all GitHub formats.
|
||||||
|
|
||||||
|
Formats:
|
||||||
|
- github.com/user-attachments/assets/{uuid} → uuid (add extension later)
|
||||||
|
- github.com/user-attachments/files/{id}/{filename} → filename
|
||||||
|
- github.com/{owner}/{repo}/files/{id}/{filename} → filename
|
||||||
|
- user-images.githubusercontent.com/{user}/{hash}.{ext} → hash.ext
|
||||||
|
- private-user-images.githubusercontent.com/...?jwt=... → extract from path
|
||||||
|
"""
|
||||||
|
from urllib.parse import urlparse
|
||||||
|
|
||||||
|
parsed = urlparse(url)
|
||||||
|
path_parts = parsed.path.split("/")
|
||||||
|
|
||||||
|
# Modern: /user-attachments/files/{id}/{filename}
|
||||||
|
if "user-attachments/files" in parsed.path:
|
||||||
|
return path_parts[-1]
|
||||||
|
|
||||||
|
# Modern: /user-attachments/assets/{uuid}
|
||||||
|
elif "user-attachments/assets" in parsed.path:
|
||||||
|
return path_parts[-1] # extension added later via detect_and_add_extension
|
||||||
|
|
||||||
|
# Repo files: /{owner}/{repo}/files/{id}/{filename}
|
||||||
|
elif "/files/" in parsed.path and len(path_parts) >= 2:
|
||||||
|
return path_parts[-1]
|
||||||
|
|
||||||
|
# Legacy: user-images.githubusercontent.com/{user}/{hash-with-ext}
|
||||||
|
elif "githubusercontent.com" in parsed.netloc:
|
||||||
|
return path_parts[-1] # Already has extension usually
|
||||||
|
|
||||||
|
# Fallback: use last path component
|
||||||
|
return path_parts[-1] if path_parts[-1] else "unknown_attachment"
|
||||||
|
|
||||||
|
|
||||||
|
def resolve_filename_collision(filepath):
|
||||||
|
"""Resolve filename collisions using counter suffix pattern.
|
||||||
|
|
||||||
|
If filepath exists, returns a new filepath with counter suffix.
|
||||||
|
Pattern: report.pdf → report_1.pdf → report_2.pdf
|
||||||
|
|
||||||
|
Also protects against manifest.json collisions by treating it as reserved.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
filepath: Full path to file that might exist
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
filepath that doesn't collide (may be same as input if no collision)
|
||||||
|
"""
|
||||||
|
directory = os.path.dirname(filepath)
|
||||||
|
filename = os.path.basename(filepath)
|
||||||
|
|
||||||
|
# Protect manifest.json - it's a reserved filename
|
||||||
|
if filename == "manifest.json":
|
||||||
|
name, ext = os.path.splitext(filename)
|
||||||
|
counter = 1
|
||||||
|
while True:
|
||||||
|
new_filename = f"{name}_{counter}{ext}"
|
||||||
|
new_filepath = os.path.join(directory, new_filename)
|
||||||
|
if not os.path.exists(new_filepath):
|
||||||
|
return new_filepath
|
||||||
|
counter += 1
|
||||||
|
|
||||||
|
if not os.path.exists(filepath):
|
||||||
|
return filepath
|
||||||
|
|
||||||
|
name, ext = os.path.splitext(filename)
|
||||||
|
|
||||||
|
counter = 1
|
||||||
|
while True:
|
||||||
|
new_filename = f"{name}_{counter}{ext}"
|
||||||
|
new_filepath = os.path.join(directory, new_filename)
|
||||||
|
if not os.path.exists(new_filepath):
|
||||||
|
return new_filepath
|
||||||
|
counter += 1
|
||||||
|
|
||||||
|
|
||||||
|
def download_attachments(
|
||||||
|
args, item_cwd, item_data, number, repository, item_type="issue"
|
||||||
|
):
|
||||||
|
"""Download user-attachments from issue/PR body and comments with manifest.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
args: Command line arguments
|
||||||
|
item_cwd: Working directory (issue_cwd or pulls_cwd)
|
||||||
|
item_data: Issue or PR data dict
|
||||||
|
number: Issue or PR number
|
||||||
|
repository: Repository dict
|
||||||
|
item_type: "issue" or "pull" for logging/manifest
|
||||||
|
"""
|
||||||
|
import json
|
||||||
|
from datetime import datetime, timezone
|
||||||
|
|
||||||
|
item_type_display = "issue" if item_type == "issue" else "pull request"
|
||||||
|
|
||||||
|
urls = extract_attachment_urls(
|
||||||
|
item_data, issue_number=number, repository_full_name=repository["full_name"]
|
||||||
|
)
|
||||||
|
if not urls:
|
||||||
|
return
|
||||||
|
|
||||||
|
attachments_dir = os.path.join(item_cwd, "attachments", str(number))
|
||||||
|
manifest_path = os.path.join(attachments_dir, "manifest.json")
|
||||||
|
|
||||||
|
# Load existing manifest to prevent duplicate downloads
|
||||||
|
existing_urls = set()
|
||||||
|
existing_metadata = []
|
||||||
|
if os.path.exists(manifest_path):
|
||||||
|
try:
|
||||||
|
with open(manifest_path, "r") as f:
|
||||||
|
existing_manifest = json.load(f)
|
||||||
|
all_metadata = existing_manifest.get("attachments", [])
|
||||||
|
# Only skip URLs that were successfully downloaded OR failed with permanent errors
|
||||||
|
# Retry transient failures (5xx, timeouts, network errors)
|
||||||
|
for item in all_metadata:
|
||||||
|
if item.get("success"):
|
||||||
|
existing_urls.add(item["url"])
|
||||||
|
else:
|
||||||
|
# Check if this is a permanent failure (don't retry) or transient (retry)
|
||||||
|
http_status = item.get("http_status")
|
||||||
|
if http_status in [404, 410, 451]:
|
||||||
|
# Permanent failures - don't retry
|
||||||
|
existing_urls.add(item["url"])
|
||||||
|
# Transient failures (5xx, auth errors, timeouts) will be retried
|
||||||
|
existing_metadata = all_metadata
|
||||||
|
except (json.JSONDecodeError, IOError):
|
||||||
|
# If manifest is corrupted, re-download everything
|
||||||
|
logger.warning(
|
||||||
|
"Corrupted manifest for {0} #{1}, will re-download".format(
|
||||||
|
item_type_display, number
|
||||||
|
)
|
||||||
|
)
|
||||||
|
existing_urls = set()
|
||||||
|
existing_metadata = []
|
||||||
|
|
||||||
|
# Filter to only new URLs
|
||||||
|
new_urls = [url for url in urls if url not in existing_urls]
|
||||||
|
|
||||||
|
if not new_urls and existing_urls:
|
||||||
|
logger.debug(
|
||||||
|
"Skipping attachments for {0} #{1} (all {2} already downloaded)".format(
|
||||||
|
item_type_display, number, len(urls)
|
||||||
|
)
|
||||||
|
)
|
||||||
|
return
|
||||||
|
|
||||||
|
if new_urls:
|
||||||
|
logger.info(
|
||||||
|
"Downloading {0} new attachment(s) for {1} #{2}".format(
|
||||||
|
len(new_urls), item_type_display, number
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
mkdir_p(item_cwd, attachments_dir)
|
||||||
|
|
||||||
|
# Collect metadata for manifest (start with existing)
|
||||||
|
attachment_metadata_list = existing_metadata[:]
|
||||||
|
|
||||||
|
for url in new_urls:
|
||||||
|
filename = get_attachment_filename(url)
|
||||||
|
filepath = os.path.join(attachments_dir, filename)
|
||||||
|
|
||||||
|
# Download and get metadata
|
||||||
|
metadata = download_attachment_file(
|
||||||
|
url,
|
||||||
|
filepath,
|
||||||
|
get_auth(args, encode=not args.as_app),
|
||||||
|
as_app=args.as_app,
|
||||||
|
fine=args.token_fine is not None,
|
||||||
|
)
|
||||||
|
|
||||||
|
# If download succeeded but we got an extension from Content-Disposition,
|
||||||
|
# we may need to rename the file to add the extension
|
||||||
|
if metadata["success"] and metadata.get("original_filename"):
|
||||||
|
original_ext = os.path.splitext(metadata["original_filename"])[1]
|
||||||
|
current_ext = os.path.splitext(filepath)[1]
|
||||||
|
|
||||||
|
# Add extension if not present
|
||||||
|
if original_ext and current_ext != original_ext:
|
||||||
|
final_filepath = filepath + original_ext
|
||||||
|
# Check for collision again with new extension
|
||||||
|
final_filepath = resolve_filename_collision(final_filepath)
|
||||||
|
logger.debug(
|
||||||
|
"Adding extension {0} to {1}".format(original_ext, filepath)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Rename to add extension (already atomic from download)
|
||||||
|
try:
|
||||||
|
os.rename(filepath, final_filepath)
|
||||||
|
metadata["saved_as"] = os.path.basename(final_filepath)
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(
|
||||||
|
"Could not add extension to {0}: {1}".format(filepath, str(e))
|
||||||
|
)
|
||||||
|
metadata["saved_as"] = os.path.basename(filepath)
|
||||||
|
else:
|
||||||
|
metadata["saved_as"] = os.path.basename(filepath)
|
||||||
|
elif metadata["success"]:
|
||||||
|
metadata["saved_as"] = os.path.basename(filepath)
|
||||||
|
else:
|
||||||
|
metadata["saved_as"] = None
|
||||||
|
|
||||||
|
attachment_metadata_list.append(metadata)
|
||||||
|
|
||||||
|
# Write manifest
|
||||||
|
if attachment_metadata_list:
|
||||||
|
manifest = {
|
||||||
|
"issue_number": number,
|
||||||
|
"issue_type": item_type,
|
||||||
|
"repository": f"{args.user}/{args.repository}"
|
||||||
|
if hasattr(args, "repository") and args.repository
|
||||||
|
else args.user,
|
||||||
|
"manifest_updated_at": datetime.now(timezone.utc).isoformat(),
|
||||||
|
"attachments": attachment_metadata_list,
|
||||||
|
}
|
||||||
|
|
||||||
|
manifest_path = os.path.join(attachments_dir, "manifest.json")
|
||||||
|
with open(manifest_path + ".temp", "w") as f:
|
||||||
|
json.dump(manifest, f, indent=2)
|
||||||
|
os.rename(manifest_path + ".temp", manifest_path) # Atomic write
|
||||||
|
logger.debug(
|
||||||
|
"Wrote manifest for {0} #{1}: {2} attachments".format(
|
||||||
|
item_type_display, number, len(attachment_metadata_list)
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
def get_authenticated_user(args):
|
def get_authenticated_user(args):
|
||||||
template = "https://{0}/user".format(get_github_api_host(args))
|
template = "https://{0}/user".format(get_github_api_host(args))
|
||||||
data = retrieve_data(args, template, single_request=True)
|
data = retrieve_data(args, template, single_request=True)
|
||||||
@@ -1059,40 +1690,47 @@ def backup_repositories(args, output_directory, repositories):
|
|||||||
|
|
||||||
continue # don't try to back anything else for a gist; it doesn't exist
|
continue # don't try to back anything else for a gist; it doesn't exist
|
||||||
|
|
||||||
download_wiki = args.include_wiki or args.include_everything
|
try:
|
||||||
if repository["has_wiki"] and download_wiki:
|
download_wiki = args.include_wiki or args.include_everything
|
||||||
fetch_repository(
|
if repository["has_wiki"] and download_wiki:
|
||||||
repository["name"],
|
fetch_repository(
|
||||||
repo_url.replace(".git", ".wiki.git"),
|
repository["name"],
|
||||||
os.path.join(repo_cwd, "wiki"),
|
repo_url.replace(".git", ".wiki.git"),
|
||||||
skip_existing=args.skip_existing,
|
os.path.join(repo_cwd, "wiki"),
|
||||||
bare_clone=args.bare_clone,
|
skip_existing=args.skip_existing,
|
||||||
lfs_clone=args.lfs_clone,
|
bare_clone=args.bare_clone,
|
||||||
no_prune=args.no_prune,
|
lfs_clone=args.lfs_clone,
|
||||||
)
|
no_prune=args.no_prune,
|
||||||
if args.include_issues or args.include_everything:
|
)
|
||||||
backup_issues(args, repo_cwd, repository, repos_template)
|
if args.include_issues or args.include_everything:
|
||||||
|
backup_issues(args, repo_cwd, repository, repos_template)
|
||||||
|
|
||||||
if args.include_pulls or args.include_everything:
|
if args.include_pulls or args.include_everything:
|
||||||
backup_pulls(args, repo_cwd, repository, repos_template)
|
backup_pulls(args, repo_cwd, repository, repos_template)
|
||||||
|
|
||||||
if args.include_milestones or args.include_everything:
|
if args.include_milestones or args.include_everything:
|
||||||
backup_milestones(args, repo_cwd, repository, repos_template)
|
backup_milestones(args, repo_cwd, repository, repos_template)
|
||||||
|
|
||||||
if args.include_labels or args.include_everything:
|
if args.include_labels or args.include_everything:
|
||||||
backup_labels(args, repo_cwd, repository, repos_template)
|
backup_labels(args, repo_cwd, repository, repos_template)
|
||||||
|
|
||||||
if args.include_hooks or args.include_everything:
|
if args.include_hooks or args.include_everything:
|
||||||
backup_hooks(args, repo_cwd, repository, repos_template)
|
backup_hooks(args, repo_cwd, repository, repos_template)
|
||||||
|
|
||||||
if args.include_releases or args.include_everything:
|
if args.include_releases or args.include_everything:
|
||||||
backup_releases(
|
backup_releases(
|
||||||
args,
|
args,
|
||||||
repo_cwd,
|
repo_cwd,
|
||||||
repository,
|
repository,
|
||||||
repos_template,
|
repos_template,
|
||||||
include_assets=args.include_assets or args.include_everything,
|
include_assets=args.include_assets or args.include_everything,
|
||||||
)
|
)
|
||||||
|
except RepositoryUnavailableError as e:
|
||||||
|
logger.warning(f"Repository {repository['full_name']} is unavailable (HTTP 451)")
|
||||||
|
if e.dmca_url:
|
||||||
|
logger.warning(f"DMCA notice: {e.dmca_url}")
|
||||||
|
logger.info(f"Skipping remaining resources for {repository['full_name']}")
|
||||||
|
continue
|
||||||
|
|
||||||
if args.incremental:
|
if args.incremental:
|
||||||
if last_update == "0000-00-00T00:00:00Z":
|
if last_update == "0000-00-00T00:00:00Z":
|
||||||
@@ -1157,6 +1795,10 @@ def backup_issues(args, repo_cwd, repository, repos_template):
|
|||||||
if args.include_issue_events or args.include_everything:
|
if args.include_issue_events or args.include_everything:
|
||||||
template = events_template.format(number)
|
template = events_template.format(number)
|
||||||
issues[number]["event_data"] = retrieve_data(args, template)
|
issues[number]["event_data"] = retrieve_data(args, template)
|
||||||
|
if args.include_attachments:
|
||||||
|
download_attachments(
|
||||||
|
args, issue_cwd, issues[number], number, repository, item_type="issue"
|
||||||
|
)
|
||||||
|
|
||||||
with codecs.open(issue_file + ".temp", "w", encoding="utf-8") as f:
|
with codecs.open(issue_file + ".temp", "w", encoding="utf-8") as f:
|
||||||
json_dump(issue, f)
|
json_dump(issue, f)
|
||||||
@@ -1228,6 +1870,10 @@ def backup_pulls(args, repo_cwd, repository, repos_template):
|
|||||||
if args.include_pull_commits or args.include_everything:
|
if args.include_pull_commits or args.include_everything:
|
||||||
template = commits_template.format(number)
|
template = commits_template.format(number)
|
||||||
pulls[number]["commit_data"] = retrieve_data(args, template)
|
pulls[number]["commit_data"] = retrieve_data(args, template)
|
||||||
|
if args.include_attachments:
|
||||||
|
download_attachments(
|
||||||
|
args, pulls_cwd, pulls[number], number, repository, item_type="pull"
|
||||||
|
)
|
||||||
|
|
||||||
with codecs.open(pull_file + ".temp", "w", encoding="utf-8") as f:
|
with codecs.open(pull_file + ".temp", "w", encoding="utf-8") as f:
|
||||||
json_dump(pull, f)
|
json_dump(pull, f)
|
||||||
|
|||||||
6
pytest.ini
Normal file
6
pytest.ini
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
[pytest]
|
||||||
|
testpaths = tests
|
||||||
|
python_files = test_*.py
|
||||||
|
python_classes = Test*
|
||||||
|
python_functions = test_*
|
||||||
|
addopts = -v
|
||||||
@@ -1,39 +1,40 @@
|
|||||||
autopep8==2.3.2
|
autopep8==2.3.2
|
||||||
black==25.1.0
|
black==25.11.0
|
||||||
bleach==6.2.0
|
bleach==6.3.0
|
||||||
certifi==2025.7.14
|
certifi==2025.11.12
|
||||||
charset-normalizer==3.4.2
|
charset-normalizer==3.4.4
|
||||||
click==8.1.8
|
click==8.3.1
|
||||||
colorama==0.4.6
|
colorama==0.4.6
|
||||||
docutils==0.22
|
docutils==0.22.3
|
||||||
flake8==7.3.0
|
flake8==7.3.0
|
||||||
gitchangelog==3.0.4
|
gitchangelog==3.0.4
|
||||||
idna==3.10
|
pytest==9.0.1
|
||||||
|
idna==3.11
|
||||||
importlib-metadata==8.7.0
|
importlib-metadata==8.7.0
|
||||||
jaraco.classes==3.4.0
|
jaraco.classes==3.4.0
|
||||||
keyring==25.6.0
|
keyring==25.7.0
|
||||||
markdown-it-py==3.0.0
|
markdown-it-py==4.0.0
|
||||||
mccabe==0.7.0
|
mccabe==0.7.0
|
||||||
mdurl==0.1.2
|
mdurl==0.1.2
|
||||||
more-itertools==10.7.0
|
more-itertools==10.8.0
|
||||||
mypy-extensions==1.1.0
|
mypy-extensions==1.1.0
|
||||||
packaging==25.0
|
packaging==25.0
|
||||||
pathspec==0.12.1
|
pathspec==0.12.1
|
||||||
pkginfo==1.12.1.2
|
pkginfo==1.12.1.2
|
||||||
platformdirs==4.3.8
|
platformdirs==4.5.0
|
||||||
pycodestyle==2.14.0
|
pycodestyle==2.14.0
|
||||||
pyflakes==3.4.0
|
pyflakes==3.4.0
|
||||||
Pygments==2.19.2
|
Pygments==2.19.2
|
||||||
readme-renderer==44.0
|
readme-renderer==44.0
|
||||||
requests==2.32.4
|
requests==2.32.5
|
||||||
requests-toolbelt==1.0.0
|
requests-toolbelt==1.0.0
|
||||||
restructuredtext-lint==1.4.0
|
restructuredtext-lint==2.0.2
|
||||||
rfc3986==2.0.0
|
rfc3986==2.0.0
|
||||||
rich==14.1.0
|
rich==14.2.0
|
||||||
setuptools==80.9.0
|
setuptools==80.9.0
|
||||||
six==1.17.0
|
six==1.17.0
|
||||||
tqdm==4.67.1
|
tqdm==4.67.1
|
||||||
twine==6.1.0
|
twine==6.2.0
|
||||||
urllib3==2.5.0
|
urllib3==2.5.0
|
||||||
webencodings==0.5.1
|
webencodings==0.5.1
|
||||||
zipp==3.23.0
|
zipp==3.23.0
|
||||||
|
|||||||
@@ -1 +0,0 @@
|
|||||||
|
|
||||||
|
|||||||
5
setup.py
5
setup.py
@@ -40,15 +40,16 @@ setup(
|
|||||||
"Development Status :: 5 - Production/Stable",
|
"Development Status :: 5 - Production/Stable",
|
||||||
"Topic :: System :: Archiving :: Backup",
|
"Topic :: System :: Archiving :: Backup",
|
||||||
"License :: OSI Approved :: MIT License",
|
"License :: OSI Approved :: MIT License",
|
||||||
"Programming Language :: Python :: 3.8",
|
|
||||||
"Programming Language :: Python :: 3.9",
|
|
||||||
"Programming Language :: Python :: 3.10",
|
"Programming Language :: Python :: 3.10",
|
||||||
"Programming Language :: Python :: 3.11",
|
"Programming Language :: Python :: 3.11",
|
||||||
"Programming Language :: Python :: 3.12",
|
"Programming Language :: Python :: 3.12",
|
||||||
|
"Programming Language :: Python :: 3.13",
|
||||||
|
"Programming Language :: Python :: 3.14",
|
||||||
],
|
],
|
||||||
description="backup a github user or organization",
|
description="backup a github user or organization",
|
||||||
long_description=open_file("README.rst").read(),
|
long_description=open_file("README.rst").read(),
|
||||||
long_description_content_type="text/x-rst",
|
long_description_content_type="text/x-rst",
|
||||||
install_requires=open_file("requirements.txt").readlines(),
|
install_requires=open_file("requirements.txt").readlines(),
|
||||||
|
python_requires=">=3.10",
|
||||||
zip_safe=True,
|
zip_safe=True,
|
||||||
)
|
)
|
||||||
|
|||||||
1
tests/__init__.py
Normal file
1
tests/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""Tests for python-github-backup."""
|
||||||
353
tests/test_attachments.py
Normal file
353
tests/test_attachments.py
Normal file
@@ -0,0 +1,353 @@
|
|||||||
|
"""Behavioral tests for attachment functionality."""
|
||||||
|
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
import tempfile
|
||||||
|
from pathlib import Path
|
||||||
|
from unittest.mock import Mock
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from github_backup import github_backup
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def attachment_test_setup(tmp_path):
|
||||||
|
"""Fixture providing setup and helper for attachment download tests."""
|
||||||
|
from unittest.mock import patch
|
||||||
|
|
||||||
|
issue_cwd = tmp_path / "issues"
|
||||||
|
issue_cwd.mkdir()
|
||||||
|
|
||||||
|
# Mock args
|
||||||
|
args = Mock()
|
||||||
|
args.as_app = False
|
||||||
|
args.token_fine = None
|
||||||
|
args.token_classic = None
|
||||||
|
args.username = None
|
||||||
|
args.password = None
|
||||||
|
args.osx_keychain_item_name = None
|
||||||
|
args.osx_keychain_item_account = None
|
||||||
|
args.user = "testuser"
|
||||||
|
args.repository = "testrepo"
|
||||||
|
|
||||||
|
repository = {"full_name": "testuser/testrepo"}
|
||||||
|
|
||||||
|
def call_download(issue_data, issue_number=123):
|
||||||
|
"""Call download_attachments with mocked HTTP downloads.
|
||||||
|
|
||||||
|
Returns list of URLs that were actually downloaded.
|
||||||
|
"""
|
||||||
|
downloaded_urls = []
|
||||||
|
|
||||||
|
def mock_download(url, path, auth, as_app, fine):
|
||||||
|
downloaded_urls.append(url)
|
||||||
|
return {
|
||||||
|
"success": True,
|
||||||
|
"saved_as": os.path.basename(path),
|
||||||
|
"url": url,
|
||||||
|
}
|
||||||
|
|
||||||
|
with patch(
|
||||||
|
"github_backup.github_backup.download_attachment_file",
|
||||||
|
side_effect=mock_download,
|
||||||
|
):
|
||||||
|
github_backup.download_attachments(
|
||||||
|
args, str(issue_cwd), issue_data, issue_number, repository
|
||||||
|
)
|
||||||
|
|
||||||
|
return downloaded_urls
|
||||||
|
|
||||||
|
return {
|
||||||
|
"issue_cwd": str(issue_cwd),
|
||||||
|
"args": args,
|
||||||
|
"repository": repository,
|
||||||
|
"call_download": call_download,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class TestURLExtraction:
|
||||||
|
"""Test URL extraction with realistic issue content."""
|
||||||
|
|
||||||
|
def test_mixed_urls(self):
|
||||||
|
issue_data = {
|
||||||
|
"body": """
|
||||||
|
## Bug Report
|
||||||
|
|
||||||
|
When uploading files, I see this error. Here's a screenshot:
|
||||||
|
https://github.com/user-attachments/assets/abc123def456
|
||||||
|
|
||||||
|
The logs show: https://github.com/user-attachments/files/789/error-log.txt
|
||||||
|
|
||||||
|
This is similar to https://github.com/someorg/somerepo/issues/42 but different.
|
||||||
|
|
||||||
|
You can also see the video at https://user-images.githubusercontent.com/12345/video-demo.mov
|
||||||
|
|
||||||
|
Here's how to reproduce:
|
||||||
|
```bash
|
||||||
|
# Don't extract this example URL:
|
||||||
|
curl https://github.com/user-attachments/assets/example999
|
||||||
|
```
|
||||||
|
|
||||||
|
More info at https://docs.example.com/guide
|
||||||
|
|
||||||
|
Also see this inline code `https://github.com/user-attachments/files/111/inline.pdf` should not extract.
|
||||||
|
|
||||||
|
Final attachment: https://github.com/user-attachments/files/222/report.pdf.
|
||||||
|
""",
|
||||||
|
"comment_data": [
|
||||||
|
{
|
||||||
|
"body": "Here's another attachment: https://private-user-images.githubusercontent.com/98765/secret.png?jwt=token123"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"body": """
|
||||||
|
Example code:
|
||||||
|
```python
|
||||||
|
url = "https://github.com/user-attachments/assets/code-example"
|
||||||
|
```
|
||||||
|
But this is real: https://github.com/user-attachments/files/333/actual.zip
|
||||||
|
"""
|
||||||
|
},
|
||||||
|
],
|
||||||
|
}
|
||||||
|
|
||||||
|
# Extract URLs
|
||||||
|
urls = github_backup.extract_attachment_urls(issue_data)
|
||||||
|
|
||||||
|
expected_urls = [
|
||||||
|
"https://github.com/user-attachments/assets/abc123def456",
|
||||||
|
"https://github.com/user-attachments/files/789/error-log.txt",
|
||||||
|
"https://user-images.githubusercontent.com/12345/video-demo.mov",
|
||||||
|
"https://github.com/user-attachments/files/222/report.pdf",
|
||||||
|
"https://private-user-images.githubusercontent.com/98765/secret.png?jwt=token123",
|
||||||
|
"https://github.com/user-attachments/files/333/actual.zip",
|
||||||
|
]
|
||||||
|
|
||||||
|
assert set(urls) == set(expected_urls)
|
||||||
|
|
||||||
|
def test_trailing_punctuation_stripped(self):
|
||||||
|
"""URLs with trailing punctuation should have punctuation stripped."""
|
||||||
|
issue_data = {
|
||||||
|
"body": """
|
||||||
|
See this file: https://github.com/user-attachments/files/1/doc.pdf.
|
||||||
|
And this one (https://github.com/user-attachments/files/2/image.png).
|
||||||
|
Check it out! https://github.com/user-attachments/files/3/data.csv!
|
||||||
|
"""
|
||||||
|
}
|
||||||
|
|
||||||
|
urls = github_backup.extract_attachment_urls(issue_data)
|
||||||
|
|
||||||
|
expected = [
|
||||||
|
"https://github.com/user-attachments/files/1/doc.pdf",
|
||||||
|
"https://github.com/user-attachments/files/2/image.png",
|
||||||
|
"https://github.com/user-attachments/files/3/data.csv",
|
||||||
|
]
|
||||||
|
assert set(urls) == set(expected)
|
||||||
|
|
||||||
|
def test_deduplication_across_body_and_comments(self):
|
||||||
|
"""Same URL in body and comments should only appear once."""
|
||||||
|
duplicate_url = "https://github.com/user-attachments/assets/abc123"
|
||||||
|
|
||||||
|
issue_data = {
|
||||||
|
"body": f"First mention: {duplicate_url}",
|
||||||
|
"comment_data": [
|
||||||
|
{"body": f"Second mention: {duplicate_url}"},
|
||||||
|
{"body": f"Third mention: {duplicate_url}"},
|
||||||
|
],
|
||||||
|
}
|
||||||
|
|
||||||
|
urls = github_backup.extract_attachment_urls(issue_data)
|
||||||
|
|
||||||
|
assert set(urls) == {duplicate_url}
|
||||||
|
|
||||||
|
|
||||||
|
class TestFilenameExtraction:
|
||||||
|
"""Test filename extraction from different URL types."""
|
||||||
|
|
||||||
|
def test_modern_assets_url(self):
|
||||||
|
"""Modern assets URL returns UUID."""
|
||||||
|
url = "https://github.com/user-attachments/assets/abc123def456"
|
||||||
|
filename = github_backup.get_attachment_filename(url)
|
||||||
|
assert filename == "abc123def456"
|
||||||
|
|
||||||
|
def test_modern_files_url(self):
|
||||||
|
"""Modern files URL returns filename."""
|
||||||
|
url = "https://github.com/user-attachments/files/12345/report.pdf"
|
||||||
|
filename = github_backup.get_attachment_filename(url)
|
||||||
|
assert filename == "report.pdf"
|
||||||
|
|
||||||
|
def test_legacy_cdn_url(self):
|
||||||
|
"""Legacy CDN URL returns filename with extension."""
|
||||||
|
url = "https://user-images.githubusercontent.com/123456/abc-def.png"
|
||||||
|
filename = github_backup.get_attachment_filename(url)
|
||||||
|
assert filename == "abc-def.png"
|
||||||
|
|
||||||
|
def test_private_cdn_url(self):
|
||||||
|
"""Private CDN URL returns filename."""
|
||||||
|
url = "https://private-user-images.githubusercontent.com/98765/secret.png?jwt=token123"
|
||||||
|
filename = github_backup.get_attachment_filename(url)
|
||||||
|
assert filename == "secret.png"
|
||||||
|
|
||||||
|
def test_repo_files_url(self):
|
||||||
|
"""Repo-scoped files URL returns filename."""
|
||||||
|
url = "https://github.com/owner/repo/files/789/document.txt"
|
||||||
|
filename = github_backup.get_attachment_filename(url)
|
||||||
|
assert filename == "document.txt"
|
||||||
|
|
||||||
|
|
||||||
|
class TestFilenameCollision:
|
||||||
|
"""Test filename collision resolution."""
|
||||||
|
|
||||||
|
def test_collision_behavior(self):
|
||||||
|
"""Test filename collision resolution with real files."""
|
||||||
|
with tempfile.TemporaryDirectory() as tmpdir:
|
||||||
|
# No collision - file doesn't exist
|
||||||
|
result = github_backup.resolve_filename_collision(
|
||||||
|
os.path.join(tmpdir, "report.pdf")
|
||||||
|
)
|
||||||
|
assert result == os.path.join(tmpdir, "report.pdf")
|
||||||
|
|
||||||
|
# Create the file, now collision exists
|
||||||
|
Path(os.path.join(tmpdir, "report.pdf")).touch()
|
||||||
|
result = github_backup.resolve_filename_collision(
|
||||||
|
os.path.join(tmpdir, "report.pdf")
|
||||||
|
)
|
||||||
|
assert result == os.path.join(tmpdir, "report_1.pdf")
|
||||||
|
|
||||||
|
# Create report_1.pdf too
|
||||||
|
Path(os.path.join(tmpdir, "report_1.pdf")).touch()
|
||||||
|
result = github_backup.resolve_filename_collision(
|
||||||
|
os.path.join(tmpdir, "report.pdf")
|
||||||
|
)
|
||||||
|
assert result == os.path.join(tmpdir, "report_2.pdf")
|
||||||
|
|
||||||
|
def test_manifest_reserved(self):
|
||||||
|
"""manifest.json is always treated as reserved."""
|
||||||
|
with tempfile.TemporaryDirectory() as tmpdir:
|
||||||
|
# Even if manifest.json doesn't exist, should get manifest_1.json
|
||||||
|
result = github_backup.resolve_filename_collision(
|
||||||
|
os.path.join(tmpdir, "manifest.json")
|
||||||
|
)
|
||||||
|
assert result == os.path.join(tmpdir, "manifest_1.json")
|
||||||
|
|
||||||
|
|
||||||
|
class TestManifestDuplicatePrevention:
|
||||||
|
"""Test that manifest prevents duplicate downloads (the bug fix)."""
|
||||||
|
|
||||||
|
def test_manifest_filters_existing_urls(self, attachment_test_setup):
|
||||||
|
"""URLs in manifest are not re-downloaded."""
|
||||||
|
setup = attachment_test_setup
|
||||||
|
|
||||||
|
# Create manifest with existing URLs
|
||||||
|
attachments_dir = os.path.join(setup["issue_cwd"], "attachments", "123")
|
||||||
|
os.makedirs(attachments_dir)
|
||||||
|
manifest_path = os.path.join(attachments_dir, "manifest.json")
|
||||||
|
|
||||||
|
manifest = {
|
||||||
|
"attachments": [
|
||||||
|
{
|
||||||
|
"url": "https://github.com/user-attachments/assets/old1",
|
||||||
|
"success": True,
|
||||||
|
"saved_as": "old1.pdf",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"url": "https://github.com/user-attachments/assets/old2",
|
||||||
|
"success": True,
|
||||||
|
"saved_as": "old2.pdf",
|
||||||
|
},
|
||||||
|
]
|
||||||
|
}
|
||||||
|
with open(manifest_path, "w") as f:
|
||||||
|
json.dump(manifest, f)
|
||||||
|
|
||||||
|
# Issue data with 2 old URLs and 1 new URL
|
||||||
|
issue_data = {
|
||||||
|
"body": """
|
||||||
|
Old: https://github.com/user-attachments/assets/old1
|
||||||
|
Old: https://github.com/user-attachments/assets/old2
|
||||||
|
New: https://github.com/user-attachments/assets/new1
|
||||||
|
"""
|
||||||
|
}
|
||||||
|
|
||||||
|
downloaded_urls = setup["call_download"](issue_data)
|
||||||
|
|
||||||
|
# Should only download the NEW URL (old ones filtered by manifest)
|
||||||
|
assert len(downloaded_urls) == 1
|
||||||
|
assert downloaded_urls[0] == "https://github.com/user-attachments/assets/new1"
|
||||||
|
|
||||||
|
def test_no_manifest_downloads_all(self, attachment_test_setup):
|
||||||
|
"""Without manifest, all URLs should be downloaded."""
|
||||||
|
setup = attachment_test_setup
|
||||||
|
|
||||||
|
# Issue data with 2 URLs
|
||||||
|
issue_data = {
|
||||||
|
"body": """
|
||||||
|
https://github.com/user-attachments/assets/url1
|
||||||
|
https://github.com/user-attachments/assets/url2
|
||||||
|
"""
|
||||||
|
}
|
||||||
|
|
||||||
|
downloaded_urls = setup["call_download"](issue_data)
|
||||||
|
|
||||||
|
# Should download ALL URLs (no manifest to filter)
|
||||||
|
assert len(downloaded_urls) == 2
|
||||||
|
assert set(downloaded_urls) == {
|
||||||
|
"https://github.com/user-attachments/assets/url1",
|
||||||
|
"https://github.com/user-attachments/assets/url2",
|
||||||
|
}
|
||||||
|
|
||||||
|
def test_manifest_skips_permanent_failures(self, attachment_test_setup):
|
||||||
|
"""Manifest skips permanent failures (404, 410) but retries transient (503)."""
|
||||||
|
setup = attachment_test_setup
|
||||||
|
|
||||||
|
# Create manifest with different failure types
|
||||||
|
attachments_dir = os.path.join(setup["issue_cwd"], "attachments", "123")
|
||||||
|
os.makedirs(attachments_dir)
|
||||||
|
manifest_path = os.path.join(attachments_dir, "manifest.json")
|
||||||
|
|
||||||
|
manifest = {
|
||||||
|
"attachments": [
|
||||||
|
{
|
||||||
|
"url": "https://github.com/user-attachments/assets/success",
|
||||||
|
"success": True,
|
||||||
|
"saved_as": "success.pdf",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"url": "https://github.com/user-attachments/assets/notfound",
|
||||||
|
"success": False,
|
||||||
|
"http_status": 404,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"url": "https://github.com/user-attachments/assets/gone",
|
||||||
|
"success": False,
|
||||||
|
"http_status": 410,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"url": "https://github.com/user-attachments/assets/unavailable",
|
||||||
|
"success": False,
|
||||||
|
"http_status": 503,
|
||||||
|
},
|
||||||
|
]
|
||||||
|
}
|
||||||
|
with open(manifest_path, "w") as f:
|
||||||
|
json.dump(manifest, f)
|
||||||
|
|
||||||
|
# Issue data has all 4 URLs
|
||||||
|
issue_data = {
|
||||||
|
"body": """
|
||||||
|
https://github.com/user-attachments/assets/success
|
||||||
|
https://github.com/user-attachments/assets/notfound
|
||||||
|
https://github.com/user-attachments/assets/gone
|
||||||
|
https://github.com/user-attachments/assets/unavailable
|
||||||
|
"""
|
||||||
|
}
|
||||||
|
|
||||||
|
downloaded_urls = setup["call_download"](issue_data)
|
||||||
|
|
||||||
|
# Should only retry 503 (transient failure)
|
||||||
|
# Success, 404, and 410 should be skipped
|
||||||
|
assert len(downloaded_urls) == 1
|
||||||
|
assert (
|
||||||
|
downloaded_urls[0]
|
||||||
|
== "https://github.com/user-attachments/assets/unavailable"
|
||||||
|
)
|
||||||
143
tests/test_http_451.py
Normal file
143
tests/test_http_451.py
Normal file
@@ -0,0 +1,143 @@
|
|||||||
|
"""Tests for HTTP 451 (DMCA takedown) handling."""
|
||||||
|
|
||||||
|
import json
|
||||||
|
from unittest.mock import Mock, patch
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from github_backup import github_backup
|
||||||
|
|
||||||
|
|
||||||
|
class TestHTTP451Exception:
|
||||||
|
"""Test suite for HTTP 451 DMCA takedown exception handling."""
|
||||||
|
|
||||||
|
def test_repository_unavailable_error_raised(self):
|
||||||
|
"""HTTP 451 should raise RepositoryUnavailableError with DMCA URL."""
|
||||||
|
# Create mock args
|
||||||
|
args = Mock()
|
||||||
|
args.as_app = False
|
||||||
|
args.token_fine = None
|
||||||
|
args.token_classic = None
|
||||||
|
args.username = None
|
||||||
|
args.password = None
|
||||||
|
args.osx_keychain_item_name = None
|
||||||
|
args.osx_keychain_item_account = None
|
||||||
|
args.throttle_limit = None
|
||||||
|
args.throttle_pause = 0
|
||||||
|
|
||||||
|
# Mock HTTPError 451 response
|
||||||
|
mock_response = Mock()
|
||||||
|
mock_response.getcode.return_value = 451
|
||||||
|
|
||||||
|
dmca_data = {
|
||||||
|
"message": "Repository access blocked",
|
||||||
|
"block": {
|
||||||
|
"reason": "dmca",
|
||||||
|
"created_at": "2024-11-12T14:38:04Z",
|
||||||
|
"html_url": "https://github.com/github/dmca/blob/master/2024/11/2024-11-04-source-code.md"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
mock_response.read.return_value = json.dumps(dmca_data).encode("utf-8")
|
||||||
|
mock_response.headers = {"x-ratelimit-remaining": "5000"}
|
||||||
|
mock_response.reason = "Unavailable For Legal Reasons"
|
||||||
|
|
||||||
|
def mock_get_response(request, auth, template):
|
||||||
|
return mock_response, []
|
||||||
|
|
||||||
|
with patch("github_backup.github_backup._get_response", side_effect=mock_get_response):
|
||||||
|
with pytest.raises(github_backup.RepositoryUnavailableError) as exc_info:
|
||||||
|
list(github_backup.retrieve_data_gen(args, "https://api.github.com/repos/test/dmca/issues"))
|
||||||
|
|
||||||
|
# Check exception has DMCA URL
|
||||||
|
assert exc_info.value.dmca_url == "https://github.com/github/dmca/blob/master/2024/11/2024-11-04-source-code.md"
|
||||||
|
assert "451" in str(exc_info.value)
|
||||||
|
|
||||||
|
def test_repository_unavailable_error_without_dmca_url(self):
|
||||||
|
"""HTTP 451 without DMCA details should still raise exception."""
|
||||||
|
args = Mock()
|
||||||
|
args.as_app = False
|
||||||
|
args.token_fine = None
|
||||||
|
args.token_classic = None
|
||||||
|
args.username = None
|
||||||
|
args.password = None
|
||||||
|
args.osx_keychain_item_name = None
|
||||||
|
args.osx_keychain_item_account = None
|
||||||
|
args.throttle_limit = None
|
||||||
|
args.throttle_pause = 0
|
||||||
|
|
||||||
|
mock_response = Mock()
|
||||||
|
mock_response.getcode.return_value = 451
|
||||||
|
mock_response.read.return_value = b'{"message": "Blocked"}'
|
||||||
|
mock_response.headers = {"x-ratelimit-remaining": "5000"}
|
||||||
|
mock_response.reason = "Unavailable For Legal Reasons"
|
||||||
|
|
||||||
|
def mock_get_response(request, auth, template):
|
||||||
|
return mock_response, []
|
||||||
|
|
||||||
|
with patch("github_backup.github_backup._get_response", side_effect=mock_get_response):
|
||||||
|
with pytest.raises(github_backup.RepositoryUnavailableError) as exc_info:
|
||||||
|
list(github_backup.retrieve_data_gen(args, "https://api.github.com/repos/test/dmca/issues"))
|
||||||
|
|
||||||
|
# Exception raised even without DMCA URL
|
||||||
|
assert exc_info.value.dmca_url is None
|
||||||
|
assert "451" in str(exc_info.value)
|
||||||
|
|
||||||
|
def test_repository_unavailable_error_with_malformed_json(self):
|
||||||
|
"""HTTP 451 with malformed JSON should still raise exception."""
|
||||||
|
args = Mock()
|
||||||
|
args.as_app = False
|
||||||
|
args.token_fine = None
|
||||||
|
args.token_classic = None
|
||||||
|
args.username = None
|
||||||
|
args.password = None
|
||||||
|
args.osx_keychain_item_name = None
|
||||||
|
args.osx_keychain_item_account = None
|
||||||
|
args.throttle_limit = None
|
||||||
|
args.throttle_pause = 0
|
||||||
|
|
||||||
|
mock_response = Mock()
|
||||||
|
mock_response.getcode.return_value = 451
|
||||||
|
mock_response.read.return_value = b"invalid json {"
|
||||||
|
mock_response.headers = {"x-ratelimit-remaining": "5000"}
|
||||||
|
mock_response.reason = "Unavailable For Legal Reasons"
|
||||||
|
|
||||||
|
def mock_get_response(request, auth, template):
|
||||||
|
return mock_response, []
|
||||||
|
|
||||||
|
with patch("github_backup.github_backup._get_response", side_effect=mock_get_response):
|
||||||
|
with pytest.raises(github_backup.RepositoryUnavailableError):
|
||||||
|
list(github_backup.retrieve_data_gen(args, "https://api.github.com/repos/test/dmca/issues"))
|
||||||
|
|
||||||
|
def test_other_http_errors_unchanged(self):
|
||||||
|
"""Other HTTP errors should still raise generic Exception."""
|
||||||
|
args = Mock()
|
||||||
|
args.as_app = False
|
||||||
|
args.token_fine = None
|
||||||
|
args.token_classic = None
|
||||||
|
args.username = None
|
||||||
|
args.password = None
|
||||||
|
args.osx_keychain_item_name = None
|
||||||
|
args.osx_keychain_item_account = None
|
||||||
|
args.throttle_limit = None
|
||||||
|
args.throttle_pause = 0
|
||||||
|
|
||||||
|
mock_response = Mock()
|
||||||
|
mock_response.getcode.return_value = 404
|
||||||
|
mock_response.read.return_value = b'{"message": "Not Found"}'
|
||||||
|
mock_response.headers = {"x-ratelimit-remaining": "5000"}
|
||||||
|
mock_response.reason = "Not Found"
|
||||||
|
|
||||||
|
def mock_get_response(request, auth, template):
|
||||||
|
return mock_response, []
|
||||||
|
|
||||||
|
with patch("github_backup.github_backup._get_response", side_effect=mock_get_response):
|
||||||
|
# Should raise generic Exception, not RepositoryUnavailableError
|
||||||
|
with pytest.raises(Exception) as exc_info:
|
||||||
|
list(github_backup.retrieve_data_gen(args, "https://api.github.com/repos/test/notfound/issues"))
|
||||||
|
|
||||||
|
assert not isinstance(exc_info.value, github_backup.RepositoryUnavailableError)
|
||||||
|
assert "404" in str(exc_info.value)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
pytest.main([__file__, "-v"])
|
||||||
153
tests/test_pagination.py
Normal file
153
tests/test_pagination.py
Normal file
@@ -0,0 +1,153 @@
|
|||||||
|
"""Tests for Link header pagination handling."""
|
||||||
|
|
||||||
|
import json
|
||||||
|
from unittest.mock import Mock, patch
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from github_backup import github_backup
|
||||||
|
|
||||||
|
|
||||||
|
class MockHTTPResponse:
|
||||||
|
"""Mock HTTP response for paginated API calls."""
|
||||||
|
|
||||||
|
def __init__(self, data, link_header=None):
|
||||||
|
self._content = json.dumps(data).encode("utf-8")
|
||||||
|
self._link_header = link_header
|
||||||
|
self._read = False
|
||||||
|
self.reason = "OK"
|
||||||
|
|
||||||
|
def getcode(self):
|
||||||
|
return 200
|
||||||
|
|
||||||
|
def read(self):
|
||||||
|
if self._read:
|
||||||
|
return b""
|
||||||
|
self._read = True
|
||||||
|
return self._content
|
||||||
|
|
||||||
|
def get_header(self, name, default=None):
|
||||||
|
"""Mock method for headers.get()."""
|
||||||
|
return self.headers.get(name, default)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def headers(self):
|
||||||
|
headers = {"x-ratelimit-remaining": "5000"}
|
||||||
|
if self._link_header:
|
||||||
|
headers["Link"] = self._link_header
|
||||||
|
return headers
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def mock_args():
|
||||||
|
"""Mock args for retrieve_data_gen."""
|
||||||
|
args = Mock()
|
||||||
|
args.as_app = False
|
||||||
|
args.token_fine = None
|
||||||
|
args.token_classic = "fake_token"
|
||||||
|
args.username = None
|
||||||
|
args.password = None
|
||||||
|
args.osx_keychain_item_name = None
|
||||||
|
args.osx_keychain_item_account = None
|
||||||
|
args.throttle_limit = None
|
||||||
|
args.throttle_pause = 0
|
||||||
|
return args
|
||||||
|
|
||||||
|
|
||||||
|
def test_cursor_based_pagination(mock_args):
|
||||||
|
"""Link header with 'after' cursor parameter works correctly."""
|
||||||
|
|
||||||
|
# Simulate issues endpoint behavior: returns cursor in Link header
|
||||||
|
responses = [
|
||||||
|
# Issues endpoint returns 'after' cursor parameter (not 'page')
|
||||||
|
MockHTTPResponse(
|
||||||
|
data=[{"issue": i} for i in range(1, 101)], # Page 1 contents
|
||||||
|
link_header='<https://api.github.com/repos/owner/repo/issues?per_page=100&after=ABC123&page=2>; rel="next"',
|
||||||
|
),
|
||||||
|
MockHTTPResponse(
|
||||||
|
data=[{"issue": i} for i in range(101, 151)], # Page 2 contents
|
||||||
|
link_header=None, # No Link header - signals end of pagination
|
||||||
|
),
|
||||||
|
]
|
||||||
|
requests_made = []
|
||||||
|
|
||||||
|
def mock_urlopen(request, *args, **kwargs):
|
||||||
|
url = request.get_full_url()
|
||||||
|
requests_made.append(url)
|
||||||
|
return responses[len(requests_made) - 1]
|
||||||
|
|
||||||
|
with patch("github_backup.github_backup.urlopen", side_effect=mock_urlopen):
|
||||||
|
results = list(
|
||||||
|
github_backup.retrieve_data_gen(
|
||||||
|
mock_args, "https://api.github.com/repos/owner/repo/issues"
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Verify all items retrieved and cursor was used in second request
|
||||||
|
assert len(results) == 150
|
||||||
|
assert len(requests_made) == 2
|
||||||
|
assert "after=ABC123" in requests_made[1]
|
||||||
|
|
||||||
|
|
||||||
|
def test_page_based_pagination(mock_args):
|
||||||
|
"""Link header with 'page' parameter works correctly."""
|
||||||
|
|
||||||
|
# Simulate pulls/repos endpoint behavior: returns page numbers in Link header
|
||||||
|
responses = [
|
||||||
|
# Pulls endpoint uses traditional 'page' parameter (not cursor)
|
||||||
|
MockHTTPResponse(
|
||||||
|
data=[{"pull": i} for i in range(1, 101)], # Page 1 contents
|
||||||
|
link_header='<https://api.github.com/repos/owner/repo/pulls?per_page=100&page=2>; rel="next"',
|
||||||
|
),
|
||||||
|
MockHTTPResponse(
|
||||||
|
data=[{"pull": i} for i in range(101, 181)], # Page 2 contents
|
||||||
|
link_header=None, # No Link header - signals end of pagination
|
||||||
|
),
|
||||||
|
]
|
||||||
|
requests_made = []
|
||||||
|
|
||||||
|
def mock_urlopen(request, *args, **kwargs):
|
||||||
|
url = request.get_full_url()
|
||||||
|
requests_made.append(url)
|
||||||
|
return responses[len(requests_made) - 1]
|
||||||
|
|
||||||
|
with patch("github_backup.github_backup.urlopen", side_effect=mock_urlopen):
|
||||||
|
results = list(
|
||||||
|
github_backup.retrieve_data_gen(
|
||||||
|
mock_args, "https://api.github.com/repos/owner/repo/pulls"
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Verify all items retrieved and page parameter was used (not cursor)
|
||||||
|
assert len(results) == 180
|
||||||
|
assert len(requests_made) == 2
|
||||||
|
assert "page=2" in requests_made[1]
|
||||||
|
assert "after" not in requests_made[1]
|
||||||
|
|
||||||
|
|
||||||
|
def test_no_link_header_stops_pagination(mock_args):
|
||||||
|
"""Pagination stops when Link header is absent."""
|
||||||
|
|
||||||
|
# Simulate endpoint with results that fit in a single page
|
||||||
|
responses = [
|
||||||
|
MockHTTPResponse(
|
||||||
|
data=[{"label": i} for i in range(1, 51)], # Page contents
|
||||||
|
link_header=None, # No Link header - signals end of pagination
|
||||||
|
)
|
||||||
|
]
|
||||||
|
requests_made = []
|
||||||
|
|
||||||
|
def mock_urlopen(request, *args, **kwargs):
|
||||||
|
requests_made.append(request.get_full_url())
|
||||||
|
return responses[len(requests_made) - 1]
|
||||||
|
|
||||||
|
with patch("github_backup.github_backup.urlopen", side_effect=mock_urlopen):
|
||||||
|
results = list(
|
||||||
|
github_backup.retrieve_data_gen(
|
||||||
|
mock_args, "https://api.github.com/repos/owner/repo/labels"
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Verify pagination stopped after first request
|
||||||
|
assert len(results) == 50
|
||||||
|
assert len(requests_made) == 1
|
||||||
Reference in New Issue
Block a user