mirror of
https://github.com/community-scripts/ProxmoxVE.git
synced 2025-07-01 19:47:38 +00:00
Compare commits
242 Commits
2025-02-13
...
2025-02-28
Author | SHA1 | Date | |
---|---|---|---|
f1cdfecaaf | |||
c738615bf2 | |||
8af885a0f3 | |||
62cdb5c0c2 | |||
843c4f8c1e | |||
d82c9a24a7 | |||
b24860c97b | |||
e8e298581c | |||
05769fdff1 | |||
5c5c628954 | |||
10f4172349 | |||
d781c75dfe | |||
0c0d1de317 | |||
ab370762a4 | |||
51bb4566e7 | |||
0e00057477 | |||
ccd70a835f | |||
5bfde87159 | |||
949dcd0b23 | |||
4cad868175 | |||
35f74cb3b6 | |||
51a3361686 | |||
0ac0cc263e | |||
c730b5c009 | |||
8c1cc56321 | |||
c01abd559b | |||
a5039cff58 | |||
3bf4155fec | |||
fb515bc057 | |||
bd0c906c44 | |||
0871b9c84c | |||
0d4b4a1837 | |||
474954afb6 | |||
5041708810 | |||
c630378701 | |||
be5e6cc870 | |||
87afc99489 | |||
ff49b8fea8 | |||
bb0bc65e27 | |||
6a60704d8e | |||
cd40beb1eb | |||
e1df6b0a6f | |||
b582b9f04d | |||
407801d9ff | |||
6bcbc0ae91 | |||
62bd7d5eef | |||
663962e873 | |||
3e7458dfaa | |||
db5cc430c7 | |||
5b91a776f1 | |||
0aa13fbea7 | |||
2d20686131 | |||
5f2163bff4 | |||
9c59bdaea1 | |||
7f4c116b2c | |||
ba842a1e49 | |||
41f4aebdc7 | |||
c12470ff12 | |||
6722e52c9b | |||
e68355d880 | |||
7d29d8a05a | |||
e6792790cc | |||
4354079c4e | |||
a02a9d803e | |||
46462a790c | |||
a971cedde5 | |||
fda9557811 | |||
633dbe3e2b | |||
981baed9d3 | |||
90fc30a228 | |||
0641ec1e2a | |||
82d4ee01d2 | |||
d564dc0ecb | |||
cb462dcb39 | |||
3401b76c44 | |||
599a518cc3 | |||
59223628af | |||
029332fb51 | |||
aba73bd0f8 | |||
893bff1b59 | |||
bef3ccd164 | |||
02fb3ab9b4 | |||
87c17fc16d | |||
03424f0474 | |||
bf9edf41d9 | |||
377d11bf48 | |||
42ad6832ea | |||
e2b6296cdf | |||
a3b1984d0a | |||
f24286b717 | |||
375275b9c0 | |||
8c41fb692b | |||
5de3075252 | |||
c72e8d3da6 | |||
40469c5de5 | |||
6827056a1d | |||
1172662095 | |||
43dc12074d | |||
6eb272ba4a | |||
1bb6e4e899 | |||
157657b298 | |||
188712936f | |||
0e5be846be | |||
05364685af | |||
03a2c41718 | |||
de4f7c3166 | |||
ba0153539d | |||
8a64d345a4 | |||
62939edc2e | |||
56a7505f15 | |||
01a02c41ff | |||
a03a1034de | |||
ece3ad2b13 | |||
20cc7572a5 | |||
7700b1a541 | |||
fc9a7f2cd4 | |||
a97a56bf88 | |||
de2e785731 | |||
eb3a2e41ed | |||
acf5eda756 | |||
df110695cd | |||
ebf63f55e6 | |||
0a1248861b | |||
10d4ce4eef | |||
35f635bdb4 | |||
27c764afd3 | |||
1f500d5aeb | |||
4e13134774 | |||
2dbd03b74c | |||
0084169c02 | |||
4aabbefcc8 | |||
6daeb7e288 | |||
81b367df07 | |||
0f410e11eb | |||
0a9dffb7a1 | |||
67b90f5582 | |||
eaceba3ed3 | |||
d7dceede4b | |||
9c867b467a | |||
d35a01f5c4 | |||
1e2f953a8f | |||
2937516869 | |||
92d2065f1d | |||
13c2f50f19 | |||
dccc45d492 | |||
a9362e0b4a | |||
10c46723fe | |||
666e170f7d | |||
109c48694e | |||
d0cd58e923 | |||
16b8bbfca6 | |||
209aa220b0 | |||
dd8db43dea | |||
7d40e148e9 | |||
ef6eeea608 | |||
0c13b71466 | |||
e1c25a3c8e | |||
e5bfb8f8a3 | |||
4dfcd32d92 | |||
167deb5d7f | |||
8cb3007d66 | |||
49bcd30e77 | |||
1e2954a993 | |||
fe5711d9c4 | |||
33f812179f | |||
d7a2614819 | |||
dc259847af | |||
9bcd1cd237 | |||
3a1ae8f7c0 | |||
9cbe196913 | |||
d0c8b1c15b | |||
2efdea9a29 | |||
978dc549f4 | |||
d4d8943c9f | |||
12a1f46703 | |||
15d20a54b3 | |||
bedfbd232d | |||
3c289e7235 | |||
450d2410d9 | |||
e1ecc8d6cf | |||
e9d9da3355 | |||
6d3c442464 | |||
1a8f5a4007 | |||
20414d9659 | |||
1fe8bc05b3 | |||
049afa994b | |||
ba41bcd561 | |||
4aa84c265d | |||
436945b711 | |||
b749119a1c | |||
87c61de11e | |||
b293638c40 | |||
d1e0c2d164 | |||
7e6a7468df | |||
1ffe6b1c3c | |||
a76733df60 | |||
70f5280fcc | |||
4cbe90597e | |||
0afe60e11a | |||
6982d02489 | |||
031aefe05a | |||
fa01cfd840 | |||
331bc0f5a6 | |||
7e9eb2f98a | |||
5a72b1e523 | |||
2693fabac2 | |||
ce16be6393 | |||
b694c339cb | |||
3957b46d98 | |||
54929e4b0d | |||
ca20d52ac1 | |||
e22a6dad6f | |||
ee84468498 | |||
746f19b0b8 | |||
a646a035d8 | |||
505cb23467 | |||
345e109d9f | |||
bca944034a | |||
be27905776 | |||
53196c7603 | |||
0777ddfbfc | |||
18bd71da89 | |||
ecd13dd5a4 | |||
3ef1ac434a | |||
60e32a05cd | |||
b7df0ee936 | |||
4cf24c54d7 | |||
c4ed0738cf | |||
88b20e5545 | |||
7c5b072303 | |||
347a23ad60 | |||
29806c4525 | |||
a045dc8012 | |||
80e7e2f5b6 | |||
1789d181aa | |||
f0ca0c3379 | |||
1711e44a4d | |||
afe4af2ff6 | |||
ec8c564e25 | |||
fd9d64b342 | |||
1d928f7ea8 | |||
f3c9e8f013 |
@ -40,27 +40,27 @@ Before contributing, please ensure that you have the following setup:
|
|||||||
- [Shell Format](https://marketplace.visualstudio.com/items?itemName=foxundermoon.shell-format)
|
- [Shell Format](https://marketplace.visualstudio.com/items?itemName=foxundermoon.shell-format)
|
||||||
|
|
||||||
### Important Notes
|
### Important Notes
|
||||||
- Use [AppName.sh](https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_GUIDE/ct/AppName.sh) and [AppName-install.sh](https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_GUIDE/install/AppName-install.sh) as templates when creating new scripts.
|
- Use [AppName.sh](https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_AND_GUIDES/ct/AppName.sh) and [AppName-install.sh](https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_AND_GUIDES/install/AppName-install.sh) as templates when creating new scripts.
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
# 🚀 The Application Script (ct/AppName.sh)
|
# 🚀 The Application Script (ct/AppName.sh)
|
||||||
|
|
||||||
- You can find all coding standards, as well as the structure for this file [here](https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_GUIDE/ct/AppName.md).
|
- You can find all coding standards, as well as the structure for this file [here](https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_AND_GUIDES/ct/AppName.md).
|
||||||
- These scripts are responsible for container creation, setting the necessary variables and handling the update of the application once installed.
|
- These scripts are responsible for container creation, setting the necessary variables and handling the update of the application once installed.
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
# 🛠 The Installation Script (install/AppName-install.sh)
|
# 🛠 The Installation Script (install/AppName-install.sh)
|
||||||
|
|
||||||
- You can find all coding standards, as well as the structure for this file [here](https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_GUIDE/install/AppName-install.md).
|
- You can find all coding standards, as well as the structure for this file [here](https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_AND_GUIDES/install/AppName-install.md).
|
||||||
- These scripts are responsible for the installation of the application.
|
- These scripts are responsible for the installation of the application.
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## 🚀 Building Your Own Scripts
|
## 🚀 Building Your Own Scripts
|
||||||
|
|
||||||
Start with the [template script](https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_GUIDE/install/AppName-install.sh)
|
Start with the [template script](https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_AND_GUIDES/install/AppName-install.sh)
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
@ -99,8 +99,8 @@ Open a Pull Request from your feature branch to the main repository branch. You
|
|||||||
|
|
||||||
## 📚 Pages
|
## 📚 Pages
|
||||||
|
|
||||||
- [CT Template: AppName.sh](https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_GUIDE/ct/AppName.sh)
|
- [CT Template: AppName.sh](https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_AND_GUIDES/ct/AppName.sh)
|
||||||
- [Install Template: AppName-install.sh](https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_GUIDE/install/AppName-install.sh)
|
- [Install Template: AppName-install.sh](https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_AND_GUIDES/install/AppName-install.sh)
|
||||||
- [JSON Template: AppName.json](https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_GUIDE/json/AppName.json)
|
- [JSON Template: AppName.json](https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_AND_GUIDES/json/AppName.json)
|
||||||
|
|
||||||
|
|
@ -95,7 +95,7 @@ Example:
|
|||||||
>| Variable | Description | Notes |
|
>| Variable | Description | Notes |
|
||||||
>|----------|-------------|-------|
|
>|----------|-------------|-------|
|
||||||
>| `APP` | Application name | Must match ct\AppName.sh |
|
>| `APP` | Application name | Must match ct\AppName.sh |
|
||||||
>| `TAGS` | Proxmox display tags without Spaces, only ; | Limit the number |
|
>| `var_tags` | Proxmox display tags without Spaces, only ; | Limit the number |
|
||||||
>| `var_cpu` | CPU cores | Number of cores |
|
>| `var_cpu` | CPU cores | Number of cores |
|
||||||
>| `var_ram` | RAM | In MB |
|
>| `var_ram` | RAM | In MB |
|
||||||
>| `var_disk` | Disk capacity | In GB |
|
>| `var_disk` | Disk capacity | In GB |
|
||||||
@ -193,13 +193,13 @@ wget -q
|
|||||||
unzip -q
|
unzip -q
|
||||||
```
|
```
|
||||||
|
|
||||||
- If a command does not come with this functionality use `&>/dev/null` to suppress it's output.
|
- If a command does not come with this functionality use `$STD` to suppress it's output.
|
||||||
|
|
||||||
Example:
|
Example:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
php artisan migrate --force &>/dev/null
|
$STD php artisan migrate --force
|
||||||
php artisan config:clear &>/dev/null
|
$STD php artisan config:clear
|
||||||
```
|
```
|
||||||
|
|
||||||
### 3.5 **Backups**
|
### 3.5 **Backups**
|
||||||
@ -247,7 +247,7 @@ function update_script() {
|
|||||||
msg_error "No ${APP} Installation Found!"
|
msg_error "No ${APP} Installation Found!"
|
||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_error "There is currently no automatic update function for ${APP}."
|
msg_error "Currently we don't provide an update function for this ${APP}."
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
```
|
```
|
@ -8,7 +8,7 @@ source <(curl -s https://raw.githubusercontent.com/community-scripts/ProxmoxVE/m
|
|||||||
# App Default Values
|
# App Default Values
|
||||||
APP="[APP_NAME]"
|
APP="[APP_NAME]"
|
||||||
# Name of the app (e.g. Google, Adventurelog, Apache-Guacamole"
|
# Name of the app (e.g. Google, Adventurelog, Apache-Guacamole"
|
||||||
TAGS="[TAGS]"
|
var_tags="[TAGS]"
|
||||||
# Tags for Proxmox VE, maximum 2 pcs., no spaces allowed, separated by a semicolon ; (e.g. database | adblock;dhcp)
|
# Tags for Proxmox VE, maximum 2 pcs., no spaces allowed, separated by a semicolon ; (e.g. database | adblock;dhcp)
|
||||||
var_cpu="[CPU]"
|
var_cpu="[CPU]"
|
||||||
# Number of cores (1-X) (e.g. 4) - default are 2
|
# Number of cores (1-X) (e.g. 4) - default are 2
|
2
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
2
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
@ -1,6 +1,6 @@
|
|||||||
name: "🐞 Script Issue Report"
|
name: "🐞 Script Issue Report"
|
||||||
description: Report a specific issue with a script. For other inquiries, please use the Discussions section.
|
description: Report a specific issue with a script. For other inquiries, please use the Discussions section.
|
||||||
|
labels: ["bug"]
|
||||||
body:
|
body:
|
||||||
- type: markdown
|
- type: markdown
|
||||||
attributes:
|
attributes:
|
||||||
|
2
.github/ISSUE_TEMPLATE/config.yml
vendored
2
.github/ISSUE_TEMPLATE/config.yml
vendored
@ -3,7 +3,7 @@ contact_links:
|
|||||||
- name: 🤔 Questions and Help
|
- name: 🤔 Questions and Help
|
||||||
url: https://github.com/community-scripts/ProxmoxVE/discussions
|
url: https://github.com/community-scripts/ProxmoxVE/discussions
|
||||||
about: For suggestions or questions, please use the Discussions section.
|
about: For suggestions or questions, please use the Discussions section.
|
||||||
- name: 🌟 Feature request
|
- name: 🌟 new Script request
|
||||||
url: https://github.com/community-scripts/ProxmoxVE/discussions/new?category=request-script
|
url: https://github.com/community-scripts/ProxmoxVE/discussions/new?category=request-script
|
||||||
about: For feature/script requests, please use the Discussions section.
|
about: For feature/script requests, please use the Discussions section.
|
||||||
- name: 💻 Discord
|
- name: 💻 Discord
|
||||||
|
33
.github/ISSUE_TEMPLATE/feature_request.yml
vendored
Normal file
33
.github/ISSUE_TEMPLATE/feature_request.yml
vendored
Normal file
@ -0,0 +1,33 @@
|
|||||||
|
name: "✨ Feature Request"
|
||||||
|
description: "Suggest a new feature or enhancement."
|
||||||
|
labels: ["enhancement"]
|
||||||
|
body:
|
||||||
|
- type: markdown
|
||||||
|
attributes:
|
||||||
|
value: |
|
||||||
|
# ✨ **Feature Request**
|
||||||
|
Have an idea for a new feature? Share your thoughts below!
|
||||||
|
|
||||||
|
- type: input
|
||||||
|
id: feature_summary
|
||||||
|
attributes:
|
||||||
|
label: "🌟 Briefly describe the feature"
|
||||||
|
placeholder: "e.g., Add support for XYZ"
|
||||||
|
validations:
|
||||||
|
required: true
|
||||||
|
|
||||||
|
- type: textarea
|
||||||
|
id: feature_description
|
||||||
|
attributes:
|
||||||
|
label: "📝 Detailed description"
|
||||||
|
placeholder: "Explain the feature in detail"
|
||||||
|
validations:
|
||||||
|
required: true
|
||||||
|
|
||||||
|
- type: textarea
|
||||||
|
id: use_case
|
||||||
|
attributes:
|
||||||
|
label: "💡 Why is this useful?"
|
||||||
|
placeholder: "Describe the benefit of this feature"
|
||||||
|
validations:
|
||||||
|
required: true
|
25
.github/ISSUE_TEMPLATE/task.yml
vendored
Normal file
25
.github/ISSUE_TEMPLATE/task.yml
vendored
Normal file
@ -0,0 +1,25 @@
|
|||||||
|
name: "🛠️ Task / General Request"
|
||||||
|
description: "Request a general task, improvement, or refactor."
|
||||||
|
labels: ["task"]
|
||||||
|
body:
|
||||||
|
- type: markdown
|
||||||
|
attributes:
|
||||||
|
value: |
|
||||||
|
# 🛠️ **Task / General Request**
|
||||||
|
Request a task that isn't a bug or feature request.
|
||||||
|
|
||||||
|
- type: input
|
||||||
|
id: task_summary
|
||||||
|
attributes:
|
||||||
|
label: "📌 Task summary"
|
||||||
|
placeholder: "e.g., Refactor XYZ"
|
||||||
|
validations:
|
||||||
|
required: true
|
||||||
|
|
||||||
|
- type: textarea
|
||||||
|
id: task_details
|
||||||
|
attributes:
|
||||||
|
label: "📋 Task details"
|
||||||
|
placeholder: "Explain what needs to be done"
|
||||||
|
validations:
|
||||||
|
required: true
|
53
.github/autolabeler-config.json
vendored
53
.github/autolabeler-config.json
vendored
@ -1,17 +1,5 @@
|
|||||||
|
|
||||||
{
|
{
|
||||||
"breaking change": [
|
|
||||||
{
|
|
||||||
"fileStatus": "renamed",
|
|
||||||
"includeGlobs": ["ct/**", "install/**", "misc/**", "turnkey/**", "vm/**"],
|
|
||||||
"excludeGlobs": []
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"fileStatus": "removed",
|
|
||||||
"includeGlobs": ["ct/**", "install/**", "misc/**", "turnkey/**", "vm/**"],
|
|
||||||
"excludeGlobs": []
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"new script": [
|
"new script": [
|
||||||
{
|
{
|
||||||
"fileStatus": "added",
|
"fileStatus": "added",
|
||||||
@ -33,10 +21,17 @@
|
|||||||
"excludeGlobs": []
|
"excludeGlobs": []
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"rename script": [
|
"maintenance": [
|
||||||
{
|
{
|
||||||
"fileStatus": "renamed",
|
"fileStatus": null,
|
||||||
"includeGlobs": ["ct/**", "install/**", "misc/**", "turnkey/**", "vm/**"],
|
"includeGlobs": ["*.md", ".github/**", "misc/*.func", "ct/create_lxc.sh", "api/**"],
|
||||||
|
"excludeGlobs": []
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"core": [
|
||||||
|
{
|
||||||
|
"fileStatus": null,
|
||||||
|
"includeGlobs": ["misc/*.func", "ct/create_lxc.sh"],
|
||||||
"excludeGlobs": []
|
"excludeGlobs": []
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
@ -47,20 +42,28 @@
|
|||||||
"excludeGlobs": []
|
"excludeGlobs": []
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"maintenance": [
|
"api": [
|
||||||
{
|
|
||||||
"fileStatus": null,
|
|
||||||
"includeGlobs": ["*.md", ".github/**", "misc/*.func", "ct/create_lxc.sh"],
|
|
||||||
"excludeGlobs": ["misc/api.func"]
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"api": [
|
|
||||||
{
|
{
|
||||||
"fileStatus": null,
|
"fileStatus": null,
|
||||||
"includeGlobs": ["api/**", "misc/api.func"],
|
"includeGlobs": ["api/**", "misc/api.func"],
|
||||||
"excludeGlobs": []
|
"excludeGlobs": []
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
|
"github": [
|
||||||
|
{
|
||||||
|
"fileStatus": null,
|
||||||
|
"includeGlobs": [".github/**"],
|
||||||
|
"excludeGlobs": []
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"json": [
|
||||||
|
{
|
||||||
|
"fileStatus": "modified",
|
||||||
|
"includeGlobs": ["json/**"],
|
||||||
|
"excludeGlobs": []
|
||||||
|
}
|
||||||
|
],
|
||||||
|
|
||||||
"high risk": [
|
"high risk": [
|
||||||
{
|
{
|
||||||
"fileStatus": null,
|
"fileStatus": null,
|
||||||
@ -68,4 +71,6 @@
|
|||||||
"excludeGlobs": []
|
"excludeGlobs": []
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
|
||||||
|
|
||||||
|
}
|
103
.github/changelog-pr-config.json
vendored
103
.github/changelog-pr-config.json
vendored
@ -1,34 +1,97 @@
|
|||||||
[
|
[
|
||||||
{
|
{
|
||||||
"title": "💥 Breaking Changes",
|
"title": "🆕 New Scripts",
|
||||||
"labels": ["breaking change"]
|
"labels": ["new script"]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"title": "✨ New Scripts",
|
"title": "🚀 Updated Scripts",
|
||||||
"labels": ["new script"]
|
"labels": ["update script"],
|
||||||
|
"subCategories": [
|
||||||
|
{
|
||||||
|
"title": "🐞 Bug Fixes",
|
||||||
|
"labels": ["bugfix"],
|
||||||
|
"notes" : []
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"title": "✨ New Features",
|
||||||
|
"labels": ["feature"],
|
||||||
|
"notes" : []
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"title": "💥 Breaking Changes",
|
||||||
|
"labels": ["breaking change"],
|
||||||
|
"notes" : []
|
||||||
|
}
|
||||||
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"title": "🚀 Updated Scripts",
|
"title": "🧰 Maintenance",
|
||||||
"labels": ["update script"]
|
"labels": ["maintenance"],
|
||||||
|
"subCategories": [
|
||||||
|
{
|
||||||
|
"title": "🐞 Bug Fixes",
|
||||||
|
"labels": ["bugfix"],
|
||||||
|
"notes" : []
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"title": "✨ New Features",
|
||||||
|
"labels": ["feature"],
|
||||||
|
"notes" : []
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"title": "💥 Breaking Changes",
|
||||||
|
"labels": ["breaking change"],
|
||||||
|
"notes" : []
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"title": "📡 API",
|
||||||
|
"labels": ["api"],
|
||||||
|
"notes" : []
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"title": "💾 Core",
|
||||||
|
"labels": ["core"],
|
||||||
|
"notes" : []
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"title": "📂 Github",
|
||||||
|
"labels": ["github"],
|
||||||
|
"notes" : []
|
||||||
|
}
|
||||||
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"title": "🌐 Website",
|
"title": "🌐 Website",
|
||||||
"labels": ["website"]
|
"labels": ["website"],
|
||||||
|
"subCategories": [
|
||||||
|
{
|
||||||
|
"title": "🐞 Bug Fixes",
|
||||||
|
"labels": ["bugfix"],
|
||||||
|
"notes" : []
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"title": "✨ New Features",
|
||||||
|
"labels": ["feature"],
|
||||||
|
"notes" : []
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"title": "💥 Breaking Changes",
|
||||||
|
"labels": ["breaking change"],
|
||||||
|
"notes" : []
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"title": "📝 Script Information",
|
||||||
|
"labels": ["json"],
|
||||||
|
"notes" : []
|
||||||
|
}
|
||||||
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"title": "🐞 Bug Fixes",
|
"title": "❔ Unlabelled",
|
||||||
"labels": ["bug fix"]
|
"labels": []
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"title": "🧰 Maintenance",
|
"title": "💥 Breaking Changes",
|
||||||
"labels": ["maintenance"]
|
"labels": ["breaking change"]
|
||||||
},
|
|
||||||
{
|
|
||||||
"title": "📡 API",
|
|
||||||
"labels": ["api"]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"title": "❔ Unlabelled",
|
|
||||||
"labels": []
|
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
|
18
.github/pull_request_template.md
vendored
18
.github/pull_request_template.md
vendored
@ -1,25 +1,25 @@
|
|||||||
## ✍️ Description
|
## ✍️ Description
|
||||||
<!-- Provide a clear and concise description of your changes. -->
|
<!-- Provide a clear and concise description of your changes. -->
|
||||||
|
|
||||||
|
|
||||||
## 🔗 Related PR / Discussion / Issue
|
## 🔗 Related PR / Discussion / Issue
|
||||||
|
|
||||||
Link: #
|
Link: #
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
## ✅ Prerequisites
|
## ✅ Prerequisites
|
||||||
|
|
||||||
Before this PR can be reviewed, the following must be completed:
|
Before this PR can be reviewed, the following must be completed:
|
||||||
|
|
||||||
- [] **Self-review performed** – Code follows established patterns and conventions.
|
- [] **Self-review performed** – Code follows established patterns and conventions.
|
||||||
- [] **Testing performed** – Changes have been thoroughly tested and verified.
|
- [] **Testing performed** – Changes have been thoroughly tested and verified.
|
||||||
|
|
||||||
|
|
||||||
## 🛠️ Type of Change
|
## 🛠️ Type of Change
|
||||||
Select all that apply:
|
|
||||||
- [] 🐞 **Bug fix** – Resolves an issue without breaking functionality.
|
|
||||||
- [] ✨ **New feature** – Adds new, non-breaking functionality.
|
|
||||||
- [] 💥 **Breaking change** – Alters existing functionality in a way that may require updates.
|
|
||||||
- [] 🆕 **New script** – A fully functional and tested script or script set.
|
|
||||||
|
|
||||||
|
Select all that apply:
|
||||||
|
|
||||||
|
- [] 🆕 **New script** – A fully functional and tested script or script set.
|
||||||
|
- [] 🐞 **Bug fix** – Resolves an issue without breaking functionality.
|
||||||
|
- [] ✨ **New feature** – Adds new, non-breaking functionality.
|
||||||
|
- [] 💥 **Breaking change** – Alters existing functionality in a way that may require updates.
|
||||||
|
|
||||||
## 📋 Additional Information (optional)
|
## 📋 Additional Information (optional)
|
||||||
<!-- Provide extra context, screenshots, or references if needed. -->
|
<!-- Provide extra context, screenshots, or references if needed. -->
|
||||||
|
68
.github/runner/docker/gh-runner-self.dockerfile
vendored
Normal file
68
.github/runner/docker/gh-runner-self.dockerfile
vendored
Normal file
@ -0,0 +1,68 @@
|
|||||||
|
FROM mcr.microsoft.com/dotnet/runtime-deps:8.0-jammy as build
|
||||||
|
|
||||||
|
ARG TARGETOS
|
||||||
|
ARG TARGETARCH
|
||||||
|
ARG DOCKER_VERSION=27.5.1
|
||||||
|
ARG BUILDX_VERSION=0.20.1
|
||||||
|
ARG RUNNER_ARCH="x64"
|
||||||
|
|
||||||
|
RUN apt update -y && apt install sudo curl unzip -y
|
||||||
|
|
||||||
|
WORKDIR /actions-runner
|
||||||
|
|
||||||
|
RUN RUNNER_VERSION=$(curl -s https://api.github.com/repos/actions/runner/releases/latest | grep "tag_name" | head -n 1 | awk '{print substr($2, 3, length($2)-4)}') \
|
||||||
|
&& curl -f -L -o runner.tar.gz https://github.com/actions/runner/releases/download/v${RUNNER_VERSION}/actions-runner-linux-${RUNNER_ARCH}-${RUNNER_VERSION}.tar.gz \
|
||||||
|
&& tar xzf ./runner.tar.gz \
|
||||||
|
&& rm runner.tar.gz
|
||||||
|
|
||||||
|
RUN RUNNER_CONTAINER_HOOKS_VERSION=$(curl -s https://api.github.com/repos/actions/runner-container-hooks/releases/latest | grep "tag_name" | head -n 1 | awk '{print substr($2, 3, length($2)-4)}') \
|
||||||
|
&& curl -f -L -o runner-container-hooks.zip https://github.com/actions/runner-container-hooks/releases/download/v${RUNNER_CONTAINER_HOOKS_VERSION}/actions-runner-hooks-k8s-${RUNNER_CONTAINER_HOOKS_VERSION}.zip \
|
||||||
|
&& unzip ./runner-container-hooks.zip -d ./k8s \
|
||||||
|
&& rm runner-container-hooks.zip
|
||||||
|
|
||||||
|
RUN export RUNNER_ARCH=${TARGETARCH} \
|
||||||
|
&& if [ "$RUNNER_ARCH" = "amd64" ]; then export DOCKER_ARCH=x86_64 ; fi \
|
||||||
|
&& if [ "$RUNNER_ARCH" = "arm64" ]; then export DOCKER_ARCH=aarch64 ; fi \
|
||||||
|
&& curl -fLo docker.tgz https://download.docker.com/${TARGETOS}/static/stable/${DOCKER_ARCH}/docker-${DOCKER_VERSION}.tgz \
|
||||||
|
&& tar zxvf docker.tgz \
|
||||||
|
&& rm -rf docker.tgz \
|
||||||
|
&& mkdir -p /usr/local/lib/docker/cli-plugins \
|
||||||
|
&& curl -fLo /usr/local/lib/docker/cli-plugins/docker-buildx \
|
||||||
|
"https://github.com/docker/buildx/releases/download/v${BUILDX_VERSION}/buildx-v${BUILDX_VERSION}.linux-${TARGETARCH}" \
|
||||||
|
&& chmod +x /usr/local/lib/docker/cli-plugins/docker-buildx
|
||||||
|
|
||||||
|
FROM mcr.microsoft.com/dotnet/runtime-deps:8.0-jammy
|
||||||
|
|
||||||
|
ENV DEBIAN_FRONTEND=noninteractive
|
||||||
|
ENV RUNNER_MANUALLY_TRAP_SIG=1
|
||||||
|
ENV ACTIONS_RUNNER_PRINT_LOG_TO_STDOUT=1
|
||||||
|
ENV ImageOS=ubuntu22
|
||||||
|
|
||||||
|
RUN apt update -y \
|
||||||
|
&& apt install -y --no-install-recommends sudo lsb-release gpg-agent software-properties-common curl jq unzip \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
RUN add-apt-repository ppa:git-core/ppa \
|
||||||
|
&& apt update -y \
|
||||||
|
&& apt install -y git \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
RUN adduser --disabled-password --gecos "" --uid 1001 runner \
|
||||||
|
&& groupadd docker --gid 123 \
|
||||||
|
&& usermod -aG sudo runner \
|
||||||
|
&& usermod -aG docker runner \
|
||||||
|
&& echo "%sudo ALL=(ALL:ALL) NOPASSWD:ALL" > /etc/sudoers \
|
||||||
|
&& echo "Defaults env_keep += \"DEBIAN_FRONTEND\"" >> /etc/sudoers
|
||||||
|
|
||||||
|
# Install own dependencies in final image
|
||||||
|
RUN curl -fsSL https://deb.nodesource.com/setup_22.x | bash - \
|
||||||
|
&& apt-get install -y nodejs \
|
||||||
|
&& apt-get install -y gh jq git
|
||||||
|
|
||||||
|
WORKDIR /home/runner
|
||||||
|
|
||||||
|
COPY --chown=runner:docker --from=build /actions-runner .
|
||||||
|
COPY --from=build /usr/local/lib/docker/cli-plugins/docker-buildx /usr/local/lib/docker/cli-plugins/docker-buildx
|
||||||
|
RUN install -o root -g root -m 755 docker/* /usr/bin/ && rm -rf docker
|
||||||
|
|
||||||
|
USER runner
|
@ -10,7 +10,7 @@ on:
|
|||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
update-app-files:
|
update-app-files:
|
||||||
runs-on: ubuntu-latest
|
runs-on: runner-cluster-htl-set
|
||||||
|
|
||||||
permissions:
|
permissions:
|
||||||
contents: write
|
contents: write
|
||||||
|
50
.github/workflows/autolabeler.yml
vendored
50
.github/workflows/autolabeler.yml
vendored
@ -7,7 +7,7 @@ on:
|
|||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
autolabeler:
|
autolabeler:
|
||||||
runs-on: ubuntu-latest
|
runs-on: runner-cluster-htl-set
|
||||||
permissions:
|
permissions:
|
||||||
pull-requests: write
|
pull-requests: write
|
||||||
env:
|
env:
|
||||||
@ -19,7 +19,7 @@ jobs:
|
|||||||
- name: Install minimatch
|
- name: Install minimatch
|
||||||
run: npm install minimatch
|
run: npm install minimatch
|
||||||
|
|
||||||
- name: Label PR based on config rules
|
- name: Label PR based on file changes and PR template
|
||||||
uses: actions/github-script@v7
|
uses: actions/github-script@v7
|
||||||
with:
|
with:
|
||||||
script: |
|
script: |
|
||||||
@ -30,33 +30,61 @@ jobs:
|
|||||||
const configPath = path.resolve(process.env.CONFIG_PATH);
|
const configPath = path.resolve(process.env.CONFIG_PATH);
|
||||||
const fileContent = await fs.readFile(configPath, 'utf-8');
|
const fileContent = await fs.readFile(configPath, 'utf-8');
|
||||||
const autolabelerConfig = JSON.parse(fileContent);
|
const autolabelerConfig = JSON.parse(fileContent);
|
||||||
|
|
||||||
const prNumber = context.payload.pull_request.number;
|
const prNumber = context.payload.pull_request.number;
|
||||||
|
const prBody = context.payload.pull_request.body.toLowerCase();
|
||||||
|
|
||||||
|
let labelsToAdd = new Set();
|
||||||
|
|
||||||
const prListFilesResponse = await github.rest.pulls.listFiles({
|
const prListFilesResponse = await github.rest.pulls.listFiles({
|
||||||
owner: context.repo.owner,
|
owner: context.repo.owner,
|
||||||
repo: context.repo.repo,
|
repo: context.repo.repo,
|
||||||
pull_number: prNumber,
|
pull_number: prNumber,
|
||||||
});
|
});
|
||||||
const prFiles = prListFilesResponse.data;
|
const prFiles = prListFilesResponse.data;
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
// Apply labels based on file changes
|
||||||
for (const [label, rules] of Object.entries(autolabelerConfig)) {
|
for (const [label, rules] of Object.entries(autolabelerConfig)) {
|
||||||
const shouldAddLabel = prFiles.some((prFile) => {
|
const shouldAddLabel = prFiles.some((prFile) => {
|
||||||
return rules.some((rule) => {
|
return rules.some((rule) => {
|
||||||
const isFileStatusMatch = rule.fileStatus ? rule.fileStatus === prFile.status : true;
|
const isFileStatusMatch = rule.fileStatus ? rule.fileStatus === prFile.status : true;
|
||||||
const isIncludeGlobMatch = rule.includeGlobs.some((glob) => minimatch(prFile.filename, glob));
|
const isIncludeGlobMatch = rule.includeGlobs.some((glob) => minimatch(prFile.filename, glob));
|
||||||
const isExcludeGlobMatch = rule.excludeGlobs.some((glob) => minimatch(prFile.filename, glob));
|
const isExcludeGlobMatch = rule.excludeGlobs.some((glob) => minimatch(prFile.filename, glob));
|
||||||
|
|
||||||
return isFileStatusMatch && isIncludeGlobMatch && !isExcludeGlobMatch;
|
return isFileStatusMatch && isIncludeGlobMatch && !isExcludeGlobMatch;
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
if (shouldAddLabel) {
|
if (shouldAddLabel) {
|
||||||
console.log(`Adding label ${label} to PR ${prNumber}`);
|
labelsToAdd.add(label);
|
||||||
await github.rest.issues.addLabels({
|
|
||||||
owner: context.repo.owner,
|
|
||||||
repo: context.repo.repo,
|
|
||||||
issue_number: prNumber,
|
|
||||||
labels: [label],
|
|
||||||
});
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
const templateLabelMappings = {
|
||||||
|
"🐞 **Bug fix**": "bugfix",
|
||||||
|
"✨ **New feature**": "feature",
|
||||||
|
"💥 **Breaking change**": "breaking change",
|
||||||
|
};
|
||||||
|
|
||||||
|
for (const [checkbox, label] of Object.entries(templateLabelMappings)) {
|
||||||
|
const escapedCheckbox = checkbox.replace(/([.*+?^=!:${}()|\[\]\/\\])/g, "\\$1");
|
||||||
|
const regex = new RegExp(`- \\[(x|X)\\]\\s*.*${escapedCheckbox}`, "i");
|
||||||
|
const match = prBody.match(regex);
|
||||||
|
if (match) {
|
||||||
|
console.log(`Match: ${match}`);
|
||||||
|
labelsToAdd.add(label);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log(`Labels to add: ${Array.from(labelsToAdd).join(", ")}`);
|
||||||
|
|
||||||
|
if (labelsToAdd.size > 0) {
|
||||||
|
console.log(`Adding labels ${Array.from(labelsToAdd).join(", ")} to PR ${prNumber}`);
|
||||||
|
await github.rest.issues.addLabels({
|
||||||
|
owner: context.repo.owner,
|
||||||
|
repo: context.repo.repo,
|
||||||
|
issue_number: prNumber,
|
||||||
|
labels: Array.from(labelsToAdd),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
96
.github/workflows/changelog-pr.yml
vendored
96
.github/workflows/changelog-pr.yml
vendored
@ -7,7 +7,7 @@ on:
|
|||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
update-changelog-pull-request:
|
update-changelog-pull-request:
|
||||||
runs-on: ubuntu-latest
|
runs-on: runner-cluster-htl-set
|
||||||
env:
|
env:
|
||||||
CONFIG_PATH: .github/changelog-pr-config.json
|
CONFIG_PATH: .github/changelog-pr-config.json
|
||||||
BRANCH_NAME: github-action-update-changelog
|
BRANCH_NAME: github-action-update-changelog
|
||||||
@ -30,7 +30,6 @@ jobs:
|
|||||||
|
|
||||||
- name: Get latest dates in changelog
|
- name: Get latest dates in changelog
|
||||||
run: |
|
run: |
|
||||||
# Extrahiere die neuesten zwei Daten aus dem Changelog
|
|
||||||
DATES=$(grep -E '^## [0-9]{4}-[0-9]{2}-[0-9]{2}' CHANGELOG.md | head -n 2 | awk '{print $2}')
|
DATES=$(grep -E '^## [0-9]{4}-[0-9]{2}-[0-9]{2}' CHANGELOG.md | head -n 2 | awk '{print $2}')
|
||||||
|
|
||||||
LATEST_DATE=$(echo "$DATES" | sed -n '1p')
|
LATEST_DATE=$(echo "$DATES" | sed -n '1p')
|
||||||
@ -55,7 +54,31 @@ jobs:
|
|||||||
const configPath = path.resolve(process.env.CONFIG_PATH);
|
const configPath = path.resolve(process.env.CONFIG_PATH);
|
||||||
const fileContent = await fs.readFile(configPath, 'utf-8');
|
const fileContent = await fs.readFile(configPath, 'utf-8');
|
||||||
const changelogConfig = JSON.parse(fileContent);
|
const changelogConfig = JSON.parse(fileContent);
|
||||||
const categorizedPRs = changelogConfig.map(obj => ({ ...obj, notes: [] }));
|
|
||||||
|
const categorizedPRs = changelogConfig.map(obj => ({
|
||||||
|
...obj,
|
||||||
|
notes: [],
|
||||||
|
subCategories: obj.subCategories ?? (
|
||||||
|
obj.labels.includes("update script") ? [
|
||||||
|
{ title: "🐞 Bug Fixes", labels: ["bugfix"], notes: [] },
|
||||||
|
{ title: "✨ New Features", labels: ["feature"], notes: [] },
|
||||||
|
{ title: "💥 Breaking Changes", labels: ["breaking change"], notes: [] }
|
||||||
|
] :
|
||||||
|
obj.labels.includes("maintenance") ? [
|
||||||
|
{ title: "🐞 Bug Fixes", labels: ["bugfix"], notes: [] },
|
||||||
|
{ title: "✨ New Features", labels: ["feature"], notes: [] },
|
||||||
|
{ title: "💥 Breaking Changes", labels: ["breaking change"], notes: [] },
|
||||||
|
{ title: "📡 API", labels: ["api"], notes: [] },
|
||||||
|
{ title: "Github", labels: ["github"], notes: [] }
|
||||||
|
] :
|
||||||
|
obj.labels.includes("website") ? [
|
||||||
|
{ title: "🐞 Bug Fixes", labels: ["bugfix"], notes: [] },
|
||||||
|
{ title: "✨ New Features", labels: ["feature"], notes: [] },
|
||||||
|
{ title: "💥 Breaking Changes", labels: ["breaking change"], notes: [] },
|
||||||
|
{ title: "Script Information", labels: ["json"], notes: [] }
|
||||||
|
] : []
|
||||||
|
)
|
||||||
|
}));
|
||||||
|
|
||||||
const latestDateInChangelog = new Date(process.env.LATEST_DATE);
|
const latestDateInChangelog = new Date(process.env.LATEST_DATE);
|
||||||
latestDateInChangelog.setUTCHours(23, 59, 59, 999);
|
latestDateInChangelog.setUTCHours(23, 59, 59, 999);
|
||||||
@ -70,24 +93,40 @@ jobs:
|
|||||||
per_page: 100,
|
per_page: 100,
|
||||||
});
|
});
|
||||||
|
|
||||||
pulls.filter(pr =>
|
pulls.filter(pr =>
|
||||||
pr.merged_at &&
|
pr.merged_at &&
|
||||||
new Date(pr.merged_at) > latestDateInChangelog &&
|
new Date(pr.merged_at) > latestDateInChangelog &&
|
||||||
!pr.labels.some(label => ["invalid", "wontdo", process.env.AUTOMATED_PR_LABEL].includes(label.name.toLowerCase()))
|
!pr.labels.some(label =>
|
||||||
|
["invalid", "wontdo", process.env.AUTOMATED_PR_LABEL].includes(label.name.toLowerCase())
|
||||||
|
)
|
||||||
).forEach(pr => {
|
).forEach(pr => {
|
||||||
|
|
||||||
const prLabels = pr.labels.map(label => label.name.toLowerCase());
|
const prLabels = pr.labels.map(label => label.name.toLowerCase());
|
||||||
const prNote = `- ${pr.title} [@${pr.user.login}](https://github.com/${pr.user.login}) ([#${pr.number}](${pr.html_url}))`;
|
const prNote = `- ${pr.title} [@${pr.user.login}](https://github.com/${pr.user.login}) ([#${pr.number}](${pr.html_url}))`;
|
||||||
|
|
||||||
for (const { labels, notes } of categorizedPRs) {
|
const updateScriptsCategory = categorizedPRs.find(category =>
|
||||||
if (labels.length === 0 || labels.some(label => prLabels.includes(label))) {
|
category.labels.some(label => prLabels.includes(label))
|
||||||
notes.push(prNote);
|
);
|
||||||
break;
|
|
||||||
|
if (updateScriptsCategory) {
|
||||||
|
|
||||||
|
const subCategory = updateScriptsCategory.subCategories.find(sub =>
|
||||||
|
sub.labels.some(label => prLabels.includes(label))
|
||||||
|
);
|
||||||
|
|
||||||
|
if (subCategory) {
|
||||||
|
subCategory.notes.push(prNote);
|
||||||
|
} else {
|
||||||
|
updateScriptsCategory.notes.push(prNote);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
|
console.log(JSON.stringify(categorizedPRs, null, 2));
|
||||||
|
|
||||||
return categorizedPRs;
|
return categorizedPRs;
|
||||||
|
|
||||||
|
|
||||||
- name: Update CHANGELOG.md
|
- name: Update CHANGELOG.md
|
||||||
uses: actions/github-script@v7
|
uses: actions/github-script@v7
|
||||||
with:
|
with:
|
||||||
@ -100,17 +139,36 @@ jobs:
|
|||||||
const changelogPath = path.resolve('CHANGELOG.md');
|
const changelogPath = path.resolve('CHANGELOG.md');
|
||||||
const categorizedPRs = ${{ steps.get-categorized-prs.outputs.result }};
|
const categorizedPRs = ${{ steps.get-categorized-prs.outputs.result }};
|
||||||
|
|
||||||
let newReleaseNotes = `## ${today}\n\n### Changes\n\n`;
|
console.log(JSON.stringify(categorizedPRs, null, 2));
|
||||||
for (const { title, notes } of categorizedPRs) {
|
|
||||||
if (notes.length > 0) {
|
|
||||||
newReleaseNotes += `### ${title}\n\n${notes.join("\n")}\n\n`;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
|
let newReleaseNotes = `## ${today}\n\n`;
|
||||||
|
for (const { title, notes, subCategories } of categorizedPRs) {
|
||||||
|
const hasSubcategories = subCategories && subCategories.length > 0;
|
||||||
|
const hasMainNotes = notes.length > 0;
|
||||||
|
const hasSubNotes = hasSubcategories && subCategories.some(sub => sub.notes && sub.notes.length > 0);
|
||||||
|
|
||||||
|
|
||||||
|
if (hasMainNotes || hasSubNotes) {
|
||||||
|
newReleaseNotes += `### ${title}\n\n`;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (hasMainNotes) {
|
||||||
|
newReleaseNotes += ` ${notes.join("\n")}\n\n`;
|
||||||
|
}
|
||||||
|
if (hasSubcategories) {
|
||||||
|
for (const { title: subTitle, notes: subNotes } of subCategories) {
|
||||||
|
if (subNotes && subNotes.length > 0) {
|
||||||
|
newReleaseNotes += ` - #### ${subTitle}\n\n`;
|
||||||
|
newReleaseNotes += ` ${subNotes.join("\n ")}\n\n`;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
const changelogContent = await fs.readFile(changelogPath, 'utf-8');
|
const changelogContent = await fs.readFile(changelogPath, 'utf-8');
|
||||||
const changelogIncludesTodaysReleaseNotes = changelogContent.includes(`\n## ${today}`);
|
const changelogIncludesTodaysReleaseNotes = changelogContent.includes(`\n## ${today}`);
|
||||||
|
|
||||||
// Ersetze oder füge Release Notes ein
|
|
||||||
const regex = changelogIncludesTodaysReleaseNotes
|
const regex = changelogIncludesTodaysReleaseNotes
|
||||||
? new RegExp(`## ${today}.*(?=## ${latestDateInChangelog})`, "gs")
|
? new RegExp(`## ${today}.*(?=## ${latestDateInChangelog})`, "gs")
|
||||||
: new RegExp(`(?=## ${latestDateInChangelog})`, "gs");
|
: new RegExp(`(?=## ${latestDateInChangelog})`, "gs");
|
||||||
@ -165,4 +223,4 @@ jobs:
|
|||||||
PR_NUMBER=$(gh pr list --head "${BRANCH_NAME}" --json number --jq '.[].number')
|
PR_NUMBER=$(gh pr list --head "${BRANCH_NAME}" --json number --jq '.[].number')
|
||||||
if [ -n "$PR_NUMBER" ]; then
|
if [ -n "$PR_NUMBER" ]; then
|
||||||
gh pr review $PR_NUMBER --approve
|
gh pr review $PR_NUMBER --approve
|
||||||
fi
|
fi
|
122
.github/workflows/close-discussion.yml
vendored
Normal file
122
.github/workflows/close-discussion.yml
vendored
Normal file
@ -0,0 +1,122 @@
|
|||||||
|
name: Close Discussion on PR Merge
|
||||||
|
|
||||||
|
on:
|
||||||
|
pull_request:
|
||||||
|
types: [closed]
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
close-discussion:
|
||||||
|
runs-on: runner-cluster-htl-set
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Checkout Repository
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Set Up Node.js
|
||||||
|
uses: actions/setup-node@v4
|
||||||
|
with:
|
||||||
|
node-version: "20"
|
||||||
|
- name: Install Dependencies
|
||||||
|
run: npm install zx @octokit/graphql
|
||||||
|
|
||||||
|
- name: Close Discussion
|
||||||
|
env:
|
||||||
|
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
PR_BODY: ${{ github.event.pull_request.body }}
|
||||||
|
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||||
|
REPO_OWNER: ${{ github.repository_owner }}
|
||||||
|
REPO_NAME: ${{ github.event.repository.name }}
|
||||||
|
run: |
|
||||||
|
npx zx << 'EOF'
|
||||||
|
import { graphql } from "@octokit/graphql";
|
||||||
|
(async function() {
|
||||||
|
try {
|
||||||
|
const token = process.env.GITHUB_TOKEN;
|
||||||
|
const prBody = process.env.PR_BODY;
|
||||||
|
const prNumber = process.env.PR_NUMBER;
|
||||||
|
const owner = process.env.REPO_OWNER;
|
||||||
|
const repo = process.env.REPO_NAME;
|
||||||
|
|
||||||
|
if (!token || !prBody || !prNumber || !owner || !repo) {
|
||||||
|
console.log("Missing required environment variables.");
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
const match = prBody.match(/#(\d+)/);
|
||||||
|
if (!match) {
|
||||||
|
console.log("No discussion ID found in PR body.");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const discussionNumber = match[1];
|
||||||
|
|
||||||
|
console.log(`Extracted Discussion Number: ${discussionNumber}`);
|
||||||
|
console.log(`PR Number: ${prNumber}`);
|
||||||
|
console.log(`Repository: ${owner}/${repo}`);
|
||||||
|
|
||||||
|
const graphqlWithAuth = graphql.defaults({
|
||||||
|
headers: { authorization: `Bearer ${token}` },
|
||||||
|
});
|
||||||
|
|
||||||
|
const discussionQuery = `
|
||||||
|
query($owner: String!, $repo: String!, $number: Int!) {
|
||||||
|
repository(owner: $owner, name: $repo) {
|
||||||
|
discussion(number: $number) {
|
||||||
|
id
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
`;
|
||||||
|
|
||||||
|
const discussionResponse = await graphqlWithAuth(discussionQuery, {
|
||||||
|
owner,
|
||||||
|
repo,
|
||||||
|
number: parseInt(discussionNumber, 10),
|
||||||
|
});
|
||||||
|
|
||||||
|
const discussionQLId = discussionResponse.repository.discussion.id;
|
||||||
|
if (!discussionQLId) {
|
||||||
|
console.log("Failed to fetch discussion GraphQL ID.");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log(`GraphQL Discussion ID: ${discussionQLId}`);
|
||||||
|
|
||||||
|
const commentMutation = `
|
||||||
|
mutation($discussionId: ID!, $body: String!) {
|
||||||
|
addDiscussionComment(input: { discussionId: $discussionId, body: $body }) {
|
||||||
|
comment { id body }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
`;
|
||||||
|
|
||||||
|
const commentResponse = await graphqlWithAuth(commentMutation, {
|
||||||
|
discussionId: discussionQLId,
|
||||||
|
body: `Merged with PR #${prNumber}`,
|
||||||
|
});
|
||||||
|
|
||||||
|
const commentId = commentResponse.addDiscussionComment.comment.id;
|
||||||
|
if (!commentId) {
|
||||||
|
console.log("Failed to post the comment.");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log(`Comment Posted Successfully! Comment ID: ${commentId}`);
|
||||||
|
|
||||||
|
const markAnswerMutation = `
|
||||||
|
mutation($id: ID!) {
|
||||||
|
markDiscussionCommentAsAnswer(input: { id: $id }) {
|
||||||
|
discussion { id title }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
`;
|
||||||
|
|
||||||
|
await graphqlWithAuth(markAnswerMutation, { id: commentId });
|
||||||
|
|
||||||
|
console.log("Comment marked as answer successfully!");
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error("Error:", error);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
})();
|
||||||
|
EOF
|
37
.github/workflows/create-docker-for-runner.yml
vendored
Normal file
37
.github/workflows/create-docker-for-runner.yml
vendored
Normal file
@ -0,0 +1,37 @@
|
|||||||
|
name: Build and Publish Docker Image
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches:
|
||||||
|
- main
|
||||||
|
paths:
|
||||||
|
- '.github/runner/docker/**'
|
||||||
|
schedule:
|
||||||
|
- cron: '0 0 * * *'
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
build:
|
||||||
|
runs-on: ubuntu-latest #To ensure it always builds we use the github runner with all the right tooling
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Checkout code
|
||||||
|
uses: actions/checkout@v3
|
||||||
|
|
||||||
|
- name: Log in to GHCR
|
||||||
|
uses: docker/login-action@v2
|
||||||
|
with:
|
||||||
|
registry: ghcr.io
|
||||||
|
username: ${{ github.actor }}
|
||||||
|
password: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
|
||||||
|
- name: Build Docker image
|
||||||
|
run: |
|
||||||
|
repo_name=${{ github.repository }} # Get repository name
|
||||||
|
repo_name_lower=$(echo $repo_name | tr '[:upper:]' '[:lower:]') # Convert to lowercase
|
||||||
|
docker build -t ghcr.io/$repo_name_lower/gh-runner-self:latest -f .github/runner/docker/gh-runner-self.dockerfile .
|
||||||
|
|
||||||
|
- name: Push Docker image to GHCR
|
||||||
|
run: |
|
||||||
|
repo_name=${{ github.repository }} # Get repository name
|
||||||
|
repo_name_lower=$(echo $repo_name | tr '[:upper:]' '[:lower:]') # Convert to lowercase
|
||||||
|
docker push ghcr.io/$repo_name_lower/gh-runner-self:latest
|
2
.github/workflows/delete-json-branch.yml
vendored
2
.github/workflows/delete-json-branch.yml
vendored
@ -9,7 +9,7 @@ on:
|
|||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
delete_branch:
|
delete_branch:
|
||||||
runs-on: ubuntu-latest
|
runs-on: runner-cluster-htl-set
|
||||||
steps:
|
steps:
|
||||||
- name: Checkout the code
|
- name: Checkout the code
|
||||||
uses: actions/checkout@v3
|
uses: actions/checkout@v3
|
||||||
|
2
.github/workflows/frontend-cicd.yml
vendored
2
.github/workflows/frontend-cicd.yml
vendored
@ -27,7 +27,7 @@ concurrency:
|
|||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
build:
|
build:
|
||||||
runs-on: ubuntu-latest
|
runs-on: runner-cluster-htl-set
|
||||||
defaults:
|
defaults:
|
||||||
run:
|
run:
|
||||||
working-directory: frontend # Set default working directory for all run steps
|
working-directory: frontend # Set default working directory for all run steps
|
||||||
|
29
.github/workflows/github-release.yml
vendored
29
.github/workflows/github-release.yml
vendored
@ -2,24 +2,39 @@ name: Create new release
|
|||||||
|
|
||||||
on:
|
on:
|
||||||
schedule:
|
schedule:
|
||||||
# Runs "At 00:01 every night" (UTC)
|
- cron: '1 0 * * *' # Runs nightly
|
||||||
- cron: '1 0 * * *'
|
workflow_dispatch:
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
create-new-release:
|
create-new-release:
|
||||||
runs-on: ubuntu-latest
|
runs-on: runner-cluster-htl-set
|
||||||
permissions:
|
permissions:
|
||||||
contents: write
|
contents: write
|
||||||
steps:
|
steps:
|
||||||
- name: Checkout code
|
- name: Checkout code
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
- name: Parse CHANGELOG.md for yesterday's entries and create a new release
|
- name: Parse CHANGELOG.md for yesterday's entries and create a new release
|
||||||
env:
|
env:
|
||||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
run: |
|
run: |
|
||||||
YESTERDAY=$(date -u --date="yesterday" +%Y-%m-%d)
|
YESTERDAY=$(date -u --date="yesterday" +%Y-%m-%d)
|
||||||
YESTERDAY_CHANGELOG_NOTES=$(awk '/^## '"$YESTERDAY"'/ {f=1; next} f && /^## [0-9]{4}-[0-9]{2}-[0-9]{2}/ {f=0} f && !/^## / {print}' CHANGELOG.md)
|
awk '/^## '"$YESTERDAY"'/ {f=1; next} f && /^## [0-9]{4}-[0-9]{2}-[0-9]{2}/ {f=0} f && !/^## / {print}' CHANGELOG.md > changelog_tmp.md
|
||||||
|
|
||||||
if [ -n "$YESTERDAY_CHANGELOG_NOTES" ]; then
|
if [ ! -s changelog_tmp.md ]; then
|
||||||
gh release create "$YESTERDAY" -t "$YESTERDAY" -n "$YESTERDAY_CHANGELOG_NOTES" --latest
|
echo "No changes found for $YESTERDAY, skipping release."
|
||||||
|
exit 0
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
CHANGELOG_SIZE=$(wc -c < changelog_tmp.md)
|
||||||
|
echo "Changelog size: $CHANGELOG_SIZE bytes"
|
||||||
|
|
||||||
|
# Crop to last 10,000 bytes if too large
|
||||||
|
if [ "$CHANGELOG_SIZE" -gt 10000 ]; then
|
||||||
|
echo "WARNING: Changelog too large, cropping to last 10,000 bytes..."
|
||||||
|
tail -c 10000 changelog_tmp.md > changelog_cropped.md
|
||||||
|
mv changelog_cropped.md changelog_tmp.md
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "Creating GitHub release for $YESTERDAY..."
|
||||||
|
gh release create "$YESTERDAY" -t "$YESTERDAY" -F changelog_tmp.md
|
||||||
|
152
.github/workflows/script-test.yml
vendored
152
.github/workflows/script-test.yml
vendored
@ -13,7 +13,7 @@ jobs:
|
|||||||
run-install-script:
|
run-install-script:
|
||||||
runs-on: pvenode
|
runs-on: pvenode
|
||||||
steps:
|
steps:
|
||||||
- name: Checkout PR branch (supports forks)
|
- name: Checkout PR branch
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v4
|
||||||
with:
|
with:
|
||||||
ref: ${{ github.event.pull_request.head.ref }}
|
ref: ${{ github.event.pull_request.head.ref }}
|
||||||
@ -37,7 +37,8 @@ jobs:
|
|||||||
echo "Changed files: $CHANGED_FILES"
|
echo "Changed files: $CHANGED_FILES"
|
||||||
echo "SCRIPT=$CHANGED_FILES" >> $GITHUB_ENV
|
echo "SCRIPT=$CHANGED_FILES" >> $GITHUB_ENV
|
||||||
env:
|
env:
|
||||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
|
||||||
|
|
||||||
- name: Get scripts
|
- name: Get scripts
|
||||||
id: check-install-script
|
id: check-install-script
|
||||||
@ -61,53 +62,71 @@ jobs:
|
|||||||
id: run-install
|
id: run-install
|
||||||
continue-on-error: true
|
continue-on-error: true
|
||||||
run: |
|
run: |
|
||||||
set +e
|
set +e
|
||||||
#run for each files in /ct
|
#run for each files in /ct
|
||||||
for FILE in ${{ env.ALL_FILES }}; do
|
for FILE in ${{ env.ALL_FILES }}; do
|
||||||
STRIPPED_NAME=$(basename "$FILE" | sed 's/-install//' | sed 's/\.sh$//')
|
STRIPPED_NAME=$(basename "$FILE" | sed 's/-install//' | sed 's/\.sh$//')
|
||||||
echo "Running Test for: $STRIPPED_NAME"
|
echo "Running Test for: $STRIPPED_NAME"
|
||||||
if [[ $FILE =~ ^install/.*-install\.sh$ ]]; then
|
if grep -E -q 'read\s+-r\s+-p\s+".*"\s+\w+' "$FILE"; then
|
||||||
CT_SCRIPT="ct/$STRIPPED_NAME.sh"
|
echo "The script contains an interactive prompt. Skipping execution."
|
||||||
if [[ ! -f $CT_SCRIPT ]]; then
|
|
||||||
echo "No CT script found for $STRIPPED_NAME"
|
|
||||||
ERROR_MSG="No CT script found for $FILE"
|
|
||||||
echo "$ERROR_MSG" > result_$STRIPPED_NAME.log
|
|
||||||
continue
|
continue
|
||||||
fi
|
fi
|
||||||
echo "Found CT script for $STRIPPED_NAME"
|
if [[ $FILE =~ ^install/.*-install\.sh$ ]]; then
|
||||||
chmod +x "$CT_SCRIPT"
|
CT_SCRIPT="ct/$STRIPPED_NAME.sh"
|
||||||
RUNNING_FILE=$CT_SCRIPT
|
if [[ ! -f $CT_SCRIPT ]]; then
|
||||||
elif [[ $FILE =~ ^ct/.*\.sh$ ]]; then
|
echo "No CT script found for $STRIPPED_NAME"
|
||||||
INSTALL_SCRIPT="install/$STRIPPED_NAME-install.sh"
|
ERROR_MSG="No CT script found for $FILE"
|
||||||
if [[ ! -f $INSTALL_SCRIPT ]]; then
|
echo "$ERROR_MSG" > result_$STRIPPED_NAME.log
|
||||||
echo "No install script found for $STRIPPED_NAME"
|
continue
|
||||||
ERROR_MSG="No install script found for $FILE"
|
fi
|
||||||
|
if grep -E -q 'read\s+-r\s+-p\s+".*"\s+\w+' "install/$STRIPPED_NAME-install.sh"; then
|
||||||
|
echo "The script contains an interactive prompt. Skipping execution."
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
echo "Found CT script for $STRIPPED_NAME"
|
||||||
|
chmod +x "$CT_SCRIPT"
|
||||||
|
RUNNING_FILE=$CT_SCRIPT
|
||||||
|
elif [[ $FILE =~ ^ct/.*\.sh$ ]]; then
|
||||||
|
INSTALL_SCRIPT="install/$STRIPPED_NAME-install.sh"
|
||||||
|
if [[ ! -f $INSTALL_SCRIPT ]]; then
|
||||||
|
echo "No install script found for $STRIPPED_NAME"
|
||||||
|
ERROR_MSG="No install script found for $FILE"
|
||||||
|
echo "$ERROR_MSG" > result_$STRIPPED_NAME.log
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
echo "Found install script for $STRIPPED_NAME"
|
||||||
|
chmod +x "$INSTALL_SCRIPT"
|
||||||
|
RUNNING_FILE=$FILE
|
||||||
|
if grep -E -q 'read\s+-r\s+-p\s+".*"\s+\w+' "ct/$STRIPPED_NAME.sh"; then
|
||||||
|
echo "The script contains an interactive prompt. Skipping execution."
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
git remote add community-scripts https://github.com/community-scripts/ProxmoxVE.git
|
||||||
|
git fetch community-scripts
|
||||||
|
rm -f .github/workflows/scripts/app-test/pr-build.func || true
|
||||||
|
rm -f .github/workflows/scripts/app-test/pr-install.func || true
|
||||||
|
rm -f .github/workflows/scripts/app-test/pr-alpine-install.func || true
|
||||||
|
rm -f .github/workflows/scripts/app-test/pr-create-lxc.sh || true
|
||||||
|
git checkout community-scripts/main -- .github/workflows/scripts/app-test/pr-build.func
|
||||||
|
git checkout community-scripts/main -- .github/workflows/scripts/app-test/pr-install.func
|
||||||
|
git checkout community-scripts/main -- .github/workflows/scripts/app-test/pr-alpine-install.func
|
||||||
|
git checkout community-scripts/main -- .github/workflows/scripts/app-test/pr-create-lxc.sh
|
||||||
|
chmod +x $RUNNING_FILE
|
||||||
|
chmod +x .github/workflows/scripts/app-test/pr-create-lxc.sh
|
||||||
|
chmod +x .github/workflows/scripts/app-test/pr-install.func
|
||||||
|
chmod +x .github/workflows/scripts/app-test/pr-alpine-install.func
|
||||||
|
chmod +x .github/workflows/scripts/app-test/pr-build.func
|
||||||
|
sed -i 's|source <(curl -s https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/build.func)|source .github/workflows/scripts/app-test/pr-build.func|g' "$RUNNING_FILE"
|
||||||
|
echo "Executing $RUNNING_FILE"
|
||||||
|
ERROR_MSG=$(./$RUNNING_FILE 2>&1 > /dev/null)
|
||||||
|
echo "Finished running $FILE"
|
||||||
|
if [ -n "$ERROR_MSG" ]; then
|
||||||
|
echo "ERROR in $STRIPPED_NAME: $ERROR_MSG"
|
||||||
echo "$ERROR_MSG" > result_$STRIPPED_NAME.log
|
echo "$ERROR_MSG" > result_$STRIPPED_NAME.log
|
||||||
continue
|
fi
|
||||||
fi
|
done
|
||||||
echo "Found install script for $STRIPPED_NAME"
|
set -e # Restore exit-on-error
|
||||||
chmod +x "$INSTALL_SCRIPT"
|
|
||||||
RUNNING_FILE=$FILE
|
|
||||||
fi
|
|
||||||
git checkout origin/main .github/workflows/scripts/app-test/pr-build.func
|
|
||||||
git checkout origin/main .github/workflows/scripts/app-test/pr-install.func
|
|
||||||
git checkout origin/main .github/workflows/scripts/app-test/pr-alpine-install.func
|
|
||||||
git checkout origin/main .github/workflows/scripts/app-test/pr-create-lxc.sh
|
|
||||||
chmod +x $RUNNING_FILE
|
|
||||||
chmod +x .github/workflows/scripts/app-test/pr-create-lxc.sh
|
|
||||||
chmod +x .github/workflows/scripts/app-test/pr-install.func
|
|
||||||
chmod +x .github/workflows/scripts/app-test/pr-alpine-install.func
|
|
||||||
chmod +x .github/workflows/scripts/app-test/pr-build.func
|
|
||||||
sed -i 's|source <(curl -s https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/build.func)|source .github/workflows/scripts/app-test/pr-build.func|g' "$RUNNING_FILE"
|
|
||||||
echo "Executing $RUNNING_FILE"
|
|
||||||
ERROR_MSG=$(./$RUNNING_FILE 2>&1 > /dev/null)
|
|
||||||
echo "Finished running $FILE"
|
|
||||||
if [ -n "$ERROR_MSG" ]; then
|
|
||||||
echo "ERROR in $STRIPPED_NAME: $ERROR_MSG"
|
|
||||||
echo "$ERROR_MSG" > result_$STRIPPED_NAME.log
|
|
||||||
fi
|
|
||||||
done
|
|
||||||
set -e # Restore exit-on-error
|
|
||||||
|
|
||||||
- name: Cleanup PVE Node
|
- name: Cleanup PVE Node
|
||||||
run: |
|
run: |
|
||||||
@ -119,35 +138,40 @@ jobs:
|
|||||||
pct stop $container_id
|
pct stop $container_id
|
||||||
pct destroy $container_id
|
pct destroy $container_id
|
||||||
fi
|
fi
|
||||||
done
|
done
|
||||||
|
|
||||||
- name: Post error comments
|
- name: Post error comments
|
||||||
run: |
|
run: |
|
||||||
ERROR="false"
|
ERROR="false"
|
||||||
SEARCH_LINE=".github/workflows/scripts/app-test/pr-build.func: line 253:"
|
SEARCH_LINE=".github/workflows/scripts/app-test/pr-build.func: line 255:"
|
||||||
|
|
||||||
|
# Get all existing comments on the PR
|
||||||
|
EXISTING_COMMENTS=$(gh pr view ${{ github.event.pull_request.number }} --repo ${{ github.repository }} --json comments --jq '.comments[].body')
|
||||||
|
|
||||||
for FILE in ${{ env.ALL_FILES }}; do
|
for FILE in ${{ env.ALL_FILES }}; do
|
||||||
STRIPPED_NAME=$(basename "$FILE" | sed 's/-install//' | sed 's/\.sh$//')
|
STRIPPED_NAME=$(basename "$FILE" | sed 's/-install//' | sed 's/\.sh$//')
|
||||||
if [[ ! -f result_$STRIPPED_NAME.log ]]; then
|
if [[ ! -f result_$STRIPPED_NAME.log ]]; then
|
||||||
continue
|
continue
|
||||||
fi
|
fi
|
||||||
ERROR_MSG=$(cat result_$STRIPPED_NAME.log)
|
ERROR_MSG=$(cat result_$STRIPPED_NAME.log)
|
||||||
|
|
||||||
if [ -n "$ERROR_MSG" ]; then
|
if [ -n "$ERROR_MSG" ]; then
|
||||||
CLEANED_ERROR_MSG=$(echo "$ERROR_MSG" | sed "s|$SEARCH_LINE.*||")
|
CLEANED_ERROR_MSG=$(echo "$ERROR_MSG" | sed "s|$SEARCH_LINE.*||")
|
||||||
echo "Posting error message for $FILE"
|
COMMENT_BODY=":warning: The script _**$FILE**_ failed with the following message: <br> <div><strong>${CLEANED_ERROR_MSG}</strong></div>"
|
||||||
echo ${CLEANED_ERROR_MSG}
|
|
||||||
gh pr comment ${{ github.event.pull_request.number }} \
|
# Check if the comment already exists
|
||||||
--repo ${{ github.repository }} \
|
if echo "$EXISTING_COMMENTS" | grep -qF "$COMMENT_BODY"; then
|
||||||
--body ":warning: The script _**$FILE**_ failed with the following message: <br> <div><strong>${CLEANED_ERROR_MSG}</strong></div>"
|
echo "Skipping duplicate comment for $FILE"
|
||||||
|
else
|
||||||
|
echo "Posting error message for $FILE"
|
||||||
ERROR="true"
|
gh pr comment ${{ github.event.pull_request.number }} \
|
||||||
|
--repo ${{ github.repository }} \
|
||||||
|
--body "$COMMENT_BODY"
|
||||||
|
ERROR="true"
|
||||||
|
fi
|
||||||
fi
|
fi
|
||||||
done
|
done
|
||||||
|
|
||||||
echo "ERROR=$ERROR" >> $GITHUB_ENV
|
echo "ERROR=$ERROR" >> $GITHUB_ENV
|
||||||
env:
|
|
||||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
|
||||||
|
|
||||||
- name: Fail if error
|
|
||||||
if: ${{ env.ERROR == 'true' }}
|
|
||||||
run: exit 1
|
|
||||||
|
243
.github/workflows/script_format.yml
vendored
Normal file
243
.github/workflows/script_format.yml
vendored
Normal file
@ -0,0 +1,243 @@
|
|||||||
|
name: Script Format Check
|
||||||
|
permissions:
|
||||||
|
pull-requests: write
|
||||||
|
on:
|
||||||
|
pull_request_target:
|
||||||
|
branches:
|
||||||
|
- main
|
||||||
|
paths:
|
||||||
|
- 'install/*.sh'
|
||||||
|
- 'ct/*.sh'
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
run-install-script:
|
||||||
|
runs-on: pvenode
|
||||||
|
steps:
|
||||||
|
- name: Checkout PR branch (supports forks)
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
with:
|
||||||
|
ref: ${{ github.event.pull_request.head.ref }}
|
||||||
|
repository: ${{ github.event.pull_request.head.repo.full_name }}
|
||||||
|
fetch-depth: 0
|
||||||
|
|
||||||
|
- name: Add Git safe directory
|
||||||
|
run: |
|
||||||
|
git config --global --add safe.directory /__w/ProxmoxVE/ProxmoxVE
|
||||||
|
|
||||||
|
- name: Set up GH_TOKEN
|
||||||
|
env:
|
||||||
|
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
run: |
|
||||||
|
echo "GH_TOKEN=${GH_TOKEN}" >> $GITHUB_ENV
|
||||||
|
|
||||||
|
- name: Get Changed Files
|
||||||
|
run: |
|
||||||
|
CHANGED_FILES=$(gh pr diff ${{ github.event.pull_request.number }} --repo ${{ github.repository }} --name-only)
|
||||||
|
CHANGED_FILES=$(echo "$CHANGED_FILES" | tr '\n' ' ')
|
||||||
|
echo "Changed files: $CHANGED_FILES"
|
||||||
|
echo "SCRIPT=$CHANGED_FILES" >> $GITHUB_ENV
|
||||||
|
env:
|
||||||
|
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
|
||||||
|
- name: Check scripts
|
||||||
|
id: run-install
|
||||||
|
continue-on-error: true
|
||||||
|
run: |
|
||||||
|
for FILE in ${{ env.SCRIPT }}; do
|
||||||
|
STRIPPED_NAME=$(basename "$FILE" | sed 's/-install//' | sed 's/\.sh$//')
|
||||||
|
echo "Running Test for: $STRIPPED_NAME"
|
||||||
|
FILE_STRIPPED="${FILE##*/}"
|
||||||
|
LOG_FILE="result_$FILE_STRIPPED.log"
|
||||||
|
|
||||||
|
if [[ $FILE =~ ^ct/.*\.sh$ ]]; then
|
||||||
|
|
||||||
|
FIRST_LINE=$(sed -n '1p' "$FILE")
|
||||||
|
[[ "$FIRST_LINE" != "#!/usr/bin/env bash" ]] && echo "Line 1 was $FIRST_LINE | Should be: #!/usr/bin/env bash" >> "$LOG_FILE"
|
||||||
|
SECOND_LINE=$(sed -n '2p' "$FILE")
|
||||||
|
[[ "$SECOND_LINE" != "source <(curl -s https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/build.func)" ]] &&
|
||||||
|
echo "Line 2 was $SECOND_LINE | Should be: source <(curl -s https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/build.func)" >> "$LOG_FILE"
|
||||||
|
THIRD_LINE=$(sed -n '3p' "$FILE")
|
||||||
|
if ! [[ "$THIRD_LINE" =~ ^#\ Copyright\ \(c\)\ [0-9]{4}-[0-9]{4}\ community-scripts\ ORG$ || "$THIRD_LINE" =~ ^Copyright\ \(c\)\ [0-9]{4}-[0-9]{4}\ tteck$ ]]; then
|
||||||
|
echo "Line 3 was $THIRD_LINE | Should be: # Copyright (c) 2021-2025 community-scripts ORG" >> "$LOG_FILE"
|
||||||
|
fi
|
||||||
|
|
||||||
|
EXPECTED_AUTHOR="# Author:"
|
||||||
|
EXPECTED_LICENSE="# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE"
|
||||||
|
EXPECTED_SOURCE="# Source:"
|
||||||
|
EXPECTED_EMPTY=""
|
||||||
|
|
||||||
|
for i in {4..7}; do
|
||||||
|
LINE=$(sed -n "${i}p" "$FILE")
|
||||||
|
|
||||||
|
case $i in
|
||||||
|
4)
|
||||||
|
[[ $LINE == $EXPECTED_AUTHOR* ]] || printf "Line %d was: '%s' | Should start with: '%s'\n" "$i" "$LINE" "$EXPECTED_AUTHOR" >> $LOG_FILE
|
||||||
|
;;
|
||||||
|
5)
|
||||||
|
[[ "$LINE" == "$EXPECTED_LICENSE" ]] || printf "Line %d was: '%s' | Should be: '%s'\n" "$i" "$LINE" "$EXPECTED_LICENSE" >> $LOG_FILE
|
||||||
|
;;
|
||||||
|
6)
|
||||||
|
[[ $LINE == $EXPECTED_SOURCE* ]] || printf "Line %d was: '%s' | Should start with: '%s'\n" "$i" "$LINE" "$EXPECTED_SOURCE" >> $LOG_FILE
|
||||||
|
;;
|
||||||
|
7)
|
||||||
|
[[ -z $LINE ]] || printf "Line %d was: '%s' | Should be empty\n" "$i" "$LINE" >> $LOG_FILE
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
|
||||||
|
EXPECTED_PREFIXES=(
|
||||||
|
"APP="
|
||||||
|
"var_tags="
|
||||||
|
"var_cpu=" # Must be a number
|
||||||
|
"var_ram=" # Must be a number
|
||||||
|
"var_disk=" # Must be a number
|
||||||
|
"var_os=" # Must be debian, alpine, or ubuntu
|
||||||
|
"var_version="
|
||||||
|
"var_unprivileged=" # Must be 0 or 1
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
for i in {8..15}; do
|
||||||
|
LINE=$(sed -n "${i}p" "$FILE")
|
||||||
|
INDEX=$((i - 8))
|
||||||
|
|
||||||
|
case $INDEX in
|
||||||
|
2|3|4) # var_cpu, var_ram, var_disk (must be numbers)
|
||||||
|
if [[ "$LINE" =~ ^${EXPECTED_PREFIXES[$INDEX]}([0-9]+)$ ]]; then
|
||||||
|
continue # Valid
|
||||||
|
else
|
||||||
|
echo "Line $i was '$LINE' | Should be: '${EXPECTED_PREFIXES[$INDEX]}<NUMBER>'" >> "$LOG_FILE"
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
5) # var_os (must be debian, alpine, or ubuntu)
|
||||||
|
if [[ "$LINE" =~ ^var_os=(debian|alpine|ubuntu)$ ]]; then
|
||||||
|
continue # Valid
|
||||||
|
else
|
||||||
|
echo "Line $i was '$LINE' | Should be: 'var_os=[debian|alpine|ubuntu]'" >> "$LOG_FILE"
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
7) # var_unprivileged (must be 0 or 1)
|
||||||
|
if [[ "$LINE" =~ ^var_unprivileged=[01]$ ]]; then
|
||||||
|
continue # Valid
|
||||||
|
else
|
||||||
|
echo "Line $i was '$LINE' | Should be: 'var_unprivileged=[0|1]'" >> "$LOG_FILE"
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
*) # Other lines (must start with expected prefix)
|
||||||
|
if [[ "$LINE" == ${EXPECTED_PREFIXES[$INDEX]}* ]]; then
|
||||||
|
continue # Valid
|
||||||
|
else
|
||||||
|
echo "Line $i was '$LINE' | Should start with '${EXPECTED_PREFIXES[$INDEX]}'" >> "$LOG_FILE"
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
for i in {16..20}; do
|
||||||
|
LINE=$(sed -n "${i}p" "$FILE")
|
||||||
|
EXPECTED=(
|
||||||
|
"header_info \"$APP\""
|
||||||
|
"variables"
|
||||||
|
"color"
|
||||||
|
"catch_errors"
|
||||||
|
"function update_script() {"
|
||||||
|
)
|
||||||
|
[[ "$LINE" != "${EXPECTED[$((i-16))]}" ]] && echo "Line $i was $LINE | Should be: ${EXPECTED[$((i-16))]}" >> "$LOG_FILE"
|
||||||
|
done
|
||||||
|
cat "$LOG_FILE"
|
||||||
|
elif [[ $FILE =~ ^install/.*-install\.sh$ ]]; then
|
||||||
|
|
||||||
|
FIRST_LINE=$(sed -n '1p' "$FILE")
|
||||||
|
[[ "$FIRST_LINE" != "#!/usr/bin/env bash" ]] && echo "Line 1 was $FIRST_LINE | Should be: #!/usr/bin/env bash" >> "$LOG_FILE"
|
||||||
|
|
||||||
|
SECOND_LINE=$(sed -n '2p' "$FILE")
|
||||||
|
[[ -n "$SECOND_LINE" ]] && echo "Line 2 should be empty" >> "$LOG_FILE"
|
||||||
|
|
||||||
|
THIRD_LINE=$(sed -n '3p' "$FILE")
|
||||||
|
if ! [[ "$THIRD_LINE" =~ ^#\ Copyright\ \(c\)\ [0-9]{4}-[0-9]{4}\ community-scripts\ ORG$ || "$THIRD_LINE" =~ ^Copyright\ \(c\)\ [0-9]{4}-[0-9]{4}\ tteck$ ]]; then
|
||||||
|
echo "Line 3 was $THIRD_LINE | Should be: # Copyright (c) 2021-2025 community-scripts ORG" >> "$LOG_FILE"
|
||||||
|
fi
|
||||||
|
|
||||||
|
EXPECTED_AUTHOR="# Author:"
|
||||||
|
EXPECTED_LICENSE="# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE"
|
||||||
|
EXPECTED_SOURCE="# Source:"
|
||||||
|
EXPECTED_EMPTY=""
|
||||||
|
|
||||||
|
for i in {4..7}; do
|
||||||
|
LINE=$(sed -n "${i}p" "$FILE")
|
||||||
|
|
||||||
|
case $i in
|
||||||
|
4)
|
||||||
|
[[ $LINE == $EXPECTED_AUTHOR* ]] || printf "Line %d was: '%s' | Should start with: '%s'\n" "$i" "$LINE" "$EXPECTED_AUTHOR" >> $LOG_FILE
|
||||||
|
;;
|
||||||
|
5)
|
||||||
|
[[ "$LINE" == "$EXPECTED_LICENSE" ]] || printf "Line %d was: '%s' | Should be: '%s'\n" "$i" "$LINE" "$EXPECTED_LICENSE" >> $LOG_FILE
|
||||||
|
;;
|
||||||
|
6)
|
||||||
|
[[ $LINE == $EXPECTED_SOURCE* ]] || printf "Line %d was: '%s' | Should start with: '%s'\n" "$i" "$LINE" "$EXPECTED_SOURCE" >> $LOG_FILE
|
||||||
|
;;
|
||||||
|
7)
|
||||||
|
[[ -z $LINE ]] || printf "Line %d was: '%s' | Should be empty\n" "$i" "$LINE" >> $LOG_FILE
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
[[ "$(sed -n '8p' "$FILE")" != 'source /dev/stdin <<< "$FUNCTIONS_FILE_PATH"' ]] && echo 'Line 8 should be: source /dev/stdin <<< "$FUNCTIONS_FILE_PATH"' >> "$LOG_FILE"
|
||||||
|
|
||||||
|
for i in {9..14}; do
|
||||||
|
LINE=$(sed -n "${i}p" "$FILE")
|
||||||
|
EXPECTED=(
|
||||||
|
"color"
|
||||||
|
"verb_ip6"
|
||||||
|
"catch_errors"
|
||||||
|
"setting_up_container"
|
||||||
|
"network_check"
|
||||||
|
"update_os"
|
||||||
|
)
|
||||||
|
[[ "$LINE" != "${EXPECTED[$((i-9))]}" ]] && echo "Line $i was $LINE | Should be: ${EXPECTED[$((i-9))]}" >> "$LOG_FILE"
|
||||||
|
done
|
||||||
|
|
||||||
|
[[ -n "$(sed -n '15p' "$FILE")" ]] && echo "Line 15 should be empty" >> "$LOG_FILE"
|
||||||
|
[[ "$(sed -n '16p' "$FILE")" != 'msg_info "Installing Dependencies"' ]] && echo 'Line 16 should be: msg_info "Installing Dependencies"' >> "$LOG_FILE"
|
||||||
|
|
||||||
|
LAST_3_LINES=$(tail -n 3 "$FILE")
|
||||||
|
[[ "$LAST_3_LINES" != *"$STD apt-get -y autoremove"* ]] && echo 'Third to last line should be: $STD apt-get -y autoremove' >> "$LOG_FILE"
|
||||||
|
[[ "$LAST_3_LINES" != *"$STD apt-get -y autoclean"* ]] && echo 'Second to last line should be: $STD apt-get -y clean' >> "$LOG_FILE"
|
||||||
|
[[ "$LAST_3_LINES" != *'msg_ok "Cleaned"'* ]] && echo 'Last line should be: msg_ok "Cleaned"' >> "$LOG_FILE"
|
||||||
|
cat "$LOG_FILE"
|
||||||
|
fi
|
||||||
|
|
||||||
|
done
|
||||||
|
|
||||||
|
|
||||||
|
- name: Post error comments
|
||||||
|
run: |
|
||||||
|
ERROR="false"
|
||||||
|
for FILE in ${{ env.SCRIPT }}; do
|
||||||
|
FILE_STRIPPED="${FILE##*/}"
|
||||||
|
LOG_FILE="result_$FILE_STRIPPED.log"
|
||||||
|
echo $LOG_FILE
|
||||||
|
if [[ ! -f $LOG_FILE ]]; then
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
ERROR_MSG=$(cat $LOG_FILE)
|
||||||
|
|
||||||
|
if [ -n "$ERROR_MSG" ]; then
|
||||||
|
echo "Posting error message for $FILE"
|
||||||
|
echo ${ERROR_MSG}
|
||||||
|
gh pr comment ${{ github.event.pull_request.number }} \
|
||||||
|
--repo ${{ github.repository }} \
|
||||||
|
--body ":warning: The script _**$FILE**_ has the following formatting errors: <br> <div><strong>${ERROR_MSG}</strong></div>"
|
||||||
|
|
||||||
|
|
||||||
|
ERROR="true"
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
echo "ERROR=$ERROR" >> $GITHUB_ENV
|
||||||
|
env:
|
||||||
|
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
|
||||||
|
- name: Fail if error
|
||||||
|
if: ${{ env.ERROR == 'true' }}
|
||||||
|
run: exit 1
|
@ -1,6 +1,6 @@
|
|||||||
#!/usr/bin/env bash
|
#!/usr/bin/env bash
|
||||||
# Copyright (c) 2021-2025 community-scripts ORG
|
# Copyright (c) 2021-2025 community-scripts ORG
|
||||||
# Author: michelroegl-brunner
|
# Author: Michel Roegl-Brunner (michelroegl-brunner)
|
||||||
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
|
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
|
||||||
|
|
||||||
color() {
|
color() {
|
||||||
@ -11,7 +11,7 @@ catch_errors() {
|
|||||||
trap 'error_handler $LINENO "$BASH_COMMAND"' ERR
|
trap 'error_handler $LINENO "$BASH_COMMAND"' ERR
|
||||||
}
|
}
|
||||||
|
|
||||||
|
# This function handles errors
|
||||||
error_handler() {
|
error_handler() {
|
||||||
local line_number="$1"
|
local line_number="$1"
|
||||||
local command="$2"
|
local command="$2"
|
||||||
@ -21,8 +21,8 @@ error_handler() {
|
|||||||
exit 0
|
exit 0
|
||||||
}
|
}
|
||||||
verb_ip6() {
|
verb_ip6() {
|
||||||
STD=""
|
STD=""
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
msg_info() {
|
msg_info() {
|
||||||
@ -30,13 +30,13 @@ msg_info() {
|
|||||||
echo -ne "${msg}\n"
|
echo -ne "${msg}\n"
|
||||||
}
|
}
|
||||||
|
|
||||||
msg_ok() {
|
msg_ok() {
|
||||||
local msg="$1"
|
local msg="$1"
|
||||||
echo -e "${msg}\n"
|
echo -e "${msg}\n"
|
||||||
}
|
}
|
||||||
|
|
||||||
msg_error() {
|
msg_error() {
|
||||||
|
|
||||||
local msg="$1"
|
local msg="$1"
|
||||||
echo -e "${msg}\n"
|
echo -e "${msg}\n"
|
||||||
}
|
}
|
||||||
@ -71,7 +71,7 @@ network_check() {
|
|||||||
}
|
}
|
||||||
|
|
||||||
update_os() {
|
update_os() {
|
||||||
msg_info "Updating Container OS"
|
msg_info "Updating Container OS"
|
||||||
apk update
|
apk update
|
||||||
apk upgrade
|
apk upgrade
|
||||||
msg_ok "Updated Container OS"
|
msg_ok "Updated Container OS"
|
||||||
@ -82,7 +82,5 @@ motd_ssh() {
|
|||||||
}
|
}
|
||||||
|
|
||||||
customize() {
|
customize() {
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
36
.github/workflows/scripts/app-test/pr-build.func
vendored
36
.github/workflows/scripts/app-test/pr-build.func
vendored
@ -6,12 +6,13 @@
|
|||||||
variables() {
|
variables() {
|
||||||
NSAPP=$(echo ${APP,,} | tr -d ' ') # This function sets the NSAPP variable by converting the value of the APP variable to lowercase and removing any spaces.
|
NSAPP=$(echo ${APP,,} | tr -d ' ') # This function sets the NSAPP variable by converting the value of the APP variable to lowercase and removing any spaces.
|
||||||
var_install="${NSAPP}-install" # sets the var_install variable by appending "-install" to the value of NSAPP.
|
var_install="${NSAPP}-install" # sets the var_install variable by appending "-install" to the value of NSAPP.
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
NEXTID=$(pvesh get /cluster/nextid)
|
NEXTID=$(pvesh get /cluster/nextid)
|
||||||
timezone=$(cat /etc/timezone)
|
timezone=$(cat /etc/timezone)
|
||||||
header_info(){
|
header_info() {
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
base_settings() {
|
base_settings() {
|
||||||
@ -20,10 +21,10 @@ base_settings() {
|
|||||||
DISK_SIZE="4"
|
DISK_SIZE="4"
|
||||||
CORE_COUNT="1"
|
CORE_COUNT="1"
|
||||||
RAM_SIZE="1024"
|
RAM_SIZE="1024"
|
||||||
VERBOSE="${1:-no}"
|
VERBOSE="no"
|
||||||
PW=""
|
PW=""
|
||||||
CT_ID=$NEXTID
|
CT_ID=$NEXTID
|
||||||
HN="Testing"
|
HN=$NSAPP
|
||||||
BRG="vmbr0"
|
BRG="vmbr0"
|
||||||
NET="dhcp"
|
NET="dhcp"
|
||||||
GATE=""
|
GATE=""
|
||||||
@ -106,7 +107,7 @@ catch_errors() {
|
|||||||
}
|
}
|
||||||
|
|
||||||
# This function handles errors
|
# This function handles errors
|
||||||
error_handler() {
|
error_handler() {
|
||||||
local line_number="$1"
|
local line_number="$1"
|
||||||
local command="$2"
|
local command="$2"
|
||||||
SCRIPT_NAME=$(basename "$0")
|
SCRIPT_NAME=$(basename "$0")
|
||||||
@ -120,17 +121,17 @@ msg_info() {
|
|||||||
echo -ne "${msg}\n"
|
echo -ne "${msg}\n"
|
||||||
}
|
}
|
||||||
|
|
||||||
msg_ok() {
|
msg_ok() {
|
||||||
local msg="$1"
|
local msg="$1"
|
||||||
echo -e "${msg}\n"
|
echo -e "${msg}\n"
|
||||||
}
|
}
|
||||||
|
|
||||||
msg_error() {
|
msg_error() {
|
||||||
|
|
||||||
local msg="$1"
|
local msg="$1"
|
||||||
echo -e "${msg}\n"
|
echo -e "${msg}\n"
|
||||||
}
|
}
|
||||||
start(){
|
start() {
|
||||||
base_settings
|
base_settings
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
@ -146,9 +147,9 @@ build_container() {
|
|||||||
TEMP_DIR=$(mktemp -d)
|
TEMP_DIR=$(mktemp -d)
|
||||||
pushd $TEMP_DIR >/dev/null
|
pushd $TEMP_DIR >/dev/null
|
||||||
if [ "$var_os" == "alpine" ]; then
|
if [ "$var_os" == "alpine" ]; then
|
||||||
export FUNCTIONS_FILE_PATH="$(cat /root/actions-runner/_work/ProxmoxVE/ProxmoxVE/.github/workflows/scripts/app-test/pr-alpine-install.func)"
|
export FUNCTIONS_FILE_PATH="$(curl -s https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/.github/workflows/scripts/app-test/pr-alpine-install.func)"
|
||||||
else
|
else
|
||||||
export FUNCTIONS_FILE_PATH="$(cat /root/actions-runner/_work/ProxmoxVE/ProxmoxVE/.github/workflows/scripts/app-test/pr-install.func)"
|
export FUNCTIONS_FILE_PATH="$(curl -s https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/.github/workflows/scripts/app-test/pr-install.func)"
|
||||||
fi
|
fi
|
||||||
|
|
||||||
export CACHER="$APT_CACHER"
|
export CACHER="$APT_CACHER"
|
||||||
@ -182,9 +183,8 @@ build_container() {
|
|||||||
"
|
"
|
||||||
echo "Container ID: $CTID"
|
echo "Container ID: $CTID"
|
||||||
|
|
||||||
|
|
||||||
# This executes create_lxc.sh and creates the container and .conf file
|
# This executes create_lxc.sh and creates the container and .conf file
|
||||||
bash /root/actions-runner/_work/ProxmoxVE/ProxmoxVE/.github/workflows/scripts/app-test/pr-create-lxc.sh
|
bash -c "$(wget -qLO - https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/.github/workflows/scripts/app-test/pr-create-lxc.sh)"
|
||||||
|
|
||||||
LXC_CONFIG=/etc/pve/lxc/${CTID}.conf
|
LXC_CONFIG=/etc/pve/lxc/${CTID}.conf
|
||||||
if [ "$CT_TYPE" == "0" ]; then
|
if [ "$CT_TYPE" == "0" ]; then
|
||||||
@ -233,6 +233,7 @@ EOF
|
|||||||
fi
|
fi
|
||||||
fi
|
fi
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# This starts the container and executes <app>-install.sh
|
# This starts the container and executes <app>-install.sh
|
||||||
msg_info "Starting LXC Container"
|
msg_info "Starting LXC Container"
|
||||||
pct start "$CTID"
|
pct start "$CTID"
|
||||||
@ -242,7 +243,7 @@ EOF
|
|||||||
msg_error "No install script found for $APP"
|
msg_error "No install script found for $APP"
|
||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
if [ "$var_os" == "alpine" ]; then
|
if [ "$var_os" == "alpine" ]; then
|
||||||
sleep 3
|
sleep 3
|
||||||
pct exec "$CTID" -- /bin/sh -c 'cat <<EOF >/etc/apk/repositories
|
pct exec "$CTID" -- /bin/sh -c 'cat <<EOF >/etc/apk/repositories
|
||||||
http://dl-cdn.alpinelinux.org/alpine/latest-stable/main
|
http://dl-cdn.alpinelinux.org/alpine/latest-stable/main
|
||||||
@ -250,11 +251,10 @@ http://dl-cdn.alpinelinux.org/alpine/latest-stable/community
|
|||||||
EOF'
|
EOF'
|
||||||
pct exec "$CTID" -- ash -c "apk add bash >/dev/null"
|
pct exec "$CTID" -- ash -c "apk add bash >/dev/null"
|
||||||
fi
|
fi
|
||||||
lxc-attach -n "$CTID" -- bash -c "$(cat /root/actions-runner/_work/ProxmoxVE/ProxmoxVE/install/$var_install.sh)" $var_install.sh
|
lxc-attach -n "$CTID" -- bash -c "$(cat /root/actions-runner/_work/ProxmoxVE/ProxmoxVE/install/$var_install.sh)"
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
description(){
|
description() {
|
||||||
IP=$(pct exec "$CTID" ip a s dev eth0 | awk '/inet / {print $2}' | cut -d/ -f1)
|
IP=$(pct exec "$CTID" ip a s dev eth0 | awk '/inet / {print $2}' | cut -d/ -f1)
|
||||||
return
|
}
|
||||||
}
|
|
@ -11,8 +11,9 @@ catch_errors() {
|
|||||||
trap 'error_handler $LINENO "$BASH_COMMAND"' ERR
|
trap 'error_handler $LINENO "$BASH_COMMAND"' ERR
|
||||||
}
|
}
|
||||||
|
|
||||||
|
# This function handles errors
|
||||||
error_handler() {
|
error_handler() {
|
||||||
local exit_code="$?"
|
local exit_code="$?"
|
||||||
local line_number="$1"
|
local line_number="$1"
|
||||||
local command="$2"
|
local command="$2"
|
||||||
local error_message="Failure in line $line_number: exit code $exit_code: while executing command $command"
|
local error_message="Failure in line $line_number: exit code $exit_code: while executing command $command"
|
||||||
@ -20,7 +21,7 @@ error_handler() {
|
|||||||
exit 100
|
exit 100
|
||||||
}
|
}
|
||||||
verb_ip6() {
|
verb_ip6() {
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
msg_info() {
|
msg_info() {
|
||||||
@ -28,18 +29,17 @@ msg_info() {
|
|||||||
echo -ne "${msg}\n"
|
echo -ne "${msg}\n"
|
||||||
}
|
}
|
||||||
|
|
||||||
msg_ok() {
|
msg_ok() {
|
||||||
local msg="$1"
|
local msg="$1"
|
||||||
echo -e "${msg}\n"
|
echo -e "${msg}\n"
|
||||||
}
|
}
|
||||||
|
|
||||||
msg_error() {
|
msg_error() {
|
||||||
|
|
||||||
local msg="$1"
|
local msg="$1"
|
||||||
echo -e "${msg}\n"
|
echo -e "${msg}\n"
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
VALIDCT=$(pvesm status -content rootdir | awk 'NR>1')
|
VALIDCT=$(pvesm status -content rootdir | awk 'NR>1')
|
||||||
if [ -z "$VALIDCT" ]; then
|
if [ -z "$VALIDCT" ]; then
|
||||||
msg_error "Unable to detect a valid Container Storage location."
|
msg_error "Unable to detect a valid Container Storage location."
|
||||||
@ -64,9 +64,12 @@ function select_storage() {
|
|||||||
CONTENT='vztmpl'
|
CONTENT='vztmpl'
|
||||||
CONTENT_LABEL='Container template'
|
CONTENT_LABEL='Container template'
|
||||||
;;
|
;;
|
||||||
*) false || { msg_error "Invalid storage class."; exit 201; };;
|
*) false || {
|
||||||
|
msg_error "Invalid storage class."
|
||||||
|
exit 201
|
||||||
|
} ;;
|
||||||
esac
|
esac
|
||||||
|
|
||||||
# This Queries all storage locations
|
# This Queries all storage locations
|
||||||
local -a MENU
|
local -a MENU
|
||||||
while read -r line; do
|
while read -r line; do
|
||||||
@ -80,23 +83,32 @@ function select_storage() {
|
|||||||
fi
|
fi
|
||||||
MENU+=("$TAG" "$ITEM" "OFF")
|
MENU+=("$TAG" "$ITEM" "OFF")
|
||||||
done < <(pvesm status -content $CONTENT | awk 'NR>1')
|
done < <(pvesm status -content $CONTENT | awk 'NR>1')
|
||||||
|
|
||||||
# Select storage location
|
# Select storage location
|
||||||
if [ $((${#MENU[@]}/3)) -eq 1 ]; then
|
if [ $((${#MENU[@]} / 3)) -eq 1 ]; then
|
||||||
printf ${MENU[0]}
|
printf ${MENU[0]}
|
||||||
else
|
else
|
||||||
msg_error "STORAGE ISSUES!"
|
msg_error "STORAGE ISSUES!"
|
||||||
exit 202
|
exit 202
|
||||||
fi
|
fi
|
||||||
}
|
}
|
||||||
|
|
||||||
|
[[ "${CTID:-}" ]] || {
|
||||||
|
msg_error "You need to set 'CTID' variable."
|
||||||
|
exit 203
|
||||||
|
}
|
||||||
|
[[ "${PCT_OSTYPE:-}" ]] || {
|
||||||
|
msg_error "You need to set 'PCT_OSTYPE' variable."
|
||||||
|
exit 204
|
||||||
|
}
|
||||||
|
|
||||||
|
# Test if ID is valid
|
||||||
|
[ "$CTID" -ge "100" ] || {
|
||||||
|
msg_error "ID cannot be less than 100."
|
||||||
|
exit 205
|
||||||
|
}
|
||||||
|
|
||||||
[[ "${CTID:-}" ]] || { msg_error "You need to set 'CTID' variable."; exit 203; }
|
# Test if ID is in use
|
||||||
[[ "${PCT_OSTYPE:-}" ]] || { msg_error "You need to set 'PCT_OSTYPE' variable."; exit 204; }
|
|
||||||
|
|
||||||
[ "$CTID" -ge "100" ] || { msg_error "ID cannot be less than 100."; exit 205; }
|
|
||||||
|
|
||||||
if pct status $CTID &>/dev/null; then
|
if pct status $CTID &>/dev/null; then
|
||||||
echo -e "ID '$CTID' is already in use."
|
echo -e "ID '$CTID' is already in use."
|
||||||
unset CTID
|
unset CTID
|
||||||
@ -110,10 +122,12 @@ CONTAINER_STORAGE=$(select_storage container) || exit
|
|||||||
|
|
||||||
pveam update >/dev/null
|
pveam update >/dev/null
|
||||||
|
|
||||||
|
|
||||||
TEMPLATE_SEARCH=${PCT_OSTYPE}-${PCT_OSVERSION:-}
|
TEMPLATE_SEARCH=${PCT_OSTYPE}-${PCT_OSVERSION:-}
|
||||||
mapfile -t TEMPLATES < <(pveam available -section system | sed -n "s/.*\($TEMPLATE_SEARCH.*\)/\1/p" | sort -t - -k 2 -V)
|
mapfile -t TEMPLATES < <(pveam available -section system | sed -n "s/.*\($TEMPLATE_SEARCH.*\)/\1/p" | sort -t - -k 2 -V)
|
||||||
[ ${#TEMPLATES[@]} -gt 0 ] || { msg_error "Unable to find a template when searching for '$TEMPLATE_SEARCH'."; exit 207; }
|
[ ${#TEMPLATES[@]} -gt 0 ] || {
|
||||||
|
msg_error "Unable to find a template when searching for '$TEMPLATE_SEARCH'."
|
||||||
|
exit 207
|
||||||
|
}
|
||||||
TEMPLATE="${TEMPLATES[-1]}"
|
TEMPLATE="${TEMPLATES[-1]}"
|
||||||
|
|
||||||
TEMPLATE_PATH="/var/lib/vz/template/cache/$TEMPLATE"
|
TEMPLATE_PATH="/var/lib/vz/template/cache/$TEMPLATE"
|
||||||
@ -121,28 +135,29 @@ TEMPLATE_PATH="/var/lib/vz/template/cache/$TEMPLATE"
|
|||||||
if ! pveam list "$TEMPLATE_STORAGE" | grep -q "$TEMPLATE"; then
|
if ! pveam list "$TEMPLATE_STORAGE" | grep -q "$TEMPLATE"; then
|
||||||
[[ -f "$TEMPLATE_PATH" ]] && rm -f "$TEMPLATE_PATH"
|
[[ -f "$TEMPLATE_PATH" ]] && rm -f "$TEMPLATE_PATH"
|
||||||
pveam download "$TEMPLATE_STORAGE" "$TEMPLATE" >/dev/null ||
|
pveam download "$TEMPLATE_STORAGE" "$TEMPLATE" >/dev/null ||
|
||||||
{ msg_error "A problem occurred while downloading the LXC template."; exit 208; }
|
{
|
||||||
|
msg_error "A problem occurred while downloading the LXC template."
|
||||||
|
exit 208
|
||||||
|
}
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
grep -q "root:100000:65536" /etc/subuid || echo "root:100000:65536" >>/etc/subuid
|
||||||
grep -q "root:100000:65536" /etc/subuid || echo "root:100000:65536" >> /etc/subuid
|
grep -q "root:100000:65536" /etc/subgid || echo "root:100000:65536" >>/etc/subgid
|
||||||
grep -q "root:100000:65536" /etc/subgid || echo "root:100000:65536" >> /etc/subgid
|
|
||||||
|
|
||||||
PCT_OPTIONS=(${PCT_OPTIONS[@]:-${DEFAULT_PCT_OPTIONS[@]}})
|
PCT_OPTIONS=(${PCT_OPTIONS[@]:-${DEFAULT_PCT_OPTIONS[@]}})
|
||||||
[[ " ${PCT_OPTIONS[@]} " =~ " -rootfs " ]] || PCT_OPTIONS+=(-rootfs "$CONTAINER_STORAGE:${PCT_DISK_SIZE:-8}")
|
[[ " ${PCT_OPTIONS[@]} " =~ " -rootfs " ]] || PCT_OPTIONS+=(-rootfs "$CONTAINER_STORAGE:${PCT_DISK_SIZE:-8}")
|
||||||
|
|
||||||
|
if ! pct create "$CTID" "${TEMPLATE_STORAGE}:vztmpl/${TEMPLATE}" "${PCT_OPTIONS[@]}" &>/dev/null; then
|
||||||
|
[[ -f "$TEMPLATE_PATH" ]] && rm -f "$TEMPLATE_PATH"
|
||||||
|
|
||||||
|
pveam download "$TEMPLATE_STORAGE" "$TEMPLATE" >/dev/null ||
|
||||||
|
{
|
||||||
|
msg_error "A problem occurred while re-downloading the LXC template."
|
||||||
|
exit 208
|
||||||
|
}
|
||||||
|
|
||||||
if ! pct create "$CTID" "${TEMPLATE_STORAGE}:vztmpl/${TEMPLATE}" "${PCT_OPTIONS[@]}" &>/dev/null; then
|
if ! pct create "$CTID" "${TEMPLATE_STORAGE}:vztmpl/${TEMPLATE}" "${PCT_OPTIONS[@]}" &>/dev/null; then
|
||||||
[[ -f "$TEMPLATE_PATH" ]] && rm -f "$TEMPLATE_PATH"
|
msg_error "A problem occurred while trying to create container after re-downloading template."
|
||||||
|
exit 200
|
||||||
|
|
||||||
pveam download "$TEMPLATE_STORAGE" "$TEMPLATE" >/dev/null ||
|
|
||||||
{ msg_error "A problem occurred while re-downloading the LXC template."; exit 208; }
|
|
||||||
|
|
||||||
|
|
||||||
if ! pct create "$CTID" "${TEMPLATE_STORAGE}:vztmpl/${TEMPLATE}" "${PCT_OPTIONS[@]}" &>/dev/null; then
|
|
||||||
msg_error "A problem occurred while trying to create container after re-downloading template."
|
|
||||||
exit 200
|
|
||||||
fi
|
|
||||||
fi
|
fi
|
||||||
|
fi
|
||||||
|
@ -1,31 +1,31 @@
|
|||||||
#!/usr/bin/env bash
|
#!/usr/bin/env bash
|
||||||
# Copyright (c) 2021-2025 community-scripts ORG
|
# Copyright (c) 2021-2025 community-scripts ORG
|
||||||
# Author: michelroegl-brunner
|
# Author: Michel Roegl-Brunner (michelroegl-brunner)
|
||||||
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
|
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
|
||||||
|
|
||||||
color() {
|
color() {
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
SCRIPT_NAME="${BASH_SOURCE[0]:-unknown_script}"
|
|
||||||
catch_errors() {
|
catch_errors() {
|
||||||
set -Euoe pipefail
|
set -Euo pipefail
|
||||||
trap 'error_handler $LINENO "$BASH_COMMAND"' ERR
|
trap 'error_handler $LINENO "$BASH_COMMAND"' ERR
|
||||||
}
|
}
|
||||||
|
|
||||||
error_handler() {
|
error_handler() {
|
||||||
local line_number="$1"
|
local line_number="$1"
|
||||||
local command="$2"
|
local command="$2"
|
||||||
local error_message="$SCRIPT_NAME: Failure in line $line_number while executing command '$command'"
|
local error_message="Failure in line $line_number while executing command '$command'"
|
||||||
echo -e "\n$error_message"
|
echo -e "\n$error_message\n" >&2
|
||||||
exit 300
|
exit 1
|
||||||
}
|
}
|
||||||
|
|
||||||
verb_ip6() {
|
verb_ip6() {
|
||||||
STD="silent"
|
STD="silent"
|
||||||
silent() {
|
silent() {
|
||||||
"$@" >/dev/null 2>&1 || error_handler "${BASH_LINENO[0]}" "$*"
|
"$@" >/dev/null 2>&1 || error_handler "${BASH_LINENO[0]}" "$*"
|
||||||
}
|
}
|
||||||
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
msg_info() {
|
msg_info() {
|
||||||
@ -33,19 +33,21 @@ msg_info() {
|
|||||||
echo -ne "${msg}\n"
|
echo -ne "${msg}\n"
|
||||||
}
|
}
|
||||||
|
|
||||||
msg_ok() {
|
msg_ok() {
|
||||||
local msg="$1"
|
local msg="$1"
|
||||||
echo -e "${msg}\n"
|
echo -e "${msg}\n"
|
||||||
}
|
}
|
||||||
|
|
||||||
msg_error() {
|
msg_error() {
|
||||||
|
|
||||||
local msg="$1"
|
local msg="$1"
|
||||||
echo -e "${msg}\n"
|
echo -e "${msg}\n"
|
||||||
}
|
}
|
||||||
RETRY_NUM=10
|
|
||||||
RETRY_EVERY=3
|
RETRY_NUM=10
|
||||||
|
RETRY_EVERY=3
|
||||||
setting_up_container() {
|
setting_up_container() {
|
||||||
|
|
||||||
sed -i "/$LANG/ s/\(^# \)//" /etc/locale.gen
|
sed -i "/$LANG/ s/\(^# \)//" /etc/locale.gen
|
||||||
locale_line=$(grep -v '^#' /etc/locale.gen | grep -E '^[a-zA-Z]' | awk '{print $1}' | head -n 1)
|
locale_line=$(grep -v '^#' /etc/locale.gen | grep -E '^[a-zA-Z]' | awk '{print $1}' | head -n 1)
|
||||||
echo "LANG=${locale_line}" >/etc/default/locale
|
echo "LANG=${locale_line}" >/etc/default/locale
|
||||||
@ -53,12 +55,11 @@ setting_up_container() {
|
|||||||
export LANG=${locale_line}
|
export LANG=${locale_line}
|
||||||
echo $tz >/etc/timezone
|
echo $tz >/etc/timezone
|
||||||
ln -sf /usr/share/zoneinfo/$tz /etc/localtime
|
ln -sf /usr/share/zoneinfo/$tz /etc/localtime
|
||||||
|
|
||||||
for ((i = RETRY_NUM; i > 0; i--)); do
|
for ((i = RETRY_NUM; i > 0; i--)); do
|
||||||
if [ "$(hostname -I)" != "" ]; then
|
if [ "$(hostname -I)" != "" ]; then
|
||||||
break
|
break
|
||||||
fi
|
fi
|
||||||
echo 1>&2 -en "No Network! "
|
|
||||||
sleep $RETRY_EVERY
|
sleep $RETRY_EVERY
|
||||||
done
|
done
|
||||||
if [ "$(hostname -I)" = "" ]; then
|
if [ "$(hostname -I)" = "" ]; then
|
||||||
@ -68,8 +69,6 @@ setting_up_container() {
|
|||||||
fi
|
fi
|
||||||
rm -rf /usr/lib/python3.*/EXTERNALLY-MANAGED
|
rm -rf /usr/lib/python3.*/EXTERNALLY-MANAGED
|
||||||
systemctl disable -q --now systemd-networkd-wait-online.service
|
systemctl disable -q --now systemd-networkd-wait-online.service
|
||||||
msg_ok "Set up Container OS"
|
|
||||||
msg_ok "Network Connected: $(hostname -I)"
|
|
||||||
}
|
}
|
||||||
|
|
||||||
network_check() {
|
network_check() {
|
||||||
@ -79,11 +78,10 @@ network_check() {
|
|||||||
}
|
}
|
||||||
|
|
||||||
update_os() {
|
update_os() {
|
||||||
msg_info "Updating Container OS"
|
export DEBIAN_FRONTEND=noninteractive
|
||||||
apt-get update
|
apt-get update >/dev/null 2>&1
|
||||||
apt-get -o Dpkg::Options::="--force-confold" -y dist-upgrade
|
apt-get -o Dpkg::Options::="--force-confold" -y dist-upgrade >/dev/null
|
||||||
rm -rf /usr/lib/python3.*/EXTERNALLY-MANAGED
|
rm -rf /usr/lib/python3.*/EXTERNALLY-MANAGED
|
||||||
msg_ok "Updated Container OS"
|
|
||||||
}
|
}
|
||||||
|
|
||||||
motd_ssh() {
|
motd_ssh() {
|
||||||
@ -91,5 +89,5 @@ motd_ssh() {
|
|||||||
}
|
}
|
||||||
|
|
||||||
customize() {
|
customize() {
|
||||||
return
|
return
|
||||||
}
|
}
|
2
.github/workflows/update-json-date.yml
vendored
2
.github/workflows/update-json-date.yml
vendored
@ -10,7 +10,7 @@ on:
|
|||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
update-app-files:
|
update-app-files:
|
||||||
runs-on: ubuntu-latest
|
runs-on: runner-cluster-htl-set
|
||||||
|
|
||||||
permissions:
|
permissions:
|
||||||
contents: write
|
contents: write
|
||||||
|
2
.github/workflows/validate-filenames.yml
vendored
2
.github/workflows/validate-filenames.yml
vendored
@ -10,7 +10,7 @@ on:
|
|||||||
jobs:
|
jobs:
|
||||||
check-files:
|
check-files:
|
||||||
name: Check changed files
|
name: Check changed files
|
||||||
runs-on: ubuntu-latest
|
runs-on: runner-cluster-htl-set
|
||||||
permissions:
|
permissions:
|
||||||
pull-requests: write
|
pull-requests: write
|
||||||
|
|
||||||
|
0
.editorconfig → .vscode/.editorconfig
vendored
0
.editorconfig → .vscode/.editorconfig
vendored
306
CHANGELOG.md
306
CHANGELOG.md
@ -17,6 +17,312 @@ All LXC instances created using this repository come pre-installed with Midnight
|
|||||||
Do not break established syntax in this file, as it is automatically updated by a Github Workflow
|
Do not break established syntax in this file, as it is automatically updated by a Github Workflow
|
||||||
|
|
||||||
|
|
||||||
|
## 2025-02-28
|
||||||
|
|
||||||
|
### 🧰 Maintenance
|
||||||
|
|
||||||
|
- #### ✨ New Features
|
||||||
|
|
||||||
|
- Shell Format Workflow [@michelroegl-brunner](https://github.com/michelroegl-brunner) ([#2400](https://github.com/community-scripts/ProxmoxVE/pull/2400))
|
||||||
|
|
||||||
|
- #### 📂 Github
|
||||||
|
|
||||||
|
- Update all Action to new selfhosted Runner Cluster [@michelroegl-brunner](https://github.com/michelroegl-brunner) ([#2739](https://github.com/community-scripts/ProxmoxVE/pull/2739))
|
||||||
|
- Update Script Test Workflow [@michelroegl-brunner](https://github.com/michelroegl-brunner) ([#2741](https://github.com/community-scripts/ProxmoxVE/pull/2741))
|
||||||
|
|
||||||
|
## 2025-02-27
|
||||||
|
|
||||||
|
### 🆕 New Scripts
|
||||||
|
|
||||||
|
- web-check [@CrazyWolf13](https://github.com/CrazyWolf13) ([#2662](https://github.com/community-scripts/ProxmoxVE/pull/2662))
|
||||||
|
- Pelican Panel [@bvdberg01](https://github.com/bvdberg01) ([#2678](https://github.com/community-scripts/ProxmoxVE/pull/2678))
|
||||||
|
- Pelican Wings [@bvdberg01](https://github.com/bvdberg01) ([#2677](https://github.com/community-scripts/ProxmoxVE/pull/2677))
|
||||||
|
- ByteStash [@tremor021](https://github.com/tremor021) ([#2680](https://github.com/community-scripts/ProxmoxVE/pull/2680))
|
||||||
|
|
||||||
|
### 🚀 Updated Scripts
|
||||||
|
|
||||||
|
- ByteStash: Removed sed, app supports Node v22 now [@tremor021](https://github.com/tremor021) ([#2728](https://github.com/community-scripts/ProxmoxVE/pull/2728))
|
||||||
|
- Keycloak: Update installation script [@tremor021](https://github.com/tremor021) ([#2714](https://github.com/community-scripts/ProxmoxVE/pull/2714))
|
||||||
|
- ByteStash: Fix Node 22 compatibility (thanks t2lc) [@tremor021](https://github.com/tremor021) ([#2705](https://github.com/community-scripts/ProxmoxVE/pull/2705))
|
||||||
|
|
||||||
|
- #### 🐞 Bug Fixes
|
||||||
|
|
||||||
|
- EOF not detected [@CrazyWolf13](https://github.com/CrazyWolf13) ([#2726](https://github.com/community-scripts/ProxmoxVE/pull/2726))
|
||||||
|
- Zitadel-install.sh: Remove one version file and update to our standard [@bvdberg01](https://github.com/bvdberg01) ([#2710](https://github.com/community-scripts/ProxmoxVE/pull/2710))
|
||||||
|
- Outline: Change key to hex32 [@tremor021](https://github.com/tremor021) ([#2709](https://github.com/community-scripts/ProxmoxVE/pull/2709))
|
||||||
|
- Typo in update scripts [@bvdberg01](https://github.com/bvdberg01) ([#2707](https://github.com/community-scripts/ProxmoxVE/pull/2707))
|
||||||
|
- SFTPGo Remove unneeded RELEASE variable [@MickLesk](https://github.com/MickLesk) ([#2683](https://github.com/community-scripts/ProxmoxVE/pull/2683))
|
||||||
|
|
||||||
|
### 🧰 Maintenance
|
||||||
|
|
||||||
|
- #### 🐞 Bug Fixes
|
||||||
|
|
||||||
|
- Update install.func: Change Line Number for Error message. [@michelroegl-brunner](https://github.com/michelroegl-brunner) ([#2690](https://github.com/community-scripts/ProxmoxVE/pull/2690))
|
||||||
|
|
||||||
|
- #### 📂 Github
|
||||||
|
|
||||||
|
- New Workflow to close Script Request Discussions on PR merge [@michelroegl-brunner](https://github.com/michelroegl-brunner) ([#2688](https://github.com/community-scripts/ProxmoxVE/pull/2688))
|
||||||
|
- Improve Script-Test Workflow [@michelroegl-brunner](https://github.com/michelroegl-brunner) ([#2712](https://github.com/community-scripts/ProxmoxVE/pull/2712))
|
||||||
|
- Switch all actions to self-hosted Runners [@michelroegl-brunner](https://github.com/michelroegl-brunner) ([#2711](https://github.com/community-scripts/ProxmoxVE/pull/2711))
|
||||||
|
|
||||||
|
### 🌐 Website
|
||||||
|
|
||||||
|
- #### ✨ New Features
|
||||||
|
|
||||||
|
- Use HTML button element for copying to clipboard [@scallaway](https://github.com/scallaway) ([#2720](https://github.com/community-scripts/ProxmoxVE/pull/2720))
|
||||||
|
- Add basic pagination to Data Viewer [@michelroegl-brunner](https://github.com/michelroegl-brunner) ([#2715](https://github.com/community-scripts/ProxmoxVE/pull/2715))
|
||||||
|
|
||||||
|
- #### 📝 Script Information
|
||||||
|
|
||||||
|
- wger - Add HTTPS instructions to the website [@tremor021](https://github.com/tremor021) ([#2695](https://github.com/community-scripts/ProxmoxVE/pull/2695))
|
||||||
|
|
||||||
|
## 2025-02-26
|
||||||
|
|
||||||
|
### 🆕 New Scripts
|
||||||
|
|
||||||
|
- New Script: Outline [@tremor021](https://github.com/tremor021) ([#2653](https://github.com/community-scripts/ProxmoxVE/pull/2653))
|
||||||
|
|
||||||
|
### 🚀 Updated Scripts
|
||||||
|
|
||||||
|
- Fix: SABnzbd - Removed few artefacts in the code preventing the update [@tremor021](https://github.com/tremor021) ([#2670](https://github.com/community-scripts/ProxmoxVE/pull/2670))
|
||||||
|
|
||||||
|
- #### 🐞 Bug Fixes
|
||||||
|
|
||||||
|
- Fix: Homarr - Manually correct db-migration wrong-folder [@CrazyWolf13](https://github.com/CrazyWolf13) ([#2676](https://github.com/community-scripts/ProxmoxVE/pull/2676))
|
||||||
|
- Kimai: add local.yaml & fix path permissions [@MickLesk](https://github.com/MickLesk) ([#2646](https://github.com/community-scripts/ProxmoxVE/pull/2646))
|
||||||
|
- PiHole: Fix Unbound sed for DNS [@MickLesk](https://github.com/MickLesk) ([#2647](https://github.com/community-scripts/ProxmoxVE/pull/2647))
|
||||||
|
- Alpine IT-Tools fix typo "unexpected EOF while looking for matching `"' [@MickLesk](https://github.com/MickLesk) ([#2644](https://github.com/community-scripts/ProxmoxVE/pull/2644))
|
||||||
|
|
||||||
|
### 🧰 Maintenance
|
||||||
|
|
||||||
|
- #### 📂 Github
|
||||||
|
|
||||||
|
- [gh] Furhter Impove Changelog Workflow [@michelroegl-brunner](https://github.com/michelroegl-brunner) ([#2655](https://github.com/community-scripts/ProxmoxVE/pull/2655))
|
||||||
|
|
||||||
|
### 🌐 Website
|
||||||
|
|
||||||
|
- #### 🐞 Bug Fixes
|
||||||
|
|
||||||
|
- Website: PocketID Change of website and documentation links [@schneider-de-com](https://github.com/schneider-de-com) ([#2643](https://github.com/community-scripts/ProxmoxVE/pull/2643))
|
||||||
|
|
||||||
|
- #### 📝 Script Information
|
||||||
|
|
||||||
|
- Fix: Graylog - Improve application description for website [@tremor021](https://github.com/tremor021) ([#2658](https://github.com/community-scripts/ProxmoxVE/pull/2658))
|
||||||
|
|
||||||
|
## 2025-02-25
|
||||||
|
|
||||||
|
### Changes
|
||||||
|
|
||||||
|
### ✨ New Features
|
||||||
|
|
||||||
|
- Update Tailscale: Add Tag when installation is finished [@michelroegl-brunner](https://github.com/michelroegl-brunner) ([#2633](https://github.com/community-scripts/ProxmoxVE/pull/2633))
|
||||||
|
|
||||||
|
### 🚀 Updated Scripts
|
||||||
|
|
||||||
|
#### 🐞 Bug Fixes
|
||||||
|
|
||||||
|
- Fix Omada installer [@JcMinarro](https://github.com/JcMinarro) ([#2625](https://github.com/community-scripts/ProxmoxVE/pull/2625))
|
||||||
|
|
||||||
|
### 🌐 Website
|
||||||
|
|
||||||
|
- Update Tailscale-lxc Json: Add message for Supported OS [@michelroegl-brunner](https://github.com/michelroegl-brunner) ([#2629](https://github.com/community-scripts/ProxmoxVE/pull/2629))
|
||||||
|
|
||||||
|
### 🧰 Maintenance
|
||||||
|
|
||||||
|
- [gh] Updated Changelog Workflow [@michelroegl-brunner](https://github.com/michelroegl-brunner) ([#2632](https://github.com/community-scripts/ProxmoxVE/pull/2632))
|
||||||
|
|
||||||
|
## 2025-02-24
|
||||||
|
|
||||||
|
### Changes
|
||||||
|
|
||||||
|
### 🆕 New Scripts
|
||||||
|
|
||||||
|
- New Script: wger [@tremor021](https://github.com/tremor021) ([#2574](https://github.com/community-scripts/ProxmoxVE/pull/2574))
|
||||||
|
- New Script: VictoriaMetrics [@tremor021](https://github.com/tremor021) ([#2565](https://github.com/community-scripts/ProxmoxVE/pull/2565))
|
||||||
|
- New Script: Authelia [@thost96](https://github.com/thost96) ([#2060](https://github.com/community-scripts/ProxmoxVE/pull/2060))
|
||||||
|
- New Script: Jupyter Notebook [@Dave-code-creater](https://github.com/Dave-code-creater) ([#2561](https://github.com/community-scripts/ProxmoxVE/pull/2561))
|
||||||
|
|
||||||
|
### 🐞 Bug Fixes
|
||||||
|
|
||||||
|
- Fix Docmost: default upload size and saving data when updating [@bvdberg01](https://github.com/bvdberg01) ([#2598](https://github.com/community-scripts/ProxmoxVE/pull/2598))
|
||||||
|
- Fix: homarr db migration [@CrazyWolf13](https://github.com/CrazyWolf13) ([#2575](https://github.com/community-scripts/ProxmoxVE/pull/2575))
|
||||||
|
- Fix: Wireguard - Restart wgdashboard automatically after update [@LostALice](https://github.com/LostALice) ([#2587](https://github.com/community-scripts/ProxmoxVE/pull/2587))
|
||||||
|
- Fix: Authelia Unbound Variable Argon2id [@MickLesk](https://github.com/MickLesk) ([#2604](https://github.com/community-scripts/ProxmoxVE/pull/2604))
|
||||||
|
- Fix: Omada check for AVX Support and use the correct MongoDB Version [@MickLesk](https://github.com/MickLesk) ([#2600](https://github.com/community-scripts/ProxmoxVE/pull/2600))
|
||||||
|
- Fix: Update-Script Firefly III based on their docs [@MickLesk](https://github.com/MickLesk) ([#2534](https://github.com/community-scripts/ProxmoxVE/pull/2534))
|
||||||
|
|
||||||
|
### ✨ New Features
|
||||||
|
|
||||||
|
- Feature: Template-Check, Better Handling of Downloads, Better Network… [@MickLesk](https://github.com/MickLesk) ([#2592](https://github.com/community-scripts/ProxmoxVE/pull/2592))
|
||||||
|
- Feature: Possibility to perform updates in silent / verbose (+ logging) [@MickLesk](https://github.com/MickLesk) ([#2583](https://github.com/community-scripts/ProxmoxVE/pull/2583))
|
||||||
|
- Feature: Use Verbose Mode for all Scripts (removed &>/dev/null) [@MickLesk](https://github.com/MickLesk) ([#2596](https://github.com/community-scripts/ProxmoxVE/pull/2596))
|
||||||
|
|
||||||
|
### 🌐 Website
|
||||||
|
|
||||||
|
- Fix: Authelia - Make user enter their domain manually [@tremor021](https://github.com/tremor021) ([#2618](https://github.com/community-scripts/ProxmoxVE/pull/2618))
|
||||||
|
- Website: Change Info for PiHole Password [@michelroegl-brunner](https://github.com/michelroegl-brunner) ([#2602](https://github.com/community-scripts/ProxmoxVE/pull/2602))
|
||||||
|
- Fix: Jupyter Json (missing logo & improve name on website) [@MickLesk](https://github.com/MickLesk) ([#2584](https://github.com/community-scripts/ProxmoxVE/pull/2584))
|
||||||
|
|
||||||
|
### 🧰 Maintenance
|
||||||
|
|
||||||
|
- [gh] Update Script Test Workflow [@michelroegl-brunner](https://github.com/michelroegl-brunner) ([#2599](https://github.com/community-scripts/ProxmoxVE/pull/2599))
|
||||||
|
- [gh] Contributor-Guide: Update AppName.md & AppName.sh [@michelroegl-brunner](https://github.com/michelroegl-brunner) ([#2603](https://github.com/community-scripts/ProxmoxVE/pull/2603))
|
||||||
|
|
||||||
|
## 2025-02-23
|
||||||
|
|
||||||
|
### Changes
|
||||||
|
|
||||||
|
### 🆕 New Scripts
|
||||||
|
|
||||||
|
- New Script: Hev socks5 server [@miviro](https://github.com/miviro) ([#2454](https://github.com/community-scripts/ProxmoxVE/pull/2454))
|
||||||
|
- New Script: bolt.diy [@tremor021](https://github.com/tremor021) ([#2528](https://github.com/community-scripts/ProxmoxVE/pull/2528))
|
||||||
|
|
||||||
|
### 🚀 Updated Scripts
|
||||||
|
|
||||||
|
- Fix: Wireguard - Remove setting NAT as its already in PostUp/Down [@tremor021](https://github.com/tremor021) ([#2510](https://github.com/community-scripts/ProxmoxVE/pull/2510))
|
||||||
|
|
||||||
|
### 🌐 Website
|
||||||
|
|
||||||
|
- Fix: Home Assistant Core - fixed wrong text in application description on website [@TMigue](https://github.com/TMigue) ([#2576](https://github.com/community-scripts/ProxmoxVE/pull/2576))
|
||||||
|
|
||||||
|
## 2025-02-22
|
||||||
|
|
||||||
|
### Changes
|
||||||
|
|
||||||
|
### 🌐 Website
|
||||||
|
|
||||||
|
- Fix a few broken icon links [@Snarkenfaugister](https://github.com/Snarkenfaugister) ([#2548](https://github.com/community-scripts/ProxmoxVE/pull/2548))
|
||||||
|
|
||||||
|
### 🧰 Maintenance
|
||||||
|
|
||||||
|
- Fix: URL's in CONTRIBUTING.md [@bvdberg01](https://github.com/bvdberg01) ([#2552](https://github.com/community-scripts/ProxmoxVE/pull/2552))
|
||||||
|
|
||||||
|
## 2025-02-21
|
||||||
|
|
||||||
|
### Changes
|
||||||
|
|
||||||
|
### 🚀 Updated Scripts
|
||||||
|
|
||||||
|
- Add ZFS to Podman. Now it works on ZFS! [@jaminmc](https://github.com/jaminmc) ([#2526](https://github.com/community-scripts/ProxmoxVE/pull/2526))
|
||||||
|
- Fix: Tianji - Downgrade Node [@MickLesk](https://github.com/MickLesk) ([#2530](https://github.com/community-scripts/ProxmoxVE/pull/2530))
|
||||||
|
|
||||||
|
### 🧰 Maintenance
|
||||||
|
|
||||||
|
- [gh] General Cleanup & Moving Files / Folders [@MickLesk](https://github.com/MickLesk) ([#2532](https://github.com/community-scripts/ProxmoxVE/pull/2532))
|
||||||
|
|
||||||
|
## 2025-02-20
|
||||||
|
|
||||||
|
### Changes
|
||||||
|
|
||||||
|
### 💥 Breaking Changes
|
||||||
|
|
||||||
|
- Breaking: Actual Budget Script (HTTPS / DB Migration / New Structure) - Read Description [@MickLesk](https://github.com/MickLesk) ([#2496](https://github.com/community-scripts/ProxmoxVE/pull/2496))
|
||||||
|
- Pihole & Unbound: Installation for Pihole V6 (read description) [@MickLesk](https://github.com/MickLesk) ([#2505](https://github.com/community-scripts/ProxmoxVE/pull/2505))
|
||||||
|
|
||||||
|
### ✨ New Scripts
|
||||||
|
|
||||||
|
- New Script: Dolibarr [@tremor021](https://github.com/tremor021) ([#2502](https://github.com/community-scripts/ProxmoxVE/pull/2502))
|
||||||
|
|
||||||
|
### 🚀 Updated Scripts
|
||||||
|
|
||||||
|
- Fix: Pingvin Share - Update not copying to correct directory [@tremor021](https://github.com/tremor021) ([#2521](https://github.com/community-scripts/ProxmoxVE/pull/2521))
|
||||||
|
- WikiJS: Prepare for Using PostgreSQL [@MickLesk](https://github.com/MickLesk) ([#2516](https://github.com/community-scripts/ProxmoxVE/pull/2516))
|
||||||
|
|
||||||
|
### 🧰 Maintenance
|
||||||
|
|
||||||
|
- [gh] better handling of labels [@MickLesk](https://github.com/MickLesk) ([#2517](https://github.com/community-scripts/ProxmoxVE/pull/2517))
|
||||||
|
|
||||||
|
## 2025-02-19
|
||||||
|
|
||||||
|
### Changes
|
||||||
|
|
||||||
|
### 🚀 Updated Scripts
|
||||||
|
|
||||||
|
- Fix: file replacement in Watcharr Update Script [@Clusters](https://github.com/Clusters) ([#2498](https://github.com/community-scripts/ProxmoxVE/pull/2498))
|
||||||
|
- Fix: Kometa - fixed successful setup message and added info to json [@tremor021](https://github.com/tremor021) ([#2495](https://github.com/community-scripts/ProxmoxVE/pull/2495))
|
||||||
|
- Fix: Actual Budget, add missing .env when updating [@MickLesk](https://github.com/MickLesk) ([#2494](https://github.com/community-scripts/ProxmoxVE/pull/2494))
|
||||||
|
|
||||||
|
## 2025-02-18
|
||||||
|
|
||||||
|
### Changes
|
||||||
|
|
||||||
|
### ✨ New Scripts
|
||||||
|
|
||||||
|
- New Script: Docmost [@MickLesk](https://github.com/MickLesk) ([#2472](https://github.com/community-scripts/ProxmoxVE/pull/2472))
|
||||||
|
|
||||||
|
### 🚀 Updated Scripts
|
||||||
|
|
||||||
|
- Fix: SQL Server 2022 | GPG & Install [@MickLesk](https://github.com/MickLesk) ([#2476](https://github.com/community-scripts/ProxmoxVE/pull/2476))
|
||||||
|
- Feature: PBS Bare Metal Installation - Allow Microcode [@MickLesk](https://github.com/MickLesk) ([#2477](https://github.com/community-scripts/ProxmoxVE/pull/2477))
|
||||||
|
- Fix: MagicMirror force Node version and fix backups [@tremor021](https://github.com/tremor021) ([#2468](https://github.com/community-scripts/ProxmoxVE/pull/2468))
|
||||||
|
- Update BunkerWeb scripts to latest NGINX and specs [@TheophileDiot](https://github.com/TheophileDiot) ([#2466](https://github.com/community-scripts/ProxmoxVE/pull/2466))
|
||||||
|
|
||||||
|
## 2025-02-17
|
||||||
|
|
||||||
|
### Changes
|
||||||
|
|
||||||
|
### 💥 Breaking Changes
|
||||||
|
|
||||||
|
- Zipline: Prepare for Version 4.0.0 [@MickLesk](https://github.com/MickLesk) ([#2455](https://github.com/community-scripts/ProxmoxVE/pull/2455))
|
||||||
|
|
||||||
|
### 🚀 Updated Scripts
|
||||||
|
|
||||||
|
- Fix: Zipline increase SECRET to 42 chars [@V1d1o7](https://github.com/V1d1o7) ([#2444](https://github.com/community-scripts/ProxmoxVE/pull/2444))
|
||||||
|
|
||||||
|
## 2025-02-16
|
||||||
|
|
||||||
|
### Changes
|
||||||
|
|
||||||
|
### 🚀 Updated Scripts
|
||||||
|
|
||||||
|
- Fix: Typo in Ubuntu 24.10 VM Script [@PhoenixEmik](https://github.com/PhoenixEmik) ([#2430](https://github.com/community-scripts/ProxmoxVE/pull/2430))
|
||||||
|
- Fix: Grist update no longer removes previous user data [@cfurrow](https://github.com/cfurrow) ([#2428](https://github.com/community-scripts/ProxmoxVE/pull/2428))
|
||||||
|
|
||||||
|
### 🌐 Website
|
||||||
|
|
||||||
|
- Debian icon update [@bannert1337](https://github.com/bannert1337) ([#2433](https://github.com/community-scripts/ProxmoxVE/pull/2433))
|
||||||
|
- Update Graylog icon [@bannert1337](https://github.com/bannert1337) ([#2434](https://github.com/community-scripts/ProxmoxVE/pull/2434))
|
||||||
|
|
||||||
|
## 2025-02-15
|
||||||
|
|
||||||
|
### Changes
|
||||||
|
|
||||||
|
### 🚀 Updated Scripts
|
||||||
|
|
||||||
|
- Setup cron in install/freshrss-install.sh [@zimmra](https://github.com/zimmra) ([#2412](https://github.com/community-scripts/ProxmoxVE/pull/2412))
|
||||||
|
- Fix: Homarr update service files [@CrazyWolf13](https://github.com/CrazyWolf13) ([#2416](https://github.com/community-scripts/ProxmoxVE/pull/2416))
|
||||||
|
- Update MagicMirror install and update scripts [@tremor021](https://github.com/tremor021) ([#2409](https://github.com/community-scripts/ProxmoxVE/pull/2409))
|
||||||
|
|
||||||
|
### 🌐 Website
|
||||||
|
|
||||||
|
- Fix RustDesk slug in json [@tremor021](https://github.com/tremor021) ([#2411](https://github.com/community-scripts/ProxmoxVE/pull/2411))
|
||||||
|
|
||||||
|
### 🧰 Maintenance
|
||||||
|
|
||||||
|
- [GH] Update script-test Workflow [@michelroegl-brunner](https://github.com/michelroegl-brunner) ([#2415](https://github.com/community-scripts/ProxmoxVE/pull/2415))
|
||||||
|
|
||||||
|
## 2025-02-14
|
||||||
|
|
||||||
|
### Changes
|
||||||
|
|
||||||
|
### 🚀 Updated Scripts
|
||||||
|
|
||||||
|
- Fix homarr [@CrazyWolf13](https://github.com/CrazyWolf13) ([#2369](https://github.com/community-scripts/ProxmoxVE/pull/2369))
|
||||||
|
|
||||||
|
### 🌐 Website
|
||||||
|
|
||||||
|
- RustDesk Server - Added configuration guide to json [@tremor021](https://github.com/tremor021) ([#2389](https://github.com/community-scripts/ProxmoxVE/pull/2389))
|
||||||
|
|
||||||
|
### 🧰 Maintenance
|
||||||
|
|
||||||
|
- [gh] Update script-test.yml [@michelroegl-brunner](https://github.com/michelroegl-brunner) ([#2399](https://github.com/community-scripts/ProxmoxVE/pull/2399))
|
||||||
|
- [gh] Introducing new Issue Github Template Feature (Bug, Feature, Task) [@MickLesk](https://github.com/MickLesk) ([#2394](https://github.com/community-scripts/ProxmoxVE/pull/2394))
|
||||||
|
|
||||||
|
### 📡 API
|
||||||
|
|
||||||
|
- [API]Add more enpoints to API [@michelroegl-brunner](https://github.com/michelroegl-brunner) ([#2390](https://github.com/community-scripts/ProxmoxVE/pull/2390))
|
||||||
|
- [API] Update api.func: Remove unwanted file creation [@michelroegl-brunner](https://github.com/michelroegl-brunner) ([#2378](https://github.com/community-scripts/ProxmoxVE/pull/2378))
|
||||||
|
|
||||||
## 2025-02-13
|
## 2025-02-13
|
||||||
|
|
||||||
### Changes
|
### Changes
|
||||||
|
@ -19,10 +19,10 @@
|
|||||||
<a href="https://ko-fi.com/community_scripts">
|
<a href="https://ko-fi.com/community_scripts">
|
||||||
<img src="https://img.shields.io/badge/Support-FF5F5F?style=for-the-badge&logo=ko-fi&logoColor=white" alt="Donate" />
|
<img src="https://img.shields.io/badge/Support-FF5F5F?style=for-the-badge&logo=ko-fi&logoColor=white" alt="Donate" />
|
||||||
</a>
|
</a>
|
||||||
<a href="https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTING.md">
|
<a href="https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_AND_GUIDES/CONTRIBUTING.md">
|
||||||
<img src="https://img.shields.io/badge/Contribute-ff4785?style=for-the-badge&logo=git&logoColor=white" alt="Contribute" />
|
<img src="https://img.shields.io/badge/Contribute-ff4785?style=for-the-badge&logo=git&logoColor=white" alt="Contribute" />
|
||||||
</a>
|
</a>
|
||||||
<a href="https://github.com/community-scripts/ProxmoxVE/blob/main/USER_SUBMITTED_GUIDES.md">
|
<a href="https://github.com/community-scripts/ProxmoxVE/blob/main/.github/CONTRIBUTOR_AND_GUIDES/USER_SUBMITTED_GUIDES.md">
|
||||||
<img src="https://img.shields.io/badge/Guides-0077b5?style=for-the-badge&logo=read-the-docs&logoColor=white" alt="Guides" />
|
<img src="https://img.shields.io/badge/Guides-0077b5?style=for-the-badge&logo=read-the-docs&logoColor=white" alt="Guides" />
|
||||||
</a>
|
</a>
|
||||||
<a href="https://github.com/community-scripts/ProxmoxVE/blob/main/CHANGELOG.md">
|
<a href="https://github.com/community-scripts/ProxmoxVE/blob/main/CHANGELOG.md">
|
||||||
|
283
api/main.go
283
api/main.go
@ -11,6 +11,7 @@ import (
|
|||||||
"log"
|
"log"
|
||||||
"net/http"
|
"net/http"
|
||||||
"os"
|
"os"
|
||||||
|
"strconv"
|
||||||
"time"
|
"time"
|
||||||
|
|
||||||
"github.com/gorilla/mux"
|
"github.com/gorilla/mux"
|
||||||
@ -31,6 +32,7 @@ func loadEnv() {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// DataModel represents a single document in MongoDB
|
||||||
type DataModel struct {
|
type DataModel struct {
|
||||||
ID primitive.ObjectID `json:"id" bson:"_id,omitempty"`
|
ID primitive.ObjectID `json:"id" bson:"_id,omitempty"`
|
||||||
CT_TYPE uint `json:"ct_type" bson:"ct_type"`
|
CT_TYPE uint `json:"ct_type" bson:"ct_type"`
|
||||||
@ -56,6 +58,13 @@ type StatusModel struct {
|
|||||||
STATUS string `json:"status" bson:"status"`
|
STATUS string `json:"status" bson:"status"`
|
||||||
}
|
}
|
||||||
|
|
||||||
|
type CountResponse struct {
|
||||||
|
TotalEntries int64 `json:"total_entries"`
|
||||||
|
StatusCount map[string]int64 `json:"status_count"`
|
||||||
|
NSAPPCount map[string]int64 `json:"nsapp_count"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// ConnectDatabase initializes the MongoDB connection
|
||||||
func ConnectDatabase() {
|
func ConnectDatabase() {
|
||||||
loadEnv()
|
loadEnv()
|
||||||
|
|
||||||
@ -78,6 +87,7 @@ func ConnectDatabase() {
|
|||||||
fmt.Println("Connected to MongoDB on 10.10.10.18")
|
fmt.Println("Connected to MongoDB on 10.10.10.18")
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// UploadJSON handles API requests and stores data as a document in MongoDB
|
||||||
func UploadJSON(w http.ResponseWriter, r *http.Request) {
|
func UploadJSON(w http.ResponseWriter, r *http.Request) {
|
||||||
var input DataModel
|
var input DataModel
|
||||||
|
|
||||||
@ -98,6 +108,7 @@ func UploadJSON(w http.ResponseWriter, r *http.Request) {
|
|||||||
json.NewEncoder(w).Encode(map[string]string{"message": "Data saved successfully"})
|
json.NewEncoder(w).Encode(map[string]string{"message": "Data saved successfully"})
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// UpdateStatus updates the status of a record based on RANDOM_ID
|
||||||
func UpdateStatus(w http.ResponseWriter, r *http.Request) {
|
func UpdateStatus(w http.ResponseWriter, r *http.Request) {
|
||||||
var input StatusModel
|
var input StatusModel
|
||||||
|
|
||||||
@ -120,6 +131,7 @@ func UpdateStatus(w http.ResponseWriter, r *http.Request) {
|
|||||||
json.NewEncoder(w).Encode(map[string]string{"message": "Record updated successfully"})
|
json.NewEncoder(w).Encode(map[string]string{"message": "Record updated successfully"})
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// GetDataJSON fetches all data from MongoDB
|
||||||
func GetDataJSON(w http.ResponseWriter, r *http.Request) {
|
func GetDataJSON(w http.ResponseWriter, r *http.Request) {
|
||||||
var records []DataModel
|
var records []DataModel
|
||||||
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
|
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
|
||||||
@ -144,6 +156,270 @@ func GetDataJSON(w http.ResponseWriter, r *http.Request) {
|
|||||||
w.Header().Set("Content-Type", "application/json")
|
w.Header().Set("Content-Type", "application/json")
|
||||||
json.NewEncoder(w).Encode(records)
|
json.NewEncoder(w).Encode(records)
|
||||||
}
|
}
|
||||||
|
func GetPaginatedData(w http.ResponseWriter, r *http.Request) {
|
||||||
|
page, _ := strconv.Atoi(r.URL.Query().Get("page"))
|
||||||
|
limit, _ := strconv.Atoi(r.URL.Query().Get("limit"))
|
||||||
|
if page < 1 {
|
||||||
|
page = 1
|
||||||
|
}
|
||||||
|
if limit < 1 {
|
||||||
|
limit = 10
|
||||||
|
}
|
||||||
|
skip := (page - 1) * limit
|
||||||
|
var records []DataModel
|
||||||
|
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
|
||||||
|
defer cancel()
|
||||||
|
|
||||||
|
options := options.Find().SetSkip(int64(skip)).SetLimit(int64(limit))
|
||||||
|
cursor, err := collection.Find(ctx, bson.M{}, options)
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer cursor.Close(ctx)
|
||||||
|
|
||||||
|
for cursor.Next(ctx) {
|
||||||
|
var record DataModel
|
||||||
|
if err := cursor.Decode(&record); err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
records = append(records, record)
|
||||||
|
}
|
||||||
|
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
json.NewEncoder(w).Encode(records)
|
||||||
|
}
|
||||||
|
|
||||||
|
func GetSummary(w http.ResponseWriter, r *http.Request) {
|
||||||
|
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
|
||||||
|
defer cancel()
|
||||||
|
|
||||||
|
totalCount, err := collection.CountDocuments(ctx, bson.M{})
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
statusCount := make(map[string]int64)
|
||||||
|
nsappCount := make(map[string]int64)
|
||||||
|
|
||||||
|
pipeline := []bson.M{
|
||||||
|
{"$group": bson.M{"_id": "$status", "count": bson.M{"$sum": 1}}},
|
||||||
|
}
|
||||||
|
cursor, err := collection.Aggregate(ctx, pipeline)
|
||||||
|
if err == nil {
|
||||||
|
for cursor.Next(ctx) {
|
||||||
|
var result struct {
|
||||||
|
ID string `bson:"_id"`
|
||||||
|
Count int64 `bson:"count"`
|
||||||
|
}
|
||||||
|
if err := cursor.Decode(&result); err == nil {
|
||||||
|
statusCount[result.ID] = result.Count
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
pipeline = []bson.M{
|
||||||
|
{"$group": bson.M{"_id": "$nsapp", "count": bson.M{"$sum": 1}}},
|
||||||
|
}
|
||||||
|
cursor, err = collection.Aggregate(ctx, pipeline)
|
||||||
|
if err == nil {
|
||||||
|
for cursor.Next(ctx) {
|
||||||
|
var result struct {
|
||||||
|
ID string `bson:"_id"`
|
||||||
|
Count int64 `bson:"count"`
|
||||||
|
}
|
||||||
|
if err := cursor.Decode(&result); err == nil {
|
||||||
|
nsappCount[result.ID] = result.Count
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
response := CountResponse{
|
||||||
|
TotalEntries: totalCount,
|
||||||
|
StatusCount: statusCount,
|
||||||
|
NSAPPCount: nsappCount,
|
||||||
|
}
|
||||||
|
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
json.NewEncoder(w).Encode(response)
|
||||||
|
}
|
||||||
|
|
||||||
|
func GetByNsapp(w http.ResponseWriter, r *http.Request) {
|
||||||
|
nsapp := r.URL.Query().Get("nsapp")
|
||||||
|
var records []DataModel
|
||||||
|
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
|
||||||
|
defer cancel()
|
||||||
|
|
||||||
|
cursor, err := collection.Find(ctx, bson.M{"nsapp": nsapp})
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer cursor.Close(ctx)
|
||||||
|
|
||||||
|
for cursor.Next(ctx) {
|
||||||
|
var record DataModel
|
||||||
|
if err := cursor.Decode(&record); err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
records = append(records, record)
|
||||||
|
}
|
||||||
|
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
json.NewEncoder(w).Encode(records)
|
||||||
|
}
|
||||||
|
|
||||||
|
func GetByDateRange(w http.ResponseWriter, r *http.Request) {
|
||||||
|
|
||||||
|
startDate := r.URL.Query().Get("start_date")
|
||||||
|
endDate := r.URL.Query().Get("end_date")
|
||||||
|
|
||||||
|
if startDate == "" || endDate == "" {
|
||||||
|
http.Error(w, "Both start_date and end_date are required", http.StatusBadRequest)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
start, err := time.Parse("2006-01-02T15:04:05.999999+00:00", startDate+"T00:00:00+00:00")
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, "Invalid start_date format", http.StatusBadRequest)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
end, err := time.Parse("2006-01-02T15:04:05.999999+00:00", endDate+"T23:59:59+00:00")
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, "Invalid end_date format", http.StatusBadRequest)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
var records []DataModel
|
||||||
|
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
|
||||||
|
defer cancel()
|
||||||
|
|
||||||
|
cursor, err := collection.Find(ctx, bson.M{
|
||||||
|
"created_at": bson.M{
|
||||||
|
"$gte": start,
|
||||||
|
"$lte": end,
|
||||||
|
},
|
||||||
|
})
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer cursor.Close(ctx)
|
||||||
|
|
||||||
|
for cursor.Next(ctx) {
|
||||||
|
var record DataModel
|
||||||
|
if err := cursor.Decode(&record); err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
records = append(records, record)
|
||||||
|
}
|
||||||
|
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
json.NewEncoder(w).Encode(records)
|
||||||
|
}
|
||||||
|
func GetByStatus(w http.ResponseWriter, r *http.Request) {
|
||||||
|
status := r.URL.Query().Get("status")
|
||||||
|
var records []DataModel
|
||||||
|
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
|
||||||
|
defer cancel()
|
||||||
|
|
||||||
|
cursor, err := collection.Find(ctx, bson.M{"status": status})
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer cursor.Close(ctx)
|
||||||
|
|
||||||
|
for cursor.Next(ctx) {
|
||||||
|
var record DataModel
|
||||||
|
if err := cursor.Decode(&record); err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
records = append(records, record)
|
||||||
|
}
|
||||||
|
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
json.NewEncoder(w).Encode(records)
|
||||||
|
}
|
||||||
|
|
||||||
|
func GetByOS(w http.ResponseWriter, r *http.Request) {
|
||||||
|
osType := r.URL.Query().Get("os_type")
|
||||||
|
osVersion := r.URL.Query().Get("os_version")
|
||||||
|
var records []DataModel
|
||||||
|
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
|
||||||
|
defer cancel()
|
||||||
|
|
||||||
|
cursor, err := collection.Find(ctx, bson.M{"os_type": osType, "os_version": osVersion})
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer cursor.Close(ctx)
|
||||||
|
|
||||||
|
for cursor.Next(ctx) {
|
||||||
|
var record DataModel
|
||||||
|
if err := cursor.Decode(&record); err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
records = append(records, record)
|
||||||
|
}
|
||||||
|
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
json.NewEncoder(w).Encode(records)
|
||||||
|
}
|
||||||
|
|
||||||
|
func GetErrors(w http.ResponseWriter, r *http.Request) {
|
||||||
|
errorCount := make(map[string]int)
|
||||||
|
|
||||||
|
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
|
||||||
|
defer cancel()
|
||||||
|
|
||||||
|
cursor, err := collection.Find(ctx, bson.M{"error": bson.M{"$ne": ""}})
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer cursor.Close(ctx)
|
||||||
|
|
||||||
|
for cursor.Next(ctx) {
|
||||||
|
var record DataModel
|
||||||
|
if err := cursor.Decode(&record); err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if record.ERROR != "" {
|
||||||
|
errorCount[record.ERROR]++
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
type ErrorCountResponse struct {
|
||||||
|
Error string `json:"error"`
|
||||||
|
Count int `json:"count"`
|
||||||
|
}
|
||||||
|
|
||||||
|
var errorCounts []ErrorCountResponse
|
||||||
|
for err, count := range errorCount {
|
||||||
|
errorCounts = append(errorCounts, ErrorCountResponse{
|
||||||
|
Error: err,
|
||||||
|
Count: count,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
json.NewEncoder(w).Encode(struct {
|
||||||
|
ErrorCounts []ErrorCountResponse `json:"error_counts"`
|
||||||
|
}{
|
||||||
|
ErrorCounts: errorCounts,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
func main() {
|
func main() {
|
||||||
ConnectDatabase()
|
ConnectDatabase()
|
||||||
@ -152,6 +428,13 @@ func main() {
|
|||||||
router.HandleFunc("/upload", UploadJSON).Methods("POST")
|
router.HandleFunc("/upload", UploadJSON).Methods("POST")
|
||||||
router.HandleFunc("/upload/updatestatus", UpdateStatus).Methods("POST")
|
router.HandleFunc("/upload/updatestatus", UpdateStatus).Methods("POST")
|
||||||
router.HandleFunc("/data/json", GetDataJSON).Methods("GET")
|
router.HandleFunc("/data/json", GetDataJSON).Methods("GET")
|
||||||
|
router.HandleFunc("/data/paginated", GetPaginatedData).Methods("GET")
|
||||||
|
router.HandleFunc("/data/summary", GetSummary).Methods("GET")
|
||||||
|
router.HandleFunc("/data/nsapp", GetByNsapp).Methods("GET")
|
||||||
|
router.HandleFunc("/data/date", GetByDateRange).Methods("GET")
|
||||||
|
router.HandleFunc("/data/status", GetByStatus).Methods("GET")
|
||||||
|
router.HandleFunc("/data/os", GetByOS).Methods("GET")
|
||||||
|
router.HandleFunc("/data/errors", GetErrors).Methods("GET")
|
||||||
|
|
||||||
c := cors.New(cors.Options{
|
c := cors.New(cors.Options{
|
||||||
AllowedOrigins: []string{"*"},
|
AllowedOrigins: []string{"*"},
|
||||||
|
@ -35,8 +35,8 @@ function update_script() {
|
|||||||
if [[ "${RELEASE}" != "$(cat /opt/2fauth_version.txt)" ]] || [[ ! -f /opt/2fauth_version.txt ]]; then
|
if [[ "${RELEASE}" != "$(cat /opt/2fauth_version.txt)" ]] || [[ ! -f /opt/2fauth_version.txt ]]; then
|
||||||
msg_info "Updating $APP to ${RELEASE}"
|
msg_info "Updating $APP to ${RELEASE}"
|
||||||
|
|
||||||
apt-get update &>/dev/null
|
$STD apt-get update
|
||||||
apt-get -y upgrade &>/dev/null
|
$STD apt-get -y upgrade
|
||||||
|
|
||||||
# Creating Backup
|
# Creating Backup
|
||||||
msg_info "Creating Backup"
|
msg_info "Creating Backup"
|
||||||
@ -55,7 +55,7 @@ function update_script() {
|
|||||||
chmod -R 755 "/opt/2fauth"
|
chmod -R 755 "/opt/2fauth"
|
||||||
|
|
||||||
export COMPOSER_ALLOW_SUPERUSER=1
|
export COMPOSER_ALLOW_SUPERUSER=1
|
||||||
composer install --no-dev --prefer-source &>/dev/null
|
$STD composer install --no-dev --prefer-source
|
||||||
|
|
||||||
php artisan 2fauth:install
|
php artisan 2fauth:install
|
||||||
|
|
||||||
|
@ -26,43 +26,91 @@ function update_script() {
|
|||||||
|
|
||||||
if [[ ! -d /opt/actualbudget ]]; then
|
if [[ ! -d /opt/actualbudget ]]; then
|
||||||
msg_error "No ${APP} Installation Found!"
|
msg_error "No ${APP} Installation Found!"
|
||||||
exit
|
exit 1
|
||||||
fi
|
fi
|
||||||
|
|
||||||
RELEASE=$(curl -s https://api.github.com/repos/actualbudget/actual/releases/latest | grep "tag_name" | awk '{print substr($2, 3, length($2)-4) }')
|
RELEASE=$(curl -s https://api.github.com/repos/actualbudget/actual/releases/latest | \
|
||||||
|
grep "tag_name" | awk -F '"' '{print substr($4, 2)}')
|
||||||
|
|
||||||
if [[ ! -f /opt/actualbudget_version.txt ]] || [[ "${RELEASE}" != "$(cat /opt/actualbudget_version.txt)" ]]; then
|
if [[ ! -f /opt/actualbudget_version.txt ]] || [[ "${RELEASE}" != "$(cat /opt/actualbudget_version.txt)" ]]; then
|
||||||
msg_info "Stopping ${APP}"
|
msg_info "Stopping ${APP}"
|
||||||
systemctl stop actualbudget
|
systemctl stop actualbudget
|
||||||
msg_ok "${APP} Stopped"
|
msg_ok "${APP} Stopped"
|
||||||
|
|
||||||
msg_info "Updating ${APP} to ${RELEASE}"
|
msg_info "Updating ${APP} to ${RELEASE}"
|
||||||
cd /tmp
|
cd /tmp
|
||||||
wget -q https://github.com/actualbudget/actual-server/archive/refs/tags/v${RELEASE}.tar.gz
|
wget -q "https://github.com/actualbudget/actual-server/archive/refs/tags/v${RELEASE}.tar.gz"
|
||||||
|
|
||||||
mv /opt/actualbudget /opt/actualbudget_bak
|
mv /opt/actualbudget /opt/actualbudget_bak
|
||||||
tar -xzf v${RELEASE}.tar.gz >/dev/null 2>&1
|
$STD tar -xzf "v${RELEASE}.tar.gz"
|
||||||
mv *ctual-server-* /opt/actualbudget
|
mv *ctual-server-* /opt/actualbudget
|
||||||
rm -rf /opt/actualbudget/.env
|
|
||||||
mv /opt/actualbudget_bak/.env /opt/actualbudget
|
mkdir -p /opt/actualbudget-data/{server-files,upload,migrate,user-files,migrations,config}
|
||||||
mv /opt/actualbudget_bak/.migrate /opt/actualbudget
|
for dir in server-files .migrate user-files migrations; do
|
||||||
mv /opt/actualbudget_bak/server-files /opt/actualbudget/server-files
|
if [[ -d /opt/actualbudget_bak/$dir ]]; then
|
||||||
|
mv /opt/actualbudget_bak/$dir/* /opt/actualbudget-data/$dir/ || true
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
if [[ -f /opt/actualbudget-data/migrate/.migrations ]]; then
|
||||||
|
sed -i 's/null/1732656575219/g' /opt/actualbudget-data/migrate/.migrations
|
||||||
|
sed -i 's/null/1732656575220/g' /opt/actualbudget-data/migrate/.migrations
|
||||||
|
fi
|
||||||
|
if [[ -f /opt/actualbudget/server-files/account.sqlite ]] && [[ ! -f /opt/actualbudget-data/server-files/account.sqlite ]]; then
|
||||||
|
mv /opt/actualbudget/server-files/account.sqlite /opt/actualbudget-data/server-files/account.sqlite
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ -f /opt/actualbudget_bak/.env ]]; then
|
||||||
|
mv /opt/actualbudget_bak/.env /opt/actualbudget-data/.env
|
||||||
|
else
|
||||||
|
cat <<EOF > /opt/actualbudget-data/.env
|
||||||
|
ACTUAL_UPLOAD_DIR=/opt/actualbudget-data/upload
|
||||||
|
ACTUAL_DATA_DIR=/opt/actualbudget-data
|
||||||
|
ACTUAL_SERVER_FILES_DIR=/opt/actualbudget-data/server-files
|
||||||
|
ACTUAL_USER_FILES=/opt/actualbudget-data/user-files
|
||||||
|
PORT=5006
|
||||||
|
ACTUAL_TRUSTED_PROXIES="10.0.0.0/8,172.16.0.0/12,192.168.0.0/16,127.0.0.1/32,::1/128,fc00::/7"
|
||||||
|
ACTUAL_HTTPS_KEY=/opt/actualbudget/selfhost.key
|
||||||
|
ACTUAL_HTTPS_CERT=/opt/actualbudget/selfhost.crt
|
||||||
|
EOF
|
||||||
|
fi
|
||||||
cd /opt/actualbudget
|
cd /opt/actualbudget
|
||||||
yarn install &>/dev/null
|
$STD yarn install
|
||||||
echo "${RELEASE}" >/opt/actualbudget_version.txt
|
echo "${RELEASE}" > /opt/actualbudget_version.txt
|
||||||
msg_ok "Updated ${APP}"
|
msg_ok "Updated ${APP}"
|
||||||
|
|
||||||
msg_info "Starting ${APP}"
|
msg_info "Starting ${APP}"
|
||||||
|
cat <<EOF > /etc/systemd/system/actualbudget.service
|
||||||
|
[Unit]
|
||||||
|
Description=Actual Budget Service
|
||||||
|
After=network.target
|
||||||
|
|
||||||
|
[Service]
|
||||||
|
Type=simple
|
||||||
|
User=root
|
||||||
|
Group=root
|
||||||
|
WorkingDirectory=/opt/actualbudget
|
||||||
|
EnvironmentFile=/opt/actualbudget-data/.env
|
||||||
|
ExecStart=/usr/bin/yarn start
|
||||||
|
Restart=always
|
||||||
|
RestartSec=10
|
||||||
|
|
||||||
|
[Install]
|
||||||
|
WantedBy=multi-user.target
|
||||||
|
EOF
|
||||||
|
|
||||||
|
systemctl daemon-reload
|
||||||
systemctl start actualbudget
|
systemctl start actualbudget
|
||||||
msg_ok "Started ${APP}"
|
msg_ok "Started ${APP}"
|
||||||
|
|
||||||
msg_info "Cleaning Up"
|
msg_info "Cleaning Up"
|
||||||
rm -rf /opt/actualbudget_bak
|
rm -rf /opt/actualbudget_bak
|
||||||
rm -rf /tmp/v${RELEASE}.tar.gz
|
rm -rf "/tmp/v${RELEASE}.tar.gz"
|
||||||
msg_ok "Cleaned"
|
msg_ok "Cleaned"
|
||||||
msg_ok "Updated Successfully"
|
msg_ok "Updated Successfully"
|
||||||
else
|
else
|
||||||
msg_ok "No update required. ${APP} is already at ${RELEASE}"
|
msg_ok "No update required. ${APP} is already at ${RELEASE}"
|
||||||
fi
|
fi
|
||||||
exit
|
exit 0
|
||||||
}
|
}
|
||||||
|
|
||||||
start
|
start
|
||||||
@ -72,4 +120,4 @@ description
|
|||||||
msg_ok "Completed Successfully!\n"
|
msg_ok "Completed Successfully!\n"
|
||||||
echo -e "${CREATING}${GN}${APP} setup has been successfully initialized!${CL}"
|
echo -e "${CREATING}${GN}${APP} setup has been successfully initialized!${CL}"
|
||||||
echo -e "${INFO}${YW} Access it using the following URL:${CL}"
|
echo -e "${INFO}${YW} Access it using the following URL:${CL}"
|
||||||
echo -e "${TAB}${GATEWAY}${BGN}http://${IP}:5006${CL}"
|
echo -e "${TAB}${GATEWAY}${BGN}https://${IP}:5006${CL}"
|
||||||
|
@ -43,15 +43,15 @@ function update_script() {
|
|||||||
mv /opt/adventurelog-backup/backend/server/.env /opt/adventurelog/backend/server/.env
|
mv /opt/adventurelog-backup/backend/server/.env /opt/adventurelog/backend/server/.env
|
||||||
mv /opt/adventurelog-backup/backend/server/media /opt/adventurelog/backend/server/media
|
mv /opt/adventurelog-backup/backend/server/media /opt/adventurelog/backend/server/media
|
||||||
cd /opt/adventurelog/backend/server
|
cd /opt/adventurelog/backend/server
|
||||||
pip install --upgrade pip &>/dev/null
|
$STD pip install --upgrade pip
|
||||||
pip install -r requirements.txt &>/dev/null
|
$STD pip install -r requirements.txt
|
||||||
python3 manage.py collectstatic --noinput &>/dev/null
|
$STD python3 manage.py collectstatic --noinput
|
||||||
python3 manage.py migrate &>/dev/null
|
$STD python3 manage.py migrate
|
||||||
|
|
||||||
mv /opt/adventurelog-backup/frontend/.env /opt/adventurelog/frontend/.env
|
mv /opt/adventurelog-backup/frontend/.env /opt/adventurelog/frontend/.env
|
||||||
cd /opt/adventurelog/frontend
|
cd /opt/adventurelog/frontend
|
||||||
pnpm install &>/dev/null
|
$STD pnpm install
|
||||||
pnpm run build &>/dev/null
|
$STD pnpm run build
|
||||||
echo "${RELEASE}" >/opt/${APP}_version.txt
|
echo "${RELEASE}" >/opt/${APP}_version.txt
|
||||||
msg_ok "Updated ${APP}"
|
msg_ok "Updated ${APP}"
|
||||||
|
|
||||||
|
@ -27,7 +27,7 @@ function update_script() {
|
|||||||
msg_error "No ${APP} Installation Found!"
|
msg_error "No ${APP} Installation Found!"
|
||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_error "There is currently no update path available."
|
msg_error "Currently we don't provide an update function for this ${APP}."
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -30,7 +30,7 @@ if [ ! -d /usr/share/nginx/html ]; then
|
|||||||
fi
|
fi
|
||||||
|
|
||||||
RELEASE=$(curl -s https://api.github.com/repos/CorentinTh/it-tools/releases/latest | grep '"tag_name":' | cut -d '"' -f4)
|
RELEASE=$(curl -s https://api.github.com/repos/CorentinTh/it-tools/releases/latest | grep '"tag_name":' | cut -d '"' -f4)
|
||||||
if [ "${RELEASE}" != "$(cat /opt/${APP}_version.txt 2>/dev/null)" ] || [ ! -f /opt/${APP}_version.txt ]; then
|
if [ "${RELEASE}" != "$(cat /opt/${APP}_version.txt)" ] || [ ! -f /opt/${APP}_version.txt ]; then
|
||||||
DOWNLOAD_URL="https://github.com/CorentinTh/it-tools/releases/download/${RELEASE}/it-tools-${RELEASE#v}.zip"
|
DOWNLOAD_URL="https://github.com/CorentinTh/it-tools/releases/download/${RELEASE}/it-tools-${RELEASE#v}.zip"
|
||||||
msg_info "Updating ${APP} LXC"
|
msg_info "Updating ${APP} LXC"
|
||||||
curl -fsSL -o it-tools.zip "$DOWNLOAD_URL"
|
curl -fsSL -o it-tools.zip "$DOWNLOAD_URL"
|
||||||
|
@ -28,7 +28,7 @@ function update_script() {
|
|||||||
msg_error "No ${APP} Installation Found!"
|
msg_error "No ${APP} Installation Found!"
|
||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_error "There is currently no update path available."
|
msg_error "Currently we don't provide an update function for this ${APP}."
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -27,7 +27,7 @@ function update_script() {
|
|||||||
msg_error "No ${APP} Installation Found!"
|
msg_error "No ${APP} Installation Found!"
|
||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_error "There is currently no update path available."
|
msg_error "Currently we don't provide an update function for this ${APP}."
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -27,7 +27,7 @@ function update_script() {
|
|||||||
msg_error "No ${APP} Installation Found!"
|
msg_error "No ${APP} Installation Found!"
|
||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_error "Ther is currently no automatic update function for ${APP}."
|
msg_error "Currently we don't provide an update function for this ${APP}."
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -28,8 +28,8 @@ function update_script() {
|
|||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_info "Updating $APP LXC"
|
msg_info "Updating $APP LXC"
|
||||||
apt-get update &>/dev/null
|
$STD apt-get update
|
||||||
apt-get -y upgrade &>/dev/null
|
$STD apt-get -y upgrade
|
||||||
msg_ok "Updated $APP LXC"
|
msg_ok "Updated $APP LXC"
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
@ -28,8 +28,8 @@ function update_script() {
|
|||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_info "Updating $APP LXC"
|
msg_info "Updating $APP LXC"
|
||||||
apt-get update &>/dev/null
|
$STD apt-get update
|
||||||
apt-get -y upgrade &>/dev/null
|
$STD apt-get -y upgrade
|
||||||
msg_ok "Updated $APP LXC"
|
msg_ok "Updated $APP LXC"
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
55
ct/authelia.sh
Normal file
55
ct/authelia.sh
Normal file
@ -0,0 +1,55 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
source <(curl -s https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/build.func)
|
||||||
|
# Copyright (c) 2021-2025 community-scripts ORG
|
||||||
|
# Author: thost96 (thost96)
|
||||||
|
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
|
||||||
|
# Source: https://www.authelia.com/
|
||||||
|
|
||||||
|
APP="Authelia"
|
||||||
|
TAGS=""
|
||||||
|
var_cpu="1"
|
||||||
|
var_ram="512"
|
||||||
|
var_disk="2"
|
||||||
|
var_os="debian"
|
||||||
|
var_version="12"
|
||||||
|
var_unprivileged="1"
|
||||||
|
|
||||||
|
header_info "$APP"
|
||||||
|
base_settings
|
||||||
|
|
||||||
|
variables
|
||||||
|
color
|
||||||
|
catch_errors
|
||||||
|
|
||||||
|
function update_script() {
|
||||||
|
header_info
|
||||||
|
check_container_storage
|
||||||
|
check_container_resources
|
||||||
|
if [[ ! -d "/etc/authelia/" ]]; then msg_error "No ${APP} Installation Found!"; exit; fi
|
||||||
|
RELEASE=$(curl -s https://api.github.com/repos/authelia/authelia/releases/latest | grep "tag_name" | awk '{print substr($2, 2, length($2)-3) }')
|
||||||
|
if [[ "${RELEASE}" != "$(/usr/bin/authelia -v | awk '{print substr($3, 2, length($2)) }' )" ]]; then
|
||||||
|
msg_info "Updating $APP to ${RELEASE}"
|
||||||
|
$STD apt-get update
|
||||||
|
$STD apt-get -y upgrade
|
||||||
|
wget -q "https://github.com/authelia/authelia/releases/download/${RELEASE}/authelia_${RELEASE}_amd64.deb"
|
||||||
|
$STD dpkg -i "authelia_${RELEASE}_amd64.deb"
|
||||||
|
msg_info "Cleaning Up"
|
||||||
|
rm -f "authelia_${RELEASE}_amd64.deb"
|
||||||
|
$STD apt-get -y autoremove
|
||||||
|
$STD apt-get -y autoclean
|
||||||
|
msg_ok "Cleanup Completed"
|
||||||
|
msg_ok "Updated $APP to ${RELEASE}"
|
||||||
|
else
|
||||||
|
msg_ok "No update required. ${APP} is already at ${RELEASE}"
|
||||||
|
fi
|
||||||
|
exit
|
||||||
|
}
|
||||||
|
|
||||||
|
start
|
||||||
|
build_container
|
||||||
|
description
|
||||||
|
|
||||||
|
msg_ok "Completed Successfully!\n"
|
||||||
|
echo -e "${CREATING}${GN}${APP} setup has been successfully initialized!${CL}"
|
||||||
|
echo -e "${INFO}${YW} Access it using the following URL:${CL}"
|
||||||
|
echo -e "${TAB}${GATEWAY}${BGN}http://${IP}:9091${CL}"
|
@ -40,11 +40,11 @@ function update_script() {
|
|||||||
tar -xzf authentik.tar.gz -C /opt/authentik --strip-components 1 --overwrite
|
tar -xzf authentik.tar.gz -C /opt/authentik --strip-components 1 --overwrite
|
||||||
rm -rf authentik.tar.gz
|
rm -rf authentik.tar.gz
|
||||||
cd /opt/authentik/website
|
cd /opt/authentik/website
|
||||||
npm install &>/dev/null
|
$STD npm install
|
||||||
npm run build-bundled &>/dev/null
|
$STD npm run build-bundled
|
||||||
cd /opt/authentik/web
|
cd /opt/authentik/web
|
||||||
npm install &>/dev/null
|
$STD npm install
|
||||||
npm run build &>/dev/null
|
$STD npm run build
|
||||||
msg_ok "Built ${APP} website"
|
msg_ok "Built ${APP} website"
|
||||||
|
|
||||||
msg_info "Building ${APP} server"
|
msg_info "Building ${APP} server"
|
||||||
@ -56,15 +56,15 @@ function update_script() {
|
|||||||
|
|
||||||
msg_info "Installing Python Dependencies"
|
msg_info "Installing Python Dependencies"
|
||||||
cd /opt/authentik
|
cd /opt/authentik
|
||||||
poetry install --only=main --no-ansi --no-interaction --no-root &>/dev/null
|
$STD poetry install --only=main --no-ansi --no-interaction --no-root
|
||||||
poetry export --without-hashes --without-urls -f requirements.txt --output requirements.txt &>/dev/null
|
$STD poetry export --without-hashes --without-urls -f requirements.txt --output requirements.txt
|
||||||
pip install --no-cache-dir -r requirements.txt &>/dev/null
|
$STD pip install --no-cache-dir -r requirements.txt
|
||||||
pip install . &>/dev/null
|
$STD pip install .
|
||||||
msg_ok "Installed Python Dependencies"
|
msg_ok "Installed Python Dependencies"
|
||||||
|
|
||||||
msg_info "Updating ${APP} to v${RELEASE} (Patience)"
|
msg_info "Updating ${APP} to v${RELEASE} (Patience)"
|
||||||
cp -r /opt/authentik/authentik/blueprints /opt/authentik/blueprints
|
cp -r /opt/authentik/authentik/blueprints /opt/authentik/blueprints
|
||||||
bash /opt/authentik/lifecycle/ak migrate &>/dev/null
|
$STD bash /opt/authentik/lifecycle/ak migrate
|
||||||
echo "${RELEASE}" >/opt/${APP}_version.txt
|
echo "${RELEASE}" >/opt/${APP}_version.txt
|
||||||
msg_ok "Updated ${APP} to v${RELEASE}"
|
msg_ok "Updated ${APP} to v${RELEASE}"
|
||||||
|
|
||||||
|
@ -25,8 +25,8 @@ function update_script() {
|
|||||||
check_container_resources
|
check_container_resources
|
||||||
if [[ ! -d /var/lib/bazarr/ ]]; then msg_error "No ${APP} Installation Found!"; exit; fi
|
if [[ ! -d /var/lib/bazarr/ ]]; then msg_error "No ${APP} Installation Found!"; exit; fi
|
||||||
msg_info "Updating $APP LXC"
|
msg_info "Updating $APP LXC"
|
||||||
apt-get update &>/dev/null
|
$STD apt-get update
|
||||||
apt-get -y upgrade &>/dev/null
|
$STD apt-get -y upgrade
|
||||||
msg_ok "Updated $APP LXC"
|
msg_ok "Updated $APP LXC"
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
@ -28,7 +28,7 @@ function update_script() {
|
|||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
/opt/beszel/beszel update
|
/opt/beszel/beszel update
|
||||||
msg_error "Ther is currently no automatic update function for ${APP}."
|
msg_error "Currently we don't provide an update function for this ${APP}."
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -25,8 +25,8 @@ function update_script() {
|
|||||||
check_container_resources
|
check_container_resources
|
||||||
if [[ ! -d /var ]]; then msg_error "No ${APP} Installation Found!"; exit; fi
|
if [[ ! -d /var ]]; then msg_error "No ${APP} Installation Found!"; exit; fi
|
||||||
msg_info "Updating $APP LXC"
|
msg_info "Updating $APP LXC"
|
||||||
apt-get update &>/dev/null
|
$STD apt-get update
|
||||||
apt-get -y upgrade &>/dev/null
|
$STD apt-get -y upgrade
|
||||||
msg_ok "Updated $APP LXC"
|
msg_ok "Updated $APP LXC"
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
71
ct/boltdiy.sh
Normal file
71
ct/boltdiy.sh
Normal file
@ -0,0 +1,71 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
source <(curl -s https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/build.func)
|
||||||
|
# Copyright (c) 2021-2025 community-scripts ORG
|
||||||
|
# Author: Slaviša Arežina (tremor021)
|
||||||
|
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
|
||||||
|
# Source: https://github.com/stackblitz-labs/bolt.diy/
|
||||||
|
|
||||||
|
APP="boltdiy"
|
||||||
|
TAGS="code;ai"
|
||||||
|
var_cpu="2"
|
||||||
|
var_ram="3072"
|
||||||
|
var_disk="6"
|
||||||
|
var_os="debian"
|
||||||
|
var_version="12"
|
||||||
|
var_unprivileged="1"
|
||||||
|
|
||||||
|
header_info "$APP"
|
||||||
|
variables
|
||||||
|
color
|
||||||
|
catch_errors
|
||||||
|
|
||||||
|
function update_script() {
|
||||||
|
header_info
|
||||||
|
check_container_storage
|
||||||
|
check_container_resources
|
||||||
|
if [[ ! -d /opt/bolt.diy ]]; then
|
||||||
|
msg_error "No ${APP} Installation Found!"
|
||||||
|
exit
|
||||||
|
fi
|
||||||
|
RELEASE=$(curl -s https://api.github.com/repos/stackblitz-labs/bolt.diy/releases/latest | grep "tag_name" | awk '{print substr($2, 3, length($2)-4) }')
|
||||||
|
if [[ "${RELEASE}" != "$(cat /opt/boltdiy_version.txt)" ]] || [[ ! -f /opt/boltdiy_version.txt ]]; then
|
||||||
|
msg_info "Stopping $APP"
|
||||||
|
systemctl stop boltdiy
|
||||||
|
msg_ok "Stopped $APP"
|
||||||
|
|
||||||
|
msg_info "Updating $APP to v${RELEASE}"
|
||||||
|
temp_dir=$(mktemp -d)
|
||||||
|
temp_file=$(mktemp)
|
||||||
|
cd $temp_dir
|
||||||
|
wget -q "https://github.com/stackblitz-labs/bolt.diy/archive/refs/tags/v${RELEASE}.tar.gz" -O $temp_file
|
||||||
|
tar xzf $temp_file
|
||||||
|
cp -rf bolt.diy-${RELEASE}/* /opt/bolt.diy
|
||||||
|
cd /opt/bolt.diy
|
||||||
|
$STD pnpm install
|
||||||
|
msg_ok "Updated $APP to v${RELEASE}"
|
||||||
|
|
||||||
|
msg_info "Starting $APP"
|
||||||
|
systemctl start boltdiy
|
||||||
|
msg_ok "Started $APP"
|
||||||
|
|
||||||
|
msg_info "Cleaning Up"
|
||||||
|
rm -rf $temp_file
|
||||||
|
rm -rf $temp_dir
|
||||||
|
msg_ok "Cleanup Completed"
|
||||||
|
|
||||||
|
echo "${RELEASE}" >/opt/boltdiy_version.txt
|
||||||
|
msg_ok "Update Successful"
|
||||||
|
else
|
||||||
|
msg_ok "No update required. ${APP} is already at v${RELEASE}"
|
||||||
|
fi
|
||||||
|
exit
|
||||||
|
}
|
||||||
|
|
||||||
|
start
|
||||||
|
build_container
|
||||||
|
description
|
||||||
|
|
||||||
|
msg_ok "Completed Successfully!\n"
|
||||||
|
echo -e "${CREATING}${GN}${APP} setup has been successfully initialized!${CL}"
|
||||||
|
echo -e "${INFO}${YW} Access it using the following URL:${CL}"
|
||||||
|
echo -e "${TAB}${GATEWAY}${BGN}http://${IP}:5173${CL}"
|
@ -39,12 +39,13 @@ function update_script() {
|
|||||||
unzip -q /opt/v${RELEASE}.zip -d /opt
|
unzip -q /opt/v${RELEASE}.zip -d /opt
|
||||||
mv /opt/BookStack-${RELEASE} /opt/bookstack
|
mv /opt/BookStack-${RELEASE} /opt/bookstack
|
||||||
cp /opt/bookstack-backup/.env /opt/bookstack/.env
|
cp /opt/bookstack-backup/.env /opt/bookstack/.env
|
||||||
cp -r /opt/bookstack-backup/public/uploads/* /opt/bookstack/public/uploads/ 2>/dev/null || true
|
cp -r /opt/bookstack-backup/public/uploads/* /opt/bookstack/public/uploads/ || true
|
||||||
cp -r /opt/bookstack-backup/storage/uploads/* /opt/bookstack/storage/uploads/ 2>/dev/null || true
|
cp -r /opt/bookstack-backup/storage/uploads/* /opt/bookstack/storage/uploads/ || true
|
||||||
cp -r /opt/bookstack-backup/themes/* /opt/bookstack/themes/ 2>/dev/null || true
|
cp -r /opt/bookstack-backup/themes/* /opt/bookstack/themes/ || true
|
||||||
cd /opt/bookstack
|
cd /opt/bookstack
|
||||||
COMPOSER_ALLOW_SUPERUSER=1 composer install --no-dev &>/dev/null
|
export COMPOSER_ALLOW_SUPERUSER=1
|
||||||
php artisan migrate --force &>/dev/null
|
$STD composer install --no-dev
|
||||||
|
$STD php artisan migrate --force
|
||||||
chown www-data:www-data -R /opt/bookstack /opt/bookstack/bootstrap/cache /opt/bookstack/public/uploads /opt/bookstack/storage
|
chown www-data:www-data -R /opt/bookstack /opt/bookstack/bootstrap/cache /opt/bookstack/public/uploads /opt/bookstack/storage
|
||||||
chmod -R 755 /opt/bookstack /opt/bookstack/bootstrap/cache /opt/bookstack/public/uploads /opt/bookstack/storage
|
chmod -R 755 /opt/bookstack /opt/bookstack/bootstrap/cache /opt/bookstack/public/uploads /opt/bookstack/storage
|
||||||
chmod -R 775 /opt/bookstack/storage /opt/bookstack/bootstrap/cache /opt/bookstack/public/uploads
|
chmod -R 775 /opt/bookstack/storage /opt/bookstack/bootstrap/cache /opt/bookstack/public/uploads
|
||||||
|
@ -8,7 +8,7 @@ source <(curl -s https://raw.githubusercontent.com/community-scripts/ProxmoxVE/m
|
|||||||
APP="BunkerWeb"
|
APP="BunkerWeb"
|
||||||
var_tags="webserver"
|
var_tags="webserver"
|
||||||
var_cpu="2"
|
var_cpu="2"
|
||||||
var_ram="1024"
|
var_ram="4096"
|
||||||
var_disk="4"
|
var_disk="4"
|
||||||
var_os="debian"
|
var_os="debian"
|
||||||
var_version="12"
|
var_version="12"
|
||||||
@ -34,7 +34,7 @@ Pin: version ${RELEASE}
|
|||||||
Pin-Priority: 1001
|
Pin-Priority: 1001
|
||||||
EOF
|
EOF
|
||||||
apt-get update
|
apt-get update
|
||||||
apt-get install -y nginx=1.26.2*
|
apt-get install -y nginx=1.26.3*
|
||||||
apt-get install -y bunkerweb=${RELEASE}
|
apt-get install -y bunkerweb=${RELEASE}
|
||||||
echo "${RELEASE}" >/opt/${APP}_version.txt
|
echo "${RELEASE}" >/opt/${APP}_version.txt
|
||||||
msg_ok "Updated ${APP} to ${RELEASE}"
|
msg_ok "Updated ${APP} to ${RELEASE}"
|
||||||
|
73
ct/bytestash.sh
Normal file
73
ct/bytestash.sh
Normal file
@ -0,0 +1,73 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
source <(curl -s https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/build.func)
|
||||||
|
# Copyright (c) 2021-2025 community-scripts ORG
|
||||||
|
# Author: Slaviša Arežina (tremor021)
|
||||||
|
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
|
||||||
|
# Source: https://github.com/jordan-dalby/ByteStash
|
||||||
|
|
||||||
|
APP="ByteStash"
|
||||||
|
var_tags="code"
|
||||||
|
var_disk="4"
|
||||||
|
var_cpu="1"
|
||||||
|
var_ram="1024"
|
||||||
|
var_os="debian"
|
||||||
|
var_version="12"
|
||||||
|
var_unprivileged="1"
|
||||||
|
|
||||||
|
header_info "$APP"
|
||||||
|
variables
|
||||||
|
color
|
||||||
|
catch_errors
|
||||||
|
|
||||||
|
function update_script() {
|
||||||
|
header_info
|
||||||
|
check_container_storage
|
||||||
|
check_container_resources
|
||||||
|
if [[ ! -d /opt/bytestash ]]; then
|
||||||
|
msg_error "No ${APP} Installation Found!"
|
||||||
|
exit
|
||||||
|
fi
|
||||||
|
RELEASE=$(curl -s https://api.github.com/repos/jordan-dalby/ByteStash/releases/latest | grep "tag_name" | awk '{print substr($2, 3, length($2)-4) }')
|
||||||
|
if [[ ! -f /opt/${APP}_version.txt ]] || [[ "${RELEASE}" != "$(cat /opt/${APP}_version.txt)" ]]; then
|
||||||
|
msg_info "Stopping Services"
|
||||||
|
systemctl stop bytestash-backend
|
||||||
|
systemctl stop bytestash-frontend
|
||||||
|
msg_ok "Services Stopped"
|
||||||
|
|
||||||
|
msg_info "Updating ${APP} to ${RELEASE}"
|
||||||
|
temp_file=$(mktemp)
|
||||||
|
wget -q "https://github.com/jordan-dalby/ByteStash/archive/refs/tags/v${RELEASE}.tar.gz" -O $temp_file
|
||||||
|
tar zxf $temp_file
|
||||||
|
rm -rf /opt/bytestash/server/node_modules
|
||||||
|
rm -rf /opt/bytestash/client/node_modules
|
||||||
|
cp -rf ByteStash-${RELEASE}/* /opt/bytestash
|
||||||
|
cd /opt/bytestash/server
|
||||||
|
$STD npm install
|
||||||
|
cd /opt/bytestash/client
|
||||||
|
$STD npm install
|
||||||
|
echo "${RELEASE}" >/opt/${APP}_version.txt
|
||||||
|
msg_ok "Updated ${APP}"
|
||||||
|
|
||||||
|
msg_info "Starting Services"
|
||||||
|
systemctl start bytestash-backend
|
||||||
|
systemctl start bytestash-frontend
|
||||||
|
msg_ok "Started Services"
|
||||||
|
|
||||||
|
msg_info "Cleaning Up"
|
||||||
|
rm -f $temp_file
|
||||||
|
msg_ok "Cleaned"
|
||||||
|
msg_ok "Updated Successfully"
|
||||||
|
else
|
||||||
|
msg_ok "No update required. ${APP} is already at ${RELEASE}"
|
||||||
|
fi
|
||||||
|
exit
|
||||||
|
}
|
||||||
|
|
||||||
|
start
|
||||||
|
build_container
|
||||||
|
description
|
||||||
|
|
||||||
|
msg_ok "Completed Successfully!\n"
|
||||||
|
echo -e "${CREATING}${GN}${APP} setup has been successfully initialized!${CL}"
|
||||||
|
echo -e "${INFO}${YW} Access it using the following URL:${CL}"
|
||||||
|
echo -e "${TAB}${GATEWAY}${BGN}http://${IP}:3000${CL}"
|
@ -28,8 +28,8 @@ function update_script() {
|
|||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_info "Updating $APP LXC"
|
msg_info "Updating $APP LXC"
|
||||||
apt-get update &>/dev/null
|
$STD apt-get update
|
||||||
apt-get -y upgrade &>/dev/null
|
$STD apt-get -y upgrade
|
||||||
msg_ok "Updated $APP LXC"
|
msg_ok "Updated $APP LXC"
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
@ -115,10 +115,10 @@ function update_script() {
|
|||||||
echo "${options[*]}"
|
echo "${options[*]}"
|
||||||
)
|
)
|
||||||
echo $cps_options >/opt/calibre-web/options.txt
|
echo $cps_options >/opt/calibre-web/options.txt
|
||||||
pip install --upgrade calibreweb[$cps_options] &>/dev/null
|
$STD pip install --upgrade calibreweb[$cps_options]
|
||||||
else
|
else
|
||||||
rm -rf /opt/calibre-web/options.txt
|
rm -rf /opt/calibre-web/options.txt
|
||||||
pip install --upgrade calibreweb &>/dev/null
|
$STD pip install --upgrade calibreweb
|
||||||
fi
|
fi
|
||||||
|
|
||||||
msg_info "Starting ${APP}"
|
msg_info "Starting ${APP}"
|
||||||
|
@ -28,8 +28,8 @@ function update_script() {
|
|||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_info "Updating ${APP} LXC"
|
msg_info "Updating ${APP} LXC"
|
||||||
apt-get update &>/dev/null
|
$STD apt-get update
|
||||||
apt-get -y upgrade &>/dev/null
|
$STD apt-get -y upgrade
|
||||||
msg_ok "Updated ${APP} LXC"
|
msg_ok "Updated ${APP} LXC"
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
@ -31,31 +31,31 @@ function update_script() {
|
|||||||
|
|
||||||
if ! dpkg -s libjpeg-dev >/dev/null 2>&1; then
|
if ! dpkg -s libjpeg-dev >/dev/null 2>&1; then
|
||||||
msg_info "Installing Dependencies"
|
msg_info "Installing Dependencies"
|
||||||
apt-get update
|
$STD apt-get update
|
||||||
apt-get install -y libjpeg-dev
|
$STD apt-get install -y libjpeg-dev
|
||||||
msg_ok "Updated Dependencies"
|
msg_ok "Updated Dependencies"
|
||||||
fi
|
fi
|
||||||
|
|
||||||
msg_info "Updating ${APP}"
|
msg_info "Updating ${APP}"
|
||||||
pip3 install changedetection.io --upgrade &>/dev/null
|
$STD pip3 install changedetection.io --upgrade
|
||||||
msg_ok "Updated ${APP}"
|
msg_ok "Updated ${APP}"
|
||||||
|
|
||||||
msg_info "Updating Playwright"
|
msg_info "Updating Playwright"
|
||||||
pip3 install playwright --upgrade &>/dev/null
|
$STD pip3 install playwright --upgrade
|
||||||
msg_ok "Updated Playwright"
|
msg_ok "Updated Playwright"
|
||||||
|
|
||||||
if [[ -f /etc/systemd/system/browserless.service ]]; then
|
if [[ -f /etc/systemd/system/browserless.service ]]; then
|
||||||
msg_info "Updating Browserless (Patience)"
|
msg_info "Updating Browserless (Patience)"
|
||||||
git -C /opt/browserless/ fetch --all &>/dev/null
|
$STD git -C /opt/browserless/ fetch --all
|
||||||
git -C /opt/browserless/ reset --hard origin/main &>/dev/null
|
$STD git -C /opt/browserless/ reset --hard origin/main
|
||||||
npm update --prefix /opt/browserless &>/dev/null
|
$STD npm update --prefix /opt/browserless
|
||||||
/opt/browserless/node_modules/playwright-core/cli.js install --with-deps &>/dev/null
|
$STD /opt/browserless/node_modules/playwright-core/cli.js install --with-deps
|
||||||
# Update Chrome separately, as it has to be done with the force option. Otherwise the installation of other browsers will not be done if Chrome is already installed.
|
# Update Chrome separately, as it has to be done with the force option. Otherwise the installation of other browsers will not be done if Chrome is already installed.
|
||||||
/opt/browserless/node_modules/playwright-core/cli.js install --force chrome &>/dev/null
|
$STD /opt/browserless/node_modules/playwright-core/cli.js install --force chrome
|
||||||
/opt/browserless/node_modules/playwright-core/cli.js install chromium firefox webkit &>/dev/null
|
$STD /opt/browserless/node_modules/playwright-core/cli.js install chromium firefox webkit
|
||||||
npm run build --prefix /opt/browserless &>/dev/null
|
$STD npm run build --prefix /opt/browserless
|
||||||
npm run build:function --prefix /opt/browserless &>/dev/null
|
$STD npm run build:function --prefix /opt/browserless
|
||||||
npm prune production --prefix /opt/browserless &>/dev/null
|
$STD npm prune production --prefix /opt/browserless
|
||||||
systemctl restart browserless
|
systemctl restart browserless
|
||||||
msg_ok "Updated Browserless"
|
msg_ok "Updated Browserless"
|
||||||
else
|
else
|
||||||
|
@ -27,7 +27,7 @@ function update_script() {
|
|||||||
msg_error "No ${APP} Installation Found!"
|
msg_error "No ${APP} Installation Found!"
|
||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_error "There is currently no update path available."
|
msg_error "Currently we don't provide an update function for this ${APP}."
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -30,14 +30,14 @@ function update_script() {
|
|||||||
RELEASE=$(curl -fsSL https://api.github.com/repos/checkmk/checkmk/tags | grep "name" | awk '{print substr($2, 3, length($2)-4) }' | tr ' ' '\n' | grep -v '\-rc' | sort -V | tail -n 1)
|
RELEASE=$(curl -fsSL https://api.github.com/repos/checkmk/checkmk/tags | grep "name" | awk '{print substr($2, 3, length($2)-4) }' | tr ' ' '\n' | grep -v '\-rc' | sort -V | tail -n 1)
|
||||||
if [[ ! -f /opt/${APP}_version.txt ]] || [[ "${RELEASE}" != "$(cat /opt/${APP}_version.txt)" ]]; then
|
if [[ ! -f /opt/${APP}_version.txt ]] || [[ "${RELEASE}" != "$(cat /opt/${APP}_version.txt)" ]]; then
|
||||||
msg_info "Updating ${APP} to v${RELEASE}"
|
msg_info "Updating ${APP} to v${RELEASE}"
|
||||||
omd stop monitoring &>/dev/null
|
$STD omd stop monitoring
|
||||||
omd cp monitoring monitoringbackup &>/dev/null
|
$STD omd cp monitoring monitoringbackup
|
||||||
wget -q https://download.checkmk.com/checkmk/${RELEASE}/check-mk-raw-${RELEASE}_0.bookworm_amd64.deb -O /opt/checkmk.deb
|
wget -q https://download.checkmk.com/checkmk/${RELEASE}/check-mk-raw-${RELEASE}_0.bookworm_amd64.deb -O /opt/checkmk.deb
|
||||||
apt-get install -y /opt/checkmk.deb &>/dev/null
|
$STD apt-get install -y /opt/checkmk.deb
|
||||||
omd --force -V ${RELEASE}.cre update --conflict=install monitoring &>/dev/null
|
$STD omd --force -V ${RELEASE}.cre update --conflict=install monitoring
|
||||||
omd start monitoring &>/dev/null
|
$STD omd start monitoring
|
||||||
omd -f rm monitoringbackup &>/dev/null
|
$STD omd -f rm monitoringbackup
|
||||||
omd cleanup &>/dev/null
|
$STD omd cleanup
|
||||||
rm -rf /opt/checkmk.deb
|
rm -rf /opt/checkmk.deb
|
||||||
msg_ok "Updated ${APP} to v${RELEASE}"
|
msg_ok "Updated ${APP} to v${RELEASE}"
|
||||||
else
|
else
|
||||||
|
@ -28,8 +28,8 @@ function update_script() {
|
|||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_info "Updating $APP LXC"
|
msg_info "Updating $APP LXC"
|
||||||
apt-get update &>/dev/null
|
$STD apt-get update
|
||||||
apt-get -y upgrade &>/dev/null
|
$STD apt-get -y upgrade
|
||||||
msg_ok "Updated $APP LXC"
|
msg_ok "Updated $APP LXC"
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
@ -36,48 +36,48 @@ function update_script() {
|
|||||||
|
|
||||||
if [ "$UPD" == "1" ]; then
|
if [ "$UPD" == "1" ]; then
|
||||||
msg_info "Updating ${APP} LXC"
|
msg_info "Updating ${APP} LXC"
|
||||||
apt-get update &>/dev/null
|
$STD apt-get update
|
||||||
apt-get -y upgrade &>/dev/null
|
$STD apt-get -y upgrade
|
||||||
msg_ok "Updated ${APP} LXC"
|
msg_ok "Updated ${APP} LXC"
|
||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
if [ "$UPD" == "2" ]; then
|
if [ "$UPD" == "2" ]; then
|
||||||
msg_info "Installing dependencies (patience)"
|
msg_info "Installing dependencies (patience)"
|
||||||
apt-get install -y attr &>/dev/null
|
$STD apt-get install -y attr
|
||||||
apt-get install -y nfs-kernel-server &>/dev/null
|
$STD apt-get install -y nfs-kernel-server
|
||||||
apt-get install -y samba &>/dev/null
|
$STD apt-get install -y samba
|
||||||
apt-get install -y samba-common-bin &>/dev/null
|
$STD apt-get install -y samba-common-bin
|
||||||
apt-get install -y winbind &>/dev/null
|
$STD apt-get install -y winbind
|
||||||
apt-get install -y gawk &>/dev/null
|
$STD apt-get install -y gawk
|
||||||
msg_ok "Installed dependencies"
|
msg_ok "Installed dependencies"
|
||||||
msg_info "Installing Cockpit file sharing"
|
msg_info "Installing Cockpit file sharing"
|
||||||
wget -q $(curl -s https://api.github.com/repos/45Drives/cockpit-file-sharing/releases/latest | grep download | grep focal_all.deb | cut -d\" -f4)
|
wget -q $(curl -s https://api.github.com/repos/45Drives/cockpit-file-sharing/releases/latest | grep download | grep focal_all.deb | cut -d\" -f4)
|
||||||
dpkg -i cockpit-file-sharing_*focal_all.deb &>/dev/null
|
$STD dpkg -i cockpit-file-sharing_*focal_all.deb
|
||||||
rm cockpit-file-sharing_*focal_all.deb
|
rm cockpit-file-sharing_*focal_all.deb
|
||||||
msg_ok "Installed Cockpit file sharing"
|
msg_ok "Installed Cockpit file sharing"
|
||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
if [ "$UPD" == "3" ]; then
|
if [ "$UPD" == "3" ]; then
|
||||||
msg_info "Installing dependencies (patience)"
|
msg_info "Installing dependencies (patience)"
|
||||||
apt-get install -y psmisc &>/dev/null
|
$STD apt-get install -y psmisc
|
||||||
apt-get install -y samba &>/dev/null
|
$STD apt-get install -y samba
|
||||||
apt-get install -y samba-common-bin &>/dev/null
|
$STD apt-get install -y samba-common-bin
|
||||||
msg_ok "Installed dependencies"
|
msg_ok "Installed dependencies"
|
||||||
msg_info "Installing Cockpit identities"
|
msg_info "Installing Cockpit identities"
|
||||||
wget -q $(curl -s https://api.github.com/repos/45Drives/cockpit-identities/releases/latest | grep download | grep focal_all.deb | cut -d\" -f4)
|
wget -q $(curl -s https://api.github.com/repos/45Drives/cockpit-identities/releases/latest | grep download | grep focal_all.deb | cut -d\" -f4)
|
||||||
dpkg -i cockpit-identities_*focal_all.deb &>/dev/null
|
$STD dpkg -i cockpit-identities_*focal_all.deb
|
||||||
rm cockpit-identities_*focal_all.deb
|
rm cockpit-identities_*focal_all.deb
|
||||||
msg_ok "Installed Cockpit identities"
|
msg_ok "Installed Cockpit identities"
|
||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
if [ "$UPD" == "4" ]; then
|
if [ "$UPD" == "4" ]; then
|
||||||
msg_info "Installing dependencies"
|
msg_info "Installing dependencies"
|
||||||
apt-get install -y rsync &>/dev/null
|
$STD apt-get install -y rsync
|
||||||
apt-get install -y zip &>/dev/null
|
$STD apt-get install -y zip
|
||||||
msg_ok "Installed dependencies"
|
msg_ok "Installed dependencies"
|
||||||
msg_info "Installing Cockpit navigator"
|
msg_info "Installing Cockpit navigator"
|
||||||
wget -q $(curl -s https://api.github.com/repos/45Drives/cockpit-navigator/releases/latest | grep download | grep focal_all.deb | cut -d\" -f4)
|
wget -q $(curl -s https://api.github.com/repos/45Drives/cockpit-navigator/releases/latest | grep download | grep focal_all.deb | cut -d\" -f4)
|
||||||
dpkg -i cockpit-navigator_*focal_all.deb &>/dev/null
|
$STD dpkg -i cockpit-navigator_*focal_all.deb
|
||||||
rm cockpit-navigator_*focal_all.deb
|
rm cockpit-navigator_*focal_all.deb
|
||||||
msg_ok "Installed Cockpit navigator"
|
msg_ok "Installed Cockpit navigator"
|
||||||
exit
|
exit
|
||||||
|
133
ct/create_lxc.sh
133
ct/create_lxc.sh
@ -36,7 +36,7 @@ trap 'error_handler $LINENO "$BASH_COMMAND"' ERR
|
|||||||
|
|
||||||
# This function handles errors
|
# This function handles errors
|
||||||
function error_handler() {
|
function error_handler() {
|
||||||
if [ -n "$SPINNER_PID" ] && ps -p $SPINNER_PID > /dev/null; then kill $SPINNER_PID > /dev/null; fi
|
if [ -n "$SPINNER_PID" ] && ps -p $SPINNER_PID >/dev/null; then kill $SPINNER_PID >/dev/null; fi
|
||||||
printf "\e[?25h"
|
printf "\e[?25h"
|
||||||
local exit_code="$?"
|
local exit_code="$?"
|
||||||
local line_number="$1"
|
local line_number="$1"
|
||||||
@ -51,13 +51,13 @@ function spinner() {
|
|||||||
local frames=('⠋' '⠙' '⠹' '⠸' '⠼' '⠴' '⠦' '⠧' '⠇' '⠏')
|
local frames=('⠋' '⠙' '⠹' '⠸' '⠼' '⠴' '⠦' '⠧' '⠇' '⠏')
|
||||||
local spin_i=0
|
local spin_i=0
|
||||||
local interval=0.1
|
local interval=0.1
|
||||||
printf "\e[?25l"
|
printf "\e[?25l"
|
||||||
|
|
||||||
local color="${YWB}"
|
local color="${YWB}"
|
||||||
|
|
||||||
while true; do
|
while true; do
|
||||||
printf "\r ${color}%s${CL}" "${frames[spin_i]}"
|
printf "\r ${color}%s${CL}" "${frames[spin_i]}"
|
||||||
spin_i=$(( (spin_i + 1) % ${#frames[@]} ))
|
spin_i=$(((spin_i + 1) % ${#frames[@]}))
|
||||||
sleep "$interval"
|
sleep "$interval"
|
||||||
done
|
done
|
||||||
}
|
}
|
||||||
@ -70,9 +70,16 @@ function msg_info() {
|
|||||||
SPINNER_PID=$!
|
SPINNER_PID=$!
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function msg_warn() {
|
||||||
|
if [ -n "$SPINNER_PID" ] && ps -p $SPINNER_PID >/dev/null; then kill $SPINNER_PID >/dev/null; fi
|
||||||
|
printf "\e[?25h"
|
||||||
|
local msg="$1"
|
||||||
|
echo -e "${BFR}${INFO}${YWB}${msg}${CL}"
|
||||||
|
}
|
||||||
|
|
||||||
# This function displays a success message with a green color.
|
# This function displays a success message with a green color.
|
||||||
function msg_ok() {
|
function msg_ok() {
|
||||||
if [ -n "$SPINNER_PID" ] && ps -p $SPINNER_PID > /dev/null; then kill $SPINNER_PID > /dev/null; fi
|
if [ -n "$SPINNER_PID" ] && ps -p $SPINNER_PID >/dev/null; then kill $SPINNER_PID >/dev/null; fi
|
||||||
printf "\e[?25h"
|
printf "\e[?25h"
|
||||||
local msg="$1"
|
local msg="$1"
|
||||||
echo -e "${BFR}${CM}${GN}${msg}${CL}"
|
echo -e "${BFR}${CM}${GN}${msg}${CL}"
|
||||||
@ -80,7 +87,7 @@ function msg_ok() {
|
|||||||
|
|
||||||
# This function displays a error message with a red color.
|
# This function displays a error message with a red color.
|
||||||
function msg_error() {
|
function msg_error() {
|
||||||
if [ -n "$SPINNER_PID" ] && ps -p $SPINNER_PID > /dev/null; then kill $SPINNER_PID > /dev/null; fi
|
if [ -n "$SPINNER_PID" ] && ps -p $SPINNER_PID >/dev/null; then kill $SPINNER_PID >/dev/null; fi
|
||||||
printf "\e[?25h"
|
printf "\e[?25h"
|
||||||
local msg="$1"
|
local msg="$1"
|
||||||
echo -e "${BFR}${CROSS}${RD}${msg}${CL}"
|
echo -e "${BFR}${CROSS}${RD}${msg}${CL}"
|
||||||
@ -113,9 +120,12 @@ function select_storage() {
|
|||||||
CONTENT='vztmpl'
|
CONTENT='vztmpl'
|
||||||
CONTENT_LABEL='Container template'
|
CONTENT_LABEL='Container template'
|
||||||
;;
|
;;
|
||||||
*) false || { msg_error "Invalid storage class."; exit 201; };
|
*) false || {
|
||||||
|
msg_error "Invalid storage class."
|
||||||
|
exit 201
|
||||||
|
} ;;
|
||||||
esac
|
esac
|
||||||
|
|
||||||
# This Queries all storage locations
|
# This Queries all storage locations
|
||||||
local -a MENU
|
local -a MENU
|
||||||
while read -r line; do
|
while read -r line; do
|
||||||
@ -129,34 +139,60 @@ function select_storage() {
|
|||||||
fi
|
fi
|
||||||
MENU+=("$TAG" "$ITEM" "OFF")
|
MENU+=("$TAG" "$ITEM" "OFF")
|
||||||
done < <(pvesm status -content $CONTENT | awk 'NR>1')
|
done < <(pvesm status -content $CONTENT | awk 'NR>1')
|
||||||
|
|
||||||
# Select storage location
|
# Select storage location
|
||||||
if [ $((${#MENU[@]}/3)) -eq 1 ]; then
|
if [ $((${#MENU[@]} / 3)) -eq 1 ]; then
|
||||||
printf ${MENU[0]}
|
printf ${MENU[0]}
|
||||||
else
|
else
|
||||||
local STORAGE
|
local STORAGE
|
||||||
while [ -z "${STORAGE:+x}" ]; do
|
while [ -z "${STORAGE:+x}" ]; do
|
||||||
STORAGE=$(whiptail --backtitle "Proxmox VE Helper Scripts" --title "Storage Pools" --radiolist \
|
STORAGE=$(whiptail --backtitle "Proxmox VE Helper Scripts" --title "Storage Pools" --radiolist \
|
||||||
"Which storage pool you would like to use for the ${CONTENT_LABEL,,}?\nTo make a selection, use the Spacebar.\n" \
|
"Which storage pool you would like to use for the ${CONTENT_LABEL,,}?\nTo make a selection, use the Spacebar.\n" \
|
||||||
16 $(($MSG_MAX_LENGTH + 23)) 6 \
|
16 $(($MSG_MAX_LENGTH + 23)) 6 \
|
||||||
"${MENU[@]}" 3>&1 1>&2 2>&3) || { msg_error "Menu aborted."; exit 202; }
|
"${MENU[@]}" 3>&1 1>&2 2>&3) || {
|
||||||
|
msg_error "Menu aborted."
|
||||||
|
exit 202
|
||||||
|
}
|
||||||
if [ $? -ne 0 ]; then
|
if [ $? -ne 0 ]; then
|
||||||
echo -e "${CROSS}${RD} Menu aborted by user.${CL}"
|
echo -e "${CROSS}${RD} Menu aborted by user.${CL}"
|
||||||
exit 0
|
exit 0
|
||||||
fi
|
fi
|
||||||
done
|
done
|
||||||
printf "%s" "$STORAGE"
|
printf "%s" "$STORAGE"
|
||||||
fi
|
fi
|
||||||
}
|
}
|
||||||
# Test if required variables are set
|
# Test if required variables are set
|
||||||
[[ "${CTID:-}" ]] || { msg_error "You need to set 'CTID' variable."; exit 203; }
|
[[ "${CTID:-}" ]] || {
|
||||||
[[ "${PCT_OSTYPE:-}" ]] || { msg_error "You need to set 'PCT_OSTYPE' variable."; exit 204; }
|
msg_error "You need to set 'CTID' variable."
|
||||||
|
exit 203
|
||||||
|
}
|
||||||
|
[[ "${PCT_OSTYPE:-}" ]] || {
|
||||||
|
msg_error "You need to set 'PCT_OSTYPE' variable."
|
||||||
|
exit 204
|
||||||
|
}
|
||||||
|
|
||||||
# Test if ID is valid
|
# Test if ID is valid
|
||||||
[ "$CTID" -ge "100" ] || { msg_error "ID cannot be less than 100."; exit 205; }
|
[ "$CTID" -ge "100" ] || {
|
||||||
|
msg_error "ID cannot be less than 100."
|
||||||
|
exit 205
|
||||||
|
}
|
||||||
|
|
||||||
|
# Check for network connectivity (IPv4 & IPv6)
|
||||||
|
#function check_network() {
|
||||||
|
# local CHECK_URLS=("8.8.8.8" "1.1.1.1" "9.9.9.9" "2606:4700:4700::1111" "2001:4860:4860::8888" "2620:fe::fe")
|
||||||
|
#
|
||||||
|
# for url in "${CHECK_URLS[@]}"; do
|
||||||
|
# if ping -c 1 -W 2 "$url" &>/dev/null; then
|
||||||
|
# return 0 # Success: At least one connection works
|
||||||
|
# fi
|
||||||
|
# done
|
||||||
|
#
|
||||||
|
# msg_error "No network connection detected. Check your internet connection."
|
||||||
|
# exit 101
|
||||||
|
#}
|
||||||
|
|
||||||
# Test if ID is in use
|
# Test if ID is in use
|
||||||
if pct status $CTID &>/dev/null; then
|
if qm status "$CTID" &>/dev/null || pct status "$CTID" &>/dev/null; then
|
||||||
echo -e "ID '$CTID' is already in use."
|
echo -e "ID '$CTID' is already in use."
|
||||||
unset CTID
|
unset CTID
|
||||||
msg_error "Cannot use ID that is already in use."
|
msg_error "Cannot use ID that is already in use."
|
||||||
@ -173,46 +209,73 @@ msg_ok "Using ${BL}$CONTAINER_STORAGE${CL} ${GN}for Container Storage."
|
|||||||
|
|
||||||
# Update LXC template list
|
# Update LXC template list
|
||||||
msg_info "Updating LXC Template List"
|
msg_info "Updating LXC Template List"
|
||||||
|
#check_network
|
||||||
pveam update >/dev/null
|
pveam update >/dev/null
|
||||||
msg_ok "Updated LXC Template List"
|
msg_ok "Updated LXC Template List"
|
||||||
|
|
||||||
# Get LXC template string
|
# Get LXC template string
|
||||||
TEMPLATE_SEARCH=${PCT_OSTYPE}-${PCT_OSVERSION:-}
|
TEMPLATE_SEARCH=${PCT_OSTYPE}-${PCT_OSVERSION:-}
|
||||||
mapfile -t TEMPLATES < <(pveam available -section system | sed -n "s/.*\($TEMPLATE_SEARCH.*\)/\1/p" | sort -t - -k 2 -V)
|
mapfile -t TEMPLATES < <(pveam available -section system | sed -n "s/.*\($TEMPLATE_SEARCH.*\)/\1/p" | sort -t - -k 2 -V)
|
||||||
[ ${#TEMPLATES[@]} -gt 0 ] || { msg_error "Unable to find a template when searching for '$TEMPLATE_SEARCH'."; exit 207; }
|
[ ${#TEMPLATES[@]} -gt 0 ] || {
|
||||||
|
msg_error "Unable to find a template when searching for '$TEMPLATE_SEARCH'."
|
||||||
|
exit 207
|
||||||
|
}
|
||||||
TEMPLATE="${TEMPLATES[-1]}"
|
TEMPLATE="${TEMPLATES[-1]}"
|
||||||
|
|
||||||
TEMPLATE_PATH="/var/lib/vz/template/cache/$TEMPLATE"
|
TEMPLATE_PATH="/var/lib/vz/template/cache/$TEMPLATE"
|
||||||
# Check if template exists, if corrupt remove and redownload
|
# Check if template exists, if corrupt remove and redownload
|
||||||
if ! pveam list "$TEMPLATE_STORAGE" | grep -q "$TEMPLATE"; then
|
if ! pveam list "$TEMPLATE_STORAGE" | grep -q "$TEMPLATE" || ! zstdcat "$TEMPLATE_PATH" | tar -tf - >/dev/null 2>&1; then
|
||||||
|
msg_warn "Template $TEMPLATE not found in storage or seems to be corrupted. Redownloading."
|
||||||
[[ -f "$TEMPLATE_PATH" ]] && rm -f "$TEMPLATE_PATH"
|
[[ -f "$TEMPLATE_PATH" ]] && rm -f "$TEMPLATE_PATH"
|
||||||
msg_info "Downloading LXC Template"
|
|
||||||
pveam download "$TEMPLATE_STORAGE" "$TEMPLATE" >/dev/null ||
|
# Download with 3 attempts
|
||||||
{ msg_error "A problem occurred while downloading the LXC template."; exit 208; }
|
for attempt in {1..3}; do
|
||||||
msg_ok "Downloaded LXC Template"
|
msg_info "Attempt $attempt: Downloading LXC template..."
|
||||||
|
|
||||||
|
if timeout 120 pveam download "$TEMPLATE_STORAGE" "$TEMPLATE" >/dev/null; then
|
||||||
|
msg_ok "Template download successful."
|
||||||
|
break
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ $attempt -eq 3 ]; then
|
||||||
|
msg_error "Three failed attempts. Aborting."
|
||||||
|
exit 208
|
||||||
|
fi
|
||||||
|
|
||||||
|
sleep $((attempt * 5))
|
||||||
|
done
|
||||||
fi
|
fi
|
||||||
|
msg_ok "LXC Template is ready to use."
|
||||||
|
|
||||||
# Check and fix subuid/subgid
|
# Check and fix subuid/subgid
|
||||||
grep -q "root:100000:65536" /etc/subuid || echo "root:100000:65536" >> /etc/subuid
|
grep -q "root:100000:65536" /etc/subuid || echo "root:100000:65536" >>/etc/subuid
|
||||||
grep -q "root:100000:65536" /etc/subgid || echo "root:100000:65536" >> /etc/subgid
|
grep -q "root:100000:65536" /etc/subgid || echo "root:100000:65536" >>/etc/subgid
|
||||||
|
|
||||||
# Combine all options
|
# Combine all options
|
||||||
PCT_OPTIONS=(${PCT_OPTIONS[@]:-${DEFAULT_PCT_OPTIONS[@]}})
|
PCT_OPTIONS=(${PCT_OPTIONS[@]:-${DEFAULT_PCT_OPTIONS[@]}})
|
||||||
[[ " ${PCT_OPTIONS[@]} " =~ " -rootfs " ]] || PCT_OPTIONS+=(-rootfs "$CONTAINER_STORAGE:${PCT_DISK_SIZE:-8}")
|
[[ " ${PCT_OPTIONS[@]} " =~ " -rootfs " ]] || PCT_OPTIONS+=(-rootfs "$CONTAINER_STORAGE:${PCT_DISK_SIZE:-8}")
|
||||||
|
|
||||||
# Create container with template integrity check
|
|
||||||
msg_info "Creating LXC Container"
|
msg_info "Creating LXC Container"
|
||||||
if ! pct create "$CTID" "${TEMPLATE_STORAGE}:vztmpl/${TEMPLATE}" "${PCT_OPTIONS[@]}" &>/dev/null; then
|
if ! pct create "$CTID" "${TEMPLATE_STORAGE}:vztmpl/${TEMPLATE}" "${PCT_OPTIONS[@]}" &>/dev/null; then
|
||||||
[[ -f "$TEMPLATE_PATH" ]] && rm -f "$TEMPLATE_PATH"
|
msg_error "Container creation failed. Checking if template is corrupted."
|
||||||
|
|
||||||
msg_ok "Template integrity check completed"
|
if ! zstdcat "$TEMPLATE_PATH" | tar -tf - >/dev/null 2>&1; then
|
||||||
pveam download "$TEMPLATE_STORAGE" "$TEMPLATE" >/dev/null ||
|
msg_error "Template appears to be corrupted. Removing and re-downloading."
|
||||||
{ msg_error "A problem occurred while re-downloading the LXC template."; exit 208; }
|
rm -f "$TEMPLATE_PATH"
|
||||||
|
|
||||||
|
if ! timeout 120 pveam download "$TEMPLATE_STORAGE" "$TEMPLATE" >/dev/null; then
|
||||||
|
msg_error "Failed to re-download template."
|
||||||
|
exit 208
|
||||||
|
fi
|
||||||
|
|
||||||
msg_ok "Re-downloaded LXC Template"
|
msg_ok "Re-downloaded LXC Template"
|
||||||
|
|
||||||
if ! pct create "$CTID" "${TEMPLATE_STORAGE}:vztmpl/${TEMPLATE}" "${PCT_OPTIONS[@]}" &>/dev/null; then
|
if ! pct create "$CTID" "${TEMPLATE_STORAGE}:vztmpl/${TEMPLATE}" "${PCT_OPTIONS[@]}" &>/dev/null; then
|
||||||
msg_error "A problem occurred while trying to create container after re-downloading template."
|
msg_error "Container creation failed after re-downloading template."
|
||||||
exit 200
|
exit 200
|
||||||
fi
|
fi
|
||||||
|
else
|
||||||
|
msg_error "Container creation failed, but template is not corrupted."
|
||||||
|
exit 209
|
||||||
fi
|
fi
|
||||||
|
fi
|
||||||
msg_ok "LXC Container ${BL}$CTID${CL} ${GN}was successfully created."
|
msg_ok "LXC Container ${BL}$CTID${CL} ${GN}was successfully created."
|
||||||
|
@ -36,12 +36,12 @@ function update_script() {
|
|||||||
if [[ "$(node -v | cut -d 'v' -f 2)" == "18."* ]]; then
|
if [[ "$(node -v | cut -d 'v' -f 2)" == "18."* ]]; then
|
||||||
if ! command -v npm >/dev/null 2>&1; then
|
if ! command -v npm >/dev/null 2>&1; then
|
||||||
echo "Installing NPM..."
|
echo "Installing NPM..."
|
||||||
apt-get install -y npm >/dev/null 2>&1
|
$STD apt-get install -y npm
|
||||||
echo "Installed NPM..."
|
echo "Installed NPM..."
|
||||||
fi
|
fi
|
||||||
fi
|
fi
|
||||||
msg_info "Updating ${APP}"
|
msg_info "Updating ${APP}"
|
||||||
/opt/cronicle/bin/control.sh upgrade &>/dev/null
|
$STD /opt/cronicle/bin/control.sh upgrade
|
||||||
msg_ok "Updated ${APP}"
|
msg_ok "Updated ${APP}"
|
||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
@ -49,7 +49,7 @@ function update_script() {
|
|||||||
if [[ "$(node -v | cut -d 'v' -f 2)" == "18."* ]]; then
|
if [[ "$(node -v | cut -d 'v' -f 2)" == "18."* ]]; then
|
||||||
if ! command -v npm >/dev/null 2>&1; then
|
if ! command -v npm >/dev/null 2>&1; then
|
||||||
echo "Installing NPM..."
|
echo "Installing NPM..."
|
||||||
apt-get install -y npm >/dev/null 2>&1
|
$STD apt-get install -y npm
|
||||||
echo "Installed NPM..."
|
echo "Installed NPM..."
|
||||||
fi
|
fi
|
||||||
fi
|
fi
|
||||||
@ -57,12 +57,12 @@ function update_script() {
|
|||||||
IP=$(hostname -I | awk '{print $1}')
|
IP=$(hostname -I | awk '{print $1}')
|
||||||
msg_info "Installing Dependencies"
|
msg_info "Installing Dependencies"
|
||||||
|
|
||||||
apt-get install -y git &>/dev/null
|
$STD apt-get install -y git
|
||||||
apt-get install -y make &>/dev/null
|
$STD apt-get install -y make
|
||||||
apt-get install -y g++ &>/dev/null
|
$STD apt-get install -y g++
|
||||||
apt-get install -y gcc &>/dev/null
|
$STD apt-get install -y gcc
|
||||||
apt-get install -y ca-certificates &>/dev/null
|
$STD apt-get install -y ca-certificates
|
||||||
apt-get install -y gnupg &>/dev/null
|
$STD apt-get install -y gnupg
|
||||||
msg_ok "Installed Dependencies"
|
msg_ok "Installed Dependencies"
|
||||||
|
|
||||||
msg_info "Setting up Node.js Repository"
|
msg_info "Setting up Node.js Repository"
|
||||||
@ -72,21 +72,21 @@ function update_script() {
|
|||||||
msg_ok "Set up Node.js Repository"
|
msg_ok "Set up Node.js Repository"
|
||||||
|
|
||||||
msg_info "Installing Node.js"
|
msg_info "Installing Node.js"
|
||||||
apt-get update &>/dev/null
|
$STD apt-get update
|
||||||
apt-get install -y nodejs &>/dev/null
|
$STD apt-get install -y nodejs
|
||||||
msg_ok "Installed Node.js"
|
msg_ok "Installed Node.js"
|
||||||
|
|
||||||
msg_info "Installing Cronicle Worker"
|
msg_info "Installing Cronicle Worker"
|
||||||
mkdir -p /opt/cronicle
|
mkdir -p /opt/cronicle
|
||||||
cd /opt/cronicle
|
cd /opt/cronicle
|
||||||
tar zxvf <(curl -fsSL https://github.com/jhuckaby/Cronicle/archive/${LATEST}.tar.gz) --strip-components 1 &>/dev/null
|
$STD tar zxvf <(curl -fsSL https://github.com/jhuckaby/Cronicle/archive/${LATEST}.tar.gz) --strip-components 1
|
||||||
npm install &>/dev/null
|
$STD npm install
|
||||||
node bin/build.js dist &>/dev/null
|
$STD node bin/build.js dist
|
||||||
sed -i "s/localhost:3012/${IP}:3012/g" /opt/cronicle/conf/config.json
|
sed -i "s/localhost:3012/${IP}:3012/g" /opt/cronicle/conf/config.json
|
||||||
/opt/cronicle/bin/control.sh start &>/dev/null
|
$STD /opt/cronicle/bin/control.sh start
|
||||||
cp /opt/cronicle/bin/cronicled.init /etc/init.d/cronicled &>/dev/null
|
$STD cp /opt/cronicle/bin/cronicled.init /etc/init.d/cronicled
|
||||||
chmod 775 /etc/init.d/cronicled
|
chmod 775 /etc/init.d/cronicled
|
||||||
update-rc.d cronicled defaults &>/dev/null
|
$STD update-rc.d cronicled defaults
|
||||||
msg_ok "Installed Cronicle Worker"
|
msg_ok "Installed Cronicle Worker"
|
||||||
echo -e "\n Add Masters secret key to /opt/cronicle/conf/config.json \n"
|
echo -e "\n Add Masters secret key to /opt/cronicle/conf/config.json \n"
|
||||||
exit
|
exit
|
||||||
|
@ -29,7 +29,7 @@ function update_script() {
|
|||||||
latest_version=$(npm show cross-seed version)
|
latest_version=$(npm show cross-seed version)
|
||||||
if [ "$current_version" != "$latest_version" ]; then
|
if [ "$current_version" != "$latest_version" ]; then
|
||||||
msg_info "Updating ${APP} from version v${current_version} to v${latest_version}"
|
msg_info "Updating ${APP} from version v${current_version} to v${latest_version}"
|
||||||
npm install -g cross-seed@latest &> /dev/null
|
$STD npm install -g cross-seed@latest
|
||||||
systemctl restart cross-seed
|
systemctl restart cross-seed
|
||||||
msg_ok "Updated Successfully"
|
msg_ok "Updated Successfully"
|
||||||
else
|
else
|
||||||
|
@ -28,8 +28,8 @@ function update_script() {
|
|||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_info "Updating $APP LXC"
|
msg_info "Updating $APP LXC"
|
||||||
apt-get update &>/dev/null
|
$STD apt-get update
|
||||||
apt-get -y upgrade &>/dev/null
|
$STD apt-get -y upgrade
|
||||||
msg_ok "Updated $APP LXC"
|
msg_ok "Updated $APP LXC"
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
@ -28,8 +28,8 @@ function update_script() {
|
|||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_info "Updating $APP LXC"
|
msg_info "Updating $APP LXC"
|
||||||
apt-get update &>/dev/null
|
$STD apt-get update
|
||||||
apt-get -y upgrade &>/dev/null
|
$STD apt-get -y upgrade
|
||||||
msg_ok "Updated $APP LXC"
|
msg_ok "Updated $APP LXC"
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
@ -28,8 +28,8 @@ function update_script() {
|
|||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_info "Updating $APP LXC"
|
msg_info "Updating $APP LXC"
|
||||||
apt-get update &>/dev/null
|
$STD apt-get update
|
||||||
apt-get -y upgrade &>/dev/null
|
$STD apt-get -y upgrade
|
||||||
msg_ok "Updated $APP LXC"
|
msg_ok "Updated $APP LXC"
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
@ -28,7 +28,7 @@ function update_script() {
|
|||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_info "Updating $APP LXC"
|
msg_info "Updating $APP LXC"
|
||||||
apt-get update &>/dev/null
|
$STD apt-get update
|
||||||
pip3 install deluge[all] --upgrade
|
pip3 install deluge[all] --upgrade
|
||||||
msg_ok "Updated $APP LXC"
|
msg_ok "Updated $APP LXC"
|
||||||
exit
|
exit
|
||||||
|
@ -28,8 +28,8 @@ function update_script() {
|
|||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_info "Updating ${APP} LXC"
|
msg_info "Updating ${APP} LXC"
|
||||||
apt-get update &>/dev/null
|
$STD apt-get update
|
||||||
apt-get -y upgrade &>/dev/null
|
$STD apt-get -y upgrade
|
||||||
msg_ok "Updated ${APP} LXC"
|
msg_ok "Updated ${APP} LXC"
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
72
ct/docmost.sh
Normal file
72
ct/docmost.sh
Normal file
@ -0,0 +1,72 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
source <(curl -s https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/build.func)
|
||||||
|
# Copyright (c) 2021-2025 community-scripts ORG
|
||||||
|
# Author: MickLesk (CanbiZ)
|
||||||
|
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
|
||||||
|
# Source: https://docmost.com/
|
||||||
|
|
||||||
|
APP="Docmost"
|
||||||
|
var_tags="documents"
|
||||||
|
var_cpu="3"
|
||||||
|
var_ram="3072"
|
||||||
|
var_disk="7"
|
||||||
|
var_os="debian"
|
||||||
|
var_version="12"
|
||||||
|
|
||||||
|
header_info "$APP"
|
||||||
|
variables
|
||||||
|
color
|
||||||
|
catch_errors
|
||||||
|
|
||||||
|
function update_script() {
|
||||||
|
header_info
|
||||||
|
check_container_storage
|
||||||
|
check_container_resources
|
||||||
|
if [[ ! -d /opt/docmost ]]; then
|
||||||
|
msg_error "No ${APP} Installation Found!"
|
||||||
|
exit
|
||||||
|
fi
|
||||||
|
RELEASE=$(curl -s https://api.github.com/repos/docmost/docmost/releases/latest | grep "tag_name" | awk '{print substr($2, 3, length($2)-4) }')
|
||||||
|
if [[ ! -f /opt/${APP}_version.txt ]] || [[ "${RELEASE}" != "$(cat /opt/${APP}_version.txt)" ]]; then
|
||||||
|
msg_info "Stopping ${APP}"
|
||||||
|
systemctl stop docmost
|
||||||
|
msg_ok "${APP} Stopped"
|
||||||
|
|
||||||
|
msg_info "Updating ${APP} to v${RELEASE}"
|
||||||
|
cp /opt/docmost/.env /opt/
|
||||||
|
cp -r /opt/docmost/data /opt/
|
||||||
|
rm -rf /opt/docmost
|
||||||
|
temp_file=$(mktemp)
|
||||||
|
wget -q "https://github.com/docmost/docmost/archive/refs/tags/v${RELEASE}.tar.gz" -O "$temp_file"
|
||||||
|
tar -xzf "$temp_file"
|
||||||
|
mv docmost-${RELEASE} /opt/docmost
|
||||||
|
cd /opt/docmost
|
||||||
|
mv /opt/.env /opt/docmost/.env
|
||||||
|
mv /opt/data /opt/docmost/data
|
||||||
|
$STD pnpm install --force
|
||||||
|
$STD pnpm build
|
||||||
|
echo "${RELEASE}" >/opt/${APP}_version.txt
|
||||||
|
msg_ok "Updated ${APP}"
|
||||||
|
|
||||||
|
msg_info "Starting ${APP}"
|
||||||
|
systemctl start docmost
|
||||||
|
msg_ok "Started ${APP}"
|
||||||
|
|
||||||
|
msg_info "Cleaning Up"
|
||||||
|
rm -f ${temp_file}
|
||||||
|
msg_ok "Cleaned"
|
||||||
|
msg_ok "Updated Successfully"
|
||||||
|
else
|
||||||
|
msg_ok "No update required. ${APP} is already at ${RELEASE}"
|
||||||
|
fi
|
||||||
|
exit
|
||||||
|
}
|
||||||
|
|
||||||
|
start
|
||||||
|
build_container
|
||||||
|
description
|
||||||
|
|
||||||
|
msg_ok "Completed Successfully!\n"
|
||||||
|
echo -e "${CREATING}${GN}${APP} setup has been successfully initialized!${CL}"
|
||||||
|
echo -e "${INFO}${YW} Access it using the following URL:${CL}"
|
||||||
|
echo -e "${TAB}${GATEWAY}${BGN}http://${IP}:3000${CL}"
|
41
ct/dolibarr.sh
Normal file
41
ct/dolibarr.sh
Normal file
@ -0,0 +1,41 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
source <(curl -s https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/build.func)
|
||||||
|
# Copyright (c) 2021-2025 community-scripts ORG
|
||||||
|
# Author: Slaviša Arežina (tremor021)
|
||||||
|
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
|
||||||
|
# Source: https://github.com/Dolibarr/dolibarr/
|
||||||
|
|
||||||
|
APP="Dolibarr"
|
||||||
|
var_tags="erp;accounting"
|
||||||
|
var_cpu="1"
|
||||||
|
var_ram="2048"
|
||||||
|
var_disk="6"
|
||||||
|
var_os="debian"
|
||||||
|
var_version="12"
|
||||||
|
var_unprivileged="1"
|
||||||
|
|
||||||
|
header_info "$APP"
|
||||||
|
variables
|
||||||
|
color
|
||||||
|
catch_errors
|
||||||
|
|
||||||
|
function update_script() {
|
||||||
|
header_info
|
||||||
|
check_container_storage
|
||||||
|
check_container_resources
|
||||||
|
if [[ ! -d /usr/share/dolibarr ]]; then
|
||||||
|
msg_error "No ${APP} Installation Found!"
|
||||||
|
exit
|
||||||
|
fi
|
||||||
|
msg_error "To update ${APP}, use the applications web interface."
|
||||||
|
exit
|
||||||
|
}
|
||||||
|
|
||||||
|
start
|
||||||
|
build_container
|
||||||
|
description
|
||||||
|
|
||||||
|
msg_ok "Completed Successfully!\n"
|
||||||
|
echo -e "${CREATING}${GN}${APP} setup has been successfully initialized!${CL}"
|
||||||
|
echo -e "${INFO}${YW} Access it using the following URL:${CL}"
|
||||||
|
echo -e "${TAB}${GATEWAY}${BGN}http://${IP}/dolibarr/install${CL}"
|
@ -28,8 +28,8 @@ function update_script() {
|
|||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_info "Updating ${APP} LXC"
|
msg_info "Updating ${APP} LXC"
|
||||||
apt-get update &>/dev/null
|
$STD apt-get update
|
||||||
apt-get -y upgrade &>/dev/null
|
$STD apt-get -y upgrade
|
||||||
msg_ok "Updated Successfully"
|
msg_ok "Updated Successfully"
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
@ -28,8 +28,8 @@ function update_script() {
|
|||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_info "Updating $APP LXC"
|
msg_info "Updating $APP LXC"
|
||||||
apt-get update &>/dev/null
|
$STD apt-get update
|
||||||
apt-get -y upgrade &>/dev/null
|
$STD apt-get -y upgrade
|
||||||
msg_ok "Updated $APP LXC"
|
msg_ok "Updated $APP LXC"
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
@ -33,8 +33,8 @@ function update_script() {
|
|||||||
msg_ok "Stopped ${APP}"
|
msg_ok "Stopped ${APP}"
|
||||||
|
|
||||||
msg_info "Updating ${APP}"
|
msg_info "Updating ${APP}"
|
||||||
wget https://github.com/MediaBrowser/Emby.Releases/releases/download/${LATEST}/emby-server-deb_${LATEST}_amd64.deb &>/dev/null
|
$STD wget https://github.com/MediaBrowser/Emby.Releases/releases/download/${LATEST}/emby-server-deb_${LATEST}_amd64.deb
|
||||||
dpkg -i emby-server-deb_${LATEST}_amd64.deb &>/dev/null
|
$STD dpkg -i emby-server-deb_${LATEST}_amd64.deb
|
||||||
rm emby-server-deb_${LATEST}_amd64.deb
|
rm emby-server-deb_${LATEST}_amd64.deb
|
||||||
msg_ok "Updated ${APP}"
|
msg_ok "Updated ${APP}"
|
||||||
|
|
||||||
|
@ -28,8 +28,8 @@ function update_script() {
|
|||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_info "Updating $APP LXC"
|
msg_info "Updating $APP LXC"
|
||||||
apt-get update &>/dev/null
|
$STD apt-get update
|
||||||
apt-get -y upgrade &>/dev/null
|
$STD apt-get -y upgrade
|
||||||
msg_ok "Updated $APP LXC"
|
msg_ok "Updated $APP LXC"
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
@ -33,9 +33,9 @@ function update_script() {
|
|||||||
|
|
||||||
msg_info "Updating ESPHome"
|
msg_info "Updating ESPHome"
|
||||||
if [[ -d /srv/esphome ]]; then
|
if [[ -d /srv/esphome ]]; then
|
||||||
source /srv/esphome/bin/activate &>/dev/null
|
$STD source /srv/esphome/bin/activate
|
||||||
fi
|
fi
|
||||||
pip3 install -U esphome &>/dev/null
|
$STD pip3 install -U esphome
|
||||||
msg_ok "Updated ESPHome"
|
msg_ok "Updated ESPHome"
|
||||||
|
|
||||||
msg_info "Starting ESPHome"
|
msg_info "Starting ESPHome"
|
||||||
|
@ -28,8 +28,8 @@ function update_script() {
|
|||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_info "Updating evcc LXC"
|
msg_info "Updating evcc LXC"
|
||||||
apt update &>/dev/null
|
$STD apt update
|
||||||
apt --only-upgrade install -y evcc &>/dev/null
|
$STD apt --only-upgrade install -y evcc
|
||||||
msg_ok "Updated Successfully"
|
msg_ok "Updated Successfully"
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
@ -43,7 +43,7 @@ function update_script() {
|
|||||||
rm -rf /opt/excalidraw
|
rm -rf /opt/excalidraw
|
||||||
mv excalidraw-${RELEASE} /opt/excalidraw
|
mv excalidraw-${RELEASE} /opt/excalidraw
|
||||||
cd /opt/excalidraw
|
cd /opt/excalidraw
|
||||||
yarn &> /dev/null
|
$STD yarn
|
||||||
msg_ok "Updated $APP to v${RELEASE}"
|
msg_ok "Updated $APP to v${RELEASE}"
|
||||||
|
|
||||||
msg_info "Starting $APP"
|
msg_info "Starting $APP"
|
||||||
|
@ -27,7 +27,7 @@ function update_script() {
|
|||||||
msg_error "No ${APP} Installation Found!"
|
msg_error "No ${APP} Installation Found!"
|
||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_error "There is currently no update path available."
|
msg_error "Currently we don't provide an update function for this ${APP}."
|
||||||
exit
|
exit
|
||||||
msg_info "Updating ${APP}"
|
msg_info "Updating ${APP}"
|
||||||
systemctl stop ${APP}
|
systemctl stop ${APP}
|
||||||
|
@ -28,8 +28,8 @@ function update_script() {
|
|||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_info "Updating ${APP} LXC"
|
msg_info "Updating ${APP} LXC"
|
||||||
apt-get update &>/dev/null
|
$STD apt-get update
|
||||||
apt-get -y upgrade &>/dev/null
|
$STD apt-get -y upgrade
|
||||||
msg_ok "Updated Successfully"
|
msg_ok "Updated Successfully"
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
@ -37,21 +37,19 @@ check_container_resources
|
|||||||
msg_info "Updating ${APP} to v${RELEASE}"
|
msg_info "Updating ${APP} to v${RELEASE}"
|
||||||
cp /opt/firefly/.env /opt/.env
|
cp /opt/firefly/.env /opt/.env
|
||||||
cp -r /opt/firefly/storage /opt/storage
|
cp -r /opt/firefly/storage /opt/storage
|
||||||
rm -rf /opt/firefly/*
|
|
||||||
cd /opt
|
cd /opt
|
||||||
wget -q "https://github.com/firefly-iii/firefly-iii/releases/download/v${RELEASE}/FireflyIII-v${RELEASE}.tar.gz"
|
wget -q "https://github.com/firefly-iii/firefly-iii/releases/download/v${RELEASE}/FireflyIII-v${RELEASE}.tar.gz"
|
||||||
tar -xzf FireflyIII-v${RELEASE}.tar.gz -C /opt/firefly --exclude='storage'
|
tar -xzf FireflyIII-v${RELEASE}.tar.gz -C /opt/firefly --exclude='storage'
|
||||||
|
cp /opt/.env /opt/firefly/.env
|
||||||
|
cp -r /opt/storage /opt/firefly/storage
|
||||||
cd /opt/firefly
|
cd /opt/firefly
|
||||||
composer install --no-dev --no-interaction &>/dev/null
|
|
||||||
php artisan migrate --seed --force &>/dev/null
|
|
||||||
php artisan firefly:decrypt-all &>/dev/null
|
|
||||||
php artisan cache:clear &>/dev/null
|
|
||||||
php artisan view:clear &>/dev/null
|
|
||||||
php artisan firefly:upgrade-database &>/dev/null
|
|
||||||
php artisan firefly:laravel-passport-keys &>/dev/null
|
|
||||||
chown -R www-data:www-data /opt/firefly
|
chown -R www-data:www-data /opt/firefly
|
||||||
chmod -R 775 /opt/firefly/storage
|
chmod -R 775 /opt/firefly/storage
|
||||||
|
$STD php artisan migrate --seed --force
|
||||||
|
$STD php artisan cache:clear
|
||||||
|
$STD php artisan view:clear
|
||||||
|
$STD php artisan firefly-iii:upgrade-database
|
||||||
|
$STD php artisan firefly-iii:laravel-passport-keys
|
||||||
echo "${RELEASE}" >"/opt/${APP}_version.txt"
|
echo "${RELEASE}" >"/opt/${APP}_version.txt"
|
||||||
msg_ok "Updated ${APP} to v${RELEASE}"
|
msg_ok "Updated ${APP} to v${RELEASE}"
|
||||||
|
|
||||||
@ -76,4 +74,4 @@ description
|
|||||||
msg_ok "Completed Successfully!\n"
|
msg_ok "Completed Successfully!\n"
|
||||||
echo -e "${CREATING}${GN}${APP} setup has been successfully initialized!${CL}"
|
echo -e "${CREATING}${GN}${APP} setup has been successfully initialized!${CL}"
|
||||||
echo -e "${INFO}${YW} Access it using the following URL:${CL}"
|
echo -e "${INFO}${YW} Access it using the following URL:${CL}"
|
||||||
echo -e "${TAB}${GATEWAY}${BGN}http://${IP}${CL}"
|
echo -e "${TAB}${GATEWAY}${BGN}http://${IP}${CL}"
|
||||||
|
@ -30,7 +30,7 @@ function update_script() {
|
|||||||
latest_version=$(npm show ghost-cli version)
|
latest_version=$(npm show ghost-cli version)
|
||||||
if [ "$current_version" != "$latest_version" ]; then
|
if [ "$current_version" != "$latest_version" ]; then
|
||||||
msg_info "Updating ${APP} from version v${current_version} to v${latest_version}"
|
msg_info "Updating ${APP} from version v${current_version} to v${latest_version}"
|
||||||
npm install -g ghost-cli@latest &> /dev/null
|
$STD npm install -g ghost-cli@latest
|
||||||
msg_ok "Updated Successfully"
|
msg_ok "Updated Successfully"
|
||||||
else
|
else
|
||||||
msg_ok "${APP} is already at v${current_version}"
|
msg_ok "${APP} is already at v${current_version}"
|
||||||
|
@ -30,7 +30,7 @@ function update_script() {
|
|||||||
fi
|
fi
|
||||||
RELEASE=$(curl -s https://api.github.com/repos/glpi-project/glpi/releases/latest | grep '"tag_name"' | sed -E 's/.*"tag_name": "([^"]+)".*/\1/')
|
RELEASE=$(curl -s https://api.github.com/repos/glpi-project/glpi/releases/latest | grep '"tag_name"' | sed -E 's/.*"tag_name": "([^"]+)".*/\1/')
|
||||||
if [[ ! -f /opt/${APP}_version.txt ]] || [[ "${RELEASE}" != "$(cat /opt/${APP}_version.txt)" ]]; then
|
if [[ ! -f /opt/${APP}_version.txt ]] || [[ "${RELEASE}" != "$(cat /opt/${APP}_version.txt)" ]]; then
|
||||||
msg_error "Ther is currently no automatic update function for ${APP}."
|
msg_error "Currently we don't provide an update function for this ${APP}."
|
||||||
else
|
else
|
||||||
msg_ok "No update required. ${APP} is already at v${RELEASE}."
|
msg_ok "No update required. ${APP} is already at v${RELEASE}."
|
||||||
fi
|
fi
|
||||||
|
@ -27,7 +27,7 @@ function update_script() {
|
|||||||
msg_error "No ${APP} Installation Found!"
|
msg_error "No ${APP} Installation Found!"
|
||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_error "There is currently no update path available."
|
msg_error "Currently we don't provide an update function for this ${APP}."
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -28,8 +28,8 @@ function update_script() {
|
|||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_info "Updating ${APP}"
|
msg_info "Updating ${APP}"
|
||||||
apt-get update &>/dev/null
|
$STD apt-get update
|
||||||
apt-get -y upgrade &>/dev/null
|
$STD apt-get -y upgrade
|
||||||
msg_ok "Updated Successfully"
|
msg_ok "Updated Successfully"
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
@ -34,8 +34,8 @@ function update_script() {
|
|||||||
msg_ok "Stopped $APP"
|
msg_ok "Stopped $APP"
|
||||||
|
|
||||||
msg_info "Updating $APP"
|
msg_info "Updating $APP"
|
||||||
apt-get update &> /dev/null
|
$STD apt-get update
|
||||||
apt-get upgrade -y &> /dev/null
|
$STD apt-get upgrade -y
|
||||||
msg_ok "Updated $APP"
|
msg_ok "Updated $APP"
|
||||||
|
|
||||||
msg_info "Starting $APP"
|
msg_info "Starting $APP"
|
||||||
|
26
ct/grist.sh
26
ct/grist.sh
@ -34,18 +34,36 @@ function update_script() {
|
|||||||
msg_ok "Stopped ${APP} Service"
|
msg_ok "Stopped ${APP} Service"
|
||||||
|
|
||||||
msg_info "Updating ${APP} to v${RELEASE}"
|
msg_info "Updating ${APP} to v${RELEASE}"
|
||||||
|
|
||||||
cd /opt
|
cd /opt
|
||||||
rm -rf grist_bak
|
rm -rf grist_bak
|
||||||
mv grist grist_bak
|
mv grist grist_bak
|
||||||
wget -q https://github.com/gristlabs/grist-core/archive/refs/tags/v${RELEASE}.zip
|
wget -q https://github.com/gristlabs/grist-core/archive/refs/tags/v${RELEASE}.zip
|
||||||
unzip -q v$RELEASE.zip
|
unzip -q v$RELEASE.zip
|
||||||
mv grist-core-${RELEASE} grist
|
mv grist-core-${RELEASE} grist
|
||||||
cp -n /opt/grist_bak/.env /opt/grist/.env
|
|
||||||
|
mkdir -p grist/docs
|
||||||
|
|
||||||
|
cp -n grist_bak/.env grist/.env || true
|
||||||
|
cp -r grist_bak/docs/* grist/docs/ || true
|
||||||
|
cp grist_bak/grist-sessions.db grist/grist-sessions.db || true
|
||||||
|
cp grist_bak/landing.db grist/landing.db || true
|
||||||
|
|
||||||
cd grist
|
cd grist
|
||||||
yarn install >/dev/null 2>&1
|
msg_info "Installing Dependencies"
|
||||||
yarn run build:prod >/dev/null 2>&1
|
$STD yarn install
|
||||||
yarn run install:python >/dev/null 2>&1
|
msg_ok "Installed Dependencies"
|
||||||
|
|
||||||
|
msg_info "Building"
|
||||||
|
$STD yarn run build:prod
|
||||||
|
msg_ok "Done building"
|
||||||
|
|
||||||
|
msg_info "Installing Python"
|
||||||
|
$STD yarn run install:python
|
||||||
|
msg_ok "Installed Python"
|
||||||
|
|
||||||
echo "${RELEASE}" >/opt/${APP}_version.txt
|
echo "${RELEASE}" >/opt/${APP}_version.txt
|
||||||
|
|
||||||
msg_ok "Updated ${APP} to v${RELEASE}"
|
msg_ok "Updated ${APP} to v${RELEASE}"
|
||||||
|
|
||||||
msg_info "Starting ${APP} Service"
|
msg_info "Starting ${APP} Service"
|
||||||
|
6
ct/headers/authelia
Normal file
6
ct/headers/authelia
Normal file
@ -0,0 +1,6 @@
|
|||||||
|
___ __ __ ___
|
||||||
|
/ | __ __/ /_/ /_ ___ / (_)___ _
|
||||||
|
/ /| |/ / / / __/ __ \/ _ \/ / / __ `/
|
||||||
|
/ ___ / /_/ / /_/ / / / __/ / / /_/ /
|
||||||
|
/_/ |_\__,_/\__/_/ /_/\___/_/_/\__,_/
|
||||||
|
|
6
ct/headers/boltdiy
Normal file
6
ct/headers/boltdiy
Normal file
@ -0,0 +1,6 @@
|
|||||||
|
__ ____ ___
|
||||||
|
/ /_ ____ / / /_____/ (_)_ __
|
||||||
|
/ __ \/ __ \/ / __/ __ / / / / /
|
||||||
|
/ /_/ / /_/ / / /_/ /_/ / / /_/ /
|
||||||
|
/_.___/\____/_/\__/\__,_/_/\__, /
|
||||||
|
/____/
|
6
ct/headers/bytestash
Normal file
6
ct/headers/bytestash
Normal file
@ -0,0 +1,6 @@
|
|||||||
|
____ __ _____ __ __
|
||||||
|
/ __ )__ __/ /____ / ___// /_____ ______/ /_
|
||||||
|
/ __ / / / / __/ _ \\__ \/ __/ __ `/ ___/ __ \
|
||||||
|
/ /_/ / /_/ / /_/ __/__/ / /_/ /_/ (__ ) / / /
|
||||||
|
/_____/\__, /\__/\___/____/\__/\__,_/____/_/ /_/
|
||||||
|
/____/
|
6
ct/headers/docmost
Normal file
6
ct/headers/docmost
Normal file
@ -0,0 +1,6 @@
|
|||||||
|
____ __
|
||||||
|
/ __ \____ _________ ___ ____ _____/ /_
|
||||||
|
/ / / / __ \/ ___/ __ `__ \/ __ \/ ___/ __/
|
||||||
|
/ /_/ / /_/ / /__/ / / / / / /_/ (__ ) /_
|
||||||
|
/_____/\____/\___/_/ /_/ /_/\____/____/\__/
|
||||||
|
|
6
ct/headers/dolibarr
Normal file
6
ct/headers/dolibarr
Normal file
@ -0,0 +1,6 @@
|
|||||||
|
____ ___ __
|
||||||
|
/ __ \____ / (_) /_ ____ ___________
|
||||||
|
/ / / / __ \/ / / __ \/ __ `/ ___/ ___/
|
||||||
|
/ /_/ / /_/ / / / /_/ / /_/ / / / /
|
||||||
|
/_____/\____/_/_/_.___/\__,_/_/ /_/
|
||||||
|
|
6
ct/headers/hev-socks5-server
Normal file
6
ct/headers/hev-socks5-server
Normal file
@ -0,0 +1,6 @@
|
|||||||
|
__ __ ______
|
||||||
|
/ /_ ___ _ __ _________ _____/ /_______/ ____/ ________ ______ _____ _____
|
||||||
|
/ __ \/ _ \ | / /_____/ ___/ __ \/ ___/ //_/ ___/___ \______/ ___/ _ \/ ___/ | / / _ \/ ___/
|
||||||
|
/ / / / __/ |/ /_____(__ ) /_/ / /__/ ,< (__ )___/ /_____(__ ) __/ / | |/ / __/ /
|
||||||
|
/_/ /_/\___/|___/ /____/\____/\___/_/|_/____/_____/ /____/\___/_/ |___/\___/_/
|
||||||
|
|
6
ct/headers/jupyternotebook
Normal file
6
ct/headers/jupyternotebook
Normal file
@ -0,0 +1,6 @@
|
|||||||
|
__ __ _ __ __ __ __
|
||||||
|
/ /_ ______ __ __/ /____ _____ / | / /___ / /____ / /_ ____ ____ / /__
|
||||||
|
__ / / / / / __ \/ / / / __/ _ \/ ___/ / |/ / __ \/ __/ _ \/ __ \/ __ \/ __ \/ //_/
|
||||||
|
/ /_/ / /_/ / /_/ / /_/ / /_/ __/ / / /| / /_/ / /_/ __/ /_/ / /_/ / /_/ / ,<
|
||||||
|
\____/\__,_/ .___/\__, /\__/\___/_/ /_/ |_/\____/\__/\___/_.___/\____/\____/_/|_|
|
||||||
|
/_/ /____/
|
6
ct/headers/outline
Normal file
6
ct/headers/outline
Normal file
@ -0,0 +1,6 @@
|
|||||||
|
____ __ ___
|
||||||
|
/ __ \__ __/ /_/ (_)___ ___
|
||||||
|
/ / / / / / / __/ / / __ \/ _ \
|
||||||
|
/ /_/ / /_/ / /_/ / / / / / __/
|
||||||
|
\____/\__,_/\__/_/_/_/ /_/\___/
|
||||||
|
|
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user