Compare commits

...

38 Commits

Author SHA1 Message Date
Nikola Jokic
d37dfa565e Merge branch 'main' into nikola-jokic/rm-dockerd 2024-02-08 20:07:36 +01:00
Nikola Jokic
7cb4e05e53 Remove dockerd but leave docker cli in the image. Bump docker version 2024-02-08 20:03:57 +01:00
Luke Tomlinson
31318d81ba Prepare v2.313.0 Release (#3137)
* update runnerversion

* update releaseNote.md

* update-releasenote
2024-02-07 15:15:32 -05:00
Nikola Jokic
1d47bfa6c7 Bump hook version to 0.5.1 (#3129)
Co-authored-by: Tingluo Huang <tingluohuang@github.com>
2024-02-07 19:55:33 +00:00
Luke Tomlinson
651ea42e00 Handle ForceTokenRefresh message (#3133)
* Handle ForceTokenRefresh message

* move to constants

* format
2024-02-07 11:24:40 -05:00
Lukas Hauser
bcc665a7a1 Fix CVE-2024-21626 (#3123)
Co-authored-by: Ferenc Hammerl <31069338+fhammerl@users.noreply.github.com>
2024-02-07 17:06:57 +01:00
Patrick Ellis
cd812f0395 Add sshd to .devcontainer.json (#3079)
This is required in order to ssh into a codespace for the repo: 


> error getting ssh server details: failed to start SSH server:
Please check if an SSH server is installed in the container.
If the docker image being used for the codespace does not have an SSH server,
install it in your Dockerfile or, for codespaces that use Debian-based images,
you can add the following to your devcontainer.json:
> 
> "features": {
>     "ghcr.io/devcontainers/features/sshd:1": {
>         "version": "latest"
>     }
> }
2024-02-06 15:06:07 +00:00
Josh Soref
fa874cf314 Improve error report for invalid action.yml (#3106) 2024-02-06 13:40:53 +00:00
Bethany
bf0e76631b Specify Content-Type for BlockBlob upload (#3119)
* add content-type to block blob upload

* Add content-type for sdk path

* fix spacing

* merge headers and only when file extension is .txt

* add conditions

* tweak conditions and path matching

* pass in headers

* add content-type for appendblob
2024-02-05 18:19:02 +00:00
Yang Cao
1d82031a2c Make sure to drain the upload queue before clean temp directory (#3125) 2024-02-02 20:37:45 +00:00
Victor Sollerhed
d1a619ff09 Upgrade docker from 24.0.8 to 24.0.9 (#3126)
Release notes:
- https://docs.docker.com/engine/release-notes/24.0/#2409
2024-02-02 09:46:12 -05:00
Jonathan Tamsut
11680fc78f Upload the diagnostic logs to the Results Service (#3114)
* upload diagnostic logs to results

* correct casing

* correct typo

* update contract

* update constant and define private method

* fix access

* fix reference to logs url

* update

* use results func

* fix method naming

* rename to match rpc

* change API contract

* fix lint issue

* refactor

* remove unused method

* create new log type
2024-02-01 14:27:06 -05:00
Victor Sollerhed
3e5433ec86 Upgrade docker from 24.0.7 to 24.0.8 (#3124)
Which (among other things) fixes this High `CVE-2024-21626`:
- https://github.com/advisories/GHSA-xr7r-f8xq-vfvv

Release notes:
- https://docs.docker.com/engine/release-notes/24.0/#2408
2024-02-01 11:10:46 -05:00
Tingluo Huang
b647b890c5 Only keep 1 older version runner around after self-upgrade. (#3122) 2024-01-31 10:03:49 -05:00
Luke Tomlinson
894c50073a Implement Broker Redirects for Session and Messages (#3103) 2024-01-30 20:57:49 +00:00
Tingluo Huang
5268d74ade Fix JobDispatcher crash during force cancellation. (#3118) 2024-01-30 15:24:22 -05:00
Tingluo Huang
7414e08fbd Add user-agent to all http clients using RawClientHttpRequestSettings. (#3115) 2024-01-30 13:39:41 -05:00
Tingluo Huang
dcb790f780 Fix release workflow. (#3102) 2024-01-30 13:25:58 -05:00
Tingluo Huang
b7ab810945 Make embedded timeline record has same order as its parent record. (#3113) 2024-01-26 17:26:30 -05:00
Tingluo Huang
7310ba0a08 Revert "Bump container hook version to 0.5.0 in runner image (#3003)" (#3101)
This reverts commit c7d65c42d6.
2024-01-18 11:50:36 -05:00
Diogo Torres
e842959e3e Bump docker and buildx to the latest version (#3100) 2024-01-18 10:42:44 -05:00
Tingluo Huang
9f19310b5b Prepare 2.312.0 runner release. (#3092) 2024-01-16 16:44:30 -05:00
Thomas Boop
84220a21d1 Patch Curl to no longer use -k (#3091)
* Update externals.sh

* Update externals.sh
2024-01-16 13:39:21 -05:00
github-actions[bot]
8e0cd36cd8 Upgrade dotnet sdk to v6.0.418 (#3085)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2024-01-10 12:48:40 -05:00
Tingluo Huang
f1f18f67e1 Remove runner trimmed packages. (#3074) 2024-01-10 12:32:17 -05:00
Yang Cao
ac39c4bd0a Use Azure SDK to upload files to Azure Blob (#3033) 2024-01-08 16:13:06 -05:00
Tingluo Huang
3f3d9b0d99 Extend --check to check Results-Receiver service. (#3078)
* Extend --check to check Results-Receiver service.

* Update docs/checks/actions.md

Co-authored-by: Christopher Schleiden <cschleiden@live.de>

---------

Co-authored-by: Christopher Schleiden <cschleiden@live.de>
2024-01-05 14:18:59 -05:00
adjn
af485fb660 Update envlinux.md (#3040)
Matching supported distros to https://docs.github.com/en/actions/hosting-your-own-runners/managing-self-hosted-runners/about-self-hosted-runners
2024-01-02 11:09:10 -05:00
Stefan Ruvceski
9e3e57ff90 close reason update (#3027)
* Update close-bugs-bot.yml

* Update close-features-bot.yml
2023-12-06 17:12:55 +01:00
Tingluo Huang
ac89b31d2f Add codeload to the list of service we check during '--check'. (#3011) 2023-11-28 13:39:39 -05:00
Tingluo Huang
65201ff6be Include whether http proxy configured as part of UserAgent. (#3009) 2023-11-28 09:43:13 -05:00
Tingluo Huang
661b261959 Mark job as failed on worker crash. (#3006) 2023-11-27 16:43:36 -05:00
Hidetake Iwata
8a25302ba3 Set ImageOS environment variable in runner images (#2878)
Co-authored-by: Nikola Jokic <jokicnikola07@gmail.com>
2023-11-23 15:04:54 +01:00
Nikola Jokic
c7d65c42d6 Bump container hook version to 0.5.0 in runner image (#3003) 2023-11-22 14:52:08 +01:00
Luke Tomlinson
a9bae6f37a Handle SelfUpdate Flow when Package is provided in Message (#2926) 2023-11-13 16:44:07 -05:00
Luke Tomlinson
3136ce3a71 Send disableUpdate as query parameter (#2970) 2023-11-13 10:19:42 -05:00
Stefan Ruvceski
a4c57f2747 Create close-features and close-bugs bot for runner issues (#2909)
* Create close-bugs-bot.yml

close bugs bot

* Update close-bugs-bot.yml

* Update close-bugs-bot.yml

* Update close-bugs-bot.yml

* Create close-features-bot.yml

* Update close-bugs-bot.yml

* Update close-features-bot.yml

* Update close-bugs-bot.yml

---------

Co-authored-by: Thomas Boop <52323235+thboop@users.noreply.github.com>
2023-10-27 10:06:10 +00:00
AJ Schmidt
ce4e62c849 fix buildx installation (#2952) 2023-10-25 13:59:39 -04:00
62 changed files with 2133 additions and 2624 deletions

View File

@@ -4,10 +4,13 @@
"features": { "features": {
"ghcr.io/devcontainers/features/docker-in-docker:1": {}, "ghcr.io/devcontainers/features/docker-in-docker:1": {},
"ghcr.io/devcontainers/features/dotnet": { "ghcr.io/devcontainers/features/dotnet": {
"version": "6.0.415" "version": "6.0.418"
}, },
"ghcr.io/devcontainers/features/node:1": { "ghcr.io/devcontainers/features/node:1": {
"version": "16" "version": "16"
},
"ghcr.io/devcontainers/features/sshd:1": {
"version": "latest"
} }
}, },
"customizations": { "customizations": {

View File

@@ -58,29 +58,6 @@ jobs:
${{ matrix.devScript }} layout Release ${{ matrix.runtime }} ${{ matrix.devScript }} layout Release ${{ matrix.runtime }}
working-directory: src working-directory: src
# Check runtime/externals hash
- name: Compute/Compare runtime and externals Hash
shell: bash
run: |
echo "Current dotnet runtime hash result: $DOTNET_RUNTIME_HASH"
echo "Current Externals hash result: $EXTERNALS_HASH"
NeedUpdate=0
if [ "$EXTERNALS_HASH" != "$(cat ./src/Misc/contentHash/externals/${{ matrix.runtime }})" ] ;then
echo Hash mismatch, Update ./src/Misc/contentHash/externals/${{ matrix.runtime }} to $EXTERNALS_HASH
NeedUpdate=1
fi
if [ "$DOTNET_RUNTIME_HASH" != "$(cat ./src/Misc/contentHash/dotnetRuntime/${{ matrix.runtime }})" ] ;then
echo Hash mismatch, Update ./src/Misc/contentHash/dotnetRuntime/${{ matrix.runtime }} to $DOTNET_RUNTIME_HASH
NeedUpdate=1
fi
exit $NeedUpdate
env:
DOTNET_RUNTIME_HASH: ${{hashFiles('**/_layout_trims/runtime/**/*')}}
EXTERNALS_HASH: ${{hashFiles('**/_layout_trims/externals/**/*')}}
# Run tests # Run tests
- name: L0 - name: L0
run: | run: |
@@ -103,6 +80,3 @@ jobs:
name: runner-package-${{ matrix.runtime }} name: runner-package-${{ matrix.runtime }}
path: | path: |
_package _package
_package_trims/trim_externals
_package_trims/trim_runtime
_package_trims/trim_runtime_externals

17
.github/workflows/close-bugs-bot.yml vendored Normal file
View File

@@ -0,0 +1,17 @@
name: Close Bugs Bot
on:
workflow_dispatch:
schedule:
- cron: '0 0 * * *' # every day at midnight
jobs:
stale:
runs-on: ubuntu-latest
steps:
- uses: actions/stale@v8
with:
close-issue-message: "This issue does not seem to be a problem with the runner application, it concerns the GitHub actions platform more generally. Could you please post your feedback on the [GitHub Community Support Forum](https://github.com/orgs/community/discussions/categories/actions) which is actively monitored. Using the forum ensures that we route your problem to the correct team. 😃"
exempt-issue-labels: "keep"
stale-issue-label: "actions-bug"
only-labels: "actions-bug"
days-before-stale: 0
days-before-close: 1

View File

@@ -0,0 +1,17 @@
name: Close Features Bot
on:
workflow_dispatch:
schedule:
- cron: '0 0 * * *' # every day at midnight
jobs:
stale:
runs-on: ubuntu-latest
steps:
- uses: actions/stale@v8
with:
close-issue-message: "Thank you for your interest in the runner application and taking the time to provide your valuable feedback. We kindly ask you to redirect this feedback to the [GitHub Community Support Forum](https://github.com/orgs/community/discussions/categories/actions-and-packages) which our team actively monitors and would be a better place to start a discussion for new feature requests in GitHub Actions. For more information on this policy please [read our contribution guidelines](https://github.com/actions/runner#contribute). 😃"
exempt-issue-labels: "keep"
stale-issue-label: "actions-feature"
only-labels: "actions-feature"
days-before-stale: 0
days-before-close: 1

View File

@@ -84,221 +84,20 @@ jobs:
git commit -a -m "Upgrade dotnet sdk to v${{ steps.fetch_latest_version.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }}" git commit -a -m "Upgrade dotnet sdk to v${{ steps.fetch_latest_version.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }}"
git push --set-upstream origin $branch_name git push --set-upstream origin $branch_name
build-hashes: create-pr:
if: ${{ needs.dotnet-update.outputs.SHOULD_UPDATE == 1 && needs.dotnet-update.outputs.BRANCH_EXISTS == 0 }}
needs: [dotnet-update] needs: [dotnet-update]
outputs: if: ${{ needs.dotnet-update.outputs.SHOULD_UPDATE == 1 && needs.dotnet-update.outputs.BRANCH_EXISTS == 0 }}
# pass outputs from this job to create-pr for use runs-on: ubuntu-latest
DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION: ${{ needs.dotnet-update.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }}
DOTNET_CURRENT_MAJOR_MINOR_VERSION: ${{ needs.dotnet-update.outputs.DOTNET_CURRENT_MAJOR_MINOR_VERSION }}
NEEDS_HASH_UPDATE: ${{ steps.compute-hash.outputs.NEED_UPDATE }}
strategy:
fail-fast: false
matrix:
runtime: [ linux-x64, linux-arm64, linux-arm, win-x64, win-arm64, osx-x64, osx-arm64 ]
include:
- runtime: linux-x64
os: ubuntu-latest
devScript: ./dev.sh
- runtime: linux-arm64
os: ubuntu-latest
devScript: ./dev.sh
- runtime: linux-arm
os: ubuntu-latest
devScript: ./dev.sh
- runtime: osx-x64
os: macOS-latest
devScript: ./dev.sh
- runtime: osx-arm64
os: macOS-latest
devScript: ./dev.sh
- runtime: win-x64
os: windows-2019
devScript: ./dev
- runtime: win-arm64
os: windows-latest
devScript: ./dev
runs-on: ${{ matrix.os }}
steps: steps:
- uses: actions/checkout@v3 - uses: actions/checkout@v3
with: with:
ref: feature/dotnetsdk-upgrade/${{ needs.dotnet-update.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }} ref: feature/dotnetsdk-upgrade/${{ needs.dotnet-update.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }}
# Build runner layout
- name: Build & Layout Release
run: |
${{ matrix.devScript }} layout Release ${{ matrix.runtime }}
working-directory: src
# Check runtime/externals hash
- name: Compute/Compare runtime and externals Hash
id: compute-hash
continue-on-error: true
shell: bash
run: |
echo "Current dotnet runtime hash result: $DOTNET_RUNTIME_HASH"
echo "Current Externals hash result: $EXTERNALS_HASH"
NeedUpdate=0
if [ "$EXTERNALS_HASH" != "$(cat ./src/Misc/contentHash/externals/${{ matrix.runtime }})" ] ;then
echo Hash mismatch, Update ./src/Misc/contentHash/externals/${{ matrix.runtime }} to $EXTERNALS_HASH
echo "EXTERNAL_HASH=$EXTERNALS_HASH" >> $GITHUB_OUTPUT
NeedUpdate=1
fi
if [ "$DOTNET_RUNTIME_HASH" != "$(cat ./src/Misc/contentHash/dotnetRuntime/${{ matrix.runtime }})" ] ;then
echo Hash mismatch, Update ./src/Misc/contentHash/dotnetRuntime/${{ matrix.runtime }} to $DOTNET_RUNTIME_HASH
echo "DOTNET_RUNTIME_HASH=$DOTNET_RUNTIME_HASH" >> $GITHUB_OUTPUT
NeedUpdate=1
fi
echo "NEED_UPDATE=$NeedUpdate" >> $GITHUB_OUTPUT
env:
DOTNET_RUNTIME_HASH: ${{hashFiles('**/_layout_trims/runtime/**/*')}}
EXTERNALS_HASH: ${{hashFiles('**/_layout_trims/externals/**/*')}}
- name: update hash
if: ${{ steps.compute-hash.outputs.NEED_UPDATE == 1 }}
shell: bash
run: |
ExternalHash=${{ steps.compute-hash.outputs.EXTERNAL_HASH }}
DotNetRuntimeHash=${{ steps.compute-hash.outputs.DOTNET_RUNTIME_HASH }}
if [ -n "$ExternalHash" ]; then
echo "$ExternalHash" > ./src/Misc/contentHash/externals/${{ matrix.runtime }}
fi
if [ -n "$DotNetRuntimeHash" ]; then
echo "$DotNetRuntimeHash" > ./src/Misc/contentHash/dotnetRuntime/${{ matrix.runtime }}
fi
- name: cache updated hashes
if: ${{ steps.compute-hash.outputs.NEED_UPDATE == 1 }}
uses: actions/cache/save@v3
with:
enableCrossOsArchive: true
path: |
./src/Misc/contentHash/externals/${{ matrix.runtime }}
./src/Misc/contentHash/dotnetRuntime/${{ matrix.runtime }}
key: compute-hashes-${{ matrix.runtime }}-${{ github.run_id }}-${{ github.run_number }}-${{ github.run_attempt }}
hash-update:
needs: [build-hashes]
if: ${{ needs.build-hashes.outputs.NEEDS_HASH_UPDATE == 1 }}
outputs:
# pass outputs from this job to create-pr for use
DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION: ${{ needs.build-hashes.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }}
DOTNET_CURRENT_MAJOR_MINOR_VERSION: ${{ needs.build-hashes.outputs.DOTNET_CURRENT_MAJOR_MINOR_VERSION }}
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
with:
ref: feature/dotnetsdk-upgrade/${{ needs.build-hashes.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }}
- name: Restore cached hashes - linux-x64
id: cache-restore-linux-x64
uses: actions/cache/restore@v3
with:
enableCrossOsArchive: true
path: |
./src/Misc/contentHash/externals/linux-x64
./src/Misc/contentHash/dotnetRuntime/linux-x64
key: compute-hashes-linux-x64-${{ github.run_id }}-${{ github.run_number }}-${{ github.run_attempt }}
- name: Restore cached hashes - linux-arm64
id: cache-restore-linux-arm64
uses: actions/cache/restore@v3
with:
enableCrossOsArchive: true
path: |
./src/Misc/contentHash/externals/linux-arm64
./src/Misc/contentHash/dotnetRuntime/linux-arm64
key: compute-hashes-linux-arm64-${{ github.run_id }}-${{ github.run_number }}-${{ github.run_attempt }}
- name: Restore cached hashes - linux-arm
id: cache-restore-linux-arm
uses: actions/cache/restore@v3
with:
enableCrossOsArchive: true
path: |
./src/Misc/contentHash/externals/linux-arm
./src/Misc/contentHash/dotnetRuntime/linux-arm
key: compute-hashes-linux-arm-${{ github.run_id }}-${{ github.run_number }}-${{ github.run_attempt }}
- name: Restore cached hashes - osx-x64
id: cache-restore-osx-x64
uses: actions/cache/restore@v3
with:
enableCrossOsArchive: true
path: |
./src/Misc/contentHash/externals/osx-x64
./src/Misc/contentHash/dotnetRuntime/osx-x64
key: compute-hashes-osx-x64-${{ github.run_id }}-${{ github.run_number }}-${{ github.run_attempt }}
- name: Restore cached hashes - osx-arm64
id: cache-restore-osx-arm64
uses: actions/cache/restore@v3
with:
enableCrossOsArchive: true
path: |
./src/Misc/contentHash/externals/osx-arm64
./src/Misc/contentHash/dotnetRuntime/osx-arm64
key: compute-hashes-osx-arm64-${{ github.run_id }}-${{ github.run_number }}-${{ github.run_attempt }}
- name: Restore cached hashes - win-x64
id: cache-restore-win-x64
uses: actions/cache/restore@v3
with:
enableCrossOsArchive: true
path: |
./src/Misc/contentHash/externals/win-x64
./src/Misc/contentHash/dotnetRuntime/win-x64
key: compute-hashes-win-x64-${{ github.run_id }}-${{ github.run_number }}-${{ github.run_attempt }}
- name: Restore cached hashes - win-arm64
id: cache-restore-win-arm64
uses: actions/cache/restore@v3
with:
enableCrossOsArchive: true
path: |
./src/Misc/contentHash/externals/win-arm64
./src/Misc/contentHash/dotnetRuntime/win-arm64
key: compute-hashes-win-arm64-${{ github.run_id }}-${{ github.run_number }}-${{ github.run_attempt }}
- name: Fetch cached computed hashes
if: steps.cache-restore-linux-x64.outputs.cache-hit == 'true' ||
steps.cache-restore-linux-arm64.outputs.cache-hit == 'true' ||
steps.cache-restore-linux-arm.outputs.cache-hit == 'true' ||
steps.cache-restore-win-x64.outputs.cache-hit == 'true' ||
steps.cache-restore-win-arm64.outputs.cache-hit == 'true' ||
steps.cache-restore-osx-x64.outputs.cache-hit == 'true' ||
steps.cache-restore-osx-arm64.outputs.cache-hit == 'true'
shell: bash
run: |
Environments=( "linux-x64" "linux-arm64" "linux-arm" "win-x64" "win-arm64" "osx-x64" "osx-arm64" )
git config --global user.name "github-actions[bot]"
git config --global user.email "41898282+github-actions[bot]@users.noreply.github.com"
git commit -a -m "Update computed hashes"
git push --set-upstream origin feature/dotnetsdk-upgrade/${{ needs.build-hashes.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }}
create-pr:
needs: [hash-update]
outputs:
# pass outputs from this job to run-tests for use
DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION: ${{ needs.hash-update.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }}
DOTNET_CURRENT_MAJOR_MINOR_VERSION: ${{ needs.hash-update.outputs.DOTNET_CURRENT_MAJOR_MINOR_VERSION }}
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
with:
ref: feature/dotnetsdk-upgrade/${{ needs.hash-update.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }}
- name: Create Pull Request - name: Create Pull Request
env: env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: | run: |
gh pr create -B main -H feature/dotnetsdk-upgrade/${{ needs.hash-update.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }} --title "Update dotnet sdk to latest version @${{ needs.hash-update.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }}" --body " gh pr create -B main -H feature/dotnetsdk-upgrade/${{ needs.dotnet-update.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }} --title "Update dotnet sdk to latest version @${{ needs.dotnet-update.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }}" --body "
https://dotnetcli.blob.core.windows.net/dotnet/Sdk/${{ needs.hash-update.outputs.DOTNET_CURRENT_MAJOR_MINOR_VERSION }}/latest.version https://dotnetcli.blob.core.windows.net/dotnet/Sdk/${{ needs.dotnet-update.outputs.DOTNET_CURRENT_MAJOR_MINOR_VERSION }}/latest.version
--- ---

View File

@@ -53,27 +53,6 @@ jobs:
win-arm64-sha: ${{ steps.sha.outputs.win-arm64-sha256 }} win-arm64-sha: ${{ steps.sha.outputs.win-arm64-sha256 }}
osx-x64-sha: ${{ steps.sha.outputs.osx-x64-sha256 }} osx-x64-sha: ${{ steps.sha.outputs.osx-x64-sha256 }}
osx-arm64-sha: ${{ steps.sha.outputs.osx-arm64-sha256 }} osx-arm64-sha: ${{ steps.sha.outputs.osx-arm64-sha256 }}
linux-x64-sha-noexternals: ${{ steps.sha_noexternals.outputs.linux-x64-sha256 }}
linux-arm64-sha-noexternals: ${{ steps.sha_noexternals.outputs.linux-arm64-sha256 }}
linux-arm-sha-noexternals: ${{ steps.sha_noexternals.outputs.linux-arm-sha256 }}
win-x64-sha-noexternals: ${{ steps.sha_noexternals.outputs.win-x64-sha256 }}
win-arm64-sha-noexternals: ${{ steps.sha_noexternals.outputs.win-arm64-sha256 }}
osx-x64-sha-noexternals: ${{ steps.sha_noexternals.outputs.osx-x64-sha256 }}
osx-arm64-sha-noexternals: ${{ steps.sha_noexternals.outputs.osx-arm64-sha256 }}
linux-x64-sha-noruntime: ${{ steps.sha_noruntime.outputs.linux-x64-sha256 }}
linux-arm64-sha-noruntime: ${{ steps.sha_noruntime.outputs.linux-arm64-sha256 }}
linux-arm-sha-noruntime: ${{ steps.sha_noruntime.outputs.linux-arm-sha256 }}
win-x64-sha-noruntime: ${{ steps.sha_noruntime.outputs.win-x64-sha256 }}
win-arm64-sha-noruntime: ${{ steps.sha_noruntime.outputs.win-arm64-sha256 }}
osx-x64-sha-noruntime: ${{ steps.sha_noruntime.outputs.osx-x64-sha256 }}
osx-arm64-sha-noruntime: ${{ steps.sha_noruntime.outputs.osx-arm64-sha256 }}
linux-x64-sha-noruntime-noexternals: ${{ steps.sha_noruntime_noexternals.outputs.linux-x64-sha256 }}
linux-arm64-sha-noruntime-noexternals: ${{ steps.sha_noruntime_noexternals.outputs.linux-arm64-sha256 }}
linux-arm-sha-noruntime-noexternals: ${{ steps.sha_noruntime_noexternals.outputs.linux-arm-sha256 }}
win-x64-sha-noruntime-noexternals: ${{ steps.sha_noruntime_noexternals.outputs.win-x64-sha256 }}
win-arm64-sha-noruntime-noexternals: ${{ steps.sha_noruntime_noexternals.outputs.win-arm64-sha256 }}
osx-x64-sha-noruntime-noexternals: ${{ steps.sha_noruntime_noexternals.outputs.osx-x64-sha256 }}
osx-arm64-sha-noruntime-noexternals: ${{ steps.sha_noruntime_noexternals.outputs.osx-arm64-sha256 }}
strategy: strategy:
matrix: matrix:
runtime: [ linux-x64, linux-arm64, linux-arm, win-x64, osx-x64, osx-arm64, win-arm64 ] runtime: [ linux-x64, linux-arm64, linux-arm, win-x64, osx-x64, osx-arm64, win-arm64 ]
@@ -136,76 +115,6 @@ jobs:
id: sha id: sha
name: Compute SHA256 name: Compute SHA256
working-directory: _package working-directory: _package
- run: |
file=$(ls)
sha=$(sha256sum $file | awk '{ print $1 }')
echo "Computed sha256: $sha for $file"
echo "${{matrix.runtime}}-sha256=$sha" >> $GITHUB_OUTPUT
echo "sha256=$sha" >> $GITHUB_OUTPUT
shell: bash
id: sha_noexternals
name: Compute SHA256
working-directory: _package_trims/trim_externals
- run: |
file=$(ls)
sha=$(sha256sum $file | awk '{ print $1 }')
echo "Computed sha256: $sha for $file"
echo "${{matrix.runtime}}-sha256=$sha" >> $GITHUB_OUTPUT
echo "sha256=$sha" >> $GITHUB_OUTPUT
shell: bash
id: sha_noruntime
name: Compute SHA256
working-directory: _package_trims/trim_runtime
- run: |
file=$(ls)
sha=$(sha256sum $file | awk '{ print $1 }')
echo "Computed sha256: $sha for $file"
echo "${{matrix.runtime}}-sha256=$sha" >> $GITHUB_OUTPUT
echo "sha256=$sha" >> $GITHUB_OUTPUT
shell: bash
id: sha_noruntime_noexternals
name: Compute SHA256
working-directory: _package_trims/trim_runtime_externals
- name: Create trimmedpackages.json for ${{ matrix.runtime }}
if: matrix.runtime == 'win-x64' || matrix.runtime == 'win-arm64'
uses: actions/github-script@0.3.0
with:
github-token: ${{secrets.GITHUB_TOKEN}}
script: |
const core = require('@actions/core')
const fs = require('fs');
const runnerVersion = fs.readFileSync('src/runnerversion', 'utf8').replace(/\n$/g, '')
var trimmedPackages = fs.readFileSync('src/Misc/trimmedpackages_zip.json', 'utf8').replace(/<RUNNER_VERSION>/g, runnerVersion).replace(/<RUNNER_PLATFORM>/g, '${{ matrix.runtime }}')
trimmedPackages = trimmedPackages.replace(/<RUNTIME_HASH>/g, '${{hashFiles('**/_layout_trims/runtime/**/*')}}')
trimmedPackages = trimmedPackages.replace(/<EXTERNALS_HASH>/g, '${{hashFiles('**/_layout_trims/externals/**/*')}}')
trimmedPackages = trimmedPackages.replace(/<NO_RUNTIME_EXTERNALS_HASH>/g, '${{steps.sha_noruntime_noexternals.outputs.sha256}}')
trimmedPackages = trimmedPackages.replace(/<NO_RUNTIME_HASH>/g, '${{steps.sha_noruntime.outputs.sha256}}')
trimmedPackages = trimmedPackages.replace(/<NO_EXTERNALS_HASH>/g, '${{steps.sha_noexternals.outputs.sha256}}')
console.log(trimmedPackages)
fs.writeFileSync('${{ matrix.runtime }}-trimmedpackages.json', trimmedPackages)
- name: Create trimmedpackages.json for ${{ matrix.runtime }}
if: matrix.runtime != 'win-x64' && matrix.runtime != 'win-arm64'
uses: actions/github-script@0.3.0
with:
github-token: ${{secrets.GITHUB_TOKEN}}
script: |
const core = require('@actions/core')
const fs = require('fs');
const runnerVersion = fs.readFileSync('src/runnerversion', 'utf8').replace(/\n$/g, '')
var trimmedPackages = fs.readFileSync('src/Misc/trimmedpackages_targz.json', 'utf8').replace(/<RUNNER_VERSION>/g, runnerVersion).replace(/<RUNNER_PLATFORM>/g, '${{ matrix.runtime }}')
trimmedPackages = trimmedPackages.replace(/<RUNTIME_HASH>/g, '${{hashFiles('**/_layout_trims/runtime/**/*')}}')
trimmedPackages = trimmedPackages.replace(/<EXTERNALS_HASH>/g, '${{hashFiles('**/_layout_trims/externals/**/*')}}')
trimmedPackages = trimmedPackages.replace(/<NO_RUNTIME_EXTERNALS_HASH>/g, '${{steps.sha_noruntime_noexternals.outputs.sha256}}')
trimmedPackages = trimmedPackages.replace(/<NO_RUNTIME_HASH>/g, '${{steps.sha_noruntime.outputs.sha256}}')
trimmedPackages = trimmedPackages.replace(/<NO_EXTERNALS_HASH>/g, '${{steps.sha_noexternals.outputs.sha256}}')
console.log(trimmedPackages)
fs.writeFileSync('${{ matrix.runtime }}-trimmedpackages.json', trimmedPackages)
# Upload runner package tar.gz/zip as artifact. # Upload runner package tar.gz/zip as artifact.
# Since each package name is unique, so we don't need to put ${{matrix}} info into artifact name # Since each package name is unique, so we don't need to put ${{matrix}} info into artifact name
@@ -216,10 +125,6 @@ jobs:
name: runner-packages name: runner-packages
path: | path: |
_package _package
_package_trims/trim_externals
_package_trims/trim_runtime
_package_trims/trim_runtime_externals
${{ matrix.runtime }}-trimmedpackages.json
release: release:
needs: build needs: build
@@ -253,33 +158,11 @@ jobs:
releaseNote = releaseNote.replace(/<LINUX_X64_SHA>/g, '${{needs.build.outputs.linux-x64-sha}}') releaseNote = releaseNote.replace(/<LINUX_X64_SHA>/g, '${{needs.build.outputs.linux-x64-sha}}')
releaseNote = releaseNote.replace(/<LINUX_ARM_SHA>/g, '${{needs.build.outputs.linux-arm-sha}}') releaseNote = releaseNote.replace(/<LINUX_ARM_SHA>/g, '${{needs.build.outputs.linux-arm-sha}}')
releaseNote = releaseNote.replace(/<LINUX_ARM64_SHA>/g, '${{needs.build.outputs.linux-arm64-sha}}') releaseNote = releaseNote.replace(/<LINUX_ARM64_SHA>/g, '${{needs.build.outputs.linux-arm64-sha}}')
releaseNote = releaseNote.replace(/<WIN_X64_SHA_NOEXTERNALS>/g, '${{needs.build.outputs.win-x64-sha-noexternals}}')
releaseNote = releaseNote.replace(/<WIN_ARM64_SHA_NOEXTERNALS>/g, '${{needs.build.outputs.win-arm64-sha-noexternals}}')
releaseNote = releaseNote.replace(/<OSX_X64_SHA_NOEXTERNALS>/g, '${{needs.build.outputs.osx-x64-sha-noexternals}}')
releaseNote = releaseNote.replace(/<OSX_ARM64_SHA_NOEXTERNALS>/g, '${{needs.build.outputs.osx-arm64-sha-noexternals}}')
releaseNote = releaseNote.replace(/<LINUX_X64_SHA_NOEXTERNALS>/g, '${{needs.build.outputs.linux-x64-sha-noexternals}}')
releaseNote = releaseNote.replace(/<LINUX_ARM_SHA_NOEXTERNALS>/g, '${{needs.build.outputs.linux-arm-sha-noexternals}}')
releaseNote = releaseNote.replace(/<LINUX_ARM64_SHA_NOEXTERNALS>/g, '${{needs.build.outputs.linux-arm64-sha-noexternals}}')
releaseNote = releaseNote.replace(/<WIN_X64_SHA_NORUNTIME>/g, '${{needs.build.outputs.win-x64-sha-noruntime}}')
releaseNote = releaseNote.replace(/<WIN_ARM64_SHA_NORUNTIME>/g, '${{needs.build.outputs.win-arm64-sha-noruntime}}')
releaseNote = releaseNote.replace(/<OSX_X64_SHA_NORUNTIME>/g, '${{needs.build.outputs.osx-x64-sha-noruntime}}')
releaseNote = releaseNote.replace(/<OSX_ARM64_SHA_NORUNTIME>/g, '${{needs.build.outputs.osx-arm64-sha-noruntime}}')
releaseNote = releaseNote.replace(/<LINUX_X64_SHA_NORUNTIME>/g, '${{needs.build.outputs.linux-x64-sha-noruntime}}')
releaseNote = releaseNote.replace(/<LINUX_ARM_SHA_NORUNTIME>/g, '${{needs.build.outputs.linux-arm-sha-noruntime}}')
releaseNote = releaseNote.replace(/<LINUX_ARM64_SHA_NORUNTIME>/g, '${{needs.build.outputs.linux-arm64-sha-noruntime}}')
releaseNote = releaseNote.replace(/<WIN_X64_SHA_NORUNTIME_NOEXTERNALS>/g, '${{needs.build.outputs.win-x64-sha-noruntime-noexternals}}')
releaseNote = releaseNote.replace(/<WIN_ARM64_SHA_NORUNTIME_NOEXTERNALS>/g, '${{needs.build.outputs.win-arm64-sha-noruntime-noexternals}}')
releaseNote = releaseNote.replace(/<OSX_X64_SHA_NORUNTIME_NOEXTERNALS>/g, '${{needs.build.outputs.osx-x64-sha-noruntime-noexternals}}')
releaseNote = releaseNote.replace(/<OSX_ARM64_SHA_NORUNTIME_NOEXTERNALS>/g, '${{needs.build.outputs.osx-arm64-sha-noruntime-noexternals}}')
releaseNote = releaseNote.replace(/<LINUX_X64_SHA_NORUNTIME_NOEXTERNALS>/g, '${{needs.build.outputs.linux-x64-sha-noruntime-noexternals}}')
releaseNote = releaseNote.replace(/<LINUX_ARM_SHA_NORUNTIME_NOEXTERNALS>/g, '${{needs.build.outputs.linux-arm-sha-noruntime-noexternals}}')
releaseNote = releaseNote.replace(/<LINUX_ARM64_SHA_NORUNTIME_NOEXTERNALS>/g, '${{needs.build.outputs.linux-arm64-sha-noruntime-noexternals}}')
console.log(releaseNote) console.log(releaseNote)
core.setOutput('version', runnerVersion); core.setOutput('version', runnerVersion);
core.setOutput('note', releaseNote); core.setOutput('note', releaseNote);
- name: Validate Packages HASH - name: Validate Packages HASH
working-directory: _package
run: | run: |
ls -l ls -l
echo "${{needs.build.outputs.win-x64-sha}} actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}.zip" | shasum -a 256 -c echo "${{needs.build.outputs.win-x64-sha}} actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}.zip" | shasum -a 256 -c
@@ -309,7 +192,7 @@ jobs:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with: with:
upload_url: ${{ steps.createRelease.outputs.upload_url }} upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package/actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}.zip asset_path: ${{ github.workspace }}/actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}.zip
asset_name: actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}.zip asset_name: actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}.zip
asset_content_type: application/octet-stream asset_content_type: application/octet-stream
@@ -319,7 +202,7 @@ jobs:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with: with:
upload_url: ${{ steps.createRelease.outputs.upload_url }} upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package/actions-runner-win-arm64-${{ steps.releaseNote.outputs.version }}.zip asset_path: ${{ github.workspace }}/actions-runner-win-arm64-${{ steps.releaseNote.outputs.version }}.zip
asset_name: actions-runner-win-arm64-${{ steps.releaseNote.outputs.version }}.zip asset_name: actions-runner-win-arm64-${{ steps.releaseNote.outputs.version }}.zip
asset_content_type: application/octet-stream asset_content_type: application/octet-stream
@@ -329,7 +212,7 @@ jobs:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with: with:
upload_url: ${{ steps.createRelease.outputs.upload_url }} upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package/actions-runner-linux-x64-${{ steps.releaseNote.outputs.version }}.tar.gz asset_path: ${{ github.workspace }}/actions-runner-linux-x64-${{ steps.releaseNote.outputs.version }}.tar.gz
asset_name: actions-runner-linux-x64-${{ steps.releaseNote.outputs.version }}.tar.gz asset_name: actions-runner-linux-x64-${{ steps.releaseNote.outputs.version }}.tar.gz
asset_content_type: application/octet-stream asset_content_type: application/octet-stream
@@ -339,7 +222,7 @@ jobs:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with: with:
upload_url: ${{ steps.createRelease.outputs.upload_url }} upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package/actions-runner-osx-x64-${{ steps.releaseNote.outputs.version }}.tar.gz asset_path: ${{ github.workspace }}/actions-runner-osx-x64-${{ steps.releaseNote.outputs.version }}.tar.gz
asset_name: actions-runner-osx-x64-${{ steps.releaseNote.outputs.version }}.tar.gz asset_name: actions-runner-osx-x64-${{ steps.releaseNote.outputs.version }}.tar.gz
asset_content_type: application/octet-stream asset_content_type: application/octet-stream
@@ -349,7 +232,7 @@ jobs:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with: with:
upload_url: ${{ steps.createRelease.outputs.upload_url }} upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package/actions-runner-osx-arm64-${{ steps.releaseNote.outputs.version }}.tar.gz asset_path: ${{ github.workspace }}/actions-runner-osx-arm64-${{ steps.releaseNote.outputs.version }}.tar.gz
asset_name: actions-runner-osx-arm64-${{ steps.releaseNote.outputs.version }}.tar.gz asset_name: actions-runner-osx-arm64-${{ steps.releaseNote.outputs.version }}.tar.gz
asset_content_type: application/octet-stream asset_content_type: application/octet-stream
@@ -359,7 +242,7 @@ jobs:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with: with:
upload_url: ${{ steps.createRelease.outputs.upload_url }} upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package/actions-runner-linux-arm-${{ steps.releaseNote.outputs.version }}.tar.gz asset_path: ${{ github.workspace }}/actions-runner-linux-arm-${{ steps.releaseNote.outputs.version }}.tar.gz
asset_name: actions-runner-linux-arm-${{ steps.releaseNote.outputs.version }}.tar.gz asset_name: actions-runner-linux-arm-${{ steps.releaseNote.outputs.version }}.tar.gz
asset_content_type: application/octet-stream asset_content_type: application/octet-stream
@@ -369,298 +252,10 @@ jobs:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with: with:
upload_url: ${{ steps.createRelease.outputs.upload_url }} upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package/actions-runner-linux-arm64-${{ steps.releaseNote.outputs.version }}.tar.gz asset_path: ${{ github.workspace }}/actions-runner-linux-arm64-${{ steps.releaseNote.outputs.version }}.tar.gz
asset_name: actions-runner-linux-arm64-${{ steps.releaseNote.outputs.version }}.tar.gz asset_name: actions-runner-linux-arm64-${{ steps.releaseNote.outputs.version }}.tar.gz
asset_content_type: application/octet-stream asset_content_type: application/octet-stream
# Upload release assets (trim externals)
- name: Upload Release Asset (win-x64-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_externals/actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}-noexternals.zip
asset_name: actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}-noexternals.zip
asset_content_type: application/octet-stream
# Upload release assets (trim externals)
- name: Upload Release Asset (win-arm64-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_externals/actions-runner-win-arm64-${{ steps.releaseNote.outputs.version }}-noexternals.zip
asset_name: actions-runner-win-arm64-${{ steps.releaseNote.outputs.version }}-noexternals.zip
asset_content_type: application/octet-stream
- name: Upload Release Asset (linux-x64-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_externals/actions-runner-linux-x64-${{ steps.releaseNote.outputs.version }}-noexternals.tar.gz
asset_name: actions-runner-linux-x64-${{ steps.releaseNote.outputs.version }}-noexternals.tar.gz
asset_content_type: application/octet-stream
- name: Upload Release Asset (osx-x64-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_externals/actions-runner-osx-x64-${{ steps.releaseNote.outputs.version }}-noexternals.tar.gz
asset_name: actions-runner-osx-x64-${{ steps.releaseNote.outputs.version }}-noexternals.tar.gz
asset_content_type: application/octet-stream
- name: Upload Release Asset (osx-arm64-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_externals/actions-runner-osx-arm64-${{ steps.releaseNote.outputs.version }}-noexternals.tar.gz
asset_name: actions-runner-osx-arm64-${{ steps.releaseNote.outputs.version }}-noexternals.tar.gz
asset_content_type: application/octet-stream
- name: Upload Release Asset (linux-arm-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_externals/actions-runner-linux-arm-${{ steps.releaseNote.outputs.version }}-noexternals.tar.gz
asset_name: actions-runner-linux-arm-${{ steps.releaseNote.outputs.version }}-noexternals.tar.gz
asset_content_type: application/octet-stream
- name: Upload Release Asset (linux-arm64-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_externals/actions-runner-linux-arm64-${{ steps.releaseNote.outputs.version }}-noexternals.tar.gz
asset_name: actions-runner-linux-arm64-${{ steps.releaseNote.outputs.version }}-noexternals.tar.gz
asset_content_type: application/octet-stream
# Upload release assets (trim runtime)
- name: Upload Release Asset (win-x64-noruntime)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime/actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}-noruntime.zip
asset_name: actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}-noruntime.zip
asset_content_type: application/octet-stream
# Upload release assets (trim runtime)
- name: Upload Release Asset (win-arm64-noruntime)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime/actions-runner-win-arm64-${{ steps.releaseNote.outputs.version }}-noruntime.zip
asset_name: actions-runner-win-arm64-${{ steps.releaseNote.outputs.version }}-noruntime.zip
asset_content_type: application/octet-stream
- name: Upload Release Asset (linux-x64-noruntime)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime/actions-runner-linux-x64-${{ steps.releaseNote.outputs.version }}-noruntime.tar.gz
asset_name: actions-runner-linux-x64-${{ steps.releaseNote.outputs.version }}-noruntime.tar.gz
asset_content_type: application/octet-stream
- name: Upload Release Asset (osx-x64-noruntime)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime/actions-runner-osx-x64-${{ steps.releaseNote.outputs.version }}-noruntime.tar.gz
asset_name: actions-runner-osx-x64-${{ steps.releaseNote.outputs.version }}-noruntime.tar.gz
asset_content_type: application/octet-stream
- name: Upload Release Asset (osx-arm64-noruntime)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime/actions-runner-osx-arm64-${{ steps.releaseNote.outputs.version }}-noruntime.tar.gz
asset_name: actions-runner-osx-arm64-${{ steps.releaseNote.outputs.version }}-noruntime.tar.gz
asset_content_type: application/octet-stream
- name: Upload Release Asset (linux-arm-noruntime)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime/actions-runner-linux-arm-${{ steps.releaseNote.outputs.version }}-noruntime.tar.gz
asset_name: actions-runner-linux-arm-${{ steps.releaseNote.outputs.version }}-noruntime.tar.gz
asset_content_type: application/octet-stream
- name: Upload Release Asset (linux-arm64-noruntime)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime/actions-runner-linux-arm64-${{ steps.releaseNote.outputs.version }}-noruntime.tar.gz
asset_name: actions-runner-linux-arm64-${{ steps.releaseNote.outputs.version }}-noruntime.tar.gz
asset_content_type: application/octet-stream
# Upload release assets (trim runtime and externals)
- name: Upload Release Asset (win-x64-noruntime-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime_externals/actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.zip
asset_name: actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.zip
asset_content_type: application/octet-stream
# Upload release assets (trim runtime and externals)
- name: Upload Release Asset (win-arm64-noruntime-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime_externals/actions-runner-win-arm64-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.zip
asset_name: actions-runner-win-arm64-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.zip
asset_content_type: application/octet-stream
- name: Upload Release Asset (linux-x64-noruntime-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime_externals/actions-runner-linux-x64-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.tar.gz
asset_name: actions-runner-linux-x64-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.tar.gz
asset_content_type: application/octet-stream
- name: Upload Release Asset (osx-x64-noruntime-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime_externals/actions-runner-osx-x64-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.tar.gz
asset_name: actions-runner-osx-x64-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.tar.gz
asset_content_type: application/octet-stream
- name: Upload Release Asset (osx-arm64-noruntime-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime_externals/actions-runner-osx-arm64-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.tar.gz
asset_name: actions-runner-osx-arm64-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.tar.gz
asset_content_type: application/octet-stream
- name: Upload Release Asset (linux-arm-noruntime-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime_externals/actions-runner-linux-arm-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.tar.gz
asset_name: actions-runner-linux-arm-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.tar.gz
asset_content_type: application/octet-stream
- name: Upload Release Asset (linux-arm64-noruntime-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime_externals/actions-runner-linux-arm64-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.tar.gz
asset_name: actions-runner-linux-arm64-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.tar.gz
asset_content_type: application/octet-stream
# Upload release assets (trimmedpackages.json)
- name: Upload Release Asset (win-x64-trimmedpackages.json)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/win-x64-trimmedpackages.json
asset_name: actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}-trimmedpackages.json
asset_content_type: application/octet-stream
# Upload release assets (trimmedpackages.json)
- name: Upload Release Asset (win-arm64-trimmedpackages.json)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/win-arm64-trimmedpackages.json
asset_name: actions-runner-win-arm64-${{ steps.releaseNote.outputs.version }}-trimmedpackages.json
asset_content_type: application/octet-stream
- name: Upload Release Asset (linux-x64-trimmedpackages.json)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/linux-x64-trimmedpackages.json
asset_name: actions-runner-linux-x64-${{ steps.releaseNote.outputs.version }}-trimmedpackages.json
asset_content_type: application/octet-stream
- name: Upload Release Asset (osx-x64-trimmedpackages.json)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/osx-x64-trimmedpackages.json
asset_name: actions-runner-osx-x64-${{ steps.releaseNote.outputs.version }}-trimmedpackages.json
asset_content_type: application/octet-stream
- name: Upload Release Asset (osx-arm64-trimmedpackages.json)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/osx-arm64-trimmedpackages.json
asset_name: actions-runner-osx-arm64-${{ steps.releaseNote.outputs.version }}-trimmedpackages.json
asset_content_type: application/octet-stream
- name: Upload Release Asset (linux-arm-trimmedpackages.json)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/linux-arm-trimmedpackages.json
asset_name: actions-runner-linux-arm-${{ steps.releaseNote.outputs.version }}-trimmedpackages.json
asset_content_type: application/octet-stream
- name: Upload Release Asset (linux-arm64-trimmedpackages.json)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/linux-arm64-trimmedpackages.json
asset_name: actions-runner-linux-arm64-${{ steps.releaseNote.outputs.version }}-trimmedpackages.json
asset_content_type: application/octet-stream
publish-image: publish-image:
needs: release needs: release
runs-on: ubuntu-latest runs-on: ubuntu-latest

View File

@@ -7,8 +7,10 @@ Make sure the runner has access to actions service for GitHub.com or GitHub Ente
- For GitHub.com - For GitHub.com
- The runner needs to access `https://api.github.com` for downloading actions. - The runner needs to access `https://api.github.com` for downloading actions.
- The runner needs to access `https://codeload.github.com` for downloading actions tar.gz/zip.
- The runner needs to access `https://vstoken.actions.githubusercontent.com/_apis/.../` for requesting an access token. - The runner needs to access `https://vstoken.actions.githubusercontent.com/_apis/.../` for requesting an access token.
- The runner needs to access `https://pipelines.actions.githubusercontent.com/_apis/.../` for receiving workflow jobs. - The runner needs to access `https://pipelines.actions.githubusercontent.com/_apis/.../` for receiving workflow jobs.
- The runner needs to access `https://results-receiver.actions.githubusercontent.com/.../` for reporting progress and uploading logs during a workflow job execution.
--- ---
**NOTE:** for the full list of domains that are required to be in the firewall allow list refer to the [GitHub self-hosted runners requirements documentation](https://docs.github.com/en/actions/hosting-your-own-runners/managing-self-hosted-runners/about-self-hosted-runners#communication-between-self-hosted-runners-and-github). **NOTE:** for the full list of domains that are required to be in the firewall allow list refer to the [GitHub self-hosted runners requirements documentation](https://docs.github.com/en/actions/hosting-your-own-runners/managing-self-hosted-runners/about-self-hosted-runners#communication-between-self-hosted-runners-and-github).
@@ -16,12 +18,15 @@ Make sure the runner has access to actions service for GitHub.com or GitHub Ente
``` ```
curl -v https://api.github.com/zen curl -v https://api.github.com/zen
curl -v https://codeload.github.com/_ping
curl -v https://vstoken.actions.githubusercontent.com/_apis/health curl -v https://vstoken.actions.githubusercontent.com/_apis/health
curl -v https://pipelines.actions.githubusercontent.com/_apis/health curl -v https://pipelines.actions.githubusercontent.com/_apis/health
curl -v https://results-receiver.actions.githubusercontent.com/health
``` ```
- For GitHub Enterprise Server - For GitHub Enterprise Server
- The runner needs to access `https://[hostname]/api/v3` for downloading actions. - The runner needs to access `https://[hostname]/api/v3` for downloading actions.
- The runner needs to access `https://codeload.[hostname]/_ping` for downloading actions tar.gz/zip.
- The runner needs to access `https://[hostname]/_services/vstoken/_apis/.../` for requesting an access token. - The runner needs to access `https://[hostname]/_services/vstoken/_apis/.../` for requesting an access token.
- The runner needs to access `https://[hostname]/_services/pipelines/_apis/.../` for receiving workflow jobs. - The runner needs to access `https://[hostname]/_services/pipelines/_apis/.../` for receiving workflow jobs.
@@ -29,6 +34,7 @@ Make sure the runner has access to actions service for GitHub.com or GitHub Ente
``` ```
curl -v https://[hostname]/api/v3/zen curl -v https://[hostname]/api/v3/zen
curl -v https://codeload.[hostname]/_ping
curl -v https://[hostname]/_services/vstoken/_apis/health curl -v https://[hostname]/_services/vstoken/_apis/health
curl -v https://[hostname]/_services/pipelines/_apis/health curl -v https://[hostname]/_services/pipelines/_apis/health
``` ```
@@ -44,6 +50,10 @@ Make sure the runner has access to actions service for GitHub.com or GitHub Ente
- Ping api.github.com or myGHES.com using dotnet - Ping api.github.com or myGHES.com using dotnet
- Make HTTP GET to https://api.github.com or https://myGHES.com/api/v3 using dotnet, check response headers contains `X-GitHub-Request-Id` - Make HTTP GET to https://api.github.com or https://myGHES.com/api/v3 using dotnet, check response headers contains `X-GitHub-Request-Id`
--- ---
- DNS lookup for codeload.github.com or codeload.myGHES.com using dotnet
- Ping codeload.github.com or codeload.myGHES.com using dotnet
- Make HTTP GET to https://codeload.github.com/_ping or https://codeload.myGHES.com/_ping using dotnet, check response headers contains `X-GitHub-Request-Id`
---
- DNS lookup for vstoken.actions.githubusercontent.com using dotnet - DNS lookup for vstoken.actions.githubusercontent.com using dotnet
- Ping vstoken.actions.githubusercontent.com using dotnet - Ping vstoken.actions.githubusercontent.com using dotnet
- Make HTTP GET to https://vstoken.actions.githubusercontent.com/_apis/health or https://myGHES.com/_services/vstoken/_apis/health using dotnet, check response headers contains `x-vss-e2eid` - Make HTTP GET to https://vstoken.actions.githubusercontent.com/_apis/health or https://myGHES.com/_services/vstoken/_apis/health using dotnet, check response headers contains `x-vss-e2eid`
@@ -52,6 +62,10 @@ Make sure the runner has access to actions service for GitHub.com or GitHub Ente
- Ping pipelines.actions.githubusercontent.com using dotnet - Ping pipelines.actions.githubusercontent.com using dotnet
- Make HTTP GET to https://pipelines.actions.githubusercontent.com/_apis/health or https://myGHES.com/_services/pipelines/_apis/health using dotnet, check response headers contains `x-vss-e2eid` - Make HTTP GET to https://pipelines.actions.githubusercontent.com/_apis/health or https://myGHES.com/_services/pipelines/_apis/health using dotnet, check response headers contains `x-vss-e2eid`
- Make HTTP POST to https://pipelines.actions.githubusercontent.com/_apis/health or https://myGHES.com/_services/pipelines/_apis/health using dotnet, check response headers contains `x-vss-e2eid` - Make HTTP POST to https://pipelines.actions.githubusercontent.com/_apis/health or https://myGHES.com/_services/pipelines/_apis/health using dotnet, check response headers contains `x-vss-e2eid`
---
- DNS lookup for results-receiver.actions.githubusercontent.com using dotnet
- Ping results-receiver.actions.githubusercontent.com using dotnet
- Make HTTP GET to https://results-receiver.actions.githubusercontent.com/health using dotnet, check response headers contains `X-GitHub-Request-Id`
## How to fix the issue? ## How to fix the issue?

View File

@@ -42,6 +42,7 @@ If you are having trouble connecting, try these steps:
- https://api.github.com/ - https://api.github.com/
- https://vstoken.actions.githubusercontent.com/_apis/health - https://vstoken.actions.githubusercontent.com/_apis/health
- https://pipelines.actions.githubusercontent.com/_apis/health - https://pipelines.actions.githubusercontent.com/_apis/health
- https://results-receiver.actions.githubusercontent.com/health
- For GHES/GHAE - For GHES/GHAE
- https://myGHES.com/_services/vstoken/_apis/health - https://myGHES.com/_services/vstoken/_apis/health
- https://myGHES.com/_services/pipelines/_apis/health - https://myGHES.com/_services/pipelines/_apis/health

View File

@@ -5,9 +5,9 @@
## Supported Distributions and Versions ## Supported Distributions and Versions
x64 x64
- Red Hat Enterprise Linux 7 - Red Hat Enterprise Linux 7+
- CentOS 7 - CentOS 7+
- Oracle Linux 7 - Oracle Linux 7+
- Fedora 29+ - Fedora 29+
- Debian 9+ - Debian 9+
- Ubuntu 16.04+ - Ubuntu 16.04+

View File

@@ -4,9 +4,9 @@ FROM mcr.microsoft.com/dotnet/runtime-deps:6.0-jammy as build
ARG TARGETOS ARG TARGETOS
ARG TARGETARCH ARG TARGETARCH
ARG RUNNER_VERSION ARG RUNNER_VERSION
ARG RUNNER_CONTAINER_HOOKS_VERSION=0.4.0 ARG RUNNER_CONTAINER_HOOKS_VERSION=0.5.1
ARG DOCKER_VERSION=24.0.6 ARG DOCKER_VERSION=25.0.3
ARG BUILDX_VERSION=0.11.2 ARG BUILDX_VERSION=0.12.1
RUN apt update -y && apt install curl unzip -y RUN apt update -y && apt install curl unzip -y
@@ -25,7 +25,7 @@ RUN export RUNNER_ARCH=${TARGETARCH} \
&& if [ "$RUNNER_ARCH" = "amd64" ]; then export DOCKER_ARCH=x86_64 ; fi \ && if [ "$RUNNER_ARCH" = "amd64" ]; then export DOCKER_ARCH=x86_64 ; fi \
&& if [ "$RUNNER_ARCH" = "arm64" ]; then export DOCKER_ARCH=aarch64 ; fi \ && if [ "$RUNNER_ARCH" = "arm64" ]; then export DOCKER_ARCH=aarch64 ; fi \
&& curl -fLo docker.tgz https://download.docker.com/${TARGETOS}/static/stable/${DOCKER_ARCH}/docker-${DOCKER_VERSION}.tgz \ && curl -fLo docker.tgz https://download.docker.com/${TARGETOS}/static/stable/${DOCKER_ARCH}/docker-${DOCKER_VERSION}.tgz \
&& tar zxvf docker.tgz \ && tar zxvf docker.tgz --strip 1 -C . docker/docker \
&& rm -rf docker.tgz \ && rm -rf docker.tgz \
&& mkdir -p /usr/local/lib/docker/cli-plugins \ && mkdir -p /usr/local/lib/docker/cli-plugins \
&& curl -fLo /usr/local/lib/docker/cli-plugins/docker-buildx \ && curl -fLo /usr/local/lib/docker/cli-plugins/docker-buildx \
@@ -37,6 +37,7 @@ FROM mcr.microsoft.com/dotnet/runtime-deps:6.0-jammy
ENV DEBIAN_FRONTEND=noninteractive ENV DEBIAN_FRONTEND=noninteractive
ENV RUNNER_MANUALLY_TRAP_SIG=1 ENV RUNNER_MANUALLY_TRAP_SIG=1
ENV ACTIONS_RUNNER_PRINT_LOG_TO_STDOUT=1 ENV ACTIONS_RUNNER_PRINT_LOG_TO_STDOUT=1
ENV ImageOS=ubuntu22
RUN apt-get update -y \ RUN apt-get update -y \
&& apt-get install -y --no-install-recommends \ && apt-get install -y --no-install-recommends \
@@ -54,7 +55,8 @@ RUN adduser --disabled-password --gecos "" --uid 1001 runner \
WORKDIR /home/runner WORKDIR /home/runner
COPY --chown=runner:docker --from=build /actions-runner . COPY --chown=runner:docker --from=build /actions-runner .
COPY --from=build /usr/local/lib/docker/cli-plugins/docker-buildx /usr/local/lib/docker/cli-plugins/docker-buildx
RUN install -o root -g root -m 755 docker/* /usr/bin/ && rm -rf docker RUN install -o root -g root -m 755 ./docker /usr/bin/ && rm -rf docker
USER runner USER runner

View File

@@ -1,23 +1,33 @@
## What's Changed ## What's Changed
* Trim whitespace in `./Misc/contentHash/dotnetRuntime/*` by @TingluoHuang in https://github.com/actions/runner/pull/2915 * Bump docker and buildx to the latest version by @diogotorres97 in https://github.com/actions/runner/pull/3100
* Send os and arch during long poll by @luketomlinson in https://github.com/actions/runner/pull/2913 * Revert "Bump container hook version to 0.5.0 in runner image (#3003)" by @TingluoHuang in https://github.com/actions/runner/pull/3101
* Revert "Update default version to node20 (#2844)" by @takost in https://github.com/actions/runner/pull/2918 * Make embedded timeline record has same order as its parent record. by @TingluoHuang in https://github.com/actions/runner/pull/3113
* Fix telemetry publish from JobServerQueue. by @TingluoHuang in https://github.com/actions/runner/pull/2919 * Fix release workflow. by @TingluoHuang in https://github.com/actions/runner/pull/3102
* Use block blob instead of append blob by @yacaovsnc in https://github.com/actions/runner/pull/2924 * Add user-agent to all http clients using RawClientHttpRequestSettings. by @TingluoHuang in https://github.com/actions/runner/pull/3115
* Provide detail info on untar failures. by @TingluoHuang in https://github.com/actions/runner/pull/2939 * Fix JobDispatcher crash during force cancellation. by @TingluoHuang in https://github.com/actions/runner/pull/3118
* Bump node.js to 20.8.1 by @TingluoHuang in https://github.com/actions/runner/pull/2945 * Implement Broker Redirects for Session and Messages by @luketomlinson in https://github.com/actions/runner/pull/3103
* Update dotnet sdk to latest version @6.0.415 by @github-actions in https://github.com/actions/runner/pull/2929 * Only keep 1 older version runner around after self-upgrade. by @TingluoHuang in https://github.com/actions/runner/pull/3122
* Fix typo in log strings by @rajbos in https://github.com/actions/runner/pull/2695 * Upgrade `docker` from `24.0.7` to `24.0.8` by @MPV in https://github.com/actions/runner/pull/3124
* feat: add support of arm64 arch runners in service creation script by @tuxity in https://github.com/actions/runner/pull/2606 * Upload the diagnostic logs to the Results Service by @jtamsut in https://github.com/actions/runner/pull/3114
* Add `buildx` to images by @ajschmidt8 in https://github.com/actions/runner/pull/2901 * Upgrade `docker` from `24.0.8` to `24.0.9` by @MPV in https://github.com/actions/runner/pull/3126
* Make sure to drain the upload queue before clean temp directory by @yacaovsnc in https://github.com/actions/runner/pull/3125
* Specify `Content-Type` for BlockBlob upload by @bethanyj28 in https://github.com/actions/runner/pull/3119
* Improve error report for invalid action.yml by @jsoref in https://github.com/actions/runner/pull/3106
* Add sshd to .devcontainer.json by @pje in https://github.com/actions/runner/pull/3079
* Resolve CVE-2024-21626 by @luka5 in https://github.com/actions/runner/pull/3123
* Handle ForceTokenRefresh message by @luketomlinson in https://github.com/actions/runner/pull/3133
* Bump hook version to 0.5.1 by @nikola-jokic in https://github.com/actions/runner/pull/3129
## New Contributors ## New Contributors
* @tuxity made their first contribution in https://github.com/actions/runner/pull/2606 * @diogotorres97 made their first contribution in https://github.com/actions/runner/pull/3100
* @MPV made their first contribution in https://github.com/actions/runner/pull/3124
* @jtamsut made their first contribution in https://github.com/actions/runner/pull/3114
* @luka5 made their first contribution in https://github.com/actions/runner/pull/3123
**Full Changelog**: https://github.com/actions/runner/compare/v2.310.2...v2.311.0 **Full Changelog**: https://github.com/actions/runner/compare/v2.312.0...v2.313.0
_Note: Actions Runner follows a progressive release policy, so the latest release might not be available to your enterprise, organization, or repository yet. _Note: Actions Runner follows a progressive release policy, so the latest release might not be available to your enterprise, organization, or repository yet.
To confirm which version of the Actions Runner you should expect, please view the download instructions for your enterprise, organization, or repository. To confirm which version of the Actions Runner you should expect, please view the download instructions for your enterprise, organization, or repository.
See https://docs.github.com/en/enterprise-cloud@latest/actions/hosting-your-own-runners/adding-self-hosted-runners_ See https://docs.github.com/en/enterprise-cloud@latest/actions/hosting-your-own-runners/adding-self-hosted-runners_
## Windows x64 ## Windows x64
@@ -119,27 +129,3 @@ The SHA-256 checksums for the packages included in this build are shown below:
- actions-runner-linux-x64-<RUNNER_VERSION>.tar.gz <!-- BEGIN SHA linux-x64 --><LINUX_X64_SHA><!-- END SHA linux-x64 --> - actions-runner-linux-x64-<RUNNER_VERSION>.tar.gz <!-- BEGIN SHA linux-x64 --><LINUX_X64_SHA><!-- END SHA linux-x64 -->
- actions-runner-linux-arm64-<RUNNER_VERSION>.tar.gz <!-- BEGIN SHA linux-arm64 --><LINUX_ARM64_SHA><!-- END SHA linux-arm64 --> - actions-runner-linux-arm64-<RUNNER_VERSION>.tar.gz <!-- BEGIN SHA linux-arm64 --><LINUX_ARM64_SHA><!-- END SHA linux-arm64 -->
- actions-runner-linux-arm-<RUNNER_VERSION>.tar.gz <!-- BEGIN SHA linux-arm --><LINUX_ARM_SHA><!-- END SHA linux-arm --> - actions-runner-linux-arm-<RUNNER_VERSION>.tar.gz <!-- BEGIN SHA linux-arm --><LINUX_ARM_SHA><!-- END SHA linux-arm -->
- actions-runner-win-x64-<RUNNER_VERSION>-noexternals.zip <!-- BEGIN SHA win-x64_noexternals --><WIN_X64_SHA_NOEXTERNALS><!-- END SHA win-x64_noexternals -->
- actions-runner-win-arm64-<RUNNER_VERSION>-noexternals.zip <!-- BEGIN SHA win-arm64_noexternals --><WIN_ARM64_SHA_NOEXTERNALS><!-- END SHA win-arm64_noexternals -->
- actions-runner-osx-x64-<RUNNER_VERSION>-noexternals.tar.gz <!-- BEGIN SHA osx-x64_noexternals --><OSX_X64_SHA_NOEXTERNALS><!-- END SHA osx-x64_noexternals -->
- actions-runner-osx-arm64-<RUNNER_VERSION>-noexternals.tar.gz <!-- BEGIN SHA osx-arm64_noexternals --><OSX_ARM64_SHA_NOEXTERNALS><!-- END SHA osx-arm64_noexternals -->
- actions-runner-linux-x64-<RUNNER_VERSION>-noexternals.tar.gz <!-- BEGIN SHA linux-x64_noexternals --><LINUX_X64_SHA_NOEXTERNALS><!-- END SHA linux-x64_noexternals -->
- actions-runner-linux-arm64-<RUNNER_VERSION>-noexternals.tar.gz <!-- BEGIN SHA linux-arm64_noexternals --><LINUX_ARM64_SHA_NOEXTERNALS><!-- END SHA linux-arm64_noexternals -->
- actions-runner-linux-arm-<RUNNER_VERSION>-noexternals.tar.gz <!-- BEGIN SHA linux-arm_noexternals --><LINUX_ARM_SHA_NOEXTERNALS><!-- END SHA linux-arm_noexternals -->
- actions-runner-win-x64-<RUNNER_VERSION>-noruntime.zip <!-- BEGIN SHA win-x64_noruntime --><WIN_X64_SHA_NORUNTIME><!-- END SHA win-x64_noruntime -->
- actions-runner-win-arm64-<RUNNER_VERSION>-noruntime.zip <!-- BEGIN SHA win-arm64_noruntime --><WIN_ARM64_SHA_NORUNTIME><!-- END SHA win-arm64_noruntime -->
- actions-runner-osx-x64-<RUNNER_VERSION>-noruntime.tar.gz <!-- BEGIN SHA osx-x64_noruntime --><OSX_X64_SHA_NORUNTIME><!-- END SHA osx-x64_noruntime -->
- actions-runner-osx-arm64-<RUNNER_VERSION>-noruntime.tar.gz <!-- BEGIN SHA osx-arm64_noruntime --><OSX_ARM64_SHA_NORUNTIME><!-- END SHA osx-arm64_noruntime -->
- actions-runner-linux-x64-<RUNNER_VERSION>-noruntime.tar.gz <!-- BEGIN SHA linux-x64_noruntime --><LINUX_X64_SHA_NORUNTIME><!-- END SHA linux-x64_noruntime -->
- actions-runner-linux-arm64-<RUNNER_VERSION>-noruntime.tar.gz <!-- BEGIN SHA linux-arm64_noruntime --><LINUX_ARM64_SHA_NORUNTIME><!-- END SHA linux-arm64_noruntime -->
- actions-runner-linux-arm-<RUNNER_VERSION>-noruntime.tar.gz <!-- BEGIN SHA linux-arm_noruntime --><LINUX_ARM_SHA_NORUNTIME><!-- END SHA linux-arm_noruntime -->
- actions-runner-win-x64-<RUNNER_VERSION>-noruntime-noexternals.zip <!-- BEGIN SHA win-x64_noruntime_noexternals --><WIN_X64_SHA_NORUNTIME_NOEXTERNALS><!-- END SHA win-x64_noruntime_noexternals -->
- actions-runner-win-arm64-<RUNNER_VERSION>-noruntime-noexternals.zip <!-- BEGIN SHA win-arm64_noruntime_noexternals --><WIN_ARM64_SHA_NORUNTIME_NOEXTERNALS><!-- END SHA win-arm64_noruntime_noexternals -->
- actions-runner-osx-x64-<RUNNER_VERSION>-noruntime-noexternals.tar.gz <!-- BEGIN SHA osx-x64_noruntime_noexternals --><OSX_X64_SHA_NORUNTIME_NOEXTERNALS><!-- END SHA osx-x64_noruntime_noexternals -->
- actions-runner-osx-arm64-<RUNNER_VERSION>-noruntime-noexternals.tar.gz <!-- BEGIN SHA osx-arm64_noruntime_noexternals --><OSX_ARM64_SHA_NORUNTIME_NOEXTERNALS><!-- END SHA osx-arm64_noruntime_noexternals -->
- actions-runner-linux-x64-<RUNNER_VERSION>-noruntime-noexternals.tar.gz <!-- BEGIN SHA linux-x64_noruntime_noexternals --><LINUX_X64_SHA_NORUNTIME_NOEXTERNALS><!-- END SHA linux-x64_noruntime_noexternals -->
- actions-runner-linux-arm64-<RUNNER_VERSION>-noruntime-noexternals.tar.gz <!-- BEGIN SHA linux-arm64_noruntime_noexternals --><LINUX_ARM64_SHA_NORUNTIME_NOEXTERNALS><!-- END SHA linux-arm64_noruntime_noexternals -->
- actions-runner-linux-arm-<RUNNER_VERSION>-noruntime-noexternals.tar.gz <!-- BEGIN SHA linux-arm_noruntime_noexternals --><LINUX_ARM_SHA_NORUNTIME_NOEXTERNALS><!-- END SHA linux-arm_noruntime_noexternals -->

View File

@@ -1 +1 @@
531b31914e525ecb12cc5526415bc70a112ebc818f877347af1a231011f539c5 54d95a44d118dba852395991224a6b9c1abe916858c87138656f80c619e85331

View File

@@ -1 +1 @@
722dd5fa5ecc207fcccf67f6e502d689f2119d8117beff2041618fba17dc66a4 68015af17f06a824fa478e62ae7393766ce627fd5599ab916432a14656a19a52

View File

@@ -1 +1 @@
8ca75c76e15ab9dc7fe49a66c5c74e171e7fabd5d26546fda8931bd11bff30f9 a2628119ca419cb54e279103ffae7986cdbd0814d57c73ff0dc74c38be08b9ae

View File

@@ -1 +1 @@
70496eb1c99b39b3373b5088c95a35ebbaac1098e6c47c8aab94771f3ffbf501 de71ca09ead807e1a2ce9df0a5b23eb7690cb71fff51169a77e4c3992be53dda

View File

@@ -1 +1 @@
4f8d48727d535daabcaec814e0dafb271c10625366c78e7e022ca7477a73023f d009e05e6b26d614d65be736a15d1bd151932121c16a9ff1b986deadecc982b9

View File

@@ -1 +1 @@
d54d7428f2b9200a0030365a6a4e174e30a1b29b922f8254dffb2924bd09549d f730db39c2305800b4653795360ba9c10c68f384a46b85d808f1f9f0ed3c42e4

View File

@@ -1 +1 @@
eaa939c45307f46b7003902255b3a2a09287215d710984107667e03ac493eb26 a35b5722375490e9473cdcccb5e18b41eba3dbf4344fe31abc9821e21f18ea5a

View File

@@ -63,17 +63,16 @@ function acquireExternalTool() {
echo "Curl version: $CURL_VERSION" echo "Curl version: $CURL_VERSION"
# curl -f Fail silently (no output at all) on HTTP errors (H) # curl -f Fail silently (no output at all) on HTTP errors (H)
# -k Allow connections to SSL sites without certs (H)
# -S Show error. With -s, make curl show errors when they occur # -S Show error. With -s, make curl show errors when they occur
# -L Follow redirects (H) # -L Follow redirects (H)
# -o FILE Write to FILE instead of stdout # -o FILE Write to FILE instead of stdout
# --retry 3 Retries transient errors 3 times (timeouts, 5xx) # --retry 3 Retries transient errors 3 times (timeouts, 5xx)
if [[ "$(printf '%s\n' "7.71.0" "$CURL_VERSION" | sort -V | head -n1)" != "7.71.0" ]]; then if [[ "$(printf '%s\n' "7.71.0" "$CURL_VERSION" | sort -V | head -n1)" != "7.71.0" ]]; then
# Curl version is less than or equal to 7.71.0, skipping retry-all-errors flag # Curl version is less than or equal to 7.71.0, skipping retry-all-errors flag
curl -fkSL --retry 3 -o "$partial_target" "$download_source" 2>"${download_target}_download.log" || checkRC 'curl' curl -fSL --retry 3 -o "$partial_target" "$download_source" 2>"${download_target}_download.log" || checkRC 'curl'
else else
# Curl version is greater than 7.71.0, running curl with --retry-all-errors flag # Curl version is greater than 7.71.0, running curl with --retry-all-errors flag
curl -fkSL --retry 3 --retry-all-errors -o "$partial_target" "$download_source" 2>"${download_target}_download.log" || checkRC 'curl' curl -fSL --retry 3 --retry-all-errors -o "$partial_target" "$download_source" 2>"${download_target}_download.log" || checkRC 'curl'
fi fi
# Move the partial file to the download target. # Move the partial file to the download target.

View File

@@ -1,57 +0,0 @@
actions.runner.plist.template
actions.runner.service.template
checkScripts/downloadCert.js
checkScripts/makeWebRequest.js
darwin.svc.sh.template
hashFiles/index.js
installdependencies.sh
macos-run-invoker.js
Microsoft.IdentityModel.Logging.dll
Microsoft.IdentityModel.Tokens.dll
Minimatch.dll
Newtonsoft.Json.Bson.dll
Newtonsoft.Json.dll
Runner.Common.deps.json
Runner.Common.dll
Runner.Common.pdb
Runner.Listener
Runner.Listener.deps.json
Runner.Listener.dll
Runner.Listener.exe
Runner.Listener.pdb
Runner.Listener.runtimeconfig.json
Runner.PluginHost
Runner.PluginHost.deps.json
Runner.PluginHost.dll
Runner.PluginHost.exe
Runner.PluginHost.pdb
Runner.PluginHost.runtimeconfig.json
Runner.Plugins.deps.json
Runner.Plugins.dll
Runner.Plugins.pdb
Runner.Sdk.deps.json
Runner.Sdk.dll
Runner.Sdk.pdb
Runner.Worker
Runner.Worker.deps.json
Runner.Worker.dll
Runner.Worker.exe
Runner.Worker.pdb
Runner.Worker.runtimeconfig.json
RunnerService.exe
RunnerService.exe.config
RunnerService.js
RunnerService.pdb
runsvc.sh
Sdk.deps.json
Sdk.dll
Sdk.pdb
System.IdentityModel.Tokens.Jwt.dll
System.Net.Http.Formatting.dll
System.Security.Cryptography.Pkcs.dll
System.Security.Cryptography.ProtectedData.dll
System.ServiceProcess.ServiceController.dll
systemd.svc.sh.template
update.cmd.template
update.sh.template
YamlDotNet.dll

View File

@@ -1,270 +0,0 @@
api-ms-win-core-console-l1-1-0.dll
api-ms-win-core-console-l1-2-0.dll
api-ms-win-core-datetime-l1-1-0.dll
api-ms-win-core-debug-l1-1-0.dll
api-ms-win-core-errorhandling-l1-1-0.dll
api-ms-win-core-fibers-l1-1-0.dll
api-ms-win-core-file-l1-1-0.dll
api-ms-win-core-file-l1-2-0.dll
api-ms-win-core-file-l2-1-0.dll
api-ms-win-core-handle-l1-1-0.dll
api-ms-win-core-heap-l1-1-0.dll
api-ms-win-core-interlocked-l1-1-0.dll
api-ms-win-core-libraryloader-l1-1-0.dll
api-ms-win-core-localization-l1-2-0.dll
api-ms-win-core-memory-l1-1-0.dll
api-ms-win-core-namedpipe-l1-1-0.dll
api-ms-win-core-processenvironment-l1-1-0.dll
api-ms-win-core-processthreads-l1-1-0.dll
api-ms-win-core-processthreads-l1-1-1.dll
api-ms-win-core-profile-l1-1-0.dll
api-ms-win-core-rtlsupport-l1-1-0.dll
api-ms-win-core-string-l1-1-0.dll
api-ms-win-core-synch-l1-1-0.dll
api-ms-win-core-synch-l1-2-0.dll
api-ms-win-core-sysinfo-l1-1-0.dll
api-ms-win-core-timezone-l1-1-0.dll
api-ms-win-core-util-l1-1-0.dll
api-ms-win-crt-conio-l1-1-0.dll
api-ms-win-crt-convert-l1-1-0.dll
api-ms-win-crt-environment-l1-1-0.dll
api-ms-win-crt-filesystem-l1-1-0.dll
api-ms-win-crt-heap-l1-1-0.dll
api-ms-win-crt-locale-l1-1-0.dll
api-ms-win-crt-math-l1-1-0.dll
api-ms-win-crt-multibyte-l1-1-0.dll
api-ms-win-crt-private-l1-1-0.dll
api-ms-win-crt-process-l1-1-0.dll
api-ms-win-crt-runtime-l1-1-0.dll
api-ms-win-crt-stdio-l1-1-0.dll
api-ms-win-crt-string-l1-1-0.dll
api-ms-win-crt-time-l1-1-0.dll
api-ms-win-crt-utility-l1-1-0.dll
clrcompression.dll
clretwrc.dll
clrjit.dll
coreclr.dll
createdump
createdump.exe
dbgshim.dll
hostfxr.dll
hostpolicy.dll
libclrjit.dylib
libclrjit.so
libcoreclr.dylib
libcoreclr.so
libcoreclrtraceptprovider.so
libdbgshim.dylib
libdbgshim.so
libhostfxr.dylib
libhostfxr.so
libhostpolicy.dylib
libhostpolicy.so
libmscordaccore.dylib
libmscordaccore.so
libmscordbi.dylib
libmscordbi.so
Microsoft.CSharp.dll
Microsoft.DiaSymReader.Native.amd64.dll
Microsoft.DiaSymReader.Native.arm64.dll
Microsoft.VisualBasic.Core.dll
Microsoft.VisualBasic.dll
Microsoft.Win32.Primitives.dll
Microsoft.Win32.Registry.dll
mscordaccore.dll
mscordaccore_amd64_amd64_6.0.522.21309.dll
mscordaccore_arm64_arm64_6.0.522.21309.dll
mscordaccore_amd64_amd64_6.0.1322.58009.dll
mscordaccore_amd64_amd64_6.0.2023.32017.dll
mscordaccore_amd64_amd64_6.0.2223.42425.dll
mscordaccore_amd64_amd64_6.0.2323.48002.dll
mscordbi.dll
mscorlib.dll
mscorrc.debug.dll
mscorrc.dll
msquic.dll
netstandard.dll
SOS_README.md
System.AppContext.dll
System.Buffers.dll
System.Collections.Concurrent.dll
System.Collections.dll
System.Collections.Immutable.dll
System.Collections.NonGeneric.dll
System.Collections.Specialized.dll
System.ComponentModel.Annotations.dll
System.ComponentModel.DataAnnotations.dll
System.ComponentModel.dll
System.ComponentModel.EventBasedAsync.dll
System.ComponentModel.Primitives.dll
System.ComponentModel.TypeConverter.dll
System.Configuration.dll
System.Console.dll
System.Core.dll
System.Data.Common.dll
System.Data.DataSetExtensions.dll
System.Data.dll
System.Diagnostics.Contracts.dll
System.Diagnostics.Debug.dll
System.Diagnostics.DiagnosticSource.dll
System.Diagnostics.FileVersionInfo.dll
System.Diagnostics.Process.dll
System.Diagnostics.StackTrace.dll
System.Diagnostics.TextWriterTraceListener.dll
System.Diagnostics.Tools.dll
System.Diagnostics.TraceSource.dll
System.Diagnostics.Tracing.dll
System.dll
System.Drawing.dll
System.Drawing.Primitives.dll
System.Dynamic.Runtime.dll
System.Formats.Asn1.dll
System.Globalization.Calendars.dll
System.Globalization.dll
System.Globalization.Extensions.dll
System.Globalization.Native.dylib
System.Globalization.Native.so
System.IO.Compression.Brotli.dll
System.IO.Compression.dll
System.IO.Compression.FileSystem.dll
System.IO.Compression.Native.a
System.IO.Compression.Native.dll
System.IO.Compression.Native.dylib
System.IO.Compression.Native.so
System.IO.Compression.ZipFile.dll
System.IO.dll
System.IO.FileSystem.AccessControl.dll
System.IO.FileSystem.dll
System.IO.FileSystem.DriveInfo.dll
System.IO.FileSystem.Primitives.dll
System.IO.FileSystem.Watcher.dll
System.IO.IsolatedStorage.dll
System.IO.MemoryMappedFiles.dll
System.IO.Pipes.AccessControl.dll
System.IO.Pipes.dll
System.IO.UnmanagedMemoryStream.dll
System.Linq.dll
System.Linq.Expressions.dll
System.Linq.Parallel.dll
System.Linq.Queryable.dll
System.Memory.dll
System.Native.a
System.Native.dylib
System.Native.so
System.Net.dll
System.Net.Http.dll
System.Net.Http.Json.dll
System.Net.Http.Native.a
System.Net.Http.Native.dylib
System.Net.Http.Native.so
System.Net.HttpListener.dll
System.Net.Mail.dll
System.Net.NameResolution.dll
System.Net.NetworkInformation.dll
System.Net.Ping.dll
System.Net.Primitives.dll
System.Net.Quic.dll
System.Net.Requests.dll
System.Net.Security.dll
System.Net.Security.Native.a
System.Net.Security.Native.dylib
System.Net.Security.Native.so
System.Net.ServicePoint.dll
System.Net.Sockets.dll
System.Net.WebClient.dll
System.Net.WebHeaderCollection.dll
System.Net.WebProxy.dll
System.Net.WebSockets.Client.dll
System.Net.WebSockets.dll
System.Numerics.dll
System.Numerics.Vectors.dll
System.ObjectModel.dll
System.Private.CoreLib.dll
System.Private.DataContractSerialization.dll
System.Private.Uri.dll
System.Private.Xml.dll
System.Private.Xml.Linq.dll
System.Reflection.DispatchProxy.dll
System.Reflection.dll
System.Reflection.Emit.dll
System.Reflection.Emit.ILGeneration.dll
System.Reflection.Emit.Lightweight.dll
System.Reflection.Extensions.dll
System.Reflection.Metadata.dll
System.Reflection.Primitives.dll
System.Reflection.TypeExtensions.dll
System.Resources.Reader.dll
System.Resources.ResourceManager.dll
System.Resources.Writer.dll
System.Runtime.CompilerServices.Unsafe.dll
System.Runtime.CompilerServices.VisualC.dll
System.Runtime.dll
System.Runtime.Extensions.dll
System.Runtime.Handles.dll
System.Runtime.InteropServices.dll
System.Runtime.InteropServices.RuntimeInformation.dll
System.Runtime.InteropServices.WindowsRuntime.dll
System.Runtime.Intrinsics.dll
System.Runtime.Loader.dll
System.Runtime.Numerics.dll
System.Runtime.Serialization.dll
System.Runtime.Serialization.Formatters.dll
System.Runtime.Serialization.Json.dll
System.Runtime.Serialization.Primitives.dll
System.Runtime.Serialization.Xml.dll
System.Runtime.WindowsRuntime.dll
System.Runtime.WindowsRuntime.UI.Xaml.dll
System.Security.AccessControl.dll
System.Security.Claims.dll
System.Security.Cryptography.Algorithms.dll
System.Security.Cryptography.Cng.dll
System.Security.Cryptography.Csp.dll
System.Security.Cryptography.Encoding.dll
System.Security.Cryptography.Native.Apple.a
System.Security.Cryptography.Native.Apple.dylib
System.Security.Cryptography.Native.OpenSsl.a
System.Security.Cryptography.Native.OpenSsl.dylib
System.Security.Cryptography.Native.OpenSsl.so
System.Security.Cryptography.OpenSsl.dll
System.Security.Cryptography.Primitives.dll
System.Security.Cryptography.X509Certificates.dll
System.Security.Cryptography.XCertificates.dll
System.Security.dll
System.Security.Principal.dll
System.Security.Principal.Windows.dll
System.Security.SecureString.dll
System.ServiceModel.Web.dll
System.ServiceProcess.dll
System.Text.Encoding.CodePages.dll
System.Text.Encoding.dll
System.Text.Encoding.Extensions.dll
System.Text.Encodings.Web.dll
System.Text.Json.dll
System.Text.RegularExpressions.dll
System.Threading.Channels.dll
System.Threading.dll
System.Threading.Overlapped.dll
System.Threading.Tasks.Dataflow.dll
System.Threading.Tasks.dll
System.Threading.Tasks.Extensions.dll
System.Threading.Tasks.Parallel.dll
System.Threading.Thread.dll
System.Threading.ThreadPool.dll
System.Threading.Timer.dll
System.Transactions.dll
System.Transactions.Local.dll
System.ValueTuple.dll
System.Web.dll
System.Web.HttpUtility.dll
System.Windows.dll
System.Xml.dll
System.Xml.Linq.dll
System.Xml.ReaderWriter.dll
System.Xml.Serialization.dll
System.Xml.XDocument.dll
System.Xml.XmlDocument.dll
System.Xml.XmlSerializer.dll
System.Xml.XPath.dll
System.Xml.XPath.XDocument.dll
ucrtbase.dll
WindowsBase.dll

View File

@@ -1,24 +0,0 @@
[
{
"HashValue": "<NO_RUNTIME_EXTERNALS_HASH>",
"DownloadUrl": "https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-<RUNNER_PLATFORM>-<RUNNER_VERSION>-noruntime-noexternals.tar.gz",
"TrimmedContents": {
"dotnetRuntime": "<RUNTIME_HASH>",
"externals": "<EXTERNALS_HASH>"
}
},
{
"HashValue": "<NO_RUNTIME_HASH>",
"DownloadUrl": "https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-<RUNNER_PLATFORM>-<RUNNER_VERSION>-noruntime.tar.gz",
"TrimmedContents": {
"dotnetRuntime": "<RUNTIME_HASH>"
}
},
{
"HashValue": "<NO_EXTERNALS_HASH>",
"DownloadUrl": "https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-<RUNNER_PLATFORM>-<RUNNER_VERSION>-noexternals.tar.gz",
"TrimmedContents": {
"externals": "<EXTERNALS_HASH>"
}
}
]

View File

@@ -1,24 +0,0 @@
[
{
"HashValue": "<NO_RUNTIME_EXTERNALS_HASH>",
"DownloadUrl": "https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-<RUNNER_PLATFORM>-<RUNNER_VERSION>-noruntime-noexternals.zip",
"TrimmedContents": {
"dotnetRuntime": "<RUNTIME_HASH>",
"externals": "<EXTERNALS_HASH>"
}
},
{
"HashValue": "<NO_RUNTIME_HASH>",
"DownloadUrl": "https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-<RUNNER_PLATFORM>-<RUNNER_VERSION>-noruntime.zip",
"TrimmedContents": {
"dotnetRuntime": "<RUNTIME_HASH>"
}
},
{
"HashValue": "<NO_EXTERNALS_HASH>",
"DownloadUrl": "https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-<RUNNER_PLATFORM>-<RUNNER_VERSION>-noexternals.zip",
"TrimmedContents": {
"externals": "<EXTERNALS_HASH>"
}
}
]

View File

@@ -17,7 +17,10 @@ namespace GitHub.Runner.Common
{ {
Task ConnectAsync(Uri serverUrl, VssCredentials credentials); Task ConnectAsync(Uri serverUrl, VssCredentials credentials);
Task<TaskAgentMessage> GetRunnerMessageAsync(CancellationToken token, TaskAgentStatus status, string version, string os, string architecture); Task<TaskAgentSession> CreateSessionAsync(TaskAgentSession session, CancellationToken cancellationToken);
Task DeleteSessionAsync(CancellationToken cancellationToken);
Task<TaskAgentMessage> GetRunnerMessageAsync(Guid? sessionId, TaskAgentStatus status, string version, string os, string architecture, bool disableUpdate, CancellationToken token);
} }
public sealed class BrokerServer : RunnerService, IBrokerServer public sealed class BrokerServer : RunnerService, IBrokerServer
@@ -44,13 +47,27 @@ namespace GitHub.Runner.Common
} }
} }
public Task<TaskAgentMessage> GetRunnerMessageAsync(CancellationToken cancellationToken, TaskAgentStatus status, string version, string os, string architecture) public async Task<TaskAgentSession> CreateSessionAsync(TaskAgentSession session, CancellationToken cancellationToken)
{ {
CheckConnection(); CheckConnection();
var jobMessage = RetryRequest<TaskAgentMessage>( var jobMessage = await _brokerHttpClient.CreateSessionAsync(session, cancellationToken);
async () => await _brokerHttpClient.GetRunnerMessageAsync(version, status, os, architecture, cancellationToken), cancellationToken);
return jobMessage; return jobMessage;
} }
public Task<TaskAgentMessage> GetRunnerMessageAsync(Guid? sessionId, TaskAgentStatus status, string version, string os, string architecture, bool disableUpdate, CancellationToken cancellationToken)
{
CheckConnection();
var brokerSession = RetryRequest<TaskAgentMessage>(
async () => await _brokerHttpClient.GetRunnerMessageAsync(sessionId, version, status, os, architecture, disableUpdate, cancellationToken), cancellationToken);
return brokerSession;
}
public async Task DeleteSessionAsync(CancellationToken cancellationToken)
{
CheckConnection();
await _brokerHttpClient.DeleteSessionAsync(cancellationToken);
}
} }
} }

View File

@@ -200,6 +200,10 @@ namespace GitHub.Runner.Common
{ {
_trace.Info($"No proxy settings were found based on environmental variables (http_proxy/https_proxy/HTTP_PROXY/HTTPS_PROXY)"); _trace.Info($"No proxy settings were found based on environmental variables (http_proxy/https_proxy/HTTP_PROXY/HTTPS_PROXY)");
} }
else
{
_userAgents.Add(new ProductInfoHeaderValue("HttpProxyConfigured", bool.TrueString));
}
if (StringUtil.ConvertToBoolean(Environment.GetEnvironmentVariable("GITHUB_ACTIONS_RUNNER_TLS_NO_VERIFY"))) if (StringUtil.ConvertToBoolean(Environment.GetEnvironmentVariable("GITHUB_ACTIONS_RUNNER_TLS_NO_VERIFY")))
{ {

View File

@@ -134,8 +134,8 @@ namespace GitHub.Runner.Common
{ {
liveConsoleFeedUrl = feedStreamUrl; liveConsoleFeedUrl = feedStreamUrl;
} }
jobRequest.Variables.TryGetValue("system.github.results_upload_with_sdk", out VariableValue resultsUseSdkVariable);
_resultsServer.InitializeResultsClient(new Uri(resultsReceiverEndpoint), liveConsoleFeedUrl, accessToken); _resultsServer.InitializeResultsClient(new Uri(resultsReceiverEndpoint), liveConsoleFeedUrl, accessToken, StringUtil.ConvertToBoolean(resultsUseSdkVariable?.Value));
_resultsClientInitiated = true; _resultsClientInitiated = true;
} }
@@ -551,6 +551,10 @@ namespace GitHub.Runner.Common
{ {
await UploadSummaryFile(file); await UploadSummaryFile(file);
} }
if (string.Equals(file.Type, CoreAttachmentType.ResultsDiagnosticLog, StringComparison.OrdinalIgnoreCase))
{
await UploadResultsDiagnosticLogsFile(file);
}
else if (String.Equals(file.Type, CoreAttachmentType.ResultsLog, StringComparison.OrdinalIgnoreCase)) else if (String.Equals(file.Type, CoreAttachmentType.ResultsLog, StringComparison.OrdinalIgnoreCase))
{ {
if (file.RecordId != _jobTimelineRecordId) if (file.RecordId != _jobTimelineRecordId)
@@ -922,6 +926,17 @@ namespace GitHub.Runner.Common
await UploadResultsFile(file, summaryHandler); await UploadResultsFile(file, summaryHandler);
} }
private async Task UploadResultsDiagnosticLogsFile(ResultsUploadFileInfo file)
{
Trace.Info($"Starting to upload diagnostic logs file to results service {file.Name}, {file.Path}");
ResultsFileUploadHandler diagnosticLogsHandler = async (file) =>
{
await _resultsServer.CreateResultsDiagnosticLogsAsync(file.PlanId, file.JobId, file.Path, CancellationToken.None);
};
await UploadResultsFile(file, diagnosticLogsHandler);
}
private async Task UploadResultsStepLogFile(ResultsUploadFileInfo file) private async Task UploadResultsStepLogFile(ResultsUploadFileInfo file)
{ {
Trace.Info($"Starting upload of step log file to results service {file.Name}, {file.Path}"); Trace.Info($"Starting upload of step log file to results service {file.Name}, {file.Path}");

View File

@@ -19,7 +19,7 @@ namespace GitHub.Runner.Common
[ServiceLocator(Default = typeof(ResultServer))] [ServiceLocator(Default = typeof(ResultServer))]
public interface IResultsServer : IRunnerService, IAsyncDisposable public interface IResultsServer : IRunnerService, IAsyncDisposable
{ {
void InitializeResultsClient(Uri uri, string liveConsoleFeedUrl, string token); void InitializeResultsClient(Uri uri, string liveConsoleFeedUrl, string token, bool useSdk);
Task<bool> AppendLiveConsoleFeedAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, Guid stepId, IList<string> lines, long? startLine, CancellationToken cancellationToken); Task<bool> AppendLiveConsoleFeedAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, Guid stepId, IList<string> lines, long? startLine, CancellationToken cancellationToken);
@@ -35,6 +35,8 @@ namespace GitHub.Runner.Common
Task UpdateResultsWorkflowStepsAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Task UpdateResultsWorkflowStepsAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId,
IEnumerable<TimelineRecord> records, CancellationToken cancellationToken); IEnumerable<TimelineRecord> records, CancellationToken cancellationToken);
Task CreateResultsDiagnosticLogsAsync(string planId, string jobId, string file, CancellationToken cancellationToken);
} }
public sealed class ResultServer : RunnerService, IResultsServer public sealed class ResultServer : RunnerService, IResultsServer
@@ -51,9 +53,9 @@ namespace GitHub.Runner.Common
private String _liveConsoleFeedUrl; private String _liveConsoleFeedUrl;
private string _token; private string _token;
public void InitializeResultsClient(Uri uri, string liveConsoleFeedUrl, string token) public void InitializeResultsClient(Uri uri, string liveConsoleFeedUrl, string token, bool useSdk)
{ {
this._resultsClient = CreateHttpClient(uri, token); this._resultsClient = CreateHttpClient(uri, token, useSdk);
_token = token; _token = token;
if (!string.IsNullOrEmpty(liveConsoleFeedUrl)) if (!string.IsNullOrEmpty(liveConsoleFeedUrl))
@@ -63,7 +65,7 @@ namespace GitHub.Runner.Common
} }
} }
public ResultsHttpClient CreateHttpClient(Uri uri, string token) public ResultsHttpClient CreateHttpClient(Uri uri, string token, bool useSdk)
{ {
// Using default 100 timeout // Using default 100 timeout
RawClientHttpRequestSettings settings = VssUtil.GetHttpRequestSettings(null); RawClientHttpRequestSettings settings = VssUtil.GetHttpRequestSettings(null);
@@ -80,7 +82,7 @@ namespace GitHub.Runner.Common
var pipeline = HttpClientFactory.CreatePipeline(httpMessageHandler, delegatingHandlers); var pipeline = HttpClientFactory.CreatePipeline(httpMessageHandler, delegatingHandlers);
return new ResultsHttpClient(uri, pipeline, token, disposeHandler: true); return new ResultsHttpClient(uri, pipeline, token, disposeHandler: true, useSdk: useSdk);
} }
public Task CreateResultsStepSummaryAsync(string planId, string jobId, Guid stepId, string file, public Task CreateResultsStepSummaryAsync(string planId, string jobId, Guid stepId, string file,
@@ -141,6 +143,18 @@ namespace GitHub.Runner.Common
throw new InvalidOperationException("Results client is not initialized."); throw new InvalidOperationException("Results client is not initialized.");
} }
public Task CreateResultsDiagnosticLogsAsync(string planId, string jobId, string file,
CancellationToken cancellationToken)
{
if (_resultsClient != null)
{
return _resultsClient.UploadResultsDiagnosticLogsAsync(planId, jobId, file,
cancellationToken: cancellationToken);
}
throw new InvalidOperationException("Results client is not initialized.");
}
public ValueTask DisposeAsync() public ValueTask DisposeAsync()
{ {
CloseWebSocket(WebSocketCloseStatus.NormalClosure, CancellationToken.None); CloseWebSocket(WebSocketCloseStatus.NormalClosure, CancellationToken.None);

View File

@@ -38,7 +38,7 @@ namespace GitHub.Runner.Common
Task<TaskAgentSession> CreateAgentSessionAsync(Int32 poolId, TaskAgentSession session, CancellationToken cancellationToken); Task<TaskAgentSession> CreateAgentSessionAsync(Int32 poolId, TaskAgentSession session, CancellationToken cancellationToken);
Task DeleteAgentMessageAsync(Int32 poolId, Int64 messageId, Guid sessionId, CancellationToken cancellationToken); Task DeleteAgentMessageAsync(Int32 poolId, Int64 messageId, Guid sessionId, CancellationToken cancellationToken);
Task DeleteAgentSessionAsync(Int32 poolId, Guid sessionId, CancellationToken cancellationToken); Task DeleteAgentSessionAsync(Int32 poolId, Guid sessionId, CancellationToken cancellationToken);
Task<TaskAgentMessage> GetAgentMessageAsync(Int32 poolId, Guid sessionId, Int64? lastMessageId, TaskAgentStatus status, string runnerVersion, string os, string architecture, CancellationToken cancellationToken); Task<TaskAgentMessage> GetAgentMessageAsync(Int32 poolId, Guid sessionId, Int64? lastMessageId, TaskAgentStatus status, string runnerVersion, string os, string architecture, bool disableUpdate, CancellationToken cancellationToken);
// job request // job request
Task<TaskAgentJobRequest> GetAgentRequestAsync(int poolId, long requestId, CancellationToken cancellationToken); Task<TaskAgentJobRequest> GetAgentRequestAsync(int poolId, long requestId, CancellationToken cancellationToken);
@@ -272,10 +272,10 @@ namespace GitHub.Runner.Common
return _messageTaskAgentClient.DeleteAgentSessionAsync(poolId, sessionId, cancellationToken: cancellationToken); return _messageTaskAgentClient.DeleteAgentSessionAsync(poolId, sessionId, cancellationToken: cancellationToken);
} }
public Task<TaskAgentMessage> GetAgentMessageAsync(Int32 poolId, Guid sessionId, Int64? lastMessageId, TaskAgentStatus status, string runnerVersion, string os, string architecture, CancellationToken cancellationToken) public Task<TaskAgentMessage> GetAgentMessageAsync(Int32 poolId, Guid sessionId, Int64? lastMessageId, TaskAgentStatus status, string runnerVersion, string os, string architecture, bool disableUpdate, CancellationToken cancellationToken)
{ {
CheckConnection(RunnerConnectionType.MessageQueue); CheckConnection(RunnerConnectionType.MessageQueue);
return _messageTaskAgentClient.GetMessageAsync(poolId, sessionId, lastMessageId, status, runnerVersion, os, architecture, cancellationToken: cancellationToken); return _messageTaskAgentClient.GetMessageAsync(poolId, sessionId, lastMessageId, status, runnerVersion, os, architecture, disableUpdate, cancellationToken: cancellationToken);
} }
//----------------------------------------------------------------- //-----------------------------------------------------------------

View File

@@ -24,7 +24,15 @@ namespace GitHub.Runner.Listener
private TimeSpan _getNextMessageRetryInterval; private TimeSpan _getNextMessageRetryInterval;
private TaskAgentStatus runnerStatus = TaskAgentStatus.Online; private TaskAgentStatus runnerStatus = TaskAgentStatus.Online;
private CancellationTokenSource _getMessagesTokenSource; private CancellationTokenSource _getMessagesTokenSource;
private VssCredentials _creds;
private TaskAgentSession _session;
private IBrokerServer _brokerServer; private IBrokerServer _brokerServer;
private readonly Dictionary<string, int> _sessionCreationExceptionTracker = new();
private bool _accessTokenRevoked = false;
private readonly TimeSpan _sessionCreationRetryInterval = TimeSpan.FromSeconds(30);
private readonly TimeSpan _sessionConflictRetryLimit = TimeSpan.FromMinutes(4);
private readonly TimeSpan _clockSkewRetryLimit = TimeSpan.FromMinutes(30);
public override void Initialize(IHostContext hostContext) public override void Initialize(IHostContext hostContext)
{ {
@@ -36,13 +44,134 @@ namespace GitHub.Runner.Listener
public async Task<Boolean> CreateSessionAsync(CancellationToken token) public async Task<Boolean> CreateSessionAsync(CancellationToken token)
{ {
await RefreshBrokerConnection(); Trace.Entering();
return await Task.FromResult(true);
// Settings
var configManager = HostContext.GetService<IConfigurationManager>();
_settings = configManager.LoadSettings();
var serverUrl = _settings.ServerUrlV2;
Trace.Info(_settings);
if (string.IsNullOrEmpty(_settings.ServerUrlV2))
{
throw new InvalidOperationException("ServerUrlV2 is not set");
}
// Create connection.
Trace.Info("Loading Credentials");
var credMgr = HostContext.GetService<ICredentialManager>();
_creds = credMgr.LoadCredentials();
var agent = new TaskAgentReference
{
Id = _settings.AgentId,
Name = _settings.AgentName,
Version = BuildConstants.RunnerPackage.Version,
OSDescription = RuntimeInformation.OSDescription,
};
string sessionName = $"{Environment.MachineName ?? "RUNNER"}";
var taskAgentSession = new TaskAgentSession(sessionName, agent);
string errorMessage = string.Empty;
bool encounteringError = false;
while (true)
{
token.ThrowIfCancellationRequested();
Trace.Info($"Attempt to create session.");
try
{
Trace.Info("Connecting to the Broker Server...");
await _brokerServer.ConnectAsync(new Uri(serverUrl), _creds);
Trace.Info("VssConnection created");
_term.WriteLine();
_term.WriteSuccessMessage("Connected to GitHub");
_term.WriteLine();
_session = await _brokerServer.CreateSessionAsync(taskAgentSession, token);
Trace.Info($"Session created.");
if (encounteringError)
{
_term.WriteLine($"{DateTime.UtcNow:u}: Runner reconnected.");
_sessionCreationExceptionTracker.Clear();
encounteringError = false;
}
return true;
}
catch (OperationCanceledException) when (token.IsCancellationRequested)
{
Trace.Info("Session creation has been cancelled.");
throw;
}
catch (TaskAgentAccessTokenExpiredException)
{
Trace.Info("Runner OAuth token has been revoked. Session creation failed.");
_accessTokenRevoked = true;
throw;
}
catch (Exception ex)
{
Trace.Error("Catch exception during create session.");
Trace.Error(ex);
if (ex is VssOAuthTokenRequestException vssOAuthEx && _creds.Federated is VssOAuthCredential vssOAuthCred)
{
// "invalid_client" means the runner registration has been deleted from the server.
if (string.Equals(vssOAuthEx.Error, "invalid_client", StringComparison.OrdinalIgnoreCase))
{
_term.WriteError("Failed to create a session. The runner registration has been deleted from the server, please re-configure. Runner registrations are automatically deleted for runners that have not connected to the service recently.");
return false;
}
// Check whether we get 401 because the runner registration already removed by the service.
// If the runner registration get deleted, we can't exchange oauth token.
Trace.Error("Test oauth app registration.");
var oauthTokenProvider = new VssOAuthTokenProvider(vssOAuthCred, new Uri(serverUrl));
var authError = await oauthTokenProvider.ValidateCredentialAsync(token);
if (string.Equals(authError, "invalid_client", StringComparison.OrdinalIgnoreCase))
{
_term.WriteError("Failed to create a session. The runner registration has been deleted from the server, please re-configure. Runner registrations are automatically deleted for runners that have not connected to the service recently.");
return false;
}
}
if (!IsSessionCreationExceptionRetriable(ex))
{
_term.WriteError($"Failed to create session. {ex.Message}");
return false;
}
if (!encounteringError) //print the message only on the first error
{
_term.WriteError($"{DateTime.UtcNow:u}: Runner connect error: {ex.Message}. Retrying until reconnected.");
encounteringError = true;
}
Trace.Info("Sleeping for {0} seconds before retrying.", _sessionCreationRetryInterval.TotalSeconds);
await HostContext.Delay(_sessionCreationRetryInterval, token);
}
}
} }
public async Task DeleteSessionAsync() public async Task DeleteSessionAsync()
{ {
await Task.CompletedTask; if (_session != null && _session.SessionId != Guid.Empty)
{
if (!_accessTokenRevoked)
{
using (var ts = new CancellationTokenSource(TimeSpan.FromSeconds(30)))
{
await _brokerServer.DeleteSessionAsync(ts.Token);
}
}
else
{
Trace.Warning("Runner OAuth token has been revoked. Skip deleting session.");
}
}
} }
public void OnJobStatus(object sender, JobStatusEventArgs e) public void OnJobStatus(object sender, JobStatusEventArgs e)
@@ -73,7 +202,13 @@ namespace GitHub.Runner.Listener
_getMessagesTokenSource = CancellationTokenSource.CreateLinkedTokenSource(token); _getMessagesTokenSource = CancellationTokenSource.CreateLinkedTokenSource(token);
try try
{ {
message = await _brokerServer.GetRunnerMessageAsync(_getMessagesTokenSource.Token, runnerStatus, BuildConstants.RunnerPackage.Version, VarUtil.OS, VarUtil.OSArchitecture); message = await _brokerServer.GetRunnerMessageAsync(_session.SessionId,
runnerStatus,
BuildConstants.RunnerPackage.Version,
VarUtil.OS,
VarUtil.OSArchitecture,
_settings.DisableUpdate,
_getMessagesTokenSource.Token);
if (message == null) if (message == null)
{ {
@@ -138,7 +273,7 @@ namespace GitHub.Runner.Listener
} }
// re-create VssConnection before next retry // re-create VssConnection before next retry
await RefreshBrokerConnection(); await RefreshBrokerConnectionAsync();
Trace.Info("Sleeping for {0} seconds before retrying.", _getNextMessageRetryInterval.TotalSeconds); Trace.Info("Sleeping for {0} seconds before retrying.", _getNextMessageRetryInterval.TotalSeconds);
await HostContext.Delay(_getNextMessageRetryInterval, token); await HostContext.Delay(_getNextMessageRetryInterval, token);
@@ -168,6 +303,11 @@ namespace GitHub.Runner.Listener
} }
} }
public async Task RefreshListenerTokenAsync(CancellationToken cancellationToken)
{
await RefreshBrokerConnectionAsync();
}
public async Task DeleteMessageAsync(TaskAgentMessage message) public async Task DeleteMessageAsync(TaskAgentMessage message)
{ {
await Task.CompletedTask; await Task.CompletedTask;
@@ -191,12 +331,84 @@ namespace GitHub.Runner.Listener
} }
} }
private async Task RefreshBrokerConnection() private bool IsSessionCreationExceptionRetriable(Exception ex)
{
if (ex is TaskAgentNotFoundException)
{
Trace.Info("The runner no longer exists on the server. Stopping the runner.");
_term.WriteError("The runner no longer exists on the server. Please reconfigure the runner.");
return false;
}
else if (ex is TaskAgentSessionConflictException)
{
Trace.Info("The session for this runner already exists.");
_term.WriteError("A session for this runner already exists.");
if (_sessionCreationExceptionTracker.ContainsKey(nameof(TaskAgentSessionConflictException)))
{
_sessionCreationExceptionTracker[nameof(TaskAgentSessionConflictException)]++;
if (_sessionCreationExceptionTracker[nameof(TaskAgentSessionConflictException)] * _sessionCreationRetryInterval.TotalSeconds >= _sessionConflictRetryLimit.TotalSeconds)
{
Trace.Info("The session conflict exception have reached retry limit.");
_term.WriteError($"Stop retry on SessionConflictException after retried for {_sessionConflictRetryLimit.TotalSeconds} seconds.");
return false;
}
}
else
{
_sessionCreationExceptionTracker[nameof(TaskAgentSessionConflictException)] = 1;
}
Trace.Info("The session conflict exception haven't reached retry limit.");
return true;
}
else if (ex is VssOAuthTokenRequestException && ex.Message.Contains("Current server time is"))
{
Trace.Info("Local clock might be skewed.");
_term.WriteError("The local machine's clock may be out of sync with the server time by more than five minutes. Please sync your clock with your domain or internet time and try again.");
if (_sessionCreationExceptionTracker.ContainsKey(nameof(VssOAuthTokenRequestException)))
{
_sessionCreationExceptionTracker[nameof(VssOAuthTokenRequestException)]++;
if (_sessionCreationExceptionTracker[nameof(VssOAuthTokenRequestException)] * _sessionCreationRetryInterval.TotalSeconds >= _clockSkewRetryLimit.TotalSeconds)
{
Trace.Info("The OAuth token request exception have reached retry limit.");
_term.WriteError($"Stopped retrying OAuth token request exception after {_clockSkewRetryLimit.TotalSeconds} seconds.");
return false;
}
}
else
{
_sessionCreationExceptionTracker[nameof(VssOAuthTokenRequestException)] = 1;
}
Trace.Info("The OAuth token request exception haven't reached retry limit.");
return true;
}
else if (ex is TaskAgentPoolNotFoundException ||
ex is AccessDeniedException ||
ex is VssUnauthorizedException)
{
Trace.Info($"Non-retriable exception: {ex.Message}");
return false;
}
else if (ex is InvalidOperationException)
{
Trace.Info($"Non-retriable exception: {ex.Message}");
return false;
}
else
{
Trace.Info($"Retriable exception: {ex.Message}");
return true;
}
}
private async Task RefreshBrokerConnectionAsync()
{ {
var configManager = HostContext.GetService<IConfigurationManager>(); var configManager = HostContext.GetService<IConfigurationManager>();
_settings = configManager.LoadSettings(); _settings = configManager.LoadSettings();
if (_settings.ServerUrlV2 == null) if (string.IsNullOrEmpty(_settings.ServerUrlV2))
{ {
throw new InvalidOperationException("ServerUrlV2 is not set"); throw new InvalidOperationException("ServerUrlV2 is not set");
} }

View File

@@ -39,6 +39,7 @@ namespace GitHub.Runner.Listener.Check
string githubApiUrl = null; string githubApiUrl = null;
string actionsTokenServiceUrl = null; string actionsTokenServiceUrl = null;
string actionsPipelinesServiceUrl = null; string actionsPipelinesServiceUrl = null;
string resultsReceiverServiceUrl = null;
var urlBuilder = new UriBuilder(url); var urlBuilder = new UriBuilder(url);
if (UrlUtil.IsHostedServer(urlBuilder)) if (UrlUtil.IsHostedServer(urlBuilder))
{ {
@@ -47,6 +48,7 @@ namespace GitHub.Runner.Listener.Check
githubApiUrl = urlBuilder.Uri.AbsoluteUri; githubApiUrl = urlBuilder.Uri.AbsoluteUri;
actionsTokenServiceUrl = "https://vstoken.actions.githubusercontent.com/_apis/health"; actionsTokenServiceUrl = "https://vstoken.actions.githubusercontent.com/_apis/health";
actionsPipelinesServiceUrl = "https://pipelines.actions.githubusercontent.com/_apis/health"; actionsPipelinesServiceUrl = "https://pipelines.actions.githubusercontent.com/_apis/health";
resultsReceiverServiceUrl = "https://results-receiver.actions.githubusercontent.com/health";
} }
else else
{ {
@@ -56,13 +58,31 @@ namespace GitHub.Runner.Listener.Check
actionsTokenServiceUrl = urlBuilder.Uri.AbsoluteUri; actionsTokenServiceUrl = urlBuilder.Uri.AbsoluteUri;
urlBuilder.Path = "_services/pipelines/_apis/health"; urlBuilder.Path = "_services/pipelines/_apis/health";
actionsPipelinesServiceUrl = urlBuilder.Uri.AbsoluteUri; actionsPipelinesServiceUrl = urlBuilder.Uri.AbsoluteUri;
resultsReceiverServiceUrl = string.Empty; // we don't have Results service in GHES yet.
} }
var codeLoadUrlBuilder = new UriBuilder(url);
codeLoadUrlBuilder.Host = $"codeload.{codeLoadUrlBuilder.Host}";
codeLoadUrlBuilder.Path = "_ping";
// check github api // check github api
checkTasks.Add(CheckUtil.CheckDns(githubApiUrl)); checkTasks.Add(CheckUtil.CheckDns(githubApiUrl));
checkTasks.Add(CheckUtil.CheckPing(githubApiUrl)); checkTasks.Add(CheckUtil.CheckPing(githubApiUrl));
checkTasks.Add(HostContext.CheckHttpsGetRequests(githubApiUrl, pat, expectedHeader: "X-GitHub-Request-Id")); checkTasks.Add(HostContext.CheckHttpsGetRequests(githubApiUrl, pat, expectedHeader: "X-GitHub-Request-Id"));
// check github codeload
checkTasks.Add(CheckUtil.CheckDns(codeLoadUrlBuilder.Uri.AbsoluteUri));
checkTasks.Add(CheckUtil.CheckPing(codeLoadUrlBuilder.Uri.AbsoluteUri));
checkTasks.Add(HostContext.CheckHttpsGetRequests(codeLoadUrlBuilder.Uri.AbsoluteUri, pat, expectedHeader: "X-GitHub-Request-Id"));
// check results-receiver service
if (!string.IsNullOrEmpty(resultsReceiverServiceUrl))
{
checkTasks.Add(CheckUtil.CheckDns(resultsReceiverServiceUrl));
checkTasks.Add(CheckUtil.CheckPing(resultsReceiverServiceUrl));
checkTasks.Add(HostContext.CheckHttpsGetRequests(resultsReceiverServiceUrl, pat, expectedHeader: "X-GitHub-Request-Id"));
}
// check actions token service // check actions token service
checkTasks.Add(CheckUtil.CheckDns(actionsTokenServiceUrl)); checkTasks.Add(CheckUtil.CheckDns(actionsTokenServiceUrl));
checkTasks.Add(CheckUtil.CheckPing(actionsTokenServiceUrl)); checkTasks.Add(CheckUtil.CheckPing(actionsTokenServiceUrl));

View File

@@ -629,6 +629,20 @@ namespace GitHub.Runner.Listener
Trace.Info("worker process has been killed."); Trace.Info("worker process has been killed.");
} }
} }
catch (Exception ex)
{
// message send failed, this might indicate worker process is already exited or stuck.
Trace.Info($"Job cancel message sending for job {message.JobId} failed, kill running worker. {ex}");
workerProcessCancelTokenSource.Cancel();
try
{
await workerProcessTask;
}
catch (OperationCanceledException)
{
Trace.Info("worker process has been killed.");
}
}
// wait worker to exit // wait worker to exit
// if worker doesn't exit within timeout, then kill worker. // if worker doesn't exit within timeout, then kill worker.
@@ -1134,6 +1148,15 @@ namespace GitHub.Runner.Listener
jobRecord.ErrorCount++; jobRecord.ErrorCount++;
jobRecord.Issues.Add(unhandledExceptionIssue); jobRecord.Issues.Add(unhandledExceptionIssue);
if (message.Variables.TryGetValue("DistributedTask.MarkJobAsFailedOnWorkerCrash", out var markJobAsFailedOnWorkerCrash) &&
StringUtil.ConvertToBoolean(markJobAsFailedOnWorkerCrash?.Value))
{
Trace.Info("Mark the job as failed since the worker crashed");
jobRecord.Result = TaskResult.Failed;
// mark the job as completed so service will pickup the result
jobRecord.State = TimelineRecordState.Completed;
}
await jobServer.UpdateTimelineRecordsAsync(message.Plan.ScopeIdentifier, message.Plan.PlanType, message.Plan.PlanId, message.Timeline.Id, new TimelineRecord[] { jobRecord }, CancellationToken.None); await jobServer.UpdateTimelineRecordsAsync(message.Plan.ScopeIdentifier, message.Plan.PlanType, message.Plan.PlanId, message.Timeline.Id, new TimelineRecord[] { jobRecord }, CancellationToken.None);
} }
catch (Exception ex) catch (Exception ex)

View File

@@ -14,6 +14,7 @@ using GitHub.Runner.Listener.Configuration;
using GitHub.Runner.Sdk; using GitHub.Runner.Sdk;
using GitHub.Services.Common; using GitHub.Services.Common;
using GitHub.Services.OAuth; using GitHub.Services.OAuth;
using GitHub.Services.WebApi;
namespace GitHub.Runner.Listener namespace GitHub.Runner.Listener
{ {
@@ -24,6 +25,8 @@ namespace GitHub.Runner.Listener
Task DeleteSessionAsync(); Task DeleteSessionAsync();
Task<TaskAgentMessage> GetNextMessageAsync(CancellationToken token); Task<TaskAgentMessage> GetNextMessageAsync(CancellationToken token);
Task DeleteMessageAsync(TaskAgentMessage message); Task DeleteMessageAsync(TaskAgentMessage message);
Task RefreshListenerTokenAsync(CancellationToken token);
void OnJobStatus(object sender, JobStatusEventArgs e); void OnJobStatus(object sender, JobStatusEventArgs e);
} }
@@ -33,6 +36,7 @@ namespace GitHub.Runner.Listener
private RunnerSettings _settings; private RunnerSettings _settings;
private ITerminal _term; private ITerminal _term;
private IRunnerServer _runnerServer; private IRunnerServer _runnerServer;
private IBrokerServer _brokerServer;
private TaskAgentSession _session; private TaskAgentSession _session;
private TimeSpan _getNextMessageRetryInterval; private TimeSpan _getNextMessageRetryInterval;
private bool _accessTokenRevoked = false; private bool _accessTokenRevoked = false;
@@ -42,6 +46,9 @@ namespace GitHub.Runner.Listener
private readonly Dictionary<string, int> _sessionCreationExceptionTracker = new(); private readonly Dictionary<string, int> _sessionCreationExceptionTracker = new();
private TaskAgentStatus runnerStatus = TaskAgentStatus.Online; private TaskAgentStatus runnerStatus = TaskAgentStatus.Online;
private CancellationTokenSource _getMessagesTokenSource; private CancellationTokenSource _getMessagesTokenSource;
private VssCredentials _creds;
private bool _isBrokerSession = false;
public override void Initialize(IHostContext hostContext) public override void Initialize(IHostContext hostContext)
{ {
@@ -49,6 +56,7 @@ namespace GitHub.Runner.Listener
_term = HostContext.GetService<ITerminal>(); _term = HostContext.GetService<ITerminal>();
_runnerServer = HostContext.GetService<IRunnerServer>(); _runnerServer = HostContext.GetService<IRunnerServer>();
_brokerServer = hostContext.GetService<IBrokerServer>();
} }
public async Task<Boolean> CreateSessionAsync(CancellationToken token) public async Task<Boolean> CreateSessionAsync(CancellationToken token)
@@ -64,7 +72,7 @@ namespace GitHub.Runner.Listener
// Create connection. // Create connection.
Trace.Info("Loading Credentials"); Trace.Info("Loading Credentials");
var credMgr = HostContext.GetService<ICredentialManager>(); var credMgr = HostContext.GetService<ICredentialManager>();
VssCredentials creds = credMgr.LoadCredentials(); _creds = credMgr.LoadCredentials();
var agent = new TaskAgentReference var agent = new TaskAgentReference
{ {
@@ -86,7 +94,7 @@ namespace GitHub.Runner.Listener
try try
{ {
Trace.Info("Connecting to the Runner Server..."); Trace.Info("Connecting to the Runner Server...");
await _runnerServer.ConnectAsync(new Uri(serverUrl), creds); await _runnerServer.ConnectAsync(new Uri(serverUrl), _creds);
Trace.Info("VssConnection created"); Trace.Info("VssConnection created");
_term.WriteLine(); _term.WriteLine();
@@ -98,6 +106,14 @@ namespace GitHub.Runner.Listener
taskAgentSession, taskAgentSession,
token); token);
if (_session.BrokerMigrationMessage != null)
{
Trace.Info("Runner session is in migration mode: Creating Broker session with BrokerBaseUrl: {0}", _session.BrokerMigrationMessage.BrokerBaseUrl);
await _brokerServer.ConnectAsync(_session.BrokerMigrationMessage.BrokerBaseUrl, _creds);
_session = await _brokerServer.CreateSessionAsync(taskAgentSession, token);
_isBrokerSession = true;
}
Trace.Info($"Session created."); Trace.Info($"Session created.");
if (encounteringError) if (encounteringError)
{ {
@@ -124,7 +140,7 @@ namespace GitHub.Runner.Listener
Trace.Error("Catch exception during create session."); Trace.Error("Catch exception during create session.");
Trace.Error(ex); Trace.Error(ex);
if (ex is VssOAuthTokenRequestException vssOAuthEx && creds.Federated is VssOAuthCredential vssOAuthCred) if (ex is VssOAuthTokenRequestException vssOAuthEx && _creds.Federated is VssOAuthCredential vssOAuthCred)
{ {
// "invalid_client" means the runner registration has been deleted from the server. // "invalid_client" means the runner registration has been deleted from the server.
if (string.Equals(vssOAuthEx.Error, "invalid_client", StringComparison.OrdinalIgnoreCase)) if (string.Equals(vssOAuthEx.Error, "invalid_client", StringComparison.OrdinalIgnoreCase))
@@ -171,6 +187,11 @@ namespace GitHub.Runner.Listener
{ {
using (var ts = new CancellationTokenSource(TimeSpan.FromSeconds(30))) using (var ts = new CancellationTokenSource(TimeSpan.FromSeconds(30)))
{ {
if (_isBrokerSession)
{
await _brokerServer.DeleteSessionAsync(ts.Token);
return;
}
await _runnerServer.DeleteAgentSessionAsync(_settings.PoolId, _session.SessionId, ts.Token); await _runnerServer.DeleteAgentSessionAsync(_settings.PoolId, _session.SessionId, ts.Token);
} }
} }
@@ -222,11 +243,29 @@ namespace GitHub.Runner.Listener
BuildConstants.RunnerPackage.Version, BuildConstants.RunnerPackage.Version,
VarUtil.OS, VarUtil.OS,
VarUtil.OSArchitecture, VarUtil.OSArchitecture,
_settings.DisableUpdate,
_getMessagesTokenSource.Token); _getMessagesTokenSource.Token);
// Decrypt the message body if the session is using encryption // Decrypt the message body if the session is using encryption
message = DecryptMessage(message); message = DecryptMessage(message);
if (message != null && message.MessageType == BrokerMigrationMessage.MessageType)
{
Trace.Info("BrokerMigration message received. Polling Broker for messages...");
var migrationMessage = JsonUtility.FromString<BrokerMigrationMessage>(message.Body);
await _brokerServer.ConnectAsync(migrationMessage.BrokerBaseUrl, _creds);
message = await _brokerServer.GetRunnerMessageAsync(_session.SessionId,
runnerStatus,
BuildConstants.RunnerPackage.Version,
VarUtil.OS,
VarUtil.OSArchitecture,
_settings.DisableUpdate,
token);
}
if (message != null) if (message != null)
{ {
_lastMessageId = message.MessageId; _lastMessageId = message.MessageId;
@@ -342,6 +381,11 @@ namespace GitHub.Runner.Listener
} }
} }
public async Task RefreshListenerTokenAsync(CancellationToken cancellationToken)
{
await _runnerServer.RefreshConnectionAsync(RunnerConnectionType.MessageQueue, TimeSpan.FromSeconds(60));
}
private TaskAgentMessage DecryptMessage(TaskAgentMessage message) private TaskAgentMessage DecryptMessage(TaskAgentMessage message)
{ {
if (_session.EncryptionKey == null || if (_session.EncryptionKey == null ||

View File

@@ -25,12 +25,6 @@
<PackageReference Include="System.ServiceProcess.ServiceController" Version="4.4.0" /> <PackageReference Include="System.ServiceProcess.ServiceController" Version="4.4.0" />
</ItemGroup> </ItemGroup>
<ItemGroup>
<EmbeddedResource Include="..\Misc\runnercoreassets">
<LogicalName>GitHub.Runner.Listener.runnercoreassets</LogicalName>
</EmbeddedResource>
</ItemGroup>
<PropertyGroup Condition=" '$(Configuration)' == 'Debug' "> <PropertyGroup Condition=" '$(Configuration)' == 'Debug' ">
<DebugType>portable</DebugType> <DebugType>portable</DebugType>
</PropertyGroup> </PropertyGroup>

View File

@@ -457,22 +457,13 @@ namespace GitHub.Runner.Listener
message = await getNextMessage; //get next message message = await getNextMessage; //get next message
HostContext.WritePerfCounter($"MessageReceived_{message.MessageType}"); HostContext.WritePerfCounter($"MessageReceived_{message.MessageType}");
if (string.Equals(message.MessageType, AgentRefreshMessage.MessageType, StringComparison.OrdinalIgnoreCase) || if (string.Equals(message.MessageType, AgentRefreshMessage.MessageType, StringComparison.OrdinalIgnoreCase))
string.Equals(message.MessageType, RunnerRefreshMessage.MessageType, StringComparison.OrdinalIgnoreCase))
{ {
if (autoUpdateInProgress == false) if (autoUpdateInProgress == false)
{ {
autoUpdateInProgress = true; autoUpdateInProgress = true;
AgentRefreshMessage runnerUpdateMessage = null; AgentRefreshMessage runnerUpdateMessage = JsonUtility.FromString<AgentRefreshMessage>(message.Body);
if (string.Equals(message.MessageType, AgentRefreshMessage.MessageType, StringComparison.OrdinalIgnoreCase))
{
runnerUpdateMessage = JsonUtility.FromString<AgentRefreshMessage>(message.Body);
}
else
{
var brokerRunnerUpdateMessage = JsonUtility.FromString<RunnerRefreshMessage>(message.Body);
runnerUpdateMessage = new AgentRefreshMessage(brokerRunnerUpdateMessage.RunnerId, brokerRunnerUpdateMessage.TargetVersion, TimeSpan.FromSeconds(brokerRunnerUpdateMessage.TimeoutInSeconds));
}
#if DEBUG #if DEBUG
// Can mock the update for testing // Can mock the update for testing
if (StringUtil.ConvertToBoolean(Environment.GetEnvironmentVariable("GITHUB_ACTIONS_RUNNER_IS_MOCK_UPDATE"))) if (StringUtil.ConvertToBoolean(Environment.GetEnvironmentVariable("GITHUB_ACTIONS_RUNNER_IS_MOCK_UPDATE")))
@@ -503,6 +494,22 @@ namespace GitHub.Runner.Listener
Trace.Info("Refresh message received, skip autoupdate since a previous autoupdate is already running."); Trace.Info("Refresh message received, skip autoupdate since a previous autoupdate is already running.");
} }
} }
else if (string.Equals(message.MessageType, RunnerRefreshMessage.MessageType, StringComparison.OrdinalIgnoreCase))
{
if (autoUpdateInProgress == false)
{
autoUpdateInProgress = true;
RunnerRefreshMessage brokerRunnerUpdateMessage = JsonUtility.FromString<RunnerRefreshMessage>(message.Body);
var selfUpdater = HostContext.GetService<ISelfUpdaterV2>();
selfUpdateTask = selfUpdater.SelfUpdate(brokerRunnerUpdateMessage, jobDispatcher, false, HostContext.RunnerShutdownToken);
Trace.Info("Refresh message received, kick-off selfupdate background process.");
}
else
{
Trace.Info("Refresh message received, skip autoupdate since a previous autoupdate is already running.");
}
}
else if (string.Equals(message.MessageType, JobRequestMessageTypes.PipelineAgentJobRequest, StringComparison.OrdinalIgnoreCase)) else if (string.Equals(message.MessageType, JobRequestMessageTypes.PipelineAgentJobRequest, StringComparison.OrdinalIgnoreCase))
{ {
if (autoUpdateInProgress || runOnceJobReceived) if (autoUpdateInProgress || runOnceJobReceived)
@@ -589,6 +596,10 @@ namespace GitHub.Runner.Listener
Trace.Info($"Service requests the hosted runner to shutdown. Reason: '{HostedRunnerShutdownMessage.Reason}'."); Trace.Info($"Service requests the hosted runner to shutdown. Reason: '{HostedRunnerShutdownMessage.Reason}'.");
return Constants.Runner.ReturnCode.Success; return Constants.Runner.ReturnCode.Success;
} }
else if (string.Equals(message.MessageType, TaskAgentMessageTypes.ForceTokenRefresh))
{
await _listener.RefreshListenerTokenAsync(messageQueueLoopTokenSource.Token);
}
else else
{ {
Trace.Error($"Received message {message.MessageId} with unsupported message type {message.MessageType}."); Trace.Error($"Received message {message.MessageId} with unsupported message type {message.MessageType}.");
@@ -627,6 +638,7 @@ namespace GitHub.Runner.Listener
{ {
try try
{ {
Trace.Info("Deleting Runner Session...");
await _listener.DeleteSessionAsync(); await _listener.DeleteSessionAsync();
} }
catch (Exception ex) when (runOnce) catch (Exception ex) when (runOnce)

View File

@@ -6,13 +6,11 @@ using System.IO;
using System.IO.Compression; using System.IO.Compression;
using System.Linq; using System.Linq;
using System.Net.Http; using System.Net.Http;
using System.Reflection;
using System.Security.Cryptography; using System.Security.Cryptography;
using System.Threading; using System.Threading;
using System.Threading.Tasks; using System.Threading.Tasks;
using GitHub.DistributedTask.WebApi; using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common; using GitHub.Runner.Common;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Sdk; using GitHub.Runner.Sdk;
using GitHub.Services.Common; using GitHub.Services.Common;
using GitHub.Services.WebApi; using GitHub.Services.WebApi;
@@ -30,20 +28,14 @@ namespace GitHub.Runner.Listener
{ {
private static string _packageType = "agent"; private static string _packageType = "agent";
private static string _platform = BuildConstants.RunnerPackage.PackageName; private static string _platform = BuildConstants.RunnerPackage.PackageName;
private static string _dotnetRuntime = "dotnetRuntime";
private static string _externals = "externals";
private readonly Dictionary<string, string> _contentHashes = new();
private PackageMetadata _targetPackage; private PackageMetadata _targetPackage;
private ITerminal _terminal; private ITerminal _terminal;
private IRunnerServer _runnerServer; private IRunnerServer _runnerServer;
private int _poolId; private int _poolId;
private ulong _agentId; private ulong _agentId;
private const int _numberOfOldVersionsToKeep = 1;
private readonly ConcurrentQueue<string> _updateTrace = new(); private readonly ConcurrentQueue<string> _updateTrace = new();
private Task _cloneAndCalculateContentHashTask;
private string _dotnetRuntimeCloneDirectory;
private string _externalsCloneDirectory;
public bool Busy { get; private set; } public bool Busy { get; private set; }
public override void Initialize(IHostContext hostContext) public override void Initialize(IHostContext hostContext)
@@ -56,8 +48,6 @@ namespace GitHub.Runner.Listener
var settings = configStore.GetSettings(); var settings = configStore.GetSettings();
_poolId = settings.PoolId; _poolId = settings.PoolId;
_agentId = settings.AgentId; _agentId = settings.AgentId;
_dotnetRuntimeCloneDirectory = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Work), "__dotnet_runtime__");
_externalsCloneDirectory = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Work), "__externals__");
} }
public async Task<bool> SelfUpdate(AgentRefreshMessage updateMessage, IJobDispatcher jobDispatcher, bool restartInteractiveRunner, CancellationToken token) public async Task<bool> SelfUpdate(AgentRefreshMessage updateMessage, IJobDispatcher jobDispatcher, bool restartInteractiveRunner, CancellationToken token)
@@ -67,13 +57,6 @@ namespace GitHub.Runner.Listener
{ {
var totalUpdateTime = Stopwatch.StartNew(); var totalUpdateTime = Stopwatch.StartNew();
// Copy dotnet runtime and externals of current runner to a temp folder
// So we can re-use them with trimmed runner package, if possible.
// This process is best effort, if we can't use trimmed runner package,
// we will just go with the full package.
var linkedTokenSource = CancellationTokenSource.CreateLinkedTokenSource(token);
_cloneAndCalculateContentHashTask = CloneAndCalculateAssetsHash(_dotnetRuntimeCloneDirectory, _externalsCloneDirectory, linkedTokenSource.Token);
if (!await UpdateNeeded(updateMessage.TargetVersion, token)) if (!await UpdateNeeded(updateMessage.TargetVersion, token))
{ {
Trace.Info($"Can't find available update package."); Trace.Info($"Can't find available update package.");
@@ -87,24 +70,6 @@ namespace GitHub.Runner.Listener
await UpdateRunnerUpdateStateAsync("Runner update in progress, do not shutdown runner."); await UpdateRunnerUpdateStateAsync("Runner update in progress, do not shutdown runner.");
await UpdateRunnerUpdateStateAsync($"Downloading {_targetPackage.Version} runner"); await UpdateRunnerUpdateStateAsync($"Downloading {_targetPackage.Version} runner");
if (_targetPackage.TrimmedPackages?.Count > 0)
{
// wait for cloning assets task to finish only if we have trimmed packages
await _cloneAndCalculateContentHashTask;
}
else
{
linkedTokenSource.Cancel();
try
{
await _cloneAndCalculateContentHashTask;
}
catch (Exception ex)
{
Trace.Info($"Ingore errors after cancelling cloning assets task: {ex}");
}
}
await DownloadLatestRunner(token, updateMessage.TargetVersion); await DownloadLatestRunner(token, updateMessage.TargetVersion);
Trace.Info($"Download latest runner and unzip into runner root."); Trace.Info($"Download latest runner and unzip into runner root.");
@@ -218,54 +183,8 @@ namespace GitHub.Runner.Listener
string archiveFile = null; string archiveFile = null;
var packageDownloadUrl = _targetPackage.DownloadUrl; var packageDownloadUrl = _targetPackage.DownloadUrl;
var packageHashValue = _targetPackage.HashValue; var packageHashValue = _targetPackage.HashValue;
var runtimeTrimmed = false;
var externalsTrimmed = false;
var fallbackToFullPackage = false;
// Only try trimmed package if sever sends them and we have calculated hash value of the current runtime/externals.
if (_contentHashes.Count == 2 &&
_contentHashes.ContainsKey(_dotnetRuntime) &&
_contentHashes.ContainsKey(_externals) &&
_targetPackage.TrimmedPackages?.Count > 0)
{
Trace.Info($"Current runner content hash: {StringUtil.ConvertToJson(_contentHashes)}");
Trace.Info($"Trimmed packages info from service: {StringUtil.ConvertToJson(_targetPackage.TrimmedPackages)}");
// Try to see whether we can use any size trimmed down package to speed up runner updates.
foreach (var trimmedPackage in _targetPackage.TrimmedPackages)
{
if (trimmedPackage.TrimmedContents.Count == 2 &&
trimmedPackage.TrimmedContents.TryGetValue(_dotnetRuntime, out var trimmedRuntimeHash) &&
trimmedRuntimeHash == _contentHashes[_dotnetRuntime] &&
trimmedPackage.TrimmedContents.TryGetValue(_externals, out var trimmedExternalsHash) &&
trimmedExternalsHash == _contentHashes[_externals])
{
Trace.Info($"Use trimmed (runtime+externals) package '{trimmedPackage.DownloadUrl}' to update runner.");
packageDownloadUrl = trimmedPackage.DownloadUrl;
packageHashValue = trimmedPackage.HashValue;
runtimeTrimmed = true;
externalsTrimmed = true;
break;
}
else if (trimmedPackage.TrimmedContents.Count == 1 &&
trimmedPackage.TrimmedContents.TryGetValue(_externals, out trimmedExternalsHash) &&
trimmedExternalsHash == _contentHashes[_externals])
{
Trace.Info($"Use trimmed (externals) package '{trimmedPackage.DownloadUrl}' to update runner.");
packageDownloadUrl = trimmedPackage.DownloadUrl;
packageHashValue = trimmedPackage.HashValue;
externalsTrimmed = true;
break;
}
else
{
Trace.Info($"Can't use trimmed package from '{trimmedPackage.DownloadUrl}' since the current runner does not carry those trimmed content (Hash mismatch).");
}
}
}
_updateTrace.Enqueue($"DownloadUrl: {packageDownloadUrl}"); _updateTrace.Enqueue($"DownloadUrl: {packageDownloadUrl}");
_updateTrace.Enqueue($"RuntimeTrimmed: {runtimeTrimmed}");
_updateTrace.Enqueue($"ExternalsTrimmed: {externalsTrimmed}");
try try
{ {
@@ -323,12 +242,6 @@ namespace GitHub.Runner.Listener
await ExtractRunnerPackage(archiveFile, latestRunnerDirectory, token); await ExtractRunnerPackage(archiveFile, latestRunnerDirectory, token);
} }
catch (Exception ex) when (runtimeTrimmed || externalsTrimmed)
{
// if anything failed when we use trimmed package (download/validatehase/extract), try again with the full runner package.
Trace.Error($"Fail to download latest runner using trimmed package: {ex}");
fallbackToFullPackage = true;
}
finally finally
{ {
try try
@@ -347,74 +260,6 @@ namespace GitHub.Runner.Listener
} }
} }
var trimmedPackageRestoreTasks = new List<Task<bool>>();
if (!fallbackToFullPackage)
{
// Skip restoring externals and runtime if we are going to fullback to the full package.
if (externalsTrimmed)
{
trimmedPackageRestoreTasks.Add(RestoreTrimmedExternals(latestRunnerDirectory, token));
}
if (runtimeTrimmed)
{
trimmedPackageRestoreTasks.Add(RestoreTrimmedDotnetRuntime(latestRunnerDirectory, token));
}
}
if (trimmedPackageRestoreTasks.Count > 0)
{
var restoreResults = await Task.WhenAll(trimmedPackageRestoreTasks);
if (restoreResults.Any(x => x == false))
{
// if any of the restore failed, fallback to full package.
fallbackToFullPackage = true;
}
}
if (fallbackToFullPackage)
{
Trace.Error("Something wrong with the trimmed runner package, failback to use the full package for runner updates.");
_updateTrace.Enqueue($"FallbackToFullPackage: {fallbackToFullPackage}");
IOUtil.DeleteDirectory(latestRunnerDirectory, token);
Directory.CreateDirectory(latestRunnerDirectory);
packageDownloadUrl = _targetPackage.DownloadUrl;
packageHashValue = _targetPackage.HashValue;
_updateTrace.Enqueue($"DownloadUrl: {packageDownloadUrl}");
try
{
archiveFile = await DownLoadRunner(latestRunnerDirectory, packageDownloadUrl, packageHashValue, token);
if (string.IsNullOrEmpty(archiveFile))
{
throw new TaskCanceledException($"Runner package '{packageDownloadUrl}' failed after {Constants.RunnerDownloadRetryMaxAttempts} download attempts");
}
await ValidateRunnerHash(archiveFile, packageHashValue);
await ExtractRunnerPackage(archiveFile, latestRunnerDirectory, token);
}
finally
{
try
{
// delete .zip file
if (!string.IsNullOrEmpty(archiveFile) && File.Exists(archiveFile))
{
Trace.Verbose("Deleting latest runner package zip: {0}", archiveFile);
IOUtil.DeleteFile(archiveFile);
}
}
catch (Exception ex)
{
//it is not critical if we fail to delete the .zip file
Trace.Warning("Failed to delete runner package zip '{0}'. Exception: {1}", archiveFile, ex);
}
}
}
await CopyLatestRunnerToRoot(latestRunnerDirectory, token); await CopyLatestRunnerToRoot(latestRunnerDirectory, token);
} }
@@ -665,9 +510,9 @@ namespace GitHub.Runner.Listener
// delete old bin.2.99.0 folder, only leave the current version and the latest download version // delete old bin.2.99.0 folder, only leave the current version and the latest download version
var allBinDirs = Directory.GetDirectories(HostContext.GetDirectory(WellKnownDirectory.Root), "bin.*"); var allBinDirs = Directory.GetDirectories(HostContext.GetDirectory(WellKnownDirectory.Root), "bin.*");
if (allBinDirs.Length > 2) if (allBinDirs.Length > _numberOfOldVersionsToKeep)
{ {
// there are more than 2 bin.version folder. // there are more than one bin.version folder.
// delete older bin.version folders. // delete older bin.version folders.
foreach (var oldBinDir in allBinDirs) foreach (var oldBinDir in allBinDirs)
{ {
@@ -694,9 +539,9 @@ namespace GitHub.Runner.Listener
// delete old externals.2.99.0 folder, only leave the current version and the latest download version // delete old externals.2.99.0 folder, only leave the current version and the latest download version
var allExternalsDirs = Directory.GetDirectories(HostContext.GetDirectory(WellKnownDirectory.Root), "externals.*"); var allExternalsDirs = Directory.GetDirectories(HostContext.GetDirectory(WellKnownDirectory.Root), "externals.*");
if (allExternalsDirs.Length > 2) if (allExternalsDirs.Length > _numberOfOldVersionsToKeep)
{ {
// there are more than 2 externals.version folder. // there are more than one externals.version folder.
// delete older externals.version folders. // delete older externals.version folders.
foreach (var oldExternalDir in allExternalsDirs) foreach (var oldExternalDir in allExternalsDirs)
{ {
@@ -795,330 +640,5 @@ namespace GitHub.Runner.Listener
Trace.Info($"Catch exception during report update state, ignore this error and continue auto-update."); Trace.Info($"Catch exception during report update state, ignore this error and continue auto-update.");
} }
} }
private async Task<bool> RestoreTrimmedExternals(string downloadDirectory, CancellationToken token)
{
// Copy the current runner's externals if we are using a externals trimmed package
// Execute the node.js to make sure the copied externals is working.
var stopWatch = Stopwatch.StartNew();
try
{
Trace.Info($"Copy {_externalsCloneDirectory} to {Path.Combine(downloadDirectory, Constants.Path.ExternalsDirectory)}.");
IOUtil.CopyDirectory(_externalsCloneDirectory, Path.Combine(downloadDirectory, Constants.Path.ExternalsDirectory), token);
// try run node.js to see if current node.js works fine after copy over to new location.
var nodeVersions = NodeUtil.BuiltInNodeVersions;
foreach (var nodeVersion in nodeVersions)
{
var newNodeBinary = Path.Combine(downloadDirectory, Constants.Path.ExternalsDirectory, nodeVersion, "bin", $"node{IOUtil.ExeExtension}");
if (File.Exists(newNodeBinary))
{
using (var p = HostContext.CreateService<IProcessInvoker>())
{
var outputs = "";
p.ErrorDataReceived += (_, data) =>
{
if (!string.IsNullOrEmpty(data.Data))
{
Trace.Error(data.Data);
}
};
p.OutputDataReceived += (_, data) =>
{
if (!string.IsNullOrEmpty(data.Data))
{
Trace.Info(data.Data);
outputs = data.Data;
}
};
var exitCode = await p.ExecuteAsync(HostContext.GetDirectory(WellKnownDirectory.Root), newNodeBinary, $"-e \"console.log('{nameof(RestoreTrimmedExternals)}')\"", null, token);
if (exitCode != 0)
{
Trace.Error($"{newNodeBinary} -e \"console.log()\" failed with exit code {exitCode}");
return false;
}
if (!string.Equals(outputs, nameof(RestoreTrimmedExternals), StringComparison.OrdinalIgnoreCase))
{
Trace.Error($"{newNodeBinary} -e \"console.log()\" did not output expected content.");
return false;
}
}
}
}
return true;
}
catch (Exception ex)
{
Trace.Error($"Fail to restore externals for trimmed package: {ex}");
return false;
}
finally
{
stopWatch.Stop();
_updateTrace.Enqueue($"{nameof(RestoreTrimmedExternals)}Time: {stopWatch.ElapsedMilliseconds}ms");
}
}
private async Task<bool> RestoreTrimmedDotnetRuntime(string downloadDirectory, CancellationToken token)
{
// Copy the current runner's dotnet runtime if we are using a dotnet runtime trimmed package
// Execute the runner.listener to make sure the copied runtime is working.
var stopWatch = Stopwatch.StartNew();
try
{
Trace.Info($"Copy {_dotnetRuntimeCloneDirectory} to {Path.Combine(downloadDirectory, Constants.Path.BinDirectory)}.");
IOUtil.CopyDirectory(_dotnetRuntimeCloneDirectory, Path.Combine(downloadDirectory, Constants.Path.BinDirectory), token);
// try run the runner executable to see if current dotnet runtime + future runner binary works fine.
var newRunnerBinary = Path.Combine(downloadDirectory, Constants.Path.BinDirectory, "Runner.Listener");
using (var p = HostContext.CreateService<IProcessInvoker>())
{
p.ErrorDataReceived += (_, data) =>
{
if (!string.IsNullOrEmpty(data.Data))
{
Trace.Error(data.Data);
}
};
p.OutputDataReceived += (_, data) =>
{
if (!string.IsNullOrEmpty(data.Data))
{
Trace.Info(data.Data);
}
};
var exitCode = await p.ExecuteAsync(HostContext.GetDirectory(WellKnownDirectory.Root), newRunnerBinary, "--version", null, token);
if (exitCode != 0)
{
Trace.Error($"{newRunnerBinary} --version failed with exit code {exitCode}");
return false;
}
else
{
return true;
}
}
}
catch (Exception ex)
{
Trace.Error($"Fail to restore dotnet runtime for trimmed package: {ex}");
return false;
}
finally
{
stopWatch.Stop();
_updateTrace.Enqueue($"{nameof(RestoreTrimmedDotnetRuntime)}Time: {stopWatch.ElapsedMilliseconds}ms");
}
}
private async Task CloneAndCalculateAssetsHash(string dotnetRuntimeCloneDirectory, string externalsCloneDirectory, CancellationToken token)
{
var runtimeCloneTask = CloneDotnetRuntime(dotnetRuntimeCloneDirectory, token);
var externalsCloneTask = CloneExternals(externalsCloneDirectory, token);
var waitingTasks = new Dictionary<string, Task>()
{
{nameof(CloneDotnetRuntime), runtimeCloneTask},
{nameof(CloneExternals),externalsCloneTask}
};
while (waitingTasks.Count > 0)
{
Trace.Info($"Waiting for {waitingTasks.Count} tasks to complete.");
var complatedTask = await Task.WhenAny(waitingTasks.Values);
if (waitingTasks.ContainsKey(nameof(CloneExternals)) &&
complatedTask == waitingTasks[nameof(CloneExternals)])
{
Trace.Info($"Externals clone finished.");
waitingTasks.Remove(nameof(CloneExternals));
try
{
if (await externalsCloneTask && !token.IsCancellationRequested)
{
var externalsHash = await HashFiles(externalsCloneDirectory, token);
Trace.Info($"Externals content hash: {externalsHash}");
_contentHashes[_externals] = externalsHash;
_updateTrace.Enqueue($"ExternalsHash: {_contentHashes[_externals]}");
}
else
{
Trace.Error($"Skip compute hash since clone externals failed/cancelled.");
}
}
catch (Exception ex)
{
Trace.Error($"Fail to hash externals content: {ex}");
}
}
else if (waitingTasks.ContainsKey(nameof(CloneDotnetRuntime)) &&
complatedTask == waitingTasks[nameof(CloneDotnetRuntime)])
{
Trace.Info($"Dotnet runtime clone finished.");
waitingTasks.Remove(nameof(CloneDotnetRuntime));
try
{
if (await runtimeCloneTask && !token.IsCancellationRequested)
{
var runtimeHash = await HashFiles(dotnetRuntimeCloneDirectory, token);
Trace.Info($"Runtime content hash: {runtimeHash}");
_contentHashes[_dotnetRuntime] = runtimeHash;
_updateTrace.Enqueue($"DotnetRuntimeHash: {_contentHashes[_dotnetRuntime]}");
}
else
{
Trace.Error($"Skip compute hash since clone dotnet runtime failed/cancelled.");
}
}
catch (Exception ex)
{
Trace.Error($"Fail to hash runtime content: {ex}");
}
}
Trace.Info($"Still waiting for {waitingTasks.Count} tasks to complete.");
}
}
private async Task<bool> CloneDotnetRuntime(string runtimeDir, CancellationToken token)
{
var stopWatch = Stopwatch.StartNew();
try
{
Trace.Info($"Cloning dotnet runtime to {runtimeDir}");
IOUtil.DeleteDirectory(runtimeDir, CancellationToken.None);
Directory.CreateDirectory(runtimeDir);
var assembly = Assembly.GetExecutingAssembly();
var assetsContent = default(string);
using (var stream = assembly.GetManifestResourceStream("GitHub.Runner.Listener.runnercoreassets"))
using (var streamReader = new StreamReader(stream))
{
assetsContent = await streamReader.ReadToEndAsync();
}
if (!string.IsNullOrEmpty(assetsContent))
{
var runnerCoreAssets = assetsContent.Split(new[] { "\n", "\r\n" }, StringSplitOptions.RemoveEmptyEntries);
if (runnerCoreAssets.Length > 0)
{
var binDir = HostContext.GetDirectory(WellKnownDirectory.Bin);
IOUtil.CopyDirectory(binDir, runtimeDir, token);
var clonedFile = 0;
foreach (var file in Directory.EnumerateFiles(runtimeDir, "*", SearchOption.AllDirectories))
{
token.ThrowIfCancellationRequested();
if (runnerCoreAssets.Any(x => file.Replace(Path.DirectorySeparatorChar, Path.AltDirectorySeparatorChar).EndsWith(x.Trim())))
{
Trace.Verbose($"{file} is part of the runner core, delete from cloned runtime directory.");
IOUtil.DeleteFile(file);
}
else
{
clonedFile++;
}
}
Trace.Info($"Successfully cloned dotnet runtime to {runtimeDir}. Total files: {clonedFile}");
return true;
}
}
}
catch (Exception ex)
{
Trace.Error($"Fail to clone dotnet runtime to {runtimeDir}");
Trace.Error(ex);
}
finally
{
stopWatch.Stop();
_updateTrace.Enqueue($"{nameof(CloneDotnetRuntime)}Time: {stopWatch.ElapsedMilliseconds}ms");
}
return false;
}
private Task<bool> CloneExternals(string externalsDir, CancellationToken token)
{
var stopWatch = Stopwatch.StartNew();
try
{
Trace.Info($"Cloning externals to {externalsDir}");
IOUtil.DeleteDirectory(externalsDir, CancellationToken.None);
Directory.CreateDirectory(externalsDir);
IOUtil.CopyDirectory(HostContext.GetDirectory(WellKnownDirectory.Externals), externalsDir, token);
Trace.Info($"Successfully cloned externals to {externalsDir}.");
return Task.FromResult(true);
}
catch (Exception ex)
{
Trace.Error($"Fail to clone externals to {externalsDir}");
Trace.Error(ex);
}
finally
{
stopWatch.Stop();
_updateTrace.Enqueue($"{nameof(CloneExternals)}Time: {stopWatch.ElapsedMilliseconds}ms");
}
return Task.FromResult(false);
}
private async Task<string> HashFiles(string fileFolder, CancellationToken token)
{
Trace.Info($"Calculating hash for {fileFolder}");
var stopWatch = Stopwatch.StartNew();
string binDir = HostContext.GetDirectory(WellKnownDirectory.Bin);
string node = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Externals), NodeUtil.GetInternalNodeVersion(), "bin", $"node{IOUtil.ExeExtension}");
string hashFilesScript = Path.Combine(binDir, "hashFiles");
var hashResult = string.Empty;
using (var processInvoker = HostContext.CreateService<IProcessInvoker>())
{
processInvoker.ErrorDataReceived += (_, data) =>
{
if (!string.IsNullOrEmpty(data.Data) && data.Data.StartsWith("__OUTPUT__") && data.Data.EndsWith("__OUTPUT__"))
{
hashResult = data.Data.Substring(10, data.Data.Length - 20);
Trace.Info($"Hash result: '{hashResult}'");
}
else
{
Trace.Info(data.Data);
}
};
processInvoker.OutputDataReceived += (_, data) =>
{
Trace.Verbose(data.Data);
};
var env = new Dictionary<string, string>
{
["patterns"] = "**"
};
int exitCode = await processInvoker.ExecuteAsync(workingDirectory: fileFolder,
fileName: node,
arguments: $"\"{hashFilesScript.Replace("\"", "\\\"")}\"",
environment: env,
requireExitCodeZero: false,
outputEncoding: null,
killProcessOnCancel: true,
cancellationToken: token);
if (exitCode != 0)
{
Trace.Error($"hashFiles returns '{exitCode}' failed. Fail to hash files under directory '{fileFolder}'");
}
stopWatch.Stop();
_updateTrace.Enqueue($"{nameof(HashFiles)}{Path.GetFileName(fileFolder)}Time: {stopWatch.ElapsedMilliseconds}ms");
return hashResult;
}
}
} }
} }

View File

@@ -0,0 +1,568 @@
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
using System.IO.Compression;
using System.Linq;
using System.Net.Http;
using System.Reflection;
using System.Security.Cryptography;
using System.Threading;
using System.Threading.Tasks;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Sdk;
using GitHub.Services.Common;
using GitHub.Services.WebApi;
namespace GitHub.Runner.Listener
{
// This class is a fork of SelfUpdater.cs and is intended to only be used for the
// new self-update flow where the PackageMetadata is sent in the message directly.
// Forking the class prevents us from accidentally breaking the old flow while it's still in production
[ServiceLocator(Default = typeof(SelfUpdaterV2))]
public interface ISelfUpdaterV2 : IRunnerService
{
bool Busy { get; }
Task<bool> SelfUpdate(RunnerRefreshMessage updateMessage, IJobDispatcher jobDispatcher, bool restartInteractiveRunner, CancellationToken token);
}
public class SelfUpdaterV2 : RunnerService, ISelfUpdaterV2
{
private static string _platform = BuildConstants.RunnerPackage.PackageName;
private ITerminal _terminal;
private IRunnerServer _runnerServer;
private int _poolId;
private ulong _agentId;
private const int _numberOfOldVersionsToKeep = 1;
private readonly ConcurrentQueue<string> _updateTrace = new();
public bool Busy { get; private set; }
public override void Initialize(IHostContext hostContext)
{
base.Initialize(hostContext);
_terminal = hostContext.GetService<ITerminal>();
_runnerServer = HostContext.GetService<IRunnerServer>();
var configStore = HostContext.GetService<IConfigurationStore>();
var settings = configStore.GetSettings();
_poolId = settings.PoolId;
_agentId = settings.AgentId;
}
public async Task<bool> SelfUpdate(RunnerRefreshMessage updateMessage, IJobDispatcher jobDispatcher, bool restartInteractiveRunner, CancellationToken token)
{
Busy = true;
try
{
var totalUpdateTime = Stopwatch.StartNew();
Trace.Info($"An update is available.");
_updateTrace.Enqueue($"RunnerPlatform: {updateMessage.OS}");
// Print console line that warn user not shutdown runner.
_terminal.WriteLine("Runner update in progress, do not shutdown runner.");
_terminal.WriteLine($"Downloading {updateMessage.TargetVersion} runner");
await DownloadLatestRunner(token, updateMessage.TargetVersion, updateMessage.DownloadUrl, updateMessage.SHA256Checksum, updateMessage.OS);
Trace.Info($"Download latest runner and unzip into runner root.");
// wait till all running job finish
_terminal.WriteLine("Waiting for current job finish running.");
await jobDispatcher.WaitAsync(token);
Trace.Info($"All running job has exited.");
// We need to keep runner backup around for macOS until we fixed https://github.com/actions/runner/issues/743
// delete runner backup
var stopWatch = Stopwatch.StartNew();
DeletePreviousVersionRunnerBackup(token, updateMessage.TargetVersion);
Trace.Info($"Delete old version runner backup.");
stopWatch.Stop();
// generate update script from template
_updateTrace.Enqueue($"DeleteRunnerBackupTime: {stopWatch.ElapsedMilliseconds}ms");
_terminal.WriteLine("Generate and execute update script.");
string updateScript = GenerateUpdateScript(restartInteractiveRunner, updateMessage.TargetVersion);
Trace.Info($"Generate update script into: {updateScript}");
#if DEBUG
// For L0, we will skip execute update script.
if (string.IsNullOrEmpty(Environment.GetEnvironmentVariable("_GITHUB_ACTION_EXECUTE_UPDATE_SCRIPT")))
#endif
{
string flagFile = "update.finished";
IOUtil.DeleteFile(flagFile);
// kick off update script
Process invokeScript = new();
#if OS_WINDOWS
invokeScript.StartInfo.FileName = WhichUtil.Which("cmd.exe", trace: Trace);
invokeScript.StartInfo.Arguments = $"/c \"{updateScript}\"";
#elif (OS_OSX || OS_LINUX)
invokeScript.StartInfo.FileName = WhichUtil.Which("bash", trace: Trace);
invokeScript.StartInfo.Arguments = $"\"{updateScript}\"";
#endif
invokeScript.Start();
Trace.Info($"Update script start running");
}
totalUpdateTime.Stop();
_updateTrace.Enqueue($"TotalUpdateTime: {totalUpdateTime.ElapsedMilliseconds}ms");
_terminal.WriteLine("Runner will exit shortly for update, should be back online within 10 seconds.");
return true;
}
catch (Exception ex)
{
_updateTrace.Enqueue(ex.ToString());
throw;
}
finally
{
_terminal.WriteLine("Runner update process finished.");
Busy = false;
}
}
/// <summary>
/// _work
/// \_update
/// \bin
/// \externals
/// \run.sh
/// \run.cmd
/// \package.zip //temp download .zip/.tar.gz
/// </summary>
/// <param name="token"></param>
/// <returns></returns>
private async Task DownloadLatestRunner(CancellationToken token, string targetVersion, string packageDownloadUrl, string packageHashValue, string targetPlatform)
{
string latestRunnerDirectory = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Work), Constants.Path.UpdateDirectory);
IOUtil.DeleteDirectory(latestRunnerDirectory, token);
Directory.CreateDirectory(latestRunnerDirectory);
string archiveFile = null;
_updateTrace.Enqueue($"DownloadUrl: {packageDownloadUrl}");
try
{
#if DEBUG
// Much of the update process (targetVersion, archive) is server-side, this is a way to control it from here for testing specific update scenarios
// Add files like 'runner2.281.2.tar.gz' or 'runner2.283.0.zip' (depending on your platform) to your runner root folder
// Note that runners still need to be older than the server's runner version in order to receive an 'AgentRefreshMessage' and trigger this update
// Wrapped in #if DEBUG as this should not be in the RELEASE build
if (StringUtil.ConvertToBoolean(Environment.GetEnvironmentVariable("GITHUB_ACTIONS_RUNNER_IS_MOCK_UPDATE")))
{
var waitForDebugger = StringUtil.ConvertToBoolean(Environment.GetEnvironmentVariable("GITHUB_ACTIONS_RUNNER_IS_MOCK_UPDATE_WAIT_FOR_DEBUGGER"));
if (waitForDebugger)
{
int waitInSeconds = 20;
while (!Debugger.IsAttached && waitInSeconds-- > 0)
{
await Task.Delay(1000);
}
Debugger.Break();
}
if (targetPlatform.StartsWith("win"))
{
archiveFile = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Root), $"runner{targetVersion}.zip");
}
else
{
archiveFile = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Root), $"runner{targetVersion}.tar.gz");
}
if (File.Exists(archiveFile))
{
_updateTrace.Enqueue($"Mocking update with file: '{archiveFile}' and targetVersion: '{targetVersion}', nothing is downloaded");
_terminal.WriteLine($"Mocking update with file: '{archiveFile}' and targetVersion: '{targetVersion}', nothing is downloaded");
}
else
{
archiveFile = null;
_terminal.WriteLine($"Mock runner archive not found at {archiveFile} for target version {targetVersion}, proceeding with download instead");
_updateTrace.Enqueue($"Mock runner archive not found at {archiveFile} for target version {targetVersion}, proceeding with download instead");
}
}
#endif
// archiveFile is not null only if we mocked it above
if (string.IsNullOrEmpty(archiveFile))
{
archiveFile = await DownLoadRunner(latestRunnerDirectory, packageDownloadUrl, packageHashValue, targetPlatform, token);
if (string.IsNullOrEmpty(archiveFile))
{
throw new TaskCanceledException($"Runner package '{packageDownloadUrl}' failed after {Constants.RunnerDownloadRetryMaxAttempts} download attempts");
}
await ValidateRunnerHash(archiveFile, packageHashValue);
}
await ExtractRunnerPackage(archiveFile, latestRunnerDirectory, token);
}
finally
{
try
{
// delete .zip file
if (!string.IsNullOrEmpty(archiveFile) && File.Exists(archiveFile))
{
Trace.Verbose("Deleting latest runner package zip: {0}", archiveFile);
IOUtil.DeleteFile(archiveFile);
}
}
catch (Exception ex)
{
//it is not critical if we fail to delete the .zip file
Trace.Warning("Failed to delete runner package zip '{0}'. Exception: {1}", archiveFile, ex);
}
}
await CopyLatestRunnerToRoot(latestRunnerDirectory, targetVersion, token);
}
private async Task<string> DownLoadRunner(string downloadDirectory, string packageDownloadUrl, string packageHashValue, string packagePlatform, CancellationToken token)
{
var stopWatch = Stopwatch.StartNew();
int runnerSuffix = 1;
string archiveFile = null;
bool downloadSucceeded = false;
// Download the runner, using multiple attempts in order to be resilient against any networking/CDN issues
for (int attempt = 1; attempt <= Constants.RunnerDownloadRetryMaxAttempts; attempt++)
{
// Generate an available package name, and do our best effort to clean up stale local zip files
while (true)
{
if (packagePlatform.StartsWith("win"))
{
archiveFile = Path.Combine(downloadDirectory, $"runner{runnerSuffix}.zip");
}
else
{
archiveFile = Path.Combine(downloadDirectory, $"runner{runnerSuffix}.tar.gz");
}
try
{
// delete .zip file
if (!string.IsNullOrEmpty(archiveFile) && File.Exists(archiveFile))
{
Trace.Verbose("Deleting latest runner package zip '{0}'", archiveFile);
IOUtil.DeleteFile(archiveFile);
}
break;
}
catch (Exception ex)
{
// couldn't delete the file for whatever reason, so generate another name
Trace.Warning("Failed to delete runner package zip '{0}'. Exception: {1}", archiveFile, ex);
runnerSuffix++;
}
}
// Allow a 15-minute package download timeout, which is good enough to update the runner from a 1 Mbit/s ADSL connection.
if (!int.TryParse(Environment.GetEnvironmentVariable("GITHUB_ACTIONS_RUNNER_DOWNLOAD_TIMEOUT") ?? string.Empty, out int timeoutSeconds))
{
timeoutSeconds = 15 * 60;
}
Trace.Info($"Attempt {attempt}: save latest runner into {archiveFile}.");
using (var downloadTimeout = new CancellationTokenSource(TimeSpan.FromSeconds(timeoutSeconds)))
using (var downloadCts = CancellationTokenSource.CreateLinkedTokenSource(downloadTimeout.Token, token))
{
try
{
Trace.Info($"Download runner: begin download");
long downloadSize = 0;
//open zip stream in async mode
using (HttpClient httpClient = new(HostContext.CreateHttpClientHandler()))
{
Trace.Info($"Downloading {packageDownloadUrl}");
using (FileStream fs = new(archiveFile, FileMode.Create, FileAccess.Write, FileShare.None, bufferSize: 4096, useAsync: true))
using (Stream result = await httpClient.GetStreamAsync(packageDownloadUrl))
{
//81920 is the default used by System.IO.Stream.CopyTo and is under the large object heap threshold (85k).
await result.CopyToAsync(fs, 81920, downloadCts.Token);
await fs.FlushAsync(downloadCts.Token);
downloadSize = fs.Length;
}
}
Trace.Info($"Download runner: finished download");
downloadSucceeded = true;
stopWatch.Stop();
_updateTrace.Enqueue($"PackageDownloadTime: {stopWatch.ElapsedMilliseconds}ms");
_updateTrace.Enqueue($"Attempts: {attempt}");
_updateTrace.Enqueue($"PackageSize: {downloadSize / 1024 / 1024}MB");
break;
}
catch (OperationCanceledException) when (token.IsCancellationRequested)
{
Trace.Info($"Runner download has been cancelled.");
throw;
}
catch (Exception ex)
{
if (downloadCts.Token.IsCancellationRequested)
{
Trace.Warning($"Runner download has timed out after {timeoutSeconds} seconds");
}
Trace.Warning($"Failed to get package '{archiveFile}' from '{packageDownloadUrl}'. Exception {ex}");
}
}
}
if (downloadSucceeded)
{
return archiveFile;
}
else
{
return null;
}
}
private async Task ValidateRunnerHash(string archiveFile, string packageHashValue)
{
var stopWatch = Stopwatch.StartNew();
// Validate Hash Matches if it is provided
using (FileStream stream = File.OpenRead(archiveFile))
{
if (!string.IsNullOrEmpty(packageHashValue))
{
using (SHA256 sha256 = SHA256.Create())
{
byte[] srcHashBytes = await sha256.ComputeHashAsync(stream);
var hash = PrimitiveExtensions.ConvertToHexString(srcHashBytes);
if (hash != packageHashValue)
{
// Hash did not match, we can't recover from this, just throw
throw new Exception($"Computed runner hash {hash} did not match expected Runner Hash {packageHashValue} for {archiveFile}");
}
stopWatch.Stop();
Trace.Info($"Validated Runner Hash matches {archiveFile} : {packageHashValue}");
_updateTrace.Enqueue($"ValidateHashTime: {stopWatch.ElapsedMilliseconds}ms");
}
}
}
}
private async Task ExtractRunnerPackage(string archiveFile, string extractDirectory, CancellationToken token)
{
var stopWatch = Stopwatch.StartNew();
if (archiveFile.EndsWith(".zip", StringComparison.OrdinalIgnoreCase))
{
ZipFile.ExtractToDirectory(archiveFile, extractDirectory);
}
else if (archiveFile.EndsWith(".tar.gz", StringComparison.OrdinalIgnoreCase))
{
string tar = WhichUtil.Which("tar", trace: Trace);
if (string.IsNullOrEmpty(tar))
{
throw new NotSupportedException($"tar -xzf");
}
// tar -xzf
using (var processInvoker = HostContext.CreateService<IProcessInvoker>())
{
processInvoker.OutputDataReceived += new EventHandler<ProcessDataReceivedEventArgs>((sender, args) =>
{
if (!string.IsNullOrEmpty(args.Data))
{
Trace.Info(args.Data);
}
});
processInvoker.ErrorDataReceived += new EventHandler<ProcessDataReceivedEventArgs>((sender, args) =>
{
if (!string.IsNullOrEmpty(args.Data))
{
Trace.Error(args.Data);
}
});
int exitCode = await processInvoker.ExecuteAsync(extractDirectory, tar, $"-xzf \"{archiveFile}\"", null, token);
if (exitCode != 0)
{
throw new NotSupportedException($"Can't use 'tar -xzf' to extract archive file: {archiveFile}. return code: {exitCode}.");
}
}
}
else
{
throw new NotSupportedException($"{archiveFile}");
}
stopWatch.Stop();
Trace.Info($"Finished getting latest runner package at: {extractDirectory}.");
_updateTrace.Enqueue($"PackageExtractTime: {stopWatch.ElapsedMilliseconds}ms");
}
private Task CopyLatestRunnerToRoot(string latestRunnerDirectory, string targetVersion, CancellationToken token)
{
var stopWatch = Stopwatch.StartNew();
// copy latest runner into runner root folder
// copy bin from _work/_update -> bin.version under root
string binVersionDir = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Root), $"{Constants.Path.BinDirectory}.{targetVersion}");
Directory.CreateDirectory(binVersionDir);
Trace.Info($"Copy {Path.Combine(latestRunnerDirectory, Constants.Path.BinDirectory)} to {binVersionDir}.");
IOUtil.CopyDirectory(Path.Combine(latestRunnerDirectory, Constants.Path.BinDirectory), binVersionDir, token);
// copy externals from _work/_update -> externals.version under root
string externalsVersionDir = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Root), $"{Constants.Path.ExternalsDirectory}.{targetVersion}");
Directory.CreateDirectory(externalsVersionDir);
Trace.Info($"Copy {Path.Combine(latestRunnerDirectory, Constants.Path.ExternalsDirectory)} to {externalsVersionDir}.");
IOUtil.CopyDirectory(Path.Combine(latestRunnerDirectory, Constants.Path.ExternalsDirectory), externalsVersionDir, token);
// copy and replace all .sh/.cmd files
Trace.Info($"Copy any remaining .sh/.cmd files into runner root.");
foreach (FileInfo file in new DirectoryInfo(latestRunnerDirectory).GetFiles() ?? new FileInfo[0])
{
string destination = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Root), file.Name);
// Removing the file instead of just trying to overwrite it works around permissions issues on linux.
// https://github.com/actions/runner/issues/981
Trace.Info($"Copy {file.FullName} to {destination}");
IOUtil.DeleteFile(destination);
file.CopyTo(destination, true);
}
stopWatch.Stop();
_updateTrace.Enqueue($"CopyRunnerToRootTime: {stopWatch.ElapsedMilliseconds}ms");
return Task.CompletedTask;
}
private void DeletePreviousVersionRunnerBackup(CancellationToken token, string targetVersion)
{
// delete previous backup runner (back compat, can be remove after serval sprints)
// bin.bak.2.99.0
// externals.bak.2.99.0
foreach (string existBackUp in Directory.GetDirectories(HostContext.GetDirectory(WellKnownDirectory.Root), "*.bak.*"))
{
Trace.Info($"Delete existing runner backup at {existBackUp}.");
try
{
IOUtil.DeleteDirectory(existBackUp, token);
}
catch (Exception ex) when (!(ex is OperationCanceledException))
{
Trace.Error(ex);
Trace.Info($"Catch exception during delete backup folder {existBackUp}, ignore this error try delete the backup folder on next auto-update.");
}
}
// delete old bin.2.99.0 folder, only leave the current version and the latest download version
var allBinDirs = Directory.GetDirectories(HostContext.GetDirectory(WellKnownDirectory.Root), "bin.*");
if (allBinDirs.Length > _numberOfOldVersionsToKeep)
{
// there are more than {_numberOfOldVersionsToKeep} bin.version folder.
// delete older bin.version folders.
foreach (var oldBinDir in allBinDirs)
{
if (string.Equals(oldBinDir, Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Root), $"bin"), StringComparison.OrdinalIgnoreCase) ||
string.Equals(oldBinDir, Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Root), $"bin.{BuildConstants.RunnerPackage.Version}"), StringComparison.OrdinalIgnoreCase) ||
string.Equals(oldBinDir, Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Root), $"bin.{targetVersion}"), StringComparison.OrdinalIgnoreCase))
{
// skip for current runner version
continue;
}
Trace.Info($"Delete runner bin folder's backup at {oldBinDir}.");
try
{
IOUtil.DeleteDirectory(oldBinDir, token);
}
catch (Exception ex) when (!(ex is OperationCanceledException))
{
Trace.Error(ex);
Trace.Info($"Catch exception during delete backup folder {oldBinDir}, ignore this error try delete the backup folder on next auto-update.");
}
}
}
// delete old externals.2.99.0 folder, only leave the current version and the latest download version
var allExternalsDirs = Directory.GetDirectories(HostContext.GetDirectory(WellKnownDirectory.Root), "externals.*");
if (allExternalsDirs.Length > _numberOfOldVersionsToKeep)
{
// there are more than {_numberOfOldVersionsToKeep} externals.version folder.
// delete older externals.version folders.
foreach (var oldExternalDir in allExternalsDirs)
{
if (string.Equals(oldExternalDir, Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Root), $"externals"), StringComparison.OrdinalIgnoreCase) ||
string.Equals(oldExternalDir, Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Root), $"externals.{BuildConstants.RunnerPackage.Version}"), StringComparison.OrdinalIgnoreCase) ||
string.Equals(oldExternalDir, Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Root), $"externals.{targetVersion}"), StringComparison.OrdinalIgnoreCase))
{
// skip for current runner version
continue;
}
Trace.Info($"Delete runner externals folder's backup at {oldExternalDir}.");
try
{
IOUtil.DeleteDirectory(oldExternalDir, token);
}
catch (Exception ex) when (!(ex is OperationCanceledException))
{
Trace.Error(ex);
Trace.Info($"Catch exception during delete backup folder {oldExternalDir}, ignore this error try delete the backup folder on next auto-update.");
}
}
}
}
private string GenerateUpdateScript(bool restartInteractiveRunner, string targetVersion)
{
int processId = Process.GetCurrentProcess().Id;
string updateLog = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Diag), $"SelfUpdate-{DateTime.UtcNow.ToString("yyyyMMdd-HHmmss")}.log");
string runnerRoot = HostContext.GetDirectory(WellKnownDirectory.Root);
#if OS_WINDOWS
string templateName = "update.cmd.template";
#else
string templateName = "update.sh.template";
#endif
string templatePath = Path.Combine(runnerRoot, $"bin.{targetVersion}", templateName);
string template = File.ReadAllText(templatePath);
template = template.Replace("_PROCESS_ID_", processId.ToString());
template = template.Replace("_RUNNER_PROCESS_NAME_", $"Runner.Listener{IOUtil.ExeExtension}");
template = template.Replace("_ROOT_FOLDER_", runnerRoot);
template = template.Replace("_EXIST_RUNNER_VERSION_", BuildConstants.RunnerPackage.Version);
template = template.Replace("_DOWNLOAD_RUNNER_VERSION_", targetVersion);
template = template.Replace("_UPDATE_LOG_", updateLog);
template = template.Replace("_RESTART_INTERACTIVE_RUNNER_", restartInteractiveRunner ? "1" : "0");
#if OS_WINDOWS
string scriptName = "_update.cmd";
#else
string scriptName = "_update.sh";
#endif
string updateScript = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Work), scriptName);
if (File.Exists(updateScript))
{
IOUtil.DeleteFile(updateScript);
}
File.WriteAllText(updateScript, template);
return updateScript;
}
}
}

View File

@@ -23,7 +23,13 @@ namespace GitHub.Runner.Sdk
if (VssClientHttpRequestSettings.Default.UserAgent != null && VssClientHttpRequestSettings.Default.UserAgent.Count > 0) if (VssClientHttpRequestSettings.Default.UserAgent != null && VssClientHttpRequestSettings.Default.UserAgent.Count > 0)
{ {
headerValues.AddRange(VssClientHttpRequestSettings.Default.UserAgent); foreach (var headerVal in VssClientHttpRequestSettings.Default.UserAgent)
{
if (!headerValues.Contains(headerVal))
{
headerValues.Add(headerVal);
}
}
} }
VssClientHttpRequestSettings.Default.UserAgent = headerValues; VssClientHttpRequestSettings.Default.UserAgent = headerValues;
@@ -33,6 +39,23 @@ namespace GitHub.Runner.Sdk
{ {
VssClientHttpRequestSettings.Default.ServerCertificateValidationCallback = HttpClientHandler.DangerousAcceptAnyServerCertificateValidator; VssClientHttpRequestSettings.Default.ServerCertificateValidationCallback = HttpClientHandler.DangerousAcceptAnyServerCertificateValidator;
} }
var rawHeaderValues = new List<ProductInfoHeaderValue>();
rawHeaderValues.AddRange(additionalUserAgents);
rawHeaderValues.Add(new ProductInfoHeaderValue($"({StringUtil.SanitizeUserAgentHeader(RuntimeInformation.OSDescription)})"));
if (RawClientHttpRequestSettings.Default.UserAgent != null && RawClientHttpRequestSettings.Default.UserAgent.Count > 0)
{
foreach (var headerVal in RawClientHttpRequestSettings.Default.UserAgent)
{
if (!rawHeaderValues.Contains(headerVal))
{
rawHeaderValues.Add(headerVal);
}
}
}
RawClientHttpRequestSettings.Default.UserAgent = rawHeaderValues;
} }
public static VssConnection CreateConnection( public static VssConnection CreateConnection(

View File

@@ -144,7 +144,7 @@ namespace GitHub.Runner.Worker
executionContext.Error(error.Message); executionContext.Error(error.Message);
} }
throw new ArgumentException($"Fail to load {fileRelativePath}"); throw new ArgumentException($"Failed to load {fileRelativePath}");
} }
if (actionDefinition.Execution == null) if (actionDefinition.Execution == null)

View File

@@ -108,6 +108,8 @@ namespace GitHub.Runner.Worker
parentContext.QueueAttachFile(type: CoreAttachmentType.DiagnosticLog, name: diagnosticsZipFileName, filePath: diagnosticsZipFilePath); parentContext.QueueAttachFile(type: CoreAttachmentType.DiagnosticLog, name: diagnosticsZipFileName, filePath: diagnosticsZipFilePath);
parentContext.QueueDiagnosticLogFile(name: diagnosticsZipFileName, filePath: diagnosticsZipFilePath);
executionContext.Debug("Diagnostic file upload complete."); executionContext.Debug("Diagnostic file upload complete.");
} }

View File

@@ -90,6 +90,7 @@ namespace GitHub.Runner.Worker
long Write(string tag, string message); long Write(string tag, string message);
void QueueAttachFile(string type, string name, string filePath); void QueueAttachFile(string type, string name, string filePath);
void QueueSummaryFile(string name, string filePath, Guid stepRecordId); void QueueSummaryFile(string name, string filePath, Guid stepRecordId);
void QueueDiagnosticLogFile(string name, string filePath);
// timeline record update methods // timeline record update methods
void Start(string currentOperation = null); void Start(string currentOperation = null);
@@ -397,11 +398,11 @@ namespace GitHub.Runner.Worker
if (recordOrder != null) if (recordOrder != null)
{ {
child.InitializeTimelineRecord(_mainTimelineId, recordId, _record.Id, ExecutionContextType.Task, displayName, refName, recordOrder); child.InitializeTimelineRecord(_mainTimelineId, recordId, _record.Id, ExecutionContextType.Task, displayName, refName, recordOrder, embedded: isEmbedded);
} }
else else
{ {
child.InitializeTimelineRecord(_mainTimelineId, recordId, _record.Id, ExecutionContextType.Task, displayName, refName, ++_childTimelineRecordOrder); child.InitializeTimelineRecord(_mainTimelineId, recordId, _record.Id, ExecutionContextType.Task, displayName, refName, ++_childTimelineRecordOrder, embedded: isEmbedded);
} }
if (logger != null) if (logger != null)
{ {
@@ -432,7 +433,7 @@ namespace GitHub.Runner.Worker
Dictionary<string, string> intraActionState = null, Dictionary<string, string> intraActionState = null,
string siblingScopeName = null) string siblingScopeName = null)
{ {
return Root.CreateChild(_record.Id, _record.Name, _record.Id.ToString("N"), scopeName, contextName, stage, logger: _logger, isEmbedded: true, cancellationTokenSource: null, intraActionState: intraActionState, embeddedId: embeddedId, siblingScopeName: siblingScopeName, timeout: GetRemainingTimeout()); return Root.CreateChild(_record.Id, _record.Name, _record.Id.ToString("N"), scopeName, contextName, stage, logger: _logger, isEmbedded: true, cancellationTokenSource: null, intraActionState: intraActionState, embeddedId: embeddedId, siblingScopeName: siblingScopeName, timeout: GetRemainingTimeout(), recordOrder: _record.Order);
} }
public void Start(string currentOperation = null) public void Start(string currentOperation = null)
@@ -982,6 +983,18 @@ namespace GitHub.Runner.Worker
_jobServerQueue.QueueResultsUpload(stepRecordId, name, filePath, ChecksAttachmentType.StepSummary, deleteSource: false, finalize: true, firstBlock: true, totalLines: 0); _jobServerQueue.QueueResultsUpload(stepRecordId, name, filePath, ChecksAttachmentType.StepSummary, deleteSource: false, finalize: true, firstBlock: true, totalLines: 0);
} }
public void QueueDiagnosticLogFile(string name, string filePath)
{
ArgUtil.NotNullOrEmpty(name, nameof(name));
ArgUtil.NotNullOrEmpty(filePath, nameof(filePath));
if (!File.Exists(filePath))
{
throw new FileNotFoundException($"Can't upload diagnostic log file: {filePath}. File does not exist.");
}
_jobServerQueue.QueueResultsUpload(_record.Id, name, filePath, CoreAttachmentType.ResultsDiagnosticLog, deleteSource: false, finalize: true, firstBlock: true, totalLines: 0);
}
// Add OnMatcherChanged // Add OnMatcherChanged
public void Add(OnMatcherChanged handler) public void Add(OnMatcherChanged handler)
{ {
@@ -1160,7 +1173,7 @@ namespace GitHub.Runner.Worker
} }
} }
private void InitializeTimelineRecord(Guid timelineId, Guid timelineRecordId, Guid? parentTimelineRecordId, string recordType, string displayName, string refName, int? order) private void InitializeTimelineRecord(Guid timelineId, Guid timelineRecordId, Guid? parentTimelineRecordId, string recordType, string displayName, string refName, int? order, bool embedded = false)
{ {
_mainTimelineId = timelineId; _mainTimelineId = timelineId;
_record.Id = timelineRecordId; _record.Id = timelineRecordId;
@@ -1186,7 +1199,11 @@ namespace GitHub.Runner.Worker
var configuration = HostContext.GetService<IConfigurationStore>(); var configuration = HostContext.GetService<IConfigurationStore>();
_record.WorkerName = configuration.GetSettings().AgentName; _record.WorkerName = configuration.GetSettings().AgentName;
_jobServerQueue.QueueTimelineRecordUpdate(_mainTimelineId, _record); // We don't want to update the timeline record for embedded steps since they are not really represented in the UI.
if (!embedded)
{
_jobServerQueue.QueueTimelineRecordUpdate(_mainTimelineId, _record);
}
} }
private void JobServerQueueThrottling_EventReceived(object sender, ThrottlingEventArgs data) private void JobServerQueueThrottling_EventReceived(object sender, ThrottlingEventArgs data)

View File

@@ -49,6 +49,9 @@ namespace GitHub.Runner.Worker
!string.IsNullOrEmpty(orchestrationId.Value)) !string.IsNullOrEmpty(orchestrationId.Value))
{ {
HostContext.UserAgents.Add(new ProductInfoHeaderValue("OrchestrationId", orchestrationId.Value)); HostContext.UserAgents.Add(new ProductInfoHeaderValue("OrchestrationId", orchestrationId.Value));
// make sure orchestration id is in the user-agent header.
VssUtil.InitializeVssClientSettings(HostContext.UserAgents, HostContext.WebProxy);
} }
var jobServerQueueTelemetry = false; var jobServerQueueTelemetry = false;
@@ -295,6 +298,8 @@ namespace GitHub.Runner.Worker
jobContext.Warning(string.Format(Constants.Runner.EnforcedNode12DetectedAfterEndOfLife, actions)); jobContext.Warning(string.Format(Constants.Runner.EnforcedNode12DetectedAfterEndOfLife, actions));
} }
await ShutdownQueue(throwOnFailure: false);
// Make sure to clean temp after file upload since they may be pending fileupload still use the TEMP dir. // Make sure to clean temp after file upload since they may be pending fileupload still use the TEMP dir.
_tempDirectoryManager?.CleanupTempDirectory(); _tempDirectoryManager?.CleanupTempDirectory();

View File

@@ -1,12 +1,11 @@
using System; using System;
using System.ComponentModel; using System.ComponentModel;
using System.Diagnostics.CodeAnalysis;
namespace GitHub.Services.Common.Internal namespace GitHub.Services.Common.Internal
{ {
[EditorBrowsable(EditorBrowsableState.Never)] [EditorBrowsable(EditorBrowsableState.Never)]
public static class RawHttpHeaders public static class RawHttpHeaders
{ {
public const String SessionHeader = "X-Runner-Session"; public const String SessionHeader = "X-Actions-Session";
} }
} }

View File

@@ -138,6 +138,8 @@ namespace GitHub.Services.Common
response.Dispose(); response.Dispose();
} }
this.Settings.ApplyTo(request);
// Let's start with sending a token // Let's start with sending a token
IssuedToken token = null; IssuedToken token = null;
if (m_tokenProvider != null) if (m_tokenProvider != null)

View File

@@ -463,6 +463,7 @@ namespace GitHub.DistributedTask.WebApi
string runnerVersion = null, string runnerVersion = null,
string os = null, string os = null,
string architecture = null, string architecture = null,
bool? disableUpdate = null,
object userState = null, object userState = null,
CancellationToken cancellationToken = default) CancellationToken cancellationToken = default)
{ {
@@ -495,6 +496,11 @@ namespace GitHub.DistributedTask.WebApi
queryParams.Add("architecture", architecture); queryParams.Add("architecture", architecture);
} }
if (disableUpdate != null)
{
queryParams.Add("disableUpdate", disableUpdate.Value.ToString().ToLower());
}
return SendAsync<TaskAgentMessage>( return SendAsync<TaskAgentMessage>(
httpMethod, httpMethod,
locationId, locationId,

View File

@@ -0,0 +1,38 @@
using System;
using System.Runtime.Serialization;
namespace GitHub.DistributedTask.WebApi
{
/// <summary>
/// Message that tells the runner to redirect itself to BrokerListener for messages.
/// (Note that we use a special Message instead of a simple 302. This is because
/// the runner will need to apply the runner's token to the request, and it is
/// a security best practice to *not* blindly add sensitive data to redirects
/// 302s.)
/// </summary>
[DataContract]
public class BrokerMigrationMessage
{
public static readonly string MessageType = "BrokerMigration";
public BrokerMigrationMessage()
{
}
public BrokerMigrationMessage(
Uri brokerUrl)
{
this.BrokerBaseUrl = brokerUrl;
}
/// <summary>
/// The base url for the broker listener
/// </summary>
[DataMember]
public Uri BrokerBaseUrl
{
get;
internal set;
}
}
}

View File

@@ -1,5 +1,6 @@
using Newtonsoft.Json; using Newtonsoft.Json;
using System; using System;
using System.Collections.Generic;
using System.Runtime.Serialization; using System.Runtime.Serialization;
@@ -15,35 +16,32 @@ namespace GitHub.DistributedTask.WebApi
{ {
} }
public RunnerRefreshMessage( [DataMember(Name = "target_version")]
ulong runnerId,
String targetVersion,
int? timeoutInSeconds = null)
{
this.RunnerId = runnerId;
this.TimeoutInSeconds = timeoutInSeconds ?? TimeSpan.FromMinutes(60).Seconds;
this.TargetVersion = targetVersion;
}
[DataMember]
public ulong RunnerId
{
get;
private set;
}
[DataMember]
public int TimeoutInSeconds
{
get;
private set;
}
[DataMember]
public String TargetVersion public String TargetVersion
{ {
get; get;
private set; set;
}
[DataMember(Name = "download_url")]
public string DownloadUrl
{
get;
set;
}
[DataMember(Name = "sha256_checksum")]
public string SHA256Checksum
{
get;
set;
}
[DataMember(Name = "os")]
public string OS
{
get;
set;
} }
} }
} }

View File

@@ -0,0 +1,10 @@
using System;
using System.Runtime.Serialization;
namespace GitHub.DistributedTask.WebApi
{
public sealed class TaskAgentMessageTypes
{
public static readonly string ForceTokenRefresh = "ForceTokenRefresh";
}
}

View File

@@ -75,5 +75,12 @@ namespace GitHub.DistributedTask.WebApi
get; get;
set; set;
} }
[DataMember(EmitDefaultValue = false, IsRequired = false)]
public BrokerMigrationMessage BrokerMigrationMessage
{
get;
set;
}
} }
} }

View File

@@ -101,6 +101,7 @@ namespace GitHub.DistributedTask.WebApi
public static readonly String FileAttachment = "DistributedTask.Core.FileAttachment"; public static readonly String FileAttachment = "DistributedTask.Core.FileAttachment";
public static readonly String DiagnosticLog = "DistributedTask.Core.DiagnosticLog"; public static readonly String DiagnosticLog = "DistributedTask.Core.DiagnosticLog";
public static readonly String ResultsLog = "Results.Core.Log"; public static readonly String ResultsLog = "Results.Core.Log";
public static readonly String ResultsDiagnosticLog = "Results.Core.DiagnosticLog";
} }
[GenerateAllConstants] [GenerateAllConstants]

View File

@@ -13,6 +13,7 @@
</PropertyGroup> </PropertyGroup>
<ItemGroup> <ItemGroup>
<PackageReference Include="Azure.Storage.Blobs" Version="12.19.1" />
<PackageReference Include="Microsoft.Win32.Registry" Version="4.4.0" /> <PackageReference Include="Microsoft.Win32.Registry" Version="4.4.0" />
<PackageReference Include="Newtonsoft.Json" Version="13.0.3" /> <PackageReference Include="Newtonsoft.Json" Version="13.0.3" />
<PackageReference Include="Microsoft.AspNet.WebApi.Client" Version="5.2.9" /> <PackageReference Include="Microsoft.AspNet.WebApi.Client" Version="5.2.9" />

View File

@@ -57,10 +57,12 @@ namespace GitHub.Actions.RunService.WebApi
} }
public async Task<TaskAgentMessage> GetRunnerMessageAsync( public async Task<TaskAgentMessage> GetRunnerMessageAsync(
Guid? sessionId,
string runnerVersion, string runnerVersion,
TaskAgentStatus? status, TaskAgentStatus? status,
string os = null, string os = null,
string architecture = null, string architecture = null,
bool? disableUpdate = null,
CancellationToken cancellationToken = default CancellationToken cancellationToken = default
) )
{ {
@@ -68,6 +70,11 @@ namespace GitHub.Actions.RunService.WebApi
List<KeyValuePair<string, string>> queryParams = new List<KeyValuePair<string, string>>(); List<KeyValuePair<string, string>> queryParams = new List<KeyValuePair<string, string>>();
if (sessionId != null)
{
queryParams.Add("sessionId", sessionId.Value.ToString());
}
if (status != null) if (status != null)
{ {
queryParams.Add("status", status.Value.ToString()); queryParams.Add("status", status.Value.ToString());
@@ -87,6 +94,11 @@ namespace GitHub.Actions.RunService.WebApi
queryParams.Add("architecture", architecture); queryParams.Add("architecture", architecture);
} }
if (disableUpdate != null)
{
queryParams.Add("disableUpdate", disableUpdate.Value.ToString().ToLower());
}
var result = await SendAsync<TaskAgentMessage>( var result = await SendAsync<TaskAgentMessage>(
new HttpMethod("GET"), new HttpMethod("GET"),
requestUri: requestUri, requestUri: requestUri,
@@ -105,5 +117,55 @@ namespace GitHub.Actions.RunService.WebApi
throw new Exception($"Failed to get job message: {result.Error}"); throw new Exception($"Failed to get job message: {result.Error}");
} }
public async Task<TaskAgentSession> CreateSessionAsync(
TaskAgentSession session,
CancellationToken cancellationToken = default)
{
var requestUri = new Uri(Client.BaseAddress, "session");
var requestContent = new ObjectContent<TaskAgentSession>(session, new VssJsonMediaTypeFormatter(true));
var result = await SendAsync<TaskAgentSession>(
new HttpMethod("POST"),
requestUri: requestUri,
content: requestContent,
cancellationToken: cancellationToken);
if (result.IsSuccess)
{
return result.Value;
}
if (result.StatusCode == HttpStatusCode.Forbidden)
{
throw new AccessDeniedException(result.Error);
}
if (result.StatusCode == HttpStatusCode.Conflict)
{
throw new TaskAgentSessionConflictException(result.Error);
}
throw new Exception($"Failed to create broker session: {result.Error}");
}
public async Task DeleteSessionAsync(
CancellationToken cancellationToken = default)
{
var requestUri = new Uri(Client.BaseAddress, $"session");
var result = await SendAsync<object>(
new HttpMethod("DELETE"),
requestUri: requestUri,
cancellationToken: cancellationToken);
if (result.IsSuccess)
{
return;
}
throw new Exception($"Failed to delete broker session: {result.Error}");
}
} }
} }

View File

@@ -89,6 +89,26 @@ namespace GitHub.Services.Results.Contracts
public long SoftSizeLimit; public long SoftSizeLimit;
} }
[DataContract]
[JsonObject(NamingStrategyType = typeof(SnakeCaseNamingStrategy))]
public class GetSignedDiagnosticLogsURLRequest
{
[DataMember]
public string WorkflowJobRunBackendId;
[DataMember]
public string WorkflowRunBackendId;
}
[DataContract]
[JsonObject(NamingStrategyType = typeof(SnakeCaseNamingStrategy))]
public class GetSignedDiagnosticLogsURLResponse
{
[DataMember]
public string DiagLogsURL;
[DataMember]
public string BlobStorageType;
}
[DataContract] [DataContract]
[JsonObject(NamingStrategyType = typeof(SnakeCaseNamingStrategy))] [JsonObject(NamingStrategyType = typeof(SnakeCaseNamingStrategy))]
public class JobLogsMetadataCreate public class JobLogsMetadataCreate

View File

@@ -1,6 +1,5 @@
using System; using System;
using System.Collections.Generic; using System.Collections.Generic;
using System.Diagnostics;
using System.IO; using System.IO;
using System.Linq; using System.Linq;
using System.Net.Http; using System.Net.Http;
@@ -8,8 +7,11 @@ using System.Net.Http.Headers;
using System.Threading; using System.Threading;
using System.Threading.Tasks; using System.Threading.Tasks;
using System.Net.Http.Formatting; using System.Net.Http.Formatting;
using Azure;
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
using Azure.Storage.Blobs.Specialized;
using GitHub.DistributedTask.WebApi; using GitHub.DistributedTask.WebApi;
using GitHub.Services.Common;
using GitHub.Services.Results.Contracts; using GitHub.Services.Results.Contracts;
using Sdk.WebApi.WebApi; using Sdk.WebApi.WebApi;
@@ -21,13 +23,15 @@ namespace GitHub.Services.Results.Client
Uri baseUrl, Uri baseUrl,
HttpMessageHandler pipeline, HttpMessageHandler pipeline,
string token, string token,
bool disposeHandler) bool disposeHandler,
bool useSdk)
: base(baseUrl, pipeline, disposeHandler) : base(baseUrl, pipeline, disposeHandler)
{ {
m_token = token; m_token = token;
m_resultsServiceUrl = baseUrl; m_resultsServiceUrl = baseUrl;
m_formatter = new JsonMediaTypeFormatter(); m_formatter = new JsonMediaTypeFormatter();
m_changeIdCounter = 1; m_changeIdCounter = 1;
m_useSdk = useSdk;
} }
// Get Sas URL calls // Get Sas URL calls
@@ -77,6 +81,19 @@ namespace GitHub.Services.Results.Client
return await GetResultsSignedURLResponse<GetSignedStepLogsURLRequest, GetSignedStepLogsURLResponse>(getStepLogsSignedBlobURLEndpoint, cancellationToken, request); return await GetResultsSignedURLResponse<GetSignedStepLogsURLRequest, GetSignedStepLogsURLResponse>(getStepLogsSignedBlobURLEndpoint, cancellationToken, request);
} }
private async Task<GetSignedDiagnosticLogsURLResponse> GetDiagnosticLogsUploadUrlAsync(string planId, string jobId, CancellationToken cancellationToken)
{
var request = new GetSignedDiagnosticLogsURLRequest()
{
WorkflowJobRunBackendId = jobId,
WorkflowRunBackendId = planId,
};
var getDiagnosticLogsSignedBlobURLEndpoint = new Uri(m_resultsServiceUrl, Constants.GetJobDiagLogsSignedBlobURL);
return await GetResultsSignedURLResponse<GetSignedDiagnosticLogsURLRequest, GetSignedDiagnosticLogsURLResponse>(getDiagnosticLogsSignedBlobURLEndpoint, cancellationToken, request);
}
private async Task<GetSignedJobLogsURLResponse> GetJobLogUploadUrlAsync(string planId, string jobId, CancellationToken cancellationToken) private async Task<GetSignedJobLogsURLResponse> GetJobLogUploadUrlAsync(string planId, string jobId, CancellationToken cancellationToken)
{ {
var request = new GetSignedJobLogsURLRequest() var request = new GetSignedJobLogsURLRequest()
@@ -91,7 +108,6 @@ namespace GitHub.Services.Results.Client
} }
// Create metadata calls // Create metadata calls
private async Task SendRequest<R>(Uri uri, CancellationToken cancellationToken, R request, string timestamp) private async Task SendRequest<R>(Uri uri, CancellationToken cancellationToken, R request, string timestamp)
{ {
using (HttpRequestMessage requestMessage = new HttpRequestMessage(HttpMethod.Post, uri)) using (HttpRequestMessage requestMessage = new HttpRequestMessage(HttpMethod.Post, uri))
@@ -161,73 +177,219 @@ namespace GitHub.Services.Results.Client
await SendRequest<JobLogsMetadataCreate>(createJobLogsMetadataEndpoint, cancellationToken, request, timestamp); await SendRequest<JobLogsMetadataCreate>(createJobLogsMetadataEndpoint, cancellationToken, request, timestamp);
} }
private async Task<HttpResponseMessage> UploadBlockFileAsync(string url, string blobStorageType, FileStream file, CancellationToken cancellationToken) private (Uri path, string sas) ParseSasToken(string url)
{ {
// Upload the file to the url if (String.IsNullOrEmpty(url))
var request = new HttpRequestMessage(HttpMethod.Put, url)
{ {
Content = new StreamContent(file) throw new Exception($"SAS url is empty");
};
if (blobStorageType == BlobStorageTypes.AzureBlobStorage)
{
request.Content.Headers.Add(Constants.AzureBlobTypeHeader, Constants.AzureBlockBlob);
} }
using (var response = await SendAsync(request, HttpCompletionOption.ResponseHeadersRead, userState: null, cancellationToken)) var blobUri = new UriBuilder(url);
var sasUrl = blobUri.Query.Substring(1); //remove starting "?"
blobUri.Query = null; // remove query params
return (blobUri.Uri, sasUrl);
}
private BlobClient GetBlobClient(string url)
{
var blobUri = ParseSasToken(url);
var opts = new BlobClientOptions
{ {
if (!response.IsSuccessStatusCode) Retry =
{ {
throw new Exception($"Failed to upload file, status code: {response.StatusCode}, reason: {response.ReasonPhrase}"); MaxRetries = Constants.DefaultBlobUploadRetries,
NetworkTimeout = TimeSpan.FromSeconds(Constants.DefaultNetworkTimeoutInSeconds)
}
};
return new BlobClient(blobUri.path, new AzureSasCredential(blobUri.sas), opts);
}
private AppendBlobClient GetAppendBlobClient(string url)
{
var blobUri = ParseSasToken(url);
var opts = new BlobClientOptions
{
Retry =
{
MaxRetries = Constants.DefaultBlobUploadRetries,
NetworkTimeout = TimeSpan.FromSeconds(Constants.DefaultNetworkTimeoutInSeconds)
}
};
return new AppendBlobClient(blobUri.path, new AzureSasCredential(blobUri.sas), opts);
}
private async Task UploadBlockFileAsync(string url, string blobStorageType, FileStream file, CancellationToken cancellationToken, Dictionary<string, string> customHeaders = null)
{
if (m_useSdk && blobStorageType == BlobStorageTypes.AzureBlobStorage)
{
var blobClient = GetBlobClient(url);
var httpHeaders = new BlobHttpHeaders();
if (customHeaders != null)
{
foreach (var header in customHeaders)
{
switch (header.Key)
{
case Constants.ContentTypeHeader:
httpHeaders.ContentType = header.Value;
break;
}
}
}
try
{
await blobClient.UploadAsync(file, new BlobUploadOptions()
{
HttpHeaders = httpHeaders,
Conditions = new BlobRequestConditions
{
IfNoneMatch = new ETag("*")
}
}, cancellationToken);
}
catch (RequestFailedException e)
{
throw new Exception($"Failed to upload block to Azure blob: {e.Message}");
}
}
else
{
// Upload the file to the url
var request = new HttpRequestMessage(HttpMethod.Put, url)
{
Content = new StreamContent(file)
};
if (blobStorageType == BlobStorageTypes.AzureBlobStorage)
{
request.Content.Headers.Add(Constants.AzureBlobTypeHeader, Constants.AzureBlockBlob);
}
if (customHeaders != null)
{
foreach (var header in customHeaders)
{
request.Content.Headers.Add(header.Key, header.Value);
}
};
using (var response = await SendAsync(request, HttpCompletionOption.ResponseHeadersRead, userState: null, cancellationToken))
{
if (!response.IsSuccessStatusCode)
{
throw new Exception($"Failed to upload file, status code: {response.StatusCode}, reason: {response.ReasonPhrase}");
}
} }
return response;
} }
} }
private async Task<HttpResponseMessage> CreateAppendFileAsync(string url, string blobStorageType, CancellationToken cancellationToken) private async Task CreateAppendFileAsync(string url, string blobStorageType, CancellationToken cancellationToken, Dictionary<string, string> customHeaders = null)
{ {
var request = new HttpRequestMessage(HttpMethod.Put, url) if (m_useSdk && blobStorageType == BlobStorageTypes.AzureBlobStorage)
{ {
Content = new StringContent("") var appendBlobClient = GetAppendBlobClient(url);
}; var httpHeaders = new BlobHttpHeaders();
if (blobStorageType == BlobStorageTypes.AzureBlobStorage) if (customHeaders != null)
{
request.Content.Headers.Add(Constants.AzureBlobTypeHeader, Constants.AzureAppendBlob);
request.Content.Headers.Add("Content-Length", "0");
}
using (var response = await SendAsync(request, HttpCompletionOption.ResponseHeadersRead, userState: null, cancellationToken))
{
if (!response.IsSuccessStatusCode)
{ {
throw new Exception($"Failed to create append file, status code: {response.StatusCode}, reason: {response.ReasonPhrase}"); foreach (var header in customHeaders)
{
switch (header.Key)
{
case Constants.ContentTypeHeader:
httpHeaders.ContentType = header.Value;
break;
}
}
}
try
{
await appendBlobClient.CreateAsync(new AppendBlobCreateOptions()
{
HttpHeaders = httpHeaders,
Conditions = new AppendBlobRequestConditions
{
IfNoneMatch = new ETag("*")
}
}, cancellationToken: cancellationToken);
}
catch (RequestFailedException e)
{
throw new Exception($"Failed to create append blob in Azure blob: {e.Message}");
}
}
else
{
var request = new HttpRequestMessage(HttpMethod.Put, url)
{
Content = new StringContent("")
};
if (blobStorageType == BlobStorageTypes.AzureBlobStorage)
{
request.Content.Headers.Add(Constants.AzureBlobTypeHeader, Constants.AzureAppendBlob);
request.Content.Headers.Add("Content-Length", "0");
}
if (customHeaders != null)
{
foreach (var header in customHeaders)
{
request.Content.Headers.Add(header.Key, header.Value);
}
};
using (var response = await SendAsync(request, HttpCompletionOption.ResponseHeadersRead, userState: null, cancellationToken))
{
if (!response.IsSuccessStatusCode)
{
throw new Exception($"Failed to create append file, status code: {response.StatusCode}, reason: {response.ReasonPhrase}");
}
} }
return response;
} }
} }
private async Task<HttpResponseMessage> UploadAppendFileAsync(string url, string blobStorageType, FileStream file, bool finalize, long fileSize, CancellationToken cancellationToken) private async Task UploadAppendFileAsync(string url, string blobStorageType, FileStream file, bool finalize, long fileSize, CancellationToken cancellationToken)
{ {
var comp = finalize ? "&comp=appendblock&seal=true" : "&comp=appendblock"; if (m_useSdk && blobStorageType == BlobStorageTypes.AzureBlobStorage)
// Upload the file to the url
var request = new HttpRequestMessage(HttpMethod.Put, url + comp)
{ {
Content = new StreamContent(file) var appendBlobClient = GetAppendBlobClient(url);
}; try
if (blobStorageType == BlobStorageTypes.AzureBlobStorage)
{
request.Content.Headers.Add("Content-Length", fileSize.ToString());
request.Content.Headers.Add(Constants.AzureBlobSealedHeader, finalize.ToString());
}
using (var response = await SendAsync(request, HttpCompletionOption.ResponseHeadersRead, userState: null, cancellationToken))
{
if (!response.IsSuccessStatusCode)
{ {
throw new Exception($"Failed to upload append file, status code: {response.StatusCode}, reason: {response.ReasonPhrase}, object: {response}, fileSize: {fileSize}"); await appendBlobClient.AppendBlockAsync(file, cancellationToken: cancellationToken);
if (finalize)
{
await appendBlobClient.SealAsync(cancellationToken: cancellationToken);
}
}
catch (RequestFailedException e)
{
throw new Exception($"Failed to upload append block in Azure blob: {e.Message}");
}
}
else
{
var comp = finalize ? "&comp=appendblock&seal=true" : "&comp=appendblock";
// Upload the file to the url
var request = new HttpRequestMessage(HttpMethod.Put, url + comp)
{
Content = new StreamContent(file)
};
if (blobStorageType == BlobStorageTypes.AzureBlobStorage)
{
request.Content.Headers.Add("Content-Length", fileSize.ToString());
request.Content.Headers.Add(Constants.AzureBlobSealedHeader, finalize.ToString());
}
using (var response = await SendAsync(request, HttpCompletionOption.ResponseHeadersRead, userState: null, cancellationToken))
{
if (!response.IsSuccessStatusCode)
{
throw new Exception($"Failed to upload append file, status code: {response.StatusCode}, reason: {response.ReasonPhrase}, object: {response}, fileSize: {fileSize}");
}
} }
return response;
} }
} }
@@ -251,23 +413,22 @@ namespace GitHub.Services.Results.Client
// Upload the file // Upload the file
using (var fileStream = new FileStream(file, FileMode.Open, FileAccess.Read, FileShare.Read, 4096, true)) using (var fileStream = new FileStream(file, FileMode.Open, FileAccess.Read, FileShare.Read, 4096, true))
{ {
var response = await UploadBlockFileAsync(uploadUrlResponse.SummaryUrl, uploadUrlResponse.BlobStorageType, fileStream, cancellationToken); await UploadBlockFileAsync(uploadUrlResponse.SummaryUrl, uploadUrlResponse.BlobStorageType, fileStream, cancellationToken);
} }
// Send step summary upload complete message // Send step summary upload complete message
await StepSummaryUploadCompleteAsync(planId, jobId, stepId, fileSize, cancellationToken); await StepSummaryUploadCompleteAsync(planId, jobId, stepId, fileSize, cancellationToken);
} }
private async Task<HttpResponseMessage> UploadLogFile(string file, bool finalize, bool firstBlock, string sasUrl, string blobStorageType, private async Task UploadLogFile(string file, bool finalize, bool firstBlock, string sasUrl, string blobStorageType,
CancellationToken cancellationToken) CancellationToken cancellationToken, Dictionary<string, string> customHeaders = null)
{ {
HttpResponseMessage response;
if (firstBlock && finalize) if (firstBlock && finalize)
{ {
// This is the one and only block, just use a block blob // This is the one and only block, just use a block blob
using (var fileStream = new FileStream(file, FileMode.Open, FileAccess.Read, FileShare.Read, 4096, true)) using (var fileStream = new FileStream(file, FileMode.Open, FileAccess.Read, FileShare.Read, 4096, true))
{ {
response = await UploadBlockFileAsync(sasUrl, blobStorageType, fileStream, cancellationToken); await UploadBlockFileAsync(sasUrl, blobStorageType, fileStream, cancellationToken, customHeaders);
} }
} }
else else
@@ -276,18 +437,16 @@ namespace GitHub.Services.Results.Client
// Create the Append blob // Create the Append blob
if (firstBlock) if (firstBlock)
{ {
await CreateAppendFileAsync(sasUrl, blobStorageType, cancellationToken); await CreateAppendFileAsync(sasUrl, blobStorageType, cancellationToken, customHeaders);
} }
// Upload content // Upload content
var fileSize = new FileInfo(file).Length; var fileSize = new FileInfo(file).Length;
using (var fileStream = new FileStream(file, FileMode.Open, FileAccess.Read, FileShare.Read, 4096, true)) using (var fileStream = new FileStream(file, FileMode.Open, FileAccess.Read, FileShare.Read, 4096, true))
{ {
response = await UploadAppendFileAsync(sasUrl, blobStorageType, fileStream, finalize, fileSize, cancellationToken); await UploadAppendFileAsync(sasUrl, blobStorageType, fileStream, finalize, fileSize, cancellationToken);
} }
} }
return response;
} }
// Handle file upload for step log // Handle file upload for step log
@@ -300,7 +459,12 @@ namespace GitHub.Services.Results.Client
throw new Exception("Failed to get step log upload url"); throw new Exception("Failed to get step log upload url");
} }
await UploadLogFile(file, finalize, firstBlock, uploadUrlResponse.LogsUrl, uploadUrlResponse.BlobStorageType, cancellationToken); var customHeaders = new Dictionary<string, string>
{
{ Constants.ContentTypeHeader, Constants.TextPlainContentType }
};
await UploadLogFile(file, finalize, firstBlock, uploadUrlResponse.LogsUrl, uploadUrlResponse.BlobStorageType, cancellationToken, customHeaders);
// Update metadata // Update metadata
if (finalize) if (finalize)
@@ -320,7 +484,12 @@ namespace GitHub.Services.Results.Client
throw new Exception("Failed to get job log upload url"); throw new Exception("Failed to get job log upload url");
} }
await UploadLogFile(file, finalize, firstBlock, uploadUrlResponse.LogsUrl, uploadUrlResponse.BlobStorageType, cancellationToken); var customHeaders = new Dictionary<string, string>
{
{ Constants.ContentTypeHeader, Constants.TextPlainContentType }
};
await UploadLogFile(file, finalize, firstBlock, uploadUrlResponse.LogsUrl, uploadUrlResponse.BlobStorageType, cancellationToken, customHeaders);
// Update metadata // Update metadata
if (finalize) if (finalize)
@@ -330,6 +499,18 @@ namespace GitHub.Services.Results.Client
} }
} }
public async Task UploadResultsDiagnosticLogsAsync(string planId, string jobId, string file, CancellationToken cancellationToken)
{
// Get the upload url
var uploadUrlResponse = await GetDiagnosticLogsUploadUrlAsync(planId, jobId, cancellationToken);
if (uploadUrlResponse == null || uploadUrlResponse.DiagLogsURL == null)
{
throw new Exception("Failed to get diagnostic logs upload url");
}
await UploadLogFile(file, true, true, uploadUrlResponse.DiagLogsURL, uploadUrlResponse.BlobStorageType, cancellationToken);
}
private Step ConvertTimelineRecordToStep(TimelineRecord r) private Step ConvertTimelineRecordToStep(TimelineRecord r)
{ {
return new Step() return new Step()
@@ -405,6 +586,7 @@ namespace GitHub.Services.Results.Client
private Uri m_resultsServiceUrl; private Uri m_resultsServiceUrl;
private string m_token; private string m_token;
private int m_changeIdCounter; private int m_changeIdCounter;
private bool m_useSdk;
} }
// Constants specific to results // Constants specific to results
@@ -419,13 +601,20 @@ namespace GitHub.Services.Results.Client
public static readonly string CreateStepLogsMetadata = ResultsReceiverTwirpEndpoint + "CreateStepLogsMetadata"; public static readonly string CreateStepLogsMetadata = ResultsReceiverTwirpEndpoint + "CreateStepLogsMetadata";
public static readonly string GetJobLogsSignedBlobURL = ResultsReceiverTwirpEndpoint + "GetJobLogsSignedBlobURL"; public static readonly string GetJobLogsSignedBlobURL = ResultsReceiverTwirpEndpoint + "GetJobLogsSignedBlobURL";
public static readonly string CreateJobLogsMetadata = ResultsReceiverTwirpEndpoint + "CreateJobLogsMetadata"; public static readonly string CreateJobLogsMetadata = ResultsReceiverTwirpEndpoint + "CreateJobLogsMetadata";
public static readonly string GetJobDiagLogsSignedBlobURL = ResultsReceiverTwirpEndpoint + "GetJobDiagLogsSignedBlobURL";
public static readonly string ResultsProtoApiV1Endpoint = "twirp/github.actions.results.api.v1.WorkflowStepUpdateService/"; public static readonly string ResultsProtoApiV1Endpoint = "twirp/github.actions.results.api.v1.WorkflowStepUpdateService/";
public static readonly string WorkflowStepsUpdate = ResultsProtoApiV1Endpoint + "WorkflowStepsUpdate"; public static readonly string WorkflowStepsUpdate = ResultsProtoApiV1Endpoint + "WorkflowStepsUpdate";
public static readonly int DefaultNetworkTimeoutInSeconds = 30;
public static readonly int DefaultBlobUploadRetries = 3;
public static readonly string AzureBlobSealedHeader = "x-ms-blob-sealed"; public static readonly string AzureBlobSealedHeader = "x-ms-blob-sealed";
public static readonly string AzureBlobTypeHeader = "x-ms-blob-type"; public static readonly string AzureBlobTypeHeader = "x-ms-blob-type";
public static readonly string AzureBlockBlob = "BlockBlob"; public static readonly string AzureBlockBlob = "BlockBlob";
public static readonly string AzureAppendBlob = "AppendBlob"; public static readonly string AzureAppendBlob = "AppendBlob";
public const string ContentTypeHeader = "Content-Type";
public const string TextPlainContentType = "text/plain";
} }
} }

View File

@@ -0,0 +1,81 @@
using System;
using System.Runtime.CompilerServices;
using System.Threading;
using System.Threading.Tasks;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Listener;
using GitHub.Runner.Listener.Configuration;
using GitHub.Services.Common;
using Moq;
using Xunit;
namespace GitHub.Runner.Common.Tests.Listener
{
public sealed class BrokerMessageListenerL0
{
private readonly RunnerSettings _settings;
private readonly Mock<IConfigurationManager> _config;
private readonly Mock<IBrokerServer> _brokerServer;
private readonly Mock<ICredentialManager> _credMgr;
private Mock<IConfigurationStore> _store;
public BrokerMessageListenerL0()
{
_settings = new RunnerSettings { AgentId = 1, AgentName = "myagent", PoolId = 123, PoolName = "default", ServerUrl = "http://myserver", WorkFolder = "_work", ServerUrlV2 = "http://myserverv2" };
_config = new Mock<IConfigurationManager>();
_config.Setup(x => x.LoadSettings()).Returns(_settings);
_credMgr = new Mock<ICredentialManager>();
_store = new Mock<IConfigurationStore>();
_brokerServer = new Mock<IBrokerServer>();
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void CreatesSession()
{
using (TestHostContext tc = CreateTestContext())
using (var tokenSource = new CancellationTokenSource())
{
Tracing trace = tc.GetTrace();
// Arrange.
var expectedSession = new TaskAgentSession();
_brokerServer
.Setup(x => x.CreateSessionAsync(
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token))
.Returns(Task.FromResult(expectedSession));
_credMgr.Setup(x => x.LoadCredentials()).Returns(new VssCredentials());
_store.Setup(x => x.GetCredentials()).Returns(new CredentialData() { Scheme = Constants.Configuration.OAuthAccessToken });
_store.Setup(x => x.GetMigratedCredentials()).Returns(default(CredentialData));
// Act.
BrokerMessageListener listener = new();
listener.Initialize(tc);
bool result = await listener.CreateSessionAsync(tokenSource.Token);
trace.Info("result: {0}", result);
// Assert.
Assert.True(result);
_brokerServer
.Verify(x => x.CreateSessionAsync(
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token), Times.Once());
}
}
private TestHostContext CreateTestContext([CallerMemberName] String testName = "")
{
TestHostContext tc = new(this, testName);
tc.SetSingleton<IConfigurationManager>(_config.Object);
tc.SetSingleton<ICredentialManager>(_credMgr.Object);
tc.SetSingleton<IConfigurationStore>(_store.Object);
tc.SetSingleton<IBrokerServer>(_brokerServer.Object);
return tc;
}
}
}

View File

@@ -24,6 +24,8 @@ namespace GitHub.Runner.Common.Tests.Listener
private Mock<ICredentialManager> _credMgr; private Mock<ICredentialManager> _credMgr;
private Mock<IConfigurationStore> _store; private Mock<IConfigurationStore> _store;
private Mock<IBrokerServer> _brokerServer;
public MessageListenerL0() public MessageListenerL0()
{ {
_settings = new RunnerSettings { AgentId = 1, AgentName = "myagent", PoolId = 123, PoolName = "default", ServerUrl = "http://myserver", WorkFolder = "_work" }; _settings = new RunnerSettings { AgentId = 1, AgentName = "myagent", PoolId = 123, PoolName = "default", ServerUrl = "http://myserver", WorkFolder = "_work" };
@@ -32,6 +34,7 @@ namespace GitHub.Runner.Common.Tests.Listener
_runnerServer = new Mock<IRunnerServer>(); _runnerServer = new Mock<IRunnerServer>();
_credMgr = new Mock<ICredentialManager>(); _credMgr = new Mock<ICredentialManager>();
_store = new Mock<IConfigurationStore>(); _store = new Mock<IConfigurationStore>();
_brokerServer = new Mock<IBrokerServer>();
} }
private TestHostContext CreateTestContext([CallerMemberName] String testName = "") private TestHostContext CreateTestContext([CallerMemberName] String testName = "")
@@ -41,6 +44,7 @@ namespace GitHub.Runner.Common.Tests.Listener
tc.SetSingleton<IRunnerServer>(_runnerServer.Object); tc.SetSingleton<IRunnerServer>(_runnerServer.Object);
tc.SetSingleton<ICredentialManager>(_credMgr.Object); tc.SetSingleton<ICredentialManager>(_credMgr.Object);
tc.SetSingleton<IConfigurationStore>(_store.Object); tc.SetSingleton<IConfigurationStore>(_store.Object);
tc.SetSingleton<IBrokerServer>(_brokerServer.Object);
return tc; return tc;
} }
@@ -81,6 +85,72 @@ namespace GitHub.Runner.Common.Tests.Listener
_settings.PoolId, _settings.PoolId,
It.Is<TaskAgentSession>(y => y != null), It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token), Times.Once()); tokenSource.Token), Times.Once());
_brokerServer
.Verify(x => x.CreateSessionAsync(
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token), Times.Never());
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void CreatesSessionWithBrokerMigration()
{
using (TestHostContext tc = CreateTestContext())
using (var tokenSource = new CancellationTokenSource())
{
Tracing trace = tc.GetTrace();
// Arrange.
var expectedSession = new TaskAgentSession()
{
OwnerName = "legacy",
BrokerMigrationMessage = new BrokerMigrationMessage(new Uri("https://broker.actions.github.com"))
};
var expectedBrokerSession = new TaskAgentSession()
{
OwnerName = "broker"
};
_runnerServer
.Setup(x => x.CreateAgentSessionAsync(
_settings.PoolId,
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token))
.Returns(Task.FromResult(expectedSession));
_brokerServer
.Setup(x => x.CreateSessionAsync(
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token))
.Returns(Task.FromResult(expectedBrokerSession));
_credMgr.Setup(x => x.LoadCredentials()).Returns(new VssCredentials());
_store.Setup(x => x.GetCredentials()).Returns(new CredentialData() { Scheme = Constants.Configuration.OAuthAccessToken });
_store.Setup(x => x.GetMigratedCredentials()).Returns(default(CredentialData));
// Act.
MessageListener listener = new();
listener.Initialize(tc);
bool result = await listener.CreateSessionAsync(tokenSource.Token);
trace.Info("result: {0}", result);
// Assert.
Assert.True(result);
_runnerServer
.Verify(x => x.CreateAgentSessionAsync(
_settings.PoolId,
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token), Times.Once());
_brokerServer
.Verify(x => x.CreateSessionAsync(
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token), Times.Once());
} }
} }
@@ -131,6 +201,83 @@ namespace GitHub.Runner.Common.Tests.Listener
} }
} }
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void DeleteSessionWithBrokerMigration()
{
using (TestHostContext tc = CreateTestContext())
using (var tokenSource = new CancellationTokenSource())
{
Tracing trace = tc.GetTrace();
// Arrange.
var expectedSession = new TaskAgentSession()
{
OwnerName = "legacy",
BrokerMigrationMessage = new BrokerMigrationMessage(new Uri("https://broker.actions.github.com"))
};
var expectedBrokerSession = new TaskAgentSession()
{
SessionId = Guid.NewGuid(),
OwnerName = "broker"
};
_runnerServer
.Setup(x => x.CreateAgentSessionAsync(
_settings.PoolId,
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token))
.Returns(Task.FromResult(expectedSession));
_brokerServer
.Setup(x => x.CreateSessionAsync(
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token))
.Returns(Task.FromResult(expectedBrokerSession));
_credMgr.Setup(x => x.LoadCredentials()).Returns(new VssCredentials());
_store.Setup(x => x.GetCredentials()).Returns(new CredentialData() { Scheme = Constants.Configuration.OAuthAccessToken });
_store.Setup(x => x.GetMigratedCredentials()).Returns(default(CredentialData));
// Act.
MessageListener listener = new();
listener.Initialize(tc);
bool result = await listener.CreateSessionAsync(tokenSource.Token);
trace.Info("result: {0}", result);
Assert.True(result);
_runnerServer
.Verify(x => x.CreateAgentSessionAsync(
_settings.PoolId,
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token), Times.Once());
_brokerServer
.Verify(x => x.CreateSessionAsync(
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token), Times.Once());
_brokerServer
.Setup(x => x.DeleteSessionAsync(It.IsAny<CancellationToken>()))
.Returns(Task.CompletedTask);
// Act.
await listener.DeleteSessionAsync();
//Assert
_runnerServer
.Verify(x => x.DeleteAgentSessionAsync(
_settings.PoolId, expectedSession.SessionId, It.IsAny<CancellationToken>()), Times.Never());
_brokerServer
.Verify(x => x.DeleteSessionAsync(It.IsAny<CancellationToken>()), Times.Once());
}
}
[Fact] [Fact]
[Trait("Level", "L0")] [Trait("Level", "L0")]
[Trait("Category", "Runner")] [Trait("Category", "Runner")]
@@ -192,8 +339,8 @@ namespace GitHub.Runner.Common.Tests.Listener
_runnerServer _runnerServer
.Setup(x => x.GetAgentMessageAsync( .Setup(x => x.GetAgentMessageAsync(
_settings.PoolId, expectedSession.SessionId, It.IsAny<long?>(), TaskAgentStatus.Online, It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<CancellationToken>())) _settings.PoolId, expectedSession.SessionId, It.IsAny<long?>(), TaskAgentStatus.Online, It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<bool>(), It.IsAny<CancellationToken>()))
.Returns(async (Int32 poolId, Guid sessionId, Int64? lastMessageId, TaskAgentStatus status, string runnerVersion, string os, string architecture, CancellationToken cancellationToken) => .Returns(async (Int32 poolId, Guid sessionId, Int64? lastMessageId, TaskAgentStatus status, string runnerVersion, string os, string architecture, bool disableUpdate, CancellationToken cancellationToken) =>
{ {
await Task.Yield(); await Task.Yield();
return messages.Dequeue(); return messages.Dequeue();
@@ -208,7 +355,113 @@ namespace GitHub.Runner.Common.Tests.Listener
//Assert //Assert
_runnerServer _runnerServer
.Verify(x => x.GetAgentMessageAsync( .Verify(x => x.GetAgentMessageAsync(
_settings.PoolId, expectedSession.SessionId, It.IsAny<long?>(), TaskAgentStatus.Online, It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<CancellationToken>()), Times.Exactly(arMessages.Length)); _settings.PoolId, expectedSession.SessionId, It.IsAny<long?>(), TaskAgentStatus.Online, It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<bool>(), It.IsAny<CancellationToken>()), Times.Exactly(arMessages.Length));
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void GetNextMessageWithBrokerMigration()
{
using (TestHostContext tc = CreateTestContext())
using (var tokenSource = new CancellationTokenSource())
{
Tracing trace = tc.GetTrace();
// Arrange.
var expectedSession = new TaskAgentSession();
PropertyInfo sessionIdProperty = expectedSession.GetType().GetProperty("SessionId", BindingFlags.Instance | BindingFlags.NonPublic | BindingFlags.Public);
Assert.NotNull(sessionIdProperty);
sessionIdProperty.SetValue(expectedSession, Guid.NewGuid());
_runnerServer
.Setup(x => x.CreateAgentSessionAsync(
_settings.PoolId,
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token))
.Returns(Task.FromResult(expectedSession));
_credMgr.Setup(x => x.LoadCredentials()).Returns(new VssCredentials());
_store.Setup(x => x.GetCredentials()).Returns(new CredentialData() { Scheme = Constants.Configuration.OAuthAccessToken });
_store.Setup(x => x.GetMigratedCredentials()).Returns(default(CredentialData));
// Act.
MessageListener listener = new();
listener.Initialize(tc);
bool result = await listener.CreateSessionAsync(tokenSource.Token);
Assert.True(result);
var brokerMigrationMesage = new BrokerMigrationMessage(new Uri("https://actions.broker.com"));
var arMessages = new TaskAgentMessage[]
{
new TaskAgentMessage
{
Body = JsonUtility.ToString(brokerMigrationMesage),
MessageType = BrokerMigrationMessage.MessageType
},
};
var brokerMessages = new TaskAgentMessage[]
{
new TaskAgentMessage
{
Body = "somebody1",
MessageId = 4234,
MessageType = JobRequestMessageTypes.PipelineAgentJobRequest
},
new TaskAgentMessage
{
Body = "somebody2",
MessageId = 4235,
MessageType = JobCancelMessage.MessageType
},
null, //should be skipped by GetNextMessageAsync implementation
null,
new TaskAgentMessage
{
Body = "somebody3",
MessageId = 4236,
MessageType = JobRequestMessageTypes.PipelineAgentJobRequest
}
};
var brokerMessageQueue = new Queue<TaskAgentMessage>(brokerMessages);
_runnerServer
.Setup(x => x.GetAgentMessageAsync(
_settings.PoolId, expectedSession.SessionId, It.IsAny<long?>(), TaskAgentStatus.Online, It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<bool>(), It.IsAny<CancellationToken>()))
.Returns(async (Int32 poolId, Guid sessionId, Int64? lastMessageId, TaskAgentStatus status, string runnerVersion, string os, string architecture, bool disableUpdate, CancellationToken cancellationToken) =>
{
await Task.Yield();
return arMessages[0]; // always send migration message
});
_brokerServer
.Setup(x => x.GetRunnerMessageAsync(
expectedSession.SessionId, TaskAgentStatus.Online, It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<bool>(), It.IsAny<CancellationToken>()))
.Returns(async (Guid sessionId, TaskAgentStatus status, string runnerVersion, string os, string architecture, bool disableUpdate, CancellationToken cancellationToken) =>
{
await Task.Yield();
return brokerMessageQueue.Dequeue();
});
TaskAgentMessage message1 = await listener.GetNextMessageAsync(tokenSource.Token);
TaskAgentMessage message2 = await listener.GetNextMessageAsync(tokenSource.Token);
TaskAgentMessage message3 = await listener.GetNextMessageAsync(tokenSource.Token);
Assert.Equal(brokerMessages[0], message1);
Assert.Equal(brokerMessages[1], message2);
Assert.Equal(brokerMessages[4], message3);
//Assert
_runnerServer
.Verify(x => x.GetAgentMessageAsync(
_settings.PoolId, expectedSession.SessionId, It.IsAny<long?>(), TaskAgentStatus.Online, It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<bool>(), It.IsAny<CancellationToken>()), Times.Exactly(brokerMessages.Length));
_brokerServer
.Verify(x => x.GetRunnerMessageAsync(
expectedSession.SessionId, TaskAgentStatus.Online, It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<bool>(), It.IsAny<CancellationToken>()), Times.Exactly(brokerMessages.Length));
} }
} }
@@ -293,7 +546,7 @@ namespace GitHub.Runner.Common.Tests.Listener
_runnerServer _runnerServer
.Setup(x => x.GetAgentMessageAsync( .Setup(x => x.GetAgentMessageAsync(
_settings.PoolId, expectedSession.SessionId, It.IsAny<long?>(), TaskAgentStatus.Online, It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<CancellationToken>())) _settings.PoolId, expectedSession.SessionId, It.IsAny<long?>(), TaskAgentStatus.Online, It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<bool>(), It.IsAny<CancellationToken>()))
.Throws(new TaskAgentAccessTokenExpiredException("test")); .Throws(new TaskAgentAccessTokenExpiredException("test"));
try try
{ {
@@ -311,7 +564,7 @@ namespace GitHub.Runner.Common.Tests.Listener
//Assert //Assert
_runnerServer _runnerServer
.Verify(x => x.GetAgentMessageAsync( .Verify(x => x.GetAgentMessageAsync(
_settings.PoolId, expectedSession.SessionId, It.IsAny<long?>(), TaskAgentStatus.Online, It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<CancellationToken>()), Times.Once); _settings.PoolId, expectedSession.SessionId, It.IsAny<long?>(), TaskAgentStatus.Online, It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<bool>(), It.IsAny<CancellationToken>()), Times.Once);
_runnerServer _runnerServer
.Verify(x => x.DeleteAgentSessionAsync( .Verify(x => x.DeleteAgentSessionAsync(

View File

@@ -23,7 +23,6 @@ namespace GitHub.Runner.Common.Tests.Listener
private Mock<IConfigurationStore> _configStore; private Mock<IConfigurationStore> _configStore;
private Mock<IJobDispatcher> _jobDispatcher; private Mock<IJobDispatcher> _jobDispatcher;
private AgentRefreshMessage _refreshMessage = new(1, "2.999.0"); private AgentRefreshMessage _refreshMessage = new(1, "2.999.0");
private List<TrimmedPackageMetadata> _trimmedPackages = new();
#if !OS_WINDOWS #if !OS_WINDOWS
private string _packageUrl = null; private string _packageUrl = null;
@@ -71,12 +70,6 @@ namespace GitHub.Runner.Common.Tests.Listener
} }
} }
using (var client = new HttpClient())
{
var json = await client.GetStringAsync($"https://github.com/actions/runner/releases/download/v{latestVersion}/actions-runner-{BuildConstants.RunnerPackage.PackageName}-{latestVersion}-trimmedpackages.json");
_trimmedPackages = StringUtil.ConvertFromJson<List<TrimmedPackageMetadata>>(json);
}
_runnerServer.Setup(x => x.GetPackageAsync("agent", BuildConstants.RunnerPackage.PackageName, "2.999.0", true, It.IsAny<CancellationToken>())) _runnerServer.Setup(x => x.GetPackageAsync("agent", BuildConstants.RunnerPackage.PackageName, "2.999.0", true, It.IsAny<CancellationToken>()))
.Returns(Task.FromResult(new PackageMetadata() { Platform = BuildConstants.RunnerPackage.PackageName, Version = new PackageVersion("2.999.0"), DownloadUrl = _packageUrl })); .Returns(Task.FromResult(new PackageMetadata() { Platform = BuildConstants.RunnerPackage.PackageName, Version = new PackageVersion("2.999.0"), DownloadUrl = _packageUrl }));
@@ -91,12 +84,10 @@ namespace GitHub.Runner.Common.Tests.Listener
{ {
await FetchLatestRunner(); await FetchLatestRunner();
Assert.NotNull(_packageUrl); Assert.NotNull(_packageUrl);
Assert.NotNull(_trimmedPackages);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin"))); Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this)) using (var hc = new TestHostContext(this))
{ {
hc.GetTrace().Info(_packageUrl); hc.GetTrace().Info(_packageUrl);
hc.GetTrace().Info(StringUtil.ConvertToJson(_trimmedPackages));
//Arrange //Arrange
var updater = new Runner.Listener.SelfUpdater(); var updater = new Runner.Listener.SelfUpdater();
@@ -152,12 +143,10 @@ namespace GitHub.Runner.Common.Tests.Listener
{ {
await FetchLatestRunner(); await FetchLatestRunner();
Assert.NotNull(_packageUrl); Assert.NotNull(_packageUrl);
Assert.NotNull(_trimmedPackages);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin"))); Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this)) using (var hc = new TestHostContext(this))
{ {
hc.GetTrace().Info(_packageUrl); hc.GetTrace().Info(_packageUrl);
hc.GetTrace().Info(StringUtil.ConvertToJson(_trimmedPackages));
//Arrange //Arrange
var updater = new Runner.Listener.SelfUpdater(); var updater = new Runner.Listener.SelfUpdater();
@@ -205,12 +194,10 @@ namespace GitHub.Runner.Common.Tests.Listener
{ {
await FetchLatestRunner(); await FetchLatestRunner();
Assert.NotNull(_packageUrl); Assert.NotNull(_packageUrl);
Assert.NotNull(_trimmedPackages);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin"))); Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this)) using (var hc = new TestHostContext(this))
{ {
hc.GetTrace().Info(_packageUrl); hc.GetTrace().Info(_packageUrl);
hc.GetTrace().Info(StringUtil.ConvertToJson(_trimmedPackages));
//Arrange //Arrange
var updater = new Runner.Listener.SelfUpdater(); var updater = new Runner.Listener.SelfUpdater();
@@ -260,12 +247,10 @@ namespace GitHub.Runner.Common.Tests.Listener
{ {
await FetchLatestRunner(); await FetchLatestRunner();
Assert.NotNull(_packageUrl); Assert.NotNull(_packageUrl);
Assert.NotNull(_trimmedPackages);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin"))); Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this)) using (var hc = new TestHostContext(this))
{ {
hc.GetTrace().Info(_packageUrl); hc.GetTrace().Info(_packageUrl);
hc.GetTrace().Info(StringUtil.ConvertToJson(_trimmedPackages));
//Arrange //Arrange
var updater = new Runner.Listener.SelfUpdater(); var updater = new Runner.Listener.SelfUpdater();
@@ -305,495 +290,6 @@ namespace GitHub.Runner.Common.Tests.Listener
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", null); Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", null);
} }
} }
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void TestSelfUpdateAsync_CloneHash_RuntimeAndExternals()
{
try
{
await FetchLatestRunner();
Assert.NotNull(_packageUrl);
Assert.NotNull(_trimmedPackages);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this))
{
hc.GetTrace().Info(_packageUrl);
hc.GetTrace().Info(StringUtil.ConvertToJson(_trimmedPackages));
//Arrange
var updater = new Runner.Listener.SelfUpdater();
hc.SetSingleton<ITerminal>(_term.Object);
hc.SetSingleton<IRunnerServer>(_runnerServer.Object);
hc.SetSingleton<IConfigurationStore>(_configStore.Object);
hc.SetSingleton<IHttpClientHandlerFactory>(new HttpClientHandlerFactory());
var p1 = new ProcessInvokerWrapper();
p1.Initialize(hc);
var p2 = new ProcessInvokerWrapper();
p2.Initialize(hc);
var p3 = new ProcessInvokerWrapper();
p3.Initialize(hc);
hc.EnqueueInstance<IProcessInvoker>(p1);
hc.EnqueueInstance<IProcessInvoker>(p2);
hc.EnqueueInstance<IProcessInvoker>(p3);
updater.Initialize(hc);
_runnerServer.Setup(x => x.GetPackageAsync("agent", BuildConstants.RunnerPackage.PackageName, "2.999.0", true, It.IsAny<CancellationToken>()))
.Returns(Task.FromResult(new PackageMetadata() { Platform = BuildConstants.RunnerPackage.PackageName, Version = new PackageVersion("2.999.0"), DownloadUrl = _packageUrl, TrimmedPackages = new List<TrimmedPackageMetadata>() { new TrimmedPackageMetadata() } }));
_runnerServer.Setup(x => x.UpdateAgentUpdateStateAsync(1, 1, It.IsAny<string>(), It.IsAny<string>()))
.Callback((int p, ulong a, string s, string t) =>
{
hc.GetTrace().Info(t);
})
.Returns(Task.FromResult(new TaskAgent()));
try
{
var result = await updater.SelfUpdate(_refreshMessage, _jobDispatcher.Object, true, hc.RunnerShutdownToken);
Assert.True(result);
Assert.True(Directory.Exists(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "bin.2.999.0")));
Assert.True(Directory.Exists(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "externals.2.999.0")));
FieldInfo contentHashesProperty = updater.GetType().GetField("_contentHashes", BindingFlags.Instance | BindingFlags.NonPublic | BindingFlags.Public);
Assert.NotNull(contentHashesProperty);
Dictionary<string, string> contentHashes = (Dictionary<string, string>)contentHashesProperty.GetValue(updater);
hc.GetTrace().Info(StringUtil.ConvertToJson(contentHashes));
var dotnetRuntimeHashFile = Path.Combine(TestUtil.GetSrcPath(), $"Misc/contentHash/dotnetRuntime/{BuildConstants.RunnerPackage.PackageName}");
var externalsHashFile = Path.Combine(TestUtil.GetSrcPath(), $"Misc/contentHash/externals/{BuildConstants.RunnerPackage.PackageName}");
Assert.Equal(File.ReadAllText(dotnetRuntimeHashFile).Trim(), contentHashes["dotnetRuntime"]);
Assert.Equal(File.ReadAllText(externalsHashFile).Trim(), contentHashes["externals"]);
}
finally
{
IOUtil.DeleteDirectory(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "bin.2.999.0"), CancellationToken.None);
IOUtil.DeleteDirectory(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "externals.2.999.0"), CancellationToken.None);
}
}
}
finally
{
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", null);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void TestSelfUpdateAsync_Cancel_CloneHashTask_WhenNotNeeded()
{
try
{
await FetchLatestRunner();
Assert.NotNull(_packageUrl);
Assert.NotNull(_trimmedPackages);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this))
{
hc.GetTrace().Info(_packageUrl);
hc.GetTrace().Info(StringUtil.ConvertToJson(_trimmedPackages));
//Arrange
var updater = new Runner.Listener.SelfUpdater();
hc.SetSingleton<ITerminal>(_term.Object);
hc.SetSingleton<IRunnerServer>(_runnerServer.Object);
hc.SetSingleton<IConfigurationStore>(_configStore.Object);
hc.SetSingleton<IHttpClientHandlerFactory>(new Mock<IHttpClientHandlerFactory>().Object);
var p1 = new ProcessInvokerWrapper();
p1.Initialize(hc);
var p2 = new ProcessInvokerWrapper();
p2.Initialize(hc);
var p3 = new ProcessInvokerWrapper();
p3.Initialize(hc);
hc.EnqueueInstance<IProcessInvoker>(p1);
hc.EnqueueInstance<IProcessInvoker>(p2);
hc.EnqueueInstance<IProcessInvoker>(p3);
updater.Initialize(hc);
_runnerServer.Setup(x => x.UpdateAgentUpdateStateAsync(1, 1, It.IsAny<string>(), It.IsAny<string>()))
.Callback((int p, ulong a, string s, string t) =>
{
hc.GetTrace().Info(t);
})
.Returns(Task.FromResult(new TaskAgent()));
try
{
var result = await updater.SelfUpdate(_refreshMessage, _jobDispatcher.Object, true, hc.RunnerShutdownToken);
FieldInfo contentHashesProperty = updater.GetType().GetField("_contentHashes", BindingFlags.Instance | BindingFlags.NonPublic | BindingFlags.Public);
Assert.NotNull(contentHashesProperty);
Dictionary<string, string> contentHashes = (Dictionary<string, string>)contentHashesProperty.GetValue(updater);
hc.GetTrace().Info(StringUtil.ConvertToJson(contentHashes));
Assert.NotEqual(2, contentHashes.Count);
}
catch (Exception ex)
{
hc.GetTrace().Error(ex);
}
}
}
finally
{
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", null);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void TestSelfUpdateAsync_UseExternalsTrimmedPackage()
{
try
{
await FetchLatestRunner();
Assert.NotNull(_packageUrl);
Assert.NotNull(_trimmedPackages);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this))
{
hc.GetTrace().Info(_packageUrl);
hc.GetTrace().Info(StringUtil.ConvertToJson(_trimmedPackages));
//Arrange
var updater = new Runner.Listener.SelfUpdater();
hc.SetSingleton<ITerminal>(_term.Object);
hc.SetSingleton<IRunnerServer>(_runnerServer.Object);
hc.SetSingleton<IConfigurationStore>(_configStore.Object);
hc.SetSingleton<IHttpClientHandlerFactory>(new HttpClientHandlerFactory());
var p1 = new ProcessInvokerWrapper(); // hashfiles
p1.Initialize(hc);
var p2 = new ProcessInvokerWrapper(); // hashfiles
p2.Initialize(hc);
var p3 = new ProcessInvokerWrapper(); // un-tar
p3.Initialize(hc);
var p4 = new ProcessInvokerWrapper(); // node -v
p4.Initialize(hc);
var p5 = new ProcessInvokerWrapper(); // node -v
p5.Initialize(hc);
hc.EnqueueInstance<IProcessInvoker>(p1);
hc.EnqueueInstance<IProcessInvoker>(p2);
hc.EnqueueInstance<IProcessInvoker>(p3);
hc.EnqueueInstance<IProcessInvoker>(p4);
hc.EnqueueInstance<IProcessInvoker>(p5);
updater.Initialize(hc);
var trim = _trimmedPackages.Where(x => !x.TrimmedContents.ContainsKey("dotnetRuntime")).ToList();
_runnerServer.Setup(x => x.GetPackageAsync("agent", BuildConstants.RunnerPackage.PackageName, "2.999.0", true, It.IsAny<CancellationToken>()))
.Returns(Task.FromResult(new PackageMetadata() { Platform = BuildConstants.RunnerPackage.PackageName, Version = new PackageVersion("2.999.0"), DownloadUrl = _packageUrl, TrimmedPackages = trim }));
_runnerServer.Setup(x => x.UpdateAgentUpdateStateAsync(1, 1, It.IsAny<string>(), It.IsAny<string>()))
.Callback((int p, ulong a, string s, string t) =>
{
hc.GetTrace().Info(t);
})
.Returns(Task.FromResult(new TaskAgent()));
try
{
var result = await updater.SelfUpdate(_refreshMessage, _jobDispatcher.Object, true, hc.RunnerShutdownToken);
Assert.True(result);
Assert.True(Directory.Exists(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "bin.2.999.0")));
Assert.True(Directory.Exists(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "externals.2.999.0")));
}
finally
{
IOUtil.DeleteDirectory(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "bin.2.999.0"), CancellationToken.None);
IOUtil.DeleteDirectory(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "externals.2.999.0"), CancellationToken.None);
}
var traceFile = Path.GetTempFileName();
File.Copy(hc.TraceFileName, traceFile, true);
var externalsHashFile = Path.Combine(TestUtil.GetSrcPath(), $"Misc/contentHash/externals/{BuildConstants.RunnerPackage.PackageName}");
var externalsHash = await File.ReadAllTextAsync(externalsHashFile);
if (externalsHash == trim[0].TrimmedContents["externals"])
{
Assert.Contains("Use trimmed (externals) package", File.ReadAllText(traceFile));
}
else
{
Assert.Contains("the current runner does not carry those trimmed content (Hash mismatch)", File.ReadAllText(traceFile));
}
}
}
finally
{
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", null);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void TestSelfUpdateAsync_UseExternalsRuntimeTrimmedPackage()
{
try
{
await FetchLatestRunner();
Assert.NotNull(_packageUrl);
Assert.NotNull(_trimmedPackages);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this))
{
hc.GetTrace().Info(_packageUrl);
hc.GetTrace().Info(StringUtil.ConvertToJson(_trimmedPackages));
//Arrange
var updater = new Runner.Listener.SelfUpdater();
hc.SetSingleton<ITerminal>(_term.Object);
hc.SetSingleton<IRunnerServer>(_runnerServer.Object);
hc.SetSingleton<IConfigurationStore>(_configStore.Object);
hc.SetSingleton<IHttpClientHandlerFactory>(new HttpClientHandlerFactory());
var p1 = new ProcessInvokerWrapper(); // hashfiles
p1.Initialize(hc);
var p2 = new ProcessInvokerWrapper(); // hashfiles
p2.Initialize(hc);
var p3 = new ProcessInvokerWrapper(); // un-tar
p3.Initialize(hc);
var p4 = new ProcessInvokerWrapper(); // node -v
p4.Initialize(hc);
var p5 = new ProcessInvokerWrapper(); // node -v
p5.Initialize(hc);
var p6 = new ProcessInvokerWrapper(); // runner -v
p6.Initialize(hc);
hc.EnqueueInstance<IProcessInvoker>(p1);
hc.EnqueueInstance<IProcessInvoker>(p2);
hc.EnqueueInstance<IProcessInvoker>(p3);
hc.EnqueueInstance<IProcessInvoker>(p4);
hc.EnqueueInstance<IProcessInvoker>(p5);
hc.EnqueueInstance<IProcessInvoker>(p6);
updater.Initialize(hc);
var trim = _trimmedPackages.Where(x => x.TrimmedContents.ContainsKey("dotnetRuntime") && x.TrimmedContents.ContainsKey("externals")).ToList();
_runnerServer.Setup(x => x.GetPackageAsync("agent", BuildConstants.RunnerPackage.PackageName, "2.999.0", true, It.IsAny<CancellationToken>()))
.Returns(Task.FromResult(new PackageMetadata() { Platform = BuildConstants.RunnerPackage.PackageName, Version = new PackageVersion("2.999.0"), DownloadUrl = _packageUrl, TrimmedPackages = trim }));
_runnerServer.Setup(x => x.UpdateAgentUpdateStateAsync(1, 1, It.IsAny<string>(), It.IsAny<string>()))
.Callback((int p, ulong a, string s, string t) =>
{
hc.GetTrace().Info(t);
})
.Returns(Task.FromResult(new TaskAgent()));
try
{
var result = await updater.SelfUpdate(_refreshMessage, _jobDispatcher.Object, true, hc.RunnerShutdownToken);
Assert.True(result);
Assert.True(Directory.Exists(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "bin.2.999.0")));
Assert.True(Directory.Exists(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "externals.2.999.0")));
}
finally
{
IOUtil.DeleteDirectory(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "bin.2.999.0"), CancellationToken.None);
IOUtil.DeleteDirectory(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "externals.2.999.0"), CancellationToken.None);
}
var traceFile = Path.GetTempFileName();
File.Copy(hc.TraceFileName, traceFile, true);
var externalsHashFile = Path.Combine(TestUtil.GetSrcPath(), $"Misc/contentHash/externals/{BuildConstants.RunnerPackage.PackageName}");
var externalsHash = await File.ReadAllTextAsync(externalsHashFile);
var runtimeHashFile = Path.Combine(TestUtil.GetSrcPath(), $"Misc/contentHash/dotnetRuntime/{BuildConstants.RunnerPackage.PackageName}");
var runtimeHash = await File.ReadAllTextAsync(runtimeHashFile);
if (externalsHash == trim[0].TrimmedContents["externals"] &&
runtimeHash == trim[0].TrimmedContents["dotnetRuntime"])
{
Assert.Contains("Use trimmed (runtime+externals) package", File.ReadAllText(traceFile));
}
else
{
Assert.Contains("the current runner does not carry those trimmed content (Hash mismatch)", File.ReadAllText(traceFile));
}
}
}
finally
{
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", null);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void TestSelfUpdateAsync_NotUseExternalsRuntimeTrimmedPackageOnHashMismatch()
{
try
{
await FetchLatestRunner();
Assert.NotNull(_packageUrl);
Assert.NotNull(_trimmedPackages);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this))
{
hc.GetTrace().Info(_packageUrl);
hc.GetTrace().Info(StringUtil.ConvertToJson(_trimmedPackages));
//Arrange
var updater = new Runner.Listener.SelfUpdater();
hc.SetSingleton<ITerminal>(_term.Object);
hc.SetSingleton<IRunnerServer>(_runnerServer.Object);
hc.SetSingleton<IConfigurationStore>(_configStore.Object);
hc.SetSingleton<IHttpClientHandlerFactory>(new HttpClientHandlerFactory());
var p1 = new ProcessInvokerWrapper(); // hashfiles
p1.Initialize(hc);
var p2 = new ProcessInvokerWrapper(); // hashfiles
p2.Initialize(hc);
var p3 = new ProcessInvokerWrapper(); // un-tar
p3.Initialize(hc);
var p4 = new ProcessInvokerWrapper(); // node -v
p4.Initialize(hc);
var p5 = new ProcessInvokerWrapper(); // node -v
p5.Initialize(hc);
var p6 = new ProcessInvokerWrapper(); // runner -v
p6.Initialize(hc);
hc.EnqueueInstance<IProcessInvoker>(p1);
hc.EnqueueInstance<IProcessInvoker>(p2);
hc.EnqueueInstance<IProcessInvoker>(p3);
hc.EnqueueInstance<IProcessInvoker>(p4);
hc.EnqueueInstance<IProcessInvoker>(p5);
hc.EnqueueInstance<IProcessInvoker>(p6);
updater.Initialize(hc);
var trim = _trimmedPackages.ToList();
foreach (var package in trim)
{
foreach (var hash in package.TrimmedContents.Keys)
{
package.TrimmedContents[hash] = "mismatch";
}
}
_runnerServer.Setup(x => x.GetPackageAsync("agent", BuildConstants.RunnerPackage.PackageName, "2.999.0", true, It.IsAny<CancellationToken>()))
.Returns(Task.FromResult(new PackageMetadata() { Platform = BuildConstants.RunnerPackage.PackageName, Version = new PackageVersion("2.999.0"), DownloadUrl = _packageUrl, TrimmedPackages = trim }));
_runnerServer.Setup(x => x.UpdateAgentUpdateStateAsync(1, 1, It.IsAny<string>(), It.IsAny<string>()))
.Callback((int p, ulong a, string s, string t) =>
{
hc.GetTrace().Info(t);
})
.Returns(Task.FromResult(new TaskAgent()));
try
{
var result = await updater.SelfUpdate(_refreshMessage, _jobDispatcher.Object, true, hc.RunnerShutdownToken);
Assert.True(result);
Assert.True(Directory.Exists(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "bin.2.999.0")));
Assert.True(Directory.Exists(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "externals.2.999.0")));
}
finally
{
IOUtil.DeleteDirectory(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "bin.2.999.0"), CancellationToken.None);
IOUtil.DeleteDirectory(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "externals.2.999.0"), CancellationToken.None);
}
var traceFile = Path.GetTempFileName();
File.Copy(hc.TraceFileName, traceFile, true);
Assert.Contains("the current runner does not carry those trimmed content (Hash mismatch)", File.ReadAllText(traceFile));
}
}
finally
{
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", null);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void TestSelfUpdateAsync_FallbackToFullPackage()
{
try
{
await FetchLatestRunner();
Assert.NotNull(_packageUrl);
Assert.NotNull(_trimmedPackages);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this))
{
hc.GetTrace().Info(_packageUrl);
hc.GetTrace().Info(StringUtil.ConvertToJson(_trimmedPackages));
//Arrange
var updater = new Runner.Listener.SelfUpdater();
hc.SetSingleton<ITerminal>(_term.Object);
hc.SetSingleton<IRunnerServer>(_runnerServer.Object);
hc.SetSingleton<IConfigurationStore>(_configStore.Object);
hc.SetSingleton<IHttpClientHandlerFactory>(new HttpClientHandlerFactory());
var p1 = new ProcessInvokerWrapper(); // hashfiles
p1.Initialize(hc);
var p2 = new ProcessInvokerWrapper(); // hashfiles
p2.Initialize(hc);
var p3 = new ProcessInvokerWrapper(); // un-tar trim
p3.Initialize(hc);
var p4 = new ProcessInvokerWrapper(); // un-tar full
p4.Initialize(hc);
hc.EnqueueInstance<IProcessInvoker>(p1);
hc.EnqueueInstance<IProcessInvoker>(p2);
hc.EnqueueInstance<IProcessInvoker>(p3);
hc.EnqueueInstance<IProcessInvoker>(p4);
updater.Initialize(hc);
var trim = _trimmedPackages.ToList();
foreach (var package in trim)
{
package.HashValue = "mismatch";
}
_runnerServer.Setup(x => x.GetPackageAsync("agent", BuildConstants.RunnerPackage.PackageName, "2.999.0", true, It.IsAny<CancellationToken>()))
.Returns(Task.FromResult(new PackageMetadata() { Platform = BuildConstants.RunnerPackage.PackageName, Version = new PackageVersion("2.999.0"), DownloadUrl = _packageUrl, TrimmedPackages = trim }));
_runnerServer.Setup(x => x.UpdateAgentUpdateStateAsync(1, 1, It.IsAny<string>(), It.IsAny<string>()))
.Callback((int p, ulong a, string s, string t) =>
{
hc.GetTrace().Info(t);
})
.Returns(Task.FromResult(new TaskAgent()));
try
{
var result = await updater.SelfUpdate(_refreshMessage, _jobDispatcher.Object, true, hc.RunnerShutdownToken);
Assert.True(result);
Assert.True(Directory.Exists(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "bin.2.999.0")));
Assert.True(Directory.Exists(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "externals.2.999.0")));
}
finally
{
IOUtil.DeleteDirectory(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "bin.2.999.0"), CancellationToken.None);
IOUtil.DeleteDirectory(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "externals.2.999.0"), CancellationToken.None);
}
var traceFile = Path.GetTempFileName();
File.Copy(hc.TraceFileName, traceFile, true);
if (File.ReadAllText(traceFile).Contains("Use trimmed (runtime+externals) package"))
{
Assert.Contains("Something wrong with the trimmed runner package, failback to use the full package for runner updates", File.ReadAllText(traceFile));
}
else
{
hc.GetTrace().Warning("Skipping the 'TestSelfUpdateAsync_FallbackToFullPackage' test, as the `externals` or `runtime` hashes have been updated");
}
}
}
finally
{
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", null);
}
}
} }
} }
#endif #endif

View File

@@ -0,0 +1,234 @@
#if !(OS_WINDOWS && ARM64)
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Net.Http;
using System.Reflection;
using System.Text.RegularExpressions;
using System.Threading;
using System.Threading.Tasks;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Listener;
using GitHub.Runner.Sdk;
using Moq;
using Xunit;
namespace GitHub.Runner.Common.Tests.Listener
{
public sealed class SelfUpdaterV2L0
{
private Mock<IRunnerServer> _runnerServer;
private Mock<ITerminal> _term;
private Mock<IConfigurationStore> _configStore;
private Mock<IJobDispatcher> _jobDispatcher;
private AgentRefreshMessage _refreshMessage = new(1, "2.999.0");
#if !OS_WINDOWS
private string _packageUrl = null;
#else
private string _packageUrl = null;
#endif
public SelfUpdaterV2L0()
{
_runnerServer = new Mock<IRunnerServer>();
_term = new Mock<ITerminal>();
_configStore = new Mock<IConfigurationStore>();
_jobDispatcher = new Mock<IJobDispatcher>();
_configStore.Setup(x => x.GetSettings()).Returns(new RunnerSettings() { PoolId = 1, AgentId = 1 });
Environment.SetEnvironmentVariable("_GITHUB_ACTION_EXECUTE_UPDATE_SCRIPT", "1");
}
private async Task FetchLatestRunner()
{
var latestVersion = "";
var httpClientHandler = new HttpClientHandler();
httpClientHandler.AllowAutoRedirect = false;
using (var client = new HttpClient(httpClientHandler))
{
var response = await client.SendAsync(new HttpRequestMessage(HttpMethod.Get, "https://github.com/actions/runner/releases/latest"));
if (response.StatusCode == System.Net.HttpStatusCode.Redirect)
{
var redirectUrl = response.Headers.Location.ToString();
Regex regex = new(@"/runner/releases/tag/v(?<version>\d+\.\d+\.\d+)");
var match = regex.Match(redirectUrl);
if (match.Success)
{
latestVersion = match.Groups["version"].Value;
#if !OS_WINDOWS
_packageUrl = $"https://github.com/actions/runner/releases/download/v{latestVersion}/actions-runner-{BuildConstants.RunnerPackage.PackageName}-{latestVersion}.tar.gz";
#else
_packageUrl = $"https://github.com/actions/runner/releases/download/v{latestVersion}/actions-runner-{BuildConstants.RunnerPackage.PackageName}-{latestVersion}.zip";
#endif
}
else
{
throw new Exception("The latest runner version could not be determined so a download URL could not be generated for it. Please check the location header of the redirect response of 'https://github.com/actions/runner/releases/latest'");
}
}
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void TestSelfUpdateAsync()
{
try
{
await FetchLatestRunner();
Assert.NotNull(_packageUrl);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this))
{
hc.GetTrace().Info(_packageUrl);
//Arrange
var updater = new Runner.Listener.SelfUpdaterV2();
hc.SetSingleton<ITerminal>(_term.Object);
hc.SetSingleton<IRunnerServer>(_runnerServer.Object);
hc.SetSingleton<IConfigurationStore>(_configStore.Object);
hc.SetSingleton<IHttpClientHandlerFactory>(new HttpClientHandlerFactory());
var p1 = new ProcessInvokerWrapper();
p1.Initialize(hc);
var p2 = new ProcessInvokerWrapper();
p2.Initialize(hc);
var p3 = new ProcessInvokerWrapper();
p3.Initialize(hc);
hc.EnqueueInstance<IProcessInvoker>(p1);
hc.EnqueueInstance<IProcessInvoker>(p2);
hc.EnqueueInstance<IProcessInvoker>(p3);
updater.Initialize(hc);
try
{
var message = new RunnerRefreshMessage()
{
TargetVersion = "2.999.0",
OS = BuildConstants.RunnerPackage.PackageName,
DownloadUrl = _packageUrl
};
var result = await updater.SelfUpdate(message, _jobDispatcher.Object, true, hc.RunnerShutdownToken);
Assert.True(result);
Assert.True(Directory.Exists(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "bin.2.999.0")));
Assert.True(Directory.Exists(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "externals.2.999.0")));
}
finally
{
IOUtil.DeleteDirectory(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "bin.2.999.0"), CancellationToken.None);
IOUtil.DeleteDirectory(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "externals.2.999.0"), CancellationToken.None);
}
}
}
finally
{
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", null);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void TestSelfUpdateAsync_DownloadRetry()
{
try
{
await FetchLatestRunner();
Assert.NotNull(_packageUrl);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this))
{
hc.GetTrace().Info(_packageUrl);
//Arrange
var updater = new Runner.Listener.SelfUpdaterV2();
hc.SetSingleton<ITerminal>(_term.Object);
hc.SetSingleton<IRunnerServer>(_runnerServer.Object);
hc.SetSingleton<IConfigurationStore>(_configStore.Object);
hc.SetSingleton<IHttpClientHandlerFactory>(new HttpClientHandlerFactory());
var p1 = new ProcessInvokerWrapper();
p1.Initialize(hc);
var p2 = new ProcessInvokerWrapper();
p2.Initialize(hc);
var p3 = new ProcessInvokerWrapper();
p3.Initialize(hc);
hc.EnqueueInstance<IProcessInvoker>(p1);
hc.EnqueueInstance<IProcessInvoker>(p2);
hc.EnqueueInstance<IProcessInvoker>(p3);
updater.Initialize(hc);
var message = new RunnerRefreshMessage()
{
TargetVersion = "2.999.0",
OS = BuildConstants.RunnerPackage.PackageName,
DownloadUrl = "https://github.com/actions/runner/notexists"
};
var ex = await Assert.ThrowsAsync<TaskCanceledException>(() => updater.SelfUpdate(message, _jobDispatcher.Object, true, hc.RunnerShutdownToken));
Assert.Contains($"failed after {Constants.RunnerDownloadRetryMaxAttempts} download attempts", ex.Message);
}
}
finally
{
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", null);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void TestSelfUpdateAsync_ValidateHash()
{
try
{
await FetchLatestRunner();
Assert.NotNull(_packageUrl);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this))
{
hc.GetTrace().Info(_packageUrl);
//Arrange
var updater = new Runner.Listener.SelfUpdaterV2();
hc.SetSingleton<ITerminal>(_term.Object);
hc.SetSingleton<IRunnerServer>(_runnerServer.Object);
hc.SetSingleton<IConfigurationStore>(_configStore.Object);
hc.SetSingleton<IHttpClientHandlerFactory>(new HttpClientHandlerFactory());
var p1 = new ProcessInvokerWrapper();
p1.Initialize(hc);
var p2 = new ProcessInvokerWrapper();
p2.Initialize(hc);
var p3 = new ProcessInvokerWrapper();
p3.Initialize(hc);
hc.EnqueueInstance<IProcessInvoker>(p1);
hc.EnqueueInstance<IProcessInvoker>(p2);
hc.EnqueueInstance<IProcessInvoker>(p3);
updater.Initialize(hc);
var message = new RunnerRefreshMessage()
{
TargetVersion = "2.999.0",
OS = BuildConstants.RunnerPackage.PackageName,
DownloadUrl = _packageUrl,
SHA256Checksum = "badhash"
};
var ex = await Assert.ThrowsAsync<Exception>(() => updater.SelfUpdate(message, _jobDispatcher.Object, true, hc.RunnerShutdownToken));
Assert.Contains("did not match expected Runner Hash", ex.Message);
}
}
finally
{
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", null);
}
}
}
}
#endif

View File

@@ -1,301 +0,0 @@
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Channels;
using System.Threading.Tasks;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Sdk;
using Xunit;
namespace GitHub.Runner.Common.Tests
{
public sealed class PackagesTrimL0
{
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public async Task RunnerLayoutParts_NewFilesCrossAll()
{
using (TestHostContext hc = new(this))
{
Tracing trace = hc.GetTrace();
var runnerCoreAssetsFile = Path.Combine(TestUtil.GetSrcPath(), @"Misc/runnercoreassets");
var runnerDotnetRuntimeFile = Path.Combine(TestUtil.GetSrcPath(), @"Misc/runnerdotnetruntimeassets");
string layoutBin = Path.Combine(TestUtil.GetSrcPath(), @"../_layout/bin");
var newFiles = new List<string>();
if (Directory.Exists(layoutBin))
{
var coreAssets = await File.ReadAllLinesAsync(runnerCoreAssetsFile);
var runtimeAssets = await File.ReadAllLinesAsync(runnerDotnetRuntimeFile);
foreach (var file in Directory.GetFiles(layoutBin, "*", SearchOption.AllDirectories))
{
if (!coreAssets.Any(x => file.Replace(Path.DirectorySeparatorChar, Path.AltDirectorySeparatorChar).EndsWith(x)) &&
!runtimeAssets.Any(x => file.Replace(Path.DirectorySeparatorChar, Path.AltDirectorySeparatorChar).EndsWith(x)))
{
newFiles.Add(file);
}
}
if (newFiles.Count > 0)
{
Assert.True(false, $"Found new files '{string.Join('\n', newFiles)}'. These will break runner update using trimmed packages.");
}
}
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public async Task RunnerLayoutParts_OverlapFiles()
{
using (TestHostContext hc = new(this))
{
Tracing trace = hc.GetTrace();
var runnerCoreAssetsFile = Path.Combine(TestUtil.GetSrcPath(), @"Misc/runnercoreassets");
var runnerDotnetRuntimeFile = Path.Combine(TestUtil.GetSrcPath(), @"Misc/runnerdotnetruntimeassets");
var coreAssets = await File.ReadAllLinesAsync(runnerCoreAssetsFile);
var runtimeAssets = await File.ReadAllLinesAsync(runnerDotnetRuntimeFile);
foreach (var line in coreAssets)
{
if (runtimeAssets.Contains(line, StringComparer.OrdinalIgnoreCase))
{
Assert.True(false, $"'Misc/runnercoreassets' and 'Misc/runnerdotnetruntimeassets' should not overlap.");
}
}
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public async Task RunnerLayoutParts_NewRunnerCoreAssets()
{
using (TestHostContext hc = new(this))
{
Tracing trace = hc.GetTrace();
var runnerCoreAssetsFile = Path.Combine(TestUtil.GetSrcPath(), @"Misc/runnercoreassets");
var coreAssets = await File.ReadAllLinesAsync(runnerCoreAssetsFile);
string layoutBin = Path.Combine(TestUtil.GetSrcPath(), @"../_layout/bin");
var newFiles = new List<string>();
if (Directory.Exists(layoutBin))
{
var binDirs = Directory.GetDirectories(TestUtil.GetSrcPath(), "net6.0", SearchOption.AllDirectories);
foreach (var binDir in binDirs)
{
if (binDir.Contains("Test") || binDir.Contains("obj"))
{
continue;
}
Directory.GetFiles(binDir, "*", SearchOption.TopDirectoryOnly).ToList().ForEach(x =>
{
if (!x.Contains("runtimeconfig.dev.json"))
{
if (!coreAssets.Any(y => x.Replace(Path.DirectorySeparatorChar, Path.AltDirectorySeparatorChar).EndsWith(y)))
{
newFiles.Add(x);
}
}
});
}
if (newFiles.Count > 0)
{
Assert.True(false, $"Found new files '{string.Join('\n', newFiles)}'. These will break runner update using trimmed packages. You might need to update `Misc/runnercoreassets`.");
}
}
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public async Task RunnerLayoutParts_NewDotnetRuntimeAssets()
{
using (TestHostContext hc = new(this))
{
Tracing trace = hc.GetTrace();
var runnerDotnetRuntimeFile = Path.Combine(TestUtil.GetSrcPath(), @"Misc/runnerdotnetruntimeassets");
var runtimeAssets = await File.ReadAllLinesAsync(runnerDotnetRuntimeFile);
string layoutTrimsRuntimeAssets = Path.Combine(TestUtil.GetSrcPath(), @"../_layout_trims/runnerdotnetruntimeassets");
var newFiles = new List<string>();
if (File.Exists(layoutTrimsRuntimeAssets))
{
var runtimeAssetsCurrent = await File.ReadAllLinesAsync(layoutTrimsRuntimeAssets);
foreach (var runtimeFile in runtimeAssetsCurrent)
{
if (runtimeAssets.Any(x => runtimeFile.EndsWith(x, StringComparison.OrdinalIgnoreCase)))
{
continue;
}
else
{
newFiles.Add(runtimeFile);
}
}
if (newFiles.Count > 0)
{
Assert.True(false, $"Found new dotnet runtime files '{string.Join('\n', newFiles)}'. These will break runner update using trimmed packages. You might need to update `Misc/runnerdotnetruntimeassets`.");
}
}
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public async Task RunnerLayoutParts_CheckDotnetRuntimeHash()
{
using (TestHostContext hc = new(this))
{
Tracing trace = hc.GetTrace();
var dotnetRuntimeHashFile = Path.Combine(TestUtil.GetSrcPath(), $"Misc/contentHash/dotnetRuntime/{BuildConstants.RunnerPackage.PackageName}");
trace.Info($"Current hash: {File.ReadAllText(dotnetRuntimeHashFile)}");
string layoutTrimsRuntimeAssets = Path.Combine(TestUtil.GetSrcPath(), @"../_layout_trims/runtime");
string binDir = Path.Combine(TestUtil.GetSrcPath(), @"../_layout/bin");
#if OS_WINDOWS
string node = Path.Combine(TestUtil.GetSrcPath(), @"..\_layout\externals\node16\bin\node");
#else
string node = Path.Combine(TestUtil.GetSrcPath(), @"../_layout/externals/node16/bin/node");
#endif
string hashFilesScript = Path.Combine(binDir, "hashFiles");
var hashResult = string.Empty;
var p1 = new ProcessInvokerWrapper();
p1.Initialize(hc);
p1.ErrorDataReceived += (_, data) =>
{
if (!string.IsNullOrEmpty(data.Data) && data.Data.StartsWith("__OUTPUT__") && data.Data.EndsWith("__OUTPUT__"))
{
hashResult = data.Data.Substring(10, data.Data.Length - 20);
trace.Info($"Hash result: '{hashResult}'");
}
else
{
trace.Info(data.Data);
}
};
p1.OutputDataReceived += (_, data) =>
{
trace.Info(data.Data);
};
var env = new Dictionary<string, string>
{
["patterns"] = "**"
};
int exitCode = await p1.ExecuteAsync(workingDirectory: layoutTrimsRuntimeAssets,
fileName: node,
arguments: $"\"{hashFilesScript.Replace("\"", "\\\"")}\"",
environment: env,
requireExitCodeZero: true,
outputEncoding: null,
killProcessOnCancel: true,
cancellationToken: CancellationToken.None);
Assert.True(string.Equals(hashResult, File.ReadAllText(dotnetRuntimeHashFile).Trim()), $"Hash mismatch for dotnet runtime. You might need to update `Misc/contentHash/dotnetRuntime/{BuildConstants.RunnerPackage.PackageName}` or check if `hashFiles.ts` ever changed recently.");
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public async Task RunnerLayoutParts_CheckExternalsHash()
{
using (TestHostContext hc = new(this))
{
Tracing trace = hc.GetTrace();
var externalsHashFile = Path.Combine(TestUtil.GetSrcPath(), $"Misc/contentHash/externals/{BuildConstants.RunnerPackage.PackageName}");
trace.Info($"Current hash: {File.ReadAllText(externalsHashFile)}");
string layoutTrimsExternalsAssets = Path.Combine(TestUtil.GetSrcPath(), @"../_layout_trims/externals");
string binDir = Path.Combine(TestUtil.GetSrcPath(), @"../_layout/bin");
#if OS_WINDOWS
string node = Path.Combine(TestUtil.GetSrcPath(), @"..\_layout\externals\node16\bin\node");
#else
string node = Path.Combine(TestUtil.GetSrcPath(), @"../_layout/externals/node16/bin/node");
#endif
string hashFilesScript = Path.Combine(binDir, "hashFiles");
var hashResult = string.Empty;
var p1 = new ProcessInvokerWrapper();
p1.Initialize(hc);
p1.ErrorDataReceived += (_, data) =>
{
if (!string.IsNullOrEmpty(data.Data) && data.Data.StartsWith("__OUTPUT__") && data.Data.EndsWith("__OUTPUT__"))
{
hashResult = data.Data.Substring(10, data.Data.Length - 20);
trace.Info($"Hash result: '{hashResult}'");
}
else
{
trace.Info(data.Data);
}
};
p1.OutputDataReceived += (_, data) =>
{
trace.Info(data.Data);
};
var env = new Dictionary<string, string>
{
["patterns"] = "**"
};
int exitCode = await p1.ExecuteAsync(workingDirectory: layoutTrimsExternalsAssets,
fileName: node,
arguments: $"\"{hashFilesScript.Replace("\"", "\\\"")}\"",
environment: env,
requireExitCodeZero: true,
outputEncoding: null,
killProcessOnCancel: true,
cancellationToken: CancellationToken.None);
Assert.True(string.Equals(hashResult, File.ReadAllText(externalsHashFile).Trim()), $"Hash mismatch for externals. You might need to update `Misc/contentHash/externals/{BuildConstants.RunnerPackage.PackageName}` or check if `hashFiles.ts` ever changed recently.");
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public Task RunnerLayoutParts_ContentHashFilesNoNewline()
{
using (TestHostContext hc = new(this))
{
Tracing trace = hc.GetTrace();
var dotnetRuntimeHashFile = Path.Combine(TestUtil.GetSrcPath(), $"Misc/contentHash/dotnetRuntime/{BuildConstants.RunnerPackage.PackageName}");
var dotnetRuntimeHash = File.ReadAllText(dotnetRuntimeHashFile);
trace.Info($"Current hash: {dotnetRuntimeHash}");
var externalsHashFile = Path.Combine(TestUtil.GetSrcPath(), $"Misc/contentHash/externals/{BuildConstants.RunnerPackage.PackageName}");
var externalsHash = File.ReadAllText(externalsHashFile);
trace.Info($"Current hash: {externalsHash}");
Assert.False(externalsHash.Any(x => char.IsWhiteSpace(x)), $"Found whitespace in externals hash file.");
Assert.False(dotnetRuntimeHash.Any(x => char.IsWhiteSpace(x)), $"Found whitespace in dotnet runtime hash file.");
return Task.CompletedTask;
}
}
}
}

View File

@@ -757,7 +757,7 @@ namespace GitHub.Runner.Common.Tests.Worker
//Assert //Assert
var err = Assert.Throws<ArgumentException>(() => actionManifest.Load(_ec.Object, action_path)); var err = Assert.Throws<ArgumentException>(() => actionManifest.Load(_ec.Object, action_path));
Assert.Contains($"Fail to load {action_path}", err.Message); Assert.Contains($"Failed to load {action_path}", err.Message);
_ec.Verify(x => x.AddIssue(It.Is<Issue>(s => s.Message.Contains("Missing 'using' value. 'using' requires 'composite', 'docker', 'node12', 'node16' or 'node20'.")), It.IsAny<ExecutionContextLogOptions>()), Times.Once); _ec.Verify(x => x.AddIssue(It.Is<Issue>(s => s.Message.Contains("Missing 'using' value. 'using' requires 'composite', 'docker', 'node12', 'node16' or 'node20'.")), It.IsAny<ExecutionContextLogOptions>()), Times.Once);
} }
finally finally

View File

@@ -14,15 +14,10 @@ DEV_TARGET_RUNTIME=$3
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
LAYOUT_DIR="$SCRIPT_DIR/../_layout" LAYOUT_DIR="$SCRIPT_DIR/../_layout"
LAYOUT_TRIMS_DIR="$SCRIPT_DIR/../_layout_trims"
LAYOUT_TRIM_EXTERNALS_DIR="$LAYOUT_TRIMS_DIR/trim_externals"
LAYOUT_TRIM_RUNTIME_DIR="$LAYOUT_TRIMS_DIR/trim_runtime"
LAYOUT_TRIM_RUNTIME_EXTERNALS_DIR="$LAYOUT_TRIMS_DIR/trim_runtime_externals"
DOWNLOAD_DIR="$SCRIPT_DIR/../_downloads/netcore2x" DOWNLOAD_DIR="$SCRIPT_DIR/../_downloads/netcore2x"
PACKAGE_DIR="$SCRIPT_DIR/../_package" PACKAGE_DIR="$SCRIPT_DIR/../_package"
PACKAGE_TRIMS_DIR="$SCRIPT_DIR/../_package_trims"
DOTNETSDK_ROOT="$SCRIPT_DIR/../_dotnetsdk" DOTNETSDK_ROOT="$SCRIPT_DIR/../_dotnetsdk"
DOTNETSDK_VERSION="6.0.415" DOTNETSDK_VERSION="6.0.418"
DOTNETSDK_INSTALLDIR="$DOTNETSDK_ROOT/$DOTNETSDK_VERSION" DOTNETSDK_INSTALLDIR="$DOTNETSDK_ROOT/$DOTNETSDK_VERSION"
RUNNER_VERSION=$(cat runnerversion) RUNNER_VERSION=$(cat runnerversion)
@@ -148,48 +143,6 @@ function layout ()
heading "Setup externals folder for $RUNTIME_ID runner's layout" heading "Setup externals folder for $RUNTIME_ID runner's layout"
bash ./Misc/externals.sh $RUNTIME_ID || checkRC externals.sh bash ./Misc/externals.sh $RUNTIME_ID || checkRC externals.sh
heading "Create layout (Trimmed) ..."
rm -Rf "$LAYOUT_TRIMS_DIR"
mkdir -p "$LAYOUT_TRIMS_DIR"
mkdir -p "$LAYOUT_TRIMS_DIR/runtime"
cp -r "$LAYOUT_DIR/bin/." "$LAYOUT_TRIMS_DIR/runtime"
mkdir -p "$LAYOUT_TRIMS_DIR/externals"
cp -r "$LAYOUT_DIR/externals/." "$LAYOUT_TRIMS_DIR/externals"
pushd "$LAYOUT_TRIMS_DIR/runtime" > /dev/null
if [[ ("$CURRENT_PLATFORM" == "windows") ]]; then
sed -i 's/\n$/\r\n/' "$SCRIPT_DIR/Misc/runnercoreassets"
fi
cat "$SCRIPT_DIR/Misc/runnercoreassets" | xargs rm -f
find . -empty -type d -delete
find . -type f > "$LAYOUT_TRIMS_DIR/runnerdotnetruntimeassets"
popd > /dev/null
heading "Create layout with externals trimmed ..."
mkdir -p "$LAYOUT_TRIM_EXTERNALS_DIR"
cp -r "$LAYOUT_DIR/." "$LAYOUT_TRIM_EXTERNALS_DIR/"
rm -Rf "$LAYOUT_TRIM_EXTERNALS_DIR/externals"
echo "Created... $LAYOUT_TRIM_EXTERNALS_DIR"
heading "Create layout with dotnet runtime trimmed ..."
mkdir -p "$LAYOUT_TRIM_RUNTIME_DIR"
cp -r "$LAYOUT_DIR/." "$LAYOUT_TRIM_RUNTIME_DIR/"
pushd "$LAYOUT_TRIM_RUNTIME_DIR/bin" > /dev/null
cat "$LAYOUT_TRIMS_DIR/runnerdotnetruntimeassets" | xargs rm -f
echo "Created... $LAYOUT_TRIM_RUNTIME_DIR"
popd > /dev/null
heading "Create layout with externals and dotnet runtime trimmed ..."
mkdir -p "$LAYOUT_TRIM_RUNTIME_EXTERNALS_DIR"
cp -r "$LAYOUT_DIR/." "$LAYOUT_TRIM_RUNTIME_EXTERNALS_DIR/"
rm -Rf "$LAYOUT_TRIM_RUNTIME_EXTERNALS_DIR/externals"
pushd "$LAYOUT_TRIM_RUNTIME_EXTERNALS_DIR/bin" > /dev/null
cat "$LAYOUT_TRIMS_DIR/runnerdotnetruntimeassets" | xargs rm -f
echo "Created... $LAYOUT_TRIM_RUNTIME_EXTERNALS_DIR"
popd > /dev/null
} }
function runtest () function runtest ()
@@ -226,9 +179,7 @@ function package ()
find "${LAYOUT_DIR}/bin" -type f -name '*.pdb' -delete find "${LAYOUT_DIR}/bin" -type f -name '*.pdb' -delete
mkdir -p "$PACKAGE_DIR" mkdir -p "$PACKAGE_DIR"
mkdir -p "$PACKAGE_TRIMS_DIR"
rm -Rf "${PACKAGE_DIR:?}"/* rm -Rf "${PACKAGE_DIR:?}"/*
rm -Rf "${PACKAGE_TRIMS_DIR:?}"/*
pushd "$PACKAGE_DIR" > /dev/null pushd "$PACKAGE_DIR" > /dev/null
@@ -246,66 +197,6 @@ function package ()
fi fi
popd > /dev/null popd > /dev/null
runner_trim_externals_pkg_name="actions-runner-${RUNTIME_ID}-${runner_ver}-noexternals"
heading "Packaging ${runner_trim_externals_pkg_name} (Trimmed)"
PACKAGE_TRIM_EXTERNALS_DIR="$PACKAGE_TRIMS_DIR/trim_externals"
mkdir -p "$PACKAGE_TRIM_EXTERNALS_DIR"
pushd "$PACKAGE_TRIM_EXTERNALS_DIR" > /dev/null
if [[ ("$CURRENT_PLATFORM" == "linux") || ("$CURRENT_PLATFORM" == "darwin") ]]; then
tar_name="${runner_trim_externals_pkg_name}.tar.gz"
echo "Creating $tar_name in ${LAYOUT_TRIM_EXTERNALS_DIR}"
tar -czf "${tar_name}" -C "${LAYOUT_TRIM_EXTERNALS_DIR}" .
elif [[ ("$CURRENT_PLATFORM" == "windows") ]]; then
zip_name="${runner_trim_externals_pkg_name}.zip"
echo "Convert ${LAYOUT_TRIM_EXTERNALS_DIR} to Windows style path"
window_path=${LAYOUT_TRIM_EXTERNALS_DIR:1}
window_path=${window_path:0:1}:${window_path:1}
echo "Creating $zip_name in ${window_path}"
$POWERSHELL -NoLogo -Sta -NoProfile -NonInteractive -ExecutionPolicy Unrestricted -Command "Add-Type -Assembly \"System.IO.Compression.FileSystem\"; [System.IO.Compression.ZipFile]::CreateFromDirectory(\"${window_path}\", \"${zip_name}\")"
fi
popd > /dev/null
runner_trim_runtime_pkg_name="actions-runner-${RUNTIME_ID}-${runner_ver}-noruntime"
heading "Packaging ${runner_trim_runtime_pkg_name} (Trimmed)"
PACKAGE_TRIM_RUNTIME_DIR="$PACKAGE_TRIMS_DIR/trim_runtime"
mkdir -p "$PACKAGE_TRIM_RUNTIME_DIR"
pushd "$PACKAGE_TRIM_RUNTIME_DIR" > /dev/null
if [[ ("$CURRENT_PLATFORM" == "linux") || ("$CURRENT_PLATFORM" == "darwin") ]]; then
tar_name="${runner_trim_runtime_pkg_name}.tar.gz"
echo "Creating $tar_name in ${LAYOUT_TRIM_RUNTIME_DIR}"
tar -czf "${tar_name}" -C "${LAYOUT_TRIM_RUNTIME_DIR}" .
elif [[ ("$CURRENT_PLATFORM" == "windows") ]]; then
zip_name="${runner_trim_runtime_pkg_name}.zip"
echo "Convert ${LAYOUT_TRIM_RUNTIME_DIR} to Windows style path"
window_path=${LAYOUT_TRIM_RUNTIME_DIR:1}
window_path=${window_path:0:1}:${window_path:1}
echo "Creating $zip_name in ${window_path}"
$POWERSHELL -NoLogo -Sta -NoProfile -NonInteractive -ExecutionPolicy Unrestricted -Command "Add-Type -Assembly \"System.IO.Compression.FileSystem\"; [System.IO.Compression.ZipFile]::CreateFromDirectory(\"${window_path}\", \"${zip_name}\")"
fi
popd > /dev/null
runner_trim_runtime_externals_pkg_name="actions-runner-${RUNTIME_ID}-${runner_ver}-noruntime-noexternals"
heading "Packaging ${runner_trim_runtime_externals_pkg_name} (Trimmed)"
PACKAGE_TRIM_RUNTIME_EXTERNALS_DIR="$PACKAGE_TRIMS_DIR/trim_runtime_externals"
mkdir -p "$PACKAGE_TRIM_RUNTIME_EXTERNALS_DIR"
pushd "$PACKAGE_TRIM_RUNTIME_EXTERNALS_DIR" > /dev/null
if [[ ("$CURRENT_PLATFORM" == "linux") || ("$CURRENT_PLATFORM" == "darwin") ]]; then
tar_name="${runner_trim_runtime_externals_pkg_name}.tar.gz"
echo "Creating $tar_name in ${LAYOUT_TRIM_RUNTIME_EXTERNALS_DIR}"
tar -czf "${tar_name}" -C "${LAYOUT_TRIM_RUNTIME_EXTERNALS_DIR}" .
elif [[ ("$CURRENT_PLATFORM" == "windows") ]]; then
zip_name="${runner_trim_runtime_externals_pkg_name}.zip"
echo "Convert ${LAYOUT_TRIM_RUNTIME_EXTERNALS_DIR} to Windows style path"
window_path=${LAYOUT_TRIM_RUNTIME_EXTERNALS_DIR:1}
window_path=${window_path:0:1}:${window_path:1}
echo "Creating $zip_name in ${window_path}"
$POWERSHELL -NoLogo -Sta -NoProfile -NonInteractive -ExecutionPolicy Unrestricted -Command "Add-Type -Assembly \"System.IO.Compression.FileSystem\"; [System.IO.Compression.ZipFile]::CreateFromDirectory(\"${window_path}\", \"${zip_name}\")"
fi
popd > /dev/null
} }
if [[ (! -d "${DOTNETSDK_INSTALLDIR}") || (! -e "${DOTNETSDK_INSTALLDIR}/.${DOTNETSDK_VERSION}") || (! -e "${DOTNETSDK_INSTALLDIR}/dotnet") ]]; then if [[ (! -d "${DOTNETSDK_INSTALLDIR}") || (! -e "${DOTNETSDK_INSTALLDIR}/.${DOTNETSDK_VERSION}") || (! -e "${DOTNETSDK_INSTALLDIR}/dotnet") ]]; then

View File

@@ -1,5 +1,5 @@
{ {
"sdk": { "sdk": {
"version": "6.0.415" "version": "6.0.418"
} }
} }

View File

@@ -1 +1 @@
2.311.0 2.313.0