Compare commits

..

64 Commits

Author SHA1 Message Date
eric sciple
18803bdff6 Preserve dates when deserializing job message from Run Service (#3269)
* Preserve dates when deserializing job message from Run Service

* Preserve dates when deserializing job message from "Actions Run Service"
2024-05-02 10:44:57 -04:00
Francesco Renzi
04b07b6675 Prepare v2.316.0 release (#3252) 2024-04-23 16:46:17 +01:00
eric sciple
dd9fcfc5b2 Replace invalid file name chars in diag log name (#3249) 2024-04-20 11:37:25 -05:00
Tingluo Huang
5107c5efb2 Cleanup enabled feature flags. (#3248) 2024-04-19 15:31:44 -04:00
Yang Cao
1b61d78c07 Relax the condition to stop uploading to Results (#3230) 2024-04-17 13:55:03 +00:00
Tingluo Huang
2e0eb2c11f Cleanup enabled feature flags. (#3246)
* Cleanup FF 'DistributedTask.UseWhich2'.

* more ff.
2024-04-16 16:52:10 -04:00
github-actions[bot]
2d83e1d88f Upgrade dotnet sdk to v6.0.421 (#3244)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2024-04-14 22:47:26 -04:00
Aiqiao Yan
4a1e38095b backoff if we retried polling for more than 50 times in less than 30m… (#3232)
* backoff if we retried polling for more than 50 times in less than 30minutes

* run dotnet format

* move delay to after no message trace

* run dotnet format
2024-04-09 14:13:07 -04:00
eeSquared
f467e9e125 Add new SessionConflict return code (#3215)
* Add new SessionConflict return code

* formatting

* Change return type of CreateSessionAsync to new enum

* Update entry scripts to handle new exit code

* Move enum
2024-03-27 18:49:58 +00:00
Tingluo Huang
77e0bfbb8a Load '_runnerSettings' in the early point of JobRunner.cs (#3218) 2024-03-23 23:12:39 -04:00
Luke Tomlinson
a52c53955c Prepare v2.315.0 release (#3216) 2024-03-22 11:38:53 -04:00
Luke Tomlinson
8ebf298bcd Always Delete Actions Service Session (#3214)
* Delete Actions Service session always

* update tes
2024-03-21 16:30:34 -04:00
Jacob Wallraff
4b85145661 Handle new non-retryable exception type (#3191)
* Handle new non-retryable exception type

* Update ActionManager.cs
2024-03-21 18:50:45 +00:00
Nikola Jokic
bc8b6e0152 Bump docker version and docker buildx version (#3208) 2024-03-20 11:16:41 -04:00
github-actions[bot]
82e01c6173 Upgrade dotnet sdk to v6.0.420 (#3211)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2024-03-17 22:42:44 -04:00
Nikola Jokic
93bc1cd918 Bump hook version to 0.6.0 (#3203) 2024-03-15 11:13:29 -04:00
Tatyana Kostromskaya
692d910868 Add ability to enforce actions to run on node20 (#3192)
Add options to enforce actions execute on node20
2024-03-14 14:12:08 +01:00
Patrick Carnahan
2c8c941622 consume new pipelines service url in handlers (#3185)
* consume pipelines service url if present

updates how the `ACTIONS_RUNTIME_URL` variable is set to utilize a new value, `PipelinesServiceUrl` if present in the endpoint. if this value is not present then the existing system connection endpoint is used to retain backward compatibility.

* consume pipelines url

updates how the `ACTIONS_RUNTIME_URL` variable is set to utilize a new value, `PipelinesServiceUrl` if present in the endpoint. if this value is not present then the existing system connection endpoint is used to retain backward compatibility.
2024-03-05 11:13:16 -05:00
Nikola Jokic
86d6211c75 Remove -f flag in wait when manually trap signal (#3182)
* Remove -f flag in wait when manually trap signal

* Remove extra empty line
2024-03-04 11:32:21 +01:00
Yashwanth Anantharaju
aa90563cae don't crash listener on getting job exceptions (#3177) 2024-02-29 15:39:29 +00:00
Tingluo Huang
4cb3cb2962 Bump runner version to match the latest patch release (#3175) 2024-02-28 20:08:31 +00:00
Ryan Troost
d7777fd632 fix summaries for actions results (#3174)
* fix summaries for actions results

* remove negative
2024-02-27 15:22:26 -05:00
Luke Tomlinson
d8bce88c4f Prepare v2.314.0 release (#3172)
* Prepare v2.314.0 release

* update releaseNoteMd
2024-02-26 13:19:04 -05:00
Tingluo Huang
601d3de3f3 Better step timeout message. (#3166) 2024-02-26 15:37:59 +00:00
Luke Tomlinson
034c51cd0b Refresh Token for BrokerServer (#3167) 2024-02-26 10:05:41 -05:00
Luke Tomlinson
d296014f99 Remove USE_BROKER_FLOW (#3162) 2024-02-21 22:39:26 +00:00
Luke Tomlinson
3449d5fa52 Broker fixes for token refreshes and AccessDeniedException (#3161) 2024-02-21 16:43:01 -05:00
Enes Çakır
6603bfb74c Add a retry logic to docker login operation (#3089)
While there's an existing retry mechanism for the `docker pull` command
[^1], it's missing for `docker login`.

Similar to the `docker pull` scenario, the container registry could
potentially be briefly unavailable or inaccessible, leading to failed
`docker login` attempt and subsequent workflow run failures.

Since it's container based workflow, there is not way to retry on
customer side. The runner should retry itself.

It also aligns with community feedback [^2].

[^1]: 8e0cd36cd8/src/Runner.Worker/ContainerOperationProvider.cs (L201)
[^2]: https://github.com/orgs/community/discussions/73069

Co-authored-by: Thomas Boop <52323235+thboop@users.noreply.github.com>
2024-02-21 17:32:37 +00:00
Yashwanth Anantharaju
b19b9462d8 handle broker run service exception handling (#3163)
* handle run service exception handling

* force fail always

* format

* format
2024-02-21 17:04:13 +00:00
github-actions[bot]
3db5c90cc4 Upgrade dotnet sdk to v6.0.419 (#3158)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2024-02-19 00:34:02 -05:00
David Omid
927b26a364 Process snapshot tokens (#3135)
* Added Snapshot TemplateToken to AgentJobRequestMessage

* WIP for processing the snapshot token

* Changed snapshot post job step condition to Success, added comments

* Refactored snapshot post-job step

* Added evaluation of snapshot token to retrieve image name

* Added snapshot to workflow schema

* Fixed linter error

* Migrated snapshot logic to new SnapshotOperationProvider

* Fixed linter error

* Fixed linter errors

* Fixed linter error

* Fixed linter errors

* Updated L0 tests

* Fixed linter errors

* Added new JobExtensionL0 tests for snapshot post-job step

* Added JobExtensionL0 test case for snapshot mappings

* Added SnapshotOperationProviderL0 tests

* Enabled nullable types for SnapshotOperationProvider and its tests

* Added more assertions to SnapshotOperationProviderL0 tests

* Fixed linter errors

* Made sure TestHostContexts are disposed of properlyh in SnapshotOperationProviderL0 tests

* Resolved PR comments

* Fixed formatting

* Removed redundant reference

* Addressed PR comments
2024-02-14 21:34:52 -05:00
Tingluo Huang
72559572f6 Pass RunnerOS during job acquire. (#3140) 2024-02-09 15:19:52 -05:00
Luke Tomlinson
31318d81ba Prepare v2.313.0 Release (#3137)
* update runnerversion

* update releaseNote.md

* update-releasenote
2024-02-07 15:15:32 -05:00
Nikola Jokic
1d47bfa6c7 Bump hook version to 0.5.1 (#3129)
Co-authored-by: Tingluo Huang <tingluohuang@github.com>
2024-02-07 19:55:33 +00:00
Luke Tomlinson
651ea42e00 Handle ForceTokenRefresh message (#3133)
* Handle ForceTokenRefresh message

* move to constants

* format
2024-02-07 11:24:40 -05:00
Lukas Hauser
bcc665a7a1 Fix CVE-2024-21626 (#3123)
Co-authored-by: Ferenc Hammerl <31069338+fhammerl@users.noreply.github.com>
2024-02-07 17:06:57 +01:00
Patrick Ellis
cd812f0395 Add sshd to .devcontainer.json (#3079)
This is required in order to ssh into a codespace for the repo: 


> error getting ssh server details: failed to start SSH server:
Please check if an SSH server is installed in the container.
If the docker image being used for the codespace does not have an SSH server,
install it in your Dockerfile or, for codespaces that use Debian-based images,
you can add the following to your devcontainer.json:
> 
> "features": {
>     "ghcr.io/devcontainers/features/sshd:1": {
>         "version": "latest"
>     }
> }
2024-02-06 15:06:07 +00:00
Josh Soref
fa874cf314 Improve error report for invalid action.yml (#3106) 2024-02-06 13:40:53 +00:00
Bethany
bf0e76631b Specify Content-Type for BlockBlob upload (#3119)
* add content-type to block blob upload

* Add content-type for sdk path

* fix spacing

* merge headers and only when file extension is .txt

* add conditions

* tweak conditions and path matching

* pass in headers

* add content-type for appendblob
2024-02-05 18:19:02 +00:00
Yang Cao
1d82031a2c Make sure to drain the upload queue before clean temp directory (#3125) 2024-02-02 20:37:45 +00:00
Victor Sollerhed
d1a619ff09 Upgrade docker from 24.0.8 to 24.0.9 (#3126)
Release notes:
- https://docs.docker.com/engine/release-notes/24.0/#2409
2024-02-02 09:46:12 -05:00
Jonathan Tamsut
11680fc78f Upload the diagnostic logs to the Results Service (#3114)
* upload diagnostic logs to results

* correct casing

* correct typo

* update contract

* update constant and define private method

* fix access

* fix reference to logs url

* update

* use results func

* fix method naming

* rename to match rpc

* change API contract

* fix lint issue

* refactor

* remove unused method

* create new log type
2024-02-01 14:27:06 -05:00
Victor Sollerhed
3e5433ec86 Upgrade docker from 24.0.7 to 24.0.8 (#3124)
Which (among other things) fixes this High `CVE-2024-21626`:
- https://github.com/advisories/GHSA-xr7r-f8xq-vfvv

Release notes:
- https://docs.docker.com/engine/release-notes/24.0/#2408
2024-02-01 11:10:46 -05:00
Tingluo Huang
b647b890c5 Only keep 1 older version runner around after self-upgrade. (#3122) 2024-01-31 10:03:49 -05:00
Luke Tomlinson
894c50073a Implement Broker Redirects for Session and Messages (#3103) 2024-01-30 20:57:49 +00:00
Tingluo Huang
5268d74ade Fix JobDispatcher crash during force cancellation. (#3118) 2024-01-30 15:24:22 -05:00
Tingluo Huang
7414e08fbd Add user-agent to all http clients using RawClientHttpRequestSettings. (#3115) 2024-01-30 13:39:41 -05:00
Tingluo Huang
dcb790f780 Fix release workflow. (#3102) 2024-01-30 13:25:58 -05:00
Tingluo Huang
b7ab810945 Make embedded timeline record has same order as its parent record. (#3113) 2024-01-26 17:26:30 -05:00
Tingluo Huang
7310ba0a08 Revert "Bump container hook version to 0.5.0 in runner image (#3003)" (#3101)
This reverts commit c7d65c42d6.
2024-01-18 11:50:36 -05:00
Diogo Torres
e842959e3e Bump docker and buildx to the latest version (#3100) 2024-01-18 10:42:44 -05:00
Tingluo Huang
9f19310b5b Prepare 2.312.0 runner release. (#3092) 2024-01-16 16:44:30 -05:00
Thomas Boop
84220a21d1 Patch Curl to no longer use -k (#3091)
* Update externals.sh

* Update externals.sh
2024-01-16 13:39:21 -05:00
github-actions[bot]
8e0cd36cd8 Upgrade dotnet sdk to v6.0.418 (#3085)
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2024-01-10 12:48:40 -05:00
Tingluo Huang
f1f18f67e1 Remove runner trimmed packages. (#3074) 2024-01-10 12:32:17 -05:00
Yang Cao
ac39c4bd0a Use Azure SDK to upload files to Azure Blob (#3033) 2024-01-08 16:13:06 -05:00
Tingluo Huang
3f3d9b0d99 Extend --check to check Results-Receiver service. (#3078)
* Extend --check to check Results-Receiver service.

* Update docs/checks/actions.md

Co-authored-by: Christopher Schleiden <cschleiden@live.de>

---------

Co-authored-by: Christopher Schleiden <cschleiden@live.de>
2024-01-05 14:18:59 -05:00
adjn
af485fb660 Update envlinux.md (#3040)
Matching supported distros to https://docs.github.com/en/actions/hosting-your-own-runners/managing-self-hosted-runners/about-self-hosted-runners
2024-01-02 11:09:10 -05:00
Stefan Ruvceski
9e3e57ff90 close reason update (#3027)
* Update close-bugs-bot.yml

* Update close-features-bot.yml
2023-12-06 17:12:55 +01:00
Tingluo Huang
ac89b31d2f Add codeload to the list of service we check during '--check'. (#3011) 2023-11-28 13:39:39 -05:00
Tingluo Huang
65201ff6be Include whether http proxy configured as part of UserAgent. (#3009) 2023-11-28 09:43:13 -05:00
Tingluo Huang
661b261959 Mark job as failed on worker crash. (#3006) 2023-11-27 16:43:36 -05:00
Hidetake Iwata
8a25302ba3 Set ImageOS environment variable in runner images (#2878)
Co-authored-by: Nikola Jokic <jokicnikola07@gmail.com>
2023-11-23 15:04:54 +01:00
Nikola Jokic
c7d65c42d6 Bump container hook version to 0.5.0 in runner image (#3003) 2023-11-22 14:52:08 +01:00
105 changed files with 2090 additions and 3295 deletions

View File

@@ -4,10 +4,13 @@
"features": { "features": {
"ghcr.io/devcontainers/features/docker-in-docker:1": {}, "ghcr.io/devcontainers/features/docker-in-docker:1": {},
"ghcr.io/devcontainers/features/dotnet": { "ghcr.io/devcontainers/features/dotnet": {
"version": "6.0.415" "version": "6.0.421"
}, },
"ghcr.io/devcontainers/features/node:1": { "ghcr.io/devcontainers/features/node:1": {
"version": "16" "version": "16"
},
"ghcr.io/devcontainers/features/sshd:1": {
"version": "latest"
} }
}, },
"customizations": { "customizations": {

View File

@@ -58,29 +58,6 @@ jobs:
${{ matrix.devScript }} layout Release ${{ matrix.runtime }} ${{ matrix.devScript }} layout Release ${{ matrix.runtime }}
working-directory: src working-directory: src
# Check runtime/externals hash
- name: Compute/Compare runtime and externals Hash
shell: bash
run: |
echo "Current dotnet runtime hash result: $DOTNET_RUNTIME_HASH"
echo "Current Externals hash result: $EXTERNALS_HASH"
NeedUpdate=0
if [ "$EXTERNALS_HASH" != "$(cat ./src/Misc/contentHash/externals/${{ matrix.runtime }})" ] ;then
echo Hash mismatch, Update ./src/Misc/contentHash/externals/${{ matrix.runtime }} to $EXTERNALS_HASH
NeedUpdate=1
fi
if [ "$DOTNET_RUNTIME_HASH" != "$(cat ./src/Misc/contentHash/dotnetRuntime/${{ matrix.runtime }})" ] ;then
echo Hash mismatch, Update ./src/Misc/contentHash/dotnetRuntime/${{ matrix.runtime }} to $DOTNET_RUNTIME_HASH
NeedUpdate=1
fi
exit $NeedUpdate
env:
DOTNET_RUNTIME_HASH: ${{hashFiles('**/_layout_trims/runtime/**/*')}}
EXTERNALS_HASH: ${{hashFiles('**/_layout_trims/externals/**/*')}}
# Run tests # Run tests
- name: L0 - name: L0
run: | run: |
@@ -103,6 +80,3 @@ jobs:
name: runner-package-${{ matrix.runtime }} name: runner-package-${{ matrix.runtime }}
path: | path: |
_package _package
_package_trims/trim_externals
_package_trims/trim_runtime
_package_trims/trim_runtime_externals

View File

@@ -15,4 +15,3 @@ jobs:
only-labels: "actions-bug" only-labels: "actions-bug"
days-before-stale: 0 days-before-stale: 0
days-before-close: 1 days-before-close: 1
close-issue-reason: "completed"

View File

@@ -15,4 +15,3 @@ jobs:
only-labels: "actions-feature" only-labels: "actions-feature"
days-before-stale: 0 days-before-stale: 0
days-before-close: 1 days-before-close: 1
close-issue-reason: "completed"

View File

@@ -84,221 +84,20 @@ jobs:
git commit -a -m "Upgrade dotnet sdk to v${{ steps.fetch_latest_version.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }}" git commit -a -m "Upgrade dotnet sdk to v${{ steps.fetch_latest_version.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }}"
git push --set-upstream origin $branch_name git push --set-upstream origin $branch_name
build-hashes: create-pr:
if: ${{ needs.dotnet-update.outputs.SHOULD_UPDATE == 1 && needs.dotnet-update.outputs.BRANCH_EXISTS == 0 }}
needs: [dotnet-update] needs: [dotnet-update]
outputs: if: ${{ needs.dotnet-update.outputs.SHOULD_UPDATE == 1 && needs.dotnet-update.outputs.BRANCH_EXISTS == 0 }}
# pass outputs from this job to create-pr for use runs-on: ubuntu-latest
DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION: ${{ needs.dotnet-update.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }}
DOTNET_CURRENT_MAJOR_MINOR_VERSION: ${{ needs.dotnet-update.outputs.DOTNET_CURRENT_MAJOR_MINOR_VERSION }}
NEEDS_HASH_UPDATE: ${{ steps.compute-hash.outputs.NEED_UPDATE }}
strategy:
fail-fast: false
matrix:
runtime: [ linux-x64, linux-arm64, linux-arm, win-x64, win-arm64, osx-x64, osx-arm64 ]
include:
- runtime: linux-x64
os: ubuntu-latest
devScript: ./dev.sh
- runtime: linux-arm64
os: ubuntu-latest
devScript: ./dev.sh
- runtime: linux-arm
os: ubuntu-latest
devScript: ./dev.sh
- runtime: osx-x64
os: macOS-latest
devScript: ./dev.sh
- runtime: osx-arm64
os: macOS-latest
devScript: ./dev.sh
- runtime: win-x64
os: windows-2019
devScript: ./dev
- runtime: win-arm64
os: windows-latest
devScript: ./dev
runs-on: ${{ matrix.os }}
steps: steps:
- uses: actions/checkout@v3 - uses: actions/checkout@v3
with: with:
ref: feature/dotnetsdk-upgrade/${{ needs.dotnet-update.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }} ref: feature/dotnetsdk-upgrade/${{ needs.dotnet-update.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }}
# Build runner layout
- name: Build & Layout Release
run: |
${{ matrix.devScript }} layout Release ${{ matrix.runtime }}
working-directory: src
# Check runtime/externals hash
- name: Compute/Compare runtime and externals Hash
id: compute-hash
continue-on-error: true
shell: bash
run: |
echo "Current dotnet runtime hash result: $DOTNET_RUNTIME_HASH"
echo "Current Externals hash result: $EXTERNALS_HASH"
NeedUpdate=0
if [ "$EXTERNALS_HASH" != "$(cat ./src/Misc/contentHash/externals/${{ matrix.runtime }})" ] ;then
echo Hash mismatch, Update ./src/Misc/contentHash/externals/${{ matrix.runtime }} to $EXTERNALS_HASH
echo "EXTERNAL_HASH=$EXTERNALS_HASH" >> $GITHUB_OUTPUT
NeedUpdate=1
fi
if [ "$DOTNET_RUNTIME_HASH" != "$(cat ./src/Misc/contentHash/dotnetRuntime/${{ matrix.runtime }})" ] ;then
echo Hash mismatch, Update ./src/Misc/contentHash/dotnetRuntime/${{ matrix.runtime }} to $DOTNET_RUNTIME_HASH
echo "DOTNET_RUNTIME_HASH=$DOTNET_RUNTIME_HASH" >> $GITHUB_OUTPUT
NeedUpdate=1
fi
echo "NEED_UPDATE=$NeedUpdate" >> $GITHUB_OUTPUT
env:
DOTNET_RUNTIME_HASH: ${{hashFiles('**/_layout_trims/runtime/**/*')}}
EXTERNALS_HASH: ${{hashFiles('**/_layout_trims/externals/**/*')}}
- name: update hash
if: ${{ steps.compute-hash.outputs.NEED_UPDATE == 1 }}
shell: bash
run: |
ExternalHash=${{ steps.compute-hash.outputs.EXTERNAL_HASH }}
DotNetRuntimeHash=${{ steps.compute-hash.outputs.DOTNET_RUNTIME_HASH }}
if [ -n "$ExternalHash" ]; then
echo "$ExternalHash" > ./src/Misc/contentHash/externals/${{ matrix.runtime }}
fi
if [ -n "$DotNetRuntimeHash" ]; then
echo "$DotNetRuntimeHash" > ./src/Misc/contentHash/dotnetRuntime/${{ matrix.runtime }}
fi
- name: cache updated hashes
if: ${{ steps.compute-hash.outputs.NEED_UPDATE == 1 }}
uses: actions/cache/save@v3
with:
enableCrossOsArchive: true
path: |
./src/Misc/contentHash/externals/${{ matrix.runtime }}
./src/Misc/contentHash/dotnetRuntime/${{ matrix.runtime }}
key: compute-hashes-${{ matrix.runtime }}-${{ github.run_id }}-${{ github.run_number }}-${{ github.run_attempt }}
hash-update:
needs: [build-hashes]
if: ${{ needs.build-hashes.outputs.NEEDS_HASH_UPDATE == 1 }}
outputs:
# pass outputs from this job to create-pr for use
DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION: ${{ needs.build-hashes.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }}
DOTNET_CURRENT_MAJOR_MINOR_VERSION: ${{ needs.build-hashes.outputs.DOTNET_CURRENT_MAJOR_MINOR_VERSION }}
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
with:
ref: feature/dotnetsdk-upgrade/${{ needs.build-hashes.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }}
- name: Restore cached hashes - linux-x64
id: cache-restore-linux-x64
uses: actions/cache/restore@v3
with:
enableCrossOsArchive: true
path: |
./src/Misc/contentHash/externals/linux-x64
./src/Misc/contentHash/dotnetRuntime/linux-x64
key: compute-hashes-linux-x64-${{ github.run_id }}-${{ github.run_number }}-${{ github.run_attempt }}
- name: Restore cached hashes - linux-arm64
id: cache-restore-linux-arm64
uses: actions/cache/restore@v3
with:
enableCrossOsArchive: true
path: |
./src/Misc/contentHash/externals/linux-arm64
./src/Misc/contentHash/dotnetRuntime/linux-arm64
key: compute-hashes-linux-arm64-${{ github.run_id }}-${{ github.run_number }}-${{ github.run_attempt }}
- name: Restore cached hashes - linux-arm
id: cache-restore-linux-arm
uses: actions/cache/restore@v3
with:
enableCrossOsArchive: true
path: |
./src/Misc/contentHash/externals/linux-arm
./src/Misc/contentHash/dotnetRuntime/linux-arm
key: compute-hashes-linux-arm-${{ github.run_id }}-${{ github.run_number }}-${{ github.run_attempt }}
- name: Restore cached hashes - osx-x64
id: cache-restore-osx-x64
uses: actions/cache/restore@v3
with:
enableCrossOsArchive: true
path: |
./src/Misc/contentHash/externals/osx-x64
./src/Misc/contentHash/dotnetRuntime/osx-x64
key: compute-hashes-osx-x64-${{ github.run_id }}-${{ github.run_number }}-${{ github.run_attempt }}
- name: Restore cached hashes - osx-arm64
id: cache-restore-osx-arm64
uses: actions/cache/restore@v3
with:
enableCrossOsArchive: true
path: |
./src/Misc/contentHash/externals/osx-arm64
./src/Misc/contentHash/dotnetRuntime/osx-arm64
key: compute-hashes-osx-arm64-${{ github.run_id }}-${{ github.run_number }}-${{ github.run_attempt }}
- name: Restore cached hashes - win-x64
id: cache-restore-win-x64
uses: actions/cache/restore@v3
with:
enableCrossOsArchive: true
path: |
./src/Misc/contentHash/externals/win-x64
./src/Misc/contentHash/dotnetRuntime/win-x64
key: compute-hashes-win-x64-${{ github.run_id }}-${{ github.run_number }}-${{ github.run_attempt }}
- name: Restore cached hashes - win-arm64
id: cache-restore-win-arm64
uses: actions/cache/restore@v3
with:
enableCrossOsArchive: true
path: |
./src/Misc/contentHash/externals/win-arm64
./src/Misc/contentHash/dotnetRuntime/win-arm64
key: compute-hashes-win-arm64-${{ github.run_id }}-${{ github.run_number }}-${{ github.run_attempt }}
- name: Fetch cached computed hashes
if: steps.cache-restore-linux-x64.outputs.cache-hit == 'true' ||
steps.cache-restore-linux-arm64.outputs.cache-hit == 'true' ||
steps.cache-restore-linux-arm.outputs.cache-hit == 'true' ||
steps.cache-restore-win-x64.outputs.cache-hit == 'true' ||
steps.cache-restore-win-arm64.outputs.cache-hit == 'true' ||
steps.cache-restore-osx-x64.outputs.cache-hit == 'true' ||
steps.cache-restore-osx-arm64.outputs.cache-hit == 'true'
shell: bash
run: |
Environments=( "linux-x64" "linux-arm64" "linux-arm" "win-x64" "win-arm64" "osx-x64" "osx-arm64" )
git config --global user.name "github-actions[bot]"
git config --global user.email "41898282+github-actions[bot]@users.noreply.github.com"
git commit -a -m "Update computed hashes"
git push --set-upstream origin feature/dotnetsdk-upgrade/${{ needs.build-hashes.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }}
create-pr:
needs: [hash-update]
outputs:
# pass outputs from this job to run-tests for use
DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION: ${{ needs.hash-update.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }}
DOTNET_CURRENT_MAJOR_MINOR_VERSION: ${{ needs.hash-update.outputs.DOTNET_CURRENT_MAJOR_MINOR_VERSION }}
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
with:
ref: feature/dotnetsdk-upgrade/${{ needs.hash-update.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }}
- name: Create Pull Request - name: Create Pull Request
env: env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: | run: |
gh pr create -B main -H feature/dotnetsdk-upgrade/${{ needs.hash-update.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }} --title "Update dotnet sdk to latest version @${{ needs.hash-update.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }}" --body " gh pr create -B main -H feature/dotnetsdk-upgrade/${{ needs.dotnet-update.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }} --title "Update dotnet sdk to latest version @${{ needs.dotnet-update.outputs.DOTNET_LATEST_MAJOR_MINOR_PATCH_VERSION }}" --body "
https://dotnetcli.blob.core.windows.net/dotnet/Sdk/${{ needs.hash-update.outputs.DOTNET_CURRENT_MAJOR_MINOR_VERSION }}/latest.version https://dotnetcli.blob.core.windows.net/dotnet/Sdk/${{ needs.dotnet-update.outputs.DOTNET_CURRENT_MAJOR_MINOR_VERSION }}/latest.version
--- ---

View File

@@ -53,27 +53,6 @@ jobs:
win-arm64-sha: ${{ steps.sha.outputs.win-arm64-sha256 }} win-arm64-sha: ${{ steps.sha.outputs.win-arm64-sha256 }}
osx-x64-sha: ${{ steps.sha.outputs.osx-x64-sha256 }} osx-x64-sha: ${{ steps.sha.outputs.osx-x64-sha256 }}
osx-arm64-sha: ${{ steps.sha.outputs.osx-arm64-sha256 }} osx-arm64-sha: ${{ steps.sha.outputs.osx-arm64-sha256 }}
linux-x64-sha-noexternals: ${{ steps.sha_noexternals.outputs.linux-x64-sha256 }}
linux-arm64-sha-noexternals: ${{ steps.sha_noexternals.outputs.linux-arm64-sha256 }}
linux-arm-sha-noexternals: ${{ steps.sha_noexternals.outputs.linux-arm-sha256 }}
win-x64-sha-noexternals: ${{ steps.sha_noexternals.outputs.win-x64-sha256 }}
win-arm64-sha-noexternals: ${{ steps.sha_noexternals.outputs.win-arm64-sha256 }}
osx-x64-sha-noexternals: ${{ steps.sha_noexternals.outputs.osx-x64-sha256 }}
osx-arm64-sha-noexternals: ${{ steps.sha_noexternals.outputs.osx-arm64-sha256 }}
linux-x64-sha-noruntime: ${{ steps.sha_noruntime.outputs.linux-x64-sha256 }}
linux-arm64-sha-noruntime: ${{ steps.sha_noruntime.outputs.linux-arm64-sha256 }}
linux-arm-sha-noruntime: ${{ steps.sha_noruntime.outputs.linux-arm-sha256 }}
win-x64-sha-noruntime: ${{ steps.sha_noruntime.outputs.win-x64-sha256 }}
win-arm64-sha-noruntime: ${{ steps.sha_noruntime.outputs.win-arm64-sha256 }}
osx-x64-sha-noruntime: ${{ steps.sha_noruntime.outputs.osx-x64-sha256 }}
osx-arm64-sha-noruntime: ${{ steps.sha_noruntime.outputs.osx-arm64-sha256 }}
linux-x64-sha-noruntime-noexternals: ${{ steps.sha_noruntime_noexternals.outputs.linux-x64-sha256 }}
linux-arm64-sha-noruntime-noexternals: ${{ steps.sha_noruntime_noexternals.outputs.linux-arm64-sha256 }}
linux-arm-sha-noruntime-noexternals: ${{ steps.sha_noruntime_noexternals.outputs.linux-arm-sha256 }}
win-x64-sha-noruntime-noexternals: ${{ steps.sha_noruntime_noexternals.outputs.win-x64-sha256 }}
win-arm64-sha-noruntime-noexternals: ${{ steps.sha_noruntime_noexternals.outputs.win-arm64-sha256 }}
osx-x64-sha-noruntime-noexternals: ${{ steps.sha_noruntime_noexternals.outputs.osx-x64-sha256 }}
osx-arm64-sha-noruntime-noexternals: ${{ steps.sha_noruntime_noexternals.outputs.osx-arm64-sha256 }}
strategy: strategy:
matrix: matrix:
runtime: [ linux-x64, linux-arm64, linux-arm, win-x64, osx-x64, osx-arm64, win-arm64 ] runtime: [ linux-x64, linux-arm64, linux-arm, win-x64, osx-x64, osx-arm64, win-arm64 ]
@@ -136,76 +115,6 @@ jobs:
id: sha id: sha
name: Compute SHA256 name: Compute SHA256
working-directory: _package working-directory: _package
- run: |
file=$(ls)
sha=$(sha256sum $file | awk '{ print $1 }')
echo "Computed sha256: $sha for $file"
echo "${{matrix.runtime}}-sha256=$sha" >> $GITHUB_OUTPUT
echo "sha256=$sha" >> $GITHUB_OUTPUT
shell: bash
id: sha_noexternals
name: Compute SHA256
working-directory: _package_trims/trim_externals
- run: |
file=$(ls)
sha=$(sha256sum $file | awk '{ print $1 }')
echo "Computed sha256: $sha for $file"
echo "${{matrix.runtime}}-sha256=$sha" >> $GITHUB_OUTPUT
echo "sha256=$sha" >> $GITHUB_OUTPUT
shell: bash
id: sha_noruntime
name: Compute SHA256
working-directory: _package_trims/trim_runtime
- run: |
file=$(ls)
sha=$(sha256sum $file | awk '{ print $1 }')
echo "Computed sha256: $sha for $file"
echo "${{matrix.runtime}}-sha256=$sha" >> $GITHUB_OUTPUT
echo "sha256=$sha" >> $GITHUB_OUTPUT
shell: bash
id: sha_noruntime_noexternals
name: Compute SHA256
working-directory: _package_trims/trim_runtime_externals
- name: Create trimmedpackages.json for ${{ matrix.runtime }}
if: matrix.runtime == 'win-x64' || matrix.runtime == 'win-arm64'
uses: actions/github-script@0.3.0
with:
github-token: ${{secrets.GITHUB_TOKEN}}
script: |
const core = require('@actions/core')
const fs = require('fs');
const runnerVersion = fs.readFileSync('src/runnerversion', 'utf8').replace(/\n$/g, '')
var trimmedPackages = fs.readFileSync('src/Misc/trimmedpackages_zip.json', 'utf8').replace(/<RUNNER_VERSION>/g, runnerVersion).replace(/<RUNNER_PLATFORM>/g, '${{ matrix.runtime }}')
trimmedPackages = trimmedPackages.replace(/<RUNTIME_HASH>/g, '${{hashFiles('**/_layout_trims/runtime/**/*')}}')
trimmedPackages = trimmedPackages.replace(/<EXTERNALS_HASH>/g, '${{hashFiles('**/_layout_trims/externals/**/*')}}')
trimmedPackages = trimmedPackages.replace(/<NO_RUNTIME_EXTERNALS_HASH>/g, '${{steps.sha_noruntime_noexternals.outputs.sha256}}')
trimmedPackages = trimmedPackages.replace(/<NO_RUNTIME_HASH>/g, '${{steps.sha_noruntime.outputs.sha256}}')
trimmedPackages = trimmedPackages.replace(/<NO_EXTERNALS_HASH>/g, '${{steps.sha_noexternals.outputs.sha256}}')
console.log(trimmedPackages)
fs.writeFileSync('${{ matrix.runtime }}-trimmedpackages.json', trimmedPackages)
- name: Create trimmedpackages.json for ${{ matrix.runtime }}
if: matrix.runtime != 'win-x64' && matrix.runtime != 'win-arm64'
uses: actions/github-script@0.3.0
with:
github-token: ${{secrets.GITHUB_TOKEN}}
script: |
const core = require('@actions/core')
const fs = require('fs');
const runnerVersion = fs.readFileSync('src/runnerversion', 'utf8').replace(/\n$/g, '')
var trimmedPackages = fs.readFileSync('src/Misc/trimmedpackages_targz.json', 'utf8').replace(/<RUNNER_VERSION>/g, runnerVersion).replace(/<RUNNER_PLATFORM>/g, '${{ matrix.runtime }}')
trimmedPackages = trimmedPackages.replace(/<RUNTIME_HASH>/g, '${{hashFiles('**/_layout_trims/runtime/**/*')}}')
trimmedPackages = trimmedPackages.replace(/<EXTERNALS_HASH>/g, '${{hashFiles('**/_layout_trims/externals/**/*')}}')
trimmedPackages = trimmedPackages.replace(/<NO_RUNTIME_EXTERNALS_HASH>/g, '${{steps.sha_noruntime_noexternals.outputs.sha256}}')
trimmedPackages = trimmedPackages.replace(/<NO_RUNTIME_HASH>/g, '${{steps.sha_noruntime.outputs.sha256}}')
trimmedPackages = trimmedPackages.replace(/<NO_EXTERNALS_HASH>/g, '${{steps.sha_noexternals.outputs.sha256}}')
console.log(trimmedPackages)
fs.writeFileSync('${{ matrix.runtime }}-trimmedpackages.json', trimmedPackages)
# Upload runner package tar.gz/zip as artifact. # Upload runner package tar.gz/zip as artifact.
# Since each package name is unique, so we don't need to put ${{matrix}} info into artifact name # Since each package name is unique, so we don't need to put ${{matrix}} info into artifact name
@@ -216,10 +125,6 @@ jobs:
name: runner-packages name: runner-packages
path: | path: |
_package _package
_package_trims/trim_externals
_package_trims/trim_runtime
_package_trims/trim_runtime_externals
${{ matrix.runtime }}-trimmedpackages.json
release: release:
needs: build needs: build
@@ -253,33 +158,11 @@ jobs:
releaseNote = releaseNote.replace(/<LINUX_X64_SHA>/g, '${{needs.build.outputs.linux-x64-sha}}') releaseNote = releaseNote.replace(/<LINUX_X64_SHA>/g, '${{needs.build.outputs.linux-x64-sha}}')
releaseNote = releaseNote.replace(/<LINUX_ARM_SHA>/g, '${{needs.build.outputs.linux-arm-sha}}') releaseNote = releaseNote.replace(/<LINUX_ARM_SHA>/g, '${{needs.build.outputs.linux-arm-sha}}')
releaseNote = releaseNote.replace(/<LINUX_ARM64_SHA>/g, '${{needs.build.outputs.linux-arm64-sha}}') releaseNote = releaseNote.replace(/<LINUX_ARM64_SHA>/g, '${{needs.build.outputs.linux-arm64-sha}}')
releaseNote = releaseNote.replace(/<WIN_X64_SHA_NOEXTERNALS>/g, '${{needs.build.outputs.win-x64-sha-noexternals}}')
releaseNote = releaseNote.replace(/<WIN_ARM64_SHA_NOEXTERNALS>/g, '${{needs.build.outputs.win-arm64-sha-noexternals}}')
releaseNote = releaseNote.replace(/<OSX_X64_SHA_NOEXTERNALS>/g, '${{needs.build.outputs.osx-x64-sha-noexternals}}')
releaseNote = releaseNote.replace(/<OSX_ARM64_SHA_NOEXTERNALS>/g, '${{needs.build.outputs.osx-arm64-sha-noexternals}}')
releaseNote = releaseNote.replace(/<LINUX_X64_SHA_NOEXTERNALS>/g, '${{needs.build.outputs.linux-x64-sha-noexternals}}')
releaseNote = releaseNote.replace(/<LINUX_ARM_SHA_NOEXTERNALS>/g, '${{needs.build.outputs.linux-arm-sha-noexternals}}')
releaseNote = releaseNote.replace(/<LINUX_ARM64_SHA_NOEXTERNALS>/g, '${{needs.build.outputs.linux-arm64-sha-noexternals}}')
releaseNote = releaseNote.replace(/<WIN_X64_SHA_NORUNTIME>/g, '${{needs.build.outputs.win-x64-sha-noruntime}}')
releaseNote = releaseNote.replace(/<WIN_ARM64_SHA_NORUNTIME>/g, '${{needs.build.outputs.win-arm64-sha-noruntime}}')
releaseNote = releaseNote.replace(/<OSX_X64_SHA_NORUNTIME>/g, '${{needs.build.outputs.osx-x64-sha-noruntime}}')
releaseNote = releaseNote.replace(/<OSX_ARM64_SHA_NORUNTIME>/g, '${{needs.build.outputs.osx-arm64-sha-noruntime}}')
releaseNote = releaseNote.replace(/<LINUX_X64_SHA_NORUNTIME>/g, '${{needs.build.outputs.linux-x64-sha-noruntime}}')
releaseNote = releaseNote.replace(/<LINUX_ARM_SHA_NORUNTIME>/g, '${{needs.build.outputs.linux-arm-sha-noruntime}}')
releaseNote = releaseNote.replace(/<LINUX_ARM64_SHA_NORUNTIME>/g, '${{needs.build.outputs.linux-arm64-sha-noruntime}}')
releaseNote = releaseNote.replace(/<WIN_X64_SHA_NORUNTIME_NOEXTERNALS>/g, '${{needs.build.outputs.win-x64-sha-noruntime-noexternals}}')
releaseNote = releaseNote.replace(/<WIN_ARM64_SHA_NORUNTIME_NOEXTERNALS>/g, '${{needs.build.outputs.win-arm64-sha-noruntime-noexternals}}')
releaseNote = releaseNote.replace(/<OSX_X64_SHA_NORUNTIME_NOEXTERNALS>/g, '${{needs.build.outputs.osx-x64-sha-noruntime-noexternals}}')
releaseNote = releaseNote.replace(/<OSX_ARM64_SHA_NORUNTIME_NOEXTERNALS>/g, '${{needs.build.outputs.osx-arm64-sha-noruntime-noexternals}}')
releaseNote = releaseNote.replace(/<LINUX_X64_SHA_NORUNTIME_NOEXTERNALS>/g, '${{needs.build.outputs.linux-x64-sha-noruntime-noexternals}}')
releaseNote = releaseNote.replace(/<LINUX_ARM_SHA_NORUNTIME_NOEXTERNALS>/g, '${{needs.build.outputs.linux-arm-sha-noruntime-noexternals}}')
releaseNote = releaseNote.replace(/<LINUX_ARM64_SHA_NORUNTIME_NOEXTERNALS>/g, '${{needs.build.outputs.linux-arm64-sha-noruntime-noexternals}}')
console.log(releaseNote) console.log(releaseNote)
core.setOutput('version', runnerVersion); core.setOutput('version', runnerVersion);
core.setOutput('note', releaseNote); core.setOutput('note', releaseNote);
- name: Validate Packages HASH - name: Validate Packages HASH
working-directory: _package
run: | run: |
ls -l ls -l
echo "${{needs.build.outputs.win-x64-sha}} actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}.zip" | shasum -a 256 -c echo "${{needs.build.outputs.win-x64-sha}} actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}.zip" | shasum -a 256 -c
@@ -309,7 +192,7 @@ jobs:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with: with:
upload_url: ${{ steps.createRelease.outputs.upload_url }} upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package/actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}.zip asset_path: ${{ github.workspace }}/actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}.zip
asset_name: actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}.zip asset_name: actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}.zip
asset_content_type: application/octet-stream asset_content_type: application/octet-stream
@@ -319,7 +202,7 @@ jobs:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with: with:
upload_url: ${{ steps.createRelease.outputs.upload_url }} upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package/actions-runner-win-arm64-${{ steps.releaseNote.outputs.version }}.zip asset_path: ${{ github.workspace }}/actions-runner-win-arm64-${{ steps.releaseNote.outputs.version }}.zip
asset_name: actions-runner-win-arm64-${{ steps.releaseNote.outputs.version }}.zip asset_name: actions-runner-win-arm64-${{ steps.releaseNote.outputs.version }}.zip
asset_content_type: application/octet-stream asset_content_type: application/octet-stream
@@ -329,7 +212,7 @@ jobs:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with: with:
upload_url: ${{ steps.createRelease.outputs.upload_url }} upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package/actions-runner-linux-x64-${{ steps.releaseNote.outputs.version }}.tar.gz asset_path: ${{ github.workspace }}/actions-runner-linux-x64-${{ steps.releaseNote.outputs.version }}.tar.gz
asset_name: actions-runner-linux-x64-${{ steps.releaseNote.outputs.version }}.tar.gz asset_name: actions-runner-linux-x64-${{ steps.releaseNote.outputs.version }}.tar.gz
asset_content_type: application/octet-stream asset_content_type: application/octet-stream
@@ -339,7 +222,7 @@ jobs:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with: with:
upload_url: ${{ steps.createRelease.outputs.upload_url }} upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package/actions-runner-osx-x64-${{ steps.releaseNote.outputs.version }}.tar.gz asset_path: ${{ github.workspace }}/actions-runner-osx-x64-${{ steps.releaseNote.outputs.version }}.tar.gz
asset_name: actions-runner-osx-x64-${{ steps.releaseNote.outputs.version }}.tar.gz asset_name: actions-runner-osx-x64-${{ steps.releaseNote.outputs.version }}.tar.gz
asset_content_type: application/octet-stream asset_content_type: application/octet-stream
@@ -349,7 +232,7 @@ jobs:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with: with:
upload_url: ${{ steps.createRelease.outputs.upload_url }} upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package/actions-runner-osx-arm64-${{ steps.releaseNote.outputs.version }}.tar.gz asset_path: ${{ github.workspace }}/actions-runner-osx-arm64-${{ steps.releaseNote.outputs.version }}.tar.gz
asset_name: actions-runner-osx-arm64-${{ steps.releaseNote.outputs.version }}.tar.gz asset_name: actions-runner-osx-arm64-${{ steps.releaseNote.outputs.version }}.tar.gz
asset_content_type: application/octet-stream asset_content_type: application/octet-stream
@@ -359,7 +242,7 @@ jobs:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with: with:
upload_url: ${{ steps.createRelease.outputs.upload_url }} upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package/actions-runner-linux-arm-${{ steps.releaseNote.outputs.version }}.tar.gz asset_path: ${{ github.workspace }}/actions-runner-linux-arm-${{ steps.releaseNote.outputs.version }}.tar.gz
asset_name: actions-runner-linux-arm-${{ steps.releaseNote.outputs.version }}.tar.gz asset_name: actions-runner-linux-arm-${{ steps.releaseNote.outputs.version }}.tar.gz
asset_content_type: application/octet-stream asset_content_type: application/octet-stream
@@ -369,298 +252,10 @@ jobs:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with: with:
upload_url: ${{ steps.createRelease.outputs.upload_url }} upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package/actions-runner-linux-arm64-${{ steps.releaseNote.outputs.version }}.tar.gz asset_path: ${{ github.workspace }}/actions-runner-linux-arm64-${{ steps.releaseNote.outputs.version }}.tar.gz
asset_name: actions-runner-linux-arm64-${{ steps.releaseNote.outputs.version }}.tar.gz asset_name: actions-runner-linux-arm64-${{ steps.releaseNote.outputs.version }}.tar.gz
asset_content_type: application/octet-stream asset_content_type: application/octet-stream
# Upload release assets (trim externals)
- name: Upload Release Asset (win-x64-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_externals/actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}-noexternals.zip
asset_name: actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}-noexternals.zip
asset_content_type: application/octet-stream
# Upload release assets (trim externals)
- name: Upload Release Asset (win-arm64-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_externals/actions-runner-win-arm64-${{ steps.releaseNote.outputs.version }}-noexternals.zip
asset_name: actions-runner-win-arm64-${{ steps.releaseNote.outputs.version }}-noexternals.zip
asset_content_type: application/octet-stream
- name: Upload Release Asset (linux-x64-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_externals/actions-runner-linux-x64-${{ steps.releaseNote.outputs.version }}-noexternals.tar.gz
asset_name: actions-runner-linux-x64-${{ steps.releaseNote.outputs.version }}-noexternals.tar.gz
asset_content_type: application/octet-stream
- name: Upload Release Asset (osx-x64-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_externals/actions-runner-osx-x64-${{ steps.releaseNote.outputs.version }}-noexternals.tar.gz
asset_name: actions-runner-osx-x64-${{ steps.releaseNote.outputs.version }}-noexternals.tar.gz
asset_content_type: application/octet-stream
- name: Upload Release Asset (osx-arm64-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_externals/actions-runner-osx-arm64-${{ steps.releaseNote.outputs.version }}-noexternals.tar.gz
asset_name: actions-runner-osx-arm64-${{ steps.releaseNote.outputs.version }}-noexternals.tar.gz
asset_content_type: application/octet-stream
- name: Upload Release Asset (linux-arm-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_externals/actions-runner-linux-arm-${{ steps.releaseNote.outputs.version }}-noexternals.tar.gz
asset_name: actions-runner-linux-arm-${{ steps.releaseNote.outputs.version }}-noexternals.tar.gz
asset_content_type: application/octet-stream
- name: Upload Release Asset (linux-arm64-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_externals/actions-runner-linux-arm64-${{ steps.releaseNote.outputs.version }}-noexternals.tar.gz
asset_name: actions-runner-linux-arm64-${{ steps.releaseNote.outputs.version }}-noexternals.tar.gz
asset_content_type: application/octet-stream
# Upload release assets (trim runtime)
- name: Upload Release Asset (win-x64-noruntime)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime/actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}-noruntime.zip
asset_name: actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}-noruntime.zip
asset_content_type: application/octet-stream
# Upload release assets (trim runtime)
- name: Upload Release Asset (win-arm64-noruntime)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime/actions-runner-win-arm64-${{ steps.releaseNote.outputs.version }}-noruntime.zip
asset_name: actions-runner-win-arm64-${{ steps.releaseNote.outputs.version }}-noruntime.zip
asset_content_type: application/octet-stream
- name: Upload Release Asset (linux-x64-noruntime)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime/actions-runner-linux-x64-${{ steps.releaseNote.outputs.version }}-noruntime.tar.gz
asset_name: actions-runner-linux-x64-${{ steps.releaseNote.outputs.version }}-noruntime.tar.gz
asset_content_type: application/octet-stream
- name: Upload Release Asset (osx-x64-noruntime)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime/actions-runner-osx-x64-${{ steps.releaseNote.outputs.version }}-noruntime.tar.gz
asset_name: actions-runner-osx-x64-${{ steps.releaseNote.outputs.version }}-noruntime.tar.gz
asset_content_type: application/octet-stream
- name: Upload Release Asset (osx-arm64-noruntime)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime/actions-runner-osx-arm64-${{ steps.releaseNote.outputs.version }}-noruntime.tar.gz
asset_name: actions-runner-osx-arm64-${{ steps.releaseNote.outputs.version }}-noruntime.tar.gz
asset_content_type: application/octet-stream
- name: Upload Release Asset (linux-arm-noruntime)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime/actions-runner-linux-arm-${{ steps.releaseNote.outputs.version }}-noruntime.tar.gz
asset_name: actions-runner-linux-arm-${{ steps.releaseNote.outputs.version }}-noruntime.tar.gz
asset_content_type: application/octet-stream
- name: Upload Release Asset (linux-arm64-noruntime)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime/actions-runner-linux-arm64-${{ steps.releaseNote.outputs.version }}-noruntime.tar.gz
asset_name: actions-runner-linux-arm64-${{ steps.releaseNote.outputs.version }}-noruntime.tar.gz
asset_content_type: application/octet-stream
# Upload release assets (trim runtime and externals)
- name: Upload Release Asset (win-x64-noruntime-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime_externals/actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.zip
asset_name: actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.zip
asset_content_type: application/octet-stream
# Upload release assets (trim runtime and externals)
- name: Upload Release Asset (win-arm64-noruntime-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime_externals/actions-runner-win-arm64-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.zip
asset_name: actions-runner-win-arm64-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.zip
asset_content_type: application/octet-stream
- name: Upload Release Asset (linux-x64-noruntime-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime_externals/actions-runner-linux-x64-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.tar.gz
asset_name: actions-runner-linux-x64-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.tar.gz
asset_content_type: application/octet-stream
- name: Upload Release Asset (osx-x64-noruntime-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime_externals/actions-runner-osx-x64-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.tar.gz
asset_name: actions-runner-osx-x64-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.tar.gz
asset_content_type: application/octet-stream
- name: Upload Release Asset (osx-arm64-noruntime-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime_externals/actions-runner-osx-arm64-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.tar.gz
asset_name: actions-runner-osx-arm64-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.tar.gz
asset_content_type: application/octet-stream
- name: Upload Release Asset (linux-arm-noruntime-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime_externals/actions-runner-linux-arm-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.tar.gz
asset_name: actions-runner-linux-arm-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.tar.gz
asset_content_type: application/octet-stream
- name: Upload Release Asset (linux-arm64-noruntime-noexternals)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/_package_trims/trim_runtime_externals/actions-runner-linux-arm64-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.tar.gz
asset_name: actions-runner-linux-arm64-${{ steps.releaseNote.outputs.version }}-noruntime-noexternals.tar.gz
asset_content_type: application/octet-stream
# Upload release assets (trimmedpackages.json)
- name: Upload Release Asset (win-x64-trimmedpackages.json)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/win-x64-trimmedpackages.json
asset_name: actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}-trimmedpackages.json
asset_content_type: application/octet-stream
# Upload release assets (trimmedpackages.json)
- name: Upload Release Asset (win-arm64-trimmedpackages.json)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/win-arm64-trimmedpackages.json
asset_name: actions-runner-win-arm64-${{ steps.releaseNote.outputs.version }}-trimmedpackages.json
asset_content_type: application/octet-stream
- name: Upload Release Asset (linux-x64-trimmedpackages.json)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/linux-x64-trimmedpackages.json
asset_name: actions-runner-linux-x64-${{ steps.releaseNote.outputs.version }}-trimmedpackages.json
asset_content_type: application/octet-stream
- name: Upload Release Asset (osx-x64-trimmedpackages.json)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/osx-x64-trimmedpackages.json
asset_name: actions-runner-osx-x64-${{ steps.releaseNote.outputs.version }}-trimmedpackages.json
asset_content_type: application/octet-stream
- name: Upload Release Asset (osx-arm64-trimmedpackages.json)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/osx-arm64-trimmedpackages.json
asset_name: actions-runner-osx-arm64-${{ steps.releaseNote.outputs.version }}-trimmedpackages.json
asset_content_type: application/octet-stream
- name: Upload Release Asset (linux-arm-trimmedpackages.json)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/linux-arm-trimmedpackages.json
asset_name: actions-runner-linux-arm-${{ steps.releaseNote.outputs.version }}-trimmedpackages.json
asset_content_type: application/octet-stream
- name: Upload Release Asset (linux-arm64-trimmedpackages.json)
uses: actions/upload-release-asset@v1.0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.createRelease.outputs.upload_url }}
asset_path: ${{ github.workspace }}/linux-arm64-trimmedpackages.json
asset_name: actions-runner-linux-arm64-${{ steps.releaseNote.outputs.version }}-trimmedpackages.json
asset_content_type: application/octet-stream
publish-image: publish-image:
needs: release needs: release
runs-on: ubuntu-latest runs-on: ubuntu-latest

View File

@@ -7,8 +7,10 @@ Make sure the runner has access to actions service for GitHub.com or GitHub Ente
- For GitHub.com - For GitHub.com
- The runner needs to access `https://api.github.com` for downloading actions. - The runner needs to access `https://api.github.com` for downloading actions.
- The runner needs to access `https://codeload.github.com` for downloading actions tar.gz/zip.
- The runner needs to access `https://vstoken.actions.githubusercontent.com/_apis/.../` for requesting an access token. - The runner needs to access `https://vstoken.actions.githubusercontent.com/_apis/.../` for requesting an access token.
- The runner needs to access `https://pipelines.actions.githubusercontent.com/_apis/.../` for receiving workflow jobs. - The runner needs to access `https://pipelines.actions.githubusercontent.com/_apis/.../` for receiving workflow jobs.
- The runner needs to access `https://results-receiver.actions.githubusercontent.com/.../` for reporting progress and uploading logs during a workflow job execution.
--- ---
**NOTE:** for the full list of domains that are required to be in the firewall allow list refer to the [GitHub self-hosted runners requirements documentation](https://docs.github.com/en/actions/hosting-your-own-runners/managing-self-hosted-runners/about-self-hosted-runners#communication-between-self-hosted-runners-and-github). **NOTE:** for the full list of domains that are required to be in the firewall allow list refer to the [GitHub self-hosted runners requirements documentation](https://docs.github.com/en/actions/hosting-your-own-runners/managing-self-hosted-runners/about-self-hosted-runners#communication-between-self-hosted-runners-and-github).
@@ -16,12 +18,15 @@ Make sure the runner has access to actions service for GitHub.com or GitHub Ente
``` ```
curl -v https://api.github.com/zen curl -v https://api.github.com/zen
curl -v https://codeload.github.com/_ping
curl -v https://vstoken.actions.githubusercontent.com/_apis/health curl -v https://vstoken.actions.githubusercontent.com/_apis/health
curl -v https://pipelines.actions.githubusercontent.com/_apis/health curl -v https://pipelines.actions.githubusercontent.com/_apis/health
curl -v https://results-receiver.actions.githubusercontent.com/health
``` ```
- For GitHub Enterprise Server - For GitHub Enterprise Server
- The runner needs to access `https://[hostname]/api/v3` for downloading actions. - The runner needs to access `https://[hostname]/api/v3` for downloading actions.
- The runner needs to access `https://codeload.[hostname]/_ping` for downloading actions tar.gz/zip.
- The runner needs to access `https://[hostname]/_services/vstoken/_apis/.../` for requesting an access token. - The runner needs to access `https://[hostname]/_services/vstoken/_apis/.../` for requesting an access token.
- The runner needs to access `https://[hostname]/_services/pipelines/_apis/.../` for receiving workflow jobs. - The runner needs to access `https://[hostname]/_services/pipelines/_apis/.../` for receiving workflow jobs.
@@ -29,6 +34,7 @@ Make sure the runner has access to actions service for GitHub.com or GitHub Ente
``` ```
curl -v https://[hostname]/api/v3/zen curl -v https://[hostname]/api/v3/zen
curl -v https://codeload.[hostname]/_ping
curl -v https://[hostname]/_services/vstoken/_apis/health curl -v https://[hostname]/_services/vstoken/_apis/health
curl -v https://[hostname]/_services/pipelines/_apis/health curl -v https://[hostname]/_services/pipelines/_apis/health
``` ```
@@ -44,6 +50,10 @@ Make sure the runner has access to actions service for GitHub.com or GitHub Ente
- Ping api.github.com or myGHES.com using dotnet - Ping api.github.com or myGHES.com using dotnet
- Make HTTP GET to https://api.github.com or https://myGHES.com/api/v3 using dotnet, check response headers contains `X-GitHub-Request-Id` - Make HTTP GET to https://api.github.com or https://myGHES.com/api/v3 using dotnet, check response headers contains `X-GitHub-Request-Id`
--- ---
- DNS lookup for codeload.github.com or codeload.myGHES.com using dotnet
- Ping codeload.github.com or codeload.myGHES.com using dotnet
- Make HTTP GET to https://codeload.github.com/_ping or https://codeload.myGHES.com/_ping using dotnet, check response headers contains `X-GitHub-Request-Id`
---
- DNS lookup for vstoken.actions.githubusercontent.com using dotnet - DNS lookup for vstoken.actions.githubusercontent.com using dotnet
- Ping vstoken.actions.githubusercontent.com using dotnet - Ping vstoken.actions.githubusercontent.com using dotnet
- Make HTTP GET to https://vstoken.actions.githubusercontent.com/_apis/health or https://myGHES.com/_services/vstoken/_apis/health using dotnet, check response headers contains `x-vss-e2eid` - Make HTTP GET to https://vstoken.actions.githubusercontent.com/_apis/health or https://myGHES.com/_services/vstoken/_apis/health using dotnet, check response headers contains `x-vss-e2eid`
@@ -52,6 +62,10 @@ Make sure the runner has access to actions service for GitHub.com or GitHub Ente
- Ping pipelines.actions.githubusercontent.com using dotnet - Ping pipelines.actions.githubusercontent.com using dotnet
- Make HTTP GET to https://pipelines.actions.githubusercontent.com/_apis/health or https://myGHES.com/_services/pipelines/_apis/health using dotnet, check response headers contains `x-vss-e2eid` - Make HTTP GET to https://pipelines.actions.githubusercontent.com/_apis/health or https://myGHES.com/_services/pipelines/_apis/health using dotnet, check response headers contains `x-vss-e2eid`
- Make HTTP POST to https://pipelines.actions.githubusercontent.com/_apis/health or https://myGHES.com/_services/pipelines/_apis/health using dotnet, check response headers contains `x-vss-e2eid` - Make HTTP POST to https://pipelines.actions.githubusercontent.com/_apis/health or https://myGHES.com/_services/pipelines/_apis/health using dotnet, check response headers contains `x-vss-e2eid`
---
- DNS lookup for results-receiver.actions.githubusercontent.com using dotnet
- Ping results-receiver.actions.githubusercontent.com using dotnet
- Make HTTP GET to https://results-receiver.actions.githubusercontent.com/health using dotnet, check response headers contains `X-GitHub-Request-Id`
## How to fix the issue? ## How to fix the issue?

View File

@@ -42,6 +42,7 @@ If you are having trouble connecting, try these steps:
- https://api.github.com/ - https://api.github.com/
- https://vstoken.actions.githubusercontent.com/_apis/health - https://vstoken.actions.githubusercontent.com/_apis/health
- https://pipelines.actions.githubusercontent.com/_apis/health - https://pipelines.actions.githubusercontent.com/_apis/health
- https://results-receiver.actions.githubusercontent.com/health
- For GHES/GHAE - For GHES/GHAE
- https://myGHES.com/_services/vstoken/_apis/health - https://myGHES.com/_services/vstoken/_apis/health
- https://myGHES.com/_services/pipelines/_apis/health - https://myGHES.com/_services/pipelines/_apis/health

View File

@@ -5,9 +5,9 @@
## Supported Distributions and Versions ## Supported Distributions and Versions
x64 x64
- Red Hat Enterprise Linux 7 - Red Hat Enterprise Linux 7+
- CentOS 7 - CentOS 7+
- Oracle Linux 7 - Oracle Linux 7+
- Fedora 29+ - Fedora 29+
- Debian 9+ - Debian 9+
- Ubuntu 16.04+ - Ubuntu 16.04+

View File

@@ -4,9 +4,9 @@ FROM mcr.microsoft.com/dotnet/runtime-deps:6.0-jammy as build
ARG TARGETOS ARG TARGETOS
ARG TARGETARCH ARG TARGETARCH
ARG RUNNER_VERSION ARG RUNNER_VERSION
ARG RUNNER_CONTAINER_HOOKS_VERSION=0.4.0 ARG RUNNER_CONTAINER_HOOKS_VERSION=0.6.0
ARG DOCKER_VERSION=24.0.6 ARG DOCKER_VERSION=25.0.4
ARG BUILDX_VERSION=0.11.2 ARG BUILDX_VERSION=0.13.1
RUN apt update -y && apt install curl unzip -y RUN apt update -y && apt install curl unzip -y
@@ -37,6 +37,7 @@ FROM mcr.microsoft.com/dotnet/runtime-deps:6.0-jammy
ENV DEBIAN_FRONTEND=noninteractive ENV DEBIAN_FRONTEND=noninteractive
ENV RUNNER_MANUALLY_TRAP_SIG=1 ENV RUNNER_MANUALLY_TRAP_SIG=1
ENV ACTIONS_RUNNER_PRINT_LOG_TO_STDOUT=1 ENV ACTIONS_RUNNER_PRINT_LOG_TO_STDOUT=1
ENV ImageOS=ubuntu22
RUN apt-get update -y \ RUN apt-get update -y \
&& apt-get install -y --no-install-recommends \ && apt-get install -y --no-install-recommends \

View File

@@ -1,23 +1,21 @@
## What's Changed ## What's Changed
* Trim whitespace in `./Misc/contentHash/dotnetRuntime/*` by @TingluoHuang in https://github.com/actions/runner/pull/2915 * Load '_runnerSettings' in the early point of JobRunner.cs by @TingluoHuang in https://github.com/actions/runner/pull/3218
* Send os and arch during long poll by @luketomlinson in https://github.com/actions/runner/pull/2913 * Add new SessionConflict return code by @eeSquared in https://github.com/actions/runner/pull/3215
* Revert "Update default version to node20 (#2844)" by @takost in https://github.com/actions/runner/pull/2918 * backoff if we retried polling for more than 50 times in less than 30minutes by @aiqiaoy in https://github.com/actions/runner/pull/3232
* Fix telemetry publish from JobServerQueue. by @TingluoHuang in https://github.com/actions/runner/pull/2919 * Update dotnet sdk to latest version @6.0.421 by @github-actions in https://github.com/actions/runner/pull/3244
* Use block blob instead of append blob by @yacaovsnc in https://github.com/actions/runner/pull/2924 * Cleanup enabled feature flags. by @TingluoHuang in https://github.com/actions/runner/pull/3246
* Provide detail info on untar failures. by @TingluoHuang in https://github.com/actions/runner/pull/2939 * Relax the condition to stop uploading to Results by @yacaovsnc in https://github.com/actions/runner/pull/3230
* Bump node.js to 20.8.1 by @TingluoHuang in https://github.com/actions/runner/pull/2945 * Cleanup enabled feature flags. by @TingluoHuang in https://github.com/actions/runner/pull/3248
* Update dotnet sdk to latest version @6.0.415 by @github-actions in https://github.com/actions/runner/pull/2929 * Replace invalid file name chars in diag log name by @ericsciple in https://github.com/actions/runner/pull/3249
* Fix typo in log strings by @rajbos in https://github.com/actions/runner/pull/2695
* feat: add support of arm64 arch runners in service creation script by @tuxity in https://github.com/actions/runner/pull/2606
* Add `buildx` to images by @ajschmidt8 in https://github.com/actions/runner/pull/2901
## New Contributors ## New Contributors
* @tuxity made their first contribution in https://github.com/actions/runner/pull/2606 * @eeSquared made their first contribution in https://github.com/actions/runner/pull/3215
* @aiqiaoy made their first contribution in https://github.com/actions/runner/pull/3232
**Full Changelog**: https://github.com/actions/runner/compare/v2.310.2...v2.311.0 **Full Changelog**: https://github.com/actions/runner/compare/v2.315.0...v2.316.0
_Note: Actions Runner follows a progressive release policy, so the latest release might not be available to your enterprise, organization, or repository yet. _Note: Actions Runner follows a progressive release policy, so the latest release might not be available to your enterprise, organization, or repository yet.
To confirm which version of the Actions Runner you should expect, please view the download instructions for your enterprise, organization, or repository. To confirm which version of the Actions Runner you should expect, please view the download instructions for your enterprise, organization, or repository.
See https://docs.github.com/en/enterprise-cloud@latest/actions/hosting-your-own-runners/adding-self-hosted-runners_ See https://docs.github.com/en/enterprise-cloud@latest/actions/hosting-your-own-runners/adding-self-hosted-runners_
## Windows x64 ## Windows x64
@@ -119,27 +117,3 @@ The SHA-256 checksums for the packages included in this build are shown below:
- actions-runner-linux-x64-<RUNNER_VERSION>.tar.gz <!-- BEGIN SHA linux-x64 --><LINUX_X64_SHA><!-- END SHA linux-x64 --> - actions-runner-linux-x64-<RUNNER_VERSION>.tar.gz <!-- BEGIN SHA linux-x64 --><LINUX_X64_SHA><!-- END SHA linux-x64 -->
- actions-runner-linux-arm64-<RUNNER_VERSION>.tar.gz <!-- BEGIN SHA linux-arm64 --><LINUX_ARM64_SHA><!-- END SHA linux-arm64 --> - actions-runner-linux-arm64-<RUNNER_VERSION>.tar.gz <!-- BEGIN SHA linux-arm64 --><LINUX_ARM64_SHA><!-- END SHA linux-arm64 -->
- actions-runner-linux-arm-<RUNNER_VERSION>.tar.gz <!-- BEGIN SHA linux-arm --><LINUX_ARM_SHA><!-- END SHA linux-arm --> - actions-runner-linux-arm-<RUNNER_VERSION>.tar.gz <!-- BEGIN SHA linux-arm --><LINUX_ARM_SHA><!-- END SHA linux-arm -->
- actions-runner-win-x64-<RUNNER_VERSION>-noexternals.zip <!-- BEGIN SHA win-x64_noexternals --><WIN_X64_SHA_NOEXTERNALS><!-- END SHA win-x64_noexternals -->
- actions-runner-win-arm64-<RUNNER_VERSION>-noexternals.zip <!-- BEGIN SHA win-arm64_noexternals --><WIN_ARM64_SHA_NOEXTERNALS><!-- END SHA win-arm64_noexternals -->
- actions-runner-osx-x64-<RUNNER_VERSION>-noexternals.tar.gz <!-- BEGIN SHA osx-x64_noexternals --><OSX_X64_SHA_NOEXTERNALS><!-- END SHA osx-x64_noexternals -->
- actions-runner-osx-arm64-<RUNNER_VERSION>-noexternals.tar.gz <!-- BEGIN SHA osx-arm64_noexternals --><OSX_ARM64_SHA_NOEXTERNALS><!-- END SHA osx-arm64_noexternals -->
- actions-runner-linux-x64-<RUNNER_VERSION>-noexternals.tar.gz <!-- BEGIN SHA linux-x64_noexternals --><LINUX_X64_SHA_NOEXTERNALS><!-- END SHA linux-x64_noexternals -->
- actions-runner-linux-arm64-<RUNNER_VERSION>-noexternals.tar.gz <!-- BEGIN SHA linux-arm64_noexternals --><LINUX_ARM64_SHA_NOEXTERNALS><!-- END SHA linux-arm64_noexternals -->
- actions-runner-linux-arm-<RUNNER_VERSION>-noexternals.tar.gz <!-- BEGIN SHA linux-arm_noexternals --><LINUX_ARM_SHA_NOEXTERNALS><!-- END SHA linux-arm_noexternals -->
- actions-runner-win-x64-<RUNNER_VERSION>-noruntime.zip <!-- BEGIN SHA win-x64_noruntime --><WIN_X64_SHA_NORUNTIME><!-- END SHA win-x64_noruntime -->
- actions-runner-win-arm64-<RUNNER_VERSION>-noruntime.zip <!-- BEGIN SHA win-arm64_noruntime --><WIN_ARM64_SHA_NORUNTIME><!-- END SHA win-arm64_noruntime -->
- actions-runner-osx-x64-<RUNNER_VERSION>-noruntime.tar.gz <!-- BEGIN SHA osx-x64_noruntime --><OSX_X64_SHA_NORUNTIME><!-- END SHA osx-x64_noruntime -->
- actions-runner-osx-arm64-<RUNNER_VERSION>-noruntime.tar.gz <!-- BEGIN SHA osx-arm64_noruntime --><OSX_ARM64_SHA_NORUNTIME><!-- END SHA osx-arm64_noruntime -->
- actions-runner-linux-x64-<RUNNER_VERSION>-noruntime.tar.gz <!-- BEGIN SHA linux-x64_noruntime --><LINUX_X64_SHA_NORUNTIME><!-- END SHA linux-x64_noruntime -->
- actions-runner-linux-arm64-<RUNNER_VERSION>-noruntime.tar.gz <!-- BEGIN SHA linux-arm64_noruntime --><LINUX_ARM64_SHA_NORUNTIME><!-- END SHA linux-arm64_noruntime -->
- actions-runner-linux-arm-<RUNNER_VERSION>-noruntime.tar.gz <!-- BEGIN SHA linux-arm_noruntime --><LINUX_ARM_SHA_NORUNTIME><!-- END SHA linux-arm_noruntime -->
- actions-runner-win-x64-<RUNNER_VERSION>-noruntime-noexternals.zip <!-- BEGIN SHA win-x64_noruntime_noexternals --><WIN_X64_SHA_NORUNTIME_NOEXTERNALS><!-- END SHA win-x64_noruntime_noexternals -->
- actions-runner-win-arm64-<RUNNER_VERSION>-noruntime-noexternals.zip <!-- BEGIN SHA win-arm64_noruntime_noexternals --><WIN_ARM64_SHA_NORUNTIME_NOEXTERNALS><!-- END SHA win-arm64_noruntime_noexternals -->
- actions-runner-osx-x64-<RUNNER_VERSION>-noruntime-noexternals.tar.gz <!-- BEGIN SHA osx-x64_noruntime_noexternals --><OSX_X64_SHA_NORUNTIME_NOEXTERNALS><!-- END SHA osx-x64_noruntime_noexternals -->
- actions-runner-osx-arm64-<RUNNER_VERSION>-noruntime-noexternals.tar.gz <!-- BEGIN SHA osx-arm64_noruntime_noexternals --><OSX_ARM64_SHA_NORUNTIME_NOEXTERNALS><!-- END SHA osx-arm64_noruntime_noexternals -->
- actions-runner-linux-x64-<RUNNER_VERSION>-noruntime-noexternals.tar.gz <!-- BEGIN SHA linux-x64_noruntime_noexternals --><LINUX_X64_SHA_NORUNTIME_NOEXTERNALS><!-- END SHA linux-x64_noruntime_noexternals -->
- actions-runner-linux-arm64-<RUNNER_VERSION>-noruntime-noexternals.tar.gz <!-- BEGIN SHA linux-arm64_noruntime_noexternals --><LINUX_ARM64_SHA_NORUNTIME_NOEXTERNALS><!-- END SHA linux-arm64_noruntime_noexternals -->
- actions-runner-linux-arm-<RUNNER_VERSION>-noruntime-noexternals.tar.gz <!-- BEGIN SHA linux-arm_noruntime_noexternals --><LINUX_ARM_SHA_NORUNTIME_NOEXTERNALS><!-- END SHA linux-arm_noruntime_noexternals -->

View File

@@ -1 +1 @@
531b31914e525ecb12cc5526415bc70a112ebc818f877347af1a231011f539c5 54d95a44d118dba852395991224a6b9c1abe916858c87138656f80c619e85331

View File

@@ -1 +1 @@
722dd5fa5ecc207fcccf67f6e502d689f2119d8117beff2041618fba17dc66a4 68015af17f06a824fa478e62ae7393766ce627fd5599ab916432a14656a19a52

View File

@@ -1 +1 @@
8ca75c76e15ab9dc7fe49a66c5c74e171e7fabd5d26546fda8931bd11bff30f9 a2628119ca419cb54e279103ffae7986cdbd0814d57c73ff0dc74c38be08b9ae

View File

@@ -1 +1 @@
70496eb1c99b39b3373b5088c95a35ebbaac1098e6c47c8aab94771f3ffbf501 de71ca09ead807e1a2ce9df0a5b23eb7690cb71fff51169a77e4c3992be53dda

View File

@@ -1 +1 @@
4f8d48727d535daabcaec814e0dafb271c10625366c78e7e022ca7477a73023f d009e05e6b26d614d65be736a15d1bd151932121c16a9ff1b986deadecc982b9

View File

@@ -1 +1 @@
d54d7428f2b9200a0030365a6a4e174e30a1b29b922f8254dffb2924bd09549d f730db39c2305800b4653795360ba9c10c68f384a46b85d808f1f9f0ed3c42e4

View File

@@ -1 +1 @@
eaa939c45307f46b7003902255b3a2a09287215d710984107667e03ac493eb26 a35b5722375490e9473cdcccb5e18b41eba3dbf4344fe31abc9821e21f18ea5a

View File

@@ -63,17 +63,16 @@ function acquireExternalTool() {
echo "Curl version: $CURL_VERSION" echo "Curl version: $CURL_VERSION"
# curl -f Fail silently (no output at all) on HTTP errors (H) # curl -f Fail silently (no output at all) on HTTP errors (H)
# -k Allow connections to SSL sites without certs (H)
# -S Show error. With -s, make curl show errors when they occur # -S Show error. With -s, make curl show errors when they occur
# -L Follow redirects (H) # -L Follow redirects (H)
# -o FILE Write to FILE instead of stdout # -o FILE Write to FILE instead of stdout
# --retry 3 Retries transient errors 3 times (timeouts, 5xx) # --retry 3 Retries transient errors 3 times (timeouts, 5xx)
if [[ "$(printf '%s\n' "7.71.0" "$CURL_VERSION" | sort -V | head -n1)" != "7.71.0" ]]; then if [[ "$(printf '%s\n' "7.71.0" "$CURL_VERSION" | sort -V | head -n1)" != "7.71.0" ]]; then
# Curl version is less than or equal to 7.71.0, skipping retry-all-errors flag # Curl version is less than or equal to 7.71.0, skipping retry-all-errors flag
curl -fkSL --retry 3 -o "$partial_target" "$download_source" 2>"${download_target}_download.log" || checkRC 'curl' curl -fSL --retry 3 -o "$partial_target" "$download_source" 2>"${download_target}_download.log" || checkRC 'curl'
else else
# Curl version is greater than 7.71.0, running curl with --retry-all-errors flag # Curl version is greater than 7.71.0, running curl with --retry-all-errors flag
curl -fkSL --retry 3 --retry-all-errors -o "$partial_target" "$download_source" 2>"${download_target}_download.log" || checkRC 'curl' curl -fSL --retry 3 --retry-all-errors -o "$partial_target" "$download_source" 2>"${download_target}_download.log" || checkRC 'curl'
fi fi
# Move the partial file to the download target. # Move the partial file to the download target.

View File

@@ -114,6 +114,11 @@ var runService = function () {
); );
stopping = true; stopping = true;
} }
} else if (code === 5) {
console.log(
"Runner listener exit with Session Conflict error, stop the service, no retry needed."
);
stopping = true;
} else { } else {
var messagePrefix = "Runner listener exit with undefined return code"; var messagePrefix = "Runner listener exit with undefined return code";
unknownFailureRetryCount++; unknownFailureRetryCount++;

View File

@@ -49,5 +49,10 @@ if %ERRORLEVEL% EQU 4 (
exit /b 1 exit /b 1
) )
if %ERRORLEVEL% EQU 5 (
echo "Runner listener exit with Session Conflict error, stop the service, no retry needed."
exit /b 0
)
echo "Exiting after unknown error code: %ERRORLEVEL%" echo "Exiting after unknown error code: %ERRORLEVEL%"
exit /b 0 exit /b 0

View File

@@ -70,6 +70,9 @@ elif [[ $returnCode == 4 ]]; then
"$DIR"/safe_sleep.sh 1 "$DIR"/safe_sleep.sh 1
done done
exit 2 exit 2
elif [[ $returnCode == 5 ]]; then
echo "Runner listener exit with Session Conflict error, stop the service, no retry needed."
exit 0
else else
echo "Exiting with unknown error code: ${returnCode}" echo "Exiting with unknown error code: ${returnCode}"
exit 0 exit 0

View File

@@ -38,7 +38,7 @@ runWithManualTrap() {
cp -f "$DIR"/run-helper.sh.template "$DIR"/run-helper.sh cp -f "$DIR"/run-helper.sh.template "$DIR"/run-helper.sh
"$DIR"/run-helper.sh $* & "$DIR"/run-helper.sh $* &
PID=$! PID=$!
wait -f $PID wait $PID
returnCode=$? returnCode=$?
if [[ $returnCode -eq 2 ]]; then if [[ $returnCode -eq 2 ]]; then
echo "Restarting runner..." echo "Restarting runner..."
@@ -84,4 +84,4 @@ if [[ -z "$RUNNER_MANUALLY_TRAP_SIG" ]]; then
run $* run $*
else else
runWithManualTrap $* runWithManualTrap $*
fi fi

View File

@@ -1,57 +0,0 @@
actions.runner.plist.template
actions.runner.service.template
checkScripts/downloadCert.js
checkScripts/makeWebRequest.js
darwin.svc.sh.template
hashFiles/index.js
installdependencies.sh
macos-run-invoker.js
Microsoft.IdentityModel.Logging.dll
Microsoft.IdentityModel.Tokens.dll
Minimatch.dll
Newtonsoft.Json.Bson.dll
Newtonsoft.Json.dll
Runner.Common.deps.json
Runner.Common.dll
Runner.Common.pdb
Runner.Listener
Runner.Listener.deps.json
Runner.Listener.dll
Runner.Listener.exe
Runner.Listener.pdb
Runner.Listener.runtimeconfig.json
Runner.PluginHost
Runner.PluginHost.deps.json
Runner.PluginHost.dll
Runner.PluginHost.exe
Runner.PluginHost.pdb
Runner.PluginHost.runtimeconfig.json
Runner.Plugins.deps.json
Runner.Plugins.dll
Runner.Plugins.pdb
Runner.Sdk.deps.json
Runner.Sdk.dll
Runner.Sdk.pdb
Runner.Worker
Runner.Worker.deps.json
Runner.Worker.dll
Runner.Worker.exe
Runner.Worker.pdb
Runner.Worker.runtimeconfig.json
RunnerService.exe
RunnerService.exe.config
RunnerService.js
RunnerService.pdb
runsvc.sh
Sdk.deps.json
Sdk.dll
Sdk.pdb
System.IdentityModel.Tokens.Jwt.dll
System.Net.Http.Formatting.dll
System.Security.Cryptography.Pkcs.dll
System.Security.Cryptography.ProtectedData.dll
System.ServiceProcess.ServiceController.dll
systemd.svc.sh.template
update.cmd.template
update.sh.template
YamlDotNet.dll

View File

@@ -1,270 +0,0 @@
api-ms-win-core-console-l1-1-0.dll
api-ms-win-core-console-l1-2-0.dll
api-ms-win-core-datetime-l1-1-0.dll
api-ms-win-core-debug-l1-1-0.dll
api-ms-win-core-errorhandling-l1-1-0.dll
api-ms-win-core-fibers-l1-1-0.dll
api-ms-win-core-file-l1-1-0.dll
api-ms-win-core-file-l1-2-0.dll
api-ms-win-core-file-l2-1-0.dll
api-ms-win-core-handle-l1-1-0.dll
api-ms-win-core-heap-l1-1-0.dll
api-ms-win-core-interlocked-l1-1-0.dll
api-ms-win-core-libraryloader-l1-1-0.dll
api-ms-win-core-localization-l1-2-0.dll
api-ms-win-core-memory-l1-1-0.dll
api-ms-win-core-namedpipe-l1-1-0.dll
api-ms-win-core-processenvironment-l1-1-0.dll
api-ms-win-core-processthreads-l1-1-0.dll
api-ms-win-core-processthreads-l1-1-1.dll
api-ms-win-core-profile-l1-1-0.dll
api-ms-win-core-rtlsupport-l1-1-0.dll
api-ms-win-core-string-l1-1-0.dll
api-ms-win-core-synch-l1-1-0.dll
api-ms-win-core-synch-l1-2-0.dll
api-ms-win-core-sysinfo-l1-1-0.dll
api-ms-win-core-timezone-l1-1-0.dll
api-ms-win-core-util-l1-1-0.dll
api-ms-win-crt-conio-l1-1-0.dll
api-ms-win-crt-convert-l1-1-0.dll
api-ms-win-crt-environment-l1-1-0.dll
api-ms-win-crt-filesystem-l1-1-0.dll
api-ms-win-crt-heap-l1-1-0.dll
api-ms-win-crt-locale-l1-1-0.dll
api-ms-win-crt-math-l1-1-0.dll
api-ms-win-crt-multibyte-l1-1-0.dll
api-ms-win-crt-private-l1-1-0.dll
api-ms-win-crt-process-l1-1-0.dll
api-ms-win-crt-runtime-l1-1-0.dll
api-ms-win-crt-stdio-l1-1-0.dll
api-ms-win-crt-string-l1-1-0.dll
api-ms-win-crt-time-l1-1-0.dll
api-ms-win-crt-utility-l1-1-0.dll
clrcompression.dll
clretwrc.dll
clrjit.dll
coreclr.dll
createdump
createdump.exe
dbgshim.dll
hostfxr.dll
hostpolicy.dll
libclrjit.dylib
libclrjit.so
libcoreclr.dylib
libcoreclr.so
libcoreclrtraceptprovider.so
libdbgshim.dylib
libdbgshim.so
libhostfxr.dylib
libhostfxr.so
libhostpolicy.dylib
libhostpolicy.so
libmscordaccore.dylib
libmscordaccore.so
libmscordbi.dylib
libmscordbi.so
Microsoft.CSharp.dll
Microsoft.DiaSymReader.Native.amd64.dll
Microsoft.DiaSymReader.Native.arm64.dll
Microsoft.VisualBasic.Core.dll
Microsoft.VisualBasic.dll
Microsoft.Win32.Primitives.dll
Microsoft.Win32.Registry.dll
mscordaccore.dll
mscordaccore_amd64_amd64_6.0.522.21309.dll
mscordaccore_arm64_arm64_6.0.522.21309.dll
mscordaccore_amd64_amd64_6.0.1322.58009.dll
mscordaccore_amd64_amd64_6.0.2023.32017.dll
mscordaccore_amd64_amd64_6.0.2223.42425.dll
mscordaccore_amd64_amd64_6.0.2323.48002.dll
mscordbi.dll
mscorlib.dll
mscorrc.debug.dll
mscorrc.dll
msquic.dll
netstandard.dll
SOS_README.md
System.AppContext.dll
System.Buffers.dll
System.Collections.Concurrent.dll
System.Collections.dll
System.Collections.Immutable.dll
System.Collections.NonGeneric.dll
System.Collections.Specialized.dll
System.ComponentModel.Annotations.dll
System.ComponentModel.DataAnnotations.dll
System.ComponentModel.dll
System.ComponentModel.EventBasedAsync.dll
System.ComponentModel.Primitives.dll
System.ComponentModel.TypeConverter.dll
System.Configuration.dll
System.Console.dll
System.Core.dll
System.Data.Common.dll
System.Data.DataSetExtensions.dll
System.Data.dll
System.Diagnostics.Contracts.dll
System.Diagnostics.Debug.dll
System.Diagnostics.DiagnosticSource.dll
System.Diagnostics.FileVersionInfo.dll
System.Diagnostics.Process.dll
System.Diagnostics.StackTrace.dll
System.Diagnostics.TextWriterTraceListener.dll
System.Diagnostics.Tools.dll
System.Diagnostics.TraceSource.dll
System.Diagnostics.Tracing.dll
System.dll
System.Drawing.dll
System.Drawing.Primitives.dll
System.Dynamic.Runtime.dll
System.Formats.Asn1.dll
System.Globalization.Calendars.dll
System.Globalization.dll
System.Globalization.Extensions.dll
System.Globalization.Native.dylib
System.Globalization.Native.so
System.IO.Compression.Brotli.dll
System.IO.Compression.dll
System.IO.Compression.FileSystem.dll
System.IO.Compression.Native.a
System.IO.Compression.Native.dll
System.IO.Compression.Native.dylib
System.IO.Compression.Native.so
System.IO.Compression.ZipFile.dll
System.IO.dll
System.IO.FileSystem.AccessControl.dll
System.IO.FileSystem.dll
System.IO.FileSystem.DriveInfo.dll
System.IO.FileSystem.Primitives.dll
System.IO.FileSystem.Watcher.dll
System.IO.IsolatedStorage.dll
System.IO.MemoryMappedFiles.dll
System.IO.Pipes.AccessControl.dll
System.IO.Pipes.dll
System.IO.UnmanagedMemoryStream.dll
System.Linq.dll
System.Linq.Expressions.dll
System.Linq.Parallel.dll
System.Linq.Queryable.dll
System.Memory.dll
System.Native.a
System.Native.dylib
System.Native.so
System.Net.dll
System.Net.Http.dll
System.Net.Http.Json.dll
System.Net.Http.Native.a
System.Net.Http.Native.dylib
System.Net.Http.Native.so
System.Net.HttpListener.dll
System.Net.Mail.dll
System.Net.NameResolution.dll
System.Net.NetworkInformation.dll
System.Net.Ping.dll
System.Net.Primitives.dll
System.Net.Quic.dll
System.Net.Requests.dll
System.Net.Security.dll
System.Net.Security.Native.a
System.Net.Security.Native.dylib
System.Net.Security.Native.so
System.Net.ServicePoint.dll
System.Net.Sockets.dll
System.Net.WebClient.dll
System.Net.WebHeaderCollection.dll
System.Net.WebProxy.dll
System.Net.WebSockets.Client.dll
System.Net.WebSockets.dll
System.Numerics.dll
System.Numerics.Vectors.dll
System.ObjectModel.dll
System.Private.CoreLib.dll
System.Private.DataContractSerialization.dll
System.Private.Uri.dll
System.Private.Xml.dll
System.Private.Xml.Linq.dll
System.Reflection.DispatchProxy.dll
System.Reflection.dll
System.Reflection.Emit.dll
System.Reflection.Emit.ILGeneration.dll
System.Reflection.Emit.Lightweight.dll
System.Reflection.Extensions.dll
System.Reflection.Metadata.dll
System.Reflection.Primitives.dll
System.Reflection.TypeExtensions.dll
System.Resources.Reader.dll
System.Resources.ResourceManager.dll
System.Resources.Writer.dll
System.Runtime.CompilerServices.Unsafe.dll
System.Runtime.CompilerServices.VisualC.dll
System.Runtime.dll
System.Runtime.Extensions.dll
System.Runtime.Handles.dll
System.Runtime.InteropServices.dll
System.Runtime.InteropServices.RuntimeInformation.dll
System.Runtime.InteropServices.WindowsRuntime.dll
System.Runtime.Intrinsics.dll
System.Runtime.Loader.dll
System.Runtime.Numerics.dll
System.Runtime.Serialization.dll
System.Runtime.Serialization.Formatters.dll
System.Runtime.Serialization.Json.dll
System.Runtime.Serialization.Primitives.dll
System.Runtime.Serialization.Xml.dll
System.Runtime.WindowsRuntime.dll
System.Runtime.WindowsRuntime.UI.Xaml.dll
System.Security.AccessControl.dll
System.Security.Claims.dll
System.Security.Cryptography.Algorithms.dll
System.Security.Cryptography.Cng.dll
System.Security.Cryptography.Csp.dll
System.Security.Cryptography.Encoding.dll
System.Security.Cryptography.Native.Apple.a
System.Security.Cryptography.Native.Apple.dylib
System.Security.Cryptography.Native.OpenSsl.a
System.Security.Cryptography.Native.OpenSsl.dylib
System.Security.Cryptography.Native.OpenSsl.so
System.Security.Cryptography.OpenSsl.dll
System.Security.Cryptography.Primitives.dll
System.Security.Cryptography.X509Certificates.dll
System.Security.Cryptography.XCertificates.dll
System.Security.dll
System.Security.Principal.dll
System.Security.Principal.Windows.dll
System.Security.SecureString.dll
System.ServiceModel.Web.dll
System.ServiceProcess.dll
System.Text.Encoding.CodePages.dll
System.Text.Encoding.dll
System.Text.Encoding.Extensions.dll
System.Text.Encodings.Web.dll
System.Text.Json.dll
System.Text.RegularExpressions.dll
System.Threading.Channels.dll
System.Threading.dll
System.Threading.Overlapped.dll
System.Threading.Tasks.Dataflow.dll
System.Threading.Tasks.dll
System.Threading.Tasks.Extensions.dll
System.Threading.Tasks.Parallel.dll
System.Threading.Thread.dll
System.Threading.ThreadPool.dll
System.Threading.Timer.dll
System.Transactions.dll
System.Transactions.Local.dll
System.ValueTuple.dll
System.Web.dll
System.Web.HttpUtility.dll
System.Windows.dll
System.Xml.dll
System.Xml.Linq.dll
System.Xml.ReaderWriter.dll
System.Xml.Serialization.dll
System.Xml.XDocument.dll
System.Xml.XmlDocument.dll
System.Xml.XmlSerializer.dll
System.Xml.XPath.dll
System.Xml.XPath.XDocument.dll
ucrtbase.dll
WindowsBase.dll

View File

@@ -1,24 +0,0 @@
[
{
"HashValue": "<NO_RUNTIME_EXTERNALS_HASH>",
"DownloadUrl": "https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-<RUNNER_PLATFORM>-<RUNNER_VERSION>-noruntime-noexternals.tar.gz",
"TrimmedContents": {
"dotnetRuntime": "<RUNTIME_HASH>",
"externals": "<EXTERNALS_HASH>"
}
},
{
"HashValue": "<NO_RUNTIME_HASH>",
"DownloadUrl": "https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-<RUNNER_PLATFORM>-<RUNNER_VERSION>-noruntime.tar.gz",
"TrimmedContents": {
"dotnetRuntime": "<RUNTIME_HASH>"
}
},
{
"HashValue": "<NO_EXTERNALS_HASH>",
"DownloadUrl": "https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-<RUNNER_PLATFORM>-<RUNNER_VERSION>-noexternals.tar.gz",
"TrimmedContents": {
"externals": "<EXTERNALS_HASH>"
}
}
]

View File

@@ -1,24 +0,0 @@
[
{
"HashValue": "<NO_RUNTIME_EXTERNALS_HASH>",
"DownloadUrl": "https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-<RUNNER_PLATFORM>-<RUNNER_VERSION>-noruntime-noexternals.zip",
"TrimmedContents": {
"dotnetRuntime": "<RUNTIME_HASH>",
"externals": "<EXTERNALS_HASH>"
}
},
{
"HashValue": "<NO_RUNTIME_HASH>",
"DownloadUrl": "https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-<RUNNER_PLATFORM>-<RUNNER_VERSION>-noruntime.zip",
"TrimmedContents": {
"dotnetRuntime": "<RUNTIME_HASH>"
}
},
{
"HashValue": "<NO_EXTERNALS_HASH>",
"DownloadUrl": "https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-<RUNNER_PLATFORM>-<RUNNER_VERSION>-noexternals.zip",
"TrimmedContents": {
"externals": "<EXTERNALS_HASH>"
}
}
]

View File

@@ -20,12 +20,12 @@ namespace GitHub.Runner.Common
{ {
private bool _hasConnection; private bool _hasConnection;
private VssConnection _connection; private VssConnection _connection;
private TaskAgentHttpClient _taskAgentClient; private ActionsRunServerHttpClient _actionsRunServerClient;
public async Task ConnectAsync(Uri serverUrl, VssCredentials credentials) public async Task ConnectAsync(Uri serverUrl, VssCredentials credentials)
{ {
_connection = await EstablishVssConnection(serverUrl, credentials, TimeSpan.FromSeconds(100)); _connection = await EstablishVssConnection(serverUrl, credentials, TimeSpan.FromSeconds(100));
_taskAgentClient = _connection.GetClient<TaskAgentHttpClient>(); _actionsRunServerClient = _connection.GetClient<ActionsRunServerHttpClient>();
_hasConnection = true; _hasConnection = true;
} }
@@ -42,7 +42,7 @@ namespace GitHub.Runner.Common
CheckConnection(); CheckConnection();
var jobMessage = RetryRequest<AgentJobRequestMessage>(async () => var jobMessage = RetryRequest<AgentJobRequestMessage>(async () =>
{ {
return await _taskAgentClient.GetJobMessageAsync(id, cancellationToken); return await _actionsRunServerClient.GetJobMessageAsync(id, cancellationToken);
}, cancellationToken); }, cancellationToken);
return jobMessage; return jobMessage;

View File

@@ -17,7 +17,14 @@ namespace GitHub.Runner.Common
{ {
Task ConnectAsync(Uri serverUrl, VssCredentials credentials); Task ConnectAsync(Uri serverUrl, VssCredentials credentials);
Task<TaskAgentMessage> GetRunnerMessageAsync(CancellationToken token, TaskAgentStatus status, string version, string os, string architecture, bool disableUpdate); Task<TaskAgentSession> CreateSessionAsync(TaskAgentSession session, CancellationToken cancellationToken);
Task DeleteSessionAsync(CancellationToken cancellationToken);
Task<TaskAgentMessage> GetRunnerMessageAsync(Guid? sessionId, TaskAgentStatus status, string version, string os, string architecture, bool disableUpdate, CancellationToken token);
Task UpdateConnectionIfNeeded(Uri serverUri, VssCredentials credentials);
Task ForceRefreshConnection(VssCredentials credentials);
} }
public sealed class BrokerServer : RunnerService, IBrokerServer public sealed class BrokerServer : RunnerService, IBrokerServer
@@ -44,13 +51,53 @@ namespace GitHub.Runner.Common
} }
} }
public Task<TaskAgentMessage> GetRunnerMessageAsync(CancellationToken cancellationToken, TaskAgentStatus status, string version, string os, string architecture, bool disableUpdate) public async Task<TaskAgentSession> CreateSessionAsync(TaskAgentSession session, CancellationToken cancellationToken)
{ {
CheckConnection(); CheckConnection();
var jobMessage = RetryRequest<TaskAgentMessage>( var jobMessage = await _brokerHttpClient.CreateSessionAsync(session, cancellationToken);
async () => await _brokerHttpClient.GetRunnerMessageAsync(version, status, os, architecture, disableUpdate, cancellationToken), cancellationToken);
return jobMessage; return jobMessage;
} }
public Task<TaskAgentMessage> GetRunnerMessageAsync(Guid? sessionId, TaskAgentStatus status, string version, string os, string architecture, bool disableUpdate, CancellationToken cancellationToken)
{
CheckConnection();
var brokerSession = RetryRequest<TaskAgentMessage>(
async () => await _brokerHttpClient.GetRunnerMessageAsync(sessionId, version, status, os, architecture, disableUpdate, cancellationToken), cancellationToken, shouldRetry: ShouldRetryException);
return brokerSession;
}
public async Task DeleteSessionAsync(CancellationToken cancellationToken)
{
CheckConnection();
await _brokerHttpClient.DeleteSessionAsync(cancellationToken);
}
public Task UpdateConnectionIfNeeded(Uri serverUri, VssCredentials credentials)
{
if (_brokerUri != serverUri || !_hasConnection)
{
return ConnectAsync(serverUri, credentials);
}
return Task.CompletedTask;
}
public Task ForceRefreshConnection(VssCredentials credentials)
{
return ConnectAsync(_brokerUri, credentials);
}
public bool ShouldRetryException(Exception ex)
{
if (ex is AccessDeniedException ade && ade.ErrorCode == 1)
{
return false;
}
return true;
}
} }
} }

View File

@@ -153,6 +153,7 @@ namespace GitHub.Runner.Common
public const int RetryableError = 2; public const int RetryableError = 2;
public const int RunnerUpdating = 3; public const int RunnerUpdating = 3;
public const int RunOnceRunnerUpdating = 4; public const int RunOnceRunnerUpdating = 4;
public const int SessionConflict = 5;
} }
public static class Features public static class Features
@@ -180,6 +181,9 @@ namespace GitHub.Runner.Common
public static readonly string DeprecatedNodeVersion = "node16"; public static readonly string DeprecatedNodeVersion = "node16";
public static readonly string EnforcedNode12DetectedAfterEndOfLife = "The following actions uses node12 which is deprecated and will be forced to run on node16: {0}. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/"; public static readonly string EnforcedNode12DetectedAfterEndOfLife = "The following actions uses node12 which is deprecated and will be forced to run on node16: {0}. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/";
public static readonly string EnforcedNode12DetectedAfterEndOfLifeEnvVariable = "Node16ForceActionsWarnings"; public static readonly string EnforcedNode12DetectedAfterEndOfLifeEnvVariable = "Node16ForceActionsWarnings";
public static readonly string EnforcedNode16DetectedAfterEndOfLife = "The following actions uses Node.js version which is deprecated and will be forced to run on node20: {0}. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/";
public static readonly string EnforcedNode16DetectedAfterEndOfLifeEnvVariable = "Node20ForceActionsWarnings";
} }
public static class RunnerEvent public static class RunnerEvent
@@ -251,6 +255,7 @@ namespace GitHub.Runner.Common
public static readonly string RunnerDebug = "ACTIONS_RUNNER_DEBUG"; public static readonly string RunnerDebug = "ACTIONS_RUNNER_DEBUG";
public static readonly string StepDebug = "ACTIONS_STEP_DEBUG"; public static readonly string StepDebug = "ACTIONS_STEP_DEBUG";
public static readonly string AllowActionsUseUnsecureNodeVersion = "ACTIONS_ALLOW_USE_UNSECURE_NODE_VERSION"; public static readonly string AllowActionsUseUnsecureNodeVersion = "ACTIONS_ALLOW_USE_UNSECURE_NODE_VERSION";
public static readonly string ManualForceActionsToNode20 = "FORCE_JAVASCRIPT_ACTIONS_TO_NODE20";
} }
public static class Agent public static class Agent
@@ -262,6 +267,7 @@ namespace GitHub.Runner.Common
public static readonly string ForcedActionsNodeVersion = "ACTIONS_RUNNER_FORCE_ACTIONS_NODE_VERSION"; public static readonly string ForcedActionsNodeVersion = "ACTIONS_RUNNER_FORCE_ACTIONS_NODE_VERSION";
public static readonly string PrintLogToStdout = "ACTIONS_RUNNER_PRINT_LOG_TO_STDOUT"; public static readonly string PrintLogToStdout = "ACTIONS_RUNNER_PRINT_LOG_TO_STDOUT";
public static readonly string ActionArchiveCacheDirectory = "ACTIONS_RUNNER_ACTION_ARCHIVE_CACHE"; public static readonly string ActionArchiveCacheDirectory = "ACTIONS_RUNNER_ACTION_ARCHIVE_CACHE";
public static readonly string ManualForceActionsToNode20 = "FORCE_JAVASCRIPT_ACTIONS_TO_NODE20";
} }
public static class System public static class System

View File

@@ -200,6 +200,10 @@ namespace GitHub.Runner.Common
{ {
_trace.Info($"No proxy settings were found based on environmental variables (http_proxy/https_proxy/HTTP_PROXY/HTTPS_PROXY)"); _trace.Info($"No proxy settings were found based on environmental variables (http_proxy/https_proxy/HTTP_PROXY/HTTPS_PROXY)");
} }
else
{
_userAgents.Add(new ProductInfoHeaderValue("HttpProxyConfigured", bool.TrueString));
}
if (StringUtil.ConvertToBoolean(Environment.GetEnvironmentVariable("GITHUB_ACTIONS_RUNNER_TLS_NO_VERIFY"))) if (StringUtil.ConvertToBoolean(Environment.GetEnvironmentVariable("GITHUB_ACTIONS_RUNNER_TLS_NO_VERIFY")))
{ {

View File

@@ -19,7 +19,7 @@ namespace GitHub.Runner.Common
TaskCompletionSource<int> JobRecordUpdated { get; } TaskCompletionSource<int> JobRecordUpdated { get; }
event EventHandler<ThrottlingEventArgs> JobServerQueueThrottling; event EventHandler<ThrottlingEventArgs> JobServerQueueThrottling;
Task ShutdownAsync(); Task ShutdownAsync();
void Start(Pipelines.AgentJobRequestMessage jobRequest, bool resultsServiceOnly = false, bool enableTelemetry = false); void Start(Pipelines.AgentJobRequestMessage jobRequest, bool resultsServiceOnly = false);
void QueueWebConsoleLine(Guid stepRecordId, string line, long? lineNumber = null); void QueueWebConsoleLine(Guid stepRecordId, string line, long? lineNumber = null);
void QueueFileUpload(Guid timelineId, Guid timelineRecordId, string type, string name, string path, bool deleteSource); void QueueFileUpload(Guid timelineId, Guid timelineRecordId, string type, string name, string path, bool deleteSource);
void QueueResultsUpload(Guid timelineRecordId, string name, string path, string type, bool deleteSource, bool finalize, bool firstBlock, long totalLines); void QueueResultsUpload(Guid timelineRecordId, string name, string path, string type, bool deleteSource, bool finalize, bool firstBlock, long totalLines);
@@ -74,6 +74,7 @@ namespace GitHub.Runner.Common
private readonly List<JobTelemetry> _jobTelemetries = new(); private readonly List<JobTelemetry> _jobTelemetries = new();
private bool _queueInProcess = false; private bool _queueInProcess = false;
private bool _resultsServiceOnly = false; private bool _resultsServiceOnly = false;
private int _resultsServiceExceptionsCount = 0;
private Stopwatch _resultsUploadTimer = new(); private Stopwatch _resultsUploadTimer = new();
private Stopwatch _actionsUploadTimer = new(); private Stopwatch _actionsUploadTimer = new();
@@ -104,11 +105,10 @@ namespace GitHub.Runner.Common
_resultsServer = hostContext.GetService<IResultsServer>(); _resultsServer = hostContext.GetService<IResultsServer>();
} }
public void Start(Pipelines.AgentJobRequestMessage jobRequest, bool resultsServiceOnly = false, bool enableTelemetry = false) public void Start(Pipelines.AgentJobRequestMessage jobRequest, bool resultsServiceOnly = false)
{ {
Trace.Entering(); Trace.Entering();
_resultsServiceOnly = resultsServiceOnly; _resultsServiceOnly = resultsServiceOnly;
_enableTelemetry = enableTelemetry;
var serviceEndPoint = jobRequest.Resources.Endpoints.Single(x => string.Equals(x.Name, WellKnownServiceEndpointNames.SystemVssConnection, StringComparison.OrdinalIgnoreCase)); var serviceEndPoint = jobRequest.Resources.Endpoints.Single(x => string.Equals(x.Name, WellKnownServiceEndpointNames.SystemVssConnection, StringComparison.OrdinalIgnoreCase));
@@ -134,11 +134,17 @@ namespace GitHub.Runner.Common
{ {
liveConsoleFeedUrl = feedStreamUrl; liveConsoleFeedUrl = feedStreamUrl;
} }
jobRequest.Variables.TryGetValue("system.github.results_upload_with_sdk", out VariableValue resultsUseSdkVariable);
_resultsServer.InitializeResultsClient(new Uri(resultsReceiverEndpoint), liveConsoleFeedUrl, accessToken); _resultsServer.InitializeResultsClient(new Uri(resultsReceiverEndpoint), liveConsoleFeedUrl, accessToken, StringUtil.ConvertToBoolean(resultsUseSdkVariable?.Value));
_resultsClientInitiated = true; _resultsClientInitiated = true;
} }
// Enable telemetry if we have both results service and actions service
if (_resultsClientInitiated && !_resultsServiceOnly)
{
_enableTelemetry = true;
}
if (_queueInProcess) if (_queueInProcess)
{ {
Trace.Info("No-opt, all queue process tasks are running."); Trace.Info("No-opt, all queue process tasks are running.");
@@ -551,6 +557,10 @@ namespace GitHub.Runner.Common
{ {
await UploadSummaryFile(file); await UploadSummaryFile(file);
} }
if (string.Equals(file.Type, CoreAttachmentType.ResultsDiagnosticLog, StringComparison.OrdinalIgnoreCase))
{
await UploadResultsDiagnosticLogsFile(file);
}
else if (String.Equals(file.Type, CoreAttachmentType.ResultsLog, StringComparison.OrdinalIgnoreCase)) else if (String.Equals(file.Type, CoreAttachmentType.ResultsLog, StringComparison.OrdinalIgnoreCase))
{ {
if (file.RecordId != _jobTimelineRecordId) if (file.RecordId != _jobTimelineRecordId)
@@ -570,9 +580,9 @@ namespace GitHub.Runner.Common
Trace.Info("Catch exception during file upload to results, keep going since the process is best effort."); Trace.Info("Catch exception during file upload to results, keep going since the process is best effort.");
Trace.Error(ex); Trace.Error(ex);
errorCount++; errorCount++;
_resultsServiceExceptionsCount++;
// If we hit any exceptions uploading to Results, let's skip any additional uploads to Results unless Results is serving logs // If we hit any exceptions uploading to Results, let's skip any additional uploads to Results unless Results is serving logs
if (!_resultsServiceOnly) if (!_resultsServiceOnly && _resultsServiceExceptionsCount > 3)
{ {
_resultsClientInitiated = false; _resultsClientInitiated = false;
SendResultsTelemetry(ex); SendResultsTelemetry(ex);
@@ -922,6 +932,17 @@ namespace GitHub.Runner.Common
await UploadResultsFile(file, summaryHandler); await UploadResultsFile(file, summaryHandler);
} }
private async Task UploadResultsDiagnosticLogsFile(ResultsUploadFileInfo file)
{
Trace.Info($"Starting to upload diagnostic logs file to results service {file.Name}, {file.Path}");
ResultsFileUploadHandler diagnosticLogsHandler = async (file) =>
{
await _resultsServer.CreateResultsDiagnosticLogsAsync(file.PlanId, file.JobId, file.Path, CancellationToken.None);
};
await UploadResultsFile(file, diagnosticLogsHandler);
}
private async Task UploadResultsStepLogFile(ResultsUploadFileInfo file) private async Task UploadResultsStepLogFile(ResultsUploadFileInfo file)
{ {
Trace.Info($"Starting upload of step log file to results service {file.Name}, {file.Path}"); Trace.Info($"Starting upload of step log file to results service {file.Name}, {file.Path}");

View File

@@ -19,7 +19,7 @@ namespace GitHub.Runner.Common
[ServiceLocator(Default = typeof(ResultServer))] [ServiceLocator(Default = typeof(ResultServer))]
public interface IResultsServer : IRunnerService, IAsyncDisposable public interface IResultsServer : IRunnerService, IAsyncDisposable
{ {
void InitializeResultsClient(Uri uri, string liveConsoleFeedUrl, string token); void InitializeResultsClient(Uri uri, string liveConsoleFeedUrl, string token, bool useSdk);
Task<bool> AppendLiveConsoleFeedAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, Guid stepId, IList<string> lines, long? startLine, CancellationToken cancellationToken); Task<bool> AppendLiveConsoleFeedAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, Guid stepId, IList<string> lines, long? startLine, CancellationToken cancellationToken);
@@ -35,6 +35,8 @@ namespace GitHub.Runner.Common
Task UpdateResultsWorkflowStepsAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Task UpdateResultsWorkflowStepsAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId,
IEnumerable<TimelineRecord> records, CancellationToken cancellationToken); IEnumerable<TimelineRecord> records, CancellationToken cancellationToken);
Task CreateResultsDiagnosticLogsAsync(string planId, string jobId, string file, CancellationToken cancellationToken);
} }
public sealed class ResultServer : RunnerService, IResultsServer public sealed class ResultServer : RunnerService, IResultsServer
@@ -51,9 +53,9 @@ namespace GitHub.Runner.Common
private String _liveConsoleFeedUrl; private String _liveConsoleFeedUrl;
private string _token; private string _token;
public void InitializeResultsClient(Uri uri, string liveConsoleFeedUrl, string token) public void InitializeResultsClient(Uri uri, string liveConsoleFeedUrl, string token, bool useSdk)
{ {
this._resultsClient = CreateHttpClient(uri, token); this._resultsClient = CreateHttpClient(uri, token, useSdk);
_token = token; _token = token;
if (!string.IsNullOrEmpty(liveConsoleFeedUrl)) if (!string.IsNullOrEmpty(liveConsoleFeedUrl))
@@ -63,7 +65,7 @@ namespace GitHub.Runner.Common
} }
} }
public ResultsHttpClient CreateHttpClient(Uri uri, string token) public ResultsHttpClient CreateHttpClient(Uri uri, string token, bool useSdk)
{ {
// Using default 100 timeout // Using default 100 timeout
RawClientHttpRequestSettings settings = VssUtil.GetHttpRequestSettings(null); RawClientHttpRequestSettings settings = VssUtil.GetHttpRequestSettings(null);
@@ -80,7 +82,7 @@ namespace GitHub.Runner.Common
var pipeline = HttpClientFactory.CreatePipeline(httpMessageHandler, delegatingHandlers); var pipeline = HttpClientFactory.CreatePipeline(httpMessageHandler, delegatingHandlers);
return new ResultsHttpClient(uri, pipeline, token, disposeHandler: true); return new ResultsHttpClient(uri, pipeline, token, disposeHandler: true, useSdk: useSdk);
} }
public Task CreateResultsStepSummaryAsync(string planId, string jobId, Guid stepId, string file, public Task CreateResultsStepSummaryAsync(string planId, string jobId, Guid stepId, string file,
@@ -141,6 +143,18 @@ namespace GitHub.Runner.Common
throw new InvalidOperationException("Results client is not initialized."); throw new InvalidOperationException("Results client is not initialized.");
} }
public Task CreateResultsDiagnosticLogsAsync(string planId, string jobId, string file,
CancellationToken cancellationToken)
{
if (_resultsClient != null)
{
return _resultsClient.UploadResultsDiagnosticLogsAsync(planId, jobId, file,
cancellationToken: cancellationToken);
}
throw new InvalidOperationException("Results client is not initialized.");
}
public ValueTask DisposeAsync() public ValueTask DisposeAsync()
{ {
CloseWebSocket(WebSocketCloseStatus.NormalClosure, CancellationToken.None); CloseWebSocket(WebSocketCloseStatus.NormalClosure, CancellationToken.None);

View File

@@ -5,6 +5,7 @@ using System.Threading.Tasks;
using GitHub.Actions.RunService.WebApi; using GitHub.Actions.RunService.WebApi;
using GitHub.DistributedTask.Pipelines; using GitHub.DistributedTask.Pipelines;
using GitHub.DistributedTask.WebApi; using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Sdk; using GitHub.Runner.Sdk;
using GitHub.Services.Common; using GitHub.Services.Common;
using Sdk.RSWebApi.Contracts; using Sdk.RSWebApi.Contracts;
@@ -60,7 +61,7 @@ namespace GitHub.Runner.Common
{ {
CheckConnection(); CheckConnection();
return RetryRequest<AgentJobRequestMessage>( return RetryRequest<AgentJobRequestMessage>(
async () => await _runServiceHttpClient.GetJobMessageAsync(requestUri, id, cancellationToken), cancellationToken, async () => await _runServiceHttpClient.GetJobMessageAsync(requestUri, id, VarUtil.OS, cancellationToken), cancellationToken,
shouldRetry: ex => ex is not TaskOrchestrationJobAlreadyAcquiredException); shouldRetry: ex => ex is not TaskOrchestrationJobAlreadyAcquiredException);
} }

View File

@@ -19,13 +19,6 @@ namespace GitHub.Runner.Common
{ {
DefaultTraceLevel = TraceLevel.Verbose; DefaultTraceLevel = TraceLevel.Verbose;
} }
else if (int.TryParse(Environment.GetEnvironmentVariable("ACTIONS_RUNNER_TRACE_LEVEL"), out var traceLevel))
{
// force user's TraceLevel to comply with runner TraceLevel enums
traceLevel = Math.Clamp(traceLevel, 0, 5);
DefaultTraceLevel = (TraceLevel)traceLevel;
}
} }
[DataMember(EmitDefaultValue = false)] [DataMember(EmitDefaultValue = false)]

View File

@@ -24,7 +24,15 @@ namespace GitHub.Runner.Listener
private TimeSpan _getNextMessageRetryInterval; private TimeSpan _getNextMessageRetryInterval;
private TaskAgentStatus runnerStatus = TaskAgentStatus.Online; private TaskAgentStatus runnerStatus = TaskAgentStatus.Online;
private CancellationTokenSource _getMessagesTokenSource; private CancellationTokenSource _getMessagesTokenSource;
private VssCredentials _creds;
private TaskAgentSession _session;
private IBrokerServer _brokerServer; private IBrokerServer _brokerServer;
private readonly Dictionary<string, int> _sessionCreationExceptionTracker = new();
private bool _accessTokenRevoked = false;
private readonly TimeSpan _sessionCreationRetryInterval = TimeSpan.FromSeconds(30);
private readonly TimeSpan _sessionConflictRetryLimit = TimeSpan.FromMinutes(4);
private readonly TimeSpan _clockSkewRetryLimit = TimeSpan.FromMinutes(30);
public override void Initialize(IHostContext hostContext) public override void Initialize(IHostContext hostContext)
{ {
@@ -34,15 +42,140 @@ namespace GitHub.Runner.Listener
_brokerServer = HostContext.GetService<IBrokerServer>(); _brokerServer = HostContext.GetService<IBrokerServer>();
} }
public async Task<Boolean> CreateSessionAsync(CancellationToken token) public async Task<CreateSessionResult> CreateSessionAsync(CancellationToken token)
{ {
await RefreshBrokerConnection(); Trace.Entering();
return await Task.FromResult(true);
// Settings
var configManager = HostContext.GetService<IConfigurationManager>();
_settings = configManager.LoadSettings();
var serverUrl = _settings.ServerUrlV2;
Trace.Info(_settings);
if (string.IsNullOrEmpty(_settings.ServerUrlV2))
{
throw new InvalidOperationException("ServerUrlV2 is not set");
}
// Create connection.
Trace.Info("Loading Credentials");
var credMgr = HostContext.GetService<ICredentialManager>();
_creds = credMgr.LoadCredentials();
var agent = new TaskAgentReference
{
Id = _settings.AgentId,
Name = _settings.AgentName,
Version = BuildConstants.RunnerPackage.Version,
OSDescription = RuntimeInformation.OSDescription,
};
string sessionName = $"{Environment.MachineName ?? "RUNNER"}";
var taskAgentSession = new TaskAgentSession(sessionName, agent);
string errorMessage = string.Empty;
bool encounteringError = false;
while (true)
{
token.ThrowIfCancellationRequested();
Trace.Info($"Attempt to create session.");
try
{
Trace.Info("Connecting to the Broker Server...");
await _brokerServer.ConnectAsync(new Uri(serverUrl), _creds);
Trace.Info("VssConnection created");
_term.WriteLine();
_term.WriteSuccessMessage("Connected to GitHub");
_term.WriteLine();
_session = await _brokerServer.CreateSessionAsync(taskAgentSession, token);
Trace.Info($"Session created.");
if (encounteringError)
{
_term.WriteLine($"{DateTime.UtcNow:u}: Runner reconnected.");
_sessionCreationExceptionTracker.Clear();
encounteringError = false;
}
return CreateSessionResult.Success;
}
catch (OperationCanceledException) when (token.IsCancellationRequested)
{
Trace.Info("Session creation has been cancelled.");
throw;
}
catch (TaskAgentAccessTokenExpiredException)
{
Trace.Info("Runner OAuth token has been revoked. Session creation failed.");
_accessTokenRevoked = true;
throw;
}
catch (Exception ex)
{
Trace.Error("Catch exception during create session.");
Trace.Error(ex);
if (ex is VssOAuthTokenRequestException vssOAuthEx && _creds.Federated is VssOAuthCredential vssOAuthCred)
{
// "invalid_client" means the runner registration has been deleted from the server.
if (string.Equals(vssOAuthEx.Error, "invalid_client", StringComparison.OrdinalIgnoreCase))
{
_term.WriteError("Failed to create a session. The runner registration has been deleted from the server, please re-configure. Runner registrations are automatically deleted for runners that have not connected to the service recently.");
return CreateSessionResult.Failure;
}
// Check whether we get 401 because the runner registration already removed by the service.
// If the runner registration get deleted, we can't exchange oauth token.
Trace.Error("Test oauth app registration.");
var oauthTokenProvider = new VssOAuthTokenProvider(vssOAuthCred, new Uri(serverUrl));
var authError = await oauthTokenProvider.ValidateCredentialAsync(token);
if (string.Equals(authError, "invalid_client", StringComparison.OrdinalIgnoreCase))
{
_term.WriteError("Failed to create a session. The runner registration has been deleted from the server, please re-configure. Runner registrations are automatically deleted for runners that have not connected to the service recently.");
return CreateSessionResult.Failure;
}
}
if (!IsSessionCreationExceptionRetriable(ex))
{
_term.WriteError($"Failed to create session. {ex.Message}");
if (ex is TaskAgentSessionConflictException)
{
return CreateSessionResult.SessionConflict;
}
return CreateSessionResult.Failure;
}
if (!encounteringError) //print the message only on the first error
{
_term.WriteError($"{DateTime.UtcNow:u}: Runner connect error: {ex.Message}. Retrying until reconnected.");
encounteringError = true;
}
Trace.Info("Sleeping for {0} seconds before retrying.", _sessionCreationRetryInterval.TotalSeconds);
await HostContext.Delay(_sessionCreationRetryInterval, token);
}
}
} }
public async Task DeleteSessionAsync() public async Task DeleteSessionAsync()
{ {
await Task.CompletedTask; if (_session != null && _session.SessionId != Guid.Empty)
{
if (!_accessTokenRevoked)
{
using (var ts = new CancellationTokenSource(TimeSpan.FromSeconds(30)))
{
await _brokerServer.DeleteSessionAsync(ts.Token);
}
}
else
{
Trace.Warning("Runner OAuth token has been revoked. Skip deleting session.");
}
}
} }
public void OnJobStatus(object sender, JobStatusEventArgs e) public void OnJobStatus(object sender, JobStatusEventArgs e)
@@ -73,12 +206,13 @@ namespace GitHub.Runner.Listener
_getMessagesTokenSource = CancellationTokenSource.CreateLinkedTokenSource(token); _getMessagesTokenSource = CancellationTokenSource.CreateLinkedTokenSource(token);
try try
{ {
message = await _brokerServer.GetRunnerMessageAsync(_getMessagesTokenSource.Token, message = await _brokerServer.GetRunnerMessageAsync(_session.SessionId,
runnerStatus, runnerStatus,
BuildConstants.RunnerPackage.Version, BuildConstants.RunnerPackage.Version,
VarUtil.OS, VarUtil.OS,
VarUtil.OSArchitecture, VarUtil.OSArchitecture,
_settings.DisableUpdate); _settings.DisableUpdate,
_getMessagesTokenSource.Token);
if (message == null) if (message == null)
{ {
@@ -143,7 +277,7 @@ namespace GitHub.Runner.Listener
} }
// re-create VssConnection before next retry // re-create VssConnection before next retry
await RefreshBrokerConnection(); await RefreshBrokerConnectionAsync();
Trace.Info("Sleeping for {0} seconds before retrying.", _getNextMessageRetryInterval.TotalSeconds); Trace.Info("Sleeping for {0} seconds before retrying.", _getNextMessageRetryInterval.TotalSeconds);
await HostContext.Delay(_getNextMessageRetryInterval, token); await HostContext.Delay(_getNextMessageRetryInterval, token);
@@ -173,6 +307,11 @@ namespace GitHub.Runner.Listener
} }
} }
public async Task RefreshListenerTokenAsync(CancellationToken cancellationToken)
{
await RefreshBrokerConnectionAsync();
}
public async Task DeleteMessageAsync(TaskAgentMessage message) public async Task DeleteMessageAsync(TaskAgentMessage message)
{ {
await Task.CompletedTask; await Task.CompletedTask;
@@ -196,12 +335,84 @@ namespace GitHub.Runner.Listener
} }
} }
private async Task RefreshBrokerConnection() private bool IsSessionCreationExceptionRetriable(Exception ex)
{
if (ex is TaskAgentNotFoundException)
{
Trace.Info("The runner no longer exists on the server. Stopping the runner.");
_term.WriteError("The runner no longer exists on the server. Please reconfigure the runner.");
return false;
}
else if (ex is TaskAgentSessionConflictException)
{
Trace.Info("The session for this runner already exists.");
_term.WriteError("A session for this runner already exists.");
if (_sessionCreationExceptionTracker.ContainsKey(nameof(TaskAgentSessionConflictException)))
{
_sessionCreationExceptionTracker[nameof(TaskAgentSessionConflictException)]++;
if (_sessionCreationExceptionTracker[nameof(TaskAgentSessionConflictException)] * _sessionCreationRetryInterval.TotalSeconds >= _sessionConflictRetryLimit.TotalSeconds)
{
Trace.Info("The session conflict exception have reached retry limit.");
_term.WriteError($"Stop retry on SessionConflictException after retried for {_sessionConflictRetryLimit.TotalSeconds} seconds.");
return false;
}
}
else
{
_sessionCreationExceptionTracker[nameof(TaskAgentSessionConflictException)] = 1;
}
Trace.Info("The session conflict exception haven't reached retry limit.");
return true;
}
else if (ex is VssOAuthTokenRequestException && ex.Message.Contains("Current server time is"))
{
Trace.Info("Local clock might be skewed.");
_term.WriteError("The local machine's clock may be out of sync with the server time by more than five minutes. Please sync your clock with your domain or internet time and try again.");
if (_sessionCreationExceptionTracker.ContainsKey(nameof(VssOAuthTokenRequestException)))
{
_sessionCreationExceptionTracker[nameof(VssOAuthTokenRequestException)]++;
if (_sessionCreationExceptionTracker[nameof(VssOAuthTokenRequestException)] * _sessionCreationRetryInterval.TotalSeconds >= _clockSkewRetryLimit.TotalSeconds)
{
Trace.Info("The OAuth token request exception have reached retry limit.");
_term.WriteError($"Stopped retrying OAuth token request exception after {_clockSkewRetryLimit.TotalSeconds} seconds.");
return false;
}
}
else
{
_sessionCreationExceptionTracker[nameof(VssOAuthTokenRequestException)] = 1;
}
Trace.Info("The OAuth token request exception haven't reached retry limit.");
return true;
}
else if (ex is TaskAgentPoolNotFoundException ||
ex is AccessDeniedException ||
ex is VssUnauthorizedException)
{
Trace.Info($"Non-retriable exception: {ex.Message}");
return false;
}
else if (ex is InvalidOperationException)
{
Trace.Info($"Non-retriable exception: {ex.Message}");
return false;
}
else
{
Trace.Info($"Retriable exception: {ex.Message}");
return true;
}
}
private async Task RefreshBrokerConnectionAsync()
{ {
var configManager = HostContext.GetService<IConfigurationManager>(); var configManager = HostContext.GetService<IConfigurationManager>();
_settings = configManager.LoadSettings(); _settings = configManager.LoadSettings();
if (_settings.ServerUrlV2 == null) if (string.IsNullOrEmpty(_settings.ServerUrlV2))
{ {
throw new InvalidOperationException("ServerUrlV2 is not set"); throw new InvalidOperationException("ServerUrlV2 is not set");
} }

View File

@@ -39,6 +39,7 @@ namespace GitHub.Runner.Listener.Check
string githubApiUrl = null; string githubApiUrl = null;
string actionsTokenServiceUrl = null; string actionsTokenServiceUrl = null;
string actionsPipelinesServiceUrl = null; string actionsPipelinesServiceUrl = null;
string resultsReceiverServiceUrl = null;
var urlBuilder = new UriBuilder(url); var urlBuilder = new UriBuilder(url);
if (UrlUtil.IsHostedServer(urlBuilder)) if (UrlUtil.IsHostedServer(urlBuilder))
{ {
@@ -47,6 +48,7 @@ namespace GitHub.Runner.Listener.Check
githubApiUrl = urlBuilder.Uri.AbsoluteUri; githubApiUrl = urlBuilder.Uri.AbsoluteUri;
actionsTokenServiceUrl = "https://vstoken.actions.githubusercontent.com/_apis/health"; actionsTokenServiceUrl = "https://vstoken.actions.githubusercontent.com/_apis/health";
actionsPipelinesServiceUrl = "https://pipelines.actions.githubusercontent.com/_apis/health"; actionsPipelinesServiceUrl = "https://pipelines.actions.githubusercontent.com/_apis/health";
resultsReceiverServiceUrl = "https://results-receiver.actions.githubusercontent.com/health";
} }
else else
{ {
@@ -56,13 +58,31 @@ namespace GitHub.Runner.Listener.Check
actionsTokenServiceUrl = urlBuilder.Uri.AbsoluteUri; actionsTokenServiceUrl = urlBuilder.Uri.AbsoluteUri;
urlBuilder.Path = "_services/pipelines/_apis/health"; urlBuilder.Path = "_services/pipelines/_apis/health";
actionsPipelinesServiceUrl = urlBuilder.Uri.AbsoluteUri; actionsPipelinesServiceUrl = urlBuilder.Uri.AbsoluteUri;
resultsReceiverServiceUrl = string.Empty; // we don't have Results service in GHES yet.
} }
var codeLoadUrlBuilder = new UriBuilder(url);
codeLoadUrlBuilder.Host = $"codeload.{codeLoadUrlBuilder.Host}";
codeLoadUrlBuilder.Path = "_ping";
// check github api // check github api
checkTasks.Add(CheckUtil.CheckDns(githubApiUrl)); checkTasks.Add(CheckUtil.CheckDns(githubApiUrl));
checkTasks.Add(CheckUtil.CheckPing(githubApiUrl)); checkTasks.Add(CheckUtil.CheckPing(githubApiUrl));
checkTasks.Add(HostContext.CheckHttpsGetRequests(githubApiUrl, pat, expectedHeader: "X-GitHub-Request-Id")); checkTasks.Add(HostContext.CheckHttpsGetRequests(githubApiUrl, pat, expectedHeader: "X-GitHub-Request-Id"));
// check github codeload
checkTasks.Add(CheckUtil.CheckDns(codeLoadUrlBuilder.Uri.AbsoluteUri));
checkTasks.Add(CheckUtil.CheckPing(codeLoadUrlBuilder.Uri.AbsoluteUri));
checkTasks.Add(HostContext.CheckHttpsGetRequests(codeLoadUrlBuilder.Uri.AbsoluteUri, pat, expectedHeader: "X-GitHub-Request-Id"));
// check results-receiver service
if (!string.IsNullOrEmpty(resultsReceiverServiceUrl))
{
checkTasks.Add(CheckUtil.CheckDns(resultsReceiverServiceUrl));
checkTasks.Add(CheckUtil.CheckPing(resultsReceiverServiceUrl));
checkTasks.Add(HostContext.CheckHttpsGetRequests(resultsReceiverServiceUrl, pat, expectedHeader: "X-GitHub-Request-Id"));
}
// check actions token service // check actions token service
checkTasks.Add(CheckUtil.CheckDns(actionsTokenServiceUrl)); checkTasks.Add(CheckUtil.CheckDns(actionsTokenServiceUrl));
checkTasks.Add(CheckUtil.CheckPing(actionsTokenServiceUrl)); checkTasks.Add(CheckUtil.CheckPing(actionsTokenServiceUrl));

View File

@@ -35,7 +35,7 @@ namespace GitHub.Runner.Listener
// This implementation of IJobDispatcher is not thread safe. // This implementation of IJobDispatcher is not thread safe.
// It is based on the fact that the current design of the runner is a dequeue // It is based on the fact that the current design of the runner is a dequeue
// and processes one message from the message queue at a time. // and processes one message from the message queue at a time.
// In addition, it only executes one job every time, // In addition, it only executes one job every time,
// and the server will not send another job while this one is still running. // and the server will not send another job while this one is still running.
public sealed class JobDispatcher : RunnerService, IJobDispatcher public sealed class JobDispatcher : RunnerService, IJobDispatcher
{ {
@@ -546,13 +546,27 @@ namespace GitHub.Runner.Listener
Trace.Info($"Return code {returnCode} indicate worker encounter an unhandled exception or app crash, attach worker stdout/stderr to JobRequest result."); Trace.Info($"Return code {returnCode} indicate worker encounter an unhandled exception or app crash, attach worker stdout/stderr to JobRequest result.");
var jobServer = await InitializeJobServerAsync(systemConnection); var jobServer = await InitializeJobServerAsync(systemConnection);
await LogWorkerProcessUnhandledException(jobServer, message, detailInfo); var unhandledExceptionIssue = new Issue() { Type = IssueType.Error, Message = detailInfo };
unhandledExceptionIssue.Data[Constants.Runner.InternalTelemetryIssueDataKey] = Constants.Runner.WorkerCrash;
// Go ahead to finish the job with result 'Failed' if the STDERR from worker is System.IO.IOException, since it typically means we are running out of disk space. switch (jobServer)
if (detailInfo.Contains(typeof(System.IO.IOException).ToString(), StringComparison.OrdinalIgnoreCase))
{ {
Trace.Info($"Finish job with result 'Failed' due to IOException."); case IJobServer js:
await ForceFailJob(jobServer, message, detailInfo); {
await LogWorkerProcessUnhandledException(js, message, unhandledExceptionIssue);
// Go ahead to finish the job with result 'Failed' if the STDERR from worker is System.IO.IOException, since it typically means we are running out of disk space.
if (detailInfo.Contains(typeof(System.IO.IOException).ToString(), StringComparison.OrdinalIgnoreCase))
{
Trace.Info($"Finish job with result 'Failed' due to IOException.");
await ForceFailJob(js, message);
}
break;
}
case IRunServer rs:
await ForceFailJob(rs, message, unhandledExceptionIssue);
break;
default:
throw new NotSupportedException($"JobServer type '{jobServer.GetType().Name}' is not supported.");
} }
} }
@@ -629,8 +643,22 @@ namespace GitHub.Runner.Listener
Trace.Info("worker process has been killed."); Trace.Info("worker process has been killed.");
} }
} }
catch (Exception ex)
{
// message send failed, this might indicate worker process is already exited or stuck.
Trace.Info($"Job cancel message sending for job {message.JobId} failed, kill running worker. {ex}");
workerProcessCancelTokenSource.Cancel();
try
{
await workerProcessTask;
}
catch (OperationCanceledException)
{
Trace.Info("worker process has been killed.");
}
}
// wait worker to exit // wait worker to exit
// if worker doesn't exit within timeout, then kill worker. // if worker doesn't exit within timeout, then kill worker.
completedTask = await Task.WhenAny(workerProcessTask, Task.Delay(-1, workerCancelTimeoutKillToken)); completedTask = await Task.WhenAny(workerProcessTask, Task.Delay(-1, workerCancelTimeoutKillToken));
@@ -1117,77 +1145,65 @@ namespace GitHub.Runner.Listener
} }
// log an error issue to job level timeline record // log an error issue to job level timeline record
private async Task LogWorkerProcessUnhandledException(IRunnerService server, Pipelines.AgentJobRequestMessage message, string detailInfo) private async Task LogWorkerProcessUnhandledException(IJobServer jobServer, Pipelines.AgentJobRequestMessage message, Issue issue)
{ {
if (server is IJobServer jobServer) try
{ {
try var timeline = await jobServer.GetTimelineAsync(message.Plan.ScopeIdentifier, message.Plan.PlanType, message.Plan.PlanId, message.Timeline.Id, CancellationToken.None);
{ ArgUtil.NotNull(timeline, nameof(timeline));
var timeline = await jobServer.GetTimelineAsync(message.Plan.ScopeIdentifier, message.Plan.PlanType, message.Plan.PlanId, message.Timeline.Id, CancellationToken.None);
ArgUtil.NotNull(timeline, nameof(timeline));
TimelineRecord jobRecord = timeline.Records.FirstOrDefault(x => x.Id == message.JobId && x.RecordType == "Job"); TimelineRecord jobRecord = timeline.Records.FirstOrDefault(x => x.Id == message.JobId && x.RecordType == "Job");
ArgUtil.NotNull(jobRecord, nameof(jobRecord)); ArgUtil.NotNull(jobRecord, nameof(jobRecord));
var unhandledExceptionIssue = new Issue() { Type = IssueType.Error, Message = detailInfo }; jobRecord.ErrorCount++;
unhandledExceptionIssue.Data[Constants.Runner.InternalTelemetryIssueDataKey] = Constants.Runner.WorkerCrash; jobRecord.Issues.Add(issue);
jobRecord.ErrorCount++;
jobRecord.Issues.Add(unhandledExceptionIssue);
await jobServer.UpdateTimelineRecordsAsync(message.Plan.ScopeIdentifier, message.Plan.PlanType, message.Plan.PlanId, message.Timeline.Id, new TimelineRecord[] { jobRecord }, CancellationToken.None); Trace.Info("Mark the job as failed since the worker crashed");
} jobRecord.Result = TaskResult.Failed;
catch (Exception ex) // mark the job as completed so service will pickup the result
{ jobRecord.State = TimelineRecordState.Completed;
Trace.Error("Fail to report unhandled exception from Runner.Worker process");
Trace.Error(ex); await jobServer.UpdateTimelineRecordsAsync(message.Plan.ScopeIdentifier, message.Plan.PlanType, message.Plan.PlanId, message.Timeline.Id, new TimelineRecord[] { jobRecord }, CancellationToken.None);
}
} }
else catch (Exception ex)
{ {
Trace.Info("Job server does not support handling unhandled exception yet, error message: {0}", detailInfo); Trace.Error("Fail to report unhandled exception from Runner.Worker process");
return; Trace.Error(ex);
} }
} }
// raise job completed event to fail the job. // raise job completed event to fail the job.
private async Task ForceFailJob(IRunnerService server, Pipelines.AgentJobRequestMessage message, string detailInfo) private async Task ForceFailJob(IJobServer jobServer, Pipelines.AgentJobRequestMessage message)
{ {
if (server is IJobServer jobServer) try
{ {
try var jobCompletedEvent = new JobCompletedEvent(message.RequestId, message.JobId, TaskResult.Failed);
{ await jobServer.RaisePlanEventAsync<JobCompletedEvent>(message.Plan.ScopeIdentifier, message.Plan.PlanType, message.Plan.PlanId, jobCompletedEvent, CancellationToken.None);
var jobCompletedEvent = new JobCompletedEvent(message.RequestId, message.JobId, TaskResult.Failed);
await jobServer.RaisePlanEventAsync<JobCompletedEvent>(message.Plan.ScopeIdentifier, message.Plan.PlanType, message.Plan.PlanId, jobCompletedEvent, CancellationToken.None);
}
catch (Exception ex)
{
Trace.Error("Fail to raise JobCompletedEvent back to service.");
Trace.Error(ex);
}
} }
else if (server is IRunServer runServer) catch (Exception ex)
{ {
try Trace.Error("Fail to raise JobCompletedEvent back to service.");
{ Trace.Error(ex);
var unhandledExceptionIssue = new Issue() { Type = IssueType.Error, Message = detailInfo }; }
var unhandledAnnotation = unhandledExceptionIssue.ToAnnotation(); }
var jobAnnotations = new List<Annotation>();
if (unhandledAnnotation.HasValue)
{
jobAnnotations.Add(unhandledAnnotation.Value);
}
await runServer.CompleteJobAsync(message.Plan.PlanId, message.JobId, TaskResult.Failed, outputs: null, stepResults: null, jobAnnotations: jobAnnotations, environmentUrl: null, CancellationToken.None); private async Task ForceFailJob(IRunServer runServer, Pipelines.AgentJobRequestMessage message, Issue issue)
} {
catch (Exception ex) try
{
Trace.Error("Fail to raise job completion back to service.");
Trace.Error(ex);
}
}
else
{ {
throw new NotSupportedException($"Server type {server.GetType().FullName} is not supported."); var annotation = issue.ToAnnotation();
var jobAnnotations = new List<Annotation>();
if (annotation.HasValue)
{
jobAnnotations.Add(annotation.Value);
}
await runServer.CompleteJobAsync(message.Plan.PlanId, message.JobId, TaskResult.Failed, outputs: null, stepResults: null, jobAnnotations: jobAnnotations, environmentUrl: null, CancellationToken.None);
}
catch (Exception ex)
{
Trace.Error("Fail to raise job completion back to service.");
Trace.Error(ex);
} }
} }

View File

@@ -14,16 +14,26 @@ using GitHub.Runner.Listener.Configuration;
using GitHub.Runner.Sdk; using GitHub.Runner.Sdk;
using GitHub.Services.Common; using GitHub.Services.Common;
using GitHub.Services.OAuth; using GitHub.Services.OAuth;
using GitHub.Services.WebApi;
namespace GitHub.Runner.Listener namespace GitHub.Runner.Listener
{ {
public enum CreateSessionResult
{
Success,
Failure,
SessionConflict
}
[ServiceLocator(Default = typeof(MessageListener))] [ServiceLocator(Default = typeof(MessageListener))]
public interface IMessageListener : IRunnerService public interface IMessageListener : IRunnerService
{ {
Task<Boolean> CreateSessionAsync(CancellationToken token); Task<CreateSessionResult> CreateSessionAsync(CancellationToken token);
Task DeleteSessionAsync(); Task DeleteSessionAsync();
Task<TaskAgentMessage> GetNextMessageAsync(CancellationToken token); Task<TaskAgentMessage> GetNextMessageAsync(CancellationToken token);
Task DeleteMessageAsync(TaskAgentMessage message); Task DeleteMessageAsync(TaskAgentMessage message);
Task RefreshListenerTokenAsync(CancellationToken token);
void OnJobStatus(object sender, JobStatusEventArgs e); void OnJobStatus(object sender, JobStatusEventArgs e);
} }
@@ -33,6 +43,7 @@ namespace GitHub.Runner.Listener
private RunnerSettings _settings; private RunnerSettings _settings;
private ITerminal _term; private ITerminal _term;
private IRunnerServer _runnerServer; private IRunnerServer _runnerServer;
private IBrokerServer _brokerServer;
private TaskAgentSession _session; private TaskAgentSession _session;
private TimeSpan _getNextMessageRetryInterval; private TimeSpan _getNextMessageRetryInterval;
private bool _accessTokenRevoked = false; private bool _accessTokenRevoked = false;
@@ -42,6 +53,9 @@ namespace GitHub.Runner.Listener
private readonly Dictionary<string, int> _sessionCreationExceptionTracker = new(); private readonly Dictionary<string, int> _sessionCreationExceptionTracker = new();
private TaskAgentStatus runnerStatus = TaskAgentStatus.Online; private TaskAgentStatus runnerStatus = TaskAgentStatus.Online;
private CancellationTokenSource _getMessagesTokenSource; private CancellationTokenSource _getMessagesTokenSource;
private VssCredentials _creds;
private bool _isBrokerSession = false;
public override void Initialize(IHostContext hostContext) public override void Initialize(IHostContext hostContext)
{ {
@@ -49,9 +63,10 @@ namespace GitHub.Runner.Listener
_term = HostContext.GetService<ITerminal>(); _term = HostContext.GetService<ITerminal>();
_runnerServer = HostContext.GetService<IRunnerServer>(); _runnerServer = HostContext.GetService<IRunnerServer>();
_brokerServer = hostContext.GetService<IBrokerServer>();
} }
public async Task<Boolean> CreateSessionAsync(CancellationToken token) public async Task<CreateSessionResult> CreateSessionAsync(CancellationToken token)
{ {
Trace.Entering(); Trace.Entering();
@@ -64,7 +79,7 @@ namespace GitHub.Runner.Listener
// Create connection. // Create connection.
Trace.Info("Loading Credentials"); Trace.Info("Loading Credentials");
var credMgr = HostContext.GetService<ICredentialManager>(); var credMgr = HostContext.GetService<ICredentialManager>();
VssCredentials creds = credMgr.LoadCredentials(); _creds = credMgr.LoadCredentials();
var agent = new TaskAgentReference var agent = new TaskAgentReference
{ {
@@ -86,7 +101,7 @@ namespace GitHub.Runner.Listener
try try
{ {
Trace.Info("Connecting to the Runner Server..."); Trace.Info("Connecting to the Runner Server...");
await _runnerServer.ConnectAsync(new Uri(serverUrl), creds); await _runnerServer.ConnectAsync(new Uri(serverUrl), _creds);
Trace.Info("VssConnection created"); Trace.Info("VssConnection created");
_term.WriteLine(); _term.WriteLine();
@@ -98,6 +113,15 @@ namespace GitHub.Runner.Listener
taskAgentSession, taskAgentSession,
token); token);
if (_session.BrokerMigrationMessage != null)
{
Trace.Info("Runner session is in migration mode: Creating Broker session with BrokerBaseUrl: {0}", _session.BrokerMigrationMessage.BrokerBaseUrl);
await _brokerServer.UpdateConnectionIfNeeded(_session.BrokerMigrationMessage.BrokerBaseUrl, _creds);
_session = await _brokerServer.CreateSessionAsync(taskAgentSession, token);
_isBrokerSession = true;
}
Trace.Info($"Session created."); Trace.Info($"Session created.");
if (encounteringError) if (encounteringError)
{ {
@@ -106,7 +130,7 @@ namespace GitHub.Runner.Listener
encounteringError = false; encounteringError = false;
} }
return true; return CreateSessionResult.Success;
} }
catch (OperationCanceledException) when (token.IsCancellationRequested) catch (OperationCanceledException) when (token.IsCancellationRequested)
{ {
@@ -124,13 +148,13 @@ namespace GitHub.Runner.Listener
Trace.Error("Catch exception during create session."); Trace.Error("Catch exception during create session.");
Trace.Error(ex); Trace.Error(ex);
if (ex is VssOAuthTokenRequestException vssOAuthEx && creds.Federated is VssOAuthCredential vssOAuthCred) if (ex is VssOAuthTokenRequestException vssOAuthEx && _creds.Federated is VssOAuthCredential vssOAuthCred)
{ {
// "invalid_client" means the runner registration has been deleted from the server. // "invalid_client" means the runner registration has been deleted from the server.
if (string.Equals(vssOAuthEx.Error, "invalid_client", StringComparison.OrdinalIgnoreCase)) if (string.Equals(vssOAuthEx.Error, "invalid_client", StringComparison.OrdinalIgnoreCase))
{ {
_term.WriteError("Failed to create a session. The runner registration has been deleted from the server, please re-configure. Runner registrations are automatically deleted for runners that have not connected to the service recently."); _term.WriteError("Failed to create a session. The runner registration has been deleted from the server, please re-configure. Runner registrations are automatically deleted for runners that have not connected to the service recently.");
return false; return CreateSessionResult.Failure;
} }
// Check whether we get 401 because the runner registration already removed by the service. // Check whether we get 401 because the runner registration already removed by the service.
@@ -141,14 +165,18 @@ namespace GitHub.Runner.Listener
if (string.Equals(authError, "invalid_client", StringComparison.OrdinalIgnoreCase)) if (string.Equals(authError, "invalid_client", StringComparison.OrdinalIgnoreCase))
{ {
_term.WriteError("Failed to create a session. The runner registration has been deleted from the server, please re-configure. Runner registrations are automatically deleted for runners that have not connected to the service recently."); _term.WriteError("Failed to create a session. The runner registration has been deleted from the server, please re-configure. Runner registrations are automatically deleted for runners that have not connected to the service recently.");
return false; return CreateSessionResult.Failure;
} }
} }
if (!IsSessionCreationExceptionRetriable(ex)) if (!IsSessionCreationExceptionRetriable(ex))
{ {
_term.WriteError($"Failed to create session. {ex.Message}"); _term.WriteError($"Failed to create session. {ex.Message}");
return false; if (ex is TaskAgentSessionConflictException)
{
return CreateSessionResult.SessionConflict;
}
return CreateSessionResult.Failure;
} }
if (!encounteringError) //print the message only on the first error if (!encounteringError) //print the message only on the first error
@@ -172,6 +200,11 @@ namespace GitHub.Runner.Listener
using (var ts = new CancellationTokenSource(TimeSpan.FromSeconds(30))) using (var ts = new CancellationTokenSource(TimeSpan.FromSeconds(30)))
{ {
await _runnerServer.DeleteAgentSessionAsync(_settings.PoolId, _session.SessionId, ts.Token); await _runnerServer.DeleteAgentSessionAsync(_settings.PoolId, _session.SessionId, ts.Token);
if (_isBrokerSession)
{
await _brokerServer.DeleteSessionAsync(ts.Token);
}
} }
} }
else else
@@ -183,19 +216,17 @@ namespace GitHub.Runner.Listener
public void OnJobStatus(object sender, JobStatusEventArgs e) public void OnJobStatus(object sender, JobStatusEventArgs e)
{ {
if (StringUtil.ConvertToBoolean(Environment.GetEnvironmentVariable("USE_BROKER_FLOW"))) Trace.Info("Received job status event. JobState: {0}", e.Status);
runnerStatus = e.Status;
try
{ {
Trace.Info("Received job status event. JobState: {0}", e.Status); _getMessagesTokenSource?.Cancel();
runnerStatus = e.Status;
try
{
_getMessagesTokenSource?.Cancel();
}
catch (ObjectDisposedException)
{
Trace.Info("_getMessagesTokenSource is already disposed.");
}
} }
catch (ObjectDisposedException)
{
Trace.Info("_getMessagesTokenSource is already disposed.");
}
} }
public async Task<TaskAgentMessage> GetNextMessageAsync(CancellationToken token) public async Task<TaskAgentMessage> GetNextMessageAsync(CancellationToken token)
@@ -205,6 +236,7 @@ namespace GitHub.Runner.Listener
ArgUtil.NotNull(_settings, nameof(_settings)); ArgUtil.NotNull(_settings, nameof(_settings));
bool encounteringError = false; bool encounteringError = false;
int continuousError = 0; int continuousError = 0;
int continuousEmptyMessage = 0;
string errorMessage = string.Empty; string errorMessage = string.Empty;
Stopwatch heartbeat = new(); Stopwatch heartbeat = new();
heartbeat.Restart(); heartbeat.Restart();
@@ -228,6 +260,23 @@ namespace GitHub.Runner.Listener
// Decrypt the message body if the session is using encryption // Decrypt the message body if the session is using encryption
message = DecryptMessage(message); message = DecryptMessage(message);
if (message != null && message.MessageType == BrokerMigrationMessage.MessageType)
{
Trace.Info("BrokerMigration message received. Polling Broker for messages...");
var migrationMessage = JsonUtility.FromString<BrokerMigrationMessage>(message.Body);
await _brokerServer.UpdateConnectionIfNeeded(migrationMessage.BrokerBaseUrl, _creds);
message = await _brokerServer.GetRunnerMessageAsync(_session.SessionId,
runnerStatus,
BuildConstants.RunnerPackage.Version,
VarUtil.OS,
VarUtil.OSArchitecture,
_settings.DisableUpdate,
token);
}
if (message != null) if (message != null)
{ {
_lastMessageId = message.MessageId; _lastMessageId = message.MessageId;
@@ -266,7 +315,7 @@ namespace GitHub.Runner.Listener
Trace.Error(ex); Trace.Error(ex);
// don't retry if SkipSessionRecover = true, DT service will delete agent session to stop agent from taking more jobs. // don't retry if SkipSessionRecover = true, DT service will delete agent session to stop agent from taking more jobs.
if (ex is TaskAgentSessionExpiredException && !_settings.SkipSessionRecover && await CreateSessionAsync(token)) if (ex is TaskAgentSessionExpiredException && !_settings.SkipSessionRecover && (await CreateSessionAsync(token) == CreateSessionResult.Success))
{ {
Trace.Info($"{nameof(TaskAgentSessionExpiredException)} received, recovered by recreate session."); Trace.Info($"{nameof(TaskAgentSessionExpiredException)} received, recovered by recreate session.");
} }
@@ -311,16 +360,27 @@ namespace GitHub.Runner.Listener
if (message == null) if (message == null)
{ {
continuousEmptyMessage++;
if (heartbeat.Elapsed > TimeSpan.FromMinutes(30)) if (heartbeat.Elapsed > TimeSpan.FromMinutes(30))
{ {
Trace.Info($"No message retrieved from session '{_session.SessionId}' within last 30 minutes."); Trace.Info($"No message retrieved from session '{_session.SessionId}' within last 30 minutes.");
heartbeat.Restart(); heartbeat.Restart();
continuousEmptyMessage = 0;
} }
else else
{ {
Trace.Verbose($"No message retrieved from session '{_session.SessionId}'."); Trace.Verbose($"No message retrieved from session '{_session.SessionId}'.");
} }
if (continuousEmptyMessage > 50)
{
// retried more than 50 times in less than 30mins and still getting empty message
// something is not right on the service side, backoff for 15-30s before retry
_getNextMessageRetryInterval = BackoffTimerHelper.GetRandomBackoff(TimeSpan.FromSeconds(15), TimeSpan.FromSeconds(30), _getNextMessageRetryInterval);
Trace.Info("Sleeping for {0} seconds before retrying.", _getNextMessageRetryInterval.TotalSeconds);
await HostContext.Delay(_getNextMessageRetryInterval, token);
}
continue; continue;
} }
@@ -343,6 +403,12 @@ namespace GitHub.Runner.Listener
} }
} }
public async Task RefreshListenerTokenAsync(CancellationToken cancellationToken)
{
await _runnerServer.RefreshConnectionAsync(RunnerConnectionType.MessageQueue, TimeSpan.FromSeconds(60));
await _brokerServer.ForceRefreshConnection(_creds);
}
private TaskAgentMessage DecryptMessage(TaskAgentMessage message) private TaskAgentMessage DecryptMessage(TaskAgentMessage message)
{ {
if (_session.EncryptionKey == null || if (_session.EncryptionKey == null ||

View File

@@ -25,12 +25,6 @@
<PackageReference Include="System.ServiceProcess.ServiceController" Version="4.4.0" /> <PackageReference Include="System.ServiceProcess.ServiceController" Version="4.4.0" />
</ItemGroup> </ItemGroup>
<ItemGroup>
<EmbeddedResource Include="..\Misc\runnercoreassets">
<LogicalName>GitHub.Runner.Listener.runnercoreassets</LogicalName>
</EmbeddedResource>
</ItemGroup>
<PropertyGroup Condition=" '$(Configuration)' == 'Debug' "> <PropertyGroup Condition=" '$(Configuration)' == 'Debug' ">
<DebugType>portable</DebugType> <DebugType>portable</DebugType>
</PropertyGroup> </PropertyGroup>

View File

@@ -359,7 +359,12 @@ namespace GitHub.Runner.Listener
{ {
Trace.Info(nameof(RunAsync)); Trace.Info(nameof(RunAsync));
_listener = GetMesageListener(settings); _listener = GetMesageListener(settings);
if (!await _listener.CreateSessionAsync(HostContext.RunnerShutdownToken)) CreateSessionResult createSessionResult = await _listener.CreateSessionAsync(HostContext.RunnerShutdownToken);
if (createSessionResult == CreateSessionResult.SessionConflict)
{
return Constants.Runner.ReturnCode.SessionConflict;
}
else if (createSessionResult == CreateSessionResult.Failure)
{ {
return Constants.Runner.ReturnCode.TerminatedError; return Constants.Runner.ReturnCode.TerminatedError;
} }
@@ -567,6 +572,11 @@ namespace GitHub.Runner.Listener
Trace.Info("Job is already acquired, skip this message."); Trace.Info("Job is already acquired, skip this message.");
continue; continue;
} }
catch (Exception ex)
{
Trace.Error($"Caught exception from acquiring job message: {ex}");
continue;
}
} }
jobDispatcher.Run(jobRequestMessage, runOnce); jobDispatcher.Run(jobRequestMessage, runOnce);
@@ -596,6 +606,11 @@ namespace GitHub.Runner.Listener
Trace.Info($"Service requests the hosted runner to shutdown. Reason: '{HostedRunnerShutdownMessage.Reason}'."); Trace.Info($"Service requests the hosted runner to shutdown. Reason: '{HostedRunnerShutdownMessage.Reason}'.");
return Constants.Runner.ReturnCode.Success; return Constants.Runner.ReturnCode.Success;
} }
else if (string.Equals(message.MessageType, TaskAgentMessageTypes.ForceTokenRefresh))
{
Trace.Info("Received ForceTokenRefreshMessage");
await _listener.RefreshListenerTokenAsync(messageQueueLoopTokenSource.Token);
}
else else
{ {
Trace.Error($"Received message {message.MessageId} with unsupported message type {message.MessageType}."); Trace.Error($"Received message {message.MessageId} with unsupported message type {message.MessageType}.");
@@ -634,6 +649,7 @@ namespace GitHub.Runner.Listener
{ {
try try
{ {
Trace.Info("Deleting Runner Session...");
await _listener.DeleteSessionAsync(); await _listener.DeleteSessionAsync();
} }
catch (Exception ex) when (runOnce) catch (Exception ex) when (runOnce)

View File

@@ -6,13 +6,11 @@ using System.IO;
using System.IO.Compression; using System.IO.Compression;
using System.Linq; using System.Linq;
using System.Net.Http; using System.Net.Http;
using System.Reflection;
using System.Security.Cryptography; using System.Security.Cryptography;
using System.Threading; using System.Threading;
using System.Threading.Tasks; using System.Threading.Tasks;
using GitHub.DistributedTask.WebApi; using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common; using GitHub.Runner.Common;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Sdk; using GitHub.Runner.Sdk;
using GitHub.Services.Common; using GitHub.Services.Common;
using GitHub.Services.WebApi; using GitHub.Services.WebApi;
@@ -30,20 +28,14 @@ namespace GitHub.Runner.Listener
{ {
private static string _packageType = "agent"; private static string _packageType = "agent";
private static string _platform = BuildConstants.RunnerPackage.PackageName; private static string _platform = BuildConstants.RunnerPackage.PackageName;
private static string _dotnetRuntime = "dotnetRuntime";
private static string _externals = "externals";
private readonly Dictionary<string, string> _contentHashes = new();
private PackageMetadata _targetPackage; private PackageMetadata _targetPackage;
private ITerminal _terminal; private ITerminal _terminal;
private IRunnerServer _runnerServer; private IRunnerServer _runnerServer;
private int _poolId; private int _poolId;
private ulong _agentId; private ulong _agentId;
private const int _numberOfOldVersionsToKeep = 1;
private readonly ConcurrentQueue<string> _updateTrace = new(); private readonly ConcurrentQueue<string> _updateTrace = new();
private Task _cloneAndCalculateContentHashTask;
private string _dotnetRuntimeCloneDirectory;
private string _externalsCloneDirectory;
public bool Busy { get; private set; } public bool Busy { get; private set; }
public override void Initialize(IHostContext hostContext) public override void Initialize(IHostContext hostContext)
@@ -56,8 +48,6 @@ namespace GitHub.Runner.Listener
var settings = configStore.GetSettings(); var settings = configStore.GetSettings();
_poolId = settings.PoolId; _poolId = settings.PoolId;
_agentId = settings.AgentId; _agentId = settings.AgentId;
_dotnetRuntimeCloneDirectory = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Work), "__dotnet_runtime__");
_externalsCloneDirectory = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Work), "__externals__");
} }
public async Task<bool> SelfUpdate(AgentRefreshMessage updateMessage, IJobDispatcher jobDispatcher, bool restartInteractiveRunner, CancellationToken token) public async Task<bool> SelfUpdate(AgentRefreshMessage updateMessage, IJobDispatcher jobDispatcher, bool restartInteractiveRunner, CancellationToken token)
@@ -67,13 +57,6 @@ namespace GitHub.Runner.Listener
{ {
var totalUpdateTime = Stopwatch.StartNew(); var totalUpdateTime = Stopwatch.StartNew();
// Copy dotnet runtime and externals of current runner to a temp folder
// So we can re-use them with trimmed runner package, if possible.
// This process is best effort, if we can't use trimmed runner package,
// we will just go with the full package.
var linkedTokenSource = CancellationTokenSource.CreateLinkedTokenSource(token);
_cloneAndCalculateContentHashTask = CloneAndCalculateAssetsHash(_dotnetRuntimeCloneDirectory, _externalsCloneDirectory, linkedTokenSource.Token);
if (!await UpdateNeeded(updateMessage.TargetVersion, token)) if (!await UpdateNeeded(updateMessage.TargetVersion, token))
{ {
Trace.Info($"Can't find available update package."); Trace.Info($"Can't find available update package.");
@@ -87,24 +70,6 @@ namespace GitHub.Runner.Listener
await UpdateRunnerUpdateStateAsync("Runner update in progress, do not shutdown runner."); await UpdateRunnerUpdateStateAsync("Runner update in progress, do not shutdown runner.");
await UpdateRunnerUpdateStateAsync($"Downloading {_targetPackage.Version} runner"); await UpdateRunnerUpdateStateAsync($"Downloading {_targetPackage.Version} runner");
if (_targetPackage.TrimmedPackages?.Count > 0)
{
// wait for cloning assets task to finish only if we have trimmed packages
await _cloneAndCalculateContentHashTask;
}
else
{
linkedTokenSource.Cancel();
try
{
await _cloneAndCalculateContentHashTask;
}
catch (Exception ex)
{
Trace.Info($"Ingore errors after cancelling cloning assets task: {ex}");
}
}
await DownloadLatestRunner(token, updateMessage.TargetVersion); await DownloadLatestRunner(token, updateMessage.TargetVersion);
Trace.Info($"Download latest runner and unzip into runner root."); Trace.Info($"Download latest runner and unzip into runner root.");
@@ -218,54 +183,8 @@ namespace GitHub.Runner.Listener
string archiveFile = null; string archiveFile = null;
var packageDownloadUrl = _targetPackage.DownloadUrl; var packageDownloadUrl = _targetPackage.DownloadUrl;
var packageHashValue = _targetPackage.HashValue; var packageHashValue = _targetPackage.HashValue;
var runtimeTrimmed = false;
var externalsTrimmed = false;
var fallbackToFullPackage = false;
// Only try trimmed package if sever sends them and we have calculated hash value of the current runtime/externals.
if (_contentHashes.Count == 2 &&
_contentHashes.ContainsKey(_dotnetRuntime) &&
_contentHashes.ContainsKey(_externals) &&
_targetPackage.TrimmedPackages?.Count > 0)
{
Trace.Info($"Current runner content hash: {StringUtil.ConvertToJson(_contentHashes)}");
Trace.Info($"Trimmed packages info from service: {StringUtil.ConvertToJson(_targetPackage.TrimmedPackages)}");
// Try to see whether we can use any size trimmed down package to speed up runner updates.
foreach (var trimmedPackage in _targetPackage.TrimmedPackages)
{
if (trimmedPackage.TrimmedContents.Count == 2 &&
trimmedPackage.TrimmedContents.TryGetValue(_dotnetRuntime, out var trimmedRuntimeHash) &&
trimmedRuntimeHash == _contentHashes[_dotnetRuntime] &&
trimmedPackage.TrimmedContents.TryGetValue(_externals, out var trimmedExternalsHash) &&
trimmedExternalsHash == _contentHashes[_externals])
{
Trace.Info($"Use trimmed (runtime+externals) package '{trimmedPackage.DownloadUrl}' to update runner.");
packageDownloadUrl = trimmedPackage.DownloadUrl;
packageHashValue = trimmedPackage.HashValue;
runtimeTrimmed = true;
externalsTrimmed = true;
break;
}
else if (trimmedPackage.TrimmedContents.Count == 1 &&
trimmedPackage.TrimmedContents.TryGetValue(_externals, out trimmedExternalsHash) &&
trimmedExternalsHash == _contentHashes[_externals])
{
Trace.Info($"Use trimmed (externals) package '{trimmedPackage.DownloadUrl}' to update runner.");
packageDownloadUrl = trimmedPackage.DownloadUrl;
packageHashValue = trimmedPackage.HashValue;
externalsTrimmed = true;
break;
}
else
{
Trace.Info($"Can't use trimmed package from '{trimmedPackage.DownloadUrl}' since the current runner does not carry those trimmed content (Hash mismatch).");
}
}
}
_updateTrace.Enqueue($"DownloadUrl: {packageDownloadUrl}"); _updateTrace.Enqueue($"DownloadUrl: {packageDownloadUrl}");
_updateTrace.Enqueue($"RuntimeTrimmed: {runtimeTrimmed}");
_updateTrace.Enqueue($"ExternalsTrimmed: {externalsTrimmed}");
try try
{ {
@@ -323,12 +242,6 @@ namespace GitHub.Runner.Listener
await ExtractRunnerPackage(archiveFile, latestRunnerDirectory, token); await ExtractRunnerPackage(archiveFile, latestRunnerDirectory, token);
} }
catch (Exception ex) when (runtimeTrimmed || externalsTrimmed)
{
// if anything failed when we use trimmed package (download/validatehase/extract), try again with the full runner package.
Trace.Error($"Fail to download latest runner using trimmed package: {ex}");
fallbackToFullPackage = true;
}
finally finally
{ {
try try
@@ -347,74 +260,6 @@ namespace GitHub.Runner.Listener
} }
} }
var trimmedPackageRestoreTasks = new List<Task<bool>>();
if (!fallbackToFullPackage)
{
// Skip restoring externals and runtime if we are going to fullback to the full package.
if (externalsTrimmed)
{
trimmedPackageRestoreTasks.Add(RestoreTrimmedExternals(latestRunnerDirectory, token));
}
if (runtimeTrimmed)
{
trimmedPackageRestoreTasks.Add(RestoreTrimmedDotnetRuntime(latestRunnerDirectory, token));
}
}
if (trimmedPackageRestoreTasks.Count > 0)
{
var restoreResults = await Task.WhenAll(trimmedPackageRestoreTasks);
if (restoreResults.Any(x => x == false))
{
// if any of the restore failed, fallback to full package.
fallbackToFullPackage = true;
}
}
if (fallbackToFullPackage)
{
Trace.Error("Something wrong with the trimmed runner package, failback to use the full package for runner updates.");
_updateTrace.Enqueue($"FallbackToFullPackage: {fallbackToFullPackage}");
IOUtil.DeleteDirectory(latestRunnerDirectory, token);
Directory.CreateDirectory(latestRunnerDirectory);
packageDownloadUrl = _targetPackage.DownloadUrl;
packageHashValue = _targetPackage.HashValue;
_updateTrace.Enqueue($"DownloadUrl: {packageDownloadUrl}");
try
{
archiveFile = await DownLoadRunner(latestRunnerDirectory, packageDownloadUrl, packageHashValue, token);
if (string.IsNullOrEmpty(archiveFile))
{
throw new TaskCanceledException($"Runner package '{packageDownloadUrl}' failed after {Constants.RunnerDownloadRetryMaxAttempts} download attempts");
}
await ValidateRunnerHash(archiveFile, packageHashValue);
await ExtractRunnerPackage(archiveFile, latestRunnerDirectory, token);
}
finally
{
try
{
// delete .zip file
if (!string.IsNullOrEmpty(archiveFile) && File.Exists(archiveFile))
{
Trace.Verbose("Deleting latest runner package zip: {0}", archiveFile);
IOUtil.DeleteFile(archiveFile);
}
}
catch (Exception ex)
{
//it is not critical if we fail to delete the .zip file
Trace.Warning("Failed to delete runner package zip '{0}'. Exception: {1}", archiveFile, ex);
}
}
}
await CopyLatestRunnerToRoot(latestRunnerDirectory, token); await CopyLatestRunnerToRoot(latestRunnerDirectory, token);
} }
@@ -665,9 +510,9 @@ namespace GitHub.Runner.Listener
// delete old bin.2.99.0 folder, only leave the current version and the latest download version // delete old bin.2.99.0 folder, only leave the current version and the latest download version
var allBinDirs = Directory.GetDirectories(HostContext.GetDirectory(WellKnownDirectory.Root), "bin.*"); var allBinDirs = Directory.GetDirectories(HostContext.GetDirectory(WellKnownDirectory.Root), "bin.*");
if (allBinDirs.Length > 2) if (allBinDirs.Length > _numberOfOldVersionsToKeep)
{ {
// there are more than 2 bin.version folder. // there are more than one bin.version folder.
// delete older bin.version folders. // delete older bin.version folders.
foreach (var oldBinDir in allBinDirs) foreach (var oldBinDir in allBinDirs)
{ {
@@ -694,9 +539,9 @@ namespace GitHub.Runner.Listener
// delete old externals.2.99.0 folder, only leave the current version and the latest download version // delete old externals.2.99.0 folder, only leave the current version and the latest download version
var allExternalsDirs = Directory.GetDirectories(HostContext.GetDirectory(WellKnownDirectory.Root), "externals.*"); var allExternalsDirs = Directory.GetDirectories(HostContext.GetDirectory(WellKnownDirectory.Root), "externals.*");
if (allExternalsDirs.Length > 2) if (allExternalsDirs.Length > _numberOfOldVersionsToKeep)
{ {
// there are more than 2 externals.version folder. // there are more than one externals.version folder.
// delete older externals.version folders. // delete older externals.version folders.
foreach (var oldExternalDir in allExternalsDirs) foreach (var oldExternalDir in allExternalsDirs)
{ {
@@ -795,330 +640,5 @@ namespace GitHub.Runner.Listener
Trace.Info($"Catch exception during report update state, ignore this error and continue auto-update."); Trace.Info($"Catch exception during report update state, ignore this error and continue auto-update.");
} }
} }
private async Task<bool> RestoreTrimmedExternals(string downloadDirectory, CancellationToken token)
{
// Copy the current runner's externals if we are using a externals trimmed package
// Execute the node.js to make sure the copied externals is working.
var stopWatch = Stopwatch.StartNew();
try
{
Trace.Info($"Copy {_externalsCloneDirectory} to {Path.Combine(downloadDirectory, Constants.Path.ExternalsDirectory)}.");
IOUtil.CopyDirectory(_externalsCloneDirectory, Path.Combine(downloadDirectory, Constants.Path.ExternalsDirectory), token);
// try run node.js to see if current node.js works fine after copy over to new location.
var nodeVersions = NodeUtil.BuiltInNodeVersions;
foreach (var nodeVersion in nodeVersions)
{
var newNodeBinary = Path.Combine(downloadDirectory, Constants.Path.ExternalsDirectory, nodeVersion, "bin", $"node{IOUtil.ExeExtension}");
if (File.Exists(newNodeBinary))
{
using (var p = HostContext.CreateService<IProcessInvoker>())
{
var outputs = "";
p.ErrorDataReceived += (_, data) =>
{
if (!string.IsNullOrEmpty(data.Data))
{
Trace.Error(data.Data);
}
};
p.OutputDataReceived += (_, data) =>
{
if (!string.IsNullOrEmpty(data.Data))
{
Trace.Info(data.Data);
outputs = data.Data;
}
};
var exitCode = await p.ExecuteAsync(HostContext.GetDirectory(WellKnownDirectory.Root), newNodeBinary, $"-e \"console.log('{nameof(RestoreTrimmedExternals)}')\"", null, token);
if (exitCode != 0)
{
Trace.Error($"{newNodeBinary} -e \"console.log()\" failed with exit code {exitCode}");
return false;
}
if (!string.Equals(outputs, nameof(RestoreTrimmedExternals), StringComparison.OrdinalIgnoreCase))
{
Trace.Error($"{newNodeBinary} -e \"console.log()\" did not output expected content.");
return false;
}
}
}
}
return true;
}
catch (Exception ex)
{
Trace.Error($"Fail to restore externals for trimmed package: {ex}");
return false;
}
finally
{
stopWatch.Stop();
_updateTrace.Enqueue($"{nameof(RestoreTrimmedExternals)}Time: {stopWatch.ElapsedMilliseconds}ms");
}
}
private async Task<bool> RestoreTrimmedDotnetRuntime(string downloadDirectory, CancellationToken token)
{
// Copy the current runner's dotnet runtime if we are using a dotnet runtime trimmed package
// Execute the runner.listener to make sure the copied runtime is working.
var stopWatch = Stopwatch.StartNew();
try
{
Trace.Info($"Copy {_dotnetRuntimeCloneDirectory} to {Path.Combine(downloadDirectory, Constants.Path.BinDirectory)}.");
IOUtil.CopyDirectory(_dotnetRuntimeCloneDirectory, Path.Combine(downloadDirectory, Constants.Path.BinDirectory), token);
// try run the runner executable to see if current dotnet runtime + future runner binary works fine.
var newRunnerBinary = Path.Combine(downloadDirectory, Constants.Path.BinDirectory, "Runner.Listener");
using (var p = HostContext.CreateService<IProcessInvoker>())
{
p.ErrorDataReceived += (_, data) =>
{
if (!string.IsNullOrEmpty(data.Data))
{
Trace.Error(data.Data);
}
};
p.OutputDataReceived += (_, data) =>
{
if (!string.IsNullOrEmpty(data.Data))
{
Trace.Info(data.Data);
}
};
var exitCode = await p.ExecuteAsync(HostContext.GetDirectory(WellKnownDirectory.Root), newRunnerBinary, "--version", null, token);
if (exitCode != 0)
{
Trace.Error($"{newRunnerBinary} --version failed with exit code {exitCode}");
return false;
}
else
{
return true;
}
}
}
catch (Exception ex)
{
Trace.Error($"Fail to restore dotnet runtime for trimmed package: {ex}");
return false;
}
finally
{
stopWatch.Stop();
_updateTrace.Enqueue($"{nameof(RestoreTrimmedDotnetRuntime)}Time: {stopWatch.ElapsedMilliseconds}ms");
}
}
private async Task CloneAndCalculateAssetsHash(string dotnetRuntimeCloneDirectory, string externalsCloneDirectory, CancellationToken token)
{
var runtimeCloneTask = CloneDotnetRuntime(dotnetRuntimeCloneDirectory, token);
var externalsCloneTask = CloneExternals(externalsCloneDirectory, token);
var waitingTasks = new Dictionary<string, Task>()
{
{nameof(CloneDotnetRuntime), runtimeCloneTask},
{nameof(CloneExternals),externalsCloneTask}
};
while (waitingTasks.Count > 0)
{
Trace.Info($"Waiting for {waitingTasks.Count} tasks to complete.");
var complatedTask = await Task.WhenAny(waitingTasks.Values);
if (waitingTasks.ContainsKey(nameof(CloneExternals)) &&
complatedTask == waitingTasks[nameof(CloneExternals)])
{
Trace.Info($"Externals clone finished.");
waitingTasks.Remove(nameof(CloneExternals));
try
{
if (await externalsCloneTask && !token.IsCancellationRequested)
{
var externalsHash = await HashFiles(externalsCloneDirectory, token);
Trace.Info($"Externals content hash: {externalsHash}");
_contentHashes[_externals] = externalsHash;
_updateTrace.Enqueue($"ExternalsHash: {_contentHashes[_externals]}");
}
else
{
Trace.Error($"Skip compute hash since clone externals failed/cancelled.");
}
}
catch (Exception ex)
{
Trace.Error($"Fail to hash externals content: {ex}");
}
}
else if (waitingTasks.ContainsKey(nameof(CloneDotnetRuntime)) &&
complatedTask == waitingTasks[nameof(CloneDotnetRuntime)])
{
Trace.Info($"Dotnet runtime clone finished.");
waitingTasks.Remove(nameof(CloneDotnetRuntime));
try
{
if (await runtimeCloneTask && !token.IsCancellationRequested)
{
var runtimeHash = await HashFiles(dotnetRuntimeCloneDirectory, token);
Trace.Info($"Runtime content hash: {runtimeHash}");
_contentHashes[_dotnetRuntime] = runtimeHash;
_updateTrace.Enqueue($"DotnetRuntimeHash: {_contentHashes[_dotnetRuntime]}");
}
else
{
Trace.Error($"Skip compute hash since clone dotnet runtime failed/cancelled.");
}
}
catch (Exception ex)
{
Trace.Error($"Fail to hash runtime content: {ex}");
}
}
Trace.Info($"Still waiting for {waitingTasks.Count} tasks to complete.");
}
}
private async Task<bool> CloneDotnetRuntime(string runtimeDir, CancellationToken token)
{
var stopWatch = Stopwatch.StartNew();
try
{
Trace.Info($"Cloning dotnet runtime to {runtimeDir}");
IOUtil.DeleteDirectory(runtimeDir, CancellationToken.None);
Directory.CreateDirectory(runtimeDir);
var assembly = Assembly.GetExecutingAssembly();
var assetsContent = default(string);
using (var stream = assembly.GetManifestResourceStream("GitHub.Runner.Listener.runnercoreassets"))
using (var streamReader = new StreamReader(stream))
{
assetsContent = await streamReader.ReadToEndAsync();
}
if (!string.IsNullOrEmpty(assetsContent))
{
var runnerCoreAssets = assetsContent.Split(new[] { "\n", "\r\n" }, StringSplitOptions.RemoveEmptyEntries);
if (runnerCoreAssets.Length > 0)
{
var binDir = HostContext.GetDirectory(WellKnownDirectory.Bin);
IOUtil.CopyDirectory(binDir, runtimeDir, token);
var clonedFile = 0;
foreach (var file in Directory.EnumerateFiles(runtimeDir, "*", SearchOption.AllDirectories))
{
token.ThrowIfCancellationRequested();
if (runnerCoreAssets.Any(x => file.Replace(Path.DirectorySeparatorChar, Path.AltDirectorySeparatorChar).EndsWith(x.Trim())))
{
Trace.Verbose($"{file} is part of the runner core, delete from cloned runtime directory.");
IOUtil.DeleteFile(file);
}
else
{
clonedFile++;
}
}
Trace.Info($"Successfully cloned dotnet runtime to {runtimeDir}. Total files: {clonedFile}");
return true;
}
}
}
catch (Exception ex)
{
Trace.Error($"Fail to clone dotnet runtime to {runtimeDir}");
Trace.Error(ex);
}
finally
{
stopWatch.Stop();
_updateTrace.Enqueue($"{nameof(CloneDotnetRuntime)}Time: {stopWatch.ElapsedMilliseconds}ms");
}
return false;
}
private Task<bool> CloneExternals(string externalsDir, CancellationToken token)
{
var stopWatch = Stopwatch.StartNew();
try
{
Trace.Info($"Cloning externals to {externalsDir}");
IOUtil.DeleteDirectory(externalsDir, CancellationToken.None);
Directory.CreateDirectory(externalsDir);
IOUtil.CopyDirectory(HostContext.GetDirectory(WellKnownDirectory.Externals), externalsDir, token);
Trace.Info($"Successfully cloned externals to {externalsDir}.");
return Task.FromResult(true);
}
catch (Exception ex)
{
Trace.Error($"Fail to clone externals to {externalsDir}");
Trace.Error(ex);
}
finally
{
stopWatch.Stop();
_updateTrace.Enqueue($"{nameof(CloneExternals)}Time: {stopWatch.ElapsedMilliseconds}ms");
}
return Task.FromResult(false);
}
private async Task<string> HashFiles(string fileFolder, CancellationToken token)
{
Trace.Info($"Calculating hash for {fileFolder}");
var stopWatch = Stopwatch.StartNew();
string binDir = HostContext.GetDirectory(WellKnownDirectory.Bin);
string node = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Externals), NodeUtil.GetInternalNodeVersion(), "bin", $"node{IOUtil.ExeExtension}");
string hashFilesScript = Path.Combine(binDir, "hashFiles");
var hashResult = string.Empty;
using (var processInvoker = HostContext.CreateService<IProcessInvoker>())
{
processInvoker.ErrorDataReceived += (_, data) =>
{
if (!string.IsNullOrEmpty(data.Data) && data.Data.StartsWith("__OUTPUT__") && data.Data.EndsWith("__OUTPUT__"))
{
hashResult = data.Data.Substring(10, data.Data.Length - 20);
Trace.Info($"Hash result: '{hashResult}'");
}
else
{
Trace.Info(data.Data);
}
};
processInvoker.OutputDataReceived += (_, data) =>
{
Trace.Verbose(data.Data);
};
var env = new Dictionary<string, string>
{
["patterns"] = "**"
};
int exitCode = await processInvoker.ExecuteAsync(workingDirectory: fileFolder,
fileName: node,
arguments: $"\"{hashFilesScript.Replace("\"", "\\\"")}\"",
environment: env,
requireExitCodeZero: false,
outputEncoding: null,
killProcessOnCancel: true,
cancellationToken: token);
if (exitCode != 0)
{
Trace.Error($"hashFiles returns '{exitCode}' failed. Fail to hash files under directory '{fileFolder}'");
}
stopWatch.Stop();
_updateTrace.Enqueue($"{nameof(HashFiles)}{Path.GetFileName(fileFolder)}Time: {stopWatch.ElapsedMilliseconds}ms");
return hashResult;
}
}
} }
} }

View File

@@ -149,7 +149,6 @@ namespace GitHub.Runner.Listener
string archiveFile = null; string archiveFile = null;
// Only try trimmed package if sever sends them and we have calculated hash value of the current runtime/externals.
_updateTrace.Enqueue($"DownloadUrl: {packageDownloadUrl}"); _updateTrace.Enqueue($"DownloadUrl: {packageDownloadUrl}");
try try

View File

@@ -459,6 +459,34 @@ namespace GitHub.Runner.Sdk
File.WriteAllText(path, null); File.WriteAllText(path, null);
} }
/// <summary>
/// Replaces invalid file name characters with '_'
/// </summary>
public static string ReplaceInvalidFileNameChars(string fileName)
{
var result = new StringBuilder();
var invalidChars = Path.GetInvalidFileNameChars();
var current = 0; // Current index
while (current < fileName?.Length)
{
var next = fileName.IndexOfAny(invalidChars, current);
if (next >= 0)
{
result.Append(fileName.Substring(current, next - current));
result.Append('_');
current = next + 1;
}
else
{
result.Append(fileName.Substring(current));
break;
}
}
return result.ToString();
}
/// <summary> /// <summary>
/// Recursively enumerates a directory without following directory reparse points. /// Recursively enumerates a directory without following directory reparse points.
/// </summary> /// </summary>

View File

@@ -23,7 +23,13 @@ namespace GitHub.Runner.Sdk
if (VssClientHttpRequestSettings.Default.UserAgent != null && VssClientHttpRequestSettings.Default.UserAgent.Count > 0) if (VssClientHttpRequestSettings.Default.UserAgent != null && VssClientHttpRequestSettings.Default.UserAgent.Count > 0)
{ {
headerValues.AddRange(VssClientHttpRequestSettings.Default.UserAgent); foreach (var headerVal in VssClientHttpRequestSettings.Default.UserAgent)
{
if (!headerValues.Contains(headerVal))
{
headerValues.Add(headerVal);
}
}
} }
VssClientHttpRequestSettings.Default.UserAgent = headerValues; VssClientHttpRequestSettings.Default.UserAgent = headerValues;
@@ -33,6 +39,23 @@ namespace GitHub.Runner.Sdk
{ {
VssClientHttpRequestSettings.Default.ServerCertificateValidationCallback = HttpClientHandler.DangerousAcceptAnyServerCertificateValidator; VssClientHttpRequestSettings.Default.ServerCertificateValidationCallback = HttpClientHandler.DangerousAcceptAnyServerCertificateValidator;
} }
var rawHeaderValues = new List<ProductInfoHeaderValue>();
rawHeaderValues.AddRange(additionalUserAgents);
rawHeaderValues.Add(new ProductInfoHeaderValue($"({StringUtil.SanitizeUserAgentHeader(RuntimeInformation.OSDescription)})"));
if (RawClientHttpRequestSettings.Default.UserAgent != null && RawClientHttpRequestSettings.Default.UserAgent.Count > 0)
{
foreach (var headerVal in RawClientHttpRequestSettings.Default.UserAgent)
{
if (!rawHeaderValues.Contains(headerVal))
{
rawHeaderValues.Add(headerVal);
}
}
}
RawClientHttpRequestSettings.Default.UserAgent = rawHeaderValues;
} }
public static VssConnection CreateConnection( public static VssConnection CreateConnection(
@@ -62,11 +85,6 @@ namespace GitHub.Runner.Sdk
settings.SendTimeout = TimeSpan.FromSeconds(Math.Min(Math.Max(httpRequestTimeoutSeconds, 100), 1200)); settings.SendTimeout = TimeSpan.FromSeconds(Math.Min(Math.Max(httpRequestTimeoutSeconds, 100), 1200));
} }
if (StringUtil.ConvertToBoolean(Environment.GetEnvironmentVariable("USE_BROKER_FLOW")))
{
settings.AllowAutoRedirectForBroker = true;
}
// Remove Invariant from the list of accepted languages. // Remove Invariant from the list of accepted languages.
// //
// The constructor of VssHttpRequestSettings (base class of VssClientHttpRequestSettings) adds the current // The constructor of VssHttpRequestSettings (base class of VssClientHttpRequestSettings) adds the current

View File

@@ -7,129 +7,6 @@ namespace GitHub.Runner.Sdk
public static class WhichUtil public static class WhichUtil
{ {
public static string Which(string command, bool require = false, ITraceWriter trace = null, string prependPath = null) public static string Which(string command, bool require = false, ITraceWriter trace = null, string prependPath = null)
{
ArgUtil.NotNullOrEmpty(command, nameof(command));
trace?.Info($"Which: '{command}'");
if (Path.IsPathFullyQualified(command) && File.Exists(command))
{
trace?.Info($"Fully qualified path: '{command}'");
return command;
}
string path = Environment.GetEnvironmentVariable(PathUtil.PathVariable);
if (string.IsNullOrEmpty(path))
{
trace?.Info("PATH environment variable not defined.");
path = path ?? string.Empty;
}
if (!string.IsNullOrEmpty(prependPath))
{
path = PathUtil.PrependPath(prependPath, path);
}
string[] pathSegments = path.Split(new Char[] { Path.PathSeparator }, StringSplitOptions.RemoveEmptyEntries);
for (int i = 0; i < pathSegments.Length; i++)
{
pathSegments[i] = Environment.ExpandEnvironmentVariables(pathSegments[i]);
}
foreach (string pathSegment in pathSegments)
{
if (!string.IsNullOrEmpty(pathSegment) && Directory.Exists(pathSegment))
{
string[] matches = null;
#if OS_WINDOWS
string pathExt = Environment.GetEnvironmentVariable("PATHEXT");
if (string.IsNullOrEmpty(pathExt))
{
// XP's system default value for PATHEXT system variable
pathExt = ".com;.exe;.bat;.cmd;.vbs;.vbe;.js;.jse;.wsf;.wsh";
}
string[] pathExtSegments = pathExt.Split(new string[] { ";" }, StringSplitOptions.RemoveEmptyEntries);
// if command already has an extension.
if (pathExtSegments.Any(ext => command.EndsWith(ext, StringComparison.OrdinalIgnoreCase)))
{
try
{
matches = Directory.GetFiles(pathSegment, command);
}
catch (UnauthorizedAccessException ex)
{
trace?.Info("Ignore UnauthorizedAccess exception during Which.");
trace?.Verbose(ex.ToString());
}
if (matches != null && matches.Length > 0 && IsPathValid(matches.First(), trace))
{
trace?.Info($"Location: '{matches.First()}'");
return matches.First();
}
}
else
{
string searchPattern;
searchPattern = StringUtil.Format($"{command}.*");
try
{
matches = Directory.GetFiles(pathSegment, searchPattern);
}
catch (UnauthorizedAccessException ex)
{
trace?.Info("Ignore UnauthorizedAccess exception during Which.");
trace?.Verbose(ex.ToString());
}
if (matches != null && matches.Length > 0)
{
// add extension.
for (int i = 0; i < pathExtSegments.Length; i++)
{
string fullPath = Path.Combine(pathSegment, $"{command}{pathExtSegments[i]}");
if (matches.Any(p => p.Equals(fullPath, StringComparison.OrdinalIgnoreCase)) && IsPathValid(fullPath, trace))
{
trace?.Info($"Location: '{fullPath}'");
return fullPath;
}
}
}
}
#else
try
{
matches = Directory.GetFiles(pathSegment, command);
}
catch (UnauthorizedAccessException ex)
{
trace?.Info("Ignore UnauthorizedAccess exception during Which.");
trace?.Verbose(ex.ToString());
}
if (matches != null && matches.Length > 0 && IsPathValid(matches.First(), trace))
{
trace?.Info($"Location: '{matches.First()}'");
return matches.First();
}
#endif
}
}
#if OS_WINDOWS
trace?.Info($"{command}: command not found. Make sure '{command}' is installed and its location included in the 'Path' environment variable.");
#else
trace?.Info($"{command}: command not found. Make sure '{command}' is installed and its location included in the 'PATH' environment variable.");
#endif
if (require)
{
throw new FileNotFoundException(
message: $"{command}: command not found",
fileName: command);
}
return null;
}
public static string Which2(string command, bool require = false, ITraceWriter trace = null, string prependPath = null)
{ {
ArgUtil.NotNullOrEmpty(command, nameof(command)); ArgUtil.NotNullOrEmpty(command, nameof(command));
trace?.Info($"Which2: '{command}'"); trace?.Info($"Which2: '{command}'");

View File

@@ -483,10 +483,6 @@ namespace GitHub.Runner.Worker
{ {
// Load stored Ids for later load actions // Load stored Ids for later load actions
compositeAction.Steps[i].Id = _cachedEmbeddedStepIds[action.Id][i]; compositeAction.Steps[i].Id = _cachedEmbeddedStepIds[action.Id][i];
if (string.IsNullOrEmpty(executionContext.Global.Variables.Get("DistributedTask.EnableCompositeActions")) && compositeAction.Steps[i].Reference.Type != Pipelines.ActionSourceType.Script)
{
throw new Exception("`uses:` keyword is not currently supported.");
}
} }
} }
else else
@@ -703,11 +699,12 @@ namespace GitHub.Runner.Worker
catch (Exception ex) when (!executionContext.CancellationToken.IsCancellationRequested) // Do not retry if the run is cancelled. catch (Exception ex) when (!executionContext.CancellationToken.IsCancellationRequested) // Do not retry if the run is cancelled.
{ {
// UnresolvableActionDownloadInfoException is a 422 client error, don't retry // UnresolvableActionDownloadInfoException is a 422 client error, don't retry
// NonRetryableActionDownloadInfoException is an non-retryable exception from Actions
// Some possible cases are: // Some possible cases are:
// * Repo is rate limited // * Repo is rate limited
// * Repo or tag doesn't exist, or isn't public // * Repo or tag doesn't exist, or isn't public
// * Policy validation failed // * Policy validation failed
if (attempt < 3 && !(ex is WebApi.UnresolvableActionDownloadInfoException)) if (attempt < 3 && !(ex is WebApi.UnresolvableActionDownloadInfoException) && !(ex is WebApi.NonRetryableActionDownloadInfoException))
{ {
executionContext.Output($"Failed to resolve action download info. Error: {ex.Message}"); executionContext.Output($"Failed to resolve action download info. Error: {ex.Message}");
executionContext.Debug(ex.ToString()); executionContext.Debug(ex.ToString());
@@ -796,43 +793,40 @@ namespace GitHub.Runner.Worker
try try
{ {
var useActionArchiveCache = false; var useActionArchiveCache = false;
if (executionContext.Global.Variables.GetBoolean("DistributedTask.UseActionArchiveCache") == true) var hasActionArchiveCache = false;
var actionArchiveCacheDir = Environment.GetEnvironmentVariable(Constants.Variables.Agent.ActionArchiveCacheDirectory);
if (!string.IsNullOrEmpty(actionArchiveCacheDir) &&
Directory.Exists(actionArchiveCacheDir))
{ {
var hasActionArchiveCache = false; hasActionArchiveCache = true;
var actionArchiveCacheDir = Environment.GetEnvironmentVariable(Constants.Variables.Agent.ActionArchiveCacheDirectory); Trace.Info($"Check if action archive '{downloadInfo.ResolvedNameWithOwner}@{downloadInfo.ResolvedSha}' already exists in cache directory '{actionArchiveCacheDir}'");
if (!string.IsNullOrEmpty(actionArchiveCacheDir) &&
Directory.Exists(actionArchiveCacheDir))
{
hasActionArchiveCache = true;
Trace.Info($"Check if action archive '{downloadInfo.ResolvedNameWithOwner}@{downloadInfo.ResolvedSha}' already exists in cache directory '{actionArchiveCacheDir}'");
#if OS_WINDOWS #if OS_WINDOWS
var cacheArchiveFile = Path.Combine(actionArchiveCacheDir, downloadInfo.ResolvedNameWithOwner.Replace(Path.DirectorySeparatorChar, '_').Replace(Path.AltDirectorySeparatorChar, '_'), $"{downloadInfo.ResolvedSha}.zip"); var cacheArchiveFile = Path.Combine(actionArchiveCacheDir, downloadInfo.ResolvedNameWithOwner.Replace(Path.DirectorySeparatorChar, '_').Replace(Path.AltDirectorySeparatorChar, '_'), $"{downloadInfo.ResolvedSha}.zip");
#else #else
var cacheArchiveFile = Path.Combine(actionArchiveCacheDir, downloadInfo.ResolvedNameWithOwner.Replace(Path.DirectorySeparatorChar, '_').Replace(Path.AltDirectorySeparatorChar, '_'), $"{downloadInfo.ResolvedSha}.tar.gz"); var cacheArchiveFile = Path.Combine(actionArchiveCacheDir, downloadInfo.ResolvedNameWithOwner.Replace(Path.DirectorySeparatorChar, '_').Replace(Path.AltDirectorySeparatorChar, '_'), $"{downloadInfo.ResolvedSha}.tar.gz");
#endif #endif
if (File.Exists(cacheArchiveFile)) if (File.Exists(cacheArchiveFile))
{
try
{ {
try Trace.Info($"Found action archive '{cacheArchiveFile}' in cache directory '{actionArchiveCacheDir}'");
{ File.Copy(cacheArchiveFile, archiveFile);
Trace.Info($"Found action archive '{cacheArchiveFile}' in cache directory '{actionArchiveCacheDir}'"); useActionArchiveCache = true;
File.Copy(cacheArchiveFile, archiveFile); executionContext.Debug($"Copied action archive '{cacheArchiveFile}' to '{archiveFile}'");
useActionArchiveCache = true; }
executionContext.Debug($"Copied action archive '{cacheArchiveFile}' to '{archiveFile}'"); catch (Exception ex)
} {
catch (Exception ex) Trace.Error($"Failed to copy action archive '{cacheArchiveFile}' to '{archiveFile}'. Error: {ex}");
{
Trace.Error($"Failed to copy action archive '{cacheArchiveFile}' to '{archiveFile}'. Error: {ex}");
}
} }
} }
executionContext.Global.JobTelemetry.Add(new JobTelemetry()
{
Type = JobTelemetryType.General,
Message = $"Action archive cache usage: {downloadInfo.ResolvedNameWithOwner}@{downloadInfo.ResolvedSha} use cache {useActionArchiveCache} has cache {hasActionArchiveCache}"
});
} }
executionContext.Global.JobTelemetry.Add(new JobTelemetry()
{
Type = JobTelemetryType.General,
Message = $"Action archive cache usage: {downloadInfo.ResolvedNameWithOwner}@{downloadInfo.ResolvedSha} use cache {useActionArchiveCache} has cache {hasActionArchiveCache}"
});
if (!useActionArchiveCache) if (!useActionArchiveCache)
{ {
await DownloadRepositoryArchive(executionContext, link, downloadInfo.Authentication?.Token, archiveFile); await DownloadRepositoryArchive(executionContext, link, downloadInfo.Authentication?.Token, archiveFile);
@@ -878,16 +872,9 @@ namespace GitHub.Runner.Worker
int exitCode = await processInvoker.ExecuteAsync(stagingDirectory, tar, $"-xzf \"{archiveFile}\"", null, executionContext.CancellationToken); int exitCode = await processInvoker.ExecuteAsync(stagingDirectory, tar, $"-xzf \"{archiveFile}\"", null, executionContext.CancellationToken);
if (exitCode != 0) if (exitCode != 0)
{ {
if (executionContext.Global.Variables.GetBoolean("DistributedTask.DetailUntarFailure") == true) var fileInfo = new FileInfo(archiveFile);
{ var sha256hash = await IOUtil.GetFileContentSha256HashAsync(archiveFile);
var fileInfo = new FileInfo(archiveFile); throw new InvalidActionArchiveException($"Can't use 'tar -xzf' extract archive file: {archiveFile} (SHA256 '{sha256hash}', size '{fileInfo.Length}' bytes, tar outputs '{string.Join(' ', tarOutputs)}'). Action being checked out: {downloadInfo.NameWithOwner}@{downloadInfo.Ref}. return code: {exitCode}.");
var sha256hash = await IOUtil.GetFileContentSha256HashAsync(archiveFile);
throw new InvalidActionArchiveException($"Can't use 'tar -xzf' extract archive file: {archiveFile} (SHA256 '{sha256hash}', size '{fileInfo.Length}' bytes, tar outputs '{string.Join(' ', tarOutputs)}'). Action being checked out: {downloadInfo.NameWithOwner}@{downloadInfo.Ref}. return code: {exitCode}.");
}
else
{
throw new InvalidActionArchiveException($"Can't use 'tar -xzf' extract archive file: {archiveFile}. Action being checked out: {downloadInfo.NameWithOwner}@{downloadInfo.Ref}. return code: {exitCode}.");
}
} }
} }
#endif #endif
@@ -1031,13 +1018,6 @@ namespace GitHub.Runner.Worker
} }
} }
foreach (var step in compositeAction.Steps)
{
if (string.IsNullOrEmpty(executionContext.Global.Variables.Get("DistributedTask.EnableCompositeActions")) && step.Reference.Type != Pipelines.ActionSourceType.Script)
{
throw new Exception("`uses:` keyword is not currently supported.");
}
}
return setupInfo; return setupInfo;
} }
else else

View File

@@ -144,7 +144,7 @@ namespace GitHub.Runner.Worker
executionContext.Error(error.Message); executionContext.Error(error.Message);
} }
throw new ArgumentException($"Fail to load {fileRelativePath}"); throw new ArgumentException($"Failed to load {fileRelativePath}");
} }
if (actionDefinition.Execution == null) if (actionDefinition.Execution == null)

View File

@@ -466,17 +466,39 @@ namespace GitHub.Runner.Worker
{ {
throw new InvalidOperationException($"Failed to create directory to store registry client credentials: {e.Message}"); throw new InvalidOperationException($"Failed to create directory to store registry client credentials: {e.Message}");
} }
var loginExitCode = await _dockerManager.DockerLogin(
executionContext,
configLocation,
container.RegistryServer,
container.RegistryAuthUsername,
container.RegistryAuthPassword);
if (loginExitCode != 0) // Login docker with retry up to 3 times
int retryCount = 0;
int loginExitCode = 0;
while (retryCount < 3)
{
loginExitCode = await _dockerManager.DockerLogin(
executionContext,
configLocation,
container.RegistryServer,
container.RegistryAuthUsername,
container.RegistryAuthPassword);
if (loginExitCode == 0)
{
break;
}
else
{
retryCount++;
if (retryCount < 3)
{
var backOff = BackoffTimerHelper.GetRandomBackoff(TimeSpan.FromSeconds(1), TimeSpan.FromSeconds(10));
executionContext.Warning($"Docker login for '{container.RegistryServer}' failed with exit code {loginExitCode}, back off {backOff.TotalSeconds} seconds before retry.");
await Task.Delay(backOff);
}
}
}
if (retryCount == 3 && loginExitCode != 0)
{ {
throw new InvalidOperationException($"Docker login for '{container.RegistryServer}' failed with exit code {loginExitCode}"); throw new InvalidOperationException($"Docker login for '{container.RegistryServer}' failed with exit code {loginExitCode}");
} }
return configLocation; return configLocation;
} }

View File

@@ -91,13 +91,13 @@ namespace GitHub.Runner.Worker
string phaseName = executionContext.Global.Variables.System_PhaseDisplayName ?? "UnknownPhaseName"; string phaseName = executionContext.Global.Variables.System_PhaseDisplayName ?? "UnknownPhaseName";
// zip the files // zip the files
string diagnosticsZipFileName = $"{buildName}-{phaseName}.zip"; string diagnosticsZipFileName = $"{buildName}-{IOUtil.ReplaceInvalidFileNameChars(phaseName)}.zip";
string diagnosticsZipFilePath = Path.Combine(supportRootFolder, diagnosticsZipFileName); string diagnosticsZipFilePath = Path.Combine(supportRootFolder, diagnosticsZipFileName);
ZipFile.CreateFromDirectory(supportFilesFolder, diagnosticsZipFilePath); ZipFile.CreateFromDirectory(supportFilesFolder, diagnosticsZipFilePath);
// upload the json metadata file // upload the json metadata file
executionContext.Debug("Uploading diagnostic metadata file."); executionContext.Debug("Uploading diagnostic metadata file.");
string metadataFileName = $"diagnostics-{buildName}-{phaseName}.json"; string metadataFileName = $"diagnostics-{buildName}-{IOUtil.ReplaceInvalidFileNameChars(phaseName)}.json";
string metadataFilePath = Path.Combine(supportFilesFolder, metadataFileName); string metadataFilePath = Path.Combine(supportFilesFolder, metadataFileName);
string phaseResult = GetTaskResultAsString(executionContext.Result); string phaseResult = GetTaskResultAsString(executionContext.Result);
@@ -108,6 +108,8 @@ namespace GitHub.Runner.Worker
parentContext.QueueAttachFile(type: CoreAttachmentType.DiagnosticLog, name: diagnosticsZipFileName, filePath: diagnosticsZipFilePath); parentContext.QueueAttachFile(type: CoreAttachmentType.DiagnosticLog, name: diagnosticsZipFileName, filePath: diagnosticsZipFilePath);
parentContext.QueueDiagnosticLogFile(name: diagnosticsZipFileName, filePath: diagnosticsZipFilePath);
executionContext.Debug("Diagnostic file upload complete."); executionContext.Debug("Diagnostic file upload complete.");
} }

View File

@@ -90,6 +90,7 @@ namespace GitHub.Runner.Worker
long Write(string tag, string message); long Write(string tag, string message);
void QueueAttachFile(string type, string name, string filePath); void QueueAttachFile(string type, string name, string filePath);
void QueueSummaryFile(string name, string filePath, Guid stepRecordId); void QueueSummaryFile(string name, string filePath, Guid stepRecordId);
void QueueDiagnosticLogFile(string name, string filePath);
// timeline record update methods // timeline record update methods
void Start(string currentOperation = null); void Start(string currentOperation = null);
@@ -397,11 +398,11 @@ namespace GitHub.Runner.Worker
if (recordOrder != null) if (recordOrder != null)
{ {
child.InitializeTimelineRecord(_mainTimelineId, recordId, _record.Id, ExecutionContextType.Task, displayName, refName, recordOrder); child.InitializeTimelineRecord(_mainTimelineId, recordId, _record.Id, ExecutionContextType.Task, displayName, refName, recordOrder, embedded: isEmbedded);
} }
else else
{ {
child.InitializeTimelineRecord(_mainTimelineId, recordId, _record.Id, ExecutionContextType.Task, displayName, refName, ++_childTimelineRecordOrder); child.InitializeTimelineRecord(_mainTimelineId, recordId, _record.Id, ExecutionContextType.Task, displayName, refName, ++_childTimelineRecordOrder, embedded: isEmbedded);
} }
if (logger != null) if (logger != null)
{ {
@@ -432,7 +433,7 @@ namespace GitHub.Runner.Worker
Dictionary<string, string> intraActionState = null, Dictionary<string, string> intraActionState = null,
string siblingScopeName = null) string siblingScopeName = null)
{ {
return Root.CreateChild(_record.Id, _record.Name, _record.Id.ToString("N"), scopeName, contextName, stage, logger: _logger, isEmbedded: true, cancellationTokenSource: null, intraActionState: intraActionState, embeddedId: embeddedId, siblingScopeName: siblingScopeName, timeout: GetRemainingTimeout()); return Root.CreateChild(_record.Id, _record.Name, _record.Id.ToString("N"), scopeName, contextName, stage, logger: _logger, isEmbedded: true, cancellationTokenSource: null, intraActionState: intraActionState, embeddedId: embeddedId, siblingScopeName: siblingScopeName, timeout: GetRemainingTimeout(), recordOrder: _record.Order);
} }
public void Start(string currentOperation = null) public void Start(string currentOperation = null)
@@ -982,6 +983,18 @@ namespace GitHub.Runner.Worker
_jobServerQueue.QueueResultsUpload(stepRecordId, name, filePath, ChecksAttachmentType.StepSummary, deleteSource: false, finalize: true, firstBlock: true, totalLines: 0); _jobServerQueue.QueueResultsUpload(stepRecordId, name, filePath, ChecksAttachmentType.StepSummary, deleteSource: false, finalize: true, firstBlock: true, totalLines: 0);
} }
public void QueueDiagnosticLogFile(string name, string filePath)
{
ArgUtil.NotNullOrEmpty(name, nameof(name));
ArgUtil.NotNullOrEmpty(filePath, nameof(filePath));
if (!File.Exists(filePath))
{
throw new FileNotFoundException($"Can't upload diagnostic log file: {filePath}. File does not exist.");
}
_jobServerQueue.QueueResultsUpload(_record.Id, name, filePath, CoreAttachmentType.ResultsDiagnosticLog, deleteSource: false, finalize: true, firstBlock: true, totalLines: 0);
}
// Add OnMatcherChanged // Add OnMatcherChanged
public void Add(OnMatcherChanged handler) public void Add(OnMatcherChanged handler)
{ {
@@ -1160,7 +1173,7 @@ namespace GitHub.Runner.Worker
} }
} }
private void InitializeTimelineRecord(Guid timelineId, Guid timelineRecordId, Guid? parentTimelineRecordId, string recordType, string displayName, string refName, int? order) private void InitializeTimelineRecord(Guid timelineId, Guid timelineRecordId, Guid? parentTimelineRecordId, string recordType, string displayName, string refName, int? order, bool embedded = false)
{ {
_mainTimelineId = timelineId; _mainTimelineId = timelineId;
_record.Id = timelineRecordId; _record.Id = timelineRecordId;
@@ -1186,7 +1199,11 @@ namespace GitHub.Runner.Worker
var configuration = HostContext.GetService<IConfigurationStore>(); var configuration = HostContext.GetService<IConfigurationStore>();
_record.WorkerName = configuration.GetSettings().AgentName; _record.WorkerName = configuration.GetSettings().AgentName;
_jobServerQueue.QueueTimelineRecordUpdate(_mainTimelineId, _record); // We don't want to update the timeline record for embedded steps since they are not really represented in the UI.
if (!embedded)
{
_jobServerQueue.QueueTimelineRecordUpdate(_mainTimelineId, _record);
}
} }
private void JobServerQueueThrottling_EventReceived(object sender, ThrottlingEventArgs data) private void JobServerQueueThrottling_EventReceived(object sender, ThrottlingEventArgs data)

View File

@@ -244,7 +244,7 @@ namespace GitHub.Runner.Worker
if (resultsReceiverEndpoint != null) if (resultsReceiverEndpoint != null)
{ {
Trace.Info($"Queueing results file ({filePath}) for attachment upload ({attachmentName})"); Trace.Info($"Queueing results file ({filePath}) for attachment upload ({attachmentName})");
var stepId = context.Id; var stepId = context.IsEmbedded ? context.EmbeddedId : context.Id;
// Attachments must be added to the parent context (job), not the current context (step) // Attachments must be added to the parent context (job), not the current context (step)
context.Root.QueueSummaryFile(attachmentName, scrubbedFilePath, stepId); context.Root.QueueSummaryFile(attachmentName, scrubbedFilePath, stepId);
} }

View File

@@ -223,6 +223,10 @@ namespace GitHub.Runner.Worker.Handlers
{ {
Environment["ACTIONS_CACHE_URL"] = cacheUrl; Environment["ACTIONS_CACHE_URL"] = cacheUrl;
} }
if (systemConnection.Data.TryGetValue("PipelinesServiceUrl", out var pipelinesServiceUrl) && !string.IsNullOrEmpty(pipelinesServiceUrl))
{
Environment["ACTIONS_RUNTIME_URL"] = pipelinesServiceUrl;
}
if (systemConnection.Data.TryGetValue("GenerateIdTokenUrl", out var generateIdTokenUrl) && !string.IsNullOrEmpty(generateIdTokenUrl)) if (systemConnection.Data.TryGetValue("GenerateIdTokenUrl", out var generateIdTokenUrl) && !string.IsNullOrEmpty(generateIdTokenUrl))
{ {
Environment["ACTIONS_ID_TOKEN_REQUEST_URL"] = generateIdTokenUrl; Environment["ACTIONS_ID_TOKEN_REQUEST_URL"] = generateIdTokenUrl;

View File

@@ -84,6 +84,45 @@ namespace GitHub.Runner.Worker.Handlers
} }
nodeData.NodeVersion = "node16"; nodeData.NodeVersion = "node16";
} }
var localForceActionsToNode20 = StringUtil.ConvertToBoolean(Environment.GetEnvironmentVariable(Constants.Variables.Agent.ManualForceActionsToNode20));
executionContext.Global.EnvironmentVariables.TryGetValue(Constants.Variables.Actions.ManualForceActionsToNode20, out var workflowForceActionsToNode20);
var enforceNode20Locally = !string.IsNullOrWhiteSpace(workflowForceActionsToNode20) ? StringUtil.ConvertToBoolean(workflowForceActionsToNode20) : localForceActionsToNode20;
if (string.Equals(nodeData.NodeVersion, "node16")
&& ((executionContext.Global.Variables.GetBoolean("DistributedTask.ForceGithubJavascriptActionsToNode20") ?? false) || enforceNode20Locally))
{
executionContext.Global.EnvironmentVariables.TryGetValue(Constants.Variables.Actions.AllowActionsUseUnsecureNodeVersion, out var workflowOptOut);
var isWorkflowOptOutSet = !string.IsNullOrWhiteSpace(workflowOptOut);
var isLocalOptOut = StringUtil.ConvertToBoolean(Environment.GetEnvironmentVariable(Constants.Variables.Actions.AllowActionsUseUnsecureNodeVersion));
bool isOptOut = isWorkflowOptOutSet ? StringUtil.ConvertToBoolean(workflowOptOut) : isLocalOptOut;
if (!isOptOut)
{
var repoAction = action as Pipelines.RepositoryPathReference;
if (repoAction != null)
{
var warningActions = new HashSet<string>();
if (executionContext.Global.Variables.TryGetValue(Constants.Runner.EnforcedNode16DetectedAfterEndOfLifeEnvVariable, out var node20ForceWarnings))
{
warningActions = StringUtil.ConvertFromJson<HashSet<string>>(node20ForceWarnings);
}
string repoActionFullName;
if (string.IsNullOrEmpty(repoAction.Name))
{
repoActionFullName = repoAction.Path; // local actions don't have a 'Name'
}
else
{
repoActionFullName = $"{repoAction.Name}/{repoAction.Path ?? string.Empty}".TrimEnd('/') + $"@{repoAction.Ref}";
}
warningActions.Add(repoActionFullName);
executionContext.Global.Variables.Set(Constants.Runner.EnforcedNode16DetectedAfterEndOfLifeEnvVariable, StringUtil.ConvertToJson(warningActions));
}
nodeData.NodeVersion = "node20";
}
}
(handler as INodeScriptActionHandler).Data = nodeData; (handler as INodeScriptActionHandler).Data = nodeData;
} }
else if (data.ExecutionType == ActionExecutionType.Script) else if (data.ExecutionType == ActionExecutionType.Script)

View File

@@ -58,6 +58,10 @@ namespace GitHub.Runner.Worker.Handlers
{ {
Environment["ACTIONS_CACHE_URL"] = cacheUrl; Environment["ACTIONS_CACHE_URL"] = cacheUrl;
} }
if (systemConnection.Data.TryGetValue("PipelinesServiceUrl", out var pipelinesServiceUrl) && !string.IsNullOrEmpty(pipelinesServiceUrl))
{
Environment["ACTIONS_RUNTIME_URL"] = pipelinesServiceUrl;
}
if (systemConnection.Data.TryGetValue("GenerateIdTokenUrl", out var generateIdTokenUrl) && !string.IsNullOrEmpty(generateIdTokenUrl)) if (systemConnection.Data.TryGetValue("GenerateIdTokenUrl", out var generateIdTokenUrl) && !string.IsNullOrEmpty(generateIdTokenUrl))
{ {
Environment["ACTIONS_ID_TOKEN_REQUEST_URL"] = generateIdTokenUrl; Environment["ACTIONS_ID_TOKEN_REQUEST_URL"] = generateIdTokenUrl;
@@ -114,6 +118,11 @@ namespace GitHub.Runner.Worker.Handlers
{ {
Data.NodeVersion = "node16"; Data.NodeVersion = "node16";
} }
if (forcedNodeVersion == "node20" && Data.NodeVersion != "node20")
{
Data.NodeVersion = "node20";
}
var nodeRuntimeVersion = await StepHost.DetermineNodeRuntimeVersion(ExecutionContext, Data.NodeVersion); var nodeRuntimeVersion = await StepHost.DetermineNodeRuntimeVersion(ExecutionContext, Data.NodeVersion);
string file = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Externals), nodeRuntimeVersion, "bin", $"node{IOUtil.ExeExtension}"); string file = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Externals), nodeRuntimeVersion, "bin", $"node{IOUtil.ExeExtension}");

View File

@@ -83,40 +83,19 @@ namespace GitHub.Runner.Worker.Handlers
shellCommand = "pwsh"; shellCommand = "pwsh";
if (validateShellOnHost) if (validateShellOnHost)
{ {
if (ExecutionContext.Global.Variables.GetBoolean("DistributedTask.UseWhich2") == true) shellCommandPath = WhichUtil.Which(shellCommand, require: false, Trace, prependPath);
{
shellCommandPath = WhichUtil.Which2(shellCommand, require: false, Trace, prependPath);
}
else
{
shellCommandPath = WhichUtil.Which(shellCommand, require: false, Trace, prependPath);
}
if (string.IsNullOrEmpty(shellCommandPath)) if (string.IsNullOrEmpty(shellCommandPath))
{ {
shellCommand = "powershell"; shellCommand = "powershell";
Trace.Info($"Defaulting to {shellCommand}"); Trace.Info($"Defaulting to {shellCommand}");
if (ExecutionContext.Global.Variables.GetBoolean("DistributedTask.UseWhich2") == true) shellCommandPath = WhichUtil.Which(shellCommand, require: true, Trace, prependPath);
{
shellCommandPath = WhichUtil.Which2(shellCommand, require: true, Trace, prependPath);
}
else
{
shellCommandPath = WhichUtil.Which(shellCommand, require: true, Trace, prependPath);
}
} }
} }
#else #else
shellCommand = "sh"; shellCommand = "sh";
if (validateShellOnHost) if (validateShellOnHost)
{ {
if (ExecutionContext.Global.Variables.GetBoolean("DistributedTask.UseWhich2") == true) shellCommandPath = WhichUtil.Which("bash", false, Trace, prependPath) ?? WhichUtil.Which("sh", true, Trace, prependPath);
{
shellCommandPath = WhichUtil.Which2("bash", false, Trace, prependPath) ?? WhichUtil.Which2("sh", true, Trace, prependPath);
}
else
{
shellCommandPath = WhichUtil.Which("bash", false, Trace, prependPath) ?? WhichUtil.Which("sh", true, Trace, prependPath);
}
} }
#endif #endif
argFormat = ScriptHandlerHelpers.GetScriptArgumentsFormat(shellCommand); argFormat = ScriptHandlerHelpers.GetScriptArgumentsFormat(shellCommand);
@@ -127,14 +106,7 @@ namespace GitHub.Runner.Worker.Handlers
shellCommand = parsed.shellCommand; shellCommand = parsed.shellCommand;
if (validateShellOnHost) if (validateShellOnHost)
{ {
if (ExecutionContext.Global.Variables.GetBoolean("DistributedTask.UseWhich2") == true) shellCommandPath = WhichUtil.Which(parsed.shellCommand, true, Trace, prependPath);
{
shellCommandPath = WhichUtil.Which2(parsed.shellCommand, true, Trace, prependPath);
}
else
{
shellCommandPath = WhichUtil.Which(parsed.shellCommand, true, Trace, prependPath);
}
} }
argFormat = $"{parsed.shellArgs}".TrimStart(); argFormat = $"{parsed.shellArgs}".TrimStart();
@@ -216,38 +188,17 @@ namespace GitHub.Runner.Worker.Handlers
{ {
#if OS_WINDOWS #if OS_WINDOWS
shellCommand = "pwsh"; shellCommand = "pwsh";
if (ExecutionContext.Global.Variables.GetBoolean("DistributedTask.UseWhich2") == true) commandPath = WhichUtil.Which(shellCommand, require: false, Trace, prependPath);
{
commandPath = WhichUtil.Which2(shellCommand, require: false, Trace, prependPath);
}
else
{
commandPath = WhichUtil.Which(shellCommand, require: false, Trace, prependPath);
}
if (string.IsNullOrEmpty(commandPath)) if (string.IsNullOrEmpty(commandPath))
{ {
shellCommand = "powershell"; shellCommand = "powershell";
Trace.Info($"Defaulting to {shellCommand}"); Trace.Info($"Defaulting to {shellCommand}");
if (ExecutionContext.Global.Variables.GetBoolean("DistributedTask.UseWhich2") == true) commandPath = WhichUtil.Which(shellCommand, require: true, Trace, prependPath);
{
commandPath = WhichUtil.Which2(shellCommand, require: true, Trace, prependPath);
}
else
{
commandPath = WhichUtil.Which(shellCommand, require: true, Trace, prependPath);
}
} }
ArgUtil.NotNullOrEmpty(commandPath, "Default Shell"); ArgUtil.NotNullOrEmpty(commandPath, "Default Shell");
#else #else
shellCommand = "sh"; shellCommand = "sh";
if (ExecutionContext.Global.Variables.GetBoolean("DistributedTask.UseWhich2") == true) commandPath = WhichUtil.Which("bash", false, Trace, prependPath) ?? WhichUtil.Which("sh", true, Trace, prependPath);
{
commandPath = WhichUtil.Which2("bash", false, Trace, prependPath) ?? WhichUtil.Which2("sh", true, Trace, prependPath);
}
else
{
commandPath = WhichUtil.Which("bash", false, Trace, prependPath) ?? WhichUtil.Which("sh", true, Trace, prependPath);
}
#endif #endif
argFormat = ScriptHandlerHelpers.GetScriptArgumentsFormat(shellCommand); argFormat = ScriptHandlerHelpers.GetScriptArgumentsFormat(shellCommand);
} }
@@ -258,14 +209,7 @@ namespace GitHub.Runner.Worker.Handlers
if (!IsActionStep && systemShells.Contains(shell)) if (!IsActionStep && systemShells.Contains(shell))
{ {
shellCommand = shell; shellCommand = shell;
if (ExecutionContext.Global.Variables.GetBoolean("DistributedTask.UseWhich2") == true) commandPath = WhichUtil.Which(shell, !isContainerStepHost, Trace, prependPath);
{
commandPath = WhichUtil.Which2(shell, !isContainerStepHost, Trace, prependPath);
}
else
{
commandPath = WhichUtil.Which(shell, !isContainerStepHost, Trace, prependPath);
}
if (shell == "bash") if (shell == "bash")
{ {
argFormat = ScriptHandlerHelpers.GetScriptArgumentsFormat("sh"); argFormat = ScriptHandlerHelpers.GetScriptArgumentsFormat("sh");
@@ -280,14 +224,7 @@ namespace GitHub.Runner.Worker.Handlers
var parsed = ScriptHandlerHelpers.ParseShellOptionString(shell); var parsed = ScriptHandlerHelpers.ParseShellOptionString(shell);
shellCommand = parsed.shellCommand; shellCommand = parsed.shellCommand;
// For non-ContainerStepHost, the command must be located on the host by Which // For non-ContainerStepHost, the command must be located on the host by Which
if (ExecutionContext.Global.Variables.GetBoolean("DistributedTask.UseWhich2") == true) commandPath = WhichUtil.Which(parsed.shellCommand, !isContainerStepHost, Trace, prependPath);
{
commandPath = WhichUtil.Which2(parsed.shellCommand, !isContainerStepHost, Trace, prependPath);
}
else
{
commandPath = WhichUtil.Which(parsed.shellCommand, !isContainerStepHost, Trace, prependPath);
}
argFormat = $"{parsed.shellArgs}".TrimStart(); argFormat = $"{parsed.shellArgs}".TrimStart();
if (string.IsNullOrEmpty(argFormat)) if (string.IsNullOrEmpty(argFormat))
{ {

View File

@@ -392,6 +392,18 @@ namespace GitHub.Runner.Worker
} }
} }
// Register custom image creation post-job step if the "snapshot" token is present in the message.
var snapshotRequest = templateEvaluator.EvaluateJobSnapshotRequest(message.Snapshot, jobContext.ExpressionValues, jobContext.ExpressionFunctions);
if (snapshotRequest != null)
{
var snapshotOperationProvider = HostContext.GetService<ISnapshotOperationProvider>();
jobContext.RegisterPostJobStep(new JobExtensionRunner(
runAsync: (executionContext, _) => snapshotOperationProvider.CreateSnapshotRequestAsync(executionContext, snapshotRequest),
condition: $"{PipelineTemplateConstants.Success}()",
displayName: $"Create custom image",
data: null));
}
// Register Job Completed hook if the variable is set // Register Job Completed hook if the variable is set
var completedHookPath = Environment.GetEnvironmentVariable("ACTIONS_RUNNER_HOOK_JOB_COMPLETED"); var completedHookPath = Environment.GetEnvironmentVariable("ACTIONS_RUNNER_HOOK_JOB_COMPLETED");
if (!string.IsNullOrEmpty(completedHookPath)) if (!string.IsNullOrEmpty(completedHookPath))

View File

@@ -42,6 +42,7 @@ namespace GitHub.Runner.Worker
Trace.Info("Job ID {0}", message.JobId); Trace.Info("Job ID {0}", message.JobId);
DateTime jobStartTimeUtc = DateTime.UtcNow; DateTime jobStartTimeUtc = DateTime.UtcNow;
_runnerSettings = HostContext.GetService<IConfigurationStore>().GetSettings();
IRunnerService server = null; IRunnerService server = null;
// add orchestration id to useragent for better correlation. // add orchestration id to useragent for better correlation.
@@ -49,13 +50,9 @@ namespace GitHub.Runner.Worker
!string.IsNullOrEmpty(orchestrationId.Value)) !string.IsNullOrEmpty(orchestrationId.Value))
{ {
HostContext.UserAgents.Add(new ProductInfoHeaderValue("OrchestrationId", orchestrationId.Value)); HostContext.UserAgents.Add(new ProductInfoHeaderValue("OrchestrationId", orchestrationId.Value));
}
var jobServerQueueTelemetry = false; // make sure orchestration id is in the user-agent header.
if (message.Variables.TryGetValue("DistributedTask.EnableJobServerQueueTelemetry", out VariableValue enableJobServerQueueTelemetry) && VssUtil.InitializeVssClientSettings(HostContext.UserAgents, HostContext.WebProxy);
!string.IsNullOrEmpty(enableJobServerQueueTelemetry?.Value))
{
jobServerQueueTelemetry = StringUtil.ConvertToBoolean(enableJobServerQueueTelemetry.Value);
} }
ServiceEndpoint systemConnection = message.Resources.Endpoints.Single(x => string.Equals(x.Name, WellKnownServiceEndpointNames.SystemVssConnection, StringComparison.OrdinalIgnoreCase)); ServiceEndpoint systemConnection = message.Resources.Endpoints.Single(x => string.Equals(x.Name, WellKnownServiceEndpointNames.SystemVssConnection, StringComparison.OrdinalIgnoreCase));
@@ -79,7 +76,7 @@ namespace GitHub.Runner.Worker
launchServer.InitializeLaunchClient(new Uri(launchReceiverEndpoint), accessToken); launchServer.InitializeLaunchClient(new Uri(launchReceiverEndpoint), accessToken);
} }
_jobServerQueue = HostContext.GetService<IJobServerQueue>(); _jobServerQueue = HostContext.GetService<IJobServerQueue>();
_jobServerQueue.Start(message, resultsServiceOnly: true, enableTelemetry: jobServerQueueTelemetry); _jobServerQueue.Start(message, resultsServiceOnly: true);
} }
else else
{ {
@@ -101,7 +98,7 @@ namespace GitHub.Runner.Worker
VssConnection jobConnection = VssUtil.CreateConnection(jobServerUrl, jobServerCredential, delegatingHandlers); VssConnection jobConnection = VssUtil.CreateConnection(jobServerUrl, jobServerCredential, delegatingHandlers);
await jobServer.ConnectAsync(jobConnection); await jobServer.ConnectAsync(jobConnection);
_jobServerQueue.Start(message, enableTelemetry: jobServerQueueTelemetry); _jobServerQueue.Start(message);
server = jobServer; server = jobServer;
} }
@@ -161,8 +158,6 @@ namespace GitHub.Runner.Worker
jobContext.SetRunnerContext("os", VarUtil.OS); jobContext.SetRunnerContext("os", VarUtil.OS);
jobContext.SetRunnerContext("arch", VarUtil.OSArchitecture); jobContext.SetRunnerContext("arch", VarUtil.OSArchitecture);
_runnerSettings = HostContext.GetService<IConfigurationStore>().GetSettings();
jobContext.SetRunnerContext("name", _runnerSettings.AgentName); jobContext.SetRunnerContext("name", _runnerSettings.AgentName);
if (jobContext.Global.Variables.TryGetValue(WellKnownDistributedTaskVariables.RunnerEnvironment, out var runnerEnvironment)) if (jobContext.Global.Variables.TryGetValue(WellKnownDistributedTaskVariables.RunnerEnvironment, out var runnerEnvironment))
@@ -295,6 +290,14 @@ namespace GitHub.Runner.Worker
jobContext.Warning(string.Format(Constants.Runner.EnforcedNode12DetectedAfterEndOfLife, actions)); jobContext.Warning(string.Format(Constants.Runner.EnforcedNode12DetectedAfterEndOfLife, actions));
} }
if (jobContext.Global.Variables.TryGetValue(Constants.Runner.EnforcedNode16DetectedAfterEndOfLifeEnvVariable, out var node20ForceWarnings) && (jobContext.Global.Variables.GetBoolean("DistributedTask.ForceGithubJavascriptActionsToNode20") ?? false))
{
var actions = string.Join(", ", StringUtil.ConvertFromJson<HashSet<string>>(node20ForceWarnings));
jobContext.Warning(string.Format(Constants.Runner.EnforcedNode16DetectedAfterEndOfLife, actions));
}
await ShutdownQueue(throwOnFailure: false);
// Make sure to clean temp after file upload since they may be pending fileupload still use the TEMP dir. // Make sure to clean temp after file upload since they may be pending fileupload still use the TEMP dir.
_tempDirectoryManager?.CleanupTempDirectory(); _tempDirectoryManager?.CleanupTempDirectory();
@@ -400,6 +403,12 @@ namespace GitHub.Runner.Worker
jobContext.Warning(string.Format(Constants.Runner.EnforcedNode12DetectedAfterEndOfLife, actions)); jobContext.Warning(string.Format(Constants.Runner.EnforcedNode12DetectedAfterEndOfLife, actions));
} }
if (jobContext.Global.Variables.TryGetValue(Constants.Runner.EnforcedNode16DetectedAfterEndOfLifeEnvVariable, out var node20ForceWarnings))
{
var actions = string.Join(", ", StringUtil.ConvertFromJson<HashSet<string>>(node20ForceWarnings));
jobContext.Warning(string.Format(Constants.Runner.EnforcedNode16DetectedAfterEndOfLife, actions));
}
try try
{ {
var jobQueueTelemetry = await ShutdownQueue(throwOnFailure: true); var jobQueueTelemetry = await ShutdownQueue(throwOnFailure: true);

View File

@@ -0,0 +1,32 @@
#nullable enable
using System.IO;
using System.Threading.Tasks;
using GitHub.DistributedTask.Pipelines;
using GitHub.Runner.Common;
using GitHub.Runner.Sdk;
namespace GitHub.Runner.Worker;
[ServiceLocator(Default = typeof(SnapshotOperationProvider))]
public interface ISnapshotOperationProvider : IRunnerService
{
Task CreateSnapshotRequestAsync(IExecutionContext executionContext, Snapshot snapshotRequest);
}
public class SnapshotOperationProvider : RunnerService, ISnapshotOperationProvider
{
public Task CreateSnapshotRequestAsync(IExecutionContext executionContext, Snapshot snapshotRequest)
{
var snapshotRequestFilePath = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Root), ".snapshot", "request.json");
var snapshotRequestDirectoryPath = Path.GetDirectoryName(snapshotRequestFilePath);
if (snapshotRequestDirectoryPath != null)
{
Directory.CreateDirectory(snapshotRequestDirectoryPath);
}
IOUtil.SaveObject(snapshotRequest, snapshotRequestFilePath);
executionContext.Output($"Request written to: {snapshotRequestFilePath}");
executionContext.Output("This request will be processed after the job completes. You will not receive any feedback on the snapshot process within the workflow logs of this job.");
executionContext.Output("If the snapshot process is successful, you should see a new image with the requested name in the list of available custom images when creating a new GitHub-hosted Runner.");
return Task.CompletedTask;
}
}

View File

@@ -295,7 +295,7 @@ namespace GitHub.Runner.Worker
!jobCancellationToken.IsCancellationRequested) !jobCancellationToken.IsCancellationRequested)
{ {
Trace.Error($"Caught timeout exception from step: {ex.Message}"); Trace.Error($"Caught timeout exception from step: {ex.Message}");
step.ExecutionContext.Error("The action has timed out."); step.ExecutionContext.Error($"The action '{step.DisplayName}' has timed out after {timeoutMinutes} minutes.");
step.ExecutionContext.Result = TaskResult.Failed; step.ExecutionContext.Result = TaskResult.Failed;
} }
else else

View File

@@ -1,12 +1,11 @@
using System; using System;
using System.ComponentModel; using System.ComponentModel;
using System.Diagnostics.CodeAnalysis;
namespace GitHub.Services.Common.Internal namespace GitHub.Services.Common.Internal
{ {
[EditorBrowsable(EditorBrowsableState.Never)] [EditorBrowsable(EditorBrowsableState.Never)]
public static class RawHttpHeaders public static class RawHttpHeaders
{ {
public const String SessionHeader = "X-Runner-Session"; public const String SessionHeader = "X-Actions-Session";
} }
} }

View File

@@ -138,6 +138,8 @@ namespace GitHub.Services.Common
response.Dispose(); response.Dispose();
} }
this.Settings.ApplyTo(request);
// Let's start with sending a token // Let's start with sending a token
IssuedToken token = null; IssuedToken token = null;
if (m_tokenProvider != null) if (m_tokenProvider != null)

View File

@@ -214,25 +214,7 @@ namespace GitHub.Services.Common
// ConfigureAwait(false) enables the continuation to be run outside any captured // ConfigureAwait(false) enables the continuation to be run outside any captured
// SyncronizationContext (such as ASP.NET's) which keeps things from deadlocking... // SyncronizationContext (such as ASP.NET's) which keeps things from deadlocking...
var tmpResponse = await m_messageInvoker.SendAsync(request, tokenSource.Token).ConfigureAwait(false); response = await m_messageInvoker.SendAsync(request, tokenSource.Token).ConfigureAwait(false);
if (Settings.AllowAutoRedirectForBroker && tmpResponse.StatusCode == HttpStatusCode.Redirect)
{
//Dispose of the previous response
tmpResponse?.Dispose();
var location = tmpResponse.Headers.Location;
request = new HttpRequestMessage(HttpMethod.Get, location);
// Reapply the token to new redirected request
ApplyToken(request, token, applyICredentialsToWebProxy: lastResponseDemandedProxyAuth);
// Resend the request
response = await m_messageInvoker.SendAsync(request, tokenSource.Token).ConfigureAwait(false);
}
else
{
response = tmpResponse;
}
traceInfo?.TraceRequestSendTime(); traceInfo?.TraceRequestSendTime();

View File

@@ -110,16 +110,6 @@ namespace GitHub.Services.Common
set; set;
} }
/// <summary>
/// Gets or sets a value indicating whether or not HttpClientHandler should follow redirect on outgoing broker requests
/// This is special since this also sends token in the request, where as default AllowAutoRedirect does not
/// </summary>
public Boolean AllowAutoRedirectForBroker
{
get;
set;
}
/// <summary> /// <summary>
/// Gets or sets a value indicating whether or not compression should be used on outgoing requests. /// Gets or sets a value indicating whether or not compression should be used on outgoing requests.
/// The default value is true. /// The default value is true.

View File

@@ -43,6 +43,7 @@ namespace GitHub.DistributedTask.Pipelines
TemplateToken jobOutputs, TemplateToken jobOutputs,
IList<TemplateToken> defaults, IList<TemplateToken> defaults,
ActionsEnvironmentReference actionsEnvironment, ActionsEnvironmentReference actionsEnvironment,
TemplateToken snapshot,
String messageType = JobRequestMessageTypes.PipelineAgentJobRequest) String messageType = JobRequestMessageTypes.PipelineAgentJobRequest)
{ {
this.MessageType = messageType; this.MessageType = messageType;
@@ -57,6 +58,7 @@ namespace GitHub.DistributedTask.Pipelines
this.Workspace = workspaceOptions; this.Workspace = workspaceOptions;
this.JobOutputs = jobOutputs; this.JobOutputs = jobOutputs;
this.ActionsEnvironment = actionsEnvironment; this.ActionsEnvironment = actionsEnvironment;
this.Snapshot = snapshot;
m_variables = new Dictionary<String, VariableValue>(variables, StringComparer.OrdinalIgnoreCase); m_variables = new Dictionary<String, VariableValue>(variables, StringComparer.OrdinalIgnoreCase);
m_maskHints = new List<MaskHint>(maskHints); m_maskHints = new List<MaskHint>(maskHints);
m_steps = new List<JobStep>(steps); m_steps = new List<JobStep>(steps);
@@ -237,6 +239,13 @@ namespace GitHub.DistributedTask.Pipelines
set; set;
} }
[DataMember(EmitDefaultValue = false)]
public TemplateToken Snapshot
{
get;
set;
}
/// <summary> /// <summary>
/// Gets the collection of variables associated with the current context. /// Gets the collection of variables associated with the current context.
/// </summary> /// </summary>

View File

@@ -29,6 +29,7 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
public const String Id = "id"; public const String Id = "id";
public const String If = "if"; public const String If = "if";
public const String Image = "image"; public const String Image = "image";
public const String ImageName = "image-name";
public const String Include = "include"; public const String Include = "include";
public const String Inputs = "inputs"; public const String Inputs = "inputs";
public const String Job = "job"; public const String Job = "job";
@@ -60,6 +61,7 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
public const String Services = "services"; public const String Services = "services";
public const String Shell = "shell"; public const String Shell = "shell";
public const String Skipped = "skipped"; public const String Skipped = "skipped";
public const String Snapshot = "snapshot";
public const String StepEnv = "step-env"; public const String StepEnv = "step-env";
public const String StepIfResult = "step-if-result"; public const String StepIfResult = "step-if-result";
public const String StepWith = "step-with"; public const String StepWith = "step-with";

View File

@@ -346,6 +346,39 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
return result; return result;
} }
internal static Snapshot ConvertToJobSnapshotRequest(TemplateContext context, TemplateToken token)
{
string imageName = null;
if (token is StringToken snapshotStringLiteral)
{
imageName = snapshotStringLiteral.Value;
}
else
{
var snapshotMapping = token.AssertMapping($"{PipelineTemplateConstants.Snapshot}");
foreach (var snapshotPropertyPair in snapshotMapping)
{
var propertyName = snapshotPropertyPair.Key.AssertString($"{PipelineTemplateConstants.Snapshot} key");
switch (propertyName.Value)
{
case PipelineTemplateConstants.ImageName:
imageName = snapshotPropertyPair.Value.AssertString($"{PipelineTemplateConstants.Snapshot} {propertyName}").Value;
break;
default:
propertyName.AssertUnexpectedValue($"{PipelineTemplateConstants.Snapshot} key");
break;
}
}
}
if (String.IsNullOrEmpty(imageName))
{
return null;
}
return new Snapshot(imageName);
}
private static ActionStep ConvertToStep( private static ActionStep ConvertToStep(
TemplateContext context, TemplateContext context,
TemplateToken stepsItem, TemplateToken stepsItem,

View File

@@ -370,6 +370,32 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
return result; return result;
} }
public Snapshot EvaluateJobSnapshotRequest(TemplateToken token,
DictionaryContextData contextData,
IList<IFunctionInfo> expressionFunctions)
{
var result = default(Snapshot);
if (token != null && token.Type != TokenType.Null)
{
var context = CreateContext(contextData, expressionFunctions);
try
{
token = TemplateEvaluator.Evaluate(context, PipelineTemplateConstants.Snapshot, token, 0, null, omitHeader: true);
context.Errors.Check();
result = PipelineTemplateConverter.ConvertToJobSnapshotRequest(context, token);
}
catch (Exception ex) when (!(ex is TemplateValidationException))
{
context.Errors.Add(ex);
}
context.Errors.Check();
}
return result;
}
private TemplateContext CreateContext( private TemplateContext CreateContext(
DictionaryContextData contextData, DictionaryContextData contextData,
IList<IFunctionInfo> expressionFunctions, IList<IFunctionInfo> expressionFunctions,

View File

@@ -0,0 +1,17 @@
using System;
using System.Runtime.Serialization;
namespace GitHub.DistributedTask.Pipelines
{
[DataContract]
public class Snapshot
{
public Snapshot(string imageName)
{
ImageName = imageName;
}
[DataMember(EmitDefaultValue = false)]
public String ImageName { get; set; }
}
}

View File

@@ -71,7 +71,8 @@
"env": "job-env", "env": "job-env",
"outputs": "job-outputs", "outputs": "job-outputs",
"defaults": "job-defaults", "defaults": "job-defaults",
"steps": "steps" "steps": "steps",
"snapshot": "snapshot"
} }
} }
}, },
@@ -155,6 +156,24 @@
} }
}, },
"snapshot": {
"one-of": [
"non-empty-string",
"snapshot-mapping"
]
},
"snapshot-mapping": {
"mapping": {
"properties": {
"image-name": {
"type": "non-empty-string",
"required": true
}
}
}
},
"runs-on": { "runs-on": {
"context": [ "context": [
"github", "github",

View File

@@ -0,0 +1,95 @@
using System;
using System.Collections.Generic;
using System.Globalization;
using System.IO;
using System.IO.Compression;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Threading;
using System.Threading.Tasks;
using GitHub.Services.Common;
using GitHub.Services.Common.Diagnostics;
using GitHub.Services.WebApi;
using Newtonsoft.Json;
namespace GitHub.DistributedTask.WebApi
{
[ResourceArea(TaskResourceIds.AreaId)]
public class ActionsRunServerHttpClient : TaskAgentHttpClient
{
private static readonly JsonSerializerSettings s_serializerSettings;
static ActionsRunServerHttpClient()
{
s_serializerSettings = new VssJsonMediaTypeFormatter().SerializerSettings;
s_serializerSettings.DateParseHandling = DateParseHandling.None;
s_serializerSettings.FloatParseHandling = FloatParseHandling.Double;
}
public ActionsRunServerHttpClient(
Uri baseUrl,
VssCredentials credentials)
: base(baseUrl, credentials)
{
}
public ActionsRunServerHttpClient(
Uri baseUrl,
VssCredentials credentials,
VssHttpRequestSettings settings)
: base(baseUrl, credentials, settings)
{
}
public ActionsRunServerHttpClient(
Uri baseUrl,
VssCredentials credentials,
params DelegatingHandler[] handlers)
: base(baseUrl, credentials, handlers)
{
}
public ActionsRunServerHttpClient(
Uri baseUrl,
VssCredentials credentials,
VssHttpRequestSettings settings,
params DelegatingHandler[] handlers)
: base(baseUrl, credentials, settings, handlers)
{
}
public ActionsRunServerHttpClient(
Uri baseUrl,
HttpMessageHandler pipeline,
Boolean disposeHandler)
: base(baseUrl, pipeline, disposeHandler)
{
}
public Task<Pipelines.AgentJobRequestMessage> GetJobMessageAsync(
string messageId,
object userState = null,
CancellationToken cancellationToken = default)
{
HttpMethod httpMethod = new HttpMethod("GET");
Guid locationId = new Guid("25adab70-1379-4186-be8e-b643061ebe3a");
object routeValues = new { messageId = messageId };
return SendAsync<Pipelines.AgentJobRequestMessage>(
httpMethod,
locationId,
routeValues: routeValues,
version: new ApiResourceVersion(6.0, 1),
userState: userState,
cancellationToken: cancellationToken);
}
protected override async Task<T> ReadJsonContentAsync<T>(HttpResponseMessage response, CancellationToken cancellationToken = default(CancellationToken))
{
var json = await response.Content.ReadAsStringAsync(cancellationToken).ConfigureAwait(false);
return JsonConvert.DeserializeObject<T>(json, s_serializerSettings);
}
}
}

View File

@@ -0,0 +1,38 @@
using System;
using System.Runtime.Serialization;
namespace GitHub.DistributedTask.WebApi
{
/// <summary>
/// Message that tells the runner to redirect itself to BrokerListener for messages.
/// (Note that we use a special Message instead of a simple 302. This is because
/// the runner will need to apply the runner's token to the request, and it is
/// a security best practice to *not* blindly add sensitive data to redirects
/// 302s.)
/// </summary>
[DataContract]
public class BrokerMigrationMessage
{
public static readonly string MessageType = "BrokerMigration";
public BrokerMigrationMessage()
{
}
public BrokerMigrationMessage(
Uri brokerUrl)
{
this.BrokerBaseUrl = brokerUrl;
}
/// <summary>
/// The base url for the broker listener
/// </summary>
[DataMember]
public Uri BrokerBaseUrl
{
get;
internal set;
}
}
}

View File

@@ -2498,6 +2498,25 @@ namespace GitHub.DistributedTask.WebApi
} }
} }
[Serializable]
public class NonRetryableActionDownloadInfoException : DistributedTaskException
{
public NonRetryableActionDownloadInfoException(String message)
: base(message)
{
}
public NonRetryableActionDownloadInfoException(String message, Exception innerException)
: base(message, innerException)
{
}
protected NonRetryableActionDownloadInfoException(SerializationInfo info, StreamingContext context)
: base(info, context)
{
}
}
[Serializable] [Serializable]
public sealed class FailedToResolveActionDownloadInfoException : DistributedTaskException public sealed class FailedToResolveActionDownloadInfoException : DistributedTaskException
{ {

View File

@@ -141,24 +141,6 @@ namespace GitHub.DistributedTask.WebApi
return ReplaceAgentAsync(poolId, agent.Id, agent, userState, cancellationToken); return ReplaceAgentAsync(poolId, agent.Id, agent, userState, cancellationToken);
} }
public Task<Pipelines.AgentJobRequestMessage> GetJobMessageAsync(
string messageId,
object userState = null,
CancellationToken cancellationToken = default)
{
HttpMethod httpMethod = new HttpMethod("GET");
Guid locationId = new Guid("25adab70-1379-4186-be8e-b643061ebe3a");
object routeValues = new { messageId = messageId };
return SendAsync<Pipelines.AgentJobRequestMessage>(
httpMethod,
locationId,
routeValues: routeValues,
version: new ApiResourceVersion(6.0, 1),
userState: userState,
cancellationToken: cancellationToken);
}
protected Task<T> SendAsync<T>( protected Task<T> SendAsync<T>(
HttpMethod method, HttpMethod method,
Guid locationId, Guid locationId,

View File

@@ -0,0 +1,10 @@
using System;
using System.Runtime.Serialization;
namespace GitHub.DistributedTask.WebApi
{
public sealed class TaskAgentMessageTypes
{
public static readonly string ForceTokenRefresh = "ForceTokenRefresh";
}
}

View File

@@ -75,5 +75,12 @@ namespace GitHub.DistributedTask.WebApi
get; get;
set; set;
} }
[DataMember(EmitDefaultValue = false, IsRequired = false)]
public BrokerMigrationMessage BrokerMigrationMessage
{
get;
set;
}
} }
} }

View File

@@ -101,6 +101,7 @@ namespace GitHub.DistributedTask.WebApi
public static readonly String FileAttachment = "DistributedTask.Core.FileAttachment"; public static readonly String FileAttachment = "DistributedTask.Core.FileAttachment";
public static readonly String DiagnosticLog = "DistributedTask.Core.DiagnosticLog"; public static readonly String DiagnosticLog = "DistributedTask.Core.DiagnosticLog";
public static readonly String ResultsLog = "Results.Core.Log"; public static readonly String ResultsLog = "Results.Core.Log";
public static readonly String ResultsDiagnosticLog = "Results.Core.DiagnosticLog";
} }
[GenerateAllConstants] [GenerateAllConstants]

View File

@@ -7,5 +7,8 @@ namespace GitHub.Actions.RunService.WebApi
{ {
[DataMember(Name = "jobMessageId", EmitDefaultValue = false)] [DataMember(Name = "jobMessageId", EmitDefaultValue = false)]
public string JobMessageId { get; set; } public string JobMessageId { get; set; }
[DataMember(Name = "runnerOS", EmitDefaultValue = false)]
public string RunnerOS { get; set; }
} }
} }

View File

@@ -9,6 +9,7 @@ using GitHub.DistributedTask.WebApi;
using GitHub.Services.Common; using GitHub.Services.Common;
using GitHub.Services.OAuth; using GitHub.Services.OAuth;
using GitHub.Services.WebApi; using GitHub.Services.WebApi;
using Newtonsoft.Json;
using Sdk.RSWebApi.Contracts; using Sdk.RSWebApi.Contracts;
using Sdk.WebApi.WebApi; using Sdk.WebApi.WebApi;
@@ -16,6 +17,15 @@ namespace GitHub.Actions.RunService.WebApi
{ {
public class RunServiceHttpClient : RawHttpClientBase public class RunServiceHttpClient : RawHttpClientBase
{ {
private static readonly JsonSerializerSettings s_serializerSettings;
static RunServiceHttpClient()
{
s_serializerSettings = new VssJsonMediaTypeFormatter().SerializerSettings;
s_serializerSettings.DateParseHandling = DateParseHandling.None;
s_serializerSettings.FloatParseHandling = FloatParseHandling.Double;
}
public RunServiceHttpClient( public RunServiceHttpClient(
Uri baseUrl, Uri baseUrl,
VssOAuthCredential credentials) VssOAuthCredential credentials)
@@ -59,12 +69,14 @@ namespace GitHub.Actions.RunService.WebApi
public async Task<AgentJobRequestMessage> GetJobMessageAsync( public async Task<AgentJobRequestMessage> GetJobMessageAsync(
Uri requestUri, Uri requestUri,
string messageId, string messageId,
string runnerOS,
CancellationToken cancellationToken = default) CancellationToken cancellationToken = default)
{ {
HttpMethod httpMethod = new HttpMethod("POST"); HttpMethod httpMethod = new HttpMethod("POST");
var payload = new AcquireJobRequest var payload = new AcquireJobRequest
{ {
JobMessageId = messageId, JobMessageId = messageId,
RunnerOS = runnerOS
}; };
requestUri = new Uri(requestUri, "acquirejob"); requestUri = new Uri(requestUri, "acquirejob");
@@ -172,5 +184,11 @@ namespace GitHub.Actions.RunService.WebApi
throw new Exception($"Failed to renew job: {result.Error}"); throw new Exception($"Failed to renew job: {result.Error}");
} }
} }
protected override async Task<T> ReadJsonContentAsync<T>(HttpResponseMessage response, CancellationToken cancellationToken = default(CancellationToken))
{
var json = await response.Content.ReadAsStringAsync(cancellationToken).ConfigureAwait(false);
return JsonConvert.DeserializeObject<T>(json, s_serializerSettings);
}
} }
} }

View File

@@ -13,6 +13,7 @@
</PropertyGroup> </PropertyGroup>
<ItemGroup> <ItemGroup>
<PackageReference Include="Azure.Storage.Blobs" Version="12.19.1" />
<PackageReference Include="Microsoft.Win32.Registry" Version="4.4.0" /> <PackageReference Include="Microsoft.Win32.Registry" Version="4.4.0" />
<PackageReference Include="Newtonsoft.Json" Version="13.0.3" /> <PackageReference Include="Newtonsoft.Json" Version="13.0.3" />
<PackageReference Include="Microsoft.AspNet.WebApi.Client" Version="5.2.9" /> <PackageReference Include="Microsoft.AspNet.WebApi.Client" Version="5.2.9" />

View File

@@ -57,6 +57,7 @@ namespace GitHub.Actions.RunService.WebApi
} }
public async Task<TaskAgentMessage> GetRunnerMessageAsync( public async Task<TaskAgentMessage> GetRunnerMessageAsync(
Guid? sessionId,
string runnerVersion, string runnerVersion,
TaskAgentStatus? status, TaskAgentStatus? status,
string os = null, string os = null,
@@ -69,6 +70,11 @@ namespace GitHub.Actions.RunService.WebApi
List<KeyValuePair<string, string>> queryParams = new List<KeyValuePair<string, string>>(); List<KeyValuePair<string, string>> queryParams = new List<KeyValuePair<string, string>>();
if (sessionId != null)
{
queryParams.Add("sessionId", sessionId.Value.ToString());
}
if (status != null) if (status != null)
{ {
queryParams.Add("status", status.Value.ToString()); queryParams.Add("status", status.Value.ToString());
@@ -104,12 +110,67 @@ namespace GitHub.Actions.RunService.WebApi
return result.Value; return result.Value;
} }
// the only time we throw a `Forbidden` exception from Listener /messages is when the runner is
// disable_update and is too old to poll
if (result.StatusCode == HttpStatusCode.Forbidden)
{
throw new AccessDeniedException($"{result.Error} Runner version v{runnerVersion} is deprecated and cannot receive messages.")
{
ErrorCode = 1
};
}
throw new Exception($"Failed to get job message: {result.Error}");
}
public async Task<TaskAgentSession> CreateSessionAsync(
TaskAgentSession session,
CancellationToken cancellationToken = default)
{
var requestUri = new Uri(Client.BaseAddress, "session");
var requestContent = new ObjectContent<TaskAgentSession>(session, new VssJsonMediaTypeFormatter(true));
var result = await SendAsync<TaskAgentSession>(
new HttpMethod("POST"),
requestUri: requestUri,
content: requestContent,
cancellationToken: cancellationToken);
if (result.IsSuccess)
{
return result.Value;
}
if (result.StatusCode == HttpStatusCode.Forbidden) if (result.StatusCode == HttpStatusCode.Forbidden)
{ {
throw new AccessDeniedException(result.Error); throw new AccessDeniedException(result.Error);
} }
throw new Exception($"Failed to get job message: {result.Error}"); if (result.StatusCode == HttpStatusCode.Conflict)
{
throw new TaskAgentSessionConflictException(result.Error);
}
throw new Exception($"Failed to create broker session: {result.Error}");
}
public async Task DeleteSessionAsync(
CancellationToken cancellationToken = default)
{
var requestUri = new Uri(Client.BaseAddress, $"session");
var result = await SendAsync<object>(
new HttpMethod("DELETE"),
requestUri: requestUri,
cancellationToken: cancellationToken);
if (result.IsSuccess)
{
return;
}
throw new Exception($"Failed to delete broker session: {result.Error}");
} }
} }
} }

View File

@@ -89,6 +89,26 @@ namespace GitHub.Services.Results.Contracts
public long SoftSizeLimit; public long SoftSizeLimit;
} }
[DataContract]
[JsonObject(NamingStrategyType = typeof(SnakeCaseNamingStrategy))]
public class GetSignedDiagnosticLogsURLRequest
{
[DataMember]
public string WorkflowJobRunBackendId;
[DataMember]
public string WorkflowRunBackendId;
}
[DataContract]
[JsonObject(NamingStrategyType = typeof(SnakeCaseNamingStrategy))]
public class GetSignedDiagnosticLogsURLResponse
{
[DataMember]
public string DiagLogsURL;
[DataMember]
public string BlobStorageType;
}
[DataContract] [DataContract]
[JsonObject(NamingStrategyType = typeof(SnakeCaseNamingStrategy))] [JsonObject(NamingStrategyType = typeof(SnakeCaseNamingStrategy))]
public class JobLogsMetadataCreate public class JobLogsMetadataCreate

View File

@@ -1,6 +1,5 @@
using System; using System;
using System.Collections.Generic; using System.Collections.Generic;
using System.Diagnostics;
using System.IO; using System.IO;
using System.Linq; using System.Linq;
using System.Net.Http; using System.Net.Http;
@@ -8,8 +7,11 @@ using System.Net.Http.Headers;
using System.Threading; using System.Threading;
using System.Threading.Tasks; using System.Threading.Tasks;
using System.Net.Http.Formatting; using System.Net.Http.Formatting;
using Azure;
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
using Azure.Storage.Blobs.Specialized;
using GitHub.DistributedTask.WebApi; using GitHub.DistributedTask.WebApi;
using GitHub.Services.Common;
using GitHub.Services.Results.Contracts; using GitHub.Services.Results.Contracts;
using Sdk.WebApi.WebApi; using Sdk.WebApi.WebApi;
@@ -21,13 +23,15 @@ namespace GitHub.Services.Results.Client
Uri baseUrl, Uri baseUrl,
HttpMessageHandler pipeline, HttpMessageHandler pipeline,
string token, string token,
bool disposeHandler) bool disposeHandler,
bool useSdk)
: base(baseUrl, pipeline, disposeHandler) : base(baseUrl, pipeline, disposeHandler)
{ {
m_token = token; m_token = token;
m_resultsServiceUrl = baseUrl; m_resultsServiceUrl = baseUrl;
m_formatter = new JsonMediaTypeFormatter(); m_formatter = new JsonMediaTypeFormatter();
m_changeIdCounter = 1; m_changeIdCounter = 1;
m_useSdk = useSdk;
} }
// Get Sas URL calls // Get Sas URL calls
@@ -77,6 +81,19 @@ namespace GitHub.Services.Results.Client
return await GetResultsSignedURLResponse<GetSignedStepLogsURLRequest, GetSignedStepLogsURLResponse>(getStepLogsSignedBlobURLEndpoint, cancellationToken, request); return await GetResultsSignedURLResponse<GetSignedStepLogsURLRequest, GetSignedStepLogsURLResponse>(getStepLogsSignedBlobURLEndpoint, cancellationToken, request);
} }
private async Task<GetSignedDiagnosticLogsURLResponse> GetDiagnosticLogsUploadUrlAsync(string planId, string jobId, CancellationToken cancellationToken)
{
var request = new GetSignedDiagnosticLogsURLRequest()
{
WorkflowJobRunBackendId = jobId,
WorkflowRunBackendId = planId,
};
var getDiagnosticLogsSignedBlobURLEndpoint = new Uri(m_resultsServiceUrl, Constants.GetJobDiagLogsSignedBlobURL);
return await GetResultsSignedURLResponse<GetSignedDiagnosticLogsURLRequest, GetSignedDiagnosticLogsURLResponse>(getDiagnosticLogsSignedBlobURLEndpoint, cancellationToken, request);
}
private async Task<GetSignedJobLogsURLResponse> GetJobLogUploadUrlAsync(string planId, string jobId, CancellationToken cancellationToken) private async Task<GetSignedJobLogsURLResponse> GetJobLogUploadUrlAsync(string planId, string jobId, CancellationToken cancellationToken)
{ {
var request = new GetSignedJobLogsURLRequest() var request = new GetSignedJobLogsURLRequest()
@@ -91,7 +108,6 @@ namespace GitHub.Services.Results.Client
} }
// Create metadata calls // Create metadata calls
private async Task SendRequest<R>(Uri uri, CancellationToken cancellationToken, R request, string timestamp) private async Task SendRequest<R>(Uri uri, CancellationToken cancellationToken, R request, string timestamp)
{ {
using (HttpRequestMessage requestMessage = new HttpRequestMessage(HttpMethod.Post, uri)) using (HttpRequestMessage requestMessage = new HttpRequestMessage(HttpMethod.Post, uri))
@@ -161,73 +177,219 @@ namespace GitHub.Services.Results.Client
await SendRequest<JobLogsMetadataCreate>(createJobLogsMetadataEndpoint, cancellationToken, request, timestamp); await SendRequest<JobLogsMetadataCreate>(createJobLogsMetadataEndpoint, cancellationToken, request, timestamp);
} }
private async Task<HttpResponseMessage> UploadBlockFileAsync(string url, string blobStorageType, FileStream file, CancellationToken cancellationToken) private (Uri path, string sas) ParseSasToken(string url)
{ {
// Upload the file to the url if (String.IsNullOrEmpty(url))
var request = new HttpRequestMessage(HttpMethod.Put, url)
{ {
Content = new StreamContent(file) throw new Exception($"SAS url is empty");
};
if (blobStorageType == BlobStorageTypes.AzureBlobStorage)
{
request.Content.Headers.Add(Constants.AzureBlobTypeHeader, Constants.AzureBlockBlob);
} }
using (var response = await SendAsync(request, HttpCompletionOption.ResponseHeadersRead, userState: null, cancellationToken)) var blobUri = new UriBuilder(url);
var sasUrl = blobUri.Query.Substring(1); //remove starting "?"
blobUri.Query = null; // remove query params
return (blobUri.Uri, sasUrl);
}
private BlobClient GetBlobClient(string url)
{
var blobUri = ParseSasToken(url);
var opts = new BlobClientOptions
{ {
if (!response.IsSuccessStatusCode) Retry =
{ {
throw new Exception($"Failed to upload file, status code: {response.StatusCode}, reason: {response.ReasonPhrase}"); MaxRetries = Constants.DefaultBlobUploadRetries,
NetworkTimeout = TimeSpan.FromSeconds(Constants.DefaultNetworkTimeoutInSeconds)
}
};
return new BlobClient(blobUri.path, new AzureSasCredential(blobUri.sas), opts);
}
private AppendBlobClient GetAppendBlobClient(string url)
{
var blobUri = ParseSasToken(url);
var opts = new BlobClientOptions
{
Retry =
{
MaxRetries = Constants.DefaultBlobUploadRetries,
NetworkTimeout = TimeSpan.FromSeconds(Constants.DefaultNetworkTimeoutInSeconds)
}
};
return new AppendBlobClient(blobUri.path, new AzureSasCredential(blobUri.sas), opts);
}
private async Task UploadBlockFileAsync(string url, string blobStorageType, FileStream file, CancellationToken cancellationToken, Dictionary<string, string> customHeaders = null)
{
if (m_useSdk && blobStorageType == BlobStorageTypes.AzureBlobStorage)
{
var blobClient = GetBlobClient(url);
var httpHeaders = new BlobHttpHeaders();
if (customHeaders != null)
{
foreach (var header in customHeaders)
{
switch (header.Key)
{
case Constants.ContentTypeHeader:
httpHeaders.ContentType = header.Value;
break;
}
}
}
try
{
await blobClient.UploadAsync(file, new BlobUploadOptions()
{
HttpHeaders = httpHeaders,
Conditions = new BlobRequestConditions
{
IfNoneMatch = new ETag("*")
}
}, cancellationToken);
}
catch (RequestFailedException e)
{
throw new Exception($"Failed to upload block to Azure blob: {e.Message}");
}
}
else
{
// Upload the file to the url
var request = new HttpRequestMessage(HttpMethod.Put, url)
{
Content = new StreamContent(file)
};
if (blobStorageType == BlobStorageTypes.AzureBlobStorage)
{
request.Content.Headers.Add(Constants.AzureBlobTypeHeader, Constants.AzureBlockBlob);
}
if (customHeaders != null)
{
foreach (var header in customHeaders)
{
request.Content.Headers.Add(header.Key, header.Value);
}
};
using (var response = await SendAsync(request, HttpCompletionOption.ResponseHeadersRead, userState: null, cancellationToken))
{
if (!response.IsSuccessStatusCode)
{
throw new Exception($"Failed to upload file, status code: {response.StatusCode}, reason: {response.ReasonPhrase}");
}
} }
return response;
} }
} }
private async Task<HttpResponseMessage> CreateAppendFileAsync(string url, string blobStorageType, CancellationToken cancellationToken) private async Task CreateAppendFileAsync(string url, string blobStorageType, CancellationToken cancellationToken, Dictionary<string, string> customHeaders = null)
{ {
var request = new HttpRequestMessage(HttpMethod.Put, url) if (m_useSdk && blobStorageType == BlobStorageTypes.AzureBlobStorage)
{ {
Content = new StringContent("") var appendBlobClient = GetAppendBlobClient(url);
}; var httpHeaders = new BlobHttpHeaders();
if (blobStorageType == BlobStorageTypes.AzureBlobStorage) if (customHeaders != null)
{
request.Content.Headers.Add(Constants.AzureBlobTypeHeader, Constants.AzureAppendBlob);
request.Content.Headers.Add("Content-Length", "0");
}
using (var response = await SendAsync(request, HttpCompletionOption.ResponseHeadersRead, userState: null, cancellationToken))
{
if (!response.IsSuccessStatusCode)
{ {
throw new Exception($"Failed to create append file, status code: {response.StatusCode}, reason: {response.ReasonPhrase}"); foreach (var header in customHeaders)
{
switch (header.Key)
{
case Constants.ContentTypeHeader:
httpHeaders.ContentType = header.Value;
break;
}
}
}
try
{
await appendBlobClient.CreateAsync(new AppendBlobCreateOptions()
{
HttpHeaders = httpHeaders,
Conditions = new AppendBlobRequestConditions
{
IfNoneMatch = new ETag("*")
}
}, cancellationToken: cancellationToken);
}
catch (RequestFailedException e)
{
throw new Exception($"Failed to create append blob in Azure blob: {e.Message}");
}
}
else
{
var request = new HttpRequestMessage(HttpMethod.Put, url)
{
Content = new StringContent("")
};
if (blobStorageType == BlobStorageTypes.AzureBlobStorage)
{
request.Content.Headers.Add(Constants.AzureBlobTypeHeader, Constants.AzureAppendBlob);
request.Content.Headers.Add("Content-Length", "0");
}
if (customHeaders != null)
{
foreach (var header in customHeaders)
{
request.Content.Headers.Add(header.Key, header.Value);
}
};
using (var response = await SendAsync(request, HttpCompletionOption.ResponseHeadersRead, userState: null, cancellationToken))
{
if (!response.IsSuccessStatusCode)
{
throw new Exception($"Failed to create append file, status code: {response.StatusCode}, reason: {response.ReasonPhrase}");
}
} }
return response;
} }
} }
private async Task<HttpResponseMessage> UploadAppendFileAsync(string url, string blobStorageType, FileStream file, bool finalize, long fileSize, CancellationToken cancellationToken) private async Task UploadAppendFileAsync(string url, string blobStorageType, FileStream file, bool finalize, long fileSize, CancellationToken cancellationToken)
{ {
var comp = finalize ? "&comp=appendblock&seal=true" : "&comp=appendblock"; if (m_useSdk && blobStorageType == BlobStorageTypes.AzureBlobStorage)
// Upload the file to the url
var request = new HttpRequestMessage(HttpMethod.Put, url + comp)
{ {
Content = new StreamContent(file) var appendBlobClient = GetAppendBlobClient(url);
}; try
if (blobStorageType == BlobStorageTypes.AzureBlobStorage)
{
request.Content.Headers.Add("Content-Length", fileSize.ToString());
request.Content.Headers.Add(Constants.AzureBlobSealedHeader, finalize.ToString());
}
using (var response = await SendAsync(request, HttpCompletionOption.ResponseHeadersRead, userState: null, cancellationToken))
{
if (!response.IsSuccessStatusCode)
{ {
throw new Exception($"Failed to upload append file, status code: {response.StatusCode}, reason: {response.ReasonPhrase}, object: {response}, fileSize: {fileSize}"); await appendBlobClient.AppendBlockAsync(file, cancellationToken: cancellationToken);
if (finalize)
{
await appendBlobClient.SealAsync(cancellationToken: cancellationToken);
}
}
catch (RequestFailedException e)
{
throw new Exception($"Failed to upload append block in Azure blob: {e.Message}");
}
}
else
{
var comp = finalize ? "&comp=appendblock&seal=true" : "&comp=appendblock";
// Upload the file to the url
var request = new HttpRequestMessage(HttpMethod.Put, url + comp)
{
Content = new StreamContent(file)
};
if (blobStorageType == BlobStorageTypes.AzureBlobStorage)
{
request.Content.Headers.Add("Content-Length", fileSize.ToString());
request.Content.Headers.Add(Constants.AzureBlobSealedHeader, finalize.ToString());
}
using (var response = await SendAsync(request, HttpCompletionOption.ResponseHeadersRead, userState: null, cancellationToken))
{
if (!response.IsSuccessStatusCode)
{
throw new Exception($"Failed to upload append file, status code: {response.StatusCode}, reason: {response.ReasonPhrase}, object: {response}, fileSize: {fileSize}");
}
} }
return response;
} }
} }
@@ -251,23 +413,22 @@ namespace GitHub.Services.Results.Client
// Upload the file // Upload the file
using (var fileStream = new FileStream(file, FileMode.Open, FileAccess.Read, FileShare.Read, 4096, true)) using (var fileStream = new FileStream(file, FileMode.Open, FileAccess.Read, FileShare.Read, 4096, true))
{ {
var response = await UploadBlockFileAsync(uploadUrlResponse.SummaryUrl, uploadUrlResponse.BlobStorageType, fileStream, cancellationToken); await UploadBlockFileAsync(uploadUrlResponse.SummaryUrl, uploadUrlResponse.BlobStorageType, fileStream, cancellationToken);
} }
// Send step summary upload complete message // Send step summary upload complete message
await StepSummaryUploadCompleteAsync(planId, jobId, stepId, fileSize, cancellationToken); await StepSummaryUploadCompleteAsync(planId, jobId, stepId, fileSize, cancellationToken);
} }
private async Task<HttpResponseMessage> UploadLogFile(string file, bool finalize, bool firstBlock, string sasUrl, string blobStorageType, private async Task UploadLogFile(string file, bool finalize, bool firstBlock, string sasUrl, string blobStorageType,
CancellationToken cancellationToken) CancellationToken cancellationToken, Dictionary<string, string> customHeaders = null)
{ {
HttpResponseMessage response;
if (firstBlock && finalize) if (firstBlock && finalize)
{ {
// This is the one and only block, just use a block blob // This is the one and only block, just use a block blob
using (var fileStream = new FileStream(file, FileMode.Open, FileAccess.Read, FileShare.Read, 4096, true)) using (var fileStream = new FileStream(file, FileMode.Open, FileAccess.Read, FileShare.Read, 4096, true))
{ {
response = await UploadBlockFileAsync(sasUrl, blobStorageType, fileStream, cancellationToken); await UploadBlockFileAsync(sasUrl, blobStorageType, fileStream, cancellationToken, customHeaders);
} }
} }
else else
@@ -276,18 +437,16 @@ namespace GitHub.Services.Results.Client
// Create the Append blob // Create the Append blob
if (firstBlock) if (firstBlock)
{ {
await CreateAppendFileAsync(sasUrl, blobStorageType, cancellationToken); await CreateAppendFileAsync(sasUrl, blobStorageType, cancellationToken, customHeaders);
} }
// Upload content // Upload content
var fileSize = new FileInfo(file).Length; var fileSize = new FileInfo(file).Length;
using (var fileStream = new FileStream(file, FileMode.Open, FileAccess.Read, FileShare.Read, 4096, true)) using (var fileStream = new FileStream(file, FileMode.Open, FileAccess.Read, FileShare.Read, 4096, true))
{ {
response = await UploadAppendFileAsync(sasUrl, blobStorageType, fileStream, finalize, fileSize, cancellationToken); await UploadAppendFileAsync(sasUrl, blobStorageType, fileStream, finalize, fileSize, cancellationToken);
} }
} }
return response;
} }
// Handle file upload for step log // Handle file upload for step log
@@ -300,7 +459,12 @@ namespace GitHub.Services.Results.Client
throw new Exception("Failed to get step log upload url"); throw new Exception("Failed to get step log upload url");
} }
await UploadLogFile(file, finalize, firstBlock, uploadUrlResponse.LogsUrl, uploadUrlResponse.BlobStorageType, cancellationToken); var customHeaders = new Dictionary<string, string>
{
{ Constants.ContentTypeHeader, Constants.TextPlainContentType }
};
await UploadLogFile(file, finalize, firstBlock, uploadUrlResponse.LogsUrl, uploadUrlResponse.BlobStorageType, cancellationToken, customHeaders);
// Update metadata // Update metadata
if (finalize) if (finalize)
@@ -320,7 +484,12 @@ namespace GitHub.Services.Results.Client
throw new Exception("Failed to get job log upload url"); throw new Exception("Failed to get job log upload url");
} }
await UploadLogFile(file, finalize, firstBlock, uploadUrlResponse.LogsUrl, uploadUrlResponse.BlobStorageType, cancellationToken); var customHeaders = new Dictionary<string, string>
{
{ Constants.ContentTypeHeader, Constants.TextPlainContentType }
};
await UploadLogFile(file, finalize, firstBlock, uploadUrlResponse.LogsUrl, uploadUrlResponse.BlobStorageType, cancellationToken, customHeaders);
// Update metadata // Update metadata
if (finalize) if (finalize)
@@ -330,6 +499,18 @@ namespace GitHub.Services.Results.Client
} }
} }
public async Task UploadResultsDiagnosticLogsAsync(string planId, string jobId, string file, CancellationToken cancellationToken)
{
// Get the upload url
var uploadUrlResponse = await GetDiagnosticLogsUploadUrlAsync(planId, jobId, cancellationToken);
if (uploadUrlResponse == null || uploadUrlResponse.DiagLogsURL == null)
{
throw new Exception("Failed to get diagnostic logs upload url");
}
await UploadLogFile(file, true, true, uploadUrlResponse.DiagLogsURL, uploadUrlResponse.BlobStorageType, cancellationToken);
}
private Step ConvertTimelineRecordToStep(TimelineRecord r) private Step ConvertTimelineRecordToStep(TimelineRecord r)
{ {
return new Step() return new Step()
@@ -405,6 +586,7 @@ namespace GitHub.Services.Results.Client
private Uri m_resultsServiceUrl; private Uri m_resultsServiceUrl;
private string m_token; private string m_token;
private int m_changeIdCounter; private int m_changeIdCounter;
private bool m_useSdk;
} }
// Constants specific to results // Constants specific to results
@@ -419,13 +601,20 @@ namespace GitHub.Services.Results.Client
public static readonly string CreateStepLogsMetadata = ResultsReceiverTwirpEndpoint + "CreateStepLogsMetadata"; public static readonly string CreateStepLogsMetadata = ResultsReceiverTwirpEndpoint + "CreateStepLogsMetadata";
public static readonly string GetJobLogsSignedBlobURL = ResultsReceiverTwirpEndpoint + "GetJobLogsSignedBlobURL"; public static readonly string GetJobLogsSignedBlobURL = ResultsReceiverTwirpEndpoint + "GetJobLogsSignedBlobURL";
public static readonly string CreateJobLogsMetadata = ResultsReceiverTwirpEndpoint + "CreateJobLogsMetadata"; public static readonly string CreateJobLogsMetadata = ResultsReceiverTwirpEndpoint + "CreateJobLogsMetadata";
public static readonly string GetJobDiagLogsSignedBlobURL = ResultsReceiverTwirpEndpoint + "GetJobDiagLogsSignedBlobURL";
public static readonly string ResultsProtoApiV1Endpoint = "twirp/github.actions.results.api.v1.WorkflowStepUpdateService/"; public static readonly string ResultsProtoApiV1Endpoint = "twirp/github.actions.results.api.v1.WorkflowStepUpdateService/";
public static readonly string WorkflowStepsUpdate = ResultsProtoApiV1Endpoint + "WorkflowStepsUpdate"; public static readonly string WorkflowStepsUpdate = ResultsProtoApiV1Endpoint + "WorkflowStepsUpdate";
public static readonly int DefaultNetworkTimeoutInSeconds = 30;
public static readonly int DefaultBlobUploadRetries = 3;
public static readonly string AzureBlobSealedHeader = "x-ms-blob-sealed"; public static readonly string AzureBlobSealedHeader = "x-ms-blob-sealed";
public static readonly string AzureBlobTypeHeader = "x-ms-blob-type"; public static readonly string AzureBlobTypeHeader = "x-ms-blob-type";
public static readonly string AzureBlockBlob = "BlockBlob"; public static readonly string AzureBlockBlob = "BlockBlob";
public static readonly string AzureAppendBlob = "AppendBlob"; public static readonly string AzureAppendBlob = "AppendBlob";
public const string ContentTypeHeader = "Content-Type";
public const string TextPlainContentType = "text/plain";
} }
} }

View File

@@ -172,43 +172,6 @@ namespace GitHub.Runner.Common.Tests
} }
} }
[Theory]
[InlineData("randomString", "0", "VERB", "FALSELEVEL")]
[InlineData("", "-1", "", "INFO")]
[InlineData("", "6", "VERB", "FALSELEVEL")]
[InlineData("", "99", "VERB", "FALSELEVEL")]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public void TraceLevel(string trace, string traceLevel, string mustContainTraceLevel, string mustNotContainTraceLevel)
{
try
{
Environment.SetEnvironmentVariable("GITHUB_ACTIONS_RUNNER_TRACE", trace);
Environment.SetEnvironmentVariable("ACTIONS_RUNNER_TRACE_LEVEL", traceLevel);
// Arrange.
Setup();
_hc.SetDefaultCulture("hu-HU"); // logs [VERB] if traceLevel allows it
// Assert.
var logFile = Path.Combine(Path.GetDirectoryName(Assembly.GetEntryAssembly().Location), $"trace_{nameof(HostContextL0)}_{nameof(TraceLevel)}.log");
var tempFile = Path.GetTempFileName();
File.Delete(tempFile);
File.Copy(logFile, tempFile);
var content = File.ReadAllText(tempFile);
Assert.Contains($"{mustContainTraceLevel}", content);
Assert.DoesNotContain($"{mustNotContainTraceLevel}", content);
}
finally
{
Environment.SetEnvironmentVariable("GITHUB_ACTIONS_RUNNER_TRACE", null);
Environment.SetEnvironmentVariable("ACTIONS_RUNNER_TRACE_LEVEL", null);
// Cleanup.
Teardown();
}
}
private void Setup([CallerMemberName] string testName = "") private void Setup([CallerMemberName] string testName = "")
{ {
_tokenSource = new CancellationTokenSource(); _tokenSource = new CancellationTokenSource();

View File

@@ -0,0 +1,81 @@
using System;
using System.Runtime.CompilerServices;
using System.Threading;
using System.Threading.Tasks;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Listener;
using GitHub.Runner.Listener.Configuration;
using GitHub.Services.Common;
using Moq;
using Xunit;
namespace GitHub.Runner.Common.Tests.Listener
{
public sealed class BrokerMessageListenerL0
{
private readonly RunnerSettings _settings;
private readonly Mock<IConfigurationManager> _config;
private readonly Mock<IBrokerServer> _brokerServer;
private readonly Mock<ICredentialManager> _credMgr;
private Mock<IConfigurationStore> _store;
public BrokerMessageListenerL0()
{
_settings = new RunnerSettings { AgentId = 1, AgentName = "myagent", PoolId = 123, PoolName = "default", ServerUrl = "http://myserver", WorkFolder = "_work", ServerUrlV2 = "http://myserverv2" };
_config = new Mock<IConfigurationManager>();
_config.Setup(x => x.LoadSettings()).Returns(_settings);
_credMgr = new Mock<ICredentialManager>();
_store = new Mock<IConfigurationStore>();
_brokerServer = new Mock<IBrokerServer>();
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void CreatesSession()
{
using (TestHostContext tc = CreateTestContext())
using (var tokenSource = new CancellationTokenSource())
{
Tracing trace = tc.GetTrace();
// Arrange.
var expectedSession = new TaskAgentSession();
_brokerServer
.Setup(x => x.CreateSessionAsync(
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token))
.Returns(Task.FromResult(expectedSession));
_credMgr.Setup(x => x.LoadCredentials()).Returns(new VssCredentials());
_store.Setup(x => x.GetCredentials()).Returns(new CredentialData() { Scheme = Constants.Configuration.OAuthAccessToken });
_store.Setup(x => x.GetMigratedCredentials()).Returns(default(CredentialData));
// Act.
BrokerMessageListener listener = new();
listener.Initialize(tc);
CreateSessionResult result = await listener.CreateSessionAsync(tokenSource.Token);
trace.Info("result: {0}", result);
// Assert.
Assert.Equal(CreateSessionResult.Success, result);
_brokerServer
.Verify(x => x.CreateSessionAsync(
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token), Times.Once());
}
}
private TestHostContext CreateTestContext([CallerMemberName] String testName = "")
{
TestHostContext tc = new(this, testName);
tc.SetSingleton<IConfigurationManager>(_config.Object);
tc.SetSingleton<ICredentialManager>(_credMgr.Object);
tc.SetSingleton<IConfigurationStore>(_store.Object);
tc.SetSingleton<IBrokerServer>(_brokerServer.Object);
return tc;
}
}
}

View File

@@ -41,7 +41,7 @@ namespace GitHub.Runner.Common.Tests.Listener
TaskOrchestrationPlanReference plan = new(); TaskOrchestrationPlanReference plan = new();
TimelineReference timeline = null; TimelineReference timeline = null;
Guid jobId = Guid.NewGuid(); Guid jobId = Guid.NewGuid();
var result = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, "someJob", "someJob", null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null); var result = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, "someJob", "someJob", null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null, null);
result.ContextData["github"] = new Pipelines.ContextData.DictionaryContextData(); result.ContextData["github"] = new Pipelines.ContextData.DictionaryContextData();
return result; return result;
} }
@@ -806,7 +806,8 @@ namespace GitHub.Runner.Common.Tests.Listener
}, },
null, null,
new List<TemplateToken>(), new List<TemplateToken>(),
new ActionsEnvironmentReference("env") new ActionsEnvironmentReference("env"),
null
); );
return message; return message;
} }

View File

@@ -24,6 +24,8 @@ namespace GitHub.Runner.Common.Tests.Listener
private Mock<ICredentialManager> _credMgr; private Mock<ICredentialManager> _credMgr;
private Mock<IConfigurationStore> _store; private Mock<IConfigurationStore> _store;
private Mock<IBrokerServer> _brokerServer;
public MessageListenerL0() public MessageListenerL0()
{ {
_settings = new RunnerSettings { AgentId = 1, AgentName = "myagent", PoolId = 123, PoolName = "default", ServerUrl = "http://myserver", WorkFolder = "_work" }; _settings = new RunnerSettings { AgentId = 1, AgentName = "myagent", PoolId = 123, PoolName = "default", ServerUrl = "http://myserver", WorkFolder = "_work" };
@@ -32,6 +34,7 @@ namespace GitHub.Runner.Common.Tests.Listener
_runnerServer = new Mock<IRunnerServer>(); _runnerServer = new Mock<IRunnerServer>();
_credMgr = new Mock<ICredentialManager>(); _credMgr = new Mock<ICredentialManager>();
_store = new Mock<IConfigurationStore>(); _store = new Mock<IConfigurationStore>();
_brokerServer = new Mock<IBrokerServer>();
} }
private TestHostContext CreateTestContext([CallerMemberName] String testName = "") private TestHostContext CreateTestContext([CallerMemberName] String testName = "")
@@ -41,6 +44,7 @@ namespace GitHub.Runner.Common.Tests.Listener
tc.SetSingleton<IRunnerServer>(_runnerServer.Object); tc.SetSingleton<IRunnerServer>(_runnerServer.Object);
tc.SetSingleton<ICredentialManager>(_credMgr.Object); tc.SetSingleton<ICredentialManager>(_credMgr.Object);
tc.SetSingleton<IConfigurationStore>(_store.Object); tc.SetSingleton<IConfigurationStore>(_store.Object);
tc.SetSingleton<IBrokerServer>(_brokerServer.Object);
return tc; return tc;
} }
@@ -71,16 +75,82 @@ namespace GitHub.Runner.Common.Tests.Listener
MessageListener listener = new(); MessageListener listener = new();
listener.Initialize(tc); listener.Initialize(tc);
bool result = await listener.CreateSessionAsync(tokenSource.Token); CreateSessionResult result = await listener.CreateSessionAsync(tokenSource.Token);
trace.Info("result: {0}", result); trace.Info("result: {0}", result);
// Assert. // Assert.
Assert.True(result); Assert.Equal(CreateSessionResult.Success, result);
_runnerServer _runnerServer
.Verify(x => x.CreateAgentSessionAsync( .Verify(x => x.CreateAgentSessionAsync(
_settings.PoolId, _settings.PoolId,
It.Is<TaskAgentSession>(y => y != null), It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token), Times.Once()); tokenSource.Token), Times.Once());
_brokerServer
.Verify(x => x.CreateSessionAsync(
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token), Times.Never());
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void CreatesSessionWithBrokerMigration()
{
using (TestHostContext tc = CreateTestContext())
using (var tokenSource = new CancellationTokenSource())
{
Tracing trace = tc.GetTrace();
// Arrange.
var expectedSession = new TaskAgentSession()
{
OwnerName = "legacy",
BrokerMigrationMessage = new BrokerMigrationMessage(new Uri("https://broker.actions.github.com"))
};
var expectedBrokerSession = new TaskAgentSession()
{
OwnerName = "broker"
};
_runnerServer
.Setup(x => x.CreateAgentSessionAsync(
_settings.PoolId,
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token))
.Returns(Task.FromResult(expectedSession));
_brokerServer
.Setup(x => x.CreateSessionAsync(
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token))
.Returns(Task.FromResult(expectedBrokerSession));
_credMgr.Setup(x => x.LoadCredentials()).Returns(new VssCredentials());
_store.Setup(x => x.GetCredentials()).Returns(new CredentialData() { Scheme = Constants.Configuration.OAuthAccessToken });
_store.Setup(x => x.GetMigratedCredentials()).Returns(default(CredentialData));
// Act.
MessageListener listener = new();
listener.Initialize(tc);
CreateSessionResult result = await listener.CreateSessionAsync(tokenSource.Token);
trace.Info("result: {0}", result);
// Assert.
Assert.Equal(CreateSessionResult.Success, result);
_runnerServer
.Verify(x => x.CreateAgentSessionAsync(
_settings.PoolId,
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token), Times.Once());
_brokerServer
.Verify(x => x.CreateSessionAsync(
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token), Times.Once());
} }
} }
@@ -115,8 +185,8 @@ namespace GitHub.Runner.Common.Tests.Listener
MessageListener listener = new(); MessageListener listener = new();
listener.Initialize(tc); listener.Initialize(tc);
bool result = await listener.CreateSessionAsync(tokenSource.Token); CreateSessionResult result = await listener.CreateSessionAsync(tokenSource.Token);
Assert.True(result); Assert.Equal(CreateSessionResult.Success, result);
_runnerServer _runnerServer
.Setup(x => x.DeleteAgentSessionAsync( .Setup(x => x.DeleteAgentSessionAsync(
@@ -131,6 +201,83 @@ namespace GitHub.Runner.Common.Tests.Listener
} }
} }
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void DeleteSessionWithBrokerMigration()
{
using (TestHostContext tc = CreateTestContext())
using (var tokenSource = new CancellationTokenSource())
{
Tracing trace = tc.GetTrace();
// Arrange.
var expectedSession = new TaskAgentSession()
{
OwnerName = "legacy",
BrokerMigrationMessage = new BrokerMigrationMessage(new Uri("https://broker.actions.github.com"))
};
var expectedBrokerSession = new TaskAgentSession()
{
SessionId = Guid.NewGuid(),
OwnerName = "broker"
};
_runnerServer
.Setup(x => x.CreateAgentSessionAsync(
_settings.PoolId,
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token))
.Returns(Task.FromResult(expectedSession));
_brokerServer
.Setup(x => x.CreateSessionAsync(
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token))
.Returns(Task.FromResult(expectedBrokerSession));
_credMgr.Setup(x => x.LoadCredentials()).Returns(new VssCredentials());
_store.Setup(x => x.GetCredentials()).Returns(new CredentialData() { Scheme = Constants.Configuration.OAuthAccessToken });
_store.Setup(x => x.GetMigratedCredentials()).Returns(default(CredentialData));
// Act.
MessageListener listener = new();
listener.Initialize(tc);
CreateSessionResult result = await listener.CreateSessionAsync(tokenSource.Token);
trace.Info("result: {0}", result);
Assert.Equal(CreateSessionResult.Success, result);
_runnerServer
.Verify(x => x.CreateAgentSessionAsync(
_settings.PoolId,
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token), Times.Once());
_brokerServer
.Verify(x => x.CreateSessionAsync(
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token), Times.Once());
_brokerServer
.Setup(x => x.DeleteSessionAsync(It.IsAny<CancellationToken>()))
.Returns(Task.CompletedTask);
// Act.
await listener.DeleteSessionAsync();
//Assert
_runnerServer
.Verify(x => x.DeleteAgentSessionAsync(
_settings.PoolId, expectedBrokerSession.SessionId, It.IsAny<CancellationToken>()), Times.Once());
_brokerServer
.Verify(x => x.DeleteSessionAsync(It.IsAny<CancellationToken>()), Times.Once());
}
}
[Fact] [Fact]
[Trait("Level", "L0")] [Trait("Level", "L0")]
[Trait("Category", "Runner")] [Trait("Category", "Runner")]
@@ -162,8 +309,8 @@ namespace GitHub.Runner.Common.Tests.Listener
MessageListener listener = new(); MessageListener listener = new();
listener.Initialize(tc); listener.Initialize(tc);
bool result = await listener.CreateSessionAsync(tokenSource.Token); CreateSessionResult result = await listener.CreateSessionAsync(tokenSource.Token);
Assert.True(result); Assert.Equal(CreateSessionResult.Success, result);
var arMessages = new TaskAgentMessage[] var arMessages = new TaskAgentMessage[]
{ {
@@ -212,6 +359,112 @@ namespace GitHub.Runner.Common.Tests.Listener
} }
} }
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void GetNextMessageWithBrokerMigration()
{
using (TestHostContext tc = CreateTestContext())
using (var tokenSource = new CancellationTokenSource())
{
Tracing trace = tc.GetTrace();
// Arrange.
var expectedSession = new TaskAgentSession();
PropertyInfo sessionIdProperty = expectedSession.GetType().GetProperty("SessionId", BindingFlags.Instance | BindingFlags.NonPublic | BindingFlags.Public);
Assert.NotNull(sessionIdProperty);
sessionIdProperty.SetValue(expectedSession, Guid.NewGuid());
_runnerServer
.Setup(x => x.CreateAgentSessionAsync(
_settings.PoolId,
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token))
.Returns(Task.FromResult(expectedSession));
_credMgr.Setup(x => x.LoadCredentials()).Returns(new VssCredentials());
_store.Setup(x => x.GetCredentials()).Returns(new CredentialData() { Scheme = Constants.Configuration.OAuthAccessToken });
_store.Setup(x => x.GetMigratedCredentials()).Returns(default(CredentialData));
// Act.
MessageListener listener = new();
listener.Initialize(tc);
CreateSessionResult result = await listener.CreateSessionAsync(tokenSource.Token);
Assert.Equal(CreateSessionResult.Success, result);
var brokerMigrationMesage = new BrokerMigrationMessage(new Uri("https://actions.broker.com"));
var arMessages = new TaskAgentMessage[]
{
new TaskAgentMessage
{
Body = JsonUtility.ToString(brokerMigrationMesage),
MessageType = BrokerMigrationMessage.MessageType
},
};
var brokerMessages = new TaskAgentMessage[]
{
new TaskAgentMessage
{
Body = "somebody1",
MessageId = 4234,
MessageType = JobRequestMessageTypes.PipelineAgentJobRequest
},
new TaskAgentMessage
{
Body = "somebody2",
MessageId = 4235,
MessageType = JobCancelMessage.MessageType
},
null, //should be skipped by GetNextMessageAsync implementation
null,
new TaskAgentMessage
{
Body = "somebody3",
MessageId = 4236,
MessageType = JobRequestMessageTypes.PipelineAgentJobRequest
}
};
var brokerMessageQueue = new Queue<TaskAgentMessage>(brokerMessages);
_runnerServer
.Setup(x => x.GetAgentMessageAsync(
_settings.PoolId, expectedSession.SessionId, It.IsAny<long?>(), TaskAgentStatus.Online, It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<bool>(), It.IsAny<CancellationToken>()))
.Returns(async (Int32 poolId, Guid sessionId, Int64? lastMessageId, TaskAgentStatus status, string runnerVersion, string os, string architecture, bool disableUpdate, CancellationToken cancellationToken) =>
{
await Task.Yield();
return arMessages[0]; // always send migration message
});
_brokerServer
.Setup(x => x.GetRunnerMessageAsync(
expectedSession.SessionId, TaskAgentStatus.Online, It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<bool>(), It.IsAny<CancellationToken>()))
.Returns(async (Guid sessionId, TaskAgentStatus status, string runnerVersion, string os, string architecture, bool disableUpdate, CancellationToken cancellationToken) =>
{
await Task.Yield();
return brokerMessageQueue.Dequeue();
});
TaskAgentMessage message1 = await listener.GetNextMessageAsync(tokenSource.Token);
TaskAgentMessage message2 = await listener.GetNextMessageAsync(tokenSource.Token);
TaskAgentMessage message3 = await listener.GetNextMessageAsync(tokenSource.Token);
Assert.Equal(brokerMessages[0], message1);
Assert.Equal(brokerMessages[1], message2);
Assert.Equal(brokerMessages[4], message3);
//Assert
_runnerServer
.Verify(x => x.GetAgentMessageAsync(
_settings.PoolId, expectedSession.SessionId, It.IsAny<long?>(), TaskAgentStatus.Online, It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<bool>(), It.IsAny<CancellationToken>()), Times.Exactly(brokerMessages.Length));
_brokerServer
.Verify(x => x.GetRunnerMessageAsync(
expectedSession.SessionId, TaskAgentStatus.Online, It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<bool>(), It.IsAny<CancellationToken>()), Times.Exactly(brokerMessages.Length));
}
}
[Fact] [Fact]
[Trait("Level", "L0")] [Trait("Level", "L0")]
[Trait("Category", "Runner")] [Trait("Category", "Runner")]
@@ -244,11 +497,11 @@ namespace GitHub.Runner.Common.Tests.Listener
MessageListener listener = new(); MessageListener listener = new();
listener.Initialize(tc); listener.Initialize(tc);
bool result = await listener.CreateSessionAsync(tokenSource.Token); CreateSessionResult result = await listener.CreateSessionAsync(tokenSource.Token);
trace.Info("result: {0}", result); trace.Info("result: {0}", result);
// Assert. // Assert.
Assert.True(result); Assert.Equal(CreateSessionResult.Success, result);
_runnerServer _runnerServer
.Verify(x => x.CreateAgentSessionAsync( .Verify(x => x.CreateAgentSessionAsync(
_settings.PoolId, _settings.PoolId,
@@ -288,8 +541,8 @@ namespace GitHub.Runner.Common.Tests.Listener
MessageListener listener = new(); MessageListener listener = new();
listener.Initialize(tc); listener.Initialize(tc);
bool result = await listener.CreateSessionAsync(tokenSource.Token); CreateSessionResult result = await listener.CreateSessionAsync(tokenSource.Token);
Assert.True(result); Assert.Equal(CreateSessionResult.Success, result);
_runnerServer _runnerServer
.Setup(x => x.GetAgentMessageAsync( .Setup(x => x.GetAgentMessageAsync(

View File

@@ -42,7 +42,7 @@ namespace GitHub.Runner.Common.Tests.Listener
TaskOrchestrationPlanReference plan = new(); TaskOrchestrationPlanReference plan = new();
TimelineReference timeline = null; TimelineReference timeline = null;
Guid jobId = Guid.NewGuid(); Guid jobId = Guid.NewGuid();
return new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, "test", "test", null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null); return new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, "test", "test", null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null, null);
} }
private JobCancelMessage CreateJobCancelMessage() private JobCancelMessage CreateJobCancelMessage()
@@ -88,7 +88,7 @@ namespace GitHub.Runner.Common.Tests.Listener
_configurationManager.Setup(x => x.IsConfigured()) _configurationManager.Setup(x => x.IsConfigured())
.Returns(true); .Returns(true);
_messageListener.Setup(x => x.CreateSessionAsync(It.IsAny<CancellationToken>())) _messageListener.Setup(x => x.CreateSessionAsync(It.IsAny<CancellationToken>()))
.Returns(Task.FromResult<bool>(true)); .Returns(Task.FromResult<CreateSessionResult>(CreateSessionResult.Success));
_messageListener.Setup(x => x.GetNextMessageAsync(It.IsAny<CancellationToken>())) _messageListener.Setup(x => x.GetNextMessageAsync(It.IsAny<CancellationToken>()))
.Returns(async () => .Returns(async () =>
{ {
@@ -184,7 +184,7 @@ namespace GitHub.Runner.Common.Tests.Listener
_configStore.Setup(x => x.IsServiceConfigured()).Returns(configureAsService); _configStore.Setup(x => x.IsServiceConfigured()).Returns(configureAsService);
_messageListener.Setup(x => x.CreateSessionAsync(It.IsAny<CancellationToken>())) _messageListener.Setup(x => x.CreateSessionAsync(It.IsAny<CancellationToken>()))
.Returns(Task.FromResult(false)); .Returns(Task.FromResult<CreateSessionResult>(CreateSessionResult.Failure));
var runner = new Runner.Listener.Runner(); var runner = new Runner.Listener.Runner();
runner.Initialize(hc); runner.Initialize(hc);
@@ -217,7 +217,7 @@ namespace GitHub.Runner.Common.Tests.Listener
.Returns(false); .Returns(false);
_messageListener.Setup(x => x.CreateSessionAsync(It.IsAny<CancellationToken>())) _messageListener.Setup(x => x.CreateSessionAsync(It.IsAny<CancellationToken>()))
.Returns(Task.FromResult(false)); .Returns(Task.FromResult<CreateSessionResult>(CreateSessionResult.Failure));
var runner = new Runner.Listener.Runner(); var runner = new Runner.Listener.Runner();
runner.Initialize(hc); runner.Initialize(hc);
@@ -263,7 +263,7 @@ namespace GitHub.Runner.Common.Tests.Listener
_configurationManager.Setup(x => x.IsConfigured()) _configurationManager.Setup(x => x.IsConfigured())
.Returns(true); .Returns(true);
_messageListener.Setup(x => x.CreateSessionAsync(It.IsAny<CancellationToken>())) _messageListener.Setup(x => x.CreateSessionAsync(It.IsAny<CancellationToken>()))
.Returns(Task.FromResult<bool>(true)); .Returns(Task.FromResult<CreateSessionResult>(CreateSessionResult.Success));
_messageListener.Setup(x => x.GetNextMessageAsync(It.IsAny<CancellationToken>())) _messageListener.Setup(x => x.GetNextMessageAsync(It.IsAny<CancellationToken>()))
.Returns(async () => .Returns(async () =>
{ {
@@ -363,7 +363,7 @@ namespace GitHub.Runner.Common.Tests.Listener
_configurationManager.Setup(x => x.IsConfigured()) _configurationManager.Setup(x => x.IsConfigured())
.Returns(true); .Returns(true);
_messageListener.Setup(x => x.CreateSessionAsync(It.IsAny<CancellationToken>())) _messageListener.Setup(x => x.CreateSessionAsync(It.IsAny<CancellationToken>()))
.Returns(Task.FromResult<bool>(true)); .Returns(Task.FromResult<CreateSessionResult>(CreateSessionResult.Success));
_messageListener.Setup(x => x.GetNextMessageAsync(It.IsAny<CancellationToken>())) _messageListener.Setup(x => x.GetNextMessageAsync(It.IsAny<CancellationToken>()))
.Returns(async () => .Returns(async () =>
{ {
@@ -458,7 +458,7 @@ namespace GitHub.Runner.Common.Tests.Listener
_configurationManager.Setup(x => x.IsConfigured()) _configurationManager.Setup(x => x.IsConfigured())
.Returns(true); .Returns(true);
_messageListener.Setup(x => x.CreateSessionAsync(It.IsAny<CancellationToken>())) _messageListener.Setup(x => x.CreateSessionAsync(It.IsAny<CancellationToken>()))
.Returns(Task.FromResult<bool>(true)); .Returns(Task.FromResult<CreateSessionResult>(CreateSessionResult.Success));
_messageListener.Setup(x => x.GetNextMessageAsync(It.IsAny<CancellationToken>())) _messageListener.Setup(x => x.GetNextMessageAsync(It.IsAny<CancellationToken>()))
.Returns(async () => .Returns(async () =>
{ {

View File

@@ -23,7 +23,6 @@ namespace GitHub.Runner.Common.Tests.Listener
private Mock<IConfigurationStore> _configStore; private Mock<IConfigurationStore> _configStore;
private Mock<IJobDispatcher> _jobDispatcher; private Mock<IJobDispatcher> _jobDispatcher;
private AgentRefreshMessage _refreshMessage = new(1, "2.999.0"); private AgentRefreshMessage _refreshMessage = new(1, "2.999.0");
private List<TrimmedPackageMetadata> _trimmedPackages = new();
#if !OS_WINDOWS #if !OS_WINDOWS
private string _packageUrl = null; private string _packageUrl = null;
@@ -71,12 +70,6 @@ namespace GitHub.Runner.Common.Tests.Listener
} }
} }
using (var client = new HttpClient())
{
var json = await client.GetStringAsync($"https://github.com/actions/runner/releases/download/v{latestVersion}/actions-runner-{BuildConstants.RunnerPackage.PackageName}-{latestVersion}-trimmedpackages.json");
_trimmedPackages = StringUtil.ConvertFromJson<List<TrimmedPackageMetadata>>(json);
}
_runnerServer.Setup(x => x.GetPackageAsync("agent", BuildConstants.RunnerPackage.PackageName, "2.999.0", true, It.IsAny<CancellationToken>())) _runnerServer.Setup(x => x.GetPackageAsync("agent", BuildConstants.RunnerPackage.PackageName, "2.999.0", true, It.IsAny<CancellationToken>()))
.Returns(Task.FromResult(new PackageMetadata() { Platform = BuildConstants.RunnerPackage.PackageName, Version = new PackageVersion("2.999.0"), DownloadUrl = _packageUrl })); .Returns(Task.FromResult(new PackageMetadata() { Platform = BuildConstants.RunnerPackage.PackageName, Version = new PackageVersion("2.999.0"), DownloadUrl = _packageUrl }));
@@ -91,12 +84,10 @@ namespace GitHub.Runner.Common.Tests.Listener
{ {
await FetchLatestRunner(); await FetchLatestRunner();
Assert.NotNull(_packageUrl); Assert.NotNull(_packageUrl);
Assert.NotNull(_trimmedPackages);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin"))); Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this)) using (var hc = new TestHostContext(this))
{ {
hc.GetTrace().Info(_packageUrl); hc.GetTrace().Info(_packageUrl);
hc.GetTrace().Info(StringUtil.ConvertToJson(_trimmedPackages));
//Arrange //Arrange
var updater = new Runner.Listener.SelfUpdater(); var updater = new Runner.Listener.SelfUpdater();
@@ -152,12 +143,10 @@ namespace GitHub.Runner.Common.Tests.Listener
{ {
await FetchLatestRunner(); await FetchLatestRunner();
Assert.NotNull(_packageUrl); Assert.NotNull(_packageUrl);
Assert.NotNull(_trimmedPackages);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin"))); Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this)) using (var hc = new TestHostContext(this))
{ {
hc.GetTrace().Info(_packageUrl); hc.GetTrace().Info(_packageUrl);
hc.GetTrace().Info(StringUtil.ConvertToJson(_trimmedPackages));
//Arrange //Arrange
var updater = new Runner.Listener.SelfUpdater(); var updater = new Runner.Listener.SelfUpdater();
@@ -205,12 +194,10 @@ namespace GitHub.Runner.Common.Tests.Listener
{ {
await FetchLatestRunner(); await FetchLatestRunner();
Assert.NotNull(_packageUrl); Assert.NotNull(_packageUrl);
Assert.NotNull(_trimmedPackages);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin"))); Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this)) using (var hc = new TestHostContext(this))
{ {
hc.GetTrace().Info(_packageUrl); hc.GetTrace().Info(_packageUrl);
hc.GetTrace().Info(StringUtil.ConvertToJson(_trimmedPackages));
//Arrange //Arrange
var updater = new Runner.Listener.SelfUpdater(); var updater = new Runner.Listener.SelfUpdater();
@@ -260,12 +247,10 @@ namespace GitHub.Runner.Common.Tests.Listener
{ {
await FetchLatestRunner(); await FetchLatestRunner();
Assert.NotNull(_packageUrl); Assert.NotNull(_packageUrl);
Assert.NotNull(_trimmedPackages);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin"))); Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this)) using (var hc = new TestHostContext(this))
{ {
hc.GetTrace().Info(_packageUrl); hc.GetTrace().Info(_packageUrl);
hc.GetTrace().Info(StringUtil.ConvertToJson(_trimmedPackages));
//Arrange //Arrange
var updater = new Runner.Listener.SelfUpdater(); var updater = new Runner.Listener.SelfUpdater();
@@ -305,495 +290,6 @@ namespace GitHub.Runner.Common.Tests.Listener
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", null); Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", null);
} }
} }
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void TestSelfUpdateAsync_CloneHash_RuntimeAndExternals()
{
try
{
await FetchLatestRunner();
Assert.NotNull(_packageUrl);
Assert.NotNull(_trimmedPackages);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this))
{
hc.GetTrace().Info(_packageUrl);
hc.GetTrace().Info(StringUtil.ConvertToJson(_trimmedPackages));
//Arrange
var updater = new Runner.Listener.SelfUpdater();
hc.SetSingleton<ITerminal>(_term.Object);
hc.SetSingleton<IRunnerServer>(_runnerServer.Object);
hc.SetSingleton<IConfigurationStore>(_configStore.Object);
hc.SetSingleton<IHttpClientHandlerFactory>(new HttpClientHandlerFactory());
var p1 = new ProcessInvokerWrapper();
p1.Initialize(hc);
var p2 = new ProcessInvokerWrapper();
p2.Initialize(hc);
var p3 = new ProcessInvokerWrapper();
p3.Initialize(hc);
hc.EnqueueInstance<IProcessInvoker>(p1);
hc.EnqueueInstance<IProcessInvoker>(p2);
hc.EnqueueInstance<IProcessInvoker>(p3);
updater.Initialize(hc);
_runnerServer.Setup(x => x.GetPackageAsync("agent", BuildConstants.RunnerPackage.PackageName, "2.999.0", true, It.IsAny<CancellationToken>()))
.Returns(Task.FromResult(new PackageMetadata() { Platform = BuildConstants.RunnerPackage.PackageName, Version = new PackageVersion("2.999.0"), DownloadUrl = _packageUrl, TrimmedPackages = new List<TrimmedPackageMetadata>() { new TrimmedPackageMetadata() } }));
_runnerServer.Setup(x => x.UpdateAgentUpdateStateAsync(1, 1, It.IsAny<string>(), It.IsAny<string>()))
.Callback((int p, ulong a, string s, string t) =>
{
hc.GetTrace().Info(t);
})
.Returns(Task.FromResult(new TaskAgent()));
try
{
var result = await updater.SelfUpdate(_refreshMessage, _jobDispatcher.Object, true, hc.RunnerShutdownToken);
Assert.True(result);
Assert.True(Directory.Exists(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "bin.2.999.0")));
Assert.True(Directory.Exists(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "externals.2.999.0")));
FieldInfo contentHashesProperty = updater.GetType().GetField("_contentHashes", BindingFlags.Instance | BindingFlags.NonPublic | BindingFlags.Public);
Assert.NotNull(contentHashesProperty);
Dictionary<string, string> contentHashes = (Dictionary<string, string>)contentHashesProperty.GetValue(updater);
hc.GetTrace().Info(StringUtil.ConvertToJson(contentHashes));
var dotnetRuntimeHashFile = Path.Combine(TestUtil.GetSrcPath(), $"Misc/contentHash/dotnetRuntime/{BuildConstants.RunnerPackage.PackageName}");
var externalsHashFile = Path.Combine(TestUtil.GetSrcPath(), $"Misc/contentHash/externals/{BuildConstants.RunnerPackage.PackageName}");
Assert.Equal(File.ReadAllText(dotnetRuntimeHashFile).Trim(), contentHashes["dotnetRuntime"]);
Assert.Equal(File.ReadAllText(externalsHashFile).Trim(), contentHashes["externals"]);
}
finally
{
IOUtil.DeleteDirectory(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "bin.2.999.0"), CancellationToken.None);
IOUtil.DeleteDirectory(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "externals.2.999.0"), CancellationToken.None);
}
}
}
finally
{
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", null);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void TestSelfUpdateAsync_Cancel_CloneHashTask_WhenNotNeeded()
{
try
{
await FetchLatestRunner();
Assert.NotNull(_packageUrl);
Assert.NotNull(_trimmedPackages);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this))
{
hc.GetTrace().Info(_packageUrl);
hc.GetTrace().Info(StringUtil.ConvertToJson(_trimmedPackages));
//Arrange
var updater = new Runner.Listener.SelfUpdater();
hc.SetSingleton<ITerminal>(_term.Object);
hc.SetSingleton<IRunnerServer>(_runnerServer.Object);
hc.SetSingleton<IConfigurationStore>(_configStore.Object);
hc.SetSingleton<IHttpClientHandlerFactory>(new Mock<IHttpClientHandlerFactory>().Object);
var p1 = new ProcessInvokerWrapper();
p1.Initialize(hc);
var p2 = new ProcessInvokerWrapper();
p2.Initialize(hc);
var p3 = new ProcessInvokerWrapper();
p3.Initialize(hc);
hc.EnqueueInstance<IProcessInvoker>(p1);
hc.EnqueueInstance<IProcessInvoker>(p2);
hc.EnqueueInstance<IProcessInvoker>(p3);
updater.Initialize(hc);
_runnerServer.Setup(x => x.UpdateAgentUpdateStateAsync(1, 1, It.IsAny<string>(), It.IsAny<string>()))
.Callback((int p, ulong a, string s, string t) =>
{
hc.GetTrace().Info(t);
})
.Returns(Task.FromResult(new TaskAgent()));
try
{
var result = await updater.SelfUpdate(_refreshMessage, _jobDispatcher.Object, true, hc.RunnerShutdownToken);
FieldInfo contentHashesProperty = updater.GetType().GetField("_contentHashes", BindingFlags.Instance | BindingFlags.NonPublic | BindingFlags.Public);
Assert.NotNull(contentHashesProperty);
Dictionary<string, string> contentHashes = (Dictionary<string, string>)contentHashesProperty.GetValue(updater);
hc.GetTrace().Info(StringUtil.ConvertToJson(contentHashes));
Assert.NotEqual(2, contentHashes.Count);
}
catch (Exception ex)
{
hc.GetTrace().Error(ex);
}
}
}
finally
{
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", null);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void TestSelfUpdateAsync_UseExternalsTrimmedPackage()
{
try
{
await FetchLatestRunner();
Assert.NotNull(_packageUrl);
Assert.NotNull(_trimmedPackages);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this))
{
hc.GetTrace().Info(_packageUrl);
hc.GetTrace().Info(StringUtil.ConvertToJson(_trimmedPackages));
//Arrange
var updater = new Runner.Listener.SelfUpdater();
hc.SetSingleton<ITerminal>(_term.Object);
hc.SetSingleton<IRunnerServer>(_runnerServer.Object);
hc.SetSingleton<IConfigurationStore>(_configStore.Object);
hc.SetSingleton<IHttpClientHandlerFactory>(new HttpClientHandlerFactory());
var p1 = new ProcessInvokerWrapper(); // hashfiles
p1.Initialize(hc);
var p2 = new ProcessInvokerWrapper(); // hashfiles
p2.Initialize(hc);
var p3 = new ProcessInvokerWrapper(); // un-tar
p3.Initialize(hc);
var p4 = new ProcessInvokerWrapper(); // node -v
p4.Initialize(hc);
var p5 = new ProcessInvokerWrapper(); // node -v
p5.Initialize(hc);
hc.EnqueueInstance<IProcessInvoker>(p1);
hc.EnqueueInstance<IProcessInvoker>(p2);
hc.EnqueueInstance<IProcessInvoker>(p3);
hc.EnqueueInstance<IProcessInvoker>(p4);
hc.EnqueueInstance<IProcessInvoker>(p5);
updater.Initialize(hc);
var trim = _trimmedPackages.Where(x => !x.TrimmedContents.ContainsKey("dotnetRuntime")).ToList();
_runnerServer.Setup(x => x.GetPackageAsync("agent", BuildConstants.RunnerPackage.PackageName, "2.999.0", true, It.IsAny<CancellationToken>()))
.Returns(Task.FromResult(new PackageMetadata() { Platform = BuildConstants.RunnerPackage.PackageName, Version = new PackageVersion("2.999.0"), DownloadUrl = _packageUrl, TrimmedPackages = trim }));
_runnerServer.Setup(x => x.UpdateAgentUpdateStateAsync(1, 1, It.IsAny<string>(), It.IsAny<string>()))
.Callback((int p, ulong a, string s, string t) =>
{
hc.GetTrace().Info(t);
})
.Returns(Task.FromResult(new TaskAgent()));
try
{
var result = await updater.SelfUpdate(_refreshMessage, _jobDispatcher.Object, true, hc.RunnerShutdownToken);
Assert.True(result);
Assert.True(Directory.Exists(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "bin.2.999.0")));
Assert.True(Directory.Exists(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "externals.2.999.0")));
}
finally
{
IOUtil.DeleteDirectory(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "bin.2.999.0"), CancellationToken.None);
IOUtil.DeleteDirectory(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "externals.2.999.0"), CancellationToken.None);
}
var traceFile = Path.GetTempFileName();
File.Copy(hc.TraceFileName, traceFile, true);
var externalsHashFile = Path.Combine(TestUtil.GetSrcPath(), $"Misc/contentHash/externals/{BuildConstants.RunnerPackage.PackageName}");
var externalsHash = await File.ReadAllTextAsync(externalsHashFile);
if (externalsHash == trim[0].TrimmedContents["externals"])
{
Assert.Contains("Use trimmed (externals) package", File.ReadAllText(traceFile));
}
else
{
Assert.Contains("the current runner does not carry those trimmed content (Hash mismatch)", File.ReadAllText(traceFile));
}
}
}
finally
{
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", null);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void TestSelfUpdateAsync_UseExternalsRuntimeTrimmedPackage()
{
try
{
await FetchLatestRunner();
Assert.NotNull(_packageUrl);
Assert.NotNull(_trimmedPackages);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this))
{
hc.GetTrace().Info(_packageUrl);
hc.GetTrace().Info(StringUtil.ConvertToJson(_trimmedPackages));
//Arrange
var updater = new Runner.Listener.SelfUpdater();
hc.SetSingleton<ITerminal>(_term.Object);
hc.SetSingleton<IRunnerServer>(_runnerServer.Object);
hc.SetSingleton<IConfigurationStore>(_configStore.Object);
hc.SetSingleton<IHttpClientHandlerFactory>(new HttpClientHandlerFactory());
var p1 = new ProcessInvokerWrapper(); // hashfiles
p1.Initialize(hc);
var p2 = new ProcessInvokerWrapper(); // hashfiles
p2.Initialize(hc);
var p3 = new ProcessInvokerWrapper(); // un-tar
p3.Initialize(hc);
var p4 = new ProcessInvokerWrapper(); // node -v
p4.Initialize(hc);
var p5 = new ProcessInvokerWrapper(); // node -v
p5.Initialize(hc);
var p6 = new ProcessInvokerWrapper(); // runner -v
p6.Initialize(hc);
hc.EnqueueInstance<IProcessInvoker>(p1);
hc.EnqueueInstance<IProcessInvoker>(p2);
hc.EnqueueInstance<IProcessInvoker>(p3);
hc.EnqueueInstance<IProcessInvoker>(p4);
hc.EnqueueInstance<IProcessInvoker>(p5);
hc.EnqueueInstance<IProcessInvoker>(p6);
updater.Initialize(hc);
var trim = _trimmedPackages.Where(x => x.TrimmedContents.ContainsKey("dotnetRuntime") && x.TrimmedContents.ContainsKey("externals")).ToList();
_runnerServer.Setup(x => x.GetPackageAsync("agent", BuildConstants.RunnerPackage.PackageName, "2.999.0", true, It.IsAny<CancellationToken>()))
.Returns(Task.FromResult(new PackageMetadata() { Platform = BuildConstants.RunnerPackage.PackageName, Version = new PackageVersion("2.999.0"), DownloadUrl = _packageUrl, TrimmedPackages = trim }));
_runnerServer.Setup(x => x.UpdateAgentUpdateStateAsync(1, 1, It.IsAny<string>(), It.IsAny<string>()))
.Callback((int p, ulong a, string s, string t) =>
{
hc.GetTrace().Info(t);
})
.Returns(Task.FromResult(new TaskAgent()));
try
{
var result = await updater.SelfUpdate(_refreshMessage, _jobDispatcher.Object, true, hc.RunnerShutdownToken);
Assert.True(result);
Assert.True(Directory.Exists(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "bin.2.999.0")));
Assert.True(Directory.Exists(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "externals.2.999.0")));
}
finally
{
IOUtil.DeleteDirectory(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "bin.2.999.0"), CancellationToken.None);
IOUtil.DeleteDirectory(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "externals.2.999.0"), CancellationToken.None);
}
var traceFile = Path.GetTempFileName();
File.Copy(hc.TraceFileName, traceFile, true);
var externalsHashFile = Path.Combine(TestUtil.GetSrcPath(), $"Misc/contentHash/externals/{BuildConstants.RunnerPackage.PackageName}");
var externalsHash = await File.ReadAllTextAsync(externalsHashFile);
var runtimeHashFile = Path.Combine(TestUtil.GetSrcPath(), $"Misc/contentHash/dotnetRuntime/{BuildConstants.RunnerPackage.PackageName}");
var runtimeHash = await File.ReadAllTextAsync(runtimeHashFile);
if (externalsHash == trim[0].TrimmedContents["externals"] &&
runtimeHash == trim[0].TrimmedContents["dotnetRuntime"])
{
Assert.Contains("Use trimmed (runtime+externals) package", File.ReadAllText(traceFile));
}
else
{
Assert.Contains("the current runner does not carry those trimmed content (Hash mismatch)", File.ReadAllText(traceFile));
}
}
}
finally
{
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", null);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void TestSelfUpdateAsync_NotUseExternalsRuntimeTrimmedPackageOnHashMismatch()
{
try
{
await FetchLatestRunner();
Assert.NotNull(_packageUrl);
Assert.NotNull(_trimmedPackages);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this))
{
hc.GetTrace().Info(_packageUrl);
hc.GetTrace().Info(StringUtil.ConvertToJson(_trimmedPackages));
//Arrange
var updater = new Runner.Listener.SelfUpdater();
hc.SetSingleton<ITerminal>(_term.Object);
hc.SetSingleton<IRunnerServer>(_runnerServer.Object);
hc.SetSingleton<IConfigurationStore>(_configStore.Object);
hc.SetSingleton<IHttpClientHandlerFactory>(new HttpClientHandlerFactory());
var p1 = new ProcessInvokerWrapper(); // hashfiles
p1.Initialize(hc);
var p2 = new ProcessInvokerWrapper(); // hashfiles
p2.Initialize(hc);
var p3 = new ProcessInvokerWrapper(); // un-tar
p3.Initialize(hc);
var p4 = new ProcessInvokerWrapper(); // node -v
p4.Initialize(hc);
var p5 = new ProcessInvokerWrapper(); // node -v
p5.Initialize(hc);
var p6 = new ProcessInvokerWrapper(); // runner -v
p6.Initialize(hc);
hc.EnqueueInstance<IProcessInvoker>(p1);
hc.EnqueueInstance<IProcessInvoker>(p2);
hc.EnqueueInstance<IProcessInvoker>(p3);
hc.EnqueueInstance<IProcessInvoker>(p4);
hc.EnqueueInstance<IProcessInvoker>(p5);
hc.EnqueueInstance<IProcessInvoker>(p6);
updater.Initialize(hc);
var trim = _trimmedPackages.ToList();
foreach (var package in trim)
{
foreach (var hash in package.TrimmedContents.Keys)
{
package.TrimmedContents[hash] = "mismatch";
}
}
_runnerServer.Setup(x => x.GetPackageAsync("agent", BuildConstants.RunnerPackage.PackageName, "2.999.0", true, It.IsAny<CancellationToken>()))
.Returns(Task.FromResult(new PackageMetadata() { Platform = BuildConstants.RunnerPackage.PackageName, Version = new PackageVersion("2.999.0"), DownloadUrl = _packageUrl, TrimmedPackages = trim }));
_runnerServer.Setup(x => x.UpdateAgentUpdateStateAsync(1, 1, It.IsAny<string>(), It.IsAny<string>()))
.Callback((int p, ulong a, string s, string t) =>
{
hc.GetTrace().Info(t);
})
.Returns(Task.FromResult(new TaskAgent()));
try
{
var result = await updater.SelfUpdate(_refreshMessage, _jobDispatcher.Object, true, hc.RunnerShutdownToken);
Assert.True(result);
Assert.True(Directory.Exists(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "bin.2.999.0")));
Assert.True(Directory.Exists(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "externals.2.999.0")));
}
finally
{
IOUtil.DeleteDirectory(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "bin.2.999.0"), CancellationToken.None);
IOUtil.DeleteDirectory(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "externals.2.999.0"), CancellationToken.None);
}
var traceFile = Path.GetTempFileName();
File.Copy(hc.TraceFileName, traceFile, true);
Assert.Contains("the current runner does not carry those trimmed content (Hash mismatch)", File.ReadAllText(traceFile));
}
}
finally
{
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", null);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void TestSelfUpdateAsync_FallbackToFullPackage()
{
try
{
await FetchLatestRunner();
Assert.NotNull(_packageUrl);
Assert.NotNull(_trimmedPackages);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this))
{
hc.GetTrace().Info(_packageUrl);
hc.GetTrace().Info(StringUtil.ConvertToJson(_trimmedPackages));
//Arrange
var updater = new Runner.Listener.SelfUpdater();
hc.SetSingleton<ITerminal>(_term.Object);
hc.SetSingleton<IRunnerServer>(_runnerServer.Object);
hc.SetSingleton<IConfigurationStore>(_configStore.Object);
hc.SetSingleton<IHttpClientHandlerFactory>(new HttpClientHandlerFactory());
var p1 = new ProcessInvokerWrapper(); // hashfiles
p1.Initialize(hc);
var p2 = new ProcessInvokerWrapper(); // hashfiles
p2.Initialize(hc);
var p3 = new ProcessInvokerWrapper(); // un-tar trim
p3.Initialize(hc);
var p4 = new ProcessInvokerWrapper(); // un-tar full
p4.Initialize(hc);
hc.EnqueueInstance<IProcessInvoker>(p1);
hc.EnqueueInstance<IProcessInvoker>(p2);
hc.EnqueueInstance<IProcessInvoker>(p3);
hc.EnqueueInstance<IProcessInvoker>(p4);
updater.Initialize(hc);
var trim = _trimmedPackages.ToList();
foreach (var package in trim)
{
package.HashValue = "mismatch";
}
_runnerServer.Setup(x => x.GetPackageAsync("agent", BuildConstants.RunnerPackage.PackageName, "2.999.0", true, It.IsAny<CancellationToken>()))
.Returns(Task.FromResult(new PackageMetadata() { Platform = BuildConstants.RunnerPackage.PackageName, Version = new PackageVersion("2.999.0"), DownloadUrl = _packageUrl, TrimmedPackages = trim }));
_runnerServer.Setup(x => x.UpdateAgentUpdateStateAsync(1, 1, It.IsAny<string>(), It.IsAny<string>()))
.Callback((int p, ulong a, string s, string t) =>
{
hc.GetTrace().Info(t);
})
.Returns(Task.FromResult(new TaskAgent()));
try
{
var result = await updater.SelfUpdate(_refreshMessage, _jobDispatcher.Object, true, hc.RunnerShutdownToken);
Assert.True(result);
Assert.True(Directory.Exists(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "bin.2.999.0")));
Assert.True(Directory.Exists(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "externals.2.999.0")));
}
finally
{
IOUtil.DeleteDirectory(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "bin.2.999.0"), CancellationToken.None);
IOUtil.DeleteDirectory(Path.Combine(hc.GetDirectory(WellKnownDirectory.Root), "externals.2.999.0"), CancellationToken.None);
}
var traceFile = Path.GetTempFileName();
File.Copy(hc.TraceFileName, traceFile, true);
if (File.ReadAllText(traceFile).Contains("Use trimmed (runtime+externals) package"))
{
Assert.Contains("Something wrong with the trimmed runner package, failback to use the full package for runner updates", File.ReadAllText(traceFile));
}
else
{
hc.GetTrace().Warning("Skipping the 'TestSelfUpdateAsync_FallbackToFullPackage' test, as the `externals` or `runtime` hashes have been updated");
}
}
}
finally
{
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", null);
}
}
} }
} }
#endif #endif

View File

@@ -23,7 +23,6 @@ namespace GitHub.Runner.Common.Tests.Listener
private Mock<IConfigurationStore> _configStore; private Mock<IConfigurationStore> _configStore;
private Mock<IJobDispatcher> _jobDispatcher; private Mock<IJobDispatcher> _jobDispatcher;
private AgentRefreshMessage _refreshMessage = new(1, "2.999.0"); private AgentRefreshMessage _refreshMessage = new(1, "2.999.0");
private List<TrimmedPackageMetadata> _trimmedPackages = new();
#if !OS_WINDOWS #if !OS_WINDOWS
private string _packageUrl = null; private string _packageUrl = null;
@@ -81,12 +80,10 @@ namespace GitHub.Runner.Common.Tests.Listener
{ {
await FetchLatestRunner(); await FetchLatestRunner();
Assert.NotNull(_packageUrl); Assert.NotNull(_packageUrl);
Assert.NotNull(_trimmedPackages);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin"))); Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this)) using (var hc = new TestHostContext(this))
{ {
hc.GetTrace().Info(_packageUrl); hc.GetTrace().Info(_packageUrl);
hc.GetTrace().Info(StringUtil.ConvertToJson(_trimmedPackages));
//Arrange //Arrange
var updater = new Runner.Listener.SelfUpdaterV2(); var updater = new Runner.Listener.SelfUpdaterV2();
@@ -143,12 +140,10 @@ namespace GitHub.Runner.Common.Tests.Listener
{ {
await FetchLatestRunner(); await FetchLatestRunner();
Assert.NotNull(_packageUrl); Assert.NotNull(_packageUrl);
Assert.NotNull(_trimmedPackages);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin"))); Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this)) using (var hc = new TestHostContext(this))
{ {
hc.GetTrace().Info(_packageUrl); hc.GetTrace().Info(_packageUrl);
hc.GetTrace().Info(StringUtil.ConvertToJson(_trimmedPackages));
//Arrange //Arrange
var updater = new Runner.Listener.SelfUpdaterV2(); var updater = new Runner.Listener.SelfUpdaterV2();
@@ -194,12 +189,10 @@ namespace GitHub.Runner.Common.Tests.Listener
{ {
await FetchLatestRunner(); await FetchLatestRunner();
Assert.NotNull(_packageUrl); Assert.NotNull(_packageUrl);
Assert.NotNull(_trimmedPackages);
Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin"))); Environment.SetEnvironmentVariable("RUNNER_L0_OVERRIDEBINDIR", Path.GetFullPath(Path.Combine(TestUtil.GetSrcPath(), "..", "_layout", "bin")));
using (var hc = new TestHostContext(this)) using (var hc = new TestHostContext(this))
{ {
hc.GetTrace().Info(_packageUrl); hc.GetTrace().Info(_packageUrl);
hc.GetTrace().Info(StringUtil.ConvertToJson(_trimmedPackages));
//Arrange //Arrange
var updater = new Runner.Listener.SelfUpdaterV2(); var updater = new Runner.Listener.SelfUpdaterV2();

View File

@@ -1,301 +0,0 @@
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Channels;
using System.Threading.Tasks;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Sdk;
using Xunit;
namespace GitHub.Runner.Common.Tests
{
public sealed class PackagesTrimL0
{
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public async Task RunnerLayoutParts_NewFilesCrossAll()
{
using (TestHostContext hc = new(this))
{
Tracing trace = hc.GetTrace();
var runnerCoreAssetsFile = Path.Combine(TestUtil.GetSrcPath(), @"Misc/runnercoreassets");
var runnerDotnetRuntimeFile = Path.Combine(TestUtil.GetSrcPath(), @"Misc/runnerdotnetruntimeassets");
string layoutBin = Path.Combine(TestUtil.GetSrcPath(), @"../_layout/bin");
var newFiles = new List<string>();
if (Directory.Exists(layoutBin))
{
var coreAssets = await File.ReadAllLinesAsync(runnerCoreAssetsFile);
var runtimeAssets = await File.ReadAllLinesAsync(runnerDotnetRuntimeFile);
foreach (var file in Directory.GetFiles(layoutBin, "*", SearchOption.AllDirectories))
{
if (!coreAssets.Any(x => file.Replace(Path.DirectorySeparatorChar, Path.AltDirectorySeparatorChar).EndsWith(x)) &&
!runtimeAssets.Any(x => file.Replace(Path.DirectorySeparatorChar, Path.AltDirectorySeparatorChar).EndsWith(x)))
{
newFiles.Add(file);
}
}
if (newFiles.Count > 0)
{
Assert.True(false, $"Found new files '{string.Join('\n', newFiles)}'. These will break runner update using trimmed packages.");
}
}
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public async Task RunnerLayoutParts_OverlapFiles()
{
using (TestHostContext hc = new(this))
{
Tracing trace = hc.GetTrace();
var runnerCoreAssetsFile = Path.Combine(TestUtil.GetSrcPath(), @"Misc/runnercoreassets");
var runnerDotnetRuntimeFile = Path.Combine(TestUtil.GetSrcPath(), @"Misc/runnerdotnetruntimeassets");
var coreAssets = await File.ReadAllLinesAsync(runnerCoreAssetsFile);
var runtimeAssets = await File.ReadAllLinesAsync(runnerDotnetRuntimeFile);
foreach (var line in coreAssets)
{
if (runtimeAssets.Contains(line, StringComparer.OrdinalIgnoreCase))
{
Assert.True(false, $"'Misc/runnercoreassets' and 'Misc/runnerdotnetruntimeassets' should not overlap.");
}
}
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public async Task RunnerLayoutParts_NewRunnerCoreAssets()
{
using (TestHostContext hc = new(this))
{
Tracing trace = hc.GetTrace();
var runnerCoreAssetsFile = Path.Combine(TestUtil.GetSrcPath(), @"Misc/runnercoreassets");
var coreAssets = await File.ReadAllLinesAsync(runnerCoreAssetsFile);
string layoutBin = Path.Combine(TestUtil.GetSrcPath(), @"../_layout/bin");
var newFiles = new List<string>();
if (Directory.Exists(layoutBin))
{
var binDirs = Directory.GetDirectories(TestUtil.GetSrcPath(), "net6.0", SearchOption.AllDirectories);
foreach (var binDir in binDirs)
{
if (binDir.Contains("Test") || binDir.Contains("obj"))
{
continue;
}
Directory.GetFiles(binDir, "*", SearchOption.TopDirectoryOnly).ToList().ForEach(x =>
{
if (!x.Contains("runtimeconfig.dev.json"))
{
if (!coreAssets.Any(y => x.Replace(Path.DirectorySeparatorChar, Path.AltDirectorySeparatorChar).EndsWith(y)))
{
newFiles.Add(x);
}
}
});
}
if (newFiles.Count > 0)
{
Assert.True(false, $"Found new files '{string.Join('\n', newFiles)}'. These will break runner update using trimmed packages. You might need to update `Misc/runnercoreassets`.");
}
}
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public async Task RunnerLayoutParts_NewDotnetRuntimeAssets()
{
using (TestHostContext hc = new(this))
{
Tracing trace = hc.GetTrace();
var runnerDotnetRuntimeFile = Path.Combine(TestUtil.GetSrcPath(), @"Misc/runnerdotnetruntimeassets");
var runtimeAssets = await File.ReadAllLinesAsync(runnerDotnetRuntimeFile);
string layoutTrimsRuntimeAssets = Path.Combine(TestUtil.GetSrcPath(), @"../_layout_trims/runnerdotnetruntimeassets");
var newFiles = new List<string>();
if (File.Exists(layoutTrimsRuntimeAssets))
{
var runtimeAssetsCurrent = await File.ReadAllLinesAsync(layoutTrimsRuntimeAssets);
foreach (var runtimeFile in runtimeAssetsCurrent)
{
if (runtimeAssets.Any(x => runtimeFile.EndsWith(x, StringComparison.OrdinalIgnoreCase)))
{
continue;
}
else
{
newFiles.Add(runtimeFile);
}
}
if (newFiles.Count > 0)
{
Assert.True(false, $"Found new dotnet runtime files '{string.Join('\n', newFiles)}'. These will break runner update using trimmed packages. You might need to update `Misc/runnerdotnetruntimeassets`.");
}
}
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public async Task RunnerLayoutParts_CheckDotnetRuntimeHash()
{
using (TestHostContext hc = new(this))
{
Tracing trace = hc.GetTrace();
var dotnetRuntimeHashFile = Path.Combine(TestUtil.GetSrcPath(), $"Misc/contentHash/dotnetRuntime/{BuildConstants.RunnerPackage.PackageName}");
trace.Info($"Current hash: {File.ReadAllText(dotnetRuntimeHashFile)}");
string layoutTrimsRuntimeAssets = Path.Combine(TestUtil.GetSrcPath(), @"../_layout_trims/runtime");
string binDir = Path.Combine(TestUtil.GetSrcPath(), @"../_layout/bin");
#if OS_WINDOWS
string node = Path.Combine(TestUtil.GetSrcPath(), @"..\_layout\externals\node16\bin\node");
#else
string node = Path.Combine(TestUtil.GetSrcPath(), @"../_layout/externals/node16/bin/node");
#endif
string hashFilesScript = Path.Combine(binDir, "hashFiles");
var hashResult = string.Empty;
var p1 = new ProcessInvokerWrapper();
p1.Initialize(hc);
p1.ErrorDataReceived += (_, data) =>
{
if (!string.IsNullOrEmpty(data.Data) && data.Data.StartsWith("__OUTPUT__") && data.Data.EndsWith("__OUTPUT__"))
{
hashResult = data.Data.Substring(10, data.Data.Length - 20);
trace.Info($"Hash result: '{hashResult}'");
}
else
{
trace.Info(data.Data);
}
};
p1.OutputDataReceived += (_, data) =>
{
trace.Info(data.Data);
};
var env = new Dictionary<string, string>
{
["patterns"] = "**"
};
int exitCode = await p1.ExecuteAsync(workingDirectory: layoutTrimsRuntimeAssets,
fileName: node,
arguments: $"\"{hashFilesScript.Replace("\"", "\\\"")}\"",
environment: env,
requireExitCodeZero: true,
outputEncoding: null,
killProcessOnCancel: true,
cancellationToken: CancellationToken.None);
Assert.True(string.Equals(hashResult, File.ReadAllText(dotnetRuntimeHashFile).Trim()), $"Hash mismatch for dotnet runtime. You might need to update `Misc/contentHash/dotnetRuntime/{BuildConstants.RunnerPackage.PackageName}` or check if `hashFiles.ts` ever changed recently.");
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public async Task RunnerLayoutParts_CheckExternalsHash()
{
using (TestHostContext hc = new(this))
{
Tracing trace = hc.GetTrace();
var externalsHashFile = Path.Combine(TestUtil.GetSrcPath(), $"Misc/contentHash/externals/{BuildConstants.RunnerPackage.PackageName}");
trace.Info($"Current hash: {File.ReadAllText(externalsHashFile)}");
string layoutTrimsExternalsAssets = Path.Combine(TestUtil.GetSrcPath(), @"../_layout_trims/externals");
string binDir = Path.Combine(TestUtil.GetSrcPath(), @"../_layout/bin");
#if OS_WINDOWS
string node = Path.Combine(TestUtil.GetSrcPath(), @"..\_layout\externals\node16\bin\node");
#else
string node = Path.Combine(TestUtil.GetSrcPath(), @"../_layout/externals/node16/bin/node");
#endif
string hashFilesScript = Path.Combine(binDir, "hashFiles");
var hashResult = string.Empty;
var p1 = new ProcessInvokerWrapper();
p1.Initialize(hc);
p1.ErrorDataReceived += (_, data) =>
{
if (!string.IsNullOrEmpty(data.Data) && data.Data.StartsWith("__OUTPUT__") && data.Data.EndsWith("__OUTPUT__"))
{
hashResult = data.Data.Substring(10, data.Data.Length - 20);
trace.Info($"Hash result: '{hashResult}'");
}
else
{
trace.Info(data.Data);
}
};
p1.OutputDataReceived += (_, data) =>
{
trace.Info(data.Data);
};
var env = new Dictionary<string, string>
{
["patterns"] = "**"
};
int exitCode = await p1.ExecuteAsync(workingDirectory: layoutTrimsExternalsAssets,
fileName: node,
arguments: $"\"{hashFilesScript.Replace("\"", "\\\"")}\"",
environment: env,
requireExitCodeZero: true,
outputEncoding: null,
killProcessOnCancel: true,
cancellationToken: CancellationToken.None);
Assert.True(string.Equals(hashResult, File.ReadAllText(externalsHashFile).Trim()), $"Hash mismatch for externals. You might need to update `Misc/contentHash/externals/{BuildConstants.RunnerPackage.PackageName}` or check if `hashFiles.ts` ever changed recently.");
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public Task RunnerLayoutParts_ContentHashFilesNoNewline()
{
using (TestHostContext hc = new(this))
{
Tracing trace = hc.GetTrace();
var dotnetRuntimeHashFile = Path.Combine(TestUtil.GetSrcPath(), $"Misc/contentHash/dotnetRuntime/{BuildConstants.RunnerPackage.PackageName}");
var dotnetRuntimeHash = File.ReadAllText(dotnetRuntimeHashFile);
trace.Info($"Current hash: {dotnetRuntimeHash}");
var externalsHashFile = Path.Combine(TestUtil.GetSrcPath(), $"Misc/contentHash/externals/{BuildConstants.RunnerPackage.PackageName}");
var externalsHash = File.ReadAllText(externalsHashFile);
trace.Info($"Current hash: {externalsHash}");
Assert.False(externalsHash.Any(x => char.IsWhiteSpace(x)), $"Found whitespace in externals hash file.");
Assert.False(dotnetRuntimeHash.Any(x => char.IsWhiteSpace(x)), $"Found whitespace in dotnet runtime hash file.");
return Task.CompletedTask;
}
}
}
}

View File

@@ -960,6 +960,33 @@ namespace GitHub.Runner.Common.Tests.Util
} }
} }
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public void ReplaceInvalidFileNameChars()
{
Assert.Equal(string.Empty, IOUtil.ReplaceInvalidFileNameChars(null));
Assert.Equal(string.Empty, IOUtil.ReplaceInvalidFileNameChars(string.Empty));
Assert.Equal("hello.txt", IOUtil.ReplaceInvalidFileNameChars("hello.txt"));
#if OS_WINDOWS
// Refer https://github.com/dotnet/runtime/blob/ce84f1d8a3f12711bad678a33efbc37b461f684f/src/libraries/System.Private.CoreLib/src/System/IO/Path.Windows.cs#L15
Assert.Equal(
"1_ 2_ 3_ 4_ 5_ 6_ 7_ 8_ 9_ 10_ 11_ 12_ 13_ 14_ 15_ 16_ 17_ 18_ 19_ 20_ 21_ 22_ 23_ 24_ 25_ 26_ 27_ 28_ 29_ 30_ 31_ 32_ 33_ 34_ 35_ 36_ 37_ 38_ 39_ 40_ 41_",
IOUtil.ReplaceInvalidFileNameChars($"1\" 2< 3> 4| 5\0 6{(char)1} 7{(char)2} 8{(char)3} 9{(char)4} 10{(char)5} 11{(char)6} 12{(char)7} 13{(char)8} 14{(char)9} 15{(char)10} 16{(char)11} 17{(char)12} 18{(char)13} 19{(char)14} 20{(char)15} 21{(char)16} 22{(char)17} 23{(char)18} 24{(char)19} 25{(char)20} 26{(char)21} 27{(char)22} 28{(char)23} 29{(char)24} 30{(char)25} 31{(char)26} 32{(char)27} 33{(char)28} 34{(char)29} 35{(char)30} 36{(char)31} 37: 38* 39? 40\\ 41/"));
#else
// Refer https://github.com/dotnet/runtime/blob/ce84f1d8a3f12711bad678a33efbc37b461f684f/src/libraries/System.Private.CoreLib/src/System/IO/Path.Unix.cs#L12
Assert.Equal("1_ 2_", IOUtil.ReplaceInvalidFileNameChars("1\0 2/"));
#endif
Assert.Equal("_leading", IOUtil.ReplaceInvalidFileNameChars("/leading"));
Assert.Equal("__consecutive leading", IOUtil.ReplaceInvalidFileNameChars("//consecutive leading"));
Assert.Equal("trailing_", IOUtil.ReplaceInvalidFileNameChars("trailing/"));
Assert.Equal("consecutive trailing__", IOUtil.ReplaceInvalidFileNameChars("consecutive trailing//"));
Assert.Equal("middle_middle", IOUtil.ReplaceInvalidFileNameChars("middle/middle"));
Assert.Equal("consecutive middle__consecutive middle", IOUtil.ReplaceInvalidFileNameChars("consecutive middle//consecutive middle"));
Assert.Equal("_leading_middle_trailing_", IOUtil.ReplaceInvalidFileNameChars("/leading/middle/trailing/"));
Assert.Equal("__consecutive leading__consecutive middle__consecutive trailing__", IOUtil.ReplaceInvalidFileNameChars("//consecutive leading//consecutive middle//consecutive trailing//"));
}
private static async Task CreateDirectoryReparsePoint(IHostContext context, string link, string target) private static async Task CreateDirectoryReparsePoint(IHostContext context, string link, string target)
{ {
#if OS_WINDOWS #if OS_WINDOWS

View File

@@ -212,210 +212,5 @@ namespace GitHub.Runner.Common.Tests.Util
File.Delete(brokenSymlink); File.Delete(brokenSymlink);
Environment.SetEnvironmentVariable(PathUtil.PathVariable, oldValue); Environment.SetEnvironmentVariable(PathUtil.PathVariable, oldValue);
} }
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public void UseWhich2FindGit()
{
using (TestHostContext hc = new(this))
{
//Arrange
Tracing trace = hc.GetTrace();
// Act.
string gitPath = WhichUtil.Which2("git", trace: trace);
trace.Info($"Which(\"git\") returns: {gitPath ?? string.Empty}");
// Assert.
Assert.True(!string.IsNullOrEmpty(gitPath) && File.Exists(gitPath), $"Unable to find Git through: {nameof(WhichUtil.Which)}");
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public void Which2ReturnsNullWhenNotFound()
{
using (TestHostContext hc = new(this))
{
//Arrange
Tracing trace = hc.GetTrace();
// Act.
string nosuch = WhichUtil.Which2("no-such-file-cf7e351f", trace: trace);
trace.Info($"result: {nosuch ?? string.Empty}");
// Assert.
Assert.True(string.IsNullOrEmpty(nosuch), "Path should not be resolved");
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public void Which2ThrowsWhenRequireAndNotFound()
{
using (TestHostContext hc = new(this))
{
//Arrange
Tracing trace = hc.GetTrace();
// Act.
try
{
WhichUtil.Which2("no-such-file-cf7e351f", require: true, trace: trace);
throw new Exception("which should have thrown");
}
catch (FileNotFoundException ex)
{
Assert.Equal("no-such-file-cf7e351f", ex.FileName);
}
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public void Which2HandleFullyQualifiedPath()
{
using (TestHostContext hc = new(this))
{
//Arrange
Tracing trace = hc.GetTrace();
// Act.
var gitPath = WhichUtil.Which2("git", require: true, trace: trace);
var gitPath2 = WhichUtil.Which2(gitPath, require: true, trace: trace);
// Assert.
Assert.Equal(gitPath, gitPath2);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public void Which2HandlesSymlinkToTargetFullPath()
{
// Arrange
using TestHostContext hc = new TestHostContext(this);
Tracing trace = hc.GetTrace();
string oldValue = Environment.GetEnvironmentVariable(PathUtil.PathVariable);
#if OS_WINDOWS
string newValue = oldValue + @$";{Path.GetTempPath()}";
string symlinkName = $"symlink-{Guid.NewGuid()}";
string symlink = Path.GetTempPath() + $"{symlinkName}.exe";
string target = Path.GetTempPath() + $"target-{Guid.NewGuid()}.exe";
#else
string newValue = oldValue + @$":{Path.GetTempPath()}";
string symlinkName = $"symlink-{Guid.NewGuid()}";
string symlink = Path.GetTempPath() + $"{symlinkName}";
string target = Path.GetTempPath() + $"target-{Guid.NewGuid()}";
#endif
Environment.SetEnvironmentVariable(PathUtil.PathVariable, newValue);
using (File.Create(target))
{
File.CreateSymbolicLink(symlink, target);
// Act.
var result = WhichUtil.Which2(symlinkName, require: true, trace: trace);
// Assert
Assert.True(!string.IsNullOrEmpty(result) && File.Exists(result), $"Unable to find symlink through: {nameof(WhichUtil.Which)}");
}
// Cleanup
File.Delete(symlink);
File.Delete(target);
Environment.SetEnvironmentVariable(PathUtil.PathVariable, oldValue);
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public void Which2HandlesSymlinkToTargetRelativePath()
{
// Arrange
using TestHostContext hc = new TestHostContext(this);
Tracing trace = hc.GetTrace();
string oldValue = Environment.GetEnvironmentVariable(PathUtil.PathVariable);
#if OS_WINDOWS
string newValue = oldValue + @$";{Path.GetTempPath()}";
string symlinkName = $"symlink-{Guid.NewGuid()}";
string symlink = Path.GetTempPath() + $"{symlinkName}.exe";
string targetName = $"target-{Guid.NewGuid()}.exe";
string target = Path.GetTempPath() + targetName;
#else
string newValue = oldValue + @$":{Path.GetTempPath()}";
string symlinkName = $"symlink-{Guid.NewGuid()}";
string symlink = Path.GetTempPath() + $"{symlinkName}";
string targetName = $"target-{Guid.NewGuid()}";
string target = Path.GetTempPath() + targetName;
#endif
Environment.SetEnvironmentVariable(PathUtil.PathVariable, newValue);
using (File.Create(target))
{
File.CreateSymbolicLink(symlink, targetName);
// Act.
var result = WhichUtil.Which2(symlinkName, require: true, trace: trace);
// Assert
Assert.True(!string.IsNullOrEmpty(result) && File.Exists(result), $"Unable to find {symlinkName} through: {nameof(WhichUtil.Which)}");
}
// Cleanup
File.Delete(symlink);
File.Delete(target);
Environment.SetEnvironmentVariable(PathUtil.PathVariable, oldValue);
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public void Which2ThrowsWhenSymlinkBroken()
{
// Arrange
using TestHostContext hc = new TestHostContext(this);
Tracing trace = hc.GetTrace();
string oldValue = Environment.GetEnvironmentVariable(PathUtil.PathVariable);
#if OS_WINDOWS
string newValue = oldValue + @$";{Path.GetTempPath()}";
string brokenSymlinkName = $"broken-symlink-{Guid.NewGuid()}";
string brokenSymlink = Path.GetTempPath() + $"{brokenSymlinkName}.exe";
#else
string newValue = oldValue + @$":{Path.GetTempPath()}";
string brokenSymlinkName = $"broken-symlink-{Guid.NewGuid()}";
string brokenSymlink = Path.GetTempPath() + $"{brokenSymlinkName}";
#endif
string target = "no-such-file-cf7e351f";
Environment.SetEnvironmentVariable(PathUtil.PathVariable, newValue);
File.CreateSymbolicLink(brokenSymlink, target);
// Act.
var exception = Assert.Throws<FileNotFoundException>(() => WhichUtil.Which2(brokenSymlinkName, require: true, trace: trace));
// Assert
Assert.Equal(brokenSymlinkName, exception.FileName);
// Cleanup
File.Delete(brokenSymlink);
Environment.SetEnvironmentVariable(PathUtil.PathVariable, oldValue);
}
} }
} }

View File

@@ -232,7 +232,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new(); TimelineReference timeline = new();
Guid jobId = Guid.NewGuid(); Guid jobId = Guid.NewGuid();
string jobName = "some job name"; string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null); var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource() jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{ {
Alias = Pipelines.PipelineConstants.SelfAlias, Alias = Pipelines.PipelineConstants.SelfAlias,

View File

@@ -382,8 +382,6 @@ runs:
} }
}; };
_ec.Object.Global.Variables.Set("DistributedTask.UseActionArchiveCache", bool.TrueString);
//Act //Act
await _actionManager.PrepareActionsAsync(_ec.Object, actions); await _actionManager.PrepareActionsAsync(_ec.Object, actions);
@@ -2375,10 +2373,6 @@ runs:
_ec.Setup(x => x.CancellationToken).Returns(_ecTokenSource.Token); _ec.Setup(x => x.CancellationToken).Returns(_ecTokenSource.Token);
_ec.Setup(x => x.Root).Returns(new GitHub.Runner.Worker.ExecutionContext()); _ec.Setup(x => x.Root).Returns(new GitHub.Runner.Worker.ExecutionContext());
var variables = new Dictionary<string, VariableValue>(); var variables = new Dictionary<string, VariableValue>();
if (enableComposite)
{
variables["DistributedTask.EnableCompositeActions"] = "true";
}
_ec.Object.Global.Variables = new Variables(_hc, variables); _ec.Object.Global.Variables = new Variables(_hc, variables);
_ec.Setup(x => x.ExpressionValues).Returns(new DictionaryContextData()); _ec.Setup(x => x.ExpressionValues).Returns(new DictionaryContextData());
_ec.Setup(x => x.ExpressionFunctions).Returns(new List<IFunctionInfo>()); _ec.Setup(x => x.ExpressionFunctions).Returns(new List<IFunctionInfo>());

View File

@@ -757,7 +757,7 @@ namespace GitHub.Runner.Common.Tests.Worker
//Assert //Assert
var err = Assert.Throws<ArgumentException>(() => actionManifest.Load(_ec.Object, action_path)); var err = Assert.Throws<ArgumentException>(() => actionManifest.Load(_ec.Object, action_path));
Assert.Contains($"Fail to load {action_path}", err.Message); Assert.Contains($"Failed to load {action_path}", err.Message);
_ec.Verify(x => x.AddIssue(It.Is<Issue>(s => s.Message.Contains("Missing 'using' value. 'using' requires 'composite', 'docker', 'node12', 'node16' or 'node20'.")), It.IsAny<ExecutionContextLogOptions>()), Times.Once); _ec.Verify(x => x.AddIssue(It.Is<Issue>(s => s.Message.Contains("Missing 'using' value. 'using' requires 'composite', 'docker', 'node12', 'node16' or 'node20'.")), It.IsAny<ExecutionContextLogOptions>()), Times.Once);
} }
finally finally

View File

@@ -193,7 +193,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new(); TimelineReference timeline = new();
Guid jobId = Guid.NewGuid(); Guid jobId = Guid.NewGuid();
string jobName = "Summary Job"; string jobName = "Summary Job";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null); var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource() jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{ {
Alias = Pipelines.PipelineConstants.SelfAlias, Alias = Pipelines.PipelineConstants.SelfAlias,

View File

@@ -29,7 +29,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new(); TimelineReference timeline = new();
Guid jobId = Guid.NewGuid(); Guid jobId = Guid.NewGuid();
string jobName = "some job name"; string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null); var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource() jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{ {
Alias = Pipelines.PipelineConstants.SelfAlias, Alias = Pipelines.PipelineConstants.SelfAlias,
@@ -106,7 +106,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new(); TimelineReference timeline = new();
Guid jobId = Guid.NewGuid(); Guid jobId = Guid.NewGuid();
string jobName = "some job name"; string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null); var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource() jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{ {
Alias = Pipelines.PipelineConstants.SelfAlias, Alias = Pipelines.PipelineConstants.SelfAlias,
@@ -162,7 +162,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new(); TimelineReference timeline = new();
Guid jobId = Guid.NewGuid(); Guid jobId = Guid.NewGuid();
string jobName = "some job name"; string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null); var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource() jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{ {
Alias = Pipelines.PipelineConstants.SelfAlias, Alias = Pipelines.PipelineConstants.SelfAlias,
@@ -216,7 +216,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new(); TimelineReference timeline = new();
Guid jobId = Guid.NewGuid(); Guid jobId = Guid.NewGuid();
string jobName = "some job name"; string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null); var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource() jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{ {
Alias = Pipelines.PipelineConstants.SelfAlias, Alias = Pipelines.PipelineConstants.SelfAlias,
@@ -271,7 +271,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new(); TimelineReference timeline = new();
Guid jobId = Guid.NewGuid(); Guid jobId = Guid.NewGuid();
string jobName = "some job name"; string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null); var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource() jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{ {
Alias = Pipelines.PipelineConstants.SelfAlias, Alias = Pipelines.PipelineConstants.SelfAlias,
@@ -322,7 +322,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new(); TimelineReference timeline = new();
Guid jobId = Guid.NewGuid(); Guid jobId = Guid.NewGuid();
string jobName = "some job name"; string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null); var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource() jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{ {
Alias = Pipelines.PipelineConstants.SelfAlias, Alias = Pipelines.PipelineConstants.SelfAlias,
@@ -373,7 +373,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new(); TimelineReference timeline = new();
Guid jobId = Guid.NewGuid(); Guid jobId = Guid.NewGuid();
string jobName = "some job name"; string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null); var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource() jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{ {
Alias = Pipelines.PipelineConstants.SelfAlias, Alias = Pipelines.PipelineConstants.SelfAlias,
@@ -471,7 +471,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new(); TimelineReference timeline = new();
Guid jobId = Guid.NewGuid(); Guid jobId = Guid.NewGuid();
string jobName = "some job name"; string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null); var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource() jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{ {
Alias = Pipelines.PipelineConstants.SelfAlias, Alias = Pipelines.PipelineConstants.SelfAlias,
@@ -555,7 +555,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new(); TimelineReference timeline = new();
Guid jobId = Guid.NewGuid(); Guid jobId = Guid.NewGuid();
string jobName = "some job name"; string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null); var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource() jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{ {
Alias = Pipelines.PipelineConstants.SelfAlias, Alias = Pipelines.PipelineConstants.SelfAlias,
@@ -610,7 +610,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new(); TimelineReference timeline = new();
Guid jobId = Guid.NewGuid(); Guid jobId = Guid.NewGuid();
string jobName = "some job name"; string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null); var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource() jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{ {
Alias = Pipelines.PipelineConstants.SelfAlias, Alias = Pipelines.PipelineConstants.SelfAlias,
@@ -653,7 +653,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new(); TimelineReference timeline = new();
Guid jobId = Guid.NewGuid(); Guid jobId = Guid.NewGuid();
string jobName = "some job name"; string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null); var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource() jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{ {
Alias = Pipelines.PipelineConstants.SelfAlias, Alias = Pipelines.PipelineConstants.SelfAlias,
@@ -717,7 +717,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new(); TimelineReference timeline = new();
Guid jobId = Guid.NewGuid(); Guid jobId = Guid.NewGuid();
string jobName = "some job name"; string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null); var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource() jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{ {
Alias = Pipelines.PipelineConstants.SelfAlias, Alias = Pipelines.PipelineConstants.SelfAlias,
@@ -781,7 +781,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new(); TimelineReference timeline = new();
Guid jobId = Guid.NewGuid(); Guid jobId = Guid.NewGuid();
string jobName = "some job name"; string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null); var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource() jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{ {
Alias = Pipelines.PipelineConstants.SelfAlias, Alias = Pipelines.PipelineConstants.SelfAlias,
@@ -969,7 +969,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new(); TimelineReference timeline = new();
Guid jobId = Guid.NewGuid(); Guid jobId = Guid.NewGuid();
string jobName = "some job name"; string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null); var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource() jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{ {
Alias = Pipelines.PipelineConstants.SelfAlias, Alias = Pipelines.PipelineConstants.SelfAlias,
@@ -1014,7 +1014,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new TimelineReference(); TimelineReference timeline = new TimelineReference();
Guid jobId = Guid.NewGuid(); Guid jobId = Guid.NewGuid();
string jobName = "some job name"; string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null); var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource() jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{ {
Alias = Pipelines.PipelineConstants.SelfAlias, Alias = Pipelines.PipelineConstants.SelfAlias,
@@ -1057,7 +1057,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new TimelineReference(); TimelineReference timeline = new TimelineReference();
Guid jobId = Guid.NewGuid(); Guid jobId = Guid.NewGuid();
string jobName = "some job name"; string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null); var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource() jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{ {
Alias = Pipelines.PipelineConstants.SelfAlias, Alias = Pipelines.PipelineConstants.SelfAlias,

View File

@@ -4,6 +4,8 @@ using System.Linq;
using System.Runtime.CompilerServices; using System.Runtime.CompilerServices;
using System.Threading; using System.Threading;
using System.Threading.Tasks; using System.Threading.Tasks;
using GitHub.DistributedTask.ObjectTemplating.Tokens;
using GitHub.DistributedTask.Pipelines.ObjectTemplating;
using GitHub.DistributedTask.WebApi; using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Worker; using GitHub.Runner.Worker;
using Moq; using Moq;
@@ -25,6 +27,9 @@ namespace GitHub.Runner.Common.Tests.Worker
private Mock<IContainerOperationProvider> _containerProvider; private Mock<IContainerOperationProvider> _containerProvider;
private Mock<IDiagnosticLogManager> _diagnosticLogManager; private Mock<IDiagnosticLogManager> _diagnosticLogManager;
private Mock<IJobHookProvider> _jobHookProvider; private Mock<IJobHookProvider> _jobHookProvider;
private Mock<ISnapshotOperationProvider> _snapshotOperationProvider;
private Pipelines.Snapshot _requestedSnapshot;
private CancellationTokenSource _tokenSource; private CancellationTokenSource _tokenSource;
private TestHostContext CreateTestContext([CallerMemberName] String testName = "") private TestHostContext CreateTestContext([CallerMemberName] String testName = "")
@@ -41,7 +46,16 @@ namespace GitHub.Runner.Common.Tests.Worker
_directoryManager.Setup(x => x.PrepareDirectory(It.IsAny<IExecutionContext>(), It.IsAny<Pipelines.WorkspaceOptions>())) _directoryManager.Setup(x => x.PrepareDirectory(It.IsAny<IExecutionContext>(), It.IsAny<Pipelines.WorkspaceOptions>()))
.Returns(new TrackingConfig() { PipelineDirectory = "runner", WorkspaceDirectory = "runner/runner" }); .Returns(new TrackingConfig() { PipelineDirectory = "runner", WorkspaceDirectory = "runner/runner" });
_jobHookProvider = new Mock<IJobHookProvider>(); _jobHookProvider = new Mock<IJobHookProvider>();
_snapshotOperationProvider = new Mock<ISnapshotOperationProvider>();
_requestedSnapshot = null;
_snapshotOperationProvider
.Setup(p => p.CreateSnapshotRequestAsync(It.IsAny<IExecutionContext>(), It.IsAny<Pipelines.Snapshot>()))
.Returns((IExecutionContext _, object data) =>
{
_requestedSnapshot = data as Pipelines.Snapshot;
return Task.CompletedTask;
});
IActionRunner step1 = new ActionRunner(); IActionRunner step1 = new ActionRunner();
IActionRunner step2 = new ActionRunner(); IActionRunner step2 = new ActionRunner();
IActionRunner step3 = new ActionRunner(); IActionRunner step3 = new ActionRunner();
@@ -100,7 +114,7 @@ namespace GitHub.Runner.Common.Tests.Worker
}; };
Guid jobId = Guid.NewGuid(); Guid jobId = Guid.NewGuid();
_message = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, "test", "test", null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), steps, null, null, null, null); _message = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, "test", "test", null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), steps, null, null, null, null, null);
GitHubContext github = new(); GitHubContext github = new();
github["repository"] = new Pipelines.ContextData.StringContextData("actions/runner"); github["repository"] = new Pipelines.ContextData.StringContextData("actions/runner");
github["secret_source"] = new Pipelines.ContextData.StringContextData("Actions"); github["secret_source"] = new Pipelines.ContextData.StringContextData("Actions");
@@ -125,6 +139,7 @@ namespace GitHub.Runner.Common.Tests.Worker
hc.SetSingleton(_directoryManager.Object); hc.SetSingleton(_directoryManager.Object);
hc.SetSingleton(_diagnosticLogManager.Object); hc.SetSingleton(_diagnosticLogManager.Object);
hc.SetSingleton(_jobHookProvider.Object); hc.SetSingleton(_jobHookProvider.Object);
hc.SetSingleton(_snapshotOperationProvider.Object);
hc.EnqueueInstance<IPagingLogger>(_logger.Object); // JobExecutionContext hc.EnqueueInstance<IPagingLogger>(_logger.Object); // JobExecutionContext
hc.EnqueueInstance<IPagingLogger>(_logger.Object); // job start hook hc.EnqueueInstance<IPagingLogger>(_logger.Object); // job start hook
hc.EnqueueInstance<IPagingLogger>(_logger.Object); // Initial Job hc.EnqueueInstance<IPagingLogger>(_logger.Object); // Initial Job
@@ -443,5 +458,80 @@ namespace GitHub.Runner.Common.Tests.Worker
Assert.Equal(0, _jobEc.PostJobSteps.Count); Assert.Equal(0, _jobEc.PostJobSteps.Count);
} }
} }
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public async Task EnsureNoSnapshotPostJobStep()
{
using (TestHostContext hc = CreateTestContext())
{
var jobExtension = new JobExtension();
jobExtension.Initialize(hc);
_actionManager.Setup(x => x.PrepareActionsAsync(It.IsAny<IExecutionContext>(), It.IsAny<IEnumerable<Pipelines.JobStep>>(), It.IsAny<Guid>()))
.Returns(Task.FromResult(new PrepareResult(new List<JobExtensionRunner>(), new Dictionary<Guid, IActionRunner>())));
_message.Snapshot = null;
await jobExtension.InitializeJob(_jobEc, _message);
var postJobSteps = _jobEc.PostJobSteps;
Assert.Equal(0, postJobSteps.Count);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public Task EnsureSnapshotPostJobStepForStringToken()
{
var snapshot = new Pipelines.Snapshot("TestImageNameFromStringToken");
var imageNameValueStringToken = new StringToken(null, null, null, snapshot.ImageName);
return EnsureSnapshotPostJobStepForToken(imageNameValueStringToken, snapshot);
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public Task EnsureSnapshotPostJobStepForMappingToken()
{
var snapshot = new Pipelines.Snapshot("TestImageNameFromMappingToken");
var imageNameValueStringToken = new StringToken(null, null, null, snapshot.ImageName);
var mappingToken = new MappingToken(null, null, null)
{
{ new StringToken(null,null,null, PipelineTemplateConstants.ImageName), imageNameValueStringToken }
};
return EnsureSnapshotPostJobStepForToken(mappingToken, snapshot);
}
private async Task EnsureSnapshotPostJobStepForToken(TemplateToken snapshotToken, Pipelines.Snapshot expectedSnapshot)
{
using (TestHostContext hc = CreateTestContext())
{
var jobExtension = new JobExtension();
jobExtension.Initialize(hc);
_actionManager.Setup(x => x.PrepareActionsAsync(It.IsAny<IExecutionContext>(), It.IsAny<IEnumerable<Pipelines.JobStep>>(), It.IsAny<Guid>()))
.Returns(Task.FromResult(new PrepareResult(new List<JobExtensionRunner>(), new Dictionary<Guid, IActionRunner>())));
_message.Snapshot = snapshotToken;
await jobExtension.InitializeJob(_jobEc, _message);
var postJobSteps = _jobEc.PostJobSteps;
Assert.Equal(1, postJobSteps.Count);
var snapshotStep = postJobSteps.First();
Assert.Equal("Create custom image", snapshotStep.DisplayName);
Assert.Equal($"{PipelineTemplateConstants.Success}()", snapshotStep.Condition);
// Run the mock snapshot step, so we can verify it was executed with the expected snapshot object.
await snapshotStep.RunAsync();
Assert.NotNull(_requestedSnapshot);
Assert.Equal(expectedSnapshot.ImageName, _requestedSnapshot.ImageName);
}
}
} }
} }

View File

@@ -101,6 +101,7 @@ namespace GitHub.Runner.Common.Tests.Worker
testName, testName,
testName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, testName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null,
new ActionsEnvironmentReference("staging"), new ActionsEnvironmentReference("staging"),
null,
messageType: messageType); messageType: messageType);
message.Variables[Constants.Variables.System.Culture] = "en-US"; message.Variables[Constants.Variables.System.Culture] = "en-US";
message.Resources.Endpoints.Add(new ServiceEndpoint() message.Resources.Endpoints.Add(new ServiceEndpoint()

Some files were not shown because too many files have changed in this diff Show More